MacBook Pro (2018): Apple finds, fixes performance BUG!

Apple has sent me the following statement to address the recent controversy surrounding the 2018 MacBook Pro and how it manages power and performance:

"Following extensive performance testing under numerous workloads, we've identified that there is a missing digital key in the firmware that impacts the thermal management system and could drive clock speeds down under heavy thermal loads on the new MacBook Pro. A bug fix is included in today's macOS High Sierra 10.13.6 Supplemental Update and is recommended. We apologize to any customer who has experienced less than optimal performance on their new systems. Customers can expect the new 15-inch MacBook Pro to be up to 70% faster, and the 13-inch MacBook Pro with Touch Bar to be up to 2X faster, as shown in the performance results on our website."

The controversy started with Dave Lee, Dave2D on YouTube, who experienced really bad, and it now turns out really buggy, performance on the new MacBook Pro 2018 with a specific Adobe Premiere workload.

While Apple couldn't initially reproduce the results, the company spent the last few days working with him to try and figure out what was going wrong.

The fix, which Apple will be making available via Software Update around the time this video hits, and following up on with a push notification, won't just benefit people with those worst-case-scenario workloads but should help with all workloads on the Coffee Lake MacBook Pro.

That, despite Apple insisting its own benchmarks, run prior to release and touting up to 70% improvements in some tasks, weren't affected by the bug and are still accurate, likewise the workloads and results of the video, photography, music, science, and developer experts the company hosted and made available to media during the MacBook Pro launch.

Real world tests

My own tests with my own workloads, which skews heavily towards video, showed about as much of a performance increase from the 2017 MacBook Pro as the 2017 MacBook Pro did the 2016. Maybe a little more in some cases.

That aligns with Jonathan Morrison of TLD, who put the new machines through the most comprehensive real-world tests I've seen so far.

In some cases, it's up to 50% better. In other cases, just a few minutes here, a few minutes there. That might not seem like much to someone who only renders a couple of videos a week. To someone who renders a couple of videos every 15 minutes — which isn't uncommon at a production house — it makes all the difference in the world.

Especially when you have a director, artist, or client on the other end with far less time than money chomping at the bit to iterate and sign off on every shot.

That's what makes real-world testing so important. Downloading an Intel Power Gadget and throwing up a video, blog post, or Reddit thread, not understanding anything about benchmarks, CPU vs. GPU load, what's hitting an accelerator or what's hitting AVX2, what's being measured and how frequently, whether or not the tool is up-to-date or tuned for the system it's being used on, or how it might affect the results, especially if it's just to get attention or spout off conspiracy theories, ends up contributing to the noise, not the useful pool of data points.

And, by the way, if all of that sounded like a bunch of jargon… or Dothraki to you, that's because it is. It's the stuff old-fashioned computer geeks lived for, but it's increasingly meaningless to modern mainstream customers.

The new silicon normal

We're living in an age where Moore's law — or more often House's Law — which predicted performance would double roughly every 18 months, is dead or dying. And, as computing continues to become more mobile, and pro-level computing more mainstream, aggressive thermal management in constrained enclosures is something we're all going to have to come to terms with.

It's the reason why Apple doesn't break out things like RAM or frequencies on iPhone or iPad and why, I think, Apple is increasingly viewing the Intel chips inside the Mac as an implementation detail. At least until it's ready with an alternative.

Sure, in a perfect world, I think Apple and everyone else would have far preferred it if Intel weren't so far behind on its roadmap. If Cannon Lake had actually followed Skylake, the process shrink had happened on schedule, the tick had continued to tock, and we never had optimization cycles like Kaby Lake, Coffee Lake, and whatever else gets crammed in between, or more cores used as a fallback for performance gains.

Given that, I totally get how a few people who prefer power over portability and don't really get how Apple's product development process works, would have vastly preferred the MacBook go thick with a couple of F-22 Raptor vents welded onto a 17-inch chassis so frequencies never fell below base.

But, Apple seems to think the iMac and especially iMac Pro better covers those requirements and wants to keep its pro portable… really portable.

More to come

I was originally planning to post my review yesterday, but, turns out it's going to take a couple more days. In the meantime, I'd love to know what you think. Hit the comments below.

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.