Consumer Reports Fails to Earn MacBook Pro Credibility

MacBook Pro setup
MacBook Pro setup (Image credit: iMore)

Update, January 12, 2017: Consumer Reports now recommends the MacBook Pro.

Consumer Reports has now finished retesting the battery life on Apple MacBook Pro laptops, and our results show that a software update released by Apple on January 9 fixed problems we'd encountered in earlier testing.With the updated software, the three MacBook Pros in our labs all performed well, with one model running 18.75 hours on a charge. We tested each model multiple times using the new software, following the same protocol we apply to hundreds of laptops every year.

No specific author is credited with the update, which seems strange. Consumer Reports also seemingly still doesn't recognize the problem with the original article, which led to as many questions about their testing and publishing methodology as it did MacBook Pro battery life.

Also: Can I get a MacBook Pro that runs 18.75 hours? Pretty please?

Update, January 10, 2017: Apple provided me with the following statement on Consumer Reports' test:

"We appreciate the opportunity to work with Consumer Reports over the holidays to understand their battery test results," Apple told iMore. "We learned that when testing battery life on Mac notebooks, Consumer Reports uses a hidden Safari setting for developing web sites which turns off the browser cache. This is not a setting used by customers and does not reflect real-world usage. Their use of this developer setting also triggered an obscure and intermittent bug reloading icons which created inconsistent results in their lab. After we asked Consumer Reports to run the same test using normal user settings, they told us their MacBook Pro systems consistently delivered the expected battery life. We have also fixed the bug uncovered in this test. This is the best pro notebook we've ever made, we respect Consumer Reports and we're glad they decided to revisit their findings on the MacBook Pro."

Update, December 23, 2016: Apple's head of worldwide marketing, Phil Schiller, posted the following statement on Twitter:

See more

Here's hoping Consumer Reports shares their test method with Apple so the results can be vetted and, if necessary fixes can be made. That's what's best for the consumer.

There's been a lot of discussion around MacBook Pro battery life. For some people it's been fine. For others, problematic. Some reviewers have had a great time. Others have had a bad experience. Apple is sticking to their original estimates, but given the ongoing debate, it's something that needs a deeper look. Sadly, Consumer Reports hasn't done that. If anything, they've only increased confusion.

in a series of three consecutive tests, the 13-inch model with the Touch Bar ran for 16 hours in the first trial, 12.75 hours in the second, and just 3.75 hours in the third. The 13-inch model without the Touch Bar worked for 19.5 hours in one trial but only 4.5 hours in the next. And the numbers for the 15-inch laptop ranged from 18.5 down to 8 hours.

What was the test?

For the battery test, we download a series of 10 web pages sequentially, starting with the battery fully charged, and ending when the laptop shuts down. The web pages are stored on a server in our lab, and transmitted over a WiFi network set up specifically for this purpose. We conduct our battery tests using the computer's default browser—Safari, in the case of the MacBook Pro laptops.

Was it because some tests used Chrome instead of Safari, which previous tests have shown can greatly reduce battery life?

Once our official testing was done, we experimented by conducting the same battery tests using a Chrome browser, rather than Safari. For this exercise, we ran two trials on each of the laptops, and found battery life to be consistently high on all six runs. That's not enough data for us to draw a conclusion, and in any case a test using Chrome wouldn't affect our ratings, since we only use the default browser to calculate our scores for all laptops. But it's something that a MacBook Pro owner might choose to try.

If I were running the tests, that right there would be a red flag. A huge, glowing, neon red flag.

Those results make very little sense and I'd take apart my chain, link by link, until I found out what was going on. I'd check and re-check my tests, I'd watch the systems like a hawk, and I'd do everything possible to find what was causing the variance. I'd even — gasp — try testing different machines and something other than web pages to see if that revealed more information.

Inconsistent results from battery life tests, for responsible publications, aren't a reason to rush out a headline in time for the holidays. They're a reason to start questioning everything, and to diligently retrace every step along the way, until you can get repeatable, reputable results.

What did Consumer Reports do?

However, with the widely disparate figures we found in the MacBook Pro tests, an average wouldn't reflect anything a consumer would be likely to experience in the real world. For that reason, we are reporting the lowest battery life results, and using those numbers in calculating our final scores. It's the only time frame we can confidently advise a consumer to rely on if he or she is planning use the product without access to an electrical outlet.

As someone who's been using a new MacBook Pro since the event back in October, and seldom with an outlet nearby, I'd laugh at that if I wasn't so busy crying. Then again, I know how to use Activity Monitor... My anecdote isn't data, though, and neither is Consumer Reports'.

Sadly, we now live in a world filled with manufactured controversies and, quite often, fake news. It's fake claims about real sapphire, cancelled watch apps that ship on time, and the perpetual rush not just to find the next "gate" but, in many cases, to create it.

"Bendgate" and "chipgate" showed there was blood in the pageview water, so now the click sharks are circling.

Now, I don't think Consumer Reports is faking news here, but I do think they're after attention more than they are answers. Otherwise, I think they would have taken the time to figure out what happened, why, and presented something truly useful. Sadly, I don't think that's their primary concern anymore. And it's why I stopped reading Consumer Reports years ago. (Yes, even their Samsung Galaxy waterproofing report.)

These days, if I'm interested in battery life tests, I go to AnandTech or Ars Technica, where they show their work, explain their methods, and often take whatever time is required to get real answers before hitting publish. Same for other areas. I look to the experts who don't settle for confusion but demand clarity.

If there is something wrong with the MacBook Pro battery, then I want to know about it. Just saying you got inconsistent results is as valuable as telling me it takes 1, 4, or 12 hours to cook a turkey – not at all. I can get food poisoning or burn a bird on my own, thanks.

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.