Recently, Consumer Reports’ MacBook Pro tests were in the news
Lab tests almost never can be correlated to real world usage
Standardised tests are only useful for rough comparisons
Apple’s new MacBook Pros received a fair amount of criticism for making life difficult for professionals by removing useful features like an SD card slot or prioritising thinness over increased performance. The criticism was reinforced when the new 2016 MacBook Pro was marked down by Consumer Reports for exhibiting sporadic battery life in its tests. Consumer Reports is an 80 year old independent, non-profit organisation that tests everything from electronics, cars, health products, kitchenware, and a lot more, and its results are taken pretty seriously.
That’s why this rating was shocking for both Apple as well as many Mac users all around the world, as MacBooks in the past generally have not exhibited a huge discrepancy between Apple’s claims and real-world testing. It was the first time that Consumer Reports did not recommend an Apple MacBook Pro.
The Cupertino hardware maker was quick to work with the organisation to figure out if there are any issues with the way the laptops were tested. It was finally revealed that a bug caused improper battery life figures. Apple released a fix, alongside Consumer Reports revealing refreshed test results, finally earning the publication’s recommendation.
But if you see the the updated results, Consumer Reports said that the 13-inch MacBook Pro with TouchBar runs an average of 15.75 hours, the 13-inch MacBook Pro without TouchBar for 18.75 hours, and the 15-inch MacBook Pro for 17.25 hours. These figures are far beyond the 10 hour ‘wireless browsing’ battery life given by Apple on its official website, for all these models. So what is going on here?
The answer lies in Consumer Reports’ standardised process for checking battery life on all laptops it tests. Here’s a quote from its blog post:
Here’s how our battery test works: We download a series of 10 web pages repeatedly, starting with the battery fully charged, and ending when the laptop shuts down. The web pages are stored on a server in our lab and transmitted over a dedicated WiFi network. We conduct our battery tests using the browser that is native to the computer’s operating system—Safari, in the case of the MacBook Pro laptops.
In comparison, this is the fineprint mentioned at the bottom of Apple’s specifications page of the new MacBook Pros:
“The wireless web test measures battery life by wirelessly browsing 25 popular websites with display brightness set to 12 clicks from bottom, or 75%.”
This sheds some light on why Consumer Reports may have got such vastly high readings – for one, in its blog post, Consumer Reports says that the display brightness is kept at 100 nits. Considering the new MacBook Pros have a maximum screen brightness of 500 nits, that means the brightness would’ve roughly been set to 20 percent, much lower than the 75 percent brightness Apple used for its tests.
Next, the discrepancy could also be due to the sites that were chosen to be loaded, as different sites will consume battery in a varying degree. Lastly, in Consumer Reports’ test, the pages were being served by a local server nearby. Since Apple’s testing methodologies are more vaguely defined, it’s likely that its tests are actually fetching those pages from the Internet, not a local server, which could have a little more impact on battery.
But either of these figures aren’t close to what people are experiencing in the real world. Why is that? That’s because in the real world, your mileage may will always vary.
Consumer Reports’ figures aren’t trying to determine what the typical customer will experience in terms of battery life – it’s trying to create a fair test that can help compare battery life across a wide range of laptops from different manufacturers. At Gadgets 360, we do something similar in our reviews – the only difference is that our reviews also include a lot of real world testing, and details about how we used the devices and why.
Lab tests are there to make it fairer to compare figures of different devices. But once you’re actually using the device, the real figures can be hugely different from what a lab test produces, and the rating process as well as consumers need to take this into account.
Here’s one example – battery life figures can vary wildly depending on which browser you use. For example, from personal and anecdotal experience, it is known that Chrome on macOS is not as well-optimised as Safari. As in, you can visibly notice the difference in battery life when using either browsers on a Mac. The same behaviour is exhibited when a Surface Book is tested with Microsoft’s Edge browser, instead of Chrome.
In the real world, people have their own preferences about which Web browsers they prefer. Maybe Mac owners want to use Chrome because they want their browsing history to be synced with Chrome on their smartphone. Maybe they’re compelled to use Chrome because certain extensions aren’t available on Safari. Maybe they want to use Safari instead of Chrome because of features like the Reader mode. Or maybe a Windows 10 user prefers using Microsoft Edge because of the annotation feature. Or maybe they use Firefox.
In the real world, people are probably using more than one browser, more than 25 browser tabs, and a lot more apps that are running background processes, than any simulated test. No wonder real-world usage of any computing device isn’t at par with the lab test figures. Some companies like Apple choose to create simulations that are closer to the real world, while others like LG use a 10-year old battery testing app to claim that their laptop runs for a whopping 23 hours. But none of these figures are going to give you a clear idea of how long it’s going to last for you. Because all of that depends upon what you’re going to do with the computer.