Quick! What do laptops and the Chevy Volt have in common? If you answered “bogus battery claims,” you got it. Batteries are more capable than ever, but their merits are being wildly oversold. Nine hours of battery life? 230MPG? Who authorizes this crap? Marketing departments have made battery life a race to the top of Bullshit Mountain, and the unaware consumer is paying for it with disappointment. We think it’s time that consumers are offered more reasonable expectations, and we think we have a solution that will do just that.
A growing chorus
The issue of bizzaro-land battery marketing was brought to the fore when AMD’s SVP and CMO Nigel Dessau called shenanigans in March.
“Most PC battery time metrics are achieved by looking at how long the battery lasted running a benchmark called MobileMark® 2007 (MMO7). This is a rating of battery life when your PC is running on average less than 5% utilized – or fundamentally idle,” he wrote. “Most PC makers don’t even turn Wi-Fi on for this test. Is this realistic based on how you use your PC?”
“If I want to know how long my battery is going to last, I want to know how long it’s going to last with me using it, not with it idle or doing nothing.”
Icrontic’s own Cliff Forster echoed Dessau’s sentiments when he wondered why consumers are willing to put up with the bogus numbers shoveled onto the market.
“Why is this happening? The truth will baffle you: OEMs test laptop battery life with critical components shut down. That means no WiFi and no advanced graphics,” he wrote. “To put it simply, today’s battery life claims are based on a bogus low power user profile. How do we allow this to go unchecked?”
Icrontic was certainly not alone in its opinion; the Wall Street Journal, the New York Times, Computer World and even Lenovo acknowledge Dessau’s observations as a legitimate problem.
“We don’t really like the fact that something is supposed to get four hours and users routinely say, ‘We divide that number by two and that’s what we get,'” said Lenovo segment marketing manager David Critchley.
Gathering some data
We wanted to go a step beyond punditry and tackle the issue for ourselves. Could we develop a battery life rating that more accurately depicted the consumer’s experience? After discussion, we thought it best to create three separate ratings for gaming, multimedia, and productivity.
To see if we were on the right track, we hit the Internet with a poll (now closed) that asked a simple question: What kind of laptop user are you? We let our readers, Twitter users, and corporate folk tell us what kind of user they are, and the answer surprised us.
Next, we asked respondents to rate the importance of eleven distinct tasks on a scale of 1-5, with one being the least important. We made sure to cover a wide spectrum: Gaming, music, movies, browsing, blogging, pictures and writing are just some of the tasks we asked users to tell us about. We asked these questions to assure that our user’s usage habits reflected the categories they selected: Did productivity tasks outweigh multimedia tasks? Was gaming really that unimportant? This is what we found:
Interpreting the data
Now that we’ve presented the numbers as they are, we wanted to draw a few conclusions about what we found and highlight a few interesting results in the data:
- Internet browsing is far and away more important than any other task. With 245 votes of 5, or “most important,” it is more than twice as important to users as document creation which received the next highest share of fives at 113 votes.
- Of the five tasks which were rated as “most important” by those polled, Internet browsing and document creation (productivity) have more votes than the remaining three combined.
- Even if you slant the numbers by adding all four multimedia options (CD music, digital music, DVD/Blu-ray movies and digital movies) together, the productivity category still receives more “most important” votes.
- Gaming is a complete bust on the notebook. It received more votes for 1, or “not at all important,” than it did for “very important” or “most important” at 4 or 5.
- Interesting: Digital music (iTunes/other) was deemed “most important” by users at a rate of more than 2:1 over music stored on a CD.
- Interesting: We didn’t figure people would be doing much file sharing on their notebook, but nearly half of those polled gave it a 4 or a 5.
Most importantly, the specific usage habits offered by users in the importance polls reinforces the results to the “What type of laptop user are you?” question. Users clearly prefer productivity to multimedia, and multimedia to gaming. This is what we expected going into our analysis, and we’re pleased to see strong indicators that endorse the point.
Reacting to the data
Having sufficient cause to believe that notebook users fit into one of three archetypes, our next goal was to create a testing methodology that convincingly replicates their usage habits. In addition to a few benchmarks we feel are qualified to do that, we also have a few ideas about how the test results could be marketed to consumers.
The gamer
No matter which way you slice it, 3D gaming on a notebook was seriously unimportant to our respondents. In our data, it was 1.5-8 times more likely to receive a “one” than any other task we inquired about. Even so, the 22.8% of our respondents that awarded it a 4 or a 5 need to be considered, and gaming notebooks alone make the case for battery tests to accommodate.
Simple battery mechanics tell us that a battery’s life is a function of load, which means how we get to that point is largely immaterial. Accepting that, we propose the following:
- Screen brightness 90%.
- WiFi enabled.
- Bluetooth enabled.
- Loop 3DMark Vantage (3DMark 2006 for Windows XP Netbooks) on default settings until the battery expires.
The rationale for these test choices is as follows:
Firstly, an informal poll conducted by Neowin reveals that 62% of 1163 respondents run their display at 80-100% brightness. We have decided that splitting the difference is the most appropriate course of action.
Secondly, the importance people placed on browsing the Internet makes an active WiFi radio an indispensable part of the laptop experience.
Thirdly, we have previously identified that the current method of expressing a battery’s single-charge run time as an “up to X minutes” statement is inappropriate. We feel that a more honest testing ecosystem would make it into a “more than X minutes” statement. That is to say, testing should reflect a worst-case scenario. While Bluetooth is by no means prevalent, we feel it is appropriate to leave it enabled given this philosophy.
Lastly, we have chosen 3DMark because it creates a repeatable, consistent GPU load across a wide variety of GPUs and operating systems. Given that all new systems are shipping with Windows Vista and will soon come with Windows 7, a DirectX 10.x test seems most appropriate when considering the “more than” battery life philosophy. We make a lone exception for the Netbook with Windows XP, which cannot run DirectX 10 code; in these cases, 3DMark 2006 is the right choice.
The multimedia user
We have established that the multimedia user likes movies, music and pictures, so it would make sense to select a test that includes these tasks. To that end, we believe that the “Memories,” “TV and Movies” and “Music” suites from PCMark Vantage provide a consistent and repeatable platform that focuses on these activities. These three suites should be looped until the battery expires.
We continue to maintain that an active WiFi radio, an active Bluetooth radio, and a screen brightness of 90% are fundamental to breathing honesty into battery marketing.
The drawback to PCMark Vantage is that it is not compatible with systems that run Windows XP, as the Netbook may for quite some time. We are interested in PCMark 2005’s text, picture, video and 2D tests, but we are open to alternatives that may offer a superior routine.
The productivity user
We have established that the productivity user likes writing, reading and browsing. Once again we believe that PCMark Vantage can provide the appropriate tests with the “Productivity” and “Communications” suite.
These two PCMark suites specifically focus on networking, document creation and browsing, all of which we know to be fundamental tasks to the productivity user. The tests should be looped until the battery expires.
We continue to maintain that an active WiFi radio, an active Bluetooth radio, and a screen brightness of 90% are fundamental to breathing honesty into battery marketing.
Lastly, we are once again open to suggestions on an alternative that will appropriately test systems configured with Windows XP.
Selling the tests to users
If anything, Apple’s “it just works” mantra has proven that consumers don’t care about the “how” so much as they care about the “what.” Users clearly care about what they’re getting, but they aren’t interested in the minutiae of getting from A to B. This means that developing a new testing methodology means diddly if marketing doesn’t work to create more realistic expectations.
To that end, we propose a sticker that not only offers the battery performance for each type of user, but also offers a “combined” rating for users who may engage in a mixture of activities.
In the example logo to the right, we made the combined rating a mean of the three sub-scores, but the median between gaming and multimedia may be an appropriate choice as well.
Going forward
Many amongst Icrontic staff are equipped with the MSI Wind U100 or U120 units for trade shows, and I can guarantee you that none of us have ever–even with SpeedStep and 10% brightness–come within a stone’s throw of the six hours scribbled on the box. If users with handcrafted battery profiles and modded BIOS roms cannot get the advertised performance, something is seriously rotten in Denmark.
That stink has overstayed its welcome. Consumers have been bamboozled and dazzled by increasingly outrageous battery claims for years, and it’s time to make a change. We’ve proven that it’s possible, and now we’re calling on OEMs to make it happen: Use realistic tests, leash your marketing teams, and let’s work together for a more realistic battery rating.
Correction: The original run of this article indicated that one respondent, or .003% of those polled, reported that they did not fit in any of the offered categories. The correct number is 0.35%, but this reference has been removed accuracy and clarity. Please also note that our study has a margin of error of approximately 5.83%. We apologize for the confusion.