PC Makers Still Not Being Totally Truthful With Battery Life Claims

It's a topic we cover often because it's near and dear to our hearts.
It's battery life, and it's finally starting to get the attention that
is has long deserved in the industry. We've seen article after article
point out that battery life claims on laptops are flawed in one way or
another, and AMD's own Pat Moorhead is stepping forward this week with
some facts and figures to back that up.

For whatever reason, AMD has taken a strong, vocal stance against
shoddy battery life claims. They've had it with PC manufacturers pumping out
notebook after notebook with crazy claims, ones that they can't
possibly deliver. Take any laptop of your own, for example. Has it ever
fully lived up to the claims on the box? Have you ever considered
returning it because it didn't? The answer to both questions is
probably "no" considering that you know very well that any machine you
received in return would have the same problem: lofty promises,
lackluster delivery.

AMD has been calling for manufacturers to settle on a standardized
testing process for battery life in order to give the public a better
idea of exactly how long one machine will last compared to another. As
it stands, the consumer has no real idea how a PC maker tests battery
life, and thus, they can't accurately compare one machine to another.
You can pretty much bet that a Gateway with five hours of claimed
battery life and a Dell with five hours of claimed battery life will
actually run out of gas at different points, even if they're both
tested the same way at home.

Moreover, the company has also pushed for a new way of displaying
battery life. Pat likens the "single battery life figure" you see on
boxes to buying a car with only a city MPG figure; the latter would
never fly, so why does the former? Pat also took the time to really
examine some of the newer back-to-school circulars floating about while
paying particularly close attention to the verbiage used to describe
battery life. Below are his observations in full:

23% increase over the prior two weeks in the
number of SKUs advertised with battery life. (34 to 42 SKUs) 23 SKUs I
observed advertised battery life or inference to it during the week of
8/10/09, and 19 the week of 8/17/09. (See raw data at very end of
blog.)

He points out that in all of his research, he still found just one
single battery life figure being advertised, as if a computer drains
similarly while watching a DVD or running a screensaver. He also found
that Apple notebooks "never list battery life," though we will say that
it's easy to find that battery life figure on the company's website,
and considering that the are only sold in a few places, anywhere you go
will present you with intelligent agents to fill you in. Still, no
listing at all on the box is ludicrous.

Real-world battery tests that we've conducted

He closes things out by restating that a single battery life test is
needed across the industry. Getting PC makers to buy in, however,
remains a tall task. After all, each maker wants their machine to seem
like the best, and by running their own tests under their own
supervision, they can basically reach any conclusion they want. What do
you think of all this? Do you just automatically adjust whatever value
you see down by 10 - 30% to get a more realistic view? Do you agree
that folks should be required to do that in order to get a solid grasp
on battery life?