Apple responds to battery life concerns with its A9 SoCs

This site may earn affiliate commissions from the links on this page. Terms of use.

Yesterday, we covered reports from concerned iPhone 6s and 6s Plus owners, who have seen markedly different results between those devices built on Samsung’s 14nm node and those using TSMC’s 16nm. Apple has since released a statement covering these concerns in greater detail than we initially alluded to yesterday, and it’s worth considering how the company’s statements fit into the overall picture. Apple’s statement is reprinted below:

With the Apple-designed A9 chip in your iPhone 6s or iPhone 6s Plus, you are getting the most advanced smartphone chip in the world. Every chip we ship meets Apple’s highest standards for providing incredible performance and deliver great battery life, regardless of iPhone 6s capacity, color, or model.

Certain manufactured lab tests which run the processors with a continuous heavy workload until the battery depletes are not representative of real-world usage, since they spend an unrealistic amount of time at the highest CPU performance state. It’s a misleading way to measure real-world battery life. Our testing and customer data show the actual battery life of the iPhone 6s and iPhone 6s Plus, even taking into account variable component differences, vary within just 2-3% of each other.

Of benchmarks and battery life

Apple has a point when it says that benchmarks don’t often track the real-world experience of actually using a device. The primary purpose of most benchmarks is to gather performance data, and the advent of modern benchmarking has its roots firmly in the pre-smartphone era, when battery life wasn’t relevant to desktops and workstations. Even now, many battery life tests amount to “Repeat this workload until the phone dies.”

Whether you use a light or heavy workload on a phone can have a profound impact on its battery life — and, by extension, on how the phone tests in comparison to other devices. Anandtech made this point in their own investigation:

Compare the iPhone 5s against the iPhone 6. The iPhone 6’s battery is 16% larger than the iPhone 5s’s, but the iPhone 6’s light usage run-time is almost 30% longer than the iPhone 5s. Clearly, the later silicon is more power efficient. Under heavy load, however, the iPhone 6’s larger battery only manages to equal the iPhone 5s’s total run-time — not exceed it. Meanwhile, the iPhone 6 Plus’s heavy run time is worse than the Galaxy Note 5’s, but more than 90 minutes better in light usage.

This is why it’s impossible to dismiss Apple’s response as “You’re holding it wrong,” despite the tone-deaf way the company communicated its statement. If a battery test doesn’t accurately capture the way people use the phone, it’s a bad benchmark. It may accurately measure power consumption between two devices in a stated workload, but the entire point of such workloads is to actually capture real-world conditions.

Thus far, the battery tests that have been floated have involved looping a JavaScript test and Geekbench’s fixed-load test, which apparently stresses the iPhone 6 Plus at a fairly constant 30%. Neither of these are particularly representative of real-world conditions. In fact, in the one test we’ve seen where real-world loading was performed (a video playback test for 60 minutes), both of the iPhones lost the same amount of battery life. This implies that in at least some conditions, power consumption between the two devices is basically identical.

Heat and variability

There are two potential factors that could be causing Samsung devices to exhibit poor performance under load as compared to TSMC equivalents. The first, which we alluded to in our initial article, is heat. Transistors that are packed together more tightly naturally concentrate more heat into smaller areas. There’s a clear and known relationship between heat and power consumption, and while the exact relationship varies from chip to chip and node to node, it’s well-known that temperature has a significant impact.

The second factor that comes into play here is variability. It’s important to understand that while we talk about Apple building an A9 processor in the same way that we might discuss Ford building an engine, there are some critical differences between the two. When TSMC, Intel, or Samsung builds a wafer of chips, they don’t automatically “know” what kind of chips they have. Each company will test their silicon to determine how good (or bad) the wafer is. Good chips are those that can run at the target voltage and clock speeds with desired power consumption levels. Great chips are those that can run at dramatically lower power consumption, or hit higher clock speeds, while bad chips are those that consume too much power or simply can’t reach target frequencies.

Each company has different methods of recovering useful dies from poor samples, whether that means disabling some of the cache, one of the cores, or using the chip in a desktop system where battery power isn’t such a concern. The important thing to understand is that variability has been getting steadily worse with every product generation. To understand why, consider a hypothetical scenario in which a “good” transistor contains between 100-200 atoms of a material, a “great” transistor contains between 140-160 atoms, and a bad transistor (that won’t meet desired specifications) has either less than 100 or more than 200. In this example, these numbers correspond to an older process node — say, 45nm.

Now, imagine this same situation, but with very different numbers. In our second example, a good transistor contains between 20 and 40 atoms of a doping material, a great transistor has between 28 – 32 atoms, and a bad transistor is any transistor with less than 20 or more than 40. It’s much, much harder to control the distribution of 20 atoms than it is to control the distribution of 100 atoms. Remember, that since 14nm chips have much more transistors than 45nm chips, it’s not just a question of tighter control — you have to be more perfect to keep fail rates under control. This is why modern chips are sometimes designed with built-in logic redundancy — if one component of a chip doesn’t pass muster, you’ve got duplicate units ready to go.

Here’s what this means, in aggregate: While we are certain that Apple still strictly targets certain ranges for its parts, we’d expect to see greater variation in run-time and battery life between TSMC and Samsung hardware because even a company has legendarily strict as Apple has to accept the laws of physics.

What does this mean for TSMC vs. Samsung?

Thus far, Apple’s official position is that there is no difference between TSMC and Samsung devices. We suspect that if the company breaks from this stance, it will be because of heat differences between the two devices, rather than performance metrics. There are subtle ways to adjust performance to cut down on skin temperature, and it may be possible to create power rules for the Samsung devices that are different than those used for TSMC.

The one thing we’ll stick to is that this variation is almost certainly why Apple was forced to dual source its hardware in the first place. What will be interesting is seeing whether or not this issue continues with later iterations of the phone. Samsung and TSMC are both consistently improving yield on 16/14nm, which means we’ll see those improvements reflected in devices — even if Apple never announces that its later products have better power consumption or lower temperatures compared with the newer ones.

Tagged In

It seems like users are seeing a poor battery life no matter which chip fab, rendering the benchmarks less than, say, useful. Not good enough for what is supposed to be a premium device.

Reginald Peebottom

I own a 6plus and before that a 5 and a 4s. They were and are great devices. But…there are a lot of issues that they have and that Apple seemingly can skate away from without much issue. Either apple ignores the issue and says something crazy like the infamous “you’re holding it wrong” dismissals or the issue appears to get no play at all (or very little) in the mainstream media. That latter phenomenon is puzzling because of the ubiquitous status of the iPhone – it’s hyper popular, it’s every where. And yet, nary a bad word is said.

The email search function has been buggy since at least iOS 7. I haven’t upgraded to 9 yet but I presume it’s still not fixed.

Joel Hruska

I’m curious. What have you seen w/r/t email search?

Joel Hruska

I mean, I think that’s very relative. Like you, I’ve wished for a long time that manufacturers would make battery-optimized versions of devices. I understand that Apple / Samsung / Whoever might not want to do this for everything, but I’d love to have an iPhone 5S shrunk down to 14nm — just a straight die shrink, no performance boosts, but all the benefits of lower power consumption and longer battery life. Don’t focus on making the device thinner, add back a few millimeters and give me a longer-lasting battery.

With rare exception, however, companies just don’t do this. So I bought a 10,000 milliamp battery that can charge my phone 3-4x over, and I carry that instead.

Bruce Wayne

You’re charging it wrong…

jimv1983

The problem with the benchmarks for battery life is that they are in no way an actual indicator of real world battery life.

Even on wi-fi with the cell radios turned off and the screen brightness at the lowest setting there is no way a 2,750mAh battery is going to power a 5.5″ 1080p IPS LCD(and everything else) for almost 14 hours. If that was true then after one hour of web browsing on an iPhone 6S Plus the battery would still be at like 95 or 96%. That means the phone would have only used 110-137.5mAh in an entire hour. The screen probably uses that much all by itself in half the amount of time.

The biggest power usage on any phone is the screen(and indirectly the GPU). By comparison the CPU takes almost no power at all.

Rik MaxSpeed

I’m currently working on a app for iPhone and what I’ve discovered is that Apple adjust the clock speed depending on the system load.

If you use a low-CPU app, for example reading an e-book, the iPhone will automatically reduce the CPU’s speed, till you turn a page, and then it will ramp it up to cope with the animated graphics.

This also explains Apple’s response to what they claim are unrepresentative battery tests. It’s a little as though you only measured a car’s miles-per-gallon by driving up Mount Washington!!

This site may earn affiliate commissions from the links on this page. Terms of use.

ExtremeTech Newsletter

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.

Email

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our
Terms of Use and
Privacy Policy. You may unsubscribe from the newsletter at any time.