A15 is much better and advanced architecture than A9.It is very power efficient and fast that event dual core A15 can easily beat quad core A9 in every aspect. TI launched video comparing A15 and quad A9http://www.youtube.com/watch?v=neayNcAQIXYReply

Not quite. It's somewhere between A9 and A15, much like Scorpion was somewhere between A8 and A9. That said, dual-core Krait @ 1.5 GHz is still much more than twice as fast as dual-core A9 @ 800 MHz, so even taking Apple's statements at face value, this isn't going to top the frustratingly ubiquitous. MSM8960.Reply

What's to steal? Apple is a fabless licensee. They buy pre-designed CPU cores from ARM and pre-designed GPU cores from Imagination Technologies and slap them together. They aren't like Qualcomm, who design their own architecture that implements the ARMv7 instruction set; all they "design" is the interconnects.

Or as I saw it put once: they don't "design" the Ax SoCs, they just create "layouts." The same is true of Samsung and TI, of course.Reply

Anand, the Cortex A15 is significantly larger than A9 (it's out-of-order after all). Even with the shrink to 32nm, I don't see how Apple could double graphics perf (almost always leads to roughly double the transistor count of the graphics) AND move to Cortex A15, all while making the die _22%_ smaller. Not possible.Reply

They were unclear on the 22% figure. It could be the package or the die that has shrunk, or both. All they said was that the "chip" is 22% smaller. We won't know for a while yet. Chipworks needs to get one to tear the A6 out and analyze.

However, just a shrink would lead to a 28.9% reduction in two dimensions (assuming the architecture is identical). With die sizes, the die shrink will be pretty much linear in both directions, for a difference of 50.5% total. See the iPad 2,4 review: http://www.anandtech.com/show/5789/the-ipad-24-rev...

Given that, SOMETHING got way bigger here if we're talking about a 22% smaller die. Either the GPU exploded, which is possible if they used an SGX543MP4 like the iPad 3, or the A15 cores are far larger than the A9's. I can't venture a guess on which of those is correct since we know nothing about the GPU yet.

So, basically: yes, it's entirely possible the cores are A15's. Anand wouldn't have said they were A15's unless he was sure; he doesn't make a habit of speculating on anything.Reply

I looked the iPad 2,4 review just before you posted this, and given that the 32 nm A5 is about 57% the size of the 45nm, it does seem plausible that this is both dual-A15 and an upgrade to SGX543MP4.Reply

Brian just said on his twitter that he's pretty sure it's an SGX 543MP3. Which actually makes a lot of sense. Saves on die area and idle power, and you can clock the GPU higher to reach the 2x performance claim. Makes a lot of sense. I'd never thought of using a 3-core GPU. It's not that far-fetched an idea, though. Very plausible.Reply

It's a Samsung made chip. I think much much is undisputed. It's worth noting then that even Samsung hasn't yet announced (let alone shipped) a product with Cortex A-15 processors in it. The Galaxy Note II is quad core A9.

In this article, it was written:"The GPU side isn't entirely clear at this point, but the 2x gains could be had through a move to 4 PowerVR SGX543 cores up from 2 in the iPhone 4S."Well, the same logic could be applied to a move to quad core Cortex A9s, which is exactly what the Exynos 4412 is. Apple didn't give the benchmarks used for their 2x claim and have been known to fudge things in their favor. Maybe they claim 4 cores is douple the speed of 2?

Application launch times are 95% about the speed of your storage and memory, which makes it a fantastically bullshit benchmark to talk about CPU performance with. A 1.5x increase in app launch speed could easily be accomplished by improvements in NAND and controller performance. You don't buy a faster CPU to make your programs launch faster, you buy a faster SSD. While I realize that current ARM CPUs are dog slow compared to even the lowest of low-end current x86 CPUs, I'm not going to believe the marketing noise until actual fully-CPU-dependent benchmarks can be run.Reply

With Passbook but no NFC in the iPhone 5 causes a massive set back to NFC adoption. The problem is that for retailers to make use of NFC, they need a large install base of phones, and if the iPhone began to support it, it becomes a viable solution.

Apple could also gain, because they could create a Google Wallet-style service on the back of iTunes that allows Apple to control the NFC transactions.

Nevertheless the lack of NFC is a feature that the iPhone 5 should have had, and it's pretty stupid that it doesn't, as it's to everyone's detriment, including Apple.

As for the rest of the iPhone 5? Beyond the A6 and LTE, it's rather underwhelming, and iOS6 really only serves to lock in customers even more, tbh.

I also think that the new iPod touch should have had the full 8MP camera as the iPhone at the very least, if not have *a better* camera than the iPhone to make it a defacto point and shoot device that also is a gaming platform. I don't understand why Apple continues to make the iPhone the 'better' device and not allow the iPod touch to carve out it's own niche as a fully-enabled device.

"Apple could also gain, because they could create a Google Wallet-style service on the back of iTunes that allows Apple to control the NFC transactions."

Let's live in the real world here, as opposed to fairy land.

The credit card companies (which is all the banks) make money off transactions right now. They aren't willing to give up that control and reduce their take by helping NFC move forward.The phone companies aren't interested in NFC and would rather implement their own half-assed unsuccessful solutions (see how VZW has treated Google Wallet).

So how do you move forward? Either Apple- works through credit card companies and makes no money on the transactions (so why bother?) OR- Apple makes money on the transactions (so they cost more than credit card transactions)

Neither of these seem like a particularly winning strategy. Google may be willing to lose a little money on Google Wallet because it fits in with their model of learning everything about everyone (in this case their financial transactions) so they can sell more targeted advertising, but that doesn't fit Apple's business model.

Their are alternative models.* Apple could essentially become a bank --- you deposit some amount with them, and then use NFC as a debit card against that amount. Yeah, you think Apple want's all the hassle and regulatory scrutiny that will invite to get into a low-margin business?* Apple works with money-market companies to withdraw money from those accounts rather than bank accounts. MIGHT work, but(a) only a fraction of the population have money market accounts AND(b) many of those are held either AT banks, or at institutions with such close ties to/dependencies on banks that they aren't going to go against the banks' wishes.Reply

Last I checked the credit card companies AND the carriers have mostly lined up behind the ISIS initiative to push NFC... It's taking forever but it's got the most backers AFAIK. Then there's Google Wallet and half a dozen other competing players because apparently we enjoy birthing new products and standards in the most painful and competitive way possible!Reply

Anyone has any insight into the 3 microphone noise reduction usage on the iPhone 5?

Since 1 of the microphones is located so close to the earpiece, I believe that one is being used for ANC (Active Noise Cancellation) function and uses the earpiece to cancel some of that ambience noise. The remaining two microphones are for voice communications beamforming type of usages.

Normally, you do not put a microphone so close to the earpiece due to acoustic coupling of the earpiece signal back into the microphone and the acoustic echo is then difficult to cancel. Reply