Why Apple’s Move to a 64-bit iPhone 5s is So Darn Clever

Some observers underestimate what's to be gained by Apple's 64-bit A7 in the iPhone 5s. Is it really just a marketing stunt? Or an inadvisable plan to put 64-bit ARM CPUs into MacBooks? I want to look at why the move to marry iOS with 64-bits is so darn clever, now and for the long term.

__________________________

First, I need to talk about innovation. And this move to 64-bits in an iPhone is an innovation that took time. So while pundits claimed that Apple was devoid of ideas and new technology, Apple was working behind the scenes to really put the screws to the competition.

So much for the idea that Apple has been drifting.

So how does a company innovate? All we have to do is look at how Apple positioned itself on the Mac and OS X side.

Over the years, those savvy enough and lucky enough to attend WWDC saw how Apple made a very careful, well thought out transition to 64-bits on the Mac. The first rule is to have a solid, graceful development roadmap that makes life easy for developers and customers. Apple did that by architecting the OS X frameworks so that 32-bit and 64-bit apps could run simultaneously on the Mac. Seamlessly. Apple engineers made sure Xcode would allow developers to gracefully move to 64-bits and invoke advanced Apple technologies with the latest compilers.

The second rule is to ruthlessly, relentlessly throw away technologies of the past so that the architecture isn't tied down with legacy dead weight. I recall the anger and resentment by developers when Apple announced, many years ago now, that there would be no 64-bit Carbon framework. Apple was single-minded, and focused on 64-bit Cocoa so that they could move both developers and customers forward.

If you do things like that, you can be poised to exploit technology when the time is ripe. If I may, I'll strain a famous saying: Serendipity is when preparation meets opportunity.

Phil Schiller introduces the 64-bit A7 SoC. (Image credit: Apple)

Seizing the Opportunity

The advantages of a 64-bit architecture are clear. Math functions are more accurate. More data can be gobbled from memory in larger chunks, and that means more processing speed. Registers are larger and do more work in a single cycle. The address bus is generally larger, and while that doesn't matter now, it will down the road with other Apple platforms, like the iPad, that may need lots of RAM.

It's easy for Apple to dig into Android and look at its architecture. Seeing what the competition has done and what the limitations are with Android and 32-bits probably led Apple's engineers to surmise that their well-planned, seamless transition to 64-bits on the Mac/OS X side could be leveraged to give them a competitive advantage with iOS, too. For example, Carl Howe told John Paczkowski at AllThingsD :

Because Apple makes the development environment and has updated those tools for 64-bit architectures, a developer only really needs to recompile their application to make it 64-bit compatible — assuming they haven’t done anything non-standard with their code,” said Howe. “This will not be true with Android, by the way. The Android Java app and native app environment will need support from Oracle, who owns the Java environment, as well as 64-bit support from the Android kernel. Android has a lot more moving pieces to coordinate, and will take longer to go to 64-bit.”

Apple traditionally loves to gain a deep technical understanding of how the software they've developed can be leveraged to gain a technical advantage -- and then spring it on the world. While observers may, at times, revel in smug criticism of Apple's perceived lack of innovation, Apple is able to quietly, in secret, dig deep into its technological expertise and then make life difficult for other companies, especially those who don't integrate their own OS with their own hardware. Apple has done this over and over. How soon we forget.

The Baloney Files

What I don't see Apple doing, as some have suggested, is developing a 64-bit iOS and A7 so as to someday move all that technology into Macs. Right now, Windows isn't dead (yet) and many Apple customers need to run Windows on Mac hardware. To do that, they need the virtualization hardware in Intel chips, exploited by, say, Parallels Desktop or VMware Fusion.

Apple would have to include virtualization hardware in its future ARM A(x) System on a Chip (SoC), and I don't see that happening for chips designed to run in a low-power, mobile environment.

Plus, as we saw at WWDC, we got every indication from Apple Senior Vice President Craig Federighi that Apple has realized with renewed vigor that while OS X and iOS should interoperate, they still do different things for different users.

Possibilities

The 64-bit architecture will extend to iPads and perhaps even an enhanced Apple TV set-top-box that can run 64-bit apps. We've already seen how iPads have been embraced in aviation and the sciences, so 64-bit computing is a natural evolution there. That powerful A7 also comes into play with Apple's camera system in terms of stabilization and image processing speed.

For those who've been wondering what Apple will be doing with TV, 64-bit processing and the ability to address more memory are probably pre-requisites for the kinds of advanced graphics and gaming Apple has in mind. For example, as Ben Bajarin astutely pointed out, "First, 64 bit will dramatically increase the performance of more intensive and demanding applications. Things like audio and video encoding/decoding and any graphically intense applications including games and other visually complex applications." You saw that advantage come to life in the September 10 Keynote in the Infinity Blade 3 demo.

All in all, Apple's clever move to 64-bits has outflanked the competition, given Apple a competitive edge right now, and prepared the foundation for technologies to come. And it sure beats tapping your phones together with an NFC link.

You do realise that the move to 64 bit was carried out by ARM and not Apple don’t you? As was the increase in the number of registers etc. All those benefits will be in the next generation of every other phone, so there isn’t anything clever that Apple has done. And this has been the primary driver behind the move. There wasn’t anything innovative or strategic that motivated Apple, it was simply the next iteration of CPU core available from ARM.

Nor was Apple unique in the smooth switch to 64bit. Linux and the open source community beat Apple by more than a year, and whilst Microsoft did their level best to make a meal of it within WinXP by the time the rolled out Win7 they had caught up.

With the way ARM CPUs are gaining performance I feel it is quite likely that sooner or later we’ll see an ARM based laptop being released by Apple. The gains in battery life will be too tempting as well as the opportunity to unify some more of OS X and iOS at a core level.

I do agree with you, however, that virtualisation of Windows is very important to many of those who throw money at Apple’s top end laptops and also support for legacy software. WinRT shows what happens when you ignore an established code base. To get around that I expect Apple to again look to history (whilst hailing the never seen before magical “innovation”) and to copy the RiscPC made by Acorn. These were ARM based systems at their core but included an Intel CPU on an acceleration board that could run DOS and Windows.

So my prediction is that we’ll see the core of OS X running on ARM and being responsible for the visual display and compositing of the desktop environment, with some apps running on that ARM CPU but with an Intel x86 CPU in there as well that can be powered on when needed by an OS X app that requires more grunt or backwards compatibility, or if virtualisation of Windows is needed.

If iPhones and IPads are not using more than 4 gig of RAM… what is the point of 64-bit CPU’s.

Have people forgotten the whole point that adding bits is nothing more than adding potential memory address’s for the CPU to access in RAM? If they are not building systems that have more than 4gig of RAM, it is totally useless.

I hope Apple never does replace Intel with ARM, it’s short sited for pro users or anyone that wants to also be able to run Windows/Linux etc.
If Apple does that it will make MacBook Pros even less appealing, it’s bad enough that Apple is treating them like iPhones where nothing is upgradeable!!
It’s funny, no one is really talking about the iPhonification of the Po line (the OS people are making a little noise, but that’s easy to fix with a simple Finder like OS 9 but different).

Just to clarify, moving the iPhone to 64-bit great idea (it’s more than the memory addressing), moving MacBooks to ARM, worst idea….

My choice for a MacBook Pro would be 17 in Retina using Haswell, with upgradable RAM up to 32GB and a replaceable hard drive so could throw a 960GB SSD in there (and in the spring or summer replace it with a 2TB). Add in 3x Thunderbolt connectors that support PCI Expansion chassises with high end graphics cards (all programs should be OpenCL). That’s Pro!

@ted keefe - with ARM there is actually another benefit in that the instruction set can now support more registers. With the way the instructions are represented in binary that requires more bits to represent the source and target registers for a given operation.

The trade off is that if your instructions are twice as big then you need twice as much memory bandwidth to transmit them. Moving the CPU to 64 bit does nothing to change the memory bandwidth, that is an entirely separate change and one that I’ve not seen specified for Apple’s A7. 32 bit ARM CPUs already have a special mode (called ‘Thumb mode’) where instructions are actually 16 bit and cover a limited range of operations. This helps improve code density reducing the amount of memory bandwidth consumed for a given piece of code. Presumably this mode remains in ARMv8 (the 64 bit CPU line), so you can choose code density or a wider range of registers for any given block of code (the CPU can be switched between modes at will).

Do you actually know what Apple has done with the A7 or are you just commentating on speculation?

The fact of the matter is… NO KNOWS what Apple has done with their SoC. It’s speculation and assumption. It IS 64-bit, but so what? That just means it can process LARGER NUMBERS. Anyone who says anything against it is just showing their complete and utterance bias against Apple, because they know nothing about it yet.

They have obviously forgotten that Apple bought P.A. Semi ... a company that produced custom extremely low power 64-bit PowerPC based chips for the freaken military and Intrinsity, the company that helped Samsung and Apple produce the extremely efficient A4.

I’m going out on a limb and saying that Apple might actually KNOW what they’re doing and the rest of you are just arm-chair quarterbacks trying to comment on it. And actually that’s fine, I can completely understand that, but give me a freaken break…it DOES NOT matter what the ARMv8 ISA offers… Apple can always modify it and extend it as they did in the A6 with the ARMv7 ISA.

And for the idiot that thinks instructions are twice as wide (myurr)... do a little research… all instructions are still 32-bit. And that has nothing to do with being a 64-bit CPU. 64-bit means data can be processed at that rate per cycle.

““This will not be true with Android, by the way. The Android Java app and native app environment will need support from Oracle, who owns the Java environment, as well as 64-bit support from the Android kernel.”

Umm, no.

#1 Java is already 64-bit and has been for several years. Moreover, Java “pointers” are register-width agnostic, they can be any size, programs don’t care.

#2 Android’s Java environment (Dalvik) needs no support from Oracle, it is 100% clean room implemented.

#3 Oracle has nothing to do with the Android kernel, which is based on Linux, which is already 64-bit. In fact, the Android-x86 project already has Android running on 64-bit processors, going all the way back to Android 4.0/Ice Cream Sandwich.

As early as 2011, ARMv8 was announced, and in October 2012, the ARM A53 and A57 were announced, with predicted SoCs shipping with them in 2014. That is, 64-bit in mobile was preordained to happen in 2014, regardless of whether Apple went 64-bit. The industry moves according to what’s available. Apple doesn’t make their own displays or camera sensors, they license them. The GPUs come from PowerVR and the CPUs are based on IP licensed from ARM.

Apple does a great job in packaging all this stuff together and tuning it to run smoothly, that is what Apple innovation is. But to claim somehow they’re 64-bit, or M7, or other stuff is unique is a distortion.

Pretty much all of the Snapdragon 800 based phones coming out are going to have M7/MotoX-like DSPs in them, because that’s what Qualcomm has included in the 800, so the entire industry will soon have it, and pretty much all Android phones in 2014 shipped with Qualcomm SoCs will have motion tracking and speech hotwording because of it.

If you want to know what phones will have in 2014 or 2015, don’t look at what Apple or Google are doing, look at what the chip vendors are publishing, because that is the technology everyone else licenses off the shelf for their designs.

Lots of untrue statements all over the web about what Apple has done, or hasn’t done. The simple facts are that they have smartly packaged a more advanced. Mobile device system and have released it earlier than the other suppliers. It’s the future of mobile computing, and they have put their competition in the catchup mode. It’s always been true that Apple has an easier method of doing this because they smartly maintained tight in house control over the hardware-OS interfaces and the OS-AS interfaces, and integration is easier for them than other suppliers because of this. Apple has always been among the industry’s best at choosing the architecture that’s best suited for the task at hand and getting it out to the customer. They have enabled a new generation of applications that can be designed to do more without sacrificing battery life. All of the derogatory remarks about there being no benefit to this design over the previous generation are thinking only about the past and present. It’ll do the present as well as the old model. It’ll do the future, and the old ones won’t. IMHO the apps to take advantage of the new design will come much more quickly than when the desktop world transitioned. All of the slamming of Apple saying it’s nothing but a marketing ploy is pathetic. In two years, the older models will be obsolete because they will be incapable of doing what people want. And some of those wants we can’t conceive yet because there’s nothing like it today on mobile devices. I do think the advantages will really be more evident on the iPad down the line, as the added screen real estate will allow the developers to better design for the human interface. Again, Apple has developed something significant for the future and is first to the marketplace.

“Apple’s 64-bit A7 in the iPhone 5s. Is it really just a marketing stunt?”

It’s amazing (and laughable) that those Apple-haters are calling the first 64-bit smartphone processor, running the first 64-bit smartphone apps, on the first 64-bit operating system, a “gimmick”.

But then Samsung follows and says it is “planning” to have a future smartphone with a 64-bit processor, but without any planned 64-bit Android OS* or Android apps that can run on it… and those same trolls DON’T call this a “gimmick”???

Those people live in Crazy Land. )

*(Carl Howe, VP of research and data sciences at the Yankee Group recently spoke with AllThingsD about how difficult it will be for Android to go 64-bit: “This will not be true with Android, by the way. The Android Java app and native app environment will need support from Oracle, who owns the Java environment, as well as 64-bit support from the Android kernel. Android has a lot more moving pieces to coordinate, and will take longer to go to 64-bit.”).

Very nicely presented. I think your basic thesis is sound, and well argued.

In reviewing some of the comments that follow, and that have followed similar articles specifically about Apple’s innovation, it’s becoming increasingly apparent that there is not simply a lack of consensus, but genuine difference, about what people understand of the term ‘innovation’.

While some appear to argue that, at least if it’s Apple, a thing does not qualify as innovation unless a technology is developed, en toto, by an Apple employee in Apple’s own labs; others appear to argue innovation, at least as done by Apple, consists simply in their (Apple) being first to market. The first definition conflates ‘innovation’, which includes even a ‘new idea’, with invention, which includes making something that ‘has not existed before’. Innovation and invention are not the same thing. Moreover, regarding invention specifically, if by that term one means that the inventor must develop the thing entirely on their own, with no external input, nor building upon others’ work as a foundation, then there are no inventions in the modern world. All inventions have been built upon prior related work.

As to innovation, I am reminded of Mozart, who is known to have occasionally taken popular and folk music of his day, and using these as a foundation, produced new masterpieces. This is truly innovative, as none of his contemporaries had either that insight or his creative genius to do similarly prior to him, and whilst his output has survived to this day, those popular pieces which he built upon have not, and today live on only in Mozart’s work. Mozart’s innovation was the idea, the vision if you will, that these simple lyrics could be woven into something greater, and his summoning the skill to make it so.

That said, I see two common responses in the blogosphere to any new product release by Apple, especially when the question of innovation arises.

One is that it is dismissed as nothing new, where the industry was already headed, or even as myurr avers, the act of Apple reaching into the past. Precedent does not invalidate innovation of a new product; indeed the whole point of recording history is to edify the present and constructively shape the future. As for something being nothing new, and where the industry was already headed, this is undoubtedly true in a number of instances, but even when so, does not diminish the distinction of being first to summit, particularly when the industry as a whole is focussed elsewhere. When Apple announced the A7 and that it was 64-bit, this took press and public, even if not the competition (although there is no indication that it did not) by surprise. This has not been, apart from the primary literature myurr cites above, a core discussion amongst industry followers. On the other hand, if what cromwellian says is true (and I have no reason to doubt it) then we should expect Android to sport 64-bit chips within a few weeks to months - I’d argue by this holiday season, in order to respond not just to Apple, but to shareholders who will punish any failure to capitalise on sales opportunities.

The second is critics’ argument that Apple is merely following what others have done, and are playing catchup. My observation of this latter assertion is that, more often than not, tit is made in reference to specs or features on a device, e.g. the camera on the iPad or the making of the iPad mini to compete with smaller tablets. Specs and features are generally not cited by Apple as ‘innovations‘ but new product lines, e.g. the iPod Nano, designed to compete with end of the spectrum. To dismiss these as an unremarkable or pathetic attempts at innovation is a straw man argument.

Finally, my observation of what constitutes innovation, by Apple or anyone else in any field, is something that constitutes a new platform upon which a future product or service line can be built. As critics correctly point out, Apple did not invent either the smartphone or the tablet. Rather, they created the first ones that people wanted to use, so much so that Google scrambled to respond to Apple, ditching their focus on Blackberry (then RIMM), and companies that failed to respond (e.g. Nokia) sank into single-digit marketshare. Apple’s model of these devices is now the platform around which the entire industry is organised.

That’s innovation - a compelling idea that becomes a foundation upon which to build the future.

I’m curious, as I’m not a bthead like you guys, whether the 64-bit environment is not so much to migrate the A7 to the mac, but to allow apps written for OSX to be ported more easily for iOS? Is there something about the 64bit addressing that perhaps makes other aspects of software integration easier? Seems like there is - as some posit - more to creating a unified 64bit hardware environment that makes a bunch of sense somehow…?

Anovelli. I don’t think it’s the unified hardware environment as much as the software environment. For reasons stated above, the ability to run Win code on Apple’s computers is a strong factor in Apple’s desktop and laptop designs. Also mentioned above, there is a design option to include a separate Intel processor for that purpose, but processors that are new and capable are expensive. When you consider that huge numbers of Apple’s computer buyers will never run Windows on their machines, that extra processor really is a design negative that would likely never make it to reality. A scaled down laptop that could run both iOS and OS X apps and not Windows, and have an extremely good battery life might become a reality, but a touchscreen laptop IMO isn’t a good tool for all day use, and I don’t really see Apple pursuing it. A lot depends on whether Intel can produce more energy efficient high speed processors though. If the ARM machines’ capabilities continue to grow, it’s hard today to say that a desktop or laptop ARM powered computer won’t ever be made. An ARM computer running iOS with a decent sized trackpad and the ability to run OS X is an interesting sounding beast. But it seems like the trackpad would have to be as big as an iPad or at least an iPad Mini to not be a pain to use like it is on a phone compared to an iPad.

I’m curious, as I’m not a bthead like you guys, whether the 64-bit environment is not so much to migrate the A7 to the mac, but to allow apps written for OSX to be ported more easily for iOS?

Unlikely.

The interface difference of going from Keyboard/Mouse to On-screen Multitouch is substantial, and is going to be the biggest issue for developers creating for tablets. Playing around with Windows 8 tablets with both apps done for touch screen, and apps done for the desktop mode really makes this obvious.

The core architecture of ARM and Intel is also considerably different, even beyond bit-depth issues. Microsoft had to develop Windows RT for ARM devices, which has lead to customer confusion and splintering of the installed base. Developers have to recompile their apps for ARM, which is why the App selection for Windows RT devices are much worse than for Windows 8 tablets. Microsoft will probably drop Windows RT altogether if PC makers start shifting to Intel Atom for cheap, energy-efficient Windows tablets.

I suppose I really am grateful to the angry android hordes: listening to them is like listening talk-radio types explaining how their despise for brown people somehow isn’t racist; however, reflecting as they do a merely technical and a-historic cluelessness, the cranks who crank up the Apple-hate-machine are much funnier.

“myurr” also gives us the hilarious party-crasher-meets-the-host moment in which it decides that directly insulting John is appropriate. Comedy Gold!

@myurr - ARM did not carry out ‘the move’ to 64bit. The move to 64bit was done by Apple - not Google, not Microsoft, not Nokia, not Motorola, not Samsung or anyone else.

ARM does design the 64bit chips that Apple uses - at least in part - but Apple builds the 64bit OS, the 64bit dev tools and 64bit apps for the phone .... and puts all the parts together, kind of like Google, or Samsung or Microsoft, only Apple is ahead of them, if they werent ahead already.

And as other’s have point out, 64bit is not just about allowing more RAM to be used ... or more than 3 or 4GB of RAM to be used, there are lots of other benefits for processing, video, photo manipulation ... and even for games.