Observers puzzled by the relatively scarce 512MB of RAM built into Apple's A5 processor used in this year's iPad 2 and iPhone 4S have received an explanation of sorts from an unlikely source: Microsoft.

Apple itself has never announced the amount of RAM built into the iPad or iPhone, forcing curious parties to poke its software and examine X-rays of its chips to arrive at the conclusion that the company's latest A5 chip has the same 512MB as the last generation A4 chip of the original iPad and last year's iPhone 4 (albeit RAM of the faster LPDDR2 type).

Competing smartphones and tablets commonly claim a full GB or more of RAM, which some pundits have identified as an area of superiority over Apple's offerings. In keeping with its tightly controlled, highly secretive design process, Apple has never acknowledged or commented on why the A5 (shown below) has so little RAM on board.

Given the direct historical relationship between available RAM and the sophistication of software a system can run, this has appeared to some as either an engineering failure by Apple, or alternatively an effort to boost margins at the cost of building a better product.

Answer from an unlikely source

However, a Microsoft blog post by Windows lead Steven Sinofsky highlights an interesting answer to a subject Apple itself has left unaddressed. In it, Sinofsky mentions "a key engineering tenet of Windows 8" that involved efforts "to significantly reduce the overall runtime memory requirements of the core system."

While the idea that the operating system should be as efficient as possible with available RAM is certainly not new, Sinofsky then introduced a detailed explanation authored by Bill Karagounis, the group program manager of Microsoft's Performance team, to detail exactly why using less RAM is so critically important.

Karagounis points out that "minimizing memory usage on low-power platforms can prolong battery life," noting that "In any PC, RAM is constantly consuming power. If an OS uses a lot of memory, it can force device manufacturers to include more physical RAM. The more RAM you have on board, the more power it uses, the less battery life you get.

"Having additional RAM on a tablet device can, in some instances, shave days off the amount of time the tablet can sit on your coffee table looking off but staying fresh and up to date," Karagounis wrote.

Windows 8 in 1 GB

In describing Microsoft's goals for getting Windows 8 to run on hardware with the same RAM as today's Windows 7, Karagounis compares the same system with just 1GB installed (the minimum required for Windows), and indicates that a base install of Windows 8 is now consuming just 281MB of available RAM, compared to 404MB for Windows 7.

Karagounis also notes that Windows 8 is actually doing more because it includes Microsoft's "Windows Defender" software intended to catch and block malware. He specifically notes that antivirus software requires a significant amount of RAM to address in working out how to prioritize memory allocation. That means Microsoft's malware issues raise challenges in RAM consumption unique to its platform.

Apple faces similar RAM constraints with its Mac OS X systems, which now also include basic malware software, although far less extensive and sophisticated than that required on Windows, where viruses not only exist in the wild but are very prevalent.

However, Microsoft's comments also pertain to Apple's iOS mobile environment, which is both scaled down and optimized to a much greater extent than even Mac OS X. While Apple's Macs now ship with a minimum of 2GB, Apple's iOS devices currently max out at just 512MB, and battery life is clearly a prime consideration.

Apple's iOS strategy also makes the concept of viruses essentially obsolete, not only because the devices can only obtain signed software from the App Store, but also because the operating system confines apps to their own sandbox. Even if an app were to pack a virulent payload, it simply couldn't deliver it in a way that infects other apps or documents because of iOS' security design. iCloud maintains a similar sandbox between the stored files of various apps.

The mythical half-tablet, half-PC

The comments of Microsoft's engineers also highlight the fallacy of thinking that Windows 8 will be able to usher in a new wave of tablets that can do double duty as light weight, long life iPad-replacements and then switch into high gear to work as full power Windows desktop machines at the owner's whim.

As Karagounis points out, having enough RAM to run a full Windows environment simply makes it impossible to match the efficiency profile of a system that is designed to use less RAM, because the installed RAM chips are going to be bleeding battery life regardless of whether they are being fully utilized or not. Simply having RAM installed means less battery life, a fact punctuated by the battery-sapping tablets running Android 3.0 Honeycomb on 1GB of RAM.

Microsoft's explanation effectively endorses Apple's strategy for designing hardware purpose-built for the task at hand, either with minimal RAM when designed to coast for days as a low power device like the iPad or iPhone, or with enough RAM to do full work, but requiring a recharge after several hours of operation, like Apple's MacBook line of notebooks.

This also further clarifies the idea that next year's ARM-based Windows 8 tablets will need to have minimal RAM to compete with the iPad, and therefore won't be able to run full scale desktop software, regardless of whether developers recompile it for ARM chips. Adding more RAM would not only take a hit on battery life but would also make the systems more expensive, and like Android 3.0 licensees, preclude them from matching Apple's iPad in price.

The Samsung-built, laptop-class tablet hardware Microsoft recently issued to demonstrate its newest build of Windows 8 for developers was initially chided for needing a fan to cool its Intel Core i5 processor, but the most telling detail of the system was that it packed 4GB of RAM, eight times as much as the iPad.

iPhone sure, but the iPad has volumes of unused space that could accommodate a much larger battery to provide the needed power for the additional RAM. It might make it too heavy, but more RAM is better, in my opinion, as long as you can still deliver the long battery life that we currently enjoy on iPad.

"relatively scarce" RAM??? According to whom? Microsoft and the Android community whose half-baked OS's require higher horsepower hardware that suck more battery juice to perform the exact same processes that iOS can do effortlessly with less RAM?

It looks like the PC/Android community is trying to keep an obsolete hardware business model going by using more and more power-hungry components to keep an industry afloat.

iPhone sure, but the iPad has volumes of unused space that could accommodate a much larger battery to provide the needed power for the additional RAM. It might make it too heavy, but more RAM is better, in my opinion, as long as you can still deliver the long battery life that we currently enjoy on iPad.

Why is more RAM "better"?

It's clearly not needed today and I don't want to lug around a bigger battery just so spec-oriented geeks can resolve their inadequacy issues

A funny thing happens with resources - applications grown to consume them

By forcing all iOS devices to have the same amount of RAM, programmers can't get lazy - "Well, I'll target it at the iPhone 4S and if it happens to work on the iPhone 4 or iPad 1, great"

That just isn't an option.

It's very similar to the video game consoles - the processing, GPU and memory in them is pitiful compared to full blown desktop PC's - yet stuff looks amazingly good on them. One aspect is the limited resources force developers to actually code efficiently. The other benefit is with consistent hardware developers can afford to get "down and dirty" with it and learn all the little tricks to extract the maximum performance. It's stable, so there is little concern that their effort will be wasted.

iOS isn't as stable as consoles - there are a couple of new hardware platforms a year, but it's certainly less than the tens of thousands of new permutations of hardware and software for general purpose computers like PC's and Macs, and certainly less than the thousands of devices on Android.

Apple's annual upgrade cycle represents a compromise of engineering, supply chain management and marketing and I think they have struck an excellent balance.

Hardware platform stability is a perfect example of an intangible that looks trivial at first glance, but when you start digging deeper, reveals there is far more to it than just casually tossing updates on a whim...

This is basically the opposite strategy that Android phone and tablet manufacturers take. The final user experience and overall performance is not a primary concern.

It's quite simple to build and manufacture a new Android phone.

(1) Just use whatever new components that you can get your hands on so that you can play leapfrog with the last Android phone that just came out 2 weeks ago. In 2 weeks time, your new Android phone will be leapfrogged by another newly introduced Android phone, but who cares about that.

(2) Whatever screen size they used in their previous phone, just make it at least an inch or two larger. Don't ever stop. Bigger is better. Just imagine that all of your customers are freaks of nature born with gorilla sized hands and that they're not really humans.

(3) Putting in a lot of RAM in and the latest dual core CPU's are a must. This is of course done to mask the horrible performance of the Android OS, and even then it doesn't operate smoothly. It doesn't matter that your new Android phone is actually slower than other phones, you now have the higher specs to boast about on forums, even though your new phone actually operates slower and worse than other phones, not to mention the disgustingly horrible battery life that your phone will have.

iPhone sure, but the iPad has volumes of unused space that could accommodate a much larger battery to provide the needed power for the additional RAM. It might make it too heavy, but more RAM is better, in my opinion, as long as you can still deliver the long battery life that we currently enjoy on iPad.

Did you read and understand the whole story? The conclusion is that you can't have the same battery life with more RAM. It is not possible with current technology.

And it even stands to reason that future tech will consume less power if it is run with less RAM. Lean and mean is the best way. Nobody seems to understand this except Apple and believe it or not Microsoft. (Although Microsoft may have just copied the idea from Apple.)

The obvious reason for more RAM is so that the phone has to spend less energy getting costly resources over a 3G data connection again. It is important to note that all desktop operating systems are different from their mobile counterparts in that desktops can stick data in RAM onto the hard disk to be used later. Sure, there's a penalty when you have to grab that data from the hard disk again instead of RAM, but it's MUCH better than having to access network resources (and on a mobile device, using 3G is the largest penalty there is).

As far as I know, none of the mobile operating systems have virtual memory because they don't want to wear out their Flash chips with continuous read/write cycles. Desktops and laptops don't really have this problem. Even Windows 8 tablets will write data to virtual memory.

So I don't really buy the "less RAM = more power savings" argument on a mobile device. Any savings gained from having less RAM would quickly be wiped out by having to re-download images and other media assets from the web.

It's clearly not needed today and I don't want to lug around a bigger battery just so spec-oriented geeks can resolve their inadequacy issues

A funny thing happens with resources - applications grown to consume them

By forcing all iOS devices to have the same amount of RAM, programmers can't get lazy - "Well, I'll target it at the iPhone 4S and if it happens to work on the iPhone 4 or iPad 1, great"

That just isn't an option.

It's very similar to the video game consoles - the processing, GPU and memory in them is pitiful compared to full blown desktop PC's - yet stuff looks amazingly good on them. One aspect is the limited resources force developers to actually code efficiently. The other benefit is with consistent hardware developers can afford to get "down and dirty" with it and learn all the little tricks to extract the maximum performance. It's stable, so there is little concern that their effort will be wasted.

iOS isn't as stable as consoles - there are a couple of new hardware platforms a year, but it's certainly less than the tens of thousands of new permutations of hardware and software for general purpose computers like PC's and Macs, and certainly less than the thousands of devices on Android.

Apple's annual upgrade cycle represents a compromise of engineering, supply chain management and marketing and I think they have struck an excellent balance.

Hardware platform stability is a perfect example of an intangible that looks trivial at first glance, but when you start digging deeper, reveals there is far more to it than just casually tossing updates on a whim...

Let programmers be lazy. Targeting the smallest market isn't a good way to make money.

"relatively scarce" RAM??? According to whom? Microsoft and the Android community whose half-baked OS's require higher horsepower hardware that suck more battery juice to perform the exact same processes that iOS can do effortlessly with less RAM?

It looks like the PC/Android community is trying to keep an obsolete hardware business model going by using more and more power-hungry components to keep an industry afloat.

Good for Apple that they can do more with less than the competition.

Good God man, next time read the freaking article before you go off on a rant. MS was applauding Apple for their memory decisions, not bashing them.

RAM is very important in iOS as you do not have demand paging of user data. This effectively puts constraints on what developer can do and often forces the developer to use different techniques. Techniques that may or may not impact performance.

A classic Example is the viewing of PDF's which the 3G iPhone could not handle well at all. iPhone 4 does a much better job of handling large and complex PDFs. This is due directly to having more RAM in the device. Going to 512MB was a huge improvement in usability just considering the viewing of PDFs. The concept applies to all non trivial software though.

Quote:

It's clearly not needed today and I don't want to lug around a bigger battery just so spec-oriented geeks can resolve their inadequacy issues

No you don't need it, that is not the same thing as saying that in general it isn't needed. It has nothing at all to do with spec-oriented geeks whomever they are, it is all about functionality and moving the platform forward. Admittedly this is a bigger deal on iPad than iPhone though. It really depends upon what you expect to get out of the platform.

Quote:

A funny thing happens with resources - applications grown to consume them

The problem you run into is what happens if a new service significantly impacts the amount of RAM available to the device. It will be very interesting to see how much iS impacts RAM free to the app. One can argue if services like Siri are really needed but we still come back to platforms grow no matter what.

Quote:

By forcing all iOS devices to have the same amount of RAM, programmers can't get lazy - "Well, I'll target it at the iPhone 4S and if it happens to work on the iPhone 4 or iPad 1, great"

That is an extremely negative way to look at the issue. A platform will become stagnate if it doesn't increase in capacity or capability over time. Look at what the first Mac Plus did and what you can do on a modern Mac Pro.

Quote:

That just isn't an option.

What isn't an option?

Quote:

It's very similar to the video game consoles - the processing, GPU and memory in them is pitiful compared to full blown desktop PC's - yet stuff looks amazingly good on them. One aspect is the limited resources force developers to actually code efficiently. The other benefit is with consistent hardware developers can afford to get "down and dirty" with it and learn all the little tricks to extract the maximum performance. It's stable, so there is little concern that their effort will be wasted.

How much game/app innovation is coming to consoles?

Quote:

iOS isn't as stable as consoles - there are a couple of new hardware platforms a year, but it's certainly less than the tens of thousands of new permutations of hardware and software for general purpose computers like PC's and Macs, and certainly less than the thousands of devices on Android.

Apple's annual upgrade cycle represents a compromise of engineering, supply chain management and marketing and I think they have struck an excellent balance.

In the iPhone it isn't a big deal. Part of that is due to systems like Android relying upon a Java like VM which itself can consume a great deal of RAM. In the end the apps aren't that bad off on iOS. At least when looking at cell phone platforms. However when discussing tablets it is a much bigger deal as more RAM can enable a far wider array of software and capabilities.

Quote:

Hardware platform stability is a perfect example of an intangible that looks trivial at first glance, but when you start digging deeper, reveals there is far more to it than just casually tossing updates on a whim...

Meaningless comments. iPhone is no more a stable platform than any other. iPhone 4s for example has dual cores and a whole new GPU relative to previous iPhones. Stability is not about RAM. RAM is a straight jacket, sometimes such jackets are needed for the crazy but more often than not the are put not the dreamers and innovators.

It takes power to operate RAM. The more RAM, the more power required. Did anything that Microsoft said really shed any light on this? This clearly is the main point of the article, and I find that very strange.

There were other strange notions in the article. The claim is made that Apples iOS strategy makes viruses obsolete. This obviously is a goal of the iOS strategy, and if the intended meaning of the claim is only this, then I get confused, as I always do whenever someone points out something that is obvious and makes it seem that they have said something else. Or, perhaps there is some way that it can be proven that the implementation has no holes, and that this is common knowledge to everyone but me. If the author knows this, then he should shed some light on that, rather than write something that implies that this is true but provide no evidence in support of the claim.

In fact, as I think about it, it is very likely that Apple screens applications that are submitted for distribution through the App store, to make certain that they do not try to get out of the sandbox. I.e., that they do not, for example, use low-level calls to the kernel to open the file system at the root. I think that I read something about Apple doing this, and if by chance this is correct, then what it means is that the sand box critically relies on the screening process to have the net effect that it endeavors to have. But Daniel Dilger wrote that " iOS strategy makes the concepts of viruses essentially obsolete, not only because [apps are screened] but also because [apps live in a sandbox].

No?! iOS developers make considerably more than strictly Android ones.

I should clarify. I really meant; "Targeting a small percentage of devices on a platform is not a good way to make the most amount of money". For example, I wouldn't expect the world to go wild for the next greatest Kinect game, when only 20% at most of the 360 install base has one.

But the CPU and RAM are on different dies. So the other guy is right and you're not.

How do you figure? The article says in the first line that it's built into the processor. It's all contained in the same "chip" though there are multiple dies within that package. It's a semantic issue at best.

Do you realize that fluoridation is the most monstrously conceived and dangerous Communist plot we have ever had to face? - Jack D. Ripper

The primary issue (more RAM = more power) is accurate to an extent. Dynamic RAM is especially an issue here as you can't stop refreshing it. However there are extensions and considerations that can tip the scales in a different direction. For example if an app has to spend extensive amounts of time accessing secondary store it can have a greater impact on battery life than limiting RAM. Also pressure on RAM can impact responsiveness of the platform.

Now realize that this is an engineering issue. Each time a process shrink comes along the engineering team has to evaluate where the best place is to get additional advantage from the power savings. That may mean more cores, a better GPU or more RAM. Obviously Apple decided that the weak points in their system architecture was in the CPU and GPU this time around. That might not be the case in the next revision.

How do you figure? The article says in the first line that it's built into the processor. It's all contained in the same "chip" though there are multiple dies within that package. It's a semantic issue at best.

You're basing your argument on ... the article?

In today's semiconductor world, packaging is moot.

The CPU module and the memory modules are mounted together using what is known as package on package (PoP). The memory module typically sits on top and the interface is via an array of "balls" in the same way that the CPU is connected to the motherboard.

The two modules are typically designed, manufactured and tested separately (and replaceable). They come together in the last step of manufacturing. You may consider it semantics, for the sake of being intransigent. But if you're really interested in learning how this works, then understand that the meanings of the chip and the term IC are changing.

You are aware that the A5 does not contain 512MB of RAM within the SoC... You seem to be implying this in your article and are incorrect.

The 512MB of DDR SDram is off die, hence the DDR memory interfaces.

Where did he say the RAM was "on the SoC chip?" He did say it was "on the A5 chip," and while technically the A5 is a PoP, not a chip I think his communication is pretty clear clear "the thing that says 'A5' on it only has 512MB of RAM in it."

"relatively scarce" RAM??? According to whom? Microsoft and the Android community whose half-baked OS's require higher horsepower hardware that suck more battery juice to perform the exact same processes that iOS can do effortlessly with less RAM?

It looks like the PC/Android community is trying to keep an obsolete hardware business model going by using more and more power-hungry components to keep an industry afloat.

Good for Apple that they can do more with less than the competition.

I remember updating my Mac Plus from 256kb of RAM to an amazing 1Mb.

I did a science degree with that thing, complete with 20Mb hard drive which cost $3000.

Better than my Bose, better than my Skullcandy's, listening to Mozart through my LeBron James limited edition PowerBeats by Dre is almost as good as my Sennheisers.

Call it what you like, when Apple releases the version of iOS* which has what the rest of the industry calls true multitasking they'll explain it in a way you'll like.

* It may not be in iOS 6 and may have to wait for iOS 7, depending on whether they can get the next generation of batteries in place in time to support the change.

You mean like listening to music while retrieving and reading emails, which I was doing with my iPhone 3G over three years ago and others were doing on their 1st generation iPhones over four years ago, that type of doing more than one thing at once multitasking?

I can't recall ever hearing the term "true multitasking" before the iPhone came along, just goes to show how influential Apple is on society as a whole.

Better than my Bose, better than my Skullcandy's, listening to Mozart through my LeBron James limited edition PowerBeats by Dre is almost as good as my Sennheisers.

You mean like listening to music while retrieving and reading emails, which I was doing with my iPhone 3G over three years ago and others were doing on their 1st generation iPhones over four years ago, that type of doing more than one thing at once multitasking?

I can't recall ever hearing the term "true multitasking" before the iPhone came along, just goes to show how influential Apple is on society as a whole.

If you watch the 2007 All Things D video of Steve Jobs being interviewed, the topic of RAM comes up. I was struck by his tip of the tongue knowledge of the exact RAM of the early Macs, and got the impression he had been thinking about RAM a lot.
Evidently, immediately before the iPhone launch, he must have been thinking about this topic, as there was so little RAM, and yet even the original iPhone could to a lot (for so little RAM).
I think the amount of RAM is a very well thought out compromise, inspired by what all an original Mac could do with even less RAM. One of the great feats of iOS, i.e. to be able to run on so little RAM so well.
And yes, more RAM is better for computing, but what counts is the user experience, and longer battery life significantly alters user experience in a positive way.
Steve Jobs always talked about the user experience as being the most import thing. Everything else is just what it takes to create an AMAZING experience....

Is this really even an issue about whether or not more RAM is better? It seems to me that engineering complex devices is always a series of compromises. Performance, weight, power consumption, size, and cost constraints all shape the contours of design. At the end of the day, it would be a ridiculous understatement to call the iOS line of devices "successful," as they have defined an entire genre of products. I would say then, that to the extent that Apple's goal was to create a successful product, their hardware choices have been superb.

what the hell is true multitasking? i play musics while using pages; browse while talking on the phone. isn't that true multitasking? i am confused.

Ahh this takes me back to high school.. the endless debates as to whether the mac had multitasking,
which of course it didn't because you could hold the mouse button down and halt the system (pre-emptive multitasking).
but of course it did, because apps written to take advantage of (co-operative) multitasking could.

bottom line, if you can do what you want, who really cares what it's called.
If devs want to play nice in the environment, it works well. better now than then even.