Ranked #1 ranked tech industry analysts in the U.S., U.K., and Europe (Apollo Research January, 2014) and one of the most cited tech analysts in the world. Disruptive consumer technologies and usage models are what really change the landscape. They determine the tech losers and winners and that's what I will be covering. I am Founder and President of Moor Insights & Strategy, an independent technology analyst firm that has been cited as experts globally in the media. Unique to the analyst business, I actually worked for high tech companies for 21 years leading strategy, products, and marketing for AT&T ('90), Compaq ('95), AltaVista ('99) and Advanced Micro Devices ('01). I departed AMD in 2011 where I served as Corporate Vice President and Corporate Fellow in the strategy group. There, I developed long-term strategies for mobile computing devices and personal computers.

The Real Reasons Microsoft, Sony Chose AMD For The Xbox One And PS4

It has been two weeks since E3, the world’s largest gaming show, and the final pieces of the game console puzzle are starting to come into place. The public knows what the Xbox One and the PlayStation 4 look like, what they will run, what they won’t, digital rights management and their price. Ironically, I have yet to read or hear exactly why MicrosoftMicrosoft and SonySony chose AMD silicon to power their new consoles and my goal here is to simply lay it out.

To get at why Microsoft and Sony chose AMD, you need to start with the content needs. Both makers were looking for a way to increase the console “footprint”, increase the amount of apps, and lower the cost of software development. The Xbox One and the PS4 are designed to do a lot more than games. They designed the consoles to be the future hub for all home entertainment and home automation and control. To effectively do this, they will need hundreds of complex apps that are relatively straight-forward to code. Therefore, you need to start with an application processor architecture that supports this, and it’s not Power architecture.

The apps processors that powers today’s Xbox 360 and the PS3 are based on the Power architecture. It delivered decent performance seven years ago, but it is much more difficult to program than the ARM (ARM HoldingsARM Holdings PLC), MIPS (Imagination Technologies Group PLC), or X86 (AMD and IntelIntel). Additionally, the technological investment in ARM, MIPS and X86 architectures and ecosystems dwarfed PowerPC over the last decade, rendering Power obsolete for the required performance per watt. In a world where your console needs to have as many apps as your smartphone, the only answer was ARM, MIPS or X86.

My sources have confirmed for me that both Sony and Microsoft felt that MIPS didn’t have the right size developer ecosystem or the horsepower to power the new consoles. Then it came down to ARM versus X86 architecture. I am told there was a technical “bake-off”, where prototype silicon was tested against each other across a myriad of application-based and synthetic benchmarks. At the end of the bake-off, ARM was deemed as not having the right kind of horsepower and that its 64-bit architecture wasn’t ready soon enough. 64-bit was important as it maximized memory addressability, and the next gen console needed to run multiple apps, operating systems and hypervisors. ARM-based architectures will soon get as powerful as AMD’s Jaguar cores, but not when Sony or Microsoft needed them for their new consoles.

Then there is the matter of implementation. Both Sony and Microsoft wanted a custom X86-based SOC, or system on a chip. SOCs are silicon similar in design to those in a smartphone, where every single piece of functionality is driven off of a single piece of silicon like the apps processor, graphics, video, image processing, audio, security, memory controller and I/O. The only thing separate inside the consoles are storage and some power management. The desire for an SOC makes sense for so many reasons. First is low power, as generally, SOCs are lower power than multiple pieces of silicon. Also, one piece of silicon generally takes less space and generates less heat than two or three chips. The requirement for an X86-based SOC ostensibly removed NvidiaNvidia from the running.

As for custom, both Microsoft and Sony had different requirements for graphics, video processing, content security, and even memory. Sony and Microsoft could have attempted to pull it off on their own, but they just didn’t have the experience or the IP required to put five to seven billion transistors on one piece of silicon. They also could have contracted a third party like Open Silicon, but frankly, this is way too complex a project and the stakes too high to go with anyone who hasn’t done leading edge design. And who can forget the Microsoft’s “red ring of death” costing them billions and tarnishing the Xbox brand. The requirement for a custom SOC removed Intel from the running, as well as their graphics.

Every one of these factors above contributed to AMD getting the nod. AMD won this business because they have the advanced IP, know-how, experience and commitment to make this happen. They have leading edge IP in CPU, GPU, memory, video, audio, and I/O. They also designed the first quad core, X86 SOC, and it’s not a giant leap to take this to eight cores. Finally, AMD built an entire product division to support the effort that others weren’t prepared to commit. It was a clear-cut win.

Disclosure: My firm, Moor Insights & Strategy, provides research, analysis, advising, and consulting to many high-tech companies, including AMD, ARM, Intel, Nvidia who are cited in this article. No employees at the firm hold any equity positions with any companies cited in this column.

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

Comments

Thanks for not being a tool and saying something like those sore losers Nvidia have been. Like AMD was only one willing to work for pennies. We have yet to actually see what AMD is being paid for each chip, but I am willing to bet its about $100, seeing as its the whole SoC and they had previously paid IBM about $90 for just their CPU and Nvidia like $125 just for their GPU in the consoles. And this also has the contorllers for the hard disk, USB, RAM, etc…Combined with its being custom designed has to come with a price tag. And with that $100 AMD probably makes out with good margins. It would be nice though if it was more like $125 like they gave Nvidia just for a GPU before.

BS article. The writer seems to be an Ex-AMD employee who got canned and now is working as a freelance consultant. I have been a huge fan of AMD for the past 9 years but after years of pathetic customer support, false advertising I have just given up, the stock has not moved up and when the company is in the crapper employees are still milking the benefits which were granted during the time of Jerry Sanders.

I know for a fact Rick Bergman made an emergency flight to Japan to negotiate with Sony to keep the contract for PS4 as AMD was struggling with its IP, heat dissipation, power consumption etc. I also believe he gave some hefty discounts to keep the contract. I also believe that somewhere IBM too is funding this program because of certain clues which I am picking up.

AMD has no silicon IP which is not available else where. They are a disunited lot, who want to milk this PS4 and Xbox win and survive another year. The whole semi custom business unit was created after the XBOX and PS4 win, is that a strategy or a desperate company trying to clutch onto a straw. Their employees call the environment at AMD as “The Hunger Games”. All we should do is wish them well, and hope the employees join Samsung/Intel/Marvell /BRCM etc

Patrick, try to focus on other companies, its easy to go back to the mother-ship, but work on something new.

@Mark…. wow nothing at all in your response even makes the smallest bit of sense.

Conspiracy theories about emergency flights to Japan to negotiate and keep contracts are asinine. Not because they do not happen, but because you should expect them to happen. That is part of the corporate world. Who cares?

The criticism the semi-custom BU was created after the win…. again who cares? They agreed to give the customer what they wanted. The only question from the perspective of AMD is whether they made the smart move here and will it be profitable. The author in no way argues one way or another. The author was clearly highlighting some of the reasons AMD won…. one of them being their willingness to customize a chip for Sony/MS. Nowhere in this article is an argument that this was a good move for AMD.

You sound as if you have a bone to grind. I was waiting to be informed but you provided no convincing information. AMD has awesome intellectual property and incredible execution. Joining the ATI graphics powerhouse with AMD’s technical prowess microprocessor design makes for a potent combination. The PlayStation 4 graphics performance is ten times (10X) that of the PS3 with AMD proving that they have integrated their purchase of ATI quite successfully. This is the best decision Sony and Microsoft could have done. But unfortunately, the XBOX One with its slower GDDR3 memory is not taking advantage of the super fast fast 8 GB GDDR5 memory in the PS4 that is propelling the graphics performance of the PS4 to 1.84 TFLOPS.

I just got paid $6784 working off my laptop this month. And if you think that’s cool, my divorced friend has twin toddlers and made over $9k her first month. It feels so good making so much money when other people have to work for so much less. This is what I do Going1.C0M

Wait… are you now saying there is no dedicated GPU in these consoles? It’s an all-in-one gimmick? If that’s so, I will cancel my pre-order TODAY and just invest another 500$ into my tower PC lol… That’s insane.

That means nothing. It’s not a gimmick. The CPU and GPU were manufactured on the same die, to better assist with power consumption, heat dissipation, noise, space-saving, and speed among other things. There are many advantages to this. The PS4 will be as powerful (graphically) as a Radeon 7850 desktop card (Which is ~$180). The XBox One has 33% less power than the PS4 and is the equivalent of a $100 card. Microsoft was caught using NVidia Graphics cards on a PC to show their tech demos of XBox one games.

So.. one die will contain 8 processing cores for data and a 9th GPU focused core? Or of the 8 cores, it will share GPU workload? In which case, a separate GPU has proven time and time over to be superior performance wise. Temp wise, it’s entirely subjective based on the application. Power wise.. meh, i’ll throw a few more bones if they had to get a few more watts from a power source. Just my opinion. I will hold off and tweak my tower based on this. It’s already statistically superior, and still 3 years old.

The beginning of your post was correct pointing out the the parent’s gimmick nonsense for what it is, but the rest of it is way off. The PS4 has superior graphics, but the Xbox is superior to any PC card because the CPU and Video share the same memory space. It’s just as silly to say the Xbox compares to a $100 card as it is to say that integrating graphics with the CPU is a gimmick.

Hmmm…one does have to wonder just where you have been these last three years since AMD released the first on-die gpu? And just how really dialed-in you are to console or pc gaming. In other words your ignorance is showing.

The 8-way Jaguar APU has designed for PS4 and XBOX will eliminate the need for a mid range discrete graphics GPU or as you like to call it “a dedicated gpu”. An on-die gpu or APU as AMD has named it works for many reasons including but not limited to, lower latency, memory controllers are on die, and the amount of mobo real estate is sreriously diminished so it’s cheaper to produce. As far as graphics performance goes, it is based on the Radeon 7000 next gen gpu.

Basically if you were to take an 8-way Intel cpu and about $300 per copy and an nVidia lastest gen GPU at about $1000 per copy and a 500 watt power supply then yes you would have for $1800 a bleeding edge gaming rig. But not too many famillies are going to purchase that to babysit Johnny, or yourself. No what AMD has done is take an 8 core cpu and marry it to the latest Radeon designs and presented that to the developers so they can optimise for AMD hardware and prodcue multithreaded games all based on a single platform. Now that will change gaming as currently 99% of the games are NOT MULTIHREADED.

There IS a dedicated GPU. The difference, in home PC language, is that the GPU isn’t connected to the motherboard by a restrictive interface like the PCI-e we know and love. Instead, the processor and the GPU are both directly manufactured on the same die. This allows for a highly engineered system (fewer “bottlenecks” per se), and should allow the hardware to achieve potential that it never practically could, even in a top of the line PC build. Those builds are always limited by the weakest link in the system, mainly that because they are made to be generic and therefore compatible with a wide variety of components, they aren’t as finely tuned as they might be. In this case, with everything all on the same die, it should run like a well oiled machine.

Granted, I’m no expert, but this is what the reading I’ve done has told me.

I think Intel wasn’t interested in a low margin product, that didn’t require high performance silicon. They can make more money elsewhere. Intel was in the original Xbox, they sold a lot of chips, didn’t make much money.

AMD is notoriously defunct with driver issues, as a large portion of headline gaming companies run their benchmarks and optimize for Nvidia architecture. This may lean more toward a more creative marketing on Nvidia’s side, but i have personally have massive issues with driver optimization with many PC games involving AMD/ATI and i now specifically use AMD CPUs with Nvidia GPUs in PC applications.

Factual errors: -Cell processor in the PS3 is based on Power Architecture, while the Xenon processor in the Xbox 360 uses PowerPC instructions. -Sony, with its partners, developed the Cell processor specifically for its PS3, demonstrating that they would have the experience and IP to pursue the option of once again making a bespoke chip for its new processor.

Grammatical/typographical errors: -Xbox One, Xbox 360, x86; capitalized as given -”but it is much more difficult to program for *the* ARM …”, “the” should be “than” -”SOCs are silicon similar in design *that is* in a smartphone”, should be “to those found” -many tense problems, including: “64-bit *was* important as it maximized memory addressability”; should be “is”, also “addressability” jargon-y

Content problems: This article doesn’t answer “Why MS and Sony chose AMD”, but rather why they chose x86. By mentioning AMD, one would naturally expect a discussion on why nVidia and Intel were not considered. Instead, there was just one sentence about nVidia. And, this is just my opinion, this discussion also ignores the fact that by moving to an x86 platform, both consoles have now made lives easier for developers, who can work entirely in an x86 workflow, and don’t have to worry about “porting” their code to suit the idiosyncrasies of each console’s processor. This translates to better game performance and helps to keep development time and costs down. The flip side being that the x86 architecture could possibly lead to more rampant piracy of console games (of course, that’s measured against the already rampant piracy of PC games). Regarding the RROD problem that plagued the Xbox 360, IIRC, that was a production problem, not a design problem (although they did modify the heatsink/fan design on later iterations of the console). Related to this point, the lower power usage isn’t in and of itself the key attraction for choosing SOCs. Rather, it’s that they generate less heat during operation. Less heat requires less active cooling (i.e. fans, cf. complaints of the fan noise in the Xbox 360), and allows for a smaller chassis (cf. comparisons of the PS3 to a fridge/George Foreman grill). And I’m no expert. I’m certainly no analyst. But I find it difficult to believe that the author had yet to read or hear about why the companies chose AMD for their upcoming consoles. It’s been talked about in a variety of sources since the consoles were announced, back in February. All one needs to do is google “PS4 AMD nVidia”. One such article suggests the reason was, unsurprisingly, cost.

Ah, yes. I apologize for the tone of my comment earlier; I get a bit cranky before coffee. It’s interesting that you’ve listed MIPS as one of the architectures they considered, because I haven’t heard anything about them in recent years. Actually, I was surprised by the amount of technical terminology and concepts that you drew on in the piece. Since we’re drowning in TLAs anyway, you could also toss in APU in reference to the AMD SOC. It’s a good term to use in searches about the topic, especially given some of the other comments regarding discrete graphics (or the lack thereof).