Ps4 reverse engineered

The PlayStation 4 and Xbox One may not offer an enormous jump over last-generation graphics out of the gate, but the SoCs inside the consoles — semi-custom APUs made by AMD — are a huge leap forward in terms of integration and capability. Both consoles integrate functionality that was previously broken out into multiple chips, using the most advanced 28nm process currently available. A new teardown of the PS4 gives us a look at how the SoC itself was assembled — and how Sony chose to hedge its bets a little when it came to yield.

Chipworks has published multiple die photos of the SoC, and they show us some very interesting things about how the chip was built.

First off, this should settle, beyond any question, exactly*which CPU architecture*the processor is based on. Despite continuing claims from some that the SoC must contain a higher-end Kaveri/Steamroller-class CPU, the tiny x86 cores implemented here are clearly based on Jaguar/Kabini. Each core is roughly 3.1 millimeters square — exactly the size AMD gave for that chip. The large (rather plaid) boxes in each quad-core arrangement are the L2 cache. Memory I/O wraps all three sides of the die, which makes sense — the SoC itself uses a GPU-style memory layout. It’s not clear from this diagram whether HSA is implemented on the chip or not, though it might be possible to identify the IOMMU unit that HSA requires with a close analysis.

Speaking of the SoC, this is one big puppy.

Die size on the chip is 328 mm sq, and the GPU actually contains 20 compute units — not the 18 that are specified. This is likely a yield-boosting measure, but it also means AMD implemented a full HD 7870 in silicon. According to Chipworks, the GPU is 88 mm sq, and takes up about a third of the total die. Looking at AMD’s published figures for the HD 7870, however, the Pitcairn GPU core is a 228 mm sq part. So which is correct? Probably both. Chipworks is only counting the cores as part of the GPU, whereas the full Pitcairn die contains memory controllers, audio processing blocks, video encode/decode hardware, the PCIe 3.0 bus interface, and a number of other low-level silicon blocks that increase total die size. The actual streaming processors are only one component.

If you ever wondered how Chipworks reverse-engineers chips to get these die shots, incidentally, read this story:*How to crack open some computer chips and take your own die shots.

The Wii U’s design was a bit prophetic

One of the interesting things about the PS4′s design is how closely it*echoes the Wii U’s design. While the PS4 is far, far more powerful than the Wii U, Sony made very similar decisions about where to spend its transistor budget. As some of you may recall, the Wii U’s GPU is far larger than its CPU — the latter is a triple-core version of the Broadway core that powered the Wii (which was, itself, a higher-clocked version of the GameCube CPU). The GPU, while still underpowered by current standards, integrated far more computational horsepower than the old Hollywood GPU inside the Wii.

If you look back at die sizes for the PS3 and Xbox 360 era, the CPU and GPU were very nearly equal partners. The PS4 on 28nm is still a third larger than the integrated PS3 on 40nm — if both were built at 28nm, the PS4 would be twice the size of its predecessor. This also helps explain the tension between*pushing more graphics horsepower*into the next generation and the need to ship a product, period. Beefing up the PS4 further would have cost more and made the chip more complex. Waiting for 20nm, when costs were more manageable, would have meant waiting a year or more while TSMC ironed out the bugs in its process. Giving the Xbox that kind of lead time was likely untenable to Sony executives.

Sony will almost certainly take the PS4 through the same die shrink process that characterized the PS3, but not for at least a year — TSMC is set to begin volume production of 20nm starting in Q1 2014, but won’t be ready to transition a part as complex as this one for quite some time.

So all the companies went with an APU. Quite interesting, should yield some good performance.

Also, can someone who knows better clarify. the 8 jaguar cores have the same floating point operations as an i7-2600 yeah? (assuming all cores get used)
I'm merely pondering for potential is all, the CPU is vastly underpowered and that particular i7 was a very good gaming cpu and that would be some saving grace.

So this would mean the XB1 and WiiU share a similar GPU 5770 vs 7770? (im just guessing)

So all the companies went with an APU. Quite interesting, should yield some good performance.

Also, can someone who knows better clarify. the 8 jaguar cores have the same floating point operations as an i7-2600 yeah? (assuming all cores get used)
I'm merely pondering for potential is all, the CPU is vastly underpowered and that particular i7 was a very good gaming cpu and that would be some saving grace.

So this would mean the XB1 and WiiU share a similar GPU 5770 vs 7770? (im just guessing)

Floating ops aren't all that pertinent when it relates to CPU performance. It's not a great metric as flops are great only within a narrow context- not really the general purpose work a CPU should be doing.

For instance, Cell had a stupid floating point count that murdered everything around it- so it was ungodly at certain functions. But if you were to try and run, say, Windows on it? Veritably dead in the water.

On paper, flop wise, Cell is somewhat competitive with an i7. But in 99% of work scenarios the i7 would murder it.

Floating ops aren't all that pertinent when it relates to CPU performance. It's not a great metric as flops are great only within a narrow context- not really the general purpose work a CPU should be doing.

For instance, Cell had a stupid floating point count that murdered everything around it- so it was ungodly at certain functions. But if you were to try and run, say, Windows on it? Veritably dead in the water.

On paper, flop wise, Cell is somewhat competitive with an i7. But in 99% of work scenarios the i7 would murder it.

yeah, that was basically what i was looking for with regard to FLOPS, just a generalised standing of the CPU. nothing major. wasn't going to start screaming from the rooftops or anything

WOW. That is a very nice card for a console system to use. considering past consoles which were way behind. The PS4 certainly is at the right level to meet performance and graphics for us console gamers.

WOW. That is a very nice card for a console system to use. considering past consoles which were way behind. The PS4 certainly is at the right level to meet performance and graphics for us console gamers.

that's not technically true. The PS3 and 360 pushed amazing hardware at launch, and as Vulgotha said, the Cell is still as "strong" (term used loosely) today against high end CPU's (given the right conditions - read his post)

A Cell + Xenos console would still be relevant today for example.

This time the consoles are no where near the bleeding edge, but instead have gone for removing the bottlenecks inherent in PC design. hence, the APU

the APU can't compete with discrete GPU and CPU, but one of the threads here AGES ago detailed how the PS3's setup caused natural bottlenecks in the system that devs had to work around.

If the PS4 where to share a similar launch price to its predecessor, i think you'd be seeing a card similar to the titan being used....or at the very least a 7970.

THAT being said, the 7870 is still a mega chip to use in that APU! now we just have to hope the CPU can last!

that's not technically true. The PS3 and 360 pushed amazing hardware at launch, and as Vulgotha said, the Cell is still as "strong" (term used loosely) today against high end CPU's (given the right conditions - read his post)

A Cell + Xenos console would still be relevant today for example.

This time the consoles are no where near the bleeding edge, but instead have gone for removing the bottlenecks inherent in PC design. hence, the APU

the APU can't compete with discrete GPU and CPU, but one of the threads here AGES ago detailed how the PS3's setup caused natural bottlenecks in the system that devs had to work around.

If the PS4 where to share a similar launch price to its predecessor, i think you'd be seeing a card similar to the titan being used....or at the very least a 7970.

THAT being said, the 7870 is still a mega chip to use in that APU! now we just have to hope the CPU can last!

yea if the performance was boosted, I would agree with that. I do agree about the cell in a sense. It was good for certain things has offloading certain tasks, but when giving more simple to do things it was just awful. Certain processes such as browser were horrible, in fact really bad but that could be because of the low ram the PS3 had. The problem with the cell I believe was just the difficulty of using, but it wasn't just that though. Even still though I consider a non alienating cpu and gpu a much, much better choice. When using the Cell you sacrificed a lot of things such as ease.

Posting Permissions

PlayStation Universe

Copyright 2006-2014 7578768 Canada Inc. All Right Reserved.

Reproduction in whole or in part in any form or medium without express written
permission of Abstract Holdings International Ltd. prohibited.Use of this site is governed
by our Terms of Use and Privacy Policy.