Post Your Comment

160 Comments

AMD-based CPU's are, perhaps, a generation or more behind Intel's current-gen CPU's in many benchmarks, including gaming. Do you think AMD's poor CPU performance will be a major limitation for the PS4, or will it not matter that much (i.e. future games won't be very CPU-bound)?Reply

Might hurt it a bit, but it depends on the clock rate or if they balance across cores properly, but with GPGPU tech being utilized, that should take a good bit of load of the CPUAlso depends on the resolution its gonna be running at.Reply

My bet is that it wouldn't matter much. In multi-threaded scenarios Intel tends to still be ahead of AMD, but the gap is much smaller. And seeing as how there is 8 threads available hopefully developers will start putting more effort into multi-threading their games to take full advantage.Reply

I don't see it as an issue, especially considering this will be a vast leap over CellBE and the GeForce 7900-class GPU of PS3. Besides, this combo is still driving just 1080P and eventually 4K displays. I think the hardware will be more than sufficient to produce a good gaming experience. Granted, this upcoming generation of console will probably run for 10 years. Reply

You are talking about desktop GPUs. The GPU inside the PS4 is as good as it gets for the $. ~ 2 Tflops with 176GB/sec suggests a GPU that's extremely close to HD7970M. HD7970M delivers 95% of the performance of a GTX680M for $350 less:

Since you cannot fit a 200W GPU inside a console, not even a 150W one, HD7970M style GPU was hands down the best possible high-end choice for PS4. You simply cannot do better for the price/performance and power consumption right now.Reply

The original XBox 360 shipped with a 203W power supply, and the Xenos GPU was believed to have a TDP of about a hundred watts, so 150 or 200W in a console isn't impossible.

Heck, my Shuttle XPC has a 500W PSU that's much smaller than the 360's power brick. I suspect the large size of the 360's PSU had more to do with it being passively cooled than anything else. If the PS4 (or 720) had an internal GPU that could leverage the active cooling system inside the console, fitting a 150W GPU inside the thing would not be terribly difficult.Reply

Remember a console can dedicate all its resources to graphics so a 7800gtx on pc game won't compare to the PS3 graphics just look at Uncharted 3 for example. So likely the PS4 will outmuscle anything you can do on a 7970m.Reply

I never said it's impossible but it would make the console a lot larger, more expensive to cool, etc. I realize that PS3/360 consoles drew 200-240W of power when they launched. For cost reasons the CPU+GPU are on the same die. That alone means going with a discrete GPU option like HD7870 would have cost a lot more money. At the end of the day Sony is not the company it used to be. It cannot afford to sell an $800 (original BOM for PS3) console for $600 in this market. Not only can they not afford such losses on the hardware but the market will refuse to buy a $599 console. Reply

A generation behind? If you are referencing Titan, it is in no way comparable to what goes into consoles. Looking past the fact that nVidia doesnt have an integrated GPU solution (As they do not make CPU's), nVidia's mobile chips are certainly not a generation ahead of what AMD has. I would actually give AMD the edge in mobile GPU's.Reply

I've never understood the obsession with backward compatibility. For one, unless you have a library of games, it's not important, and if you have a library of games you already have a PS3. If you already have a PS3 just keep it to play your old library of games. Problem solved. Two, I would much rather have $100-$200 less in parts, and, by extension, retail price of the console and no backward compatibility then vice versa. And no, you are not going to run PS3 games on the PS4 in emulation, so that is not an option.Reply

So you don't have 3-4 consoles plugged into the TV. Personally, in my household - we have a Wii on a kids TV. A PS2 on the main TV. We are waiting for the PS4... I don't see the point of spending $250 for a PS3 when that can go towards the PS4. But, we would love to play some PS3 titles on the same same PS4 console.

That would be better than buying a used PS3 for $150 and sucking up space... both with the Console itself, the controllers, cabling, etc.

Its bad enough we'll still have the PS2 for now.

The horse power of the PS4 should allow PS2/PS3 gaming through emulation.Reply

how are they behind intel exactly? have u seen the newest game engines? in BF3 and MoH Warfighter AMD & Intel are neck and neck, its only for older single threaded games that intel beat AMD out. Any new game that's multi threaded doesn't matter. I imagine with all PS4 games using similar modern engines like Dice's Frostbite Engine 2.0 that AMD or Intel would be neck and neck in CPU performance.Reply

AMD may be head to head with and sometimes even in front of a same-price intel core processors, but not nearly with a competitor at the same TDP. Apparently Sony decided the price is more important, and will live with the bigger/noisier design that the higher power consumption of the AMD cores brings with it.

Or they just know that AMD needed this deal badly, and like the leverage this gives them.Reply

C'mon, get your facts right. Jaguar is a low-power core. It has excellent performance/watt and performance/mm^2. There is nothing from Intel which can compete with it. Atom is too slow, SB/IB is too big and too expensive. We are NOT talking about the big server design from AMD, Orochi. And even the Bulldozer cores are not "bigger/noisier" than comparable Intel cores.Reply

It's not even one of the high-end CPU's though. It's Jaguar cores, which are the successor of Bobcat (competitor to Atom). Jaguar seems about as powerful as ARM's Cortex A15 to me. Not sure why they'd prefer 8 of those over 4 high-end cores. Just because of price?

For Bobcat it is very much about size and speed of cache. E-350 has L2 running at half speed. This is much too slow. Full speed is a must. 8 cores at 2GHz with full speed 4MB L2.The L1 cache for Bobcat is much better than Bulldozer's. Separate L1 for every core must be an optimum. Bulldozer should have 64i+64d for every core...Reply

Remember that the Devs will be a lot closer to the 'wire', and wont have Windows in the way, so AMD's disadvantage in Windows might not be as significant. Carmack has repeatedly pushed for MS to allow game devs closer access to the hardware.

Tasks and threads aren't APIs. But there are plenty of OS specific APIs that would allow someone to figure out the OS just by the API. I haven't been able to find anything online saying the PS3 OS is Linux based.Reply

Just as Quadro and Firepro cards use the same silicon as the ordinary gaming cards do, yet produce considerable performance advantage over the gaming cards, the custom made CPU and GPU in this unit will benefit from the same kind of treatment. Rest assured that AMD can tweak its products to deliver next gen performance with this years silicon. It is only a matter of the volume that is guaranteed to be purchased that teases a company like AMD or NVIDIA to do so or not, and the potential sales volume for the console, any console, is always very high. At least when it comes to companies with an established developer and software base like Sony and Microsoft. In short, don't worry. The performance will most probably blow your mind.Reply

Jaguar is a low-power core and the successor of Bobcat. Everything that Intel has in this market is Atom. And Atom is pure sh** compared to Jaguar. Btw, benchmarks say nothing most of the time. Even the bigger AMD processors are good, but most of the time slowed by poor software optimization under Windows.Reply

Most Software, Library, and Other tools are written with Intel x86 in mind. Not AMD's x86. And Software Optimization is key to performance. I guess with MUCH closer to metal programming, tuning, AMD'S CPU isnt as bad as some of those benchmarks made out to be.

( That doesn't mean that it will out perform Intel. is just that there are still quite a bit more to squeeze out from AMD )Reply

It's not a problem. The PS4 is designed to offload the data parallel workloads to the iGPU. A throughput optimized processor (just like a GCN multiprocessor) is more optimal for physics and AI calcuraltion, or just any kind of sorting, culling or asset decompression. Even you can simulate physics on the iGPU, and copy the results back to the CPU in the same frame, so you can create smoke particles that affect the AI.Reply

AMD's APU is several generations ahead of Intel, which makes it up nicely, doesn't it?So who went for AMD's APU:

1) Sony2) Microsoft3) Valve (SteamBox)

It's also a huge win for AMD on another front => games (nearly the only application that needs much computing power that we run on PC) will tend to be multi-threaded from the beginning, so there goes Intel's single thread performance advantage.Reply

NO. Look at the larger picture here.Regardless of Intel or AMD Sony has chosen an x86 multicore CPU which most games today are not even close to being optimized for.This new console will mean that game developers will specifically start coding for multicore x86 architectures and we will likely see huge leaps in performance for games.

This is a win for PC and console gamers, heck it is a win for the entire gaming and software industry.Reply

PC processors have been fast enough that many games still are poorly optimized for more than a few cores. As a software engineer, I can understand why -- it's a pain in the ass.

Consoles are a different world, though. Your game has to compete with the other guy's game on hardware that won't change for years, so there's a strong push to get every bit of performance you can get. This is a huge win for all platforms (even mobile phones have 4+ cores), and will greatly strengthen the market for a highly thread-optimized ecosystem.Reply

Which gives consoles the PROs and CONS.... like the Amiga computers from the 80s~90s. The Amiga 500 sold from 1987~1991 with no change other than systemboard tweaks and shrinks.

The games hit the hardware very very tightly. So much that most games didn't allow for the multi-tasking OS to run. Remember, this is a 7,14Mhz single core computer... so they needed as much as they could from it.

This hurt the Amiga as a computer... even with my 25Mhz high end Amiga 3000, I had to run "emulation" of the Amiga500 with ADOS 1.3, rather than my full 32bit/ADOS 2.x or 3.0 OS.Reply

From what I have experienced and many of has is that for the most part Gaming will always be based on resources of the Company and tools as well as hardware. CPU bound gaming ? Well from what AMD has planned APU is a CPU and GPU. ;)Does this make any sense?Reply

Lol. But SSDs in a console is needless cost imho, games load from optical media anyways unless you want a SSD large enough to fit every game in it, which would make the console so much more expensive. Reply

512 MB was on the low side when they released the last gen. 8 gigs is on the high side now. 4 gigs of ram is really all a gaming PC needs. 8 gigs is nice tho. I think we wont see ram size as a problem in these consoles for 10 years.Reply

4Gb is all a gaming PC needs right now because the vast majority of AAA releases have been pegged for consoles with their paltry 512Mb. Now that devs have 8Gb to play with, you better believe that 8Gb will be bog-standard by the end 2015.

The fact that the PS4 has 8Gb of graphics memory to play with is going to have an interesting effect on PC graphics cards too.Reply

was gonna say the same but u beat me to it. now that they've passed the 4gb/64bit mark, 8gb is gonna be paltry by 2015-16. by then most systems should have 64gb RAM (heck i have 16GB ram now, and i upgraded to that in 2012.... it only cost me $70!)Reply

Always surprises me when people forget about how tech has exponential growth. Seriously TEN YEARS? PC already has texture packs that demand over 1GB of video ram let alone system RAM and as 1 guy said, it's mainly due to the porting from the current rag consoles that have been outdated since 2 years before their release.Reply

Current PC's have 4GB of DDR3 for the CPU, and another 2GB of DDR5 for the GPU.

They do not share though. So the GPU only has 2GB to work with, while the CPU only has 4GB of slower ram to work with.

The PS4 will be 8GB of unified RAM. So if a developer makes a game and the CPU only needs 1GB of DDR5 for its tasks, then the GPU can access the rest of the 6.5GB of DDR5 or so that the OS is not using in the background.Reply

The rumours/leaks pegged the PS4 at 4GB of faster GDDR5 whereas the next XBox is thought to have 8GB of slower DDR3. It's good that the PS4 now turns out to have 8GB after-all. It might cost more now, but the cost will drop over the console's life-cycle while the extra RAM will no doubt prove useful to developers.Reply

Yup, I thought it would either be 4GB GDDR5, or 8GB DDR3. Got the best of both worlds. Nice. Even the fanciest of graphics cards usually don't have half of that GDDR5 on them,and none of us can get it for system memory. Reply

The PS4 is locked down. Assuming Sony didn't have a sudden change of heart (a completely massive one...) they'll likely try even harder to lock this thing down than they did with even the PS3.

People will eventually break that (like all prior systems), but then there will be the challenge of getting a regular operating system to actually run on it (getting graphics fully functional will likely be the hardest problem... even leveraging the open source radeon drivers it likely won't be easy). Eventually people might be able to have Steam up and running on it... but by that point, the hardware in the PS4 will likely look fairly antiquated by normal PC standards (just like the PS3/360's hardware is very low end by today's standards).Reply

I can't believe how often idiocy like this gets parroted. You cannot compare to the specs of a PC to the specs of a console. Even when relatively similar base hardware is used, it is like comparing apples to tires. The console is highly optimized to drive graphics and other calculations that are game related. A PC using an HD 7870 will get crushed by a console if it is based on the hardware Anand says it is.Reply

According to AMD the same hardware can be 5 or 6 times faster when directx or opengl layers can be bypassed. Coding for the GPUs native instructions and not going through an abstraction layer can have big performance improvements

They don't match up to what we see in real life-a similar GPU as what the consoles have delivers similar results (I know, I've got a Geforce 9650m GT I still use daily).

For another, the Xbox is already using Direct X, and the Playstation 3 Open GL. If that was really killing performance that much, the Playstation 2, Playstation Portable, Wii, and 3DS could all run the same games as the Playstation 3 and Xbox 360.

For another-if this was true, someone like Carmack would have been talking about it YEARS ago. He's certainly railed against dumb moves in APIs before. All sorts of developers would be talking about it and demanding action from Microsoft.

Maybe there's some bizarre situation where some graphics function really is that much slower through Direct X still than Open GL or something, but there's just no way it's really making things 10 or 100x slower.Reply

No, facts are kind correct. The reason performance on a PC is so slow (comparitivly) to the same hardware in a console is all about those layers and other overheads.

First you have Windows Direct3D (or Open GL), which is a general 3D API, lots of other performance hugging systems, and then you have the graphics card drivers themselves.

The ps3 in particular does a really good job (nowdays) of getting rid of those things with the GPU command buffer generation library (libgcm). It's a better way than OpenGL or Direct X for when the hardware is known. An openGL will be written on top likely for the ps4 (just like the ps3) so as to ease developers into it.

There is so much capability that cannot be used with standard API's - especially in these new GPUs, and particularly with AMDs new tech and the new APU tech that will be in the PS4.

I'll point you to a great article on why this PS4 will be pretty astonishing in the years ahead, as they are working on a new libgcm.

Jesus, who says it's slower on PC???Ever seen how, say, Dragon Age looks like on PC vs PS3/Xbox?

What is vastly different with consoles is IT MAKES SENSE TO OPTIMIZE FOR PARTICULAR HARDWARE. Now how much you could win from such optimization depends, of course, on how much your generic code+API it is based on sucked. But there is no way a mature API would slow something down 5-6 times. Even 1.5 times slower is hard to imagine.Reply

I've never seen an article like that and just assumed console development was also done with a 3D API. It seems misleading then to be comparing the "performance" of the PS4 GPU to an HD 7870 when the actual code and graphics that get rendered on the PS4 are at a much higher performance level. I understand that the comparison in an article like this is of the hardware and low level performance but most people probably just extrapolate that to in game performance.Reply

what pariah said. A 7870 in a PC is not optimized because developers have to develop for it and 2 dozen other configurations, so a lot of efficiency is lost. whereas in a console they know EXACTLY what it's capable of and can optimize specifically for it. a 7870 in a console would be similar to a 7950 in a PC. :) Reply

I can't believe how often this stupidity gets repeated. The NUMBER ONE REASON PC's require a lot to push games on max is because devs take the cheap way out and use effects like HBAO,HDAO,SSAO that require a lot of GPU power for almost no return. For example, GOW played at med settings 800x600 with a 9700pro. If PC hardware was SO inefficient that wouldn't be possible.

There are some efficiencies to be had but not on the level that you're suggesting or that have been suggested.Reply

Not a chance. Developers have access to the 'metal' on PS4. Certain functions will work 10-100x faster because there is no API overhead. Some of the graphics we've seen at the unveiling are already approaching Crysis 3 level and HD7870 cannot even come close to maxing out Crysis 3. In 4-5 years the type of games PS4 would be able to run, an HD7870 would not and certainly not a $550 PC because that means a Core i3 or low end FX4000.Reply

Are you currently using PSN on your PS3? Compare PSN price of multi-plat games to the Steam prices. Most of them match up, thanks to a EU PSN sale, Dishonored and Darksiders 2 are actually cheaper on PSN than on Steam. I am quite sure that a few games are actually cheaper on Steam than on PSN.

However the way Sony is pushing for agressive PSN pricing gives me hope that the PC-console digital game download price difference will shrink considerably over the coming yearsReply

That's a great story. Unfortunately it falls apart because it assumes PC's in general and budget PC's specifically will still be running AMD 7870 class GPU's in 4-5 years.

They of course will not be. The PS4 on the other hand will. Even if the claims of speed gains are true, by the time devs are confortable with the hardware and cranking out highly optimized games on PS4 hardware, the PC will be 3-5 generations past the silicon in the PS4.Reply

PS4 will cost $400 and have better performance then your $600 PC with 7870 my i53570k, 4gb ram, and gtx 660 system costed $800 to build last year. PS4 will be half that and offer better performance.Reply

Yep the 8GB of GDDR5 is real monster in here! A lot of people seems to forget how much faster this memory is than normal DDR3 or upcoming DDR4 are!This really allows this machine to trech its muscles. The GPU is allso much faster than I expected. And 8 cores even smaller should be sufficient to feed those GPU cores, so this can really fly for a console! 1080p seems to be possible and good 720p games should be just fine.Reply

I think this is unlikely to work. These still aren't off the shelf parts, I doubt you'll actually be able to build your own PS4 from Newegg. And short of some amazing reverse-engineering by hackers, we won't have drivers for anything except the official hardware.

The opposite is much more likely to be true - Windows, OSX or Linux being installed on this x64 CPU (provided Sony allows it or the console is hacked).Reply

Yes you might right, however this thing is built mostly with off-shelve parts, I doubt OSX will be installed on this, Linux and Win most likely, dont know why would this be done (unless it is possible to create partitions without bricking the device for dual-boot).Reply

I don't play console games anymore, but they have clear advantages over something like a PC. Consoles are simple and easier to operate, great for multi-player with people in the same room, and have exclusives that you can't get on the PC (especially if you're a fan of Japanese developers). They are more family-friendly and you don't have to tweak any settings within games. Also, crashes and freezes are far less frequent on consoles.

If I was still into gaming like I was when I was a teenager, I'd buy a PS4 or Xbox 720.Reply

I'ld have to disagree. A PC game without intrusive DRM can be a hell of a lot easier to operate. Most of my PC games just load. I don't have to sign in, I don't have to select my save location. I just click on the game icon, wait ten seconds, click continue and I'm on my way.

As for crashes, the only one's I've had often were in Fallout 3 and NV, and even then that was using mods.

My main experience console gaming is Forza 4. A minute just to load the game. Last time I checked I had more than 50 hours in menus (not including upgrading/tuning. That was for ~250 hours of play all up. Four days of my life just waiting.Reply

I was looking everywhere for detailed info on the specs, thanks for putting up this info and your thoughts on it!!!! Really more than i expected as some people were saying its gonna be 4GB RAM (which would've been terrible if its supposed to last 8 years). The thing is these specs aren't anywhere near futureproof..... in 2 years they'll be completely low end, let alone 5 years! Anyways from the games i saw being previewed, the baseline for graphics will be like BF3 Ultra on PC, which is still fantastic!Reply

For PC building at least their is no option to install GDDR5 ram. That kind of ram is only in GPUs to my knowledge. Is this equivalent to say ...the systems DDR3 RAM? I also realize DDR3 is built into the worse off GPUs but I guess my question goes to differentiating that between the GPU RAM and the system RAM. They say the PS4 has 8gb of GDDR5 RAM...But its like everyone is acting like its the system RAM. Can the whole system run off the GPU RAM? Kind of a 2fer? I hope I phrased this well enough because im not sure if I actually got it across correct.Just so I'm clear here (Dont need any wasted replies explaining that DDR3 is worst) I know DDR3 RAM in video cards sucks and GDDR5 RAM rocks. GDDR5 is better.Reply

Using GDDR5 throughout the system got me thinking as well. I've never really known the latency involved with GDDR5 but it sure means that Jaguar has a huge amount of bandwidth.

As others have said, a strong CPU isn't really too necessary here, and as more and more games are multithreaded, I expect even two Jaguars can spit out a decent amount of work (they are, in essence, as powerful as K8 cores albeit with a huge list of ISAs available to them). I do wonder if HSA comes into any of this. Regardless, perhaps it'll convince developers to spend more time on properly threading their games with so many cores available to them, and with them using x86, porting should be easier.

As for "built by AMD", I thought they were just licensing the tech out for somebody else to make it?Reply

I know I'm asking much, but some more technical details regarding the cpu/gpu/ram would be nice... :-)

8 GB DDR5 and 7850 performance levels is nice (hopefully not comparing to a mobile gpu part). Of course, it also depends of the DDR5 speed. Unified fast cache would be nice too.

But the cpu is very, very slow by today's (pc gaming) standards. Even with 8 cores, unless it's 3 Ghz, the performance won't be enough to push the graphics part to its limit. Don't forget that a game is not quite the optimal task to spread to more than 4 cores, so the single threaded performance is still important. Hell, I've touched notebooks with this kind of cpus and they are painfully slow comparing to the cheapest core i3. They are much better than Atoms, I agree, but there's not the comparison I would make. We need at least desktop Trinity cpu performance here.

Anyway, I know it's impossible, but I would like to see XBMC on this someday... :-)Reply

Do you feel that the CPU performance of existing consoles (Cell and PPC in the Xbox 360) are lacking? Jaguar is much, much better for integer computation than either of those two machines, even when clocked at a far lower rate. For example, the Power architecture in the 360 has *horrible* IPC as it is in-order, with many pipeline hazards. Clock for clock, Jaguar should be on the order of 3-4 times as fast as the PPE in Cell and the Xbox 360, as it is out-of-order, has extensive register renaming, and a very good branch predictor. Couple that with an increase in real core count, and it seems like a sensible decision for Sony to have made.Reply

I also have a 1GHz C-50 Bobcat that completely smokes my old 2GHz AMD 3000+ (which in turn was faster than the 3GHz P4's of the same era). I also have an i7-2630QM laptop. I only notice the extra power when simulating clocks.

Modern CPU's are so much faster in part because they include all the other chips that used to make up a mother board (like most of the north bridge and memory controller). Think about grabing the pen next to you compared to sending someone to the shops to buy a new one. Both pen's write just as fast, just your going to have to wait a while for the one from the shops.

The stuff Xorrax talks about is the other part. A great branch predictor and a good memory hierarchy do great things for IPC.Reply

Pick your comparison -- I'm comparing to existing consoles, which these are replacing. Not PC gaming

The size, cost, and power benefits of 8 Jaguar cores over other something else doesn't seem like a bad decision. As others have mentioned, it will motivate developers to optimize for multi-core more than they have historically.

Links to benchmarks? Jaguar isn't in the market yet, so I'd be surprised to find anything meaningful out there other than data from engineering samples, which may be suspect.Reply

I think an 8 core 1.6GHz Jaguar is fine. No it's not the highest end you can get on PC, but they have costs to consider. Plus it will have some modifications that we don't know of too, and no Jaguar on the PC side has access to GDDR5 memory. Developers will have to work around its limitations, that's just par for the course for consoles. Reply

I'm not really savvy to this stuff. Can someone do any sort of preliminary cost analysis to guess how much the components would be? Any sort of general idea would be fine. Just wondering how heavy of a loss Sony is expected to take on this. Reply

First we have to estimate the transistor count. My estimate is 3.2 billion, 2.8 billion for the gpu and 400 million for the cpu. For a chip that size Sony is probably paying $80-$100 apiece for the first million, and who knows what after that. I cant find the price on a HD7950M gpu die but it cant be much more than $100. So this chip isnt going to be much more than that either. Since the volume is higher I would expect it to be closer to $70 than $100. I'm sure AMD is eating it compared to the margins they get from discrete gpu sales.Reply

Really, AMD...? A company with $2.70 stock price, that tell you just how capable this company is...not. I guess when you go with the worst you get what you pay for. Give me an Intel i7 processor anytime, I'll pay the extra for performance...Reply

I don't mind them using AMD, but Jaguar? Really? Ouch. It's Cell craziness all over again, only I'm not sure 8x Jaguar will really provide that much more performance than the CellBE. At 1.6GHz, Jaguar is basically something like 30% faster than Atom at 1.6GHz, which means something like a 1.6GHz Ivy Bridge Core or a Piledriver core would be three times faster. I would have much rather seen 3GHz quad-core Piledriver or similar than octal-core Jaguar.Reply

Of course a CPU like the FX8350 would be often bored when coupled with a HD7850 feeding 1080p. So it would also be a good idea to replace the GPU with something in the 7950 or better range. And if Sony still manages to sell that box for <500$, everybody would be happy.Reply

LOL, good one, but you misspelled you name.As 8 jaguar cores is about the same size as a single core i7. Add far cheaper process. You could probably get a quarter to a half Intel single core. Try gaming on that :)Reply

I'm not sure about that. Had you said Piledriver "module", then I wouldn't be quibbling, but as Jaguar should be roughly equivalent to K8 in performance per core, and Phenom was only a modest step up, I couldn't see how a Piledriver core on its own would outperform it so massively especially considering they're smaller than Stars cores themselves. Even so, if you meant module, I'd agree 100%.Reply

I'm actually quite disappointed with the PS4. Apart from using hardware that's barely current-gen, the new services seem to be blatant ripoffs of what NVidia and Nintendo are doing, and nothing actually new seems to be included in the package. The streaming service produces horrible image quality, from what I've seen from the Killzone video. Also, the lack of any next-gen processing like tesselation in the video worries me.Reply

Anyone who was expecting a sub 600 dollar console to be better than PCs costing more was setting themselves up for disappointment anyways. The rumored specs were pretty much dead on, with a pleasant surprise in the 8GB GDDR5.

None of us can get GDDR5 as system memory, and most of us don't have half that, most not even a quarter of it, in a GPU.

And every time a new console launches there are articles about which inexpensive PCs are better, but those kind of miss the point imo, lets see which runs games better in 7 years. Developers won't care about your 7 year old PC, they will care about the current generation of console hardware though and will ensure their game runs well on it. Reply

The reason a PC wont last as long, even if it's more powerful, is because PC graphics improve more over time. Even many current console ports right now look significantly better on PC. So yes a console will obviously always run it's games well as it's hardware stays the same. But if PC games stayed at the same level of graphics for many years at a time, then PC's would easily be able to do the same.

Personally i'd prefer the better graphics and if it means upgrading now and again i don't have a problem with that.Reply

That is going to be huge for home consoles. Bigger than anyone can even imagine. Who the hell needs saves when you can just suspend the console at a point and then start up right where you left off?

No logos, no menus, no copyrights, no waiting. You sit down. You pick up your controller. Your PS4 with its camera detects you and the game just starts up. Before you have even sat down, the game is sitting staring at you paused. Waiting for you to hit the Options button to continue.

You need to go to the potty, then the grocery store to pick up more Red Bull, nachos, and a card for apologizing to your girlfriend for forgetting her birthday while playing the new Killzone? You come back an hour later, you sit down. It's on, waiting for you again.

Especially if it can leverage HDMI-CEC to manage your HDTV and receiver on and off, too.

Huh? I don't see why this is big deal. You can just pause the game and turn the TV off or do something else. Computers stay on all day and night, why not the PS3 too? I've left it on for days and never had any issues.

It's not so much that the PS3 can't stay on all day and night, it can, easily, I've left mine on Folding@Home for weeks at a time. It's just every time I think to pause the game and just turn off the TV, I have a little mental battle; How long til I'm back at the TV? How much electricity is the system going to be drawing needlessly? How much time is going to be wasted waiting for the game to reload?

All of that is now gone, cause it just suspends. It's not so much that it wasn't possible, just that it's made to do it now. You can turn it off mindlessly even if you're going to be coming back in 15 minutes or 5 hours, which will be nice for a console, especially if the game only allows you to save it at particular points in the game. Reply

What I'm missing in most "previews" I've seen so far is the fact that the jaguar cores are the first generation cores using HSA. For sure that'll have an effect on the performance when comparing to the current generation cpu's, making comparisons really hard to do. This ps4 will pack so much more punch then the specs suggest when comparing to current pc's. Reply

People aren't thinking about some things when they comment about the CPU being an AMD CPU.

1: These are low power Jaguar Cores, not Piledriver/Bulldozer. That means 5-25W range for power draw. Using TSMC 28nm HKMG Process.

2: You can't expect the performance of a Standard Jaguar APU to be equal to this APU, due to the difference in memory controller. As a standard JG uses DDR3, where this will have access to much higher speed GDDR5. It should be interesting to see if this gets around the memory controller bottleneck present since the Phenom.

3: This should change the landscape for gaming. Since all new games will be able to run on multithreaded PC hardware. And in this coming gens case, it should benefit AMD with most of them being pre-optimized for AMD hardware. Reply

4: These Jaguar CPU cores are all full function x86-64 CPU cores. Which means all new games for these consoles will be programmed to run in native x86-64 mode. This should mean, most PC ports will come able to run in native x86-64 mode as well. Which should add much better visual quality, and give more access to RAM for programmers. Reply

GDDR5 actually has a higher latency than DDR3, much higher actually. I wonder how that will play out. With GPUs the latency doesn't matter as much as bandwidth is the most important thing, but with CPUs after a point bandwidth isn't as important and a super high latency in comparison to DDR3 could hurt. Reply

I'm aware, maybe it won't be a big deal, that's why I said I was just wondering about it. But I think the latency from DDR 1-2-3 is a lesser difference than DDR3-GDDR5. Maybe the clock rate cancels some or most of that out. Again I'm unsure, just speculating. Reply

People here are talking about how AMD CPUs are slower than Intel CPUs. This is true in a WINDOWS environment. What we all must remember is that this system will have an OS and software specifically optimized for AMD x86 multi-threaded processors. AMD CPUs in Windows are victims of poor optimization. AMD doesn't crank out a widely-used compiler for their CPUs like Intel does. The PS4 simply will not have that problem.

Me thinks it is because Intel ones are simply too expensive and unlike to negotiate the price. It would be pretty silly spend half of the budget on cpu (i.e. $200 for an i7 for a $400-odd console) while the money could be better spent on somewhere else. Reply

seems to me haedware wise the new xbox and the PS 4 will be very similar and they will have similar performance. The main thing diffeernt between them will be thier UI and online features. I don't really see the need to buy new systems yet as I have too many unfinished games oon both the 350 and PS3. Granted the graphics will be nicer but will the gameplay be that different? the social stuff is not important to gameplay to me, eventually I will get both new systems once something comes out that I must have Halo 5and the new Killzone would interest me . hard to justify the expense of new systems when the current ones still have life in them,Reply

Many people here seem to have unrealistic expectations for PS4. The supposed "high-end" has moved a lot from where it used to be. There is no way you are going to get a $1000, 300 Watt graphics card in console targeted at approximately half that price (or less?). How many people actually buy these ridiculously expensive, power hungry cards anyway? I wouldn't buy a video card for more than $300 (and even that seems high); the price/performance ratio is not there for these super high-end offerings.

Also, does anyone think that Intel or Nvidia was ever really an option? If they went with a separate CPU and GPU, this increases cost and power significantly (communication off chip waste power and reduces performance). Intel has plenty of CPUs which could have been an option, but they do not have a powerful enough gpu for a single chip solution. Nvidia isn't really an option either. They have the gpu, but no good cpus. Would you rather have an nvidia GPU with some integrated ARM cores? AMD is really the only one who can easily provide both a powerful GPU and sufficient cpu power for this chip. AMD wins on price, performance, and power since neither Nvidia nor Intel can offer this on a single chip.

Direct comparison with the amount of memory in a PC is not relevant either. Most of the memory in a current PC is wasted; the GPU memory is really only a high-bandwidth cache for stuff that is in system memory. The 8 GB of GDDR5 on the PS4 should be significantly more efficient compared to the PC architecture. Hopefully it is all accessible from the GPU and CPUs, and is all in a single, cache coherent memory space; this seems like it this could be figured out from dev kits...?

It would be nice if we could get a similar architecture in the PC space. With how small current CPU cores are, and with the memory controller on the CPU, it really doesn't make sense to have CPUs and GPUs separate any more. The actual integer cores are tiny these days. Most of the CPU is taken up with caches and FP units. If you have a GPU on the same die, then the CPU FP units are a lot less important. You still need to have FP units for mixed code, but for serious FP processing, just execute it on the GPU. Although, without using super fast on-board memory, the bandwidth to the GPU will be lacking. It sounds like Intel may be trying to solve this with more on-die memory; most people don't actually need more CPU cores anyway. I was expecting the PS4 to use a large on-die memory, but this probably isn't necessary since they went with such large and fast external memory.

For a PC though, I would want the ability use multiple GPU/CPU/Memory cards.Reply

"GPU memory is really only a high-bandwidth cache for stuff that is in system memory"

That's not correct. There's no need to keep textures in system RAM once they're in graphics card RAM. Also you completely ignore virtual memory. When a game is ran on the PC, if other software is using RAM the game needs a lot of it will be paged out.

The console, on the other hand, will likely not have virtual memory and will keep at least 1GB if not 2GB of that RAM for its operating system and other software (streaming, social, ...).

And BTW, we do have a similar architecture on the PC side. AMD APU's have been available for a while, and AMD's plan has been to offer more and more integration with time. The main differences here are a skew towards more graphics power and less CPU power (which isn't necessarily good for a general purpose PC chip) and a lot more memory bandwidth. It would be interesting to see if any of these make it to PC space in some form.Reply

It is a shame the bottleneck will be physical media. We may have to discuss in the future the long load screens to the initial loads that will take place.I love the massive capacity Blu-ray affords, but it is the slowest kid on the block.Then the Mechanical Hard Drive is going to be an issue.I hope that Sony allows the ability to swap out the Hard Drive (Sata 3)and has support for SSD through the firmware that will boost the performance overall.Hell the option to load almost everything onto a SSD off of BLuray would be an incredible option..Xbox360 was allowing this if I am not mistaken..Overall it is appearing to be one heck of system for pure gaming.Reply

Alot of odd comments being made about the future of PC gaming, and the current state of PC gaming.

Someone said that most games need 4GB of RAM to play? Really? I can play multiple PC games like Simcity and Farcry 3 without even touching the 4GB mark on my system, and that is with Windows 7 - 64bit in the background ( which alone only uses 1.3GB of DDR3 ).

I have yet to even get close to 100% RAM usage. No matter what I'm doing ( except if I'm purposefully doing it on a prime64 stress test ).

8GB of DDR5 will be matched by 2015 at a reasonable price?! That is a funny thing to consider. PC's don't even use GDDR5 for anything but GPU's. There isn't a CPU even in the works right now that will be using GDDR5. Not that I've read or heard about anyway. As for a PC GPU using 8GB of DDR5 by 2015 that will be at a mass market price? Hell no. The only ones I could even search about are well into the 2,000$ range. Seems at the moment PC's are generally going to be using 3GB of DDR5 as a standard within the next few years.

There are plenty of PC exclusives out there like the total war series, Metro and others that do not worry about the console world. Not sure why people think developers will actually hold back on their titles simply for the sake of the console world. All they have to do is create the engine for the PC and then basically turn everything to Normal / High settings for the consoles. Turn off AA, turn on DX9, etc. This type of thing is done on the PC all day long and it takes all of 10 seconds to change your settings around to match your system's capabilities.Reply