After bagging chip supply deals for all three new-generation consoles -- Xbox One, PlayStation 4, and Wii U, things are looking up for AMD. While Wii U uses older-generation hardware technologies, Xbox One and PlayStation 4 use the very latest AMD has to offer -- "Jaguar" 64-bit x86 CPU micro-architecture, and Graphics CoreNext GPU architecture. Chips that run the two consoles have a lot in common, but also a few less-than-subtle differences.

PlayStation 4 chip, which came to light this February, is truly an engineer's fantasy. It combines eight "Jaguar" 64-bit x86 cores clocked at 1.60 GHz, with a fairly well spec'd Radeon GPU, which features 1,156 stream processors, 32 ROPs; and a 256-bit wide unified GDDR5 memory interface, clocked at 5.50 GHz. At these speeds, the system gets a memory bandwidth of 176 GB/s. Memory isn't handled like UMA (unified memory architecture), there's no partition between system- and graphics-memory. The two are treated as items on the same 8 GB of memory, and either can use up a majority of it.

Xbox One chip is a slightly different beast. It uses the same eight "Jaguar" 1.60 GHz cores, but a slightly smaller Radeon GPU that packs 768 stream processors, and a quad-channel DDR3-2133 MHz memory interface, which offers a memory bandwidth of 68.3 GB/s, and holding 8 GB of memory. Memory between the two subsystems are shared in a similar way to PlayStation 4, with one small difference. Xbox One chip uses a large 32 MB SRAM cache, which operates at 102 GB/s, but at infinitesimally lower latency than GDDR5. This cache cushions data-transfers for the GPU. Microsoft engineers are spinning this off as "200 GB/s of memory bandwidth," by somehow clubbing bandwidths of the various memory types in the system.

The two consoles also differ with software. While PlayStation 4 runs a Unix-derived operating system with OpenGL 4.2 API, Xbox One uses software developers are more familiar with -- a 64-bit Windows NT 6.x kernel-based operating system, running DirectX 11 API. Despite these differences, the chips on the two consoles should greatly reduce multi-platform production costs for game studios, as the two consoles together have a lot in common with PC.Source: Heise.de

by: ManofGodGot 3 words into that video and realized he had absolutely nothing worth listening to. :slap: Clicked the X and moved on, not even worth reading the comments.

What i find funny about it how they went on about watching movies on it so much.. Like sorry last thing i would get is a xbox to watch movies and pay MS to go online then pay other people as well to use the service.

Just that you be better of with a PS4 but they need exclusives to make people like me at least to even think about getting a xbox..

And if movies annd stuff is what you going do mostly with it get a frigging Roku 3 as that will beat the pants of it in every way even more so on power usage as the unit only takes 3.2w on load and supports 3rd party stuff too.

I would have to go PS4 for a few reasons like for Heavy Rain if there is ever another of those and uncharted and then the free online so no monthly fee's.

Again if ya just watching movies Roku 3 has more than enough to keep you happy for a long time..

by: Jstn7477I have no problem paying for hardware that makes my games run smoothly, considering I have a $300 monitor that functions best at 120Hz. Playing games at 40 FPS was something I did a couple years ago with an X2 4400+ and 7800GS in 2008, then X4 9750 and a 9800 GT, and then a 4GHz 955BE and HD 5770 before I got my 2600K and HD 6950 in late 2011. My minimum framerate in TF2 almost doubled when I got the i7 (before you call out the video card differences, my 5770 was never fully stressed in TF2 to begin with). Without VSYNC, TF2 runs in the 200s but in the largest fights on 24-28 player servers, my framerate dips down to around 100 with shadows off, sometimes less in extreme situations. My main work computer with a 2.5GHz Phenom X3 8550 and a 3850 AGP hangs around in the mid 30s-40s in the same situations with an under-utilized GPU.

Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.

by: JacezNow, the Xbox One has an underclocked (850Mhz -> 800Mhz) HD 6770 inside and the PS4 has an underclocked (900Mhz -> 800Mhz) HD 6870. The memory bandwidth appears to be accurate for both examples.

All in all, it's an Entry-Point/Mid-Range computer.

What? PS4's GPU is a 1152 GCN v2.0 cores part @ 800MHz with a shared system memory of colossal size and bandwidth.

And I'm pretty sure the XBO's GPU side is no slouch either.

PS4's CPU part is @ 2GHz, it's been confirmed time and time again over the weeks after it's launch, not sure about the XBO. All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).

All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)

by: Dent1Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.

Good point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due

by: theoneandonlymrkGood point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due

Not likely... more like 3820x2160@30Hz... or 60, if "we're" lucky... Seeing as how they don't see the point of 60fps when they(SONY/the developers) "think" they would rather squeeze some extra eye-candy mumbo-jumbo and stay at 30fps... I don't see 120Hz ever happening in the console world... but then again, HDTV/PC display tech might have a big revolution down the road, who knows what will make more sense then.

by: NeoXFAll in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch.

Way above? Really? For it to be true, it would mean that more than half of i7-2600 performance is regularly lost on OS + DirectX + Driver overheads, which is simply ridiculous.

by: NinkobEiPlaystation historically has always had the best exclusive titles. This generation will not be any different. It has to do with which continent the console is developed on. MS can try to make an appeal to japanese developers but in the end there will always be that language barrier.

If you read the Anandtech report, the PS4 should run a lot hotter/higher power. Seems unlikely that their fan will be quieter.

by: Dent1Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.

I will agree that some games do play fine at 60 FPS (which is what the more poorly optimized games tend to go down to on my machine on some occasions despite none of my CPU cores or GPU being maxed out), but if you play multiplayer FPS games competitively e.g. Team Fortress 2, Counter-Strike, Quake, etc. there is quite a difference in smoothness between 60Hz and 100-120Hz. Some people have hung onto their CRTs for years and play at stupid resolutions like 1024*768 and 100Hz for these particular games (if they have yet to purchase a 120Hz 1080p LCD) because your local frame rate determines how many snapshots are sent back to the server. Again, many single player games (especially slower paced ones) at 60 Hz play nicely, but in multiplayer FPS games where people tweak the hell out of their net settings, reduce their interp settings and whatnot, it's hard to be part of the 60Hz norm. Call of Duty (not that I play it) supposedly has the best hit registration at 125 or 250 client FPS from what I heard as well.

it really is
If I had to look at the best exclusives its nintendo without a doubt, sony tends to keep many exclusives in the japanese markets(and they're some very good ones that end up being unknown) ms tends to do multiplatform games better but when their is an exclusive its usually a pretty good minus the kinect games.

by: VinskaI'd say OGL is only a disadvantage on when on Windoze. And it's not even OpenGL's fault per se.
I'd best describe it in the words one game developer said to me not long a go (not exact words; Greatly shortened) "working with OpenGL is great. OpenGL is also lighter on the CPU and helps to keep the framerate up when running on weaker CPUs. But OpenGL implementations on Windows just suck and are much slower than they could be."

Also, what midnightoil said.

Apples to apples, Direct3D will always be faster because it's hardware + software, not just software. Case in point, Direct3D created the unified shader model, OpenGL adapted it in its own specifications. Whenever there is a performance hit on Direct3D, it is because it is doing something extra (e.g. post processing).

As to what midnightoil said, bare in mind that Windows on Xbox isn't the same as Windows on IBM-PC compatible. Xbox developers likely have direct access to the hardware resources to squeeze every drop of performance from the hardware. On consumer Windows, developers have to go through layer after layer of software to reach the hardware which means it is slower--but less likely to crash (and other undesirable outcomes) the computer. The reason why there isn't a direct access to the hardware in consumer Windows it has to account for the hundreds of graphics devices out there.

I have no doubt that Sony would have used DirectX if they didn't have to license it from Microsoft.

On the other hand, comparing D3D to OGL on RL usage scenarios as "apples to apples" is not possible. D3D is only implemented on Windoze [and MS devices]. There, OGL is either greatly neglected by the implementers or is simply non-existent. Say what You want, but comparing D3D to pitiful excuses for OGL implementations found on Windoze simply cannot be called as "apples to apples".
On *nix, for example, the implementations are much better.
But comparing between D3D on Windoze and and OGL on *nix cannot be called "apples to apples" due to arising external factors [obvious one - different friggin' OS].

by: FordGT90ConceptDirect3D will always be faster because it's hardware + software, not just software.

Direct3D is emulated on *nix. OpenGL and Direct3D both have to go through the layers of protection on Windows. Windows is the closest apples to apples available. The fact that most professional software is rendered using OpenGL attests to the fact that is well implemented on Windows.

There are a lot of engines out there which run on Windows that support Direct3D and OpenGL render paths and the performance is more or less the same when trying to achieve the same degree of visuals.

A lot of EA titles (The Sims 3 and Spore, for example) are DirectX on Windows and OpenGL on Mac OS X. If DirectX was as terrible as you claim it is, why would EA go out of their way to use DirectX on Windows instead of OpenGL on both?

You do know that virtually all professional software (e.g. AutoCAD, 3DSMax, Photoshop, etc.) uses an OpenGL render, correct? OpenGLs implementation on Windows is good (much better than you claim it not to be), it just isn't up to par with the purpose-built DirectX. Most x86 compatible games are released on Windows because of DirectX, not in spite of it. DirectX was created because Bill Gates wasn't satisified with OpenGL at the time. Hell, about the only game developer that loves him some OpenGL is John Carmack (ID Tech engine). That's mostly because he resents the Microsoft empire.

And don't expect a further reply from me on this topic. The discussion is circular.

by: NeoXF...
All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).

All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)

:roll: Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while. :laugh:

Come on, man! :shadedshu You seriously think these shitty consoles can beat any current PC, let alone the next-generation ones? A Haswell + Titan will crush any of these so-called "gaming machines" hands down. Even the developers themselves (from both platforms) have said they won't take on the high-end PC's head on, instead focusing on "good" (read: not great, let alone the best) middle-of-the-road performance for a gaming console. This time they focused more on entertainment, making the consoles a "media hub" etc. for the living room, not raw power.

By the time the consoles launch and developers get experience coding for them, Broadwell + Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that. :rockout:

by: 1c3d0gBy the time the consoles launch and developers get experience coding for them, Broadwell Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that.

Consoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.

by: 1c3d0gAre you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while.

by: AquinusConsoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.

I bet some of us TPU'ers could build a budget pc sub-600 that is on par with the Xbone. This will be doubly true in a couple of years. And the best thing about PC is we dont have to spend $60 for a game. It is very common to buy last year's triple-A titles for $10 during a sale.

As far as PC gaming being a dying niche, well never has that statement been less true than right now.

The PC gaming market reached $20 billion in 2012, a healthy increase of eight percent over the previous year, the PC Gaming Alliance (PCGA) revealed this week at a news conference held in San Francisco.

Now let us compare that to console sales in 2012

Video game and console sales plunged 22 percent in 2012, according to NPD Group data published by Home Media Magazine. As consumers focused their dollars on a few high-profile titles and opted for new digital services, and publishers just released fewer titles, revenue for the year totaled $13.3 billion compared to $17.0 billion in 2011. The decline more than doubled the 9 percent decrease between 2010 and 2011, reported the Los Angeles Times.

Sales are bound to drop due to the supposedplanned release of the new consoles. And for people like my self only play a few classy games it's more than $60 you paying with a xbox as you gotta tag on the monthly fee o top and that would be like paying $100+. That's why they need to get the new models out the door ASAP.

So if i get one of the ewer systems it would have to not have a monthly fee.

It would be nice what the power usage is of these newer ones too how much better are they.