Xbox One gets a CPU speed boost to go with its faster GPU

The pre-launch tweaks continue with a 150MHz clock speed bump.

Last month, Microsoft announced that the Xbox One's GPU would be getting a slight clock speed bump, and now it sounds like the CPU is getting some of the same love. Polygon reports that the console's AMD-produced CPU cores will run at 1.75GHz in the shipping version of the console instead of the 1.6GHz that they've run at in development kits so far. Microsoft VP of marketing and strategy Yusuf Mehdi mentioned the clock speed increase at the Citi Global Financial Conference today and reaffirmed Microsoft's intent to release the console in November (though he declined to be more specific).

"We'll announce a launch date shortly," said Mehdi. "We recently just went into full production, so we're now producing en masse Xbox One consoles. We've had real good progress on the system. In fact, we just updated the CPU performance to 1.75 GHz on top of the graphics performance improvement, so the system is really going to shine [and] the games look pretty incredible."

Both the Xbox One and PlayStation 4 use eight-core CPUs based on AMD's "Jaguar" architecture—in PCs, Jaguar is intended to be AMD's low-cost, low-power CPU architecture, which makes it a decent fit for a more GPU-centric device like a game console. We have yet to see a reliable source comment on the PlayStation 4's CPU clock speed, which makes it difficult to say how the two consoles' relative CPU performance stacks up.

Other information about the Xbox One has continued to trickle out from Microsoft as its eventual (as-yet-unknown) launch date draws nearer: the software won't initially support external storage, for example, and the console's beefy cooling system was apparently designed to last for its entire projected 10-year lifespan. The Xbox One will also now include a headset, a reversal of a position that Microsoft took earlier in the summer.

Keeping in mind that consoles generally use 2-3+ year old hardware (at time of launch etc), and are (these days) expected to last about 10 years, as well as determine the limits of games for that period (Skyrim designed with the limits of the 360 and PS3 in mind, 32-bit, be able to fit on disc, etc), this is a good sign.

The hardware limitations of consoles /do/ confine AAA games, and I'd rather Fallout 4 etc be built to the limitations of hardware like this, rather than the PS3/360.

It seems unlikely with 5 or 6 cores devoted to the game that the CPU will end up being the bottleneck.

Once again it seems like MS are making actual customers beta testing as at this stage in the products release this is probably being managed through software and there's no way it can improve the life of the system.

I look forward to the announcement the PS4 runs at 1.76 GHz.

Edit: brain fart. Wrote ps3.

Games respond pretty well to higher single threaded performance, at least to a point - multithreading is hard. You probably also want to save the "making customers beta testers" speech until it's actually released - there are no actual customers currently.

I must admit, despite the fact that I don't regret pre-ordering my Wind Waker HD Wii U bundle... I'm growing increasingly concerned about the disparity between the PS4/Xbox One and the Wii U. I had high hopes that Nintendo would at least even out the playing field a bit... but it appears that there's once again a large chasm between the power of Nintendo's console and the other two.

Don't get me wrong, I'm a big PC gamer and look to the PC for the maximum visual experience. I don't demand the highest visual fidelity from my consoles, but I do want Nintendo to get back into the "core market." I had hoped that this gen would bring them back into the mainstay of gaming, rather than being a (enormously successful) fringe. I still think the Wii U will succeed due to Nintendo exclusives, but I really feel that this may be another under-powered console in comparison.

What worries me is with these console CPUs clock speeds so low per core, will game developers have to make up the difference by heavily optimizing games for six cores or more and not making any changes on the PC side resulting in poorly optimized code that will show up as microstutter in PCs with dual core or quad core processors?

It seems unlikely with 5 or 6 cores devoted to the game that the CPU will end up being the bottleneck.

Once again it seems like MS are making actual customers beta testing as at this stage in the products release this is probably being managed through software and there's no way it can improve the life of the system.

I look forward to the announcement the PS4 runs at 1.76 GHz.

Edit: brain fart. Wrote ps3.

Games respond pretty well to higher single threaded performance, at least to a point - multithreading is hard. You probably also want to save the "making customers beta testers" speech until it's actually released - there are no actual customers currently.

Obviously single thread performance will speed things up - assuming there are no other bottlenecks. The multithreaded is hard argument though - the PS2, PS3, XB360 were all multi threaded. Games developers have experience of this.

Its hard to break out a general computer problem into parallel paths for multi threading and it may be the extra threads yield diminishing returns but seriously, you think the CPU is going to end up as the bottleneck?

As to making their customers beta testers - XBox 360s started failing after a few months. This announcement is being made a couple of months after the previous release of the specs. This announcement seems more like the marketing department are now controlling the technology. They've been working on this thing for years and suddenly they need to increase the CPU speed?

What worries me is with these console CPUs clock speeds so low per core, will game developers have to make up the difference by heavily optimizing games for six cores or more and not making any changes on the PC side resulting in poorly optimized code that will show up as microstutter in PCs with dual core or quad core processors?

The latter. By making it difficult to port to PC, they can keep the sales and 'best play experience' on the console, and retain more console exclusives.

What worries me is with these console CPUs clock speeds so low per core, will game developers have to make up the difference by heavily optimizing games for six cores or more and not making any changes on the PC side resulting in poorly optimized code that will show up as microstutter in PCs with dual core or quad core processors?

It's not likely to be an issue - sharing cores among multiple threads isn't really anything new. There's really no reason to think that PCs with a few very fast cores (in comparison here) will handle games designed for 6-8 slow cores badly.

It seems unlikely with 5 or 6 cores devoted to the game that the CPU will end up being the bottleneck.

Once again it seems like MS are making actual customers beta testing as at this stage in the products release this is probably being managed through software and there's no way it can improve the life of the system.

I look forward to the announcement the PS4 runs at 1.76 GHz.

Edit: brain fart. Wrote ps3.

Games respond pretty well to higher single threaded performance, at least to a point - multithreading is hard. You probably also want to save the "making customers beta testers" speech until it's actually released - there are no actual customers currently.

Obviously single thread performance will speed things up - assuming there are no other bottlenecks. The multithreaded is hard argument though - the PS2, PS3, XB360 were all multi threaded. Games developers have experience of this.

Its hard to break out a general computer problem into parallel paths for multi threading and it may be the extra threads yield diminishing returns but seriously, you think the CPU is going to end up as the bottleneck?

As to making their customers beta testers - XBox 360s started failing after a few months. This announcement is being made a couple of months after the previous release of the specs. This announcement seems more like the marketing department are now controlling the technology. They've been working on this thing for years and suddenly they need to increase the CPU speed?

Uh yeah they were "multithreaded" sure, but there's still typically one main thread doing much more work than the rest. As you noted, more cores/threads runs into serious diminishing returns in most cases. As for the CPU being a potential bottleneck - uh. yes? When has it ever not been? It's only not often an issue on the PC side currently because most games are still being designed to work on console hardware an order of magnitude or two slower, but it still definitely puts limits on games (eg BF3 player count, AI in general).

Right now they're finding out what yields are really like because they're only now physically making said chips - and it turns out they're better than expected. This isn't unusual at all given they probably picked a conservative number, and doesn't indicate that marketing is controlling technology.

I can't help but imagine this kind of PR is out because they are slipping in their production schedule. Sony has had most of the press since gamescom when they announced the release date. MS said they were waiting until the announcement. Still nothing, but wait here's an extra 150MHz on the CPU clock? At a financial conference two days after PAX? Who gives two squirts of piss, why haven't they announced a release date? The third console curse would be complete if they don't beat Sony to the shelves and just a complete tidal wave of shit if they miss Black Friday. I hope there are two strong platforms out of the gate, but goddamn if MS isn't trying to crack their skull open at the first hurdle.

"A major advantage to PS4's GPU is that while being of the same base architecture as Xbox One's, it has 50% more shaders; there are 768 shaders on Xbox One's GPU, and 1152 on PS4's."

If this is accurate, I don't see why anyone would opt for the Xbox One. 50% is a MASSIVE degree of difference.

I wouldn't call it a massive difference until software is actually released. The PS3 should have had better performing games than the 360. In reality, multiplatform games were virtually identical unless the developers blotched one version.

It wouldn't surprise me at all if the same thing happened this generation even though the PS4 seems more capable. Personally, I'll let the early adopters place their bets and see what looks like a better deal two years from now.

I wonder, will they ever stop improving near pointless things and do something worthwhile? Like say, improving the number of packets it can drop so their multiplayer games stop having so much lag?

Packet losses are handled at the application level, not the console level. Microsoft and Sony have little direct control over how third parties handle their networking, aside from making higher-level APIs to handle that stuff themselves. Except that neither Microsoft nor Sony will describe their API publicly, so you won't hear about it.

Or maybe add vac support so I can go one game in some of my more enjoyed games without facing down aimbots with 30-0 sprees who will shoot you when you're behind them ...through their own back?

VAC is a Valve proprietary technology to ensure executable integrity. Consoles have been doing it for years with code signatures. Microsoft's system fared better, as the PS3's LV0 keys were leaked last year.

In the grand scheme of things 150MHz isn't anything to get excited about. Seems more like PR spin.

Overclocking for the sake of putting out bigger numbers from a company that has had overheating problems with their initial release hardware in the past right before they go into production is concerning too.

Wish MS would just stick to their guns instead of trying to 1-up Sony. All these last minute changes feel like trouble waiting to happen.

"A major advantage to PS4's GPU is that while being of the same base architecture as Xbox One's, it has 50% more shaders; there are 768 shaders on Xbox One's GPU, and 1152 on PS4's."

If this is accurate, I don't see why anyone would opt for the Xbox One. 50% is a MASSIVE degree of difference.

It is - and the PS4 GPU is definitely faster, but there's more to a GPU than shaders (ie this doesn't make it 50% faster by any means). The memory interface difference is what I'd keep an eye on.

Well, likely both of them will be 128-bit wide interfaces. The DDR3 will be placed in a dual channel configuration is the likely thing. The only thing to make the system's throughput close to the PS4's GDDR5 memory is through the use of the 32 MB ESRAM buffer that Microsoft's including to offset the inherent limitation of DDR3 memory.

The question that hasn't been answered is the speed of the DDR3 and GDDR5 memories that both companies are using. Is Microsoft using the fastest DDR3-2133 memory, or are they opting for a slower clockspeed say 1600 MHz or 1866 MHz? I'm really hoping it's not DDR3-1066 or 1333 MHz, that'll be just really sad lol. The clockspeed for the GDDR5 would be more of a wildcard since no one knows how Sony's going to clock the GDDR5 memory, so the actual bandwidth is unknown but the throughput would be higher than the DDR3 found in the XBOX One. Also, not sure how well the ESRAM would perform since it's an extra layer that programmers have to take into account in order to optimize for the console.

I wonder, will they ever stop improving near pointless things and do something worthwhile? Like say, improving the number of packets it can drop so their multiplayer games stop having so much lag? Or maybe add vac support so I can go one game in some of my more enjoyed games without facing down aimbots with 30-0 sprees who will shoot you when you're behind them ...through their own back?

Yeah, no reason to support an Xbox One when it appears that the multiplayer support, despite not being free, will still be as laggy and cheat riddled as ever.

I'd like to not have players shooting through walls with sniper rifles but... oh wait, that would require taking the focus off of the shiny graphics.

This is not about PC gaming. On European BF3 and Halo servers I've never encountered people with aimbots.

LOL @ "heat problems" type of posts - I thought we could leave that nonsense to IGN and GS, where the fan-trolls and children play.

Given the way that console is designed, a +150Mhz bump the CPU and the ~+50Mhz bump to the GPU isn't going to make the thing melt or probably even make the fan spin up much more than usual.

On the other hand, given that the platform is targeted (vs the way pc games are created), these minor bumps could make a difference (based on the synthetic pc tests I have done with my PC hardware when bumping up similarly).

Also - I highly doubt that this was a PR move - it is a yield-based move that actually means the rollout is better than expected, not worse!

=============

Keep in mind there are details concerning both consoles that both companies seem cagey to answer - and last gen, the xbox 360 had the better gpu and that didn't mean squat at the end of the day.

but but 50% more bluh bluh bluh

And here I thought Ars-ians were suppose to be smarted when they judged hardware... What about RAM speeds? System latency? And blah-blah-blah...

"A major advantage to PS4's GPU is that while being of the same base architecture as Xbox One's, it has 50% more shaders; there are 768 shaders on Xbox One's GPU, and 1152 on PS4's."

If this is accurate, I don't see why anyone would opt for the Xbox One. 50% is a MASSIVE degree of difference.

It is - and the PS4 GPU is definitely faster, but there's more to a GPU than shaders (ie this doesn't make it 50% faster by any means). The memory interface difference is what I'd keep an eye on.

Well, likely both of them will be 128-bit wide interfaces. The DDR3 will be placed in a dual channel configuration is the likely thing. The only thing to make the system's throughput close to the PS4's GDDR5 memory is through the use of the 32 MB ESRAM buffer that Microsoft's including to offset the inherent limitation of DDR3 memory.

The question that hasn't been answered is the speed of the DDR3 and GDDR5 memories that both companies are using. Is Microsoft using the fastest DDR3-2133 memory, or are they opting for a slower clockspeed say 1600 MHz or 1866 MHz? I'm really hoping it's not DDR3-1066 or 1333 MHz, that'll be just really sad lol. The clockspeed for the GDDR5 would be more of a wildcard since no one knows how Sony's going to clock the GDDR5 memory, so the actual bandwidth is unknown but the throughput would be higher than the DDR3 found in the XBOX One. Also, not sure how well the ESRAM would perform since it's an extra layer that programmers have to take into account in order to optimize for the console.

Memory separation is a hassle, no matter the platform. From a system architecture standpoint, the PS4 is doing a lot of things right, and I'm not talking just about having fast ram: the fact that it's a big, single pool that can be easily accessed by both the GPU and CPU is a huge, huge benefit. As you say, having small "pockets" of memory with different speeds does involve quite a lot of management. While it's better than nothing, you have to make your code very smart to keep that pocket full to avoid wasting it, but since it's kinda small you just can't drop random data. You have to put those things you access the most (I remember having debug code on a PSP to check access counts of each data chunk to get a better insight of what should I tell my engine to put into EDRAM at all times).

Also, there's the threading problem: You either limit this ESRAM access to a single thread or write code for house keeping, even maybe involving expensive context-switching procedures that will also put pressure on your main RAM (because that contextual ESRAM stuff has to go somewhere during the switch).

These things are pretty common on consoles, so developers do know how to deal with them, but not having to do it would be much better and would let us put those CPU/GPU cycles to better use like giving you the most accurate brown shade for your next shooter

Any performance boost is always welcome and AMD's current offerings don't excel in performance per core. Also, you have to keep in mind that in a console those 150 MHz won't just get lost in a myriad of applications / services like they'd probably do on a PC. It's nothing to write home about, but it's a welcome improvement.

However, from my perspective, the PS4 still has the upper hand when it comes to raw hardware power. But it isn't big enough to make it very noticeable IMO, and most titles will be cross-platform so they're likely to look the same on both PS4 / X1.

All I want to see is if AMD will capitalize on their unique opportunity here. For the first time, two major consoles are using almost the same hardware, which also happens to be very PC-like hardware.

If AMD can design a replacement to DirectX that's lower level and closer to console APIs, AMD can gain the benefits of console optimization on the PC too on their own hardware. This will make life VERY difficult for nVidia and Intel's GPU division, since now they're the only ones with shitty console ports.

All I want to see is if AMD will capitalize on their unique opportunity here. For the first time, two major consoles are using almost the same hardware, which also happens to be very PC-like hardware.

If AMD can design a replacement to DirectX that's lower level and closer to console APIs, AMD can gain the benefits of console optimization on the PC too on their own hardware. This will make life VERY difficult for nVidia and Intel's GPU division, since now they're the only ones with shitty console ports.

Like OpenGL? The PS3 already had a kinda rough implementation, but considering that the PS4 already runs a FreeBSD derivative, it's likely to have full OpenGL support. Of course that's not going to happen with the X1, though

"A major advantage to PS4's GPU is that while being of the same base architecture as Xbox One's, it has 50% more shaders; there are 768 shaders on Xbox One's GPU, and 1152 on PS4's."

If this is accurate, I don't see why anyone would opt for the Xbox One. 50% is a MASSIVE degree of difference.

Clocks also matter, as do other units on the GPU (TMUs and ROPs). If the Xbox One has the same number of ROPs (AMD decouples these from shaders) as the PS4, but higher clocks, some workloads will run faster even if it pushes fewer theoretical FLOPS.

Then there's memory bandwidth, which can't really be compared before we get our hands on them (completely different tech).

All I want to see is if AMD will capitalize on their unique opportunity here. For the first time, two major consoles are using almost the same hardware, which also happens to be very PC-like hardware.

If AMD can design a replacement to DirectX that's lower level and closer to console APIs, AMD can gain the benefits of console optimization on the PC too on their own hardware. This will make life VERY difficult for nVidia and Intel's GPU division, since now they're the only ones with shitty console ports.

AMD isn't writing the software for either console (it's MS/Sony that are choosing the APIs to expose), and Microsoft certainly has no interest in trying to kill DirectX.

Andrew Cunningham / Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue.