Rumors about the next-generation Xbox have been circulating for years. The next-generation Xbox, or Xbox 720 as some call it, is expected to launch later this year. There are some indications that Microsoft might unveil the next-generation Xbox ahead of E3 2013. Other rumors have put the price of the next-generation Xbox at around $400.

While most details of the next-generation console remain to be seen, leaked specs have surfaced this week that give some hardware specifications for the processor that will be the brains of the next-generation Xbox. The processor has x64 architecture and eight cores running at 1.6 GHz. Each of those CPU threads has its own 32 kB L1 instruction cache and 32 kB L1 data cache. Each module of the four CPU cores has its own 2 MB L2 cache giving the processor a total of 4 MB of L2 cache.

VGLeaks reports that each core has one fully independent hardware thread and doesn't share execution resources. Each hardware thread is also reportedly able to issue two instructions per clock cycle. The next-generation Xbox GPU is reportedly a custom D3D 11.1 class unit running at 800 MHz with 12 shader cores and 768 total threads.

Each of those threads is reportedly able to perform one scaler multiplication and additional operation per clock. A natural user interface sensor is also always present. That processor is reportedly paired with 8 GB of DDR3 RAM and 32 MB of fast embedded SRAM.

The machine is also paired with a 6x Blu-ray drive, gigabit Ethernet, Wi-Fi, Wi-Fi Direct, and various hardware accelerators for image, video, and audio codecs. The machine is also tipped include a Kinect multichannel echo cancellation hardware chip and cryptography engines for encrypting and decrypting content.

quote: dual phenoms with an older radeon GPU? smartphones will have comparable power in two years... hook them up to your TV with miracast, get a bluetooth controller, and why do i need a console again??

A smartphone won't be able to touch this processor for a long time, although they are advancing at a rapid rate, they're still only performing about as fast as a 5 year old Intel Atom and that's just in Single threaded performance.This 8 core chip is probably likely based on the AMD FX architecture, if the buzz is true.

Throw in a decent GPU that alone will consume more power than several dozen smartphones and has allot of execution resources and Co-Processors...

Paired up with lots of relatively fast memory by smartphone standards...

And I don't see a phone touching it any time soon, especially considering allot of developers will be making games on the metal and not via various API's with an OS in the middle.

Mind you, the Desktop PC still (And will always) reign supreme from a gaming performance standpoint as it doesn't really have any TDP or size limitations.

AMD FX - Or do you expect Intel to release special underclocked Sandy/Ivy Bridge CPUs at a significant discount? If true, this console is similar spending $400 on a 5-year old PC. That makes the GPU extremely important since it will define the platforms 3D performance. We're already used to having games designed for ancient CPUs on modern PCs. The big news is that it has 8 cores. Games and multi-core/threaded programming are not exactly on the best of terms at the moment.

Summary: Get ready for another decade of games with limited physics, minimal processing requirements, and put to market based with their defining feature being how well they can squeeze frames from the cut down years old budget GPU under the hood. Go play Mass Effect 1 on a PC to see what I mean - it took devs years of learning to squeeze more pixels out of the 360 and PS3.

You're gonna have to be more specific. Which Atom do you call the best current Atom? Which A15-based SoC?

Anyway, in a larger power envelope like a console, the ARM chips don't currently have the IPC. If these clocks are true they're not Piledriver based. Maybe Jaguar? Jaguar does boost IPC significantly over Bobcat, and it would be extremely affordable - both in terms of actual cost and power/thermals.

Never know. But I agree. This is definitely a win on the developer side for Microsoft if their console is essentially an optimized PC. It will allow insanely easy porting (if you even have to call it that). Even if the PS4 was more powerful (both are essentially using the same GPU since the 6670 and 7670 are the same thing) developers being able to just optimize things on the 360 for lower end PC specs and go means two major platforms with hardly any extra development time.

You must be too young to remember the original Xbox. Not only was it x86, but through emulation many of the popular titles could be played on the 360. Hopefully that be the case once again - perhaps even with better emulation than last time? Well, one can hope.

Oh, and we don't really know for sure what either one is using. Sony could very well also be using x86 chips as well, there are certainly rumors to this effect. In which case, depending on APIs used (OpenGL?), might also make for some easy ports - at least between PC and PS4. Not so much for a cross-platform title that includes PC, PS4 and Xbox Next. They'd have to have two different render paths at the very minimum.

Well.. rumors currently point to x64 for the next xbox. The article above states as such, though they have a habit of 'correcting' their articles here without mention.

Not to mention, the only differences in the instruction sets come down to memory usage.. You act as if there is some magical improvement moving from x86 to x64 (double the bits, means double the performance, right..? lol), where in reality it's just an easier way to reconcile the higher memory demands of more demanding applications.

The move to x64 is common sense, especially given the backwards compatibility with PC chips. Devs can choose to code for x86 or x64 -- and hopefully we'll see a few more x64 optimized PC games come out of it.

Really good game artists and marketers have successfully duped the public into thinking that cell phones are the equal of consoles and PCs when it comes to graphics but its just plain not true. Want proof yourself? Go get the Unity3D game engine, develop a cutting-edge PC or console game, and convert it to iPhone or Android. You'll be doing TONS of scaling back and corner cutting on both the CPU and GPU side to get it to run.

"Intel is investing heavily (think gazillions of dollars and bazillions of engineering man hours) in resources to create an Intel host controllers spec in order to speed time to market of the USB 3.0 technology." -- Intel blogger Nick Knupffer