Rumors about the next-generation Xbox have been circulating for years. The next-generation Xbox, or Xbox 720 as some call it, is expected to launch later this year. There are some indications that Microsoft might unveil the next-generation Xbox ahead of E3 2013. Other rumors have put the price of the next-generation Xbox at around $400.

While most details of the next-generation console remain to be seen, leaked specs have surfaced this week that give some hardware specifications for the processor that will be the brains of the next-generation Xbox. The processor has x64 architecture and eight cores running at 1.6 GHz. Each of those CPU threads has its own 32 kB L1 instruction cache and 32 kB L1 data cache. Each module of the four CPU cores has its own 2 MB L2 cache giving the processor a total of 4 MB of L2 cache.

VGLeaks reports that each core has one fully independent hardware thread and doesn't share execution resources. Each hardware thread is also reportedly able to issue two instructions per clock cycle. The next-generation Xbox GPU is reportedly a custom D3D 11.1 class unit running at 800 MHz with 12 shader cores and 768 total threads.

Each of those threads is reportedly able to perform one scaler multiplication and additional operation per clock. A natural user interface sensor is also always present. That processor is reportedly paired with 8 GB of DDR3 RAM and 32 MB of fast embedded SRAM.

The machine is also paired with a 6x Blu-ray drive, gigabit Ethernet, Wi-Fi, Wi-Fi Direct, and various hardware accelerators for image, video, and audio codecs. The machine is also tipped include a Kinect multichannel echo cancellation hardware chip and cryptography engines for encrypting and decrypting content.

has nothing to do with how well consoles hold up... they cut ALOT of corners to do that holding up.

It's how they are HOLDING US BACK. Our machines are leaps and eons over console technology even on a new consoles launch day.

And yet, we are stuck with mjaority of games being shoddy console ports, where they developed at the console hardware's expense, and thus, our system sit and suffer from the lack of 90% the performance and improvements in game technology that could be used.

FOR YEARS ON END. All because it's a cheaper alternative that a monkey could run. There's no innovation in it. When I was growing up, I had to build, configure, customize, network, learn, etc. There is none of that in a console. It's just making prodigies that could be in their prime technological years dumb as rocks. Just sofa zombies. Nothing productive.

And unfortunately, since its so widespread, its become the primary cash cow, and thats where developers will develop for... making them lazy in the process... not looking into exploring a wider range of features and technology.

And sadly, alot of great developers have migrated there with no intention of turning back as PC being their primary platoform.

Eventually... you'll see the hardware manufacturers start slacking if they have not already. Why put out a new CPU or GPU anymore when less and less are buying them? Why not just sit back and rest for a few years before hardware releases since developers aren't even really investing time into it?

Problem is, when you and I were growing up, games were still relatively simple to develop and sold to a relatively small audience. I don't know about the early and mid 90's, but even by 1999 the average cost to develop a game was $1 - $4 million. Fast forward to today and we're seeing games like GTA 4 cost $100 million. On top of that, the cost of buying the games has gone down over time as well considering inflation/cost of living. It takes economies of scale to fund that HUGE increase in cost of development. That invariably means a watering-down effect will take place because only the mass market has enough consumers to fuel those demands. The mass market is NEVER on the cutting edge nor is it targeted at enthusiasts.

We gamers/techno geeks who grew up with this stuff are not the target audience anymore. It's not going to change. Cest la vie.

That is called free market. Developer releases multiplatform game on XB, PS and PC. If console versions sell more (and it seems they usually do), developer will put more focus/time/resources on console version next time, and after that.

It is naïve to think that disappearance of consoles would bring everyone back to PC. Majority of people simply want to play games, not to tinker with hardware. More people would move from consoles to tablets/smartphones/iPods… than to PCs, and that would hurt gaming even more.

I’m building my computers since early ’90, and really enjoy it. A few days ago I upgraded my gaming PC from C2Q to new iCore, and had my first Z77 chipset MoBo. On system shutdown, NIC would remain active and, for some reason, confuse my router so that other computers on LAN would drop network and Internet connectivity. While gaming PC was up and running, other machines would run without network issues. I think of myself as reasonably clued about hardware, still it took me solid 2 hours to figure this out how to disable WoL as feature doesn’t exist in motherboard’s EFI. EFI has “wake on kbd” and “wake on mouse” options, but no wake on lan. However, if you ENABLE ERP feature, it will in return disable all wake on… features, including wake on lan. Logical, isn’t it? I actually enjoyed the challenge, but for someone who just wants to run machine and play a game, it would be mighty frustrating. And then there are driver updates, game, OS, AV, PunkBuster… patches, and then there are Steam, Origin, uPlay… updates. All in all, hardly a user friendly environment for someone who just wants to play a game.

And then, of course, MS and Sony go an extra mile to secure exclusive titles and developers. I was hoping that MS initiative “Games for Windows” will bring some order (and fine-tuned, optimized extraordinary titles) to PC gaming, but it didn’t happen. Probably due to MS dedication to Xbox… but still, it is a shame that PC manufacturers like Intel, AMD, nVidia, Creative, TurtleBeach, Cooler Master, Thermaltake and all other, especially high-end enthusiast brands… didn’t find it important to create some kind of consortium and put some money into securing more exclusive developers and titles for the platform.

That's a good point. Image quality on consoles is so dumbed down its not even funny. It appears to be holding up for 7 years, because that is the hardware that is being developed to. Oh well. I guess a 1+ year old mid range card is still a hell of alot better than a 7 year old mid range card.

A lot of times, when I get home from work, I don't WANT to "build, configure, customize, network, learn, etc.". I want to sit down in my living room, turn my TV on, press play, and start blowing things up.

Other times, I will turn to my PC and do whatever is necessary to download, install, patch, get the latest drivers, adjust video options, or anything else I might POSSIBLY have to do to play a PC game. Not like any of that is a big deal, but I can see how some might not want to deal with it.

That being said, I don't think consoles are holding PC gaming back at all, I game equally on both and have no qualms with either platform. Also, it's already been stated elsewhere in this thread - consoles don't NEED to have the latest and greatest tech because developers can squeeze more out of dedicated hardware than is cost effective to go cutting edge. They may have cut a lot of corners to get Far Cry 3 to look as good as it did on the 360, but the gaming experience (audio and video) was still superb...

...and it looked even better on the PC, so everyone is a happy camper. I hope both platforms continue to press forward because I would like to enjoy both. That doesn't make me a sofa zombie, it makes me a choice zombie.