7xxxD are the onboard GPUs of desktop Trinity
7xxxG are the onboard GPUs of mobile Trinity.
The 7660D is the onboard GPU of the 5800k.

The highest card Trinity can hybrid graphics with is the HD6670.

Blarg, yeah that's right. Sorry bout that, hmmm interesting if that's the actual game performance. Still gonna say, waiting for the APUs to mature a bit. I think it was pre-mature for consoles to select them, could have done a better solution. Maybe not as cheap, guess we'll see how it plays out. They did make crap chips (360/ps3) run nice, so these should preform amazing in comparison.

1. Proprietary, such as continuing to use exotic CPUs that can't be found in any PCs. Such as the PS3's one main core and seven auxiliary core architecture that many developers hated.

2. ARM

3. x86

For ARM, no CPU can deliver the needed performance, and most of them are 2-bit. Sure one can OC a 64-bit ARM CPU to absurd levels, but bottlenecks may occur since the architecture is pushed far out of specs (such as cache misses due to the cache system having to handle much more data). Porting the games to PC would also be a pain, which is something almost all developers were complaining about. And the idea of porting a console game to a smartphone/tablet? Unheard of.

For 86x, only Intel and AMD provides such CPU. Intel, given their hard-fisted attitude towards ultrabook manufacturers complaining about Intel's CPUs being priced too high even for bulk purchases, would've demanded a pretty penny and only give like a 5% discount for 10 million CPUs. After all, why bend over if you're the king of the hill?

For AMD, their CPUs were somewhat lackluster. But they had something that would reduce mobo costs to make up for that. An combined CPU+GPU.

Ever noticed how single-CPU socket motherboards are cheaper than the dual-CPU socket types? That's because the more components you stick onto a PCB, the more wiring between the components, and the more complexity. That quickly adds up in manufacturing.

By shifting the extra wiring into the APU, MS and Sony can get away with reduced costs. That means they can spurge on extras, such as a bigger hard drive, lower console price, or higher profits.

As for the GPU, Nividia most likely offered a solution. The only issue was that because their solution required a separate socket, thus more wiring and complexity, they would've have to either take an even smaller profit margin than AMD's APU, or convince MS/Sony to accept lower graphic performance.

I think Sony (and maybe MS) saw the future in GPU computing. Why else would Sony spurge for 8 GB of GDDR5 and an APU design that has HSA support? MS, I'm not so sure, as I still don't understand their designs.

Remember, you can't jam an i5/i7 and a GTX 680 into a box priced at $400.Edited by A Bad Day - 4/5/13 at 6:57pm

AMD designs an APU that can handle higher graphical processing, then people expect it to do as good as a dedicated CPU and GPU? Oh please, maybe most of us on this forum will continue buying high-end, but I know for a fact that your average person buying an all-in-one or HTPC will not care. This is perfect for that market.

honestly i do though for kevari, if rumours are correct that it would have dedicated GDDR5. issue with the A10s now is that its memory limited so faster ram will definitely let it keep up with lower mainstream cards.

as a $130 CPU + GPU package i have no real complains about it and it certainly allows me to play my games and build funky and compact ITX systems

Not this AMD user excuse of "if you haven't owned it, you don't know how fast it is yet, even though reviews have been published all over the net" I heard that many times during the bulldozer release, Give it a break. If you want to get over 30FPS with the fastest APU (still not smooth at all) In any somewhat recent game you need to either lower the resolution below 1080p or lower the settings to the lowest with no AA, or both.
Just facts. Some facts hurt when you're emotionally attached.

You're missing the plot, it's meant to be a budget gaming solution, get off your high horse and realize that not everyone can/wants to throw money at their computer.

I know exactly what they are, excellent alternatives for people into extremely light gaming, not recent titles at the settings the game was designed for at 1080p. If you just use the APU alone and it fits your needs then its a great purchase, better than anything Intel has for that situation. These guys defending them don't seem to get that. They also seem to have them in their signatures too, didn't notice that before.

You say budget solution, but a real gamers budget solution would be an i3/FX6300 + 7770/7790. Then you might actually be able to enjoy the games you play.

I can post plenty of benchmarks like 2010 did, but you APU freaks won't accept it, the truth is never valid unless it fits into your alternate reality. The excuse of these reviews not using top of the line ram for the most budget system you can buy is extremely comical too.

*snip*

No AA. No AF. Everything low. Nice FPS.

Yet again, using bottlenecking ram.

Tomshardware actually had a good article on this topic, the numbers are not "the best" but they're nowhere near as bad as your handpicked performance charts.

This is with DDR3-1866. OC the RAM, and the gaming performance difference between the i3 and the APUs further separate. Before the RAM price inflation, it was fairly easy to pick up a 1866 mhz RAM at $40-$60. Now? That would be a bargain price.

Once a dedicated GPU is added in, either that game can take advantage of Crossfire at 100% without a CPU bottleneck, or go with the i3 + GPU.