So for 98% of the market AMD's APUs are the best choice and a great value.

For those who don't understand 22nm does not offer a tangible performance advantage over 28nm/32nm. While 22nm could offer slightly lower power consumption ***if properly implemented***, so far AMD's 32nm APUs are as capable of low power consumption in actual use as Intel's 22nm products, thus Intel does not enjoy any advantage from their 22nm process other than increased profits at the consumer's expense - if you chose to buy an over-priced Intel processor.

With Richland shipping AMD has upped the ante even further and OEMs are jumping on these APUs because they provide the best performance and value proposition.

Preview

What about A10-5700? Why look at the least power efficient models? Looking at the performance for the CPU and GPU, the A10-5700 is the sweet spot hands down, not the A10-5800K series. A10-5700 is mopping the floor with i3-3225 in games while using just 12W (11% more power) -- 105W vs. 117W.

Since AMD is at least 1 full node behind, they are not going to have a product that's competitive both on the CPU and GPU side. That's just not how physics works. You can continue to expect Intel having an APU with a faster CPU and AMD naturally offering a faster GPU solution but a weaker CPU. A sacrifice has to be made. Right now the GPUs inside AMD's APUs are too weak really to play modern games which is why most people are choosing an Intel i3 against A10 series. We'll see what happens once AMD brings out Steamroller cores and adds a lot more power to the APUs in the next 2-3 years.

Preview

That comparison makes no sense whatseover since the price of buying an i3-3220/3225 + HD7750 is way more than the price of the A10-5700. Also, people buying a $130 APU are not going to be cross-shopping that setup with a $220-230 setup.

i3 3220 = $130
i3 3225 = $145
HD7750 = $90-100

i3 3200 + HD7750 = $220 minimum

You cannot start comparing an i3 + discrete GPU to an A10 and ignore the cost factor. No one in the world is cross-shopping a $220 CPU+GPU combo against a $130 A10-5700 because an A10 or i3 target budget systems where cost is a big factor.

If you can afford to purchase an i3 + a discrete GPU worth $100, then of course none of the AMD's APUs makes sense. But that's not the market APUs serve at all.

Again, you seem to be confused what target market the APUs service. It is evident from this statement you made:

"Its sounds that Ilya Gavrichenkov is lying on his article, if you said "While 22nm could offer slightly lower power consumption""

He is not lying. The power consumption differential between similarly priced i3 3220 and A10-5700 is only 12W and yet the A10-5700 mops the floor with it in gaming or GPU related tasks. Therefore, someone who is interested in a budget gaming APU is better off with an A10-5700.

Preview

how do you count these 98% of the market?
U think gamers are 98%?
I still think 98% of the processes a user does are CPU not GPU bound.
Therefore an i3 is the winner in these cases.
And if you still want both: high x86 performance and some GPU Performance there is still no solution to buying a i3 or i5 and a (cheap?) Graphicscard

Preview

2.

I am very surprised by the 4k video result. My PC is Q6600 + 7500LE. Yet, when I play MP4 1080p video coded at about 12mbps, CPU rarely shot above 25%. I would say 99% of the time, it stays at 17-25%. The 4k video used in your test is using the same codec and has only 34mbps. How could we see such a significant frame drop with faster CPUs + much faster GPUs?

Preview

Preview

5.

Beautiful review.
It showed something i was thinking for a while now, AMD and Intel are starting to be uncomparable.
Like x86, powerPC or ARM.
Because we use computers for longer than 1 year i would like to see the same, but totaly same comparison between Llano and Sandy Bridge. To see how does it had matured with latest drivers.
Realy exelent review.

Preview

6.

I have enabled FULL hardware acceleration and got QUICK SYNC working on G2020!! (will work only in IVY cpu)
you need 15.26.1.64.2618 driver to enable quick sync in MediaEspresso.
I checked it and GPUZ showes the GPU usage :-)
Same effect with sony vegas- choose the AUTOMATIC Encode mode or GPU and you will see GPU load 50%~80% during Render
Please XbitLabs- check it and write the findings!

I am using a HP laptop with A8-4500M and the display is 1600*900. While playing those 4k clips, the CPU stays at 11-17% most of the time. Not sure if the codec is playing any trick, as it may know my display has a very low resolution. I don't know if the CPU utilization would be higher, if this PC is connected to a 4k monitor.

Hi xbitlabs, could you describe in details how you did that 4k video test? If your results were true, it could have had a major implication for the whole PC industry in 2-3 years.

Preview

8.

Are we over the days of QuickSync not working in you have another GPU on your PC?

I must admit I didn't read every word so you may have said in the article but back in the Sandy Bridge days QuickSync only worked when you have a monitor attached to one of its outputs, is this still the case?

I will be buying a Haswell set up when it's out but will be using a dedicated GPU also, will the GPU on the CPU still function?

What would be better (mainly quality but also speed) new AMD card, nVidia, sticking with QuickSync or disabling all GPU acceleration and purely using x86?

Preview

9.

After reading revues in December last year I put together an i3 3220
For one of my grand sons, whilst there was an AMD A6-3670 here already which was about a year old this was to stop 2 kids trying to play on one machine. Now the newer i3 3220 is hardly ever used as the AMD machine appears to be the preferred unit for playing games on.
I even put an old HD5570 video card into the Intel unit to try and make up for the woeful graphics, this did not improve the game play at all, It cost me a newer HD 7770 card just to get it to have the same graphics as an older AMD cpu.
It is still not the preferred unit for the kids to play on. The only thing which it appears faster at is when using an USB stick. Other than that the Intel computer appears to be slower less user friendly. This was even more apparent when in my ignorance I changed both units to Win 8., a complete turn off for both the 12 and 14 went back to playing on tablets, once I ran the recovery and went back to the win 7 they started to use the units again.
They both have an dislike for the Intel toy even if they don't know what CPU's are in the machines as they visit during the weekends.I see no reason to change to Intel for my own use or the 10 units in the business.

Preview

I've write something about this a while ago, and tell me if I'm wrong, but I feel that for small things like browsing files, music or pictures, my AMD machine feels sort of snapier. The AMD is an unlocked Phenom2 550 while my main computer is an I7 2600k. I'm not talking about video editing here, just doing normal, current operation. If you ask me, this really does count for a "light" user like my wife for example.

Preview

Preview

10.

Why do you always insist on testing Pentiums and Celerons with a Z-series chipset and 1866MHz RAM? My guess is that most buyers will combine these cheap CPUs with a B75 or H77 based mobo and be limited to 1333MHz with quite a bit lower performance depending on benchmark. To be fair you should test also i3 with 1600MHz since this is the max for any buyer choosing a lower spec mobo. Ideally, you should test 1333/1600/1866 so that we all can see the influence of memory speed.

Preview

I would assume it was to keep the same ram across the board. Because 1866 is the upper limit an Amd apu can use without any overclocking. That way gaming reviews would be more accurate. As someone else said for everyday usage Amd builds do feel snappier (web, movies, light gaming). For professional work intel all the way. As far as the power usage for home user, Unless its in a notebook I stopped caring. I have 3 pcs in my house, 3 TVs and an Ac highest my electric bill has been last year was $87. Heck the icemachine in my fridge used more power then all 3 of my pcs. Any friends who build new pc regularly less if its amd or intel I make a link to power options on the desktop. I just tell them when gaming full performance mode all other times balanced.
But great article shows the strengths and weaknesses of each platform very well.

Preview

11.

Great review! And thanks for pointing out a very important point again:

when the system includes a discrete graphics core, the integrated GPU has no effect on 3D or heterogeneous computing performance, which means that the computing performance of x86 cores remains the main factor for choosing CPUs for classic discrete PC configurations

Preview

Preview

14.

To me it doesn't make sense to go with AMD, unless your budget is around $300 for the whole system but you still want to play games. As stated above, the best AMD APU is only equal to entry level discrete graphics (~$50), so if you can stretch that budget even to $400, you can have the best of both worlds, computing and graphics, and still use less power than the AMD solution.

Preview

15.

Is it me or those idle power consumption numbers look a tad too high? People have been building HTPCs with G530s and G620s from last generation and getting 20-25W idle measured at the wall. But here you are measuring after the PSU and using current gen 22nm IVB Pentiums and still getting 36+ wattage? What gives? Your test config doesn't seem to use GFX card as well, so what exactly is consuming so much power?

Preview

Cheap power meters can sometimes give odd results. They are are made to handle loads in the kilo-Watt range and can be unreliable with small loads, as the A2D converter can't discriminate small changes. The other consideration is power factor, which can screw up measurements of true power, especially if using a cheap PSU with poor PF correction. It's worth looking at what the power meter indicates for current and phase as well as power. I'm not a great user of these things, but one time I used a power meter while adding components to an A/V setup, and I swear the power consumption went down, according to the meter, when I added one more component. Switch mode PSUs can do weird things.

And while I am here, a big thank you for an excellent review which answered the questions that were worrying me.