From the preview is look like the A10 is an i3 class CPU. If you all think Trinity is a POS, then the same is true for the entire Intel i3 CPU line.The difference is that for anything GPU related the A10 easily beat a i7-3770k with the highest end Intel IGP.

Lets not dismiss OpenCL and WebGL just yet, both where Intel APU as pretty deficient.

With Trinity you get an i3 class CPU but 2x faster IGP then the HD4000, for the price of an i3.Its not what everyone is looking for (if you need a discreet GPU, trinity make no sense), but you need to be an Intel fanboy to buy an i3 when Trinity is available.

From the preview is look like the A10 is an i3 class CPU. If you all think Trinity is a POS, then the same is true for the entire Intel i3 CPU line.The difference is that for anything GPU related the A10 easily beat a i7-3770k with the highest end Intel IGP.

Lets not dismiss OpenCL and WebGL just yet, both where Intel APU as pretty deficient.

With Trinity you get an i3 class CPU but 2x faster IGP then the HD4000, for the price of an i3.Its not what everyone is looking for (if you need a discreet GPU, trinity make no sense), but you need to be an Intel fanboy to buy an i3 when Trinity is available.

There are still a ton of reasons to buy an i3, many of them pertinent to desktop users. There's really only a very thin slice of people that Trinity works for.

Trinity A8 is faster, but an Ivy Bridge i3 core does have a 1.2 % higher IPC in this test.

I can see people slamming bulldozer. But here we have a chip that decimate intel for GPU task, get same or better idle power, is comparable (win & lose) on the CPU front compared to an i3and get very similar IPC in CPU intensive tasks.

Unless you plan to buy a discreet GPU, and i3 based system make no sense.

No contest on energy inefficiency ?"Trinity got a 100W TDP !! but the i3 only got a 55w.. 55w is better..." sigh.

What do you prefer system A that idle at 40 watt, but can consume 115w to deliver 40fps gaming orsystem B that also iddle 40 watt but only consume 70w to deliver 15fps.[...]

We all get it BUT not all the people in the market for a CPU in the i3/ trinity price range game on the IGP. In that case trinity doesn't make sense. So the percentage of DIY market share interested in gaming with IGPs is quite small (i don't have numbers it's just my opinion from anecdotal observations).

The chip is good but it's adequate for pre-built systems, again the kind that have no dicrete card. But regardless the elephant in the room is the x86 abysmal performing cores. Most gamers that start out with an IGP will upgrade later to a discrete card and then trinity will be again a handicap.

So can we stop this senseless back and forth "Oh it's awesome compared to i3s when not used with a discrete card" but "Yeah but how many people build such systems for gaming and not buy a discrete card the sports X times better performance than IGPs".

We all know that. The market for Trinity is out there, sure but given that we are mostly enthusiasts here......only a few of us will consider this chip.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

Abysmal core? same performance as intel i3 in most situations, check the benchmarks.

I already stated my view that Trinity makes no sense if your goal is to use a discreet GPU,So you cant make that argument

What is unclear is how Trinity will scale in the future when game include more Compute task.You cant count on it, but its possible that Trinity might get better overtime because of its solid IGP.

From all I see, the A10 is an i3 class CPU (At least from Anand review and others), but Its often over 2x faster then the HD4000 paired with a 3770k (upto 5 time faster then the i3 IGP). etc..

Unless you plan on doing discreet gpu gaming, or an Intel fanboy, I still dont see how an i3 makes any sense over an A10.

And seriously , if you'r building a gaming PC with a discreet GPU, go with an i5.

I checked the benchmarks and i've seen cases where the APU drops under 30 fps while the Intel Ivy stays well above 60 (minimum frames). I'm certain we will find out more about this in the TR review, i can't wait to read it.

As for "if you'r building a gaming PC with a discreet GPU, go with an i5" not everyone has a budget that can accomodate an i5. For smaller budgets putting the bigger chunk of the money into the discrete card makes a lot more sense. For example when i built my rig almost 2 years ago, i chose a Ph II 955 and a GTX 560 Ti. Had i gone for the i5 2500k the budget constraint would have foced to buy a card in the likes of a GTX 450/ HD 5770. Which rig do you think would pump more frames at 1920x1080?

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

I've only read Anandtech's review but the CPU is a solid 'not bad.' That's different from being good, but seeing how it performs makes me wonder why AMD was so worried about showing only IGP performance first.

What I still haven't seen anyone do is a graphics review of older games at higher resolutions. We know this will run newer games decently (or maybe not for strenuous games) at low resolution, but what about 4-5 year old games that are still popular and fun at 1920x1200 or x1080? That's another niche where these processors could show their value but no one does testing with old games

Last edited by MadManOriginal on Tue Oct 02, 2012 9:16 am, edited 1 time in total.

After seeing the reviews it looks like an i3 + practically any discrete GPU from the last 2 years in the $70+ range will be the better choice for playing games. Trinity's high clocks and quad integer pipelines do give it a few wins in selected CPU benchmarks over a much lower-clocked i3, but it comes at the cost of doubling the power consumption.

According to the Anand review "part 2" it's about as expected. Trinity is not much better than Llano in terms of CPU performance, but is better in graphics and idle power consumption. So Trinity is essentially the perfect CPU for someone that doesn't do much and really leaves their computer idling all day. For gamers, it's better to go Intel + discrete. For non-gamers who still need computational strength, it's Intel with either the IGP or a discrete GPU.

After seeing the reviews it looks like an i3 + practically any discrete GPU from the last 2 years in the $70+ range will be the better choice for playing games. Trinity's high clocks and quad integer pipelines do give it a few wins in selected CPU benchmarks over a much lower-clocked i3, but it comes at the cost of doubling the power consumption.

I dont get why amd did that thing of releasing half the spec for review, the CPU isnt bad, it just sits where everyone expected, no big deal. Overall ok platform/apu and probably the replacement for my htpc (x2 240, gf8200, hd tv). From the piledriver improvements (very little tweaks, mostly on power) paints a picture of what to expect from vishera, a mild improvement on cpu performance, mostly from optimizations and another speed bin allowed by the lower power consumption, and some tweaks here and there.

Hope to see soon the damage labs review from both.

Last edited by juampa_valve_rde on Tue Oct 02, 2012 9:13 pm, edited 1 time in total.

Perhaps we can just all agree that the A10 is great for people who fit in the demographic of:

Gamers who want:More than an i3Less than an i5Will never have a discrete gpu

For this criteria, the A10 is a perfect chip..... for the .02% of computer users. I'm sure AMD will sell tens of them.

Thats pretty narrow minded, they are aiming at OEM not gamers. I'd be surprised if the sales of you-build-it parts is more than a few percent period, whether it be intel, amd, nvidia or whoever. Its a sad truth but obvious that DIY/enthusiast is slowly circling the drain with a flood of tablets, laptops and sealed box style desktops that are "good enough" for everyone else.

Regardless, A10-5700 and A8-5500 are pretty much ideal for a small HTPC, even though amd doesn't try to sell 65w cpus to end users but OEMs that shovel them into terrible configurations to dump on unsuspecting consumers.

FWIW, I would only consider the i3-3__5 in a HTPC without another gpu, but the extra performance from HD2500-->HD4000 doesn't help sub-par drivers.

The other "option" is a cheap pentium/celeron plus a low end discrete gpu, but its rather kludgy, a massive waste of a slot on an itx board (particularly if you use a tuner card) and probably draws more total system power

A lot of this has to due with proper GPU drivers that mostly work correctly for a lot of specific scenarios, as opposed to intel's that only occasionally work without glitches. (nvidia is fine but discrete only)Too many reviews gloss over this, just posting a benchmark, whatever happened to even simple screenshots of render quality? I think there is a lot of understating when comparing amd vs intel iGPU, not only is amd drawing more frames per second but they are drawing more accurate frames.

I was at a Borderlands2 LAN this weekend, and I needed to get a laptop that could run it.

My previous laptop had a downclocked 4650 with DDR3, which is basically the same as a Llano A6, right down to the architecture, shader count and clockspeed. It ran the original borderlands very well; 30fps with everyone turned on, or 60fps if you disabled just dynamic shadows.

Having read the Ivy Bridge reviews and seen on paper how close HD4000 is to a Llano A8, I picked up one of our two i7-3612QM machines and set about installing Borderlands2 with the expectation that it would be similar, if not slightly better than my old laptop which was the equivalent of a lowly 320-shader A6.

I couldn't have been more wrong; Complete slideshow at 2-3 fps. Granted, this was 1080p at default settings.

I know Borderlands 2 might be a little more demanding than its predecessor but it seems to run exactly the same on all my other hardware. I dropped it down to 720p and turned everything to off or low. Still no good. 20-30 fps, but obviously only in the high teens when in combat, rather than the 30+ I would call multiplayer-friendly; Basically, this is Xbox performance, not PC performance. I suppose it's still technically better than an Xbox since that doesn't even run at 720p. Anyway, that was the lowest supported resolution.

I dug out an old i3 with a 16-shader Geforce 210M, expecting similarly poor performance. Bear in mind that this is only 16 downclocked shaders of an obsolete architecture, which was last seen in the 240-shader GTX 280 a long time ago. Based on shaders and clockspeeds, I was expecting this thing to be 15x slower than a rather old desktop card, and therefore not up to the task.

I couldn't have been more wrong; Completely acceptable at 15-ish fps on the default settings, using the silly 1600x900 native resolution. So I cranked everything down to minimum@720p and it ran at maybe 40fps for the weekend, dropping to about 25fps during combat. Bearable for something with such underwhelming specs.

The moral of my rather longwinded story is this:Never underestimate the incompetence of Intel when it comes to drivers, especially when talking about games that are "plays best on AMD" or with Nvidia's TWIMTBP support.

Odd.

I've played Borderlands 2 on a bone-stock i7-2600 with the integrated graphics and, while it wasn't pretty, it played perfectly smooth throughout at 1280x720.

OEM's don't care about gaming performance, for the most part. For those that do, they will have a discrete card. Even the HTPC crowd will only find this platform OK, due to power use. I see no compelling reason to use something like Trinity when my i3-2120t does everything at 35W. The sad truth of the matter is that AMD is sinking its hopes on APU's when its a very small demographic. AMD has to get something going to be relevant to the mainstream. Trinity simply isn't it.

Last edited by TheEmrys on Tue Oct 02, 2012 2:43 pm, edited 1 time in total.

OEM's don't care about gaming performance, for the most part. For those that do, they will have a discrete card. Even the HTPC crowd will only find this platform OK, due to power use. I see no compelling reason to use something like Trinity when my i3-2120t does everything at 35W. The sad truth of the matter is that AMD is sinking its hopes on APU's when its a very small demographic. AMD has to get something going to be relevant to the mainstream. Trinity simply isn't it.

Perhaps we can just all agree that the A10 is great for people who fit in the demographic of:

Gamers who want:More than an i3Less than an i5Will never have a discrete gpu

For this criteria, the A10 is a perfect chip..... for the .02% of computer users. I'm sure AMD will sell tens of them.

Trinity is quite balanced in for it's price. You really need a discrete card at ~80-100$ to get something much better than trinity (otherwise don't bother). The added discrete card is not pocket change. Obviously, Trinity is not for me, because I'll probably get a 7970 or something like that but when you have a $500-600 budget, the $80 gfx card is not trivial. Compare this with i7 3770k buyers. Why the hell would I care for lousy HD4000 graphics when I'll be paying $300 for a cpu? I mean, who uses 3770k with onboard graphics?

I agree with both- it's perfect for OEMs, but not for the DIY crowd. .02% is about right.

What I get out of this though is that AMD is headed in the right direction while dragging Intel along with them. Integrated GPUs are moving higher and higher up the discreet GPU food chain, and while CPU performance is essentially 'stuck' with no real way to increase clockspeeds while few programs actually benefit from having >2 cores on the desktop, AMD and Intel (and everyone that uses ARM, etc) can contribute more and more of the transistors each die shrink provides to graphics/compute. This is a good trend, I think.

Looks great for the HTPC crowd with 6570 w ddr3 graphics performance and its pretty dang close to the i3 series x86 performance. With a household with more 3 hdtvs like mine and you already have a i5 or i7 sandy or ivy system for video transcoding and file serving. I think they will make fine inexpensive HTPC's with enough graphics power for madvr or whatever you prefer along with i3 cpu power for the most part. You can even do some decent gaming on them specially with older titles, and blizzard games like D3 and older.

Looks great for the HTPC crowd with 6570 w ddr3 graphics performance and its pretty dang close to the i3 series x86 performance. With a household with more 3 hdtvs like mine and you already have a i5 or i7 sandy or ivy system for video transcoding and file serving. I think they will make fine inexpensive HTPC's with enough graphics power for madvr or whatever you prefer along with i3 cpu power for the most part. You can even do some decent gaming on them specially with older titles, and blizzard games like D3 and older.

What I really find pathetic is how AMD maintains its website. They really gotta fire the guys who maintain it. Near the bottom of the product page (just above the disclaimers), under Additional Information, click on 'Specs' and you'll be presented with a spec table that doesn't even include the new Trinity parts! No A10-5800K, no A10-5700, none. So sloppy. Is the guy responsible for maintaining and updating the AMD website tired of his job?

Trinity is quite balanced in for it's price. You really need a discrete card at ~80-100$ to get something much better than trinity (otherwise don't bother). The added discrete card is not pocket change. Obviously, Trinity is not for me, because I'll probably get a 7970 or something like that but when you have a $500-600 budget, the $80 gfx card is not trivial. Compare this with i7 3770k buyers. Why the hell would I care for lousy HD4000 graphics when I'll be paying $300 for a cpu? I mean, who uses 3770k with onboard graphics?

Trinity's GPU is weaker than a 6570 (makes sense, it is more or less a lower spec'd one) which starts at $55before MIR. Given that bulldozer/piledriver isn't as good as the i3 for gaming, you can save a few bucks going with the i3. Note that the linked benchmarks use PC-1866 memory, you can save more with PC-1600 which is the standard speed for i3s. Or have even less performance if you pair Trinity with PC-1600.

I agree that any money at the budget end is non-trivial, but this only reinforces what other posters have said, Trinity only makes sense if you can't compromise money or space at all. Basically if you have absolutely no wiggle room on price or are building an mini-ITX HTPC. Of course, there currently are no FM2 mini-ITX motherboards (at least not on Newegg). Even if/when they do arrive, the majority of mini-ITX cases still have room for at least a low profile card which the 6570 is.

I'd like to see a budget showdown between Trinity vs Pentium and/or i3 with 6570s.

I say pay a little more to get an i5. The difference in performance should carry you a couple years longer at least, giving more time for the power savings to make up the price difference.

And even if the numbers don't actually work out, just choose to believe that while you enjoy the extra performance of your i5.

The integrated graphics don't even come into play. The A10 or A8 are probably comparable to mainstream graphic cards of 4 years ago. Where's the benchmark comparing it to a 9800GTX? Trinity might be okay for indie games and stuff running on the Source engine, but I wouldn't want to run Diablo 3 on it. I'm too spoiled by hi-rez discrete graphics I guess.

What I really find pathetic is how AMD maintains its website. They really gotta fire the guys who maintain it. Near the bottom of the product page (just above the disclaimers), under Additional Information, click on 'Specs' and you'll be presented with a spec table that doesn't even include the new Trinity parts! No A10-5800K, no A10-5700, none. So sloppy. Is the guy responsible for maintaining and updating the AMD website tired of his job?

Actually i've been dissapointed by AMD's website for a long time now. When i search for specific CPU specs (max temperature, voltages, revision codes) i have to go through countless pages that make no sense to finally find this one:http://products.amd.com/en-us/DesktopCPUResult.aspx

They really need to hire a guy to make sense of their site. When i go there as a buyer i want to see the juicy info about specific models, not endless pages of marketing techno-babble.

Intel's site used to be great back in the Core2 era. Now it's pretty close to AMD's clusterf*ck.Actually i just visted Intel's site (haven't seen it in a while), it's actually quite good. You have to go to Menu>Intel products>3rd Intel Corei....> then select between "Laptops, Desktops, Server" tab, then the Core i category (eg. Core i3) then it shows all the SKUs. There you click on the name of a particular SKU and another page opens where it shows all the info you need (revisions, TJ max, etc.).

For AMD....i don't even want to describe how many pages you have to go through.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

Actually i just visted Intel's site (haven't seen it in a while), it's actually quite good. You have to go to Menu>Intel products>3rd Intel Corei....> then select between "Laptops, Desktops, Server" tab, then the Core i category (eg. Core i3) then it shows all the SKUs. There you click on the name of a particular SKU and another page opens where it shows all the info you need (revisions, TJ max, etc.).

Honestly, I think Intel's site is quite professional, except for the "What can we help you find today?" search box (which is a bit of a nuisance, but it could be useful), and the pull down menu is kinda overkill, with everything in there. AMD's site looks like it's put together by a bunch of amateurs. I don't even want to visit it nowadays because I only get disappointed.