This thread isn't meant to reahash the controversy, but to talk about the CPU itself if anyone is interested.

I will say that, based upon Anandtech's numbers AMD has addressed what, in my opinion, has been Bulldozer's weakest link, which is power consumption. Sure, the performance isn't as high as Sandy / Ivy. But performance is good enough for most things. The problem has been that power consumption has been atrocious. Anandetech shows an idle power consumption of 37.2 watts for the A10-5800K. That is exteremely acceptable.

I'm not a user concerned with power consumption at all, but I just wish I had good things to say about AMD to someone who asks for a buying recommendation. I miss the old old days when the Athlon 64 was actually a better performer than anything Intel had available.

Flip, you've already pointed out that AMD is good enough for all of those casual buyers looking to purchase a $400 laptop, and while I agree with that statement, the bad press that follows AMD for every so-called misstep in their recent chip endeavors has been something that makes me guilty for recommending AMD even for those budget $400 laptop purchases. Trust me, I still pass along as much relevant info to a prospective buyer as I can, but I just don't feel quite right recommending AMD for most buyer's needs. I hope this discussion stays civil, as its always stirs up some piss and vinegar around here to discuss AMD vs Intel.

PS - I own more AMD powered rigs than I do Intel, but the Intel is my main gamer rig, so that was a no-brainer.

What Anandtech is a little unclear on is, are the benchmarks for discrete GPUs run on the Trinity rig or on an Intel rig? What I'm getting to is, does Trinity's iGPU gaming performance suffer at all due to poor x86 performance, or is the GPU being maxed out? It's a VLIW4 (Radeon 6900 series-type) GPU so it's more efficient than Llano's GPU plus by default it's clocked at 800MHz. It's faster than Llano by quite a bit, but I wonder if there's not more available. Might be memory bandwidth constrained as well.

All else aside and based on the chip itself, it's not a bad solution for a low power user. It's quad core performance with a very decent IGP in a package that idles at 37.2 watts. An added bonus is the fact that AMD includes certain features thoughout its entire lineup - I'm specifically thinking of virtualization support but there may be others (registered memory?) too. The mobos are generally cheaper for a given level of quality, too.

I don't see much reason to recommend AMD over Intel at this point, but it's still worth recognizing that the IGP is pretty great, and the power consumption ( separate from whatever TDP may be rated at) is super respectable. Depending upon how AMD prices these, they may still end up being very decent econobox / HTPC solutions.

Desktop Trinity, if I recall correctly, will eventually release 45W versions (can't remember if will be dual or quad core). Those would be the only versions that I would be interested in, as the only application I could think to use it in, would be in a home server. It should use about 15W less than my current server, that uses chipset based integrated graphics (BE-2400 & AMD 690G/SB600). Plus it would bring the benefit of DDR3, which would drop (slightly) the power requirement for the memory, which is currently DDR2-800.

it's not a bad solution for a low power user. It's quad core performance with a very decent IGP in a package that idles at 37.2 watts.

Indeed. Its IGP is its strongest point, and if you could couple it with Intel's CPU ... wow.

I guess if I paid attention to my electric bill each month, I'd be more concerned, but as it is I can ignore it and let my wife pay it while my GTX580 sucks upon the wall outlet like a college chug champ.

"We measured the average power drawn at the wall under different conditions"37 watt is for the entire system at the wall.

How much is a barebone ivy bridge i3 system at around ~40 watt. I think Ivy bridge i7-3770k system idle at ~80watt. (Thats I find wrong.)Also looking back at Sandy review, they seem to be at ~75watt idle.

So I would say this is more than respectable.

Also there is a HUGE, HUGE misconception. that a high TDP = high power usage.You can have a 100watt TDP CPU take use less power then a 65watt TDP cpu when playing video, browsing the web, idling etc..

But when you crank the workload, the high TDP usually mean more horse power. So the 65watt might get 15FPS in a game, the 100watt 40fps.HIs it bad to have a system run 3 time faster by just an extra 35 watt... Does this mean the 100 watt chip is not power efficient ? to the contrary the 100w part is WAY more efficient.

Anyways, Trinity , in the preview, seem to have great idle power usage. hopefully TR will do a power graph , not for Cinebench ,Trinity is not CPU designed for the raytracing market, but for gaming.

Indeed. Its IGP is its strongest point, and if you could couple it with Intel's CPU ... wow. .

?? What are talking about, Trinity CPU is a match for the latest Intel I3 ivy bridge model.And if you get a i5 or i7, dont you think an IGP makes little sense and will be under-powered?

What do you think would happen if the i7-3770k lost 1 core and used AMD GPU ? It wouldn't be any faster then Trinity at gaming as it seem games are GPU limited.

We might see this come out clearer next week. but Trinity CPU seem perfectly balanced for the target market. (No its not an i7-3770k killer, well it doesn't hurt that it run games like minecraft 4x faster

How much is a barebone ivy bridge i3 system at around ~40 watt. I think Ivy bridge i7-3770k system idle at ~80watt. (Thats I find wrong.)Also looking back at Sandy review, they seem to be at ~75watt idle.

I think you should look around for some more data. I think Ivy may still consume less power at idle than an A10-5800, but I think it is close enough not to matter now. TR's review showed 3770K at 45 watts for the whole system including a Radeon 7970. As far as I'm concerned, it doesn't matter if one is a little better than the other as long as they're close.

If my girlfriend's PC died October 2nd I would definitely build her a Trinity based desktop. She runs some lighter games every now and again (think WoW, Portal 2, etc), but mostly does web browsing, watching movies, etc. Trinity, or even Llano would be perfect for her.

Currently she's running a 7 year old Dell with a P4 @ 3.0GHz, a 7600GT, and Windows 7 (the latter two I added).

My conservational side says her current PC is good enough and doesn't need replacing, but my IT side can't believe her entry level Dell has lasted this long -- and that it needs to die in a fire.

I think we all know what Trinity brings to the table. What's brilliant about it is that AMD knew what most users are looking for and they are well able to meet that need. Not everyone is willing or able to spend a grand on a high-powered setup, nor are most folks looking for Core i7- or HD7990-level performance, and nor are most folks willing to own a PC that sucks the power outlet dry. For most folks, Trinity is 'just right', as Goldilocks would've put it. Priced competitively (read: equally) with i3 chips, with enough quantities to meet demand, and with enough marketing dollars, AMD has a winner here.

I was at a Borderlands2 LAN this weekend, and I needed to get a laptop that could run it.

My previous laptop had a downclocked 4650 with DDR3, which is basically the same as a Llano A6, right down to the architecture, shader count and clockspeed. It ran the original borderlands very well; 30fps with everyone turned on, or 60fps if you disabled just dynamic shadows.

Having read the Ivy Bridge reviews and seen on paper how close HD4000 is to a Llano A8, I picked up one of our two i7-3612QM machines and set about installing Borderlands2 with the expectation that it would be similar, if not slightly better than my old laptop which was the equivalent of a lowly 320-shader A6.

I couldn't have been more wrong; Complete slideshow at 2-3 fps. Granted, this was 1080p at default settings.

I know Borderlands 2 might be a little more demanding than its predecessor but it seems to run exactly the same on all my other hardware. I dropped it down to 720p and turned everything to off or low. Still no good. 20-30 fps, but obviously only in the high teens when in combat, rather than the 30+ I would call multiplayer-friendly; Basically, this is Xbox performance, not PC performance. I suppose it's still technically better than an Xbox since that doesn't even run at 720p. Anyway, that was the lowest supported resolution.

I dug out an old i3 with a 16-shader Geforce 210M, expecting similarly poor performance. Bear in mind that this is only 16 downclocked shaders of an obsolete architecture, which was last seen in the 240-shader GTX 280 a long time ago. Based on shaders and clockspeeds, I was expecting this thing to be 15x slower than a rather old desktop card, and therefore not up to the task.

I couldn't have been more wrong; Completely acceptable at 15-ish fps on the default settings, using the silly 1600x900 native resolution. So I cranked everything down to minimum@720p and it ran at maybe 40fps for the weekend, dropping to about 25fps during combat. Bearable for something with such underwhelming specs.

The moral of my rather longwinded story is this:Never underestimate the incompetence of Intel when it comes to drivers, especially when talking about games that are "plays best on AMD" or with Nvidia's TWIMTBP support.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

I'm not sure the HD4000 is close to llano. From the review I seen, the HD4000 is often half the speed.

I personally dont find Intel CPU for laptop that interesting (I'm into graphic app & games) unless you can get a discreet GPU to go with it.I used an a8-3500m based latop for a while and that was impressive. (low noise/ heat/ and high overall performance)

Since the (p)reviews weren't allowed to discuss pricing it's hard to judge value...

Thinking from a platform perspective...- For Trinity, you'll want faster PC-1866 memory which is maybe $15 more than the PC-1600 wanted/needed for i3s(for 8 GB).- Motherboard pricing? No idea what FM2's cost. FM1/A75's start roughly $15 more than 1155/H61's.- CPU differential. This does affect graphics performance to some degree (remember TR's inside the second cpu research?). Might be able to make due with an i3 (SB or IB, maybe even a Pentium) and use those savings (with the $30 I listed above) to cover the cost (at least partially) of a decent discrete video card.

The A10 estimated to be $130. From what I can tell the 6670 is faster than the A10 gpu for gaming (when paired with the A10 cpu)

Do you care to count all these pennies? Don't know...Do you have more space for a discrete card? Don't know...

From what I've seen, the A10 great for an IGP, but IMO still falls short of my threshold of 1080p gaming, HQ no AA/AF. I make concessions in terms of quality after my hardware has aged, but not for new stuff.

*EDIT* Did not know there are FM2/A55's... Anyways will know about pricing soon enough.

Last edited by PixelArmy on Mon Oct 01, 2012 1:13 pm, edited 1 time in total.

Edit after reading the full article it seems like the Piledriver architecture still fails to close the gap in gaming with a discrete card compared to Intel's yesteryear or current CPUs. Actually the Phenom II 980 beats both the FX 8150 and the 5800K when paired up with a discrete card. What a disgrace.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

AMD 10-5800K Officially launching tomorrow. Is it worth upgrading from AMD 1055T or wait for PileDriver?

Trinity is Piledriver - at least the CPU parts of it are. Will Piledriver see it own 4/8 core implementation on the desktop minus a GPU or are they going right to Steamroller? I don't know. Llano was a Phenom II with GPU attached and Trinity is an optimized Bulldozer (aka Piledriver) with a slightly better GPU attached. Right?

AMD 10-5800K Officially launching tomorrow. Is it worth upgrading from AMD 1055T or wait for PileDriver?

None of the above if brand loyalty does not apply to you. But we might disscuss more if you can tell us about your budget and for what kind of applications your future CPU will be used.

nVidia video drivers FAIL, click for more infoDisclaimer: All answers and suggestions are provided by an enthusiastic amateur and are therefore without warranty either explicit or implicit. Basically you use my suggestions at your own risk.

AMD 10-5800K Officially launching tomorrow. Is it worth upgrading from AMD 1055T or wait for PileDriver?

Wait for Piledriver unless you are obsessed with having an IGP at all costs (meaning that even a 2 year old 6570 GPU is too expensive or too loud for your use since the 6570 easily beats the Trinity GPU).

There is a very good reason that AMD didn't allow review sites to post reviews about the Trinity CPU. The performance just isn't there and AMD is hoping that you only care about the IGP as a way to paper over Trinity's weaknesses.

Definately worth upgrading a 1055T to a SB or IB. Other than that, its just taking giant steps... sideways.

Unless you are not going to overclock your 1055 true 6 core. If you need cores i would go with a i5 or i7, but the i3s are no slouches when it comes to gaming......now handbraking and re encoding video and stuff the extra cores on that thuban will come in handy over a i3

I agree 100% The I3s with a inexpensive dedicated graphics card is the way to go for a fast but cheap gaming or HTPC system that will use minimal power. Depending on what graphics card you put in it.....i think the 7850 is a great buy these days for good 1080p gaming. The new GTX 660 is no slouch either but i really like the 7800 series. They overclock like champs and you can give amd some of your money for something that performs worth its price.

Still a AMD fan here. plus i think the AMD cards are better for HTPC work along with good gaming.I have to admit i have not personally tried a gtx 6 series for media playback, but i know my AMD cards give me a nicer picture in media playback over my gtx 560ti's.

I would have owned 2 7870s but i built my 2600k 560ti sli gaming rig over 1.5 years ago. Plus i was goin to get a 3d monitor and all that but it never happened. I do have a top of the line 3d tv though but HDMI specs limit you to 24fps @1080p and 60fps at 720p. HDMI does have the bandwidth for well over 60fps in 1080p 3d they just never included it. But it supports 4k resolution now go figure.

I would have owned 2 7870s but i built my 2600k 560ti sli gaming rig over 1.5 years ago. Plus i was goin to get a 3d monitor and all that but it never happened. I do have a top of the line 3d tv though but HDMI specs limit you to 24fps @1080p and 60fps at 720p. HDMI does have the bandwidth for well over 60fps in 1080p 3d they just never included it. But it supports 4k resolution now go figure.

You totally ought to put that pumped epeen in your sig.... oh, wait...

I have to admit i have not personally tried a gtx 6 series for media playback, but i know my AMD cards give me a nicer picture in media playback over my gtx 560ti's.

I am making the assumption that Trinity's CPU performs worse than Llano's CPU. Upgrading from a 1055T is hard because AMD haven't managed to make anything better than an X6 in the last few years.

What Piledriver does is fix the power-guzzling habits of Bulldozer, and little else; It's still not a good architecture for current software.Bulldozer's 8 'cores' were worse in most situations than four stars cores used in the AlthlonII/PhenomII because they have such low IPC per thread and most software isn't threaded that well.Trininty only has 4 Piledriver cores, which means it is more accurately classed as a dual-core CPU that can run four threads.

If you want to know how bad Trinity's CPU will be, have a look at the FX-4100, specifically how it gets trashed by the i3 in single-threaded programs that represent most of the software on the market. The purpose of Trinity's CPU component is to be adequate enough to keep the IGP busy, and nothing more than that. The very concept of "upgrading" to something with such low aspirations is pointless; Trinity is for new builds that don't have space or budget for a discrete gaming card.

Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.

If you want to know how bad Trinity's CPU will be, have a look at the FX-4100, specifically how it gets trashed by the i3 in single-threaded programs that represent most of the software on the market. The purpose of Trinity's CPU component is to be adequate enough to keep the IGP busy, and nothing more than that. The very concept of "upgrading" to something with such low aspirations is pointless; Trinity is for new builds that don't have space or budget for a discrete gaming card.

And here I was thinking that Trinity was supposed to have 8 'cores' when people were calling it a quad-core CPU. Wish they'd just call them 'modules' like AMD did before their marketing department botched the whole thing.

If it's just a four-module 'Piledriver' setup, then it's going to get it's ass kicked. By i3s. And you're right, this is literally just going to be good for those that can't afford/don't need a real GPU, and need faster graphics than HD4000 (which is already competent for many things).

If you want to know how bad Trinity's CPU will be, have a look at the FX-4100, specifically how it gets trashed by the i3 in single-threaded programs that represent most of the software on the market. The purpose of Trinity's CPU component is to be adequate enough to keep the IGP busy, and nothing more than that. The very concept of "upgrading" to something with such low aspirations is pointless; Trinity is for new builds that don't have space or budget for a discrete gaming card.

And here I was thinking that Trinity was supposed to have 8 'cores' when people were calling it a quad-core CPU. Wish they'd just call them 'modules' like AMD did before their marketing department botched the whole thing.

No, Trinity is a four-core (2 module) chip in its highest-end incarnation (lower end models only have one module). On the CPU side, comparing it to a hyper-threaded core i3 is reasonable both in performance and price. The i3 (especially an Ivy Bridge version) will definitely take the win on power efficiency though. Trinity on the desktop is interesting because AMD opened up the GPU, but the CPU is nothing particularly amazing and mostly beats the older Llanos because Trinity comes pre-overclocked and the Llanos were intentionally underclocked...