I partially disagree (personal preference). I'd like to the 65W A10-5700 reviewed as opposed to the A10-5800K since a 65W part makes much more sense for an HTPC then a 100W part. By extention, the A8-5500 would be interesting as well though I'm curious how much of a difference the number of Radeon cores makes in terms of HTPC usage.Reply

I transcode on my HTPC, but I just use Quicksync on my i3 with HD 3000 graphics. I use Arcsoft media converter 7 and rip HD TV recordings down to a manageable size to play on my Iconia tablet. Considering the fact that it only takes 20-30 minutes to take a 1080p show down to 720p at 1/6 the original file size, I can't complain about the results. Intel offers an HD 4000 i3, and that would be my HTPC CPU of choice if I had to buy today.Reply

The features you are testing are never obvious from a spec sheet, so a targeted hands-on review like this is very important. At least it is to me, because my next laptop choice will be based on its capabilities for media viewing and gaming. And battery life, followed by weight.

I'm glad that Anandtech has explained to us that this is a staged released and has offered its review based around that by looking to past performance. This is better reporting. Not the immature biased reporting being done by Tech Report.If Intel did this, it's almost a sure thing TechReport.com would not have said a thing about a staged release and gone ahead with its review the same way Anantech did here.Reply

Isn't giving you 23.977 what you'd actually want over 23Hz? I can't think of when you'd want 23Hz (whereas 24Hz, 25Hz and 30Hz are all useful) whereas 23.976 is what you'd want from telecined material.Reply

Hmmm.. all vendors tag 23.976 Hz as 23 Hz in the monitor / GPU control panel settings. So, when I set the panel to 23 Hz, I am actually expecting 23.976 Hz. However, this platform gives me 23.977 Hz which is a departure from the usually accurate AMD cards that I have seen so far.Reply

In short, with the 0.001 Hz difference, the renderer might need to repeat a frame every ~17 minutes. I am NOT saying that this is a serious issue for everyone, but there are some readers who do care about this (as evidenced by the range of opinions expressed in this thread: http://www.avsforum.com/t/1333324/lets-set-this-st...Reply

"The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Trinity seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."Although this statement is technically correct it has no real world relevance. At this time people who can afford 4k TVs ( if there any commercially available ones at this time) won't be messing around with cheap htpcs. It's an inconsequential statement made just to detract from AMD's overall superiority with this product in the htpc market.If I was in AMD's shoes why would I dedicate resources to a nonexistent market ? Has anyone actually tested Nvidia or Intel's 4K output over HDMI to see whether they actually work? In the early days of HDCP all the video card manufacturers were claiming compliance but real world compatibility was a different matter.

More importantly do you have any 4K films to watch? No. Will you in the immediate future? No. Even then, when will *most* new films coming out be available in 4K? Probably in 5 years time when you'd build a new HTPC anyway.

The 4K thing is absolutely irrelevant at this point (unlike 3D I'd argue because you can go into plenty of shops and buy actual 3D media).

After Hi Def came out hardware (TVs) were available quickly but it took a *long* time before there was plenty of 1080p material anyway (note use of the word, 'plenty'). Hell, most people I know are still watching stuff in SD. Laughably, 4K isn't even close to being out yet, let alone the content.

There is an appropriate CPU/APU model for every budget these days. Virtually any current model APU/CPU will perform just fine for 98% of consumers. Most consumers buy what fits their needs and budget, not the over priced, over hyped top-of-the-line models.

AMDs new Trinity APUs and Vishera desktop FX processors offer more performance for less, which is good for consumers.Reply

We don't know about Vishera, not yet anyway. We don't know what the improvements over Bulldozer will yield as a whole, only what a couple of benchmarks showed in a brief Toms comparison between Trinity and Zambezi. There are plenty of scenarios to consider.Reply

Yes some of us do know the results... Comparing Trintiy to Vishera is incorrect. Vishera is to be compared to Zambezi.

AMD has hit their projected 10-15% gains for Vishera compared to Zambezi. Some people already know the results but the NDA doesn't expire for a few weeks so they can't print them yet. Most folks will be happy with Vishera except the haters.Reply

I'd find it hard to believe you were personally under NDA (please prove me wrong). I also believe the gains were per clock, which should theoretically, given the assumption you stated, result in a slightly larger performance gap between the 8150 and the 8350 as the latter has a higher base clock and is more likely to hit max turbo speed.

Like I said though, two benchmarks in the public domain aren't gospel, regardless of whether we're comparing Vishera OR Trinity to Zambezi. Remember that L3 cache doesn't always help, but when it does, the gains can be significant, meaning the A10-5800K could occasionally be outperformed by a similarly clocked 41x0 CPU, but the flip side is that it could occasionally perform on par with a similarly clocked 43x0 CPU.Reply

"AMD was a little late in getting to the CPU - GPU party. Their first endeavour, the Llano APU"Aren't Zacate and Ontario APUs? They were released in 01/2011, half a year before Llano. Or aren't you counting low power APUs? :)Thanks for the article!Reply

What's the big deal with 4k at THIS moment? There are no 4k tv's out are there? By the time they're out, or by the time they're actually affordable by a decent amount of consumers, we would have several generations of new apu's. Reply

These HTPC-perspective articles are consistently some of the most useful and interesting content that AT puts up. As far as I can tell, there really aren't any other tech sites that delve this deep into this kind of functionality - most reviews settle for playing a 1080p Bluray and posting a screenshot of the CPU usage in task manager. While it may only be a relatively small audience for who this stuff is relevant, we are a very interested audience, and I personally appreciate every detail and statistic included. Thanks Ganesh!Reply

At this point, it seems 4k is more marketing hype. I'll link this article: http://reviews.cnet.com/8301-33199_7-57491766-221/...For the typical anandtech readers (probably much more technically gifted than me) I also recall reading a similar article/post on avsforums explaining that for any display size <~100 inches, the 4k standard is hard to justify. Also, while I know that future proofing is sound, there is very little content or ability to play back said content at that resolution. As a previous poster mentioned, by the time 4k becomes a standard, the current platforms will seem antiquated. Anyway, Anandtech is the best tech site around by far; read it every morning.Reply

Meaning, setting it to 16-235 means to discard 0-15 and 236-255 and expand the remainder to full RGB.

Obviously I don't have a Trinity setup so I'm just speculating, but on my HD6400 there is a different parameter on the display configuration section to tweak screen output range - which I set to RGB full range.Reply

I think you are referring to the pixel format output which is YCbCr 4:4:4 / YCbCr 4:2:2 / RGB Limited / RGB Full

The dynamic range aspect is orthogonal to the pixel format output over HDMI.

The screenshot posted is that of a video playing in the background. Sorry if that wasn't clear. I am not sure about AMD's terminology here, but any user setting the dynamic range to 16-235 would expect NOT to see values 0 - 15 and 236 - 255. Reply

Yes I was referring to pixel format output. I use RGB Full. I was under the impression that YCbCr cannot display the ranges 0-15 and 236-255 but I think I might be wrong on this one. It is YV12 / YUY2 colorspaces that lack these ranges.

And what you're saying about dynamic range is exactly what I'm saying is happening. If you select 16-235, 0-15 and 236-255 from the video is filtered out and the remaining is expanded back to 0-255. Thus a video decoded to YV12 / YUY2 space played on a full range display would have a greyish black or white without selecing 16-235 range. Meaning, the wording on AMD's UI is correct, just the whole idea behind it is confusing.Reply

- HTPC Box: that's passively cooled: An A10-5700 would work great in there and be a nice upgrade!

- Office/Workstation Box: GPU acceleration can make a lot of difference, not to mention people have different needs.

- Gaming Box: For someone who wants to game but don't want to shell out the money needed to get 1080p Ultra graphics, or as I see it, a gaming starter kit.

Well? APU's have plenty of point if you're not an out of touch Intel fanatic. Also did you even read the review? There was encoding and decoding that the APU did really well.

btw. I have a passively cooled HTPC, and a Laptop I use for office work, both based on APU's(Currently Llano, the HTPC is getting a Trinity upgrade though.) and I wouldn't want them any other way.Reply

I'd like to see AMD trying a bit harder to keep their power consumption down, because in the end the reason for me to choose an i5-3570K was that like AMD it offered 'enough' GPU power, but at a much lower max power. My computer runs at well under 10W idle and about 75W max (OCCT+Furmark), more like 45W in normal use. I wouldn't be able to get near that kind of power consumption with equally-featured Trinity parts (aside from the lower CPU performance, which isn't really a big deal tbh).

Yup, I've been working on my own little HTPC project(Although not as cool as yours :D). The Streacom FC5-OD is surprisingly good at cooling down even a 100W APU, right now I'm using a 3870k, I'm planning on getting the A10-5700 asap, and the final touch I plan on adding is the 6670, connect it to the opposite cooling ribs, however right now I'm running into a PSU limit, that I plan on countering by getting a slightly better PSU(250W CarPC PSU instead of a 150W picoPSU)

But yeah despite the slightly higher load, the fact is on idle, and most likely average, AMD have really brought down power consumption with Trinity. But I like your setup, and will probably borrow a few ideas from there.Reply

Nope :) 29 Hz is 'control panel speak' for 29.97 Hz and 59 Hz is 'control panel speak' for 59.94 Hz. So, if you have a file at 29.97 fps, it can be played back without any dropped or unsymmetrical repetition at 59.94 Hz since each frame has to be just 'painted' twice at that refresh rate.Reply

I remain complete bewildered that chip manufacturers cannot get the frame rates right. It may be an odd frame rate but it is a standard rate that has remained the same forever.

However, the problem for AMD remains the TDP of the processors. Heat requires to be dealt with, usually by fans and that means noise. An HTPC needs to be as close to silent as possible.

TDP of 65W is simply too high. You can (as I have) buy a ridiculously over powered i7-3770T which has a TDP of 45W. AMD need to reduce the TDP to no more than 35-45W. At that point there are various HTPC cases which can cool that completely passively.

Overall this is yet another step forward in the ideal HTPC but we are still short of the promised landReply

Put one of these on a mini-itx board and cram it into something the size of the Shuttle HX61 that I just got and I am interested. I am so spoiled by having a small, silent, cool HTPC I will never go back to anything louder or bigger than a 360.Reply

"Intel started the trend of integrating a GPU along with the CPU in the processor package with Clarkdale / Arrandale. The GPU moved to the die itself in Sandy Bridge. Despite having a much more powerful GPUs at its disposal (from the ATI acquisition), AMD was a little late in getting to the CPU - GPU party."

According to my readings, it was AMD not Intel, first to talk and initiated APU(cpu+gpu). Intel found the threat used it manpower and resources , came out release cpu+Gpu chip. Reply