Flash acceleration has traditionally worked without issues in AMD and NVIDIA drivers, unlike Intel. Intel and Adobe got it right with Ivy Bridge. Fortunately, things look good with Trinity too. As the screenshot below indicates, we have full GPU acceleration for both decoding and rendering. AMD's System Monitor shows how the CPU and GPU resources are balanced when playing H.264 Flash videos.

Netflix streaming, on the other hand, uses Microsoft's Silverlight technology. Unlike Flash, hardware acceleration for the video decode process is not controlled by the user. It is up to the server side code to attempt GPU acceleration. Thankfully, Netflix does try to take advantage of the GPU's capabilities.

This is evident from the A/V stats recorded while streaming a Netflix HD video at the maximum possible bitrate of 3.7 Mbps. The high GPU usage in the AMD System Monitor also points to hardware acceleration being utilized.

One point which deserves mention here is that Flash and Silverlight acceleration works without hiccups here, unlike what we saw in the Brazos-based machines (where the CPU was too weak despite the availability of hardware acceleration through the GPU).

Post Your Comment

49 Comments

Hmmm.. all vendors tag 23.976 Hz as 23 Hz in the monitor / GPU control panel settings. So, when I set the panel to 23 Hz, I am actually expecting 23.976 Hz. However, this platform gives me 23.977 Hz which is a departure from the usually accurate AMD cards that I have seen so far.Reply

In short, with the 0.001 Hz difference, the renderer might need to repeat a frame every ~17 minutes. I am NOT saying that this is a serious issue for everyone, but there are some readers who do care about this (as evidenced by the range of opinions expressed in this thread: http://www.avsforum.com/t/1333324/lets-set-this-st...Reply

"The video industry is pushing 4K and it makes more sense to a lot of people compared to the 3D push. 4K will see a much faster rate of adoption compared to 3D, but Trinity seems to have missed the boat here. AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."Although this statement is technically correct it has no real world relevance. At this time people who can afford 4k TVs ( if there any commercially available ones at this time) won't be messing around with cheap htpcs. It's an inconsequential statement made just to detract from AMD's overall superiority with this product in the htpc market.If I was in AMD's shoes why would I dedicate resources to a nonexistent market ? Has anyone actually tested Nvidia or Intel's 4K output over HDMI to see whether they actually work? In the early days of HDCP all the video card manufacturers were claiming compliance but real world compatibility was a different matter.

More importantly do you have any 4K films to watch? No. Will you in the immediate future? No. Even then, when will *most* new films coming out be available in 4K? Probably in 5 years time when you'd build a new HTPC anyway.

The 4K thing is absolutely irrelevant at this point (unlike 3D I'd argue because you can go into plenty of shops and buy actual 3D media).

After Hi Def came out hardware (TVs) were available quickly but it took a *long* time before there was plenty of 1080p material anyway (note use of the word, 'plenty'). Hell, most people I know are still watching stuff in SD. Laughably, 4K isn't even close to being out yet, let alone the content.