From a HTPC perspective, GPUs over the last two generations have done little to tempt users to upgrade since HD audio bitstreaming became a commodity feature. The appearance of 3D TVs called for some updates from the GPU vendors, but the technology didn't really pick up at the pace that the industry wanted it to.

With the 3D craze having been milked dry, it is now the time for a new buzzword: 4K. Retina displays have become the focus of much talk, thanks to Apple's promotion, and, at Computex, we saw the introduction of products with 11" and 13" screens having 1080p resolution. It is not tough to imagine 4K resolution panels becoming commonplace in 32" and larger sized TVs and even 24" and larger sized monitors.

The one aspect that 4K has going for it is the fact that the higher resolution (when it comes to videos, at least) is unlikely to have any ill effects on the viewers' health. Unlike 3D (which caused discomfort to a number of consumers), we expect 4K to have a much smoother sailing in gaining acceptance in the marketplace. In addition, 4K is the natural step towards a more immersive experience. As such, we are more positive about 4K from a consumer as well as industry perspective than we ever were about the 3D initiative.

In terms of being an early adopter, the current issue with the 4K ecosystem is the fact that HDMI officially only supports up to 4096 x 2160 @ 24 Hz and 3840 x 2160 @ 30 Hz. For a smooth desktop experience at 4K resolution, it is imperative that we get 60 Hz refreshes at 4096 x 2160. This is scheduled to come in the next update to the HDMI specifications. It is also unfortunate that we are restricted to 4096 x 2160 as the maximum resolution, when the official 4K cinema specifications are only slightly larger at 4096 x 2304.

In any case, the Zotac GT 640 that we are looking at today is compliant with the current HDMI 4K specifications. The 4K resolution is available over all the three ports (using an appropriate DVI to HDMI converter). Both the DVI ports are also capable of carrying audio.

AMD's GCN lineup is also compliant with the HDMI 4K specifications, but it is the GT 640 which has excited us enough to talk about this in detail. While AMD's 4K hardware decode remains an unusable feature for the general consumer right now, NVIDIA's Kepler implementation fares much better.

Due to the aforementioned issues with the mini-HDMI port on Zotac's card, we tested out the 4K output over the dual-link DVI port connected to the Sony VPL-VW 1000ES through a DL-DVI to HDMI adapter.

Currently, native DXVA mode implementations tend to crash the system. However, using LAV Filters 0.50.5 in the CUVID mode or DXVA2 Copy-Back mode, we are able to decode H.264 streams with resolutions greater than 1080p using the GPU.

The screenshot above (click on the picture for full 4K resolution) shows the playback of a 4096 x 2304 H.264 stream at 24 fps.

We see that the GPU's VPU has approximately 60% load. EVR-CP doesn't load up the GPU core too much (less than 50% core utilization). Note that the maximum refresh rate possible at 4096 x 2160 is only 24 Hz, as indicated by the EVR-CP statistics. Another point to note is that the LAV Video Decoder is operating in CUVID mode.

CUVID acceleration is also possible for videos with arbitrary resolutions greater than 1080p. The screenshot below (again, click for full Quad FHD resolution) shows flawless decode acceleration of a 3412 x 1920 video at 25 fps. At 3840 x 2160 (Quad FHD), the GPU is able to drive the desktop with a refresh rate of 29.97 Hz. In this case, the VPU load is a bit lower (around 45%) as per expectations.

How well does 4K decode and rendering work with other combinations of decoders / renderers? The usage graphs below present the CPU package power, GPU core load, memory controller load, video engine load and video bus load when playing our 4K test clip (original version of this YouTube video) on a 1080p display (which is probably going to be the way that most consumers are going to enjoy 4K content for some time to come). As usual, we accept no quality tradeoffs in madVR and go with the high quality settings that we have used in previous reviews.

It is immediately obvious that the GT 640 is not in any way up to task for madVR processing on 4K content, even when it is just downscaling to 1080p. As evident from the above graph, the core is maxed out whenever we choose madVR as the renderer irrespective of the decoder used. Our suggestion is to retain EVR-CP as the renderer for all 4K content.

Post Your Comment

60 Comments

This is the type of review that other hardware sites can't even imagine, let alone write. Thanks for putting this and the other HTPC articles together. It's great to see a hardware review site taking HTPC enthusiasts and their needs seriously. Excellent review.Reply

What specifically are you looking for? Gaming performance or HTPC functionality? Gaming performance isn't likely to improve; even with the newer architecture it's not Kepler that's the limiting factor. HTPC functionality on the other hand can easily be improved with drivers.Reply

If they're going to release a DDR3 version, why not just offer a version with no onboard memory and two DIMM slots so that users can add there own? You can get a DDR3-2133 kit which would boost bandwidth limited scenarios by roughly 15%. While I don't see the need, such a card could be upgraded all the way to 16 GB of memory. Reply

Sockets- are unconventional (I don't think nVidia likes this word)- introduce a little cost (GPU manufacturer doesn't like it)- make the board larger (GPU manufacturer doesn't like it)- make the bus timing worse, so it's harder to clock them as high as directly soldered chips (wouldn't matter with DDR3, though)- introduce another point of failure (GPU manufacturer doesn't like higher RAM rates)- add cost to the overall product, as the end user wouldn't get as sweet a deal on RAM as the GPU manufacturer (this would eat into the GPU manufacturers profit margin)Reply

It is too noisy, and the HDMI socket is an epic design fail. As a card for an HTPC what were Zotac thinking of? This is so badly wrong.

Now onto frame rates. Nvidia, AMD and Intel really are total and utter idiots or they have decided that we the customers are total and utter idiots. There is simply no excuse for all IGPs and video cards not to be able to lock on to the correct frame rate with absolute precision. It is not as though the frame rate specs for film have changed recently. I cannot decide whether it is sloppiness, arrogance or they simply do not give a rats a##e for the customer experience.Reply