Will Your UHDTV Chip Decode Every HEVC Bit Stream?

The indispensable element in the next-generation set-top box is a decoder chip capable of handling all the highly compressed video streams.

MADISON, Wis. — Is the semiconductor industry prepared to profit from the emerging 4K UHDTV market? If so, how ready is anyone?

As soon as consumers start looking for 4K Ultra HDTV (assuming they do), the indispensable element in the next-generation set-top box is a decoder chip capable of handling all the highly compressed video streams.

Compression will be carried out by using a complex but flexible "toolkit" developed by the High Efficiency Video Coding (HEVC) standard, a successor to H.264/MPEG-4 AVC (Advanced Video Coding).

Leading video chip companies such as Samsung, Broadcom, and ViXS Systems are all said to have developed their own HEVC chips. But these devices are still in early versions.

The nightmare secretly anticipated by many chip designers is the long and arduous task of verifying whether a video chip is truly capable of decoding every HEVC-encoded video stream.

HEVC, also known as H.265, is said to halve the bandwidth requirement for video, thus enabling migration to 4K video.

The issue is that there are so many variable options embedded in the standard's toolkit. Each encoder designer is free to use any tricks available in the toolkit to wring the best compression performance out of the standard.

That flexibility, however, poses a huge headache for decoder chip designers, since they need to make sure that their chip "understands" all the variables present in different HEVC-compressed video streams, explained Alan Scott, CEO of Argon Design.

HEVC in multi-core chips
To complicate the matter further, in HEVC, multi-core chips can encode streams in independent tiles, while hardware implementations may choose to minimize cache sizes by using wavefront encoding. However, decoders need to support all options.

Also, "HEVC has moved from a fixed partitioning scheme to a quad-tree decomposition scheme. This means there are orders of magnitude more variations of block sizes for coding/transform/prediction," Scott told me.

If you've lived long enough to remember when MPEG-1 and MPEG-2 standards were developed (in the 1990s), you also probably remember a test sequence video stream showing tulip fields against Dutch windmills on a screen. The video clip showed a beautiful landscape panned slowly. But pretty soon, you might have wearied of tulip fields. You might have seen tulips in your dreams.

In those days, you could validate your video decoders by looking at the screen, says Scott. But that visual practice won't cut it anymore, he believes, because "HEVC is many times more complex than MPEG-2."

He notes that the new standard "has so many options, variables and settings. It's hard to cope with all that complexity."

VP8 was introduced long after H.264, so it was at a disadvantage from the start. It seems that this time around, H.265 and VP9 will emerge at about the same time, and that both should be available for UHDTV. It would be nice to see them compared in terms of compression efficiency, where efficiency would be compared on the basis of image quality vs bit rate and, for any given image quality, the CPU resources required in encoding and in decoding.

I'm not sure whether HEVC is the only option worth discussing, this time around?

The biggest issue is really about frame rate, we have current implementations which only support 50/60Hz when in reality the large frame size doesn't suit low frame rates. There is research going on into higher frame rates (120/150Hz) but I would really like to see the industry taking a longer term view and building 300Hz.

I am glad that I am not alone traumatized (no, not really) by the tulips and windmills video! (well, as a reporter covering the field, at every MPEG demo I attended, I was shown the same video over and over...)

I always thought the standards body's "tool box" strategy genious,. But oh, boy, there appears to be so much flexibility in HEVC encoding.

It may take some time for vendors to "settle on a few subsets," though.

Ah yes, the tulips and windmills! There were a few other streams I watched so many times that they are forever etched into my memory.

On a more serious note, a standard verification suite will help, even if it isn't yet an official standard. Eventually, within the vast space of options on the encoder side, people will hopefully settle on a subset of tools that work well for most UHDTV delivery schemes.

...In HEVC, multi-core chips can encode streams in independent tiles, while hardware implementations may choose to minimize cache sizes by using wavefront encoding. However, decoders need to support all options.

...HEVC has moved from a fixed partitioning scheme to a quad-tree decomposition scheme. This means there are orders of magnitude more variations of block sizes for coding/transform/prediction.

Now both of these things pointed out above tells me that yes, verifying your decoder against your own encoder may work, but certainly it doesn't guranatee that your decoder can support various encoders that will soon be out there! That sure sounds like "unintended consquences" waiting to happen.