UHD: 8 Bits vs 10

Finally realizing that more pixels alone is not enough, the UHD Alliance has come out with specs for what they call “Ultra HD Premium.” Besides HDR and a wide color gamut, the performance metrics require that the video bit depth must be 10 bit. (I assume they are talking about the luminance channel here, as most video is encoded as luminance plus color difference signals.) However if you look inside almost any broadcast/cable/satellite transmission facility, you’ll see that most are using 8-bit mezzanine formats such as XDCAM HD 50 for server storage. Why is this so bad? Imagine paying $1K or more for a 4K monitor and seeing background colors like the thumbnail above. (Or at left depending on the screen size of your device.) And no, the graphic artist did not design it that way.

While the upcoming 4K Blu-ray Disc format (which requires a new 4K Blu-ray Disc player) and some streaming services (like Netflix) will be capable of delivering Ultra HD Premium content, don’t expect any from broadcast or cable services any time soon. The future over-the-air ATSC 3.0 standard will support Ultra HD Premium services, but you’ll need some future ‘to be announced’ converter box to make it work with today’s 4K displays. Gee, it’s fun being on the bleeding edge of technology, isn’t it?