September 2014

Hot Button Discussion

HEVC Slowly Rolls Out
By Michael Goldman

Since SMPTE Newswatch last reported on the then-impending formal arrival of the High-Efficiency Video Coding compression standard (HEVC, also known as H.265 or MPEG-H, Part 2) in 2012, much has transpired in terms of the standard's direction and potential impact on a wide range of improvements regarding the efficient broadcasting of high-quality video content direct to consumers. The bottom line on one hand, suggests Matthew Goldman, senior vice president of TV compression technology at Ericsson, is that HEVC has clearly established itself as the eventual enabling compression standard for making the transmission of ultra high-definition television (UHDTV) content to consumers possible. On the other hand, he points out that the process of fully rolling the new spec into the professional and consumer hardware systems necessary to make all this work efficiently could take several more years as the horse, in essence, is only now leaving the starting gate.

"For broadcasters, HEVC provides a practical mechanism to deliver 4K UHDTV [content] to the consumer's home environment or even mobile devices, because the leap in bandwidth efficiency gives us around a 50% bit rate savings over [the previous video compression standard] AVC [also known as H.264 or MPEG-4, Part 10]," Goldman says. "For all the standards organizations, for both professional and consumer applications, therefore, the standard for the coming decade or more will clearly be HEVC, but the question is when, since it all depends on when the hardware technology supporting HEVC becomes mature. Typically, mature implementations come about two years after a standard is published. You always have early adopters, but they are not yet mature implementations. The other thing to keep in mind is that HEVC, unlike its predecessors--AVC and MPEG-2--because of its higher level of efficiency, addresses a wider breadth of applications. For instance, MPEG-2 only addressed high-definition and standard-definition video, and didn't do very well in cases where lower bit rates were needed, such as Web video, Internet television or over-the-top (OTT). AVC did a better job, but HEVC does a much better job on the low end, to the point where OTT or Internet video can get much higher resolutions at lower bit rates than in the past. That level of efficient compression even makes it practical when broadcasting video content over 4G LTE cellular networks for the first time, as Verizon is now planning to do. At the same time, HEVC addresses high efficiency at the high end--the 4K UHD level. But the industry needs to get it into hardware, and not just software implementations, before we see it really start to take off, and that will take a while."

"Therefore, there will be no single date for when the industry will officially 'adopt' HEVC. It will roll out over several years, and it could be beyond 2018 or 2019 before [it becomes] fully ubiquitous. That is not an official date, just my best guess, based on what I'm seeing and past experience. In particular application spaces, such as video over the Internet, we already are starting to see it happen with software encoding. However, it won't be a complete or meaningful rollout until it is in the hardware for market applications such as traditional broadcast over direct-broadcast satellite, cable TV, and terrestrial, where highest picture quality at lower bit rates are required, and for UHD content to the consumer. These rollouts take longer."

Goldman points out that HEVC's status is such that, as industry developments continually push the envelope of picture quality with improved dynamic range, higher frame rates, and so on, the standard will continually need to evolve to address ongoing demands to wring even more bandwidth out of the transmission sponge. Indeed, when HEVC was formally finalized by the Joint Collaborative Team on Video Coding (JCT-VC) (a joint effort involving coding experts from both the ITU-T Video Coding Experts Group and the ISO/IEC Moving Picture Experts Group) in January of 2013, its initial version specifically defined a new standardized codec for consumer-related direct-to-home applications--for things like digital cable, direct broadcast satellite transmissions, IPTV delivery, Blu-ray encoding, and so on. Goldman says this first step "is affectionately known as version 1 of HEVC, because it is a living standard, and we always intended to have later versions follow."

Version 2 followed in April of this year, and that version was known as the Range Extensions of the original standard, also known by the RExt abbreviation, analogous to the Fidelity Range Extensions (FRExt) that were built into the previous AVC standard.

"The Range Extensions cover mainly professional applications, particularly content acquisition, exchange, and primary distribution," Goldman explains. Essentially, the idea is to permit higher bit rates for an even more pristine picture quality, and still have a significant reduction in bit rate, compared to uncompressed video, which is the objective of the standard. "Since UHDTV-1 [also known as 4K UHDTV] is four times the resolution of HDTV, and after that, eventually, we will have UHDTV-2 [also known as 8K UHDTV], which is 16 times the resolution of HDTV, you can imagine how much more bandwidth professional plants will have to have to [process] such content, both video and audio," he says. RExt is designed to address that issue for professional applications. It was finalized in April, so hardware is not yet available that incorporates the profiles. But the application would be, for example, similar to an uplink truck at a venue, doing live content acquisition. This is currently being done with [the AVC compression standard], using 4:2:2 profiles. RExt, when the hardware becomes available, will allow them to move up that picture quality capability within the confines of the same bandwidth. That is an exciting development, but once again, it will be sometime before it is deployed practically in the field, Goldman says.

Meanwhile, even before the first two versions of HEVC have fully pushed their way into hardware that will eventually replace AVC legacy equipment around the industry, yet a third version of HEVC is now "hot off the presses," Goldman states. That version is the Scalable High Efficiency Coding (SHVC) version of the codec. Goldman says the SHVC version was finalized in July and was at press-time expected to be published shortly.

The notion behind SHVC is to wrap into the spec the ability for encoded video streams to carry a "base layer" that would be decodable by all devices and "enhancement layers" of data that end user devices can utilize, or not utilize, depending on the device in question, to possibly improve the consumer's viewing experience.

"Legacy devices would just discard the enhancement layers because they can't decode them," Goldman explains. For example, they do not have enough memory or processing power. But newer devices would understand them and decode them, and combine them with the base layer or layers to create a much better picture on newer monitors. For instance, a television or handheld device that does not support UHD could take HD information from the base layer and display HD, while devices that can handle UHD, could grab that additional information from the enhancement layer, combine it with the HD base layer, to give the user a UHD picture on their device. This kind of application is known as 'spatial scalability,' one of the types of layered coding that SHVC defines."

Another approach to scalable data that could travel in the HEVC stream could be the notion of "temporal scalability," Goldman adds. In that case, the stream would carry layers of data for content that could be displayed at different frame rates, depending on the end user's device, and there are other types of scalability, as well, that would benefit from the SHVC version of the new compression standard.

"For example," he elaborates, "a legacy display may only handle 50 or 60 frames per second, and that would be coded in the base layer, while a newer device could decode an enhancement layer that doubles the frame rate to 100 or 120 fps."

Goldman quickly adds, however, that the concept of trying to add scalable capabilities to a compression codec is not a new concept, and has been tried before with both MPEG-2 and AVC. While, technically, the concept has always been feasible, there were always practical implementation difficulties that surmounted any benefits that scalable coding could have provided, according to Goldman. Since SHVC has literally just been finalized, and has not yet been widely deployed into modern hardware and used in practical situations, he says it is too soon to know if the concept will fare better this time around under the HEVC umbrella.

"In the past, by the time we mathematically calculated how to do it and tried to implement it practically and economically for a real commercial deployment, scalable coding either was impossible to process in realtime, or the complexity was so high that it was more economical to just simulcast two separate streams--like a standard def and a high def stream, for instance," Goldman states. The savings in bit rates that could have resulted, did not justify the extra complexity or economic costs of implementing the concept," he says.

"But it is important to note that the concept is not new. Today, engineers are hoping that thanks to Moore's Law, it is now technologically possible for them to do it because they have made so many improvements to the algorithms. They hope that will make it viable to do in realtime versus simulcasting. Until it is deployed and tried out in real-world situations, we really won't know for sure. So that part of the [HEVC] equation has not yet been proven. I do believe that the direct-to-consumer version--the first version of the standard--and the Range Extensions are here to stay."

In fact, as he stated in the Newswatch in 2012, Goldman continues to believe that video compression leaps occur about every decade. What may end up being different with HEVC, he suggests, is that it may not get wholly replaced by a new standard in a decade as was the case with MPEG-2 and AVC. For the sake of interoperability, flexibility, and a myriad of business reasons, a rush to continually evolve into something radically different would serve the industry no good purpose in the long run, he believes. Rather, it may simply be updated in almost modular fashion to address the best ways to transmit the various new offerings the industry comes up with in terms of picture quality as they arrive with the dawn of the UHDTV era.

"There will be a call for technology to address higher dynamic range (HDR) for MPEG to put into the HEVC standard, likely early next year," Goldman says. "But it will probably be another ten years before you would want to have another major leap beyond HEVC. It has to be orderly; otherwise you would just confuse the industry. But I suppose there is always the desire to keep searching for that Holy Grail--what MPEG called years ago reconfigurable video coding, or RVC. That would be to actually download the spec to two-way receiving devices to allow them to continually reconfigure themselves using the latest algorithm, whatever it might be. But for now, that is still a Holy Grail--the technology just isn't there to do something like that yet. For now, you move forward in steps, and with HEVC, we've taken a pretty big step," he says.

News Briefs

Satellite Survives
A recent analysis on the Cable & Satellite International (CSI) website explores the future for direct-to-home satellite broadcast technology in light of the ongoing interactive media revolution and over-the-top (OTT) video delivery technologies. The article suggests that interactive media delivery systems are capable of exploiting a key weakness of direct-to-home (DTH) satellite broadcasting--they are, by definition, two-way and interactive, which modern consumers are eager to pursue in the new media world. Therefore, the article suggests, in regions of the world where cable, fiber, and other high-end broadband transmission systems can penetrate, it makes logical sense to see the proliferation of satellite transmission receding over the course of time. On the other hand, however, the article states that many parts of the world simply won't be among those regions any time soon. Rural and remote areas that simply either don't make economic or logistical sense for broadband companies currently, will continue to rely on DTH as their primary broadcasting transmission methodology. Indeed, the article points out that, statistically, on a global basis, the number of satellite channels and homes worldwide that can receive satellite signals continue to rise, and that major satellite players like SES and Intelsat have scheduled a half-dozen new broadcast satellite launches to happen between now and the end of 2016. The article also examines new developments in satellite IP applications that further add to DTH's evolving value. Therefore, for these and other reasons, the article postulates, the world appears very much headed for a hybrid future, with DTH and new forms of video transmission technologies continuing to co-exist.

New Media, Traditional Content
Television traditionalists should take heart in a great irony presented in a recent article from the TechNewsWorld site that suggests that the rapid proliferation of OTT video service providers has rejuvenated the spirit and pursuit of critically acclaimed, original broadcast content creation. The article, by Peter Suclu, titled "The New Golden Age of TV," suggests that recent awards, critical success, and high viewership for shows like "House of Cards" from OTT service Netflix has ignited something of a race from other OTT providers like Hulu, Amazon Prime, and more recently, even Yahoo, to find and produce original content. This development, is creating new work for writers, directors, producer, actors, and others across the content creation industry, the piece points out. Netflix and the OTT paradigm generally have changed how and when people view such content, as well as the technology used to make, distribute, access, and watch the content, and how traditional viewing seasons and ratings are calculated and analyzed, the article states. However, they are following a grand tradition of the television industry in the sense that they are finding that original and creatively satisfying programming remains the form of entertainment consumers most want to access, even if the medium they choose is continually evolving.

Home Atmos Arriving
As a follow-up to recent coverage in Newswatch about Dolby's plans to offer a consumer version of its object-based Atmos immersive cinema sound system, The Hollywood Reporter recently reported that the company is so far along with its plans that such technology will be ready to be supported by partners offering Blu-ray players and OTT services some time this fall. The report states that Dolby has developed authoring tools to permit studios to create Atmos theatrical mixes for home theater playback, and that studio partners will start offering such titles, capable of being played back in Blu-ray players that use Dolby's TrueHD codec or through OTT services that use various Dolby encoding schemes to offer audio tracks in the Atmos format. The article also states that consumers will have a few options for how they configure home theaters to playout the immersive sound experience, with a new rendering technology that renders out sound to a particular height plane that replicates the experience of hearing a ceiling speaker being central to the concept.