Posted
by
timothy
on Saturday January 26, 2013 @09:24AM
from the looks-so-much-like-his-brother dept.

An anonymous reader writes "The H.265 codec standard, the successor of H.264, has been approved, promising support for 8k UHD and lower bandwidth, but the patent issues plaguing H.264 remain." Here's the announcement from the ITU. From the article: "Patents remain an important issue as it was with H.264, Google proposing WebM, a new codec standard based on VP8, back in 2010, one that would be royalties free. They also included it in Chrome, with the intent to replace H.264, but this attempt never materialized. Mozilla and Opera also included WebM in their browsers with the same purpose, but they never discarded H.264 because most of the video out there is coded with it.
MPEG LA, the owner of a patent pool covering H.264, promised that H.264 internet videos delivered for free will be forever royalty free, but who knows what will happen with H.265? Will they request royalties for free content or not? It remains to be seen. In the meantime, H.264 remains the only codec with wide adoption, and H.265 will probably follow on its steps."

Nobody "won". Companies weren't making proposals for complete replacements for h.264. They were making proposals for incremental improvements on h.264. h.265 is a collection of those different improvements. Each one is small in itself, but they add up.

The H.264 patent pool has 30 licencors and the list of patents is 59 pages long, so the short answer is: Most of the industry. Apart from Google with WebM and previously Microsoft with VC-1, there is surprising unity. My predictions are as follows: HEVC is as dominant in hardware as H.264, there will be an open source encoder like xvid/x264 and those who can't or won't use that will use WebM despite the somewhat larger size because Google will probably fight to back it as a free codec. Anything else will be never go anywhere outside geek circles like Vorbis or Theora.

You're being very kind by saying WebM is "less effective" compared to H.264. I'd put it closer to "why in the hell would I want crummy looking compression unless I use at least twice the data rate?" This from someone who's livelihood partially comes from putting compressed streams on the Internet. WebM isn't good enough and just got lapped again.

Well if you're rendering some cutscenes for a game and want a codec that is free to use and better than MPEG1 - MPEG2 and all newer ones are still patented - then WebM might fit the bill. I'll agree it doesn't take much to win that category though.

WebM completely failed to gain any traction whatsoever against h.264, so why should it do any better against h.265?

Well, if WebM were as good as h.265, then we'd be a in a place where no hardware supports either standard and new hardware could support both standards. Right now, h.264 has hardware support and WebM doesn't, putting it at a large disadvantage.

Amazing how that works. Company does a bunch of research work, and then says "hey, you can use the research work we did if you pay us". It's almost like the people there are normal human beings who want to live and eat!

But if the ITU is going to make it a standard, it really should require patent grants for any patents that would cover that standard, and require all entities that are participating to sign agreements that they are agreeing that any patents they currently hold or will hold in the future will not be used against those implementing this standard.

Google is also in a patent nuclear war with Apple and some of those patents should be thrown out entirely. Like any war, you have to do dirty things to survive because you cant count on the other side being acting civilized.

Patents more and more seem to be nothing but a pointless burden. Both classes should be done away with (the bogus ones and the ones that standard depend on).

What would be better would be requiring a Free implementation of said standard, and if $BIGCO doesn't want to (or can't due to other obligations) make their resulting product Free as well, they can pay a license fee. Just like all the other dual licensed software/source out there....

That's a popular opinion, but it's not part of the ITU charter. The ITU (and ISO) make official standards that aren't free. The ITU/ISO don't have an open source/free only perspective. Nor do they disallow competing standards.

> Company does a bunch of research work, and then says "hey, you can use the research work we did if you pay us".

Patenting and/or charging to do math is idiotic. The point of having a standard is that _anyone_ could read it, and implement a working version. Standards _need_ to be free else society literally pays the price of "progress ransom"

You don't have to pay a fee to write HTML, Javascript, etc. You shouldn't have to pay a fee just to shuffle numbers around - i.e. to encode video.

Very true, but I'd go as far as to say you can charge money to/encode/ video, but should have no say over/decoding/ it. Therefor, the person encoding can choose to use a better, paid-for encoder, or a free one. The user won't care really, because they can decode it either way.

You don't have pay a license just because you want to implement (fancy) math on a computer, which is ALL a codec is.

If companies want to license their encoder for a fee I don't have a problem with that AS LONG as it becomes free (donated to the public domain for the benefit of everyone) in 5 - 10 years. This "disease of greed and screw the public benefit" needs to stop at some poi

Company does a bunch of research work, and then says "Hey, we're now going to force everyone in the world to use our research work and pay us to do that, even if they'd prefer to use someone else's work for cheaper or free"

I forget, are we for or against authoritarianism? Depends who's paying the corporate standard committees, I guess.

You don't choose to "use" standards. You are forced to implement them either by government regulation or interoperability needs. See what happens with the FAT file system: it's the result of an insignificant research effort, it is itself extremely poor technology, yet every device manufacturer is currently forced to implement it, and therefore needs to pay money to Microsoft.

This adds a sunk cost to the barriers to entry into the device market, in favour of the established market dominators (which is what patents are all about), and to the detriment of free market, consumers and technological progress.

FAT/FAT32 isn't a poor technology, it's a simple technology. It's not very complicated, but the implementation has evolved over nearly 40 years.

Secondly, you don't have to pay royalties to Microsoft for using FAT/FAT32 itself. You have to pay Microsoft if you use the same exact algorithm for storing larger than 8.3 filenames on FAT. You are free to use a different algorithm, and not pay any royalties, or stick to 8.3 filenames as the original FAT/FAT32 did.

It's only been hacked to support larger disks and longer file names, and still it does that poorly (high internal fragmentation, small maximum file size, no support for extended attributes, poor performance on optical storage and flash, and let's not talk about missing features).

Hacked -- Improved. Those words are pretty much interchangeable depending on your own view and biases. Also, systems using FAT can use extended attributes if they wish. OS/2 does just fine with extended attributes on FAT, and so does cygwin. Just because FAT doesn't explicitly say this is where you stick them doesn't mean you can't write a file system driver on top of it that puts them wherever you want. Yes, FAT has poor performance on optical storage, but why would you use FAT on it in the first plac

The point is that there is not much choice if it is part of an interoperability standard. You simply cannot view a H.264 video on the web with a browser that only supports WebM, just as you'll have no luck to watch NTSC broadcasts with a PAL-only TV. Of course you are free to try to sell that PAL-only TV in the US, but you won't succeed, not because it is bad (the same TV may sell like crazy in Europe), but because it doesn't work with US broadcasts.

Oh, that is not the way to go. You ought to see what you can do with VHS. Capture with MJPEG to Cinepak, then export to VCD. Rip that to Indeo 4, then convert to Indeo 5. Import into Adobe Premiere and run some filters on it. Export to SVCD. Rip and upconvert the SVCD into MPEG2 720x480 and export to a DIVX file. Convert to WMV with Windows Media Encoder. Import back into Adobe Premiere, add a few more filters, export to Quicktime. Upload to Youtube, using their video stabalizers and automatic filters. Use

Why? The visually perceivable differences between the source and high bit/pixel H.264 are almost non-existent.

There are generally more differences between the actual source (film/captured video/etc.) and the adjusted-before-encoding (filtering, color-"correction", etc.) source than those caused by lossy encoding.

Repeatedly transcoding between lossy formats degrades quality each time. *You* may not perceive differences, but encoders do, and they tend to amplify those differences until they become very noticeable visual artifacts - no matter how many bits you use.

If you need to work with original content to edit it before the final compression for consumption, then you need lossless. But, there are plenty of lossless formats that will work for that, and some are compressed. If you dedicate 1TB or so to your edit workspace, you won't need to use a compressed format unless you are working with the entirety of a modern 2-hour movie. I learned this when I accidently forgot to choose a compression method when re-encoding some TV shows at 720p, and it took over 4 hours

There are already lossless video codecs out there. Lagarith is a recent and popular one. The problem is that they only cut maybe 2/3 off your raw file size. Ten seconds of raw 1080p video is over a gigabyte. There's just too much information there -- you have to throw some away to get reasonable compression ratios. Waiting for lossless video to be as small as H.264 is like waiting for a 200MB download for a DVD-sized Linux ISO. Sadly, it's just not going to happen.

I don't think anyone is waiting for lossless codecs to get smaller, they are waiting for the hardware to get bigger. It happened to compressed formats for music in the 90s and video in the 00s, now the teens may start to see losslessly compressed formats rule.

I don't think anyone is waiting for lossless codecs to get smaller, they are waiting for the hardware to get bigger. It happened to compressed formats for music in the 90s and video in the 00s, now the teens may start to see losslessly compressed formats rule.

The storage is already here - 4TB drives can hold a useful amount of lossless video. A 1080p video frame is around 6MB uncompressed, at 30fps that's 180MB/sec. If you want true 1080p60, that's 360MB/sec, or about 3 seconds a gigabyte. A minute takes 20GB, 1TB can hold 50 minutes. 4TB can hold 200minutes, or just over 3 hours worth of uncompressed 1080p60 video.

And most cameras don't use lossless to begin with - a 4K frame quaruples the data rate (turning our 4TB drive into a still-useful 50 minutes of video storage), but we're talking about a massive 1.4GB/sec. The ever-popular RED cameras use SSDs, and proprietary REDcode codecs in order to be able to keep datarates down enough for an SSD.

even with bandwidth having 3 hours on a 4 TB drive isn't going to cut it. I have nearly 1000 movies and tv shows on my 3TB drive. Once we have PB sized drives with GB/sec transfer rates then we will be talking.

BD+ in Blu-ray Disc muddies this a bit, as it allows transforming the decompressed image based on whether or not other authenticity checks pass.

Although "transform the audio and video output" is listed as an option of BD+, it doesn't work the same way as most humans would parse that description. Based on this [wikipedia.org], it's just another way to encrypt the full.m2ts stream.

If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection, as you would need to decode the H.264 stream, apply the BD+ operations, then re-encode those frames to put back into the ripped stream. Note t

If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection

Exactly as planned.

In addition, in order to alter the uncompressed data, it would require that every Blu-Ray player use exactly the same H.264 decoder with exactly the same options and only apply video alterations after BD+ is done with the data.

I was under the impression that the transformations didn't need to depend on bit-perfect output from the video decoder. Just guessing, but they could involve color space modification, rotation, flipping, cutting and pasting, bending (remember old scrambled channels from the VideoCipher II era?), and the like.

If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection

Exactly as planned.

Since this would effectively stop ripping, I'm pretty sure if it were possible while still letting Blu-Ray players play the movie, it would already have been done.

I was under the impression that the transformations didn't need to depend on bit-perfect output from the video decoder. Just guessing, but they could involve color space modification, rotation, flipping, cutting and pasting, bending (remember old scrambled channels from the VideoCipher II era?), and the like.

First, they have to be simple, because of the limited power of the BD+ virtual machine, so anything that involved serious memory moves would be out. Color and pixel value would be pretty much the limit.

Depending on much the decoding varies from the reference, there might be some seriously noticable artifacts, especially if the scrambing was enoug

Go jack of in a flower pot anonymous wanker. HD-DVD would have surely been an annoying piece of crap as well, with Microsoft driving it you can count on it. The only difference is, we *know* Blu-Ray is a piece of crap which we would not have known, had it lost the fight and remained in obscurity.

Once a standard becomes good enough, people will hang on to it for a long long time. Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec? Apple has enough difficulties pushing aac through, and not many hardware producers are including vorbis support. I guess the same could be said for windows xp and desktop hardware.

Once a standard becomes good enough, people will hang on to it for a long long time. Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec? Apple has enough difficulties pushing aac through, and not many hardware producers are including vorbis support. I guess the same could be said for windows xp and desktop hardware.

MP3-files are small enough to be streamable perfectly well even on really slow connections, but video files ain't small. A 2-hour, 1080p video file with any kind of a remotely-acceptable quality will weigh in at 4GB+, and well, it sure ain't streamable over very slow connections. Not to mention the fact that bandwidth costs money. Ergo, any developments that result in higher quality at the same size or similar quality at a smaller size are certainly welcome, both for consumers and for content-producers.

Once a standard becomes good enough, people will hang on to it for a long long time. Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec? Apple has enough difficulties pushing aac through, and not many hardware producers are including vorbis support. I guess the same could be said for windows xp and desktop hardware.

MP3-files are small enough to be streamable perfectly well even on really slow connections, but video files ain't small. A 2-hour, 1080p video file with any kind of a remotely-acceptable quality will weigh in at 4GB+, and well, it sure ain't streamable over very slow connections. Not to mention the fact that bandwidth costs money. Ergo, any developments that result in higher quality at the same size or similar quality at a smaller size are certainly welcome, both for consumers and for content-producers.

As a thought-experiment, let's assume that this or that TV-series I was watching on Netflix weighed in at 1.5GB for a 1h episode, and I watched 15 episodes in a month. That'd be 22.5GB of data. Now, if the move to a new codec reduced filesizes by 5% we'd end up with ~21.4GB of data -- that's already one gigabyte in savings. Now, multiply this with e.g. 200 000 users, what do you see?

Apparently you don't remember it, but at one time, MP3 files weren't small either. I remember it taking about an hour to download a good quality MP3. And there was streaming, too. Things like Real Player provided lower quality, higher compressed versions that were more suitable for streaming. Then do you know what happened next? Did Real Player and stuff like it win out? Nope. I'll give you a hint...the MP3 files didn't get any smaller.

Connections got faster, and bandwidth got cheaper. Much like those days for MP3, today good quality h264 files are a bit cumbersome, but I can easily download them in an hour or 2 with a typical (not even high end) consumer level internet connection. And today there are ways to get lower quality, more highly compressed version that can stream a fairly good quality HD video in real time. Give it another 5 years and the problem will easily solve itself without replacing every single piece of hardware and re-encoding every existing file.

Really? Must be your cable provider. In the last few years, my cable provider (WideOpenWest) has given a free upgrade from 8Mbps to 16Mbps (and in the 5 years before that we went from 4Mbps to 8Mpbs), and introduced new 30Mbps and 50Mbps plans. Comcast has introduced 100Mbps a plan in many areas. Google has their first Gigabit city. I've heard a number of stories of municipals setting up their own internet service with speeds between 20Mbps and 100Mbps. Verizon Fios has new plans of 50Mbs, 75 Mbps, 150Mbps,

Another thing that plays into it: are you in a major metro area, or more rural? Rural areas are always going to lag behind. The first player in can justify it because they get 100% of the market. After that, additional service providers will only see a fraction of that return, thus it's not worth it to start up there, and thus the first player gets to enjoy their monopoly for a long time (so no incentive to upgrade). So, yeah, they'll always stay behind, but in the context of this thread (whether there's a

Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec?

You're asking the wrong question, the right question is how many have FLAC copies of all their MP3s? Because I hope you weren't seriously suggesting they should re-encode from the MP3 files. I think you will find that many people have never even heard of FLAC and even if they did few tools have made it easy to create dual FLAC/MP3 rips of a CD, least not any the average person would have heard about. Assuming he didn't just download those MP3s in the first place and isn't about to chase down different copie

Old content will stay in h264, new content will be released in h265. For when that switch happens depends on market. Anime fansubs have been early adopters for pretty much all new technologies relating to non-streamed video. I except them to start using it pretty much right after some kind of x265 will come out. Other markets will make the switch slower (or they will just keep using both) as it requires upgrading the consumers hardware/software.

They also included it in Chrome, with the intent to replace H.264, but this attempt never materialized.

Apart from the awful English, WebM has been quite successful, too: a lot of software packages use WebM because they don't need to license H.264, and not just open source software.

Video standards aren't replaced overnight, and in fact, in a lot of places can't be replaced at all. The best way of dealing with these kinds of compatibility issues is to offer an alternative when people need to upgrade and change hardware/software anyway. So, let's hope that WebM can compete with H.265, because then we have a real chance of largely getting rid of proprietary video standards.

So, let's hope that WebM can compete with H.265, because then we have a real chance of largely getting rid of proprietary video standards.

WebM is a distribution codec for YouTube. H.264 is core technology in digital television.

Theatrical production. Cable, broadcast and satellite distribution. Home video. Industrial applications... The list goes on and on and on.

The licensors of H.264/HEVC are global giants in R&D and manufacturing. Philips. Samsung, Mitsubishi. Panasonic. Toshiba. The 1181 H.264 licensees operate on more or less the same scale. The standards they adopt are the standards which stick.

The standard IS open in that during definition of it anyone (paying to be a member) can contribute, provide feedback, and vote. If you meant free as in beer, they could have required that, but then none of the corporations that did the R&D would have participated and we'd have many "standards" and not just one.

Its because they want something that is near the best technology possible at this time, and that means dealing with the people that do actually have technology that is near the best possible at this time. H.265 is a very large improvement over H.264 (about 50% of the bit rate for equal quality) and nobody in the "open" world can do that.

This is a huge upgrade for any business pushing digital video through wires and radio waves. Even in the case where encoder and decoder licenses are a large cost, they wil

All of those patents are most likely incredibly trivial and all companies and organizations that sucessfully lobbied them in, did so not for their technological benefits but to make sure their patents were as widely used as possible.

If the ITU were to demand patent-free standards, they would be just as good but without the royalties.

Apparently you don't know what the word average means, or how significant digits are used. The thing being measured has no impact on the number of significant digits you can use. It's purely determined by the precision of your measurements.

Given how widespread H.264 hardware implementations are and the fact that blu-ray does not have H.265 I'd expect to see adoption first in the video conferencing world (SIP, H.323....CISCO/Tandberg, Polycom, etc)

For real time encoding H.265 can provide 30% reduction of bandwidth at the same bitrate. Transcoded content like what you might do at home will get some benefit but not as much as the real time stuff (streaming will benefit a lot too)

Changing codecs is going to require outfits like Dish and DirecTV to replace all of the end user hardware. I'm not convinced that h265 is enough of an improvement for them to consider this.

The more legacy users you have, the harder it is for you to get buy-in on a new digital format. Incremental improvements will continue to be a harder and harder sell to people with legacy content and legacy equipment.

Because H.265 will has (I believe) half the bandwidth requirements of H.264, DirecTV or Dish Network could either cram in more channels or keep the current channel allocations but at MUCH higher video quality.

First adaptation, as usual, will be by HQ rip groups and anime fansubbers. These people pride themselves in being on the cutting edge and implementing stuff that isn't implemented anywhere in hardware yet. They were the guys who moved from h.264 high profile to h.264 10 bit high profile when h.264 hardware support started to become prevalent. They were the ones who moved to h.264 when divx hardware support became prevalent. Etc.

Funnily enough, it was the same for h.264, divx/xvid and so on. Frankly I wouldn't be surprised if many of the guys encoding that stuff actually work in the industry and use their "hobby" as a testbed for new encoding techniques and methods before they go to mass production.

I expect we'll see services like Netflix jump on the bandwagon pretty fast. They already produce multiple copies of their videos in different codecs to cater to different device capabilities. If memory serves, they do VC-1 for the desktop client, low bitrate h.264 for the mobile clients, and high bitrate h.264 for the STB/console clients.

Migrating platforms which can support it to h.265 will provide them with immediate savings. There aren't that many of them, but the PS3 happens to be their flagship and dev

The 'H' video encoding standards have NOTHING to do with free-to-use codecs. They are a COMMERCIAL industrial standard, designed to be reasonable and safe to license, because of the patent pool.

Complaining that H265 will include some royalty mechanisms is like complaining that the sky is blue! Even the document that will detail the final H265 standard will NOT be free, just as today you have to pay to get a copy of the H264 standard.

The open-source movement is not the same as demanding "death to capitalism" or the end of profit, as some very stupid people here seem to think. The 'H' standards have nothing to do with open-source. However, because the 'H' standards are not industrial secrets, open-source developers can and will develop open-source encoders and decoders.

Talk of WebM is pure garbage, since the key developers of x264 looked at the source Google released, and discovered that VP8 had illegally ripped off the H264 standard (badly), taking advantage of the fact that VP8 was originally closed-source. In other words, Google was conned (actually, this isn't true- Google knew full well that VP8 infringed hundreds of patents, but simply wanted to transfer millions to the owners of the company).

If people want to be activists over the royalty situation, it should be with this goal. Encoders, and encoded video (including streamed) should be royalty free. Only the decoders (hardware or software) should pay a royalty. This way, once you own your tablet, laptop, phone, or Windows, you have already paid for the licence to decode H265, allowing all apps to use this format freely.

The advantage of H265 (and H264) to end users is clear. Tiny, extremely energy efficient, hardware circuits can handle the video decoding, providing first quality video services on devices of all kinds. The standards allow software teams (like those behind x264) to produce insanely efficient, ultra-high-quality encoding solutions, and also allow work to progress on very fast (although low quality or very high bandwidth) hardware encoders.

H265 promises (if the encoding efficiency shown by x264 is possible for H265) 4K films on existing Bluray technology- which is essential since the collapsing market for disks means that it is most unlikely a new disk standard will ever replace Bluray.

To conclude. Standards are good, and some standards will involve royalties.

> The advantage of H265 (and H264) to end users is clear. Tiny, extremely energy efficient, hardware circuits can handle the video decoding

That's all well and good, but that's supposed to be what the advantages of h264 are and we've already got that and tons of legacy equipment and content.

On the other hand, most people are going to be hard pressed to notice any reason to want 4K given that BluRay is already a tough sell with anything much beyond a 1:1 viewing distance to screen size ratio.