Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "AMD's latest feature added to their open-source Radeon DRM graphics driver is VCE video encoding via a large code drop that happened this morning. A patched kernel and Mesa will now allow the Radeon driver with their latest-generation hardware to provide low-latency H.264 video encoding on the GPU rather than CPU. The support is still being tuned and it only supports the VCE2 engine but the patches can be found on the mailing list until they land in the trunk in the coming months."

They have decoding support, but at least as recently as Google Summer of Code 2013 [multimedia.cx] they don't have hardware encoding support. That seems to be the fault of the ffmpeg project though, encoding was added to the VA API in June 2009. Lack of interest?

Encoding was added to the VAAPI interface, but was never supported by Intel hardware. There's not much sense implementing a protocol when there's no hardware to interface with. You may be looking for the Intel Media SDK [intel.com], which wasn't made publicly available until the middle of last year.

Encoding was added to the VAAPI interface, but was never supported by Intel hardware. There's not much sense implementing a protocol when there's no hardware to interface with. You may be looking for the Intel Media SDK [intel.com], which wasn't made publicly available until the middle of last year.

That's not what he's saying and you know it. He's staying on the topic of this article which is Intel has provided open source h.264 encoding (VAAPI) for years while AMD is now releasing code. You could do a little research [01.org]. As for ffmpeg, it doesn't support VAAPI but that is a choice of the ffmpeg developers not to include it; however, through the external libx264 library, you can get ffmpeg to encode h.264. This has nothing to do with Intel as they released the code. That's like complaining that it's

I'm sure he means Intel's Quick Sync hardware codecs, which are integrated on Intel's CPUs and does not use the integrated GPU.

My understanding of AMD's VCE is that it is also a fully separate codec which does not use any GPU compute power, though they do have optimized paths to copy the framebuffer into VCE for low-latency screen capture.

That's interesting. For anyone who, like me, wasn't familiar with Quick Sync, it seems to be a dedicated video codec chip on the CPU. In testing, it was much faster than CPU or even GPU encoding, but at lower picture quality per megabyte.

I don't think it's quite nonsense. A motorcycle has lower latency than a truck (you can get there faster), but a truck has higher throughput (it can deliver 1000 boxes quicker). That's a useful distinction.

Especially Quick Sync can easily encode IN REAL TIME, so it's useful for DVRs, etc. (Think instant replay). An unassisted CPU will struggle with real time encoding. Being able to encode even multiple streams in real time is better than not being able to.

Unless the binary mentioned is different from the various others in the directory, it's firmware that the driver needs to load onto the card on startup (which has been the case for Radeons for some time now) rather than a binary driver component that runs on within the host OS.

Distribution will be a hassle, as always; but it's not architecturally much different from just adding a chunk of flash to the card and storing the firmware there instead.

From the mailing list, it appears you still need to link this all to a closed source binary...

No, it's firmware/microcode. The driver sends it to the GPU at boot as a blob, it lives inside the card hidden from everything. The alternative would be to have an EEPROM and a firmware flashing utility, it'd still be there and closed source but it wouldn't be in the driver. It's not really part of the programming model, it's hardware initialization/configuration/tweaks to make the it work correctly according to the model.

It is still a step backward, a last ditch attempt to rescue the patent encumbered CODEC before it becomes extinct. They should let it die, for the good of progress. Who wants a CODEC backed by a group that sued a mom for publishing a birthday video online over patent licensing?

that's the thing... these new codecs, they don't specify exactly how you should encode.

That's the point. Because codec quality is highly dependent on the tables you use, which is the main selling point of the codecs. In other words, the quality of the final output is strictly determined by the quality of the coder.

The decoder rarely adds quality loss itself - it just reconstructs the signal based on what it is given and few decoders actually have a say in quality decisions.