Posted
by
CmdrTaco
on Tuesday October 12, 2010 @11:36AM
from the keep-'em-accelerated dept.

An anonymous reader writes "Microsoft has just received a patent that grants the company the rights to GPU-accelerated video encoding, which may be the primary technology that takes advantage of the horsepower of the GPU in today's consumer applications. The broad patent covers tasks to perform motion estimation in videos, the use of the depth buffer of the GPU, to determine comprising, collocating video frames, mapping pixels to texels, frame processing using the GPU, and output of data to the CPU."

This will make it almost impossible for smaller companies to make fast video encoding applications. They will have to start paying royalties if they want to encode video using the GPU in applications such as FRAPS or any video converter. Their products will either have to become more expensive or remain inferior to products made by larger companies.

It seems that the idea was apparent long before the patent came about. I think the underlying reason that we haven't seen it yet is that the tradeoff value wasn't present yet. The GPU must beat out the CPU by a sufficiently wide price and performance margin for the workload before anybody bothers with specialized code for it.

Why would it be obvious that hardware designed to accelerate 3d rendering - transformation, lighting and rasterisation - can accelerate the compression of video frames?

It seems that you are 'obviously' wrong.

It's seems incredibly obvious to me. Of course, I've worked on FFT code for Cray vector units which were around a long time before 2004. If you can't see the relationship between vector processing, FFTs, and any form of video compression/display, then perhaps you shouldn't be in charge of determining what is "obvious" regarding this particular patent.

I have long felt that our patent system is ridiculous because it allows such silly patents. If something is obvious to an expert in the field, then it shouldn't be patentable.

This patent is nothing more than a description on how to use a general purpose processor to perform specific tasks. Adding to that, it describes a way to use computers to handle video. And using GPUs to do work is fundamentally very old technology, as they basically are glorified vector processors. So, how can such an obvious and overreaching patent pass regarding such fundamental technology? Is this not a obvious application of this particular technology?

What is confusing? Microsoft does something. Microsoft applies for a patent on that thing. A patent lawyer who knows very little about the tech in question, has about 600 applications he's currently supposed to be processing, has been instructed that he can't work overtime this week by his boss, but also that he is too far behind on his portfolio and needs to catch up, and who doesn't make near as much as his buddies from law school do to begin with, looks at it. He thinks "I don't even know what half of these words *mean*", then notices that Microsoft filed the patent. Through his haze of pain and frustration he dimly remembers that Microsoft is an "Innovative and economy driving company" and says "fuck it." He hits the "Approve" button.

His boss is happy because his numbers are better this week, and there is no real penalty for approving patents that later get overturned. Even assuming that Microsoft ever attempts to defend the patent rather than just threatening small companies with it in hopes that they'll cave without a court battle.

The things currently wrong with the patent system which this story demonstrates:

1) Patent attorneys often don't understand the tech they are expected to review. This is less of a problem with "real" patents, since the device being patented is just that. A device. If it does what it says it does, in the way it says it does, understanding why isn't all that important. Software is essentially algorithms. If you don't understand them, then judging their uniqueness is difficult.

2) The reviewers in the patent office are phenomenally overworked right now. There are literally tens of thousands of applications backed up. I saw some patent official guy at the end of the Bush administration say that if all applications stopped, right then, he could maybe catch up in a year or two. I don't imagine it's gotten better. Both Bush and Obama have authorized more reviewers, but it seems to be like filling the ocean with a teaspoon.

3) Patent reviewers make a fraction of what patent attorneys in private practice make. This means that they're always looking to get out and get into private firms. Probably not all of them, but like any rational human, most want to make more money and get more respect.

4) There is no real penalty for screwing up. Most patents never get defended in court, because the companies that own them mainly used them as bargaining chips, or to threaten smaller, defenseless, companies. Even if the patent does go to court, it'll take years to invalidate, and no repercussions fall on the approver.

Eliminating software patents would, in one stroke, alleviate or eliminate two of these four problems. Probably the most serious two. It'd be awful nice if it happened. The alternative is probably the whole system collapsing under its own weight eventually.

The patent application was received in October of 2004 according to the article. So I assume Badaboom would have to precede that or produce some form of prior art preceding that date to defend themselves should Microsoft resort to litigation after failing to agree to a licensing deal with Badaboom's creators. Regardless, a cursory glance proves that Microsoft could out lawyer them whether they are right or not so I believe with a 98% confidence that BadaboomIt is facing some serious liabilities.

The annoying thing is that there won't be a lot of actual prior art to fight the patent. GPU's at that point weren't very good at general purpose computations, so there wasn't a lot of generally available software that did it. Less so for video encoding specifically. OTOH, a GeForce 3 was somewhat programmable and it was released in 2001. People were abusing even simpler GPU's for general purpose computation over ten years ago using texture combiners and compositing modes. Even before then, people used quite general purpose processors as GPU's, and probably could have executed video encoding code on them.

Sadly, despite the fact that if you were to travel back in time to 2003 and shout, "video encoding on the GPU will be practical in a few years," nobody would really be shocked; it is still patentable in our current system because nobody specifically published doing this exact process on this exact type of chip. I expect It'll basically be another land rush of "X on the Internet" and "(The exact same) X on the wireless" type patents on the GPU.

Though, apparently their use of the depth buffer is kind of an interesting hack. That's less obvious from my glance at it, but in what I'd think of as an ideal world, it wouldn't be patentable. It probably doesn't really apply on modern hardware with arbitrary memory buffers and programming in OpenCL. But, it'll still be close enough in behavior to something more sensible on modern hardware that it'll still be scary as part of a giant stack of patents being carried by a scary lawyer looking for rent on your work.

You're being sarcastic, right? Why would a glorified vector engine be useful for doing video compression, which is basically lots and lots of vector math? It's so obvious that anybody with even basic knowledge of video compression would immediately understand how the two problem spaces map onto one another with no instruction whatsoever.

It's so obvious that ATI released software to do it within a year of when that patent was first filed, which means they were working on it at least a year before that, which means that multiple people independently came up with the idea at the same time, which means it is obvious.

Heck, other companies had already been doing this, and even held patents on it [espacenet.com] five years earlier. Okay, so texture compression and video compression aren't quite the same thing. One deals with a single image, one deals with compressing a series of images.... Yeah, that's not obvious to anyone who has never seen someone make a flipbook during class in elementary school.

Except that conceptually, it's a trivial extension of texture compression, which video cards have supported natively since at least the late 1990s. The only reason we weren't doing video compression is that the video cards weren't fast enough and/or were too power hungry to offer an advantage over CPUs. The patent office should not be awarding patents for discoveries, and that's all GPU-based video decompression really is---discovering that suddenly GPUs are faster than CPUs and things that were impractical (but widely discussed) years before are now practical.

I've only skimmed patent, so it's possible that the summary sucks and that there's something novel and unobvious here, but at a glance, this patent really does look like an explanation for a straightforward mapping of video compression onto a GPU which with the possible exception of the motion estimation would have been obvious a decade ago. For that matter, if you had asked somebody "how would you do motion estimation on a GPU" a decade ago, they probably would have come up with a similar solution.

Then again, it's a software patent, and the design process for nearly all software is obvious to someone with suitable skills in the field, which is why these patents are almost universally crap anyway.

Running code on a coprocessor is not novel in any way. It's what graphics coprocessors are for. This is like patenting a lemonade stand because it is located on the corner rather than in the middle of the block.

No, but providing someone with a unit to calculate (for example) a transformation doesn't mean that you give them an efficient way of computing FFTs. Of course if you give them a general purpose matrix-vector multiplier then it does. In 2004 a graphics card used a fixed function unit, and today it uses a general purpose one. Assumptions about how obvious it is that other applications can be performed don't carry back to previous generations of the hardware.

Because while an early FPU may be drastically more simple than a modern GPU, that *is* what a GPU is... It's a big floating point maths unit – it's particularly good at matrix/vector floating point maths.

That's because they are almost all of the type 'use a [general-purpose device] to do [specific thing].'

No, they're not. That's what the eyeball-grabbing-summary says. The actual patents say "use these 10 steps to achieve this one result". For example: Slashdot ran a story that Microsoft patented page-turning on an e-book reader. There was a big dog-pile as people leapt in to claim prior art, citing a million Flash apps that allowed page turning. Nobody bothered to read the bit that described, in detail, how it was a gesture input that included things like how many pages to turn, what multiple fingers would do, and how the UI would respond graphically to let the user know how all that was going to go down.

This is not an occasional occurance. It happens every time. You lot are so busy looking for reasons to say software patents are bad you don't check on any of the things you're arguing about. You're shooting your own credibility in the foot, arguing with me isn't going to fix that.