Viacom’s true desire: one copyright filter to rule them all

At last week's Web 2. company.

Last week, Google finally unveiled the long-awaited copyrighted content filter for popular video-sharing destination YouTube. While Viacom is presumably happy to see video fingerprinting technology finally arrive at YouTube, the entertainment giant says it would be much happier with an industry-standard system. In other words, Viacom is looking for universal filtering regardless of context and purpose.

"What no one wants is a proprietary [filtering] system that benefits one company," Viacom CEO Philippe Dauman told attendees at last week's Web 2.0 Summit according to PC World. "It is a big drain to a company like ours to have to deal with incompatible systems."

What Dauman wants is a universal scheme that benefits his company, something along the lines of an open-source, open-standard filtering system. Presumably, that's what members of the new copyright consortium unveiled last week will be using. The group, which includes Viacom, CBS, Disney, News Corp., NBC, Microsoft, and a couple of smaller video-sharing sites, will require members to proactively filter infringing video using "effective content identification technology."

Viacom—and many other media conglomerates—would love to see something along the lines of The One Filter emerge—something that would instantly swat any video on any site on the Internet that Big Content's filters believe is infringing. Upload an entire episode of Aqua Teen Hunger Force? Swat! A 10-minute snippet from Oprah? Swat! A clip of your kid dancing to the latest chart-topping song? Swat! A 20-second clip from The Colbert Report as part of a five-minute piece on fair use? Swat!

It's not a pretty picture, because, as we pointed out last week, automated filters don't understand fair use. Although an open-standard, automated system that relies on digital fingerprints may do a bang-up job flagging and removing an uploaded clip, it's not going to understand when a copyrighted video is being used for education purposes, criticism, or parody. When Big Content is already issuing bad takedown requests when humans are involved, we shudder to imagine what happens when the task is put in the hands of software. Do you think it will be programmed to play it safe? Neither do we.

It's understandable that media companies want to maintain as much control over their content as possible. Content filters would give them nearly absolute control—more than envisioned by the Copyright Act. A technological solution will have to innovate on the interpretation of "fair use" given by the rights holders. Somehow, we suspect this won't be very "fair" at all.

Although the DMCA provides a means of redress for mistaken takedown notices, the type of filtering system envisioned by Dauman might preempt it entirely. It's simple: there's no DMCA counternotification to be made if the video in question never makes it to the site because it was blocked by a filter.