This week's case is especially maddening. "Sintel" is an amazing demonstration of the power of the open source Blender rendering software. Created by a volunteer community and funded by mass donations, the 15-minute short demonstrates you don't have to be a big movie studio to make high-quality animated movies. As part of a series of movies, it challenges Big Media's assertions that the only route to high-quality content is draconian usage controls. "Sintel" is released under a Creative Commons license allowing unrestricted use of any purpose -- including uploading to YouTube -- without any individual permission or EULA (end-user license agreement) or digital restrictions.

Why did Sony demand the film be taken down? In this case, it probably didn't explicitly do so -- an overcautious algorithm did it instead. The current best guess is that "Sintel" was accidentally claimed by Sony when the company added the movie to its pool of demos for its ultra-HD televisions. But the error prevented all YouTube users from exercising their freedom under the license to share the movie with others.

I tested this myself, uploading a copy to my own account and was immediately told it was blocked because of Sony. I appealed the block, explaining that I had a license to use, to upload, and even to monetize the film, but the block remained until a few days later when mysteriously, without notice or apology, the block was removed. Even now I'm unable to display ads with the movie, suggesting that my account has suffered long-term reputational harm. Google offers no recourse here, not even a way to file a problem ticket; it clearly buys into the view that small users are disposable.

The movie and music conglomerates have caused a world where there are automated systems that constantly take potshots at media posted by independents and individuals, forcing them off the Web. They do this by leveraging a clause in the DMCA that encourages hosting services to take down first and ask questions afterward. Services that don't do so become party to the infringement claim; those that slavishly respond, even when there's no prima facie case, gain protection. That mechanism is based on the assumption that anything good is likely to be created by the conglomerate, so there's no harm done treating their takedowns as truth.

The Sony example above is even more sinister. Google reduces its workload and legal risk in YouTube by offering media conglomerate corporations an algorithmic approach to takedowns, called Content ID. Corporations provide clips that Google uses to train an AI to spot supposed copyright infringements and block them without the need for a legal complaint. As the EFF points out:

If your video was removed thanks to a Content ID match, it's quite possible that no human at the copyright owner's office has ever seen your video.

However, the triggered process is biased toward the supposition that you're a criminal rather than a falsely accused citizen. First, Content ID takedowns happen automatically rather than first asking if there's a problem. While you can dispute the Content ID takedown, which can reinstate your video, it will probably already have been invisible for a while. In my case, the appeal did not reinstate public access to the movie. Innocent or not, you lose.