On Friday, April 7, 2017, the Ninth Circuit published its opinion in Mavrix Photographs v LiveJournal, a potentially significant decision touching on a number of provisions in Section 512 of the Copyright Act, which was passed as part of the Digital Millennium Copyright Act in 1998 (DMCA) and provides safe harbors for online service providers against claims of copyright infringement under certain circumstances.

The court reversed the district court’s ruling that LiveJournal qualified for the safe harbor, and while some of its holdings are positive for copyright owners—who have long advocated that the DMCA safe harbors have failed to fulfill Congress’s goal of cooperation between service providers and copyright owners in detecting and deterring online infringement—other parts of the decision may create perverse incentives for service providers to do less to prevent infringement on their sites.

I want to focus in particular on the Ninth Circuit’s suggestion that LiveJournal’s use of a technological tool to automatically block posts from certain sources may be a factor that determines whether the service had “the right and ability to control infringements”, and, consequently, whether it would be disqualified from protection under §512’s safe harbor. I think courts should resist this suggestion, as it is contrary to the text and purpose of the statute, and could disincentivise service providers from employing tools that are critical for detecting and deterring infringement.

The Case

But first, a bit of background. LiveJournal is perhaps one of the earliest social media networks, and anyone of a certain age will deny ever having posted a personal blog there. The site features a number of “communities”, and at issue here is a community called “Oh No They Didn’t!” (ONTD), which is focused on providing celebrity gossip. Community members submit photos, videos, links, and other media concerning up-to-date celebrity news. Originally, like other LiveJournal communities, posts were moderated by volunteers, but as the community became one of the most popular on LiveJournal, the company took a more active role in curating posts to attract and maximize advertising revenue. At the time of the complaint, LiveJournal employed a full-time community leader, who oversaw a team of volunteer moderators charged with screening and approving user submissions.

Mavrix Photographs is a photo agency specializing in photos of celebrities, which it licenses to celebrity magazines. It alleges that between 2010 and 2014, twenty of its photos were posted on ONTD without authorization, and it sued LiveJournal for copyright infringement. LiveJournal asserted it was immune from liability under the §512 safe harbor and moved for summary judgment, which the district court granted. Mavrix appealed to the Ninth Circuit.

Section 512(c) shields service providers from liability “for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider”, and the bulk of the Ninth Circuit’s decision hinged upon whether, given the volunteer moderators’ role in screening and approving the content, that content was stored on LiveJournal “at the direction of a user.” 1The court concluded that agency law governs whether such volunteer moderators are acting on behalf of LiveJournal and remanded for the factfinder to determine the ultimate question. But even if the content is stored at the direction of a user, and LiveJournal meets this threshold requirement for safe harbor protection, §512 outlines other requirements a service provider must meet. Two in particular were in dispute, and the Ninth Circuit addressed each of those in its opinion “to provide guidance to the district court” on remand.

“Right and ability to control” and technological tools

I want to focus here on one of those two in particular—the requirement expressed in § 512(c)(1)(B), which provides that a service provider cannot qualify for the safe harbor if it “receive[s] a financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity.”

This “right and ability to control”, as the Ninth Circuit observed, “involves ‘something more than the ability to remove or block access to materials posted on a service provider’s website.’”

The service provider does “something more” when it exerts “high levels of control over activities of users.” The service provider exerts “high levels of control,” for example, when it, “prescreens sites, gives them extensive advice, prohibits the proliferation of identical sites,” provides “detailed instructions regard[ing] issues of layout, appearance, and content,” and ensures “that celebrity images do not oversaturate the content.”

The Ninth Circuit went on to explain:

LiveJournal’s rules instruct users on the substance and infringement of their posts. The moderators screen for content and other guidelines such as infringement. Nearly two-thirds of submitted posts are rejected, including on substantive grounds. ONTD maintains a list of sources that have complained about infringement from which users should not submit posts. LiveJournal went so far as to use a tool to automatically block any posts from one source. In determining whether LiveJournal had the right and ability to control infringements, the fact finder must assess whether LiveJournal’s extensive review process, infringement list, and blocker tool constituted high levels of control to show “something more.”

Among the facts that the Ninth Circuit indicated could count as “something more” under the “right and ability to control” prong, it is the inclusion of the last one—the use of automated blocking technology—that is especially problematic for creators and copyright owners. Given the scale of online infringement, technological tools that can block, filter, or identify infringing materials are critical to addressing online piracy. Many major platforms already incorporate automated tools to minimize infringement. And a number of creators and copyright owners have urged greater adoption of such tools, recognizing that efforts to enforce copyright online are greatly ineffective in their absence.

But if the Ninth Circuit is correct, and the implementation of automated technology to block potentially infringing material can kick a service provider out of the safe harbor, it would create a powerful disincentive for the adoption of such tools. Service providers would likely decline to adopt them altogether rather than running the risk of losing safe harbor protection and being exposed to liability.

While the Ninth Circuit is correct on the law, I think its suggestion that blocking technologies give rise to “something more” under the “right and ability to control” test is contrary to the text and purpose of §512. And I get there by using the same reasoning that led to the “something more” standard in the first place.

Something more

The phrase “something more” is not found in the statute—rather, its genesis comes from case law. As the Central District Court of California explained in 2001:

[T]he “right and ability to control” the infringing activity, as the concept is used in the DMCA, cannot simply mean the ability of a service provider to remove or block access to materials posted on its website or stored in its system. To hold otherwise would defeat the purpose of the DMCA and render the statute internally inconsistent. The DMCA specifically requires a service provider to remove or block access to materials posted on its system when it receives notice of claimed infringement. The DMCA also provides that the limitations on liability only apply to a service provider that has “adopted and reasonably implemented … a policy that provides for the termination in appropriate circumstances of [users] of the service provider’s system or network who are repeat infringers.” Congress could not have intended for courts to hold that a service provider loses immunity under the safe harbor provision of the DMCA because it engages in acts that are specifically required by the DMCA. 2Hendrickson v. eBay, 165 F. Supp. 2d 1082, 1093-94 – Dist. Court, CD California 2001.

The same is true when it comes to implementation of technological tools for blocking or filtering content. Section 512(i)(1)(B) provides that the limitations on liability only apply to a service provider that “accommodates and does not interfere with standard technical measures”, which are technical measures used “to identify or protect copyrighted works.” Assuming for the sake of this argument that LiveJournal’s blocking tool otherwise qualified as a standard technical measure under the statute, 3It doesn’t, for reasons unrelated to this argument. Under §512(i)(2), standard technical measures “(A) have been developed pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process; (B) are available to any person on reasonable and nondiscriminatory terms; and (C) do not impose substantial costs on service providers or substantial burdens on their systems or networks.” I’m not aware of any technical measure that was developed pursuant to the process laid out in the statute. it would render the statute internally inconsistent to say that a service provider would lose immunity by employing a blocking tool that is specifically required by the statute.

This interpretation is also consistent with the overall purpose of §512. Congress drafted §512 with the intent to provide incentives for service providers and copyright owners to detect and deter online infringement. 4S. Rep. No. 105-190 at 20 (1998). It would run contrary to this intent if service providers were discouraged from employing automated tools that do just that to preserve their safe harbors.

Conclusion

Courts should, instead of looking at steps taken to prevent infringing activity, focus on facts that show the ability to control infringing activity. That’s not to say that under no circumstances is the use of automated technology disqualifying—a service provider might tout its use of filtering or blocking tools merely as a pretense while continuing to facilitate and profit off infringement. But courts could also, as the Ninth Circuit noted here in a footnote, find right and ability to control if the facts show a service provider intentionally induced infringement. Similarly, as the MPAA argued in its amicus brief, if the facts show that a service provider actively reviews and prescreens each user submission for the purpose of curating likely infringing content intended to draw visitors to the site, then it necessarily exercises higher levels of control over activities of users that would kick the service provider out of the safe harbors under this prong.

It doesn’t, for reasons unrelated to this argument. Under §512(i)(2), standard technical measures “(A) have been developed pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process; (B) are available to any person on reasonable and nondiscriminatory terms; and (C) do not impose substantial costs on service providers or substantial burdens on their systems or networks.” I’m not aware of any technical measure that was developed pursuant to the process laid out in the statute.

About

Copyhype provides news and info on current developments relating to copyright law, the media industries, and the digital economy. It cuts through the hype to bring reasoned discussion aimed at both legal and nonlegal audiences.

Terry Hart is currently VP Legal Policy and Copyright Counsel at the Copyright Alliance. Any opinions expressed on this site remain his own and not necessarily those of his present or any past employers.