Yet Another Lesson from the Cambridge Analytica Fiasco: Remove the Barriers to User Privacy Control

Last weekend’s Cambridge Analytica news—that the company was able to access tens of millions of users’ data by paying low-wage workers on Amazon’s Mechanical Turk to take a Facebook survey, which gave Cambridge Analytica access to Facebook’s dossier on each of those turkers’ Facebook friends—has hammered home two problems: first, that Facebook’s default privacy settings are woefully inadequate to the task of really protecting user privacy; and second, that ticking the right boxes to make Facebook less creepy is far too complicated. Unfortunately for Facebook, regulators in the U.S. and around the world are looking for solutions, and fast.

But there’s a third problem, one that platforms and regulators themselves helped create: the plethora of legal and technical barriers that make it hard for third parties—companies, individual programmers, free software collectives—to give users tools that would help them take control of the technologies they use.

Ad-blockers are nearly as old as the web. In the early days of the web, they broke the deadlock over pop-up ads, allowing users to directly shape their online experience, leading to the death of pop-ups as advertisers realized that serving a pop-up was a guarantee that virtually no one would see your ad. We—the users—decided what our computers would show us, and businesses had to respond.

Web pioneer Doc Searls calls the current generation of ad-blockers “the largest consumer revolt in history.” The users of technology have availed themselves of the tools to give them the web they want, not the web that corporations wanted us to have. The corporations that survive this revolt will be the ones who can deliver services that users are willing to use without add-ons that challenge their business-models.

Under ideal conditions, companies that do bad things with technology are shamed and embarrassed by bad press (norms); they face lawsuits and regulatory action (law); they lose customers and their share-price dips (markets); and then toolsmiths make add-ons for their product that allow us all to use them safely, without giving up our personal information, or being locked into their software store, or having to get repairs or consumables from the manufacturer at any price (code).

But an increasing slice of the web is off-limits to the “code” response to bad behavior. When a programmer at Facebook makes a tool that allows the company to harvest the personal information of everyone who visits a page with a “Like” button on it another programmer can write a browser plugin that blocks this button on the pages you visit.

This week, we made you a tutorial explaining the torturous process by which you can change your Facebook preferences to keep the company’s “partners” from seeing all your friends’ data. But what many folks would really like to do is give you a tool that does it for you: go through the tedious work of figuring out Facebook’s inscrutable privacy dashboard, and roll that expertise up in a self-executing recipe—a piece of computer code that autopiloted your browser to login to Facebook on your behalf and ticked all the right boxes for you, with no need for you to do the fiddly work.

But they can’t. Not without risking serious legal consequences, at least. A series of court decisions—often stemming from the online gaming world, sometimes about Facebook itself—has made fielding code that fights for the user into a legal risk that all too few programmers are willing to take.

That's a serious problem. Programmers can swiftly make tools that allow us to express our moral preferences, allowing us to push back against bad behavior long before any government official can be convinced to take an interest—and if your government never takes an interest, or if you are worried about the government's use of technology to interfere in your life, you can still push back, with the right code.

When we talk about “walled gardens,” we focus on the obvious harms: an App Store makes one company the judge, jury and executioner of whose programs you can run on your computer; apps can’t be linked into and disappear from our references; platforms get to spy on you when you use them; opaque algorithms decide what you hear (and thus who gets to be heard).

But more profoundly, the past decade’s march to walled gardens has limited what we can do about all these things. We still have ad-blockers (but not for “premium video” anymore, because writing an ad-blocker that bypasses DRM is a potential felony), but we can’t avail ourselves of tools to auto-configure our privacy dashboards, or snoop on our media players to see if they’re snooping on us, or any of a thousand other useful and cunning improvements over our technologically mediated lives.

Because in the end, the real risk of a walled garden isn’t how badly it can treat us: it’s how helpless we are to fight back against it with our own, better code. If you want to rein in Big Tech, it would help immensely to have lots of little tech in use showing how things might be if the giants behaved themselves. If you want your friends to stop selling their private information for a mess of potage, it would help if you could show them how to have an online social life without surrendering their privacy. If you want the people who bet big on the surveillance business-model to go broke, there is no better way to punish them in the marketplace than by turning off the data-spigot with tools that undo every nasty default they set in the hopes that we'll give up and use products their way, not ours.

Related Updates

When is software free? Is it enough that the software be licensed under a free or open license? What about patents? Software as a service? Trade secrets? What about DRM? Is software ever free? There's a saying in the software freedom movement: "if you can't open it, it's not yours....

EFF is introducing a new Coders' Rights project to connect the work of security research with the fundamental rights of its practitioners throughout the Americas. The project seeks to support the right of free expression that lies at the heart of researchers' creations and use of computer code to...

Have you ever wanted to talk with the Electronic Frontier Foundation about the risks of talking in public about security issues, especially in connected Internet of Things devices? Tomorrow, you'll get your chance. Information security has never been more important: now that everything from a car to a...

Congress has never made a law saying, "Corporations should get to decide who gets to publish truthful information about defects in their products,"— and the First Amendment wouldn't allow such a law — but that hasn't stopped corporations from conjuring one out of thin air, and then defending it as...

Update: Canadian authorities announced on May 7 that they dropped all charges against the teen they had previously accused of unauthorized use of a computer service for downloading public records from a government website. Canadian authorities should drop charges against a 19-year-old Canadian accused of “unauthorized use...

For tech lawyers, one of the hottest questions this year is: can companies use the Computer Fraud and Abuse Act (CFAA)—an imprecise and outdated criminal anti-“hacking” statute intended to target computer break-ins—to block their competitors from accessing publicly available information on their websites? The answer to this question has wide-ranging...

Despite the full-throated objections of the cybersecurity community, the Georgia legislature has passed a bill that would open independent researchers who identify vulnerabilities in computer systems to prosecution and up to a year in jail. EFF calls upon Georgia Gov. Nathan Deal to veto S.B. 315 as soon...

Deputy Attorney General Rod Rosenstein delivered a speech on Tuesday about what he calls “responsible encryption” today. It misses the mark, by far. Rosenstein starts with a fallacy, attempting to convince you that encryption is unprecedented: Our society has never had a system where evidence of criminal wrongdoing...

EFF, joined by Public Knowledge, filed an amicus brief today asking the Court of Appeals for the Federal Circuit to revisit one of its worst decisions ever. Three years ago this month, in Oracle v. Google, the Federal Circuit held that the Java Application Programming...

This year was one of the busiest in recent memory when it comes to cryptography law in the United States and around the world. But for all the Sturm und Drang, surprisingly little actually changed in the U.S. In this post, we’ll run down the list of things that happened...