more heinous than you might think, and that contrary to popular belief, there is something you can do about it.

Part philosophical treatise and part rousing how-to, Obfuscation reads at times as an urgent call to arms.

“Machines don’t forget.”

“We mean to start a revolution with this book,” its authors declare. “Although its lexicon of methods can be, and has been, taken up by tyrants, authoritarians, and secret police, our revolution is especially suited for use by the small players, the humble, the stuck, those not in a position to decline or opt out or exert control.”

One of the tricky things about online tracking is that it’s so complex and invisible that we aren’t necessarily cognizant of it happening,” says Finn Brunton, coauthor and professor at New York University. “Part of the goal of Obfuscation is to draw attention to precisely that problem.”

Consider the trick by which, in loading a single (practically invisible) pixel onto a website you’re visiting, an ad server can, without your knowledge, collect all kinds of information about the browser and device that you’re using—information that could then be used down the line to, say, jack up the price on a plane ticket the next time you’re making travel arrangements, serve up a selection of higher-end goods the next time you search on an online retailer’s site, or, on the flip side, make it tougher for you to get a loan, if something about your data gets flagged as a credit risk.

This is a clear example of what Brunton and coauthor Helen Nissenbaum, also a professor at NYU, describe as “information asymmetry,” where, as they write, the companies collecting data “know much about us, and we know little about them or what they can do.”

The surveillance background

It’s not just that we haven’t agreed to having our personal information collected, it’s that the invisible processes of dossier building are so complex, and their consequences so difficult to predict, that it would be virtually impossible to understand exactly what we’re being asked to consent to.

Whereas NSA snooping makes headlines, other forms of quiet surveillance go unnoticed (and unregulated), to the benefit of shadowy entities making bank in the data economy—or even police using software to calculate citizens’ threat “scores.”

“Machines don’t forget,” Brunton says. Suppose you have an agreement with one company, “the best company run by the best people,” he says, “but then they go bankrupt, or get subpoenaed, or acquired. Your data ends up on the schedule of assets,” and then you don’t know where it might end up.”

To be clear, the authors—whose manifesto irked critics who argue that these kinds of transactions are what finance the “free” internet—aren’t against online advertising per se.

“Before ad networks started the surveillance background,” Nissenbaum explains, “there was traditional advertising, where Nike could buy an ad space on, say, the New York Times [website], or contextual advertising, where Nike would buy space on Sports Illustrated. There were plenty of ways of advertising that didn’t involve tracking people.”

Nowadays, though, Brunton says, “Many online sites that produce content you use and enjoy don’t get that much money out of the advertising, and yet there’s a whole galaxy of third-party groups on the back end swapping data back and forth for profit, in a way that’s not necessarily more effective for the merchant, the content provider, or you.

“Then add on top of it all that the data can be misused, and you have a network that is less secure and built around surveillance. I think that starts to shift the balance in favor of taking aggressive action.”

That’s where obfuscation—defined in the book as “the production of noise modeled on an existing signal in order to make a collection of data more ambiguous, confusing, harder to exploit, more difficult to act on, and therefore less valuable”—comes in.

TrackMeNot, for example, one of several elegant obfuscation tools designed by Nissenbaum and NYU computer science colleagues, serves up bogus queries to thwart search engines’ efforts to build a profile on you, so that when you search, say, “leather boots,” it also sends along “ghost” terms like “Tom Cruise,” “Spanish American War,” and “painters tape” (which don’t affect your search results). Another tool, ADNAUSEUM, registers a click on all the ads in your ad blocker, rendering futile any attempt to build a profile of your preferences based on ads you click.

History lessons

Even as they look to future battles, Brunton and Nissenbaum draw inspiration from the past, offering a compendium of examples of obfuscation tactics used throughout history.

People worried that their private conversations may be being recorded can play a “babble tape” in the background—an update to the classic mobster strategy of meeting in noisy bathrooms to safeguard against FBI audio surveillance.

Shoppers can swap loyalty cards with strangers to prevent brick-and-mortar stores from building a record of their purchases. The orb-weaving spider, vulnerable to attacks by wasps, builds spider decoys to position around its web.

Brunton and Nissenbaum are often asked in interviews about what simple steps even technophobes can take to protect their privacy. The answer: It depends on what scares you most.

“Are you worried about Google?” Brunton asks. “About your insurance company? Where are the places that you want to push back?” A theme that emerges in the book is that obfuscation tactics, while often similar in principle, vary a lot in practice; each unique threat requires a unique defense.

“The ideal world for me is one where you don’t need to obfuscate.”

“Camouflage is often very specific,” Nissenbaum explains. “This animal is worried about these particular predators with this particular eyesight. It’s a general thing but in the instance, it is quite specialized.”

That makes for a big challenge, since there are so many threats—and the notion of “opting out” of all types of surveillance has become so impractical as to be nearly nonsensical. (In the book, Brunton and Nissenbaum quip that it would mean leading “the life of an undocumented migrant laborer of the 1920s, with no internet, no phones, no insurance, no assets, riding the rails, being paid off the books for illegal manual work.”)

Brunton, for example, refuses to use E-ZPass (which, in addition to enabling your cashless commute, announces your location to readers that could be waiting anywhere—not just in tollbooths), but can’t resist the convenience of Google Maps. And Nissenbaum declined to share her location with acquaintances using the iPhone’s “Find My Friends” app, but lamented that there’s no box to check to keep Apple from knowing her whereabouts.

Brunton and Nissenbaum stress that obfuscation isn’t a solution to the problem of constant surveillance, but rather a stopgap to draw attention to the issue and the need for better regulation.

“The ideal world for me,” Nissenbaum says, is “one where you don’t need to obfuscate.”

She draws an analogy between our time and the moment when, soon after telephones became mainstream, the US passed laws forbidding phone companies from listening in on their customers’ conversations.

“You could imagine a different route, where they could eavesdrop and say, ‘Oh, I can hear you discussing with your mom that you would like to go to Mexico in the summer, why don’t we send you a few coupons for Mexican travel?’” Until we pass similar laws to address our current predicament, we’ll be stuck with “the information universe eavesdropping on everything we do.”

Brunton draws an even bolder comparison—between the dawn of the information age and the (much) earlier transition from agrarian to industrial life. Indeed, history is a testament to how societies can and do find equilibrium with relation to transformative new technologies.

The bad news, in the case of the Industrial Revolution, though, is that “in the middle of that shift, horrific things happened to huge populations of people,” Brunton says. Today, he argues, we have the opportunity to prevent the digital equivalent of such horrors. “Can find ways to prevent the worst outcomes for vulnerable populations?”

This text is published here under a Creative Commons License.
Author: Eileen Reynolds-NYU
Check here the article’s original source with the exact terms of the license to reproduce it in your own website