Hide and Seek: The Problem with Obfuscation

HERE’S WHAT WE KNOW. By various means of seduction, coercion, and co-optation, everyday life has been irresistibly colonized by forces collectively known as Big Data: the corporations and state agencies that use communications networks and digital surveillance to amass huge quantities of information on the activities of individuals and groups in hopes of predicting and containing their next moves. Short of a total renunciation of the now routine conveniences of contemporary life and voluntary exile from the spaces where almost all social belonging and recognition takes place, you cannot escape.

Improvements in technology have made Big Data’s recent conquest of our culture seem inevitable. “The proximate reasons for the culture of total surveillance are clear,” the software developer Maciej Cegłowski explained in a recent talk. “Storage is cheap enough that we can keep everything. Computers are fast enough to examine this information, both in real time and retrospectively. Our daily activities are mediated with software that can easily be configured to record and report everything it sees upstream.” But behind those proximate causes, as Cegłowski points out, are resilient, long-standing motives that have guided the direction of technological development for centuries: “State surveillance is driven by fear. And corporate surveillance is driven by money.”

Obfuscation: A User’s Guide for Privacy and Protest, by Finn Brunton and Helen Nissenbaum, two professors in New York University’s media, culture, and communications department, starts by treating the dystopian situation described above as more or less a given. The dominion of Big Data means we are more or less vulnerable at all times to states and corporations that cannot be made accountable for their actions. “Data about us are collected in circumstances we may not understand, for purposes we may not understand, and are used in ways we may not understand,” Brunton and Nissenbaum write. We don’t know how much data is collected, nor exactly who is collecting it, nor the names of the parties they’re selling it to, nor how long any of them are planning on keeping it. Worse, they point out, “we don’t know what near-future algorithms, techniques, hardware, and databases will be able to do with our data.” This means that apparently inconsequential data about where we go, who we know, what we search, and what we consume “can potentially provide more powerful, more nearly complete, and more revealing analyses than ordinary users could have anticipated — more revealing, in fact, than even the engineers and analysts could have anticipated.” A data profile that seems innocent enough now could combine with future information and as-yet-untested analytical approaches to become a vector of persecution.

To counter the advance of these inscrutable institutions, Brunton and Nissenbaum suggest we generate some opacity of our own. To this end, they outline a variety of techniques of obfuscation — which they define as “the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection” — that ordinary people can deploy to camouflage themselves. Despite its subtitle, though, Obfuscation is not really a how-to book. It has little practical advice for would-be digital saboteurs. While several techniques of obfuscation are described and recommended, the authors don’t provide much detail on how to go about implementing them oneself. There’s not even a list of browser extensions one could download to get started. (The book’s first half does offer a brief compendium of historical examples of obfuscation in action, ranging from the use of aluminium-lined paper “chaff” to defeat World War II–era radar to Tor relays that can be used to mask the sources and destinations of internet traffic.)

Instead, Obfuscation expends the bulk of its energy defending its own existence. It is haunted at nearly every turn by the voices of the Big Data propagandists who insist mass data collection is always and ever for our own good, and that only selfish people would try to meddle with the reliability of omnipresent surveillance by being intentionally dishonest. Brunton and Nissenbaum do their best to take such claims seriously, committing much of the book’s second half to handwringing discussions of how obfuscation could be ethically justified in Rawlsian terms. In line with Nissenbaum’s earlier work on privacy and “contextual integrity,” they stress the importance of context for determining when obfuscation is appropriate.

In making the case for obfuscation, Brunton and Nissenbaum freely admit the ultimate impotence of these techniques, and the structural weakness of those who have recourse to them. As the authors are quick to concede, the tactics of obfuscation do nothing to decrease the asymmetries of power and information they are designed to disrupt. At best, they only “pollute” the databases of the powerful, and spur them to do a better job justifying their data collection and analysis and clarifying exactly what good they are doing. More often, though, the “troublemaking” tactics of obfuscation merely buy some time, as the obfuscators are quickly outwitted by their more powerful adversaries and must struggle to come up with some other provisional evasive measure.

Obfuscation consists of ways the weak can temporarily elude the predations of the powerful, and it is justified with reference to that disparity. “Sometimes we don’t have a choice about having our data collected,” Brunton and Nissenbaum write, “so we may as well (if we feel strongly about it) put a little sand in the gears.” They draw explicit inspiration from political scientist James C. Scott’s concept of “weapons of the weak.” In Scott’s 1985 book of that title, he analyzed how oppressed Malaysian peasants without political agency found other, less overtly political means to resist: squatting, poaching, petty theft, work slowdowns, furtive complaints, and so on. Obfuscation, in Brunton and Nissenbaum’s account, works similarly, granting a limited form of agency to those who can’t rely on “wealth, the law, social sanction, and force” to protect their interests.

The evasions of obfuscation are contingent on acceptance of the impossibility of genuine escape. They provide means of getting along under conditions of enemy occupation, not strategies of revolution or resistance. They consolidate rather than dismantle the asymmetries of power; they are fugitive, rearguard actions in the midst of an ongoing collective surrender. As clever and satisfyingly creative as obfuscation’s false flags, diversions, and decoys can be, they do not speak truth to power so much as mock it behind its back. Weapons of the weak are for those who have become resigned to remaining weak.

¤

What is obfuscation good for, then? Brunton and Nissenbaum argue that it “offers some measure of resistance, obscurity, and dignity, if not a permanent reconfiguration of control or an inversion of the entrenched hierarchy.” In other words, it permits gestures of independence that feel satisfying, without changing the actual conditions of domination. We can flood Google with false search queries to disguise our real ones, but we can’t effectively agitate for an ad-free search engine, let alone for a cleaner cultural separation of information and advertising. Presumably, if enough acts of “data disobedience” are registered, entities with actual power might take it upon themselves to devise the “firmer and fairer approaches to regulating appropriate data practices,” as Brunton and Nissenbaum put it. But more often, the tactics of obfuscation don’t scale: the more people use them, the more incentive corporations and governments have to devote their superior resources to developing countertactics, thus quickly closing off whatever vulnerabilities were revealed. A preoccupation with thwarting surveillance only emphasizes the futility of the tactics left open to us, ultimately confirming the system’s oppressive strength.

Nor is obfuscation an effective foundation for a collective democratic politics. As Brunton and Nissenbaum recognize, obfuscation relies on secrecy and deception, not transparency, making it much more useful for individual protection than for public advocacy. In certain respects, privacy-based obfuscation negates the tendency to protest: it caters to a self-protectiveness that runs counter to the self-sacrifice that civic engagement often requires. Obfuscation tends to configure protest as an assertion of personal privacy rather than collective public risk-taking.

Then, too, Big Data has developed its own forms of compensation to offer the populations it has conquered. For the sorts of people who are conditioned to believe (by dint of habitus, wealth, or some other type of cultural privilege) that they have nothing to hide and thus nothing to fear from being closely watched, constant surveillance can serve as a flattering form of social recognition. Many of us have learned to consume surveillance — even the surveillance of our own bodies — as entertainment. It can feel good to be at the center of the action. Who doesn’t derive pleasure from attention, even if it is only from a social network or a fitness tracker or an algorithm? The shopping recommendation services, the spurious quizzes, the awkward year-in-review documentaries on Facebook: these track our every move and model our anticipated future behavior for our amusement. They help posit a sense of our “real self,” the self that can be revealed only after massive amounts of data about us are processed and sorted in conjunction with that of millions of others. They encourage us to think of ourselves as enthralling yet solvable mysteries, stories we can enjoy vicariously.

In this way, surveillance becomes both a means to self-knowledge and a precondition for it. Surveillance calls out to us, hailing us like rideshares. Obfuscation, for all its potential usefulness, never escapes this logic. When surveillance makes us seek lines of flight, it sharpens our awareness of ourselves as selves, as people having selves worth protecting or concealing from view. We rely on surveillance to supply a sense of stakes for the self. The mechanisms that Big Data relies upon to infer relevance — the likes, the patterns of communication among other people in our networks, geographic and biological cues — are the same ones we rely on as well. They bring us into focus not only for our adversaries but for each other and ourselves.

Not only is obfuscation a weapon of the weak, then: it’s a very messy and imprecise one — a crude pipe bomb rather than a targeted drone. While obfuscation intends to spread disinformation at the expense of data collectors, it also undermines the information channels we rely on for social cohesion. The same asymmetry that creates the need for subterfuge also means we can’t know how our disinformation will circulate throughout our networks and which of our friends and associates will experience collateral damage. Given the way some predictive algorithms work — drawing inferences from who we are connected to and what their preferences are, correlating patterns in our data with that of others to draw conclusions about what we want and who we are — we can’t limit the effects of muddying the waters to our own online tide pools. Our phony data may feed into algorithmic models determining whether other people will receive discounts, get offered loans, or end up on government watch lists.

If obfuscation does nothing more than preserve the illusion of freedom and agency in the midst of total surveillance, it may be more fruitful to adopt a different approach. Obfuscation assumes that the autonomy of the individual self is something precious that needs to be protected from violation. But in making the unmonitored self the ultimate prize, obfuscations colludes with existing systems of power which rely on our isolation, and our eagerness to assume responsibility for circumstances we can’t entirely control. Surveillance is more effective the more we are guided by the threat it seems to represent to our personal integrity.

But a merely illusory integrity, rooted in paranoia, may not be worth holding on to at all. So what if, instead of obfuscating, we stayed alert to the potential solidarities that are articulated by the very schemes designed to control us? What if, instead of trying to fly under the enemy’s radar, we let that radar help us find allies with whom we can fly in formation? If we can find a way to repurpose the resources of surveillance toward ends it can’t perceive, we could begin to consume the system rather than be consumed by it.