Surveillance and Trust

Worried about where your kids are or what they're up to? Relax. All you need is PhoneSherrif on their iPhone. It logs users' activities and sends the information back to your own device. Or try mSpy, MamaBear, or FiLIP. The latter is a colourful, watch-like device for kids that functions like the electronic bracelets used for offenders. Using GPS, It'll set safe zones and alert parents when boundaries are crossed.

Surveillance has gone domestic. Is this okay? All sorts of issues might cross your mind, to do with normalizing surveillance or relying on technological "solutions" for relational problems. Judith Shulevitz, writing in the New Republic, says the really insidious problem may be this: "It sends the message that nothing and no one is to be trusted: not them, not us, and especially not the rest of the world." I agree.

In what ways does surveillance undermine trust? How and why does this happen today? Is it inevitable? Are there alternatives? You don't have to say that surveillance— or the technologies that support it—is bad in itself to recognize that we easily rely on machines rather then on each other. The world of email and texting is simpler and quicker than finding a moment for a face-to-face encounter. How much more might bonds of trust be broken when we rely on hidden cameras in toys or on phones to track our kids? We'll explore this here, including the conviction that, at every level, there are other ways forward.

Surveillance boom

Surveillance is an ancient—and ambivalent— practice. Its agents may be seen in the "watchmen" of biblical times, or those sent to "spy out the land," and its more shadowy Shakespearian moments occur when characters hide behind curtains or peep through keyholes. Devices for surreptitious watching or listening are also of antique origin. Checking up on others has been a common but historically not usually an organized and generalized activity.

Until the twentieth century. Factories, offices, schools, hospitals, armed forces all became more "rationally" organized under modern bureaucratic rule in the nineteenth century. This flowered—if that's the right word—in terrifying regimes such as czarist then Soviet Russia, the Third Reich in Germany, or, for that matter, tendencies within the "liberal democracies" that George Orwell worried about in Nineteen Eighty-Four.Surveillance in multiple contexts has been expanding exponentially since the mid-twentieth century, especially as information technologies were applied in one context after another. The digital computer was a product of wartime national intelligence, so spy agencies were an early beneficiary. Possibilities in the office and factory followed, for monitoring by camera or keystroke. Toward the turn of the century police became, essentially, information workers, and developing digital methods helped to transform older techniques of advertising and marketing.

Customer data had been collected for decades, but software and statistical techniques now make it possible for these to morph into the loyalty cards and then social media surveillance that we know today. Now, when you search on Amazon, check Facebook, or mention a product in an email, in a matter of minutes you start to see banner ads for other, similar items. They know.

Surveillance 2.0

All this means that today's watching is not like old Orwellian or Stasi surveillance— except that in these cases, too, trust was undermined. Today, its means are digital and they're found in every context. Recall that Alan Turing's digital computer was mechanical, huge, and heavy. Today, little and lightweight devices linked wirelessly to global networks means that there really is nowhere to hide, if disappearing seems like something you'd fancy. We all depend on the digital, moment by moment, mobile or static.

Another big difference is that surveillance in Orwell's fictional Oceana or all-tooreal East Germany was state-run. While governments are still thoroughly involved in monitoring and tracking, corporations are also profoundly implicated. Don't forget that the initial Edward Snowden scandal that outraged citizens was the fact that the US National Security Agency obtained data from phone company Verizon and the Internet giants Apple, Google, Microsoft, Yahoo, and others. Now, marketing means surveillance.

User-generated content becomes a possibility with web 2.0, but ironically the freedom to contribute your tweets, pics, and posts has more than one downside— in this case, that without encryption, all these are also "fair game" for marketers and, by extension, for spy agencies. We use the platforms as if they're free, but the price we pay is that we become the target for all kinds of personal data collection. We easily become both commodity and suspect.

Surveillance as solvent

One problem with the so-called mass surveillance that Snowden exposed is that it creates suspects. The data dragnet that scoops up millions of information fragments from a vast array of sources—just in case—produces many "false positives." It makes mistakes that finger innocents. And this works both ways. We feel uneasy, uncertain, because we wonder when we might be stopped at the border—or wherever— as a suspect. And we start to doubt the system that is supposed to protect us. Trust is broken.

All too often, surveillance is a solvent of trust. Imagine how your teen feels when you install Verizon's Hum in the car she's borrowing—it alerts you, at home, when she's out of the agreed range or is speeding. Or recall how millions of phone company customers felt completely betrayed when they discovered from Edward Snowden that their personal details were shared with security agencies. And by the way, don't say that metadata—details like whom you contacted, for how long, and from where— does not count as "personal data." It's exactly the kind of information that private detectives seek. Not personal? Hardly. Trust is hard to build, but easily destroyed, at any level. As we turn with greater frequency to our devices, we may be turning from trust to technology. Our world too readily substitutes robotics for relationships.

In her 2002 Reith Lectures on today's "crisis of trust," Onora O'Neill observed that in a world of rising suspicion where— against the evidence—more people are thought to be untrustworthy, we end up working out more stringent ways to control this dubious category rather than seeking to rebuild trust. While she spoke mainly of institutional trust, this also applies to children and parents. Fears of "stranger danger" foster further fears, whereas appropriate development of trust enables not only basic relationships to flourish but also confidence, wisdom, and discretion among children.

Of course there's an element of risk in all trust, and in a less-than-perfect world it will always be so. But the need to trust is vital to any and all healthy relationships, so to substitute this with devices is, paradoxically, the real danger. As Tonya Rooney says, unless children are "able to place themselves in the position of trusting the 'other' and exposing themselves to whatever risk this may entail, then they also have little basis for understanding the 'other.'" Here, surveillance replaces trust because of our cultural tendency to be risk averse and security obsessed.

Surveillance troubles trust

The difficult task of understanding the "other" can also be seen on a much larger scale. Not long ago I was in Birmingham, UK, where a large Muslim population lives, especially in the rather deprived neighbourhood of Sparkbrook. In 2010 police cameras sprouted all around the perimeter of this area, under a "community safety" initiative. The shock came when it was discovered that the surveillance was in fact the product of an "anti-terrorist" scheme and the cameras were recording the comings and goings of every single driver.

Comments

Sign Up for the Comment Newsletter

Join 7000+ readers who receive fresh, thought-provoking articles once a week right in their inbox.First NameEmail Address *Sign-up for our weekly newsletter. You can unsubscribe at any time. See our privacy policy for more details.

Like every other big city, Birmingham has to contend with issues of drug-related gang violence. However, in this mainly Pakistani- British district, residents were horrified to find that the whole community was uniquely under scrutiny for violent extremism; in essence, untrusted.

I met an Anglican priest from Sparkbrook, Richard Sudworth, who explained that many neighbourhood meetings are held in his church hall.

He says that Sam Wells's motif of "being with" suggests a model of Christian presence that is relational and not subject to ulterior motives or utilitarian ends. This flows from our understanding that "the Word became flesh and blood and moved into the neighborhood" (John 1:14 The Message). This "being with" complements the Catholic social teaching that seeks to value the nests of human relationships in public life that underscores the "common good." In a talk I heard, he concludes that "the rise of mass surveillance in western societies reveals a cultural crisis of trust for which only a politics of the common good, of 'being with,' is remedy."

Going beyond the family or community level, we may also see the negative effects of surveillance and suspicion on the very broad canvas of post-9/11 journalism and social media use. In this case, agencies charged with protecting national security hope that citizens will trust them, and their use of secret, classified information, to act in ways that citizens would approve of. The problem is that, especially since the Cold War years, they have frequently shown themselves to be unaccountable and far from transparent. They all too easily become a law unto themselves.

One example of this is the "chilling effect" on the very core of democratic life—open communication and information. Journalists, whistleblowers, and of course any responsible and concerned citizens who also use social media worry that their activities may be monitored, and thus closed down. Several careful studies show that both regular social media users and, especially, journalists have pulled back their online discussions or their writing about "sensitive" issues since the Edward Snowden disclosures.

For example, in the United States, where 87 percent were aware of Snowden's work, there was a 20 percent decline in Wikipedia page views concerning terrorism, using words such as "Taliban," "car bomb," or "al Qaeda." By 2015 users of Google were using fewer keywords that they believed might get them into trouble with government. And a study of PEN writers showed that one in six curbed their content out of fear of surveillance; in other words, they were self-censoring. Here is the chilling effect, which is also a means of control when it produces collective conformity.

Surveillance we can trust?

So, does surveillance always act as a solvent of trust, or can it perform a role that fosters flourishing, not fear? Frankly, this is a difficult question today, just because of the combination of corporate and governmental power behind the major currents of surveillance, the invisibility of the software and statistics on which it mainly depends, the media amplification of mistrust and fear-mongering, and the domestication of surveillance in our phones, Fitbits, and, yes, FiLIPS. It's complicated!

However, surveillance can be restrained when necessary and, better, redirected. To redirect surveillance is to connect it with appropriate purposes, to work out how it can foster flourishing, not fear. My friend in Sparkbrook believes in the need for community policing, but within relations of trust where, for example, "decisions over mass surveillance [would] be qualified by the priority of human seeing, watching, and listening in mutually challenging ways."

That's the key—human priorities first, rather than technological, political, or commercial ones. Eric Stoddart puts it nicely when he says that "surveillance of people has dominated our culture of technologized risk. . . . Surveillance ought first and foremost to be for people." Uniquely in the field, Stoddart also insists that, so far from working out of some abstract notions of "privacy," an appropriate perspective comes from considering the experiences and activities of the "crucified God." For Jesus, intense surveillance was such a constant reality in his life that he understands profoundly what it is like to be spied on, monitored, and tracked, and inspires efforts to watch carefully and to hold surveillance to account.

This resonates with what John Hall says: "Bluntly, social life per se depends on the ability to trust, to so regularize encounters that fear and uncertainty are contained." The latter are expected, but we need trust in order to continue living our lives and ordering our societies for the common good. What are the steps here?

Relational, reciprocal trust

Trust is necessarily relational; we trust others, beginning in the family. This can expand to trusting groups or institutions. Trust is also reciprocal. It is a bond that fosters fairness in the relationship, so each party feels obliged to trust the other. The notion of covenant takes this further; the obligations are expressed in promises. Such promises may be implicit—as in a parentchild relationship—or explicit, such as when an Internet company makes every effort to protect personal data.The relevance for surveillance is clear. We may watch each other, or institutions may monitor us, in ways that inspire trust or that produce suspicion and fear. We would like to think that if we have nothing to hide, we have nothing to fear. But in today's surveillance climate, sad to say, such naïve faith is misplaced. This is not necessarily because of duplicity or deceit but because of the methods used: profiling and prediction based on data fragments pulled together statistically from so-called bulk collection. "Mistakes" and system bias are common.

Suspicion spirals unless it is halted by new, credible promises and obligations.

One way this has been approached in the Western world is through the idea of privacy. While far from adequate in many ways, "privacy" may be cast as a right or a civil liberty, and thus has a place in law and regulation. But, as Priscilla Regan and others have shown, it may do much more. It can be mobilized to foster trust, social cohesion, and solidarity. Privacy thus supports democratic participation, which is so easily undermined when, for instance, surveillance produces a chilling effect.

Interestingly such sentiments also appear in some official statements of those charged with maintaining privacy. The Canadian Federal Privacy Commission, for example, notes that trust and social cohesion are perhaps the first casualties as people put aside either privacy or security in favour of the other. Trust between citizens and their neighbours, as well as citizens and the state, hinges on a mutual understanding or consensus about the need to provide security protection and the need to respect rights like privacy and to preserve the free and democratic society which we all cherish. Picking up on the point about promises, the Privacy Commission also states that privacy is "a vital component of the social contract between Canadians and their government. Without privacy, without protective boundaries between government and citizens, trust begins to erode." Mutual trust between state and citizen is essential to avoid that enervating sense of alienation, inequality, and unfairness.

A culture of trust?

So where to turn? Without doubt, we need something beyond liberal democracy, which itself is eroded by surveillance-aslack- of-trust (stranger danger, suspicion, fear). Those who believe that trust is crucial for social relationships may be inspired by the broader triangle of trust between God and humans, whose third part is the earth itself.

With the rest of creation, humans trust that our food and means of sustenance for life on earth will be provided. But this in turn demands that we engender trust in our "horizontal" relationships, which yields the solidarity and mutuality discussed above—and ensures that all have access to the good things of creation.

We started with a discussion of personal devices and systems used domestically— nannycams and electronic security bracelets that support parental surveillance. Apart from the question of the necessity or appropriateness of FiLIPs or teen-driver monitors, there's an issue of what kind of world we are making when we rely on such gizmos. A culture of surveillance is emerging in which, rather than questioning or maybe even resisting the rise of ubiquitous surveillance, we embrace it ourselves.

This may be problematic on a small scale, in families or at school. But its bigger impact is to normalize many other kinds of surveillance and to teach the next generations that high-tech monitoring of everyday life is okay. How did we come to this? Some of the steps are outlined above. Can we do anything about it now?

Yes, we can. Let's start to place our thinking about and practices of surveillance in a larger frame in which trust features prominently. From everyday life in homes and families, where using our devices to check on others may so easily dissolve trust, to our working lives, to our clubs, churches, and communities, to our lives as urban and global citizens—we can raise questions about and offer alternatives to surveillance when it seems to be displacing trust. Many online groups offer help here—some are listed in a book that my research group put together on surveillance in Canada. (Check out the free PDF at go.cardus.ca/ surveillance.) We've made this digital world; now we have to learn "digital discipleship." Unavoidably, one of our first moves must be to rethink our relations of trust. Within this, we may discover ways of curbing inappropriate surveillance and shaping surveillance for the common good.

DavidLyon directs the Surveillance Studies Centre, is a Professor of Sociology, holds a Queen’s Research Chair and is cross-appointed as a Professor in the Faculty of Law at Queen's University in Kingston, Ontario.