The Right Deed for the Wrong Reason

Until a few days ago, I didn't know who Britt McHenry was. Now I do—not through her day job at ESPN, but rather through her surveillance-enabled, Web-driven disgrace. If you don't know her story, you know one very like it: McHenry was brutally rude to a tow-pound employee. A surveillance camera caught her tirade and some footage of it ended up on a website. The Internet pounced. McHenry later tweeted: “in an intense and stressful moment, I allowed my emotions to get the best of me and said some insulting and regrettable things,” which sounds about right to me. Who hasn't done that? However, this is 2015, and McHenry didn't get the time to scrutinize and evaluate herself in private. Instead, she was hoisted up onto the virtual pillory of Internet scorn.

That's life in the 21st century. Surveillance now is not just imposed by the state on the citizenry, à la 1984. It's also a practice citizens impose on one another (and on agents of the state), with cameras and social media. More and more of what we do and say—to say nothing of what we tweet and post—is available for others to see and (more importantly) to judge instantly.

Pondering this obviously huge shift in the way people now live their lives (and, specifically, McHenry's story), Megan Garber made an argument the other day that puzzled me. We behave better when we know we are being watched, she wrote, therefore being watched is not all bad. Woe unto the two-faced and the slackers and sliders, because technology makes it, as Garber wrote, “harder to differentiate between the people we perform and the people we are.” Wealthy celebrities will have to think twice about insulting lowly service workers. More importantly, cops will, we hope, hesitate to abuse prisoners when body cams are recording their every move. Who would say that's not good?

Twenty or thirty years ago, a lot of people would have. The assumption that underlies Garber's claim would have been, at the very least, debatable. But in 2015 it is considered to be obviously true, and she spends no time examining it. Surveillance has been around so long that we accept its premises even when we argue about it.

That assumption is this: All that matters is what people do, not why they do it. That is the justification when we use monitoring to ensure compliance to any rule, be it basic courtesy, professional standards, adherence to the law or obedience to a moral code. If a viral video of my bad behavior subjects me to global contempt, you can be fairly sure that I won't make that mistake again. But you can't be sure that I won't want to. You won't know if I have reflected on my behavior and understood that I “let my emotions get the best of me,” or if I'm just avoiding an unpleasant ordeal. I myself may not understand why it is so important that I comply. All I need to know is that nonconformity will be revealed and punished.

That is what works, without the murky, unmeasurable complications that would ensue if you had to get me to reflect and decide for myself. And what works is what is being deployed all around us. At the office there are keystroke monitors to make sure employees stay on task. Online there is insta-shaming to make sure you don't use any word or phrase that your tweeps consider un-PC. Even in the privacy of your own lived life, there are thousands of apps you can use to monitor and shame yourself into eating less, exercising more, saving money, or spending less time on Facebook.

These technologies are oriented toward measurable results: hours saved, pounds lost, cigarettes unsmoked, clients contacted and so on. In that, they express the ideology of our time, which can also be seen driving the turn in government away from explicit appeals to reason in favor of “nudges,” and a similar turn in business toward marketing via big-data prediction, social media or other avenues that bypass conscious reflection. It doesn't matter what you think or feel, it only matters what you do.

Now, this assumption can be justified in a variety of ways. One is that in some circumstances, where life and limb are at hazard, it is entirely appropriate not to care what people are thinking. It is so important that police not violate civil liberties, for example, that we can reasonably say we don't care if they're cool with the concept. Don't Get Caught (And You Will Be) is a crude but effective way of insuring as little death and damage as possible. But this claim doesn't justify the Internet shaming of celebrities or the use of software to make sure employees don't bounce over to Ebay in the office. The cost of a violation there is too low.

For the vast majority of other situations in which we accept monitoring tech to guarantee courtesy or conscientiousness, the justification is the same as you hear for most tech: It just makes life easier, you know? Why struggle with yourself about going the extra mile at work, when a social app that reveals your performance to colleagues is sure to motivate you? For that matter, why agonize about eating too much when you can use a special fork-gadget to let you know you are eating too fast? As Evan Selinger has put it, letting the monitors decide is a form of outsourcing. And outsourcing is about making life “seamless” and “frictionless,” to use the developer buzzwords.

The problem with this justification, of course, is that when we remove work and friction from life, we lose as well as gain. Selinger has criticized apps that “outsource intimacy” on this basis. When you set up an app to text your significant other, you save time and effort that you actually needed to spend to be engaged with that person. You shouldn't avoid the work because the work is the point. In these cases, it most certainly does matter what people think and feel as they perform an act. These are the times when doing the “right thing” without insight or self-awareness is a moral catastrophe, as T.S. Eliot famously put it:

The last temptation is the greatest treason:

To do the right deed for the wrong reason.

I think our fast-evolving methods of surveillance and shaming have the same flaw as the apps that outsource intimacy. When we monitor others to make sure they behave—as when we monitor ourselves to make sure we behave—we are outsourcing the work of self-government.

Instead of asking people to decide for themselves, imperfectly as ever, what they should and should not do in carrying out their jobs, we trust the cameras. Instead of affording McHenry her chance to examine her own behavior and come to terms with her conscience, we shame her into an apology. Did she mean it? Does she even know? Her chance to figure that out was taken from her. I can't speak for her, but if that had happened to me, I know I would be the poorer for it. My sense that I am different from the person people can see—that I have in me mysteries, hope and surprise—would be diminished. That is what it means to no longer “differentiate between the person I perform and the person I am.” And it is a terrible thing.

Guess who knew that? Back in the bad old days, when only governments had the power to engage in mass surveillance, the spymasters of oppressive states understood it very well.

When there was still a Czechoslovakia and it was run by Communists, the security forces there tapped phones and bugged apartments of dissidents. One day, to torment the writer Jan Prochazka, they took recordings of his chats with friends and family and broadcast them on the radio. Prochazka was devastated. After all, as Milan Kundera wrote of the incident, “that we act different in private than in public is everyone's most conspicuous experience, it is the very ground of the life of the individual.” Is that worth giving up, to be sure semi-celebrities behave themselves?