I teach a freshman writing class called Digital Culture and Counterculture, part of the purpose of which you might call “consciousness raising.” This meant something once, and I’d like to think it can mean something still. But lately I find that my students don’t quite fit my agenda. The agenda, that is, of teaching subversion.

We start the semester out with Wendell Berry’s “Why I am not going to buy a computer,” penned (literally) in 1987. Berry despises what he calls “technological fundamentalism,” the tendency to assume by virtue of unconscious indoctrination that everything innovative is good. We hear the voices of this fundamentalism everywhere, Berry charges. And it leads to a sickening superciliousness whereby everything old appears outdated and subject to revision. What about sunlight, pen and paper, and the standard model Royal typewriter he bought in 1956! Berry cries out. What about the sanctity of existing human relationships (his wife served as his editor) and the glorious tradition of writing by hand? At the end of his essay, Berry offers that “when somebody has used a computer to write work that is demonstrably better than Dante’s, and when this better is demonstrably attributable to the use of a computer, then I will speak of computer with a more respectful tone of voice, though I still will not buy one.”

All to no avail. When his essay was published in Harper’s, it generated several heated responses which the magazine printed perhaps to highlight the fury of computer proponents even then. Berry is a hypocrite, most charged. Berry should recognize the wonderful new possibilities of digital technologies and stop wasting everyone’s time with his crusty, quasi-Luddite critiques. To the magazine’s great delight, Berry responded and put his finger on the dike. He knows he is a hypocrite. The problem of being “a person of this century,” to use his elegant phrase, is that there is no way not to be a hypocrite. We are all plugged into the energy corporations, Berry admits, and most of us guzzle petroleum products in our homes and on the roads outside them like there’s no tomorrow. (Eventually, perhaps, there won’t be one.) All we can do is choose where to draw the line and stick to it.

Berry drew the line at buying a computer. Yet many of Harper’s readers found this attempt at setting a principled example unsatisfactory. They saw his moral scrupulousness as self-indulgent, and his critique of wanton consumption as out of touch. To this last charge, Berry took special issue. The root of technological fundamentalism, he argued, lay in his respondents’ passionate, almost fanatical, defense of the status quo:

At the slightest hint of a threat to their complacency, they repeat, like a chorus of toads, the notes sounded by their leaders in industry. The past was gloomy, drudgery-ridden, servile, meaningless, and slow. The present, thanks only to purchasable products, is meaningful, bright, lively, centralized, and fast. The future, thanks only to more purchasable products, is going to be even better. Thus consumers become salesmen, and the world is made safer for corporations.

When we read this passage in class I like to look around the room and notice my students’ responses. Do they identify with Berry’s critics? Are they moved by the ire that animates his eloquent rebuttal? Typically they seem unmoved, gazing forward at me as if I’m giving a Ted Talk. Judging by the papers I receive a few weeks after this opening discussion they find Berry’s argument unconvincing, partly for good reason. Berry was writing before the Internet and had no idea how significant computers would soon become. On a certain reading, his critique is myopic, unimaginative, and flat out wrong in light of recent history.

One glaring error students often point to is Berry’s insistence that computers lack any political utility. “I do not see that computers are bringing us one step nearer to anything that does matter to me: peace, economic justice, ecological health, political honesty, family and community stability, good work.” Naturally, college freshmen evaluating this claim in 2013 have plenty of ammunition with which to gun it down. They seem to take great relish in highlighting Berry’s inaccuracies, as if invalidating him validates some unknown voice in the back of their heads which they know must be right.

Very few students take issue with technology in the terms Berry provides; instead they prefer the more up-to-date Douglas Adams and his 1999 essay “How to stop worrying and learn to love the Internet.” Adams himself is great at highlighting the unsightly myopia that tends to affect writers like Wendell Berry. But his argument essentially turns on lauding all innovations as if they’re equal:

I suppose earlier generations had to sit through all this huffing and puffing with the invention of television, the phone, cinema, radio, the car, the bicycle, printing, the wheel and so on, but you would think we would learn the way these things work, which is this:

1) everything that’s already in the world when you’re born is just normal;

2) anything that gets invented between then and before you turn thirty is incredibly exciting and creative and with any luck you can make a career out of it;

3) anything that gets invented after you’re thirty is against the natural order of things and the beginning of the end of civilisation as we know it until it’s been around for about ten years when it gradually turns out to be alright really.

Apply this list to movies, rock music, word processors and mobile phones to work out how old you are.

Yes, isn’t that cute. We’re all indebted to the prejudices of our time. Perfectly natural that our parents and grandparents distrust the Internet and still worry about “privacy concerns.” They’ll be dead soon, anyway.

It would be nice if my students could synthesize Berry’s moralism with Adams’ pragmatism and come up with something more durable than either of them did. But most side with the pragmatists’ argument. After all, what choice do they have? None of them could get their schoolwork done without computers. And social life would be unimaginable without all their friends on Facebook. To preserve their sense of self—to preserve their sense of how the world works and how it should work—they have to argue against Wendell Berry; they have to resist his old-fashioned moralism even as they sense him breathing down their necks.

We came to a possible turning point last week when we discussed online dating. I assigned a 2011 New Yorker article by Nick Paumgarten called “Looking for someone: sex, love, and loneliness on the Internet,” thinking it would spur a good conversation. At first they were reticent as usual. We talked about the positives and negatives of this quintessential hallmark of digital culture, and the big sociological shifts that enabled its formation. According to Paumgarten and biological anthropologist Helen Fisher, the rise of Internet dating rests on three major turning points: 1) the massive influx of women into the workforce, 2) introduction of the Pill, and 3) rising divorce rates, all of which came to a head in the U.S. after 1945. As Fisher puts it, “Our social and sexual patterns have changed more in the last fifty years than in the last ten thousand.” Consequently, “our courtship rituals are rapidly changing, and we don’t know what to do.”

I hoped the existential implications of this dilemma would be manifest as we surveyed the contemporary dating scene. Match, OK Cupid, Plentyoffish, Jdate, Eharmony, Chemistry (Fisher started this one under the auspices, and on the payroll, of Match’s parent company, InterActivCorp), Howaboutwe, ScientificMatch…the list is nearly endless. All of these sites use different algorithms and presumably cater to different market niches. But the underlying principle is the same. According to Paumgarten, ScientificMatch “attempts to pair people according to their DNA, and claims that this approach leads to a higher rate of female orgasms.” Yet this only takes the approach of tamer (less ambitious?) sites to its outer limits.

What online dating is all about, I implore my students, is the principle of scientific management. We are all familiar with how this works in practice. When we find ourselves on the toothpaste aisle at the grocery store (likely a supermarket), we know that the available brands and accompanying brushes have all been vetted by multiple experts. This same knowledge applies to every consumer product: to cars, televisions, and of course, our personal computers. To live in the modern world, it seems we have to learn to depend on experts and the principle of scientific management. Otherwise we’ll be left behind in a fog of bad smells and other inefficiencies.

But where do we draw the line? At what point do we stop turning our lives over to scientists and their unimpeachably useful index of algorithms?

To dramatize the stakes I like to pose the following scenario (I’ve used it twice now, this semester and last). Imagine that some time in the not-too-distant future a new online service has been developed. If you choose to use it, this service guarantees you a detailed account of how and when you will meet each of your romantic partners for the rest of your life. Names, dates, descriptions of physical proportions and breakups—everything is there, and upon reading it your fate is sealed. It is up to you whether or not to use this service. But the technology is available. The algorithm has been perfected. Instead of the messy, haphazard process of sorting your way through lived experience, going down this path blindly with this person, going down that path blindly with another, you can have complete and total certainty. There is no longer any margin of error.

After presenting this scenario in the eeriest tone I can muster, I ask my students by a show of hands how many of them would choose to use such a service. Their answer, at least as late as February 2013, always depends.

“H.G. Wells once said, coming out of a political meeting where they had been discussing social change, that this great towering city was a measure of the obstacle, of how much must be moved if there was to be any change. I have known this feeling, looking up at great buildings that are the centres of power, but I find I do not say ‘There is your city, your great bourgeois monument, your towering structure of this still precarious civilisation’ or I do not only say that; I say also ‘This is what men have built, so often magnificently, and is not everything then possible?'”

-Raymond Williams, The Country and the City

Ever since becoming a graduate student, I’ve slowly forgotten how to alchemize thoughts into an intriguing set of words. This is troubling given the nature of my obligations, and I’m grateful I can still submit crap to my professors that entails a deceptive gloss of accomplishment. But when it comes to my extracurricular scribbles, I’ve descended into a state of confusion. It’s as if my head has erupted in civil war, where all parties are desperate for an armistice or ceasefire, and yet the hostilities endure. I’m still not clear as to the cause of the conflict, but I have my suspicions. Consider this reportage from the wreckage — a kind of gonzo journalism as applied to a warring mind.

I.

I enter that grotesque simulacrum of society known as Facebook about four thousand clicks a day. This is largely due to my study routine, which involves hours on end cozying up to three to seven books a week. My laptop is my portal into the effects of reality, where reality is now defined by the anxious escape from reality, a virtual stage crowded with digitally cropped anthropoids performing various boasts, vents, and poses. I occasionally take part in the mad rush, because I’m just as implicated in the lonely sociability as everyone else, but my regular preference is just to sit still and watch. (Lately I’ve been participating beyond my comfort zone.) I have a hunch we’ve always been escaping reality, and the most updated medium for doing so only makes the point as sharp as a blade. After all, my reality of three to seven books a week can scarcely be deemed more real.

Since Sandy Hook, my feed has been bombarded with pro-gun propaganda, as well as a variety of other right-wing hysterics. About a third of my friends are Marines, most of whom I served with in Afghanistan. They’re not happy. President Obama is compared to Hitler and Mao. A few push a website that encourages active-duty personnel to stop obeying orders from a tyrannical regime. Proud visuals of personal arsenals come to the fore, along with tips on where to acquire the cheapest or most badass AR-15, presumably before the liberal junta rolls in with the tanks. One posts a link implying the shooting was a hoax, a government “false flag” operation intended to muster the emotions required in order to pass more aggressive gun laws. Someone “likes” it, and then someone else “shares” it. Comment threads are born, and fresh paranoia is exchanged.

While I normally avoid engaging in such fare, I feel a responsibility to talk it out with my erstwhile comrades. I really do like these guys, and I don’t want them to waste their energies aiming for red herrings, especially since most of them are working-class. I can’t stand seeing them obsess over modest gun-control regulations once supported by the likes of Ronald Reagan and Richard Nixon, just as they buy into plutocratic lies blaming liberals, union members, and poor people for our nation’s dearth of employment and educational opportunity, not to mention their increasing sense of disempowerment. (Although the Democratic Party is no doubt complicit in the arrangement.) I so want to leap over the electronic canyon and share a pitcher with them on the other side. I want to shoot the shit like we did in the Helmand a couple years past. I want to complicate their prejudices while prodding them away from a politics of resentment to a politics of thoughtful resistance. I want them to recognize the inextricable link between economic power and political power, and the necessity in attacking (and claiming) both, not through a naive and escapist libertarianism but by way of the kind of workplace democracy and community autonomy put forth in Gar Alperovitz’s America Beyond Capitalism or David Schweickart’ s After Capitalism. In other words, I want them to read what I read, and I want them to think what I think.

Instead, I spend hours dissecting each and every claim made by a Sandy Hook “truther,” a young man who once served as one of my top sergeants. Every time I knock down an absurdity my interlocutor replaces it with another. Despite the effort, nothing is conceded. I remind him I’m a gun-owner, something I originally avowed in Afghanistan, before we set off on a mission. (My Marines always enjoyed hearing I owned a gun. Or rather I enjoyed telling them.) Since I’m a Connecticut native, I’m asked if I knew anyone involved in the massacre. I tell him my mother’s co-worker lost a son, and another acquaintance a niece. He proceeds without acknowledgment. My final words are conciliatory. “I get it,” I say. “It’s very hard to trust anyone these days. Everything is MASS — our government, our corporations, our media.” I go on to attribute the distrust more to technology, inhumane scales of social interaction, and anomie than to Hollywood-style villains. “But you’re right,” I say. “A lot of people in power get away with a lot of terrible things. I feel ya man.” He responds in kind, thanking me for a worthwhile debate. In the weeks ahead, he stops with the conspiracy theories. The fury persists.

II.

In late January 2011, I find myself back in California, after nearly a year’s duty in Afghanistan. That summer, I return to the northeast as civilian, for the first time in five years. I am disillusioned from the war and my service more broadly, but I’m also just beginning to recoil from an upper-middle-class milieu that strikes me as self-satisfied and clueless. I realize the self-importance in the reaction, but I can’t help it. I attend a high school reunion, and while I drink myself to a state of ostensible equilibrium, I’m pissed. There are a handful of confidantes who know what’s up, but the bulk of the young professionals saunter across the marble as if the world were as deep as their pockets and as wide as their gaze.

That fall, Occupy erupts. I’m not an activist, but my sympathies have no doubt shifted to the left. The slide began at boot camp, when I learned what social inequality means. Its meaning is ugly and angry, and not the stuff of a sustainable republic. I’m already on amiable terms with left writers involved in the Wall Street protests, so I make my way to events in the city. I donate multicolored duct tape for the occupiers, at their request. I write a couple pieces online. I debate right-wing friends from college to the point of exhaustion. That’s the extent of my activism.

Now I’m too busy in school to do much of anything. I maintain a blog, just barely, and follow up on the latest from periodicals like Dissent, The Baffler, N + 1, The New Inquiry, and Jacobin. I even have a Twitter account, which I sometimes scroll when Facebook’s procrastinatory utility is expended. I follow a handful of radicals. They’re brilliant, and there’s a contingent that’s serious about reaching larger audiences. But a greater part of the tweets make me nervous. There’s a 140-character limit, and a recklessness of cool pervades every syllable, replicating the capitalist status anxiety they’re so intent on subverting. Though I agree with eighty percent of what they have to say, they’re not saying it to America. They’re saying it to themselves. And if they say it to America the way they say it to themselves, they might as well not say anything at all.

III.

There’s a small town an hour south of Atlanta. I know it well. My grandmother had twin sisters who slunk below the Mason Dixon after World War Two. They fell in love with returning sailors in Manhattan, both of whom were southerners. The twins exchanged their Jewish tenements for the Bible Belt, and never looked back. One of the two couples ended up in Williamson, Georgia, and one of the daughters from that marriage stayed in the area, married a soldier, and raised two sons of her own. She also reconnected with her relatives up north. When I went to college at Emory University, I savored weekends and holidays with her family. I fished with the neighbors, partied with the young adults, and frequented events at the local church. I even got invited to the “Caboose Club,” comprised of a group of older men (including the mayor) who meet once a week in the abandoned car of a freight train, to nibble on eggs and sip on coffee at the break of dawn. A confederate flag presides across one wall, and an elk head obtrudes from the other.

This past visit I go horseback riding with a gentleman who recently lost his wife from cancer. The two of them, close friends of my family in Williamson, attended my boot camp graduation at Parris Island back in 2006. They greeted me with photographs of the husband as a recruit on the same island, right when Vietnam was trudging along. He won the company’s highest shooter award then. I hardly passed the minimum requirement, and the three of us chuckled about the disparity. In the pickup truck, on the way to the campground and park, with the horse trailer rumbling behind, I nervously deliver my condolences. “I know she always wanted to teach me how to ride,” I say. “I’m just grateful you’re going out of your way to fulfill the promise, especially at such a difficult time.” He nods his head. There’s a moment of silence. “No,” he says. “This is exactly where I want to be.”

About an hour later, on the trail, after a good forty-five minutes of traipsing through rolling woods, he turns back to me. “Hey young man, you ready to run?” “Yes, sir,” I say. “All right, hold fast to the bridle with one hand and grip the back of the saddle with the other.” “Got it, Sir.” “And watch your head.” And then we’re off, cutting sharp turns and ducking hanging limbs. (On Facebook, later that day, I’m quick to brag about it as my CLINT EASTWOOD MOMENT.) We stop at a burger place on the drive home. “Sleepless in Seattle” booms from the television. Plaques adorn the restaurant, mostly replete with religious aphorisms. My elder reads off one of them. “Love is the key to happiness.” He grunts. I take it as a cue. “I tend to think life is about something more complicated than happiness,” I say. “If I’m forced to say what that is, I guess I’d say it’s about struggle.” He seems to agree. We finish our burgers.

That evening, I attend a birthday party at the church. It’s for another one of my dear friends in Williamson, an 87-year-old man who taught me to fish my very first stay. The space is packed with over 170 people. I’m situated across a garrulous schoolteacher who discovers I’m a PhD candidate. He asks me what I study, and I tell him American history. He lights up. He says American history is his favorite subject, and we Christians ought to cherish our national heritage, especially now that it’s under attack by a Marxist president. I try to meet him halfway by waxing romantic about community. Meanwhile, dozens of lively personalities ascend to the stage to reminisce about the honored guest. They speak in thick drawls, stringing one charming colloquialism to the next. I’m overwhelmed by their ease of speech and theatrical force, as if they were all studied actors or comics. On the other hand, there’s a sincere flood of emotion.

When I climb up to the platform with my cousin, she’s already in tears. She had it rough through the years, with dysfunctional parents and drug addicts as brothers. The birthday boy served as one of her great ballasts as she established a glowing clan of her own, like an angel dispatched from the heavens. This is a running theme with a number of the speakers. I follow her moving words with Jewish self-deprecation, which half the room seems to appreciate. I talk about the city versus the country, and how Williamson taught me about the virtues of the latter. I tell the audience I love my family down south and that I love their town. I tell the celebrated octogenarian, just a few feet below me, that I love him too. I almost cry. He beams. Everyone claps. I return to my seat.

I have long been surrounded by, interested in, and comfortable with people who hold themselves to standards that would seem, to any jury of fellow human beings, to be almost impossibly high. At one point, the drive for excellence in all things became such an annoyance to ourselves that the “p” word was banned. I’m not talking only about the people who come to the public’s attention for doing so, like someone who wins an Olympic medal or attains the stature of Albert Einstein, but about people in all walks of life, whether or not any part of that walk ever takes place in or anywhere near the public spotlight.

For the sake of this little essay I’m thinking of a kind of everyday, rather humble, self-critical kind of perfectionist.

The standards this perfectionist evokes are of every variety, especially those that involve what might seem to others to entail matters of almost minute importance. Considering the internalized standards requires what might appear to be small decisions, but keeping them (or not) can be of enormous importance to the person involved or, so they think at least, people around them. Their decisions can have long term consequences and in the meantime greatly affect everyday life in the present. Or they can have little real effect in the outside world and remain matters solely of internal importance.

Of course we must admit it is not really accurate to say there are such people. Freud gave us no uncertain terms for understanding the predicament at the heart of our social connection to other people. In Civilization and Its Discontents, he laid out the situation. For collective life, we give up so much of what we by nature and instinct think we want. Our conscience becomes the arbiter, where we come to harbor the ideals for our behavior, and the cost of violating the internalized social standard is, as Philip Rieff, one of Freud’s interpreters put it, the “terrible cost of guilt.”

So, to some degree everyone–at least everyone trying to live in some degree of relation to fellow human beings in anything close to a peaceful, functioning social order–has a strain of perfectionism. This notoriously both helps and hinders us. There exists only the finest of lines between the kind of perfectionism that allows someone to live from day to day with smooth sailing and the kind that causes even our most wrenching inner agonies.

So much of what occupies us from day to day–whether it is in our work or entertainments, idle gossip or philosophy of life, struggle with addiction or search for spiritual understanding, adventures and woeful misadventures in love–can go either way, at any time. It can call up those inner ideals, as if mustering so many soldiers for duty, and find us seriously wanting. Punishment ensues, with luck only after judicious consideration but often in actuality by kangaroo court.

To think of all of the cultural production–the books, poems, paintings, buildings, operas, plays, films, meals–that has something to do with the oft troubled nexus of desire and ideal is, most likely, to gaze upon the entire history of mankind’s expression. In a recent article on Huffington Post‘s blog (12/21/2012) journalist Jim Sleeper, referring to novelist D.H. Lawrence, recently called it “the eternal tension between impulsive, selfish desires and deeper strivings toward a common good.” In Lawrence’s word to the wise, quoted by Sleeper: “It is the business of our Chief Thinkers to tell us of our own deeper desires, not to keep shrilling our little desires in our ears.”

Perfectionism’s taproot is nowhere if not sunken in deep here. And rightly so, to remarkable ends. From a painted ceiling with one telling of the entire human story, to glorious church spires reaching to the sky, our striving toward the ideal is responsible for much, if not all, of the beauty and majesty that exists in the manmade portion of the world.

Yet we know that even in those very mortals by whose hands immortal-feeling creations are delivered an inner struggle of unseen magnitude was often part of the inner territory traversed. Gleaning, glimpsing, let alone aspiring to our ideals can bring us to the very edge of the abyss.

The urge toward perfection is responsible for so much that is good, but at least in equal proportion, so much that is bad. We see this in the utopian impulse. Who wouldn’t want every single aspect of human life to be reformed and put in line with the highest ideal we can imagine? Only those who know the horrors of totalizing movements to remake all of existence, even at the cost of human life, both physical and spiritual. It is sheer heroism that makes it possible for us even to continue to speak of ideals in the aftermath of such destruction.

Though it must be put in an entirely different light, perfectionism on the more modest scale of the individual can also have devastating effects. One of Freud’s main preoccupations was, after all, the debilitation and suffering of the human person. Not only can instinctual drives force us to veer off course, but so can the very ideals inculcated by our upbringing and encounter with the outside world. A superego out of control can damage everything around it.

*
Everyday, rather humble, self-critical perfectionism takes as many forms as there are people so inclined: a father faults himself for not spending more time with his children when they were young; a wife suffers inner torment for having violated her wedding vows; a graduate student painfully regrets missing a deadline; feeling impure or rejected, a teenager commits suicide; a child erupts in anger for accidentally coloring outside the lines. The results of this kind of perfectionism can be, in some cases, insignificant and even laughable but in other cases, nothing short of fatal.

In the less significant cases, we get a lot of mileage from the details. While the serious illness involved in a true, full-blown obsessive-compulsive disorder is another thing entirely, the more mild tendency to keep things a bit too orderly is often something we find amusing. The television show “The Odd Couple,” about two men sharing an apartment, one sloppy and one obsessively neat, is a well-known example. While some people have more than others, we are all acquainted to some degree with these milder tendencies. So when we laugh, we are laughing at ourselves.

But a lot of time the people closest to me with perfectionist tendencies that are a bit too strong to be considered mild, if (blessedly) too under control (most of the time) to be anywhere close to fatal–are not laughing. In their (our) struggle to adhere to the standards for behavior humans set for themselves, they must face tremendous, sometimes ever-renewing disappointment. Intellectually, we might grasp the idea of human limits and shortcomings. If we fall short of what we think is good behavior, we can repeat several mantras: we are all flawed; we all make mistakes; we are not gods; we are not God.

Does this bring comfort? It seems like all we have, this intellectual proposition, but to derive any comfort from it, it is the emotional content of the acknowledgement of our limits that we need, and this must be a renewable source for those whose capacity to feel self-disappointment seems sometimes to know no bounds. Is there any time our limited being, flaws, mistakes, and all the rest of the things we are, have, and do, can actually be a comfort? This is one focal point of religious and spiritual tradition, of course, but even then, teachings must be borne out in experience for them to provide succor.

Not emanating from the same place as arrogance or delusions of grandeur, everyday perfectionism does not necessarily prevent us from functioning in the world on a day-to-day basis, however much it threatens to do so and even manages in some periods. However, it can be everything from a constant nagging and distraction to a source of anxiety, inner turmoil, and disconnection from others. Even when one manages to refrain from holding other people up to one’s list of standards, which can sometimes be a small feat in itself, one can cut oneself off from living in true relation with others, lost in this inner realm of standards and disappointments.

If examined in isolation, the individual can look so meager and inadequate, especially in light of his or her own ideals. The flaws are all too readily apparent.

But isn’t it the case that when examined in connection with another person, it is precisely what we internally determine as a shortcoming that might have some role, purpose, or meaning? What if it turns out to be just what another person needs, even for mere survival? What if an error in one light is the source for another’s completion in another? There are so many other possibilities.

If we change the vantage point and stop looking merely at ourselves, as if in a mirror, but broaden our perspective to include other people–sometimes even just one other person–we might find the perfection we crave.

*

On a light note, as partial illustration perhaps, a couple of years ago, I was sitting with a dear friend late at night at a New York City diner of the humblest sort (both the friend and the diner). We ordered some of the stock diner fare on the menu and ate and talked. When the waitress saw we were not eating anymore, though there were a few fries and things left, she began removing the dishes. My friend asked her to leave everything, though we had been there a very long time. We were surrounded by empty tables, and she made it clear we were welcome to stay and kindly poured us more coffee. After the past couple of decades of eating in setting after setting in which the unspoken rush to get to dishes off the table sets the tone of things, I was curious. “Why do you want her to leave everything?” I asked. His answer was sweet and moving. He said that in times like these he wanted to be able to see all of the “remains” as long as possible, a beautiful sign of our enjoyable conversation and time together.

What might have appeared to another person, whose cleanliness drive is near the top of their inner standards, to be a mess in dire need for removal was now to me a gift.

In the case of internalized standards and the vexations of our own perceived shortcomings, in certain particular cases (not all, of course) the very thing for which you are raking yourself over the coals might be the thing that gives me joy, interest, hope, or wholeness. I can think of other examples, large and small, of how what looks like a shortcoming from one vantage point is the very essence of perfection from another. Can you?

The fact that we’re currently engaged in a serious debate over gun control is a very good thing for this country. No one can deny the issue’s urgency, or the stakes involved. But the fact that this debate has emerged as the main response to the tragedy at Sandy Hook last month is telling of another problem. To be sure, political thinking is indispensable for addressing political problems like gun control, and Americans have spent a long time training themselves in the arts, idiocies, and nuances of politics. Yet not all problems are merely political. Not everything can be solved by legislation, and we cannot expect something like the high rate of gun deaths in this country to drop simply as a result of better laws and policies. The root is culture.

Many liberals distrust this line of argument because they think it’s a cover for the main tenet of the NRA: “guns don’t kill people, people kill people.” But my purpose here is not to endorse one side in a debate. I don’t agree with the NRA on gun control; I don’t agree with their interpretation of the Second Amendment; I don’t agree with them on anything, really, except for the obvious sense in which the claim above is true. People die from guns because one person uses a gun to kill another person. If guns possessed the power to wreak havoc outside of human hands altogether, we would be in a lot more trouble than we are. They would indeed be mystical fire sticks, which they are not. The issue, then, is how we should regard the human capacity, desire, and intention to obtain and use guns. Our concern—liberals, conservatives, everyone—should be with the cultural logic and psychology of gun use, not with any misplaced superstition about the demonic character of this quintessentially human tool and artifact.

In what follows I want to propose that the tragedy that occurred at Sandy Hook Elementary School is part of a larger cultural dilemma, which dwarfs the present political debate over gun control in both its complexity and stakes. We can begin to grasp this dilemma by asking what precisely was tragic about the event. Was it the death of so many innocent people? The fact that so many of them were small children? Or that any person could be warped enough to commit such a heinous act? All of this is devastating, and I too wanted to cry when I saw the news on TV. But for some reason I couldn’t. I was with my family in Phoenix, Arizona; that afternoon we’d taken my mom to see a lunchtime ballet called “The Snow Queen” for her birthday. We went to a nearby Ethiopian restaurant afterward, and the place was empty except for a few people sitting at one table staring at a television. We sat down at our table and looked at our menus. Out of the corner of my eye I noticed that the TV was reporting breaking news. It was unclear at first whether the story was local Phoenix news or national, but when I stood up and walked toward the screen I saw the headline: 26 dead at school massacre in Newtown, CT.

Alongside the Ethiopian family at the other table, we all reeled in sadness and disgust. How could someone do this? What possible cause could explain something this atrocious? More details flooded in as we struggled to order our meal. Meanwhile one of the family’s children, probably a four or five year-old, walked between the empty tables smiling at whomever would make eye contact. Of course the gunman killed himself. My step-dad said something about the need for more gun control and I immediately shot back that this has nothing to do with guns, or that it only has something to do with guns in a secondary or third sense. What this incident tells us, I began to argue to my family, is that there are deep lying psycho-pathologies in our society and culture. As Christy Wampole wrote in a New York Times op-ed (“Guns and the Decline of the Young Man”, 12/17/12), which I read some time later, “there is something about life in the United States, it seems, that is conducive to young men planning and executing large-scale massacres. But the reasons elude us.”

When my parents asked if we could isolate the cause, I repeated a refrain I often use to explain my view of history and culture: it’s like a complicated physics equation. The forces are so many and so interrelated that we can’t isolate any particular one as the root cause or agent. I disagree with Wampole that the underlying issue is the decline of “the heroic young man,” or any other easily definable change. An event like Sandy Hook was the product of much larger forces, forces almost as mysterious and complex as the Big Bang. There is no simple explanation or solution; appealing to the modest efforts of legislators to fix this problem is a lot like tossing a coin into a fountain and wishing for world peace.

*

Since that afternoon in Phoenix I’ve watched the response and ensuing debate unfold with scarcely any attention to the deeper layer of culture. In a front page story five days after the shooting (“Virginia Tech, Fort Hood, Aurora, Sandy Hook…”, 12/19/12), USA Today reported that on average one mass killing occurred somewhere in America every two weeks between 2006 and 2010—and that the number was likely to rise after records for the last two years became available. This finding was based on FBI statistics and the agency’s definition of mass killings. Yet on the whole the article provided little beyond shock value. It ended with a quote from Rep. Carolyn McCarthy, a Democrat from New York and “a leading proponent of tighter gun laws,” then the following disclaimer: “For all the attention they receive, mass killings still accounted for only about 1% of all murders over those five years. More died from migraines and falls from chairs than mass murders, according to death records kept by the Centers for Disease Control and Prevention.”

I didn’t happen to catch the Sunday morning talk shows a few days earlier, but I imagine that by comparison this was in-depth analysis.

I had a more substantive experience when I decided to see Quentin Tarantino’s new film “Django Unchained” on Christmas night. Since its release the film has generated commentary from all manner of critics. But few have noted the unintended symmetry between its form and content and what happened in Newtown, Connecticut. In fact, most reviewers seem to have missed entirely what I take to be the film’s central moral message: that culture is the root of all evil, American and otherwise.

Their collective myopia is perhaps best summed up by Anthony Lane of The New Yorker, who wrote in the 1/7/13 issue that “Django” remains caught in its director’s familiar “tangle of morality and style”:

Tarantino is dangerously in love with the look of evil, and all he can counter it with is cool—not strength of purpose, let alone goodness of heart, but simple comeuppance, issued with merciless panache.

Lane made this tone-deaf accusation about a film that tackles what is perhaps the most morally reprehensible, and ambiguous, fact of American history—slavery—and does so in vivid, grisly, meticulous detail. In another New Yorker article (“Tarantino Unchained”, 1/2/13), historian Jelani Cobb puts his finger on the real issue that seems to irk many of the film’s critics, social and otherwise: “not even an entertaining alternative history can erase our actual conceptions of the past.” Fellow filmmaker Spike Lee agrees with this sentiment so totally that he condemned “Django” for historical inaccuracy and refused to see it. What accounts for the vitriol?

Tarantino has elicited strong reactions from viewers for a long time, not least because the prevalence of violence in his films makes him a poster child for the unending controversy over how disparate media affect society. But the violence in “Django” is far from merely stylized or aesthetic. In nearly every instance, it is attached to a series of historical facts and actors that dramatize the ways in which American slavery, among other institutions, depended upon a perverse system of violence, organized according to its own logic. In rich and distinctive ways, each of Tarantino’s three main characters partake in this rationalized violence, and in the end justice comes only in a bloodbath.

It’s true that this is not a pretty image, especially at Christmas time. But neither was the Civil War. What Tarantino’s critics appear to miss is that he is a modernist artist through and through, intent on transgressing established rules and binaries at the same time he envelopes us in his abstract depictions of the real and the imaginary. As the writer-director himself put it in a recent series of interviews, “I’ve always aligned myself with that hip hop idea of taking things that already exist and riffing on them and creating something new.” He wanted to deal with slavery “in an operatic way,” within a specific genre—the spaghetti Western—but in a way that still “tell[s] the truth.” He admitted that he was troubled by many aspects of “Django” during the writing and filming process; but he also felt that he was serving a higher social purpose: “We still can’t deal with the sin of slavery,” he explained to one interviewer. “That’s why we have to lie about it by omission. I think people need to look at things like this. That would be the beginning of healing.”

Indeed, part of “Django” was shot in old slave quarters, and according to Tarantino “you felt the ghosts, bearing witness.” He reined in the most graphic violence in the final cut, making his “the PG version” of slavery. The film is rife with anachronisms and historical inaccuracy; its opening even announces that it takes place in 1858, “two years before the Civil War” (which started in 1861). But the point is that this is a historical parable, cast at one of the original sites of American identity and at the nexus of all its cultural complexity. The ugly fact of Tarantino’s ultimate moral message—in the end, vengeance rules the day—that’s the American way!—comes close to the truth by virtue of its dexterity and its visceral quality. Whatever the director’s explicit intentions, this is what his film evokes. It is not history per se, but so what?

As Adam Serwer acknowledged in Mother Jones (1/7/13), “perhaps lovers of Westerns may even walk out understanding what some of their most memorable ‘heroes’ were actually fighting for.” Yet many of Tarantino’s critics remain unwilling to recognize his achievement. They decry his aesthetics, his bad history, and his lack of social conscience, all of which may conceal a different sort of politics that operates according to its own logic. Much as it is in the interests of USA Today and CNN to stay within the staid format that promises high sales and ratings no matter how badly this affects the quality of public information, Tarantino’s vested critics must do everything they can to invalidate his claim to cultural resonance. As was the case in earlier modernist epochs, the critics are the most adroit protectors of the status quo, for they are often among its prime beneficiaries.

In Tarantino’s case, it seems to work best to identify him as a “stylist,” clearly not a responsible historian or social critic; to suggest that his rendering of 1858 Mississippi—drenched in equal parts Black Power fantasy and antebellum realism, nothing approaching the humility or homage of Spielberg-like propriety—is ahistorical, and offensive to boot. By this logic, the director and his scurrilous vision of art must be wrong. He couldn’t possibly be telling us anything important about our culture or society, certainly not in the wake of a tragedy like Sandy Hook. After all, shouldn’t we be arguing about gun control?

*

The vagueness of what Tarantino’s critics think he got wrong is mirrored in the imprecision with which many good liberals are trying to pinpoint the problem Sandy Hook evinces. In “The Talk of the Town” (The New Yorker, 1/7/13), Hendrik Hertzberg writes that “America is alone among the advanced democracies of the world in suffering from an unending epidemic of gun mayhem. Are our politicians so much more cowardly, our legislators so much more corruptible than theirs? Or is our creaking, clanking political and governmental machinery so clogged with perverse incentives and exploitable bottlenecks that getting anything done requires our elected leaders to be more courageous (and our citizens more engaged) than theirs ever have to be?”

If only the question—and its presumed answer—were so simple.

The root is culture. The rest lies atop its enigmatic substratum. If we’re to address our social problems with any chance of understanding them deeply, we need to know our limitations and start thinking more like Quentin Tarantino.

Why is it that we—our hearts, anyway—become so set on loving not just other people but other particular people? Doesn’t a huge portion of our trouble in life come from this simple, but seemingly unavoidable reality? Why do our hearts have to be so discriminating? Who are they to know, anyway? Why is it not possible for our minds to play a simple trick and insert a different face into the frame of our affections?

After all, we are living in the “age of mechanical reproduction,” as Walter Benjamin wrote about art, but as now seems to apply to nearly everything. The whole concept of the Internet and allied technologies is that they can create a virtual world. If we can do that, what will be left to desire? And what desire will be left with which to desire it?

In huge swaths of our lives, the new technologies have insinuated themselves in mind-bending ways. As they change everything from how we conduct our personal lives to how we perform our work, they do so in very precise ways the merits of which we can pinpoint, analyze, and debate. Yet they also do so in more sweeping ways that are harder to assess. One of the underlying assumptions driving so much of the breathtaking innovation and many people’s open-armed embrace of what the new technologies have to offer is that more and more parts of life can be simulated. It is reminiscent of—and not unrelated to—the drive, in robotics, to simulate the human being. Tellingly, such efforts at mechanical cloning always falter when it comes to the replication of human emotion.

In the realm of the affections, there are concrete reasons why we might search for someone to stand in for someone else, yet it is this precise realm that seems to resist all such efforts. We might understandably wish to come up with a way to simulate the strong feelings that we feel for a particular individual so as not be so beholden to him or her. The reasons include the usual suspects, ones that have been with us long before the microchip: inaccessibility of the object of our affections; grief at the loss of a lover or companion through separation, death, or another twist of fate; or just the waning of passion caused by a mundane series of disappointments, mishaps, or misunderstandings. And the attempts to achieve such emotional simulation reveal a history of inventiveness that most likely maps directly onto the entire span of human history itself. But just as impressive as these efforts is the mountain of evidence we have accumulated for what they add up to: naught.

Unless we find a way to fool ourselves better, it is highly improbable that we can ever manage to disabuse our hearts of their true commitment to a particular person. And it is also questionable whether we should even try.

In this series of posts, I have been mulling over the completely moot question of why we should wish to desire at all, given all of the foreseeable and unforeseeable pain that inevitably causes. The pain I had in mind was the searing-cold-knife-blade-into-the-chest variety that can follow upon what is or is thought to be the permanent loss of the beloved; the kind that seems to empty one’s inner self of everything that fills life with joy and meaning; the kind that drove Puccini’s Tosca to leap to her death after she found out that her beloved Mario had been wrenched forever from her loving arms by a death squad. His staged execution, the intended act of simulation, turned out to be excruciatingly real.

But there are other painful emotions that fall far short of this extreme of tragic loss. The full experience of these other affective states can be extremely difficult and unpleasant and can also make desire seem undesirable. Our culture, while pretending to be all about desire, actually removes its possibility in the name of our supposed protection and therapeutic healing, making all strong emotions that are the basis of exclusive attachments suspect.

One of these painful emotions is jealousy, an emotion that would not exist if there were any chance of a kind of saving reproducibility of love. Jealousy is something that seems to have no place in today’s world of self-possession. At best it is a sign of immaturity and insecurity; at worst it goes against one of the shibboleths of today’s consumer culture, the commandment to smile for the camera and appear to be having fun at all times. It smacks of grasping, smothering, restricting, controlling, possessing. It seems unnatural, a sign that self-interest has taken over. Or it is pathetic, a sign of weakness, suggesting that one should get a life (presumably not this one).

*

This morning, just as I was opening my eyes to the new day, I was flanked by my dog, who had taken advantage of my sleep state, as he does each morning, to settle in where it was warm and cozy in my arms rather than in his own dog bed. My daughter approached to wish me good morning and, seeing my canine companion, greeted him affectionately. At that point I greeted my daughter. As recurs in this situation with unbroken regularity, a loud and pained sound, half squeal of delight and half agonized moan, was emitted involuntarily from deep within my dog’s throat. The sound is as unpleasant as the proverbial fingernails on a chalkboard: at once irritatingly loud and ridiculous. Normally perfectly well-behaved and the model of true gentlemanliness, this animal’s closest brush with objectionable behavior is at these moments. It is obvious why he reacts to them the way he does. He’s jealous.

This is a member of one of the dog breeds most known for getting along with households with multiple members, for making a deep bond with each and every member of a family, from infants to octogenarians. There are other breeds that have a tendency not to get along as well with children than others but this is not one of them. And some breeds are notorious (or famous–this is part of the point I aim to make shortly) for developing a primary bond with only one person. He is not of that ilk either.

Yet, even when this little dog is ensconced in the very embrace of one member of his human pack, a moment of potentially divided affections can nearly rip out his heart from the sound of it. Only a major show of affection on the part of all of the parties involved in this sudden emotional crisis, this paroxysm of panic, can reassure him enough to get him to calm down. There’s no question of falling back into the trance state of unbroken connection. That will have to await another time. It’s as though we have to reassure him that he will still be loved if one of us shows affection for another.

Canine jealousy suggests that the constellation of emotions provoked by the feeling of possible or impending loss of the affections of the beloved to someone else are not, as so many communes and other reform movements in the nineteenth and twentieth-century U.S. have made them out to be, just another bourgeois claim to ownership stemming from our indoctrination into the wholesale system of private property. Dogs don’t own property, last I looked. Is it possible that those emotions–the fear and unease, the anxiety and panic–are part of the experience of loving and being loved when that love is at its most natural, not least?

*

This is not at all an endorsement of jealousy. It can and should provoke the emotions of fear and worry that lead toward prevention; we should be vigilant in keeping it from wreaking the total destruction of which it is capable. At its worst, it can be fatal, as our notions “jealous rage” and “crime of passion” suggest. At its best, even, it can make us look and feel ridiculous. My dog always looks a little sheepish when his panic subsides and he realizes the interloper that was out to snatch away his blissful connection was a phantom.

When jealousy is untrammeled by countervailing resources at our disposal, phantoms can be as destructive as real threats. Jealousy is as often as not unfounded. Foreshadowing of the terrible loss she was to sustain, an irrational jealousy concerning Mario had plagued Tosca. She continually harbored suspicions that he must be making love to other women yet, in fact, his love for Tosca was as ultimate—as inimitable—as her love for him.

But here is where we can glimpse the role jealousy can sometimes play in our most intense human attachments. It teaches us, in no uncertain terms, the answer to that adrenaline-pumping wolf howl of George Thorogood’s, “Who Do You Love?” Without being aware of that—without knowing, even at a deeply unconscious level, who it is we can’t bear to do without—we might live altogether differently. Tosca turned out to have all too short time on this earth with her beloved. Thus, her jealousy may have been what ensured that she sought out and treasured all moments of love and intimacy they were to know together. Her jealousy may have angered Mario, whose passion for Tosca did not deserve to be questioned, but maybe it also riveted his attention so that vital vocalizations and enactments took place and precious time was not wasted. Wasn’t it how he learned of the extreme vulnerability even this beautiful, beloved woman could experience when it came to a single, irreplaceable person—himself? When they both could see how unfounded her jealousy was and how his assurances of love dispensed with it, mirth resulted and their intimacy deepened. Giving a (limited) hearing to jealousy, an emotion that seems negative by definition, ended in something else entirely.

Of course this doesn’t always happen. When it does turn out that true love is equally shared, the discomfort and piercing pain and dread of jealousy is still something that people who have experienced it would no doubt prefer to avoid. When it does not turn out that the love is equally shared, of course, it can be the precursor to even greater agony. In either case, such a feeling can also remind us of the sheer fragility of what it is to have experienced in the past another person in such a way that he or she becomes inextricably lodged in our very psyche for the present and foreseeable future. It is understandable why we might try to dodge or deny the whole range of emotional states that can arise when a particular person stands out from all the rest as unique, irreplaceable, and all too real.

*

Jealousy’s critics, its main detractors, like all those who would simply have us do away with inconvenient or complicated emotions by not feeling them, seem to me to be spokespeople from the party of reproducibility, and it is that party that is in the ascendancy. It is much easier to let someone recede from vision if we never felt that powerful force that is human desire at its apogee. The party of uniqueness, after all, knows that if we lose someone, the splendor that is that person can never find a substitute, just as we don’t have a prayer of simulating the very particular love he or she inspires in us.

This is another reason why it might seem sometimes to be a mistake to desire and exult in the ways only possible in a world in which people—and our experiences with them and of them—are not interchangeable. We can be “cool” and “chill” now in the glorious age of “whatever.” In the calculus so often regnant today, no particular individual counts too much.

I don’t remember when my mom and I first started sending text messages. Like all communications revolutions, it was probably out of utility at first. We might have started exchanging basic information like when we could schedule a time to talk, or when my flight was supposed to arrive. They never had much valence of emotion. As I imagine that early period now, text messages were a minor appendage to our actually-existing-relationship in person or on the phone. But then something changed. Slowly but surely texting assumed the status of a new autonomous sphere of relations between us. It became a way of communicating without actually talking, and now its charms are hard to deny.

I like to think that I have mixed feelings about this development because I have mixed feelings about all revolutions in communications technology. Any new medium that allows us to flatten our interactions into bits of digitized data necessarily reduces the scope and complexity of our relationships. I’m well aware that there is an obvious rejoinder to this argument, and it’s true that few technologies (just how many is a worthwhile thought experiment) exercise total influence over our behavior. But the important point still stands: at the very least, a medium like text messaging permits a process of reduction that can lead to flatter, ultimately less recognizable human relationships. In every case, the tool is the problem.

I’ve been making this argument for as long as I can remember (roughly since I’ve identified as the sort of person who says things like “for as long as I can remember.”) But since I purchased a smart phone last summer I’ve noticed my techno-hypocrisy deepening. If I’m to be honest, a slow slide has been ongoing for years. Until June 2009 I resisted joining Facebook because I thought my private boycott represented a laudable example of cultural conservatism. When I finally gave in, I planned to enter this new online world the way I imagined an anthropologist entered a foreign culture. I started an essay called “Re-Entering A World of Text,” in which I justified my project (and my secret political agenda) in grossly highfalutin terms. Looking back at that essay now, it’s clear that I was suffering from more than a few illusions:

My experiment is to make my participation based solely on expressions of authentic conscience. From what I can tell, this is an uncommon use of the medium, but it may be one best suited to democratic ends. If people are disturbed or off put by my comments and declarations, they can ignore them—but they remain changed by the contact I have initiated. Slowly, these sorts of dynamic interactions will filter through the community of users (the body of citizens), and what is persuasive will have lasting effect…. Facebook opens up manifold possibilities for democracy, culture, and human community, but such advancement can only begin when new cultural influences encourage a discussion about right use of the new medium.

I could quote more, but it would be too painful knowing how my experiment ended.

For the next two years I recorded nearly every interaction I had on Facebook. No matter how interesting or mundane, I copied and pasted what transpired on my wall into two word documents, one called “Facebook logs,” and a later version called “Toward a theory of right use.” During my first few days on the ground I chose the profile picture (of me attentively reading Richard Yates’ “Eleven Kinds of Loneliness”) that still graces my page and began posting status updates that I hoped would spur intellectual debate. Intermittently, I worried whether people thought I was making a fool of myself.

7/5/09: “Democratic nations will therefore cultivate the arts that serve to render life easy in preference to those whose object is to adorn it. They will habitually prefer the useful to the beautiful, and they will require that the beautiful should be useful.”- Tocqueville

7/9/09: “Life is made of marble and mud. And, without all the deeper trust in a comprehensive sympathy above us, we might hence be led to suspect the insult of a sneer, as well as an immitigable frown, on the iron countenance of fate.”-Hawthorne

“Soooo dreamy,” a friend posted after that last one. To which I responded,

“Yeah? Well allow me to continue: ‘What is called poetic insight is the gift of discerning, in this sphere of strangely mingled elements, the beauty and the majesty which are compelled to assume a garb so sordid.’ Also, don’t you think it’s time someone forged an archetype on Facebook that involves foisting such quotations on one’s friends?”

I did succeed at provoking some discussion. But over the succeeding months and years I also felt myself adjusting to the limits of the medium. I experienced all the advantages of a world of text. It felt good to make new friends and connect with old ones in new ways; and it became clear that Facebook could, and often did, enhance real-world relationships. If I was testing a hypothesis, it appeared that I had my answer. Slowly but surely, using Facebook became ordinary, and I lost interest in my ambitious project around the same time my love life began to improve.

It was a long road from the Facebook logs to the purchase of my Samsung Galaxy SIII. But now that I’ve caved in on the smart phone front too, certain patterns have revealed themselves. Before each great leap forward toward greater investment in the technological habits and habitats that clog our world, I seem to need to voice some misgiving, some social critique of existing cultural practices lest I succumb to “the disease of modern times” (Paul Goodman). I don’t know where this need comes from, or whose criteria I’m trying to satisfy. But the plain fact is that I’m trying to avoid complicity.

Complicity in what exactly I can’t say. The problem is difficult to pinpoint. I feel its contours every time I notice myself gazing covetously at my friends’ iPhones, wishing I’d done more comparison-shopping before I renewed my contract with T-Mobile and bought the Samsung. (No matter what anyone says, I’m convinced that iPhones are better than Androids; I just don’t have the courage to post this on Facebook.) I talk about symptoms constantly; I’m even teaching a class this academic year called “Digital Culture and Counterculture.” Still, the cycle continues.

At 12:38 am on 11/24/12, I sent my mom a photo of my sister and her boyfriend sitting by the fire in my living room.

Mom: Sweet pic really looking

forward to meeting him

i thought you guys would

be too tired to stay up

ps is it cold in your house?

krista looks all bundled up

Me: We’re all tired. We made

a fire and now they’re

going to bed.

Mom: Sounds like yall [she’s from Louisiana] are

having a great time

connecting

can we talk in afternoon/

eve tomorrow?

When does a [my girlfriend] return?

Me: We’re having dinner with

her tomorrow. Let’s see

how the day goes.

Mom: K g night

Me: By the way, I’m not happy

to see you adopting this

abbreviated text speak,

Mom. The people who

are fighting for that side in

the battle over the future

shape of culture deserve

to lose. Make us read

more. Even if it takes

longer.

Mom: Sorry for not responding

last night

i fell asleep reading your

text tome elevating

cellphone language to the

heights of great literature

Usually im more careful

but I didnt want to take

too much of your time

Were headed to valley of

Fire now [in northern AZ]

Lots! Of rock 🙂

Love your maternal

ancestor of the first

degree

6/26/13: I’m in Cupertino again talking to Apple about how to incorporate some of my wryer protestations into one or two of their web commercials. This time we managed to come up with a few lines for a jingle and an interesting graphic. On the elevator ride back down to the lobby I send my mom a snide text about corporate complacency via my new iPhone 6. She gets the joke, but she isn’t surprised or all that impressed. I’m lucky she knows I’m a little slow.

Recently, I attended the lecture of the esteemed political philosopher and public intellectual, Micahel Sandel. The second of this year’s Tanner Lectures on Ethics, Citizenship, and Public Responsibility, Sandel’s talk centered around the theme of his most recent book, What Money Can’t Buy: The Moral Limits of Markets. I haven’t read the book. Though, judging by the talk (and a few reviews) it seems perfectly clear what the book is about, and it’s a reiteration of a story that’s been told in more or less the same way since at least since the late 19th century: We’ve transitioned from a market economy to a market society, wherein more and more areas of life are subject to market mechanisms; the market erodes existing values that foster a sense of debate about larger questions of morality and the public good; our political discourse suffers from this trend, reduced now to series of shouting matches that lead, at best, to political stalemate.

In the world of political philosophy, Sandel is best known for his communitarian critique of liberalism, along with the likes of Alasdair MacIntyre and, to a lesser extent, Michael Walzer, all of whom launched what seemed like collective expression of dissatisfaction with liberal thought in the late 70s and 80s. In his book, Liberalism and The Limits of Justice, Sandel took issue with one of the signal pronouncements of liberal (and socialist) political philosophy in the twentieth-century, John Rawls’s A Theory of Justice. He criticized Rawls’s “original position” for abstracting the human being from its community and arguing, on Kantian grounds, that the self (along with other selves), shorn of its commitments to social values, will necessarily act in accordance with the dictates of rationality, without any contingent or situated laws or norms. For Rawls the un-situated self in the original position meant that, along with others in the same position, there exists a situation of equilibrium in which participants will act out of rational self-interest to attain a position that will be mutually agreeable to all. The logic of Rawls’s view was that people acting out of self-interest, and without a sense of situatedness or contingency, will agree inherently on the principle of liberty, but will also be necessarily risk-averse, ideally causing them to assent to provisions for the least well off.

On Sandel’s account, this was wholly unrealistic, because the “original position” assumes that we could ever be “unsituated,” that is, unconditioned by circumstance. In other words, we can never escape, as Heidegger put it, our “thrownness.” Like Alsadair MacIntyre, Sandel argued that without a conception of the public good a priori (which is not to say the content of that good), then public deliberation and public life is unthinkable. Add Sandel’s vision of a “market society” into the equation, and public life seems like a foregone conclusion.

Sandel cajoled the audience into considering why it is that the encroachment of market forces into more and more areas of life deemed “off-limits” is in fact Bad. With a capital B. Fair enough. By way of public discussion (the model of the town hall forum wistfully longed after by those longing for a so-called “revival” of civic life), Sandel wanted the audience to reach a point at which, by considering examples such as putting a fifty thousand dollar price tag on immigration in order to solve a social problem by market mechanisms, they would realize that, “Hey, the market can’t solve all of our problems; we’re human beings damn it, and social problems can’t be solved with money.” But what Sandel cavalierly leaves out of the story, aside from the fact that it is at this point a hopeless and sentimental paean, is just what he means by “markets” and what their existence and proliferation might say about us as human beings.

If in fact we take an industrialized market economy as the underwriter of the “social-self,” freed from the moorings of the traditional family and work lives, then Sandel may have been right to cast such strong doubt on Rawls’s deontological self. But as we find ourselves in a globalized world underwritten by financial capitalism, then we must take that society on its own terms, and allow it to disclose to us something we keep hidden from ourselves, something we don’t really want to know about ourselves. Many on the left decry this society and the attendant self-interestedness, self-absorption, and “narcissism” it has wrought. Others, like James Livingston, see in that same society the creation of new resources through which that society might become more just. If in fact these represent two ends of a spectrum on which our political imaginary rests, then it’s safe to say that Sandel would find himself at home with the former. What’s more interesting, however, what’s arguably a more vital resource for political change is precisely what the polarization itself obscures. But can we see the clearing?

Sandel’s goal is a more robust public life, a view of the good that values deliberative democracy and debate over large questions of public morality, as the key to a vibrant political and moral life. But insofar as the markets he decries have made issues of the good into fungible assets and commodities, then the question becomes this: Who is to say that such an idyllic public forum won’t result in a stalemate reflective of a larger culture of nihilism?

You might say that the over-extension of markets is, arguably, one of the causes of such nihilism, and so if you rein in markets then agreement on what constitutes the good might become a more realistic possibility. But, in fact, our nihilism has nothing to do with markets. It seems, rather, that it is the collective expression, and culmination of what Rienhold Niebuhr described as our divided self. That self contains two tensions perennially at odds with each other, and explains both the beauty and wonder of late capitalism and the horror and injustice it has wrought. On the one side of that self is the fact that, as human beings, our highest degree of self-realization can only be found in the lives of others, a fundamental paradox because the self that seeks fulfillment in others is a self that can no longer be. Freud called this the death drive. Along with sex, he rightly noted that its sublimation lies at the heart of every human vitality, both individual and collective, political and social, artistic and cultural. There can in fact be no limitations placed on these vitalities, except for the fact that, while they are the most beautiful parts of human being-in-the-world, they are, as the other side of the self, also the most destructive. At the individual level, Niebuhr attributed this to the fundamental fact of self-interest contained in every human being, her will-to-power which, when expressed on the collective level, can be, and usually is, profoundly destructive. Thus it is that human history contains the most awesome triumphs along with the most destructive and evil forces. This is, I think, what Joseph Schumpter had in mind when described capitalism as a process of “Creative Destruction.”

The human being divided against herself is expressed in the nihilistic atmosphere of late capitalism: it contains the richest, most varied artistic and intellectual cultures, yet it is underwritten by the profoundest injustice and exploitation. As Walter Benjamin put it, “There is no document of civilization which is not at the same time a document of barbarism.”

The forces of privatization and the general creep of market forces into domains that should remain sacred are facts of the last four decades that should be reckoned with. But how? Recourse to a public good of deliberative democracy by condemning “markets” outright fails to consider the ways in which the market itself is controlled by the indeterminate nature of human desire which, if left unchecked, leaves us in the situation we find ourselves in today. That is to say, the reality of a perpetual conflict of interests and desires, our clashing of wills-to-power, soberly diagnosed by Niebuhr and, for that matter, James Madison, is a more fundamental issue today than the forces of markets themselves.

For Niebuhr, the arbiter of such conflicts was reason, which is “organically related to a particular center of vitality, individual and collective,” and which serves to “limit expansive impulses by coercion.” Herein lays a crucial point. Reason is nothing but the claim to community, and requires the necessity of coercion to keep the destructive side of our inordinate desires, individual and collective, from wreaking destruction within and without. It is a claim that recognizes human vitality, mutability, and proliferation as both the source of our ills and triumphs, and necessitates coercion as mediating factor between the two. Insofar as reason, in this sense, is bound up with the acknowledgment of human frailty and finitude, which is in fact fallibility, and thus recognizes that the “claim of reason” rests on nothing more than our fragile and multiple claims to community, then the use of coercion in order to maintain elements of community necessarily translates into competing claims on what community is. Power will only meet power. And the claims of power in this sense will always represent the competition of interests enshrined by Madison in the constitution, and articulated by Niebuhr.

In the case of combating privatization, neo-liberalism, and the encroachment of the market, then, what is required is a sober recognition that the market does not itself stand at odds with the public good, but that we stand at odds with ourselves. In the throes of late-capitalism, acknowledging this condition means acknowledging that we can in fact be “in” and not “of” this world, as James Livingston denies, as an acknowledgment of the fact that the desire for transcendence and sublimation is contained in our proliferating technological culture fostered by financial and global capitalism: our facebooks, and iPhones, the virtual worlds we create for ourselves; these are not simply utilities, but the desire to be elsewhere, in other worlds. Insofar as we remain in “this” world, however, a world that we in fact create and amend, then coming to terms with it does not mean waiting to be saved in the next life, but acknowledging that there is no real limit placed upon the human imagination, her desire to be elsewhere, her drive toward virtual self-effacement. Insofar as this is a fact of fallibility, then it should act as a sober reassessment of the human capacity for evil, for destruction, and injustice.

Recourse to public morality and the “Good,” as opposed to the nebulous “market,” fails to take this fact into account. And it is that failure that speaks to the sober reality that reason’s claim to community makes coercion necessary for justice, a fact that makes democracy both necessary and, as we see in our own case, perilous. As Niebuhr put it, “Man’s capacity for justice makes democracy possible; but man’s capacity for injustice makes democracy necessary.”

When I asked “why exult” in my post under that title a few weeks ago, I really did mean it as a question. I was so plagued by that question for so long—some eon-filled years— that I finally wrote about it here on a wing and a prayer. I was hoping against hope that from somewhere an answer would be given.

My question was “Why exult in true connectedness, given the inevitability of loss?”

The post concerned the almost intolerable pain that can flood in upon the ending of times of utmost connection with another human being—the kind of connection that halts time and instead seamlessly unfolds something else entirely, as though minutes, hours, and days, were replaced by a new unending measure of pure bliss. When one’s entire being has reverberated with this rhapsody, who can calmly accept anything less?
No one. The problem is we have to. Or do we?

We have tried so hard, we mere mortals, to find a way not to. One of the ways is to do away with desire. If only we did not experience so much longing, so much yearning, so much desiring, we would not feel so much pain. Triumph over desire and the self will be at peace.

That might be true. But if so, what kind of peace is this? Is this the blissful eternity- drenched peace of union and communion with the beloved other or a soul-deadening solipsism? Perpetual re-enchantment at the font of the one desired or willed de- enchantment for the sake of an end to want?

So much of our culture seems—seems—to be about desire; signs everywhere seem to be encouraging it. The marketing-purchasing perpetual motion machine says it is right and good to want this thing, that thing, and everything. Even people are things, also to be desired. But of course the collapsing of the distance between desire and the thing or person desired in actuality can actually serve to kill desire. Immediate gratification moves in too quickly before desire can even assert itself.

Instead, the exultation of true connection might actually require the germination of longing, in the same way that it might be suffering that prepares the self for the deepest possible experience of shared bliss.

Why long? Because we have to. That is our story. But the message these days often seems to be that desire is undesirable, that it would be best not to have to long. Only attainment of the desired object brings happiness. But in fact, when we do not have to long, we often lose sight of what is longed-for altogether. Isn’t that a worse fate–permanent and irrevocable and soul-destroying?

This culture’s ambivalence about longing currently stands at epic proportions. It is not sure whether longing is good or not. It provides myths to tell us about where longing goes. Desire is present in the phase of infatuation when the experience of being with someone is new; it fades into oblivion in time. It is helpful, isn’t it, that we have discovered mathematical formulas for such things. Desire is attached to the rise and fall of hormones over the life course; this explains intense longing and its demise. Fixation on flaws in the particular human object itself, with a consumer culture’s cruel calculus, is but a thinly disguised rationale.

Such ideas are symptoms of a culture that does everything possible to starve desire of the very conditions it needs to thrive.

In Walden, Henry David Thoreau wrote, “Society is commonly too cheap. We meet at very short intervals, not having had time to acquire any new value for each other….The value of a man is not in his skin, that we should touch him.”

This may not apply to all cases of exultation and loss, at least with equal salience. But something can be taken from this idea about the need for a certain perspective or vantage point, which can at least help us think about temporary partings or externally imposed limits or absences. Seeing something too close up at all moments might rule out the kind of contemplation of the loved one and the cherished connection that establishes an indelible impression of his or her value on the mind, heart, and soul. The time that lapses without the best is a painful reminder that the best is the best.

The desirability of allowing desire full play seems clear if desire is what allows the bond to be understood as a sacred one. This is a different kind of desire, thus it allows for a different kind of fulfillment. The enchantment and exultation that result from this rare kind of connection might only be possible because of the awareness of the possibility of separation as well as the experience of loss.

*

The other day I read a note from a dear friend whose company I miss, to say the least. The riptide of feeling felt nearly unbearable, the pain temporarily as bad as some of the worst physical hurts my body has borne. As it subsided, it was replaced immediately by gratitude at getting beyond this pain’s fearsome apogee, but simultaneously a desire for it not to stop. I didn’t understand that other desire until, in the days afterward, the memory of the painful emotion that had gripped me returned—now in fondness. It was an experience of exquisiteness, of supreme poignancy, of fullness. It brought a kind of completion, through a moment at the very heart of which, by its very definition, was separation, partiality, incompletion. But the completion was now in the longing, at least for the moment.

Just as longing can provide for the fulsome memory of the precise melodies of genuine conversation, conversing–the old term was, so aptly, intercourse–and all that is possible between human beings, the experience of those unending echoes initiated in time yet continuing out of it transformed the moment of longing into an experience of the presence of the longed-for. Who is to say that this apprehension of the one who was missed was any less real than the real-life observances of those who are lucky enough to share time and space but who have lost the capacity to experience each other in full? Very few seem to know how to experience someone anew in each new moment rather than taking him or her for granted and thus ceasing to notice the individual as someone unfolding, growing, circling back, despairing, exulting, doubting, becoming, believing, living, and dying. How is “togetherness” under these circumstances anything like union? How is it even close to what can exist between two souls rendered as vulnerable as humanly possible by the pain of desire and the hope for satisfaction in the full knowledge there is only one fragile path to this release? While time and space and other logistical considerations can interfere with the glorious experience of the “skin” of another, as in Thoreau’s quote, the experience of oneness, when the other is really understood to be the other and thus able to be loved and capable of loving in return, can break out of those bounds into a less time-bound realm.

Isn’t it longing–albeit a painful path to this infinite immersion–that makes this so?

The objects in this room: the green pillow I rest this notebook on, couch and further cushion behind me. My smartphone sits symmetrically along the edge of the glass coffee table, a clean black half inch over the line. Sharp. Next to my data box so many books and notebooks, magazines and periodicals spew their words. Sentences in combination from Yorker to Lears then back to Jackson again. If I’m to raise sense up from this rubble I’ll have to stumble just right. Evenly, to make the granules of dilapidated speech look smooth. Sifting for the right interpretation—like shuffling cards with less surety that they’ll still deal straight. Inspiration shifts to machination, then back to sifting again. Sheets of paper stacked to summon the right muse.

I say I’m carving out a monograph bit by bit. The guitar to my left sulks in front of an empty heating-pad box to remind me of my progress. My roommate’s dirty jeans hang glumly over the banister; he said he’s taking them to the cleaners today. Soon? Light pours onto his face as he sits fastened to his Kindle at the dining room table. To his left, bookcase #1 with so many books I can’t remember. To his right, bookcase #2 with so many books I never read. Between them an empty chair glows in the evanescent morning sunshine. The day is now well underway.

This is how my brain works. But just yesterday I heard of an alternative.

The way she explained it, certain drugs called SSRIs (Selective Serotonin Reuptake Inhibitors) adjust serotonin levels to absorb cortisol and reduce cognitive symptoms like anxiety and unwanted compulsive thoughts. It’s like taking a vitamin, she said. Only this vitamin can really help take the edge off.

The more she spoke the more I believed it might be this simple. I remembered the slogan from the Sixties, “Better Living Through Chemistry”, and thought about the prospect all day. That afternoon a friend and I imagined starting prescriptions together as a kind of scientific experiment in self-management. Clearly our brains weren’t operating at peak performance, so why not adjust the levels of certain naturally occurring chemicals to make ourselves feel more efficient, more productive, and less prone to non-optimal states of being? So what if we did this by synthetic means. Maybe the boost was what we needed to jumpstart our dissertations. Maybe we would never get them done without these drugs!

Pragmatic logic never seemed so justified or exhilarating as we talked passed our initial reluctance, intent on delivering ourselves from the reality of our lives without the benefits of SSRIs. But that night the charm began to wear off. I was sitting in a loft listening to a bad poetry reading when the first wave of serious misgivings hit me. I pictured myself months later, having started the drugs and taken a liking to them. Inevitably there would come a time when my prescription was running low and I was unable to refill it immediately. I imagined rationing my pills and calculating the possible effects of denying my brain its dose of artificial solace even for one day. I could already feel the faint symptoms of relapse brewing. Returning to myself without the proper admixture of chemical engineering was terrifying. I would have to start over, maybe go on another drug. Either way I’d be trapped.

My heart was pounding as another bad poet left the stage to what seemed like uproarious applause. How could these people possibly think that poem was worth clapping for? I remained reticent, insisting through my stiff body language that this was the proper aesthetic response to what we’d just heard. I wondered how I would experience this reading if I were on the drugs, and then it hit me: the decision to tweak my brain chemistry would change everything. Even on a low dose it would mean a new level of dependency on forces outside my control and a permanently new relationship to my own conscious mind. Earlier in the day the prospect seemed liberating. But the dread I felt sitting there envisioning my future on SSRIs reminded me how easy it is to sacrifice agency when the promise of convenient relief presents itself as guaranteed, and mostly cost-free.

*

The costs are hard to see. This is the problem of technologies ranging from agriculture, to combustion engines, to smartphones. In each case we might hope to remain in control by appealing to some notion of conscious/responsible use. But the dynamics of autonomy and dependence are fundamentally altered when we allow our tools to affect the ways we think and behave, and virtually no tool has no effect. The often obscured question is whether the effects of a given tool really constitute net benefits, or whether our uses require us to compromise our ideas of what it is to live well. In which cases should we not accept the transaction? Is bad poetry worth recognizing even if it doesn’t feel good?

The ideology of efficacy that underwrites technologies from personal computers to SSRIs is certainly difficult to oppose consistently. I drive a car, use email, shop at the grocery store, etc. And I can’t imagine life without these conveniences. But perhaps we gain something more valuable than intellectual consistency when we choose to preserve select realms of our lives from the intoxicating promises and hidden costs of modern technology.

Opting for the ethics of an eccentric hypocrite in this sense probably does not make life any easier. But depending on who you ask, it may reveal blind spots that are painfully inefficient to realize yet necessary to see.

There was a buzz in the air—a pack of assistants and security staff stood around tensely—that indicated the presence of Justin Bieber, who was slated to make an appearance on “The Voice,” to promote his new album. Bieber, who had just turned eighteen, wore a white T-shirt, tight black jeans sagging low, and unlaced Timberland boots. His hair was swept up into a James Dean pompadour, and a black bandanna with skulls on it dangled from his back pocket. He was much smaller than the young men in the Wanted, and he looked frail and skittish. (At one point, Braun reminded me, “That skinny kid you just met is the most Googled person on the planet by like two hundred million hits.”)

So these are the culture makers, I thought lying on my couch several thousand miles from Los Angeles. Scooter Braun is at the epicenter and I am at the periphery trying to find a window in.

I was only reading the New Yorker that afternoon because I couldn’t walk. While Braun was out signing potential young pop stars to his label, Schoolboy Records, I had succumbed to a basketball injury two days earlier under semi-heroic circumstances and was out of commission. (I darted too quickly for a rebound while waiting for someone else on my team to score the winning point in our game. I was lucky to have incurred only soft tissue damage.) Housebound and reduced to hobbling around on crutches, images of Justin Bieber on the set of “The Voice” pooled inside my head.

The young men immediately began comparing tattoos. George lifted up his shirt to reveal some song lyrics: “We try / we fall / we live another day.” “Dope,” Bieber said, and pulled up his pant leg to show, on his calf, a large tattoo of Jesus with hands clasped in prayer. (Bieber and his mother are devout Christians.) The Wanted members looked a little stunned.

Carson Daly, the host of “The Voice,” walked by. Braun called out, “Hey Carson!” Daly and Braun began to review a script detailing stage patter. Bored, Bieber started a game, playfully jabbing everyone in the crotch with his fist. First, he jabbed at Braun, who, without looking up from the script, dropped his hands to block. Daly did the same. When Bieber jabbed at Siva Kaneswaran, a member of the Wanted, he connected. He called out, “Got you, bro.” Kaneswaran balled his fist but seemed unsure how to respond. “I don’t want to hurt his pretty face,” he said. Braun said, “Just get him in the pretty balls. It’s fair game.” “No, it’s not,” Bieber said. Braun took a firm tone. “Justin, it is—fair game,” he said. “You hit him in the balls, fair game.” Bieber was peeved. “Where’re we going?” He asked. “Where’s my dressing room?”

What hope lies in historical monographs, one wonders, when these are the people who control the gears?

“Ten years ago, a pop star might not have a fragrance that does a hundred and twenty million dollars in business in a year.” He went on, “My job is to make sure a client doesn’t have any ‘what if’s—to make sure, when you look back, you don’t say, ‘What if I had done this? What if I had done that?’” Among Bieber’s other revenue streams: “Never Say Never,” a 2011 movie that Braun produced about Bieber’s life, which was the highest-grossing concert film in U.S. history; a line of watches, backpacks, and singing dolls; a ‘home’ collection that includes comforter sets and shower curtains; and an endorsement deal with Proactiv, a purveyor of acne remedies. All this has made Bieber rich—his annual income is estimated to exceed fifty million dollars—and has given Braun a unique economic power. A big part of a manager’s job, one industry veteran told me, is “getting an artist to say yes to things.”

He cuts a nice figure, my rival. The photo on the first page of Lizzie Widdicombe’s article (“Teen Titan,” September 3, 2012) shows Braun posing nobly before a drum set overlooking the majestic Hollywood hills. His left eyebrow is raised slightly, but the look on his face is dignified, almost Napoleonic. One senses that he knows he commands a vast empire, and that it will continue to grow larger. As of this summer, Universal Music Group named Braun, 31, its first technology “entrepreneur in residence.” According to Lucian Grainge, Universal’s C.E.O., “He understands the entertainment business…. The company likes hits, the fans like hits, and that’s what he’s there to do—make hits. We’re not in the art business.”

Envy works in mysterious ways. If I’m to be perfectly honest, I find the dream of wielding this kind of cultural power almost irrepressibly seductive. Even the pools and the palm trees exert a quiet charm I can’t deny. Yet I also recognize Braun and his ilk as The Enemy. His minion army of pop star-entrepreneurs threatens to tarnish and distort nearly everything I hold dear. They are the barbarians who rushed passed the gates long ago, and now we’re at their mercy much as I was on the couch that day.

From where I sat this did not seem like a cognitive distortion. By some curious postmodern logic it seemed there was even something faintly heroic in Braun’s victory. Part of me wanted to congratulate him, to venerate him as the new emperor. What else could I do in my invalid condition but admit defeat?

*

When confronted with the Biebers and the Brauns of the world, those of us who endeavor to heal this culture because we believe we have some ability to restore health can’t help but feel defeated. Even if we achieve some modest influence, we know our best efforts will likely remain drops in the empty bucket of our desiccated civilization. We are on the losing side of both History and history, everything confirms that; and the prospect of another revolution doesn’t look promising. This, I suspect, is why many of us spend our days treading water in academic spas far away from the mainlines of contemporary popular culture: we are trying to avoid the evidence that the war is over. We want not to think that we’re working in vain.

Los Angeles glistening in the distance, I hobbled up to get a fresh icepack out of the freezer knowing self-indulgence only explains so much.