Bitter Pill: The Modern Medicine Lament

Related Books:

“My Best Friend, my doctor, won’t even say what it is I’ve got.”-Bob Dylan

I recently became aware of a trend, the Modern Medicine Lament, in which American writers struggle to make an uneasy peace with a system from which they feel alienated. And it begs the question: has it always been this way?

Doctors have enjoyed a colorful depiction in books and letters over the years. Kafka’s brilliant short story “A Country Doctor” is still read and taught frequently. Boris Pasternak’sDr. Zhivago was a man of principle in any language, in any time. Chekhov was a trained physician. I should also mention my favorite doctor in literature, Dr. Livesey, from Robert Louis Stevenson’sTreasure Island. Stevenson, you’ll recall, sketched another doctor, Dr. Jekyll, whose enthusiasm for chemicals took him off the rails (if Jekyll lived in America today he would surely declaim in a basement recovery meeting about the social transgressions committed by his intoxicated self). Mary Shelley’sFrankenstein, written almost 200 years ago, offers a remarkable foreshadowing of the moral and ethical challenges inherent to the practice of medicine, which has always had one ultimate goal: triumph over death. It is telling that we will in passing mistake the name of the title character for that of the monster.

In 1885 Louis Pasteur, a Frenchman, first administered vaccine to a human, a child bitten by a rabid dog. The treatment was successful. It was not an insignificant moment in human history. Giant scientific leaps forward like Pasteur’s continue to inject health and medicine into the lives of everyday people. Vaccines and antibiotics changed the world, though today their administration is practically mundane. In America, where good health has always been considered something of a birthright, we resent doctors. They are a necessary evil, a reminder of the basic infirmity of our bodies and the inevitability of their decline. Sure, Americans love watching fictional doctors treat fictional patients on television, but in reality aren’t doctors society’s consummate whipping boys? After all, that goal – sticking it to death – has never yet been achieved. Good news from a doctor cannot amount to more than “you will live for maybe a few more years, all things being equal.” And anyway, Americans don’t want to live forever, they simply want their life on earth to be pain-free, and believe it should be.

Pills that govern the chemical workings of the brain are now at the forefront of our ever-advancing medical knowledge. They treat disorders like depression, schizophrenia, autism, addiction, panic, mania, and garden-variety anxiety. Neurochemistry remains the least understood field in medicine, but the sales figures of these drugs have exploded in the past twenty years. Pharmaceutical manufacturers employ direct advertising – and also work more quietly through doctors – to encourage the public to treat a psychological condition far enough from bliss as a disorder. Comparatively little attention is paid to the fresh array of stresses and overload of stimuli that burden the modern brain, and how these factors can capitalize on the ease of modern life, where we are at greater leisure to explore exactly how we feel, as opposed to wasting all of our energy on mere survival. Effexor, Wellbutrin, Paxil, Zoloft, Prozac: ugly mash-ups, yes, but also household words. The drugs have brought relief to millions of people suffering from mental duress.

But the rise of Psychotropic Nation has created a cultural preoccupation with pills here in the U.S., one that has in turn given rise to questions about the efficacy of our medical system (actually just one of many aspects of our system that provoke such questions). If you are writing a novel, say, and wish to introduce recreational drug use into the plot (you may want the characters to seem more subversive, irrational, hedonistic, or edgy), you might shy away from the ho-hum world of schedule 1 drugs: your pot, your cocaine and heroin – in favor of those that can be obtained with a doctor’s note: pain pills, sedatives, amphetamines. The irony payoff is just too great, and writers love irony. The companies that make these drugs want you to want them, but as soon as you do, you probably should not have them. And maybe you, the writer (or the characters for that matter), don’t have health insurance, or went through a period when you weren’t covered – that just adds to the irony. Without insurance you’re not seeing a doctor, making it a whole lot easier for you to go schedule 1 than to buy a bottle of valium. And, given the cost of such pills, cheaper too.

Jonathan Franzen wrote extensively on this aspect of American life (see also Ben Kunkel’sIndecision, in which psychopharmacology plays no small roll). In The Corrections the drug is called Aslan, and its effects are somewhere between Prozac and ecstasy. At least two Lamberts use the drug, Chip during an unfortunate weekend sex binge, and Enid, Chip’s mother, whose little helper gets her rather strung out over a longer period. Franzen’s treatment is made more complete as Gary, eldest of the Lambert kids (and hilariously aware of the ebb and flow of his own serotonin and dopamine levels) invests money in the drug company that makes Aslan. Meanwhile, the pill is pushed by a leonine doctor with a creepy, guru-like aspect. And, of course, the one individual who could really use a pick-me-up, the crushingly depressed father Alfred, gets none. Collective dysphoria has never been so amusing.

Life imitates art, but it’s no barrel of laughs. That said, the cover story of this month’s Harper’s, “Manufacturing Depression: A Journey into the Economy of Melancholy”, by Gary Greenberg, does deliver the odd ironic chortle. Mr. Greenberg, a psychotherapist, is writing a book about the “misuses of medical diagnoses,” and if his magazine piece is any indication, it may be worth reading. The piece opens with Mr. Greenberg cataloging the failures and dissatisfactions of his life to a kindly psychiatrist, Dr. George Papakostas, in order to see if he qualifies for an experimental drug study at the Depression Clinical and Research Program of Massachusetts General Hospital. And, after checking some boxes, the doctor delivers his diagnosis: Mr. Greenberg has Major Depression. Would he like to try Celexa, Lexapro, Mirapex, or omega-3 fish oil?

“It was hard to believe that Papakostas really thought I had major depression,” writes Mr. Greenberg. Mr. Greenberg does feel bad sometimes, inadequate, feckless, and yes, his hair is thinning. His life is not blissful. But what is made abundantly clear to him is that the clinical criteria for a diagnosis of Depression, codified in the psychiatrist-developed Structured Clinical Interview, are bunk. Your score on this questionnaire, determined by the doctor, is totally subjective, the questions laughably interpretive. Dr. Papakostas, looking for subjects for a drug study driven by new medicines from Forest Laboratories, Inc. and paid for by the federal government, is predisposed towards a diagnosis of Clinical Depression. That’s really what someone looking to join such a study wants to hear, right? “‘Are you content with the amount of happiness that you get doing things that you like..?'” It is a standardized question asked by the doctor at one of Mr. Greenberg’s weekly follow-ups. “‘I’m not big on contentment,’ I said. Is anyone? I wondered. Is anyone ever convinced that his or her pursuit of happiness has reached its goal? And what would happen to the consumer economy if we began to believe that any amount of happiness is enough?”

The uncomfortable intersection of the consumer economy and medicine is at the heart of an article by Bruce Stutz that appeared in the May 6 issue of the NY Times Magazine. Unlike Mr. Greenberg, who never believes that he is clinically depressed even as he dutifully takes his Mass General fish oil, Mr. Stutz begins from a different point of view: he, like millions of Americans, went through a period of debilitating depression for which he sought medical treatment. Talk therapy and a prescribed selective serotonin reuptake inhibitor, Effexor, worked for him. Three years and a more positive outlook on life later, Mr. Stutz found himself shaking hands with his psychiatrist at the conclusion of his final session. But there was no mention of going off the drug.

“Somehow I couldn’t believe I had to take this pill for the rest of my life,” he writes. How many people taking such medication have had that thought? It’s not just the side effects, the occasional bouts of impotence, the weight gain, the dulled sensory perceptions and emotions, and it’s not just the monetary cost of the pills. It is also living with a stigmatizing reminder that one is sick and will never be well. But Mr. Stutz was well: he felt better; he was able to go one with his life. The stresses that had predicated his mental slide, the death of a parent, the breakup of a marriage, the loss of a job, were in the rearview. So he tapered his meds and hunkered down. Fierce withdrawal symptoms followed: mental torpor, physical discomfort, and the frightening “brain zaps,” blinding, incapacitating insta-headaches. With the help of some experts in clinical biology, Mr. Stutz does an admirable job of elucidating the chemical processes that were at work in his brain, which was, without the help of the meds, running a serotonin deficit. What Mr. Stutz did not experience during that period was a return of his depression symptoms. And so he wonders, “does our long-term reliance on these drugs become more of a convenience than a cure?”

Drug companies and doctors have about as much interest in helping people go off their psych meds as tobacco execs have in helping people quit cigarettes. Still, the medical industry is simply giving us what we want, a quick fix. What happens when the quick fix goes bad? The title of Ann Bauer’s May 18 article on Salon.com, “Psych Meds Drove My Son Crazy”, is inelegant but to the point. Her story is gripping, horrifying, and ultimately infuriating. Mrs. Bauer’s eldest son was born with autism. At the age of 17 this highly functional kid living in Minnesota became depressed, and his mother took him to a psychiatrist who prescribed an anti-depressant, which, she was assured, would not only snap him out of his funk, but also help control some of his autism-related obsessive tendencies. Instead, his condition grew worse. Doctors at a “respected neuropsychology clinic” reevaluated Mrs. Bauer’s son, now 30 pounds heavier and sleeping 16 hours a day, and changed the original diagnosis: in addition to his autism, her son was experiencing “‘psychomotor slowing’ – a form of schizophrenia.” And so a different drug was prescribed, Abilify, which was new (and, Mrs. Bauer notes, had been marketed direct-to-consumer in Time and Newsweek). Still her son’s condition worsened, “humming, shifting foot to foot, screaming if anyone touched him or tried to move him.” He would dialogue with voices that Mrs. Bauer could not hear. She tapered him off the Abilify.

Two days later he “got out of bed and stood in one place for a solid hour.” When Mrs. Bauer placed a hand on him, he beat her up.

Amazingly, the doctors managed to convince Mrs. Bauer to try yet another drug, a powerful anti-psychotic, Geodon. Her son took to living on the street after that. Only by conducting her own research, and getting a lucky referral to the Mayo Clinic from a retired doctor in Stony Brook, N.Y., an expert in a little known condition called autistic catatonia, did Mrs. Bauer find her son proper medical care. It took two years. Five days after checking him into Mayo, Mrs. Bauer read a front-page story in the NY Times “about psychiatrists in Minnesota who were collecting money from drug manufacturers for prescribing atypical antipsychotics, including Abilify and Geodon.” The article cited some hefty payout numbers, and also some serious risk factors for the drugs. It did not mention a fact that the doctors at Mayo confirmed: administered to an individual suffering from autistic catatonia, which they determined was the root cause of her son’s initial decline, neuroleptics like Abilify and Geodon only amplify the effects of the disorder, and they can cause permanent neurological damage.

She doesn’t say so, but I really hope Mrs. Bauer sued the pants off some folks. I would be interested to know.

There will be more Modern Medicine Laments to come. We will read them, and we will also watch with interest TV shows like “The Sopranos”, in which the writers have taken an increasingly critical line on the treatment of depression in America, and films like Michael Moore’s upcoming documentary about the ills of the American health care system. We will see more legal settlements against drug manufacturers like Purdue Pharma (OxyContin) and Pfizer (Celebrex) for misrepresenting the effects of their products to the consumer public. And, of course, we will continue to pop pills. We are a nation of armchair doctors. Sometimes it seems like a prescription pad is the only thing separating us from the real thing.

Update: The Libra in me desires balance. I do not want this post to seem an ad hoc dismissal of the medical profession as a whole. So I would steer folks to a book, Mountains Beyond Mountains, by Tracy Kidder, that had a profound impact on me when I read it. The book is about Dr. Paul Farmer, whose work battling T.B. while bringing basic medical care to corners of the world like Haiti and Peru where none existed before makes him something of a medical superhero. Kidder’s profile of Dr. Farmer proves that modern medicine is still changing the world for the better.

“They seem to have things under control,” I said.
“Who?”
“Whoever’s in charge out there.”
“Who’s in charge?”
“Never mind.”
—White Noise
Despite having closely followed the disastrous events in the Gulf for over a month with something akin to self-flagellatory devotion, growing increasingly angry and disillusioned with each failed attempt to contain the stricken oil well, I recently booked a South Caribbean cruise for my honeymoon in January. It was only after the plans had been finalized that I realized how little the oil spill had actually affected me: I operated under the assumption that someone—the government, BP, someone—would have the “situation” resolved, cleaned up, and concluded before it could intrude on my vacation. I had blithely researched and planned the cruise, never considering that the worst manmade natural disaster in our nation’s history might have real repercussions for me. This naïve self-assurance gave me pause and, like many avid readers, I turned to what literature might teach me about such hubris.
Don DeLillo’s 1985 novel White Noise narrates the events of a manmade disaster so eerily similar to the Gulf oil spill in some of its details that it has an aura of prognostication. The novel is narrated by Jack Gladney, a professor of Hitler studies at College-on-the-Hill in Blacksmith, a quiet town somewhere in the U.S. Jack is an incredibly empathetic character. Contrary to what we might predict for the professor who founded an academic discipline devoted to studying the most heinous figure in modern history, Jack is a good husband and father, kind to his coworkers, and generally affable. Even his idiosyncrasies are endearing: he wears dark-tinted sunglasses on campus, changes his professional name to J. A. K. Gladney, and gains weight to bulk out his frame, each pose an attempt to acquire the gravitas expected of him by students and fellow professors. The careful cultivation of his public persona is matched by his need to provide answers for his family, to be a source of knowledge and assurance to his adolescent son, and to appear to have control over events outside his field of expertise.
When an accident in a nearby train yard spills 35,000 gallons of “Nyodene Derivative” (a fictional, highly toxic byproduct of commercial insecticides), creating an amorphous black cloud quickly named an “airborne toxic event,” Jack assures his family that they will be safe without fleeing home: “These things happen to poor people who live in exposed areas. Society is set up in such a way that it’s the poor and the uneducated who suffer the main impact of natural and man-made disasters. People in low-lying areas get the floods, people in shanties get the hurricanes and tornados. I’m a college professor. Did you ever see a college professor rowing a boat down his own street in one of those TV floods?” Even as the air currents threaten to send the toxic cloud toward his neighborhood, Jack insists that alarm would be out of step with his professional position, saying “I don’t see myself fleeing an airborne toxic event.”
Jack’s self assurance can be maintained only through an illusion of control. He assumes that the weather, government, and his socio-economic status will all contrive to protect him from the threatening black cloud. But this illusion is wrested from him after he learns that his two minute exposure to the toxin will likely jeopardize his health, though it will be fifteen years before the symptoms begin to manifest. “Scheduled to die,” Jack’s fear of death encroaches upon his ability to see himself among the living. Confiding to a fellow professor, he speaks of the trap he finds himself in: “It’s almost as though our fear is what brings it on. If we could learn not to be afraid, we could live forever.” Caught between the living and the dead, fear and uncertainty drive all of Jack’s actions after the exposure.
The victims of the Gulf oil spill are now trapped in the same epistemic gap in which Jack finds himself. Possibly the most confounding aspect of the disaster is that after two months there is still no certainty as to the extent of the damage. It is not merely a problem of tracking the massive, miles-long invisible plumes of oil that are suspected to be floating below the surface. A more essential problem is that the government and BP have been unable to determine how much oil is leaking from the well. There are only best and worst case scenarios separated by tens of thousands of barrels per day (as of this writing, it was estimated that between 12,600 and 40,000 barrels per day were bleeding into the Gulf before the riser was cut, and between 35,000 and 60,000 barrels per day afterwards).
Being unable to fathom such quantities, we are in a situation similar to Jack’s: things are bad, danger is lurking, but we don’t know its full extent. Like Jack’s, our exposure has been consummate, and fatal for the health and economic stability of many, but the final tally is not yet in.
Much of the novel’s pathos derives from Jack’s attempts to regain control of his life while living in the gap—living with the uncertainty of certain death. First, he alters his routine and begins to obsessively see his doctor and search for a miracle cure for his fear of death, a drug called Dylar. In the end, he violently steals the drug, consciously plotting his movements, the effort to superimpose order on his actions altering his narrative voice from the avuncular professor to the conniving criminal. The reversal of Jack’s fortunes is classically tragic, resulting from his flawed self-assurance. He both fears and longs for a conclusion to the uncertainty, desiring the resolution inevitable at the conclusion of any plot. It is as if he had read Aristotle’s Poetics and now awaits the catharsis available at the ending.
Keeping in mind E.M. Forster’s comments in Aspects of the Novel on the difference between “plot” in drama and the modern novel—the latter of which gives much greater emphasis to character development and action which derives organically from that development— Aristotle’s well-known emphasis on the unity and parts of a plot reveals what we as readers seek in narrative. Turning on either (though ideally both) a recognition on the part of a character or a reversal of his fortunes, the best plots are those which elicit sympathy and pity for the characters, resulting in catharsis for the audience. But the emotional payoff can come only at the conclusion, the result of both identifying with the characters and realizing that though you could be in their situation, you are not.
DeLillo not only masterfully plots White Noise, his characters also speaks eloquently of “plots.” Lecturing to his class, Jack opines that “All plots tend to move deathward. This is the nature of plots. Political plots, terrorist plots, lovers’ plots, narrative plots, plots that are part of children’s games. We edge nearer death every time we plot. It is like a contract that all must sign, the plotters as well as those who are the targets of the plot.” In other words, plotting is a way of reaching an end, the conclusion, and resolving whatever degree of mystery is left in a narrative or life. A plot gives structure to messy and meaningless facts by tying them together but in so doing, requires that the telling be curtailed, sometimes prematurely (for instance, the litigation and environmental cleanup from the oil spill will undoubtedly be with us for years to come, but the “narrative” of events that our culture will construct—in the media and in court—will likely provide an ending that doesn’t account for these lingering signs of the spill).
Aware that death is growing inside him, Jack has essentially short-circuited his life’s “plot.” There is no mystery left. Asked if he would like to know the exact date of his death, he says “Absolutely not. It’s bad enough to fear the unknown. Faced with the unknown, we can pretend it isn’t there. Exact dates would drive many to suicide, if only to beat the system.”
As writers and readers, we are bound to what Forster called the “tyranny of the plot.” Obligated to tie up loose ends, the writer must often sacrifice true characterization, curtailing the organic development of his characters (often with a “contrived” death or marriage, though obvious exceptions are the modernist ambiguous ending and the postmodern fragmented narrative). Forster questions the necessity false endings: “Why is there not a convention which allows a novelist to stop as soon as he feels muddled or bored? Alas, he has to round things off, and usually the characters go dead while he is at work, and our final impression of them is through deadness.” Why must all things move “plotwards”? How can the “deadness” of the characters (both creatively and in the plot) be accounted for? It is as if writers are compelled to sacrifice their characters to the reader’s need for catharsis and redemption, found in the resolution of the plot. This, I believe, is the answer given by Aristotle. We need endings to reassert our own humanity and to find life even in death.
In this way, there is something life affirming in even the greatest disasters. But only after they have ended: only after the tale of survival has been concluded and can be retold, filling in the gaps in a way that brings logic to bear on the messiness of life, creating a narrative that allows those not directly affected (the “audience” of the disaster) to live with fear by rehearsing disaster through its displacement. As stated by one of the characters in the novel, “The more we rehearse disaster, the safer we’ll be from the real thing.”
But we live in the gap, in that middle section of the novel where nothing is resolved and everything is at stake. Rereading White Noise, I recognized that plotting and planning are just ways in which I try to project order onto chaos. This is where fiction departs most drastically from life. In reading fiction, we must learn to willingly suspend disbelieve. But the beauty of living in the middle is the ability to will ourselves to believe that in these moments of suspension there is opportunity for human action.

What may seem like a frontal attack standard written English is in fact something quite different: a rise of a new public language heavily influenced by oral speech that, supercharged by online and television discourse, does much of the actual persuading in modern life while leaving standard, university-taught English unscathed.

When I found Jim Thompson’sThe Transgressors at a recent used book sale, I became cartoonishly excited. Thompson is one of my favorite pulp novelists, and there was nothing to dislike about my find: its cover depicted an old coupe stuck in rutted mud, with someone rushing, I guessed, to check a captive in its trunk. The back-cover copy described a reliably sadistic tale; an old New Republic blurb promised “a tour of hell.” It was called The Transgressors, for Christ’s sake. I couldn’t have asked for more.
My excitement stemmed from The Nothing Man, a 1954 Thompson novel that I discovered a decade ago, having never before heard of the book or the author. It told the story of a sexually disfigured veteran who goes on a killing spree to cover up his secret -- as if being a serial killer was the lesser shame. It was a bizarre, unsettling book, angry with energy, that seemed to have been written yesterday, not during the Dwight D. Eisenhower years.
Unfortunately, it’s been downhill from there. In the years since, I’ve returned to Thompson again and again, and I’ve never recaptured the feeling that jumped from The Nothing Man. The Getaway -- made famous by Steve McQueen, and later, Alec Baldwin and Kim Basinger -- came close, but was derailed by an outré epilogue that seemed imported from a different book. The Killer Inside Me, The Golden Gizmo, Pop. 1280 -- all were fine, but none supplied the blast of that first discovery. The Transgressors turned out to be readable, but was the worst Thompson I’ve read -- and it led me to ask myself a question common to fading relationships: Is it him or is it me?
I’ve experienced this pattern -- manic love, followed by a futile attempt to regain said manic love -- with others: I fell for T.C. Boyle after my brother slipped me The Tortilla Curtain, and despite Drop City, When the Killing’s Done, The Harder they Come, and others, it remains my favorite Boyle. Portnoy’s Complaint, The Sun Also Rises, and Billy Bathgate are my favorite Philip Roth, Ernest Hemingway, and E.L. Doctorow novels -- and were the first of each I read. These are legendary figures, responsible for a raft of classics. Is it possible that out of all of their works, those three are the best? Did I just happen to choose each one’s greatest effort right out of the gate?
I would say that I didn’t. My experience with Thompson, and to a lesser degree with Boyle, has led me to believe that the discovery of a new style -- Thompson’s turbo-charged dissolution; Boyle’s burbling streams of words -- eclipses the storytelling that the style supports. Reading the stylistically unfamiliar -- be it Evie Wyld or Lauren Groff or Patrick deWitt -- can be so pleasingly disorienting that it leaves the reader giddy: This is incredible, we think as we flip on through. This is a totally new experience. As a reader, it’s the moment you seek -- but that euphoria can also distort your inner Michiko Kakutani and set future expectations impossibly high.
Breakfast of Champions might not have been Kurt Vonnegut’s hands-down masterpiece, but I read it before any of his others -- and for that, it’s elevated in my mind. With its drawings of sphincters and cows and general jokiness, it’s no Cat’s Cradle or Slaughterhouse-Five -- but it doesn’t need to be, because the Vonnegut basics were there. And most importantly, for me, it came first. When you see a beautiful stranger across a crowded room, it doesn’t matter that he or she might not be having the most attractive day of his or her life. The spark fires either way, and you won’t forget the moment.
This isn’t to say that quality doesn’t matter. The first Danielle Steel novel you read will be as crummy as your last, and no matter when you read Tough Guys Don’t Dance, it won’t top The Naked and the Dead. But when a work is discovered can, at the very least, insert a welcome uncertainty to an overbearing consensus. In everything I’ve read about Jim Thompson, I haven’t seen much mention of The Nothing Man. This isn’t all that surprising, since he wrote more than 30 books. But critics consistently cite The Killer Inside Me and Pop. 1280 as two of his best. Maybe they are. But maybe those critics would think differently if they’d read The Nothing Man first. As for me, I’ll keep trying: there are still a few unread Thompson novels sitting on my shelf.
Image Credit: Flickr/Steven Guzzardi

4 comments:

A note on vaccination: In her beautiful and fascinating Turkish Letters, Lady Mary Wortley Montagu describes the Turkish practice of "engrafting" against the small pox in a letter dated 1717. This was vaccination without injection, rather by making a small cut or scratch on the arm into which what Montagu calls "venom" was rubbed. Montagu (better known as a one-time friend and then bitter enemy of Alexander Pope) had her infant son vaccinated and brought the practice back to England in the 1720's.

The great contemporary British novelist William Boyd gave us a novel called "The Blue Afternoon" in which one of the central characters is a Scottish doctor, circa 1900, living in (and attempting to bring modern medicine to) the Philippines. (when assorted murders and pioneering air-flights aren't leaping off the page) (but i digress… It's a fine book. I highly recommend it)

In one of his essays, the late Nigerian writer Chinua Achebe stated that “no one be fooled by the fact that we write in English, for we intend to do unheard-of things with it.” That “we” is, in essence, an authoritative oratorical posture that cast him as a representative of a group, a kindred of writers who -- either by design or fate -- have adopted English as the language of literary composition. With these words, it seems that to Achebe the intention to do “unheard-of” things with language is a primary factor in literary creation. He is right. And this should be the most important factor.
Achebe was, however, not merely speaking about the intention of his contemporaries alone, but also of writers who wrote generations before him. Among them would be, ironically, Joseph Conrad, whose prose he sometimes queried, but who embodied that intention to the extent that he was described by Virginia Woolf as one who “had been gifted, so he had schooled himself, and such was his obligation to a strange language wooed characteristically for its Latin qualities rather than its Saxon that it seemed impossible for him to make an ugly or insignificant movement of the pen.” That “we” also includes writers like Vladimir Nabokov of whom John Updike opined: “Nabokov writes prose the way it should be written: ecstatically;” Arundhati Roy; Salman Rushdie; Wole Soyinka; and a host of other writers to whom English was not the only language. The encompassing “we” could also be expanded to include prose stylists whose first language was English like William Faulkner, Shirley Hazzard, Virginia Woolf, William Golding, Ian McEwan, Cormac McCarthy, and all those writers who, in most of their works, float enthusiastically on blasted chariots of prose, and whose literary horses are high on poetic steroids. But these writers, it seems, are the last of a dying breed.
The culture of enforced literary humility, encouraged in many writing workshops and promoted by a rising culture of unobjective literary criticism, is chiefly to blame. It is the melding voice of a crowd that shouts down those who aspire to belong to Achebe’s “we” from their ladder by seeking to enthrone a firm -- even regulatory -- rule of creative writing. The enthroned style is dished out in the schools under the strict dictum: “Less is more.” Literary critics, on the other hand, do the damage by leveling variations of the accusation of writing “self-conscious (self-important; self-aware...) prose” on writers who attempt to do “unheard-of” things with their prose. The result, by and large, is the crowning of minimalism as the cherished form of writing, and the near rejection of other stylistic considerations. In truth, minimalism has its qualities and suits the works of certain writers like Ernest Hemingway, Raymond Carver, John Cheever, and even, for the most part, Chinua Achebe himself. With it, great writings have been produced, including masterpieces like A Farewell to Arms. But it is its blind adoption in most contemporary novels as the only viable style in the literary universe that must be questioned, if we are to keep the literary culture healthy.
One of the insightful critics still around, Garth Risk Hallberg, describes this phenomenon in his 2012 New York Times Review of A.M Homes’sMay We Be Forgiven with these apt observations:
The underlying problem here is style. Homes’s ambitions may have grown in the quarter-century since The Safety of Objects was published, but her default mode of narration remains mired in the minimalism of that era: an uninflected indicative voice that flattens everything it touches. Harry gets some upsetting news: 'Two days later, the missing girl is found in a garbage bag. Dead. I vomit.' Harry gets a visitor: 'Bang. Bang. Bang. A heavy knocking on the door. Tessie barks. The mattress has arrived.'
Hallberg goes on to describe, in the next two paragraphs, the faddist nature of the style:
Style may be, as Truman Capote said, 'the mirror of an artist’s sensibility,' but it is also something that develops over time, and in context. When minimalism returned to prominence in the mid-80s, its power was the power to negate. To record yuppie hypocrisies like some sleek new camera was to reveal how scandalous the mundane had become, and how mundane the scandalous. But deadpan cool has long since thinned into a manner. Its reflexive irony is now more or less the house style of late capitalism. (How awesome is that?)
As a non-Western writer, knowing the origin of this fad is comforting. But as Hallberg pointed out, context, not tradition, is what should decide or generate the style of any work of fiction. Paul West noted in his essay, “In Praise of Purple Prose,” written around the heyday of minimalism in 1985, that the “minimalist vogue depends on the premise that only an almost invisible style can be sincere, honest, moving, sensitive and so forth, whereas prose that draws attention to itself by being revved up, ample, intense, incandescent or flamboyant turns its back on something almost holy -- the human bond with ordinariness.” This rationale, I dare say, misunderstands what art is and what art is meant to do. The essential work of art is to magnify the ordinary, to make that which is banal glorious through artistic exploration. Thus, fiction must be different from reportage; painting from photography. And this difference should be reflected in the language of the work -- in its deliberate constructiveness, its measured adornment of thought, and in the arrangement of representative images, so that the fiction about a known world becomes an elevated vision of that world. That is, the language acts to give the “ordinary” the kind of artistic clarity that is the equivalence of special effects in film. While the special effect can be achieved by manipulating various aspects of the novel such as the structure, voice, setting, and others, the language is the most malleable of all of them. All these can hardly be achieved with sparse, strewn-down prose that mimics silence.
The sinuous texture of language, its snakelike meandering, and eloquent intensity is the only suitable way of telling the multi-dimensional and tragic double Bildungsroman of the “egg-twin” protagonists of Arundhati Roy’s The God of Small Things. Roy’s narrator, invested with unquestionable powers of insight and deliberative lens, is able to maintain a concentrated force of focus on a very specific instance, scene, or place, or action. Hence, the writer -- like a witness of such a scene -- is able to move with the sweeping prose that will at once appear gorgeous and at the same time be significant and memorable. Since Nabokov’s slightly senile narrator in Lolita posits that “you can always trust a murderer for a fancy prose style,” we are able to understand why Humbert Humbert would describe his lasped sexual preference for Dolores while in bed with her mum in this way: “And when, by means of pitifully ardent, naively lascivious caresses, she of noble nipple and massive thigh prepared me for the performance of my nightly duty, it was still a nymphet’s scent that in despair I tried to pick up, as I bayed through the undergrowths of dark decaying forests.” Even though the playfulness of Humbert’s elocution is apparent, one cannot deny aptness -- and originality -- of the description of Humbert’s response to the pleasure his victim is giving him is.
It is not, however, that the “less is more” nugget is wrong, it is that it makes a blanket pronouncement on any writing that tends to make its language artful as taboo. When sentences must be only a few words long, it becomes increasingly difficult to execute the kind of flowery prose that can establish a piece of writing as art. It also establishes a sandcastle logic, which, if prodded, should crash in the face of even the lightest scrutiny. For the truth remains that more can also be more, and that less is often inevitably less. What writers must be conscious of, then, is not long sentences, but the control of flowery prose. As with anything in this world, excess is excess, but inadequate is inadequate. A writer must know when the weight of the words used to describe a scene is bearing down on the scene itself. A writer should develop the measuring tape to know when to describe characters’ thoughts in long sentences and when not to. But a writer, above all, should aim to achieve artistry with language which, like the painter, is the only canvas we have. Writers should realize that the novels that are remembered, that become monuments, would in fact be those which err on the side of audacious prose, that occasionally allow excess rather than those which package a story -- no matter how affecting -- in inadequate prose.
In the same vein, describing a writer’s prose as “self-conscious” isn’t wrong, it is that it misallocates blames to an ailing part of a writer’s work. Self-consciousness is a term that mostly describes the metafictional qualities of a work; it cannot, in effect, describe the use of language. “The hand of the writer” can appear in the framing of a story, in its structure, in the characterization, in the form of experimental works and frame narratives, but it cannot appear in its language. “Self-consciousness” cannot be applied to the use of words on the page, just as Wolfgang Amadeus Mozart cannot be accused of self-conscious tune or Yinka Shonibare of self-conscious art. Self-consciousness or pomposity cannot be reflected in a piece of writing, except in its tone, and in fiction, this is even harder to detect. What can be reflected in a piece of writing is excess and lack of control, which can stand in the way of anything at all in life. What critics should be calling out should be pretentious, unsuccessful gloss that lacks measure and control. They should call out images that might be inexact, ineffective, or superfluous. When critics plunge head-on against great writers (Don Delilo, Cormac McCarthy, etc.,) in the manner of B.R. Myers’s agitated fracking masquerading as “criticism," they only end up scaring other writers from attempting to pen artistic prose. Fear might be what many writers writing today seem to be showing by indulging in the writing of seemingly artless prose. Authorial howls of artful prose as created by James Joyce, Faulkner, Nabokov, Cormac McCarthy, Shirley Hazzard, are becoming increasingly rare -- sacrificed on the altar of minimalism. Hence, it is becoming more and more difficult to differentiate between literary fiction and the mass market commercial genre pieces, which, more often than not, are couched in plain language.
The gravest danger in conforming to this prevailing norm is that contemporary fiction writers are unknowingly becoming complicit in the ongoing disempowering of language -- a phenomenon that the Internet and social media are fueling. Words were once so powerful, so revered, that, as culture critic Sandy Kollick once observed, “to speak the name of something was in fact to invoke its existence, to feel its power as fully present. It was not then as it is now, where a metaphor or a simile merely suggests something else. To identify your totem for a preliterate gatherer-hunters was to be identical with it, and to feel the presence of your clan animal within you.” But no more so. Too many words are being produced in print and visual media that the power of words is diminishing. There are now simply too many newspapers, too many books, too many blogs, too many Twitter accounts for words to maintain their ancestral sacredness. And as writers adjust the language of prose fiction to conform to this era of powerless words, language is disempowered, leading -- as Kollick further points out -- to the inexorable “emptying out of the human experience,” the very object fiction was meant to preserve in hardbacks and paperbacks.
It is therefore necessary that writers everywhere should see it as their ultimate duty to preserve artfulness of language by couching audacious prose. Our prose should be the Noah’s ark that preserves language in a world that is being apocalyptically flooded with trite and weightless words. “The truest writers,” Derek Walcott said, “are those who see language not as a linguistic process, but as a living element.” By undermining the strongest element of our art, we are becoming unconscious participants in the gradual choking of this “living element,” the life blood of which is language. This we must not do. Rather, we must take a stand in confirmation of the one incontestable truth: that great works of fiction should not only succeed on the strength of their plots or dialogue or character development, but also by the audacity of their prose.
Image Credit: Wikipedia.

It requires a peculiar moment in contemporary culture when certain white male writers can decry that their jobs are harder as white men than if they were minorities. In that way, storytelling as with most things bears a truly striking institutional likeness—to the extent that the enterprise of writing and publishing is an institution—to our current politics.

1.The year is 1984, and in the quiet center of a declining Midwestern city, the Indians start to appear. They loiter on skybridges over otherwise dead downtown streets. They pose for snapshots in front of the train station, gather in saris for picnics on the hill beneath the art museum. An Indian princess suddenly marries the heir to a local brewery. At the annual Veiled Prophet Ball, where the city’s elite honors one of its own, the Prophet’s throne stands empty. Most mysteriously of all, after the city’s longstanding police chief retires, he passes over local candidates to select an unknown woman from Bombay as his successor. “The city was appalled,” the novel begins, “but the woman -- one S. Jammu -- assumed the post before anyone could stop her.”
The Twenty-Seventh City was published twenty-five years ago this month by a young writer named Jonathan Franzen. The book’s cover reflected the soaring ambitions of its author, an antiquated skyline dominated by an outsized Gateway Arch and a female face staring out intesely from under her bindi, sometimes called a third eye. The city was St. Louis -- once the fourth largest city in the U.S., it had dropped to twenty-seventh by 1988 -- helpfully rendered on a map inside the front cover as if it were a fantasy novel, the Midwest as Middle Earth. And in some ways it was a fantasy, the dark twisted fantasy of a native son.
Wasting little time, S. Jammu begins reconfiguring the political landscape. Her immediate goal is to restore St. Louis to its former glory by reintegrating the city with the more affluent and powerful county, from which it split off in the late 19th century. To this end, she funnels millions of foreign dollars into real-estate speculation on the city’s north side. She quickly converts the mayor, gains traction with the black community, and co-opts prominent business and governmental leaders to her cause. Along with her accomplices, most notably a decadent radical named Singh, she enacts a subversive program inspired by Indira Gandhi’s martial-law-like crackdown, the Emergency. The homes of prominent St. Louisans are bugged. When coercion and bribery fail, the arrivistes are not afraid to resort to car bombs, roadblocks, and paramilitary strikes -- what might be called limited acts of terror.
The only man that stands in Jammu’s way is Martin Probst, a contractor from Webster Groves, the inner-ring suburb where Franzen grew up. A contractor who worked on the iconic Arch, and the widely respected leader of the civic-improvement organization Municipal Growth, Probst is a noble capitalist Ayn Rand could almost love (he defeated the unions but probably treats his employees too well). Probst distrusts Jammu and leads the opposition to her takeover of the city. This drives Jammu and Singh to extraordinary measures: they will attempt to induce “the State” in Probst. The State is in a shattered, vulnerable condition “in which a subject’s consciousness became extremely limited.” Singh’s account of the operation is chilling:
As a citizen of the West, Probst was...sentimental. In order to induce the State in him, it might be necessary only to accelerate the process of bereavement, to compress into three or four months the losses of twenty years. The events would be unconnected accidents, a “fatal streak”...lasting only as long as it took Probst to endorse Jammu publicly and direct Municipal Growth to do likewise.
Probst’s “fatal streak” begins with the death of the family dog, and escalates to the choreographed estrangement of his teenage daughter, who moves into the apartment of a young photographer. When Probst refuses to bend, Singh kidnaps his wife, Barbara. From its premise the novel extracts a ruthless set of consequences, spelled out in technocratic and emotionless prose -- a technique that very effectively creates sympathy for the Probst family and its embattled patriarch. Probst is a flawed but decent man, devoted to his family and his privacy: his most characteristic expression is an awkward “well!” Even as Probst’s family falls apart, the peripheral characters in his life close in, such as his old and pitiable high school friend Jack DuChamp, the excellently unhinged gardener Mohnwirbel, and the right-wing lunatic General Norris (in this book, Norris has it all right). These characters seem like the repressed specters haunting Probst’s orderly American mind. What is stripped away by the conspiracy against him, and by extension the novel itself, is his “wellness,” his comforts and psychic embankments. It is not until his memorably germ-infested visit to a shopping mall on Christmas Eve that he recognizes what has happened to him: “He was sick, and the city was sick on the inside too, choking on undigested motives, racked by lies”
2.
It was a long, dense, problematic novel about a city not exactly at the center of the nation’s consciousness, then or now. Nevertheless, Franzen’s debut was widely reviewed and, for the most part, highly praised. Richard Eder’s rave in the Los Angeles Times was titled “America’s History May Not Be Written by Americans.” In the New York Times, Michiko Kakutani was more ambivalent, noting that “the storyline about a charismatic, Marxist-indoctrinated woman’s attempt to seize control of an American city by using terrorist tactics...sounds like a red-baiting, paranoid nightmare come true.” Neither response fully captured the anger of the novel or the extent of Franzen’s imaginative allegiance with the outsiders.
The local media saw it differently. The St. Louis Post-Dispatch ran a defensive article about Franzen entitled “Don’t Judge by Cover: Author Likes His Hometown.” Referring to the first edition’s cover art, but implicitly to the novel itself, the Post asked: “Why so much distortion? Why would a son of St. Louis be so hard on his hometown?”
Franzen’s deeply ambivalent portrait of the city provokes these questions, and also exposes the bind of the first-time Midwestern novelist: even while the speculative plot unleashes chaos on St. Louis, the city itself is rendered with a wealth of local detail which I imagine will be exhausting to many coastal readers. Franzen builds up and dismantles the city at once, using a sinuous omniscient voice that glides between the locals and the plotting Indians (Jammu and Singh evoke the city’s imperial past when they attribute their terrorist acts to a front group called the Osage Warriors). It’s interesting to learn that the character Jammu was imported from a play Franzen wrote at Webster Groves High School. Behind the Pynchonesque conspiracy, there is an adolescent revenge fantasy at the novel’s heart, which produces some of its most inspired scenes: a suburban family taking cover as their windows shatter with gunfire, an explosion in a TV station parking lot, mass panic at a pro football game. Franzen reimagines the Midwest as an oddly theatrical war zone where terror is a fact of life. But the novel also makes us feel the loss of the Probsts’ rich, cluttered domestic life in Webster Groves, a history that readers must infer almost archeologically from its ruins. If it was possible to write a book of violent nostalgia, Franzen had succeeded.
3.
My wife and I were surprised to find how much we liked St. Louis, after we moved here in the fall of 2004. We knew very little beyond the ominous reports that had filtered through the national media. “All cities are ideas,” Franzen writes. “They create themselves, and the rest of the world apprehends them or ignores them as it chooses.” By the time we arrived, the twenty-seventh city had fallen to the fifty-second (it is now the fifty-eighth). What we encountered was a vexed landscape, a crumbling but also rebuilding city which welcomed us into its project of rehabilitation. I read Franzen’s novel as a primer, a narrative of tragic decline, from the eclipse of St. Louis by Chicago in the 1870 census and the city’s shining moment at the 1904 World’s Fair, to de-industrialization, white flight, the demolition of the notorious Pruitt-Igoe housing complex in the 1970s. Still, we’d never seen structures of such peculiar spectral beauty as the looming red-brick buildings that seemed to line every St. Louis street. While the city’s inequalities could be disorienting, a single wrong turn taking you from stable neighborhoods to areas of surreal devastation, it was also a fascinating place. We felt like we were living someplace where we could matter. After graduating with her master’s degree in urban planning, my wife found work managing data and making maps for a nonprofit that revitalized low-income neighborhoods. Despite the city’s rumored insularity, we grew connected and invested here, and within a few years we bought a house, adopting the city and its problems as our own.
In March 2008, on her way home from work, my wife was attacked on a quiet street just blocks from our house. What began as a mugging devolved into sexual assault. (She later brilliantly documented how the attack altered her mental map of the city on her blog.) A few days later, the police caught up to the perpetrator and arrested him in the bird sanctuary of a nearby park. He pled guilty to all charges, sparing my wife from testifying at his trial, so in this limited, legal sense, everything was resolved. Yet at the same time, over the months and years to follow, she was haunted by the experience in State-like ways. And while her experience remained fundamentally unimaginable to me, no matter how many times I replayed her description in my head, my confusion and anger became its own kind of State, so that I would join her there. It was impossible not to think of her as I reread the passages about Barbara Probst’s captivity in a desolate East St. Louis warehouse. To maintain the charade that Barbara has left Probst for him, Singh dictates her weekly phone calls to her husband, and as artificial as they are, these scenes do actually capture the distortion, the brittleness that can enter a relationship after a trauma. It never felt like we were alone in those days, as if our conversations were being filtered through an interpreter. We could feel, with Probst, that “the whole city [was] a thing of foreignness and menace.” We turned off the news: every report of violence -- and these were violent post-recession years in St. Louis -- resounded with suddenly personal import. My wife carried a timetable of civil twilight so that we would never be caught outside after dark; in the dark we stayed home and watched TV, something safely fictional. Guilt filtered into our daily lives, leading us to question our most basic acts, until we felt culpable in our mere presence. We wondered if our earlier enthusiasm for St. Louis wasn’t naive. At one point, Franzen writes of Barbara Prost: “This was the worst pain of all, that the world seethed with motives she could never grasp.” While we eventually emerged, and saw the attacker as an individual rather than a malign force, his crime something that could have occurred anywhere, the city never looked exactly the same.
4.
It was another St. Louisan, T.S. Eliot, who wisely said that humankind cannot bear very much reality. I certainly can’t. Books serve me both as a way to confront and avoid real difficulty, and my wrenching ambivalence about The Twenty-Seventh City probably results from the ways it hits too close to home and doesn’t allow me to escape. There is something unsettling about the novel’s tentacular hold on my own experience in the city it depicts. Books can become essential to us in strange and invasive ways, almost against our will.
Franzen continues to have a remarkable ability, both as a writer and a persona, to touch nerves, and his divisiveness is surely a sign of his strength. While I’ve enjoyed all of Franzen’s subsequent work and recognize the technical gains he has made as a storyteller, nothing has moved me personally like his first novel. “I was trying to write an uncanny book,” Franzen told TheParis Review. “A book about making strange a familiar place...that was the feeling I was after...what kind of weird, surreal world have I fallen into here, in the most boring of Midwestern cities?” Well, I disagree about the boring part, and I think The Twenty-Seventh City succeeds, insofar as it does, not only by making St. Louis strange but by drawing out the latent strangeness in the city’s history. The audacity of Franzen’s project still resonates in the city today -- a local developer’s north-side regeneration project bears an uncomfortable resemblance to Jammu’s land grab -- and its visionary streak stands as something of an unfulfilled promise in his later work. It will be reissued in November as the first Picador Modern Classic.
“Only St. Louis knew,” Franzen writes. “Its fate was sealed within it, its special tragedy nowhere else.” The narrative of tragic decline is seductive in its own way, partly because it relieves the mourner from the responsibility of forming new conspiracies to make the city better. All cities are ideas, and St. Louis’s struggle, as in other Midwestern cities, is partly the mental one of convincing itself that it is not specially doomed. Looking closely, there are definite signs of progress: new residents downtown, an undersung art scene, community development on the north side, consolidation of chambers of commerce and law-enforcement functions. There is even some renewed talk of a Great Reconciliation between the city and the county. The Twenty-Seventh City itself ends darkly in a series of ironic anticlimaxes, reflecting the growing cynicism of the young man from Webster Groves. After almost a decade here, I understand how this city could have driven Franzen nuts and broken his heart. It’s hard to say how long we’ll stay in St. Louis, but despite all its obvious issues, despite everything, we’ll always be rooting for this town. It’s harder to say what I think of The Twenty-Seventh City. Reading it again, I experience its pervasive uncanniness, the sense of being somewhere close to home, but not quite. It also makes me a bit sad, almost as if I’m reading a posthumous work. That St. Louis kid is long gone.