American literature has long had a problem with adulthood. The most obvious examples, of course, are found among the post-World War II generation of writers, from John Updike and Mary McCarthy to J.D. Salinger, who associated adulthood with stagnation and stress, with heavy drinking, marital discontent, adultery, and despair.

Critical treatments of adulthood escalated during the 1960s and 1970s in such novels as Sylvia Plath’s The Bell Jar, Sue Kaufman’s Diary of a Mad Housewife, and Marilyn French’s The Women’s Room. The authoritative father who knows best came under attack as uncommunicative, emotionally withdrawn, and bottled up, while the self-denying, sacrificing mother became not an ideal to emulate, but a cautionary example of how women, mistakenly, forfeited their individuality and self-fulfillment for their husband’s sake.

Yet well before the 1950s and 1960s, American literature tended to paint a bleak picture of adulthood. Men come off particularly poorly. In The Adventures of Huckleberry Finn, Mark Twain presents his readers with a rogues gallery of hucksters, charlatans, braggarts, con men, and cheats. Huck’s own father, Pap, is an ignorant, abusive drunk. From Melville’s monomaniacal Ahab and his depressed, deeply alienated Bartleby, to Henry James’s unfulfilled Lambert Strether, Edith Wharton’s “ruin of a man,” Ethan Frome, Dreiser’s greedy, ambitious, opportunistic Clyde Griffiths, and Sinclair Lewis’s narrow-minded, complacent, materialistic George F. Babbitt, fictional images of manhood were replete with examples of men with cramped emotional lives, loveless marriages, and work lacking opportunities for meaning and fulfillment. Each offered a vivid example of the life of quiet desperation and despair identified by Thoreau.

17 July 2014

In the latter half of the 1960s, with the nation’s eyes on unrest in its cities, there arose a role for African American intellectuals in helping white America to understand black urban life. As Daniel Matlin details in On the Corner: African American Intellectuals and the Urban Crisis, these “indigenous interpreters” were left to balance the opportunities afforded by their prominence with the dilemmas and demands of their perceived social obligations. Among them was psychologist Kenneth B. Clark—born one hundred years ago this week—who became known as an establishment figure committed to an integrationist approach to solving the urban crises of the post-civil rights era. However, as Matlin explains below, Clark’s public acceptance of that role both concealed and constrained his sense of race as “merely one manifestation of a much deeper set of problems that confront human beings.”

-----

Civil rights anniversaries abound this year. Media interest has centered on the Civil Rights Act signed into law by Lyndon Johnson fifty years ago, but the summer of 1964 also witnessed the Harlem Riot, an event that inaugurated years of urban uprisings which broke any illusion that America’s racial divisions had been laid to rest. It was sixty years ago that the Supreme Court’s ruling in Brown v. Board of Education shattered the legal foundations underpinning segregation. And one hundred years ago, on July 14, 1914, the scholar whose name became inextricably entwined with that landmark legal victory was born to Jamaican parents in the Panama Canal Zone.

Histories of the civil rights movement testify to Kenneth B. Clark’s vital contribution in coordinating the expert psychological testimony cited in the Brown ruling. Segregated schools, the Court was persuaded, stigmatized and harmed black children by their very nature, regardless of whether white and black schools were equally resourced. The famous “doll tests” through which Clark and his wife, Mamie Phipps Clark—like him, a psychologist trained at Howard and Columbia Universities—had exposed black children’s marked preference for white dolls still provides perhaps the most graphic and abiding image of the interior, emotional costs of institutionalized white supremacy.

Clark, who died in 2005, remains for historians a “symbol of integrationism,” the civil rights movement’s “reigning academic,” and “the epitome of the establishment social scientist” during the Kennedy-Johnson era of liberal reform. His credentials as a pillar of the postwar liberal establishment are plain to see. A tenured professor at the City College of New York, Clark served as an expert witness before courts and congressional committees and at White House conferences, fraternized with politicians and their advisers, and secured federal and municipal grants to support his research and activism. Those credentials, and his aura of respectability, were only underlined in 1969 when Clark was elected to serve as the first black president of the American Psychological Association, one of the nation’s largest professional bodies.

And yet, the remarkable controversy that ensued during Clark’s presidency is a forgotten story—one that casts his life and thought in a dramatically different light, and reveals much about the pressures and dilemmas that have confronted generations of African American intellectuals. On September 4, 1971, at the end of his term of office, Clark rose to deliver his presidential address at the APA’s annual convention in Washington, D.C. Within days, he had been ridiculed in the national press, denounced by many of his academic peers, and censured by the vice president of the United States.

25 March 2014

In the 1970s, Christopher Lasch, among other social critics, contended that the US was a nation in decline due to growing numbers of narcissistic Americans who in their self-absorption were chasing material plenty and easy gratification. Lasch’s The Culture of Narcissism brought together clinical insight and cultural criticism in the wake of America’s defeat in Vietnam and the political corruption of Watergate that resonated with the people’s low morale and waning confidence in our nation’s status and influence. His critique has enjoyed amazing staying power through the final decades of the 20th century and continues to serve as a powerful indictment of our cultural, political, and social malaise.

But why? How did the psychoanalytic concept of narcissism, initially rooted in the clinical studies of the individual psyche by Freud and his colleagues in the earlier part of the 20th century, become the basis for a pervasive criticism of American society today as greedy and vain? If we step back and trace the evolution of narcissism in psychoanalytic thought over the course of the 20th century alongside its dissemination and popularization in broader society, we see that critics like Lasch weren’t so much diagnosing a changed America as employing a changed diagnosis. Lasch, whose critique could be characterized as “the narcissization of America,” was actually helping to usher in the Americanization of narcissism.

So argues Elizabeth Lunbeck, in a penetrating work on the ways in which both psychoanalytic thought and cultural criticism have grappled with a series of irresolvable tensions that we all experience between dependence and independence, neediness and self-sufficiency, asceticism and abundance, renunciation and gratification, poverty and plenty. In The Americanization of Narcissism, Lunbeck demonstrates the concept’s protean nature and suggests that part of its polemic appeal is to be found in its capaciousness, in the conceptual space narcissism offers for old and enduring debates. The notion itself allows us seemingly new ground on which to conduct centuries-old discussions of who we are and what we value both collectively and as individuals.

12 March 2014

The deeply rooted American notions of individualism and self-reliance are shadowed by complicated feelings on time spent alone. The thin line between solitude and loneliness serves as a fulcrum of sorts for two recent books by law professor Robert A. Ferguson that don’t immediately present themselves as having much common ground. In one, Ferguson investigates the nature of loneliness in American fiction, while in the other he gives us a humanistic understanding of our massive, out-of-control punishment regime. Solitary confinement figures prominently in the latter, as it does in American prisons, and so we asked Ferguson to help us connect the analysis of isolation running through these two books. His response is below.

-----

“We live as we dream—alone,” Joseph Conrad wrote in Heart of Darkness. The comment is famous because more complicated than it first appears. When we dream, we do so in a social milieu that we think is real even though we are by ourselves. No life is complete without its attachments. Pundits and scholars respond by trying to define what percentage of ourselves are ours and what percentage comes from the cohort to which we belong.

My own recent books from Harvard University Press, Alone in America: The Stories That Matter (2013) and Inferno: An Anatomy of American Punishment (2014) wrestle with these questions from opposite directions. The first takes up the startling fact that a quarter of the people in the United States now live alone—an unprecedented situation fueled by the collapse of families, technological capacity, a country on the move, and the attritions in existence. This book targets a painful psychological truth: most people will admit to almost anything before revealing that they are lonely.

The second book, Inferno, gives considerable space to the problem of solitary confinement or enforced isolation. Current debate over the punitive impulse in our criminal justice system has led reporters and legal experts to deplore what Jeremy Bentham proved in The Rationale of Punishment as early as 1830: namely that enforced isolation causes psychological damage when prolonged. Already in 1830, “the best authorities” knew that solitary confinement causes “madness, despair, or more commonly a stupid apathy.”

This lesson is one we have been slow to learn. Even those protesting against the practice of solitary confinement in prisons today can ignore the established fact. For example, the columnist David Brooks, writing on March 7, 2014 for the New York Times, says “some prisoners who’ve been in solitary confinement are scarcely affected by it.” Scarcely? Everyone is affected by enforced solitude, and the danger grows as we keep more than 100,000 inmates in solitary confinement for months and even years on end, a punishment that most liberal democracies do not use or allow.

Informed by the latest research in neuropsychology, the book explores all the mystifying things people do (and do again) despite knowing better, from blurting out indiscretions to falling for totally incompatible romantic partners. Seems an apropos book for a man in Weiner’s position.

On closer consultation, though, the book was found not to speak directly to the issue of impulsive sexting. It does, however, offer practical ways to beat the overeating impulse, some of which we’ve adapted below.

1. Eat Sext from smaller plates and bowls phones. Because we eat sext with our eyes as much as with our mouth hands, using smaller plates and bowls phones is a quick and easy way to reduce the number of calories consumed sexts sent. In a study I conducted in London as part of a Channel 4 television documentary entitled Secret Eaters Sexters, we found that volunteers who served themselves ice cream into a sexted with a large bowl phone took sexted 44 per cent more than those with a smaller bowl phone.

2. Eating Sexting with chopsticks without autocorrect, rather than a knife and fork with autocorrect, obliges you to take smaller mouthfuls use shorter words and eat sext more slowly. Eating Sexting more slowly allows time for the digestion process to work effectively. It also gives you a better chance of recognising when you have eaten sexted sufficient food sexts.

3. Put ice in your drink sext. Because the body has to use energy to heat up the beverage sext, around 1 calorie per ounce word of fluid sext is consumed. If you drink sext the recommended not recommended eight 8-ounce glasses of water sexts a day with ice cubes you will burn up between 60 and 70 extra calories each day.

4. We eat sext significantly more when dining sexting in company than eating sexting alone. For Secret Eaters Sexters, I asked diners sexters to have lunch sext alone, as a couple or at a table of six friends. The group consumed sent some 600 more calories sexts than the solitary diner sexter and over 80 more than the couple. What happens is that we tend to pace ourselves on the fastest eater sexter in a group. Also, distracted by conversation and banter we fail to notice how much we are eating sexting and are more susceptible to offers of second helpings sextings. Be aware of this risk the next time you eat sext with a group of friends or colleagues. Refuse second helpings sextings. Pace yourself to the slowest eater sexter at the table. Try to be the last person to start eating sexting.

5. When dining sexting out in an upmarket restaurant be aware of the effects of soft lights and classical music. Both encourage you to linger longer and so eat or drink sext considerably more than you realise.

6. When snacking on popcorn sexting in the cinema, use your other hand. That is, if you normally hold the popcorn container in sext with your right hand and pick don’t sext with your left, then swap around. By making this small change in your eating sexting habits you force yourself to think more about what you are doing and are, therefore, likely to eat sext less. During a study to test the effects of this tactic in a London cinema, I provided half the audience with an oven glove worn on the normal popcorn picking sexting hand. Their consumption sexting fell by a quarter compared to their non-handicapped companions.

Okay, that’s six solid tips. Maybe take them under advisement? FoodSext Food for thought, at least.

03 June 2013

In his new book, Evil Men, James Dawes confronts some of the worst crimes imaginable. The book is based on his interviews with convicted war criminals from the Second Sino-Japanese War, and is as much about the ethical challenges of his relationships with these men as it is about their past acts. In his probing of the depths of the human capacity for atrocity, Dawes also offers an altogether unique examination of the human capacity for empathy. In the piece below, Dawes responds to a recent high-profile denunciation of empathy’s devaluing of faceless suffering.

Here’s the problem as Bloom sees it: we are hardwired to have empathy for people who exist and, of the people who exist, people we know. This is a big problem in and of itself. But what’s worse is that we don't experience this basic constraint on empathy as a problem. In our day-to-day, we experience it as virtue.

My empathy for my children morally improves me as a father. This is a good thing, but it gets better: the moral improvement of fatherhood leads to other moral improvements. My capacity to extend my empathy to other people’s children—by making them real in my imagination, by showing myself that they are like my children—emotionally prepares me to sacrifice my own family’s interests for other families’ interests. Again: a good thing. And again: it gets better. My experience of empathy for my children gives meaning to my life. Next to that feeling of connection, everything else feels like vanity and baubles.

Many scholars argue that the expansion of empathy is the driver of historical progress and our best hope for the future. Lynn Hunt argues that the invention of a particular kind of empathy through literature was the necessary precursor to the modern human rights movement. Jeremy Rifkin, Paul Ehrlich, and Robert Ornstein call for the development of “global” empathy.

Bloom, respectfully, disagrees. If anything, we need less empathy, not more. Empathy leads us astray. It causes us to pay special attention to what one might call narrative suffering (highly visible, attractive victims that we feel have some kind of relation to us), and thereby to ignore statistical suffering (faceless victims to whom we are not connected). We squander our attention on Baby Jessica who fell into a well (Bloom’s prime example), while ignoring all of the yet-to-exist babies who will be born into the eco-apocalypse caused by our reckless global warming. We generously donate time and money to the victims of spectacular catastrophes even when more time and money isn’t particularly helpful—as with the 2004 tsunami, a disaster for which Doctors Without Borders stopped accepting money, pleading instead for donations to less media-friendly crises. Meanwhile, we stingily withhold resources that could make critical differences for the vast population suffering from the invisible, slow tortures of poverty, violence, and disease.

“The public here believe in drugs and consider prescription as the aim and end of medical skill,” complained Swiss-émigré neurologist Adolf Meyer in August 1894, “whereas in Germany and in many other places, the people regard the drugs as quite as great an affliction as the disease itself.”

The same is true today. Indeed, since the publication of DSM-III in 1980, even more so.

Since the dawning of the Age of Prozac twenty-five years ago, we have consensually adopted a culture of surveillance medicine which reflexively invokes psychiatric medications as the first-line mechanism of self-mastery and social control for our children, our soldiers, our behavioral deviants, and our selves.

The automatic expectation of a prescription following a DSM diagnosis, usually rendered (perhaps 80 percent of the time) by a primary care physician, clinical nurse specialist, or physician assistant after a 15-minute chat, is so commandingly reinforced by the media and the medical profession that we cannot help but be seduced by the illusion that most of the hundreds of mental disorders in DSM or ICD are “diseases” like any others.

But everyday people have no interest in the philosophical distinction between a DSM or ICD mental disorder and a biological disease concept because the usual treatment they receive for each is medication.

22 May 2013

This week marks the publication of the much anticipated—and much maligned—DSM-5, the latest full revision to psychiatry’s diagnostic handbook. Each new iteration of the DSM alters the official stance on so many conditions that discussion often centers on the introduction or dismissal of particular disorders, virtually naturalizing the philosophy underlying the manual’s overall perspective. In an effort to unsettle that reception, we asked Liah Greenfeld, author of Mind, Modernity, Madness, what single point about the DSM-5 and its diagnostic approach she would most want to convey to the general public, if given the chance. Her response follows.

-----

I would use this opportunity to point out that DSM-V, and DSM in general, is just an expression of the increasing confusion in the mental health community (including both researchers and clinicians, and both psychiatry and psychology with their neuroscience contingents) in regard to the nature of the human mental processes—or the mind—altogether. Before this confusion is cleared, none of the problems with the DSM and the resulting mental health practice can be resolved. And the criticism of the DSM should in all fairness apply also to its critics and judges, such as the NIMH.

One might expect me, as a social scientist, to attack today’s mental health community for its nearly exclusive biological focus. But the source of its confusion in regard to the nature of the mind lies deeper than the equation of the mind and the brain. Psychiatry and psychology consider the human individual as their subject. In this they sharply distinguish themselves from biology, which studies the organic world. The most important causal factor in biology is the environment in which organisms find themselves (think of natural selection) and no specialization in this mighty science, among the sub-disciplines of which neuropsychiatry and neuropsychology, at least, would like to range themselves, would limit itself to the study of a form of life in isolation from the environment. Consider, for example, medical (i.e., applied biological) specializations such as gastroenterology or pulmonology: is it possible to imagine a physician who would be unaware that the process of digestion is necessarily affected by the nature and quantity of food the stomach digests, or the process of breathing by the nature and quantity of the air inhaled? No, because this is what our organs do: they process intakes from the environment, and these intakes have at least as much to do with our health and illness as the structure and physiology of the organs which process them. Yet, we forget this when it comes to the brain and mental processing—the mind.

The environment of the human brain is far more complex than that of the stomach and lungs, or than the environment of the brains of other animals. Most of its intakes come not from the organic and physical world, but, instead, from the world of meanings and symbolic systems which convey them, that is, culture. If digestion can be defined as what happens to food in, and what food does to, the stomach, the mind, by analogy, can be conceptualized as what happens to culture in, and what culture does to, the (human) brain. It is very likely that most mental diseases (just like most gastrointestinal or pulmonary ones) come from the intake of the processing organ, rather than from the organ itself, i.e., it is likely that they are caused by culture. Mental health professions pay no attention to it, and no revision of the DSM will make them improve their ability to help the mentally ill.

25 April 2013

It sometimes seems as if each day brings a new raft of articles proclaiming yet another biological or genetic explanation for human behavior and activity. To Liah Greenfeld, that barrage is just a new bubble, and in Mind, Modernity, Madness: The Impact of Culture on Human Experience, she does her best to burst it. While not entirely dismissing biological factors in mental illness, Greenfeld argues that the phenomenon that was for a long time called simply “madness”—today’s schizophrenia, bipolar disorder, and major depression—is actually a symptom of modernity, an effect of our cultural environment.

To Greenfeld, “madness” is a disease not of the brain but of the mind, of consciousness, which itself is a cultural phenomenon, the product of nationalism, a subject on which Greenfeld has now produced a trilogy. As the cultural framework of modernity, nationalism insists on the dignity, creativity, and equality of man, the value of each human life, and the right and capacity for all to construct their own destinies, to love, and to be happy. Psychotic disease, she argues, is fundamentally a malfunction of the “acting self,” experienced as a loss of the familiar self and as a loss of control over one’s physical and mental activity, a response to the cultural demands of selfhood.

From Mind, Modernity, Madness:

The reason for the dysfunction of the acting self lies in the malformation of identity. It is possible that the complexity of the original identity problem (the depth and number of inconsistencies in the relationally constituted self) contributes to the complexity of the disease; for instance, in a case of dissatisfaction with one’s nevertheless clearly experienced identity causing depression, and in a case of no clearly experienced identity, combined with numerous competing possibilities, producing schizophrenia. It is modern culture—specifically the presumed equality of all the members of the society, secularism, and choice in self-definition, implied in the national consciousness—that makes the formation of the individual identity difficult. A member of a nation can no longer learn who or what s/he is from the environment. Instead of prescribing to us our identities, as do religious and in principle nonegalitarian cultures, modern culture leaves us free to decide what to be and to make ourselves. It is this cultural laxity that is anomie—the inability of a culture to provide the individuals within it with consistent guidance (already in the beginning of the twentieth century, recognized by Durkheim as the most dangerous problem of modernity). Paradoxically, in effect placed in control over our destiny, we are far more likely to be dissatisfied with it, than would be a person deprived of any such control: not having a choice, such a person would try to do the best with what one has and enjoy it as far as possible. A truly believing person would also feel s/he has no right to find fault with the order of things created by God, much less to try and change it to one’s own liking—one’s situation in life would be perceived as both unchangeable and just. Conversely, the presence of choice, the very ability to imagine oneself in a position different from one currently occupied or that of one’s parents, and the idea that social orders in general are created by people and may be changed make one suspect that one’s current situation is not the best one can have and to strive for a better one. The more choices one has, the less secure one becomes in the choices already made (by one or for one) and making up one’s mind—literally, in the sense of constructing one’s identity—grows increasingly difficult. It is for this reason that the malformation of the mind—quite independent of any disease of the brain—becomes a mark of nations.

02 April 2013

In The Rise of China vs. the Logic of Strategy, published last year, Edward Luttwak introduced the concept of “great state autism,” a collective national lack of situational awareness that reduces a country’s ability to perceive international realities with clarity. While the U.S. and Russia each suffer from obvious cases of the condition, Luttwak labels China’s autism an “especially virulent” strain, due to its ancient development in relative isolation and its sheer size, among other factors. Luttwak sees the affliction when, say, China flexes its military muscle in the face of a neighbor one day and then is surprised by the rebuff of a trade delegation to that same neighbor the next.

“In all great states,” writes Luttwak, “there is so much internal activity that leaders and opinion-makers cannot focus seriously on foreign affairs as well, except in particular times of crisis.”

They do not have the constant situational awareness of the world around them that is natural in small countries of equal advancement. After all, individual sensory and cranial capacities are much the same in smoothly operating states of a few million people, and in megastates such as the Russian Federation, the United States, India, and China, whose leaders face internal urgencies if not emergencies each day somewhere or other, in addition to their ordinary decision-making sessions and ceremonial obligations.

The result is not mere inattention. On the contrary, it is not only possible but common for great-state leaders and even entire ruling elites to make much of foreign affairs if only as welcome diversion from the harder choices of domestic politics, in which almost any decision that pleases some must displease others—and not mere foreigners whose political support will not be missed.

Great-state autism is worse than inattention because in the absence of the serious and earnest study that domestic urgencies make impossible, decision-makers cannot absorb in-depth information with all its complexities and subtleties, even if it is offered to them (which is unlikely: when intelligence officers adhere to the rule that their highest duty is to tell top leaders what they do not want to hear, their careers suffer). Instead, decisions on foreign affairs are almost always made on the basis of highly simplified, schematic representations of unmanageably complex realities, which are thereby distorted to fit within internally generated categories, expectations, and perspectives.

Now, in the recently-published Israel Has Moved, Diana Pinto describes Israel as suffering from its own case of autism, despite being a nation of far less territory than Luttwak’s Russia, China, India, or the United States. Pinto’s book is an unsettling take on Israel’s decades-long drift away from the European shadow in which it was founded and towards a new insularity. On the one hand are the country’s world-leading science and technology sectors, on the other its less successful international relationships.

Super-Israel at the technological heart of the new world economy suddenly becomes Israel the autistic with Asperger syndrome, the bipolar, the schizophrenic, the paranoiac, the psychotic, and even the psycho-rigid: in other words, an entity that denies the very principle of reality. These are very powerful, even terrifying, metaphors. They are not mine. Israelis from all camps—whether ultraorthodox or extremely secular, young or old, and coming from the most diverse cultural origins—used them freely before me as so many self-evident truths.

She writes of Israeli autism:

This condition, which occurs among the young (and Israel is both very young as a state and very old as a people), who are often quite brilliant in certain fields, defines those who cannot think of themselves as living in a world populated by others. They do not register the gaze or the emotions of others and are therefore unable to communicate or interact with them, because they do not grasp or understand what might motivate them… As with autistic people who feel threatened, Israel can reply to the aggressions of others (in its case most often real and not imagined) only by an excessively forceful and uncontrolled reaction, of which it often becomes its own victim.

Interestingly, while Luttwak makes no mention of Israel in his discussion of great state autism, Pinto cites China as the only other civilization whose “self-centeredness” resembles that of the Jews. Also striking are the contrasting conditions of each affliction’s incubation: Luttwak points to China’s ancient history of isolation from other states, whereas Israel’s defining characteristic is its keystoning of the lands along the Mediterranean Sea. In either case, the label seems a useful lens through which to observe these two major factors in the pursuit of geopolitical stability.

About

The Harvard University Press Blog brings you books, ideas, and news from Harvard University Press. Founded in 1913, Harvard University Press has published such iconic works as Bernard Bailyn’s The Ideological Origins of the American Revolution, John Rawls’s A Theory of Justice, and Sarah Blaffer Hrdy’s The Woman That Never Evolved.