Barbara Ehrenreich comments on working in America

August 09, 2011

I completed the manuscript for Nickel and Dimed in a time of seemingly boundless prosperity. Technology innovators and venture capitalists were acquiring sudden fortunes, buying up McMansions like the ones I had cleaned in Maine and much larger. Even secretaries in some hi-tech firms were striking it rich with their stock options. There was loose talk about a permanent conquest of the business cycle, and a sassy new spirit infecting American capitalism. In San Francisco, a billboard for an e-trading firm proclaimed, “Make love not war,” and then -- down at the bottom -- “Screw it, just make money.”

When Nickel and Dimed was published in May 2001, cracks were appearing in the dot-com bubble and the stock market had begun to falter, but the book still evidently came as a surprise, even a revelation, to many. Again and again, in that first year or two after publication, people came up to me and opened with the words, “I never thought...” or “I hadn’t realized...”

To my own amazement, Nickel and Dimed quickly ascended to the bestseller list and began winning awards. Criticisms, too, have accumulated over the years. But for the most part, the book has been far better received than I could have imagined it would be, with an impact extending well into the more comfortable classes. A Florida woman wrote to tell me that, before reading it, she’d always been annoyed at the poor for what she saw as their self-inflicted obesity. Now she understood that a healthy diet wasn’t always an option. And if I had a quarter for every person who’s told me he or she now tipped more generously, I would be able to start my own foundation.

Even more gratifying to me, the book has been widely read among low-wage workers. In the last few years, hundreds of people have written to tell me their stories: the mother of a newborn infant whose electricity had just been turned off, the woman who had just been given a diagnosis of cancer and has no health insurance, the newly homeless man who writes from a library computer.

At the time I wrote Nickel and Dimed, I wasn’t sure how many people it directly applied to -- only that the official definition of poverty was way off the mark, since it defined an individual earning $7 an hour, as I did on average, as well out of poverty. But three months after the book was published, the Economic Policy Institute in Washington, D.C., issued a report entitled “Hardships in America: The Real Story of Working Families,” which found an astounding 29% of American families living in what could be more reasonably defined as poverty, meaning that they earned less than a barebones budget covering housing, child care, health care, food, transportation, and taxes -- though not, it should be noted, any entertainment, meals out, cable TV, Internet service, vacations, or holiday gifts. Twenty-nine percent is a minority, but not a reassuringly small one, and other studies in the early 2000s came up with similar figures.

The big question, 10 years later, is whether things have improved or worsened for those in the bottom third of the income distribution, the people who clean hotel rooms, work in warehouses, wash dishes in restaurants, care for the very young and very old, and keep the shelves stocked in our stores. The short answer is that things have gotten much worse, especially since the economic downturn that began in 2008.

Post-Meltdown Poverty

When you read about the hardships I found people enduring while I was researching my book -- the skipped meals, the lack of medical care, the occasional need to sleep in cars or vans -- you should bear in mind that those occurred in the best of times. The economy was growing, and jobs, if poorly paid, were at least plentiful.

In 2000, I had been able to walk into a number of jobs pretty much off the street. Less than a decade later, many of these jobs had disappeared and there was stiff competition for those that remained. It would have been impossible to repeat my Nickel and Dimed “experiment,” had I had been so inclined, because I would probably never have found a job.

For the last couple of years, I have attempted to find out what was happening to the working poor in a declining economy -- this time using conventional reporting techniques like interviewing. I started with my own extended family, which includes plenty of people without jobs or health insurance, and moved on to trying to track down a couple of the people I had met while working on Nickel and Dimed.

This wasn’t easy, because most of the addresses and phone numbers I had taken away with me had proved to be inoperative within a few months, probably due to moves and suspensions of telephone service. I had kept in touch with “Melissa” over the years, who was still working at Wal-Mart, where her wages had risen from $7 to $10 an hour, but in the meantime her husband had lost his job. “Caroline,” now in her 50s and partly disabled by diabetes and heart disease, had left her deadbeat husband and was subsisting on occasional cleaning and catering jobs. Neither seemed unduly afflicted by the recession, but only because they had already been living in what amounts to a permanent economic depression.

Media attention has focused, understandably enough, on the “nouveau poor” -- formerly middle and even upper-middle class people who lost their jobs, their homes, and/or their investments in the financial crisis of 2008 and the economic downturn that followed it, but the brunt of the recession has been borne by the blue-collar working class, which had already been sliding downwards since de-industrialization began in the 1980s.

In 2008 and 2009, for example, blue-collar unemployment was increasing three times as fast as white-collar unemployment, and African American and Latino workers were three times as likely to be unemployed as white workers. Low-wage blue-collar workers, like the people I worked with in this book, were especially hard hit for the simple reason that they had so few assets and savings to fall back on as jobs disappeared.

How have the already-poor attempted to cope with their worsening economic situation? One obvious way is to cut back on health care. The New York Times reported in 2009 that one-third of Americans could no longer afford to comply with their prescriptions and that there had been a sizable drop in the use of medical care. Others, including members of my extended family, have given up their health insurance.

Food is another expenditure that has proved vulnerable to hard times, with the rural poor turning increasingly to “food auctions,” which offer items that may be past their sell-by dates. And for those who like their meat fresh, there’s the option of urban hunting. In Racine, Wisconsin, a 51-year-old laid-off mechanic told me he was supplementing his diet by “shooting squirrels and rabbits and eating them stewed, baked, and grilled.” In Detroit, where the wildlife population has mounted as the human population ebbs, a retired truck driver was doing a brisk business in raccoon carcasses, which he recommends marinating with vinegar and spices.

The most common coping strategy, though, is simply to increase the number of paying people per square foot of dwelling space -- by doubling up or renting to couch-surfers.

It’s hard to get firm numbers on overcrowding, because no one likes to acknowledge it to census-takers, journalists, or anyone else who might be remotely connected to the authorities.

In Los Angeles, housing expert Peter Dreier says that “peoplewho’ve lost their jobs, or at least their second jobs, cope bydoubling or tripling up in overcrowded apartments, or bypaying 50 or 60 or even 70 percent of their incomes in rent.”According to a community organizer in Alexandria, Virginia,the standard apartment in a complex occupied largely by daylaborers has two bedrooms, each containing an entirefamily of up to five people, plus an additional person layingclaim to the couch.

No one could call suicide a “coping strategy,” but it is one way some people have responded to job loss and debt. There are no national statistics linking suicide to economic hard times, but the National Suicide Prevention Lifeline reported more than a four-fold increase in call volume between 2007 and 2009, and regions with particularly high unemployment, like Elkhart, Indiana, have seen troubling spikes in their suicide rates. Foreclosure is often the trigger for suicide -- or, worse, murder-suicides that destroy entire families.

“Torture and Abuse of Needy Families”

We do of course have a collective way of ameliorating the hardships of individuals and families -- a government safety net that is meant to save the poor from spiraling down all the way to destitution. But its response to the economic emergency of the last few years has been spotty at best. The food stamp program has responded to the crisis fairly well, to the point where it now reaches about 37 million people, up about 30% from pre-recession levels. But welfare -- the traditional last resort for the down-and-out until it was “reformed” in 1996 -- only expanded by about 6% in the first two years of the recession.

The difference between the two programs? There is a right to food stamps. You go to the office and, if you meet the statutory definition of need, they help you. For welfare, the street-level bureaucrats can, pretty much at their own discretion, just say no.

Take the case of Kristen and Joe Parente, Delaware residents who had always imagined that people turned to the government for help only if “they didn’t want to work.” Their troubles began well before the recession, when Joe, a fourth-generation pipe-fitter, sustained a back injury that left him unfit for even light lifting. He fell into a profound depression for several months, then rallied to ace a state-sponsored retraining course in computer repairs -- only to find that those skills are no longer in demand. The obvious fallback was disability benefits, but -- catch-22 -- when Joe applied he was told he could not qualify without presenting a recent MRI scan. This would cost $800 to $900, which the Parentes do not have; nor has Joe, unlike the rest of the family, been able to qualify for Medicaid.

When they married as teenagers, the plan had been for Kristen to stay home with the children. But with Joe out of action and three children to support by the middle of this decade, Kristen went out and got waitressing jobs, ending up, in 2008, in a “pretty fancy place on the water.” Then the recession struck and she was laid off.

Kristen is bright, pretty, and to judge from her command of her own small kitchen, probably capable of holding down a dozen tables with precision and grace. In the past she’d always been able to land a new job within days; now there was nothing. Like 44% of laid-off people at the time, she failed to meet the fiendishly complex and sometimes arbitrary eligibility requirements for unemployment benefits. Their car started falling apart.

So the Parentes turned to what remains of welfare -- TANF, or Temporary Assistance to Needy Families. TANF does not offer straightforward cash support like Aid to Families with Dependent Children, which it replaced in 1996. It’s an income supplementation program for working parents, and it was based on the sunny assumption that there would always be plenty of jobs for those enterprising enough to get them.

After Kristen applied, nothing happened for six weeks -- no money, no phone calls returned. At school, the Parentes’ seven-year-old’s class was asked to write out what wish they would present to a genie, should a genie appear. Brianna’s wish was for her mother to find a job because there was nothing to eat in the house, an aspiration that her teacher deemed too disturbing to be posted on the wall with the other children’s requests.

When the Parentes finally got into “the system” and began receiving food stamps and some cash assistance, they discovered why some recipients have taken to calling TANF “Torture and Abuse of Needy Families.” From the start, the TANF experience was “humiliating,” Kristen says. The caseworkers “treat you like a bum. They act like every dollar you get is coming out of their own paychecks.”

The Parentes discovered that they were each expected to apply for 40 jobs a week, although their car was on its last legs and no money was offered for gas, tolls, or babysitting. In addition, Kristen had to drive 35 miles a day to attend “job readiness” classes offered by a private company called Arbor, which, she says, were “frankly a joke.”

Nationally, according to Kaaryn Gustafson of the University of Connecticut Law School, “applying for welfare is a lot like being booked by the police.” There may be a mug shot, fingerprinting, and lengthy interrogations as to one’s children’s true paternity. The ostensible goal is to prevent welfare fraud, but the psychological impact is to turn poverty itself into a kind of crime.

How the Safety Net Became a Dragnet

The most shocking thing I learned from my research on the fate of the working poor in the recession was the extent to which poverty has indeed been criminalized in America.

Perhaps the constant suspicions of drug use and theft that I encountered in low-wage workplaces should have alerted me to the fact that, when you leave the relative safety of the middle class, you might as well have given up your citizenship and taken residence in a hostile nation.

Most cities, for example, have ordinances designed to drive the destitute off the streets by outlawing such necessary activities of daily life as sitting, loitering, sleeping, or lying down. Urban officials boast that there is nothing discriminatory about such laws: “If you’re lying on a sidewalk, whether you’re homeless or a millionaire, you’re in violation of the ordinance,” a St. Petersburg, Florida, city attorney stated in June 2009, echoing Anatole France’s immortal observation that “the law, in its majestic equality, forbids the rich as well as the poor to sleep under bridges...”

In defiance of all reason and compassion, the criminalization of poverty has actually intensified as the weakened economy generates ever more poverty. So concludes a recent study from the National Law Center on Poverty and Homelessness, which finds that the number of ordinances against the publicly poor has been rising since 2006, along with the harassment of the poor for more “neutral” infractions like jaywalking, littering, or carrying an open container.

The report lists America’s ten “meanest” cities -- the largest of which include Los Angeles, Atlanta, and Orlando -- but new contestants are springing up every day. In Colorado, Grand Junction’s city council is considering a ban on begging; Tempe, Arizona, carried out a four-day crackdown on the indigent at the end of June. And how do you know when someone is indigent? As a Las Vegas statute puts it, “an indigent person is a person whom a reasonable ordinary person would believe to be entitled to apply for or receive” public assistance.

That could be me before the blow-drying and eyeliner, and it’s definitely Al Szekeley at any time of day. A grizzled 62-year-old, he inhabits a wheelchair and is often found on G Street in Washington, D.C. -- the city that is ultimately responsible for the bullet he took in the spine in Phu Bai, Vietnam, in 1972.

He had been enjoying the luxury of an indoor bed until December 2008, when the police swept through the shelter in the middle of the night looking for men with outstanding warrants. It turned out that Szekeley, who is an ordained minister and does not drink, do drugs, or cuss in front of ladies, did indeed have one -- for “criminal trespassing,” as sleeping on the streets is sometimes defined by the law. So he was dragged out of the shelter and put in jail.

“Can you imagine?” asked Eric Sheptock, the homeless advocate (himself a shelter resident) who introduced me to Szekeley. “They arrested a homeless man in a shelter for being homeless?”

The viciousness of the official animus toward the indigent can be breathtaking. A few years ago, a group called Food Not Bombs started handing out free vegan food to hungry people in public parks around the nation. A number of cities, led by Las Vegas, passed ordinances forbidding the sharing of food with the indigent in public places, leading to the arrests of several middle-aged white vegans.

One anti-sharing law was just overturned in Orlando, but the war on illicit generosity continues. Orlando is appealing the decision, and Middletown, Connecticut, is in the midst of a crackdown. More recently, Gainesville, Florida, began enforcing a rule limiting the number of meals that soup kitchens may serve to 130 people in one day, and Phoenix, Arizona, has been using zoning laws to stop a local church from serving breakfast to homeless people.

For the not-yet-homeless, there are two main paths to criminalization, and one is debt. Anyone can fall into debt, and although we pride ourselves on the abolition of debtors’ prison, in at least one state, Texas, people who can’t pay fines for things like expired inspection stickers may be made to “sit out their tickets” in jail.

More commonly, the path to prison begins when one of your creditors has a court summons issued for you, which you fail to honor for one reason or another, such as that your address has changed and you never received it. Okay, now you’re in “contempt of the court.”

Or suppose you miss a payment and your car insurance lapses, and then you’re stopped for something like a broken headlight (about $130 for the bulb alone). Now, depending on the state, you may have your car impounded and/or face a steep fine -- again, exposing you to a possible court summons. “There’s just no end to it once the cycle starts,” says Robert Solomon of Yale Law School. “It just keeps accelerating.”

The second -- and by far the most reliable -- way to be criminalized by poverty is to have the wrong color skin. Indignation runs high when a celebrity professor succumbs to racial profiling, but whole communities are effectively “profiled” for the suspicious combination of being both dark-skinned and poor. Flick a cigarette and you’re “littering”; wear the wrong color T-shirt and you’re displaying gang allegiance. Just strolling around in a dodgy neighborhood can mark you as a potential suspect. And don’t get grumpy about it or you could be “resisting arrest.”

In what has become a familiar pattern, the government defunds services that might help the poor while ramping uplaw enforcement. Shut down public housing, then make it acrime to be homeless. Generate no public-sector jobs, thenpenalize people for falling into debt. The experience of thepoor, and especially poor people of color, comes to resemblethat of a rat in a cage scrambling to avoid erratically administeredelectric shocks. And if you should try to escape thisnightmare reality into a brief, drug-induced high, it’s “gotcha”all over again, because that of course is illegal too.

One result isour staggering level of incarceration, the highest in the world. Today, exactly the same number of Americans -- 2.3 million -- reside in prison as in public housing. And what public housingremains has become ever more prison-like, with randompolice sweeps and, in a growing number of cities, proposeddrug tests for residents. The safety net, or what remains of it,has been transformed into a dragnet.

It is not clear whether economic hard times will finally force us to break the mad cycle of poverty and punishment. With even the official level of poverty increasing -- to over 14% in 2010 -- some states are beginning to ease up on the criminalization of poverty, using alternative sentencing methods, shortening probation, and reducing the number of people locked up for technical violations like missing court appointments. But others, diabolically enough, are tightening the screws: not only increasing the number of “crimes,” but charging prisoners for their room and board, guaranteeing they’ll be released with potentially criminalizing levels of debt.

So what is the solution to the poverty of so many of America’s working people? Ten years ago, when Nickel and Dimed first came out, I often responded with the standard liberal wish list -- a higher minimum wage, universal health care, affordable housing, good schools, reliable public transportation, and all the other things we, uniquely among the developed nations, have neglected to do.

Today, the answer seems both more modest and more challenging: if we want to reduce poverty, we have to stop doing the things that make people poor and keep them that way. Stop underpaying people for the jobs they do. Stop treating working people as potential criminals and let them have the right to organize for better wages and working conditions.

Stop the institutional harassment of those who turn to the government for help or find themselves destitute in the streets. Maybe, as so many Americans seem to believe today, we can’t afford the kinds of public programs that would genuinely alleviate poverty -- though I would argue otherwise. But at least we should decide, as a bare minimum principle, to stop kicking people when they’re down.

July 12, 2011

For a book about the all-too-human “passions of war,” my 1997 work Blood Rites ended on a strangely inhuman note: I suggested that, whatever distinctly human qualities war calls upon -- honor, courage, solidarity, cruelty, and so forth -- it might be useful to stop thinking of war in exclusively human terms. After all, certain species of ants wage war and computers can simulate “wars” that play themselves out on-screen without any human involvement.

More generally, then, we should define war as a self-replicating pattern of activity that may or may not require human participation. In the human case, we know it is capable of spreading geographically and evolving rapidly over time -- qualities that, as I suggested somewhat fancifully, make war a metaphorical successor to the predatory animals that shaped humans into fighters in the first place.

A decade and a half later, these musings do not seem quite so airy and abstract anymore. The trend, at the close of the twentieth century, still seemed to be one of ever more massive human involvement in war -- from armies containing tens of thousands in the sixteenth century, to hundreds of thousands in the nineteenth, and eventually millions in the twentieth century world wars.

It was the ascending scale of war that originally called forth the existence of the nation-state as an administrative unit capable of maintaining mass armies and the infrastructure -- for taxation, weapons manufacture, transport, etc. -- that they require. War has been, and we still expect it to be, the most massive collective project human beings undertake. But it has been evolving quickly in a very different direction, one in which human beings have a much smaller role to play.

One factor driving this change has been the emergence of a new kind of enemy, so-called “non-state actors,” meaning popular insurgencies and loose transnational networks of fighters, none of which are likely to field large numbers of troops or maintain expensive arsenals of their own. In the face of these new enemies, typified by al-Qaeda, the mass armies of nation-states are highly ineffective, cumbersome to deploy, difficult to maneuver, and from a domestic point of view, overly dependent on a citizenry that is both willing and able to fight, or at least to have their children fight for them.

Yet just as U.S. military cadets continue, in defiance of military reality, to sport swords on their dress uniforms, our leaders, both military and political, tend to cling to an idea of war as a vast, labor-intensive effort on the order of World War II. Only slowly, and with a reluctance bordering on the phobic, have the leaders of major states begun to grasp the fact that this approach to warfare may soon be obsolete.

Consider the most recent U.S. war with Iraq. According to then-president George W. Bush, the casus belli was the 9/11 terror attacks. The causal link between that event and our chosen enemy, Iraq, was, however, imperceptible to all but the most dedicated inside-the-Beltway intellectuals. Nineteen men had hijacked airplanes and flown them into the Pentagon and the World Trade Center -- 15 of them Saudi Arabians, none of them Iraqis -- and we went to war against… Iraq?

Military history offers no ready precedents for such wildly misaimed retaliation. The closest analogies come from anthropology, which provides plenty of cases of small-scale societies in which the death of any member, for any reason, needs to be “avenged” by an attack on a more or less randomly chosen other tribe or hamlet.

Why Iraq? Neoconservative imperial ambitions have been invoked in explanation, as well as the American thirst for oil, or even an Oedipal contest between George W. Bush and his father. There is no doubt some truth to all of these explanations, but the targeting of Iraq also represented a desperate and irrational response to what was, for Washington, an utterly confounding military situation.

We faced a state-less enemy -- geographically diffuse, lacking uniforms and flags, invulnerable to invading infantries and saturation bombing, and apparently capable of regenerating itself at minimal expense. From the perspective of Secretary of Defense Donald Rumsfeld and his White House cronies, this would not do.

Since the U.S. was accustomed to fighting other nation-states -- geopolitical entities containing such identifiable targets as capital cities, airports, military bases, and munitions plants -- we would have to find a nation-state to fight, or as Rumsfeld put it, a “target-rich environment.” Iraq, pumped up by alleged stockpiles of “weapons of mass destruction,” became the designated surrogate for an enemy that refused to play our game.

The effects of this atavistic war are still being tallied: in Iraq, we would have to include civilian deaths estimated at possibly hundreds of thousands, the destruction of civilian infrastructure, and devastating outbreaks of sectarian violence of a kind that, as we should have learned from the dissolution of Yugoslavia, can readily follow the death or removal of a nationalist dictator.

But the effects of war on the U.S. and its allies may end up being almost as tragic. Instead of punishing the terrorists who had attacked the U.S., the war seems to have succeeded in recruiting more such irregular fighters, young men (and sometimes women) willing to die and ready to commit further acts of terror or revenge. By insisting on fighting a more or less randomly selected nation-state, the U.S. may only have multiplied the non-state threats it faces.

Unwieldy Armies

Whatever they may think of what the U.S. and its allies did in Iraq, many national leaders are beginning to acknowledge that conventional militaries are becoming, in a strictly military sense, almost ludicrously anachronistic. Not only are they unsuited to crushing counterinsurgencies and small bands of terrorists or irregular fighters, but mass armies are simply too cumbersome to deploy on short notice.

In military lingo, they are weighed down by their “tooth to tail” ratio -- a measure of the number of actual fighters in comparison to the support personnel and equipment the fighters require. Both hawks and liberal interventionists may hanker to airlift tens of thousands of soldiers to distant places virtually overnight, but those soldiers will need to be preceded or accompanied by tents, canteens, trucks, medical equipment, and so forth. “Flyover” rights will have to be granted by neighboring countries; air strips and eventually bases will have to be constructed; supply lines will have be created and defended -- all of which can take months to accomplish.

The sluggishness of the mass, labor-intensive military has become a constant source of frustration to civilian leaders. Irritated by the Pentagon’s hesitation to put “boots on the ground” in Bosnia, then-Secretary of State Madeline Albright famously demanded of Secretary of Defense Colin Powell, “What good is this marvelous military force if we can never use it?” In 2009, the Obama administration unthinkingly proposed a troop surge in Afghanistan, followed by a withdrawal within a year and a half that would have required some of the troops to start packing up almost as soon as they arrived. It took the U.S. military a full month to organize the transport of 20,000 soldiers to Haiti in the wake of the 2010 earthquake -- and they were only traveling 700 miles to engage in a humanitarian relief mission, not a war.

Another thing hobbling mass militaries is the increasing unwillingness of nations, especially the more democratic ones, to risk large numbers of casualties. It is no longer acceptable to drive men into battle at gunpoint or to demand that they fend for themselves on foreign soil. Once thousands of soldiers have been plunked down in a “theater,” they must be defended from potentially hostile locals, a project that can easily come to supersede the original mission.

We may not be able clearly to articulate what American troops were supposed to accomplish in Iraq or Afghanistan, but without question one part of their job has been “force protection.” In what could be considered the inverse of “mission creep,” instead of expanding, the mission now has a tendency to contract to the task of self-defense.

Ultimately, the mass militaries of the modern era, augmented by ever-more expensive weapons systems, place an unacceptable economic burden on the nation-states that support them -- a burden that eventually may undermine the militaries themselves. Consider what has been happening to the world’s sole military superpower, the United States. The latest estimate for the cost of the wars in Iraq and Afghanistan is, at this moment, at least $3.2 trillion, while total U.S. military spending equals that of the next 15 countries combined, and adds up to approximately 47% of all global military spending.

To this must be added the cost of caring for wounded and otherwise damaged veterans, which has been mounting precipitously as medical advances allow more of the injured to survive. The U.S. military has been sheltered from the consequences of its own profligacy by a level of bipartisan political support that has kept it almost magically immune to budget cuts, even as the national debt balloons to levels widely judged to be unsustainable.

The hard right, in particular, has campaigned relentlessly against “big government,” apparently not noticing that the military is a sizable chunk of this behemoth. In December 2010, for example, a Republican senator from Oklahoma railed against the national debt with this statement: “We're really at war. We're on three fronts now: Iraq, Afghanistan, and the financial tsunami [arising from the debt] that is facing us.” Only in recent months have some Tea Party-affiliated legislators broken with tradition by declaring their willingness to cut military spending.

How the Warfare State Became the Welfare State

If military spending is still for the most part sacrosanct, ever more spending cuts are required to shrink “big government.” Then what remains is the cutting of domestic spending, especially social programs for the poor, who lack the means to finance politicians, and all too often the incentive to vote as well. From the Reagan years on, the U.S. government has chipped away at dozens of programs that had helped sustain people who are underpaid or unemployed, including housing subsidies, state-supplied health insurance, public transportation, welfare for single parents, college tuition aid, and inner-city economic development projects.

Even the physical infrastructure -- bridges, airports, roads, and tunnels -- used by people of all classes has been left at dangerous levels of disrepair. Antiwar protestors wistfully point out, year after year, what the cost of our high-tech weapon systems, our global network of more than 1,000 military bases, and our various “interventions” could buy if applied to meeting domestic human needs. But to no effect.

This ongoing sacrifice of domestic welfare for military “readiness” represents the reversal of a historic trend. Ever since the introduction of mass armies in Europe in the seventeenth century, governments have generally understood that to underpay and underfeed one's troops -- and the class of people that supplies them -- is to risk having the guns pointed in the opposite direction from that which the officers recommend.

In fact, modern welfare states, inadequate as they may be, are in no small part the product of war -- that is, of governments' attempts to appease soldiers and their families. In the U.S., for example, the Civil War led to the institution of widows' benefits, which were the predecessor of welfare in its Aid to Families with Dependent Children form. It was the bellicose German leader Otto von Bismarck who first instituted national health insurance.

World War II spawned educational benefits and income support for American veterans and led, in the United Kingdom, to a comparatively generous welfare state, including free health care for all. Notions of social justice and fairness, or at least the fear of working class insurrections, certainly played a part in the development of twentieth century welfare states, but there was a pragmatic military motivation as well: if young people are to grow up to be effective troops, they need to be healthy, well-nourished, and reasonably well-educated.

In the U.S., the steady withering of social programs that might nurture future troops even serves, ironically, to justify increased military spending. In the absence of a federal jobs program, Congressional representatives become fierce advocates for weapons systems that the Pentagon itself has no use for, as long as the manufacture of those weapons can provide employment for some of their constituents.

With diminishing funds for higher education, military service becomes a less dismal alternative for young working-class people than the low-paid jobs that otherwise await them. The U.S. still has a civilian welfare state consisting largely of programs for the elderly (Medicare and Social Security). For many younger Americans, however, as well as for older combat veterans, the U.S. military is the welfare state -- and a source, however temporarily, of jobs, housing, health care and education.

Eventually, however, the failure to invest in America’s human resources -- through spending on health, education, and so forth -- undercuts the military itself. In World War I, public health experts were shocked to find that one-third of conscripts were rejected as physically unfit for service; they were too weak and flabby or too damaged by work-related accidents.

Several generations later, in 2010, the U.S. Secretary of Education reported that “75 percent of young Americans, between the ages of 17 to 24, are unable to enlist in the military today because they have failed to graduate from high school, have a criminal record, or are physically unfit.” When a nation can no longer generate enough young people who are fit for military service, that nation has two choices: it can, as a number of prominent retired generals are currently advocating, reinvest in its “human capital,” especially the health and education of the poor, or it can seriously reevaluate its approach to war.

The Fog of (Robot) War

Since the rightward, anti-“big government” tilt of American politics more or less precludes the former, the U.S. has been scrambling to develop less labor-intensive forms of waging war. In fact, this may prove to be the ultimate military utility of the wars in Iraq and Afghanistan: if they have gained the U.S. no geopolitical advantage, they have certainly served as laboratories and testing grounds for forms of future warfare that involve less human, or at least less governmental, commitment.

One step in that direction has been the large-scale use of military contract workers supplied by private companies, which can be seen as a revival of the age-old use of mercenaries. Although most of the functions that have been outsourced to private companies -- including food services, laundry, truck driving, and construction -- do not involve combat, they are dangerous, and some contract workers have even been assigned to the guarding of convoys and military bases.

Contractors are still men and women, capable of bleeding and dying -- and surprising numbers of them have indeed died. In the initial six months of 2010, corporate deaths exceeded military deaths in Iraq and Afghanistan for the first time. But the Pentagon has little or no responsibility for the training, feeding, or care of private contractors. If wounded or psychologically damaged, American contract workers must turn, like any other injured civilian employees, to the Workers’ Compensation system, hence their sense of themselves as a “disposable army.” By 2009, the trend toward privatization had gone so far that the number of private contractors in Afghanistan exceeded the number of American troops there.

An alternative approach is to eliminate or drastically reduce the military’s dependence on human beings of any kind. This would have been an almost unthinkable proposition a few decades ago, but technologies employed in Iraq and Afghanistan have steadily stripped away the human role in war. Drones, directed from sites up to 7,500 miles away in the western United States, are replacing manned aircraft.

Video cameras, borne by drones, substitute for human scouts or information gathered by pilots. Robots disarm roadside bombs. When American forces invaded Iraq in 2003, no robots accompanied them; by 2008, there were 12,000 participating in the war. Only a handful of drones were used in the initial invasion; today, the U.S. military has an inventory of more than 7,000, ranging from the familiar Predator to tiny Ravens and Wasps used to transmit video images of events on the ground. Far stranger fighting machines are in the works, like swarms of lethal “cyborg insects” that could potentially replace human infantry.

These developments are by no means limited to the U.S. The global market for military robotics and unmanned military vehicles is growing fast, and includes Israel, a major pioneer in the field, Russia, the United Kingdom, Iran, South Korea, and China. Turkey is reportedly readying a robot force for strikes against Kurdish insurgents; Israel hopes to eventually patrol the Gaza border with “see-shoot” robots that will destroy people perceived as transgressors as soon as they are detected.

It is hard to predict how far the automation of war and the substitution of autonomous robots for human fighters will go. On the one hand, humans still have the advantage of superior visual discrimination. Despite decades of research in artificial intelligence, computers cannot make the kind of simple distinctions -- as in determining whether a cow standing in front of a barn is a separate entity or a part of the barn -- that humans can make in a fraction of a second.

Thus, as long as there is any premium on avoiding civilian deaths, humans have to be involved in processing the visual information that leads, for example, to the selection of targets for drone attacks. If only as the equivalent of seeing-eye dogs, humans will continue to have a role in war, at least until computer vision improves.

On the other hand, the human brain lacks the bandwidth to process all the data flowing into it, especially as new technologies multiply that data. In the clash of traditional mass armies, under a hail of arrows or artillery shells, human warriors often found themselves confused and overwhelmed, a condition attributed to “the fog of war." Well, that fog is growing a lot thicker. U.S. military officials, for instance, put the blame on “information overload” for the killing of 23 Afghan civilians in February 2010, and the New York Times reported that:

“Across the military, the data flow has surged; since the attacks of 9/11, the amount of intelligence gathered by remotely piloted drones and other surveillance technologies has risen 1,600 percent. On the ground, troops increasingly use hand-held devices to communicate, get directions and set bombing coordinates. And the screens in jets can be so packed with data that some pilots call them “drool buckets” because, they say, they can get lost staring into them.”

When the sensory data coming at a soldier is augmented by a flood of instantaneously transmitted data from distant cameras and computer search engines, there may be no choice but to replace the sloppy “wet-ware” of the human brain with a robotic system for instant response.

War Without Humans

Once set in place, the cyber-automation of war is hard to stop. Humans will cling to their place “in the loop” as long as they can, no doubt insisting that the highest level of decision-making -- whether to go to war and with whom -- be reserved for human leaders. But it is precisely at the highest levels that decision-making may most need automating. A head of state faces a blizzard of factors to consider, everything from historical analogies and satellite-derived intelligence to assessments of the readiness of potential allies. Furthermore, as the enemy automates its military, or in the case of a non-state actor, simply adapts to our level of automation, the window of time for effective responses will grow steadily narrower. Why not turn to a high-speed computer? It is certainly hard to imagine a piece of intelligent hardware deciding to respond to the 9/11 attacks by invading Iraq.

So, after at least 10,000 years of intra-species fighting -- of scorched earth, burned villages, razed cities, and piled up corpses, as well, of course, as all the great epics of human literature -- we have to face the possibility that the institution of war might no longer need us for its perpetuation. Human desires, especially for the Earth’s diminishing supply of resources, will still instigate wars for some time to come, but neither human courage nor human bloodlust will carry the day on the battlefield.

Computers will assess threats and calibrate responses; drones will pinpoint enemies; robots might roll into the streets of hostile cities. Beyond the individual battle or smaller-scale encounter, decisions as to whether to match attack with counterattack, or one lethal technological innovation with another, may also be eventually ceded to alien minds.

This should not come as a complete surprise. Just as war has shaped human social institutions for millennia, so has it discarded them as the evolving technology of war rendered them useless. When war was fought with blades by men on horseback, it favored the rule of aristocratic warrior elites. When the mode of fighting shifted to action-at-a-distance weapons like bows and guns, the old elites had to bow to the central authority of kings, who, in turn, were undone by the democratizing forces unleashed by new mass armies.

Even patriarchy cannot depend on war for its long-term survival, since the wars in Iraq and Afghanistan have, at least within U.S. forces, established women’s worth as warriors. Over the centuries, human qualities once deemed indispensable to war fighting -- muscular power, manliness, intelligence, judgment -- have one by one become obsolete or been ceded to machines.

What will happen then to the “passions of war”? Except for individual acts of martyrdom, war is likely to lose its glory and luster. Military analyst P.W. Singer quotes an Air Force captain musing about whether the new technologies will “mean that brave men and women will no longer face death in combat,” only to reassure himself that “there will always be a need for intrepid souls to fling their bodies across the sky.”

Perhaps, but in a 2010 address to Air Force Academy cadets, an under secretary of defense delivered the “bad news” that most of them would not be flying airplanes, which are increasingly unmanned. War will continue to be used against insurgencies as well as to “take out” the weapons facilities, command centers, and cities of designated rogue states. It may even continue to fascinate its aficionados, in the manner of computer games. But there will be no triumphal parades for killer nano-bugs, no epics about unmanned fighter planes, no monuments to fallen bots.

And in that may lie our last hope. With the decline of mass militaries and their possible replacement by machines, we may finally see that war is not just an extension of our needs and passions, however base or noble. Nor is it likely to be even a useful test of our courage, fitness, or national unity. War has its own dynamic or -- in case that sounds too anthropomorphic -- its own grim algorithms to work out. As it comes to need us less, maybe we will finally see that we don’t need it either. We can leave it to the ants.

December 02, 2009

Has feminism been replaced by the pink-ribbon breast
cancer cult? When the House of Representatives passed the Stupak
amendment, which would take abortion rights away even from women who
have private insurance, the female response ranged from muted to
inaudible.

A few weeks later, when the United States Preventive Services Task
Force recommended that regular screening mammography not start until
age 50, all hell broke loose. Sheryl Crow, Whoopi Goldberg, and Olivia
Newton-John raised their voices in protest; a few dozen non-boldface
women picketed
the Department of Health and Human Services. If you didn’t look too
closely, it almost seemed as if the women’s health movement of the
1970s and 1980s had returned in full force.

Never mind that Dr. Susan Love, author of what the New York Times
dubbed “the bible for women with breast cancer,” endorses the new
guidelines along with leading women’s health groups like Breast Cancer
Action, the National Breast Cancer Coalition,
and the National Women’s Health Network (NWHN). For years, these groups
have been warning about the excessive use of screening mammography in
the U.S., which carries its own dangers and leads to no detectible
lowering of breast cancer mortality relative to less mammogram-happy
nations.

Nonetheless, on CNN last week, we had the unsettling spectacle of
NWHN director and noted women’s health advocate Cindy Pearson speaking
out for the new guidelines, while ordinary women lined up to attribute
their survival from the disease to mammography. Once upon a time,
grassroots women challenged the establishment by figuratively burning
their bras. Now, in some masochistic perversion of feminism, they are
raising their voices to yell, “Squeeze our tits!”

When the Stupak anti-choice amendment passed, and so entered the
health reform bill, no congressional representative stood up on the
floor of the House to recount how access to abortion had saved her life
or her family’s well-being. And where were the tea-baggers when we
needed them? If anything represents the true danger of “government
involvement” in health care, it’s a health reform bill that – if the
Senate enacts something similar -- will snatch away all but the
wealthiest women’s right to choose.

It’s not just that abortion is deemed a morally trickier issue than mammography. To some extent, pink-ribbon culture has
replaced feminism as a focus of female identity and solidarity. When a
corporation wants to signal that it’s “woman friendly,” what does it
do? It stamps a pink ribbon on its widget and proclaims that some
miniscule portion of the profits will go to breast cancer research.
I’ve even seen a bottle of Shiraz called “Hope” with a pink ribbon on
its label, but no information, alas, on how much you have to drink to
achieve the promised effect. When Laura Bush traveled to Saudi Arabia
in 2007, what grave issue did she take up with the locals? Not women’s
rights (to drive, to go outside without a man, etc.), but “breast
cancer awareness.” In the post-feminist United States, issues like
rape, domestic violence, and unwanted pregnancy seem to be too edgy for
much public discussion, but breast cancer is all apple pie.

So welcome to the Women’s Movement 2.0: Instead of the proud female
symbol -- a circle on top of a cross -- we have a droopy ribbon.
Instead of embracing the full spectrum of human colors -- black, brown,
red, yellow, and white -- we stick to princess pink. While we used to
march in protest against sexist laws and practices, now we race or walk
“for the cure.” And while we once sought full “consciousness” of all
that oppresses us, now we’re content to achieve “awareness,” which has
come to mean one thing -- dutifully baring our breasts for the annual
mammogram.

Look, the issue here isn’t health-care costs. If the current levels
of screening mammography demonstrably saved lives, I would say go for
it, and damn the expense. But the numbers are increasingly insistent:
Routine mammographic screening of women under 50 does not
reduce breast cancer mortality in that group, nor do older women
necessarily need an annual mammogram. In fact, the whole dogma about
“early detection” is shaky, as Susan Love reminds us:
the idea has been to catch cancers early, when they’re still small,
but some tiny cancers are viciously aggressive, and some large ones
aren’t going anywhere.

One response
to the new guidelines has been that numbers don’t matter -- only
individuals do -- and if just one life is saved, that’s good enough. So
OK, let me cite my own individual experience. In 2000, at the age of
59, I was diagnosed with Stage II breast cancer on the basis of one
dubious mammogram followed by a really bad one, followed by a biopsy.
Maybe I should be grateful that the cancer was detected in time, but
the truth is, I’m not sure whether these mammograms detected the tumor
or, along with many earlier ones, contributed to it: One known
environmental cause of breast cancer is radiation, in amounts easily
accumulated through regular mammography.

And why was I bothering with this mammogram in the first place? I
had long ago made the decision not to spend my golden years undergoing
cancer surveillance, but I wanted to get my Hormone Replacement Therapy
(HRT) prescription renewed, and the nurse practitioner wouldn’t do that
without a fresh mammogram.

As for the HRT, I was taking it because I had been convinced, by the
prevailing medical propaganda, that HRT helps prevent heart disease and
Alzheimer’s. In 2002, we found out that HRT is itself a risk factor for
breast cancer (as well as being ineffective at warding off heart
disease and Alzheimer’s), but we didn’t know that in 2000. So did I get
breast cancer because of the HRT -- and possibly because of the
mammograms themselves -- or did HRT lead to the detection of a cancer I
would have gotten anyway?

I don’t know, but I do know that that biopsy was followed by the
worst six months of my life, spent bald and barfing my way through
chemotherapy. This is what’s at stake here: Not only the possibility
that some women may die because their cancers go undetected, but that
many others will lose months or years of their lives to debilitating
and possibly unnecessary treatments.

You don’t have to be suffering from “chemobrain”
(chemotherapy-induced cognitive decline) to discern evil, iatrogenic,
profit-driven forces at work here. In a recent column
on the new guidelines, patient-advocate Naomi Freundlich raises the
possibility that “entrenched interests -- in screening, surgery,
chemotherapy and other treatments associated with diagnosing more and
more cancers -- are impeding scientific evidence.” I am particularly
suspicious of the oncologists, who saw their incomes soar starting in
the late 80s when they began administering and selling chemotherapy
drugs themselves in their ghastly, pink-themed, “chemotherapy suites.”
Mammograms recruit women into chemotherapy, and of course, the
pink-ribbon cult recruits women into mammography.

What we really need is a new women’s health movement, one that’s
sharp and skeptical enough to ask all the hard questions: What are the
environmental (or possibly life-style) causes of the breast
cancer epidemic? Why are existing treatments like chemotherapy so toxic
and heavy-handed? And, if the old narrative of cancer’s progression
from “early” to “late” stages no longer holds, what is the
course of this disease (or diseases)? What we don’t need, no matter how
pretty and pink, is a ladies’ auxiliary to the cancer-industrial
complex.

November 03, 2009

If you can't find any swine flu vaccine for your kids, it won't be for
a lack of positive thinking. In fact, the whole flu snafu is being
blamed on "undue optimism" on the part of both the Obama administration
and Big Pharma.

Optimism is supposed to be good for our health. According to the
academic "positive psychologists," as well as legions of unlicensed
life coaches and inspirational speakers, optimism wards off common
illnesses, contributes to recovery from cancer, and extends longevity.
To its promoters, optimism is practically a miracle vaccine, so
essential that we need to start inoculating Americans with it in the
public schools -- in the form of "optimism training."

But optimism turns out to be less than salubrious when it comes to public
health. In July, the federal government promised to have 160 million
doses of H1N1 vaccine ready for distribution by the end of October.
Instead, only 28 million doses are now ready to go, and optimism is the
obvious culprit. "Road to Flu Vaccine Shortfall, Paved With Undue
Optimism," was the headline of a front page article in the October 26th
New York Times. In the conventional spin, the vaccine shortage
is now "threatening to undermine public confidence in government." If
the federal government couldn't get this right, the pundits are already
asking, how can we trust it with health reform?

But let's stop a minute and also ask: Who really screwed up here --
the government or private pharmaceutical companies, including
GlaxoSmithKline, Novartis, and three others that had agreed to
manufacture and deliver the vaccine by late fall? Last spring and
summer, those companies gleefully gobbled up $2 billion worth of
government contracts for vaccine production, promising to have every
American, or at least every American child and pregnant woman, supplied
with vaccine before trick-or-treating season began.

According to Health and Human Services Secretary Kathleen Sebelius, the
government was misled by these companies, which failed to report
manufacturing delays as they arose. Her department, she says, was
"relying on the manufacturers to give us their numbers, and as soon as
we got numbers we put them out to the public. It does appear now that
those numbers were overly rosy."

If, in fact, there's a political parable here, it's about Big
Government's sweetly trusting reliance on Big Business to safeguard the
public health: Let the private insurance companies manage health
financing; let profit-making hospital chains deliver health care; let
Big Pharma provide safe and affordable medications. As it happens,
though, all these entities have a priority that regularly overrides the
public's health, and that is, of course, profit -- which has led
insurance companies to function as "death panels," excluding those who
might ever need care, and for-profit hospitals to turn away the
indigent, the pregnant, and the uninsured.

As
for Big Pharma, the truth is that they're just not all that into
vaccines, traditionally preferring to manufacture drugs for such
plagues as erectile dysfunction, social anxiety, and restless leg
syndrome. Vaccines can be tricky and less than maximally profitable to
manufacture. They go out of style with every microbial mutation, and
usually it's the government, rather than cunning direct-to-consumer
commercials, that determines who gets them. So it should have been no
surprise that Big Pharma approached the H1N1 problem ploddingly, using
a 50-year old technology involving the production of the virus in
chicken eggs, a method long since abandoned by China and the European
Union.

Chicken eggs are fine for omelets, but they have quickly proved to
be a poor growth medium for the viral "seed" strain used to make H1N1
vaccine. There are alternative "cell culture" methods
that could produce the vaccine much faster, but in complete defiance of
the conventional wisdom that private enterprise is always more
innovative and resourceful than government, Big Pharma did not demand
that they be made available for this year's swine flu epidemic. Just
for the record, those alternative methods have been developed with
government funding, which is also the source of almost all our basic
knowledge of viruses.

So, thanks to the drug companies, optimism has been about as
effective in warding off H1N1 as amulets or fairy dust. Both the
government and Big Pharma were indeed overly optimistic about the
latter's ability to supply the vaccine, leaving those of us who are
involved in the care of small children with little to rely on but hope
-- hope that the epidemic will fade out on its own, hope that our loved
ones have the luck to survive it.

And contrary to the claims of the positive psychologists, optimism
itself is neither an elixir, nor a life-saving vaccine. Recent studies
show that optimism -- or positive feelings -- do not affect recovery
from a variety of cancers, including those of the breast, lungs, neck,
and throat. Furthermore, the evidence that optimism prolongs life has
turned out to be shaky at best: one study of nuns frequently cited as
proof positive of optimism's healthful effects turned out, in fact,
only to show that nuns who wrote more eloquently about their vows in
their early twenties tended to outlive those whose written statements
were clunkier.

Are we ready to abandon faith-based medicine of both the individual and
public health variety? Faith in private enterprise and the market has
now left us open to a swine flu epidemic; faith alone -- in the form of
optimism or hope -- does not kill viruses or cancer cells. On the
public health front, we need to socialize vaccine manufacture as well
as its distribution. Then, if the supply falls short, we can always
impeach the president. On the individual front, there's always soap and
water.

To listen to the TomDispatch
audio interview with Ehrenreich that accompanies this piece, click here.

October 13, 2009

Feminism made women miserable. This, anyway, seems to be the most
popular takeaway from "The Paradox of Declining Female Happiness," a recent study by Betsey Stevenson and Justin Wolfers which purports to show that women have become steadily unhappier since 1972. Maureen Dowd and Arianna Huffington greeted the news with somber perplexity, but the more common response has been a triumphant: I told you so.

On Slate's DoubleX website,
a columnist concluded from the study that "the feminist movement of the
1960s and 1970s gave us a steady stream of women's complaints disguised
as manifestos… and a brand of female sexual power so promiscuous that
it celebrates everything from prostitution to nipple piercing as a
feminist act -- in other words, whine, womyn, and thongs." Or as
Phyllis Schlafly put it,
more soberly: "[T]he feminist movement taught women to see themselves
as victims of an oppressive patriarchy in which their true worth will
never be recognized and any success is beyond their reach...
[S]elf-imposed victimhood is not a recipe for happiness."

But it's a little too soon to blame Gloria Steinem for our
dependence on SSRIs. For all the high-level head-scratching induced by
the Stevenson and Wolfers study, hardly anyone has pointed out (1) that
there are some issues with happiness studies in general, (2) that there
are some reasons to doubt this study in particular, or (3) that, even
if you take this study at face value, it has nothing at all to say
about the impact of feminism on anyone's mood.

For starters, happiness is an inherently slippery thing to measure
or define. Philosophers have debated what it is for centuries, and even
if we were to define it simply as a greater frequency of positive
feelings than negative ones, when we ask people if they are happy, we
are asking them to arrive at some sort of average over many moods and
moments. Maybe I was upset earlier in the day after I opened the bills,
but then was cheered up by a call from a friend, so what am I really?

In one well-known psychological experiment, subjects were asked to
answer a questionnaire on life satisfaction, but only after they had
performed the apparently irrelevant task of photocopying a sheet of
paper for the experimenter. For a randomly chosen half of the subjects,
a dime had been left for them to find on the copy machine. As two
economists summarize the results: "Reported satisfaction with life was
raised substantially by the discovery of the coin on the copy machine
-- clearly not an income effect."

As for the particular happiness study under discussion, the red
flags start popping up as soon as you look at the data. Not to be
anti-intellectual about it, but the raw data on how men and women
respond to the survey reveal no discernible trend to the naked eyeball.
Only by performing an occult statistical manipulation called "ordered
probit estimates," do the authors manage to tease out any trend at all,
and it is a tiny one: "Women were one percentage point less likely than
men to say they were not too happy at the beginning of the sample
[1972]; by 2006 women were one percentage more likely to report being
in this category." Differences of that magnitude would be stunning if
you were measuring, for example, the speed of light under different
physical circumstances, but when the subject is as elusive as happiness
-- well, we are not talking about paradigm-shifting results.

Furthermore, the idea that women have been sliding toward despair is contradicted by the one objective
measure of unhappiness the authors offer: suicide rates. Happiness is,
of course, a subjective state, but suicide is a cold, hard fact, and
the suicide rate has been the gold standard of misery since sociologist
Emile Durkheim wrote the book on it in 1897. As Stevenson and Wolfers
report -- somewhat sheepishly, we must imagine -- "contrary to the
subjective well-being trends we document, female suicide rates have
been falling, even as male suicide rates have remained roughly constant
through most of our sample [1972-2006]." Women may get the blues; men
are more likely to get a bullet through the temple.

Another distracting little data point that no one, including the
authors, seems to have much to say about is that, while "women" have
been getting marginally sadder, black women have been getting happier
and happier. To quote the authors: "…happiness has trended quite
strongly upward for both female and male African Americans… Indeed, the
point estimates suggest that well-being may have risen more strongly
for black women than for black men." The study should more accurately
be titled "The Paradox of Declining White Female Happiness," only that
might have suggested that the problem could be cured with melanin and
Restylane.

But let's assume the study is sound and that (white) women have become
less happy relative to men since 1972. Does that mean that feminism
ruined their lives?

Not according to Stevenson and Wolfers, who find that "the relative
decline in women's well-being... holds for both working and
stay-at-home mothers, for those married and divorced, for the old and
the young, and across the education distribution" -- as well as for
both mothers and the childless. If feminism were the problem, you might
expect divorced women to be less happy than married ones and employed
women to be less happy than stay-at-homes. As for having children, the
presumed premier source of female fulfillment: They actually make women
less happy.

And if the women's movement was such a big downer, you'd expect the
saddest women to be those who had some direct exposure to the noxious
effects of second wave feminism. As the authors report, however, "there
is no evidence that women who experienced the protests and enthusiasm
in the 1970s have seen their happiness gap widen by more than for those
women were just being born during that period."

What this study shows, if anything, is that neither marriage nor
children make women happy. (The results are not in yet on nipple
piercing.) Nor, for that matter, does there seem to be any problem with
"too many choices," "work-life balance," or the "second shift." If you
believe Stevenson and Wolfers, women's happiness is supremely
indifferent to the actual conditions of their lives, including poverty
and racial discrimination. Whatever "happiness" is...

So why all the sudden fuss about the Wharton study, which first
leaked out two years ago anyway? Mostly because it's become a launching
pad for a new book by the prolific management consultant Marcus
Buckingham, best known for First, Break All the Rules and Now, Find Your Strengths. His new book, Find Your Strongest Life: What the Happiest and Most Successful Women Do Differently,
is a cookie-cutter classic of the positive-thinking self-help genre:
First, the heart-wrenching quotes from unhappy women identified only by
their email names (Countess1, Luveyduvy, etc.), then the stories of
"successful" women, followed by the obligatory self-administered test
to discover "the role you were bound to play" (Creator, Caretaker,
Influencer, etc.), all bookended with an ad for the many related
products you can buy, including a "video introduction" from Buckingham,
a "participant's guide" containing "exercises" to get you to happiness,
and a handsome set of "Eight Strong Life Plans" to pick from. The Huffington Post has given Buckingham a column in which to continue his marketing campaign.

It's an old story: If you want to sell something, first find the
terrible affliction that it cures. In the 1980s, as silicone implants
were taking off, the doctors discovered "micromastia" -- the "disease"
of small-breastedness. More recently, as big pharma searches furiously
for a female Viagra, an amazingly high 43% of women have been found to
suffer from "Female Sexual Dysfunction," or FSD. Now, it's unhappiness,
and the range of potential "cures" is dazzling: Seagrams, Godiva, and
Harlequin, take note.

ThinkBeforeYouPink.orgThink Before You Pink™, a project of Breast Cancer Action, launched in 2002 in response to the growing concern about the number of pink ribbon products on the market. The campaign calls for more transparency and accountability by companies that take part in breast cancer fundraising, and encourages consumers to ask critical questions about pink ribbon promotions.

August 04, 2009

To judge from most of the commentary on the Gates-Crowley affair, you would think that a "black elite" has gotten dangerously out of hand. First Gates (Cambridge, Yale, Harvard) showed insufficient deference to Crowley, then Obama (Occidental, Harvard) piled on to accuse the police of having acted "stupidly." Was this "the end of white America" which the Atlantic had warned of in its January/February cover story? Or had the injuries of class – working class in Crowley’s case – finally trumped the grievances of race?

Left out of the ensuing tangle of commentary on race and class has been the increasing impoverishment—or, we should say, re-impoverishment--of African Americans as a group. In fact, the most salient and lasting effect of the current recession may turn out to be the decimation of the black middle class. According to a study by Demos and the Institute for Assets and Social Policy, 33 percent of the black middle class was already in danger of falling out of the middle class at the start of the recession. Gates and Obama, along with Oprah and Cosby, will no doubt remain in place, but millions of the black equivalents of Officer Crowley – from factory workers to bank tellers and white collar managers – are sliding down toward destitution.

For African Americans – and to a large extent, Latinos – the recession is over. It occurred between 2000 and 2007, as black employment decreased by 2.4 percent and incomes declined by 2.9 percent. During the seven-year long black recession, one third of black children lived in poverty and black unemployment—even among college graduates-- consistently ran at about twice the level of white unemployment. That was the black recession. What’s happening now is a depression.

Black unemployment is now at 14.7 percent, compared to 8.7 for whites. In New York City, black unemployment has been rising four times as fast as that of whites. Lawrence Mishel, president of the Economic Policy Institute, estimates that 40 percent of African Americans will have experienced unemployment or underemployment by 2010, and this will increase child poverty from one-third of African-American children to slightly over half. No one can entirely explain the extraordinary rate of job loss among African Americans, though factors may include the relative concentration of blacks in the hard-hit retail and manufacturing sectors, as well as the lesser seniority of blacks in better-paying, white collar, positions.

But one thing is certain: The longstanding racial "wealth gap" makes African Americans particularly vulnerable to poverty when job loss strikes. In 1998, the net worth of white households on average was $100,700 higher than that of African-Americans. By 2007, this gap had increased to $142,600. The Survey of Consumer Finances, which is supported by the Federal Reserve Board, collects this data every three years -- and every time it has been collected, the racial wealth gap has widened. To put it another way: in 2004, for every dollar of wealth held by the typical white family, the African American family had only one 12 cents. In 2007, it had exactly a dime. So when an African American breadwinner loses a job, there are usually no savings to fall back on, no well-heeled parents to hit up, no retirement accounts to raid.

All this comes on top of the highly racially skewed subprime mortgage calamity. After decades of being denied mortgages on racial grounds, African Americans made a tempting market for bubble-crazed lenders like Countrywide, with the result that high income blacks were almost twice as likely as low income white to receive high interest subprime loans. According to the Center for Responsible Lending, Latinos will end up losing between $75 billion and $98 billion in home-value wealth from subprime loans, while blacks will lose between $71 billion and $92 billion. United for a Fair Economy has called this family net-worth catastrophe the "greatest loss of wealth for people of color in modern U.S. history."

Yet in the depths of this African American depression, some commentators, black as well as white, are still obsessing about the supposed cultural deficiencies of the black community. In a December op-ed in the Washington Post, Kay Hymowitz blamed black economic woes on the fact that 70 percent of black children are born to single mothers, not noticing that the white two-parent family has actually declined at a faster rate than the black two-parent family. The share of black children living in a single parent home increased by 155 percent between 1960 to 2006, while the share of white children living in single parent homes increased by a staggering 229 percent.

Just last month on NPR, commentator Juan Williams dismissed the NAACP by saying that more up-to-date and relevant groups focus on "people who have taken advantage of integration and opportunities for education, employment, versus those who seem caught in generational cycles of poverty," which he went on to characterize by drug use and crime. The fact that there is an ongoing recession disproportionately affecting the African American middle class – and brought on by Wall Street greed rather than "ghetto" values – seems to have eluded him.

We don’t need any more moralizing or glib analyses of class and race that could have just as well been made in the 70s. The recession is changing everything. It’s redrawing the class contours of America in ways that will leave us more polarized than ever, and, yes, profoundly hurting the erstwhile white middle and working classes. But the depression being experienced by people of color threatens to do something on an entirely different scale, and that is to eliminate the black middle class.

Barbara Ehrenreich is the president of United Professionals and author, most recently, of "This Land Is Their Land: Reports From a Divided Nation."

Dedrick Muhammad is a Senior Organizer and Research Associate of the Institute for Policy Studies.

June 15, 2009

THE human side of the recession, in the new media genre that's been called "recession porn," is the story of an incremental descent from excess to frugality, from ease to austerity. The super-rich give up their personal jets; the upper middle class cut back on private Pilates classes; the merely middle class forgo vacations and evenings at Applebee's. In some accounts, the recession is even described as the "great leveler," smudging the dizzying levels of inequality that characterized the last couple of decades and squeezing everyone into a single great class, the Nouveau Poor, in which we will all drive tiny fuel-efficient cars and grow tomatoes on our porches.

But the outlook is not so cozy when we look at the effects of the recession on a group generally omitted from all the vivid narratives of downward mobility - the already poor, the estimated 20 percent to 30 percent of the population who struggle to get by in the best of times. This demographic, the working poor, have already been living in an economic depression of their own. From their point of view "the economy," as a shared condition, is a fiction.

This spring, I tracked down a couple of the people I had met while working on my 2001 book, "Nickel and Dimed," in which I worked in low-wage jobs like waitressing and housecleaning, and I found them no more gripped by the recession than by "American Idol"; things were pretty much "same old." The woman I called Melissa in the book was still working at Wal-Mart, though in nine years, her wages had risen to $10 an hour from $7. "Caroline," who is increasingly disabled by diabetes and heart disease, now lives with a grown son and subsists on occasional cleaning and catering jobs. We chatted about grandchildren and church, without any mention of exceptional hardship.

As with Denise Smith, whom I recently met through the Virginia Organizing Project and whose bachelor's degree in history qualifies her for seasonal $10-an-hour work at a tourist site, the recession is largely an abstraction. "We were poor," Ms. Smith told me cheerfully, "and we're still poor."

But then, at least if you inhabit a large, multiclass extended family like my own, there comes that e-mail message with the subject line "Need your help," and you realize that bad is often just the stage before worse. The note was from one of my nephews, and it reported that his mother-in-law, Peg, was, like several million other Americans, about to lose her home to foreclosure.

It was the back story that got to me: Peg, who is 55 and lives in rural Missouri, had been working three part-time jobs to support her disabled daughter and two grandchildren, who had moved in with her. Then, last winter, she had a heart attack, missed work and fell behind in her mortgage payments. If I couldn't help, all four would have to move into the cramped apartment in Minneapolis already occupied by my nephew and his wife.

Only after I'd sent the money did I learn that the mortgage was not a subprime one and the home was not a house but a dilapidated single-wide trailer that, as a "used vehicle," commands a 12-percent mortgage interest rate. You could argue, without any shortage of compassion, that "Low-Wage Worker Loses Job, Home" is nobody's idea of news.

In late May I traveled to Los Angeles - where the real unemployment rate, including underemployed people and those who have given up on looking for a job, is estimated at 20 percent - to meet with a half-dozen community organizers. They are members of a profession, derided last summer by Sarah Palin, that helps low-income people renegotiate mortgages, deal with eviction when their landlords are foreclosed and, when necessary, organize to confront landlords and bosses.The question I put to this rainbow group was: "Has the recession made a significant difference in the low-income communities where you work, or are things pretty much the same?" My informants - from Koreatown, South Central, Maywood, Artesia and the area around Skid Row - took pains to explain that things were already bad before the recession, and in ways that are disconnected from the larger economy. One of them told me, for example, that the boom of the '90s and early 2000s had been "basically devastating" for the urban poor. Rents skyrocketed; public housing disappeared to make way for gentrification.

But yes, the recession has made things palpably worse, largely because of job losses. With no paychecks coming in, people fall behind on their rent and, since there can be as long as a six-year wait for federal housing subsidies, they often have no alternative but to move in with relatives. "People are calling me all the time," said Preeti Sharma of the South Asian Network, "They think I have some sort of magic."

The organizers even expressed a certain impatience with the Nouveau Poor, once I introduced the phrase. If there's a symbol for the recession in Los Angeles, Davin Corona of Strategic Actions for a Just Economy said, it's "the policeman facing foreclosure in the suburbs." The already poor, he said - the undocumented immigrants, the sweatshop workers, the janitors, maids and security guards - had all but "disappeared" from both the news media and public policy discussions.

Disappearing with them is what may be the most distinctive and compelling story of this recession. When I got back home, I started calling up experts, like Sharon Parrott, a policy analyst at the Center on Budget and Policy Priorities, who told me, "There's rising unemployment among all demographic groups, but vastly more among the so-called unskilled."

How much more? Larry Mishel, the president of the Economic Policy Institute, offers data showing that blue-collar unemployment is increasing three times as fast as white-collar unemployment. The last two recessions - in the early '90s and in 2001 - produced mass white-collar layoffs, and while the current one has seen plenty of downsized real-estate agents and financial analysts, the brunt is being borne by the blue-collar working class, which has been sliding downward since deindustrialization began in the '80s.When I called food banks and homeless shelters around the country, most staff members and directors seemed poised to offer press-pleasing tales of formerly middle-class families brought low. But some, like Toni Muhammad at Gateway Homeless Services in St. Louis, admitted that mostly they see "the long-term poor," who become even poorer when they lose the kind of low-wage jobs that had been so easy for me to find from 1998 to 2000. As Candy Hill, a vice president of Catholic Charities U.S.A., put it, "All the focus is on the middle class - on Wall Street and Main Street - but it's the people on the back streets who are really suffering."

What are the stations between poverty and destitution? Like the Nouveau Poor, the already poor descend through a series of deprivations, though these are less likely to involve forgone vacations than missed meals and medications. The Times reported earlier this month that one-third of Americans can no longer afford to comply with their prescriptions.

There are other, less life-threatening, ways to try to make ends meet. The Associated Press has reported that more women from all social classes are resorting to stripping, although "gentlemen's clubs," too, have been hard-hit by the recession. The rural poor are turning increasingly to "food auctions," which offer items that may be past their sell-by dates.

And for those who like their meat fresh, there's the option of urban hunting. In Racine, Wis., a 51-year-old laid-off mechanic told me he's supplementing his diet by "shooting squirrels and rabbits and eating them stewed, baked and grilled." In Detroit, where the wildlife population has mounted as the human population ebbs, a retired truck driver is doing a brisk business in raccoon carcasses, which he recommends marinating with vinegar and spices.

The most common coping strategy, though, is simply to increase the number of paying people per square foot of dwelling space - by doubling up or renting to couch-surfers. It's hard to get firm numbers on overcrowding, because no one likes to acknowledge it to census-takers, journalists or anyone else who might be remotely connected to the authorities. At the legal level, this includes Peg taking in her daughter and two grandchildren in a trailer with barely room for two, or my nephew and his wife preparing to squeeze all four of them into what is essentially a one-bedroom apartment. But stories of Dickensian living arrangements abound.

In Los Angeles, Prof. Peter Dreier, a housing policy expert at Occidental College, says that "people who've lost their jobs, or at least their second jobs, cope by doubling or tripling up in overcrowded apartments, or by paying 50 or 60 or even 70 percent of their incomes in rent." Thelmy Perez, an organizer with Strategic Actions for a Just Economy, is trying to help an elderly couple who could no longer afford the $600 a month rent on their two-bedroom apartment, so they took in six unrelated subtenants and are now facing eviction. According to a community organizer in my own city, Alexandria, Va., the standard apartment in a complex occupied largely by day laborers contains two bedrooms, each housing a family of up to five people, plus an additional person laying claim to the couch.

Overcrowding - rural, suburban and urban - renders the mounting numbers of the poor invisible, especially when the perpetrators have no telltale cars to park on the street. But if this is sometimes a crime against zoning laws, it's not exactly a victimless one. At best, it leads to interrupted sleep and long waits for the bathroom; at worst, to explosions of violence. Catholic Charities is reporting a spike in domestic violence in many parts of the country, which Candy Hill attributes to the combination of unemployment and overcrowding.

And doubling up is seldom a stable solution. According to Toni Muhammad, about 70 percent of the people seeking emergency shelter in St. Louis report they had been living with relatives "but the place was too small." When I asked Peg what it was like to share her trailer with her daughter's family, she said bleakly, "I just stay in my bedroom."

The deprivations of the formerly affluent Nouveau Poor are real enough, but the situation of the already poor suggests that they do not necessarily presage a greener, more harmonious future with a flatter distribution of wealth. There are no data yet on the effects of the recession on measures of inequality, but historically the effect of downturns is to increase, not decrease, class polarization.

The recession of the '80s transformed the working class into the working poor, as manufacturing jobs fled to the third world, forcing American workers into the low-paying service and retail sector. The current recession is knocking the working poor down another notch - from low-wage employment and inadequate housing toward erratic employment and no housing at all. Comfortable people have long imagined that American poverty is far more luxurious than the third world variety, but the difference is rapidly narrowing.

Maybe "the economy," as depicted on CNBC, will revive again, restoring the kinds of jobs that sustained the working poor, however inadequately, before the recession. Chances are, though, that they still won't pay enough to live on, at least not at any level of safety and dignity. In fact, hourly wage growth, which had been running at about 4 percent a year, has undergone what the Economic Policy Institute calls a "dramatic collapse" in the last six months alone. In good times and grim ones, the misery at the bottom just keeps piling up, like a bad debt that will eventually come due.

Barbara Ehrenreich is the author, most recently, of "This Land Is Their Land: Reports From a Divided Nation."

February 23, 2009

I like to think that some of the things I write cause discomfort in those readers who deserve to feel it. Ideally, they should squirm, they should flinch, they might even experience fleeting gastrointestinal symptoms. But I have always drawn the line at torture. It may be unpleasant to read some of my writings, especially if they have been assigned by a professor, but it should not result in uncontrollable screaming, genital mutilation or significant blood loss.

With such stringent journalistic ethics in place, I was shocked to read in the February 14th Daily Mail Online a brief article headed "Food writer's online guide to building an H-bomb...the 'evidence' that put this man in Guantanamo." The "food writer" was identified as me, and the story began:

A British 'resident' held at Guantanamo Bay was identified as a terrorist after confessing he had visited a 'joke' website on how to build a nuclear weapon, it was revealed last night. Binyam Mohamed, a former UK asylum seeker, admitted to having read the 'instructions' after allegedly being beaten, hung up by his wrists for a week and having a gun held to his head in a Pakistani jail.

While I am not, and have never been, a "food writer," other details about the "joke" rang true, such as the names of my co-authors, Peter Biskind and physicist Michio Kaku. Rewind to 1979, when Peter and I were working for a now-defunct leftwing magazine named Seven Days. The government had just suppressed the publication of another magazine, The Progressive, for attempting to print an article called "The H-Bomb Secret." I don't remember that article and the current editor of The Progressive recalls only that it contained a lot of physics and was "Greek to me." Both in solidarity with The Progressive and in defense of free speech, we at Seven Days decided to do a satirical article entitled "How to Make Your Own H-Bomb," offering step-by-step instructions for assembling a bomb using equipment available in one's own home.

The satire was not subtle. After discussing the toxicity of plutonium, we advised that to avoid ingesting it orally, "Never make an A-bomb on an empty stomach." My favorite section dealt with the challenge of enriching uranium hexafluoride:

First transform the gas into a liquid by subjecting it to pressure. You can use a bicycle pump for this. Then make a simple home centrifuge. Fill a standard-size bucket one-quarter full of liquid uranium hexafluoride. Attach a six-foot rope to the bucket handle. Now swing the rope (and attached bucket) around your head as fast as possible. Keep this up for about 45 minutes. Slow down gradually, and very gently put the bucket on the floor. The U-235, which is lighter, will have risen to the top, where it can be skimmed off like cream. Repeat this step until you have the required 10 pounds of uranium. (Safety note: Don't put all your enriched uranium hexafluoride in one bucket. Use at least two or three buckets and keep them in separate corners of the room. This will prevent the premature build-up of a critical mass.)

Our H-bomb cover story created a bit of a stir at the time, then vanished into the attics and garages of former Seven Days staffers, only to resurface, at least in part, on the Internet in the early 2000s. Today, you can find it quoted on the blog spot of a University of Dayton undergraduate (http://port80.blogsome.com/2005/03/13/how-not-to-build-a-thermonuclear-bomb), along with the flattering comment: "This forum post is priceless. It is one of the best pieces of scientific satire I have ever seen. I can only hope and pray that terrorist groups attempt to construct an atomic bomb using these instructions - if they survive the attempt, they'll have at least wasted months of effort."

Enter Binyam Mohamed, an Ethiopian refugee and legal resident of Britain who had found work as a janitor after drug problems derailed his college career. According to his lawyer, Clive Smith of the human rights group Reprieve, Mohamed traveled to Afghanistan in 2001, attracted by the Taliban's drug-free way of life - which, from my point of view, was a little like upgrading from bronchitis to lung cancer. War soon drove him out of Afghanistan and to Karachi, from where he sought to return to the U.K. But, as a refugee, he lacked a proper passport and was using a friend's, which led to his apprehension at the airport. Smith says the Pakistanis turned him over to the FBI, who were obsessed at the time with the possibility of an Al Qaeda nuclear attack on the U.S. After repeated beatings and the above-mentioned hanging by the wrists, Mohamed "confessed" to having read an article on how to make an H-bomb on the Internet, insisting to his interrogators that it was a "joke."

But post-9/11 America was an irony-free zone, and it's still illegal to banter about bombs in the presence of airport security staff. It's not clear how the news of Mohamed's H-bomb knowledge was conveyed to Washington - many documents remain classified or have not been released - but Smith speculates that the part about the H-bomb got through, although not the part about the joke. The result, anyhow, was that Mohamed was thrust into a world of unending pain - tortured at the U.S. prison in Baghram, rendered to Morocco for 18 months of further torture, including repeated cutting of his penis with a scalpel, and finally landing in Guantanamo for almost five years of more mundane abuse. He was just released and returned to Britain today.

As if that were not enough for a satirist to have on her conscience, the U.S. seems to have attributed Mohamed's presumed nuclear ambitions to a second man, an American citizen named Jose Padilla, aka the "dirty bomber." The apparent evidence? Padilla had been scheduled to fly on the same flight out of Karachi that Mohamed had a ticket for, so obviously they must have been confederates. Commenting on Padilla's apprehension in 2002, the Chicago Sun-Times editorialized: "We castigate ourselves for failing to grasp the reality of what they're [the alleged terrorists are] trying to do, but perhaps that is a good thing. We should have difficulty staring evil in the face."

I am not histrionic enough to imagine myself in any way responsible for the torments suffered by Mohamed and Padilla - at least no more responsible than any other American who failed to rise up in revolutionary anger against the Bush terror regime. No, I'm too busy seething over another irony: Whenever I've complained about my country's torturings, renderings, detentions, etc., there's always been some smug bastard ready to respond that these measures are what guarantee smart-alecky writers like myself our freedom of speech. Well, we had a government so vicious and impenetrably stupid that it managed to take my freedom of speech and turn it into someone else's living hell.

January 12, 2009

Ever on the lookout for the bright side of hard times, I am tempted to delete “class inequality” from my worry list. Less than a year ago, it was the one of the biggest economic threats on the horizon, with even hard line conservative pundits grousing that wealth was flowing uphill at an alarming rate, leaving the middle class stuck with stagnating incomes while the new super-rich ascended to the heavens in their personal jets. Then the whole top-heavy structure of American capitalism began to totter, and –poof!—inequality all but vanished from the public discourse. A financial columnist in the Chicago Sun Times has just announced that the recession is a “great leveler,” serving to “democratize[d] the agony,” as we all tumble into “the Nouveau Poor…”

The media have been pelting us with heart-wrenching stories about the neo-suffering of the Nouveau Poor, or at least the Formerly Super-rich among them: Foreclosures in Greenwich CT! A collapsing market for cosmetic surgery! Sales of Gulfstream jets declining! Niemen Marcus and Saks Fifth Avenue on the ropes! We read of desperate measures, like having to cut back the personal trainer to two hours a week. Parties have been canceled; dinner guests have been offered, gasp, baked potatoes and chili. The New York Times relates the story of a New Jersey teenager whose parents were forced to cut her $100 a week allowance and private Pilates classes. In one of the most pathetic tales of all, New Yorker Alexandra Penney relates how she lost her life savings to Bernie Madoff and is now faced with having to lay off her three-day- a-week maid, Yolanda. “I wear a classic clean white shirt every day of the week. I have about 40 white shirts. They make me feel fresh and ready to face whatever battles I may be fighting …” she wrote, but without Yolanda, “How am I going to iron those shirts so I can still feel like a poor civilized person?”

But hard times are no more likely to abolish class inequality than Obama’s inauguration is likely to eradicate racism. No one actually knows yet whether inequality has increased or decreased during the last year of recession, but the historical precedents are not promising. The economists I’ve talked to-- like Biden’s top economic advisor, Jared Bernstein—insist that recessions are particularly unkind to the poor and the middle class. Canadian economist Armine Yalnizyan says, “Income polarization always gets worse during recessions.” It makes sense. If the stock market has shrunk your assets of $500 million to a mere $250 million, you may have to pass on a third or fourth vacation home. But if you’ve just lost an $8 an hour job, you’re looking at no home at all.

Alright, I’m a journalist and I understand how the media work. When a millionaire cuts back on his crème fraiche and caviar consumption, you have a touching human interest story. But pitch a story about a laid-off roofer who loses his trailer home and you’re likely to get a big editorial yawn. “Poor Get Poorer” is just not an eye-grabbing headline, even when the evidence is overwhelming. Food stamp applications, for example, are rising toward a historic record; calls to one DC-area hunger hotline have jumped 248 percent in the last six months, most of them from people who have never needed food aid before. And for the first time since 1996, there’s been a marked upswing in the number of people seeking cash assistance from TANF (Temporary Aid to Needy Families), the exsanguinated version of welfare left by welfare “reform.” Too bad for them that TANF is essentially a wage-supplement program based on the assumption that the poor would always be able to find jobs, and that it pays, at most, less than half the federal poverty level.

Why do the sufferings of the poor and the downwardly- mobile class matter more than the tiny deprivations of the rich? Leaving aside all the soft-hearted socialist, Christian-type, arguments, it’s because poverty and the squeeze on the middle class are a big part of what got us into this mess in the first place. Only one thing kept the sub-rich spending in the 00’s, and hence kept the economy going, and that was debt: credit card debt, home equity loans, car loans, college loans and of course the now famously “toxic” subprime mortgages, which were bundled and sliced into “securities” and marketed to the rich as high-interest investments throughout the world. The gross inequality of American society wasn’t just unfair or aesthetically displeasing; it created a perilously unstable situation.

Which is why any serious government attempt to get the economy going again – and I leave aside the unserious attempts like bank bailouts and other corporate welfare projects—has to start at the bottom. Obama is promising to generate three million new jobs in “shovel ready” projects, and let’s hope they’re not all jobs for young men with strong backs. Until those jobs kick in, and in case they leave out the elderly, the single moms and the downsized desk-workers, we’re going to need an economic policy centered on the poor: more money for food stamps, for Medicaid, unemployment insurance, and, yes, cash assistance along the lines of what welfare once was, so that when people come tumbling down they don’t end up six feet under. For those who think “welfare” sounds too radical, we could just call it a “right to life” program, only one in which the objects of concern have already been born.

If that sounds politically unfeasible, consider this: When Clinton was cutting welfare and food stamps in the 90s, the poor were still an easily marginalized group, subjected to the nastiest sorts of racial and gender stereotyping. They were lazy, promiscuous, addicted, deadbeats, as whole choruses of conservative experts announced. Thanks to the recession, however – and I knew there had to be a bright side – the ranks of the poor are swelling every day with failed business owners, office workers, salespeople, and long-time homeowners. Stereotype that! As the poor and the formerly middle class Nouveau Poor become the American majority, they will finally have the clout to get their needs met.

December 08, 2008

44, The PrequelBy Tom EngelhardtDid you know that the IBM Center for the Business of Government hosts a "Presidential Transition" blog; that the Council on Foreign Relations has its own "Transition Blog: The New Administration"; and that the American University School of Communication has a "Transition Tracker" website? The National Journal offers its online readers a comprehensive "Lost in Transition" site to help them "navigate the presidential handover," including a "short list," offering not only the president-elect's key recent appointments, but also a series of not-so-short lists of those still believed to be in contention for as-yet-unfilled jobs. Think of all this as Entertainment Weekly married to People Magazine for post-election political junkies.

Newsweek features "powering up" ("blogging the transition"); the policy-wonk website Politico.com offers Politico 44 ("a living diary of the Obama presidency"); and Public Citizen has "Becoming 44," with the usual lists of appointees, possible appointees, but -- for the junkie who wants everything -- "bundler transition team members" and "lobbyist and bundler appointees" as well. (For those who want to know, for instance, White House Social Secretary-designate Desiree Roberts bundled at least $200,000 for the Obama campaign.)

The New York Times has gone whole hog at "The New Team" section of its website, where there are scads of little bios of appointees, as well as prospective appointees -- including what each individual will "bring to the job," how each is "linked to Mr. Obama," and what negatives each carries as "baggage." Think of it as a scorecard for transition junkies. The Washington Post, whose official beat is, of course, Washington D.C. über alles, has its "44: The Obama Presidency, A Transition to Power," where, in case you're planning to make a night of it on January 20th, you can keep up to date on that seasonal must-subject, the upcoming inaugural balls. And not to be outdone, the transitioning Obama transition crew has its own mega-transition site, Change.gov.

Earliest, Biggest, Fastest

And that, of course, only begins to scratch the surface of the media's transition mania -- I haven't even mentioned the cable news networks -- which has followed, with hardly a breath, nearly two years of presidential campaign mania. Let's face it, whether or not the Obama transition is the talk of Main Street and the under-populated malls of this American moment, it's certainly the talk of medialand -- and at what can only be termed historic levels, as befits a "historic" transition period.

Believe me, no one's sparing the adjectives right now. This transition is the earliest, biggest, fastest, best organized, most efficient on record, even as Obama himself has "maintained one of the most public images of any president-elect." It's cause for congratulations all around, a powerful antidote, we're told, to Bill Clinton's notoriously chaotic transition back in 1992. In fact, we can't, it seems, get enough of a transition that began to gather steam many months before November 4th and has been plowing ahead for more than a post-election month now.

It's kind of exhausting, really, just thinking about that awesomely humongous transition line-up. Check out the list of transition review teams and advisors at Change.gov and you'll find that it goes over the horizon. According to the Washington Post, 135 transition team members, organized into 10 groups, all wearing yellow badges, backed by countless transition advisers, "have swarmed into dozens of government offices, from the Pentagon to the National Council on Disability" preparing the way for the new administration. This, like so much else, has been "unprecedented."

And don't get anyone started on the veritable "army" of volunteer lawyers giving "unprecedented scrutiny" to possible administration appointees in a vetting process that began at the moment of Obama's nomination, not election. As the Washington Post's Philip Rucker described it:

"Embarrassing e-mails, text messages, diary entries and Facebook profiles? Gifts worth more than $50? Relatives linked to Fannie Mae, Freddie Mac, AIG or another company getting a federal bailout? Obama is conducting the vetting much as he managed his campaign: methodically, thoroughly and on a prodigious scale."That process includes a distinctly unprecedented invasion of privacy via a seven-page, 63-question form that all potential appointees have had to fill out. Imagine, for instance, that after 62 "penetrating" questions on every aspect of your life, you faced this catch-all 63rd question: "Please provide any other information, including information about other members of your family, that could suggest a conflict of interest or be a possible source of embarrassment to you, your family, or the president-elect." (For anyone worried about privacy issues, what this means practically -- as Barton Gelman explained in his book Angler on the vice-presidential 200-question vetting process by which Dick Cheney chose himself as candidate and then used private information sent in by the other candidates for his own purposes -- is major dossiers on about 800 people.)

Everything in this "transition" is, in fact, more prodigious and more invasive than in any previous transition, including, of course, the ongoing media fascination with all those positions Obama is filling with "the best and the brightest." We're not just talking about his vast economic team or his national security team, but the presidential liaison to Capitol Hill, the White House press secretary, the president's speechwriter, his communications director, and his White House staff secretary, not to speak of the First Lady's deputy chief of staff and, of course, that White House social secretary. And then there's always that bout of "fantasy football for foodies," the speculation over who will be the new White House chef.

The Transition Bulks Up

Talk about confident and organized, Peter Baker and Helene Cooper of the New York Times report that Obama invited former Marine Corps Commandant Gen. James Jones to meet with him and all but offered him a key national security post "a full 13 days before the election." (He clearly felt that he had a pretty good idea of who was going to be president-elect by then.) And the rest of his transition, so efficiently organized by former Clinton White House Chief of Staff John Podesta, has been on a (steam)roll ever since. Post-November 4th, it has been rolling out the key appointments at a historically "unprecedented" pace.

Five weeks past victory, according to the Times, Obama had announced 13 of the 24 "most important positions in a new administration," including Jones as his national security adviser. At the equivalent moment in their transitions, Jimmy Carter had filled two of these positions; Ronald Reagan, two; George H.W. Bush, 8 (but his was largely a carry-over administration); Bill Clinton, one; and George W. Bush (distracted by an electoral battle wending its fateful way to the Supreme Court), one.

Bated breath hardly catches the media mood, facing the thrilling almost daily roll-outs of new appointments and record numbers of president-elect press conferences against a backdrop of enough American flags to outfit a parade and announced from a White-House press-room-style podium carefully -- not to say ornately -- labeled The Office of the President Elect." At such moments, the Obama transition can seem anything but transitional.

Given the overwhelming, largely congratulatory focus on specific appointments and their attendant drama -- will the strong personalities of Hillary, Bob, and Jim clash? Are the Obama-ites in a desperate scramble for a new CIA Director? Is Larry Summers next in line for the Fed? -- the larger architecture of this moment, and what it portends for the presidency to come, is ignored.

Think of it this way: After the Imperial Campaign -- that two-year extravaganza of bread and circuses (and money) -- comes the Imperial Transition. Everything in these last weeks, like the preceding two years, has been bulked up, like Schwarzenegger's Conanesque pecs. In other words, since November 5th, what we've been experiencing in the midst of one of the true crisis periods in our history has essentially been an unending celebration of super-sized government. Consider it an introduction to what will surely be the next Imperial Presidency.

As the transition events indicate, whatever its specific policies of change, the administration-to-come is preparing to move, and in force, into an empty executive branch as it already exists. Wherever there's an opening, that is, Podesta's guys are rushing to fill it.

The particular transition moment that caught my eye occurred two weeks ago when the chief strategist of the Obama election campaign, David Axelrod, was appointed senior adviser to the president. To be more specific, he was given Karl Rove's old slot (and, assumedly, office) in the White House. As the Boston Globe's Peter Canelos wrote:

"[I]t's now obvious that there's one part of George W. Bush's political legacy that Obama and Axelrod aren't eager to change: the very dubious notion of having the president's campaign strategist rubbing elbows with all the policy wonks in the West Wing."True, presidents have often wanted trusted advisors near at hand, but the institutionalization of that urge in an actual office in the White House is a new development that Obama could easily, as well as painlessly, have reversed (and many would have cheered him for it). So consider it a signal.

Barack Obama -- thank goodness -- isn't George Bush. He doesn't arrive in office with a crew wedded to a "unitary executive theory" of the presidency, or an urge to loose the executive from the supposed "chains" of the Watergate-era Congress, or to "take off the gloves" globally. He doesn't have strange, twisted, oppressive ideas about how the Constitution should work, nor assumedly do visions of a "commander-in-chief presidency" (or vice presidency) dance in his head like so many sugar plums.

But don't ignore the architecture, the deep structure of the American political system. Make no mistake, Obama is moving full-speed ahead into an executive mansion rebuilt and endlessly expanded by the national security state over the last half-century-plus, and then built up in major ways by George W.'s "team." Despite the prospect of a new dog and a mother-in-law in the White House, the president-elect and his transition team show no signs of wanting to change the basic furniture, no less close up a few wings of the imperial mansion (other, perhaps, than the elaborate prison complex at Guantanamo).

With so many catastrophes impending and so many pundits and journalists merrily applauding the most efficient transition in American history, no one, it seems, is even thinking about the architecture.

The GM of Governments

The New York Times's David Sanger recently reported on what happened when Obama's mini-transition teams of ex-Clintonistas ventured into the heart of our post-9/11 imperial bureaucracy. Many of the team members had worked in the very same departments in the 1990s. On returning, however, they found themselves to be so many Alices in a labyrinthine new Wonderland of national security. Sanger writes:

"[S]everal say they feel more like political archaeologists. 'The buildings look the same,' one said over coffee, 'but everything inside is unrecognizable.' And as they dig, they have tripped across a few surprises… [F]ew can contain their amazement, chiefly at the sheer increase in the size of the defense and national-security apparatus."'For a bunch of small-government Republicans,' [said] one former denizen of the White House who has now stepped back inside for the first time in eight years, 'these guys built a hell of an empire.' Eight years ago, there were two deputy national security advisers; today there are a half-dozen, each with staff."

And don't think for a second that most or all of those half-dozen posts aren't likely to be filled by the new administration, or that, four or eight years later, we'll be back to two deputy national security advisers; nor should you imagine that the Homeland Security Department that Arizona Governor Janet Napolitano is to run, a vast, lumpy, inefficient, ineffective post-9/11 creation of the Bush administration (which now has its own embedded mini-homeland-industrial complex) will be gone in those same years, anymore than that most un-American of words "homeland" is likely to leave our lexicon; nor will Barack Obama not appoint a Director of National Intelligence, another of those post-9/11 creations that added yet one more layer of bureaucracy to the 18 departments, agencies, and offices which make up the official U.S. Intelligence Community.

Don't hold your breath for that labyrinthine mess to be reduced to a more logical two or three intelligence agencies; nor will that 2002 creation of the Bush administration, the U.S. Northern Command, another militarization of "the homeland" now in the process of bulking up, be significantly downsized or abolished in the coming years.

On all of this, the Bush administration has gone out of its way to lend a hand to Obama's transition team and, in the process, help institutionalize the imperial transition itself. Like the new money arrangements pioneered in the 2008 elections, it surely will remain part of the political landscape for the foreseeable future. From such developments in our world, it seems, there's never any turning back.

There's nothing strange about all this, of course, if you're already inside this system. It seems, in fact, too obvious to mention. After all, what president wouldn't move into the political/governmental house he's inheriting as efficiently and fully as possible?

The unprecedented size of this imperial pre-presidency, however, signals something else: that what is to come -- quite aside from the specific policies adopted by a future Obama administration – will be yet another imperial presidency. (And, by the way, those who expect Congress to suddenly become the player it hasn't been, wielding power long ceded, are as likely to be disappointed as those who expect a Hillary Clinton State Department renaissance under the budgetary shadow of the Pentagon.)

On January 20th, Barack Obama will be more prepared than any president in recent history to move in and, as everyone now likes to write, "hit the ground running." But that ground -- the bloated executive and the vast national security apparatus that goes with it (as well as the U.S. military garrisons that dot the planet), all further engorged by George W., Dick, and pals -- is anything but fertile when it comes to "change."

Maybe if the imperial presidency and the national security state worked, none of this would matter. But how can they, given the superlatives that apply to them? They're oversized, over-muscled, overweight, overly expensive, overly powerful, and overly intrusive.

Bottom line: they are problem creators, not problem solvers. To expect one genuine "decider," moving in at the top, to put them on a diet-and-exercise regimen is asking a lot. After all, at the end of the George Bush era, what we have is the GM of governments, and when things start to go wrong, who's going to bail it out?

Tom Engelhardt, co-founder of the American Empire Project, runs the Nation Institute's TomDispatch.com. He is the author of The End of Victory Culture, a history of the American Age of Denial. The World According to TomDispatch: America in the New Age of Empire (Verso, 2008), a collection of some of the best pieces from his site and an alternative history of the mad Bush years, has recently been published.