Archive for the ‘Warner’ Category

Judith Warner, in “The Lure of Opulent Desolation,” says the “baby busters” of the ’60s never rebelled against the trappings of domesticity represented by their images of the 1950s. Many of them, deep down, yearn for it. Mr. Herbert says we should “Add Up the Damage.” He says he doesn’t think George W. Bush should be allowed to slip quietly out of town. There should be a loud, collective angry howl and demonstrations over the damage he’s done to this country. Amen, brother! Here’s Ms. Warner:

About seven years ago, not long after settling into a little house on a tree-lined street in a city neighborhood all but indistinguishable from the suburbs surrounding it, I developed a brief obsession with mid-20th-century American anomie. I read “The Man in the Gray Flannel Suit” and “The Organization Man.” I re-read “The Feminine Mystique.”

And I devoured Richard Yates’s “Revolutionary Road,” a then largely overlooked book that I found one day among the paperbacks in our local bookstore, snatching it up for what its jacket promised would be “the most evocative portrayal” of suburban “opulent desolation.” (“What in God’s name was the point or the meaning or the purpose of a life like this?” was the sort of gratifying payoff I found within its pages.)

I approached these books, I’ll admit, with a kind of prurient interest, a combination of revulsion and irresistible attraction, thoroughly enjoying the sad and sordid sexual repression, the infantilization of women, the cookie-cutter conformity imposed upon men. I couldn’t get enough of the miserable domestic underbelly of life in the period we like to call “the Fifties,” an era that spans the late ’40s to the mid-’60s. Some of the fascination was a kind of exoticism. More, however, came from the fact that, I found, in our era of “soccer moms,” “surrendered wives” and “new traditionalism,” the look and sound of the opulent desolation was eerily familiar.

I soon had a steady stream of new material to feed my craving for Lucky Strike- and martini-scented domestic disturbances. The films “Far From Heaven” and “The Hours.” The TV series “Mad Men.” And now, of course, “Revolutionary Road,” the movie, repackaging what USA Today recently called “the savagery of post-war domesticity” for the Oscars.

Why is there such a desire, even a hunger, to recreate images from such an unhappy past? A past characterized by every possible form of bigotry? A past, furthermore, that people like the “Mad Men” creator Matthew Weiner and the directors of “Revolutionary Road,” “Far From Heaven” and “The Hours” can’t possibly remember, having been born, like me, in the 1960s?

“Part of the show is trying to figure out,” Weiner told The Times’s Alex Witchel last June, “what is the deal with my parents. Am I them? Because you know you are.”

There’s some of that, I think. But there’s also much more.

Unlike the baby boomers before us, we “baby busters” of the ’60s never rebelled against the trappings of domesticity represented by our images of the 1950s. Many of us, deep down, yearn for it, having experienced divorce or other sorts of family dislocation in the 1970s. We keep alive a secret dream of “a model of routine and order and organization and competence,” a life “where women kept house, raised kids and kept their eyebrows looking really good,” as the writer Lonnae O’Neal Parker once described it in The Washington Post Magazine.

But that order and routine and competence in our frenetic world proves forever elusive, a cruel ideal we can never reach.

The fact is: as an unrebellious, cautious, anxious generation, many of us are living lives not all that different from those of the parents of the early 1960s, yet without the seeming ease, privileges and benefits. Husbands have been stripped of the power perks of their gender, wives of the anticipation that they’ll be taken care of for life.

How we seem to love and hate those men and women we never knew. What we would give to know their secrets: how Dad managed to come home at 5 p.m. to read the paper or watch TV while Mom fixed dinner and bathed the kids. How Mom turned up at school, every day, unrumpled, coiffed, unflappable. And more to the point: how they managed to afford the lives that they led, on one salary, without hocking their homes to pay for college, without worrying about being bankrupted by medical bills.

How we make them pay now, when we breathe them back into life. Our cultural representations of them are punishing. We defile the putative purity of the housewives — those doe-eyed, frivolous, almost simple-minded depressives — by assigning them drunken, cheating, no-good mates. We discredit the memory of the organization men by filling them with self-loathing and despair. Each gender invites its downfall, and fully deserves the comeuppance that history, we know, will ultimately deal it.

That’s where the pleasure comes in. No matter how lost we are, no matter how confused, no matter how foolish we feel, we can judge ourselves the winners.

Whatever… Here’s Mr. Herbert:

Does anyone know where George W. Bush is?

You don’t hear much from him anymore. The last image most of us remember is of the president ducking a pair of size 10s that were hurled at him in Baghdad.

We’re still at war in Iraq and Afghanistan. Israel is thrashing the Palestinians in Gaza. And the U.S. economy is about as vibrant as the 0-16 Detroit Lions.

But hardly a peep have we heard from George, the 43rd.

When Mr. Bush officially takes his leave in three weeks (in reality, he checked out long ago), most Americans will be content to sigh good riddance. I disagree. I don’t think he should be allowed to slip quietly out of town. There should be a great hue and cry — a loud, collective angry howl, demonstrations with signs and bullhorns and fiery speeches — over the damage he’s done to this country.

This is the man who gave us the war in Iraq and Guantánamo and torture and rendition; who turned the Clinton economy and the budget surplus into fool’s gold; who dithered while New Orleans drowned; who trampled our civil liberties at home and ruined our reputation abroad; who let Dick Cheney run hog wild and thought Brownie was doing a heckuva job.

The Bush administration specialized in deceit. How else could you get the public (and a feckless Congress) to go along with an invasion of Iraq as an absolutely essential response to the Sept. 11 attacks, when Iraq had had nothing to do with the Sept. 11 attacks?

Exploiting the public’s understandable fears, Mr. Bush made it sound as if Iraq was about to nuke us: “We cannot wait,” he said, “for the final proof — the smoking gun that could come in the form of a mushroom cloud.”

He then set the blaze that has continued to rage for nearly six years, consuming more than 4,000 American lives and hundreds of thousands of Iraqis. (A car bomb over the weekend killed two dozen more Iraqis, many of them religious pilgrims.) The financial cost to the U.S. will eventually reach $3 trillion or more, according to the Nobel laureate economist Joseph Stiglitz.

A year into the war Mr. Bush was cracking jokes about it at the annual dinner of the Radio and Television Correspondents Association. He displayed a series of photos that showed him searching the Oval Office, peering behind curtains and looking under the furniture. A mock caption had Mr. Bush saying: “Those weapons of mass destruction have got to be somewhere.”

And then there’s the Bush economy, another disaster, a trapdoor through which middle-class Americans can plunge toward the bracing experiences normally reserved for the poor and the destitute.

Mr. Bush traveled the country in the early days of his presidency, promoting his tax cut plans as hugely beneficial to small-business people and families of modest means. This was more deceit. The tax cuts would go overwhelmingly to the very rich.

The president would give the wealthy and the powerful virtually everything they wanted. He would throw sand into the regulatory apparatus and help foster the most extreme income disparities since the years leading up to the Great Depression. Once again he was lighting a fire. This time the flames would engulf the economy and, as with Iraq, bring catastrophe.

If the U.S. were a product line, it would be seen now as deeply damaged goods, subject to recall.

There seemed to be no end to Mr. Bush’s talent for destruction. He tried to hand the piggy bank known as Social Security over to the marauders of the financial sector, but saner heads prevailed.

In New Orleans, the president failed to intervene swiftly and decisively to aid the tens of thousands of poor people who were very publicly suffering and, in many cases, dying. He then compounded this colossal failure of leadership by traveling to New Orleans and promising, in a dramatic, floodlit appearance, to spare no effort in rebuilding the flood-torn region and the wrecked lives of the victims.

He went further, vowing to confront the issue of poverty in America “with bold action.”

It was all nonsense, of course. He did nothing of the kind.

The catalog of his transgressions against the nation’s interests — sins of commission and omission — would keep Mr. Bush in a confessional for the rest of his life. Don’t hold your breath. He’s hardly the contrite sort.

He told ABC’s Charlie Gibson: “I don’t spend a lot of time really worrying about short-term history. I guess I don’t worry about long-term history, either, since I’m not going to be around to read it.”

The president chuckled, thinking — as he did when he made his jokes about the missing weapons of mass destruction — that there was something funny going on.

Judith Warner, in “Living the Off-Label Life,” says our brain function is enhanced by drinking coffee, by eating nutritious food, even by getting a good night’s sleep. Should taking brain-enhancing drugs be another step along that continuum? Charles Blow, in “Heaven for the Godless,” says in a recent poll, a majority of respondents said that religions other than theirs — as well as people who are atheist and nonreligious — could achieve eternal life. Mr. Herbert urges us to “Stop Being Stupid.” He says over the past few decades, the American way has been to pay for things with money that wasn’t there. Americans must resolve to be smarter going forward. Here’s Ms. Warner:

What if you could just take a pill and all of a sudden remember to pay your bills on time? What if, thanks to modern neuroscience, you could, simultaneously, make New Year’s Eve plans, pay the mortgage, call the pediatrician, consolidate credit card debt and do your job — well — without forgetting dentist appointments or neglecting to pick up your children at school?

Would you do it? Tune out the distractions of our online, on-call, too-fast A.D.D.-ogenic world with focus and memory-enhancing medications like Ritalin or Adderall? Stay sharp as a knife — no matter how overworked and sleep-deprived — with a mental-alertness-boosting drug like the anti-narcolepsy medication Provigil?

I’ve always said no. Fantasy aside, I’ve always rejected the idea of using drugs meant for people with real neurological disorders to treat the pathologies of everyday life.

Most of us, viscerally, do. Cognitive enhancement — a practice typified by the widely reported abuse of psychostimulants by college students cramming for exams, and by the less reported but apparently growing use of mind-boosters like Provigil among in-the-know scientists and professors — goes against the grain of some of our most basic beliefs about fairness and meritocracy. It seems to many people to be unnatural, inhuman, hubristic, pure cheating.

That’s why when Henry Greely, director of Stanford Law School’s Center for Law and the Biosciences, published an article, with a host of co-authors, in the science journal Nature earlier this month suggesting that we ought to rethink our gut reactions and “accept the benefits of enhancement,” he was deluged with irate responses from readers.

“There were three kinds of e-mail reactions,” he told me in a phone interview last week. “ ‘How much crack are you smoking? How much money did your friends in pharma give you? How much crack did you get from your friends in pharma?’ ”

As Americans, our default setting on matters of psychotropic drugs — particularly when it comes to medicating those who are not very ill — tends to be, as the psychiatrist Gerald Klerman called it in 1972, something akin to “pharmacological Calvinism.” People should suffer and endure, the thinking goes, accept what hard work and their God-given abilities bring them and hope for no more.

But Greely and his Nature co-authors suggest that such arguments are outdated and intellectually dishonest. We enhance our brain function all the time, they say — by drinking coffee, by eating nutritious food, by getting an education, even by getting a good night’s sleep. Taking brain-enhancing drugs should be viewed as just another step along that continuum, one that’s “morally equivalent” to such “other, more familiar, enhancements,” they write.

Normal life, unlike sports competitions, they argue, isn’t a zero-sum game, where one person’s doped advantage necessarily brings another’s disadvantage. A surgeon whose mind is extra-sharp, a pilot who’s extra alert, a medical researcher whose memory is fine-tuned to make extraordinary connections, is able to work not just to his or her own benefit, but for that of countless numbers of people. “Cognitive enhancement,” they write, “unlike enhancement for sports competitions, could lead to substantive improvements in the world.”

I’m not convinced of that. I’m not sure that pushing for your personal best — all the time — is tantamount to truly being the best person you can be. I have long thought that a life so frenetic and fractured that it drives “neuro-normal” people to distraction, leaving them sleep-deprived and exhausted, demands — indeed, screams for — systemic change.

But now I do wonder: What if the excessive demands of life today are creating ever-larger categories of people who can’t reach their potential due to handicaps that in an easier time were just quirks? (Absent-minded professor-types were, for generations, typically men who didn’t need to be present — organized and on-time — for their kids.) Is it any fairer to saddle a child with a chronically overwhelmed parent than with one suffering from untreated depression?

And, furthermore, how much can most of us, on a truly meaningful scale, change our lives? At a time of widespread layoffs and job anxiety among those still employed, can anyone but the most fortunate afford to cut their hours to give themselves time to breathe? Can working parents really sacrifice on either side of the wage-earning/life-making equation? It’s disturbing to think that we just have to make do with the world we now live in. But to do otherwise is for most people an impossible luxury.

For some of us, saddled with brains ill-adapted to this era, and taxed with way too many demands and distractions, pharmacological Calvinism may now be a luxury, too.

Here’s Mr. Blow:

In June, the Pew Forum on Religion and Public Life published a controversial survey in which 70 percent of Americans said that they believed religions other than theirs could lead to eternal life.

This threw evangelicals into a tizzy. After all, the Bible makes it clear that heaven is a velvet-roped V.I.P. area reserved for Christians. Jesus said so: “I am the way, the truth and the life: no man cometh unto the Father, but by me.” But the survey suggested that Americans just weren’t buying that.

The evangelicals complained that people must not have understood the question. The respondents couldn’t actually believe what they were saying, could they?

So in August, Pew asked the question again. (They released the results last week.) Sixty-five percent of respondents said — again — that other religions could lead to eternal life. But this time, to clear up any confusion, Pew asked them to specify which religions. The respondents essentially said all of them.

And they didn’t stop there. Nearly half also thought that atheists could go to heaven — dragged there kicking and screaming, no doubt — and most thought that people with no religious faith also could go.

What on earth does this mean?

One very plausible explanation is that Americans just want good things to come to good people, regardless of their faith. As Alan Segal, a professor of religion at Barnard College told me: “We are a multicultural society, and people expect this American life to continue the same way in heaven.” He explained that in our society, we meet so many good people of different faiths that it’s hard for us to imagine God letting them go to hell. In fact, in the most recent survey, Pew asked people what they thought determined whether a person would achieve eternal life. Nearly as many Christians said you could achieve eternal life by just being a good person as said that you had to believe in Jesus.

Also, many Christians apparently view their didactic text as flexible. According to Pew’s August survey, only 39 percent of Christians believe that the Bible is the literal word of God, and 18 percent think that it’s just a book written by men and not the word of God at all. In fact, on the question in the Pew survey about what it would take to achieve eternal life, only 1 percent of Christians said living life in accordance with the Bible.

Now, there remains the possibility that some of those polled may not have understood the implications of their answers. As John Green, a senior fellow at the Pew Forum, said, “The capacity of ignorance to influence survey outcomes should never be underestimated.” But I don’t think that they are ignorant about this most basic tenet of their faith. I think that they are choosing to ignore it … for goodness sake.

And now here’s Mr. Herbert:

I’ve got a new year’s resolution and a new slogan for the country.

The resolution may be difficult, but it’s essential. Americans must resolve to be smarter going forward than we have been for the past several years.

Look around you. We have behaved in ways that were incredibly, astonishingly and embarrassingly stupid for much too long. We’ve wrecked the economy and mortgaged the future of generations yet unborn. We don’t even know if we’ll have an automobile industry in the coming years. It’s time to stop the self-destruction.

The slogan? “Invest in the U.S.” By that I mean we should stop squandering the nation’s wealth on unnecessary warfare overseas and mindless consumption here at home and start making sensible investments in the well-being of the American people and the long-term health of the economy.

The mind-boggling stupidity that we’ve indulged in was hammered home by a comment almost casually delivered by, of all people, Bernie Madoff, the mild-mannered creator of what appears to have been a nuclear-powered Ponzi scheme. Madoff summed up his activities with devastating simplicity. He is said to have told the F.B.I. that he “paid investors with money that wasn’t there.”

Somehow, over the past few decades, that has become the American way: to pay for things — from wars to Wall Street bonuses to flat-screen TVs to video games — with money that wasn’t there.

Something for nothing became the order of the day. You want to invade Iraq? Convince yourself that oil revenues out of Baghdad will pay for it. (Meanwhile, carve out another deficit channel in the federal budget.) You want to pump up profits in the financial sector? End the oversight and let the lunatics in the asylum run wild.

For those who wanted a bigger house in a nicer neighborhood, there were mortgages with absurdly easy terms. Credit-card offers came in the mail like confetti, and we used them like there was no tomorrow. For students stunned by the skyrocketing cost of tuition, there were college loans that could last a lifetime.

Money that wasn’t there.

Plenty of people managed their credit wisely. But much of the country, including many of the top government officials and financial titans who were supposed to be guarding the nation’s wealth, acted as if there would never be a day of reckoning, a day when — inevitably — the soaring markets would crash and the bubbles explode.

We were stupid in so many ways. We shipped American jobs overseas by the millions and came up with the fiction that this was a good deal for just about everybody. We could have and should have taken the time and made the effort to think globalization through, to be smarter about it and craft ways to cushion its more harmful effects and to share its benefits more equitably.

We bought into the dopey idea that you could radically cut taxes and still maintain critical government services — and fight two wars to boot!

We were living in a dream world. The general public, and to a great extent the press, closed its eyes to the increasingly complex and baffling machinations of the financial industry, which kept screaming that oversight would ruin everything.

We should have known better. It didn’t require a genius (or even an economics degree) to understand a crucial point that popped up some years ago in a front-page article in The Wall Street Journal: “Markets are a great way to organize economic activity, but they need adult supervision.”

Did Alan Greenspan not understand that? Bob Rubin? Larry Summers?

Now that the reality of a stunning economic downturn has so roughly intervened, we at least have the option of being smarter going forward. There is broad agreement that we have no choice but to go much more deeply into debt to jump-start the economy. But we have tremendous choices as to how we use that debt.

We should use it to invest in the U.S. — in a world-class infrastructure (in its broadest sense) to serve as the platform for a world-class, 21st-century economy, and in a system of education that actually prepares American youngsters to deal successfully with the real world they will be encountering.

We need to invest in a health care system that improves the quality of American lives, enhances productivity, puts large numbers of additional people to work and eases the competitive burden of U.S. corporations.

We need to care for our environment (if long-term survival means anything to us) and get serious about weaning ourselves from foreign oil.

And, finally, we need to start living within our means and get past the nauseating idea that the essence of our culture and the be-all and end-all of the American economy is the limitless consumption of trashy consumer goods.

Merry Christmas! Happy Channukah! Bobo is off today — savor the gift. Ms. Warner asks “Do You Believe?” She says there is too much reason and self-awareness. Too little wonder, marvel and faith — in the largest and vaguest sense of the word. Mr. Herbert, in “A Race to the Bottom,” says it’s time we refocused our lens on American workers and tried to see them in a fairer, more appreciative light. Gee, ya think? Here’s Ms. Warner:

Last Thursday, I put the finishing touches on my column and then, throwing all caution and reason to the wind, I began to bake cookies. For teacher gifts. At 9:30 p.m.

This was a very stupid idea.

I am a terrible baker, under the best of circumstances. I am good for nothing after 10 p.m., when the preventive meds I take for migraines kick in. I should have known that embarking on intricate melting and mixing and measuring at the end of a long day, with an overeager and overtired 8-year-old at my elbow (“Vacation starts tomorrow, Mommy! Who cares about sleep?”) was a recipe for disaster.

But it’s the season of miracles, and I was not to be deterred.

For I am resolved, this Christmas-and-Hanukkah-time, to be cheery and happy and hopeful. This despite the bad economy. This despite the string of late-night mess-ups (It’s not my fault! The dog ate my gifts!) that netted the teachers a mere four cookies apiece, separated by a very generous amount of brightly colored tissue.

And this despite — and, indeed, in large part because of — the steady stream of e-mail news roundups I’ve been receiving for the past couple of weeks, all sending me the message that for sensitive, thoughtful, decent souls, really, the holidays are no time to be happy at all.

The miserable missives began before Thanksgiving, when the Council on Contemporary Families sent out a helpful list of sources and story ideas on topics like “Not Home for the Holidays” (for parents estranged from their adult children), “How to Keep Your Sex Life Alive During Hard Economic Times and Holiday Stress” (a two-fer), “The Gift That Keeps on Giving: Gender Traps and the Holidays,” and, from a person called a “cyberpsychology specialist,” “Long-Distance Couples and Happy Holidays: Digital Lifesavers and Land Mines.”

More recently, Psych Central, a news consolidator that has the gall to call itself “Your Mental Health Break,” sent out an eight-page compilation of holiday-themed links to stories like “Ho-Ho-Ho-Holiday (Office) Parties: Bah Humbug?,” “Seasonal Stepfamily Stress,” “Managing Children’s Expectations” and “Transforming Holiday Angst Into Gratitude.”

At first, I greeted the e-mails with a smile. ADDitude magazine kicked off the season with an article on The Holiday Husband (“he can only handle things for so long,”) which was accompanied by a picture of a man trying to read his morning newspaper as his wife anxiously stood over him trying to Make Plans. This made me laugh so hard that I spilled coffee all over my desk and immediately forwarded the article to my husband, Max, with the thought that we might send it out instead of a family Christmas card. (I received no response.)

But, after weeks of “coping” tips, “de-stressing” guides and how-to’s on “beating the holiday blues,” I ended up kind of sad. And vaguely resentful.

It wasn’t that I was tempted to make light of people’s difficulties, cast doubt upon their mental-health issues or deny that the holidays can be stressful for all the reasons that they’re supposed to be fun. It’s not as though I am any stranger to “Managing Unhappy Relatives at Holiday Time,” as Psych Central puts it. I know that for many people, anticipating happiness as family gatherings loom is an act of faith on a par with expecting Santa Claus to come down the chimney.

But without some belief in the possibility of happiness, without some willful suspension of our attunement to the dreariness of reality, the holiday season really is nothing more than a forced march of shopping wrapped in a laundry list of neuroses.

One of the greatest risks, I think, of living in a secular world — as I, semi-regretfully, do — is something I might call the Woody Allenization of everything. Too much reason. Too much self-awareness. Too much blah-blah. Too little wonder, and marvel and faith — in the largest and vaguest sense of the word.

You have, in this climate, to carve out whatever little islands of belief that you can. My 11-year-old daughter, Julia, resolutely refuses not to believe in Santa Claus. No number of mall Santas, varying in face and demeanor, no amount of presents delivered straight to the door by UPS, can shake her from her faith. This is, I suppose, her way of preserving a sense of mystery and miracle in her otherwise dessicatedly secular world, where chocolate Santas at school have gone the way of the Pledge of Allegiance. And I admire her greatly for it.

I haven’t been able to believe in Santa Claus since first grade. But I will, this year, approach our extended family Christmas gathering with an anticipation of joy and peace.

Miracles, after all, can happen, they say. If you believe.

Here’s Mr. Herbert:

Toward the end of an important speech in Washington last month, the president of the American Federation of Teachers, Randi Weingarten, said to her audience:

“Think of a teacher who is staying up past midnight to prepare her lesson plan… Think of a teacher who is paying for equipment out of his own pocket so his students can conduct science experiments that they otherwise couldn’t do… Think of a teacher who takes her students to a ‘We, the People’ debating competition over the weekend, instead of spending time with her own family.”

Ms. Weingarten was raising a cry against the demonizing of teachers and the widespread, uninformed tendency to cast wholesale blame on teachers for the myriad problems with American public schools. It reminded me of the way autoworkers have been vilified and blamed by so many for the problems plaguing the Big Three automakers.

But Ms. Weingarten’s defense of her members was not the most important part of the speech. The key point was her assertion that with schools in trouble and the economy in a state of near-collapse, she was willing to consider reforms that until now have been anathema to the union, including the way in which tenure is awarded, the manner in which teachers are assigned and merit pay.

It’s time we refocused our lens on American workers and tried to see them in a fairer, more appreciative light.

Working men and women are not getting the credit they deserve for the jobs they do without squawking every day, for the hardships they are enduring in this downturn and for the collective effort they are willing to make to get through the worst economic crisis in the U.S. in decades.

In testimony before the U.S. Senate this month, the president of the United Auto Workers, Ron Gettelfinger, listed some of the sacrifices his members have already made to try and keep the American auto industry viable.

Last year, before the economy went into free fall and before any talk of a government rescue, the autoworkers agreed to a 50 percent cut in wages for new workers at the Big Three, reducing starting pay to a little more than $14 an hour.

That is a development that the society should mourn. The U.A.W. had traditionally been a union through which workers could march into the middle class. Now the march is in the other direction.

Mr. Gettelfinger noted that his members “have not received any base wage increase since 2005 at G.M. and Ford, and since 2006 at Chrysler.”

Some 150,000 jobs at General Motors, Ford and Chrysler have vanished outright through downsizing over the past five years. And like the members of Ms. Weingarten’s union (and other workers across the country, whether unionized or not), the autoworkers are prepared to make further sacrifices as required, as long as they are reasonably fair and part of a shared effort with other sectors of the society.

We need some perspective here. It is becoming an article of faith in the discussions over an auto industry rescue, that unionized autoworkers should be taken off of their high horses and shoved into a deal in which they would not make significantly more in wages and benefits than comparable workers at Japanese carmakers like Toyota.

That’s fine if it’s agreed to by the autoworkers themselves in the context of an industry bailout at a time when the country is in the midst of a financial emergency. But it stinks to high heaven as something we should be aspiring to.

The economic downturn, however severe, should not be used as an excuse to send American workers on a race to the bottom, where previously middle-class occupations take a sweatshop’s approach to pay and benefits.

The U.A.W. has been criticized because its retired workers have had generous pensions and health coverage. There’s a horror! I suppose it would have been better if, after 30 or 35 years on the assembly line, those retirees had been considerate enough to die prematurely in poverty, unable to pay for the medical services that could have saved them.

Randi Weingarten and Ron Gettelfinger know the country is going through a terrible period. Their workers, like most Americans, are already getting clobbered and worse is to come.

But there is no downturn so treacherous that it is worth sacrificing the long-term interests — or, equally important — the dignity of their members.

Teachers and autoworkers are two very different cornerstones of American society, but they are cornerstones nonetheless. Our attitudes toward them are a reflection of our attitudes toward working people in general. If we see teachers and autoworkers as our enemies, we are in serious need of an attitude adjustment.

Judith Warner, in “Getting Beyond Camelot,” says Caroline Kennedy is a very capable woman, but that doesn’t mean she should be handed Hillary Clinton’s soon-to-be-vacated United States Senate seat. Mr. Krugman, in “The Madoff Economy,” says the vast riches achieved by those who managed other people’s money have had a corrupting effect on our society as a whole. Here’s Ms. Warner:

Caroline Kennedy is, by all accounts, a smart, decent and very capable woman. There is no reason why she shouldn’t enter politics and why she couldn’t have a good shot at winning an election.

That doesn’t mean she should be handed Hillary Clinton’s soon-to-be-vacated United States Senate seat.

Running for office and getting a high-class government handout are two very different things.

I suppose Caroline can’t be blamed entirely for having a bit of a blind spot when it comes to the fine line between deserving accomplishment and political entitlement. In 1960, when her father was elected president, vacating his seat in the Senate, he wanted his brother, Ted, to take his place. But Ted was too young to serve. So the Kennedy camp convinced Foster Furcolo, the Massachusetts governor, to appoint Benjamin Smith, an old college friend of Jack’s, “to keep the seat warm” until Ted turned 30 and could run in 1962.

But J.F.K. did sometimes worry, presidential historian Doris Kearns Goodwin told me this week, that the public might sour on the idea of a Kennedy family dynasty.

The early 1960s were more indulgent times. In 2008, however, I’m not sure we can afford to extend excessive amounts of public generosity to the wealthy and well-connected. It doesn’t strike me as desirable or — for New York Democrats in particular, and even for Caroline herself — very wise.

We are living in a moment when all the machinations, the corner-cutting, the inside deals, mutual back-scratching and indifference to the larger world of our nation’s wealthiest and most interconnected have led us straight into the ground. We’ve just elected a president who’s sworn to clean things up. We’re in the middle of a political-appointment fiasco in Illinois.

With lawmakers and taxpayers eyeing bonuses and corporate jets with angry incredulity, we’ve arrived, after years of worshipping the very wealthy, at what could be a very positive time of reckoning. This change could go a long way toward restoring people’s faith in the fairness and decency of our leaders and institutions.

In keeping with the times, it would be an appealing act of humility if Caroline Kennedy aimed her first shot at politics a bit lower — say, at the House of Representatives. Perhaps she could run for Representative Carolyn Maloney’s seat if Maloney were, as she and her supporters would like her to be, named to Clinton’s Senate seat.

A number of major national women’s groups have endorsed Maloney for that position. Their leadership has been uncharacteristically quiet since Caroline entered the fray. But Marcia Pappas, president of the New York State chapter of the National Organization for Women told me, politely: “We can have a number of people who are qualified and we have to be very respectful of people and their talents, but when it comes down to it, we have to be grown-ups. We think this position should go to someone who’s paid her dues, who’s done the work.”

That’s the operative word: “work.” I do think that the next United States senator from New York ought to be someone who has worked for the honor. Clearly, Caroline can’t, for the sake of her political viability — or her likability with people like me — suddenly remake herself into someone who has worked for a living. But at this point, with so many people struggling so arduously just to get by, any effort at all would be appreciated. True campaigning — at the appropriate time — would prove her mettle pretty fast.

It might even be liberating. It can’t be fun to live your life defined — in the pubic eye at least — by your tragic past. At age 51, having still to be the “lucky little girl with a pony and an impossibly handsome father,” our “tragic national princess,” as The Washington Post’s Ruth Marcus put it last week, can’t be too great, no matter how many strangers say they like you. (“I like her … There’s magic in the Kennedy name that attracts power and support and love,” wrote radio host Rob Kall on The Huffington Post.)

Such love is a bit unseemly. Even embarrassing. “Confusing Politics With the Lifetime Channel,” Andrew Sullivan’s Daily Dish blog headlined the subject this week.

Caroline doesn’t have to be a fairy-tale princess anymore. She can be her own white knight, vaulting the Kennedys proudly into the 21st century, if only she plays by the rules and waits her turn.

Here’s Prof. Krugman:

The revelation that Bernard Madoff — brilliant investor (or so almost everyone thought), philanthropist, pillar of the community — was a phony has shocked the world, and understandably so. The scale of his alleged $50 billion Ponzi scheme is hard to comprehend.

Yet surely I’m not the only person to ask the obvious question: How different, really, is Mr. Madoff’s tale from the story of the investment industry as a whole?

The financial services industry has claimed an ever-growing share of the nation’s income over the past generation, making the people who run the industry incredibly rich. Yet, at this point, it looks as if much of the industry has been destroying value, not creating it. And it’s not just a matter of money: the vast riches achieved by those who managed other people’s money have had a corrupting effect on our society as a whole.

Let’s start with those paychecks. Last year, the average salary of employees in “securities, commodity contracts, and investments” was more than four times the average salary in the rest of the economy. Earning a million dollars was nothing special, and even incomes of $20 million or more were fairly common. The incomes of the richest Americans have exploded over the past generation, even as wages of ordinary workers have stagnated; high pay on Wall Street was a major cause of that divergence.

But surely those financial superstars must have been earning their millions, right? No, not necessarily. The pay system on Wall Street lavishly rewards the appearance of profit, even if that appearance later turns out to have been an illusion.

Consider the hypothetical example of a money manager who leverages up his clients’ money with lots of debt, then invests the bulked-up total in high-yielding but risky assets, such as dubious mortgage-backed securities. For a while — say, as long as a housing bubble continues to inflate — he (it’s almost always a he) will make big profits and receive big bonuses. Then, when the bubble bursts and his investments turn into toxic waste, his investors will lose big — but he’ll keep those bonuses.

O.K., maybe my example wasn’t hypothetical after all.

So, how different is what Wall Street in general did from the Madoff affair? Well, Mr. Madoff allegedly skipped a few steps, simply stealing his clients’ money rather than collecting big fees while exposing investors to risks they didn’t understand. And while Mr. Madoff was apparently a self-conscious fraud, many people on Wall Street believed their own hype. Still, the end result was the same (except for the house arrest): the money managers got rich; the investors saw their money disappear.

We’re talking about a lot of money here. In recent years the finance sector accounted for 8 percent of America’s G.D.P., up from less than 5 percent a generation earlier. If that extra 3 percent was money for nothing — and it probably was — we’re talking about $400 billion a year in waste, fraud and abuse.

But the costs of America’s Ponzi era surely went beyond the direct waste of dollars and cents.

At the crudest level, Wall Street’s ill-gotten gains corrupted and continue to corrupt politics, in a nicely bipartisan way. From Bush administration officials like Christopher Cox, chairman of the Securities and Exchange Commission, who looked the other way as evidence of financial fraud mounted, to Democrats who still haven’t closed the outrageous tax loophole that benefits executives at hedge funds and private equity firms (hello, Senator Schumer), politicians have walked when money talked.

Meanwhile, how much has our nation’s future been damaged by the magnetic pull of quick personal wealth, which for years has drawn many of our best and brightest young people into investment banking, at the expense of science, public service and just about everything else?

Most of all, the vast riches being earned — or maybe that should be “earned” — in our bloated financial industry undermined our sense of reality and degraded our judgment.

Think of the way almost everyone important missed the warning signs of an impending crisis. How was that possible? How, for example, could Alan Greenspan have declared, just a few years ago, that “the financial system as a whole has become more resilient” — thanks to derivatives, no less? The answer, I believe, is that there’s an innate tendency on the part of even the elite to idolize men who are making a lot of money, and assume that they know what they’re doing.

After all, that’s why so many people trusted Mr. Madoff.

Now, as we survey the wreckage and try to understand how things can have gone so wrong, so fast, the answer is actually quite simple: What we’re looking at now are the consequences of a world gone Madoff.

Today the Times has opened the floodgates and we have 9 columnists, so I’m breaking them down into 2 sections. Judith Warner writes about “No Ordinary Woman,” and says Sarah Palin is a woman who is able to be promoted on the kinds of attributes that were once the exclusive province of unremarkable white men. I think by “unremarkable” she’s being polite and not saying “ignorant.” She does have lovely manners. Mr. Friedman ponders “If Larry and Sergey Asked for a Loan….” and suggests we can only grow our way out of this crisis — with more innovation and entrepreneurship, which create new businesses and better jobs. Mr. Egan writes about “The Party of Yesterday,” and says Republicans have been insinuating for years that some of the brightest, most productive communities in the United States are fake American. Mr. Kristof, in “The Endorsement From Hell,” says the endorsement of John McCain by a Web site linked to Al Qaeda isn’t a surprise. Four more years of blindness to nuance in the Muslim world is an excellent recruiting tool. Here’s Ms. Warner:

In 1977, Bella Abzug, the former congresswoman and outspoken feminist, said, “Our struggle today is not to have a female Einstein get appointed as an assistant professor. It is for a woman schlemiel to get as quickly promoted as a male schlemiel.”

In other words: women will truly have arrived when the most mediocre among us will be able to do just as well as the most mediocre of men.

By this standard, the watershed event for women this year was not Hillary Clinton’s near ascendancy to the top of the Democratic ticket, but Sarah Palin’s nomination as the Republicans’ No. 2.

For Clinton was a lifelong overachiever, a star in a generational vanguard who clearly took to heart the maxim that women “must do twice as well as men to be thought half as good,” and in so doing divorced herself from the world of the merely average. In that, she was not unlike Barack Obama — taxed by his race to be twice as reassuring, twice as un-angry, twice as presidential as any white candidate.

Mediocrity, after all, is the privilege of those who have arrived.

Palin is a woman who has risen to national prominence without, apparently, even remotely being twice as good as her male competitors. On the contrary, her claim to fame lies in her repudiation of Clinton-type exceptionalism.

She speaks no better — and no worse — than many of her crowd-pleasing male peers, dropping her g’s, banishing “who” in favor of “that,” issuing verbal blunders that linger just long enough to make their mark in the public mind before they’re winked away in staged apologies.

She is a woman who is able to not only get by but also be quickly promoted on the kinds of attributes that were once the exclusive province of unremarkable white men: rapport, the right looks or connections, an easy sort of familiarity.

In the days leading up to Palin’s pick as vice-presidential nominee, according to an article in The New York Times Magazine today, Rick Davis, who is John McCain’s campaign manager, said a friend had told him how best to choose a running mate: “You get a frame of Time magazine, and you put the pictures of the people in that frame. You look at who fits that frame best — that’s your V.P.”

Donny Deutsch, the ad executive turned talk show host, put it less elegantly on CNBC right after the Republican convention. “Women want to be her, men want to mate with her,” he said, describing Palin as a “new creation that the feminist movement has not figured out in 40 years.”

And this was the crux of the Palin Phenomenon: she was a breakthrough woman who threatened no one.

The McCain crowd would have you believe that Palin is the perfect representation of the post-feminist woman, a candidate whose very existence marks the end of feminism — of the old “liberal feminist agenda,” as McCain himself has put it — and the start of a more global kind of triumph for the great mass of women.

Just as some young women in recent years have argued that appearing topless on “Girls Gone Wild” is an act of sexual liberation, putting an untested Alaskan governor on the road to the White House was spun as a sign of the arrival of real, hot-blooded women into the mainstream of power.

But the finer points of what it takes for real women to make progress in seizing power don’t seem much to trouble Palin.

“Someone called me a ‘redneck woman’ once, and you know what I said back? ‘Why, thank you,’” she told the country singer Gretchen Wilson at a recent Republican rally.

I guess Palin has never seen Wilson’s “Redneck Woman” music video, which, in addition to images of an attractive Wilson driving a variety of fuel-inefficient vehicles, features a couple of stripper-styled babes dancing in cages, one of which is made of chains.

With her five children, successful political career, $1.2 million net worth and beauty pageant looks, Sarah Palin is really not an average woman, much less the worthy schlemiel envisioned by Abzug. She’s actually, as Colin Powell carefully said, quite “distinguished” — for her looks, her grace and charm, her ability to connect with an audience, her ambition and her drive. Those are admirable, even enviable qualities. But the American public, defecting from the McCain ticket in a slow bleed, is clearly not convinced that they amount to vice-presidential qualifications.

Seems like “real America” wants something more than a wife, mother or girlfriend in a female political leader.

The hardest thing about analyzing the Bush administration is this: Some things are true even if George Bush believes them.

Therefore, sifting through all his steps and missteps, at home and abroad, and trying to sort out what is crazy and what might actually be true — even though George Bush believes it — presents an enormous challenge, particularly amid this economic crisis.

I felt that very strongly when listening to President Bush and Treasury Secretary Hank Paulson announce that the government was going to become a significant shareholder in the country’s major banks. Both Bush and Paulson were visibly reluctant to be taking this step. It would be easy to scoff at them and say: “What do you expect from a couple of capitalists who hate any kind of government intervention in the market?”

But we should reflect on their reluctance. There may be an important message in their grimaces. The government had to step in and shore up the balance sheets of our major banks. But the question I am asking myself, and I think Paulson and Bush were asking themselves, is this: “What will this government intervention do to the risk-taking that is at the heart of capitalism?”

There is a fine line between risk-taking and recklessness. Risk-taking drives innovation; recklessness drives over a cliff. In recent years, we had way too much of the latter. We are paying a huge price for that, and we need a correction. But how do we do that without becoming so risk-averse that start-ups and emerging economies can’t get capital because banks with the government as a shareholder become exceedingly cautious.

Let’s imagine this scene: You are the president of one of these banks in which the government has taken a position. One day two young Stanford grads walk in your door. One is named Larry, and the other is named Sergey. They each are wearing jeans and a T-shirt. They tell you that they have this thing called a “search engine,” and they are naming it — get this — “Google.” They tell you to type in any word in this box on a computer screen and — get this — hit a button labeled “I’m Feeling Lucky.” Up comes a bunch of Web sites related to that word. Their start-up, which they are operating out of their dorm room, has exhausted its venture capital. They need a loan.

What are you going to say to Larry and Sergey as the president of the bank? “Boys, this is very interesting. But I have the U.S. Treasury as my biggest shareholder today, and if you think I’m going to put money into something called ‘Google,’ with a key called ‘I’m Feeling Lucky,’ you’re fresh outta luck. Can you imagine me explaining that to a Congressional committee if you guys go bust?”

And then what happens if the next day the congressman from Palo Alto, who happens to be on the House banking committee, calls you, the bank president, and says: “I understand you turned down my boys, Larry and Sergey. Maybe you haven’t been told, but I am one of your shareholders — and right now, I’m not feeling very lucky. You get my drift?”

Maybe nothing like this will ever happen. Maybe it’s just my imagination. But maybe not …

“Government bailouts and guarantees, while at times needed, always come with unintended consequences,” notes the financial strategist David Smick. “The winners: the strong, the big, the established, the domestic and the safe — the folks who, relatively speaking, don’t need the money. The losers: the new, the small, the foreign and the risky — emerging markets, entrepreneurs and small businesses not politically connected. After all, what banker in a Capitol Hill hearing now would want to defend a loan to an emerging market? Yet emerging economies are the big markets for American exports.”

Don’t get me wrong. I am not criticizing the decision to shore up the banks. And we must prevent a repeat of the reckless bundling and securitizing of mortgages, and excessive leveraging, that started this mess. We need better regulation. But most of all, we need better management.

The banks that are surviving the best today, the ones that are buying others and not being bought — like JPMorgan Chase or Banco Santander, based in Spain — are not surviving because they were better regulated than the banks across the street but because they were better run. Their leaders were more vigilant about their risk exposure than any regulator required them to be.

Bottom line: We must not overshoot in regulating the markets just because they overshot in their risk-taking. That’s what markets do. We need to fix capitalism, not install socialism. Because, ultimately, we can’t bail our way out of this crisis. We can only grow our way out — with more innovation and entrepreneurship, which create new businesses and better jobs.

So let’s keep our eyes on the prize. Save the system, install smart regulations and get the government out of the banking business as soon as possible so that the surviving banks can freely and unabashedly get back into their business: risk-taking without recklessness.

Here’s Mr. Egan, writing from Seattle:

Two years ago, a list of the nation’s brainiest cities was put together from Census Bureau reports — that is, cities with the highest percentage of college graduates, which is not the same as smart, of course.

These are vibrant, prosperous places where a knowledge economy and cool things to do after hours attract people from all over the country. Among the top 10, only two of those metro areas — Raleigh, N.C., and Lexington, Ky. — voted Republican in the 2004 presidential election.

This year, all 10 are likely to go Democratic. What’s more, with Colorado, New Hampshire and Virginia now trending blue, Republicans stand to lose the nation’s 10 best-educated states as well.

It would be easy to say these places are not the real America, in the peculiar us-and-them parlance of Sarah Palin. It’s easy to say because Republicans have been insinuating for years now that some of the brightest, most productive communities in the United States are fake American — a tactic that dates to Newt Gingrich’s reign in the capitol.

Brainy cities have low divorce rates, low crime, high job creation, ethnic diversity and creative capitalism. They’re places like Pittsburgh, with its top-notch universities; Albuquerque, with its surging Latino middle class; and Denver, with its outdoor-loving young people. They grow good people in the smart cities.

But in the politically suicidal greenhouse that Republicans have constructed for themselves, these cities are not welcome. They are disparaged as nests of latte-sipping weenies, alt-lifestyle types and “other” Americans, somehow inauthentic.

If that’s what Republicans want, they are doomed to be the party of yesterday.

Not only are we becoming more urban as a nation, but we’re headed for an ethnic muddle that could further shrink the party of small-mindedness. By 2023, more than half of all American children will be minority, the Census Bureau projects.

Ronald Reagan was lashed by liberals for running a “Morning in America” campaign, but he knew this country, at heart, was always tomorrow-looking — and he fared very well in educated cities as well as small towns. “Whatever else history may say about me when I’m gone,” said Reagan, “I hope it will record that I appealed to your best hopes, not your worst fears.” Barack Obama, who brings that music to the stage, leads by 30 points on the “hope and optimism” question in polls.

Spurning the Reagan lesson, John McCain made a fatal error in turning his campaign over to the audience of Rush Limbaugh and Sean Hannity. In so doing, he chose the unbearable lightness of being Sarah Palin, trotted out Paris Hilton and labeled Obama a socialist who associates with terrorists.

At a recent Palin rally, the crowd started chanting, “We want Fox!” McCain has given them just that. But how isolated and out-of-touch is this audience? At the end of each debate, a sure-fire way to decide who won was to look at the Fox viewers poll — typically showing a landslide for McCain. Within a day, scientific surveys found big wins for Obama.

Whether Americans are real or fake, they can see through Palin, a woman who couldn’t correctly answer a third grader a few days ago when asked to explain the duties of vice president. Somewhere, between the shuffling to costume and accessorize Palin with a $150,000 wardrobe, her handlers never handed her a copy of the Constitution.

Republicans blow off the smart cities with the counterargument that they win the exurbs — the frontier of new homes, young families and the fresh middle class. And it’s true, in 2004, George Bush won 97 of the 100 fastest-growing counties in America.

That will not happen this year. Polls show McCain is losing 20 percent of self-described moderate Republicans. And new registration figures and other polls indicate that Obama will likely win such iconic exurban centers as Washoe County, Nev., Loudoun County, Va., and Wake County, N.C.

But in the kind of pattern that has held true since McCain went over to the stupid side, his brother recently referred to suburban northern Virginia as “communist country” and a top adviser, Nancy Pfotenhauer, said it was not “real Virginia.”

Here in Seattle, it’s become a one-party city, with a congressman for life and nodding-head liberals who seldom challenge a tax-loving city government. It would be nice, just to keep the philosophical debate sharp, if there were a few thoughtful Republicans around.

That won’t happen so long as Republicans continue to be the party of yesterday. They’ve written the cities off. Fake Americans don’t count, but this Election Day, for once, they will not feel left out.

And now here’s Mr. Kristof:

John McCain isn’t boasting about a new endorsement, one of the very, very few he has received from overseas. It came a few days ago:

“Al Qaeda will have to support McCain in the coming election,” read a commentary on a password-protected Islamist Web site that is closely linked to Al Qaeda and often disseminates the group’s propaganda.

The endorsement left the McCain campaign sputtering, and noting helplessly that Hamas appears to prefer Barack Obama. Al Qaeda’s apparent enthusiasm for Mr. McCain is manifestly not reciprocated.

“The transcendent challenge of our time [is] the threat of radical Islamic terrorism,” Senator McCain said in a major foreign policy speech this year, adding, “Any president who does not regard this threat as transcending all others does not deserve to sit in the White House.”

That’s a widespread conservative belief. Mitt Romney compared the threat of militant Islam to that from Nazi Germany or the Soviet Union. Some conservative groups even marked “Islamofascism Awareness Week” earlier this month.

Yet the endorsement of Mr. McCain by a Qaeda-affiliated Web site isn’t a surprise to security specialists. Richard Clarke, the former White House counterterrorism director, and Joseph Nye, the former chairman of the National Intelligence Council, have both suggested that Al Qaeda prefers Mr. McCain and might even try to use terror attacks in the coming days to tip the election to him.

“From their perspective, a continuation of Bush policies is best for recruiting,” said Professor Nye, adding that Mr. McCain is far more likely to continue those policies.

An American president who keeps troops in Iraq indefinitely, fulminates about Islamic terrorism, inclines toward military solutions and antagonizes other nations is an excellent recruiting tool. In contrast, an African-American president with a Muslim grandfather and a penchant for building bridges rather than blowing them up would give Al Qaeda recruiters fits.

During the cold war, the American ideological fear of communism led us to mistake every muddle-headed leftist for a Soviet pawn. Our myopia helped lead to catastrophe in Vietnam.

In the same way today, an exaggerated fear of “Islamofascism” elides a complex reality and leads us to overreact and damage our own interests. Perhaps the best example is one of the least-known failures in Bush administration foreign policy: Somalia.

Today, Somalia is the world’s greatest humanitarian disaster, worse even than Darfur or Congo. The crisis has complex roots, and Somali warlords bear primary blame. But Bush administration paranoia about Islamic radicals contributed to the disaster.

Somalia has been in chaos for many years, but in 2006 an umbrella movement called the Islamic Courts Union seemed close to uniting the country. The movement included both moderates and extremists, but it constituted the best hope for putting Somalia together again. Somalis were ecstatic at the prospect of having a functional government again.

Bush administration officials, however, were aghast at the rise of an Islamist movement that they feared would be uncooperative in the war on terror. So they gave Ethiopia, a longtime rival in the region, the green light to invade, and Somalia’s best hope for peace collapsed.

“A movement that looked as if it might end this long national nightmare was derailed, in part because of American and Ethiopian actions,” said Ken Menkhaus, a Somalia expert at Davidson College. As a result, Islamic militancy and anti-Americanism have surged, partly because Somalis blame Washington for the brutality of the Ethiopian occupiers.

“There’s a level of anti-Americanism in Somalia today like nothing I’ve seen over the last 20 years,” Professor Menkhaus said. “Somalis are furious with us for backing the Ethiopian intervention and occupation, provoking this huge humanitarian crisis.”

Patrick Duplat, an expert on Somalia at Refugees International, the Washington-based advocacy group, says that during his last visit to Somalia, earlier this year, a local mosque was calling for jihad against America — something he had never heard when he lived peacefully in Somalia during the rise of the Islamic Courts Union.

“The situation has dramatically taken a turn for the worse,” he said. “The U.S. chose a very confrontational route early on. Who knows what would have happened if the U.S. had reached out to moderates? But that might have averted the disaster we’re in today.”

The greatest catastrophe is the one endured by ordinary Somalis who now must watch their children starve. But America’s own strategic interests have also been gravely damaged.

The only winner has been Islamic militancy. That’s probably the core reason why Al Qaeda militants prefer a McCain presidency: four more years of blindness to nuance in the Muslim world would be a tragedy for Americans and virtually everyone else, but a boon for radical groups trying to recruit suicide bombers.

Bobo must be having some sort of existential melt-down. He has a rumination on baby names today. Well, I guess it’s better than his usual run of stuff. Judith Warner has some truth to relay on the “Partial Birth” ban. Here’s Bobo:

Names matter. People named Dennis and Denise are disproportionately likely to become dentists. People named Lawrence or Laurie are disproportionately likely to become lawyers. People named Louis are disproportionately likely to live in St. Louis, and people named Georgia are disproportionately likely to move to the state that served as home in “Gone With the Wind.”

As Brett Pelham of State University of New York at Buffalo has shown in dozens of different ways, people are drawn to professions, places and people that remind them of themselves. A thing as seemingly superficial as a name can influence, even if slightly, the course of a whole life (which is why I’ve named my own children President, Laureate and Hedge Fund Manager).

Nevertheless, I didn’t become aware of the true import of names until I read Laura Wattenberg. She has taken her obsession with names — which in other hands could be regarded as an eccentricity — and has transformed it into a window on American society.

On her blog, The Baby Name Wizard, Wattenberg tracks the rise and fall of naming fashions. One of her mega-observations, which isn’t that surprising, is that we are living in the age of the long tail when it comes to naming our kids. In 1880, just 10 names — William, John, Mary, George, etc. — accounted for 20 percent of all babies. Now those 10 names account for just 2 percent of American babies.

Name conformity peaked around World War II. Since then parents have been more and more likely to seek out the unusual. “Across regions, races and classes,” Wattenberg writes, “many thousands of American parents are united by a common bond: their mutual determination to be nothing like each other.”

This observation is merely a jumping-off point. Between 1890 and 1920, as America was urbanizing, parents gave names that were paved with gold, Wattenberg observes. Girls were often named after gems — Amber, Ruby, Jewel and Opal.

In the 1950s, some surge of naming testosterone produced a lot of swaggering male names ending in the letter K: Jack, Mark and Frank, not to mention Rock, Dirk and Buck. But over the past few decades, K has moved to the front of names: Kyle, Kaitlyn and Kayla. “If any letter defines modern American name style, K is it,” Wattenberg notes.

The most astonishing change concerns the ending of boys’ names. In 1880, most boys’ names ended in the letters E, N, D and S. In 1956, the chart of final letters looked pretty much the same, with more names ending in Y. Today’s chart looks nothing like the charts of the past century. In 2006, a huge (and I mean huge) percentage of boys’ names ended in the letter N. Or as Wattenberg put it, “Ladies and gentlemen, that is a baby-naming revolution.”

Wattenberg observes a new formality sweeping nursery schools. Thirty years ago there would have been a lot of Nicks, Toms and Bills on the playground. Now they are Nicholas, Thomas and William. In 1898, the name Dewey had its moment (you should be able to figure out why). Today, antique-sounding names are in vogue: Hannah, Abigail, Madeline, Caleb and Oliver.

In the late 19th century, parents sometimes named their kids after prestigious jobs, like King, Lawyer, Author and Admiral. Now, children are more likely to bear the names of obsolete proletarian professions, Cooper, Carter, Tyler and Mason.

Wattenberg uses her blog to raise vital questions, such as should you give your child an unusual name that is Googleable, or a conventional one that is harder to track? But what’s most striking is the sheer variability of the trends she describes.

Naming fashion doesn’t just move a little. It swings back and forth. People who haven’t spent a nanosecond thinking about the letter K get swept up in a social contagion and suddenly they’ve got a Keisha and a Kody. They may think they’re making an individual statement, but in fact their choices are shaped by the networks around them.

Furthermore, if you just looked at names, you would conclude that American culture once had a definable core — signified by all those Anglo names like Mary, Robert, John and William. But over the past few decades, that Anglo core is harder to find. In the world of niche naming, there is no clearly identifiable mainstream.

For the past few decades, the White House has been occupied by George, William, George, Ronald, James and Richard. Those pillars are crumbling. Pluralism is here.

Well. That was special… Here’s Ms. Warner:

When the Supreme Court voted 5 to 4 to uphold the federal Partial-Birth Abortion Ban Act this spring, the ambivalently pro-choice public was largely quiescent, believing, as Congress had previously ruled, that the procedure was “gruesome and inhuman,” medically unnecessary, highly controversial in the medical community and so rare as to be little missed.

What’s clear, however, as the ban has become a reality, is that fetuses will be spared no brutality. Second trimester abortion is still legal and the most common method for it — dismembering a fetus inside the womb before removing it in pieces — is no less awful to contemplate than the outlawed procedure, in which an intact fetus’s skull was punctured and collapsed to ease its removal. But women are now more at risk. And doctors have been forced into a danger zone where they must weigh what they believe to be best medical practices against the need to protect themselves from the threat of prosecution.

This kind of ethical tightrope walk, this sort of judicial meddling into standard medical practices, is unprecedented — and poisonous. An article in this month’s volume of the trade journal Obstetrics & Gynecology treats the dilemma with a mocking tone. “At our recent Annual Clinical Meeting in San Diego, I asked several colleagues if they intended to make referrals to the Supreme Court. All said ‘No’ because the court is not available for telephone consultations and makes rounds infrequently,” it says.

But in truth, dealing with the ban is no laughing matter. You see, as it turns out, the Supreme Court didn’t just outlaw “partial-birth” abortions (known in the medical community as “intact dilation and extraction” or D & X,) when it upheld Congress’s ban. It criminalized any second trimester abortion that begins with a live fetus and where “the fetal head or the fetal trunk past the navel is outside the body of the mother.”

The big problem with this, doctors say, is that, due to the unpredictability of how women’s bodies react to medical procedures, when you set out to do a legal second trimester abortion, something looking very much like a now-illegal abortion can occur. Once you dilate the cervix — something that must be done sufficiently in order to avoid tears, punctures and infection — a fetus can start to slip out. And if this happens, any witness — a family member, a nurse, anyone in the near vicinity with an ax to grind against a certain physician — can report that the ban has been breached. Bringing on stiff fines, jail time and possible civil lawsuits.

Justice Anthony Kennedy, writing for the court’s majority, asserted that prosecution for accidental partial births won’t occur; there has to be “intent” for there to be a crime. But as doctors now understand it, intent could be inferred by the degree of dilation they induce in their patients. What, then, do they do? Dilate the cervix sufficiently and risk prosecution, or dilate less and risk the woman’s health? And if they dilate fully, how do they prove it wasn’t their intent to deliver an intact fetus?

This dilemma is the latest product of the awful algorithm that, in anti-choice rhetoric of the past few decades, has increasingly pitted the “interests” of the fetus against the health of the woman. It makes the true intent of the partial-birth abortion ban clear: the point is not (in the short term) to stop seemingly brutal fetal deaths, but rather to make all abortions as burdensome, as difficult and as emotionally and physically trying for women — and for doctors — as possible.

To escape having to choose between their patients’ interests and their own, physicians who perform abortions around the country now are taking steps to ensure that doctors won’t find themselves accidentally allowing a live fetus to be partially “born” in the course of a second trimester abortion. The Planned Parenthood Federation of America and other independent providers are now making it policy in abortions that could become legally risky for doctors to use digoxin — a cardiac drug — to kill the fetus up to one day in advance of the procedure. The upshot for women will be more time-consuming and costly abortion services, additional rounds of amniocentesis, more pain and more risk of infection.

And the outcome for the fetus won’t change.

Judith Warner is a contributing columnist for TimesSelect and a guest Op-Ed columnist. Bob Herbert is off today.

Most high-profile politicians acquire weird little bits of biography that you just cannot shake out of your mind. A reporter once told me that he sat next to a member of Congress on a trip, while said lawmaker kept eating mayonnaise out of those little packets they give you at fast-food restaurants. Even if this guy someday single-handedly resurrects the Equal Rights Amendment and shepherds it through 37 State Legislatures, when I look at him, a corner of my brain will always think condiments.

Then there is Mitt Romney, a candidate most of us don’t really know well yet. (A disconcerting number of well-informed people seem to believe his name is “Mort.”) Yet he could become the Republican presidential nominee. It behooves us to pay attention, to mull his Iran plan and deconstruct his position on health care.

But every time I see him, all I can think about is Seamus the dog.

Seamus, in case you missed the story, was the Romneys’ Irish setter back in the early 1980s. Mitt used to drive the family from Boston to Ontario every summer for a vacation, with the dog strapped to the roof in a crate.

As The Boston Globe reported this summer, Romney had the entire trip planned so rigidly that every gas station stop was predetermined before departure. During the fatal trip of ’83, Seamus apparently needed one more than the schedule allowed. When evidence of the setter’s incontinence came running down the back windshield, Romney abandoned his itinerary and drove to the closest gas station, where he got a hose and sprayed both dog and station wagon clean.

“It was a tiny preview of a trait he would grow famous for in business: emotion-free crisis management,” The Globe said.

Well, you could spin it that way. Imagine George W. Bush staring blankly at the windshield, the way he did during his My Pet Goat moment. However, how many people out there are troubled by the idea that we might have a president who wouldn’t let his kids go to the bathroom unless it was time for a preauthorized rest stop?

Romney has already come under considerable fire from animal rights groups over the Seamus incident. “They’re not happy that my dog loves fresh air,” Romney snapped back. He said that just recently, in Pittsburgh, although Seamus had actually long since shuffled off this mortal coil.

Is it possible that Romney is trying to dodge the Republican YouTube debate because he’s afraid someone will ask him about his method of transporting dogs across long distances? Perhaps we could have one sponsored by the A.S.P.C.A. instead.

Most of the candidates from both parties have pets. In fact, so many of them have golden retrievers or labradors you can’t help but wonder if they rent them. (John Edwards, ever the conspicuous consumer, has one of each.) This could be an excellent opportunity for John McCain to catch a break, since he seems to have the largest menagerie. Although counting each of the fish individually was a bit much.

McCain also has a ferret, which could provide ample opportunity for lively discussion with Rudy Giuliani, a well-known ferret-hater. Few of us who lived in New York City during his ferret-banning crusade can forget the day a ferret owner confronted the mayor on a radio-call-in show. Giuliani, in tones of Dr. Phil on steroids, urged him to seek psychiatric care. (“This excessive concern with little weasels is a sickness.”)

Animal-lovers around the nation may also be interested to know that Giuliani’s second wife once asked for $1,140 a month in dog support for Goalie, the family retriever. Or that the third Mrs. Giuliani is a former saleswoman for surgical staplers — a profession that involves demonstrations of how well the product works during unnecessary surgery on dogs.

The Giuliani campaign has dodged the question of whether Judith Nathan Giuliani ever was involved in this kind of activity, which usually ends badly for the dog in question. This week a spokesman said he didn’t know, adding: “In the 1970s that was an acceptable medical technique,” which I think we can probably take for a yes.

Once we settle all these issues we can get back to health care. Although every time Mitt Romney walks on stage, a sodden Irish setter is going to flash before my eyes.

I guess it hasn’t dawned on her yet that she’s part of the problem… Maybe if she (and all the rest of her ilk) would stop writing things like this we could focus on the issues. Hmmmph. The folks at The Daily Howler have their beadly little eyes focused on her. Now here’s Ms. Warner:

Have you followed the series of articles in The Times about Joshua Komisarjevsky, the Cheshire, Conn., 26-year-old who, on early parole for a long string of late-night home robberies, teamed up with an accomplice and broke into a nearby house, sexually assaulted a woman and at least one of her young daughters, beat the father with a baseball bat and left them all to die in a fire? (The father alone survived.)

Buried in a report on Tuesday was a sinister detail that piled on a broad insult to all the gruesome injuries, victimizing a whole new set of people who should have had no link whatsoever with Komisarjevsky’s crimes. It was that, while pleading for leniency for his client’s earlier break-ins, Komisarjevsky’s lawyer, William T. Gerace, had in 2002 told a judge that the young man suffered from attention deficit hyperactivity disorder and the learning disabilities dyslexia and dysgraphia as a child.

A.D.H.D., dyslexia and dysgraphia — invoked as logical potential causes for home invasions and theft? I don’t know if you all find this as appalling, offensive and cruel as I do. Perhaps you shrug it off as the work of a defense lawyer doing his job. I just can’t do that, because I know that Gerace isn’t alone in supporting and promulgating the view that kids with problems like A.D.H.D. — and depression and perhaps soon, thanks to this case, learning disabilities — pose real dangers to society.

Call it the Columbine Syndrome. Ever since the news got out that school shooter Eric Harris was taking Luvox, an antidepressant, kids’ mental illness and eventual mass murder have been linked in the public mind. This past May, the journal Psychiatric Services published the results of the first large-scale nationally representative survey of public attitudes about children’s mental health. Eighty-one percent of respondents said they thought children with major depression would be dangerous to themselves or others; 33 percent said they believed children with A.D.H.D. were likely to be dangerous.

This despite the fact that scientific studies have shown only a modest relationship between mental health issues and violence, “a relationship that is largely attributable to co-occurring substance abuse,” wrote a team of authors led by Bernice A. Pescosolido, a sociologist at Indiana University. “Unfortunately,” they concluded, “public perceptions that mental illness and violence go hand in hand may be more important than the evidence.”

Another study released in March found about one in five parents saying they would not want children with A.D.H.D. or depression as their neighbors, in their child’s classroom or as their child’s friends.

It’s deeply ironic that at a time when more than ever is known about children’s mental health needs and more methods than ever exist to help kids with behavioral or emotional issues, the stigma attached to those problems won’t budge. Instead, our brave new world of diagnosis and treatment has spurred new kinds of myth-making and prejudice. Chief among them is the idea that a diagnosis of A.D.H.D. is an escape hatch for selfish and permissive modern parents who are too lazy to discipline their badly behaved kids and prefer instead to medicate them into compliance.

There are very serious consequences of trivializing conditions like A.D.H.D. There is real harm done by instrumentalizing disorders — whether it’s in the service of a legal defense, as in Komisarjevsky’s case, or more generally to buttress ideological arguments about the decline of the American family. The more the disorders are banalized or made ridiculous, the more parents and kids dealing with them are stigmatized. The net result of this stigma, according to numerous studies, is that families don’t seek the help they need. And children with A.D.H.D. need help — not because they’re at risk of becoming rapists and arsonists but because, untreated, they’re likely to be in for a lifetime of frustration and unhappiness.

Health officials at a local psychiatric hospital apparently tried once to put Komisarjevsky on antidepressants, but, according to The Times, his parents refused, saying their son needed to deal with his problems “on a spiritual level.” I don’t know whether Komisarjevsky’s behavior stems from sickness or from evil. But I do know there’s something sick, in general, about turning kids with difficulties into actors in the morality play about family life that’s forever being staged in our time.

Judith Warner is the author of “Perfect Madness” and a contributing columnist for TimesSelect. She is a guest Op-Ed columnist.
Bob Herbert is off today.

Bobo has a piece on Sen. Obama’s and Sen. Edwards’ differing plans on how to address poverty. Ms. Warner has a piece that tries to tell people that the television show “24” is, ahem, fiction. Here’s Bobo:

Suppose you were going to decide your vote for president entirely on the issue of who could best reduce poverty. Who would you vote for?

You’d start by focusing your attention on the candidates who have invested the most time in the issue, John Edwards and Barack Obama.

You’d find that both have a multilayered view of poverty. We used to have debates in which liberals emphasized the lack of jobs and conservatives emphasized personal behavior. But in the post-welfare-reform world, it’s pretty obvious that everything feeds into everything else. For Edwards and Obama, poverty flows from a lack of jobs and broken families, bad schools and bad role models, no training and no self-control.

For both candidates, you have to attack everything at once. You have to holistically change the environment that structures behavior. The question is how to do it.

Obama and Edwards agree on a lot, but in this matter they emphasize different things. As Alec MacGillis of The Washington Post observed, Edwards emphasizes programs that help people escape from concentrated poverty. Obama emphasizes programs that fix inner-city neighborhoods. One helps people find better environments, the other seeks to strengthen the environment they are already in.

Edwards would create a million housing vouchers for working families. These would, he argues, “enable people to vote with their feet to demand safe communities with good schools.” They’d help people move to where the jobs are and foster economic integration.

The problem with his approach is that past efforts at dispersal produced disappointing results. Families who were given the means to move from poor neighborhoods to middle-class areas did not see incomes rise. Girls in those families did a little better, but boys did worse. They quickly formed subcultures in the new communities that replicated patterns of the old ones. Male criminality rose, but test scores did not.

Obama, by contrast, builds his approach around the Harlem Children’s Zone, what he calls “an all-encompassing, all-hands-on-deck anti-poverty effort.” The zone takes an area in Harlem and saturates it with childcare, marriage counseling, charter schools and job counselors and everything else you can think of. Obama says he’ll start by replicating the program in 20 cities around the country.

The problem here is that there are few historical examples of neighborhoods being lifted up at once. There are 4,000 community development corporations around the country and they have not lifted residents out of poverty. The positive influences in the center get overwhelmed by the negative peer influences all around.

The organizations that do appear to work, like the Harlem Children’s Zone (there’s no firm data yet), tend to have charismatic leaders like Geoffrey Canada who are willing to fight teachers’ unions and take on bureaucracies. It’s not clear whether their success is replicable, let alone by the federal government.

What we have, then, is two divergent approaches, both of which have problems and low odds of producing tremendous success. If you find that discouraging, welcome to the world of poverty policy.

If I had to choose between the two, I guess I’d go with the Obama plan. I’d lean that way because Obama seems to have a more developed view of social capital. Edwards offers vouchers, job training and vows to create a million temporary public-sector jobs. Obama agrees, but takes fuller advantage of home visits, parental counseling, mentoring programs and other relationship-building efforts.

The Obama policy provides more face-to-face contact with people who can offer praise or disapproval. Rising out of poverty is difficult — even when there are jobs and good schools. It’s hard to focus on a distant degree or home purchase. But human beings have a strong desire for approval and can accomplish a lot with daily doses of praise and censure. Standards of behavior are contagious that way.

A neighborhood is a moral ecosystem, and Obama, the former community organizer, seems to have a better feel for that. It’s not only policies we’re looking for in selecting a leader, it’s a sense of how the world works. Obama’s plan isn’t a sure-fire cure for poverty, but it does reveal an awareness of the supple forces that can’t be measured and seen.

•

Last week I cited data on rising earnings among the working poor. I should have made it clear that the data referred to poor households with children, since poor households without children did not enjoy those gains.

Here’s Ms. Warner:

“I hope people will make the distinction between television and reality.”

Jack Bauer stood with his back to the sea, the variegated light of early evening playing upon the features of his careworn face. Pondering the future, he lifted a cigarette to his lips, its golden ember a searing reminder of his perpetual courtship of death …

Sorry, I got confused.

Let me start again.

Kiefer Sutherland was smoking a cigarette and fielding reporters’ queries at a Fox TV party on the Santa Monica pier last week, when the issue was raised of how, well, freaky it is that his show’s first female president will make her debut just in time for the Iowa caucuses.

There is a difference, he suggested, between “24” and real life. “But,” he went on, “I can tell you one thing. We had the first African-American president on television, and now Barack Obama is a serious candidate. That wasn’t going to happen eight years ago. Television is an incredibly powerful medium, and it can be the first step in showing people what is possible.”

I giggled a bit nastily over this at first. What was next — claims that fingering China as a one-nation axis of evil on “24” had presaged the country’s exposure this spring as the source of all perishables tainted and fatal? That screen first lady Martha Logan’s descent into minimadness anticipated Laura Bush’s increasingly beleaguered late-term demeanor? (Has anyone but me noticed her astounding resemblance to Dolores Umbridge in “Harry Potter and the Order of the Phoenix”?) That foolish Vice President Noah Daniels’s narrowly averted war with the Russians had its real-life equivalent in recent Bush-Putin wrangling over Eastern European missile defense systems?

Silliness upon silliness. But still, something about this idea of “24” as a political crystal ball spoke to me. So, eager to get some advance notice on what we might one day see in a woman president (What to Expect if You’re Expecting Another Clinton), I went to the show’s Web site, looking for season seven clues.

I didn’t find any. Instead, I spent a marvelous afternoon browsing through “research” files on Joint Direct Attack Munition missiles, suitcase nukes, hyocine-pentothal (a fictional drug), C-4 explosives, A.A. sponsors and Air Force Two (not technically a plane). I learned, to my surprise, that Jack Bauer has a bachelor’s in English literature, and that Audrey Raines — not surprisingly — is a product of Brown and Yale. It was like shopping in a mall without windows, gambling in a casino without clocks — a total, disorienting departure into a self-contained alternate reality.

Kiefer Sutherland and I may both be silly, but we’re not the only people guilty of blurring the boundaries when it comes to “24.” In recent weeks, a surprising number of journalists have seemed ready to play along with the conceit that the fictional creation of the show’s first female chief executive could actually have some bearing on the American political scene. The Hollywood Reporter, for one, proclaimed this change “could become a self-fulfilling prophecy.”

I don’t remember people holding their breath for major political developments every time a new season began on “The West Wing.” There’s something different, I think, about “24” that gives its cartoonishness a bizarrely compelling sense of reality.

The past six or so years — the years of the show’s existence — have given us a parade of imagery seemingly tailor-made for Bauer’s TV world. The crumbling of the World Trade Center, Saddam Hussein in a hole, stress-deranged U.S. soldiers-turned-prison-block-pornographers — the dividing line between what’s believable and what’s not, between fantasy and reality, has become utterly permeable.

What was once unimaginable, or imagined only for entertainment value in “Die Hard”-type thrillers, is now all too real. Anything is possible in a world of falling towers and Abu Ghraib. Kiefer Sutherland’s magical beliefs about his show’s potential impact on politics are forgivable. Even quaint.

The big difference, unfortunately, between real life and small-screen fiction is that, on “24,” Jack Bauer actually catches the bad guys and saves the world. Good guys are incorruptible; fatuous politicians are made to pay for their sins. There is redemption; there is comeuppance.

Oh, and torture works.

Judith Warner is the author of “Perfect Madness” and a contributing columnist for TimesSelect. She is a guest Op-Ed columnist.

Gail Collins is everywhere, every day, all the time, it seems. Today she writes about Gov. Spitzer of New York, with a small swipe of the claws at Sen. Clinton. Judith Warner addresses the importance of “The Cleavage Conundrum.” Here’s Ms. Collins:

Eliot Spitzer accidentally dialed my number the other day, and when he identified himself I automatically said, “How are you doing?”

Really not the right greeting.

“Oh … great,” the governor said, in a tone that conveyed both deep sarcasm and a near-bewilderment.

He cannot believe what has happened to him. Our brand-new, hot-shot governor, armed with his mighty mandate, laid low. The star of the Democratic Party of New York, whose sway is currently only slightly less sweeping than the Democratic Party of Turkmenistan, has been totally rolled by the 78-year-old leader of the State Senate Republicans.

Spitzer went to war with Joe Bruno, the Senate majority leader, over his reform agenda — the one that people elected him with nearly 70 percent of the vote to accomplish. He clearly got carried away with the governor-as-warrior metaphor. The Republicans claim that he called Bruno “senile” in a particularly nasty way. And although Spitzer denied it, the senator and his supporters reacted with such high-pitched wailing that voters must have had the impression that their governor was committing elder abuse.

Worse, members of the governor’s staff decided to collect evidence that Bruno was using the state helicopter and planes for trips that were not really about state business. It was a fool’s errand. Given the vagueness of the limits on this kind of perk, for Bruno to break the law, he’d have to buy a delicatessen in Times Square and fly down from Albany every day at 11 a.m. to handle the lunchtime rush.

The Republicans quickly turned the tables, claiming the Spitzer administration was using the State Police to spy on their leader. Before you knew it, the governor and his aides were on their way to testify before the State Ethics Commission. Meanwhile, the formerly dispirited Republicans were mouthing the dreaded question, “What did the governor know and when did he know it?”

Spitzer had promised to “bring passion back to Albany.” This was not what we thought he had in mind.

Now in a sane world, Joseph Bruno would not be taking taxpayer-funded helicopters to begin with. He’s a state senator for heaven’s sake. How much critical business can he have in Manhattan?

Bruno’s spokesman said the senator needs to get back and forth to the city to consult with Mayor Michael Bloomberg and with downstate businesses who find the Republicans “much more sympathetic to the needs of the business community.”

Can we have a show of hands? How many people think that the businesses would be willing to make a trip upstate for an opportunity to pour their wishes and hopes and dreams into a sympathetic Republican ear? And if Michael Bloomberg needs to see Joe Bruno, he can afford to hire his own helicopter. He can hire his own litter and have people carry him to Albany if he feels like it.

Bruno now sees himself as a one-man mission to protect traditional Albany from Eliot the Hun. “He’s an official with statewide responsibilities. Especially now in light of what’s come out,” said the spokesman, who referred to the helicopter investigation as “an attempt to annihilate us, to wipe us off the face of the earth, to kill him, do whatever it takes. … ”

This is all very sad.

Spitzer is probably going to recover, perhaps as a chastened and more pleasant person. But he has lost the moment. When a new chief executive arrives, legislators are usually unsure of themselves for a while, and this is the precious soft spot when they can be pushed into doing big, bold things. If you screw it up, they’ll instantly revert to their preference for doing small, expensive things instead. (One of Hillary Clinton’s great pluses as a presidential candidate is that having been part of the great screwing up of the beginning of her husband’s administration in 1993, she may have figured out how not to do it again.)

The centerpiece of Spitzer’s first legislative session was supposed to be a campaign finance reform bill. Now, the price tag for passing it is escalating by the moment. Legislative pay raises! More construction projects!

It is a great tradition in Albany that no important bill ever emerges by itself. It gets mixed with pork and pet projects and lobbyists’ to-do lists until Bruno, Democratic Assembly Leader Sheldon Silver and the current governor sit down to create one huge hairball of a deal.

Spitzer was supposed to change that. But now here we are. Last week Senator Bruno was striding around the state like Rocky Balboa, while the governor was telling Danny Hakim and Nicholas Confessore of The Times that his wife had started asking, “What was wrong with going into the family business?” (High-end real estate.)

The hairball is back.

It turns out I’m not the only one who thought her Thursday offering left something to be desired — http://www.dailyhowler.com/dh072707.shtml. I’m looking forward to their future coverage of her stuff. Now here’s Ms. Warner:

The Washington Post’s penetrating exposé of Hillary Clinton’s “surreptitious” show of cleavage on the Senate floor last week (“To display cleavage in a setting that does not involve cocktails and hors d’oeuvres is a provocation”) sent me trawling on the Internet, digging through sites like eBay and Hijabs-R-Us, desperate to buy a burqa.

I’d come upon the article on a very bad day, one in which I’d made the fatal error of wearing a sundress that had shrunk at the dry cleaners. Zipping up the top required a fair amount of exhaling and spousal assistance and a certain compression of body parts. All of which meant that, when I dropped my eyes down from the computer screen where I was reading the piece and turned them in the direction of my ever-contemplatable navel, I was confronted by an unmistakable bit of, well, “provocative” décolleté.

It wasn’t — I ran to check in the mirror — discernable from head-on or from the side. In fact, you pretty much had to be looking straight down to see it. Still, I didn’t want to take any chances. I did not want to run the risk, as Clinton had, according to The Post’s Robin Givhan, of giving passers-by the impression that they were “catching a man with his fly unzipped.” (“Just look away!”)

I have a hard enough time making friends around the office.

And so, I spent the rest of the 90-degree day buttoned up in a warm jacket. Grumbling and muttering all the way.

You see, I’d always thought that, when you reached a certain age or a certain stage in life, you sort of bought your way out of the sexual rat race. You could be a “person of cleavage,” to borrow a Pulitzer-worthy phrase from Ruth Marcus, a Post columnist, but you could nonetheless make it through your day without having to give the matter much thought.

After all, isn’t every woman past a certain age, at a certain weight and after a certain amount of breast-feeding, a “person of cleavage?” And aren’t you allowed, at a certain time of life, to escape from the world of at least my youth, where you couldn’t walk down the street licking an ice cream cone without inviting a stream of leering commentary?

I always thought that middle age afforded some kind of protection from prying eyes and personal remarks. I thought this was the silver lining to growing up and growing older. Clearly, I was wrong.

Funny that it took another woman to drive that point home to me. Funny, too, that when I looked closely at the photo that accompanied Givhan’s article, I couldn’t see anything vaguely resembling cleavage. I guess you had to have just the right angle.

Givhan is a fashion writer, which means she spends a great deal of time in the company of professional anorexics, sunken-chested young women whose attempts at cleavage are gerry-rigged for the cameras. These are women whose images are tightly controlled, for whom every millimeter of flesh shown or unshown is a matter of careful planning and styling and aesthetic hand-wringing.

Normal women are different. Normal women — real women, dare I say? — women who have other, more important things on their minds than their looks, women who have other people on their minds than themselves, the kinds of women, in other words, whom you’d probably want to have running the country, aren’t likely to be so “perfect” in appearance. Their flesh (minus the knife) will bulge or sag; their clothes will pull and shift, showing this or that lunch stain, this or that wrinkle, this or that unbidden bit of skin. It’s a mark of how stunted we are as a society that, no matter what their age, accomplishments or stature, we still expect these women to maintain a level of image control worthy of a professional beauty.

The difference between a real female commander in chief and a woman who plays one on TV will one day prove to be that the former is not always shot at a flattering angle. She will have bad shirt days and the occasional run in a stocking.

At least, I hope she’ll be the kind of woman who would permit that to happen. I hope, at the very least, that predatory eyes won’t force her to spend a chunk of precious work time every day being packaged into an impenetrable, invulnerable suit of professionally styled armor.

But I wouldn’t count on it.

Judith Warner is the author of “Perfect Madness” and a contributing columnist for TimesSelect. She is a guest Op-Ed columnist.
Bob Herbert is off today.

Bobo talks about the “neopopulist story line” about rising income inequality, and calls it “simple minded.” Well, if anyone knows simple-minded it’s Bobo. Judith Warner writes about setting priorities for American work/family policy. Here’s Bobo:

If you’ve paid attention to the presidential campaign, you’ve heard the neopopulist story line. C.E.O.’s are seeing their incomes skyrocket while the middle class gets squeezed. The tides of globalization work against average Americans while most of the benefits go to the top 1 percent.

This story is not entirely wrong, but it is incredibly simple-minded. To believe it, you have to suppress a whole string of complicating facts.

The first complicating fact is that after a lag, average wages are rising sharply. Real average wages rose by 2 percent in 2006, the second fastest rise in 30 years.

The second complicating fact is that according to the Congressional Budget Office, earnings for the poorest fifth of Americans are also on the increase. As Ron Haskins of the Brookings Institution noted recently in The Washington Post, between 1991 and 2005, “the bottom fifth increased its earnings by 80 percent, compared with around 50 percent for the highest-income group and around 20 percent for each of the other three groups.”

The third complicating fact is that despite years of scare stories, income volatility is probably not trending upward. A study by the C.B.O. has found that incomes are no more unstable now than they were in the 1980s and 1990s.

The fourth complicating fact is that recent rises in inequality have less to do with the grinding unfairness of globalization than with the reality that the market increasingly rewards education and hard work.

A few years ago, the rewards for people earning college degrees seemed to flatten out. But more recent data from the Bureau of Labor Statistics suggests that the education premium is again on the rise.

Fifth, companies are getting more efficient at singling out and rewarding productive workers. A study by the economists Thomas Lemieux, Daniel Parent and W. Bentley MacLeod suggests that as much as 24 percent of the increase in male wage inequality is due to performance pay.

Sixth, inequality is also rising in part because people up the income scale work longer hours. In 1965, less educated Americans and more educated Americans worked the same number of hours a week. But today, many highly educated people work like dogs while those down the income scale have seen their leisure time increase by a phenomenal 14 hours a week.

Seventh, it’s not at all clear that the big winners in this economy are self-dealing corporate greedheads who are bilking shareholders. A study by Steven N. Kaplan and Joshua Rauh finds that it’s not corporate honchos who are filling up the ranks of the filthy rich. It’s hedge fund managers. Or, as Kaplan and Rauh put it, “the top 25 hedge fund managers combined appear to have earned more than all 500 S.&P. 500 C.E.O.’s combined.” The hedge fund guys are profiting not because there’s been a shift in social norms favoring the megarich. It’s just that a few superstars are now handling so much capital.

Eighth, to the extent that C.E.O. pay packets have thickened (and they have), there may be good economic reasons. The bigger a company gets, the more a talented C.E.O. can do to increase earnings. Over the past two and a half decades, the value of top U.S. companies has increased 500 percent, according to Xavier Gabaix and Augustin Landier. The compensation for the C.E.O.’s of those companies has also increased 500 percent.

Ninth, we’re in the middle of one of the greatest economic eras ever. Global poverty has declined at astounding rates. Globalization boosts each American household’s income by about $10,000 a year. The U.S. economy, despite all the bad-mouthing, is chugging along. Thanks to all the growth, tax revenues are at 18.8 percent of G.D.P., higher than the historical average. The deficit is down to about 1.5 percent of G.D.P., below the historical average.

All of this is not to say everything is hunky-dory. Inequality is obviously increasing. There’s evidence that global trade is producing more losers.

Instead, the main point is that the Democratic campaign rhetoric is taking on a life of its own, and drifting further away from reality. Feeding off pessimism about the war and anger at Washington, candidates now compete to tell dark, angry and conspiratorial stories about the economy.

I doubt there’s much Republicans can do to salvage their fortunes by 2008. But over the long term a G.O.P. rebound can be built by capturing the Bill Clinton/Democratic Leadership Council ground that the Democrats are now abandoning. Whoever gets globalization right will have a bright future, and in the long run, the facts matter.

Here’s Ms. Warner:

The news from the Pew Research Center this month — that 60 percent of working mothers say they’d prefer to work part time — was barely out before it was sucked up into the fetid air of the mommy wars, with all the usual talk on “opting out” and guilting out, and the usual suspects lining up to slug it out on morning talk TV.

But the conversation we should be having these days really isn’t one about What Mothers Want. (This has been known for years; surveys dating back to the early 1990s have shown that up to 80 percent of mothers — working and at-home alike — consistently say they wish they could work part time.) The interesting question is, rather, why they’re not getting it.

Only 24 percent of working mothers now work part time. The reason so few do isn’t complicated: most women can’t afford to. Part-time work doesn’t pay.

Women on a reduced schedule earn almost 18 percent less than their full-time female peers with equivalent jobs and education levels, according to research by Janet Gornick, a professor of sociology and political science at City University of New York, and the labor economist Elena Bardasi. Part-time jobs rarely come with benefits. They tend to be clustered in low-paying fields like the retail and service industries. And in better-paid professions, a reduced work schedule very often can mean cutting down from 50-plus hours a week to 40-odd — hardly a “privilege” worth paying for with a big pay cut.

It doesn’t have to be this way. In Europe, significant steps have been made to make part-time work a livable reality for those who seek it. Denying fair pay and benefits to part-time workers is now illegal. Parents in Sweden have the right to work a six-hour day at prorated pay until their children turn 8 years old. Similar legislation helps working parents in France, Austria, and Belgium and any employee in Germany and the Netherlands who wants to cut back.

Even Britain has a (comparatively tame) pro-family law that guarantees parents and other caregivers the right to request a flexible schedule from their employers. European employers have the right to refuse workers’ requests, but research shows that very few actually do. And workers have the right to appeal the denials.

None of this creates a perfect world. Feminists have long been leery of part-time work policies, which tend to be disproportionately used by women, mommy-tracking them and placing them at an economic disadvantage within their marriages and in society. The American model of work-it-out-for-yourself employment is Darwinian, but women’s long working hours have gone a long way toward helping them advance up the career ladder.

“We know that family-friendly policies encourage work force participation,” says Professor Gornick, who has extensively studied family policy on both sides of the Atlantic. “But do they lower the glass ceiling or make it thicker? That’s the million-euro question.”

I think that when it comes to setting priorities for (currently nonexistent) American work-family policy, we ought to go for the greatest good for the greatest number.

The place to start, ideally, would be universal health care, which is really the necessary condition for making freedom of choice a reality for working parents. European-style regulations outlawing wage and benefit discrimination against part-time workers would be nice, too, though it’s not a terribly realistic goal for the U.S., where even unpaid family leave is still a hot-button issue for employers.

A British-style “soft touch” law could, however, be within the realm of the possible. Senator Edward Kennedy and Representative Carolyn Maloney are circulating draft legislation modeled on the British workplace flexibility law that would give employees — all workers, not just moms or parents — the right to request a flexible schedule. The legislation — which would require employers to discuss flexibility with workers who request it, but wouldn’t require them to honor the requests — has a little bit of something for everyone: protection from retaliation for workers who fear letting on that they’re eager to cut back, protection from “unfunded mandates” for businesses.

Critics might say the proposed legislation’s touch is so soft as to be almost imperceptible, but it’s a start. At the very least, it’s a chance to stop emoting about maternal love and war and guilt and have a productive conversation.

Judith Warner is the author of “Perfect Madness” and a contributing columnist for TimesSelect. She is a guest Op-Ed columnist.
Bob Herbert is off today.