There seems to be an unwritten rule of blogging that you need to post about something as soon as it emerges as the issue of the day. It's looked down on if you blog about it a week later, or you'll at least need to excuse it with an apology ("I'm sorry I'm late to this, but I just noticed it, and..."). Blogging about a story that's a month old is unthinkable.

[This video] aired a couple of weeks ago, but it's a pretty good look at the drinking age debate, with lots of camera time for Amethyst Initiative founder John McCardell. [emphasis added]

"But" it's a pretty good look at the drinking age debate? Why not "and"? Did the debate over the legal drinking age really change at all in the few weeks since the 60 minutes segment aired?

I know it's petty to pick on this one person's word choice, but it's symptomatic of a general problem in blogging. Sure there'll be an issue here and there that's so timely it only really makes sense to blog it right away. But don't most things worth talking about stay interesting for a while?

If something that's a couple weeks "old" is too out of date for you, you certainly wouldn't have much interest in, say, reading a book.

So I plan to keep breaking this unwritten rule of the blogosphere; that is, I'll blog something that doesn't cross the "new" threshold as long as it crosses the "interesting" threshold. Even a subject that's centuries old isn't off limits; a fortiori, it's OK to blog something from last year.

2. "People get a hit of energy when they are negative about something."

Monday, March 30, 2009

John McWhorter gave an extensive answer to that question in a recent interview with Bill Moyer on PBS.

When I originally posted this, I was using just a YouTube clip that McWhorter had posted to his own blog. I typed up my own transcription because I didn't know there was any existing transcript. Since then, the YouTube clip has been taken down, but the video and transcript are on PBS's website.

Basically, I can't imagine the playing field ever being completely level. I don't know how you can create that.

And this is the crucial thing: I think that descendants of African slaves in the United States are the only group in human history who have insisted that we can only achieve under perfect or near-perfect conditions.

Now, for me to have said that in 1950, now that would have been pushing it. But in terms of the way it is now? Life is never fair.

And I want to stress once again: I know there is racism in the United States. Tulia, Texas . . . anyone can Google that.

But the fact of the matter is that when you look at the problems that the black community has, tracing them to racism has gotten so abstract that, really, I don't see, and maybe I'm missing something, but I don't see how we could convince a significant proportion of the American population that racism was the main issue anymore.

And more to the point, if you could take away racism right now . . . Racism's gone! The problems we are talking about would still be there. So I think that racism is important, but not as important in 2009 as helping people who need help. . . .

Sometimes I think we have to learn not only from history 100 years ago, but from history a few news cycles ago. The common wisdom . . . not long ago was that Barack Obama couldn't win. That there was still enough racism "out there" that these people in diners talking about how they would never vote for a black man were going to tip the election. And you know, they didn't, and they didn't even come close. We have to learn from that in thinking about what's "out there," as it's often put.

And what's most important? Not what's "out there." We all know what's "out there." You can find it. Look for it, and you'll find it. It might be next door to you. . . . Does it matter in 2009? And I'm suggesting that these days, it's not one of our larger problems. . . .

Were we ever thinking that there was going to be an America where there was nothing that you could call racism? Because, we are homo sapiens, and we're wired in certain ways. The idea that we could never have any biases, that we would never process people according to group, that there would never be some people who are more troglodytic on this thing than others -- I don't think that that corresponds to any kind of reality. We have made amazing strides. But the idea that we can ever have none? I don't know, we'd have to be a different species. . . .

The black president was elected. That settles it. Many people don't like that.

MOYER: With a lot of white votes, by the way.

McWHORTER: With a lot of white votes. And so many people don't like that that really settles the position in the middle. But that's where the conversation is always going to be.

If you play this performance back to back with the album version, you'll hear how she finesses each little piece of melody in the live version as if she were just creating the tune on the spot. She's never on autopilot.

But you can make the problem of evil particularly acute by focusing on animal suffering. Let's assume there's no human being around to observe the animal, so that we rule out even a theoretical possibility that a human might learn some sort of lesson. For example, before humans even existed, there were animals experiencing pain. Can you reconcile this fact with the existence of a benevolent god?

A couple of logical but unappealing possibilities spring to mind. One is: "Animals simply don't have any awareness, feelings, etc., so their suffering isn't bad -- or, rather, it doesn't even make sense to talk about them suffering, just as it doesn't make sense to talk about a rock suffering."

At the other extreme: "No, animals are conscious, they have free will, and they have souls. So, if God and evil are compatible on the theory that God gave us free will (so that we could be virtuous), then the same thing applies to animals."

I don't think most people find either of these extremes plausible. Most people seem to think animals are at least minimally conscious in that they can feel pain (for instance), but aren't as robustly conscious as humans -- they don't have free will or souls. (Of course, many philosophers prefer not to talk about anyone having free will or souls, but I'm trying to approach this in Christian-ish terms because of the problem of evil's salience within Christianity.)

OK, let's put all that to the side for now. Let's assume for the sake of argument that Leibniz's best-of-all-possible-worlds theory is correct -- that is, suffering is justified in the long run by the existence of free will, because free will is a precondition for virtue, and freedom entails the freedom to cause harm. Let's also assume (since I think most people agree) that animals don't operate at such a sophisticated level: unlike humans, they aren't capable of attaining virtue by exercising free will.

Doesn't it follow that animal suffering is a greater evil than human suffering?

In a typical debate over the moral status of animals, someone on the pro-animal side will make the point: "Animals, like humans, can feel pain. That gives them moral status -- even if they don't have human intelligence, humans still have a responsibility to avoid cruelty to animals when possible."

The response is then going to be: "Even if you're right that both humans and animals can feel those initial stabs of pain, that overlooks a crucial distinction. Only humans can intellectually reflect on the experience over time. We have this profound experience that animals don't have."

Those who make this latter point often seem to assume it's an argument for caring more about human beings. On the contrary, though, our ability to reflect and "build character" seems to mitigate our suffering. Meanwhile, animals are left merely havingsuffered, without gaining anything from the experience.

I'm self-plagiarizing this post from one of my earliest posts, from April 21, 2008, which went up before this blog even went public. I started that post by admiring Bertrand Russell's concise refutations of influential philosophical arguments in his book History of Western Philosophy.

I cannot accept this; I think that particular events are what they are, and do not become different by absorption into a whole. Each act of cruelty is eternally a part of the universe; nothing that happens later can make that act good rather than bad, or can confer perfection on the whole of which it is a part.

Now here's his refutation of Leibniz's argument that there's a benevolent God who made this "the best of all possible worlds." Leibniz said the best possible world would contain free will, so God created a world with free will, which explains why bad things happen: they're human acts of free will. There are many obvious problems with this argument -- for instance, there's a lot of bad stuff in the world that's not caused by human action. But Russell's refutation is particularly clever:

A Manichaean might retort that this is the worst of all possible worlds, in which the good things that exist serve only to heighten the evils. The world, he might say, was created by a wicked demiurge [i.e. a demon], who allowed free will, which is good, in order to make sure of sin, which is bad, and of which the evil outweighs the good of free will. The demiurge, he might continue, created some virtuous men, in order that they might be punished by the wicked; for the punishment of the virtuous is so great an evil that it makes the world worse than if no good men existed.

It's a commonplace to ridicule Leibniz's view that God has ensured that we live in "the best of all possible worlds." I mean, Voltaire made fun of it in his novel Candide, so it must be wrong. I'm guessing that people will balk at the "best of all possible worlds" idea when phrased like that, but if you phrase it more gently -- "Things work out for the best" -- it's still hugely influential.

I certainly agree with Russell's response to Spinoza: cruel acts aren't transformed into good by being absorbed into the whole universe. This might be why I'm so indifferent to religion.

Unlike many secularists, though, I don't believe that cruelty and suffering are "just there" and don't have any larger meaning in the grand scheme of things. I don't have any more interest in an "It's all meaningless" view than in an "It's all for the best" view. What I think is that even if things that happen in the world do have some kind of ultimate meaning, the suffering is still there, and you can't rationalize it away.

This explains the overwhelming instinct, cutting across political lines, that torture is just wrong, period. Even those who argue for exceptions to society's general "don't torture people" rule tend to rely on scenarios where the suffering caused by torture is far outweighed by preventing others from suffering -- the classic "ticking bomb," etc. This still implies that suffering itself is the basic unit that we're looking at in making moral assessments. So people are quibbling over a very narrow exception -- maybe an important exception, but not one that calls into question the fundamental "torture is bad" consensus.

And so, no one takes the position: "Hey, go ahead and torture as much as you like! It's sure to be a net plus in the end -- it'll be a learning experience, or it will be a ringing affirmation of our own free will, or something." Well ... no one applies this to human beings. But it's regularly applied to God. Bizarrely, God is held to lower moral standards than humans are.

Tuesday, March 24, 2009

- President Obama says we need to take on "drug cartels that have gotten completely out of hand." As opposed to the reasonable drug cartels?

- He seemed to get an extra (uncharacteristic?) burst of enthusiasm when he said: "That whole philosophy of persistence is one that I'm going to be emphasizing in the years to come when I'm in office." (Rough quote from live broadcast.)

"This is a big ocean liner. It's not a speed boat. It doesn't turn around immediately."

UPDATE: Mickey Kaus points out that of the reporters who asked Obama a question stated a blatant falsehood: "1 in 50 children are now homeless in America." As Kaus says, "This is one of those statistical assertions that you know is BS before you even set out to show it's BS." But click here if you're interested in the details.

As a writer, I enjoy listening to people speak and, when they’re in the middle of a particularly interesting sentence, I try to imagine how I’d like to see it finished.

Usually I am disappointed. But with some select people, the payoff is far greater than I could have imagined. They have something to say that’s remarkably insightful or unexpected or even just articulate in a way that takes your breath away.

I could do without the overstated "takes your breath away" rhetoric, but aside from that, it seems like a useful test, though there's an obvious risk of becoming overly judgmental.

Dubner gives 3 examples: President Obama, classical pianist Glenn Gould, and some sports guy who wrote a book about some other sports guy. Click the above link if you're interested in the specific explanations.

Dubner adds:

In each case, the subject spoke with what I can only characterize as total intelligence — a lot of mental horsepower, to be sure, but also nuance, precision, conceptual and practical elements combined in the same sentence, and psychological astuteness.

I guess, therefore, that if I were asked to define what it means to be “smart” in this day and age, those are the characteristics I’d list. I know a lot of super-brainy people who don’t express themselves well; I know a lot of psychologically astute people who haven’t a whiff of organization or precision about them; I know a lot of articulate people who can’t see the big picture. But if I were friends with either Obama, [that sports guy], or Gould, I’d have to say that they were the smartest people I know. (Sadly, I’m not.)

I don't understand why he focuses on "President Obama's first press conference" (the prime-time address in early February where he pitched the economic stimulus plan). I'm fine with Dubner using Obama as an example of a smart person. But as for that specific press conference, I agree with this review — it was unusually long-winded and meandering. ("Obama seemed like he was channeling a particularly loquacious combination of Joe Biden, Bill Clinton, and the ghost of Hubert Humphrey. . . . Obama radiated the sense of a leader who has digested too many economic briefings and memorized too many talking points in preparation for his primetime rendezvous with the public.")

It's also weird that he starts out equating intelligence with finishing your sentence with unexpected insight, but he never gives any example of a sentence by any of those 3 people (or anyone else) that ended unexpectedly. So I was disappointed with how he finished his blog post.

As for Dubner's analysis of what makes someone intelligent — "nuance," "precision," "conceptual"/"practical," and "psychologically astute" — I'm not sure. Someone who had all those qualities in their conversation would probably be interesting to listen to. But do they add up to "intelligence"? Maybe they add up to "how to sound intelligent."

whether a person, if asked to explain himself, is capable of doing so in different, clearer terms than he used the first time.

Unfortunately, it's all too common to see people trying to project intelligence through precisely the opposite approach: persistently explaining things in obfuscatory terms. I keep coming back to this passage from an essay by John Kenneth Galbraith (previously blogged):

Complexity and obscurity have professional value—they are the academic equivalents of apprenticeship rules in the building trades. They exclude the outsiders, keep down the competition, preserve the image of a privileged or priestly class. The man who makes things clear is a scab. He is criticized less for his clarity than for his treachery.

Monday, March 23, 2009

[H]alf of the world's almost 7,000 remaining languages may disappear by 2100, experts say.

A language is considered extinct when the last person who learned it as his or her primary tongue dies. Last month, the United Nations Educational, Scientific and Cultural Organization (UNESCO) launched an online atlas of endangered languages, labeling more than 2,400 at risk of extinction.

Languages typically die when speakers of a small language group come in contact with a more dominant population. That happened first when hunter-gatherers transitioned to agriculture, then during periods of European colonial expansion, and more recently with global migration and urbanization. The spread of English, Spanish and Russian wiped out many small languages. ...

Extinct languages can be revived, especially when they have been recorded.

"But when you skip a generation, it's hard to pick a language back up again," said Douglas Whalen, president of the Endangered Language Fund, which gives grants to language-preservation projects. "You need a community that is really committed and will bring children up from birth in the second language, even if they themselves are not the most fluent speakers." ...

The Living Tongues Institute recruits youth who are not fluent in their traditional tongue to become "language activists," using digital equipment to document their elders' voices and learn the language themselves. This creates a record and builds pride in the language.

But wait a second — what's the point of all this? What exactly is so bad about thousands of languages becoming extinct?

The most concrete argument for preserving languages seems to be that you can preserve esoteric knowledge that's exclusive to these languages. This may be a general familiarity with the culture that speaks the language, or it may be more specific — the WaPo article talks (somewhat credulously) about secret medicinal remedies passed down from generation to generation.

But if that's really our concern, does it apply only to the past? What about the future? If more people speak a larger number of obscure languages, doesn't that make it more likely that useful information will remain confined to tiny subcultures, depriving the rest of the world of the benefits? So doesn't this concern actually cut against putting moribund languages on life support?

The language revivalists yearn for — surprise — diversity. What they miss is that language death is a healthy outcome of diversity.

If people truly come together, then they speak a common language. We can muse upon a "salad bowl" ideal in which people go home and use their nice "diverse" language with "their own." But in reality,almost always the survival of that "diverse" language means that the people are segregated in some way, which in turn is almost always due to an unequal power relationship — i.e., precisely what "diversity" fans otherwise consider such a scourge.

Jews in shtetls, for example, spoke Yiddish at home and Russian elsewhere because they lived under an apartheid system, not because they delighted in being bilingual. The Amish still speak German only because they live in isolation from modern life, which few of us would consider an ideal for indigenous groups to strive for.

In the end, the proliferation of languages is an accident: a single original language morphed into 6,000 when different groups of people emerged. I hope that dying languages can be recorded and described. I hope that many persist as hobbies, taught in schools and given space in the press, as Irish, Welsh, and Hawaiian have.

However, the prospect we are taught to dread — that one day all the world's people will speak one language — is one I would welcome. Surely easier communication, while no cure-all, would be a good thing worldwide. There's a reason the Tower of Babel story is one of havoc rather than creation.

If people confined themselves to their "traditional" languages (and what does that mean? proto-Indo-European?) I'd be wearing today a long black coat, a felt hat, long curly earlocks, and a fringed garment, and speaking English haltingly, with a heavy accent, as a second language. I wouldn't like that. John wouldn't exist, and I wouldn't like that either.

Saturday, March 21, 2009

Liberal blogger men are thrilled with the New York Times's appointment of 29-year-old Atlantic blogger Ross Douthat to replace William Kristol on the op-ed page.

Why "liberal blogger men"? Because Pollitt thinks that these "men" wouldn't be so "thrilled" about Douthat if they were women. Why? Because Douthat says things like this (quoted by Pollitt — I couldn't find the original source):

[I]t . . . makes adaptive sense for women to have a certain amount of difficulty having orgasms, because then they're more likely to seek out a long-term monogamous partner who knows their body well, which in turn dovetails nicely with the general female interest in having only one partner, the better to keep that partner around when the children come along.

If you Google this quote, you'll see it's been frequently ridiculed on the internet. But I haven't seen anyone, including Katha Pollitt (who sarcastically calls it "thoughtful commentary"), actually explain what's wrong with it.

That passage is an attempt to give an evolutionary explanation for a human trait. I don't know if Douthat is right or wrong, but you don't show that he's wrong unless you give some kind of specific argument beyond sarcastic sneering.

But are the liberal blogger men somehow slanted against women in praising Douthat? I'd say no. A few points:

1. By observing a fact about the world and then providing an evolutionary explanation of it, you're not approving of that state of affairs.

For instance, I can observe that a lot of men commit rape, and I can give an obvious evolutionary explanation for it: men, unlike women, have the physical potential to reproduce as many times as they can have sex. Since women tend to resist consenting to sex, men will have an easier time reproducing if they're willing to violate women's lack of consent. But does giving this explanation mean that I in any way approve of rape? Of course not.

[W]e're all puppets, and our best hope for even partial liberation is to try to decipher the logic of the puppeteer. . . . Just because natural selection created us doesn't mean we have to slavishly follow it's peculiar agenda. (If anything, we might be tempted to spite it for all the ridiculous baggage it's saddled us with.)

As we try to make sense of the brave new world that VHS and streaming video have built, we might start by asking a radical question: Is pornography use a form of adultery? . . .

[A]dultery is inevitable, but it’s never been universal in the way that pornography has the potential to become—at least if we approach the use of hard-core porn as a normal outlet from the rigors of monogamy, and invest ourselves in a cultural paradigm that understands this as something all men do and all women need to live with. In the name of providing a low-risk alternative for males who would otherwise be tempted by “real” prostitutes and “real” affairs, we’re ultimately universalizing, in a milder but not all that much milder form, the sort of degradation and betrayal that only a minority of men have traditionally been involved in.

Now, I'm a liberal blogger man, as Pollitt might put it. And I disagree with Douthat's take on pornography. Yet, reading the pornography article makes me glad the NYT chose him as their new columnist. It's possible to learn from people you disagree with, or at least admire their willingness to take unpopular positions.

3. As Matthew Yglesias (one of the liberal blogger men Pollitt criticizes) points out, he can endorse Douthat as the columnist without endorsing everything Douthat has written:

I think that conservatives such as Ross Douthat are regularly wrong about a wide variety of important topics. Thus, instances of them being wrong can be easily produced. . . . That said, . . . I don’t think it makes one a traitor to progressive politics . . . to think it’s a good thing when conservatives-who-offer-more replace conservatives-who-offer-less.

Pollitt openly disagrees with Yglesias on that point: she wishes Bill Kristol had stayed at the NYT because he's "a dull, complacent apparatchik who set forth the Bush line in all its fact-free glory." The problem is, it's hard to have a useful debate when one side doesn't make the best possible case for itself. If you're a liberal and you really believe in the merit of your liberal views, you should want them to be pitted against the most thoughtful conservatives, even if (especially if) they sometimes take positions that make you uncomfortable.

Friday, March 20, 2009

Facebook friend status update, referring to the old punk band Operation Ivy:

_______ is on an Op Ivy bender.

That makes me want to listen to some "Knowledge" -- an Operation Ivy song covered here by Green Day (whose first live show ever using the name Green Day was also Operation Ivy's last show, according to Wikipedia):

For this fear of death is indeed the pretence of wisdom, and not real wisdom, being the appearance of knowing the unknown; since no one knows whether death, which they in their fear apprehend to be the greatest evil, may not be the greatest good. Is there not here conceit of knowledge, which is a disgraceful sort of ignorance?

And this is the point in which, as I think, I am superior to men in general, and in which I might perhaps fancy myself wiser than other men -- that whereas I know but little of the world below, I do not suppose that I know.

Back to the modern day ... don't forget about Operation Ivy's bassist, Matt Freeman, who went on to be the bassist for Rancid. The greatest punk bassist in the world (parental advisory: explicit lyrics):

Thursday, March 19, 2009

And I'm not using "99.9%" loosely to mean "the vast majority." That's the actual figure: the total of the bonuses paid to AIG employees are about one one-thousandth of the AIG bailout. (The bonuses are just under $200 million; the bailout is just under $200 billion.)

I'm starting to find the obsessive "what did Geithner/Obama know and when did he know it" line of questioning a little tedious. Yes, it's worth establishing a rough chronology so we know if public officials are telling us the truth. But the endless preoccupation by my colleagues in the media--when did the Fed tell Treasury, when did Treasury tell [Tim] Geithner, when did Geithner tell Obama--is getting a little ridiculous. This just wasn't a huge substantive mistake. It was a small substantive mistake--we're talking about a tiny fraction of the $200 billion we're floating AIG here....

He then quotes a passage from a recent Washington Post article that "gets at the absurdity of it all" (boldface added by Scheiber):

During this period, Geithner's primary concern was keeping the financial system from collapsing, a source said. The compensation packages for AIG employees were hardly, if ever, brought up, another source said. Other staff members at the Fed and Treasury were in charge of the compensation issues and only briefed Geithner, sources familiar with the matter said. Once nominated for the Treasury post in December, Geithner recused himself from affairs related to specific firms.

Scheiber adds:

[H]ow much would we even want a Treasury secretary to focus on $165 million in bonus money while there were hundreds of billions of dollars in bailout money flowing to AIG and other companies? Doesn't seem like that would be a particularly good use of his time beyond a certain point.

The problem, of course, is that if you don't mind the politics of a situation like this, it can quickly cripple your efforts to do anything else. But, again, that's a reason for the press to treat it like a political fiasco ("please tell the American people what you're doing to make this right") not a substantive fiasco (Watergate-style badgering). The coverage seems to me a lot more in line with the latter than the former.

I don't have much to add since I think Scheiber is pretty clearly correct about all this.

Just one more point. I remember partaking in the outrage over the big investment banks' CEOs "taking private jets to hold out a tin cup to Congress" last year. For a moment there, it felt really important. Then I saw some story about CEOs making a point of not taking their private jets. Now, you can say that's too little too late. But let's say, for the sake of argument, that you actually admire the CEOs' abnegation. It occurred to me: does that really change anything important? I mean, I think it's good to avoid taking private jets because of the monstrous financial and environmental costs. But it's not as if I was going to be any less outraged about the need to bail them out no matter how ascetic their lifestyle choices were. Unfortunately, there's such an enormous amount of money at stake that a few million dollars swishing around here and there just doesn't change very much. I care very little about these people's moral purity; I care a lot more about what's going to happen to everyone else.

A council on men and boys would promote stable marriage as the best avenue to improve the lives and living conditions of America's women and families. A council on men and boys would address the crisis in American manhood that results in the scourge of infidelity, divorce, lack of commitment and fatherhood with multiple partners.

If Ms. Hicks wonders why men have no interest in a "stable marriage," or commitment, she need only look as far as her own dripping disdain for men and her lack of insight into a culture that holds men responsible, portrays women as victims, and then sets up a "council" to correct a problem that women spend over 30 years in the making. A council on women is about expanding their opportunities. A council on men is about controlling them.

I don't buy that explanation for why men screw up society. Men have been screwing up society forever. It's not the result of a few decades of anti-male feminists.

But I want to get back to the thing that actually happened -- the "White House Council on Women and Girls." What I want to know is: why is this really necessary?

Unequal pay? Obama loves to say that women are paid only 78 cents on the dollars relative to men. It's not at all clear to me that women are paid unequally. But that's a complicated enough topic that I want to leave it for a whole other blog post. (If you have actual evidence on this question that controls for all the relevant variables, please let me know in the comments. I cannot emphasize that italicized part enough.)

I'm particularly wondering about the "girls" part. They pointedly chose to put "girls" in the title, not just "women." Why do we need a White House Council on girls rather than, say, boys? This is not a rhetorical question -- I'd be interested to see any answers in the comments.

The children who are suffering in this country, who are having trouble in school, and for whom the murder and suicide rates and economic dropout rates are high, are boys — especially boys of color, for whom the whole educational system, starting in kindergarten, often feels a form of exile, a system designed by and for white girls.

In the progressive Midwestern city where I live, the high school dropout rate for these alienated and written-off boys is alarmingly high. Some are even middle-class, but many are just hanging on, their families torn apart by harsh economics and a merciless criminal justice system. Why does it seem to be the Republicans who are more vocal about reforming our drug laws? Why has no one in the Democratic Party campaigned to have felons who have served their time made full citizens again? Their continued disenfranchisement is a foolhardy strike against these men and their families.

[T]he message here seems to be that no matter how bad things get for boys/men, well, they're men, so they can look after themselves.

Women, on the other hand, need a Presidential Council to make sure they're doing all right, even if by many metrics they are outperforming men.

I think I'd be vaguely insulted if I were a woman.

Another commenter has this ground-breaking idea:

The White House Council on Women and Girls should be merged with the White House Council on Men and Boys to create the White House Council on People, the goal of which is to promote policies that help people. President Obama should head the Council, and its name can be shortened to the White House.

The percentage of Americans who call themselves Christians has dropped dramatically over the past two decades, and those who do are increasingly identifying themselves without traditional denomination labels....

But I don't think those are the most important results from this poll.

It's hard to get at people's true thoughts and feelings by asking them to put a label on themselves, because you dredge up all the connotations surrounding the labels in addition to the literal meaning. There are plenty of atheists, for instance, who don't want to go around proclaiming, "I'm an atheist," because that's just not polite. And this can cut both ways: I have a couple friends who deeply believe in Christianity and call themselves "Christians," but will refer to themselves as "not religious" or even opposed to "religion." For them, "religion" means human-made institutions that distort and distract from the founding principles of Christianity.

The answers to these questions are more revealing, because they're about real life rather than labels:

Were you married in a religious ceremony? (ever married respondents only)

When you die, do you expect to have a religious funeral or service?

The answers were 69% "yes" vs. 30% "no" for weddings, and 66% "yes" vs. 27% "no" for funerals (with a few percent not answering). So that's roughly a 70-30 split for both questions. (Here's a PDF of the complete results.)

I'm surprised at how secular we've turned out to be. I would have expected those numbers maybe in Europe; I would have thought the percentage of secular weddings or funerals here in America would be more like 10%. Even to me, the idea of having a wedding or funeral without the trappings of religion feels ruthlessly antisocial.

There seems to be a strong notion (though maybe not as strong as I thought) that even if you aren't particularly religious, you should still publicly go through the motions to be nice. If you're, say, getting married, well, "families" are going to "expect" it.

People sometimes talk about "family" as if it's this dictatorial entity controlling your life. But if you're over 18, unless you're bound by some specific debt or contractual condition to a family member, you're allowed to do what you want.

2. I've been face-to-face with Bill Clinton several dozen times. I was born in Hope, Arkansas and my dad was friends with Hillary soon after I was born, and later worked on the campaign and for the first year and a half in the White House. The funniest / weirdest Clinton story I have, though, happened when Hillary Clinton came to BookPeople [in Austin, Texas]. I was still relatively new at the time, I guess, or at least quieter about things than I am now, and leading up to the event I didn't really tell anyone about the family connection. I volunteered to work the event cause I thought it would be fun but it had been years and years since I'd seen the Clintons and I had gone from being a pre-adolescent to a sort-of grown-up in that time so I assumed she wouldn't remember me. She wouldn't have, but my dad had lunch with her (and several other people) that day and let her know that I worked at BookPeople. So whenever she was finishing up signing people's books and we were all kind milling around the 3rd floor events room she stands up and kind of yells "Where's Summer? I need to see Summer!" At which point I became so embarrassed and overwhelmed that when I hugged her I called her Hillary instead of Senator Clinton, which — just for the record — you Don't Do even if you're an old family friend. My dad lectured me about it forever. . . .

10. When my dad worked in the White House I had a huge crush on George Stephanopoulos — shutup, I was like 12 — something that my dad apparently thought was okay to tell George all about, culminating in me actually being at my dad's office one day when he came in and my dad reminded him that I was the one who thought he looked like Tom Cruise. Worst. moment. of. my. life.

Monday, March 16, 2009

Large blocks of government offices sit unfilled and critical jobs--those involved in managing the global economy, for example--go unperformed. Talk to those administration recruiters, and they'll complain about the difficulty of finding bodies to fill the posts. ... [T]hey have been deterred by one of the most rigorous human-resources operations in history. ...

As the [Washington] Post reported last week, the "intensified vetting process has left dozens of President Obama's picks to run the government mired in a seemingly endless confirmation limbo, frustrated and cut off from the departments they are waiting to serve and unable to perform their new duties."

Why the confirmation constipation? For one, there are more than a thousand presidential appointees that require Senate approval. Each has to pass a rigorous security check by the FBI, which conducts copious interviews not only with the nominee, but also with myriad acquaintances and colleagues. (One college president who was vetted for a part-time advisory-board position during the Clinton administration recalls FBI agents knocking on the doors of his fraternity-house neighbors to verify his character.)

Obama's insistence on having the most transparent administration in U.S. history has slowed the process even further, and his high-minded ethics standards have limited his ability to consider the full scope of talent. While the revolving door should be closed to lobbyists, there's something absurd about a process that treats human rights activists and consumer advocates as the ethical equivalent of K Street sharks, shutting them out of government. ...

President Obama needs to streamline this process before he erodes not only the willingness to serve, but his own ability to govern.

Way back in August 2007 I criticized Obama’s lobbyist pledge as “meaningless grandstanding.” That turns out to have been too optimistic....

[D]umb as the pledge was, it’s dumber still to stick with it. Flip-flopping will look bad, but nobody will care in 2012 about an old flip-flop. By contrast, lots of people will care by 2012 if we’re in the midst of a prolonged depression. The premium has to be on getting smart, effective people in place in order to frame and implement smart, effective policy. At the moment, Obama is still floating on a positive image and the fact that people rightly blame his predecessor for the current situation being so bad. But that brand isn’t going to be worth anything in a couple of years unless he brings back growth. He should admit that the initial pledge was ill-considered and a bit cynical and that it would be even more cynical to stick with a mistaken promise merely in order to avoid the need to admit a mistake.

Sunday, March 15, 2009

"...if they make no mention of prisoner re-entry programs like this one and you can't find a reference to same in the index of their books, then you know you have encountered someone who is more interested in the catharsis of complaining than in the work of solutions."

Now, surely everyone will be thrilled to hear about these success stories, right? Surely no one would actually get more pleasure out of believing that the criminal justice system is destroying blacks' lives, would they?

Friday, March 13, 2009

I've been on a classical-music kick lately. I was just listening to Haydn's "London" Symphony (No. 104), and longing for the days when the second movement ("Andante") could be considered the height of entertainment.

The first couples minutes of the movement are about as ploddingly bland as you could imagine: slow, major key, regular meter, etc. But the blandness is worth it for the section starting at 2:45, where he incongruously switches the main theme to a minor key as a way to segue into the energizing pay-off.

Then, at 3:20, the energetic section gets abruptly cut off, and there's an awkward pause -- one of Haydn's wonderful trademarks. For the next few seconds, you feel like it's going to go back to the plodding blandness -- but no, it's still energetic for a while. Then, at 3:50-4:00, he transitions back to the slow theme from the beginning of the movement -- a smooth, gentle transition that's the perfect foil to the earlier awkward pause.

At that point, we really are back to the slow, quiet music that we started out with, but -- this is the beauty of classical music -- the original theme takes on a new, richer character now that we've been through the adventure in the middle.

I find it hard to imagine something like this being written today, with this much trust in the listener's attention span. I don't mean by a contemporary "classical" composer (since classical music has stopped having much cultural relevance), but by a popular songwriter. There'd be too much concern about the audience losing interest before the pay-off.

That seems to be the question everyone on the blogosphere is talking about, due tothis Washington Post article, which is based on 13 interviews with parents who killed their children in the same way: driving to the parent's destination, forgetting to drop the kid off somewhere, and leaving the kid in the hot car for a long time.

(There should really be a warning on that article for the benefit of those who don't feel like running into a graphic description of a child slowly dying.)

More details:

"Death by hyperthermia" is the official designation. When it happens to young children, the facts are often the same: An otherwise loving and attentive parent one day gets busy, or distracted, or upset, or confused by a change in his or her daily routine, and just... forgets a child is in the car. It happens that way somewhere in the United States 15 to 25 times a year, parceled out through the spring, summer and early fall.

There may be no act of human failing that more fundamentally challenges our society's views about crime, punishment, justice and mercy. According to statistics compiled by a national childs' safety advocacy group, in about 40 percent of cases authorities examine the evidence, determine that the child's death was a terrible accident -- a mistake of memory that delivers a lifelong sentence of guilt far greater than any a judge or jury could mete out -- and file no charges. In the other 60 percent of the cases, parsing essentially identical facts and applying them to essentially identical laws, authorities decide that the negligence was so great and the injury so grievous that it must be called a felony, and it must be aggressively pursued.

There's a lot of talk about how these things aren't "intentional"; they're "accidents." One parent "plainly" told Weingarten in an interview:

"I don't feel I need to forgive myself, because what I did was not intentional."

But that doesn't mean the act isn't illegal. Not all crimes are intentional. There are crimes based on "recklessness" or "negligence." As long as the state has a crime on the books defined as, say, negligently causing death (or, for that matter, negligently endangering the welfare of a child), the legal question actually seems pretty simple.

A lot of people seem to have a very strong emotional reaction that parents who do this shouldn't be prosecuted. By calling it "emotional," I'm not putting it down -- emotion can be a useful guide to what's right. But it's also important to step back and analyze it. Why do people have that reaction? Is it because the article starts out describing the agony of a particular father who did this, with a touching photo of the man holding his lost child's stuffed animal and looking overwhelmed with sadness?

Well, what if you were the parent ... and your babysitter did this to your child? Would you want the babysitter to be charged with a crime (assuming they're over 18)? It would hardly seem principled to have a rule that parents are legally allowed to do this to their children but babysitters aren't allowed to do it to other people's children. (I stole this point from this commenter.)

To me, the decisive factor is the "lifelong sentence of guilt far greater than any a judge or jury could mete out." No matter what technical argument might be made about how prosecutors could legally charge the parent with a crime, that doesn't mean they have to do so. Living with the guilt is punishment enough; it seems like a waste of resources to make the parent also serve a prison sentence, even a light one.

Another problem, though: this is a very well-crafted article that describes the parents as sympathetic and loving. But there are parents who routinely neglect their children because they're always drunk or high, or because they just don't care enough. I haven't seen any evidence to show that any such parents have caused their children to die by leaving them in cars, but it's entirely possible. Should prosecutors treat those parents differently from "normal" parents?

Many people are outraged at the parents: "How could you possibly forget your child in the car?!" I actually think it's disturbingly easy to imagine this. Can't you think of a time when you told yourself, "I have to do tasks A, B, C, and D in the next hour," then felt assured that you did everything you were supposed to, only to realize hours later that you never did task B? Well, imagine doing that, but task B is dropping your kid off at day care.

Megan McArdle has a good retort to the people who have that outraged reaction:

[T]he belief that you cannot possibly leave your kids in a car seat on a warm day is very dangerous to your kids. It is a virtual certainty that someone who read that article, and said to himself "That's BS--I could never leave my kid in a parked car"--will leave their kid in a parked car. It is the people who are afraid of it, who think that they could do the unthinkable, who are most likely to avoid that fate.

"We are vulnerable, but we don't want to be reminded of that. We want to believe that the world is understandable and controllable and unthreatening, that if we follow the rules, we'll be okay. So, when this kind of thing happens to other people, we need to put them in a different category from us. We don't want to resemble them, and the fact that we might is too terrifying to deal with. So, they have to be monsters."

Interestingly, the author admitted after publishing the piece that he had a blatant bias in writing the article. No, he didn't kill his child. But by his own admission (published in this Q&A published shortly after the article), he almost did. He would have left his infant in a locked car -- in the summer, in Miami -- but his daughter Molly happened to say something right before he was going to get out. Before then, he had no idea she was still there. He adds:

I did not tell my wife about that moment in the parking lot, not for years, not until half a year ago when I began working on this story and needed to explain why it was keeping me awake nights. And I didn't tell Molly about it until just a couple of months ago; oddly, I found that 25 years after the day no harm was done, I couldn't look her in the eye.

So, can this be stopped? Some people suggest car alarms. Well ...

there is the Chattanooga, Tenn., business executive who must live with this: His motion-detector car alarm went off, three separate times, out there in the broiling sun. But when he looked out, he couldn't see anyone tampering with the car. So he remotely deactivated the alarm and went calmly back to work.

The author (again, in the Q&A) has a suggestion that seems more useful:

[Make] sure that daycare centers ALWAYS call the parent if the child doesn't arrive one day.

Wednesday, March 11, 2009

[Steele:] The choice issue cuts two ways. You can choose life, or you can choose abortion. You know, my mother chose life. So, you know, I think the power of the argument of choice boils down to stating a case for one or the other.
[GQ:] Are you saying you think women have the right to choose abortion?

[Steele:] Yeah. I mean, again, I think that’s an individual choice.

[GQ:] You do?

[Steele:] Yeah. Absolutely.

Well, I admire the fact that he's so open-minded about what kind of positions on abortion are acceptable in the Republican party. I'm not saying that to tweak him -- he explicitly makes this point:

[GQ:] Do pro-choicers have a place in the Republican Party?

[Steele:] Absolutely!

[GQ:] How so?

[Steele:] You know, Lee Atwater said it best: We are a big-tent party. We recognize that there are views that may be divergent on some issues, but our goal is to correspond, or try to respond, to some core values and principles that we can agree on.

[GQ:] Do you think you’re more welcoming to pro-choice people than Democrats are to pro-lifers?

[Steele:] Now that’s a good question. I would say we are. Because the Democrats wouldn’t allow a pro-lifer to speak at their convention. We’ve had many a pro-choicer speak at ours—long before Rudy Giuliani. So yeah, that’s something I’ve been trying to get our party to appreciate.

Hm, maybe that's how it would be useful to me: I could use it to stay in touch with you, the blog reader ... if you're out there!

One of the things that's most kept me from using Twitter is the word "tweets." Anytime the topic of Twitter comes up, the word "tweets" isn't far behind. It sounds like a word you might come up with if you wanted to make a website appealing to 10-year-olds. But I just feel embarrassed when I hear adults talking about their "tweets." So while I might keep using Twitter, I'm not going to talk about "tweets."

Here are a few posts I've seen on Twitter that went some way toward convincing me it's only a partial waste of time and not a complete waste of time:

michaelianblack Admitting that it's not the best thing in the world, can't we all agree that eating people isn't THAT bad?

Lileks With an electric razor, you can shave at your desk. You don't have to look at yourself in mirror. Great for guilty consciences!

So that's all kinda cool, I guess. But it's still not as useful as Facebook.

One other thing. Sooner or later everyone's going to get tired of the 140-character limit. I already am. I understand the benefit of forcing everyone to be concise. And it's kind of fun watching the number of remaining characters count down, which makes it "breezy and game-like." But they could keep all of that and make the limit, say, 300 characters. Cap it at a length that will let people write a couple decent sentences. Right now, it's hard to write even one sentence. If someone created a new website that was just like Twitter but a little less draconian, I'd switch.

I just found out one extra tidbit that makes this archbishop even more appalling. The rapist was the girl's stepfather, and the archbishop specifically refrained from excommunicating the stepfather.The archbishop tells us that the rape of the 9-year-old girl was not as bad as the abortion.

It seems like basically a blog service but with a word limit that's so restrictive that it's awkward to do links.

Or like IM without the spontaneity.

Or like Facebook's status updates without Facebook.

So, it's like a severely limited version of a bunch of other social media, with nothing extra to compensate? What's the point?

Since I wrote that, I've slowly started to see the point. It's sort of like IM but better: everyone's content gets merged together, so you can experience it as an ongoing stream rather than a discrete, private conversation. OK, that's nice.

But that's leaving something out: who is engaging in those conversations? Ideally, you'd be connected with huge numbers of friends on Twitter. Well, I know very few people who regularly use Twitter. Meanwhile, I'm on Facebook and have about 250 friends, which is not an unusually high number for Facebook. I'd be surprised if even 50 of them are on Twitter.

Why does that matter? Because Facebook already has a feature that's much like Twitter: you can write a "status" message at the top of your profile that will also temporarily show up on your friends' Facebook homepage. I think the Facebook status even has a more generous space limit than Twitter.

So here's the problem: let's assume I want to let people know what I'm doing -- say, making carrot-ginger soup (real example). Why would I post this to Twitter instead of Facebook? Even on the liberal assumption that 50 of my friends are on Twitter, that's just a tiny fraction of 250. So I can do the same thing on either site, but a lot more people will see it if I use Facebook.

A Slate article from yesterday gave an elaborate explanation of why Twitter isn't going to "kill Facebook." According to the article, Twitter has an edge because it gives users real-time content, right when it's posted, while Facebook manipulates the timing of its content. (The article then dismisses its own point by noting that Facebook is apparently going to overhaul the interface to be more like Twitter.) That may be good fodder for a Slate writer who needs to make a clever point in an article, but is that what's really going to determine whether Twitter "kills" Facebook (or Facebook "kills" Twitter)? I have to think most people choosing between the sites are going to care about the same thing I care about: how many people I'd want to be in contact with are using these sites?

Aside from all that, I have been warming up to Twitter. I see it as an alternative to Facebook for older people. Twitter might not make sense as an alternative to Facebook if you're in your 20s, but if you're over 40 or so, it allows you to partake in the joys of social media without having to figure out: "What's this Facebook thing all about?"

Maybe that's why the mainstream media are so obsessed with Twitter. They're informed enough to know that social-networking websites are a big enough craze to be worth reporting on. And it's clear what the most important such site is right now: Facebook. But either they don't get Facebook or it's too complicated to try to explain how it works in the limited space of media soundbites.* Twitter, by contrast, is entirely based on simplicity and brevity, so it's easy to convey it to a mass audience and seem like you're tuned into what the kids are up to these days.

* Of course, if that's the reason, then you have to wonder: if the mainstream media aren't adept at explaining something as trivial as Facebook, can we trust them to report on complicated issues that are actually serious?

One bill [in Montana] gets straight to the issue — promising to exempt hundreds of millions in economic stimulus projects from the state's landmark environmental policies. Environmentalists are ramping up lobbying efforts as a wave of measures eroding regulatory rules gain serious traction in the face of a recession and shrinking state coffers....

In California, lawmakers relaxed environmental laws for road projects and construction equipment in the name of economic stimulus as part of a recently approved budget package. In Idaho, lawmakers shut down new regulations for septic-tank drain fields because they feared it would hinder Idaho's economy, especially during a recession.

Utah is even considering a company's offer to take nuclear waste in exchange for needed cash. In Kansas, lawmakers are pushing for legislation that would pave the way for coal-fired power plants in the southwest part of the state....

Huffington Post gives this story the headline, "Some States Still Don't See Economy-Environment Connection." The implication is that this kind of thing is a strange aberration occurring in a few out-of-step states. But it seems like there's a pretty straightforward leap from "We need to stimulate the economy with big infrastructure projects and new blue-collar jobs or else the world as we know it is going to collapse," to "Hey, wouldn't it be worth loosening up a bunch of environmental regulations?"

Of course, one reaction is: don't worry, everything's fine, because you can stimulate the economy through "green jobs." I'm all in favor of stimulating the economy with green jobs. But that's not the only thing or even the main thing being done as part of the stimulus plan. The overall plan is clearly in tension with environmentalist goals, but people are just hoping we won't notice this.

We consumers are getting contradictory messages about spending. On the one hand, we are told that our overconsumption is polluting and cluttering up the earth with garbage, using up resources and showing insensitivity to all the needy people in the world. On the other hand, we are told that until we start buying more goods and services, the economy will be in the dumps and we will leave many of our fellow citizens jobless, homeless and hungry.

Friday, March 6, 2009

I usually try to stick to just blogging music I'm pretty well acquainted with on Music Fridays, not just random YouTube clips or MySpace profiles. I admit that this week is an exception -- I don't have any of Michel Petrucciani's albums, and I've mainly just listened to a few performances on YouTube. I have visited his grave in Paris, reverently located right next to that of another pianist -- Chopin. And after watching some YouTube clips and reading up on his life, I'll need to pick up an album or two of his.

"He was in pain all the time," recalled his father, Tony Petrucciani, a part-time guitarist in the Grant Green mold. "He cried. I bought him a toy piano." The keyboard looked like a mouth to Michel, and he thought it was laughing at him, so he smashed it with a toy hammer, and his father got him an old full-size upright abandoned by British soldiers at a military base. From the age of four, Michel spent virtually all his free time, which was abundant, at the piano.

Petrucciani was twelve years old and looked like a toddler when his father started carrying him into jam sessions around the south of France. ...

I love this clip of him playing "Take the 'A' Train" with bassist Anthony Jackson and the acclaimed drummer Steve Gadd:

More from the article:

The first time I saw Michel Petrucciani, a friend of his was carrying him into Bradley's, the tiny piano-jazz club in Greenwich Village where I spent most of my nights and salary in the 1980s. ... The bar crowd cleared a path from the door to the piano, and Petrucciani screamed, "Get out of my way, motherfuckers!" ...

"I've never been around anyone who loved to live like Petrucciani--and live life to the fullest," says Mary-Ann Topper, his manager during his breakthrough years in New York. "He said to me, 'Mary-Ann, I want to have at least five women at once, I want to make a million dollars in one night'--things that were probably impossible. But had Michel ever thought that anything was impossible, he would have never done anything he did." As Petrucciani himself said, "I'm a brat. My philosophy is to have a really good time and never let anything stop me from doing what I want to do. It's like driving a car, waiting for an accident. That's no way to drive a car. If you have an accident, you have an accident--c'est la vie."

"Save Our Planet … Every day millions of gallons of water are used to wash towels that have only been used once … Please decide for yourself."

The author, Jill Hunter Pellettieri, says she likes to spite these signs by "request[ing] fresh towels and sheets every day."

So, what does Pellettieri have against hotels encouraging people to reuse towels? She says:

I'm all for saving the environment. But I don't want to be guilt-tripped into going green.

Now, she's right that it's guilt-tripping. But what's wrong with "guilt-tripping" if it's actually useful in making people more aware of the environmental effects of their actions?

One thing wrong with it is: guilt-tripping can backfire. So it's not always a smart tactic. You have to weigh the benefit of encouraging people to do good things against the potential for making them feel resentful. But she's not making that argument; she seems to have a problem with it in principle. But if it's the principle of the thing, well, consumers should feel guilt-tripped.

She says it's "your right when you pay for the room" to use a new towel every time. Well, the whole problem is people like her who think it's their "right." You may have the "right" in a narrow, contractual sense: you've paid for full hotel service, and you can do what you want and expect the hotel to continue serving you. But the hotel also has the right to encourage (not force) you to exercise that right responsibly.

In the service industry, it's the business that should take responsibility for being environmentally sound, not the customers. There are a number of ways hotels can do this: installing water-saving toilets and showers, replacing light bulbs with CFLs, using solar energy, eliminating Styrofoam coffee cups, substituting room key cards made of plastic with those made of recycled paper.

Why is this either/or? Shouldn't it be both? Customers (at hotels or anywhere else) have an enormous responsibility not to excessively use up resources.

You know, Slate is normally a publication you read if you want to have your preconceptions challenged with clear thinking that's not afraid to make the reader disturbed or uncomfortable, including about the environment. (Slate has a long-running weekly column on environmentally smart consumption, a striking contrast from this article.) Pellettieri, in contrast, assumes there must be an easy way out. We shouldn't need to give up the tiniest luxury. Big businesses should make their products "green" so we can use them guilt-free.

I'm all in favor of "green" products, and her specific suggestions along those lines might all be fine. But changing the characteristics of products is an intrinsically limited way of approaching environmental conservation. Green products may be better than the alternative, but they're not magical. I never know how much good supposedly green products are really going to accomplish, and it's hard to even know which products are greener than others, whatever that means. But I know that using only what I need is a whole lot better than using, say, three times what I need with barely any pay-off for myself.

Her own examples of things she'd like businesses to take care of just accentuate this fact: "installing water-saving toilets and showers, replacing light bulbs with CFLs, using solar energy, eliminating Styrofoam coffee cup." Water, cups, energy, and light are all consumer resources that we should be using just as much as we need, not wasting.

She does have a reasonable proposal:

If hotels really can't do without these opt-in laundry schemes, at least they could be transparent about their motives and reward the guests for their sacrifice. "Reusing your towel not only saves our precious natural resources; it also helps us save money. By participating in our linen-reuse program, we'll knock $10 off your room stay per night."

Now, I think offering a specific discount as an incentive is a fine idea. The problem is that she seems to think this is the only way a business could legitimately encourage its customers to conserve. If doing less washing saves businesses money, as she says, then we should expect that to eventually be reflected in lower prices (all other things being equal). That would be, in principle, the same sort of discount she's advocating. It would just be more diffuse and less visible, but that's no reason to disregard it.