Saturday, December 10, 2005

It seems absurd in retrospect, because I was only 10 years old in 1968 (although my prodigy status had gotten me into the 8th Grade), but I was very attracted to Eugene McCarthy's campaign. Had he or Scoop Jackson ever won the presidency, people like myself might well have remained Democrats.

McCarthy was an original. This article is fairly comprehensive. But to capture the mood of the time, you should read the James Jackson Kilpatrick write-up of that campaign in the National Review.

President Narcissistic Swine, aka Bill Clinton, now solemnly informs the world that "climate change is real... and caused by human activities." Not so real, of course, to have induced him actually to submit the Kyoto Protocol to the Senate for ratification back in 1997; he would have received no more than ten votes in favor, and what's a little environmental destruction when his political interests are at stake?

Well, where Willie stands depends on where he (or someone) sits. So let us review the actual evidence on anthropogenic global warming, shall we?

1. Carbon dioxide concentrations in the atmosphere have increased from about 290 parts per million in 1900 to about 360 ppm today. Over 80 percent of that increase occurred after the surface temperature peak around 1940, a sequence of event inconsistent with the standard left-wing argument. Many of the same people now screaming about global warming were screaming about global cooling in the mid 1970s.

2. The evidence shows that surface temperatures 3000 years ago were about 2 degrees C higher than today, abnormally low 1500 years ago, and over a degree C warmer 1000 years ago, after which the earth entered the Little Ice Age until about the year 1700, from which surface and atmospheric temperatures now are emerging. Temperatures now appear to be a bit below or at the 3000-year average, and the evidence does not support the claim that temperatures in the 20th century were unusual compared with the previous 900 years.

3. Satellite and weather balloon (radiosonde) measurements since 1979, corrected for orbital drift, instrument calibration shifts, and other such measurement error, show an increase in lower tropospheric temperature of 0.06 degrees C per decade, or 0.6 degrees C if extrapolated for 100 years. Other recent work correcting the IPCC models yields a similar modest warming of about 1.5 degrees C over the next century.

4. Surface temperature measurements over the last century show an increase of about 0.27 degrees C; since 1940, the figure is about 0.09 degrees C if extrapolated for 100 years. We do not know if adjustments in the data for urbanization ("heat island") effects are complete.

5. Since 1979, surface temperatures have increased about 0.18 degrees C per decade. The figure for the lower troposphere is 0.06 degrees C; but the conventional IPCC models predict that the troposphere should warm more than the surface. This suggests significant modeling error in the conventional models.

6. The IPCC models predict larger effects from increased concentrations of carbon dioxide than actually observed in the satellite and weather balloon data, an outcome consistent with the hypothesis that the interactions among water vapor, carbon dioxide, and other atmospheric components tend to dampen the effects of increased concentrations of carbon dioxide.

8. Satellite measurements of global sea levels show a downward trend for most of the earth, with the exception of the eastern equatorial Pacific.

9. The data since 1940 show trend declines in the frequency and intensities (wind speeds) of hurricanes.

10. Both theory and evidence suggest that prospective anthropogenic warming will be modest and will occur for the most part in the coldest and driest air masses, particularly Siberia and western North America in the winter.

Basic global warming theory is correct: increase the carbon dioxide concentration in the atmosphere, and the earth will warm a bit. The problem is that the conventional models in essence are disequilibrium models: The warming will cause more ocean evaporation, the resulting increase in water vapor concentrations will warm the earth even more, so water vapor concentrations will increase further, etc. This story is objectively false: The warming 3000 years ago, not caused by capitalism generally or SUVs in particular, did not yield a permanent warming. Nor is it at all obvious that a warmer earth would be worse than a cooler earth; that depends on the decrease in the value of the existing capital stock, the cost of adjustments, etc. All a topic for another day.

Sam's probably got real reviews of The Lion, the Witch, and the Wardrobe running in half-a-dozen places, but I couldn't resist offering my own scattershot impression. I just returned from a first-day showing and I am blown away. This movie not only avoided every pitfall I dreaded upon hearing Disney was involved in the project, it is a gem. Not flawless, but a gem nonetheless.

There is too much in this movie to take in all at one go. Luckily I am guaranteed a return visit next weekend, as my husband, poor unlucky sod, was sent to Paris on business for five days and so missed tonight's outing.

The soundtrack is for the most part a work of artistic genius -- eclectic, original, and without cliche from beginning to end. The only choice I'm not sure about is Alanis Morissette over the closing credits, but we were the only ones left in the theater at that point anyway.

Technology has finally become so sophisticated that there is no longer a need for the audience to will belief in depictions of fantastic worlds. The centaurs were as convincing as the Pevensie children. Aslan is almost too perfect -- at one point I found myself marvelling at the way his mane rippled in the wind instead of paying attention to the plot.

There will be controversy over the beavers. I liked them, Rachel did not.

This is one of the few book to movie adaptations I have seen where I have agreed with the changes and omissions. At 150 minutes it is long for a movie aimed at a young audience, but there was no restlessness in the theater.

And now a review of the audience: there are now two full generations of people who have no earthly idea how to behave in public. I am accustomed to being surrounded by obnoxious morons in movie theaters. I am not yet accustomed to mother and son pairs offering non-stop commentary loud enough to drown out battle scenes. Sample dialog from the brace of mental giants directly behind us, on the appearance of a squadron of airborne war gryffons:

Hey! Is they Pegasus things?

You dumb%&@, they're cat-eagles.

They had to take time off from kicking the back of my head to think this stuff up.

Thursday, December 08, 2005

It occurred to me that the following excerpt might be of interest. The writing is mine (as I have permission to reveal) but the book's author is Guillermo Sostchin, a prominent Cuban-Jewish attorney here in Miami. The larger opus comprises his analyses of Biblical passages and themes. This particular selection is an essay dealing with an aspect of the story of Noah's Ark.

And Noah entered, along with his sons and his wife and their wives, into the Ark, away from the (advancing) waters of the Flood. From the pure animals and from the animals that are not pure and from the birds and all that crawls upon the ground, two at a time they came to Noah to the Ark, male and female, as the Lord commanded to Noah. (Genesis 7:7-9)

Away from the (advancing) waters of the flood. Noah, too, was of little faith, alternating between believing and not believing that the Flood was coming, so he did not enter the Ark until the waters were pressuring him. They came to Noah. On their own. (Rashi)

The Bible is drawing our attention to a fascinating contrast between the human response to the impending Flood and the animal reaction. The animals came of their own accord, indicating that they had some instinctive sense of the impending doom and knew to seek out some form of refuge. (Of course, many more than two would have shared the instinct to find a way out, but only two were given the extra sense that the Ark being built on a particular man’s property held the key to survival.) Human beings, on the other hand, do not seem to have sensed that anything was amiss. Noah had a prophecy, which he in turn relayed to others. All of his neighbors dismissed his warnings as sheerest fantasy. Even he, as Rashi deduces from the text, had difficulty achieving a full acceptance of this idea which ran so radically counter to the human perception that the world was very solid and durable. Only the actual beginnings of a powerful storm convinced him thoroughly of the literal accuracy of the prophecy. This, despite his accepting the message from G-d, communicating it to others and even putting into action the instruction to build an ark to specifications. Indeed we saw this phenomenon confirmed in our own day, when a great tsunami struck Indonesia and Thailand. Many people were killed while lounging casually on the beaches. When the authorities arrived, looking for survivors to evacuate and bodies to bury, they were amazed to find that although almost two hundred thousand people had perished, not a single denizen of the animal kingdom had lost its life. Zero: not one animal had died as a result of the tsunami. Clearly, they had been aware in advance that it was on its way and they had been able to find their way to the safety of the higher elevations. Why is it that animals are more responsive to portents of danger than humans; why would mankind find this message so difficult to process?

It seems to me that the Bible (and Nature, in its recent rumblings) is trying to show us the proof against the notion that human beings might have evolved from animals in some manner that was not guided by a Divine intelligence. Had there been a process that was random and achieved simply by nature taking its own course, there would have been a bridge that links the consciousness of animals and humans. There could not be a total shift from one system of processing environmental data to another without the slightest vestige remaining from the first system. Instead, we see that animals respond to a network of instinctual stimuli alerting them to ripples in the tranquility of Nature. Humans, however, have no access to this data bank. They can only process information by importation through the five senses followed by an intellectual examination and deliberation. Therefore, an animal senses a Flood by instinct; he immediately heads toward safety. A person, even a good person, even one who has been informed by prophetic means, finds it difficult to perceive danger through the intellect when the planet sits peaceful and solid beneath his feet.

For those upwardly mobile urban residents who live in Brentwood, the upper east side of New York, Chicago’s Gold Coast or places where the aspiring masters of the universe set up house, children are not small flesh and blood people; they are commodities. Their value fluctuates like the gold market. What counts, of course, is whether they can enhance the reputation of parents, and whether parents can live vicariously through the exploits of kids.

At a recent dinner event several guests regaled me with stories of their children’s achievements. That is well and good since parents who have something positive to say about their kids might as well let other guests in on the success. But at one point, a fellow said my son disappoints me, “he didn’t get into Harvard.” I asked if he (the dad) went to Harvard. He said, “no but I was counting on my son to get in.” I innocently noted that this rejection would probably not have the slightest influence on the young man’s future. Dad demurred, “of course it will; I was counting on it.”

This conversation has been repeated many times, in many places. Each time I come away perplexed. Why would parents be disappointed that a son or daughter didn’t get in to an Ivy League college, especially if they didn’t get in to one themselves? Moreover, why be disappointed in this child who didn’t get admitted to an elite school? It is precisely because it is elite that everyone doesn’t get in. This rejection doesn’t ensure failure in life, just as acceptance doesn’t ensure success.

The answer to this conundrum is that upper middle class kids are treated as commodities. It is what they do that matters, not who they are. The marketplace of conversation is dependent on the conditions that allow one to boast about the children in a social game of one-upsmanship. Here is reverse projection: the parents derive prestige from what their children achieved. I can remember a time when kids, who took pride in their parents accomplishments, wanted to emulate them. How quaint that seems at the moment.

This children’s commodities market has its up and downs just like the Mercantile Market. On some days Johnny’s stock goes up; he won his tennis match or got 1600 on the SAT. On other days his stock goes down; he didn’t win a Merit Scholarship or he struck out in the 9th with the bases loaded. This rollercoaster effect is found in everyone’s life and surely boasting about children is not uncommon. What makes this condition odd is the lack of intrinsic value in the child. Kids must produce to have value just as corporate value is dependent on earnings.

Not only does this put inordinate pressure on children; it is an attitude hostile to the very nature of a parent-child relationship. It dehumanizes the kid and grotesquely limns the parents. In this human calculus one weighs the scales of achievements and failures using the most superficial of standards to register a judgment. Is Johnny less of a person because he didn’t get in to Harvard?

Fortunately this slice of life is restricted to an affluent portion of the population that has the opportunity to preoccupy themselves with fantasies of their offspring’s accomplishments. Very often what dad or mom couldn’t do for themselves, they expect from their children. After all, they offered every privilege money can provide, now results are expected.

Where this leads is already clear: psychiatrists treat more children of the wealthy than ever before. Children are driven to succeed and become depressed when unrealistic standards are not met. Parents, on the other hand, are frantic. If Mary isn’t always attentive in school, she becomes a candidate for Ritalin. If Johnny only scored 1500 on the SAT, Kaplan or Princeton Review sessions will be in his future. It is not merely the edge Mom and Dad want for their children; it’s the “stock price” of the offspring.

Children as commodities may seem as a harsh idea, but it is a part of current reality in my opinion. The problem is kids often can’t reach the expectations parents have for them and the market suffers from irrational exuberance. Perhaps this market will also burst like its analogue on Wall Street. That might release the pressure at home, but its consequence for society would be very profound indeed.

Those who know me are aware that I could never could be called "a man of the left," but maybe "a man of the heft" would be fitting.

Hitting age 35 while still carrying excess weight has landed me in the doghouse with my doctor. He threw the book at me and I'm now blogging about the experience as a way to stay accountable to dieting.

If anyone is interested in reading about that and maybe contributing their own comments, just slide over to I Might Be a Giant, a new weblog about cutting personal liabilities.

MEMPHIS, Tenn. - In an unusual case of mistaken identity, a woman who thought a block of white cheese was cocaine is charged with trying to hire a hit man to rob and kill four men. The woman also was mistaken about the hit man. He turned out to be an undercover police officer.

Sandy Booth, 18, was arrested over the weekend and remains in jail with bond set at $1 million on four charges of attempted murder and four counts of soliciting a murder.

According to police, Booth was in the Memphis home of the four intended victims last week when she mistook a block of queso fresco cheese for cocaine — inspiring the idea to hire someone to break into the home, take the drugs, and kill the men.

She told the undercover cop, whom she thought to be a hitman, that any children in the house old enough to testify would have to be killed.

To summarize: an eighteen-year-old girl decides to kill four men and their children, and presumably any spouses and girlfriends who chanced to be there, and any other innocent bystanders who might happen by, and take the men's imaginary cocaine money.

Wednesday, December 07, 2005

Now, like many or even most Americans, I have a soft spot for John McCain. A war hero, an ex-POW, and a willingness to cross his own party. Still, it's hard to tell when he's grandstanding or following principle.

On his leadership against torture, since he was a victim of it, we shall give him the benefit of the doubt, although not his allies on the other side of the aisle and even on his own.

Torture is wrong, and it doesn't work, anyway.

Sweet. Grabbing the moral high ground, and anyone who disagrees is a sadist interested only in inflicting needless suffering. Cheney and Rumsfeld are Himmler and Heydrich.

But torture does work. Let's get that straight. The case of US Army Col. Allen West is easily as important as the Valerie Plame nonsense, but has disappeared from the public discussion (if it was ever there) because it puts the lie to framing defenders of the "ticking bomb" scenario as immoral sadists.

Briefly, while serving in Iraq, Col. West uncovered a plot to ambush him and his men. He treated rather roughly a man who had knowledge of the plot, fired a few shots from his pistol in the man's close proximity while threatening to kill him, and got the information, saving both himself and his men.

As a coda, administrative action was taken against Col. West under the Uniform Code of Military Justice for what "amounted to torture." His career is over.

So, torture is not only already illegal, it also works. It can save lives. So much for the moral clarity that the current anti-torture argument claims. There's a real-world dilemma here.

But what of the "'wrongness' of torture" argument that remains? It claims a moral absolute, but is in conflict with the first natural right, to survive. Was Col. West obligated to die because of this moral absolute of "wrongness"? Let his men perish?

The "wrongness" argument requires suicide. Let its proponents own it: I would rather die than have someone tortured to save me. Or to save my friends, my lover, my parents, or my children.

Further, I forbid anyone else from saving their own lives or those of friends and family in this way.

Strangely enough, like capital punishment, I'm personally opposed to torture for reasons that resonate from my religious beliefs. But if I'm to park all that at the door when we as a nation decide important things like this, then my reason admits that the arguments for both torture to save life (and for capital punishment) are the stronger.

And to throw both the moral and practical arguments into a blender, especially when neither can stand on its own, and use the resulting incomprehensible slime to pour on one's opponents as "supporters of torture?" No, that just won't do. John McCain gets a pass. The rest do not. This is the real world, where if like Pilate one washes his hands and walks away, innocents die.

The Spectator has graciously run a musing of mine on how some of the subtler decisions required in Iraq may be getting drowned out by the Democrats' shrillness and hyperbole.

Here is the merest morsel to clean your palate:

Even if the military obstacles are eventually breached, we are caught in a subtle conflict that simultaneously challenges our political, governmental, legal, and moral sensibility. Say we determine, as hitherto we have, that the peculiar morphology of modern terrorism requires the suspension of certain precious mores. It allows, even demands, that we imprison people for years with less-than-due process, or torture people who have urgent knowledge of pending or impending horrors. What, then, do we tell the new government of Iraq? Can we allow it to behave in this manner?

Tuesday, December 06, 2005

Over at The American Spectator, it's considered a great honor if one's article is featured in the headlining picture. That honor is still mine on Dec. 6 until midnight. You could actually click on that picture and it links through to my article.

Now that midnight is a muddy memory, and sic transit gloria mundi (even on Tuesdi), we can only link to the specific article.

My subject today was the assassination by person or persons unknown of the much bewailed and bemoaned Mr. Rabia, #3 potentate of al-Qaeda.

Here is a strand culled from amid the arabesque:

The real War on Terror may be kicking in now. Now we have to get individual al Qaeda members who may be lurking in attics and cellars anywhere and everywhere. At this point the logic of war between the United States of America and a private-sector gang involves bestowing upon them a sort of honorary sovereignty. They are the government-in-exile of the sovereign nation of al Qaeda and every one of them is an ambassador. Their home, in whatever host country, is a piece of enemy territory. The principle of embassy status and diplomatic immunity is applied in reverse.Look, they came here and bombed us with their Air Force. Does it really matter that their fleet was acquired through piracy of commercial air craft? In the same way, we view Hamza Rabia's house in Pakistan as occupying a legal status distinct from the rest of that ally country. His house is an al Qaeda embassy with discrete sovereignty and as long as we don't mess Pakistani lawns too badly with shrapnel and body parts, we reserve the right to act on our declaration of war. Or better said, on our engaging of their declaration of war.Also, please let me encourage you again to visit my new sub-blog for fun two-line comments on the day's news.http://twolinenewsviews.blogspot.com/

You know Catholics and Evangelicals have ceased hostilities when you read National Review's list of 15 Unsung Conservatives and find:

Carl F. H. Henry (1913–2003): Billy Graham was the greatest evangelical preacher of the 20th century; one of the greatest evangelical thinkers was Henry. An ordained Baptist minister, he gave the evangelical movement its intellectual heft through his books and, most notably, his editorship of Christianity Today, a magazine that he and Graham founded in 1956 to counteract the influence of the more liberal Christian Century. Although he defended traditional understandings of Scripture, he rejected fundamentalist rigidity and urged evangelicals to engage the wider world rather than to retreat from it — an encouragement that continues to motivate serious Christians to occupy the public square.

CFHH is one my personal heroes, but he wouldn't have been on the radar of NR twenty years ago. The fact that he is included now shows the religious interpenetration of the two camps and how well Henry's legacy is wearing.

In the beginning, Hollywood discovered that there was a market for serious Christians who would enjoy entertainments based on a more orthodox view of the faith. The prophet Gibson showed them the way. Though he was despised for revealing this unpopular truth, his already significant fame grew and the actor became an icon. And it was good.

Having learned Gibson's truth, the behemoth company Disney did seek to dwell in the promised land and thus became the makers of a film based on a story by an older prophet Lewis. By all accounts from those who have seen the early results, it too was good.

However, it may be the case that all this marketing sometimes goes too far and that may not be good.

The two two-line blogs that I did here yesterday inspired me to create a separate blog for that purpose. This way I can comment, if briefly, on a number of news events each day - but in a maximum of two lines.

Larry White is an outstanding free market monetary theorist at the University of Missouri in St. Louis. At the Division of Labor blog he notes that although the nominal price of gold is back to the level of 1987 it remains much lower in real terms, after adjusting for the 75 percent rise in prices since then. But the real price of gold has nearly doubled over the past four years, which he interprets as hedging against inflation:

“The upsurge in gold over the last four years suggests that that investor confidence may be slipping again – and not without good reason. As Bloomberg reports: So far this year, consumer prices are rising at a 4.9 percent annual rate compared with a 3.7 percent increase at the same time last year.”

Yes, but . . .

So far this year, consumer prices less energy are rising at only a 2.0 percent rate -- down from 2.2 percent at the same time last year. Energy prices in the CPI rose 12 percent in September alone, but fell slightly in October.

If we look at the superior chain-weighted CPI, prices were up only 1.7 percent over the past twelve months for all items less food and energy. Food is rarely a significant factor (I'd prefer to drop the "core" measure), and food prices were up only 2.1 percent over the year while energy prices soared by 26.3 percent. Leaving out energy alone, the chained CPI would be close to 1.8 percent over twelve months. Since even chained price indexes exaggerate inflation, because of quality improvements and hidden discounts, an inflation rate of 1.8 percent for everything except energy is really quite low.

The main reason this distinction matters is not that rising energy prices don't hurt, or even that global oil demand is only indirectly related to Fed policy. The key reason we absolutely must look at inflation without energy prices is that energy prices cannot and will not keep rising forever. When they stop rising, we'll see how the underlying rate of inflation really is.

If the chained CPI less energy remains around 1.8 percent, then total inflation will likewise drop to about 1.8 percent if energy prices merely stabilize, and to a rate below 1.8 percent if energy prices keep falling.

It is theoretically possible that non-energy prices might accelerate if energy prices fall, because cheaper energy frees-up cash to spend on other things. In the past, however, spikes in energy prices in 1974-75, 1979-81 and 2000 were always followed by slower inflation in non-energy prices for at least a year or two. The Fed’s notion that energy inflation spreads like a virus from energy to everything else is factually false.

Non-energy inflation is now lower than it was during in any year from 1967 to 2001, and also lower than last year. So relax and enjoy a happy new year. But maybe it's time to trim those hedges.

I had to go out in the wee hours to get medicine for my infant at a Times Square pharmacy. The trip felt ultra-safe. I could have been walking through Disneyworld. One would not be able to say the same of Atlanta, Houston, or Birmingham. Message to those cities: try electing a Republican mayor every once in a while, even of the nominal type. Might improve your chances of attracting a little tourist revenue.

2. What You Get With Monopolies

We took taxis on a couple of occasions. Both times, one felt as though he were dealing with a mercenary instead of with a businessperson or a service provider. It's less "where do you want to go" and more "come with me if you want to live."

New York might consider dropping their system of authorizing only certain taxi services and let everyone compete who is willing to honor safety regulations. The market is captive right now. And it shows.

3. The Democratization of Cuisine

I think it was once the case that you had to travel to great metropolises or abroad to get outstanding food. That is no longer the case. I've had the opportunity to dine in a wide variety of locales and it is clear to me that you can get really good food almost anywhere there is a market of reasonable size.

So, the food may not make New York an attraction. What I think will keep NYC flowing with tourists is Broadway. You just cannot get live theatre like that in such abundance and quality wherever you go. Broadway is a fabulous distinctive.

Tom Peters' team likes to point out how influential he is. I've noted before that I have enjoyed reading his books, but his trendiness and political correctness become a little insufferable at times. However, I think when it is all said and done his work will not outlast that of Peter Drucker, who recently died after an amazing career.

Checking out the Peters website recently, I ran across this unsightly bit:

11.28 cover tribute to Peter Drucker, called him ... "THE MAN WHO INVENTED MANAGEMENT." Maybe he "invented" management—highly unlikely, since British trading companies among others have been doing it brilliantly for about half a millennium—but he sure as heck didn't "invent" leadership. (Nor say much about it, for that matter.)

Not very nice, Mr. Peters, especially when one is talking about the most eminent management theorist of the last half century and the gentleman with whom you like to think of yourself as competing.

Sunday, December 04, 2005

This very poignant - and classy - elegy by President Gerald Rudolph Ford about the late Hugh Sidey was published a week ago in the Washington Post. It only came to my attention on Friday, and I believe that it is worth commemorating here, hardly less timely for being a week later.

Obviously, Ford has writers. But just as clearly, the sentiments are his, and they provide a rare window into the persona of our nonagenarian ex-President.

An apology is in order. It seems that in my haste I have made some waste, failing to provide a link for my fellow Clubsters to enjoy my column of Thursday last. This is a humorous exploration of the white lies that are woven into the colorful fabric of our lives.

A foretaste:

Imagine that we declare National Truth Day. Every husband will tell his secretary that his wife does understand him. In fact, having nursed him through various ailments and depressions, she understands him much better than you ever could sitting behind your desk with a People magazine.

And this:

Students will turn to professors to admit that the term paper about lowering crime by aborting black babies which was graded "chillingly racist but refreshingly irreverent" was bought for 100 dollars on the Internet and originally written by Bill Bennett as an undergraduate. (Just kidding, Bill.)