Friday, June 30, 2006

After sitting neglected in my library for two years, I just finished reading Edward Cline’s Sparrowhawk: Book II: Hugh Kenrick this afternoon. I regret every hour that I postponed reading Cline’s work, for he has crafted an utterly stupendous literary achievement—an achievement made all the sweeter to behold given our friendship and his frequent contributions to the Center’s advocacy.

I hold that after Ayn Rand, Edward Cline is perhaps the first true Objectivist artist—not in failed attempts or half-realized aspirations—but in actual concrete execution. And this is important to note, because despite all the philosophic power of Objectivism and Ayn Rand’s artistic example, I often have a hard time finding value in much of what attempts at Objectivist art today. This has been nagging me for years, for it is as if many of these artists seemingly have the right philosophy and motives, but simply lack the willingness and discipline to fully train themselves in their medium (be it paint, sculpture, or fiction). Their failures I cannot explain, other then to say that they seem to believe that philosophy alone can make up for a deficit in training, craftsmanship and skill.

Not so with Cline’s efforts in Sparrowhawk: Book II. Here Cline offers a stunning portrait of a young English aristocrat who transforms himself into a man of mind, integrity and action in the years preceding the American Revolution. As I read his book, I found myself pausing to note how exquisitely it was constructed. Action, such as a sword-fight between the hero and a villain impacts the reader with deliberate force. Dialog, such as a discussion of philosophy among great-minded friends, illuminates the fundamental truths of the characters’ (and our) existence. No character moves causelessly; hero or villain, the soul of each is given life and shapes the plot with alacrity and conviction. And as a historical novel set in a bygone era, it unabashedly integrates the language and ideas of the time to masterful effect. For years novelists and filmmakers have struggled and failed to capture the sprit of the American Revolution and the changes in men’s minds that preceded it. Not any more—not with this work by Edward Cline. It honors the rebels of the past, and provides fuel for those who would make our future.

And thus I simply cannot fathom why this achievement has not been heralded and proclaimed by Objectivists. I hope I can redress this failure by saying that you, reading these lines, must find and read this book, and discover with your own mind what I attempt to describe here. I think you will find yourself as inspired about it as me. Fortunately for me, the rest of Cline’s Sparrowhawk series is already in my possession, and I now look forward to a long holiday weekend enjoying each of them.

Thursday, June 29, 2006

So said Ellsworth Toohey to Peter Keating near the end of Ayn Rand's The Fountainhead in a speech in which he explains the method of his plan to rule the world.

I have commented in the past on the folly of Bill Gates spending his billions on fighting various "ills" without bothering to examine their causes. Now Warren Buffett will donate $37 billion to the Bill and Melinda Gates Foundation, or about 85 percent of his estimated worth of $44 billion. Bill Gates himself is worth an estimated $50 billion.

"Now worth $30 billion, the Gates foundation is one of the world's richest philanthropic organizations," says a Reuters article. "It has committed millions to fighting diseases such as malaria and tuberculosis in developing countries, and to education and library technology in the United States."

It is not just the wasted billions that causes the mind to reel. It is also, among other things, the utter futility of the gestures. "Developing countries"? Read "Third World" backwaters that will never develop beyond what they are now: incubators of poverty, starvation, disease, death and tyranny. Their various inhabitants and rulers are as clueless about the political and economic causes of malaria, tuberculosis and poverty as are apparently Gates and Buffett, who have far less excuse for their ignorance.

What is astonishing is that neither Gates nor Buffett grasps the nature of the origin of his wealth, nor apparently has ever bothered to ask himself why there is a difference between the American standard of living and its wealth and the standards of living and the degrees of destitution that are responsible for the diseases they wish to combat and cure. Or, if they have sensed or identified the difference (and given their public statements, there is no evidence they place any importance on the difference), their altruistic premises trump any distinction.

The folly must be examined in order to understand what will and will not be accomplished by pouring billions of dollars into the bottomless pits of the needy around the globe, and by perpetuating the ever-deepening sinkholes of American public education.

Let us first note that charities produce nothing. They are eminently non-productive. The United States is rich because it is productive, because so much created wealth was invested in other productive enterprises, and only a fraction of the produced wealth ever donated to charities. (And we will leave aside for the moment the incalculable wealth confiscated by the U.S., state and other governments, also non-productive entities, dedicated to such boondoggles as Social Security and rebuilding New Orleans.) Perhaps the fortunes Gates will invest in research to cure malaria (God forbid he advocate the application of DDT) and tuberculosis will actually produce the hoped for cures. Fine. That will leave the cured to endure poverty, starvation and other diseases, not to mention the turmoil and anarchy and tyrannical brutality of the countries in which malaria and tuberculosis might be checked.

Let us also note that, in regards to the wealth Gates will donate to the public education system, the students who will be the immediate or direct beneficiaries of that money, for as long as they are hostages of that system, will not emerge brighter students or super achievers. By all the direct evidence of plummeting test scores and the inability of increasing numbers of students and young adults to think, read, do simple math, and write, learning how to use technology or some souped-up library or data system will not turn them into independent individuals capable of emulating Gates's business success.

What Gates overlooks or is oblivious to is that the education system is committed to turning young people into selfless individuals who defer to arbitrary authority and regard themselves as mere cogs in society, some more adept or skillful than others, tolerated as long as they remain obedient ciphers.

And for as long as Gates and Buffett are lauded as role models of "responsible citizens" and exemplars of sacrifice and "giving back," any given student will be discouraged from developing a personal, selfish ambition, and never encouraged to ask the question: Give back what, and to whom? This is a more potent consequence of their actions, more potent than any amount of money they may donate. Perhaps the most perilous thing Gates's folly will accomplish is the further "legitimization" of selflessness as a "noble" virtue.

These students will end up as shortsighted or blind as Gates and Buffett must be in any realm beyond their businesses, and the realm in which they are most blind is the moral foundation of capitalism and freedom. By announcing their intention to squander their wealth in a prolonged orgy of altruism, they betray the very vestiges of the morality that allowed them to succeed in their businesses. Obviously, throughout their entire careers, Gates and Buffett accepted the idea that greed and personal ambition were either evil or irrelevant. It was "practical" to make a profit, but not moral. Ideas? Principles? Free minds? Championing capitalism? No. Apparently, bridge games are the limit of their intellectual efforts.

Their altruist campaign to "do good" by "giving back" is evidence of what could be called moral autism. Webster's New Collegiate Dictionary defines autism as an "absorption in phantasy to the exclusion of reality." The Oxford Concise Dictionary adds a term that is directly related to altruism in its definition of autism: "Morbid absorption in fantasy," the term morbid medically indicative of an unhealthy disease combined, in Gates's case, with an obsession to cure the ills of the world. And altruism is the progenitor of myriad fantasies. It requires leaving reason behind and focusing on ridding the world of an "ill" with no reference to reality.

I am not certain that experts have determined whether clinically defined autism is a consequence of physiological disorders or psychological ones, or a combination of them. But I am certain that moral autism is a consequence of a profound philosophical disorder: the automatic suspension of reason where moral values are in question and a departure from reality, a condition required by altruism.

A medically certified autistic person may not have any control over his condition, but the moral autism of Bill Gates, Warren Buffett, and countless individuals, is a matter of choice. If the mark of autism is a "disconnect" from reality in favor of a fantasy in which reason is neither applicable nor welcome, then Gates and Buffett are morally autistic. For them, there is no rational causal relationship between reality and morality. They can be brilliant in business, but become congenital idiots in the realm of morality.

Another observation is that all those who are praising Gates and Buffett are gloating in self-righteous vindication of the altruism they have been promoting all their lives and careers, happy that such enormous wealth will be consumed in altruist programs. What is obscene about this event is the glib sanctimony of Gates and Buffett, and the smug sanctimony of those who approve of the give-away.

It would be interesting to hear the reception these same altruists -- in the news media, in universities, by politicians, in churches -- would give Gates and Buffett if these men announced instead that they planned to devote their billions to educating Americans on the values of reason, capitalism, and liberty, and to rediscovering the America that the Founders intended this country to be -- a land of the free, not a home of the selfless.

Our nation was young when Charles Pinckney, an American minister to Republican France, replied to an official French request for graft, "Millions for defense, but not a cent for tribute." Altruism has so warped the character of our nation that now the reply is: "Billions for boondoggles, but not a cent for reason."

Moral autism is a disease that can only be combated and cured by advocating reason and capitalism. A nation that treats it as a superlative normal condition will not know why it is perishing.

Wednesday, June 28, 2006

The chum that is being thrown out this cycle is just so uninspiring. It goes without saying that I often disagree with the editorial stands of The New York Times, but the Republicans' recent charge of treason over the newspaper's disclosure of a secret government program to monitor international finances is simply absurd. Why? Because if info about a convert program was leaked to the public, it was leaked by an administration official—the administration official caused the leak to happen, and he bears the burden of responsibility for any negative fallout—not the The New York Times. Furthermore, the existence of a monitor program was common knowledge; according to the AP, it had been alluded to in a UN report as early as 2002.

So why then is the The New York Times the focus of all this high and mighty criticism while the source of the government leak is hardly an afterthought? That’s easy to answer—attacking The New York Times appeals to the Republican base during an election cycle and steers attention away from the government’s own failure to police its ranks.

What bothers me is that we’ve seen this before. Remember the Martha Stewart case and the charge of insider trading? The “inside information” she was investigated for acting upon was the Food and Drug Administrations’ refusal to grant approval for the product of a pharmaceutical company. That information was leaked by a government official, but he certainly didn’t pay for his leak; it was only Martha Stewart and the pharmaceutical company’s CEO, Sam Waksal who were sentenced to jail.

Will that be the story with The New York Times? It remains to be seen, but if I were them, I would start circling the wagons. When this administration wants to draw blood in order to score political points, it is utterly ruthless.

You have to hand it to the American Antitrust Institute, which has recently managed to take antitrust to a new low.

An antitrust probe into the explosives industry that started with a bang in the 1990s has ended quietly with an order signed by a federal judge in Utah.

U.S. District Judge David Sam wrapped up the matter earlier this month by distributing the remaining $48,887 of a $60 million settlement in the case to the Salt Lake

Community Action Program (SLCAP) and the American Antitrust Institute in Washington, D.C. SLCAP advocate Karen Silver said her organization expects to use its share - $16,133 - for advocacy efforts involving utilities.

The amount adds up to pocket change for the mining industry, but it is a welcome addition to the SLCAP's coffers.

We could always use the money," Silver said.

The American Antitrust Institute, a nonprofit education and advocacy organization whose mission is to increase the role of competition, was awarded $32,754.

The institute and SLCAP were among the groups that applied for the leftover settlement funds - a standard procedure when there is a small amount of money remaining after the plaintiffs' claims have been satisfied and dividing it up among a large number of plaintiffs would be impractical.

The case has its roots in a U.S. Department of Justice investigation that began in 1992 into whether the manufacturers and distributors of commercial explosives were conspiring to fix prices, rig bids and allocate customers among themselves, according to Washington, D.C., attorney Richard McMillan, who represented mining companies that alleged they were victims of an antitrust scheme.[Pamela Manson, The Salt Lake Tribune]

That's impressive--literally money for nothing. What do you say to CAC applying for leftover antitrust settlement funds, on the grounds that CAC’s mission is to increase the role of competition—by abolishing the antitrust laws?

Tuesday, June 27, 2006

"America is the land of the uncommon man. It is the land where man is free to develop his genius -- and to get its just rewards." ~ Ayn Rand

As Independence Day nears and with immigration a hotly debated issue, I'm reminded of how an atheist from the Soviet Union taught me what it means to be an American patriot.

Ayn Rand, author of Atlas Shrugged and The Fountainhead, wrote that America is "the greatest, the noblest and, in its original founding principles, the only moral country in the history of the world."

When I first read Rand's books and heard her lectures, many of which expressed equal adulation for America, I was a left-wing ideologue who questioned whether she knew that ours was a racist society that had enslaved blacks, stole this land from the Indians, and exploited the poor, women and children. And yet, whenever I heard our national anthem, a prideful lump would inevitably form in my throat. Looking back, this tells me that I grasped, even as I bought into those vicious charges, that there was much, much more to America. So when I encountered Rand's bold, uncompromising praise and defense of the United States -- all made with arguments atypical of the average American patriot -- she struck a chord with me.

While conservatives claimed this land was "God's chosen country" to explain America's greatness, Rand asserted that this nation was the crowning achievement of the Enlightenment, the eighteenth century intellectual movement in which reason was championed, faith-based dogmas were challenged and broken, and religion's influence in all realms was substantially weakened. Thus our Founding Fathers, from Thomas Jefferson to George Washington to John Adams, Rand noted, were primarily pro-reason secularists or deists who founded, for the first time in history, a nation based on explicit philosophical ideas -- above all, that each individual has a right to his life, liberty and pursuit of happiness.

Rand recognized that what distinguished America from all nations, past and present, is its moral and political foundation: individual rights. That is, each man has a right to think for himself and pursue his chosen values in the pursuit of his own happiness. And so, no authority -- no gods, kings, popes, bureaucrats -- may dictate the course of any individual's life; that he may live for himself, "neither sacrificing himself to others nor sacrificing others to himself," as Rand wrote. She explicitly identified that America, at root, is a nation based on reason, individualism and rational self-interest, all ideas that she celebrated in her books.

Based on these rights and life-affirming values, and on its corollary capitalist economic system, America emerged as a nation of freethinking, hard-working, productive individuals. A land of scientists, inventors, entrepreneurs and businessmen who made possible an array of labor-, time- and life-saving advances or improvements -- including the steam engine, automobile, airplane, telephone, penicillin -- and thereby raised every man, woman and child's standard of living, prosperity and life-expectancy to unprecedented heights.

Rand's books taught me these facts, and also that what is fundamental to being an American is not any irrelevant characteristics, such as one's birthplace or race, but that one understands and chooses to live by the ideas unique to this country, yet are necessary to all men for their long-term survival, prosperity and happiness. Moreover, she taught that when evaluating historical figures, what is most relevant is not how they were like their predecessors, but how they distinguished themselves.

I therefore understood that our Founding Fathers represent a unique bridge between the irrationalities and injustices of the old world and the much greater heights that this nation has yet to achieve. So while some Founders owned slaves, for example, it is crucial to note that slavery, in some form, existed in virtually every pre-American society. And that what is most significant about figures like Jefferson or Washington is that they were the first in history to uphold the individual rights universal to all men, thus laying the moral and intellectual foundation for slavery's eventual abolition.

Rand understood that America could never be a racist society and still rise to its unprecedented status, and she noted how inasmuch as racism existed, it was a force in the feudal-like, anti-capitalist, agrarian South, which lost the Civil War to the freer, capitalist, industrial North. She knew that America was not the backward, tribalist society some tried to paint it to be, and asserted that this portrait was true of the native Indians, and contested the claim they had a "right" to this land. "If a 'country' does not protect rights," she asked rhetorically, "if a group of tribesmen are the slaves of their tribal chief, why should you respect the 'rights' that they don't have or respect?"

In sum, Rand unabashedly countered the claims that America owes a God for its freedom and wealth, that we Americans must live for "the common good," and that our government must be a paternalistic redistributer of our wealth to provide others with everything from Medicaid/Medicare to Social Security. She taught that to be American, above all, means that one respect each individual's right to live as he sees fit and to keep what he produces and trades voluntarily with others to mutual advantage.

I'm thankful that Rand escaped the slave state of Soviet Russia, where millions of innocents were slaughtered based on such communist ideals as self-sacrifice, equality of results and an all-powerful state that dictated how others must think and live. I'm thankful Rand came to live in this nation, where she knew she was free to think independently and write books with innovative, challenging ideas, exemplified by the provocatively titled The Virtue of Selfishness. Finally those books provide a foundation on which America can properly complete and ground her revolutionary principles and reach infinitely greater, unimagined heights.

Monday, June 26, 2006

Edward Rothstein of the New York Timescritiques PBS’s new weekly series "Bill Moyer’s on Faith and Reason." His depiction of author Mary Gordon (whom he reports placed thorns in her shoes as a child in order to school herself in Christian martyrdom) sent shivers up my spine.

Ms. Gordon suggests that "there are two major narratives in the world, the narrative of fundamentalism and the narrative of consumerism." Given her own religious faith, she explains, she is much more comfortable imagining the inner life of a suicide bomber "than I am of Donald Trump"; she finds the terrorist mind, with its belief in eternal truth, "much more comprehensible."

Ms. Gordon says that whenever she sees people driving Hummers, "I want to just drive them off the road" — or worse. She could "go out on quite a spree," she says. What stops her from becoming a roadside bomber fighting for eternal truth, she explains, is her Christian belief that these "greedy" materialists "are sacred and valuable in the eyes of God."

Notice the smear that Gordon makes in labeling the "narrative" of the West as "consumerism." A consumer qua consumer produces nothing; he is a parasite, and while inconsistent, the West that Gordon finds herself uncomfortable imagining thrives upon production. After all, in order to have your Hummer (and keep it), you must to produce something of value and trade it with others—you cannot have a Hummer in a vacuum.

In contrast, the suicide bomber sacrifices his life and the lives of others in the name of the afterlife, and it's this mentality that Gordon says she understands and has sympathy for, only choosing not to actually commit murder and self-immolation in its name. And note that it's not because Gordon wants to keep her life or that she believes others have a right to theirs, it's the thin rationalization that even SUV drivers "are sacred and valuable in the eyes of God." Lucky for them, because if Gordon had her way, they would be food for worms.

In little more than two paragraphs, Mary Gordon reveals the cancer growing upon the soul of the West. $100 bucks says she gave her interview (to be bounced of satellites and broadcast all over the world) in a climate controlled room and after eating a square, well-balanced meal, yet this vermin has the audacity to decry "consumerism." She prospers in the land of plenty—yet it's the murders and their witch-doctor leaders with whom she feels a spiritually kinship. Count me as sick to my stomach with disgust.

Monday, June 19, 2006

It was inevitable, almost predestined, that Frank Capra's cinematic paean to selflessness and self-sacrifice, "It's a Wonderful Life" (1946), would be voted the most inspiring American film out of one hundred candidates by the American Film Institute. In a culture that values altruism as a primary, uncontroversial, not-to-be-questioned virtue, it is almost an instance of determinism.

On its official website, the AFI's director and CEO, Jean Picker Firstenberg, explained the purpose of the program that aired the choices on national television on June 14:

"The past few years have not been easy in America -- from September 11th to the devastation of hurricanes Katrina, Rita and Wilma. AFI's 100 years....100 Cheers will celebrate the films that inspire us, encourage us to make a difference and send us from the theatre with a greater sense of possibility and hope for the future."

The website notes: "AFI distributed a ballot in November 2005 with 300 nominated inspiring movies to a jury of over 1,500 leaders from the creative community, including film artists (directors, screenwriters, actors, editors, cinematographers), critics and historians."

"To make a difference," in the context of the Capra film, is a euphemism for selfless efforts on behalf of others, for "giving back" to society, to the "community," to the world.

The AFI program, broadcast under the title "Cheers," elaborates on its moral criteria of the "most inspiring:

"Movies that inspire with characters of vision and conviction who face adversity and often make a personal sacrifice for the greater good. Whether these movies end happily or not, they are ultimately triumphant -- both filling audiences with hope and empowering them with the spirit of human potential."

And therein is the clincher: "sacrifice for the greater good."

In previous commentary, I cited Bill Gates's decision to "give back" his billions as an auspicious instance of craven selflessness in a commitment to "make a difference for the greater good." It is his money, and he has a right to dispose of it as he wishes. One can think of a number of "worthier" things he could spend the money on than on the insatiable demands of the needy, such as the endowment of a university fully staffed by advocates of reason and freedom.

However, one would like to ask him: "On the premise that you are giving back to society what you took from it, what exactly is it that you took? Ideas for software? Programming innovations? If you concede that you originated those things, and not society, why are you branding yourself as a thief or a repentant debtor? If you concede that you took your customers' money in trade, why do you believe that you don't deserve every penny of it? Haven't your products revolutionized men's lives and made an incalculable difference? If you concede that you gave the public a priceless value, why are you willing to believe that it was immoral, immaterial, or irrelevant, and that you must make amends?"

But it is nearly futile to argue with a convert to altruism. One's only weapon is reason. Altruism is reason-proof. It derogates the self and selfishness. It is a corrosive that eats away at a mind and renders it progressively impervious to rational persuasion. It is why I rarely attempt to persuade an otherwise rational person of the folly and impracticality of his altruist beliefs. To make the transition from an altruist morality to one of rational selfishness requires too great a mental task for a person who at least senses the rightness of a refutation of altruism; he would see that he would need to repudiate nearly everything on which he has based his life. It is too frightening or traumatic a prospect, and the person will choose instead to "blank out" without pursuing the subject privately or in conversation.

This is not so much a digression as it is an elucidation. To the AFI, the term "inspiration," in a literary or artistic context, refers almost exclusively to the motivation to practice altruism and self-sacrifice. It has nothing to do with what Ayn Rand called "spiritual fuel" to pursue or fight for one's values. In her essay, "What is Romanticism?" in The Romantic Manifesto, she writes:

"The archenemy and destroyer of Romanticism was the altruist morality. Since Romanticism's essential characteristic is the projection of values, particularly moral values, altruism introduced an insolvable conflict into Romantic literature from the start. The altruist morality cannot be practiced (except in the form of self-destruction) and, therefore, cannot be projected or dramatized convincingly in terms of man's life on earth...."

In that same essay, she notes:

"Romanticism is a category of art based on the recognition of the principle that man possesses the faculty of volition.....If man possesses volition, then the crucial aspect of his life is his choice of values -- if he chooses values, then he must act to gain and/or keep them -- if so, then he must set his goals and engage in purposeful action to achieve them."

Some of the films that made the top 100 list are "inspiring" for the right reasons, that is, they do not inspire one to devote one's life to others' needs or to sacrifice anything, but dramatize the pursuit of personal values. The values they dramatize the pursuit of are as varied as the subjects and themes of the films. And some of them dramatize apparent sacrifices which are actually actions taken at risk to preserve values.

To cite an example from the AFI list, "Gunga Din" is about a water-carrier for the British army in India. He wants to be a regular soldier in that army, but is scoffed for his ambition. He risks his life to warn the army of a trap, and is killed. This is not so much a "sacrifice" as his achieving his goal of being a soldier (and his knowing the risks of being one). The same could be said about "Glory," in which the principal characters die as soldiers risking their lives to fight for their values. About these and a few other films that feature the risks of warfare, the last thing one would want to hear is President Bush pontificating on the virtue of sacrifice in relation to collectivist or altruist goals. Bush and Hollywood, ostensibly enemies, have more in common than either would be willing to acknowledge.

I personally find these inspiring stories. On the other hand, as a teenager I found the deterministic, Shakespearian "Lawrence of Arabia" inspiring not only for its numerous production values (such as direction, cinematographer, casting, and dialogue), but chiefly because it suggested what is possible if those same production values were applied to Romantic stories.

The majority of the films on the AFI list, however, fall somewhere in between value-pursuit and value-sacrifice, or have little or nothing to do with either end, such as "2001: A Space Odyssey." The list is as mixed as an altruist's premises. One revolts against the presence of some films on the same list as others. "Shane" and "High Noon" should not be in the same company with "Harold and Maude" and "Dances with Wolves." It is also worth noting that "The Fountainhead" did not make it to the list.

There is no room here to discuss all one hundred films on the AFI list of the "most inspiring." That would require a book. But an Associated Press article on the AFI list is instructive about the moral esteem in which "It's a Wonderful Life" is held in modern culture. It is the story of George Bailey, who surrenders his personal ambition to the needs of his "community," is about to commit suicide, when, as the A.P. article describes it, he "got a chance to see how ugly the world would be without him" had he not been born, that is, conned into relinquishing that ambition. At movie's end, George's brother, referring to all the people in Bedford Falls George has "helped," proclaims him the richest man in the town.

"We all connect to that story," said Bob Gazzale, producer of the AFI TV special. "We may not all connect to the story of a fighter from Philadelphia or a singing family in the Austrian Alps. But there's no way to get away from the inspiring story of George Bailey. It relates to us all."

No, it does not, if by "relate" he means that we all have the potential for selflessness or self-sacrifice, or the capacity to tolerate it for the sake of others' needs, as George Bailey chose to tolerate it. The first time I saw the Frank Capra film as a child, I was repelled by it, and for a long time was intrigued about why it was so revered. As a novelist, I have always wanted to rewrite that story. But Ayn Rand beat me to it in Atlas Shrugged, the story about heroes who refuse to be George Baileys.

It would be interesting to speculate on whether or not Bill Gates, now the richest man in the world, found "It's a Wonderful Life" the most inspiring movie he ever saw, and whether or not he ever privately wondered, at the peak of his career, when he was being sued by rivals and hounded by the U.S. government and the European Union, what the world would be like had he not pursued his own selfish ambition to create Microsoft, or if he now withdrew the products of his mind.

That, however, would necessitate the self-esteem of a man proud of his achievements, together with a knowledge of the injustices perpetrated against him. Bill Gates lacks both that self-esteem and a sense of justice; he is motivated by humility and mercy, the twin enemies of justice. He meets the criteria of a sacrificer for the "greater good."

Friday, June 16, 2006

"Mitchell Layton had inherited a quarter of a billion dollars and had spent the thirty-three years of his life trying to make amends for it." (The Fountainhead, p. 579, Centennial edition)

The estimated personal worth of Bill Gates, age 49, chairman of Microsoft, the 4th largest company in the world, is $50 billion, all of it earned, not inherited, and he has proposed devoting the balance of his life making amends for it. Bill Gates announced yesterday that he will be spending less time running Microsoft and more time to his charity work, "giving back" to the world.

Let us say that Gates somehow manages to conquer malaria without the benefit of using DDT. Fifty million children are saved. Then what? What are they going to do with their lives and health on a continent plagued by dictatorships, poverty, and corruption?

Gates believes in "giving back" his billions. His Bill and Melinda Gates Foundation is endowed with $30 billion. In another quarter, President Bush is motivated by the same altruist morality, that Americans should be willing to help Iraq achieve "democracy" (fallaciously equated with freedom) and sacrifice their lives in the effort. His program is tentatively projected to cost $500 billion, and rising. Making Iraq "safe for democracy" will somehow ensure America's security.

Pouring wealth down the bottomless pit of Africa or Iraq will not accomplish "good." Doing "good" without the least thought of in what context and circumstance the "good" might actually have some tangible, beneficial results will have no results, or results that are inimical for all concerned parties. (The youth of Saudi Arabia, for example, benefit physically from billions in oil revenues; having no purpose in life, most turn into murdering jihadists.) Apparently Gates has devoted little or no thought to the necessary conditions that would ensure that children did not starve, contract AIDS, or succumb to malaria, just as Bush has devoted little or no thought to the conditions necessary to ensure any country's freedom, prosperity, and well-being.

One would think that such an elemental fact would occur to someone as bright as Bill Gates. But, one of the pernicious effects of an altruist morality in an otherwise rational and productive mind is that it necessarily, fundamentally, and incrementally dissolves the causal connections that lead to rational conclusions. Gates would not pour a fortune into the development of a software program that not only would not work, but also be proven to damage or destroy an operating system or computer hardware. But he will spend his fortune to "do good" without the least consideration of what causes the "bad."

Altruism divorces the real world from the moral world, which is believed to be on a "higher plane" but somehow can influence the real world. To Gates, there is no connection between freedom, property rights, and the sanctity of the individual and the prosperity and well-being these things can make possible. He sees misery, starvation, and disease in Africa and other "undeveloped" regions of the world, and believes that money, not freedom, will eradicate them.

Gates's father, William Gates, recently appeared in the news to argue strenuously against the temporary repeal of the blatantly confiscatory estate tax by Congress, righteously claiming that wealth is a "privilege," that the wealthy actually have little right to arrange for the disposal of their fortunes on their decease, and that the government had a responsibility to tax it away for society's sake as a form of "giving back." One can imagine that Bill, his son, has been moral putty in the father's hands. One is tempted to cast him as an Ellsworth Toohey, and almost tempted to pity Bill Gates.

Asked in an interview which he would like to be remembered for, Microsoft or his charity work, Bill Gates replied that he didn't think it was important what he was remembered for, just as long as "good" was done. He answered the question, which startled him, almost immediately, with no evidence of offense or pride in his manner.

The news media stressed that Gates wishes to give his money away to programs that produce "results." The media also lauded Gates as a marvelous example to American children and young adults who are attracted to "volunteerism." I do not know the details of his purpose in pouring money into our ravenously wasteful and destructive education system (other than to introduce children to technology), but if that program is motivated by the same altruist spirit, the only lasting result will be the inculcation of more "selfless" ciphers, the Brown Shirts and self-sacrificers (and sacrificers of others) of tomorrow.

It is no cultural coincidence that the American Film Institute recently voted, out of one hundred candidates, that the most inspiring is "It's a Wonderful Life," the Frank Capra "classic" about a man, George Bailey, who surrenders his ambition to the needs of his "community." Bill Gates is another George Bailey. Reality emulates art again.

It is the daunting task of reason to destroy once and for all the myth that the selfless man is an exemplar of morality, beginning with Robin Hood and ending to date with Bill Gates. Ayn Rand was so right that Immanuel Kant and his numerous yea-sayers over the centuries are man's most evil nemeses.

Thursday, June 15, 2006

One of Britain's most prestigious art galleries put a block of slate on display, topped by a small piece of wood, in the mistaken belief it was a work of art.

The Royal Academy included the chunk of stone and the small bone-shaped wooden stick in its summer exhibition in London.

But the slate was actually a plinth -- a slab on which a pedestal is placed -- and the stick was designed to prop up a sculpture. The sculpture itself -- of a human head -- was nowhere to be seen.

"I think the things got separated in the selection process and the selectors presented the plinth as a complete sculpture," the work's artist David Hensel told BBC radio.

The academy explained the error by saying the plinth and the head were sent to the exhibitors separately.

"Given their separate submission, the two parts were judged independently," it said in a statement. "The head was rejected. The base was thought to have merit and accepted.

"The head has been safely stored ready to be collected by the artist," it added. "It is accepted that works may not be displayed in the way that the artist might have intended." [Reuters]

This story is poetic beyond compare. I wonder if the artist will see an increase in demand for his plinths, on the grounds that the Royal Academy found them to have artistic merit? And more fundamentally, will the artist leave modern art altogether, on the grounds that his plinth was recognized, but his head was ignored?

Clearly amused, Mr Hensel said: "Anything, even if it is not intended to be art, can still have a presence. I like the look of the plinth and support. I can recognize it as a nice object. But I never thought the selectors would choose it as an exhibit."

This story makes me both laugh and cry. Here, one sees plain evidence that the Royal Academy is corrupt and that it has reduced itself to an object of ridicule, yet all the artist can do is equivocate for them, saying he understands how they were taken in by his slab's "presence." He's so utterly blinded by modernism and the cult of the ugly, he can't even see the art world for for the fraud it has allowed itself to become--even when it negativly impacts the presentation of his own work.

Wednesday, June 14, 2006

Not only do eighteen people die each day while waiting for an organ; many people also suffer as they wait. Why are so many people suffering and dying while waiting for organs? Because the acceptance of altruism has convinced Americans that it is better for some people to suffer and die than it would be for others to donate organs for a profit. Profit is plainly selfish; thus, according to altruism, it would sully the whole "beautiful" altruistic "ideal" of people giving away organs for free.

Moreover, on the premise of altruism, it is wrong for those who possess more wealth or more virtue to benefit from that fact; thus, the authorities must see to it that the distribution of organs has nothing to do with who can afford to purchase an organ or whether the recipient is an innocent child, or a heroic soldier, or a convicted murderer.

Altruism treats life as a hospice, and the more able or worthy you are, the more you deserve to suffer and die. As Biddle notes, "The solution to such atrocities is for people to repudiate altruism and embrace egoism."

Tuesday, June 13, 2006

All right. Two Saudis and one Yemeni committed suicide at the Guantanamo Bay prison. And? As one correspondent of mine remarked: "Since Muslims are committing suicide on a daily basis all over the world -- and killing as many others as is possible with themselves -- what is so hard to believe about three suicides in a jail?" Remember that every one of the 460 detainees at Gitmo was either taken in combat against U.S. forces in Afghanistan or Iraq or elsewhere, or taken as a suspect with terrorist or Taliban connections, and scheduled to be tried by a tribunal.

It is hard to believe if reality does not conform to one's wishes.

Remember that these are not "rockin'" fans of the Dixie Chicks or gentle Bono groupies or twittering sycophants of Muslim-patronizing Prince Charles of Britain, spirited away from Pennsylvania Avenue or the Strand and unlawfully incarcerated without charge. These are men who would just as soon as cut the throats of American civilians with box-cutters, hijack another planeload of them and smash it into the U.S. Capitol in an act of suicidal jihad. Or at least stockpile bags of ammonium nitrate fertilizer to grow more piles of Western bodies and rubble.

No, if you listen to the news media, you are not to remember that. You are to buy the story that these three succumbed to the "stench of despair," as Mark Denbeaux, a law professor at Seton Hall University described the "plight" of prisoners at Gitmo. Denbeaux and his son represent two Tunisian prisoners there. It did not occur to the writer of the Associated Press report that quoted Denbeaux to wonder: Who is paying Denbeaux's retainer? CAIR? Or some other Islamic front organization funded by Saudi Arabia? Attorneys cost money. So does the judicial system, even for pro bono lawyers. But don't expect any investigative Pulitzer Prize-winning stories to result from that tidbit.

You are not to remember either the extent to which the American military has gone to accommodate Islamic customs at Gitmo in terms of prayer times and prayer rugs, food, free copies of the Koran, not to mention all the medical services, cleaner clothing than most of them ever wore, and other perks that no American prisoners of war ever enjoyed in any war of the 20th century. The U.S. has gone more than the whole nine yards to fend off accusations by "human rights" organizations that it is mistreating prisoners, even to the extent of calling them "detainees" and not "prisoners of war."

No, you are to empathize with their suffering, not your own or that of Americans whom these "detainees" have killed or would have killed if not captured. You are to forget that every one of them acted in the name of a totalitarian ideology that regards due process, individual rights, and freedom as the corrupt practices of men to be either killed or enslaved.

A measure of the news media's virulent hatred of the U.S. is how quickly and eagerly it will jump on any rumor of American misbehavior. Its malevolent glee at a chance to knock the Marines -- the proudest and least politically correct American military service -- over Haditha must sit in the craw of anyone who has ever been in combat against Islamic "insurgents" or lost a friend or relative to these "freedom fighters." You want to put your fist through the TV screen and wipe the sanctimony from the faces of Charles Gibson, Matt Lauer, Diane Sawyer and their patronizingly skeptical brothers and sisters elsewhere in the media.

The crime this time is that especially the American news media is willing to grant credence to our enemies first -- an enemy that knows how to work the West's multicultural and relativist premises to his full advantage -- before examining facts or even recognizing that there are such things as facts. Observe, for example, how the media dwells on the Israeli mortar shell dropped on a Gaza beach, killing some "innocent" Palestinian civilians. You watch the footage and if you have half a brain, you must ask yourself: Why does this look so staged? What was a cameraman doing there with a camcorder and audio? Why does the little girl behave like she is following directions?

You can almost hear the Hamas's verbal cues. "Now, run along the beach looking for your father. Don't look at the camera! Okay. Now you see him. I'll put the camera on him, and then you see him and flop into the sand and roll back and forth hysterically, screaming anguish and bloody murder. Try not to look up at the camera, or it'll look phony. Hey, great work, little one! Now we have an excuse to fire more rockets into Israel. To hell with their apologies. We want to kill Jews. Say, little one, how would you like to wear a pretty new vest?"

But, back to the Gitmo suicides. The Associated Press reports General John Craddock, commander of the U.S. Southern Command, saying that the "suicides were part of Islamic militants' holy war against the United States and its allies." "They're determined, intelligent, committed elements," said Craddock, "and they continue to do everything they can...to become martyrs in the jihad." "Militants"? Not prisoners of war?

Fine. Let more of them commit suicide and martyr themselves. Give them the bed sheets and maybe some nylon rope. It will mean fewer hostile mouths for U.S. taxpayers to feed. It's a thought, but all 130 Saudis at Gitmo could be freed by herding them onto Air Force transports for "release" over Riyadh, together with about a thousand 500-pound bombs targeted on various palaces of the sheiks, the mourning tents, and mosques.

The Associated Press reports of June 12th on the suicides read like an Islamic agony column. (A correspondent of mine queried whether or not the Associated Press and Reuters might be sub-cells of Al-Quada, which is not so wild a hypothesis, since rich Saudis are stealthily buying interests in Western news organizations.) Ample space was given to the likes of Denbeaux and his ilk in the European Union and Saudi Arabia, all of whom commiserate over the "detainees" and who call for the closing of Gitmo and release of the "detainees." Very little space was devoted to the American position. Most Saudis don't believe the deaths were the result of suicides, or if they believe they were suicides, they were brought on by "torture."

"A crime was committed here," said Kateb al Shimri," to the Associated Press, "and the U.S. authorities are responsible." The Associated Press went on to say that Shimri echoed "the general sentiment heard in the Saudi capital." Shimri is a Saudi lawyer representing relatives of Saudis held at Gitmo. He plans to sue the U.S. government for compensation on behalf of the relatives of the suicides. "Many Saudis denounced the suicide claims as a fabrication, and some accused the U.S. authorities of complicity in the inmates' deaths."

"They were killed; they were murdered," one mother of a Saudi prisoner of war wailed. "This was no suicide." This from a Muslim woman who would have celebrated her son's death had he wandered into an Israeli pizza parlor and blew himself and twenty people up. That action, presumably, would not qualify as killing and murder. Does anyone out there see the deadly double standard that is eroding the separation of Western and Muslim cultures, the moral chasm that divides the life-giving values of the West and the death-worshipping cult of the East?

Finally, I quote another Saudi whose veracity is demonstrably impeachable, Mufleh al-Qahtani, deputy director of the Saudi kingdom's Saudi Human Rights Group. "There are no independent monitors at the detention camp," he said to the Associated Press, "so it is easy to pin the crime on the prisoners, given that it's possible they were tortured." Also, the A.P. article reported that "The kingdom's semiofficial human rights organization called for an independent investigation into the deaths of the two Saudis."

I submit that a Saudi "human rights" organization is as much an oxymoron as a Mafia-run squad that offers crime victims trauma and bereavement counseling. It would be the stuff of satire were it not actually happening.

I submit also that it is beyond bizarre. Doubtless Shimri and al-Qahtani speak with the approval of the Saudi government. Saudi Arabia last week complained that the State Department included it in a list of twelve countries that deal in "human trafficking" (also known as slavery), and that this was "unfair" in lieu of American guilt. Prince Turki al-Faisal, Saudi ambassador to the U.S., speaking to a group of Nashville businessmen, said that "We read in American media and the press about the mistreatment of illegals who come to the U.S. seeking work and end up in brothels and gangs and unacceptable servitude, whether in factories or at farms, and yet that is not mentioned in the State Department report."

More sanctimony that invites a punch in the face. It cannot even be called "hypocrisy." You see how slyly and effectively the double standard of fatal altruist/ pragmatist/ multicultural Western premises can be used against us. You can see it; President Bush and Condeleezza Rice cannot. Or will not. Al-Faisal and his ilk know how to work a crowd of dupes and apologists and exploit our own double standard of good and bad premises. "You are altruists, but not perfect. Do not presume to throw stones at us. What you call crimes and abuses, you are guilty of committing." And the dupes and apologists and aging hippies in three-piece suits nod in sad concession.

Saudi Arabia is a medieval dinosaur that also respects honor killings, castrations of boys, the subjugation of women, tribal vendettas, supports kindergartens for killers called "madrasas," funds the jihad against the West through various "charities," foundations, and oil revenues, and regularly practices extortion against especially the U.S. Saudi and other Islamic mouthpieces in the U.S. call for Sharia law to replace the Constitution. (Since that is an assertion of a "religious belief," it cannot be defined as advocating treason, even though Islam makes no distinction between "church" and state.)

It would be an interesting to listen to a debate on the subject of which Muslim country is our deadlier enemy: Saudi Arabia or Iran. If I were a judge of such a debate, I would be obliged to call a tie and give both sides an equal number of marks.

[L]et's do a quick recap:- Iran is clearly trying to develop a nuclear bomb. Everyone knows it; no one disputes it (besides Iran).- Iran is an indisputable enemy of the West. Weekly prayers include "Death to America" chants. Its president openly calls for the destruction of Israel, and openly expresses his goal that Islam should rule the world.- Iran would use a nuclear bomb. Iran is ruled by Islamic fundamentalists with a messianic vision about the coming end of the world. These are not rational people. They "love death", as they openly tell us (and as Islamic suicide bombers prove weekly). They would be exhilirated by the chance to martyr themselves, as long as they could take us with them. A strategy of "nuclear deterrence" doesn't work with irrational people who think death is great.- "Diplomacy" with an irrational life-hating dictatorship is dishonest and self-defeating. It is grotesquely irrational and immoral to seek to reward someone in exchange for not killing us. Isn't it blatantly obvious what behavior that encourages?- Nothing we say or promise is going to stop Iran from developing nuclear weapons in any case. There is nothing in this world we could give them, or that they would want, that could persuade them to cease and desist. They don't care about "this world". Their focus is on the "next world"--which, according to their beliefs, a nuclear bomb will help to bring about.

Bottom line: Iran wants to destroy us. We don't want to be destroyed (well, I guess I can't speak for the Europeans). There is no middle ground here. There is nothing to discuss, debate, or negotiate.

There is only one "diplomatic message" that needs to be sent to Iran: Stop developing nuclear weapons, or we will destroy you. And we mean it.

Sadly we are still wasting time treating the Iranians with kid-gloves when open warfare has long been overdue. As the elimination of Zarqawi shows, this war is winnable, these murderers are not invincible, our military is more than capable of destroying them. All that is required is that we commit ourselves to American self-defense.

Continuing the Islamic theme, Jason Pappas offers the following observation about failure to call a spade a spade:

Mainstream political and intellectual writers are unable, on principle, to face the barbarian nature of the enemy’s culture. Instead, they blame America. Both Democrats and Republicans argue over who can engineer a better world in Iraq and win “the hearts and minds” of the Islamic world. It’s we that have to change, not Muslims. We’re the problem, according to this analysis. If they haven’t embraced the liberal democracy that we’ve patiently and generously offered, we must have did something wrong. (Too few troops, too many troops, not enough U.N. troops, too much humiliation, too little force, too soon, too late, etc.)

The complete blindness to the inherent failure of Arab societies is captured in Colin Powell’s quip on Iraq: “we broke it, we own it.” If Saddam’s Iraq was Colin’s idea of a working nation, let’s hope we never have Powell as a President.

Mike also doesn’t like what he is seeing in the recent negotiations with Iran:

So we are trying to get Iran to give up its nuclear bomb intentions by giving it a guaranteed supply of nuclear fuel! No wonder Iran is willing to "study" the package. They probably can't believe it either. That's like the homeowner offering the thief a guaranteed supply of crowbars in the hopes the thief will use them for "peaceful purposes."

Of course, the homeowner (West) refuses to identify the fact that such a policy will result in all other thieves (thugs) noticing what works and presenting the same demands to the homeowner (West) until one day he discovers that his money and silverware (freedom) and whatever else he had to negotiate away, are gone. Such is the logical result of ignoring the existence of, and compromising on, principles.

What I find laughable is a nation sitting upon a massive underground lake of oil claiming it “needs” to develop nuclear energy. Yeah, right—like Antarctica needs to develop ice.

Here, Andy Clarkson looks at news reports on the “The College of Rational Education,” a project involving Eric Daniels and Gary Hull. Is this yet another “best kept secret” in Objectivism, because until I heard it from Andy, I knew nothing of it.

Two good posts from the always good to read Gus: the first is on a recent call that Google be regulated by the government because it is so “large”— made none other than by a conservative lobby group, and the second describing Huey Long, Louisiana’s infamous populist governor, and the many parallels to his reign today.

Pity the Poor Objectivist Center, now attempting to recast itself as the “Atlas Society” in a seeming attempt to be less Objectivist and more Objectivish. Diana Hsieh eviscerates them accordingly:

This change of name is good news -- and not just because it's yet another highlyvisible example of the organization's incompetent floundering. The name changedistances the organization from Ayn Rand's philosophy of Objectivism. After all,the symbol of Atlas refers to far more than Atlas Shrugged. Given theorigin of the symbol in Ancient Greek myth, the name "The Atlas Society" doesnot necessarily imply Ayn Rand.

Of course, this new "Atlas Society" will still claim to represent Ayn Rand's philosophy -- at least for a while. They've been explicitly distancing themselves from that prickly philosophy of Objectivism for some time now; it's just too uncompromising for Ed Hudgins. The new name will allow them to do that so much more easily. I wouldn't dignify that shift by calling it more honest, but it will be more accurate.

This has to be my favorite sculpture. Even in a photograph, I cannot look at it for long without being moved to tears. The woman reaches up for love. She touches him tenderly, bare of soul. He lifts her head to his lips, and they unite in a circle beneath his hopeful gaze. An exalted human experience, love and passion triumphant!

I agree. I recall that earlier this spring Sherri Tracinski attacked a similar sculpture by Daniel Chester French because it had wings and was allegedly named after a passage in the Holy Bible (a point that seems to be a matter of debate among art historians). Tracinski’s position was that French’s sculpture was an unreal representation of romantic love—and that no artist, save for Sandra Shaw has been able to accurately capture love in their art.

Um, yeah, right. If you look at art such as that depicted in the photo and all you see is an attack on existence, you need to tone it down a notch. A pair of wings ain't the enemy in art . . .

Although this was a relatively quick and somewhat experimental painting, I have to admit that I love the end result. It will take some time for the thick white paint in the brighter areas of the fireworks to dry completely so that the painting can be varnished and professionally photographed, but hopefully at that point I can make a better image available. Until then, enjoy New Year's Eve and please send in any last comments or questions.

For his age, Larsen is a deeply talented artist. He’s also a man seriously in love with portraying people’s back-sides. I think Larsen will take his art to the next level when he is able to master the human face—and can portray a face that is alive, intelligent, and shows the viewer things like that magnetic form of engagement that we see when we witness the greatest and the beautiful, or a heart that has found serenity. I think if he wants it, it's his for the taking . . .

Amanda Carlson recently celebrated the reasons for her love for Art Nouveau and Art Deco:

The wonderful thing that I think best characterizes both Nouveau and Deco is that it is functional art. They enliven menial everyday items with inspirational art, not by pasting art on top of things, but by making the style an integral/natural part of the structure of the things one creates. Nouveau does it in a flowing, curvy, often described as "whiplash" style (usually busy). Deco does it in a geometric, angular, bare-bones sort of way. But the same glorious idea that I adore applies to both: beauty and elegance are necessary in the structure of living, and not to be added as an after-thought.

“Border crossers,” is I guess that is the new, official PC term of evasion. They used to be called wetbacks, but that was judged to harsh and even “racist.” After all Americans wouldn’t want to hurt the feelings of those who flaunt our laws and national sovereignty. So, the new term became “illegal alien.” While it was an increase to two words and four syllables to say the same thing as one word, it was still accurate.

Accuracy was still a problem for the arbiters of language. Accordingly, the new, new term became “undocumented workers.” Now we are up to seven syllables to say nothing. “Undocumented,” as if the main problem with these invaders is a paper work hang-up. While shorter, “border crossers” is even more absurd. Millions cross our southern border legally every year. The purpose of the new PC term is to evade the distinction between the law abiding and the law breaker.

You almost have to admire it—the ability to reframe the debate by recasting the terms.

Just in case you missed it, Principles in Practice is the blog of The Objective Standard. There, Alan Germani writes about on several women in Saudi Arabia who had female to male sex-change operations.

Not being able to drive cars or move freely are minor examples of the oppression women face in Saudi Arabia and other Islamic theocracies. Arranged marriages, domestic abuse, and honor killings are regular aspects of Muslim women's so-called lives. When their alternative is to become a man or to suffer a lifetime of psychological and physical abuse, the big surprise is that more Muslim women haven't had sex-change operations.

I’d suspect if you are in a position to change your gender, you’re in a position to leave the country. What I would like to know is the number of women who attempt to flee Saudi Arabia in seach of better environs.

At American Renaissance, Steven Brockerman offers a short biography of Ken Iverson, CEO of Nucor and pioneer of the American mini-steel mill.

Nucor planners, engineers, contractors and workers gather. A monumental struggle begins. Seemingly insurmountable obstacles arise, followed by spectacular failures—mounds of capital are expended at an alarming rate—a growing doubt spreads among Nucor investors—naysayers are popping off in the press left and right—and, silently, America’s industrial tycoons for which steel is their companies’ life blood wait in agonizing suspense.

Then: heroic perseverance—brilliantly ingenious solutions—increasing successes—a muted but steadfast and growing determination—and, in the end, glorious, magnificent triumph! And above it all the while, leading the way—tough, certain, unflappable, his eyes ever focused on the goal—stands Kenneth Iverson.

While some Objectivist blogs are cool, this one is awesome, and here Mike rips a religionist’s attempt to say that the Ayn Rand Institute supports genocide. You’ll just have to see this one for yourself.

David Vekslar reports that chemistry sets model rocketry is about to become illegal in the name of "Homeland Defense."

This is a sad development indeed, as many of America’s great inventors got into technology experimenting with chemicals and home-made fireworks.

Indeed. I for one loved my model rockets as a kid, and I look forward to introducing my future children to them and other “dangerous’ hobbies as well—that is if the Congress doesn’t get in the way first.

Here’s a new quazi-blog that’s been brought to my attention. Here the “inspector” takes on the death tax:

Consider the very idea of a Death Tax, for a moment. The deceased has already paid whatever taxes were demanded in the first place when he earned his wealth. If he wanted to bequeath this money while he was still alive, he wouldn’t have to pay a tax on it first. (although unfortunately, the recipient might)

So why does he have to pay extra for being dead? Is there something wrong with dying, that it has to be punished or something? No, the answer is far more sinister: in the eyes of the taxman, he’s just collecting what was his all along.

You see, your property was never yours at all. “Your” property, and by extension your life, belonged to the state. They were just letting you use it. Everything you have is, in the end, their property.

Well, not if I can help it . . .

* * *

And there you have it—the third Objectivist blog carnival! Happy trails to you . . .

Tuesday, June 06, 2006

When I finally got around to taking basic micro- and macro-economics in graduate school, it was in many ways a disappointment. I was certainly not an expert, but by this time in my life, I had been exposed to many ideas about how economies ought to work. However, the class I took was nothing like the few books I had previously read. Those books argued for laissez faire capitalism and criticized government intervention. The instructor and the textbook were united in believing that actual markets are “imperfect” and break down and government intervention is required to keep order and safety. At the time I had some trouble coming up with arguments against market failure since aside from large economics treatises that I did not have time to read, there were seemed to be no concise refutations of such supposed circumstances.

Today the situation is quite different thanks to Brian P. Simpson, Assistant Professor in the School of Business and Management at National University, La Jolla, California. As its name implies, Simpson’s book, Markets Don’t Fail! (Lexington Books, 2005) provides an antidote to the almost universal college economics textbook assertions about market failure. In that respect, it’s an excellent resource for those who want to understand the issues behind these claims.

In his text, Simpson addresses some of the most common claims of market failure, examining issues such as monopolization, externalities, environmentalism, and public goods, just to name a few. In each example, Simpson lays out the strongest case of the interventionist side—and then proceeds to utterly demolish it. An illustrative example is his coverage of externalities. He begins by clearly defining the term “externality”:

An “externality”… is a cost imposed, or benefit bestowed, on people other than those who purchase or sell a good or service. The recipient of the externality is neither compensated for the cost imposed on him, nor does he pay for the benefit bestowed upon him. These costs and benefits are labeled “externalities” because the people who experience them are outside or external to the transaction to buy and sell the good or service. (P.85)

After further describing the difference between positive and negative externalities, Simpson explains why it is claimed that markets fail in this instance:

The alleged failure of the market occurs because, it is claimed, the market provides too many goods that produce negative externalities and too few goods that create positive externalities. Too many goods that create negative external effects are allegedly produced because the costs imposed on those who experience the negative externalities are not taken into account in the production of the goods creating the negative side effects. Remember, these costs are imposed on people who neither buy nor sell the goods. If these costs were accounted for in the production of such goods the cost of producing them, and therefore the price needed to purchase them, would be higher. Hence, fewer of them would be produced and purchased.

The “solution” … is government intervention into the market. …It is claimed that the government must take some action to restrict the production of these goods by, perhaps, imposing a tax on the producers of such goods so that these will experience the effects of all the costs they impose on others. (P.86-87)

Finally, Simpson proceeds to analyze and refute not only the economic arguments behind both positive and negative externalities, arguing that acting on “externality theory in a consistent manner and implement[ing] policies based on it . . . would lead to economic stagnation, a much lower standard of living, and thus a much lower level of individual satisfaction in the economy” but going deeper and arguing that the entire concept of “externality” is philosophically invalid and absurd.

He concludes the chapter by writing that “[t]he externality argument does not provide any evidence of market failure. The only evidence of failure this argument provides, as with all the arguments against the market, is the failure of contemporary economists and other intellectuals to embrace sound concepts and ideas.”

Markets Don’t Fail! is about more than just economics. Just as in the case of externalities above, Simpson presents a multiple level refutation of each of the market failure claims. In a separate chapter he also provides a good review of the positive case for capitalism, showing in detail how capitalism is the only moral social system and rests on an ethics of egoism. Simpson rewards the reader with a wealth of arguments that will help him understand the issues involved—and students of economics will finally have a resource with detailed answers to the false claims so often made in their textbooks.

Sunday, June 04, 2006

Today, one often hears the question asked -- sometimes despairingly, sometimes jeeringly -- that if classical music is so wonderful, uplifting, and timeless, why is it no longer being composed? The stock answers are numerous, but unconvincing.

One is that classical music is peculiar to a period of European history dating approximately from the Renaissance through the nineteenth century, and thus is not the "voice" of our age. But that classical music remains valued by so many people in this age belies this assertion.

Another argument claims that classical composition has "evolved" beyond harmony, tonality, and melody to a "new plateau" of atonality. A variant of this argument charges that the public "ear," so habituated to the traditional forms of musicality, suffers from a sort of evolutional, tonal lag because it has not kept pace with the ever-evolving musical avant-gard, purportedly representative of an advanced species of humanity. Thus, the ear must be trained or "conditioned" to plumb the reputed depths of jumbles of random sounds, or, in some cases, no sounds at all.

This is the complaint of the modern artist who sneers that the public cannot appreciate his abstract rendering of, say, Perseus and Andromeda, as a canvas of blots, drippings, and sprinkled-on metal shavings. The public, with the notable exception of an aesthetically superior minority, is philistine, perhaps even artistically "reactionary"; it is confined to a reificatory, bourgeois aesthetic prison, and insists that art be -- Gads! Can you credit it? -- intelligible and that music be compatible with its inchoate psychology.

Modern "formal" music, like modern art, is devoted to addressing a "higher" consciousness, using a "logic" that transcends syllogisms, proportion, time, space dimension, sense perception, and other Euro- and/or logo-centric "constructs." In short, reality. It requires that listeners revise their expectations, discard the "prejudice" of the various centrisms, and passively receive logically ineffable droplets of pure essence, or pure being -- or deliberately unintegrated sense data.

Among the many demerits of the politically correct Webster's II New Riverside University Dictionary (1994), is its definition of music: "The art of arranging tones in an orderly sequence so as to produce a unified and continuous composition." This definition is a step backward from "The science or art of incorporating intelligible combinations of tones into a composition having structure and continuity," which is the definition found in Webster's Seventh New Collegiate Dictionary (1969). The Riverside definition replaces the key term intelligible with orderly, which can mean virtually anything, and the term structure with unified, which can also mean virtually anything. One can imagine that the next edition of the Riverside will shed the self-conscious air of its ambiguous qualifiers and offer an au courant, fashionably "deconstructed" definition: "The art of arranging tones in a sequence to produce a composition" -- which, of course, could be applied equally to Beethoven's "Symphony No. 5" or to the gruntings and squeals of a pig sty.

A musical composition is an identifiable sum of its parts. A composition that has no structure, that seems to fly apart, or worse, seems to be notes and rhythms randomly flung into the air to fall where they may on a blank music sheet, has no sum, no identity, and no theme but chaos and madness. A composition of jumbled sounds "represents" merely the modernist fixation with pseudo-aesthetics and artistic fraud.

In her explanation of the purpose and demands of music, novelist-philosopher Ayn Rand wrote:

"It is in terms of his fundamental emotions -- i.e., the emotions produced by his own metaphysical value judgments -- that man responds to music....The theme of a composition entitled 'Spring Song' is not spring, but the emotions which spring evoked in the composer....Liszt's 'St. Francis Walking on the Water' was inspired by a specific legend but what it conveys is a passionately dedicated struggle and triumph -- by whom and in the name of what, is for each individual to supply." 1

It was fashionable among early twentieth century composers to write melodic music punctuated by stretches of dissonance. Ralph Vaughan Williams, Aaron Copeland, Charles Ives, and Virgil Thompson all interspersed orchestrated "folk" melodies with dissonance. Even Edward Elgar, in his later work, resorted to the practice. They all helped to make madness and the irrational respectable. Copeland's "Symphony No. 3," for example, uses his well-known "Fanfare for the Common Man" as a melody around which he weaves screeches, drum rolls that herald nothing, and other chaotic noise. And none but the musicians who must play it can remember the full score of Samuel Barber's "Adagio."

"Don't set out to raze all shrines -- you'll frighten men," says Ellsworth Toohey, the critic and arch-villain in Rand's novel, The Fountainhead. "Enshrine mediocrity -- and the shrines are razed."2 Toohey offers that advice in the course of explicating, for one of his willingly duped victims, his method of inculcating and promulgating collectivism in men's souls. He could have added: Elevate incompetence, and competence is irrelevant; sanctify the irrational, and the rational is emasculated; praise noise, and music is silenced. The principle behind Thomas Gresham's law, that bad money will drive out the good, is equally applicable to art and music, especially in a culture that is in a state of philosophical disintegration, and in which the destroyers are blithely sustained by the destroyed. Indeed, the idea that our culture, in its present state of anarchy, could generate classical music, seems almost oxymoronic.

"Doctors have this theory that if you play classical music for infants, they'll understand complex relationships, like math. They don't know what effect rock-and-roll would have. Well, we figure the world could do with one fewer accountant."

This message was spoken by a post-adolescent male voice in a smarmy drawl in an ad for a popular radio station, accompanied by a series of jerky, time-lapse close-ups of a smiling infant rolling its head back and forth on a pillow in seeming enjoyment of the dissonant "rock" being played in the background. The commercial's message is clear: It is not necessary for anyone to understand "complex relationships like math," or to develop much skill in any field of mental labor. It is okay to raise a child to be a cognitive troglodyte, unable to raise his consciousness beyond the immediately perceptible, impatient with music that demands conceptual integration or that addresses a soul he may never recognize he possesses, or could have possessed, indifferent or hostile to anything that "makes sense."

Whether or not there is any scientific truth to the theory that a particular genre of music can aid in (or arrest) a child's mental faculties, the ad implicitly endorses the stunting of children's minds. Accountant doubtless is used as a generic pejorative for all professionals who deal in facts, which includes the universe of Western science and technology that allows the intellectually slothful to exist in relative opulence and without having to exert much mental effort. The ad is distinctly anti-mind.

Anyone who regularly attends classical music concerts must be familiar with the practice of conductors or music directors of inserting "new" (or even old) atonal compositions between "traditional" ones in a program. An orchestra might begin with, say, Mozart's "Impresario Overture," end with Prokofiev's "Classical Symphony," and sandwich in between them something like Peter Warlock's "Capriole Suite." The practice ensures that concertgoers hear something of the "new plateau" genre whether they want to or not. And they will hear it, chiefly because most concertgoers believe it would be rude to rise en masse, leave the hall, and return when the noise has subsided. Modern "formal" music is played to audiences held hostage by their own civility.

If an orchestra were to advertise an all-Warlock, or an all-John Cage, or an all-Schoenberg concert, attendance would be embarrassingly thin. Why conductors or music directors continue the practice of subjecting their audiences to aural torture is a matter of conjecture. Perhaps they feel duty-bound to be "fair" to the newer composers; perhaps they feel obligated to play the compositions of government- or foundation-subsidized artists.

The last possibility has some interesting implications. How many orchestras remain wholly supported by private donations and receipts, free of the pressures exerted in by the byzantine mazes of public arts funding bureaucracies? Very few. That they must resort to this brand of extortion underscores the bankruptcy of what they foist upon their audiences.

Surely conductors know the difference between Camille Saint-Saëns' "Phaeton" and Fritz Kreisler's "String Quartet." They must suspect that people attend live performances for many reasons, but that voluntary submission to what amounts to an enervating, auditory Rorschach test is not one of them. Whatever rationalizations have been offered by defenders of the practice, it is as purposeful as art galleries exhibiting kitsch or non-art together with genuine art. The unstated purpose of these exercises is to "enshrine mediocrity," to subvert and destroy values, to undercut man's capacity to formulate or sustain values, and to introduce doubt in their minds about the values they do hold.

One regularly exposed to this practice, if he does not maintain the conviction that what is being committed is a fraud, will begin to think: "Perhaps there is something here, something important about these lead pipes welded together to make a stick man. It's right there next to Canova's "Cupid and Psyche." Perhaps I've missed the boat, and shouldn't be so smug (or certain) about these things."

This individual will not stop seeing the stick man as a bunch of pipes welded together, nor will he begin doubting the artistic value of the Canova, but he may begin to doubt the evidence of his senses, the certainty of his mind. Some part of his implicit certitude concerning right and wrong, good and bad, beautiful and ugly, reality and fantasy, will turn to mush, the certitude progressively softened by the miasma of a subjectivist, value-negating artistic nihilism.

This is an instance of retrogression, of the flaunting of primitivism as merely a "cultural difference." Among this country's black youth the results of this value negation have been especially sad. The enormity of the evil perpetrated on them by their parents and teachers defies description. "Cultural separatism" shares the same corrupting end as atonal "formal" composition: to be both A and non-A; that is, to live in a country whose high standard of living is made possible by Western values, but to hold conscious values that are hostile to or inimical to the West and civilized living.

Walter Grimes, reporting on a highly publicized debate between August Wilson, the Pulitzer-winning black playwright and Robert Brustein, drama critic for The New Republic, wrote: "Mr. Wilson tried to explain that his insistence on a black theater was not limiting."3

"Why is white experience assumed to be universal, he asked, and black experience somehow particular? Why are black artists expected to become universal by transcending race and moving beyond black themes?"4

Grimes added:

"Black Americans, Mr. Wilson said, want to enter the American mainstream, but not at the price of shedding their African identity. Black artists have a duty to preserve and promote the thoughts and values of their ancestors, including their African ancestors. 'If we choose not to assimilate...this does not mean we oppose the values of the dominant culture, but rather we wish to champion our own causes, our own celebrations, our own values.'"5

Mr. Grimes did not broach such questions as: What is a "black theme"? What is it that Mr. Wilson wishes to perpetuate? Is it only black "angst"? It is merely "white" experiences that the playwright wants segregated from the mainstream, or is it Western values in general? Are the concepts of individual rights and independent minds too universal or too peculiarly "white" to apply to blacks? How can one support individual freedoms, yet uphold a tribal (i.e., collectivist) consciousness at the same time?

"Separatism" may be achieved, but an "ethno-culture," burdened with such phenomena as "Ebonics" in language, will not send probes to Mars, invent open-heart surgery, or grow corn. The great black musicians who contributed to American culture, e.g., Scott Joplin, Duke Ellington, Lionel Hampton, and Louis Armstrong, have apparently been disowned in favor of the malevolent "dissing" and droning of "rap." Armstrong and company are now no more revered among Afro-centrists than are Thomas Sowell, J.C. Watts, Walter Williams, or Ward Connerly among thinkers, economists or educators, black or white.

Composers of film scores inherited the mantle of classical music composers. There is little distinction between what moved the latter and what can inspire the best creators of film scores: a story, a legend, an image, a tableau, a play, a need to express some inner conviction or truth. Once, much film music approached the symphonic or classical level. Many scores by composers such as William Walton, Arthur Bliss, John Barry, and Miklos Rozsa are as evocative and memorable as any opus from the nineteenth century, and can stand alone apart from their original inspiration. Walton's score for Henry V, Maurice Jarre's for Lawrence of Arabia, and James Horner's for Glory come to mind as instances of what is possible.

The best film scores were those written for grand-scale, larger-than-life epics. But such epics are no longer being produced. Great music cannot be written to dramatize triteness, or about psychotics, functional illiterates, criminals, perverts, predatory aliens, whales or dinosaurs. And great music cannot be indefinitely appropriated to accompany and elevate the depiction of the superficial, the witless, the stupid, or the banal, such as in Woody Allen's Manhattan.

The preferred and broadening cesspool of subject matter of most filmmakers today cannot serve as the genesis of magnificent, or event pleasant music. Popular films have become little more than vehicles for "special effects"; their stories are superfluous appendages, flimsy excuses to exhibit the technological repertoire of their computer graphics artists and incendiary experts. "Serious" films today, such as Love! Valour! Compassion! and Female Perversions (dealing, respectively, with homosexual relationships and feminist existentialism), are not rich material for great music, either. Film scores are written now to be heard and promptly forgotten.

A word about bass in contemporary popular music. Were this a separate article, its title could well be "Technology in the Hands of Barbarians." The stress on "mega" bass (of 120 decibels or more, crowding the 180 decibel range of a NASA rocket launch) is especially revealing, for it confesses an attempt to compensate for vapidity of content in what passes for contemporary popular music. Bass, once considered a single musical element, has come to dominate "pop" music because this type of music requires the least amount of thought or imagination by either its composers or listeners. Its continual "thumping" -- in popular music and even in television commercials -- is used to arrest one's attention, deaden thought, and metaphorically beat listeners to a stupefied pulp. On dance floors and in bars, it imposes a nihilistic gestalt on everyone and everything it touches. It is not joy or happiness or even sorrow that this kind of bass seeks to evoke, but a temporary state of annihilation.

Bass is also employed now as a weapon against civilized existence by those who install expensive "mega bass" amplifiers, "woofers," and speakers in their vehicles. It is easy to name the motive of the owners of these throbbing machines: pure, unadulterated malice. The blasts that emanate from these vehicles are distracting not merely because of their volume; their peculiar, offensive, intrusive nature penetrates one's consciousness as a disruptive, often painful force. It is not joy that the perpetrators of the "mega bass" phenomenon wish to share with random passersby or residents, but hatred and the chance to torture without physically touching anyone. What such creatures are saying is: We're a revolting nuisance, but we're here, we're pumping up the volume, and there's nothing you can do about it.

"Rap," of course, cannot even be considered as music. Taking together its belligerent tone, its monotonous, metronomic beat, obscene and homicidal "lyrics," and confrontational delivery, it is simply a species of malevolence.

Students attending the best music schools are no longer taught how to compose "classical" music. These schools, such as the Peabody in Baltimore, the Curtis in Philadelphia, and the Julliard in New York, are turning out talented soloist musicians, but their philosophy of composition is governed -- if modern "formal" music is any kind of gauge -- by the likes of Arnold Schoenberg, or worse. Consider the spirit of the nineteenth century, and one will understand the reasons why so much great music was written in that era. Consider the spirit of our time, and one will grasp the significance of music as a litmus test of general cultural well-being or decay.

A culture takes its cues from the top -- from the universities, from the intelligentsia, from the trendsetters of ideas. And if the message from the top is that anything goes, then all that is good will go. The rubbish, bile, and nihilism that pass for music today cannot be legislated out of existence. Conservatives such was William Bennett, the former Secretary of Education, have proposed silencing the barbarians and frauds and nuisances, but even if they could be repressed or muffled, the appearance of a new Verdi, Brahms or Chopin will not be the consequence.

What is true of politics is true of aesthetics. Just as a free nation will collapse into statism when the most rational elements of the political philosophy on which it was founded and sustained are subverted or negated by elements of their antipodes, the best in aesthetics will vanish when the irrational, the atonal, and the unintelligible are given equal time and equal approbation.

The sad truth is that we should not expect greatness in music to emerge from a decaying, rudderless culture.

Friday, June 02, 2006

In response to the mountain of criticism it received for its definition of racism which included having “a future time orientation” and “emphasizing individualism as opposed to a more collective ideology” [blogged about at ROR here], the Seattle Public Schools has issued the following statement:

In response to the numerous concerns voiced regarding definitions posted on the Equity & Race website, we have decided to revise our website in a way that will hopefully provide more context to readers around the work that Seattle Public Schools is doing to address institutional racism. The intended purpose of our work in the area of race and social justice is to bring communities together through open dialogue and honest reflection around what is meant by racism and the impact is has on our society and more specifically, our students. Our intention is not to put up additional barriers or develop an “us against them” mindset, nor is it to continue to hold onto unsuccessful concepts such as a melting pot or colorblind mentality. It is our hope that we can explore the work of leading scholars in the areas of race and social justice issues to help us understand the dynamics and realities of how racism permeate throughout our society and use their knowledge to help us create meaningful change. This difficult work is vital to the success of our students and families. Thank you for sharing your concerns.

I love how the Hollins’ apology still manages to make a muck of it, this time attacking the “unsuccessful concept” of the “colorblind mentality.” Yeah, you know, that old chestnut that leads one to actually believe that race is immaterial to what one thinks or does. And I also love the ode to “open dialogue” and the desire to avoid an “us against them” mindset. Sure, your mentality may be failed, but we still can talk about it.

I take the above as proof that one can be an utterly flaming idiot who attracts national attention through their buffoonery and still not get fired from the government’s public school system.

"The world must be made safe for democracy," said President Woodrow Wilson to Congress on April 2nd, 1917, some months after he proposed "peace without victory." Four days later Congress approved a declaration of war against Germany. Wilson could have asked for a declaration much earlier. German submarines were sinking neutral American shipping in a policy of unrestricted submarine warfare, five merchant ships being sunk in February and March that year alone.

Wilson had been waiting for a more "overt" act of belligerence against the U.S other than the loss of American lives at sea at German hands. But the most recent sinkings, together with the Zimmermann note to the German minister in Mexico, forced him to face reality. The Zimmermann note pledged Germany to support Mexico in an invasion of the U.S. southwest to deter certain American entry into the European conflict until after Germany had beaten Britain and France to exhaustion. If the U.S. declared war, German foreign minister Alfred Zimmermann instructed his minister in Mexico to assure Mexico that "we shall make war together and together make peace. We shall give generous financial support and it is understood that Mexico is to reconquer the lost territory in New Mexico, Texas and Arizona."

A declaration of war was not what Wilson had in mind as an altruist "tonic of a moral adventure," as editor and fellow Progressive Herbert Croly had prescribed for America years before. Rather, it was the role of mediator and "peace maker" in the conflicts and international disputes of the early 20th century.

Shuttle ahead ninety years to Georgetown University, where British Prime Minister Tony Blair, in remarks about the "new" global politics, proclaimed, "Idealism becomes the realpolitik." An essential part of that "idealism" is the introduction of "democracy" in regions of the world that have seen no legitimate governments in over a century, chiefly because their inhabitants did not know what to do with democracy, except to vote themselves new tyrants or tolerate old ones. Democracy, however, means mob rule, no matter how legitimate it sounds. It recognizes no individual rights that a majority cannot abridge or abrogate.

Even Wilson's contemporary, Vladimir Lenin, understood that. "Democracy is not identical with majority rule." Off by one adverb in that statement, he elucidates the point in contradiction of himself. "Democracy is a State which recognizes the subjection of the minority to the majority, that is, an organization for the systematic use of force by one class against the other, by one part of the population against another." (Chapter 4, State and Revolution, 1919) Which is why democracy was as much his enemy as "capitalistic" republicanism, to be ruthlessly crushed. After all, in terms of a nation's population, a totalitarian party's members are always in the minority.

The point here is that President Bush's and Mr. Blair's "idealism" does not fundamentally differ from Wilson's. Its moral core consists of blind duty and the sacrifice of wealth and of lives to accomplish the spread of democracy. Integral to the concept is that the U.S. should eschew its selfish isolationism and adopt a proactive, Kantian "moral" role to correct wrongs wherever it might see them. Our political leaders are ruled by the little Prussian's categorical imperative to "do the right thing" regardless of cost, self-interest, or even of consequence.

"Democracy," rather than being an object of populist appeal or simply because it is easier for politicians to pronounce than "constitutional republic" (which is what the U.S. is becoming less and less), thus complements such "idealist realpolitik." That is the true character of Mr. Blair's "realpolitik." It is the "idealism" of humility, retreat, and ultimate self-destruction.

In the conflict with Iran and its neo-Hitlerian President Mahmoud Ahmadinejad, Bush contends that the issue of Iran's nuclear weapons development can be resolved with "robust diplomacy." That was Wilson's premise behind his proposal for an international peace conference to end the fighting between the European powers, and the basis of Neville Chamberlain's negotiations with Nazi Germany.

Wilson also said, in April 1915, that "No nation is fit to sit in judgment upon any other nation." Both Bush and Blair have refined that idea, alleging that no religion is fit to sit in judgment of any other creed. Their altruist, Christian premises forbid them to condemn Islam, and allow them to claim that Islam is not the motivating force behind terrorism. It has been "hijacked," or "perverted."

Anyone who has read the Koran knows this is an absurd notion, as absurd a notion that Hitler "hijacked" Nazism or that Stalin "perverted" communism. But, then, Bush and Blair believe in democracy, as well.

Some commentators may suspect that the May 31st news that the U.S. is willing to negotiate directly with Iran is a ruse to assure world opinion that it is not trying to bully Iran into giving up its nuclear enrichment program, and that it does not intend to employ force against Iran.

Given recent developments, we can believe that it is not a ruse. President Bush and Secretary of State Condoleezza Rice are willing to take both of Ahmadinejad's hands and personally lead him to the higher plateau of international amity, global peace, and pure "democracy," with Prime Minister Blair, Europe, Russia, China and others flinging confetti and flowers at them. Ahmadinejad can snarl and missile-rattle all he wishes; Bush and Rice are willing to forget dignity and take the abuse in the name of a higher cause.

Ahmadinejad is a beast, they agree. But he is there, a metaphysical given, and must be dealt with without igniting more conflict or exacerbating existing animosity. Ma Rice acknowledges that Iran is a supporter of terrorism "in Lebanon and Palestinian territories," she remarked at a news conference, according to an Associated Press report on May 31st. But, "Iran can and should be a responsible state." No mention by her or Bush of its support of terrorism in Iraq, where Iran's "insurgent" proxies and planners are picking off Americans and Iraqis by the busload. Apparently, that is not "overt' enough an act of war.

Nazi Germany and Imperial Japan were also metaphysical givens. Our policy more than half a century ago was to erase those givens, and that was the end of that. There were no negotiating "tables" to lure dictators to in the name of peace, just the burnt out shell of the Reichstag and the wind-blown ashes of Hiroshima.

As John Lewis remarked in correspondence elsewhere in response to the AP report, "Note that Rice's admission that Iran has a right to nuclear energy is the same error the British made prior to WW II, when they accepted that Germany had the same 'right to self-determination' as other nations."

Iran's "self-determination," in light of its record and especially in view of Ahmadinejad's bellicose rantings, includes the "destiny" of ruling the Mideast by force or subversion, the annihilation of Israel, and setting the terms of peace with the rest of the world in a quest for a Pax Persia via nuclear payload.

In the staring contest between Ahmadinejad, Bush and Rice, the pragmatists blinked. So they must always blink when facing bellicosity. Their concept of ensuring national security is to offer the aggressor bribes, such as the U.S and Britain did in Vienna on June 1st, and to rule out military force.

The "realpolitik" of U.S. policy to date has been one of uncompromising pragmatism. Pragmatism as an "ideal" and as a policy must by its nature sacrifice the good to evil; otherwise it would not be pragmatism. Evil derives its strength from compromised and ultimately vanquished principles. Pragmatism discounts principles as a guide to moral conduct; they are forgotten in a rush to keep a nemesis at bay.

The principle left behind here is the right of the U.S. to its self-defense against a threatening rogue state. Reason and reality have no role in a policy of pragmatism. Yet, despite pragmatism's sorry and costly role in history, especially in the 20th century, current leaders are convinced that pragmatism is the only "moral" path to follow. They are determined to make it "work." But it works only to the benefit of the enemies of civilization.

The New York Times, under the chortling headline on June 1st, "Bush's Realization on Iran: No Good Choice Left Except Talk," reported that the president asked Rice "several months ago that he needed 'a third option,' a way to get beyond either a nuclear Iran or an American military action." The term "beyond" is eloquently appropriate; it suggests an excursion into fantasyland in search of a Star Trekian "Prime Directive." Bush has explicitly rejected an "either/or" in favor of an evasive, non-confrontational middle course.

One must wonder about the psychology of men who are so afraid of absolutes that they are willing to acknowledge a threat but never the rational course of action to take to remove one. According to the AP report, when Rice was asked about the possibility of the U.S. reestablishing diplomatic relations with Iran, Rice "ruled out a 'grand bargain.' However, she said a negotiated solution to the nuclear dispute could 'begin to change the relationship.'"

"Nobody is confused about the nature of this regime," said Rice at a news conference held to announce the alleged shift in policy. "We are not negotiating the terms of terrorism."

Were she and Bush genuinely confused about the nature of Iran's regime, it might be forgivable. But she names what she and Bush both know, and that makes the action an unforgivable betrayal. In effect, their willingness to "come to the table" to talk is, in effect, a willingness to negotiate the terms of terrorism.

Is it any wonder that Ahmadinejad is so contemptuously confident that Islam will triumph? Even psychopaths like him can sense cowardice and smell blood. Ahmadinejad has mastered Hitler's playbook of the 1930's.

The overture to the U.S.'s creeping, inevitable capitulation on Iran was reported in the Los Angeles Times of May 26th under the appropriate headline, "The Tyranny Doctrine."

"Last week, Secretary of State...Rice announced resumption of full U.S. diplomatic relations with Libya, citing Tripoli's renunciation of terrorism and intelligence cooperation." The article asserts that this move "marks an effective end to the Bush doctrine."

Rather, it highlights a continuation of the Bush doctrine of non-judgmental pragmatism, which has been to take the path of least resistance and greatest expediency, to avoid confronting major threats and to expend lives and treasure on incidentals, such as Iraq and Afghanistan. Not to mention, in this instance, a forgiving of Libyan dictator Qadhaffi for the murder of hundreds of Westerners by his own army of jihadists. There is "realpolitik" for you.

The Los Angeles Times article goes on to list Bush's record of non-achievements in his pursuit of global "democracy":

"The Bush administration has watched Egypt abrogate elections, ignored the collapse of the so-called Cedar Revolution in Lebanon and abandoned Chinese dissidents; now Washington is mulling a peace treaty with Stalinist North Korea."

The mare's nest of pragmatism and its consequences grows nastier, thicker and more perilous. When will Bush have his own "reality check" and grasp the true nature of our enemies? When we experience another September 11th?

Bush, at his second inauguration, stated: "The survival of liberty in our land increasingly depends on the success of liberty in other lands. The best hope for peace in our world is the expansion of freedom in all the world." The first half of this statement is not strictly true; liberty in America can succeed without it succeeding elsewhere in the world. But what if the rest of the world rejects the peace that freedom can bring, and chooses the "peace" of submission, tyranny or conquest?