The Strom Thurmond who emerges from these recollections is a mercenary character.
He had already been elected to the State Legislature and was laying the groundwork
for the campaigns that would land him in the governor's office and lead him
to the United States Senate. To see this plan through, he needed to prevent
news of the black daughter, well known in the black community, from jumping
into the white press.

When Ms. Washington-Williams said she wanted to go college, Mr. Thurmond
naturally suggested South Carolina State, the segregated black college whose
budget he would later control as governor. As a state official, he could visit
there without fear of being outed. He ensured Ms. Washington-Williams' silence
by manipulating her emotionally  and funneling to her the envelopes
full of cash that allowed her to pay the tuition.

Ms. Washington-Williams sees the meetings and cash transactions as proof
of affection. But while doling out the money, Mr. Thurmond sometimes asked
how she felt about having to keep their relationship secret and how she was
holding up under pressure from reporters who had gotten wind of the truth.
As an abandoned child, Ms. Washington-Williams made an understandable calculation;
she decided that a fraction of a father who met her in back rooms but disowned
her in public was preferable to no father at all.

Since Mr. Thurmond's black daughter came forward to claim him, his descendants
have been fretting about how people look at them in church  and whether
they will be invited to the right parties. The tragedy of this case played
out in the life of a needy child who was abandoned by her father and then
misused for political purposes. If the Thurmonds are looking for something
to be ashamed of, this is it.

There may not have been a more lowly and vulnerable position in Edgefield, S.C., in 1925 than that of a teenage black maid.

But that was how Essie Mae Washington-Williams's mother, Carrie Butler, was employed when she and a young Strom Thurmond, the scion of a powerful white family, had what Mrs. Williams described as"an affair."

Affair? That's the language many people have used to refer to the liaison after Mrs. Williams broke a lifetime of silence last week and revealed that she was the mixed-race daughter of one of the South's most powerful and segregationist politicians.

But some historians argue that the word"affair" makes it too simple - that intimidation was organic to that time and place, and therefore part of any relationship of this sort, whether consensual or not. South Carolina was an apartheid state in 1925, completely segregated, where the racial code was enforced. At the time, Mr. Thurmond was an unmarried teacher and a high school coach in the little town of Edgefield. Mrs. Williams's mother, swept floors and did dishes in the Thurmond family home.

"White men were king,'' said Valinda Littlefield, a professor of African-American history at the University of South Carolina."She was basically a child. He can do with her what he wants. She's more or less the family's slave."

No one is saying that Mr. Thurmond forced the teenager to have sex. But at the time, Dr. Littlefield said, many black families who sent their daughters off to work as maids equipped them with straight razors or tried to get them placed in homes without young men. The fear of rape was very real for blacks, while white men saw access to black women as a coming-of-age ritual, an unspoken custom of white men having early sexual experiences with black women.

"There was this uncontrollable, unconscious attraction to the otherness of black people," said Edward Ball, author of the memoir,"Slaves in the Family" (Farrar, Straus & Giroux, 1998)."I believe there was a little Strom Thurmond lurking in many white men's hearts."

Black men were powerless to stop these intrusions, knowing at the same time that if they even cast a glance toward a white woman, they could be swinging from the branches of the nearest magnolia tree.

Racial mixing was something that could never be acknowledged because Jim Crow society teetered on the shaky premise that blacks and whites were separate species, even if a look around proved otherwise.

"Everybody knew this was going on," said Jack Bass, co-author of"Ol Strom" (Atlanta: Longstreet, 1999), a biography of Mr. Thurmond."But what was talked about was the number of mulatto children on the next plantation, not those on your own."

There was another reason for the silence. The law. Often these"affairs" were illegal, under a number of provisions. And had Mr. Thurmond been caught and prosecuted, history might have been a little different. In 1925, a man convicted of illicit sex could have lost his right to vote - and hold office.

During long periods, America looks too pacific to be a threat to the likes of Hitler and Mussolini. Too much like Athens gone soft. But at times such as the present--with wars in Afghanistan and Iraq--the Spartan dimension of our civilization becomes visible to all doubters. The biggest thing that most Europeans don't know about America is its Spartan side. Our founders chose the eagle as the symbol for the nation because the eagle is supreme in war, seeing unblinkingly and at great distances. Once fixed on its prey, the eagle is not easily deterred.

Our founders well knew that democracy of itself softens manners, tames--even coddles--the human spirit, and pulls great spirits down to a lower common level. No democracy will long survive, they knew, that does not toughen itself to face adversity, to raise up warriors, and to keep ready a warlike spirit. A democratic army should be small, under civilian control, they insisted, kept safely away from political power, but committed to keeping those who serve in it fearless and invincible.

In a word, in order to survive and to prosper, democracies need to infuse a Spartan spirit into their Athenian thinking. To maintain the peace, prepare for war. A democracy too soft will soon perish.

In this respect, TIME magazine was wise to choose as its"Man of the Year" this last week of 2003"The U.S. Soldier." What soldiers! Just two years ago, a mere one hundred of our best-trained"green-berets," dropped stealthily into Afghanistan to hook up with the Afghan resistance, brought down entrenched Taliban power in a matter of fifty days. They were aided by spectacular air power, but what made that air power so deadly were the direct aiming devices focused on targets by the green berets. At times these most advanced of warriors rode about the Afghan countryside on horseback, in rough nineteenth- century cloaks and scarves, directing the airplanes with radar and targeting beams focused on enemy forces hidden in the mountains.

In the history of presidential politics, only Richard Nixon rivals George W. Bush's level of cynicism about the economy. Nixon used just about every economic trick in the book to ensure his 1972 reelection: Throughout 1971, he made good on the generous social spending he'd promised in that year's State of the Union address. Late that year, he took the dollar off the gold standard so the Fed could increase the money supply without restraint. By early 1972, Nixon was reportedly even ordering his Cabinet to spend money, which helped turn a $3 billion surplus into a $23 billion deficit in less than two years. The consequences were hard to miss once Nixon lifted wage and price controls in 1974: Inflation spiraled, and interest rates rocketed into the double digits. Before long, the United States was on the edge of one of the deepest recessions since the 1930s.

But even Nixon's irresponsibility pales in comparison with Bush's. What makes W. qualitatively worse is the larger fiscal dynamic at work. That is, to ensure his own reelection, Nixon needed to focus on only one objective: buying off swing voters. But, thanks to the rise of the conservative movement, Bush actually has two objectives: He must buy off swing voters but also appease his conservative base--often the people most incensed by the policies used to accomplish goal number one. How does Bush square that apparent circle? With lavish, long-term tax cuts that disproportionately benefit the wealthy.

Politically, the effect is positive: Swing-state voters are happy because they have their manufacturing jobs and their farm subsidies. Conservatives are happy because they have their tax cuts. The problem is that, economically, each tactic reinforces the negative effects of the other: Monkeying around with the dollar the way Snow has drives up long-term interest rates; so do the massive long-term deficits caused by the administration's upper-income tax cuts. Meanwhile, tariffs and quotas drive up prices for consumers, which, among other things, leads to inflation and ultimately higher interest rates as well. Keep heading in that direction and don't be surprised if, not long after 2005, Bush's strong recovery turns anemic. Of course, he could always trot out John Snow to redefine the word"strong" again.

[Dick] Cheney violated the Bush administration's policy of never saying the e-word [empire] in a Christmas card he and his wife sent out to various supporters and important Washingtonians. (Chatterbox did not receive one.) Along with their best wishes for this holiday season, the Cheneys included the following quotation from Benjamin Franklin:

And if a sparrow cannot fall to the ground without His notice, is it probable that an empire can rise without His aid?

Franklin said this at the Constitutional Convention in 1787 by way of suggesting that the proceedings begin each day with a prayer. It is a favorite touchstone for those, like Cheney, who believe that the separation of church and state has become overly fastidious. (These people seldom go on to mention that Franklin's suggestion was rejected by the other delegates.) For Cheney, though, it was a twofer, because it also allowed him to state (using the words of another) that America need not be ashamed of its empire. Although Chatterbox fears that Cheney's motive—in blazing past whatever warnings his aides likely extended about using the e-word—was fanaticism, he can't help but applaud Cheney's honesty. It's time for America's empire to come out of the closet.

Pedant's corner . Why did Franklin use the word"empire," when he so easily could have said"nation" or"republic" or some such? A mere decade after the Revolutionary War, wasn't"empire" a dirty word? Chatterbox posed this question to Franklin biographer Edmund S. Morgan ."It didn't carry the kind of freight that the word carries today," Morgan explained, adding that Franklin's use of the term probably reflected his Anglophilia and his desire to spread the new nation's dominion westward. (Doing so, of course, would subject various Native American tribes to foreign rule, but people didn't think that way at the time.) Franklin, Morgan said, did not mean to indicate any desire to conquer foreign lands, a notion he would have found distasteful. Walter Isaacson , another Franklin biographer, said much the same when Chatterbox caught up with him a few hours later, and pointed out that the negative connotations we attach to"empire" were in that time attached to the word" colonial":

He was very opposed to colonialism. That was a bad word. He believed that any nation or"empire" that had territories should treat all inhabitants, in the far-flung territories as well as near the center, as equal citizens with equal democratic and legislative and governing rights. In other words, he was against" colonialism" but he never used"empire" in a pejorative manner.

Dick Cheney may be fully aware of what Franklin meant when he uttered the word"empire," but even so it can't have escaped his notice that the word's contemporary meaning is much more provocative.

African-Americans and white Americans are so deeply entangled by blood that racial categories have become meaningless. When discussing the issue in public, I typically offer my own family as an example. We check "black" on the census and appear black to the naked eye, but we are also descended from white ancestors on both sides. Despite appearances, I told an audience not long ago, "I am as 'white' as anyone in this room."

White people -- mainly blank-faced and perplexed -- typically don't get it. But black people get it fine: they chuckle, cover their faces in mock embarrassment or nod in quiet agreement. Racial ambiguity is a theme they have heard discussed in their families and communities throughout their lives.

Black families have always talked openly about white ancestors and relatives. In hotbeds of race-mixing like New Orleans or Charleston , S.C. , black and white branches of a family sometimes lived so close at hand that they ran into one another on the street, and black children were warned that their pale relatives could react violently if approached. Black parents who passed on news of white ancestry to their offspring were not trying to arrange family reunions. They were debunking racism by showing their children that black families and white families were more closely connected by ancestry than racists liked to admit.

White families, by contrast, were terrified by blackness in the family tree. Relationships that could not simply be ignored were deliberately buried. The cover-up hatched 200 years ago by Thomas Jefferson's family was blown away a few years back after genetic evidence showed that Jefferson almost certainly fathered Sally Hemings's final son, Eston, born in 1808. This led historians to conclude that Jefferson fathered all of her children in a relationship that lasted more than 35 years.

The big lesson for historians in the Hemings-Jefferson case was that the oral histories passed down by slaves and their descendants were more reliable than the official written record. This put historians on notice that they should give the oral tradition more credence, especially when working on issues of interracial intimacy....

The biographer Nadine Cohodas dismissed [the Thurmond story] as a "legend in the black community" a decade ago in her book "Strom Thurmond and the Politics of Southern Change." Another writer of the South described it as apparently without foundation -- a phrase that is used all the time to dismiss the black oral tradition as apocryphal.

In the 1998 biography, "Ol' Strom," however, a journalism professor, Jack Bass, and a Washington Post reporter, Marilyn Thompson, went back to the oral stories of black South Carolinians, some of whom knew the household, as well as the accounts of a black elevator operator who recalled seeing a light-skinned black woman riding the elevator to visit Mr. Thurmond when he was governor.

How could Mr. Thurmond, who sought the presidency on a segregationist platform in 1948, have lived publicly as a racist while secretly helping to support a black daughter? This was a common practice in the South, where slaveholders and their descendants produced mulatto children. While some white fathers treated their mixed-race children like dirt, others supported and educated them. They refused to acknowledge them to keep the nonexistent barrier between the races firmly intact.

Ideologically, next year's election in Indonesia is being viewed by some as a showdown between two recurring political forces in the country: the Old Order of Sukarno and the New Order of Suharto.

The Indonesian Democratic Party-Struggle (PDI-P) and Golkar - as well as a string of other smaller parties - represent the two major forces respectively.

But for families of the country's first two presidents, the competition may be a little more personal.

The daughters of Indonesia 's first two presidents are vying not just for power but also to maintain their family's political turf by keeping alive the legacies left by their fathers.

Their involvement in the upcoming elections reflects the deeply rooted oligarchic nature of Indonesia 's politics.

From the Sukarno clan, the trio of daughters dubbed 'Charlie's Angels' by the local media - comprising incumbent President Megawati Sukarnoputri, Ms Rachmawati Sukarnoputri and Ms Sukmawati Sukarnoputri - are competing next year.

The rival Suharto clan is represented by eldest daughter Siti Hardijanti Rukmana, better known as Tutut, who announced her political comeback last week.

The women lead four of the 24 political parties participating in next year's polls.

Except for Ms Sukmawati, they will likely run for the presidency but only Ms Megawati is not considered a political lightweight compared to contenders that include Cabinet Minister Susilo Bambang Yudhoyono, former defence minister Wiranto and National Assembly Speaker Amien Rais.

But their presence in the poll reinforces the notion that when it comes to choosing their leaders, Indonesians are still drawn to familiar names from the past.

In the nearly four decades of Indonesia 's history, the battle for power has continued to involve the same family names. Historian Hermawan Sulistyo said that is hardly a surprise.

He told The Straits Times: 'Our elites are the result of years of political inbreeding; outsiders will have a hard time breaking in.

'If we look at the elites' family trees in the last half of the century, the core of it come from Menteng families,' he said, referring to the posh residential areas in Central Jakarta where most of the old money and power-holders live.

'The majority not only grew up with one another, they are related to one another possibly from long lines of marriages,' he said.

For example, former president Abdurrahman Wahid, who was impeached and replaced by Ms Megawati in 2001, is the son of Mr Sukarno's first religious minister as well as the grandson of the founder of the country's largest Islamic grouping, the Nahdlatul Ulama.

Now, one of his daughters Zannuba Arifah Wahid appears likely to follow in her father's footsteps in politics.

But although next year's election will be the first democratically held one involving the two clans, the Sukarno versus Suharto history stretches back to 1967, when then army general Suharto took over power from Mr Sukarno.

He then led Indonesia for the next 32 years, bringing economic prosperity and political stability while quashing political dissidents to maintain his power.

But his leadership was rife with allegations of corruption involving his children, abuse of power and human rights violation.

In May 1998, he quit in the midst of a reform movement led by students after his Cabinet ministers resigned en masse.

Analysts said Indonesians have a capacity to forget the sins of their former leaders once they become disillusioned with the current one.

The late Mr Sukarno is hailed as a visionary, a nation-builder and a captivating orator. But his leftist leanings resulted in a disastrous end to his career.

As a political movement, Clintonism arguably was born on May 6, 1991, when Bill Clinton delivered a seminal speech on his "New Democratic" vision to a conference of the Democratic Leadership Council in Cleveland.

Political historians may conclude that Clintonism was eclipsed as the dominant set of ideas in the Democratic Party on Tuesday, when Al Gore, Clinton 's vice president, endorsed Howard Dean in the 2004 presidential race.

Dean has demonstrated many assets in his bid for the Democratic nomination. He's run a groundbreaking campaign that has changed forever the way candidates look at the Internet. He's shown the capacity to inspire great passion among Democratic activists. He speaks the way a boxer jabs, with sharp thrusts that strike many voters as heartfelt and uninfected by political calculation.

But whatever his other virtues, it's difficult to argue that Dean upholds the political philosophy that Clinton advanced. Indeed, Dean is probably the Democratic contender who most directly rejects Clinton 's vision.

By endorsing Dean, Gore has continued the journey away from Clinton that began in Gore's own 2000 presidential campaign. More important, the former vice president's endorsement suggests that just three years after Clinton left office, key portions of the Democratic establishment most associated with him are willing to acquiesce, if not to help, as Dean moves to redirect the party.

Clinton and Dean offer diametrical visions of how the Democrats can capture the White House.

Clinton 's overriding political assumption was that Democrats could not win solely by mobilizing their hard-core partisans. Instead, Clinton argued that Democrats had to craft policies that attracted swing voters while maintaining the allegiance of traditional Democrats.

In the central line of his 1991 speech, Clinton memorably declared that Democrats had to redesign their agenda to recapture middle-class voters who had abandoned the party since the 1960s. "Too many of the people who used to vote for us," he said, "the very burdened middle class we are talking about, have not trusted us in national elections to defend our national interests abroad, to put their values into social policy at home, or to take their tax money and spend it with discipline."

Dean starts from precisely the opposite perspective.

Throughout his campaign, he has disparaged the idea of targeting the Democratic message toward swing voters. Instead, he argues that Democrats must focus on mobilizing their base, and inspiring nonvoters, with language and an agenda that energizes traditional party constituencies such as labor, feminists and gay civil rights activists.

"We are going to take back the Democratic Party from the idea that the way to win elections is to neglect our base," Dean recently said.

The other day I found myself reading a leftist rag that made outrageous claims
about America. It said that we are becoming a society in which the poor tend
to stay poor, no matter how hard they work; in which sons are much more likely
to inherit the socioeconomic status of their father than they were a generation
ago.

The name of the leftist rag? Business Week, which published an article titled
"Waking Up From the American Dream." The article summarizes recent
research showing that social mobility in the United States (which was never
as high as legend had it) has declined considerably over the past few decades.
If you put that research together with other research that shows a drastic
increase in income and wealth inequality, you reach an uncomfortable conclusion:
America looks more and more like a class-ridden society.

And guess what? Our political leaders are doing everything they can to fortify
class inequality, while denouncing anyone who complains--or even points out
what is happening--as a practitioner of "class warfare."

Let's talk first about the facts on income distribution. Thirty years ago
we were a relatively middle-class nation. It had not always been thus: Gilded
Age America was a highly unequal society, and it stayed that way through the
1920s. During the 1930s and '40s, however, America experienced what the economic
historians Claudia Goldin and Robert Margo have dubbed the Great Compression:
a drastic narrowing of income gaps, probably as a result of New Deal policies.
And the new economic order persisted for more than a generation: Strong unions;
taxes on inherited wealth, corporate profits and high incomes; close public
scrutiny of corporate management--all helped to keep income gaps relatively
small. The economy was hardly egalitarian, but a generation ago the gross
inequalities of the 1920s seemed very distant.

Now they're back. According to estimates by the economists Thomas Piketty
and Emmanuel Saez--confirmed by data from the Congressional Budget Office--between
1973 and 2000 the average real income of the bottom 90 percent of American
taxpayers actually fell by 7 percent. Meanwhile, the income of the top 1 percent
rose by 148 percent, the income of the top 0.1 percent rose by 343 percent
and the income of the top 0.01 percent rose 599 percent. (Those numbers exclude
capital gains, so they're not an artifact of the stock-market bubble.) The
distribution of income in the United States has gone right back to Gilded
Age levels of inequality.

America was once a place of substantial intergenerational mobility: Sons often did much better than their fathers. A classic 1978 survey found that among adult men whose fathers were in the bottom 25 percent of the population as ranked by social and economic status, 23 percent had made it into the top 25 percent. In other words, during the first thirty years or so after World War II, the American dream of upward mobility was a real experience for many people.

Now for the shocker: The Business Week piece cites a new survey of today's adult men, which finds that this number has dropped to only 10 percent. That is, over the past generation upward mobility has fallen drastically. Very few children of the lower class are making their way to even moderate affluence. This goes along with other studies indicating that rags-to-riches stories have become vanishingly rare, and that the correlation between fathers' and sons' incomes has risen in recent decades. In modern America, it seems, you're quite likely to stay in the social and economic class into which you were born.

In a brief I will submit today to the Supreme Court, I argue on behalf of all 50 states that reciting"under God" in the Pledge in public schools is well within the confines of the First Amendment to the Constitution. In Texas, for example, schools must teach students to be"thoughtful, active citizens who understand the importance of patriotism." One way districts are accomplishing that goal is by having students voluntarily recite the Pledge each school day. Yet, an adverse ruling from the court would undermine that law and those of 42 other states that specifically provide for public-school children reciting the Pledge.

It's no secret that many founders of our nation and our states not only believed in God, but also sought divine guidance in fashioning our system of government. Many of our historical documents, speeches and even architecture acknowledge God. The Declaration of Independence alone contains four references to God, including its unambiguous statement that all persons are"endowed by their Creator with certain unalienable Rights." That sentiment also animated President Lincoln when he beseeched those gathered amid the carnage of Gettysburg to resolve"that this nation under God shall have a new birth of freedom."

The Pledge of Allegiance, too, is part of our common heritage. After an early form first appeared in a youth publication in 1892, it grew in acceptance and changed form until Congress officially adopted it in 1942. In 1954, Congress inserted the phrase"under God" to make the Pledge more reflective of the nation's character. Congressional committee reports from the time of that amendment echo the Declaration of Independence, noting that our government recognizes the importance of each person as being"endowed by [God] with certain inalienable rights which no civil authority may usurp." The addition of"under God" meant that the Pledge was simply the latest historical and patriotic acknowledgment of our nation's undeniable religious heritage.

I am encouraged by the fact that virtually every reference to the Pledge by the Supreme Court and by at least 12 individual justices over the decades has agreed that the Pledge is consistent with the First Amendment. Justice Sandra Day O'Connor, for example, expressed her view in Wallace v. Jaffree that the reference to God in the Pledge"serve[s] as an acknowledgment of religion with 'the legitimate secular purposes of solemnizing public occasions, [and] expressing confidence in the future.'" Justice William Brennan, one of the court's more liberal members, admitted in School District of Abington Township v. Schempp that"[t]he reference to divinity in the revised pledge of allegiance. . . may merely recognize the historical fact that our Nation was believed to have been founded 'under God.'"

Forty years ago, an important emissary was sent to France by a beleaguered president of the United States. It was during the Cuban missile crisis, and the emissary was a tough-minded former secretary of state, Dean Acheson. His mission was to brief French President Charles de Gaulle and solicit his support in what could become a nuclear war involving not just the United States and the Soviet Union but the entire NATO alliance and the Warsaw Pact.

At the end of the briefing, Acheson said to de Gaulle,"I would now like to show you the evidence, the photographs that we have of Soviet missiles armed with nuclear weapons." The French president responded,"I do not wish to see the photographs. The word of the president of the United States is good enough for me. Please tell him that France stands with America."

Would any foreign leader today react the same way to an American emissary sent abroad to say that country X is armed with weapons of mass destruction that threaten the United States? It is unlikely. The recent conduct of U.S. foreign policy, by distorting the threats facing America, has isolated the United States and undermined its credibility. It has damaged our ability to deal with issues in North Korea, Iran, Russia and the West Bank. If a case ever needs to be made for action against a truly imminent threat, will any nation take us seriously?

Fifty-three years ago, following the Soviet-sponsored assault by North Korea on South Korea, the Soviet Union boycotted a resolution in the U.N. Security Council for a collective response to North Korea's act. That left the Soviet Union alone in opposition, stamping it as a global pariah.

Today it is the United States that finds itself alone. In recent weeks, there were two votes on the Middle East in the U.N. General Assembly. In one, the vote was 133 to 4, and in the other, it was 144 to 4 — the United States, Israel, the Marshall Islands and Micronesia. Japan and all of our NATO allies, including Great Britain and the so-called"new" Europe, voted with the majority.

The loss of U.S. international credibility and the growing U.S. isolation are aspects of a troubling paradox: American power worldwide is at its historic zenith, but American global political standing is at its nadir. Maybe we are resented because we are rich, and we are, or because we are powerful, and we certainly are. But I think anyone who thinks that this is the full explanation is taking the easy way out and engaging in a self-serving justification.

Since the tragedy of 9-11, our government has embraced a paranoiac view of the world summarized in a phrase President Bush used on Sept. 20, 2001:"Either you are with us or you are with the terrorists."

I suspect officials who have adopted the"with us or against us" formulation don't know its historical origins. It was used by Lenin to attack the social democrats as anti-Bolshevik and justify handling them accordingly. This phrase is part of our policymakers' defining focus, summed up by the words"war on terrorism." War on terrorism reflects, in my view, a rather narrow and extremist vision of foreign policy for a superpower and for a great democracy with genuinely idealistic traditions.

Our country suffers from another troubling condition, a fear that periodically verges on blind panic. As a result, we lack a clear perception of critical security issues such as the availability to our enemies of weapons of mass destruction. In recent months, we have experienced perhaps the most significant intelligence failure in U.S. history. That failure was fueled by a demagogy that emphasizes worst-case scenarios, stimulates fear and induces a dichotomous view of world reality.

Robert Parry, outlining the history of US/Saddam ties in an article first published
in February 2003 on the eve of war and republished by In
These Times (Dec. 16, 2003):

This intersection of Saddams wars and U.S. foreign policy dates back
at least to 1980 when Irans radical Islamic government held 52 Americans
hostage in Tehran and the sheiks of the oil-rich Persian Gulf feared that
Ruhollah Khomeinis radical breed of Islam might sweep them from power
just as it had the Shah of Iran a year earlier.

The Iranian government began its expansionist drive by putting pressure on
the secular government of Iraq, instigating border clashes and encouraging
Iraqs Shiite and Kurdish populations to rise up. Iranian operatives
sought to destabilize Saddams government by assassinating Iraqi leaders.
[For details, see An Unnecessary War, Foreign Policy, January/February
2003.]

On Aug. 5, 1980, as tensions mounted on the Iran-Iraq border, Saudi rulers
welcomed Saddam to Riyadh for the first state visit ever by an Iraqi president
to Saudi Arabia. During meetings at the kingdoms ornate palaces, the
Saudis feted Saddam whose formidable Soviet-supplied army was viewed as a
bulwark against Iran.

Saudi leaders also say they urged Saddam to take the fight to Irans
fundamentalist regime, advice that they say included a green light
for the invasion from President Carter.

Less than two months after Saddams trip, with Carter still frustrated
by his inability to win release of the 52 Americans imprisoned in Iran, Saddam
invaded Iran on Sept. 22, 1980. The war would rage for eight years and kill
an estimated one million people.

The claim of Carters green light for the invasion was made
by senior Arab leaders, including King Fahd of Saudi Arabia, to President
Reagans first secretary of state, Alexander Haig, when Haig traveled
to the Middle East in April 1981, according to top secret talking
points that Haig prepared for a post-trip briefing of Reagan.

Haig wrote that he was impressed with bits of useful intelligence
that he had learned. Both [Egypts Anwar] Sadat and [Saudi then-Prince]
Fahd [explained that] Iran is receiving military spares for U.S. equipment
from Israel, Haig noted. It was also interesting to confirm that
President Carter gave the Iraqis a green light to launch the war against Iran
through Fahd.

Haigs talking points were first disclosed at Consortiumnews.com
in 1995 after I discovered the document amid records from a congressional
investigation into the early history of the Reagan administrations contacts
with Iran. At that time, Haig refused to answer questions about the talking
points because they were still classified. Though not responding to
direct questions about the talking points, Carter has pooh-poohed
other claims that he gave Saddam encouragement for the invasion.

Senator Joseph I. Lieberman learned that he had been passed over by Al Gore
for a presidential endorsement in precisely the same way he learned that he
had been picked as Mr. Gore's running mate in the first place: from The Associated
Press. That lapse in manners - and the message it conveyed - set off cries
of "Betrayal!" from Mr. Lieberman's angry backers last week.

But from Judas to Brutus and beyond, limited loyalty has been an occupational
hazard of leadership, never more so than in the Darwinian world of electoral
politics. In a business in which strong ego, expediency and self-interest
are virtually hard-wired into the practitioners' DNA, disloyalty is not only
nothing new, it is an occasional necessity.

"Creative betrayal is the essence of successful statesmanship,"
said Richard Norton Smith, the director of the Abraham Lincoln Presidential
Library in Springfield, Ill. "The other side of that coin is that loyalty
is a very attractive quality, but it doesn't get you anything.'...

In politics, a little disloyalty sometimes goes a long way. Presidents have
dumped running mates who looked to be a drag on the ticket for a second term
(Franklin D. Roosevelt with John Nance Garner and Henry Wallace; Gerald Ford
with Nelson Rockefeller). They have undermined loyal No. 2's who were striving
to succeed them with mixed success (Harry Truman with Alben Barkley; Lyndon
Johnson with Hubert Humphrey). After Richard M. Nixon salvaged his vice-presidential
campaign from a slush-fund scandal with the "Checkers" speech in
1952, Dwight Eisenhower declared, "You're my boy!" But he had let
Nixon dangle for days while he assessed public opinion.

By 1956, Eisenhower was sounding out Nixon about whether he wouldn't really
rather be a cabinet member than vice president, and when asked in 1960 to
cite an example of an important decision that Nixon had influenced in his
eight years in office, Eisenhower replied, "If you give me a week, I
might think of one."

Perhaps no feud between former patron and protégé was more
bitter than Theodore Roosevelt's turning against his vice president and successor,
William Howard Taft, in the campaign of 1912. Roosevelt believed that Taft
had failed to uphold his progressive legacy, calling him "dumber than
a guinea pig." Hurt and flummoxed at his mentor's taunts, Taft blurted
out in self-defense: "Even a cornered rat will fight."

In a recent interview with US News & World Report, Vietnam-era Secretary of Defense Robert S. McNamara explained that he doesn't feel it that it's appropriate for him to comment on the conduct of the war in Iraq. But it's not too hard to guess what he probably thinks, based on his own experiences and the conclusions he has drawn about them. In his unique 1995 book In Retrospect, McNamara listed eleven lessons from Vietnam that are very much worth reflecting on.

McNamara wrote, It is sometimes said that the post-Cold War world will be so different from the world of the past that the lessons of Vietnam will be inapplicable or of no relevance to the twenty-first century. I disagree... There were eleven major major causes for our disaster in Vietnam:

We misjudged then--as we have since--the geopolitical intentions of our adversaries... and we exaggerated the dangers to the United States of their actions.

We viewed the people and leaders of South Vietnam in terms of our own experience. We saw in them a thirst for--and a determination to fight for--freedom and democracy. We totally misjudged the political forces within the country.

We underestimated the power of nationalism to motivate a people... to fight and die for their beliefs and values--and we continue to do so today in many parts of the world.

Our misjudgments of friend and foe alike reflected our profound ignorance of the history, culture, and politics of the people in the area, and the personalities and habits of their leaders...

We failed then--as we have since--to recognize the limitations of modern, high-technology equipment, forces and doctrine in confronting unconventional, highly motivated people's movements. We failed as well to adapt our military forces to the task of winning the hearts and minds of people from a totally different culture.

We failed to draw Congress and the American people into a full and frank discussion and debate of the pros and cons of a large-scale U.S. military involvement... before we initiated the action.

After the action got underway and unanticipated events forced us off our planned course, we failed to retain popular support in part because we did not explain fully what was happening and why we were doing what we did. We had not prepared the public to understand the complex events we faced and how to react constructively to the need for changes in course as the nation confronted uncharted seas and an alien environment. A nation's deepest strength lies not in military prowess but, rather, in the the unity of its people. We failed to maintain it.

We did not recognize that neither our people nor our leaders are omniscient. Where our own security is not directly at stake, our judgment of what is in another people's or country's best interest should be put to the test of open discussion in international forums. We do not have the God-given right to shape every nation in our own image or as we choose.

We did not hold to the principle that U.S. military action--other than in response to direct threats to our own security--should be carried out only in conjuction with multinational forces supported fully (and not merely cosmetically) by the international community.

We failed to recognize that in international affairs, as in other aspects of life, there may be problems for which there are no immediate solutions... at times, we may have to live an imperfect, untidy world.

Underlying many of these errors lay our failure to organize the top echelons of the executive branch to deal effectively with the extraordinarily complex range of political and military issues...

These were our major failures, in their essence. Though set forth separately, they are all in some way linked: failure in one area contributed to or compounded failure in another. Each became a turn in a terrible knot.

Pointing out these mistakes allows us to map the lessons of Vietnam, and places us in a position to apply them to the post-Cold War world.

Back in 1937, an economist named Ronald Coase realized something that helped explain the rise of modern corporations -- and which just might explain the coming decline of the American two-party political system.

Coase's insight was this: The cost of gathering information determines the size of organizations.

It sounds abstract, but in the past it meant that complex tasks undertaken on vast scales required organizational behemoths. This was as true for the Democratic and Republican parties as it was for General Motors. Choosing and marketing candidates isn't so different from designing, manufacturing and selling automobiles.

But the Internet has changed all that in one crucial respect that wouldn't surprise Coase one bit. To an economist, the"trick" of the Internet is that it drives the cost of information down to virtually zero. So according to Coase's theory, smaller information-gathering costs mean smaller organizations. And that's why the Internet has made it easier for small folks, whether small firms or dark-horse candidates such as Howard Dean, to take on the big ones.

For all Dean's talk about wanting to represent the truly"Democratic wing of the Democratic Party," the paradox is that he is essentially a third-party candidate using modern technology to achieve a takeover of the Democratic Party. Other candidates -- John Kerry, John Edwards, Wesley Clark -- are competing to take control of the party's fundraising, organizational and media operations. But Dean is not interested in taking control of those depreciating assets. He is creating his own party, his own lists, his own money, his own organization. What he wants are the Democratic brand name and legacy, the party's last remaining assets of value, as part of his marketing strategy. Perhaps that's why former vice president Al Gore's endorsement of Dean last week felt so strange -- less like the traditional benediction of a fellow member of the party" club" than a senior executive welcoming the successful leveraged buyout specialist. And if Dean can do it this time around, so can others in future campaigns.

Garrett Jones, a 1993 graduate of the U.S. Army War College and retired case officer with the CIA in Africa, Europe, and the Middle East, writing in the newsletter of the Foreign Policy Research Institute (Dec. 15, 2003):

Identify the following country: a mid-sized non-European country with a history going back to Biblical events. An ancient trading center, it figured in the works of ancient historians. It has an almost exclusively Muslim population, a troubled colonial history, and until recently was governed by an absolute dictator. The dictator, who ran a corrupt regime and aggressively attacked his neighbor, was ousted by force. The country, whose economy has since collapsed, is occupied by a multinational-armed force that is the frequent target of violent attacks--a mix of locally inspired efforts and actions planned or inspired by the Al Qaeda terrorist organization. It is believed to possess significant untapped oil reserves.

No, it is not Iraq. It is Somalia in 1993.

Somalia, about the size of Texas, was called the "Land of Punt" by ancient Egyptian writers. The frankincense referred to in Biblical narratives probably originated in the land that is now Somalia, which was visited and written about by Arab historian Ibn Battuta in 1331. Chinese historians record visits of Chinese fleets beginning in the eleventh century and continuing intermittently until the 1400s. The 2003 CIA World Factbook lists Somalia as almost exclusively Sunni Muslim. It was colonized into separate protectorates by the British in 1886-1960 and the Italians in 1889-1960. The dictator Muhammad Said Barre began a series of border wars starting in 1974 with Ethiopia over the Ogaden region. Barre was ousted in a 1991 coup. The attacks on the multinational forces of Operation Restore Hope include the infamous October 3, 1993 attack on Task Force Rangers in Mogadishu that was the subject of Mark Bowden's book Black Hawk Down (New York. Atlantic Monthly Press, 1999), the best general account of the event. As of 2003, oil exploration rights to Somalia were still carried as an asset by ConocoPhillips in SEC filings; exploration operations had been suspended due to "force majeure."

While one would not want to overreach in drawing a comparison between Somalia in 1993 and the current situation in Iraq, our experience in Somalia may be instructive in important respects. When I was serving in Somalia in 1993 as chief of CIA operations, some of the attacks against both multinational and U.S. forces were inspired or at least assisted, by elements of the Al Qaeda terrorist organization. At the time, we did not know that. We did know that "foreigners" who had served in the Afghanis' jihad against Soviet forces were assisting the Somalis in their attacks. In fact, we believed we knew the names of some of the individuals. However, in 1993, Al Qaeda was an unknown organization and Osama bin Laden was simply a "person of interest" in the terrorist world. Subsequently, we discovered that bin Laden had sent members of his organization to Somalia from Sudan to assist local Somali warlord Muhammad Farrah Aidid. According to statements allegedly made by bin Laden, he is doing the same thing in Iraq today.

Examining what happened in Somalia in 1993 may provide us with clues to how Al Qaeda is operating today in Iraq: its structure, personnel, and targeting criteria. For the capture of Saddam Hussein does not end the threats to our troops there. It also may, by exclusion, permit us to distinguish between Al Qaeda-inspired attacks and cases of local Iraqis' "venting" through armed actions. Much as we did not know exactly what we were up against in Somalia in 1993, in Iraq today we have been trying to "drain the Babylonian swamp" without always knowing the enemy.

Surveying post-Hussein Baghdad, one is apt to think of a battered child bracing himself against the next blow. Gloom, anger, despair are all part of the city's psychological makeup.

Standing in the Mustansiriya's courtyard, I am approached by a former Baathist military officer who rages about the looting of Baghdad's landmarks."Why do the Americans let this happen?" he asks."I believe they do this in order to erase Baghdad's identity."

I hear this accusation constantly. And it is understandable. The U.S. military was slow to respond to looting here. And confronted with the current guerrilla war, the occupying authority has not made cultural issues a priority.

Months after the American invasion began, the grounds of the former Ottoman headquarters, the Kushle, are littered with rubble. Broken glass crunches under my feet as I pass through the entry hall. The empty shell of a cluster bomb lies in an otherwise gutted second-floor room. Such images are typical of Baghdad. And one quickly gets used to them, as if stepping through a construction site.

A moment later, in the courtyard, I watch as a frail old man methodically loads bricks onto the back of a horse-drawn cart. The bricks are piled 10 courses high, and as the horse lurches forward they spill out of the cart, crashing back down into the courtyard.

The man is one of the many looters who have eaten away at the city's abandoned ministries, palaces and office buildings. But in this case, there is a particular irony: Built in the late 19th century as a testament to Ottoman power, the Kushle is partly constructed from bricks that the Ottomans themselves pillaged from the Rusafa district's ancient fortified wall.

This sort of cannibalism is a particularly grotesque legacy of the breakdown of a city's social fabric. But there are other, more subtle ways the past has been diminished. In the 20th century, for example, the brutality of war was joined by another destructive force: the myth of progress. This was particularly true of the developing world, where modern architecture was not only embraced as a sign of technological achievement, but older structures were often reviled as emblems of a primitive past.

The result was a form of historical amnesia. And in Baghdad, that problem was compounded by a lack of durability in traditional building materials. Cairo's great landmarks were built in stone; most of Baghdad's ancient buildings were constructed of mud bricks. As such, without constant care, they were apt to melt away over time. Termites often devoured what was left.

For centuries, these structures were patched up or rebuilt. But modern architects found another solution: concrete. Many of the traditional courtyard houses that once gave the city its character were bulldozed to make way for new development — especially during Hussein's rule.

In 1984, a government-sponsored survey listed 3,089 historic structures worth preserving in the Rusafa district alone. A decade later, that figure had been reduced to fewer than 1,000. Today, the examples of that tradition are virtually nonexistent.

The callous indifference to the past in a city once rich with history is particularly glaring along a segment of elevated freeway in northwest Baghdad. Built in the 1980s, the freeway carves through the center of an ancient cemetery. The stele of the Zubayda tomb — built for Haroun al Rashid's wife — rises among a number of lesser-known graves on one side. On the other, more tombs are scattered along the base of a mud-brick tower — one of the few remaining fragments of the ancient wall that defined Baghdad's historic core.

It's hard to find a more potent symbol of disrespect for the past in the name of progress. One immediately thinks of Rome, with its inexhaustible storehouse of architectural treasures and the cultural pride those layers inspire. At the other extreme, Los Angeles' lack of an architectural tradition has been a form of liberation, pointing the city toward the future.

Baghdad has neither the benefit of an unbroken history nor the freedom that comes with youth.

In important ways, the"war on terrorism" has represented an impulse to undo violently precisely the humiliation of 9/11. To be sure, the acts of that day had a warlike aspect. They were certainly committed by men convinced that they were at war with us. In post-Nuremberg terms they could undoubtedly be considered a" crime against humanity." Some kind of force used against their perpetrators was inevitable and appropriate. The humiliation caused, together with American world ambitions, however, precluded dealing with the attacks as what they were--terrorism by a small group of determined zealots, not war. A more focused, restrained, internationalized response to Al Qaeda could have been far more effective without being a stimulus to expanded terrorism.

Unfortunately, our response was inseparable from our superpower status and the syndrome that goes with it. Any nation attacked in that way would have felt itself humiliated. But for the United States, with our national sense of being overwhelmingly powerful and unchallengeable, to have its major institutions violently penetrated created an intolerable breakdown of superpower invulnerability that was never supposed to happen, a contradiction that fed our humiliation.

We know from history that collective humiliation can be a goad to various kinds of aggressive behavior--as has been true of bin Laden and Al Qaeda. It was also true of the Nazis. Nazi doctors told me of indelible scenes, which they either witnessed as young children or were told about by their fathers, of German soldiers returning home defeated after World War I. These beaten men, many of them wounded, engendered feelings of pathos, loss and embarrassment, all amid national misery and threatened revolution. Such scenes, associated with strong feelings of humiliation, were seized upon by the Nazis to the point where one could say that Hitler rose to power on the promise of avenging them.

With both Al Qaeda and the Nazis, humiliation could, through manipulation but also powerful self-conviction, be transformed into exaggerated expressions of violence. That psychological transformation of weakness and shame into a collective sense of pride and life-power, as well as power over others, can release enormous amounts of aggressive energy. Such dangerous potential has been present from the beginning in the American"war" on terrorism.

The Federal Open Market Committee's decision Tuesday to keep the federal
funds rate at 1 percent is symptomatic of the Fed's current problem with prices:
it's looking in the wrong direction, worrying about deflation, not inflation.
The Bureau of Economic Analysis' long-awaited revision of economic data to
2002 also tells us much about where we've been. Conclusion: they're right,
it's not currently 1929. But it is 1973!

"The probability of an unwelcome fall in inflation has diminished in
recent months, and now appears almost equal to that of a rise in inflation,"
said the official statement from the FOMC meeting. Almost equal is an interesting
view -- with gold up 60 percent from its low at $406.60 per ounce at Wednesday's
close, the market appears to be taking the view that inflation is very much
more likely than deflation. Other commodities, too have risen sharply during
2003 while even the oil price, which was expected to drop following the partial
return of Iraq production, has remained close to the levels prevailing before
the Iraq war. In this context, a sharp drop in U.S. crude oil supplies to
275.7 million barrels, reported Wednesday morning, is a further indicator
that the era of cheap oil is not about to return soon....

The overall unhealthiness of this position, at least in the United States,
is demonstrated by the Bureau of Economic Analysis' revised gross domestic
product data for the years to 2002. This is very difficult to interpret, because
the BEA has made so many changes to the basis on which figures are calculated
that it is impossible for analysts to determine the true picture. For example,
a "statistical adjustment," modest in the late 1990s, balloons to
$80 billion in 2001 and $120 billion in 2002, and, if deducted would wipe
out the modest economic growth of 2001 and approximately halve that of 2002.

Even if we take the BEA's figures at face value, economic growth in 2002
has been revised down significantly from the figure first announced, and the
recession has been significantly prolonged, with the first quarter of negative
GDP growth having occurred as early as the third quarter of 2000. The size
of the revisions produced by the BEA is indeed shown by that quarter; the
figure initially announced for its growth, in October 2000, on which market
reaction was based, was a sturdy 2.7 percent. A figure that is initially announced
at 2.7 percent and finally (?) revised to MINUS 0.5 percent is needless to
say NOT particularly useful in making coherent investment decisions. The third
quarter 2003 growth, so loudly trumpeted by the administration at 8.2 percent,
may similarly be revised down; indeed, it is likely to be so, since it is
notable in the BEA's current revisions that it is the quarters of exuberant
growth, such as the first quarter of 2002, that disappear when the arithmetic
is done properly....

There is another period in U.S. history when retail sales dropped sharply,
while commodity prices soared, and the dollar weakened; it is not 1929 but
1973. In the years following 1973, the U.S. went through quite a severe recession,
accompanied by very substantial inflation, a sharp decline in productivity,
a collapse in inflation-adjusted terms in the stock market, a rise in unemployment
that did not begin to reverse for a decade and a sharp retraction in U.S.
power and influence worldwide.

If 2003 looked like 1973, then 2004 must have every likelihood of looking
like 1974. Adjust your strategies accordingly.

Whenever President George W. Bush ventures abroad to meet foreign officials
the question is not what will he get accomplished but whether or not he will
be murdered. The man cannot set foot outside the United States without a bodyguard
of thousands of armed men and women. He literally cannot make a public appearance
for fear of his life.

No other world public figure rivals George W. Bush in low esteem. The man
is despised everywhere. He is a universal hate object. You might think that
distinction would have been conferred on Bush's friend, the thug-ugly Vladimir
Putin, whose hands are red from his atrocities in Chechnya, but no, the globally
reviled politician is the American president.

In his recent visit to England Bush's hosts did not dare even invite him
to talk to the House of Commons lest he had been heckled into silence. The
newspapers remarked that in this, only the second state visit by an American
president, he made no public appearances, for fear the normally law-abiding
English might have done something untoward to his presidential person.

For whatever reasons, no comparison between these two state visits was made,
one by the least and one by the most popular American president, Woodrow Wilson.
If Bush is the despair of most of the world, Wilson was its hope.

Wilson's bodyguard was the common people. "Everywhere he went he was
the idol of the masses," wrote journalist Mark Sullivan. "Never
since Peter the Hermit had Europe so blindly, so eagerly followed one leader.
It was frequently said during late December 1918 that Wilson could overturn
any government in Europe by an appeal to the people against their rulers."
Millions turned out for him. Historian E. Dodd wrote, "The masses of
European peasantry, shopkeepers and day laborers looked forward to his arrival
as men looked in medieval times to the Second Coming of Christ."

Wilson, as no other president in our history, had the ability to talk to
the people of the world in language that expressed their prayers for liberty,
independence and dignity. As popular as Franklin Roosevelt and John Kennedy
were with the humble people of the Earth, it was nothing compared to Wilson's.
When President Wilson spoke, he reached the world.

When George W. Bush says, as he did in England, "We seek the advance
of freedom and the peace that freedom brings," nobody believes him.

[T]he mere fact that Jordan has a king who speaks English with an American
accent - he was at Georgetown University for a year - makes it different from
most Arab states. And, in fact, Jordan has carved out a separate identity.
Within five minutes of meeting you, just about every Jordanian wants to know
if you have visited Petra, the spectacular city-in-living-stone - seen in
"Raiders of the Lost Ark" - built by the pre-Muslim Nabateans some
2,000 years ago. The remaining discussion typically turns to ancient Roman
ruins. Anything Muslim comes much later.

As part of the same effort to "brand" the nation as unique, the
technical name is the "Hashemite Kingdom of Jordan." The Hashemites,
of course, are the royal family of Jordan. But, as dynasties go, they don't
have much of a history. The first king of Jordan, Abdullah, was born in Mecca
and spent much of his early life in Istanbul. But he backed the winning side
in World War I, and so, after the war, the victorious British rewarded him
with the mostly empty territory that lay east of Palestine, beyond the Jordan
River. As if to prove the area's afterthoughtness, the new entity was called
"Trans-Jordan." It would be as if New Yorkers dubbed America west
of the Hudson as simply, "Trans-Hudson."

Trans-Jordan fought against Israel in 1947-8. But as Israeli historian Avi
Shlaim demonstrated in his 1988 book, "Collusion Across the Jordan: King
Abdullah, the Zionist Movement, and the Partition of Palestine," Abdullah's
real goal was to grab as much of Palestine as he could, leaving the Israelis
with the rest. Jordan lost its Palestinian turf - the West Bank - in 1967,
and has since renounced any claim on the area.

Ever since, the Jordanians have posed as moderate conciliators. But what
that really means is that the Jordanians tell the various sides what they
want to hear. The government recently hosted a conference denouncing the Israeli
"security wall" going up in the West Bank as an Apartheid-like abomination.
Yet in a 1996 strategy document, a right-wing Israeli think tank, closely
associated with both the Likud Party and American neoconservatives, Jordan
was described as an ally that would work with Israel to "roll back"
Syria and Iraq. So which is it? Probably both. Even more recently, King Abdullah
traveled last week to Washington to bask in official acclaim and to collect
still more American aid.

It's a tricky game for the Hashemites because many, if not most, of the 5.3
million people here hail from Palestine. Indeed, in a bloody 1970 civil war,
the Hashemite army crushed the Palestinian Liberation Organization.

1969 A riot following a police raid of the Stonewall Inn, a gay bar in New
York, becomes the symbolic start of the American gay rights movement.

1973 The newly formed National Gay Task Force and Lambda Legal Defense Fund
successfully lobby the American Psychiatric Association to remove homosexuality
from its list of mental disorders.

1981 The first cases of a mysterious immune deficiency among gay men are
documented by the Center for Disease Control. Acquired Immune Deficiency Syndrome,
or AIDS, gets its name the next year.

1982 Wisconsin is the first state to ban discrimination on the basis of sexual
orientation.

1988 Gay Americans celebrate the first national "Coming out day"
Oct. 11.

1989 Denmark becomes the first country to recognize a registered partnership
status for same-sex couples. Over the next decade, Norway, Sweden, Greenland,
Iceland, and the Netherlands also move to grant partnership status to same-sex
couples.