IMAGINE
you live in a slum. Your many children are shoeless, often there's no food
on the table, two three days at a time. You have an immensely wealthy
neighbor who built a mansion, many fine cars in the circular drive, many
television sets gleaming through the windows, you can smell meat roasting
every night. Weekends there's Music thru the PA system, dress balls out
in the garden, you hear corks popping and laughter. You peek thru
a hole in the fence to watch -- almost without willing to, you just can't
help but see it. You wonder if there will ever be any trickledown, a basket
of fruit left on your doorstep. There isn't and it makes your heart sad.
Cuz you know your kids are watching all this. Then one day, imagine you
hear bullets, screams, furniture and glass breaking and you realize this
country, errrr, this neighbor --is absolutely DYSFUNCTIONAL! The man is
a violent bully. He shoots his wife, his kids, or he sends them off to
die in wars, even shoots up his neighbors. You hear that every servant
in the house has been found in an advanced state of starvation and has
no medical plan, he never saw to them. Now, to top it off, for no reason,
he sends his bully bodyguards into YOUR meager little yard and bang, bang,
dead people everywhere! YOU MUST really hate this guy right? ANd then suddenly,
it's on the radio, it turns out he stole all his money from everyone in
town and wasted it on vanity and show and now he he hasn't a dime. He's
poor like you. His creditors are screaming, they're going down too.......

CHOMSKY says about the UGLY AMERICAN.

Chomsky: What America's 'Crisis' Means to the
Rest of the WorldBy Noam Chomsky, Boston Review. Posted September
10, 2009.

Perhaps I may begin with a few words about the title. There is too much
nuance and variety to make such sharp distinctions as theirs-and-ours,
them-and-us. And neither I nor anyone can presume to speak for “us.” But
I will pretend it is possible.

There is also a problem with the term “crisis.” Which one? There are
numerous very severe crises, interwoven in ways that preclude any clear
separation. But again I will pretend otherwise, for simplicity.

One way to enter this morass is offered by the June 11 issue of the
New York Review of Books. The front-cover headline reads “How to Deal With
the Crisis”; the issue features a symposium of specialists on how to do
so. It is very much worth reading, but with attention to the definite article.
For the West the phrase “the crisis” has a clear enough meaning: the financial
crisis that hit the rich countries with great impact, and is therefore
of supreme importance. But even for the rich and privileged that is by
no means the only crisis, nor even the most severe. And others see the
world quite differently. For example, in the October 26, 2008 edition of
the Bangladeshi newspaper The New Nation, we read:
It’s very telling that trillions have already been spent to patch up
leading world financial institutions, while out of the comparatively small
sum of $12.3 billion pledged in Rome earlier this year, to offset the food
crisis, only $1 billion has been delivered. The hope that at least extreme
poverty can be eradicated by the end of 2015, as stipulated in the UN’s
Millennium Development Goals, seems as unrealistic as ever, not due to
lack of resources but a lack of true concern for the world’s poor.

The article goes on to predict that World Food Day in October 2009 “will
bring . . . devastating news about the plight of the world’s poor . . .
which is likely to remain that: mere ‘news’ that requires little action,
if any at all.” Western leaders seem determined to fulfill these grim predictions.
On June 11 the Financial Times reported, “the United Nations’ World Food
Programme is cutting food aid rations and shutting down some operations
as donor countries that face a fiscal crunch at home slash contributions
to its funding.” Victims include Ethiopia, Rwanda, Uganda, and others.
The sharp budget cut comes as the toll of hunger passes a billion—with
over one hundred million added in the past six months—while food prices
rise, and remittances decline as a result of the economic crisis in the
West.

As The New Nation anticipated, the “devastating news” released by the
World Food Programme barely even reached the level of “mere ‘news.’” In
The New York Times, the WFP report of the reduction in the meager Western
efforts to deal with this growing “human catastrophe” merited 150 words
on page ten under “World Briefing.” That is not in the least unusual. The
United Nations also released an estimate that desertification is endangering
the lives of up to a billion people, while announcing World Desertification
Day. Its goal, according to the Nigerian newspaper THISDAY, is “to combat
desertification and drought worldwide by promoting public awareness and
the implementation of conventions dealing with desertification in member
countries.” The effort to raise public awareness passed without mention
in the national U.S. press. Such neglect is all too common.

It may be instructive to recall that when they landed in what today
is Bangladesh, the British invaders were stunned by its wealth and splendor.
It was soon on its way to becoming the very symbol of misery, and not by
an act of God.

As the fate of Bangladesh illustrates, the terrible food crisis is not
just a result of “lack of true concern” in the centers of wealth and power.
In large part it results from very definite concerns of global managers:
for their own welfare. It is always well to keep in mind Adam Smith’s astute
observation about policy formation in England. He recognized that the “principal
architects” of policy—in his day the “merchants and manufacturers”—made
sure that their own interests had “been most peculiarly attended to” however
“grievous” the effect on others, including the people of England and, far
more so, those who were subjected to “the savage injustice of the Europeans,”
particularly in conquered India, Smith’s own prime concern in the domains
of European conquest.
Smith was referring specifically to the mercantilist system, but his
observation generalizes, and as such, stands as one of the few solid and
enduring principles of both international relations and domestic affairs.
It should not, however, be over-generalized. There are interesting cases
where state interests, including long-term strategic and economic interests,
overwhelm the parochial concerns of the concentrations of economic power
that largely shape state policy. Iran and Cuba are instructive cases, but
I will have to put these topics aside here.

The food crisis erupted first and most dramatically in Haiti in early
2008. Like Bangladesh, Haiti today is a symbol of misery and despair. And,
like Bangladesh, when European explorers arrived, the island was remarkably
rich in resources, with a large and flourishing population. It later became
the source of much of France’s wealth. I will not run through the sordid
history, but the current food crisis can be traced directly to 1915, Woodrow
Wilson’s invasion: murderous, brutal, and destructive. Among Wilson’s many
crimes was dissolving the Haitian Parliament at gunpoint because it refused
to pass “progressive legislation” that would have allowed U.S. businesses
to take over Haitian lands. Wilson’s Marines then ran a free election,
in which the legislation was passed by 99.9 percent of the 5 percent of
the public permitted to vote. All of this comes down through history as
“Wilsonian idealism.”

Later, the United States Agency for International Development (USAID)
instituted programs to turn Haiti into the “Taiwan of the Caribbean,” by
adhering to the sacred principle of comparative advantage: Haiti must import
food and other commodities from the United States, while working people,
mostly women, toil under miserable conditions in U.S.-owned assembly plants.
Haiti’s first free election, in 1990, threatened these economically rational
programs. The poor majority entered the political arena for the first time
and elected their own candidate, a populist priest, Jean-Bertrand Aristide.
Washington adopted the standard operating procedures for such a case, moving
at once to undermine the regime. A few months later came the anticipated
military coup, and the resulting junta instituted a reign of terror, which
was backed by Bush senior and even more fully by Clinton, despite pretenses.
By 1994 Clinton decided that the population was sufficiently intimidated
and sent U.S. forces to restore the elected president, but on the strict
condition that he accept a harsh neoliberal regime. In particular, there
must be no protection for the economy. Haitian rice farmers are efficient,
but cannot compete with U.S. agribusiness that relies on huge government
subsidies, thanks largely to Reagan, anointed High Priest of free trade
with little regard to his record of extreme protectionism and state intervention
in the economy.

Perhaps I may begin with a few words about the title. There is too much
nuance and variety to make such sharp distinctions as theirs-and-ours,
them-and-us. And neither I nor anyone can presume to speak for “us.” But
I will pretend it is possible.

There is also a problem with the term “crisis.” Which one? There are
numerous very severe crises, interwoven in ways that preclude any clear
separation. But again I will pretend otherwise, for simplicity.

One way to enter this morass is offered by the June 11 issue of the
New York Review of Books. The front-cover headline reads “How to Deal With
the Crisis”; the issue features a symposium of specialists on how to do
so. It is very much worth reading, but with attention to the definite article.
For the West the phrase “the crisis” has a clear enough meaning: the financial
crisis that hit the rich countries with great impact, and is therefore
of supreme importance. But even for the rich and privileged that is by
no means the only crisis, nor even the most severe. And others see the
world quite differently. For example, in the October 26, 2008 edition of
the Bangladeshi newspaper The New Nation, we read: It’s very telling that
trillions have already been spent to patch up leading world financial institutions,
while out of the comparatively small sum of $12.3 billion pledged in Rome
earlier this year, to offset the food crisis, only $1 billion has been
delivered. The hope that at least extreme poverty can be eradicated by
the end of 2015, as stipulated in the UN’s Millennium Development Goals,
seems as unrealistic as ever, not due to lack of resources but a lack of
true concern for the world’s poor.

The article goes on to predict that World Food Day in October 2009 “will
bring . . . devastating news about the plight of the world’s poor . . .
which is likely to remain that: mere ‘news’ that requires little action,
if any at all.” Western leaders seem determined to fulfill these grim predictions.
On June 11 the Financial Times reported, “the United Nations’ World Food
Programme is cutting food aid rations and shutting down some operations
as donor countries that face a fiscal crunch at home slash contributions
to its funding.” Victims include Ethiopia, Rwanda, Uganda, and others.
The sharp budget cut comes as the toll of hunger passes a billion—with
over one hundred million added in the past six months—while food prices
rise, and remittances decline as a result of the economic crisis in the
West.

As The New Nation anticipated, the “devastating news” released by the
World Food Programme barely even reached the level of “mere ‘news.’” In
The New York Times, the WFP report of the reduction in the meager Western
efforts to deal with this growing “human catastrophe” merited 150 words
on page ten under “World Briefing.” That is not in the least unusual. The
United Nations also released an estimate that desertification is endangering
the lives of up to a billion people, while announcing World Desertification
Day. Its goal, according to the Nigerian newspaper THISDAY, is “to combat
desertification and drought worldwide by promoting public awareness and
the implementation of conventions dealing with desertification in member
countries.” The effort to raise public awareness passed without mention
in the national U.S. press. Such neglect is all too common.

It may be instructive to recall that when they landed in what today
is Bangladesh, the British invaders were stunned by its wealth and splendor.
It was soon on its way to becoming the very symbol of misery, and not by
an act of God.

As the fate of Bangladesh illustrates, the terrible food crisis is not
just a result of “lack of true concern” in the centers of wealth and power.
In large part it results from very definite concerns of global managers:
for their own welfare. It is always well to keep in mind Adam Smith’s astute
observation about policy formation in England. He recognized that the “principal
architects” of policy—in his day the “merchants and manufacturers”—made
sure that their own interests had “been most peculiarly attended to” however
“grievous” the effect on others, including the people of England and, far
more so, those who were subjected to “the savage injustice of the Europeans,”
particularly in conquered India, Smith’s own prime concern in the domains
of European conquest.

Smith was referring specifically to the mercantilist system, but his
observation generalizes, and as such, stands as one of the few solid and
enduring principles of both international relations and domestic affairs.
It should not, however, be over-generalized. There are interesting cases
where state interests, including long-term strategic and economic interests,
overwhelm the parochial concerns of the concentrations of economic power
that largely shape state policy. Iran and Cuba are instructive cases, but
I will have to put these topics aside here.

The food crisis erupted first and most dramatically in Haiti in early
2008. Like Bangladesh, Haiti today is a symbol of misery and despair. And,
like Bangladesh, when European explorers arrived, the island was remarkably
rich in resources, with a large and flourishing population. It later became
the source of much of France’s wealth. I will not run through the sordid
history, but the current food crisis can be traced directly to 1915, Woodrow
Wilson’s invasion: murderous, brutal, and destructive. Among Wilson’s many
crimes was dissolving the Haitian Parliament at gunpoint because it refused
to pass “progressive legislation” that would have allowed U.S. businesses
to take over Haitian lands. Wilson’s Marines then ran a free election,
in which the legislation was passed by 99.9 percent of the 5 percent of
the public permitted to vote. All of this comes down through history as
“Wilsonian idealism.”

Later, the United States Agency for International Development (USAID)
instituted programs to turn Haiti into the “Taiwan of the Caribbean,” by
adhering to the sacred principle of comparative advantage: Haiti must import
food and other commodities from the United States, while working people,
mostly women, toil under miserable conditions in U.S.-owned assembly plants.
Haiti’s first free election, in 1990, threatened these economically rational
programs. The poor majority entered the political arena for the first time
and elected their own candidate, a populist priest, Jean-Bertrand Aristide.
Washington adopted the standard operating procedures for such a case, moving
at once to undermine the regime. A few months later came the anticipated
military coup, and the resulting junta instituted a reign of terror, which
was backed by Bush senior and even more fully by Clinton, despite pretenses.
By 1994 Clinton decided that the population was sufficiently intimidated
and sent U.S. forces to restore the elected president, but on the strict
condition that he accept a harsh neoliberal regime. In particular, there
must be no protection for the economy. Haitian rice farmers are efficient,
but cannot compete with U.S. agribusiness that relies on huge government
subsidies, thanks largely to Reagan, anointed High Priest of free trade
with little regard to his record of extreme protectionism and state intervention
in the economy.

For working people, small farmers, and the poor, at home and abroad,
all of this spells regular disaster. One of the reasons for the radical
difference in development between Latin America and East Asia in the last
half century is that Latin America did not control capital flight, which
often approached the level of its crushing debt and has regularly been
wielded as a weapon against the threat of democracy and social reform.
In contrast, during South Korea’s remarkable growth period, capital flight
was not only banned, but could bring the death penalty.

Where neoliberal rules have been observed since the ’70s, economic performance
has generally deteriorated and social democratic programs have substantially
weakened. In the United States, which partially accepted these rules, real
wages for the majority have largely stagnated for 30 years, instead of
tracking productivity growth as before, while work hours have increased,
now well beyond those of Europe. Benefits, which always lagged, have declined
further. Social indicators—general measures of the health of the society—also
tracked growth until the mid-’70s, when they began to decline, falling
to the 1960 level by the end of the millennium. Economic growth found its
way into few pockets, increasingly in the financial industries. Finance
constituted a few percentage points of GDP in 1970, and has since risen
to well over one-third, while productive industry has declined, and with
it, living standards for much of the workforce. The economy has been punctuated
by bubbles, financial crises, and public bailouts, currently reaching new
highs. A few outstanding international economists explained and predicted
these results from the start. But mythology about “efficient markets” and
“rational choice” prevailed. This is no surprise: it was highly beneficial
to the narrow sectors of privilege and power that provide the “principal
architects of policy.”

The phrase “golden age of capitalism” might itself be challenged. The
period can more accurately be called “state capitalism.” The state sector
was, and remains, a primary factor in development and innovation through
a variety of measures, among them research and development, procurement,
subsidy, and bailouts. In the U.S. version, these policies operated mainly
under a Pentagon cover as long as the cutting edge of the advanced economy
was electronics-based. In recent years there has been a shift toward health-oriented
state institutions as the cutting edge becomes more biology-based. The
outcomes include computers, the Internet, satellites, and most of the rest
of the IT revolution, but also much else: civilian aircraft, advanced machine
tools, pharmaceuticals, biotechnology, and a lot more. The crucial state
role in economic development should be kept in mind when we hear dire warnings
about government intervention in the financial system after private management
has once again driven it to crisis, this time, an unusually severe crisis,
and one that harms the rich, not just the poor, so it merits special concern.
It is a little odd, to say the least, to read economic historian Niall
Ferguson in the New York Review of Books symposium on “The Crisis” saying
that “the lesson of economic history is very clear. Economic growth . .
. comes from technological innovation and gains in productivity, and these
things come from the private sector, not from the state”—remarks that were
probably written on a computer and sent via the Internet, which were substantially
in the state sector for decades before they became available for private
profit. His is hardly the clear lesson of economic history.

Large-scale state intervention in the economy is not just a phenomenon
of the post-World War II era, either. On the contrary, the state has always
been a central factor in economic development. Once they gained their independence,
the American colonies were free to abandon the orthodox economic policies
that dictated adherence to their comparative advantage in export of primary
commodities while importing superior British manufacturing goods. Instead,
the Hamiltonian economy imposed very high tariffs so that an industrial
economy could develop: textiles, steel, and much else. The eminent economic
historian Paul Bairoch describes the United States as “the mother country
and bastion of modern protectionism,” with the highest tariffs in the world
during its great growth period. And protectionism is only one of the many
forms of state intervention. Protectionist policies continued until the
mid-twentieth century, when the United States was so far in the lead that
the playing field was tilted in the proper direction—that is, to the advantage
of U.S. corporations. And when necessary, it has been tilted further, notably
by Reagan, who virtually doubled protectionist barriers among other measures
to rescue incompetent U.S. corporate management unable to compete with
Japan.

From the outset the United States was following Britain’s lead. The
other developed countries did likewise, while orthodox policies were rammed
down the throats of the colonies, with predictable effects. It is noteworthy
that the one country of the (metaphorical) South to develop, Japan, also
successfully resisted colonization. Others that developed, like the United
States, did so after they escaped colonial domination. Selective application
of economic prinicples—orthodox economics forced on the colonies while
violated at will by those free to do so—is a basic factor in the creation
of the sharp North-South divide. Like many other economic historians, Bairoch
concludes from a broad survey that “it is difficult to find another case
where the facts so contradict a dominant theory” as the doctrine that free
markets were the engine of growth, a harsh lesson that the developing world
has learned again in recent decades. Even the poster child of neoliberalism,
Chile, depends heavily on the world’s largest copper producer, Codelco,
nationalized by Allende.

In earlier years the cotton-based economy of the industrial revolution
relied on massive ethnic cleansing and slavery, rather severe forms of
state intervention in the economy. Though theoretically slavery was ended
with the Civil War, it emerged again after Reconstruction in a form that
was in many ways more virulent, with what amounted to criminalization of
African-American life and widespread use of convict labor, which continued
until World War II. The industrial revolution, from the late nineteenth
century, relied heavily on this new form of slavery, a hideous story that
has only recently been exposed in its shocking detail in a very important
study by Wall Street Journal bureau chief Douglas Blackmon. During the
post-World War II “golden age,” African Americans were able for the first
time to enjoy some level of social and economic advancement, but the disgraceful
post-Reconstruction history has been partially reconstituted during the
neoliberal years with the rapid growth of what some criminologists call
“the prison-industrial complex,” a uniquely American crime committed continuously
since the 1980s and exacerbated by the dismantling of productive industry.

The American system of mass production that astonished the world in
the nineteenth century was largely created in military arsenals. Solving
the major nineteenth-century management problem—railroads—was beyond the
capacity of private capital, so the challenge was handed over to the army.
A century ago the toughest problems of electrical and mechanical engineering
involved placing a huge gun on a moving platform to hit a moving target—naval
gunnery. The leaders were Germany and England, and the outcomes quickly
spilled over into the civilian economy.

Some economic historians compare that episode to state-run space programs
today. Reagan’s “Star Wars” was sold to industry as a traditional gift
from government, and was understood that way elsewhere too: that is why
Europe and Japan wanted to buy in. There was a dramatic increase in the
state role after World War II, particularly in the United States, where
a good part of the advanced economy developed in this framework.

State-guided modes of economic development require considerable deceit
in a society where the public cannot be controlled by force. People cannot
be told that the advanced economy relies heavily on their risk-taking,
while eventual profit is privatized, and “eventual” can be a long time,
sometimes decades. After World War II Americans were told that their taxes
were going to defense against monsters about to overcome us—as in the ’80s,
when Reagan pulled on his cowboy boots and declared a National Emergency
because Nicaraguan hordes were only two days from Harlingen, Texas. Or
twenty years earlier when LBJ warned that there are only 150 million of
us and 3 billion of them, and if might makes right, they will sweep over
us and take what we have, so we have to stop them in Vietnam.

For those concerned with the realities of the Cold War, and how it was
used to control the public, one obvious moment to inspect carefully is
the fall of the Berlin Wall twenty years ago and its aftermath. Celebration
of the anniversary in November 2009 has already begun, with ample coverage,
which will surely increase as the date approaches. The revealing implications
of the policies that were instituted after the fall have, however, been
ignored, as in the past, and probably will continue to be come November.

Reacting immediately to the Wall’s fall, the Bush senior administration
issued a new National Security Strategy and budget proposal to set the
course after the collapse of Kennedy’s “monolithic and ruthless conspiracy”
to conquer the world and Reagan’s “evil empire”—a collapse that took with
it the whole framework of domestic population control. Washington’s response
was straightforward: everything will stay much the same, but with new pretexts.
We still need a huge military system, but for a new reason: the “technological
sophistication” of Third World powers.

We have to maintain the “defense industrial base,” a euphemism for state-supported
high-tech industry. We must also maintain intervention forces directed
at the Middle East’s energy-rich regions, where the threats to our interests
that required military intervention “could not be laid at the Kremlin’s
door,” contrary to decades of pretense. The charade had sometimes been
acknowledged, as when Robert Komer—the architect of President Carter’s
Rapid Deployment Force (later Central Command), aimed primarily at the
Middle East—testified before Congress in 1980 that the Force’s most likely
use was not resisting Soviet attack, but dealing with indigenous and regional
unrest, in particular the “radical nationalism” that has always been a
primary concern throughout the world.

With the Soviet Union gone, the clouds lifted, and actual policy concerns
were more visible for those who chose to see. The Cold War propaganda framework
made two fundamental contributions: sustaining the dynamic state sector
of the economy (of which military industry is only a small part) and protecting
the interests of the “principal architects of policy” abroad.

The fate of NATO exposes the same concerns, and it is highly pertinent
today. Prior to Gorbachev NATO’s announced purpose was to deter a Russian
invasion of Europe. The legitimacy of that agenda was debatable right from
the end of World War II. In May 1945 Churchill ordered war plans to be
drawn up for Operation Unthinkable, aimed at “the elimination of Russia.”
The plans—declassified ten years ago—are discussed extensively in the major
scholarly study of British intelligence records, Richard Aldrich’s The
Hidden Hand. According to Aldrich, they called for a surprise attack by
hundreds of thousands of British and American troops, joined by one hundred
thousand rearmed German soldiers, while the RAF would attack Soviet cities
from bases in Northern Europe. Nuclear weapons were soon added to the mix.

The official stand also was not easy to take too seriously a decade
later, when Khrushchev took over in Russia, and soon proposed a sharp mutual
reduction in offensive weaponry. He understood very well that the much
weaker Soviet economy could not sustain an arms race and still develop.
When the United States dismissed the offer, he carried out the reduction
unilaterally. Kennedy reacted with a substantial increase in military spending,
which the Soviet military tried to match after the Cuban missile crisis
dramatically revealed its relative weakness. The Soviet economy tanked,
as Khrushchev had anticipated. That was a crucial factor in the later Soviet
collapse.

But the defensive pretext for NATO at least had some credibility. After
the Soviet disintegration, the pretext evaporated. In the final days of
the USSR, Gorbachev made an astonishing concession: he permitted a unified
Germany to join a hostile military alliance run by the global superpower,
though Germany alone had almost destroyed Russia twice in the century.
There was a quid pro quo, recently clarified. In the first careful study
of the original documents, Mark Kramer, apparently seeking to refute charges
of U.S. duplicity, in fact shows that it went far beyond what had been
assumed. It turns out, Kramer wrote this year in The Washington Quarterly,
that Bush senior and Secretary of State James Baker promised Gorbachev
that “no NATO forces would ever be deployed on the territory of the former
GDR . . . NATO’s jurisdiction or forces would not move eastward.’’ They
also assured Gorbachev “that NATO would be transforming itself into a more
political organization.” There is no need to comment on that promise. What
followed tells us a lot more about the Cold War itself, and the world that
emerged from its ending.

As soon as Clinton came into office, he began the expansion of NATO
to the east. The process accelerated with Bush junior’s aggressive militarism.
These moves posed a serious security threat to Russia, which naturally
reacted by developing more advanced offensive military capacities. Obama’s
National Security Advisor, James Jones, has a still-more expansive vision:
he calls for extending NATO further east and south, becoming in effect
a U.S.-run global intervention force, as it is today in Afghanistan—“Afpak”
as the region is now called—where Obama is sharply escalating Bush’s war,
which had already intensified in 2004.

NATO Secretary-General Jaap de Hoop Scheffer informed a NATO meeting
that “NATO troops have to guard pipelines that transport oil and gas that
is directed for the West,” and more generally have to protect sea routes
used by tankers and other “crucial infrastructure” of the energy system.
These plans open a new phase of Western imperial domination—more politely
called “bringing stability” and “peace.”

As recently as November 2007, the White House announced plans for a
long-term military presence in Iraq and a policy of “encouraging the flow
of foreign investments to Iraq, especially American investments.” The plans
were withdrawn under Iraqi pressure, the continuation of a process that
began when the United States was compelled by mass demonstrations to permit
elections. In Afpak Obama is building enormous new embassies and other
facilities, on the model of the city-within-a-city in Baghdad. These new
installations in Iraq and Afpak are like no embassies in the world, just
as the United States is alone in its vast military-basing system and control
of the air, sea, and space for military purposes.

While Obama is signaling his intention to establish a firm and large-scale
presence in the region, he is also following General Petraeus’s strategy
to drive the Taliban into Pakistan, with potentially quite serious consequences
for this dangerous and unstable state facing insurrections throughout its
territory. These are most extreme in the tribal areas crossing the British-imposed
Durand line separating Afghanistan from Pakistan, which the Pashtun tribes
on both sides of the artificial border have never recognized, nor did the
Afghan government when it was independent. In an April publication of the
Center for International Policy, one of the leading U.S. specialists on
the region, Selig Harrison, writes that the outcome of Washington’s current
policies might well be “what Pakistani ambassador to Washington Husain
Haqqani has called an ‘Islamic Pashtunistan.’” Haqqani’s predecessor had
warned that if the Taliban and Pashtun nationalists merge, “we’ve had it,
and we’re on the verge of that.”

Prospects become still more ominous as drone attacks that embitter the
population are escalated with their huge civilian toll. Also troubling
is the unprecedented authority just granted General Stanley McChrystal—a
special forces assassin—to head the operations. Petraeus’s own counter-insurgency
adviser in Iraq, David Kilcullen, describes the Obama-Petraeus-McChrystal
policies as a fundamental “strategic error,” which may lead to “the collapse
of the Pakistani state,” a calamity that would “dwarf” other current crises.

It is also not encouraging that Pakistan and India are now rapidly expanding
their nuclear arsenals. Pakistan’s were developed with Reagan’s crucial
aid, and India’s nuclear weapons programs got a major shot in the arm from
the recent U.S.-India nuclear agreement, which was also a sharp blow to
the Non-Proliferation Treaty. India and Pakistan have twice come close
to nuclear war over Kashmir, and have also been engaged in a proxy war
in Afghanistan. These developments pose a very serious threat to world
peace.

Returning home, it is worth noting that the more sophisticated are aware
of the deceit that is employed as a device to control the public, and regard
it as praiseworthy. The distinguished liberal statesman Dean Acheson advised
that leaders must speak in a way that is “clearer than truth.” Harvard
Professor of the Science of Government Samuel Huntington, who quite frankly
explained the need to delude the public about the Soviet threat 30 years
ago, urged more generally that power must remain invisible: “The architects
of power in the United States must create a force that can be felt but
not seen. Power remains strong when it remains in the dark; exposed to
the sunlight it begins to evaporate.” An important lesson for those who
want power to devolve to the public, a critical battle that is fought daily.

Whether the deceit about the monstrous enemy was sincere or not, if
Americans a half century ago had been given the choice of directing their
tax money to Pentagon programs to enable their grandchildren to have computers,
iPods, the Internet, and so on, or putting it into developing a livable
and sustainable socioeconomic order, they might have made the latter choice.
But they had no choice. That is standard. There is a striking gap between
public opinion and public policy on a host of major issues, domestic and
foreign, and public opinion is often more sane, at least in my judgment.
It also tends to be fairly consistent over time, despite the fact that
public concerns and aspirations are marginalized or ridiculed—one very
significant feature of the yawning “democratic deficit,” the failure of
formal democratic institutions to function properly. That is no trivial
matter. In a forthcoming book, the writer and activist Arundhati Roy asks
whether the evolution of formal democracy in India and the United States—and
not only there—“might turn out to be the endgame of the human race.” It
is not an idle question.

It should be recalled that the American republic was founded on the
principle that there should be a democratic deficit. James Madison, the
main framer of the Constitutional order, held that power should be in the
hands of “the wealth of the nation,” the “more capable set of men,” who
have sympathy for property owners and their rights. Possibly with Shay’s
Rebellion in mind, he was concerned that “the equal laws of suffrage” might
shift power into the hands of those who might seek agrarian reform, an
intolerable attack on property rights. He feared that “symptoms of a levelling
spirit” had appeared sufficiently “in certain quarters to give warning
of the future danger.” Madison sought to construct a system of government
that would “protect the minority of the opulent against the majority.”
That is why his constitutional framework did not have coequal branches:
the legislature prevailed, and within the legislature, power was to be
vested in the Senate, where the wealth of the nation would be dominant
and protected from the general population, which was to be fragmented and
marginalized in various ways. As historian Gordon Wood summarizes the thoughts
of the founders: “The Constitution was intrinsically an aristocratic document
designed to check the democratic tendencies of the period,” delivering
power to a “better sort” of people and excluding “those who were not rich,
well born, or prominent from exercising political power.”

In Madison’s defense, his picture of the world was pre-capitalist: he
thought that power would be held by the “enlightened Statesman” and “benevolent
philosopher,” men who are “pure and noble,” a “chosen body of citizens,
whose wisdom may best discern the true interests of their country and whose
patriotism and love of justice would be least likely to sacrifice it to
temporary or partial considerations,” guarding the public interest against
the “mischiefs” of democratic majorities. Adam Smith had a clearer vision.

There has been constant struggle over this constrained version of democracy,
which we call “guided democracy” in the case of enemies: Iran right now,
for example. Popular struggles have won a great many rights, but concentrated
power and privilege clings to the Madisonian conception in ways that vary
as society changes. By World War I, business leaders and elite intellectuals
recognized that the population had won so many rights that they could not
be controlled by force, so it would be necessary to turn to control of
attitudes and opinions. Those are the years when the huge public relations
industry emerged—in the freest countries of the world, Britain and United
States, where the problem was most acute. The industry was devoted to what
Walter Lippmann approvingly called “a new art in the practice of democracy,”
the “manufacture of consent”—the “engineering of consent” in the phrase
of his contemporary Edward Bernays, one of the founders of the public relations
industry. Both Lippmann and Bernays took part in Wilson’s state propaganda
organization, the Committee on Public Information, created to drive a pacifist
population to jingoist fanaticism and hatred of all things German. It succeeded
brilliantly. The same techniques, it was hoped, would ensure that the “intelligent
minorities” would rule, undisturbed by “the trampling and the roar of a
bewildered herd,” the general public, “ignorant and meddlesome outsiders”
whose “function” is to be “spectators,” not “participants.” This was a
central theme of the highly regarded “progressive essays on democracy”
by the leading public intellectual of the twentieth century (Lippmann),
whose thinking captures well the perceptions of progressive intellectual
opinion: President Wilson, for example, held that an elite of gentlemen
with “elevated ideals” must be empowered to preserve “stability and righteousness,”
essentially the Madisonian perspective. In more recent years, the gentlemen
are transmuted into the “technocratic elite” and “action intellectuals”
of Camelot, “Straussian” neocons, or other configurations. But throughout,
one or another variant of the doctrine prevails, with its Leninist overtones.

And on a more hopeful note, popular struggle continues to clip its wings,
quite impressively so in the wake of 1960s activism, which had a substantial
impact on civilizing the country and raised its prospects to a considerably
higher plane.

Returning to what the West sees as “the crisis”—the financial crisis—it
will presumably be patched up somehow, while leaving the institutions that
created it pretty much in place. Recently the Treasury Department permitted
early TARP repayments, which reduce bank capacity to lend, as was immediately
pointed out, but allow the banks to pour money into the pockets of the
few who matter. The mood on Wall Street was captured by two Bank of New
York Mellon employees, who, as reported in The New York Times, “predicted
their lives—and pay—would improve, even if the broader economy did not.”

The chair of the prominent law firm Sullivan & Cromwell offered
the equally apt prediction that “Wall Street, after getting billions of
taxpayer dollars, will emerge from the financial crisis looking much the
same as before markets collapsed.” The reasons were pointed out, by, among
others, Simon Johnson, former chief economist of the IMF: “Throughout the
crisis, the government has taken extreme care not to upset the interests
of the financial institutions, or to question the basic outlines of the
system that got us here,” and the elite business interests [that] played
a central role in creating the crisis, making ever-larger gambles, with
the implicit backing of the government, until the inevitable collapse .
. . are now using their influence to prevent precisely the sorts of reforms
that are needed, and fast, to pull the economy out of its nosedive.

Meanwhile “the government seems helpless, or unwilling, to act against
them.” Again no surprise, at least to those who remember their Adam Smith.

But there is a far more serious crisis, even for the rich and powerful.
It is discussed by Bill McKibben, who has been warning for years about
the impact of global warming, in the same issue of the New York Review
of Books that I mentioned earlier. His recent article relies on the British
Stern report, which is very highly regarded by leading scientists and a
raft of Nobel laureates in economics. On this basis McKibben concludes,
not unrealistically, “2009 may well turn out to be the decisive year in
the human relationship with our home planet.” In December a conference
in Copenhagen is “to sign a new global accord on global warming,” which
will tell us “whether or not our political systems are up to the unprecedented
challenge that climate change represents.” He thinks the signals are mixed.
That may be optimistic, unless there is a really massive public campaign
to overcome the insistence of the managers of the state-corporate sector
on privileging short-term gain for the few over the hope that their grandchildren
will have a decent future.

At least some of the barriers are beginning to crumble—in part because
the business world perceives new opportunities for profit. Even The Wall
Street Journal, one of the most stalwart deniers, recently published a
supplement with dire warnings about “climate disaster,” urging that none
of the options being considered may be sufficient, and it may be necessary
to undertake more radical measures of geoengineering, “cooling the planet”
in some manner.

As always, those who suffer most will be the poor. Bangladesh will soon
have a lot more to worry about than even the terrible food crisis. As the
sea level rises, much of the country, including its most productive regions,
might be under water. Current crises are almost sure to be exacerbated
as the Himalayan glaciers continue to disappear, and with them the great
river systems that keep South Asia alive. Right now, as glaciers melt in
the mountain heights where Pakistani and Indian troops suffer and die,
they expose the relics of their crazed conflict over Kashmir, “a pristine
monument to human folly,” Roy comments with despair.

The picture might be much more grim than even the Stern report predicts.
A group of MIT scientists have just released the results of what they describe
as the most comprehensive modeling yet carried out on the likelihood of
how much hotter the Earth’s climate will get in this century, [showing]
that without rapid and massive action, the problem will be about twice
as severe as previously estimated six years ago—and could be even worse
than that.
Worse because the model does not fully incorporate other positive feedbacks
that can occur, for example, if increased temperatures caused a large-scale
melting of permafrost in arctic regions and subsequent release of large
quantities of methane.

The leader of the project says, “There’s no way the world can or should
take these risks,” and that “the least-cost option to lower the risk is
to start now and steadily transform the global energy system over the coming
decades to low or zero greenhouse gas-emitting technologies.” There is
far too little sign of that.

While new technologies are essential, the problems go well beyond. We
have to face up to the need to reverse the huge state-corporate social
engineering projects of the post-World War II period, which quite purposefully
promoted an energy-wasting and environmentally destructive fossil fuel-based
economy. The state-corporate programs, which included massive projects
of suburbanization along with destruction and then gentrification of inner
cities, began with a conspiracy by General Motors, Firestone, and Standard
Oil of California to buy up and destroy efficient electric public transportation
systems in Los Angeles and dozens of other cities; they were convicted
of criminal conspiracy and given a slap on the wrist. The federal government
then took over, relocating infrastructure and capital stock to suburban
areas and creating the massive interstate highway system, under the usual
pretext of “defense.” Railroads were displaced by government-financed motor
and air transport.

The programs were understood as a means to prevent a depression after
the Korean War. One of their Congressional architects described them as
“a nice solid floor across the whole economy in times of recession.” The
public played almost no role, apart from choice within the narrowly structured
framework of options designed by state-corporate managers. One result is
atomization of society and entrapment of isolated individuals with self-destructive
ambitions and crushing debt. These efforts to “fabricate consumers” (to
borrow Veblen’s term) and to direct people “to the superficial things of
life, like fashionable consumption” (in the words of the business press),
emerged from the recognition a century ago of the need to curtail democratic
achievements and to ensure that the “opulent minority” are protected from
the “ignorant and meddlesome outsiders.”

While state-corporate power was vigorously promoting privatization of
life and maximal waste of energy, it was also undermining the efficient
choices that the market does not provide—another destructive built-in market
inefficiency. To put it simply, if I want to get home from work, the market
offers me a choice between a Ford and a Toyota, but not between a car and
a subway. That is a social decision, and in a democratic society, would
be the decision of an organized public. But that is just what the dedicated
elite attack on democracy seeks to undermine.

The consequences are right before our eyes in ways that are sometimes
surreal. In May The Wall Street Journal reported:
U.S. transportation chief [Ray LaHood] is in Spain meeting with high-speed
rail suppliers. . . . Europe’s engineering and rail companies are lining
up for some potentially lucrative U.S. contracts for high-speed rail projects.
At stake is $13 billion in stimulus funds that the Obama administration
is allocating to upgrade existing rail lines and build new ones that could
one day rival Europe’s fastest. . . . [LaHood is also] expected to visit
Spanish construction, civil engineering and train-building companies.

Spain and other European countries are hoping to get U.S. taxpayer funding
for the high-speed rail and related infrastructure that is badly needed
in the United States. At the same time, Washington is busy dismantling
leading sectors of U.S. industry, ruining the lives of the workforce and
communities. It is difficult to conjure up a more damning indictment of
the economic system that has been constructed by state-corporate managers.
Surely the auto industry could be reconstructed to produce what the country
needs, using its highly skilled workforce—and what the world needs, and
soon, if we are to have some hope of averting major catastrophe. It has
been done before, after all. During World War II the semi-command economy
not only ended the Depression but initiated the most spectacular period
of growth in economic history, virtually quadrupling industrial production
in four years as the economy was retooled for war, and also laying the
basis for the “golden age” that followed.

Warnings about the purposeful destruction of U.S. productive capacity
have been familiar for decades and perhaps sounded most prominently by
the late Seymour Melman. Melman also pointed to a sensible way to reverse
the process. The state-corporate leadership has other commitments, but
there is no reason for passivity on the part of the “stakeholders”—workers
and communities. With enough popular support, they could take over the
plants and carry out the task of reconstruction themselves. That is not
a particularly radical proposal. One standard text on corporations, The
Myth of the Global Corporation, points out, “nowhere is it written in stone
that the short-term interests of corporate shareholders in the United States
deserve a higher priority than all other corporate ‘stakeholders.’”

It is also important to remind ourselves that the notion of workers’
control is as American as apple pie. In the early days of the industrial
revolution in New England, working people took it for granted that “those
who work in the mills should own them.” They also regarded wage labor as
different from slavery only in that it was temporary; Abraham Lincoln held
the same view.
And the leading twentieth-century social philosopher, John Dewey, basically
agreed. Much like ninetheenth-century working people, he called for elimination
of “business for private profit through private control of banking, land,
industry, reinforced by command of the press, press agents and other means
of publicity and propaganda.” Industry must be changed “from a feudalistic
to a democratic social order” based on workers’ control, free association,
and federal organization, in the general style of a range of thought that
includes, along with many anarchists, G.D.H. Cole’s guild socialism and
such left Marxists as Anton Pannekoek, Rosa Luxemburg, Paul Mattick, and
others. Unless those goals are attained, Dewey held, politics will remain
“the shadow cast on society by big business, [and] the attenuation of the
shadow will not change the substance.” He argued that without industrial
democracy, political democratic forms will lack real content, and people
will work “not freely and intelligently,” but for pay, a condition that
is “illiberal and immoral”—ideals that go back to the Enlightenment and
classical liberalism before they were wrecked on the shoals of capitalism,
as the anarchosyndicalist thinker Rudolf Rocker put it 70 years ago.

There have been immense efforts to drive these thoughts out of people’s
heads—to win what the business world called “the everlasting battle for
the minds of men.” On the surface, corporate interests may appear to have
succeeded,
but one need not dig too deeply to find latent resistance that can be revived.
There have been some important efforts. One was undertaken 30 years ago
in Youngstown Ohio, where U.S. Steel was about to shut down a major facility
at the heart of this steel town. First came substantial protests by the
workforce and community, then an effort led by Staughton Lynd to convince
the courts that stakeholders should have the highest priority. The effort
failed that time, but with enough popular support it could succeed.

It is a propitious time to revive such efforts, though it would be necessary
to overcome the effects of the concerted campaign to drive our own history
and culture out of our minds. A dramatic illustration of the challenge
arose in early February 2009, when President Obama decided to show his
solidarity with working people by giving a talk at a factory in Illinois.
He chose a Caterpillar plant, over objections of church, peace, and human
rights groups that were protesting Caterpillar’s role in providing Israel
with the means to devastate the territories it occupies and to destroy
the lives of the population. A Caterpillar bulldozer had also been used
to kill American volunteer Rachel Corrie, who tried to block the destruction
of a home. Apparently forgotten, however, was something else. In the 1980s,
following Reagan’s lead with the dismantling of the air traffic controllerss
union, Caterpillar managers decided to rescind their labor contract with
the United Auto Workers and seriously harm the union by bringing in scabs
to break a strike for the first time in generations. The practice was illegal
in other industrial countries apart from South Africa at the time; now
the United States is in splendid isolation, as far as I know.

Whether Obama purposely chose a corporation that led the way to undermine
labor rights I don’t know. More likely, he and his handlers were unaware
of the facts.

But at the time of Caterpillar’s innovation in labor relations, Obama
was a civil rights lawyer in Chicago. He certainly read the Chicago Tribune,
which published a careful study of these events. The Tribune reported that
the union was “stunned” to find that unemployed workers crossed the picket
line with no remorse, while Caterpillar workers found little “moral support”
in their community, one of the many where the union had “lifted the standard
of living.” Wiping out those memories is another victory for the highly
class-conscious American business sector in its relentless campaign to
destroy workers’ rights and democracy.

The union leadership had refused to understand. It was only in 1978
that UAW President Doug Fraser recognized what was happening and criticized
the “leaders of the business community” for having “chosen to wage a one-sided
class war in this country—a war against working people, the unemployed,
the poor, the minorities, the very young and the very old, and even many
in the middle class of our society,” and for having “broken and discarded
the fragile, unwritten compact previously existing during a period of growth
and progress.” Placing one’s faith in a compact with owners and managers
is suicidal. The UAW is discovering that again today, as the state-corporate
leadership proceeds to eliminate the hard-fought gains of working people
while dismantling the productive core of the American economy.

Investors are now wailing that the unions are being granted “workers’
control” in the restructuring of the auto industry, but they surely know
better. The government task force ensured that the workforce will have
no shareholder voting rights and will lose benefits and wages, eliminating
what was the gold standard for blue-collar workers.

This is only a fragment of what is underway. It highlights the importance
of short- and long-term strategies to build—in part resurrect—the foundations
of a functioning democratic society. An immediate goal is to pressure Congress
to permit organizing rights, the Employee Free Choice Act that was promised
but
seems to be languishing. One short-term goal is to support the revival
of a strong and independent labor movement, which in its heyday was a critical
base for advancing democracy and human and civil rights, a primary reason
why it has been subject to such unremitting attack in policy and propaganda.
A longer-term goal is to win the educational and cultural battle that has
been waged with such bitterness in the “one-sided class war” that the UAW
president perceived far too late. That means tearing down an enormous edifice
of delusions about markets, free trade, and democracy that has been assiduously
constructed over many years and to overcome the marginalization and atomization
of the public so that they can become “participants,” not mere “spectators
of action,” as progressive democratic theoreticians have prescribed.

Of all of the crises that afflict us, the growing democratic deficit
may be the most severe. Unless it is reversed, Roy’s forecast may prove
accurate. The conversion of democracy to a performance with the public
as mere spectators—hardly a distant possibility—might have truly dire consequences.