There will come a moment when the most urgent threats posed by the credit
crisis have eased and the larger task before us will be to chart a
direction for the economic steps ahead. This will be a dangerous moment.
Behind the debates over future policy is a debate over history -- a
debate
over the causes of our current situation. The battle for the past will
determine the battle for the present. So it's crucial to get the
historystraight.

What were the critical decisions that led to the crisis? Mistakes were
made
at every fork in the road -- we had what engineers call a "system failure,"
when not a single decision but a cascade of decisions produce a tragic
result. Let's look at five key moments.

No. 1: Firing the Chairman

In 1987 the Reagan administration decided to remove Paul Volcker as
chairman of the Federal Reserve Board and appoint Alan Greenspan in
his
place. Volcker had done what central bankers are supposed to do. On
his
watch, inflation had been brought down from more than 11 percent to
under 4
percent. In the world of central banking, that should have earned him
a
grade of A+++ and assured his re-appointment. But Volcker also understood
that financial markets need to be regulated. Reagan wanted someone
who did
not believe any such thing, and he found him in a devotee of the
objectivist philosopher and free-market zealot Ayn Rand.

Greenspan played a double role. The Fed controls the money spigot, and
in
the early years of this decade, he turned it on full force. But the
Fed is
also a regulator. If you appoint an anti-regulator as your enforcer,
you
know what kind of enforcement you'll get. A flood of liquidity combined
with the failed levees of regulation proved disastrous.

Greenspan presided over not one but two financial bubbles. After the
high-tech bubble popped, in 2000-2001, he helped inflate the housing
bubble. The first responsibility of a central bank should be to maintain
the stability of the financial system. If banks lend on the basis of
artificially high asset prices, the result can be a meltdown -- as
we are
seeing now, and as Greenspan should have known. He had many of the
tools he
needed to cope with the situation. To deal with the high-tech bubble,
he
could have increased margin requirements (the amount of cash people
need to
put down to buy stock). To deflate the housing bubble, he could have
curbed
predatory lending to low-income households and prohibited other insidious
practices (the no-documentation -- or "liar" -- loans, the interest-only
loans, and so on). This would have gone a long way toward protecting
us. If
he didn't have the tools, he could have gone to Congress and asked
for
them.

Of course, the current problems with our financial system are not solely
the result of bad lending. The banks have made mega-bets with one another
through complicated instruments such as derivatives, credit-default
swaps,
and so forth. With these, one party pays another if certain events
happen
-- for instance, if Bear Stearns goes bankrupt, or if the dollar soars.
These instruments were originally created to help manage risk -- but
they
can also be used to gamble. Thus, if you felt confident that the dollar
was
going to fall, you could make a big bet accordingly, and if the dollar
indeed fell, your profits would soar. The problem is that, with this
complicated intertwining of bets of great magnitude, no one could be
sure
of the financial position of anyone else -- or even of one's own position.
Not surprisingly, the credit markets froze.

Here too Greenspan played a role. When I was chairman of the Council
of
Economic Advisers, during the Clinton administration, I served on a
committee of all the major federal financial regulators, a group that
included Greenspan and Treasury Secretary Robert Rubin. Even then,
it was
clear that derivatives posed a danger. We didn't put it as memorably
as
Warren Buffett -- who saw derivatives as "financial weapons of mass
destruction" -- but we took his point. And yet, for all the risk, the
deregulators in charge of the financial system -- at the Fed, at the
Securities and Exchange Commission, and elsewhere -- decided to do
nothing,
worried that any action might interfere with "innovation" in the financial
system. But innovation, like "change," has no inherent value. It can
be bad
(the "liar" loans are a good example) as well as good.

No. 2: Tearing Down the Walls

The deregulation philosophy would pay unwelcome dividends for years
to
come. In November 1999, Congress repealed the Glass-Steagall Act --
the
culmination of a $300 million lobbying effort by the banking and
financial-services industries, and spearheaded in Congress by Senator
Phil
Gramm. Glass-Steagall had long separated commercial banks (which lend
money) and investment banks (which organize the sale of bonds and
equities); it had been enacted in the aftermath of the Great Depression
and
was meant to curb the excesses of that era, including grave conflicts
of
interest. For instance, without separation, if a company whose shares
had
been issued by an investment bank, with its strong endorsement, got
into
trouble, wouldn't its commercial arm, if it had one, feel pressure
to lend
it money, perhaps unwisely? An ensuing spiral of bad judgment is not
hard
to foresee. I had opposed repeal of Glass-Steagall. The proponents
said, in
effect, Trust us: we will create Chinese walls to make sure that the
problems of the past do not recur. As an economist, I certainly possessed
a
healthy degree of trust, trust in the power of economic incentives
to bend
human behavior toward self-interest -- toward short-term self-interest,
at
any rate, rather than Tocqueville's "self interest rightly understood."

The most important consequence of the repeal of Glass-Steagall was indirect
-- it lay in the way repeal changed an entire culture. Commercial banks
are
not supposed to be high-risk ventures; they are supposed to manage
other
people's money very conservatively. It is with this understanding that
the
government agrees to pick up the tab should they fail. Investment banks,
on
the other hand, have traditionally managed rich people's money -- people
who can take bigger risks in order to get bigger returns. When repeal
of
Glass-Steagall brought investment and commercial banks together, the
investment-bank culture came out on top. There was a demand for the
kind of
high returns that could be obtained only through high leverage and
big
risktaking.

There were other important steps down the deregulatory path. One was
the
decision in April 2004 by the Securities and Exchange Commission, at
a
meeting attended by virtually no one and largely overlooked at the
time, to
allow big investment banks to increase their debt-to-capital ratio
(from
12:1 to 30:1, or higher) so that they could buy more mortgage-backed
securities, inflating the housing bubble in the process. In agreeing
to
this measure, the S.E.C. argued for the virtues of self-regulation:
the
peculiar notion that banks can effectively police themselves.
Self-regulation is preposterous, as even Alan Greenspan now concedes,
and
as a practical matter it can't, in any case, identify systemic risks
-- the
kinds of risks that arise when, for instance, the models used by each
of
the banks to manage their portfolios tell all the banks to sell some
security all at once.

As we stripped back the old regulations, we did nothing to address the
new
challenges posed by 21st-century markets. The most important challenge
was
that posed by derivatives. In 1998 the head of the Commodity Futures
Trading Commission, Brooksley Born, had called for such regulation
-- a
concern that took on urgency after the Fed, in that same year, engineered
the bailout of Long-Term Capital Management, a hedge fund whose
trillion-dollar-plus failure threatened global financial markets. But
Secretary of the Treasury Robert Rubin, his deputy, Larry Summers,
and
Greenspan were adamant -- and successful -- in their opposition. Nothing
was done.

No. 3: Applying the Leeches

Then along came the Bush tax cuts, enacted first on June 7, 2001, with
a
follow-on installment two years later. The president and his advisers
seemed to believe that tax cuts, especially for upper-income Americans
and
corporations, were a cure-all for any economic disease -- the modern-day
equivalent of leeches. The tax cuts played a pivotal role in shaping
the
background conditions of the current crisis. Because they did very
little
to stimulate the economy, real stimulation was left to the Fed, which
took
up the task with unprecedented low-interest rates and liquidity. The
war in
Iraq made matters worse, because it led to soaring oil prices. With
America
so dependent on oil imports, we had to spend several hundred billion
more
to purchase oil -- money that otherwise would have been spent on American
goods. Normally this would have led to an economic slowdown, as it
had in
the 1970s. But the Fed met the challenge in the most myopic way imaginable.
The flood of liquidity made money readily available in mortgage markets,
even to those who would normally not be able to borrow. And, yes, this
succeeded in forestalling an economic downturn; America's household
saving
rate plummeted to zero. But it should have been clear that we were
living
on borrowed money and borrowed time.

The cut in the tax rate on capital gains contributed to the crisis in
another way. It was a decision that turned on values: those who speculated
(read: gambled) and won were taxed more lightly than wage earners who
simply worked hard. But more than that, the decision encouraged leveraging,
because interest was tax-deductible. If, for instance, you borrowed
a
million to buy a home or took a $100,000 home-equity loan to buy stock,
the
interest would be fully deductible every year. Any capital gains you
made
were taxed lightly -- and at some possibly remote day in the future.
The
Bush administration was providing an open invitation to excessive borrowing
and lending -- not that American consumers needed any more encouragement.

No. 4: Faking the Numbers

Meanwhile, on July 30, 2002, in the wake of a series of major scandals
--
notably the collapse of WorldCom and Enron -- Congress passed the
Sarbanes-Oxley Act. The scandals had involved every major American
accounting firm, most of our banks, and some of our premier companies,
and
made it clear that we had serious problems with our accounting system.
Accounting is a sleep-inducing topic for most people, but if you can't
have
faith in a company's numbers, then you can't have faith in anything
about a
company at all. Unfortunately, in the negotiations over what became
Sarbanes-Oxley a decision was made not to deal with what many, including
the respected former head of the S.E.C. Arthur Levitt, believed to
be a
fundamental underlying problem: stock options. Stock options have been
defended as providing healthy incentives toward good management, but
in
fact they are "incentive pay" in name only. If a company does well,
the
C.E.O. gets great rewards in the form of stock options; if a company
does
poorly, the compensation is almost as substantial but is bestowed in
other
ways. This is bad enough. But a collateral problem with stock options
is
that they provide incentives for bad accounting: top management has
every
incentive to provide distorted information in order to pump up share
prices.

The incentive structure of the rating agencies also proved perverse.
Agencies such as Moody's and Standard & Poor's are paid by the
very people
they are supposed to grade. As a result, they've had every reason to
give
companies high ratings, in a financial version of what college professors
know as grade inflation. The rating agencies, like the investment banks
that were paying them, believed in financial alchemy -- that F-rated
toxic
mortgages could be converted into products that were safe enough to
be held
by commercial banks and pension funds. We had seen this same failure
of the
rating agencies during the East Asia crisis of the 1990s: high ratings
facilitated a rush of money into the region, and then a sudden reversal
in
the ratings brought devastation. But the financial overseers paid no
attention.

No. 5: Letting It Bleed

The final turning point came with the passage of a bailout package on
October 3, 2008 -- that is, with the administration's response to the
crisis itself. We will be feeling the consequences for years to come.
Both
the administration and the Fed had long been driven by wishful thinking,
hoping that the bad news was just a blip, and that a return to growth
was
just around the corner. As America's banks faced collapse, the
administration veered from one course of action to another. Some
institutions (Bear Stearns, A.I.G., Fannie Mae, Freddie Mac) were bailed
out. Lehman Brothers was not. Some shareholders got something back.
Others
did not.

The original proposal by Treasury Secretary Henry Paulson, a three-page
document that would have provided $700 billion for the secretary to
spend
at his sole discretion, without oversight or judicial review, was an
act of
extraordinary arrogance. He sold the program as necessary to restore
confidence. But it didn't address the underlying reasons for the loss
of
confidence. The banks had made too many bad loans. There were big holes
in
their balance sheets. No one knew what was truth and what was fiction.
The
bailout package was like a massive transfusion to a patient suffering
from
internal bleeding -- and nothing was being done about the source of
the
problem, namely all those foreclosures. Valuable time was wasted as
Paulson
pushed his own plan, "cash for trash," buying up the bad assets and
putting
the risk onto American taxpayers. When he finally abandoned it, providing
banks with money they needed, he did it in a way that not only cheated
America's taxpayers but failed to ensure that the banks would use the
money
to re-start lending. He even allowed the banks to pour out money to
their
shareholders as taxpayers were pouring money into the banks.

The other problem not addressed involved the looming weaknesses in the
economy. The economy had been sustained by excessive borrowing. That
game
was up. As consumption contracted, exports kept the economy going,
but with
the dollar strengthening and Europe and the rest of the world declining,
it
was hard to see how that could continue. Meanwhile, states faced massive
drop-offs in revenues -- they would have to cut back on expenditures.
Without quick action by government, the economy faced a downturn. And
even
if banks had lent wisely -- which they hadn't -- the downturn was sure
to
mean an increase in bad debts, further weakening the struggling financial
sector.

The administration talked about confidence building, but what it delivered
was actually a confidence trick. If the administration had really wanted
to
restore confidence in the financial system, it would have begun by
addressing the underlying problems -- the flawed incentive structures
and
the inadequate regulatory system.

Was there any single decision which, had it been reversed, would have
changed the course of history? Every decision -- including decisions
not to
do something, as many of our bad economic decisions have been -- is
a
consequence of prior decisions, an interlinked web stretching from
the
distant past into the future. You'll hear some on the right point to
certain actions by the government itself -- such as the Community
Reinvestment Act, which requires banks to make mortgage money available
in
low-income neighborhoods. (Defaults on C.R.A. lending were actually
much
lower than on other lending.) There has been much finger-pointing at
Fannie
Mae and Freddie Mac, the two huge mortgage lenders, which were originally
government-owned. But in fact they came late to the subprime game,
and
their problem was similar to that of the private sector: their C.E.O.'s
had
the same perverse incentive to indulge in gambling.

The truth is most of the individual mistakes boil down to just one:
a
belief that markets are self-adjusting and that the role of government
should be minimal. Looking back at that belief during hearings this
fall on
Capitol Hill, Alan Greenspan said out loud, "I have found a flaw."
Congressman Henry Waxman pushed him, responding, "In other words, you
found
that your view of the world, your ideology, was not right; it was not
working." "Absolutely, precisely," Greenspan said. The embrace by America
-- and much of the rest of the world -- of this flawed economic philosophy
made it inevitable that we would eventually arrive at the place we
are
today.

Joseph Stiglitz, a Nobel laureate, is a professor of economics
at Columbia University.