Recent Articleshttp://prospect.org/authors/126147/rss.xml
The American Prospect - articles by authorenDemocratic Delusionshttp://prospect.org/article/democratic-delusions
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><span class="dropcap">R</span>ichard Ellis always votes no. Since moving to Oregon in<br />
1990 to teach political science at Willamette University, Ellis has been asked to<br />
pass judgment on 74 statewide initiatives, an average of more than 12 per<br />
election. Initiatives are proposed laws or constitutional amendments placed on a<br />
state's ballot by citizen petition (that's how they differ from referenda, which<br />
originate in the legislature). In the last decade alone, Oregonians have been<br />
required to make binding decisions on proposals to roll back property taxes,<br />
reduce public-employee benefits, impose term limits on legislators, and make<br />
prisoners work 40-hour weeks, among dozens of other mostly right-wing measures.<br />
This year there are serious efforts under way to make voters decide on such<br />
matters as "paycheck protection" (requiring individual workers' explicit<br />
permission to spend union dues on political causes), more term limits, and<br />
judicial elections with a "none-of-the-above" option.</p>
<p>
Ellis adopted his "no matter what the issue, no matter what the<br />
measure, I vote no" policy because he hopes "to promote greater skepticism of the<br />
populist mantle with which the initiative process is invariably cloaked." Far<br />
from being a "pure, direct expression of the popular will," the initiative is a<br />
device used mostly by a dark trinity of wealthy individuals, special interests,<br />
and professional initiative activists to force their pet causes to the top of the<br />
political agenda. As such, initiatives enfeeble the normal processes of<br />
constitutional democracy, lure courts into dense political thickets, burden<br />
voters with decisions they would rather not have to make, and bind future<br />
generations by granting constitutional status to fleeting passions. It's a strong<br />
indictment, and Ellis makes his case with the expertise of an accomplished<br />
political scientist, the grace of a talented writer, and the fervor of a<br />
committed public citizen.</p>
<p>
<span class="dropcap">A</span>mericans aren't completely sold on initiatives as a way of<br />
conducting public business: Only 24 states (beginning with South Dakota in 1898)<br />
allow them. The roots of this ambivalence are in the clash between American<br />
constitutionalism and American political culture, a subject that Ellis has<br />
plumbed expertly in other books.</p>
<p>
Two of the leading values in American political culture, Ellis argues,<br />
are populism and libertarianism. These values are in tension with each other on<br />
all sorts of issues. For example, our populist strain responds to calls for<br />
strict regulation of big business and is open to prayer in public schools; our<br />
libertarian strain exalts the unfettered market and prefers the dogma-free<br />
classroom. Yet when it comes to the initiative, there's been no tension at all<br />
between the two values. The populist faith in the people and the libertarian<br />
suspicion of political institutions have converged behind almost anything that<br />
smacks of direct democracy.</p>
<p>
But Americans also admire, even exalt, constitutional government, which<br />
eschews direct democracy for what Ellis calls "deliberative democracy." One great<br />
advantage of the "long, drawn-out process by which a bill becomes a law" --<br />
through the ordinary process of committee hearings and floor debates in a<br />
bicameral legislature -- is that the bill's design flaws, both political and<br />
technical, can be weeded out. Another advantage is that by the time the<br />
legislature gets done with a controversial bill, the result is seldom a complete<br />
victory for one side and a total defeat for the other. Instead, deliberative<br />
democracy usually works to "balance rival interests and needs rather than have<br />
one side trump the other." One side trumping the other is, of course, the<br />
inevitable outcome of the winner-take-all initiative process.</p>
<p>
Through most of American history, Americans' ambivalence about the initiative<br />
manifested itself in occasional rather than steady use of the process. There was<br />
an initial burst of activity in the 1910s, when the initiative was new, but Ellis<br />
punctures the myth, perpetuated by modern initiative advocates and some scholars,<br />
that this was a latter-day Periclean age. He shows that in most states, it wasn't<br />
pink-cheeked, petition-wielding idealists who enacted Progressive Era reforms<br />
like primaries, direct election of U.S. senators, and women's suffrage over the<br />
opposition of corrupt politicians. Instead "the Progressive reformation happened<br />
the old-fashioned way: through electing representatives who did the people's<br />
bidding" in the legislature. Ellis notes that when initiatives gave the voters a<br />
voice on women's suffrage, they usually said no. What they liked instead in the<br />
1910s was Prohibition.</p>
<p>
By the 1940s, initiative use had fallen to half or less of its earlier peak.<br />
Ellis shows that from 1942 to 1971, the typical initiative state had one measure<br />
on the ballot every two years, a function of "demand-side" politics. Voters who<br />
were generally pleased with how their demands were being met by elected officials<br />
seldom saw the need to endrun them.</p>
<p>
Recent decades have been different. The number of initiatives that reached<br />
state ballots rose from less than 100 nationwide in the 1960s to around 250 in<br />
the 1980s and nearly 400 in the 1990s. Seventy-six initiatives were voted on in<br />
2000 alone. Yet this was no prairie fire of authentic grass-roots democracy. In<br />
1998, for example, nearly half the initiative states had just one measure on the<br />
ballot, or none at all. Only one state, Mississippi, added the initiative to its<br />
constitution during the 1990s, and it shackled the process with several checks to<br />
keep it from being used rashly. The campaign for a national initiative amendment<br />
to the U.S. Constitution, which seemed so promising 20 years ago, "dropped almost<br />
completely off the national radar." </p>
<p>
Far from being a national phenomenon, the recent outbreak of initiatives has<br />
been confined mostly to California and five other western states: Oregon,<br />
Colorado, North Dakota, Arizona, and Washington. Six in 10 of the initiatives<br />
that reached the ballot between 1990 and 2000 were in that handful of states.<br />
Thus, Ellis argues, the explanation for the recent outbreak of initiatives can't<br />
be the mood of the voters -- if it were, then initiative use would be expanding<br />
across the map, citizens in noninitiative states would be clamoring for the right<br />
to make laws by ballot, and Congress would be feeling pressure to create a<br />
national initiative. </p>
<p>
<span class="dropcap">I</span>nstead, the explanation for initiative fever lies in the realm<br />
of "supply-side" politics. "According to this theory," Ellis explains, "the<br />
number of initiatives on the ballot is determined not by the demands of the<br />
people but by the suppliers of initiatives."</p>
<p>
Who are these suppliers? George Soros is one. Like several other<br />
wealthy individuals whom Ellis describes, Soros has turned the ballots of about a<br />
dozen states (so far) into his personal legislature on the issue of drug<br />
decriminalization -- sometimes in the form of medical-marijuana initiatives and<br />
sometimes with measures to limit the power of law-enforcement officers to seize<br />
assets from drug dealers. Soros is able to do this because he can hire<br />
professional signature-gathering firms to rustle up all the signed petitions it<br />
takes to secure his ideas a place on the ballot. Ironically, Ellis points out,<br />
states established these signature requirements in the first place to assure that<br />
voters would only have to decide on initiatives that have broad-based<br />
grass-roots support.</p>
<p>
Special interests with deep pockets are the second prime supplier of<br />
initiatives. Nearly every initiative state recently has had at least one<br />
casino-financed gambling measure on its ballot, including a 1998 proposal to<br />
legalize tribal casinos in California that unleashed $97 million in campaign<br />
spending by the Indian gambling industry (pro) and a host of Las Vegas casino<br />
companies (con). Unlike this measure, which passed, gambling initiatives usually<br />
lose. But the financial stakes are so high that casino companies seeking new<br />
markets may come back election after election. After all, they only have to win<br />
once.</p>
<p>
Professional initiative activists are the third major supplier of state<br />
ballot measures. Oregon's Bill Sizemore, for example, placed six initiatives<br />
before his state's voters in 2000. Political life and financial livelihood, so<br />
often in tension for elected officials, happily coincide for Sizemore. The<br />
antitax, antiunion organization he heads, Oregon Taxpayers United, hires the<br />
signature-gathering firm that he owns, I&amp;R Petition Services, Inc., to qualify<br />
measures for the ballot. Although all six of his 2000 initiatives lost, Sizemore<br />
managed to divert the political energies and resources of Oregon's unions and<br />
other liberal organizations into battling against his initiatives instead of<br />
promoting their own causes.</p>
<p>
As Ellis shows, the pioneers of the initiative process in America were<br />
left-wing populists like Eugene Debs and Edward Bellamy and middle-class<br />
progressives like Woodrow Wilson, leaders who wanted to enhance popular control<br />
of government. Most contemporary practitioners of supply-side initiative<br />
politics, however, are unaccountable to the people. An elected official who<br />
persistently promotes causes that the voters despise can be tossed out of office.<br />
But Soros and Sizemore can just keep coming back with more ballot measures. When<br />
Sizemore ran for governor in 1998 as the Republican nominee, he received 30<br />
percent of the vote, the worst defeat of a major party gubernatorial candidate in<br />
modern Oregon history. Yet two years later his six initiatives dominated the<br />
ballot. This year he is promoting several more antitax and antiunion<br />
initiatives.</p>
<p>
One other electorally unaccountable group has seen its power enhanced by the<br />
initiative process: judges. Nowadays it's a dead-on certainty that any<br />
controversial proposal will face repeated challenges in court. Round One usually<br />
involves a lawsuit opposing the "ballot title" -- that is, the words used to<br />
describe the initiative on the ballot. The difference between banning "abortion<br />
of any fetus located wholly or partly in the birth canal" and banning "the<br />
killing of a child in the process of being born" (the choice faced by the<br />
Washington State Supreme Court in 1998) is, after all, worth fighting over. </p>
<p>
A second round of legal challenges begins if the initiative passes. Ellis<br />
reports that about half of all voter-approved initiatives of the past 40 years<br />
have been attacked in court (the figure is two-thirds in California) -- and about<br />
half the challenges have been successful. Most judges don't like getting<br />
embroiled in the political process, but they're stuck. "In this environment,"<br />
Ellis writes, "the election itself becomes almost an interlude between the main<br />
legal dramas that occur prior to a measure's qualification and immediately<br />
following the election."</p>
<p>
Ellis's indictment of the initiative is so persuasive that one can forgive<br />
him for not adequately discussing the effects it has on governors and<br />
legislators. As <i>The New York Times</i> reported on March 3, a raft of successful<br />
ballot measures in states such as Colorado and Arizona, passed in isolation from<br />
each other, have imposed contradictory tax limitations and new spending<br />
requirements (usually for schools) on state governments that have rendered the<br />
normal processes of constitutional government almost unworkable. </p>
<p>
<span class="dropcap">E</span>llis accompanies his critique of the initiative process with a<br />
number of sensible suggestions for reforming it, most of which are already in<br />
place in one or more states. Why not borrow from Illinois, he asks, and stipulate<br />
that initiatives to amend a state constitution need a three-fifths majority to<br />
pass? After all, constitutional amendments enacted through the legislative<br />
process usually require a supermajority. Or why not adopt the Nevada model and<br />
insist that an initiative be approved in two consecutive general elections before<br />
taking effect, which would give the state legislature time to address the issue<br />
between votes? </p>
<p>
But Ellis is smart enough to realize that the roots of the initiative<br />
are deeply embedded in American political culture, that is, in libertarianism and<br />
populism. What needs to change -- and Ellis hopes that his book, along with other<br />
recent anti-initiative works like Peter Schrag's <i>Paradise Lost: California's<br />
Experience, America's Future</i> and David Broder's <i>Democracy Derailed:<br />
Initiative Campaigns and the Power of Money</i>, will help -- is not our<br />
adherence to libertarianism and populism but rather our understanding of how<br />
these cultural values apply to the initiative. The libertarian in us will<br />
continue to distrust politicians, but maybe we can be convinced that entrusting<br />
political authority to wealthy individuals, special interests, and professional<br />
initiative activists who can't be voted out of office is worse. As for our<br />
populist faith in the people, what has that to do with a process that makes<br />
courts the ultimate arbiters of public policy? Americans need not abandon our<br />
values; we just need to apply them differently. By writing <i>Democratic<br />
Delusions</i>, Ellis has made a major contribution to fostering that new way of<br />
thinking. And in the meantime, he wants us to just vote no.</p>
<p></p></div></div></div>Tue, 21 May 2002 15:24:16 +0000142587 at http://prospect.orgMichael NelsonFantasia: The Gospel According to C.S. Lewishttp://prospect.org/article/fantasia-gospel-according-cs-lewis
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><span class="dropcap">L</span>ast June, before Hobbits and Harry Potter began crowding out all other arts coverage, <i>The New York Times</i> ran a front-page story about <i>The Chronicles of Narnia</i>, the seven-volume series of children's fantasy books written by the English novelist C.S. Lewis in the 1950s. The article was called "Marketing 'Narnia' without a Christian Lion" -- and apparently the headline was as far as either Andrew Greeley or Charles Colson got before throwing down their newspapers in disgust. Greeley (who is a gadfly sociologist, priest, and romance novelist) and Colson (the famously born-again Watergate-era adviser to Richard Nixon) are widely published Christian commentators. Both took the <i>Times</i> headline to mean that, as Greeley huffed in a syndicated column, Lewis's publisher HarperCollins "intends to censor out of C.S. Lewis's masterpiece that which is most essential to it -- its Christian imagery -- because that imagery would be offensive to secularists." Readers who experience the bowdlerized versions of the stories, Colson complained in a radio commentary, "won't . . . really be experiencing Lewis at all." Moreover, "it won't do them any good," he asserted, as though literature were like vitamins or brussels sprouts. Urging his readers to boycott HarperCollins, Colson commended Zondervan Books for its plan "to continue publishing the Narnia books in their original form."</p>
<p>
In truth, the dudgeon of Colson and Greeley was somewhat misdirected. HarperCollins is <i>already</i> republishing the Narnia series in its original format, both under its own imprint and under the Zondervan imprint, which is actually a HarperCollins subsidiary. And the <i>Times</i> story -- which neither Colson nor Greeley took the time to read carefully -- was not about publishing a new version of the <i>Chronicles</i> with the Christian elements excised, but rather about the new strategy HarperCollins had launched to market Narnia.</p>
<p>Still, the publisher's new, three-part marketing scheme did raise hackles, particularly among Christian conservatives. The first element of the new strategy raised no objections: to publish several editions of the complete <i>Chronicles</i>, ranging from cheap to deluxe and from one-volume to seven volumes, along with an audio edition read by famous actors. A second element -- to create a line of Narnia toys -- provoked alarums of tackiness, but not much more. </p>
<p>The final element of HarperCollins's campaign, however, aroused real concern: a new series of wholly secular Narnia novels and picture books that the publisher plans to commission for younger readers. Online Lewis discussion groups like <a target="outlink" href="http://members.aol.com/merelewis/">MereLewis</a> and <a target="outlink" href="http://ibiblio.org/usenet-i/groups-html/alt.books.cs-lewis.html">alt.books.cs-lewis</a> were flooded with angry and fearful comments about HarperCollins, mostly from Christian fans of Lewis. As <a target="outlink" href="http://www.beliefnet.com/">beliefnet.com</a> columnist Frederica Matthewes-Green summarized their laments, "to many [conservative Christians], downplaying Lewis's faith seems like one more in a string of insults."</p>
<p>
<span class="dropcap">B</span>ut whose agenda are C. S. Lewis's defenders pursuing? Not Lewis's. As Douglas Gresham, Lewis's adopted stepson and a nondenominational Christian preacher in Ireland, argues, "the surest way to prevent secularists and their children from reading [the <i>Chronicles</i>] is to keep it in the 'Christian' or 'Religious' section of the bookstores." After all, the Narnia books have rarely been marketed as "Christian" literature; nor, surely, have they been read that way, especially by children. As <a target="outlink" href="http://slate.msn.com/?id=110460">Lauren Winner</a> recollected in <i>Slate</i>, when she and her non-Christian friends read the <i>Chronicles</i> in grammar school, "we just thought we were reading a riveting tale, one in which, as in so much children's literature, good triumphs over evil and a hero brings on a utopian reign of peace." That's the experience most young readers have, and it's the experience Lewis wanted them to have: "a pre-baptism of the child's imagination" that, years later, may draw them into faith. </p>
<p>This was the experience that Lewis himself had. Growing up in Belfast in the early 1900s, he felt that Christianity was boring; mythology, on the other hand, was interesting. Although Lewis was taken by his parents to the Anglican church on Sunday mornings, worship there was as much a political act as a religious one, a way for Irish Protestants to let it be known that they were loyal subjects of the crown, not Roman papists. What Lewis found in church was arid, sterile, and cold -- "the dry husks of religion," as he put it. In contrast, the Irish, Norse, and Greek myths he read in storybooks were filled with dash and color: gods, wars, exotic creatures, intrigue, and emotions. So taken was Lewis by mythology that as a child he created an imaginary country called Boxen and wrote stories about it. The stories "were an attempt to combine my two chief literary pleasures -- 'dressed animals' and 'knights in armour.' As a result, I wrote about chivalrous mice and rabbits who rode out in complete mail to kill not giants but cats." </p>
<p>Sent to England for his schooling, Lewis came under the influence of a tutor who was much enamored of <i>The Golden Bough</i>, a monumental new work about religion and mythology by Sir James Frazer. Frazer regarded religion as a human effort to make sense of the frightening and incomprehensible: thunder, pestilence, famine, death, and so on. In particular, Frazer found in human cultures a recurring story of a god whose death and resurrection saves his people. This god usually was associated with agriculture and fertility: Just as in the cycle of nature the plant is broken, the seed enters the ground, and life springs up, so was the god broken, buried, and restored. </p>
<p>Frazer was an atheist, and so, for many years, was Lewis. But Lewis never ceased to find the stories of dying and resurrected gods stirring. The thrill, he wrote, was akin to watching a diver "flashing for a moment in the air, and then down through the green, and warm, and sunlit water, into the pitch black, cold, freezing water, down into the mud and slime, then up again, his lungs almost bursting, back again to the green and warm and sunlit water, and then at last out into the sunshine, holding in his hand the dripping thing he went down to get."</p>
<p>Lewis's studies in English literature led to a faculty position at Oxford, where he quickly became close with the philologist and fantasy novelist J.R.R. Tolkien, who was a committed Christian. Whenever he encountered a story of a god dying to save his people in mythology, Lewis told Tolkien, he was "mysteriously moved, even though no one knows where [the mythological god] is supposed to have lived and died; he's not historical." Why, he wondered, was he not similarly moved by the Christian Gospel's avowedly historical accounts of Jesus' death and resurrection?</p>
<p>The answer, Tolkien told him, was to recognize that the Gospel story was mythic and should be appreciated as such -- "but with this tremendous difference: that it really happened." Lewis later wrote: "By becoming fact [the dying-god story] does not cease to be myth: that is the miracle." But "it is God's myth where the others are men's myths: i.e. the Pagan stories are God expressing Himself through the minds of poets, using such images as He found there, while Christianity is God expressing Himself through what we would call 'real things.'" The Christian dying-god story, Lewis came to believe, lay at the exact intersection of myth and history.</p>
<p>
<i><span class="dropcap">T</span>he Chronicles of Narnia</i> was Lewis's attempt to bring children to that intersection in the hope that, with the passage of time, they would realize that Christianity stood there. The stale, stained-glass version of Jesus that churches typically presented was, Lewis believed, as much of a turnoff for other children as it had been for him. Instead of Bible stories, he'd give them adventure stories involving children and mythical creatures, including a powerful and tender lion named Aslan. <i>The Lion, the Witch, and the Wardrobe</i> featured a menagerie of familiar mythic characters, ranging from centaurs to Santa Claus. Indeed, it was this pastiche of mythologies that Tolkien most disliked about the <i>Chronicles</i> -- in his own <i>Lord of the Rings</i> trilogy, Tolkien was fastidious about creating a world with no stray elements. </p>
<p>The <i>Chronicles</i> fulfilled Lewis's intention of telling the entire Christian story -- from the Creation, to the Crucifixion and the Resurrection, to the end of time -- without ever mentioning Christianity. For example, the climax of <i>The Lion, the Witch, and the Wardrobe</i> comes when Aslan voluntarily dies in order to spare one of the English children from the full consequences of his behavior, only to rise from death in triumph over the diabolical White Witch. In <i>The Magician's Nephew</i>, Aslan sings the world into creation and then watches as evil enters it. <i>The Last Battle</i> brings the end of the world and the Last Judgment. Summarizing the books makes them sound more formulaic than they are. The chief pleasure of reading the <i>Chronicles</i> lies not in the Christian element but rather in the stories and characters that make these elements seem -- in the course of things, and without bold allegorical labels -- appealing and exciting.</p>
<p><span class="dropcap">L</span>ewis's influence is strongly evident in our present cultural moment. J.K. Rowling, for instance, based her famous "platform nine and three-quarters" -- the place at London's King's Cross Station where young wizards enter the world of the Hogwarts School in her <i>Harry Potter</i> series -- on the wardrobe through which English schoolchildren pass into the land of Narnia in <i>The Lion, the Witch, and the Wardrobe</i>. (Rowling has also apparently based the scope of the <i>Harry Potter</i> series -- a projected seven volumes -- on Lewis's books as well.) And though Tolkien, for his part, said that he didn't especially like the Narnia books, he also took inspiration from Lewis, his close friend and colleague on the Oxford University English faculty during the middle decades of the twentieth century. Throughout the 12 years that Tolkien spent writing the <i>Lord of the Rings</i> trilogy, with no confidence that they were any good or would ever find an audience, Lewis was his faithful reader, critic, and cheerleader. "The unpayable debt that I owe to him," Tolkien wrote, was "sheer encouragement. He was for long my only audience. Only from him did I ever get the idea that my 'stuff' could be more than a private hobby."</p>
<p>Given all this, what accounts for Lewis's relative eclipse -- in the popular culture, anyway -- by Tolkien and Rowling? Does Narnia speak less directly to our time, or to the children of our time, than Middle Earth or Hogwarts? Or is it simply the case that Lewis's world adapts less readily to our Hollywoodized, secularized sensibility?</p>
<p>More likely it's the latter -- which makes it ironic that Rowling, like HarperCollins, has been pilloried recently by angry conservative Christians for writing playfully in the <i>Harry Potter</i> books about witchcraft and wizardry. Rowling doesn't understand the objections. Like the <i>Chronicles</i>, the <i>Harry Potter</i> books are infused with a Christian worldview: Both Lewis and Rowling celebrate courage, loyalty, friendship, compassion, forgiveness, persistence, and self-sacrifice with an compellingness that puts William Bennett's <i>Book of Virtues</i> to shame. She's a member of the Church of Scotland and, whenever she's asked, says, "I believe in God, not magic." In fact, Rowling initially was afraid that if people were aware of her Christian faith, she would give away too much of what's coming in the series. "If I talk too freely about that," she told a Canadian reporter, "I think the intelligent reader -- whether ten [years old] or sixty -- will be able to guess what is coming in the books." In truth, it's not much harder to find Gospel parallels in the <i>Harry Potter</i> stories than in the <i>Chronicles</i>. "Rejoice . . . ," says a wizard on the occasion of Harry's birth. "Even Muggles like yourself should be celebrating this happy, happy day!" Shooting stars streak across the heavens to mark the baby Harry's coming. "I wouldn't be surprised if today was known as Harry Potter Day in the future," says one of the teachers at Hogwarts when she hears the news. Substitute "Gentiles" for Muggles, "star in the east" for "shooting stars," and "Christmas" for "Harry Potter Day" and you get the idea.</p>
<p>
If any of this -- good versus evil, appealing young heroes who prevail by developing Christian virtues -- sounds like Tolkien's <i>Fellowship of the Ring</i> and its successor books, well, it should. "<i>The Lord of the Rings</i> is of course a fundamentally religious and Catholic work," Tolkien wrote to a Jesuit friend; "unconsciously so at first, but consciously in the revision. . . . For the religious element is absorbed into the story and the symbolism." Those who invent mythic worlds, Tolkien wrote in an essay called "On Fairy Stories," serve as "subcreators" who "make . . . because we are made: and not only made, but made in the image and likeness of a Maker."</p>
<p>Tolkien carefully avoided any hint of biblical allegory and injected no overtly religious elements into the <i>Ring</i> stories -- Middle Earth is devoid of temples, gods, and rituals. But what he did instead was even more deeply faithful. Tolkien created a world in which hope, the ultimate Christian virtue, is woven into the fundamental nature of reality -- in which, as Frodo and Sam approach the end of all things, it makes sense for them to renounce the power that would enslave and instead submit to the power that frees. In doing so, Tolkien, like Lewis and Rowling, offers his young readers "a pre-baptism of the child's imagination."</p>
</div></div></div>Thu, 31 Jan 2002 22:35:48 +0000142453 at http://prospect.orgMichael NelsonWhere Have You Gone, Franklin Roosevelt?http://prospect.org/article/where-have-you-gone-franklin-roosevelt
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><font class="nonprinting articlebody">&#13;<br />
&#13;</font></p>
<p>&#13;<br />
&#13;<br />
The November 1, 1948, issue of <i>Life</i> magazine is a collector's item because of a picture on page 37 that is captioned, "The next president travels by ferry over the broad waters of San Francisco bay." The picture is of Thomas E. Dewey. &#13;</p>
<p>&#13;<br />
&#13;<br />
Of greater significance is an article that begins on page 65 called "Historians Rate U.S. Presidents." The story was written by Professor Arthur M. Schlesinger, Sr., who had called on 55 of his fellow historians to grade each president (excluding the incumbent, Harry S. Truman) as either "great," "near great," "average,"&#13;<br />
"below average," or a "failure." When Schlesinger averaged each president's grades, Abraham Lincoln, George Washington, Franklin D. Roosevelt, Thomas Jefferson, Woodrow Wilson, and Andrew Jackson scored as great presidents, Ulysses S. Grant and Warren G. Harding were rated as failures, and the rest fell in between. &#13;</p>
<p>&#13;<br />
&#13;<br />
&#13;</p>
<p>Schlesinger followed his 1948 survey with another in 1962. The results were strikingly similar: the same pair of failures and nearly the same set of greats. Truman, now eligible for the ballot, ended up in ninth place as a near-great president in the company of John Adams and Theodore Roosevelt. Dwight D. Eisenhower, who had left office the previous year, earned a low C: He ranked 22nd, sandwiched between Chester A. Arthur and Andrew Johnson. Republicans howled that Schlesinger had packed the jury with Democrats; he replied that he had chosen the most eminent historians in the country. Both were right. &#13;</p>
<p>&#13;<br />
&#13;<br />
More important than the rankings themselves were the two related standards that historians used in assessing presidents: power and the desire to be powerful. "Washington aside," Schlesinger wrote, "none of [the great presidents] waited for the office to seek the man; they pursued it with all their might and main." Once in office, their greatness was established by the fact that "every one of [them] left the Executive branch stronger and more influential than he found it." When dealing with Congress, they knew "when to reason and to browbeat, to bargain and stand firm, ... and when all else failed, they appealed over the heads of the lawmakers to the people." Nor did the great presidents shy away from confrontations with the Supreme Court. They were, to be sure, inattentive to the administration of the bureaucracy (arguably a core responsibility for a chief executive), but Schlesinger sunnily explained that this freed them for the more important task of "moral leadership." &#13;</p>
<p>&#13;<br />
&#13;<br />
Historians still like to play the presidential greatness game. Indeed, one of the most recent rankings of presidents was commissioned in 1996 by Arthur Schlesinger, Jr., the eminent professor's equally eminent son. Eisenhower's stock has risen since the 1960s (he now regularly shows up among the top 10) but less because of any new appreciation of what long was thought to be his passive style of leadership than because of recent archival research that shows him to have been a deceptively strong "hidden-hand" leader. Ambition for power and success in wielding it remain the attributes that stir the admiration of most president-ranking historians. Explaining the high regard that Lincoln, Washington, FDR, and Company still enjoy among his academic colleagues, Schlesinger, Jr., noted that historians continue to view as great those presidents who "took risks ... provoked controversy ... [and] stood in Theodore Roosevelt's 'bully pulpit.'" &#13;</p>
<p>&#13;<br />
&#13;<br />
Now come two historically savvy political scientists, Marc Landy of Boston College and Sidney M. Milkis of the University of Virginia, to argue in <i>Presidential Greatness</i> that although historians may have gotten the roster of great presidents right, they have gotten the essence of presidential greatness wrong. The greatness of the presidents in Landy and Milkis's hall of fame--Washington, Jefferson, Jackson, Lincoln, and FDR--has less to do with power than with purpose. Great presidents are "conservative revolutionaries" who in uncertain times "teach the nation about the need for great change but also about how to reconcile change with American constitutional traditions and purposes." In addition, great presidents are both leaders and creatures of strong political parties, mobilizing their party to build a majority coalition yet restrained by its demand for fidelity to party principles and organization.&#13;</p>
<p>&#13;<br />
Washington established his greatness by working systematically to transfer his enormous personal authority to the new and still fragile Constitution. Jefferson, who like Washington would have occupied a prominent place in the American pantheon even if he had never been president, helped democratize the Constitution by creating the Democratic-Republican Party "and allowing it to flourish as a real party, not a vehicle for personal aggrandizement." Jackson furthered the cause of democratization within the Constitution by building the first mass-based political party. He also taught Jeffersonians how to combine their zeal for states' rights and limited government with a strong attachment to the Union.&#13;</p>
<p>&#13;<br />
&#13;<br />
Lincoln was in some ways the ultimate partisan. Virtually all of Lincoln's national stature at the time of his election owed to his being the nominee of the Republican Party. His use of political patronage during the Civil War put Jackson's spoils system to shame. But that did not stop Lincoln from preserving and renewing the Constitution by weaving into it the commitment to equality that is embedded in the Declaration of Independence. ("Four score and seven years ago" took his listeners back to 1776, not 1787.) Finally, FDR taught the American people--and the Supreme Court--that the Constitution allows the federal government, led by the president, to play an active and continuous role in domestic and international affairs.&#13;</p>
<p>&#13;<br />
&#13;<br />
The same historians who mistakenly equate greatness with ambition and power instead of with constitutional growth are guilty of a second fundamental error, Landy and Milkis imply. The profession's very preoccupation with presidential greatness suggests that greatness is still possible. But the political parties that are so essential to great leadership have waned in influence to the point that they can neither empower presidents for monumental accomplishment nor keep them faithful to broader purposes and interests. Landy and Milkis portray presidents today as being more visible than ever but also, because of the exposure that visibility brings, more vulnerable, awash in a sea of competing demands they cannot hope to satisfy from special-interest groups, the ever-critical media, and an impatient public.&#13;</p>
<p>&#13;<br />
&#13;<br />
Ironically, it was Roosevelt, the last of the great presidents, whom Landy and Milkis say closed the door to greatness for his successors. He did so by assaulting the authority of political parties in order to be free of external restraints on presidential leadership. FDR's most enduring institutional legacy was the Executive Office of the President, which over the years has taken over the traditional party functions of linking the president to interest groups, staffing the administration, and developing policies. What Roosevelt did not foresee is that the same party weakness that unshackles presidents also leaves them bereft of reliable and strong organizational support when they need it. &#13;</p>
<p>&#13;<br />
&#13;<br />
Presidential debates, which place the candidates in the spotlight as solo acts rather than as featured players in a party ensemble, accentuate the problem Roosevelt created. When Al Gore and George W. Bush debate each other as individual candidates, and when their commercials are all about them and their opponent instead of the party ticket, the message to the voters is that political parties are institutions of little consequence. &#13;</p>
<p>&#13;<br />
&#13;<br />
&#13;</p>
<p>In the absence of greatness, smaller virtues may deserve more of our attention. Princeton University political scientist Fred I. Greenstein focuses his book <i>The Presidential Difference</i> on "emotional intelligence"--a president's ability to "manage his emotions and turn them to constructive purposes." (The opposite of emotional intelligence is "emotional obtuseness," which Greenstein connects to leaders who end up "not being masters of their own passions.") Greenstein also looks at five other leadership traits in assessing each of the presidents from Roosevelt to Bill Clinton. The five, none of which he thinks is as important as emotional intelligence, lie more in the realm of skill than of character: an aptitude for public communications, adroitness at motivating and organizing advisers (think Martin Sheen in <i>The West Wing</i>), surefootedness in bargaining with fellow politicians, a consistent vision of public policy, and an effective cognitive style of processing information.&#13;</p>
<p>&#13;<br />
Greenstein acknowledges that Roosevelt provides "endless positive lessons" about how to be an effective president, but, wittingly or not, he evens the score with Schlesinger's liberal historians by being otherwise kind to Republicans. He judges three of FDR's five Democratic successors to be deficient in emotional intelligence (Lyndon B. Johnson, Jimmy Carter, and Clinton), but faults only one of the five modern Republicans (Richard Nixon). Granted, Nixon's "deep-seated anger and feelings of persecution" take the cake for emotional obtuseness. But LBJ's insecurity-born "mood swings of near-clinical proportions" and his brutal bullying of subordinates also "impeded the conduct of [his] responsibilities." Carter stumbled because he was "fixed in his ideas and unwilling to brook disagreement." As for Clinton, his "psychic shortcomings were debilitating." Better a "patently emotionally stable" Gerald Ford, Greenstein seems to be saying, or a "generous, polite, and forbearing" George H.W. Bush.&#13;</p>
<p>&#13;<br />
&#13;<br />
Greenstein is no less kind to the Republican presidents when it comes to the five leadership skills. If you tally his assessment of the presidents' performance of each skill as +1 for success, -1 for failure, and 0 for somewhere in between, the Republicans come out pretty well. Nixon's overwhelming emotional deficiencies aside, he scores +5 (out of 5) for skill. Eisenhower, whose hidden-hand leadership style Greenstein was the first to uncover and celebrate in a 1982 book, and Gerald Ford both score +3. Ronald Reagan scores +2, and George Bush scores -1. &#13;</p>
<p>&#13;<br />
&#13;<br />
Democratic presidents fail the leadership skills test as badly as they fail the emotional test. John F. Kennedy barely passes with a +1. All the others score a good bit lower. Truman and Johnson are -1, Clinton is -2, and Carter comes in at -5.&#13;</p>
<p>&#13;<br />
&#13;<br />
Although Greenstein claims to have "avoided presidential rankings," part of the fun of his book is the same fun political and historical junkies have had over the years with the surveys by Schlesinger <i>père et fils</i>--that is, unmasking their biases and disputing their judgments. Surely the historians were right to place Kennedy, Johnson, and Truman above Nixon and Ford, in contrast to Greenstein's arrangement. But they were wrong to rank Reagan with the average presidents (as they did in Schlesinger, Jr.'s 1996 survey), just as Greenstein is right to rate him more highly. &#13;</p>
<p>&#13;<br />
&#13;<br />
More serious than presidential rankings, however, is the question raised in <i>Presidential Greatness</i> (which, with its well-supported arguments, is the more substantive of the two books): Is the decline of presidential greatness such a bad thing? Landy and Milkis seem to think it is. They especially lament the loss of the teaching function that great presidents perform, leading the people to accept new policies by demonstrating their compatibility with enduring constitutional principles. &#13;</p>
<p>&#13;<br />
&#13;<br />
The negative on this question was perhaps best defended in a famous 1977 <i>Commentary</i> article by political scientist Nelson W. Polsby. His article--appropriately called "Against Presidential Greatness"--argued that the quest for greatness led presidents to act in ways that disserved themselves and the nation. "For fear of being found out and downgraded," he wrote, "there is the temptation to hoard credit rather than share it ... [and] to export responsibility away from the White House for the honest shortfalls of programs, thus transmitting to the government at large an expectation that loyalty upward will be rewarded with disloyalty down." The most dangerous temptation is "to offer false hopes and to claim spurious accomplishments to the public at large." &#13;</p>
<p>&#13;<br />
&#13;<br />
The worst of both worlds surely comes when a sitting president becomes obsessed with how historians will rate him. Carter, after studying the political scientist James David Barber's celebration of FDR, Truman, and other "active-positive" presidents in his 1972 book <i>The Presidential Character</i>, plaintively told a reporter that active-positive is "what I would like to be. That's what I hope I prove to be." In 1996 Clinton privately grouped his presidential predecessors into three tiers, then spent a long Sunday morning with consultant Dick Morris discussing what he could do to join the top group. &#13;</p>
<p>&#13;<br />
&#13;<br />
Nixon, as always, wins the prize for excess. He audiotaped much of what went on in the White House because he thought that if historians had an accurate record of what he said and did there, they would judge his presidency favorably. It is unlikely that Nixon will turn out to have been right about historians in the long term--they are still pretty liberal, and he is still looking pretty bad. What is certain is that he could not have been more wrong about how Congress, the Supreme Court, the media, and the public would judge him in the short term. Lesson for Gore or Bush: If elected, just do the job. ¤&#13;<br />
&#13;
</p>
<p>&#13;<br />
&#13;<br /><br />&#13;</p>
<p><br />&#13;</p>
<hr size="1" /><center>&#13;
<p align="center"><font face="verdana,geneva,arial" size="-2"></font>
class="nonprinting"&gt;&#13;</p>
<hr size="1" />&#13;<br />
&#13;
<!-- dhandler for print articles --><p>&#13;<br />
&#13;<br />
&#13;<br />
&#13;<br />
&#13;</p>
</center></div></div></div>Wed, 19 Dec 2001 19:15:56 +0000141788 at http://prospect.orgMichael NelsonThe Curse of the Vice Presidencyhttp://prospect.org/article/curse-vice-presidency
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><font size="+2" color="darkred"><b>U</b></font>ntil the election of George Bush the elder in 1988, no incumbent vice president had been elected president since Martin Van Buren in 1836. (Bush opened his first post-election news conference by saying, "It's been a long time, Marty.") Yet it also is true that, starting with Harry S. Truman in 1945, five of the last 10 presidents have been former vice presidents: Truman, Lyndon B. Johnson, Richard Nixon, Gerald Ford, and Bush. Death or resignation accounts for the ascensions of Truman, Johnson, and Ford, but each of them except Ford subsequently won at least one presidential election on his own.</p>
<p>Does being vice president make Al Gore a stronger contender for president or a weaker one? Until Gore agreed to be Bill Clinton's running mate in 1992, he was pursuing a different route to an eventual run at the White House. After youthful dalliances with journalism and the ministry, Gore had ascended rapidly, winning his father's old House seat in central Tennessee in 1976, then moving up to the Senate in 1984 and winning re-election by a landslide in 1990, when he carried every county in the state. In 1988 he'd made a presentable if premature run at the Democratic nomination. Gore was the youngest serious contender for a major-party nomination in this century, finishing third in a field of eight.</p>
<p>The nature of Gore's springboard changed dramatically in May 1992. Clinton placed Gore on his list of 40 potential running mates, had him checked out by Democratic Party eminence Warren Christopher, then kept Gore on the list when he pared it down to five. On June 30, Clinton and Gore had the sort of two-souls-become-one meeting (incredibly, they had scarcely known each before then) that is scheduled for one hour and lasts for three. The call to Gore's Carthage, Tennessee, home came shortly before midnight on July 8.</p>
<p></p><p><font size="+2" color="darkred"><b> T</b></font>he roots of the vice presidency's uncertain political status are embedded deeply in the Constitution and in two centuries of history. The Constitutional Convention of 1787 created the vice presidency as a weak office, but also a prestigious one. The Constitution empowered the vice president only to be "president of the Senate, but shall have no Vote, unless they be equally divided." It was the election system that brought the prestige. Every four years, presidential electors were charged to cast two votes for president: The first-place finisher in the electoral college won the office, and the person who finished second became vice president. In awarding the vice presidency to the runner-up in the presidential election, the Constitution thus made the vice president the presumptive heir to the presidency. Not surprisingly, the nation's first two vice presidents, John Adams and Thomas Jefferson, were elected to be its second and third presidents.</p>
<p></p><p>The arrival of political parties nominating not just a candidate for president, but a vice presidential candidate as well, rendered this system unworkable. The breakdown came in 1800 when, as a result of all of the Democratic-Republican electors faithfully discharging their duty to vote for both Jefferson and his vice presidential running mate, Aaron Burr, a tie vote for president occurred between the two nominees, and it took the House of Representatives weeks to resolve in Jefferson's favor.</p>
<p>The 12th Amendment, which was passed in time for the 1804 election, solved this problem neatly by instructing electors to cast one vote for president and a separate vote for vice president. But the amendment had a disastrous unintended side effect on the vice presidency: It left the office weak and, by stripping the vice president of his claim to be the second-most qualified person in the country to be president, took away its prestige as well. From 1804 on, talented and ambitious politicians shied away from vice presidential nominations. "I do not propose to be buried until I am dead," sniffed Daniel Webster when he was offered the Whig Party nomination in 1848. Ancient has-beens (six vice presidents died in office, all of natural causes, between 1812 and 1899) and middle-aged never-wases (George M. Dallas? Daniel D. Tompkins?) took their place.</p>
<p><font color="darkred"><b>Resurrecting a Dead Office</b></font></p>
<p></p><p>Although the vice presidency is still constitutionally weak, the contrast between the political prestige of the nineteenthcentury version of the office and the twentieth-century version is stark. Except for Van Buren, no nineteenth-century vice president was even renominated by his party's convention for a second term as vice president, much less nominated to run for president. Starting with William Howard Taft's vice president, James S. Sherman, however, every twentieth-century vice president who sought a second term has been renominated, and nine of them (nearly half) have gone on to receive a presidential nomination. Four nineteenth-century vice presidents succeeded to the presidency when the elected president died, but none of them was nominated to run for a full presidential term. The best of the four--Chester A. Arthur--was mediocre. The other three--John Tyler, Millard Fillmore, and Andrew Johnson--ran the gamut from bad to awful. In the twentieth century, not only were all five successor presidents--Theodore Roosevelt, Calvin Coolidge, Truman, Johnson, and Ford--renominated for president by their party, but all except Ford (who came very close) were elected. As a group, historians actually rank them higher than the century's elected presidents.</p>
<p></p><p><font size="+2" color="darkred"><b> T</b></font>he record of vice presidential prestige has been even more compelling since the end of World War II. Starting in 1948, the vice presidential candidate as often as not has been the more experienced member of the ticket in high government office, including recent nominees such as Walter F. Mondale in 1976, Bush in 1980, Lloyd Bentsen in 1988, and Gore in 1992. Vice presidents have become the presumptive front-runners for their party's presidential nomination. Starting with Nixon in 1960, every elected vice president except Dan Quayle has led in a majority of the Gallup polls that measure the party rank and file's pre-convention preferences for president. Again excepting Quayle, all eight of the postwar vice presidents who have sought their party's presidential nomination have won it. </p>
<p></p><p>The roles and resources of the vice presidency also have grown in recent years. The office is larger and more prominent than in the past--in the terminology of political science, it has been "institutionalized." As recently as the mid-1970s, vice presidents hung their hats in the Capitol and the Old Executive Office Building, arranged their own housing, and were forced to crib speechwriters from the White House. Today they enjoy a large and professional staff, a West Wing office, a separate line item in the executive budget, and a grand official residence--the Admiral's House at the Naval Observatory. The office also has been institutionalized in the broader sense that more--and more substantial--vice presidential activities are now taken for granted. These include regular private meetings with the president, a wide-ranging role as senior presidential adviser, membership on the National Security Council, full intelligence briefings, access to the Oval Office paper flow, public advocacy of the administration's programs and leadership, a leadership role in the party second only to the president, sensitive diplomatic missions, attendance at cabinet meetings, and serving as a presidential liaison to congressional leaders and interest groups.</p>
<p></p><p><font size="+2" color="darkred"><b> T</b></font>he reasons for the enhanced status of the vice presidency in government and politics are several. At the turn of the twentieth century, the rise of national news media (mass circulation magazines and newspaper wire services) and a new style of active political campaigning elevated the visibility and prestige of the vice president, which made the office more appealing to a better class of political leaders. In the 1900 election, the Republican nominee, Theodore Roosevelt, won widespread publicity and accumulated political IOUs from local politicians in nearly every state by becoming the first vice presidential candidate in history to campaign vigorously across the country. During the 1920s and 1930s, the roster of vice presidents included a speaker of the House, a Senate majority leader, and a Nobel Prize-winning cabinet member.</p>
<p>In 1940 Franklin D. Roosevelt, who had run (and lost) for vice president himself in 1920, successfully claimed for presidential candidates the right to name their running mates. In the past, party leaders had made that decision. They typically used it to pair the nominee for president with a vice presidential candidate from the opposite wing of the party, thereby discouraging the president from ever trusting the vice president personally or entrusting him with useful responsibilities in office. Voters want vice presidents to be loyal to the president as much as presidents do. This allows the president to choose his running mate virtually assured that such loyalty would be forthcoming.</p>
<p>Finally, after 1945, the combination of Truman's woefully unprepared succession to the presidency when Roosevelt died (Truman was at best dimly aware of the existence of the atom bomb and the Allies' plans for the postwar world) and the proliferation of nuclear weapons heightened public concern that the vice president be a leader who is ready and able to step into the presidency at a moment's notice.</p>
<p><font color="darkred"><b>A Vice Presidential Constitution</b></font></p>
<p></p><p>As voters increasingly have come to judge vice presidential nominees by their fitness to succeed to the presidency, most candidates for president have learned that, in filling the second slot on the ticket, they can do well politically by doing good for the country. As Hamilton Jordan put it in a 1976 memo to his candidate, Jimmy Carter, "The best politics is to select a person who is accurately perceived by the American people as being qualified and able to serve as president if that should become necessary."</p>
<p></p><p>The Constitution has been altered during the last halfcentury in ways that have redounded to the benefit of vice presidents. The 25th Amendment, which was enacted in 1967, focused almost entirely on the vice presidency. The amendment declared, at last, that when the president dies, resigns, or is removed from office, "the Vice President shall become President" for the remainder of the four-year term. Vice presidents--nine in all (how's that for a stepping-stone to the presidency?)--had been doing exactly that since John Tyler, upon William Henry Harrison's untimely death (after one month in office) in 1841, declared himself president rather than acting president, ignoring the considerable congressional grumbling that ensued. At the time, this move had almost the character of a coup, since many thought the vice president had the right to serve only as interim chief executive until a special election could be called.</p>
<p>Indeed, until the 25th Amendment was enacted, the language of the Constitution remained vague enough to admit just that interpretation. James Madison's extensive notes of the debates at the Constitutional Convention indicate that a special presidential election was the framers' true intention. The key phrase that ended up in Article II of the original Constitution said that if the president dies, resigns, is removed by impeachment, or is unable "to discharge the Powers and Duties of the said Office, the Same shall devolve on the Vice President." The Same what? The president's "Powers and Duties" or "the said Office"--that is, the presidency itself? The framers meant only the powers and duties and only in a custodial capacity, but through careless drafting they did not say so clearly in the final text. Because Madison had embargoed his papers, his notes of the convention were not yet in circulation when Harrison died, and all the delegates were dead. Tyler's stubbornness constituted a successful fait accompli that set the precedent for all of his successors to follow. But it took the 25th Amendment to settle the succession question once and for all.</p>
<p>The amendment did more than tidy up a constitutional infelicity. It also made the vice president the crucial actor in determining whether a president is disabled: Unless the vice president agrees that the president is physically or mentally unable to serve, nothing can be done. Finally, the amendment provided that whenever the vice presidency becomes vacant (by 1967, this had happened 16 times during the nation's first 36 presidencies), the president will nominate a new vice president with congressional confirmation. So prestigious had the vice presidency become that in 1976, Americans barely noticed that their national bicentennial celebration was presided over by two men, President Ford and Vice President Nelson A. Rockefeller, who had attained their offices not through election but by being appointed vice president.</p>
<p>Equally significant in constitutional terms was the 22nd Amendment, which imposed a two-term limit on the president in 1951. Just as nobody had meant to damage the vice presidency politically with the enactment of the 12th Amendment in 1804, nobody was trying to enhance the vice president's political status when the 22nd Amendment limited presidential tenure. But the two-term limit made it possible for the vice president to step forward as a presidential candidate early in the president's second term, rather than wait in the wings until the president decided what he wanted to do. All three vice presidents who have served second-term presidents since the 22nd Amendment was enacted have made good use of this opportunity: Nixon in 1960, Bush in 1988, and now Gore.</p>
<p>In all, Gore inherited an impressive office when he became vice president in 1993. He has contributed to the power and prestige of the office as well: heading the administration's reinventing government initiative, serving as an important diplomatic channel to Russia and other former Soviet republics, filling the bureaucracy with political allies, deflating strong opposition to the North American Free Trade Agreement when he shredded Ross Perot in a televised debate, developing the Telecommunications Act of 1996 and persuading Congress to pass it, and stiffening the president's spine at crucial moments. "You can get with the goddamn program!" Gore famously told Clinton when the president was vacillating on his 1993 economic plan--and Clinton did. The conventional wisdom about the Gore vice presidency is absolutely true. No vice president in history has been more influential.</p>
<p></p><p><font size="+2" color="darkred"><b> S</b></font>till, the question remains: Is being vice president a blessing or a curse for a talented political leader like Gore who is trying to win the presidency? The answer comes in two parts, with the easy part first. Service as vice president is clearly the most direct route to winning a party's presidential nomination. There is a downside to the vice presidency, of course, especially the certain prospect of being a steady source of merriment for late-night television comedians. But consider what vice presidents seeking to be nominated for president have going for them.</p>
<p></p><p>In addition to the opportunity for early fundraising and organization-building that the 22nd Amendment affords and the likelihood that the vice president is already a leader of some stature, vice presidents derive two other benefits from the office in their pursuit of a presidential nomination. The first is that their ongoing activities as party leader--campaigning across the country during elections, raising funds at other times--and as public advocate of the administration and its policies uniquely situate them to win friends among the political activists who typically dominate the nominating process. (Such campaigning also is good experience for a presidential candidacy.) Second, the recent growth in the governmental responsibilities and resources of the vice presidency has made it a more prestigious position and thus a more plausible stepping-stone to the presidency. Substantive matters like international diplomacy and symbolic ones like the trappings of the office--not just the mansion and Air Force Two, but even the new vice presidential seal that displays an eagle, wings spread, with a claw full of arrows and a starburst at its head (the eagle in the old seal seemed rather sedentary)--attest to the prestige of the office.</p>
<p>Altogether, the modern vice president typically is an experienced and talented political leader who is loyal to the president and admired by the party--an ideal formula for securing a presidential nomination and one that Gore executed skillfully this spring. Exit surveys during the Democratic primaries and caucuses showed Gore winning overwhelming support from voters who approved of Clinton's performance as president. Needless to say, such voters made up the vast majority of those who turned out at the polls. Gore's worst moment in the nomination campaign was, in a sense, the exception that demonstrated the rule. The vice president's zeal as a fundraiser for Clinton and the Democratic National Committee in 1995 and 1996 ("Is it possible to do a reallocation for me to take more of the events and the calls?" he asked in a memo) gave former Senator Bill Bradley an opening among independent voters last fall. But it also strengthened Gore's bond with Democratic activists, which turned out to be much more important.</p>
<p><font color="darkred"><b>Loyal to a Fault </b></font></p>
<p></p><p>Winning the party's nomination for president is no small thing, but it is not the main thing. For all their advantages in getting nominated, vice presidents have had an unusually hard time closing the deal in November. To be sure, the so-called Van Buren syndrome can be overstated: Of the 34 vice presidents who served between Van Buren and Bush, only seven even tried to run for president, and two of them--Nixon in 1960 and Humphrey in 1968--came very close to winning. But vice presidents carry burdens into the fall campaign that are as firmly grounded in their office as the advantages they bring to a nominating contest.</p>
<p></p><p><font size="+2" color="darkred"><b> I</b></font>ndeed, some of the activities of the modern vice presidency that are most appealing to party activists may repel other voters. Days and nights spent fertilizing the party's grass roots with fervent, sometimes slashing rhetoric can alienate those who look to the presidency for leadership that unifies rather than divides. Gore's blurt to a postimpeachment rally of Democratic congressmen that Clinton "will be regarded in the history books as one of our greatest presidents" doubtless warmed the cockles of yellow dog Democratic hearts, but it seemed wildly excessive to almost everyone else. The woodenness that many people attribute to Gore is partly an artifact of the hundreds of vice presidential moments he has spent standing motionless and silent in the background while Clinton has spoken animatedly to the cameras.</p>
<p>Certain institutional qualities of the modern vice presidency also handicap the vice president turned presidential candidate. Vice presidents seldom get to take credit for the successes of the administration: That is a presidential prerogative. But they can count on being attacked for all of the administration's shortcomings. Such attacks allow no effective response. A vice president who tries to stand apart from the White House will alienate the president and cause voters to wonder why the criticisms were not voiced earlier. Gore did himself no good, for example, when he spent the evening of his official announcement for president telling the <i>20/20</i> audience that Clinton's behavior in the Monica Lewinsky affair was "inexcusable" or when he later dissented from administration policy on Elián Gonzalez. A vice president's difficulties are only compounded when it comes to matters of substantive public policy. Let Gore offer a new proposal, and Bush demands to know why he has hidden it under his hat until now. </p>
<p>Vice presidents can always say that loyalty to the president forecloses public disagreement, but that course is no less perilous politically. The public that values loyalty in a vice president disdains that quality as soon as he bids to become president. Strength, vision, and independence are what people look for then--the very qualities that vice presidents almost never get to display. Polls that show Gore trailing Bush by around 20 percentage points in the category of leadership are less about Bush and Gore than about the vice presidency. Bush's father trailed Michael S. Dukakis by a similar margin in the summer of 1988.</p>
<p></p><p>The political handicaps that vice presidents carry into the general election are considerable. They need not be insurmountable. As with all things vice presidential, much depends on the presidents they serve. </p>
<p>One of the main reasons that Nixon and Humphrey lost, for example, is that their presidents were so unhelpful. Every Poli Sci 100 student knows what Dwight D. Eisenhower said when a reporter asked him to name a single "major idea of [Nixon's] you had adopted" as president: "If you give me a week, I might think of one." (Less well-known is that a week later, Eisenhower still had nothing to say.) Johnson treated Humphrey with all the spitefulness of which he was capable as soon as it became clear that the Democratic convention was not going to draft him for another term despite his earlier withdrawal from the race. In true vice presidential style, Humphrey carried Johnson's water on Vietnam for four years, only to have the president threaten repeatedly that if he broke even slightly with the administration line, there would be political hell to pay. When Humphrey, ignoring yet another Johnson warning, finally did speak out in favor of a bombing pause just five weeks before the election, his poll numbers began a steep ascent. As Humphrey later said, he didn't lose the election to Nixon; he just ran out of time.</p>
<p></p><p>In contrast, Van Buren benefited enormously from his association with President Andrew Jackson, who regarded his vice president's election to the presidency as validation of the transformation he had wrought in American politics. Ronald Reagan was equally committed to Bush's success, putting ego aside to praise (even inflate) the vice president's contributions to what the president began calling the "Reagan-Bush administration." Reagan's popularity was of even greater benefit to his vice president. Bush won the votes of 80 percent of those who approved of Reagan's performance as president; he lost nine-to-one among those who disapproved. Eighty percent of many is more than 90 percent of few: Bush was elected.</p>
<p>Clinton combines Jackson's belief that his legacy is closely tied to his vice president's political success with Reaganesque approval ratings. If there is such a thing as "Clinton fatigue," it must be the exhaustion felt by those who have always hated him but have never been able to persuade the rest of the country that they are right. Clinton's job approval rating has been in the 60 percent-plus range for nearly four years--the highest and most enduring numbers for a second-term president in the history of polling. He has made it clear that all of his vast political talents are at Gore's disposal from now until November--including his ability, not often seen, to shine the spotlight on someone other than himself. Much to Clinton's credit, he remained steadfast last fall when Gore, in full panic mode, sometimes went out of his way to distance himself from the president.</p>
<p>
As much as they will help, though, Clinton's efforts and popularity will not be enough to elect Gore. At the end of the day, candidates for president win or lose their own elections. "You're number two," says Gore, "and whether it's in politics or business or the professions, you have to make a transition from being number two to number one." But the president's assistance, joined with full use of the advantages the vice president derives from his own office, suggests that Gore's decision to seek the vice presidency instead of staying in the Senate eight years ago was his best available avenue to the White House.
</p>
</div></div></div>Wed, 19 Dec 2001 19:15:53 +0000140564 at http://prospect.orgMichael NelsonCollege for Dunceshttp://prospect.org/article/college-dunces
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><font color="darkred" size="+2">T</font>he electoral college is a constitutional time bomb that has been ticking for more than a century. It finally exploded on election day. Unkind as it is to say so--hasn't Al Gore suffered enough?--it's only fitting that it blew up in the Democrats' face.&#13;</p>
<p>&#13;<br />
&#13;<br />
The explosion, of course, was Gore's apparent loss to George W. Bush in the electoral college even though he won the national popular vote by a margin of around 100,000--the same plurality by which John F. Kennedy surpassed Richard Nixon in 1960. Such an overturning of the voters' will has not occurred since 1888, when Republican challenger Benjamin Harrison unseated President Grover Cleveland despite Cleveland's having been favored by substantially more voters. Harrison's victory tainted his presidency and set the stage for a rematch four years later, which Cleveland handily won.&#13;</p>
<p>&#13;<br />
&#13;<br />
The reason this year's outcome is fitting is that the strategy for winning presidential elections that the Democrats have been using with disturbing frequency in recent years backfired on them. Of the past six Democratic presidential nominees, Gore is the third to lose the presidency in exactly the way he attempted to win it--namely, by conceding the popular vote in dozens of states to the Republicans and trying to squeak through to victory by winning enough big states to carry the electoral college.&#13;</p>
<p>&#13;<br />
&#13;<br />
George McGovern was the first modern Democrat to try this strategy. He is also the easiest to forgive. Trailing President Nixon in 1972 by double-digit margins in the polls, McGovern spent nearly all of his time and money during the final month of the campaign trying to win the dozen largest states, which together have more than enough electoral votes to choose the president. He failed dismally. In fact, when political scientists point to the electoral college's tendency to "magnify" the popular vote--that is, to give the winning presidential candidate a larger percentage of the electoral vote than of the national popular vote--the 1972 election is one of their prime examples. Nixon won 61 percent of the popular vote but 97 percent of the electoral vote.&#13;</p>
<p>&#13;<br />
&#13;<br />
Michael Dukakis was next. As his 1988 presidential campaign foundered, Dukakis ended up taking a page out of McGovern's playbook. In mid October, abandoning all hope of winning a broad popular majority against Vice President George Bush, Dukakis adopted an 18-state strategy to amass a narrow electoral college victory. Scarcely a moment of the Massachusetts governor's time and, more important, scarcely a dollar of his campaign fund went into the other 32 states from that point on. &#13;</p>
<p>&#13;<br />
&#13;<br />
Dukakis's strategy failed almost as badly as McGovern's, although not before giving the Bush campaign a major scare. But what would have happened if the strategy had succeeded? Would a president who had won by gaming the system have been able to govern? (Harrison wasn't; nor was John Quincy Adams, who was elected against Andrew Jackson with a minority of votes in 1824.) To be sure, candidates don't make the rules; they only play by them. But for many years, Americans have come to expect that the right way to win a presidential election is to seek a majority of the people's votes and count on the electoral vote to follow.&#13;</p>
<p>&#13;<br />
&#13;<br />
Gore's prospects of winning the 2000 election in the accepted way--that is, by seeking votes everywhere (well, maybe not Idaho and Wyoming) and racking up both popular and electoral vote majorities--seemed much better than McGovern's or Dukakis's had been, especially from the vantage point of late summer and early fall. His acceptance speech at the Democratic convention had been a smash, and the debates loomed as the arena in which he would polish off his inexperienced and not terribly quick-witted opponent for good.&#13;</p>
<p>&#13;<br />
&#13;<br />
Things did not work out that way. Gore, ahead in the polls by around 5 percentage points before the three presidential debates, stumbled out of them 5 points behind. When this gap remained stubbornly unbridgeable, Gore strategists quietly embraced the McGovern-Dukakis strategy. By election eve, most of them were explaining, off the record, how they planned to pull out an electoral-vote majority by winning almost all of the big states narrowly while Bush was carrying the small states overwhelmingly. They brushed off questions about how such a victory could avoid being tarnished. &#13;</p>
<p>&#13;<br />
&#13;<br />
What no one expected, of course, is what happened. Gore won the national popular vote. Yet the likelihood is that on January 20--two weeks after the joint session of Congress at which Gore, as president of the Senate, will open the electoral votes submitted by the states and announce his own defeat--it is Bush who will take the oath of office as president. &#13;</p>
<p>&#13;<br />
&#13;<br />
The ironic--dare one say just?--roots of Gore's plight extend further back than his and McGovern's and Dukakis's defeats. In 1969, in response to a bipartisan initiative by Democratic Senator Birch Bayh of Indiana and the Nixon administration, Congress seriously considered a proposed constitutional amendment to replace the electoral college with a system of direct election by the voters. The sponsors' main concern, which flowed directly from the 1968 election, was that a George Wallace-style third-party extremist with a strong enough regional base to win several states' electoral votes could throw a presidential election into the House of Representatives or, perhaps worse, throw his electors to the Republican or Democratic nominee in return for concessions on policies and appointments.&#13;</p>
<p>&#13;<br />
&#13;<br />
As had been the case for years, public opinion polls at the time showed wide but shallow support for direct election. Then as now, around three-fourths of voters found the electoral college to be densely complicated and did not understand why they couldn't elect presidents with their votes the same way they elect congressmen and governors. But--sorry, civics teachers--electoral college reform has never been the sort of home-and-hearth issue that motivates many people to pick up the phone or take pen in hand. Members of Congress have thus been free to vote their own preferences without fear of reprisal at the polls.&#13;</p>
<p>&#13;<br />
&#13;<br />
In 1969 small-state legislators opposed direct election in the understandable but mistaken belief that every state's guaranteed three electoral votes (around half of 1 percent of the 538 total electoral votes) made small states more important in presidential elections than would their share of the popular vote. But candidates are not going to care much about Alaska, Delaware, and South Dakota no matter how presidents are elected. &#13;</p>
<p>&#13;<br />
&#13;<br />
A few conservatives joined the small-staters in opposition to Bayh's direct-election proposal, mostly on the grounds that state-by-state voting in the electoral college is a bulwark of states' rights and federalism. This argument is less mistaken than misguided. The federal principle is already deeply embodied in congressional elections, in which each state gets two senators just because it is a state. But what does federalism have to do with the presidency, the one part of the government that is designed to represent the nation as a whole rather than as an amalgam of states? &#13;</p>
<p>&#13;<br />
&#13;<br />
The margin of defeat for electoral college reform, however--both in 1969 and 10 years later, when Congress revisited the issue at the behest of President Jimmy Carter--was provided by big-state Democratic liberals. They argued (although seldom in public) that they liked the electoral college not only because their states' large blocs of votes attract to them almost all of the presidential candidates' time and attention, but also because liberal constituencies tend to be concentrated in the major cities: African Americans, Latinos, Jews, and unionized workers. Conservative supporters of the electoral college at least had a principled rationale for their position, however errant it may have been. Democratic liberals had little basis for opposing reform but political self-interest. When Bayh's and later Carter's proposals for direct election were killed by Congress, Democratic fingerprints were all over the murder weapon. &#13;</p>
<p>&#13;<br />
&#13;<br />
The case for allowing voters to elect the president in the same way they elect virtually every other official at every level of government is strong verging on self-evident. Indeed, liberal progressives won the theoretical argument years ago, when they successfully advanced the cause of direct election of senators (originally, state legislatures chose them) and of primaries to nominate candidates. To be sure, some problems with direct election of the president would need to be worked out. Votes would need to be tabulated through a national network, for example, so that recounts in close elections could be made expeditiously. To ensure that a president is not elected by a small minority of voters, a prompt run-off election between the top two vote getters would be needed when no candidate receives, say, 40 percent of the national popular vote.&#13;</p>
<p>&#13;<br />
&#13;<br />
Technical problems like these can be solved if the political will is there. The more difficult task will be getting Democratic candidates for president to stop gaming the system and to embrace the challenge of winning voter support. Even harder will be persuading Democratic liberals in Congress to put aside partisan interests and propose a direct-election constitutional amendment to the states for ratification. ¤</p>
</div></div></div>Wed, 19 Dec 2001 19:08:07 +0000141880 at http://prospect.orgMichael NelsonFlunking the Electoral College?http://prospect.org/article/flunking-electoral-college
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p></p><center><font face="verdana,helvetica,arial" color="#004987" size="3"><b><a name="top" id="top">Should the United States abolish or alter the Electoral<br />
College system? If so, what should replace it?</a></b></font><br /><font color="darkred"><b>12.18.00</b></font><br /></center><br /><div align="center">
<p>
<a class="first" href="#michael">Michael Nelson</a> | <a class="first" href="#james">Representative James Clyburn</a> | <a class="first" href="#walter">Walter Berns</a> | <a class="first" href="#william">Representative William Delahunt </a> | <a class="first" href="#jamesr">James R. Whitson</a><br /></p>
</div>
<hr size="1" width="80%" /><p><font color="darkred" face="verdana, arial, helvetica"><b><a name="michael" id="michael">Michael Nelson: End It</a></b></font></p>
<p>
The Electoral College was no one's first choice at the Constitutional Convention of 1787. Various delegates had all sorts of ideas about who should elect the president, including Congress, the people, the governors of the states, even a randomly selected group of legislators. Unable to agree on any of these, weary after three months of rehearsing the same old arguments, and racing toward adjournment, the convention appointed a committee to deal with presidential selection and other "postponed matters." From this committee sprang the Electoral College, a mechanism so odd and complicated that it skirted the convention's established lines of division. The delegates accepted the Electoral College mostly as a stopgap measure, to be replaced with something better once the new government was up and running. They knew that George Washington would be the first president no matter how that president was chosen.</p>
<p>
It is also safe to say that the Electoral College is not the process for picking presidents that any constitutional convention would arrive at today if we were starting from scratch. Two centuries ago the idea of involving the voters only indirectly in elections was a familiar one. Most governors were chosen by their state legislatures, and no one thought it odd that U.S. senators would be elected the same way under the new constitution. Today, indirect election is strange bordering on perverse. Governors are chosen directly by the voters and, as a result of the Seventeenth Amendment, so are senators.</p>
<p>
So why has the Electoral College endured? Not because of any strong arguments in its favor. Some traditionalists cite the federal principle as the best reason for keeping the Electoral College. The states, they argue, not just the people, should be involved in choosing the president. But surely the federal principle applies more appropriately to Congress, which represents us in our variety, than to the presidency, which represents us in our unity. Another standard argument for the Electoral College is that it localizes the problems that arise from close elections: Imagine the whole country as one big Florida in 2000. But the whole country wouldn't be Florida if presidents were chosen by the people in a national election. Direct election would surely be accompanied not by the existing hodgepodge of chads in this state and touch-screens in that one, but by a uniform system of voting that reflects the best technology.</p>
<p>
The real basis for the Electoral College's endurance is less one of principled arguments than of special interest pleading, compounded by the inherent difficulty of amending the Constitution when the special interests themselves have to do the amending. The special interests include both the smallest states, which get a larger (but still tiny) voice in presidential elections than their populations would warrant, and the largest states, which under the electoral college occupy the center of the candidates' attention even more than they would if their votes weren't awarded in twenty- to fifty-four-vote clumps. Together, these states constitute many more than the 13 -- 1/4 plus one -- states that it takes to prevent a constitutional amendment from being ratified.</p>
<p></p><p><font color="darkred">Michael Nelson is professor of political science at Rhodes College and editor and coauthor of <i>The Elections of 2000</i>, which will be published in March by Congressional Quarterly Press. </font><br /><br /><br /><br /><a href="#top">[Top]</a></p>
<p></p><hr size="1" width="80%" /><p><font color="darkred" face="verdana, arial, helvetica"><b><a name="james" id="james">Representative James Clyburn: Mend It</a></b></font></p>
<p>In light of the presidential election debacle, I believe it is time for our country to reconsider how the highest office in the land is won. Many political observers are calling for the end of the Electoral College. I think that is a bad idea. However, I do believe the current system must be redesigned to reflect the modern complexity of our nation.</p>
<p>
For a long time, I have advocated changing our entire election method. My position has always been that winner-take-all elections trample on the variety of voices in our diverse country. Winner-take-all elections by their very nature mean that the highest vote getter wins, even if the margin of victory is only one vote. </p>
<p>
The Founding Fathers were concerned about this scenario significantly enough to discount basing a presidential election solely on the popular vote. Instead they decided to implement an Electoral College method that uses each state's number of U.S. House and Senate members to determine the number of ballots each state casts in the Electoral College. However, winner-take-all is still at play here. Whoever wins the popular vote in the state, in turn gets all of its Electoral College votes. And that is true whether or not the highest popular vote getter wins by a margin of one vote or one million votes. That is the crux of the problem.</p>
<p>
There are two states that have made an exception to this rule -- Maine and Nebraska. The legislatures in these states have determined that electors will be apportioned based on who wins each congressional district in the state. To me, this is a logical solution. </p>
<p>
Let me explain how this scenario would work in South Carolina. We have eight electoral votes because we have six House seats in addition to our two Senate seats. Since the Senate seats are elected statewide, those two electoral votes should be cast for the presidential candidate who won the popular vote in South Carolina. Using this year's presidential campaign as an example, George W. Bush would receive those two votes. Then presidential votes would have to be examined by congressional districts. I am certain, even without having the exact numbers, that Al Gore won the Sixth Congressional District. Therefore, he would receive at least one electoral vote from South Carolina. </p>
<p>
Using this method of selecting electors, Florida would not have become the winner-take-all, make-or-break state in this year's presidential election. A few hundred votes would not have been the difference between receiving all or none of its 25 electoral votes. Instead, 23 of those 25 votes would be divided up based on the outcome in each congressional district. </p>
<p>
Having said that, I can't say for sure who would have won the 2000 presidential campaign under this scenario. I have not seen the break down of the race by congressional districts. Consequently, I am not making this argument because it may have resulted in a Gore victory. Rather this scenario represents the fairest reform of the Electoral College while holding true to the Founding Fathers' desire to avoid electing the president by popular vote.</p>
<p>
I admit that such proposals have been introduced in Congress and failed. It is unlikely with such divided government as we have at this time that enough votes could be culled to pass a constitutional amendment to this effect and I don't believe we should. However, individual states have the right to determine how they select their Electoral College representatives. I urge South Carolina and other states to consider adopting the method used by Maine and Nebraska.</p>
<p><font color="darkred">Representative James Clyburn is a Democrat from South Carolina's Sixth District.</font><br /><br /><br /><a href="#top">[Top]</a></p>
<p></p><hr size="1" width="80%" /><p><font color="darkred" face="verdana, arial, helvetica"><b><a name="walter" id="walter">Walter Berns: Preserve It</a></b></font> </p>
<p>There's no prospect that the Electoral College can be abolished. To abolish it would take a constitutional amendment. The amendment requires ratification by 3/4 of the states, and that means that 13 states could prevent ratification. The question then is whether 13 states have an interest in maintaining the Electoral College, and the answer to that is almost certainly yes. These are states with small populations but nevertheless real interests. The Constitution acknowledges those interests by giving them -- simply by virtue of the fact that they are states -- two United States senators and therefore, a minimum of three electoral votes.</p>
<p>
But the Electoral College has advantages over every proposed substitute -- and particularly the substitute most frequently proposed, namely, direct popular election. Consider the fix we would have been in this year if instead of the Electoral College, we had a system of direct popular election. We would have had Floridas all over the country. We had the Florida situation because of the close popular vote in Florida, and we therefore had to recount every vote in Florida. In a system of direct popular election, we would have to recount every vote everywhere. This year, we did not bother to recount votes in California or New York or Texas because in the first two, Mr. Gore had a great majority of the votes, and in Texas, Mr. Bush had a great majority. But in a system of direct popular elections, every vote everywhere matters and therefore has to be counted, and if necessary recounted. This in itself is a case for retaining the system we now have.</p>
<p>
In addition, however, there is what I think is the fact that in a system of direct popular election, there will be a proliferation of presidential candidates. The present system discourages what we call third party candidates. In direct popular elections, there would be no discouragement. Again, to reflect on our current situation, in a system of direct popular election, not only would we have had Ralph Nader and Pat Buchanan, but we would have had a serious effort by Gary Bauer, whose support in the country, I suspect, far exceeds that of Nader or Buchanan. Then in the very likely event that no one would get the majority of the popular vote, there would have to be a runoff, and that would lead to a situation where Gary Bauer, Ralph Nader, and Pat Buchanan would begin to talk with the two major party candidates. The question would be, What would have to be offered them in exchange for their support? It is not unreasonable to suspect that one of the two major party candidates, and perhaps both, would offer Ralph Nader the head of the Environmental Protection Agency and Pat Buchanan the head of immigration; and what would Gary Bauer want, and what might he be offered? Perhaps control of nominations to the Supreme Court, Gary Bauer insisting on imposing the <i>Roe v. Wade</i> litmus test that only an anti-abortionist judges be appointed. Enough said.</p>
<p><font color="darkred">Walter Berns is a resident scholar at the American Enterprise Institute.</font><br /><br /></p>
<p><a href="#top">[Top]</a></p>
<p></p><hr size="1" width="80%" /><p><font color="darkred" face="verdana, arial, helvetica"><b><a name="william" id="william">Representative William Delahunt: End It </a></b></font></p>
<p>
For years, most Americans have ignored the Electoral College as a harmless nuisance. Not any more. The collision between the electoral and popular vote is no longer just a historical curiosity. It's time to abolish the Electoral College, and to count the votes of all Americans in presidential elections.</p>
<p>
This is about far more than any one candidate or the outcome of a particular election. At stake is public confidence in our electoral system.</p>
<p>
Two centuries ago, the Constitutional Convention considered many ways to select the president of an emerging republic, from popular election to assigning the decision to the Congress. The Electoral College was a political compromise that reflected a basic mistrust of the electorate -- the same mistrust that denied the vote to women, African-Americans and persons who did not own property.</p>
<p>
The Electoral College may or may not have made sense in 1787. But through 21st Century eyes, it is as anachronistic as the limitations on suffrage itself. Whether or not you like the results of a particular election, whether you voted for Bush or Gore or Nader, whether you live in Massachusetts or Montana, your vote should count.</p>
<p>
Some defend the Electoral College because it carries the weight of constitutional authority. As a member of the House Judiciary Committee, I approach the prospect of amending the Constitution with extreme caution. But the 12th and 22nd Amendments have already altered the system designed by the framers for electing the president; and until ratification of the 17th Amendment in 1913, the U.S. Senate was elected not by the people, but by state legislatures.</p>
<p>
For me, it is a straightforward proposition. If the Electoral College merely echoes the election results, then it is superfluous. And if it contradicts the voting majority, then why tolerate it?</p>
<p>
It is a remarkable and enduring virtue of our political system that our elections are credible and decisive -- and that power changes hands in a coherent and dignified manner. Many other nations watch with envy as a U.S. president welcomes his successor, often a political adversary, to the White House.</p>
<p>
The Electoral College threatens that stability. Even this election's crash civics course yields only a glimpse of the problems it can cause. Picture this: If Florida were won by a third-party or favorite son candidate, depriving any candidate of the required 270 electoral votes, the presidential election would be thrown into the House of Representatives.</p>
<p>
That process would take months to resolve. The electors don't even cast their votes until December. It's another month until those votes are counted. And if a single candidate still lacked a majority, the nightmare would just be starting.</p>
<p>
The nation would be rudderless -- caught between a lame-duck president and two or more potential successors. Even in sleepy colonial times, this would create enormous national anxiety. Given today's 24-hour media frenzy and America's superpower status in the modern world, the impact is almost unimaginable. From financial markets to foreign terrorists, an extended period of confusion is an open invitation for trouble.</p>
<p>
The Electoral College is not worth these risks. For months, I have talked with congressional colleagues who shared my concern that this could be the year when the electoral vote contradicts the popular will. Now it is about to occur. For these reasons, my first act of the 107th Congress will be to introduce legislation to abolish the Electoral College outright.</p>
<p>
There is no partisan edge to this undertaking. It would not affect the outcome of the election; in fact, I first proposed it when the pundits were predicting a popular vote win by Governor Bush. I will seek support from Republican, Democratic, and Independent colleagues. The ultimate beneficiary could be a candidate of any party.</p>
<p>
I am under no illusion about the difficulty of enacting a constitutional amendment, which requires a congressional supermajority and ratification by two-thirds of the states. But now is the time to act -- while the sting of the contradiction is still fresh.</p>
<p>
Historically, the Congress has debated Electoral College reform only when faced with urgent public concern about specific elections. The Senate held hearings in 1992, when it appeared the Perot candidacy might deadlock the Electoral College. After the three-candidate 1968 election, the House actually approved a direct-election amendment, but it fell victim to Senate filibuster.</p>
<p>
Every other public official, from selectman to Senator, is chosen by majority vote. That's the way it's supposed to work in a democracy. For reasons both philosophical and practical, that's also how we should elect the president.</p>
<p><font color="darkred">William Delahunt is U.S. Representative from the 10th Congressional District in Massachusetts.</font><br /><br /><br /><a href="#top">[Top]</a></p>
<p></p><hr size="1" width="80%" /><p><font color="darkred" face="verdana, arial, helvetica"><b><a name="jamesr" id="jamesr">James R. Whitson: Preserve It</a></b></font></p>
<p>
The little understood, yet widely vilified, Electoral College is the best system for electing a president in a country as large and diverse as the United States. It prevents a big city from being more powerful than several states combined. It protects minority interests and opinions from being silenced by a simple majority. It requires candidates to build a<br />
national consensus by making their appeal as broad as possible, instead of pandering<br />
to a geographic region or running up the vote in states favorable to them. And finally, it ensures the federal nature of our government that the founders intended.</p>
<p>
The electoral college system prevents urban areas from overwhelming states in electoral clout. The total combined population of the 15 states of Alaska, Deleware, Hawaii, Idaho, Maine, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Dakota, Rhode Island, South Dakota, Vermont, and Wyoming is about 15.5 million. By comparison, the total combined population of New York City, Los Angeles, Chicago, and Houston is about 15.5 million.</p>
<p>
In the most popular alternative to the Electoral College, a direct election, these four cities would have the same electoral power as all 15 states. The people of a single state have wide and varied needs and issues because of their wide and varied geography. But the cities would be given preferential treatment by the candidates because it would be less expensive and more efficient for them to spend their time there rather than travel throughout an entire state.</p>
<p>
The electoral college system protects minority interests and opinions from being overpowered by a simple majority. Farmers, once a very influential constituency, now make up less than 4 percent of the population. Why would a candidate worry about this small group in a direct election? In the Electoral College system, since farmers do make up sizable parts of several states, their combined strength in a smaller pool of voters gives them more power. The same applies to minority opinions. For example, if 55 percent of the population is against abortion rights, and 45 percent are for them, a candidate<br />
in a direct election need not address the concerns of the minority view -- he can win without them. However, in the current system, ignoring a national minority opinion won't work because many states will disagree with the national opinion. This forces candidates to at least acknowledge the concerns of the minority.</p>
<p>
The electoral college system requires candidates to build a national consensus, rather than pad their lead in certain states or regions. In fact, this situation was averted in 1888 when Grover Cleveland based his campaign on one issue only popular in the South. He ended up barely winning the popular vote, but losing in the electoral vote to Benjamin Harrison handily. It turns out Cleveland had run up his vote total in six southern states<br />
where he garnered up to 65-80 percent of the popular vote. In the other 32 states<br />
combined, he lost the popular vote! The Electoral College discourages candidates from pandering to one region at the expense of the rest of the country.</p>
<p>
And lastly, the electoral college system maintains the power of the states<br />
in our government. The Founding Fathers originally intended the federal and state governments to share power. They still do, but now the states have much less say in the federal government than they used to. In fact, as originally set up, the people only voted on the House of Representatives; the Senate was actually chosen by the state governments so the states had a direct say in what laws were passed. The 17th Amendment took the states out of the federal legislature and indirectly out of the federal judiciary (where they had a vote in the Senate on judicial appointments). By getting rid of the Electoral College, the states would lose their power over the third branch of the government, the executive branch. </p>
<p><font color="darkred">James R. Whitson is editor of <a href="http://www.presidentelect.org">Presidentelect.org</a>.</font><br /><br /><br /><br /><a href="#top">[Top]</a></p>
</div></div></div>Wed, 19 Dec 2001 19:08:06 +0000139305 at http://prospect.orgMichael NelsonThe Lottery Gamblehttp://prospect.org/article/lottery-gamble
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p> <span class="dropcap">H</span>ere's the best news to come out of the otherwise screwed-up 2000 election: The political juggernaut that during the last third of the twentieth century transformed the states from staunch foes of gambling into gambling's chief sponsors has slowed to a crawl. The voters of Arkansas rejected a lottery-casino ballot measure, joining the voters of Alabama, who turned down a lottery proposal in 1999. South Carolina voters were more ambivalent: They approved a lottery proposal, but they also elected a Republican House of Representatives that may refuse to pass the enabling legislation needed to put a lottery into effect.<br /></p><p><br /></p><p>
</p><p> What a contrast to the period that began in 1964, when New Hampshire became the first state ever to create, own, and operate a lottery. New Hampshire is one of only two states with neither an income tax nor a sales tax, and therein lies the tale. A lottery seemed to the state's voters a painless, voluntary tax.<br /></p><p><br /></p><p>Lotteries spread rapidly in this country during the 1970s and 1980s, when New Hampshire seemed a model to many states. In 1978 California voters passed Proposition 13, which placed severe restrictions on the state's taxing authority and inspired voters in some other states to enact similar measures. More important, Prop 13 and its progeny made politicians everywhere averse to new taxes. Only one state, Connecticut, has enacted a personal income tax or general sales tax since 1977. Ronald Reagan was elected president in 1980 on a promise to make substantial reductions in federal income tax rates. He not only accomplished this goal but also persuaded Congress to reduce spending on grant programs to the states.<br /></p><p><br /></p><p>To state governments caught in a vise between greater revenue needs and widespread opposition to taxes, the lottery seemed an appealing way out. During the late 1960s and the 1970s, 12 states (mostly in the Northeast) legalized lotteries. During the 1980s, 18 states--representing a majority of every region of the country except the South--followed suit. Six more states, including three in the South, legalized lotteries in the early 1990s. In all, 37 state governments and the District of Columbia--representing nearly 90 percent of the nation's population--now own and operate lotteries.<br /></p><p><br /></p><p>The desire for nontax revenues was not the only thing fueling the spread of lotteries; there also was competitive pressure on the states that didn't have a lottery. Once a critical mass of lottery states was reached, a race to the bottom began. In 1986, for example, John Carlin, the liberal Democratic governor of Kansas and an opponent of lotteries, saw how many dollars were flowing out of his state as people crossed the border to play in Missouri and Colorado. Carlin became a lottery convert, arguing that "not having one when your neighbor has one is like tying one hand behind your back." Kansas's story was repeated nearly everywhere. As the political scientists Frances Stokes Berry and William Berry found, the greater the number of lottery states that border a state without one, the more likely that state is to adopt a lottery.<br /></p><p><br /></p><p>
</p><p> <span class="dropcap">W</span>hat a deal with the devil Carlin and his fellow governors struck. To begin with, lotteries are a wildly regressive way of raising revenue. Although members of nearly every demographic group bet the lottery in roughly equal numbers, some bet much more frequently than others did. "The heaviest players," Duke University economists Charles Clotfelter and Philip Cook have found, are "blacks, high-school dropouts, and people in the lowest income category." Yet state lotteries depend on the participation of these frequent players. "If all players spent the same as the median player, $75 a year," report Clotfelter and Cook, "[lottery ticket] sales would fall by 76 percent." Eighty-two percent of lottery bets are made by just 20 percent of players--and this group is disproportionately poor, black, and uneducated.<br /></p><p><br /></p><p>
</p><p> Despite laws to the contrary, minors bet the lottery, too. The presence of lottery tickets alongside candy, chips, and crackers in neighborhood convenience stores places children directly in contact with gambling. In lottery states, three-fourths of high school seniors report having bet in a lottery, according to the 1999 report of the National Gambling Impact Study Commission. In Massachusetts the attorney general found that children as young as age nine were able to buy lottery tickets in 80 percent of their attempts.<br /></p><p><br /></p><p>An additional problem with lotteries is that the money that states make from them seldom goes where the law says it should. Eighteen states earmark their lottery revenues for education; others, for transportation or programs for seniors. But economists have discovered that in most states little if any net increase in spending for the earmarked purpose actually occurs. Instead these states substitute lottery revenues for money they otherwise would have spent from their general funds.<br /></p><p><br /></p><p>Perhaps the worst thing about lotteries is that they put states into the business of gambling, which generates its own downward spiral of increasing regressivity and deception. States come to depend on the revenues from lottery games as part of their ongoing budgets. But people get bored betting the same games over and over again. Ticket sales and revenues to the state treasury drop. So state lottery agencies ramp up their advertising, much of which is designed to persuade those who already bet a great deal to bet a great deal more.<br /></p><p><br /></p><p>The federal government is no help. Although commercial sweepstakes operators like Publishers Clearinghouse are governed by the Federal Trade Commission's truth-in-advertising rules, Congress has exempted state lotteries from such restraints. With few exceptions, lottery agencies use their freedom from federal regulation to advertise their games misleadingly, thereby fostering the impression that the odds of winning a big prize are good and that playing the lottery is a sensible way to enhance one's financial status. In doing so, these agencies encourage luck--not hard work or saving and investment--as a strategy for success.<br /></p><p><br /></p><p>"When I was younger, I suppose I could have done more to plan my future," says a smiling young man in a commercial for the Connecticut lottery. "But I didn't. Or I could have made some smart investments. But I didn't. Heck, I could have bought a one-dollar Connecticut lotto ticket, won a jackpot worth millions, and gotten a nice big check every year for 20 years. And I did! I won!" The commercial ends with a voice-over saying, "Overall chance of winning is one in 30." But that is the chance of winning a small prize in an instant lottery, not "a jackpot worth millions."<br /></p><p><br /></p><p>Bettors may become less and less susceptible to commercials like this, but they are hardly immune to the epidemic at large. State lottery agencies, pressured by their governors and legislatures to keep the revenues coming, develop new, more enticing games. Over the years, the monthly drawing has given way to the daily drawing, the instant scratch-off game, and lotto. The five states that have recently decided to market slot machinestyle video lottery terminals may represent the wave of the future. In the late 1990s, lottery revenues fell in nearly half the states; but the video states experienced annual growth rates ranging from 9 percent to 26 percent.<br /></p><p><br /></p><p>
</p><p> Until recently political conditions seemed ripe for a new round of state lottery enactments. Except for Alaskans and Hawaiians, every American lives in a state that either has a lottery or shares a border with one or more lottery states. Ambitious politicians in nonlottery states have a strong incentive to urge such enactments. Lotteries are a normal activity of state government, they argue, pointing to the money the state loses when its people cross the border to bet in other states.<br /></p><p><br /></p><p>
</p><p> But the rejection of a lottery by the voters of Alabama and Arkansas, as well as South Carolinians' tepid approval of one, suggests that the political tide may have turned. Before 1999 referenda to create state-run lotteries were almost unbeatable: 32 passed, and only two (both in North Dakota) were defeated. Since 1999 lottery referenda have gone one for three. Voters in Maine, a lottery state since 1974, turned down a ballot measure in the 2000 election to allow video gambling at racetracks. Tennessee voters are far from certain to approve a lottery in 2002, when a referendum is scheduled to take place. That's a big change from just a few years ago, when easy passage would have been a sure thing.<br /></p><p><br /></p><p>Lotteries are on the political decline for several reasons. The recently formed National Coalition Against Legalized Gambling (NCALG), a grass-roots organization that can be counted on to set up shop in almost any state that is considering a lottery, deserves part of the credit. NCALG is especially good at rousing opponents from both ends of the political spectrum. Liberals are called to arms by the issues of social justice that a lottery raises. Conservatives are energized by their conviction that gambling is morally destructive.<br /></p><p><br /></p><p>Anyway, the promise of new revenues from a lottery is less alluring than it used to be. Now that lotteries have been around long enough for economists and other social scientists to study their effects, the word is out: They're bad news. They are regressive, deceptive, and--for both children and adults--enticing to the point of being addictive. The revenues they generate for a state are roughly equivalent to those that an increase in the sales tax of less than 1 percent would produce. But the main argument against lotteries should have been as apparent to New Hampshire 37 years ago as it is to Alabama and Arkansas today: It's not the place of government to encourage people to gamble.<br /></p><p><br /></p><p><br /></p><table cellpadding="4" border="2" bgcolor="#ccccff"><tr><td>
<p align="center"><span class="subhead">Gambling Online</span></p>
<p>
<small><br /><span class="dropcap">T</span>he political outlook isn't bleak for all forms of gambling--or even for all sectors of the dot-com economy. Distressingly, Internet gambling is on the rise.<br /></small></p><p><br /></p><p>Since 1995, when the first gambling Web site was launched, more than 500 sites have cropped up. Most of these are virtual casinos licensed in countries such as Antigua and Australia, where Internet gambling is legal. An analyst for Christiansen/Cummings Associates calculates that online gambling revenues more than doubled from $300 million in 1997 to $651 million in 1998, then redoubled to $1.2 billion in 1999 and nearly redoubled again to $2.0 billion in 2000. Other industry analysts offer higher estimates.<br /></p><p><br /></p><p>
</p><p> The U.S. government has not been oblivious to this pattern. When Janet Reno was attorney general, she declared: "You can't hide online and you can't run offshore." But about all she had to work with was the Wire Communications Act, a 40-year-old law designed to keep people from using telephones and telegraphs to place bets on horse races and football games. The law is, at best, tenuously applicable to online casinos, much less enforceable against an Internet industry that is moving from telephone wires to cellular, fiber-optic, and satellite technologies.<br /></p><p><br /></p><p>
</p><p> Surprisingly, not many people support Internet gambling. The only group that has lobbied on its behalf in Washington is the Interactive Gaming Council, an association of Web-based casinos and sports books that is located in Vancouver, Canada. As for the general public, a 1999 Gallup poll found that 75 percent of Americans disapproved of "legalized gambling or betting using the Internet"--by far the largest margin of opposition to any form of gambling.<br /></p><p><br /></p><p>
</p><p> And the public is right to disapprove. Internet betting offers a unique temptation to people who are prone to gamble destructively. For the large and rapidly increasing number of Americans who have online access through personal computers, gambling Web sites are a 24/7 presence. Young people, for whom playing online games is a major form of recreation, are especially vulnerable. Gambling Web sites typically accept the word of bettors who claim to be of legal age. Potentially addictive in their own right and a gateway to other, more dangerous forms of gambling, Internet casinos contribute to gambling disorders.<br /></p><p><br /></p><p>Online gambling also may foster crime in a distinctive and especially problematic way. Until recently, traditional bricks-and-mortar casinos were places where cash-rich criminal organizations could launder money: They simply bought chips with illegally obtained cash, played a little, then cashed in the chips for untainted currency. Congress closed this door in the mid-1980s by requiring casinos to report every transaction larger than $10,000. But Internet venues can evade this law simply by basing themselves outside the United States.<br /></p><p><br /></p><p>Another problem with the industry is how it affects government, especially at the state level. Web sites based in other countries pay no taxes in the United States. Yet the states are left to deal with the crime, bankruptcy, and gambling disorders that may result.<br /></p><p><br /></p><p>
</p><p> The will to prevent Internet gambling is widespread in Washington, but strong uncertainties remain about how this should happen. Much of Congress's attention has been devoted to Arizona Senator Jon Kyl's proposed Internet Gambling Prohibition Act, a bill that would extend the coverage of the old wire act to include Internet-related technologies such as fiber-optic cable and microwave transmission.<br /></p><p><br /></p><p>
</p><p> The Kyl bill, which passed the Senate in 1999, has received strong support from an unusual political coalition: a broad spectrum of Christian groups, ranging from the liberal National Council of Churches, which regards gambling as a threat to the poor, to the conservative Southern Baptist Convention, which is concerned primarily with issues of morality. These groups are joined by the American Gaming Association, which represents the established and closely regulated casino industry, and by amateur and professional sports organizations such as the National Collegiate Athletic Association and the National Football League, which worry about the effects of Internet-based sports betting on the integrity of their games.<br /></p><p><br /></p><p>But even some who share the Kyl bill's goals doubt that it would be effective in achieving them. Policing activities in cyberspace--especially when they are legal in the countries where the Web sites are based--is no easy task.<br /></p><p><br /></p><p>Yet Congress can, as Century Foundation President Richard Leone says, "put sand in the wheels" of Internet gambling. In addition to passing the Kyl bill, it could follow the 1999 National Gambling Impact Study Commission's suggestions both to "prohibit wire transfers to known Internet gambling sites, or [to] the banks who represent them" and to render unrecoverable in U.S. courts "any credit card debts incurred while gambling on the Internet." To be sure, Americans who wanted desperately to gamble on the Internet would find a way to get their money to the necessary places--perhaps by opening an account in an overseas bank, then instructing that bank to transfer funds to an Internet gambling site. But the casual, experimental, or underage Internet gambler who had to go to more trouble than entering a credit card number would be deterred.
</p></td>
</tr></table><p></p><p></p>
</div></div></div>Wed, 19 Dec 2001 18:48:06 +0000142050 at http://prospect.orgMichael NelsonChins Up, Liberalshttp://prospect.org/article/chins-liberals
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><font color="darkred"><big>A</big></font>mericans are ideological conservatives and operational liberals. That was the finding of social psychologists Lloyd A. Free and Hadley Cantril, who based much of <i>The Political Beliefs of Americans, </i>their classic work about public opinion, on a massive survey they conducted during the fall of 1964. As ideological conservatives, Americans are skeptical about the "role and sphere of government in general and of the Federal Government in particular," the authors discovered. Yet as operational liberals, citizens have favored just about every "government program to accomplish social objectives since at least the days of the New Deal." </p>
<p>
Republican Senator Barry Goldwater lost the 1964 presidential election, Free and Cantril argued, because he was an in-your-face operational conservative. He traveled to Tennessee, for example, to make a speech blasting the Tennessee Valley Authority (TVA). "As long as Goldwater could talk ideology alone, he was high, wide, and handsome," they wrote. "But the moment he discussed issues and programs, he was finished." Although Free and Cantril didn't say so, it surely did not help Goldwater's cause that he was so relentlessly pessimistic about America. His view, which was shared by many conservative thinkers at the time, was that the United States and the other Western nations were in decline, the only uncertainty being the rate of their descent into a Soviet-dominated abyss. </p>
<p>
One of the 45 states that Goldwater lost in 1964 was California, which President Lyndon B. Johnson carried by a margin of 1.3 million votes. Two years later, the man whom Goldwater had described as the likely heir to his political mantle, Ronald Reagan, was elected governor of California over two-term Democratic incumbent Pat Brown by nearly a million votes. Sixteen years after that, in 1980, Reagan was elected president in a 44-state landslide and promptly proclaimed in his inaugural address that "government is not the solution to our problems; government is the problem." And in 1996 Bill Clinton, the only Democratic president elected in the post-Reagan era, was handily re-elected after kicking off the year by declaring in his State of the Union address that "the era of big government is over."</p>
<p>
What explains Reagan's political success? And what explains Clinton's? The answer to the first question, it turns out, is not much different from the answer to the second.</p>
<p></p><p>The story of Reagan's election as governor of California in 1966 is told engagingly by historian Matthew Dallek in <i>The Right Moment</i>. It's a familiar story in its main outline, but Dallek is masterful at providing the telling detail. For instance, although everyone knows that Reagan used to be a liberal Democrat, it will be news to most that he once made anti-Republican radio commercials for the International Ladies Garment Workers Union. Dallek also has an eye for the revealing memo. As an indicator of just how ideologically liberal California's Democratic activists were in the 1960s, consider this list of "magic words" that Brown campaign aide Fred Dutton told the governor would make the California Democratic Council swoon when he spoke at its convention: "sacrifice; India; people in need in the world; dangers of atomic testing."</p>
<p>
Arcana aside, Dallek's thorough account provides the raw material for an explanation of Reagan's political success that might be called "Free and Cantrilplus." Reagan's first appearance in national politics came in October 1964, when he delivered a half-hour televised speech for Goldwater that columnist David Broder called at the time "the most successful national political debut since William Jennings Bryan electrified the 1896 Democratic convention with the 'Cross of Gold' speech." As soon as Reagan was done speaking, thousands of small donors sent contributions to the Goldwater campaign totaling $8 million.</p>
<p>
"The Speech," as it came to be known among Reaganites, was one that Reagan had written, rewritten, and learned by heart in the course of delivering it dozens, perhaps hundreds, of times as an after-dinner speaker for General Electric. It resonated with Americans' ideological conservatism by celebrating the virtues of the free market and excoriating big-government planners. Nothing special about that: Goldwater could ring those chimes too. What Reagan left out that Goldwater usually put in, however, were attacks on Social Security and the TVA and the other federal programs that Americans cherished. What Reagan put in that Goldwater left out was optimism. We don't have to "choose between a left or right," Reagan declared. "There is only up or down: up to man's age-old dream--the ultimate in individual freedom consistent with law and order--or down to the ant heap of totalitarianism." </p>
<p>
Reagan's speech spurred several major backers of the California Republican Party--most of them firstgeneration Americans who had become successful self-made businessmen--to recruit him to run for governor in 1966. The surest sign that Reagan would not repeat Goldwater's mistakes was that he hired the political consulting firm of Spencer-Roberts to direct his campaign. That was the firm that had managed liberal New York Governor Nelson Rockefeller's near-miss challenge to Goldwater in the 1964 California presidential primary. When Goldwater volunteered to help out in 1966, Reagan deftly replied with a private letter that thanked and praised the senator at length but said not a word about his offer.</p>
<p>
Reagan had many things going for him in 1966. Brown may have scored a few points by charging that all Reagan had ever done was act in the movies and on television, but Reagan had the screen actor's gift of connecting with a mass audience through the lens of a camera. (Besides, movies and television are a major industry in California.) Brown's media advisers had little ability of their own, and less to work with in their candidate: One offered to lend the portly governor an exercise bike; another urged him to take off his glasses. More important, Californians were unimpressed with Brown's recent handling of student demonstrators at Berkeley and rioters in Watts. Indeed, Dallek argues that the key to Reagan's victory--and "the decisive turning point in American politics"--was his effective use of the law-and-order issue.</p>
<p>
The real key to Reagan's political success, however, was a recipe that included massive doses of ideological conservatism and optimism while leaving all but a pinch of operational conservatism in the pantry. In his public appearances, Reagan waxed rhapsodic about the joys of free enterprise and California's bright future. When asked about changing abortion laws, however, he replied that he would wait until the legislature acted, then "go for the necessary advice to men of science, men of medicine, and men of God." He handled most other hotbutton issues the same way.</p>
<p>
<font color="darkred"><big>J</big></font>ust as Ronald Reagan inherited a Republican Party that was on the ropes in 1966 (and, again, in 1980), Bill Clinton won the nomination of a reeling Democratic Party in 1992. Like his recent predecessors among Democratic presidential candidates--George McGovern, Walter Mondale, and Michael Dukakis--Clinton ran as an operational liberal promising a host of new federal programs designed to make the economy grow. But unlike them, Clinton leavened his rhetoric with two buzzwords that he'd appropriated from conservatives: "opportunity" and "responsibility." </p>
<p>
Clinton also ran an upbeat, optimistic campaign in 1992, painting a future filled with technological wonders and an expanding economic pie rather than gas lines, oil spills, and nuclear winter. (It was Al Gore, his running mate, who talked about global warming.) He realized that optimism is liberalism's main contribution to politics. The idea--starry-eyed, by historical standards--that human beings can create the society they want to live in is liberal to the core. Clinton knew that it is liberals whose chins, like FDR's, ought to be jutting upward when they talk about the future.</p>
<p>
Michael Waldman, who worked in Clinton's domestic-policy shop for two years before serving as chief speechwriter from 1995 to 1999, is understandably fixated with words in <i>POTUS Speaks</i>. (POTUS is govspeak for president of the United States.) Hilarious as it is, given that he wrote the material, Waldman praises the president for "the effervescent power of his words" and marvels, "As I watched Clinton persuade [many fellow Democrats of the virtues of NAFTA], I found myself becoming persuaded too." But Waldman's larger point is correct: The president's political success was as much a function of what he said and how he said it as of what he did to change government. </p>
<p>
The 1996 re-election campaign incorporated all the main elements of the Clinton formula for victory at the polls. Operational liberalism was one such element: Clinton demolished Republican congressional leaders in the battle over the 1996 budget by refusing to cut spending for popular domestic programs, even when that meant shutting down the government. On the campaign trail, he offered the litany of "Medicare, Medicaid, education, and the environment" so relentlessly that reporters began recording the phrase in their notebooks as "M2E2." </p>
<p>
Optimism was another constant in the Clinton re-election campaign. On Clinton's behalf, U.S. Trade Representative Mickey Kantor instructed the speechwriters and his fellow cabinet members: "We're not the party of government; we're the party of the future." Clinton himself repeatedly referred to his presidency as a "bridge to the twenty-first century," promising the voters a bridge "big enough, strong enough, and wide enough for everybody to walk across" and asserting that "everyone has a right to walk on the bridge." Reporters and political junkies wearied of the image (by one count, Clinton uttered the word "bridge" nine times in an average campaign speech), but the voters liked hearing that the future was something to look forward to and that there was a place in it for them.</p>
<p>
Clinton never forgot, however, that Americans are ideological conservatives. Waldman reports that in 1993 the new president ignored advice from George Stephanopoulos to answer Reagan's "government is the problem" pronouncement with a ringing defense of government in his own first inaugural address. Instead, three years later Clinton kicked off his campaign for re-election with the "big government is over" declaration. After he won, he began tempering that statement, but only by balancing it: "Government is not the problem, and government is not the solution," he said, in his second inaugural address). In the 1998 State of the Union speech, he proclaimed, "We have moved past the sterile debate between those who say government is the enemy and those who say government is the answer." For Waldman, the "words that defined the Clinton presidency" most persuasively were those that defended government programs without defending government as a proposition.</p>
<p>
Clinton is a controversial figure among liberals, and he is likely to remain so for many years--for all sorts of reasons. But surely all can agree that by reclaiming the cloak of optimism he did liberalism a great service. Gore's dreary 2000 campaign is a reminder that liberals can still sink into the dank morass inhabited by the political ghosts of McGovern, Mondale, and Dukakis. "You've never had it so good, and I'm mad as hell about it" is not, as journalist Michael Kinsley pointed out, a theme likely to electrify the nation. The Democratic standard-bearer in 2004 will have more to learn from the party's only recent occupant of the White House than from its several recent failed contenders. </p>
</div></div></div>Wed, 07 Nov 2001 21:05:37 +0000142007 at http://prospect.orgMichael NelsonThe Essential Tip O'Neillhttp://prospect.org/article/essential-tip-oneill
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><i>Tip O'Neill and the Democratic Century,</i> by John<br />
A. Farrell. Little, Brown and Company, 776 pages, $29.95.</p>
<p>
Jimmy Breslin called Tip O'Neill "a lovely spring rain of a man" and John A. Farrell proves Breslin right in <i>Tip O'Neill and the Democratic Century.</i> Farrell, a prizewinning veteran reporter for <i>The Boston Globe,</i> has written a book as lovely as its subject, and also as big and accomplished. </p>
<p>
Yes, there are stories--including a few that you may not have heard before, like this one: O'Neill, who was blissfully disengaged from the popular culture, was chatting at a fundraiser with a handsome young man who seemed to think that O'Neill knew who he was. After the young man left, O'Neill asked a friend, "Who was that?" The answer: "Warren Beatty."</p>
<p>
O'Neill looked blank for a second. "The lion tamer's son?" </p>
<p>
Then there was the time when O'Neill heard that Barney Frank, his fellow Massachusetts congressman, was going to announce publicly that he was gay. O'Neill quietly began to inform a few colleagues. "Barney is coming out of the room," he told them. And once when O'Neill, as a freshman congressman, returned home from a trip to Nevada to see a hydrogen bomb test, he discovered a band of bruises around his belly. Worried that he might have radiation poisoning, he went to see a doctor. No, the doctor told him, you don't have radiation poisoning. You got the bruises by banging too hard against the craps tables in Las Vegas.</p>
<p>
All the familiar O'Neill material is here, too. Farrell tells us, for example, about winning Mrs. O'Brien's vote ("People like to be asked") and reminds us that "All politics is local" (a hand-me-down from O'Neill's father, who was the Cambridge, Massachusetts, superintendent of sewers). But anecdotes are only one of this book's virtues. Farrell has written a knowing and engaging biography of O'Neill, a lucid chronicle of his times, and a wonderfully realized portrayal of the settings in which he spent his life: Boston during the first half of the twentieth century and Washington during the second half. The unexpected value of Farrell's book is the example it offers today's Democrats about how to survive and even thrive during a Republican presidency.</p>
<p>
O'Neill, a model for today's Democrats? Old, rumpled, overweight, gruff-voiced, pretelevision, preNew Democrat Tip O'Neill?</p>
<p>
Scoff if you like. Then try to name another nationally prominent Democratic leader within memory--executive or legislative, federal or state--who has retired from public life with flags flying and reputation intact.</p>
<p>
Thomas P. O'Neill, Jr., appeared on the national scene so late in life--he was 64 when he became Speaker of the U.S. House of Representatives in 1977--that it's hard to imagine what an able young politician he was. As a senior at Boston College, O'Neill came within 229 votes of being elected to the Cambridge City Council. After winning a seat in the state legislature two years later, he rose through the ranks to become speaker of the Massachusetts House of Representatives. At age 37, he was the second-youngest house speaker in the history of the state, as well as the first Roman Catholic and the first Democrat. One reason O'Neill won the job was that his colleagues liked him so much--he was always good for a card game, a story, or a cigar. The other reason was that they respected him for his courage--for taking on the McCarthyites who wanted to force all of the state's teachers to swear a loyalty oath, for example. </p>
<p>
It never occurred to O'Neill that politics wasn't the best career in the world or that government was anything other than a force for the good. As a boy, he lived the solid middle-class life that a family income rooted in public office afforded. He followed his father around Cambridge and watched him dispense jobs and buckets of coal to his working-class constituents. O'Neill turned 21 the year Franklin D. Roosevelt became president, and cheered when unemployed friends found jobs in New Deal programs and pensionless old neighbors began drawing Social Security. In Massachusetts, as Farrell points out, even the Republicans were liberals.</p>
<p>
O'Neill's first major crusade as speaker at the Massachusetts statehouse involved health care for the mentally ill. The way he approached the issue reveals the approach to politics and government that marked his entire career. His interest in the state's decaying system of mental hospitals was piqued when a constituent with a Down's syndrome child sought his help in getting the child hospitalized. O'Neill drove the child to the state hospital in Belmont and was turned away: The waiting list already had 3,600 names on it. So he left the child in the waiting room and then phoned to say: "The child is in your hospital. Find a bed." But he also rammed through the biggest one-year capital outlay in state history in order to fund new hospital construction. Good politics in the form of constituent service is what got the child into the old hospital. Good government in the form of new legislation financed the building of new ones.</p>
<p>
O'Neill never wore his religion on his sleeve, but Farrell leaves little doubt that O'Neill's political sensibility derived from his immersion in Catholicism. As a boy in parochial school, he was instructed in the gospel: blessed are the poor, the meek, those who mourn, and those who thirst for justice. "Other boys heard the sermons as well," Farrell points out, but other boys had not lost their mothers when they were infants, as O'Neill had. "O'Neill's intimate sense of loss made him an insistent, and powerful, tower of strength for the needy," according to Farrell. Later in O'Neill's political career, pundits would point to his faith to explain why he supported the Hyde Amendment restricting abortion access or opposed American intervention in Central America. What they missed was the O'Neill who told his son's senior class, "In everything you do, you must recall that Christ loved man and wished us, for our own sakes, to love Him. The method by which we exercise that love is by loving our fellow man, by seeing that justice is done, that mercy prevails."</p>
<p>
O'Neill was elected to Congress in 1952 when John F. Kennedy left his 11th Congressional District seat to run for the Senate. O'Neill and the Kennedys never had an easy relationship: Joseph P. Kennedy pumped campaign funds to a state legislator named LoPresti who was running against Tip in the Democratic primary because he figured that his son, Jack Kennedy, had the Irish vote locked up but could use some help with the Italians. Being on the outs with the Kennedys didn't hurt O'Neill a bit in Congress. House Democratic leader John McCormack, another Massachusetts politician who had problems with the Kennedy family, was a more valuable patron in that setting than any Kennedy. </p>
<p>
McCormack introduced Tip to Speaker Sam Rayburn's "board of education," an after-hours gathering of House insiders. When McCormack's retirement in 1971 created an opening for a big-city northerner in the House Democratic leadership, O'Neill was appointed party whip. Two years later, he was elected majority leader by his fellow Democrats, and four years after that he was elected Speaker. In both elections, the vote was unanimous. A potential rival for majority leader withdrew by saying in front of the House Democratic caucus, "Tip, I can tell you something that nobody else in this room can. You haven't got an enemy in the place."</p>
<p>
Backslapping bonhomie was not the whole story of O'Neill's rise to power. The House Democrats who elected him Speaker were post-Vietnam, post-Watergate reformers: young, suburban, independent, and impatient with traditional ways of doing things. The O'Neill this generation liked kept his cigars in his pocket but also helped uproot conservative southern Democrats from their committee chairmanships and broke early with President Lyndon B. Johnson over Vietnam. O'Neill's shift on the war was "the most politically significant ... of all the congressional changes of position," according to Congress's official history of the Vietnam War.</p>
<p>
O'Neill's fondest dream had been to serve as Speaker with a Democratic president. It came true in a those-whom-the-gods-would-punish kind of way when Jimmy Carter was elected in 1976. Substantively, Carter was a different kind of Democrat from O'Neill. The Speaker, as his former aide Chris Matthews summarized his political philosophy, "believed in the programs. Programs for people." O'Neill's heart for the poor and afflicted, and his inclination to slip a few hundred million in their direction, was boundless. "I've been one of the big spenders of all time," he proclaimed to a group of reporters, then waxed rhapsodic about programs he had funded to cure knock-knee and help dwarves grow taller. Carter, for his part, believed in fiscal austerity and government reorganization. </p>
<p>
These differences in substance were nothing compared with O'Neill and Carter's differences in style and temperament. O'Neill was, as the title of his memoir put it, a "Man of the House." Carter regarded Congress as part of the mess he had been sent to Washington to clean up. O'Neill's approach to legislating was to get everybody with a political stake in a decision into the same room and keep negotiating until there were enough votes to pass the bill. Carter's was to study a problem from every conceivable angle, arrive at the correct solution, and then tell O'Neill and Senate Majority Leader Robert C. Byrd to go pass it into law. In the end, O'Neill couldn't decide whether the Carter administration failed because it was unlucky ("You get a good hand, and the dealer drops the deck") or because it was made up of, as he put it, "a bunch of pricks." </p>
<p>
In 1980 Carter lost his bid for reelection to Ronald Reagan and the Republicans took control of the Senate. Overnight and by default, O'Neill became the most prominent Democrat in the country. Reaganites were thrilled: They had run a commercial all through the election year that showed a fat, white-haired, cigar-chomping O'Neill look-alike driving a big car and running out of gas. Democrats worried that the Republicans would complete their electoral realignment by winning the House of Representatives in the midterm elections of 1982. Reagan already had a governing majority in the House: 190 Republicans plus 40 members of the southern-dominated Democratic Conservative Forum, the so-called boll weevils.</p>
<p>
O'Neill adapted readily to his new role as leader of the opposition, both in the media and in Congress. Matthews helped him craft sound bites for his daily press briefings and taught him to repeat them until the evening news programs had no choice but to air them. O'Neill's public image gradually went from "big, fat, and out of control" (the epithet of a Republican congressman--who got beat in the next election) to "this big guy with a good heart and a lot of guts," everybody's favorite Uncle Tip. </p>
<p>
In Congress, O'Neill crafted what Farrell calls a "give him rope" strategy for dealing with Reagan. Knowing that the American people would not turn their backs on Reaganomics until they had seen it fail, O'Neill watched patiently as the Republican and boll weevil coalition enacted the president's tax and budget cuts. But he pounced when Reagan tried to cut Social Security benefits for early retirees and when the economy slid into its deepest recession since the 1930s. Far from losing the House in 1982, the Democrats gained 26 seats. Far from losing programs, Farrell argues, O'Neill preserved the New Deal and the Great Society with all their "muscle and bone--and even some flab" intact. Reagan remained personally popular, but the Reagan Revolution was dead in the water.</p>
<p>
<span class="dropcap">G</span>ive Bush rope" is not a strategy that Democrats who are still convinced that Al Gore won the presidential election are likely to embrace in 2001, and even in O'Neill's day it wasn't the approach taken by the party's liberal writers and thinkers. But for elected Democrats who believe that George W. Bush's policies are ill advised, as O'Neill believed that Reagan's were, a patient strategy based on a confident appraisal that the Bush administration will use that rope to hang itself is worth considering. </p>
<p>
The more important lesson today's Democrats can learn from the late Speaker (he died in 1994) is about the place of religious faith in politics. The Democratic Party seems increasingly uncomfortable with religion. This may explain why references to "faith-based organizations" conjure up, for some liberals, images of Baptist preachers pressuring women not to have abortions instead of Reform Jews feeding the hungry and Catholic nuns nursing AIDS victims--even though there are a lot more of the latter two kinds of group than of the former. Church attendance is just one of several measures of how secular the Democrats have become. In the 2000 presidential election, Bush beat Gore by 20 percentage points among voters who attend religious services at least once a week, while Gore beat Bush by 17 points among those who seldom or never attend.</p>
<p>
What Democrats can learn from O'Neill is that the social-justice strain in all of America's major-faith traditions can be both an anchor and a spur when it comes to liberal politics. No poll or election could ever dissuade O'Neill from thinking that the fundamental purpose of government is to help "the least of these"--he believed that deep in his soul. O'Neill's religion never let him lapse into the inertia or sullenness that is born of losing. For him the Cross was a sign of ultimate victory in the face of momentary defeat. Whether from religious conviction or some equally compelling wellspring of purpose, modern Democrats not only need to say the right things, they also need to believe that the stakes are high enough to do them.
</p>
</div></div></div>Wed, 07 Nov 2001 20:10:25 +0000141117 at http://prospect.orgMichael NelsonFrom Rez to Richeshttp://prospect.org/article/rez-riches
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote><p>
</p><p><i>Indian Gaming: Tribal Sovereignty and American Politics,</i> W. Dale Mason. University of Oklahoma Press, 330 pages, $29.95.
</p><p>
</p><p><i>The Revenge of the Pequots: How a Small Native American Tribe Created the World's Most Profitable Casino,</i> Kim Isaac Eisler. Simon and Schuster, 267 pages, $25.00.<br /></p><p>
</p><p>
</p><p><i>Without Reservation: The Making of America's Most Powerful Indian Tribe and Foxwoods, the World's Largest Casino,</i> Jeff Benedict. HarperCollins, 376 pages, $26.00.<br /></p><p></p></blockquote>
<p></p><p><br /></p><p>
</p><p> At a gathering of political scientists in 1997, W. Dale Mason's graduate adviser introduced him to an eminent scholar, noting that Mason's award-winning doctoral dissertation was about "Indian gaming." The scholar told Mason that there was someone in the room he had to meet; he called over a graduate student who studied game theory. The student was from India.<br /></p><p><br /></p><p>
</p><p> The term "Indian gaming" is less likely to be misunderstood these days, as casinos owned by American Indian tribes have become a prominent part of the national landscape. Only 13 years ago, there were no tribal casinos in this country; now these enterprises operate in 24 states. During the past few months, two major trade publishers have released books on the biggest of them all: the Foxwoods Resort Casino, which is owned by the Mashantucket Pequot Tribal Nation and located near Ledyard, Connecticut, just a two-hour drive down from Boston or up from New York City. Both of these books--Jeff Benedict's <i>Without Reservation</i> and Kim Isaac Eisler's <i>Revenge of the Pequots</i>--tell the less-than-rags to more-than-riches story of how a tribe that did not exist in the eyes of the federal government until 1983 created, as the subtitles have it, "the world's largest casino" and "the world's most profitable casino." Meanwhile, Mason's dissertation-turned-book provides the historical context that explains the rise of tribal casinos.<br /></p><p><br /></p><p>The story Mason tells in <i>Indian Gaming</i> begins nearly two centuries ago, when the Supreme Court established in a series of decisions that American Indian tribes enjoy what Mason calls a "diminished sovereignty." Although state governments can't, of their own authority, make tribes do anything, the federal government's authority over tribes is virtually unlimited. To be sure, Chief Justice John Marshall had a lot to say about how the federal government should exercise this authority. It was to act as a guardian would for a ward, offering "kindness," "relief," and "protection."<br /></p><p><br /></p><p>For years, the federal government fell short of that kindly role; it did one heinous thing to Native Americans after another. But in the 1970s and 1980s, a combination of forces began turning public policy in a more favorable direction. There was a new assertiveness by organizations such as the American Indian Movement. There also were changes in popular culture that fostered a <i>Dances with Wolves</i> romanticism, Republican presidents who wanted tribes to become economically self-sufficient so that federal subsidies could be cut, and pro-Indian Supreme Court decisions with teeth.<br /></p><p><br /></p><p>When tribes first turned to commercial gambling in this recent era, usually in the form of high-stakes bingo, the Court prevented state governments from clamping down on them. In a 6-to-3 decision that scrambled the justices' usual alignments (Antonin Scalia and John Paul Stevens were on one side, and William Rehnquist and Thurgood Marshall were on the other), the Court declared in the 1987 case <i>California v. Cabazon Band of Mission Indians</i> that states that permit or, in the case of lotteries, operate gambling in any form can't prevent tribes from having their own gambling facilities.<br /></p><p><br /></p><p>The states, which had opposed <i>Cabazon,</i> subsequently won a partial victory in Congress. The 1988 Indian Gaming Regulatory Act (IGRA) authorized tribes to sue states that tried to deny them the right to open casinos, but it also restricted the tribes' latitude. IGRA allowed tribes to have casinos only if the state already allowed similar forms of gambling. And it required that before opening a casino, a tribe had to negotiate a compact with the state that spelled out the terms of operation.<br /></p><p><br /></p><p>As Mason points out, in the 13 years since IGRA was enacted, 189 tribes have gotten into the business of what the law calls Class III gambling--casinos or, in a few cases, offtrack betting on horse races. These operations bring profits of more than $7 billion a year into tribal coffers, an astonishing amount compared with any other moneymaking enterprise that Native Americans have ever owned and operated.<br /></p><p><br /></p><p>But most of the nation's 557 tribes--indeed, around two-thirds of them--are not part of the casino economy, and many of the tribes that are part of it are barely getting by. Eight tribal casinos account for 40 percent of all the revenue. Geography is one reason why so few tribes have been able to make big money. Many tribal lands are too far from population centers ("feeder markets," in the industry vernacular) to attract enough gamblers from off the reservation. Culture is another reason. Some tribes, like the Navajo and the Seneca, have voted to stay out of the casino business for fear that it would defile their traditions and despoil their lands.<br /></p><p><br /></p><p>
</p><p> <span class="dropcap">F</span>or tribes that are both fortunately located and willing to trade traditional culture for modern casinos, politics has been the key to whether they have been able to cash in--as the case of the Mashantucket Pequots illustrates. Politically, the Pequot tribe played its hand well in setting up Foxwoods. As both Eisler and Benedict tell the story, the Pequots benefited enormously from the groundwork laid by Tom Tureen, a white public-interest lawyer who in 1980 used the threat of litigation to persuade Congress to give two Maine tribes, the Passamaquoddies and the Penobscots, nearly $82 million. The basis of Tureen's legal claim was the requirement of the Indian Nonintercourse Act of 1790 that every sale of tribal land be approved by federal treaty. Maine had never sought such approval, which put two-thirds of all the land in the state at legal risk. Neither, Tureen discovered, had Connecticut when it sold off most of the Pequots' land in 1856.<br /></p><p><br /></p><p>
</p><p> But were there any Pequots left? The tribe was, as Eisler points out, "the very first to be exterminated" by the English and Dutch during colonial days. In a bit of heavy-handed foreshadowing, Herman Melville named Captain Ahab's doomed ship the <i>Pequod</i> after the "celebrated tribe of Massachusetts Indians, now extinct as the ancient Medes." But a couple of hundred acres of Pequot land in southeastern Connecticut remained untouched and, well into the twentieth century, a woman who claimed Pequot ancestry lived on them. When she died in 1973, her grandson, Richard "Skip" Hayward, a knockabout welder, moved onto the land and encouraged all of his relatives to do the same. Eisler regards Hayward as being perhaps one-sixteenth Pequot on his mother's side. As far as Benedict is concerned, Hayward's--and his grandmother's--Pequot credentials are entirely bogus.<br /></p><p><br /></p><p>Tureen was able to parlay the Hayward family's claim to tribal status into federal recognition and a $900,000 settlement from Congress, which the tribe used to buy several hundred more acres. In 1987 Hayward benefited from the <i>Cabazon</i> ruling and from Connecticut's enactment of a law that allowed schools and charitable organizations to hold "Las Vegas Nights" offering casino games with noncash prizes. The push for the bill had come from Mothers Against Drunk Driving (MADD), which was hoping to attract kids to stick around the school on prom night instead of getting on the roads. But because of <i>Cabazon,</i> the MADD bill opened the door to full-scale casino gambling on Pequot land. When Connecticut governor Lowell Weicker refused to negotiate a casino compact with Hayward, Tureen invoked the provision of IGRA that allows tribes to sue recalcitrant states and won in federal court. Foxwoods opened in 1992.<br /></p><p><br /></p><p>As easy as the Pequots had it in their political dealings with the governments in Hartford and Washington, they haven't been spared the turmoil of politics within the tribe. The stakes, of course, have become enormous. With advice from veteran casino executives from Atlantic City and ample financing from a Malaysian billionaire who had always wanted a stake in an American casino, Hayward was able to take full advantage of his superb location and freedom from federal and state taxes. Foxwoods grew within a few years to include 24 restaurants, three hotels, 17 shops, a golf course, a state-of-the-art Pequot museum, and profits of more than $1 billion per year.<br /></p><p><br /></p><p>Eager to grow his tribe as well as its business, Hayward invited Narragansett Indians from Rhode Island, whose bloodlines were intertwined with the Pequots' in complex and ancient ways, to move onto the reservation. (Under federal law, each tribe gets to determine who its members are.) Hayward offered each of them the same share of Foxwoods's profits as his own Pequot family members received: free housing, day care, health care, and college tuition, along with an annual dividend of $50,000 and, if they wanted it, a job in the business. Needless to say, the Narragansetts accepted this invitation in great numbers, swelling the tribe's ranks from a few dozen to roughly 650. It wasn't long, however, before the Narragansetts' leader, Kenneth Reels, did a head count and decided he had the votes to unseat Hayward as tribal chairman. Both Eisler and Benedict regard Reels as small-minded and thuggish, but neither of them doubts his ability to count votes. He was elected in 1998.<br /></p><p><br /></p><p>
</p><p> The tribes in New Mexico and Oklahoma that Mason studied have had more difficult--and more typical--political experiences with their state governments. In both states, tribes not only had to act as sovereign governments pursuing their right to open casinos; they also had to act like interest groups. In the early 1990s when Bruce King, New Mexico's Democratic governor, stonewalled a number of compact-seeking Pueblo tribes on the grounds that the only casino gambling his state allowed was the occasional charitable Las Vegas Night, the state's mostly Democratic tribes united in support of King's Republican opponent in the 1994 election. The Republican, Gary Johnson, won--thanks in no small measure to the tribes' $189,000 in campaign contributions. He quickly authorized them to open casinos on their lands. When the state supreme court ruled that the governor had exceeded his authority by not getting the legislature to approve the casino compacts, the tribes again pooled their resources, hired lobbyists, waged a paid media campaign, bought 10 of 40 tables at the Democratic legislators' annual fundraiser, and secured a new compact with bipartisan support that met the court's approval. Ten tribal casinos were soon up and running.<br /></p><p><br /></p><p>
</p><p> Mason's theme that sovereignty without politics isn't enough is illustrated in his chapter on Oklahoma. Oklahoma's tribes had almost as good a legal claim as New Mexico's to get into the casino business. What's more, Oklahoma has the largest Indian population in the country--more than 260,000 in a state of only 3.4 million people. But Oklahoma's 39 tribes remained disunited by long-standing feuds and rivalries, and a hostile state government was able to tie them in legal and political knots.<br /></p><p><br /></p><p>Mason's work certainly has the most intellectual heft of anything yet published on tribal gambling, but it also suffers a bit in its transmutation from dissertation to book. Sentences like the following are all too common: "While Henshcen and Sidlow made a contribution to the field much beyond Kingdon, they did not go far enough in their own research." Well, yeah, and if you turn back 200 pages to the only other references in the book to Henschen, Sidlow, or Kingdon, you can sort of figure out what he means. But that's a bit much to ask of the reader.<br /></p><p><br /></p><p>Liveliness of presentation is certainly not the problem with either of the books about Foxwoods. Both Eisler and Benedict have written page-turners. They each tell essentially the same story, with Skip Hayward as the main character. For Eisler, Hayward is a lovable rogue. For Benedict, he's a scam artist. Both authors agree, however, that Hayward's recent toppling by Kenneth Reels threatens disaster for the Pequots. Reels won by promising million-dollar salaries to tribal council members and higher "dividend payments" to every member of the tribe, a pledge that has required the tribal government to squander much of its casino profits and borrow heavily.<br /></p><p><br /></p><p>Unfortunately, none of these books makes much of the Supreme Court's 1996 decision in <i>Seminole Tribe v. Florida.</i> In that case, the Court invoked the 11th Amendment to invalidate the section of IGRA that allows tribes to sue states that refuse to negotiate casino compacts--the very provision that gave the Pequots the legal leverage they needed to open Foxwoods. Stonewalling states are now virtually immune to legal challenges.<br /></p><p><br /></p><p>Politics, however, can still clear a path to additional tribal casinos. The tribes that got their casinos before the Court shut the legal door have plowed some of their profits into lobbying, advertising, and campaign contributions in order to protect their position. Hayward, for example, was identified by the Democratic National Committee as early as 1993 as one of the party's top 10 donors, and his continuing six-figure donations made him a repeat guest at Bill Clinton's White House coffees.<br /></p><p><br /></p><p>California tribes demonstrated in 2000 that sometimes states can be convinced to legalize Indian casino gambling of their own volition. With the support of Democratic Governor Gray Davis (and a campaign war chest in the tens of millions), the tribes played on Anglo guilt to persuade the state's voters to pass a proposition granting tribes exclusive authority to offer casino gambling in California. By this time next year, an estimated 40,000 to 100,000 tribe-owned slot machines are forecast to be up and clanging. Foxwoods--and Las Vegas--look to your laurels. </p>
<p></p><p><br /></p><p></p>
</div></div></div>Mon, 05 Nov 2001 23:49:14 +0000140941 at http://prospect.orgMichael NelsonHave the People Spoken?http://prospect.org/article/have-people-spoken
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"></div></div></div>Mon, 05 Nov 2001 23:01:43 +0000142097 at http://prospect.orgMichael NelsonThe President as Potentatehttp://prospect.org/article/president-potentate
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote><p>
<b>President Nixon: Alone in the White House</b><br />
By Richard Reeves. Simon and Schuster, 672 pages, $28.00<br /></p><p><br /><b>No Peace, No Honor: Nixon, Kissinger, and Betrayal in Vietnam</b> By Larry Berman. Free Press, 334 pages, $27.50
</p></blockquote>
<p></p><p><br /></p><p><span class="dropcap">L</span>iberals may wish it weren't so, but the last president not to give a let's-rein-in-big-government State of the Union address was Richard Nixon. Bill Clinton's most memorable line from a State of the Union was "the era of big government is over," and Jimmy Carter's was his charge that "the government has almost become like a foreign country, so strange and distant." All the other presidents since Nixon--Gerald Ford, Ronald Reagan, and the two George Bushes--have been conservative Republicans who slam big government for sport.<br /></p><p><br /></p><p>Nixon's 1971 address to Congress (and a prime-time national television audience) was different. He used it to outline the "great goals" of his "New American Revolution," which included welfare reform, "full prosperity in peacetime," restoring and enhancing the natural environment, and "improving health care and making it available more fairly to more people." By welfare reform, Nixon meant a modest guaranteed income for the poor, not shooing people off the rolls. By improved health care, he meant something like universal medical coverage.<br /></p><p><br /></p><p>Nixon lent his support to other liberal measures, too; Stewart Alsop began calling him "President Liberal" in his <i>Newsweek</i> column. Nixon approved the Philadelphia Plan, a quotas-based approach to increasing the number of blacks working on federally assisted construction projects. He explicitly embraced Keynesian economics and proposed "full employment" budgets--that is, budgets that would intentionally run deficits until the workforce was fully employed. In August 1971, Nixon imposed wage-and-price controls on virtually every product, service, and occupation in the American economy. He urged Congress to pass clean-air and clean-water legislation and created the Environmental Protection Agency. He signed the 1971 Federal Election Campaign Act and the 1972 bill that not only raised Social Security benefits by 20 percent but also indexed them to annual changes in the cost of living.<br /></p><p><br /></p><p>Richard Nixon, liberal icon? Hardly. As Richard Reeves shows in <i>President Nixon: Alone in the White House, </i> Nixon cared about two things and two things only: getting re-elected and foreign policy. Domestic policy was uninteresting to Nixon; he privately dismissed it as "building outhouses in Peoria." To the extent that he paid attention to the home front, he did so either to strengthen his hand in foreign policy or to help secure his re-election.<br /></p><p><br /></p><p>Nixon's foreign-policy preoccupations sometimes took him to the left on domestic matters, sometimes to the right. His main goal, Reeves argues, was to convince the Chinese, the Soviets, and the North Vietnamese that domestic disorder had not weakened the resolve, military or diplomatic, of the United States. One way of accomplishing that goal was to numb the roots of urban race riots and student protests; hence the proposal (dubbed the Family Assistance Plan) for a guaranteed income and the creation of a draft lottery and, later, of an all-volunteer army. Another way was to crack down on leaks that revealed the extent of disagreement within his administration. Perversely, however, it was the leaking of the so-called Pentagon Papers--which made John F. Kennedy and Lyndon B. Johnson (but not Nixon) look bad--that provoked Nixon to create the White House "plumbers unit," whose venal bungling helped bring down his presidency.<br /></p><p><br /></p><p>Reeves shows that when it came to Nixon's second concern--getting re-elected--the president was enormously impressed by Richard Scammon and Ben Wattenberg's 1970 book <i>The Real Majority. </i> Scammon and Wattenberg argued that the Republican and Democratic Parties were evenly matched because the Democrats owned the "Economic Issue" (actually a congeries of issues such as Social Security, the environment, and jobs) and the Republicans held title to the "Social Issue" (crime, drugs, and morality). Whichever party could neutralize the other party's attempts to play to its strength would predominate.<br /></p><p><br /></p><p>The lesson Nixon learned from Scammon and Wattenberg was to run left on economics, especially since the Democrats showed signs of running right on social issues. (The typical Democratic campaign commercial in the 1970 midterm election featured a candidate riding shotgun in a police car.) Nixon's real purpose in supporting popular liberal legislation was to keep the Democrats who dominated Congress from getting all the credit for enacting it. Sometimes, as with the Philadelphia Plan, he saw the added benefit of aggravating the political fault lines within the Democratic Party, in this case by pitting liberals and civil-rights groups against the nearly all-white construction unions.<br /></p><p><span class="dropcap">W</span>hy Nixon devoted his career to electoral politics is a mystery that Reeves plumbs but does not pretend to solve. Reeves has been around a lot of politicians in his long and accomplished journalistic career and, as he points out, most of them "are men who can't stand to be alone. Nixon did not like to be with people." At times, Nixon's misanthropy was almost comic, as when he ordered that White House Christmas parties be scheduled for when he was out of town so that he wouldn't have to attend; or when he told his chief of staff to deflect an adoring group of fellow Whittier College alumni into "an Evening at the White House or a church service. . . . This would be much better than a reception for them alone where I would have to get into too much conversation."<br /></p><p><br /></p><p>But usually Nixon's dislike of people was, well, vile. On the strength of Nixon's own words, Reeves reveals the president's hatred for groups as various as Jews ("they are out to kill us"), civil servants ("they're bastards and they're out to screw us"), and Senate Republicans ("a bunch of jackasses. . . . Fuck the Senate!")--to cite just a few. Shortly after winning 60 percent of the popular vote against Democrat George McGovern in the 1972 election, Nixon told an interviewer that "the average American is just like the child in the family"--that is, "soft" and "spoiled." Within the White House, he decided early in his presidency that "I must build a wall around myself." By July 1969, Reeves shows, three close aides--H.R. Haldeman, John Ehrlichman, and Henry Kissinger--constituted "Nixon's environment." He was constantly writing memos complaining about all the people who wanted to see him, especially his cabinet.<br /></p><p><br /></p><p>Psychology no doubt takes us partway toward explaining what Nixon was doing in politics. Reeves cites approvingly <i>The Presidential Character, </i> the 1972 book by political scientist James David Barber, who chronicled Nixon's subconscious need to compensate for low self-esteem by dominating others and predicted that he would bring down his own presidency through politically self-destructive behavior as soon as his hold on power was threatened. More tantalizingly, Reeves quotes from a number of to-do and, more important, to-be lists that Nixon wrote on yellow legal pads in the solitude of his hideaway in the Executive Office Building. These reveal aspirations for nobility of character so different from the reality of who Nixon was as to be poignant. One typical list included these items:<br /></p><p><br /></p><p><br /></p><blockquote>
<p>Each day a chance to do something memorable for someone.<br /><br />Need to be good to do good.<br /><br />Need for joy, serenity, confidence, inspirational.<br /><br />Goals: Set example, inspire, instill pride.<br /></p><p></p></blockquote>
<p></p><p><br /></p><p>Much as Nixon despised politics, he seems to have regarded it as the arena in which he could become something better than he knew himself to be.<br /></p><p><br /></p><p>But enough psychology. Surely the main reason Nixon sought the presidency so desperately is that he knew politics was the price he'd have to pay to get his hands on the reins of foreign-policy making. Reeves reminds us that Nixon devoutly admired Woodrow Wilson (another noble-minded but psychologically flawed individual, according to Barber and several Wilson biographers) and cared deeply about securing Wilson's great goal of a peaceful world led by the United States. Nixon also enjoyed foreign policy because, in dealing confidentially with other national leaders, he could sidestep Congress, the news media, interest groups, the bureaucracy--even his own secretary of state and secretary of defense--and get away with it.<br /></p><p><br /></p><p>Nixon especially liked dealing with the leaders of the major communist powers, China and the Soviet Union. In those countries, only a few people made all the decisions and they could keep secrets--Nixon's idea of Utopia. He was never happier than when secretly orchestrating the events that led to his February 1972 trip to China. Reeves quotes Kissinger aide Winston Lord as saying that Nixon "deliberately mirrored adversaries which were secretive. In China, only two or three people were involved in decision making." When Nixon met Mao, he was tickled by the chairman's comment "I like rightists." ("Those on the right can do what those on the left talk about," Nixon replied, neglecting to explain that whenever those on the left tried to do things like open a door to China, those on the right accused them of being comsymps.) And Nixon was so taken by Chou En-lai that, overlooking Chou's bloody career as a communist despot, he urged Kissinger to spin American reporters that "RN has similar character characteristics and background as Chou." He even gave Kissinger a list of nine Nixon-Chou talking points. Number 9 was as typical as it was preposterous: "Steely but . . . subtle and appears almost gentle."<br /></p><p><br /></p><p>Reeves concedes Nixon a certain measure of strategic brilliance: "His great intellectual strength was connecting the dots, seeing the world whole and from different angles." The China opening, for example, was aimed mostly at the Soviet Union; the idea was that fear of being outflanked by a U.S.-China alliance would spur the Soviets to accede to American goals on nuclear-arms control and other matters. Nixon was right about that. Three months after his trip to China, he traveled to Moscow to collect the Soviets' signature on the Strategic Arms Limitation Treaty, which placed history's first ceiling on nuclear deployment. When Nixon pressed Congress to approve a missile defense system, he had no doubt what its real purpose was: to give the United States a bargaining chip that he could negotiate away in return for concessions on Soviet nuclear submarines. Nixon was also shrewd enough to know that the Soviets would be inclined to embrace anything that enabled them to divert money from weapons development into their feeble civilian economy.<br /></p><p><br /></p><p><span class="dropcap">I</span>n view of Nixon's keen insights into China and the Soviet Union's strategic and economic incentives to cooperate with the United States, it's hard to understand his conviction that unless he continued to prop up the South Vietnamese government of Nguyen Van Thieu, the entire structure of world peace that he was orchestrating would collapse. Yet, as political scientist Larry Berman shows in <i>No Peace, No Honor: Nixon, Kissinger, and Betrayal in Vietnam, </i> it was Nixon's blind spot concerning Vietnam that led him astray, at enormous cost to both the United States and, especially, its sad ally.<br /></p><p><br /></p><p>Nixon's sharp focus on the communist superpowers clouded his perception of the essentially local realities of the Vietnam War. Surely, Nixon reasoned, North Vietnam was as much a pawn of the Soviets and Chinese as South Vietnam was of the United States, so pressure from the communist superpowers would force their client to negotiate a withdrawal from the south. But, as Berman points out, the dominant historical memory of North Vietnam's leaders was of the Geneva peace conference of 1954, where they had given up land they had won from the French on the battlefield in response to diplomatic pressure from the Soviet Union and China. Ho Chi Minh and all of his ruling colleagues were from North Vietnam's own "greatest generation"; they had no intention of making the same mistake again. Indeed, Nixon's trip to China actually reduced Chinese influence over North Vietnam because it suggested that China now cared mostly about its relationship with the United States.<br /></p><p><br /></p><p>A lesson Nixon learned from his dealings with the leaders of the Soviet Union and China--namely, that they enjoyed negotiating different terms in secret than they were proclaiming in public--led him further astray in dealing with North Vietnam. Time and again, Berman shows, Kissinger would go into a secret meeting with North Vietnamese negotiator Le Duc Tho with a let's-deal attitude and find that Tho's private negotiating position was the same as his public one. In session after session, Tho would merrily (and perceptively) invoke the most recent evidence of congressional and public opposition to the war, take note of the latest reduction in the number of American troops that Nixon's policy of "Vietnamization" had left in the south, and then place the same old demand on the table: The United States must withdraw if it hoped to see its prisoners of war.<br /></p><p><br /></p><p>Nixon and Kissinger's response was to accuse the North Vietnamese of bad faith, launch occasional and massive bombing campaigns, and state their renewed resolve to settle for nothing less than "peace with honor." This preoccupation with honor was their mistake all along, like a boy fighting on the playground over nothing because he thinks he has to do it to impress the girls. In truth, the girls think he's stupid, just as the Soviets and Chinese thought Nixon's persistence in a losing war that he'd inherited from his predecessors was stupid. As for the North Vietnamese reaction to Nixon and Kissinger's bluster at the negotiating table, Berman unmasks it as so much hemming and hawing for all the effect it had. At the end of the day, over Thieu's strenuous but futile objections (the "betrayal" in Berman's subtitle is of Thieu), Nixon agreed in January 1973 to a cease-fire in place: that is, a cease-fire that allowed North Vietnam to keep its 150,000 soldiers in South Vietnam when the United States withdrew. The terms were essentially the same--and in several cases, word-for-word the same--as those offered to the United States in 1969. They assured that South Vietnam would fall to the communists. Not surprisingly, on the night the peace agreement was announced, there was rejoicing in the streets of Hanoi, silence in the streets of Saigon.<br /></p><p><span class="dropcap">C</span>learly, the bar is high for any author who seeks to add to the already crowded shelves of books on Nixon and Vietnam. Both Reeves and Berman have cleared that bar easily by approaching their overlapping subjects in distinctively illuminating ways.<br /></p><p><br /></p><p>Berman, the author of two books about U.S. policy during the earlier stages of the war (<i>Planning a Tragedy</i> and <i>Lyndon Johnson's War</i>), is renowned among Vietnam scholars for his documents-based research, and in <i>No Peace, No Honor</i> he has uncovered much new documentary evidence. Although Nixon and Kissinger labored mightily and for the most part successfully to keep scholars out of their papers, Berman has ingeniously drawn on sources as varied as the notes Kissinger's assistants took at the peace talks and the transcripts of both the public and the secret negotiating sessions kept by the Vietnamese. Berman's only flaws are occasional lapses into breathless "now the real story can be told" prose and a tendency to see everything through the lenses of his subject. (Worst example: "There very likely would have been no Watergate if not for Vietnam.")<br /></p><p><br /></p><p>Reeves's book has its own, equally minor defects. Inexplicably (unless he just ran out of gas), Reeves wraps up his story on April 30, 1973, with more than 15 months--nearly a quarter--of Nixon's presidency left to go. Is there a second volume in the works? One hopes so, but Reeves gives no indication that there is.<br /></p><p><br /></p><p>What makes <i>President Nixon: Alone in the White House</i> so good is that Reeves has once again executed with great skill the approach that he first used in his 1993 book <i>President Kennedy: Profile of Power. </i> Reeves took as his purpose "to reconstruct the Nixon presidency as it looked from the center" by uncovering "what he knew and when he knew it, sometimes hour by hour, sometimes minute by minute." As in the Kennedy book, Reeves "hoped to get close to knowing what it was like to be president."<br /></p><p><br /></p><p>One thing Reeves learned by following calendar and the clock so closely is that noble and venal deeds not only can spring from the same person but often can do so at the same time. On May 28, 1972, for example, the very day that Nixon was negotiating the nuclear stand-down in Moscow, his henchmen were planting bugs in the Watergate offices of the Democratic National Committee.<br /></p><p><br /></p><p>Another important insight Reeves derived from his relentlessly chronological approach to Kennedy and Nixon is that neither man changed very much during the course of his presidency. "The office makes the man" is a familiar and comforting nostrum, supported for years by the claims of Kennedy biographers that their adored hero grew tremendously during his thousand days as president. Not so, argues Reeves. Kennedy left the presidency as he entered it: a cool, dispassionate leader who "did not know what he was doing at the beginning, and in some ways never changed at all," a man who "substituted intelligence for ideas or idealism, questions for answers."<br /></p><p><br /></p><p>A reassuring corollary to this insight is that neither does the office destroy the person, as some presidential scholars fretted in the aftermath of the Johnson and Nixon presidencies. They looked at Johnson, the most effective Senate leader ever, and at Nixon, the most resilient politician in history, and wondered: What did the office do to these men that their presidencies should end in political catastrophe? President Nixon, Reeves shows, was just Richard Nixon with a title--a man alone, riddled with noble aspirations and deep insecurities, and, like Kennedy, preoccupied with power in ways that made him neither consistently liberal, consistently conservative, nor consistently anything else.<br /></p><p></p>
</div></div></div>Wed, 17 Oct 2001 19:03:38 +0000142256 at http://prospect.orgMichael NelsonHistory, Meet Politicshttp://prospect.org/article/history-meet-politics
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"></div></div></div>Fri, 20 Jul 2001 17:04:41 +0000142062 at http://prospect.orgMichael Nelson