Recent Articleshttp://prospect.org/authors/125992/rss.xml
The American Prospect - articles by authorenDemocracy-Proofhttp://prospect.org/article/democracy-proof
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote><p>
<b>How Democratic Is the American Constitution?</b> By Robert Dahl.<br />
Yale University Press, 198 pages, $19.95<br /></p></blockquote>
<p>
<span class="dropcap">I</span><i>n the Frozen Republic: How the Constitution Is Paralyzing Democracy<br /></i>(1996), Daniel Lazare points out that the U.S. Constitution was adopted<br />
unconstitutionally. The Articles of Confederation, our first governing compact,<br />
contained a provision that any amendment would require the consent of all 13<br />
states. Yet the Articles were supplanted without unanimous consent of the states.<br />
That's because Article VII of the new Constitution provided that <i>it</i> would take<br />
effect if ratified by only nine of the 13 states. Wasn't Article VII therefore in<br />
violation of the original governing document?</p>
<p>
In Federalist 40, Madison dismissed this objection. It would be "absurd," he<br />
declared, to "subject the fate of twelve States to the perverseness or corruption<br />
of a thirteenth." Surely this was obvious to "every citizen who has felt for the<br />
wounded honor and prosperity of his country." End of discussion. So much for the<br />
original intention of those who framed the Articles of Confederation.</p>
<p>
The new Constitution contained an equally "absurd" provision. According to<br />
Article V, "no State, without its Consent, shall be deprived of its equal<br />
Suffrage in the Senate." Each state, regardless of population, was to have two<br />
senators. As a result, two centuries later half the U.S. population sends 18<br />
senators to Washington while the other half sends 82. Twenty senators represent<br />
54 percent of the population; another 20 represent less than 3 percent.<br />
California gets two senators; the 20 least populous states, which combined have<br />
roughly the same number of people as California, get 40 senators. Senators<br />
elected by 11 percent of the population can kill proposed legislation with a<br />
filibuster; senators elected by as little as 5 percent of the population can<br />
block a constitutional amendment. For two centuries this blatantly undemocratic<br />
institution -- "perhaps the most unrepresentative legislative body in the world,"<br />
Lazare observes -- has survived without serious challenge, thanks partly to elite<br />
(especially slaveholder) self-interest and partly to popular<br />
Constitution-worship. But by the logic of Federalist 40, a people in earnest<br />
about equal representation for all, and therefore determined to reform the<br />
Senate, ought not be obstructed.</p>
<p>
The composition of the Senate is not the only undemocratic feature of the<br />
Constitution, as Robert Dahl reminds us in <i>How Democratic Is the American<br />
Constitution?</i> Of the others, the most flagrant is the electoral college.<br />
There is nothing to be said for this institution. It has no other purpose or<br />
result than to frustrate equal representation for all citizens, and its effect on<br />
our political history has been calamitous. It was rejected several times at the<br />
Constitutional Convention until it slipped by on a last-minute vote. It has never<br />
functioned as intended, i.e., as a deliberative body. Within a dozen years it had<br />
caused a constitutional crisis (the deadlocked election of 1800). Several decades<br />
later, after another deadlocked election in 1876, electoral-college horse-trading<br />
resulted in the abandonment of federal efforts to enforce civil rights in the<br />
South. In four presidential elections, including the last one, the candidate with<br />
the greatest number of popular votes was not chosen as president. Overwhelming<br />
majorities regularly tell pollsters that the electoral college should be<br />
abolished. Seven hundred proposals to reform or abolish it have been introduced<br />
in the House, the most recent of which in 1989 passed with an 83 percent<br />
majority. As always, the Senate blocked any action. <i>Quo usque tandem?</i></p>
<p>
<i>How Democratic Is the American Constitution?</i> is a short book, not only<br />
because Dahl is a masterly expositor but also because the case against<br />
Constitution-worship is not very difficult to make. To begin with, the early<br />
republic did not worship it. The framers were a gifted and experienced group --<br />
though some, such as Alexander Hamilton and Gouverneur Morris, were not<br />
particularly well-disposed toward democracy. But divisions among them were sharp:<br />
Even Hamilton, for example, said that giving each state the same number of<br />
senators "shocks too much the ideas of justice and every human feeling." And some<br />
of the most eminent among them, such as Elbridge Gerry and Edmund Randolph,<br />
refused to sign or signed only with grave reservations. The debate in the country<br />
over ratification was extremely vigorous (see the two splendid Library of America<br />
volumes on the subject). The decision was not made by popular vote but by elected<br />
delegates, more than a third of whom voted against ratification. In short, our<br />
forebears did not in the least regard the Constitution as an inspired deliverance<br />
from heaven.</p>
<p>
Moreover, there were some distinguished second thoughts. Conservatives<br />
endlessly cite Madison's Federalist 10 on the dangers of faction and the need to<br />
curb popular majorities. But as Dahl points out, Madison soon reconsidered.<br />
Within a few years he was writing in an anti-Federalist journal that "in every<br />
political society, parties are unavoidable" ("a natural offspring of Freedom," as<br />
he put it still later), and that political competition could be made fairer "by<br />
withholding unnecessary opportunities from a few to increase the inequality of<br />
property by an immoderate, and especially an unmerited, accumulation of riches,"<br />
and "by the silent operation of the laws, which, without violating the rights of<br />
property, reduce extreme wealth towards a state of mediocrity and raise extreme<br />
indigence towards a state of comfort." Madison the radical!</p>
<p>
Other democracies do not particularly admire our Constitution, at least to the<br />
extent of imitating it. In a chapter titled "The Constitution as a Model: An<br />
American Illusion," Dahl notes that "among the countries most comparable to the<br />
United States" -- he lists 22 -- "and where democratic institutions have long<br />
existed without breakdown, not one has adopted our American constitutional<br />
system." Our combination of an executive branch independent of the legislature, a<br />
"first-past-the-post" electoral system that practically rules out third parties<br />
and coalition governments, extensive judicial review of federal legislative<br />
enactments, and strong bicameralism with highly unequal representation in the<br />
upper chamber is unique.</p>
<p>
<span class="dropcap">S</span>o ours is an inefficient and undemocratic system. The Senate and<br />
the electoral college merit no further discussion. First-past-the-post, or<br />
strictly majoritarian, elections are also plainly unfair. In theory, at least, a<br />
party that gained a one-vote plurality in every election district would win 100<br />
percent of the seats in the legislature -- an obvious absurdity. In practice,<br />
voters know that a vote for any except the two major parties is likely to be<br />
"wasted," producing no representation. This is, of course, convenient for the two<br />
major parties, but it leaves some (possibly many) voters unrepresented. A<br />
proportional system in which each party's legislative membership corresponds to<br />
its percentage of votes received would reflect the popular will more accurately<br />
without, Dahl contends, any loss of effectiveness. All the established<br />
democracies except Canada and (for the time being) the United Kingdom have a<br />
proportional, or "consensus," system rather than a majoritarian one.</p>
<p>
The steady growth of presidential powers is also an American<br />
exception, Dahl contends. Actually, the U.S. government was not designed to have<br />
such a powerful chief executive. For all the talk then and now about the<br />
separation of powers, Dahl writes, those who framed and ratified the Constitution<br />
believed that "the only legitimate representative of the popular will was the<br />
Congress, not the president." The "myth of the presidential mandate" is a<br />
subsequent creation. Policy -- including foreign policy -- was (is) supposed to<br />
issue from the deliberations of a body of elected lawmakers, not from one elected<br />
chief administrator in consultation with his appointees. No other mature<br />
democracy, Dahl points out, has a "single popularly elected chief executive with<br />
important constitutional powers." </p>
<p>
Regardless, the main question remains: Does our constitutional system at least<br />
work well for us? Dahl is skeptical, though he has to acknowledge the uncertain<br />
relevance of political arrangements to social and economic indicators, as well as<br />
the difficulty of comparing countries that differ in size and homogeneity.<br />
There's not much data in the book, although its references are helpful on this<br />
score. From the work of Arend Lijphart and other social scientists Dahl cites, it<br />
is clear, at any rate, that majoritarian democracies such as ours do not<br />
generally outperform consensus democracies on such measures as voter<br />
satisfaction, accountability, macroeconomic management, or the control of<br />
violence.</p>
<p>
In any case, Dahl has not come to bury the Constitution, only to undermine<br />
complacency about it. Besides, as he acknowledges in the book's sobering<br />
conclusion, there's not much we can do. The Constitution is virtually<br />
democracy-proof. A Supreme Court that promulgates and upholds a <i>Buckley v.<br />
Valeo</i> (not to mention a <i>Bush v. Gore</i>) is all too likely to find<br />
constitutional problems with any serious move in the direction of popular<br />
sovereignty. It is hard to imagine a Congress unified and determined enough to<br />
reassert its primacy over the executive branch. And however indefensible, the<br />
Senate in its present form is here to stay.</p>
<p>
For all these reasons, Dahl avows a "measured pessimism" about the prospects<br />
for a more democratic political system any time soon. The only hope -- a<br />
long-term one -- is to help along the evolution of a more democratic political<br />
culture. How? By trying to "reduce the vast inequalities in the existing<br />
distribution of political resources." I presume that by "resources" he means<br />
information, experience, and money. Unfortunately this intriguing suggestion<br />
comes in the book's penultimate paragraph and receives no elaboration. That is<br />
disappointing; but then Dahl has spent much of his career elaborating it in other<br />
books, many of them as valuable as this one. </p>
<p>
During that long career (he is now 87), Dahl has received nearly every<br />
accolade for which a political scientist is eligible. I can't forbear adding my<br />
mite of praise to the heap. Dahl's work seems to me an admirable, even inspiring<br />
blend of normative and analytical, citizenly and scholarly, generous and<br />
disinterested. <i>How Democratic Is the American Constitution?, </i>along with<br />
his other books, such as<i> Democracy and Its Critics</i> and <i>A Preface to<br />
Economic Democracy</i>, will continue for quite a while to remind the rest of us,<br />
gently but persistently, that our professed ideal of democratic equality requires<br />
a good deal more in the way of practice than we seem to have noticed.</p>
<p></p></div></div></div>Tue, 18 Jun 2002 20:21:35 +0000142631 at http://prospect.orgGeorge ScialabbaOur Posthuman Futurehttp://prospect.org/article/our-posthuman-future
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><span class="dropcap">A</span>s many astute observers have pointed out, controversial<br />
new ideas are assimilated in three stages. First they're false and pernicious,<br />
then they're true but trivial, and finally they're what everyone claims to have<br />
believed all along. I see nothing to disprove this time-tested formula in the<br />
case of Francis Fukuyama's thesis about the "end of history." According to<br />
Fukuyama, the evolution of social structure has come to a natural terminus in the<br />
combination of free markets and liberal democracy. Though once we scoffed, I'm<br />
sure that now, as long as everyone else is willing to accept a few modest<br />
qualifications -- markets require vigilant and impartial regulation, periodic<br />
free elections are a necessary but far from sufficient condition of robust<br />
democracy, unequal distribution of economic power can and usually does translate<br />
into unequal distribution of political power, present-day levels of solidarity<br />
and selfishness are not eternally fixed -- my fellow democratic socialists will<br />
join me in graciously acknowledging that Fukuyama's thesis is just what we've<br />
always had in mind, or close enough. </p>
<p>
Qualifications, however, are for earthbound thinkers; Fukuyama has<br />
moved on. In <i>Our Posthuman Future</i>, he has looked past the end of history and<br />
descried the end of mankind. Markets and democracy may be the last word for the<br />
human nature of 2002, but what if human nature changes?</p>
<p>
Until recently, this wouldn't have seemed to most people a compelling, or even<br />
an intelligible, question. But the last several decades have seen astonishing<br />
progress in the life sciences. Evolutionary psychology has supplemented or<br />
superseded Plato, Freud, and Skinner; molecular biology, meanwhile, seems about<br />
to usurp the prerogatives of an even more august personage, sometimes known as<br />
the Creator. The technology of Aldous Huxley's Brave New World is no longer<br />
fantastic or even remote. Of course no one currently wants to end up there, or<br />
anywhere similar. But could it happen? And if so, how do we prevent it?</p>
<p>
The first part of <i>Our Posthuman Future</i> is an informative survey of<br />
contemporary bioscience and its political implications. It's no longer much<br />
disputed, Fukuyama writes, that heredity is involved to a nontrivial extent in<br />
determining intelligence, criminal behavior, and secondary sexual differences.<br />
(Homosexuality remains an open question.) It's equally certain that environmental<br />
factors such as diet, education, peers, parents, and social conventions also play<br />
an important role. A few diseases, such as cystic fibrosis, can be traced to a<br />
single gene, but "higher-level behaviors ... are likely to have far more complex<br />
genetic roots, being the product of multiple genes that interact both with each<br />
other and with the environment." The findings of the Human Genome Project -- that<br />
we have fewer genes than previously thought -- make this even more likely. Still,<br />
"it seems almost inevitable that we will know much more about genetic causation<br />
even if we never fully understand how behavior is formed." </p>
<p>
Neuropharmacology is one possible Huxleyan technology. Though less than 15<br />
years old, serotonin reuptake inhibitors (Prozac, Zoloft, Paxil) have been<br />
administered as antidepressants to a sizeable fraction of the population. So has<br />
Ritalin, a central nervous system stimulant that increases concentration and<br />
curbs hyperactivity in children. These are valuable drugs, but worrisome ones,<br />
too. Conservatives worry that opportunities for self-discipline and character<br />
development may be lost. Liberals worry that real-world causes of unhappiness and<br />
restlessness may be obscured. </p>
<p>
Like the relief of emotional pain, the prolongation of life can be too dearly<br />
bought. Throughout the developed world, as life expectancy rises and birth rates<br />
fall, the median age is increasing sharply. Biomedical technology will probably<br />
accentuate this trend. Yet unless ways are found not only to postpone death but<br />
to extend the prime of life proportionately -- a much more difficult matter --<br />
disturbing changes are likely. Alzheimer's and other incapacitating diseases<br />
could become epidemic. Pressures on younger workers and family members could<br />
become intolerable. The rhythms of generational succession could be disrupted.</p>
<p>
<span class="dropcap">G</span>enetic engineering is the most tempting and troubling of<br />
biomedical prospects. Genetic screening of embryos is on the horizon; indeed<br />
it's already in use for single-gene diseases. First attempts at human cloning are<br />
just over the horizon. But germ-line modification, the alteration of DNA in the<br />
fertilized egg, is the real Pandora's box. Here the line between therapy and<br />
enhancement is blurred; the way is open to "designer babies" whose sex, IQ,<br />
physique, and other traits are predetermined. </p>
<p>
Germ-line engineering may well be impossible; the scientific jury is<br />
still out. Like computer-based artificial intelligence, it may turn out to be a<br />
will-o'-the-wisp, at least as currently conceived. Gene/environment interaction<br />
has scarcely begun to be understood. The emergence of complexity -- in this case,<br />
of phenomes from genomes -- may be beyond the grasp of any information-processing<br />
technology, present or future. There may be no way to find out whether a<br />
particular genetic intervention has unanticipated lethal consequences, except by<br />
performing experiments that violate the right of informed consent. Fukuyama is<br />
rightly cautious in accepting claims about technical feasibility; in my<br />
(admittedly less well-informed) opinion, even stronger skepticism is warranted.</p>
<p>
Still, it's only prudent to give some thought to what we should do if genetic<br />
engineering turns out to be feasible. The middle of <i>Our Posthuman Future</i> is an<br />
effort to lay ethical foundations for policy judgments. Why shouldn't we "seize<br />
the power," as one geneticist has put it, to "control what has been left to<br />
chance in the past"?</p>
<p>
The simplest answer is religious: Genetic engineering would set God's will at<br />
naught. But Fukuyama respectfully sidesteps religion. Another possible answer is<br />
that genetic manipulation unfairly (because irreversibly) "embeds one<br />
generation's social preferences" -- about temperament, personality, looks -- "in<br />
the next." (Though couldn't such changes be reversed in the generation after<br />
next?) Another reason has to do with class and equality: Won't genetic<br />
engineering increase inequality if only the rich can afford it? (Perhaps, though<br />
eventually, Fukuyama contends, political pressure would force democratic<br />
governments to subsidize it for all.) The most compelling reason, I think, is<br />
ecological: The human organism is a miniature ecosystem, and "ecosystems are<br />
interconnected wholes whose complexity we frequently don't understand; building a<br />
dam or introducing a plant monoculture into an area disrupts unseen relationships<br />
and destroys the system's balance in totally unanticipated ways." (One hopes<br />
Fukuyama's admirers on <i>The Wall Street Journal</i> editorial page heard<br />
that.)</p>
<p>
None of these reasons fully satisfies Fukuyama, however. He must have<br />
philosophical underpinnings. So we get an elaborate argument, over several<br />
chapters, to the effect that yes, there is a human nature, extreme social<br />
constructionists notwithstanding. It is the only plausible foundation for human<br />
rights and political equality and, therefore, we had better not tamper with it.<br />
Ultimately, "we want to protect the full range of our complex, evolved natures"<br />
-- our souls -- "against attempts at self-modification," attempts that may<br />
simplify us, blunting our suffering or elevating our IQ but robbing us of<br />
emotional depths or imaginative heights. </p>
<p>
After this philosophical flight, the book descends to policy. Though skeptical<br />
of most governmental regulation, especially at the international level, Fukuyama<br />
recognizes that the biotechnology industry has "too many commercial interests<br />
chasing too much money for self-regulation to continue to work." Actually,<br />
"continue" is wrong: Self-regulation has not worked. Monsanto, Aventis, Eli Lilly,<br />
and many other pharmaceutical and agro-technical giants have acted unwisely or<br />
unethically, then bought, bullied, bamboozled, or otherwise successfully<br />
neutralized critics and government regulators. But while Fukuyama seems unaware<br />
of how frequently industry self-discipline has failed in similar cases, he is at<br />
least ready to acknowledge that it will likely fail in the case of human<br />
biotechnology.</p>
<p>
What kind of regulatory regime, then, should exist and what issues should it<br />
address? The FDA, EPA, USDA, and NIH can't do the job, Fukuyama argues<br />
persuasively. Human biotechnology is outside their mandate and beyond their<br />
competence. A new agency with enforcement powers is needed, perhaps modeled on<br />
Britain's Human Fertilization and Embryology Authority. As other states follow<br />
suit, varying national policies can be gradually harmonized.</p>
<p>
America's national policy, Fukuyama suggests, should roughly be: yes to<br />
therapy, no (or at best maybe) to enhancement. Stem-cell research and<br />
preimplantation screening would be presumptively legitimate when aimed at<br />
inherited diseases but doubtful when aimed at selecting traits or improving<br />
performance. Germ-line engineering, with its drastic and open-ended consequences,<br />
would be very doubtful. Human-reproductive cloning would be banned outright.<br />
These are wise suggestions. As Fukuyama observes: "The original purpose of<br />
medicine is, after all, to heal the sick, not to turn healthy people into gods." </p>
<p>
Actually, if that observation were applied to politics as well, the question<br />
of genetic engineering might appear still more dubious. It might seem that<br />
productive as well as medical resources are better directed to meeting basic<br />
human needs than to expanding the opportunities of the comparatively healthy and<br />
wealthy. The mere possibility of designer babies for some while others in the<br />
same society, or other societies, can't afford proper care for <i>their</i> babies<br />
might seem ... unnatural.</p>
<p>
"Our greatest social philosopher," as his publicists call him, is not, alas,<br />
our most acute social critic. One good reason, not mentioned in <i>Our Posthuman<br />
Future</i>, for skepticism about biotechnological interventions, is the "economy<br />
of means" principle. The right-wing version of this principle: Just say no. The<br />
left-wing version: Expensive, complicated, and dangerous solutions are usually<br />
more profitable but less effective. Safe, simple, effective, and relatively cheap<br />
ways to increase intelligence and longevity in the American population include<br />
mounting the equivalent of the antismoking educational campaign against the high<br />
levels of fat, sugar, and salt in our national diet; encouraging physical<br />
activity, for example by forcing ourselves, through gasoline taxes and urban<br />
rezoning, to walk more, if only to get to public transportation; implementing<br />
national health insurance, or at least universal free health care for children<br />
under five; and subsidizing public radio and television. Reducing economic<br />
insecurity and environmental degradation also offers virtually unlimited<br />
opportunities for enhancing our beleaguered selves. But these policies won't make<br />
investors a lot of money -- on the contrary -- while pharmaceuticals and genomics<br />
will.</p>
<p>
The best reason of all not to press forward into the posthuman future also<br />
goes unmentioned in this book. It's that the enormous resources required could be<br />
put to much better use helping the many people who do not now enjoy a human<br />
present. According to the United Nations, roughly three billion people do not<br />
have access to safe sewers, 1.5 billion lack clean water, 1.25 billion don't have<br />
minimally adequate housing, one billion have no health care, and a half-billion<br />
don't get the minimum daily caloric requirement. An unknown but very large number<br />
are illiterate. Among the poorest fifth of the world's population, between 30,000<br />
and 40,000 children under five die each day from the effects of malnutrition and<br />
infection; another 180 million barely hang on. The UN estimates that all these<br />
needs could be met, at a basic level, for a yearly expenditure equal to 10<br />
percent of the recently proposed U.S. military budget -- or slightly less than<br />
Americans and Europeans spend annually on pet food and ice cream.</p>
<p>
We -- the fortunate "we" of Fukuyama's title -- already inhabit a different<br />
world than these hapless billions. Their "human dignity" is moot; their "complex,<br />
evolved natures" don't have much scope for expression. It's understandable that<br />
the biotech industry and the political class take no notice of these nonvoting,<br />
nonconsuming, noninvesting multitudes. But shouldn't our greatest social<br />
philosopher have thought to put in a word for them?</p>
<p></p></div></div></div>Tue, 21 May 2002 15:24:38 +0000142588 at http://prospect.orgGeorge ScialabbaThe Control of Ideashttp://prospect.org/article/control-ideas
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote><p><b>The Future of Ideas: The Fate of the Commons in a Connected World</b><br /><br />By Lawrence Lessig. Random House, 352 pages, $30.00</p>
<p>
<b>Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity</b><br /><br />By Siva Vaidhyanathan. New York University Press, 243 pages, $27.95</p></blockquote>
<p><span class="dropcap">A</span> specter is haunting culture: the specter of intellectual-property law. Soon every embodiment, however ephemeral, of thought or imagination may be defined as a "product," its every use commercially controlled. Thanks to digital technology, as Lawrence Lessig pointed out in his <i>Code and Other Laws of Cyberspace</i> (1999), a book is potentially no longer just a book. An online publisher, for example, will be able to specify </p>
<blockquote><p>whether you could read the book once or one hundred times; whether you could cut and paste from it or simply read it without copying; whether you could send it as an attached document to a friend or simply keep it on your machine; whether you could delete it or not; whether you could use it in another work, for another purpose, or not; whether you could simply have it on your shelf or have it and use it as well.</p></blockquote>
<p>Similar restrictions will apply to compact discs, videos, Web sites, databases, software applications, and anything else that is encoded or transmitted digitally. All of them will be packaged with programs that can monitor, and then either charge for or block, every imaginable use.</p>
<p>Why is this "propertization" of culture disturbing? One reason is its possible effect on equality. The present unmetered World Wide Web is analogous to the public library and broadcasting systems, which at least in principle foster social equality by making cultural resources available on equal terms to the rich and nonrich. Another reason is its effect on community. To adapt Cass Sunstein's perspective in <i><a href="http://www.republic.com" target="_blank">Republic.com</a></i>, increasingly individualized consumption packages may well increase cultural fragmentation. </p>
<p>In his new book,<i> The Future of Ideas</i>, Lessig emphasizes another reason: the likely ill effects on innovation. Whatever other harm the new law of intellectual property may do, he warns, it will probably also chill the remarkable creativity associated with the pre-commercial Internet. Up to now, new ways of connecting to the Net, communicating on the Net, and, perhaps most important, distributing art, ideas, and information across the Net have been devised at an amazing rate. By and large, this was not done for profit.</p>
<p>Consider the "architecture" of the Internet. The GNU/Linux operating system (the world's fastest-growing and a serious rival to Unix), the Apache server (two-thirds of all servers on the Web), the Perl programming language, the BIND (Berkeley Internet Name Domain) system, the "sendmail" program (which runs on 75 percent of all servers), and the protocols of the World Wide Web--"these projects," Lessig writes, "together constitute the soul of the Internet." They <i>are</i> the revolution. All of them were developed as, and remain, "open code" projects. That is, the code in which they are written is unowned or is governed by a "General Public License," a permissive form of copyright that allows anyone to modify the code, provided he or she makes those modifications available free to everyone else. The code that enables the Internet is thus common property. It is, to use a traditional term that Lessig adapts to cyberspace with extraordinary rigor and originality, a "commons."</p>
<p>A commons is a resource that is available to everyone (or everyone in some community) without permission. The term will be familiar to many readers from Garrett Hardin's well-known argument about the "tragedy of the commons." To take Hardin's example: If a pasture is held in common, the benefits of adding to one's herd will accrue to oneself, while the costs will be shared. The result is overgrazing and a ruined pasture. The solution is exclusive property rights.</p>
<p>This little parable has played a large part in forming contemporary intuitions about political economy. The belief that private control almost invariably produces the most efficient use of scarce resources is part of the common sense of market societies and is regularly invoked in order to oppose state regulation or public ownership. But as Lessig makes clear, this maxim does not apply straightforwardly to intellectual resources--a point that media studies professor Siva Vaidhyanathan also makes in <i>Copyrights and Copywrongs</i>.</p>
<p>Pasture is what economists call a "rivalrous" resource. One person's (or cow's) consumption leaves less for others. Culture is a nonrivalrous resource. One person's consumption leaves no less for others. Rivalrous resources can be depleted; nonrivalrous resources cannot. From the point of view of efficiency, it follows that different kinds of property rights should govern the two kinds of resources. In Lessig's formulation:</p>
<blockquote><p>If the resource is rivalrous, then a system of control is needed to assure that the resource is not depleted--which means the system must assure that the resource is both produced and not overused. If the resource is nonrivalrous, then a system of control is needed simply to assure that the resource is created... . Once it is created, there is no danger that the resource will be depleted. By definition, a nonrivalrous resource cannot be used up.</p></blockquote>
<p>With a nonrivalrous resource, one can have a commons without the tragedy. In <i>Code</i> and in <i>The Future of Ideas</i>, Lessig shows at great length that the precommercial Internet was the site of much rapid and fruitful innovation precisely because it was a commons. And he shows at even greater length that the evolution of intellectual-property law--driven by corporate leviathans and their lawyer-gnomes, and articulated by free-market ideologues on the judicial bench--is drastically changing the open character of the Internet, enclosing the commons.</p>
<p><span class="dropcap">W</span>hat is the character of the Internet, and what does it have to do with innovation? "How the Internet was designed," Lessig claims, "intimately affected the freedoms it has enabled... . And arguably no principle of network architecture has been more important to the success of the Internet than this single principle of network design--e2e." The term means "end-to-end." Between the edges, or ends, of the network (that is, individual users) was a simple, neutral data-transport system that would run whatever new applications were programmed in at the ends, no matter who owned the wires and machines in the middle. This meant that anyone could invent and distribute new applications or modify existing ones. And a great many people--scruffy graduate students, lowly coders, bored engineers, and scientists diddling around on company time--did. GNU/Linux, Apache, the Internet and Web protocols, and so on, were collective achievements.</p>
<p>This end-to-end "architecture of freedom" guaranteed progress but not profits. So, large, vertically integrated companies are moving to substitute an architecture of control, breaking up end-to-end by "layering onto the original code layer of the Internet new technologies that facilitate greater discrimination, and hence control, over the content and applications that can run on the Net." Microsoft has used its control over Windows' source code to prevent Windows users from switching to non-Microsoft Internet browsers, like Netscape Navigator. AOL Time Warner hopes to combine its vast resources of broadband cable pipe, network access, and content in ways that will handicap competitors and box in customers. Cable companies that provide Internet access artificially limit "video streaming," since streamed video is a potential competitor of cable programming. Freelancers compiled wonderfully comprehensive new archives of popular music that stimulated much musical experimentation among people without access to expensive equipment. But lawyers for the recording industry have pretty much choked off such unfettered innovation.</p>
<p>Besides these strategies for heading off novel technical uses of digital media, there are also powerful new surveillance techniques for clamping down on even the most casual uses of content. Intellectual-property advocates "obsess about the ability for content to be 'stolen,'" Lessig notes, "but we must also keep in view the potential for use to be more perfectly controlled." For example, computer programs called "bots" scan the Web for copyrighted content and report back to their corporate masters. </p>
<blockquote><p>The pattern here is extremely common. Copyright holders vaguely allege copyright violations; a hosting site, fearing liability and seeking safe harbor, immediately shuts down the site. The examples could be multiplied thousands of times over, and only then would you begin to have a sense of the regime of control that is slowly emerging over content posted by ordinary individuals in cyberspace. Yahoo!, MSN, and AOL have whole departments devoted to the task of taking down "copyrighted" content from any Web site, however popular, simply because the copyright holder demands it. [Bots] find this content; ISPs [Internet service providers] are ordered to remove it; fearing liability, and encouraged by a federal law that gives them immunity if they remove the content quickly, they move quickly to take down the content.</p></blockquote>
<p>Filmmakers are also hard hit. Evocation by allusion is common to all art. But numerous movies have been held up because the director made passing use of something that someone had, however implausibly, copyrighted. Every image, every melody, every brand name, "every piece of artwork, any piece of furniture or sculpture, has to be cleared before you can use it," a director tells Lessig. Every shot has to go through the studio's legal department. Independent filmmakers, with no studio behind them, have to self-censor or take their chances. "The cost," one of them complains, "is creativity. Suddenly the world you're trying to create is completely generic and void of the elements you would normally [make use of]." As Lessig comments: "This is not a picture of copyrights imperfectly protected; this is a picture of copyright control out of control."</p>
<p>Lessig and Vaidhyanathan tell many such stories, depressingly similar, about the cable-TV, music, film, publishing, and software industries. In each case, new modes of creation and distribution enabled by the Internet threaten the market share of big players. The behemoths respond by "locking up" their products with encryption software, requiring users to sign away even traditionally protected rights of "fair use," using their ownership of source code or of supply-and-distribution networks to marginalize potential competitors, or simply threatening newcomers with ruinously expensive lawsuits. Only federal regulation can preserve the open environment that elicited such remarkable innovation in the recent past. But it won't happen. The courts--where most federal judges are now appointees of Ronald Reagan or the two George Bushes--forget to balance private claims against the public interest and instead give the behemoths what they want. Legislators, intensively lobbied and campaign-funded, also cave in.</p>
<p>This shrinking of the public domain is not at all what the nation's founders had in mind. Constitutionally speaking, intellectual property is not like other property. Section 8 of Article I reads: "The Congress shall have Power ... to promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries." It is clear, as Lessig and Vaidhyanathan show, that the founders did not intend to give creators (much less their corporate employers) unlimited ownership rights. Unlimited exclusive control, they recognized, would stifle progress. The purpose of copyrights and patents is to promote innovation; the proper goal of copyright and patent law is to strike a balance between incentive and access, between rewarding achievement and facilitating more achievement.</p>
<p>Besides, being sensible persons and not Chicago School doctrinaires, the founders understood that "Science and useful Arts" are to some extent a gift economy. Gratitude, the pleasure of discovery, the impulse to self-expression, and devotion to a common enterprise motivate creators quite as much as lucre. Of course, artists need to make a living, and even to get rich. Lessig, who clerked for Richard Posner and Antonin Scalia, is not an anarcho-communist. But the real point is to keep the tune flowing, the conversation alive, the gift in motion. Poets, jazz musicians, filmmakers, physicists, and coders know this. It's not their fault (and it's not primarily for their benefit) that the balance has been lost--that, as Lessig laments, "the ability to propertize culture in America is [now] essentially unlimited ... even though the plain text of the Constitution speaks volumes against such expansive control." </p>
<p>In <i>Copyrights and Copywrongs</i>, Siva Vaidhyanathan covers much of the same ground as <i>The Future of Ideas</i> but pays more attention to history and sociology and less to technology and legal theory. Three chapters on the history of copyright in literature, film, and music (this last with fascinating material on blues and rap) are framed by two analytical chapters, one surveying the common-law roots and constitutional meanings of copyright, the other assessing the likely cultural consequences of the revolution in intellectual-property law. Smoothly written and equable in tone, it makes a valuable supplement to Lessig's brilliant but slightly hectic exposition. </p>
<p>"<span class="dropcap">A</span> republic, if you can keep it," Benjamin Franklin is said to have answered someone in the crowd outside Independence Hall who asked what the deliberations inside had produced. We've done an indifferent job, as Christopher Lasch, Walter Karp, Robert Wiebe, and others have reminded us. An important feature of that republic was a culture of innovation made possible by laws that found a reasonable balance between commerce and creativity. This feature, like the culture of deliberation that briefly flourished in the early Republic, is being eroded by the pressures of competition and concentration. Lessig himself, as skeptical as Franklin, doubts that these pressures will be successfully resisted in the end. But at least, thanks to <i>Code</i> and <i>The Future of Ideas</i>, Vaidhyanathan's <i>Copyrights and Copywrongs,</i> Sunstein's <i><a href="http://www.republic.com" target="_blank">Republic.com</a></i>, Seth Shulman's <i>Owning the Future</i>, and a few other farsighted works, we need not be herded altogether passively into the global cyber-playpen.</p>
</div></div></div>Thu, 10 Jan 2002 00:36:49 +0000142402 at http://prospect.orgGeorge ScialabbaThe Social Recessionhttp://prospect.org/article/social-recession
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><font class="nonprinting articlebody">&#13;<br />
&#13;</font></p>
<p><i>The American Paradox: Spiritual Hunger in an Age of Plenty</i>, by David G. Myers. Yale University Press, 414 pages, $29.95.&#13;</p>
<p>&#13;<br />
&#13;<br /><i>The Loss of Happiness in Market Democracies</i>, by Robert Edwards Lane. Yale University Press, 465 pages, $35.00.&#13;</p>
<p>&#13;<br />
&#13;<br />
Over the portal of modernity is written Kant's famous definition: "What is Enlightenment? It is humankind's emergence from its self-imposed childhood." But that inspiring metaphor has a sobering implication. What follows childhood, after all, is adolescence; and folklore and social science agree that, whatever its attractions, this is also the most turbulent, violent, and unhappy stage of life. If Kant's metaphor holds, humankind ought to be suffering some colossal growing pains around now.&#13;</p>
<p>&#13;<br />
Well, we are. In the developed countries, it's true, more people are rich than ever before--science, democracy, and capitalism have kept their promise. But more people are unhappy, too, at least by some measures, and unhappy in new ways. To simplify grandly, the traditional--indeed, immemorial--sources of unhappiness were scarcity and constraint; now, for the first time on a large scale, overstimulation and the erosion of constraint (i.e., of social bonds, which are inseparable from constraints) are also sources of unhappiness. This is not a new idea; Émile Durkheim and Max Weber first conceived it and more or less founded sociology on it. But as the social consequences of modernity broaden and ramify, fresh descriptions and analyses are required of social scientists. Several such works have appeared recently, most notably Francis Fukuyama's <i>Great Disruption</i> and Robert Putnam's <i>Bowling Alone</i>, as well as two less bold and brilliant but nonetheless valuable studies of contemporary American malaise, David Myers's <i>American Paradox</i> and Robert Lane's <i>Loss of Happiness in Market Democracies</i>.&#13;</p>
<p>&#13;<br />
Their common theme is that the way we live now has its costs. The legitimation of divorce, cohabitation, and single parenthood; the universal availability of contraception; the economic independence of women; the commodification of child care, elder care, and other personal services; the proliferation of consumer choice; the diversification of media; the superabundance of information; the increase of residential and occupational mobility; the displacement of loyalty and personal influence by merit and competition in the workplace; the decline of party identification and patronage; the undermining of tradition, hierarchy, familism, and governmental paternalism; and in general, the increasing prevalence of individual autonomy and economic rationality in more and more spheres of life: These are all goods. But not unmixed goods. The awkward truth appears to be that constraints are also supports, choices are also stresses, and breadth of experience may sometimes be the enemy of depth.&#13;</p>
<p>&#13;<br />
The data on our current "social recession" (Myers's term) are familiar, but it is useful to have them fully and clearly set out in the two books under review. Since 1960 the divorce rate has doubled. Cohabitation is seven times more frequent. Four out of 10 ninth-graders and seven out of 10 high school seniors report having had sexual intercourse. The average age of first marriage for men has increased from 23 to 27 and for women from 20 to 25. Births to unmarried teens have quadrupled; births to all unmarried parents have sextupled. The proportion of children not living with two parents has tripled. The number of children living with a never-married mother has increased by a factor of 13. Forty percent of all children do not live with their biological fathers. &#13;</p>
<p>&#13;<br />
Hours per week parents spend with children have decreased by nearly half (30 to 17). The teenage suicide rate has tripled. The rate of violent crime has quadrupled; the rate of juvenile violent crime has septupled. Twelve million people, including 3 million teenagers, contract sexually transmitted diseases each year. Average television-watching hours per household have increased 40 percent; average SAT scores have declined 50 points. The number of survey respondents agreeing that "most people can be trusted" has dropped 40 percent, while those asserting that "you can't be too careful in dealing with people" have risen 50 percent. And although personal income has more than doubled, the proportion of Americans calling themselves "pretty well off financially" has dropped 40 percent and the number who say they are "very happy" has dropped 15 percent, while the incidence of depression is, depending on the estimate, three to 10 times greater.&#13;</p>
<p>&#13;<br />
&#13;<br /><font color="darkred" size="+2">S</font>o far, just trends. But there are correlations in the data, too. Married people are happier and healthier than divorced or unmarried people. People are more likely to stay married if they are religious, are well-educated, grew up in a two-parent home, married after age 20, and married as virgins. Compared with married couples, cohabiting couples enjoy sex less, are more often unfaithful, and (if they eventually marry) are more likely to divorce. &#13;</p>
<p>&#13;<br />
And then there's the root of (not all, but much) developmental evil: father absence. Seven out of 10 delinquents are from father-absent homes. Teenage boys from such homes are three times as likely to be incarcerated by age 30; for each additional year a boy spends in a home without two parents, the risk of incarceration increases by 5 percent. Children from single-parent homes are more likely to be abused, to drop out of school, and (by a factor of five) to be poor. The presence of stepparents improves the numbers a little, but not much. Overall, the nonmarital birth rate predicts a society's violent crime rate with striking accuracy. &#13;</p>
<p>&#13;<br />
Even the most expertly parsed data are not self-explanatory, of course. Myers and Lane have rather different interpretive styles: the former analytical and hortatory, very ready to propose voluntary or legislative remedies; the latter abstract, rarely venturing more than a tentative suggestion. Myers, a social psychologist, emphasizes culture and values; Lane, a political scientist, emphasizes institutions and systems. &#13;</p>
<p>&#13;<br />
Both emphasize that poverty is not the prime cause of our "social recession." Myers forcefully criticizes economic inequality in contemporary America, advocating increased progressivity and a ceiling on executive compensation. More cautiously, Lane concludes that "by increasing unemployment to control inflation we are apparently increasing misery for no direct (though possibly some indirect) gain in well-being." For the most part, however, both books are less concerned with injustice or exploitation than with individualism and its unintended consequences. &#13;</p>
<p>&#13;<br />
The most obvious of these is marital instability. Myers quotes the historian Lawrence Stone: "The scale of marital breakdowns in the West since 1960 has no historical precedent... . There has been nothing like it for the last 2,000 years, and probably longer." <i>The American Paradox</i> thoroughly documents the effects of this change and offers many sensible suggestions for coping with it, such as mandating parental leave and flextime, indexing the dependent exemption to inflation, adjusting tax rates on married couples' income, reforming divorce laws, automating child-support payments, and directly subsidizing two-parent families in various ways. He also makes a plausible, nonsectarian case for teaching virtue in the public schools. &#13;</p>
<p>&#13;<br />
&#13;<br /><font color="darkred" size="+2">A</font>s for the causes of the marriage crisis, Myers wisely recognizes that they go beyond mere selfishness or shallow ideals of "personal growth" to include urbanization and the disappearance of neighborhoods, the changing nature of work and the resulting economic progress of women, the declining efficacy of religious sanctions, the sexualization of advertising and entertainment, and, of course, the wider availability of contraceptive technology. Individualism stands upon a vast historical scaffolding.&#13;</p>
<p>&#13;<br />
Lane describes a different portion of the scaffolding. In a long and subtle (sometimes excessively subtle) analysis of the incentive structure of market culture, he shows that the main sources of happiness ("intrinsic work enjoyment, family solidarity, social inclusion, sociability at the workplace") and unhappiness (chronic financial insecurity, overwork, etc.) are "externalities"--matters respecting which markets are, in their normal workings, indifferent. "It does not pay," he concludes, "to devote resources to benefits accruing to workers but not to the firm's net income."&#13;</p>
<p>&#13;<br />
The consequences of all this elicit from Lane a rare burst of eloquence. "There is a kind of famine of warm interpersonal relations, of easy-to-reach neighbors, of encircling, inclusive memberships, and of solidary family life... . For people lacking in social support of this kind, unemployment has more serious effects, illnesses are more deadly, disappointment with one's children is harder to bear, bouts of depression last longer, and frustration and failed expectations of all kinds are more traumatic." Valuing intimacy more, income and power less, would counter this "loss of happiness." But how to bring that about, how to promote companionship rather than economic growth or procedural equality as a societal norm? Lane is chary of suggestions, pleading only that other social scientists give the matter more attention. &#13;</p>
<p>&#13;<br />
Statistically, the social recession has leveled off since the mid-1990s, though it's not clear why. What are our long-term prospects? Not everyone gracefully outlives his or her growing pains. Some adolescents do go haywire, after all. It doesn't seem altogether impossible that humankind--or at any rate, American society--will go haywire. Not convulsively, perhaps, but just drifting into a psychic minimalism. Michael Walzer has imagined the nightmarish result of "individualism with a vengeance" and of an equally untrammeled skepticism and hedonism: "a human being thoroughly divorced, freed of parents, spouse, and children, watching pornographic performances in some dark theater, joining (it may be his only membership) this or that odd cult, which he will probably leave in a month or two for one still odder ... dissociated, passive, lonely, ultimately featureless." This is not what emancipation was supposed to look like. &#13;</p>
<p>&#13;<br />
Doubt and freedom are sacred to us since the Enlightenment, and rightly. Whatever bonds and beliefs we hope to ground community on must be able to withstand all criticisms and temptations. Are there any such grounds? Myers recommends a nondoctrinaire theism ("leaving theologians and atheists to duke it out over the ancient what-is-truth question"), but it is practically contentless. Lane looks wistfully to social science for "a better theory of measured (and not just inferred) utility." That sounds to me like a straw in the hurricane of consumer marketing and global competition.&#13;</p>
<p>&#13;<br />
Let's hope for a prophet. As it happens, my favorite prophet, D.H. Lawrence, has some apposite words. "Man has a double set of desires, the shallow and the profound, the personal, superficial, temporary desires, and the inner, impersonal, great desires that are fulfilled in long periods of time. The desires of the moment are easy to recognize, but the other ones, the deeper ones, are difficult. It is the business of our Chief Thinkers to tell us of our deeper desires, not to keep shrilling our little desires in our ears."&#13;</p>
<p>&#13;<br />
Myers and Lane (along with Putnam, Fukuyama, and the late Christopher Lasch, another great twentieth-century prophet) have done what Chief Thinkers ought to do. But when one contemplates the multitude of opposing voices--the market, TV and the movies, popular music, the advertising industry, and, increasingly, the Internet--all ceaselessly shrilling our little desires in our ears, it's hard to be optimistic. ¤&#13;<br />
&#13;
</p>
<p>&#13;<br /><br />&#13;</p>
<p><br />&#13;</p>
<hr size="1" /><center>&#13;
<p align="center"><font face="verdana,geneva,arial" size="-2"></font>
class="nonprinting"&gt;&#13;</p>
<hr size="1" />&#13;
<!-- dhandler for print articles --><p>&#13;<br />
&#13;<br />
&#13;<br />
&#13;<br />
&#13;</p>
</center></div></div></div>Fri, 09 Nov 2001 20:19:21 +0000141739 at http://prospect.orgGeorge ScialabbaCherny Speakshttp://prospect.org/article/cherny-speaks
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><i>The Next Deal: The Future of Public Life in the Information Age</i>, by Andrei Cherny. Basic Books, 268 pages, $24.00.</p>
<p>John Stuart Mill, in his essay on Coleridge, remarks that "a knowledge of the<br />
speculative opinions of men between twenty and thirty years of age is the great<br />
source of political prophecy." If Mill is right, then one should pay particular<br />
attention when a young opinionator comes along with 25-year-old Andrei Cherny's<br />
credentials--speechwriter for Vice President Al Gore and author of the 2000<br />
Democratic Party platform--claiming, explicitly and insistently, to speak for his<br />
generation. As the Clinton administration wound down, Cherny enrolled in law<br />
school, though he keeps a hand in politics as a contributing editor for <i>The<br />
New Democrat</i> and senior policy adviser to the speaker of the California State<br />
Assembly. In a <i>New York Times</i> profile last summer, Cherny claimed to have<br />
set his sights on a career in criminal law. But it's hard to believe that <i>The<br />
Next Deal</i> wasn't conceived as a bid for a top domestic policy post in the<br />
Gore White House. By the luck of the chad, however, we are reading here about a<br />
future that is not to be, at least not for another four years. </p>
<p>
Perhaps it's just as well. Cherny's premise is that public life should be<br />
drastically reshaped in response to what he calls the Choice Revolution: "the<br />
growing expectation among consumers that the world be customized to fit their<br />
preferences and the growing effort among businesses to meet this expectation." We<br />
currently have an industrial-age government, he argues, for an information-age<br />
society. We go online to order everything from individually tailored blue jeans<br />
to individually tailored stock portfolios. We work online: In America today,<br />
Cherny notes, "a quarter of workers are wired workers (working with networked<br />
computers in a flexible, team-oriented environment) and nearly a third are free<br />
agents (working for themselves and from their homes)"--a new, knowledge-based<br />
workforce, "self-reliant and empowered to do their jobs as they think best." We<br />
find community online, in chat rooms and e-mail: communities "based on shared<br />
interests and not just shared geography." Yet we still "stand in endless lines to<br />
pick up forms at the Department of Motor Vehicles." The new economy is a Pegasus:<br />
swift, flexible, responsive, efficient, and nonhierarchical. Government, by<br />
contrast, is a dinosaur: "wasteful, corrupt, distant, and laden with<br />
bureaucracy." </p>
<p>
Government must be reinvented before it can interest the "Choice Generation,"<br />
which has grown up with the Internet. This is a generation that "impatiently raps<br />
its fingers on the table when it takes more than a few seconds to download a web<br />
page from China, which expects packages sent from the other end of the continent<br />
to arrive by 10:00 a.m. the next morning, which finds it difficult to watch TV<br />
without a remote control in hand, which demands a piping hot pizza delivered to<br />
their front door in half an hour." To an older critic, even a prematurely older<br />
one like, say, Jedediah Purdy, these things might betoken a sadly limited<br />
attention span and a hankering for instant gratification. But to Cherny, they are<br />
the outward signs of empowerment.</p>
<p>
The Choice Generation has an anthem, Cherny writes. Its verses are the<br />
advertising slogans of dotcoms: "We're betting on ourselves" (Suretrade.com);<br />
"Bureaucracy Beware" (Homeloan.com); "Believe in yourself" (Ameritrade.com);<br />
"Power to the People" (DiscoverBrokerage.com). A young fogy like, say, Thomas<br />
Frank might take these phrases for mere marketing blather--the latest stage in<br />
the phony conquest of cool. To Cherny they compose a "haiku of choice,<br />
individualism, and self-reliance."</p>
<p>
We have struck camp, our young prophet tells us, and are on the open road:</p>
<blockquote><p>The Choice Generation is clearly an evolving group--its values<br />
are continually reinforced as technology continues to provide more choices, more<br />
personalization, and more individualized power. The big three TV networks have<br />
been challenged in turn by cable, by more networks, by a thousand channels of<br />
satellite and digital cable programming. The movie theater competed first with<br />
the VCR, which lets viewers control what they watch and when they watch from the<br />
comfort of their couches. It has now to deal with DVDs that let viewers skip to<br />
their favorite scenes and soundtracks. With increasing frequency, young people<br />
have begun to eschew prepackaged music albums in favor of custom-created CDs made<br />
up of the songs <i>they</i> choose. Pagers appeared and were followed by cellular<br />
phones. Personal computers transformed America and were followed by Palm Pilots.<br />
Young people traded in record players for personal Walkmans and then for even<br />
more control with Discmans. Everywhere one looks the technologies that shape the<br />
lives of the Choice Generation are constantly and consistently trending toward<br />
giving the individual more personal power. More than half of the nation's minors<br />
have a television and CD player in their bedroom. No one watches them over their<br />
shoulders; they have a previously unimagined amount of power to shape the<br />
environment they live in. </p></blockquote>
<p>
More fundamentally, the Internet has put them in control. They read whatever<br />
they want to read, buy whatever they want to buy, download photos of the hottest<br />
star of the moment. Moreover, they don't have to look at CNN's web page, they can<br />
look at their MyCNN page, which shows them only the topics they are interested<br />
in. They don't have to sift through the Web with the AltaVista search engine,<br />
they can search with MyAltaVista, which conforms itself to their "surfing"<br />
habits. At mybytes.com (with the slogan, "It's my web"), they can customize "my<br />
research tools," "my calendar," "my interests," "my life," and so on. At<br />
myway.com, where "you'll get all the information you need, the way you want it,"<br />
they are welcomed to "the Internet's distinct new personality. Yours." At<br />
Barbie.com, they can design their own doll to fit the specifications they choose.<br />
At Nike.com, they can design their own sneakers and even have their own name put<br />
on the shoes. The notion of "one-size-fits-all" is ever more obsolete in a world<br />
they are increasingly customizing to fit themselves.</p>
<p></p><p>O brave new world, that has such choices in it!</p>
<p></p><p>How to bring government into the information age? <i>The Next Deal</i> is<br />
surprisingly short on program. You might think that a book that practically<br />
posits computer literacy as a prerequisite of effective citizenship would<br />
consider how to help the many people--even young people--who are not computer<br />
literate (or print literate, for that matter) to become so. Not a word. You might<br />
think that a book that celebrates interactivity would spend many pages detailing<br />
ways to bring every citizen adequate information about public issues--about, say,<br />
the fiscal impact of massively regressive tax cuts, or the environmental impact<br />
of hog farming and cattle raising, or the moral impact of unreformed campaign<br />
finance--and ways, in turn, to bring citizens' opinions to bear on officials<br />
between elections. A few pages, not many details. <i>The Next Deal</i> would<br />
doubtless have made Cherny's reputation if it had pointed out, before the media<br />
herd, that industrial-age electoral technology was a disaster waiting to happen,<br />
not to mention a clear indicator of the political class's contempt for the<br />
underclass. No such luck. </p>
<p></p><p>So what <i>is</i> the program? "As a general rule and whenever possible,<br />
government power and funds should go directly to citizens--and only rarely to<br />
institutions. Americans should have personal control over the money that<br />
government spends on their behalf, thereby putting decisions about the direction<br />
of government programs into their hands instead of those of bureaucrats." In<br />
particular: "Parents should decide which public school their children attend and<br />
have the funds follow the children to the schools that deserve them; unemployed<br />
workers should decide how to spend their job training benefits and choose the<br />
services they feel would be most beneficial to them; young people should decide<br />
where to invest a portion of their Social Security retirement funds and accept a<br />
greater share of both the risk and reward." Health insurance should be purchased<br />
individually rather than through one's employer, and "a system of tax credits and<br />
grants should be put in place to help those who cannot afford to purchase<br />
coverage." </p>
<p>
To facilitate all this choice, Americans "will need access to even more<br />
information. That means tests in schools that allow parents to compare students,<br />
teachers, and schools against national and international benchmarks. It means<br />
rating health care plans, hospitals, and doctors. It means detailed, useful, and<br />
useable information on the history and results rendered by every provider of<br />
every service that Americans will be able to choose from." There is already, of<br />
course, plenty of information available to online retirement investors (though it<br />
won't do them much good in case of a meltdown or even a long recession).</p>
<p>
It's all rather sketchy, but there's something here. Why <i>shouldn't</i><br />
government be more like e-business? If only Cherny hadn't skimped on programmatic<br />
detail while larding the book with potted history, irrelevant anecdote, and<br />
speechwriterly platitude. But I suppose his publisher wanted it in time for the<br />
inauguration.</p>
<p>
<span class="dropcap">A</span>t the end of <i>The Next Deal</i> comes a curious and heartening reversal.<br />
Cherny proposes a universal-national-service scheme--a one-year Citizen Corps for<br />
18-year-olds. All those hip young consumers, he recognizes, need to shoulder a<br />
"New Responsibility" as a "necessary counterbalance to the individual autonomy of<br />
the Choice Revolution." Notwithstanding our (well, some people's) unprecedented<br />
prosperity, there's plenty of useful work begging to be done.</p>
<p>
</p><blockquote>Millions of the old could stay out of nursing homes for years if<br />
someone could come visit them once a day, making sure that they are all right and<br />
that small household chores are performed. Millions of children need extra<br />
reading and math tutoring; thousands of homeless are looking for help in moving<br />
off the streets and back into society; parents need, but often cannot afford,<br />
child care for their young children; after-school care is needed to keep older<br />
kids off the streets, away from the television, and in a classroom; an<br />
overburdened and expensive health care system needs an infusion of nurses' aides;<br />
the Peace Corps needs to expand its efforts in nations around the globe without<br />
reducing its quality; soil erosion needs to be battled; streams need to be<br />
cleaned; classrooms need more teacher aides; and the police need help in<br />
organizing neighborhoods against crime.</blockquote>
<p>
This is a splendid idea. It is also, as Cherny acknowledges, not a new idea.<br />
It is, come to think of it, a pretty old idea--pre-industrial-age, even. "From<br />
each according to his abilities, to each according to his needs" is only a recent<br />
restatement of it. Turning this ancient idea into a smart, up-to-the-minute<br />
information-age manifesto--now <i>that</i> would be a proper task for a talented<br />
and ambitious young wordsmith like Andrei Cherny. </p>
</div></div></div>Wed, 07 Nov 2001 20:34:46 +0000142020 at http://prospect.orgGeorge ScialabbaFree to Choosehttp://prospect.org/article/free-choose
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><i>Moral Freedom: The Search for Virtue in a World of Choice</i>, by Alan Wolfe. W.W. Norton 224 pages, $24.95.</p>
<p>
As every parent knows, sometimes the only answer to "Why?" is "Because I say so." For a long time that was, at least in form, the most common answer to society's ultimate why question: "Why be moral?" And of course, for an even longer time that question rarely arose, which greatly simplified matters. </p>
<p>
From the moral standpoint, modernity may be defined as the unwillingness of the many, and no longer only a privileged or heroic few, to take "Because I (we) say so" for an answer. This fateful recalcitrance had many sources. One was a lightening of the burdens of daily life, thanks to the agricultural innovations of the late Middle Ages and the trickle-down effect of growing trade. Another was the evolution and differentiation of nation-states and national churches. Yet another was the success of natural philosophy, later called science, which made asking "Why?" seem in general a more promising thing to do. And perhaps most significant was the creation of a labor market, which eventually forced at least one urgent why question on everyone: "Why get my daily bread in this way rather than some other?" The result of these various but related historical developments was (here, a slight flourish of trumpets) the birth of the individual. </p>
<p>
This has not been an unmixed blessing. It's hard being an individual. It entails many more choices than we had two million years ago on the savanna, where our genetic endowment mostly took shape. Since, from the physiological point of view, every choice is a stress, it is possible, at least in theory, to have too many choices. At a certain point, "Because I say so" begins to look like an adaptive strategy.</p>
<p>
Nevertheless, the Pandora's box of modernity is wide open. The historical changes mentioned above are irreversible. Individuality may yet evaporate in the postmodern electronic cybercollective, but it will not sink back into the premodern organic mass. Individuality is our condition, and freedom is therefore our moral fate, especially here in America, where the four horsemen of modernity--abundance, pluralism, technology, and mobility--are most firmly in the saddle. </p>
<p>
As Alan Wolfe puts it: "Americans have come to accept the relevance of individual freedom, not only in their economic life, but in their moral life as well." Indeed, says Wolfe, the "defining characteristic of the moral philosophy of Americans" is the principle of moral freedom. "Moral freedom means that individuals should determine for themselves what it means to lead a good and virtuous life. Contemporary Americans find answers to the perennial questions asked by theologians and moral philosophers, not by conforming to strictures handed down by God or nature, but by considering who they are, what others require, and what consequences follow from acting in one way rather than another."</p>
<p>
<span class="dropcap">W</span>hat this means in practice is the subject of Wolfe's short but fascinating new book, <i>Moral Freedom.</i> As in his last book, <i>One Nation, After All,</i> Wolfe and his associates interviewed approximately 200 people in eight communities around the country; and once again he has fashioned the results into a seamless weave of narrative and interpretation, deftly alternating quotes and commentary. As before, he finds that by and large Americans are just trying to get by, good-humoredly but a little worriedly, in a world that's moving a bit too fast; eclectic, improvising, sometimes glancing wistfully at tradition and authority yet always wary of them as well.</p>
<p>
The organizing theme of the interviews was the meaning of virtue. Unsurprisingly, few of Wolfe's respondents had much to say about it in the abstract. Like most Americans on most subjects, they are uncomfortable with categorical statements. But they were more forthcoming--positively chatty, it appears--when asked for their attitudes and feelings about individual virtues like loyalty, honesty, self-discipline, and forgiveness. The resulting slightly artificial intimacy and occasionally banal earnestness lends <i>Moral Freedom</i> some of the flavor of soap opera and talk radio, which are perhaps where many of those attitudes and feelings came from. </p>
<p>
If Americans have a moral metaprinciple, it is flexibility. "No absolutes" is our watchword. One might say (with apologies to Barry Goldwater): Extremism in the pursuit of virtue is no virtue; moderation in the practice of virtue is no vice. One owes loyalty to an employer or a spouse, but also to one's career and even one's own happiness. Honesty is the best policy, as long as it's reciprocated; and honesty in dealing with big institutions (especially the IRS) doesn't really count. Self-discipline is essential to success, though pleasure is no less essential to emotional health. And so on, the rule being: Adapt every rule to the circumstances. "Americans," Wolfe observes, "are consequentialists."</p>
<p>
We have to be, he explains. For one thing, the old rule-givers no long command automatic deference. Even the Christians in his sample seem to rely on their own readings of Scripture, not authoritative institutional ones. And the stakes involved in moral judgment are different now. Traditional morality was "narrow but deep; the individual had few choices to make, but all those choices were serious." Present-day morality, by contrast, is "shallow but broad; we have many more moral issues to consider, even if few of them will result in eternal damnation, social ostracism, or the poorhouse."</p>
<p>
Does this make for a certain moral squishiness? Wolfe, admirably but perhaps a tad excessively loyal to his respondents, denies it. It is true, he acknowledges, that Americans no longer "subject themselves to the severe and demanding tests of character imposed ... by one version or another of the Protestant ethic." This does not mean, however, "that the morality by which people live now makes them soft. It means only that the new morality is different, making up in the bewildering array of temptations it must face what it lacks in vigor and unswerving self-confidence." </p>
<p>
Sorry, but I'm afraid it sounds to me like we're soft. Certainly the extramoral evidence points that way. Americans gobble junk food, watch TV, and guzzle gas without much restraint, as our waistlines and smog levels testify. We can't say no to entitlements or yes to taxes. We want military superiority but no casualties. Once a self-reliant nation of tinkerers and inventors, we now have little idea what makes our fancy gadgets tick. A man's word may once have been his bond in these here parts, but now we lead the world in lawyers per capita. Aren't these character flaws? Wouldn't more "rigor" and less flexibility serve us well?</p>
<p>
Until now America's fabulous prosperity has blunted the force of such questions--and indeed, made the whole issue of character somewhat moot. In our postindustrial (if that's what it is) capitalist society, most people's economic well-being does not seem dependent on other people's virtuous behavior. (Or on one's own, for the fortunate minority enriched by the stock market.) Residential and commercial mobility make the behavior of individuals and businesses hard to monitor or sanction. Most of us are content to leave that to the experts anyway: the professional regulators, litigators, journalists, educators, and therapists who, in effect, administer our public morality--and who do so, for the most part, without reference to "virtue" or "character."</p>
<p>
Two other recent developments seem particularly to cut against the psychic stability and coherence that are the foundation of character. The first, mentioned by Wolfe but more fully explored by Richard Sennett in <i>The Corrosion of Character,</i> is the altered shape and rhythm of careers. We change jobs and even livelihoods today with unprecedented frequency; and companies, too, disappear or metamorphose rapidly. In these circumstances, adaptability and detachment are at a premium, strong ties and long-term commitments at a discount. </p>
<p>
More subtly but perhaps more fundamentally, television and the Internet erode our capacity for inwardness. The sheer volume of stimuli, their velocity and decibel level, our passivity (except for an index finger on the mouse or the remote)--psychologically, these and other features of the "electronic millennium" (Sven Birkerts's phrase) add up to a new evolutionary niche. Our attention, or at any rate our grandchildren's, can only become more diffuse, shallow, and restless. Continuity is part of the very definition of character; and continuity is gradually but inexorably fraying, within us and without.</p>
<p>
Wolfe's respondents are aware, at least peripherally, of these environmental changes but are on the whole resigned to them. "The world is an incredibly dynamic place. Nothing stays the same," a Silicon Valley entrepreneur asserts in a tone of finality. They are, however, uneasy about the decline of religious belief. This is, apparently, not because Christian or non-Christian religious beliefs are true, but because they're useful. Like most conservative social critics, these ordinary people praise religion but decline to argue it. They want the socializing effects of authoritative worldviews for their children, provided the kids grow out of those views at the appropriate time, and not too painfully. As Wolfe puts it, cogently: "It is as if they want some of the practice of old-time character formation--especially a greater respect for order and discipline--without the rest of old-time character formation, especially theological instruction and an emphasis on religious obedience."</p>
<p>
How entirely reasonable! But is it possible? Wolfe, impressed by his subjects' cheerful, humane pragmatism, seems to think so. But some of his readers will feel that the notion of "moral freedom" harbors, if not a contradiction, then at least a tension that might, and perhaps ought, to have made <i>Moral Freedom</i> a darker, more pessimistic book. </p>
<p></p></div></div></div>Wed, 07 Nov 2001 20:11:59 +0000141115 at http://prospect.orgGeorge ScialabbaThe God That Failshttp://prospect.org/article/god-fails
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"></div></div></div>Tue, 07 Aug 2001 14:12:41 +0000142167 at http://prospect.orgGeorge ScialabbaThinking about Thinkinghttp://prospect.org/article/thinking-about-thinking
<div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><blockquote><p><i>The Metaphysical Club: A Story of Ideas in America,</i> Louis Menand. Farrar, Straus and Giroux, 480 pages, $27.00.</p></blockquote>
<p></p><p><br /></p><p><br /></p><p>
</p><p> <span class="dropcap">N</span>ot long ago the philosopher Daniel Dennett called Darwinism--the theory of evolution by means of natural selection--"the best idea ever." It is certainly one of the most consequential. As an explanation of an enormous range of biological and cultural phenomena and, perhaps more important, as a mode of explanation, it has altered the intellectual landscape decisively. Even Karl Marx, only a few years after the publication of <i>On the Origin of Species,</i> sought permission (unsuccessfully) to dedicate <i>Das Kapital</i> to Charles Darwin. ("Organisms of the universe, unite... . "?)<br /></p><p><br /></p><p>
</p><p> One of Darwinism's consequences, as Louis Menand tells it, was a change in the character of philosophy in America. Before Darwin, philosophy was in effect an adjunct of religion. There were few philosophy professors in the young United States, and they were largely concerned with justifying the ways of God to their gentlemen students. The conflict between Calvinism and the Enlightenment had produced a temporary compromise: Unitarianism, which amounted to deism, or natural theology, plus New Testament morality. Emerson and the Transcendentalists had rejected this compromise, spurning deistic rationalism in favor of German idealism, and Christian morality in favor of a semimystical individualism. But it was Darwinism that, by stripping natural history of any evidence of divine purpose, kicked the props out from under supernatural religion altogether.<br /></p><p><br /></p><p>Contemporaneous with Darwinism, and equally shattering, was the Civil War. "As traumatic wars do," Menand writes, "the Civil War discredited the beliefs and assumptions of the era that preceded it." In this case, what was discredited was the belief that moral certainty is attainable or desirable. Moral certainty, on both sides, had after all helped bring about the bloodiest war the world had yet seen.<br /></p><p><br /></p><p>The generation that came to maturity in America after the Civil War and Darwin's <i>Origin of Species</i> is the subject of Menand's superb study <i>The Metaphysical Club.</i> Actually, "study" is almost too stuffy; it really is, as the subtitle asserts, a story, full of color, incident, and personality. The protagonists are William James, Oliver Wendell Holmes, Jr., Charles Sanders Peirce, and John Dewey. The first three were members of a short-lived Cambridge discussion group called, in a spirit of self-mockery, the Metaphysical Club. (All the members were decidedly anti-metaphysical.) Dewey, a generation later, synthesized and publicized their intellectual innovations. This is, as Menand rightly claims, the central lineage in our intellectual history: "Together they were more responsible than any other group for moving American thought into the modern world... . We are still living, to a great extent, in a country these thinkers helped to make."<br /></p><p><br /></p><p>
</p><p> <span class="dropcap">A</span>nd what is their legacy? Pragmatism is its name, as most readers will know; but what is its substance? In Menand's summary:<br /></p><p><br /></p><p><br /></p><blockquote>What these four thinkers had in common was not a group of ideas, but a single idea--an idea about ideas. They all believed that ideas are not "out there" waiting to be discovered, but are tools--like forks and knives and microchips--that people devise to cope with the world in which they find themselves. They believed that ideas are produced not by individuals, but by groups of individuals--that ideas are social. They believed that ideas do not develop according to some inner logic of their own, but are entirely dependent, like germs, on their human carriers and the environment. And they believed that since ideas are provisional responses to particular and unreproducible circumstances, their survival depends not on their immutability but on their adaptability.</blockquote>
<p></p><p><br /></p><p>
</p><p> What was new about this? In the traditional, metaphysical view, abstractions such as "human nature," "soul," "truth," and "justice" had a real existence. Since such entities were immaterial, immutable, and eternal, they were, in a sense, even more real than physical objects or events. Philosophical discourse therefore ascended very quickly beyond mere fact to the loftier sphere of concepts, essences, and ideas.<br /></p><p><br /></p><p>This was all very well as long as not much of interest was happening down on the ground. Eventually, however, thanks to science, commerce, the revival of classical culture, and the decline of political absolutism, things down here got very interesting indeed. Pragmatism was an attempt to drag philosophy back to earth.<br /></p><p><br /></p><p>It did this by redefining the meaning of a statement or theory or idea as whatever can be done with it, or made of it. The proper judges of whether what someone has made of a statement is legitimate or useful are all those interested in the question; and the criterion they judge by is their own sense of what is plausible and important. This is rather general, but it captures the essential elements in Menand's definition: Inquiry is purposeful, social, and provisional. It is not and cannot be, in the strong, traditional sense, disinterested, solitary, or definitive.<br /></p><p><br /></p><p>The philosophical shift from dogmatism to skepticism, from finality to tentativeness, was clearly an imitation of modern scientific practice. It was also, Menand suggests, an adaptation to modern capitalism. Pragmatism inculcated "a kind of skepticism that helped people cope with life in a heterogeneous, industrialized, mass-market society"; an intellectual and psychological flexibility that "is what permits the continual state of upheaval that capitalism thrives on."<br /></p><p><br /></p><p>This is, so far, a fairly standard account of pragmatism's origin and significance. What is enthralling and illuminating about <i>The Metaphysical Club</i> is its portraits of individuals and their milieus. Menand is wonderfully deft at evoking a climate of ideas or a cultural sensibility, embodying it in a character, and moving his characters into and out of one another's lives. What might have been a jumble of intellectual movements and colorful minor figures--abolitionism, nineteenth-century race theory, the rise of statistics and probability theory, Social Darwinism, cultural pluralism, legal realism, anthropological relativism, experimental psychology, academic professionalism, progressive education, the settlement-house movement, the Pullman strike, Oliver Wendell Holmes, Sr., Louis Agassiz, Benjamin Pierce, Henry Livermore Abbott, Chauncey Wright, Hetty Robinson, Alain Locke--is instead a subtle weave of entertaining narrative and astute interpretation.<br /></p><p><br /></p><p>Nearly everything is known by now about James and Dewey, two of the book's four protagonists, but Menand's account of their development is lively and perceptive. He dwells on the young James's trip up the Amazon with Louis Agassiz, Harvard's celebrated naturalist. Agassiz was looking for evidence against Darwin and in support of polygenism, the theory that species and human racial groups were created separately, with distinct and unchanging attributes. James was initially impressed, like most of his contemporaries, by Agassiz's knowledge and flair but later reacted strongly, again like many contemporaries, against Agassiz's dogmatism. It was Agassiz's style of "typological and prescriptive thinking" that, Menand shows, taught James how science should <i>not</i> be done.<br /></p><p><br /></p><p>James was temperamentally undogmatic, almost to a fault. Menand has some fun with James's famous indecisiveness. The philosopher was unsure about whether to propose to his beloved, what to name his son, when to resign from Harvard. (It was not always so amusing: James suffered long afterward for his inability to decide whether to serve in the Civil War.) But he makes good use of this psychohistory, too, in characterizing James's thought. The programmatic open-endedness of pragmatism becomes more intelligible when we've been made acquainted with James's "quicksilver" personality.<br /></p><p><br /></p><p>
</p><p> <span class="dropcap">J</span>ohn Dewey, by contrast, had no personality, or practically none. Still, Menand makes an interesting connection between an obscure episode in Dewey's life and his writings, which were uniquely influential among Americans in the first half of the twentieth century. Dewey had just arrived at the University of Chicago when the great Pullman strike of 1894 broke out. He and Jane Addams, founder of the Hull House settlement, discussed it. Dewey described himself at the time as "a good deal of an anarchist" and damned the upper classes. Addams, though equally disgusted with the rich, insisted that social antagonisms were fundamentally illusory. Her profession of this mystical-sounding notion caused the scales to fall from Dewey's eyes. "I never had anything take hold of me so," he wrote his wife the next morning. "I can see that I have always been interpreting the dialectic wrong end up--the unity as the reconciliation of opposites, instead of the opposites as the unity in its growth." Dewey's habitual attention to the intimate relations between the individual and the social may be partly traceable to this curious revelation.<br /></p><p><br /></p><p>
</p><p> The third of pragmatism's leading figures--and, according to James, its real originator--is not quite so familiar. Charles Peirce, the son of an eminent mathematician, was himself mathematically gifted and worked for 30 years on the U.S. coastal survey. He was also a womanizer, spendthrift, drug addict, and, as he confessed in his Harvard class book, afflicted with "1 Vanity, 2 Snobbishness, 3 Incivility, 4 Recklessness, 5 Lazyness, 6 Ill-temper." For a decade or so, Peirce and Chauncey Wright--another unstable genius, revered by James and Holmes and known as "the Cambridge Socrates"--hashed out Darwin, logic, cosmology, and other philosophical matters, in and (mostly) out of the Metaphysical Club. Peirce's thought is difficult, and even Menand cannot render it altogether intelligible. He does, though, quote from Peirce's essay "How to Make Our Ideas Clear" what is probably the best one-sentence statement of pragmatism: "The opinion which is fated to be ultimately agreed to by all who investigate, is what we mean by the truth."<br /></p><p><br /></p><p>Oliver Wendell Holmes, Jr., is the tragic hero of <i>The Metaphysical Club.</i> Son of Doctor Holmes--the original Boston Brahmin, friend of Emerson, and "Autocrat of the Breakfast Table"--the young Oliver was brilliant, good-looking, and well-connected. Already a religious and philosophical skeptic, he dropped out of college to enlist in the military immediately after the shelling of Fort Sumter, in a burst of unionist and antislavery fervor. But the dogmatism of the abolitionists, the irrationality of war, and the paradoxical example of Henry Livermore Abbott, a fellow officer who disagreed emphatically with the Union cause but nonetheless fought with exceptional skill and bravery, turned him into a moral and political skeptic as well.<br /></p><p><br /></p><p>After the war, Holmes entered on a prodigiously successful career as a legal scholar and judge. His legal philosophy was the ne plus ultra of skepticism. Law is not based on first principles, according to Holmes; indeed, it is not based on anything: It is "nothing more or less than what judges do." All legal reasoning involves at least some interpretation and is therefore, at least to some degree, open to challenge and reinterpretation. Settled law is simply whatever seems pretty likely to survive all such challenges. By analogy with Peirce's sentence above, one might consider Holmes as saying: "The opinion which is fated to be ultimately agreed to by all who adjudicate, is what we mean by the law." Law, like philosophy, is contingent all the way down.<br /></p><p><br /></p><p>Holmes and James, once the best of friends, grew apart over the years. Holmes thought James softheaded; James thought Holmes hard-hearted. Present-day sympathies are mostly with James, but Menand's account maintains a careful balance. Holmes may sometimes have been "selfish, vain,[and] thoughtless of others," as one of his oldest friends remarked with some bitterness. But there was something admirable, too, about his grim, unyielding refusal of philosophical or moral solace. It was, in its way, an experimental life: Few people have dispensed with conventional belief and sentiment so utterly as Holmes. Those who aspire to live, as Nietzsche urged, "without metaphysical comfort" will take heart, or take heed, from his example, which, like much else, is figured delicately and poignantly in <i>The Metaphysical Club.</i><br /></p><p></p>
</div></div></div>Fri, 20 Jul 2001 17:05:15 +0000142061 at http://prospect.orgGeorge Scialabba