How Have We Changed?

As part of the magazine’s look back over the past forty years, American Heritage asked a wide range of historians, journalists, writers, and public figures the following question: “What do you think is the most important, or interesting, or overlooked way in which America has changed since 1954, and why? And what does this change say about us as a people?” We knew this was a broad question, to say the least, but we were still surprised by the answers it elicited; they turned out to be as various and provocative and illuminating as the people they came from. An anthology follows.

The Terbul Deklin of Liturcy

The change from waxed paper to cling wrap says it all.

The degree of civilization at any time and place may be measured by the way in which particular acts are classified. Some are free, some forbidden or commanded by law, the rest abstained from, by habit and the sway of opinion. I think the chief change in American society between 1954 and today is the shrunken area occupied by the third kind of behavior.

The phenomenon has been called Permissiveness and credited to a bugbear called Relativism. This explanation ignores the underlying motive and the predisposing state of fact. What we see is not simply laxity, but a paradox that carries a message. Why the extensive lying, cheating, and stealing by the intelligent and well-to-do? Why the artist’s rage to disgust and serve up the obscene? Why the passion for “telling all” and for the conglomerate, not only in business but also in everyday life, eating at all times and places, wearing any kinds of clothes anywhere, and using dirty words—everything regardless of surrounding conditions?

I believe the answer lies in that last word. The intense purpose behind many seemingly disparate acts and beliefs comes from resentment against obstacles, against any condition set in the path of any creature’s doing what he, she, or it desires. A barrier is an affront to human nature. What all want is the Unconditional Life.

This ideal stimulates the imagination and, if need be, removes guilt. Ambition balked cheats with a clear conscience; and greed, seeing the arbitrariness of property rights, steals as it were on principle. With lines blurred and fences down, it is easy to be virtuous and never “discriminate” in any sense. In this light, blue jeans, a sweater, pearls, and gold evening shoes qualify as “a style.”

To explain the rise of the passion for the unconditioned would require a look at modern cultural history, taking in the wars, the social thought, and the arts of our time. But the immediate impelling force is the universal sense of oppression: too many contacts with too many people, thanks to multiplying means of communication; too many rules, warnings, limits, delays, duties, prohibitions, exclusions, conditions—a ubiquitous “zoning” of existence by which government tries to reconcile incessant claims. Lastly, the exit door is blocked by the mass of trivial details that force attention if one is to thread one’s way more or less unbruised through workaday life. To take every chance of breaking out is but reflex action.

The most important change since 1954 derives from the fact that until then America had never lost a war. The French, who had lost many wars, lost the Battle of Dien Bien Phu that year, which should have taught us a thing or two but didn’t. And so today we, too, are now a nation that has experienced defeat. Until the Indochinese events that began (for us) in 1954, Americans believe that every time the country undertook a major project—to defeat the British, to settle the West, to control the secessionist South and abolish slavery, to institute a variety of social reforms, to defeat the Second World War Axis—the project turned out well. But not Vietnam.

In the first years after the war, most Americans drew a very simple lesson—to wit, never get involved in someone else’s troubles unless it is absolutely unavoidable. Some people still feel that way, naturally. But I think most people have come to think today that it is good to send at least a few American soldiers to central Africa; that in spite of everything, we performed a good deed in Somalia; that we did well in preventing the expansion of Saddam Hussein’s power, even if success in the Gulf War was less than what it seemed at the time. So the simple lesson from Vietnam has turned out to be too simple.

All in all, we are an older nation than we were in 1954. We have learned that things can go deeply wrong, and we have learned that, even so, hiding in a cave is no alternative. We have learned the lesson that there is no lesson. George Washington is our national symbol, and on our heads, too, a few white hairs have sprouted.

Can we understand our own times? I’m not sure. Matters that seem urgent, forces that seem irresistible in their period may vanish like the once powerful Prohibition party. But here are some very provisional comments.

In 1954 and again in 1965 my husband and I drove to California and back, dawdling across the country. A couple of the roads we took in 1954 (U.S. 40, U.S. 66) had become historic by 1965. Sheer growth has been an obvious change. Now I hear that the roads, airports, and bridges built in the late 1950s are broke and no one’s fixing them. That’s like us: Americans, as a people, have been better at building great projects than at maintaining them. On the other hand, the past forty years have brought new strength to movements for historic preservation and conservation; if Americans ever became good housekeepers, that would be a change.

Self-righteousness has been a persistent American response to change. Fundamentalism revives as popular music grows noisier and corporations grow bigger; the 1990s are like the 1920s. Right now a lot of self-righteousness is directed against women. Yet my hunch is that the changing role of women, which has affected our intimate relations and our expectations of strangers, may be the most important change in the last forty years. Television—its ubiquitous influence is another significant change—helps by showing old movies. The films of 1954 put women in roles that now seem quaint. The change may be a reason that we hear the complaint that there are no roles for older actresses. The ditsy dowager has become less credible, and we haven’t yet figured out how to tell a story in which the judge is a grandmother. I think we will, though, because society nowadays seems to demand more intelligence and more participation by the intelligent of all ages and genders.

From my own self-centered perspective, the most extraordinary change in America over the past forty years is that I have somehow been transformed from a boy of sixteen to a man of fifty-six. I find this astonishing, and I don’t know how to account for it.

The other America—people of color, whatever color—was invisible to white America forty years ago.

When I look at the Bigger Picture, two remarkable changes suggest themselves to me, and I have the sense that they are somehow related. First, the regionalism that was such a defining aspect of this country has been eroded beyond measure. When you drove across the country in 1954, bouncing along on bad roads, risking ptomaine in dubious diners, holing up nights in roadside cabins and tourist courts, you were rewarded with a constant change of scene that amounted to more than a change of landscape. There were no chain restaurants, no franchised muffler-repair shops, and even the brands of beer and gasoline were apt to change when you crossed a couple of state lines. Nowadays you take the Eisenhower administration’s most enduring legacy, the interstate highway system, and eat at Burger Kings and sleep in Days Inns, and when the scenery palls, you duck into a mall, walk past thirty franchised shops, and catch a movie at the fourplex theater. Even the local accents have softened, weathered away by forty more years of national television. We have become more nearly a single nation than we used to be.

And at the same time, the complexion of this nation is infinitely more varied than it was forty years ago. America was peopled by persons of Northern European stock. Most had been here for many generations. Immigration had slowed to a trickle, and the more recent arrivals were also European—Irish and Italians and Greeks and Armenians, and refugees from what we were still calling war-torn Europe. There were fewer blacks, and they were far less visible, found mostly in the largest Northern cities and the rural South. There were a few Mexicans in the Southwest, a handful of French Canadians in New England.

And now? Nearly a third of the population of my own city, New York, is foreign-born, arriving in the same numbers they were a hundred years ago. And the new immigrants come from every continent but Antarctica. You see it most vividly on both coasts, but it’s just as true in the heartland, where it’s more apt to surprise you—the Indian family operating a motel in rural Mississippi, the cluster of Vietnamese restaurants in Denver, the Hmong craftsmen in Minnesota and Wisconsin.

All changed, changed utterly. Or, in another light, not changed at all. America has spent the past forty years evolving, becoming more completely what it has been from its beginnings. It has taken one more step (or a series of steps, or a glide) in that ceaseless process called Self-Realization.

Recently I attended a reunion of the families of my youth, working-class families that in the early and mid-fifties had converged on a remote site along the Missouri River in South Dakota, formed a community, and built an enormous dam. It was a joyous get-together because it recalled such good times. Our parents came of age in the Great Depression and helped win World War II, and here on the South Dakota prairie they had been involved in a monumental project that paid good wages. It was for us, their children, an age of promise. Most of us lived in intact, nuclear families. Many of us were the first members of our families to attend college.

As I stood before my many friends from forty years earlier, I felt an enormous pride in their achievements. Many returned with advanced degrees and worldly experiences well beyond the reach of our parents. Those who continued the working-class tradition of their families reflected the long, rising curve of prosperity and material comforts unimagined in our youth.

There were other changes. Their daughters have ambitions well beyond marriage and childbearing. Almost no one at the reunion smoked, and forty years ago nearly everyone did. Now physical fitness is a way of life, and then it was eccentric behavior. Overall, however, our gathering was a tribute to a rare time in America when the can-do spirit prevailed, when the nation was an economic and political collossus unchallenged in the world arena, when all seemed possible and we were the direct beneficiaries.

Of course, we were all white.

The other America—African-American, Native American, Latino, Asian—the people of color, whatever color, were invisible to most of white America forty years ago, except when they were celebrated as entertainers or athletes. Forty years ago the face of the American political, economic, and cultural establishment was white only.

Four decades later all the primary colors are vivid and visible in the mosaic of America. We still have too far to go in resolving our complex feeling about race, but the fact of a multiracial society is no longer denied. I can think of no greater or more welcome change in forty years.

Beginning with simple servomechanisms that came out of World War II, followed by rapid advances in electronics through microcircuits and silicon chips, overwhelming changes have been brought about since 1954 in the way Americans live and think and view the world.

The key word is control . Our living spaces from bedrooms to bathrooms and kitchens are controlled by automatic devices that did not exist forty years ago. Our automobiles are no longer entirely controlled by the drivers but by numerous electronic thingamabobs. Office and professional workers are never out of the control of computers and ever-changing modes of communication. Writers and publishers are controlled by word processors. Patients and physicians in hospitals are controlled by electronic machines totally incomprehensible to the average human being. Computerized equipment has replaced the card catalogs of our libraries, speeding the use of, but homogenizing, information and our preserved folk culture. Intensive use of pictures and spoken words to instruct and entertain has started a decline in the exactness of the language in recording complex and abstract ideas.

It is too early to determine the full effect of these man-made controls upon American society, but thus far we appear to be more isolated from one another than we were.

I hate to say it, but feel the need to acknowledge the quiet triumph of secularism in the past thirty years. This is not an invitation to admire my sibylline powers, but in fact, I did venture in my first book that it was unreasonable to expect that there wouldn’t be consequences from the assault in the university on religious faith. In my day lusty agnostics would on the least invitation happily engage in trench warfare against Christianity. Okay. But it is worse now, or such is my reading. The evangelists of agnosticism no longer feel the need to move their armies against what, in their judgment, is nowadays only a derelict defense force. It isn’t that, of course, but the indifference to religion, reflected in the life of the university, is a development of great social consequence. If one listens, for instance as I recently did on the relevant anniversary, to the message by FDR when he communicated to the American people that D-day had happened and the reconquest of Europe was in prospect, one is starkly reminded of how our leaders then addressed us. “Almighty God: Our sons, pride of our nation, this day have set upon a mighty endeavor. . . . They will need Thy blessings . . . we know that by Thy grace, and by the righteousness of our cause, our sons will triumph. . . . Thy will be done, Almighty God.” That was common currency from the aristocratic and the ruling classes, in FDR’s case conjoined. He spoke language that suggested the ultimate dimension of the human experience, and this was the foundation of American idealism, liberty under God, as we came to phrase it. I cannot imagine a modern President speaking so, even though the incumbent and his predecessors are Christians by formal understanding. What all of it means is that the great regulator of days gone by is no longer vibrant, and the consequences hardly need to be enumerated. In other ages it was all there: crime, libertinism, self-centeredness, infidelity. But it was viewed as departure from the correct standards. Now we get such as the Surgeon General, whose answer to the question Is it wrong to conceive out of wedlock? was “No. Everyone has different moral standards.” That would include Pol Pot.

Forty years ago, a medical student, I could walk the streets of various American cities without great anxiety. I well remember leaving my college dorm room unlocked all the time, and similarly, my medical-school dorm room. Now fear (of robbers, of injury, even of death) hovers over many of us who walk city streets, drive city roads—black and white, well-to-do and poor: a significant and melancholy turn of events.

The awkward fact that is overlooked in much reporting and commentary is that the DNA that created America has suffered a mutation. It can no longer be assumed that immigrants want to be American. The 1965 Immigration Act, thought to be a modest reform, has produced a flood-tide of immigrants who will not assimilate as easily as earlier generations, many of them because they do not want to assimilate at all, and who claim rights not as Americans but as ethnic and linguistic minorities. The very identity of America is being challenged, in its political faith, its history, and its culture. A vague sentimentality about immigration and a fear of being accused of “racism” have suppressed proper discussion of this crucial issue.

The greatest change in America since the birth of American Heritage was the placing of electric motors inside typewriters. Sure, IBM did it sometime earlier, but by the mid-1950s the others were doing it, and it became possible for college and university teachers to participate in this revolution in typewriter technology. I remember the first one I bought, a boxy Underwood with a slightly futuristic decor, but inside was that mighty motor, and this meant that after most of a day of talking to students and listening to students it was possible to try to write something without feeling as if one were rowing a boat. There are people like David McCullough who have never appreciated this typewriter revolution and insist on using Remington portables circa 1941. David tests his machines by (so he says) throwing them against a wall, whereupon they simply bounce off, proving they are made of steel. He would never buy a machine with a motor, he says. But he likes to live in the typewriter Dark Ages.

There have been changes in machines since the 1950s, notably the introduction of electronics (31 moving parts versus 247, which means that they last only until their circuit boards break). Word processors were the next step, which I refused to take (spend my day looking at a screen?). But the typewriter motor was the principal change of our time.

So enamored are we of diversity that merely trying to define what an American is today is itself un-American.

As for what it says about us as a people, it says that we are a lot less tired than we used to be.

The most important change in this country in the last forty years took place in 1973, when the upheaval of Watergate triggered a shift from presidential government to congressional government.

The term congressional government was coined by a young history scholar named Woodrow Wilson as the title of his first book, an analysis of the way Congress seized control of the nation after the near impeachment of President Andrew Johnson in 1865. Wilson foresaw the danger of an aggrandizing legislature. “In proportion as you give it power it will inquire into everything, settle everything, meddle in everything.”

The ultimate danger, Wilson feared, was legislative tyranny, which would be a despotism far worse than that of a dictator. Congress could become a despot “who has unlimited time—who has unlimited vanity—who has or believes he has unlimited comprehension.”

Wilson put his astute finger on the fatal flaw in congressional government: “Nobody stands to sponsor the policy of government. A dozen men originate it; a dozen compromises twist and alter it; a dozen officers whose names are scarcely known outside Washington put it into execution.” The result is massive alienation among the citizenry.

In the nineteenth century—and in our own era—another by-product of congressional government has been massive corruption, corruption so pervasive Washington insiders no longer even recognize it. Mark Twain summed it up in his era when he said the United States had no distinctly criminal class “except Congress.” The modern Congress, bloated with perks and PACs, the creature of lobbyists and pressure groups, suggests a similar conclusion.

What was Wilson’s answer to congressional government? A strong Presidency. “The President is at liberty, both in law and in conscience, to be as big a man as he can.” Beginning with Theodore Roosevelt, twentieth-century Presidents restored the balance between the two branches. For present and future Presidents to regain this authority, the modern Congress will have to give up some of its power. This will require an epic political battle. But it must be fought—and won—as soon as possible.

Today’s curious tolerance of forked-tongue corporate labels for special-interest outfits and privately financed think tanks has long been overlooked. To define the gimmick by opposites: The venerable “Society for the Prevention of Cruelty to Animals” is irreproachable. “League of Women Voters” and “U.S.English” are candid enough. But even “Common Cause"—much as I respect most of what it does—is a bit slippery. That could just as well apply to a junta concocted by Jerry Falwell, Pat Robertson, et al. For more flagrant examples, consider “American Enterprise Institute” and “Center for National Policy Studies”—masterpieces of camouflage-ambiguity recently rivaled by “United We Stand.”

Not that this letterhead shell game is anything new. A century ago things like “Non-Partisan League” occasionally surfaced. But for every such creation in great-grandpa’s time hundreds are now churning out smothering masses of direct-mail come-ons, TV pitches, and “position papers.” My current favorite, succeeding “National Organization for Women,” is “Concerned Women for America"—narrowly red-hot against abortion and for school prayer.

Such hidden-ball tricks apparently work so well that we must expect more and better ones. The note in italics at the tail of the op-ed piece will say, “Dr. Pundit is a Fellow of the Social Responsibility Council,” but not that its fat-cat angel used to help finance the John Birch Society. Madison Avenue’s “Ad-Liberty League” will make awards for untrammeled creativity, yes, but its undercover mission will be gradually to soften up viewers for hard-core pornography in commercials . . .

Re change in America since 1954:1 would have no doubt. It is the wonderfully greater sense of moral purpose with which the affluent and the comfortable (of whom, of course, I am one) defend their well-being and specifically their income against the claims of the unfortunate and the deprived. This holds for money and especially for leisure, and in a degree that might astonish even Thorstein Veblen. How good leisure for the fortunate, how depraved for the welfare mother.

The most interesting change in my life since 1954 has been the development of the computer. Of course we all have been affected by computers in hundreds of ways, but for me the computer’s most striking impact has been its effect on my typing. All my life I have been a two-finger typist. I know pretty well where the different letters are, and I can hit the keys at a reasonably rapid pace, but no matter how hard I concentrate I make lots and lots of typos. I do not believe I have ever typed an entire double-spaced page without hitting at least one wrong key. I was not even able to use an electric typewriter because when 1 tried, I could never get through a paragraph without inadvertently nicking the shift lock with the pinkie of my left hand when striking the a with my typing finger. The result, of course, would be several lines of capital letters unnoticed because of my slavish concentration on the keyboard in my futile effort to avoid errors.

Over the years I spent large sums hiring professional typists to produce “clean” copies of my messy manuscripts and equally large amounts of time checking over the typists’ work to make sure they had not left out anything or gotten a date, a page number in a footnote, or some other number wrong. In doing this, nine times out of ten I decided to change something that was typed just the way I wrote it. That meant more professional typing. Sooner or later I would have to give up and accept what the typist produced as final copy.

Today I am probably an even more inaccurate typist than 1 was back in 1954. But thanks to my trusty spelling checker, on paper I now look flawless.

I think the most important way America has changed since 1954 is that we are no longer a United States, but rather a crazy quilt of special interests in conflict.

How has America changed most since 1954? We are less likely to think of ourselves as a single people. Forty years ago it was more or less clear what it meant to “be an American"—now, who has any idea at all? Indeed, paradoxically, the very question seems now somewhat un-American. So enamored are we of diversity, of multiculturalism, of the sense of ourselves as many rather than one, that merely trying to define what an American is today is itself un-American.

This is more than a matter of shifting metaphors, replacing the melting pot with the mosaic, though that is part of it. The melting pot was always too innocent to be true; this country was always more of a mosaic, even when it pretended to the homogeneity of the melting pot. But the earnestness contained within that sugary metaphor, fairy talc though it may have been, served us well for generations, and was not nearly as reactionary a force as we now assume it to have been. The eagerness to think of ourselves as one people provided a kind of spiritual underpinning to such powerful forces for social change as the creation of social security in the 1930s or the civil rights battles of the 1960s.

Now, the dream of homogeneity lies shattered. We no longer aspire to it as a people; we no longer believe it has any connection to the hope of a better life. We are less innocent, vastly less trusting—and unable, at least so far, to find a way to bring out of our current infatuation with confrontation and differences any kind of vision of harmony and wholeness. It is not that we were once homogeneous and now we are not: it is that once, despite all our differences, we had a sense of common purpose.

My father spent the winter of 1954 struggling to overcome sentiment with reason. He was trying to abandon a lifelong allegiance to the New York Yankees and become a fan of the previously hated Brooklyn Dodgers (soon to be the Los Angeles Dodgers, a mobility that could supply another appropriate story for this issue). My father, a committed civil libertarian and political activist, was appalled that the Yanks had not yet hired a black player (Jackie Robinson had broken the color barrier with the Dodgers in 1947). He succeeded in his struggle, and the Yanks brought up their first black player, Elston Howard, in 1955. But too late for my dad, who remained a Dodger fan for the rest of his life.

We are still so far from obtaining full racial justice that we forget all too easily how immensely much has been accomplished in the past forty years. I well remember, as a child in one of the nation’s most diverse and liberal constituencies, my shock on the very few occasions that I saw an interracial couple walking down a Manhattan street. Black people never appeared as ordinary human beings in advertisements, but only as the risible stereotypes of fearful, bug-eyed sidekicks like Rochester (for Jack Benny) or Birmingham (for Charlie Chan) or as fat, happy cooks like Aunt Jemima. In my high school, integrated de jure, I met black students only in chorus and gym. William Faulkner said that integration, only a tiny first step to justice, would take generations, for minds cannot be changed by force. But minds are changed by living—the rationale for required integration—and my son not only feels no discomfort with people of any color or shape but simply cannot fathom why anyone ever would.

Race is a surrogate for all other bases of false and cruel separation and denigration: gender, religion, national origin. Therefore our move to greater ease and geniality on this front mirrors a growing acceptance of differences in all areas and marks the most salutary change in American social life during the past forty years.

My professions of evolutionary biology and paleontology have, during this same forty-year interval, discovered the basis for the striking similarities that unite all human peoples. Genetic differences among so-called races are trivial (the evolutionary finding), based on the surprising recency of common ancestry, about two hundred thousand years (the paleontological finding), for all modern humans. For once a cliché turns out to be literally true: Our differences are only skin deep.

The United States, decrying the population explosion in the so-called Third World, itself became overpopulated in the last forty years, with nary a mention of that pejorative word in the media or in the halls of government. From 1950 to 1990 the population increased by almost a hundred million souls—more than three times the nation’s, inhabitants on the eve of the Civil War. The effects have been pervasive. To cite only a few: It has eroded the vaunted dignity of the individual by forcing us
to use numbers rather than names as identifiers in the records of practically every institution of society; it has significantly encouraged illegal immigration because the crowded-up areas of the country more readily than ever before can slip the harness of government; it has notably accelerated the rise of crime by nourishing it with the important advantage of easy anonymity for its perpetrators. What all this says about us as a people is that in making a fetish of the idea that bigger is better, we have shortsightedly faced away from the protection of national self-interest.

The most decisive and deleterious change has been the shaming and decline of the American liberal tradition.

The range of speculation on the various topics you propose is enormous, and high on the list of major events for America must surely be the dissolution of the Soviet Union and the emancipation of its satellites. Nevertheless, it may be that the sudden appearance and spread of AIDS is among the greatest dangers and disasters of modern times. A young, sexually active generation had only just begun to celebrate a new and liberating mode of life—with pop-culture heroes, musicals like Hair , an astonishing freedom from old inhibitions and hang-ups—when the wild celebration of the Aquarius Revolution was brought to a dead halt. My generation had sometimes been shy, nervous, and timid about inaugurating sex (Philip Larkin’s “Annus Mirabilis” seemed to sum up our situation:

Sexual intercourse began In nineteen sixty-three (Which was rather late for me) — Between the end of the Chatterley ban And the Beatles’ first LP. Up until then there’d only been A sort of bargaining A wrangle for a ring , A shame that started at sixteen And spread to everything .

And we fumbled about awkwardly, making many grave mistakes and often feeling suitably foolish, if not worse.) But nothing that assailed us could match the terrible plague that now presents a continuing terror to any sexually active person, male or female, straight or gay. All that cheerful abandon has been lost and has been replaced by a constant anxiety. This blight upon youthful instincts is no providential retribution for unwarranted license. AIDS afflicts chastely married couples, their children, innocent recipients of blood transfusions. And it seems to have settled down for a long stay.

Revolutions come and go. Even the Russian Revolution has finally, mercifully, gone. But one revolution, I suspect, will remain with us for the foreseeable future: the sexual revolution.

In 1954 the illegitimacy ratio (the proportion of out-of-wedlock births to total births) was less than 4.5 percent. In 1991 (the last year for which we have definitive statistics), it was 29.5 percent; today it is certainly above 30 percent—a sevenfold increase in less than half a century. The white ratio rose from almost 2 percent to 22 percent; the black from almost 20 percent to 68 percent. In 1964, when Daniel Patrick Moynihan wrote his report about the breakdown of the black family, the black illegitimacy ratio was 24.5 percent; today the white illegitimacy ratio is 22 percent. For poor whites (below the official poverty line) the ratio is 44 percent. In 1990 one in ten teen-age girls got pregnant.

The illegitimacy figures are only (a very large “only") the tip of the iceberg. They have to be seen in conjunction with such other factors as the doubling of the divorce rate, the quintupling of the cohabitation rate, and a vast increase in the number of “sexually active” (as the euphemism has it) teen-agers. (In 1970, 5 percent of fifteen-year-old girls had had sexual intercourse; in 1988, 25 percent had.) And looming over all these statistics is the well-documented correlation between single-parent families and welfare, crime, juvenile delinquency, drug abuse, school dropouts, illiteracy, and the rest of the familiar syndrome known as “social pathology.”

There is no question but that we are experiencing a sexual revolution that is nothing less than a major social revolution.

It was sometime back in 1954 when a distinct change came over the United States. It happened after the uprising of our North Korean prisoners, when scores who were thought by the hard-liners to be getting soft were executed by their fellow prisoners. The American press and general public were horrified that prisoners could be so dedicated to a lost cause that they would actually kill so many of their “brothers.”

I remember at the time thinking that there was nothing odd about the patriotic Koreans’ reactions, only the reaction of the American press. It seemed to me at that moment our nation had lost part of its will.

After the Korean War, gradually and imperceptibly, our classic American resolve, our vitality, our willingness to look boldly at the blackness of the human soul wilted away. Since then we seem to have succumbed to a collective gnashing of teeth or forgiveness sessions over every ill at home or abroad. We appear to have cast aside our common sense and exaggerated most solvable problems, making them all but insoluble. We have become a nation of brooders. We may have lost our national soul and fiber in the process.

Many of the men in my years at college enlisted in the Marine Corps convinced that our luck in having been educated meant that we had to repay society by becoming officers. Some of my colleagues were killed within minutes of coming on the line in Korea, others were crippled—the most gifted is still a mental vegetable—and one was wounded and buried by mortar shells behind the Chinese lines and lives today because several members of his platoon were wounded in retrieving what was thought would be his corpse.

Those of us who survived were profoundly saddened by the deaths and cripplings of our fellow men, but we didn’t dwell on it. We all had volunteered.

To me, the most overlooked and underreported way our country has changed is that since 1954 we have collectively bankrupted our nation’s toughness. We have become soft, sentimental, fat, complacent, too rich, monstrously selfish, cynical, disgustingly decadent, dedicatedly hypocritical—and, worst of all, perpetual whiners.

During the past forty years the country has moved out of the machine age into the information age. The insistence of many scholars and journalists that this transition was taking place became a self-fulfilling prophecy. Americans began to think and speak in information-age metaphors, and these metaphors influence their activities. They send messages traveling in bits at the speed of light, not in words via snail mail. Instead of circumstances’ causing events, they seem to be arising from network interactions too complex to analyze. People meet by interface with computers instead of face-to-face with other persons. Information moves as bits along circuits instead of as components along assembly lines. No longer do mechanical engineers insist on uniformity; instead computer engineers celebrate heterogeneous networks. Local places with all their contingent characteristics give way to universal spaces sustained by electronic webs. These transformations are not to be explained by technological determinism. They result from countless Americans’ choosing to have affairs with computers rather than automobiles and restlessly exploring information networks instead of earthbound highways. Only an optimist would insist, however, that Internet surfers today are more fulfilled than Sunday drivers yesterday.

In the last decades there have been marked overturns of settled traditions and habits in American life—the growing power of women, the entry of many African-Americans into middle-class society, the openness of homosexuality, the national conflict over abortion, the nervous emphasis on health, sensible eating habits, and cancer, the anti-smoking crusade, the growing intrusiveness of the media, the replacement of learning in the universities by political correctness, the centrality of the computer.

But to my mind the most decisive and deleterious change has been the shaming and decline of the American liberal tradition—involving the increasing power and intolerance of fundamentalism in politics, the hollowness of public piety, the contempt for rational argument, the stress on matters of sexual conduct rather than on Christian love and charity. Led by ex-liberals from Reagan down and ex-radicals (I could name a hundred) in company with self-profiting evangelists, this has produced a poisonously reactionary temper and as great a contempt for traditional give-and-take in politics as it feels toward the poor, the unemployed, and the homeless. The image of America I grew up with—as the last great hope of earth—has shrunk to the point where patriotism is understandably suspect. That is as great and harmful a phenomenon as the fact that, unlike Europeans, Americans run to fat.

There is less respect for human life—a hopelessness that devalues everything. This, combined with the barrage of violence on TV and in movies, desensitizes our children and glamorizes violence.

People are more cynical about government.

The family is evolving, with women in the work force and men taking on more parenting responsibility.

Divorce has become commonplace, and the stigma attached to unwed motherhood has disappeared, creating an underclass of fatherless children and poverty-stricken families.

Society is transient; more people are moving more frequently. There is no sense of permanence or obligation to community.

“Moral relativity” has impeded the teaching of values and ethics to our children. Society expects the schools to handle the upbringing of our children. Schools emphasize “self-esteem” at the expense of personal responsibility and community obligation.

Life is moving at a quicker pace. Computerization has enabled information to be transmitted in the blink of an eye around the world. Society has become technologically sophisticated—cellular phones, fax machines, computers, VCRs, et cetera, et cetera.

Democracy has replaced communism in Germany and Russia and signaled a new era in South Africa.

Polio has been eliminated, but AIDS has appeared.

We have become more sophisticated politically and are asking more questions. We sue doctors for malpractice, sue the police for incompetence, and sue bosses for sexual harassment. The lawyers are cleaning up, but their profession no longer enjoys the respect it once had.

Mechanized outdoor recreationists have discovered the spaciousness of the American West and are joyously cutting it down to democratic size; everyone can handle the mountains now. Four-wheel-drive vehicles and their waspy little cousins, all-terrain go-carts, clank, grind, spin their wheels, and lunge from one scenic overlook to the next. But are the occupants looking? Or just relishing the thrill of the gasoline thrusts that got them there?

Fat-tire mountain bikes startle Vibramsoled hikers, who in turn resent stepping into cow pies on trails that exist because cattle long ago put them there. The swaths that slice the steep forests of the Rockies used to be caused by avalanches; now most of the plunging pathways are torn out of the land for skiers. In no part of the Grand Canyon are you beyond the noise of aircraft loaded with effort-free tourists. And speaking of the Grand Canyon, each summer day something like six thousand cars and waddling motor homes compete for sixteen hundred parking spaces.

The most surprising change has been the radical revision of ;he time-honored relationships between the sexes.

This pouring of people into our once wide-open spaces and their building of second homes and condos and service units (industrial tourism, Ed Abbey called the whole business) certainly seem to have created a new outdoor ambience. But aren’t the developments driven by the same exploitive urges that led—still lead—to open-pit mines, clear-cut forests, and eroded riverine habitats? People do whatever their expanding technologies enable them to do, and they do it with unflagging Yankee zest. I doubt that the urge will ever change very much.

The most interesting turn that American culture has taken between 1954 and 1994, I would argue, is the change from one in which citizens, often unwittingly, employed models of convergence, conformity, and consensus to the opposite, which features divergence, difference, and dissension, if not conflict.

While revisionists are finding undercurrents that will lead to more complex pictures of the mid-century years, I doubt whether they will be able to change the inherited framing images completely. Having had to paper over their differences to win a war and prosecute a cold war; having seen an impulse on the part of many, especially ex-GIs and their families, to settle down and be more or less like one another; having experienced the setting of cultural tones by a set of people whose ancestors had done it for centuries, the citizenry seemed to welcome centripetal impulses.

I think of the formation in those times of the United Nations, the World Federalists, the World Council of Churches; of “The Family of Man” photographic sequence; of Will Herberg’s famed Protestant-Catholic-Jew model of three-way American Way of Life religion that turned out to be one-way; of under God inserted before indivisible in the Pledge of Allegiance (in 1954, to be precise); of ecumenism and interfaith and interracial and integration as strivings.

Between then and now—let’s play with the summer of 1965 as a turning point, when troops were sent to Vietnam and Watts burned—there was a remarkable turn of the centrifugal. The movements—racial separation, feminist particularity, Afro-Native-Euro-Hispanic-Asian hyphenations of “American,” straight-versus-gay sexual differentiations, ideological polarizations, new sectarianisms, and other elements that produced what is now called multiculturalism—came to dominate.

There may be signs of backlash in 1994 against too much particularism and exclusivism. There are signals of widespread hunger for some return of national community and of concern for the common good. But meanwhile, I would still insist, the change of direction after 1965 created a vortex that consumes a great deal of citizen energy in matters secular and sacred and will provide subject matter for historians in decades ahead.

I do not know whether the change I have noted is the most important or most interesting change that has occurred in the last forty years in America, but it certainly is one of the most overlooked—namely, the change in the functions and relationships of the major institutions of the federal government. The Senate, beginning in the period of the leadership of that body by Lyndon Johnson, was changed into a kind of advanced House of Representatives, with emphasis on committee meetings, and roll calls, and discouragement of Senate debate, and attention to primary responsibilities over defense and foreign policy. In the same period, the years of the Vietnam War, the House of Representatives was encouraged to become affirmatively active in foreign and military policy, being asked and encouraged to act as equal to the Senate in these areas through the passage of resolutions and by placing riders on both authorization and appropriation bills. Reforms and reorganization of the House and of the Senate have significantly reduced the powers of both bodies, and also their responsibilities, leaving them more subject to bureaucratic and presidential domination (as in the current budget process). State and federal distinctions have been confused; Clinton, in this mode, ran for governor of the United States. The executive branch has usurped legislative functions. The courts have become executives—in running schools, savings and loans, and communications. And the Congress gradually becomes a judicial review agency.

African-Americans can use the bathrooms in any standard establishment along all of America’s major roads. This may sound like a flippant response, but it took the achieving of just so basic a right as this to begin building the still-incomplete record of other, loftier rights gained.

The recent change in American life that has been most surprising to me and most significant in our national life has been the radical revision of the time-honored relationships between the sexes. The wild changes in patterns of courtship and marriage have left me gasping. I am a strong supporter of women’s liberation and have written favorably about it, but being a man, I also have to consider the powerful effect these changes have had on American males. I grew up believing that boys should compliment girls and exchange the banter of adolescence. Now what I did is called sexual harassment, and as an adult who works with many women, I have to be constantly on guard lest I say something in friendship that could be construed as harassment.

Even so, I am glad to see women attain power in American life. I have a woman editor, a woman lawyer, a woman business counselor, three brilliant women assistants, and women in all other aspects of my life. I much prefer the present systems of courtship to the stupid, rigorous patterns to which I had to conform when I was young, but I am distressed to see the number of fine young men I know and teach who have opted not to marry because the new rules are so poorly defined and often so unfair to the fumbling husband. Friends tell me: “Jim, you’re too old. The young men coming along will be educated differently, and all will balance out.” I hope they prove to be right.

Four decades ago the United States was given the opportunity to vote for a presidential candidate who could follow in the tradition of Franklin D. Roosevelt, the one President in this century who led the nation in peace and war and set the moral compass for a majority of Americans.

In 1952 and again in 1956, Adlai E. Stevenson, the former governor of Illinois, was nominated by the Democratic party for President. He was defeated twice by General and then President Dwight D. Eisenhower. This was the greatest missed opportunity in our time for changing the course of American and world history.

During the Eisenhower years the Cold War continued and expanded, with the activist Secretary of State John Foster Dulles talking ceaselessly about the American duty to protect the “free world” from communism. It led to the most unpopular war and the greatest defeat for such a concept—the Vietnam War. The Eisenhower Doctrine of “falling dominoes” was continued by President Kennedy and President Nixon-Kissinger as well as President Johnson; all are responsible for the names on the Vietnam War Memorial. It must be remembered that Vice President Nixon was Eisenhower’s choice, which led to the Nixon Presidency and disgrace.

Of course, it was near impossible to defeat General Eisenhower; similarly, General Grant won on the coattails of war. Neither goes down in history for his legacy in the White House.

Would President Stevenson have made a difference? I strongly believe so. Do not judge him by his role (though no one was as much respected before or since) as U.N. ambassador. President Kennedy throttled him and did not have the wisdom to make him Secretary of State. Judge him by his role as governor of Illinois, where his hard work and record shone. Domestically he had been one of the bright young lawyers who served in the New Deal; he might well have continued its ideas of helping the “other Americans"—the one-third of a nation that has not entered the “middle class,” with all that that signifies. And on the foreign front, Stevenson, who helped write the founding documents of the United Nations, would have had a different vision of the world. He was not a cold warrior, not a hawk, and the Vietnam War and the games of war that followed might not have damaged the United States in the eyes of the world and its own people.

I think the most important development of the past forty years for Americans has been the rise, both at home and abroad, of an aggressive, ethnic, religious, and racial consciousness, eventuating in demands for special rights, privileges, or simply official recognition for particular groups. Abroad this feeling has resulted in civil wars and the emergence of new states. At home it has been contained within the normal framework of politics but has transformed and in some ways transcended the collective identity of Americans as a people.

The most overlooked change in America started on December 23, 1947, when William Shockley, Walter Brattain, and John Bardeen—three names unknown to the public—at Bell Labs put a split gold-foil-covered chip of plastic, about an inch on a side, with two wires crudely soldered to it, onto a small germanium chip. It was the world’s first transistor.

By 1953 transistors, at $6.00 a pop, had gone into the first commercial device, a hearing aid, and two years later (down to $2.50) they went into the first “solid-state” portable radio receiver.

That was the start of the high-tech revolution; by now millions of transistors can be built into a postage-stamp-size chip. We are now awash in a sea of calculators, personal computers (and their myriad peripherals), software, data storage banks (and the information superhighway), of cable TV sets, VCRs, remotes, CD players, fancy telephone services, fax machines, copiers, camcorders, and a thousand other devices that trace back to December 23, 1947.

We’ve gone through wars and turbulent times in the last forty years; the political, social, and economic scenes have been in constant and increasing flux. But that’s been true throughout recorded history; only rarely do watershed breakthroughs come; the Renaissance, the Reformation, the Industrial Revolution. To these, add the high-tech revolution, the transistor era, which has changed every nation on earth, taken us on our first steps into space, and transformed every aspect of war and peace. A millennium down the pike, when two world wars and the rise and fall of the Soviet Union have faded into an obscure historical limbo, one with the fall of Troy and the Crusades, December 23, 1947, may well be the best-remembered date in the twentieth century.

William Shockley, Walter Brattain, and John Bardeen deserve a bronze statue, which they won’t get in our time.

I think the most interesting change in America since 1954 is the way in which attitudes about “life station” have evolved. When I was born, in 1947, the American dream was essentially defined in terms of the capacity of white males to challenge the social and economic class into which they had been born and to participate in a fluid class structure that was based on notions of a meritocracy. Women, blacks, the disabled, and gays and lesbian were for all intents and purposes “invisible people” and were considered the “exceptions” to the American dream. That is no longer the case. The combined effects of the “movements” for civil rights, for gender equity, for freedom of choice for abortion, for disability rights, and for gay rights all have altered irrevocably the notion of “station” and have given new meaning and breadth to the parameters of the “American dream.” No longer are some Americans consigned to limitations on the basis of the circumstances of birth; today the notion of a meritocracy is more inclusive than it was in 1954. This development has far-ranging consequences, reflected in the work force and otherwise, but it may well represent the most fundamental redefinition of American life of this century.

Spousal abuse is finally getting its due recognition as a crime; it used to be just a sporting event.

What is to be made of the fact that so little has been made of the fact that between 1954 and 1994 the cities listed in the next paragraph, plus numerous smaller municipalities and townships, have elected and re-elected “black” mayors, many of whom have been succeeded by other “black” mayors? And hardly any have been replaced by an all too obvious white backlash.

There is, to be sure, much to be said for taking such incredibly revolutionary changes in stride as if they were not really remarkable at all but rather only a normal eventuality in the open society that the United States is at last becoming. Just look at how casually sportsfandom, that great representative cross section of national attitudes, has come to accept the superstar status (supersalaries, windfall endorsement contracts, and all) of an undeniably impressive number of their “black” fellow countrymen in baseball, pro football, and basketball during the past twenty-five years.

Perhaps there is something to be said for the benign neglect that Daniel Moynihan had in mind after all. It is certainly preferable to the crocodile tears of condescending do-gooders like one Andrew Hacker, whose book Two Nations is an exasperatingly obvious example of how American social science survey “findings” function as the folklore of white supremacy.

According to Hacker, race relations in the United States seem to be as god-awful as ever—if not worse—since in Hacker’s view self-improvement in conduct and proficiency has not made “black” U.S. citizens more acceptable to “white” folks. On the other hand, Hacker explains away the election of all those black mayors by saying that “black candidates who gain white support come from middle class backgrounds and display middle class demeanor.” So do most white candidates from upper- and lower-class backgrounds alike . Moreover, lower-class voters obviously prefer middle-class efficiency to upper-class “classiness” or lower-class anything. Middle-class efficiency with the common touch is the ticket in American politics. After all, aren’t politicians elected to improve things? Black politicians certainly are. So what the hell is Hacker implying? Lower-class people who are content to remain lower-class don’t vote. Nor did black Americans fight against school segregation in order to remain lower-class. Could it be that the professor doesn’t know that revolution is a middle-class, not a lower-class, thing?

Two nations black and white, separate, hostile, unequal, Hacker proclaims with his title. Two nations? Only two? What about the Asians, Mexicans, Puerto Ricans, Cubans, and other not very white U.S. citizens from Latin America and elsewhere? And as of this morning the Anti-Defamation League was still very much in the business of fighting domestic anti-Semitism. Not does it become anything other than American by doing so. Nor have the passports held by black Americans become null and void during the last twenty-five years.

One thing that does not seem to have changed very much during the last twenty-five years is the pessimism of white academic experts on black prospects. Two nations? Who thinks the Democratic party would have been better off without Ron Brown as chairman during the last election? Probably not President Clinton! Two
nations?

As one of the first women to serve in the U.S. Senate, I believe that one of the most important changes in the last forty years is the increased number of women who are running for and getting elected to public office. The struggle to get women into elected positions is by no means over, but with seven women in the U.S. Senate and forty-eight in the U.S. House of Representatives, the face of Congress has undoubtedly changed since 1954.

It has not been an easy road for those of us who are already here. We have had to struggle against stereotypes, against those who said that women belonged at home or were not capable of doing the same kind of work as their male colleagues. And now that we have proved able to legislate equally well, we still, like all working women, face the extra demand of being wives and mothers. But at the same time, we are making changes—changes that are deeper than just our bright-colored clothing among the dark suits. As women we bring a unique perspective to the kinds of challenges that our country faces right now, from welfare reform to health care to youth violence. We are making our voices heard, and happily we have begun to receive a good deal of respect from our male colleagues. Although there may be only a handful of women in office now, we have proved that together we can accomplish quite a bit.

Women are finding their way into the political system in ever-increasing numbers. It is not only the U.S. Congress that has seen this kind of change. In fact, as you look at state and local government, you find an even greater number of women holding elected positions. And though we may still be a minority among the sea of dark suits, I fully expect that the next forty years will show an even greater increase in the role women play in our government.

The changes that have swept America the past forty years, from the fruits of the civil rights movement to the cultural changes wrought by the women’s rights movement, have been dramatic indeed. It is hard to think of any other society in the history of the world that has been so profoundly changed in so short a time. But I’ll give you a small change that speaks to what might be called the national personality. Americans talk more than they used to. We are now such a garrulous nation! Talking on the talk shows, on talk radio, calling in to “America’s Talking” and Bob Grant and Joy Behar, chattering away live with astrologers on public access, telling absolute strangers that our mother is a lesbian biker, that we’re having an affair with our brother-in-law. . . . There’s a whole lot of sharing going on out there, and just one of the interesting things about it is it doesn’t seem authentic—i.e., the secrets people are sharing don’t seem like real secrets but like narrative constructed to give us a claim on the national microphone. One wonders also, Who’s listening? Who is learning, being heartened, instructed, shocked? Hemingway once said: Do not confuse movement with action. We are becoming a people who confuse chatter with communication. One can endlessly explore the implications of this change in the American personality, but I’ll offer only two. One is suggested by the image of a violence-inclined Iranian mullah replying to a query regarding potential U.S. responses to an act of aggression, with the words “Americans—they talk.” The other is suggested by a scene from a Bergman movie, I think Scenes From a Marriage , in which a tired, aging sophisticate, a woman heavy into her middle years, says to a psychiatrist (I’m paraphrasing), “The life around me seems flat and dry, like this table.” And the psychiatrist, without saying it, thinks: Me too. This is how more and more Americans of a certain type—sensitive, of a certain intellectual refinement, resistant to the messages of the talk culture—are feeling, or will feel.

I don’t know where it went exactly, but sometime around 1954 America lost its sense of irony. This may have been sheer carelessness on our part. Or the H-bomb may have blown it away. Or irony may have decided to walk out on its own (without leaving a note) because the country was getting too serious about itself, while at the same time becoming splintered into intense little special-interest groups, to which the intrusion of something as expansive as irony would have been intolerable. In any case, irony took a permanent hike. And its absence has made life hell for writers and stupid for discourse in general. There are no wry jokes to be gotten, since there are no wry jokes to be made. Everyone is taken at his or her word. Not that people have to mean what they say; they merely have to sound that way. Thus in politics and other demonstrations of public life we are often confronted with the delightful combination of solemnity and insincerity. (The use of delightful here is ironic, by the way.) Why does the disappearance of irony matter? Because when people lose their sense of irony, they forfeit their ability to be teased out of adamancy; thus they also lose the chance to change their minds. So everybody is right, and everybody knows it, and God has a smile on His or Her face that seems part amused, part melancholy, part angry. What would you call that?

The cancellation of stack privileges at the Library of Congress and pretty much everywhere else, the death of hitchhiking, the overspreading of the urban landscape by graffiti, all have a common root—the increase in feral behavior. Predation, and fear of predation, color and torture our public existence far more than they used to. Only the most obtuse would dare pause to tousle the hair of a cute toddler in a supermarket these days. For some of us past a certain age, the awareness of predation can be accompanied by a guilty nostalgia for the peaceful feel of triumphalist but hypocritical (it was, in fact, racist, sexist, homophobic . . .) postwar patriarchalism.

Of course we had famous people forty years ago, people we looked up at admiringly or glanced at sidelong aghast. Each led a public life within his or her own field—politics, entertainment, art, sports, whatever—but each was entitled to a private life that was deemed uninstructive to the rest of us—anyway, none of our business—unless, somehow, it brought them into a court of law. In other words, their celebrity remained closely tied to their achievements, and as a group they did not constitute an alternative reality infinitely enviable, infinitely fascinating.

Television, which in 1954 was a black-and-white and sometime thing, quickly changed all that. In the wink of history’s eye it granted us instant access to, instant intimacy with these strangers. The rest of the media had no choice but to follow if they were to remain competitive. Suddenly there were no issues except those that could be personified, symbolized, by famous people. And most of these matters were eventually discussed in terms of their proponents’ personal psychology (and, very often, their personal travails). The trivialization of our public life, our inability to see beyond the images of the moment, has been, for me, the most astonishing development of the last four decades.

And it’s getting worse. For the omnivorous media now grant spokesman or expert status to any self-anointed leader no matter how lunatic or minuscule his following. Now the call-in programs (and more recently the Internet) grant the privilege of sounding off (and showing off) to everyone. The grotesque spectacle of the O. J. Simpson case, in which the public, with its banners and its sound bites, thrusts itself into the drama, is all too typical of the way we live now. In effect the citizenry has abandoned its natural place, the Screen Extras Guild, and taken up membership in the Screen Actors Guild. And why not? Forty years of intimacy have taught them that their idols are fashioned of the commonest clay, are every bit as unwise (and tormented) as they are.

Electronic populism is the form our government now takes. Electronic anarchy is the form it will soon assume. In these circumstances traditional governance has long since lost decisiveness. It is largely a showy carapace beneath which a self-perpetuating bureaucracy, alternately impotent and imperious, administers our sacred entitlements.

The fifteen minutes of fame that Andy Warhol proposed as everyone’s ultimate entitlement remains beyond most people’s reach, perhaps. But thirty seconds as a disembodied voice on “Larry King"? That’s within everyone’s grasp. And the alternative reality becomes, for many, a virtual reality—which is, possibly, a euphemism for insanity. In another forty years representative democracy, the Age of Reason’s most inspiring dream, may well be the road kill of three hundred million sputtering, muttering, stuttering Volkscomputers chugging implacably down the information superhighway, intent on their own mindless agendas.

Ever since the fifties turned overnight into the sixties and the gray flannel suits were traded in for love beads, I’ve given up pretending to understand this surprisingly inscrutable country. The cast is huge, and the stage is small, and so much of the action takes place off camera that it’s easy to miss the point of a whole era, or conceivably lifetime.

So, obviously, someone who didn’t understand the fifties has his work cut out comparing them with the nineties, which he probably doesn’t understand either. But the electronic revolution, at least, seems solid and irreversible, if only because it gets absorbed at every stage into the landscape, where it promptly disappears. Even the Luddites of the sixties didn’t take axes to their transistor radios or stereos. And they knew, to the minute, how to get on the six-o’clock television news.

Contrariwise, all revolutions involving the sexes may be assumed to be somewhat cyclical, like hemlines, including my current favorite—to wit, the apparent triumph of Philip Wylie’s Momism—as exemplified by P.C., hung juries, and the self-esteem movement—and the simultaneous disappearance of Mom herself. That kindly, artfully fuddled, and largely mythical creature, who took ten minutes to start the car and half an hour to put on her makeup crooked, has disappeared even from jokes. Especially from jokes. “Mom” would have got all the names of minorities wrong, but there would have been no doubting her goodwill; her replacements of both sexes never get anything wrong because they wouldn’t dare to. In fact, the combination of public politeness and nastiness will soon reach upper-class English levels if we keep this up.

But this clearly seems like a stage on the way to somewhere else; indeed, we may already be there for all I know. Your computer and its descendants, on the other hand, are here to stay.

What’s new since 1954? Well, let’s see. We’ve been through a few wars since then. They ended, and I was naive enough to think that when there is no war, there is peace. Men, women, and children will not die in peace, I thought, in the great numbers that were lost in war years. I could not predict such massacres as Bosnia, Somalia, Rwanda, Haiti, AIDS, drugs, drug cartels, drive-by shootings, riots, earthquakes, floods, pollution, the disappearance of the rain forests, serial killings, extinction of wild animals, oil spills, forest fires, terrorists blowing up passenger planes and office buildings, and the weight of most people in America. There was a hopeful World War II song sung in England in the forties, “When the lights go on again all over the world.” Well, the lights are on now, and all they’ve brought is a clearer vision of how we go about finding new ways to kill ourselves. In fact, we can watch it live on CNN. I am thankful for all the progress we’ve made in medicine, in science, and in education. Television is more accessible, but not necessarily much better. Movies are better paced and have more action, but if there aren’t thirty dead bodies in the first hour, we go out for popcorn, which until recently had enough coconut oil to kill us without having to leave the theater. Government is probably no more corrupt than it was forty years ago; we just get to see and hear about it now. Murders and major crimes are now committed by schoolchildren, grade-school children at that. I wouldn’t say it’s increased since 1954; I don’t remember its ever happening in 1954. Spousal abuse is finally getting its due recognition as a crime; it used to be just a sporting event. The cost of a theater ticket, musicals only, since straight plays seem to be vanishing from the marketplace, is now the same as you’d have paid to rent a small apartment for a month in 1954. Charges of sexual harassment have put the fear of God into men these days. But it’s also used as a weapon of threat by both men and women. If we’d had it in 1954, I would have been more careful of how I wooed my wife and possibly would not have proposed for fear it might be misconstrued. Ask me the question forty years from now. Although I’m sixty-seven now, I will still be alive in 2034, only all the parts of my body will have been replaced with the exception of my right thumb, for DNA purposes. Also, in 2034 I’ll be anxious to hear how the O. J. Simpson trial turns out.

The most striking and positive change in American life since 1954 is the accomplishment of the civil rights revolution. The political, social, and cultural changes produced by that revolution have been of immense significance, affecting the whole spectrum of human relations in this country. The core change was the dismantling of official segregation and Jim Crow voting restrictions in the South, an overt and pervasive system, buttressed by social prejudice, which most liberals despaired of changing prior to 1954. Although racism is still very much with us, it is no longer a “given” of our cultural discourse; it is a “problem.” Expressions that once were norms of polite conversation now grate like chalk on a blackboard: “That’s white of you.” And the re-examination of prejudices and discriminatory practices initiated by the civil rights movement have energized the women’s movement, gay rights, and (in general) made it possible for a wide range of hitherto submerged groups and identities to seek and find their place in American public life. The larger goals of the movement—the quest for social justice—remain unfulfilled, but its achievement has indeed been revolutionary. And if the desire for social justice remains an important motive in American politics, it is largely because of the continued vitality of the civil rights movement and its heirs.

On the negative side, American political life has been demoralized by the twin failures of liberal politics in the 1960s and 1970s: the war in Vietnam and the abandonment of the Great Society. In some ways 1994 resembles 1954: There is a sense that democratic politics is no longer usable by ordinary people as an instrument for making positive changes in society; that government has become too large, too conservative, too bureaucratic, too locked up by interest groups (they used to be called power elites) to be an instrument for reform. But the passion and the capacity for action that emerged in the 1960s were in fact latent in the silences of the gray-flannel fifties. Perhaps that is true today as well.

Around 1950, American nuclear strategists decided they would designate 1954 as “the year of maximum danger.” It was supposed to be the year when the Soviet Union would attain technological parity with the United States’ nuclear power. The Americans’ prediction was wrong, but no one outside Russia—and few inside—knew it. In the meantime, nuclear anxieties grew. Remember, during the fifties America’s favorite piece of lawn furniture was the fallout shelter. For three decades these anxieties hardly abated at all. There were many years of maximum danger, as it happened, but somehow the anxieties were held in check.

Exactly how America has been changed by this experience is at the moment beyond the telling, but no one should be so bold as to argue that it has left us untouched. That fact is essential and immutable, and this one too: Were it not for those years of maximum danger, America would be a far different place from what it is today. Better or worse? Best not to say. The whole story of the atomic era is not over yet.

There is a tragic counterpoint to the unquestionable progress America has made since Brown v. Board of Education . The proliferation of guns and the rise of the drug culture in urban ghettos constitute one of the most devastating events in American society, and this is a change that has taken place since the mid-1950s. The change has been insidious. Slums and crime have existed ever since cities were built, and in Newport News, Virginia, where I grew up in the 1930s and 1940s, there was endemic violence. White people committed their share of the violence, but the city was nearly 40 percent black, and most of the mayhem was in the black section of town, in a particularly disorderly enclave known by the lurid name Blood Fields.

My uncle was an emergency room doctor, and he described to me some of the horrendous injuries that brought patients to the hospital. But I think it is important to note that his recollections were almost always of stabbing and knifings and that booze, not narcotics, was a precipitating factor in nearly every case. Naturally there were some deaths, but most knife assaults left survivable wounds. Gunshot killings and injuries were comparatively rare. Drugs were virtually nonexistent. At the same time, there was no real gang activity as it is known now, nor did alcohol, either legally acquired or bootleg, cause any outbreaks of criminal warfare. It was a pretty dreadful scene, but by today’s standards quite tame and on a manageable scale, if there be such a thing.

Of course poverty and neglect were at the heart of the matter then as now, and it could be argued that the separate nature of the present crisis is only a matter of degree. Still, the bloody terror convulsing American cities today is largely due to the appearance of drugs and the vast spawn of high-powered weapons that slaughter people—mostly young and nearly all black or Hispanic—by the thousands. This exponential increase in murder is the worst social development in America in the last forty years. The inability of politicians to cope with the problem is partly due to their craven capitulation to the National Rifle Association, one of the most evil organizations to exist in any nation, past or present. It is a seemingly incurable situation that says that, as a people, we are at best immoral and at worst totally mad.

The most important change, I think, is that the United States has triumphed over its adversary, the Soviet Union, and emerged as the dominant power in the world. This is a precarious situation and probably will not last far into the twenty-first century. In this time of primacy it is America’s challenge to take the lead in meeting such international perils as nuclear proliferation, environmental deterioration, and runaway population growth.

In a longer perspective I venture to guess that three or four hundred years from now, historians will say that the most fateful developments of our time were two scientific events that occurred in the decade just preceding the start of American Heritage . One was the release of atomic energy, which put our civilization and our very existence under permanent threat of destruction. The other was the revelation of the genetic code, a discovery whose effects on human health and welfare are just beginning to be felt.

It seems to me that one of the most important ways in which America has changed since 1954 is that unfortunately we have lost some of the innocent and optimistic spirit that has guided and enhanced our country. We have also added in those years a most unwelcome cynicism and distrust and disbelief in both our institutions and our leaders. It can, of course, be argued that many of these diminished views about our institutions and leadership are justified, and that many of these have indeed failed us. We have had that feeling from time to time in the past. What is new and most unwelcome now seems to be a feeling that virtually denies our history. It is a feeling that is equally cynical about the optimism of the past and about the hopes of the future, a feeling that we are as bad as or worse than most countries, that we have ourselves to blame for most of our problems, and that there is very little any of us can do to change what is basically a bad outlook for the future.

One of the principal sneering criticisms of President Reagan’s eight years in the White House is “Oh, he didn’t do anything except make us feel good.” While I believe that assessment is basically wrong and simply states the conventional wisdom that held that President Reagan was bound to be a failure, it also seems to me that it overlooks the fact that it is quite an accomplishment to make the American people “feel good” about their country and themselves. I would hope that over the next half-century we might regain far more of that feeling. When we have it, there are very few limits to what each of us and the country can accomplish.

Indeed, there is little reason for pessimism and cynicism. Of course we have major problems. The agenda of a democracy is never finished. But we should study and take heart from our many triumphs of the past, the extraordinary way in which we achieved the leadership of the world, and the fact that nothing except our attitude and pride in ourselves and our accomplishments has changed in the last forty years.

We are still the same people with the same invention and productive genius and skills that enable us to do so much good for the world and ourselves.

That is perhaps the best reason of all for studying our history.

One of the remarkably unremarked differences between the mid-1950s and the mid-1990s is this: Back then, words, written and spoken, were important in ways that today seem startling.

“Robert Frost strode onto the stage at Carnegie Hall to a standing ovation from an overflow house. . . . One night in 1957, T. S. Eliot was reading his poems to an overflow audience in Columbia’s McMillin Theater. Even faculty members had difficulty getting tickets, and people were crowded into the windows and doors, and listening outside to Eliot over loudspeakers. . . . Dylan Thomas stood at the podium. . . . This was his third American tour in two years.” So writes Jeffrey Hart in his wonderful When the Going Was Good: American Life in the Fifties .

Today it is unimaginable that any poet could occasion such excitement on any American campus or any other American venue. Of course Frosts and Eliots and Thomases are thin on the ground today, but they were not really plentiful then.

Back then, even the impulse of youthful rebellion was apt to take a literary turn. Again, Hart: “Outside McMillin Theater there was a vast throng that had been unable to get in. They pounded on the doors and milled around. Ticketholders entered between lines of police.” The occasion was a reading, or perhaps a howling, by Alien Ginsberg and two other “beat” poets. Jack Kerouac was supposed to be there but, not uncharacteristically, wasn’t.

Nowadays the way to pack a campus auditorium is to invite a political extremist to deliver a rant or to book rock or rap musicians who advertise their arrested development by wearing their baseball caps backward. That fashion statement is, presumably, some sort of rebellious gesture. Back in the 1950s some of us preferred to dissent by declaiming e. e. cummings:

the Cambridge ladies who live in furnished souls are unbeautiful and have comfortable minds (also, with the church’s protestant blessings daughters, unscented shapeless spirited) they believe in Christ and Longfellow, both dead, . . .

On campuses, particularly, it is difficult for poetry, or literature generally, to be as important as it once was. This is because many teachers of literature teach in strange, off-putting ways. They treat great works as mere “texts” of indeterminate meaning. Books that once were passionately read as food for the soul have become mere fodder for teaching “strategies” to reveal all of life as a power struggle between the privileged and their victims. Words have become toys, not taken seriously. Earnestly, but not really seriously.

It is beyond the scope of this wee response to American Heritage ’s query to try to explain all the reasons for the change in the status of words. Suffice it to say, whatever the change says about us, it cannot be encouraging.

The status of women has undergone greater changes in the last four decades than in the last four centuries. No change goes deeper into the social structure. It alters the relations of wife to husband, of mother to child, of women to other women. This of itself is enough to identify ours as a revolutionary period. Yet it has been a (largely) peaceful revolution, and a fruitful one. It tapped the resources of half the human race.

The most interesting change has been the rapid growth in ethnic diversity, the struggles of the nation to adjust to that diversity, and the way the diversity has made us less culturally monolithic, isolated, and unsophisticated.

We contemplate our navel so closely, we survey ourselves and write about ourselves so intensely, it is difficult to believe there can be any overlooked change. There may, however, be an undervalued change, and this is in our shortened attention span, our declining educational expectations, and our thirst for spectacle.