UKUUG

The approach of the new century offers an occasion to ponder the condition
of humanity and of the planet that sustains us. How many of the world's
nearly six billion people live well or in circumstances that are even
marginally agreeable? How many still suffer poverty, war, disease,
illiteracy, and the other scourges of our species? Will the policies of
global civilization merely magnify well-known ecological, economic, and
social ills? Or will the next century find ingenious remedies?

Alas, as the symbolic stroke of midnight speeds toward us, the opportunity
to rethink the situation of humankind and renew our sense of purpose is
rapidly being frittered away. When people hear about the year 2000 these
days, the first thing that springs to mind is the computer glitch that
threatens to disrupt computer systems and send our institutions careening
toward chaos. Because programmers in earlier decades economized on space
by cleverly dropping two digits, we are now obsessed with the problem and
the costly challenge of minimizing its possible damage.

Yes, the Y2K troubles are real. But there's a pungent irony here. Our
society has become so slavish in its dependence upon digital equipment
that it seems unwilling to face squarely the health of the planet and
humanity's future. To my way of thinking, this is the real millennium
"bug," the urgent "Year 2000 Problem" that our systems planners, corporate
elites, and political leaders have overlooked in recent years.

One indicator underscores how grave this deficiency has become. At a time
in which most societies around the world have committed themselves to
technology as the one true path to improvement, the common understanding
of what "technology" means and what it includes is now rapidly shrinking.
Not too long ago "technology" referred to the whole range of tools,
techniques and systems people use to achieve practical ends. This
definition arose during the nineteenth century, replacing earlier, more
limited definitions. While the concept was overly broad, it was far
richer than the one gaining prominence today, the notion that "technology"
is just information technology, nothing else. Other kinds of instruments,
methods, technical arrangements, and devices are grouped under more
specialized rubrics. Social issues, both promising developments and
gnawing problems, that involve the broader range of technical means, are
fading as matters for public attention and debate.

This warped view of technical matters first gained prominence on Wall
Street, where the category "technology stocks" has taken on a particular
significance. The technology stocks are, of course, shares in computing,
digital communications, Internet services, and the like. When one hears
that "technology" is soaring or sinking on the stock exchange, one knows
that we're talking about Microsoft, Dell Computer, Lucent Technologies,
Netscape, Seagate, Sun Microsystems, America Online, Cisco Systems, and
the like. In this context, the term no longer refers to automobiles,
airlines, chemicals, agriculture, or anything of the sort. The word
"information" has been dropped as a modifier, leaving "technology" as a
pure, seemingly self-evident label.

This innocuous linguistic convenience for busy stock traders has now
spread, infecting contemporary journalism and everyday speech, signaling a
narrowing of awareness and care. Oddly enough, this constriction of focus
happens at a time in which, to all appearances, there is an explosion in
sources of news coverage on "technology," hundreds of magazines,
newspapers, paperback books, television programs, and on-line sources
filled with stories about people's involvement with technical things. For
serious technology watchers, this would seem to be a godsend. But if one
looks closely at the content of this burgeoning news coverage, the vast
bulk of it is limited solely to the computer industry and the Net. What
first appears to be a wealth of useful information conceals a profound
poverty of outlook.

Within today's "technology" beat, the press typically follows stories of
just two kinds. First are reports on the activities of business firms in
the computer and communications field -- the latest deals, mergers,
acquisitions, new product introductions, and strategies of corporate
movers and shakers. News of this sort used to be confined to the pages of
Business Week,
Fortune, and the financial section of your local
newspaper. But under the rubric of "technology" the machinations of CEOs,
managers, and lawyers in the information corporations have now been
elevated to a status and glamor not unlike that attached to sports heroes
and rock stars. Will Bill Gates stave off the Justice Department? Will
Steven Jobs stay on at Apple? Will the leaders of Bell Atlantic and GTE
bring off their corporate marriage? Apparently, the reading public has an
endless appetite for stories of this kind.

Also favored in this approach are reports about digital hem lines -
late-breaking fashion trends in the design, marketing, and consumption of
computer hardware and software. Which new gadgets does Silicon Valley
have in store for us this season? How much computing power will I need to
run the next-generation programs? Should I buy the latest Windows
upgrade? What colorful services and diversions can be found on the World
Wide Web? People who follow rapidly changing info-styles now find a great
torrent of chatter about such matters in both print and pixel.

Commitment to this approach seems all but universal. The "Technology
Alert" from the Wall Street Journal
that arrives in my email each day is
never about anything other than computer and communications firms. If one
turns to the on-line version of the New York Times and clicks on
"technology," dozens of articles about the computer biz and digital
hem lines begin scrolling by. Much the same holds for the hundreds of
newspapers and magazines that print the latest gossip from the Internet
grapevine. Day by day, the dull uniformity of it all raises the question:
Why bother reading this dross at all? Here, for example, are some recent
items from the Times' predictable stream:

"Oracle Announces Online Challenge to Microsoft"

"Web Erotica Aims for Female Customers"

"Braindump on the Blue Badge: A Guide to Microspeak"

"Flat Screens: Good But Costly"

"Video-conferencing Stage Fright"

"PC's for the High End Crowd"

"Putting A Virtual Doggy in Your Window"

"Hurricane Watchers Clog Web Sites"

"Gossip Sites Target Music and Film Business"

Of course, the mood and outlook of such stories in the
Times and
elsewhere is strictly "upside," often totally euphoric, Viagra for the
mind. In both the giddy writing and glitzy neo-neon illustrations, the
model for "technology" journalism in this mode is, of course,
Wired
magazine. The recent sale of that publication to Conde Nast, publishers
of Vogue, confirmed what many of us had suspected all along,
that the
magazine was less a serious discussion of the transition to a digital
society than a never ending barrage of excited promotions for ephemeral
electronic products and the personalities who hawk them.
Now that Wired
is owned by those adept at selling cosmetics and couture, its role is at
last thoroughly transparent. What's remarkable is that so many supposedly
respectable publications have decided to mimic the tawdry self-indulgence
that has become the hallmark of cyber journalism.

An obvious shortcoming of this odd focus for reporting and thinking is the
vast spectrum of interesting and important topics it systematically
neglects. If one is interested in solar electricity, for example, the
second fastest-growing energy source in the world, one can read for years
and never find it in today's "technology" coverage. Although the
bio-technically driven "second Green Revolution" will likely affect
billions of people in years to come, its arrival goes all but unnoticed.
If one is interested in the rapidly evolving techniques of flexible
production in global factories and offices, don't bother looking in the
local newspaper or its on-line edition; from all indications, "technology"
doesn't include such things anymore. How about the ecological disasters
caused by "advances" in the technologies of fishing and aquaculture?
What? Where? When? Why wasn't I informed?

An illustration of a significant piece of news that has gone all but
unnoticed amidst the hoopla of American "technology" coverage is the
raging controversy about the introduction of genetically modified food in
Great Britain. One study by British scientists, reported recently by BBC
and
The Guardian, found that rats fed genetically engineered potatoes
suffered stunted growth and weakened immune systems. Whether or not the
study turns out to be reliable, concerns about it and about genetically
modified food have sparked citizen protest and disputes among the
political parties in Parliament. While you can be sure that the emerging
biotechnology firms around the world are closely watching this flap and
its possible ramifications, the American reading public is kept in the
dark, nourished by hundreds of Olestra-rich puff pieces about Internet fun
and frolic.

Perhaps aware of the growing vapidity of today's techno-news reporting,
some prominent publications have recently decided they need a larger
theme, a Big Picture within which to frame their topics. The startlingly
brash, unprecedented, and illuminating context many of them appear to have
settled upon is "Innovation." Yes, folks, here it comes! Out of the
research labs, into the hands of entrepreneurs, from there to the global
marketplace, and into your lives - technology! What matters in this
perspective is simply an appreciation of the dynamic flow and process.
Never mind the social contexts, broader consequences, or policy choices at
hand. Behold the surprisingly colorful people engaged in cutting-edge
university and corporate research (and you thought they were just cold and
grey!). Follow those far-sighted venture capitalists as they seed the
landscape with promising start-up companies. Be the first on your block
to catch a glimpse of all the gadgets and new media that will shape the
offices, homes and schools of the future.

Given the long history of campaigns to promote technologies of one kind or
another in this century, it's amusing that anybody would find this
emphasis on "innovation" the least bit novel. In one guise or another,
this idea has been the bread and butter of industrialists, advertisers and
reporters for eighty years. Ideas and images celebrating innovation were
already current in visions of modernity of the 1920s when automobiles and
electrical appliances (rather then Palm Pilots) were all the rage. From
its very first issue, Henry Luce's
Fortune magazine (1930) regaled
readers with high-tone stories and photographs depicting links between
emerging technology, business initiatives, and social transformation.
Then as now, the arrow of causation always pointed in one direction. As
the motto of the International Exposition held in Chicago in 1933 boldly
proclaimed, "Science Finds -- Industry Applies -- Man Conforms."

As we receive our daily dose of this threadbare mythology, updated for the
age of cyber-space, the problem is not merely that the scope of reporting
on technology and human affairs is dwindling. Resourceful readers can
always search out diverse, substantive sources of news and information
about all kinds of technology-related events. The far more urgent problem
lies in the fact that, at a crucial moment in human history, public
discourse about matters of consequence has been reduced in its outlook,
trivialized in its grasp. Since people's awareness of what matters is
strongly influenced by what news sources highlight as current and
noteworthy, the shrinking perspective of technology journalism is a
serious loss.

Among the issues that cry out for attention as a new era dawns is the
widening gap of inequality that characterizes the world's population. Our
much heralded global economy has been very good at producing a handful of
billionaires and millionaires. But for roughly a third of the Earth's
people, especially children in the less developed countries, grinding
poverty is an everyday reality, a situation already evident even before
the economic crises of the past year. Can it be that we find the
suffering of hundreds of millions of our fellow human beings insignificant
when compared to the puzzle of finding a Y2K fix?

While we're at it, why not tackle some of the "bugs" that threaten the
environment we will hand to our children? How about fixing the
technologies that spew millions of tons of CO2 into the air each day,
exacerbating global warming? How about replacing the systems that pour
toxic chemicals into the air, water and land, slowly poisoning human
populations and other species? Let's eliminate the errors in our tax laws
that encourage energy waste and other ecologically destructive practices.
And let's fix the development bug that destroys good farmland and
devastates the world's forests. These are among the steps that would be
taken by those hopeful about Earth's future.

I'm told that if all goes well, if enough time, money, and effort are
invested, our computers will actually remember that a new millennium has
arrived. Alas, we humans may forget to update our spiritual clocks,
ignoring a momentous turning point and the challenge it presents.

Back in the 1960s, Studs Terkel wrote his classic book,
Working, based
on interviews with hundreds of Chicagoans. The picture he sketched was
not pretty. But in their Second Annual Big Issue (Dec., 1997) the editors
of Forbes ASAP assured us that things are different today:

Reading Terkel's
Working now is like scanning an ancient text. If
there is one common emotion that emerges from the Babel of voices in
Terkel's book, it is boredom. Boredom is the leitmotiv of the
Industrial Age. Almost everyone, from the spot-welder to the CEO, is
deeply bored in Terkel's world. His people dream of a job that is
meaningful, challenging, and so fulfilling that they would never want
to leave it.

They got their wish. Today, in the information age, the world of work
is now so intellectually challenging, meaningful, and compelling that
we are never bored.

(http://www.forbes.com/asap/97/1201/index.htm)

On the other hand, if our evident need for distraction is any measure, we
may be just about the most bored people ever to walk the earth. Are
data-entry workers never bored? Or the customer service employees whose
official mission in life is to explain to anonymous callers how to plug in
their new printers? Or the growing legions of programmers responsible for
maintaining old code? And what about the armies of conscripts pressed
into mind-numbing duty against the Year 2000 bug?

As the
Forbes ASAP editors see it, our salvation comes from the chip and
the Net. Okay. Look at the financial service vocations that have so
dramatically re-shaped themselves around the chip and the Net. How easy
would it be for the employee of a typical investment firm to place his
investments based on meaning and conviction - on a sense of personal
responsibility for what his funds do to the world - as opposed to the
dictates of number-crunching algorithms? Admittedly, making money for
its own sake can be a pleasurable distraction, assuming you don't think
too much about the nations or villages whose economy you could just as
easily be destroying as helping. But this empty mathematical exercise
hardly counts as an advance in the meaningfulness of work.

Then there's the farmer, enclosed in the cab of his huge tractor,
traversing thousands of acres while a computer tuned in to a Global
Positioning Satellite allocates varying doses of fertilizer to each small
sector of the farm's grid. The most likely result is that a concern for
abstract "total inputs and outputs" replaces meaningful contact with the
land. The farmer no longer feels directly responsible for the processes
of life, death, and resurrection going on in the soil. He no longer
experiences himself as intimately woven together with them. And, in any
case, these processes are most likely being rendered sterile by his
current fertilization practices. Does he really find this kind of work
more meaningful?

You pick a vocation, and I'll give you another example. The fact is that
the computer is an engine of abstraction, removing us - so far as we give
it free rein - from direct engagement with the sources of meaning in the
world. Certainly we can
reach across the barriers of abstraction: the
investor can seek out real value behind the mathematical value, and the
farmer can take the time and trouble to know his land intimately and care
for it in a deeply satisfying manner. But it requires an effort that runs
across the grain of all those efficiently operating chips celebrated
in Forbes ASAP.

If the editors of that publication are convinced we've entered a new era
of meaningful work, it's because, as they put it,

command and control are dead. The chip and the Net have killed it.

But this misses the whole point. The issue is not centralization (with
its need for command and control) versus decentralization (with its
distributed intelligence). No, the real question has to do with the
overall balance between computation and the non-computational. That is,
it has to do with the balance between syntax and meaning - between frozen
forms of intelligence on the one hand, and our own fluid expressive
potentials on the other. It hardly matters whether the patterns of frozen
intelligence are centralized or not. As every spider knows, you can
immobilize your prey with a delicate web just as well as with a stinger.
This is an important issue, having a great deal to do with our seemingly
inevitable drive toward ever greater standardization. I'll have more to
say about it in the future.