TECHNOLOGY

The Eureka Factor,
etc.

Faster than a speeding
microchip

One characteristic of the industrial era, which began in
about 1800, is increasing speed of technological change, as
compared with previous human history. Julius Caesar and
George Washington lived in quite similar worlds, points out
Philip Curtin, Hopkins historian, Africanist, and MacArthur
fellow. For both, long-distance travel was by horse,
long-distance communication by letter. Agriculture was the
dominant human enterprise, and technologies remained much the
same for centuries.

That had been true throughout the agricultural era,
beginning some 12,000 years ago, Curtin notes. For example,
the stone ax was "a great breakthrough, and it persisted for
thousands of years before the bronze axe replaced it." New
technologies and newly domesticated animals and plants
spread--but slowly. "Most people lived in bands of hunters or
fishermen, 20 to 30 people," says Curtin. "They didn't have
enough contact to bounce new ideas around very much."

After the development of literacy, faster communication
led to faster technological change, especially after 1800.
And today, with instant communication available around the
globe, a "generation" of computers may hold sway for as
little as five years.

Curtin finds the speed of change worrisome, because the
human being is much the same as ever. "Just because we have
computers and print out our papers doesn't mean we're any
smarter or more moral. We are neither." --EH

The Eureka Factor and
meta-tools

Paul Hazan works at the Applied Physics Laboratory,
where he is assistant to the director (read
big-thinker-in-residence) on advanced computer technology.
Twenty years ago, at a time when microprocessor, byte, and
chip were alien words even to many scientists, Hazan was
saying this new technology would change the world. He was
right.

Today, he's working on computer visualization (of which
virtual reality is one aspect), and he says that will change
the world, because it will allow scientists to explore the
utterly unknown. They'll be like Magellan--they won't have to
know what they're looking for, won't have to know enough to
ask a question. Rather, using computer-aided multimedia
visualization, they can simply sally forth and look for
something interesting. "This is where we can accelerate the
rate of discovery," Hazan says enthusiastically.

He calls this power to search the unknown the Eureka
Factor, and he bubbles about it. "Now there's a whole
generation of tools that help us visualize things. We're
actually expanding human creativity!"

He continues, "If you had to be an internal combustion
expert in order to drive a car, how many people would drive?
But in the early times, you pretty much did need to know,
because the car stalled all the time." The new computer
tools, says Hazan, are like the modern car--so advanced,
they're simple. These are what he calls meta-tools--"tools
that essentially raise the level of abstraction so that the
user has no need to understand the details." Today's
accountant, for instance, can concentrate on accounting and
think very little about how her computer does it.

As well as acknowledging the inherent difficulty of any
rapid change, Hazan emphasizes that today's change is
multifactorial: Everything is changing, all at the same time,
faster than we can understand what's happening. "The problem
is," he says, "things are getting so complex that you need a
long time to understand them, so you have to specialize. At
the same time, most changes involve multiple disciplines, so
we need to generalize--and you know the old saw. A specialist
is a person who knows more and more about less and less,
until he or she knows everything about nothing. Conversely, a
generalist is a person who gets to know less and less about
more and more, until he or she knows nothing about
everything. And that dilemma really is what we're facing."

Advancing computer technology will help, Hazan believes.
In medicine, technology, and business--in every field, in
fact--he says the new meta-devices do so much to maximize the
abilities of the user that people can now re-integrate their
thinking at a very high level. With computer help, they'll be
both generalists and specialists, and they'll do it well.

Hazan is not a blind partisan. Does he think computers
will be abused? No doubt. Can we expect privacy problems? You
bet. Furthermore, he says, the employment issues brought
about by computers are about to become global and
white-collar, because now that the world is interlinked,
programming and many other computerized chores can be done
anywhere in the world. Presumably, they'll be done wherever
it's cheapest. "India has a great many computer programmers,"
says Hazan.

Here again, however, the engineer thinks computers
leverage human ability enough to compensate. "What is the
greatest resource we have available to us? People! And when
we say people we mean people's minds. And all the things
we're talking about basically deal with cultivating and
communicating with--integrating if you like--people's minds. So
we're leveraging our ability to change, to cope, to enhance
the quality of life. That's what's on the other side of the
scale. So am I optimistic? Very." --EH

The information glut

The computer revolution has made the global transfer of
data easy. Too easy? Ron Brookmeyer, a biostatistician in the
School of Public Health, wonders about that. "Computers not
only put data in everyone's hands, but also give them the
power to massage it and try to reach some conclusions. And
that's good," he says. Yet Brookmeyer is concerned. Too
often, he says, one has no idea how data on the Internet was
collected, if or how it was checked, or whether it's been
peer-reviewed. "Numbers take on a life on their own," says
Brookmeyer, soberly. --EH

Entering the realm of virtual
reality

Imagine that you could shrink to a microscopic size that
allowed you to see and manipulate molecules. Mathematician
John Sadowsky recently did--or at least felt as if he had. At
a recent conference on synthetic environments, Sadowsky
donned a virtual reality headset, gripped a "spaceball" (a
three-dimensional "mouse"), and then "flew" through a field
of molecules, stopping as he went to try to fit together
molecules according to their shapes and valence charges.

The program could help biochemistry students studying
molecular structure, says Sadowsky, and similar applications
will soon dramatically transform the classroom and research
lab. The Office of Naval Research is developing virtual
reality systems for testing ship designs, he points out.
Virtual reality is also being used to grant mobility to
disabled children. And NASA has used virtual reality programs
to help design corrective optics for the Hubble Space
Telescope.

Sadowsky and several colleagues at the Johns Hopkins
Applied Physics Laboratory are creating "a three-dimensional
blackboard," a virtual reality headset that displays graphics
and video images of three- or four-dimensional objects. "When
I was studying topology, there were certain types of spaces I
had a very difficult time visualizing," says Sadowsky. Now
teachers will be able to show students three-dimensional
models rather than draw two-dimensional representations of
three-dimensional objects. And they'll be able to demonstrate
four-dimensional objects--like Klein bottles and lens
spaces--that often baffle students of topology.

The "holy grail" for scientists in the field, says
Sadowsky, "is to eliminate the computer as we know it, or the
computer as interface, so that you're interacting with an
environment." Out go the objects that remind you you're using
a computer, like the keyboard, computer screen, and mouse.
Instead, explains Sadowsky, "the computer produces all the
signals necessary to sense the environment and picks up our
responses so we're interacting." In an architectural design
program featured at the conference, Sadowsky walked around a
virtual reality kitchen (actually an empty room), rearranging
cabinets, furniture, pots, and pans (that he visualized
through his headset) by manipulating a spaceball. His
location was tracked by three infrared cameras on his headset
that sensed light-emitting diodes in the ceiling; as he moved
through the empty room, his headset revealed different views
of the "kitchen." --MH

As higher education goes
higher-tech

Higher education is undergoing a "sea change" that
rivals in magnitude the one that occurred just over a century
ago when the modern research university was born, says Joseph
Cooper, university provost and vice president for academic
affairs. Then, lectures and seminars became the standard way
of imparting knowledge, and "we went 100 years without anyone
questioning those modes of delivery. Now," he says, "we're
going to have to invent a whole new set of pedagogies based
on technology."

For instance, says the political scientist, "what is
'remote' is going to have to be redefined." Already, students
can gain access to databases and library holdings across the
world from the (relative) comfort of their dorm rooms. Thanks
to the latest software packages for foreign languages,
students can experience what it's like to talk with native
speakers, while budding mathema- ticians can actually
visualize four-dimensional shapes. Ongoing advances in
CD-Rom, video conferencing, and (eventually) virtual reality,
will only expand the possibilities, Cooper says.

He's optimistic, though cautiously so. "We have to be
sensitive to what it is we want to accomplish. It's not so
terrible if you don't have lectures or textbooks anymore. The
truth is, the lecture method and the seminar method never
fully worked as they were originally intended.The trick is to
preserve the face-to-face relations with faculty members that
are such an integral part of learning, while also getting
away from the disadvantages of trying to teach 400 students
in a lecture hall."

"If we do this right, we have a chance to re-customize
education to the individual," says Cooper. The new technology
means that students will be able to work at their own pace,
he points out. And they'll be able to interact with their
instructors in ways--and at times--that they never could
before. For instance, some Hopkins students have already
begun using e-mail to ask questions of professors and get
feedback on their papers. The electronic interaction offers
speeds of response that aren't available through once-a-week
office hours.

How will universities go about adapting to this sea
change? For one thing, thanks to advances in communication,
"size is not going to be the kind of critical factor it used
to be," Cooper says. Universities will collaborate more, he
predicts, so "it won't matter so much if your department has
15 faculty members rather than 40." Rather than maintain
broadly-equipped (and expensive) departments of foreign
language, say, University X might share its instructor of
Cantonese with University Y, who in return would share its
instructor of Portuguese. "Universities will have to decide
where they're going to focus and spend their resources," says
the provost. Collaboration will also occur (and already is
occurring) among university libraries, so that "those with
huge holdings won't be at such an advantage anymore."

Further, Cooper foresees universities forging new
partnerships with businesses. These days, thanks to the speed
of technological change, keeping up with your field--or
learning a new one--has become a lifelong challenge. That
opens up whole new constituencies for universities, says
Cooper. At Stanford, for instance, the Instructional
Television Network offers over 250 courses annually to 5,000
enrollees at over 200 corporate sites. --SD

A "miracle" boomerangs

Since the 1950s, the middle has dropped out of U.S.
agriculture: few remain of the mid-sized family farms that
used to dominate. Rather, we have very, very big farms
(agribusiness), or very, very small ones (like the computer
programmer who keeps bees and grows 10 "heritage" fruit
trees).

Mechanization and agrichemicals, says Helen Wheatley
(PhD '93), were the new technologies that drove this speedy
shift, to the surprise of all. Wheatley, a historian at
Seattle University who wrote her thesis on 20th-century
cotton farming, says that mechanization was expected to ease
the farmer's labor, the miraculous new chemicals to multiply
his yield--as they did. But buying machines and chemicals
requires a lot of money, year after year after year, which
favored the deep pockets of business over the individual
farmer. Furthermore, the historian points out, the bigger a
farm, the bigger its subsidy. So ultimately, the farm subsidy
program benefited agribusiness more than the family farms for
which it had been established.

Another unexpected effect: agri- business has proven
quite destructive to the land, which it is often said to
"mine." Says Wheatley, "You go in and use the land and use
the water until there's none left. Then you get out."

The new methods were especially problematic in cotton
farming, for two reasons. One has to do with pesticides.
Wheatley says, "The cotton farmers kept pouring on the
pesticides because they just thought it had to work, since
pesticides were so great for other types of farming."
Unfortunately, in cotton, the worst of the pests are tucked
up snug inside the boll, safe from sprays. From the frequent
spraying, the pests quickly evolved resistance, and
everywhere that cotton has been grown, from the U.S. to
Australia, "extraordinary doses of pesticide show up in
entire watersheds."

The other unexpected evil came with irrigation. When
U.S. cotton farming moved into the arid Southwest, irrigation
became a must. What no one had thought about was that desert
soil, not having been leached by the frequent rain of other
areas, contains eons worth of concentrated trace elements,
such as selenium. Washed out by irrigation, to everyone's
surprise the excess selenium began to poison stock and
waterfowl.

"We change our technology faster than we can understand
the ramifications," comments Wheatley. The sorry side effects
of modern cotton farming are in brutal contrast to the
buoyancy people first felt about this changing
technology--"the promise of it, the excitement people felt
about irrigation, these wonderful chemicals, new and
wonderful ways the state could intervene to help the farmer.
But they found that solving old problems just created new
ones." --EH

Presto! Genetic sequencing information
at your fingertips

In the olden days (meaning just a few years ago),
scientists would spend weeks searching the library for gene
sequence and protein information, says David Kingsbury, the
new associate dean for information science at the School of
Medicine and director of the William H. Welch Medical
Library. Now, however, thanks to the Internet-- an
international network of databases--scientists can access at
least 15 databases of technical information on the structure,
sequence, and function of genes and proteins. They can get
the answers they need in minutes, rather than weeks.

Kingsbury explains how it works: Suppose you wanted to
know more about a particular gene that you have sequenced.
After typing in the sequence, you can ask Prot-Web (a Hopkins
application that allows you to navigate through the
databases)to retrieve other similar sequences from GenBank, a
database run out of the National Institutes of Health that
stores all identified genetic sequences. The system will tell
you how closely your sequence matches those, as well as the
names of proteins the genes code for.

You can then search Protein Information Resources, a
database run out of Georgetown University in Washington D.C.,
for the amino acid sequences of those proteins. Or tap into
the Protein Database, in Brookhaven, New York, for an image
of the protein's three-dimensional structure. Each search
takes less than a minute.

"A number of people using Prot-Web have told me it has
accelerated information," says Kingsbury. "They've discovered
things they didn't know were out there, things they otherwise
wouldn't have seen. People check in every few days just to
see what's new. When you don't have to wait a year getting
published in a peer-reviewed journal it helps."

Some people have expressed concern that without peer
review, the databases will perpetuate mistakes. But according
to Kingsbury, much of the information has already been
published in peer-reviewed journals. In any case, peer
reviewers don't usually check to see that the hundreds of
nucleotide bases in a genetic sequence are correct. "There's
no way a peer reviewer is going to know a sequence," he says.
"The literature is full of incorrect observations. The issue
is vexing, but it's nothing new." Furthermore, he says, the
electronic databases may even reduce errors because more
people have access to the information and can discover
inaccuracies. --MH

To know or not to know

"The spinoff has started already." These words, about
the Human Genome Project, come from Victor McKusick with some
quiet pride. The grand old man of genetics at Hopkins,
McKusick has been working in the field since the '40s--since
before Watson and Crick unraveled the double helix. His
genetic catalog, The Mendelian Inheritance of Man, is
regarded as the classic text. And today, central figure of
the national Genome Data Base, McKusick is actually able to
see applications, like the genetic test for cystic fibrosis,
helping actual people. Tests for genes involved in many cases
of colon cancer are also available, "and a breast cancer
gene, on chromosome 17, is just around the corner." He says,
"People are pounding down the trail to isolating the gene."

Today's genetic tests are only the beginning. All fields
of medicine, McKusick says, are using the genomic approach to
their most puzzling ailments, an approach that--as he tells
it--is simple: "We map the gene, walk in on the gene, and find
out what it does normally. Then we find out what particular
derangement appears in this particular disease."

McKusick sees two generic hazards: "One is, the Human
Genome Project has the effect of increasing the gap between
what we know how to diagnose and what we know how to treat."
For example, physicians can now predict Huntington's--but
cannot prevent or treat its onset. "And this will be more and
more the case." Also, he believes the project "runs the risk
of increasing the gap between what we think we know and what
we really know." Issues of employability, insurability, and
confidentiality are also a subject of "rightful concern."

None of that diminishes his pride in the advances of his
life's work. "We'd better go into this with our eyes open,"
he says, "but it is a matter of faith with me that in the
long pull, it is better to know than not to know." He repeats
the words, with some emphasis. They matter to him: "In the
long pull." --EH

Changes in store for the concert
hall

Peabody Conservatory faculty member Geoffrey Wright sits
in a studio with a keyboard, which is unremarkable for a
composer. What is striking is the looming array of electronic
and computer components that dwarf the musical instrument.
They're all of a piece to Wright, though. In his hands and
mind, all of this stuff is a musical instrument.

Wright is a composer in Peabody's electronic and
computer music department, a man with a deep fascination for
the application of technology to music. For him, the most
significant change in music that he is witnessing--and
participating in--is the rapid development of musical
intelligence in computers.

"Musicians are struggling very hard to find ways to
represent the entire musical process digitally," Wright says.
"We're trying to find a way to capture the essence of music
and store it on a computer."

Computers have been able to mimic the sounds of
instruments for a while now, he says. What artists and
researchers want to do is go further: to have computers
engage in what amounts to musical thought--to create a
computer that can respond to a human musician not like a
machine simply following instructions, but like another
musician.

"Over the years," Wright says, "people have attacked
technology in the arts as cold and dehumanizing." The early
creators of computer music did little to change that
impression, he says, because their compositions tended to
leave the performer out of the performance. Musicians didn't
"play" computer music; it existed only on magnetic tape,
which the composer then played for the audience. "The sounds
were often wonderful, but the audience got restless," he
says. People wanted to see musicians, not go to a concert
hall to listen to a tape deck.

Next, composers tried having a musician play along with
a tape, but the computer couldn't respond to the musician,
who was in effect playing along with a stereo system, albeit
an expensive one.

"Now," says Wright, "we're able to change that process
and have the computer 'watch' the performer, and respond."
For example, if the performer speeds up during one section of
a piece, so does the computer. If the performer skips a
passage, like a good accompanist the computer will skip
ahead, too. "What we're dealing with is real-time feedback,"
Wright says.

Technological change is hardly new to music, Wright
points out, from the piano to the phonograph. And as a
composer working with computers, Wright does not feel
disconnected from the creators of string quartets or
concertos. "Some of the issues we face are the same issues
Bach faced," he points out. "What note comes next?"

Wright is seeing increasing numbers of students
enrolling in the electronic music department. "Students see
technology in music as creating new career paths"--career
paths that are desperately needed, he says, at a time when a
single orchestral opening routinely attracts 1,000
applicants. New forms of composition and performance may
create new opportunities for students as programmers,
engineers, and performers on new instruments. --DK

The way the future wasn't

What about technological revolutions that didn't happen?
asks John Sommerer, deputy director of the Milton S.
Eisenhower Research Center at the Applied Physics Laboratory.

Back in the 1950s, says Sommerer, many people envisioned
we'd be traveling to work in personal helicopters and riding
between cities over skywalks. Those dreams have fizzled.
Likewise the prediction that we'd have electricity from
nuclear power--electricity so cheap that it wouldn't be worth
metering. "We saw only one part of the equation," says
Sommerer. "We ignored the fact that it is quite expensive to
deal with the waste products of nuclear power." Also, if
electricity were free, people might use so much of it that
we'd suffer the environmental consequences of another
by-product of electric power--heat--which would amplify global
warming.

"It's very easy for people when they're trying to be
visionary to look only at small parts of a problem, and to
leap over other problems to say how the technological advance
is going to change society," he observes. --MH

Written by Sue De Pasquale, Elise Hancock, Melissa
Hendricks, and Dale Keiger.