The Digital General: Reflections on Leadership in the Post-Information
Age

PAUL T. HARIG

As I compose this introduction, one of my children is playing
on a hand-held video game that contains more memory than the first
computer I had access to in graduate school. My other child is
on-line to her friends somewhere on the Internet. At the same
time, 300 lieutenant colonels and colonels are converging on Carlisle,
Pennsylvania, to begin their formal education in senior leadership
at the US Army War College. I am imagining how these events intersect.
Perhaps 30 years from now, when the generation of digital colonels
in the Army War College Class of 2026 embark on a well-worn path
in Eisenhower's footsteps, we will see the effects of the Information
Age that started for them when they were playing video games and
"surfing the net" as kids in 1996.

Perhaps the future will look much as it does today and did 500
years ago. A colleague reminds me that with the tremendous advancements
in technology, organizations, and doctrine, command is still very
personal.[1] He writes, "Consider Schwarzkopf, alone in his
bunker, making the decision to start the ground war in the Gulf,
and contrast the moment with Eisenhower on the eve of D-Day. The
modern general has state-of-the-art technology light years ahead
of what Eisenhower had available to him, yet both faced the same
questions--Is it time? Has enough been done to ensure the success
of the attack? Both had huge staffs manned with the best minds
they could assemble; both sought and used the best intelligence
they had available to them, but the decision to commit forces
to the offensive in the end was largely intuitive, personal, and
private--In my judgment, the time is right." In the
end, it could be argued, all great commanders are the same. They
adapt the technology of their times in a highly personal, reflective
space where machines can extend, but never supplant, the human
dimension of their leadership.

My colleague's point is clear: there will always be a human dimension
to leadership. The most successful commanders will be those who
possess a few basic traits: courage, intellect, and a cultivated
sense of intuition. If that commander doesn't even know how to
turn on a computer, it matters not. That is a comforting speculation,
and there are abundant examples in the biographies of the great
captains to support it. Yet, I have a nagging concern for that
batch of "digital generals" 35 years from now.[2] I
wonder if the cultural transformation between now and then might
be so profound, so revolutionary, that these future leaders will
follow along the path described above? I also wonder if high technology
might have a corrosive effect upon their command, in the way that
"management" might have corrupted leadership to produce
the zero-defects mentality.

I am not a Luddite; I embrace technology for its extensions to
my senses, my muscles, and my intellect. But I am a child of the
late 1940s, and my view of technology is conditioned by my experience,
constructed in a Newtonian culture of linear, cause-and-effect
relationships and predictable consequences. I learned from television
as it developed, but spent as much time with radio and records,
books and newspapers to form a worldview. I have experienced (and
circumscribed) technology as an exciting stream of tools which
could improve without necessarily perturbing the natural order
of things. It was only recently that this relationship has been
exploded by technology's capacity to interact with and remodel
human events in real time (exemplified by the effects of televised
combat and talk-show politics that shape public opinion and mold
policy), fulfilling Marshal McLuhan's prophesy that "the
medium is the message"[3] while shifting social attitudes
and behavior. From that perspective, one must ask whether technology's
ultimate effect upon future senior leaders could go far beyond
providing an enhanced warfighter toolbox. Is something else there
that will transform the human dimension of the "digital general"
too?

The Medium Might Rewrite the Message

It is difficult to understand a revolution from the inside-out.
We comprehend change through historical trends and visualize the
unfamiliar through familiar patterns. Thus, the "revolution
in military affairs," considered a qualitatively new capability
to see, understand, and respond to a changing military situation,
is being framed and understood in the "traditional"
language of computer systems. One view is that the microprocessor
is another powerful tool to integrate knowledge, a system-of-systems
to help dissipate the fog of war.[4] But what if the microprocessor
is but an icon for a more fundamental conceptual revolution that
is occurring? What if technology is shaping a real social and
cultural transformation? Could the Information Age be more than
improved tools? Can other changes be far behind?

In his essay on military decisionmaking in the Information Age,
Professor Thomas Czerwinski forecasts a transition to a style
of "command by influence" in which the military leader
would communicate mission-type orders through symbolic imagery
rather than voice or text, leveraging subordinates' initiative
to exploit chaos through greater situational awareness.[5] I speculate
that the digital general some 35 years from now might not just
communicate differently but will actually think differently
from his or her predecessors, because conceptual behavior
itself is evolving during the Information Age.

The evidence that a threshold may have been crossed is found in
a steady worldwide rise in intelligence test performance. The
psychologist James Flynn first demonstrated that IQ has been going
up ever since testing began. The "Flynn effect" is now
well documented in many technologically advanced countries: the
average gain is about three IQ points per decade, more than a
full standard deviation since the last generation of military
leadership was born. According to the report of a task force established
by the American Psychological Association,[6] the consistent intelligence
score gains documented by Flynn seem much too large to be explained
by simple increases in test sophistication. Yet, their cause is
presently unknown. According to the task force, the most plausible
explanation is based upon striking cultural differences between
successive generations: the population is increasingly urbanized;
television exposes us to more information and more perspectives
on topics than ever before; children stay in school longer; and
everyone is exposed to new forms of experience. In short, complexity
of life may produce corresponding changes in complexity of mind.

It will be no surprise to one who has watched school-age children
"surf the net" that information technology has jolted
our culture, promoting access to ideas and immediacy to events,
leading to mastery of resources. Without leaving his room, a 12-year-old
can "cyberchat" or correspond worldwide with e-mail
pals, download a computer game, compile references from university
libraries for homework papers, or view a music video.

Adults are often astounded at their children's facility with this
high technology. To someone who witnessed a social transformation
driven by television, this capability is considered revolutionary,
a "paradigm shift." But paradigms shift only on the
margin--to the youngster who played Super Mario Brothers®
or Sonic the Hedgehog® at age five, this new information
technology is just an incremental change, a step up from the SegaÔ
or NintendoÔ, but hardly a revolution in his or her eyes;
something "cooler" will always come along. Keep in mind
that these children are the senior leaders in the "Army After
Next"; their grandchildren will be even more dissimilar to
us.

Transformed Minds, Transformed Culture

As technology transforms the culture, it also shifts expectations
and perceptions, and this is the heart of the conceptual transformation
from pre- to post-microprocessor generations. Let me enumerate
a few elements of this shift:

Information management and manipulation are replacing knowledge
acquisition and inference. The exponential growth of information
and the methods for acquiring it have transformed the meaning
of expertise. In the past, an expert was the repository of facts.
Experts "learned" how to become experts by acquiring
those facts and by learning how to distinguish truth. But there
are now too many facts, too much stored information, too many
sources. Experts are now defined by their ability to recognize
underlying patterns so that new facts can be acquired and integrated.
Experts learn how to match these underlying patterns or heuristics
to new data sources in order to advance composite knowledge.

Within the Internet, for example, some of the most active enterprises
are "World Wide Web search engines." These programs
range from simple semantic filters which dredge related items
from the millions of pages of stored data, to keyword systems
which group and retrieve sources according to underlying patterns
and themes. As a cursory examination of these systems will demonstrate,
there is no shortage of information on any imaginable subject;
one such system boasts access to 50 million pages of information.
Hence, it is becoming ever more important to know what to ask
for, and increasing status is being accorded to those who can
efficiently frame a search strategy to a question while avoiding
becoming overwhelmed by the possible answers. Likewise, information
manipulation has become a central academic objective--rather than
presenting students with a static set of facts, instructors are
challenging their students to learn the interrogation process
themselves to harvest their own answers. The result is a generation
that is comfortable navigating numerous, complex data sets.

Of course, there are risks in this activity. On the Internet,
information is plentiful but not necessarily genuine or reliable.
Searching is a recursive project which involves trial and error.
Most "search engines" score the accuracy of a selection,
or "hit," by the likelihood that the result matches
the search template. This is not, I would assert, analogous to
finding the truth. Accuracy of retrieval is generally determined
by how well a piece of information is indexed for retrieval, not
by its internal validity. Consequently, on the Internet it is
possible to retrieve erroneous information correctly, and to deceive
others with bogus "facts" or rumor freely distributed
throughout cyberspace.

The real problem of going from inference to data management is
that as much emphasis can be placed on the process as on the product
of a search. That could engender a shaky practice: seek until
you find, but generally accept what meets your expectations. The
corollary, "If you didn't find it, you didn't ask for it
the right way," can shape the process of inquiry into merely
proving one's expectations, a cybernetic prejudice that suspends
critical analysis and evaluation of the results. A related pitfall
is uncritical acceptance of results, i.e. confusing process for
product: "If my search strategy returned an answer, it must
be a correct one." The perils of uncritical thinking are
compounded by some innate characteristics of the Internet: it
is voluminous, it is uncensored, and it can be counterfeit. With
so much information available, efficiency has priority over reliability,
hence the popularity of the search engines. Yet, there is rarely
a day that counterfeit material is not posted somewhere (or reposted,
multiplying its weight to the search engines). Unfortunately,
the information age provides no easy answer to this problem, but
presents an abundance of hazards for the uncritical thinker who
uses technology to make his decisions. "Let the buyer beware!"
will take on more significance in a data-intensive Information
Age.

The basis for learning is changing. In the pre-microchip
generation, lessons learned were generated from directly experienced
real-world consequences. The tutelage of mentors was prized because
they had learned the tricks and traps of the trade through personal
or vicarious experience and handed them down through education.
The most valuable mentors were guardians of institutional memory,
who could act like harbor pilots to guide novices to a solution
because they knew where the shallow straits were located, perhaps
because they had once run aground themselves. (One of the interesting
sidelights here is the pressure for increased self-reliance caused
by the disappearance of institutional memory, a result of force
realignments and the inevitable attrition of seasoned veterans
who stood in the past as mentors.)

By contrast, the information age has facilitated virtual reality--self-discovery
through simulation of actual situations, events, and problems.
Given data on the real-world parameters, computers can model sensory
cues and contingencies, support gaming of choice alternatives,
and simulate appropriate consequences. Moreover, these simulations
can harmlessly mimic reality so that actual disaster is never
experienced as a consequence. This is their key advantage, of
course. Incremental learning or proficiency can be developed by
stopping or restarting the process at will, completely erasing
the damage of a previous mistake.

As data storage and processing power increases, the boundary between
simulation and reality becomes even less distinguishable. Witness
the power of computer animation in such popular films as Jurassic
Park, Forrest Gump, and Twister,or the
realism of mechanical simulation in the new Boeing 777 training
module. In military affairs, this capability to simulate reality
has been employed to construct a "virtual theater of war"
which seamlessly combines real units and simulated ones to test
doctrinal concepts and materiel effectiveness through simulated
operational and tactical maneuvers.

When reality and simulation become indistinguishable, is there
an indelible effect on the player that desensitizes him to the
damage of real-world catastrophes? When harm and pain exist chiefly
in cyberspace as immaterial states, can a player fail to develop
sympathy for real-world consequences? The allure of virtual reality
could have a dark side--the blunting of the human sentiments to
real war that is fought abstractly, in a manner not too different
from counterpart recreational games. Could advanced technology,
the increasing digitization of the battlefield, and the automation
of combat systems transform the experience of war into another
video arcade game, an abstraction defined by the movement and
deletion of computer icons? In my opinion, the answers are not
clear-cut, but today's most popular computer games have martial
themes, and more are on the way. The story is told of an officer
who, in the pitch of virtual battle, swore at his terminal, "Damn,
I lost an icon!" as an overrun battalion was flagged by the
computer. Even if the story is apocryphal, the potential for the
response clearly exists in cyberwar.

Systematic decisionmaking is eclipsing intuition. Computers
that beat chess masters prove that artificial intelligence and
knowledge-based systems are capable of extremely sophisticated
decisionmaking. Programs that apply experts' collective rules
of thumb have even been shown to make more consistent, reliable
decisions than humans in similar problem-solving situations. Yet
most expert systems operate from data-hungry mathematical models.
They illustrate the inseparable relationship between knowledge
and intelligence--the more you know, the smarter you become; so
to become smarter, you must know even more.

If the quantity of evidence determines the certainty of a hypothesis,
then how much evidence is enough? The answer is determined by
the amount of ambiguity in the problem, because computers reason
in all-or-nothing terms and have limited tolerance for partial
evidence. Uncertainty is resolved by redundant observations, so
that more data is collected to resolve the uncertainty. Ironically,
systems that can scan a situation in great depth and analyze in
great precision can provide a decisionmaker with so much capability
that he becomes addicted to the information and consequently paralyzed
by it. Recently, the Army Times described a computer-assisted
exercise during which a battle staff hadn't noticed it was being
overrun by the enemy because the commander was preoccupied with
obtaining more data from his battlefield computer.[7]

Through technology, it is not only possible to suffer paralysis
by analysis, but also to neglect the intuitive skills that give
commanders an important advantage in ambiguous situations. The
author of a recent Military Review essay observes that
intuition allows a commander to focus rapidly on feasible solutions
when time for systematic analysis is unavailable.[8] This capacity
is particularly important in peacekeeping, where the traditional
combat decisionmaking model does not fit. But, as that writer
notes, intuition comes largely from experience with a broad base
of situations. Overreliance on structured systems might, indeed,
stunt the growth of this intellectual capacity and severely limit
a commander's options in unfamiliar scenarios. One of the worst
possible outcomes would be the erosion of a leader's ability to
use his own eyes and ears. While decision systems might present
an unparalleled opportunity to eliminate risks, they could obscure
a strategic leader's awareness of key inputs to the decisionmaking
process which exist outside the range of data available through
computers. Assessment of political and environmental conditions,
for example, will rarely be carried out through the systems of
sensors that will generate most of the input to Force XXI computers.[9]

A Hubris in the Information Age

One of the particular ironies of the Information Age is that the
shifts in expectations and perceptions cataloged here may create
and support superb battle staff officers, because these men and
women of the future will know how to leverage powerful analytical
tools for tremendous advantages in speed, precision, and effect.
Yet, these transformations also could supply a hubris for the
digital general because they make it more difficult to shift from
the operational to the strategic level of leadership.

In the worst case, an officer corps mesmerized by high technology
could produce a generation of senior leaders that is so insecure
without their computer models and decision systems that they could
not step beyond them. That could have dire consequences:

Reluctance to "break out of the box."
When any formal data system becomes a leader's primary commodity
for strategic decisions, the demand for hard evidence can become
the enemy of hunches, eventually suppressing new perspectives
on a situation. A senior leader's experience shouldn't be entrapped
by rigid analytical systems that force a choice from options in
all of which ambiguity is a common circumstance. In fact, strategic
leaders need some personal distance from hard data in order to
sample other channels of reality, such as having face-to-face
discussions to sound out the feelings behind the pros and cons
of an issue.

Death of the metaphor. Just as there are plentiful examples
where critical scientific breakthroughs have occurred while the
right brain (our intuitive, pre-verbal cognitive resource) was
operating ahead of the pack, strategic vision requires an ability
to think in metaphors, to seek related patterns in unrelated objects,
situations, and events. True, our future senior leaders will have
access to more information. The successful ones will be those
who are best able to sort out the important from the interesting.
The development and testing of analogies--the patterns that allow
leaders to see the important under data overload, is a skill that
could waste away under a sterile diet of expert systems and virtual
reality simulations.

Fear of risk and error. I doubt that the best microchip
will ever exceed the value of "Kentucky windage" in
decisionmaking, but the illusion of omniscience from multisensory
information systems might make our leaders fear the "guesstimate,"
preferring to avoid risking mistakes by substituting certainty
models for their intuition.

The Way Ahead or the Way Out?

No one can reliably predict whether technology-driven changes
will necessarily impair the human dimension of senior leadership;
possible negative consequences are so averse, however, that it
is important to plan to prevent them. That implies at least two
steps: first, to monitor the penetration of "digital thinking"
in our youth and assess its effects on the young solders who operate
our battle systems. Second, to assert the value of intuition,
risk-taking, and creative thinking throughout our professional
military education process. The first step recognizes that we
may not yet see the phenomenon in our current commanders--they
come from a different generation of technology and culture. With
the real leading edge of the cultural revolution completing elementary
school, we must plan now to meet this generation at the doorway,
prepared to stress values that might be submerged in their experience
and prepared to cultivate abilities that will make them effective
"digital generals" in their own time.

NOTES

1. Colonel Len A. Fullencamp, Department of National Security
and Strategy, US Army War College, personal communication, 1996.

2. This is an arbitrary date, set to match the 45- to 50-year-olds
in 2026 who are today's elementary school students.

7. This was a sidebar to an article by Sean D. Naylor, "Digitized
Force: Better, but Not Smaller; Technology Promises to Change
the Way Battles are Fought," Army Times, 23 October
1995. See also Thomas Czerwinski's assessment of the three common
command systems--by direction, by plan, and by influence--in terms
of their reliance on varying mixes of data and intuition for decisionmaking.
He links the three in their approach to uncertainty and its influence
of the commander. ("Command and Control at the Crossroads,"
in this issue of Parameters, pp. 121-32.)

9. See, for example, the experience of the authors of "Declaring
Victory: Planning Exit Strategies for Peace Operations,"
in this issue (pp. 69-80). The analytical process they used to
plan for relinquishing full responsibility for the "stable,
secure environment" in Haiti to local police was highly structured
and carefully managed. Yet their greatest need was for what they
called "political situational awareness," which proved
very difficult to acquire.

Colonel Paul T. Harig is the Director of the Army Physical Fitness
Research Institute of the US Army War College. He is a graduate
of Fairleigh Dickinson University, the Army Command and General
Staff College, and the Army War College. He holds a doctorate
in clinical psychology from the University of Vermont and a military
proficiency designator A-prefix for expertise in psychology from
the Army Medical Department. Colonel Harig has served on the staff
of the Army Surgeon General and currently functions as an Army
consultant on health promotion policy and senior leader executive
wellness.