Looking ahead to the mid-to-late 1980s, it's becoming clear that
the successful organizations will be the ones which manage information
most effectively. Likewise, the nation that best marshalls its
information resources will emerge as the dominant economic force in the
post-industrial age.

For the moment, the United States is ahead in the two areas
critical to this latter-day revolution: exploitation of the
semiconductor chip and development of a superior communications
infrastructure. However, Japan and Europe appear determined to catch up
and perhaps overtake the United Sttaes in both areas. European nations
were slow to focus on the importance of communications, but are now
making up for lost time by pushing through plans for an integrated
services digital network. Japan has been a leader in communications
technology for some time, and has recently moved ahead of the United
States in a vital segment of the semiconductor industry, capturing over
70 per cent of the market for 64K RAM chips.

Now, the Japanese have set out to leapfrog USA computer technology
by targeting the development of a "fifth-generation" system by
1990. In an effort reminiscent of the United States space program, Japan
has gathered the youngest and brightest under a charismatic leader and
backed the enterprise with considerable resources. The strategy is
spelled out in an impressive national plan of the Ministry of
International Trade and Indistry (Miti), called Fifth Generation
Computer Systems. The plan documents a carefully staged ten-year
research and development program on Knowledge Information Processing Systems (Kips) . . . artificially intelligent machines that can reason,
draw conclusions, make judgments and even understand the written and
spoken word . . . with initial funding and new laboratories in Tokyo
provided by the Japanese government.

This ambitious undertaking is expected to cost as much as $1
billion over ten years. Compared with IBM's annual R&D budget
of over $1.5 billion, the Japanese outlay may not sound that impressive,
until one remembers that the money is for one project and a highly
innovative one at that. However, the rewards are equally grand. If the
plan is successful, Japan could not only dominate the traditional forms
of the computer industry, but establish a "knowledge" industry
in which knowledge itself would be a salable commodity.

In their book "The Fifth Generation: Artificial Intelligence
and Japan's Computer Challenge to the World," Edward
Feigenbaum and Pamela McCorduck claim that the wealth of nations, which
depended upon land, labor and capital during its agricultural and
industrial phases, will come in the future to depend upon information,
knowledge and intelligence.

"That isn't to say that traditional forms of wealth will
be unimportant," Feigenbaum and McCorduck state. "Humans must
eat, and they use up energy, and the like manufactured goods. But in
the control of all these processes will reside a new form of power,
which will consist of facts, skills, codified experience, and large
amounts of easily obtained data, all accessible in fact, powerful ways
to anybody who wants it . . . scholar, manager, policy maker,
professional or ordinary citizen. And it will be for sale."

Feigenbaum and McCorduck believe the Fifth Generation will be more
than a technological breakthrough. "The Japanese expect these
machines to change their lives . . . and everyone else's,"
they say. "Intelligent machines are not only to make Japans's
society a better, richer one by the 1990s, but they are explicitly
planned to be influential in other areas, such as managing energy or
helping deal with the problems of an aging society." In addition,
the new machines will help to increase productivity in the primary
industries such as agriculture and fishing, as well as tertiary
industries, such as services, design and general management, where
productivity improvements have been difficult to achieve.

On a global scale, the Fifth Generation project, even if only
partially successful, could vault Japan into a leadership position in
the world's information processing business.

Recognizing this, individual European nations such as Britain and
France have embarked on similar government-subsidized projects, while
the European Economic Community has funded its own cooperative program.
In the United States, the challenge is being mst by the government in
the form of the Pentagon's Defense Advanced Research Projects
Agency, as well as by industry. One of the more unusual efforts is a
joint R&D venture, Microelectronics and Computer Technology
Corporation, created an d funded privately by leading United States
firms to help maintain United States technological preeminence and
international competitiveness in microelectronics and computers. The
firm's most complex and ambitious project is a ten-year effort on
fifth generations systems. Going Beyond von Neumann

The transition from information processing top
"knowledge" processing . . . from computers that calculate and
store data to computers that reason and inform . . . will require a
significant departure from conventional computer designs. The first
four generations of computers . . . vacuum tube, transistorized,
integrated circuit and very-large-scale integrated circuit (VLSI)
computers . . . all followed the same general architecture, based around
a central processor, memory, arithmetic unit and input/output devices.
Known as von Neumann machines, after the computer pioneer and
mathematician John von Neumann, these computers operate principally in
serial fashion, step by step. In contrast, the fifth generation systems
will use new parallel architectures (known collectively as non-von
Neumann architectures), as well as new memory organizations, new
programming languages, and new operations wired in for hanlding symbols
and not just numbers.

With the von Neumann architecture, the processor-memory bottleneck
ultimately creates a traffic jam that limits the speeds computers can
attain even with the fastest microelectrics circuits. For the fifth
generation systems, the Japanese are considering a "data flow"
computer championed by Jack Dennis at the Massachusetts Institute of
Technology. Data flow computers have a large number of processors, each
with their own memory, and a routing network so that the processors and
memories can communicate with each other and execute insturctions
simultaneously.

The Japanese are also aiming for chips with ten million
transistors, in contrast to today's limit of a few hundred thousand
transistors. Such processors are being developed in the course of
another Miti effort, the SuperSpeed Computing Project, and will be
adapted into the fifth generation machines. In addition, the Fifth
Generation depends on access to "knowledge" bases in many
locations, so its technology will ultimately be fused with the most
advanced communications technologies the Japanese can design.

Perhaps the biggest challenge the Japanese face is in the field of
artificial intelligence, or AI, a discipline which focuses on machines
that can deduce, infer and learn. In contrast to ordinary computers,
which must be programmed to perform every step involved in solving a
problem, AI machines can figure out how to solve problems on their own.
They need only know what the problem is, and they can find that out by
asking the user questions in languages that resemble human ones, in
contrast to the arcane programming languages needed by conventional
computers.

Artificial intelligence finds application in knowledge-based, or
"expert" systems, which use AI methods to solve problems and
to aid decision making by using a knowledge base along with rules of
inference that apply to the specific field of knowledge. These expert
systems not only replicate and multiply the value of human expertise,
but also capture it and perpetuate it in computerized form, amking it
possible to pass on knowledge and experience from generation to
generation. Other AI work has given computers the ability to understand
natural languages such as English, making it easy for computer novices
to use the machines effectively and to develop new computer applications
without programming. Evolution of Expert Systems

AI research dates from 1956 when the expression was coined by John
McCarthy, then assistant professor of mathematics at Dartmouth College in Hanover, New Hampshire. Early research looked for general
problem-solving solutions, which turned out to be overwhelming in scope.
For instance, if a chess-playing program used AI techniques to examine
every possible move before deciding on the best one, it would have to
look at 10.sup.120 moves for each game. Obviously, the program would
run faster if it could decide which of the possible moves would be
considered "good" in a given context. When playing chess, or
dealing with business problems, people use their accumulated knowledge
of the world and their particular experience to solve problems rather
than trying all possible alternatives. AI researchers came to realize
that computers could be programmed to do the same and this gave birth to
expert systems.

Today, expert systems are used as diagnostic aids in medicine, as
planning tools for manufacturing and as decision support systems for
dealing with oil drilling problems. According to Feigenbaum and
McCorduck, knowledge is the key factor in the performance of such
systems. They have impressive credentials to discuss the subject.
Feigenbaum, a professor of computer sciences at Stanford University is a
recognized pioneer in artificial intelligence; McCorduck, a science
writer who teaches at Columbia University, has been interested in AI for
two decades and recently authored a history of artificial intelligence
called "Machines Who Think."

They explain that the knowledge needed in an expert system is of
two types: the first comprises the widely shared knowledge that is
written in text books and journals; the second is the knowledge of good
practice and good judgment acquired by human experts over years of work.
In addition to this knowledge, an expert system needs an "inference
procedure," a method of reasoning used to understand and act upon
the combination of knowledge and problem data.

Feigenbaum and McCorduck explain that the knowledge in the
knowledge base must be represented in symbolic form and in memory
structures that can be used efficiently by the problem-solving and
inference subsystem (see figure). This representation can take many
forms. One of the most common is the object, a cluster of attributes
that describe a thing. Another common representation is the rule, which
consists of a collection of statements, called the "if" part,
and a conclusion or action to be taken, called the "then"
part. To find out if a rule is relevant to the reasoning task at hand,
the problem-solving program must scan over the store of "ifs"
in the knowledge base.

In the Fifth Generation plan, knowledge will be stored
electronically in a large file known as a relational data base. The job
of automatically updating the knowledge in the file and in organizing
appropriate searches for relevant knowledge will be performed by the
knowledge-base management software. The interaction between the
hardware file and the software file manager will be handled by a logical
language called a relational algebra.

The Fifth Generation prototype knowledge-base subsystem will handle
thousands of rules and thou sands of objects. Each object will be
allotted a thousand characters of file storage space. Within the
ten-year plan, the Japanese goal is to develop knowledge-base capacity
that will handle tens of thousands of inference rules and 100 million
objects . . . enough to store the entire Encyclopedia Britannica.
Japanese planners want their machines to handle millions of logical
inferences per second (Lips), where one logical inference is an
"if/then" sequence of reasoning. The Japanese have also chosen
Prolog as the language of interaction between the logic processing
hardware and the software that implements the various problem-solving
strategies. Milestones in Japan's Agenda

The initial milestone in the Fifth Generation plan is a single-user
Prolog workstation, called a personal sequential-inference computer,
which will be capable of performing one million Lips. Feigenbaum and
McCorduck explain that it is intended to be both a prototype for later
development, as well as an intermediate product that may be on the
market by 1985. "This prototype would give an order of magnitude improvement over software-based Prolog implementations on today's
common mainframe computers," they say. The final target for the
subsystem is an inference supercomputer capable of performing 100
million to 1 billion Lips, a speed that Feigenbaum and McCorduck say can
only be achieved by the "insightful use of a great deal of parallel
processing in the computer hardware."

The work will be done by research teams at the Institute for New
Generation Computer Technology (ICOT), supplemented by contract work
done under ICOT's direction. ICOT was formed as an
"instant" institute in april 1982 and staffed by 40
researchers under director Dr. Kazuhiro Fuchi, former head of the
Information Sciences Division of Miti's Electrotechnical Laboratory
and the main architect of the Fifth Generation project. Fuchi assembled
the 40 researchers within two weeks of the start of the project, causing
a controversy in the seniority-conscious country by demanding that
everyone be under 35. Fuchi, who is in his mid-40s, justified his action
by arguing that revolutions aren't made by the elderly.

The young researchers come from the eight firms backing ICOT . . .
Fujitsu, Hitachi, Nippon Electric Corporation, Mitsubishi, Matsushita,
Oki, Sharp, and Toshiba . . . and the two national laboratories that are
also participating, the government-owned Nippon Telephone and
Telegraph's Musashino Laboratories and Miti's own
Electrotechnical Laboratory. After three or four yeas, the researchers
will be rotated out of ICOT back to their company laboratories.
Meanwhile, there are no proprietary considerations to limit
collaboration among them while they are at ICOT. In addition, the
researchers are routinely sent back to their firms to report on
progress.

The rotation and the routine reports are intended to seed ideas
throughout the participating firms systematically. "Such
cooperation might agitate a Washington antitrust regulator were it to
happen in the United States," note Feigenbaum and McCorduck,
"but ICOT's mission is to foster such cooperation and to
educate industrial scientists actively by joint project work."

Miti's announced commitment of $450 million over the ten-year
period is spread rather lightly over the first three-year phase ($45
million) and then budgeted heavily for the years of expensive
development engineering. The first phase will be funded fully by Miti.
In the second and third phases, Miti expects the funding will be matched
by the participating companies, bringing the total project budget to
about $850 million. Feigenbaum and McCorduck note that other
Miti-initiated national projects have seen higher ratios of
industry-to-government spending, sometimes two or three to one.
"It's very possible that if the project is meeting its
intermediate targets at the end of the first phase, and if the Japanese
economy is strong, the total budget could well escalate to more than $1
billion," the authors state.

The Fifth Generation project is structured over a ten-year period.
The first three-year phase is intended for building the research teams
and laboratories, learning the state-of-the-art, forming the concepts
that will be needed in the later work, and building hardware and
software tools for the later phases. The personal sequential-inference
(PSI) computer is one of these tools. The workstation will be a
prototype of later machines, as will be its problem-solving software.
Early expert system prototype applications will also be written,
according to Feigenbaum and McCorduck.

The second phase of four years is one of engineering
experimentation, prototyping, continuing experiments with significant
applications, and initial experiments at systems integration. The first
thrust at the major problem of parallel processing will be done in these
years. The final phase of three years will be devoted to advanced
engineering, building final major engineering prototypes, and further
systems integration work. During this phase, the R&D results will be
distilled into a set of production specifications for the commercial
products that are to be marketed by the participating companies. Making
Machines Man-Like

One of the major undertakings will be the development of
intelligent interfaces . . . the ability the machines will have to
listen, see, understand and reply to human users . . . which will
require extensive research and development in natural language
processing, speech understanding and graphics and image understanding.

Because computer novices will be the largest groups of users,
natural language processing is one of the Fifth Generation
project's most important research goals. Natural language systems
shift the burden of understanding from the user onto the machine; the
natural language system must understand the idiosyncrasies of the user
and his language rather than forcing the user to understand the
idiosyncrasies of the computer. Such a system must accept and answer
requests expressed in the user's natural language, phrased any way
the user chooses to express himself. If any ambiguities exist in the
request, a natural language system must deduce them and ask the user for
clarification.

Natural language processing will also be put to use in the
development of a highly ambitious machine translation program, initially
between English and Japanese, with a vocabulary of 100,000 words,
Feigenbaum and McCorduck report. The goal is 90 per cent accuracy, with
the remaining 10 per cent to be processed by the user. Research is
natural language processing will proceed in three stages, beginning with
an experimental system, followed by a pilot model implementation stage
that is connected with the inference and knowledge base machines, and
concluding with prototype implementations. At that point, the machines
will be expected to understand continuous human speech with a vocabulary
of 50,000 words and 95 per cent accuracy with a few hundred or more
speakers. The speech understanding system is also expected to be
capable of running a voice-activated typewriter and of conducting a
dialogue with users by means of synthesized speed in Japanese or
English.

Picture and image processing are considered almost as important as
language processing, according to Feigenbaum and McCorduck, especially
as they contribute to computer-aided design and manufacture, and to the
effective analysis of aerial and satellite images, medical images and
the like. Here again the research will take place in three phases,
beginning with an experimental phase, followed by the introduction of a
pilot model and finally the implementation of the prototype and its
integration into the Fifth Generation machine. One potential
application is in robotics, where the goal would be to construct robots
that can see, understand and act under differing circumstances.

The Japanese surprised many industry observers by selecting Prolog
(for "programming in logic") as the language of interaction
between the logic processing hardware and software that implements the
various problem-solving strategies . . . effectively the machine
language of the logic processor. Developed in Europe in the late 1970s,
Prolog is used in AI work because its basic terms express logical
relationships among objects, and not just equations as most programming
languages do. However, few of the intelligent systems now in use rely
on Prolog; most of them have been programmed in Lisp (for list
processing), an older language which software engineers have developed
more fully.

Prolog is derived from formal deductive logic, whereas Lisp
comprises sets of equations through which a required function can be
defined in terms of simpler, more primitive functions. The idea is to
decompose the original statement into simpler tasks that can be easily
executed in parallel by the computer. In fact, both languages are
well-suited to parallel processing machines.

While development of Prolog machines has been slow, there are a
number of computers and personal workstations to run Lisp, including
machines from Digital Equipment Corp., Fujitsu, Lisp Machines,
Incorporated, Symbolics, Incorporated, and Xerox Corporation, Apollo
Computer, Incorporated recently extended its programming language base
with Domain Lisp, allowing AI applications to run on its workstations.

Among natural-language interfaces, the best known is Intellect from
Artificial Intelligence Corporation of Waltham, Massachusetts. Intellect
gained a major milestone for AI technology last summer with its
acceptance by IBM Corporation. Serving at present as an interface to
IBM data base products, Intellect could become the primary interface to
a host of software services ranging from communications to modeling to
spreadsheets, allowing relatively inexperienced users to create
sophisticated data base queries and other applications. Admiral Leads
America's Team

Within the United States, the most concerted R&D effort in AI
comes from Microelectronics & Computer Technology Corporation (MCC),
based in Austin, Texas. MCC was created explicitly as a United States
response to Japan's Fifth Generation project. Concerned about the
Japanese challenge, William C. Norris, chairman of Control Data
Corporation, convened a meeting of top computer and microelectronics
industry executives in Orlando, florida in April, 1982. From the
meeting came a decision to create a joint-venture corporation to fund
research and also contribute researchers. Task forces then thrashed out
a research agenda focusing on four programs . . . computer-aided design
and manufacturing, software technology, packaging and advanced computer
architectures.

The venture partly follows the Japanese model: the companies will
donate scientists and researchers to MCC, loaning them for a minimum of
three years. However, whereas Miti is helping to finance the Japanese
effort, MCC will rely entire on private funding. Each shareholder
company in MCC can invest money and personnel in any or all of the
programs. In return, a company that funds research gets three years
lead in being licensed at no cost to develop its own products and
package them for the marketplace. After that, any company, member or
not, can be licensed.

MCC is now owned by 15 American microelectronics and computer
companies: Advanced Micro Devices, Allied, BMC Industries, Control Data,
Digital Equipment, Harris, Honeywell, Martin Marietta Aerospace, Mostek,
Motorola, National Semiconductor, NCR, RCA, Rockwell and Sperry.
Despite initial antitrust fears, the Department of Justice has announced
that it does not object to the creation of MCC. However, the Department
will be looking at each of the four programs as they develop to see if
the combination of firms cooperating in them could constitute a breach
of antitrust laws.

The shortest program, dealing with the packaging of integrated
circuits, will be funded at about $5 million a year for six years. MCC
also plans to spend about $8 million annually on a seven-year program to
develop new techniques, procedures and tools for improving the
productivity of the software development process by one or two orders of
magnitude. A further $11 million a year is targeted for an eight-year
program to develop new computer-aided design and manufacturing tools for
the production of tomorrow's VLSI chips.

The largest, most expensive and longest running program deals with
advanced computer architectures and is the MCC equivalent to
Japan's Fifth Generation program. Over its ten-year lifetime, the
program will be funded at about $15 million annually for work in four
main areas:

Parallel Processing--to develop the languages and architectures to
allow computers to perform tasks simultaneously instead of sequentially,
with corresponding increases in processing speed.

Data base system management--to improve data base design and
storage methods and capacities to permit flexible storage and faster
retrieval of a broader range of more complex information.

Human factors technology--to improve the relationship between man
and computer by simplifying the use of computers through techniques such
as improved voice or character recognition or use of natural languages.

Knowledge-based systems--to realize the computer's
problem-solving potential by developing new ways to represent human
knowledge and thought concepts, as well as new engineering models and
tools to apply human expertise to a wide range of problems.

To run the corporation, MCC's directors chose retired Admiral
Bobby Ray Inman, former director of the National Security Agency and
former deputy director of the CIA. Inman believes the first three years
of the program will be spent "squeezing out what we really know in
all [four] areas . . . where we need additional basic research and where
to press on with advanced research." The hope is that VLSI
technology will have advanced to a stage by that time where it can be
applied across all four areas. "Defining what [VLSI technology]
will do for us, and what we decide on doing, will be a major effort
that's probably going to take two or three yeas," Inman says.
"So we'll probably be about the six-year point before we start
dealing with the task of designs we want for computer
architectures."

By next year, Inman envisions a staff of about 250 and a budget of
about $75 million a year. By then he also expects to have detailed
milestones for all four projects, though he has yet to decide whether to
announce them or keep them proprietary.

More than any other single agency in the world, the Pentagon's
Defense Advanced Research Projects Agency (DARPA) is responsible for the
current state-of-the-art in AI. When no USA corporation or foundation
chose to take AI seriously, or could afford to, DARPA supported it
through two decades of vital but highly risk research. A 1981 study by
the Defense Science Board ranked AI in the top ten military technologies
for the 1980s. Today, AI work is funded through the agency's
four-year, $600 million Strategic Computing Program, intended to put
artificial intelligence into military equipment.

Among the systems under development: an "autonomous land
vehicle," which will move around battlefields on legs, guided by a
vision system and a supercomputer to help it distinguish among objects
cluttering its path; a sophisticated command-control-and-communications
battlefield management system that will utilize AI technology to predict
battle scenarios and suggest appropriate strategies: and a pilot's
associate to help identify incoming hostile targets or equipment
malfunctions and to report them in a synthesized voice. Europe Adds
Spirit to Contest

Western Europe has several projects in various stages of
implementation and planning. In the United Kingdom, a fifth-generation
computer project called Alvey is underway based on the Japanese model
with the British government committed to funding more than half of the
five-year $550 million budget. A collaborative effort on the part of
government, industry, academia and other research organizations, the
Alvey program will focus on four research areas: knowledge-based
systems, software engineering, man-machine interface and VLSI
technology.

In addition, government and industry are collaborating on a
research institute devoted to artificial intelligence. Called the
Turing Institute, in honor of British mathematician and computer
theoretician Alan Turing, it will concentrate on fundamental research in
computer architectures, automatic programming, knowledge-based systems
and advanced robotics. Set up in collaboration with the University of
Strathclyde, the institute has a number of industrial sponsors,
including ICL, Sinclair Research, two Shell Oil Research laboratories
and two government agencies.

France has also been paying close attention to the Japanese Fifth
Generation Project. Inria, the French national information sciences
laboratory, formed a group of scientists and industrialists from both
the public and private sectors to plan a French response to the Japanese
challenge. Its overall thrust is a national effort to design and
manufacture software and hardware to compete with Japan's
knowledge-based systems. Meanwhile, Schlumberger, the French oil field
instrumentation specialist, considers artificial intelligence important
enough to have established its own AI group.

The European Economic Community (EEC) als has an R&D plan,
dubbed Esprit, for European Strategic Program for Research in
Information Technology. A $1.2 billion joint venture among the ten EEC
countries, Esprit will focus on three technologies . . . knowledge-based
systems, software and microelectronics . . . as well as two applications
areas . . . office automation and computer-integrated manufacturing.
Espirit was set up with the help of the leading European information
processing companies, and all projects require transnational cooperation
between researchers from more than one EEC country.

European Commission vice president Viscount Etienne Davignon,
architect of the Esprit program, managed to get the European Commission,
European Parliament and national governments to agree to the program in
record time . . . within a few months. However, although the level of
funding was also agreed in principle, the go-ahead was not given because
Esprit became entangled in the wider European community budgetary
crisis. Neither the German nor the United Kingdom governments would
agree to give long-term financial assurances to the Esprit program until
the long-term financial future of the community funding was agreed. In
February, however, the British and German governments finally gave the
financial go-ahead for the first five-year phase of the ten-year
program, with the $1.2 billion budget to be financed on an equal basis
by the EEC and by industry.

To support the Esprit program, the EEC is establishing an advanced
communications network, called the Esprit Information Exchange System.
A team of major vendors is developing software to allow the Esprit
participants not only to send each other files, documents, messages and
software, but also to use each other's computer systems.
Britain's ICL and GEC, West Germany's Siemens, France's
Compagnie Des Machines Bull and Italy's Olivetti won the three-year
contract to develop the network, whose aim is to bridge the
incompatibility that exists between various machines used by Esprit
participants and to provide a manufacturer-independent baseline for
other Esprit developments. Various universities and research centers
will also help to develop the network.

Meanwhile, not content with Esprit and the European national
fifth-generation projects, Bull, ICL and Siemens, Europe's three
lagest computer companies, established a joint research institute last
September in southern Bavaria. The intitute is owned and financed
equally by the three companies. Research will center on knowledge
processing.

Britain's Alvey program is designed to complement rather than
compete with Esprit. A direct response to the Japanese effort, the
program is Britain's first large-scale collaborative R&D
project between government and industry and represents a doubling of
Britain's research efforts in information technology.

It began when a British delegation attended the Tokyo conference at
which Japan's Fifth Generation program was unveiled. Spurred by
the delegation's report, the British government immediately
commissioned a working party under John Alvey, technical director of
British Telecom, to advise on the scope of a similar collaborative
R&D program. The result was the so-called Alvey program, a
five-year plan to coordinate government, industry and university
research nationwide. All industry work will be 50 per cent
government-funded, while university research will get 100 per cent aid.
To ensure that basic research leads to an end product, the program will
identify a number of "demonstration projects" covering
applications in industry, defense, medicine and the social services. To
coordinate the program, the Alvey Directorate has been established
within Britain's Department of Trade and Industry with Brian Oakley
serving as program director. USA Choices for the 1990's

Faced with the government-backed Japanese and European challenge,
what should be the United States response? In their book, Feigenbaum
and McCorduck list various possibilities, such as joining with Japan,
forming industrial R&D consortia protected from antitrust
regulations, or relinquishing the hardware effort and concentrating
solely on software--akin to the razor blade company that gives away
razors because profits are in the blades. But what they would really
like to see is a national center for knowledge technology. "It
might be a mega-institute, like Los Alamos, embracing all forms of
knowledge technology," they say. "Or it might be a smaller
multiple-university-run laboratory, such as Brookhaven and Fermilab in
physics." Whatever form it takes, the national laboratory should
be newly created.

"We cannot look to the existing national laboratories for the
kind of innovations a knowledge technology laboratory must produce,
freighted as they are with tradition, stodginess and bureaucracy,"
they say. "Those three horsemen of the intellectual apocalypse
will eventually come to the new laboratory, but while it is still new,
it has at least a fighting chance to achieve brilliance."

Without some kind of plan, Feigenbaum and McCorduck conclude the
United States should prepare "to become the first great agrarian
post-industrial society."

COPYRIGHT 1984 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.