IS PARALLEL COMPUTING DEAD? Ken Kennedy, Director, CRPC

The announcement that Thinking Machines would seek Chapter 11 bankruptcy
protection, although not unexpected, sent shock waves through the high-
performance computing community. Coupled with the well-publicized
problems of Kendall Square Research and the rumored problems of Intel
Supercomputer Systems Division, this event has led many people to
question the long- term viability of the parallel computing industry and
even parallel computing itself. Meanwhile, the dramatic strides in the
performance of scientific workstations continues to squeeze the market
for parallel supercomputing. On several recent occasions, I have been
asked whether parallel computing will soon be relegated to the trash
heap reserved for promising technologies that never quite make it.
Washington certainly seems to be looking in the other direction--agency
program managers, if they talk of high-performance computing at all,
seem to view it as a small and relatively unimportant subcomponent of
the National Information Infrastructure.

Is parallel computing really dead? At the very least, it is undergoing a
major transition. With the end of the cold war, there is less funding
for defense-oriented supercomputing, which has been the traditional
mainstay of the high-end market. If parallel computing is to survive in
the new environment, a much larger fraction of sales must be to
industry, which seems to be substantially less concerned with high-end
performance. Two factors bear on the size of the industrial market for
parallel computing. First, most engineering firms have recently made
the transition away from mainframes to workstations. These companies
believe that if they need more computational power than they have on a
single workstation, they should be able to get it by using a network of
such machines. Whether or not this is true, it has substantially
affected sales of tightly coupled parallel systems and must be taken
into account when analyzing the needs of industry users.

A second factor affecting commercial sales of parallel computer systems
has been the reluctance of independent software vendors like MacNeal-
Schwendler to move their applications to parallel machines. I believe
the primary reason for this reluctance has been the absence of an
industry standard interface that supports machine-independent parallel
programming. Without such a standard, a software vendor's investment in
conversion to parallel machines is not protected--when a new parallel
computing architecture with a new programming interface emerges, the
application would need to be retargeted.

So is parallel computing on its last legs? Although good news has been
very limited over the past few months, there are several reasons why it
is too soon to give up:

First, no matter how powerful the workstation processor is, if it
is possible to write programs that scale from one processor to
thousands, there will be plenty of applications that can take advantage
of the additional computational power due to parallelism. Even high-end
workstation companies acknowledge this by providing multiprocessors and
support for clusters.

Second, the software and algorithms problems that must be solved
for scalable parallel systems to be usable (e.g., development of
scalable parallel programming interfaces and standard programming
languages) also need to be solved for networks of workstations if they
are to succeed as high-end computing engines.

Finally, I believe independent software vendors will adopt
parallelism when parallel programming interfaces become industry
standards. Recently, J.S. Nolan and Associates, a vendor of reservoir
analysis codes, has undertaken a major project to implement their very
popular code VIP in PVM. This has been made possible by the portability
of PVM, which has yielded implementations on most common parallel
computing systems, including networks of workstations.

From the outset, we have known that a lot of work was needed to make
scalable parallel computing truly useful. Projects at the Center for
Research on Parallel Computation and throughout the computational
science community are now beginning to bear fruit. Standards like High
Performance Fortran and Message Passing Interface, along with portable
systems like PVM, are beginning to make parallel computing more
palatable to commercial firms.

So I assert that parallel computing is not dead--it is simply suffering
through a period of growing pains. We are coming to realize that
parallel computing is really a software problem. Therefore, we should
not be too distressed when a company drops out of the crowded hardware
market. The industry needs software and standards to make writing
parallel applications easier. If it gets them, the whole industry will
grow, creating more business for everyone.