This
paper presents the first incarnation of a public-key cryptosystem. The principal
computation used for encryption and decryption is exponentiation with respect
to a composite modulus. This paper together with the papers of W. Diffie and
M. Hellman ("New Directions in Cryptography," IEEE Trans. on Inf.
Th.,IT-22) and R. Merkle (`Secure Communications Over Insecure Channels' Commun.
ACM, vol. 17, no. 4, 294 -299) are generally regarded as the seminal papers
in the field of public-key cryptography. The 'RSA' system continues to occupy
a central place in both the theoretical and practical development of the field.
More than 400,000,000 copies of the RSA algorithm are currently installed and
it is the primary cryptosystem used for security on the internet and world wide
web.

`The
problem of distinguishing prime numbers from composites, and of resolving composite
numbers into their prime factors, is one of the most important and useful in
all of arithmetic.... The dignity of science seems to demand that every aid
to the solution of such an elegant and celebrated problem be zealously cultivated.'

-
Karl Freidrich Gauss

Disquisitiones
Arithmetica ART. 329 (1801)

(translation
from D.E. Knuth's "The Art of Computer Programming,"

Vol.2,
second edition, Addison-Wesley, 1981, pg 398.)

The
problem of distinguishing prime numbers from composite numbers is ancient. Since
the time of Eratosthenes' sieve it has attracted the attention of many distinguished
researchers. This paper presents a 'nearly polynomial time' deterministic algorithm
for the problem. More specifically, there exists a positive real c such that
for sufficiently large n, the algorithm halts within log nc log(log(log(n)))
steps. The next best deterministic algorithm is strictly exponential. The primary
methods used in the algorithm are from algebraic number theory and class field
theory (higher reciprocity laws). H. W. Lenstra, A.K. Lenstra and H. Cohen were
able to simplify and implement the algorithm. Their implementation can test
the primality of numbers of hundreds of digits in a few minutes.

It
appears that this is the first result in theoretical computer science ever published
in Annals of Mathematics.

Note:
This result has since been superseded by the brilliant work of A. Wiles in settling
Fermat's last Theorem.

'Fermat's
last theorem' is, of course, the conjecture that:

xn+yn=zn
has no solutions in the positive integers when n > 2

By
1976 enough was known that it was possible to establish by computation that
the so called `first case' of Fermat's last theorem held for all primes p <
3xlO^9 (see "13 Lecture on Fermat's Last Theorem," by P. Ribenboim,
Springer-Verlag, 1979).

In
this paper (and the companion paper "Theorem de Brun-Titchmarsh- application
au theorem de Fermat" by E. Fouvry Invent. Math, 79: 383-407, 1985) the
following was proved:

Theorem[Adleman,
Fouvry, Heath-Brown] There exist infinitely many primes for which the first
case of Fermat's last theorem holds.

The
existence of a random polynomial time algorithm for the set of primes is demonstrated.
The techniques used are from arithmetic algebraic geometry, algebraic number
theory and analytic number theory. The result complements the well known result
of Solovay and Strassen ("A Fast Monte-Carlo Test For Primality,"
SIAM J. Comput. 6 (1977), 84-85) that there exists a random polynomial time
algorithm for the set of composites.

If
one is willing to accept randomness in computation, then this result settles
the theoretical question of the existence of a polynomial time algorithm for
distinguishing prime numbers from composite numbers.

A
small instance of the' Hamiltonian path problem' is encoded in molecules of
DNA and solved in a test tube using the tools of molecular biology. This is
apparently the first example of computation carried out at the molecular level
and suggests the possibility of fundamental connections between biology and
computer science.