Contents

Definition

We shall start by giving the definition of the entropy of dynamical system.
Consider dynamical systems with discrete
time. The phase space of dynamical system is denoted by \(M\ .\) It is
equipped with \(\sigma\)-algebra \(\mathcal M\) and a probability measure \(\mu\)
defined on \(\mathcal M\ .\) In the general ergodic theory, dynamics is given by a
measurable transformation \(T\) of \(M\) onto itself preserving the measure
\(\mu\ .\) It is enough for many applications to assume that \(M\) is the
Lebesgue space, i.e., it is isomorphic (as a measure space) to the unit
interval with the usual Lebesgue measure. The concept of Lebesgue space
was introduced in the works of Halmos, von Neumann and Rokhlin and now
is widely used.

It is clear from the definition that this entropy is a metric invariant
of dynamical system. The following theorem is the main tool which
allows to compute \(h ( T )\ .\) It uses the notion of a generating
partition.

This theorem was proven for Bernoulli partitions by Kolmogorov in his lectures.
The proof was based upon the formula for entropy of Bernoulli partitions.
The proof in the general case was given in [S1]. It used
an inequality for conditional entropies which is the equality in the
Bernoulli case. However, at that time when the notion of entropy appeared, the transition from
equality to inequality was a serious step.

Entropy can be defined for dynamical systems with continuous time
because
\[h ( T^t) = | t | h ( T^1)\ ,\] \(- \infty < t < \infty\ .\)
For special flows the so-called Abramov formula connects the entropy of the flow with
the entropy of the base automorphism.

The exposition of entropy theory of dynamical systems can be found in
many monographs and textbooks, see e.g., [B], [CFS], [P], [W]. Now many
examples of dynamical systems with positive entropy are known even within
the class of deterministic dynamical systems.
The general theorem of Kushnirenko states the entropy of flow generated by a smooth vector field on a compact smooth manifold is finite.

History

The notion of Metric Entropy of dynamical system, also known as Measure-Theoretic Entropy, Kolmogorov Entropy, Kolmogorov-Sinai Entropy, or just KS entropy, appeared in the paper by Kolmogorov ([K1]).
This was a time when Kolmogorov was interested and worked on
several problems from information theory, the dimension of
functional spaces and so on. The main theorem of KAM-theory
appeared in the works of Kolmogorov a little bit earlier.
In 1957, Kolmogorov led a seminar on dynamical systems which was attended by such
people as Alexeev, Arnold, Tikhomirov, Pinsker, Meshalkin, the author of
the present paper and others.

The problem of metric isomorphism of dynamical systems was certainly one
of the most often discussed. In his lectures accompanying the seminar,
Kolmogorov presented a probabilistic proof of the von Neumann theorem on
isomorphism of dynamical systems with pure point spectrum. The main
point of view at that time was that dynamical systems arising in
probability theory are different even from the metric point of view from
dynamical systems generated by ODE and PDE. Motivated by all these
ideas, Kolmogorov proposed the notion of entropy about which it was
believed that it will allow to distinguish "probabilistic" dynamical
systems and "deterministic" dynamical systems. The first announcement
of the entropy was done by Kolmogorov in one of his lectures. It contained the metric
invariant for Bernoulli shifts and gave the proof that 2-shifts and
3-shifts are metrically non-isomorphic. However, in the text prepared
for publication the exposition was quite different. Kolmogorov
introduced a new class of dynamical systems which he called
quasi-regular and defined the notion of entropy only for quasi-regular
systems.

Quasi-regular dynamical systems resembled regular stationary processes
of probability theory which were studied earlier in the theory of random processes. Later,
for some time, quasi-regular dynamical systems were called Kolmogorov
systems. However, at some moment Kolmogorov asked to change this
terminology because he hoped to work in this area and it would be
inconvenient for him to use the words like "Kolmogorov system." The
change was done and the term K-system is now commonly accepted.
However, Kolmogorov did not return to this field. The notion of
K-system plays an important role in ergodic theory.

Having written the text and submitting it for publication, Kolmogorov
left from Moscow to Paris where he spent the whole semester.

At that time, the author of this paper was a graduate student and Kolmogrov
was his advisor. It was natural to start to think about the notion of entropy which could be
applied to all dynamical systems. After several attempts the definition which
was given above was invented. However, there were no non-trivial
examples of dynamical systems to which this definition could be applied.
During this time V.A. Rokhlin became aware of Kolmogorov's paper and his general
definition of entropy. Rokhlin became very excited
and proposed to compute the entropy of the automorphism of the
two-dimensional torus. Following the main point of view at that time, the author
tried to prove that the entropy is zero because the automorphism of the
torus is certainly a purely "deterministic" dynamical system. The proof didn't go through.
Kolmogorov was the first to suggest that the entropy must be positive.
After that, it became possible to prove the needed result and there were enough reasons to publish the general definition of entropy together with the formula for the entropy of the torus automorphism.
This was done in [S1] and the title of the paper was "On the notion of entropy of dynamical
system." Some time later, Rokhlin found an example which showed that
the entropy introduced by Kolmogorov is not a metric invariant of
dynamical system. This example was reproduced in the paper by
Kolmogorov (see [K2]). Now there are many examples of this
kind. (See the paper [LPS] by E. Lindenstrauss, Y. Peres, W. Schlag).
However, the result proven by Kolmogorov in his lecture was
correct!

The financial support from the NSF, grant #DMS 0600996, is highly
appreciated.