Professor of Electrical Engineering

Making Complexity Simple

Outline

Arcane

Pity

Perspective

Information

Course

Status

The Second Law of Thermodynamics

Surely one of science's most glorious accomplishments
Also one of the most profound and mysterious
What is this thing called "entropy?"
Does the Second Law really tell us which way clocks run?
Deep related concepts
Complexity
Randomness
Fluctuations
Dissipation
Order
The public's imagination is capturedNobody really understands it
At least that is what we believed as graduate students
It is too complex to be taught

Why is entropy considered arcane?

Because people think it is part of thermodynamics
Thermodynamics really is hard to understand.
Many concepts -- heat, work, temperature, ...
You have to care about things like ideal gases.
Most engineers don't need to know thermodynamics.
Most don't care.

This is a shame

Entropy and the Second Law are taught as part of
thermodynamics, so most people miss them.... and thereby miss something pretty important
All scientists, all engineers, indeed all educated people need
to understand that some operations are reversible and
some are not.
They need to be able to tell them apart.
They need to know there is a way to quantify reversibility.
They need a "monotonic model"
To complement models of conserved quantities
Both are helpful for understanding the world
Both are prototypes for other quantitiesCan we satisfy these very real needs somehow?They have nothing to do with thermodynamics.
It's just that thermodynamics is where they have traditionally
been taught.

Entropy is useful outside of thermodynamics

Thermodynamics always involves energy
Entropy need notOutside of thermodynamics, without links to energy
Entropy is less complex
There are plenty of reversible and irreversible operations
The Second Law, or something much like it, exists
Monotonic models can be taught more easilyThe more general the context, the simpler the concept
This is the secret of making complexity simple

War is too important to be left to the generals*

Information is simpler and more general

Start with information
Entropy is one kind of information.
Entropy is information we do not haveSee reversible and irreversible data transformations
In computation and communicationsNote that irreversible operations destroy information
This is the Second Law in this contextApply to a physical system with energy
Use maximum-entropy principle
Voila, thermodynamics!
Temperature is energy per bit of entropy (sort of)
Intensive vs. extensive variables
Second Law in traditional setting
Carnot efficiencyThe basic idea of reversibility is not difficult to understand

We want to teach this stuff to freshmen

Why is this possible?
Today's students are different
Best to start from the known
Data, disks, Internet, packets, bits, ...
Consistent with the coming information age
Go toward the unknown
Thermodynamics, equilibrium, heat engines, refrigerators
Relevant to the current industrial age
Physical view of information . . . like energy, information
can be of many types
can be converted from one form to another
can exist in one place or another
can be sent from here to there
can be stored for later use
There are interesting applications
Biology (genetic code)
Communications
Quantum computing

Course material (cont)

11. Temperature
One of the Lagrange multipliers is temperature
Heat, work
Carnot efficiency12. Myths
Order out of chaos
Miracle needed
Heat death
Evaporation of black holes
Difficulty of extensions to social science

Can freshmen actually learn this stuff?

We are finding out
Fall 1999, course development
Faculty: Paul Penfield, Seth Lloyd, Sherra Kerns
Students: small set of freshmen serving as guinea pigs
Spring 2000, pilot offering
Limited to 50 freshmen
Of course we have a Web site -- everybody does
http://www-mtl.mit.edu/users/penfield/6.095-s00/
Fall 2000, revisions, note writing
Spring 2001, first full offeringThe devil is in the details
E.g., which is the best statistical mechanics model to use?

Some of the subtle points (risks)

Entropy is the information we don't have
Therefore entropy is subjective (some people don't like that)Math generally simple -- discrete, not continuous processes
But Lagrange multipliers are not easySkill in modeling is still important
No magic hereAt the freshman level you cannot go very deeply
All we can do is provide simple ideas to be built on later
We want to be consistent with later learning in specific areasStay in touch -- we will let you know how it turns outIf we are successful
The course will be permanent
We will help other universities start similar courses
We will advocate it as a science exposure in the liberal arts