Paul Penfield, Jr.
D. C. Jackson Professor of Electrical
Also one of the most profound and mysterious
What is this thing called "entropy?"
Does the Second Law really tell us which way clocks run?
Deep related concepts
The public's imagination is captured
At least that is what we believed as graduate students
It is too complex to be taught
Thermodynamics really is hard to understand.
Many concepts -- heat, work, temperature, ...
You have to care about things like ideal gases.
Most engineers don't need to know thermodynamics.
Most don't care.
All scientists, all engineers, indeed all educated people need to understand that some operations are reversible and some are not.
They need to be able to tell them apart.
They need to know there is a way to quantify reversibility.
They need a "monotonic model"
To complement models of conserved quantities
Both are helpful for understanding the world
Both are prototypes for other quantities
But that is where they have traditionally been taught.
Entropy need not (at least at one level)
Entropy is less complex
There are plenty of reversible and irreversible operations
The Second Law, or something much like it, exists
Monotonic models can be taught more easily
This is the secret of making complexity simple
* George Clémenceau (1841 - 1929)
French Premier, 1906 - 1909, 1917 - 1920
Shannon believed this was not a fundamental identity
As far as I know Jaynes agreed
The question is, can we convert from one to the other?
Vastly different scales
They are the same concept
Information is physical
Entropy is subjective
Both are relative
One can be traded for the other (with experimental difficulty)
We should think about entropy conversion systems
The ideas are simple, the applications are arcane
Unifying the concepts simplifies them and lets freshmen in
Entropy is one kind of information -- information we don't have
In computation and communications
They also give meaning to causality and the direction of time
This is the Second Law in this context
Use principle of maximum entropy
Temperature is energy per bit of entropy (sort of)
Second Law in traditional setting
12 weeks (Spring 2001 outline):
1. Bits and Codes (perhaps next year qubits)
9. Physical Systems
12. Quantum Information
Introduction to Computing
Signals and streams
Binary code, gray code
Helps with low channel capacity
Irreversible -- fidelity requirement
Run length encoding
The LZW patent issue
Physical sources of noise
coins, dice, cards
Probabilities are relative and subjective
Information can be quantified
Model with source, coder, channel, decoder, receiver
Source coding theorem
TCP and IP
Strategies for recovery from lost packets
Discrete memoryless channel
M = IIN - L
= IOUT - N
Cascade inequalities in L, N, and M
L1 <= L
L1 + L2 - N1 <= L <= L1 + L2
Given received signal, what is input
How much information have we learned?
Entropy is information we do not have
It is relative and subjective
Input probabilities consistent with constraints
Minimum bias means maximum entropy
Simple examples with analytic solution
Examples with one constraint, many states
Microscopic vs. macroscopic
Energy per state
Expected value of energy
One of the Lagrange multipliers is temperature
You can trade one for the other
Difference in scales is being erased by Moore's law
What does that say about physics?
But Lagrange multipliers are not easy
No magic here
All we can do is provide simple ideas to be built on later
We want to be consistent with later learning in specific areas
It's disturbing to have unanswered questions close at hand
Fall 1999, course development
Faculty: Paul Penfield, Seth Lloyd, Sherra Kerns
Spring 2000, pilot offering
Limited to 50 freshmen
Of course we have a Web site -- everybody does
Spring 2001, second pilot offering
The course will be permanent starting in Spring 2002
We will help other universities start similar courses
We will advocate it as a science exposure in the liberal arts