Anda di halaman 1dari 28

Boltzmann, Shannon, and (Missing) Information

Second Law of Thermodynamics.

Entropy of a gas.
Entropy of a message. Information?

B.B. (before Boltzmann): Carnot, Kelvin, Clausius, (19th c.)


Second Law of Thermodynamics: The entropy of an isolated system never decreases. Entropy defined in terms of heat exchange:
Change in entropy = (Heat absorbed)/(Absolute temp). (+ if absorbed, - if emitted).

(Molecules unnecessary).

Hot (Th)

Cold (Tc)

Isolated system. Has some structure (ordered). Heat, Q, extracted from hot, same amount absorbed by cold energy conserved, 1st Law. Entropy of hot decreases by Q/Th; entropy of cold increases by Q/Tc > Q/Th, 2d Law. In the fullness of time Lukewarm No structure (no order).

Pauls entropy picture


Sun releases heat Q at high temp entropy decreases

Stuff releases heat q, gets more organized entropy decreases


Surroundings absorb q, gets more disorganized entropy increases Overall, entropy increases

Living stuff absorb heat Q at lower temp larger entropy increases

2d Law of Thermodynamics does not forbid emergence of local complexity (e.g., life, brain, ). 2d Law of Thermodynamics does not require emergence of local complexity (e.g., life , brain, ).

Boltzmann (1872))
Entropy of a dilute gas. N molecules obeying Newtonian physics (time reversible). State of each molecule given by its position and momentum. Molecules may collide i.e., transfer energy and momentum among each other. colliding

Represent system in a space whose coordinates are positions and momenta = mv (phase space).

momentum

position

Subdivide space into B bins.


pk = fraction of particles whose positions and momenta are in bin k.

Build a histogram of the pks.

pks change because of Motion Collisions External forces

Given the pks, how much information do you need to locate a molecule in phase space?
All in 1 bin highly structured, highly ordered no missing information, no uncertainty.

Uniformly distributed unstructured, disordered, random. maximum uncertainty, maximum missing information.

In-between case intermediate amount of missing information (uncertainty). Any flattening of histogram (phase space landscape) increases uncertainty.

Boltzmann: Amount of uncertainty, or missing information, or randomness, of the distribution of the pks, can be measured by

HB = pk log(pk)

pk histogram revisited.
All in 1 bin highly structured, highly ordered HB = 0. Maximum HB.

Uniformly distributed unstructured, disorder, random. HB = - log B. Minimum HB.

In-between case intermediate amount of missing information (uncertainty).


In between value of HB.

Boltzmanns Famous H Theorem


Define: HB = pklog(pk)

Assume:
Show: AHA!

Molecules obey Newtons Laws of motion.


HB never increases. - HB never decreases: behaves like entropy!! Identify entropy with HB :

If it looks like a duck

S = - k B HB
Boltzmanns constant

New version of Second Law:


The phase space landscape either does not change or it becomes flatter. life?

It may peak locally provided it flattens overall.

Two paradoxes
1. Reversal (Loschmidt, Zermelo). Irreversible phenomena (2d Law, arrow of time) emerge from reversible molecular dynamics. (How can this be?
cf Tony Rothman).

2. Recurrence (Poincar). Sooner or later, you are back where you started. (So, what does approach to equilibrium mean?)

Graphic from: J. P. Crutchfield et al., Chaos, Sci. Am., Dec., 1986.

Well
1. Interpret H theorem probabilistically. Boltzmanns treatment of collisions is really probabilistic,, molecular chaos, coarse-graining, indeterminacy anticipating quantum mechanics? Entropy is probability of a macrostate is it something that emerges in the transition from the micro to the macro? 2. Poincar recurrence time is really very, very long for real systems longer than the age of the universe, even.

Anyhow, entropy does not decrease!

on to Shannon

AB (After Boltzmann): Shannon (1949) Entropy of a message


Message encoded in an alphabet of B symbols, e.g.,
English sentence (26 letters + space + punctuations) Morse code (dot, dash, space) DNA (A, T, G, C)

pk = fraction of the time that symbol k occurs (~ probability that symbol k occurs).

pick a symbol any symbol


Shannons problem: Want a quantity that measures

missing information: how much information is needed to establish what the symbol is, or
uncertainty about what the symbol is, or

how many yes-no questions need to be asked to establish what the symbol is. Shannons answer:

HS = - k pk log(pk)
A positive number

Morse code example:


All dots: p1 = 1, p2 = p3 = 0. Take any symbol its a dot; no uncertainty, no question needed, no missing information, HS = 0. 50-50 chance that its a dot or a dash: p1 = p2 = , pk = 0. Given the ps, need to ask one question (what question?), one piece of missing information, HS = log(2) = 0.69 Random: all symbols equally likely, p1 = p2 = p3 = 1/3.

Given the ps, need to ask as many as 2 questions -- 2 pieces of missing information, HS = log(3) = 1.1

Two comments:
1. It looks like a duck but does it quack?

Theres no H theorem for Shannons HS.


2. H is insensitive to meaning.
Shannon: [The] semantic aspects of communication are irrelevant to the engineering problem.

On H theorems:
Q: What did Boltzmann have that Shannon didnt?

A: Newton (or equivalent dynamical rules for the evolution of the pks).
Does Shannon have rules for how the pks evolve?

In a communications system, the pks may change because of transmission errors. In genetics, is it mutation? Is the result always a flattening of the pk landscape, or an increase in missing information?
Is Shannons HS just a metaphor? What about Maxwells demon?

On dynamical rules.
Is a neuron like a refrigerator?

Entropy of fridge decreases.

Entropy of signal decreases.

The entropy of a refrigerator may increase, but it needs electricity. The entropy of the message passing through a neuron may increase, but it needs

nutrients.
General Electric designs refrigerators. Who designs neurons?

Insensitive to meaning: Morse revisited X={.


H

. .-.. .-.. --- .-- --- .-. .-.. -..}


E L L O W O R L D

Y={.-

- -.-. -.. . ..-. --. .. .--- -.-}


C D E F G H I J M

A B

Same pks, same entropies same missing information.

If X and Y are separately scrambled still same pks, same missing information same entropy.
The message is in the sequence? What do geneticists say? Information as entropy is not a very useful way to characterize the genetic code?

Do Boltzmann and Shannon mix?


Boltzmanns entropy of a gas, SB = - kB Spklog pk

kB relates temperature to energy: E = kBT


relates temperature of a gas to PV. Shannons entropy of a message, SS = - kSpklog pk

k is some positive constant no reason to be kB.

Does SB + SS mean anything? Does the sum never decrease? Can an increase in one make up for a decrease in the other?

Maxwells demon yet once more.

Demon measures velocity of molecule by bouncing light on it and absorbing reflected light; process transfers energy to demon;

increases demons entropy makes up for entropy decrease of gas.

Anda mungkin juga menyukai