Entropy of a gas.
Entropy of a message. Information?
(Molecules unnecessary).
Hot (Th)
Cold (Tc)
Isolated system. Has some structure (ordered). Heat, Q, extracted from hot, same amount absorbed by cold energy conserved, 1st Law. Entropy of hot decreases by Q/Th; entropy of cold increases by Q/Tc > Q/Th, 2d Law. In the fullness of time Lukewarm No structure (no order).
2d Law of Thermodynamics does not forbid emergence of local complexity (e.g., life, brain, ). 2d Law of Thermodynamics does not require emergence of local complexity (e.g., life , brain, ).
Boltzmann (1872))
Entropy of a dilute gas. N molecules obeying Newtonian physics (time reversible). State of each molecule given by its position and momentum. Molecules may collide i.e., transfer energy and momentum among each other. colliding
Represent system in a space whose coordinates are positions and momenta = mv (phase space).
momentum
position
Given the pks, how much information do you need to locate a molecule in phase space?
All in 1 bin highly structured, highly ordered no missing information, no uncertainty.
Uniformly distributed unstructured, disordered, random. maximum uncertainty, maximum missing information.
In-between case intermediate amount of missing information (uncertainty). Any flattening of histogram (phase space landscape) increases uncertainty.
Boltzmann: Amount of uncertainty, or missing information, or randomness, of the distribution of the pks, can be measured by
HB = pk log(pk)
pk histogram revisited.
All in 1 bin highly structured, highly ordered HB = 0. Maximum HB.
Assume:
Show: AHA!
S = - k B HB
Boltzmanns constant
Two paradoxes
1. Reversal (Loschmidt, Zermelo). Irreversible phenomena (2d Law, arrow of time) emerge from reversible molecular dynamics. (How can this be?
cf Tony Rothman).
2. Recurrence (Poincar). Sooner or later, you are back where you started. (So, what does approach to equilibrium mean?)
Well
1. Interpret H theorem probabilistically. Boltzmanns treatment of collisions is really probabilistic,, molecular chaos, coarse-graining, indeterminacy anticipating quantum mechanics? Entropy is probability of a macrostate is it something that emerges in the transition from the micro to the macro? 2. Poincar recurrence time is really very, very long for real systems longer than the age of the universe, even.
on to Shannon
pk = fraction of the time that symbol k occurs (~ probability that symbol k occurs).
missing information: how much information is needed to establish what the symbol is, or
uncertainty about what the symbol is, or
how many yes-no questions need to be asked to establish what the symbol is. Shannons answer:
HS = - k pk log(pk)
A positive number
Given the ps, need to ask as many as 2 questions -- 2 pieces of missing information, HS = log(3) = 1.1
Two comments:
1. It looks like a duck but does it quack?
On H theorems:
Q: What did Boltzmann have that Shannon didnt?
A: Newton (or equivalent dynamical rules for the evolution of the pks).
Does Shannon have rules for how the pks evolve?
In a communications system, the pks may change because of transmission errors. In genetics, is it mutation? Is the result always a flattening of the pk landscape, or an increase in missing information?
Is Shannons HS just a metaphor? What about Maxwells demon?
On dynamical rules.
Is a neuron like a refrigerator?
The entropy of a refrigerator may increase, but it needs electricity. The entropy of the message passing through a neuron may increase, but it needs
nutrients.
General Electric designs refrigerators. Who designs neurons?
Y={.-
A B
If X and Y are separately scrambled still same pks, same missing information same entropy.
The message is in the sequence? What do geneticists say? Information as entropy is not a very useful way to characterize the genetic code?
Does SB + SS mean anything? Does the sum never decrease? Can an increase in one make up for a decrease in the other?
Demon measures velocity of molecule by bouncing light on it and absorbing reflected light; process transfers energy to demon;