Sign In
Heat Transfer
Entropy
Is entropy quantized?
quora vs twitter
This quote sets the tone:
There are many many statements in the
literature which say that information is the
same as entropy. The reason for this was told
by Myron Tribus . The story goes that
Shannon didn't know what to call his measure
so he asked von Neumann, who said `You
should call it entropy ... [since] ... no one
knows what entropy really is, so in a debate
you will always have the advantage'
(Tribus1971 ) Information Is Not Uncertainty
Page on virginia.edu
Written Nov 13, 2015 View Upvotes Answer requested
by Ralph Maalouly
Share
To visualize entropy, albeit in a vague and notreally-accurate manner, let us consider the
situation of the physics teacher John and his
class of 35 students, shall we?
Let's say John wishes to teach the students
S =k ln( p)
where k=1.38061023
Therefore, from a microscopic point of view, the
entropy of a system increases whenever the
molecular randomness or uncertainity (i.e.,
molecular probability) of a system increases.
Thus, entropy is a measure of molecular
disorder, and the molecular disorder of an
isolated system increases any time it undergoes
a process.
Read this great article for understanding what
exactly irreversibility means and how we
stumbled upon something called entropy. It is
quite a story:
Reversibility and Entropy
It is appropriate to say at this point that entropy
is essentially a measure of stability of a system
of particles. By stability I mean that when a
system of particles already has so much
randomness to it, it cannot easily become more