Anda di halaman 1dari 19

What is the easiest definition of 'entropy' - Quo

Saved to Dropbox Jun 17, 2016, 12:38 AM

Sign In

Second Law of Thermodynamics

Heat Transfer

Entropy

What is the easiest definition of


"entropy"?
8 ANSWERS

Henry K.O. Norman, Retired Programmer,


Studying Hard to Learn Physics
361 Views Most Viewed Writer in Entropy
(physics) with 30+ answers

Entropy comes in a bewildering number of


flavors, interpretations, and misconceptions
from energy flow to disorder to information
content but I will assume that you refer to
Thermodynamic Entropy , and mention some
of the other entropy definitions only in
passing.
Entropy S is a consequence of the Second Law
of Thermodynamics , one form of which
states that In a closed or fully isolated system,
unconstrained energy spontaneously tend to
disperse, to spread out, from more concentrated
forms to less concentrated forms. That is,
energy spontaneously flows from hot to cold,
and never the other way around.

Entropy is not a material substance: S is a


number, a measure of energy dispersal, as
energy over temperature, in SI units as joule
per kelvin (J/K) or in base units kg m^2 s^
2 K^1 (when measured as the entropy content
of some substance, as S m^1, where m is either
kg or mol). Simply put, entropy is a measure of
the amount of unusable energy within a closed
or isolated system (for example, our Universe).
To me, that gives us the easiest definition:
Entropy is a measure in units of joule per
kelvin of how energy in a closed or isolated
system spontaneously disperses over time,
and that total entropy in the Universe will
always increase. Local and temporary
decreasing entropy as for example in
Photosynthesis and Cellular Respiration
is always more than adequately compensated for
by an increase of entropy elsewhere.
You may have read that entropy is the measure
of a systems disorder (or randomness). That is
a stubborn misconception which is flat out
wrong. For example: Whether a deck of playing
cards is perfectly ordered (the four suites, in
succession, ordered from Ace (1) to King (13)) or
in some random state of disorder (thoroughly
shuffled) has nothing to do with entropy: these
examples are simply two arbitrarily chosen
orders out of the 52! (factorial 52 =
123... 52 810^67) equally unique orders
that a deck of cards can have.

Again: Entropy is Not Disorder. This fact is very


well argued by professor Frank Lambert , in
his article Entropy Is Not Disorder , as well as
in many other articles on Lamberts
Entropysite , and by professor Harvey Leff
in Removing the Mystery of Entropy and
Thermodynamics (part I to V, 2010), and
many other articles on Leffs Entropysite .

Cartoon courtesy sciphilos.info


Deviation from the original thermodynamic
definition of entropy, as a measure of energy
dispersal, to a measure of information content
appears to have begun with statistical
mechanics, which states that entropy is the
amount of additional information needed to

specify the exact physical state of a system


additional, as in unknown.
Brief entropy evolution time-line (for details,
see linked articles):
1854: Thermodynamic Entropy
(Thermodynamics (R. Clausius )) The
change in a systems entropy S = (dQ/T) 0,
where dQ is a measure of energy (heat)
transferred into a system of absolute
temperature T. Absolute entropy S came later.
1875: Classical Entropy (Statistical
Mechanics (L. Boltzmann )) In an ideal
gas quantity W (the number of microstates
corresponding to a specific macro state), the
entropy S = k log W, where k is Boltzmanns
Constant (relating average individual particle
energy with temperature). Also stated as S =
kB log (where W).
1878: Gibbs Entropy (Statistical
Thermodynamics (J.W. Gibbs )) In a
system (a set of microstates) where Ei is the
energy of microstate i, and pi is the probability
of that energy fluctuation, the entropy S = kB
(i) (pi log pi).
1930: von Neumann Entropy (Quantum
Statistical Mechanics (J. von Neumann ))
For a QM system described by a density
matrix , von Neumanns definition of
entropy S = tr( log ), where tr is the trace

and log is the (natural) matrix logarithm . With


in terms of its eigenvectors , then S = (j)
(j log i). Confusingly, JvN entropy is also
defined as the amount of randomness in
the state S, as H(S) = tr(S log S)... I do not
claim that I understand this stuff...
1948: Information Entropy (Information
theory (C. Shannon )) In a message, as
modeled by any flow of information, entropy is
the average expected value of the information
content of that message, stated as the average
number of bits needed to store or communicate
one symbol in a message. I.e., in a message X,
the entropy H(X) = i(P(xi)I(xi)) = i(P(xi) log
P(xi)), where I is the information content of X.
With the logarithm base = 2, the unit is
shannon (Sh, better known as bit (binary
digit)).
Except for similarity of the often recurring
general formula X = Y log Z this has
nothing in common with thermodynamic
entropy. Note also that bits are not in
themselves information, bit strings (binary
values) have to be mapped to the actual
information they happen to represent. In itself,
for example, the binary value 01101110 is
meaningless unless its representation is known
(among countless other possible mappings, the
decimal value 110, and the ASCII character n).

1972: Black Hole Entropy (Black Hole


Thermodynamics , J. Bekenstein ) Black
hole entropy SBH = A/(4P^2) = c^3A/(4G ),
where A is the BH surface area (i.e., area of the
BH event horizon ), P is the Planck
length , and c, G, and , respectively, are the
speed of light, the Gravitational Constant ,
and the reduced Planck Constant . Note that
this yields a dimensionless quantity (a pure
number): to convert to thermodynamic
entropy, some say that the results should be
multiplied by kB (as in this cartoon):

Cartoon courtesy s3.amazonaws.com


Note also that there are at least ten (10) very
different black hole entropy equations, with
results varying over a hefty 39 orders of
magnitude (!). Obviously an area of intense
theoretical debate. For details, see Black Hole
Properties .xlsx .

If the sources cited are somewhat correct, the


above should be a fair synopsis of the entropy
metamorphosis over the years.
For dessert, have a taste of Boltzmanns H
Theorem , and if youre a glutton for
punishment, follow up with Generalized
Entropies (F. Dupuis et al., 2013), Entropy in
General Physical Theories (A.J. Short and S.
Wehner, 2010), and Similarity Between
Quantum Mechanics and Thermodynamics
(S. Abe and S. Okuyama, 2010) All three from
Cornells arXiv, all rather technical.
Confused yet? Worry not youre not alone.
And thanks for asking!
Updated Jun 3 View Upvotes Answer requested by
Ralph Maalouly
Share

MORE ANSWERS BELOW. RELATED QUESTIONS

What is the exact definition of entropy?

What are entropy and enthalpy?

Entropy: When did Entropy start?

What is entropy and why does it exist?

What does q(rev) mean in the definition of


entropy?

How can we relate the three different


definitions of entropy together?

Is a system that keep its entropy low a


good definition of a living organism?

Is there a generalized definition of entropy


that covers its applications in both
statistical thermodynamics and
information theory? If so, what is it?

Is there any easy way to explain the


definition of entropy? Is it some sort of
diffusion of energy throughout a space?

How is Q defined in the physics definition


of Entropy? Is it the total Kinetic Energy of
the system at microscopic level?

Is entropy quantized?

What is free entropy?

What does "negative entropy" mean?

Why is entropy weird?

Why is there entropy?


OTHER ANSWERS

Tom McNamara, Sometimes I think


maybe I know what entropy is
481 Views Tom has 60+ answers in Physics

Entropy comes up in classical thermodynamics,


and it comes up again in statistical mechanics.
When both approaches work, they (should!) give
the same predictions. But they give two very
different points of view on the same predictions.
In statistical mechanics, entropy measures how
probable each outcome of an experiment is. I
gave it my best shot here:
Tom McNamara's answer to Why is it when I
drop a vase it smashes into a million pieces
however when I then drop the million pieces it
does not form a vase?

Footnote: Disorder vs Probability.

Traditionally, people often say entropy measures


"disorder." But disorder doesn't seem to have a
very precise definition (unless we use it to mean
the probability of a macro-state, which is back to
the interpretation I'm using above). So I've never
been able to really do much with "disorder." On
the other hand probability has a very clear
meaning.
Also ... well ... laypeople will talk about entropy.
And "the universe tends towards disorder"
seems not only ill-defined but weirdly
apocalyptic.
Joy Division - Disorder

On the other hand, "the universe tends towards


things that are likely to happen" seems betterdefined, easier to understand, and less dramatic.
Laypeople may say "duh" ... but I can accept that
Updated Nov 16, 2015 View Upvotes
Share

Mark Barton, PhD in Physics, The


University of Queensland, physicist with
National Astrono...
253 Views Most Viewed Writer in
Thermodynamics with 120+ answers

In classical thermodynamics, there's really only


one definition: entropy is a function of state that
never decreases (in total for the system and
environment), remains constant in reversible
processes, but increases in irreversible
processes. The "function of state" here looks
like unimportant fine print, but is actually doing a
lot of work. Everything else is just an attempt to
characterize in more memorable terms why such
an inscrutable thing is important, and for my
money none of them really does a good job. It's
not until you get to statistical mechanics that
you get something actually intuitive: entropy is
(Boltzmann's constant times the logarithm of)
the number of ways of rearranging the
microstate (the location of each particle and
quantum of energy) while still conforming to the
description of some particular macrostate (e.g.,
particular values of macroscopic quantities like
pressure, temperature, volume etc). So for
example, if you've got a bullet that's at absolute
zero but traveling at high speed, there's only one
microstate that is consistent with that, but if you
take the same energy and use it to have a hot
bullet with no bulk kinetic energy, then there's
lots of ways of doing that.

Written Nov 15, 2015 View Upvotes Answer requested


by Ralph Maalouly
Share

John Bailey, My contribution to physics


was showing how to correctly calculate
the capacit...
111 Views John has 390+ answers in Physics

Entropy is a measure of the degree of


organization or disorder of a system.
There are at least four contexts in which entropy
is used as a measure of that system
characteristic.
The contexts are:
thermodynamics
information theory
information physics
networks
In that order, examples are:
ice vs steam
my living room vs our storage basement
entanglement vs decoherence

quora vs twitter
This quote sets the tone:
There are many many statements in the
literature which say that information is the
same as entropy. The reason for this was told
by Myron Tribus . The story goes that
Shannon didn't know what to call his measure
so he asked von Neumann, who said `You
should call it entropy ... [since] ... no one
knows what entropy really is, so in a debate
you will always have the advantage'
(Tribus1971 ) Information Is Not Uncertainty
Page on virginia.edu
Written Nov 13, 2015 View Upvotes Answer requested
by Ralph Maalouly
Share

Haridev Vaikundamoorthy, Mechanical


Engineer
371 Views Most Viewed Writer in Second Law of
Thermodynamics

To visualize entropy, albeit in a vague and notreally-accurate manner, let us consider the
situation of the physics teacher John and his
class of 35 students, shall we?
Let's say John wishes to teach the students

what entropy means, in connection to


probability. Since it is also John's birthday, he
buys some chocolates to distribute to his
beloved students. But he intentionally buys only
25 chocolates, instead of 35. To teach the
students entropy, he throws chocolates in
random directions inside the class, and informs
the students that each one can keep only one
chocolate with them. His students are of course
not aware of his intent, so they eagerly catch
every chocolate that comes their way and
distribute it among their own friends-circles,
making sure to keep one for themselves. Once
the chocolates are all finished, the students
realize that some of them do not have
chocolates and complain to John about it,
thinking that a few students are keeping two or
three chocolates for themselves instead of just
one per student.
At this point, John reveals that he had
intentionally only bought 25 chocolates. The
students are furious because John was one of
their favorite teachers and they had not
expected this. They feel left-out and think John
intentionally threw chocolates to specific
students in the class. Then, John says, "IT
DOESN'T MATTER TO ME exactly which
students do not have the chocolates, for if I
had distributed it in any other way, there would
still have been 10 students without the
chocolate."

Let us think about this sentence for a while. To


John, all his students are equal. It really doesn't
matter to him which 10 students did not get the
chocolates. So if he had distributed the
chocolates in any other way, the situation would
have been the same for him. In any scenario, 10
students would have been left out.
But to the students, it does matter which 10 did
not get the chocolates. Then, can we ask
ourselves what are the possible combinations of
the 10 students who do not get the chocolate
out of the 35 students? That is exactly what
entropy talks about. There are two terms in
relation to entropy that we must understand at
this point - The macrostate and the microstate.
The macrostate is that 10 students do not have
the chocolate at the end of the exercise. The
microstate corresponds to each of the
combinations that are possible for forming the
particular group of 10 students to reach the
same macrostate. So, as you can figure out there are several microstates which
correspond to the same macrostate. And to an
observer (John), each of the microstates is
exactly identical to another one.
Now, let us forget what we talked about so far
and dwell into the technicalities, shall we?
TL;DR:

First, let us describe microstates and


macrostates as applied to thermodynamic
systems:
Classical thermodynamics describes systems in
terms of a few variables (functions of state) :
temperature, pressure, volume... But such a
system is really made of atoms, so a much richer
description must be possible in principle: we
could specify the quantum state of all the
atoms--the microstate .
If we see the atoms as being equivalent to the
students in the example above, then describing
which atom is at what quantum condition
becomes a microstate, just as we described
which specific student has a chocolate and
arrived at a microstate. But mind you, an atom
can have several quantum states which is unlike
our example where we defined a microstate with
just two quantum states - a student either has a
chocolate or doesn't.
Each microstate M i of the system is a set of
positions qi and velocities V i for i=1,2, n which
describe the position and velocity of each
particle comprising the system.
Of course as the atoms interact and this state
changes very rapidly - perhaps 1035 times a

second. But the observed macrostate of the


system doesn't change. We still measure the
same pressure, temperature etc. ----> we still
have only 25 chocolates no matter how it is
distributed.
Many different microstates all correspond to
the same macrostate.
As you can imagine, for large N (say, N=10^23),
this gets out of hand.
The probability that the system is in microstate
Mi is quite low as there are many, many
different microstates the system could occupy.
This is what we refer to as randomness. So,
randomness isn't really randomness, it is just a
term used to refer to the total number of
microstates. If randomness in a system is too
high, it means that the number of microstates of
that system is a huge number corresponding to
a particular macrostate.
This suggests we can derive thermodynamics
directly from the quantum behaviour of atoms
and molecules. Or, in other words, we can
calculate the macroscopic behaviour of the
system by averaging over the corresponding
microstates.
Now, Entropy:

Entropy of a system is related to the total


number of possible microstates of a system,
called thermodynamic probability, p, by the
Boltzmann relation:

S =k ln( p)
where k=1.38061023
Therefore, from a microscopic point of view, the
entropy of a system increases whenever the
molecular randomness or uncertainity (i.e.,
molecular probability) of a system increases.
Thus, entropy is a measure of molecular
disorder, and the molecular disorder of an
isolated system increases any time it undergoes
a process.
Read this great article for understanding what
exactly irreversibility means and how we
stumbled upon something called entropy. It is
quite a story:
Reversibility and Entropy
It is appropriate to say at this point that entropy
is essentially a measure of stability of a system
of particles. By stability I mean that when a
system of particles already has so much
randomness to it, it cannot easily become more

random. Hence a system that has high entropy is


essentially a stable system - it can occupy so
many microstates and still be in the same
macrostate.
At the same time, a system which has only one
microstate is also a stable system. Hence, we
have two types of stability: zero entropy and
infinite entropy, which correspond to the values
p = 1 and p = in the entropy relation which
we had discussed above.
This is what physicists mean when they say that
the universe started from big bang (before which
the universe was stable as it had only one
microstate), and is now accelerating towards
another state of stability with infinite entropy.
This is a inference of the law that says entropy of
the universe always increases.
In a sense, we can say that before big bang
there existed an absolute unstable equilibrium ,
and in the future there will exist an absolute
stable equilibrium .
Updated Mar 27 View Upvotes
Share

Anda mungkin juga menyukai