Entropy and Chance
Thread - Entropy and Chance...
On Tue, 4 Feb 2003, Zero Sum replied to a posting
by Kevin Phyland:
> [1] Entropy is
the tendency for systems to go from "order" to
> "disorder"?
Yes.
> [2] The total
entropy in the Universe is increasing?
Yes.
> [3] Entropy is
however a statistical process?
Yes.
Also, macroscale and microscale differences come into consideration.
You are looking macroscale.
> [4] So...entropy
could actually increase in some parts of the
> Universe provided there
was a compensating extra decrease
> somewhere else?
Within a 'system', yes. Within
a 'locality', not unless it is part of the system being observed. You grow,
you live, "defying" entropy at the cost of the energy and materials you consume.
> [5] If entropy
actually decreased in some part of the Universe
> rather bizarre (i.e. highly unlikely
> statistically/probability-wise)
events might occur?
Umm...no. The measure of
the entropy would be the occurrence of those unlikely events. There
is no such abstract thing as entropy, it is not something that exists.
It is a word used to describe a rate of change in
information content.
For ex. A magnetic tape has a higher entropy than a CD because the
information stored on it decays (cannot be read) faster (statistically).
An old magnetic tape will be in a more advanced state of entropy than a newly
recorded one.
> Where am I going
here?
Dunno. Entropy is not a physical
thing it applies to information, or rather organisation of information.
It isn't even 'smooth'. Take a look at, say, a library in which the
books have a half life and become unreadable after a semi-random interval.
If you graph the level of entropy of the library we will see it start low
with the library functioning well, and then as the books start to decay we
find it getting exponentionally more useless. So it's entropy follow
an exponential curve. But wait! Suppose one of those books was
the library index. That would put a large blip in the 'usability'
curve (which we are using to measure the entropy).
Hummm... I don't know if I have explained anything, maybe someone can
add to that or correct it where needed.
Paul
Williams added:
>
Where am I going here?
>
> Well...is
it possible that if you were in a region of the
> Universe
where temporarily entropy decreased, you could get
> things
like 100 rolls of a die showing six, or dropping a
> broken
biscuit and it gets back together?
Even a million
rolls of 6 - yes. A now unbroken, broken biscuit certainly.
My understanding
(with the usual disclaimers) is ; - in a room diffused with a gas, there is a finite
probability that all the atoms of this gas will end up in one tiny corner
of the said room. (Not to forget (the old) thousand monkeys
typing on a thousand typewriters putting out the works of Shakespeare.)
The thing here
is having infinite time.
It appears unlikely
that there will an infinite amount of time left to do work in the universe
though.
Shame really...
Never mind -
it's a long ways off.
Keep dropping
that biscuit - it will undrop one day - maybe :-)
Philosophy only:
Our universe's
*Time* obviously did not exist before our universe came into being.
There is a thought
that our universe had to come in to being for whatever unlikely reason as there
was no time boundary constraining even an almost inestimable low probability.
*Eternity* existed
beforehand. Anything could happen and did.
The universe
appeared because it had to - there are no zero probabilities.
To which Zero Sum replied:
> Philosophy only:
> Our universe's *Time* obviously did not exist before our universe came
> into being.
Why is this obvious? And
what presupposes you to think it is true?
> There is a thought
that our universe _had to_ come in to being for
> whatever unlikely reason as there was no time boundary constraining
even
> an almost inestimable low probability.
> *Eternity* existed beforehand. Anything could happen and did.
> The universe appeared because it had to - there are no zero
> probabilities.
That all follows given the first
point. But is the first point valid? (Devil's Advocate)
Paul
Williams replied:
> > Philosophy only:
> >
Our universe's *Time* obviously did not exist before our universe came
> >
into being.
> Why is
this obvious? And what presupposes you to think it is true?
General
Relativity says that there was no time before the big bang. I readily admit that
I'm not up to the maths, but I'm led to believe that most theories contain
in essence the coming into being of time tied with the appearance of matter.
Big bang cosmology
has slowly become a facet of inflationary cosmology.
"
The nothingness 'before' the creation of the universe is the most complete
void that we
can imagine -- no space, time or matter existed. It is a world without place, without
duration or eternity, without number...yet this unthinkable void converts
itself into the plenum of existence -- a necessary consequence of physical laws. Where
are these laws written into the void? It would seem that even the void is subject
to law, a logic that existed prior to time and space."
- Heinz Pagels
(from 'Perfect Symmetry')
According to
Jim Hartle and Stephen Hawking (some time ago):
"The
origin of the universe cannot be assigned to a particular moment, so
there is no
Singularity. The universe, in essence, emerged from a no-time state of pure space.
Quantum fluctuations in the spacetime metric changed the signature from being
purely Euclidean ( 1,1,1,1) to the familiar relativistic one (-1,1,1,1). The Euclidean
spacetime tunnelled into an inflationary de Sitter spacetime which
started the Big Bang."
(see "A Brief
History of Time")
"David Meyer,
in 1993, ... combined string theory with work in quantum gravity to show that
the appearance of time and ticking clocks may have occurred as a phase
transition between disordered acausal states and one in which causal connections
between states can be defined."
All the
above quotes from:
http://itss.raytheon.com/cafe/qadir/q791.html
"From
what I understand of the 'theory', quantum fluctuations are not 'causal' events that
occur in time. It is only by linking together a particular sequence
of these fluctuations that you end up with a time-like ordering and a process
happening in time."
- Sten
Odenwald
http://itss.raytheon.com/cafe/qadir/q1455.html
'Self
Reproducing Inflationary Universe'
"After
inflation the universe becomes divided into different exponentially
large domains
inside which properties of elementary particles and even dimension of space-time
may be different."
- Andre
Linde
http://physics.stanford.edu/linde/
[Linde
(with Guth and Steinhardt) won the 2002 Dirac Medal for their work in
developing Inflationary
Cosmology.]
John Gribben
(physicist and science writer) has written 'Cosmology for Beginners' which is
a good start for those who desire some background.
http://www.biols.susx.ac.uk/home/John_Gribbin/
> > There is a thought that our
universe _had to_ come in to being for
> >
whatever unlikely reason as there was no time boundary constraining
> >
even an almost inestimable
low probability.
> >
*Eternity* existed beforehand. Anything could happen and did.
> >
The universe appeared because it had to - there are no zero
> >
probabilities.
> That all
follows given the first point. But is the first point valid?
> (Devil's
Advocate)
I believe that
we can only talk about time in our universe after inflation. Things were happening
'before' this - birth if you like. I titled this 'Philosophy only' because
anything 'before' the epic of
inflation
is, I believe, forever unknowable. My current favourite is the 'Self Reproducing
Inflationary Universe':
"The first,
and main, problem (for cosmologists) is the very existence of the big bang. One may wonder:
What came before? If space-time did not exist then, how
could everything appear from nothing? What arose first: the universe or the
laws determining its evolution? Explaining this initial singularity,
where and when it all began, still remains the most intractable problem
of modern cosmology."
"One domain
of a smallest possible size of 10 -33 centimeters (planck
length) is more
than enough to produce everything we see now."
"Even
if the universe at the beginning of inflation was as small as 10-33
centimeter, after
10-35 second of inflation this domain acquires an unbelievable size. According
to some inflationary models, this size in centimeters can equal 10 1 000
000 000 000 ; that is, a 1 followed by a trillion zeros.
The size
of the observable universe is 1028 centimetres" (1 followed by
28 zeros)
http://physics.stanford.edu/linde/1032226.pdf
I'd best
stop rabbiting on now...I failed many exams by forgetting the question. :-)
Anthony Morton responded:
> All this talk
of luck and chance and statistics has dragged me
> back to my pet topic - entropy.
Fascinating topic too.
But as Zero pointed out, entropy is neither a kind of 'stuff' nor is it a
process. It's simply a combinatorial artefact that stems from the way
we assign macroscopic parameters to complex systems.
In statistical mechanics, one deals with systems containing an enormous number
N of essentially identical particles. A very simple system might consist
of two grams of helium, containing N = 6 {correction =
3} x 1023
helium atoms. Each atom has associated with it three position coordinates
and three momentum coordinates (leaving aside quantum complications).
To specify the state of the complete system requires specifying these six
coordinates for every single atom, or 6N coordinates in total.
So every possible state of our system corresponds to a point in a truly enormous
6N-dimensional space, called 'phase space'.
To keep things (just) manageable, we now split up this phase space into lots
of little boxes, so that at any given time the state of the system can (in
principle, if not in practice) be located within one of these boxes. These
little boxes are called 'microstates'.
As a simple example, consider ordinary three-dimensional space and suppose
we are tracking the position of a fly as it moves through the space.
If we measure each position coordinate to an accuracy of 1cm, we are locating
the fly within a cube of side length 1cm, and we call each such cube a microstate.
We can imagine these cubes stacking together to tile all of space.
Now returning to our two grams of helium gas, we cannot hope to measure the
position and momentum of every atom and thereby determine which microstate
the gas is in. The best we can do is describe the system
using 'macroscopic variables', the most familiar being temperature, pressure
and volume. If we use these three macroscopic variables to describe
the system, as in classical thermodynamics, then we are specifying a state
not with 6N coordinates but rather with just
3 coordinates.
So the state of the system is now a point moving around in a 3-dimensional
phase space defined by these macroscopic variables (T, P, V). Again,
we can divide this space into little boxes of dimension (dT, dP, dV).
We call one of these boxes a 'macrostate'.
The obvious question now is: what is the relationship between microstates
and macrostates? If we know the system is in a given macrostate, what
(if anything) does this tell us about its microstate?
This is precisely where entropy comes in.
Using statistical mechanics, we can determine the values of the macroscopic
variables (T, P, V) when the system is in a given microstate. Now,
what we find is that for some macrostates - some combinations (T, P, V) -
there are a relatively small number of microstates yielding that macrostate,
while for other macrostates there
are a relatively large number of microstates. The entropy of a macrostate
is defined, essentially, as the number of microstates that yield that macrostate.
A good analogy is with a bucket of 300 dice. Define a microstate of
this system as the set of values shown on each of the 300 dice, and a macrostate
as the average of the face values. Thus a microstate has 300 coordinates,
while a macrostate has 1 coordinate. Now, the macrostate with value
1 corresponds to just one microstate: the state where all dice show the value
1. Similarly, the macrostate with value 6 also corresponds to just
one microstate. But the macrostate with value 3.5 can be generated
by a very large number of microstates: the 10227 or so microstates
with 50 ones, 50 twos .... 50 sixes are just a subset of these. The
macrostate 3.5 therefore has much higher entropy than the macrostates 1 or
6.
Now try this experiment. Arrange your 300 dice on a table top each
with 6 uppermost - a state of minimum entropy. Then introduce some
thermal noise by pounding the table enough to make the dice jump.
Determine the new macrostate of the system by averaging the dice values.
Repeat this a few times. It's now a purely combinatorial fact that
you will almost certainly progress to macrostates of higher and higher entropy,
and eventually converge on the state of maximum entropy with value 3.5.
Because this maximum-entropy state can be realized in overwhelmingly more
ways than any other, you will never deviate very far from it once you've
approached it.
In the realm of classical thermodynamics, the macrostates of maximum entropy
are those approximately satisfying the ideal-gas law PV = NkT, which describes
the behaviour of a gas in thermal equilibrium. A gas
can be brought into a transient state that disobeys this law, but thermal
fluctuations will always tend to bring it back into a state obeying the law,
simply because the vast majority of microstates obey it.
Of course, you can always imagine something reaching into the system and
driving it into a state with reduced entropy. Using the dice example,
you can simulate a 'Maxwell demon' by perturbing successive dice so as to
always leave 6 uppermost. Here you have the advantage of being able
to distinguish the microstates you prefer. Thermal fluctuations on
the other hand do not have a preferred direction, and so entropy increases
when a system is left to its own devices - in other words when a system is
'closed' and not subject to energy inputs
from other systems.
> [1] Entropy is
the tendency for systems to go from "order" to
> "disorder"?
Yes. Though 'order' and 'disorder'
are difficult to define, if you can restrict the microstates of a system
to a relatively small number then the system is in a sense more 'ordered'
than when it's possible for the microstates to range over all of phase space.
> [2] The total
entropy in the Universe is increasing?
Yes.
> [3] Entropy is
however a statistical process?
In the sense that it can decrease
locally at the same time as it increases globally, yes.
> [5] If entropy
actually decreased in some part of the Universe
> rather bizarre (i.e. highly
unlikely
> statistically/probability-wise)
events might occur?
Unfortunately not. You can
often argue the other way: if things occur that are unlikely to be the result
of thermal fluctuations alone, the result will usually be a decrease in entropy.
But something has to cause that decrease; it won't occur spontaneously.
As to whether a region of the Universe could attain a state where you could
roll a die 100 times in the ordinary way and get 6 every time:
there would have to be something strange happening to the laws of classical
mechanics were that so. Dice, after all, are completely deterministic
in their behaviour; it's only the extreme sensitivity of the die-rolling
process that makes them a good source of 'random' numbers.
Andrew
Lock commented:
"...eventually
converge on the state of maximum entropy
with value 3.5. Because this maximum-entropy state can be realized
in
overwhelmingly more ways than any other, you will never deviate very
far from it once you've approached it.
Now this seems
to indicate that there is an end point to entropy, at least in this example
. . . In that there will come a point after a great many kickings
of the table when you will no longer be appreciably changing the entropy
of the macrostate. Now, my question is does this hold true for all systems?
Is there a point at which the Universe as a whole will reach whereby it's entropy
is at maximum and will no longer increase? What would this mean if there
is?
Anthony Morton replied:
Sorry - brevity not my strong point, and all that.... :-)
When cosmologists speculate about the long-term future of the universe the
result is usually depressing. One possibility is just what we're talking
about here - it's usually called the 'heat death' scenario. Basically
the entropy of the universe goes up and up until the entire universe reaches
a maximum-entropy state where all that's left is diffuse heat energy.
There are no stars or planets because they're reduced-entropy phenomena.
Everything is just a uniform concentration of photons and dust.
Current thinking is that the only way we're likely to avoid the heat death
scenario is if the curvature of the universe prevents it from expanding forever.
The idea is that if the universe contains enough matter it will eventually
stop expanding and contract back to a 'big crunch'. Unfortunately the
big crunch scenario isn't much more palatable survival-wise than the heat
death scenario.....
Kevin
Phyland responded:
Lots to think
about THERE!!! :))
However, In better
check whether I've been telling my students the right thing about temperature
while I'm going...
I tell them that
temperature is a measure of the AVERAGE kinetic energy of the molecules in
a given substance (e.g. air) and that the reason that water
vapour can exist at room temperature is that statistically some molecules
will have much more energy than average and some
less...
Am I giving them
the "good oil" here? (Fingers crossed)
Ray added:
>>When cosmologists speculate about the long-term future of the universe
the result is usually depressing.
When all hypotheses are already constructed upon as yet unproven hypotheses,
and conjectures as foundations, it may be said to be equally validity (until
maybe I can be shown why it is impossible) that multiverses are interlinked
by black holes and super novae wherein each respective universe recreates
upon destruction in another, and by virtue of procession patterns over time
-like reproduction in life- and thus provide for a limitless phenomenon.
Whatever may be the case in truth, one certainty is obvious. None of
us will be here to watch it.
:) but maybe that is just another kind of depressing notion?
David
Martin supplemented:
Anthony
Morton wrote:
> entropy is neither a kind of 'stuff' nor is it
> a process. It's simply a combinatorial artifact that stems from
the
> way we assign macroscopic parameters to complex systems.
I was going to reply in some detail to this thread but Tony beat me to it.
Tony, this is the best description of the meaning of entropy that I've ever
seen. I'd like to use it in my lectures sometime (although I don't teach
this topic at present, I probably will sometime), do you want to maintain
copyright or is that OK by you, with attribution of course?
The only thing I could add would be that entropy is actually a number, which
can be easily calculated for any system in two (equivalent) ways; in terms
of the systems temperature and heat energy, or in terms of the number of
ways the constituent particles can be arranged. This number always increases
(for so-called irreversible processes in closed systems) and is is often
called "the arrow of time".
Why we only ever see entropy increase with time seems to me to be closely
related to the nature of time and is a mystery to me. Has anyone else on
the list thought about this?
Zero Sum posted:
>On Wed, 5 Feb 2003
18:47, David Martin wrote:
> Why we only ever see entropy increase with time seems to me to be closely
> related to the nature of time and is a mystery to me. Has anyone else
on
> the list thought about this?
If you will educate the ignorant (me!)...
I have two thoughts on this. From a classical point of view, you can view
the Universe as a process, like some massively parallel CPU. This then
fits the statistical model and time exists as the tick between one universal
microstate and the next. In this way, there is no past, no future,
just a continuously evolving present with overall entropy increase required
(if we live in a closed universe).
>From a more 'modern'
perspective, the increase of entropy with time, indeed
the increase in time itself, is an anthropic artefact. Since all
microstates must be regarded as having equal existence, the increase in
entropy is what we perceive because _this_ is the universe that we are in.
I hope that is not the hokum it
sounds like.
David
Martin replied to Ray:
Ray
wrote:
>
>>When cosmologists speculate about the long-term future of the universe
the
>
result is usually depressing.
>
>
When all hypotheses are already constructed upon as yet unproven hypotheses,
>
and conjectures as foundations, it may be said to be equally validity (until
>
maybe I can be shown why it is impossible) that mutiverses are interlinked
>
by black holes and super novae wherein each respective universe recreates
>
upon destruction in another, and by virtue of procession patterns over
>
time -like reproduction in life- and thus provide for a limitless
>
phenomenon.
Hi Ray and others,
I'm not really sure what's meant by this, but if I've understood you correctly
you seem to be saying that just because a hypothesis is unproved is equivalent
to saying that anything goes, i.e. all hypotheses are equally likely.
There are as many counterexamples as there are laws of physics but I'll just
give one (to your hypothesis :-)
Newton hypothesised that the gravitational force between two point masses
was proportional to the product of the masses and inversely proportional
to the separation *squared*. This hypothesis has turned out to be so accurate
on the "everyday" scale of things that it's called Newton's Law of Gravity.
It accounts for the acceleration due to gravity at the surface of the Earth
(and other planets) and the way this varies with height, the motion of the
planets and their moons, cometary orbits and so on.
It turns out that Newton's hypothesis is only an approximation and that a
better one is Einstein's general relativity, which needs to be used for very
accurate calculations (e.g. GPS systems) or near very dense objects (e.g.
neutron stars). However, for low mass objects, Einstein gives almost
identical results to Newton.
Einstein's hypothesis is itself almost certainly an approximation which will
have to be changed when quantum effects are taken into account, but no one
has figured out how to do this yet.
The point of all this is that laws of gravity based on an inverse cube law,
or which depend on the square of the mass, etc. etc. are absolutely wrong
and in no way equivalent to Newton / Einstein: they predict results which
are simply not observed e.g. planets spiralling into the sun after a few
orbits.
Chris Lawson responded:
I would agree exactly with Geoff's earlier comments, to wit:
>[1] Entropy is
the tendency for systems to go from "order" to
>"disorder"?
Sort of.
>[2] The total entropy
in the Universe is increasing?
Yes.
>[3] Entropy is
however a statistical process?
Yes
>[4] So...entropy
could actually increase in some parts of the Universe
Yes.
>provided there
was a compensating extra decrease
>somewhere else?
No. There is no compensating mechanism.
Remember, it's just a statistical process. This is just the old dice fallacy
-- the idea that if you've rolled a bunch of sixes, you must be due for a
one. Same with what you're
suggesting here. Just because there might be an extremely improbable decrease
in entropy somewhere in the Universe, the total entropy of the Universe will
still be increasing because there is just so much damn Universe out there
for the statistical laws to work on that it will swamp local entropy reductions.
Same with the dice. Roll fifty sixes in a row and it looks quite impressive,
but if it's a in a sequence of a googol rolls, then it's not so impressive
after all.
>[5] If entropy
actually decreased in some part of the Universe
>rather bizarre (i.e. highly
unlikely
>statistically/probability-wise)
events might occur?
Not necessarily. Remember that
entropy can increase (and does all the time) in small-scale systems. This
was demonstrated this year by a team at ANU, but the demonstration only showed
what physicists and chemists knew anyway. But if you take a thousand small-scale
systems, even if one or two lose entropy, overall there will almost certainly
be an increase in entropy.
There are cosmologists who have explored the idea of reverse-entropy Universes.
Their work remains highly conjectural.
And added:
At 12:06 5/02/03 +1000, Paul
Williams wrote:
>General Relativity
says that there was no time before the big bang.
General relativity has nothing
to say on the matter. You're talking cosmology here, which draws heavily
on GR, but is not the same thing.
>I readily admit
that I'm not up to the maths, but I'm led to believe that
>most theories contain in essence the coming into being of time tied with
the
>appearance of matter.
Ummm...the information I have (and
I readily admit it is indirect) is that most cosmologists feel that the concept
of "time" doesn't necessarily hold outside the Universe because time is a
property of the Universe. You say "there was no time before the big bang".
Most cosmology I read would say "the very concept of 'before' does not make
sense outside the Universe; there was no 'before the Big Bang.'" Most cosmologists
(I believe) would say that "space-time came into existence and the Big Bang
is the origin (in the maths sense) of that existence" or words to that effect.
But they would NOT say that it has anything necessarily to do with the appearance
of matter. For a start, what we usually consider matter didn't even come
into existence until some time after the Big One. I also feel that most cosmologists
would NOT say that Time DOES NOT exist outside the Universe, but that it
*may* not exist in any real sense outside the Universe. By definition, we
can't say a great deal about what lies beyond the Universe, so making definitive
statements are fraught with danger. It is one thing to say that the concept
of time means that there was no "before the Big Bang". it is another to rule
out any sort of time-like properties in something we have no experience of
and no hope of getting any inkling into.
>All the above quotes
from:
>http://itss.raytheon.com/cafe/qadir/q791.html
A good source of quotes, but do
remember that these are very small snippets that do not carry all the precedent
argument and conditional analysis. Also note that many of the people quoted
disagreed with each other on significant points, so this does not so much
represent What We Know, as a selection of interesting tidbits from a range
of people with a range of opinions about a fascinating, but only minimally
knowable subject.
Ian
Musgrave posted:
>[1]
Entropy is the tendency for systems to go from "order" to
>"disorder"?
No (Sorry
Zero Entropy has nothing to do with "order" in our commonsense understanding
of it, statistically, entropy is simply a matter of counting the number of
available states for a system with a given fixed amount of energy. However,
many texts illustrate entropy with comparisons of ideal gases vs solids,
which gives the illusion it is about order. As an example of the counterintuitive
aspects of entropy, rust has a lower entropy than the system of oxygen and
iron, yet rust is more "disorderly" to us our common sense than iron. Anthony
Morton tries to put this in perspective, but it's a trap to think of entropy
in terms of "order") )
See
JCE 1999
(76) 1385 [Oct] Shuffled Cards, Messy Desks, and Disorderly Dorm
Rooms
- Examples of Entropy Increase? Nonsense! :
<http://jchemed.chem.wisc.edu/Journal/Issues/1999/Oct/abs1385.html>
also
http://www.macatea.com/workshop/FAQ_2nd/FAQ_2nd.shtml#entropy
http://www.talkorigins.org/faqs/thermo/entropy.html
>[2]
The total entropy in the Universe is increasing?
Maybe, most probably.
However, this depends in part on certain boundary conditions about the Universe
we don't know at the moment. Like is the Universe isolated. Also, relativity
mucks up some of the entropy accounting.
See these
google discussion threads for an idea of the problem (the URLS will wrap,
so manual assembly is required)
http://groups.google.com/groups?q=entropy+universe+group:talk.origins+author:gans&hl=en&lr=&ie=UTF-8&selm=jjhW4.18%24_56.519%40typhoon.nyu.edu&rnum=2
http://groups.google.com/groups?q=entropy+universe+group:talk.origins+author:gans&hl=en&lr=&ie=UTF-8&selm=8g4tfl%2454e%241%40news.panix.com&rnum=3
http://groups.google.com/groups?q=entropy+universe+group:talk.origins+author:parson&hl=en&lr=&ie=UTF-8&selm=7cp6uv%246r4%40peabody.colorado.edu&rnum=2
http://groups.google.com/groups?q=entropy+universe+group:talk.origins+author:parson&hl=en&lr=&ie=UTF-8&selm=85lac1%24n05%241%40peabody.colorado.edu&rnum=4
>[3]
Entropy is however a statistical process?
Yes.
>[4]
So...entropy could actually increase in some parts of the
>Universe
provided there was a compensating extra decrease
>somewhere else?
Yes, happens
all the time. Photosynthesis is one such system (there was a recent article
in New Scientist on life as entropy "eaters"), as is planetary/star formation
from molecular clouds.
New Scientist
5/10/2002 (requires subscription or 7 day free trial)
http://archive.newscientist.com/secure/article/article.jsp?rp=1&id=mg17623635.600
Also,
an interesting article about the second law being "broken" (but repeats the
disorder nonsense)
http://www.newscientist.com/news/news.jsp?id=ns99992572
>[5]
If entropy actually decreased in some part of the Universe
>rather bizarre (i.e. highly unlikely
>statistically/probability-wise) events might occur?
No. (Unless you
think star formation is bizarre)
>Where
am I going here?
From
a state of High Delta G to low delta G?
>Well...is
it possible that if you were in a region of the
>Universe where temporarily entropy decreased,
Like, say, a
snow storm?
>you
could get
>things like 100 rolls of a die showing six, or dropping a
>broken biscuit and it gets back together?
No (well,
you could get 100 sixes in a row, but that would not be due
to low entropy).
A journal
devoted to entropy (free but seriously hard)
http://www.mdpi.org/entropy/
Information
theory and entropy on the web
http://www.mdpi.org/entropy/entropyweb/entropyweb.htm
And Anthony,
can I pinch your example too?
Andrew Lock commented:
In regards to this -
> Yes, happens all
the time. Photosynthesis is one such system (there was a
> recent article in New Scientist on life as entropy "eaters"), as is
> planetary/star formation from molecular clouds.
Could it be said that the entropy
of the Universe as a complete system is static? Stars are exploding all the
time, but then new ones are made from the stuff left over. Two galaxies might
collide and disperse, but a new one would form from the detritus (not too
sure on that point btw). We look at small systems here on Earth, and in what
we can postulate, but does this give us an accurate representation of the
complete system?
These are actual questions, by the way, not abstract philosophising . . .
And while I'm here -
The concepts of "before" and "outside" the Universe are interesting ones.
The Universe is, by definition, everything. Not everything contained within
something, but everything. There can be no before or outside, because those
befores and outsides are already contained within the defining of the Universe.
It already is. Speaking of befores and outsides should be restricted to discussions
on what is within the universe, rather than the complete as it stands.
Another related point -
Given - The Universe is everything
Given - What we can see of everything (which is a lot) appears to have had
a definite "starting point" and a postulated "endpoint"
This leads us to say, what was there before? What will be there afterwards?
Which leads to a thought - Nothing.
There was nothing there before, because there was nothing there. There wasn't
a void from which the Big Bang sprung. It just happened. There will be nothing
afterwards for the same reason. The Universe contains everything. But
it wasn't always there.
Zero
Sum replied:
>...Anthony
Morton tries to put this in
> perpective, but it's a trap to think of entropy in terms of "order")
)
>
Which is why
everyone was using the word in quotes, I think.
It also seems that you don't explain something quite right. Suppose
we have our bucket of 500 dice and they are in a microstate all showing sixes.
The entropy of those dice is lower than they would be showing random
faces because there are more possible microstates that the dice may occupy.
I suggest that entropy is not "simply a matter of counting the number
of available states for a system with a given fixed energy" because that
makes no account of the microstate that the system is currently occupying.
With my bucket of 500 dice, the number of possible microstates never changes.
Simply summing those would tell you that entropy never changes.
>
See
> JCE 1999 (76) 1385 [Oct] Shuffled Cards, Messy Desks, and Disorderly
Dorm
> Rooms - Examples of Entropy Increase? Nonsense! :
> <http://jchemed.chem.wisc.edu/Journal/Issues/1999/Oct/abs1385.html>
> also
> http://www.macatea.com/workshop/FAQ_2nd/FAQ_2nd.shtml#entropy
>
> http://www.talkorigins.org/faqs/thermo/entropy.html
>
I'll look at
these later, but I am more concerned with straightening out what appears
to be a misconception between us (probably mine!).
>
>[2] The total entropy in the Universe is increasing?
>
> Maybe, most probably. However, this depends in part on certain boundary
> conditions about the Universe we don't know at the moment. Like is the
> Universe isolated. Also, relativity mucks up some of the entropy
> accounting.
I think we were
assuming that the universe is a closed system. But could you explain
more about this, please? Particularly how relativity messes it up...
Anthony Morton posted:
> I'd like to use
it in my lectures sometime (although I don't teach this
> topic at present, I probably will sometime), do you want to maintain
> copyright
> or is that OK by you, with attribution of course?
No worries. Anyone's welcome
to pinch that little essay as long as it's attributed. Just be sure
to correct my mistake in giving the number of coordinates in a classical
system of particles as 6^N rather than the correct number 6N. :-)
> The only thing
I could add would be that entropy is actually a number, which can
> be easily calculated for any system in two (equivalent) ways; in terms
of the
> systems temperature and heat energy, or in terms of the number of ways
the
> constituent particles can be arranged. This number always increases
(for
> so-called irreversible processes in closed systems) and is is often
called "the
> arrow of time".
Quite so.
> Why we only ever
see entropy increase with time seems to me to be closely
> related to the nature of time and is a mystery to me. Has anyone else
on the
> list thought about this?
Whenever I turn to thoughts about the nature of time it seems to tie my brain
in knots. I was quite taken by Hawking's identification of three 'arrows
of time': the psychological one relating to our perception of passing time,
the thermodynamic one by which entropy always increases, and the cosmological
one whereby the universe goes on expanding.
It seems to me that the basis for the thermodynamic arrow of time is related
to the observation that all systems with positive energy undergo thermal
motions with no preferred direction, leading essentially to 'chaotic' behaviour.
But how this might relate to the other two arrows of time is still mysterious
to me.
Donald
Lang wrote:
>
better check whether I've been telling my students
> the right
thing about temperature while I'm going...
>
> I tell
them that temperature is a measure of the AVERAGE
> kinetic
energy of the molecules in a given substance (e.g.
> air)
Nearly true.
Pretty good in the case of air, if you consider only the motion of the molecules. Quantum
mechanics usually allows you to forget about zero point motion in 'cold'
systems. The nucleons in oxygen and nitrogen nuclei are moving around at
quite high speeds, but no state is available for them to go slower. The electrons
likewise are whizzing round their atoms. The energy is not available
for other use. So the first caveat is that you must leave out any "zero
point" energies. Next you have to take a harder look at degrees of freedom.
In air the molecules have kinetic energy, the same on
average
associated with each space dimension. Molecules with two atoms can
also rotate.
They rotate about any axis perpendicular to the line joining them. That gives them
two extra degrees of freedom. You could imagine rotating about the line
joining them as well. There are a lot of ways of
saying
one simple thing: "The moment of inertia about that axis is too small
so the quantum
of energy involved in rotating about that axis is too big and that degree of freedom
is 'frozen out'." You can freeze the other two degrees of rotational
freedom if you lower the temperature enough. When you
freeze
out any degree of freedom the first effect that shows up is a drop in
the associated
specific heat.
The temperature
is a measure of the AVERAGE kinetic energy associated with the active degrees of
freedom of the system.
> and
that the reason that water vapour can exist at room
> temperature
is that statistically some molecules will have
> much more
energy than average and some less...
'Bout
right.
>
> Am I giving
them the "good oil" here? (Fingers crossed)
Tempted
to say, "Straight from the serpent" , but I think it is in fact WD-40 and may loosen
up their mental joints a little.
>
P.S. And does the existence of water vapour at room
> temperature
have something to do with the
> macrostate/microstate
view?
Probably,
but I am not sure it is worth the effort to spell it out.
And Donald added:
Pardon me, while I pick nits a little. In general four grams of Helium -
not two. You could of course go buy something valuable and have three grams
of Helium with the same number of atoms.
It is the end point of the universe that is much more important. Not to us
personally, but in principle and a bit later.
>Basically the entropy of the universe
goes up and up until the entire
> universe reaches a maximum-entropy state where
all that's left is
> diffuse heat energy. There are no stars
or planets because they're
> reduced-entropy phenomena. Everything
is just a uniform concentration
> of photons and dust.
Not quite true I understand.
One scenario eventually gets rid of atoms altogether. Protons might eventually
decay with a half life near 10^31 years into smaller items. Eventually there
would be photons and maybe items like neutrinos [and wimps, maybe -
just maybe!] and nothing else. After ten half lives the number of protons
has dropped by a factor of ~10^3 and if you stick around for a thousand
half lives the number remaining has dropped to 10 ^ -300 of the number now
around.
Failing that scenario there is an earlier correction. A dust cloud with the
mass of the solar system and spread over a volume out to say the Oort radius
has less entropy than the same mass collected into a sun and planets. even
though the cloud might be cold and the sun is definitely hot.
I won't try to produce a proof. The standard assertion is something about
"exercise for the student". With luck you will find it both obvious and counter
intuitive.
Ian Musgrave wrote:
>> ...As
> > an example of the
counterintuitive aspects of entropy, rust has a lower
> > entropy than the
system of oxygen and iron, yet rust is more "disorderly"
> > to us our common
sense than iron. Anthony Morton tries to put this in
> > perspective, but
it's a trap to think of entropy in terms of "order"
>Which is why everyone was
using the word in quotes, I think.
Even in those quotes, the kind of things that entropy does has poor relation
to "order", remember the rust example.
>It also seems that
you dont explain something quite right. Suppose we have
>out bucket of 500 dice and
they are in a microstate all showing sixes.
>The entropy of those dice
is lower than they would be showing random faces
>because there are more possible
microstates that the dice may occupy.
No. To make it simpler the entropy of 5 dice sitting quietly on the table
is the same for the sequence
6 6 6 6 6 as it is for 4 3 5 1 6.
There are limits to Antony's analogy, but these limits in fact it helps to
explain the "at a given fixed amount of energy".
In a real, physical system of dice, the dice sitting on the table have a
low entropy because at that energy level (sitting quietly on the table) they
have just one microstate. In the absence of energy input, the dice will sit
there, six face up (or 4 3 5 1 6 face up), until the sun goes into red giant
phase (or your mother/spouse/small child moves them). When we add energy
by thumping the table (and only then) do the dice acquire multiple microstates.You
may consider this analogous to the entropy of a cube of ice and the entropy
of the same volume of liquid water.
>I suggest that
entropy is _not_ "simply a matter of counting the number of
>available states for a system
with a given fixed energy" because that
>makes no account of the microstate
that the system is currently ocupying.
That's what the "a given fixed
energy" clause is about. The state the system is currently occupying.
>With my bucket
of 500 dice, the number of possible microstates never
>changes. Simply summing
those would tell you that entropy never changes.
Sum the number of _possible_ microstates in a bucket of dice sitting on the
floor over a unit time T. There is only one, the current configuration. Unless
you add energy, those dice will sit in that configuration until proton decay
sets in. Now sum the number of possible microstates of the same bucket of
dice being shaken (you may need a stroboscopic camera to do so) over the
same unit time T (there are lots and lots and lots). Think ice (the number
of possible states at a given energy, zero deg C and think water (the number
of possible states at 20 deg C).
(in Anthony's analogy, a number of dice sitting quietly on a table showing
all sixes has the same entropy as a number of dice sitting quietly on the
table showing any combination of faces, but for illustration purposes all
sixes shows the evolution of the states when energy is added rather than
some other combination. However, our sense of "order" and the naive statistical
concept that "all sixes" is a more unique sequence than 4 3 5 1
6 kicks in and messes it up)
[snip handy references]
> > >[2]
The total entropy in the Universe is increasing?
> > Maybe, most probably.
However, this depends in part on certain boundary
> > conditions about
the Universe we don't know at the moment. Like is the
> > Universe isolated.
Also, relativity mucks up some of the entropy
> > accounting.
>I think we were assuming
that the universe is a closed system.
Yes, but do we know that the universe is a closed (or in thermodynamic
terms isolated) system?
>But could
>you explain more about this,
please? Particularly how relaticity messes it
>up...
To measure the entropy of the universe we have to account for all the radiation
in the universe as well as matter. Relativity messes up or accounting of
photons in different parts of the universe in ways that I do not understand
and cannot explain, but are briefly covered in the URLs supplied, and is
also treated (not from a thermodynamics view though) in part two of the sci.physics
FAQ.
Put it this way, the problem is non trivial. Serious physicists find it hard,
and there is currently no clear answer.
Zero
Sum responded:
> >Which is why everyone was using
the word in quotes, I think.
> Even in
those quotes, the kind of things that entropy does has poor
> relation
to "order", remember the rust example.
Nolo contrendre,
nevertheless it is commonly used, particularly in textbooks.
"Order" is capable of carrying various semantic baggage. I suspect
that we are using somewhat differing meanings here.
> >It also seems that you dont
explain something quite right. Suppose we
> >have
out bucket of 500 dice and they are in a microstate all showing
> >sixes.
The entropy of those dice is lower than they would be showing
> >random
faces because there are more possible microstates that the dice
> >may
occupy.
> No. To
make it simpler the entropy of 5 dice sitting quitely on the table
> is the
_same_ for the sequence
> 6 6 6
6 6 as it is for 4 3 5 1 6.
Obviously. And for 6 6 6 6 6 6 versus 3 4 5 1 6 and.... I said
"showing random faces". You argument would be comment would be valid
if I had said "showing particular random faces".
Perhaps I should have used totals and said that there is only one microstate
(with your five dice) that add up to thirty, but far more that add up to
forty or fifty {fourteen
or fifteen?}.
>
There are limits to Antony's analogy, but these limits in fact it helps
> to explain
the "at a given fixed amount of energy".
The "at a given
fixed amount of energy" is provided by the fact that we are taking about
closed systems. It is inherent and does not need further detailing.
[snip]
> >I suggest that entropy is _not_
"simply a matter of counting the number
> >of
available states for a system with a given fixed energy" because that
> >makes
no account of the microstate that the system is currently
> >ocupying.
> That's
what the "a given fixed energy" clause is about. The state the
> system
is currently occupying.
We appear to be talking at cross purposes.
>
Sum the number of _possible_ microstates in a bucket of dice sitting on
> the floor
over a unit time T. There is only one, the current
> configuration.
Unless you add energy, those dice will sit in that
> configuration
until proton decay sets in. Now sum the number of possible
> microstates
of the same bucket of dice being shaken (you may need a
> stroboscopic
camera to do so) over the same unit time T (there are lots
> and lots
and lots). Think ice (the number of possible states at a given
> energy,
zero deg C and think water (the number of possible states at 20
> deg C).
>
> (in Anthony's
analogy, a number of dice sitting quitely on a table
> showing
all sixes has the same entropy as a number of dice sitting
> quitely
on the table showing any combination of faces, but for
> illustration
purposes all sixes shows the evolution of the states when
> energy
is added rather than some other combination. However, our sense of
> "order"
and the naive statistical concept that "all sixes" is a more
> unique
sequence than 4 3 5 1 6 kicks in and messes it up)
But it is
more unique if you cannot distinguish dice. You have 6 6 6 6 6 6 vs
4 3 5 1 6, 4 3 5 6 1, 4 3 1 5 6, 4 3 1 6 5 ...
> > > Maybe, most probably.
However, this depends in part on certain
> >
> boundary conditions about the Universe we don't know at the moment.
> >
> Like is the Universe isolated. Also, relativity mucks up some of the
> >
> entropy accounting.
> >
>
> >I
think we were assuming that the universe is a closed system.
> Yes, but
do we _know_ that the universe is a closed (or in thermodynamic
> terms
isolated) system?
Given al the other assumptions we are making, I think that that that it is
a safe position at the moment - at least not likely to be disproved.
And we can't really have this conversation about a system that has an energy
input or drain.
> >But could you explain more
about this, please? Particularly how
> >
relativity messes it up...
> To measure
the entropy of the universe we have to account for all the
> radiation
in the universe as well as matter. Relativity messes up or
> accounting
of photons in different parts of the universe in ways that I
> do not
understand and cannot explain, but are briefly covered in the URLs
> supplied,
and is also treated (not from a thermodynamics view though) in
> part two
of the sci.physics FAQ.
>
> Put it
this way, the problem is non trivial. Serious physicists find it
> hard,
and there is currently no clear answer.
Can you add to
this? I'm not seeing this as a major problem which may be more to do
with my philosophical stance on models than anything else.
Chris Lawson replied:
>"Order" is capable
of carrying various semantic baggage. I suspect that we
>are using somewhat differing
meanings here.
I agree with Ian here. "Order"
is a very bad term to use when discussing entropy because it causes great
confusion. A lot of textbooks and pop-sci treatments use the concept because
it gets a message across easily, but unfortunately it is a *simplification*
rather than a good analogy.
In reality, entropy is a thermodynamic concept that can be applied to information
theory. When we start taking about "order", it conjures up all sorts of images
in different people's heads. As you say, it carries lots of
semantic baggage.
One of my favourite misunderstandings of entropy was in a Doctor Who episode
(a real fan will probably be able to tell me which story it was), when the
evil Master had dragged the Doctor to the end of time when entropy was approaching
maximum. When the Doctor stepped out into the high-entropy world, everything
was falling apart. He couldn't walk past a cliff without thousands of rocks
falling down it. What the writers had failed to understand was that at near
maximal entropy -- everything has already fallen down. There are no rocks
left to fall. High-entropy states are notable for their utter lack
of interesting events. Nothing much happens to a high-entropy system unless
it is brought into contact with another system with a different entropy level.
Anyway, while "order" is OK as shorthand, it really doesn't work very well
in the particulars. Let me give just one example. A box full of gas molecules.
If all the molecules are aligned in an array and are moving in the same direction,
then they have very high entropy, and will shortly find themselves being
randomised and then the entropy of the gas will inevitably increase over
time. Now imagine the same box with the same number of gas
molecules, only now the molecules are arranged in a complex 3D Fibonacci
spiral, with multiple different velocities. Now this system has a higher
entropy than the previous monostate, but it is clearly more ordered. So
"order" is only roughly correlated with low entropy in most common situations,
but it is not the same thing.
Any corrections would be appreciated. (Entropy throws me a bit, too.)
Donald
Lang added:
One extra measurement
to make Kevin.
It has been established
around here that a peach stone probably has negative mass while it is still
inside the ripe fruit. A ripe peach would then have small total mass,
and, by equipartition of energy, a considerable mean
square speed.
The experimental
evidence is that an open bowl of peaches evaporates very rapidly.