MATT O'DOWD: Can a demon
defeat the second law of thermodynamics? This is actually a very
serious scientific question. Entropy is sometimes
described as a measure of disorder or randomness. The second law of
thermodynamics, the law that entropy
must on average increase, has been interpreted
as the inevitability of the decay of structure. This is misleading. As we saw in our episode
on the physics of life, structure can
develop in one region even as the entropy
of the universe rises. Ultimately, entropy is a
measure of the availability of free energy, of energy
that isn't hopelessly mixed into thermal equilibrium. Pump energy into a small system
and complexity can thrive. But entropy is connected
to disorder and randomness in a very real way. See, entropy is a
measure of ignorance. Entropy very directly
measures how much information we don't have about a system. But with knowledge
comes power, literally. Know a system perfectly and
you can return it to order, or you can extract its energy. To show you how
I'm going to have to introduce you to a friend
of mine named Maxwell's demon. But before I do that, a
little recap is in order. This episode will build heavily
on our recent episode, linked here. Worth a watch, if you haven't. There, we show that
entropy is a direct measure of hidden information. It's defined by the number
of possible configurations of particles-- or microstates in
physics-speak-- that could produce
the same observed set of macroscopic observables,
or the same macrostate. More precisely, if we only know
the thermodynamic properties of a system-- temperature, pressure,
volume, et cetera-- how much extra
information would we need to perfectly describe
the current microscopic arrangement? Let's explore this using the
analogy we used last time, the Go board. Let's say we place 180 black
stones on random vertices of our Go board. In the vast majority
of such arrangements, stones are spread
evenly over the board. These all correspond to the same
smoothly distributed macrostate and to maximum entropy. Even if you knew you
were in this macrostate, you still wouldn't know much
about which vertex had a stone. There are around 10 to the power
of 108 possible microstates that give you this smoothly
distributed macrostate, and you'd need close to
361 bits of information, one for each of the
19 by 19 vertices, to fully describe the board. On the other hand, what if
the black stones all ended up on the right side? There are only a
few configurations that look like this, which means
knowing the macrostate almost tells you the exact microstate. You only need a few
more bits of information about that middle column. This is close to
minimum entropy. Now, those stones
all being on one side could represent particles
that all have high or all have low energies, or particles that
are all on one side of a room. Either way, this would be
far from thermal equilibrium. The particles would quickly flow
to fill the available space, and you could extract
energy from that flow. OK, so that was one
weird Go configuration. But what about this one,
where the stones are arranged in stripes? Again there's only one
arrangement like this, so it should also be
low entropy right? Well, not quite. The particles are
pretty well mixed. They don't need to move far
to reach true equilibrium. How are you going
to extract energy? In fact, this
configuration isn't low entropy in the
thermodynamic sense. And this is the
critical distinction. Thermodynamic entropy
is related to the amount of hidden information, based on
thermodynamic knowledge only. It defines how far a system
is from thermal equilibrium, and it also defines
the availability of free energy,
energy that can be extracted as the system
moves back to equilibrium. Thermodynamic entropy
is low if there are differences in the average
thermodynamic properties from one macroscopic
region compared to another. But weird configurations
like these, where average thermodynamic
properties are the same everywhere, are just
weird microstates among the many microstates of
a very high entropy macrostate. So much for structure
and organization always meaning low
entropy, right? Well, actually, it turns out
that even these specific high entropy configurations can be
transformed to low entropy, as long as we have
information about the state. To understand this,
you're finally going to have to
meet Maxwell's demon. As well as unifying the
equations of electromagnetism, James Clerk Maxwell
was one of the founders of statistical mechanics. Understanding that entropy
was a statistical phenomenon, he came up with a
thought experiment to explore just how
fundamental the second law of thermodynamics really was. He imagined a box
with two halves, sealed by a wall between them. The wall has a tiny door large
enough for a single molecule to pass through. The air throughout the box
is the same temperature, so even if we open
the door, temperature would stay the same. The halves are in thermal
equilibrium with each other, and if the box is isolated
from its surroundings, then this is the state
of maximum entropy. Let's introduce the demon. The demon has the
ability to observe speed and trajectories
of individual particles in the system. It can also open and close
the door between the sides. Every time the demon sees a
high speed particle approaching from the right,
it opens the door to let it pass to the left
side, and it lets lower speed particles pass left to right. Soon enough, the left side
is full of fast particles and is hot, while the right
contains slow particles and is cold. We are no longer in
thermal equilibrium, and the entropy is lower than
before the demon started. This temperature
differential can then be used to drive a heat
engine and do work. But all of this
appears to have been done without exchanging
energy or entropy with the outside universe. This seems to violate the second
law of thermodynamics, which demands entropy remain
constant or increase, unless energy is exchanged
with the outside universe. Believe it or not,
Maxwell's demon was seen as a serious
problem for the second law for many years. Physicists tried to figure
out a flaw in the argument. It turns out that there
are plausible non-demonic-- or even intelligent-- mechanisms
to detect approaching particles and open the door, mechanisms
which, in principle, don't increase entropy. But it turns out,
there's one last step in the process of
sorting particles where the increase of
entropy is unavoidable. Weirdly, it's not
in the measurement of particle trajectories
or the motion of the door. It's in the storage
of information. It's in the memory of the demon. See, in order for the
demon to do its job, it must learn about
the particles. The demon, or the particle
sorting system it represents, must start in some known
predictable state, which is altered by interaction
with a particle into some unknown state. From our point of view, the
randomness of the particles decreases, but that
randomness is transferred to the memory of the demon. But even demons
have finite memory. At some point, the system
for sorting particles needs to be reset to
continue its work. To erase the information
of past sorting means to transform some
arbitrarily jumbled state into a single known state. That's a physical process
that reduces the demon's internal entropy, and that
takes an irreversible transfer of energy. The demon has to
radiate heat, which means transferring
entropy back into the box or to the universe. Either way the second
law is preserved. This resolution was proposed
by Rolf Landauer in 1960, nearly 100 years after Maxwell
proposed the conundrum. Landauer's principle states
that "any logically irreversible manipulation of information,
such as the erasure of a bit or the merging of two
computation paths, must be accompanied by
a corresponding entropy increase." And this has come to be
seen as a fundamental limit to the efficiency
of computation. Both Maxwell's conundrum
and Landauer's resolution are fascinating,
because they highlight the fundamental link between
entropy and information. Firstly, they show that
information about a system allows us to
extract useful work. If you have perfect knowledge
of the state of all particles in the box, you
could open and close the door in exactly
the right sequence to sort the particles
any way you liked. You could turn any perfectly
known equilibrium state into literally any other
non-equilibrium state. You could turn information
into work or into structure. But ultimately, possessing
that information does increase entropy. You generate heat as you
compute, in particular as you reset your own memory. The entropy of the
universe must increase, and yet knowing the microstate
of a system, no matter how thermodynamically mixed, allows
you to decrease its entropy. As long as you can
power your computation, information is power. Claude Shannon, the father
of information theory, was deeply inspired by the
close connection between entropy and information. He defined a new type of
entropy, Shannon entropy. It can be interpreted as
the amount of previously hidden information
that can potentially be revealed by the
observation of a random event. In a sense, it's a measure of
the uncertainty of the event. The more uncertain or
more unusual an event, the more revealing it is then
a more predictable event. So for example,
the roll of a die has more entropy than
the flip of a coin. There are more
potential outcomes in the former, so
more uncertainty has been eliminated
when the die is cast. Shannon use the term
entropy for this measure, because his formula was almost
identical to the formula for thermodynamic entropy. Just multiply by the Boltzmann
constant to get the latter. If you take all possible
results of the die roll to be equally likely,
then Shannon's formula looks just like the
Boltzmann equation. This isn't a coincidence. In fact, Shannon's
entropy is, in a sense, just a generalization of
thermodynamic entropy. The former describes the
amount of hidden information in any system. The latter applies specifically
to thermodynamic systems. And there's a third, perhaps
even more fundamental, type of entropy. That's quantum entropy, also
known as Von Neumann entropy. It describes the hidden
information in quantum systems, but more accurately, it's a
measure of the entanglement within quantum systems. In fact, the evolution
of quantum entanglement may be the ultimate
source of entropy, the second law, the limits
of information processing, and even the arrow of time. We'll get tangled up in
all of that in an upcoming episode of "Space Time." Last time, we delved into
the true nature of entropy and the cause of the second
law of thermodynamics. Let's see what you had to say. Iago Silva criticizes
our assumption of the ergodic hypothesis. For the non-physicists
out there, the ergodic hypothesis is
basically the assumption that all microstates are equally
probable over long periods of time. Iago's criticism
is totally fair. Starting with this
assumption gets you to the Boltzmann equation, and
it's a nice, relatively simple way to understand entropy. It's valid for
idealized situations, but isn't necessarily general. If different microstates can
have different probabilities, then you need to include
those probabilities in your equation for entropy. The resulting formula, the
Gibbs entropy equation, was derived soon after
Boltzmann's equation, and is the more general
statistical mechanical definition for entropy. Nifelheim Mists notes that
if entropy is statistical, then it's wrong to say that
it must always increase. Rather, it's always
likely to increase. Well, Nifelheim Mists is right. However, the likelihood
of entropy decreasing for any macroscopic
isolated system is so overwhelmingly
small, that it never happens on any sane time scale. On the other hand,
if the universe lasts for infinite time,
then principal entropy drops of all sizes should
eventually happen. Check it our Boltzmann brain
episode for more on that. Zombie Blood would like us
to walk through the math used in calculating the number
of possible microstates on the Go board. Sure, but that sounds like
a challenge episode to me. Stand by. David Durant wanted to let us
know that the Go board made him think of the game
of Life and wonder whether the universe is a giant
quantum cellular automata, which led him to a Wikipedia
page on digital physics, and now he's reading about
pan-computationalism. Some say that David
Durant is still in that Wikipedia rabbit hole. Some say that his Wiki browsing
choices became Turing complete long ago, and
several new universes have been simulated in
David Durant's yes-no click decisions. Could we be one of
those universes? Can we ever know? Keep browsing, David Durant,
for all of our sakes.
why doesn't opening and closing the door cause an increase in entropy?
Does this argument hold up if the demon handles his computation with reversible circuits? It seem to the answer is no, but I have only read a small bit about them.
Here's a review article I read recently, if anyone here wants to dive a little deeper. It's a technical paper, but I think physics undergrads should be able to understand at least the gist of it.
Anyone know where that shirt he is wearing is from? It's not one of the spacetime shirts.