There's a concept that's crucial
to chemistry and physics. It helps explain why physical processes
go one way and not the other: why ice melts, why cream spreads in coffee, why air leaks out of a punctured tire. It's entropy, and it's notoriously
difficult to wrap our heads around. Entropy is often described as
a measurement of disorder. That's a convenient image,
but it's unfortunately misleading. For example, which is more disordered - a cup of crushed ice or a glass
of room temperature water? Most people would say the ice, but that actually has lower entropy. So here's another way of thinking
about it through probability. This may be trickier to understand,
but take the time to internalize it and you'll have a much better
understanding of entropy. Consider two small solids which are comprised
of six atomic bonds each. In this model, the energy in each solid
is stored in the bonds. Those can be thought of
as simple containers, which can hold indivisible units of energy
known as quanta. The more energy a solid has,
the hotter it is. It turns out that there are numerous
ways that the energy can be distributed in the two solids and still have the same
total energy in each. Each of these options
is called a microstate. For six quanta of energy in Solid A
and two in Solid B, there are 9,702 microstates. Of course, there are other ways our eight
quanta of energy can be arranged. For example, all of the energy
could be in Solid A and none in B, or half in A and half in B. If we assume that each microstate
is equally likely, we can see that some of the energy
configurations have a higher probability of occurring
than others. That's due to their greater number
of microstates. Entropy is a direct measure of each
energy configuration's probability. What we see is that the energy
configuration in which the energy
is most spread out between the solids has the highest entropy. So in a general sense, entropy can be thought of as a measurement
of this energy spread. Low entropy means
the energy is concentrated. High entropy means it's spread out. To see why entropy is useful for
explaining spontaneous processes, like hot objects cooling down, we need to look at a dynamic system
where the energy moves. In reality, energy doesn't stay put. It continuously moves between
neighboring bonds. As the energy moves, the energy configuration can change. Because of the distribution
of microstates, there's a 21% chance that the system
will later be in the configuration in which the energy is maximally
spread out, there's a 13% chance that it will
return to its starting point, and an 8% chance that A will actually
gain energy. Again, we see that because there are
more ways to have dispersed energy and high entropy than concentrated energy, the energy tends to spread out. That's why if you put a hot object
next to a cold one, the cold one will warm up
and the hot one will cool down. But even in that example, there is an 8% chance that the hot object
would get hotter. Why doesn't this ever happen
in real life? It's all about the size of the system. Our hypothetical solids only had
six bonds each. Let's scale the solids up to 6,000 bonds
and 8,000 units of energy, and again start the system with
three-quarters of the energy in A and one-quarter in B. Now we find that chance of A
spontaneously acquiring more energy is this tiny number. Familiar, everyday objects have many, many
times more particles than this. The chance of a hot object
in the real world getting hotter is so absurdly small, it just never happens. Ice melts, cream mixes in, and tires deflate because these states have more
dispersed energy than the originals. There's no mysterious force
nudging the system towards higher entropy. It's just that higher entropy is always
statistically more likely. That's why entropy has been called
time's arrow. If energy has the opportunity
to spread out, it will.
Wha aboot it, eh?