An introduction to entropy (or: how physics likes to gamble)

In this post we discuss the concept of entropy in the context of statistical mechanics. While it is self-contained, it serves as a primer to a follow-up post describing the results of a recent paper.

Take a look at the video clip below. It shows a number of particles bouncing around in two chambers, connected by a tunnel. Each particle is unaware of the rest, and is reflected off the wall as if it was a billiard on a pool table. Initially, the collective behaviour of the particles looks unstructured and random. Keep looking, however, and something interesting happens…

Even though every single particle in the system behaves according to physical law, the fact that all particles congregrate in one chamber seems unphysical, as if there is another power at work. The truth is that this clip is merely a backwards-played version of a system of particles all starting in the same urn, moving into random directions. If you’re curious, you can see the original clip here.

Let’s take a closer look at the relevance of the direction of time in this process. In the next clip, we look at a simulation of the same system containing only one particle.

This time it is impossible to tell if the clip is time-reversed or not. The original is here, and shows no qualitatively different behaviour.

This hints at an interesting paradox. If the motion of a single particle tells you nothing about the direction of time, how come a large number of (independent) particles tells you so much? The answer, rooted in the discipline of statistical physics, is entropy.

The basics: classical mechanics

All the matter around us is built up from molecules. For solid matter, these molecules are tightly interconnected in a grid-like structure. This is why, when you drop a mug, it either retains its shape, or shatters into solid pieces. On the other hand, liquids and gases do not have this property. Their molecules are free to move in space and interact with each other through attraction, repulsion and collisions. For example: when you spill water, it will retain its volume, but adapt its shape to the surface it is spilled on. Gases behave even less structured; gas molecules spread freely throughout unconfined spaces in a process called diffusion.

When we look at single molecules, we understand their motion and have done so since the late seventeenth century. The rules that govern these particles are the same that work for large objects like apples and bowling balls: Newton’s laws of motion. Among other things, these laws state that the acceleration of an object is the result of the sum of forces that act on that object, divided by its mass.

However, there is a major difference between the motion of life-sized objects and molecules of gases: the number they appear in. The vast number of particles that makes up an observable amount of gas is so large, that Newton’s laws cannot be reliably applied to predict the motion of the individual particles; especially if you take into account the sheer number of collisions that occur among them and that change their trajectories.

So if we cannot use the classical and well-understood tools that Isaac Newton left us, are there any alternatives? Well, yes. It turns out that a statistical approach can provide us with much more practical tools, if we are willing to give up the information we can get on individual particles.

Upscaling: statistical mechanics

Generally speaking, the goal of statistics is to deduce rules about individuals by analysing the collective. As an example, imagine the following thought experiment.

Take a six-sided die and throw it. The face-up number turns out to be even; not a very surprising result. Next: throw three dice, resulting in all of them presenting an even number. These odds are not large, but not too rare either. Now, simultaneously throw, say, 14,398 different dice, check the results and find out that each and every one of them presents an even number. At this point, you’d probably start to wonder if the dice weren’t manipulated.

Although technically possible, the probability that such an event occurs is so low, that you would never observe it in real life. In a similar way, Newton’s law does not provide the full picture when we are looking at large numbers of particles. It does not weed out the unlikely possibility that all particles follow the same trajectory, because it does not take likelihood into account.

Statistically speaking, you would expect that around half of the 14,398 dice display even numbers. Moreover, around one in six dice should, for instance, display the number 5. This statistical mindset is one of the concepts that James Maxwell contributed to the field of mechanics: how likely are different statistical, or macroscopic properties to be observed, and how can those likelihoods be coupled to the microscopic configurations of the particles the system consists of?

This question touches on the definition of entropy, or how many of the particle configurations lead to the state a system is in. Maxwell posed what would later be known as the Second Law of Thermodynamics: that in any closed system, over time, the macroscopic state the system is in will be one of the most likely ones based on the number of configurations that leads to it. Or, as Max Planck posed it: entropy never decreases.

Back to our dice: imagine that someone went to the effort of placing all dice face-up with the number 4. Starting from this situation, you randomly pick a number of dice and rethrow them. For a collection of fair six-sided dice, a very large number of configurations have around an equal number of each face. Having an configuration of dices displaying only 4’s resembles a situation with very low entropy. Evolving the system by rethrowing dice and letting random motion take its course would mean that slowly, the number of 4’s reduces and entropy increases, because that is the most overwhelmingly likely option.

The effects of a collective

This phenomenon is responsible for why tea spreads in a water-filled mug, why balloons deflate, and why the first video clip in this article looks weird. Mundane as these events may seem, entropy is deeply entrenched in their behaviour.

Of course, many other elements are in play here as well. The field of thermodynamics is notoriously daunting to start in. If you are interested, there are numerous introductions on the topic, both technical tutorials as well as more accessible books. Other applications of emerging behaviour on this blog can be found here and here.