Things tend to get messed up. We know this truth intuitively, but over the last 200 years, physicists have come to understand it mathematically, too. An enigmatic property known as entropy, which roughly describes how mixed-up a system's parts are, tends to always increase. In other words, things get messier and more mixed up as time goes on. Physicists call this tendency the second law of thermodynamics, but it's more a matter of statistics than physics. Entropy increases because there are more ways to be messy than to be tidy. Take a drop of ink in a pool of water. Imagine all the possible ways of arranging the ink and water molecules. The arrangements in which the ink molecules all happen to sit together in a tiny teardrop shape are rare, in comparison with all the configurations in which the ink molecules can sit anywhere throughout the pool. The second law is simply the statement that the molecules, as they jitter and push each other around, are bound to end up in one of the more likely arrangements (ink molecules dispersed evenly) rather than a special and rare one (ink molecules all clustered together). Essentially all known systems behave in a similar way. Sugar cubes dissolve into coffee. Scents disperse throughout a room. Waves mix a sandcastle's grains back into the beach. Physicists first started getting to grips with entropy's rise in the mid-1800s, but now there is a more modern way of thinking about it. The way to understand entropy, as Zack Savitsky discussed in his long-form exploration of the concept last year, is as a measure of uncertainty or ignorance. This alternative definition traces to the American mathematician Claude Shannon's 1948 paper establishing a theory of communication. In Shannon's theory, a low-entropy message is one with recognizable structure, analogous to the ink drop. The message abababab…, for instance, is low entropy, and therefore low uncertainty. After learning the first few characters, you can guess how the message continues. A random jumble of characters like fkale93xh..., by contrast, has high entropy. You are much less certain how this will continue. This more general notion of entropy as uncertainty provides further insight into why systems mix themselves up in accordance with the second law of thermodynamics. Consider a deck of cards. When it's fresh out of the package, you can predict the order with total certainty: ace, two, three, et cetera. But as you shuffle the deck, you're less likely to know what card will follow an ace. Our ignorance is bound to increase with each shuffle as the deck finds those vastly more likely patternless orderings. The same is true of the universe overall. The most common arrangements of atoms and molecules are uniform ones, where no part contains any information that will help reveal the arrangement somewhere else. So as the universe changes, we're all but guaranteed to find the more likely situations of bland, structureless states. What's New and Noteworthy Some physicists feel dissatisfied with the looseness of the second law of thermodynamics. The present should lead to one future, they say, so why are we talking about statistics of many possible futures? In recent years, these researchers have made progress on deriving the statistical nature of the second law from absolute quantum principles. The inexorability and universality of the second law has focused attention on cases that seem to defy it. In 2017 physicists discovered "quantum scars," where patterns in strings of particles break down but then spontaneously reappear. And other groups have recently constructed a strange type of magnetic order that persists at high temperatures, where entropy typically erases all patterns. Then there's life itself: a highly ordered arrangement of molecules. But these examples merely exploit the second law's fine print. Quantum scars amount to a very special deck of cards and a very special shuffling technique that allows order to recur — not a general decrease in entropy. And the magnetic order persists by allowing another type of disorder to grow at a greater rate. It survives increasing entropy, rather than defying it. Biophysicists say that living organisms do something similar, keeping their own entropy low by greatly increasing entropy in the world around them. Thus, the apparent exceptions help prove the law. Entropy has also served as a crowbar to crack open one of the universe's blackest of boxes: black holes. In the 1970s, the physicist Jacob Bekenstein noticed that black holes seemed to be second-law-violating machines. They could swallow teacups and planets, deleting the entropy of those objects and therefore reducing the entropy of the universe outside. But one should never bet against the second law, Bekenstein reasoned, so black holes must have an entropy of their own that grows as they grow. Many physicists take this to mean that black holes should be made of many pieces that can be rearranged, much as a gas is made of many molecules. Puzzlingly, however, those pieces appear to live on the black hole's surface rather than inside it — a mystery that many physicists consider to be their most promising clue to the quantum behavior of gravity. But entropy's most enduring legacy may prove to be the way it helps us define the future itself. One can argue that the "arrow" of time points in the direction of increasing entropy. This insight has fueled research into the quantum origins of time and timekeeping. |