High school science class has a tendency to boil the universe down into a series of immutable laws. Newton’s laws of motion, Kirchhoff’s circuit laws, the laws of thermodynamics, etc. — these principles govern the known universe without exception.
Of the aforementioned, one of the most often quoted is the second law of thermodynamics, which is usually given as “entropy always increases,” or some variation thereof. But what is entropy, and why must it always increase?
“Entropy,” like the second law of thermodynamics, is almost always given in poorly-defined, indefinite terms, like “the disorder of a system.” Indeed, Ludwig Boltzmann, one of the progenitors of chaos theory, originally defined entropy as the “amount of chaos” in a closed thermodynamic system. However, he did not stop with such nebulous terminology.
In fact, Boltzmann was the first to quantify entropy with a singular formula, penning the first derivation for its value in 1872. The exact formula (in classical physics) is S = k*ln(W), generally interpreted as “the number of microstates which have the same prescribed macroscopic properties” for a given closed system.
To translate the above jargon, a system is simply a finite portion of space whose change over time is being studied. A system is closed if and only if there is no transfer of matter or energy from within the system to outside. A macrostate describes the general state of a system, and a microstate describes the combination of all the states of the particles in the system. Multiple microstates, which can describe the specific energies and arrangements of particles, can describe the same macrostate, such as the general distribution of particles and temperature of the system.
Boltzmann’s entropy formula, as it came to be known, describes the possible changes to a given microstate with respect to its macroscopic properties; that is, entropy is proportional to the number of microstates for a macrostate of the system.
‘Entropy,’ like the Second Law of Thermodynamics, is almost always given in poorly-defined, indefinite terms, like ‘the disorder of a system.’
Herein lies the reason why entropy must always increase. Imagine filling a room with gas. As the gas cylinder empties, the room fills, and the gas itself disperses. The number of microstates — in other words, the possible positions every gas particle can have — is lower for the macrostate of the gas in the cylinder than for the gas filling the entire room. Thus, the entropy of the system increases, and no amount of time or dispersive processes will get the gas back in the cylinder.
It therefore becomes easy to imagine that this must always hold true: that entropy increases continuously without bound. However, this is simply not the case; as usual, these classical laws break down at the quantum level.
Quantum physics is the physics of quanta. Quanta are finite, discrete energy states, usually understood as being “packets” of energy, like photons. At the simplest level, quantum physics is the physics of systems whose energies exist in discrete units. This interpretation of the world is directly at odds with classical physics, in which values like energy exist on a continuum. At current, this model is the most widely accepted.
Quantum physics has several odd (and interesting) ramifications, the most famous of which being superposition. In other words, quanta can exist in several states at once until they are directly “observed,” usually by laboratory apparatus.
Though superposition can theoretically manifest on the macroscopic level, the term is usually reserved for much smaller quanta, which can, for instance, exist in several places at once until directly observed, as proved in Thomas Young’s double slit experiment.
Superposition can affect not only position but also time. Certain interactions on very small time scales can, in fact, exist in a superposition of moving forward and backward through time simultaneously.
Superposition can affect not only position but also time.
This is not to say, however, that particles can time travel. Time itself can be defined by entropy, with the direction of increasing entropy itself being the direction in which we define time to be moving forward. This is also not to say that we may reverse entropy; the second law of thermodynamics, so far as we know, is constant.
Rubino et al. prove that under very specific circumstances the flow of time may be muddled; that is, it is possible for a particle to be put into a superposition in which the flow of time becomes suspended. However, measuring entropy breaks the superposition and, invariably, time continues forward with the particle carrying along the path of increasing entropy.
While, on the microscopic level, entropy may become periodically suspended in time, it will eventually resume and continue in the direction of entropy on the macroscopic level, preserving the second law of thermodynamics.
Thus, while not exactly immutable, the second law of thermodynamics still remains as a benchmark for the behavior of macroscopic thermodynamic systems, as, like all “universal laws,” it is derived by observation, and matures via experimental study, all in the pursuit of understanding.