One of the most fundamental laws of thermodynamics states that entropy is always increasing. But what exactly is entropy? Simply put, entropy is a measure of randomness or loss of energy available to do work.
Simply put, entropy is a measure of randomness or loss of energy available to do work.
One of the first mathematical formulations of entropy was published in the 19th century by Sadi Carnot in his book titled “Fundamental Principles of Equilibrium and Movement.” In his book, Carnot states that any moving engine loses “moment of activity” — or energy able to do useful work. Entropy can be decreased locally, meaning that, if some force or “agent” acts upon an open system, it can decrease the entropy of that system at the cost of increasing the energy of the system overall.
In the late 1860s, scientists began to utilize probability to explain the fundamental principles of entropy. Entropy was characterized as the distribution of microstates within a system. Because of the probabilistic nature of this definition, it leads to some puzzling questions. If entropy increases over time but is also a probability, who is to say what that length of time is? For example, by sheer chance, the entropy of a system could decrease, but we know that in the long term it will increase. Can we arbitrarily define a period of time or say that the laws governing entropy have been violated in this length of time?
Even more complications in the theory of entropy arose when James Clerk Maxwell proposed his now-infamous thought experiment: Maxwell’s Demon. In this theoretical experiment, a gas is enclosed by a box with a partition that can open and closed at will by a “demon.” The omniscient “demon” is able to open the partition at just the right times to allow only the fast particles into one side and the “slow” particles into the other side. Because temperature is directly related to the speed of particles, this means that the hot side of the box gets hotter while the cold side gets colder. Theoretically, this would violate the second law of thermodynamics by decreasing the entropy of the system.
Even more complications in the theory of entropy arose when James Clerk Maxwell proposed his now-infamous thought experiment: Maxwell’s Demon.
In 1929, Leo Szilard, a Hungarian physicist who worked on the Manhattan Project, published a now-famous response to Maxwell’s Demon. In any conceivable real-world system, the demon would need to be able to measure the speed and direction of the particles to know when to open the partition. The amount of energy the demon would expend by taking this measurement would mean the demon increases entropy overall, even if the entropy of the box itself decreases. It raised a point that is now fundamental in thermodynamics and information theory: The acquisition of information itself is work and expends energy.
It raised a point that is now fundamental in thermodynamics and information theory: The acquisition of information itself is work and expends energy.
Entropy is applicable to virtually everything we experience in our daily lives — boiling water, pistons in an engine, and heat from the sun. It determines the thermodynamic properties of all the molecules we interact with. For example, when salt is added to water, the water boils at a higher temperature. This is because the entropy of the water has increased after being mixed with another molecule, and the entropic difference between liquid water and vaporized water has decreased. This means that there is less “incentive” for water to vaporize.
But entropy is far more profound than just heating or cooling. In its purest sense, entropy simply means there is an inherent decay to the universe. The universe initially started off with “information” that can never be regained and will eventually degenerate into nothingness. In fact, entropy itself is the arrow of time — differentiating the past from the future.
In fact, entropy itself is the arrow of time — differentiating the past from the future.
The concept of entropy is still an area of intense research, and physicists continuously try to find exceptions. And learning more about entropy itself has far-reaching implications from the behavior of condensed matter to quantum computing.
Image Source: Piqsels