Powered By Blogger

Monday, May 18, 2020

Book Reviewed: Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature by Arieh Ben-Naim

Entropy is not associated with Time's Arrow

The second law of thermodynamics is expressed in many ways, the simplest is that heat flows naturally from a hotter to a colder body. One of the outcome of this simple universal property is that the natural order of events is a transformation of a physical state from an ordered to a disordered state; a state with more useful energy into less useful energy; a system with more information to one with less information; and a system with slow moving molecules to the one with fast moving molecules. Therefore, for a closed system like this universe, the entropy or the chaos is always increasing, hence this sets an arrow of time for events that moves from past to future and never in the reverse direction. This is a distinct feature of second law of thermodynamics that is different from classical and quantum physics which are time reversible.

Boltzmann was first to associate entropy with disorder, but he did not "equate" entropy with disorder. The second Law is also known to be of probability, but Boltzmann did not elaborate on which probability it operated. is it between a state changing from lower to a higher probability or from high probability towards a maximum value?

The author makes a distinction between Shannon’s measure of entropy (SMI) and entropy; entropy refers to macroscopic systems at equilibrium, and SMI refers to all other systems containing either small or large number of particles, and near or far from equilibrium. Thus, instead of saying that the entropy increases with time, and then reaches a maximum at equilibrium, it is shown that entropy is proportional to maximum probability distributions (of locations and momenta of the particles). Thus, the author argues that the SMI, not the entropy, evolves into a maximum value when the system reaches equilibrium. His conjecture is that entropy’s association with time is misunderstood, but when it is redefined with SMI, it becomes clear that entropy is not directly connected to time flow.

This is purely an original idea and the author has a fascinating way of narrating his theory. Very readable and stimulating.

No comments:

Post a Comment