Entropy

From MgmtWiki
Jump to: navigation, search

Full Title or Meme

The variability that we do not know at any one point in time.

Context

  • Entropy is often treated as synonymous to chaos and disorder. This page considers entropy to be the lack of information. Some of the structure of this section come from Entropy is not disorder: A physicist’s perspective
  • The real progenitor for Entropy is the Second Law of Thermodynamics which has been characterized as foretelling the heat death of the universe.
  • The fact that Entropy has been low in the past leads to the observation that relics of the past exist in the present, while the future is unknowable to us mere mortals. In order to leave a trace, the events of the past must have been frozen in-place. This can only happen in irreversible processes, which, once frozen, cannot be undone.
  • The real paradox is the struggle between order and chaos - between the Doing and Undoing.

Understanding Entropy’s Definition

  • From the “invisible force” to the “harbinger of chaos,” you may have heard quite a few sensational phrases describing entropy.
  • Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties, which constitute the overall state of the system.
  • Richard Feynman defined[1] entropy as a measure of the opposite of available energy. Or in other words, the entropy increases when the available energy decreases.

The definitions seem quite simple. However, the devil lies in the details: What do we mean by “the number of states?” How do we count this number? Intuitively, it seems like a system should only be in one state, so N would always be 1, implying that entropy is always 0!

To have a useful definition of entropy, we need to look beyond our system. Instead of always counting one state, we count the number of states that are similar to our system in some ways. In other words, given quantities that are relevant, entropy counts the number of states that share the same relevant quantities, while ignoring irrelevant quantities. Thus, to compute entropy, we must first separate the details of a system into two categories: relevant and irrelevant. Entropy then captures the amount of irrelevant details of a system. This means entropy quantifies our ignorance! Let’s go through an example. Let’s imagine our physical system is described by 100 digits, given by 7607112435 2843082689 9802682604 6187032954 4947902850 1573993961 7914882059 7238334626 4832397985 3562951413 These look like seemingly random numbers. So if the only relevant information is that there are 100 digits, and the precise digits are irrelevant, the entropy would simply be

Missing image showing the entropy for a system described by 100 random digits

However, a careful reader might recognize that these seemingly random numbers are the first 100 digits of pi in reverse! If I tell you that our system is given exactly by the digits of pi, there would only be one possible state that can describe this system, and the entropy will be 0! Of course, entropy computed in this system is rather useless because I have not introduced any interaction that can cause the system to change. But this example serves as a nice illustration that entropy is ill-defined until we find out what quantities are relevant vs. irrelevant.

Relevant vs. Irrelevant

So how do we draw this line between what is relevant versus irrelevant? This is where physics comes in: Relevant quantities: macroscopic quantities that don’t change very much over time. They are typically measurable and controllable in an experiment (e.g., total energy, volume, number of particles). Irrelevant quantities: microscopic quantities that change rapidly over time. These are typically not easily measurable and can appear seemingly random (e.g., specific locations and velocities of each particle). As it turns out, the properties of most systems can be cleanly broken down into these two categories. However, what counts as fast is subjective: intermolecular interactions happen faster than the blink of an eye, whereas intergalactic movements span across millennia. In the few cases where we can’t cleanly separate the different physical quantities, we simply state that the system is not in thermal equilibrium and entropy is ill-defined! In summary, the more sophisticated definition of entropy is: given a system, entropy measures our ignorance in irrelevant quantities over relevant quantities

Confronting Misuses of Entropy

Armed with our newfound understanding of entropy, let’s confront some common misuses of entropy. Say there is a huge mess on the floor. Is the entropy high or low? Does a messy floor imply a high entropy? Well, that is a trick question! Just like the digits of pi example I showed above, the question is ill-defined. We simply cannot compute an entropy meaningfully until we know what the relevant and irrelevant variables are! Indeed, the mess on the floor could be a delicate piece of art, and who’s to say a particular splotch of paint wasn’t the result of a carefully planned masterpiece? Here’s another common misuse. Say my ceramics fell on the floor. Did entropy increase? Did entropy increase because a piece of ceramics fell onto the ground? The answer is probably yes, but not because the pieces seem to be arranged in a more chaotic way. Entropy is always increasing in daily physical processes. The fact that the pieces of ceramics are separated instead of stuck together doesn’t contribute much to the notion of entropy. Let’s think back to the previous discussion on relevant vs. irrelevant quantities. Are the ceramic pieces rapidly changing and rearranging in such a way that the system thermalizes? No. So, it doesn’t make a lot of sense to associate entropy with the patterns of broken pieces of ceramics. If we were to imagine some weird simulation where the exact same piece of ceramics is breaking over and over again and we want to pretend that there is some sort of thermalization process, we could try to create some notion of entropy, but it would be a weird ad-hoc notion at best. So, we now know that entropy doesn’t capture some objective, fundamental notion of disorder. Why, then, do we associate entropy with chaos and disorder?

Relationship to statistics and information

To get a better understanding, we need to make a connection to statistics. Given a probability distribution p, we can compute a quantity called the information entropy H. The information entropy measures how random the given probability distribution is. It is given by H = - <sum over I upto N> pi log pi where the sum is over all the possible outcomes the probability distribution describes. How does this information entropy relate to the physicist’s entropy? Well, the information entropy is maximized when the distribution is uniform, such that each probability is 1/N. The maximum value for information entropy is given by H max = log N We see that the maximum information entropy is the same as the physicist’s definition of entropy! This tells us that: entropy captures the randomness of the irrelevant parts of a system when we pretend that those irrelevant parts are described by a uniform distribution Armed with all this knowledge, we can summarize what entropy really computes:

  1. Take a system; we first divide all physical quantities into two categories: relevant and irrelevant
  2. We assume that the irrelevant quantities behave like random variables drawn from a uniform distribution
  3. The entropy captures our ignorance of these irrelevant variables

So that’s it! Entropy can only be computed when we enforce an approximate statistical view on a system. However, no physical system literally follows these statistical laws (not even quantum mechanical ones). So, entropy serves as a measure of the apparent “disorder” due to our incomplete knowledge of the world.

Biological Diversity

Leinster's ebook brings new mathematical rigor to the ongoing vigorous debate on how to quantify biological diversity. The question "what is diversity?" has surprising mathematical depth, and breadth too: this book involves parts of mathematics ranging from information theory, functional equations and probability theory to category theory, geometric measure theory and number theory. It applies the power of the axiomatic method to a biological problem of pressing concern, but the new concepts and theorems are also motivated from a purely mathematical perspective. The main narrative thread requires no more than an undergraduate course in analysis. No familiarity with entropy or diversity is assumed.[2]

Energy Photons

There is little mention of the fact that radiation contains entropy as well as energy, with different spectral distributions. Whereas the energy function has been deeply studied, the radiation entropy distribution has not been analyzed at the same depth. The Mode of the energy distribution is well known –Wien’s law– and Planck’s law has been analytically integrated recently, but no similar advances have been made for the entropy. This paper focuses on the characterization of the entropy of radiation distribution from an statistical perspective, obtaining a Wien’s like law for the Mode and integrating the entropy for the Median and the Mean in polylogarithms, and calculating the Variance, Skewness and Kurtosis of the function. Once these features are known, the increasing importance of radiation entropy analysis is evidenced in three different interdisciplinary applications: defining and determining the second law Photo-synthetically Active Radiation (PAR) region efficiency, measuring the entropy production in the Earth’s atmosphere, and showing how human vision evolution was driven by the entropy content in radiation.[3]

An algorithm is said to take logarithmic time if T(n) = O(log n).

An algorithm is said to run in polylogarithmic time if T(n) = O((log n)k), for some constant k.

Problems

If an asymmetry in time does not arise from the fundamental dynamical laws of physics, it may be found in special Boundary Conditions. The argument normally goes that since thermodynamic entropy in the past is lower than in the future according to the Second Law of Thermodynamics, then tracing this back to the time around the Big Bang means the universe must have started off in a state of very low thermodynamic entropy: the Thermodynamic Past Hypothesis. In this paper, we consider another Boundary Condition that plays a similar role, but for the decoherent arrow of time, i.e. the quantum state of the universe is more mixed in the future than in the past. According to what we call the Entanglement Past Hypothesis, the initial quantum state of the universe had very low entanglement entropy. We clarify the content of the Entanglement Past Hypothesis, compare it with the Thermodynamic Past Hypothesis, and identify some challenges and open questions for future research.[4]

Principle of Maximum Rate of Entropy Production

Garth W. Paltridge researched topics such as the optimum design of plants and the economics of climate forecasting, and worked on atmospheric radiation and the theoretical basis of climate. Paltridge introduced the subsequently disputed hypothesis that the earth/atmosphere climate system adopts a format that maximizes its rate of thermodynamic dissipation, i.e., entropy production. This suggests a governing constraint by a principle of maximum rate of entropy production. According to this principle, prediction of the broad-scale steady-state distribution of cloud, temperature and energy flows in the ocean and atmosphere may be possible when one has sufficient data about the system for that purpose, but does not have fully detailed data about every variable of the system.[5] This principle served as the basis for Tim Palmer to switch from Physics to Meteorology with the idea that this might turn into a fantastic unifying principle.[6]

References

  1. Richard Feynman, The Character of Physical Law p 1212
  2. Tom Leinster, Entropy and Diversity: The Axiomatic Approach (2012) https://arxiv.org/abs/2012.02113
  3. Alfonso Delgado-Bonal, Entropy of radiation: the unseen side of light Nature (2017-05-17) https://www.nature.com/articles/s41598-017-01622-6
  4. Jim Al-Khalili and Eddy Keming Chen, The Decoherent Arrow of Time and the Entanglement Past Hypothesis 2024-05-07 https://philpapers.org/archive/ALKTDA.pdf
  5. G. W. Paltridge, The steady-state format of global climate. Quarterly Journal of the Royal Meteorological Society. 104 (442): 927–945.(1978) Bibcode:1978QJRMS.104..927P. doi:10.1002/qj.49710444206.
  6. Tim Palmer, The Primacy of Doubt, (2022) ISBN 9781541619715

Other Material