Difference between revisions of "Entropy"
(→Context) |
(→References) |
||
Line 59: | Line 59: | ||
[[Category: Philosophy]] | [[Category: Philosophy]] | ||
+ | [[Category: Physics]] | ||
+ | [[Category: Information]] |
Revision as of 17:20, 9 June 2023
link title==Full Title or Meme== The variability that we do not know at any one point in time.
Contents
Context
- Entropy is often treated as synonymous to chaos and disorder. This page considers entropy to be the lack of information. Some of the structure of this section come from Entropy is not disorder: A physicist’s perspective
- The real progenitor for is Entropy is the Second Law of Thermodynamics which has been characterized as foretelling the head death of the universe.
- The real paradox is the struggle between order and chaos - between the Doing and Undoing.
Understanding Entropy’s Definition
- From the “invisible force” to the “harbinger of chaos,” you may have heard quite a few sensational phrases describing entropy.
- Ludwig Boltzmann defined entropy as a measure of the number of possible microscopic states of a system in thermodynamic equilibrium, consistent with its macroscopic thermodynamic properties, which constitute the overall state of the system.
- Richard Feynman defined[1] entropy as a measure of the opposite of available energy. Or in other words, the entropy increases when the available energy decreases.
The definitions seem quite simple. However, the devil lies in the details: What do we mean by “the number of states?” How do we count this number? Intuitively, it seems like a system should only be in one state, so N would always be 1, implying that entropy is always 0!
To have a useful definition of entropy, we need to look beyond our system. Instead of always counting one state, we count the number of states that are similar to our system in some ways. In other words, given quantities that are relevant, entropy counts the number of states that share the same relevant quantities, while ignoring irrelevant quantities. Thus, to compute entropy, we must first separate the details of a system into two categories: relevant and irrelevant. Entropy then captures the amount of irrelevant details of a system. This means entropy quantifies our ignorance! Let’s go through an example. Let’s imagine our physical system is described by 100 digits, given by 7607112435 2843082689 9802682604 6187032954 4947902850 1573993961 7914882059 7238334626 4832397985 3562951413 These look like seemingly random numbers. So if the only relevant information is that there are 100 digits, and the precise digits are irrelevant, the entropy would simply be
Missing image showing the entropy for a system described by 100 random digits
However, a careful reader might recognize that these seemingly random numbers are the first 100 digits of pi in reverse! If I tell you that our system is given exactly by the digits of pi, there would only be one possible state that can describe this system, and the entropy will be 0! Of course, entropy computed in this system is rather useless because I have not introduced any interaction that can cause the system to change. But this example serves as a nice illustration that entropy is ill-defined until we find out what quantities are relevant vs. irrelevant.
Relevant vs. Irrelevant
So how do we draw this line between what is relevant versus irrelevant? This is where physics comes in: Relevant quantities: macroscopic quantities that don’t change very much over time. They are typically measurable and controllable in an experiment (e.g., total energy, volume, number of particles). Irrelevant quantities: microscopic quantities that change rapidly over time. These are typically not easily measurable and can appear seemingly random (e.g., specific locations and velocities of each particle). As it turns out, the properties of most systems can be cleanly broken down into these two categories. However, what counts as fast is subjective: intermolecular interactions happen faster than the blink of an eye, whereas intergalactic movements span across millennia. In the few cases where we can’t cleanly separate the different physical quantities, we simply state that the system is not in thermal equilibrium and entropy is ill-defined! In summary, the more sophisticated definition of entropy is: given a system, entropy measures our ignorance in irrelevant quantities over relevant quantities
Confronting Misuses of Entropy
Armed with our newfound understanding of entropy, let’s confront some common misuses of entropy. Say there is a huge mess on the floor like the picture below. Is the entropy high or low? (Image of paint splatter) Does a messy floor imply a high entropy? Well, that is a trick question! Just like the digits of pi example I showed above, the question is ill-defined. We simply cannot compute an entropy meaningfully until we know what the relevant and irrelevant variables are! Indeed, the mess on the floor could be a delicate piece of art, and who’s to say a particular splotch of paint wasn’t the result of a carefully planned masterpiece? Here’s another common misuse. Say my ceramics fell on the floor. Did entropy increase? Did entropy increase because a piece of ceramics fell onto the ground? The answer is probably yes, but not because the pieces seem to be arranged in a more chaotic way. Entropy is always increasing in daily physical processes. The fact that the pieces of ceramics are separated instead of stuck together doesn’t contribute much to the notion of entropy. Let’s think back to the previous discussion on relevant vs. irrelevant quantities. Are the ceramic pieces rapidly changing and rearranging in such a way that the system thermalizes? No. So, it doesn’t make a lot of sense to associate entropy with the patterns of broken pieces of ceramics. If we were to imagine some weird simulation where the exact same piece of ceramics is breaking over and over again and we want to pretend that there is some sort of thermalization process, we could try to create some notion of entropy, but it would be a weird ad-hoc notion at best. So, we now know that entropy doesn’t capture some objective, fundamental notion of disorder. Why, then, do we associate entropy with chaos and disorder?
Relationship to statistics and information
To get a better understanding, we need to make a connection to statistics. Given a probability distribution p, we can compute a quantity called the information entropy H. The information entropy measures how random the given probability distribution is. It is given by H = - <sum over I upto N> pi log pi where the sum is over all the possible outcomes the probability distribution describes. How does this information entropy relate to the physicist’s entropy? Well, the information entropy is maximized when the distribution is uniform, such that each probability is 1/N. The maximum value for information entropy is given by H max = log N We see that the maximum information entropy is the same as the physicist’s definition of entropy! This tells us that: entropy captures the randomness of the irrelevant parts of a system when we pretend that those irrelevant parts are described by a uniform distribution Armed with all this knowledge, we can summarize what entropy really computes:
- Take a system; we first divide all physical quantities into two categories: relevant and irrelevant
- We assume that the irrelevant quantities behave like random variables drawn from a uniform distribution
- The entropy captures our ignorance of these irrelevant variables
So that’s it! Entropy can only be computed when we enforce an approximate statistical view on a system. However, no physical system literally follows these statistical laws (not even quantum mechanical ones). So, entropy serves as a measure of the apparent “disorder” due to our incomplete knowledge of the world.
References
- See also the page on Chaos and Order