Difference between revisions of "Information in Physics"
(→Context) |
(→Context) |
||
(15 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
==Full Title or Meme== | ==Full Title or Meme== | ||
+ | There doesn't seem to be a good definition of this at the present time. | ||
+ | |||
+ | * [https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/ How to Better Define Information in Physics] is an article that does a good job of exploring the various definitions. | ||
==Context== | ==Context== | ||
Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics. | Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics. | ||
− | First published as ''Physics from Fisher Information'' in 1999, Roy Frieden has been claiming that observable physics ( | + | First published as ''Physics from Fisher Information'' in 1999, Roy Frieden has been claiming that observable physics ([[Wave Function Collapse]]) can be described as the loss of Fisher Information.<ref>B. Roy Frieden, ''Science from Fisher Information'' (2004) Cambridge UP ISBN 0521009111</ref> the fisher information can be interpreted as the inverse of the standard error (squared), but only when the log-likelihood is quadratic (i.e., the Gaussian log-likelihood). In the vast majority of cases, we are not dealing with a gaussian population … but, the log-likelihood of the MLE will often rapidly converge to a quadratic, especially around ±2 standard deviations. In this case, treating the inverse of the fisher information as an estimated precision of an estimate will be approximately correct. |
− | Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons<ref>''What are differences and relationship between shannon entropy and fisher information?'' https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa</ref> | + | Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons<ref>Stack Exchange, ''What are differences and relationship between shannon entropy and fisher information?'' https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa</ref> In summary, Fisher information is related to estimation error while Shannon information is related to message content or distribution. |
Despite the above in 2010 Kevin Knuth declared " Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science."<ref>Kevin H. Knuth ''Information Physics: The New Frontier'' (2010-09-27) https://arxiv.org/abs/1009.5161</ref> | Despite the above in 2010 Kevin Knuth declared " Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science."<ref>Kevin H. Knuth ''Information Physics: The New Frontier'' (2010-09-27) https://arxiv.org/abs/1009.5161</ref> | ||
− | How to Better Define Information in Physics<ref> | + | How to Better Define Information in Physics<ref>Dick Mills, ''How to Better Define Information in Physics'' (2018-06-20) https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/</ref> |
* See wiki page on [[Complexity Theory]] | * See wiki page on [[Complexity Theory]] | ||
+ | |||
+ | ==Types of Information== | ||
+ | * Laplace - the clockwork universe is set in motion and runs deterministically by itself (Laplace's daemon) | ||
+ | * Gaussian - error bounds prevent us from make any exact measurement | ||
+ | * Boltzmann statistics of distinguishable (known) particles | ||
+ | * Planck & Bose - statistics of indistinguishable particles | ||
+ | * Pauli & Dirac - statistics of particles subject to exclusion principle | ||
+ | * Fisher - population statistics | ||
+ | * Shannon - information content | ||
+ | ===Entropy=== | ||
+ | The entropy of the universe is constantly increasing. The second law of thermodynamics states that the state of entropy of the entire universe, as an isolated system, will always increase over time. Energy always flows downhill, and this causes an increase of entropy. Entropy is the spreading out of energy, and energy tends to spread out as much as possible. It flows spontaneously from a hot (i.e. highly energetic) region to a cold (less energetic) region. As a result, energy becomes evenly distributed across the two regions, and the temperature of the two regions becomes equal. The same thing happens on a much larger scale. The Sun and every other star are radiating energy into the universe. However, they can’t do it forever. Eventually the stars will cool down, and heat will have spread out so much that there won’t be warmer objects and cooler objects. Everything will be the same very cold temperature. Once everything is at the same temperature, there’s no reason for anything to change what it’s doing. The universe will have run down completely, and the entropy of the universe will be as high as it is ever going to get.<ref>Ernest Z. ''Why is entropy of the universe increasing?'' https://socratic.org/questions/why-is-entropy-of-universe-increasing</ref> | ||
+ | |||
+ | The change in entropy of an isolated system during an irreversible process is > 0; while for a reversible process, it is = 0.the change in entropy of an isolated system during an irreversible process is > 0; while for a reversible process, it is = 0. | ||
+ | |||
+ | The collapse of the wave function is generally considered to be irreversible. But there does not seem to be anyway to prove that this is true. | ||
==References== | ==References== | ||
+ | <references /> | ||
+ | ===Other Material=== | ||
+ | * See wiki page on [[Dimensions in Physics]] | ||
[[Category: Physics]] | [[Category: Physics]] | ||
[[Category: Information]] | [[Category: Information]] |
Latest revision as of 10:49, 29 October 2024
Contents
Full Title or Meme
There doesn't seem to be a good definition of this at the present time.
- How to Better Define Information in Physics is an article that does a good job of exploring the various definitions.
Context
Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics.
First published as Physics from Fisher Information in 1999, Roy Frieden has been claiming that observable physics (Wave Function Collapse) can be described as the loss of Fisher Information.[1] the fisher information can be interpreted as the inverse of the standard error (squared), but only when the log-likelihood is quadratic (i.e., the Gaussian log-likelihood). In the vast majority of cases, we are not dealing with a gaussian population … but, the log-likelihood of the MLE will often rapidly converge to a quadratic, especially around ±2 standard deviations. In this case, treating the inverse of the fisher information as an estimated precision of an estimate will be approximately correct.
Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons[2] In summary, Fisher information is related to estimation error while Shannon information is related to message content or distribution.
Despite the above in 2010 Kevin Knuth declared " Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science."[3]
How to Better Define Information in Physics[4]
- See wiki page on Complexity Theory
Types of Information
- Laplace - the clockwork universe is set in motion and runs deterministically by itself (Laplace's daemon)
- Gaussian - error bounds prevent us from make any exact measurement
- Boltzmann statistics of distinguishable (known) particles
- Planck & Bose - statistics of indistinguishable particles
- Pauli & Dirac - statistics of particles subject to exclusion principle
- Fisher - population statistics
- Shannon - information content
Entropy
The entropy of the universe is constantly increasing. The second law of thermodynamics states that the state of entropy of the entire universe, as an isolated system, will always increase over time. Energy always flows downhill, and this causes an increase of entropy. Entropy is the spreading out of energy, and energy tends to spread out as much as possible. It flows spontaneously from a hot (i.e. highly energetic) region to a cold (less energetic) region. As a result, energy becomes evenly distributed across the two regions, and the temperature of the two regions becomes equal. The same thing happens on a much larger scale. The Sun and every other star are radiating energy into the universe. However, they can’t do it forever. Eventually the stars will cool down, and heat will have spread out so much that there won’t be warmer objects and cooler objects. Everything will be the same very cold temperature. Once everything is at the same temperature, there’s no reason for anything to change what it’s doing. The universe will have run down completely, and the entropy of the universe will be as high as it is ever going to get.[5]
The change in entropy of an isolated system during an irreversible process is > 0; while for a reversible process, it is = 0.the change in entropy of an isolated system during an irreversible process is > 0; while for a reversible process, it is = 0.
The collapse of the wave function is generally considered to be irreversible. But there does not seem to be anyway to prove that this is true.
References
- ↑ B. Roy Frieden, Science from Fisher Information (2004) Cambridge UP ISBN 0521009111
- ↑ Stack Exchange, What are differences and relationship between shannon entropy and fisher information? https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa
- ↑ Kevin H. Knuth Information Physics: The New Frontier (2010-09-27) https://arxiv.org/abs/1009.5161
- ↑ Dick Mills, How to Better Define Information in Physics (2018-06-20) https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/
- ↑ Ernest Z. Why is entropy of the universe increasing? https://socratic.org/questions/why-is-entropy-of-universe-increasing
Other Material
- See wiki page on Dimensions in Physics