Difference between revisions of "Information in Physics"
(→Context) |
(→Context) |
||
Line 4: | Line 4: | ||
Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics. | Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics. | ||
− | First published as ''Physics from Fisher Information'' in 1999, Roy Frieden has been claiming that observable physics (collapse of the wave function) can be described as the loss of Fisher Information.<ref>B. Roy Frieden, ''Science from Fisher Information'' (2004) Cambridge UP ISBN 0521009111</ref> | + | First published as ''Physics from Fisher Information'' in 1999, Roy Frieden has been claiming that observable physics (collapse of the wave function) can be described as the loss of Fisher Information.<ref>B. Roy Frieden, ''Science from Fisher Information'' (2004) Cambridge UP ISBN 0521009111</ref> the fisher information can be interpreted as the inverse of the standard error (squared), but only when the log-likelihood is quadratic (i.e., the Gaussian log-likelihood). In the vast majority of cases, we are not dealing with a gaussian population … but, the log-likelihood of the MLE will often rapidly converge to a quadratic, especially around ±2 standard deviations. In this case, treating the inverse of the fisher information as an estimated precision of an estimate will be approximately correct. |
Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons<ref>''What are differences and relationship between shannon entropy and fisher information?'' https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa</ref> | Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons<ref>''What are differences and relationship between shannon entropy and fisher information?'' https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa</ref> |
Revision as of 15:51, 3 July 2023
Full Title or Meme
Context
Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics.
First published as Physics from Fisher Information in 1999, Roy Frieden has been claiming that observable physics (collapse of the wave function) can be described as the loss of Fisher Information.[1] the fisher information can be interpreted as the inverse of the standard error (squared), but only when the log-likelihood is quadratic (i.e., the Gaussian log-likelihood). In the vast majority of cases, we are not dealing with a gaussian population … but, the log-likelihood of the MLE will often rapidly converge to a quadratic, especially around ±2 standard deviations. In this case, treating the inverse of the fisher information as an estimated precision of an estimate will be approximately correct.
Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons[2]
Despite the above in 2010 Kevin Knuth declared " Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science."[3]
How to Better Define Information in Physics[4]
- See wiki page on Complexity Theory
References
- ↑ B. Roy Frieden, Science from Fisher Information (2004) Cambridge UP ISBN 0521009111
- ↑ What are differences and relationship between shannon entropy and fisher information? https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa
- ↑ Kevin H. Knuth Information Physics: The New Frontier (2010-09-27) https://arxiv.org/abs/1009.5161
- ↑ Disk Mills, How to Better Define Information in Physics https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/