Information in Physics

From MgmtWiki
Revision as of 15:49, 3 July 2023 by Tom (talk | contribs) (Context)

Jump to: navigation, search

Full Title or Meme

Context

Since the time of Boltzmann information about physical objects has been important in some areas of physics, specifically in thermodynamics.

First published as Physics from Fisher Information in 1999, Roy Frieden has been claiming that observable physics (collapse of the wave function) can be described as the loss of Fisher Information.[1]

Fisher Information and Shannon Information are two different concepts. Fisher Information is related to the asymptotic variability of a maximum likelihood estimator. Higher Fisher Information is associated with lower estimation error. Fisher Information is used to investigate the precision of the neural code when studying large populations of neurons. Shannon Information refers to the content of the message or distribution, not its variability. Shannon mutual information is used to investigate the precision of the neural code when investigating very small populations of neurons[2]

Despite the above in 2010 Kevin Knuth declared " Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science."[3]

How to Better Define Information in Physics[4]

References

  1. B. Roy Frieden, Science from Fisher Information (2004) Cambridge UP ISBN 0521009111
  2. What are differences and relationship between shannon entropy and fisher information? https://math.stackexchange.com/questions/1523416/what-are-differences-and-relationship-between-shannon-entropy-and-fisher-informa
  3. Kevin H. Knuth Information Physics: The New Frontier (2010-09-27) https://arxiv.org/abs/1009.5161
  4. Disk Mills, How to Better Define Information in Physics https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/