Difference between revisions of "Uncertainty"

From MgmtWiki
Jump to: navigation, search
(Quantum Mechanics)
(Prediction)
 
Line 13: Line 13:
 
==Prediction==
 
==Prediction==
 
Most of science is an attempt to create as set of equations that will predict the future. It appears that knowing the future is an obsession with the human species. Whether it will rain, or where to find the next meal were important in the survival of the species and so has always consumed a great deal of the learning that humans pass onto their progeny. But we have know for as long as we have any records that this effort was troublesome.  William Shakespeare has Hamlet state that human knowledge is limited: "There are more things in heaven and Earth, Horatio, / Than are dreamt of in your philosophy [science]."
 
Most of science is an attempt to create as set of equations that will predict the future. It appears that knowing the future is an obsession with the human species. Whether it will rain, or where to find the next meal were important in the survival of the species and so has always consumed a great deal of the learning that humans pass onto their progeny. But we have know for as long as we have any records that this effort was troublesome.  William Shakespeare has Hamlet state that human knowledge is limited: "There are more things in heaven and Earth, Horatio, / Than are dreamt of in your philosophy [science]."
 +
===Probabilistic Learning===
 +
LLMs are not even trying to find a “best” answer – they are trying to guess at what a human would most likely say, which is inherently probabilistic. LLMs estimate a probability distribution, and then sample from this distribution when asked to provide a single response.
 +
 +
In this regard [[Machine Learning]] is like [[Quantum Mechanics]], you can create a probability cloud, but the actual response (or measurement) is not knowable in advance.
  
 
==Quantum Mechanics==
 
==Quantum Mechanics==

Latest revision as of 14:32, 23 July 2024

Full Title or Meme

Context

Karl Pearson, the English statistician and geneticist, is commonly credited with first describing the concept of uncertainty as a measure of data variability in the late 1800s.[1] Before Pearson, scientists realized that their measurements incorporated variability, but they assumed that this variability was simply due to error. For example, measurement of the orbits of planets around the sun taken by different scientists at different times varied, and this variability was thought to be due to errors caused by inadequate instrumentation. The French mathematician Pierre-Simon Laplace discussed a method for quantifying error distributions of astronomical measurements caused by small errors associated with instrument shortcomings as early as 1820. As technology improved through the 1800s, astronomers realized that they could reduce, but not eliminate this error in their measurements.

Pearson put forward a revolutionary idea: Uncertainty, he proposed, was not simply due to the limits of technology in measuring certain events – it was inherent in nature. Even the most careful and rigorous scientific investigation (or any type of investigation for that matter) could not yield an exact measurement. Rather, repeating an investigation would yield a scatter of measurements that are distributed around some central value. This scatter would be caused not only by error, but also by natural variability. In other words, measurements themselves, independent of any human or instrument inaccuracies, exhibit scatter.

Whether it is the flight path of an arrow, the resting heart rate of an adult male, or the age of a historical artifact, measurements do not have exact values, but instead always exhibit a range of values, and that range can be quantified as uncertainty. This uncertainty can be expressed as a plot of the probability of obtaining a certain value, and the probabilities are distributed about some central, or mean, value.[2]

Based on the early papers of Quantum Mechanics before Heisenberg formulated his Uncertainty principle, JBS Haldane formulated his own "suspicion that the universe is not only queerer than we suppose, but queerer than we can suppose."[3]

Prediction

Most of science is an attempt to create as set of equations that will predict the future. It appears that knowing the future is an obsession with the human species. Whether it will rain, or where to find the next meal were important in the survival of the species and so has always consumed a great deal of the learning that humans pass onto their progeny. But we have know for as long as we have any records that this effort was troublesome. William Shakespeare has Hamlet state that human knowledge is limited: "There are more things in heaven and Earth, Horatio, / Than are dreamt of in your philosophy [science]."

Probabilistic Learning

LLMs are not even trying to find a “best” answer – they are trying to guess at what a human would most likely say, which is inherently probabilistic. LLMs estimate a probability distribution, and then sample from this distribution when asked to provide a single response.

In this regard Machine Learning is like Quantum Mechanics, you can create a probability cloud, but the actual response (or measurement) is not knowable in advance.

Quantum Mechanics

The first formal description of Uncertainty in Quantum Mechanics cam from Heisenberg in part as a response to the Schrodinger Equation. It clearly showed that it was not possible to measure coordinated values (like Position and Momentum) at the same moment. This is sometime called the Measurement Problem of Quantum Mechanics which is said to be so exact, except at the time an observation is made. In this wiki we assert that any event where an iteration occurs the "collapse of the wave function" also occurs.

It should be noted that most the of the physicists teaching in the university still believe in Determinism of the sort described by Laplace.[4]"An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, ti would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom."

Fourier Transforms

Uncertainty principles are not formally defined. In general, uncertainty principles refer to a meta-theorem in Fourier analysis that states that a nonzero function and its Fourier transform cannot be localized to arbitrary precision.[5]

More[6]

next[7]

References

  1. D. Salsburg, The lady tasting tea: How statistics revolutionized science in the twentieth century. New York: W. H. Freeman (2001) ISBN‎ 9780805071344
  2. Anthony Carpi+1, Uncertainty, Error, and Confidence https://www.visionlearning.com/en/library/Process-of-Science/49/Uncertainty-Error-and-Confidence/157
  3. JBS Haldane, Possible Worlds (1927) https://jbshaldane.org/books/1927-Possible-Worlds/haldane-1927-possible-worlds.html#Page_260
  4. Tim Palmer, The Primacy of Doubt (2022) p. 12 ISBN 9781541619715
  5. Kabir Dubey The Fourier uncertainty Principles University of Chicgo https://math.uchicago.edu/~may/REU2021/REUPapers/Dubey.pdf
  6. Emanuele Pesaresi, Uncertainty Principle Derivation from Fourier Analysis https://www.linkedin.com/pulse/uncertainty-principle-derivation-from-fourier-emanuele-pesaresi?utm_source=share&utm_medium=member_android&utm_campaign=share_via
  7. Jason Baker & Christian Roberts The Uncertainty Principle: Fourier Analysis In Quantum Mechanics (2016-11-20) https://math.unm.edu/~crisp/courses/wavelets/fall16/ChrisJasonUncertaintyPple.pdf