Computational Complexity Theory

From MgmtWiki
Revision as of 08:38, 2 July 2023 by Tom (talk | contribs) (Context)

Jump to: navigation, search

Full Title or Meme

The meta-mathematics of determining how hard it is to solve a problem. This is the origin of what is now called computational complexity theory.

Context

  • The wiki page on Complexity begins to address Complexity from a computation point of view.
  • This page is interesting the the

https://www.physicsforums.com/insights/how-to-better-define-information-in-physics/


Between 1500–1700, there was an important and dramatic shift in the way that people in Europe perceived and understood the world. Through the works of thinkers such as Francis Bacon, Galileo Galilei, Johannes Kepler, Rene Descartes, and Isaac Newton, the Western world underwent a scientific revolution. This resulted in a shift from a worldview governed by the Church and Christian theology and ethics, to that of an inanimate, machine-like material world governed by natural forces and exact mathematical rules.[1]

David Hilbert proposed in 1900 his challenge #2 to prove the consistency of the axioms of arithmetic for logical problems. Kurt Gödel showed that this was not possible in 1931.[2] This discovery was just a few years after Heisenberg showed that the mathematics of Quantum Mechanics asserted a fundamental limit to the accuracy of any measurements made on an individual quantum particle. Finally in 1996 Ilya Prigogine published a book declaring "The End of Certainty."[3]

References

  1. Capra and Luisi, The Newtonian world-machine. In The Systems View of Life: A Unifying Vision (pp. 19–34). Cambridge: Cambridge University Press. (2014) doi:10.1017/CBO9780511895555.004
  2. Kurt Gödel, On Formally Undecidable Propositions of Principia Mathematica and Related Systems Dover (original in German 1931) ISBN 9780486669809
  3. Ilya Prigogine, The End of Certainty, Free Press (1996) ISBN 0684837059

Other Material