Difference between revisions of "Analog Computer"

From MgmtWiki
Jump to: navigation, search
(Full Title or Meme)
(Back to the Future)
Line 21: Line 21:
  
 
40 years ago, Nobel Prize-winner Richard Feynman argued that “nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.”<ref>Gil Press, ''27 Milestones in the History of Quantum Computing'' Forbes https://www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/?sh=7024e11b7b23</ref>
 
40 years ago, Nobel Prize-winner Richard Feynman argued that “nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.”<ref>Gil Press, ''27 Milestones in the History of Quantum Computing'' Forbes https://www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/?sh=7024e11b7b23</ref>
 +
 +
See the wiki discussion of [[Quantum Simulation]] for both analog and digital simulation.
  
 
==References==
 
==References==

Revision as of 22:37, 8 June 2023

Full Title or Meme

While the word computer in 2020 implies digital computer, the first useful computers were analog.

Calling these analog machines "Computers" is to conflate the digital nature of calculation with the analog nature of merging. For better or for worse this confusion is so pervasive that any attempt to fix the nomenclature now would be fruitless.

Context

  • The first portable computer was a slide rule, which all engineers were required to master.
  • The first (non-human) military computers were analog fire control on the battleship Missouri and bomb-sights on aircraft during the Second World War.[1]
  • The brain was never a serial, digital machine to begin with: it is a massively parallel analog machine. Each neuron has hundreds or more inputs, a very large input vector, and a single analog output.[2]
  • Each neuron in the brain has a similarly large output vector to all of the synapses that downstream neurons have established to read its output each synapse which can have its own parameters, like whether it is inhibitory or normal.

The First Computer Network

In the late 1950s, a sophisticated continental air defense system was developed by the United States Air Force to protect North America from incoming Soviet nuclear bombers. The name of the system was SAGE - the Semi-Automatic Ground Environment. SAGE was a massive technical complex of 23 concrete bunkers in the U.S. and (one) in Canada, each connected to local and distant radar stations and controlled by the largest computer system of its day.[3]

The AN/FSQ-7 analog computer had already anticipated many techniques, which appeared only later in commercially available computers: modem, screen pen, graphics, etc.

Problems

  • Artificial Intelligence runs on megabytes with megawatts in digital computer mega-centers. Natural intelligence runs on less than 20 watts of analog tissue in the human brain. And that brain consumes a large fraction of the power needed by a human body. For the average adult in a resting state, the brain consumes about 20 percent of the body’s energy.[4] Far more than any other animal. Artificial Intelligence as presently constituted is unsustainable. And other features of modern life, like crypto currencies, are even worse energy hogs.

Back to the Future

When digital computers learned to shrink their geometries onto a chip of silicon, they rapidly overtook the clunky old analog computers[5]

40 years ago, Nobel Prize-winner Richard Feynman argued that “nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.”[6]

See the wiki discussion of Quantum Simulation for both analog and digital simulation.

References

  1. Bernd Ulmann, Analog Computing 2nd Extended ed. Edition de Gruyter Textbook) ISBN 978-3110787610
  2. John von Nuemann, The Computer & the Brain Yale Univ Press (1958) ISBN 9780300181111
  3. Bernd Ulmann, AN/FSQ-7: the computer that shaped the Cold War ISBN 9783486727661
  4. How Much Energy Does the Brain Use? (2019-02-01) https://www.brainfacts.org/Brain-Anatomy-and-Function/Anatomy/2019/How-Much-Energy-Does-the-Brain-Use-020119
  5. Charles Platt, Back to the Analog Future Wired (2023-05) https://www.wired.com/story/unbelievable-zombie-comeback-analog-computing/
  6. Gil Press, 27 Milestones in the History of Quantum Computing Forbes https://www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/?sh=7024e11b7b23