Analog Computer

From MgmtWiki
Jump to: navigation, search

Full Title or Meme

While the word computer in 2020 implies digital computer, the first useful computers were analog.

Calling these analog machines "Computers" is to conflate the digital nature of calculation with the analog nature of merging. For better or for worse this confusion is so pervasive that any attempt to fix the nomenclature now would be fruitless.

Context

  • The first portable computer was a slide rule, which all engineers were required to master until the late 1960's.
  • The first (non-human) military computers were analog fire control on the battleship Missouri and bomb-sights on aircraft during the Second World War.[1]
  • The brain was never a serial, digital machine to begin with: it is a massively parallel analog machine. Each neuron has hundreds or more inputs, a very large input vector, and a single analog output.[2]
  • Each neuron in the brain has a similarly large output vector to all of the synapses that downstream neurons have established to read its output each synapse which can have its own parameters, like whether it is inhibitory or normal.

The First Useful Computer

The Differential Analyzer[3][4][5] had its first incarnation in the 1930s. By the time the WW II started it is handling a variety of war related projects including support for Norbert Wieners creation of the Wiener Filter.
One day in 1942, the Rockefeller Differential Analyzer was dedicated to winning the war. For the next several years this large mathematical machine. the centerpiece of MIT's Center of Analysis, labored over the calculation of firing tables and the profiles of radar antennas. Weighing almost a hundred tons and comprising some two thousand vacuum tubes, several thousand relays, a hundred and fifty motors, and automated input units, the analyzer was the most important computer in existence in the United States at the end of the war. (IMHO) Wartime security prohibited its public announcement until after the war. ... The first demonstration of the still incomplete analyzer was held on 1941-12-13. By March 1943, when the staff and facilities of the center had been converted entirely to war work.

The First Computer Network

In the late 1950s, a sophisticated continental air defense system was developed by the United States Air Force to protect North America from incoming Soviet nuclear bombers. The name of the system was SAGE - the Semi-Automatic Ground Environment. SAGE was a massive technical complex of 23 concrete bunkers in the U.S. and (one) in Canada, each connected to local and distant radar stations and controlled by the largest computer system of its day.[6]

The AN/FSQ-7 analog computer had already anticipated many techniques, which appeared only later in commercially available computers: modem, screen pen, graphics, etc.

Sage was only the first networked design of many computer connections to radar designed by MIT's Lincoln Labs and going back to Jay Forrester's work in 1944 with the 10 ton Whirlwind computer.[7]

Problems

  • Artificial Intelligence runs on megabytes with megawatts in digital computer mega-centers. Natural intelligence runs on less than 20 watts of analog tissue in the human brain. And that brain consumes a large fraction of the power needed by a human body. For the average adult in a resting state, the brain consumes about 20 percent of the body’s energy.[8] Far more than any other animal. Artificial Intelligence as presently constituted is unsustainable. And other features of modern life, like crypto currencies, are even worse energy hogs.

Back to the Future

When digital computers learned to shrink their geometries onto a chip of silicon, they rapidly overtook the clunky old analog computers[9]

40 years ago, Nobel Prize-winner Richard Feynman argued that “nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.”[10] What is very clear about Quantum Mechanics is that while the Schrödinger Equation is a continuous linear equations, it enters the world of discrete values when the wave equation collapses. Many of the state variables are indeterminate before this collapse, but fixed at the time of collapse.

See the wiki discussion of Quantum Simulation for both analog and digital simulation. What is apparent though is that the quantum world exists in a curious state between discrete and continuous that digital computers are not able to match.

Starting in 2018 the rise of interest in Artificial Intelligence started several companies to rethink the idea of emulating human neural systems in digital computers.[11]
Analog was the original way to model neural networks. Digital came later. But more recently, researchers (and companies like Mythic and Syntiant) are looking at in-memory analog computation to reduce power compared to digital inference engines. By eliminating the digital memory transactions required in a typical inference engine, “You can potentially save lots of power and die area. Mythic depends on digital input. Aspinity, in contrast, processes analog inputs. Mythic is just using flash memory cells as voltage-variable conductance elements to replace digital multiply-accumulators (MACs). On the other hand, Aspinity uses a variety of parametrized analog circuits; amps, filters, adders/subtractors, etc.

In 2024 several companies are trying of create an analog computer that will perform Artificial Intelligence functions faster and with less operating cost than purely digital models have achieved here-to-fore. Analog computing, with its direct processing capabilities and energy efficiency, might just be the key to addressing the power challenges posed by the proliferation of smart sensors. As we look ahead, analog computers could play a crucial role in creating a smarter and more sustainable future.

  1. Aspinity: has developed the **AML100 chip**, which blends analog and digital computing. By doing so, it significantly lowers energy consumption while retaining the advantages of digital technology for smart sensors. These sensors are essential for various applications, from wearables to autonomous cars.
  2. Anabrid: founded in 2020, is shaping the future of analog computing. Their mission is to fit analog computers onto a chip, making analog computing ubiquitous. Their approach could revolutionize energy-efficient computing.[12][13]
  3. Mythic: had developed a "unique new technology" that promised to deliver performance-per-watt and performance-per-dollar gains for edge AI. However, that apparently wasn't enough to give the 10-year-old company the traction it needed to keep investor money flowing.[14]
  4. Intel and Texas Instruments: are at the forefront of analog design engineering. If you're an analog design engineer, these companies offer exciting opportunities. Additionally, **Meta** and **Apple** are known for their competitive salaries in this field².

Analog computers now just one step from digital | ScienceDaily. (2021-12-09) https://www.sciencedaily.com/releases/2021/12/211209082557.htm.

Example

Analog computers are making a come-back with several companies offering designs that aim to solve Artificial Intelligence models faster at lower energy costs. What really got my interest was when I found my first project as principle researcher after leaving college recorded on Wikipedia. To determine the size of the furnace, I needed to calculate the diffusion profile of chromium into iron at high temperature where the crystallographic structure of the Iron changed as the level of chromium reached a particular concentration using an Analog Devices AD/four hybrid with an XDS Sigma 5. https://en.wikipedia.org/wiki/Scientific_Data_Systems

With Light

  • Metamaterials for Analog Optical Computing 2024-03-29 Novel metamaterial-based architectures offer a promising platform for building mass-producible, reprogrammable schemes that perform computing tasks with light.

References

  1. Bernd Ulmann, Analog Computing 2nd Extended ed. Edition de Gruyter Textbook) ISBN 978-3110787610
  2. John von Nuemann, The Computer & the Brain Yale Univ Press (1958) ISBN 9780300181111
  3. Larry Owens, Vannevar Bush and the Differential Analyzer: The Text and Context of an Early Computer Technology and Culture(1986-01) https://www.jstor.org/stable/3104945?origin=crossref
  4. Kent H Lundberg, Vannevar Bush's Differential Analyzer https://www.mit.edu/~klund/analyzer/
  5. haverford Bush and the Diffeential Analyzer list of references http://ds-wordpress.haverford.edu/bitbybit/resources/chapter-3/3-vannevar-bush-and-the-differential-analyzer/
  6. Bernd Ulmann, AN/FSQ-7: the computer that shaped the Cold War ISBN 9783486727661
  7. MIT news, MIT’s first divorce https://www.technologyreview.com/2023/06/27/1073782/mits-first-divorce/
  8. How Much Energy Does the Brain Use? (2019-02-01) https://www.brainfacts.org/Brain-Anatomy-and-Function/Anatomy/2019/How-Much-Energy-Does-the-Brain-Use-020119
  9. Charles Platt, Back to the Analog Future Wired (2023-05) https://www.wired.com/story/unbelievable-zombie-comeback-analog-computing/
  10. Gil Press, 27 Milestones in the History of Quantum Computing Forbes https://www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/?sh=7024e11b7b23
  11. Mike Demler quoted in Aspinity Puts Neural Networks Back to Analog (2019-06-25) https://www.eetimes.com/aspinity-puts-neural-networks-back-to-analog/
  12. Analog computing is undergoing a resurgence - Big Think. https://bigthink.com/the-future/analog-computing-resurgence/
  13. Anabrid: Shaping the Future of Analog Computing. https://www.future-of-computing.com/anabrid-shaping-the-future-of-analog-computing/
  14. Dylan Martin, Mythic bet big on analog AI but has run out of cash The Register (20222-11-09) https://www.theregister.com/2022/11/09/mythic_analog_ai_chips/