Difference between revisions of "Eventful Universe"

From MgmtWiki
Jump to: navigation, search
(The Model)
(Abstract)
 
(115 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
==Full Title==
 
==Full Title==
The Event-driven Universe is a model focused on the events that happen between objects in the universe.
+
The Event-driven Universe is a model focused on the events that happen among objects in the universe.
 
===Author===
 
===Author===
 
Thomas C. Jones
 
Thomas C. Jones
  
 
===Date===
 
===Date===
2020-11-22
+
2020-12-15. additional details on 2022-10-15
  
 
==Abstract==
 
==Abstract==
Models of the universe typically focus on the externes of very large or very small elements, like collisions of galaxies or of photons and electrons. These are both events between identified objects, which are called entities in computer science. But the existing standard models immediately establish a set of "facts" about one particular mathematical set of assumptions like continuous and well defined space and time. In these models constructs like Lagrangian or Hamiltonian are precise. In contrast this article will focus on the events themselves, rather than the commonly accepted assumptions, at least so far as time in concerned. That means that no mathematical formula that include time as a variable will be assumed to be a fundamental truth. The major theme of this article is that if physics wants to learn about how evolution can provide answers to open issues, then the best place to look is at places where evolution has been successfully applied. As one example: machine learning has already been enhanced by acknowldging that evolution can bring new techniques and insights to hard technological challenges.
+
Models of the universe typically focus on the extremes of very large or very small elements, like collisions of galaxies or of photons and electrons. These are both events between identified objects, which are called entities in computer science. But the existing standard models are created to establish a set of "facts" about one particular mathematical set of assumptions like continuous and well-defined space and time. In these models constructs like the Lagrangian or Hamiltonian are precise and deterministic. In contrast this article will focus on information from the observed events themselves, rather than the commonly accepted assumptions, at least so far as time is concerned. That means that no mathematical formulas that include time as a variable will be assumed to be a fundamental truth. Instead time will arise from a sequence of events as postulated by Whitehead.<ref name=whitehead /> The major theme of this article is that if physics wants to learn about how self-organization can provide answers to open issues, then the best place to look is at places where evolution has been successfully applied. As one example: machine learning has already been enhanced by acknowledging that evolution can bring new techniques and insights to hard technological challenges.
  
 
==The Model==
 
==The Model==
* Euclidian three space will be assumed.
+
"Scientists work from models acquired through education ... often without quite knowing or needing to know what characteristics have given these models the status of community paradigms" <ref>Thomas Kuhn, ''The Structure of Scientific Revolutions,, (1996-12-15)  ISBN 978-0226458083</ref> Some, like Boltzmann, Einstein and Feynman, think outside of the models that they were given to create new models which results in the Paradigm Shift that Thomas Kuhn described in the book from which that quote was taken. Breaking old models can be difficult. Lack of acceptance and an incurable disease led Boltzmann to commit suicide.
 +
* Euclidean three space will be assumed for its simplicity.
 
* There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
 
* There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
 
# the emission of a single photon by an excited electron and its subsequent absorption by another election
 
# the emission of a single photon by an excited electron and its subsequent absorption by another election
# the emission of an entangled photon pair and the subsequent absorption of one of those photons.
+
# the emission of an [[Entangled]] photon pair and the subsequent absorption of one of those photons.
The goal is to create a partial ordering of events and then determine if it is possible to use this partial ordering of event to create a time line that can become continuous in the limit. Each partial ordering would be local to the events and so would naturally lead to a relative time line for each locality.
+
The goal is to create a partial ordering of events and then determine if it is possible to use this partial ordering of events to create a timeline that can become continuous in the limit. Each partial ordering would be local to the events and so would naturally lead to a relative timeline for each locality.
 +
===General Semantics===
 +
This label is fully described in the wiki page on [[Semantics]]. It can be summarized as a description of how events translate into observations which are then pigeon-holed into existing names and labels that are known to scientists from their education and experience. What is really needed is a fresh look at events that is unencumbered by existing taxonomies.
 +
===Axioms===
 +
It certainly is not clear whether the universe could really be as strange a place as their laboratory experiments appeared to show. Bit by bit they pieced together the results of experiments and found rules obeyed by matter. In 1936 Birkhoff and Von Neumann collected the rules together into the accepted axioms of quantum mechanics. Birkhoff and Von Neumann showed that the axioms of [[Quantum Mechanics]] provide a consistent framework in which it is once again possible to predict the results of experiment, at least statistically. But, although these laws are mathematically consistent, most scientists agree that they are counter intuitive, and do not have any satisfactory known physical interpretation.
 +
 
 +
===Graphiti===
 +
Kant has been summarized as "“Time exists in order that everything doesn’t happen all at once . . . and space exists so that it doesn’t all happen to you."  This is a discussion of an event, or a "happening".
 +
 
 +
===The Process===
 +
All of the physical reality that we know on this world was created as a process: a series of events followed by the adaptions of the climate and the biome to the new realities. In his book "The Self-organizing Universe" Erich Jantsh<ref>Erich Jantsh, ''The Self-organizing Universe'' Pergamon (1980-01-15) ISBN 0080243118</ref> proposes that the formation of the universe itself is a process that we are only now beginning to comprehend. This wiki has more at the [[Self-organization]] of identity ecosystems but the general structure is the same, chaos leads inexorably to order. While even the Noble Laureate Murray Gell-mann <ref>Murray Gell-mann, ''The Quark and the Jaguar'' Freeman (1994) ISBN 0716725819</ref> describes the concept of [[Self-organization]], he does not use it in creating physical models. Perhaps we can fill in some of that here.
  
 
==Irreversibility and Power Consumption==
 
==Irreversibility and Power Consumption==
As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions
+
As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions
 +
 
 +
The problem with these models is that they lead to the static Wheeler-deWitt equation (''Ĥ''(x)|ψ> = 0) which tell us that the universe does not evolve.<ref>Bryce S. deWitt, ''Quantum Theory of Gravity. I. The Canonical Theory'' (1967-08-25) Phys. Rev. 160, 1113 https://journals.aps.org/pr/abstract/10.1103/PhysRev.160.1113</ref> Many find that to be a fatal flaw of physics. It simply does not correspond in any way to reality.
 +
 
 +
Quantum computing is teaching us more about computing and about quantum mechanics, but it is not at all clear where this new knowledge is taking us. In the following we will investigate computing elements that consume something, but it is not clear what, perhaps it is the precursor to entropy. This proposition will not depend on existing knowledge, but rather just explore what is possible if are not completely bound by accepted knowledge. What is created in this endeavor is time itself.
 +
 
 +
'''Landauer's principle''' is a physical principle pertaining to the lower theoretical limit of energy consumption of [Computation]] . It holds that "any logically irreversible manipulation of [[Information#As a property in physics|information]], such as the erasure of a [[bit]] or the merging of two [[Computation]] paths, must be accompanied by a corresponding [[entropy]] increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".<ref name = bennett>Charles H. Bennett,  ''Notes on Landauer's principle, Reversible Computation and Maxwell's Demon.'' Studies in History and Philosophy of Modern Physics volume=34 issue=3 pp. 501–510 (2003) http://www.cs.princeton.edu/courses/archive/fall06/cos576/papers/bennett03.pdf DOI 10.1016/S1355-2198(03)00039-X</ref>
 +
 
 +
Charles H. Bennett at IBM taught us that computing does not need to consume much power.
 +
 
 +
===The Direction of Time===
 +
Fundamental to the idea of irreversibility is that time flows from the past to the future. Quantum Mechanics does not appear to predict that one directional flow. Leonard Susskind predicted that Fractal-Flows determined Time’s Arrow.<ref>Leonard Susskind, ''Fractal-Flows and Time’s Arrow'' (2022-04-07) https://arxiv.org/pdf/1203.6440.pdf</ref>
  
Bennet IBM
+
===Problem Solving===
 +
Physics has focused on the problem of getting from point a to point. b.  For example, in deterring the function of a machine we start with the first and second laws of thermodynamics. But in the case of the light beam we focus on the path with the "least action". In the first case we are dealing with something large, the machine, while in the second case we are dealing with something quite small, a photon. Similarly, in the former, the Hamiltonian (total energy) is kept constant, while the second the Lagrangian (fungibility of energy) is kept to a minimum while the Hamiltonian does not care how the energy is distributed or changed in form. Landauer's principle is similar, just keep the fungibility of energy at a minimum and the computational power is unlimited, although the computation might take a long time.
 +
===Is the Universe a Computer?===
  
The problem with these models is that they lead to the static Wheeler-deWitt equation (''Ĥ''(x)|ψ> = 0) which tell us that the universe does not evolve.<ref>Bryce S. deWitt, ''Quantum Theory of Gravity. I. The Canonical Theory'' (1967-08-25) Phys. Rev. 160, 1113 https://journals.aps.org/pr/abstract/10.1103/PhysRev.160.1113</ref> Many find that to be a fatal flaw of physics. It simply does not correspond in any way to reality.
+
What is the universe computing? As far as we can tell, it is not producing a single answer to a single question.... Instead the universe is computing itself. Powered by Standard Model software, the universe computes quantum fields, chemicals, bacteria, human beings, stars, and galaxies. As it computes, it maps out its own spacetime geometry to the ultimate precision allowed by the laws of physics. Computation is existence.<ref>Seth Lloyd as reported by Alexandra Churikova, ''Is the Universe Actually a Giant Quantum Computer?'' (2015) https://cmsw.mit.edu/angles/2015/is-the-universe-actually-a-giant-quantum-computer/</ref>
  
 
==What Purpose do Clocks Serve==
 
==What Purpose do Clocks Serve==
 
It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.
 
It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.
  
Event-driven (aka on-demand) computing can be substantially lest expensive than fully synchronous chips. While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain.<ref>P. A. Menoal + 12, ''A million spiking-neuron integrated circuit with a scalable communication network and interface''. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.</ref> Intel followed with their Loihi chip.<ref>Kyung M Song + 11,  (2020-03). ''Skyrmion-based artificial synapses for neuromorphic computing''. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. [https://www.nature.com/articles/s41928-020-0385-0 doi:10.1038/s41928-020-0385-0]</ref>
+
Event-driven (aka on-demand) computing can be substantially less expensive than fully synchronous chips.<ref>Samuel Greengard, ''Neuromorphic Chips take Shape'' '''CACM 63''' No. 8 (2020-08) p. 9-11</ref> While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain.<ref>P. A. Menoal + 12, ''A million spiking-neuron integrated circuit with a scalable communication network and interface''. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.</ref> Intel followed with their Loihi chip.<ref>Kyung M Song + 11,  (2020-03). ''Skyrmion-based artificial synapses for neuromorphic computing''. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. [https://www.nature.com/articles/s41928-020-0385-0 doi:10.1038/s41928-020-0385-0]</ref>
  
 
Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these asynchronous computing devices in our passport or other portable ID card so that they can function flawlessly on the energy created by waving the chip card over the reader.<ref> Seven M. Nowick, ''Asynchronous Design -- Part 1: Overview and Recent Advances'' (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances</ref> As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock.  In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most effective since clock synchronization can be maintained at all times.
 
Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these asynchronous computing devices in our passport or other portable ID card so that they can function flawlessly on the energy created by waving the chip card over the reader.<ref> Seven M. Nowick, ''Asynchronous Design -- Part 1: Overview and Recent Advances'' (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances</ref> As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock.  In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most effective since clock synchronization can be maintained at all times.
 +
 +
==Life Cycle of a Photon==
 +
A [[Photon]] is born as an event when an electron loses some quantum of energy. It spends its entire life looking for a fulfilling way to die. It can be viewed as a computing device with a limited amount of storage and a limited computing capacity. It looks "forward" in space to the full extent of its computing capacity and remembers as much about its past as its storage will accommodate (aka the state vector). In the absence of any other matter, its computing capacity allows it to travel "at the speed of light". Thus creating the concept of time as the inverse of the distance traveled in empty during the compute cycle. (Time can thus be measured outside of the photon. Inside of the photon there is no memory of distance traversed, and hence no knowledge of time.) When more matter is around, more compute power is consumed with understanding the effect of that extra computing effort on the path length. This allows the photon to look forward a much shorter distance, and so it appears to go slower. In some case the photon can be trapped into a place where it is unable to complete a computation cycle about where it wants to go, and so it appears to stop. So a photon appears to be just a Turing machine of limited capacity and storage, just like your own smart phone.
 +
 +
Bose-Einstein statistics produce accurate results because photons are indistinguishable from each other, one cannot treat any two photons having equal quantum numbers (e.g., polarization and momentum vector) as being two distinct identifiable photons. By analogy, if in an alternate universe coins were to behave like photons, the probability of producing two heads would be one-third, and so would the probability of getting a head and a tail (which equals one-half for the conventional, distinguishable coins). Which is another way to say that only the outcome of an experiment can be statically evaluated, not the path taken by an individual photon.
 +
 +
"In ordinary classical physics, the typical setup is to imagine that definite things happen, and that in a sense every system follows a definite thread of behavior through time. But the key idea of quantum mechanics is to imagine that many threads of possible behavior are followed, with a definite outcome being found only through a measurement made by an observer."<ref>Stephen Wolfram ''The Wolfram Physics Project: A One-Year Update''  (2021-04-24)    https://writings.stephenwolfram.com/2021/04/the-wolfram-physics-project-a-one-year-update/</ref>. This quote of Stephan Wolfram was based on graph theory in contradiction to this paper's focus on discrete events being the only. "real" thing of the universe. So, while events can be, in principle, determined, the same is not true for the position and speed of any particle as any discrete time. For a broader discussion see the wiki page [[Discrete or Continuous]].
 +
 +
===Some History===
 +
In other words, we really knew all of this stuff before, but just couldn't accepts the facts in front of our faces.
 +
*Aristarchus of Samos (c. 310 – c. 230 BC) was an ancient Greek astronomer and mathematician who presented the first known heliocentric model that placed the Sun at the center of the known universe with the Earth revolving around it. He was influenced by Philolaus of Croton, but Aristarchus identified the "central fire" with the Sun, and he put the other planets in their correct order of distance around the Sun. Like Anaxagoras before him, he suspected that the stars were just other bodies like the Sun, albeit farther away from Earth. His astronomical ideas were mostly ignored in favor of the geocentric theories of Aristotle and Ptolemy. However, Nicolaus Copernicus did attribute the heliocentric theory to Aristarchus.<ref> George Kish, ''A Source Book in Geography.'' (1978) Harvard University Press. p. 51. ISBN 978-0-674-82270-2.</ref>
 +
*Quote "The [[Principle of Least Action]] just says that a particle sniffs out every possible path and takes the one that results in the least action."<ref name=Arkani-Hemad>Nima Arkani-Hemad Suzy Conference YouTube  (2018-08-30) https://www.youtube.com/watch?v=xNVZg694ct8</ref> The action principle is preceded by earlier ideas in optics. In ancient Greece, Euclid wrote in his Catoptrica that, for the path of light reflecting from a mirror, the angle of incidence equals the angle of reflection. Hero of Alexandria (c. 10 AD – c. 70 AD)  later showed that this path was the shortest length and least time.<ref>Kline, Morris, ''Mathematical Thought from Ancient to Modern Times''. New York: Oxford University Press.  (1972) pp. 167–68. ISBN 0-19-501496-0.</ref>
 +
* During the early part of the 20th century, Minkowski created the canonical model of the time-space continuum and the foundational work on convex solutions.<ref>Amir Beck, ''Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications'' https://archive.siam.org/books/mo19/</ref> The idea of "Suitably Convex" solutions was learned from the Arkani-Hemad <ref  name=Arkani-Hemad /> paper referenced in the point above. Convex Optimization has evolved new methods, but the goal is consistent.<ref>Boyd and Vanderbergue, ''Convex Optimization,'' Cambridge U. Press, (2004)  http://www.stanford.edu/~boyd/cvxbook/</ref>
 +
* In Feynman's thesis (1942) he took Wheeler's advice to consider both the retarded and advanced contributions to the Lagrangian, which is based on the [[Principle of Least Action]]. That seems to imply that the location of a particle in both time and space is indeterminate in the most fundamental sense. The only events that can be located in time and space seem to be the emission and absorption of the particle. <ref>Richard Feynman, ''Principles of Least Action in Quantum Physics'' Princeton (1942) https://cds.cern.ch/record/101498/files/Thesis-1942-Feynman.pdf</ref>
  
 
==How Does Time Evolve?==
 
==How Does Time Evolve?==
Line 36: Line 74:
 
One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative).  The term spin is just a physicist's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicists call spin.
 
One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative).  The term spin is just a physicist's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicists call spin.
  
what about polarization? is that different?
+
To be addressed: What about polarization? Need to accommodate linear and circular polarization of photons. Probably just need to add term to the photon state vector.
 
 
In Feynman's thesis he took Wheeler's advice to consider both the retarded and advanced contributions to the Lagrangian. That seems to imply that the location of a particle in both time and space is indeterminate in the most fundamental sense. The only events that can be located in time and space seem to be the emission and absorption of the particle.
 
  
 
==Wave Function of the Photon==
 
==Wave Function of the Photon==
Line 44: Line 80:
  
 
Wave function for the photon<ref>I. Bialynick-Birual,  ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PHYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf</ref>
 
Wave function for the photon<ref>I. Bialynick-Birual,  ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PHYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf</ref>
 +
 
===Interference===
 
===Interference===
 
All electromagnetic waves can be superimposed upon each other without limit. The electric and magnetic fields simply add at each point.  If two waves with the same frequency are combined there will a be a constant interference pattern caused by their superposition.  Interference can either be constructive, meaning the strength increases as result, or destructive where the strength is reduced. The amount of interference depends of the phase difference at a particular point. It can be shown that constructive interference occurs for phase differences of 0-1200, and 240-3600.  Thus destructive interference occurs from 120-2400.  For two identical waves, no phase shift results in total constructive interference, where the strength is maximum and 1800 phase shift will create total destructive interference (no signal at all).
 
All electromagnetic waves can be superimposed upon each other without limit. The electric and magnetic fields simply add at each point.  If two waves with the same frequency are combined there will a be a constant interference pattern caused by their superposition.  Interference can either be constructive, meaning the strength increases as result, or destructive where the strength is reduced. The amount of interference depends of the phase difference at a particular point. It can be shown that constructive interference occurs for phase differences of 0-1200, and 240-3600.  Thus destructive interference occurs from 120-2400.  For two identical waves, no phase shift results in total constructive interference, where the strength is maximum and 1800 phase shift will create total destructive interference (no signal at all).
Line 57: Line 94:
  
 
Comments on this formula:
 
Comments on this formula:
* The symbol is used to indicate an approximate result, as opposed to an exact relationship. This result is only valid for relatively small angles, something less than about 200.
+
* The symbol~ is used to indicate an approximate result, as opposed to an exact relationship. This result is only valid for relatively small angles, something less than about 200.
 
* There is some ambiguity in what is meant by this angle. In actuality, the wave does not simply end at this boundary, but falls off smoothly. The exact point which defines the extent of the wave is a matter of definition and there are two standard conventions: the one used here is the ½ power or -3 dB definition.
 
* There is some ambiguity in what is meant by this angle. In actuality, the wave does not simply end at this boundary, but falls off smoothly. The exact point which defines the extent of the wave is a matter of definition and there are two standard conventions: the one used here is the ½ power or -3 dB definition.
 
* The factor of 2 is also an approximation. A more accurate description requires detailed knowledge about the shape of the aperture
 
* The factor of 2 is also an approximation. A more accurate description requires detailed knowledge about the shape of the aperture
 
* The angle q is measured in radians!
 
* The angle q is measured in radians!
 +
 
===Photon polarization===
 
===Photon polarization===
 
Photon polarization is the quantum mechanical description of the classical polarized sinusoidal plane electromagnetic wave. An individual photon can be described as having right or left circular polarization, or a superposition of the two. Equivalently, a photon can be described as having horizontal or vertical linear polarization, or a superposition of the two.
 
Photon polarization is the quantum mechanical description of the classical polarized sinusoidal plane electromagnetic wave. An individual photon can be described as having right or left circular polarization, or a superposition of the two. Equivalently, a photon can be described as having horizontal or vertical linear polarization, or a superposition of the two.
Line 66: Line 104:
 
The description of photon polarization contains many of the physical concepts and much of the mathematical machinery of more involved quantum descriptions, such as the quantum mechanics of an electron in a potential well. Polarization is an example of a qubit degree of freedom, which forms a fundamental basis for an understanding of more complicated quantum phenomena. Much of the mathematical machinery of quantum mechanics, such as state vectors, probability amplitudes, unitary operators, and Hermitian operators, emerge naturally from the classical Maxwell's equations in the description. The quantum polarization state vector for the photon, for instance, is identical with the Jones vector, usually used to describe the polarization of a classical wave. Unitary operators emerge from the classical requirement of the conservation of energy of a classical wave propagating through lossless media that alter the polarization state of the wave. Hermitian operators then follow for infinitesimal transformations of a classical polarization state.
 
The description of photon polarization contains many of the physical concepts and much of the mathematical machinery of more involved quantum descriptions, such as the quantum mechanics of an electron in a potential well. Polarization is an example of a qubit degree of freedom, which forms a fundamental basis for an understanding of more complicated quantum phenomena. Much of the mathematical machinery of quantum mechanics, such as state vectors, probability amplitudes, unitary operators, and Hermitian operators, emerge naturally from the classical Maxwell's equations in the description. The quantum polarization state vector for the photon, for instance, is identical with the Jones vector, usually used to describe the polarization of a classical wave. Unitary operators emerge from the classical requirement of the conservation of energy of a classical wave propagating through lossless media that alter the polarization state of the wave. Hermitian operators then follow for infinitesimal transformations of a classical polarization state.
  
==Evolving Transformations System==
+
==The one Photon Hypothesis==
A similar event-based model was created path the university of New Brunswick. The class of processes, i.e. similarly structured processes—with a common generative structure— pervades all levels of consideration in ETS. In particular, the most basic concept, the primitive event, is defined via the classes of primal processes.<ref>L. Goldfarb + 4, ''What is a structural representation? Proposal for an event-based representational formalism, Sixth Variation, Tech. Rep. TR07-187'' (2007) Faculty of Computer Science, University of New Brunswick http://www.cs.unb.ca/~goldfarb/ETSbook/ETS6.pdf</ref> The ETS then go on to create a collection of primitive events, or transformations that are essentially state machines with a fixed set of inputs and outputs. The model assumes a time sequence for each primitive. In other words there is a predefined partial ordering of primitives where events occur, or not.
+
 
 +
Event: a bundle of energy emitted from an electron in an atom with particular direction or momentum vector and a spin (+ or -). The polarization will be ignored for the moment.
 +
 
 +
State: Point of no dimension at the place where it is emitted. The model is that of a compute engine with strictly limited memory and compute power.
 +
 
 +
Process: Look forward along the direction of momentum 2r units. Examine all space centered at r with a radius r and volume 4/3  π r looking for anything that might absorb the energy. For this cycle only the emitter is considered to be a potential absorber.
 +
 
 +
Goal: To find the shortest path. If should be obvious given dispersion or the two-slit experiment or other effects that the goal is not always achieved. This can be attributed to the limited processing capacity of the photon which gives the probabilistic results that we see.
 +
 
 +
State: no absorber found:
 +
 
 +
Process Look forward 3r units expending the same resources used to look at pervious space, but do not review anything in the circle examined in the previous step.
 +
 
 +
Create a formula for the radius of the circles created. This should match dispersion.
 +
 
 +
==The two Photon Hypothesis==
 +
For time to exist photons should be emitted (nearly) always in [[Entangled]] pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.
 +
 
 +
These models assumes a time sequence for each primitive. In other words there is a predefined partial ordering of primitives where events occur, or not.
 +
 
 +
==Purpose==
 +
 
 +
Does.the universe have a purpose? Is it just a gigantic computer computing something like the meaning as suggested by The Hitchhiker's Guide suggests?
  
==Hypothesis==
 
For time to exist photons should be emitted (nearly) always in entangled pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.
 
  
 
==Summary==
 
==Summary==
 +
 
There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. The only significant difference between the universe and a computer is that the scope of useful computers is small enough that we can figure out how to program it to get useful answers. So go forth and make your own model universe. You have all of the capability you need on your desktop, with a little help from the cloud.
 
There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. The only significant difference between the universe and a computer is that the scope of useful computers is small enough that we can figure out how to program it to get useful answers. So go forth and make your own model universe. You have all of the capability you need on your desktop, with a little help from the cloud.
 +
 +
==Appendix==
 +
 +
===Evolving Transformations System (ETS)===
 +
Event based solutions have been explored by philosophers like Alfred North Whitehead, <ref name=whitehead> Alfred North Whitehead,  ''Process and Reality. An Essay in Cosmology. Gifford Lectures Delivered in the University of Edinburgh During the Session 1927–1928'', (1929) Macmillan, New York, Cambridge University Press, Cambridge UK. </ref> and John Heil<ref> John Heil, ''The Universe as we find it'' Oxford (2015-06-01) ISBN 978-0198738978</ref>. This has been primarily an Ontological investigation with little in the way of physical details. Heil complains that since substance carries properties, but not events, that events are not a suitable ontology for physics. He seems not to be aware of the use of attributes of links in computer science which when faced with a deficient ontology, just changed the ontology.
 +
 +
A similar event-based model was created at the university of New Brunswick. The class of processes, i.e. similarly structured processes—with a common generative structure— pervades all levels of consideration in ETS. In particular, the most basic concept, the primitive event, is defined via the classes of primal processes.<ref>L. Goldfarb + 4, ''What is a structural representation? Proposal for an event-based representational formalism, Sixth Variation, Tech. Rep. TR07-187'' (2007) Faculty of Computer Science, University of New Brunswick http://www.cs.unb.ca/~goldfarb/ETSbook/ETS6.pdf</ref> The ETS then go on to create a collection of primitive events, or transformations that are essentially state machines with a fixed set of inputs and outputs. According to the ETS formalism, photon is not a "particle" but is a (spatially instantiated) stream of structured events. The last sentence applies equally well to the models described here. However, the event processing is quite different.
  
 
==References==
 
==References==
  
 
[[Category: Article]]
 
[[Category: Article]]
 +
[[Category: Physics]]
 +
[[Category: Model]]
 
[[Category: Ecosystem]]
 
[[Category: Ecosystem]]

Latest revision as of 17:40, 1 March 2024

Full Title

The Event-driven Universe is a model focused on the events that happen among objects in the universe.

Author

Thomas C. Jones

Date

2020-12-15. additional details on 2022-10-15

Abstract

Models of the universe typically focus on the extremes of very large or very small elements, like collisions of galaxies or of photons and electrons. These are both events between identified objects, which are called entities in computer science. But the existing standard models are created to establish a set of "facts" about one particular mathematical set of assumptions like continuous and well-defined space and time. In these models constructs like the Lagrangian or Hamiltonian are precise and deterministic. In contrast this article will focus on information from the observed events themselves, rather than the commonly accepted assumptions, at least so far as time is concerned. That means that no mathematical formulas that include time as a variable will be assumed to be a fundamental truth. Instead time will arise from a sequence of events as postulated by Whitehead.[1] The major theme of this article is that if physics wants to learn about how self-organization can provide answers to open issues, then the best place to look is at places where evolution has been successfully applied. As one example: machine learning has already been enhanced by acknowledging that evolution can bring new techniques and insights to hard technological challenges.

The Model

"Scientists work from models acquired through education ... often without quite knowing or needing to know what characteristics have given these models the status of community paradigms" [2] Some, like Boltzmann, Einstein and Feynman, think outside of the models that they were given to create new models which results in the Paradigm Shift that Thomas Kuhn described in the book from which that quote was taken. Breaking old models can be difficult. Lack of acceptance and an incurable disease led Boltzmann to commit suicide.

  • Euclidean three space will be assumed for its simplicity.
  • There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
  1. the emission of a single photon by an excited electron and its subsequent absorption by another election
  2. the emission of an Entangled photon pair and the subsequent absorption of one of those photons.

The goal is to create a partial ordering of events and then determine if it is possible to use this partial ordering of events to create a timeline that can become continuous in the limit. Each partial ordering would be local to the events and so would naturally lead to a relative timeline for each locality.

General Semantics

This label is fully described in the wiki page on Semantics. It can be summarized as a description of how events translate into observations which are then pigeon-holed into existing names and labels that are known to scientists from their education and experience. What is really needed is a fresh look at events that is unencumbered by existing taxonomies.

Axioms

It certainly is not clear whether the universe could really be as strange a place as their laboratory experiments appeared to show. Bit by bit they pieced together the results of experiments and found rules obeyed by matter. In 1936 Birkhoff and Von Neumann collected the rules together into the accepted axioms of quantum mechanics. Birkhoff and Von Neumann showed that the axioms of Quantum Mechanics provide a consistent framework in which it is once again possible to predict the results of experiment, at least statistically. But, although these laws are mathematically consistent, most scientists agree that they are counter intuitive, and do not have any satisfactory known physical interpretation.

Graphiti

Kant has been summarized as "“Time exists in order that everything doesn’t happen all at once . . . and space exists so that it doesn’t all happen to you." This is a discussion of an event, or a "happening".

The Process

All of the physical reality that we know on this world was created as a process: a series of events followed by the adaptions of the climate and the biome to the new realities. In his book "The Self-organizing Universe" Erich Jantsh[3] proposes that the formation of the universe itself is a process that we are only now beginning to comprehend. This wiki has more at the Self-organization of identity ecosystems but the general structure is the same, chaos leads inexorably to order. While even the Noble Laureate Murray Gell-mann [4] describes the concept of Self-organization, he does not use it in creating physical models. Perhaps we can fill in some of that here.

Irreversibility and Power Consumption

As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions

The problem with these models is that they lead to the static Wheeler-deWitt equation (Ĥ(x)|ψ> = 0) which tell us that the universe does not evolve.[5] Many find that to be a fatal flaw of physics. It simply does not correspond in any way to reality.

Quantum computing is teaching us more about computing and about quantum mechanics, but it is not at all clear where this new knowledge is taking us. In the following we will investigate computing elements that consume something, but it is not clear what, perhaps it is the precursor to entropy. This proposition will not depend on existing knowledge, but rather just explore what is possible if are not completely bound by accepted knowledge. What is created in this endeavor is time itself.

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of [Computation]] . It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two Computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".[6]

Charles H. Bennett at IBM taught us that computing does not need to consume much power.

The Direction of Time

Fundamental to the idea of irreversibility is that time flows from the past to the future. Quantum Mechanics does not appear to predict that one directional flow. Leonard Susskind predicted that Fractal-Flows determined Time’s Arrow.[7]

Problem Solving

Physics has focused on the problem of getting from point a to point. b. For example, in deterring the function of a machine we start with the first and second laws of thermodynamics. But in the case of the light beam we focus on the path with the "least action". In the first case we are dealing with something large, the machine, while in the second case we are dealing with something quite small, a photon. Similarly, in the former, the Hamiltonian (total energy) is kept constant, while the second the Lagrangian (fungibility of energy) is kept to a minimum while the Hamiltonian does not care how the energy is distributed or changed in form. Landauer's principle is similar, just keep the fungibility of energy at a minimum and the computational power is unlimited, although the computation might take a long time.

Is the Universe a Computer?

What is the universe computing? As far as we can tell, it is not producing a single answer to a single question.... Instead the universe is computing itself. Powered by Standard Model software, the universe computes quantum fields, chemicals, bacteria, human beings, stars, and galaxies. As it computes, it maps out its own spacetime geometry to the ultimate precision allowed by the laws of physics. Computation is existence.[8]

What Purpose do Clocks Serve

It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.

Event-driven (aka on-demand) computing can be substantially less expensive than fully synchronous chips.[9] While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain.[10] Intel followed with their Loihi chip.[11]

Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these asynchronous computing devices in our passport or other portable ID card so that they can function flawlessly on the energy created by waving the chip card over the reader.[12] As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock. In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most effective since clock synchronization can be maintained at all times.

Life Cycle of a Photon

A Photon is born as an event when an electron loses some quantum of energy. It spends its entire life looking for a fulfilling way to die. It can be viewed as a computing device with a limited amount of storage and a limited computing capacity. It looks "forward" in space to the full extent of its computing capacity and remembers as much about its past as its storage will accommodate (aka the state vector). In the absence of any other matter, its computing capacity allows it to travel "at the speed of light". Thus creating the concept of time as the inverse of the distance traveled in empty during the compute cycle. (Time can thus be measured outside of the photon. Inside of the photon there is no memory of distance traversed, and hence no knowledge of time.) When more matter is around, more compute power is consumed with understanding the effect of that extra computing effort on the path length. This allows the photon to look forward a much shorter distance, and so it appears to go slower. In some case the photon can be trapped into a place where it is unable to complete a computation cycle about where it wants to go, and so it appears to stop. So a photon appears to be just a Turing machine of limited capacity and storage, just like your own smart phone.

Bose-Einstein statistics produce accurate results because photons are indistinguishable from each other, one cannot treat any two photons having equal quantum numbers (e.g., polarization and momentum vector) as being two distinct identifiable photons. By analogy, if in an alternate universe coins were to behave like photons, the probability of producing two heads would be one-third, and so would the probability of getting a head and a tail (which equals one-half for the conventional, distinguishable coins). Which is another way to say that only the outcome of an experiment can be statically evaluated, not the path taken by an individual photon.

"In ordinary classical physics, the typical setup is to imagine that definite things happen, and that in a sense every system follows a definite thread of behavior through time. But the key idea of quantum mechanics is to imagine that many threads of possible behavior are followed, with a definite outcome being found only through a measurement made by an observer."[13]. This quote of Stephan Wolfram was based on graph theory in contradiction to this paper's focus on discrete events being the only. "real" thing of the universe. So, while events can be, in principle, determined, the same is not true for the position and speed of any particle as any discrete time. For a broader discussion see the wiki page Discrete or Continuous.

Some History

In other words, we really knew all of this stuff before, but just couldn't accepts the facts in front of our faces.

  • Aristarchus of Samos (c. 310 – c. 230 BC) was an ancient Greek astronomer and mathematician who presented the first known heliocentric model that placed the Sun at the center of the known universe with the Earth revolving around it. He was influenced by Philolaus of Croton, but Aristarchus identified the "central fire" with the Sun, and he put the other planets in their correct order of distance around the Sun. Like Anaxagoras before him, he suspected that the stars were just other bodies like the Sun, albeit farther away from Earth. His astronomical ideas were mostly ignored in favor of the geocentric theories of Aristotle and Ptolemy. However, Nicolaus Copernicus did attribute the heliocentric theory to Aristarchus.[14]
  • Quote "The Principle of Least Action just says that a particle sniffs out every possible path and takes the one that results in the least action."[15] The action principle is preceded by earlier ideas in optics. In ancient Greece, Euclid wrote in his Catoptrica that, for the path of light reflecting from a mirror, the angle of incidence equals the angle of reflection. Hero of Alexandria (c. 10 AD – c. 70 AD) later showed that this path was the shortest length and least time.[16]
  • During the early part of the 20th century, Minkowski created the canonical model of the time-space continuum and the foundational work on convex solutions.[17] The idea of "Suitably Convex" solutions was learned from the Arkani-Hemad [15] paper referenced in the point above. Convex Optimization has evolved new methods, but the goal is consistent.[18]
  • In Feynman's thesis (1942) he took Wheeler's advice to consider both the retarded and advanced contributions to the Lagrangian, which is based on the Principle of Least Action. That seems to imply that the location of a particle in both time and space is indeterminate in the most fundamental sense. The only events that can be located in time and space seem to be the emission and absorption of the particle. [19]

How Does Time Evolve?

In asynchronous event handling, a rough sense of time arises from the delay imposed by completing the processing of any event that blocks progress on the overall program. We see that recorded all of the time in computer science as operations per second. These metrics are purely dimensionless quantities as the operation itself is a determinate of time. If we apply that to the propagation of light, it seems that even in the vacuum of space some event processing is the limiting factor. A model for that event is what needs to be developed. If the model can show the impact of operation in a non-inertial frame, it would be able merge quantum physics with general relativity.

One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative). The term spin is just a physicist's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicists call spin.

To be addressed: What about polarization? Need to accommodate linear and circular polarization of photons. Probably just need to add term to the photon state vector.

Wave Function of the Photon

It seems that all time begins with the photon, the transport of quanta of energy.

Wave function for the photon[20]

Interference

All electromagnetic waves can be superimposed upon each other without limit. The electric and magnetic fields simply add at each point. If two waves with the same frequency are combined there will a be a constant interference pattern caused by their superposition. Interference can either be constructive, meaning the strength increases as result, or destructive where the strength is reduced. The amount of interference depends of the phase difference at a particular point. It can be shown that constructive interference occurs for phase differences of 0-1200, and 240-3600. Thus destructive interference occurs from 120-2400. For two identical waves, no phase shift results in total constructive interference, where the strength is maximum and 1800 phase shift will create total destructive interference (no signal at all).

Df = 2pDx/l
Df = 2pDt/T

The phase shift that causes the interference can occur either due to a difference in path length, Dx, or a difference in the arrival time, Dt.

Diffraction

The idealized plane wave is actually infinite in extent. If this wave passes through an aperture, it will diffract, or spread out, from the opening. The degree to which the cropped wave will spread out depends on the size of the aperture relative to the wavelength. In the extreme case where the aperture is very large compared to the wavelength, the wave will see no effect and will not diffract at all. At the other extreme, if the opening is very small, the wave will behave as if it were at its origin and spread out uniformly in all directions from the aperture. In between, there will be some degree of diffraction.

First consider a circular aperture. If a wave with wavelength l encounters an opening with diameter D, the amount of diffraction as measured by the angle, q ,at which the new wave diverges from the opening, measured from edge to edge, will be approximated by q = l/D

Comments on this formula:

  • The symbol~ is used to indicate an approximate result, as opposed to an exact relationship. This result is only valid for relatively small angles, something less than about 200.
  • There is some ambiguity in what is meant by this angle. In actuality, the wave does not simply end at this boundary, but falls off smoothly. The exact point which defines the extent of the wave is a matter of definition and there are two standard conventions: the one used here is the ½ power or -3 dB definition.
  • The factor of 2 is also an approximation. A more accurate description requires detailed knowledge about the shape of the aperture
  • The angle q is measured in radians!

Photon polarization

Photon polarization is the quantum mechanical description of the classical polarized sinusoidal plane electromagnetic wave. An individual photon can be described as having right or left circular polarization, or a superposition of the two. Equivalently, a photon can be described as having horizontal or vertical linear polarization, or a superposition of the two.

The description of photon polarization contains many of the physical concepts and much of the mathematical machinery of more involved quantum descriptions, such as the quantum mechanics of an electron in a potential well. Polarization is an example of a qubit degree of freedom, which forms a fundamental basis for an understanding of more complicated quantum phenomena. Much of the mathematical machinery of quantum mechanics, such as state vectors, probability amplitudes, unitary operators, and Hermitian operators, emerge naturally from the classical Maxwell's equations in the description. The quantum polarization state vector for the photon, for instance, is identical with the Jones vector, usually used to describe the polarization of a classical wave. Unitary operators emerge from the classical requirement of the conservation of energy of a classical wave propagating through lossless media that alter the polarization state of the wave. Hermitian operators then follow for infinitesimal transformations of a classical polarization state.

The one Photon Hypothesis

Event: a bundle of energy emitted from an electron in an atom with particular direction or momentum vector and a spin (+ or -). The polarization will be ignored for the moment.

State: Point of no dimension at the place where it is emitted. The model is that of a compute engine with strictly limited memory and compute power.

Process: Look forward along the direction of momentum 2r units. Examine all space centered at r with a radius r and volume 4/3 π r looking for anything that might absorb the energy. For this cycle only the emitter is considered to be a potential absorber.

Goal: To find the shortest path. If should be obvious given dispersion or the two-slit experiment or other effects that the goal is not always achieved. This can be attributed to the limited processing capacity of the photon which gives the probabilistic results that we see.

State: no absorber found:

Process Look forward 3r units expending the same resources used to look at pervious space, but do not review anything in the circle examined in the previous step.

Create a formula for the radius of the circles created. This should match dispersion.

The two Photon Hypothesis

For time to exist photons should be emitted (nearly) always in Entangled pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.

These models assumes a time sequence for each primitive. In other words there is a predefined partial ordering of primitives where events occur, or not.

Purpose

Does.the universe have a purpose? Is it just a gigantic computer computing something like the meaning as suggested by The Hitchhiker's Guide suggests?


Summary

There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. The only significant difference between the universe and a computer is that the scope of useful computers is small enough that we can figure out how to program it to get useful answers. So go forth and make your own model universe. You have all of the capability you need on your desktop, with a little help from the cloud.

Appendix

Evolving Transformations System (ETS)

Event based solutions have been explored by philosophers like Alfred North Whitehead, [1] and John Heil[21]. This has been primarily an Ontological investigation with little in the way of physical details. Heil complains that since substance carries properties, but not events, that events are not a suitable ontology for physics. He seems not to be aware of the use of attributes of links in computer science which when faced with a deficient ontology, just changed the ontology.

A similar event-based model was created at the university of New Brunswick. The class of processes, i.e. similarly structured processes—with a common generative structure— pervades all levels of consideration in ETS. In particular, the most basic concept, the primitive event, is defined via the classes of primal processes.[22] The ETS then go on to create a collection of primitive events, or transformations that are essentially state machines with a fixed set of inputs and outputs. According to the ETS formalism, photon is not a "particle" but is a (spatially instantiated) stream of structured events. The last sentence applies equally well to the models described here. However, the event processing is quite different.

References

  1. 1.0 1.1 Alfred North Whitehead, Process and Reality. An Essay in Cosmology. Gifford Lectures Delivered in the University of Edinburgh During the Session 1927–1928, (1929) Macmillan, New York, Cambridge University Press, Cambridge UK.
  2. Thomas Kuhn, The Structure of Scientific Revolutions,, (1996-12-15) ISBN 978-0226458083
  3. Erich Jantsh, The Self-organizing Universe Pergamon (1980-01-15) ISBN 0080243118
  4. Murray Gell-mann, The Quark and the Jaguar Freeman (1994) ISBN 0716725819
  5. Bryce S. deWitt, Quantum Theory of Gravity. I. The Canonical Theory (1967-08-25) Phys. Rev. 160, 1113 https://journals.aps.org/pr/abstract/10.1103/PhysRev.160.1113
  6. Charles H. Bennett, Notes on Landauer's principle, Reversible Computation and Maxwell's Demon. Studies in History and Philosophy of Modern Physics volume=34 issue=3 pp. 501–510 (2003) http://www.cs.princeton.edu/courses/archive/fall06/cos576/papers/bennett03.pdf DOI 10.1016/S1355-2198(03)00039-X
  7. Leonard Susskind, Fractal-Flows and Time’s Arrow (2022-04-07) https://arxiv.org/pdf/1203.6440.pdf
  8. Seth Lloyd as reported by Alexandra Churikova, Is the Universe Actually a Giant Quantum Computer? (2015) https://cmsw.mit.edu/angles/2015/is-the-universe-actually-a-giant-quantum-computer/
  9. Samuel Greengard, Neuromorphic Chips take Shape CACM 63 No. 8 (2020-08) p. 9-11
  10. P. A. Menoal + 12, A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.
  11. Kyung M Song + 11, (2020-03). Skyrmion-based artificial synapses for neuromorphic computing. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. doi:10.1038/s41928-020-0385-0
  12. Seven M. Nowick, Asynchronous Design -- Part 1: Overview and Recent Advances (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances
  13. Stephen Wolfram The Wolfram Physics Project: A One-Year Update (2021-04-24) https://writings.stephenwolfram.com/2021/04/the-wolfram-physics-project-a-one-year-update/
  14. George Kish, A Source Book in Geography. (1978) Harvard University Press. p. 51. ISBN 978-0-674-82270-2.
  15. 15.0 15.1 Nima Arkani-Hemad Suzy Conference YouTube (2018-08-30) https://www.youtube.com/watch?v=xNVZg694ct8
  16. Kline, Morris, Mathematical Thought from Ancient to Modern Times. New York: Oxford University Press. (1972) pp. 167–68. ISBN 0-19-501496-0.
  17. Amir Beck, Introduction to Nonlinear Optimization: Theory, Algorithms, and Applications https://archive.siam.org/books/mo19/
  18. Boyd and Vanderbergue, Convex Optimization, Cambridge U. Press, (2004) http://www.stanford.edu/~boyd/cvxbook/
  19. Richard Feynman, Principles of Least Action in Quantum Physics Princeton (1942) https://cds.cern.ch/record/101498/files/Thesis-1942-Feynman.pdf
  20. I. Bialynick-Birual, ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PHYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf
  21. John Heil, The Universe as we find it Oxford (2015-06-01) ISBN 978-0198738978
  22. L. Goldfarb + 4, What is a structural representation? Proposal for an event-based representational formalism, Sixth Variation, Tech. Rep. TR07-187 (2007) Faculty of Computer Science, University of New Brunswick http://www.cs.unb.ca/~goldfarb/ETSbook/ETS6.pdf