The Event-driven Universe is a model focused on the events that happen between objects in the universe.
Thomas C. Jones
Models of the universe typically focus on the externes of very large or very small elements, like collisions of galaxies or of photons and electrons. These are both events between identified objects, which are called entities in computer science. But the existing standard models immediately establish a set of "facts" about one particular mathematical set of assumptions like continuous and well defined space and time. In these models constructs like Lagrangian or Hamiltonian are precise. In contrast this article will focus on the events themselves, rather than the commonly accepted assumptions, at least so far as time in concerned. That means that no mathematical formula that include time as a variable will be assumed to be a fundamental truth. The major theme of this article is that if physics wants to learn about how evolution can provide answers to open issues, then the best place to look is at places where evolution has been successfully applied. As one example: machine learning has already been enhanced by acknowldging that evolution can bring new techniques and insights to hard technological challenges.
- Euclidian three space will be assumed.
- There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
- the emission of a single photon by an excited electron and its subsequent absorption by another election
- the emission of an entangled photon pair and the subsequent absorption of one of those photons.
Irreversibility and Power Consumption
As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions
The problem with these models is that the result is the static Wheeler-DeWitt equation which tell us that the universe does not evolve. I find that to be a fatal flaw of physics. It simply. does not correspond in any way to reality.
What Purpose do Clocks Serve
It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.
Event-driven (aka on-demand) computing can be substantially lest expensive than fully synchronous chips. While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain. Intel followed with their Loihi chip.
Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these devices in our passports and mobile driver's license so that they can function flawlessly on the energy created by waving the chip card over the reader. As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock. In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most efficient since clock synchronization can be maintained at all times.
How Does Time Evolve?
In asynchronous event handling, a rough sense of time arises from the delay imposed by completing the processing of any event that blocks process of the overall program. We see that recorded all of the time in computer science as operations per second. These metrics are purely dimensionless quantities as the operation itself is a determinate of time. If we apply that the the propagation of light, it seems that even in the vacuum of space that some event processing is the limiting factor. The nature of that event is what needs an explanation.
One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative). The term spin is just a physician's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicians call spin.
what about polarization? is that different?
Wave Function of the Photon
It seems that all time begins with the photon, the transport of quanta of energy.
Wave function for the photon
For time to exist photons should be emitted (nearly) always in entangled pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.
There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. There really is no significant difference between the universe and a computer beyond the small issue of scale. So go forth and make your own universe. You have all of the capability you need on your desktop, with a little help from the cloud.
- P. A. Menoal + 12, A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.
- Kyung M Song + 11, (2020-03). Skyrmion-based artificial synapses for neuromorphic computing. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ (March 2020). "Skyrmion-based artificial synapses for neuromorphic computing". Nature Electronics. 3 (3): 148–155. https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. doi:10.1038/s41928-020-0385-0
- Seven M. Nowick, Asynchronous Design -- Part 1: Overview and Recent Advances (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances
- I. Bialynick-Birual, ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PIIYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf