# Difference between revisions of "Eventful Universe"

## Full Title

The Event-driven Universe is a model focused on the events that happen between objects in the universe.

Thomas C. Jones

2020-11-22

## The Model

• Euclidian three space will be assumed.
• There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
1. the emission of a single photon by an excited electron and its subsequent absorption by another election
2. the emission of an entangled photon pair and the subsequent absorption of one of those photons.

## Irreversibility and Power Consumption

As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions

Bennet IBM

The problem with these models is that they lead to the static Wheeler-deWitt equation ($\hat{H}(x) |\psi\rangle = 0$) (H(x)|psi> = 0) which tell us that the universe does not evolve. Many find that to be a fatal flaw of physics. It simply. does not correspond in any way to reality.

## What Purpose do Clocks Serve

It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.

Event-driven (aka on-demand) computing can be substantially lest expensive than fully synchronous chips. While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain.[1] Intel followed with their Loihi chip.[2]

Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these asynchronous computing devices in our passport or other portable ID card so that they can function flawlessly on the energy created by waving the chip card over the reader.[3] As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock. In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most effective since clock synchronization can be maintained at all times.

## How Does Time Evolve?

In asynchronous event handling, a rough sense of time arises from the delay imposed by completing the processing of any event that blocks progress on the overall program. We see that recorded all of the time in computer science as operations per second. These metrics are purely dimensionless quantities as the operation itself is a determinate of time. If we apply that to the propagation of light, it seems that even in the vacuum of space some event processing is the limiting factor. A model for that event is what needs to be developed.

One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative). The term spin is just a physicist's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicists call spin.

what about polarization? is that different?

In Feynman's thesis he took Wheeler's advice to consider both the retarded and advanced contributions to the Lagrangian. That seems to imply that the location of a particle in both time and space is indeterminate in the most fundamental sense. The only events that can be located in time and space seem to be the emission and absorption of the particle.

## Wave Function of the Photon

It seems that all time begins with the photon, the transport of quanta of energy.

Wave function for the photon[4]

## Hypothesis

For time to exist photons should be emitted (nearly) always in entangled pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.

## Summary

There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. The only significant difference between the universe and a computer is that the scope of useful computers is small enough that we can figure out how to program it to get useful answers. So go forth and make your own model universe. You have all of the capability you need on your desktop, with a little help from the cloud.

## References

1. P. A. Menoal + 12, A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.
2. Kyung M Song + 11, (2020-03). Skyrmion-based artificial synapses for neuromorphic computing. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ (March 2020). "Skyrmion-based artificial synapses for neuromorphic computing". Nature Electronics. 3 (3): 148–155. https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. doi:10.1038/s41928-020-0385-0
3. Seven M. Nowick, Asynchronous Design -- Part 1: Overview and Recent Advances (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances
4. I. Bialynick-Birual, ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PIIYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf