Eventful Universe

From MgmtWiki
Revision as of 14:56, 16 December 2020 by Tom (talk | contribs) (Life Cycle of a Photon)

Jump to: navigation, search

Full Title

The Event-driven Universe is a model focused on the events that happen among objects in the universe.

Author

Thomas C. Jones

Date

2020-12-15

Abstract

Models of the universe typically focus on the externes of very large or very small elements, like collisions of galaxies or of photons and electrons. These are both events between identified objects, which are called entities in computer science. But the existing standard models immediately establish a set of "facts" about one particular mathematical set of assumptions like continuous and well defined space and time. In these models constructs like Lagrangian or Hamiltonian are precise. In contrast this article will focus on the events themselves, rather than the commonly accepted assumptions, at least so far as time in concerned. That means that no mathematical formula that include time as a variable will be assumed to be a fundamental truth. The major theme of this article is that if physics wants to learn about how evolution can provide answers to open issues, then the best place to look is at places where evolution has been successfully applied. As one example: machine learning has already been enhanced by acknowldging that evolution can bring new techniques and insights to hard technological challenges.

The Model

  • Euclidian three space will be assumed.
  • There will be two event models addressed: (these can be thought of as Feynman diagrams if that is helpful)
  1. the emission of a single photon by an excited electron and its subsequent absorption by another election
  2. the emission of an entangled photon pair and the subsequent absorption of one of those photons.

The goal is to create a partial ordering of events and then determine if it is possible to use this partial ordering of events to create a time line that can become continuous in the limit. Each partial ordering would be local to the events and so would naturally lead to a relative time line for each locality.

Irreversibility and Power Consumption

As the Lagrangian is designed to find the path with the least power transfers (as measured by the rate of change of the difference between potential and kinetic energy) so to the attempt to find low power computer solutions

Bennet IBM taught us that computing does not need to consume much power.

The problem with these models is that they lead to the static Wheeler-deWitt equation (Ĥ(x)|ψ> = 0) which tell us that the universe does not evolve.[1] Many find that to be a fatal flaw of physics. It simply does not correspond in any way to reality.

Quantum computing is teaching us more about computing and about quantum mechanics, but it is not at all clear where this new knowledge is taking us. In the following we will investigate computing elements that consume something, but it is not clear what, perhaps it is the precursor to entropy. This proposition will not depend on existing knowledge, but rather just explore what is possible if are not completely bound by accepted knowledge. What is created in this endeavor is time itself.

What Purpose do Clocks Serve

It is informative to consider why clocks are used in computer systems and to see that there are lessons to be learned. in a mechanized society where the time value of money is of the essence of every financial transaction, the it has become a socially acknowledged fact that time management of every aspect o our lives is the highest goal to producing the greatest output of goods and services for the least expenditure of our precious time. Good time keeping was importing in our lives and in our science. In short, it works for us in a modern society and that is fully reflected in the focus of lives. Power, the expenditure of energy over time, has become the goal of man and of learning. Good clocks are important in business and in the synchronization of GPS satellites that require them for establishing position on the planet.

Event-driven (aka on-demand) computing can be substantially lest expensive than fully synchronous chips. While some modern chips selectively turn on or off cores as they are needed for computations, the core capability of the chip is powered full time. One example of event-driven design is the IBM TrueNorth chip consumes only 70 mW of power emulating the asynchronous and neural design of the human brain.[2] Intel followed with their Loihi chip.[3]

Asynchronous (self-clocking) computer circuits are imagined to be low power implementation as compared to synchronous (centralized clock) chip designs. About 700 million of us (in 2015) had one of these asynchronous computing devices in our passport or other portable ID card so that they can function flawlessly on the energy created by waving the chip card over the reader.[4] As chips become larger and compute cores focus on particular parts of the compute load, some level of asynchrony appears to be the only viable integration technique. Often each core type will have its own clock and communicate asynchronously among cores. But for high speed digital links, the communications clock is established first and then all following data is synchronized to that clock. In short there is a growing interest in improving reliability and speed with asynchronous chip designs and interconnects, but all long distance digital communications links appear to be destined to be clocked between the sender and the receiver. Most communications protocols blend the data and the clock into a single protocol. For multiple access network protocols, asynchronous makes sense. For point to point protocols, synchronous appears to be the most effective since clock synchronization can be maintained at all times.

Life Cycle of a Photon

A photon is born as an event when an electron loses some quantum of energy. It spends its entire life looking for a fulfilling way to die. It can be viewed as a computing device with a limited amount of storage and a limited computing capacity. It looks "forward" in space to the full extent of its computing capacity and remembers as much about its past as its storage will accommodate. In the absence of any other matter, its computing capacity allows it to travel "at the speed of light". Thus creating the concept of time as the inverse of the distance traveled during the compute cycle. When more matter is around, the compute power is consumed with understanding the effect of that on its path. This allows it to look forward a much shorter distance, and so it appears to go slower. In some case the photon can be trapped into a place where it is unable to complete a compution where it wants to go, and so it appears to stop. So it is just a Turing machine of limited capacity and storage, just like your own smart phone.

Some History

In other words, we really knew all of this stuff before, but just couldn't accepts the facts in front of our faces.

  • Aristarchus of Samos ( c. 310 – c. 230 BC) was an ancient Greek astronomer and mathematician who presented the first known heliocentric model that placed the Sun at the center of the known universe with the Earth revolving around it. He was influenced by Philolaus of Croton, but Aristarchus identified the "central fire" with the Sun, and he put the other planets in their correct order of distance around the Sun. Like Anaxagoras before him, he suspected that the stars were just other bodies like the Sun, albeit farther away from Earth. His astronomical ideas were often rejected in favor of the geocentric theories of Aristotle and Ptolemy. Nicolaus Copernicus did attribute the heliocentric theory to Aristarchus.[5]
  • Quote "The Principle of Least Action just says that a particle sniffs out every possible path and takes the one that results in the least action. [6] The action principle is preceded by earlier ideas in optics. In ancient Greece, Euclid wrote in his Catoptrica that, for the path of light reflecting from a mirror, the angle of incidence equals the angle of reflection. Hero of Alexandria later showed that this path was the shortest length and least time.[7]

How Does Time Evolve?

In asynchronous event handling, a rough sense of time arises from the delay imposed by completing the processing of any event that blocks progress on the overall program. We see that recorded all of the time in computer science as operations per second. These metrics are purely dimensionless quantities as the operation itself is a determinate of time. If we apply that to the propagation of light, it seems that even in the vacuum of space some event processing is the limiting factor. A model for that event is what needs to be developed. If the model can show the impact of operation in a non-inertial frame, it would be able merge quantum physics with general relativity.

One possibility is to look at the differences between different photons. So far as we know photons have energy and projection of the "spin" on the direction of momentum (helicity with values of positive and negative). The term spin is just a physicist's misuse of the term. Spin might be more appropriate for a determination of the energy of the photon which is ϵ = hν, where ϵ is the energy and ν is the frequency and which Feynman described as the rotation of a disk allowing him to demonstrate interference to a lay audience. So a useful model would be that the rate of the rotation is ν and the direction of rotation is what physicists call spin.

what about polarization? Need to accommodate linear and circular polarization of photons.

In Feynman's thesis he took Wheeler's advice to consider both the retarded and advanced contributions to the Lagrangian. That seems to imply that the location of a particle in both time and space is indeterminate in the most fundamental sense. The only events that can be located in time and space seem to be the emission and absorption of the particle.

Wave Function of the Photon

It seems that all time begins with the photon, the transport of quanta of energy.

Wave function for the photon[8]

Interference

All electromagnetic waves can be superimposed upon each other without limit. The electric and magnetic fields simply add at each point. If two waves with the same frequency are combined there will a be a constant interference pattern caused by their superposition. Interference can either be constructive, meaning the strength increases as result, or destructive where the strength is reduced. The amount of interference depends of the phase difference at a particular point. It can be shown that constructive interference occurs for phase differences of 0-1200, and 240-3600. Thus destructive interference occurs from 120-2400. For two identical waves, no phase shift results in total constructive interference, where the strength is maximum and 1800 phase shift will create total destructive interference (no signal at all).

Df = 2pDx/l
Df = 2pDt/T

The phase shift that causes the interference can occur either due to a difference in path length, Dx, or a difference in the arrival time, Dt.

Diffraction

The idealized plane wave is actually infinite in extent. If this wave passes through an aperture, it will diffract, or spread out, from the opening. The degree to which the cropped wave will spread out depends on the size of the aperture relative to the wavelength. In the extreme case where the aperture is very large compared to the wavelength, the wave will see no effect and will not diffract at all. At the other extreme, if the opening is very small, the wave will behave as if it were at its origin and spread out uniformly in all directions from the aperture. In between, there will be some degree of diffraction.

First consider a circular aperture. If a wave with wavelength l encounters an opening with diameter D, the amount of diffraction as measured by the angle, q ,at which the new wave diverges from the opening, measured from edge to edge, will be approximated by q = l/D

Comments on this formula:

  • The symbol is used to indicate an approximate result, as opposed to an exact relationship. This result is only valid for relatively small angles, something less than about 200.
  • There is some ambiguity in what is meant by this angle. In actuality, the wave does not simply end at this boundary, but falls off smoothly. The exact point which defines the extent of the wave is a matter of definition and there are two standard conventions: the one used here is the ½ power or -3 dB definition.
  • The factor of 2 is also an approximation. A more accurate description requires detailed knowledge about the shape of the aperture
  • The angle q is measured in radians!

Photon polarization

Photon polarization is the quantum mechanical description of the classical polarized sinusoidal plane electromagnetic wave. An individual photon can be described as having right or left circular polarization, or a superposition of the two. Equivalently, a photon can be described as having horizontal or vertical linear polarization, or a superposition of the two.

The description of photon polarization contains many of the physical concepts and much of the mathematical machinery of more involved quantum descriptions, such as the quantum mechanics of an electron in a potential well. Polarization is an example of a qubit degree of freedom, which forms a fundamental basis for an understanding of more complicated quantum phenomena. Much of the mathematical machinery of quantum mechanics, such as state vectors, probability amplitudes, unitary operators, and Hermitian operators, emerge naturally from the classical Maxwell's equations in the description. The quantum polarization state vector for the photon, for instance, is identical with the Jones vector, usually used to describe the polarization of a classical wave. Unitary operators emerge from the classical requirement of the conservation of energy of a classical wave propagating through lossless media that alter the polarization state of the wave. Hermitian operators then follow for infinitesimal transformations of a classical polarization state.

The one Photon Hypothesis

Event: a bundle of energy emitted from an electron in an atom with particular direction or momentum vector and a spin (+ or -). The polarization will be ignored for the moment.

State: Point of no dimension.

Process: Look forward along the direction of momentum 2r units. Examine all space centered at r with a radius r and volume 4/3 π r looking for anything that might absorb the energy. For this cycle only the emitter is considered to be a potential absorber.

State: no absorber found:

Process Look forward 3r units expending the same resources used to look at pervious space, but do not review anything in the circle examined in the previous step.

Create a formula for the radius of the circles created. This should match dispersion.

The two Photon Hypothesis

For time to exist photons should be emitted (nearly) always in entangled pairs. Before you get to hung up on reality, engage with me in a little suspension of disbelief.

These models assumes a time sequence for each primitive. In other words there is a predefined partial ordering of primitives where events occur, or not.

Summary

There should no longer be any argument whether the universe is a computer simulation or not. Forget about it. The only significant difference between the universe and a computer is that the scope of useful computers is small enough that we can figure out how to program it to get useful answers. So go forth and make your own model universe. You have all of the capability you need on your desktop, with a little help from the cloud.

Appendix

Evolving Transformations System

A similar event-based model was created at the university of New Brunswick. The class of processes, i.e. similarly structured processes—with a common generative structure— pervades all levels of consideration in ETS. In particular, the most basic concept, the primitive event, is defined via the classes of primal processes.[9] The ETS then go on to create a collection of primitive events, or transformations that are essentially state machines with a fixed set of inputs and outputs. According to the ETS formalism, photon is not a "particle" but is a (spatially instantiated) stream of structured events.

The last sentence above applies equally well to the models described here. However, the event processing is quite different.

References

  1. Bryce S. deWitt, Quantum Theory of Gravity. I. The Canonical Theory (1967-08-25) Phys. Rev. 160, 1113 https://journals.aps.org/pr/abstract/10.1103/PhysRev.160.1113
  2. P. A. Menoal + 12, A million spiking-neuron integrated circuit with a scalable communication network and interface. Science. 345 (6197): 668–73. doi:10.1126/science.1254642. PMID 25104385. S2CID 12706847.
  3. Kyung M Song + 11, (2020-03). Skyrmion-based artificial synapses for neuromorphic computing. Nature Electronics. 3 (3): 148–155. arXiv:1907.00957. [ https://newsroom.intel.com/editorials/intels-new-self-learning-chip-promises-accelerate-artificial-intelligence/#gs.m2dno0 https://en.wikichip.org/wiki/intel/loihi arXiv:1907.00957. doi:10.1038/s41928-020-0385-0
  4. Seven M. Nowick, Asynchronous Design -- Part 1: Overview and Recent Advances (2015-05) IEEE Design and Test 32(3):5-18 DOI: 10.1109/MDAT.2015.2413759 https://www.researchgate.net/publication/331181563_Asynchronous_Design_--_Part_1_Overview_and_Recent_Advances
  5. George Kish, A Source Book in Geography. (1978) Harvard University Press. p. 51. ISBN 978-0-674-82270-2.
  6. Nima Arkani-Hemad Suzy Conference YouTube (2018-08-30) https://www.youtube.com/watch?v=xNVZg694ct8
  7. Kline, Morris, Mathematical Thought from Ancient to Modern Times. New York: Oxford University Press. (1972) pp. 167–68. ISBN 0-19-501496-0.
  8. I. Bialynick-Birual, ON THE WAVE FUNCTION OF THE PHOTON (1994) ACTA PHYSICA POLONICA A http://old.cft.edu.pl/~birula/publ/APPPwf.pdf
  9. L. Goldfarb + 4, What is a structural representation? Proposal for an event-based representational formalism, Sixth Variation, Tech. Rep. TR07-187 (2007) Faculty of Computer Science, University of New Brunswick http://www.cs.unb.ca/~goldfarb/ETSbook/ETS6.pdf