Difference between revisions of "General Theory of Living Systems"

From MgmtWiki
Jump to: navigation, search
(Context)
(Context)
(129 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
==Full Title or Meme==
 
==Full Title or Meme==
  
In order to understand or design any successful, continuing digital ecosystem, there must first be a general theory of living systems that we can use a template to build such a theory.
+
In order to understand or design any successful, continuing digital ecosystem, it will be highly useful first to have a general theory of living [[Ecosystem]]s that we can use a template to build a theory for a digital ecosystem.
  
 
==Context==
 
==Context==
In 1978 James Grier Miller attempted a General Theory of Living systems <ref>James Grier Miller ''Living Systems'' 1978 ISBN 978-0070420151</ref>
+
In 1978 James Grier Miller attempted a [[General Theory of Living Systems]] <ref>James Grier Miller ''Living Systems'' 1978 ISBN 978-0070420151</ref> which proposed rigid taxonomies of levels of [[Complexity]] and common processes contained in each level. As with all taxonomies of living systems, this one necessarily has very fuzzing boundaries which help a new learner, but which become an undue burden if carried to extremely fine detail. Miller understood all of this and selected a number of levels that best fit his data. None-the-less, this page will continue in Miller's path and try the same taxonomic approach, but first a short description of alternate methods. Whereas Miller thought that living systems were a subset of all systems, this page takes the view that all complex, successful, enduring systems are subject to the laws of evolution (specifically survival of the fittest) and hence should be considered to be living systems.
  
In 1979 Douglas Hofstadter  started making the case that natural thinking involved "chunking" small ideas together to make a smaller number of classes of objects <ref>Douglas R. Hofstadter, ''Gödel, Escher, Bach: An Eternal Golden Braid'' Vintage</ref> that hide the details in ways that were currently in process in "object oriented programming. He continued this line of development into and the use of fluid analogies in our own thought as a model for how computer networks should work.<ref>Douglas R. Hofstadter, ''Fluid Concepts and Creative Analogies: Computer Models Of The Fundamental Mechanisms Of Thought'' 1995 ISBN 978-0465024759 </ref>
+
In 1979 Douglas Hofstadter  started making the case that natural thinking involved "chunking" small ideas together to make a smaller number of classes of objects <ref>Douglas R. Hofstadter, ''Gödel, Escher, Bach: An Eternal Golden Braid'' 1979 Vintage ISBN 0-39474502-7</ref> that hide the details in ways that were currently in process in "object oriented programming. He continued this line of development into the use of fluid analogies in our own thought as a model for how computer networks should work.<ref>Douglas R. Hofstadter, ''Fluid Concepts and Creative Analogies: Computer Models Of The Fundamental Mechanisms Of Thought'' 1995 ISBN 978-0465024759 </ref>
Analogies (which we make constantly, relentlessly and mostly unconsciously) are what allow categorization to happen. "Our minds are constructed with an unlimited quality for 'chunking' primordial concepts, which then become larger concepts." As an example the word "hub," as in "Denver is the hub for United Airlines," brings to mind a large linked network of concepts that are "chunked" together to make up the commonly used term. Other examples range from basics like "wheel" and "node" to higher-order concepts like "spoke" and "network." Higher-order concepts are assembled from lower-order concepts. He believes there no fundamental difference in thinking with basic concepts and very large concepts because we don't "see" inside them. We build concepts by putting several concepts together and putting a membrane around them and giving that a name. Kind of miraculously these [interior] concepts disappear into the named high-order concept.
+
Analogies (which we make constantly, relentlessly and mostly unconsciously) are what allow categorization to happen. "Our minds are constructed with an unlimited quality for 'chunking' primordial concepts, which then become larger concepts." As an example the word "hub," as in "Denver is the hub for United Airlines," brings to mind a large linked network of concepts that are "chunked" together to make up the commonly used term. Other examples range from basics like "wheel" and "node" to higher-order concepts like "spoke" and "network." Higher-order concepts are assembled from lower-order concepts. He believes there no fundamental difference in thinking with basic concepts and very large concepts because we don't "see" inside them. We build concepts by putting several concepts together and putting a membrane around them and giving that a name. Kind of miraculously these [interior] concepts disappear into the named high-order concept. In mathematics this is called [[Abstract Thinking]] and in computer science it is called [[Information Hiding]].
  
While high-level languages and object-oriented programming were present in the early days of computing (e.g. MAD and Smalltalk), they did not capture a significant mind-share of working programmers until the introduction of Java
+
While high-level languages and object-oriented programming (OOP) were present in the early days of computing (e.g. MAD and Smalltalk), they did not capture a significant mind-share of working programmers until the release of Java in 1996. With OOP languages, encapsulation or information hiding became a recognized feature of computer system architectures. While this sound like "chunking", one of its purposes was to hide details. Whereas living systems create abstract chunks, information hiding created pieces that were sealed and did not expose details of the implementation. So information hiding does not allow for level crossing and does not share the capability in open systems for higher level systems to directly interface with lower level systems. Since this is a general theory, we cannot assume that information hiding is necessary or appropriate, although privacy advocates might disagree.
  
In 2007 Cleland and Chyba wrote a chapter in Planets and Life:<ref>Woodruff, T. Sullivan; John Baross. Planets and Life: The Emerging Science of Astrobiology. October 8, 2007 Cambridge University Press. </ref> "In the absence of such a theory, we are in a position analogous to that of a 16th-century investigator trying to define 'water' in the absence of molecular theory." ... "Without access to living things having a different historical origin, it is difficult and perhaps ultimately impossible to formulate an adequately general theory of the nature of living systems"  
+
In 2007 Cleland and Chyba wrote a chapter in Planets and Life:<ref>Woodruff, T. Sullivan; John Baross. Planets and Life: The Emerging Science of Astrobiology. 2007-10-08 Cambridge University Press. ISBN 978-0521531023</ref> "In the absence of such a theory, we are in a position analogous to that of a 16th-century investigator trying to define 'water' in the absence of molecular theory." ... "Without access to living things having a different historical origin, it is difficult and perhaps ultimately impossible to formulate an adequately general theory of the nature of living systems"  
  
 
[[Bayesian Identity Proofing]] provides the means for a collection of authentication and verification steps to be validated.
 
[[Bayesian Identity Proofing]] provides the means for a collection of authentication and verification steps to be validated.
 +
 +
[[Homeostasis]] provides the description of systems that can maintain themselves and describes the difficulty of using [[Stability]] at one level of a hierarchy of systems to a higher level.
  
 
==Problems==
 
==Problems==
 +
Human cognitive process are limited and not nearly as logical as some believe.
 +
 +
===Cognitive Overload===
 +
Human short term memory is extremely limited.
 +
 +
Users come to a web site to get work done, and what are they faced with:
 +
* Authentication
 +
* Registration
 +
* Choices for privacy, etc. etc
 +
* Stress of normal problems of life
 +
 +
===Cognitive Dissonance===
 +
Human learning is based on causal relationships, primarily so that humans can know what to expect of any given situation. When something does not happen as their experience would indicate, they may find that they are at a loss as to how to react to the current situation resulting in confusion, or loss of context. When UX authors talk about tradition, or user expectations, they are really addressing the cognitive dissonance that occurs when thing don't go the way a user expects. Continued cognitive dissonance will lead to anxiety or anger which will drive people away.
 +
 +
===Tradition===
 +
Karl Popper started a theory of how people rely on tradition in making sense of this world,<ref>Karl Popper, ''Conjectures and Refutations.'' (1961) Routledge chapter 4 ISBN 0-415-28594-1</ref> which is paraphrased here. A collection of features of our society, its institutions and traditions give a sense of order and discipline to our daily lives, without which we face anxiety and even terror about our future. What we experience as a normal social life can exist only if we can know, and can have confidence, that there are things and events which must continue and will not be different in the near future. To meet our needs for some level of certainty in our lives that the role of tradition can be understood. We will become anxious, frustrated and even terrified with our environment if it does not contain a considerable amount of order, a great number of regularities to which we can adjust ourselves. The mere existence of these regularities is perhaps more important than their particular merits or demerits. They are needed as regularities and therefore handed on as traditions, whether or not they are in other respects rational or necessary or good or beautiful. There is a need for tradition in social life.
 +
 +
===Causal Relationships===
 +
A new and unconfirmed theory suggests that perception, motor control, memory and other [[Brain]] functions all depend on comparisons between ongoing actual experiences and the brain’s modeled expectations. This is the predictive coding explanation for how the brain works. This theory started from researchers trying to teach a computer to understand images that are fed into it. Its algorithm tries to make predictions about the environment and them compares the predictions to reality, just the way that humans learn to navigate in the real-wold, and now in the digital world. Designers need to be aware of this learning pattern and avoid changes that cause anxiety in users by testing the new designs to see how real users react. Nothing is more frustrating to a user that to have an update to an often used interface suddenly make it behave in unexpected ways. The user's response is unlikely to match the expected response of the designer's and developers.<ref>Jordana Cepelewicz. ''To Make Sense of the Present, Brains May Predict the Future''. Quanta (2018-07) https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710/</ref>
 +
 +
===Level Dependencies===
 +
Each level of complexity depends upon the support of all levels, but specifically on the levels below it.
 +
 +
Just as civilization depends on the good will of the large majority of its citizens, so too a personal computer depends on the existence of some ecosystem where it can successfully be deployed.
 +
 +
Yet even the least complex life form, the virus or the management chip buried deep in some data center, can be the portal for and infection that will have consequences at the highest level.
 +
 +
===Successful ecosystems===
 +
For an open ecosystem to be successful it must be [[Resilience|Resilient]] to change in every one of its components at every level. Since the ecosystem must dynamically evolve as new version of old components are constantly added and the results of many of those changes are unpredictable.
 +
 +
It is not possible to design an ecosystem in advance, just to design components with the best knowledge available at the time and introduce the new component, gradually if there is expected to be a great impact. For ecosystem changes, design thinking involves picking a way that the ecosystem should change and the designing a component that will help it move in that direction. Any design that depends on a completely new ecosystem is not likely to develop as expected. Before designing a change to an existing component, it is necessary first to peel back all of the assumptions that went into the existing design to see which are critical to the continued success of the ecosystem and which are available for alteration.<ref> Terry Winograd and Fernando Flores, ''Understanding Computers and Cognition'' 1986 ISBN 0-201-11297-3</ref>
 +
 +
===Open versus Closed Systems===
 +
Analogies comparing social systems to biological organisms goes back to Plato<ref>Plato, ''Republic.'' (380 BC) 462c</ref> or St. Paul<ref>St. Paul. 1 Corinthians 12:12-17 (60)</ref> but is with Popper<ref>Karl R. Popper, ''The Open Society and Its Enemies, Part 1 Plato.'' (original in 1943) Chapter 10, especially note 7 ISBN 0-691-01968-1</ref> that the ideas of the organic body related to society in the tribe and in the state are examined in detail.  We find that Popper believes that general living systems are closed and that only the closed, tribal societies compare well to the body. But we see today that most organic systems have an open relationship to their parts via both matter-energy and information. We find the same thing with information systems that have been closed or enterprise specific in the past are now open to the internet and suffer the pains of learning how to be secure and defensive in this new, open ecosystem, just as human systems needed to relearn to be secure after the Athenian experiment with democracy starting with Solon<ref>Solon</ref>.
  
 
==Solutions==
 
==Solutions==
Line 38: Line 74:
 
|}
 
|}
  
===Grier's 19 "proceses" that work at all levels===
+
===Miller's "Processes" that work at all levels===
  
The colored boxes highlight the ones that deal with energy or manner, not the primary focus of this presentation.
+
Miller had 19 processes, defense and timer were added later. The boxes highlighted with color are the ones that deal with energy or manner, not the primary focus of this presentation.
 +
Two special cases are the reproducer, which might include cloning, but is not further discussed here; and the boundary which is important for identity and information hiding and so will be considered here.
  
 
{|border="1" padding="2" width="799px"
 
{|border="1" padding="2" width="799px"
| Nu || Name || Typical use||  User Experience
+
| Nu || Name || Function ||  Importance
 
|-
 
|-
| 1 || ingestor  || Management of Computer || bgcolor="SkyBlue"| Energy Matter
+
| 1 || ingestor  || input food, parts, energy || bgcolor="KHAKI"| Energy Matter, important to computers
 
|-
 
|-
| 2 || distributor || Internet of Things ||  bgcolor="SkyBlue"| Energy Matter
+
| 2 || distributor || move food, parts, energy ||  bgcolor="KHAKI"| Energy Matter, important to computers
 
|-
 
|-
| 3 || converter || Accessing Web Sites ||  bgcolor="SkyBlue"| Energy Matter
+
| 3 || converter || Change from one form to another ||  bgcolor="KHAKI"| Energy Matter, might be important to computers
 
|-
 
|-
| 4 || producer || Local processing & virtualization ||  bgcolor="SkyBlue"| Energy Matter
+
| 4 || producer || Establish relationships ||  bgcolor="KHAKI"| Energy Matter, important to interconnections
 
|-
 
|-
| 5 || storage  || Collection of computers in a single location|| bgcolor="SkyBlue"| Energy Matter
+
| 5 || storage  || Could include a variety of back-up || bgcolor="KHAKI"| Energy Matter, important for reliability
 
|-
 
|-
| 6 || extruder|| Social Network ||  bgcolor="SkyBlue"| Energy Matter
+
| 6 || extruder|| Social Network ||  bgcolor="KHAKI"| Energy Matter, waste heat is an issue
 
|-
 
|-
| 7 || motor ||Maintenance of Names ||  bgcolor="SkyBlue"| Energy Matter
+
| 7 || motor ||Maintenance of Names ||  bgcolor="KHAKI"| Energy Matter, important for robots
 
|-
 
|-
| 8 || supporter || Management of Computer || bgcolor="SkyBlue"| Energy Matter
+
| 8 || supporter || How everything fits together || bgcolor="KHAKI"| Energy Matter, important for maintenace
 
|-
 
|-
| 9 || Board Computer || Internet of Things || bgcolor="SkyBlue"|Room temperature or video surveillance
+
| 9 || reproducer || Create new copies of self || bgcolor="KHAKI"| Energy Matter and information, future concern
 
|-
 
|-
| 10 || Single Processor Computer || Accessing Web Sites ||  bgcolor="SkyBlue"|Simple Queries of web
+
| 10 || boundary || Maintains integrity ||  Energy Matter and information, allows a continuing identity
 
|-
 
|-
| 11 || Multiple Processor Computer || Local processing & virtualization ||  bgcolor="SkyBlue"|Mobile or Desktop device that maintains user info
+
| 11 || input transducer || detect changes or commands ||  all recorded external events
 
|-
 
|-
| 12 || Data Center || Collection of computers in a single location|| Only by Administrators
+
| 12 || internal transducer || Internally created events || Tracks changes due to internal operations
 
|-
 
|-
| 13|| Cloud (single owner)|| Social Network ||  bgcolor="SkyBlue"|Interaction of user searches and tracking
+
| 13 || channel and net || Looks externally in all directions ||  Ethernet, High speed local nets
 
|-
 
|-
| 14 ||Internetwork of Clouds ||Maintenance of Names || bgcolor="SkyBlue"|User cannot access desired resourced for security reasons
+
| 14 || timer (added later) || Internally generated events || Could be considered a part of internal transducer
 
|-
 
|-
| 15 || Chip || Management of Computer || Only by Administrators
+
| 15 || decoder || Converts data to meaning  || For processing at a more abstract level
 
|-
 
|-
| 16 || Board Computer || Internet of Things ||  bgcolor="SkyBlue"|Room temperature or video surveillance
+
| 16 || associator || Learning ||  Intelligence, Artificial or otherwise
 
|-
 
|-
| 17 || Single Processor Computer || Accessing Web Sites ||  bgcolor="SkyBlue"|Simple Queries of web
+
| 17 || memory || static storage and remembering ||  Allows active searching
 
|-
 
|-
| 18 || Multiple Processor Computer || Local processing & virtualization || bgcolor="SkyBlue"|Mobile or Desktop device that maintains user info
+
| 18 || decider || Executive decision maker || Flexibility, reliability, adaptability
 
|-
 
|-
| 19 || Data Center || Collection of computers in a single location|| Only by Administrators
+
| 19 || encoder || Converts internal data for reuse || Translation, either for storage or for transmission
 
|-
 
|-
| 20 || Cloud (single owner)|| Social Network ||  bgcolor="SkyBlue"|Interaction of user searches and tracking
+
| 20 || output transducer|| Sends messages outbound ||  Display or speech (user experience)
 
|-
 
|-
| 7 ||Internetwork of Clouds ||Maintenance of Names || bgcolor="SkyBlue"|User cannot access desired resourced for security reasons
+
| 21 || defense (recent) || no component survives without it || antigen in component or higher
 
|}
 
|}
 +
Here we will drill into only a few of these processes:
 +
====3 Converter====
 +
As a general rule, information systems delegate the problems of the supply of component materials or energy to experts. This section serves primarily to bind the need for food inputs, in particular energy inputs. The human being is a converter at the cell level of matter and energy into information, but at low efficiency, except for the brain which appears to have maximized the computation per unit of energy.<ref>Liqun Luo, ''Why Is the Human Brain So Efficient?'' '''Nautilus''' (2018-04-12) http://nautil.us/issue/59/connections/why-is-the-human-brain-so-efficient</ref> Similarly in 1973 Charles Bennett, building on the work of IBM's Rolf Landauer, showed that general-purpose computation can be performed by a logically and thermodynamically reversible apparatus, which can operate with arbitrarily little energy dissipation per step because it avoids throwing away information about past logical states; and in 1982 he proposed a reinterpretation of Maxwell's demon, attributing its inability to break the second law to the thermodynamic cost of destroying, rather than acquiring, information.<ref>Charles Bennett, CV https://researcher.watson.ibm.com/researcher/view_person_subpage.php?id=2861</ref> In other words it takes more energy to erase the blackboard than it does to fill it up.
  
 +
'''Landauer's principle''' is a physical principle pertaining to the lower theoretical limit of energy consumption of [[computation]]. It holds that "any logically irreversible manipulation of [[Information#As a property in physics|information]], such as the erasure of a [[bit]] or the merging of two [[computation]] paths, must be accompanied by a corresponding [[entropy]] increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".<ref name = bennett>Charles H. Bennett,  ''Notes on Landauer's principle, Reversible Computation and Maxwell's Demon.'' Studies in History and Philosophy of Modern Physics volume=34 issue=3 pp. 501–510 (2003) http://www.cs.princeton.edu/courses/archive/fall06/cos576/papers/bennett03.pdf DOI 10.1016/S1355-2198(03)00039-X</ref>
  
 +
'''Free Energy''' has been proposed as measure of [[Evolution]]ary pressure. Using the availability of energy to create the information reservoir needed to enable a civilized society White showed in 1949<ref>L.A. White, ''The science of culture.'' (1949) Grove Press p 361 ASIN B000OMISVM</ref> culture to be "an elaborate thermodynamic mechanical system" designed to carry on the life processes of man by harnessing and controlling energy. It should be little surprise in 2006 when Karl Friston postulated that Free Energy (E-TS)<ref>L.D. Landau and E.M. Lifshitz, Statistical Physics. (1958 in English) Pergamon Press p. 45</ref> was the necessary precursor of any artificial intelligence;<ref> Friston, K., Kilner, J., & Harrison, L. ''A free energy principle for the brain.'' (2006) '''J Physiol Paris. , 100''' (1–3), 70–87. </ref> but it was to the computer scientists who typically think about the speed of computation rather than the energy required to power their dreams. See also [https://en.wikipedia.org/wiki/Free_energy_principle Wikipedia on the Free Energy Principle] which makes the idea seem harder than it really is.
 +
===21 Defence===
 +
The foundation of epidemiology was the tracking of cholera to a London water pump. The medical intervention was to remove the handle from the pump. <ref>''How Epidemiology Began with a London Water Pump''. Biofire https://www.biofiredx.com/blog/how-epidemiology-began-with-a-london-water-pump/</ref>
  
 +
===Hardware as a Living System===
 +
At least since the 1970s computer boards have come with "self-aware" components, often as independent computers that monitor the health of the main computer system. Since the 1990s the computer chips themselves are aware of the compute load that they are handling and adapt the number of circuits that are powered, or even the frequency of the clock used on individual cores so that power consumption (and heat generation) is kept to a minimum.<ref>Noor Muben, ''Workload Frequency Scaling Law - Derivation and Verification.'' (2018-09) '''CACM 61''' No 9 pp. 43-47</ref> in the 2010s data centers became fully virtualized so that the loss of a single compute cell would be automatically replaced by another and a signal raised to allow a human at regular intervals to replace all the faulty units with no down time. So now the data center was aware of all of the parameters that effected its performance from the chip to the entire module of thousands of computers. The computer repairman was operating under the direction of the data center itself as just one component of this living system.
  
 
+
===Software as a Living System===
 
+
In an example given by Lee<ref name="lee"> Edward E. Lee, ''Is Software the Result of Top-Down Intelligent Design or Evolution?'' (2018-09) '''CACM 61''' No 9 pp. 31-39</ref> where an individual [[Entity]] is an executing computer program(s). While many are short-lived, others like Wikipedia seem to have a life of their own and even interbreeds with other software systems just like a living system. It reacts to stimulus from its environment and requires energy and information inputs. It operates autonomously, defends itself from attack and replicates (the last 3 with help from humans at this point). Wikipedia is a good example of a symbiosis where people and programs inter-operate to the benefit of both. Certainly program are written by humans with a purpose, but that purpose is driven by the current ecosystem and will adapt as the ecosystem adapts, or it will just die out. The growth of [[Artificial Intelligence]] will make programs even more capable of self-adaption to changes in the ecosystem including to new attacks as they are deployed.
ingestor, distributor, converter, producer, storage, extruder, motor, supporter
 
The processors of information are:
 
input transducer, internal transducer, channel and net, timer (added later), decoder, associator, memory, decider, encoder, output transducer.
 
  
 
==References==
 
==References==
  
 
+
[[Category:Ecosystem]]
[[Category:Glossary]]
+
[[Category:User Experience]]
 +
[[Category:Article]]
 +
[[Category:Complexity]]

Revision as of 10:17, 27 January 2021

Full Title or Meme

In order to understand or design any successful, continuing digital ecosystem, it will be highly useful first to have a general theory of living Ecosystems that we can use a template to build a theory for a digital ecosystem.

Context

In 1978 James Grier Miller attempted a General Theory of Living Systems [1] which proposed rigid taxonomies of levels of Complexity and common processes contained in each level. As with all taxonomies of living systems, this one necessarily has very fuzzing boundaries which help a new learner, but which become an undue burden if carried to extremely fine detail. Miller understood all of this and selected a number of levels that best fit his data. None-the-less, this page will continue in Miller's path and try the same taxonomic approach, but first a short description of alternate methods. Whereas Miller thought that living systems were a subset of all systems, this page takes the view that all complex, successful, enduring systems are subject to the laws of evolution (specifically survival of the fittest) and hence should be considered to be living systems.

In 1979 Douglas Hofstadter started making the case that natural thinking involved "chunking" small ideas together to make a smaller number of classes of objects [2] that hide the details in ways that were currently in process in "object oriented programming. He continued this line of development into the use of fluid analogies in our own thought as a model for how computer networks should work.[3] Analogies (which we make constantly, relentlessly and mostly unconsciously) are what allow categorization to happen. "Our minds are constructed with an unlimited quality for 'chunking' primordial concepts, which then become larger concepts." As an example the word "hub," as in "Denver is the hub for United Airlines," brings to mind a large linked network of concepts that are "chunked" together to make up the commonly used term. Other examples range from basics like "wheel" and "node" to higher-order concepts like "spoke" and "network." Higher-order concepts are assembled from lower-order concepts. He believes there no fundamental difference in thinking with basic concepts and very large concepts because we don't "see" inside them. We build concepts by putting several concepts together and putting a membrane around them and giving that a name. Kind of miraculously these [interior] concepts disappear into the named high-order concept. In mathematics this is called Abstract Thinking and in computer science it is called Information Hiding.

While high-level languages and object-oriented programming (OOP) were present in the early days of computing (e.g. MAD and Smalltalk), they did not capture a significant mind-share of working programmers until the release of Java in 1996. With OOP languages, encapsulation or information hiding became a recognized feature of computer system architectures. While this sound like "chunking", one of its purposes was to hide details. Whereas living systems create abstract chunks, information hiding created pieces that were sealed and did not expose details of the implementation. So information hiding does not allow for level crossing and does not share the capability in open systems for higher level systems to directly interface with lower level systems. Since this is a general theory, we cannot assume that information hiding is necessary or appropriate, although privacy advocates might disagree.

In 2007 Cleland and Chyba wrote a chapter in Planets and Life:[4] "In the absence of such a theory, we are in a position analogous to that of a 16th-century investigator trying to define 'water' in the absence of molecular theory." ... "Without access to living things having a different historical origin, it is difficult and perhaps ultimately impossible to formulate an adequately general theory of the nature of living systems"

Bayesian Identity Proofing provides the means for a collection of authentication and verification steps to be validated.

Homeostasis provides the description of systems that can maintain themselves and describes the difficulty of using Stability at one level of a hierarchy of systems to a higher level.

Problems

Human cognitive process are limited and not nearly as logical as some believe.

Cognitive Overload

Human short term memory is extremely limited.

Users come to a web site to get work done, and what are they faced with:

  • Authentication
  • Registration
  • Choices for privacy, etc. etc
  • Stress of normal problems of life

Cognitive Dissonance

Human learning is based on causal relationships, primarily so that humans can know what to expect of any given situation. When something does not happen as their experience would indicate, they may find that they are at a loss as to how to react to the current situation resulting in confusion, or loss of context. When UX authors talk about tradition, or user expectations, they are really addressing the cognitive dissonance that occurs when thing don't go the way a user expects. Continued cognitive dissonance will lead to anxiety or anger which will drive people away.

Tradition

Karl Popper started a theory of how people rely on tradition in making sense of this world,[5] which is paraphrased here. A collection of features of our society, its institutions and traditions give a sense of order and discipline to our daily lives, without which we face anxiety and even terror about our future. What we experience as a normal social life can exist only if we can know, and can have confidence, that there are things and events which must continue and will not be different in the near future. To meet our needs for some level of certainty in our lives that the role of tradition can be understood. We will become anxious, frustrated and even terrified with our environment if it does not contain a considerable amount of order, a great number of regularities to which we can adjust ourselves. The mere existence of these regularities is perhaps more important than their particular merits or demerits. They are needed as regularities and therefore handed on as traditions, whether or not they are in other respects rational or necessary or good or beautiful. There is a need for tradition in social life.

Causal Relationships

A new and unconfirmed theory suggests that perception, motor control, memory and other Brain functions all depend on comparisons between ongoing actual experiences and the brain’s modeled expectations. This is the predictive coding explanation for how the brain works. This theory started from researchers trying to teach a computer to understand images that are fed into it. Its algorithm tries to make predictions about the environment and them compares the predictions to reality, just the way that humans learn to navigate in the real-wold, and now in the digital world. Designers need to be aware of this learning pattern and avoid changes that cause anxiety in users by testing the new designs to see how real users react. Nothing is more frustrating to a user that to have an update to an often used interface suddenly make it behave in unexpected ways. The user's response is unlikely to match the expected response of the designer's and developers.[6]

Level Dependencies

Each level of complexity depends upon the support of all levels, but specifically on the levels below it.

Just as civilization depends on the good will of the large majority of its citizens, so too a personal computer depends on the existence of some ecosystem where it can successfully be deployed.

Yet even the least complex life form, the virus or the management chip buried deep in some data center, can be the portal for and infection that will have consequences at the highest level.

Successful ecosystems

For an open ecosystem to be successful it must be Resilient to change in every one of its components at every level. Since the ecosystem must dynamically evolve as new version of old components are constantly added and the results of many of those changes are unpredictable.

It is not possible to design an ecosystem in advance, just to design components with the best knowledge available at the time and introduce the new component, gradually if there is expected to be a great impact. For ecosystem changes, design thinking involves picking a way that the ecosystem should change and the designing a component that will help it move in that direction. Any design that depends on a completely new ecosystem is not likely to develop as expected. Before designing a change to an existing component, it is necessary first to peel back all of the assumptions that went into the existing design to see which are critical to the continued success of the ecosystem and which are available for alteration.[7]

Open versus Closed Systems

Analogies comparing social systems to biological organisms goes back to Plato[8] or St. Paul[9] but is with Popper[10] that the ideas of the organic body related to society in the tribe and in the state are examined in detail. We find that Popper believes that general living systems are closed and that only the closed, tribal societies compare well to the body. But we see today that most organic systems have an open relationship to their parts via both matter-energy and information. We find the same thing with information systems that have been closed or enterprise specific in the past are now open to the internet and suffer the pains of learning how to be secure and defensive in this new, open ecosystem, just as human systems needed to relearn to be secure after the Athenian experiment with democracy starting with Solon[11].

Solutions

Taxonomy of Levels of connected Systems

Level Name Typical use User Experience
1 Chip Management of Computer Only by Administrators
2 Board Computer Internet of Things Room temperature or video surveillance
3 Single Processor Computer Accessing Web Sites Simple Queries of web
4 Multiple Processor Computer Local processing & virtualization Mobile or Desktop device that maintains user info
5 Data Center Collection of computers in a single location Only by Administrators
6 Cloud (single owner) Social Network Interaction of user searches and tracking
7 Internetwork of Clouds Maintenance of Names User cannot access desired resourced for security reasons

Miller's "Processes" that work at all levels

Miller had 19 processes, defense and timer were added later. The boxes highlighted with color are the ones that deal with energy or manner, not the primary focus of this presentation. Two special cases are the reproducer, which might include cloning, but is not further discussed here; and the boundary which is important for identity and information hiding and so will be considered here.

Nu Name Function Importance
1 ingestor input food, parts, energy Energy Matter, important to computers
2 distributor move food, parts, energy Energy Matter, important to computers
3 converter Change from one form to another Energy Matter, might be important to computers
4 producer Establish relationships Energy Matter, important to interconnections
5 storage Could include a variety of back-up Energy Matter, important for reliability
6 extruder Social Network Energy Matter, waste heat is an issue
7 motor Maintenance of Names Energy Matter, important for robots
8 supporter How everything fits together Energy Matter, important for maintenace
9 reproducer Create new copies of self Energy Matter and information, future concern
10 boundary Maintains integrity Energy Matter and information, allows a continuing identity
11 input transducer detect changes or commands all recorded external events
12 internal transducer Internally created events Tracks changes due to internal operations
13 channel and net Looks externally in all directions Ethernet, High speed local nets
14 timer (added later) Internally generated events Could be considered a part of internal transducer
15 decoder Converts data to meaning For processing at a more abstract level
16 associator Learning Intelligence, Artificial or otherwise
17 memory static storage and remembering Allows active searching
18 decider Executive decision maker Flexibility, reliability, adaptability
19 encoder Converts internal data for reuse Translation, either for storage or for transmission
20 output transducer Sends messages outbound Display or speech (user experience)
21 defense (recent) no component survives without it antigen in component or higher

Here we will drill into only a few of these processes:

3 Converter

As a general rule, information systems delegate the problems of the supply of component materials or energy to experts. This section serves primarily to bind the need for food inputs, in particular energy inputs. The human being is a converter at the cell level of matter and energy into information, but at low efficiency, except for the brain which appears to have maximized the computation per unit of energy.[12] Similarly in 1973 Charles Bennett, building on the work of IBM's Rolf Landauer, showed that general-purpose computation can be performed by a logically and thermodynamically reversible apparatus, which can operate with arbitrarily little energy dissipation per step because it avoids throwing away information about past logical states; and in 1982 he proposed a reinterpretation of Maxwell's demon, attributing its inability to break the second law to the thermodynamic cost of destroying, rather than acquiring, information.[13] In other words it takes more energy to erase the blackboard than it does to fill it up.

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".[14]

Free Energy has been proposed as measure of Evolutionary pressure. Using the availability of energy to create the information reservoir needed to enable a civilized society White showed in 1949[15] culture to be "an elaborate thermodynamic mechanical system" designed to carry on the life processes of man by harnessing and controlling energy. It should be little surprise in 2006 when Karl Friston postulated that Free Energy (E-TS)[16] was the necessary precursor of any artificial intelligence;[17] but it was to the computer scientists who typically think about the speed of computation rather than the energy required to power their dreams. See also Wikipedia on the Free Energy Principle which makes the idea seem harder than it really is.

21 Defence

The foundation of epidemiology was the tracking of cholera to a London water pump. The medical intervention was to remove the handle from the pump. [18]

Hardware as a Living System

At least since the 1970s computer boards have come with "self-aware" components, often as independent computers that monitor the health of the main computer system. Since the 1990s the computer chips themselves are aware of the compute load that they are handling and adapt the number of circuits that are powered, or even the frequency of the clock used on individual cores so that power consumption (and heat generation) is kept to a minimum.[19] in the 2010s data centers became fully virtualized so that the loss of a single compute cell would be automatically replaced by another and a signal raised to allow a human at regular intervals to replace all the faulty units with no down time. So now the data center was aware of all of the parameters that effected its performance from the chip to the entire module of thousands of computers. The computer repairman was operating under the direction of the data center itself as just one component of this living system.

Software as a Living System

In an example given by Lee[20] where an individual Entity is an executing computer program(s). While many are short-lived, others like Wikipedia seem to have a life of their own and even interbreeds with other software systems just like a living system. It reacts to stimulus from its environment and requires energy and information inputs. It operates autonomously, defends itself from attack and replicates (the last 3 with help from humans at this point). Wikipedia is a good example of a symbiosis where people and programs inter-operate to the benefit of both. Certainly program are written by humans with a purpose, but that purpose is driven by the current ecosystem and will adapt as the ecosystem adapts, or it will just die out. The growth of Artificial Intelligence will make programs even more capable of self-adaption to changes in the ecosystem including to new attacks as they are deployed.

References

  1. James Grier Miller Living Systems 1978 ISBN 978-0070420151
  2. Douglas R. Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid 1979 Vintage ISBN 0-39474502-7
  3. Douglas R. Hofstadter, Fluid Concepts and Creative Analogies: Computer Models Of The Fundamental Mechanisms Of Thought 1995 ISBN 978-0465024759
  4. Woodruff, T. Sullivan; John Baross. Planets and Life: The Emerging Science of Astrobiology. 2007-10-08 Cambridge University Press. ISBN 978-0521531023
  5. Karl Popper, Conjectures and Refutations. (1961) Routledge chapter 4 ISBN 0-415-28594-1
  6. Jordana Cepelewicz. To Make Sense of the Present, Brains May Predict the Future. Quanta (2018-07) https://www.quantamagazine.org/to-make-sense-of-the-present-brains-may-predict-the-future-20180710/
  7. Terry Winograd and Fernando Flores, Understanding Computers and Cognition 1986 ISBN 0-201-11297-3
  8. Plato, Republic. (380 BC) 462c
  9. St. Paul. 1 Corinthians 12:12-17 (60)
  10. Karl R. Popper, The Open Society and Its Enemies, Part 1 Plato. (original in 1943) Chapter 10, especially note 7 ISBN 0-691-01968-1
  11. Solon
  12. Liqun Luo, Why Is the Human Brain So Efficient? Nautilus (2018-04-12) http://nautil.us/issue/59/connections/why-is-the-human-brain-so-efficient
  13. Charles Bennett, CV https://researcher.watson.ibm.com/researcher/view_person_subpage.php?id=2861
  14. Charles H. Bennett, Notes on Landauer's principle, Reversible Computation and Maxwell's Demon. Studies in History and Philosophy of Modern Physics volume=34 issue=3 pp. 501–510 (2003) http://www.cs.princeton.edu/courses/archive/fall06/cos576/papers/bennett03.pdf DOI 10.1016/S1355-2198(03)00039-X
  15. L.A. White, The science of culture. (1949) Grove Press p 361 ASIN B000OMISVM
  16. L.D. Landau and E.M. Lifshitz, Statistical Physics. (1958 in English) Pergamon Press p. 45
  17. Friston, K., Kilner, J., & Harrison, L. A free energy principle for the brain. (2006) J Physiol Paris. , 100 (1–3), 70–87.
  18. How Epidemiology Began with a London Water Pump. Biofire https://www.biofiredx.com/blog/how-epidemiology-began-with-a-london-water-pump/
  19. Noor Muben, Workload Frequency Scaling Law - Derivation and Verification. (2018-09) CACM 61 No 9 pp. 43-47
  20. Edward E. Lee, Is Software the Result of Top-Down Intelligent Design or Evolution? (2018-09) CACM 61 No 9 pp. 31-39