General Theory of Living Systems

From MgmtWiki
Revision as of 10:54, 23 November 2018 by Tom (talk | contribs) (3 Converter)

Jump to: navigation, search

Full Title or Meme

In order to understand or design any successful, continuing digital ecosystem, it will be highly useful first to have a general theory of living ecosystems that we can use a template to build a theory for a digital ecosystem.


In 1978 James Grier Miller attempted a General Theory of Living systems [1] which proposed rigid taxonomies of levels of complexity and common processes contained in each level. As with all taxonomies of living systems, this one necessarily has very fuzzing boundaries which help a new learner, but which become an undue burden if carried to extremely fine detail. Miller understood all of this and selected a number of levels that best fit his data. None-the-less, this page will continue in Miller's path and try the same taxonomic approach, but first a short description of alternate methods. Whereas Miller thought that living systems were a subset of all systems, this page takes the view that all complex, successful, enduring systems are subject to the laws of evolution (specifically survival of the fittest) and hence should be considered to be living systems.

In 1979 Douglas Hofstadter started making the case that natural thinking involved "chunking" small ideas together to make a smaller number of classes of objects [2] that hide the details in ways that were currently in process in "object oriented programming. He continued this line of development into the use of fluid analogies in our own thought as a model for how computer networks should work.[3] Analogies (which we make constantly, relentlessly and mostly unconsciously) are what allow categorization to happen. "Our minds are constructed with an unlimited quality for 'chunking' primordial concepts, which then become larger concepts." As an example the word "hub," as in "Denver is the hub for United Airlines," brings to mind a large linked network of concepts that are "chunked" together to make up the commonly used term. Other examples range from basics like "wheel" and "node" to higher-order concepts like "spoke" and "network." Higher-order concepts are assembled from lower-order concepts. He believes there no fundamental difference in thinking with basic concepts and very large concepts because we don't "see" inside them. We build concepts by putting several concepts together and putting a membrane around them and giving that a name. Kind of miraculously these [interior] concepts disappear into the named high-order concept.

While high-level languages and object-oriented programming (OOP) were present in the early days of computing (e.g. MAD and Smalltalk), they did not capture a significant mind-share of working programmers until the release of Java in 1996. With OOP languages, encapsulation or information hiding became a recognized feature of computer system architectures. While this sound like "chunking", one of its purposes was to hide details. Whereas living systems create abstract chunks, information hiding created pieces that were sealed and did not expose details of the implementation. So information hiding does not allow for level crossing and does not share the capability in open systems for higher level systems to directly interface with lower level systems. Since this is a general theory, we cannot assume that information hiding is necessary or appropriate, although privacy advocates might disagree.

In 2007 Cleland and Chyba wrote a chapter in Planets and Life:[4] "In the absence of such a theory, we are in a position analogous to that of a 16th-century investigator trying to define 'water' in the absence of molecular theory." ... "Without access to living things having a different historical origin, it is difficult and perhaps ultimately impossible to formulate an adequately general theory of the nature of living systems"

Bayesian Identity Proofing provides the means for a collection of authentication and verification steps to be validated.


Human cognitive process are limited and not nearly as logical as some believe.

Cognitive Overload

Human short term memory is extremely limited.

Users come to a web site to get work done, and what are they faced with:

  • Authentication
  • Registration
  • Choices for privacy, etc. etc
  • Stress of normal problems of life

Cognitive Dissonance

Human learning is based on causal relationships, primarily so that humans can know what to expect of any given situation. When something does not happen as their experience would indicate, they may find that they are at a loss as to how to react to the current situation resulting in confusion, or loss of context. When UX authors talk about tradition, or user expectations, they are really addressing the cognitive dissonance that occurs when thing don't go the way a user expects. Continued cognitive dissonance will lead to anxiety or anger which will drive people away.


Karl Popper started a theory of how people rely on tradition in making sense of this world,[5] which is paraphrased here. A collection of features of our society, its institutions and traditions give a sense of order and discipline to our daily lives, without which we face anxiety and even terror about our future. What we experience as a normal social life can exist only if we can know, and can have confidence, that there are things and events which must continue and will not be different in the near future. To meet our needs for some level of certainty in our lives that the role of tradition can be understood. We will become anxious, frustrated and even terrified with our environment if it does not contain a considerable amount of order, a great number of regularities to which we can adjust ourselves. The mere existence of these regularities is perhaps more important than their particular merits or demerits. They are needed as regularities and therefore handed on as traditions, whether or not they are in other respects rational or necessary or good or beautiful. There is a need for tradition in social life.

Causal Relationships

A new and unconfirmed theory suggests that perception, motor control, memory and other Brain functions all depend on comparisons between ongoing actual experiences and the brain’s modeled expectations. This is the predictive coding explanation for how the brain works. This theory started from researchers trying to teach a computer to understand images that are fed into it. Its algorithm tries to make predictions about the environment and them compares the predictions to reality, just the way that humans learn to navigate in the real-wold, and now in the digital world. Designers need to be aware of this learning pattern and avoid changes that cause anxiety in users by testing the new designs to see how real users react. Nothing is more frustrating to a user that to have an update to an often used interface suddenly make it behave in unexpected ways. The user's response is unlikely to match the expected response of the designer's and developers.[6]

Level Dependencies

Each level of complexity depends upon the support of all levels, but specifically on the levels below it.

Just as civilization depends on the good will of the large majority of its citizens, so too a personal computer depends on the existence of some ecosystem where it can successfully be deployed.

Yet even the least complex life form, the virus or the management chip buried deep in some data center, can be the portal for and infection that will have consequences at the highest level.

Successful ecosystems

For an open ecosystem to be successful it must be resilient to change in every one of its components at every level. Since the ecosystem must dynamically evolve as new version of old components are constantly added and the results of many of those changes are unpredictable.

It is not possible to design an ecosystem in advance, just to design components with the best knowledge available at the time and introduce the new component, gradually if there is expected to be a great impact. For ecosystem changes, design thinking involves picking a way that the ecosystem should change and the designing a component that will help it move in that direction. Any design that depends on a completely new ecosystem is not likely to develop as expected. Before designing a change to an existing component, it is necessary first to peel back all of the assumptions that went into the existing design to see which are critical to the continued success of the ecosystem and which are available for alteration.[7]


Taxonomy of Levels of connected Systems

Level Name Typical use User Experience
1 Chip Management of Computer Only by Administrators
2 Board Computer Internet of Things Room temperature or video surveillance
3 Single Processor Computer Accessing Web Sites Simple Queries of web
4 Multiple Processor Computer Local processing & virtualization Mobile or Desktop device that maintains user info
5 Data Center Collection of computers in a single location Only by Administrators
6 Cloud (single owner) Social Network Interaction of user searches and tracking
7 Internetwork of Clouds Maintenance of Names User cannot access desired resourced for security reasons

Miller's "Processes" that work at all levels

Miller had 19 processes, defense and timer were added later. The boxes highlighted with color are the ones that deal with energy or manner, not the primary focus of this presentation. Two special cases are the reproducer, which might include cloning, but is not further discussed here; and the boundary which is important for identity and information hiding and so will be considered here.

Nu Name Function Importance
1 ingestor input food, parts, energy Energy Matter, important to computers
2 distributor move food, parts, energy Energy Matter, important to computers
3 converter Change from one form to another Energy Matter, might be important to computers
4 producer Establish relationships Energy Matter, important to interconnections
5 storage Could include a variety of back-up Energy Matter, important for reliability
6 extruder Social Network Energy Matter, waste heat is an issue
7 motor Maintenance of Names Energy Matter, important for robots
8 supporter How everything fits together Energy Matter, important for maintenace
9 reproducer Create new copies of self Energy Matter and information, future concern
10 boundary Maintains integrity Energy Matter and information, allows a continuing identity
11 input transducer detect changes or commands all recorded external events
12 internal transducer Internally created events Tracks changes due to internal operations
13 channel and net Looks externally in all directions Ethernet, High speed local nets
14 timer (added later) Internally generated events Could be considered a part of internal transducer
15 decoder Converts data to meaning For processing at a more abstract level
16 associator Learning Intelligence, Artificial or otherwise
17 memory static storage and remembering Allows active searching
18 decider Executive decision maker Flexibility, reliability, adaptability
19 encoder Converts internal data for reuse Translation, either for storage or for transmission
20 output transducer Sends messages outbound Display or speech (user experience)
21 defense (recent) no component survives without it antigen in component or higher

Here we will drill into only a few of these processes:

3 Converter

As a general rule, information systems delegate the problems of the supply of component materials or energy to experts. This section serves primarily to bind the need for food inputs, in particular energy inputs. The human being is a converter at the cell level of matter and energy into information, but at low efficiency, except for the brain which appears to have maximized the computation per unit of energy.[8] Similarly in 1973 Charles Bennett, building on the work of IBM's Rolf Landauer, showed that general-purpose computation can be performed by a logically and thermodynamically reversible apparatus, which can operate with arbitrarily little energy dissipation per step because it avoids throwing away information about past logical states; and in 1982 he proposed a reinterpretation of Maxwell's demon, attributing its inability to break the second law to the thermodynamic cost of destroying, rather than acquiring, information.[9] In other words it takes more energy to erase the blackboard than it does to fill it up.

Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment".[10]

Free Energy has been proposed as measure of Evolutionary pressure. Using the availability of energy to create the information reservoir needed to enable a civilized society White showed in 1949[11] culture to be "an elaborate thermodynamic mechanical system" designed to carry on the life processes of man by harnessing and controlling energy. It should be little surprise in 2006 when Karl Friston postulated that Free Energy (E-TS)[12] was the necessary precursor of any artificial intelligence;[13] but it was to the computer scientists who typically think about the speed of computation rather than the energy required to power their dreams. See also Wikipedia on the Free Energy Principle which makes the idea seem harder than it really is.

Hardware as a Living System

At least since the 1970s computer boards have come with "self-aware" components, often as independent computers that monitor the health of the main computer system. Since the 1990s the computer chips themselves are aware of the compute load that they are handling and adapt the number of circuits that are powered, or even the frequency of the clock used on individual cores so that power consumption (and heat generation) is kept to a minimum.[14] in the 2010s data centers became fully virtualized so that the loss of a single compute cell would be automatically replaced by another and a signal raised to allow a human at regular intervals to replace all the faulty units with no down time. So now the data center was aware of all of the parameters that effected its performance from the chip to the entire module of thousands of computers. The computer repairman was operating under the direction of the data center itself as just one component of this living system.

Software as a Living System

In an example given by Lee[15] where an individual Entity is an executing computer program(s). While many are short-lived, others like Wikipedia seem to have a life of their own and even interbreeds with other software systems just like a living system. It reacts to stimulus from its environment and requires energy and information inputs. It operates autonomously, defends itself from attack and replicates (the last 3 with help from humans at this point). Wikipedia is a good example of a symbiosis where people and programs inter-operate to the benefit of both. Certainly program are written by humans with a purpose, but that purpose is driven by the current ecosystem and will adapt as the ecosystem adapts, or it will just die out. The growth of Artificial Intelligence will make programs even more capable of self-adaption to changes in the ecosystem including to new attacks as they are deployed.


  1. James Grier Miller Living Systems 1978 ISBN 978-0070420151
  2. Douglas R. Hofstadter, Gödel, Escher, Bach: An Eternal Golden Braid 1979 Vintage ISBN 0-39474502-7
  3. Douglas R. Hofstadter, Fluid Concepts and Creative Analogies: Computer Models Of The Fundamental Mechanisms Of Thought 1995 ISBN 978-0465024759
  4. Woodruff, T. Sullivan; John Baross. Planets and Life: The Emerging Science of Astrobiology. 2007-10-08 Cambridge University Press. ISBN 978-0521531023
  5. Karl Popper, Conjectures and Refutations. (1961) Routledge chapter 4 ISBN 0-415-28594-1
  6. Jordana Cepelewicz. To Make Sense of the Present, Brains May Predict the Future. Quanta (2018-07)
  7. Terry Winograd and Fernando Flores, Understanding Computers and Cognition 1986 ISBN 0-201-11297-3
  8. Liqun Luo, Why Is the Human Brain So Efficient? Nautilus (2018-04-12)
  9. Charles Bennett, CV
  10. Charles H. Bennett, Notes on Landauer's principle, Reversible Computation and Maxwell's Demon. Studies in History and Philosophy of Modern Physics |volume=34 |issue=3 |pages=501–510 |year=2003 |url= DOI 10.1016/S1355-2198(03)00039-X
  11. L.A. White, The science of culture. (1949) Grove Press p 361 ASIN B000OMISVM
  12. L.D. Landau and E.M. Lifshitz, Statistical Physics. (1958 in English) Pergamon Press p. 45
  13. Friston, K., Kilner, J., & Harrison, L. A free energy principle for the brain. (2006) J Physiol Paris. , 100 (1–3), 70–87.
  14. Noor Muben, Workload Frequency Scaling Law - Derivation and Verification. (2018-09) CACM 61 No 9 pp. 43-47
  15. Edward E. Lee, Is Software the Result of Top-Down Intelligent Design or Evolution? (2018-09) CACM 61 No 9 pp. 31-39