Difference between revisions of "Complexity"

From MgmtWiki
Jump to: navigation, search
(Mathematical Context)
(Mathematical Context)
Line 19: Line 19:
 
==Mathematical Context==
 
==Mathematical Context==
 
Also see the wiki page [[Computational Complexity Theory]] for more detailed information.
 
Also see the wiki page [[Computational Complexity Theory]] for more detailed information.
* The first description of complex systems was generated by Edward Lorenz in the 1950's as he was trying to predict weather patterns. His equations introduced a time dependency which made them irreversible. It was rediscovered by  <ref>Edward N. Lorenz, "The statistical prediction of solutions of dynamic equations" (1960). Symposium on Numerical Weather Prediction in Tokyo. http://eaps4.mit.edu/research/Lorenz/The_Statistical_Prediction_of_Solutions_1962.pdf</ref>
+
* The first description of complex systems was generated by Edward Lorenz in the 1950's as he was trying to predict weather patterns. His equations introduced a time dependency which made them irreversible which is inconsistent with the [[Determinism]] promoted by most physicists today. It was rediscovered by  <ref>Edward N. Lorenz, "The statistical prediction of solutions of dynamic equations" (1960). Symposium on Numerical Weather Prediction in Tokyo. http://eaps4.mit.edu/research/Lorenz/The_Statistical_Prediction_of_Solutions_1962.pdf</ref>
 
* Another major contributor to complexity theory is John Holland, a computer scientist and professor at the University of Michigan. Holland designed the genetic algorithm based on the idea that components of complex systems can be broken down into building blocks, whose characteristics can then be represented in code. <ref>''Reference for Business'' https://www.referenceforbusiness.com/management/Bun-Comp/Complexity-Theory.html#ixzz7nlrEHBIA</ref>
 
* Another major contributor to complexity theory is John Holland, a computer scientist and professor at the University of Michigan. Holland designed the genetic algorithm based on the idea that components of complex systems can be broken down into building blocks, whose characteristics can then be represented in code. <ref>''Reference for Business'' https://www.referenceforbusiness.com/management/Bun-Comp/Complexity-Theory.html#ixzz7nlrEHBIA</ref>
  

Revision as of 20:43, 8 July 2024

Full Title

Complexity seems to be an Emergent Behavior of the natural selection of small changes that are more efficient in the use of information and energy.

Context

This wiki page is about complex systems. Computational Complexity Theory is simple in comparison to this topic. Perhaps the best single definition of Computational Complexity Theory is likely to be Kolmogorov's definition as the length of the smallest program to create the result desired. By that definition perhaps the universe is just the smallest system able to create human life. While that statement is (in theory) Falsifiable, we certainly have not found a counter example to it.

Another good definition of Complexity is "the study of the phenomena which emerge from a collection of interacting objects".[1]

Historical Context

  • What has once been a simple Homeostatic ecosystem will try to become more efficient at using the resources available to it. Originally the resources were land and human labor. As humans learned more about nature the resources evolved to energy and information.
  • Archaeology has taught us that the rise and fall of complex societies[2] has been evident throughout human history. In general the organizational capability of the leaders in creating and maintaining complex societies became of the measure of greatness in a civilization.
  • Recent historians[3] have described history as just being a series of peaks and valleys as societies rise and fall. But the overall trend is that the peaks keep getting higher and the valleys also continue to get higher.
  • Nassim Taleb has described[4] a more fractal view of the peaks and valleys as being completely unpredictable. His arguments many not imply that the peaks and valleys will not continue to rise, but only that the variation between a peak and a valley will rise as well.
  • Joseph Tainter observed[5] that each new level of social complexity is added at a cost, if only a cost in surplus production. As societies become more complex the marginal returns for increasing complexity increase. Collapse is often a result of the diminishing value of complexity
  • Modern technological society seems to value creative destruction[6] so that the peaks and valleys can continue, but much more rapid, and hopefully limited in scope.
  • Maximization of return on investment leads to more dependence on other organizations, both corporate and national, which leads to reduced resilience and loss of functionality when some level stops functioning as expected.
  • The Inequality that Piketty[7] says is inevitable in capitalism lead inexorably to instability. Boushey says[8] "Countries that have this deep inequality, like we do, are much more prone to financial crises in no small part because high wealth inequalities leads to more debt, which just makes your economy more fragile."

Mathematical Context

Also see the wiki page Computational Complexity Theory for more detailed information.

  • The first description of complex systems was generated by Edward Lorenz in the 1950's as he was trying to predict weather patterns. His equations introduced a time dependency which made them irreversible which is inconsistent with the Determinism promoted by most physicists today. It was rediscovered by [9]
  • Another major contributor to complexity theory is John Holland, a computer scientist and professor at the University of Michigan. Holland designed the genetic algorithm based on the idea that components of complex systems can be broken down into building blocks, whose characteristics can then be represented in code. [10]

Levels of Complexity

  • Complexity started with division of labor. Each citizen had a set of tasks to perform. These were largely horizontal categorization of tasks with perhaps a single chief.
  • The tribes of humans lead by a chief were largely independent and often at war over resources.
  • This first steps to complex societies was the vertical separation of tasks that raised the importance of the chief and created a hierarchy of chiefs.
  • The apotheosis of vertical hierarchy was the feudal systems of one to two millennia ago.
  • War between tribes is not efficient. Eventually most people got the point and sought to eliminate war. That is a work still in process.
  • A single chief was efficient, but only in the short run. (Power corrupts and absolute power corrupts absolutely) Eventually most people got the point and tried a more decentralized approach to managing civilization. That is still a work in process.
  • Since Hobbes[11] the concept of individual liberty and agency has been a powerful paradigm of growth of complexity. That has fostered a highly complex (horizontal) separation of functions among the workforce, but fostered a "winner-take-all" system that recreates the vertical levels of wealth very much like those in a feudal society.
  • Increased differentiation between people's work lease to more interdependence and complexity which requires more cooperation at all levels.
  • The new levels of complexity are human organizations like IBM (last century) and Apple (this century). Note that these complex organizations seem to have short lives as leviathans, which is probably a good example of creative destruction.
  • Which brings us the the levels of complexity within the ecosystems. It is not only the legal organizations like IBM and Apple that structure our lives, but other webs of interaction. So IBM and Apple depend upon the web of technology suppliers and standards that enables their dominance.
  • The cycle of complexity described above does not necessarily lead to a collapse. An alternate is systems that shed some information in order that they may grow to the next level of complexity. Studies reported in Nature Ecology and Evolution have shown cases where during evolution some animals, like the sea squirt have lost genes (information) that was previous deemed necessary for brains function so that they could create new ways of building brains.[12] That means that collapse is not a requirement of growth if you can construct a system that sheds information that is not absolutely required. Unfortunately the only way to know if information is required is to eliminate it and see what happens.

Complexity as a Problem

Human's instinctively look for simple solutions to complex problems. But when standard teams get together, they try to accommodate all sorts of use cases. Consider the Software Bill of Materials (SBOM) which could be a simple list of dependencies or it could be a modular model linking other types of data into an SBOM such as build tools and vulnerability data.[13]
“The risk in this approach is we’ve all seen the world littered with really great standards ideas that no one ever touches because they were too complex and they were trying to cover too broad [an area]. SBOM’s power is that it can apply to the entire world of software which is giant,” ranging from a cloud container to medical devices to 5G and 6G radio access networks. “Complexity delivers value,” but it can also be “the enemy of security and success.”

As M. Mitchell Waldrop states in Complexity, “The edge of chaos is the constantly shifting battle zone between stagnation and anarchy, the one place where a complex system can be spontaneous, adaptive, and alive.” And so, the outcomes will never be fully determined in advance.

Natural Lifecycle of Complex Ecosystems

Most studies of complex systems have focus on their rise rather than their fall. The most famous study of the fall of an empire was Edward Gibbon's[14] history which was written before social structure was a topic of academic interest.[15]Joseph Tainter described how 18 different ancient civilization collapsed and the implications that held for current civilizations.[5]

Examples of a excess of complexity and collapse of little ecosystems are rife throughout technology. Programming language gain in complexity as more and more people seek to use those languages in more and more systems. Many way are found to improve the functionality and perfomance of the lanuage over time by appliciaiton of the universal rule in computer systems that any problem in commputer science can be solved by adding yet one more layer of abstraction. But complexity adds cost by the difficulty of using the language and of the fragility of complex solutons as ever more layers of abstraction make bugs both more common and more dificult to fix. Identity management solutions offer several examples of this. The one exmaple with the shortest lifecycle of complexity collapse was the "Directory Access Protocol" (DAP) which was created by hordes of the most dedicated technologists of computer identity. The resulting standard was so complex that it was never broadly adopted and quicly resulting in the "Lightweight Directory Access Protocol" which was adopted by the most ubiquitous soltions, like Microsoft's Active Directory. It too succumbed to complexity creep and is now being replaced by OpenID Connect, which has already evidenced symptoms of complexity creep. That example of complexity creep in standards prior to any wide-scale deployment is being revisted by the "Self-Soverign Idenity" standardization process which cannot even get published due to a confusing welter of ammedments that are raising its complexity before any wide-scale adoption.

Complex Adaptive Systems (CAS)

The emergence of complexity has been mathematically modeled as a CAS. The challenge in the real-world is that quantitative measurements are not always available. One approach to CAS is to use Living Ecosystems to describe organizational processes. Dooley (1997) is perhaps the most widely cited example of this approach. CAS has also been used in health informatics research (e.g. Day & Norris, 2007; Ward, Stevens, Brentnall, & Briddon, 2008).

While the use of biological concepts such as Ecosystems and autopoesis (self organizing systems) are described and applied in Dooley’s paper, there is an attempt to examine the underlying role of complexity and interdependence inherent in the CAS view. However, the biological ecosystem view of organisations has been justifiably criticised for its lack of clear connection between the biological concept of species and a corresponding unit of construction in human organisations (Young, 1988). Nonetheless, there is some recognition of the potential of CAS in the human sciences, as well as in the field of health informatics. Therefore, we will attempt to apply the CAS to our domain of study. We start with the assumption that while the biological approach to analysis of organisations is informative, there are no direct correspondences. That is, we assume the underlying phenomenology of things like resource limitation, the unit of information (i.e. DNA in biology, an unknown entity in organisation studies) between the two fields are sufficiently different so as to be not directly comparable. Ecosystems and organisations are both constrained by resource limitations, by the internal structure of their interacting components and by their relationship to their external environment. However the economics of the underlying resources are substantially different. While ecosystems are generally limited by nutrient availability, the resource limitations for human organisations are material, financial and human. What is common between the two systems is that the flow of these resources are important drivers of change and homeostasis. Therefore it appears that a direct analysis of the dynamic processes that underlie resource flows should be useful in defining a more robust conceptual basis for organisational ecology. Baranger (2002) provides an excellent non-technical summary of complexity theory which is outlined in the remainder of this section. Because Baranger’s disciplinary perspective is from theoretical physics, while his writing remains close to the mathematical underpinnings of complexity theory, his grounding in an application, along with his clear teaching skills is very instructive, as it provides us with a clear logical explanation of how to link the abstract mathematics of complexity to an applied dimension. Complex Adoptive Systems are difficult to understand because of the interaction between two fundamental components – chaos and complexity. Chaos can be a property of simple systems (i.e. systems with few parameters), and the results of chaotic models are by definition intrinsically unpredictable. Baranger states that chaos is “that part of mathematics where calculus does not apply”. One of the defining features of chaos is sensitivity dependent on initial conditions (e.g. in our study it may be that the initial training approach can vary between units in small ways, but that these small differences might have dramatic consequences). Complexity is different from chaos. The human body, weather patterns, and ecosystems are all examples of complex systems where the individual constituents self-organise, and the whole is greater than the sum of its parts. Emergence (as in emergent properties) is a phenomenon stemming from complexity where the organisation and interactions at one level of a system cause changes at another level. A system whose configuration is capable of changing over time is called a dynamic system. A dynamic model is a mathematical model or a set of rules describing the time dependence of a point's position in space (either physical space or a more abstract idea of space). A simple example of a dynamical system as would be described in any introductory physics book is the swinging of a pendulum. Chaos has a close relationship with complexity. Complexity has the property of multiple interacting components each of which may or may not be chaotic subcomponents. The network of interactions is compounded by stochasticity (probabilistically determined variation). In thermodynamics, the statistical model of probabilistic variation is described by the concept of entropy. An adaptive system is one which interacts with itself and its environment to achieve an end. A simple example of an adaptive system is Stevenson’s governor – a mechanical mechanism that prevents excessive speeding of engines. A more complex example is that of homeostatic systems in living systems, for example body temperature regulation. Entropy is an important part of any system as it helps define whether a system is closed (independent) or open (dependent on other systems). In thermodynamics, the entropy (degree of disorder) of a closed system increases over time. High entropy systems have high levels of disorder, and the components of a high entropy system are generally seen as possessing disorder whose atomic configuration are uninteresting. However, the effects of a transient increase in entropy can be interesting. A substantial outage of the electronic documentation system of our study site is a good example of a transient increase in the rate of the accumulation of entropy, which will be discussed next. One fascinating property of entropy is that even in the physical sciences, it is a constructed concept, which is used to make “reality” more manageable. The smoothing procedure used for entropy analysis defines the scale beyond which the analyst is unable or unwilling to keep track of details. Smoothing represents a self-imposed (subjective) increase in the entropy of the system – the key to understanding this procedure is to optimise the level of analysis at which it is performed. As our data consist of individual interviews, we need to understand the nature and quality of the data we gather, and at what level we maximise its meaning. This in turn allows us to improve our understanding of the flow of resources within the organisation.

Complex Adoptive Systems’ quantitative roots do not exclude its use for solving qualitative problems. For example, a quantitative problem in electronics would be to calculate the change in voltage in a lighting circuit when a change occurs. A qualitative equivalent would be to determine whether the light bulb becomes dimmer or brighter as a result of that change. It should be clear that where the number of parameters is high, or measurement is uncertain, or where a chaotic system is suspected, a qualitative solution will be more achievable and likely more desirable. This brief summary should illustrate that CAS as an ontology (framework to generate meaning) is capable of bridging the divide between positivist and post-positivist. That is, between the perspective that there is a “true” reality versus the idea of a socially constructed reality (Lichtenstein, 2000). In the search for improved understanding in social research we need to evaluate this way of looking at things in order to determine how useful it is, and to determine whether this lower level of CAS compared to the organisation as ecosystem approach is useful in providing explanations of change processes.

References

  1. Neil F. Johnson, Chapter 1: Two's company, three is complexity from Simply complexity: A clear guide to complexity theory. (2009) Oneworld Publications. ISBN 978-1780740492
  2. Greg Woolf, Archaeological Narratives of the Collapse of Complex Societies in Decline and Decline-Narratives in the Greek and Roman World, (2017) Oxford
  3. V.G.Childe What happened in history (1942) Penguin
  4. Nassim Nicholas Taleb, The Black Swan (2007) Random House ISBN 9781400063512
  5. 5.0 5.1 Joseph Tainter The Collapse of Complex Societies (1988) Cambridge University Press ISBN 0 521 34092 6 https://wtf.tw/ref/tainter.pdf
  6. Joseph Schumpeter Capitalism, Socialism, and Democracy.(1942) Harper & Bros
  7. Thomas Piketty, Capital in the Twenty First Century (2014-01-01) Belknap Press ISBN 978-0674430006
  8. Heather Boushey quoted by Katy Lederer Equality? That's What's Good for Growth (2020-08-30) New York Times p B1ff
  9. Edward N. Lorenz, "The statistical prediction of solutions of dynamic equations" (1960). Symposium on Numerical Weather Prediction in Tokyo. http://eaps4.mit.edu/research/Lorenz/The_Statistical_Prediction_of_Solutions_1962.pdf
  10. Reference for Business https://www.referenceforbusiness.com/management/Bun-Comp/Complexity-Theory.html#ixzz7nlrEHBIA
  11. Thomas Hobbes, Leviathan or The Matter, Forme and Power of a Commonwealth Ecclesiastical and Civil (1668)
  12. Viviane Callier, By Losing Genes, Life Often Evolved More Complexity (2020-09-01) Quanta https://www.quantamagazine.org/by-losing-genes-life-often-evolved-more-complexity-20200901/
  13. Sara Friedman, Federal officials warn against increasing complexity when developing standards, policies Inside Cybersecurity (2022-08-26) https://insidecybersecurity.com/share/13832?sf169739228=1
  14. Edward Gibbon, "The History of the Decline and Fall of the Roman Empire" (1776)
  15. Ben Ehrenreich, How Do You Know When Society Is About to Fall Apart? New York Times (2020-11-08) https://www.nytimes.com/2020/11/04/magazine/societal-collapse.html?action=click&module=Editors%20Picks&pgtype=Homepage

Other Material