[go: up one dir, main page]

0% found this document useful (0 votes)
52 views40 pages

Complex Adaptive Systems Guide

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views40 pages

Complex Adaptive Systems Guide

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Complex

Adaptive
Systems
Si Guide Series
Overview
Complex adaptive systems are all around us, from financial markets to ecosystems to the
human immune system and even civilization itself. These systems consist of many agents
that act and react to each other's behavior; out of this often chaotic set of interactions
emerge global patterns of organization in a dynamic world of constant change and
evolution where nothing is fixed.

In this guide, we will begin by discussing adaptation itself, including cybernetics and
looking at how systems regulate themselves to respond to change. We'll then proceed to
discuss the dynamics of cooperation and competition using game theory. We looking at
the process of evolution as a powerful and relentless force that shapes complex adaptive
systems on the macro-level. Finally the ideas of resilience is introduced here and
contrasted with robustness.
Systems Thinking Systems Awareness Systems Theory Complexity Theory Adaptive Systems

Where am I?
This paper forms part of
our set of 20 guides guide
covering all things systems
innovation. System Inquiry Systems Modeling Mapping Actor Mapping Leverage Points

Transition Design Transition Models Furtures & Naratives System Gardening Co-Design

+++
Ecosystem Building Networks Value Network Scaling Change Systemic Evaluation
Guide Contents

Adaptive System Game Theory Evolution Resilience


How systems adapt through We look at situations of How complex adaptive Learn the difference
regulatory processes and cooperation & competition systems manage to adapt to between robustness and
respond to their environment between adaptive agents a changing environment resilience
Adaptive
Systems
What is an Adaptive System?
We can define adaptation as the capacity for a system to change its
state in response to some change within its environment. An adaptive
system is then, a system that can change given some external
perturbation, and this is done in order to optimize or maintain its
condition within an environment by modifying its state.

Adaptation simply means the system can generate a number of


different responses to a set of changes within the state of its
environment. This adaptive capacity gives a system some flexibility
that improves its performance.

The growth of a plant or fungus towards a source of light, what is


called phototropism, is an example of adaptation. It is designed to
respond to external influence that creates a response within the
system. Entities that are capable of more advanced forms of
adaptation have specialized subsystems dedicated to regulating this
process of adaptation. We call these specialized components
regulatory or control systems.
What is Agency?
All adaptive systems regulate some process and they do this in order to
maintain and develop their structure and functioning. For example,
plants process light and other nutrients and their adaptive capacity
enables them to alter their state so as to intercept more of those
resources. The same is true for bacteria and animals, the same is true for
a basketball team or a business, they all have some conception of value
that represents whatever is the resource that they require, whether that
is sunlight, fuel, food, money, etc.

This creates what we can call a value system, that is to say, whatever
structure or process they are trying to develop forms the basis for their
conception of value. Agents use their agency to act and make choices in
the world to improve their status with respect to whatever it is they
value; this value system may be very simple or very complex. All adaptive
systems have a degree of agency. Agency is the capacity to make
choices based upon information and act upon those choices
autonomously to affect the state of their environment.
Cybernetics
Cybernetics is the study of control, communications and
information processing within systems of all kinds, biological,
mechanical and social. Norbert Wiener - one of the founders
of the subject - defined cybernetics as “the scientific study of
control and communication in the animal and the machine.”
The word cybernetics comes from Greek word meaning
“governance” or “to steer, navigate or govern.”

The primary object of study within cybernetics are control


systems that are regulated by negative feedback loops. There
are essentially just three components to any given control
mechanism. Firstly, there needs to be some form of a sensor
for feeding information into the system. Secondly, there
needs to be a controller that contains the logic or set of
instructions for processing this information. Lastly, an actuator
that executes some action in order to affect the state of the
system or its environment.
Homeostasis
Many types of systems require both a continuous input of resources
from their environment and the capacity to export entropy back to the
environment in order to maintain their given level of functionality. A
business organization is one example, requiring a continuous revenue
stream to pay its employees and suppliers, while also producing a
certain amount of waste material that it must externalize.

In order for these systems to maintain their intended level of


functionality - what we might call their normal or equilibrium state -
they must have an environment that is conducive to providing them
with these required conditions. Within ecology and biology the term
homeostasis is used to describe this phenomenon.

The word Homeostasis derives from the Greek word meaning homos
or "similar" and stasis meaning "standing still". It is the state of a
system in which variables are regulated so that internal conditions
remain stable and relatively constant, despite changes within the
system's environment.
Regulation
In order for systems to maintain homeostasis, there needs to be
some kind of regulatory mechanism; what we also call a control
system. This control mechanism has to regulate both the
system’s internal and external environment to ensure that the
environmental conditions are within the given set of parameters
that will enable the internal processes of the system to function
at a normal, equilibrium state.

Cybernetics is the area of systems theory that studies these


regulatory mechanisms. Cybernetics comes from a Greek word
which means to steer or guide, and this is exactly what a control
system is designed to do. It is designed to guide the system in
the direction of the set of environmental parameters that are
best suited for it to maintain homeostasis. Systems use their
regulatory mechanisms to maintain their internal composition
within just the right parameters required for the needed
processes to take place.
Feedback Regulation
Feedback loops are a fundamental object of study within cybernetics
in that they are accountable for the process of regulation within all
control systems. Negative feedback is at the heart of processes of
regulation and control of all kind.

An example of this might be the feedback loops that regulate the

- temperature of the human body. Different body organs work to


maintain a constant temperature within the body by either conserving

+ or releasing more heat. Through sweating and capillary dilation, they


counterbalance the fluctuations in the external environment’s
temperature.

Another example of a negative feedback control process would be


between the supply and demand of a product. The more the supply
of a product the lower the price will become which would feedback
to induce the producers to lower production. This is a decentralized
form of regulatory process, without any one person directing the
market the feedback prevents overpricing and overproduction.
How It Works: Control Systems

Controller
The information processing unit
with logical set of instructions
Sensor
Takes in information
about the current state
of the system Actuator
(e.g. Thermometer) Outputs instruction
for adjusting the system
to meet the desired state
(e.g. Heater)

System
The system being regulated
(e.g. Smart Home)
Second Order Cybernetics
New cybernetics or second-order cybernetics is sometimes
described as the "cybernetics of cybernetics". It investigates the
construction of models of cybernetic systems looking beyond the
issues of the “first order” cybernetics and the emphasis on control,
recognizing that the investigator is also part of the system, and of
the importance of autonomy, self-referentiality, and self-organizing
capabilities of complex systems.

This is a recognition that the investigators of a system can never see


how it works by standing outside it because the investigators are
always engaged themselves in a cybernetic process of observation
with the system being observed; that is, when investigators observe
a system, necessarily they affect it and are affected by it. Thus
second order cybernetics introduces the observers conceptual
system as a “second order” to the regulatory process, through
which it can be adjusted, adapted and we can get such phenomena
as learning.
Game
Theory
Game Theory
When two adaptive systems interact there comes to form a dynamic
of cooperation and competition as the pursuit of the valued ends of
one agent becomes interdependent with that of others. Game theory
is the study of these interactions between adaptive agents. Game
theory studies the strategic interaction between agents engaged in
relations of cooperation and conflict.

Game theory is used in many different areas whenever there are


agents that act according to some form of logic to maximize their
pay-off within some rule-bound interaction, such as politics,
economics, business management or ecology. Agents typically do
this based upon the costs and payoffs for choosing one of either
option. This cost-benefit ratio varies depending on the scenario - the
game - they are engaged in with other agents.

Some scenarios such as playing chess have very low incentives for
cooperation while favoring competition, while others have high
incentives toward cooperation. Studying the structure of these games
helps us to understand when and why people cooperate.
What is a Game?
In game theory, a game is any context within which adaptive agents
interact and, in so doing, become interdependent. Interdependence
means that the values associated with some property of one element
become correlated with those of another. In this context, it means
that the goal attainment of one agent becomes correlated with the
others. This gives us a game wherein agents have a value system; they
can make choices and take actions that affect others, and the
outcome of those interactions will have a certain payoff for all the
agents involved.

The trade negotiations between two nations can be modeled as a


game, the interaction of businesses within a market is a game, the
different strategies adopted by creatures in an ecosystem can be seen
as a game, the interaction between a seller and buyer as they haggle
over the price of an item is a form of game. The provision of public
goods and the formation of organizations can be seen as games.
Likewise, the routing of internet traffic and the interaction between
financial algorithms are games.
Cooperation vs Competition
To quickly take a simple concrete example of a game, let's think
about the current situation with respect to international politics and
climate change. In this game, we have all of the world's countries
and all countries will benefit from a stable climate, but it requires
them to cooperate and all pay the price of reducing emissions in
order to achieve this. Although this cooperative outcome would be
best for all, it is in fact in the interest of any nation state to defect
on their commitments as then they would get the benefit of others
reducing their pollution without having to pay the cost of reducing
their own emissions.

Because in this game it is in the private interests of each to defect,


in the absence of some overall coordination mechanism the best
strategy for an agent to adopt given only their own cost-benefit
analysis is to defect and thus all will defect and we get the worst
outcome for the overall system. This game captures in very
simplified terms the core dynamic, between cooperation and
competition, that is at the heart of almost all situations of
interdependence between adaptive agents.
Non-Cooperative Games
A non-cooperative game is one where an element of competition
exists and there are limited mechanisms for creating institutions for
cooperation. This may be because of the inherent nature of the game
we are playing, that is to say, it is a zero-sum game which is strictly
competitive and thus cooperation will add no value.

Noncooperation may be a function of isolation, lack of communication


and interaction with which to build up the trust that enables
cooperation. We see this within modern societies, as these societies
have grown in size they have transited from communal cooperative
systems based on the frequent interaction of members to requiring
formal third parties to ensure cooperation because of the anonymity
and lack of interaction between members of large societies.

Likewise, there may simply be a lack of formal institutions to support


cooperation between members, an example of this might be what we
call a failed state where the government's authority is insufficiently
strong to impose sanctions and thus cannot work as the supporting
institutional framework for cooperation.
Prisoners Dilemma
The prisoner's dilemma game is a classic two player game
that is often used to present the core dilemma at the heart of
non-cooperative games. Conceive of two prisoners detained
in separate cells, interrogated simultaneously and offered
deals in the form of lighter jail sentences for betraying the
other criminal. They have the option to "cooperate" with the
other prisoner by not telling on them, or "defect" by
betraying the other. However, if both players defect, then
they will both serve a longer sentence than if neither said
anything.

If one cooperates while the other defects they face a high


cost thus given the knowledge that the other player's best
decision is to "defect" each player improves their own
situation by switching from "cooperating" to “defecting".
The prisoner's dilemma thus has a single equilibrium: where
both players choose to defect. Thus without means to
support cooperation both will likely defect resulting in the
worst outcome for both.
The Social Dilemma
The prisoner's dilemma is a simple game that illustrates the
broader concept of what is called a "social dilemma". Social
dilemmas are characterized by two properties: The social
payoff to each individual for defecting behavior is higher than
the payoff for cooperative behavior, regardless of what the
other group members do, yet all individuals in the group
receive a lower payoff if all defect than if all cooperate. It is a
situation where individual rational behavior leads to a situation
where everyone is worse off.

Social dilemmas are of interest to many because they reveal


the core tension between the individual and the group that is
engendered in situations of cooperation. At their core, social
dilemmas are situations in which self-interest is at odds with
collective interests and they can be found in many situations of
interdependence; from resource management to relationship
development, to international politics, public goods provision
and business management.
Cooperation
A cooperative game is one in which there can be cooperation
between the players and they have the same cost. Cooperative
games are an example of non-zero sum games. This is because in
cooperative games, either every player wins or loses. Cooperation
may be achieved through a number of different possibilities, it
may be built into the dynamics of the game as would be the case
with a positive-sum game where payoffs are positively correlated.
In such a case the innate structure of the game creates an
attractor towards cooperation because it is both in the interest of
the individuals and the whole organization.

A good example of this are the mutually beneficial gains from


trade in goods and services between nations. If businesses or
countries can find terms of trade in which both parties benefit
then specialization and trade can lead to an overall improvement
in the economic welfare of both countries, with both sides seeing
it as in their interest to cooperate in this organization, because of
the extra value that is being generated.
Achieving Cooperation

+
+ +
In almost every situation of interaction between human beings,
social or economic, there is an optimal equilibrium where
everyone cooperates and all get a good payoff but there is also
often a non-cooperative option that is not so good for all
involved but a better option if agents are only self-interested
and there is no way to enable cooperation between them. In
many ways, we can say it is the job of social and economic
institutions to try and provide this infrastructure that enables
trust between so that everyone chooses the cooperative
outcome and all get the best payoff.

This cooperation may be achieved by external enforcement by


some authoritative third party such as governments and
contract law. Where we cooperate in a transaction because the
third party is ensuring that it is in our interests to do so by
creating punishments or rewards. Likewise, cooperation may be
achieved through peer-to-peer interaction and feedback
mechanisms, as covered in a future guide.
Evolution
Evolution
Evolution is a process of adaptation that operates on the macro-
level of a system. Adaptation is the capacity to generate a response
to some change within the environment. Evolution is this same
process but operating on the macro scale, i.e. on the level of a
population of agents. Here again, it is the capacity for the system
to respond to changes within its environment. Evolution is the
adaptive response of a group of entities that occurs over a period
of many life cycles. Evolutionary changes reflect the response of the
collection of agents to their environment.

The concept of evolution has a strong association with its


application in biology. However, complexity theory deals with the
concept on a slightly more abstract level as it applies to all complex
adaptive systems from the development of civilization to financial
markets, cultures, and technologies. As such, we are trying to
understand evolution as a continuous and pervasive phenomenon
that occurs in all types of natural, social and engineered systems.
Evolution - How it Works

1. Change 2. Variety 3. Selection 4. Replication


A population of agents Responding to changes means The different variants in the Those that prove "fittest”
exists within some selecting from a variety of system are exposed to their attract more resources and
environment and must different internal states or operating environment to see are duplicated, thus the
periodically adapt as a strategies, thus the need for which are best adapted and system as a whole comes to
whole in order to survive diversity. most successful within that express more of their
and maintain functionality. context. characteristics.
Example - Politics

1. Change 2. Variety 3. Selection 4. Replication


Broader changes in culture New political parties form to Those which best match the Those that are selected for
and society lead to changes represent the different political views of the society are become more prevalent in
in socio-political beliefs. perspectives of society selected for at the ballot box. the political system.
Example - Economics

1. Change 2. Variety 3. Selection 4. Replication


Technology changes make Businesses create a set of Consumers select the product Businesses whose products
possible new products and products based upon the new that best suits them in their were selected have more
services. technological possibilities. given context. revenue to grow while others
don’t.
Requisite Variety
Through the iterations of evolution over a prolonged period of time,
a system can go from starting simple to becoming more complex. By
retaining functional variants, it can expand to become capable of
operating within broader more complex environments.

The process of evolution illustrates the importance of variety in order


to be able to respond to changes within the environment. A system
with sufficient internal variety to represent the possible changes
within its environment is said to have requisite variety. When a system
has requisite variety, then it can be said to have control over itself
within that particular environment and be sustainable.

However, there is always a broader environment that will present the


system with a wider, more complex set of eventualities for it to deal
with. Thus evolution proceeds through a process of differentiation -
to produce variety - and integration as the system becomes more
complex and can operate in a broader more complex environment.
Law of Requisite Variety
An important theory within cybernetics created by Ross Ashby which states that, if a
system is to be stable, the number of states of its regulatory mechanism must be
greater or equal to the number of states in the system being regulated. Thus Ashby’s
Law means that for a system - an organism, a government, a car to be regulated or
under control means that for any given change in state presented by its environment
the system can generate a response so as to maintain functionality. The system has
to be able to respond to the diversity of alterations by choosing from some capacity
or response that it has internally to counteract it, thus enabling it to maintain
homeostasis and control.
Resilience
Theory
Resilience
Resilience is typically understood as the capacity of a system to
maintain functionality given some alteration. When a system is
subject to some sort of perturbation or disturbance, it responds
by moving away from its initial state. The tendency of a system to
remain close to its equilibrium state, despite disturbances, is
termed robustness. On the other hand, the speed with which it
adapts and finds a new viable state after disturbance may be
understood as its resilience.

Whereas resistance will try to remove all disturbances through


control and a strong boundary condition, adaptation comes with a
recognition of the importance of disturbance in testing the
system. This is in order to maintain the competency of the
system’s components; as the diversity and effectiveness of its
constituent elements is the only thing that is going to ensure its
preservation. This is illustrated by the development of the human
immune system or the resilience of a forest to fires as they both
need periodic testing in order to maintain their resilience.
Dependence
All systems operate within an environment and they are dependent
upon some set of input values from that environment in order to
operate successfully. Whether this is the human body requiring an
input of oxygen or a technology requiring the input of fuel or a
government requiring the cooperation of its people. When the
system goes outside of this range of required inputs its
functionality and structure become degraded. In the way that a
human without food will cease to operate effectively or a
government without financial revenue will be rendered
dysfunctional over time.

Thus, in order for the system to maintain functionality, it has to


maintain itself within a given set of input parameters. There are
essentially just two different ways of achieving this. The system can
work to control the input values so that they do not change - thus
working to resist change - or it can work to enable its capacity to
adapt to a wide variety of input values in which case again it would
be able to stay within the required input range. This second
approach - where the system works to improve its capacity to
adapt - we call adaptive capacity.
Antifragile
Resilience is about allowing for changes while looking at the
system’s ability to endure despite these changes. Whereas
resistance and robustness are about achieving "success" -
i.e. successfully resisting change is what is valued - with
resilience failure becomes more valuable than success as it is
only through failing that we build up resilience; resilience is
something that is learned through failure. Resilience is
always about being aware and ready to adapt to
unexpected risks.

The resilience approach is about turning a crisis into an


opportunity by seeing it as the potential to develop new
capacity. For a resilient community, a change is a learning
opportunity, for a fragile community change is a crisis to be
avoided. This is the concept of “antifragility" - a term
developed by Nassim Nicholas Taleb - which refers to a
property of systems that increase in capability to thrive as a
result of disturbance, mistakes, shocks, or failures.
How systems respond to stress...

Fragility Robustness Resilience Antifragile


The inability to The ability to resist The ability to adapt and The ability to grow
respond to change change bounce back from change through adversity
Robustness Strategy
Both robustness and resilience are trying to achieve the
same end, of maintaining functionality, but the first does it
through resistance while the second does it through
adaptation. As always the success of a strategy is
contingent upon the environment within which it is being
enacted. At low levels of change within relatively stable
environments, the robustness strategy works well, but as
the change becomes faster and systemic the advantage
shifts to the adaptive strategy.

We can see this within the business community, as over the


past decades with globalization and information technology
creating major changes in the business landscape, the
terms agility and adaptation have moved to the forefront of
management terminology.
Distribution
Robustness and resilience are general characteristics of self-organizing
systems both through their capacity to resist change and their capacity
to adapt to it. In centralized systems with top-down control, there are
specialized components required for regulating the system, these
represent largely irreplaceable hubs that will affect the whole system if
removed or degraded. Within complex systems in the contrary, control is
distributed out on the local level meaning there is much less
specialization, missing or damaged components can often be replaced
by others and this gives them a much lower level of criticality.

For example, the internet is a distributed system that is not dependent


upon one node or route to deliver packages, if a node is damaged or
down the computer will send the package through a different pathway.
Likewise, self-organizing systems are held within their current
configuration by a set of feedback loops that are also distributed out
across the system on the local level. A good example of this might be a
magnet which consists of many tiny magnetic spins that are all aligned to
produce an overall magnetic force, what holds the system together is
not one or a few critical parts but distributed feedback relations.
Resilience Strategy
Resilience becomes of importance when dealing with open
systems because, by definition, they involve more exchanges
with their environment, while the environment can never be fully
controlled; meaning there is a great requirement meant for
adaptive capacity and resilience. For example, as societies open
up to global processes - as connectivity proliferates and people
recognize their interdependence with processes and systems
outside of their borders, they must also give up some capacity
of control that may have previously existed.

In a globalized world, no nation is able to control the flows of


financial capital that affect their economy, no nation is fully able
to control the flow of people and goods. In a world where
processes that shape people's lives transcends the national
level, the strategy of resistance becomes less successful.
Resilience in navigating such larger processes of change - as
globalization and environmental change - becomes of greater
importance to nations and people.
CC BY-NC 4.0 DEED

Version 2.0
A Systems Innovation Publication
www.systemsinnovation.network
Reference and Resources
TheFreeDictionary.com. (2016). adaptive system. [online] Available at: https://encyclopedia2.thefreedictionary.com/adaptive+system [Accessed 21 Sep. 2020].

Concepts: Adaptive — New England Complex Systems Institute (2014). New England Complex Systems Institute. [online] New England Complex Systems Institute.
Available at: https://necsi.edu/adaptive [Accessed 21 Sep. 2020].

Wikiwand. (2020). Phototropism | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Phototropism [Accessed 21 Sep. 2020].

Wikiwand. (2020). Control system | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Control_system [Accessed 21 Sep. 2020].

Wikiwand. (2020). Agency (sociology) | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Agency_(sociology) [Accessed 21 Sep. 2020].

Concepts: Adaptive — New England Complex Systems Institute (2014). New England Complex Systems Institute. [online] New England Complex Systems Institute.
Available at: https://necsi.edu/adaptive [Accessed 21 Sep. 2020].

What are complex adaptive systems? - Stockholm Resilience Centre. (2020). Retrieved 19 August 2020, from https://www.stockholmresilience.org/research/research-
videos/2014-03-12-what-are-complex-adaptive-systems.html

Why Covid-19 and systemic risks are part of the hyper-connected world we live in - Stockholm Resilience Centre. (2020). Retrieved 19 August 2020, from
https://www.stockholmresilience.org/research/research-news/2020-05-29-why-covid-19-and-systemic-risks-are-part-of-the-hyper-connected-world-we-live-in.html

Top-down and bottom-up design | Wikiwand. (2020). Retrieved 19 August 2020, from https://www.wikiwand.com/en/Top-down_and_bottom-
up_design#:~:text=Top%20down%20approach%20starts%20with,systems%20of%20the%20emergent%20system.

Vanderstraeten, R. (2001). Observing Systems: a Cybernetic Perspective on System/Environment Relations. Journal for the Theory of Social Behaviour, 31(3),
pp.297–311.

Amazon.com. (2020). Cybernetics, Second Edition: or the Control and Communication in the Animal and the Machine: Wiener, Norbert: 9780262730099:
Amazon.com: Books. [online] Available at: https://www.amazon.com/Cybernetics-Second-Control-Communication-Machine/dp/026273009X [Accessed 21 Sep.
2020].
Contributors to Wikimedia projects (2005). Human Physiology/Homeostasis. [online] Wikibooks.org. Available at:
https://en.wikibooks.org/wiki/Human_Physiology/Homeostasis [Accessed 21 Sep. 2020].

Wikiwand. (2020). Control theory | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Control_theory [Accessed 21 Sep. 2020].

Wikiwand. (2020). Actuator | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Actuator [Accessed 21 Sep. 2020].

Amazon.com. (2020). Cybernetics, Second Edition: or the Control and Communication in the Animal and the Machine: Wiener, Norbert: 9780262730099: Amazon.com:
Books. [online] Available at: https://www.amazon.com/Cybernetics-Second-Control-Communication-Machine/dp/026273009X [Accessed 21 Sep. 2020].

Wikiwand. (2020). Variety (cybernetics) | Wikiwand. [online] Available at: https://www.wikiwand.com/en/Variety_(cybernetics)#/The_Law_of_Requisite_Variety [Accessed
21 Sep. 2020].

Yumpu.com (2020). encyclopedia-of-evolutionpdf-online-reading-center. [online] yumpu.com. Available at:


https://www.yumpu.com/en/document/view/10345437/encyclopedia-of-evolutionpdf-online-reading-center [Accessed 21 Sep. 2020].

Concepts: Adaptive — New England Complex Systems Institute (2014). New England Complex Systems Institute. [online] New England Complex Systems Institute.
Available at: https://necsi.edu/adaptive [Accessed 21 Sep. 2020].

Heylighen, F. and Joslyn, C. (2001). Cybernetics and Second-Order Cybernetics. [online] Academic Press. Available at: http://pespmc1.vub.ac.be/Papers/Cybernetics-
EPST.pdf [Accessed 21 Sep. 2020].

Heylighen, F. and Joslyn, C. (2001). Cybernetics and Second-Order Cybernetics. [online] Academic Press. Available at: http://pespmc1.vub.ac.be/Papers/Cybernetics-
EPST.pdf.

Heylighen, F. (n.d.). THE SCIENCE OF SELF- ORGANIZATION AND ADAPTIVITY. [online] Available at: http://pcp.vub.ac.be/Papers/EOLSS-Self-Organiz.pdf [Accessed
21 Sep. 2020].

Biologyreference.com. (2020). Natural Selection - Biology Encyclopedia - body, examples, human, process, organisms, life, specific, energy, bacteria. [online] Available
at: http://www.biologyreference.com/Mo-Nu/Natural-Selection.html [Accessed 21 Sep. 2020].

Edge.org. (2020). Edge.org. [online] Available at: https://www.edge.org/response-


detail/27150#:~:text=His%20%22Law%22%20of%20Requisite%20Variety,in%20the%20system%20being%20controlled. [Accessed 21 Sep. 2020].

Mathworks.com. (2020). Genetic Algorithm. [online] Available at: https://uk.mathworks.com/discovery/genetic-algorithm.html?w.mathworks.com [Accessed 21 Sep. 2020].

You might also like