1. Introduction

1.1 The Science of Self-Organizing Systems

The scientific study of self-organizing systems is
relatively new, although questions about how organization arises have
of course been raised since ancient times. The forms we identify around
us are only a small sub-set of those theoretically possible. So why
don't we see more variety ? To answer such a question is the reason why
we study self-organization.

Many natural systems show organization (e.g. galaxies,
planets, chemical compounds, cells, organisms and societies).
Traditional scientific fields attempt to explain these features by
referencing the micro properties or laws applicable to their component
parts, for example gravitation or chemical bonds. Yet we can also
approach the subject in a very different way, looking instead for
system properties applicable to all such collections of parts,
regardless of size or nature. It is here that modern computers prove
essential, allowing us to investigate the dynamic changes that occur
over vast numbers of time steps and with a large numbers of initial
options.

Studying nature requires timescales appropriate for the
natural system, and this restricts our studies to identifiable
qualities that are easily reproduced, precluding investigations
involving the full range of possibilities that may be encountered.
However, mathematics deals easily with generalised and abstract systems
and produces theorems applicable to all possible members of a class of
systems. By creating mathematical models, and running computer
simulations, we are able to quickly explore large numbers of possible
starting positions and to analyse the common features that result. Even
small systems have almost infinite initial options, so even with the
fastest computer currently available, we usually can only sample the
possibility space. Yet this is often enough for us to discover
interesting properties that can then be tested against real systems,
thus generating new theories applicable to complex systems and their
spontaneous organization.

1.2 Definition of Self-Organization

The essence of self-organization is that system structure
often appears without explicit pressure or involvement from outside the
system. In other words, the constraints on form (i.e. organization) of
interest to us are internal to the system, resulting from the
interactions among the components and usually independent of the
physical nature of those components. The organization can evolve in
either time or space, maintain a stable form or show transient
phenomena. General resource flows within self-organized systems are
expected (dissipation), although not critical to the concept itself.

The field of self-organization seeks general rules about
the growth and evolution of systemic structure, the forms it might
take, and finally methods that predict the future organization that
will result from changes made to the underlying components. The results
are expected to be applicable to all other systems exhibiting similar
network characteristics.

1.3 Definition of Complexity Theory

The main current scientific theory related to
self-organization is Complexity Theory, which states:

Critically interacting
components self-organize to form potentially evolving structures
exhibiting a hierarchy of emergent system properties.

Emergent System Properties - New features are evident
which require a new vocabulary

We explore and explain the terms comprising this
definition in this FAQ. The form of the definition given here is the
slightly rephrased result of a discussion on the SOS newsgroup, where
the editor of this FAQ offered an initial definition and the concept
was refined, but the elements included are found in most general
treatments of self-organization, although the emphasis may vary in
different approaches to the subject.

2. Systems

2.1 What is a system ?

A system is a group of interacting parts functioning as a
whole and distinguishable from its surroundings by recognizable
boundaries. There are many varieties of systems, on the one hand the
interactions between the parts may be fixed (e.g. an engine), at the
other extreme the interactions may be unconstrained (e.g. a gas). The
systems of most interest in our context are those in the middle, with a
combination both of changing interactions and of fixed ones (e.g. a
cell). The system function depends upon the nature and arrangement of
the parts and usually changes if parts are added, removed or
rearranged. The system has properties that are emergent, if they are
not intrinsically found within any of the parts, and exist only at a
higher level of description.

2.2 What is a system property ?

When a series of parts are connected into various
configurations, the resultant system no longer solely exhibits the
collective properties of the parts themselves. Instead any additional
behaviour attributed to the system is an example of an emergent system
property. A configuration can be physical, logical or statistical, all
can show unexpected features that cannot be reduced to an additive
property of the individual parts. Crucial to such properties is the
fact that we cannot even describe them using the language applicable to
the parts, we need a new vocabulary, new terms to be invented, e.g.
'laser' to denote the functional features of the entity (e.g. coherent
light producer).

2.3 What is emergence ?

The appearance of a property or feature not previously
observed as a functional characteristic of the system. Generally,
higher level properties are regarded as emergent. An automobile is an
emergent property of its interconnected parts. That property disappears
if the parts are disassembled and just placed in a heap. There are
three aspects involved here. First is the idea of 'supervenience', this
means that the emergent properties will no longer exist if the lower
level is removed (i.e. no 'mystically' disjoint properties are
involved). Secondly the new properties are not aggregates, i.e. they
are not just the predictable results of summing part properties (for
example when the mass of a whole is just the mass of all the parts
added together). Thirdly there should be causality - thus emergent
properties are not epiphenomenal (either illusions or descriptive
simplifications only). This means that the higher level properties
should have causal effects on the lower level ones - called 'downward
causation', e.g. an amoeba can move, causing all its constituent
molecules to change their environmental positions (none of which
however are themselves capable of such autonomous trajectories). This
implies also that the emergent properties 'canalize' (restrict) the
freedom of the parts (by changing the 'fitness landscape', i.e. by
imposing boundary conditions or constraints).

2.4 What is organization ?

The arrangement of selected parts so as to promote a
specific function. This restricts the behaviour of the system in such a
way as to confine it to a smaller volume of its state space. The
recognition of self-organizing systems can be problematical. New
approaches are often necessary to find order in what was previously
thought to be noise, e.g. in the recognition that a part of a system
looks like the whole (self-similarity) or in the use of phase space
diagrams.

2.5 What is state or phase space ?

This is the total number of behavioural combinations
available to the system. When tossing a single coin, this would be just
two states (either heads or tails). The number of possible states grows
rapidly with complexity. If we take 100 coins, then the combinations
can be arranged in over 1,000,000,000,000,000,000,000,000,000,000
different ways. We would view each coin as a separate parameter or
dimension of the system, so one arrangement would be equivalent to
specifying 100 binary digits (each one indicating a 1 for heads or 0
for tails for a specific coin). Generalizing, any system has one
dimension of state space for each variable that can change. Mutation
will change one or more variables and move the system a small distance
in state space. State space is frequently called phase space, the two
terms are interchangeable.

2.6 What is self-organization ?

a) The evolution of a system into an organized form in the
absence of external pressures.

b) A move from a large region of state space to a
persistent smaller one, under the control of the system itself. This
smaller region of state space is called an attractor.

c) The introduction of correlations (pattern) over time or
space for previously independent variables operating under local rules.

Typical features include (in rough order of generality):

Absence of external control (autonomy)

Dynamic operation (time evolution)

Fluctuations (noise/searches through options)

Symmetry breaking (loss of freedom/heterogeneity)

Global order (emergence from local interactions)

Dissipation (energy usage/far-from-equilibrium)

Instability (self-reinforcing choices/nonlinearity)

Multiple equilibria (many possible attractors)

Criticality (threshold effects/phase changes)

Redundancy (insensitivity to damage)

Self-maintenance (repair/reproduction metabolisms)

Adaptation (functionality/tracking of external
variations)

Complexity (multiple concurrent values or objectives)

Hierarchies (multiple nested self-organized levels)

2.7 Can things self-organize ?

Yes, any system that takes a form that is not imposed from
outside (by walls, machines or forces) can be said to self-organize.
The term is usually employed however in a more restricted sense by
excluding physical laws (reductionist explanations), and suggesting
that the properties that emerge are not explicable from a purely
reductionist viewpoint. Examples include magnetism, crystallization,
lasers, Bernard cells, Belouzov-Zhabotinsky and Brusselator reactions,
cellular autocatalysis, organism structures, bird & fish
flocking, immune system, brain, ecosystems, economies etc. An excellent
overview of this question can be found in Francis Heylighen's paper
'The Science of Self-Organization and Adaptivity' http://pespmc1.vub.ac.be/Papers/EOLSS-Self-Organiz.pdf

2.8 What is an attractor ?

A preferred position for the system, such that if the
system is started from another state it will evolve until it arrives at
the attractor, and will then stay there in the absence of other
factors. An attractor can be a point (e.g. the centre of a bowl
containing a ball), a regular path (e.g. a planetary orbit), a complex
series of states (e.g. the metabolism of a cell) or an infinite
sequence (called a strange attractor). All specify a restricted volume
of state space (a compression). The larger area of state space that
leads to an attractor is called its basin of attraction and comprises
all the pre-images of the attractor state. The ratio of the volume of
the basin to the volume of the attractor can be used as a measure of
the degree of self-organisation present. This Self-Organization Factor
(SOF) will vary from the total size of state space (for totally ordered
systems - maximum compression) to 1 (for ergodic - zero compression)

2.9 What is an pre-image ?

If a system is iterated (stepped in time) and moves from
state x to state y, then state x is a pre-image of state y. In other
words it is on the trajectory that leads into state y. A pre-image that
itself has no pre-image is called a Garden of Eden state, and is the
starting point for a trajectory. It is usual to exclude states on the
attractor itself from the pre-image list, to avoid circularity, since
these are all pre-images of each other.

2.10 How do attractors and self-organization relate ?

Any system that moves to a fixed structure can be said to
be drawn to an attractor. A complex system can have many attractors and
these can alter with changes to the system interconnections (mutations)
or parameters. Studying self-organization is equivalent to
investigating the attractors of the system, their form and dynamics.

2.11 What is the mechanism of self-organization ?

Random (or directed) changes can instigate
self-organization, by allowing the exploration of new state space
positions. These positions exist in the basins of attraction of the
system and are inherently unstable, putting the system under stress of
some sort, and causing it to move along a trajectory to a new
attractor, which forms the self-organized state. Noise (fluctuations)
can allow metastable systems (i.e. those possessing many attractors -
alternative stable positions) to escape one basin and to enter another,
thus over time the system can approach an optimum organization or may
swap between the various attractors, depending upon the size and nature
of the perturbations.

3. Edge of Chaos

3.1 What is criticality ?

A point at which system properties change suddenly, e.g.
where a matrix goes from non-percolating (disconnected) to percolating
(connected) or vice versa. This is often regarded as a phase change,
thus in critically interacting systems we expect step changes in
properties.

3.2 What is self-organized criticality (SOC) ?

The ability of a system to evolve in such a way as to
approach a critical point and then maintain itself at that point. If we
assume that a system can mutate, then that mutation may take it either
towards a more static configuration or towards a more changeable one (a
smaller or larger volume of state space, a new attractor). If a
particular dynamic structure is optimum for the system, and the current
configuration is too static, then the more changeable configuration
will be more successful. If the system is currently too changeable then
the more static mutation will be selected. Thus the system can adapt in
both directions to converge on the optimum dynamic characteristics.

3.3 What is the Edge of Chaos (EOC) ?

This is the name given to the critical point of the
system, where a small change can either push the system into chaotic
behaviour or lock the system into a fixed behaviour. It is regarded as
a phase change. It is at this point where all the really interesting
behaviour occurs in a 'complex' system, and it is where systems tend to
gravitate give the chance to do so. Hence most ALife systems are
assumed to operate within this regime.

At this boundary a system has a correlation length
(connection between distant parts) that just spans the entire system,
with a power law distribution of shorter lengths. Transient
perturbations (disturbances) can last for very long times (infinity in
the limit) and/or cover the entire system, yet more frequently effects
will be local or short lived - the system is dynamically unstable to
some perturbations, yet stable to others.

3.4 What is a phase change ?

A point at which the appearance of the system changes
suddenly. In physical systems the change from solid to liquid is a good
example. Non-physical systems can also exhibit phase changes, although
this use of the term is more controversial. Generally we regard our
system as existing in one of three phases. If the system exhibits a
fixed behaviour then we regard it as being in the solid realm, if the
behaviour is chaotic then we assign it to the gas realm. For systems on
the Edge of Chaos the properties match those seen in liquid systems, a
potential for either solid or gaseous behaviour, or both.

3.5 How does percolation relate to SOC ?

Percolation is an arrangement of parts (usually visualised
as a matrix) such that a property can arise that connects the opposite
sides of the structure. This can be regarded as making a path in a
disconnected matrix or making an obstruction in a fully connected one.
The boundary at which the system goes from disconnected to connected is
a sudden one, a step or phase change in the properties of the system.
This is the same boundary that we arrive at in SOC and in physics is
sometimes called universality due its general nature.

3.6 What is a power law ?

If we plot the logarithm of the number of times a certain
property value is found against the log of the value itself we get a
graph. If the result is a straight line then we have a power law.
Essentially what we are saying is that there is a distribution of
results such that the larger the effect the less frequently it is seen.

The mathematical form is: N(s) = s - t

where N(s) is the number of events with size s and t (tor)
is the exponent (the minus sign indicates that the numbers fall with
increasing s).

Taking logs we have log N(s) = - t log s

A good example is earthquake activity where many small
quakes are seen but few large ones, the Richter scale is based upon
such a law. A system subject to power law dynamics exhibits the same
structure over all scales. This self-similarity or scale independent
(fractal) behaviour is typical of self-organizing systems.

4. Selection

4.1 Isn't this just the same as selection ?

No, selection is a choice between competing options such
that one arrangement is preferred over another with reference to some
external criteria - this represents a choice between two stable systems
in state space. In self-organization there is only one system which
internally restricts the area of state space it occupies. In essence
the system moves to an attractor that covers only a small area of state
space, a dynamic pattern of expression that can persist even in the
face of mutation and opposing selective forces. Alternative stable
options are each self-organized attractors and selection may then
choose between them based upon their emergent phenotypic properties.

4.2 How does natural selection fit in ?

Selection is a bias to move through state space in a
particular direction, maximising some external fitness function -
choosing between mutant neighbours. Self-organization drives the system
to an internal attractor, we can call this an internal fitness
function. The two concepts are complementary and can either mutually
assist or oppose. In the context of self-organizing systems, the
attractors are the only stable states the system has, selection
pressure is a force on the system attempting to perturb it to a
different attractor. It may take many mutations to cause a system to
switch to a new attractor, since each simply moves the starting
position across the basin of attraction. Only when a boundary between
two basins is crossed will an attractor change occur, yet this shift
could be highly significant, a metamorphosis in system properties.

4.3 What is a mutant neighbour ?

In the world of possible systems (the state space for the
system) two possibilities are neighbours if a change or mutation to one
parameter can change the first system into the second or vice versa.
Any two options can then be classified by a chain of possible mutations
converting between them (via intermediate states). Note that there can
be many ways of doing this, depending on the order the mutations take
place. The process of moving from one possibility to another is called
an adaptive walk.

4.4 What is an adaptive walk ?

A process by which a system changes from one state to
another by gradual steps. The system 'walks' across the fitness
landscape, each step is assumed to lead to an improvement in the
performance of the system against some criteria (adaptation).

4.5 What is a fitness landscape ?

If we rate every option in state space by its achievement
against some criteria then we can plot that rating as a fitness value
on another dimension, a height that gives the appearance of a
landscape. The result may be a single smooth hill (a correlated
landscape), many smaller peaks (a rugged landscape) or something in
between.

5. Interconnections

5.1 What are interactions ?

Influences between parts due to their interconnections.
These interconnections can be of many forms (e.g. wiring, gravitational
or electromagnetic fields, physical contact or logical information
channels). We assume that the influence can act in such a way as to
change the part state or to cause a signal to be propagated in some way
to other parts. Thus the extent of the interactions determines the
behavioural richness of the system.

5.2 How many parts are necessary for self-organization ?

As few as two (in magnetic or gravitational attraction)
can suffice, but generally we use the term to classify more complex
phenomena than point attractors. The richness of possible behaviour
increases rapidly with the number of interconnections and the level of
feedback. For small systems we are able to analyse the state
possibilities and discover the attractor structure. Larger systems
however require a more statistical approach where we sample the system
by simulation to discover the emergent properties.

5.3 What is feedback ?

A connection between the output of a system and its input,
in other words a causality loop - effect is fed back to cause. This
feedback can be negative (tending to stabilise the system - order) or
positive (leading to instability - chaos). Feedback results in
nonlinearities, constraints on the system behaviour leading to
unpredictability.

5.4 What interconnections are necessary ?

In general terms, for self-organization to occur, the
system must be neither too sparsely connected (so most units are
independent) nor too richly connected (so that every unit affects every
other). Most studies of Boolean Networks suggest that having about two
connections for each unit leads to optimum organisational and adaptive
properties. If more connections exist then the same effect can be
obtained by using canalizing functions or other constraints on the
interaction dynamics.

5.5 What is a Boolean Network or NK model ?

Taking a collection (N) of logic gates (AND, OR, NOT etc.)
each with K inputs and interconnecting them gives us a Boolean Network.
Depending upon the number of inputs (K) to each gate we can generate a
collection of possible logic functions that could be used. By
allocating these to the nodes (N) at random we have a Random Boolean
Network (RBN - also called a Kauffman Net or the Kauffman Model) and
this can be used to investigate whether organization appears for
different sets of parameters. Some possible logic functions are
canalizing and it seems that this type of function is the most likely
to generate self-organization. This arrangement is also referred to
biologically as a NK model where N is seen as the number of genes (with
2 alleles each - the output states) and K denotes their
inter-dependencies.

5.6 What are canalizing functions and forcing structures ?

A function is canalizing if a single input being in a
fixed state is sufficient to force the output to a fixed state,
regardless of the state of any other input. For example, for an AND
gate if one input is held low then the output is forced low, so this
function is canalizing. An XOR gate, in contrast, is not since the
state can always change by varying another input. The result of
connecting a series of canalizing functions can be to force chunks of
the network to a fixed state (an initial fixed input can ripple through
and lock up part of the network - a forcing structure). Such fixed
divisions (barriers to change) can break up the network into active and
passive structures and this can allow complex modular behaviours to
develop. Because the structure is canalizing, a single change can
switch the structure from passive to active or back again, this allows
the network to perform a series of regulatory functions.

5.7 How does connectivity affect landscape shape ?

In general the higher the connectivity the more rugged the
landscape becomes. Simply connected landscapes have a single peak, a
change to one parameter has little effect on the others so a smooth
change in fitness is found during adaptive walks. High connectivity
means that variables interact and we have to settle for compromise
fitnesses, many lower peaks are found and the system can become stuck
at local optima or attractors, rather than being able to reach the
global optimum.

5.8 What is an NKC Network ?

If we allow each node (N) to be itself a complex
arrangement of interlinked parts (K) then we can regard the connections
between nodes (C) as a further layer of control. This relates
biologically to a genome interacting with other genomes. K is the gene
interactions within the organism, C the genes outside the organism that
affect it. The overall fitness is derived from the combinations of the
interacting gene fitnesses.

5.9 What is an NKCS Network ?

An extension of the NKC model to add multiple species.
Each species is linked to S other species. This can best be seen by
visualising an ecosystem, where the nodes are species (assumed
genetically identical) each consisting of a collection of genes, and
the interactions between the species form the ecosystem. Thus the local
connection K specifies how the genes of one species interact with
themselves and the distant connections (C x S ) how the genes interact
with each of the other species. This model then allows co-evolutionary
development and organization to be studied.

5.10 What is an autocatalytic set ?

A collection of interacting entities often react in
certain ways only, e.g. entity A may be able to affect B but not C. D
may only affect E. For a sufficiently large collection of different
entities a situation may arise where a complete network of
interconnections can be established - the entities become part of one
coupled system. This is called an autocatalytic set, after the ability
of molecules to catalyse each other's formation in the chemical
equivalent of this arrangement.

6. Structure

6.1 What are levels of organization ?

The smallest parts of a system produce their own emergent
properties, these are the lowest 'system' features and form the next
level of structure in the system. Those system components then in turn
form the building blocks for the next higher level of organization,
with different emergent properties, and this process can proceed to
higher levels in turn. The various levels can all exhibit their own
self-organization (e.g. cell chemistry, organs, societies) or may be
manufactured (e.g. piston, engine, car). One measure of complexity is
that a complex system comprises multiple levels of description, the
more ways of looking at a system then the more complex it is, and more
extensive is the description needed to specify it (algorithmic
complexity).

6.2 How is energy related to these concepts ?

Energy considerations are often regarded as an explanation
for organization, it is said that minimising energy causes the
organization. Yet there are often alternative arrangements that require
the same energy. To account for the choice between these requires other
factors. Organization still appears in computer simulations that do not
use the concept of energy, although other criteria may exist. This
system property suggests that we still have much to learn in this area,
as to the effect of resource flows of various types on organizational
behaviour. The relationship between entropy and self-organization is
also studied, this tries to relate organization to the 2nd Law of
Thermodynamics and recent findings here suggest that order is a
necessary result of far-from-equilibrium (dissipative) systems trying
to maximise stress reduction. This suggests that the more complex the
organism then the more efficient it is at dissipating potentials, a
field of study sometimes called 'autocatakinetics' and related to what
has been called 'The Law of Maximum Entropy Production'. Thus
organization does not 'violate' the 2nd Law (as often claimed) but
seems to be a direct result of it.

6.3 How does it relate to chaos ?

In nonlinear studies we find much structure for very
simple systems, as seen in the self-similar structure of fractals and
the bifurcation structure seen in the logistic map. This form of system
exhibits complex behaviour from simple rules. In contrast, for
self-organizing systems we have complex assemblies generating simple
emergent behaviour, so in essence the two concepts are complementary.
For our collective systems, we can regard the solid state as equivalent
to the predictable behaviour of a formula, the gaseous state as
corresponding to the statistical or chaotic realm and the liquid state
as being the bifurcation or fractal realm.

6.4 What are dissipative systems ?

Systems that use energy flow to maintain their form are
said to be dissipative systems, these would include atmospheric
vortices, living systems and similar. The term can also be used more
generally for systems that consume energy to keep going e.g. engines or
stars. Such systems are generally open to their environment.

6.5 What is bifurcation ?

A phenomenon that results in a system splitting into two
possible behaviours (with a small change in one parameter), further
changes to the parameter then cause further splits at regular intervals
(the Feigenbaum constant, approx. 4.6692...) until finally the system
enters a chaotic phase. This sequence from stability, through
increasing complexity, to chaos has much in common with the observed
behaviour of complex systems, reflecting changes in attractor structure
with variations to parameters. On occasion, successive iterations in a
model of the system will cycle between the available behaviours.

Cybernetics is the precursor of complexity thinking in the
investigation of dynamic systems and set the groundwork for the study
of self-maintaining systems, using feedback and control concepts. It
relates generally to systems isolated or closed in organizational
terms, in other words to self-contained systems. Complexity theory
includes some new concepts such as self-organization plus its various
specialisms, and adds more prominence to borrowed concepts like
emergence, phase space and fitness landscapes, but in essence it
relates systems to other systems. It includes the two way information
flows between them, their mutual reactions to their environment or
co-evolution. It also deals with systems that can evolve or adapt, that
can become quite different systems.

6.7 What is synergy ?

Synergy studies the additional benefit accruing to
collective systems. This relates to the idea that the whole is greater
(or less) that the parts. It includes the study of mergers,
organisational benefits of co-operation and more generally what is
referred to in complexity studies as emergence. Synergy includes
symbiotic effects, along with many other forms of co-operative or
combinatoric fitness enhancements. Where joint effects reduce fitness
(e.g. in destructive competition) the term 'dysergy' can be used. In
physical systems the term Synergetics is also employed [Haken,
Buckminster-Fuller].

6.8 What is autopoiesis ?

Autopoiesis is self-production - maintenance of a living
organism's form with time and flows. It is a special case of
homeostasis and relates to a systemic definition of life. The concept
is frequently applied to cognition, viewing the mind as a
self-producing system, with self-reference and self-regulation which
evolves using structural coupling.

6.9 What is structural coupling ?

This is the idea that a complex and autopoietic system
must relate to its environment, and the internal structure becomes
coupled to relevant features of that environment. In complexity terms
the environment selects which of the systems attractors becomes active
at any time, what is also called situated or selected
self-organization.

6.10 What is homeostasis ?

This is the regulation of critical variables to form an
equilibrium state in the face of perturbation. It relates to
cybernetics and to the EOC state in complexity, and concentrates on
automatic mechanisms of self-regulation.

6.11 What are extropy and homeokinetics ?

Several other terms are loosely used with regard to
self-organizing systems, many in terms of human behaviour. Extropy
(also variously called 'ectropy', 'negentropy' or 'syntropy') refers to
growing organizational complexity. Homeokinetics is connected with SOS
and relates to viewing complex systems from an atomic point of view as
collections of moving particles.

6.12 What is stigmergy ?

The use of the environment to enable agents to communicate
and interact, facilitating self-organization. This can be by deliberate
storage of information (e.g. the WWW) or by physical alterations to the
landscape made as a result of the actions of the lifeforms operating
there (e.g. pheromone trails, termite hills). The future choices made
by the agents are thus constrained or stimulated dynamically by the
random changes encountered.

7.1 How can self-organization be studied ?

Since we are seeking general properties that apply to
topologically equivalent systems, any physical system or model that
provides those connections can be used. Much work has been done using
Cellular Automata and Boolean Networks, with Alife, Genetic Algorithms,
Neural Networks and similar techniques also widely used. In general we
start with a set of rules specifying how the interconnections behave,
the network is then randomly initiated and iterated (stepped)
continually following the ruleset. The stable patterns obtained (if
any) are noted and the sequence repeated. After many trials
generalisations from the results can be attempted, with some
statistical probability.

7.2 What results are there so far ?

Some of these results are very tentative (due to the
difficulties in analysing larger networks), and subject to change as
more research is undertaken and these systems become better understood.
Many of these results are expanded and justified by Stuart Kauffman in
his recent lecture notes. For a more philosophical
overview of the difficulties see CALResCo's Quantifying
Complexity Theory.

The attractors of a system are uniquely determined by
the state transition properties of the nodes (their logic) and the
actual system interconnections.

Attractors result in the merging of historical
positions. Thus irreversibility is inherent in the concept. Many
scenarios can result in the same outcome, therefore a unique logical
reduction that a state arose from a particular predecessor (backward
causality) is impossible, even in theory. Merging of world lines in
this way invalidates, in general, determination of the specific
pre-image of any state.

The ratio of the basin of attraction size to attractor
size (called here the Self-Organizing Factor or SOF) varies from the
size of the whole state space (totally ordered, point attractor) down
to 1 (totally disordered, ergodic attractor).

Single connectivity mutations can considerably alter the
attractor structure of networks, allowing attractors to merge, split or
change sequences. Basins of attraction are also altered and initial
points may then flow to different attractors.

Single state mutations can move a system from one
attractor to another within the system. The resultant behaviour can
change between fixed, chaotic, periodic and complex in any combination
of the available attractors and the effect can be predicted if the
system details are fully known.

The mutation space of a system with 2 alleles at each
node is a Boolean Hypercube of dimension N (number of neighbours). The
number of adaptive peaks for random systems is 2 ** N /(N+1),
exponentially high.

The chance of reaching a random higher peak halves with
each step, after 30 steps it is 1 in a Billion. The time required
scales in the same way. Mean length of an adaptive walk to a nearby
peak is ln N. Branching walks are common initially, but most end on
local optima (dead ends). This makes finding a single 'maximum fitness'
peak an NP-hard problem. Correlated landscapes are necessary for
adaptive improvement.

For such systems with high connectivity, the median
number of attractors is N/e (linear), the median number of states
within an attractor averages 0.5 * root(2 ** N) (exponentially large).
These systems are highly sensitive to disturbance, and swap amongst the
attractors easily.

For K=0, there is a smooth landscape with one peak (the
global optimum). Length of an adaptive walk is N/2, directions uphill
decreasing by one with each step.

For K=1, median attractor numbers are exponential on N,
state lengths increase only as root N, but again are sensitive to
disturbance and easily swap between attractors.

For K=2 we have a phase transition, median number of
attractors drops to root N, average length is also root N (more recent
work has identified that sampling techniques tend to miss small
attractors, more generally the number increases at least linearly with
N). The system is stable to disturbance and has few paths between the
attractors. Most perturbations return to the same attractor (since most
perturbations only affect the 'stable core' of nodes outside the
attractor).

Systems that are able to change their number of
connections (by mutation) are found to move from the chaotic (K high)
or static (K low) regions spontaneously to that of the phase transition
and stability - the self-organizing criticality. The maximum fitness is
found to peak at this point.

Natural genetic systems with high connectivity
K>2 have a higher proportion of canalizing functions than would
be the case if randomly assigned. This suggests a selective bias
towards functions that can support self-organization to the edge of
chaos.

The 'No Free Lunch' Theorem states that, averaged over
all possible landscapes, no search technique is better than random.
This suggests, if the theory of evolution is valid, that the landscape
is correlated with the search technique. In other words the organisms
create their own smooth landscape - the landscape is 'designed' by the
agents...

If we measure the distance between two close points in
phase space, and plot that with time, then for chaotic systems the
distance will diverge, for static it will converge onto an attractor.
The slope gives a measure of the system stability (+ve is chaotic) and
a zero value corresponds to edge of chaos. This goes by the name of the
Lyapunov exponent (one for each dimension). Other similar measures are
also used (e.g. Derrida plot for discrete systems).

A network tends to contain an uneven distribution of
attractors. Some are large and drain large basins of attraction, other
are small with few states in their corresponding basins.

The basins of attraction of higher fitness peaks tend to
be larger than those for lower optima at the critical point. Correlated
landscapes occur, containing few peaks and with those clustered
together.

As K increases, the height of the accessible peaks
falls, this is the 'Complexity Catastrophe' and limits the performance
towards the mean in the limit.

Mutation pressure grows with system size. Beyond a
critical point (dependant upon rate, size and selection pressure) it is
no longer possible to achieve adaptive improvement. A 'Selection or
Error Catastrophe' sets in and the system inevitably moves down off the
fitness peak to a stable lower point, a sub-optimal shell. Limit = 2 *
mutation rate * N ** 2 / MOD(selection pressure).

For co-evolutionary networks, tuning K (local
interactions) to match or exceed C (species interactions) brings the
system to the optimum fitness, another SOC. This tuning helps optimise
both species (symbiotic effects). Reducing the number S of interacting
species (breaking dependancies - e.g. new niches) also improves overall
fitness. K should be minimised but needs to increase for large S and C
to obtain rapid convergence.

In the phase transition region the system is generally
divided into active areas of variable behaviour separated by fixed
barriers of static components (frozen nodes - the stable core).
Pathways or tendrils between the dynamic regions allow controlled
propagation of information across the system. The number of active
islands is low (less than root N) and comprises about a fifth of the
nodes (increasing with K).

At the critical point, any size of perturbation can
potentially cause any size of effect - it is impossible to predict the
size of the effect from the size of the perturbation (for large,
analytically intractable systems). A power law distribution is found
over time, but the timing and size of any particular perturbation is
indeterminate.

Plotting the input entropy of a system gives a high
value for chaotic systems, a low value for ordered systems and an
intermediate for complex system. Variance of the input entropy is high
for complex systems but low for both ordered and chaotic ones. This can
be used to identify EOC behaviour.

For a network of N nodes and E possible edges, then as N
grows the number of edge combinations will increase faster than the
nodes. Given some probability of meaningful interactions, then there
will inevitably be a critical size at which the system with go from
subcritical to supracritical behaviour, a SOC or autocatalysis. The
relevant size is N = Root ( 1 / ( 2 * probability) ).

Since a metabolism is such an autocatalytic set, this
implies that life will emerge as a phase transition in any sufficiently
complex reaction system - regardless of chemical or other form.

Given the protein diversity in the biosphere, this
proves to be widely supracritical, yet stability of cells requires
partitioning to a subcritical but autocatalytic state. This balance
suggests a limit to cell biochemical diversity and a self-organizing
maintenance below that limit. This is related to the Error Catastrophe,
too high a rate of innovation is not controllable by selection and
leads to information loss, chaos and breakdown of the system.

Given a supracritical set of existing products M, and
potential products M' (M' > M), equilibrium constant constraints
predict that the probability of the difference M' - M set should be
non-zero. Therefore there will be a gradient towards more diversity, in
other words 'creativity', in any such system.

Evaluating the above for the diversity we find on this
planet shows that we have so far explored only an insignificant
fraction of state space during the time the universe has existed. Thus
the Universe in not yet in an equilibrium state and the standard
assumptions of equilibrium statistical mechanics do not apply (e.g. the
ergodic hypothesis).

Two or more interacting autocatalytic sets that increase
reproduction rates above that of either in isolation will grow
preferentially. This is a form of trade or mutual assistance, an
ecosystem in miniature.

Such interacting sets can generate components that are
not in either set. giving a higher level of joint operation, emergent
novelty.

If such innovation involves a cost, then the rate of
innovation will be constrained by payback period. This is seen in
economic analogues, where risk/profit forms a balance, as well as in
ecological systems. Interactions must be net positive sum to be
sustainable.

In spatially extended networks a wide variety of
different patterns are found, these occur over a large fraction of
parameter or state space. Patterns form both by continuous gradient
(diffusion over space) and discrete interaction (cell-cell induction
signalling) processes.

Patterns increase exponentially in frequency with the
number of units in the network, inductive processes producing more
stable patterns, whilst diffusion processes produce more unstable ones,
suggesting the former is more important in morphogenesis.

7.3 How applicable is self-organization ?

The above results seem to indicate that such system
properties can be ascribed to all manner of natural systems, from
physical, chemical, biological, psychological to cultural. Much work is
yet needed to determine to what extent these system properties relate
to the actual features of real systems and how they vary with changes
to the constraints. Power laws are common in natural systems and an
underlying SOC cannot be ruled out as a possible cause of this
situation.

8. Resources

8.1 Is any software available to study self-organization ?

Few software packages relate to self-organization as such,
but many do show self-organized behaviour in the context of more
specialised topics. These include cellular automata (Game of Life),
neural networks (recurrent or Hopfield networks, and self-organizing
maps), genetic algorithms (evolution), artificial life (agent
behaviour), fractals (mathematical art) and physics (spin glasses).
These can be found via the relevant newsgroup FAQs.

8.3 What books can I read on this subject ?

Adami, Christoph. Introduction to Artificial Life (1998
Telos/Springer-Vertag). A good introduction with included Avida
software, covering the main concepts and maths

Ashby, W. Ross. An Introduction to Cybernetics (1957
Chapman & Hall). The earliest introduction to the applicability
of cybernetics to biological systems, now reprinted on the Web.
Recommended - see http://pcp.vub.ac.be/books/IntroCyb.pdf

Goodwin, Brian. How the Leopard Changed Its Spots: The
Evolution of Complexity (1994 Weidenfield & Nicholson London).
Self-organization in the development of biological form
(morphogenesis), an excellent overview.

Langton, Christopher (ed.). Artificial Life -
Proceedings of the first ALife conference at Santa Fe (1989 Addison
Wesley). Technical (several later volumes are available but this is the
best introduction).

Levy, Steven. Artificial Life - The Quest for a New
Creation (1992 Jonathan Cape). Excellent popular introduction.

Lewin, Roger. Complexity - Life at the Edge of Chaos
(1993 Macmillan). An excellent introduction to the general field.

Mandelbrot, Benoit. The Fractal Geometry of Nature (1983
Freeman). A classic covering percolation and self-similarity in many
areas.

Stewart and Cohen. Figments of Reality: The Evolution of
the Curious Mind. (1997 Cambridge University Press).

Turchin, Valentin F. The Phenomenon of Science: A
Cybernetic Approach to Human Evolution (1977 Columbia University
Press). An online book covering similar concepts from an earlier
viewpoint, - see http://pespmc1.vub.ac.be/PoS/

9. Miscellaneous

9.1 How does self-organization relate to other areas of
complex systems ?

Many studies of complex systems assume that the systems
self-organize into emergent states which are not predictable from the
parts. Artificial Life, Evolutionary Computation (incl Genetic
Algorithms), Cellular Automata and Neural Networks are the main fields
directly associated with this idea, all of which fall under the general
auspices of Complex Systems or Complexity Theory.

9.2 Which Newsgroups are relevant ?

All links bad.

9.3 Which Journals are relevant ?

Some journals (both online and printed) which relate to complexity and
self-organisation are:

9.4 Updates to this FAQ

This FAQ has been compiled and is maintained by Chris
Lucas of the CALResCo Group. Comments, suggestions, requests for
additions and particularly criticisms and corrections are warmly
welcomed. Please feel free to EMail me at CALResCo.

Usual get out clauses, I take no responsibility for any
errors contained in the information presented here or any damages
resulting from its use. The information is accurate however as far as I
am aware.

This FAQ may be posted in any newsgroup, mail list or BBS
as long as it remains intact and contains the following copyright
notice. This document may not be used for financial gain or included in
commercial products without the express permission of the author.