There is a great similarity between the extremes of a whitewater river and those of a complex system. At one end of the scale, the water is dark, silent and still. At the other it is foaming – white, loud, and wildly chaotic. Whitewater paddlers need to use different techniques depending on the state of the river. In turbulence they have to read the waves and react in an instant. A flurry of paddle strokes moves their raft erratically, but safely, downstream – skirting boulders, plunging over sharp drops, jetting through narrowing canyon walls in a thunder of surging water.
Systems display comparable states. These can be characterized as Class I (stasis), Class II (order), Class III (chaos) and Class IV (complexity). There is a similar parallel with conditions at sea, illustrated in a paper on marine safety – “Marine accident prevention: an evaluation of the ISM code by the fundamentals of the complexity theory” – authored by Alexander Goulielmos and Constantinos Giziakis in Disaster Prevention and Management: An International Journal (2002).
Class I systems are unchanging. Class II systems are periodic and predictable. Class III systems are unstable and unpredictable. Class IV systems are complex, serving up both order and chaos. The zone between order and chaos is often referred to as the ‘edge of chaos.’
The Dynamics of Change
Simple diagrams can be used to portray system behavior. Systems move from one state to another within a ‘phase space.’ If they gravitate toward a particular end state, that is called an ‘attractor.’
Class I systems are predisposed to be stable. They are said to have a point attractor. Regardless of where they start, they converge on a single stable state. A marble dropped into a bowl behaves like this, ending up at rest on the bottom.
Class II systems move through a repeated sequence of states, tracing an endless loop regardless of the starting point. They are said to have a periodic attractor. An orbiting planet is an example.
Class III systems behave chaotically. They are said to have a strange attractor. They are very sensitive to initial conditions. Weather is a good example. A small difference in the initial state of the atmosphere can have a major impact on weather conditions, while a large difference has little effect.
MIT meteorologist Edward Lorenz discovered this while testing a weather model on a computer. A small rounding error in the inputs produced a major difference in weather predictions. He presented these findings in a paper titled “Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?” (1972), and we now refer to this as the ‘butterfly effect.’
Chaotic systems have multiple attractors. The marble analogy is helpful once again. Where a marble ends up depends on where it was dropped. If dropped inside any one of the basins, it heads with certainty to the bottom. If dropped on a ridge, it will end up on one side or the other depending on a very small difference in the starting position. This is chaotic (unpredictable) behavior.
When dynamic systems are stressed, this chaotic behavior is multiplied. The number of attractors increases as each attractor splits into two. Each split is called a ‘bifurcation.’
Principia Cybernetica Web, an online encyclopedia created by scientists on topics in cybernetics, systems theory and complexity, elegantly described how bifurcation works.
“The system now has two stable patterns of behavior. For example, imagine that you let water run through a tap. When the water runs very slowly, it comes out in regular, periodically falling drops. When you open the tap a little bit more, it may happen that there are two patterns of running: either big drops succeeding each other quickly, or a thin, continuous stream. Sometimes you have the one pattern, sometimes a slight fluctuation in the pressure makes the water switch to the other pattern.
When the stress is increased further, more bifurcations take place, and the attractors split up further. First, you have 4 possible regimes, then 8, then 16, then 32, and so on, ever more quickly. At a certain point, the number of attractors becomes infinite and the system is erratically jumping from the one to the other all the time. This is true chaos. The behavior of the system has become totally unpredictable. Coming back to the water tap, this is what happens when the tap is opened fully and the water is running out turbulently, in one big, irregular waterfall, with droplets spraying in all directions.”
Class IV complex systems, operating between order and chaos, demonstrate pattern emergence. They maximize stability while maintaining the capacity for change — creating order some of the time and chaos some of the time. Intermittent periods of stability are referred to as ‘long transients.’ Long periods of steady state are separated by short, disruptive bursts — a pattern called a ‘punctuated equilibrium.’
Physicist Per Bak modeled this behavior by dropping sand on a table, one grain at a time. As sand is added, the pile becomes steeper. Then avalanches start to occur. The size distribution of the avalanches follows a power law — the long-tailed curve that also describes the size distribution of earthquakes, lunar craters, solar flares, power outages, wars, species extinction events, and many other natural, social and economic phenomena. There are many more small events than very large ones; however, large events are orders of magnitude greater.
Elaine Eisenfeld in Per Bak, How Nature Works: The Science of Self-Organized Citicality (1996)
As successive grains are added, the sandpile reaches a critical state where it collapses spontaneously. Per Bak called this ‘self-organized criticality’ because it happens without any external intervention. He described this in his book How Nature Works: The Science of Self-Organized Citicality (1996).
“If we drop a single grain of sand at one place instead of another, this causes only a small local change in the configuration. There is no means by which the disturbance can spread system-wide. The response to small perturbations is small. In a noncritical world nothing dramatic ever happens. It is easy to be a weather (sand) forecaster in the flatland of a non-critical system. Not only can he predict what will happen, but he can also understand it, to the limited extent that there is something to understand. The action at some place does not depend on events happening long before at far-away places. Contingency is irrelevant.”
Things change when the sandpile becomes critical.
“A single grain of sand might cause an avalanche involving the entire pile. A small change in the configuration might cause what would otherwise be an insignificant event to become a catastrophe. The sand forecaster can still make short time predictions by carefully identifying the rules and monitoring his local environment. If he sees an avalanche coming, he can predict when it will hit with some degree of accuracy. However, he cannot predict when a large event will occur, since this is contingent on very minor details of the configuration of the entire sandpile.”
Criticality marks a phase transition — a sudden and occasionally catastrophic change of state. In The Black Swan: The Impact of the Highly Improbable (2007), Nassim Nicholas Taleb calls these sudden, unexpected catastrophes ‘Black Swans.’
“A small number of Black Swans explain almost everything in our world, from the success of ideas and religions, to the dynamics of historical events, to elements of our personal lives. Ever since we left the Pleistocene, some ten millennia ago, the effect of these Black Swans has been increasing. It started accelerating during the industrial revolution, as the world started getting more complicated, while ordinary events, the ones we study and discuss and try to predict from reading the newspapers, have become increasingly inconsequential.”
Life is experienced, Taleb says, as “the cumulative effect of a handful of significant shocks.” The path of history is not foreseeable; therefore we are certain to be surprised. Yet we cling to a mistaken view of reality, where the future can be predicted and planned.