This article is about adopting a complex systems-based view of societal phenomena. All human societies are a collective of individuals with coupled behaviors.
As a result, large-scale society-wide information and local behaviors are coupled. This can be appealing, but also leads to surprising behavior. In complex systems, noise feeding back onto itself can lead to transitions to orderly states. In practice, this means individual slightly random positive decisions can results in a society-scale negative behavior.
There is another issue with complex dynamical systems: their possible states are distributed according to a heavy-tailed distribution.
But in collective settings where contagion shapes behaviour – a run on the banks, a scramble to buy toilet paper – the probability distributions for possible events are often heavy-tailed.
Events from the long tail can in turn lead to more very rare events.
What’s more, once a rare but hugely significant ‘tail’ event takes place, this raises the probability of further tail events. We might call them second-order tail events; they include stock market gyrations after a big fall, and earthquake aftershocks. The initial probability of second-order tail events is so tiny it’s almost impossible to calculate – but once a first-order tail event occurs, the rules change, and the probability of a second-order tail event increases.
This uncertainty makes predicting the results of one’s actions very hard. Many fiascoes happened when central entities made bottom-up decisions from crude data and making easy conclusions about cause and effects.
There are better ways to make consequential, society-wide decisions. As the mathematician John Allen Paulos remarked about complex systems: ‘Uncertainty is the only certainty there is. And knowing how to live with insecurity is the only security.’ Instead of prioritising outcomes based on the last bad thing that happened – applying laser focus to terrorism or inequality, or putting vast resources into healthcare – we might take inspiration from complex systems in nature and design processes that foster adaptability and robustness for a range of scenarios that could come to pass.
Nature has come up with solutions to this uncertainty by using two strategies:
The timescales on which a system’s processes run have critical consequences for its ability to predict and adapt to the future. Prediction is easier when things change slowly – but if things change too slowly, it becomes hard to innovate and respond to change. To solve this paradox, nature builds systems that operate on multiple timescales.
This separation needs to be just right (not too large, not too small) and adaptable.
We are just at the dawn of using complex system science to study societal and environmental phenomena, but this could lead to great innovations in policy-making.