Since I am on vacation, I have been catching up on some light reading, like Vannevar Bush’s 1949 Modern Arms and Free Men. I have also been reading an unhealthy amount of news. This has led me to reflect a bit more on how the main lessons of complexity theory for archaeology work out in the real world, in our present. How does being part of a complex system affect us in real terms, as individuals, groups, and as a species?
Bush’s main theses (:2) are “that the technological future is far less dreadful and frightening than many of us have been led to believe, and that the hopeful aspects of modern applied science outweigh by a heavy margin its threat to our civilization” and that “the democratic process is itself an asset with which, if we can find the enthusiasm and the skill to use it and the faith to make it strong, we can build a world in which all men can live in prosperity and peace”. Let’s forget for a moment how quaint or even misguided these sentiments may seem as the American and Russian Presidents meet in today in Helsinki in my beloved Finland.
Both theses have to do with complexity, because they have to do with intentions and consequences. If we all behaved as Bush advises, and if there was a clear connection between our good faith efforts and their impacts on the world, we wouldn’t be in our current situation. I have argued that because we live in a complex system characterized by chaotic change, there is in fact a very weak long-term correlation, between our intentions as agents, and the consequences of our actions.
In the short term, there can in fact be a good correlation between what we intend to do and what we actually achieve. This is what gets us into so much trouble individually, as groups, and as a species. When our system is far from self-organized criticality (SOC), when it is stable and not subject to sudden, unpredictable changes, we can do pretty much anything we want without much fear of upsetting things too badly. Our actions have mostly local implications, and mostly the ones we intended. We can push a few grains of sand on the pile in our immediate vicinity, and others will barely notice. Our individual actions, in the final analysis, don’t matter much. While this can be frustrating, it is also relatively safe.
When our system approaches SOC, however, any one of our actions, regardless of our intentions, can have significant and global repercussions, sometimes leading to a complete, catastrophic rearrangement of the system. This is when our intentions no longer matter, and when we can have a devastating impact on others, and ourselves. Merely nudging one single grain of sand can lead to a complete collapse and reorganization of the pile.
Most of us are fairly risk averse and not very enterprising. Most of us try to think beyond our narrow self-interest. To quote Bush (:8) again, most of us are “governed by a moral code which transcends expediency”. In most circumstances, especially when we are far away from SOC (which is most of the time), this serves us well.
Some of us, however, individuals and groups, are more tolerant of risk, more enterprising, and more self-interested. Some of us are all those things at once. When our system is far from SOC, not very subject to change regardless of our intentions and our efforts, that subset of us learn from experience that their actions, no matter how outrageous, have few serious consequences. No matter how much they try to push the limits, nothing globally destructive seems to happen. The tragedies they put in motion are limited in scope. They are local and circumscribed.
Proxy wars, for example, are unlikely to escalate into cataclysmic, world ending nuclear exchanges. In fact, some of my other light holiday reading suggests that a system far from SOC is very resistant to this kind of escalation. Given the number of close calls we actually had, if the cold war system had been unstable, we wouldn’t be here to discuss it.
Since a system far from SOC favours intended consequences over the unintended, those enterprising, self-interested, risk seeking people among us, can learn that their outrageous actions consistently serve their narrow interests. They learn that brinkmanship pays.
When our system approaches SOC, these people’s behaviour becomes extremely dangerous. In a critically self-organized system, in a sand-pile ready to slide into a system-wide catastrophic avalanche, any action can initiate the change. Unfortunately, humans are blind to SOC. We just don’t know what state our system is in.
Statistically, the change is likely to come from those enterprising, risk-tolerant, self-interested individuals and groups, and it is likely to come from them because they have learned that the system can tolerate their behavior relatively intact. Until it can’t.
Then, all the intentions behind the technology and its application, all the good will of a democratic process don’t matter one bit. When the avalanche begins, all we can do is ride the wave and see what happens. The most shocked and surprised are usually those whose reckless behaviour initiated the change. The rest of us just pay the price.
Note to self: I need to think about whether, in human systems, the very learning of brinksmanship favoured by distance from SOC, actually favours a movement of the system toward SOC. Maybe that will be a post next summer.
Bush V 1949. Modern arms and free men: a discussion of the role of science in preserving democracy, New York: Simon and Schuster.