I haven’t been very actively writing here recently. I think I got trapped in the question ‘what meaningful could I write that is not already out there?’. Well, anyway, just a short post today with some thoughts that I have carried around for a while. I just came across a post on the Aid on the Edge blog that was talking about South Africa and the uncertainty of its future:
We human beings do not like uncertainty. We seek to understand what events portend, taking comfort in coming up with an answer. (…) Yet sometimes there is more wisdom, and more comfort to be taken, in acknowledging a more humbling truth – that which of many alternative futures (including ones we cannot imagine) will come to pass is unknowable, is a product of decisions and actions that have not yet been made. This understanding of change as something ‘emergent’, evolving, which can unfold in far-reaching yet ex ante unpredictable directions, is the key insight of ‘complexity theory’ – an insight which can offer a useful dose of humility to governance prognosticators.
The question that comes to my mind when reading this is how to handle the tension of the uncertainty of the future and the deeply institutionalized need for planning in development institutions.
I have worked with a systems dynamics approach combining causal loop diagrams with a method called the sensitivity analysis. It helps us to determine the relative importance of impact factors in a system and characterize them as active, critical, passive, and buffering. Together, these two tools allow to select impact factors that could be targeted by development agencies in future projects.
Now, what is the value of causal loop diagrams? Some people say that they are not more than an improved version of linear causal chains, but still not able to reflect ‘real’ complexity, i.e., the unpredictability of complex systems. Loop diagrams still work with cause and effect relationships, the cause and effect between two factors that can be connected with other factors to eventually build a loop. Yet, complexity sciences say that in complex systems cause and effect are hard to determine, so why bother?
I think that the causal loop analysis and the sensitivity analysis allow us to evaluate the factors that are most relevent and focus on them. They further illustrate some of the more prominent feedback mechanisms of the system that could amplify or hamper our interventions or that we could even use to change some of the dynamics of the system in our favor. They also cater the need of planning or at least of establishing a rational base for planning.
But yeah, we have to avoid falling back into the ‘we can predict the future’ trap, trying to build a prefect model of a system (just remember, models of complex systems have to be as complex as the real thing to accurately simulate it). Complex systems remain inherently unpredictable and our actions need to be tuned to the reactions of the system to any intervention. The above mentioned tools help us to make sense of the dynamics of a system and to select the more promising interventions. They do, however, not release us from the need of an experimental (or may I call it evolutionary) approach to solving real problems in real systems.
I would appreciate any thoughts on that in the comments!