Blog

Thinking outside of the linear box

Denny Borsboom

20.04.2022

When people think of interventions, they tend to think in small linear chains: if you intervene on A, then this will cause B, perhaps resulting in C, but then there’s a full stop. However, few of the systems we typically intervene on can be fully analyzed in this way, because most contain feedback loops that should be factored in. Psychological systems, such as the networks of interacting symptoms that are now popular in clinical psychology and psychiatry, are plausible examples that require such an analysis. For instance, addiction can be a coping mechanism for dealing with problems, but these problems may also arise because of the addiction; think of the proverbial alcoholic who drinks to forget the problems they have because they drink. This type of feedback loop is very common in mental disorders. 


Feedback loops are tricky, especially when we are reasoning about interventions. An illustrative example is the so-called Cobra effect. The story goes that, during British colonial rule, authorities in India tried to bring down the number of Cobras by issuing a reward for anybody who would bring them a dead Cobra. Although the program was initially a success, before long local farmers were actively breeding Cobras in order to collect the bounty. When the authorities learned of this, they discontinued their policy, and because the farmers had no other use for the animals, they freed them into the wild, resulting in a net increase of the Cobra population. So in the end, the policy achieved exactly the opposite of what it intended. 


While this example is often used to highlight the difficulty of intervening in complex systems, which it indeed does, it is interesting to note that the analysis of the example has also promoted understanding. It has allowed scholars to collect many similar examples, which have become categorized as perverse incentives. We can analyze the structure of such situations. For the Cobra example, the policy maker targets variable A (number of wild Cobras) through intervention B (issuing a bounty), but intervention B has side effect C (farming Cobras) that leads the intervention to be ineffective in bringing down A; this then causes the policy maker to recount the policy (intervention not-B), which leads to a new behavior D (releasing Cobras) that actually increases A in the end. We can draw a diagram of this process, put it in a computer simulation, and use that to model similar cases. Thus, even though the story vividly illustrates the unwieldy nature of complex systems, it also has allowed us to learn about a particular type of effect that may occur in such systems. This allows policy makers attuned to the complexity perspective to keep these unintended consequences in mind, which is highly useful in many other similar situations1. 


Unintended consequences that are reminiscent of such effects are described regularly in the literature on mental disorders. They have been most extensively studied in the context of medication. For instance, it is well known that antipsychotics, while often effective, can have severe side effects. These side effects can both lead users to quit taking the medication and promote other effects that may lead to new problems (e.g., weight gain, nausea, fatigue) that can in turn worsen the situation, as they can induce still other problems (e.g., lack of energy, loss of interest, depressed mood) that may be implicated in psychopathology networks. While it is exceptionally difficult to track such effects scientifically, it is not impossible that in some cases the intervention can lead to a net worsening of the outcome in the long run, similar to the Cobra effect. Note that this can be the case even if the medication is effective in mitigating its proximal target in the short run. 


In addition to unintended consequences, complex systems perspectives can also suggest surprising and counterintuitive interventions that can be useful in practice. Perhaps the most famous example concerns the effect of pollution in shallow lakes, intensively studied by Wageningen ecologist Marten Scheffer. In these lakes, increased nutrient loads, produced by fertilizer pollution, can cause them to suddenly transition from a clear water state into a turbid state, where the lake is covered with algae. Lake turbidity has many undesirable consequences, including degrading biodiversity. An intervention that can reverse this process in certain situations is, counterintuitively, to remove fish from the water so as to decimate their numbers. This interrupts a process in the ecosystem that sustains the turbid state, because certain types of fish both profit from and promote that state. As such, catching these fish can lead the lakes to recover. The counterintuitive aspect is that the intervention, which at first glance seems to worsen the situation (namely, it further degrades biodiversity) actually breaks a feedback process that ultimately can lead to improvement. It is also interesting to note that targeting the original cause of the problem (pollution leading to excess nutrients) may have no effect at all due to hysteresis effects: the threshold above which the lake switches from clear to turbid is much higher than the threshold below which it switches back to its clear state. This is essentially the same process as the one I described in my previous blog post on psychopathology traps. 


Surprising counterintuitive effects in the area of psychopathology certainly exist, although they do not seem systematically documented as such and are rarely backed by a systems analysis. A suggestive example is the beneficial effect of sleep restriction therapy on insomnia. In sleep restriction therapy, the time at which clients are allowed to sleep is restricted to a small time window that is then gradually widened; this intervention can lead to improvement within weeks. This intervention is similar to the lake example, because at first glance sleep restriction seems to worsen the situation by artificially maintaining the wakeful state (which was exactly the problem). Sleep restriction therapy may also self-sustaining feedback processes which have insomnia as a stable outcome, for instance by intervening in the mechanisms that support the circadian rhythm or by changing maladaptive beliefs that are both causes and consequences of insomnia. 


Interestingly, I do not know of complex systems analyses that have been applied to these situations. This is surprising because these seem cases of low-hanging fruit that scientists are usually prone to reap; in addition, the research needed is cheap compared to standard research in the field (experimental tests of interventions, neuroscience approaches, genetics). In my view, charting, studying, and simulating unintended and surprising intervention effects along the lines of complex systems should be a scientific discipline. It could help us understand and possibly mitigate adverse consequences of interventions; it could also help us to discover counterintuitive intervention effects that we might not otherwise consider. It is high time to start thinking outside of the linear box.



Denny Borsboom (1973) is a Dutch psychologist and psychometrician. Since 2013, he has worked as a professor of psychology at the University of Amsterdam. His work focuses on applying network theory to the study of mental disorders and their symptoms.

Previous blog posts