Thinking in Systems
posted on: Apr 12, 2018

Since reading Meditations by Marcus Aurelius, I had yet to come across a book with such a high signal to noise ratio. Just finished reading Thinking in Systems by Donella Meadows, and this fits the bill. I had to pause, think and ponder after reading each paragraph in order not to miss the profound underlying concepts. Very easy to read, this is one of those books that changes how you look at the world.

What is a system?

A system is a set of things - people, cells, molecules, or whatever - interconnected in such a way that they produce their own pattern of behaviour over time, to achieve something. A system must consist of three things: elements, interconnections, and a function or purpose. Some examples include the digestive system, a company, solar system, etc.

How to identify a system?
  • Can you identify the parts?
  • Do the parts affect each other?
  • Do the parts together produce an effect that is different from the effect of each part on its own?
  • Does the effect, the behaviour over time, persist in a variety of circumstances?
Some properties of a system
  • Changing individual elements usually has the least effect on the system
  • If the interconnections change, the system may be greatly altered. It may even become unrecognizable, even though the same elements are in the system.
  • Changes in function(non-human systems) or purpose(human ones) also can be drastic.

Why Systems work so well

Resilience:
  • Resilience is a measure of a system’s ability to survive and persist within a variable environment.
  • It arises from many feedback loops that work to restore a system even after large deviation. Having redundancy is hugely important.
  • A set of feedback loops that can restore or rebuild feedback loops is going one higher level. Self-organization.
  • Resilience is not the same thing as being static or constant over time. Resilient systems can be very dynamic. Short-term oscillations, or periodic outbreaks, or long cycles of succession, climax, and collapse may, in fact, be the normal condition, which resilience acts to restore. Conversely, systems that are constant over time can be very fragile.
  • Loss of resilience can come as a surprise because the system usually is paying much more attention to its play than to its playing space. One day it does something it has done a hundred times before and crashes.
Self-Organization
  • The most marvellous characteristic of some complex systems is their ability to learn, diversify, complexity, and evolve.
  • Self-organizing systems can arise from simple, basic rules.
Hierarchies
  • A brilliant systems invention, provides a system stability and resilience.
  • Reduces the amount of information that any part fo the system has to keep track of, e.g. If you have a liver disease, a doctor can treat it without paying much attention to your heart or your personality. While programming in C#, I am not worried about binary or assembly code.

Why Systems surprise us

Linear minds in a Nonlinear world
  • Everything we think we know about the world is a model. Every word and every language is a model. So are the ways I picture the world in my head - my mental models. None of these is or ever be the real world.
  • Our models usually have a strong congruence with the world. However, our models fall far short of representing the world fully. This is both a blessing and a curse.
  • One cannot navigate well in an interconnected, feedback-dominated, complex world without taking their eyes off short-term events and look for long-term behaviour and structure; unless one is aware of false boundaries and bounded rationality; unless one takes into account limiting factors, nonlinearities and delays.
  • The sudden swing between exponential growth caused by a dominant reinforcing loop, and then sudden decline caused by a suddenly dominant balancing loop.
  • Side-Effects: Effects which one hadn’t foreseen or don’t want to think about. Black-swan events (Nassim Taleb).
  • Being less surprised by complex systems is mainly a matter of learning to expect, appreciate, and use the world’s complexity.
System Boundaries

If you wish to make an apple pie from scratch, you must, first, invent the universe - Carl Sagan.

  • There is no single, legitimate boundary to draw around a system. There are no separate systems. The world is a continuum. Where to draw the boundary around a system depends on the purpose of the discussion. The questions we want to ask.
  • Boundaries can produce problems when we forget that we’ve artificially created them. When we draw them too narrowly, the system surprises us. e.g. To avoid the sewage problem. If we draw the boundary around our home, we can throw the waste in the river. But the towns downstream make it clear that the boundary for thinking about sewage has to include the whole river.
  • We get attached to the boundaries our minds happen to be accustomed to. National boundaries, trade boundaries, ethnic boundaries, boundaries between public and private responsibility, between rich and poor, polluters and polluters. It is a great art to remember that boundaries are of our own making and that they can and should be reconsidered for each new discussion, problem, or purpose.
Layers of Limits
  • At any given time, the input that is most important to a system is the one that is most limiting. (It took me some time to wrap my head around this statement)
  • Any system with multiple inputs and outputs is surrounded by layers of limits.
  • There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
Bounded Rationality
  • It would be nice if Adam Smith’s “Invisible Hand” was true, but it isn’t, most of the time.
  • Unfortunately, the world presents us with multiple examples of people acting rationally in their short-term best interests and producing aggregate results that no one likes. (While reading this, I couldn’t help but reflect on how so many problems in India are caused by this Tragedy of the Commons).
  • Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.
  • We are not omniscient, rational optimizers. Rather, we are blundering “satisfiers” attempting to meet our needs well enough before moving on to the next decision. We often don’t foresee (or choose to ignore) the impacts of our actions on the whole system.
  • Trading Places: Suppose you are for some reason lifted out of your accustomed place in society and put in the place of someone whose behaviour you have never understood. It can be thought in both directions. Not only rich can be poor, but also poor can be rich. In your new position, you experience the information flows, the incentives and disincentives, the goals and discrepancies, the pressures - that go with that position. Would that such transitions could happen much more often, in all directions, it would broaden everyone’s horizons and increase empathy.
  • Seeing how individual decisions are rational within the bounds of the information available does not provide an excuse for narrow-minded behaviour. It provides an understanding of why that behaviour arises. Blaming the individual rarely helps create a more desirable outcome. Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview.
  • From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires.

Leverage Points: Places to intervene in a system

Places in the system where a small change could lead to a large shift in behaviour. Listed below in the increasing order of importance and leverage, and impact they produce.

Physical - Low leverage

12: Numbers

  • Important, especially in the short term.
  • Changing these variables rarely changes the behaviour of the system. However, they become leverage points when they kick off one of the items higher on this list, e.g. interest rates reinforcing compounding interest can make a big difference.

11: Buffers

  • In chemistry, a stabilizing stock is known as a buffer.
  • You can often stabilize a system by increasing the capacity of a buffer. If a buffer is too big, the system gets inflexible. It reacts too slowly, and costs a lot to build or maintain.

10: Stock and Flow structures

  • A stock is the foundation of any system. They are the elements of the system that you can see, feel, count, or measure at any given time.
  • Stocks change over time through the actions of a flow.
  • The plumbing structure of the stocks and flows and their physical arrangements can have an enormous effect on how a system operates. But it’s a rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place.

9: Delays

  • A delay in a feedback process is critical, relative to rates of change in the stocks that the feedback loop is trying to control. Delays that are too short cause overreaction. Delays that are too long cause damped, sustained, or exploding oscillations. Overlong delays in a system with a threshold, a danger point, a range past which irreversible damage can occur, cause overshoot and collapse.
  • Delays in feedback loops are critical determinants of system behaviour.
  • A system just can’t respond to short-term changes when it has long-term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.

Information and Control - High Leverage

8: Balancing Feedback Loops

  • A stabilizing, regulating feedback loop which opposes or reverses whatever direction of change is imposed on the system. Any balancing feedback loop needs a goal, a monitoring and signalling device to detect deviation from the goal, and a response mechanism.
  • A complex system usually has numerous balancing feedback loops to self-correct under different conditions and impacts. They might be inactive most of the time, but their presence is critical to the long-term welfare of the system. One of the big mistakes we make is to strip away these emergency response mechanisms because they aren’t often used and they appear to be costly.
  • The strength of a balancing feedback loop is important relative to the impact it is designed to correct.

7: Reinforcing Feedback Loops

  • An amplifying or enhancing feedback loop, which reinforces the direction of change. These are vicious cycles and virtuous circles. The more it works, the more it gains the power to work some more, driving system behaviour in one direction.
  • Reinforcing feedback loops are sources of growth, explosion, erosion, and collapse in systems. A system with an unchecked reinforcing loop ultimately will destroy itself. Usually, a balancing feedback loop will kick in sooner or later.

6: Information Flows

  • The structure of who does and does not have access to information. Missing information flows is one of the most common causes of system malfunction.
  • Adding or restoring information can be a powerful intervention, delivering feedback to a place where it wasn’t going before.
  • This is a high leverage point, often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be, to permit it to happen.

5: Rules - Incentives, punishments, constraints

  • The rules of the system define its scope, its boundaries, its degrees of freedom.
  • Strong leverage points. Power over the rules is real power. If one wants to understand the deepest malfunctions of systems, pay attention to the rules and to who has power over them.

Crucial Leverage Points. These are more abstract but most effective

4: Self-Organization

  • The ability of a system to structure itself, to create new structure, to learn, or diversify. The power to add, change, or evolve system structure.
  • The most stunning thing living systems and some social systems can do to change themselves utterly by creating whole new structures and behaviours. In biology, it’s called evolution.
  • The ability to self-organize is the strongest form of system resilience. A system that can evolve can survive almost any change, by changing itself.

3: Goals

  • The purpose or function of the system. All the leverage points above can be twisted to conform to the goal of the system.
  • Changing the actors in the systems is a low-level intervention, as long as the players fit into the same old system. The exception is at the top, where a single player can have the power to change the system’s goals.

2: Paradigms

  • The mindset out of which the system - its goals, structures, rules, delays, parameters arise
  • The shared idea in the stocks of a system, the great big unstated assumptions, constitute the system’s paradigm, or the deepest set of beliefs about how the world works.
  • Paradigms are the sources of systems. From them, arise the shared agreements about the nature of reality, system goals, and information flows, feedbacks, stocks, flows, and everything else about systems.
  • We might think that the paradigms are harder to change than anything else. However, it can happen in a millisecond, in a single individual. All it takes is a new way of saying. We change paradigms by building a model of the system, which takes us outside the system and forces us to see it whole. (My own paradigms have been changed that way, after leaving India and coming to Canada).

1: Transcending Paradigms

  • The highest leverage point is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true”, that everyone including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension.
  • It is to “get” at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny.
  • It is to let go into not knowing, it is to embrace into uncertainty, into what the Buddhists call enlightenment.

Living in a World of Systems

  • It is one thing to understand how to fix a system and quite another to wade in and fix it. For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take.
  • Systems thinking leads to another conclusion as we stop being blinded by the illusion of control. It says that there is plenty to do, of a different sort of “doing”. The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them.
  • Before you disturb the system in any way, watch how it behaves. This simple guideline helps to avoid many wrong turns down the line. Starting with the behaviour of the system forces you to focus on facts, not theories. It keeps one from falling too quickly into one’s own beliefs or misconceptions, or those of others.
  • Watching what really happens, instead of listening to peoples’ theories of what happens, can explode many careless causal hypotheses.
  • Starting with the behaviour of the system directs one’s thoughts to dynamic, to static, analysis - not only to “What’s wrong?” but also to “how did we get there?”, “What other behaviour modes are possible?”, “If we don’t change direction, where are we going to end up?”
  • Remember, always, that everything one knows and everything everyone knows, is only a model. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them to be plausible until you find some evidence that causes you to rule one out. This is the scientific method.
  • When one’s walking along a tricky, curving, unknown, surprising, obstacle-strewn path(a complex world), one would be a fool to keep their head down and look just at the next step in front of them. One would be equally a fool just to peer far ahead and never notice what’s immediately under one’s feet.