Dave Snowden

7 principles of intervention in complex systems

RSS Feed

I may buy this T-Shirt as its a great slogan. As it happens I was working on intervention strategy in Washington a week ago. We have completed a SenseMaker® project to understand employee attitudes and then spent one day with the senior management team. Not to simply report what we had found, but to allow to explore the data and come up with a series of interventions that would hold out the possibility of achieving change, or through failure would allow learning to take place.

Now one of the basic Cynefin mantras is that of safe-fail interventions, and lots of them, if the problem is complex. One can only really understand a complex system by interacting with it (the probe of probe-sense-respond) and while people get that, getting them to take on board the full implications is another matter. Some time ago I decided that we needed to have a more rigid process that would force people to take the various aspects of a safe-to-fail experiment into account in design, so we created a form. its still in use, but I have also been working on some basic principles to expand on that. So here they are:

You need multiple parallel experiments and they should be based on different and competing theories/hypotheses.

They must be safe-to-fail, which (to state the obvious) means that if they fail you must be able to survive and consequences and recover

A percentage must fail, if not you are not stretching the boundaries enough and your scanning range is reduced in consequence

Each experiment must be coherent, not just a stab in the dark (hence my liking of the T-Shirt). Ideally coherence should be based on evidence, at a minimum ritual dissent should be used to test the ideas.

Actions speak louder than words, if you are trying to counter a negative story then taking small visible actions that make the story impossible to tell is the best policy. Countering stories with stories rarely works as does countering them with facts. Doing things makes all the difference.
You don't start any experiment, safe-to-fail or otherwise unless you can monitor its impact in real time, or at least within correction time of you ……

… damping or amplification strategy. Working both out in advance is key, so you are ready to respond quickly to either success or failure.

Its worth noting that an experiment that fails may provide a better route forwards than one which succeeds

It is important to realise that a lot of conflict happens in the complex arena. The reason for this is that many theories can be coherent to the facts, so a right answer cannot be determined by further analysis (that is for complicated problems). By allowing anyone with a coherent theory to construct and implement a safe-to-fail experiment you radically reduce conflict in decision making.

All of this is part of the new advanced Cynefin course that is being developed – and there is a short term opportunity to be part of the first outing of this. A client asked for a day next week, and said they were open to a public course rather than a private one. So if you are interested you can book here.