The Input Insensitivity Illusion

Some things are simple. They are governed by well understood rules, they can be measured, modelled and predicted with good accuracy. They seem to take care of themselves. These are simple systems and we don’t need to worry about them, right?

Take William Foster, Michael Douglas’s character in the film Falling Down. On the face of it, he is a walking simple system. He is a law abiding citizen going about his day in a routine fashion. The same inputs, the same outputs. Sure, there have been some adjustments to his system recently, he is now unemployed and separated from his family - but he is coping with these changes and continues to function normally. We don’t need to worry about him, right?

Those of you who have seen the film [spoiler alert] will know that we definitely do need to worry about him. He ends up going on a cathartic shooting spree, railing against all the petty injustices he has begrudgingly tolerated for years. What gives the movie its power is the idea of an Everyday Joe going off the rails - a simple system acting unpredictably.

Marcus du Sautoy’s book, What We Cannot Know: Explorations at the Edge of Knowledge, provides some excellent examples of simple systems behaving chaotically. One of its central messages is the idea that even incredibly simple systems, driven by totally deterministic rules, can seem to exhibit totally unpredictable behaviors.

Think of a pendulum. We can predict its behaviour accurately: where it moves, at what velocity. It is governed by extremely well understood physical rules. This predictability has been successfully harnessed by time-keeping technology for centuries. Now, if we attach a second pendulum to the bottom of the first - making a double pendulum - it will still be governed by the same rules and we should be able to continue to predict its behaviour. The system is DETERMINISTIC - i.e. if you know the beginning state (input), you can calculate the state at any future time (output). In this case, if you could start the pendulum off in exactly the same conditions then it would follow the same path. Note that the 'crazy' pendulum movement is not the issue here - that bit is predictable.

Source: Wikipedia - 100Miezekatzen It turns out the system is very sensitive to tiny variations in the starting position and many other factors. Slight differences in the starting position, temperature and airflow (tiny nuclear forces caused by cosmic radiation events?), can result in large differences in the behaviour of this system: very different 'crazy' paths. These slight differences in the input are - in practical terms - imperceptible to us, but they have a very noticeable impact on the observable output. This means the double pendulum is to all intents and purposes both DETERMINISTIC and UNPREDICTABLE.
The same is true of many systems. I won’t get into a discussion here regarding the validity of the distinction between aleatory and epistemic uncertainty (perhaps a topic for another blog post), but there are many things which it is theoretically possible, yet actually impractical to predict (e.g. the outcome of a dice throw). Most of these systems do not cause us a problem, because we view them as highly complex or even chaotic systems and treat them as random, and predictable only in terms of a probability distribution based on observed outcomes. In other words, we know they are unpredictable and take this into account in our decision-making.

It should be noted that a chaotic system is not necessarily entirely unpredictable; rather such chaotic systems may have a ‘horizon’ beyond which it is not practically possible to predict its outputs. In the 1950s the prediction horizon for weather systems was about 18 hours, today it is about one week. This may sound like a huge improvement, but consider the ~x2 fold increase in transistor density each year that Moore’s Law suggests, or the actual increase in processing power the real supercomputers used to model the weather. In 1959 The Met Office’s Ferranti Mercury, nicknamed Meteor, was capable of doing 30,000 calculations a second, and the modern day Cray Supercomputer is estimated to complete 16,000 trillion per second. It is thought there is a practical ceiling of 14 days even with perfectly accurate input data and greater computer power. There is clearly a diminishing return in accuracy with ever more powerful machines and accurate models. Nevertheless, the weather, despite being a massively complicated system, is still to some degree predictable.

But, as I mentioned, it is not the known chaotic systems or highly complex systems which are a problem. Thinking back to our movie analogy, these systems are more like a Nicholas Cage character. We know they are crazy, we expect them to act erratically, so we take precautions or stay out of their way.

Our real problems arise when we view something as simple, we believe we can predict its outcome accurately and we act accordingly, and yet it actually possesses hidden complexity, which can lead to strategic surprise. This is the ‘Input Insensitivity Illusion’.

The balance of power, which held the peace (more-or-less) in Europe for the best part of a century, is a good example of a highly complex system, which some experts began to view as simple and deterministic. It has been argued that the resulting miscalculations led to a far greater catastrophe (the First World War) than would have occurred had the relevant decision-makers recognised the unpredictability of the system within which they were operating.

It is an easy trap to fall into. When you know your business, market, customers, processes and systems, it’s tempting to believe that you ‘totally get it’. Over years of experience you will have developed models (either statistical data-driven models or heuristic brain-based models) that give you a feel for what might happen next. Moreover, in systems which are driven by simple deterministic logic it’s easy to believe that you know how it all works, so you may think you can predict how it will behave. Unlike some decision-making mistakes, the ‘input insensitivity illusion’ is more likely to affect those with more experience of a given system, because they will have developed more confidence in their models of the system.

There is an inevitability about humans succumbing to this illusion. We are driven to minimise complexity and impose simplicity in our conceptualisation in order to get things done. In many cases reductionism serves us well, and we could not model most systems at all without some simplification. The problem is not the simplification itself, it is the amount of trust we place in that simplification and our overconfidence in the resultant predictions. The problem is distinct from oversimplification, another common decision-making failure modes. In this case, we are assuming that ‘negligible’ variations in input will result in a small variation in output.

So how can we maximise the gains of conceptually reducing complexity, while guarding against its inherent risks? The answer lies in remaining honest with ourselves.

There are a number of areas where organisations commonly forget to ask themselves questions about supposed simple systems and predictive models. Avoiding these pitfalls can help mitigate the risk posed by the simplicity illusion:

  • Beware of prediction decay - As I’ve stated, simplicity in a system does not equate to predictability of output. Don’t simply extend the trend line and error bars for a predictive model over time. This may be fine for short term or micro level predictions, but longer term predictions or forecasts that are affected by the surrounding macro system may be subject to significant prediction errors over time. Also understand that prediction decay may be catastrophic in nature, not simply fading in a linear fashion. If this is the case, you could end up not just being wrong, but being very wrong.

  • Search for false precision - Organisations are particularly susceptible to false precision with the explosion in new sources of data. The increase in quantification can give the impression that we can measure everything accurately. It is important to probe the validity of new sources of data. They may not measure what they claim to measure, in which case your model’s assumptions may be violated. Ultimately such data should be judged by its predictive power, but all the while it is a relatively new and untested data source we should be particularly wary. Being Bayesian helps to protect against an over-reliance on new unverified data.

  • Know the limits of your knowledge - Be clear what you can’t know. This shouldn’t stop you from trying to reduce uncertainty, but understand what the information currently available to you permits you to be sure about. You can waste valuable resources (not to mention make costly predictive errors) by trying to model the unknowable. Work within the chaos. Any chaotic deterministic system will be predictable up to a point for a give required level of fidelity: The ‘chaos prediction horizon’. For the accurate weather forecasts, there is an expected maximum prediction horizon of approximately 14 days, and for a double pendulum it’s probably in the first couple of swings (if you release by hand).

  • Simplicity + Certainty = Caution - Simplicity is an alluring concept, so is certainty. We want to understand how things work and we want to be sure that our understanding is true. Unfortunately, the two concepts coincide less frequently than we would like. Occam’s Razor may be a useful approach for choosing between hypotheses in some circumstances, but it is not a licence to abandon query about simple explanations - especially when they are expressed with absolute certainty.

All of the above are really just heuristics designed to prompt our scepticism. They can all be summed up by the maxim - Always ask: How sure am I? Remembering to pose this question at every step when building predictive models, particularly when the system you are modelling appears simple, should help you avoid being overconfident.

So don’t forget, next time you are standing in line behind someone muttering under their breath about poor service, just think: are they really the simple system they appear to be?