We see the world, not as it is, but as we are – or, as we are conditioned to see it. -Stephen R Covey.
It’s well understood that our prejudices colour our views. Perception interferes with objectivity. Behavioural scientists have come to conclude that we see what we believe more often than we believe what we see! Organisations have to contend with this problem every day even as they strive to mobilise the energies of a diverse set of people to achieve a common goal.
To solve this issue of differentiating between truth and perception, managers tend to rely on data. This faith in data is deep-set, as evidenced by the overused line, “In God we trust, and for everything else, get the data.” Unfortunately, data is does not really help differentiate between truth and perception. For instance, two managers might have two different interpretations of the same data; or two opposing views can have fully reliable data supporting them.
So, data at best helps validating the existence of an entity but decisions are made based on the inferences (or deductions) from the data made in the minds of the managers. To illustrate this, let’s look at the specific issue of poor deliveries by a vendor, a common complaint of plant managers at manufacturing companies. It is very likely that an attempt to zero down on the root cause of this problem leads a manager to the conclusion that cause is the vendor’s “bad attitude”. Consequently, he may proceed with decisions and actions like counselling the vendor or even replacing him. What is of import to note here is that while there is data to support the fact that deliveries were late, the conclusion is purely an inference which may or may not be accurate. There isn’t much by way of figures or numbers to evidence the “bad attitude” so the decision was very likely colored by the manager’s pervious experiences either with this or other vendors!
Why are many human decisions thusly, often led by perception-based inferences? Behavioural scientists have pointed out that as a species, can do two kinds of thinking- slow and fast. We are capable of slow, rigorous analytical evaluation. However, since the human species has often had to act quickly in the face of danger, we have also evolved the ability to make quick decisions relying on intuition and heuristics. The latter method is obviously not very logical and subject to bias but sufficed to address immediate dangers.
While such urgent situations and dangers in our environment may have come down, human beings during decision making, often prefer taking the ‘lazy’, shorter route to conclusions -hopping on biases and leaping off of past experiences. A decision make in rush can have long standing consequences in organizations. So, it is important that managers are trained to use a more analytical approach to decision making. For this, managers have to escape the trap of erroneous mental models and biases. The good news is that our minds can be trained to do so if we keep three axioms in mind while analyzing problems:
1. Organizations are interconnected systems
In an organization, issues that crop up are very rarely isolated incidents or problems. They many not even have their root in the same department. Therefore, true power for a decision maker in this context lies in the ability to appreciate the whole pattern of interlinkages in the organization along with the cause-effect connections between its myriad parts. The problem of being able to grasp interconnections become even more difficult when cause and effect form non-linear loops, especially when the two are separated by time and space. For example, when a cause in one department has an effect in another, after a considerable amount of time, the connection between the two can remain undetected for a long time.
This knowledge of being able to see interconnections in and of itself is far more valuable than intimate knowledge of the many individual parts of a system. Managers, however, are trained to be specialists in their respective departments and areas of work. Moreover, as bosses of their domains, teams look up to them to rule and protect their turf, to shield them from blame. This culture of protecting self- interest compartmentalizes the managers’ thinking and clouds their view of the big picture.
This failure of mangers to take a systemic view can be a major huddle to good decision making and can hurt not only the department but the whole organization. So, a simple and effective way to train the mind is to use computer simulators– these can help collapse time and space dimensions (a day of real life can be modeled as a second in computer time and a manager on a simulator can visualize the effects spanning across departments) to make managers aware of the non-linear loops of cause and effect across departments.
2. Along with data, always comes noise
The other aspect which interferes with clear thinking is ‘noise’. While you need information to make decisions, differentiating between relevant information and irrelevant noise is not easy. Sorting between the two is made harder by the fact that our perception can amplify noise and make it seem relevant based on circumstances. Here’s an example. When you’re running late for a meeting, and are slowed down by traffic, you are likely to notice every stop sign and every bad, congested patch on the road. And when you reach the meeting, you would complain about horrid conditions of the roads. However, had you started out early, you would have paid scant attention to these minor obstacles along the way.
Similarly, managers struggling with poor on-time delivery performance could perceive routine issues such as quality problems, absenteeism, machine breakdowns and so on as major roadblocks. When orders are running late, every ‘bump on the road’ created by quality issues, or a machine breakdown can suddenly loom larger. Distracted into trying to solve these noise-level issues, can lead to the real problems getting completely ignored – long queues of raw material waiting to be processed and the consequent wastage of buffers. The longer the wait at upstream work centers, shorter is the time available to deal with natural uncertainties in the downstream work centers. Managers are not unaware of the problems with waiting time but while making decisions, the real issue often gets drowned out by the amplified noise of the minor ones.
3. Falsification not verification is the true test of a hypothesis
Taking the system view allows you to develop hypotheses which span across space and time. Being wary of ‘noise’ can help sort between relevant and irrelevant information that can go into these hypotheses. However, it is also important to beware of confirmation bias while testing these hypotheses. The best way to avoid it is to take the approach falsification. Falsification method involves trying to identify any contrary observation to prove a hypothesis wrong. This is because even if there are many observations that support, this cannot prove a hypothesis right. So, the best way to check reality is to subject it to tests of failure. If a hypothesis has survived many tests of failure, it can be assumed to be right for the time being. As managers, we have to develop the ability to attempt and disprove our hypothesis rather than trying to confirm it.
If managers can keep these three axioms in mind when it comes to decision making, they can anticipate and avoid traps of incorrect causality, amplified noise and confirmation bias. Without appreciating and practicing these principles, data alone can be at best a crutch and at worst a handicap while trying to decipher between reality and perception. By providing the right training to think methodically and analytically, organisations can harness the wealth of intuition in their managers and use available data to provide the right signals.
It is time they invested in thought-ware.