ARTICLE

Jennifer Webster

06 June 2016

Reducing the number of accidents and near misses, or just moving off that performance plateau can be incredibly hard to achieve.

Given the effort and the resources it takes to put into place a robust health and safety management system, it can sometimes be difficult to admit that more needs to be done.

So what do you do? Do you ignore the fact that there is an issue, or do you make the situation appear less problematic than it actually is? Neither approach is recommended.

As a tactic, the latter option can have serious repercussions. We just need to think back to the Columbia space shuttle disaster for an example of how badly things can go when factual data, particularly new data, is ignored. According to the incident report, there were at least 79 missed opportunities to re-examine whether foam striking the shuttle’s fuselage during its launch could pose a problem on re-entry. Even when the evidence suggested it might, decisions were made on previously held assumptions. As human beings, we do not like to deal with the uncertainty or ambiguity that comes with the ‘what if’ or the ‘might happen’ moments. These situations make us feel as though we not in control and make our decision- making harder.

Why is sometimes easy to ignore the evidence?

So how do we explain why it is sometimes easier to create certainty out of ambiguity, or why we might ignore evidence that suggests an accident might happen? In order to do this, we have to look at heuristics. Heuristics are mental short-cuts or rules of thumb that reduce some of the cognitive effort we expend each day in processing information to make decisions.

In his book, ‘Thinking fast, and slow’, Daniel Kahneman (2011) describes the way we think and reason through the introduction of a dual processing system or System 1 and System 2. System 1 thinking is automatic, intuitive and effortless, a legacy of evolution. System 2 is controlled, conscious and requires effort to maintain.  

We constantly flit between the two because we need both Systems to function. We use System 2 processes when we have to make an important decision make, where a decision is relevant to us, or when the decision results in us becoming personally accountable. However, to maintain this level of conscious System 2 thinking requires effort to so we create mental short cuts or rules of thumb (heuristics). In other words, we subconsciously look for ways of simplifying how we process information but this can introduce bias into our thinking. It is just the way the human brain is designed and even the most seasoned professional is vulnerable to these processes.

The way we think influences health and safety practices

Heuristics and cognitive biases influence our decision making in numerous ways. For instance, the information we receive early on in our careers often has an undue influence on how we subsequently think about a situation. This is known as the ‘anchoring effect’. It is not usual for people to continue to do a job the way they have always done it, especially if they have done the job for a long time. This is why it is so important to set the tone about what good health and safety looks like in schools and colleges and look at refresher training for those with long tenure. Another type of bias, confirmation bias, leads people to rely on evidence that fits with pre-existing beliefs or thoughts so like the shuttle disaster, decisions are based on we know from the past without considering the unexpected.

Decision-making in the corporate world

The expectations of the corporate world for certainty, control and accountability also influences how decisions are made. To go to a senior manager with a ‘what if’ or a ‘might happen’ is the death knell for many a career. Instead, it is easier to blame individuals or teams who do the job on the ‘shop floor.’ The ‘fix’ is usually further training, more processes and procedures or some penalty. Consequently, nothing much changes, but what this scenario creates is a self-perpetuating cycle where the flow of information up the chain of command, the ‘what if’ or the ‘might happen’ begins to stall. Cover-ups become the norm. Worst still, the probability of a major incident occurring is played down and the focus on more short term solutions which only tackle the tip of the iceberg.

Widening out the conversation

This is a very brief overview of the complex world of decision-making, and there is a great deal more that is worth exploring in relation to occupational health and safety. Just raising awareness that we are all subject to heuristics and biases is a start. What we can also do is look at opportunities to increase participation in health and safety decision-making within organisations.

Jennifer Webster MSc, CPsychol, AFBPsS is occupational psychologist at the Health and Safety Laboratory, Buxton.

 

 

 

 
OTHER ARTICLES IN THIS SECTION
FEATURED SUPPLIERS
TWITTER FEED
 
//