Primitive, emotional, irrational – not exactly how most of us would characterise the workplace or indeed our thought processes and behaviours in it, but the best literature from the field of human and organisational factors tells us these adjectives may be more applicable than we think.
Humans, it turns out, are not the paragons of logic we believe ourselves to be. When required to make complex decisions – especially under pressure – we self-righteously believe we apply our modern, intelligent “higher brains”, when in fact our emotional, “primitive brains” play a much bigger part than we are aware.
With almost shocking predictability we fall foul of a withering range of traps called heuristics and biases.
It is these “mental rules of thumb”, rather than considered intellect, which rule many aspects of our working environment, right from the workshop to the boardroom, we need to take action.
But what exactly is this field that throws up such controversial thinking and how do we learn from what it has to teach us?
At first glance, human and organisational factors appear highly theoretical and all-encompassing but as a seasoned pragmatist, I find two statements help define the topic.
The first is “the understanding of interactions between humans and other elements of a system”, followed by “the influence this has on organisational behaviours”.
If we then consider these human “elements” in terms of perception; cognition and behaviour, I feel we have taken the first steps in what Nobel Prize winner Daniel Kahneman refers to as “enriching the vocabulary”.
We begin to break down the psychological stages and mechanisms at play, when employees make decisions and we can start to develop systems with the inbuilt tolerances.
Human factors is not a new field of study. The Health and Safety Executive first published HSG48 – “Reducing Error and Influencing Behaviour” – in 1989 and there is a wealth of excellent material on its website.
Step Change in Safety has a Human Factors Steering Group, which is “ready to assemble when necessary to address any specific issues which may arise” – high profile indeed.
However, it would be fair to say that the Oil & Gas Industry is not leading the field in this area. In my experience they would do well to take some lessons from other industries that recognise processes and provide training which allows for human error.
A good example of this is the CRM (crew resource management) training used within the aviation industry and currently being considered for the field of surgery.
An excellent opportunity to develop the dialogue and share understanding presented itself recently when the Society of Petroleum Engineers (SPE) hosted “Another Perspective on Risk”, a seminar on the human contribution to risk.
Opening the event Lord Cullen made some very astute observations on the over reliance on human intervention and the requirement for good system design to be error tolerant.
There was an impressive panel of non-industry speakers from both sides of the Atlantic and at one point in the day a golden opportunity arose to discuss the human contribution to risk in the fields of both process safety and personal safety – two very different challenges! However, the debate appeared to focus on examples of the mismanagement of performance measures and we lost an ideal chance to address the human contribution to these risks.
We consistently miss opportunities like these because we lack the critical understanding to ask the right questions.
For years, in order to understand root causes, we have been urged to ask “why?” (sometimes five times in a row).
The very question “why” when applied to humans suggests reason and conscious decision whereas “how” addresses error mechanisms.
But the “availability heuristic” and “confirmation bias” indicate that we make mental shortcuts on the probability of something happening, based on how easily we can recall similar examples and that we are programmed to argue away anything that doesn’t support our prejudice.
Would it be difficult to find evidence of these concepts in almost any risk assessment being done today?
Earlier in my career, I remember being told “people are our most important asset” and today I see IIP plaques and logos everywhere I go.
But now my position is that our investment in people should go beyond workplace satisfaction and career development, focusing instead on the vulnerability of individual employees as “elements” in our systems.
We should really understand how our employees make decisions and invest wisely around that.
Some organisations have a healthy, proactive approach to human factors and some even consider the organisational aspects and I salute them.
Unfortunately, and to the apparent frustration of the competent and approval authorities, many others seem to be “too busy being busy!”
As a topic for a short article, human factors is far from simple yet ironically many of the strategic solutions which can be applied in our industry are simplicity itself.
I often see the question “Have you considered human factors?” usually in risk assessments. I hope you can now share my wry smile when it is followed by a small box with a tick in it!
Dean Wiseman is a senior consultant at FQM Ltd, a health, safety, environment and quality consultancy and training organisation.