This conference is important in its focus on human risk.
It is a great privilege for me to give the opening address. I will, of course, be giving my personal comments, but illustrating them by reference to a number of major accidents in operations involving major hazards.
The Working Environment
In a safety investigation it is important to uncover underlying factors which played a part in what happened. The board which investigated the loss of the Spaceship Columbia and its crew in 2003 make this remark: “When causal chains are limited to technical flaws and individual failures,” they said, “the ensuing responses aimed at preventing a similar event in the future are equaly limited: they aim to fix the technical problem and replace or retrain the individual responsible. Such corrections lead to a misguided and potentially disasterous belief that the underlying problem has been solved.”
This brings me to the working environment, by which I mean the whole set up in which operatives work – the physical design of the plant, the procedures, standards and practices with which they are expected to comply and so on. In the event of an accident there may well be reasons for what otherwise could be written off as no more than the result of human error. The working environment, it seems to me, can have considerable influence on whether operatives make errors.
I take two weel-known cases from the nuclear industry. The Chernobyl disaster in the Ukraine in 1986 followed a runaway chain reaction, and a large power surge. It illustrated the danger of designers leaving it to the operatives (whatever may have been their faults) to take corrective action, rather than designing and inherently safe solution. The Kemeny Commission which investigated the incident at the Three Mile Island nuclear plant in Pennsylvania, seven years before, found that the operating procedures which applied had been “at least very confusing and could be read in such a way as to lead the operators to take the incorrect actions they did”.
Personnel at the Esso gas plant at Longford, Australia in 1998 restarted the pumping of warm lean oil to a heat exchanger, causing it to fracture, releasing hydrocarbon vapours and liquids. The training of the personnel had emphasized the knowledge they needed for the job, but not an understanding that the heat exchanger could not withstand low temperatures and thermal shocks. The investigating commission rejected Esso’s contention that the accident was simply caused by the fault of personnel. The fact that none of those on duty at the time understood just how dangerous the situation was indicated a systematic training failure. Not even the plant manager understood the dangers of cold metal.
These, and no doubt other examples, demonstrate that the working environment may lead, or lend itself, to human error. It is up to management to see that the working environment does not do so. Part of that working environment is the place which is given to safety in the culture of an organisation.
Conflict of Priorities
There can be a conflict of priorities. At the oil storage depot at Buncefield in England in 2005 a sticking gauge and an inoperable high level switch led to an overflow of petrol, the ignition of vapour cloud and a massive explosion and fire. The official investigation found that control room staff had limited control over flow rates and timing of incoming fuel, and did not have sufficient information to manage its storage. Increasing throughput and a lack of engineering support had put more pressure on site management and staff. These pressures created a culture where keeping the process operating was the primary focus and process safety did not get the attention, resources and priority that it required. Keeping the process operating was the primary focus.
Employees can be distracted by getting mixed messages from management about safety. A member of the board which investigated the Columbia Space Shuttle Disaster in 2003, later wrote:
“Leaders must remember that what they emphasise can change an organisation’s stated goals and objectives. If reliability and safety are preached as ‘organisational bumper stickers,’ but leaders constantly emphasise keeping on schedule and saving money, workers will soon realize what is important and change accordingly.”
As regards the risk of a conflict over priorities, in my report on Piper Alpha, I said:
“Senior management must demonstrate to their organization…that safety is of the highest priority and that improvements in safety will, in addition to rerducing injuries and incidents, result in improved business. The ‘noise around performance’ must be tempered to ensure it does not swamp the noise around safety,”
This means that throughout every organization there has to be effective leadership in safety, and that leaders must make their commitment to safety visible through their active involvement.
The implications of human behavior and attitudes are particularly significant on offshore installations where safety depends on the interaction of people who may come from different disciplines or employers. That puts a premium on collaboration and effective communications.
The Front Line
So far I’ve discussed the influences of the working environment in leading to errors, but moreover it is for management to see that the working environment is robust enough to allow for the fact that operatives are only human, with all the normal weaknesses, tendencies and limitations.
All operatives can be absent-minded, distracted, preoccupied, careless or poorly motivated. They can forget to take precautions, even when they have been fully trained and know the dangers. I take this example from a very different field, the rail accident at Clapham junction in 1988. The accident happened after the signaling system showed a false signal. A signaling technician had failed to insulate a redundant wire. He was conscientious. He knew what he should do and wrongly thought he had done it. The author of the official report said: “Any worker will make mistakes during his working life. No matter how conscientious he is in preparing and carrying out his work, there will come a time when he makes a slip. It is those unusual and infrequent events that have to be guarded against by a system of independent checking of his work”.
Human factors can affect working practices. The Piper Alpha disaster was preceded by a number of shortcomings in the way in which the in which the permit to work procedure was carried out. In my report I said that “over time there is an increasing probability that the procedure in practice will have been departed from that originally laid down”. Hence, I said, management should pick up these changes in a timely way and decide what to do – such as modifying the system, or providing additional training and so on.
Not everything can be covered in advance by procedures or trtaining, or picked up in due course by monitoring or auditing. Front-line supervisors can certainly play an important part by keeping an eye on what is happening, and using their considerable hands-on experience on the spot, for example where there are unforeseen situations.
Personal Commitment to Safety
Supervisors can also form an important bridge between the management and the workforce. In my report on the Piper Alpha I said that:
“It is essential that the whole workforce is committed to and involved in safe operations. The front-line supervisors are a key link in achieving that as each is personally responsible for ensuring that all employees, whether the company’s own or contractors, are trained to work safely and that they not only know how to perform their job safely, but are convinced that they have a responsibility to do so.”
Bringing home commitment to safety means not only creating motivation for good safety practices but also putting resources into the management of safety for the purpose.
Safety awareness in the whole workforce is sometimes referred to as making them ‘error wise’, so they can identify dangers and act accordingly. That calls for vigilance. That is where personal qualities do matter…
An incident, a near miss, an abnormality in the functioning of a plant, a failure in communication could be the precursor to serious trouble. Safety awareness at all levels should avoid any tendency to tolerate, cut corners, forget or fail to report, investigate and take lasting corrective action. That assumes, of course, a general commitment to safety which enables employees to report what has happened without fear of recrimination. So, how do we counter fear of reporting and reluctance to step forward to voice concerns, and cultivate a collaborative approach to risk.
Ultimately, the multiple barriers to prevent catastrophe include risk awareness, vigilance and real freedom to speak out.
I look forward to this conference providing insights into how better account can be taken of human risk in all aspects of offshore operations. I wish you all well in that endeavor.