Andy’s Almanac on Accidents, Part Eight: There’s Something in the Err
Safety

Andy’s Almanac on Accidents, Part Eight: There’s Something in the Err

By Sphera’s Editorial Team | January 19, 2022

To err is human, but to err working in a hazardous industry could be downright dangerous. Andy Bartlett explains the human factors part of the Operational Risk Management equation.

Listen to other episodes of Andy’s Almanac on Accidents.

 

The following transcript was edited for style, length and clarity.

James Tehrani:

Welcome to the SpheraNow ESG podcast, a program focused on safety, sustainability and productivity issues. I’m James Tehrani, Spark’s editor-in-chief. Today, we welcome back to the program Andy Bartlett, Sphera’s solution consultant for operational risk management, for part eight of Andy’s Almanac on Accidents. Today we’ll be discussing human errors—and I hope I don’t make any. Thank you so much for joining me today, Andy.

Andy Bartlett:

Thanks, James, for the introduction again. This is beginning to be a bit of fun. We started out with doing three. Now we are on number eight, and I’m sure we might have a few more in the bag [this] year.

James Tehrani:

Oh, definitely. Let’s keep it rolling. I mean, we have a lot of momentum with ‘Andy’s Almanac.’ It’s been great getting to know you over the past couple of years. So before we begin, since human error is a broad subject in terms of process safety, we often refer to human factors. Can you explain that to the audience?

Andy Bartlett:

I think a basic explanation would be poor human factors can lead to human error. Human error is what the individual or the team commits to actively. And it’s always shown as something somebody did or something they didn’t do. And then human factors are the reason why the errors occur. Human factors being an umbrella term for the study of people’s performance in a specific environment. And later on, we’ll look at the model and we’ll explain it in more detail.

James Tehrani:

OK, great. Well, before, when we were planning this podcast, you shared a stat with me that I thought was pretty interesting. And it said that 99% of accidental losses, except for natural disasters, begin with a human error. Wow. I mean, that’s pretty hard to believe. I mean, it’s 99%. So I understand that you came from a session from the Middle East CCPS, can you briefly explain what that conference was and elaborate on that statistic you shared?

Andy Bartlett:

Yeah. This conference was organized by the American Institute of Chemical Engineers, Center for Chemical Process Safety, that’s AIChE and then CCPS. And it’s the Middle East Process Safety Conference. It’s the third one. I’ve been involved with all three of them. This last one was done online, and Sphera had two people speaking there. We had two papers accepted, so it was quite a good thing for the company. And what they do is, CCPS is trying to share process safety learnings around the world. They have these conferences in various places, the Far East, USA and Europe. So it’s a good sharing with a drive to a zero-harm culture being the push.

So I had the privilege to share the session on human factors. William Bridges, a good friend of mine from the Process Improvement Institute published a paper, Everything You Need to Know About Human Reliability for Process Safety, where he mentioned 99% of accidental losses, except for natural disasters, begin with a human error. Now, this data is supported by 50,000 investigations cataloged by PII plus hundreds of thousands of others. So he gave a good explanation of it. And in the olden days, whenever there was an incident, they blamed the person. It was the train driver’s fault, or it was the guy operating the shovel’s fault, or it was the man breaking containment’s fault. However, we know better today and, as we move on, we’ll explain why.

James Tehrani:

Yeah. It almost sounds a little bit like in the medical profession where you hear, ‘First, do no harm.’ It’s almost similar to what you’re talking about here, I think.

Andy Bartlett:

Yes. Yes it is. We don’t want to hurt anybody on the job, and we don’t want to destroy facilities that would result in loss of jobs.

James Tehrani:

Yeah. OK. So were there any other highlights from the event regarding human error that you’d like to share with our audience?

Andy Bartlett:

Yeah. There were two other presenters. So we’ll start with Ibrahim Balharith with Saudi Aramco, The Human Factor in Process Safety Management. Human error is a symptom of a trouble deeper inside the system, which we touched on earlier. And in my example, which I had to go and investigate this incident, a permit to work was issued for furnace A, which was out of service for repairs, but the contractor receiving the permit to work actually went to work on furnace B, which was in service.

James Tehrani:

And how does an error like that take place?

Andy Bartlett:

Well, OK. We’ll go into the human factors for that in a minute.

James Tehrani:

OK.

Andy Bartlett:

So when he opened the flange fuel gas escaped and ignited from the furnace burner causing burns to the contractor. An emergency shutdown was required to stop the fire. So we lost production and we injured a person. When you look at the human error model, we can say there’s four parts of the human error model, slips, lapse, mistake, violations. So under the slips, the design of the pipe and equipment identification stenciling was not fit for purpose, which increased the likelihood of working on the wrong equipment. The lapse, the permit issuer for not following site procedures, which required him to go to the furnace and say, ‘This is the place I want you to work.’ It’s called a joint site visit with the contractor prior to the work starting. Then there was a mistake, the permit issuer made an error of judgment in not communicating the right location, actually pointing to the location from a distance.

James Tehrani:

Oh, I see.

Andy Bartlett:

Yeah. And then there was a violation, which is the fourth part of the human error model where supervision accountability, an employee ignoring training. So when you look deeper, which we did do when we investigated this, there was a team of us. This particular person had several violations against him for permits. He was a very lazy individual, and the supervision had not taken him to account. And they had retrained him, but he hadn’t listened to the training and he ignored it. So really he shouldn’t have been in that position anymore issuing permits. So that’s the human error, isn’t it? So if you go back to the question before, every incident starts with the human error. So the human error was one, the person issuing the permit. And I guess you could even say that the person receiving the permit should have  stood on the ground and said, ‘Look, I need you to come with me and show me exactly where you want me to work.’

James Tehrani:

I don’t know why I have medical on my brain, but when you have a surgery and they’re doing a surgery on your finger, they say, ‘Can you point to where the surgery is.’ Even though they know where it is, because they just want to have that reassurance that they’re working on the right part of the body. It’s almost similar to that.

Andy Bartlett:

Reinforcement of the command. Yeah. That was one. And then the other presentation that it was by Natasha Andrews, Planning for Human Factors Engineering in Projects. And that’s quite interesting to me because I had an example where we commissioned several gas plants. And one of the things on a gas plant is you have to take samples, live samples while a plant’s running. And that’s how you set up your automated samples system is you take live samples and you take them to the lab and you process that sample and see whether the automated system is giving you the same result. If it’s not, then you have to adjust the automated system to give you the same result.

It was put on the drawing. We need a sample point here. So when the people have built the plant, they put the sample point where is it in the drawing. Just so happened, it was up a ladder. So the person going to do the sample had to carry the equipment, which is a special spanner to undo the valve, connect the piping on the sample container, which is normally referred to as a ‘bomb’ because it looks like the shape of one. And then you have to carry this full container which carries quite … It’s stainless steel, it’s heavy and it’s full of liquid gas down a ladder. If you dropped it, it would be like a bomb.

James Tehrani:

Wait, a sample bomb? What’s a sample bomb?

Andy Bartlett:

That’s what it’s called. It’s a container that looks like a bomb. It has a little valve at either end.

If it was carrying propane, which expands 11 to 1, quite a big explosion would happen. The first thing you do when you check out the plant, they say, ‘Well, that’s no good. We’ve going to have problems taking that sample. We’ll need to move the sample point somewhere else.’ So human factor engineering is that you want to make the task that the people are doing as easy and as safely as possible, not put in the position that they could make an error quite easily by doing the job in the way that the design was. So that’s the human factor part of it. You design out the human error possibility. That’s where that comes from.

James Tehrani:

I know you touched on this already, but can you go a little bit deeper into the human error model? I think that’s something that we really should focus on for this audience. I think that’s going to be interesting for them.

Andy Bartlett:

Yeah. So if you remember back to what is human error, it is someone did or didn’t do something. So you either have to do something one way, the correct way, or you don’t do it the correct way. So it’s basically you either do it or you don’t do it. So the slip is something that was designed wrongly, lapse is when people don’t follow the procedure, and a mistake when they make an error of judgment, and then violation where they actually don’t do something that they should have done.

James Tehrani:

So the question I have for you is, we have all this software. I mean, there’s obviously this fear of software, but with all this information in people’s hands, why is this still happening? These human errors. It seems like there’s so much information at their fingertips now that obviously wasn’t available when you were starting your career in the 1970s. So we’re still seeing these kinds of things. Is it that people are ignoring the technology and information in front of them? Are they not using it? Are companies taking shortcuts?

Andy Bartlett:

Well, shortcuts is one of the human errors that happens. And it comes under the violations. So the question you’re asking me, I don’t have the figures where companies who had paper-based systems have introduced technology, did their human errors go down? I would think so, because it’s difficult to make some of these human errors using technology. Take, for example, if you are using mobile equipment to issue a permit, you know exactly where that person is, they are in the right place. You know whether they did the joint site inspection and when they went to issue the permit, you know that they followed the risk assessment that was with the permit because they’ve signed off to say they have done it.

Whereas with the paper system, they could just sit in an office and just tick all the boxes, which has happened in this particular incident. The person ticked all the boxes in the office and then left. And the fire actually touched the office where the permit issuer had been. So the question being, is technology helping us to reduce human error? In my opinion, I would say yes. Do I have the statistics? Well, how many companies do we know who’ve looked at it that way? Reducing human error, reducing incidents, that’s why they want to spend the money, right?

James Tehrani:

Sure. Well, I know you were intricately involved in the Sphera’s recent safety report that came out about a month ago at this recording. Are there any highlights from that report that you think are applicable to human error that you’d like to share?

Safety Report 2021
ReadSafety Report 2021
While Environmental, Social and Governance (ESG) goals are of prime importance, safety makes for a resilient and sustainable business model.

Andy Bartlett:

Yeah, well, I went and looked at the HSE UK who actually issued a report on safety culture. And when they were referring to inclination of their employees to comply with rules or act safely or unsafely, which is human error. However, they found that the culture style of management is even more significant. For example, the natural unconscious bias for production over safety. We hear a lot about unconscious bias nowadays. Production over safety and the tendency to focus on short terms. Win-win as we hear and being highly reactive to what’s happening and let’s fix it now and get on with it. Whereas sometime it’s better to take the long term and say, “OK, this keeps happening. We can’t live with this. We need to take a shut down and do it, which costs money and, of course, loses production.”

So when you look at Sphera’s latest safety survey, which has some interesting findings along these lines, it says 60% want to reduce their risk exposure. Forty-eight percent, which is up from 40%, want to comply with all regulations, which I would think they want to because they could find if you don’t. And 37% has stated that is now a corporate board priority to move safety in the right direction. So a strong safety culture, an organization must engage its frontline workers to complete safety activities on a regular basis. So the importance of giving health and safety guidelines makes itself felt in the operations schedules and practices.

So what are we seeing is that the people in the front line are the ones who make the mistakes or don’t make the mistakes. We don’t hear about it when they do things right, because production is running, the products being sold, money’s coming in, stock price is stable. However, when there is an incident and you lose production and it gets out that this was a bad decision by management, it’s normally in the past always been a bad decision by the guy in the front line. And that’s the human error, but the human factor is, ‘Let’s see, what systems were in place that didn’t prevent that person from making the error?’

And that’s what we need to look at. By having technology, we can help people not make errors, and we can also, with technology, see people making errors as the risk levels rise in the facility and the management can then say, ‘OK, why is that risk rising in that particular unit, that particular organization? We need to go and look and fix it.’ So safety culture, it’s all about management and the workforce working together not to make bad decisions.

James Tehrani:

And that’s part of a HAZOP, right? When you’re doing a HAZOP study, you’re looking at what could happen if this pipe was out of service or things like that?

What Is a HAZOP?
GlossaryWhat Is a HAZOP?
A HAZOP is a systematic assessment tool used to identify and address potential hazards before an incident occurs.

Andy Bartlett:

Well, a HAZOP study is on the other set, it’s of too much, too less temperature, pressure, flow, etc. You do a HAZOP study and you look at all the ways the plant can run. But I’m more inclined to look at the day-to-day safety, where we have a job safety analysis that says, ‘We’re going to do this job.’ What are the hazards that could occur, the potential hazards and what controls would need to be in place? And of course, human error would be you’ve mapped all this out and they don’t follow the controls. They take shortcuts. And that’s where incidents happen. With the JSA, either you missed the hazard or you missed the control, or they didn’t put the control in place that it supposed to be. And with a paper-based system that’s easily done. However, with electronic technology we’ve got today, you can manage that a lot better.

James Tehrani:

Sure. I mean, it’s like human nature to try and find the quickest way to do things sometimes. And the quickest way, when it comes to safety is not usually the best way.

Andy Bartlett:

Now, what happens is people get away with doing things the wrong way, and it becomes a habit and it becomes the norm and the procedure itself is not followed. So that tends to happen. And what I have seen is you get a young person who comes in who is a trainee and an older employee shows them, ‘OK, we are going to go through the training manual as a mentor and a mentee.’ And the young person has training, ‘OK, I’ve got to do this this way.’ ‘Don’t bother with that way. We’ve always done it this way. It’s quick and its faster. Don’t worry about it.’ And the older person retires, the young person goes in and doesn’t know all the other things that can go on. And what happens is they make the mistake doing it the wrong way, but they don’t have the experience to correct it when it goes wrong.

Which, actually, I go back to my days in the chemical plant, a guy on a shift before me took a shortcut, I came on shift and it was all going wrong. And I had no idea how to fix it because what I’d been taught wasn’t happening. And I had to go and get the supervisor to come out and say and show him, ‘Look, I’ve got all these things going wrong.’ ‘Ah, that’s because he’s done that and he’s done that.’ And I said, ‘Well, nobody ever showed me to do that.’ No, you’re not supposed to do it that way. It’s a quick way of doing it. So that does happen. But with technology, hopefully that will iron out those cracks in the system.

James Tehrani:

I think we’ve talked about this before with shift handoff, how dangerous it can be when you have somebody who’s worked say eight, 10-hour shift, they’re tired. They want to get home. And they don’t necessarily relay all the information to the next person who’s coming on the job. And that’s where it could become really dangerous.

Andy Bartlett:

Yeah. I always remember some of my shift handovers. I was coming on shift and the guy going off would say, ‘Everything as usual.’ And that was it.

James Tehrani:

That’s not very helpful. Is it?

Andy Bartlett:

No, no. And especially when it wasn’t that particular time, and the guy got a warning and he got demoted. And blame me for not knowing the wrong way to do the job.

James Tehrani:

You can’t blame Andy Bartlett. Well, Andy, this was a pleasure. Were there any final thoughts you wanted to add about human errors before we end this program?

Andy Bartlett:

No, I think that it’s something that we build into our programs as we develop them, the way to try and reduce human error and provide the information to keep people on the right track. That’s one of our reasons for being.

James Tehrani:

Fantastic. Well, another great episode of Andy’s Almanac on Accidents, can’t wait for the next one. And I’ll talk to you soon. Thank you so much.

 

 

 

The Best of Spark Delivered to Your Inbox
Sphera
Sphera is the leading provider of Environmental, Social and Governance (ESG) performance and risk management software, data and consulting services with a focus on Environment, Health, Safety & Sustainability (EHS&S), Operational Risk Management and Product Stewardship.