Connect with us

Human Factors

Errors In Diving Can Be Useful For Learning— ‘Human Error’ Is Not!

Errors are normal and can help us learn as long as we understand the kind of error we made, and how it made sense for us at the time. Conversely, simply attributing the cause of a diving accident to ‘human error’ is practically useless. Here Human Factors Coach Gareth Lock examines various types of errors and violations, and importantly the influence of the system in which they operate, and explains why simply citing ‘human error’ ain’t enough.

Published

on

Text by Gareth Lock. Images by G. Lock unless noted. The Human Diver is a sponsor of InDepth.

The title of this article might appear to be an odd statement because there is a conflict; however, that conflict is intentional. This is because there is a difference between the term “error,” which can help us learn, and the attribution which people apply to the cause of an accident as ‘human error’ which doesn’t help anyone. In fact, if you see ‘human error’ or ‘diver error’ or ‘human factors’ as the cause of an adverse event, then the investigation stopped too early, and the learning opportunities will be extremely limited.

What is an error?

A few simple definitions might be

  • the outcomes from a system that didn’t deliver the expected results
  • an unintended deviation from a preferred behaviour
  • something that is not correct; a wrong action or statement

To a certain extent, the precise definition of an error isn’t that important because whatever definition we use, we must recognise that ‘errors’ are never the cause of an accident, incident, or an adverse event, rather they are indicative of a weakness or failure somewhere in the wider system. If you are wondering what ‘system’ means in this context, it will be covered in the next section.

One of the key points to recognise is that an error can only be defined after an event. The reason for this is because if we knew we were going to make a mistake, a slip, or have a lapse, then we would have done something about it to prevent the negative behaviour/event from occurring! The terms mistake, slip, and lapse were specifically used as these are recognised terms in the science of human factors to describe the different types of variability in our performance.

Some Definitions

  • A mistake is where you’ve done the ‘wrong’ thing but thought it was correct. Examples of this could be that training materials have changed but you’re teaching the old skill because you weren’t aware of the change, entering a wreck via the ‘wrong’ doorway thinking it was the correct one or surveying the ‘wrong’ part of the reef as some cues/clues for the location had been misinterpreted.
  • A slip is an unintended action. This could be cross-clipping a bolt snap, writing the wrong gas analysis figures down because you transposed two numbers, or putting the diver tally number on the wrong peg on a boat board.
  • A lapse is when we forget something. This could be forgetting the additional weight pouch needed for saltwater, not doing up the drysuit zip, or not going through the process to count all the divers back on the boat.

By breaking ‘errors’ down into these categories, we can develop mitigations to reduce the likelihood of them occurring on dives. For mistakes, buddy checking, co-teaching, QC/QA processes and debriefs all help us recognise when something isn’t quite right. For slips, we can design something so that it is harder to do the wrong thing. For lapses, we can use a checklist, buddy check, or physical prompt/modification to reduce the likelihood that critical items are not forgotten before it is too late. We can also look to modify the ‘system’ so that we are less likely to make an error, or if we do, we catch it before it is critical. 

There is also an additional classification of errors called ‘violations’ which is where a rule of some sort is present and has not been followed. ‘Violations’ are also broken down into different categories too.

Photo by Barbara Leatham
  • Situational violation is where the context led to breaking the rule being the ‘obvious’ answer. This could be where a dive center manager tells an instructor to break the standards for commercial reasons or risk losing their job, or where a diver forgets their thermal undergarments and dives anyway because of time/money invested, or a wreck is penetrated without a line because the visibility appears fine.
  • Routine violation is where it is normal to break the rules and potentially where it is socially more difficult to comply. Examples include not using written pre-dive checklists on CCR because no one else does, not analysing gas prior to a dive because no one else does, going below gas minimums because the boat crew make fun of coming up with ‘too much gas’.
  • Exceptional violation is where breaking the rule is potentially safer than compliance. This might be rescuing someone below the MOD of the gas being breathed or going below gas minimums because you were cutting someone from an entanglement.
  • Recklessness is where there is no thought or care for the outcome. In hindsight, many think this is present in some cases. However, in my opinion, if divers genuinely thought that their dive would end up with a serious injury or fatality, they wouldn’t do it. As such, there are weaknesses in education or self-awareness.

With the exception of recklessness, violations also provide opportunities for organisational learning. What is it about the rule that meant it was harder to follow? If rules are consistently being broken, it is not likely a person thing, but rather a situational or contextual thing that needs to be addressed. Investigations shouldn’t stop at the rules that were broken, as it is likely that they were broken before, and an adverse event didn’t happen then.

Terms like ‘loss of situational awareness’, ‘complacency’ and ‘poor teamwork’ are just other ways of expressing a general term like ‘human error’. 

All these types of ‘errors’ within the system provide learning opportunities, but only if they are examined in detail and reflected upon to understand what was going on at the time.

What do you mean by ‘system’?

A system is a mental construct or idea used to describe how something works in an environment. In human factors terms, this is where humans are involved with technology, with paperwork, with processes, within physical, social, and cultural environments, and with other people. A diver who is on a boat using open circuit SCUBA about to enter the water with their teammate and dive a wreck in 50 m/170 ft is part of a system. The system they are within contains:

  • Equipment designed and tested against certain standards.
  • Compressed gas from a fill station which has protocols to follow making it safe to breathe.
  • Training from multiple instructors using training materials from different agencies following high-level standards from a body like the RSTC or RTC, each of which has a QA/QC process in place.
  • Protocols from the boat’s captain based on national and local maritime requirements.
  • A dive computer that has decompression algorithms developed and refined over time.
  • A cultural and social environment that influences how to behave in certain circumstances.
  • A physical environment consisting of surface and underwater conditions.
  • A wreck description that was drawn on a whiteboard on the boat, with a brief delivered by a guide explaining what is where and when to end the dive.
  • Personal and team goals/drivers/constraints which have to be complied with or are shaping decisions.
  • Multiple other divers and dive teams on the same boat who have their own needs, training, goals/drivers/constraints.
  • Human bodies with their own individual physical, physiological and psychological requirements and constraints.
  • The boat crew and their competencies and experience.

This list is by no means exhaustive but gives you an idea of all the ‘things’ that go into a system and those ‘things’ that might have weaknesses within them. Addressing these weaknesses and developing strengths are respectively what the science of human factors and resilience engineering are about.

When something goes wrong, why shouldn’t we use ‘human error’ as a cause?

There are a few problems with the term ‘human error’ as a cause.

  • Firstly, it is a bucket that we can put all the different sorts of performance variability into without understanding what we can do to prevent future events. We will all make errors, so saying to divers ‘be careful’, ‘pay more attention’, or ‘be safe’ doesn’t help identify the factors that lead to the errors. These are the error-producing or latent conditions that are always there but not always together and don’t always lead to a problem. e.g., time pressures, incomplete briefing, equipment modifications, equipment serviceability, inadequate communications, flawed assumptions, and so on.
  • Secondly, due to the fundamental attribution bias, we tend to focus on the performance of individual divers or instructors, rather than look at the context, the wider system, in which they were operating. As there isn’t a structured way of investigating adverse events in diving industry (coming soon!) or even using a standard framework to work out ‘how it made sense to those involved’, divers often jump to the conclusion that the cause was a just part of being a human and was ‘human error’ or ‘the human factor’. This is far from the case as you can see in the ‘If Only…’ documentary.
  • Finally, adverse events like accidents, incidents, and near-misses are the outcomes of interactions within a system, and we can’t deconstruct the problem to single failures of individuals, hoping to ‘fix’ them. If that was the case and the problem was just down to errant individuals, we’d have far more adverse events than we do because people are broadly wired the same way. Humans, as well as being a critical factor when it comes to failure, are also the heroes in many cases when the system they are within doesn’t have all of the information provided to be ‘safe’. As such, we need to look at the wider system elements and their relationships to look at both success and failure.

We want divers to make errors…

Conversely, there are times when we don’t mind people making errors (slips, lapses, mistakes, and some types of violations) during a training course or even during exploration. In fact, in some cases, it is to be encouraged. The reason is that errors provide opportunities for learning. If we never make an error, we will never improve. Innovation means we are pushing boundaries against uncertainty, and with uncertainty comes the possibility that we end up with an outcome that wasn’t expected or wanted. By using trial and error, we have a better idea of where the boundary is. Self-discovery is one the most powerful learning tools we have because there is personal buy-in. Errors in training also help us solve problems in the real world, because the real world rarely follows fixed boundaries, and it is much better to fail in the training environment where you have an instructor with you to provide a safety net. Note, the instructors have to have a high level of skills/experience to act as this safety net! 

Even when we don’t want divers or instructors to make errors, there will be times when things don’t go to plan. Experts are continually monitoring and adapting their performance, picking up small deviations and correcting them before they become a safety-critical event. This is why it is important to focus on and fix the small stuff before it snowballs and exceeds your capacity to manage the situation.

Fellow Human Diver instructor Guy Shockey uses the analogy of a Roomba (automated, robotic hoover) to explain this concept during his Fundamentals, Technical and Rebreather classes. The Roomba doesn’t know anything about its environment. However, over time, as it bounces its way around the house, it finds the locations of the walls, chair legs, table legs, doorways and other obstacles and creates a map of the boundaries. However, humans still have to help it when it comes to stairs as the top step doesn’t have a boundary, and it falls down! The ‘system’ solution is to put a ‘virtual wall’ in place so that the wall sensors can detect where the drops are. 

What we are looking to develop during training, fun diving, and exploration is resilience. This is the capacity to adapt our performance and skills to deliver positive results when situations are changing around us, even to the point where we have an adverse event, but fail safely. Resilience provides us with:

  • the ability to anticipate what might happen (positively and negatively),
  • the ability to identify and monitor the critical factors to ensure safety,
  • the ability to respond to what is going on around us, and finally,
  • the ability to learn from what has happened in the past and apply it to future dives.

Resilience is developed from direct experience and learning from others’ activities and outcomes. We shouldn’t just focus on learning from negative outcomes though, we also want to know what makes positive ones happen, too. 

Note, if instructors are only taught to follow a standard ‘script’ of what to do when, then when they or their students encounter something novel, they are more likely to have an unexpected outcome. That outcome might be ‘lucky’ (good) or ‘unlucky’ (bad). Experience helps stack the odds in our favour by building resilience. At an organisational level, sharing near misses and workarounds amongst instructors provides other instructors with this learned knowledge so that they are more resilient when they teach. Resilience doesn’t just exist at the individual level, it applies at the system level too.

In Conclusion

Errors are normal, and they can help us learn as long as we understand what sort of error we made and how it made sense for us to do what we did. This reflection isn’t easy because it requires effort. It also requires us to understand how we create a shared mental model within our team of what is happening now and what is likely to happen in the future (the key goal of non-technical skills development programmes). 

We also need to understand the influence the system has on our behaviours and actions and look further back in time and space to see what else was present. Therefore, if we genuinely want to understand what led to the adverse events, we should spend time looking at what ‘normal’ looks like, the conditions that are normally present and what divers and instructors have to deal with, not just those that were present at the time of the adverse event. Accidents and incidents occur as deviations from normal, not just deviations from standards.

While ‘human error’ might be an easy bucket to throw the variability of human performance and its associated outcomes into, the attribution rarely improves safety or performance because it doesn’t look at the rationale behind the performance or the context associated with it. For that, we have to dig deeper and understand local rationality.


Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 350 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.  

Diving Safety

The Cause Of The Accident? Loss Of Situation Awareness.

Because of our innate biases, it’s easy to blame an individual for causing an adverse event, even if that individual made good choices. In fact, according to research, in the majority of adverse events, individuals make good decisions informed by incomplete information and NOT the other way around. As Human Factors coach Gareth Lock explains, if we want to improve our outcomes, we need to understand how decisions or choices are made and how “situation awareness” fits into this. Mr. Lock proceeds to lay out his case.

Published

on

By

Text by Gareth Lock. Images courtesy of G. Lock. Note that The Human Diver is a sponsor of InDepth.

The title of this blog comes from the numerous reports I have seen attributing the cause of an accident to be a loss of situational awareness. This could be a direct statement like the title, or it could be something like ‘they should have paid more attention to…’, ‘they didn’t notice they had drifted off the wreck/reef’, ‘they weren’t focusing on their student’s gas remaining’, ‘they hadn’t noticed the panic developing’, or one of the most prevalent, ‘it was obvious, the diver/instructor just lacked common sense’.

How many times have you heard or read these sorts of statements?

The problem is that it is easy to attribute causality to something after the event, even if it is the wrong cause! In the recent blog on The Human Diver site, a number of biases were explained that makes it easy to blame an individual for the cause when, in fact, there are always external factors present that lead those involved to make ‘good choices’ based on their experiences, skills, and knowledge, but those choices still led to a ‘bad outcome’.

Previous research has shown that the majority of adverse events are not due to ‘bad decisions’ or ‘bad choices’ with ‘good information’, rather they are ‘good decisions’ informed by incomplete information. If we want to improve the decision making of divers, we need to understand how decisions or choices are made and how situation awareness fits into this. You might notice that I have used situation awareness instead of the more common situational awareness. The reason is based on language – you can’t be aware of ‘situational’ but you can be aware of the situation.

Mental Models, Patterns and Mental Shortcuts

Our brains are really impressive. We take electrical signals from our nervous system, convert these into ‘something’ which is interpreted and matched against ‘patterns’ within our ‘memory’, and then based on memories of those experiences, we tell our bodies to execute an action, the results of which we perceive, and we go through the process again.

What I have described above is a model. It approximates something that happens, something that is far more complex than I can understand, but it is close enough to get the point across. Data comes in, it gets processed, it gets matched, a decision is made based on experiences, goals and rewards, and I then do something to create a change. The process repeats.

Models and patterns allow us to take mental shortcuts, and mental shortcuts save us energy. If we have a high-level model, we don’t need to look at the details. If we have a pattern that we recognise, we don’t have to think about the details, we just execute an action based on it. Think about the difference between the details contained within a hillwalker’s map, a road map and a SatNav. They have varying levels of detail, none of which match reality.

The following example will show how many different models and patterns are used to make decisions and ‘choices’.

Entering the Wreck

A group of three divers are swimming alongside an unfamiliar wreck and one which is rarely dived. They are excited to be there. The most experienced diver enters the wreck without checking the others are ok, and the others follow. As the passageway is relatively open, they do not lay a line. They start moving along a passageway and into some other rooms. As they progress, the visibility drops until the diver at the back cannot see anything.

The reduced visibility was caused by two main factors: 

  1. Silt and rust particles falling down, having been lifted from the surfaces by exhaled bubbles 
  2. Silt stirred up from the bottom—and which stayed suspended due to a lack of current— caused by poor in-water skills, i.e. finning technique, hand sculling, and buoyancy control.

The divers lose their way in the wreck. Through luck, they make their way back out through a break in the wreck they were unaware of.

This story is not that uncommon but, because no one was injured or killed, it likely doesn’t make the media, thereby possibly allowing others to learn from this event. Publicity or visibility of the event is one thing, learning from it is another.

They Lost Situation Awareness in the Wreck

We could say that they lost situation awareness in the wreck. However, that doesn’t help us learn because we don’t understand how the divers’ awareness was created or how it was being used. Normally, if such an account were to be posted online, the responses would contain lots of counterfactuals – ‘they should have…’, ‘they could have…’, ‘I would have…’ These counterfactuals are an important aspect of understanding situation awareness and how we use it, but they rarely help those involved because they didn’t have the observer’s knowledge of the outcome.

Toward a Theory of Situation Awareness in Dynamic Systems. Endsley. 1995.

Our situation awareness is developed based on our experiences, our knowledge, our learnings, our goals, our rewards, our skills, and many other factors relating to our mental models and the pattern matching that takes place. Our situation awareness is a construct of what we think is happening and what is likely to happen. Crucially, it does not exist in reality. This is why situation awareness is hard to teach and why significant reflection is needed if we want to improve it.

The Mental Models and Patterns that were Potentially Matched on this Dive

As described above, our brains are really good at matching patterns from previous experiences, and then coming up with a ‘good enough’ solution to make a decision. The following shows a number of models or patterns present in the event described above.

  • “the divers are excited…” – this reduces inhibitions and increases risk-taking behaviours. Emotions heavily influence the ‘logical’ decisions we make.
  • “the most experienced diver enters the wreck…” – social conformance and authority gradient. How easy is it to say no to progressing into the wreck? What happened the last time someone said no to a decision?
  • “passage is relatively open…” – previous similar experiences have ended okay, it takes time to lay line, and time is precious while diving. We become more efficient than thorough.
  • “diver at the back cannot see anything.” – the lead diver’s visibility is still clear ahead because the silt and particles issues are behind them.
  • “exhaled bubbles…poor in-water technique…” – none of the divers had been in a wreck like this before where silt and rust particles dropped from the ceiling. Their poor techniques hadn’t been an issue in open water where they could swim around or over the silting. They had never been given guidance on what ‘good’ can look like.
  • “they lose their way…” – they have no patterns to match which allows them to work out their return route. The imagery when looking backwards in a wreck (or cave) can be different from looking forward. They laid no line to act as the constant in a pattern.

I hope you can see that much of the divers’ behaviours were based on previous experiences and a lack of similar situations. The divers didn’t have the correct matching patterns to help them make the ‘good’ decisions they needed to. Even if they did have the patterns for this specific diving situation, was there the psychological safety that allowed the team members to question going into the wreck, or signal that things were deteriorating at the back before the visibility dropped to almost zero?

As described in the recent article, ‘Challenger Safety’, team members will look to see if learner safety is present before pushing the boundaries. Learner safety is where it is okay to make a mistake; this could be a physical mistake or a social one.

Staying ‘Ahead’ of the Problem

The following three models from David Woods (Chapter 3, Cognitive Systems Engineering, 2017) show how we can think about how a diver deals with an incident as it develops. Each incident is not one thing, it involves multiple activities that need to be dealt with in a timely manner. If they aren’t dealt with, there is a potential that the diver is trying to solve an old problem when the situation has moved on.

The image above shows how we used to think people dealt with problems. There was a single diagnosis to the problem, and the operator would start to align their thoughts and actions with ‘reality’ based on the feedback they were receiving. 

This second image shows how a ‘perfect’ operator would track the changes as an incident developed and adapt their behaviour as a consequence. There would still be a lag, but the problems would be dealt with.

However, this image shows what happens when the operator falls behind the adaptation process, often still trying to solve an old problem, or they don’t have the models/patterns to pick up the changes then track more closely to the issue. 

Consider how this would apply to the divers in the wreck penetration scenario above. Accidents and incidents happen when those involved lose the capacity to adapt to the changes occurring before a catastrophic situation occurs. 

Divers holding a pre-dive briefing at the EDGE 2.0 workshop in Vancouver Island. Photo by Gareth Lock.

Building Situation Awareness

If situation awareness is developed internally, how do we improve it?

  • Build experience. The more experience you have, the more patterns you have to subconsciously match against. This means you are more likely to end up with an intentionally good outcome rather than a lucky one.
  • Dive briefs. Briefs set the scene, so we understand who is going to do what and when. What are the limits to ending the dive? We understand the potential threats and enjoyable things to see. Dive briefs also help create psychological safety, so it is easier to question something underwater.
  • Tell stories. When things go well and when they go not so well, tell context-rich stories. Look at what influenced your decisions. What was going on in your mind? What communication/miscommunication took place?
  • Debriefs. Debriefs are a structured way of telling a story that allows team members to make better, more-informed decisions the next time around. They help identify the steps within the events that lead to capacity being developed.

Can We Assess Situation Awareness during Dives or Diver Training?

The simple answer is no, not directly.

As described above, situation awareness is a construct held by individuals, even when operating as a team, with the team model being an aggregation of the individual’s models. The goal of The Human Diver human factors/non-technical skills programmes is to help divers create a shared mental model of what is going on now, what is going to happen, and update it as the dive (task) progresses so that the ‘best’ decisions are made.

In-water, we can only look at behaviours and outcomes, not situation awareness directly. We can observe the outcomes of the decisions and the messages being communicated which are used to share individual divers’ mental models with their teammates. This means for the instructor to be able to comprehend and then assess the ‘situation awareness’ behaviours/outcomes displayed by the team, the instructor must have a significant number of patterns (experiences) so that they can make sense of what they are perceiving. The more patterns, the more likely their perception will match what is being sensed in front of them and that their decision is ‘good’. Those patterns are context-specific too! An instructor who is very experienced in blue water with unlimited visibility will have a different set of patterns from a green water diver with limited visibility, or a recreational instructor compared to a technical instructor.

The best way to assess how effective situation awareness was on the dive is via an effective debrief which focuses on local rationality. How did it make sense for you to do what you did? What were the cues and clues that led you to make that decision? Cues and clues that would have been based on experience and knowledge. Unfortunately, a large percentage of debriefs that are undertaken in classes focus on technical skill acquisition and not building non-technical skills.

Summary

Situation awareness is a construct, held in our heads, based on our experiences, training, knowledge, context, and the goals and rewards we are working toward. It does not exist. You cannot lose situation awareness, but your attention can be pointing in the ‘wrong direction’. You are constantly building up a mental picture of what is going on around you, using the limited resources you have, adapting the plan based on the patterns you are matching, and the feedback you are getting. Therefore to improve, you must learn what is important to pay attention to and how you will notice it. It isn’t enough to say, “Pay more attention.” We should be looking for the patterns, not the outcomes.

If we want to improve our own and our teams’ situation awareness, i.e., the number of, and the quality of, the patterns we hold in our brains, we need to build experience. That takes time, and it takes structured feedback and debriefs to understand not just that X leads to Y, but why X leads to Y, and what to do when X doesn’t lead to Y. The more models and patterns we have, the better the quality of our decisions.


Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 350 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture. 

Continue Reading

Thank You to Our Sponsors

Subscribe

Education, Conservation, and Exploration articles for the diving obsessed. Subscribe to our monthly blog and get our latest stories and content delivered to your inbox every Thursday.

Latest Features