Diving Safety
The Cause Of The Accident? Loss Of Situation Awareness.
Because of our innate biases, it’s easy to blame an individual for causing an adverse event, even if that individual made good choices. In fact, according to research, in the majority of adverse events, individuals make good decisions informed by incomplete information and NOT the other way around. As Human Factors coach Gareth Lock explains, if we want to improve our outcomes, we need to understand how decisions or choices are made and how “situation awareness” fits into this. Mr. Lock proceeds to lay out his case.
Text by Gareth Lock. Images courtesy of G. Lock. Note that The Human Diver is a sponsor of InDepth.

The title of this blog comes from the numerous reports I have seen attributing the cause of an accident to be a loss of situational awareness. This could be a direct statement like the title, or it could be something like ‘they should have paid more attention to…’, ‘they didn’t notice they had drifted off the wreck/reef’, ‘they weren’t focusing on their student’s gas remaining’, ‘they hadn’t noticed the panic developing’, or one of the most prevalent, ‘it was obvious, the diver/instructor just lacked common sense’.
How many times have you heard or read these sorts of statements?
The problem is that it is easy to attribute causality to something after the event, even if it is the wrong cause! In the recent blog on The Human Diver site, a number of biases were explained that makes it easy to blame an individual for the cause when, in fact, there are always external factors present that lead those involved to make ‘good choices’ based on their experiences, skills, and knowledge, but those choices still led to a ‘bad outcome’.
Previous research has shown that the majority of adverse events are not due to ‘bad decisions’ or ‘bad choices’ with ‘good information’, rather they are ‘good decisions’ informed by incomplete information. If we want to improve the decision making of divers, we need to understand how decisions or choices are made and how situation awareness fits into this. You might notice that I have used situation awareness instead of the more common situational awareness. The reason is based on language – you can’t be aware of ‘situational’ but you can be aware of the situation.
Mental Models, Patterns and Mental Shortcuts

Our brains are really impressive. We take electrical signals from our nervous system, convert these into ‘something’ which is interpreted and matched against ‘patterns’ within our ‘memory’, and then based on memories of those experiences, we tell our bodies to execute an action, the results of which we perceive, and we go through the process again.
What I have described above is a model. It approximates something that happens, something that is far more complex than I can understand, but it is close enough to get the point across. Data comes in, it gets processed, it gets matched, a decision is made based on experiences, goals and rewards, and I then do something to create a change. The process repeats.
Models and patterns allow us to take mental shortcuts, and mental shortcuts save us energy. If we have a high-level model, we don’t need to look at the details. If we have a pattern that we recognise, we don’t have to think about the details, we just execute an action based on it. Think about the difference between the details contained within a hillwalker’s map, a road map and a SatNav. They have varying levels of detail, none of which match reality.
The following example will show how many different models and patterns are used to make decisions and ‘choices’.
Entering the Wreck

A group of three divers are swimming alongside an unfamiliar wreck and one which is rarely dived. They are excited to be there. The most experienced diver enters the wreck without checking the others are ok, and the others follow. As the passageway is relatively open, they do not lay a line. They start moving along a passageway and into some other rooms. As they progress, the visibility drops until the diver at the back cannot see anything.
The reduced visibility was caused by two main factors:
- Silt and rust particles falling down, having been lifted from the surfaces by exhaled bubbles
- Silt stirred up from the bottom—and which stayed suspended due to a lack of current— caused by poor in-water skills, i.e. finning technique, hand sculling, and buoyancy control.
The divers lose their way in the wreck. Through luck, they make their way back out through a break in the wreck they were unaware of.
This story is not that uncommon but, because no one was injured or killed, it likely doesn’t make the media, thereby possibly allowing others to learn from this event. Publicity or visibility of the event is one thing, learning from it is another.
They Lost Situation Awareness in the Wreck
We could say that they lost situation awareness in the wreck. However, that doesn’t help us learn because we don’t understand how the divers’ awareness was created or how it was being used. Normally, if such an account were to be posted online, the responses would contain lots of counterfactuals – ‘they should have…’, ‘they could have…’, ‘I would have…’ These counterfactuals are an important aspect of understanding situation awareness and how we use it, but they rarely help those involved because they didn’t have the observer’s knowledge of the outcome.

Our situation awareness is developed based on our experiences, our knowledge, our learnings, our goals, our rewards, our skills, and many other factors relating to our mental models and the pattern matching that takes place. Our situation awareness is a construct of what we think is happening and what is likely to happen. Crucially, it does not exist in reality. This is why situation awareness is hard to teach and why significant reflection is needed if we want to improve it.
The Mental Models and Patterns that were Potentially Matched on this Dive
As described above, our brains are really good at matching patterns from previous experiences, and then coming up with a ‘good enough’ solution to make a decision. The following shows a number of models or patterns present in the event described above.
- “the divers are excited…” – this reduces inhibitions and increases risk-taking behaviours. Emotions heavily influence the ‘logical’ decisions we make.
- “the most experienced diver enters the wreck…” – social conformance and authority gradient. How easy is it to say no to progressing into the wreck? What happened the last time someone said no to a decision?
- “passage is relatively open…” – previous similar experiences have ended okay, it takes time to lay line, and time is precious while diving. We become more efficient than thorough.
- “diver at the back cannot see anything.” – the lead diver’s visibility is still clear ahead because the silt and particles issues are behind them.
- “exhaled bubbles…poor in-water technique…” – none of the divers had been in a wreck like this before where silt and rust particles dropped from the ceiling. Their poor techniques hadn’t been an issue in open water where they could swim around or over the silting. They had never been given guidance on what ‘good’ can look like.
- “they lose their way…” – they have no patterns to match which allows them to work out their return route. The imagery when looking backwards in a wreck (or cave) can be different from looking forward. They laid no line to act as the constant in a pattern.
I hope you can see that much of the divers’ behaviours were based on previous experiences and a lack of similar situations. The divers didn’t have the correct matching patterns to help them make the ‘good’ decisions they needed to. Even if they did have the patterns for this specific diving situation, was there the psychological safety that allowed the team members to question going into the wreck, or signal that things were deteriorating at the back before the visibility dropped to almost zero?

As described in the recent article, ‘Challenger Safety’, team members will look to see if learner safety is present before pushing the boundaries. Learner safety is where it is okay to make a mistake; this could be a physical mistake or a social one.
Staying ‘Ahead’ of the Problem
The following three models from David Woods (Chapter 3, Cognitive Systems Engineering, 2017) show how we can think about how a diver deals with an incident as it develops. Each incident is not one thing, it involves multiple activities that need to be dealt with in a timely manner. If they aren’t dealt with, there is a potential that the diver is trying to solve an old problem when the situation has moved on.

The image above shows how we used to think people dealt with problems. There was a single diagnosis to the problem, and the operator would start to align their thoughts and actions with ‘reality’ based on the feedback they were receiving.

This second image shows how a ‘perfect’ operator would track the changes as an incident developed and adapt their behaviour as a consequence. There would still be a lag, but the problems would be dealt with.

However, this image shows what happens when the operator falls behind the adaptation process, often still trying to solve an old problem, or they don’t have the models/patterns to pick up the changes then track more closely to the issue.
Consider how this would apply to the divers in the wreck penetration scenario above. Accidents and incidents happen when those involved lose the capacity to adapt to the changes occurring before a catastrophic situation occurs.

Building Situation Awareness
If situation awareness is developed internally, how do we improve it?
- Build experience. The more experience you have, the more patterns you have to subconsciously match against. This means you are more likely to end up with an intentionally good outcome rather than a lucky one.
- Dive briefs. Briefs set the scene, so we understand who is going to do what and when. What are the limits to ending the dive? We understand the potential threats and enjoyable things to see. Dive briefs also help create psychological safety, so it is easier to question something underwater.
- Tell stories. When things go well and when they go not so well, tell context-rich stories. Look at what influenced your decisions. What was going on in your mind? What communication/miscommunication took place?
- Debriefs. Debriefs are a structured way of telling a story that allows team members to make better, more-informed decisions the next time around. They help identify the steps within the events that lead to capacity being developed.
Can We Assess Situation Awareness during Dives or Diver Training?
The simple answer is no, not directly.
As described above, situation awareness is a construct held by individuals, even when operating as a team, with the team model being an aggregation of the individual’s models. The goal of The Human Diver human factors/non-technical skills programmes is to help divers create a shared mental model of what is going on now, what is going to happen, and update it as the dive (task) progresses so that the ‘best’ decisions are made.
In-water, we can only look at behaviours and outcomes, not situation awareness directly. We can observe the outcomes of the decisions and the messages being communicated which are used to share individual divers’ mental models with their teammates. This means for the instructor to be able to comprehend and then assess the ‘situation awareness’ behaviours/outcomes displayed by the team, the instructor must have a significant number of patterns (experiences) so that they can make sense of what they are perceiving. The more patterns, the more likely their perception will match what is being sensed in front of them and that their decision is ‘good’. Those patterns are context-specific too! An instructor who is very experienced in blue water with unlimited visibility will have a different set of patterns from a green water diver with limited visibility, or a recreational instructor compared to a technical instructor.

The best way to assess how effective situation awareness was on the dive is via an effective debrief which focuses on local rationality. How did it make sense for you to do what you did? What were the cues and clues that led you to make that decision? Cues and clues that would have been based on experience and knowledge. Unfortunately, a large percentage of debriefs that are undertaken in classes focus on technical skill acquisition and not building non-technical skills.
Summary
Situation awareness is a construct, held in our heads, based on our experiences, training, knowledge, context, and the goals and rewards we are working toward. It does not exist. You cannot lose situation awareness, but your attention can be pointing in the ‘wrong direction’. You are constantly building up a mental picture of what is going on around you, using the limited resources you have, adapting the plan based on the patterns you are matching, and the feedback you are getting. Therefore to improve, you must learn what is important to pay attention to and how you will notice it. It isn’t enough to say, “Pay more attention.” We should be looking for the patterns, not the outcomes.
If we want to improve our own and our teams’ situation awareness, i.e., the number of, and the quality of, the patterns we hold in our brains, we need to build experience. That takes time, and it takes structured feedback and debriefs to understand not just that X leads to Y, but why X leads to Y, and what to do when X doesn’t lead to Y. The more models and patterns we have, the better the quality of our decisions.

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 350 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.
Diving Safety
Does The Sport Diving Community Learn from Accidents?
Do we learn from accidents as a diving culture and, as a result, take the actions, where needed, to improve divers’ safety? Though we might like to think that’s the case, the reality is more complicated as human factors coach Gareth Lock explains in some detail. Lock offers a broad six-point plan to help the community boost its learning chops. We gave him an A for effort. See what you think.
by Gareth Lock

Learning is the ability to observe and reflect on previous actions and behaviours, and then modify or change future behaviours or actions to either get a different result or to reinforce the current behaviours. It can be single-loop, whereby we only focus on the immediate actions and change those, e.g., provide metrics for buoyancy control during a training course, or double-loop where the underlying assumptions are questioned, e.g., are we teaching instructors how to teach buoyancy and trim correctly? The latter has a great impact but takes more time, and more importantly, requires a different perspective. Culture is the ‘way things are done around here’ and is made up of many different elements as shown in this image from Rob Long. Learning culture is a subset of a wider safety culture.

Regarding a safety culture, in 2022 I wrote a piece for InDEPTH, “Can We Create A Safety Culture In Diving? Probably Not, Here’s Why,” about whether the diving industry could have a mature safety culture and concluded that it probably couldn’t happen for several reasons:
- First, ‘safe’ means different things to different people, especially when we are operating in an inherently hazardous environment. Recreational, technical, cave, CCR and wreck diving all have different types and severities of hazards, and there are varying levels of perception and acceptance of risk. The ultimate realisation of risk, death, was only acknowledged in the last couple of years by a major training agency in their training materials. Yet it is something that can happen on ANY dive.
- Second, given the loose training standards, multiple agencies, and instructors teaching for multiple agencies, there is a diffuse organisational influence across the industry which means it is hard to change the compliance-focus that is in place. From the outside looking in, there needs to be more evidence of leadership surrounding operational safety, as opposed to compliance-based safety e.g., ensuring that the standards are adhered to, even if the standards have conflicts or are not clear. This appears to be more acute when agencies have regional licensees who may not be active diving instructors and are focused on revenue generation and not the maintenance of skilled instructors. There is very little, if any, evidence that leadership skills, traits or behaviours are taught anywhere in the diving industry as part of the formal agency staff or professional development processes. This impacts what happens in terms of safety culture development.
- Finally, the focus on standards and rules aligns with the lowest level of the recognised safety culture models – Pathological from Hudson. Rules and standards do not create safety. Rules facilitate the discussion around what is acceptably safe, but they rarely consider the context surrounding the activities at the sharp end, i.e., dive centres and diving instructors and how they manage their businesses. These are grey areas. There is a difference between ‘Work as Imagined’ and ‘Work as Done,’ and individual instructors and dive centre managers must both ‘complete the design’ because the manuals and guides are generic, and manage the tension between safety, financial pressures, and people (or other resources) to maintain a viable business. Fundamentally, people create safety not through the blind adherence to rules, but through developed knowledge and reflecting on their experiences, and then sharing that knowledge with others so that they, too, may learn and not have to make the same mistakes themselves.

The proceeding discussion brings us to the main topics of this article, does the diving industry have a learning culture, and what is needed to support that learning culture?
What is a learning culture?
In the context of ‘safe’ diving operations, a learning culture could be defined as “the willingness and the competence to draw the right conclusions from its safety information system, and the will to implement major reforms when their need is indicated.” (Reason, 1997). Here we have a problem!
‘Willingness…’
The industry is based around siloed operations: equipment manufacturers, training agencies, dive centres/operations, and individual instructors. Adopting a genuine learning approach means that the barriers must be broken down and conversations happen between and within the silos. This is very difficult because of the commercial pressures present. The consumer market is small, and there are many agencies and equipment manufacturers that are competing for the same divers and instructors. Also, agencies and manufacturers have competing goals. Agencies want to maximise the number of dive centres/instructors to generate revenue, and one of the ways of doing that is to maximise the number of courses available and courses that can be taught by individual instructors e.g., different types of CCR units. Manufacturers don’t want to realise the reputational risk because their equipment/CCR is involved in a fatal diving accident, but they also want to maximise their return on investment by making it available to multiple agencies and instructors. The higher-level bodies (WRSTC, RTC, and RESA) are made up of the agencies and manufacturers that will inherit the standards set, so there is a vested interest in not making too much change. Furthermore, in some cases, there is a unanimous voting requirement which means it is easy to veto something that impacts one particular agency but benefits many others.
‘Competence…’
This will be expanded in the section below relating to information systems as they are highly interdependent.
What safety information systems do we have in the diving community?
Training agencies each have their own quality assurance/control/management systems, with varying levels of oversight. This oversight is determined by the questions they ask, the feedback they receive, and the actions they take. These are closed systems and based around compliance with the standards set by the agency – sometimes those standards are not available to be viewed by the students during or after their class! Research has been carried out on some of this quality data, but it appears to have focused on the wrong part e.g., in 2018, a paper was published by Shreeves at al, which looked at violations outside the training environment involving 122 diving fatalities. While the data would have been available, a corresponding research project involving fatalities inside the training environment was not completed (or if it was, it wasn’t published in the academic literature).
As the ex-head of Quality Control of a training agency, I would have been more interested in what happened inside my agency’s training operations than what occurred outside, not from a retributive perspective, but to understand how the systemic failures were occurring. I also understand that undertaking such research would mean it would be open for ‘legal discovery’, and likely lead to the organisation facing criticism if a punitive approach was taken rather than a restorative one.
Safety organisations like Divers Alert Network collect incident data, but their primary focus is on quantitative data (numbers and types of incidents), not narrative or qualitative data – it is the latter that helps learning because we can relate to it. The British Sub Aqua Club produce an annual report, but there is very limited analysis of the reported data, and there does not appear to be any attempt made to look at contributory or influential factors when categorising events. The report lists the events based on the most serious outcome and not on the factors which may have influenced or contributed to the event e.g., a serious DCI event could have been caused by rapid ascent, following an out-of-gas situation, preceded by a buddy separation, and inadequate planning. The learning is in the contributory factors, not in the outcome. In fairness, this is because the organizations do not have to undertake more detailed investigations, and because the information isn’t contained in the submitted reports.
Research from 2006 has shown that management in organisations often want quantitative data, whereas practitioners want narrative data about what happened, how it made sense, and what can be done to improve the situation. Statistical data in the diving domain regarding safety performance and the effectiveness of interventions e.g., changes to the number of fatalities or buoyancy issues is of poor quality and should not be relied upon to draw significant conclusions.

What is required to populate these systems?
There are several elements needed to support a safety information system.
- Learning-focused ‘investigations’.
- Competent ‘investigators’.
- Confidential and collaborative information management and dissemination systems.
- Social constructs that allow context-rich narratives to be told.
Learning-focused ‘investigations’. The diving industry does not have a structured or formal investigation or learning process, instead relying on law-enforcement and legal investigations. Consequently, investigations are not focused on learning, rather they are about attributing blame and non-compliance. As Sidney Dekker said, “you can learn or blame; you can’t do both”. The evidence that could be used to improve learning e.g., standards deviations, time pressures, adaptations, poor/inadequate rules, incompetence, and distractions… are the same elements of data that a prosecution would like to know about to hold people accountable. Rarely does the context come to the fore, and it is context that shapes the potential learning opportunities. “We cannot change the human condition, but we can change the conditions in which humans work.” (James Reason). Rather than asking ‘why did that happen’ or even ‘who was to blame’, we need to move to ‘how did it make sense to do what they did’. ‘Why’ asks for a justification of the status quo, ‘how’ looks at the behaviour and the context, not the individual.
Competent ‘investigators’. As there isn’t any training in the diving domain to undertake a learning-focused investigation, we shouldn’t be surprised that the investigations focus on the individual’s errant behaviour. Even those ‘investigations’ undertaken by bodies like DAN, the NSS-CDS Accident Committee or the BSAC do not involve individuals who have undertaken formal training in investigations processes or investigation tools. A comprehensive learning review is not quick, so who is going to pay for that? It is much easier to deflect the blame to an individual ‘at the sharp end’ than look further up the tree where systemic and cultural issues reside. The education process for learning-focused investigations starts with understanding human error and human factors. The Essentials class, 10-week programme, and face-to-face programmes provide this initial insight, but the uptake across the industry, at a leadership level, is almost non-existent. Four free workshops are planned for Rebreather Forum 4.0 to help address this.
Confidential information management system. Currently, no system allows the storage of context-rich diving incident data outside the law-enforcement or legal system in a manner that can be used for learning. After discussions with senior training agency staff, it appears that as little as possible is written down following an incident. When it is, it is shared with the attorney to enable the ‘attorney-client’ privilege to be invoked and protected from discovery. If internal communications occur via voice, then the potential learning is retained in the heads of those involved but will fade over time. Furthermore, if they leave that role or organisation, then the information is almost guaranteed to be lost.
Social Constructs: Two interdependent elements are needed to support learning: psychological safety and a “Just Culture.” With the former, the majority of modern research strongly suggests that it is the presence of psychological safety that allows organisations to develop and learn (Edmondson, 1999). Edmondson describes numerous case studies where organisational and team performance was improved because incidents, problems, and near-misses were reported. Paradoxically, the more reports of failure, the greater the learning. It was not because the teams were incompetent; they wanted to share the learning and realised that they could get better faster with rapid feedback. They also knew that they wouldn’t be punished because psychological safety is about taking an interpersonal risk without fear of retribution or reprisal – this could be speaking up, it could be challenging the status quo, it could be saying “I don’t know”, or it could be about trying something new and coming up with an unexpected outcome.
The second requirement is a Just Culture which recognises that everyone is fallible, irrespective of experience, knowledge, and skills. This fallibility includes when rules are broken too, although sabotage and gross negligence (a legal term) are exceptions. Neither a Just Culture nor psychological safety are visible in the diving industry, although some pockets are present. To support psychological safety (proactive/prospective) and a Just Culture (reactive), there is a need for strong, demonstrable leadership:
- Leaders who have integrity – they walk the talk.
- Leaders who show vulnerability – talking about their own mistakes including the context and drivers; leaders who want to look at organisational issues inside their own organisation – not just point fingers at others problems.
- Leaders who recognise that human error is only the starting point to understand something going wrong, not the end.
‘…the will to implement major reforms…’
This is probably the hardest part because learning involves change. Change is hard. It costs cognitive effort, time, and money, and this has an impact on commercial viability because of the need to generate new materials, to educate instructor trainers/instructors and divers about the change and do it in multiple languages. Unless there is a major external pressure, e.g., the insurance companies threaten to withdraw support, things are unlikely to change because there aren’t enough people dying in a single event to trigger an emotional response for change. For example, in the General Aviation sector in the US approximately 350 people die each year, but if these deaths happened in airliners, it would mean two to three crashes per year, and this would be considered unacceptable.
In 2022, more than 179 people died diving in the US. (personal communications with DAN)
The most radical changes happen when double-loop learning is applied.
NASA did not learn from the Challenger disaster because it focused on single-loop learning, and when Columbia was lost, the investigation unearthed a lack of organisational learning i.e., double-loop learning. Chapter 8 from the Columbia Accident Investigation Board provides many parallels with the diving industry. The recent changes to PADI drysuit training standards following a fatal dive on a training course provide an example of single-loop learning – fix the ‘broken instructor’ and clarify course training requirements. The double-loop learning approach would be to look at self-certification and the wider quality management across the agency/industry; however, such an approach has significant commercial disadvantages across the board.

Creating a Learning Culture
The previous paragraphs talk about many of the issues we’ve got, but how do we improve things?
- Move to using a language that is learning-based, not ‘knowing’-based. This video from Crista Vesel covers the topic relatively quickly. This includes not using counterfactuals (could have, should have, would have, failed to…) which are informed by hindsight bias. Fundamentally, counterfactuals tell a story that didn’t exist.
- Look to local rationality rather than judging others. Move from who (is to blame) and ‘why did you do that?’, to ‘how did it make sense for you to do that?’. Separate the individual from the actions/behaviours and stop applying the fundamental attribution bias where we believe the failure is due to an individual issue rather than the context.
- Look to break down the barriers between the silos and share information. Ultimately, the stakeholders within the diving community should be looking to create a safe diving environment. Throwing rocks and stones at each other for ‘incompetence’ is not going to help.
- Adopt the Five Principles of Human and Organisational Performance as outlined in this blog.
- Build ‘If Only…’ or something produced for the recreational market, into training programmes at the instructor trainer, instructor, and diver level. This way the culture can slowly change by telling context-rich stories that have ‘stickiness’. However, this requires a fundamental shift in terms of how stories are told and how risk is portrayed in the diving industry.
- Finally, recognise we are all fallible. Until we accept that all divers are fallible and are trying to do the best they can, with the knowledge they have, the money they have, the resources they have, the skills they’ve acquired, and the drivers and goals they are facing, then we are unlikely to move forward from where we are, and we’ll keep choosing the easy answer: ‘diver error’.
DIVE DEEPER
InDEPTH: Examining Early Technical Diving Deaths: The aquaCORPS Incident Reports (1992-1996) by Michael Menduno
InDEPTH: The Case for an Independent Investigation & Testing Laboratory by John Clarke

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 450 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.