Diving Safety
Drift is Normal. Being a Deviant is Normal. Here’s Why
What causes individuals and organizations to drift from acceptable standards and behavior? Is it an aberration or something to expect, and what can we do about it? Human Factors coach Gareth Lock takes us for a deep dive into human biases and our tendency to drift, and what that means for human performance.


by Gareth Lock
Header image: a deviant diver on the SMS Cöln, and other pictures courtesy of Gareth Lock, unless noted
In 1994, two US Army Black Hawk helicopters were shot down by two US Air Force F-15 fighter jets over northern Iraq killing all 26 people on board the choppers. When the story hit the media, it was almost unbelievable that two highly professional aircrews being guided by other equally professional operators on the Airborne Warning and Control System (AWACS) aircraft could mistake the Black Hawk helicopters for Mil Mi-28 Hind helicopters. But they did!
In his excellent book Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq, Scott Snook developed and demonstrated the concept of practical drift, a theory whereby each sub-organisation or team has a certain amount of leeway to undertake their operations. This flexibility acknowledges that you can’t follow the rules exactly to the letter all the time. The problem is that these small deviations compound across the wider system with potential disastrous results; and, importantly, no one appears to recognize that the drift is occurring. Snook’s event map describes a complicated web of relationships between multiple stakeholders—the tasking organisation, the aircrew in the Black Hawks, the F-15 aircrew, and the AWACS control team—all of whom were doing the best they could with their limited resources and quickly changing circumstances.
Practical drift is similar to the “Normalization of Deviance,” a concept Diane Vaughan developed during her examination of the Challenger Shuttle disaster. Vaugn explored the idea in her 1996 book, The Challenger Launch Decision – Risky Technology, Culture, and Deviance at NASA. Normalization of deviance has been discussed in a number of recent diving blogs in an attempt to explore the acceptance of the (continued) breaking of a single rule.
Rather than focus on a single rule, we should consider Vaughan’s definition wider than the individual level, and look to a larger scale. “Social normalization of deviance means that people within the organisation become so accustomed to a deviation that they don’t consider it as deviant, despite the fact that they far exceed their own rules for elementary safety.” Neil Richardson, a safety and human factors professional (and colleague of mine) operating primarily in the aviation domain, offers another perspective while addressing the same point: “The Shuttle programme was risk-managed right up until the point it wasn’t and the Challenger and crew were lost.”
Risk management vs, uncertainty management
Risk management is often mentioned in the “professional” arm of diving and diver training courses—such as dive master, instructor, and instructor trainer courses—but it is rarely covered in detail during “user” courses or sport diving. Despite this lack of formal content and process, we are constantly managing relevant uncertainties with the goal of providing an enjoyable dive for ourselves and our students and reducing the likelihood of having an adverse event.
The term “uncertainties” has specifically been used instead of “risk” because of the way that we normally make decisions in an uncertain environment. When managing risk, we are often comparing historical analyses of quantitative data to determine likelihood and consequence using the logical or System 2 part of the brain. However, when we are managing uncertainties, we use a different part of the brain—often described as System 1—which relies on pattern matching, cognitive biases and mental shortcuts. Importantly, System 1 is heavily influenced by our emotions, which is why we often react quickly rather than logically.
Equating “risk” and “uncertainties” is like conflating the “apple” type of decision-making with the “orange” type of decision-making. They are both decision-making concepts, but they have different processes and applications and can lead to different outcomes.
We need to recognize that the uncertainties we deal with while diving aren’t just focused on physical safety/harm, but also cover legal, reputation, financial, psychological, and social uncertainties and their associated outcomes. Research has shown that the fear of psychological harm can be stronger than the fear of physical harm.
In the diving industry, when something goes wrong, the (social) media and “investigations” often focus on the proximal causes—those that are closest in time and space to the event—of what happened. There is a focus on violations, rule-breaking, human error, recklessness, or direct health issues, and only sometimes do supervisory/instructional factors come into the discussion. Furthermore, the media rarely examines “local rationality” (why it made sense for the individual to do what they did) or the immediate or wider organisational and cultural factors that may have been present.
Local rationality
If we focus on the local rationality to start with, we know that the majority of the time we are operating in System 1 mode, which is fast, intuitive, and pattern-matching based thinking. We are not actively paying attention to everything that we’re sensing; instead, we are picking what we think are the relevant or important factors based on our previous knowledge and experiences, focused by our present goals and expectations, and using those elements of information to make a decision.
Despite what some would think, you can’t pay 100% attention all the time! This means that we are literally ditching billions of bits of sensory data each day because, in real time, we don’t think those bits are relevant or important. When there are pressures that prevent us from being more thorough, we are trying to be as efficient as possible. These pressures might be related to time, money, peer-pressure, fear of failure, fear of non-compliance, or fixation on goals/outcomes. However, the more we get “right” without thinking about all of the incoming stimuli, the more we use this pattern to reinforce our decision and then repeat it. How often have you heard “We’ve always done it this way?”
Maybe an adverse event would provide a learning opportunity? Unfortunately, the likelihood of adverse events serving as cautionary tales is entirely dependent upon biases in our thinking and how those biases inform our interpretation of an event, adverse or otherwise. The following is a list of biases:
- Outcome bias describes the tendency to judge serious events more critically than minor events. This is because we disconnect the quality of the outcome from the quality of the decision. For example, those involved in a fatality with the same conditions as a non-fatality will be treated more critically; a poorly performing regulator that free-flows in 10 m/33 ft of cold water will be treated differently from the same regulator that free-flows in 40 m/131 ft of cold water because the consequences are more severe.
- Fundamental attribution bias is the tendency to attribute causality of an adverse event involving someone else to the individual involved rather than the situation or context. This is different to when we personally experience failure, as we often blame the situation or context! Inversely, when we personally experience success, we look at our skills and behaviors; but, when others succeed, we have a tendency to attribute the “opportunities” they had as the cause for success.
- Distancing through differencing is the tendency to discount failures in others as being relevant to ourselves because we are different to the other party in some way, even if the general conditions and context are the same. A recreational OC diver may forget part of their pre-dive sequence because they were distracted but an experienced OC technical diver may believe that they wouldn’t make that same mistake, even though the conditions were the same.
- Hindsight bias is the tendency to think that, if we had been in the adverse situation, we would have known at the time what the adverse event would have been and would have responded differently. Part of this is because we are able to join the dots looking backwards in time, recognising a pattern that wasn’t apparent in the moment.
Rewards
As a result of these biases, we aren’t very good at picking up small deviations in procedures because we experience “good enough” outcomes, and we are “rewarded” for gradual erosion of the safety margins that the original standards were created to address:
• We saved time (or weren’t late) as we skipped through the checks quickly.
• We saw more of the wreck or reef because we extended the bottom time and ate into our minimum gas margins.
• We managed to certify a few more students this month which helped pay the bills, even though we didn’t cover everything to the same level of detail that we normally do.
• We got some really great social media feedback because we took those divers somewhere they hadn’t been before—and shouldn’t have been either—but they loved it.

Rewards come in all sorts of shapes and sizes, but the common factor is the dopamine rush: Our brains are wired to favor the feel-good rush of a short-term gain over the prolonged reward of a long-term gain. On the other side of the coin, we are also willing to sacrifice a potential major loss in the future if there is a guaranteed minor loss now. For instance, imagine that you’re entering the water for the “dive of a lifetime” in cold water with a regulator setup that doesn’t breathe too well. You weren’t able to get it serviced because of time/money issues. At the end of this particular dive, you have to do a gas sharing ascent; someone else was out of gas due to an equipment failure, and both of your second stages freeflow and freeze, due to poor regulator performance, increased gas flow and the cold environmental conditions. This resulted in two people who were now out of gas and making a rapid ascent to the surface.
In hindsight, we can see where the failures occurred. But, in real time, the erosion of safety margins and subconscious acceptance of the increased “risk” are likely not considered. In mid-July 2021, I gave a presentation to Divers Alert Network Southern Africa (DAN SA) on the topic of setting and maintaining goals and how goal focus can reduce safety.
Organisations drift too
This article opens with the topic of normalization of deviation as it related to NASA and the Challenger Shuttle loss. The gradual, imperceptible shift from an original baseline through a series of “risk managed” processes and activities resulted in a “new” baseline that was far from acceptable when considering the original safety argument. This isn’t the first time an organisation has drifted, nor will it be the last.
Organisations are made of people, and there are reward systems in place within organisations which lead to a conflict between safety, workload, and financial viability. The image below from Jens Rasmussen shows this tension and the “safety margins” that are perceived to be in place. The difficulty is that we don’t know how big the gap is between the margin and catastrophe, so we keep pushing the boundaries until we get some feedback (failure) and hope that it isn’t catastrophic.
Another way of looking at this tension and drift is to use a framework from the Human and Organisation Performance (HOP) domain called the Organisational Drift Model from Sidney Dekker.
The premise here is that safety is “created” by the development of rules, processes, procedures, and a culture which supports adherence to these standards or expectations. In the modern safety domain, these rules, processes, and procedures are called “Work as Imagined” or “Work as Prescribed.” They rarely match exactly the operational environment to which they are going to be used. There are good reasons for that; you cannot document everything that you want your people (instructor trainers, instructors, dive masters, and divers) to do in every circumstance, so there will be gaps between what should be done and what is done. These gaps are filled in by experience and feedback. Some call this common sense, but you can’t develop common sense without personal experience!
As time progresses, there is an increased gap between the “Work as Imagined” (black line) and “Work as Done” (blue line). This gap is risk or uncertainty to the organisation. Not all drift is bad though, because innovation can come from drift as long as it is recognized, debriefed, and intentionally fed back into the system for improvement.

At the same time as individual and team performance is drifting, the operational environment is changing too. There are accumulations which are adding uncertainty/risk to the system: old or outdated equipment, external requirements changing, legislation changes, change of purpose of equipment or accommodation/infrastructure, and many others. Often these accumulations are dealt with by different people in an organisation, so the compounding effect is not seen.
The gap between “Work as Done” and the “Accumulations” line is known as capacity within the system. This capacity is managed by individuals, taking into account their experience, knowledge, skills, and attitudes towards and within the diving environment. Safety does not reside in paperwork, equipment, or individuals; it is created by those within the diving system taking into account all of the resources they have and the pressures they face while balancing workload, money, and safety dynamically.
However, when the capacity runs out (when the Work as Done line crosses the Accumulations line) an adverse event occurs. This event is now under the spotlight because it is obvious and cannot be hidden, especially if it is very serious. Hindsight clouds our ability to learn because we think the gaps must have been obvious. Effective organisational learning to prevent drift doesn’t need an adverse event. What it needs is a curious mind and the motivation to improve. If we stopped time 5 seconds before the lines crossed, while we still had capacity, then all of the learning opportunities would still be present and we could examine them. We would be able to see what accumulations are occurring, we would be able to see Work as Done actually was, and we would be able to increase the capacity of the system thereby reducing the likelihood of an adverse event. But that requires organisations to recognize that adverse events are outcomes from a complex system with many interactions, and where they set and demonstrate the acceptable standards and expectations. The absence of adverse events does not mean that you are operating a ‘safe’ system.
If drift is normal, what can I do about it?
First, recognize and acknowledge that drift exists. We all have a tendency to drift. If drift is occurring, look at the conditions that are causing the drift without focusing on the drifting individual themselves. This could be time pressures, financial pressures because of ‘cheap’ courses, lack of experience, high turnover of staff and low commitment to the sport by divers or dive professionals.
Secondly, create an environment where feedback, especially critical context rich feedback, is the norm. This has multiple benefits:
- Individuals find out where they are drifting from the standards/expectations which have been set.
- Organisations find out if their standards/expectations are fit for purpose and where issues about compliance are arising.
- Accumulations are identified in a timely manner and addressed.
There are a number of blogs on The Human Diver website and our Vimeo channel which help to develop a learning culture, understand how drift occurs via human error, and how to develop both a psychologically safe environment and a Just Culture. In terms of having an immediate effect, a post-dive/post-project debrief is one of the best methods, and you can download the DEBRIEF framework I created to help facilitate critical, learning-focused debriefs from here: www.thehumandiver.com/debrief
Remember, is it normal to err. It is what we do once we’ve made the error that matters when it comes to creating positive change in the future. If we focus on the individual and their behavior, things are unlikely to improve. However, if we look at the conditions and context, then we have the opportunity to reduce the chances of an adverse event in the future. And if we share those lessons, it isn’t just our organisation or team that improves, the diving community can too.
Dive Deeper
Be There or Be Deviant: HF In Diving Conference 24-25 September 2021

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 350 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and a Just Culture. In September 2021, he will be opening the first ever Human Factors in Diving conference. His goal: to bring human factors practice and knowledge into the diving community to improve safety, performance, and enjoyment.
\
Diving Safety
Does The Sport Diving Community Learn from Accidents?
Do we learn from accidents as a diving culture and, as a result, take the actions, where needed, to improve divers’ safety? Though we might like to think that’s the case, the reality is more complicated as human factors coach Gareth Lock explains in some detail. Lock offers a broad six-point plan to help the community boost its learning chops. We gave him an A for effort. See what you think.
by Gareth Lock

Learning is the ability to observe and reflect on previous actions and behaviours, and then modify or change future behaviours or actions to either get a different result or to reinforce the current behaviours. It can be single-loop, whereby we only focus on the immediate actions and change those, e.g., provide metrics for buoyancy control during a training course, or double-loop where the underlying assumptions are questioned, e.g., are we teaching instructors how to teach buoyancy and trim correctly? The latter has a great impact but takes more time, and more importantly, requires a different perspective. Culture is the ‘way things are done around here’ and is made up of many different elements as shown in this image from Rob Long. Learning culture is a subset of a wider safety culture.

Regarding a safety culture, in 2022 I wrote a piece for InDEPTH, “Can We Create A Safety Culture In Diving? Probably Not, Here’s Why,” about whether the diving industry could have a mature safety culture and concluded that it probably couldn’t happen for several reasons:
- First, ‘safe’ means different things to different people, especially when we are operating in an inherently hazardous environment. Recreational, technical, cave, CCR and wreck diving all have different types and severities of hazards, and there are varying levels of perception and acceptance of risk. The ultimate realisation of risk, death, was only acknowledged in the last couple of years by a major training agency in their training materials. Yet it is something that can happen on ANY dive.
- Second, given the loose training standards, multiple agencies, and instructors teaching for multiple agencies, there is a diffuse organisational influence across the industry which means it is hard to change the compliance-focus that is in place. From the outside looking in, there needs to be more evidence of leadership surrounding operational safety, as opposed to compliance-based safety e.g., ensuring that the standards are adhered to, even if the standards have conflicts or are not clear. This appears to be more acute when agencies have regional licensees who may not be active diving instructors and are focused on revenue generation and not the maintenance of skilled instructors. There is very little, if any, evidence that leadership skills, traits or behaviours are taught anywhere in the diving industry as part of the formal agency staff or professional development processes. This impacts what happens in terms of safety culture development.
- Finally, the focus on standards and rules aligns with the lowest level of the recognised safety culture models – Pathological from Hudson. Rules and standards do not create safety. Rules facilitate the discussion around what is acceptably safe, but they rarely consider the context surrounding the activities at the sharp end, i.e., dive centres and diving instructors and how they manage their businesses. These are grey areas. There is a difference between ‘Work as Imagined’ and ‘Work as Done,’ and individual instructors and dive centre managers must both ‘complete the design’ because the manuals and guides are generic, and manage the tension between safety, financial pressures, and people (or other resources) to maintain a viable business. Fundamentally, people create safety not through the blind adherence to rules, but through developed knowledge and reflecting on their experiences, and then sharing that knowledge with others so that they, too, may learn and not have to make the same mistakes themselves.

The proceeding discussion brings us to the main topics of this article, does the diving industry have a learning culture, and what is needed to support that learning culture?
What is a learning culture?
In the context of ‘safe’ diving operations, a learning culture could be defined as “the willingness and the competence to draw the right conclusions from its safety information system, and the will to implement major reforms when their need is indicated.” (Reason, 1997). Here we have a problem!
‘Willingness…’
The industry is based around siloed operations: equipment manufacturers, training agencies, dive centres/operations, and individual instructors. Adopting a genuine learning approach means that the barriers must be broken down and conversations happen between and within the silos. This is very difficult because of the commercial pressures present. The consumer market is small, and there are many agencies and equipment manufacturers that are competing for the same divers and instructors. Also, agencies and manufacturers have competing goals. Agencies want to maximise the number of dive centres/instructors to generate revenue, and one of the ways of doing that is to maximise the number of courses available and courses that can be taught by individual instructors e.g., different types of CCR units. Manufacturers don’t want to realise the reputational risk because their equipment/CCR is involved in a fatal diving accident, but they also want to maximise their return on investment by making it available to multiple agencies and instructors. The higher-level bodies (WRSTC, RTC, and RESA) are made up of the agencies and manufacturers that will inherit the standards set, so there is a vested interest in not making too much change. Furthermore, in some cases, there is a unanimous voting requirement which means it is easy to veto something that impacts one particular agency but benefits many others.
‘Competence…’
This will be expanded in the section below relating to information systems as they are highly interdependent.
What safety information systems do we have in the diving community?
Training agencies each have their own quality assurance/control/management systems, with varying levels of oversight. This oversight is determined by the questions they ask, the feedback they receive, and the actions they take. These are closed systems and based around compliance with the standards set by the agency – sometimes those standards are not available to be viewed by the students during or after their class! Research has been carried out on some of this quality data, but it appears to have focused on the wrong part e.g., in 2018, a paper was published by Shreeves at al, which looked at violations outside the training environment involving 122 diving fatalities. While the data would have been available, a corresponding research project involving fatalities inside the training environment was not completed (or if it was, it wasn’t published in the academic literature).
As the ex-head of Quality Control of a training agency, I would have been more interested in what happened inside my agency’s training operations than what occurred outside, not from a retributive perspective, but to understand how the systemic failures were occurring. I also understand that undertaking such research would mean it would be open for ‘legal discovery’, and likely lead to the organisation facing criticism if a punitive approach was taken rather than a restorative one.
Safety organisations like Divers Alert Network collect incident data, but their primary focus is on quantitative data (numbers and types of incidents), not narrative or qualitative data – it is the latter that helps learning because we can relate to it. The British Sub Aqua Club produce an annual report, but there is very limited analysis of the reported data, and there does not appear to be any attempt made to look at contributory or influential factors when categorising events. The report lists the events based on the most serious outcome and not on the factors which may have influenced or contributed to the event e.g., a serious DCI event could have been caused by rapid ascent, following an out-of-gas situation, preceded by a buddy separation, and inadequate planning. The learning is in the contributory factors, not in the outcome. In fairness, this is because the organizations do not have to undertake more detailed investigations, and because the information isn’t contained in the submitted reports.
Research from 2006 has shown that management in organisations often want quantitative data, whereas practitioners want narrative data about what happened, how it made sense, and what can be done to improve the situation. Statistical data in the diving domain regarding safety performance and the effectiveness of interventions e.g., changes to the number of fatalities or buoyancy issues is of poor quality and should not be relied upon to draw significant conclusions.

What is required to populate these systems?
There are several elements needed to support a safety information system.
- Learning-focused ‘investigations’.
- Competent ‘investigators’.
- Confidential and collaborative information management and dissemination systems.
- Social constructs that allow context-rich narratives to be told.
Learning-focused ‘investigations’. The diving industry does not have a structured or formal investigation or learning process, instead relying on law-enforcement and legal investigations. Consequently, investigations are not focused on learning, rather they are about attributing blame and non-compliance. As Sidney Dekker said, “you can learn or blame; you can’t do both”. The evidence that could be used to improve learning e.g., standards deviations, time pressures, adaptations, poor/inadequate rules, incompetence, and distractions… are the same elements of data that a prosecution would like to know about to hold people accountable. Rarely does the context come to the fore, and it is context that shapes the potential learning opportunities. “We cannot change the human condition, but we can change the conditions in which humans work.” (James Reason). Rather than asking ‘why did that happen’ or even ‘who was to blame’, we need to move to ‘how did it make sense to do what they did’. ‘Why’ asks for a justification of the status quo, ‘how’ looks at the behaviour and the context, not the individual.
Competent ‘investigators’. As there isn’t any training in the diving domain to undertake a learning-focused investigation, we shouldn’t be surprised that the investigations focus on the individual’s errant behaviour. Even those ‘investigations’ undertaken by bodies like DAN, the NSS-CDS Accident Committee or the BSAC do not involve individuals who have undertaken formal training in investigations processes or investigation tools. A comprehensive learning review is not quick, so who is going to pay for that? It is much easier to deflect the blame to an individual ‘at the sharp end’ than look further up the tree where systemic and cultural issues reside. The education process for learning-focused investigations starts with understanding human error and human factors. The Essentials class, 10-week programme, and face-to-face programmes provide this initial insight, but the uptake across the industry, at a leadership level, is almost non-existent. Four free workshops are planned for Rebreather Forum 4.0 to help address this.
Confidential information management system. Currently, no system allows the storage of context-rich diving incident data outside the law-enforcement or legal system in a manner that can be used for learning. After discussions with senior training agency staff, it appears that as little as possible is written down following an incident. When it is, it is shared with the attorney to enable the ‘attorney-client’ privilege to be invoked and protected from discovery. If internal communications occur via voice, then the potential learning is retained in the heads of those involved but will fade over time. Furthermore, if they leave that role or organisation, then the information is almost guaranteed to be lost.
Social Constructs: Two interdependent elements are needed to support learning: psychological safety and a “Just Culture.” With the former, the majority of modern research strongly suggests that it is the presence of psychological safety that allows organisations to develop and learn (Edmondson, 1999). Edmondson describes numerous case studies where organisational and team performance was improved because incidents, problems, and near-misses were reported. Paradoxically, the more reports of failure, the greater the learning. It was not because the teams were incompetent; they wanted to share the learning and realised that they could get better faster with rapid feedback. They also knew that they wouldn’t be punished because psychological safety is about taking an interpersonal risk without fear of retribution or reprisal – this could be speaking up, it could be challenging the status quo, it could be saying “I don’t know”, or it could be about trying something new and coming up with an unexpected outcome.
The second requirement is a Just Culture which recognises that everyone is fallible, irrespective of experience, knowledge, and skills. This fallibility includes when rules are broken too, although sabotage and gross negligence (a legal term) are exceptions. Neither a Just Culture nor psychological safety are visible in the diving industry, although some pockets are present. To support psychological safety (proactive/prospective) and a Just Culture (reactive), there is a need for strong, demonstrable leadership:
- Leaders who have integrity – they walk the talk.
- Leaders who show vulnerability – talking about their own mistakes including the context and drivers; leaders who want to look at organisational issues inside their own organisation – not just point fingers at others problems.
- Leaders who recognise that human error is only the starting point to understand something going wrong, not the end.
‘…the will to implement major reforms…’
This is probably the hardest part because learning involves change. Change is hard. It costs cognitive effort, time, and money, and this has an impact on commercial viability because of the need to generate new materials, to educate instructor trainers/instructors and divers about the change and do it in multiple languages. Unless there is a major external pressure, e.g., the insurance companies threaten to withdraw support, things are unlikely to change because there aren’t enough people dying in a single event to trigger an emotional response for change. For example, in the General Aviation sector in the US approximately 350 people die each year, but if these deaths happened in airliners, it would mean two to three crashes per year, and this would be considered unacceptable.
In 2022, more than 179 people died diving in the US. (personal communications with DAN)
The most radical changes happen when double-loop learning is applied.
NASA did not learn from the Challenger disaster because it focused on single-loop learning, and when Columbia was lost, the investigation unearthed a lack of organisational learning i.e., double-loop learning. Chapter 8 from the Columbia Accident Investigation Board provides many parallels with the diving industry. The recent changes to PADI drysuit training standards following a fatal dive on a training course provide an example of single-loop learning – fix the ‘broken instructor’ and clarify course training requirements. The double-loop learning approach would be to look at self-certification and the wider quality management across the agency/industry; however, such an approach has significant commercial disadvantages across the board.

Creating a Learning Culture
The previous paragraphs talk about many of the issues we’ve got, but how do we improve things?
- Move to using a language that is learning-based, not ‘knowing’-based. This video from Crista Vesel covers the topic relatively quickly. This includes not using counterfactuals (could have, should have, would have, failed to…) which are informed by hindsight bias. Fundamentally, counterfactuals tell a story that didn’t exist.
- Look to local rationality rather than judging others. Move from who (is to blame) and ‘why did you do that?’, to ‘how did it make sense for you to do that?’. Separate the individual from the actions/behaviours and stop applying the fundamental attribution bias where we believe the failure is due to an individual issue rather than the context.
- Look to break down the barriers between the silos and share information. Ultimately, the stakeholders within the diving community should be looking to create a safe diving environment. Throwing rocks and stones at each other for ‘incompetence’ is not going to help.
- Adopt the Five Principles of Human and Organisational Performance as outlined in this blog.
- Build ‘If Only…’ or something produced for the recreational market, into training programmes at the instructor trainer, instructor, and diver level. This way the culture can slowly change by telling context-rich stories that have ‘stickiness’. However, this requires a fundamental shift in terms of how stories are told and how risk is portrayed in the diving industry.
- Finally, recognise we are all fallible. Until we accept that all divers are fallible and are trying to do the best they can, with the knowledge they have, the money they have, the resources they have, the skills they’ve acquired, and the drivers and goals they are facing, then we are unlikely to move forward from where we are, and we’ll keep choosing the easy answer: ‘diver error’.
DIVE DEEPER
InDEPTH: Examining Early Technical Diving Deaths: The aquaCORPS Incident Reports (1992-1996) by Michael Menduno
InDEPTH: The Case for an Independent Investigation & Testing Laboratory by John Clarke

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 450 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.