Connect with us

Diving Safety

Do ‘Bad Apples’ Actually Exist?

Human factors coach Gareth Lock explores and analyses the theory of “bad apples” as it applies to dive training. Are system failures the fault of individual agents or are they a result of the system in which they operate, or both? Lock’s answers and potential remedies will enlighten—he gets the core.



By Gareth Lock. Header image by Julian Mühlenhaus. Photos courtesy of Gareth Lock unless noted.

A diving instructor delivered a class in which one of the students misconfigured their equipment before entering the water. This misconfiguration led to hypoxia and the student’s subsequent drowning.

A boat caught fire during the night because of allegedly overloaded electrical circuits, and then 34 people died when they couldn’t escape the boat, despite the boat being accepted as seaworthy and general specifications being adhered to.

A diving instructor and two students died while undertaking a dive in a submerged mine system when visibility was reduced and they couldn’t follow the guideline back to the surface.

Each one of these cases is real, and each one will trigger a strong emotional response because we see, after the fact, it was likely that this event was going to happen given the conditions, actions, and decisions at the time. As a rule, we have a need to assign blame for tragedies, especially when there is loss of life, and even more especially when that loss strikes home. There is an impulse in us to hold someone accountable in light of a recognised standard of best practices, or good behaviour. And yet, at the same time, we know that those involved did not suspect they were going to die at the time; otherwise, they would have done something to prevent  it.  

We have some degree of confidence that the participants in these scenarios were balancing acceptable performance (safety), financial viability (affordability), and workload (physical ability), which refers to resource and people rather than physical ability, and that they were managing risk and uncertainty and balancing those risks and uncertainties with the associated rewards. This tension exists in pretty much everything we do and, in the main, we are pretty good at it. 

Within a system that goes right more often than wrong, bad events stick out like a sore thumb. Furthermore, when the adverse event is serious and ‘obvious’ we judge more harshly. We often believe that by getting rid of these ‘bad apples’ we can make the system safer. But as Sidney Dekker says “Reprimanding bad apples may seem like a quick and rewarding fix, but it’s like peeing in your pants. You feel relieved and perhaps even nice and warm for a little while, but then it gets cold and uncomfortable. And you look like a fool.” 

These negative or critical responses are due to well-known cognitive biases, severity bias, outcome bias, and the fundamental attribution bias or error. What we often forget though is that our behaviour is a product of the system we operate in, and the choices we make in the here and now are influenced by experiences (good and bad) and the rewards (kudos, money and social media recognition) or punishments (disciplinary action, financial hardship, social media retribution) we currently face or perceive we face. The more we look into an event, the more we realize that context is so important in understanding how it made sense for someone to do what they did, or what conditions were present that increased the likelihood of the adverse event occurring.

So, where do ‘bad apples’ come from?

The term ‘bad apples’ comes from the concept that a bad apple will spoil the barrel, as such we should get rid of it before we lose the harvest. Technically, all apples will spoil eventually because of the natural decay process, even when isolated from other apples. The term bad apples has then been applied to humans in social and organisational contexts where the worry is that a single individual will corrupt others around them (from a given standard) and performance will deteriorate, or the company will go bust because of a lack of financial viability. In the safety world, ‘bad apple theory’ states that your inherently safe system remains safe as long as you get rid of the bad apples. There is a certain irony to these ideas.

There have been numerous papers looking at bad apples across multiple domains, police corruption, financial corruption, and healthcare organisations and their impact on safety and performance. In each case, the research has recognised that those individuals did not start off as ‘bad apples’, they started as well-intentioned, sufficiently-trained individuals that gradually got absorbed into the culture of the system in which they were immersed.

Many believe that individuals still need to be held accountable for their actions, and the individual bad apples haven’t been ignored in the research, e.g., Shojania and Dixon-Woods showed that 3% of doctors lead to 49% of complaints and 1% of doctors covered all complaints in a hospital. However, that paper also recognised that those individuals were a product of a failed system. At some point, disciplinary action is needed, but it should only follow a learning-based investigation and not be the first tool that comes out of the box. 

The research recognizes that it wasn’t so much that the apples were bad, but rather the barrel in which they were being stored did not have the systems in place to stop the decay from developing. As such, we need to be looking more at the barrel rather than the apples if we want to make improvements to safety and performance in diving. As Professor James Reason said, “We cannot change the human condition, but we can change the conditions under which humans work.” Fundamentally, if the same ‘errors’ or ‘deviations’ keep happening at an individual level, it is likely to be a system problem, not an individual one.

Decay (drift) is normal

As divers, and being human, we all have a tendency to drift. We want to find more efficient or effective ways of getting the dive done or completing the class given the local pressures we are facing to achieve the goals we’ve set, based on previous outcomes (right or wrong). Our adherence to the rules is based on multiple factors: ease of rule compliance, fear of non-social compliance, fear of litigation, financial constraints, who wrote the rules, how much value the rules have, whether we will get caught, and what is the worst thing is that can happen. 

The context is driving the behaviour. However, we should be creating an environment where adherence happens for two reasons – the rules match the environment, and those involved want to comply because they understand the value, risks, and bigger picture. Adherence should not be because those involved have to comply and are fearful of the consequences. This can lead to gaming of the system and misplaced motivation.

The normalisation of deviance, normalisation of risk, and practical drift’ (See Normalisation side box below) are all terms used to describe the slow movement away from the standards which have been decided upon, written down and published, potentially without recognition of a developing gap. Consequently, feedback is required to identify the deviations and provide corrections back to the standard, or maybe even change the standard. At the training agency level, this should be happening via the QC/QA processes, where the student provides some form of feedback to the agency about the standard of teaching and whether skills were taught or not. The problem with this is that unless the performance standards are available and briefed prior to the class, the student doesn’t know what they don’t know and won’t be able to spot drift. Therefore, unless there is a major deviation, which means that significant drift will have already occurred, drift is hard to spot. 


Normalisation of Deviance is “when people within an organization become so insensitive to deviant practice that it no longer feels wrong.” Vaughan “…had found no evidence of rule violation and misconduct by individuals.” Instead, the key to accepting risk in the past was what I called “the normalisation of deviance”: Normalisation of Deviance is not about breaking rules, it is a social construct based on having standards that are gradually being eroded. This is happening because the output takes priority, and the secrecy associated with discrete or siloed operations means that other stakeholders don’t know or can’t know what else is going on. In diving, this could be the gradual reduction in hours needed to undertake training to meet a competitive advantage or the acceptance that deviations are happening and, as long as it doesn’t end up with a serious injury or fatality, it is okay. Vaughan, Diane. The Challenger Launch Decision. University of Chicago Press. Kindle Edition. Location 86.

Normalisation of Risk is the gradual process through which risky/dangerous practices or conditions become acceptable over time. In diving, these can be: reducing the amount of gas remaining at the end of a dive because nothing has gone wrong, using CCR cells beyond 12 months of manufacture because they still work, or not having a continuous line to the surface while cave diving. See: ‘Shit Happens’: The Selling of Risk in Extreme Sport

Practical Drift comes from the work of Scott Snook examining the Blackhawk shootdown in April 1994. “Practical drift is the slow steady uncoupling of local practice from written procedure. It is this structural tendency for subunits to drift away from globally synchronized rule-based logics of action toward locally determined task-based procedures that places complex organizations at risk.” See: Snook, Scott A.. Friendly Fire (p. 24). Princeton University Press. Kindle Edition. 

Instructors, are you a “Bad Apple”? Check the results of our survey that we conducted with the Business of Diving Institute: Bad Apple Survey

Another dimension is the constant fear that if the student provides critical feedback to the agency about the instructor, the instructor will behave badly toward them in the future. As such, the student provides platitudes that add no value to learning. This problem is replicated at the Instructor Trainer and Course Director level. ITs and CDs drift as they are human too, but the consequences are more serious as their deviations become more widespread.

Photo by Sean Romanowski

Some agencies deal with individual instructor drift by undertaking a regular check of the instructor’s performance in a live class, or by recommending co-teaching sessions where drift or variability in performance can be identified and corrected. However, we have to be careful of the ‘observer effect’ or ‘Hawthorne effect’ as well as the possibility of individuals ‘playing the game’ to pass, meaning they know how to adhere to standards but choose to cut corners when not being observed.

At GUE, [Ed. note: G.Lock is on GUE’s Quality Control board] we have worked hard over the last four to five years to change the perception of feedback within the QC forms from being something to be feared to be something that is rewarding, especially when low scores or critical comments come in. I remember when an instructor from another agency had completed their internship and taught one of their early classes, I contacted them because of a comment in the form. I wanted to understand the background behind this comment with a view to them self-improving or getting them support. 

The instructor’s initial response to me was very defensive, which confused me. Afterwards, they explained the reason for their reaction was that in the past the only reason QC contacted an instructor was because there was likely to be a lawsuit inbound, and so everything had to be documented as per the standards! 

If the diving industry and training agencies (barrels) want divers and diving instructors (apples) to improve, they need to provide an environment where variability in performance is visible, recognised, and not hidden. This means that there is a need for psychological safety to speak up before something happens, and a Just Culture so learning from adverse events can happen. 

Unfortunately, this need is not helped by litigation and the discovery process, where anything written down can be demanded in the case of a lawsuit. For example, at the top of one of the major training agency’s incident report forms, it says, “This form is being prepared in the event of litigation.” This guidance is not likely to help anyone understand how it made sense for people to do what they did, especially if they were deviating from standards to achieve certain goals. If it isn’t written down, then it didn’t ‘exist’ and therefore can’t be produced! However, the lack of documentation makes it difficult or even impossible to detect drift. Furthermore, the lack of clear and coherent standards across the industry—and the limited visibility of these—means that it is harder to spot drift developing. Fundamentally, what acceptable standards are you drifting from?

Understanding one of the ‘bad apples’ above

The following list looks at the conditions surrounding the first event and shows how variability at multiple levels caused this tragic event.

  • A number of instructors had filed complaints to their agency about the instructor involved. It is not clear what the agency did about these complaints, as nothing appeared to change in terms of the instructor’s behaviour. 
  • Students did not have easy access to the agency’s standards and, when located, said standards were difficult to understand and contained contradictions. 
  • The agency HQ staff (as with most agencies) was very small and so had limited opportunities to undertake QC checks. 
  • The financial margins for dive training are small, so efficiencies are found. Instructors holding multiple unit training certifications means proficiency cannot be as high when compared to specialisation. Multiple unit certification increases instructor teaching opportunities for a limited market. 
  • The bespoke class was based on combining multiple classes and didn’t formally exist in the manner it was being taught. 
  • The class schedule was constantly changing due to the availability of staff and students.
  • The urgency to complete the task was driven by financial pressures for fear of handing grants back. 
  • There appeared to be a perception that photographic media was needed for the shop that the deceased student worked at, and the instructor was the manager/owner of that shop. 
  • The students in the class did not feel that they could challenge the developing situation on the boat, likely for reasons of social conformance and culture. 
  • There was no ‘team’ on the boat, with the perception of four students plus an instructor, rather than a learning team working together around a common purpose. 
  • There had been a very similar error made by the student on a previous dive two weeks prior, and this didn’t appear to have been picked up by the diver or dive team.

If we look at Snook’s definition of ‘Practical Drift’ (See Normalisation side box) we can see that over time each of the different parts of the system gradually drifted away from a standard and there was no effective check in place to bring those involved back to the expected, and possibly unclear, standards.

Opportunities for change.

The following provides some opportunities for improvement.

  • If the financial viability of your dive business is struggling; you have two choices: you can cut corners and be more cost efficient, or you can fold and find another job. The problem with cutting corners is that you don’t know where the ‘accident line’ is until you step over it. Get an external view of how you teach and what margins are being eroded, and listen to that feedback. It might save you a significant amount in the future.
  • If you are a lone instructor, and you do not have any form of checking performance and don’t co-teach with others, then you will very likely drift. You need to be proactive in arresting this drift by involving others and accepting feedback from them. Own the likelihood of drift.
  • If you are an agency and your instructor trainers are not checked on a regular basis, do not be surprised that your instructors will not be performing at the standards they should be following. When drift or deviations occur at the top e.g., ITs and CDs, the impact at the lower levels is magnified. There is a need to create a psychologically-safe environment so that feedback is expected, can be provided, and is then shared amongst other instructors. This change starts with leaders.


There are two ways of looking at the question, “Do Bad apples exist?” in the diving industry (and life in general). One answer would be “No, they don’t exist” because everyone has the potential to be a ‘bad apple’ based on the context in which they are operating. The other way is “Yes, because at some point good apples turn bad,” but the reason they turn bad is because of the system.

Learning comes about via exploring boundaries and making ‘errors’ and reflecting on them afterwards. This ability to learn from our own and others’ mistakes is relatively immature in the diving industry—not just looking at outcomes but also at local rationality. We only have to look at social media to see the conflict and judgment that happens when an adverse event is made public. Unfortunately, we have an innate bias to look for individual fault rather than systemic weakness and often ignore the context that is driving those behaviours. This is especially true in the US with a litigious culture that looks to blame and sue, rather than learn and understand. There is a long journey ahead to improve the orchard and barrels, but we will get there more quickly if we stop focusing on the ‘bad apples’.

Instructors, are you a “Bad Apple”? Check the results of our survey that we conducted with the Business of Diving Institute: Bad Apple Survey

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 350 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture. 

Subscribe for the InDepth Newsletter

Diving Safety

Does The Sport Diving Community Learn from Accidents?

Do we learn from accidents as a diving culture and, as a result, take the actions, where needed, to improve divers’ safety? Though we might like to think that’s the case, the reality is more complicated as human factors coach Gareth Lock explains in some detail. Lock offers a broad six-point plan to help the community boost its learning chops. We gave him an A for effort. See what you think.




by Gareth Lock

Learning is the ability to observe and reflect on previous actions and behaviours, and then modify or change future behaviours or actions to either get a different result or to reinforce the current behaviours. It can be single-loop, whereby we only focus on the immediate actions and change those, e.g., provide metrics for buoyancy control during a training course, or double-loop where the underlying assumptions are questioned, e.g., are we teaching instructors how to teach buoyancy and trim correctly? The latter has a great impact but takes more time, and more importantly, requires a different perspective. Culture is the ‘way things are done around here’ and is made up of many different elements as shown in this image from Rob Long. Learning culture is a subset of a wider safety culture.

Regarding a safety culture, in 2022 I wrote a piece for InDEPTH, Can We Create A Safety Culture In Diving? Probably Not, Here’s Why,” about whether the diving industry could have a mature safety culture and concluded that it probably couldn’t happen for several reasons:

  • First, ‘safe’ means different things to different people, especially when we are operating in an inherently hazardous environment. Recreational, technical, cave, CCR and wreck diving all have different types and severities of hazards, and there are varying levels of perception and acceptance of risk. The ultimate realisation of risk, death, was only acknowledged in the last couple of years by a major training agency in their training materials. Yet it is something that can happen on ANY dive.
  • Second, given the loose training standards, multiple agencies, and instructors teaching for multiple agencies, there is a diffuse organisational influence across the industry which means it is hard to change the compliance-focus that is in place. From the outside looking in, there needs to be more evidence of leadership surrounding operational safety, as opposed to compliance-based safety e.g., ensuring that the standards are adhered to, even if the standards have conflicts or are not clear. This appears to be more acute when agencies have regional licensees who may not be active diving instructors and are focused on revenue generation and not the maintenance of skilled instructors. There is very little, if any, evidence that leadership skills, traits or behaviours are taught anywhere in the diving industry as part of the formal agency staff or professional development processes. This impacts what happens in terms of safety culture development.
  • Finally, the focus on standards and rules aligns with the lowest level of the recognised safety culture models – Pathological from Hudson. Rules and standards do not create safety. Rules facilitate the discussion around what is acceptably safe, but they rarely consider the context surrounding the activities at the sharp end, i.e., dive centres and diving instructors and how they manage their businesses. These are grey areas. There is a difference between ‘Work as Imagined’ and ‘Work as Done,’ and individual instructors and dive centre managers must both ‘complete the design’ because the manuals and guides are generic, and manage the tension between safety, financial pressures, and people (or other resources) to maintain a viable business. Fundamentally, people create safety not through the blind adherence to rules, but through developed knowledge and reflecting on their experiences, and then sharing that knowledge with others so that they, too, may learn and not have to make the same mistakes themselves.

The proceeding discussion brings us to the main topics of this article, does the diving industry have a learning culture, and what is needed to support that learning culture?

What is a learning culture?

In the context of ‘safe’ diving operations, a learning culture could be defined as “the willingness and the competence to draw the right conclusions from its safety information system, and the will to implement major reforms when their need is indicated.” (Reason, 1997). Here we have a problem! 


The industry is based around siloed operations: equipment manufacturers, training agencies, dive centres/operations, and individual instructors. Adopting a genuine learning approach means that the barriers must be broken down and conversations happen between and within the silos. This is very difficult because of the commercial pressures present. The consumer market is small, and there are many agencies and equipment manufacturers that are competing for the same divers and instructors. Also, agencies and manufacturers have competing goals. Agencies want to maximise the number of dive centres/instructors to generate revenue, and one of the ways of doing that is to maximise the number of courses available and courses that can be taught by individual instructors e.g., different types of CCR units. Manufacturers don’t want to realise the reputational risk because their equipment/CCR is involved in a fatal diving accident, but they also want to maximise their return on investment by making it available to multiple agencies and instructors. The higher-level bodies (WRSTC, RTC, and RESA) are made up of the agencies and manufacturers that will inherit the standards set, so there is a vested interest in not making too much change. Furthermore, in some cases, there is a unanimous voting requirement which means it is easy to veto something that impacts one particular agency but benefits many others.


This will be expanded in the section below relating to information systems as they are highly interdependent.

What safety information systems do we have in the diving community?

Training agencies each have their own quality assurance/control/management systems, with varying levels of oversight. This oversight is determined by the questions they ask, the feedback they receive, and the actions they take. These are closed systems and based around compliance with the standards set by the agency – sometimes those standards are not available to be viewed by the students during or after their class! Research has been carried out on some of this quality data, but it appears to have focused on the wrong part e.g., in 2018, a paper was published by Shreeves at al, which looked at violations outside the training environment involving 122 diving fatalities. While the data would have been available, a corresponding research project involving fatalities inside the training environment was not completed (or if it was, it wasn’t published in the academic literature).

As the ex-head of Quality Control of a training agency, I would have been more interested in what happened inside my agency’s training operations than what occurred outside, not from a retributive perspective, but to understand how the systemic failures were occurring. I also understand that undertaking such research would mean it would be open for ‘legal discovery’, and likely lead to the organisation facing criticism if a punitive approach was taken rather than a restorative one.

Safety organisations like Divers Alert Network collect incident data, but their primary focus is on quantitative data (numbers and types of incidents), not narrative or qualitative data – it is the latter that helps learning because we can relate to it. The British Sub Aqua Club produce an annual report, but there is very limited analysis of the reported data, and there does not appear to be any attempt made to look at contributory or influential factors when categorising events. The report lists the events based on the most serious outcome and not on the factors which may have influenced or contributed to the event e.g., a serious DCI event could have been caused by rapid ascent, following an out-of-gas situation, preceded by a buddy separation, and inadequate planning. The learning is in the contributory factors, not in the outcome. In fairness, this is because the organizations do not have to undertake more detailed investigations, and because the information isn’t contained in the submitted reports.

Research from 2006 has shown that management in organisations often want quantitative data, whereas practitioners want narrative data about what happened, how it made sense, and what can be done to improve the situation. Statistical data in the diving domain regarding safety performance and the effectiveness of interventions e.g., changes to the number of fatalities or buoyancy issues is of poor quality and should not be relied upon to draw significant conclusions.

What is required to populate these systems?

There are several elements needed to support a safety information system.

  • Learning-focused ‘investigations’.
  • Competent ‘investigators’.
  • Confidential and collaborative information management and dissemination systems.
  • Social constructs that allow context-rich narratives to be told.

Learning-focused ‘investigations’. The diving industry does not have a structured or formal investigation or learning process, instead relying on law-enforcement and legal investigations. Consequently, investigations are not focused on learning, rather they are about attributing blame and non-compliance. As Sidney Dekker said, “you can learn or blame; you can’t do both”. The evidence that could be used to improve learning e.g., standards deviations, time pressures, adaptations, poor/inadequate rules, incompetence, and distractions… are the same elements of data that a prosecution would like to know about to hold people accountable. Rarely does the context come to the fore, and it is context that shapes the potential learning opportunities. “We cannot change the human condition, but we can change the conditions in which humans work.” (James Reason). Rather than asking ‘why did that happen’ or even ‘who was to blame’, we need to move to ‘how did it make sense to do what they did’. ‘Why’ asks for a justification of the status quo, ‘how’ looks at the behaviour and the context, not the individual.

Competent ‘investigators’. As there isn’t any training in the diving domain to undertake a learning-focused investigation, we shouldn’t be surprised that the investigations focus on the individual’s errant behaviour. Even those ‘investigations’ undertaken by bodies like DAN, the NSS-CDS Accident Committee or the BSAC do not involve individuals who have undertaken formal training in investigations processes or investigation tools. A comprehensive learning review is not quick, so who is going to pay for that? It is much easier to deflect the blame to an individual ‘at the sharp end’ than look further up the tree where systemic and cultural issues reside. The education process for learning-focused investigations starts with understanding human error and human factors. The Essentials class, 10-week programme, and face-to-face programmes provide this initial insight, but the uptake across the industry, at a leadership level, is almost non-existent. Four free workshops are planned for Rebreather Forum 4.0 to help address this.

Confidential information management system. Currently, no system allows the storage of context-rich diving incident data outside the law-enforcement or legal system in a manner that can be used for learning. After discussions with senior training agency staff, it appears that as little as possible is written down following an incident. When it is, it is shared with the attorney to enable the ‘attorney-client’ privilege to be invoked and protected from discovery. If internal communications occur via voice, then the potential learning is retained in the heads of those involved but will fade over time. Furthermore, if they leave that role or organisation, then the information is almost guaranteed to be lost.

Social Constructs: Two interdependent elements are needed to support learning: psychological safety and a “Just Culture.” With the former, the majority of modern research strongly suggests that it is the presence of psychological safety that allows organisations to develop and learn (Edmondson, 1999). Edmondson describes numerous case studies where organisational and team performance was improved because incidents, problems, and near-misses were reported. Paradoxically, the more reports of failure, the greater the learning. It was not because the teams were incompetent; they wanted to share the learning and realised that they could get better faster with rapid feedback. They also knew that they wouldn’t be punished because psychological safety is about taking an interpersonal risk without fear of retribution or reprisal – this could be speaking up, it could be challenging the status quo, it could be saying “I don’t know”, or it could be about trying something new and coming up with an unexpected outcome.

The second requirement is a Just Culture which recognises that everyone is fallible, irrespective of experience, knowledge, and skills. This fallibility includes when rules are broken too, although sabotage and gross negligence (a legal term) are exceptions. Neither a Just Culture nor psychological safety are visible in the diving industry, although some pockets are present. To support psychological safety (proactive/prospective) and a Just Culture (reactive), there is a need for strong, demonstrable leadership:

  • Leaders who have integrity – they walk the talk.
  • Leaders who show vulnerability – talking about their own mistakes including the context and drivers; leaders who want to look at organisational issues inside their own organisation – not just point fingers at others problems.
  • Leaders who recognise that human error is only the starting point to understand something going wrong, not the end.

‘…the will to implement major reforms…’

This is probably the hardest part because learning involves change. Change is hard. It costs cognitive effort, time, and money, and this has an impact on commercial viability because of the need to generate new materials, to educate instructor trainers/instructors and divers about the change and do it in multiple languages. Unless there is a major external pressure, e.g., the insurance companies threaten to withdraw support, things are unlikely to change because there aren’t enough people dying in a single event to trigger an emotional response for change. For example, in the General Aviation sector in the US approximately 350 people die each year, but if these deaths happened in airliners, it would mean two to three crashes per year, and this would be considered unacceptable.

In 2022, more than 179 people died diving in the US. (personal communications with DAN)

The most radical changes happen when double-loop learning is applied.

NASA did not learn from the Challenger disaster because it focused on single-loop learning, and when Columbia was lost, the investigation unearthed a lack of organisational learning i.e., double-loop learning. Chapter 8 from the Columbia Accident Investigation Board provides many parallels with the diving industry. The recent changes to PADI drysuit training standards following a fatal dive on a training course provide an example of single-loop learning – fix the ‘broken instructor’ and clarify course training requirements. The double-loop learning approach would be to look at self-certification and the wider quality management across the agency/industry; however, such an approach has significant commercial disadvantages across the board.

Creating a Learning Culture

The previous paragraphs talk about many of the issues we’ve got, but how do we improve things?

  1. Move to using a language that is learning-based, not ‘knowing’-based. This video from Crista Vesel covers the topic relatively quickly. This includes not using counterfactuals (could have, should have, would have, failed to…) which are informed by hindsight bias. Fundamentally, counterfactuals tell a story that didn’t exist.
  1. Look to local rationality rather than judging others. Move from who (is to blame) and ‘why did you do that?’, to ‘how did it make sense for you to do that?’. Separate the individual from the actions/behaviours and stop applying the fundamental attribution bias where we believe the failure is due to an individual issue rather than the context.
  1. Look to break down the barriers between the silos and share information. Ultimately, the stakeholders within the diving community should be looking to create a safe diving environment. Throwing rocks and stones at each other for ‘incompetence’ is not going to help.
  1. Adopt the Five Principles of Human and Organisational Performance as outlined in this blog.
  1. Build ‘If Only…’ or something produced for the recreational market, into training programmes at the instructor trainer, instructor, and diver level. This way the culture can slowly change by telling context-rich stories that have ‘stickiness’. However, this requires a fundamental shift in terms of how stories are told and how risk is portrayed in the diving industry.
  1. Finally, recognise we are all fallible. Until we accept that all divers are fallible and are trying to do the best they can, with the knowledge they have, the money they have, the resources they have, the skills they’ve acquired, and the drivers and goals they are facing, then we are unlikely to move forward from where we are, and we’ll keep choosing the easy answer: ‘diver error’.


InDEPTH: Examining Early Technical Diving Deaths: The aquaCORPS Incident Reports (1992-1996) by Michael Menduno

InDEPTH: The Case for an Independent Investigation & Testing Laboratory by John Clarke

Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 450 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.

Continue Reading

Thank You to Our Sponsors

  • Fathom


Education, Conservation, and Exploration articles for the diving obsessed. Subscribe to our monthly blog and get our latest stories and content delivered to your inbox every Thursday.

Latest Features