Header image by Derk Remmers
Not A Theory — A Fact! How NAUITEC Manages Isobaric Counter Diffusion
by Daniel Millikovsky
There is some confusion in the technical diving community as to whether we should pay attention to the physical law while planning gas switches, particularly on ascent. Here are some of the basics of this topic and how NAUI’s technical division, NAUITEC, has addressed this matter in training and diving operations since 1997.
Fact: Isobaric counterdiffusion is a real gas transport mechanism. We need to pay attention to it in mixed gas diving.
Fiction: Isobaric counterdiffusion is a theoretical laboratory concept and doesn’t affect divers at all.
From NAUI Technical Diver (textbook):
Isobaric counterdiffusion (ICD) describes a real gas transport mechanism in the blood and tissues of divers using helium and nitrogen. It’s not just some theoretical concoction, and it has important impacts for tech diving. It was first observed in the laboratory by Kunkle and Strauss in bubble experiments, is a basic physical law, was first studied by Lambertsen and Idicula in divers, has been extensively reported in medical and physiology journals, and is accepted by the deco science community worldwide.
Isobaric means “equal pressure.” Counterdiffusion means two or more gases diffusing in opposite directions. For divers, the gases concerned are the inert gases nitrogen and helium and not metabolic gases like oxygen, carbon dioxide, water vapor, or trace gases in the atmosphere. Specifically, ICD during mixed gas diving operations concerns the two inert gases moving in opposite directions under equal ambient pressure in tissues and blood. In order to understand this, we have to consider their relative diffusion speeds. Lighter gases diffuse faster than heavier gases. In fact, helium (He) is seven times lighter than nitrogen (N2) and diffuses 2.65 times faster.
If a diver has nitrogen-loaded tissue, and if their blood is loaded with helium, this will result in greater total gas loading because helium will diffuse into tissue and blood faster than nitrogen diffuses out, resulting in increased inert gas tensions. Conversely, if a diver has helium-loaded tissues, and their blood is loaded with nitrogen, this will produce the opposite effect: Helium will off-gas faster than nitrogen on-gases, and total inert gas tensions will be lower. This last case is what we can call in decompression planning a “Good ICD,” but we need to choose the fractions of N2 wisely on ascent.
Also, Doolette and Mitchell’s study of Inner Ear Decompression Sickness (IEDCS) shows that the inner ear may not be well-modelled by common (e.g. Bühlmann) algorithms. Doolette and Mitchell propose that a switch from a helium-rich mix to a nitrogen-rich mix, as is common in technical diving when switching from trimix to nitrox on ascent, may cause a transient supersaturation of inert gas within the inner ear and result in IEDCS. They suggest that breathing-gas switches from helium-rich to nitrogen-rich mixtures should be carefully scheduled either deep (with due consideration to nitrogen narcosis) or shallow to avoid the period of maximum supersaturation resulting from the decompression. Switches should also be made during breathing of the largest inspired oxygen partial pressure that can be safely tolerated with due consideration to oxygen toxicity.
In the case of dry suits filled with light gases while breathing heavier gases, the skin lesions resulting are a surface effect, and the symptomatology is termed “subcutaneous ICD.” Bubbles resulting from heavy-to-light breathing gas switches are called “deep-tissue ICD,” obviously not a surface-skin phenomenon. The bottom line is simple: don’t fill your exposure suits with a lighter gas than you are breathing and avoid heavy-to-light gas switches on a deco line. In both cases, the risk of bubbling increases with exposure time.
More simply, light to heavy gas procedures reduces gas loading, while heavy to light procedures increases gas loading. Note, however, that none of these counter transport issues come into play when diving a closed circuit rebreather.
The NAUITEC Way
ICD is not scientific theory, it is fact. Understanding and avoiding ICD is the way to reduce bubble formation and an increased risk of DCS, and to allow for a more efficient decompression practice in the long term.
Deep trimix dives require a high helium and low nitrogen mix [Note that NAUITEC mandates an equivalent narcotic depth (END) of 30 m, similar to Global Underwater Explorers (GUE)]. NAUITEC takes a hierarchical approach to trimix decompression based on risk reduction.
In its preferred “Zero Order Rule” (zero risk from ICD), NAUITEC recommends that divers not switch from helium to nitrogen (nitrox) breathing mixtures upon ascent. Instead, divers decompress on their bottom gas (trimix) until reaching their 6 m/20 ft stop, and then decompress on pure oxygen (O2). This reduces task loading and minimizes switch changes.
If the diver wants to reduce their deco obligation and/or add a deep deco gas, they would switch to an intermediate deco mix, specifically a “hyperoxic” trimix, also called helitrox or triox, with an oxygen fraction greater than 23.5%. In practice this is accomplished by replacing the helium with oxygen and keeping the fraction of N2 the same, or ideally less. This avoids a N2 slam from ICD. Note that it is recommended that NAUI divers always maintain an equivalent air depth (END) of no more than 30 m/100 f.
This is what we recommend and practice, and we believe it offers less risk than switching from a trimix bottom gas to an enriched air nitrox (EAN) 50, (i.e. 50% O2, 50% N2) at 21 m/70 ft, which is a common community practice. The bottom line here is that in-gassing gradients for nitrogen have been minimized by avoiding isobaric switch. THERE MUST BE A HIGH BENEFIT TO RISK RATIO to deviate from Zero Order Rule!
The additional rules present increased risk. The First Order Rule: No switches from helium to nitrogen breathing mixes deeper than 30 m/100 ft. The Second Order Rule mandates no switches from helium to nitrogen mixes deeper than 21 m/70 ft.
The last rule seems to be common in technical diving, but it has certainly not been formally tested. Just say no when the risks outweigh the benefits. Many times, the benefit of a gas switch does not outweigh the risk. Risk reduction is always the primary goal.
GUE On Isobaric Counterdiffusion
By Richard Lundgren
GUE does not dispute Isobaric Counterdiffusion (ICD) as it’s a natural part of how we achieve decompression efficiency, i.e. maximizing the gradient between the different inert gases in a diver’s tissues and what is being respired. This is sometimes referred to as the positive ICD effect.
The flip side of the coin, the negative ICD effect, involves a potential increased risk for decompression illness (DCI), most commonly subclinical manifestations affecting the inner ear and causing inner ear decompression sickness (IEDCS).
Although the exact mechanics are not known, one potential aggravating factor could well be ICD when the gradient resulting from a switch from a helium to a nitrox mix is too large. This is sometimes called a “nitrogen slam.” This occurs when a gas with slow diffusivity is transported into a tissue more rapidly than a higher-diffusing gas is transported out, like when switching from bottom gas, for example a Trimix 15/55 (15% O2, 55% helium, balance N2) to a nitrox decompression gas like Nitrox 50 (50% O2, 50% N2) at 21m/70 ft. This can result in supersaturating of some tissues and consequently, bubble formation.
Based on ICD theory alone, one could draw the conclusion that any gas switch not containing helium after a trimix/heliox dive would be provocative and increase the decompression stress. This is where academics need to be tuned to the application and empirical evidence.
The practice of “getting off the mix early and deep,” which led some divers to switch to air at great depth in order to maximize the off gassing of helium, was a common early practice in the tech community. It was a practice that most likely resulted in elevated risk of not only DCI, but also inert gas narcosis and the problems it can engender. This practice, as most people likely know, was not subscribed to by GUE.
On the contrary, GUE was the first organization to call for helium-enriched gases when diving deeper than 30 m/100 ft, both for bottom gas and decompression gas. We were also early advocates for switching from helium-based bottom gas to nitrox 50 under special circumstances.
However, it should be made very clear though, that among the very active GUE dive community, we have seen no indications or significant statistics implying that the DCI risk or occurrence is elevated when switching to Nitrox 50 as the first deco gas after a 72m/250ft dive breathing Trimix 15/55. For deeper dives, additional deco gases are used. All of these contain helium.
Another possible issue could occur when divers switch to their helium-based back gas briefly after decompressing on Nitrox 50 but before switching to pure oxygen, and/or taking an oxygen break during their 6 m/20 ft O2 decompression. However, based on thousands of decompression dives in the GUE community, these gas breaks have not been reported to cause problems. Note that these switches occur at shallow depths, and therefore reduced pressure gradients.
Superficial ICD, i.e. when the body is surrounded by a less dense gas compared to what’s being respired is more of a theoretical problem for divers, as we don’t use helium mixes to inflate our dry suits for the obvious reasons of thermal conductivity.
Interestingly, the concerns over ICD may at first glance seem irrelevant to rebreather (CCR) divers, assuming that their diluent remains the same throughout the dive. But remember most CCR divers rely on open circuit bailout, which may require gas switches.
Note: The British Sub Aqua Club (BSAC) recommends that divers allow for a maximum of 0.5 bar difference in PN2 at the point of the gas switch. According to former BSAC Tech lead Mike Rowley, “The recommendation isn’t an absolute, but a flexible advisory value so a 0.7 bar differential isn’t going to bring the Sword of Damocles down on you.”
Not A Theory — A Fact! References:
NAUI Technical Diver, National Association of Underwater Instructors, 2000.
Wienke B.R. & O’Leary T.R. Isobaric Counterdiffusion, Fact And Fiction. Advanced Diver Magazine
Technical Diving in Depth, B.R. Wienke
Lambertsen C. J., Bornmann R. C., Kent M. B. (eds). Isobaric Inert Gas Counterdiffusion. 22nd Undersea and Hyperbaric Medical Society Workshop. UHMS Publication Number 54WS(IC)1-11-82. Bethesda: Undersea and Hyperbaric Medical Society; 1979; 182 pages.
Doolette, David J., Mitchell, Simon J. (June 2003). “Biophysical basis for inner ear decompression sickness.” Journal of Applied Physiology, 94(6): 2145–50.
Daniel Millikovsky is a lifetime NAUI member (NAUI# 30750). He’s been a NAUI instructor exclusively for 22 years, a Course Director for 20 years, and in 2016, became a Course Director Trainer and Representative in Argentina. Daniel is a very active NAUI Technical Instructor Examiner (#30750L) for several courses including OC and CCR mixed gas diving. He has also been a member of the NAUI Training Committee since 2020. He owns Argentina Diving, a NAUI Premier, Pro Development, and Technical Training Center based in Buenos Aires, Argentina.
Daniel began diving in 1993 as a CMAS diver and then continued with his NAUI career, becoming an instructor in 1998. He opened his first NAUI Pro Scuba Center (DIVECOR) in Cordoba, Argentina. Daniel is enthusiastic about teaching and training and is a sought after presenter at numerous international dive conferences and shows. He can be reached at firstname.lastname@example.org, website: www.argentinadiving.com.
Richard Lundgren is the founder of Scandinavia’s Baltic Sea Divers and Ocean Discovery diving groups, and is a GUE Instructor Trainer, an Instructor Examiner, and a member of its Board of Directors. He has participated in numerous underwater expeditions worldwide and is one of Europe’s most experienced trimix divers. With more than 4000 dives to his credit, Richard Lundgren was a member of the GUE expeditions to dive the Britannic (sister ship of the ill-fated Titanic) in 1997 and 1999, and has been involved in numerous projects to explore mines and caves in Sweden, Norway, and Finland. In 1997, in arctic conditions, he performed the longest cave dive ever carried out in Scandinavia. Richard’s other exploration work has included the 1999 filming of the famous submarine, M1, for the BBC; the side scan sonar surveys of the Spanish gold galleons off Florida’s Key West in 2000; and the search for the Admiral’s Fleet, an ongoing project that has already led to the discovery of more than 40 virgin wrecks perfectly preserved in the cold waters of the Swedish Baltic Sea.
Is a Just Culture needed to support learning from near misses and diving accidents?
Human factors coach Gareth Lock delves deep into the meaning, impact, and need for “Just Culture” in diving, as well as creating a psychologically-safe environment that enables divers to highlight and challenge possible safety issues. Lock argues that both are essential, and offers practical suggestions for building the culture we deserve.
Text and photos courtesy of Gareth Lock
The heavily regulated aviation industry is often praised for its effective Just Culture, which has facilitated an ultra-safe operating environment. What purpose could a Just Culture serve in diving, and why should we care when we are in an unregulated environment? Aviation is ultra-safe—regulatory agencies implement safety targets and demand continuous improvement of safety performance. Nevertheless, adverse events still happen. Why? Because of the complex nature of operations and the variability of human performance. It isn’t because of ‘stupid pilots’ and/or ‘pilot error.’
The following two stories centre around a similar adverse event—a departure from a point on the runway the crew weren’t expecting—but have very different outcomes when it comes to the opportunities to learn and improve. As you read them, consider which one has more parallels with diving and the diving community.
A crew of four pilots taxi out at night to depart from Miami, Florida to Doha, Qatar on a Boeing 777 with 279 people onboard. Due to busy radio communications and an internal misunderstanding as they navigated around the airfield at night, the crew mistakenly entered a runway at a point that was further from the departure end than their performance planning assumed. This meant that the runway distance available to them was 2,610 m/8,563 ft instead of 3,610 m/11,844 ft. The crew had not noticed this problem before they asked for departure clearance.
After they received departure clearance, they accelerated down the runway to reach take-off speed. Very shortly after the crew ‘rotated’ (pulled back on the stick), the aircraft struck the approach lights of the opposite runway, causing damage including puncturing the skin of the aircraft. Fortunately, the puncture did not breach the pressure hull. The crew were unaware of the impact and carried on their flight to Doha without any further issues. On arrival, while taxiing in, the ground crew noticed the damage and informed the flight crew.
This event was widely reported in the media. It was also investigated internally and by local aviation authorities. The CEO’s response was, “We will not accept any kind of lapses by pilots because they have hundreds of passengers whom they risked,” and all four pilots were fired. Scarily, and I believe falsely, he also stated that “At no time was the aircraft or the passengers put in any harm’s way.” However, the aircraft was still on the ground when it left the runway, and if the crew had to abort their take-off just before rotate speed, the aircraft would have likely gone off the end of the runway and into the waterway—and probably the housing estate outside the airfield perimeter—with the loss of the aircraft and possibly passengers and crew.
The flight crew were operating from a Caribbean island with a single runway after they had arrived the night before as passengers. Neither the Captain nor the First Officer had been there before. They boarded their aircraft in the dark just before their passengers arrived. Once everyone was onboard, they left the parking area via the single entrance to the runway, and they turned to go to their departure end. The airport diagram they were using for navigation showed a single large concrete turning circle at the end of the runway. As they taxied down to where they thought the end was, they came across a large concrete circle and so started their turnaround process to line up facing the other way to depart.
They gained departure clearance and started to accelerate down the runway to reach their rotational speed. After rotating, they noticed that the lights at the other end of the runway passed under them more quickly than they expected for a runway of this length.
Once they arrived back in the UK, independently, both the Captain and First Officer looked online at Google Earth as they had an uneasy feeling about what had happened. What they found was that there were two concrete circles on the runway and not just the one as marked on their taxi diagram.
They immediately got a hold of their dispatch team and let them know that the airport diagram they had operated with wasn’t accurate and could cause a major problem. They also raised an Air Safety Report (ASR) within their airline so others could learn from the circumstances. The crew were congratulated for reporting this event, even though their departure safety margins had been reduced. The charts were amended as a consequence.
Given that pilots are reported to make between three to six errors per hour, which airline would you rather operate with—the one that welcomes and congratulates its operators for reporting mistakes, or the one that punishes them? Furthermore, how important do you think the perception (or illusion) of safety is for the first airline’s customer base? The absence of reported near misses, incidents or accidents does not mean that your system is safe. Paradoxically, those organisations who report more and learn from those reports have fewer adverse events, especially repeat events.
What is a Just Culture?
A Just Culture was originally mentioned in 1997 as part of James Reason’s work in Managing the Risk of Organisational Accidents where he describes it as being part of a Safety Culture. There were five sub-cultures that made up a safety culture: just culture, reporting culture, informed culture, flexible culture, and a learning culture. The image below shows their interaction in more detail. My personal view is that Just Culture supports everything else, but it could be argued that if you don’t have a Reporting Culture, you don’t need a Just Culture.
Reason recognised that a wholly Just Culture is unattainable, but that there needs to be some line between errors or unwanted outcomes caused by system design and human performance variability, and those caused by gross negligence, sabotage, or reckless behaviour. As he highlights, “The difficulty is discriminating between these few truly ‘bad apples’ and the vast majority of unsafe acts to which the attribution of blame is neither appropriate nor useful.” Sidney Dekker, author of Just Culture goes further and says, “It isn’t so much where the line is drawn, but who draws it.” In the world of diving, it is often online peers or the lawyers who draw the line. The former rarely have the full context and don’t understand human error and human factors, and the latter aren’t necessarily interested in wider organisational learning, as they are focused on their claim.
What does a Just Culture do?
A Just Culture facilitates the sharing of conditions or outcomes which aren’t expected or wanted—e.g., near misses, incidents, and accidents. The sharing happens because there is a recognition and acceptance within the team, organisation, or sector that human errors are part of normal operations and that professionals come to work to do a good job in spite of the limitations and constraints which are part of their job. These limitations and constraints create tensions and conflicts between doing what is written in manuals and procedures (Work as Imagined) and what really happens (Work as Done) to achieve the results for which they are being rewarded (e.g., productivity goals). Sometimes it isn’t possible to complete the job by following the rules because of these conflicts—conflicts which often have commercial drivers as their source. For example, pilots might not be able to do all their pre-departure checks in the correct order and at the right time because of limited ground time on turnarounds. For the aircrew, this can be business as usual, but in the event of an accident investigation, this would be picked up as deviant behaviour.
This is like dive centres who are commercially driven and face local competition. They would like to up their productivity (to make up for reduced costs), but this means that standards might be bent every now and again to make things work. This could mean diving deeper than maximum depths to make use of boats for multiple clients and/or courses, not completing the minimum dive time or the minimum number of dives which increases instructor availability, increasing the number of students in a class to maximise revenue or deal with a shortage of instructors/DMs, or not having surface cover when needed because they couldn’t be sourced or they cost too much to include in the course fees. There is also the very real issue that some standards are not valid or that organisations accept unauthorised protocol changes but don’t do anything about that.
Proceduralising a Culture? Isn’t that an Oxymoron?
Understanding this context is not easy, so part of Reason’s initial work was the production of a flow-diagram or process which showed managers how they could look at an event and determine where culpability might lie and how it should be dealt with. This framework has been reproduced by multiple different organisations like Shell, the Keil Centre, and Baines Simmons.
Each of these processes is a decision tree, with the outcome determining what sort of action should be taken against the individual—sometimes it is punitive. On the surface, this sounds like a good idea. The problem is that these processes rarely consider the rich context that is needed to understand how it made sense for someone to do what they did, with biases like hindsight, outcome bias, severity bias, and the fundamental attribution bias getting in the way of understanding what really happened. This often leads to punishment when it is not applicable. Furthermore, which of the many individual contributory or causal factors do you examine as part of this ‘process?’ Most of the time, the focus is on the ‘sharp end’—those doing the work, rather than further up the organisational chain and the conditions.
Some organisations have inserted two additional tests to help managers determine if this was an individual problem or a systemic one. The substitution test asks if someone with the same knowledge, skills, and experience, and under the same pressures and drivers, but without knowledge of the outcome, would do the same thing. If they would, then it is likely a system issue. The other test concerns whether this event/action has happened before to either the organisation or to the individual. In both cases, if it has, then it highlights organisational weakness either in system design or training/coaching of the individual after the first (or subsequent) event.
The problem with such process-based approaches is that they can’t create a culture, therefore they can’t be part of a Just Culture. Fundamentally, a culture can’t be proceduralised—a culture is based on the relationships and interactions between individuals of a group, team, organisation, sector, or nation. Ultimately, it is ‘how things are done around here,’ often without those involved knowing why! We love technology to make things easier, but in this case, a process flow chart doesn’t help create a Just Culture.
Build the organisational learning into the investigation process
Things have changed in some organisations though. BP decided to review and rewrite their Just Culture process because they realised it wasn’t working as intended—the goal of the policy was to facilitate learning from near-misses, incidents, and accidents, but they were missing too much. Their rework meant that they asked learning-focused questions as part of their investigation process. Rather than asking the Just Culture questions at the end, in isolation and with limited information, they required the investigators to ask the questions during the investigation to generate the rich context needed to understand the event. Consequently, they found that 90% of their incidents were systemic in nature and were not caused by human error of those operating at the sharp end. You can see an extract from their investigation flow-diagram below. While this is a process, the purpose is to facilitate discussion and learning during the investigation rather than using an isolated judgement at the end.
Does diving need a Just Culture?
The simple answer is yes. It also needs a psychologically-safe environment. In my opinion, psychological safety is needed before an event to allow latent conditions to be highlighted and challenged, whereas a Just Culture is needed after an event to allow discussions to take place around the context and how it occurred. Psychological safety supports the discussions under a Just Culture.
The diving community needs a culture that follows those same initial concepts from James Reason in 1997, the recognition that errors are normal, that the skills and experience of those involved within the current operating environment must be considered, and that only when gross negligence or sabotage are present should we look at punitive action. This means that when adverse events occur in diving and they are not subject to legal action, then the events should be used for learning. Unfortunately, the legal systems do not currently address the needs of a Just Culture because the goal is to find someone or some organisation to blame and which will facilitate damage claims. In my opinion, this poses a problem in the US where litigation is rife, particularly in diving, often for minor events. Unfortunately, in some cases, litigation is the only way to get some form of damages, or even an apology.
Diving learning from Healthcare and Wildland Firefighting?
Fortunately, it doesn’t have to be that way. Following a discussion with a colleague at Lund University, Sweden where I am currently studying for a MSc in Human Factors and System Safety, I was told that there are hundreds of hospitals in the USA who have made inroads into reducing litigation claims and increasing patient safety by undertaking Communication and Resolution Programmes (CRPs). These CRPs have reduced litigation, increased individual and organisational learning, increased patient safety, and, ultimately, have reduced insurance premiums because problems are resolved in a different manner. These programmes are based around a common understanding of human error, performance-shaping factors, and a Just Culture. Fundamentally, if you don’t understand human error, then you can’t create a Just Culture. Regarding Wildland Firefighting, I have previously written on that in InDepth.
In the sports diving sector, we don’t have a standardised, structured investigation process which is based around learning with most investigations looking at proximal causes rather than systemic ones, and often looking for blame. Even in the commercial sector, the HSE and OSHA are focused on non-compliance rather than learning opportunities. This means that building the ‘Just Culture’ questions into the learning process isn’t possible (at the moment). However, as a culture is based on the relationships and interactions between those within a team, organisation, or sector and the language they use, we can certainly start to develop one by changing the language we use to facilitate the sharing of, and learning from, near misses and incidents/accidents. Producing documentaries like ‘If Only…’ is a great way to get the concepts across by telling a story, and stories are how we learn.
I know of one GUE community who have set up their own ‘Learning Forum’ on Facebook which allows the open, critiquing (but not critical) discussion of adverse events and near misses—a forum I support. The Human Diver: Human Factors in Diving Facebook group also supports a learning-based approach with very mature conversations taking place when adverse events are discussed.
How can you help build this culture?
Start by being curious and limiting your judgments. While it is hard, try not to fall foul of the cognitive biases that I mentioned earlier, ask the simple question, “How did it make sense for them to do what they did?”, and you might be surprised at the answers you get back. Asking “How?” moves the focus to the context; “Why?” and ‘“Who?” are all about the person which invokes a blame response. At the organisational level, asking the same questions can highlight gaps in your own processes, procedures, expectations, and leadership. If instructors are consistently breaching standards, is it because the standard is unachievable or does not add the value you think it does to the operation? Firing them without understanding why they breached the standard is an example of a failed Just Culture. It can be uncomfortable to ask this question, but asking it is essential for improvement. Plus, in this scenario, you don’t need an accident to create learning because the latent conditions are already present. All you need is a genuine curiosity and a desire to learn and improve.
Human Diver: They broke the rules! So…?
InDEPTH: Human Factors page
Gareth Lock has been involved in high-risk work since 1989. He spent 25 years in the Royal Air Force in a variety of front-line operational, research and development, and systems engineering roles which have given him a unique perspective. In 2005, he started his dive training with GUE and is now an advanced trimix diver (Tech 2) and JJ-CCR Normoxic trimix diver. In 2016, he formed The Human Diver with the goal of bringing his operational, human factors, and systems thinking to diving safety. Since then, he has trained more than 450 people face-to-face around the globe, taught nearly 2,000 people via online programmes, sold more than 4,000 copies of his book Under Pressure: Diving Deeper with Human Factors, and produced “If Only…,” a documentary about a fatal dive told through the lens of Human Factors and A Just Culture.