Connect with us

Education

Listen to Your Ears: The Connection Between Eustachian Tube Dysfunction and Inner-Ear Barotrauma

Having ear problems? You’re not alone! According to DAN, eustachian tubes and their associated ear injuries represent the single largest cause of dive injuries bar none. Nearly 38% of diving injuries resulted from ear or sinus barotrauma with a prevalence 130% greater than all cases of DCS. Though middle-ear barotrauma is the most commonly associated with Eustachian tube (ET) dysfunction, the much more serious inner-ear barotrauma (IEBT) remains a close second. DAN’s risk mitigatory, Reilly Fogarty, reviews the latest research and what it may mean for your ears. Listen up people!

Published

on

by Reilly Fogarty

Eustachian tubes are your ear’s version of pushing a thumb loop through a wrist seal to equalize your dry glove, but maybe a bit less exciting. The narrow passage that connects the pharynx to the cavity of the middle ear allows equalization between the ears and the sinus passages, and that’s about it. 

In the same way that cardiovascular disease poses the single greatest risk of death to most adults in the U.S., Eustachian tubes and their associated ear injuries represent the single largest cause of dive injuries bar none. Nearly 38% injuries resulted from ear or sinus barotrauma (2018 DAN Annual Diving Report), with a prevalence of nearly 130% greater than all cases of DCS. 

In terms of ear injuries, middle-ear barotrauma is the most common associated with Eustachian tube (ET) dysfunction, but the more serious inner-ear barotrauma (IEBT) remains a close second. While trauma to the middle ear typically heals quickly and without lasting effect, IEBT can cause permanent damage if not recognized and treated in a timely matter. Because of the nature of the inner ear and the delicate structures connected to it, damage is more likely to be irreversible even with surgery. Given the prevalence of ear issues and the potential severity of IEBT, the key to preserving our ability to dive (and minimizing injuries) is prevention. A group of researchers (Kitajima N et al., 2016) recently worked to correlate the function of the Eustachian tube with an incidence of IEBT.  

Using replicable physical measurements and impedance tests, the researchers quantified ET function in 16 divers with a history of IEBT and 20 without. They measured the pressure required to open the ET, maximum volume of air in the middle ear and the speed at which equalization occurred. In an ideal situation, it should take 200-650 dekapascals (daPa) to open a healthy Eustachian tube, a pressure gradient equivalent to an 8—26-inch depth change. The paper categorizes ET function in these divers as one of three categories: 

  1. Patulous (open, or requiring less than 200 daPa to open)
  2. Normal (collapsed but requiring less than 650 daPa to open and filling or emptying instantaneously)
  3. Stenotic (collapsed and requiring up to 1200 daPa to open or filling and emptying very slowly)

The categories effectively categorize an ET that functions well, moderately, and poorly (respectively). Comparing these fitness measurements with the divers’ history of IEBT, they found the following:

  1. In healthy divers without a history of IEBT, 30% equalized slowly but the pressure required to do so was within the normal range.
  2. Among divers with IEBT, most had notably stenotic ET, either requiring significant time to empty or fill, or requiring increased pressure to open.
  3. Divers with IEBT and a perilymph fistula (a tear in the round or oval window of the ear often caused by forceful equalization) had significantly worse ET dysfunction. It is suspected that pressure caused by forceful equalization may be the cause of IEBT in these divers. The paper presents 11 cases of IEBT caused perilymphatic fistula.
  4. Some divers with IEBT did have normal ET function at the time of testing.

Like much frontline research, the results aren’t as clean or as groundbreaking as we’d like. The paper provides a strong argument for the correlation of ET dysfunction and IEBT, which seems reasonable, but then advocates a Eustachian tube function evaluation in divers to prevent these injuries. While IEBT can cause deafness and significant injury, it’s prevalence appears to be about 1.7% of dive injuries or 4.3% of ear injuries among divers. Whether the prevalence or severity of the injury warrants an additional test before divers get in the water is an entirely larger discussion about fitness to dive and risk-based analysis. 

What we do know now is that healthy divers and those with ET dysfunction can both experience IEBT and significant associated ear injuries from failed, too rapid, or too forceful equalization, but divers who have trouble equalizing due to stenotic Eustachian tubes are likely at significantly greater risk. These divers (like those with slow equalization in the group of divers without a history of IEBT) can minimize their risks and likely dive without much worry by equalizing slowly and often and listening to their ears during their dives. 

Additional resources:

  1. Kitajima N, Sugita-Kitajima A, Kitajima S. Quantitative analysis of inner ear barotrauma using a Eustachian tube function analyzer. Diving Hyperb Med. 2016;46(2):76-81. (can be found at: https://www.ncbi.nlm.nih.gov/pubmed/27334994)
  1. May I Bend Your Ear? by Michael Menduno

Reilly Fogarty is a team leader for its risk mitigation initiatives at Divers Alert Network (DAN). When not working on safety programs for DAN, he can be found running technical charters and teaching rebreather diving in Gloucester, MA. Reilly is a USCG licensed captain whose professional background also includes surgical and wilderness emergency medicine as well as dive shop management.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Education

Part Two: “Tech Divers, Deep Stops, and the Coming Apocalypse”

In part two of this four-part series on the history and development of GUE’s decompression protocols, GUE founder and president Jarrod Jablonski discusses the lack of appropriate mixed gas decompression tables in the early days of technical diving. He goes on to discuss the initial development of “ratio decompression” and the early thinking and rapid adoption of “deep stops,” which have come under pressure as a result of new research that questions their efficacy and safety. He also discusses the emergence of the now nearly ubiquitous gradient factors (GF) used in most dive computing. Feel free to DIVE IN and share your thoughts.

Published

on

By

If you have not read Part One of the “Decompression, Deep Stops, and the Pursuit of Precision in a Complex World” series please find it here.

by Jarrod Jablonski

Header Photo by JP Bresser

Technical diving means different things to different people, opening legitimate arguments about different time periods and personalities involved in shaping this activity. Our purpose is more confined as I wish to focus mainly upon the development of decompression procedures. Leading figures like Jacques Cousteau (1910–1997) performed reasonably significant “technical” dives including both deep and overhead exposures. Yet, his dives were relatively short, often on air, and absent a community of fellow tech divers with whom to evolve varying strategies. Meanwhile, military or commercial activities reached significant depth with notable exposures but used different technologies and procedures. Hopefully, the following overview will be independently interesting—although I also hope to illustrate what I consider an intriguing wrinkle currently lacking in most debates over the best ascent schedule for a decompression dive.

For our purposes, I mark the 1980s as the period in which technical diving truly started to become a globally recognized activity. Of course, numerous significant projects occurred before this period, but the 1980s established a growing global awareness that amazing diving feats were possible by enthusiasts. The 1990s and the developing internet age brought these ideas progressively into the mainstream. Most importantly, these global communities could now easily communicate information and, of course, disagree about the best way to do one thing or another. 

Jarrod Jablonski, George Irvine, and Brent Scarabin preparing for dives in Wakulla Springs (late 1990s). Photo from the GUE archives.

The 1980s and 1990s were also an interesting period of development because people were doing more aggressive dives but still lacked important support tools that would take another decade to materialize. For example, most early tech dives were using some form of military, commercial, or scientific tables, such as the U.S. Navy, Oceaneering International, Inc. or the National Oceanic and Atmospheric Administration (NOAA), which were often not well-suited to the dive at hand. This is because there were no decompression programs available. Divers were unable to calculate their own profiles and would trade whatever tables they could gather. In many cases, they were forced to choose from tables that were calculated at a different depth, time, and/or breathing mix to the planned dive. 

George Irvine after a dive with the WKPP. Photo from the GUE archives.

Over time, the physiologist Dr. R.W. “Bill” Hamilton (1930–2011) began creating custom tables, but the lack of flexibility, as well as the cost, discouraged frequent use for most divers, especially when considering a range of depth profiles and their various time adjustments. This problem was especially prominent in deep cave diving where variable profiles and long bottom times created numerous complications . This landscape encouraged, if not demanded, that exploration divers be creative with their decompression practices. For example, Woodville Karst Plain Project (WKPP) explorers George Irvine and I began to explore ways to extrapolate from existing tables. The birth of what is now called ratio decompression originated with this practice. 

By exploring the way decompression schedules develop, we began outlining simple ratios that allowed divers to adjust their decompression based upon a ratio of time spent at depth. Then, in 1998, I went on to form Global Underwater Explorers (GUE), and such practices became part of the process for helping divers appreciate the structure of decompression while supporting adjustments to profiles when depth or time varied from that expected. Regrettably, some individuals took this too far and began promoting complex adjustments and marketed them as superior to the underlying algorithms from which they had been derived.

We should leave a more in-depth review of ratio deco for another time. For now, I intend to illustrate that “rules” such as ratio deco had their roots in limited availability of decompression tables and evolved as a useful tool for understanding and estimating decompression obligation. The fact that early tech divers had limited ability to calculate profiles encouraged a “test-and-see” philosophy, further fueling the early popularity of ad hoc adjustments to decompression profiles. This was most notable in the DIR and GUE communities through ratio-style adjustments and also prominent with non-DIR advocates who used a modification known as Pyle-stops, originating from ichthyologist Richard Pyle’s early deep dives and his attempts to refine efficient ascent protocols. One common adjustment in these approaches involved a notable reduction in ascent speed, adding additional stops known as “ deep stops

Driven by self-discovery, the migration toward deep stops resulted in rare agreement among technical divers, and the practice developed a life of its own in conference symposiums, being advocated far and wide by a healthy share of technical divers. The earliest phases of these modifications were driven in part by limited access to decompression tables, though by the end of the 1990s there were a variety of decompression programs available. This was a big improvement, although it was still true that divers had a limited baseline of successful dives while planning mixed-gas dives in the 50 to 100 meter range and beyond. Today, most divers take for granted that technical dives planned with available resources and within today’s common use (a few hours around 90m/300 ft) are relatively safe in terms of decompression risk. When these programs first came to market, there remained many question marks about their efficacy. 

Jarrod Jablonski and Todd Kincaid use Doppler while evaluating decompression profiles in 1995. Photo from the GUE archives.

The availability of decompression programs was a big advantage for technical divers, as they now had more sophisticated tools at their disposal. Yet, the output from many of these programs produced tables that were sometimes different to profiles thought to be successful by some groups. Typically, the difference related to an increase in the total decompression time and/or a distribution of stops that varied from developing consensus. These divers were keenly interested in the developing tools but reluctant to change from what appeared successful, especially when that change required additional hours in the water. A debate developed around the reason for these differences, bringing the already brewing interest in bubbles well into the mainstream. 

At the time, all tables were based upon dissolved gas models like Buhlmann, and thus not directly modeling bubble development during an ascent. Dissolved gas models preference ascents that maintain a reasonably high gradient between the gas in tissues and the gas being breathed, which should be supportive of efficient elimination from tissues. Dissolved gas models manage—but do not explicitly control for—bubbles and were thus labeled (probably unfairly) as “bend and mend” tables. In fact, dissolved gas models are designed to limit supersaturation (explicitly), because this is supposed to limit bubble formation. Haldane references this control of supersaturation in his pioneering publication .

Those advocating for different ascent protocols imagine that a lack of deep stops creates more bubbles, which then need to be managed during a longer series of shallow decompression stops. In this scenario, a slower ascent from depth, including “deep stops,” would reduce the formation and growth of bubbles while reducing time that might otherwise be needed to manage previously formed bubbles. Support for these ideas gained momentum in diving and professional communities, although some of these individuals were arguably conflicted by a vested interest in promoting deep stop models. Eventually, the practice received more critical consideration but during the intervening years deep stops were largely considered as common knowledge.

Jarrod Jablonski and George Irvine at the last stop of a 15-hour decompression in 1998. Photo from the GUE archives.

The idea of controlling bubbles became extremely popular through the 1990s, encouraging deeper stops designed to limit bubble formation and growth. The “test-and-see” approach developed by early tech divers appears to have fueled the promulgation of deep stops though the determining characteristics remained poorly defined. Early tech divers embraced the uncertainty of their activity, realized that nobody had the answers, and decided to “experiment” in search of their own answers. In fact, divers were experimenting with a lot more than deep stops, altering gases breathed, the placement of various decompression stops, and the total amount of decompression time utilized. This meant that divers were sometimes aggressively adjusting multiple factors simultaneously. 

For example, divers using a decompression program might input significantly less helium than was present in their breathing mixture. This was done because the algorithm increased decompression time with elevated helium percentage, sometimes known as the helium penalty . In other cases, divers would completely change the structure of stop times. For example, divers would invert the way a profile should be conducted by doing more time near a gas switch and less time prior to the next gas switch, believing the higher-oxygen gas only 3m/10ft away was time better utilized. 

I do not intend an exhaustive or detailed review but only to assert the many, sometimes radical approaches being taken by tech divers who often perceived success with these various strategies. In some cases, it appears these practices may have been leading the way toward improvements (eliminating the Helium penalty) while other adjustments might prove disadvantageous. In both cases, it is important to note that divers understood—or should have understood—these actions to be potentially dangerous, accepting risk as a natural part of pushing into uncertain territory where definitive guidance and clear borders are rarely available. 

None of this is to argue that divers should engage in aggressive decompression or challenge convention or place themselves at risk to unknown complications. I merely wish to clarify the atmosphere under which these adjustments were conducted, while highlighting that conventional ideas of risk mitigation are inherently complicated against a backdrop of novel exploration. A sense of relative risk dominated most of these trials since there are also risks associated with lengthy, in-water exposure. Eliminating what might be an avoidable decompression obligation could reduce risk from other obvious factors, including changing weather, dangerous marine life, being lost at sea, hypothermia, and oxygen exposure. Decompression experimentation was but one of many attempts to establish new protocols during extensive exploration projects. 

During this time, deep stops and similar adjustments were part of the “norm” for aggressive technical explorers who were sometimes notably reducing in-water time as compared to available tables. It is interesting to note that individual differences in susceptibility seemed among the most prominent variables across the range of tested adjustments, and we will return to this in a later discussion. For now, we should acknowledge that the “success” being achieved (or imagined) is greatly complicated by a small sample size of self-selected individuals who were simultaneously experimenting with a range of variables. On the other side, I do not want to entirely discount the results being seen by these divers. We should remember that development of safe decompression ascents for the general diving community was not the goal of most tech divers. These divers were interested in maximizing their personal and team efficiency during decompression. These strategies may or may not have been objectively successful or broadly applicable, but many teams imagined them so, at least within the narrow scope being considered. Later, we will return to these important distinctions with careful consideration for the potentially different interests being pursued by divers evaluating different decompression profiles.

Fortunately, a desire to create broadly useful tools was high on the list of priorities for some individuals in the technical diving community, leading to a relevant and important contribution. This inventive approach would introduce a new way to think about decompression, remaining to this day at the center of the debate about deep stops. 

Creating a New Baseline

Despite the popularity of deep stops and other modifications presented previously, technical divers lacked a common language for comparing the results, especially across different profiles. Compounding this problem was the variable way decompression programs considered a dive to be more or less “safe.” For example, some programs developed “safety factors” which increased total decompression time by an arbitrary factor, i.e., made them 10% longer. Other programs used different strategies, though it was not clear whether any of the various safety factors actually made the decompression safer. Whether safer or not, these factors were typically inconsistent and added complications when comparing various profiles. Just as these debates were reaching a fervor, help arrived from an unlikely place. An engineer by trade and decompression enthusiast by choice, Erik Baker had developed a novel solution. 

George Irvine passing time during a long decompression. Photo from the GUE archives.

Baker was seeking a way to establish consistency in considering the safety or lack of safety between various profiles. The term Baker applied was Gradient Factor (GF). I will not invest considerable time exploring the science behind GF, as many useful resources are widely available. For our purposes, it is sufficient to say that GFs allows a user to establish a lower threshold than the maximum recommended by a dissolved gas algorithm. This maximum pressure or M-value is assigned to “compartments” having an assumed amount of tolerance relative to the flow of blood they receive. Decompression theory is complicated by many factors but when the pressure of a gas in a compartment nears the M-value, it is thought that the risk of decompression sickness becomes higher. By adjusting a profile through the use of GF, one presumably reduces the risk. However, this also means there is less gradient between the gas in tissues and the gas in blood, reducing the driving force for the removal of gas and probably requiring additional decompression time. 

Gradient factors took an important step toward using a consistent language when talking about a variety of adjustments divers might make to their decompression. As part of his work on GF, Baker had developed a keen interest in the decompression adjustments by leading technical divers. Baker and I began working together while evaluating some of our most extreme diving profiles. This collaboration led to a number of productive developments including GUE’s DecoPlanner, released in 1997, and among the first to utilize GF methodology. These collaborations further highlighted what appeared to be a discrepancy between the decompression expected by dissolved gas algorithms and the decompressions being conducted by many technical divers. 

Decompression experimentation tended to cluster in small groups whose size was likely affected by self-selection with members who would stop or reduce aggressive dives when experiencing decompression sickness. Yet, it seemed possible that there was more to the story. These divers were doing several things that should have exposed them to notable risk and yet were repeatedly completing decompressions of more than 15 hours. Deep stops were only part of the story as these divers were ignoring conventional wisdom in several areas while notably reducing decompression time. What was behind this discrepancy? Were deep stops right or wrong? Was the conservative approach to helium right or wrong? Were these individuals lucky? Were they unusually resistant to decompression sickness, or were there other factors lurking in the background?

Note: I will outline many of these developments in the upcoming Part Three, where we more directly consider the modern challenges to deep stops and most especially the assertion they are dangerous. In the interim, I hope to hear from our readers. Do you have different experiences from this period? Do you think such experimentation is reckless or inadvisable? Please let us know your thoughts.


Jarrod is an avid explorer, researcher, author, and instructor who teaches and dives in oceans and caves around the world. Trained as a geologist, Jarrod is the founder and president of GUE and CEO of Halcyon and Extreme Exposure while remaining active in conservation, exploration, and filming projects worldwide. His explorations regularly place him in the most remote locations in the world, including numerous world record cave dives with total immersions near 30 hours. Jarrod is also an author with dozens of publications, including three books.

Continue Reading

In this video, GUE President Jarrod Jablonski and Technical Administrator Richard Lundgren discuss some of the early hurdles associated with decompressing from deep tech and cave dives. This discussion explores the lack of readily available deep-diving tables and most especially those capable of managing multi-level profiles. They also explore the early development of ratio deco and similar “experiments” conducted by early technical divers.

What is Ratio Deco?

The basic idea of ratio deco involves establishing a ratio between the time spent at a given depth and the associated decompression. This is possible because the curve describing the relationship between bottom time and total stop time (TST) at a single depth can be approximated by a straight line. The straight-line approximation breaks down beyond a certain range but can be useful toward estimating decompression time. For example, the ratio in a typical tech dive at 45 m/ 150 ft using the appropriate gases is 1:1, meaning that a dive at 45 m/150 ft for 30 min will result in a decompression of 30 min while assuming appropriate bottom and decompression gasses.

We can also follow with adjustments for deeper dives so that each 3 m/10 ft deeper than planned would result in a decompression extension of five minutes. At some point the increase or decrease in the baseline parameter will break down, leading to profiles that are too conservative or too liberal and we need a new ratio such as the GUE standard for 75 m/250 ft at 2:1 where 30 minutes of bottom time results in 60 minutes of decompression, assuming appropriate decompression gasses arranged through a properly staged ascent.

What exactly are deep stops?

Deep stops are technically any stops added below what would otherwise be established by a typical dissolved gas model. Dissolved gas models establish faster ascents when compared to models that intend to reduce development of bubbles. Bubble-oriented models seek to reduce bubble development by ascending more slowly, usually by adding pauses or "deep stops".

“The formation of gas bubbles in the living body during or shortly
after decompression evidently depends on the fact that the partial
pressure of the gas or gases dissolved in the blood and tissues is in
excess of the external pressure. But it is a well-known fact that
liquids, and especially albuminous liquids such as blood, will hold gas
for long periods in a state of supersaturation, provided the super
saturation does not exceed a certain limit. In order to decompress
safely it is evidently necessary to prevent this limit being exceeded
before the end of decompression.”

Boycott AE, Damant GCC, Haldane JS. The prevention of compressed-air illness. J Hygiene (Lond ) 1908;8:342-443.

The extra decompression time calculated by various algorithms when breathing a helium mix is a consequence of the long held belief that helium, which is lighter than air, enjoys faster uptake by the body than nitrogen (in the case of the Buhlmann algorithm 2.65 times faster).

Compartments are hypothetical tissues which are intended to model how gas moves in and out of tissues where blood flow varies. Fast compartments are intended to model movement in a tissue with a lot of blood flow and slow compartments strive to model behavior in tissues with very little blood flow.