Using Supercomputers to Understand the Echoes of Toothed Whales

By Kimberly Mann Bruch, SDSC
A sperm whale coming up for air while a free diver swims alongside

By emitting sounds and analyzing resulting echoes, humans and animals use active acoustic sensing systems to explore and comprehend their surroundings. Humans use high-frequency sonar systems, also known as echosounders or fishfinders, to observe fish and zooplankton in the ocean. Toothed whales and bats, on the other hand, utilize the same principles via “echolocation” to navigate and find food underwater and in the air.

Specifically, toothed whales have evolved remarkable abilities to communicate, hunt and navigate using sound in diverse underwater environments. They face complex auditory scenes, from interacting with their own kind to detecting echoes bouncing off prey and underwater features. Yet, a detailed understanding of how these whales perceive direction and the physical mechanisms behind it remains limited. 

Thanks to U.S. National Science Foundation (NSF) ACCESS allocations, Wu-Jung Lee, a principal oceanographer at the University of Washington, and her postdoc YeonJoon Cheong, have been using Bridges-2 at the Pittsburgh Supercomputing Center (PSC) to combine advanced modeling techniques with detailed three-dimensional representations of whale head anatomy obtained from computed tomography (CT) scans to address this gap. 

A scientific visualization of the math from the simulation.
Head-related transfer functions (HRTFs) numerically simulated at the vicinity of dolphin ears.

“From our previous experiences with an ACCESS Explore project, we found that access to the high-performance memory nodes on the Bridges-2 system was essential for our work,” Lee explained. “This is particularly true for simulating high-frequency sounds propagating in large whale heads because of the vast number of model elements needed to analyze the sound’s interaction with biological structures in both time and frequency domains, which requires a high-performance computing resource like Bridges-2.” 

A visualization of the area inside a dolphin's head.
Three-dimensional volumetric representation of the head of a bottlenose dolphin reconstructed using computed tomography (CT) scans.

Most recently, Lee and her team have utilized their ACCESS allocations to illustrate how different anatomical structures in the head of toothed whales could affect sound features arriving at the animals’ ears. 

They presented the findings at the 184th Meeting of the Acoustical Society of America. 

“By using supercomputers like Bridges-2, we are able to gain insights into the type of information the animals have access to when they echolocate,” Lee said. “In addition to PSC, we also have allocations on Jetstream2 at Indiana University and the Open Storage Network.” 

Lee said that her group’s work on Jetstream2 and Open Storage Network involves different research studies projects that have not yet been published. These projects are focused on classifying echoes from different fish schools recorded by echosounders and tracking bat activities in Seattle, and they hope to share their findings soon in upcoming papers.

Project Details

Resource Provider Institution(s): Pittsburgh Supercomputing Center (PSC), Indiana University (Jetstream2), Open Storage Network (OSN)
Affiliations: University of Washington
Funding Agency: NSF
Grant or Allocation Number(s): BIO240018

The science story featured here was enabled by the U.S. National Science Foundation’s ACCESS program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.

Sign up for ACCESS news and updates.

Receive our monthly newsletter with ACCESS program news in your inbox. Read past issues.