Supercomputers Work Together for Big Science Simulations

By Megan Johnson, NCSA
A visualization of turbulence. Intertwining lines twist in chaotic swirls.

How many supercomputers does it take to simulate plasma turbulence? Quite a few, it turns out. Researchers from the University of Wisconsin-Madison (UW) utilized a number of ACCESS resources, including Stampede2 from the Texas Advanced Computing Center (TACC), Expanse from the San Diego Computer Center (SDSC) and Bridges-2 from the Pittsburgh Supercomputing Center (PSC). But it was Anvil, from Purdue’s Rosen Center for Advanced Computing (RCAC) that did some of the heaviest lifting. Fully half of Anvil’s resources were used in the single simulation – a whopping 512 nodes used at once.

The research project, developed by Bindesh Tripathi, who spearheaded it under the joint supervision of advisors Paul Terry and Ellen Zweibelis, professors in the Department of Physics at UW-Madison, focuses on studying the plasma in our galaxy known as the interstellar medium (ISM). ISM is the stuff between the stars – gasses and particles that move about in the vast distances between large stellar objects. 

Tripathi, who is working toward finishing his doctoral dissertation, is simulating plasma dynamics, in particular, turbulence simulations. These simulations are extremely complex, as there are many factors, like the magnetic fields created by ISM electrical currents, to consider when building the model.

“It’s been difficult over the years to replicate observations made of large-scale magnetic fields,” said Terry. “Whenever we apply a model like magnetohydrodynamics to try and understand this process, we find that the system is rather turbulent. The motions are rather disordered and chaotic, which means the currents are rather disordered and chaotic, and so the magnetic fields themselves are rather disordered and chaotic. And it’s difficult to solve one of these mathematical models in such a way that you can recover these large-scale, ordered magnetic field structures. So this is something we are trying to understand with the simulations that we’re running.”

Tripathi’s work builds upon Terry’s research group’s previous discoveries. To explore these results and discover more refined details about stable-mode excitations, Tripathi is working with specialized simulation software called Dedalus, an application designed to study fluid mechanics. Running these simulations requires huge amounts of computational power.

“I ran the Dedalus code, and I found it running beautifully well,” says Tripathi. “Anvil has a large number of cores, and the queue time was relatively short, even for the very large resources that I was requesting, and the jobs would run quite fast. So it was a quick turnaround, and I got the output pretty quickly. I have had to wait a week or even longer on other machines, so Anvil has been quite useful and easy to run the code. Anvil has also generously provided us with storage of a large dataset, which now amounts to 125,000 gigabytes from my turbulence simulations.”

This research is the result of a Maximize ACCESS allocation. Researchers like Tripathi are able to work with such large simulations due to the impressive portfolio of compute resources available through ACCESS. If you’re an expert in research computing, or if you’re new to incorporating High-Performance Computing (HPC) into your research plan, ACCESS has the resources to help you take your research to the next level. You can apply for an ACCESS allocation here.

You can read a deeper dive into the research in the original story posted here: Half of the entire Anvil supercomputer used to challenge traditional turbulence theory for space and climate modeling


Resource Provider Institution(s): Rosen Center for Advanced Computing (RCAC) at Purdue, Texas Advanced Computing Center (TACC), San Diego Supercomputer Center (SDSC), Pittsburg Supercomputing Center (PSC)
Affiliations: University of Wisconsin (UW)-Madison
Funding Agency: NSF
Grant or Allocation Number(s): PHY130027

The science story featured here was enabled by the U.S. National Science Foundation’s ACCESS program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.

Sign up for ACCESS news and updates.

Receive our monthly newsletter with ACCESS program news in your inbox. Read past issues.