top of page

Top Scientific & Engineering Use Cases Earned the 2025 HPCwire Readers’ Choice Award at SC’25

Article about HPC use cases awarded by HPCwire readers in 2025.


In this article, to demonstrate what’s possible today with current peta- and exaflop supercomputers, we’ll showcase several actual scientific and engineering simulation projects which just received the 2025 HPCwire Readers’ and Editors’ Choice Award.


The HPCwire Readers' Choice Awards began in 2003 at the Supercomputing conference in Phoenix, Arizona. It is an annual program where the global high-performance computing community votes to recognize outstanding organizations, technologies, and use cases. The winners are determined through a polling of HPCwire's worldwide readership, which takes place several months before the SC (Supercomputing) conference each year. For the first awards ceremony, the winners were announced at SC03 including companies like Cray, IBM, SGI, Cisco, Sun, and Apple, among others. The awards celebrate achievements in HPC across science and industry.


Now in its 22nd year, the awards are nominated and voted on by the global HPC community, with the winners announced and honored during the Supercomputing Conference (SC25) in St. Louis, MO, in November 2025.


In this article, we selected the Readers’ Choice Award winners for best use of HPC in Life Sciences, Physical Sciences, Societal Plight, and Industry (Automotive, Aerospace, Manufacturing, Chemical, etc.). All winners awarded in 2025 are here: https://www.hpcwire.com/2025-hpcwire-awards-readers-editors-choice.


Life Sciences: Characterization of Hemodynamics in Blood Vessel Sprouts

Researchers at the New Jersey Institute of Technology (NJIT) and the University of Florida have made a groundbreaking discovery about how blood flows through the tiniest new blood vessels in our bodies, findings that could revolutionize our understanding of everything from wound healing to cancer treatment. Using powerful supercomputer simulations, they've revealed that individual red blood cells create dramatic, ever-changing forces as they flow past newly sprouting blood vessels – a discovery made using U.S. National Science Foundation (NSF) ACCESS allocations on the Expanse system at the San Diego Supercomputer Center (SDSC)—part of the School of Computing, Information and Data Sciences (SCIDS) at UC San Diego.


“When blood flows through your vessels, it creates a force against the vessel walls—like water rushing through a garden hose pushing against the hose's inner surface," explained Peter Balogh, the study’s senior author and assistant professor of mechanical and industrial engineering at NJIT. "This is known as wall shear stress.”


Red blood cells flowing through a vessel with a smooth, red background, conveying a dynamic and vital mood. No text present.
Red blood cells flowing through body vessels, Image courtesy: SDSC, https://www.sdsc.edu/news/2025/PR20250707_blood_vessels.html

But here's what surprised the research team: this force in the newly sprouted vessel isn't steady. Instead, it fluctuates dramatically each time a single red blood cell flows past the entrance. Balogh compared it to cars driving past a highway on-ramp—each car creates its own unique wind pattern and pressure at the entrance. Similarly, each red blood cell generates its own distinct pattern of force against the blood vessel wall. Read more at: https://www.sdsc.edu/news/2025/PR20250707_blood_vessels.html


Life Sciences: Simulations paired with Experiments to Find New Targets for HIV Drugs or Vaccines

A University of Pittsburgh team has simulated the HIV-1 virus to show how a twist in a critical protein may help it squeeze into a host cell’s nuclear pore, causing infection. The research team’s simulations yielded encouraging results, suggesting that simulations can pair with experiments to find new targets for HIV drugs or vaccines. The findings suggest how the capsid may deform to fit through the nuclear pores, revealing a potential target for future AIDS therapies.


The method, to be reported in the Proceedings of the National Academy of Sciences, may also be useful for studying flexibility in other important proteins.

The year 2024 was a mixed bag in the fight against AIDS. Modern antiviral therapies have turned HIV into a chronic, but survivable, disease. Death rates are at a 20-year low. On the other hand, doctors are unlikely to meet their goal of eliminating AIDS as a health threat worldwide by 2030. In some populations, HIV infection is actually increasing. And we still don’t have a vaccine. That’s why scientists studying AIDS continue to search for weak spots in the virus’s infection cycle. Read more at: https://www.eatg.org/hiv-news/hiv-protein-switch-may-help-virus-squeeze-into-host-cell-nucleus/


Physical Sciences: A Real-time Tsunami Digital Twin

Lawrence Livermore National Laboratory (LLNL), UC San Diego/Scripps, and UT Austin built a digital twin system for real-time tsunami forecasting that could dramatically improve early warning capabilities for coastal communities near earthquake zones. Using the HPE Cray supercomputer El Capitan and TACC systems Frontera, Lonestar6, Stampede3, and Vista, the system turns ocean pressure sensor data and physics-based models into localized forecasts in under 0.2 seconds, which is about 10 billion times faster than traditional methods. This achievement cuts false alarms and speeds credible warnings.


Map of tectonic plates off the Pacific Northwest with labeled faults, seismic displacement graphics, and seafloor velocity overlay.
Scientists at Lawrence Livermore National Laboratory have helped develop a real-time tsunami forecasting system that could dramatically improve early warning capabilities for coastal communities near earthquake zones (Images courtesy of Tzanio Kolev/LLNL).

The resulting tsunami “digital twin” models the effects of seafloor earthquake motion using real-time pressure sensor data and advanced physics-based simulations. This dynamic, data-driven system can infer the earthquake’s impact on the ocean floor and forecast the tsunami's behavior in real time — complete with uncertainty quantification. Read more at: https://www.llnl.gov/article/53276/llnl-scientists-explore-real-time-tsunami-warning-system-worlds-fastest-supercomputer.


Physical Sciences: Identification and Characterization of the Failed Stars in a Globular Cluster in our Milky Way Galaxy

An international team of researchers used data from the new James Webb Space Telescope (JWST) and simulations on supercomputers, including Pittsburgh Supercomputing Center (PSC)’s Bridges-2 and San Diego Supercomputer Center (SDSC)’s Expanse, to identify and date three brown dwarfs in a globular cluster for the first time.


Collections of stars called globular clusters are some of the oldest known structures in our Galaxy, and their properties preserve the record of conditions in the early Universe. But estimating their exact ages is very challenging, and different dating techniques often disagree with each other. An international team used data from the new James Webb Space Telescope (JWST) and simulation on supercomputers including PSC’s Bridges-2 to identify and date for the first time three brown dwarfs in a globular cluster. These objects, too small to be stars and too large to be planets, cool off over time, providing a new way to estimate the age of the parent cluster.


Brown dwarf against starry space.
llustration of a brown dwarf. Brown dwarfs are more massive and hotter than planets but lack the mass required to become sizzling stars. Their atmospheres can be similar to the giant planet Jupiter’s. Image courtesy: https://science.nasa.gov/asset/hubble/forecast-for-exotic-weather/

Scientists studying the origins of the Universe must glean clues about how stars and, well, everything formed, using weak signals from faraway astronomical structures. Globular clusters are an important source of this information. These are collections of stars that look a little like large spheres (“globes”) of light through a pair of binoculars. Most of them are almost as old as the Universe itself and may offer clues as to the formation and evolution of the earliest stars and galaxies. Read more here: https://www.psc.edu/globular-clusters/.


Societal Plight: Recently Founded Societal Computing and Innovation Lab (SCIL) takes a novel approach to creating breakthrough technological innovations to meet complex societal challenges


The San Diego Supercomputer Center (SDSC)’s Societal Computing and Innovation Lab (SCIL) takes a novel approach to creating breakthrough technological innovations to meet complex societal challenges with a commitment to real-world solutions that leverage integrated workflows, next-generation data and AI, cutting-edge science, and advanced digital infrastructure.


The Societal Computing and Innovation Lab (SCIL), housed at the San Diego Supercomputer Center (SDSC) develops and supports a suite of open, interoperable data platforms that serve as critical infrastructure for collaborative research, decision-making and real-world impact. These platforms include the National Data Platform, which has earned national recognition as a “Project to Watch,” as well as the National Science Foundation (NSF) Quantum Foundry’s Quantum Data Hub and the Wildfire Science & Technology Commons. Read more at: https://www.hpcwire.com/off-the-wire/sdsc-launches-societal-computing-and-innovation-lab/.


Energy: How Biochar and Optimized Nitrogen Improve Soil Health, Reduce Emissions, and Support Sustainable Biofuel Production


Two graduate students from Tennessee State University leveraged U.S. National Science Foundation ACCESS resources on Jetstream2 to reveal how biochar and optimized nitrogen use can improve soil health, reduce emissions, and support sustainable biofuel production. This research points toward scalable agricultural practices that can cut greenhouse gas emissions, reduce farmer reliance on costly fertilizers, and strengthen bioenergy and food security for society at large.


How the combination works:


  • Soil health: Biochar, a carbon-rich material from biomass pyrolysis, can improve soil health and nutrient retention, leading to increased crop yields and fertility. Its application can enhance microbial communities and strengthen the effectiveness of nitrogen fertilizers.

  • Nitrogen optimization: Combining biochar with optimized nitrogen application can reduce nitrogen losses from the soil compared to using synthetic nitrogen alone.

  • Emission reduction: By better retaining nutrients, this approach can help reduce the amount of nitrogen that is lost to the environment as a greenhouse gas. The biochar itself is also a stable carbon material that can be used for long-term carbon storage in the soil.

  • Biofuel support: The research also indicates that this approach can support sustainable biofuel production by improving soil health and fertility in areas where biofuel crops are grown. 


Read more at: https://access-ci.org/jetstream2-used-for-agriculture-research/.


HPC in Industry: High-Performance Spaceflight Computing (HPSC) Platform for Aerospace Missions


The NASA Jet Propulsion Laboratory selected Microchip’s PIC64-HPSC for its High-Performance Spaceflight Computing (HPSC) platform. The platform delivers deterministic and fault-tolerant HPC, onboard AI, advanced sensor fusion, and autonomous decision-making to aerospace missions.


A space lander with solar panels and antennas on a dark lunar surface.
Image courtesy: NASA Jet Propulsion Laboratory.

The objective of NASA’s project is to develop a next-generation flight computing system that addresses computational performance, power management, fault tolerance, and connectivity needs of NASA missions through 2040 and beyond. NASA is actively seeking to answer fundamental questions about life beyond Earth through groundbreaking science and exploration missions: Are we alone? What does tomorrow bring? How is our universe changing?  


For future space missions seeking to answer these questions, there is a need for significant advances in onboard computing. Required advances in computing include navigation and control systems, complex science instruments, robotic science sample acquisition and return, communications, autonomous robotic operations, crewed instrument health and safety monitoring, power generation and management, and autonomous fault handling and recovery. Read more at: https://www.nasa.gov/game-changing-development-projects/high-performance-spaceflight-computing-hpsc/.

Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page