NVIDIA Supercharges Hawk Supercomputer for AI Work

PinIt

NVIDIA GPUs will help the Hawk supercomputing system support even larger scale deep learning projects that make use of AI.

Germany’s High Performance Computing Center Stuttgart (HLRS) is already one of the world’s largest supercomputing centers, but its newest partnership with NVIDIA is taking that power one step further thanks to AI. The 16th largest supercomputing system in the world will gain 192 NVIDIA Ampere architecture GPUs linked on the NVIDIA Mellanox InfiniBand network. The center is moving towards hybrid computing — a system where CPU and GPU work together, combining traditional simulation with Big Data approaches.

The system will support even larger scale deep learning projects that make use of AI. While central processing units have traditionally been the standard for diverse calculations required in things like climate modeling or fluid dynamics, graphics processing units excel at repetitive tasks required to train AI.

This expansion allows groups to make greater use of AI-driven analysis in simulations. Future projects will be able to handle billions of simulations in parallel for training AI efficiently and quickly.

See also: Fastest Supercomputer Adopts Real-Time Analytics

NVIDIA computing will tackle real-world issues

The supercomputer will work on a variety of projects. One researcher hopes to use the new processing power to train neural networks to evaluate metal alloys quickly and cost-effectively. Another will take a project in fluid dynamics, improving the analysis of turbulence.

The center will also be part of a new project designed to better predict the arc of COVID cases, notifying hospitals in advance when beds are likely to fill to capacity. Hospitals would have time to prepare, and policymakers would have a greater chance of altering public policy before it’s too late.

The initiative underscores a move to launch AI-analysis in simulations and bring augmented decision making to the forefront. These initiatives have AI do the heavy lifting of noticing patterns among billions and trillions of data points so that humans can respond before the breaking point (such as during pandemics) or make improvements to thorny problems such as turbulence analysis.

Elizabeth Wallace

About Elizabeth Wallace

Elizabeth Wallace is a Nashville-based freelance writer with a soft spot for data science and AI and a background in linguistics. She spent 13 years teaching language in higher ed and now helps startups and other organizations explain - clearly - what it is they do.

Leave a Reply

Your email address will not be published. Required fields are marked *