NVIDIA: AI Still Has a Hardware Problem

PinIt

The largest AI models can take months to train on today’s computing platforms. NVIDIA’s new offerings aim to address that issue.

At its annual GTC conference, NVIDIA announced a bevy of new AI-specific GPUs and CPUs, including the Hopper H100 GPU, which they claim will dramatically speed up how companies deploy the most sophisticated AI applications like BERT and GPT-3, which are built using transformer models. But the announcement begs the question: Can all challenges around AI simply be solved with more computing power?

See also: NVIDIA and Partners Target Artificial Intelligence at GTC

Joel Hans

About Joel Hans

Joel Hans is a copywriter and technical content creator for open source, B2B, and SaaS companies at Commit Copy, bringing experience in infrastructure monitoring, time-series databases, blockchain, streaming analytics, and more. Find him on Twitter @joelhans.

Leave a Reply

Your email address will not be published. Required fields are marked *