SHARE
Facebook X Pinterest WhatsApp

Google Launches New AI Chips, Claims Faster Than NVIDIA

thumbnail
Google Launches New AI Chips, Claims Faster Than NVIDIA

Google has launched the latest version of its TPU AI chip, with a dig at Nvidia’s A100 in the official research paper published alongside.

Written By
thumbnail
David Curry
David Curry
May 23, 2023

Google has announced the latest edition of its artificial intelligence chips, the fourth generation tensor processing unit, which it claims is faster than Nvidia’s A100 tensor core GPU. 

According to Google, TPU v4 is 1.2-1.7 times faster than the Nvidia A100, and uses 1.3-1.9 times less power, although Nvidia has disputed some of these figures. Nvidia CEO Jensen Huang said in a blog post that its latest H100 chips delivered four times more performance than A100, and hinted that those are more powerful than Google’s TPU v4. 

In the technical paper released alongside, Google notes that the TPU v4 was created specifically for supercomputer which trained the 540 billion-parameter PaLM large language model (LLM). This LLM was the framework for Bard, Google’s chatbot and ChatGPT competitor. 

See also: Reports of the AI-Assisted Death of Prose are Greatly Exaggerated

This back and forth between Nvidia and Google has heated up over the past few years, although Nvidia is very firmly in the driver seat when it comes to industry acclamation. That said, Google is a potent challenger and has some big name LLM developers on board, including Midjourney. 

“We’re proud to work with Google Cloud to deliver a seamless experience for our creative community powered by Google’s globally scalable infrastructure,” said David Holz, founder and CEO of Midjourney. “From training the fourth version of our algorithm on the latest v4 TPUs with JAX, to running inference on GPUs, we have been impressed by the speed at which TPU v4 allows our users to bring their vibrant ideas to life.”

Generative systems will most likely be the main talking point in tech this year, with each month a new type of system coming online. ChatGPT remains the focal point for a lot of this creation, but we are also seeing AI generated music featured the likeness of Drake and Kanye West, AI generated videos of Harry Potter and Lord of the Rings in the style of Balenciaga, and AI generated ebooks appearing on Amazon. 

To power all of this, GPU and TPU designers are working overtime to improve the performance of these chips, while lowering the amount of power each chip needs. Expect this industry to heat up over the next 12 months, as Amazon and Microsoft also look into building their own custom chips for AI and ML-related projects. 

thumbnail
David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

Model-as-a-Service Part 1: The Basics
If 2025 was the Year of AI Agents, 2026 will be the Year of Multi-agent Systems
AI Agents Need Keys to Your Kingdom
The Rise of Autonomous BI: How AI Agents Are Transforming Data Discovery and Analysis

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.