IBM Wants To Lead AI Hardware Development - RTInsights

IBM Wants To Lead AI Hardware Development

IBM Wants To Lead AI Hardware Development

IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

Written By
David Curry
David Curry
Mar 30, 2020
2 minute read

Hardware acceleration is increasingly essential in applications that rely on real-time analytics and artificial intelligence. This has driven several industry efforts to find suitable solutions.

Last year, IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

According to Karl Freund, a senior analyst at Moor Insights and Strategy, AI performance improvements require bit-length reductions, while still preserving the accuracy of higher bit-length calculations.

SEE ALSO: IBM Advances Sentiment Analytics via Watson AI Services

The common formats – 32bit and 16bit – are workable for most computation tasks, but for deep neural networks, IBM may need to reach even smaller than 8-bit to achieve its goal.

It recently published a paper on “Hybrid 8-bit Floating Point”, a format that uses different precision requirements for computations. IBM demonstrated the format’s accuracy, comparable to 16-bit math, but with 4X lower costs.

“This approach could theoretically enable someone to build a chip for training deep neural networks that would use ¼ the chip area, or perhaps deliver 4 times the performance at the same cost,” said Freund.

IBM is not the only technology company researching smaller bit formats, Google and Nvidia have both published papers on 8-bit for AI. Nvidia has been exploring 8-bit format for five years now, stating in 2016 that the format would be enough for AI.

However, times have changed since 2016 and now IBM is working on 2-bit chipsets for AI, which can meet the accuracy requirements of 16-bit. In a paper published in July last year, the company showed its attempts to build an accurate 2-bit format that heavily outperformed 8-bit. While there is still a ways to go, it shows that 8-bit is not enough for everyone.

David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

Why Storage is Becoming the Limiting Factor in AI Infrastructure
Ken Claffey
Apr 27, 2026
Smart Manufacturing Trends 2026: How AI, IoT, and Automation Are Driving Efficiency and Resilience
Why Most AI Projects Fail Before They Reach the Algorithm
Jeronimo De Leon
Apr 23, 2026
English as Code and the End of Drag-and-Drop Thinking
Binny Gill
Apr 22, 2026

Featured Resources from Cloud Data Insights

Why Storage is Becoming the Limiting Factor in AI Infrastructure
Ken Claffey
Apr 27, 2026
Real-time Analytics News for the Week Ending April 25
Smart Manufacturing Trends 2026: How AI, IoT, and Automation Are Driving Efficiency and Resilience
Why the Best MSPs Are Starting to Rethink Cloud Strategy (Without Making a Big Deal About It)
Richard Copeland
Apr 24, 2026
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.