IBM Wants To Lead AI Hardware Development

IBM Wants To Lead AI Hardware Development

IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

Written By
David Curry
David Curry
Mar 30, 2020

Hardware acceleration is increasingly essential in applications that rely on real-time analytics and artificial intelligence. This has driven several industry efforts to find suitable solutions.

Last year, IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

According to Karl Freund, a senior analyst at Moor Insights and Strategy, AI performance improvements require bit-length reductions, while still preserving the accuracy of higher bit-length calculations.

SEE ALSO: IBM Advances Sentiment Analytics via Watson AI Services

The common formats – 32bit and 16bit – are workable for most computation tasks, but for deep neural networks, IBM may need to reach even smaller than 8-bit to achieve its goal.

It recently published a paper on “Hybrid 8-bit Floating Point”, a format that uses different precision requirements for computations. IBM demonstrated the format’s accuracy, comparable to 16-bit math, but with 4X lower costs.

“This approach could theoretically enable someone to build a chip for training deep neural networks that would use ¼ the chip area, or perhaps deliver 4 times the performance at the same cost,” said Freund.

IBM is not the only technology company researching smaller bit formats, Google and Nvidia have both published papers on 8-bit for AI. Nvidia has been exploring 8-bit format for five years now, stating in 2016 that the format would be enough for AI.

However, times have changed since 2016 and now IBM is working on 2-bit chipsets for AI, which can meet the accuracy requirements of 16-bit. In a paper published in July last year, the company showed its attempts to build an accurate 2-bit format that heavily outperformed 8-bit. While there is still a ways to go, it shows that 8-bit is not enough for everyone.

David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

Real-time Analytics News for the Week Ending April 5
What You Need to Know About Scaling Agentic AI
How Model Context Protocol (MCP) Exploits Actually Work
Casey Bleeker
Apr 3, 2026
AI-Powered Network-as-a-Service: Enabling “Lights Out” Networking for the AI Era
Jim Sullivan
Apr 2, 2026

Featured Resources from Cloud Data Insights

Real-time Analytics News for the Week Ending April 5
What You Need to Know About Scaling Agentic AI
How Model Context Protocol (MCP) Exploits Actually Work
Casey Bleeker
Apr 3, 2026
Powering Smart Cities: Designing Rugged PoE for Outdoor and Industrial Edge Deployments
Jordan Smith
Apr 2, 2026
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.