SHARE
Facebook X Pinterest WhatsApp

IBM Wants To Lead AI Hardware Development

thumbnail
IBM Wants To Lead AI Hardware Development

IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

Written By
thumbnail
David Curry
David Curry
Mar 30, 2020

Hardware acceleration is increasingly essential in applications that rely on real-time analytics and artificial intelligence. This has driven several industry efforts to find suitable solutions.

Last year, IBM established the AI Hardware Center, a research and development lab aimed at increasing AI performance by 1000x in the next ten years.

According to Karl Freund, a senior analyst at Moor Insights and Strategy, AI performance improvements require bit-length reductions, while still preserving the accuracy of higher bit-length calculations.

SEE ALSO: IBM Advances Sentiment Analytics via Watson AI Services

The common formats – 32bit and 16bit – are workable for most computation tasks, but for deep neural networks, IBM may need to reach even smaller than 8-bit to achieve its goal.

It recently published a paper on “Hybrid 8-bit Floating Point”, a format that uses different precision requirements for computations. IBM demonstrated the format’s accuracy, comparable to 16-bit math, but with 4X lower costs.

“This approach could theoretically enable someone to build a chip for training deep neural networks that would use ¼ the chip area, or perhaps deliver 4 times the performance at the same cost,” said Freund.

IBM is not the only technology company researching smaller bit formats, Google and Nvidia have both published papers on 8-bit for AI. Nvidia has been exploring 8-bit format for five years now, stating in 2016 that the format would be enough for AI.

However, times have changed since 2016 and now IBM is working on 2-bit chipsets for AI, which can meet the accuracy requirements of 16-bit. In a paper published in July last year, the company showed its attempts to build an accurate 2-bit format that heavily outperformed 8-bit. While there is still a ways to go, it shows that 8-bit is not enough for everyone.

thumbnail
David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

Real-time Analytics News for the Week Ending January 10
Model-as-a-Service Part 1: The Basics
If 2025 was the Year of AI Agents, 2026 will be the Year of Multi-agent Systems
AI Agents Need Keys to Your Kingdom

Featured Resources from Cloud Data Insights

The Manual Migration Trap: Why 70% of Data Warehouse Modernization Projects Exceed Budget or Fail
The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.