Stanford Unveils New Flexible AI Chip

PinIt

The new chip will use a compute-in-memory concept that handles processing within the memory component itself.

Researchers at Stanford University have unveiled a smaller, more flexible AI chip that will enable AI capabilities in the smallest edge devices. The work was described in a recent article in Nature. The chips could have a wide range of uses in edge environments where the energy requirements of processing must be balanced by power availability. These new energy-efficient chips stretch battery life further while remaining the size of a fingertip.

Right now, the separation of data processing and storage drains the battery as data continually moves back and forth. The new chip will reduce data movement through a resistive random-access memory (RRAM) chip—a compute-in-memory concept that handles processing within the memory component itself.

This is a more scalable option for edge devices. It’s a nonvolatile memory component that keeps data even when the device turns off. Doing calculations on the chip itself removes the need to send data packets to and from the cloud and improves the overall efficiency of even the smallest batteries. This energy efficiency means more devices could see greater AI capabilities.

Download Infographic Now: Manufacturing Leaders’ Views on Edge Computing and 5G

See also: IBM Takes AI Chip Research to the Next Level

Building on existing chip technology

Compute-in-memory is well established already, and the researchers aren’t the first to propose processing directly within memory. However, this is one of the first times researchers have integrated this level of memory right onto the neural network chip. They’ve benchmarked through hardware measurements, offering a glimpse at what edge computing could look like in the near future.

The team tested benchmarks and found:

  • 99% accurate in letter recognition (MNIST dataset)
  • 85.7% accurate on image classification (CIFAR-10 dataset)
  • 84.7% accurate on Google speech command recognition
  • 70% reduction in image-reconstruction error on a Bayesian image recovery task.

These very promising results indicate excellent potential for this proof-of-concept chip, and researchers are in development to put it into a position for production and utilization. If chips reach that stage, they’re expected to be cost-effective and adaptable, making them suitable for a wide range of uses—home health monitoring or even helping to fight climate change.

Elizabeth Wallace

About Elizabeth Wallace

Elizabeth Wallace is a Nashville-based freelance writer with a soft spot for data science and AI and a background in linguistics. She spent 13 years teaching language in higher ed and now helps startups and other organizations explain - clearly - what it is they do.

Leave a Reply

Your email address will not be published.