SHARE
Facebook X Pinterest WhatsApp

AI Model Mimics Brain Neurons to Reduce Energy Costs

thumbnail
AI Model Mimics Brain Neurons to Reduce Energy Costs

Abstract design made of head outlines, lights and abstract design elements on the subject of intelligence, consciousness, logical thinking, mental processes and brain power

Deployed for AI, e-prop would require only 20 watts, approximately one-millionth the energy a supercomputer uses.

Written By
thumbnail
David Curry
David Curry
Aug 3, 2020

Artificial intelligence models continue to grow in sophistication and complexity, adding to the need for more data, computation, and energy.

To help combat increasing energy costs, researchers at TU Graz’s Institute of Theoretical Computer Science have developed a new algorithm, called e-propagation (e-prop for short).

SEE ALSO: Researchers Develop Algorithm to Detect Crude Oil on Water

E-prop mimics how neurons send electrical impulses to other neurons in our brain, which massively reduces the amount of energy human brains use, in comparison to machine learning. Deployed for AI, e-prop would require only 20 watts, approximately one-millionth the energy a supercomputer uses.

“E-prop relies on two types of signals that are abundantly available in the brain, but whose precise role for learning have not yet been understood: eligibility traces and learning signals,” said Wolfgang Maass and Robert Legenstein, authors of the e-prop paper.

With this enormous reduction in energy usage, AI models could theoretically be trained and tested on computers far less exclusive than the current world-class supercomputers.

Another differential with e-prop works completely online, rather than storing data centrally, which improves efficiency by not requiring separate memory for constant data transfer.

The theory of e-prop makes a concrete experimentally testable prediction: that the time constant of the eligibility trace for a synapse is correlated with the time constant for the history-dependence of the firing activity of the postsynaptic neuron,” said the research team.

“It also suggests that the experimentally found diverse time constants of the firing activity of populations of neurons in different brain areas are correlated with their capability to handle corresponding ranges of delays in temporal credit assignment for learning.”

thumbnail
David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

AI Agents Need Keys to Your Kingdom
The Rise of Autonomous BI: How AI Agents Are Transforming Data Discovery and Analysis
Why the Next Evolution in the C-Suite Is a Chief Data, Analytics, and AI Officer
Digital Twins in 2026: From Digital Replicas to Intelligent, AI-Driven Systems

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.