fMRI plus AI language models deciphers thoughts

New Technology Deciphers Thoughts Using fMRI and AI Language Models

New Technology Deciphers Thoughts Using fMRI and AI Language Models

Human head and brain. Different kind of waveforms produced by brain activity shown on background. Digital illustration.

Combining the ability of fMRI to monitor neural activity with the predictive power of AI language models has resulted in a thought decoder.

Jun 29, 2023
2 minute read

Functional magnetic resonance imaging (fMRI) has transformed cognitive neuroscience. Still, its fundamental limitation is that neuroscientists cannot look at a brain scan and tell what someone is seeing, hearing, or thinking in the scanner. However, researchers are now one step closer to decoding internal experiences into words using fMRI and artificial intelligence language models according to a new study published in Nature Communications. This technology could benefit people who cannot outwardly communicate, such as those who have suffered strokes or are living with amyotrophic lateral sclerosis.

The AI language system an early relative of the model behind ChatGPT

Combining the fMRI’s ability to monitor neural activity with the predictive power of AI language models has resulted in a decoder that can reproduce, with a high level of accuracy, the stories that a person listened to or imagined telling in the scanner. However, the decoder is still in its infancy, requiring extensive training for each user, and it doesn’t construct an exact transcript of the words heard or imagined.

The team also tested the technology to see what might happen if someone wanted to resist or sabotage the scans. Study participants who tried to trick it by telling another story in their heads produced gibberish results.

See also: OpenAI Launches AI Dialogue Model ChatGPT

There are still obstacles, but researchers are cautiously optimistic

Researchers emphasize the need for proactive policies that protect the privacy of one’s internal mental processes. The decoder’s accuracy decreased when it struggled with grammatical features, such as pronouns, and proper nouns, such as names and places.

The biggest roadblock is fMRI itself, which doesn’t directly measure the brain’s rapid firing of neurons but instead tracks the slow changes in blood flow that supply those neurons with oxygen. Despite the limitations, the ability to translate imagined speech into words is critical for designing brain-computer interfaces for people unable to communicate with language.

The technology is many years away from being used as a brain-computer interface in everyday life, as the scanning technology isn’t portable and requires extensive customization. The AI models need to be trained to adapt and adjust to each user’s brain. Still, researchers hope that commonalities across people’s brains will be uncovered in the future to make the technology more accessible.

Elizabeth Wallace

Elizabeth Wallace is a Nashville-based freelance writer with a soft spot for data science and AI and a background in linguistics. She spent 13 years teaching language in higher ed and now helps startups and other organizations explain - clearly - what it is they do.

Recommended for you...

Why Storage is Becoming the Limiting Factor in AI Infrastructure
Ken Claffey
Apr 27, 2026
Smart Manufacturing Trends 2026: How AI, IoT, and Automation Are Driving Efficiency and Resilience
Why Most AI Projects Fail Before They Reach the Algorithm
Jeronimo De Leon
Apr 23, 2026
English as Code and the End of Drag-and-Drop Thinking
Binny Gill
Apr 22, 2026

Featured Resources from Cloud Data Insights

Why Storage is Becoming the Limiting Factor in AI Infrastructure
Ken Claffey
Apr 27, 2026
Real-time Analytics News for the Week Ending April 25
Smart Manufacturing Trends 2026: How AI, IoT, and Automation Are Driving Efficiency and Resilience
Why the Best MSPs Are Starting to Rethink Cloud Strategy (Without Making a Big Deal About It)
Richard Copeland
Apr 24, 2026
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.