SHARE
Facebook X Pinterest WhatsApp

Artificial Intelligence Predicts Corn Yield Rates

thumbnail
Artificial Intelligence Predicts Corn Yield Rates

Precision agriculture, through the use of artificial intelligence and analytics, could be a $12 billion industry by 2025.

Written By
thumbnail
David Curry
David Curry
Mar 11, 2020

A research group at the University of Illinois has developed a new approach to predict corn yield rates, using a machine learning algorithm that combines on-ground measurements with satellite images.

“We’re trying to change how people run agronomic research,” said assistant professor, Nicolas Martin.  “Instead of establishing a small field plot, running statistics, and publishing the means, what we’re trying to do involves the farmer far more directly. We are running experiments with farmers’ machinery in their own fields. We can detect site-specific responses to different inputs, and we can see whether there’s a response in different parts of the field.”

SEE ALSO: Florida Poly Develops Algorithm To Detect Citrus Greening

Normally, a machine learning algorithm would follow a pre-determined pattern and recognize faults, but in this study the research group had the algorithm learn the patterns, organize them, and figure out more efficient ways of deploying nitrogen and seed.

“We don’t really know what is causing differences in yield responses to inputs across a field,” said Martin. “Sometimes people have an idea that a certain spot should respond strongly to nitrogen and it doesn’t, or vice versa. The [algorithm] can pick up on hidden patterns that may be causing a response and when we compared several methods, we found out that it was working very well to explain yield variation.”

The research was conducted between 2017 and 2018 in the Midwest, Argentina, Brazil, and South Africa. Martin believes that the experiment can be moved on to more than just corn, potentially offering recommendations for multiple farming combinations.

Precision agriculture, through the use of artificial intelligence and analytics, could be a $12 billion industry by 2025, according to Global Market Insights.

thumbnail
David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Recommended for you...

AI Agents Need Keys to Your Kingdom
The Rise of Autonomous BI: How AI Agents Are Transforming Data Discovery and Analysis
Why the Next Evolution in the C-Suite Is a Chief Data, Analytics, and AI Officer
Digital Twins in 2026: From Digital Replicas to Intelligent, AI-Driven Systems

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.