To address concerns about energy consumption, AI system providers are using research innovations and software and hardware improvements to increase energy efficiency.
Prompting has become the new method of query and calculation for AI users across the globe, but it’s important to note that such services aren’t exactly “free.” All those prompts that cover everything from meeting notes to questions about cat food nutrition require data center power at the backend, as it turns out, increasingly enormous power. Just look at tech providers’ intentions to go as far as acquiring nuclear power plants to support their services, which are sucking up a great deal of energy.
And AI prompting is everywhere. It’s built into popular apps such as Microsoft Word, search engines, mobile apps, and social media channels. It’s everywhere.
Earlier this year, MIT Technology Review began pursuing and guesstimating the data usage and cost of prompts in the AI age, but have been met by silence from tech providers, until now. Google released data calculating how much energy its Gemini apps per query.
“AI models use up energy in two phases: when they initially learn from vast amounts of data, called training, and when they respond to queries, called inference,” report Casey Crownhart and James O’Donnell in the earlier MIT Technology Report analysis. “When ChatGPT was launched a few years ago, training was the focus, as tech companies raced to keep up and build ever-bigger models. But now, inference is where the most energy is used.”
Google estimates the median Gemini app text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 milliliters (or about five drops) of water. These figures “are substantially lower than many public estimates,” the report’s authors, Amin Vahdat and Jeff Dean, both with Google, stated. “The per-prompt energy impact is equivalent to watching TV for less than nine seconds.”
See also: Which Generative AI is better? Gemini vs. ChatGPT
The calculations are all-encompassing, not only measuring machine cycles, but also overhead and water usage. However, the report “was also strictly limited to text prompts, so it doesn’t represent what’s needed to generate an image or a video,” reported Crownhart in a follow-up article in MIT Technology Report. “Other analyses show that these tasks can require much more energy.” Plus, she added, Google did not reveal counts on the number of prompts it processes, which would provide a bigger picture of total energy usage.
For example, it’s estimated that OpenAI’s ChatGPT handles more than 2.5 billion requests daily. If OpenAI’s energy footprint is similar to that of Google, that equates to 600 million watt-hours consumed. (And that’s just one AI provider among the many!)
A Focus on Energy Efficiency
The Google researchers indicate that their company has been aggressively pursuing greater energy efficiency. “Our AI systems are becoming more efficient through research innovations and software and hardware efficiency improvements,” they state. “For example, over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, all while delivering higher quality responses. These results are built on our latest data center energy emissions reductions and our work to advance carbon-free energy and water replenishment.”
Measuring AI energy usage is in extremely inexact science as well, Crownhart and O’Donnell cautioned in the earlier MIT Technology Review article. “Measuring the energy used by an AI model is not like evaluating a car’s fuel economy or an appliance’s energy rating. There’s no agreed-upon method or public database of values. There are no regulators who enforce standards, and consumers don’t get the chance to evaluate one model against another.”