SHARE
Facebook X Pinterest WhatsApp

Restraining AI’s Energy-Guzzling Ways

thumbnail
Restraining AI’s Energy-Guzzling Ways

Addressing AI’s energy-guzzling nature all comes down to using artificial intelligence thoughtfully, according to a recent Accenture report.

Written By
thumbnail
Joe McKendrick
Joe McKendrick
Jul 22, 2025

Artificial intelligence, which gobbles up data center power, is running away with available resources. “The infrastructure fueling the AI revolution is consuming unprecedented amounts of electricity and water at rates that rival entire nations,” according to a new analysis out of Accenture. Such energy-guzzling by AI is likely to grow if not addressed.

Over the next five years, AI data centers could consume up to 612 terawatt-hours, which is equivalent to Canada’s total annual electricity consumption, the report’s authors, led by Stephanie Jamison, state. AI’s share of global power consumption is set to rise from 0.2% in 2024 to 1.9% by 2030, an extraordinary pace of 48% CAGR, outpacing the expected 1.5% growth rate in overall electricity demand growth in this period.

There’s another critical resource at risk as well: “these data centers are predicted to guzzle more than 3 billion cubic meters of water annually – more than the total annual freshwater withdrawals of countries like Norway or Sweden.” 

The Accenture team recommends AI itself as an effective approach to reigning in AI consumption. “With AI’s energy footprint growing relentlessly, AI governance-as-code — embedding sustainability into automated compliance systems — is key to reducing risks, lowering costs and building resilient, future-proof AI ecosystems,” they advise.

AI-driven automation “can help enforce sustainability policies and manage environmental risks in real time,” they continue. “Automation can also make it easier to select the most sustainable infrastructure for each model deployment.”

Tools and platforms such as the Cloud Native Computing Foundation’s Kepler project (Kubernetes-based Efficient Power Level Exporter) can help “support energy-aware AI scheduling.” In addition, intelligent workload orchestration tools such as Karmada (Kubernetes-based Multi-Cloud, Multi-Cluster Orchestrator) can help “optimize AI workloads across regions based on carbon intensity.

Accenture also developed what it calls a Sustainability-Adjusted Intelligence Quotient (SAIQ), a measure of how efficiently AI systems convert money, electricity, water and carbon into actual performance.  SAIQ is a composite efficiency score that measures the environmental and economic performance of AI systems, weighing factors such as costs, electricity consumption, carbon emissions, and water intake.

See also: AI Model Mimics Brain Neurons to Reduce Energy Costs

Curbing AI’s Energy-Guzzling

The Accenture authors offer numerous suggestions for bringing AI energy consumption under contreol, including the following:

  • Deploy AI at the edge: This “cuts down on cloud use and improves performance by reducing latency. Edge AI applications are especially well-suited to industries that rely on real-time data processing, such as manufacturing, healthcare, retail and financial services.”
  • Adopt dynamic scaling and smart load balancing: “Match energy use to AI workloads.” Design AI infrastructure “with energy proportionality in mind, drawing on principles like dynamic efficiency scaling to optimize power use across AI workloads. Reduce peak energy demands by using adaptive scheduling to shift AI processing to times when power is cheapest and cleanest.”
  • Choose right-size AI models: “Instead of defaulting to large-scale LLMs, deploy
    task-specific AI models. Techniques like Retrieval-Augmented Generation (RAG) can
    reduce inference costs by accessing data only when needed,”

It all comes down to using AI thoughtfully, the Accenture team states. “The paradox of AI is that it can be used both more selectively and more broadly to reduce its impacts. Businesses that get it right can drive sustainability, profitability and competitiveness to new heights.”

thumbnail
Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Recommended for you...

AI Agents Need Keys to Your Kingdom
Beyond Procurement: Optimizing Productivity, Consumer Experience with a Holistic Tech Management Strategy
Rishi Kohli
Jan 3, 2026
Smart Governance in the Age of Self-Service BI: Striking the Right Balance
The AI Executive Order Creates Uncertainty, Not Clarity. Here’s How to Navigate It.
RTInsights Team
Dec 26, 2025

Featured Resources from Cloud Data Insights

The Difficult Reality of Implementing Zero Trust Networking
Misbah Rehman
Jan 6, 2026
Cloud Evolution 2026: Strategic Imperatives for Chief Data Officers
Why Network Services Need Automation
The Shared Responsibility Model and Its Impact on Your Security Posture
RT Insights Logo

Analysis and market insights on real-time analytics including Big Data, the IoT, and cognitive computing. Business use cases and technologies are discussed.

Property of TechnologyAdvice. © 2026 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.