Artificial intelligence systems could account for nearly half of all power consumption in global datacentres by the end of this year, according to new research — fuelling growing concerns over the environmental impact of AI technologies.
The analysis, conducted by Alex de Vries-Gao, founder of the Digiconomist tech sustainability platform, suggests that AI could represent up to 49% of total datacentre energy use by the end of 2025. The study is due to be published in the energy journal Joule and comes just days after the International Energy Agency (IEA) forecast that AI could require nearly as much electricity by the end of the decade as Japan consumes today.
Based on electricity drawn by chips from major AI hardware providers including Nvidia, AMD, and Broadcom, the research estimates that AI currently accounts for around 20% of total datacentre energy consumption — already a significant slice of the 415 terawatt hours (TWh) used by data centres globally last year, according to the IEA (excluding cryptocurrency mining).
De Vries-Gao factored in variables such as hardware efficiency, cooling systems, and workload intensity to estimate AI’s growing share of demand. He warns that the pace of expansion in AI hardware and model training could soon drive AI-specific energy consumption to 23 gigawatts — more than twice the total power usage of the Netherlands.
“These innovations can reduce the computational and energy costs of AI,” said De Vries-Gao. “But efficiency gains can also encourage wider adoption — and ultimately more energy use.”
Support authors and subscribe to content
This is premium stuff. Subscribe to read the entire article.