Global Data Centre Energy Use Set to Double Amid AI Boom, IEA Warns

Spread the love

The electricity required to power the world’s data centres is expected to double over the next five years, driven largely by the rapid growth and deployment of advanced AI models, according to a new report from the International Energy Agency (IEA).

By 2030, data centres could be consuming up to 945 terawatt-hours (TWh) of electricity annually—more than three times the current energy consumption of the entire United Kingdom. This surge in demand is projected to be highly concentrated around global technology and population hubs, placing significant pressure on utility providers, grid infrastructure, and the environment.

AI is one of the biggest stories in the energy world today,” said Fatih Birol, Executive Director of the IEA. He noted that in countries like the United States, Japan, and Malaysia, data centres are expected to drive a substantial portion of electricity demand growth—up to half or more in some cases.

Despite the growing impact, major tech companies have offered limited transparency around the energy footprint of AI systems. The IEA estimates that training OpenAI’s GPT-4 model required 42 gigawatt-hours (GWh) over 14 weeks—roughly equivalent to the daily electricity use of 28,500 homes in developed countries, or 70,500 in lower-income nations.

Even routine AI operations, known as “inference”, are energy-intensive. Generating a six-second video clip using AI requires nearly eight times the electricity needed to charge a smartphone, and about twice that of a laptop.

In the U.S., data centres—many designed to support AI workloads—are projected to consume more electricity by 2030 than the country’s entire output of energy-intensive manufacturing sectors, including aluminium, steel, cement, and chemicals.

Yet the IEA also highlights AI’s potential to be part of the solution. AI technologies could help manage future energy demands, improve data centre efficiency, and speed the shift to cleaner energy sources.

The explosive rise of AI has been enabled by two major shifts: a 99% drop in the cost of compute infrastructure since 2006, and a 350,000-fold increase in the amount of compute used to train cutting-edge models over the past decade.

Leave a Reply

Your email address will not be published. Required fields are marked *