Generative AI tools like ChatGPT, Midjourney and Sora make it seem effortless to ask computers to do clever and useful things. But such tasks have a hidden cost. The energy demands of training and running artificial-intelligence models are high, and growing with the sector’s rapid expansion. This is causing headaches for power grids and slowing down the transition to a green economy.
The energy cost of a single AI-powered query is about ten times that of a standard search on Google, one of the least energy-intensive online activities. When scaled to trillions of queries annually, the total energy consumption for AI searches alone could rival that of an entire country, such as Ireland.
A single hyperscale data centre that runs advanced AI models can be incredibly power-intensive. In America, home to several of these mega data centres, they are estimated to have guzzled 4% of the nation’s total electricity in 2024, an amount roughly equivalent to the annual electricity demand of Pakistan. Researchers say that around 60% of this power is used to run the servers themselves. If current trends continue, the U.S. Department of Energy projects that by 2028 data centres could consume up to 12% of all American electricity.
Such high demand is due in part to the intense energy needs of training computer models. This process, which can consume enormous amounts of power—the training of GPT-3 alone required an estimated 1,287 MWh—requires thousands of specialised graphics processing units to run continuously for weeks or months. And since AI models have a short shelf-life and need to be retrained every so often, total energy use can add up quickly. This hefty demand for electricity is also responsible for substantial carbon emissions; one bleak forecast predicts that 60% of the increasing electricity demands from data centres will be met by burning fossil fuels.
The good news is that much of this waste can be reduced. The simplest and most immediate gains can be made by software. Choosing the right model for the right job can have dramatic effects. One study exploring this accuracy-versus-energy trade-off found that it was possible to save 77% of the total energy consumed by an AI model for a mere 1.1% drop in accuracy.
Methods like model distillation are proving effective. In the longer term, efficiency gains can be made in hardware. The introduction of novel technologies, including cutting-edge hardware tailored to AI workloads, could bring another step-change, though such approaches are not a near-term replacement for the GPUs that power large language models. Perhaps most important of all is switching energy sources. However, decarbonising AI data centres is not as simple as running them on electricity from renewable sources. Some argue that the constant, 24/7 power demand of AI is fundamentally incompatible with the intermittent nature of renewables, creating a powerful justification for other carbon-free, baseload power sources like nuclear energy.