The Green Web Foundation have published a thorough and insightful report into the sustainability of AI, and the results are pretty damning. Over the couple of years I’ve been talking about web sustainability I haven’t really touched upon AI a whole lot, partly as there wasn’t a lot of publicly available hard data about its environmental impact. Now, however, it’s become an issue that’s too big to ignore. We can’t not talk about it.
Previously I was under the impression that the majority of AI’s energy use happens during the training stage. Although the vast amounts of energy and water resources used for model training is bad enough, recent evidence appears to show that a large proportion happens in the inference stage — that’s the actual use of AI models, via tools like ChatGPT and AI-powered search engines.
One example from the report highlights that a typical Google search with AI could use between 4 and 30 times more energy than a traditional search:
A good example comes from looking towards the impact of incorporating AI into search engines such as Google. A single generative AI query could use 4 to 5 times more energy than a regular search engine query. Others found that average energy consumption per search could be 6.9–8.9 Wh, compared to 0.3 Wh for a standard Google search. This gives us an enormous range of 4-30 times larger. Whichever end of the scale the figures land, it’s a significant increase.
It makes for pretty depressing reading that companies are determined to plough ahead with shoving AI into everything, and obfuscate their emissions figures rather than focus their considerable resources on a sustainable future.
The report makes a great point that when AI use cannot be avoided, there are ways to use it responsibly. As many developers may feel they have no choice, given that companies appear to view AI as the only thing that will make them relevant, it is important to be able to advocate for responsible AI use and insist upon transparency.