With the rapid expansion of artificial intelligence, environmental watchdogs are beginning to sound the alarm about the amount of energy artificial intelligence models consume. But such warnings need to be based on real numbers, and a new report by Digiconomist founder Alex de Vries attempts to quantify the growing environmental impact of AI.
De Vries—who also tracks the energy consumption of cryptocurrency mining—says the training phase of AI models is the most energy intensive. This is when the program is fed large datasets, before it even answers a single prompt. But de Vries argues that the inference phase, when the AI is tested against real-world data, is not given as much attention by environmental groups when it may significantly contribute to the life-cycle cost of an AI model.
It has been nine years since Digiconomist was first launched with the objective of “exposing the unintended consequences of digital trends.” For a big part of these nine years, the sustainability of digital assets such as Bitcoin has been a key focus of the research by…
— Digiconomist (@DigiEconomist) October 10, 2023
“Some numbers are floating out there on the energy use of AI’s carbon footprint, but despite some concerns, the content to support that, at least so far, wasn't there,” de Vries told Decrypt. “I figured that's where I could step in and try to shed some light on it, although it's very difficult.”
“It's very different than with cryptocurrencies to come up with numbers,” he added.
AI developers, de Vries said, often use the same narrative that cryptocurrency companies do, claiming to [influence utilities to] produce renewable energy. But the cost of building the large data centers needed to house hundreds of computers or servers don’t yield anything, like jobs, for the local economy.
“You don't need a lot of people to run a data center, and it also doesn't attract additional business because you don't have to be next to the data center,” de Vries said. “So you're left with a massive power hog.”
De Vries highlighted the response time of the environmental community when it comes to industries that consume a large amount of energy, and the differences between the ecological scrutiny of Bitcoin that came after the 2017 bubble burst and today's hype surrounding AI.
“The environmental community tends to be a bit slow—the first headlines with regard to Bitcoin’s electricity consumption were in like 2017, 2018, when the first Bitcoin bubble happened,” de Vries said. “It took until the second half of 2022 before environmental organizations started to get seriously involved.”
Earlier this month, a report by the Associated Press said data centers like those used for ChatGPT consume about 500 milliliters of water for every 5 to 50 prompts. The AP noted that OpenAI’s data centers are allegedly located near the same waters in Iowa that feed local corn fields.
As de Vries and others try to bring attention to AI energy consumption, others, however, say the concern will prove as overblown as it was for the energy consumption of blockchains.
“We'll see apocalyptic predictions about AI's energy use fall flat,” HIVE Research Director Adam Sharp previously told Decrypt, citing a 2017 Newsweek prediction that Bitcoin would consume 100% of the world’s energy by 2020.
“This tech is very new, and efficiency will improve dramatically over coming years,” Sharp said.
In his report, de Vries cautions against counting on AI technology and hardware improvements to solve the technology’s environmental issues.
“It is probably too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption,” de Vries wrote. “These advancements can trigger a rebound effect whereby increasing efficiency leads to increased demand for AI, escalating rather than reducing total resource use.”
De Vries hopes that as more headlines on AI’s electricity consumption emerge, environmental groups will pay attention, having learned from cryptocurrencies. However, de Vries said limited available data may lead to groups putting the question of AI’s energy consumption on the back burner.
“Right now, the numbers are small, so you can argue, ‘Why do we need to put this very high on our agenda if it's still small?’” de Vries said. “But the thing is not going to stay small for very long.”
De Vries hopes that people will consider sustainability a key factor when using AI and other known limitations and concerns. As AI tools become more ubiquitous, researchers have sounded the alarm about data privacy, AI’s pattern of producing biased or racist responses, and generating false information, also known as hallucinating.
“There's a lot of hype, and I think that's the biggest thing with emerging technologies,” de Vries said. “Everyone always gets lost in the hype and the fear of missing out, and we have to do something, and we completely forget about the end user.”
Likening it to the hype surrounding blockchain, De Vries cautioned against thinking of AI as a silver bullet to solve all the world’s problems.
“You really don't necessarily have to try something out to figure out it's not going to be useful,” he said. “There's going to be very clear cases when you need a distributed database and when you don't, and it's also going to be the case for AI when you run you're going to need a very big model in your back end to help you do stuff or versus when you don't.”