While misinformation and the threat of AI taking over human jobs continue to dominate the conversation about the dangers of artificial intelligence, a Boston University professor is sounding the alarm on another possible downside—the potentially sizable environmental impact of generative AI tools.
"As an AI researcher, I often worry about the energy costs of building artificial intelligence models," Kate Saenko, associate professor of computer science at Boston University, wrote in an article at The Conversation. "The more powerful the AI, the more energy it takes."
While the energy consumption of blockchains like Bitcoin and Ethereum has been studied and debated from Twitter to the halls of Congress, the effect of the rapid development of AI on the planet has not yet received the same spotlight.
No Virginia, Ron DeSantis's Campaign Didn't Alter His Video With AI
Ron DeSantis used footage from a political rally for an ad, like many candidates before him. But to make it a little more epic, he added digital fighter jets flying overhead. And because the ad debuted less than 24 hours after Donald Trump used AI to create a video trolling DeSantis for his botched campaign announcement on Twitter, some people assumed the Florida governor used AI to add in the jets. But that’s not the case, a person familiar with the matter told Decrypt. The ad used only normal...
Professor Saenko aims to change that, but acknowledged in the article that there is limited data on the carbon footprint of a single generative AI query. However, she said that research puts the number four to five times higher than a simple search engine query.
According to a 2019 report, Saenko said a generative AI model called the Bidirectional Encoder Representations from Transformers (or BERT)—with 110 million parameters—consumed the energy of a round-trip transcontinental flight for one person using graphics processing units (or GPUs) to train the model.
In AI models, parameters are variables learned from data that guide the model's predictions. More parameters in the mix often means greater model complexity, requiring more data and computing power as a result. Parameters are adjusted during training to minimize errors.
Saenko noted in comparison that OpenAI’s GPT-3 model—with 175 billion parameters—consumed an equivalent amount of energy as 123 gasoline-powered passenger vehicles driven for one year, or around 1,287-megawatt hours of electricity. It also generated 552 tons of carbon dioxide. She added that the number comes from just getting the model ready to launch before any consumers started using it.
"If chatbots become as popular as search engines, the energy costs of deploying the AIs could really add up," Saenko said, citing Microsoft's addition of ChatGPT to its Bing web browser earlier this month.

Nvidia Stock Soars Double-Digits Following Revenue Uptick Amid AI Frenzy
Nvidia shares soared 27.6% in extended trading yesterday, catapulting the company towards a $1 trillion market cap. The shares rose dramatically following Nvidia’s first-quarter report, which highlighted a 19% quarter-on-quarter increase in revenue, significantly surpassing expectations Nvidia is a leading manufacturer of cutting-edge chips essential for the training of AI services, a sector that’s seen a tsunami of investment fueled by the buzz around OpenAI's ChatGPT and similar AI-powered too...
Not helping matters is the fact that more and more AI chatbots, like Perplexity AI and OpenAI's wildly popular ChatGPT, are releasing mobile applications. That makes them even easier to use and exposes them to a much broader audience.
Saenko highlighted a study by Google that found that using a more efficient model architecture and processor and a greener data center can considerably reduce the carbon footprint.
Funding Web3 Film Calladita With Traditional Systems 'Wasn't Working', Says Director Miguel Faus
Spanish Film Director Miguel Faus talks with Decrypt about how he financed his movie “Calladita” with NFTs. In his view, Web3 and blockchain allow creators to get around some of the key difficulties present in the traditional movie industry, specifically finding funding.
"While a single large AI model is not going to ruin the environment," Saenko wrote, “if a thousand companies develop slightly different AI bots for different purposes, each used by millions of customers, then the energy use could become an issue."
Ultimately, Saenko concluded that more research is needed to make generative AI more efficient—but she’s optimistic.
"The good news is that AI can run on renewable energy," she wrote. "By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40 compared to using a grid dominated by fossil fuels."
Interested in learning more about AI? Check out our latest Decrypt U course, “Getting Started with AI.” It covers everything from the history of AI to machine learning, ChatGPT, and ChainGPT. Find out more here.