AI tech could require as much electricity as a small nation, study finds

AI has limitations and shouldn’t be used for everything, especially considering its privacy concerns and high energy demand.

368
SOURCEEcoWatch

With the increasing demand for artificial intelligence (AI), a researcher has found that this technology could have a massive energy footprint, requiring as much electricity as a small country.

Alex de Vries, a Ph.D. candidate at the Vrije Universiteit Amsterdam School of Business and Economics and founder of the website Digiconomist, which has long reported on the energy demands of technology like Bitcoin, has published initial research in a commentary piece in the journal Joule on AI’s potential future energy demand.

Although AI has been around since the 1950s, the past two years have seen rapid growth of this technology with tools like ChatGPT making headlines. But de Vries warns that this technology requires a lot of electricity to train and run, and this is only going to go up as AI becomes more prevalent.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” de Vries said, as reported by ScienceDaily.

According to de Vries, training AI is one of the most energy-consuming processes of using this technology. For instance, Hugging Face, an AI company, shared that one of its AI models alone, BigScience Large Open-Science Open-Access Multilingual (BLOOM), needed 433 MWh of electricity for training. That’s enough energy to power about 40 average U.S. households for a year, Euronews reported.

After training, AI goes through an inference phase, where users put in their questions and AI generates responses based on these new prompts. This also requires a lot of energy. De Vries wrote, “Research firm SemiAnalysis suggested that OpenAI required 3,617 of NVIDIA’s HGX A100 servers, with a total of 28,936 graphics processing units (GPUs), to support ChatGPT, implying an energy demand of 564 MWh per day.”

Although some AI-developing companies have expressed plans to make their tools more energy-efficient, de Vries warned that this could just increase how frequently AI is used, ultimately negating any benefits from improved efficiency.

“By 2027 worldwide AI-related electricity consumption could increase by 85.4–134.0 TWh of annual electricity consumption from newly manufactured servers. This figure is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina and Sweden,” de Vries said on Digiconomist. “While this would represent half a percent of worldwide electricity consumption, it would also represent a potential significant increase in worldwide data center electricity consumption. The latter has been estimated to represent one percent of worldwide electricity consumption.”

De Vries further warned that AI has limitations and shouldn’t be used for everything, especially considering its privacy concerns and high energy demand. Instead, he said companies should be mindful of how they utilize this technology, only using it when it is really necessary or so beneficial that the costs are outweighed. De Vries also recommended that policymakers consider requiring environmental disclosures for AI and AI supply chains.

FALL FUNDRAISER

If you liked this article, please donate $5 to keep NationofChange online through November.

[give_form id="735829"]

COMMENTS