MENU

AI drives up data centre power requirements

AI drives up data centre power requirements

Business news |
By Nick Flaherty



The power requirements of AI chips are driving up the use of electricity in data centre says an analysis by a Dutch researcher.

“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” says Alex de Vries, a Ph.D. candidate at Vrije Universiteit Amsterdam.

Training generative AI tools requires feeding the models a large amount of data, a process that is energy intensive. Hugging Face, an AI-developing company based in New York, reported that its multilingual text-generating AI tool consumed about 433MWh during training, enough to power 40 average American homes for a year.

Other LLMs, including GPT-3, Gopher and Open Pre-trained Transformer (OPT), reportedly used 1,287, 1,066, and 324 MWh, respectively, for training. Each of these LLMs, was trained on terabytes of data and has 175 billion or more parameters.

The production of AI servers is projected to grow rapidly in the near future. By 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production.

The energy footprint does not end with training. De Vries’s analysis shows that when the tool is put to work—generating data based on prompts— every time the tool generates a text or image, it also uses a significant amount of computing power and thus energy. For example, ChatGPT could cost 564 MWh of electricity a day to run.

de Vries says that an increase in machines’ efficiency often increases demand. In the end, technological advancements will lead to a net increase in resource use, a phenomenon known as Jevons’ Paradox.

“The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it,” de Vries says.

However the overall energy use of data centres is still significantly larger, and this creates a bottleneck in supplying the power required.

Given its estimated 95% market share in 2023, Nvidia leads the AI servers market. The company is expected to deliver 100,000 of its AI servers in 2023.

If operating at full capacity (i.e., 6.5 kW for Nvidia’s DGX A100 servers and 10.2 kW for DGX H100 servers), these servers would have a combined power demand of 650–1,020 MW. On an annual basis, these servers could consume up to 5.7–8.9 TWh of electricity. Compared to the historical estimated annual electricity consumption of data centers, which was 205 TWh, this is almost negligible, says de Vries.

Google, for example, has been incorporating generative AI in the company’s email service and is testing out powering its search engine with AI. The company processes up to 9 billion searches a day currently. Based on the data, de Vries estimates that if every Google search uses AI, it would need about 29.2 TWh of power a year, which is equivalent to the annual electricity consumption of Ireland.

However Google has ensured that its data centres are powered by renewable energy and is using machine learning to optimise the power delivery in the data centre to reduce overall energy use. 

The amount of energy used is comparable to the annual electricity consumption of countries such as the Netherlands, Argentina, and Sweden. Moreover, improvements in AI efficiency could also enable developers to repurpose some computer processing chips for AI use, which could further increase AI-related electricity consumption.

“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it,” de Vries says.

www.cell.com/joule/fulltext/S2542-4351(23)00365-3

 

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s