MENU

Tool predicts the carbon footprint of algorithms

Tool predicts the carbon footprint of algorithms

News |
By Jean-Pierre Joosting



Artificial intelligence, and particularly the subfield of deep learning, appears likely to become a significant climate culprit should industry trends continue. In only six years, from 2012 to 2018, the compute needed for deep learning has grown 300,000%. However, the energy consumption and carbon footprint associated with developing algorithms is rarely measured, despite numerous studies that clearly demonstrate the growing problem.

This is important because deep learning algorithms churn whenever we use Siri or Alexa, when Netflix suggests movies and TV shows based upon our viewing histories, or when we communicate with a website’s customer service chatbot.

To address this issue, two students at the University of Copenhagen’s Department of Computer Science, Lasse F. Wolff Anthony and Benjamin Kanding, together with Assistant Professor Raghavendra Selvan, have developed a software programme they call Carbontracker. The programme can calculate and predict the energy consumption and CO2 emissions of training deep learning models.

“Developments in this field are going insanely fast and deep learning models are constantly becoming larger in scale and more advanced. Right now, there is exponential growth. And that means an increasing energy consumption that most people seem not to think about,” according to Lasse F. Wolff Anthony.


Deep learning training is the process during which the mathematical model learns to recognize patterns in large datasets. It’s an energy-intensive process that takes place on specialized, power-intensive hardware running 24 hours a day.

“As datasets grow larger by the day, the problems that algorithms need to solve become more and more complex,” states Benjamin Kanding.

One of the biggest deep learning models developed thus far is the advanced language model known as GPT-3. In a single training session, it is estimated to use the equivalent of a year’s energy consumption of 126 Danish homes, and emit the same amount of CO2 as 700,000 kilometres of driving.

“Within a few years, there will probably be several models that are many times larger,” says Lasse F. Wolff Anthony.

“Should the trend continue, artificial intelligence could end up being a significant contributor to climate change. Jamming the brakes on technological development is not the point. These developments offer fantastic opportunities for helping our climate. Instead, it is about becoming aware of the problem and thinking: How might we improve?” explains Benjamin Kanding.


Carbontracker, which is a free programme, aims to provide the field with a foundation for reducing the climate impact of models. Among other things, the programme gathers information on how much CO2 is used to produce energy in whichever region the deep learning training is taking place. Doing so makes it possible to convert energy consumption into CO2 emission predictions.

The two computer science students suggest that deep learning practitioners look at when their model trainings take place, as power is not equally green over a 24-hour period, as well as what type of hardware and algorithms they deploy.

“It is possible to reduce the climate impact significantly. For example, it is relevant if one opts to train their model in Estonia or Sweden, where the carbon footprint of a model training can be reduced by more than 60 times thanks to greener energy supplies. Algorithms also vary greatly in their energy efficiency. Some require less compute, and thereby less energy, to achieve similar results. If one can tune these types of parameters, things can change considerably,” concludes Lasse F. Wolff Anthony.

www.ku.dk

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s