Niks meer missen?
Schrijf je in voor onze nieuwsbrief!
Foto: Sara Kerklaan
international

Pepijn Stoop | Yes, Shell is polluting, but so is AI

Pepijn Stoop ,
1 februari 2023 - 10:09

Last week, dozens of UvA students partially occupied the Binnengasthuis in protest against our university's ties with Shell. Oil is bad for the environment: People outside Science Park can figure that out. But that artificial intelligence (AI) has a huge environmental impact? Our own students hardly know that.

Since January 1st, I have stopped eating meat to reduce my own carbon footprint. It was during a meeting with other artificial intelligence students that I proudly talked about this. Until one of the other students gave a presentation on the environmental impact of artificial intelligence (AI), including that of language models. I felt a bit hypocritical after that presentation. Let me explain why.

ChatGPT emits 3.82 tons of C02 per day, more than what a household emits from electricity consumption in 2.5 years

When I started writing this piece, I found out that few people – both inside and outside AI - realize that models like ChatGPT are big polluters. A thousand pieces have already been written about ChatGPT: about how it can answer your existential life questions, write your code, and take your exams (see also my previous column). But one crucial and confrontational message is missing: ChatGPT emits 3.82 tons of C02 per day, more than what a household emits from electricity consumption in 2.5 years. At ChatGPT, the estimate is based on one million users each asking 10 questions a day. Considering ChatGPT had already reached this number within five days of its launch and has grown rapidly since then, the emissions are probably higher.

 

Pretty polluting for a convenient question service. So why is that? Mainly because of the way these models are trained. The only way recent models learn language is by feeding them massive amounts of data. For ChatGPT this involves at least eight million web pages. You don't train an algorithm like that during your lunch break. Training BERT, a language model from Google, took four days. You can’t train such models on your four-year-old Acer laptop, either. Training GPT-3, the predecessor of ChatGPT, consumed an estimated 936 Mwh. That could power almost a million households for a year.

 

These are preposterous numbers, especially in a climate crisis. The saddest part is that the users of these models are not the people of Bangladesh, who have to deal with floods, but rich Western countries. Even worse, I almost completed my undergraduate degree without ever hearing about this. Indirectly, students know that models don't run on air. But knowledge about how huge that consumption is provides an important counterargument against using certain "grey" models, such as a neural network, when a simpler and greener model will also do. But future PhD students in AI need to know that. 

 

In the meantime, I ask myself if I'm still being green if I don't eat meat but run weekly models that don't contribute much to a positive footprint. Maybe I should go ask ChatGPT about that after all.

 

Pepijn Stoop is a UvA student of artificial intelligence.