Artificial Intelligence, Carbon Footprints, and the Risk to the Environment

Artificial Intelligence, Carbon Footprints, and the Risk to the Environment

image provided by pixabay

This post is also available in: heעברית (Hebrew)

Artificial intelligence uses huge amounts of electricity and water to work, and the problems caused by this are only going to worsen. Similar to the calls about the environmental impact of bitcoin and cryptocurrency (bitcoin mining alone uses more electricity than Norway and Ukraine combined), there are now calls about the similar toll that comes with the increased use of AI tech.

According to The Guardian, artificial intelligence tools are powered by GPUs, which are complex computer chips able to handle the billions of calculations a second required to power tools like ChatGPT and Google Bard. These chips require huge amounts of electricity to power them.

Sasha Luccioni, a researcher in ethical and sustainable AI said: “Fundamentally speaking, if you do want to save the planet with AI, you have to consider also the environmental footprint [of AI first]… It doesn’t make sense to burn a forest and then use AI to track deforestation.”

There are several experts currently trying to quantify AI’s environmental impact, which is difficult for many reasons, one being that the companies behind the most popular tools won’t share details of how much energy their systems use.

Another problem is that this cost is invisible to users- we as users can’t see the cloud-based servers, the chips working to complete processing tasks, or the huge volumes of water passing through pipes inside data centers used to keep the computers powering the AI tools cool. These are all invisible.

According to the Guardian, when it comes to water use, before the integration of GPT-4 into ChatGPT it used up 500ml of water every 20 questions and corresponding answers, and this amount will likely grow with the release of GPT-4, as researchers forecast.

Energy use and carbon footprint are harder to calculate. Researchers estimate that the training of GPT-3 consumed 1,287 MWh and led to emissions of more than 550 tonnes of carbon dioxide, equivalent to flying between New York and San Francisco 550 times.

Attempts are being made to maintain AI’s intelligence without the huge energy use, and many solutions being suggested end up trading off performance for environmental good.

It is unlikely that we will see performance being sacrificed to reduce ecological impact, but this issue needs to be addressed. Technology analysts believe that if nothing changes then by 2025 the energy consumption of AI-based tools will be greater than that of the entire human workforce.

So far there is no solution to this problem, but a beginning would be to treat AI more like cryptocurrency and have an increased awareness of its harmful environmental impacts.