The Dangers of AI Energy Consumption

image provided by pixabay

By now it is known that artificial intelligence-powered systems consume huge amounts of electricity to operate, so a recent study set out to calculate the energy use and carbon footprint of several recent large language models.

ChatGPT, for example, was found to be consuming 1,287 megawatt hours of electricity, which is the equivalent of energy used by 121 homes for a year in the US.

Some say that this is the time to ask what the offset of this great technological development is. Alex de Vries, a researcher at the School of Business and Economics at the Vrije Universiteit Amsterdam argues that in the future, the energy demands to power AI tools may exceed the power demands of some small nations.

The boom in generative AI has been growing since the introduction of ChatGPT in late 2022, which has resulted in an increase in the demand for AI chips. There are now more and more companies developing their own chips to meet the heavy AI requirements, with Google and Amazon already having their own AI chips and Microsoft not far behind.

All this to say- there is going to be a significant rise in the energy footprint of the AI industry.

According to Interesting Engineering, AI tools have an initial training phase followed by an inference phase. The training phase is the most energy-intensive and has been the center of AI sustainability research done thus far. The inference phase is when these tools generate output based on the data they are trained on. Vries has called on the scientific community to pay more attention to this phase.

Vries concluded by noting that it is too optimistic to expect that improvements in hardware and software efficiencies will fully offset any long-term changes in AI-related electricity consumption, but efforts are being made- a team of MIT researchers from the Lincoln Laboratory Supercomputing Centre managed to reduce the energy consumption of an AI model by 12-15% by capping the power consumed by the GPUs powering it.