Data scientist and researcher Alex De Vries from the School of Business and Economics at the Vrije Universiteit Amsterdam conducted a study on artificial intelligence's environmental impact. Since generative AI requires powerful servers, there are worries that all the computing power could increase energy consumption and carbon emissions.

AI Revolution and Its Environmental Impact: Experts Study the Sustainability of Machine Learning Technology
(Photo: Pixabay/ Geralt)

De Vries initially called attention to pollution from crypto mining with his website Digiconomist. As he turns his focus to AI, he reports that it is still too early to measure the amount of planet-heating pollution due to AI-driven applications such as ChatGPT. However, it is worth paying attention now to prevent runaway emissions, and users are urged to be mindful of their reason for using AI.

Environmental Impact of AI

The direct environmental impact of artificial intelligence is mostly related to the physical infrastructure it requires. These include processors, data centers, and other specialized computing hardware. The vast majority of evidence suggests the negative impact of AI computing infrastructure on the environment.

The AI computing life cycle is divided into four stages: production, transport, operations, and end-life stages. The most significant carbon emissions come from the operational stage at around 70% to 80%.

AI servers are energy-hungry devices. For example, a single NVIDIA DGX A100 server can consume as much electricity as a handful of U.S. households combined. The electricity consumption of hundreds of thousands of such devices can add up quickly.

Experts suggest that by 2027, the worldwide electricity consumption related to AI could increase by 85.4 - 134.0 TWh of annual electricity consumption from newly manufactured servers. This is almost the same as the annual electricity consumption of countries such as Sweden, Argentina, and the Netherlands. Although this figure represents half a percent of global electricity consumption, it also represents a potential increase in global data center electricity consumption.

Just like blockchain, emerging technologies like AI are often accompanied by hype and the fear of missing out. As a result, tech manufacturers create applications that yield little to no benefit to the end users. With AI being an energy-intensive technology, this trend can result in significant wasted resources.

READ ALSO: Artificial Intelligence Made Faster, More Efficient: New NVIDIA Processor Chip for AI Helps It to Better Understand Human Language

How Can the Damage Be Reduced?

Given the potential increase in electricity consumption related to artificial intelligence, users are urged to be mindful of the use of AI. Meanwhile, data centers can also practice key steps to reduce their operational environmental footprint.

The basic point to start with is the use of renewable sources of energy to power the data centers. Operators can also use and fine-tune pre-trained models to save time and energy in building AI models.

Server visualization is another method that data centers can use to improve their efficiencies. This process allows multiple virtual servers to run on a single physical server. As the servers and hardware resources are consolidated, the power needed to run and cool them is reduced.

Lastly, the electrical energy going into a data center must be rejected as waste heat through an efficient cooling system. Data center cooling contributes 33% to 40% of overall energy usage and consumes hundreds of billions of liters of fresh water annually. Data centers should be built in locations with abundant renewable energy sources or cooler climates to achieve significant emissions savings.

RELATED ARTICLE: How Artificial Intelligence Will Affect Everyone

Check out more news and information on Artificial Intelligence in Science Times.