LOADING

Type to search

Three Strategies to Curb Artificial Intelligence’s Insatiable Energy Appetite

Share

Training a single AI model can emit as much greenhouse gas as burning 300,000 pounds of coal, underscoring the need for ‘Green AI.’ This approach includes adopting efficiency improvements, regulatory frameworks, and mindful AI deployments to mitigate environmental impact.

Artificial intelligence (AI) models have increasingly been deployed across the globe. Businesses are particularly interested in how AI will revolutionize the workplace, with the five big tech firms—Alphabet, Amazon, Apple, Meta, and Microsoft—leading the way with an estimated $400 billion budget this year for capital expenditures on AI-related hardware and research and development.

As vast sums of money are poured into AI development, a parallel stream of research is looking beyond the financials and into the environmental costs of AI.

One study found that the computing resources needed to train a single natural-language processing AI model, a technology used in popular applications such as Siri, Alexa, Google Translate, and ChatGPT, could emit as much carbon dioxide (CO2) as burning 300,000 pounds of coal. It would require about 4,500 tree seedlings grown for 10 years to mitigate this impact.

Another study found that generating a single image through AI consumed as much electricity as half a smartphone charge. While this may not seem as damaging as the footprint that training AI models create, the sheer volume of images created compounds its impact, calling for mindful use and deployment of the technology.

The rapid expansion of AI is also reflected in the rising demand for data centers and in turn, the resources that power them. The International Energy Agency estimates that data center workloads have grown by over 340% between 2015 and 2022, with electricity demand projected to double to a baseline of 800 TWh by 2026. This increase, largely driven by the computing and cooling needs of these centers, is approximately equivalent to twice Sweden’s annual electricity consumption.

In Virginia, in the United States, where the world’s largest concentration of data centers is found, water consumption has jumped by almost two-thirds since 2019 to at least 7 billion liters in 2023.

Water is another critical resource that data centers consume to cool servers and ensure they operate efficiently. The Financial Times reported that in Virginia, in the United States, where the world’s largest concentration of data centers is found, water consumption has jumped by almost two-thirds since 2019 to at least 7 billion liters in 2023. To put this into perspective, water used from Virginia’s data center alley alone can support the basic water needs of about 200,000 people annually.

Big tech companies are also seeing marked increases in their greenhouse gas emissions after integrating AI into many of their core products. Google announced a 48% or approximately 4.6 million metric ton jump in emissions in 2023 compared to its 2019 baseline.

Likewise, Microsoft’s greenhouse gas emissions increased by about 30% from its 2020 baseline to 15.4 million metric tons in 2023. According to Nature, a scientific journal, a million metric ton of CO2 is roughly equivalent to the average annual emissions of 35 commercial airlines, 216,000 vehicles, and 115,000 homes in the US. The pursuit of AI will clearly have a dramatic impact on climate goals based on the numbers provided by these two tech giants alone.  

While these trends are alarming, one silver lining is that efficiency advancements, such as free air cooling, immersion cooling, sector coupling, and waste heat and renewable energy use, have helped limit the massive energy and water consumption needed by data centers and data transmission networks.

Nevertheless, as AI adoption expands, further government and private sector commitment to energy and water efficiency, renewable energy procurement, and research and development will be crucial in the sustainable growth of the sector.

Another comforting prospect is the emerging field of Green AI that advocates sustainable practices in model design, training, and deployment to reduce AI’s associated environmental cost and carbon footprint. It also examines AI’s role in the sustainability agenda by making sure that its benefits outweigh the resources it demands.

Literature suggests three strategies to enable Green AI adoption. These include:

Establishing standardized methodologies for quantifying, comparing, and tracking resource consumption and carbon emissions, as well as increased transparency, reporting, and coverage across the AI lifecycle.

Setting regulatory frameworks that legally address the transparency and accountability responsibilities of stakeholders to drive Green AI, as well as policy incentives that include environmental and social metrics in AI development and assessments.

Mindful deployment of AI that takes into consideration the cost and benefits of developing and using the technology for a particular purpose. Not all problems require a machine-learning-based solution and climate solutions need not be high tech.

As AI continues to advance and capture the imagination of stakeholders worldwide, we need to evaluate if it is being used responsibly. AI is a great tool with immense potential to solve humanity’s biggest problems, but it is also a massive resource consumer and significant emitter of carbon.

By acknowledging AI’s dual role in society and promoting its responsible use, we can ensure that its future does not lead to another problem that needs to be solved.

Source: blogs.adb.org

Leave a Comment

Your email address will not be published. Required fields are marked *