AI can help tackle the climate crisis but unless measures are put in place AI’s environmental footprint risks to be a part of the climate problem.
Hyperscalers, which include Google, Microsoft and Amazon, are driving the swift proliferation of electricity-guzzling data centers to expand their artificial intelligence and cloud computing technologies. As a result, the International Energy Agency (IEA), says that by 2026 data centers could use twice as much energy as two years ago—an amount roughly equivalent to adding “at least one Sweden or at most one Germany.”
The increasing demand is not just putting a strain on the grid. The boom in data centers is expected to produce about 2.5 billion metric tons of carbon dioxide-equivalent emissions globally through the end of the decade, according to Morgan Stanley research published in early September.
In addition, data centers, which house thousands of servers, require substantial amounts of water for cooling. Training large-scale AI models can lead to a 10-fold increase in water usage compared to traditional computing tasks, according to a September 12 briefing paper published by bluegain, a German company that helps leaders in their digital, business model, and sustainability shifts. These systems can use between 1.8 to 2.5 million gallons of water annually per megawatt of power consumed, depending on the technology and location. And, as data centers expand to support larger and more complex AI models, their power consumption and subsequent cooling requirements are expected to increase proportionally, further straining water resources.
But there is another side to the story. The energy grids of the future will require more powerful analytical tools and AI has a critical role to play.
In addition to better forecasting of energy supply and demand and predictive maintenance of physical assets, the IEA says AI applications in the energy sector could include managing and controlling grids, facilitating demand response and providing improved or expanded consumer services.(. Firms such as Octopus Energy and Oracle Utilities are already exploring this).
Young companies like BrainBox AI, iGenius, and Crusoe are using AI to help companies become more sustainable and others are using the technology to help protect the environment by doing everything from helping predict forest fires to preventing illegal overfishing.
This raises thorny questions, says an article published on the World Economic Forum’s website. “Do the economic and societal benefits of AI outweigh the environmental cost of using it? And more specifically, do the benefits of AI for the energy transition outweigh its increased energy consumption?”
The Forum’s Artificial Intelligence Governance Alliance is applying a cross-industry and industry-specific lens to understand how AI can be leveraged to transform sectors and drive impact on innovation, sustainability and growth.
As part of this initiative, the Forum’s Centre for Energy and Materials and Centre for the Fourth Industrial Revolution have launched a dedicated workstream on AI and energy usage and strategies to manage it.
“We need to understand the advantages and the disadvantages,” says Roberto Bocca, the Forum’s Head, Centre for Energy and Materials and a member of the Executive Committee.” We are talking about more efficiency but also more consumption.
The Forum will publish its initial findings at its annual meeting in Davos in January 2025.
“We are currently in the process of building a multi-stakeholder community, “says Thapelo Tladi, the Forum’s Lead, Energy Initiatives. “It involves ICT companies that are leading AI adoption but also a number of other companies across various industries like telecommunications, the chemical industry, health, etc. We have about 20 companies and we are looking to grow the group further.”
The Forum is one of a number of players tackling AI climate issues. France’s GenAI Impact is creating tools to measure and reduce the environmental impacts of using and deploying AI, Cloud providers such as Scaleway and evroc are using new approaches to significantly lower the footprint of data centers and consultants like Germany’s bluegain are advising companies on specific ways to reduce AI-related water consumption.
The topic will also be front and center at the XYZ Generative AI for business conference in Paris September 27. “Assessing The Environmental Impact of Generative AI: The Challenges & Opportunities” is the title of a panel which will be moderated by The Innovator’s Editor-in-Chief. Interviews with the panelists and with the Forum shed light on AI’s climate conundrum and what companies can do now to lower AI’s environmental impact.
Sizing Up The Situation
The IEA estimates that data centers, cryptocurrencies, and AI consume about 460 TWh of electricity worldwide in 2022, almost 2% of total global electricity demand. The ever-growing quantity of digital data requires an expansion and evolution of data centers to process and store it. Electricity demand in data centers is mainly from two processes, with computing accounting for 40% of electricity demand of a data center. Cooling requirements to achieve stable processing efficiency
similarly makes up about another 40%, says the IEA. The remaining 20% comes from other associated IT equipment.
Depending on the pace of deployment, range of efficiency improvements as well as artificial intelligence and cryptocurrency trends, the IEA expect global electricity consumption of data centers, cryptocurrencies and artificial intelligence to range between 620-1 050 TWh in 2026, with the base case for demand at just over 800 TWh – up from 460 TWh in 2022. This corresponds to an additional 160 TWh up to 590 TWh of electricity demand in 2026 compared to 2022.
It is difficult to get accurate estimates on the impact of GenAI in part due to machine learning models being incredibly variable, because they can be configured in ways that can dramatically impact their power consumption, but also due to organizations like Meta, Microsoft and OpenAI not openly sharing relevant information. What’s more there is no standard way of measuring energy and water consumption. Data is not systematically collected on AI’s energy use and environmental impact and there is a need for greater transparency and tracking – especially as models grow and GenAI use becomes more prevalent, says Samuel Rince, another speaker at XYZ Generative AI’s impact on business conference.
“Transparency on the models’ architecture, training process, inference energy consumption, etc. should be reported by providers, just like the PUE [Power Usage Effectiveness] is for data centers,” says Rince.
Rince is the co-founder and President of GenAI Impact, a non-profit organization dedicated to assessing and highlighting the environmental footprint of generative AI technologies. The French organization’s team, consisting of researchers, developers, and freelancers, collaborates on developing open-source methodologies and tools for environmental impact assessment.
Based on available information, GenAI Impact has developed tools for developers to measure the emissions of the third-party AI they are using so they can keep track of how much their company’s own usage of those technologies is impacting the environment.
For its part the Forum is seeking to get “an objective view on the information that is out there,” says Tladi, the Forum’s Lead, Energy Initiatives. “We want to find out where the energy is being consumed and we want to find out how they are measuring this and how industry can move together to improve transparency,” he says. “Then in the next phase we will drill down on the more practical elements of what companies can do in terms of strategies. It is great that we can do certain things to manage the energy consumption of AI but at the end of the day we will still need more energy so it is important that we also look at the clean power supply that will be needed.”
Reducing Data Center Resource Consumption and Emissions
A growing number of data centers are using green energy and that is expected to increase. Indeed, Morgan Stanley points out in its September report that an upside in the build out of energy guzzling data centers is that it will create a large market for decarbonization solutions. The build-out of the giant computer warehouses will increase investments in clean power development; energy efficient equipment and so-called green building materials, Morgan Stanley said. Carbon capture, utilization, and sequestration technology and carbon dioxide removal processes are also expected to get a boost as tech companies try to keep their climate promises, the report said.
Scaleway, a Cloud provider which is owned by Iliad Group and markets itself as an alternative to the U.S.-based hyperscalers, are reducing their energy consumption in other ways.,
Centers operated by Iliad Group’s OpCore in which Scaleway servers are housed, use no air conditioning and minimal water. “To our knowledge our GPU cluster is the only one in the world not cooled by air conditioning, which typically accounts for 30% to 40% of energy use,” says James Martin, Scaleway’s head of content and sustainability communications, a speaker at the XYZ Paris conference.
While hyperscalers are not transparent about their water usage based on available information Martin says he believes Scaleway uses significantly less water than American and Chinese Cloud providers.
In its annual sustainability report, Microsoft, a multibillion-dollar investor in Open AI, divulged that its data centers in Iowa and other areas used nearly 1.7 billion gallons of water in 2022. That’s 34% more than it used in 2021. While Microsoft hasn’t specifically said what led to the unusual surge, experts say it’s no coincidence it occurred while the company’s data scientists were believed to be training the large language models (LLMs) that power Chat GPT’s intelligence, according to a September 12 bluegain briefing document.
That conclusion about AI water consumption would seem to make sense since Google also reportedly used more than 5.6 billion gallons of water in 2022, or 20% more than the previous year, while training LLMs for its generative AI tool, Bard, bluegain says.
Data centers use huge volumes of water for cooling towers. Scaleway’s sister company OpCore uses a different approach in its DC5 data center: It uses air from outside to pass over the servers most of the year. During the summer months when the air gets hot it uses an adiabatic cooling system – mimicking how the human body sweats to cool down. By evaporating a few grams of water into the air, a few hours per year, the air coming from the outside can be cooled by nearly 10°C.
Scaleway’s system has 2,200 sensors and measurement points, allowing the system to adapt, self-regulate, and optimize its processes every 17 milliseconds so that every sensor gets exactly the right energy and cooling needed to function at maximum efficiency, saving both energy and water.
Scaleway is also taking steps to cut energy consumption by making adjustments to the hardware and software it uses.
Generative AI requires considerably more computing power than standard calculations. A key reason for this is that generative AI model training calls for GPUs (Graphic Processing Units] rather than CPUs, (Computer Processing Units), the hardware components that are the core computational unit in a server. GPUs generally require around four times more energy than CPUs. (Case in point: Ampere’s CPUs for AI consume 3-5 times less energy than the equivalent NVIDIA machines.
GPUs tend to generate 2.5x more heat than CPUs. Standard CPUs used in cloud computing are in the range of 250-350W TDP, whereas GPUs are in the 750-800W range and they require that much extra cooling power, says Scaleway. So, the processors needed for generative AI training and inference are considerably more power-hungry than pre-generative AI models.
The emissions from training, the process required to ‘educate’ a generative AI model by feeding it as much data as possible, vary widely depending on the model. A white paper published by French tech association Data For Good estimates GPT 3.5’s training emissions at c. 500 tCO2eq -.
Inference, or the everyday usage of a model, has its own impact, which has been estimated at 200 times higher than that of training. According to Data for Good, considering ChatGPT has 100m weekly users, that’s 100,000 trillion CO2 e/year for GPT-3.5.
Inference also uses a lot of water. It’s been established that one conversation with ChatGPT uses half a liter of water in terms of the data center cooling resources required. This is on top of GPT-3’s training, which required 5.4 million liters of water . That’s a bit more than one liter per training hour (training GPT-3 took 4.6 million GPU hours, according to ChatGPT)
Given these elements, it’s not surprising that AI energy demand is set to outpace supply.
One academic, Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has suggested that if every Google search for a year used AI, it would use the equivalent amount of electricity used to power a small country like Ireland.
One of the ways to reduce GenAI’s water and energy demands is to use alternative hardware for inference. “We are working with Ampere, they provide CPUs that are capable of handling many inference workloads and they can do it using 3x to 5x less energy an equivalent NVIDIA GPUs,” says Scaleway’s Martin.
Scaleway is currently rolling out its own “managed inference” service. Corporate customers will be able to choose from a number of opensource GenAI models that serve as alternatives to U.S. large language models, generate less emissions and use less water and energy, says Martin. “We are providing an alternative that is more sustainable and also sovereign because the training and inference data stays in Europe,” he says.
Suggested Best Practices For Corporates
Scaleway’s Martin, GenAI Impact’s Rince, Dr. Shirin Dora, an assistant professor in the Computer Science Department at the UK’s Loughborough University, and bluegain have practical suggestions for what companies can do to decrease AI’s environmental footprint:
*Don’t train your own LLM model from scratch, says Martin. iF a model already exists finetune it for your needs. The cost to your company and to the environment will be much lower
*Use GenAI only when it is relevant rather than for simple tasks like writing emails or for translation or as a calculator, says Martin.
*Reduce inference energy consumption by replacing GPUs with CPUs, says Martin.
*When using GenAI don’t always rely on the biggest LLMs. Use small models of traditional deep learning for smaller tasks, says Rince.
*When possible, use opensource models and model compression optimization to conserve consumption, says Rince.
*Be more open to switching to neuromorphic or quantum computing systems that are faster, more efficient and consume less energy, says Loughborough University’’s Dr. Dora.
*Seize the benefits of Life Long Learning Machines when they become available, urges Dr. Dora. Current large language model systems are limited to performing only those tasks for which they have been specifically programmed and trained. If you try to train them with new material the risk is that everything previously learned will be erased so when you want to add something new you need to start the energy guzzling task of training the model from scratch. Lifelong learning machines develop systems that can learn continuously during execution and become increasingly expert while performing tasks and apply previous skills and knowledge to new situations – without forgetting previous learning and without the need to retrain from scratch.
*Bluegain says companies who want to address the water footprint of digital technologies should consider:
- a) Investing in Advanced Cooling Technologies:Adopt cooling solutions that reduce water use.( Google says it is developing new climate-conscious cooling tech that can reduce a data center’s water use by as much as 50% while preserving energy efficiency.
- b) Optimizing Cooling Processes: IoT-enabled building automation systems monitor and control cooling systems to enhance water efficiency without compromising performance.
- Promoting Sustainable AI Development:Support the creation of AI models that are designed with water and energy efficiency in mind, working closely with technology providers to drive sustainable innovation.
- Encouraging Best Practice Exchange: Participate in collective efforts to share best practices and conduct joint research aimed at reducing water consumption across sectors.
- Setting Sustainable Targets for Water: Integrate the reduction of water consumption into the company’s sustainability goals and ensure transparent reporting and accountability against the targets set.
By adopting proactive measures to reduce energy and water consumption, organizations can not only advance their sustainability agendas but also enhance their operational efficiency, balancing technological innovation with responsible resource management, says bluegain.
This article is content that would normally only be available to subscribers. Sign up for a four-week free trial to see what you have been missing.
To access more of The Innovator’s Focus On AI articles click here.