Patricia Gestoso asks whether we can balance innovation and environmental responsibility

In 2021, van Wynsberghe proposed defining sustainable artificial intelligence (AI) as “a movement to foster change in the entire lifecycle of AI products (i.e., idea generation, training, re-tuning, implementation, governance) towards greater ecological integrity and social justice”. The concept comprised two key contributions: AI for sustainability and the sustainability of AI.

At the time, a growing effort was already underway exploring how AI tools could help address climate change challenges (AI for sustainability). However, studies have already shown that developing large Natural Language Processing (NLP) AI models results in significant energy consumption and carbon emissions, often caused by using non-renewable energy. van Wynsberghe posited the need to focus on the sustainability of AI.

Four years later, the conversation about making AI sustainable has evolved considerably with the arrival of generative AI models. These models have popularised and democratised the use of artificial intelligence, especially as a productivity tool for generating content.

Tech leaders such as Eric Schmidt (former Google CEO) and Sam Altman (OpenAI CEO) have disregarded concerns about AI’s sustainability, as AGI will supposedly solve them in the future.

Another factor that has exponentially increased the resources dedicated to AI is the contested hypothesis that developing AI models with increasingly large datasets and algorithmic complexity will ultimately lead to Artificial General Intelligence (AGI)—a type of AI system that would match or surpass human cognitive capabilities.

Powerful businesses, governments, and academia consider AGI a competitive advantage. Tech leaders such as Eric Schmidt (former Google CEO) and Sam Altman (OpenAI CEO) have disregarded concerns about AI’s sustainability, as AGI will supposedly solve them in the future.

In this context, what do current trends reveal about the sustainability of AI?

Challenges

Typically, artificial intelligence models are developed and run on the cloud, which is powered by data centres. As a result, their construction has increased significantly over the past few years. McKinsey estimates that global demand for data centre capacity could rise between 19% and 22% annually from 2023 to 2030.

For example, the Trump administration announced a private-sector $500 billion investment in AI infrastructure called Stargate that includes the construction of up to 20 data centres; the UK expressly included the construction of more data centres as a priority in its AI Opportunities Action Plan; and France and the United Arab Emirates are set to spend $31-52 billion on an AI data centre and other AI investments.

Data centres require considerable amounts of electricity to run their servers and water for refrigeration. They also generate substantial carbon emissions.

 Research estimated that a 100-word email generated by an AI chatbot using GPT-4 consumes about half a litre of water and is equivalent to powering 14 LED light bulbs for 1 hour.

Luccioni and co-workers estimated that the development of GPT-3, a generative AI model behind many chatbots, emitted about 500 metric tons of carbon, roughly equivalent to over a million miles driven by a car, as it used carbon-intensive energy sources like coal and natural gas. The data centre used 700,000 litres of water during the process.

Another study compared the carbon emissions associated with using generative AI models for different tasks. The most efficient text generation model evaluated used as much energy as 9% of a full smartphone charge for 1,000 queries, whereas the least efficient image generation model used around half a charge per image.  Research estimated that a 100-word email generated by an AI chatbot using GPT-4 consumes about half a litre of water and is equivalent to powering 14 LED light bulbs for 1 hour.  While those values appear small, they become massive in the context of millions of daily queries.

Counterintuitively, many data centres are built in desert areas, such as the US Southwest. Removing the heat in a dry environment is more effective than with most air, and the region has access to cheap and reliable non-renewable energy from the country’s largest nuclear plant.

There is also evidence that data centres can affect the stability of the electric grid for those living nearby and impact electricity prices. In 2023, electricity prices in Ireland were a staggering 48% above the EU average and had increased by 41% compared to the second half of 2021. Part of that price increase is explained by data centres’ unparalleled demand on the electricity grid. Ireland’s data centres consumed more electricity in 2023 than all its urban homes combined.

Data centres are also noisy. The heating, ventilation, and air conditioning (HVAC) systems that cool servers and the backup power generators can generate noise outside a data centre, contributing to noise pollution in its neighbourhood.

Finally, there is often a lack of transparency. Many businesses build their data centres under shell companies to avoid community pushback or withhold critical information about the exact location and environmental impact.

Opportunities

In terms of algorithms, the recent release of DeepSeek-R1, a generative AI open-source model from a Chinese AI startup, has upended entrenched beliefs about how the performance of this type of AI system scales with hardware and development demands. Their creators have claimed its performance is on par with OpenAI-o1, developed with far fewer chips, and cheaper ($6 million compared to ChatGPT-4’s $100 million). Interestingly, initial evaluations regarding the cost of running the model suggest an increased energy consumption.

Google and Prime Intellect have been exploring globally distributed AI model training, which spreads the environmental footprint.  Recently, the AI Energy Score released by Hugging Face enables the comparison of AI models in terms of efficiency, promoting informed decisions about sustainability in AI development.

Microsoft announced last year a new design for data centres that optimises AI workloads and consumes zero water for cooling, as water is recirculated through a closed loop.

Some operators are exploring ways to repurpose data centres’ substantial energy storage capacity for grid balancing.

Regarding power sources, big tech companies have approached this challenge by matching their annual electricity use, including data centres, with renewable energy (Meta in 2020 and Amazon in 2023). Last year, Google committed $20 billion to renewable projects and Microsoft anchored a $9 billion renewable energy coalition.

Microgrids are localised energy systems that operate independently from the central grid. They provide energy to the data centres without shifting the responsibility of residential consumers to fund the grid expansions they require.

Some operators are exploring ways to repurpose data centres’ substantial energy storage capacity for grid balancing. For example, working with electricity distribution companies to reallocate underutilised battery units from Uninterruptible Power Supplies (UPS) to contribute to grid stability, potentially generating additional revenue.

Data centres can also leverage AI insights to anticipate high- and low-demand periods for energy through historical data, real-time usage, and external events, and optimise utilisation. AI can also help forecast hardware needs, decreasing downtime. Virtual twins technology can simulate computer workloads and their impact on data centres’ footprint, reducing server and cooling consumption by up to 10% and 30%, respectively.

The US state of Virginia exemplifies some of the tensions above. Northern Virginia hosts hundreds of digital infrastructure facilities, which annually contribute 74,000 jobs, $5.5 billion in labour income, and $9.1 billion in GDP. On the other hand, estimates forecast that the monthly energy consumption average could reach more than 30,000 GWh by 2040, considerably surpassing supply. Over the last two years, several bills have aimed to regulate the impact of data centres on water and energy usage, as well as other domains such as parks, historic sites, and forestland.  

The EU Energy Efficiency Directive (EED) is another example of how regulation can help ensure datacentres’ transparency, sustainability, and resilience. The EED requires data centre owners and operators to report data on energy and water usage annually to an EU database.

Power usage effectiveness (PUE) is a metric for data centres’ energy efficiency. It is the ratio between the total amount of energy used by a computer data centre facility and the energy delivered to computing equipment. An ideal PUE is 1.0.  China has decreased its average PUE to less than 1.5 and, starting on 1 July 2025, new data centres that provide services to Australia’s federal government must attain a 1.4 PUE or below.  

A Systemic Approach

As the analysis above demonstrates, making AI sustainable is a pressing, multi-stakeholder, and cross-border issue. This year, two global initiatives have been launched at the intersection of AI and sustainability.

The AI Action Summit 2025 released a statement on inclusive and sustainable artificial intelligence for people and the planet signed by 58 countries, which the UK and the US notably refused to endorse.

The AI Action Summit 2025 released a statement on inclusive and sustainable artificial intelligence for people and the planet signed by 58 countries, which the UK and the US notably refused to endorse. Key points include fostering investments in sustainable AI systems, promoting an international discussion on AI and the environment, and welcoming an observatory on the energy impact of AI.

The Coalition for Sustainable AI is a multistakeholder global initiative to align hardware and software AI development with environmental considerations. Some of its goals are developing standardised methods and metrics for measuring AI’s environmental impacts, evaluating AI systems across their entire life cycle, establishing frameworks for reporting and disclosing AI’s environmental impact, and ensuring AI infrastructure and software are built and maintained in line with global environmental commitments.

As we begin to see efforts to curb AI footprint, it is paramount that we don’t forget that Jevons paradox predicts that greater efficiency can drive higher overall consumption. To achieve AI sustainability, efforts must proactively develop strategies to counteract this rebound effect.

Image attribution: Nadia Piet & Archival Images of AI + AIxDESIGN / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/

Patricia Gestoso

Patricia is a scientific services leader and a diversity and inclusion tech evangelist. Throughout her career as global head of scientific support, training, and services, Patricia has worked with Fortune …

Read More »

Leave a Reply

Your email address will not be published. Required fields are marked *