Skip to main content

Camille Périssère
Senior Associate

Introduction

In January 2025, newly elected US President Donald Trump unveiled “Stargate,” a $500 billion initiative poised to become the most extensive AI infrastructure project in history. Backed by industry giants OpenAI, SoftBank, and Oracle, Stargate aims to solidify America’s leadership in artificial intelligence by constructing vast data centers and accompanying energy systems necessary to power next-generation AI technologies.​

While this ambitious endeavour underscores the strategic (and political) importance of AI, it also raises significant environmental concerns. Data centers are notorious for their substantial energy consumption, land use, and water requirements. Moreover, the rapid advancement of AI hardware accelerates electronic waste and intensifies the demand for rare raw materials. Notably, GenAI models are particularly resource intensive.

According to estimates, the initial training phase of GPT-4 would have emitted between 1,200 and 15,000 tons of CO₂, depending on whether the model was trained in a data center in Canada East (the lowest-carbon Azure region1 thanks to its predominantly hydroelectric grid) or on grid electricity in California, where natural gas still makes up a significant portion of the energy mix. Under the worst-case assumption, this would be comparable to the yearly energy consumption of 2,000+ US homes2. Moreover, the electricity consumption from training GPT-4 may be approximately 40 to 48 times higher than that required to train GPT-3, even though GPT-4’s total parameter count is believed to be only about 10 times greater. If these numbers seem staggering, they pale in comparison to the environmental impact of other stages in an AI model’s life cycle – particularly the deployment (inference) phase.

Despite these challenges, GenAI also offers promising solutions to combat climate change. In agriculture, GenAI enhances precision farming by analyzing extensive datasets to optimize resource utilization. In the energy sector, companies like Google have employed AI to optimize data center operations, achieving significant reductions in energy consumption. DeepMind’s AI system, for instance, reduced cooling energy usage in Google’s data centers by up to 40%, translating to a 15% decrease in overall energy usage3. Furthermore, GenAI contributes to more accurate climate modeling, aiding in better understanding and adaptation to climate change.​ Notable companies in this space include Mitiga Solutions, Eoliann, or Jua AI – a Swiss startup that has developed a proprietary large-scale model, known as a “Large Physics Model,” trained on petabytes of raw data to simulate atmospheric and environmental dynamics. These examples only scratch the surface of what GenAI can contribute to the fight against climate change – a topic substantial enough to deserve a white paper of its own.

Nevertheless, accurately measuring the environmental impact of AI remains a complex task. As Golestan Sally Radwan, Chief Digital Officer of the United Nations Environment Programme, aptly stated: “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.” This underscores the imperative to scrutinize not only the environmental footprint of AI but also to explore and implement solutions than can support a greener computing future.

Generative AI's Environmental Cost

While generative AI offers promising advancements, it also comes with a significant environmental cost. The development and operation of large-scale AI models – particularly the large language models behind GenAI – require immense energy for the data processing, model training and inference (which is the process of using a trained model to make predictions on new data).

Indeed, these models, often consisting of billions of parameters, like OpenAI’s GPT-4, demand extraordinary computational resources. The electricity needed for such training processes leads to considerable carbon dioxide emissions and adds significant pressure to the power grid.

And the energy consumption doesn’t end once a model is trained. Running these models in everyday applications, scaling them to millions of users, and continually refining their performance require ongoing and substantial energy input. In fact, a single query made through ChatGPT can use up to ten times more electricity than a standard Google search. Even generating a seemingly simple meme or “starter pack” image via ChatGPT could consume between two to five litres of water – you might think twice before cracking a joke to your colleague about that far-fetched career move.

Let’s dive a bit deeper to understand why these processes are problematic for the environment.

The Power-Hungry Nature of Generative AI

This rapid acceleration of generative AI adoption has obviously triggered a surge in demand for power-hungry data centers. And even though it’s called “cloud computing,” data centers very much exist in the physical world. They require vast amounts of electricity as well as water, space, and scarce resources, with both direct and indirect effects on biodiversity.

A data center is a temperature-controlled building that houses computing infrastructure, such as servers, data storage drives, and network equipment. Amazon, a leading cloud provider, operates over 100 data centers globally, each typically containing around 50,000 servers used to power its cloud computing services.

Despite some cloud providers’ claims of utilizing renewable energy (such as Amazon4), many of these centres still largely depend on fossil fuels, compounding the technology’s carbon footprint.

Global demand for data center capacity could more than triple by 2030

AI (and especially GenAI) is the key driver of growth in demand for data center capacity

Source: McKinsey Data Center Demand model

Additionally, in some countries, concern about the pressure data centers exert on electricity grids as well as the impact on national climate targets have brought a complete halt to the building of new ones. Ireland, for instance, has stopped issuing new grid connections to data centers in the Dublin area until 2028 due to concerns about grid pressure and climate goals. Ireland’s transmission system operator estimates that data centers will account for 28% of the country’s power use by 2031.

Traditionally, data centers have been built near population centers to reduce latency, often leveraging colocation strategies – where multiple providers form a cluster to enhance resilience and prevent outages. A notable example is Telehouse Paris Voltaire, one of the largest colocation facilities in Paris, hosting over 100 carriers and serving as a key hub for 80% of France’s live internet traffic5. However, when training AI models, low latency and network redundancy are less critical than during inference, when the model is actively serving users. As a result, AI training data centers are increasingly being located in more remote regions where power grids are less strained. Still, limited transmission infrastructure in these areas poses a growing risk of supply constraints as demand continues to increase.

Biggest Data Center Markets by Megawatts

Source: Voronoi by Visual Capitalist6

The US remains the biggest market by far with 5,426 data centers as of April 20257, hosting the biggest data producing and consuming businesses. In terms of energy consumption, the Virginia market nearly doubled in one year. A titan project like Stargate raises legitimate concerns about the additional pressure made on electric grid, as well as the significant environmental impact.

GenAI’s Hidden Water Consumption

Water usage is another concern. AI infrastructure consumes vast amounts of water to cool down data centres, with global AI-related operations projected to use six times more water than Denmark. This water demand is particularly alarming in regions already facing water scarcity, where the diversion of resources for cooling purposes could exacerbate existing shortages and strain local ecosystems. Moreover, the environmental impact of wastewater produced by these cooling processes, which often contains chemicals or pollutants, adds another layer of concern.

Rare Earth Reliance: GenAI’s Material Footprint and Geopolitical Risks

Beyond energy use, generative AI accelerates the need for high-performance hardware such as GPUs and other specialized processors. These chips rely on a variety of critical raw materials, including rare earth elements (REE) like neodymium, dysprosium, terbium, and others. These elements are essential for manufacturing permanent magnets, high-efficiency cooling systems, and precision components used in advanced computing and data center infrastructure.

However, the extraction and processing of rare earth elements come with significant environmental costs. Mining operations are often concentrated in a few regions, particularly in China, which controls a large portion of global REE supply. The extraction process typically involves open-pit mining, chemical separation, and radioactive waste generation, leading to soil and water contamination, deforestation, and greenhouse gas emissions. Additionally, rare earth mining has been associated with poor labor conditions and geopolitical risks, raising concerns about ethical sourcing and supply chain stability.

China produces 60% of all REE used as components in high technology devices

Source: Visual Capitalist8

The E-Waste Ripple Effect of GenAI

The relentless pace of AI innovation also contributes to mounting electronic waste. As newer, more powerful devices replace outdated hardware, this cycle of continual upgrades perpetuates a linear, unsustainable economic model, further straining ecological systems.

“When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in. There are much broader consequences that go out to a system level and persist based on actions that we take,” says Elsa A. Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of MIT’s new Climate Project.

Conclusion

The promise of generative AI is immense, but so is its footprint. From the carbon emissions of model training to the rare earth materials buried in our devices, the full lifecycle of AI must be scrutinized. To move forward responsibly, we must not only innovate – but also mitigate. It will take collective action from policymakers, technologists, and consumers to chart a path where AI innovation aligns with planetary limits.

Stay tuned for our second piece in this two-part series where we will explore how we can shape an AI ecosystem that works with the planet, not against it.

Bibliography

  1. https://medium.com/data-science/the-carbon-footprint-of-gpt-4-d6c676eb21ae
  2. https://www.epa.gov/energy/greenhouse-gas-equivalencies-calculator#results
  3. https://quantumzeitgeist.com/deepmind-ai-cuts-google-data-center-cooling-bill-by-40-revolutionizing-energy-efficiency/#google_vignette
  4. https://www.datacenters.com/news/amazon-s-100-billion-data-center-expansion
  5. https://www.telehouse.fr/connectivite-data-center-telehouse/france/paris/telehouse-paris-voltaire/
  6. https://www.voronoiapp.com/markets/Biggest-Data-Center-Markets-by-Megawatts-3006
  7. https://www.statista.com/statistics/1228433/data-centers-worldwide-by-country/
  8. https://elements.visualcapitalist.com/visualizing-chinas-dominance-in-clean-energy-metals/?utm_source=chatgpt.com

About AVP

AVP is an independent global investment platform dedicated to high-growth, tech (from deep-tech to tech-enabled) companies across Europe and North America, managing more than €3bn of assets across four investment strategies: venture, early growth, growth and fund of funds. Our multi-stage platform combines global research with local execution to drive investment. Since its establishment in 2016, AVP has invested in more than 60 technology companies and in more than 60 funds with the Fund of Funds investment strategy. Beyond providing equity capital, our expansion team works closely with founders, providing the expertise, connections and resources needed to unlock growth opportunities, and create lasting value through meaningful collaborations.