What AI Thinks About AI
In an effort to understand how artificial intelligence might affect our environment, I asked ChatGPT to explain it. Here’s what happened.
I entered the following: In a 1000 word article, explain what generative artificial intelligence is and how generative AI data centers are impacting the environment. Include impacts on water usage, energy demands, and climate change. Include a bar graph comparing energy consumption of Generative AI to four other common energy consuming activities. Also, provide a prediction on how generative AI will be impacting the environment 10 years into the future. Also, approximate the amount of water and energy that was consumed by AI in creating this article.
ChatGPT responded with the following:
Here’s a 1000 word article covering what generative artificial intelligence is, it’s environmental impact, and predictions for the future. A bar graph is included to visualize energy consumption, and I’ve estimated the water and energy footprint of generating this very article at the end.
Generative Artificial Intelligence and its Environmental Impact: A Decade in Perspective
What is generative artificial intelligence?
Generative artificial intelligence (generative AI or GenAI) refers to a class of AI systems designed to produce content-- from text and music, to code, videos, and synthetic data. These models, such as OpenAI’s GPT-4, Google’s Gemini, and Anthropic Claude, are typically trained on vast data sets using deep learning architectures, especially transformer models.
Unlike traditional AI, which is designed to recognize patterns or classified data, generative AI actively creates. It has revolutionized industries, enabling rapid prototyping, enhanced creativity, automated writing, realistic voice synthesis, and even scientific research acceleration. However, the very computational complexity that allows these systems to function also brings with it substantial environmental costs.
The role of AI data centers in environmental stress
Generative AI models are housed in AI―optimized data centers that require immense computing power for both training and inference (i.e. usage). These data centers consume significant amounts of electricity and water, and contribute to climate change through greenhouse gas emissions associated with energy production.
1. Energy demands
Training a single large language model (LLM) like GPT – 3 or GPT – 4 can consume millions of kilowatt hours (kWh) of electricity. For context, researchers at the University of Massachusetts Amherst estimated that training a single AI model could emit as much CO2 as five American cars in their lifetime―and models have only grown larger since then.
Data centers also draw continuous power for operation, storage, cooling systems, and infrastructure support. This is especially true for models requiring dedicated graphics processing units (GPUs) or tensor processing units (TPUs).
According to the international energy agency (IEA), AI–related workloads may drive data center electricity consumption to double by 2026, contributing more than 1,000 Tera-watt-hours (TWh) annually―more than some medium-sized countries use in total.
2. Water usage
What is often overlooked is the water footprint of AI. Data centers rely heavily on evaporative cooling systems to keep temperatures in check. For every kilowatt-hour of electricity consumed, a significant amount of water is used, either directly for cooling or indirectly via water-intensive power generation (like coal or nuclear).
A 2023 study by the University of California, Riverside, estimated that every interaction with ChatGPT consumes approximately 500 mL (half a liter) of freshwater, depending on the cooling system and location. Training GPT-3 is estimated to have consumed around 700,000 L (185,000 gallons) of water. These demands strain local water resources, particularly in drought–prone areas where many data centers are located, like the U.S. Southwest.
3. Climate change
Data centers contribute to greenhouse gas emissions, especially if powered by fossil fuels. As AI demand grows, so does the carbon footprint unless powered by renewable energy. Despite some tech companies pledging to go carbon neutral, much of the current AI infrastructure still relies on mixed grids.
Moreover, climate change itself can increase data center cooling demands. Creating a feedback loop: more AI leads to more emissions, which worsen the climate, increasing the energy need to cool servers, further driving emissions.
Comparative Energy Consumption
Below is a bar graph comparing the energy consumption of generative AI to four other energy–intensive activities, based on average or estimated figures.
Energy Consumption Comparison
Activity———————————————————-------------------- Energy ( TWh/ year)
Generative AI (est. 2025)—————————————————————250
YouTube streaming———————————————————————244
Residential refrigerators (US) ———————————————-180
Bitcoin mining —————————————————-130
Global aviation industry ———————————————————————————-915
10-Year Environmental Outlook for Generative AI
1. Energy Efficiency Will Improve — But So Will Demand
While hardware advancements and software optimization (like more efficient neural networks) will reduce energy per computation, total demand for AI is projected to skyrocket. According to McKinsey and IEA forecast, the number of AI workloads could increase 20- to 30-fold by 2035. Unless data centers shift to 100% renewable energy, AI will remain a net contributor to global emissions.
2. Water Stress Will Increase
As more data centers are constructed to meet demand, especially in developing nations and arid regions, water scarcity the may worsen. Innovation in cooling systems — such as liquid immersion or air based cooling— could reduce water use, but widespread adoption is slow and expensive.
3. Policy and Regulation Will Shape AI Sustainability
Governments may begin regulating data center resource usage— enforcing carbon caps, water permits, and efficiency standards. Countries in the European Union, for example, are already exploring “digital sustainability” policies.
4. AI-Optimized Chips and Decentralized Inference
The rise of AI–specific chips and on–device AI (i.e., running GenAI on smart phones instead of the cloud) could redistribute environmental burdens. This could reduce central cooling and electricity demands but introduce new environmental challenges around device manufacturing and e–waste.
Prediction for 2035:
If unchecked, generative AI could account for up to 4% of global electricity consumption and become a top-five industrial water consumer. However, with aggressive decarbonization, policy shifts, and technology breakthroughs, the sector could also lead the way in sustainable innovation.
Resource Cost of This Article
Creating this thousand word article using generative AI may seem negligible individually, but the cumulative impact is significant when scaled to millions of users.
Estimated Resource Consumption:
* Energy Used: ~0.005 kWh
(Based on inference estimate per thousand – word response across large language models [LLMs] )
* Water Used: ~0.0025 L (2.5mL)
(Based on recent estimates that a single LLM query uses ~500mL per 10,000 words)
Note: These values are rough estimates and depend on the model, data center location, hardware, and workload.
Conclusion
Generative AI is a transformative technology reshaping industries, creativity, and productivity. But, behind the marvel lies an invisible environmental cost — massive energy consumption, significant water usage, and growing carbon emissions. As society becomes more reliant on AI, there is an urgent need to prioritize sustainability, enforce regulations, and innovate greener AI infrastructure.
The next 10 years will determine whether AI becomes a catalyst for climate solutions—or an accelerant of the problem.
This article was created with minimal help from Paul Leline, however, he did consume about 500 mL of fresh water in the process.