The Environmental Footprint of AI: Google Finally Lifts the Veil
- Natasha Tatta
- Aug 23
- 4 min read
Updated: 2 days ago

AI: Innovation or Environmental Threat?
Since the rise of ChatGPT in November 2022, generative AI has become omnipresent. Businesses use it to automate tasks, students to draft essays, and individuals to plan trips or create content in seconds.
But behind this revolution lies a pressing question: what is the environmental footprint of AI?
For two years, speculation ran wild. Some experts feared that every prompt might consume as much electricity as leaving a light bulb on for several minutes. Others imagined Big Tech’s massive data centres draining water supplies and straining global power grids.
In August 2025, Google decided to provide a clearer picture. For the first time, a major AI provider published detailed data on the environmental footprint of Gemini, Google's chatbot launched in 2023.
The Environmental Footprint of AI: Surprise!
According to Google, each Gemini prompt consumes:
0.24 kilowatt-hours of electricity → about one second in a microwave or nine seconds of TV.
0.03 grams of CO₂ → roughly 1/150th of the carbon footprint of charging a smartphone.
0.26 milliliters of water → just five drops.
👉 These figures seem tiny, especially compared to the apocalyptic projections often cited in the media. But note: they apply only to one prompt.
Google also clarified that these numbers rely on a full-stack calculation, which includes not just AI processors but also unused capacity, cooling systems, and overall data centre operations. Interestingly, AI chips themselves account for only 58% of total energy use; the rest comes from the infrastructure that makes real-time responses possible.

Major Efficiency Gains
Perhaps even more striking: Google claims it reduced per-prompt energy consumption by a factor of 33 in a single year.
Back in May 2024, each Gemini prompt required around 8 kilowatt-hours, a number much closer to early alarmist estimates and the widespread fear of an ecological disaster.

In just a few months, the energy footprint per prompt shrunk to a nearly negligible level—at least on an individual scale.
The Grey Areas: What Google Doesn’t Say
While this announcement is a welcome step toward transparency, it also raises important questions:
Total prompt volume
A minimal cost per prompt becomes massive when multiplied by billions of daily prompts. Without disclosure on total usage, the global impact remains unknown.
Other features
Gemini doesn’t only generate text. It also creates images, videos, and advanced analyses such as Deep Research, where a single response equals dozens of text prompts. Google hasn’t provided data for these more energy-intensive use cases.
Model training
Completely absent from the report is the training phase. Training a model as large as Gemini requires months of computation across thousands of processors, consuming astronomical amounts of energy. Some researchers estimate that training one large AI model can emit as much CO₂ as several hundred transatlantic flights.
Selective transparency
By publishing only favourable data (inference, or day-to-day use), Google controls the narrative. The lack of global figures makes it impossible to fairly compare Gemini’s footprint to rivals like OpenAI, Anthropic, or Meta.
Gemini in Context: Everyday Comparisons
To put the numbers into perspective:
0.24 kWh (Gemini) → one second of microwave use.
1 kWh → charging a smartphone for one minute.
8 kWh → running an 8W LED bulb for an hour.
60 kWh → one hour of TV.
200 kWh → one hour of heavy laptop use.
So, a single Gemini prompt costs almost nothing compared to an hour of Netflix or a laundry cycle. But scaled globally, the impact becomes significant.
Data Centres: The Real Environmental Challenge
Google’s figures highlight a broader truth: AI cannot be separated from its infrastructure.
Data centres already consume about 2% of the world’s electricity. With AI growth, this share could rise sharply. Some experts predict that by 2030, data centre electricity demand could double if no major optimizations are made.
Water is another pressing issue. Servers require constant cooling, often achieved with water-based systems. In 2022, Google was criticized for high water use at U.S. data centres, particularly in drought-prone regions.
Google Under Global Pressure
It’s no coincidence that Google released these figures now. Pressure is mounting:
Environmental NGOs are raising alarms about AI’s impact.
Governments, especially in Europe, are demanding greater transparency.
Investors want assurances that AI growth is sustainable.
By disclosing extremely low per-prompt numbers, Google aims to prove it can balance AI innovation with energy efficiency.
The Long-Term Paradox
The paradox is clear:
On one hand, Google achieved spectacular technical progress, slashing the footprint per prompt.
On the other hand, mass adoption of AI could wipe out these gains—or even increase global consumption.
This is the classic rebound effect: the more efficient a technology becomes, the more it’s used, and the greater the total impact.
Toward Greener AI?
To truly reduce AI’s energy footprint, several strategies are key:
Further optimize AI chips.
Build data centres powered by renewable energy.
Recycle server heat.
Develop smaller, specialized AI models.
Transparency, But Not the Full Picture
By publishing detailed figures, Google has taken an important step toward greater transparency. The announcement suggests that the environmental footprint of AI may not be the disaster some predicted—at least for simple text prompts.
But the overall picture is incomplete. Without data on training, advanced features, and total usage, it’s impossible to draw a definitive conclusion.
🤔 So, the question remains: Will AI become a sustainable tool—or an accelerator of the environmental crisis? That depends on Big Tech’s transparency, their commitment to greener innovation, and our collective choices as users.
Source : Google Cloud Blog. (August 2025). Measuring the environmental impact of delivering AI at Google Scale. Retrieved from https://cloud.google.com/blog/products/infrastructure/measuring-the-environmental-impact-of-ai-inference