
Google has released new data showing the resource cost of each Gemini AI query, highlighting both improved efficiency and ongoing environmental concerns.
At a Glance
- A typical Gemini text prompt uses 0.24 watt-hours of electricity, roughly equal to watching nine seconds of TV
- Each query produces about 0.03 grams of CO₂ equivalent
- Water use per query averages 0.26 milliliters, or about five drops
- Over the past year, energy demand per query dropped 33-fold, and carbon emissions fell 44-fold
- Critics warn that Google’s accounting may omit indirect water use and underestimate real impacts
Measuring the AI Footprint
Google’s report lays out for the first time the per-query costs of interacting with its Gemini AI system. According to the company, answering a text prompt requires just 0.24 watt-hours of electricity, 0.03 grams of CO₂, and 0.26 milliliters of water. On an individual scale, these numbers appear trivial—comparable to nine seconds of television viewing or five drops of water.
Watch now: The growing environmental impact of AI data centers’ energy demands · YouTube
The report emphasizes efficiency gains over the past year, with energy use falling 33 times per query and carbon emissions 44 times. Google credits infrastructure optimization, more efficient model design, and better utilization of its data centers.
Transparency Versus Blind Spots
Google’s disclosure is notable in its transparency. The company published a methodology describing how it calculates energy, water, and carbon usage across its AI infrastructure, including overhead such as cooling and idle loads. This makes it one of the few firms to publicly quantify environmental costs at the query level.
However, experts caution that Google’s chosen metrics may underrepresent the true footprint. Critics point to the exclusion of indirect water usage—such as the energy and resource costs of infrastructure—and reliance on market-based accounting for carbon emissions. This approach can downplay real-world environmental effects, particularly in regions dependent on fossil fuels.
Scaling Risks
While a single Gemini query may consume negligible resources, the impact multiplies at scale. AI adoption is surging worldwide, with millions of queries processed daily. Even with major efficiency gains, aggregate energy and water demands continue to rise. Data centers supporting AI workloads are already among the largest consumers of power and water, with some using as much electricity as tens of thousands of homes.
Analysts warn that global energy demand from AI data centers could double by 2030, reaching levels comparable to entire industrialized nations. Water usage, too, is set to expand into the billions of liters annually. Without industry-wide standards for reporting and regulation, the true scale of AI’s environmental impact may remain difficult to measure.
Sources
Wall Street Journal
The Verge
CBS News
arXiv














