Google has published detailed figures on the electricity, carbon emissions and water consumption associated with its Gemini artificial intelligence system, in one of the most transparent disclosures yet from a major technology company about AI’s environmental costs.
Although the per-query impact appears negligible, the exponential growth of AI worldwide underscores the significance of the findings. The data provides fresh insight into the broader sustainability challenges posed by large-scale AI deployment.
According to Google, a single Gemini text prompt consumes roughly 0.24 watt-hours of electricity—equivalent to watching television for fewer than nine seconds—emits around 0.03 grams of carbon dioxide equivalent, and uses approximately 0.26 millilitres of water, or five drops. The calculations take into account not only the direct running costs of AI systems but also idle electricity use and wider data centre infrastructure.
Comparisons with other models, such as Meta’s Llama 3.1, reveal substantial variations depending on measurement methods, with outputs ranging from 580 to 3,600 prompts per kilowatt-hour. Gemini Apps, by contrast, showed narrower, more consistent results. By disclosing these figures, Google aims to encourage an industry-wide standard for reporting AI’s environmental footprint.
The company noted significant efficiency gains in Gemini, claiming it now requires 33 times less energy per query than it did a year ago, thanks to advances in hardware, algorithms and data centre optimisation. Yet overall emissions continue to rise as demand for AI accelerates. Since 2019, Google’s total greenhouse gas emissions have increased by 51 per cent, with AI identified as a major driver.
Electricity consumption by Google’s data centres reached 30.8 million megawatt-hours in 2024, more than double that of 2020. Nevertheless, the company reported a 12 per cent reduction in direct emissions from its facilities through renewable energy contracts, operational efficiencies and improved cooling systems. Agreements with utilities in Indiana and Tennessee allow Google to lower data centre power use during peak grid demand, reducing pressure on local energy systems.
Beyond renewables, Google is also betting on nuclear energy to secure constant, low-carbon supply for its power-hungry AI infrastructure. Partnerships with Kairos Power and the Tennessee Valley Authority aim to advance molten salt reactor technology. Alongside this, demand-response strategies and expanded clean energy contracts form part of Google’s broader strategy to meet soaring AI energy needs.
The disclosure underscores both the rapid progress in AI efficiency and the paradox that rising demand is likely to keep pushing overall environmental costs upwards.