<RETURN_TO_BASE

Google Lifts the Lid on How Much Energy a Single AI Prompt Uses

'Google released a report estimating that a median text prompt to Gemini uses 0.24 Wh, about one second of microwave power, plus 0.26 ml of water and 0.03 g CO2, and provided a detailed breakdown of infrastructure contributions.'

A rare look into per-prompt energy use

Google has published a technical report that estimates how much energy a single text prompt to its Gemini models consumes. The company reports a median prompt energy use of 0.24 watt-hours, roughly equivalent to running a typical microwave for about one second. The document also provides average estimates for associated water consumption and carbon emissions for a text prompt.

What exactly was measured

The report aims to be comprehensive. Google measured not only the energy used by the AI accelerators running the models, but also the supporting infrastructure: the host machine CPU and memory, idle backup equipment, and data center overhead such as cooling and power conversion. That wider scope gives a fuller picture of the total resources needed to answer a single prompt.

How the energy breaks down

According to Google, the median prompt's 0.24 Wh is split roughly as follows:

  • 58% from the AI chips (Google's custom TPUs)
  • 25% from the host machine's CPU and memory
  • 10% from idle backup machines that remain available in case of failure
  • 8% from data center overhead, including cooling and power conversion

This shows that the specialized AI hardware is the largest consumer, but other infrastructure contributes substantially to the total cost.

Emissions and water use

Google translates the energy number into greenhouse gas emissions and water consumption. Using a market-based emissions factor that accounts for Google's clean energy purchases, the company estimates 0.03 grams of CO2 per median prompt. For water, Google estimates 0.26 milliliters per prompt, about five drops.

Google uses a market-based approach rather than a simple grid average because the company has contracts to buy over 22 gigawatts of power from renewables and other low carbon sources since 2010, which lowers its reported emissions per unit of electricity on paper.

Variation, limits, and trends

The 0.24 Wh figure is a median and does not represent every query. Some prompts require far more compute and energy, for example feeding dozens of books into the model and asking for a detailed synopsis, or running more extensive reasoning models that take many internal steps. The report covers only text prompts and does not include image or video generation, which other analyses show can be much more energy intensive.

Google also reports rapid efficiency gains. The median Gemini prompt used 33 times more energy in May 2024 than it did in May 2025, a decline Google attributes to model improvements and software optimizations.

Why this matters

This disclosure is one of the most detailed industry contributions to date on AI resource use and will be useful to researchers and analysts who previously lacked access to internal measurements. It highlights both the importance of accounting for the full infrastructure footprint and the limits of company-provided data: Google still controls which details to publish, and the total daily query count for Gemini is not disclosed, making it hard to infer total platform-wide energy demand.

Researchers and advocates are calling for standardized metrics and an AI energy rating system similar to Energy Star so that users and policymakers can compare tools on a consistent basis. For now, Google's report is a significant step toward greater transparency about the real-world resource costs of interacting with AI.

🇷🇺

Сменить язык

Читать эту статью на русском

Переключить на Русский