Google Reveals Gemini AI Prompt Uses Five Drops of Water and Energy of 9 Seconds of TV
Quote from Alex bobby on August 25, 2025, 6:04 AM
Using Google’s AI: Equivalent to Five Drops of Water or 9 Seconds of TV
The environmental impact of artificial intelligence (AI) has long been a point of contention. While AI models promise efficiency, productivity, and new avenues of innovation, the resources required to power them—especially at scale—remain under scrutiny. In a new technical paper, Google has shed some light on this debate, offering insights into how much energy and water are used by its Gemini AI chatbot with each prompt.
The findings are both eye-opening and reassuring. A single query on Gemini, according to Google, consumes roughly the same energy as watching nine seconds of television and uses the equivalent of just five drops of water. But while the per-prompt numbers appear modest, questions remain about the true scale of AI’s environmental footprint.
Breaking Down the Numbers
Google reports that:
- Energy use per Gemini prompt: ~0.24 watt-hours (Wh)
- Carbon emissions per prompt: ~0.03 grams of CO₂
- Water usage per prompt: ~0.26 millilitres, or about five drops
The company says these figures account not only for the power consumed by servers but also the idle energy of AI chips and the water needed to cool data centre equipment. By including these additional factors, Google argues that its numbers more accurately reflect “true operating efficiency at scale” rather than the theoretical consumption often cited in other studies.
Comparisons With Other Platforms
To put this into perspective, the Electric Power Research Institute, a non-profit organisation, previously estimated that each prompt to OpenAI’s ChatGPT consumes about 2.9 Wh—more than ten times higher than Google’s estimate for Gemini. For comparison, a traditional Google search query consumes about 0.3 Wh, slightly higher than a single Gemini prompt but far lower than ChatGPT’s reported usage.
These disparities highlight both the complexity of measurement and the rapid efficiency gains being made by AI developers. Still, such comparisons also point to the larger issue: while the per-query impact may be small, the cumulative effect of billions of queries could be substantial.
The Broader Energy Picture
The International Energy Agency (IEA) has warned that AI’s hunger for power is set to grow dramatically. The agency forecasts that global demand for AI-related energy will double within five years, potentially reaching 945 terawatt-hours per year—comparable to the entire electricity consumption of Japan.
This projection underscores the challenge. Even if AI prompts are individually efficient, the aggregate demand across platforms and applications will be enormous as adoption spreads to everything from customer service to medical research.
Google’s Rising Emissions
The report arrives at a time when Google’s own environmental record is facing scrutiny. According to its latest sustainability report, the company’s total greenhouse gas emissions have increased by 51 percent since 2019. Much of this rise is linked not to data centre operations themselves but to the manufacturing and assembly of the specialised hardware required to run advanced AI models.
In other words, while per-prompt energy use may be declining, the supply chain impact of scaling up AI infrastructure is offsetting those efficiency gains. This raises questions about whether AI can be considered truly sustainable, especially when hardware refresh cycles and manufacturing processes are factored into the equation.
Google’s Efficiency Gains
Despite these concerns, Google insists that progress is being made. According to the company, energy use and carbon footprint per Gemini prompt have decreased significantly:
- 33 times less energy since August 2024
- 44 times lower carbon footprint over the same period
Such dramatic reductions suggest that AI models are becoming far more efficient with each iteration. Improvements in hardware design, model optimisation, and cooling technology are contributing to these gains. Yet without knowing the total number of Gemini queries per day, it remains difficult to assess whether these efficiencies are enough to offset the overall rise in demand.
Water Usage: Small But Symbolic
Perhaps the most striking figure in Google’s report is water consumption. At just 0.26 millilitres per prompt—roughly five drops of water—the number may seem trivial. However, in regions where water scarcity is a pressing issue, the cumulative impact of millions of prompts could add up.
Data centres rely heavily on water for cooling, and this has become a point of tension in communities where tech infrastructure competes with agriculture and households for limited resources. While Google stresses that its facilities are optimised for efficiency, questions remain about how sustainable such water use will be as AI demand scales up.
Transparency and Its Limits
Google’s willingness to publish detailed estimates is notable, especially given the reticence of many technology firms to disclose the environmental costs of AI. However, the study has limitations. The most glaring omission is the absence of data on total daily query volumes. Without this figure, it is impossible to calculate Gemini’s overall energy and water demand at scale.
Critics argue that focusing only on per-prompt consumption risks minimising the full picture. A single drop of water may not seem like much—but billions of drops quickly fill an ocean.
Looking Forward
AI may be a tool for solving some of the world’s biggest challenges, but it must also avoid becoming one of its biggest environmental liabilities. As demand for AI services increases, companies like Google will need to balance innovation with sustainability, ensuring that efficiency gains keep pace with growth.
For now, one Gemini prompt may be as light as a handful of water drops or a few seconds off television—but the question remains: how many prompts will it take before the environmental cost can no longer be ignored?
Final Thought
Google’s findings show that an individual AI prompt may seem environmentally insignificant, but the true impact of AI lies in scale. With billions of prompts processed daily and the rapid expansion of AI infrastructure, even drops of water and seconds of energy add up. The challenge ahead is ensuring that efficiency gains and transparency keep pace with AI’s explosive growth—so that innovation doesn’t come at the expense of sustainability.
Conclusion
Google’s new data on Gemini’s per-prompt energy and water use provides a useful reference point in the ongoing debate about AI’s environmental footprint. On an individual level, the impact appears minimal: nine seconds off television, five drops of water, and a fraction of a gram of CO₂. But scale is everything.
As AI adoption accelerates globally, the environmental burden will grow—not only from energy consumption but also from hardware manufacturing and water use. While Google’s efficiency gains are encouraging, transparency about total usage and long-term sustainability strategies will be key.
Meta Description:
Google says each Gemini AI prompt consumes just 0.24 Wh of energy and five drops of water—far less than many estimates. But at scale, the environmental impact of AI remains a growing concern.

Using Google’s AI: Equivalent to Five Drops of Water or 9 Seconds of TV
The environmental impact of artificial intelligence (AI) has long been a point of contention. While AI models promise efficiency, productivity, and new avenues of innovation, the resources required to power them—especially at scale—remain under scrutiny. In a new technical paper, Google has shed some light on this debate, offering insights into how much energy and water are used by its Gemini AI chatbot with each prompt.
The findings are both eye-opening and reassuring. A single query on Gemini, according to Google, consumes roughly the same energy as watching nine seconds of television and uses the equivalent of just five drops of water. But while the per-prompt numbers appear modest, questions remain about the true scale of AI’s environmental footprint.
Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026): big discounts for early bird.
Tekedia AI in Business Masterclass opens registrations.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab: From Technical Design to Deployment (next edition begins Jan 24 2026).
Breaking Down the Numbers
Google reports that:
- Energy use per Gemini prompt: ~0.24 watt-hours (Wh)
- Carbon emissions per prompt: ~0.03 grams of CO₂
- Water usage per prompt: ~0.26 millilitres, or about five drops
The company says these figures account not only for the power consumed by servers but also the idle energy of AI chips and the water needed to cool data centre equipment. By including these additional factors, Google argues that its numbers more accurately reflect “true operating efficiency at scale” rather than the theoretical consumption often cited in other studies.
Comparisons With Other Platforms
To put this into perspective, the Electric Power Research Institute, a non-profit organisation, previously estimated that each prompt to OpenAI’s ChatGPT consumes about 2.9 Wh—more than ten times higher than Google’s estimate for Gemini. For comparison, a traditional Google search query consumes about 0.3 Wh, slightly higher than a single Gemini prompt but far lower than ChatGPT’s reported usage.
These disparities highlight both the complexity of measurement and the rapid efficiency gains being made by AI developers. Still, such comparisons also point to the larger issue: while the per-query impact may be small, the cumulative effect of billions of queries could be substantial.
The Broader Energy Picture
The International Energy Agency (IEA) has warned that AI’s hunger for power is set to grow dramatically. The agency forecasts that global demand for AI-related energy will double within five years, potentially reaching 945 terawatt-hours per year—comparable to the entire electricity consumption of Japan.
This projection underscores the challenge. Even if AI prompts are individually efficient, the aggregate demand across platforms and applications will be enormous as adoption spreads to everything from customer service to medical research.
Google’s Rising Emissions
The report arrives at a time when Google’s own environmental record is facing scrutiny. According to its latest sustainability report, the company’s total greenhouse gas emissions have increased by 51 percent since 2019. Much of this rise is linked not to data centre operations themselves but to the manufacturing and assembly of the specialised hardware required to run advanced AI models.
In other words, while per-prompt energy use may be declining, the supply chain impact of scaling up AI infrastructure is offsetting those efficiency gains. This raises questions about whether AI can be considered truly sustainable, especially when hardware refresh cycles and manufacturing processes are factored into the equation.
Google’s Efficiency Gains
Despite these concerns, Google insists that progress is being made. According to the company, energy use and carbon footprint per Gemini prompt have decreased significantly:
- 33 times less energy since August 2024
- 44 times lower carbon footprint over the same period
Such dramatic reductions suggest that AI models are becoming far more efficient with each iteration. Improvements in hardware design, model optimisation, and cooling technology are contributing to these gains. Yet without knowing the total number of Gemini queries per day, it remains difficult to assess whether these efficiencies are enough to offset the overall rise in demand.
Water Usage: Small But Symbolic
Perhaps the most striking figure in Google’s report is water consumption. At just 0.26 millilitres per prompt—roughly five drops of water—the number may seem trivial. However, in regions where water scarcity is a pressing issue, the cumulative impact of millions of prompts could add up.
Data centres rely heavily on water for cooling, and this has become a point of tension in communities where tech infrastructure competes with agriculture and households for limited resources. While Google stresses that its facilities are optimised for efficiency, questions remain about how sustainable such water use will be as AI demand scales up.
Transparency and Its Limits
Google’s willingness to publish detailed estimates is notable, especially given the reticence of many technology firms to disclose the environmental costs of AI. However, the study has limitations. The most glaring omission is the absence of data on total daily query volumes. Without this figure, it is impossible to calculate Gemini’s overall energy and water demand at scale.
Critics argue that focusing only on per-prompt consumption risks minimising the full picture. A single drop of water may not seem like much—but billions of drops quickly fill an ocean.
Looking Forward
AI may be a tool for solving some of the world’s biggest challenges, but it must also avoid becoming one of its biggest environmental liabilities. As demand for AI services increases, companies like Google will need to balance innovation with sustainability, ensuring that efficiency gains keep pace with growth.
For now, one Gemini prompt may be as light as a handful of water drops or a few seconds off television—but the question remains: how many prompts will it take before the environmental cost can no longer be ignored?
Final Thought
Google’s findings show that an individual AI prompt may seem environmentally insignificant, but the true impact of AI lies in scale. With billions of prompts processed daily and the rapid expansion of AI infrastructure, even drops of water and seconds of energy add up. The challenge ahead is ensuring that efficiency gains and transparency keep pace with AI’s explosive growth—so that innovation doesn’t come at the expense of sustainability.
Conclusion
Google’s new data on Gemini’s per-prompt energy and water use provides a useful reference point in the ongoing debate about AI’s environmental footprint. On an individual level, the impact appears minimal: nine seconds off television, five drops of water, and a fraction of a gram of CO₂. But scale is everything.
As AI adoption accelerates globally, the environmental burden will grow—not only from energy consumption but also from hardware manufacturing and water use. While Google’s efficiency gains are encouraging, transparency about total usage and long-term sustainability strategies will be key.
Meta Description:
Google says each Gemini AI prompt consumes just 0.24 Wh of energy and five drops of water—far less than many estimates. But at scale, the environmental impact of AI remains a growing concern.
Share this:
- Click to share on Facebook (Opens in new window) Facebook
- Click to share on X (Opens in new window) X
- Click to share on WhatsApp (Opens in new window) WhatsApp
- Click to share on LinkedIn (Opens in new window) LinkedIn
- Click to email a link to a friend (Opens in new window) Email
- Click to print (Opens in new window) Print



