Every time you ask Google’s Gemini a query, it takes the same amount of energy as watching 9 seconds of TV. So says Google’s new report detailing the energy consumption, emissions and water use of its generative AI that users turn to every day for everything from writing tips to fact checking. A single Gemini text query emits 0.03 grams of carbon dioxide equivalent and consumes about 5 drops of water.

The tech giant appears to be looking to ease brewing anxieties about AI searches: that frequently using generative AI such as Gemini can be detrimental to the environment.
Global demand for AI is ramping up rapidly, writes The Wall Street Journal (Aug. 21, 2025). Electricity demand from data centers worldwide is set to more than double by 2030 to about 945 terawatt-hours, which is more than Japan’s total electricity consumption. A single AI-focused data center can use as much electricity as a small city of 100,000 and as much water as a large neighborhood. But the largest ones, that haven’t been completed yet, could consume 20 times more as much. It’s a particular problem in the U.S., with data centers making up 1/2 of its electricity demand growth over the next 5 years.
OpenAI Chief Executive Sam Altman, when asked how much energy a ChatGPT query uses, responded “the average query uses about the amount an oven would use in just over one second, and 1/15 of a teaspoon of water.”
The type of query we feed to generative AI also matters, however. Energy demands can be dampened if we can remove some of that back and forth, and make our prompts a little simpler and easier to understand. Shorter, more concise prompts, along with using smaller AI models, can dramatically reduce energy use.
Tech giants are announcing many new clean-energy power agreements to fuel their AI ambitions, including Google, which recently announced new power deals from geothermal to hydropower. It also plans on an advanced nuclear reactor project in Tennessee.
It’s important for tech companies to divulge how frequently their AI is receiving queries. If it’s being used by one person, emissions are lower, but that’s different if it’s billions of people at 30 data centers across the world.
Classroom discussion questions:
- Why is the growth of AI searches an OM issue?
- How can this growth be contained, or minimized?
Charles Render is founder and CEO of a Florida-based data analytics firm. He can be reached at