- Google’s Gemini Apps has more than 400 million active users
- As more users use AI systems, the more importance of inference efficiency rises
- Google’s analysis only focuses on simple, text-based queries

Tech giant Google has said that a typical text prompt through its Gemini Apps uses less energy than it takes to watch TV for nine seconds, in its latest report looking at the environmental impact of artificial intelligence (AI).
In the ‘Measuring the environmental impact of delivering AI at Google scale’ report, Google said a single text prompt through Gemini uses 0.24 watt-hours (Wh) of energy and emits 0.03g of carbon dioxide equivalent (CO2e). It also said that a single text prompt also consumes 0.24ml of water, the same as about five drops.
If you’re interested in cutting your emissions and lowering costs, there are government grants for solar panels which could help you get started.
The report come amid concerns that AI might be harmful to the planet because of the amount of energy it requires, particularly in data centres where information is stored, as industries use it to make better decisions and people increasingly rely on generative tools, such as ChatGPT for information.
Some scientists have suggested that AI data centres are about to rapidly increase their energy usage and account for a greater share of the planet’s electricity consumption.
It is predicted that by 2026 data centres globally will consume 1,050 terawatt-hours (tWh) of electricity, which would be more than the whole of Russia and slightly less than Japan. This would make AI date centres the fifth biggest energy consumer in the world.
For context, as recently as 2022 that figure was 460tWh, which put data centres at eleventh place globally.

Get a solar panel quote
Answer a few simple questions, and a trusted solar panel installer will be in touch with a bespoke quote for your home or business.
When it comes to water, a single large data centre uses about 2.5 billion litres every year, with global consumption being 560 billion litres a year, a figure that could double by 2030. AI data centres use water to cool by absorbing heat from computing equipment. For every kilowatt hour of energy a data centre uses, it needs two litres of water to cool.
“The transformative power of AI is undeniable,” Google said in its report. “But as user adoption accelerates, so does the need to understand and mitigate the environmental impact of AI serving.
“Through detailed instrumentation of Google’s AI infrastructure for serving the Gemini AI assistant, we find the median Gemini Apps text prompt consumes 0.24Wh of energy—a figure substantially lower than many public estimates,” the company explains.
It also claims that in one year it has cut the energy consumption and carbon footprint of a single prompt by 33% and 44% respectively; and it says that continuing to ease the effect AI has on the environment needs “important attention”.
With that in mind Google has proposed a “comprehensive measurement” of AI’s effect on the environment, saying it would be “critical for accurately comparing different models” and to “properly incentivise efficiency gains”.
As of May 2025, Google’s Gemini Apps had more than 400 million active users. However, while its data on Gemini’s energy usage seems striking, by only focusing on text queries has seemingly sidestepped other everyday functions, which are also major causes of energy usage, such as document analysis, image generation and live video, among much else.
As well as that, Google’s AI also runs its search engine, which raises its emissions significantly. In 2023, Google announced that its emissions had risen by 48% in five years, driven mainly by AI.
How did Google calculate the environmental footprint of its AI?
The tech giant compared different AI models, including the hardware and energy it ran on, while enabling system-wide efficiency optimisations, to the models themselves.
Google said that by sharing its methodology, it hoped to increase industry-wide consistency in calculating AI’s resource consumption and efficiency.
Its comprehensive approach included:
- A full system dynamic power: This spanned not just the energy and water used by the primary AI model during active usage, but also the actual achieved utilisation, which can be much lower than theoretical maximums.
- Idle machines: To ensure high availability and reliability, production systems used a degree of provisioned capacity that’s idle, but ready to handle traffic spikes. This was factored into the total energy footprint.
- CPU and RAM: The host CPU and RAM play a crucial role in serving AI and use energy.
- Data center overhead: The energy consumed by the IT equipment running AI workloads is only part of the story. The infrastructure supporting these systems, from cooling systems, power distribution and other overheads – also consume energy.
- Data center water consumption: Data centers often consume water for cooling. As AI systems become more optimised and energy-efficient, water consumption will naturally decrease too.
Google’s comprehensive methodology estimates that its 0.24Wh of energy, 0.03 grams of CO2e and 0.26ml of water account for all critical elements of serving AI globally.