5 Things you did not know about AI’s energy use broken down in estimates for various Google search use cases.
Energy Consumption of Different Google Search Types
Every Google search – whether text, image, voice, or AI-generated – uses a small amount of energy. While each query’s energy cost is tiny in isolation, the billions of searches per day add up to significant consumption. Google has invested heavily in efficient data centers to minimize per-query energy use , and energy use can vary by the type of search.
Below we break down estimates for various Google search use cases, given in watt-hours (Wh) per query, along with relatable consumer-equivalent examples. We also note infrastructure and efficiency context where relevant.
Standard Text Search
Text search
A typical Google text search is very efficient, requiring on the order of 0.3 Wh of energy (0.0003 kWh) on Google’s servers . This is a fraction of a watt-hour – for perspective, about the energy to power a 60 W light bulb for only ~17 seconds . In other words, one search uses roughly the same electricity as your body burns in 10 seconds or emits only ~0.2 grams of CO₂. Such a query’s energy is also comparable to just a few percent of a smartphone’s battery charge.
Footprint
Google achieves this low per-search footprint through extremely optimized infrastructure – its data centers are among the world’s most energy-efficient (Power Usage Effectiveness ~1.1) and increasingly powered by carbon-free energy (64% on average in 2022) . As a result, your computer often uses more energy to display search results than Google uses to find them. Still, at Google’s scale even this small energy usage is significant: 3.5 billion searches a day × ~0.3 Wh each ≈ 1.05 GWh daily, about the electricity consumption of 30,000 US homes in a day .
- Consumer equivalent: ~0.3 Wh per query – about the energy to run a typical LED light bulb for 2–3 minutes, or roughly 1/30th of the energy needed to boil a cup of water. In practical terms, ~33 text searches use about 10 Wh, which is akin to fully charging one smartphone.
Image Search
Google images
Searching for images (“Google Images”) involves retrieving and transmitting visual content, which can require slightly more processing and data transfer than a text query. Google’s servers must not only query the index for relevant images but also fetch image thumbnails. However, much of the heavy lifting (crawling and indexing images) is done beforehand, and caching is used to serve popular images efficiently.
Consumption
Precise figures aren’t publicly broken out, but an image search is estimated to consume on the order of ~0.4 Wh per query (roughly 1.1–1.5× a standard text search). This marginal increase accounts for additional image-processing overhead and the larger data payload. The server-side energy remains very small – only a few tenths of a watt-hour – while the user’s device might actually expend more energy downloading and rendering image thumbnails than the Google data center expends to find them.
- Consumer equivalent: ~0.4 Wh per image query – on the order of 4–5% of a smartphone’s battery capacity per search, or about the same electricity as running a 10 W LED lamp for ~2.5 minutes. In other words, a handful of image searches uses about as much energy as charging a phone for a few minutes.
Voice Search
Assisted search
Voice searches (for example, using the Google Assistant or the voice input on the Google app) add an extra step: converting speech to text via an automatic speech recognition (ASR) model. This speech-processing step requires additional computation in Google’s cloud. Once transcribed, the query is handled like a normal search. Because of the ASR stage, a voice search tends to use slightly more energy than a text search – roughly 0.4–0.6 Wh per query by some estimates (perhaps about 1.5–2× a standard search).
Task breakdown
In effect, the servers are doing two tasks: decoding your voice with an AI model and then executing the search. Google’s ASR models are optimized for efficiency (and often run on specialized hardware), so the added energy per request is still well under 1 Wh.
Notably, if a voice query is handled entirely by an assistant (providing a spoken answer to a simple question), it might even save a bit of energy on the user’s side (by avoiding a full graphics load on a device), but on the server side it remains a bit higher than typing the query.
- Consumer equivalent: ~0.5 Wh per voice query – roughly twice the energy of a text search. This is comparable to keeping a 60 W lightbulb on for ~30 seconds, or about 5%–6% of a phone’s charge consumed in one spoken query. For example, two voice searches use about as much server energy as one charge of a wireless earbud case.
AI-Enhanced Search (Generative AI Results)
AI overviews
“AI overviews” or generative answers – as seen in Google’s Search Generative Experience (SGE) or potentially Gemini-powered results – are far more computationally intensive than traditional searches. Here, large machine learning models generate original text or context based on search results.
Several analyses indicate an order-of-magnitude jump in energy per query. Estimates range from about 3 Wh per AI-powered query up to 7–9 Wh in some cases , depending on the model size and how much text is generated. This is roughly 10–30× the energy of a standard search. Alex de Vries (Digiconomist) estimated that an AI “snapshot” answer on Google uses ~3 Wh, about ten times a normal search’s 0.3 Wh .

In real terms
In real terms, that 3 Wh is “roughly equivalent to the amount of power used by a 100W lightbulb for two minutes” . Another study notes that each request to an AI language model can consume 7–9 Wh, ~25× a Google query’s energy demand . (The higher estimates likely correspond to complex prompts or larger models; in practice Google’s SGE has been optimizing to lower this cost.)
Google itself has stated that the “machine costs” (compute overhead) for generative AI in search dropped 80% since the initial launch in Labs thanks to model tuning and better hardware, so the current energy per AI search may be closer to a few watt-hours. Even so, AI-assisted searches clearly consume significantly more electricity than a lookup that simply retrieves existing results .
This extra energy largely goes into the inference computations of the neural network that composes the answer, which require powerful accelerators (TPUs/GPUs) working for a longer time per query.
- Consumer equivalent: ~3–5 Wh (typical AI-generated result) – on the order of running a 100 W light bulb for 2–3 minutes, or 10× a standard search’s energy. Just one AI search query might use about 0.3–0.5% of a full EV car charge if it’s a very large model (e.g. 7+ Wh, assuming an EV ~50 kWh battery), though Google is working to reduce this. Another way to look at it: a single AI-powered query (few Wh) consumes similar energy as boiling a 0.1–0.2 liter of water in a kettle.
In bulk terms, if Google were to apply generative AI to all searches, the total energy usage could rise dramatically – one analysis suggests it could equal the electricity consumption of an entire country (like Ireland) if every search used AI.
High-Volume & Real-Time Trending Queries
Real time trends
When a major event breaks (e.g. a World Cup final or trending news story) millions of people may search for the same thing simultaneously. Interestingly, the per-query energy for trending searches is not much higher than a normal search – and can even be lower – thanks to caching and efficiency in handling repeated queries. Google’s systems will often serve popular queries from cache (storing recent results in memory) for a short period , which avoids recomputing the same answer each time.
This means that the first user to search a fresh news item might incur the full ~0.3 Wh cost, but many subsequent users get the answer with less work. In some cases (like rapidly evolving news), the index may be updated in near-real-time, but Google’s infrastructure is designed to handle that incrementally. Overall, you can assume a trending-news query still uses on the order of ~0.3 Wh per request (perhaps a bit more if it bypasses cache for the latest update).
Real impact
The real impact comes from volume: sudden surges in queries can drive up total energy use (and load on servers) even if each query is individually small. Google reported that during the 2022 World Cup final, Search saw its highest ever trafficin 25 years – essentially “the entire world was searching for one thing” at once. The infrastructure handled this by spreading the load across data centers worldwide.
If we consider a ballpark figure, one million trending searches at ~0.3 Wh each would consume about 300 kWh in total. That is as much electricity as an average U.S. home uses in 10 days, or enough to fully charge ~30,000 smartphones. Fortunately, Google’s data centers can absorb these spikes, and frequently-requested results are delivered efficiently to each user.
- Consumer equivalent: ~0.3 Wh per trending query (similar to any text search). However, in aggregate 1 million such searches ≈ 300 kWh – about what it takes to boil 1,000 liters of water or to drive a typical electric car for 1,200–1,500 km. In other words, during major real-time events the collective energy for all searches can be huge (even though Google partially mitigates this with caching and fast indexing).
AI image creation or other very heavy tasks
Very heavy tasks for instance generating images, long code generation, large language model fine-tuning use at least ~5–20 Wh per request. While ultra-heavy tasks such as training a large language model use
thousands to millions of Wh per training run.
- Consumer Equivalent: Very heavy tasks use the equivalent to running a laptop for 1–3 hours or driving an electric car ~100–400 meters. And ultra heavy tasks can use up to a year of a U.S. household’s annual energy consumption (depending on model size) .
Summary Comparison
Infographic
The infographic below compares the estimated energy use per Google search request across different use cases, along with an illustrative consumer-equivalent for that single query:

Sources for this comparison & recap
Data compiled from Google’s public statements and sustainability reports, academic and industry research, and reputable tech analyses. Google’s 2009 disclosure put a typical search at 0.0003 kWh google.blogspot.com, and recent analyses consistently affirm ~0.2–0.3 Wh per search.
Enhanced searches using AI consume significantly more energy per query wpr.org. Note that actual values can vary with query complexity, caching, and data center efficiency. Google continues to improve hardware, cooling, and algorithms to cut per-query energy use, especially for AI – for example, reducing generative AI serving costs by 80% since introduction Jacobin.com.
Additionally, where a query is served (where the data center is and its power source) can influence the carbon footprint of that energy (renewable vs. grid mix), though the pure energy (Wh) remains roughly the same.
In summary, a simple search might use only a few hundred milliwatt-hours. An almost negligible amount of energy. Whereas an AI-driven query can use a few watt-hours. Making it important to deploy these advanced features judiciously and efficiently.
Search Use Case | Energy per Query (Wh) | Equivalent Consumer Energy Use |
---|---|---|
Standard text search | ≈ 0.3 Wh | ~60 W lightbulb for 17 seconds (a few % of phone charge) |
Image search | 0.3–0.5 Wh (estimated) | A 10 W LED lamp for ~2–3 minutes (few more seconds of 60 W bulb) |
Voice search | 0.4–0.6 Wh (estimated) | ~2× a text query (adds speech-to-text); 60 W bulb for ~30 sec |
AI-generated search | 3–7 Wh | ≈10–25× a normal search; ~1 hour on a landline phone ; ~5 min running a 100 W bulb |
Trending/news query | ≈ 0.3 Wh (cached; slightly higher if fresh) | Similar to standard search; aggregated: 1 million queries ≈ 300 kWh (≈10 days of one home’s usage) |
Energy reduction strategies
Adopting energy-efficient machine learning operations, including optimized model architectures and performance, contributes to reduced energy usage in AI workflows. Implementing strategies like compressing large language models (LLMs) and other neural networks also contributes. This technique makes them more efficient and accessible, and can lower the energy footprint of large language models by up to 45%.
Both Google and OpenAI and leading AI organizations are actively implementing strategies and researching options to reduce the energy consumption associated with artificial intelligence (AI) operations.
- In 2016 Google integrated DeepMind’s AI to enhance the efficiency of its data center cooling systems, achieving up to a 40% reduction in energy used for cooling.
- To support AI growth sustainably, Google is investing €1 billion in expanding its Finnish data center, which operates on 97% carbon-free energy. Additionally, waste heat from this facility is redirected to local district heating systems .
- OpenAI is reportedly collaborating with semiconductor companies to design custom AI chips. This move aims to enhance computational efficiency and reduce energy consumption by tailoring hardware to specific AI workloads
- OpenAI is focusing on improving the efficiency of its AI models. For instance, newer models like GPT-4 are designed to perform complex reasoning tasks more efficiently. And can potentially reduce the energy required per query
- To meet the substantial energy requirements of AI training, OpenAI, in partnership with Microsoft, is exploring sustainable energy options, including the use of nuclear power, to ensure a reliable and cleaner energy supply for their data centers.

Have you ever stopped to think about the energy behind the tools you use?
The hidden costs
Whether you’re a developer building with large language models or a marketer prompting AI to craft your next campaign, each query, each output, carries a hidden energy cost. We don’t often talk about it, but maybe we should.
As AI becomes part of our everyday workflow, what if we started factoring in not just performance and cost, but also energy use? This isn’t about guilt, it’s about awareness. About making smarter, more responsible choices. So here’s an invitation: let’s start asking not just what we’re building with AI, but how much it really costs to run. And whether there’s a lower-impact way to get the job done.
Creating awareness
We’ll start by sharing the hidden costs of using AI to help us write this post and future ones.
Total Estimated Energy Use of this post using ChatGPT: ~90–110 Wh (watt-hours). Consumer Equivalents:
- Driving an electric car ~500–800 meters
- Running a 100W lightbulb for ~1 hour
- Fully charging a modern smartphone 7–10 times
- Boiling ~1.5 liters of water in a kettle
More resources:
https://blog.google/outreach-initiatives/environment/deepmind-ai-reduces-energy-used-for/
https://www.businessinsider.com/openai-chip-design-tsmc-broadcom-big-tech-nvidia-2024-10?
https://heatmap.news/technology/openai-o1-energy
https://www.weforum.org/stories/2025/03/on-device-ai-energy-system-chatgpt-grok-deepx/
https://arxiv.org/abs/2504.06307
https://arxiv.org/abs/2503.23934