Digital democracy|Dystopian new tech
The Energy Cost of Chatbots: 26 Queries = A Warm Lunch
Generative AI is transforming the technology world, bringing both improvements and new challenges. The recent case¹ of the UAE’s agreement with OpenAI to provide free ChatGPT Plus access across the country is just one example of its widespread adoption.
However, the rapid growth of this technology raises important concerns. According to recent research² by Surfshark, privacy and data security issues are one of them — AI chatbots collecting and sharing a large amount of users’ data points with third parties. In addition to these data concerns, experts also warn about another: energy consumption. Research finds that the energy used to make 26 chatbot queries equals the same amount of energy as microwaving lunch.
Key insights
- One ChatGPT query consumes energy equivalent to running a 10W LED bulb for about 12 minutes. Similarly, a single query uses the same amount of energy as charging your phone with a 5W charger for 24 minutes. If we consider more powerful appliances, such as a 1000W microwave, one ChatGPT query equals about 7 seconds of microwave use. This means you could heat up your lunch (assuming a three-minute heating time) with the energy of approximately 26 queries. Finally, when compared to watching TV, a 50-minute episode of your favorite series (on a 100W TV) uses roughly the same amount of energy as 42 ChatGPT queries.
- If every person in the USA made a single query to ChatGPT, it would use an estimated 685MWh of energy. To put this into perspective, this amount of energy could power approximately 63 average American homes for an entire year, given that the average USA household consumes about 10.8MWh annually³.
- Each ChatGPT query produces an estimated 4.32 grams of CO₂⁴. This is because powering the data centers that run these queries requires electricity, much of which is still generated from fossil fuels that emit carbon dioxide. Multiplied by millions of queries daily, this results in significant carbon emissions. For instance, just one day of everyone in the US making a single query could emit around 1479 metric tons of CO₂ — roughly equivalent to the annual emissions of about 322 average gasoline cars⁵, or the same carbon footprint as 1,500 people flying from London to New York and back⁶.
- The global number of AI users is expected to reach approximately 378 million in 2025, marking a 20% increase from the previous year. This represents an addition of nearly 65 million new users in 2025 alone — the highest annual increase recorded so far⁷. As AI adoption grows, optimizing energy efficiency and carbon impact becomes increasingly critical.
- ChatGPT’s estimated energy consumption per simple query varies across studies, ranging from 0.3 watt-hours (Wh) (Epoch AI⁸, 2025) to around 3Wh (3Wh — Alex de Vries⁹, 2023; 2.9Wh — BestBrokers/EPRI¹⁰, 2024). These differences reflect variations in model size, hardware efficiency, and measurement methods. This variation highlights both ongoing improvements in AI infrastructure and the complexity of accurately measuring AI energy use. For this study, we used an average of 2Wh per ChatGPT query. Comparing the 2Wh energy use per ChatGPT query with Google Search shows that ChatGPT is nearly seven times more energy-demanding than Google Search (2 Wh vs. 0.3 Wh¹¹).
Methodology and sources
The energy consumption estimates per ChatGPT query were compiled from multiple recent studies published between 2023 and 2025. Estimates derive from lifecycle assessments and hardware efficiency models, not direct measurements, due to limited transparency from AI companies. The low estimate of 0.3 watt-hours (Wh) per query comes from Epoch AI’s⁸ 2025 analysis, reflecting improvements in model optimization and infrastructure. The higher estimate of 3Wh per query is based on earlier work by Alex de Vries⁹ (2023) and corroborated by measurements from the Electric Power Research Institute (EPRI) and BestBrokers¹⁰ in 2024 (2.9 Wh). Recent optimizations in GPT-4o reduced energy use to 0.3Wh, whereas older models consumed significantly more due to inefficient hardware. Equally, complex queries with very long inputs may even exceed 3Wh. For this study, we calculated the average from other studies, which resulted in a value of 2Wh.
Carbon emissions per query, estimated at 4.32 grams of CO₂⁴, were derived from lifecycle analyses of data center electricity use, incorporating regional grid carbon intensity averages, assuming a global average grid intensity of 1.44 kg CO₂/kWh (actual emissions vary regionally, e.g., 0.144–9g CO₂/query). EPA⁶ estimates an average gasoline car emits ~4.6 metric tons of CO₂ annually (387 kg/month).
Appliance power ratings were sourced from publicly available manufacturer specifications representing typical household devices. Energy consumption over five minutes was calculated by multiplying power (in watts) by the fraction of an hour (5/60), yielding watt-hours (Wh).
For the complete research material behind this study, visit here.