What’s the Carbon Footprint of a ChatGPT Query?
What is the environmental impact of a single ChatGPT query?
Since the launch of OpenAI’s groundbreaking AI chatbot and the rise of large language models (LLMs) more broadly, this question has sparked widespread curiosity.
Recently, OpenAI CEO Sam Altman addressed this topic in a detailed post on his personal blog, offering an initial estimate.
Altman wrote:
“People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency light bulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon.”
He added:
“As datacentre production gets automated, the cost of intelligence should eventually converge to near the cost of electricity.”
Yet, Altman’s assessment does not fully account for the rapidly growing and widespread adoption of these AI tools.
When directly asked, ChatGPT itself acknowledges that while the environmental cost of an individual query may be relatively modest compared to other common digital activities, the cumulative effect of billions of daily interactions quickly adds up.
This scaling transforms a seemingly small footprint into a significant environmental consideration, underscoring the urgent need to evaluate AI’s long-term sustainability.
Here is what ChatGPT said when asked:
“The environmental impact of a single ChatGPT query is relatively small on its own, but when multiplied by billions of daily interactions, it contributes to a significant cumulative carbon footprint. Each query requires substantial computational power, which consumes energy—often sourced from data centers with varying environmental efficiency. As a result, while one query may have a modest energy cost, the widespread and growing use of AI models like ChatGPT raises important questions about the sustainability and environmental consequences of large-scale AI deployment.”
ChatGPT’s Carbon Footprint Could Match 4,300 Round-Trip Flights from Paris to NYC
Two years ago, Greenly, an app that enables companies to monitor their real-time CO2 emissions, estimated that the original version of ChatGPT had a carbon footprint of approximately 240 tonnes of CO2 equivalent—comparable to 136 round-trip flights between Paris and New York City.
Remarkably, the training phase alone accounted for 99% of these emissions, or about 238 tonnes annually.
Breaking down the footprint further, electricity consumption during operation represented three-quarters of the total (around 160 tonnes), followed by server manufacturing at 68.9 tonnes, and refrigerant gas leakage contributing 9.6 tonnes, according to the report.
A more recent Greenly analysis assessed the environmental impact of ChatGPT’s latest iteration.
If an organisation were to use ChatGPT-4 to handle one million emails monthly, the model’s training and usage would generate approximately 7,138 tonnes of CO2 equivalent annually—equivalent to 4,300 round-trip flights between Paris and New York.
Complementing these findings, researchers at MIT estimate that training multiple AI language models produces emissions five times greater than those of an average American car over its entire life cycle, including manufacturing.
These staggering figures highlight the urgent environmental challenges posed by the rapid expansion of AI technologies.
In response, a growing movement toward developing smaller, more efficient, and less energy-intensive AI models is gaining momentum, aiming to balance innovation with sustainability.
One X (formerly known as Twitter) user asked ChatGPT to provide an estimated carbon emissions per query based on Delhi’s usage.
Coinlive took the liberty to ask the same query but replaced Delhi with Singapore and below is the result.