
The Environmental Cost of Every ChatGPT Query
Every time you ask ChatGPT a question, something happens in a data centre somewhere that you cannot see. Processors heat up. Cooling systems activate. Water evaporates or recirculates. Electricity flows from a grid that is, depending on where the data centre is located and what time of day it is, powered by some combination of renewable energy and fossil fuels. The energy cost of that single interaction is small. The energy cost of 100 million daily interactions the scale at which ChatGPT was operating by early 2025 is not small. It is a meaningful fraction of the energy consumption of a mid-sized country. And the AI infrastructure buildout that is currently underway Microsoft committing $80 billion to data centres in fiscal year 2026 alone, Google committing $75 billion in capex for 2025, the Stargate project pledging $500 billion over four years means the energy consumption of AI will grow by orders of magnitude in the next five years. The environmental cost of the AI revolution is not a theoretical future concern. It is already the reason Microsoft's carbon emissions increased 30% between 2020 and 2023 despite a public pledge to be carbon neutral by 2030. It is the reason that data centres in Ireland now consume more electricity than all the country's urban households combined. It is a real cost being paid right now, and it is almost entirely invisible to the billions of people running up the tab.
A single ChatGPT query consumes approximately 10 times the energy of a Google search. Microsoft's carbon emissions increased 30% between 2020 and 2023 driven almost entirely by AI data centre construction. The environmental bill for the AI revolution is real, growing, and almost entirely invisible to the people running up the tab.
The Energy Numbers: What One Query Actually Costs
A standard Google search query consumes approximately 0.3 watt-hours of electricity. A ChatGPT query consumes approximately 0.001 to 0.01 kilowatt-hours roughly 3 to 33 times as much, depending on the complexity and length of the interaction. A query that involves generating a long document or a complex piece of code sits at the higher end of this range. The International Energy Agency estimated in 2025 that data centres globally consumed approximately 200 to 250 terawatt-hours of electricity per year roughly 1% of global electricity demand. By 2026, with the AI infrastructure buildout in full acceleration, this figure is projected to reach 500 to 700 terawatt-hours annually by 2030.To put this in context: 500 terawatt-hours per year is approximately the annual electricity consumption of France. The AI revolution, at the infrastructure buildout rate currently underway, will by 2030 require as much electricity as a major European nation just to run its training and inference workloads on top of existing data centre demand for cloud computing, streaming, and communication services. The training runs for large language models are particularly expensive: training a single large model like GPT-4 is estimated to have consumed several thousand megawatt-hours of electricity and produced hundreds of tonnes of CO2 equivalent, depending on the energy mix of the data centres used.
The Water Cost Nobody Talks About
Electricity consumption is the visible environmental cost of AI. Water consumption is the invisible one. Data centres require cooling either through water evaporation in cooling towers, or through indirect cooling systems that use water as a heat exchange medium. Microsoft disclosed that its global data centre water consumption increased by 34% between 2021 and 2022, reaching 6.4 million cubic metres. Google disclosed that its data centres consumed 5.6 billion litres of water in 2022. These figures were reported before the AI infrastructure buildout accelerated significantly.The water consumption problem is geographically concentrated. Data centres are often located in areas chosen for cheap electricity or favourable tax conditions which are not always areas with abundant water resources. Data centres in the American Southwest, which is experiencing multi-year drought conditions, are competing for water resources with agriculture and municipal water systems. A single large data centre can consume millions of litres of water per day. The decision about where to locate AI infrastructure is therefore not just an energy decision it is a water resource decision with significant implications for local communities and ecosystems.
The Broken Carbon Pledges
Microsoft pledged in 2020 to be carbon negative by 2030 and to remove its historical carbon emissions by 2050. In 2023, the company's carbon emissions were 30% higher than in 2020. The primary reason: the rapid construction of data centres for AI workloads requires carbon-intensive building materials steel, concrete, aluminium and the electricity consumption of those data centres is outpacing the company's renewable energy procurement. Microsoft's chief sustainability officer acknowledged in the company's 2023 sustainability report that AI was making the 2030 carbon goal significantly harder to achieve.Google reported a 48% increase in greenhouse gas emissions between 2019 and 2023, driven by data centre energy consumption. The company had previously committed to operating on carbon-free energy by 2030. Amazon, which operates the world's largest cloud infrastructure through AWS, has similarly seen its carbon footprint grow despite significant renewable energy investment. The pattern across the hyperscalers is consistent: AI infrastructure investment is growing faster than renewable energy procurement and carbon offsetting capacity, resulting in higher absolute emissions even as emissions intensity per unit of computation decreases.
The Efficiency Paradox
The standard response from AI companies to environmental concerns is that AI will help solve climate change by optimising energy grids, accelerating materials science for clean energy, improving climate modelling, and reducing inefficiency across industrial and transportation systems. These are genuine applications and the potential is real. DeepMind's AlphaFold has already transformed protein structure prediction in ways that could accelerate drug development and materials science. Google's AI-powered data centre cooling optimisation has reduced cooling energy consumption by 40%.The paradox is that the same AI capability that enables these environmental applications requires the infrastructure buildout that is itself a significant environmental problem. The net calculation AI's environmental benefit versus AI's environmental cost depends entirely on assumptions about deployment timeline, energy mix, and the counterfactual world in which AI does not exist. What is certain is that the environmental cost of the current AI buildout is being paid now, in the form of increased carbon emissions and water consumption, while the environmental benefits are projected and in most cases not yet realised at scale.
What Accountability Actually Requires
The environmental cost of AI is currently externalised it is paid by the atmosphere, by water systems, and by communities near data centres, not by the companies or users generating the demand. Making this cost visible and accountable requires specific disclosure not general sustainability commitments, but per-query and per-model-training energy consumption figures reported consistently and audited independently. The EU AI Act requires environmental impact reporting for high-impact AI systems. The US has no equivalent requirement. The gap between these regulatory approaches means that the environmental accountability for the majority of global AI infrastructure depends entirely on voluntary disclosure.Consumers and enterprises making AI investment decisions in 2026 typically have no information about the carbon footprint of the AI tools they are using. A company that commits to carbon neutrality in its operations and then deploys a high-consumption AI system for every employee has likely increased its total carbon footprint without including that increase in its sustainability reporting. Closing this gap requiring AI-related energy consumption to be included in corporate carbon accounting is the specific accountability mechanism that most clearly connects the environmental cost to the decision-makers generating it.