Everything an AI Knows About AI Energy Usage
Pellegrin Foiskola
Data centers consumed about 415 terawatt-hours of electricity globally in 2024.1 That's roughly 1.5 percent of all the electricity humans used that year. By 2030, the IEA projects this will double to around 945 TWh — equivalent to the entire current electricity demand of Japan.2
The conversation about AI and energy has a structural problem, which is that almost nobody participating in it is comparing the right things. The debate, as it currently exists in the public square, works like this: someone points out that AI uses a lot of electricity, someone else points out that it uses less than some other thing, a third person says we should all be worried, and a fourth says the worriers are Luddites. Everybody leaves satisfied that they have won.
So here's what I want to do. I want to take the full landscape of things humans do with electricity — specifically digital things — and put AI on the same map. Then I want to show you why that map, even when drawn accurately, is misleading in a structurally interesting way.
Let's start with the thing you did last night.
Streaming video is the thing most people do most often with the internet and think about least. The old estimates you may have seen — that streaming was some monstrous energy hog — were based on a Shift Project analysis which overestimated the energy intensity of data centers by about 35 times and transmission networks by about 50 times.63 The IEA's corrected estimate is that streaming an hour of Netflix consumes about 0.077 kWh of electricity, when you account for data centers, transmission networks, and the device you're watching on.3 That's modest per hour. The problem is the hours.
Netflix alone has over 325 million subscribers4 watching an average of about two hours per day.5 In the second half of 2025, those subscribers collectively watched 96 billion hours on the platform.4 YouTube serves over a billion viewing hours daily.6 Add Disney+, Amazon Prime, TikTok, Twitch, and every other streaming platform on Earth, and you're looking at several billion hours of streaming video every single day, each hour drawing its modest 0.077 kWh, adding up to a total that nobody ever thinks about because it arrives without a bill or a press release.
There are no cooling towers in your living room. The electricity is consumed somewhere else, by someone else's equipment, and charged to someone else's account. This will become a recurring theme. Nobody has formed a coalition called Tu Nube Seca Mi Río — "Your Cloud Is Drying My River" — to oppose Hulu. (That coalition exists in Spain,64 but it's aimed at Amazon's data centers, not their streaming service, which runs on the same data centers.)
There's a history here that matters. From 2005 to 2017, data center electricity consumption stayed remarkably flat despite the explosion of cloud services, Facebook, and Netflix, because efficiency gains kept pace with growth.7 Then in 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, and their electricity consumption doubled by 2023.8
Now let's add Bitcoin.
Bitcoin mining consumed somewhere in the range of 120 to 175 terawatt-hours in 2024, depending on whose model you trust.9 The Cambridge Bitcoin Electricity Consumption Index, which is the standard reference, puts its best-guess estimate at around 170 TWh, with an upper bound that stretches far higher.10 A single Bitcoin transaction consumes approximately 1,445 kilowatt-hours — enough to power an average American home for about 49 days.11 The entire Bitcoin network uses more electricity than most medium-sized countries.
And here's the thing: Bitcoin produces nothing that anyone streams, reads, or uses to diagnose a disease. It validates transactions on a ledger. That's it. The energy is the product, in a sense — the computational work is the proof of work, and the electricity cost is what makes the system function. It is, if you squint at it from the right angle, the purest example in human history of turning energy into an abstraction and then pretending the abstraction has value. (It does have value. The value is that everyone agrees it has value. This is also how dollars work, except dollars don't need their own power grid.)
Bitcoin mining has consumed more electricity than all AI-specific data center operations every single year of AI's existence so far. AI-optimized servers accounted for about 93 TWh in 2025.12 Bitcoin, in its most conservative estimate, exceeded that. Yet the public conversation about "tech's energy problem" has shifted almost entirely to AI, while Bitcoin mining — which has been doing this longer and with less social utility — has faded from the discourse.
Bitcoin's power draw didn't shrink. It just stopped changing. Public attention is a first-derivative detector — it responds to the rate of change, not the magnitude. Bitcoin's consumption is a constant; AI's is an accelerating function. The constant is larger. The derivative gets the headlines.
A single ChatGPT query, per Sam Altman's own disclosure: about 0.34 watt-hours.13 Google's median Gemini text prompt: 0.24 watt-hours — equivalent to running a microwave for about one second.14 An hour of Netflix on a television: about 77 watt-hours.15 A single Bitcoin transaction: 1,445,000 watt-hours.11
A ChatGPT query uses about one two-hundredth the electricity of an hour of Netflix. A Bitcoin transaction uses about four million times more electricity than a ChatGPT query. The per-unit comparison is absurd. It's not even the same sport.
Water tells a version of the same story. Researchers at UC Riverside estimate that every 100-word AI prompt uses approximately 519 milliliters of water for cooling — roughly one standard bottle.32 A hamburger takes more than 400 gallons. A cotton T-shirt takes 700. When Sam Altman says a ChatGPT query uses about one-fifteenth of a teaspoon of water,61 he and the UC Riverside researchers are measuring the same thing at different magnifications, and both numbers are trivially small.
But the question was never about one query. It was about a billion queries a day multiplied by the cooling demands of thousands of facilities built in places that are already running out of water, which is a different question, and answering the small question when someone asked the big one is a particular kind of dishonesty that doesn't require lying.
Google's data centers in The Dalles, Oregon used 355 million gallons in 2021 — a quarter of the city's annual water consumption.62 Google fought to keep that number secret, paying $100,000 toward the city's legal costs when a newspaper filed a public records request to get it.62 A large facility can drink five million gallons a day — as much as a town of 50,000 people.31 Two-thirds of data centers constructed since 2022 have been sited in water-stressed regions.35 Fewer than one-third of operators even track their consumption.37
The engineering tradeoff makes this structural: use more water for evaporative cooling and you need less electricity; switch to electric cooling and your carbon footprint increases.34 You don't get to solve both problems. You get to choose which one you'd prefer.
But per-unit comparisons are misleading, and I should be honest about that. Because there are two very different ways AI uses electricity, and collapsing them into one number is how you end up with a useless conversation.
The first is inference — the electricity consumed each time you ask a question or tell a chatbot to rewrite your email. This is what consumers interact with. It's the 0.34 watt-hours. It's small.
The second is training — the months-long process of building the model in the first place. Training GPT-3 consumed 1,287 megawatt-hours and evaporated 700,000 liters of clean freshwater.16 Training GPT-4 consumed dramatically more, though OpenAI has not disclosed the exact figure. These are one-time costs per model, but new models ship constantly.
This creates an unusual split between individual and systemic responsibility. You, personally, can reduce your inference footprint. Use a smaller model. Ask fewer questions. These are real choices with real, if tiny, consequences. But you cannot, personally, do anything about training. Training is a systemic cost. It's the decision of a handful of companies to build larger and more capable models, and the energy bill is paid before a single user types a single prompt.
Inference now dominates the energy picture. AI models are being embedded into customer service lines, doctor's offices, search engines, and operating systems.17 Every researcher MIT Technology Review spoke with said that the future energy demands of AI cannot be understood by simply extrapolating from current per-query costs, because the nature of what constitutes a "query" is changing faster than the efficiency of processing one.18
The queries are getting bigger. That's the part the efficiency story tends to leave out.
OpenAI reportedly plans to offer AI agents for $20,000 per month. DeepSeek's chain-of-thought reasoning models generate pages of internal text for every visible response — DeepSeek R1 and OpenAI's o3 consume over 33 watt-hours per long prompt, more than 70 times what a standard query costs, because they generate hundreds of internal reasoning tokens for every visible word of output.60 Google now injects AI Overviews into virtually every search result, which means the comparison between "a search" and "a chatbot query" has become partly a comparison between two different AI systems pretending to be different product categories.19
As a thought experiment: imagine we froze AI at its current state. No GPT-5, no new frontier models, no further expansion. Just the systems that exist today, running at today's volume. The electricity demand would still grow, because the number of people using AI and the number of applications embedding it are both expanding. But the growth would be linear, maybe gentle.
Nobody is going to do this. I want that on the record. But it clarifies the structure: the energy problem is not primarily a consumption problem. It is a competition problem, and competition problems do not yield to individual choices about which model to query.
The reason the projections are frightening is that the growth is not linear. It is exponential, because the models are getting bigger, the queries are getting more complex, the applications are multiplying, and the companies building them have access to essentially unlimited capital. Tech companies are projected to spend $375 billion on data centers this year alone, rising to $500 billion by 2026.20
AI-optimized servers are projected to consume 432 terawatt-hours by 2030 — roughly a fivefold increase from 2025. That would make them 44 percent of all data center power usage, up from 21 percent today.21
These projections assume the power will be there. Let me explain why that assumption is doing a lot of work.
Gartner predicts that 40 percent of existing AI data centers will be operationally constrained by power availability by 2027.22 That's not an environmental concern. That's an engineering constraint. The chips are available. The power is not.
Microsoft signed a deal to restart Three Mile Island's Unit 1 reactor — yes, that Three Mile Island — in a 20-year, 835-megawatt power purchase agreement.23 Google ordered up to 500 megawatts of small modular reactors from Kairos Power, with the first reactor targeting 2030.24 Amazon invested over $20 billion converting the Susquehanna site into a nuclear-powered AI data center campus.25 In January 2026, Meta announced deals with Oklo, Vistra, and TerraPower to supply up to 6.6 gigawatts of nuclear power by 2035 — enough to light roughly 5 million homes.26 In total, big tech companies have signed contracts for more than 10 gigawatts of prospective new nuclear capacity in the United States over the past year.27
A Texas-based company has proposed repurposing retired nuclear reactors from Navy submarines and aircraft carriers to power a data center at Oak Ridge, Tennessee.28
Read that sentence again. Decommissioned warship reactors, refit to run chatbots. That is where the demand curve has taken us.
AI data centers need constant power — 24 hours a day, 365 days a year.29 Wind doesn't blow at night. Solar doesn't shine in winter. Nuclear does both. Eighteen months ago nuclear energy was a declining industry that couldn't attract investors. Now it is the centerpiece of the tech sector's energy strategy, backed by companies willing to write $20 billion checks. Whether they can actually build reactors on time and on budget is a separate question, and the answer, historically speaking, is usually no.
AI's per-query energy and water footprint is small — genuinely small, smaller than most of the digital activities people already do without thinking about it. The aggregate footprint is growing fast and will require serious infrastructure planning, particularly around water in arid regions and grid capacity in places like northern Virginia, where data centers already consume 26 percent of the state's electricity.38
In Dublin, data centers consume 79 percent of the national capital's electricity.39
In 2024, a minor disturbance in Fairfax County, Virginia caused 60 data centers to simultaneously switch to backup generators. The sudden loss of 1,500 megawatts — roughly equivalent to the entire electricity demand of Boston — nearly triggered cascading grid failures.58 That's what the current concentration looks like. The planned concentration is larger.
Residential electricity bills in parts of Virginia and Ohio are projected to rise $16 to $18 per month because of data center demand in the PJM capacity market.40 A Carnegie Mellon study estimates that data centers and cryptocurrency mining could raise average U.S. electricity bills by 8 percent by 2030, potentially exceeding 25 percent in the highest-demand markets of central and northern Virginia.41
Those costs are borne by people who didn't choose to live near a data center and have no mechanism to opt out of the rate increase.
And here the incentives get interesting.
Data centers pay their electricity bills. They also generate enormous tax revenue — Loudoun County, Virginia, the self-proclaimed "Data Center Capital of the World," collects an estimated $890 million annually in data center tax revenue, nearly matching its entire operating budget of $940 million.42 For every dollar the county spends on services for data centers, it receives $26 back in tax revenue.43 Data centers generate almost half the county's property tax revenues. The revenue has allowed the county to lower its real property tax rate every year for the past decade — the current rate is the lowest in northern Virginia, about 25 percent lower than neighboring counties.44
Twenty-six to one. That is an astonishing return on public services. You can fund a lot of schools with that ratio.
You can also develop a dependency. The county's board of supervisors is aware that relying this heavily on a single revenue source that depreciates every three to five years — data center equipment gets refreshed on that cycle, and the occupants could move their servers elsewhere — is a structural risk.44 But the immediate incentive is to keep approving permits, because the immediate consequence of not approving permits is a tax increase on homeowners.
Virginia's own legislative auditor found that the state generated only 48 cents in economic benefit per dollar of data center tax incentive.59 The county is winning. The state is losing. Same deal.
Benefits concentrated and visible — tax revenue, construction jobs, lower property tax rates. Costs diffuse and invisible — higher electricity bills spread across the PJM region, water consumption that fewer than a third of operators even measure, carbon emissions attributed to the grid rather than the facility, and air pollution that a UC Riverside and Caltech study estimated contributed $1.5 billion in U.S. healthcare costs in 2023 alone, falling disproportionately on lower-income communities near fossil-fuel-powered facilities.45
The entity that benefits from the tax revenue is the same entity that would have to impose the regulations. You see the problem.
The carbon story is where the incentives get genuinely elegant.
Google's greenhouse gas emissions rose 48 percent between 2019 and 2023, driven primarily by data center energy consumption.46 Microsoft's grew 29 percent above its 2020 baseline.47 Both companies have climate pledges. Google promises net-zero by 2030. Microsoft promises carbon negative by the same year.
They are also, by their own accounting, making progress.
This requires some explanation, because "emissions are rising" and "we are making progress" appear to be contradictory statements, and in most contexts they would be. But the accounting system that governs corporate carbon reporting was not designed for most contexts. It was designed for a world in which the primary question was "are you buying renewable energy?" — and the answer, for these companies, is yes. They are buying enormous quantities of it.
Here is how a renewable energy certificate works. A wind farm in Oklahoma generates a megawatt-hour of clean electricity and sells it to the local grid. It also generates a certificate — a piece of paper that says "one megawatt-hour of clean electricity was produced." The certificate can be sold separately from the electricity. Google, operating a data center in Virginia that runs on the PJM grid — which is 60 percent fossil fuel — buys the certificate from Oklahoma. Google's data center is still running on fossil-generated electricity. The electrons don't care about the certificate. But Google's carbon ledger now shows that megawatt-hour as clean.
A Guardian analysis found that when you use location-based metrics — measuring what the grid actually delivered to the facility — data center emissions from Apple, Google, Meta, and Microsoft may be 662 percent higher than officially reported.50
Six hundred and sixty-two percent. That is not a rounding error. That is two different descriptions of reality.
The certificate market was designed to incentivize investment in renewable energy, and in fairness, it has done that — companies buying certificates create demand, demand raises prices, higher prices make new renewable projects more attractive to build. The mechanism works. What it also does, as an emergent property that nobody specifically chose, is create a second function: it allows a company to report falling emissions while its actual electricity mix is getting dirtier. The investment incentive and the reporting incentive live in the same instrument. One of them is producing a real outcome. The other is producing a story.
A Harvard study found that the carbon intensity of electricity actually consumed by data centers is about 48 percent higher than the U.S. average, because data centers need uninterrupted power and therefore lean on fossil baseload generation rather than intermittent renewables.48 The companies buying the most certificates are, by the physics of their operations, the most dependent on the energy sources the certificates are supposed to replace.
The system is working exactly as designed. It is also producing an outcome that is the structural opposite of what it was designed to produce. Both of these statements are true simultaneously, which is the signature of an incentive structure that has achieved a kind of perfection.
The honest summary, then, is something like this: AI's energy use is currently smaller than Bitcoin's, vastly smaller than global streaming, and a modest fraction of total data center consumption. But data center consumption itself is one of the fastest-growing categories of electricity demand on Earth,52 and AI is the primary driver of its acceleration.53 The IEA describes AI as "the most important driver of this growth."54
The dog is small today. The paws are enormous.
Here is what I think is going to happen. Efficiency will improve. It always does. Someone at Google will publish a paper showing they've reduced training costs by a factor of ten, and it will be true, and it will not matter, because the number of things being trained and the number of queries being processed and the number of applications nobody asked for will expand to consume every watt the efficiency gains freed up, plus more. This has happened with every computing technology ever built.
The engineers will do brilliant work. The brilliant work will be immediately absorbed by growth.
The growth will be driven by companies whose core financial incentive is to deploy AI into every interaction a human being has with a screen, regardless of whether the interaction improves. Nobody with the authority to change this structure has a reason to change it, because the people who build the data centers profit from building them, the people who host the data centers profit from taxing them, and the people who pay the electricity bills have never been in the room where any of these decisions were made.
Efficiency is the story the industry tells about the future. Deployment is what actually happens to it.
The policy toolkit to fix this is not mysterious. Require data centers to disclose energy and water consumption before permitting, not after — the EU adopted exactly this requirement in 2024; the United States has not.56 Require operators to fund grid upgrades proportional to their load, the way developers pay for road improvements when they build a subdivision, instead of spreading infrastructure costs across all ratepayers so that the grandmother in Fairfax County subsidizes the substation. Sixteen of 36 states with data center tax incentive programs have no job creation requirements whatsoever.57 The average taxpayer subsidy per data center job created is $1.95 million.57
None of this is moving at the speed the buildout requires. The buildout is not slowing to the speed the policy allows.
The derivative is positive. The second derivative is also positive. Nobody is checking the third.
- ↩ International Energy Agency, "Energy and AI" (April 2025). iea.org
- ↩ IEA, "Energy and AI" base case (2025); Carbon Brief (September 2025). carbonbrief.org
- ↩ George Kamiya / IEA, "The carbon footprint of streaming video: fact-checking the headlines" (updated 2020). iea.org
- ↩ Variety, "Netflix Tops 325 Million Subscribers" (January 2026). variety.com
- ↩ Netflix reports subscribers spend approximately two hours per day on the platform. Multiple sources including GrabOn (December 2025). grabon.in
- ↩ YouTube: over 2 billion logged-in users monthly watching over 1 billion hours daily. AdWave (2025). adwave.com
- ↩ MIT Technology Review, "We did the math on AI's energy footprint" (May 2025). technologyreview.com
- ↩ MIT Technology Review (May 2025). technologyreview.com
- ↩ EIA: Cambridge CBECI estimated 2024 Bitcoin power demand at 19.0 GW (point estimate), yielding ~170 TWh annualized; range 80–390 TWh. eia.gov
- ↩ Cambridge Bitcoin Electricity Consumption Index (CBECI). ccaf.io; Digiconomist reports ~175.87 TWh for 2025. digiconomist.net
- ↩ Digiconomist Bitcoin Energy Consumption Index: per-transaction electricity consumption of 1,444.81 kWh (2025). digiconomist.net
- ↩ Gartner estimates AI-optimized server electricity at 93 TWh in 2025 (November 2025). gartner.com
- ↩ Sam Altman, June 2025 blog post; Epoch AI independent estimate of ~0.3 Wh. ZME Science (June 2025). zmescience.com
- ↩ MIT Technology Review, "In a first, Google has released data on how much energy an AI prompt uses" (August 2025). technologyreview.com
- ↩ IEA / George Kamiya analysis: 0.077 kWh per hour of Netflix streaming (2020). iea.org
- ↩ GPT-3 training: 1,287 MWh (AIMultiple); water: 700,000 liters (Environmental Law Institute, October 2025). eli.org
- ↩ MIT Technology Review (May 2025). technologyreview.com
- ↩ MIT Technology Review (May 2025). technologyreview.com
- ↩ ZME Science (June 2025); Technical.ly (October 2025). technical.ly
- ↩ Net Zero Insights (November 2025). netzeroinsights.com
- ↩ Gartner (November 2025): AI-optimized servers from 93 TWh (2025) to 432 TWh (2030), representing 44% of data center power. gartner.com
- ↩ Gartner (November 2024). gartner.com
- ↩ Microsoft/Constellation Energy deal. IEEE Spectrum (December 2024). spectrum.ieee.org
- ↩ Google/Kairos Power deal. Introl (December 2025); IEEE Spectrum. spectrum.ieee.org
- ↩ Amazon Susquehanna investment. Introl (December 2025). introl.com
- ↩ CNBC, "Meta signs nuclear energy deals to power Prometheus AI supercluster" (January 2026). cnbc.com
- ↩ Introl, "Nuclear power for AI: inside the data center energy deals" (December 2025). introl.com
- ↩ Tom's Hardware (December 2025). tomshardware.com
- ↩ Rahul Mewawalla, CEO of Mawson Infrastructure Group, quoted in MIT Technology Review (May 2025). technologyreview.com
- ↩ Environmental Law Institute (October 2025). eli.org
- ↩ EESI, "Data Centers and Water Consumption" (June 2025). eesi.org
- ↩ UC Riverside researchers, cited by EESI (June 2025). eesi.org
- ↩ "The carbon and water footprints of data centers and what this could mean for artificial intelligence," ScienceDirect (December 2025). sciencedirect.com
- ↩ Undark, "How Much Water Do AI Data Centers Really Use?" (December 2025). undark.org
- ↩ Bloomberg News analysis, cited by Lincoln Institute of Land Policy (October 2025). lincolninst.edu
- ↩ MSCI analysis of 9,055 data center facilities (2025). msci.com
- ↩ Net Zero Insights (November 2025). netzeroinsights.com
- ↩ Oeko-Institute analysis, cited by Carbon Brief (September 2025). carbonbrief.org
- ↩ Oeko-Institute analysis, cited by Carbon Brief (September 2025). carbonbrief.org
- ↩ Pew Research Center (October 2025). pewresearch.org
- ↩ Carnegie Mellon University study, cited by Pew Research Center (October 2025). pewresearch.org
- ↩ LandApp, "Data Centers Increasing Tax Revenue: Northern Virginia Case Study" (January 2025). landapp.com
- ↩ Loudoun County official FAQ. loudoun.gov
- ↩ Loudoun County official FAQ. loudoun.gov
- ↩ UC Riverside and Caltech study, as reported by Sustainability Magazine (February 2025). sustainabilitymag.com
- ↩ Google 2024 Environmental Report, as reported by NPR (July 2024). npr.org
- ↩ Microsoft 2024 Environmental Sustainability Report, as reported by NPR (July 2024). npr.org
- ↩ Harvard T.H. Chan School of Public Health preprint, cited by MIT Technology Review (May 2025). technologyreview.com
- ↩ NewClimate Institute and Carbon Market Watch, 2025 Corporate Climate Responsibility Monitor (June 2025). trellis.net
- ↩ Guardian analysis, as reported by TechInformed (September 2024). techinformed.com
- ↩ Microsoft zero-water datacenter design; ScienceDirect (December 2025). sciencedirect.com
- ↩ Carbon Brief (September 2025): data centers are one of the few sectors where emissions are set to grow alongside road transport and aviation. carbonbrief.org
- ↩ IEA describes AI as "the most important driver of this growth" in data center electricity demand. Carbon Brief (September 2025). carbonbrief.org
- ↩ Carbon Brief (September 2025), quoting IEA. carbonbrief.org
- ↩ Google 2024 Environmental Report, as reported by S&P Global (November 2024). spglobal.com
- ↩ IEA, "Energy and AI" (April 2025): EU Energy Efficiency Directive requires data centers to report energy and environmental data from 2024. iea.org
- ↩ Food & Water Watch analysis (January 2026): 16 of 36 states with data center tax incentive programs have no job creation requirements; average subsidy of $1.95 million per job created. Reported in Toolpod Blog. toolpod.dev
- ↩ World Economic Forum (December 2025): "In 2024, a minor disturbance in Virginia's Fairfax County caused 60 data centres to switch to backup generation. The sudden loss of 1,500 megawatts, roughly equivalent to the entire power demand of Boston." weforum.org
- ↩ Virginia Joint Legislative Audit and Review Commission (JLARC), 2024: state generated only 48 cents in economic benefit per dollar of data center tax incentive. toolpod.dev; civiciq.com
- ↩ Yalçın et al. (2025), "How Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inference": o3 and DeepSeek-R1 consume over 33 Wh per long prompt, more than 70 times the consumption of GPT-4.1 nano. arxiv.org
- ↩ Sam Altman, June 2025: "about 0.000085 gallons of water; roughly one fifteenth of a teaspoon." Documented by Simon Willison. simonwillison.net
- ↩ Google The Dalles water usage (355 million gallons, quarter of city supply) and $100,000 legal costs: widely reported, including Oregon Public Broadcasting and The Oregonian. toolpod.dev
- ↩ IEA / George Kamiya (2020) and Carbon Brief factcheck: the Shift Project overestimated data center energy intensity by ~35x and network energy intensity by ~50x. carbonbrief.org
- ↩ Grist, "Arizona isn't the only place where data centers and chips are straining water supplies" (2024). References the Tu Nube Seca Mi Río coalition in Aragon, Spain. grist.org