AI Data Centers Are Using So Much Power, Texas Might Run Out

Texas grid operators sent a memo to AI companies in August: stop building data centers or we can’t guarantee power for everyone else.

This isn’t theoretical anymore. AI is breaking electrical grids.

A single NVIDIA H100 GPU cluster—the kind used to train large language models—draws 10.2 kilowatts. Continuously. Scale that to a full training run and you’re burning megawatts 24/7 for months.

OpenAI’s GPT-4 training reportedly used 25 megawatts. For context, that’s enough to power 20,000 homes. Google’s data centers now consume 18 terawatt-hours annually—more than entire countries.

And it’s accelerating.

I talked to a utilities planner in Virginia. Data center power demand in their region has doubled every 18 months since 2021. “We build capacity as fast as we can. It’s never enough. AI companies show up asking for 500 megawatts like they’re ordering pizza.”

The problem isn’t just volume—it’s timing. Power grids are designed around predictable demand curves. Morning spike, evening spike, overnight lull. Plan accordingly. AI training doesn’t care about human patterns. It runs flat out, 24/7, year-round.

That constant massive load destabilizes grids designed for variable demand. Grid operators are scrambling to adapt.

Microsoft announced they’re restarting Three Mile Island’s nuclear reactor to power AI data centers. That’s not a joke. The infamous partially melted reactor site is being recommissioned because AI needs that much power.

Amazon’s building small modular reactors next to data centers. Google’s exploring geothermal. Meta’s buying wind farms. Everyone’s desperate for carbon-free baseload power.

But here’s the ugly truth: most AI data centers run on fossil fuels. The renewable projects are PR. In reality, new AI data centers overwhelmingly rely on natural gas plants because that’s what’s available now.

A climate researcher I interviewed was blunt: “AI companies talk about sustainability while literally reversing decades of emissions reductions. ChatGPT queries have carbon costs. Nobody wants to admit it.”

Every AI interaction uses electricity. Every image generation, every code completion, every chatbot conversation. Multiply by billions of users and you get grid-threatening demand.

The economics are perverse. Power companies love AI data centers—massive reliable revenue. Local communities hate them—strain on infrastructure, environmental impact, minimal jobs created. It’s not like a factory that employs thousands. Data centers need maybe 50 people once operational.

Arizona rejected three proposed AI data centers this year because of water concerns. Yes, water. Data centers use millions of gallons daily for cooling. Arizona’s in a historic drought. The state said no.

Dublin stopped approving new data centers entirely. Ireland’s grid couldn’t handle more load. Companies are looking at Iceland, Norway, Canada—anywhere with cold climates and excess power.

The cooling problem is wild. Modern data centers use liquid cooling because air cooling can’t handle the heat density. NVIDIA’s H100s run so hot that traditional cooling methods fail. You need specialized liquid cooling loops, which are expensive and complex.

Google’s using seawater cooling in coastal facilities. It works but introduces corrosion issues. Equipment lifecycle drops from 10 years to 5.

A data center engineer told me horror stories: “We had a cooling system failure in our AI cluster. The room hit 140°F in four minutes. $50 million in hardware destroyed. The heat density is insane.”

The infrastructure demands are absurd. Cities that want AI data centers need: massive power substations, water treatment facilities, fiber backbone connectivity, and often new transmission lines. We’re talking billions in infrastructure for one building.

And AI companies expect cities to subsidize this. “Economic development,” they call it. Cities desperate for tax revenue agree, then realize they’re paying for grid upgrades to power machines that employ almost nobody.

The bottleneck is becoming real. NVIDIA can’t make chips fast enough. Even if they could, there isn’t enough power to run them all. Power is now the limiting factor in AI scaling.

Some researchers argue we’ve hit peak AI scaling. Not because the models can’t get bigger—because we literally can’t power bigger training runs without building new power plants.

That takes 5-10 years. AI companies want to scale now.

The short-term solution: nuclear. Microsoft, Google, and Amazon are all pursuing nuclear partnerships. Small modular reactors, old plants recommissioned, anything for carbon-free baseload.

But nuclear takes years to deploy. In the meantime, AI is burning gas and straining grids.

The pessimistic take: we’ll hit a wall where power constraints limit AI progress. Training bigger models becomes physically impossible without decades of infrastructure build-out.

The optimistic take: this forces AI efficiency. Models that do more with less power. Better algorithms, not just bigger computers.

Either way, the age of unlimited AI scaling is over. Physics has entered the chat.