Start with the obvious: data centers need firm power, not really 24/7 as they run at 50% capcity factors.
Solar/Wind are fuel, batteries are really providing the capacity.
So the assumption is gas & nuclear carry the load.
Big Tech energy portfolios tell a more nuanced story.
Amazon & Microsoft: 40+ GW wind/solar each
Google & Meta: ~15 GW wind/solar each
All four signed nuclear deals as well
The nuclear numbers are smaller
In short they are sticking to their commitments to clean energy and buying NG units for "capacity" for speed to power.
The "100% renewable" claims from NVIDIA, Google, et al?
Mostly RECs — Renewable Energy Credits purchased on paper.
Not physical electrons.
Google & Microsoft are moving toward stricter 24/7 carbon-free matching.
NVIDIA is still on annual accounting.
There's a gap between the headline and the actual electrons used.
Nuclear does produce power 24/7 but won't be run in a way that matches their load.
Restarting Palisade, Three Mile Island, and Duane Arnold doesn't rewrite the story.
SMRs won't meaningfully enter the mix until after 2030
So what actually fills the gap today?
Most people say gas, it is clearly sold out right now.
But we have ~600 GW of operating natural gas plants today, most of that fleet was built pre-2004.
Those older plants aren't running efficiently so they don't dispatch unless we desperately need them.
Changes everything about how you should read "gas powers data centers."
The efficiency gap is staggering:
→ Modern CCGT plants (post-2014): Super efficient <7,000 Btu/kWh
→ Plants from 1999–2013: ~7,500 Btu/kWh
→ Older simple-cycle/steam plants: >10,000 Btu/kWh
That's 30–40% more fuel burned per MWh on the older kit. Normally 20GW retire per year, right now it is less than 3GW, that will catch up to everyone in 2030
New gas builds for data centers are run like peakers — running 100–300 hours/year, not prime power.
That means their TWh contribution is actually small.
They're reliability insurance, not bulk energy.
The electrons are still mostly coming from renewables + existing aging NG plants.
The EIA is clear on what fills incremental demand:
"primarily increased utilization of existing natural gas plants."
Not new plants.
Existing ones — many of them old, inefficient, already in the dispatch queue.
Run harder, very expensive to run.
Amazon just paid $87M for empty land in Oregon:
→ 1.2 GW solar capacity permitted
→ 7.2 GWh battery storage
→ Adjacent to an existing large datacenter
They didn't do that to ignore solar going forward. This is the 2026 design language. Very different from 2024.
Globally, 2025 was an inflection point:
"100% of all electricity demand growth in 2025 came from solar, wind, and nuclear."
The macro trend and the US near-term story are both true simultaneously.
All incremental TWh are coming from low-carbon sources.
So what's the honest summary?
✅ Aging gas fleet (~300 GW 22+ yrs old) is doing heavy lifting inefficiently
✅ New gas builds are mostly peakers, not prime power
✅ Renewables/Nuclear/Batteries are 90% of what is being added globally -- including in the USA
✅ Fleet modernization opportunity is enormous and underappreciated
This is early innings.
The least-discussed opportunity in the US energy transition:
Replace 300 GW of inefficient aging gas capacity with modern CCGTs + battery buffers + co-located solar.
The emissions & cost reduction would have a huge impact, same as 100% renewables — and keep the lights on reliably. Data, not dogma. 🔋⚡☀️
• • •
Missing some Tweet in this thread? You can try to
force a refresh
.@EpochAIResearch has quietly assembled the most rigorous data set on AI infrastructure in existence. Here's what it tells us about how much compute we need by 2030, how many giant campuses are actually required, and where the real distributed inference opportunity lies. 🧵
The baseline: AI compute stock is growing at 3.4× per year, doubling every 7 months. Training compute for frontier models grows at 5× per year. US AI data center capacity will exceed 50 GW by 2030 — approaching 5% of total US generation capacity.
This is not scary for the grid.
But not all of that 50 GW needs to be concentrated. Final frontier training runs — the kind that require extreme GPU synchrony — represent only ~10% of total R&D compute spend. The other 90% is experiments, fine-tuning, inference, and synthetic data. Distributable. epoch.ai/gradient-updat…
PJM just cried "Uncle". They admit that capacity prices are up 1,000%+ in two auctions and that money goes to existing generators, doesn't send a signal to build new on 3 year contracts.
The prescription is pretty bold.
They aren't doing a market patch, this is a rethink 🧵
Grid 1.0 was built on one assumption: demand is passive. Show up whenever. Take whatever you need. The grid serves you.
That worked for factories and homes. It doesn't work when a hyperscaler drops 500 MW peak on a node.
PJM opens the door to changing their "must serve" requirements. They will figure out how reliable they can be and make only those promises. States can figure out what they want to do to fill in the gaps. That returns $16B to the states to implement that solution.
Everyone's talking about the AI data center buildout like it's just a money problem. It's not. There are 5 distinct hard constraints stacked on top of each other — and solving one just reveals the next.
Why there is no way the US can unlock 100GW of AI compute by 2030.
First, the scale of the gap.
The US has ~50 GW of data center capacity online today. Bain and McKinsey both put demand at ~100 GW by 2030.
That means adding ~10 GW per year for 5 years straight. The record year so far? About 2.5 GW actually delivered. We need 4x that. Every year.
Constraint #1: Grid interconnection.
Lets assume this gets solved with an interuptible tariff.
Data centers get a "qualified" yes which requires them to curtail 100-200 hours a year. Match with VPPs and you could see 10% rate decreases across the country. energyempirepodcast.substack.com/p/the-ai-power…
Just read a wild JPMorgan note on the current energy + war situation. Here are the spiciest takes 🧵
“US energy independence” is basically a myth.
Even as a net exporter, the US is still getting hit by global price shocks—sometimes worse than Europe.
Strait of Hormuz = ultimate leverage.
Iran may have figured out it can “hold the global economy hostage” cheaply. Potential toll revenues: $70–90B/year.