Infrastructure
Electricity is the New Cloud
AI workloads will consume 1,000+ TWh by 2028. Power is becoming the new cloud constraint.
The IEA projects global data center electricity demand will exceed 1,000 TWh by 2028, roughly matching Japan's total consumption. Data center power demand is growing 20–25% annually. Unlike cloud compute — which can be provisioned in minutes — electricity requires physical infrastructure that takes years to build. Power purchase agreements (PPAs), grid interconnection, and utility relationships are becoming the new competitive moat.
What changed
Cloud compute taught companies that infrastructure could be elastic and on-demand. Electricity doesn't work that way. Grid capacity is finite, interconnection queues are years long, and power costs are increasingly location-dependent. The companies that built the most efficient AI training clusters are now discovering that their competitive advantage depends on access to cheap, reliable power — not better GPUs. Nuclear, geothermal, and long-term PPAs are becoming strategic investments, not sustainability initiatives.
What leaders should do
Treat power as a strategic asset class. Evaluate every data center location by power cost trajectory, grid reliability, interconnection timeline, and renewable availability. Build a 5-year power cost model per region. Explore on-site generation options (solar + battery, small modular reactors) for high-demand facilities. Negotiate PPAs with 10+ year horizons — short-term power contracts will not survive AI's demand curve.
What ZOAK wants to build
A power-as-strategy planning tool for AI companies: it models power cost trajectories by region, scores grid reliability and interconnection timelines, evaluates on-site generation options, and produces a 5-year infrastructure capital plan. The output is a power strategy, not an energy dashboard.
Operating analysis
The IEA's data is stark: data center electricity consumption roughly doubled between 2020 and 2025, and will double again by 2028. In the U.S., Virginia's "Data Center Alley" already strains regional grid capacity. New hyperscale facilities in Texas, Georgia, and Ohio face interconnection delays of 4–7 years. The economic math is shifting: power cost is becoming a larger percentage of total AI operating cost than compute, especially for inference workloads that run 24/7.
This creates a new class of competitive advantage: companies that lock in cheap, reliable, long-term power access will be able to run AI workloads at lower marginal cost than competitors. Power is the new cloud — and the companies that treat it as strategy will win.
| Signal | Why it matters | Action |
|---|---|---|
| Demand surge | IEA: 1,000+ TWh by 2028, equivalent to Japan's total electricity demand. | Model power demand curves per facility and workload type. |
| Cost advantage shift | Power cost is becoming a larger % of AI ops than compute for 24/7 inference. | Score regions by long-term power cost trajectory, not just current rates. |
| Nuclear renaissance | Tech companies are investing in nuclear (SMRs) for guaranteed baseload power. | Evaluate on-site generation options for power independence. |
What would we build first?
A regional power cost and capacity scoring tool for the top 30 global data center markets: grid headroom, interconnection queue status, average PPA rates, renewable availability, and planned capacity additions. The output is a ranked list of regions by AI infrastructure suitability.
Is nuclear realistic for AI companies?
Several hyperscalers have already signed agreements for small modular reactor (SMR) installations. The timeline is 5–8 years for first power delivery. For companies with 10+ year infrastructure horizons, nuclear is a credible strategy. For companies that need power in 2–3 years, solar + battery + grid PPAs are the pragmatic path.
How would we measure success?
Reduction in power cost surprise events (unplanned rate increases exceeding 10%) by 50%+. Time from site selection decision to interconnection agreement should decrease measurably. Power cost per GPU-hour should trend downward over 3 years for organizations using the planning tool.
ZOAK_BUILD_THESIS = {
category: "Power infrastructure strategy",
first_principle: "electricity is the new competitive moat for AI companies",
target_lift: "+40% infrastructure planning speed",
next_move: "score top 30 data center markets by power suitability"
}
Sources: IEA Electricity 2026, Brookings Institution — Data Center Energy
Related engagement
Building AI infrastructure that depends on power access?
Tell us about your capacity and location constraints — we'll scope a power strategy diagnostic.
Start a conversation