Your electricity bill is climbing not because you’re using more power, but because AI data centers are driving infrastructure costs that utilities are passing directly to you. In PJM power markets spanning 13 states from Illinois to North Carolina, residential customers are paying $9-18 more per month in 2026. Baltimore residents got hit hardest: bills jumped $17/month in 2025, with another $4 increase landing this year. The culprit is a capacity market auction where data center demand pushed prices up 833%, creating a hidden “AI infrastructure tax” on 65 million Americans whether they use ChatGPT or not.
If you’re a developer building AI tools, here’s the infrastructure cost externality nobody’s talking about. This isn’t abstract energy policy—it’s real money out of real people’s pockets, and the meter’s still running. By 2028, the average family could face $70/month increases, all to subsidize an AI boom that profits Big Tech while consumers foot the infrastructure bill.
The $9.3B Cost Shift: How Data Centers Drove Capacity Prices Up 833%
PJM’s capacity market auction cleared at $329.17 per megawatt-day for 2026/27, up from $28.92 in 2024/25—an 833% increase. Monitoring Analytics, PJM’s independent market monitor, attributes 63% of that spike directly to data center demand. That translates to $9.3 billion in infrastructure costs being recovered from all 65 million PJM customers, not just the data centers consuming the power.
Baltimore provides the starkest example. Wholesale electricity prices jumped 125% in 2025, driving Baltimore Gas & Electric residential bills up $17/month. Another $4 increase hit in 2026. Meanwhile, Washington D.C. residents on Pepco saw $21/month increases, with the D.C. Office of Consumers’ Counsel confirming $10/month came purely from capacity market costs. Western Maryland and Ohio customers face $18/month and $16/month increases respectively.
Here’s the mechanism: capacity markets are designed to ensure grid reliability by contracting power supply years in advance, with costs distributed across all customers. Data centers create massive new demand—individual facilities consuming 50-200 MW each—driving up capacity auction prices. The entire customer base pays, not just the data centers. It’s infrastructure cost socialization on a scale consumers have never experienced, and we’re only at the beginning.
Why AI Chips Consume 10-20x More Power Than Traditional Computing
AI data centers require 162 kilowatts per square foot today, rising to 176 kW/sq ft by 2027, compared to traditional data centers at 40-60 kW/sq ft. NVIDIA’s 2027 “Kyber” system will pack 576 GPUs into a single rack consuming 600 kilowatts—equivalent to the electricity demand of 500 U.S. homes, all in the footprint of a filing cabinet.
Goldman Sachs research shows rack-level power density more than doubled in two years, from 8 kW per rack to 17 kW today, with projections hitting 30 kW average by 2027. At the grid level, data center power demand is projected to grow 50% by 2027 (to 84 GW) and 165% by 2030 compared to 2023 levels, with AI workloads comprising 27% of total data center load by 2027.
This explains why infrastructure costs are exploding. Traditional data center economics assumed incremental load growth of 2-3% annually. AI chips—GPUs like NVIDIA’s H100 and Blackwell—consume 10-20x more power than CPUs and generate proportionally more heat, requiring upgraded cooling infrastructure. Utilities planned for one trajectory; AI delivered something completely different. Consequently, the gap between forecasted and actual demand is what’s driving capacity shortages and price spikes.
1,000 TWh by 2026: Global Scale and the Ireland Warning
The International Energy Agency projects data centers will consume 1,000 terawatt-hours of electricity globally by 2026—equivalent to Japan’s entire national electricity consumption, and double the 500 TWh consumed in 2022. That’s not a rounding error. It’s an infrastructure buildout on the scale of powering the world’s third-largest economy.
Ireland provides a cautionary precedent. Data centers accounted for 21% of national electricity in 2023 and are projected to reach 32% by 2026, triggering government concerns about rolling blackouts and a moratorium on new Dublin data centers until 2028. When data centers cross roughly 20% of grid capacity, you hit infrastructure saturation, consumer backlash, and regulatory moratoriums. PJM is early in this trajectory, but with data center clusters in Northern Virginia, Maryland, and Ohio already overwhelming local grids, the Ireland playbook may arrive faster than utilities expect.
McKinsey estimates $6.7 trillion in global data center infrastructure investment will be needed through 2030 to keep pace with compute demand, with the U.S. representing 40% of that ($2.7 trillion). The breakdown: 15% to builders (land, construction), 25% to energy systems (power generation, transmission, cooling), and 60% to technology (chips, GPUs, computing hardware). Cooling infrastructure alone represents $1.7 trillion globally—paid for by consumers through capacity markets.
Related: Memory Shortage 2026: AI Demand Drives 80% Price Spike
Who Pays, Who Profits: The Cost Allocation Fight
Here’s the fundamental tension: tech giants—Microsoft, Google, Meta, Amazon—build data centers and profit from AI services, while residential and commercial utility customers subsidize the infrastructure through capacity market auctions and transmission upgrades. Harvard Law School’s Environmental & Energy Law Program warns: “One of the largest concerns is the risk that other ratepayers could end up footing the bill for unneeded infrastructure designed for data centers.”
Oregon’s POWER Act represents the counter-model. The state created a separate rate class for large loads over 20 MW (data centers), with direct cost assignment, minimum demand charges of 80-85%, and 12+ year contract requirements to prevent cost shifting to other ratepayers. Most states haven’t acted, but the bipartisan backlash is building—both Bernie Sanders and Ron DeSantis have publicly opposed the data center boom’s impact on consumer electricity affordability. When those two agree, you’re looking at mainstream concern.
Traditional utility regulation wasn’t designed for ultra-large loads like 50-200 MW facilities with 10-20 year lifespans. Rate structures built for balanced residential/commercial/industrial mixes break down when a single data center equals the demand of an entire town. Developers building AI infrastructure need to understand: you’re creating political and economic consequences that will reshape energy policy, whether you intended to or not.
The 2028 Projection: $70/Month and What’s Next
The Natural Resources Defense Council projects that by 2028, the average American family could see electricity bills rise by $70 per month due to data center-driven infrastructure costs. Furthermore, PJM’s capacity market is expected to remain at or near the maximum price ($329.17/MW-day cap) for 5-10 years according to IEEFA, as supply can’t keep pace with data center demand growth.
Infrastructure lead times explain the multi-year price pressure. New power generation takes 3-7 years (natural gas to nuclear), transmission line upgrades take 5-10 years (permitting and construction), while data centers build out in 18-24 months. This timing mismatch creates persistent supply shortages. Add transformer shortages, land availability constraints, and political opposition to new infrastructure (Ireland-style backlash), and the capacity crunch could last through the decade.
That $9-18/month today becomes $70/month in two years—a material cost burden for households. It also creates political risk for the AI industry. If consumer bills spike 20-30% annually, expect regulatory backlash, data center moratoriums, and stricter cost allocation requirements. Tech companies betting big on AI infrastructure may face a very different regulatory environment by 2027-2028.
Key Takeaways
- PJM capacity market prices jumped 833% in two years, with data centers responsible for 63% of the increase ($9.3 billion cost shift to consumers across 13 states)
- AI power density (162-176 kW/sq ft) is 3-4x higher than traditional data centers, with NVIDIA’s 2027 Kyber rack consuming 600 kW—enough to power 500 U.S. homes in a filing cabinet footprint
- Global data center electricity consumption will hit 1,000 TWh by 2026 (equivalent to Japan’s entire consumption), requiring $6.7 trillion in infrastructure investment through 2030
- Baltimore residential bills increased $17/month in 2025 with another $4 in 2026; by 2028, the average family could face $70/month increases if current trends continue
- Oregon’s POWER Act shows an alternative path—direct cost assignment for data centers over 20 MW prevents residential cost shifting, but most states haven’t reformed tariffs yet
The AI boom isn’t free. Someone pays for the infrastructure, and right now, it’s consumers who never asked to subsidize Big Tech’s compute expansion. Whether that continues depends on how fast regulators catch up to the reality showing up in electricity bills.









