While AI companies publicly battle over GPUs and model architectures, a quieter infrastructure war is brewing over transformers—not the neural network kind, but the 140-year-old electrical devices that power data centers. Three solid-state transformer startups raised $280 million combined this month, betting that power infrastructure, not compute, will determine whether AI scaling continues or hits a wall.
The funding wave tells a story most developers miss: NVIDIA is planning 1-megawatt racks—enough to power 1,000 homes—and traditional iron-copper transformers physically can’t handle the load. Heron Power raised $140 million, Amperesand pulled in $80 million, and DG Matrix secured $60 million between February and March 2026. That’s a quarter-billion dollars wagered on replacing infrastructure older than the airplane.
The Power Crisis Nobody Wants to Talk About
NVIDIA’s rack power density is exploding. Current generation racks consume 100+ kilowatts. The Kyber system shipping in 2027 demands 600 kilowatts per rack—triple today’s density. Future Feynman GPUs will hit 1 megawatt per rack, producing the same heat output as 200 kitchen ovens running simultaneously.
Here’s the problem that makes investors nervous: At the industry-standard 54 volts DC, a single 1-megawatt rack requires 200 kilograms of copper busbar. Scale that to a 1-gigawatt data center—the size hyperscalers are planning—and you need 200,000 kilograms of copper just for rack busbars. That doesn’t include the rest of the electrical infrastructure. The physics don’t work. The economics don’t work. Traditional power delivery architecture simply doesn’t scale to megawatt-per-rack densities.
Goldman Sachs projects data center power demand will surge 165% by 2030, from 55 gigawatts today to 122 gigawatts. AI workloads currently represent 14% of data center power consumption but will jump to 27% by 2027. Furthermore, data center occupancy is projected to peak above 95% in late 2026 before new capacity comes online. The bottleneck isn’t GPU availability—it’s the electrical grid’s ability to deliver power efficiently at unprecedented densities.
Silicon Replaces 140 Years of Iron and Copper
Solid-state transformers replace electromechanical iron-copper transformers with silicon-based power electronics. Instead of heavy magnetic coils, SSTs use wide-bandgap semiconductors—silicon carbide and gallium nitride—for high-frequency switching. A three-stage conversion process (input rectifier, high-frequency DC-DC isolation, output inverter) achieves 97.5-99% conversion efficiency versus 95% for traditional transformers.
That efficiency difference sounds marginal until you run the numbers. For example, a 1-megawatt data center operating at 20% load saves 87+ megawatt-hours annually by switching to SSTs. The footprint shrinks by 80%. Moreover, Heron Power’s SST technology occupies 70% less space than traditional equipment. Amperesand reports greater than 80% reduction in electrical equipment footprint compared to legacy infrastructure.
The breakthrough isn’t just efficiency—it’s the 800-volt DC architecture SSTs enable. Texas Instruments and NVIDIA announced a complete 800V DC reference design on March 16. Higher voltage reduces current demand, cutting copper requirements by 45%. Additionally, the architecture improves end-to-end efficiency by 5% while reducing total cost of ownership by 30%. You can scale the same infrastructure from 100-kilowatt racks to 1-megawatt monsters without redesigning power delivery.
DigiTimes called 2026 “a critical transition period for accelerated adoption” of high-voltage DC and solid-state transformers. The SiC/GaN semiconductor market for 800V data center power systems is projected to hit $2.7 billion by 2030. This isn’t speculative research anymore—commercial deployments are starting this year.
Why $280 Million Flowed to Three Startups in Three Months
The funding timeline shows market urgency. Andreessen Horowitz’s American Dynamism fund and Breakthrough Energy Ventures co-led Heron Power’s $140 million Series B in February. Walden Catalyst Ventures and Temasek led Amperesand’s $80 million Series A in November 2025. Meanwhile, Engine Ventures, ABB, and Chevron Technology Ventures backed DG Matrix’s $60 million Series A in February.
That’s $280 million in three months chasing the same problem. Investors rarely move this fast on infrastructure plays unless they see an imminent bottleneck. TechCrunch ran two features in four weeks: “Why investors are going gaga over solid-state transformers” on February 20, followed by “The best AI investment might be in energy tech” on March 20. When TechCrunch declares energy infrastructure more important than AI models, something fundamental shifted.
Amperesand is delivering 30 megawatts of commercial systems in 2026 to hyperscale AI data centers. Their first deployment goes live early this year at the Port of Singapore, supporting a mission-critical charging pilot with PSA International. Similarly, Navitas and EPFL demonstrated a 250-kilowatt SST at APEC 2026 on March 4. DG Matrix reports that 90% of its customer base is data centers, and the company recently partnered with Exowatt to provide power management for solar-powered data center containers.
An industry expert told TechCrunch bluntly: “It will actually slow down the progress in scaling up data centers if you don’t have solid-state transformers ready quite soon.” That quote explains the funding velocity. Consequently, investors see AI scaling stalling in 2027-2028 without new power infrastructure, and nobody wants to be the one who missed the bottleneck.
The Math Changes at Hyperscale
SSTs cost 3-10 times more than traditional transformers upfront. For instance, a 50 kVA solid-state transformer runs $60,000-$100,000 versus $6,000-$10,000 for a standard unit. That’s not a rounding error—it’s a business decision that separates hyperscale AI infrastructure from legacy data centers.
Traditional transformers have proven 30-50 year lifespans. They’re bulletproof, commodity components with minimal maintenance requirements. In contrast, SSTs target 20+ year durability with SiC technology, but the track record is shorter. Complex power electronics introduce more potential failure points than simple iron-copper coils.
The ROI case works when power density and efficiency matter more than capital costs. For new 100+ megawatt AI data centers deploying 600-kilowatt or 1-megawatt racks, the 87+ megawatt-hours of annual energy savings, 80% footprint reduction, and 30% lower total cost of ownership justify the premium. However, for budget-constrained operators running <100-kilowatt legacy racks, traditional transformers remain the rational choice.
Cloud providers building cutting-edge AI infrastructure will adopt SSTs and 800V DC architecture. Existing data centers will stick with what works until equipment replacement cycles force the decision. The transition happens in new construction, not retrofits.
Watch Amperesand’s 30 Megawatts in 2026
Amperesand’s commercial deployments this year are the litmus test. If 30 megawatts of SST systems run reliably in hyperscale production environments, adoption accelerates. Conversely, if reliability issues surface, the market retreats to wait for Generation 2 technology.
NVIDIA’s Kyber system rollout in 2027 provides the second validation point. Six-hundred-kilowatt racks test whether 800V DC architecture scales beyond demos and pilot programs. By 2030, the industry will know if SSTs disrupted power infrastructure or remained a niche solution for bleeding-edge deployments.
For developers, the infrastructure battle determines cloud economics. If AWS, Google, and Microsoft adopt SSTs and 800V DC, you’ll see lower AI inference costs and better availability as power efficiency improvements flow through to pricing. However, if traditional transformers remain standard, expect AI infrastructure costs to stay elevated as operators pay the efficiency penalty.
Goldman Sachs estimates $720 billion of grid spending will be needed through 2030 to support AI data center growth. Some portion of that investment will flow to SSTs if early deployments prove reliable. Therefore, for investors, SST startups are high-risk bets that infrastructure bottlenecks matter as much as model architectures. For developers, it’s a reminder that the stack extends below the API layer into physical infrastructure you never think about—until it becomes the constraint.
Key Takeaways
- Power infrastructure, not GPUs, may limit AI scaling in 2027-2030 as NVIDIA pushes toward 1-megawatt racks
- $280 million raised across three SST startups (Heron, Amperesand, DG Matrix) in three months signals investor conviction that energy infrastructure is critical
- SSTs achieve 97.5-99% efficiency versus 95% traditional, enabling 800V DC architecture with 45% copper reduction and 80% footprint savings
- Amperesand’s 30 megawatts of commercial deployments in 2026 are the critical reliability test determining adoption velocity
- Watch cloud provider infrastructure commitments—AWS, Google, and Microsoft adopting 800V DC and SSTs means lower AI costs ahead

