SpaceX filed with the FCC on Friday requesting permission to launch up to one million satellites for a solar-powered orbital data center system to handle AI computing workloads. To put that in perspective: SpaceX’s Starlink constellation currently has 11,000 satellites in orbit. This request is 100 times larger than Starlink and 100 times larger than ALL satellites currently orbiting Earth combined.
The eight-page filing submitted January 31, 2026, claims that “orbital data centers are the most efficient way to meet the accelerating demand for AI computing power,” citing solar energy with minimal operating and maintenance costs. Whether the FCC approves this or not, the sheer audacity of the request reveals how desperate the AI infrastructure situation has become.
The Scale Problem: 100x Everything
Before Starlink, roughly 10,000 satellites orbited Earth. SpaceX has since launched 11,000 Starlink satellites (with about 9,600 still in orbit) for internet connectivity. Now they’re asking for permission to deploy 1 million satellites specifically for AI compute—a constellation 100 times larger than their current one.
The filing specifies satellites operating in low-Earth orbit between 500-2,000 kilometers altitude, arranged in “narrow orbital shells spanning up to 50 km each.” Each satellite would run on solar power, theoretically providing continuous compute capacity without the energy costs plaguing terrestrial data centers.
Here’s the thing: This number isn’t realistic, and SpaceX knows it. More on that in a moment.
Why Space? The AI Energy Crisis
AI data centers are hitting hard limits on Earth. Global data center electricity consumption was 415 terawatt-hours in 2024—about 1.5% of all global electricity. That’s projected to nearly double to 945 TWh by 2030, with AI workloads driving 35-50% of that demand.
The numbers are staggering. Training OpenAI’s GPT-4 reportedly consumed roughly 50 gigawatt-hours of electricity—enough to power San Francisco for three days—at an estimated cost exceeding 100 million dollars. The PJM Interconnection, which serves 65 million people across 13 states, projects it will be six gigawatts short of reliability requirements in 2027. Virginia already dedicates 26% of its state electricity to data centers.
Space offers what terrestrial grids can’t: unlimited solar power, vacuum cooling (no water or air conditioning required), and no grid constraints. The problem is getting there.
The Economics Are Brutal (For Now)
An economic analysis modeling a 1 gigawatt facility over five years found orbital solar power costs approximately 51 billion dollars versus 16 billion for terrestrial combined-cycle gas turbine power—a 400% cost premium. At current launch costs of 1,500 to 3,000 dollars per kilogram, launching a 100-ton orbital data center module runs 150 to 300 million dollars per module before operational costs.
SpaceX’s Starship aims to reduce launch costs to 10 to 20 dollars per kilogram, which would drop module costs to 1 to 2 million dollars—a 99% reduction. Economic viability for orbital data centers depends entirely on achieving these dramatic launch cost reductions.
But orbital data centers aren’t science fiction. Starcloud, a startup backed by Nvidia and Y Combinator, launched its first satellite in November 2025 with an Nvidia H100 GPU—100 times more powerful than any GPU previously operated in space. In December 2025, Starcloud successfully trained the first AI model in space. Their second satellite, launching October 2026, will have 100 times the power generation of the first and integrate Nvidia’s Blackwell platform. Starcloud claims energy costs in space are 10 times cheaper than terrestrial options, even including launch expenses.
The concept works. It’s just not economically viable at scale yet.
This Is Classic Musk Negotiating
Here’s what SpaceX is really doing: anchoring high. It’s a known regulatory strategy they’ve used successfully before. When SpaceX applied for their Starlink Gen2 constellation, they requested permission for 29,988 satellites. The FCC approved 7,500 in 2022, then another 7,500 in January 2026—totaling 15,000 satellites, roughly 50% of the original request. The remaining 22,488 satellites are still deferred.
The FCC isn’t going to approve one million satellites. Industry consensus suggests this is an opening bid for negotiations, not a realistic target. SpaceX will likely settle for tens of thousands over a decade-plus timeline, not hundreds of thousands.
But there’s another angle here: space debris. Starlink currently performs one collision avoidance maneuver every two minutes on average. Orbital altitudes between 520 and 1,000 kilometers are already at potential runaway collision threshold, according to recent studies. Adding one million more satellites to this environment raises serious Kessler syndrome concerns—the scenario where orbital debris density triggers a cascade of collisions that makes certain orbits unusable for decades.
SpaceX is proactively lowering Starlink orbits from 550 kilometers to 480 kilometers throughout 2026 to reduce collision risk. But scaling to one million satellites, even over many years, would require solving problems that don’t have solutions yet.
What This Really Signals
Whether the FCC approves this request or not, the filing reveals two critical truths about AI infrastructure in 2026.
First, the terrestrial power grid simply cannot keep up with AI compute demand. Companies are literally looking to space for solutions because Earth-based infrastructure is hitting physical limits. Carnegie Mellon researchers estimate data centers could increase average U.S. electricity bills by 8% by 2030, exceeding 25% in high-demand markets like Northern Virginia.
Second, Musk’s negotiation playbook is on full display. The question isn’t “Will they launch one million satellites?” It’s “What number will they settle on?” Based on Starlink Gen2 precedent, expect approval for a fraction—perhaps 50,000 to 100,000 satellites deployed over 10-15 years.
The future likely looks hybrid: orbital data centers handling batch processing and AI training workloads that can tolerate 20-millisecond latency, while terrestrial facilities handle real-time inference and latency-sensitive applications. Space-based computing works—Starcloud proved it. But commercial viability at scale is still three to five years out, dependent on Starship achieving its promised 90%+ reduction in launch costs.
SpaceX just asked for the moon. They’ll settle for Mars. And the fact that they’re asking at all shows how intense the pressure on AI infrastructure has become.










