Marvell Technology just bet $3.25 billion that AI’s biggest problem isn’t chips—it’s the wires connecting them. On December 2, 2025, the chipmaker announced it would acquire Celestial AI for up to $5.5 billion (including performance milestones), bringing “Photonic Fabric” technology that uses light instead of copper to connect AI accelerators and memory. The acquisition targets a critical bottleneck: modern AI chips spend more time waiting for data than computing.
Matt Murphy, Marvell’s CEO, called the deal “a transformative step” addressing a $10 billion market. “Copper-based interconnects are approaching their fundamental limits,” he said. Marvell expects $500 million in Celestial AI revenue by Q4 fiscal 2028, doubling to $1 billion by Q4 2029.
The Memory Wall: Fast Chips, Slow Wires
AI compute performance has increased 60,000× over 20 years, while memory bandwidth improved just 100× and interconnects only 30×. Meanwhile, Transformer parameters grow 410× every two years, but GPU memory scales at 2× every two years. The math doesn’t work.
It’s like upgrading to a Ferrari but keeping the same traffic-clogged highway. Even High Bandwidth Memory (HBM) helps, but it’s expensive and capacity-limited. The real constraint isn’t memory—it’s copper wires delivering data to processors.
Photonics: Trading Copper for Light
Celestial AI’s Photonic Fabric replaces electrical signals with light. Data travels as photons through optical channels, delivering 16 terabits per second in a single chiplet—10× today’s 1.6T ports.
Energy efficiency is where photonics shines. The technology consumes 6.2 picojoules per bit versus 62.5 picojoules for Nvidia’s NVLink—a 4× improvement critical when AI clusters burn megawatts. It also delivers nanosecond-class latency and thermal stability for co-packaging with high-power chips.
A $5.5 Billion Gamble on Optical
Marvell is paying $1 billion cash and $2.25 billion in stock upfront, with another $2.25 billion contingent on hitting revenue targets by 2029. The deal positions Marvell as a full-stack AI infrastructure provider competing with Nvidia, Broadcom, and Intel.
Murphy’s pitch: Marvell will be a neutral interconnect supplier working with any custom AI chip. “We’re playing offense,” he said, outlining a vision spanning silicon photonics, packaging, and rack integration. If photonics becomes cheaper and more scalable, spending could shift from Nvidia and Broadcom to Marvell.
The Photonics Arms Race
Marvell isn’t alone. Nvidia unveiled silicon photonics at GTC 2025. Intel is working on optical solutions. Startups like Lightmatter claim 10× I/O bandwidth gains today, potentially 100× in five years. Every major player bets optical interconnects will become standard for next-gen AI clusters.
The Catch: Thermal Stability
Photonics isn’t guaranteed. Optical components are temperature-sensitive, requiring engineers to control roughly one million photonic elements with real-time feedback. Silicon micro-ring modulators have low thermal stability, needing additional mechanisms to function.
Celestial AI’s differentiator is proprietary “thermally stable modulation technology,” but manufacturing at scale remains unproven. Previous optical attempts struggled with cost and reliability. The $2.25 billion earn-out suggests even Marvell acknowledges execution risk.
What This Means for Developers
If photonics delivers, next-gen AI models will train on fundamentally different infrastructure. Optical interconnects could enable memory pooling, larger models, and 10-100× faster training. Cloud providers will shift to photonic racks.
The timeline: Celestial AI revenue starts H2 fiscal 2028, three years out. By then, developers may deploy on photonic infrastructure without thinking about it—like we moved from dial-up to fiber without rewriting apps.
The real question: Can Marvell execute before Nvidia, Intel, or a startup beats them? At $5.5 billion, this is Marvell’s bet that light solves the memory wall. The industry will know by 2028.










