IBM and Cisco announced November 20 a partnership to network quantum computers by 2030, betting they can scale computational power by linking systems rather than building bigger machines. It’s a fundamentally different approach from Google’s Willow chip, which scaled to 105 qubits through monolithic design. If the bet pays off, we get a quantum internet by the late 2030s. If it fails, IBM-Cisco waste a decade while Google and AWS sprint ahead on single-system efficiency.
The quantum race has three horses now. Google’s scaling up with surface codes. IBM’s optimizing efficiency with qLDPC. And IBM-Cisco are scaling out through networking. One strategy will dominate by 2035. The others will look foolish in hindsight.
The Networking Bet: Cisco’s Fabric, IBM’s Qubits
Cisco brings quantum networking infrastructure—a chip that generates 200 million entangled photon pairs per second at 99 percent fidelity, plus Q-Orchestrator software for circuit partitioning and resource scheduling. IBM delivers the quantum computers: Starling, planned for 2029, promises 100 million quantum gates across 200 logical qubits. The vision is linking multiple Starling-class systems in separate cryogenic environments, distributing workloads across a quantum network.
Jay Gambetta, IBM’s Director of Research, frames it as the only path to scale beyond physical limits. Vijoy Pandey, Cisco’s GM for Outshift, echoes the pitch: connecting quantum computers is essential for useful scale. The timeline targets a 2030 proof-of-concept, an early-2030s operational network, and a late-2030s quantum internet foundation.
Competing Strategies: Three Paths to Quantum Supremacy
Google’s Willow uses surface codes—a 2D lattice where qubits monitor neighbors for errors. The chip achieved exponential error reduction as it scaled from 3×3 to 7×7 qubit grids, each step halving error rates. But surface codes demand brutal overhead: roughly 4,000 physical qubits to match what IBM claims to achieve with 288 using qLDPC.
AWS took a different tack with Ocelot, leveraging cat qubits—named after Schrödinger’s thought experiment—that intrinsically suppress certain errors. The 9-qubit prototype achieves 90 percent cost reduction in error correction compared to traditional approaches, though it remains early-stage and requires near absolute zero cooling.
IBM’s Starling roadmap bets on quantum low-density parity-check codes, where each qubit monitors six others instead of surface code’s nearest-neighbor grid. The efficiency gain is dramatic: 90 percent fewer qubits for equivalent error correction. The trade-off is complexity—qLDPC requires modular chip design and more intricate connectivity.
| Approach | Company | Strategy | Qubits | Timeline |
|---|---|---|---|---|
| Willow | Surface code scale-up | 105 | Dec 2024 | |
| Ocelot | AWS | Cat qubits (intrinsic error suppression) | 9 | Feb 2025 prototype |
| Starling | IBM | qLDPC efficiency | 200 logical | 2029 target |
| Network | IBM-Cisco | Distributed scale-out | Multiple systems | 2030 PoC |
The debate isn’t just technical—it’s strategic. Do you optimize single systems until they’re powerful enough, or do you network weaker systems together? IBM-Cisco are betting physical limits force the network approach. Google-AWS are betting single-system efficiency wins before networking matures.
The Decoherence Problem: Why 2030 May Be Wishful Thinking
Quantum states are absurdly fragile. Decoherence—the collapse of quantum information due to environmental interference—is quantum computing’s foundational challenge. Networking makes it worse. Every photon transmission, every nanosecond of latency, every network hop introduces new error sources. Current demonstrations managed 30 kilometers of urban fiber for 17 days, and 255 kilometers using room-temperature detectors. Impressive, but continental-scale quantum networks face exponentially harder challenges.
Error correction overhead compounds the problem. A single logical qubit requires 1,000 to 10,000 physical qubits depending on error rates. Real-time decoders need sub-microsecond response times, requiring specialized FPGAs or ASICs. Networking distributes error correction across systems, multiplying complexity. And Willow’s already-impressive 0.14 percent error per cycle is still orders of magnitude above the 0.0001 percent rate needed for large-scale algorithms.
Networking quantum computers when single systems barely work is like planning a highway system before inventing the wheel. Ambitious? Absolutely. Plausible by 2030? The physics says maybe, but the engineering says it’s a stretch.
Practical Applications: When Quantum Actually Matters
The good news is quantum computing already shows advantages in narrow domains. Google and Boehringer Ingelheim simulated Cytochrome P450 enzyme metabolism more efficiently than classical methods. JPMorgan Chase and IBM are running quantum algorithms for option pricing and risk analysis, with early studies showing advantages over Monte Carlo simulations. IonQ and Ansys achieved 12 percent speedup over classical high-performance computing for medical device simulation in March 2025—the first documented quantum advantage in a real-world application.
Near-term expectations are modest: 2 to 10 times improvements by 2028-2030 in high-value financial and pharmaceutical problems. That’s enough to justify the investment for specific use cases, but not revolutionary. The quantum-as-a-service moment—when developers can spin up quantum instances like AWS EC2—is still years out. When it arrives, it’ll look like cloud computing’s 2006 moment: transformative, but only in hindsight.
What Developers Should Watch
Track IBM’s milestones. If Nighthawk achieves quantum advantage in 2026 with 16 times the circuit depth of current systems, the roadmap holds. If Cockatoo successfully links quantum chips together in 2027, multi-module entanglement is real. If Starling delivers 100 million gates on 200 logical qubits by 2029, IBM proves qLDPC works at scale. And if the IBM-Cisco proof-of-concept networks multiple quantum computers in 2030, the scale-out bet pays off.
Red flags to watch: timeline slippage on Loon, Kookaburra, or Cockatoo processors. Error rates plateauing above the 0.0001 percent threshold. Networking demonstrations staying confined to metro scale—50 to 100 kilometers—without scaling to continental distances. Commercial applications failing to materialize by 2026-2028.
Market indicators suggest momentum: quantum computing raised 3.77 billion dollars in the first nine months of 2025, nearly triple the 1.3 billion raised in all of 2024. The industry published 120 quantum error correction papers from January to October 2025, up from 36 in 2024. Market size projections climb from 1.8 billion in 2025 to 5.3 billion by 2029.
The quantum race is accelerating. IBM-Cisco’s networking bet is bold, technically risky, and strategically fascinating. It’s also the kind of moonshot that either redefines an industry or becomes a cautionary tale. We’ll know which by 2030.











