PyPy, the alternative Python implementation delivering 5x performance improvements over CPython through JIT compilation, is collapsing. In March 2026, the uv package manager merged a PR warning users that “PyPy is not actively developed anymore,” prompting PyPy core developer cfbolztereick to acknowledge that “the remaining core devs don’t have the capacity to keep up with CPython.” NumPy officially dropped PyPy support on March 8, 2026. With only 2-4 commits per month and the last release in July 2025, PyPy is stuck on Python 3.11 while CPython has advanced to 3.13.
This isn’t just about PyPy—it’s about systemic failure in open-source sustainability. PyPy delivers the 5x speedup Microsoft’s multi-million-dollar Faster CPython project couldn’t achieve, yet it can’t find funding. While venture-backed AI startups burn billions for marginal productivity gains, a proven performance tool serving 85% of Python developers dies from neglect.
PyPy’s Performance Advantage: 5x Speedup Microsoft Couldn’t Match
PyPy achieves 3-5x average speedup over CPython through JIT compilation, with some compute-intensive workloads showing up to 18x improvement. This isn’t theoretical—production deployments on Hacker News report handling ~500 requests per second per CPU core, and Quora publicly runs on PyPy in production.
Compare that to Microsoft’s Faster CPython initiative. From 2020 to 2025, a team of 6 engineers including Guido van Rossum aimed for 5x improvement following the Shannon Plan. They achieved 20-40% total speedup (Python 3.10 → 3.14) before Microsoft cancelled the project and laid off the team in May 2025. The industry spent millions trying to achieve what PyPy already delivers for free, then abandoned both efforts.
The message is clear: PyPy proves 5x Python performance is possible through JIT compilation. However, without commercial backing, even proven solutions die. Microsoft’s failure makes PyPy’s success more impressive—and its decline more tragic.
NumPy Drops PyPy Support: Ecosystem Signals Withdrawal
NumPy officially dropped PyPy support on March 8, 2026, removing extensive compatibility code accumulated over years. The technical reasons are damning: PyPy lacks Python 3.12 support (stuck on 3.11), refcounting inconsistencies, missing C-API features like PyErr_FormatV, and maintenance burden without testing.
NumPy maintainer Nathan Goldbaum captured the problem: “My worry about keeping all the pypy-specific code in-tree is that people will cargo cult it and it will likely be broken…without any testing.” When your workarounds include handling different ctypes.__mro__ behavior, broken np.resize() and memory mapping, and inability to modify tp_doc after PyType_Ready, the maintenance cost exceeds the value.
For data science workloads—Python’s largest use case—PyPy is now unusable. This creates a vicious cycle: fewer users lead to less testing, which causes more bugs, which drives away more users. NumPy’s exit signals broader ecosystem withdrawal, not just a single library dropping support.
The Funding Paradox: Billions for AI, Zero for Proven Performance
The Python ecosystem has 85% developer adoption and serves a $823.92 billion global software market (2025, projected $2.248 trillion by 2034), yet PyPy cannot secure funding despite delivering 5x performance improvements. In contrast, venture-backed AI startups raised billions in 2025-2026 for unproven productivity tools.
This reveals warped industry priorities. We’ll pay billions for marginal AI gains but nothing for tools that reduce compute needs by 5x. It’s economically irrational and environmentally wasteful. Moreover, the market has failed to value critical infrastructure that delivers measurable, proven results.
The open-source sustainability crisis extends far beyond PyPy. In September 2025, package registries issued an open letter revealing that 82% of Maven Central consumption comes from less than 1% of worldwide IPs—hyperscalers using infrastructure for free while maintainers burn out. Furthermore, 60% of open-source maintainers remain unpaid despite serving billions of downloads monthly. The Open Source Endowment launched in 2025 to address this funding gap, but systemic change requires industry-wide commitment.
“Unmaintained” vs “Reduced Capacity”: What It Means for Production Users
The uv package manager’s merged PR labeling PyPy “unmaintained” sparked heated debate on Hacker News (194 points, 79 comments). PyPy core developer cfbolztereick pushed back: “PyPy isn’t unmaintained. We are certainly fixing bugs and are occasionally improving the jit. However, the remaining core devs…don’t have the capacity to keep up with cpython.”
The distinction matters for production users. “Unmaintained” implies abandoned—security patches won’t arrive, migrate immediately. “Reduced capacity” suggests security patches may arrive, just slowly. Unclear communication erodes trust and accelerates abandonment even if the project isn’t dead yet.
The evidence supports “reduced capacity” more than “unmaintained”: 2-4 commits per month, last release July 2025 (8 months ago), Python 3.12 support in progress by new contributors. Nevertheless, PyPy’s 2-version lag behind CPython (3.11 vs 3.13) creates compatibility pressure that forces production users to choose between sticking with PyPy and risking obsolescence or migrating to CPython and accepting 5x performance regression.
No Clear Alternative: PyPy’s Decline Leaves Performance Gap
PyPy has no viable successor for JIT-compiled Python performance. Alternatives exist but fall short. Pyston, a CPython fork with JIT optimizations, runs faster than CPython but slower than PyPy, with limited adoption after Dropbox abandoned it. GraalPy, a Java-based implementation, suffers from 100x slower startup time and +1GB RAM overhead—impractical for most use cases. Jython and IronPython target Java/.NET interop, not performance. Cython offers compile-time optimization, not runtime JIT.
CPython continues incremental gains (~5% per release), but without PyPy competition, Python performance may stagnate. Consequently, if PyPy dies, Python loses its only path to >2x performance improvements. Developers must choose: accept slower code, rewrite in Rust/Go, or scale horizontally (expensive, wasteful). The ecosystem loses a pressure valve for performance-critical workloads.
Related: Big Tech Pledges AI Power—But Who Really Pays the Bill?
Key Takeaways
- PyPy achieves 3-5x average speedup (up to 18x on compute workloads) that Microsoft’s multi-million-dollar Faster CPython project couldn’t match—they delivered only 1.5x before cancellation in May 2025
- NumPy dropped PyPy support on March 8, 2026, citing Python version lag (3.11 vs 3.13), refcounting issues, and maintenance burden—signaling broader ecosystem withdrawal
- The funding paradox: Python serves 85% of developers and an $823.92 billion market, yet PyPy can’t secure funding while AI startups raise billions for marginal gains
- “Unmaintained” vs “reduced capacity” terminology matters for production users—PyPy still ships bug fixes and JIT improvements but can’t keep pace with CPython’s release cycle
- No viable PyPy alternative exists—Pyston is slower, GraalPy has 100x startup overhead, Cython isn’t runtime JIT—leaving a performance gap if PyPy dies
- Open-source sustainability crisis affects all critical infrastructure: 82% of package registry consumption from <1% of IPs (hyperscalers), 60% of maintainers unpaid
PyPy’s decline represents a market failure where proven, critical infrastructure dies while hype-driven projects thrive. The industry must fund tools that deliver measurable value or accept degraded performance and higher infrastructure costs. Ultimately, we’re choosing to pay more for compute because we won’t pay anything for software that reduces compute needs by 5x.



