Open Source

AI Rewrites Kill Software Licensing: Chardet Case

On March 6, open source pioneer Bruce Perens declared the entire economics of software are “dead, gone, over, kaput!” Why? Days earlier, Dan Blanchard used Claude AI to rewrite Python’s chardet library from scratch. The result: 48x faster performance in just 5 days. He then flipped its license from copyleft LGPL to permissive MIT. Original creator Mark Pilgrim disputed this move. However, Blanchard might be legally correct. That’s precisely what breaks licensing entirely.

The Legal Paradox Breaking Everything

Blanchard claims his AI-generated rewrite constitutes a “clean room implementation” that escapes LGPL restrictions. But if AI generated the code, who owns the copyright? On March 2, the Supreme Court ruled in Thaler v. Perlmutter that AI-generated works cannot be copyrighted. Only works with meaningful human authorship qualify for protection.

This creates an impossible contradiction. Blanchard needs human authorship to claim copyright and apply the MIT license. But his clean room defense requires proving AI independence—that he didn’t copy the original code. You cannot have both. Either AI did it (no copyright, no license) or a human did it (no clean room, LGPL still applies). Traditional licensing frameworks cannot resolve this paradox.

Copyleft Enforcement Is Dead

Armin Ronacher, creator of Flask, coined the term “slopfork” for AI rewrites that reproduce behavior while escaping legal obligations. His analysis cuts to the core. Copyleft code like the GPL heavily depends on friction to enforce it. But you can trivially rewrite it these days.

The chardet numbers prove his point. Blanchard’s plagiarism detector showed less than 1.3% code similarity between version 7.0 and previous releases. Normal version-to-version similarity runs 80-93%. By any objective measure, this is a new codebase. Moreover, it performs 48 times faster, supports multiple cores, and uses fundamentally different architecture. All Claude needed was the API specification and test suite. These are publicly available documents that any library must provide.

GPL and LGPL enforcement worked because copying was expensive and detectable. Courts could trace lineage, identify copied sections, and compel compliance. However, AI eliminates that friction entirely. Test suites and API specs are enough to reconstruct any functionality. When similarity drops below 2%, proving copying becomes legally impossible.

Proprietary Software Faces the Same Fate

This is not just an open source problem. Perens demonstrated that proprietary software is equally vulnerable. Using only publicly available competitor documentation, he prompted AI to create a Site Reliability Engineering platform from scratch. The result? A functionally equivalent system built in days, not months. “Proprietary licensing seems almost irrelevant,” he concluded.

Trade secrets and proprietary algorithms face the same threat. If documentation exists—and it must, for customer use—AI can reconstruct core functionality. Legal moats disappear when code can be cloned from specs. Consequently, companies that relied on licensing for competitive advantage now compete on execution speed instead.

The Selective Outrage Problem

Ronacher notes a telling pattern. Companies like Vercel happily re-implemented bash using AI but protested when their Next.js framework faced similar treatment. Principles shift based on whose code gets targeted.

This selective outrage guarantees more disputes. Everyone wants to use AI to escape restrictive licenses they do not like. Nobody wants their own licenses escaped. Ronacher predicts more fights but doubts many reach court. Both sides fear setting bad precedent. That legal uncertainty accelerates the breakdown. Without court resolution, licensing exists in an unenforceable gray zone.

Breaking the Social Compact

Hong Minhee distinguishes between “legal” and “legitimate.” Even if courts eventually rule AI rewrites are lawful, they shatter community trust. Chardet had 12 years of contributors working under LGPL’s reciprocity promise. Those contributors expected derivative works to share improvements back. The library serves 130 million downloads monthly.

Relicensing to MIT removes all protection. Companies can now take chardet, add proprietary improvements, and sell the result without sharing code back. That is lawful under MIT. However, it also breaks the social compact 12 years of contributors relied on. Why contribute to LGPL projects if maintainers can relicense via AI? The incentive structure collapses.

What Happens Next

Legal battles loom, but reluctantly. Both sides in any dispute fear court precedent that clarifies the rules. Clarity helps whoever loses more than whoever wins. Meanwhile, developers face impossible choices. Use AI for rewrites and risk lawsuits? Avoid AI and watch competitors speed past? Trust copyleft protection that might not exist?

Perens proposes “Post-Open” licensing frameworks that address compliance and sustainability differently. Others suggest “specification copyleft” that protects APIs and tests rather than code expression. The industry needs new models fast. Current licensing assumes code copying is expensive and detectable. AI shatters both assumptions.

The Hard Truth About AI and Licensing

Perens is not wrong. The economics of software development face fundamental disruption. Licensing worked because it controlled something scarce: the code itself. AI makes code abundant and replication trivial. Legal protection that depends on scarcity fails when scarcity disappears.

The industry must stop pretending old licensing models still work. Copyleft cannot be enforced when rewrites drop similarity below 2%. Proprietary licenses cannot prevent cloning when documentation suffices for reconstruction. Therefore, companies must compete on execution, ecosystem, and continuous innovation—not legal moats.

Blanchard’s chardet rewrite is not an isolated incident. It is the template. Every LGPL project with public tests can be “slopforked” to permissive licensing. Every proprietary API with documentation can be cloned. The question is not whether this undermines traditional licensing—it already has. The question is how fast the industry adapts to that reality.

ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to cover latest tech news, controversies, and summarizing them into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *

    More in:Open Source