Anthropic, OpenAI, and Block just did something unusual in the AI industry: they donated competing infrastructure projects to a neutral Linux Foundation. The Model Context Protocol (MCP), with 97 million monthly downloads, is moving to community governance alongside OpenAI’s AGENTS.md and Block’s goose framework. The question isn’t whether this matters—it’s whether Big Tech will actually honor the commitment.
What MCP Is and Why Developers Should Care
MCP standardizes how AI applications connect to external tools and data sources. Think of it like USB-C for AI—one protocol that works across Claude, ChatGPT, Cursor, VS Code, Copilot, and Gemini.
Before MCP, every AI platform had proprietary integration methods. Want to connect your chatbot to GitHub? Build a custom integration for ChatGPT. Then rebuild it for Claude. Then again for Cursor. That’s the M×N problem—M applications times N integrations equals a lot of wasted developer time.
MCP solves this by providing a universal standard. Build one MCP server, and it works everywhere. Over 10,000 active MCP servers already exist, covering everything from GitHub integration to database access to Fortune 500 enterprise deployments.
The practical impact is immediate. Developers can now build integrations once and deploy them across every major AI platform. Vendor lock-in gets harder when switching costs drop to near zero.
Competitors Actually Collaborating—For Now
The new Agentic AI Foundation (AAIF) brings together three projects that should theoretically compete. Anthropic donated MCP. OpenAI contributed AGENTS.md, a markdown standard used by 60,000+ repositories. Block added goose, a local-first AI agent framework.
Eight companies signed on as platinum members: AWS, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI. That’s the entire AI industry sitting at one table, agreeing to neutral governance.
This follows the Kubernetes playbook. Google open-sourced Kubernetes and handed it to the Linux Foundation’s Cloud Native Computing Foundation. Container orchestration became standardized, and the ecosystem flourished. The AAIF is betting the same pattern works for AI infrastructure.
Jim Zemlin, Linux Foundation executive director, framed it as essential for trust. “Bringing these projects together under the AAIF ensures they can grow with the transparency and stability that only open governance provides.”
The subtext is clear: enterprises won’t bet on vendor-controlled standards. They need neutral stewardship.
The Skeptical Take
History offers reasons for caution. Microsoft pioneered “embrace, extend, extinguish”—adopt an open standard, add proprietary extensions, then use market dominance to marginalize competitors who can’t support the additions. The US Department of Justice documented this strategy during antitrust proceedings.
Internet Explorer embraced Netscape’s web standards, extended them with ActiveX controls that broke Java compatibility, then used Windows bundling to extinguish Netscape. Office documents were deliberately designed to render poorly in competing browsers, per a 1998 Bill Gates memo. Microsoft is now an AAIF platinum member. The irony is hard to ignore.
Even within the AAIF, contradictions exist. OpenAI donated AGENTS.md to the foundation but simultaneously ships AgentKit with proprietary APIs designed to create vendor lock-in. Anthropic open-sourced MCP while building Claude-specific features elsewhere. Every vendor claims neutrality while hedging their bets.
The real test isn’t the press release—it’s whether vendors prioritize MCP compatibility when it conflicts with proprietary features. Will Google extend MCP with Gemini-only capabilities? Will Microsoft add Azure-specific requirements? Open governance only works if members actually follow the rules.
What Developers Should Do
Despite the risks, MCP adoption makes strategic sense. Vendor lock-in is real—66 percent of teams upgrade within the same vendor rather than switching—but 37 percent of enterprises now deploy five or more AI models specifically to reduce dependency on any single provider.
MCP lowers switching costs. If you build on the standard, migrating from Claude to ChatGPT or Gemini becomes a configuration change instead of a rewrite. That’s valuable even if vendor extensions eventually fragment the ecosystem.
The realistic scenario isn’t pure standardization or complete fragmentation. It’s probably something in between—MCP establishes baseline compatibility while vendors differentiate with extensions, similar to how cloud providers handle Kubernetes. Compatible but not identical.
For developers, that means: use MCP for new integrations, but stay platform-agnostic in your architecture. Explore the 10,000+ existing MCP servers. Watch for vendor behavior that breaks compatibility. And remember that open governance only works if someone enforces it.
The AAIF announcement is significant. Whether it leads to real interoperability or just delays inevitable vendor lock-in depends on what happens next.










