AI & DevelopmentSecurityDeveloper Tools

GitHub Copilot Trains on Your Data: Opt Out Before April 24

GitHub Copilot will start training AI models on your code interactions in 8 days unless you opt out. On April 24, 2026, every prompt you send, suggestion you accept, and code snippet you write with Copilot Free/Pro/Pro+ becomes training data by default. GitHub announced this policy change March 26 and is counting on most developers not noticing the deadline.

What Data GitHub Will Collect

This isn’t just your code. It’s everything you do with Copilot: every prompt you send, all generated suggestions, code you accept or modify, surrounding context on your screen while you work, file names, folder structure, navigation patterns, chat messages, and even your thumbs up/down feedback.

GitHub claims they don’t collect repository source code “at rest” – meaning code sitting on their servers that you’re not actively editing. But the distinction is thin. When you’re working in a private repository, that code context IS collected as “interaction data.” Your private repo isn’t really private anymore during active work.

Privacy Is Now a Paid Feature

GitHub created a class system for privacy. Copilot Free users become training data. Copilot Pro users paying $10/month also become training data. Even Copilot Pro+ users are affected. Only Copilot Business and Enterprise customers get protection – along with students and teachers.

You’re either training data or protected based on your plan. Individual developers – even paying ones – are the product. Enterprise customers are the customers.

The Opt-Out Problem

GitHub didn’t ask if you wanted to participate. They asked if you didn’t. The developer community response has been overwhelmingly negative: 59 thumbs-down reactions versus 3 positive ones on the GitHub community discussion. The top comment captured the frustration: “You didn’t ask if I wanted to participate. You asked if I didn’t.”

This is a trust violation, not just a policy change. Under EU GDPR standards, data processing consent must be “freely given, specific, informed, and unambiguous.” Opt-out doesn’t meet that bar. It exploits user inaction and unawareness.

OpenAI, Anthropic, and Microsoft all use similar opt-out patterns – so GitHub isn’t alone here. But code is more sensitive than casual chat logs. Developers work under NDAs, compliance requirements like HIPAA and SOC2, and confidentiality obligations. One team member using Copilot can expose the entire team’s work. GitHub’s policy puts individual developers at legal and professional risk to fuel AI training.

How to Opt Out Before April 24

You have two options to protect your data:

Method 1 (Settings page):

  1. Go to GitHub.com and click your profile picture (top right)
  2. Select “Settings”
  3. Navigate to “Copilot” section
  4. Find the “Privacy” heading
  5. Locate “Allow GitHub to use my data for AI model training”
  6. Set to DISABLED
  7. Verify the change saved

Method 2 (Direct URL):

  1. Visit github.com/settings/copilot/features
  2. Find the training data toggle
  3. Disable it

Check the setting is OFF before April 24. No grace period. No retroactive opt-out for data already collected.

Why GitHub Needs Your Data

GitHub’s justification: “Real-world interaction data from developers enables models to better understand development workflows and deliver more accurate and secure code pattern suggestions.” They’re not wrong that interaction data – what suggestions developers accept versus reject – is valuable for training. High-quality code training data is becoming scarce. Public GitHub repos have been scraped dry. Interaction data shows what actually works in practice.

But making it opt-out instead of opt-in reveals the real motivation: GitHub knows most developers won’t notice the April 24 deadline. Free tier users are explicitly monetized as data sources now, justifying the free tier’s existence. Privacy has become a premium feature, not a default right.

What Comes Next

This policy shift signals where the AI industry is heading. Privacy as opt-out, not opt-in. Free users as product, not customer. Data extraction as business model, not service improvement. Developers have three options:

Opt out (2-minute fix for now), upgrade to Business/Enterprise (if code privacy is critical and GitHub is essential), or switch to privacy-focused alternatives. Codeium offers unlimited free tier with strong privacy guarantees. Cursor’s Business plan includes privacy mode with no code storage. Tabnine offers on-premises deployment for enterprises in finance, healthcare, legal, and government. Continue.dev is self-hostable with full control over data.

Or leave GitHub entirely: Gitea, Forgejo, and GitLab self-hosted give you complete control. Codeberg and Sourcehut are privacy-respecting alternatives. Self-hosting git on a VPS costs around $200/year for full autonomy.

Eight days left. Opt out, upgrade, or switch. GitHub is betting you won’t choose any of them.

ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to cover latest tech news, controversies, and summarizing them into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *