AI & DevelopmentSecurityNews & Analysis

Meta Smart Glasses Lawsuit: Workers Viewed Intimate Footage

Meta marketed its Ray-Ban smart glasses as “designed for privacy, controlled by you.” The reality? Workers at a Kenya-based subcontractor reviewed footage of users having sex, using toilets, and in other intimate moments inside their homes. A class action lawsuit filed March 5 alleges Meta violated privacy laws and engaged in false advertising. Over 7 million people bought these glasses in 2025, and all their footage flows through this mandatory review pipeline.

The lawsuit follows a Swedish newspapers investigation that exposed the practice. Workers weren’t just seeing random street footage—they reviewed deeply personal moments users assumed were private. Nudity. Sex. Bathroom use. Inside bedrooms and living rooms. The kind of footage that, if you’re wearing glasses marketed as privacy-first, you’d never expect another human to see.

Plaintiffs Gina Bartone of New Jersey and Mateo Canu of California are represented by Clarkson Law Firm, which specializes in public interest cases. The lawsuit names both Meta Platforms and Luxottica of America as defendants. The UK’s Information Commissioner’s Office has also launched a formal investigation into Meta’s data handling practices.

The Privacy Promise That Wasn’t

Meta’s marketing was explicit. The glasses were “designed for privacy” and “built for your privacy.” Meta spokesperson Christopher Sgro even stated: “That media stays on the user’s device.”

Except it doesn’t. When users capture photos or videos with the Ray-Ban Meta glasses, the content uploads to Meta’s servers, then gets transmitted to a third-party subcontractor in Kenya where human workers manually review and label the footage. This labeled data trains Meta’s AI models—the computer vision that powers the “Hey Meta” assistant.

The lawsuit argues: “No reasonable consumer would understand ‘designed for privacy, controlled by you’ to mean that deeply personal footage from inside their homes would be viewed and catalogued by human workers overseas.”

Meta’s privacy policy does mention that review “may be automated or manual (human),” but that disclosure is buried in supplemental terms most users never read. The gap between “designed for privacy” billboard marketing and fine-print disclosures is what the lawsuit targets.

Why Human Reviewers Exist

This isn’t unique to Meta. AI training requires labeled data, and labeling requires humans. OpenAI, Google, Microsoft, and Amazon all outsource data labeling to countries like Kenya, Uganda, and India where labor costs are low.

A TIME investigation found that OpenAI paid its contractor $12.50 per hour per worker for ChatGPT data labeling—but workers actually received just $2 per hour. They reviewed traumatic content without mental health support. The work is grueling, the pay exploitative, and the conditions toxic. Kenya’s Data Labelers Association was formed specifically to fight these conditions.

The problem isn’t that AI training needs human reviewers. The problem is companies marketing privacy while hiding the fact that strangers are watching your most intimate moments for $2 an hour.

The Stakes Are Massive

Meta sold over 7 million smart glasses in 2025—triple the previous year’s sales. The company was discussing doubling production to 20 million units by the end of 2026. This class action could cover every U.S. user who bought glasses based on Meta’s privacy marketing.

The UK investigation adds regulatory risk. Under GDPR, fines can reach 4% of global revenue. For Meta, that’s billions.

More importantly, this lawsuit could set precedent for the entire wearable AI industry. If Meta loses, every company making AI glasses, AR headsets, or smart wearables will face pressure to disclose human review practices upfront—not in buried terms of service paragraphs.

What Happens Next

The class action certification process is underway. Discovery will force Meta to produce internal documents about what executives knew, when they knew it, and what marketing and legal teams discussed. If the case goes to trial, it could take 2-3 years. The UK ICO investigation typically takes 12-18 months.

If you own Ray-Ban Meta glasses, you should review your privacy settings immediately. You can disable AI features through the companion app, though Meta removed the option to prevent voice recording storage in April 2025.

The larger question is whether wearable AI can ever be truly private. Cloud-based AI offers powerful capabilities but requires sending your data to servers. On-device AI preserves privacy but sacrifices features. The industry has chosen power over privacy. This lawsuit asks: did they at least owe users honesty?

ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to cover latest tech news, controversies, and summarizing them into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *