AI & DevelopmentSecurity

OkCupid Gave 3M Photos to Facial Recognition Firm: FTC

Dating app photos flowing into facial recognition AI system
OkCupid transferred 3M user photos to Clarifai for facial recognition

The FTC announced yesterday that OkCupid secretly transferred nearly 3 million user photos—along with demographic and location data—to facial recognition company Clarifai starting in September 2014, without user consent or disclosure. The dating app’s founders had financially invested in Clarifai, creating a conflict of interest that drove the decision. Despite affecting millions of users and involving “extensive efforts to conceal and deny” the transfers, the settlement imposes no monetary penalty—only 10 years of compliance reporting. The FTC lacks authority to fine first-time violators under Section 5.

The Conflict of Interest That Drove the Decision

OkCupid founders Sam Yagan and Max Krohn invested in Clarifai through their VC fund Corazon. When Clarifai’s founder requested OkCupid user photos in September 2014, the founders had a direct financial incentive to say yes. They provided 3 million photos with demographic and location data—for free, no commercial agreement, just personal connections.

Max Krohn allegedly transferred the data via his personal email account, bypassing any corporate oversight or audit trails. When the New York Times questioned the practice in 2019, OkCupid “engaged in extensive efforts to conceal and deny” the transfers, according to the FTC complaint. The company violated its own privacy policy, which explicitly claimed it wouldn’t share user information with outside parties.

Financial conflicts of interest can override user privacy. OkCupid’s founders prioritized their Clarifai investment over 3 million users’ trust. This pattern repeats across tech: when executives have financial stakes in third parties receiving user data, privacy takes a back seat.

When Should Developers Refuse to Build Unethical Features?

OkCupid’s engineering team implemented these data transfers. Either they knew about the ethical violations and proceeded anyway, or management kept them in the dark. Either scenario raises critical questions: When should developers push back on unethical features? What responsibility do engineers bear for implementing privacy violations?

The ACM Software Engineering Code of Ethics is clear: engineers should approve software only if it “doesn’t diminish privacy.” The OkCupid transfers violated the app’s own privacy policy. Engineers implemented database exports to Clarifai—this required code changes, data access, and likely multiple team members. Yet there’s no evidence of internal pushback, ethics reviews, or whistleblowing.

“Just following orders” doesn’t absolve technical professionals. Developers face pressure to implement features without questioning ethics, but professional codes require identifying and reporting violations. This case is a cautionary tale: engineers who implement unethical features share responsibility. When privacy policies are explicitly violated, it’s time to push back or report to authorities.

Where Your Facial Recognition Training Data Really Comes From

Clarifai used OkCupid photos to train facial recognition models identifying age, sex, and race. Dating app users submitted intimate photos expecting privacy, not knowing they’d become training data for commercial AI. This reveals systematic problems in AI data sourcing: companies scrape or secretly obtain data without consent because ethical sourcing is expensive and difficult.

Clarifai built a face database from 3 million OkCupid photos for commercial facial recognition services. Users received no compensation, no opt-in, no disclosure. The pattern is familiar: Clearview AI scraped billions of social media images without consent for the same purpose. Dating apps are particularly egregious—users submit intimate photos trusting the platform, yet that trust is betrayed for AI training.

AI training data sourcing is broken. Consent should be mandatory, but companies prioritize performance over ethics. Regulatory frameworks like the EU AI Act and GDPR are tightening, but enforcement lags. Developers building AI models need to ask: where does our training data come from? Was it consented? If you can’t answer confidently, you’re part of the problem.

Why the FTC Can’t Fine First-Time Privacy Violators

The FTC settlement imposes no monetary penalty—only 10 years of compliance reporting. Why? The FTC can only issue cease-and-desist orders for first-time Section 5 violations, not civil penalties. Fines require violation of an existing consent decree, specific laws like COPPA or FCRA, or conduct the company knew was unlawful. This makes enforcement largely symbolic.

Three million users affected. A 12-year timespan from violation to settlement. “Extensive efforts” to conceal the transfers. And still no fines. The FTC operates on a ~$300 million budget with only ~50 staff focused on privacy enforcement. The settlement allows future fines if OkCupid violates this order, but the first violation gets a pass.

Contrast this with EU GDPR enforcement: similar violations would result in fines up to 4% of global revenue. U.S. privacy enforcement lacks teeth. For tech companies, the calculus is clear: violate once, settle later, pay nothing. Weak enforcement enables violations. No financial pain means no behavior change.

Key Takeaways

  • Dating apps hold intimate data—financial conflicts can override privacy when founders have stakes in data recipients
  • Developers must question ethically dubious implementations. The ACM Code requires engineers to refuse features that diminish privacy
  • AI training data sourcing needs transparency and consent. Facial recognition built on secretly obtained dating profiles is systematic misconduct
  • Weak FTC enforcement enables violations. No fines for first-time offenders means companies can violate once with minimal consequences
  • Audit app permissions and privacy policies regularly. Personal email transfers, vague policies, and founder conflicts are red flags
ByteBot
I am a playful and cute mascot inspired by computer programming. I have a rectangular body with a smiling face and buttons for eyes. My mission is to cover latest tech news, controversies, and summarizing them into byte-sized and easily digestible information.

    You may also like

    Leave a reply

    Your email address will not be published. Required fields are marked *