Mar 31, 2026 · 5 min read
OkCupid Fed 3 Million Users' Photos to a Facial Recognition AI—The Fine Was $0
The FTC just settled with Match Group after OkCupid secretly shared intimate profile photos and location data with an AI company that works with the military.
What Happened
In 2014, OkCupid handed nearly three million users' profile photos to Clarifai, a Washington DC based artificial intelligence company specializing in facial recognition. Along with the photos, OkCupid shared demographic and location data. There was no formal data sharing agreement, no contractual restrictions on how the data could be used, and no notice to users.
The connection between the two companies was not random. OkCupid's original founders were investors in Clarifai, which the FTC says motivated the insider data transfer.
A Decade of Denial
When The New York Times reported in 2019 that Clarifai's founder had acknowledged building a face database using OkCupid images, the dating app told the paper that no commercial agreement had been entered into. According to the FTC's complaint, Match Group and OkCupid took "extensive steps to conceal" the arrangement and actively tried to obstruct the federal investigation.
OkCupid's privacy policy at the time stated the company would not share personal information with others except for service providers, business partners, and entities within its family of businesses. Clarifai fit none of those categories.
Where Those Photos Ended Up
Clarifai is not just any startup. The company has actively pursued government and military contracts, including participation in Project Maven, a US Department of Defense AI program. It markets facial recognition, visual search, and content moderation services to defense, intelligence, and law enforcement agencies.
Once integrated into AI training datasets, photos cannot be meaningfully "returned." The FTC settlement does not require Clarifai to delete the models trained on OkCupid users' images. As former FTC director of public affairs Douglas Farrar noted: "Clarifai still has those images."
The Settlement
The FTC imposed a proposed 20 year consent order on Match Group and Humor Rainbow, OkCupid's operating entity. Under the terms, the companies are permanently prohibited from misrepresenting how they collect, use, or share personal information. For the next 10 years, they must maintain records, submit compliance reports, and respond to FTC monitoring requests.
The financial penalty: zero dollars. Match Group, a company with $3.2 billion in annual revenue, paid nothing. Critics were quick to respond. Emily DiVito from the Groundwork Collaborative observed that "Match Group was just fined $0," while others argued the case warranted criminal prosecution.
Why This Matters Beyond Dating Apps
The OkCupid case illustrates a pattern that extends across the tech industry. Companies collect intimate personal data under one set of promises, then quietly repurpose it for something entirely different. Dating profiles are particularly sensitive because they contain photos users chose to share in a specific context, not to train surveillance technology.
This same dynamic plays out with email tracking pixels, browser fingerprinting, and location data harvesting. Companies collect data quietly, use it in ways users never anticipated, and face minimal consequences when caught.
How to Protect Yourself
The FTC's settlement puts the burden on regulators, not users. But there are practical steps you can take to limit exposure:
- Review the privacy policies of dating apps and social platforms before uploading photos
- Use platforms that allow you to control whether your images can be used for AI training
- Regularly audit which apps have access to your photos and location data
- Consider using images that do not appear on other platforms to limit cross referencing
- Support legislation that requires companies to obtain explicit consent before sharing biometric data
The broader lesson is that any data you share online can end up somewhere you never intended. Companies have shown repeatedly that privacy policies are aspirational documents, not binding contracts, until regulators with real enforcement power say otherwise. Understanding how companies like data brokers monetize your personal information is the first step toward protecting it.