OkCupid Fed Nearly 3 Million Faces to an AI Firm, and the Price Was: Nothing
United States – April 11, 2026 – OkCupid allegedly handed nearly 3 million user photos to Clarifai, and the FTC settlement reads like a permanent “don’t lie again” sign with no …
The newsroom coffee tastes like burnt compliance. Outside, sirens bounce off glass towers where the privacy policy is always a bedtime story and never a contract. On my screen: PDFs, press releases, and the same old loop. Take the data. Deny the data. Settle the case. Keep the leverage.
This one is not subtle. The Federal Trade Commission says OkCupid shared users’ personal information with an unrelated third party in September 2014, despite privacy promises that said otherwise. The alleged package was ugly: nearly three million user photos, plus location and other information. The FTC identifies the third party as Clarifai, known for AI tooling including facial recognition. The agency says users were not told, and were not given a chance to opt out.
Then came the long tail. The FTC alleges Match and OkCupid took extensive steps for years to conceal and deny the data sharing, including trying to obstruct the investigation. If that reads like a filing cabinet falling down a stairwell, good. It should.
What the FTC filed, and where it landed
On March 30, 2026, the FTC announced action against OkCupid and affiliate Match Group Americas, tied to OkCupid’s operator Humor Rainbow, Inc. The FTC filed a federal complaint and a proposed stipulated order in the U.S. District Court for the Northern District of Texas, Dallas Division.
The FTC alleges OkCupid gave an unauthorized third party access to millions of users’ personal data, including nearly three million photos plus location and other information, without formal or contractual restrictions on how the data could be used.
And the incentive story is right there in the complaint’s framing. The FTC says the third party sought the large datasets because OkCupid’s founders were financial investors in that third party.
Translation: “shared” means your face became inventory
Translation: when a company says it “shares” data, it turns your life into a transferable asset. A face becomes a row in a table. A location becomes a coordinate to be joined with other coordinates. Then somebody calls it “product improvement” so it sounds like nicer fonts instead of more surveillance capacity.
Dating data is intimate infrastructure. It’s not just “personal.” It’s where you are, how you describe yourself, and what you disclose when you think you are talking to a potential partner, not an AI training pipeline.
Here is the mechanism: promises as marketing, enforcement as paperwork
Here is the mechanism: platforms make privacy promises broad enough to soothe users and flexible enough to feed partners. The upside is immediate. The downside is theoretical. When the downside becomes real, time becomes the defense.
The proposed settlement’s core consequence, as described publicly, is a permanent prohibition on misrepresenting the extent to which they collect, use, disclose, delete, protect, or maintain personal information, including the purposes and user choices under state privacy laws. It’s not nothing. But it is not a time machine. You cannot unring a bell from 2014.
Follow the money: the fine is missing, the extraction is not
Follow the money: the FTC press release does not announce a monetary penalty. Media coverage points to a structural reality of U.S. privacy enforcement: often the agency cannot seek civil penalties for first-time violations under certain statutes without specific penalty authority.
The quiet part: enforcement without meaningful financial consequence becomes a rehearsal. Companies learn the choreography: deny, delay, settle, promise not to do it again, and keep the institutional knowledge of how to do it faster next time.