Omidyar Network: Dating Apps & Responsible Technology
Omidyar Network is a social change venture that has committed more than $1.8 billion to innovative for-profit companies and nonprofit organisations since 2004.
Omidyar Network works to reimagine critical systems and the ideas that govern them, and to build more inclusive and equitable societies.
While no one is entitled to a date, we are all entitled to visibility into how our data is used, how opaque algorithms impact us and, above all, more accountability and choice from companies that mediate our personal lives.
Omidyar Network approached AWO to conduct a landscaping study on the harms, risks and responsible technology opportunities related to dating apps. At a closed-door briefing, we presented our findings to Omidyar Network’s grantees and partners.
Through desk research and stakeholder interviews, the report offered analysis of:
- Data collection: In October, the LGBTQ+ dating app Grindr lost its appeal against a €6.5 million fine by Norway’s Data Protection Authority. The Appeals board agreed that Grindr had violated EU data protection law, which included illegal data sharing with third-parties for behavioural advertising. Grindr is not the only app with troubling data practices; in 2014, OKCupid allegedly shared a database of users’ profile pictures with Clarifai, a facial recognition company, for use as training data. (This has been subject to an FTC investigation.) More recently, Tinder reportedly leveraged users’ age to discriminate against older users, charging them more for its premium service.
- Trust and Safety: Dating apps are designed for adults to match virtually and meet in real life, which changes the parameters of off-line harm and risk. Off-line harms have prompted a U.S. congressional investigation into underage users on dating apps, a ProPublica investigation on sexual assaults through dates facilitated by apps, and litigation focused on apps’ failure to respond to harassment reports.
- Recommender systems and the ‘match’ algorithm: Dating apps are not neutral. Apps collect behavioural data based on whom users like and are liked by. Like all recommender systems, dating app algorithms can amplify bias and create silos. Bias can be learned through user interaction with the algorithm, reinforcing user preferences.
- Relevance of the Digital Services Act (DSA): As of now, no dating app meets the threshold for the DSA’s most rigorous obligations, which is 35 million monthly active users. While the DSA will still give dating app users some protections, including more transparency around recommender systems and advertising, users won’t benefit from its strongest provisions.