Exclusive: Grindr partners with Spectrum Labs on AI moderation
Grindr is partnering with Spectrum Labs, tapping the startup's AI-based system to help filter postings on the LGBTQ dating service.
Between the lines: For years, Grindr has chosen not to implement an AI system for content moderation, not because it didn't want to augment its keyword-based filtering system, but because it was concerned that the models weren't sensitive enough to keep users safe without introducing other types of bias.
Content moderation via machine learning is tricky, controversial and not always good, Grindr spokesman Patrick Lenihan told Axios.
- With Spectrum, which provides content moderation to other dating services as well as game companies and other Internet companies, Lenihan said Grindr finally found an option it was comfortable with. "They had the thing we really needed."
How it works: Rather than simply police content for certain words or phrases, Spectrum's contextual AI service works to solve specific issues, such as identifying the sale of drugs and sex as well as trying to detect underage users.
- Spectrum has a set of algorithms it has tuned over the years, but also works with each customer to make the system work for their environment. As a result, it can take weeks or months to get its tools up and running, but Spectrum CEO Justin Davis says that's an investment that pays dividends over time.
Why it matters: While Grindr had understandable reasons for waiting to find a suitable AI system, not using one meant the company was heavily reliant on user reports. In addition to being reactive rather than proactive, the approach is also vulnerable to abuse.
- Spectrum's Davis says that only 18% of users across services report problematic encounters and a huge percentage of those are actually false reports, such as people who didn't like their date.
- And the other non-AI method that Grindr and others use — monitoring for keywords — has gotten less effective over time as people have become more sophisticated in avoiding such systems.
The big picture: Dating apps have become the key method for matchmaking, but the rise in popularity has also made them a hotbed for harassment, illegal activity and scams.
- Fraudsters bilked online daters out of an estimated $500 million last year, according to one study based on data reported to the FTC.
- "That's a lot of money," Davis said.