Snapchat makes its content guidelines public
Snapchat on Wednesday said it will begin publishing guidelines that detail what types of content gets algorithmically distributed in its app to users.
Why it matters: To date, only vetted publishing partners and professional creators on the platform have had access to the guidelines. Now, Snapchat is making them public to give parents more assurance about what their teens see on the app.
- The guidelines will be available in Snapchat's Family Center, a portal within the app that lets parents with children under 18 on the platform see their child’s friend list and with whom they’re communicating.
How it works: Snapchat uses algorithms to distribute personalized content to users both via its Stories tab, which features content from professional media publishers and popular creators, and via its Spotlight tab, which shows the best content created from everyday Snapchat users.
- In both instances, Snapchat uses a set of content moderation guidelines to determine which types of content can be recommended to users.
- Now, parents will be able to restrict their teens' ability to view recommended content in Stories or Spotlight via controls within its Family Center.
- Parents will be able filter out Stories content from publishers or creators that Snapchat may have identified as either "sensitive" or "suggestive."
Details: The content guidelines detail how Snapchat categorizes certain content as either "sensitive" or "suggestive" based on a certain set of criteria that parents will be able to view.
- For example, moderately suggestive content — like a women in a bikini — may be deemed sensitive, but it doesn't necessarily violate Snapchat's content guidelines. Parents can now decide whether to stop that content from being recommended to their child.
- Some content doesn't violate Snapchat's rules, but Snapchat still won't distribute it algorithmically because it is ultra-sensitive. For example, explicit language that describes sex acts is not eligible for algorithmic distribution.
- Broadly speaking, any publicly available content that is considered by Snapchat's moderation team to be "harmful, shocking, exaggerated, deceptive, intended to disgust, or in poor taste" is not eligible for recommendation to anyone.
Be smart: Snapchat users could technically still view that content if they proactively chose to follow that content's creators, but the Family Center still gives parents the ability to control which creators a user under 18 can follow.
- Accounts that repeatedly post that type of content could be temporarily or permanently disqualified from having their content recommended.
The big picture: While most tech companies make their community standards rules public, most don't provide detailed breakouts of which types of content get algorithmically distributed and which don't.
- Some tech platforms chose to keep those guidelines obscure, out of fear that creators will use that information to game the system.