Aug 10, 2021 - Technology

Google and YouTube roll out new protections for teens

Photo of a sign reading Google and YouTube
Photo by Smith Collection/Gado/Getty Images

Google and YouTube are introducing new safety protections for users under 18, according to company blog posts Tuesday.

Why it matters: Google joins Facebook-owned Instagram in changing privacy and advertising policies for younger users as regulators across the globe scrutinize how Big Tech affects children.

Google changes in the coming months include:

  • A new policy allowing anyone under 18, or their parent or guardian, to request removal of their images from Google Image search results.
  • Blocking ad targeting based on the age, gender or interests of people under 18.
  • Users under 18 will not have the ability to turn on location history (which is off by default).
  • SafeSearch, which filters out explicit results, will be turned on for existing users under 18 and will be the default setting for teens creating new accounts.
  • Apps will be required to disclose how they use data as part of a new safety section for Google Play, which will also highlight which apps follow Google's family policies.

Meanwhile, YouTube changes include:

  • Removing "overly commercial" videos from YouTube Kids, which YouTube says could be content that focuses solely on product packaging or "directly encourages" kids to spend money.
  • Adjusting the default upload setting to the most private option for users between 13 and 17, with private uploads only being seen by the user and whoever they choose.
  • Turning on "take a break" and bedtime reminders by default for users 13-17, and turning autoplay off by default for the group (although they can turn it back on).
  • Adding an autoplay option for YouTube Kids, but turning it off as the default setting in the app.

Flashback: Google agreed to pay a $170 million fine in 2019 to settle allegations from the Federal Trade Commission that YouTube violated children's privacy provisions.

Go deeper