
The Google search homepage. Photo: David Gray/Bloomberg via Getty Images
Users will now be able to submit requests to remove images of minors from Google search results, the tech giant announced Wednesday.
Why it matters: The move comes amid outrage over Facebook's negative effect on teens' mental health and lack of protections for children. In the wake of criticism from senators, tech companies are scrambling to set themselves apart.
How it works: Anyone under the age of 18, their parent or a guardian can fill out a form to report the image appearing in search results.
- Users will be able to include image URLs, search query terms that lead to the image and links for search pages that contain the image.
- Google teams will review the request and reach out if they need additional information. If the request meets the company's requirement, the image will be taken down and the user will be notified, according to Google's statement.
Yes, but: Removing an image from Google results doesn't make it disappear from the internet entirely, the company cautions.
What they're saying: "We believe this change will help give young people more control over their digital footprint and where their images can be found on Search," Danny Sullivan, Google's public liaison for Search, said in the announcement.
- The company had said earlier this year that it would pursue steps to bolster minors' privacy and ensure protection for their mental well-being.
The big picture: Senators from both parties are urging tech executives to go on the record supporting various legislative changes to laws covering online content liability, privacy and children's safety.
- "Being different from Facebook is not a defense. That bar is in the gutter. What we want is not a race to the bottom, but a race to the top," Sen. Richard Blumenthal (D-Conn.) said at a Senate Commerce consumer protection subcommittee hearing on Tuesday.
- Some lawmakers have weighed the possibility of creating a new federal oversight body to regulate tech firms.