Social media companies engaged in "vast surveillance" of users, FTC says
Add Axios as your preferred source to
see more of our stories on Google.

In this photo illustration, the Social Media icons and Facebook, Messenger and Instagram logos seen displayed on a smartphone. Photo Illustration: Mateusz Slodkowski/SOPA Images/LightRocket via Getty Images
A Federal Trade Commission staff report released Thursday found that nine leaders in the social media and video streaming industry, including Meta and X, conducted "vast surveillance" of consumers to monetize their personal information.
Why it matters: Not only did companies capitalize off of users' data, the report alleged, but they also failed to protect consumers, including minors, and used collection, minimization and retention practices that were "woefully inadequate."
What they're saying: "While lucrative for the companies, these surveillance practices can endanger people's privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking," FTC Chair Lina Khan said in a statement.
- She continued: "Several firms' failure to adequately protect kids and teens online is especially troubling."
Of note: The study, which began nearly four years ago, is based on responses from orders issued in December 2020 to nine companies operating 13 different social media and video streaming services.
- The orders requested data on each company's operations between 2019 and 2020.
The big picture: None of the companies reported sharing practices that treat teen users' data differently than that of adults, the study found.
- "This shows that any privacy protections that were present for children ... disappeared the moment that child turned thirteen," the report said.
The report also alleged some companies engaged in "willful blindness" around users under the age of 13.
- While most companies said their platforms weren't intended for users under 13 and that they did not knowingly have any data collected from children, the report contended there is evidence children are using the platforms.
Flashback: In June, U.S. Surgeon General Vivek Murthy called on Congress to require social media platforms to use warning labels to alert users of the risks social media can pose to teens' mental health.
- His call and the FTC's report come amid a time of increased focus on the impact social media can have on teen mental health, with several studies linking social media use to increased depression among teenagers.
What they're saying: Jacqueline Ford, an attorney with the FTC's Division of Privacy and Identity Protection, said that while companies said they protected children by complying with the Children's Online Privacy Protection Act Rule, COPPA "should be the floor, not the ceiling" for protecting children online.
- "Teens aren't adults, and so they should have heightened protections," Ford said, such as "designing age-appropriate experiences," affording "the most privacy protective settings by default" and "limiting the collection, use and sharing of their data."
Zoom out: The business models of many companies, the FTC reported, incentivized "mass data collection" of users and, sometimes, non-users, to monetize, often through targeted advertising.
- Some brands purchased information from data brokers, companies that collect personal information to resell or share, and third parties, sometimes about consumers' offline lives.
- Ford said data can sometimes still be collected from non-users, even if they do not have an account, if they receive a link to a social media page or access a platform in another way.
- User-inputed data, user actions across social media platforms and, in some cases, offline data, was used to create user profiles that were then made available to advertisers to target specific segments of consumers.
Aside from one exception, the study found, the companies did not appear to give people a choice to opt in or out of having their data used in algorithms, data analytics or AI.
Since the study began, some organizations have updated policies amid scrutiny from state and federal policymakers.
- On Tuesday, Instagram (owned by Meta) announced it would place new, protective settings on teen accounts, which will now automatically be private.
The bottom line: "Self-regulation has been a failure," the report contended.
- While the FTC report made recommendations for companies and Congress, including passing federal legislation to limit surveillance and expand protections for teens over 13, it did not take specific enforcement actions of its own.
Go deeper: Americans don't trust social media companies with AI
