March 28, 2023
I know the days are getting longer, but this time of year, don't the months feel like they're lengthening too?
🧩 Situational awareness: Chinese tech giant Alibaba announced today it will split into six core businesses that will be able to pursue independent fundraising and IPOs.
Today's Login is 1,232 words, a 5-minute read.
1 big thing: Congress zeroes in on content rules
In this Congress, online content and moderation will be the main tech policy obsession as legislators focus on transparency, kids' safety and bias, Axios' Ashley Gold reports.
Why it matters: Companies are back on defense explaining their content rules and protocols to lawmakers who are concerned about what they see or don't see on social media platforms.
- The content push comes as antitrust action from the Hill looks considerably less likely and various worries about tech outside of companies' size and dominance boil over — including the impact of the rise of generative AI applications.
Driving the news: So far in 2023, Congress has held one hearing on antitrust, a handful touching on alleged Big Tech bias (with another coming up this week), one on Section 230 and content moderation, and a blockbuster grilling of TikTok CEO Shou Zi Chew.
- Sources tell Axios they expect bills such as the Kids Online Safety Act (KOSA), the American Data Privacy and Protection Act, and the EARN IT Act to be reintroduced in the coming weeks.
What they're saying: "The large companies are happy to talk about anything right now other than competition," Nu Wexler, a partner at Seven Letter and veteran of several Big Tech companies, told Axios. "Privacy and competition are important, but they're hard to explain. Everyone has opinions about content."
- "Congress has the muscle memory to be like 'I see something bad online. I will have a hearing about it, and drag some executives and some experts who say tech is bad in front of us to talk about it,'" one tech industry insider told Axios, adding that generative AI is a likely area of exploration.
The big picture: People are mad about tech's impact on mental health, especially for kids and teens, and its role in spreading misinformation.
- Feeling regulatory pressure from Europe, federal agencies, the states and Congress, platforms are rolling out new guidelines, touting work on children's privacy and mental health, and being more transparent than ever about their algorithms.
Flashback: It's a notable change from when social media platforms first started getting heat on Capitol Hill: At that time, information about how companies made content decisions was hard to come by, and contrition by tech CEOs was rare.
- "In 2023, tech platforms are way more willing to say 'We make mistakes sometimes,' and they lead with that," Emma Llansó, director of the Free Expression Project at the Center for Democracy and Technology, told Axios.
What to watch: "In general, we'll also continue to see the fundamental disagreement between Democrats and Republicans about whether the problem is that online services don't do enough or do too much to address problematic content on their services," Llansó said.
- "That's part of why my bet is on kids-related issues to see any chance of movement, since that has historically been an area where they're able to come to bipartisan agreements."
To read more stories like this, subscribe to our new Axios Pro Tech Policy newsletter.
2. Wellness lab offers parents TikTok counsel
A new clearinghouse has emerged for mediating between tech companies and those concerned about their products' impact on kids: the Digital Wellness Lab at Boston Children's Hospital and Harvard Medical School, reports Jennifer A. Kingson of Axios' What's Next.
Why it matters: Young people live their lives on social media, and it's not going away — so parents and pediatricians need to learn to recognize when it becomes a problem, says pediatrician Michael Rich, the lab's founder.
- At the same time, tech companies need to set appropriate guardrails, Rich tells Axios.
- Rich argues that unhealthy internet use is not an addiction, but rather a disorder he's dubbed Problematic Interactive Media Use — or PIMU — that indicates other underlying problems, including mood disorders and ADHD.
- PIMU is "a collection of symptoms of kids seeking to soothe themselves, to comfort themselves, to distract themselves," he tells Axios.
Where it stands: Rich founded the Digital Wellness Lab in 2021 to look at the unknown health consequences of the surge in kids spending six-plus hours a day online.
- With sponsorship from major tech platforms — such as Twitch, Roblox, Snap, Discord and TikTok — the Lab is trying to address the concerns of parents, doctors and lawmakers without villainizing the companies involved.
What they're saying: "After close to 30 years of doing this research, I grew tired and frustrated with the fact that it was in a polarized, adversarial environment," says Rich, a professor at Harvard Medical School and a doctor at Boston Children's Hospital.
- "The pediatricians were saying, 'The kids are in trouble,' and the policymakers are freaking out and saying, 'We've got to make laws about this.' And the tech and entertainment companies are in siege mentality and defensive mode."
Rich — a former filmmaker who calls himself the "Mediatrician" — is pulling the constituencies together to hammer out ground rules based on science and common sense.
Driving the news: Bowing to pressure, TikTok recently set a 60-minute screen time limit for children under 18 (albeit one that kids or their parents can bypass by punching in a code) — after seeking advice from the Digital Wellness Lab.
State of play: Heavy social media use has been linked to mental health issues in children — most notably, depression in teen girls — and there's a cottage industry of lawyers and treatment programs aiming to help desperate parents.
- By contrast, the Clinic for Interactive Media and Internet Disorders (CIMAID) at Boston Children's Hospital — which Rich co-runs — is a leading medical program for kids with health problems related to internet and social media use.
- Founded in 2017, CIMAID has seen "close to 1,000 kids," but the population "should be larger," Rich says. "It's only limited by the amount of staff I have to see them."
- Rich's advice? Instead of yelling at your kid to stop playing video games, "sit down next to them and play 'Grand Theft Auto' with them."
3. Take note
- Tomorrow, tune in to Axios' second annual What's Next Summit, starting at 8:10am ET. Check out the agenda and register to livestream the event here.
- Snap hired Microsoft's Rob Wilk as its president of the Americas, a key ad sales position, Ad Age reports. Meanwhile, Snap head of growth Jacob Andreou will leave to join Greylock as a general partner, per TechCrunch.
- Elon Musk announced that beginning April 15, Twitter's default "For you" feed would only show messages from verified users who'd paid for a subscription. (Axios)
- President Biden signed an executive order Monday banning U.S. government agencies from using some commercial spyware. (Axios)
- As part of broad layoffs, Disney has reportedly shut down a division dedicated to developing metaverse strategies. (Wall Street Journal)
4. After you Login
- It used to be that you could spot AI output by checking for hands with more or less than five fingers. (The pontiff's right hand in that image does look a bit off!) But it's only going to keep getting harder to separate simulacrum from reality.
Thanks to Peter Allen Clark for editing and Bryan McBournie for copy editing this newsletter.