Catch up on the day's biggest business stories
Subscribe to Axios Closer for insights into the day’s business news and trends and why they matter
Stay on top of the latest market trends
Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.
Sports news worthy of your time
Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.
Tech news worthy of your time
Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.
Get the inside stories
Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.
Communicate like Axios
Keep teams engaged and aligned with Axios-style communications crafted with Axios HQ.
Learn more
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Want a daily digest of the top Denver news?
Get a daily digest of the most important stories affecting your hometown with Axios Denver
Want a daily digest of the top Des Moines news?
Get a daily digest of the most important stories affecting your hometown with Axios Des Moines
Want a daily digest of the top Twin Cities news?
Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities
Want a daily digest of the top Tampa Bay news?
Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay
Want a daily digest of the top Charlotte news?
Get a daily digest of the most important stories affecting your hometown with Axios Charlotte
Sign up for Axios NW Arkansas
Stay up-to-date on the most important and interesting stories affecting NW Arkansas, authored by local reporters
Illustration: Sarah Grillo/Axios
Tech platforms have built the heart of their businesses around secretive computer algorithms, and lawmakers and regulators now want to know just what's inside those black boxes.
Why it matters: Algorithms, formulas for computer-based decision making, are responsible for what we get shown on Facebook, Twitter and YouTube — and, increasingly, for choices companies make about who gets a loan or parole or a spot at a college.
How it works: When posts "go viral," algorithms are why. Often, they work by detecting small blips in user interest and amplifying them.
- Algorithms' complexity and obscurity have helped tech firms make the case that they are neutral platforms. They also allow companies to stand at one remove from responsibility for decisions about promoting and demoting content.
- But users and critics, increasingly aware of the power of these systems, now want to hold companies more responsible for the outcomes their code produces.
Driving the news: At a hearing on "Algorithms and Amplification," executives from YouTube, Twitter and Facebook, along with Harvard researcher Joan Donovan and ethicist Tristan Harris, will testify Tuesday before the Senate Judiciary Committee's privacy, technology and law subcommittee.
- The subcommittee is led by Sen. Chris Coons (D-Del.) and ranking member Sen. Ben Sasse (R-Neb.).
The big picture: Government agencies around the world are starting to take up issues related to algorithms and machine learning.
- The EU last week unveiled a series of proposed new regulations, while the FTC published a blog post reminding companies that they will be held accountable for violating antidiscrimination laws if their algorithms produce biased results.
Our thought bubble: The conversation in policy circles has long concentrated on the outer limits of content decisions — decisions about what gets removed and who gets banned. Those are what software people call "edge cases." What gets recommended, and why, is the center of the issue.
Between the lines: Platforms have long used their algorithms to boost business metrics, such at the amount of time spent on their site. Increasingly, though, they are also acknowledging and tapping the power of algorithms to limit the spread of misinformation or hate speech that doesn't merit an outright ban.
- Twitter has used its algorithmic powers in a variety of ways, including putting potentially offensive content behind a warning flag and limiting retweets of information that it isn't ready to ban entirely.
- YouTube has changed its recommendation engines to steer away from conspiracy theories and other "borderline content" that was criticized for leading people down rabbit holes that may be good for increasing time spent on the site, but are bad for society.
- Facebook has changed the way it recommends groups to users in an effort to reduce radicalization and tweaked its algorithm to show more news from credible sources, among other changes.
- Platforms long tried to avoid such measures for fear of being seen as putting their thumbs on the scale. But their inaction allowed problems to grow unchecked — including election interference, proliferation of conspiracy theories, vaccine hesitancy and Covid-19 misinformation.
Yes, but: Some politicians see these efforts as another sign of platforms' unfairness.
- On the right, complaints about "shadow banning" stem from situations where platforms have limited the spread of users' content rather than outright banned them.
What to watch: Democratic aides told Axios they see today's hearing as a chance to reset the conversation about algorithms and their role in public discourse, a topic that has often been politicized and devolved into partisan squabbling.
- Aides are looking forward to homing in on YouTube's recommendation algorithm, which serves up suggested videos for users based on their history. One question that may come up is how often YouTube users are recommended content that is later found to be in violation of YouTube's policy, an aide said.
- Polarization around the 2020 presidential election and social media's role in the Jan. 6 Capitol riot make the hearing especially important to have now, the aides said.
Tech firms see their algorithms as a kind of trade secret and are reluctant to expose their inner workings, both to keep them from competitors and to make it harder for users to game their systems.
- Recent efforts at transparency from companies are encouraging, committee aides said. But so much information about what content stays up and how algorithms weigh different factors in ranking posts is still kept secret, an aide said.
The tech companies are expected to focus on steps they are already taking:
- offering chronological feeds as an alternative to algorithmic ones;
- limiting the spread of "borderline content";
- and providing ways for individuals to more clearly signal the type of content they want to see.
Be smart: When Congress gets a bunch of tech executives to testify, the conversation can quickly devolve into individual lawmakers' beefs with companies.
- But a spate of recent hearings have shown more bipartisan interest in actually figuring out how to regulate tech than in the past, and algorithms are of interest to regulators around the world.