Sign up for our daily briefing

Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on the day's biggest business stories

Subscribe to Axios Closer for insights into the day’s business news and trends and why they matter

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Stay on top of the latest market trends

Subscribe to Axios Markets for the latest market trends and economic insights. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Sports news worthy of your time

Binge on the stats and stories that drive the sports world with Axios Sports. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tech news worthy of your time

Get our smart take on technology from the Valley and D.C. with Axios Login. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Get the inside stories

Get an insider's guide to the new White House with Axios Sneak Peek. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Want a daily digest of the top Denver news?

Get a daily digest of the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Want a daily digest of the top Des Moines news?

Get a daily digest of the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Want a daily digest of the top Twin Cities news?

Get a daily digest of the most important stories affecting your hometown with Axios Twin Cities

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Want a daily digest of the top Tampa Bay news?

Get a daily digest of the most important stories affecting your hometown with Axios Tampa Bay

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Want a daily digest of the top Charlotte news?

Get a daily digest of the most important stories affecting your hometown with Axios Charlotte

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Sarah Grillo/Axios

In the U.S. and Europe, Big Tech is under fire — hit with big fines and the threat of stiff regulation — for failing to thwart the profound consequences of its inventions, including distorted elections, divided societies, invaded privacy, and sometimes deadly violence.

Driving the news: Now, artificial intelligence researchers, facing potentially adverse consequences from their own technology, are seeking to avoid being ensnared by the same "techlash."

  • AI researchers are working to limit dangerous byproducts of their work, like race- or gender-biased systems and supercharged fake news.
  • But the effort has partly backfired into a controversy of its own.

What's going on: As we reported, OpenAI, a prominent research organization, unveiled a computer program last week that can generate prose that sounds human-written.

  • It described the feat and allowed reporters to test it out (as we did — see this), but OpenAI said it would withhold the computer code.
  • It said it was attempting to establish a new norm around potentially dangerous inventions in which, for the sake of preventing their possible misuse, researchers would continue their work but keep some advances under wraps in the laboratory.
  • In the case of its own new invention, OpenAI said it feared that somebody could use it to effectively develop a weapon for mass-producing fake news.
  • This was the first time a major research outfit is known to have used the rationale of safety to keep AI work secret.

But the move met massive blowback: AI researchers accused the group of pulling off a media stunt, stirring up fear and hype, and unnecessarily holding back an important research advance.

Why it matters: Against the backdrop of the techlash, we're seeing a messy debate play out around an urgent question: what to do with increasingly powerful "dual-use" technologies — AI that can be used for good or for ill.

  • The outcome will determine how technology that could cause widespread harm will — or won't — be released into the world.
  • "None of us have any consensus on what we're doing when it comes to responsible disclosure, dual use, or how to interact with the media," Stephen Merity, a prominent AI researcher, tweeted. "This should be concerning for us all, in and out of the field."

Details: OpenAI says its partial disclosure was an experiment. In a conversation with two top AI researchers from Facebook, OpenAI's Dario Amodei held up social media companies as a cautionary tale:

"The people designing Twitter, Facebook, and other seemingly innocuous platforms didn't consider that they might be changing the nature of discourse and information in a democracy … and now we're paying the price for that with changes to the world order."
  • Several researchers praised OpenAI's decision to withhold code as a vital step toward rethinking norms. "I think it's amazingly responsible," said Kristian Hammond, a Northwestern professor and CEO of AI company Narrative Science.

But other academic researchers came down hard.

  • While the new program is often impressive, its researchers admit that they simply used a scaled-up version of previous work. It's very likely therefore that someone could replicate the feat at relatively minimal cost. OpenAI says that's why it sounded the alarm.
  • But Sam Bowman, a professor at New York University, said the move "feels like a worst-of-both-worlds compromise that slows down the research community without actually having a real long-term safety impact."
  • Several experts said OpenAI's warnings of potential societal impacts are exaggerated. "We're still very far away from the risks," says Anima Anandkumar, a Caltech professor and Nvidia's machine learning research director. She said it's early to be withholding any research at all.

What's next: Computer science is lurching toward the same robust discussion that biologists and nuclear scientists had before them — when to circumscribe openness in the name of safety and ethics.

  • Notably, Google recently said it will consider potential harms of its AI research before deciding to publish it.
  • "I'm not sure what alternative there was," says Jeremy Howard, a founder of AI company Fast.ai. "I think OpenAI did the right thing here, even if they communicated it sub-optimally."

Go deeper

Big Tech's reputation takes a pandemic plunge

Expand chart
Data: Harris Poll; Chart: Danielle Alberti/Axios

Americans have fallen further out of love with Big Tech, the latest Axios/Harris 100 brand reputation poll shows.

Why it matters: Even though Americans were hyper-connected to their devices throughout the pandemic, their relationship with many of the world's biggest tech firms has continued on a downward trend, suggesting that people see their products as necessary evils.

There's an ETF for everything, except bitcoin

Illustration: Sarah Grillo/Axios

Happiness. Weed. Robots. Water. Whatever the theme, there's probably an ETF promoting a basket of stocks related to it.

Why it matters: Thematic ETFs are an investment mania side effect. There's newfound retail investor interest in narrow exposure to hot corners of the stock market. More are launching to meet the moment.

A divided nation flocks to partisan brands

Data: Harris Poll; Chart: Danielle Alberti/Axios

Americans are leaning into companies that have strong political positions, in the wake of one of the country's most divisive election years.

Driving the news: New rankings from the Axios/Harris 100 poll — an annual survey to gauge the reputation of the most visible brands in the country — show that brands with clear partisan identifications are becoming more popular.