March 30, 2023

They say you can't step in the same river twice. But, friends, here in the Bay Area, it has felt like we've been stuck in the same "atmospheric river" all season.

Today's Login is 1,175 words, a 4-minute read.

1 big thing: AI's great "pause" debate

Illustration: Shoshana Gordon/Axios

An open letter calling for a six-month "pause" in work on advanced artificial intelligence is dividing the tech industry — not just between AI boosters and skeptics, but also between different factions of AI's critics.

Driving the news: The letter — initially signed by Elon Musk, Apple co-founder Steve Wozniak and other industry luminaries — urged "a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities."

  • Specifically, it said that AI labs should "immediately pause for at least 6 months the training of AI systems more powerful than GPT-4," the latest version of OpenAI's large language model, which it released two weeks ago.

What they're saying: The tech world was abuzz over the letter Wednesday, but not many expected to see either a voluntary industry slowdown or a government-mandated "pause" any time soon.

  • "There are no literal proposals in the actual moratorium," Box CEO Aaron Levie told Axios' Ina Fried on stage at the Axios What's Next Summit Wednesday. "It was just, 'Let's now spend the time to get together and work on this issue.' But it was signed by people that have been working on this issue for the past decade."
  • "There's a lot of conversation about, 'Let's pull the plug,' but I'm not sure there is a single plug," Arati Prabhakar, director of the White House Office of Science and Technology Policy, said in another interview by Ina at What's Next.

Between the lines: The letter was organized by the Future of Life Institute, a nonprofit dedicated to "steering transformative technology ... away from extreme large-scale risks."

  • The letter's argument roughly represents the position of "longtermist" AI critics, like Musk, who have been sounding alarms for more than a decade.
  • This view, initially laid out by philosopher Nick Bostrom and later embraced by many tech insiders, warns that an AI might end up — either through human malice or engineering error — with both the goal of destroying humanity and the power to achieve its aim.

Another camp of AI critics maintains that the letter and its advocates are inflating the power of large language models like OpenAI's.

  • ChatGPT is nowhere close to being the kind of "artificial general intelligence" (AGI) that might threaten humanity, they say — it's just auto-complete on steroids.
  • Worrying about some distant apocalypse draws our attention away from more immediate harms — including biased recommendations and misinformation — that are already occurring from AI systems now in use by government and private companies.
  • Emily Bender, a linguistics professor at the University of Washington who has been at the forefront of AI criticism in recent years, tweeted that the open letter was "dripping with #AIHype."

Of note: OpenAI itself has warned of the need to hedge against AI's dangers.

  • In a February blog post, OpenAI CEO Sam Altman wrote, "Some people in the AI field think the risks of AGI (and successor systems) are fictitious; we would be delighted if they turn out to be right, but we are going to operate as if these risks are existential."

Yes, but: Another vocal contingent in tech views the whole case for an AI pause as fundamentally at odds with the tech industry's entrepreneurial spirit and drive to innovate.

  • "The sky is not falling, and Skynet is not on the horizon," Daniel Castro, director of the Center for Data Innovation at the industry-funded Information Technology and Innovation Foundation, said in a statement.
  • "However, AI advances have the potential to create enormous social and economic benefits across the economy and society," Castro argues.
  • Computer scientist Andrew Ng, an AI veteran, tweeted, "There is no realistic way to implement a moratorium and stop all teams from scaling up LLMs, unless governments step in. Having governments pause emerging technologies they don't understand is anti-competitive, sets a terrible precedent, and is awful innovation policy."

Our thought bubble: The dynamics of startup capitalism and tech investment make the kind of coordination and restraint an AI "pause" would require extremely unlikely. Washington's political gridlock and slow learning curve make government action equally unlikely, and global geopolitics are pushing the U.S. toward accelerating AI instead.

2. Fantasy author takes on Audible's "poor" terms

Photo illustration: Shoshana Gordon/Axios. Photo: Courtesy of Brandon Sanderson

A bestselling fantasy author has moved some of his recent projects off Audible, Amazon's industry-dominating audiobook platform, because, he charges, "they treat authors very poorly," Axios' Will Chase reports.

What's happening: Brandon Sanderson — an industry heavyweight with 15 New York Times bestsellers, including multiple No. 1 spots — is the first author of his stature to publicly challenge Audible. His book "Oathbringer" was the most pre-ordered book of all time on that service.

Driving the news: Sanderson recently ran the most successful Kickstarter campaign ever, raising $42 million to self-publish four novels that he wrote in secret during the pandemic. Then he announced that he would distribute the audiobooks through Spotify and Speechify rather than Audible.

Why it matters: Audiobooks are the fastest-growing format in publishing, projected to be worth $35 billion by 2030.

  • Audible controls 63% of the U.S. audiobook market, according to one estimate, but Sanderson claims their influence is far greater.

Details: Audible pays authors 25% royalties on audiobooks, or 40% for those who agree to an exclusivity contract. That's well below the industry standard of 70% for other digital products like games or apps.

  • As with audiobooks, those products involve a wider production crew or organization than just the author, so the revenue splits get complex fast.

Between the lines: Sanderson is particularly critical of the way Audible's terms affect self-published and indie authors, who aren't splitting their cut with a publishing house.

  • Industry insiders are well aware of the disconnect: "Everyone knows the margins on audiobooks are just ludicrously high," he told Axios.

The other side: A spokesperson told Axios Audible remains committed to investing in authors: "We maintain open communication with our authors and creators to innovate and evolve, and we continue to explore new projects with Brandon. We look forward to ongoing dialogue with him on how we can further improve the experience for him and other creators, while bringing his work and others’ to our listeners."

Reality check: Sanderson knows this will be an uphill battle. "How does anyone take on Amazon? Right? And as I've said before, I don't want to go to war with Amazon, they would win," he told Axios.

3. Take note

On Tap

  • A special event focusing on "Advancing Technology for Democracy," part of the State Department's Summit for Democracy 2023, features speakers including Secretary of State Antony Blinken and EU Vice President Margrethe Vestager.


4. After you Login

  • Earlier this week we told you that AI-generated images are getting better at endowing people with the right number of fingers. One thing that still remains a puzzlement to the AI eye, apparently, is unicycles.

Thanks to Peter Allen Clark for editing and Bryan McBournie for copy editing this newsletter.