Axios Pro Exclusive Content

The next AI battle: Copyright

Ashley Gold
May 18, 2023
Illustration of a gavel on a sounding board with the copyright symbol on it

Illustration: Sarah Grillo/Axios

How generative AI systems should treat copyrighted work is the next fierce policy debate in Washington around the game-changing technology.

Driving the news: As the many policy questions around generative AI become more well-defined, copyright and licensing is one area where different coalitions are staking out positions and lobbying lawmakers on their priorities.

Why it matters: Songwriters, artists and other creators who make original work have already been struggling in the internet age and with the prevalence of streaming music and TV.

  • The rise of generative AI, which pulls from massive data sets scraped from the web without much transparency of where its information is coming from, adds to worries about permission, compensation and credit.

The big picture: Current debate mirrors past ones around music licensing, compensation, streaming, copyright, trademarks and patents.

  • "This has the potential to be as big, or have even bigger impact, than Napster had," Michael Huppe, CEO of SoundExchange, told Axios.
  • "What the industry learned back then and what they realize now is you can't stop technology. So the question is, how do you lean into it in a way that protects the creative arts and allows music to flourish?"

What's happening: A House Judiciary panel hearing Wednesday explored several issues:

  • Whether AI-generated work should be eligible for copyright protection.
  • The potential job displacement of artists.
  • Permissions for use of works in AI data training sets.
  • Whether anything AI-generated should be considered "art."

What they're saying: "I am hard-pressed to understand how a system that rests almost entirely on the works of others — and can be commercialized or used to develop commercial products — owes nothing, not even notice, to the owners of the works it uses to power its system," said Rep. Hank Johnson at Wednesday's hearing.

  • "Training AI to mimic professional performers or 'generate' new works based on millions of copies of published songs and recordings presents a host of legal implications. ... And it’s a long-term threat to music itself," Dan Navarro, a musician and songwriter, said at the hearing.
  • "The moral invasion of AI engines that steal the core of a professional performer's identity — the product of a lifetime's hard work and dedication — without permission or pay cannot be tolerated," wrote Recording Industry Association of America CEO Mitch Glazier and National Music Publishers' Association CEO David Israelite in a Billboard op-ed ahead of the hearing.

Background: The U.S. Copyright Office provided guidance in March stating that in the case of works containing AI-generated materials, the office will consider whether the contributions reflected an author's "own original mental conception."

  • Copyright applicants must also disclose when their work includes AI-created material, the office said.

The other side: Joshua Lamel, executive director of the Re:Create Coalition, whose members include libraries and tech and civil society groups, said following Wednesday's hearing that "policymakers must remember that generative AI is grounded in Fair Use and other elements that are not subject to copyright protection."

  • Public Knowledge said it's against any restrictive new requirements for generative AI systems in how they trawl the web for data sets, or other major expansions of copyright law in the name of protecting artists, per a blog post earlier this month.

The intrigue: Some tech players want to take advantage of generative AI while protecting copyrighted work and creators. Adobe, for example, has an image generator called Firefly, which is only trained on Adobe stock images and openly licensed content, and it's automatically tagged with content credentials.

  • Adobe leads the Content Authenticity Initiative, whose members include media outlets, camera manufacturers, researchers and other groups that promote the adoption of standards for content attribution as means of preventing misinformation and deepfake images and videos.
  • "What we want to do is think about a way to allow people to prove what's true" in an age of misinformation, Dana Rao, Adobe's general counsel, told Axios during a week in Washington talking about AI with lawmakers.
  • "We want innovation to go forward, and providing AI to the billions of non-creators who want to create is a net good. But we also need to think through the compensation issues."

The bottom line: With Sen. Marsha Blackburn's focus on this issue for Nashville artists and another copyright hearing coming up in the Senate in July, expect a lot more debate.

Go deeper