Exclusive: New platform aims to stop AI from using creators' work without permission
Add Axios as your preferred source to
see more of our stories on Google.

Photo illustration: Sarah Grillo/Axios. Photos: Michael Ochs/Archives and Paras Griffin via Getty Images
A new rights-and-governance platform is launching after a pilot program with Malcolm X's estate and Katt Williams' studio, bet on helping creators keep their work from being quietly swept into AI training systems — and losing out on pay when it is.
Why it matters: Generative AI systems are being trained on enormous scraped datasets of books, videos, music and cultural archives — often without permission and with no settled legal standard for whether that's allowed.
- More than 60 lawsuits are now winding through the courts, and the U.S. Copyright Office has warned the current system is "not sustainable."
- Without clear protections, creators risk having their work absorbed into AI models with no meaningful way to retrieve it — a concern echoed by legal experts like James Grimmelmann.
Context: AXM is a Black-owned platform that helps creators and estates register and control how their work can be used in AI training, and automate licensing and payouts across partners and platforms.
- At its core, AXM tries to give creators something they don't have right now: the ability to say yes or no before their work becomes training data.
What they're saying: "We're going to have legal uncertainty for a while, and meanwhile the technology keeps changing," Grimmelmann, a Cornell Tech professor of digital and information law, says.
Zoom in: AXM's founders — a music executive, a cultural-rights strategist, and a systems engineer — say that uncertainty is precisely why they built the company:
- "In America, we look at culture more as content, but we've always considered it an asset — something that deserves a higher level of dignification and rights," said AXM CEO Archie Davis, a former EVP and chief creative officer at Def Jam and co-founder of Ryan Coogler's Proximity Media.
- COO Andrew Farrior, who comes from media and digital tech, said "every year estates were exposed to predatory acquisition."
- CTO Jameson Parker, who previously built systems at Google, Adobe and public-sector institutions, added, "the solve wasn't another app."
- "It was the infrastructural layer that's missing globally."
Between the lines: AXM and similar platforms are positioning themselves as a structured path for AI companies willing to pay for licensed training data once the lawsuits settle.
- "These companies are trying to provide a way forward for AI companies that are willing to pay royalties ... offering a convenient one-stop deal for a lot of useful content," Grimmelmann explained.
Zoom out: AI's legal landscape looks a lot like the early-2000s battles over Napster and the long fight over Google Books scanning millions of copyrighted works.
- Courts and regulators are still deciding whether to treat AI training as a new form of fair use — or as mass infringement requiring permission and payment.
- "Sometimes society decides the new tech is worth allowing and puts costs on copyright owners anyway, and sometimes we decide the tech should stop," Grimmelmann said.
- "There are over 60 AI lawsuits currently pending, many of them not going to produce decisions until next year or later. We're going to have legal uncertainty for a while, and meanwhile, the technology keeps changing."
Flashback: For marginalized creators and cultural estates, the risks are already visible — and accelerating.
- AI-generated videos featuring Martin Luther King Jr.'s likeness sparked backlash earlier this year, raising alarms within estates that their most valuable material could be remixed or replicated without guardrails.
- "You can imagine harms from marginalized groups being excluded from training data — but also harms from having their work included without compensation, where their culture is profited off by large tech companies," Grimmelmann said.
The bottom line: AXM's founders say that's the gap they're trying to close: giving creators and estates a way to define what's allowed — and what isn't — before the next wave of AI systems arrives.
- "Estates didn't have a framework for AI," Farrior said. "There was no playbook — no way to say what was allowed, what wasn't, and who gets paid."
Editor's note: This story has been updated with Williams' company's involvement.
