Labeling won't solve AI's problems
As AI continues to embed itself in our digital tools and our lives, it's getting harder to draw clear lines between what's AI-generated and what's not.
What's happening: Legislators, regulators and ethicists are going all in on requiring labeling for AI-created work — but as AI use becomes more of a human-machine collaboration, labeling will lose its coherence and meaning.
Driving the news: The Biden administration's long-awaited AI executive order directs the Commerce Department to come up with a scheme for "watermarking" works produced by AI.
- Meanwhile, members of Congress have introduced an AI Labeling Act that requires "clear and conspicuous" disclosure of AI generated content across all media types.
Why it matters: Labeling advocates argue that clearly distinguishing between the work of humans and that of AI will help the public cope with an expected onslaught of synthetic video, audio, images and texts.
Be smart: It's easy enough to demand that you stamp "Created by ChatGPT" on entire essays the chatbot spits out in response to your prompt.
- But the moment you start using an AI tool as a collaborator — to brainstorm ideas, sketch out alternatives, fill in an outline's blanks, or touch up a final draft — you face tougher questions about provenance and authorship, and binary "human or AI" labeling becomes inadequate.
Software developers are on the front lines of this latest wave of change — as they usually are.
- Generative AI-based tools like GitHub's Copilot that serve as a programmer's collaborator, writing and updating chunks of code, have soared in popularity since their introduction over the last couple of years.
- Earlier this year an executive at Microsoft (which owns GitHub) reported that 40% of the code developers were checking in to Github repositories was "AI-generated and unmodified."
- In many cases, that code won't fall neatly into "AI-created" or "human-written" categories. Each program is woven with threads of human and AI origin.
The field of image creation has used algorithmically based tools for myriad tasks like color-correction, outline detection and sharpening or blurring for decades.
- You can ask Midjourney or DALL-E to draw George Washington on the bridge of the Starship Enterprise, or you can paint the same image with oils on canvas yourself.
- Virtually anything in between — any digital image made with modern graphic tools, like Adobe's Photoshop, Illustrator and Creative Suite — will arguably depend in some way on the contributions of AI.
- That means even if you could devise a reliable and tamper-proof watermarking technique to ID AI-generated content, you're going to face endless uncertainty classifying most digital images according to any consistent AI-vs.-human scheme.
Labeling text is no less of a quagmire.
- Software makers are building AI helpers into all of the most popular writing environments, from Google Docs to Microsoft Word.
- Every time you use any kind of auto-complete, modern spell check or grammar check, you're using AI.
- If you want to be certain that your writing is "all-human," you're going to have to use a typewriter or a pen.
- Still, this approach doesn't work very well beyond images and bad actors will find ways to use tools that don't leave this sort of trace.
Between the lines: The problem with labeling is already emerging in controversies over voice-acting in video game development.
- In a popular new shooter, The Finals, Axios Gaming's Stephen Totilo reports, actors' voices are encoded and then text-to-speech AI generates impromptu interjections for their characters.
- In other words, like so many creations we will increasingly encounter, this material is a hybrid of human and AI that's impossible to disentangle.
What's next: Much of Silicon Valley now believes that generative AI's facility with natural human language means it's inevitable that AI will become the new interface for basically all computing tasks.
- That means AI will play a role in every creative tool used to make content, and the "AI-generated" label will cease to have any meaning.
Go deeper: When humans and machines share a canvas