Spot a Sora fake, while you still can
Add Axios as your preferred source to
see more of our stories on Google.

Screenshots: Courtesy of Jeremy Carrasco
"Today is the last day I believe anything I see on the internet," X, TikTok and YouTube commenters joked on September 30, the day OpenAI released its Sora video app.
Why it matters: The latest version of Sora has unleashed a torrent of fake videos into the feeds of every place people scroll. The sophisticated, high-quality videos can make it hard to distinguish AI from real footage.
Zoom in: OpenAI promised guardrails, including adding the Sora logo to every video. Creations contain both a visible watermark and invisible content credentials.
- But within a week of release, Sora watermark removal tutorials hit YouTube, Reddit and TikTok. They're not hard to find. The phrase "I find out a way to remove the Sora watermark" is itself a meme.
Between the lines: "Some removal tools are nearly perfect or imperceptible, especially if the video is very simple," Jeremy Carrasco, founder of Showtools.ai on YouTube and TikTok, told Axios. Carrasco has been making it his mission to debunk fake videos since the first bunny jumped on a backyard trampoline.
- Look for the "spongy block" where the watermark was removed, Carrasco says.
- However, OpenAI has already released the Sora 2 API a few days ago, and Sora 2 Pro videos are much higher quality and don't have a watermark.
- Carrasco hasn't seen an undetectable AI video go viral yet, but from the Sora 2 Pro videos he has seen, he estimates that 90% of people would not be able to tell at first watch that a really good video wasn't real.
What they're saying: "If you're tricked by an AI possum eating Halloween candy, that doesn't mean you're stupid," Carrasco says.
- "That's a learning opportunity for when the politician is being deepfaked by AI."
Tech giants have had labeling guidelines in place for their own platforms for years, but efforts to set industry standards seem to have slowed.
- YouTube, TikTok and Meta say they require creators to disclose AI when it looks realistic and will remove content that's against their safety standards, whether it's made with AI or not.
- But moderators are in a perpetual game of whack-a-mole.
If you see something slightly off — in any video you see online — "your gut should tell you to look closer," Carrasco says on his channel.
- If the video looks like security camera or body cam footage, that's a sign. AI video generators are particularly adept at creating realistic-looking cam footage, presumably because the models were trained on a lot of it and because we're used to those videos looking a little grainy.
- Carrasco advises checking the video's source and looking for inconsistencies, like speakers whose lips don't move and objects that disappear without explanation.
Yes, but: The more time you spend poring over videos in search of extra fingers and broken laws of physics, the more you're telling the algorithm that you want to see more videos like it.
- You can block the accounts that create AI videos in order to signal the platform that you don't want to see AI slop.
The other side: AI videos that look real are also causing people to doubt authentic videos, diminishing the humor, talent and physical abilities of creators and their editors.
- Carrasco breaks this down using videos from athlete Travis Nguyen doing unbelievable looking parkour on city streets.
- You can scroll down in accounts to find out how long creators have been at it, Carrasco explains. If it's before 2024, you can assume it's not AI-generated. (Although special effects certainly existed before then.)
- It helps to remember that most AI video models don't generate clips longer than 10 seconds, so if the video is longer and doesn't have cuts or edits, it's probably not AI.
What we're watching: AI video will only get longer, better and harder to spot. Even Carrasco is worried and stresses that all of these techniques could soon fail.
- "If you think of AI video spotting as another media literacy skill, this works," he told Axios.
- "But, if you think that you're going to be able to spot your way out of it, it's not going to happen."
