Dark web AI defeats human verification on crypto exchanges
Add Axios as your preferred source to
see more of our stories on Google.

Illustration: Sarah Grillo/Axios
AI has unlocked a powerful tool being sold to money launderers to create phony accounts on cryptocurrency exchanges, according to new research from Cato Networks, a computer security firm.
Why it matters: Fraudsters need lots of accounts to cash out ill-gotten gains, as they play whack-a-mole with the trust and safety teams at the digital asset platforms.
Between the lines: Etay Maor of Cato CTRL, Cato Networks' threat intelligence lab, has released research that details how the attack works.
- First, AI swiftly generates fake documents, such as a passport. In the example they've seen, it's been done for a person who doesn't actually exist.
- These accounts often require some sort of live proof of humanity, such as selfies or a video. And that's where the deepfake comes in:
- AI is able to generate either photos or a video than can match up with the document and fool an automated agent.
"These accounts are important because they are a vital point in the attack life cycle," Maor tells Axios.
Threat level: Abilities like this allow fraudsters to scale the operational end of their money moving.
- "While in the past I've seen this done in a very professional manner with document forgers, now it's done in just a much more accessible manner," Maor said.
Zoom in: A ransomware, pig butchering or identity fraudster needs to give their victim someplace to send the money so they can cash out. Obviously, they don't want to put their actual name on the account retrieving the ill-gotten gains.
- With services like these, criminals can change identity with every single payment. This decreases the friction at a key bottleneck for fraud.
In the weeds: In a video with the blog post, they show taking an AI photo and using it to create an identity for dozens of companies.
- Then it makes a fake video from that photo that matches the specifications of a specific cryptocurrency exchange.
- It also syncs up so the video seems to be coming from the device's camera.
What we're watching: Social engineering.
- Fraudsters proficient at manipulating support employees at companies in real time are likely to find these kinds of tools extremely useful for extending the reach of their cons.
Maor recommends that companies look for glitches in the artifacts sent their way and look to introduce some randomness in their approach from account to account.
- For example, if they verify with video, they can vary the specific instructions given from video to video.
- Humans can be brought in to double check, but of course this also all increases the onboarding friction for legitimate users.
