AI voice scams are on the rise – here’s how to stay safe, according to security experts

How to avoid falling foul of AI voice scams

· TechRadar

News By Christian Rowlands published 24 November 2024

(Image credit: Getty Images / Ton photograph)

Jump To:


  • AI voice-clone scams are on the rise, according to security experts
  • Voice-enabled AI models can be used to imitate loved ones
  • Experts recommend agreeing a safe phrase with friends and family

The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

What are AI voice scams?

Scam calls aren't new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.

So how can you stay safe? What makes the threat so problematic is not just how easily and cheaply it can be deployed, but how convincing AI voices have become.

OpenAI faced backlash for its Sky voice option earlier this year, which sounded spookily like Scarlett Johansson, while Sir David Attenborough has described himself as “profoundly disturbed” by an AI voice clone which was indistinguishable from his real speech.

(Image credit: Getty Images / d3sign)
Just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media.

Even tools designed to beat scammers demonstrate how blurred the lines have become. UK network O2 recently launched Daisy, an AI grandma designed to trap phone scammers in a time-wasting conversation, which they believe is with a real senior citizen. It’s a clever use of the technology, but also one that shows just how well AI can simulate human interactions.

Disturbingly, fraudsters can train AI voices based on very small audio samples. According to F-Secure, a cybersecurity firm, just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media.

How AI voice-cloning scams work

The basic concept of a voice-clone scam is similar to standard phone scams: cybercriminals impersonate someone to gain the victim’s trust, then create a sense of urgency which encourages them to disclose sensitive information or transfer money to the fraudster.

Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.

Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.

Contact me with news and offers from other Future brandsReceive email from us on behalf of our trusted partners or sponsors