The UK’s Starling Bank has issued a warning that “millions” of people could fall prey to scams utilizing artificial intelligence to mimic their voices.
Starling Bank, an online-only lender, revealed that fraudsters can use AI to clone a person’s voice from just three seconds of audio, which could be sourced from videos they share on social media. Scammers can then identify the victim’s friends and family, using the cloned voice to make phone calls and solicit money.
These scams pose a risk to “millions,” according to Starling Bank’s press release on Wednesday. They have already impacted hundreds; a recent survey of over 3,000 adults conducted by the bank with Mortar Research found that more than a quarter of respondents had been targeted by an AI voice-cloning scam in the past year. The survey also revealed that 46% of respondents were unaware that such scams existed, and 8% indicated they would send money as requested by a friend or family member, even if the call seemed suspicious.
“People regularly post content online that has recordings of their voice without ever imagining it’s making them more vulnerable to fraudsters,” Lisa Grahame, chief information security officer at Starling Bank, said in the press release. The bank advises individuals to establish a “safe phrase” with their loved ones—a simple, memorable phrase that differs from their usual passwords—to verify identity during phone calls.
Starling Bank recommends against sharing this safe phrase via text, as it could make it easier for scammers to intercept it. If it must be shared this way, the message should be deleted after the recipient has read it. As AI technology becomes more skilled at mimicking human voices, concerns are growing about its potential to enable criminal activity, such as accessing bank accounts and spreading misinformation.
Earlier this year, OpenAI introduced its voice replication tool, Voice Engine, but chose not to make it publicly available at that time due to concerns about the potential misuse of synthetic voices.