A British bank is warning the world about AI voice clone scams. The bank said in a press release that there are a whole bunch of cases and that the false reports could affect anyone who has a social media account.
According to latest data from Starling Bank, 28% of UK adults say they’ve fallen victim to an AI voice clone scam not less than once up to now yr. The same data found that nearly half of UK adults (46%) have never heard of an AI voice clone scam and are unaware of the danger.
Related: How to outsmart AI-powered phishing scams
“People regularly post content online that includes recordings of their voice without ever realizing that it makes them more vulnerable to fraudsters,” Lisa Grahame, chief information security officer at Starling Bank, said within the press release.
The scam, which is predicated on artificial intelligence, only needs an audio snippet (nearly three seconds) to convincingly duplicate an individual’s speech patterns. Considering that lots of us post way more every day, the scam could affect the population a massin keeping with CNN.
After cloning, the criminals call the victims’ relatives to fraudulently extort money from them.
Related: Andy Cohen lost “a lot of money” in a complicated scam – How to avoid becoming a victim yourself
In response to the growing threat, Starling Bank recommends introducing a verification system amongst relatives and friends using a singular protected phrase that you simply only communicate out loud to your family members – not via text message or email.
“We hope that through campaigns like this, we can equip the public with the information they need to protect themselves,” Grahame added. “Simply having a safe phrase with trusted friends and family – that you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the line.”