Bank warns AI cloning scams are out of control


A UK bank is warning the world to beware of AI voice cloning scams. The bank said in a press release that it involves hundreds of cases and scams can affect anyone with a social media account.

According to new data from Starling Bank, 28% of UK adults say they have already been the target of an AI voice cloning scam at least once in the past year. The same data revealed that nearly half of UK adults (46%) have never heard of an AI voice cloning scam and are unaware of the risk.

Related: How to beat AI-powered phishing scams

“People regularly post content online that has recordings of their voice, never realizing it's making them more vulnerable to fraudsters,” said Lisa Grahame, chief information security officer at Starling Bank, in the press release .

The AI-powered spoof only needs a snippet (just three or so seconds) of audio to convincingly duplicate a person's speech patterns. Considering that many of us post far more than that on a daily basis, fraud can affect the population a measurefor CNN.

Once cloned, the criminals call the victim's loved ones to fraudulently ask for funds.

Related: Andy Cohen lost 'a lot of money' to a very sophisticated scam – Here's how to avoid becoming a victim yourself

In response to the growing threat, Starling Bank recommends adopting a verification system between relatives and friends using a unique system safe phrase that you only share with loved ones out loud — not by text or email.

“We hope that through campaigns like this, we can arm the public with the information they need to keep themselves safe,” Grahame added. “Simply having a secure phrase with trusted friends and family – which you never share digitally – is a quick and easy way to make sure you can verify who's on the other end of the phone .”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *