Your voice may be compromised. It's not a risk that financial advisors had to take seriously a decade ago. Today, it is a very real threat.
For a successful adviser who spent decades building a well-curated list of high-net-worth clients, the threat hit home. After spending $8,000 on a video project for his website, Ken Brown postponed production. A member of his study group had been duped out of $650,000 by a sophisticated and convincing deepfake, and he began to see himself as a target.
He had the feeling that a recording of his voice on his internet video would make it more accessible. And he wasn't alone. His study group counterparts started talking about removing the videos from their websites as a tactic to stay out of the questioning of scammers. Their reasoning? If they were to be targeted, Brown said, “It would be a huge blow to the bottom line.”
Hackers can steal your voice
Voice deepfakes occur when a hacker takes a recording of your voice, clones it, and then manipulates it. Suddenly you sound like you're saying things you've never said. The explosion of AI gave fraudsters the technology needed to easily clone voices.
From fake product endorsements to misinformation, a voice can now be weaponized. actress Scarlett Johansson's latest lawsuit she claims it just happened to her. She's accusing OpenAI of copying her voice for ChatGPT's new personal assistant. Taylor Swift was targeted in 2023 with an AI-generated hoax video endorsing Le Creuset cookware and, in 2024, explicit pornographic photos of a fake deep (looking like Swift) flooded the internet. Celebrities are in the headlines, but financial advisors are also at risk.
Advisors are vulnerable to Voice Deepfakes
Anyone who manages money can be a target. Imagine if someone took a small audio sample of your real voice and created a vocal rendition, then used it to direct a large bank transfer or hack into a bank account. It happened in Bank of America. Can it happen anywhere?
Think of all the audio content at the hacker's virtual fingertips. Recorded speeches and videos on social media, even a phone call or a Zoom meeting can be recorded and then edited. 60 minutes showed how quickly and easily someone can scam you with the latest advanced cheating tools. This is why advisers need to be aware and adapt.
Advice for financial gatekeepers
Deepfake technology leaves advisors in a delicate situation.
Like Brown, you may be wondering, “Should I stop marketing with videos or podcasts?”
Cybersecurity defense expert Brian Edelman says emphatically “no.” You cannot sacrifice the ability to grow your business. Despite all the scams, he is seen as the CEO of cyber security company FCIEdelman is still confident that giving up marketing is not the answer.
“I don't think fear is the way we deal with this,” Edelman says, “I think knowledge is the way we deal with this.” Instead of hiding, he says advisers should come up with a plan. Edelman recommends these 3 steps:
1. Take responsibility
Owning the risk and every misstep you make is where to start. “When the financial advisor makes mistakes, then they come under the microscope of 'Did you have the knowledge to protect your client?'” Edelman says. He emphasizes that it is your fiduciary responsibility as a financial advisor to protect your clients.
2. Train your team and your customers
Make it clear to everyone what kind of information you will never ask for over the phone or in an unencrypted email. Your protocol for pulling code words and old-school multifactor authentication should be an ongoing part of your internal training and customer education.
Let customers know: This is how we do it. We validate and verify.
Do they have elderly customers who forget their code words? Add a step to your process that with each meeting, you're reviewing their security code word and reminding them of your protocols.
Constantly discuss your plan with your customers. Try to summarize in meetings and incorporating messages into your marketing (blogs, newsletters, videos, podcasts and website landing pages). Let customers know you take the threat seriously and have a process and protocols in place.
3. Practice your answer
To protect against voice spoofing and other cybersecurity threats, Edelman suggests testing your team with what's known as “incident response,” which is common in both the cybersecurity and real worlds. law enforcement. Have your team practice how you would respond to different threats.
“What happens if I put this video out there and some fake artist or bad actor uses my voice to do something bad?” Edelman asks. “Better to do it in an incident drill than in reality. So just pretend it happened.”
By claiming, you will gain valuable information on how to protect yourself from any threat scenario. Then use what you've learned to create your own incident response plan. It turns something you fear into an opportunity to take customer advocacy to the next level.
According to F-Secure, a cybersecurity technology company, only 45% of companies have an incident response plan.
The First Line of Defense
Will there be less to worry about next year?
Don't count on it.
“It's going to be harder and harder to know if we're talking to the people we think we're talking to or a deep fake,” Edelman says. “The more you become aware of the things you fear, the more empowered you are to stop fearing and turn that fear into a strength.”
For counselors, being the first line of defense can be intimidating. It can also inspire change. Brown's team used the scare as a wake-up call to build even more security controls into their process including:
- Visual identification: His team uses FaceTime calls so they know they're really talking to a customer.
- Call back: Because hackers can spoof caller IDs, when a customer calls with a request to move money, Brown's team tells the customer they'll hang up and call them back.
- Home office help: After going to his broker/dealer's cyber security team and asking for additional help, Brown had a special tag added to clients' accounts. If one of these customers calls the home office and requests a transaction or to access funds, the home office routes the call to the Brown team.
“It's amazing how good these people are,” he says Coffee. Being ultra-sensitive to echo sound and the hacker's ability to deal damage may actually be his best asset. The first question every counselor should ask is, “How can I fight it?”
Laura Garfield is the co-founder of The decanter of ideasa video marketing company that creates personalized remote videos for financial advisors.