“That won’t happen to me” is something many business owners say when discussing cyberscams and the need for adequate protections for their business, but these days it’s getting to be a really, really stupid statement that you definitely don’t want your clients, employees and banker to hear.

Generative artificial intelligence (AI) tools are allowing scammers to produce deepfakes to defraud their targets. Earlier this year, Clive Kabatznik, an investor in Florida, called his local Bank of America representative to discuss a big money transfer he was planning to make.

Immediately after this legitimate call, a scammer called the bank back using an AI-generated deepfake voice of “Clive” to convince the banker to transfer the money to another account. Fortunately, the banker was suspicious enough that no money was transferred, but not everyone is as lucky.

According to a report titled Artificial Imposters by McAfee, a well-established cyber security firm, 77% of AI voice scams were successful in securing money from their target. Even scarier, AI tools can clone a voice from just three seconds of audio.

A UK-based energy firm’s CEO was the victim of a voice scam when he thought he was talking to his boss, the CEO of the parent company based in Germany. The voice on the other end of the line instructed him to send the equivalent of $233,000 to a Hungarian supplier. The voice was so convincing, down to the slight German accent, that the CEO complied without hesitation. By the time they realized what had happened, the money had already been transferred to Mexico and distributed to other locations that weren’t traceable.

But big businesses aren’t the only ones targeted.

Jennifer DeStefano, a mother of a 15-year-old daughter, recounted during a US Senate hearing her terrifying encounter with an AI scammer who used the voice of her daughter to attempt to convince her that the girl had been kidnapped. Fortunately, her daughter was in her bed sleeping at the time, and Jennifer was able to realize it was a scam. Many others aren’t as lucky as Jennifer and are getting scammed by AI voices of grandchildren, children and other loved ones who “urgently need money.”

This approach is still so new that there’s no comprehensive accounting of how often it’s happening, but the CEO of Pindrop, a security company that monitors audio traffic for many of the largest US banks, said he had seen a jump in its prevalence this year – and in the sophistication of scammers’ voice-fraud attempts. Another large voice-authentication vendor, Nuance, saw its first successful deepfake attack on a financial services client late last year.

With the rapid advancement of AI technology and its wider availability as costs come down, coupled with the broad availability of recordings of people’s voices on TikTok, Facebook, Instagram and YouTube, the perfect conditions have been created for voice-related AI scams.

What do you need to do to protect yourself?

For starters, share this article to make sure your staff is aware of these types of scams. Next, instruct them to ALWAYS check with you via a text message or other means BEFORE transferring money. If you’re not a business owner, you can do the same with your family, using a code word or other means of verifying the caller’s legitimacy.

Also, check the caller ID. If it’s something you don’t recognize, or it’s a blocked number, that’s a BIG red flag that it’s a scam. Even if it sounds like them on the other end of the line, hang up and call their phone direct or the place they’re supposed to be (school, office, etc.).

If the person is calling with on-fire urgency and wants money wire-transferred or a Bitcoin payment, that’s another huge red flag. Real emergencies don’t come with highly skeptical payment demands.

In business, you’ve clawed and climbed your way to the top, dodging all sorts of pitfalls and predators that have tried to make you their meal. Such threats are everywhere, and the higher you climb, the more you’ll find hiding behind every tree, every rock and every step. No matter how small and insignificant you might think you are, you ARE a target for someone, and being casual about cyber security and the threats these cybercriminals pose is an absolute surefire way to be robbed.

If you don’t want this to happen to you, click here to request a free Cyber Security Risk Assessment to see just how protected your organization is against known predators. If you haven’t had an independent third party conduct this audit in the last six months, you’re due.

It’s completely free and confidential, without obligation. Voice scams are just the latest in a tsunami of threats aimed at small business owners, with the most susceptible being the ones who never “check the locks” to ensure their current IT company is doing what they should. Claim your complimentary Risk Assessment today.