Photo of a woman's hand holding a cellphone with an incoming call from an unknown number.
July 22, 2024 Business Banking

Defend Yourself Against AI Voice Scams

Defend Yourself Against AI Voice Scams

Phishing scams are nothing new, but technology is constantly evolving and so are the tactics scammers use.

 

In this article we’ll cover:
  • What is AI Voice Cloning? Learn how scammers use AI to create realistic voice imitations.
  • How Scammers Obtain Voice Recordings: Discover the tactics used to gather voice samples.
  • Types of AI Voice Scams: Understand the different ways scammers might target you, from emergency family scams to politically motivated deceptions.
  • Red Flags: Learn to spot the signs of an AI voice scam, such as urgency and requests for untraceable payment methods.
  • Protection Tips: Equip yourself with practical tips to avoid falling victim, including the power of a family codeword.

Staying abreast of trending scams can help you stay protected from fraud attempts.

One of the newest and most alarming threats is AI voice scams, which leverage sophisticated artificial intelligence to impersonate the voices of loved ones or trusted figures.

Imagine receiving a frantic call from your child or grandchild, claiming they’ve been arrested or injured while traveling in a foreign country and desperately need money for bail or medical care.

Understandably panicked at such terrifying news, you rush to send the requested funds through a wire transfer or gift card, only to discover later that your child is safe and sound, and your money is gone.

This scenario, unfortunately, is becoming increasingly common. Scammers can target anyone, but family scams are particularly effective because they prey on our emotions and concern for loved ones.

Let’s explore AI voice cloning technology, how it’s used in scams, and most importantly, how you can protect yourself and your loved ones from falling victim.

What is AI Voice Cloning?

Artificial Intelligence (AI) has revolutionized many aspects of our lives, but it also comes with its own set of risks. AI voice cloning technology is a prime example. This technology can mimic a person’s voice with remarkable accuracy, using just a few seconds of recorded audio. Scammers obtain these recordings from social media videos, voicemail messages, or any other publicly available sources.

How Does AI Voice Cloning Work?

Imagine a world where a machine can learn to mimic your voice perfectly, down to the subtle inflections and nuances that make it uniquely yours. That’s the power behind AI voice cloning.

Scammers can use short audio clips – culled from social media posts, voicemails, or even robocalls – and feed them into an AI program. The program then analyzes the audio, learning the speaker’s voice patterns and mannerisms. This allows the AI to synthesize speech that sounds eerily similar to the original speaker.

How Scammers Obtain Voice Recordings

Scammers are resourceful when it comes to gathering voice samples. They might use social engineering tactics to trick individuals into recording their voice or simply scrape voice samples from online platforms. They may also use recordings from robocalls you’ve answered, so it’s wise not to engage with spammy phone calls.

Once they have enough data, they feed it into an AI model that can replicate the voice and generate new audio clips that say anything the scammer wants.

How AI Voice Scams Work

Scammers use AI voice cloning to create convincing audio messages. They can impersonate friends, family members, or even official representatives from institutions like banks. The cloned voice can then be used in various scenarios to deceive victims into taking actions that benefit the scammer.

Example of an AI Voice Scam

Imagine receiving a call that sounds like your sibling’s voice, claiming they are in urgent need of money due to an emergency. The voice sounds familiar, and under pressure, you comply with their request. Later, you realize it was a scam, and the funds you transferred are gone. This is a typical example of how AI voice scams can unfold.

Types of AI Voice Scams Affecting Consumers

AI voice scams are a rapidly emerging threat that targets a diverse range of victims. These scams leverage sophisticated AI technology to clone voices and create compelling, fraudulent communications that are difficult to distinguish from genuine ones. Understanding the various forms these scams can take and the techniques they employ is crucial for safeguarding oneself against potential losses.

Emergency Family Scams

The emergency family scams are aimed specifically at disabling their target with attacks at their most vulnerable side: their emotions. There’s nothing more personal to people than their families, so a lot of family-related events are reacted to more quickly, with fewer questions, as long as there’s the right threat to scam them with.

  • Exploiting Emotional Vulnerabilities: Scammers will target emotions like love, concern, and panic to cloud your judgment. They might claim a loved one is in trouble, injured, or arrested, and needs immediate financial assistance to avoid a dire situation.
  • Highly Personalized Touch: AI can use snippets of information gleaned from social media or online sources to personalize the scam. They might use real names, locations, or even reference recent events to make the scenario seem more believable.
  • Pressure to Act Quickly: Scammers will often create a sense of urgency, urging you to send money immediately without verifying the situation. They might claim there’s no time to contact other family members or explain details over email.
Politically Motivated Scams

This particular type of scam is not necessarily an attempt to trick you out of money but is meant to either promote a political cause or discredit someone else’s. Scammers take recordings of well-known political figures and use AI to manipulate their voices to say whatever they want to promote their personal agenda. Here are a few examples:

  • Campaign support: Scammers may call during an election season to deliver messages, solicit support, or provide information. For example, the supposed caller may appear to endorse a particular candidate or cause.
  • Attack ads: An AI clone of a recognizable voice, like the president or other well-known political figure, may target opponents with negative or misleading information disguised as a trusted source.
  • Manipulating public opinion: Scammers may try to spread disinformation or propaganda using a familiar voice.

Red Flags: How to Spot an AI Voice Scam

The keys to staying safe from AI voice scams are awareness and just a dash of skepticism. Here are some red flags that should make you pause and say, “Wait a minute”:

Urgency 

Scammers will often try to create a sense of urgency to cloud your judgment. Be cautious if you’re asked to act immediately without time to think. Don’t be pressured into making a quick decision, especially involving money.

Untraceable Payment Methods 

Requests for wire transfers, gift cards, or cryptocurrency should raise alarm bells. These methods are preferred by scammers due to their untraceable nature. Legitimate businesses will not request payment via wire transfer, gift cards, or cryptocurrency. These methods are virtually untraceable, and once the money is sent, it’s nearly impossible to retrieve.

Unknown Numbers

Be wary of calls from unknown numbers. Scammers can use technology to “spoof” phone numbers, hiding their source. These numbers can be cloned or masked, providing scammers with anonymity.

Staying Calm is Key 

In the heat of the moment, it’s easy to panic. However, staying calm and composed is crucial. Pause to assess the situation logically before taking any action.

If you receive a suspicious call, take a deep breath and try to stay calm. Don’t give out any personal information, and politely tell the caller you’ll get back to them.

Tips for Protecting Yourself from AI Voice Scams

While AI voice scams can sound sophisticated and intimidating, the good news is there are concrete steps you can take to protect yourself. Here are some essential tips to keep you safe:

  • Don’t Answer Calls from Unknown Numbers
  • One of the simplest ways to protect yourself is to avoid answering calls from unknown numbers. Allow these calls to go to voicemail, and then decide if they require a response.
  • Verify Information
  • If you receive a suspicious call, verify any information they give you. Call the person back at a known number or contact a mutual acquaintance to confirm the story.
  • If it’s a political call, do your homework to be certain the information they give you is true and accurate. If they’re asking for a donation to a cause, you’ll want to ensure that any entity you deal with is reputable and has a verifiable presence. Make any donations online through a trusted website rather than over the phone.
  • Don’t overshare on social media. One of the ways scammers collect voice samples for AI cloning is through social media. Even a short video of you or a family member could be enough to create a compelling copy of their voice. Limit what you share online, and make sure that your posts are only visible to friends and family.
  • Keep personal information private. Avoid sharing personal information unless you are absolutely certain about the identity of the caller.

Simple but Effective: The Power of a Family Codeword

Here’s a low-tech yet surprisingly effective way to protect yourself and your family: create a secret codeword. This could be a random phrase or inside joke that only your family would know. If someone claiming to be a loved one calls and asks for money, simply ask for the codeword. AI can mimic a voice, but it can’t guess a secret password.

Reporting Scams and Learning More

If you encounter a scam, report it immediately. For anything relating to your accounts with Saint Louis Bank, you can reach us during business hours at (314) 851-6200.

In the United States, national resources are available to help you learn more about consumer protection and how to avoid falling victim to scams.

Stay Vigilant, Stay Safe

AI voice scams represent a new frontier in the world of fraud. By staying vigilant and informed, you can protect yourself from these schemes. Be aware of the tactics scammers use and take steps to protect yourself to significantly reduce your risk of falling victim to a scam.

Remember the key points—avoid unknown numbers, verify information, and use a family codeword. Share this information with friends and family to help them stay safe. Awareness and preparedness are your best defenses against AI voice scams. 

Learn more about keeping your online activity safe by visiting our Online Security site, where you can find more resources and information on this and related topics!