Wowrack Blog

AI Scams and How to Protect Your Loved Ones

Shania     27 October 2023     Security     0 Comments

We have learned previously that with the rise of AI usage worldwide, many people find AI as a tool that helps them increase productivity. However, we also learned that AI also brings concerns to many. One of the main concerns circulating AI usage is security, as cybercriminals have started to adopt AI to scam people. In this blog, we are going to discuss what are the types of AI scams, how to detect them, and how to protect your loved ones from them.

Type of AI Scams

AI Audio Scam Calls

One of the most popular scam tactics cybercriminals use is voice mimicking. AI-powered software like Deepfake enables people to mimic the voices of others. Scammers use these tools to trick people into giving them money.

Cybercriminals start by collecting personal info from social media or the dark web. This includes family names, phone numbers, and voices. Then, they proceed to use AI-powered software to clone the voice they had gained previously. Finally, scammers will try to call the original voice owner's family members or friends. In this call, scammers will demand some money in exchange of the safety of their loved one. Eventually, out of panic, victims end up transferring large sums of money to these scammers.

McAfee's survey found that 25% of adults had previously experienced some kind of AI voice scam. Among them, 10% were targeted personally and 15% said it happened to someone they know. They also found that 77% of the victims had lost their money as a result of this scam.

Jennifer DeStefano, a US citizen, is an example of someone who was targeted by cybercriminals who utilized voice-mimicking techniques. The bad guys were mimicking her daughter's voice, saying that she was kidnapped, to trick DeStefano into transferring $1 million. Even though she did not fall into the trap, we can learn through her case that everyone can be a victim of these scams. Thus, we must never let our guard down.

AI Video Scams

Deepfake is an AI-powered technology that allows its users to create new videos and audio using existing images, voice recordings, and videos of a person, without requiring the actual person to be involved. Aside from generating fake audio to make scam calls, cybercriminals have also been using this technology to create misleading videos to advertise their businesses. Scammers are also starting to utilize this technology to make fake video calls to convince people to transfer money.

All it takes is the audio, photo, and video of the person you want to impersonate. As mentioned previously, scammers can find these easily on social media profiles and the dark web. This explains why you can find a lot of fake videos of public figures online, as their videos are already all over the internet.

Actor Tom Hanks is an example of a public figure who has been used by cybercriminals to generate fake video content. Scammers were using his face and voice to create a video of him promoting a dental insurance plan when he had clarified that he had nothing to do with the business.

In 2022, scammers also used the same tactic to create a fake video of Elon Musk promoting a new cryptocurrency. A man in India also lost approximately $480 to a scammer who used AI to impersonate his former colleague in a WhatsApp video call.

Characteristics of AI Scams

Most AI scams, whether through video or audio calls, usually contain at least one of these characteristics:

Pressure for immediate action

Scammers have one thing in common, and it's that their end goal is to extort money from you. They will try to pressure you into immediately transferring some amount of money into their accounts.

Use scare tactics

People usually cannot think clearly when they are panicking, so scammers usually use scare tactics, such as saying that your loved one is kidnapped, in danger, or sick, to make you panic and eventually transfer your money.

Naming people or companies that you are likely to be familiar with (family members, coworkers, large corporations)

People are always reluctant to give their money to someone they don't know. Thus, scammers always will try to name someone you definitely will recognize immediately while trying to extort money from you. This can be a famous public figure, your co-workers, or your family members.

Hard-to-resist offers (large money, easy job, etc)

Everyone loves money, and scammers take advantage of this to make their scams tempting to people by tricking them into believing that joining their business or clicking on their websites will give them lots of money.

Requesting personal or sensitive information

Credible organizations like banks won't ask for your personal information through calls or messages, so if an unknown number is contacting you asking for your details such as account numbers and passwords, then it is most likely a scam.

How to Protect Your Loved Ones

Educate yourself and your surroundings

Cybercriminals are constantly evolving along with the growth of technology. This makes it important for you to keep yourself updated about the most recent scam tactics. You can do this through continually educating yourself by reading or watching tech news. Once you have educated yourself, don't forget to educate the people around you as well. Share those insights in your group chats, and give them reminders when you see them in person. This especially applies to the people around you who are older and/or less proficient with technology, as they are the most vulnerable to these advanced AI scams.

Be extra careful when someone is asking for money or personal information

As mentioned previously, the end goal of every scam is to extort money from you. Thus, when someone is suddenly asking for money or personal information, regardless of the sum and who they claim to be, you need to always be extra careful.

Check and recheck, don't let the panic get into you

When someone is asking for your money or personal information, you must always verify their identity before obeying their commands. Call your bank's official customer service number to verify whether it's really them, or call and/or text other family members or friends to make sure if the person who called you is really who they claimed to be, and if they are really in a position where they need that money or information.

Don't reveal too much information online

We have learned that scammers can impersonate you easily by collecting videos that feature your face, and audios that feature your voice. Scammers can also take advantage of the personal information you share on your social media, such as information on when will you be out of town with your friends, and when your parents will be away from home, to curate their scams to be more convincing. Thus, we must always be careful with what we share online and make sure that we don't reveal too much, as anyone can be targeted by these scammers.

Try to have a "code word" for your family

When it comes to your loved ones, try to have an exclusive code word. A code word is a unique word that you can use to verify their identity. You can ask for this code when an unknown number who claims to be your family member calls you.

Conclusion

We can conclude that cybercriminals these days have gotten more and more advanced, as they are also starting to utilize AI to curate their scams to make them even more convincing. Their scam calls are now mimicking the voices of people we are familiar with, and the same goes for their scam video calls. Anyone can be a victim of these scams, but it is still possible to protect ourselves and our loved ones from them. We can do this by making sure that we continually educate ourselves and our loved ones, being cautious of people who ask for money, making sure to re-check the caller's identity and credibility, not revealing too much information online, and having an exclusive code word for our family.

Leave a comment



Get a Free Consultation for Your Business
Logo Wowrack Horizontal breathing space-02
US Headquarters
12201 Tukwila International Blvd #100,
Tukwila, Washington 98168
United States of America

APAC Headquarters
Jl. Genteng Kali No. 8, Genteng District,
Surabaya, East Java 60275
Indonesia

© 2024 Wowrack and its affiliates. All rights reserved.