Premium

Beware: How scammers are using AI to sound like your loved ones – 3 tips to stay safe

Received a call from an unknown number claiming that your friend or a family member has been arrested or involved in an accident? Chances are it's a scammer using AI voice cloning tool to dupe you of your money.

With AI voice cloning tools on the rise, scammers are jumping on the opportunity to dupe unsuspecting victims of their money.With AI voice cloning tools on the rise, scammers are jumping on the opportunity to dupe unsuspecting victims of their money. (AI Generated)

Sunil Mittal, the chairman of Bharti Airtel recently revealed that scammers are cloning his voice using AI and calling company executives to transfer a large sum of money. At the NDTV World Summit, the businessman said that the AI-cloned voice was so convincing that he was stunned after hearing the recording.

While the company official did not fall for the voice clone scam, scammers successfully used voice cloning to dupe people of their money. With AI voice cloning scams on the rise, there have been several instances in the last few months where people have lost anywhere between a few thousand to more than a lakh.

What is AI voice cloning?

AI voice cloning is a technology that uses AI to create a digital replica or copy of a person’s voice. The process involves recording a person’s voice and then training an AI model to impersonate that person.

Story continues below this ad

AI voice cloning has a wide range of useful applications in sectors like creating realistic voice options for text-to-speech, offering personalised customer interactions, and creating educational materials. However, the technology can also used by fraudsters to impersonate someone in your family and in some cases, impersonate kids’s voices and dupe parents of their money by telling them they are implicated in a criminal case.

How easy is it to clone someone’s voice?

Using artificial intelligence-powered tools to clone someone’s voice is fairly easy. Just run a quick search on Google and you will be greeted with several free and paid services that charge as low as $5 (approx. Rs 420) to clone a person’s voice.

All you need to clone someone’s voice is a 30-second clip of the voice you want to replicate. Upload that audio clip on one of the many voice cloning services available, follow a bunch of steps and you will be able to say anything in their voice.

How do AI voice clone scams work?

Even if you are aware of most modern scams like YouTube like-to-earn scams, WhatsApp job offerings and others, AI voice cloning scams can be convincing since they often involve your friends, someone you know or someone in your family.

Story continues below this ad

Scammers often impersonate law officials like policemen or officials from known government agencies like the Narcotics Control Bureau claiming that your friend, relative or kid has been held in custody in a criminal case. In some instances, scammers may also claim that someone you know is involved in an accident and ask for money on an unknown number so they can take them to the hospital.

These scammers lure victims by creating a fake sense of urgency and demanding that you immediately transfer some money for their release. In cases where parents were duped by scammers using AI voice cloning to impersonate their children, parents say that they heard their children crying over the phone and that they sound exactly like his voice, which is something that can be really convincing even if you are wary of such scams.

How to identify AI Voice Cloning scams?

1. Check the country code

While some fraudsters may reside in India, people who lost money to the scam say that many times, the phone number starts from +92, which is the country code for Pakistan. Scammers have been duping people from Pakistan for quite some time now, with some reports dating back to 2010.

Story continues below this ad

Any Indian government official, be it the police or other law enforcement agencies will call from a number that starts from +91, which is India’s country code. Also, if anyone tells you that you have been put on a ‘Digital Arrest’ and that you have to stay in front of the screen for hours, it is a scam.

2. Call or talk to the person if possible

In case you get a call from an unknown number claiming to be in touch with someone you know, avoid sending any money unless you talk to that person talk to them for a few minutes. If possible, call the person the scammer is referring to and check the details to see if there is any truth or if someone is impersonating them.

3. Change in speaking tone and inconsistent speech patterns

If you notice any stutters, delayed replies, robotic speech tones or changes in speaking tone, chances are the other person is likely is not real. While AI voice cloning tools are pretty good at maintaining the natural flow of the conversation, they often fumble when it comes to replying to unscripted or answering spontaneous questions.

To know if it is a real person on the other side, ask them about things like what time they usually come home, names of the people live with or other personally identifiable information that you can use to determine if is a real person or an AI on the other end.

Story continues below this ad

An easy way to determine if the caller is using AI voice cloning to impersonate an individual is that you won’t hear any breathing during the conversation. Also, AI-generated speech lacks any acceleration or deacceleration in talking speed, so if someone on the call is talking at the same speed without taking a breath or pausing, it is likely the work of artificial intelligence.

In the coming years, generative AI capabilities are only going to get better, and future versions of such AI voice cloning tools will be more accurate at replicating human emotions and speech. So, even if you are wary of such scams, we advise you to avoid picking up calls from unknown numbers, especially ones that do not start with the +91 prefix. Also, before performing any financial transaction, make sure to verify their identity.

Anurag Chawake is a Senior Sub-Editor at indianexpress.com. His fascination with technology and computers goes back to the days of Windows 98. Since then, he has been tinkering with various operating systems, mobile phones, and other things. Anurag usually writes on a wide range of topics including Android, gaming, and PC hardware among other things related to consumer tech. His Twitter, Instagram, Facebook and LinkedIn user name is antechx. ... Read More

Latest Comment
Post Comment
Read Comments
Advertisement
Loading Taboola...
Advertisement