Premium
This is an archive article published on November 16, 2023

As elections approach, how to deal with audio deepfakes?

What are audio deepfakes? Why are they so hard to recognise? And how to spot one?

AudioAudio deepfakes have become increasingly hard to spot. (Pixabay)
Listen to this article
As elections approach, how to deal with audio deepfakes?
x
00:00
1x 1.5x 1.8x

Generative AI has gotten very good with time, good enough to make you question the audiovisual content you consume daily. It is now extremely easy to clone someone’s voice to create fake audio or video clips — in other words, to create deepfake audios.

In the arena of electoral politics, such cloning can be put to dangerous use, spreading misinformation in an all new effective way. Just clone the voice of any political leader, superimpose the audio onto an existing video clip, and share.

Take for instance the above Instagram handle, which creates AI voice clones of Prime Minister Narendra Modi and Kerala Chief Minister Pinarayi Vijayan. While in this case, the creator does mention that these videos are fake and for entertainment purposes only, if shared without a watermark they are likely to fool many a lay person.

So, how does one recognise such doctored content?

First, what are AI voice clones or deepfake audios?

AI voice clones or deepfake audios refer to the use of artificial intelligence (AI) technology, particularly deep learning algorithms, to generate synthetic or manipulated voice recordings that mimic the voice of a specific individual. The technology has advanced considerably in recent years and now it is possible to create highly realistic and convincing audio forgeries.

Story continues below this ad

How exactly are AI voice clones made?

Creating clones of anyones voice is very easy. All you need is a laptop with a good internet connection and the audio clip of the person whose voice you wish to clone.

We spoke to Siva, the creator behind the above-mentioned Instagram handle. “Using the website covers.ai, one can simply upload an audio and then select the voice they want, the audio will be ready within 5 minutes,” he said. Also anyone can create their voice clone with that website by paying only Rs 399. They need to upload a good-quality audio clip of the voice which is at least 3 minutes long, and then wait for a week. The website will create their AI voice clone and they can create any song or audio with that for lifetime,” he explained.

There are other online tools like play.ht and Eleven Labs that can be used to create AI voice clones easily. There are also several tutorials available on YouTube on making AI voice clones.

https://www.youtube.com/watch?v=sA8qWKOhS9M

Why has spotting audio deepfakes has become hard?

Earlier, audio deepfakes were fairly robotic and unrealistic, making them easy to detect. However, technology has progressed significantly since then.

Story continues below this ad

With the help of advanced AI, deepfake videos and images are being increasingly created by taking advantage of content posted on public social media profiles,Aaron Bugal, field CTO, Asia Pacific and Japan, Sophos, told The Indian Express. While setting social profiles to private and limiting them to only known friends or contacts can help limit overt exposure, it isn’t a guarantee that someone among them won’t repost it or use it for nefarious purposes,” he said.

Sophos is a worldwide leader and innovator of advanced cybersecurity solutions.

So, what can be done?

One way to deal with deepfakes is for the authorities to crack down on social media platforms for it.

It is reassuring to see the Indian Ministry of Electronics and Information Technology (MeitY) sent an advisory to social media companies urging them to tackle deep-fake content. In the advisory, the government also warned social media intermediaries that failing to remove deepfake information from their platforms might result in penalties such as losing safe harbour rights, among other things. Such stringent advice from the government can help to flatten the curve of data being exploited to create deepfake content,” Bugal said.

Story continues below this ad

As a protective measure, digitally signed videos can be a way to verify that content can be trusted. Much like how certificates are used to validate website security and email communications, the same could be used for validating digital media. As technology evolves and deepfake production times shrink and quality vastly improves, a point may come where it’s impossible to distinguish a deepfake from real recorded content; therefore, validating content as true using a signing or verification process is needed,” he added.

How can you identify deepfakes?

Dealing with AI voice clones and potential audio deepfakes requires vigilance and proactive measures. Here are three steps you should follow whenever you see a video or audio clip on social media.

  1. Stay informed: Keep yourself updated about the latest political developments and the statements made by key political leaders so that you don’t fall into the trap of a widely shared audio or video clip of a politician making a controversial statement.
  2. Verify before you share: Verify the source of the audio or video clip. If the source is not reliable, avoid sharing.
  3. Use AI detection tools if possible: There are certain AI voice detectors available online, however unlike AI voice cloning tools, these are not free. Some of these are aivoicedetector.com, play.ht can also be used to detect AI voices.

Why should we be extra vigilant ahead of elections?

There are several audio clips of political leaders available online. And what is required to create a flawless audio clone is hardly a one or two minute clip. During elections, this misinformation can influence public opinion, damage the reputation of candidates, and manipulate the democratic process.

Also, sections of the Indian Penal Code and IT Act can be invoked by police against those spreading misinformation and fake news.

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement