Prime Minister Narendra Modi on Friday (November 17) said it’s important to understand how artificial intelligence (AI) works as it could be used to create ‘deepfakes’ to purposefully spread false information or have malicious intent behind their use. He also urged the media to spread awareness about the issue.
“I recently saw a video in which I was seen singing a Garba song. There are many other such videos online,” said PM Modi, adding that the looming threat of deepfakes has become a great concern and can create a lot of problems for everyone.
Mostly, a controversy erupted after a video of actress Rashmika Mandanna entering an elevator went viral on social media. What initially appeared genuine was, in fact, a deepfake of the actress. The original video featured a British Indian girl, Zara Patel, and her face was morphed to insert Mandana’s face instead. Not only this, there has been a surge of deepfake audios and videos of political leaders on platforms like Instagam.
So how can you identify deepfake videos and audio?
Dealing with AI voice clones and potential audio deepfakes requires vigilance and proactive measures. Here are some of the steps you should follow whenever you see a video or audio clip on social media:
1. Unnatural eye movements: Deepfake videos often exhibit unnatural eye movements or gaze patterns. In genuine videos, eye movements are typically smooth and coordinated with the person’s speech and actions.
2. Mismatches in colour and lighting: Deepfake creators may have difficulty replicating accurate colour tones and lighting conditions. Pay attention to any inconsistencies in the lighting on the subject’s face and surroundings.
3. Compare and contrast audio quality: Deepfake videos often use AI-generated audio that may have subtle imperfections. Compare the audio quality with the visual content.
4. Strange body shape or movement: Deepfakes can sometimes result in unnatural body shapes or movements. For example, limbs may appear too long or short, or the body may move in an unusual or distorted manner. Pay attention to these inconsistencies, especially during physical activities.
5. Artificial facial movements: Deepfake software may not always accurately replicate genuine facial expressions. Look for facial movements that seem exaggerated, out of sync with speech, or unrelated to the context of the video.
6. Unnatural positioning of facial features: Deepfakes may occasionally exhibit distortions or misalignments in these features, which can be a sign of manipulation.
7. Awkward posture or physique: Deepfakes may struggle to maintain a natural posture or physique. Pay attention to any awkward body positions, proportions, or movements that appear unusual or physically implausible.
8. Verify before you share: Verify the source of the audio or video clip. If the source is not reliable, avoid sharing.
9. Stay informed: Keep yourself updated about the latest political developments and the statements made by key political leaders so that you don’t fall into the trap of a widely shared audio or video clip of a politician making a controversial statement.
10. Use AI detection tools if possible: There are certain AI voice detectors available online, however, unlike AI voice cloning tools, these are not free. Some of these are aivoicedetector.com, and play.ht can also be used to detect AI voices.