Premium
This is an archive article published on November 6, 2023

‘Deepfake’ video showing Rashmika Mandanna: How to identify fake videos

With improvements in technology related to artificial intelligence (AI), deepfakes are becoming common on the internet. These include pictures, audio or videos. Here's how such deepfakes can be spotted.

rashmika mandanna.Mandanna responded to the video on the platform X, saying, "Something like this is honestly, extremely scary not only for me, but also for each one of us who today is vulnerable to so much harm because of how technology is being misused." (Photo via Instagram)
Listen to this article
‘Deepfake’ video showing Rashmika Mandanna: How to identify fake videos
x
00:00
1x 1.5x 1.8x

A video that supposedly shows actress Rashmika Mandanna entering an elevator has ignited a firestorm of controversy on the internet. What initially appears as genuine is, in fact, a ‘deepfake’ of the actress. The original video features a British Indian girl, Zara Patel, and her face was morphed to insert Mandana’s face instead.

Responding to the video, Rajeev Chandrasekhar, the Union Minister for Electronics & Technology, said on the social media platform X that deep fakes are the latest and a “more dangerous and damaging form of misinformation” that need to be dealt with by social media platforms. He also cited the legal obligations of social media platforms and IT rules pertaining to digital deception.

With improvements in technology related to artificial intelligence (AI), deepfakes are becoming common on the internet. These include pictures, audio or videos that are constructed using deep learning technology, a branch of machine learning where massive amounts of data are fed into a system to create fake content that looks real. Here’s how such deepfakes can be spotted:

Unnatural Eye Movements

Deepfake videos often exhibit unnatural eye movements or gaze patterns. In genuine videos, eye movements are typically smooth and coordinated with the person’s speech and actions.

Mismatches in Color and Lighting

Deepfake creators may have difficulty replicating accurate colour tones and lighting conditions. Pay attention to any inconsistencies in the lighting on the subject’s face and surroundings.

Compare and Contrast Audio Quality

Deepfake videos often use AI-generated audio that may have subtle imperfections. Compare the audio quality with the visual content.

Strange Body Shape or Movement

Deepfakes can sometimes result in unnatural body shapes or movements. For example, limbs may appear too long or short, or the body may move in an unusual or distorted manner. Pay attention to these inconsistencies, especially during physical activities.

Artificial Facial Movements

Story continues below this ad

Deepfake software may not always accurately replicate genuine facial expressions. Look for facial movements that seem exaggerated, out of sync with speech, or unrelated to the context of the video.

Unnatural Positioning of Facial Features

Deepfakes may occasionally exhibit distortions or misalignments in these features, which can be a sign of manipulation.

Awkward Posture or Physique

Deepfakes may struggle to maintain a natural posture or physique. Pay attention to any awkward body positions, proportions, or movements that appear unusual or physically implausible.

Apart from the above observations, you can also take a screenshot of the video and run a reverse image search to check the source and the original video. To do this, go to https://images.google.com/ and click on the camera icon that says ‘Search by image’. You can then upload the screenshot and Google will show you if visuals associated with it are taken from previous videos.

Ankita Deshkar is a Deputy Copy Editor and a dedicated fact-checker at The Indian Express. Based in Maharashtra, she specializes in bridging the gap between technical complexity and public understanding. With a deep focus on Cyber Law, Information Technology, and Public Safety, she leads "The Safe Side" series, where she deconstructs emerging digital threats and financial scams. Ankita is also a certified trainer for the Google News Initiative (GNI) India Training Network, specializing in online verification and the fight against misinformation. She is also an AI trainer with ADiRA (AI for Digital Readiness and Advancement) Professional Background & Expertise Role: Fact-checker & Deputy Copy Editor, The Indian Express Experience: Started working in 2016 Ankita brings a unique multidisciplinary background to her journalism, combining engineering logic with mass communication expertise. Her work often intersects regional governance, wildlife conservation, and digital rights, making her a leading voice on issues affecting Central India, particularly the Vidarbha region. Key focus areas include: Fact-Checking & Verification: As a GNI-certified trainer, she conducts workshops on debunking deepfakes, verifying viral claims, and using OSINT (Open Source Intelligence) tools. Cyber Law & IT: With postgraduate specialization in Cyber Law, she decodes the legalities of data privacy, digital fraud, and the evolving landscape of intellectual property rights. Public Safety & Health: Through her "The Safe Side" column, she provides actionable intelligence on avoiding "juice jacking," "e-SIM scams," and digital extortion. Regional Reporting: She provides on-ground coverage of high-stakes issues in Maharashtra, from Maoist surrenders in Gadchiroli to critical healthcare updates and wildlife-human conflict in Nagpur. Education & Credentials Ankita is currently pursuing her PhD in Mass Communication and Journalism, focusing on the non-verbal communication through Indian classical dance forms. Her academic foundation includes: MA in Mass Communication (RTM Nagpur University) Bachelors in Electrical Engineering (RTM Nagpur University) Post Graduate Diploma (PGTD) in Cyber Law and Information Technology Specialization in Intellectual Property Rights Recent Notable Coverage Ankita’s reportage is recognized for its investigative depth and emphasis on accountability: Cyber Security: "Lost money to a scam? Act within the 'golden hour' or risk losing it all" — A deep dive into the critical window for freezing fraudulent transactions. Public Health: "From deep coma to recovery: First fully recovered Coldrif patient discharged" — Investigating the aftermath of pharmaceutical toxins and the healthcare response. Governance & Conflict: "Gadchiroli now looks like any normal city: SP Neelotpal" — An analysis of the socio-political shift in Maoist-affected regions. Signature Beat Ankita is best known for her ability to translate "technical jargon into human stories." Whether she is explaining how AI tools like MahaCrimeOS assist the police or exposing the dire conditions of wildlife transit centres, her writing serves as a bridge between specialized knowledge and everyday safety. Contact & Follow X (Twitter): @ankita_deshkar Email: ankita.deshkar@indianexpress.com   ... Read More

 

Latest Comment
Post Comment
Read Comments
Advertisement
Advertisement
Advertisement
Advertisement