scorecardresearch
Follow Us:
Tuesday, January 18, 2022

Meta launches platform to tackle ‘revenge porn’ more proactively in India

The platform will let women approach and flag possible intimate images, videos which could be uploaded to Facebook, Instagram without the consent of women.

Written by Shruti Dhapola | New Delhi |
December 11, 2021 9:43:33 am
meta, meta logo, meta news, meta facebook,Meta added it had disabled accounts related to these companies and served them with a cease and desist notice, while also sharing details of the internal findings with security researchers, other platforms and policymakers.(File)

‘Revenge porn’ or the sharing of non-consensual intimate images of women is a problem that continues to grow. And now Meta is hoping to tackle this in a more proactive manner with a new platform in India called StopNCII.org. The platform, in partnership with UK-based ‘Revenge Porn helpline’, will let women approach and flag possible intimate images, videos which could be uploaded to Facebook, Instagram without the consent of women.

Highlighting how there have been unfortunate instances of people “encountering this form of abuse” taking “very drastic steps,” Karuna Nain, Director, Global Safety Policy at Meta, told indianexpress.com, said: “… it just takes on a whole new life because you are about interacting in social situations because you are worried whether the other person has seen my image out there.”

StopNCII.org acts as a bank of sorts where victims can share ‘hashes’ of their photos, videos, which are at threat or have been exposed. A hash is a unique digital fingerprint attached to each photo or video that is shared.

The hash is then shared with Facebook, Instagram and if someone tries to upload a video or image where the hash matches, then that upload gets flagged as possibly violating the company’s content policy.

Meta says the images or videos do not leave the device when a victim is uploading them. Instead only the said hash is uploaded. The way Meta sees it the new platform can act as a heads up for them and help them deal with intimate image abuse better.

It should be kept in mind that StopNCII.org’s website clearly states that images in question need to be those in an intimate setting. This could be images and videos where the victim is naked, showing their genitals, engaging in sexual activity or poses, or wearing underwear in compromising positions.

It is also limited to adult women who are over the age of 18, meaning that victims of child pornography cannot approach or rely on this platform. According to Nain, for child sexual abuse images, they can only work with select NGOs who are authorised and have legal cover to do so. This is why StopNCII is limited to women over 18.

But would the hash be accurate if someone were to alter or change the intimate image before uploading it? Unfortunately that’s where the challenge really lies. The technology that Meta is using–one which is widely used across the industry– works on exact or near matches.

“So if there’s somebody who does some severe alteration of that photo or video, then it would not be an exact match for the hash that we have received. And so the person would need to keep a lookout and would probably want to use the system again to upload that hash of that altered piece of content,” Nain admitted.

It should also be noted that uploading the hash by itself will not automatically guarantee that the content does not end up on Facebook or Instagram. Or that it will get automatically stopped.

According to Nain, Facebook or Instagram’s review teams will still go through the content and see if it violates their policies when any content matches the hash. Nor is Meta promising a fixed time-frame or duration for these cases to be resolved.

While the review teams do prioritise this content given the high severity associated with this arm, they cannot guarantee that the issue will be resolved in a limited time. Meta sees three possible scenarios on what happens when dealing with such content.

In the first, the content was already shared and reported on the platform and once the hash is received, the system is automated going forward. So if someone tries to upload it again, it just gets tagged and blocked faster.

In the second and slightly more problematic instance, the content was uploaded to Facebook or Instagram and did not get flagged by the automated detection systems.

“Because this matching content has either never been reported or proactively detected by us, we will need to send it to our review teams to check what’s going on,” Nain explained, adding that only once the review team determines that it is violative will it get removed. So a hash getting generated by itself is no guarantee of removal.

However, once a piece of content is flagged as violative, the process is automated going forward.

And then there is a third scenario where someone hasn’t shared the hashed content on the platform at all or what Facebook says is a wait and watch. “Only when someone tries to upload that content would it be detected and will we be able to take that matching content and send it to our review teams to check what’s going on,” she stressed.

Right now StopNCII.org is limited to Facebook and Instagram. Meta is hoping that other tech players will also come on board and join the platform in order to make it easier for victims because right now the onus is on them to make sure that the  image does not end up on multiple platforms.

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

  • Newsguard
  • The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.
  • Newsguard
Advertisement
Advertisement
Advertisement
Advertisement