Reddit has updated its policy around involuntary pornography and sexual content involving minors. The policies, which were previously combined in a single rule, have now been broken down into two different rules. The updated policies also crack down on pornographic images or videos that have another person’s face, swapped using artificial intelligence (AI), without their consent. Meanwhile, Reddit has asked users to report involuntary pornography, along with a link to where the content is available on the platform and a brief description of the issue.
“Reddit prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked,” reads a Reddit policy page. The policy calls out images or video of “intimate parts of a person’s body”, created or posted without their permission. “Additionally, do not post images or video of another person for the specific purpose of faking explicit content or soliciting “lookalike” pornography,” the page reads.
Reddit policy against sexual or suggestive content involving minors includes “child sexual abuse imagery, child pornography, and any other content, including fantasy content (e.g. stories, anime), that encourages or promotes pedophilia, child exploitation, or otherwise sexualizes minors.”
The move comes shortly after Twitter revealed guidelines to ban posts related to deepfakes – fake porn videos of celebrities created with a machine learning algorithm – on it platform. A Twitter spokesperson said the company will suspend accounts it identifies as the original poster of intimate media that has been produced or distributed without the subject’s consent. Meanwhile, social app Discord , short video hosting company Gfycat and Pornhub have cracked down on non-consensual porn, especially deepfakes on their platforms.