Already facing flak for its censorship policies, Facebook now wants its users to define what is “objectionable”, eventually empowering them to decide how much nudity and violence they are comfortable seeing, CEO Mark Zuckerberg said in a post. These policies help a user understand what type of sharing is allowed on Facebook and what type of content may be reported to the social media giant and removed.
“The idea is to give everyone in the community options for how they would like to set the content policy for themselves,” Zuckerberg wrote on Facebook’s Community Standards policy on Thursday. “Where is your line on nudity? On violence? On graphic content? On profanity? What you decide will be your personal settings. We will periodically ask you these questions to increase participation and so you don’t need to dig around to find them,” he added.
Zuckerberg also noted that for those who do not make a decision, the policies decided by majority of people in their region would be enforced. Even in that case the individual would have the option of updating personal settings anytime. “With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allowed and Facebook will also block content based on standards and local laws,” Zuckerberg noted.
To classify the objectionable content, Facebook will use artificial intelligence. The Menlo Park-based company wants to start with the cases in 2017. “It’s worth noting that major advances in AI are required to understand text, photos and videos to judge whether they contain hate speech, graphic violence, sexually explicit content and more,” he said.
At our current pace of research, we hope to begin handling some of these cases in 2017, but others will not be possible for many years, the Facebook CEO noted.