To solve its problems, Facebook needs to accept its role in creating themhttps://indianexpress.com/article/express-sunday-eye/digital-native-alone-and-lonesome-5658604/

To solve its problems, Facebook needs to accept its role in creating them

The problem with Facebook’s recent shout-out for collective social responsibility.

Facebook, Facebook social responsibility
Facebook’s reliance on its communities, who are also their products, to do the work that they are primarily and economically responsible for, is shameful. (Photo: Getty/Thinkstock Images)

While we were wheeling and dealing in the abyss of misinformation that is our social media existence, one of the biggest platforms of dubious information circulation, Facebook, made a strange announcement. Facebook, a corporation that has, over the years, awarded trolling, deployed mood manipulation algorithms for increased engagement, encouraged offensive content for more eyeballs, and has sold our data to those who would like to influence our political choices, came out into the open and cried how it is feeling lonely.

Facebook CEO, Mark Zuckerberg, wrote a heartfelt, if not emoji-laden, opinion editorial on Saturday sounding like the poor boy who tries and tries but can’t solve the question set for homework. The editorial had Zuckerberg talking about the dangers that are threatening the very fabric of our individual and collective lives — harmful content, democratic election manipulation, privacy, and data portability. For Zuckerberg, whose company is evaluated on the stock market at more than the entire GDP of India in 2018, the task is too much for one tiny little corporation dedicated to sharing pictures of dancing kittens to bear on its tiny little shoulders.

In a virtuous bravado that reeks of self-victimhood, Zuckerberg argues that Facebook, like the little engine that could, is doing its best to control the unfettered excesses of these digital phenomena, but it needs a little help. It would be nice, Zuckerberg suggests, if governments and other consortia took responsibility for these problems so that they could go back to doing what they specialise in doing — making people feel bad about themselves. It is laughable that a company like Facebook that has been holding governments hostage through their economic might and bullying societies into opting out of internet protocols like net neutrality, suddenly decides that when it comes to solving the real problems, they would much rather the people did it themselves.

Perhaps, the irony of this is that companies like Facebook are, in fact, responsible for the widespread problems that they now bravely attempt to solve, like allowing fake news websites and violent groups to make it their communication platform with ease.

Advertising

After the last US elections, and more recently with Brexit, we saw how Facebook, despite all the warning signs, refused to keep a check on automatic trolls creating persuasive campaigns of hatred. Facebook’s child company, WhatsApp, has been identified as leading to lynch mobs and vigilante groups in India and apart from some cosmetic patchwork to the UI design and circulation restrictions, there has been no single plausible solution.

It is characteristic of Facebook’s governance and regulation approach that they generally catch up with their own problems much after they have become almost insurmountable. They have never shown the moral courage and ethical strength to completely re-tweak their products and services when they have directly been linked to violent ramifications. It was only in the face of public outcry that last year Facebook targeted terror and violence groups and shut down their pages — before that, they were only flexing their muscles on trans-performers who were not using their “real” names. It wasn’t a coincidence that the horrific Christchurch shooter terrorist used Facebook to livestream his heinous attack.

The platform has historically depended on its users to do most of their dirty work. Even as far away as in 2008, when some of the first privacy concerns had emerged on Facebook, it had asked its users to verify and vouch for each other, and created cute warning adverts. Facebook took the privacy scare to tell us that we are surrounded by predatory data bots who masquerade as friends while selling out our data to the first bidder.

Similarly, when harmful content started seeping into Facebook from the dark web, Facebook did not invest in human solutions to these questions. Instead it decided to set up some generic algorithms and put the responsibility of flagging harmful content on to its users. If you see something, say something — but the chances are it won’t change much within the system.

Facebook’s reliance on its communities, who are also their products, to do the work that they are primarily and economically responsible for, is shameful. Many other community-driven services outside the Facebook Family have done much more to think with its communities over moderating behaviour and content. While it is completely understandable that Facebook wants to build a larger ecosystem within which these problems need to be addressed and resolved, it first needs to acknowledge the role it plays in creating these problems and show how it is internally correcting its mechanisms to build more sustainable solutions.

Nishant Shah is a professor of new media and co-founder, The Centre for Internet & Society, Bengaluru.

This article appeared in print with the headline ‘Digital Native: Alone and Lonesome’