Updated: October 27, 2021 7:41:14 am
THE MINISTRY of Electronics and Information Technology is preparing a report on the key findings related to India in the internal documents collected by Facebook whistleblower Frances Haugen, including alleged discrepancies in algorithmic recommendations that lead new users in the country to “misinformation and hate speech”, The Indian Express has learnt.
“If needed, we will call their executives to explain how their algorithms work and the action they have taken so far to counter misinformation and hate speech. For now, we will have to study (the revelations made by Haugen),” sources said.
The report is likely to be prepared and finalised over this week, and contain details such as how Facebook failed to check the spread of misinformation and hate speech on its platform in India primarily because it did not have the right tool to flag or monitor content in Hindi and Bengali.
The findings of a Facebook researcher in Kerala from a self-created user account, which encountered several instances of hate speech and misinformation on the basis of algorithmic recommendations of the platform, are also likely to be included in the report, the sources said.
In her complaint to the US Securities and Exchange Commission (SEC), Haugen had said that despite being aware that “RSS users, groups, and pages promote fear-mongering, anti-Muslim narratives”, Facebook could not take action or flag this content, given its “lack of Hindi and Bengali classifiers”.
All focus on US
India has over 34 crore Facebook users. But the company’s internal documents show it spends as much as 87 per cent of its global budget, earmarked for tackling misinformation, in North America where only 10 per cent of its total user base resides.
Citing an undated internal Facebook document titled “Adversarial Harmful Networks-India Case study”, the complaint sent to US SEC by non-profit legal organisation Whistleblower Aid on behalf of Haugen noted: “There were a number of dehumanizing posts (on) Muslims… Our lack of Hindi and Bengali classifiers means much of this content is never flagged or actioned, and we have yet to put forth a nomination for designation of this group (RSS) given political sensitivities.”
Apart from Haugen’s revelations about the alleged inaction by Facebook on hate speech and misinformation being spread in India, The New York Times reported that the company’s own employees were grappling with the effects the platform had on users in India, especially in the run-up to the 2019 general elections.
Responding to queries sent then by The Indian Express, Facebook had said that based on the algorithmic recommendations made to the test user account it had created, the company had undertaken “deeper, more rigorous analysis” of its recommendation systems in India.
“This exploratory effort of one hypothetical test account inspired deeper, more rigorous analysis of our recommendation systems, and contributed to product changes to improve them. Product changes from subsequent, more rigorous research included things like the removal of borderline content and civic and political Groups from our recommendation systems,” a Facebook spokesperson had said.
The New York Times reported that the Facebook researcher’s report “was one of dozens of studies and memos written by Facebook employees grappling with the effects of the platform on India”.
“They provide stark evidence of one of the most serious criticisms levied by human rights activists and politicians against the world-spanning company: It moves into a country without fully understanding its potential effects on local culture and politics, and fails to deploy the resources to act on issues once they occur,” the report said.
📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines
- The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.