Follow Us:
Thursday, January 23, 2020

‘Coordinated inauthentic behaviour’: Why Facebook removed pages in India

The company described the category of a third India takedown as “civic spam” and not “CIB”. With no common linkages between the individual pages, these pages were not “coordinated”.

Written by Karishma Mehrotra | New Delhi | Updated: April 4, 2019 9:41:22 am
facebook, facebook inauthentic pages, facebook removes congress pages, facebook removes 687 pages, facebook removes pages, facebok fake pages, facebook fake accounts, facebook crackdown, facebook inc pages, lok sabha polls, elections news, lok sabha elections The company described the category of a third India takedown as “civic spam” and not “CIB”. With no common linkages between the individual pages, these pages were not “coordinated”. (Illustration: C R Sasikumar)

Earlier this week, Facebook removed four networks of groups, pages and accounts from India and Pakistan. Three of these were taken down for what it called “coordinated inauthentic behaviour” (The Indian Express, April 1). What kind of behaviour led to this crackdown?

Two kinds of behaviour

Two of the India networks, one each linked to the Congress and BJP, had been on Facebook’s radar for the past two months because of “coordinated inauthentic behaviour” — or “CIB” as the company has described it since early 2018. CIB refers to an orchestrated set of platform violations operated by a single common entity or source.

The company described the category of a third India takedown as “civic spam” and not “CIB”. With no common linkages between the individual pages, these pages were not “coordinated”.

Content which were posted by pages linked to the Congress IT cell, according to Facebook.

Regardless of the existence of a common source, the signals and violations in both categories are similar: single user with multiple accounts (SUMA), spamming behaviour, clickbait behaviour, location obfuscation, and content or ad farms. Content or ad farms are websites and pages with large amounts of low-quality content, typically to make money, which appear high on search engines.

READ | Citing behaviour, Facebook removes 700 accounts, pages linked to BJP and Congress

The company linked one CIB India network to the Congress’s Gujarat IT cell. Initially the platform’s algorithms repeatedly flagged and took down multiple accounts. The company traced these accounts to an IP hub in the party’s Gujarat IT Cell. Most of the accounts exhibited bot-like behaviour, rather than human efforts.

The other Indian CIB network was linked to the company Silver Touch Technologies, with special focus on a BJP-leaning page called India Eye. Facebook says the page, with 2.6 million followers and $70,000 in ad spending, was hiding its location and using a fake name. The company matched the admins to Silver Touch, and took down the page. While Facebook saw no formal connections between Silver Touch and BJP in the back-end network, Silver Touch has worked for both the ruling party and the government on IT solutions.

The company reached out to the Election Commission and both political parties the morning they took down these pages.

For the company taken down for civic scam, Facebook’s algorithms did most of the detection of violations and displayed the number of violations per page. With little human investigation, the technologists looked at those numbers and decided which pages to take down.

Because the violators in this category are often small players with a small number of followers, the company has a policy not to disclose their identity. Company sources said a majority of the pages in this category were political, and BJP-supporting page creators told The Indian Express that they were severely hit by the sweep.

Some of the posts by the Pakistan-based pages, which praised the Pakistan Army and the current Prime Minister in their posts.

“In one short night, Facebook has removed over 40 of our BJP base pages, which had Rs 10 crore investment. Almost 90 per cent of our strength on Facebook. They just want to say that they have acted against spam for their investors,” a BJP social media volunteer said.

The CIB takedown in Pakistan was of a network that linked back to that country’s military media wing. Amongst other misleading tactics, an online group of Pakistanis disguised themselves as Kashmiris aggrieved by the Indian Army, therefore violating location policies, according to Facebook.

CIB and fake news

Like many of Facebook’s policy shifts over the past two years, the impetus to tackle coordinated inauthentic behaviour began with a backlash over the platform’s impact on the 2016 US presidential election. This catapulted the phrase “fake news” to worldwide attention, but waves had already begun with the Brexit campaign and Russia’s 2014 annexation of Crimea.

“Inauthentic behaviour” and “fake news” are not exactly the same thing. Facebook’s CIB refers to the tactics that actors use to spread content, such as building networks of Facebook pages and accounts using false identities. To understand how “fake news” might relate to this, it’s important to recognise the distinction between “misinformation” and “disinformation.” The former refers to false information regardless of whether or not the intent of those spreading it was to mislead the audience; the latter means that there was a deliberate effort to spread a manipulative narrative.

Because Facebook’s CIB policy focuses on the intent to deceive the audience, it may involve “disinformation”. The analysis of the content shared by the taken-down India and Pakistan pages will be included in a report by American think tank The Atlantic Council.

US election and after

After the 2016 presidential election, Facebook CEO Mark Zuckerberg said it was a “pretty crazy idea” to think that fake news on the platform influenced the results. Soon after, The Washington Post reported that a “sophisticated Russian propaganda campaign” used a network of websites and social media accounts to mislead the public against Hillary Clinton and in favour of Donald Trump.

Facebook makes much of its revenue through advertising plans; people can pay it to target their ads to specific demographic groups, and the Russian group is alleged to have targeted paid advertising as well.

Months later, Facebook conceded that 150 million Americans saw Russian Internet Research Agency disinformation on the platform. The US Congress began summoning company executives and commissioning studies, The New York Times detailed the internal processes that slowly made the information public, and the US Justice Department’s Special Counsel Robert Mueller began investigating Russian links to Trump.

First takedowns for CIB

A Facebook team led by head of cybersecurity policy Nathaniel Gleicher (which was also behind the takedown of the India and Pakistan networks) began its research in the runup to the 2018 US midterm polls.

In January 2018, five months before the midterms, Facebook deleted pages and fake accounts it said were potentially aiming to disrupt the elections. By August, it began to use the CIB lexicon as it initiated another large sweep of 652 pages, this time focused on Eastern Europe and Central Asia.

By the US midterms in November, Facebook had projected itself as going hard against disinformation campaigns. The company then had spurts of similar takedowns in Bangladesh, Philippines, and elsewhere.

The US and India contexts differ in at least one significant way. Facebook coordinated heavily with the US government and law enforcement agencies because the attention remained on Russian, and hence foreign, influence. However, the campaigns in India are not only Pakistan-related, but are also internal domestic campaigns linked to the ruling party and the Opposition.

For all the latest Explained News, download Indian Express App

0 Comment(s) *
* The moderation of comments is automated and not cleared manually by indianexpress.com.
Advertisement
Advertisement
Advertisement
Advertisement