Written by Michael Schwirtz and Sheera Frenkel
Campaigning for Ukraine’s presidential election had just begun to heat up when the authorities announced they had thwarted a Russian plot to use Facebook to undermine the vote.
Unlike the 2016 interference in the United States, which centered on fake Facebook pages created by Russians in faraway St. Petersburg, the operation in Ukraine this year had a clever twist. It tried to circumvent Facebook’s new safeguards by paying Ukrainian citizens to give a Russian agent access to their personal pages.
In a video confession published by the SBU, Ukraine’s domestic intelligence service, a man it identified as the Russian agent said that he resided in Kiev, Ukraine’s capital, and that his Russian handlers had ordered him “to find people in Ukraine on Facebook who wanted to sell their accounts or temporarily rent them out.”
“As I learned,” said the man, who was not identified by name, “their goal was to use those accounts to publish political ads or to plant fake articles.”
The operation suggested that Russia’s tactics had evolved somewhat over the last three years. The change, it seems, is partly a response to security measures adopted by Facebook after the 2016 campaign in the United States when, officials say, the social media company was used to widely disseminate disinformation and inflame partisan tensions.
The question ahead of the election in Ukraine, scheduled for Sunday, is whether Facebook has evolved as well.
The vote presents Facebook with an opportunity to take what it has learned and confront Russia over what the Kremlin considers its home turf. Ukraine has long been a testing ground for all manner of so-called Russian active measures, and was among the first hit with the kind of electoral manipulation later deployed against the United States, France and other countries.
Facebook officials insist the company is ready. It fired an opening salvo in January when it announced the takedown of a coordinated effort involving nearly 150 fake accounts, which appeared to mimic a disinformation campaign by Russia’s Internet Research Agency during the 2018 midterm election campaign in the United States. On Tuesday, the company announced another takedown involving nearly 2,000 Russia-linked pages, groups and accounts, some involved in posting disinformation about Ukraine.
But officials and candidates in Ukraine complain that despite some improvements, Facebook’s response to what should have been an obvious threat has been sluggish.
A new tool meant to increase transparency around political advertising on Facebook came online only on March 18, less than two weeks before the election. Some candidates have said they found it difficult to protect their accounts with the company’s most basic security feature: the little blue check mark Facebook uses to help users distinguish genuine pages from impostors.
“What they have done so far amounts to a dereliction of duty,” said Nina Jankowicz, a global fellow at the Washington-based Wilson Center’s Kennan Institute, who focuses on Russian disinformation.
“They’ve known about the election in Ukraine this spring for years,” she said of Facebook officials. “There is no reason so many things should be happening last minute.”
Although the personal account of Volodymyr Zelensky, the presidential front-runner, was verified quickly, the Facebook and Instagram pages for his campaign were not, Zelensky’s team said, despite its requests.
Mikhail Fedorov, who handles digital strategy for Zelensky’s campaign, said its Facebook pages had been drowned out by fakes almost indistinguishable from the real site.
Facebook representatives, he said, had been slow to respond when the campaign tried to report a problem. Facebook said it had removed pages impersonating Zelensky’s that were reported to the company.
Out of frustration, Fedorov said, the campaign created its own anti-sabotage tool, which he called “a mobile online group,” that alerts Zelensky’s followers to the presence of a fake account and urges them to overwhelm it with supportive comments.
Few doubt that Ukraine is in Russia’s cross hairs. Ahead of the election, activity has surged from Russian-linked bots and a proliferation of fake accounts impersonating the candidates, “aimed at the provoking of hostilities between Ukrainians in social networks,” Facebook in particular, said Serhii Demediuk, the chief of Ukraine’s cyberpolice. He said his office also had observed an uptick in requests on dark web forums for unauthorized remote access to Ukraine’s voter registry.
“The analysis of these incidents indicates that a significant number of those publications are originating from the territory of the Russian Federation,” he said.
No amount of security is 100 percent insurmountable, particularly against Russia, which has devoted enormous resources to its global disinformation efforts, said Nathaniel Gleicher, the head of cybersecurity policy at Facebook.
“The threat actors are going to continue to innovate and try to find new ways to route around security that we put in place,” he said in an interview. “But each security protocol we put in place slows down the actors, forces them to work harder and gives your team more chances to catch them.”
Far from reacting slowly to the threats facing Ukraine, he said, Facebook has been active in confronting disinformation campaigns emanating from Russia since at least April last year. That is when the company announced it had closed 135 Facebook and Instagram accounts controlled by Russia’s Internet Research Agency, which has been linked to the 2016 election tampering in the United States.
The effort to pay Ukrainian citizens to host political advertisements and fake articles on their personal pages appeared tailor-made to circumvent new security measures designed to prevent foreigners from purchasing political advertisements in Ukraine and elsewhere.
In his video confession, the man identified as the Russian agent, whose face was blurred to conceal his identity, said that the goal was to disseminate political advertisements in support of pro-Russian candidates as well as “publish fake news or kompromat” about candidates Moscow opposed.
“The plan was for the culprit to manipulate the consciousness of the Ukrainian voters in the interest of the Kremlin,” the SBU said in a statement about the case.
An SBU spokeswoman declined to provide further details. While Ukrainian officials are often quick to demonize Russia, the case aligns with what Facebook officials say will be the future of disinformation.
Increasingly, Facebook officials say, disinformation will be spread not by foreign actors, but by citizens looking to sway the opinions of fellow citizens. Political parties will hire professional companies to spread false news about their opponents, using fake accounts and bots. And foreigners who want to meddle with a country’s elections will look to hire people locally, through bribery or trickery, to do their bidding.
During the midterm elections in the United States, Facebook already faced some of those problems. Americans spread disinformation to fellow Americans, and a group of Democratic technical experts decided to experiment with swaying the Alabama Senate race.
Some Ukrainian officials warn that the risk of disinformation campaigns from within the country are as high or higher than those directed by Russia.
“I am more concerned with the negative campaigning from another candidate than from Russia,” said Aleksey Ryabchin, a member of Parliament from the party of Yulia Tymoshenko, another presidential candidate.
By itself, Facebook can do little to completely insulate Ukraine from election interference. The country is under siege by Russia both on the internet and in the physical world. Russia essentially controls two separatist regions in eastern Ukraine, where a war that began in 2014 has claimed about 13,000 lives. In recent years, cyberattacks believed to have originated in Russia have shut down power plants and transportation infrastructure.
During the last presidential election, in 2014, hackers breached the servers of Ukraine’s election commission and programmed its website to publish a fake result when the polls closed. Ukrainian officials thwarted the scheme at the last minute, although Russian government television reported the fake result anyway.
The Kremlin has denied involvement in efforts to manipulate elections in Ukraine or anywhere else, though officials have said they have no ability to prevent patriotic-minded Russians from attempting to do so.
“These are the rules of the game. If it’s an open platform then of course there will be people who will try to use this platform for their purposes,” said Dmitry Polyanskiy, Russia’s first deputy permanent representative to the United Nations.
“We can’t be responsible for any publication on Facebook or on Twitter,” he said, “and we do not believe in these conspiracy theories that somewhere people are sitting and clicking mice putting these publications on the web.”