On the face of it, there is little that connects Joseph Marie Jacquard, a 16th-century French weaver and merchant, a documentary about Anthony Bourdain, the late chef, writer and TV personality, and whistleblower Frances Haugen’s revelations about the degree to which Facebook is aware of, and causes, deep social and political harm. But it was Jacquard’s success that led to Bourdain being resurrected. Coupled with what we now know about Facebook and government encroachment on individual rights, this act of transcendent necromancy should scare us all.
Haugen’s revelations underline three basic points. First, those running social media are not ill-intentioned per se. But their moral ambivalence towards the consequences of their products and the agnosticism that is built into the design of the algorithms has made it so that they may as well be. Take, for instance, the effect of Instagram on the mental health of adolescent girls, or the role WhatsApp and Facebook have played in promoting ethnic violence in places as diverse as Myanmar, parts of Africa and India. Haugen has provided documents that show that the corporation that runs all three apps was well aware of these consequences — and yet, it did little to stop them.
Second, there is no “good” way, no market-based solution that offers a plausible way out. The apps are so deeply intertwined with how we live and work (just look at the crippling effect a seven-hour shutdown of all of Facebook’s apps had on October 5) that a competitor is likely to fill in the space vacated by any one company.
Finally, it is naive to believe in any substantial form of self-regulation. Simply put, social media’s entire architecture is based on maximising screen time and the data so collected. What the algorithm does is find what will keep people hooked the most, and for the longest — the actual content doesn’t matter. Expecting social media giants to regulate the very thing that their profits are based on is like asking drug dealers to prioritise rehab clinics.
If self-regulation is out, is government regulation the answer? Unfortunately, the actions of even democratically-elected governments often inspire little confidence. Take just two recent examples — the Pegasus snooping scandal and the Arsenal Consulting findings. From both, it seems clear that for many governments, including ours, the use of technology to breach individual rights is not incidental to a larger goal – as in the case of social media companies — but an intrinsic part of how they function.
Forget the fact that the Government of India appears to be the only national government that has not been shaken by the Pegasus scandal. What is more significant is that governments can now deploy “zero-click” spyware that can easily bypass security mechanisms. And that such capabilities have been deployed against journalists, political friends and opponents, defence personnel, businessmen — citizens with an inalienable right to privacy and dignity. As Subhasis Banerjee wrote (‘Guardrails of privacy’, IE, July 27): “… what if a malware injection and surveillance attempt as sophisticated as Pegasus altogether bypasses the data protection architecture and regulatory oversight?… Pegasus was apparently also designed to self-destruct on detection attempts, though, according to the Amnesty report, it did not entirely succeed and left traces. While one always theoretically understood the possibilities, that such James Bond-like tools actually exist and are used by governments is certainly an eye-opener.”
Unfortunately, the Pegasus scandal is only the tip of the iceberg.
In July 2021, a New York Times documentary, Roadrunner: A film about Anthony Bourdain, opened to near-universal critical acclaim. But director Morgan Neville’s revelation that the film has snippets of dialogue in Bourdain’s voice, created using AI after he passed away, has made many uncomfortable. In essence, it showed us how close we are to raising the dead: Between voice cloning technology, advanced robotics that is on the verge of creating human-like androids and the sheer amount of our personalities that has been poured into sites and apps, it will soon be possible to create a simulacrum of deceased loved ones. But fears about how leaps in technology can fundamentally change the human condition are not merely about esoteric life-and-death questions.
Imagine spyware as sophisticated as Pegasus that can escape detection, combined with the ability to create authentic voice and facial features. The dangers flagged by the Arsenal Consulting revelations — that evidence was likely planted on the computers of academics, lawyers and activists in the Bhima Koregaon case — become all the more frightening. What if a doctored video is used to jail activists? Or to establish the chanting of “anti-national”, “seditious” slogans?
Perhaps there really are no solutions to the challenges of the internet age. The apple has been eaten and the sin is so systemic that there’s nothing to be done. Or, maybe, the response to the power of software lies in the first battles against it.
In 1804, Joseph Marie Jacquard invented the Jacquard Loom, which simplified and, to a degree, automated the weaving of complex fabrics. It was arguably the first software ever created, the progenitor of Facebook, Pegasus and voice-recreation AI. While those in power at the time – profit-makers and politicians – welcomed the Loom, others saw it as evil, as taking away jobs and agency. These objectors threw their wooden shoes (“sabots”, in French) at the infernal machine. From the sabots came the first saboteurs and acts of sabotage. But these were also acts of assertion against a world changing — at least for them — for the worse.
Today, it is impossible to throw a shoe at the ephemeral global network. But perhaps, because of its ability to undermine democracy, agency, mental health, privacy and individuality, we need saboteurs more than ever.
This column first appeared in the print edition on October 13, 2021 under the title ‘Why we need saboteurs’. email@example.com