Updated: October 17, 2021 7:36:04 am
The Dussehra break gave me a chance to head to the hills. Seven hours away from Delhi, in Uttarakhand, you are surrounded by lush foliage, hills, hiking trails and even rumours of a lurking leopard. It was in this setting that I started Nobel Prize-winning author Kazuo Ishiguro’s new book, Klara and the Sun, about a place in which children have AFs or artificial friends, machines with AI that were programmed to respond to human beings — as they observed more, they stored more information about their owners and knew how to respond to them.
Klara, the AF, could soon mimic the behaviour, speech, walk and all manner of responses of the child she was bought for. The trouble began when the parent started finding Klara more perfect than her child. Ishiguro asks us a deep question: Is love substitutable? For our purposes, the book describes the process of “deep learning” where artificial intelligence programmes are able to absorb information and start demonstrating reasoning of the kind which distinguishes us as humans.
The scenario Ishiguro describes is not science fiction or a far-fetched rendition of the future. As India gingerly moves towards completing the second year of pandemic living, one trend is certain — we will rely more and more on artificial intelligence to get through our day. In 1956, John McCarthy wrote, “Artificial intelligence is allowing a machine to behave in such a way that it would be called intelligent if a human being behaved in such a way.” Siri, which Apple consumers are dependent on, is an example of artificial intelligence. Yet for Siri to be intelligent, to recognise even a cat, Siri’s creators said it needed to be fed almost 1,00,000 pictures of cats to spot one when you ask her.
We are now in an era, defined by a Covid-inspired retreat into the digital or virtual world. In the world’s largest democracy of 1.7 billion people, increasingly more and more citizens have vital information on themselves stored as part of state or private data platforms. The government collects information for Aadhaar, for vaccinations on CoWin, from our tax returns, from our driving licence and a host of other instruments that enable us to navigate the public world of citizenship.
Private platforms like Facebook, Twitter and a host of others collect what makes us unique — our opinions, our likes and dislikes, our ideologies — on their platforms. Based on this data, algorithms drive news and information that aligns with our beliefs to our Twitter or Facebook selves. Meanwhile, we now consume popular culture on the privacy of our laptops through OTT platforms like Amazon Prime, Netflix or Hotstar, instead of going to the theatre. Algorithms on each of these platforms recommend movies or serials we like based on our prior viewing.
But the use of algorithms is not just restricted to movie recommendations. It is being used in many parts of the United States to predict recidivism likelihood amongst prisoners and, therefore, to grant sentences based on machine predictions. With this come biases — after all, the data being fed to create the algorithms reflects the opinions of the programmers feeding the information. For instance, is the zip code of where a person lives likely to indicate chances of committing fresh crimes? If you live in a white neighbourhood, are you less likely to commit a new crime? How does one make the algorithm fair and transparent?
In India, for all this data collection that is underway constantly, in a seemingly irreversible manner, citizens have no rights over their data or protection from its extraction and in general, against its misuse. There is no data protection law in place, even though a Bill is being discussed by the parliamentary committee on information technology. Our only protection at present is the Supreme Court’s judgment in Puttaswamy, where it ruled that citizens have rights to informational privacy. Yet, in the absence of legislation, this proves difficult to implement.
While the state has unilateral rights to collect and use our data, it has also given itself the ability to regulate private parties. For instance, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 are used to mandate that WhatsApp, which uses end-to-end encryption, must enable the identification of the first originator of the information. At present, it is as if the state can have deep learning on its citizens. In turn, we the citizens have no remedies for the abuse of that learning. You can switch off Siri, but not the state.
Our government is thinking about the potential and inevitability of the greater use of AI. The 2018 Niti Aayog National Strategy for Artificial Intelligence points to the greater need for AI in sectors like education, healthcare and agriculture. Niti Aayog notes that impediments to the greater use of AI include the lack of access to data, concerns for privacy and security. While noting these concerns, the papers also make clear that the aims of state policy include creating a data marketplace — a “deployable model” in which it seeks to bring “buyers and sellers of data together”.
Amid these plans for greater deployment of AI and harvesting of our data, the lack of any rights paradigm provided by law is deeply unsettling. It violates a constitutional premise that citizens must have their speech, expression, intellectual property and liberty rights protected. In the absence of these protections, the state will continue its deep learning about us and harvest what makes us human and defines our personalities — our personal data.
This column first appeared in the print edition on October 16, 2021 under the title ‘State, Siri and You’. The writer is a Senior Advocate at the Supreme Court of India. ‘Opening Argument’ is a fortnightly column.
📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines
- The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.