scorecardresearch
Follow Us:
Monday, December 06, 2021
Premium

Use of facial recognition technology by police is dangerous

Anushka Jain, Likhita, Matt Mahmoudi write: The risk it poses to human rights, including right to privacy, far outweighs any of its purported benefits in law and order

Written by ANUSHKA JAIN , Likhita , Matt Mahmoudi |
Updated: November 25, 2021 7:36:50 am
Facial recognition technology identifies the distinct features of a person’s face to create a biometric map, which an algorithm then matches to possible individuals. (File)

It’s 2020, the Covid-19 pandemic is peaking, and you desperately need to get to the pharmacy to stock up on essentials. As you walk there, almost every street you pass has cameras installed, watching closely as they attempt to identify your face and track your movements. You cross the street, only to be intercepted by police officers who demand that you remove your face mask. You ask why, but no one responds. Then, without explanation, you’re lined up and an officer captures your face on a tablet.

This might sound like a scene from a film set in a dystopian world. In fact, this is an emerging reality for the people of Hyderabad, which stands on the brink of becoming a total surveillance city. According to police officials, more than six lakh CCTV cameras have already been deployed in the city, with the very real possibility that this number will continue to increase. These all-pervasive cameras will soon be connected in a real-time network managed by Hyderabad’s Command and Control Centre. They can be used in combination with the police’s existing facial recognition cameras — meaning that in Hyderabad today, it is virtually impossible to walk down the street without exposure to this invasive technology.

Numerous disturbing news reports have already emerged from Hyderabad about illegal cordon-and-search operations and random frisking of civilians, as well as reports about the police stopping and taking photographs of people on the road without any reason.

The construction of the Integrated Police Command Control Centre, in Banjara Hills in Hyderabad, at a cost of Rs 800 crore, is another worrying development. This centre will allow the police to access real-time surveillance footage from the network of cameras that monitor the city. Surveillance practices that further entrench and automate discriminatory and problematic policing practices — such as data analytics, social media analysis capabilities and facial recognition — will also be carried out at this centre. It’s a chilling attempt to control citizens’ lives through technology.

Facial recognition technology identifies the distinct features of a person’s face to create a biometric map, which an algorithm then matches to possible individuals. The system searches across databases of millions of images, scraped without knowledge or consent, and often fails.

The use of facial recognition technology is already under severe scrutiny around the world, with some jurisdictions, including Belgium and Luxembourg, having already banned its use. The European Union is in the process of finalising and passing one of the most comprehensive bans on facial recognition technology yet, while in the United States, multiple city- and state-level bans and moratoria have been imposed. More than 200 organisations have called for a global ban on the use of biometric surveillance technologies that enable mass and discriminatory surveillance, while even Facebook announced that it would be shutting down its facial recognition programme.

Yet, many police units in India — including Hyderabad Police — today continue to acquire and deploy this dangerous and invasive technology.

In India, these technological infringements on our human rights are particularly dire. The right to privacy was recognised as a fundamental right, included under the right to life and liberty by the Supreme Court of India in 2017. However, without a law in place to regulate data collection and to act as an oversight mechanism, valid concerns about privacy and other rights violations continue to arise.

The absence of any legal framework to govern data protection, especially in the context of personal biometric data, means that we are blindly turning our public spaces into sites of technological experimentation, where human rights are sidelined for profit and control.

The proposed Personal Data Protection Bill 2019 has been stuck for years in Parliament. Despite this, police forces and intelligence agencies have accelerated their unchecked personal data collection activities.

Under the guise of the protection of women and children, huge amounts of public money are being spent on these technologies with no evidence of their effectiveness, further squandering precious public funds.

Government programmes such as Safe City, Smart City and the Nirbhaya Fund have been utilised to bankroll these projects — yet the human rights violations that occur as a result of their use far outweigh any purported benefit that these technologies claim to provide.

The model of surveillance policing in Hyderabad not only raises significant human rights concerns, but it could also further motivate other state police departments and intelligence agencies to adopt similar measures throughout the country. The Telangana state authorities have a duty to uphold human rights by banning the use of dangerous facial recognition technologies.

This column first appeared in the print edition on November 24, 2021 under the title ‘Panoptican city’. Jain is associate counsel at Internet Freedom Foundation; Likhita is a researcher and adviser at Amnesty International; Mahmoudi is an AI and big data researcher at Amnesty International.

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Opinion News, download Indian Express App.

  • Newsguard
  • The Indian Express website has been rated GREEN for its credibility and trustworthiness by Newsguard, a global service that rates news sources for their journalistic standards.
  • Newsguard
0 Comment(s) *
* The moderation of comments is automated and not cleared manually by indianexpress.com.
Advertisement
Advertisement
Advertisement
Advertisement