Premium
This is an archive article published on June 19, 2023
Premium

Opinion CoWIN leaks: Where’s government’s due diligence?

Declaring that personal data is secure isn't enough. Privacy guarantees need more

cowin data leakTrusting the integrity of software or hardware is usually avoided because such correctness is often difficult to establish. (Representational)
June 19, 2023 09:05 AM IST First published on: Jun 19, 2023 at 07:07 AM IST

The recent media reports about the CoWin data leak are no doubt disconcerting, but what is even more so is the government’s response to them. It is hardly reassuring to be informed through a ministerial declaration that though some data may have leaked due to earlier breaches or poorly modelled use cases, there really is nothing to worry about because the back-end database is probably still secure. The question is — from what?

Data-related privacy and security concerns are usually countered with two kinds of reactions. The first is fatalistic, and they dismiss the worries saying that our phone or Aadhaar numbers may already be out there with hundreds of entities anyway. These reactions are frivolous. Various data protection discourses and the Supreme Court judgement on privacy debunk these adequately. The second is from the keepers of these systems. They often claim security by forceful proclamations. They argue that the security and privacy safeguards deployed are foolproof because they use “state-of-the-art best practices”. These claims often fail to precisely articulate what are the exact security and privacy problems that these best practices address and end up affirming — ad nauseam — that “the backend databases are safe and we have taken care of privacy”. That is neither here nor there.

Advertisement

Security and privacy discourse requires a standard grammar to ensure that stakeholders do not talk past each other. It is customary in contemporary computer science to start security specifications with a well-articulated threat model, which captures security risks and the capabilities of a hypothetical adversary. For large public service applications involving critical personal data, it is standard to assume that the adversary can corrupt all insiders including system administrators, all custody chains, and all hardware and software. Then, the system designers are required to either argue for security — in some well-established and standard (usually cryptographic) framework — against such a threat model, or specify the unavoidable trust assumptions on various authorities.

Trusting the integrity of software or hardware is usually avoided because such correctness is often difficult to establish. Solutions are typically acceptable only if the trust can be distributed among multiple authorities and it can be demonstrated that a system is safe unless a threshold number of authorities collude. These standards are undoubtedly hard to achieve, which is one of the main reasons why digitalisation attempts are conservative in most countries. Unfortunately, none of the large public service digitalisation undertakings in India probably measure up to these standards. They do not even have publicly articulated threat models without which the security claims are at best, imprecise and at worst, vacuous.

Privacy requires even more due diligence. Leakage of sensitive personal information like DOB, phone, Aadhaar or passport numbers not only makes one vulnerable to direct harms like fraud, identity theft or illegal surveillance, but also to hard-to-detect indirect harms resulting out of unknown entities using personal data in unknown ways. For example, such data may be used illegally for profiling voters and influencing them, or for profiling people for predatory advertising. This is particularly problematic because individuals are often less careful about these indirect harms, but the collective harm to society is considerable. Indeed, the SC’s privacy judgment identified loss of informational self-determination as a perilous privacy harm, which the government is duty-bound to protect us from.

Advertisement

Preventing such function creeps requires exacting standards of purpose limitation, for which security, particularly against insider attacks, is a necessary condition. However, it is by no means sufficient. Also required are legal standards of purpose limitation and access control regulation to prevent building parallel copies of sensitive databases. Any digitalisation entails some privacy risks at the interface of the digital and the human, which need to be precisely modelled. The interface is a crucial component of the digitalisation use cases, which define how various users, including administrators and operators, interact with digital systems. This requires not only modelling the communication protocols and edge devices for data recording and dissemination, like point-of-service systems or phones, but also modelling the privacy impacts of the information that is revealed. It is incumbent upon the system designers to precisely model this minimum unavoidable risk in an ideal functionality, and demonstrate that system implementation does not introduce additional risks.

Failure to do the required due diligence of privacy risk assessment of use cases invariably results in function creeps and violations of purpose limitation, as is evident from the imprecise definition (in the Aadhaar Act) and the indiscriminate use of the “Aadhaar card” in all sorts of services. Some of these are backed by laws and some are not. It is not surprising that there are inevitable privacy breaches.

The other harms that often arise due to inadequate modelling are in digitalisation of welfare delivery such as sale of PDS ration or MNREGA payments. Failure to precisely understand the constraints on equipment, users and personnel, their digital literacy levels and empowerment, and the relationship of the primary function of the welfare service with digitalisation may result in exclusions and denial of services, hardships, and increased transactional costs for the beneficiaries that may hurt the very objectives of the welfare services.

The digitalisation journey in India has been breathtaking in its scale and scope, and, given our challenges, we perhaps need the digitalisation of public services more than most others. However, digitalisation can certainly do with some more computer science rigour.

The writer is professor, Computer Science and Engineering, IIT Delhi. Views are personal