The Union Cabinet recently approved the draft Data Protection Bill, which is envisaged to be the bedrock for the digitalisation and data ambitions of both the state and the private sector. The Bill will now go to Parliament for debate and approval. The Bill is crucial because irrespective of our levels of digital literacy or comfort with digital technologies, digitalisation and data will inevitably and increasingly impact vital aspects of our public and private lives. But, does the draft Bill adequately address the still extant public concerns that led to the unanimous privacy judgment by a nine-judge bench of the Supreme Court almost six years back? I think not.
The central design objective of the Bill appears to be to facilitate data collection and processing by the government and private entities rather than address the concerns for data protection that led the SC to recognise privacy as a fundamental right of citizens. The SC identified informational self-determination and control to be the crucial aspects for the protection of privacy and liberty of individuals, and laid down the standards of determination with the three-fold tests of legality, legitimacy and proportionality. The requirement of legality would suggest that there needs to be enabling laws as pre-conditions, at least for large public service digital applications of the government, including around digital surveillance. But, surprisingly, the sense of the Bill so far seems to be the opposite. Section 5 of the last available draft seems to suggest that the proposed Act will allow any purpose which is not expressly forbidden by law.
Legitimacy demands that the state should be obligated to establish that there is a legitimate interest behind a proposed digitalisation, and proportionality demands that the digital application should be the least intrusive for the purpose and that there should be a balancing of the extent to which Fundamental Rights are likely to be impinged. Surprisingly, there are as yet no standards for either of these tests.
Legitimacy, which should require a rigorous and not a mere speculative theory of public good, is not addressed at all, and the required standards for proportionality are also left vague and unclear. The directives of “reasonable efforts” and “appropriate technical and organisational measures” are inadequate for determining whether an application is the least intrusive for the purpose, or whether it balances the risks correctly. In particular, balancing requires specifying clear standards for both risk assessments and legitimacy. It appears to be entirely unlikely that these standards can be worked out without well-thought-out guidelines and grammar, or that they can be left to subordinate regulations.
An effective data protection bill also needs to understand the various nuances of privacy risks from digital applications. As legal scholars like Daniel Solove and others have pointed out — also extensively cited and elucidated in the SC judgment — apart from the risks of direct harms arising out of illegal surveillance, profiling and possible uncovering of one’s private world to the public, the other crucial aspects of privacy are the indirect harms that arise out of invasions that link siloed data items to create digital hallucinations of personae and use them inappropriately. The indirect harms are hard to detect, often because their effects are more subtle and long-term.
Hence, the measures of post-violation complaints and penalties — of the type envisaged in the last draft of the Bill — are not adequate for protection and mitigation. Protection from indirect harms needs to be ex-ante rather than ex-post, and data fiduciaries and data controllers need to have exacting standards for ex-ante privacy protection and purpose limitation.
The other problematic aspect of the draft Bill appears to be its over-dependence on consent. Apart from unreasonably putting the onus on unsuspecting individuals to correctly recognise all privacy risks entailed in complicated digital applications, consent also often presents a false choice. Denying consent in pervasive applications may unreasonably limit options, cause hardships or put barriers to freedom of expression. Hence, effective data protection requires an accountability-based rather than a consent-based framework which puts the onus on data controllers and fiduciaries, irrespective of the level of consent rather than on individuals. This is not to say that consent is not required but that one cannot hide behind consent for privacy protection. Also, the current section on “deemed consent” seems to grant dangerous powers to the state or even employers. The clauses of deemed consent under “in public interest” or “for provision of any service or benefit to the Data Principal… by the State or any instrumentality of the State” appears to be unacceptably empowering.
The draft Bill was also completely silent about the standards of anonymisation, encryption and access control. These are not merely technical and operational issues, but crucial considerations for digitalisation and data without which any data protection discourse is woefully incomplete. Even if the details are relegated to subordinate regulations, the objectives and standards need to be specified in an effective and modern data protection bill.
Moreover, a data protection bill that fails to address the concerns of fairness, bias and misinformation that arise out of the automated processing of data, especially by AI applications, is probably outdated even before it is passed. The concerns are many. An effective data protection bill must take these into account.
In summary, the current draft Bill falls short of expectations in many respects. Most significantly, it bears testimony to a mindset of technocrats and the executive to somehow bypass the objections and concerns — including those articulated in the SC judgment — in their zeal to enable digitalisation, rather than try to understand and address them in earnest. This should hopefully change.
The writer is Professor, Computer Science and Engineering, IIT Delhi