Privacy concerns with emerging technology and their redressal through PDP Bill

Permitting sweeping exemptions for government agencies would render ineffective the forward-looking provisions being brought in through the PDP Bill 2019.
Computer Interception
Computer Interception

By Dhruv Somayajula and Ameen Jauhar

The 21st century marks the rise of several technological innovations whose full impact is yet to be realised. These emerging technologies include artificial intelligence (AI) applications in various sectors, the Internet of Things (IoT), blockchain technology and many more. These include examples of wearables, smart home devices, predictive tools through machine learning, and facial recognition technology to name a few.

The promise of these technologies is greater efficiency, computational ability and significantly augmenting human functioning. For example, intelligent algorithms in healthcare set-ups can aid in detecting cancer, while facial recognition is being deployed for law enforcement purposes in India. These technologies are also seen as economically valuable with rising demand for AI and IoT systems. Private entities can, thus, profit by building devices that add value to consumers or other businesses using AI or IoT systems.

However, the use of these emerging technologies is intertwined with the processing of vast amounts of data. This issue is noticeable in devices with built-in sensors that collect and analyse data round-the-clock, or machine or deep learning algorithms requiring vast data corpuses for their development and deployment. This ubiquitous integration of data processing and its application by AI and IoT devices, presents real concerns with serious, large-scale implications from an informational privacy stand-point.

For instance, most such data processing happens in a person-agnostic manner, i.e., it does not differentiate between the person who may have consented to that processing or not. Further, aggregated databases comprising data from friends, family and people can aid in profiling algorithms. The fact that such profiling tools are governed by one time notice and consent at the outset by a user, raises concerns around how effective and meaningful such consent is, in preserving privacy and autonomy of one’s information.

There are further concerns around how profiling itself can be biased and exclusionary which add to these direct impediments to information privacy. Examples of bias and discrimination have been noted in AI applications used by the credit loan industry, or on racial markers for recidivism and tougher criminal sentencing norms.

Puttaswamy and the shifting contours of notice and consent

These newer concerns emerge in the backdrop of the right to informational privacy. The right to privacy has been recognised as part of Article 21 of the Indian constitution in KS Puttaswamy v. Union of India in 2017. That judgment discussed the idea of autonomy within the right to informational privacy. Such decisional autonomy refers to a right for every person to have control over who can process or use her personal data.

Traditional means to exercise this control has typically involved a notice-and-consent mechanism, where individuals are supposed to be informed regarding the terms of processing of their data and are given the choice to accept or reject such processing. This process places the choice of the individual at the forefront, and views the act of giving consent as an economic choice made by a rational consumer.

Seen from this lens, the actions of most consumers who are theoretically keen on preserving privacy but in practice, routinely keep consenting to their data being processed makes little sense. This disparity between thought and action has been dubbed as the ‘privacy paradox’.

In recent years, the ‘rational choice’ view has been challenged in light of how consent actually plays out in real life. For instance, reportedly digital operators now sometimes use ‘dark patterns’ to evade or inconvenience users from actually knowing the full extent and terms of their data being processed. This involves the use of cognitive manipulations pushing users to click on a preferred pre-selected choice, or by making it difficult to access crucial terms of use for their data through complex website design.

On the other hand, the act of giving out consent on almost every level in the digital age has led to ‘consent fatigue’. Consent fatigue refers to a phenomenon where users develop a tendency to simply accept privacy notices without reading, due to various factors which include bulky and jargon-based notices and the constant information overload accompanying every choice for the user online. These factors explain the privacy paradox to an extent, and further highlight the inadequacy of ‘notice-and-consent’ as a metric to protect informational privacy.

The above mentioned issues raised by personal data processing practices of AI and IoT applications point to an inadequacy of consent mechanisms to ensure informational self-determination and privacy. A better way to safeguard informational privacy and preserve control over data processing, is by providing for access and control over one’s data throughout its period of processing, and not just at the point of its collection.

Enabling users at an individual level, along with broader community level norms, allow for a continuing exercise of the right to informational privacy. Further, a broad-based involvement with personal data at the individual and societal level throughout the use of personal data addresses some of the concerns raised by AI and IoT applications.

For this continuing exercise of privacy and data protection norms at both an individual and a societal level, a regulatory framework setting out individual-based rights and obligations, measures and norms based on functionality and features of data processing, and an enforcement mechanism that provides for remedies and penalties would provide a more cogent approach to data processing concerns of the 21st century.

PDP Bill, 2019, and privacy concerns in emerging technologies

In this context, the Personal Data Protection Bill, 2019 (PDP Bill 2019) tabled in front of the Parliament is a welcome and much needed legislative step. In addition to notice-and-consent mechanisms, the PDP Bill 2019 provides for individual rights such as the rights to correction, erasure, portability and withdrawal of consent, and sets out obligations that focus on data minimization through limited permissions for collection, purpose-based processing, and storage.

Further, the PDP Bill 2019 aims to set societal benchmarks on transparency, security standards, data breach reporting, privacy-by-design policies, and requires significantly large data processing entities to undertake data protection impact assessments and appoint a data protection officer.

The PDP Bill 2019 aims to permit emerging technology innovations through regulatory sandboxing. The effectiveness of this introduction would be contingent on how much regulatory relaxation and supervision is involved in the sandbox. Lastly, the PDP Bill 2019 provides for a data protection authority, and penalties for violating its provisions.

The PDP Bill 2019 has been subject to much criticism regarding the sweeping nature of exemptions from the entire bill proposed to be granted by the central government to government agencies. This criticism may be partially addressed (in the letter of the law) through the proposed recommendations of the JPC, by introducing a requirement for ‘just, fair, reasonable and proportionate procedures’ for such exempted agencies.

Nevertheless, the rapid deployment of various AI and IoT based solutions by government agencies, requires a deliberated reconsideration of these exemptions. The use of facial recognition technology in law enforcement, authentication for public services, and the proposed use of IoT systems for smart cities envision the processing of vast amounts of personal data.

Permitting sweeping exemptions for government agencies would render ineffective the forward-looking provisions being brought in through the PDP Bill 2019. The continuing deployment of technological solutions by the government also highlights the urgent need for a data protection law in India.

Dhruv Somayajula is a research fellow at the Vidhi Centre for Legal Policy, and Ameen Jauhar leads the Applied Law & Technology Research team at Vidhi. The complete working paper on the above subject can be found here.

Bar and Bench - Indian Legal news
www.barandbench.com