India’s Digital Personal Data Protection Act: A data maximisation regime?

A law aimed at limiting the collection and retention of personal data is now becoming a backdoor for surveillance.
Data Privacy and The Internet
Data Privacy and The Internet
Published on
4 min read

Worldwide, data protection laws are based on a straightforward principle: collect and process as little data as possible to safeguard the privacy interests of residents. Many responsible technology companies also base their privacy practices on the concept of data minimisation.

Even in the initial drafts of India’s data protection bills, including the explanatory note to the 2022 Bill, data minimisation was specifically identified as a key principle. The final version of the Digital Personal Data Protection (DPDP) Act omitted this term, but the idea remained clear: personal data should be collected and processed only for specified purposes. These developments gave the impression that India was aligning with one of the core principles of modern privacy law. 

However, the Rules issued under the Act paint a different picture. Instead of promoting minimisation, they risk establishing a data maximisation regime. The key issue is a rule that requires data fiduciaries to retain any personal data, related traffic data and logs for 1 year from the date of processing. The intention is to enable the government to access this personal data for law enforcement and other lawful purposes, including the decision to notify a data fiduciary as a significant data fiduciary. 

To understand the importance of this rule, it is important to examine modern data privacy governance in the technology sector. Many current digital systems are designed with privacy as a priority, deliberately avoiding the storage of personal data. These include techniques such as anonymisation at source or point of collection, ephemeral data processing and automated deletion after brief retention periods.

Apple, for example, uses local differential privacy, adding noise to personal data to prevent identification before it is transmitted to Apple’s servers. Features such as Face ID and Touch ID operate entirely on the device’s Secure Enclave, meaning the biometric data used to authenticate a user is stored locally and is not transmitted to the cloud.

Google employs federated learning, where machine learning models are trained directly on user devices while the data remains there. Urban mobility analytics platforms like Uber Movement publish traffic insights using only aggregated and anonymised trip data, rather than identifiable records of individual journeys. These approaches reflect a broader shift in privacy engineering towards source and edge anonymisation, where personal data is either transformed before leaving the device or processed locally without being stored centrally.

Many systems are configured to automatically delete logs after a set period, sometimes instantly, unless an event necessitates retention. Motion-sensing cameras, for example, capture data temporarily and discard it unless an alert is triggered. Voice assistants retain only relevant commands and discard ambient noise to ensure that what is not intended as a command is not processed. These architectures reflect a simple idea: the best way to protect personal data is often not to keep it at all.

These are integral design principles that have been in practice for decades to reduce privacy harms and improve consumer trust. If personal data is purged periodically or is anonymised, it cannot be misused, breached or accessed improperly. This also helps companies manage their privacy risk better and leads to cost optimisation.

A mandatory retention requirement that compels organisations to store identifiable logs risks undermining these privacy-preserving design architectures that depend on deletion or anonymisation. Even when companies have intentionally designed their systems to avoid storing identifiable personal data, they may now be forced to retain records they would have otherwise purged. This creates a lose-lose situation for the Indian public - whose privacy is now at greater risk - and for tech companies, which will have to overhaul and discard their privacy-preserving architectures to comply with Indian legal requirements.

It is interesting to note that the DPDPA only required the government to specify a timeframe for the compulsory deletion of personal data. What was intended to clarify deletion timelines has, in effect, become a broad retention obligation, establishing a maximisation regime.

This also raises a question about the regulatory design of the DPDP Act. The rules established under a privacy statute should ideally be guided by privacy considerations. When they start to resemble law enforcement access frameworks, it indicates that conflicting policy priorities are influencing the regulatory outcome. Additionally, this highlights the absence of checks and balances in the DPDPA to prevent rulemaking beyond the executive's scope.

The retention requirement is not a privacy safeguard but a means to enable government access to personal data. This is ironic, as a law aimed at limiting the collection and retention of personal data is now becoming a backdoor for surveillance. It must also be remembered that the government has unchecked powers under the Act to request any data fiduciary for information in relation to the functions under the Act. This provision is currently being challenged before the Supreme Court as unconstitutional.

These provisions gradually normalise the idea that personal data must remain persistently available for potential State access. Over time, this shifts the design philosophy of digital systems away from privacy-by-design principles toward surveillance-by-design principles. This is precisely the opposite of what the Supreme Court intended in the Puttaswamy case or what the Justice Srikrishna Committee proposed in its report on data protection law. If India’s data protection regime is to remain credible, it must stay rooted in its original premise of data minimisation rather than expanding the digital State's data footprint.

Nikhil Narendran is a Partner at Trilegal.

Bar and Bench - Indian Legal news
www.barandbench.com