In today’s digital economy, data is no longer just a by-product of online activity. It has become one of the most important economic assets. Every interaction online: searches, purchases, location data, browsing behaviour, comes up with information that businesses can analyse to improve services, understand preferences and build new products.
Several industries have evolved since the adoption of data-driven models. Data powers targeted advertising, enables innovation in artificial intelligence and helps companies understand consumer behaviour with clear and unbiased precision. In many ways, it has also allowed digital services to remain accessible or free for users.
But this opportunity comes with an equally important question: where does monetisation end and privacy begin?
The challenge for regulators and businesses today is not whether data should be used to create value. It is how to ensure that such value creation does not come at the cost of individual rights.
Across the world, legal systems are attempting to answer this question. Frameworks such as the European Union’s General Data Protection Regulation (GDPR), California’s Consumer Privacy Act (CCPA) and India’s Digital Personal Data Protection Act, 2023 (DPDP Act) recognise a key principle: personal data does not belong to companies. Organisations merely process it under clearly defined legal conditions.
India’s DPDP Act represents an important step in this direction. It establishes framework for businesses processing personal data while also recognising the economic importance of data flows in a rapidly digitising economy. At the same time, the structure has attracted debate for its market-oriented design, with some critics suggesting that stronger enforcement and oversight will be necessary to ensure that individuals remain adequately protected.
Ultimately, the effectiveness of any data protection law lies not just in its text, but in its implementation.
One of the most critical safeguards in this context is the principle of purpose limitation. Data must be collected for specific and legitimate purposes. If organisations are allowed to reuse personal data indefinitely for unrelated activities, it creates the risk of function creep where data gathered for one purpose is quietly repurposed for another without the user’s knowledge.
Closely linked to this is data minimisation. Ideally companies should collect the data that is genuinely required to deliver a service or fulfil a set task. The more information an organisation gathers, the larger the exposure to misuse, breaches or unintended consequences.
An additional trend emerging in regulation is that of privacy by design. Privacy should not be checked off once the product has been developed. On the contrary, companies should embed their data-protection policies in the system’s architecture from the very beginning. The use of encryption technologies can help companies minimise the risks involved in dealing with information.
Government and legal frameworks alone cannot solve the challenge. Transparency and user trust must also form the foundation of any responsible data ecosystem.
Consent, for instance, is often presented as non-negotiable of digital privacy. However, the element of consent is usually lost in lengthy privacy policies that no one reads. The concept of consent must be based on full knowledge and comprehension of the relevant issues.
At the same time, new models are coming in that attempt to rebalance the economics of data. Some proposals highlight that individuals could actively license their data in exchange for compensation or benefits, creating a more two-way value keeping in mind the digital economy. Even if these solutions are quite distant from becoming reality today, they indicate that the role played by an individual in how his/her personal data can be used becomes increasingly acknowledged.
Ultimately, the balance between data monetisation and data security does not necessarily need to be a zero-sum game. Companies require data to innovate and compete, but they should not sacrifice customers' right to privacy in the process of it. Therefore, consent becomes the foremost important step.
What needs to be done in the future includes developing robust legal mechanisms, proper business models and enforcement measures to achieve data innovation in the way that does not harm individuals.
In a digital-first economy, the real goal is not to restrict data; it is to build systems where value creation and trust can grow together.
Meeru Gupta is General Counsel (VP) at Bata India.