As data becomes the primary currency, the real question is no longer whether users gave consent, but whether they were ever given a meaningful opportunity to say no.
In the contemporary digital environment, consent is no longer a guarantee of autonomy; it has become a performance shaped by persuasive design and interface psychology. As businesses increasingly rely on data extraction to sustain their models, a troubling design trend has emerged: dark patterns. These deceptive user interface techniques subtly manipulate users into taking actions they may not fully intend to take - be it Amazon’s complex account deletion path, LinkedIn’s aggressive contact scraping, or urgency traps on travel platforms.
Legal frameworks such as India’s Digital Personal Data Protection Act, 2023 (DPDPA) and the Indian Contract Act, 1872 state that consent must be “free,” “specific” and “informed”. These definitions, however, remain aspirational with thousands of consumers experiencing “forced action” and several other dark patterns through platform interfaces. Across jurisdictions, the gap between the law’s conceptualisation of consent and the user’s experience of consent is widening. This raises a critical question: if design scripts the choice, does the user still act freely?
The Indian Contract Act, 1872 treats consent as free when it is not the result of coercion, undue influence, fraud, misrepresentation or mistake. The underlying premise is that individuals are autonomous agents capable of making rational decisions unless subjected to overt forms of pressure or deception. However, behavioural design techniques do not fit neatly into these categories. The consent is shaped, nudged, directed and influenced by the environment. The contract law framework assumes free will is compromised only by intentional, overt, interpersonal wrongdoing. It does not recognise environmental influences engineered through design. Thus, the law’s notion of free will remains rooted in a classical conception of an autonomous, rational individual, one that behavioural psychology increasingly complicates.
The DPDPA looks at consent through the lens of data protection, stating that consent in providing personal data MUST be free, specific, informed and unconditional. The essence of this section is that the personal data provided by the individual that can only be used for a “necessary” purpose must be given by the individual with their entire authority and unconditional consent. Here, we run into the same issue of behavioural design techniques being ignored. This is extremely problematic, as dark patterns such as “Privacy Zuckering” (which hasn’t been addressed in CCPA guidelines) run rampant whereby e-commerce platforms send unsolicited messages without user consent to nudge them into completing their transaction.
In an attempt to address such issues concerning consumer autonomy, a notification was passed by the Central Consumer Protection Authority (CCPA) in 2023 which saw “dark patterns” being legally defined as “deceptive design patterns meant to subvert or impair consumer autonomy”. Additionally, the order defined 13 different types of dark patterns under the notifications with the intention of prohibiting all platforms from engaging in such manipulative behaviour. Due to their continued prevalence however, earlier on June 5, 2025, the CCPA issued an advisory to all e-commerce platforms instructing them to conduct “self-audits” and accordingly ensure their platforms are free from such dark patterns.
In November, the Ministry of Consumer Affairs, Food & Public Distribution has released a press release stating that 26 leading e-commerce platforms have “voluntarily submitted self-declaration letters” confirming compliance with the Dark Patterns Guidelines passed earlier in 2023. While this may be a step in the right direction, there isn’t any transparency or disclosure of this process to the public. As the most impacted stakeholders in such a situation, it's imperative that the public must have an idea of what exactly was done, vis-a-vis the self-audit. Did the platform have a dark pattern at all? How was it addressed? This would not only create a greater degree of accountability among such platforms, but would also retain the trust of consumers.
On the issue of dark patterns, Zepto CEO Aadit Palicha acknowledged that these design choices were an unsuccessful experiment but emphasised that their removal had “nothing to do with government intervention.” This admission raises a more fundamental question: if platforms are retracting dark pattern-like features for business reasons rather than regulatory compliance, to what extent are existing frameworks actually constraining such practices?
User interfaces function as behavioural architecture. They do not merely present choices; they structure decisions. The prominence of certain buttons directs attention; the placement of alternatives signals hierarchy; the sequencing of pages funnels choices; the use of timers creates artificial urgency; and pre-checked boxes normalise consent. The environment is, therefore, not a neutral medium; it actively structures decisions. Behavioural psychology demonstrates that human decision-making is predictably biased. Contrary to the rational-actor model, individuals rely on heuristics - mental shortcuts, that make them susceptible to systematic errors in judgment.
First, the framing effect shows that the way options are presented alters the decisions people make, even when the underlying information is identical. Second, loss aversion causes individuals to prefer avoiding losses over acquiring gains; urgency cues exploit this tendency. Third, anchoring shows how initial reference points, like the “promo price”, skew subsequent judgments. Fourth, the default effect indicates that people disproportionately choose pre-selected options because defaults signal recommended behaviour and reduce decision fatigue.
These cognitive biases do not eliminate autonomy, but they complicate it. Individuals may feel they are acting freely while unconsciously responding to subtle cues embedded in their environment. Behavioural psychology challenges the conventional legal assumption that external influence must be overt or coercive to meaningfully affect free will.
Dark patterns implicate constitutional rights under Article 21, which, as interpreted by the Supreme Court, places individual autonomy, informational self-determination and decisional privacy at the core of the right to life and personal liberty. When digital platforms nudge or mislead users into “agreeing” through deceptive design, they interfere with this constitutional promise by subtly distorting free choice. Decisional privacy, in this sense, protects a person’s freedom to refuse, withdraw or withhold consent without being steered, pressured or manipulated. This vision sits uneasily with today’s digital marketplace, where interfaces are deliberately built to wear down user resistance. Dark patterns may not coerce users in a traditional sense, but they engineer the environment to influence outcomes. They exploit cognitive shortcuts and predictable biases, creating a form of subtle, structural coercion. If we take the constitutional right to autonomy seriously, valid consent cannot be reduced to a mere absence of force. It must reflect meaningful choice - one that is informed, voluntary and supported by the design of the interface itself. A checkbox ticked after navigating a maze of nudges may meet statutory formalities, but it falls short of the Constitution’s demand for true voluntariness.
Law continues to treat consent as a checkbox event - something to be obtained, rather than a context-driven process shaped by cognitive biases, behavioural economics and interface design. As a result, what appears to be legally valid consent is frequently the product of interface coercion, not autonomous choice. Digital interface design operationalises behavioural psychology. Platforms strategically arrange visual cues, colours, layouts and defaults to elicit desired behaviours. These patterns may not rise to the level of manipulative “dark patterns”, yet they operate through the same psychological mechanisms.
Additionally, enforcement of such regulations has always been a problem. Earlier this year, BookMyShow was issued a notice by the CCPA. The platform was employing the dark pattern formally known as “basket casing” - essentially adding ₹1 per ticket booked on the application which automatically went toward the app’s “BookASmile” charity initiative. While this action is explicitly prohibited under the 2023 guidelines, there was nothing more than a notice by the CCPA against the platform, ordering them to take down the dark pattern. This lack of enforcement only creates a system wherein the use of such deceptive practices is homogenised across several platforms.
The digital world today resembles the maze at Villa Pisani. The user enters believing that they will choose their own path, only to realise that the design itself determines where they end up. Dark patterns operate in exactly this way - they offer the illusion of choice while quietly engineering the journey. As long as platforms can script user choices through behavioural manipulation, consent will remain a legal fiction. Just as Villa Pisani’s spiralling corridors appear open and navigable but subtly guide visitors toward a predetermined centre, today’s interfaces create pathways that feel like free choice while gently funnelling users toward outcomes platforms prefer. The modern user is not coerced, but corridored, moving through a labyrinth crafted to make one route seem natural, inevitable or effortless.
Sandhyashree Karanth and Abhijeet Menon are third-year law students at the School of Law, RV University, Bangalore.