RTI not meant to micro-manage government; no need for standalone AI law for now: Economic Survey of India

The Economic Survey warns that transparency, if pursued without balance, may undermine effective governance.
RTI Act , Right to Information
RTI Act , Right to Information
Published on
3 min read

The Economic Survey of India has cautioned that the Right to Information (RTI) Act, 2005 was not designed to satisfy idle curiosity or enable citizens to micro-manage government functioning, even as it reaffirmed the law’s importance as a cornerstone of transparency and accountability.

“The RTI Act was never intended as a tool for idle curiosity, nor as a mechanism to micromanage government from the outside.”

The survey describes the RTI Act as one of India’s most significant democratic reforms and acknowledges its transformative impact on governance. However, it warns that transparency, if pursued without balance, may undermine effective governance.

The survey cautions that such an approach may defeat the law’s original purpose. At the same time, the survey stresses that any reconsideration of the RTI framework must not dilute its core function.

The survey observes that India’s RTI framework is broader than most global transparency laws. It contrasts India’s approach with other jurisdictions where internal personnel rules, inter-agency memoranda, financial regulation reports and policy formulation materials are expressly exempt from disclosure.

According to the survey, India provides limited space for such exclusions.

“India, in contrast, leaves far less space for such carve-outs.”

It notes that draft notes, internal correspondence and even personal records of officials often become public, sometimes without a strong link to public interest. As a result, file notings and internal opinions fall within the definition of information under the Act, with only Cabinet papers enjoying temporary protection.

The survey warns that routine disclosure of internal drafts and remarks may weaken governance.

“If every draft or remark might be disclosed, officials may hold back, resorting instead to cautious language and fewer bold ideas...The candour needed for effective governance is blunted.”

Instead, accountability should attach to final decisions, it suggests.

Democracy functions best when officials can deliberate freely and are then held accountable for the decisions they finally endorse, not for every half-formed thought expressed along the way.”

The survey notes that Indian courts have recognised such limits and reiterates the law’s core objective.

“The RTI Act is best understood not as an end in itself, but as a means to strengthen democracy.”

Standalone law for AI not needed for now

On the regulation of artificial intelligence (AI), the Economic Survey of India 2025–26 has suggested that India does not require a standalone law. It states that AI governance can instead be anchored within the Digital Personal Data Protection (DPDP) Act, 2023.

The survey cautions against rushing into an omnibus AI statute and recommends a phased, risk-based regulatory approach that evolves alongside technology and market adoption.

The survey proposes a sequencing model for AI governance - enable experimentation, allow scaling and introduce binding legal obligations only where risks, market power or information asymmetries are most pronounced. It warns that uniform, upfront regulation could impose disproportionate compliance costs on start-ups and early-stage innovators in a labour-abundant and resource-constrained economy like India.

Rather than replicating models such as the EU’s Artificial Intelligence Act, the survey argues that India should allow sectoral regulators to oversee AI deployment within their respective domains - including finance, healthcare, education and public administration.

"Data governance must also evolve through subordinate legislation under the DPDP framework to introduce functional data categorisation and auditability requirements, specifically for large-scale AI training. This must be complemented by incentive-based mechanisms for domestic value retention, such as the menu-based contribution pathways illustrated earlier. Human capital pipelines, particularly the ‘earn-and-learn’ pathways and curricular flexibility, should be scaled using existing legislative and budgetary lever."

The survey recommends that AI compliance obligations be calibrated based on scale, risk and economic impact, with lighter regimes for research institutions and start-ups and higher transparency and reporting requirements for large firms operating in sensitive sectors.

Rejecting blanket bans, it calls for stronger institutional capacity - including an AI Safety Institute - to conduct scenario testing, red-teaming and public disclosure of safety assessments. Certain uses such as intrusive surveillance, predictive policing and opaque behavioural profiling, may warrant strict limits, it added.

Overall, the survey says that India’s AI opportunity lies in application-led, sector-specific systems built on open, interoperable platforms, supported by incremental regulation under existing law.

Bar and Bench - Indian Legal news
www.barandbench.com