AI lawyer
AI lawyer

Centre amends IT Rules to regulate AI-generated content; takedown timelines reduced

The amendments were notified by the MeitY on February 10 and will come into force on February 20, 2026.
Published on

The Central government has amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 to formally bring AI-generated content within India’s intermediary regulation framework.

The amendments were notified by the Union Ministry of Electronics and Information Technology (MeitY) on February 10, 2026 and will come into force on February 20.

The amendments have been issued under the Centre;s rule-making powers under the Information Technology Act, 2000. They clarify that “information” used for unlawful acts under the IT Rules includes synthetically generated information, extending intermediary due-diligence, takedown and enforcement obligations to AI-generated content.

The changes also strengthen the user-notification obligations of intermediaries. Platforms are now required to periodically inform users, at least once every three months, that non-compliance with platform rules, privacy policies or user agreements may result in immediate suspension or termination of access, removal of content or both.

The Rules further require intermediaries to warn users that unlawful activity may attract penalties or punishment under applicable laws and that offences requiring mandatory reporting, including those under the Bharatiya Nagarik Suraksha Sanhita, 2023 and the  Protection of Children from Sexual Offences (POCSO) Act, will be reported to the appropriate authorities.

AI-generated content defined

Under the amended rules, AI-generated content is regulated through the newly inserted definition of “synthetically generated information.” The term covers audio-visual content that is artificially or algorithmically created, generated, modified or altered using computer resources in a manner that appears real or authentic and is likely to be perceived as indistinguishable from a natural person or real-world event.

The notification clarifies that routine or good-faith activities such as editing, formatting, transcription, translation, accessibility improvements, educational and training materials, and research outputs will not fall within the scope of synthetically generated information, provided they do not result in false or misleading electronic records.

Labelling of AI-generated content

Intermediaries that enable the creation or dissemination of AI-generated content are now required to ensure that such content is clearly and prominently labelled as synthetically generated.

Where technically feasible, such content must also be embedded with permanent metadata or provenance mechanisms, including a unique identifier, to enable identification of the computer resource used to generate or modify it. Intermediaries are prohibited from enabling the removal, suppression or alteration of such labels or metadata.

New duties for social media platforms

Significant social media intermediaries must require users to declare whether content is AI-generated before it is displayed, uploaded or published on their platforms.

Platforms must also deploy appropriate technical measures, including automated tools, to verify the accuracy of such declarations. Where content is confirmed to be AI-generated, it must be displayed with a clear and prominent notice indicating its synthetic nature

Faster takedown and compliance timelines

The amendments significantly tighten multiple enforcement and compliance timelines under the IT Rules, accelerating the pace at which intermediaries are required to act on unlawful content and user grievances.

The amendment expressly substitutes several timelines contained in Rule 3 of the IT Rules.

  • The time for compliance with lawful takedown directions issued under has been reduced from 36 hours to 3 hours;

  • Under the grievance redressal mechanism, the period for disposal of grievances has been reduced from 15 days to 7 days;

  • For complaints that require urgent action, the amendment reduces the time available to intermediaries to act from 72 hours to 36 hours;

  • intermediaries are now required to act on specified content removal complaints within 2 hours, instead of the earlier 24-hour period.

The amendments also clarify that intermediaries must act expeditiously when they become aware of violations involving synthetically generated information, whether on their own or upon receipt of a complaint. Such action may include disabling access to the content, suspending user accounts and reporting the matter to the appropriate authorities where required by law.

The amendment further clarifies that the removal or disabling of access to synthetically generated information by intermediaries in compliance with the IT Rules will not amount to a violation of safe harbour conditions under Section 79(2) of the IT Act.

[Read Rules]

Attachment
PDF
IT ethics rules amendment
Preview
Bar and Bench - Indian Legal news
www.barandbench.com