Introduction
The use of artificial intelligence in creating and modifying content is now a routine part of business operations across sectors. Organisations increasingly use AI tools to prepare text, images, videos, presentations, educational materials, and marketing content. As this practice expands, regulatory attention has also increased, particularly around transparency and disclosure where content is artificially or synthetically generated. The recent amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 have brought renewed focus to these issues in the Indian context.
What Counts as Synthetically Generated InformationAt a broad level, the amended framework addresses content that is created or significantly altered using artificial intelligence tools, machine learning systems, deepfake technology, or similar technologies in a manner that may make it difficult for an ordinary viewer to distinguish such content from naturally created material. The Rules refer to such content as "Synthetically Generated Information" and contemplate transparency measures around its dissemination on digital platforms.
The Labelling RequirementOne of the central themes emerging from the amended framework is labelling. In general terms, where AI-generated or AI-modified content is published on a digital platform, it is expected to be clearly and conspicuously identified in a manner that enables users to recognise it as such. The emphasis is on visibility, clarity, and platform-appropriate disclosure rather than hidden or purely metadata-based identification.
Implications for Businesses & Territorial ConsiderationsFor businesses, this means that AI governance can no longer be treated purely as an internal technology issue. It increasingly intersects with legal, compliance, brand, and reputational considerations. Even where a particular organisation does not neatly fall within the core categories directly regulated under the Rules, the overall direction of the framework suggests that transparent disclosure of AI-generated content is becoming an important compliance and risk-management practice. A practical issue that often arises is whether obligations depend only on where content is viewed, or whether they may also be relevant where content is created or modified in India and later used elsewhere. While territorial questions can be complex and fact-dependent, the broader compliance trend points toward adopting consistent internal standards for identification of AI-generated content, particularly for organisations that create content for use across multiple jurisdictions and platforms.
The Role of Digital Platforms, Organizational Governance and SOPsAnother important consideration is the role of digital platforms. In many cases, platforms that host or distribute content have their own disclosure tools, content labelling features, and publication requirements. Where content is uploaded to such platforms, compliance may involve using the tagging or disclosure mechanisms built into the relevant service. As platform rules continue to evolve, organisations should ensure that internal teams are aware of platform-specific requirements at the time of publication. From a governance perspective, organisations should consider adopting internal policies or standard operating procedures dealing with the use of AI in content creation. Such frameworks can help address issues such as approved AI tools, review and approval processes, disclosure standards, accountability, training, and escalation protocols. A structured internal approach is often the most effective way to manage legal, operational, and reputational risks in this area.
Thoughts
The broader takeaway is that AI-content regulation is still developing, and enforcement expectations may continue to evolve. In this environment, transparency, internal controls, and platform-aware publication practices are likely to remain important. Organisations using AI for content creation should therefore keep their review processes under periodic assessment and remain attentive to further legal and regulatory developments.