<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Intermediary Guidelines Archives - Bhatt &amp; Joshi Associates</title>
	<atom:link href="https://bhattandjoshiassociates.com/tag/intermediary-guidelines/feed/" rel="self" type="application/rss+xml" />
	<link>https://bhattandjoshiassociates.com/tag/intermediary-guidelines/</link>
	<description>Best High Court Advocates &#38; Lawyers</description>
	<lastBuildDate>Fri, 20 Feb 2026 12:55:48 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Section 79 Safe Harbour and AI Platforms: Can an Algorithm Be an Intermediary Under Indian Law?</title>
		<link>https://bhattandjoshiassociates.com/section-79-safe-harbour-and-ai-platforms-can-an-algorithm-be-an-intermediary-under-indian-law/</link>
		
		<dc:creator><![CDATA[Aaditya Bhatt]]></dc:creator>
		<pubDate>Fri, 20 Feb 2026 12:34:08 +0000</pubDate>
				<category><![CDATA[Information Technology]]></category>
		<category><![CDATA[AI and Law]]></category>
		<category><![CDATA[AI Regulation India]]></category>
		<category><![CDATA[Algorithmic Liability]]></category>
		<category><![CDATA[Generative AI]]></category>
		<category><![CDATA[Intermediary Guidelines]]></category>
		<category><![CDATA[Intermediary Liability]]></category>
		<category><![CDATA[IT Act 2000]]></category>
		<category><![CDATA[IT Rules 2026]]></category>
		<category><![CDATA[Safe Harbour]]></category>
		<category><![CDATA[Section 79]]></category>
		<guid isPermaLink="false">https://bhattandjoshiassociates.com/?p=31818</guid>

					<description><![CDATA[<p>Introduction The question of whether an artificial intelligence platform can qualify as an &#8220;intermediary&#8221; under Indian law — and thereby claim the protection of safe harbour under Section 79 of the Information Technology Act, 2000 — is one of the most pressing and underexamined questions in Indian technology law today. For more than two decades, [&#8230;]</p>
<p>The post <a href="https://bhattandjoshiassociates.com/section-79-safe-harbour-and-ai-platforms-can-an-algorithm-be-an-intermediary-under-indian-law/">Section 79 Safe Harbour and AI Platforms: Can an Algorithm Be an Intermediary Under Indian Law?</a> appeared first on <a href="https://bhattandjoshiassociates.com">Bhatt &amp; Joshi Associates</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2><b>Introduction</b></h2>
<p><span style="font-weight: 400;">The question of whether an artificial intelligence platform can qualify as an &#8220;intermediary&#8221; under Indian law — and thereby claim the protection of safe harbour under Section 79 of the Information Technology Act, 2000 — is one of the most pressing and underexamined questions in Indian technology law today. For more than two decades, Section 79 has functioned as the backbone of India&#8217;s internet economy, shielding platforms from secondary liability for third-party content. The provision was drafted at a time when the internet was imagined as a passive pipe: a conduit through which users sent and received information. Algorithms of the generative and recommending kind that now define digital experience were simply not contemplated [1].</span></p>
<p><span style="font-weight: 400;">Today, platforms such as YouTube, Instagram, and AI-native services like Grok do not simply host content. Their algorithms curate, amplify, personalise, and in the case of generative AI, actively produce it. This makes the question far from academic: if an algorithm is found to be an active participant in content creation or curation, the platform deploying it may lose its statutory shield entirely. The Ministry of Electronics and Information Technology (MeitY) has, through a series of advisories in 2023 and 2024, begun to signal precisely this shift — that AI is not simply content hosted on a platform, but content shaped and generated by it [2].</span></p>
<h2><b>The Architecture of Section 79 of the IT Act: What the Provision Actually Says</b></h2>
<p><span style="font-weight: 400;">Section 79 of the Information Technology Act, 2000, provides in its operative part: </span><i><span style="font-weight: 400;">&#8220;Notwithstanding anything contained in any law for the time being in force but subject to the provisions of sub-sections (2) and (3), an intermediary shall not be liable for any third party information, data, or communication link made available or hosted by him.&#8221;</span></i><span style="font-weight: 400;"> This immunity is not unconditional. Sub-section (2) requires that the intermediary must not have initiated the transmission, must not have selected the receiver, and must not have selected or modified the information contained in the transmission. It must also observe due diligence and comply with the guidelines prescribed by the Central Government.</span></p>
<p><span style="font-weight: 400;">Sub-section (3) withdraws the protection in two scenarios: first, where the intermediary has conspired with, abetted, aided, or induced the commission of an unlawful act; and second, where the intermediary, upon receiving &#8220;actual knowledge&#8221; that unlawful content is being hosted on its platform, fails to expeditiously remove or disable access to that material. The term &#8220;intermediary&#8221; is defined under Section 2(1)(w) of the IT Act as </span><i><span style="font-weight: 400;">&#8220;any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record,&#8221;</span></i><span style="font-weight: 400;"> and expressly includes telecom service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online marketplaces, and cyber cafes [1].</span></p>
<p><span style="font-weight: 400;">The structure of this provision assumes a fundamental premise: that the intermediary is a passive actor. Its immunity is premised on its not having shaped the content in question. The moment it crosses into active participation — selecting, modifying, inducing — the statutory protection falls away. The rise of AI platforms tests every element of this assumption.</span></p>
<h2><b>Shreya Singhal v. Union of India (2015): The Constitutional Baseline</b></h2>
<p><span style="font-weight: 400;">No discussion of Section 79 of the IT Act is complete without a reckoning with the Supreme Court&#8217;s landmark judgment in </span><i><span style="font-weight: 400;">Shreya Singhal v. Union of India</span></i><span style="font-weight: 400;">, (2015) 5 SCC 1, delivered on 24 March 2015 by a bench of Justices J. Chelameswar and R.F. Nariman. The case arose from a batch of writ petitions under Article 32 of the Constitution of India, principally challenging the constitutionality of Sections 66A, 69A, and 79 of the IT Act. The Supreme Court&#8217;s treatment of Section 79 fundamentally reshaped the intermediary liability regime in India [3].</span></p>
<p><span style="font-weight: 400;">The Court read down Section 79(3)(b) to narrow its scope significantly. The holding was unambiguous:</span></p>
<blockquote><p><i><span style="font-weight: 400;">&#8220;Section 79 is valid subject to Section 79(3)(b) being read down to mean that an intermediary upon receiving actual knowledge from a court order or on being notified by the appropriate Government or its agency that unlawful acts relatable to Article 19(2) are going to be committed then fails to expeditiously remove or disable access to such material.&#8221;</span></i></p></blockquote>
<p><span style="font-weight: 400;">In practical terms, the Court held that intermediaries are not required to act upon private takedown requests. &#8220;Actual knowledge,&#8221; as used in Section 79(3)(b), was interpreted to mean knowledge received through the medium of a court order — not a complaint from a private party. This interpretation rested on a practical foundation: holding intermediaries like Google and Facebook to a standard of responding to every private complaint would make it impossible for them to function, since millions of requests are received and an intermediary cannot be expected to adjudicate the legality of each piece of content on its own. The Court further affirmed that there is no positive obligation on intermediaries to monitor content on their platforms [3]. This no-monitoring principle remains foundational to India&#8217;s safe harbour regime under Section 79 of the IT Act, even as AI regulation begins to chip away at it.</span></p>
<h2><b>Active vs. Passive Intermediaries: The Christian Louboutin Standard</b></h2>
<p><span style="font-weight: 400;">The passive/active distinction now central to the AI liability debate was crystallised in Indian jurisprudence by the Delhi High Court in </span><i><span style="font-weight: 400;">Christian Louboutin SAS v. Nakul Bajaj &amp; Ors.</span></i><span style="font-weight: 400;">, 2018 SCC OnLine Del 12215, decided on 2 November 2018 by Justice Prathiba M. Singh. The case involved the luxury shoe brand&#8217;s claim against darveys.com, an e-commerce platform that used the plaintiff&#8217;s trademarks as meta-tags and claimed to sell authentic goods sourced from authorised stores [4].</span></p>
<p><span style="font-weight: 400;">The defendant&#8217;s principal defence was that it was a mere intermediary under Section 79 of the IT Act. Justice Singh rejected this defence and, in doing so, laid down a twenty-six point framework to determine whether an online platform is a passive conduit or an active participant. The court reasoned that so long as a platform acts as &#8220;mere conduit or passive transmitters of the records or of the information, they continue to be intermediaries, but merely calling themselves as intermediaries does not qualify all e-commerce platforms or online market places as one.&#8221; The court then held:</span></p>
<blockquote><p><i><span style="font-weight: 400;">&#8220;When an e-commerce website is involved in or conducts its business in such a manner, which would see the presence of a large number of elements enumerated above, it could be said to cross the line from being an intermediary to an active participant.&#8221;</span></i></p></blockquote>
<p><span style="font-weight: 400;">By curating product listings, arranging logistics, using meta-tags, and guaranteeing authenticity, darveys.com had exceeded the role of a neutral conduit. The court also held that failure to observe due diligence with respect to intellectual property rights could amount to &#8220;conspiring, aiding, abetting, or inducing&#8221; unlawful conduct under Section 79(3)(a), independently disentitling the platform from safe harbour [4].</span></p>
<p><span style="font-weight: 400;">This framework applies with full force to AI platforms. When a recommendation algorithm selects which content a user sees, or when a generative AI model produces text or video in response to a user prompt, the question of whether these functions constitute &#8220;selection&#8221; or &#8220;modification&#8221; of information within the language of Section 79(2)(b) becomes the defining legal inquiry. The </span><i><span style="font-weight: 400;">Christian Louboutin</span></i><span style="font-weight: 400;"> standard supplies the doctrinal tool; generative AI supplies the stress test.</span></p>
<h2><b>IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: Expanding the Compliance Perimeter</b></h2>
<p><span style="font-weight: 400;">The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, notified on 25 February 2021 under Section 87 read with Section 79 of the IT Act, represent the most significant regulatory expansion of intermediary obligations since the original 2011 Guidelines. Rule 7 makes explicit that an intermediary which fails to comply with prescribed due diligence requirements shall no longer be entitled to safe harbour under Section 79(1) of the IT Act and shall be liable under applicable laws [1].</span></p>
<p><span style="font-weight: 400;">The 2021 Rules introduced the classification of &#8220;significant social media intermediaries&#8221; (SSMIs) — social media intermediaries with more than fifty lakh (five million) registered users in India. SSMIs bear substantially heavier obligations: they must appoint a Chief Compliance Officer, a Grievance Redressal Officer, and a Nodal Contact Person, all resident in India. Rule 4(2) requires SSMIs that primarily provide messaging services to enable identification of the &#8220;first originator&#8221; of information where directed by a court or competent authority under Section 69 of the IT Act.</span></p>
<p><span style="font-weight: 400;">For AI platforms, the most consequential provision is Rule 3(1)(b), which requires intermediaries to &#8220;make reasonable efforts by itself, and to cause the users of its computer resource&#8221; not to publish certain categories of prohibited content. This language has been interpreted as potentially imposing a preventive obligation — not merely reactive removal — that moves the compliance standard toward something approaching a monitoring duty. If AI systems deployed on a platform generate or amplify prohibited content, the question of whether the platform made &#8220;reasonable efforts&#8221; to prevent this, independently of any user action, becomes immediately live [2].</span></p>
<h2><b>MeitY&#8217;s AI Advisories: The Regulatory Turn</b></h2>
<p><span style="font-weight: 400;">India&#8217;s formal attempt to address AI within the intermediary liability framework began in November 2023 and crystallised through MeitY advisories issued in early 2024. The March 15, 2024 Advisory — which replaced the March 1, 2024 Advisory — directed intermediaries to ensure that the use of &#8220;AI models, large language models, generative AI technology, software or algorithms&#8221; on or through their platforms does not allow users to host, display, upload, modify, publish, transmit, store, update, or share any content in violation of the Intermediary Guidelines or any other law in force [2].</span></p>
<p><span style="font-weight: 400;">The advisory&#8217;s significance lies in its implicit treatment of AI not as content but as a potentially liable actor within the intermediary ecosystem. By requiring platforms to ensure that AI models deployed on them do not enable unlawful conduct, MeitY effectively placed the responsibility for AI-generated harm squarely on the platform. A platform that deploys a generative AI model which produces deepfake content, defamatory material, or content that undermines democratic processes cannot credibly claim it was merely hosting third-party information — because the AI is not a third party in any conventional sense. It is the platform&#8217;s own deployed technology [2].</span></p>
<p><span style="font-weight: 400;">The advisories also addressed deepfakes specifically, reflecting the 2023 Rashmika Mandanna incident, where AI-generated synthetic video caused significant public and political concern. That episode illustrated how AI-generated content can cause reputational harm at a scale and speed that outpaces any traditional notice-and-takedown mechanism, and demonstrated to MeitY that the existing framework needed explicit AI-specific obligations [5].</span></p>
<h2><b>IT (Intermediary Guidelines) Amendment Rules, 2026: Formalising AI Liability</b></h2>
<p><span style="font-weight: 400;">The most direct regulatory intervention to date is the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, notified by MeitY on 20 February 2026. These rules, for the first time, introduce a statutory definition of &#8220;synthetically generated information&#8221; (SGI), described as any content that is artificially or algorithmically created, generated, modified, or altered using a computer resource in a manner that appears authentic. This definition is intentionally broad, capturing the full range of AI-generated content including deepfakes, synthetic audio-visual material, and algorithmically altered images [5].</span></p>
<p><span style="font-weight: 400;">The 2026 Rules impose mandatory labelling obligations on intermediaries that facilitate the creation of SGI. Visual content must carry a clear and permanent metadata identifier covering at least ten percent of the display area; audio content must contain an audible disclosure during at least ten percent of its duration. These labels cannot be removed, modified, or suppressed by users. The rules also dramatically reduce takedown timelines: unlawful or prohibited AI-generated content must be removed or disabled within three hours of receiving a lawful notice [5].</span></p>
<p><span style="font-weight: 400;">The 2026 Rules expressly clarify that intermediaries acting in good faith and in compliance with these obligations will continue to enjoy safe harbour protection under Section 79 of the IT Act. Conversely, failure to comply — failure to label, delay in takedown, or inadequate grievance handling — may result in the loss of that protection. Safe harbour is thereby transformed from a passive shield into a compliance-contingent privilege. The standard is no longer merely reactive: an intermediary must demonstrate system-level preparedness to deal with AI-generated risks proactively, not merely respond to them after harm has occurred [5].</span></p>
<h2><b>The Grok Question: When AI Is the Platform</b></h2>
<p><span style="font-weight: 400;">The most pointed articulation of the AI-as-creator problem in Indian regulatory discourse concerns the deployment of Grok, an AI model integrated into X (formerly Twitter). The Indian government has argued — publicly, if not yet conclusively in litigation — that X&#8217;s deployment of Grok effectively makes it a creator of content, not merely a host. If Grok generates content in response to user prompts, X cannot claim to be a neutral intermediary whose only role is the passive transmission of third-party information. On this view, Section 79&#8217;s safe harbour would not apply, because the platform itself is the origin point of at least some of the content on it [6].</span></p>
<p><span style="font-weight: 400;">This is the active/passive distinction from </span><i><span style="font-weight: 400;">Christian Louboutin</span></i><span style="font-weight: 400;"> transposed directly onto generative AI. The legal framework as it currently stands does not offer a clean answer. The definition of intermediary in Section 2(1)(w) refers to a person who &#8220;receives, stores or transmits&#8221; electronic records or &#8220;provides any service with respect to that record.&#8221; A generative AI model arguably does none of these things in the traditional sense — it creates records rather than receiving or transmitting them [1][6].</span></p>
<p><span style="font-weight: 400;">Researchers at the Carnegie Endowment have observed that existing definitions under the IT Act, when applied to AI systems, are &#8220;being stretched too thin&#8221; and that &#8220;generative AI systems may not fall neatly within the purview of either publisher or intermediary&#8221; under the current statutory framework [7]. This definitional gap is precisely why the 2026 Amendment Rules and the anticipated Digital India Act are significant: they represent attempts to fill a statutory vacuum that the original IT Act, drafted in 2000, could not have anticipated.</span></p>
<h2><b>MySpace Inc. v. Super Cassettes Industries Ltd.: The No-Monitoring Principle and Its Limits</b></h2>
<p><span style="font-weight: 400;">The no-monitoring principle affirmed in </span><i><span style="font-weight: 400;">Shreya Singhal</span></i><span style="font-weight: 400;"> was reaffirmed by a Division Bench of the Delhi High Court in </span><i><span style="font-weight: 400;">MySpace Inc. v. Super Cassettes Industries Ltd.</span></i><span style="font-weight: 400;">, (2017) 236 DLT 478. The court held that intermediaries are not under any positive obligation to proactively monitor content on their platforms for copyright infringement, and that &#8220;actual knowledge&#8221; must be in the form of a court order — not constructive or inferred knowledge. The court expressly rejected the argument that a platform&#8217;s technical ability to detect infringing content was equivalent to legal knowledge sufficient to impose liability [8].</span></p>
<p><span style="font-weight: 400;">This principle sits uneasily alongside the 2026 Rules&#8217; mandatory labelling and three-hour takedown obligations for AI-generated content. If a platform deploys an AI model that generates content, and that content turns out to be unlawful, the platform&#8217;s argument that it had no &#8220;actual knowledge&#8221; of the specific unlawfulness is considerably weakened — because the AI is the platform&#8217;s own system. The content did not arrive from an unknown third-party originator; it was produced by the platform&#8217;s own technology. The no-monitoring principle was premised on the practical impossibility of reviewing every piece of user-generated content. That impossibility argument does not translate cleanly to AI-generated content, which the platform&#8217;s own systems produced and could, in principle, have been designed to screen from the outset [8].</span></p>
<h2><b>X Corp. v. Union of India: Section 79(3)(b) and the Live Battleground of Safe Harbour</b></h2>
<p><span style="font-weight: 400;">The question of how Section 79(3)(b) interacts with AI-generated content is being contested in live litigation before the Karnataka High Court in </span><i><span style="font-weight: 400;">X Corp. v. Union of India</span></i><span style="font-weight: 400;">, a writ petition filed on 5 March 2025 before Justice M. Nagaprasanna. X Corp. challenges the legality of information-blocking orders issued by various government ministries under Section 79(3)(b), following a MeitY Office Memorandum of 31 October 2023 that authorised all central ministries, state governments, and local police officers to issue content blocking orders through the Sahyog portal [9].</span></p>
<p><span style="font-weight: 400;">X&#8217;s core argument, drawing expressly on </span><i><span style="font-weight: 400;">Shreya Singhal</span></i><span style="font-weight: 400;">, is that Section 79(3)(b) cannot function as an independent mechanism for content blocking. Content blocking, X submits, can only occur through the constitutionally safeguarded process under Section 69A of the IT Act, which requires reasoned orders and procedural safeguards. By contrast, Section 79(3)(b) merely describes the circumstances in which safe harbour is lost — it does not independently confer blocking power on the executive [9]. For AI platforms, the implications are significant: if informal government notices under Section 79(3)(b) are sufficient to trigger takedown obligations for AI-generated content, platforms will face executive pressure to remove such content without judicial oversight, fundamentally altering the architecture of safe harbour from an immunity into a tool of executive content governance.</span></p>
<h2><b>Conclusion</b></h2>
<p><span style="font-weight: 400;">Section 79 of the IT Act was not written for the age of algorithms. Its passive-intermediary model, refined through case law from </span><i><span style="font-weight: 400;">Shreya Singhal</span></i><span style="font-weight: 400;"> to </span><i><span style="font-weight: 400;">Christian Louboutin</span></i><span style="font-weight: 400;"> to </span><i><span style="font-weight: 400;">MySpace</span></i><span style="font-weight: 400;">, assumes a clean separation between the platform and the content it hosts. Generative AI destroys that separation. When an algorithm recommends, curates, or creates content, the platform is no longer merely a conduit — it is a participant. Whether courts will treat that participation as sufficient to strip safe harbour protection depends on how the active/passive distinction is applied to algorithmic conduct. MeitY&#8217;s 2026 Amendment Rules have begun to answer this question legislatively, by conditioning safe harbour on demonstrated compliance with AI-specific obligations, mandatory labelling, and accelerated takedown timelines. The answer, in short, is that an algorithm can be treated as part of the intermediary for regulatory purposes — but the intermediary that deploys it cannot hide behind Section 79 when the algorithm itself is the source of the harm.</span></p>
<h2><b>References</b></h2>
<p><span style="font-weight: 400;">[1] Information Technology Act, 2000, Sections 2(1)(w) and 79, Ministry of Electronics and Information Technology, Government of India. Available at:</span><a href="https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&amp;orderno=105"> <span style="font-weight: 400;">https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&amp;orderno=105</span></a></p>
<p><span style="font-weight: 400;">[2] S&amp;R Associates, &#8220;Investing in AI in India (Part 3): AI-related Advisories Under the Intermediary Guidelines,&#8221; October 2024. Available at:</span><a href="https://www.snrlaw.in/investing-in-ai-in-india-part-3-ai-related-advisories-under-the-intermediary-guidelines/"> <span style="font-weight: 400;">https://www.snrlaw.in/investing-in-ai-in-india-part-3-ai-related-advisories-under-the-intermediary-guidelines/</span></a></p>
<p><span style="font-weight: 400;">[3] </span><i><span style="font-weight: 400;">Shreya Singhal v. Union of India</span></i><span style="font-weight: 400;">, (2015) 5 SCC 1, Supreme Court of India, 24 March 2015. Full judgment available at:</span><a href="https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2015/06/Shreya_Singhal_vs_U.O.I_on_24_March_2015.pdf"> <span style="font-weight: 400;">https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2015/06/Shreya_Singhal_vs_U.O.I_on_24_March_2015.pdf</span></a></p>
<p><span style="font-weight: 400;">[4] </span><i><span style="font-weight: 400;">Christian Louboutin SAS v. Nakul Bajaj &amp; Ors.</span></i><span style="font-weight: 400;">, 2018 SCC OnLine Del 12215, Delhi High Court, 2 November 2018. Available at:</span><a href="https://indiankanoon.org/doc/99622088/"> <span style="font-weight: 400;">https://indiankanoon.org/doc/99622088/</span></a></p>
<p><span style="font-weight: 400;">[5] TBA Law, &#8220;India&#8217;s IT Intermediary Rules 2026 Amendment on AI-Generated Content: A Legal Analysis,&#8221; 2026. Available at:</span><a href="https://www.tbalaw.in/post/india-s-it-intermediary-rules-2026-amendment-on-ai-generated-content-a-legal-analysis"> <span style="font-weight: 400;">https://www.tbalaw.in/post/india-s-it-intermediary-rules-2026-amendment-on-ai-generated-content-a-legal-analysis</span></a></p>
<p><span style="font-weight: 400;">[6] IAS Gyan, &#8220;Grok Case Raises Questions of AI Governance,&#8221; 2024. Available at:</span><a href="https://www.iasgyan.in/daily-editorials/grok-case-raises-questions-of-ai-governance"> <span style="font-weight: 400;">https://www.iasgyan.in/daily-editorials/grok-case-raises-questions-of-ai-governance</span></a></p>
<p><span style="font-weight: 400;">[7] Carnegie Endowment for International Peace, &#8220;India&#8217;s Advance on AI Regulation,&#8221; November 2024. Available at:</span><a href="https://carnegieendowment.org/research/2024/11/indias-advance-on-ai-regulation?lang=en"> <span style="font-weight: 400;">https://carnegieendowment.org/research/2024/11/indias-advance-on-ai-regulation?lang=en</span></a></p>
<p><span style="font-weight: 400;">[8] Bar and Bench, &#8220;Generative AI and Intermediary Liability Under the Information Technology Act&#8221; (discussing </span><i><span style="font-weight: 400;">MySpace Inc. v. Super Cassettes Industries Ltd.</span></i><span style="font-weight: 400;">, (2017) 236 DLT 478). Available at:</span><a href="https://www.barandbench.com/view-point/generative-ai-and-intermediary-liability-under-the-information-technology-act"> <span style="font-weight: 400;">https://www.barandbench.com/view-point/generative-ai-and-intermediary-liability-under-the-information-technology-act</span></a></p>
<p><span style="font-weight: 400;">[9] SC Observer, &#8220;X Relies on &#8216;Shreya Singhal&#8217; in Arbitrary Content-Blocking Case in Karnataka HC,&#8221; July 2025. Available at:</span><a href="https://www.scobserver.in/journal/x-relies-on-shreya-singhal-in-arbitrary-content-blocking-case-in-karnataka-hc/"> <span style="font-weight: 400;">https://www.scobserver.in/journal/x-relies-on-shreya-singhal-in-arbitrary-content-blocking-case-in-karnataka-hc/</span></a></p>
<p>The post <a href="https://bhattandjoshiassociates.com/section-79-safe-harbour-and-ai-platforms-can-an-algorithm-be-an-intermediary-under-indian-law/">Section 79 Safe Harbour and AI Platforms: Can an Algorithm Be an Intermediary Under Indian Law?</a> appeared first on <a href="https://bhattandjoshiassociates.com">Bhatt &amp; Joshi Associates</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>WhatsApp Challenges IT Rules 2021 on Traceability Clause: A Constitutional and Privacy Dispute in India&#8217;s Digital Regulation</title>
		<link>https://bhattandjoshiassociates.com/whatsapp-challenges-it-rules-2021-traceability-clause-a-constitutional-and-privacy-dispute-in-indias-digital-regulation/</link>
		
		<dc:creator><![CDATA[Team]]></dc:creator>
		<pubDate>Mon, 31 May 2021 11:46:43 +0000</pubDate>
				<category><![CDATA[Current Events]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[Cyber Law India]]></category>
		<category><![CDATA[Digital Privacy]]></category>
		<category><![CDATA[Digital Rights]]></category>
		<category><![CDATA[End-to-End Encryption]]></category>
		<category><![CDATA[Intermediary Guidelines]]></category>
		<category><![CDATA[IT Rules 2021]]></category>
		<category><![CDATA[Puttaswamy Judgment]]></category>
		<category><![CDATA[Right to Privacy]]></category>
		<category><![CDATA[traceability]]></category>
		<category><![CDATA[WhatsApp India]]></category>
		<guid isPermaLink="false">https://bhattandjoshiassociates.com/?p=11168</guid>

					<description><![CDATA[<p>Introduction The intersection of digital privacy and national security has emerged as one of the defining legal battlegrounds in contemporary India. In May 2021, WhatsApp LLC filed a petition before the Delhi High Court challenging Rule 4(2) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) IT Rules, 2021[1]. This WhatsApp challenge to [&#8230;]</p>
<p>The post <a href="https://bhattandjoshiassociates.com/whatsapp-challenges-it-rules-2021-traceability-clause-a-constitutional-and-privacy-dispute-in-indias-digital-regulation/">WhatsApp Challenges IT Rules 2021 on Traceability Clause: A Constitutional and Privacy Dispute in India&#8217;s Digital Regulation</a> appeared first on <a href="https://bhattandjoshiassociates.com">Bhatt &amp; Joshi Associates</a>.</p>
]]></description>
										<content:encoded><![CDATA[<h2><b>Introduction</b></h2>
<p><span style="font-weight: 400;">The intersection of digital privacy and national security has emerged as one of the defining legal battlegrounds in contemporary India. In May 2021, WhatsApp LLC filed a petition before the Delhi High Court challenging Rule 4(2) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) IT Rules, 2021[1]. This WhatsApp challenge to IT Rules 2021 on traceability represents a fundamental dispute between the government&#8217;s regulatory ambitions and the right to privacy of millions of Indian users who rely on encrypted messaging services. The case raises critical questions about the extent to which the state can demand technological capabilities that may undermine the very foundations of secure digital communications. WhatsApp&#8217;s petition argues that the traceability requirement violates constitutional protections enshrined under Articles 14, 19(1)(a), 19(1)(g), and 21 of the Indian Constitution, while also exceeding the statutory authority granted under the Information Technology Act, 2000[2].</span></p>
<h2><b>The Legal Framework: Information Technology Act and Intermediary Rules</b></h2>
<p><span style="font-weight: 400;">The Information Technology Act, 2000 serves as the primary legislative framework governing digital intermediaries in India. The Act, through its various provisions, aims to balance the interests of innovation and user protection with legitimate state concerns regarding security and public order. Within this framework, Section 79 of the IT Act holds particular significance as it provides what is commonly known as safe harbour protection to intermediaries. Under Section 79(1), an intermediary shall not be liable for any third party information, data, or communication link made available or hosted by it, subject to certain conditions[3].</span></p>
<p><span style="font-weight: 400;">The safe harbour protection under Section 79(2) applies only when the intermediary&#8217;s function is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored. The intermediary must not initiate the transmission, select the receiver, or modify the information contained in the transmission. Furthermore, the intermediary must observe due diligence while discharging its duties and comply with guidelines prescribed by the Central Government[4].</span></p>
<p><span style="font-weight: 400;">However, this protection is not absolute. Section 79(3) specifies that the exemption shall not apply if the intermediary has conspired, abetted, aided or induced the commission of an unlawful act, or upon receiving actual knowledge or notification from the appropriate government or its agency regarding unlawful content, fails to expeditiously remove or disable access to that material. The Central Government exercises its rule-making authority under Section 87(2) of the IT Act, which empowers it to make rules for carrying out the provisions of the Act.</span></p>
<p><span style="font-weight: 400;">On February 25, 2021, the Ministry of Electronics and Information Technology notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, superseding the earlier 2011 rules. These rules significantly expanded the regulatory framework governing digital intermediaries, introducing new classifications and obligations. The rules distinguish between ordinary intermediaries and Significant Social Media Intermediaries, defined as platforms with registered users in India above a notified threshold of five million users. For SSMIs providing messaging services primarily, Rule 4(2) imposes an additional due diligence requirement to enable the identification of the first originator of information on its computer resource as may be required by a judicial order or an order passed under Section 69 of the IT Act[5].</span></p>
<h2><b>Understanding Rule 4(2): The Traceability Mandate</b></h2>
<p><span style="font-weight: 400;">Rule 4(2) of the IT Rules 2021 represents the centerpiece of this legal controversy. The provision specifically requires significant social media intermediaries that provide services primarily in the nature of messaging to enable the identification of the first originator of information on its platform. This obligation arises when either a court of competent jurisdiction or an authority empowered under Section 69 of the IT Act issues an order requiring such identification. The provision includes a crucial territorial limitation, stating that where the first originator of any information is located outside India, the first originator of that information within India shall be deemed to be the first originator.</span></p>
<p><span style="font-weight: 400;">The term &#8220;originator&#8221; is defined in the IT Act as a person who sends, generates, stores or transmits any electronic message. However, this definition creates ambiguity because an originator may not necessarily be the author or creator of the content. Someone who forwards a message, shares a screenshot, or copy-pastes content from another platform could potentially be identified as the originator, even though they did not create the underlying content. This technical limitation raises significant questions about the effectiveness and fairness of the traceability mechanism.</span></p>
<p><span style="font-weight: 400;">The requirement applies only to SSMIs providing messaging services, which would include platforms like WhatsApp, Signal, and Telegram that have more than five million users in India. WhatsApp, with over 530 million users in India, clearly falls within this category and is therefore subject to the traceability mandate. The rules do not specify the exact technological mechanism by which traceability should be implemented, leaving it to the platforms to determine how to comply with the requirement without breaking end-to-end encryption.</span></p>
<h2><b>WhatsApp&#8217;s Constitutional Challenge to IT Rules 2021 on Traceability</b></h2>
<p>WhatsApp&#8217;s petition before the Delhi High Court presents a multifaceted constitutional challenge to Rule 4(2). Analysis of WhatsApp’s challenge to Rule 4(2) IT Rules 2021 on traceability, encryption, privacy and national security highlights the broader implications for digital rights in India. The company filed its writ petition on May 26, 2021, one day after the deadline for compliance with the new rules. Senior Advocate Mukul Rohatgi represented WhatsApp before a division bench comprising Chief Justice DN Patel and Justice Jyoti Singh. The Delhi High Court issued notice to the Centre on August 27, 2021, directing the government to file a response to WhatsApp&#8217;s contentions[6].</p>
<p><span style="font-weight: 400;">The petition argues that Rule 4(2) violates the fundamental right to privacy as recognized in the landmark Supreme Court judgment of Justice K.S. Puttaswamy (Retd.) v. Union of India. In this unanimous nine-judge bench decision delivered on August 24, 2017, the Supreme Court held that the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution[7]. The judgment established a three-part test for any invasion of privacy: legality, necessity, and proportionality. WhatsApp contends that Rule 4(2) fails all three prongs of this test.</span></p>
<p><span style="font-weight: 400;">On the legality prong, WhatsApp argues that there is no law enacted by Parliament that expressly requires an intermediary to enable identification of the first originator of information on end-to-end encrypted platforms. The company submits that Rule 4(2) is ultra vires to Section 79 of the IT Act because the parent statute does not authorize the imposition of such a requirement through subordinate legislation. The petition emphasizes that while Section 79 grants rule-making power regarding due diligence requirements for intermediaries, it does not extend to mandating fundamental changes to the technological architecture of encrypted messaging services.</span></p>
<p><span style="font-weight: 400;">Regarding necessity, WhatsApp argues that Rule 4(2) allows for the issuance of orders to identify the first originator without judicial oversight or prior judicial scrutiny, which means there is no guarantee against arbitrary state action. The petition points out that orders can be issued not only by courts but also by executive authorities under Section 69 of the IT Act, without requiring the government to demonstrate that less intrusive means are unavailable or ineffective. This absence of procedural safeguards violates the necessity requirement established in the Puttaswamy judgment.</span></p>
<p><span style="font-weight: 400;">On proportionality, WhatsApp submits that the traceability requirement would force the platform to break end-to-end encryption for all its users, not just for specific individuals suspected of wrongdoing. The petition explains that to trace even one message, the service would have to trace every message, as there is no way to predict in advance which user will be the subject of an order seeking first originator information. This wholesale surveillance architecture is grossly disproportionate to any legitimate state interest and creates privacy risks for hundreds of millions of innocent users.</span></p>
<p><span style="font-weight: 400;">The petition also challenges Rule 4(2) under Article 14 of the Constitution, which guarantees equality before the law. Relying on the Supreme Court&#8217;s decision in Shayara Bano v. Union of India, WhatsApp argues that laws are manifestly arbitrary in violation of Article 14 when they are obviously unreasonable, capricious, irrational, without adequate determining principle, or excessive and disproportionate. The company contends that Rule 4(2) is manifestly arbitrary because it imposes burdens far exceeding any purported benefits and because Parliament did not intend to grant authority to make such legislation through subordinate rule-making.</span></p>
<p><span style="font-weight: 400;">Furthermore, WhatsApp asserts that Rule 4(2) violates the fundamental right to freedom of speech and expression guaranteed under Article 19(1)(a) of the Constitution. The petition explains that once citizens become aware that messaging platforms have built the ability to identify first originators, they will not feel safe to speak freely for fear that their lawful private communications will be traced and used against them. This chilling effect on free speech is antithetical to the very purpose of end-to-end encryption, which is designed to protect the confidentiality and security of private communications.</span></p>
<h2><b>The Government&#8217;s Defense of Traceability</b></h2>
<p><span style="font-weight: 400;">The Union of India, through the Ministry of Electronics and Information Technology, has filed detailed responses defending the constitutionality and necessity of Rule 4(2). The government&#8217;s position rests on several key arguments that attempt to balance individual privacy rights with collective security interests.</span></p>
<p><span style="font-weight: 400;">The Centre argues that Section 87 of the Information Technology Act granted it the power to formulate Rule 4(2), which mandates significant social media intermediaries to enable identification of the first originator in legitimate state interest. The government emphasizes that this requirement is essential for curbing the menace of fake news and offences concerning national security, public order, and crimes against women and children. The Ministry has stated that the right to privacy is not absolute and must be balanced against the Article 21 rights of vulnerable citizens within cyberspace who are or could be victims of cyber-crime.</span></p>
<p><span style="font-weight: 400;">In its affidavit before the Delhi High Court, the government has clarified that it respects the right to privacy and has no intention to violate it when WhatsApp is required to disclose the origin of a particular message. The Centre maintains that such requirements arise only in cases involving very serious offences related to sovereignty and integrity of India, security of the state, friendly relations with foreign states, public order, or incitement to cognizable offences. The government contends that the traceability provision is reasonable and expects platforms to use mechanisms that guard encryption while protecting user privacy.</span></p>
<p><span style="font-weight: 400;">The government has also placed the burden on intermediaries to develop technological solutions that comply with Indian law. The Centre&#8217;s submission states that even if existing technology does not allow identification of the first originator without breaking encryption, it is the legal obligation of platforms like WhatsApp to find solutions that can enable such identification. The Ministry argues that platforms cannot claim immunity from legal obligations simply because compliance may require modifications to their current technological architecture.</span></p>
<p><span style="font-weight: 400;">Additionally, the government has pointed to WhatsApp&#8217;s own data collection practices, arguing that the platform already collects users&#8217; personal information and shares it with Facebook and other third parties for commercial purposes. This, according to the Centre, undermines WhatsApp&#8217;s claims about protecting user privacy. The government maintains that if WhatsApp can collect and process user data for business purposes, it should be able to develop mechanisms for identifying first originators when required by law enforcement for investigating serious crimes.</span></p>
<h2><b>The Privacy Jurisprudence: Puttaswamy and Its Application</b></h2>
<p><span style="font-weight: 400;">The Puttaswamy judgment forms the doctrinal foundation for privacy protection in India and serves as the primary precedent in WhatsApp&#8217;s challenge to Rule 4(2) IT Rules 2021 (traceability clause). In Justice K.S. Puttaswamy (Retd.) v. Union of India, decided on August 24, 2017, a nine-judge constitution bench of the Supreme Court unanimously held that the right to privacy is a fundamental right intrinsic to life and personal liberty under Article 21 and is a part of the freedoms guaranteed by Part III of the Constitution. The bench comprised Chief Justice J.S. Khehar and Justices J. Chelameswar, S.A. Bobde, R.K. Agrawal, R.F. Nariman, A.M. Sapre, D.Y. Chandrachud, S.K. Kaul, and S. Abdul Nazeer.</span></p>
<p><span style="font-weight: 400;">Justice Chandrachud, writing for himself and three other judges, articulated that privacy is a concomitant of an individual&#8217;s right to exercise control over their own personality. The judgment recognized that privacy safeguards individual autonomy and recognizes the ability of individuals to control vital aspects of their lives. Privacy protects personal intimacies including marriage, procreation, family, and sexual orientation, which are at the core of privacy and dignity. The Court emphasized that privacy attaches to the person and is not lost merely because an individual is in a public place.</span></p>
<p><span style="font-weight: 400;">The Puttaswamy judgment established that privacy is not an absolute right and can be restricted by the state, but any such restriction must satisfy a three-part test. First, there must be legality, which requires that any invasion of privacy must be through a validly enacted law. Second, there must be necessity, meaning that the restriction must serve a legitimate state aim and there must be guarantees against arbitrary state action. Third, the restriction must be proportionate, requiring that the state achieve its legitimate aims through the least restrictive alternative available.</span></p>
<p><span style="font-weight: 400;">The judgment also recognized informational privacy as a distinct facet of the right to privacy. Justice Chandrachud observed that dangers to privacy in the age of information can originate not only from the state but also from non-state actors. The Court commended to the Union Government the need to examine and put in place a robust regime for data protection, cautioning that such a regime requires careful and sensitive balance between individual interests and legitimate concerns of the state.</span></p>
<p><span style="font-weight: 400;">The Puttaswamy decision explicitly overruled earlier Supreme Court judgments in M.P. Sharma v. Satish Chandra and the majority opinion in Kharak Singh v. State of Uttar Pradesh to the extent that they held privacy was not a fundamental right under the Constitution. The Court held that life and personal liberty are inalienable to human existence and constitute rights under natural law. No civilized state can contemplate an encroachment upon life and personal liberty except through the authority of law that meets constitutional requirements.</span></p>
<p><span style="font-weight: 400;">In applying this jurisprudence to Rule 4(2), WhatsApp argues that the traceability provision fails the Puttaswamy test on all three grounds. The petition contends that there is no valid parliamentary law authorizing such invasive surveillance, that procedural safeguards against arbitrary state action are absent, and that the requirement to break encryption for all users is grossly disproportionate to any legitimate governmental objective.</span></p>
<h2><b>Technical Implications: End-to-End Encryption and Traceability</b></h2>
<p><span style="font-weight: 400;">The technical dimensions of this legal dispute are crucial to understanding why WhatsApp and other encrypted messaging platforms oppose the traceability requirement so vehemently. End-to-end encryption is a security measure that prevents third parties, including the messaging platform itself, from accessing the content of communications between users. When a message is sent using end-to-end encryption, it is encrypted on the sender&#8217;s device, transmitted in encrypted form, and only decrypted on the recipient&#8217;s device. The encryption keys are stored only on user devices, not on the platform&#8217;s servers.</span></p>
<p><span style="font-weight: 400;">WhatsApp implemented end-to-end encryption using the Signal Protocol in 2016, meaning that the company itself cannot read the messages exchanged between users. This technical architecture is fundamental to the platform&#8217;s privacy promise to its users. The company has consistently maintained that requiring traceability would necessitate fundamental changes to this architecture that would undermine the security and privacy protections offered by end-to-end encryption.</span></p>
<p><span style="font-weight: 400;">Technology experts and civil society organizations have supported WhatsApp&#8217;s technical claims. A parliamentary standing committee report concluded that technology experts were unanimous in their opinion that it is technically impossible to introduce traceability on encrypted platforms without breaking the encryption technology itself. The report noted that implementing originator traceability may weaken end-to-end encryption and create vulnerabilities that could be exploited by malicious actors.</span></p>
<p><span style="font-weight: 400;">To comply with Rule 4(2) while maintaining end-to-end encryption, messaging platforms would need to implement what is known as message tracing or message tracking. This would require storing metadata about who sent which message to whom and when, creating a database that maps the flow of messages across the platform. However, this approach has several significant problems.</span></p>
<p><span style="font-weight: 400;">First, storing such metadata at scale would be technically challenging and expensive, particularly for a platform like WhatsApp that processes billions of messages daily. Second, this metadata database would itself become a massive privacy risk, as it would reveal communication patterns, social networks, and associations among users. Third, the metadata could be used to infer the content of communications even without breaking encryption, as patterns of communication can be highly revealing.</span></p>
<p><span style="font-weight: 400;">Moreover, traceability based on the first forwarder rather than the original creator of content has limited effectiveness. Users commonly copy content from websites or other platforms and paste it into chats, take screenshots of messages, or retype content they have seen elsewhere. In such cases, the person identified as the first originator on WhatsApp would not actually be the creator or author of the content, rendering the traceability mechanism ineffective for its stated purpose of identifying the source of misinformation or harmful content.</span></p>
<h2><b>Comparative Perspectives: Global Approaches to Encrypted Communications</b></h2>
<p><span style="font-weight: 400;">India is not alone in grappling with the tension between encrypted communications and law enforcement access. Governments worldwide have sought various approaches to address this challenge, often referred to as the encryption debate or the going dark problem.</span></p>
<p><span style="font-weight: 400;">In the United States, law enforcement agencies have long advocated for backdoors or exceptional access mechanisms that would allow them to decrypt communications when authorized by court order. However, technology companies and privacy advocates have consistently argued that such mechanisms would weaken security for all users and could be exploited by adversaries. The debate has resulted in a stalemate, with no federal legislation requiring backdoors in encrypted systems.</span></p>
<p><span style="font-weight: 400;">The European Union has taken a different approach through its General Data Protection Regulation and the ePrivacy Directive, which provide strong protections for communications privacy. However, some EU member states have proposed or enacted national legislation requiring platforms to retain certain metadata or provide access to encrypted communications under specific circumstances. These national measures have faced legal challenges under EU law for potentially conflicting with fundamental rights protections.</span></p>
<p><span style="font-weight: 400;">Australia passed the Telecommunications and Other Legislation Amendment (Assistance and Access) Act in 2018, which requires technology companies to provide technical assistance to law enforcement agencies, including potentially weakening encryption. This legislation sparked significant controversy and concern from technology companies and civil society organizations about its impact on security and privacy.</span></p>
<p><span style="font-weight: 400;">The United Kingdom has considered similar measures through the Investigatory Powers Act 2016, which grants broad surveillance powers to government agencies. However, courts have struck down portions of this legislation for violating privacy rights under the European Convention on Human Rights. In December 2020, the Court of Justice of the European Union ruled that UK surveillance practices violated EU law, specifically regarding bulk data retention requirements.</span></p>
<p><span style="font-weight: 400;">Brazil&#8217;s Marco Civil da Internet provides strong protections for internet users&#8217; privacy and freedom of expression, while also establishing procedures for law enforcement access to user data with judicial authorization. The Brazilian approach attempts to balance privacy and security through clear procedural safeguards and judicial oversight, which contrasts with India&#8217;s Rule 4(2) that allows executive authorities to issue traceability orders without prior judicial review.</span></p>
<p><span style="font-weight: 400;">These international examples demonstrate that while many countries struggle with similar tensions between privacy and security, most democratic nations that have attempted to mandate weakening of encryption or require traceability have faced significant legal, technical, and political challenges. The lack of a clear international consensus on this issue underscores the complexity of the problem that India is attempting to solve through Rule 4(2).</span></p>
<h2><b>The Status of Proceedings and Future Implications</b></h2>
<p><span style="font-weight: 400;">The legal challenge to Rule 4(2) remains pending before the Delhi High Court. After issuing notice to the Centre in August 2021, the court has heard arguments from both sides but has not yet rendered a final judgment on the merits of WhatsApp&#8217;s petition. In April 2024, during one of the hearings, WhatsApp&#8217;s counsel made the striking statement that the platform would exit India if forced to break encryption, underscoring the fundamental nature of the dispute.</span></p>
<p><span style="font-weight: 400;">The Supreme Court of India, in March 2024, transferred various petitions challenging different aspects of the IT Rules 2021 from multiple High Courts to the Delhi High Court for consolidated hearing. This transfer indicates the national importance of the issues at stake and suggests that a definitive resolution may eventually require Supreme Court intervention.</span></p>
<p><span style="font-weight: 400;">Meanwhile, the government has shown no indication of withdrawing or modifying Rule 4(2). The Ministry of Electronics and Information Technology has consistently defended the provision as necessary for public safety and national security. In subsequent amendments to the IT Rules in 2022, the government actually expanded intermediary obligations in other areas, suggesting a continued commitment to stringent regulation of digital platforms.</span></p>
<p><span style="font-weight: 400;">The outcome of this case will have profound implications for digital rights in India and could set precedents affecting hundreds of millions of users of encrypted messaging services. If the court upholds Rule 4(2), WhatsApp and other encrypted messaging platforms will face a difficult choice: either comply with the traceability requirement by fundamentally redesigning their encryption systems, which would undermine their global security architecture, or refuse to comply and potentially face loss of safe harbour protection or even be forced to exit the Indian market.</span></p>
<p><span style="font-weight: 400;">Conversely, if the court strikes down Rule 4(2) as unconstitutional, it would establish important limits on the government&#8217;s ability to mandate surveillance capabilities through subordinate legislation. Such a ruling would affirm the primacy of the Puttaswamy privacy framework and clarify that fundamental alterations to encrypted communications systems cannot be imposed without clear parliamentary authorization and robust procedural safeguards.</span></p>
<p><span style="font-weight: 400;">The case also raises broader questions about the regulation of digital platforms in India and the appropriate balance between innovation, privacy, and security. As India develops its digital economy and seeks to establish itself as a technology hub, the legal framework governing digital platforms will significantly influence whether India is perceived as a rights-respecting jurisdiction that protects user privacy or as one where surveillance concerns may deter users and businesses.</span></p>
<h2><b>Conclusion</b></h2>
<p><span style="font-weight: 400;">WhatsApp&#8217;s constitutional challenge to the traceability provision in the IT Rules 2021 represents a watershed moment in Indian digital rights jurisprudence. At its core, this case requires courts to determine whether the government can mandate that private companies build surveillance capabilities into encrypted communications systems, and if so, under what conditions and with what safeguards. The resolution of this case will shape the future of privacy, free speech, and secure communications for hundreds of millions of Indians who rely on messaging platforms for personal, professional, and political expression.</span></p>
<p><span style="font-weight: 400;">The legal and technical complexities involved demonstrate that there are no simple answers to the challenges posed by encrypted communications in the digital age. Both the government&#8217;s security concerns and users&#8217; privacy interests are legitimate and important. However, the Puttaswamy framework provides clear guidance that any invasion of privacy must be necessary, proportionate, and backed by adequate procedural safeguards. As the Delhi High Court weighs these competing interests, its eventual decision will determine whether India&#8217;s approach to digital regulation respects the constitutional commitment to privacy while addressing legitimate security needs.</span></p>
<h2><b>References</b></h2>
<p><span style="font-weight: 400;">[1] The Print. (2021, May 26). WhatsApp challenges new IT rules in Delhi HC, terms it &#8216;unconstitutional&#8217;. </span><a href="https://theprint.in/india/whatsapp-challenges-new-it-rules-in-delhi-hc-terms-it-unconstitutional/666023/"><span style="font-weight: 400;">https://theprint.in/india/whatsapp-challenges-new-it-rules-in-delhi-hc-terms-it-unconstitutional/666023/</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[2] LiveLaw. (2021, June 10). Traceability Rule Will Break End-To-End Encryption; Can Put Privacy Of Journalists, Activists, Politicians At Risk: WhatsApp Tells Delhi High Court. </span><a href="https://www.livelaw.in/news-updates/whatsapp-delhi-high-court-traceability-end-to-end-encryption-privacy-risk-174743"><span style="font-weight: 400;">https://www.livelaw.in/news-updates/whatsapp-delhi-high-court-traceability-end-to-end-encryption-privacy-risk-174743</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[3] Indian Kanoon. Section 79 in The Information Technology Act, 2000. </span><a href="https://indiankanoon.org/doc/844026/"><span style="font-weight: 400;">https://indiankanoon.org/doc/844026/</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[4] The LawGist. (2024, March 8). Exemption from Liability of Intermediary (Section 79 of Information Technology Act 2000). </span><a href="https://thelawgist.org/exemption-from-liability-of-intermediarysection-79-of-information-technology-act-2000/"><span style="font-weight: 400;">https://thelawgist.org/exemption-from-liability-of-intermediarysection-79-of-information-technology-act-2000/</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[5] PRS India. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. </span><a href="https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021"><span style="font-weight: 400;">https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[6] LiveLaw. (2021, August 27). Delhi High Court Issues Notice To Centre On WhatsApp&#8217;s Plea Challenging Traceability Clause Under IT Rules 2021. </span><a href="https://www.livelaw.in/top-stories/delhi-high-court-notice-centre-whatsapps-plea-challenging-traceability-clause-under-it-rules-2021-180387"><span style="font-weight: 400;">https://www.livelaw.in/top-stories/delhi-high-court-notice-centre-whatsapps-plea-challenging-traceability-clause-under-it-rules-2021-180387</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[7] Supreme Court Observer. Fundamental Right to Privacy &#8211; Justice K.S. Puttaswamy v Union of India. </span><a href="https://www.scobserver.in/cases/puttaswamy-v-union-of-india-fundamental-right-to-privacy-case-background/"><span style="font-weight: 400;">https://www.scobserver.in/cases/puttaswamy-v-union-of-india-fundamental-right-to-privacy-case-background/</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[8] MediaNama. (2021, May 27). Summary: WhatsApp alleges IT Rules are unconstitutional in lawsuit. </span><a href="https://www.medianama.com/2021/05/223-whatsapp-lawsuit-it-rules-indian-government/"><span style="font-weight: 400;">https://www.medianama.com/2021/05/223-whatsapp-lawsuit-it-rules-indian-government/</span></a><span style="font-weight: 400;"> </span></p>
<p><span style="font-weight: 400;">[9] Software Freedom Law Center. (2023, May 17). Legal challenges to the traceability provision – What is happening in India? </span><a href="https://sflc.in/legal-challenges-traceability-provision-what-happening-india/"><span style="font-weight: 400;">https://sflc.in/legal-challenges-traceability-provision-what-happening-india/</span></a><span style="font-weight: 400;"> </span></p>
<p style="text-align: center;"><em>Published and Authorized by <strong>Vishal Davda</strong></em></p>
<p>The post <a href="https://bhattandjoshiassociates.com/whatsapp-challenges-it-rules-2021-traceability-clause-a-constitutional-and-privacy-dispute-in-indias-digital-regulation/">WhatsApp Challenges IT Rules 2021 on Traceability Clause: A Constitutional and Privacy Dispute in India&#8217;s Digital Regulation</a> appeared first on <a href="https://bhattandjoshiassociates.com">Bhatt &amp; Joshi Associates</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
