Government’s Use of Section 79 of the IT Act and X’s Legal Challenge
The recent legal challenge by Elon Musk-owned X (formerly Twitter) against the Indian government’s interpretation of Section 79(3)(b) of the Information Technology (IT) Act, 2000, has raised critical concerns regarding free speech, content moderation, and intermediary liability. X has filed a petition before the Karnataka High Court contesting the government’s expanded use of Section 79 to issue content-blocking orders, arguing that it bypasses safeguards provided under Section 69A of the IT Act.
X’s Legal Challenge
- X has challenged the government’s interpretation and application of Section 79 before the Karnataka High Court, stating that:
- Section 79 is only a safe harbour provision and does not grant blocking powers to the government.
- The Sahyog portal creates an arbitrary censorship regime.
- Government orders lack sufficient justification and are not in line with constitutional safeguards under Article 19(2).
- These measures harm X’s business model, which relies on the free exchange of lawful information.
- The Karnataka High Court, while refusing to pass an interim order, has allowed X to approach the court again if coercive action is taken.
Grok AI Controversy and Legal Debate
- X’s AI chatbot, Grok 3, has drawn government scrutiny for using Hindi slang and allegedly posting critical responses about the government.
- Key legal question: Does AI-generated content qualify as third-party content under Section 79’s safe harbour provision?
- Courts may need to determine whether X can be held liable for AI-generated responses.
Section 79 of the IT Act: Safe Harbour Provision
- Section 79 provides a “safe harbour” for intermediaries (such as X), exempting them from liability for content posted by third-party users.
- Section 79(3)(b) states that intermediaries can be held liable if they do not remove unlawful content upon receiving actual knowledge or notification from the appropriate government agency.
- The Supreme Court’s ruling in Shreya Singhal v. Union of India (2015) restricted the application of this provision, ruling that intermediaries are only required to remove content if ordered by a court or if the government’s order aligns with Article 19(2) restrictions.
Section 69A: Government’s Blocking Powers
- Section 69A allows the Central Government to block public access to any online content in the interests of sovereignty, integrity, national security, and public order.
- It is governed by the Information Technology (Procedure and Safeguards for Blocking of Access to Information by Public) Rules, 2009, which mandates a structured review process before issuing blocking orders.
- The Shreya Singhal ruling upheld Section 69A, noting that it has safeguards against misuse.
Government’s Expanded Use of Section 79
- In October 2023, the Ministry of Electronics and Information Technology (MeitY) issued directives allowing all ministries, state governments, and law enforcement agencies to issue blocking orders under Section 79(3)(b).
- In October 2024, MeitY launched the “Sahyog” portal, enabling agencies to issue and upload content removal orders.
- X has argued that these measures establish an “unlawful parallel content-blocking mechanism”, bypassing Section 69A safeguards and the Supreme Court’s ruling in Shreya Singhal.
Constitutional and Legal Implications
- The case raises critical issues related to constitutional rights, legal safeguards, and digital governance.
- Concerns over censorship: X’s challenge underscores fears of arbitrary content takedowns without proper legal oversight.
- Need for due process: The Supreme Court’s ruling in Shreya Singhal emphasised the importance of judicial review in content removal decisions.
- Impact on free speech and digital rights: The outcome of this case could set a precedent for content regulation and intermediary liability in India.
Way Forward
- Judicial Clarity: The Supreme Court or High Courts should provide a clear interpretation of Section 79’s scope and ensure it is not misused for arbitrary censorship.
- Legislative Reforms: The IT Act should be amended to define due process mechanisms for content takedown requests.
- Transparency Mechanisms: MeitY and digital platforms should collaborate on transparent and accountable content moderation frameworks.
- Balancing Free Speech and National Security: Any content-blocking framework must balance national security concerns with fundamental rights under Article 19(1)(a).
Subscribe to our Youtube Channel for more Valuable Content – TheStudyias
Download the App to Subscribe to our Courses – Thestudyias
The Source’s Authority and Ownership of the Article is Claimed By THE STUDY IAS BY MANIKANT SINGH