|

X Challenges Government’s Use of IT Act for Content Moderation, Citing Free Speech Concerns

Get Your PDF

(Source – Indian Express, Section – Explained, Page – 15)

Topic: GS2 – Polity

Context

  • Elon Musk-owned X (formerly Twitter) has contested the Indian government’s application of Section 79(3)(b) of the Information Technology (IT) Act, 2000, arguing that it bypasses the procedural safeguards outlined in Section 69A, which specifically governs content blocking.

  • The legal battle, currently before the Karnataka High Court, raises significant concerns about free speech, intermediary liability, and content moderation in India.

Analysis of the news:

Shreya Singhal & Section 69A: The Precedent

  • In the landmark Shreya Singhal v. Union of India (2015) case, the Supreme Court struck down Section 66A of the IT Act for being “unconstitutionally vague” and for granting excessive power to restrict free speech.

  • Post this ruling, Section 69A became the primary legal framework for content moderation, allowing the government to block online information under specific conditions outlined in Article 19(2) of the Constitution, such as threats to sovereignty, public order, or morality.

  • Crucially, blocking orders under Section 69A must include reasons, enabling judicial review.

Government’s Use of Section 79(3)(b)

  • The Supreme Court, in Shreya Singhal, also clarified Section 79, which provides a “safe harbour” for intermediaries like X, shielding them from liability for user-generated content.

  • However, Section 79(3)(b) holds intermediaries liable if they fail to remove unlawful content after receiving actual knowledge via a court order or a government notification under Article 19(2) grounds.

  • Despite this, in October 2023, the Ministry of Electronics and Information Technology (MeitY) issued a directive allowing blocking orders under Section 79(3)(b).

  • In October 2024, MeitY launched the “Sahyog” portal, enabling government authorities to issue and upload such orders.

X’s Legal Challenge

  • X argues that MeitY’s move creates an unlawful blocking regime without the procedural protections of Section 69A.

  • The company claims Section 79 was never intended to be a blocking mechanism, but merely a liability shield for intermediaries.

  • X asserts that any content takedown must follow either the Section 69A process or a court order.

  • On March 17, X sought an interim order from the Karnataka High Court against coercive action but was denied by Justice M. Nagaprasanna, who left the door open for future legal intervention.

The Grok Controversy & AI Liability

  • X’s legal battle coincides with controversy surrounding its AI chatbot, Grok 3, which has drawn government attention for its use of Hindi slang and critical responses.

  • While no official notice has been issued, this raises a new legal question: Is X liable for AI-generated content under safe harbour laws?

  • Courts may now have to determine whether AI-generated responses fall under third-party content, a crucial issue in the evolving landscape of digital regulation.

Conclusion

  • X’s challenge raises critical questions about free speech, government overreach, and intermediary liability in India’s digital space.

  • The case’s outcome will shape the future of content moderation, AI regulation, and platform accountability, setting a precedent for balancing state control and digital freedoms.

Practice Question: The regulation of online content must balance national security and free speech. In light of X’s legal challenge against the Indian government’s use of Section 79(3)(b) of the IT Act, discuss the implications of intermediary liability and the evolving role of AI in content moderation. (250 Words /15 marks)

Similar Posts