Prajwala v. Union of India & Others
Bench: Division Bench — Supreme Court
Parties
Facts of the Case
Prajwala, an NGO working against human trafficking and child sexual abuse, filed a public interest writ before the Supreme Court seeking directions to the Union of India, internet service providers, and social media platforms to proactively identify, remove, and report child sexual abuse material (CSAM) — also called child pornography — circulating online. The petition brought to the Court's attention that videos and images of child sexual abuse were being widely circulated via messaging platforms and online services, and that platforms were not taking proactive steps to detect and remove such content.
Legal Issues Before the Court
- 1Are internet intermediaries and platforms obligated to proactively detect and remove child sexual abuse material (CSAM) circulating on their platforms?
- 2Can the Supreme Court issue directions requiring platforms to use technology (such as PhotoDNA hashing) to identify and remove CSAM?
- 3What obligations do platforms and the government have to protect children from sexual exploitation online?
The Judgment
The Supreme Court issued a series of landmark directions requiring the Union of India and internet platforms to: (1) implement PhotoDNA hashing technology (or equivalent) to proactively identify CSAM on their platforms; (2) remove all identified CSAM immediately; (3) report CSAM to law enforcement agencies; (4) cooperate with NCMEC's CyberTipline (National Center for Missing & Exploited Children); and (5) establish a 24-hour hotline / reporting mechanism for CSAM. The Court directed the Government of India to develop a national framework for CSAM detection and removal. These directions were issued in 2015 and subsequent orders in 2017 and beyond.
Key Principles Laid Down
PROACTIVE REMOVAL — PLATFORMS' OBLIGATION: Prajwala established that internet intermediaries are not merely passive conduits — they have an affirmative obligation to proactively detect and remove CSAM circulating on their platforms. Passive take-down-on-notice is insufficient for CSAM.
PHOTODNA HASHING — COURT-DIRECTED TECHNOLOGY: The Supreme Court specifically directed platforms to implement hash-matching technology (PhotoDNA) to identify known CSAM content. This is one of the few cases where the Indian Supreme Court has mandated the use of specific technology as a constitutional safeguard for child protection.
ARTICLE 21 AND CHILDREN — STATE'S OBLIGATION: The right to life of children under Article 21 read with Article 39(f) (the state's duty to protect children) creates a positive obligation on the state to ensure children are protected from online sexual exploitation.
COORDINATION WITH INTERNATIONAL BODIES: The Court directed cooperation with international reporting mechanisms like NCMEC's CyberTipline, integrating India into the global CSAM detection network.
Impact on Indian Law
Prajwala (2015) is the foundational judicial intervention on CSAM in India. It has driven the development of India's cyber CSAM policy, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021, and ongoing compliance obligations for major platforms. The case remains active — subsequent orders have been issued as technology and platform policies have evolved. It is the touchstone for any legal discussion of online child safety and platform liability in India.
Frequently Asked Questions
What did the Supreme Court direct in the Prajwala case regarding online child sexual abuse material?
In Prajwala v. Union of India (2015 onwards), the Supreme Court directed internet platforms and the Union of India to: implement PhotoDNA hash-matching technology to proactively identify and remove child sexual abuse material (CSAM); report detected CSAM to law enforcement; coordinate with NCMEC's CyberTipline; and establish 24-hour reporting mechanisms. The Court held that platforms have an affirmative obligation to proactively detect and remove CSAM — not merely act on notice.