AI-Generated Child Sexual Abuse Material (CSAM)

  • 0
  • 3058
Font size:
Print

AI-Generated Child Sexual Abuse Material (CSAM)

Context:

The Department for Science, Innovation and Technology (UK) and the AI Safety Institute (now AI Security Institute) released the International AI Safety Report 2025, warning of the imminent risk of AI tools being used to create, possess, and disseminate Child Sexual Abuse Material (CSAM).

More on News

  • The UK government is making its first legislative attempt to address these threats, targeting AI tools capable of generating CSAM.
  • The World Economic Forum (2023) and the Internet Watch Foundation (October 2024) have flagged the increasing prevalence of AI-generated CSAM.

Overview

  • CSAM refers to any material (images, videos, audio) depicting sexually explicit portrayals of children.
  • The digital world has created a global market for CSAM, allowing material to be shared across borders, and providing anonymity to perpetrators.
  • Technology has also enabled end-to-end encryption on social media platforms, allowing abusers to trade CSAM within closed, often paid, groups without facing repercussions.
  • Each view of CSAM contributes to the demand for more content, fostering the cycle of exploitation and abuse. Research shows that nearly half of those who view CSAM attempt to make contact with children to further abuse them.

Monetisation and Digital Crimes

  • Technological developments have facilitated the monetisation of child sexual abuse, such as the ability to pay to watch live-streamed abuse on the dark web.
  • The rise of sextortion, where criminals threaten to expose explicit images or videos of children unless paid, has also been fueled by technology.

Long-Term Impact on Victims

  • Child sexual abuse has lifelong psychological and emotional consequences.
  • Each time an image or video is shared, the victim’s privacy is repeatedly violated.
  • Victims fear that friends, family, or the public may see their abuse material.

Key Legislative Developments in the UK

  • The UK’s upcoming legislation will criminalise: The possession, creation, or distribution of AI tools designed to generate CSAM. Possession of paedophile manuals that guide individuals in using AI to generate CSAM.
  • This marks a shift from an ‘accused-centric’ and ‘act-centric’ approach to ‘tool-centric’ legislation, focusing on the tools and mediums used to commit crimes rather than just the individuals performing the act.
  • The Protection of Children Act 1978 (UK) and Coroners and Justice Act 2009 (UK) focus on criminalising actual child abuse images.
  • The new legislation, however, will also prohibit AI-generated CSAM, closing an existing loophole.

Key Benefits of the new approach:

  • It helps in apprehending offenders at the preparation stage.
  • It aims to limit the spread of CSAM, addressing the mental health toll it causes, particularly on children.
  • It tackles AI-generated CSAM, which previously wasn’t adequately addressed in existing laws focused on images of actual children.

The Situation in India

  • According to the National Crime Records Bureau (NCRB) 2022, cybercrimes against children have increased dramatically.
  • The National Cyber Crime Reporting Portal (NCRP), under the Cyber Crime Prevention against Women and Children (CCPWC) scheme, reported 1.94 lakh incidents of child pornography as of April 2024.
  • The NCRB’s memorandum of understanding with the National Centre for Missing and Exploited Children (NCMEC), USA has facilitated the sharing of 69.05 lakh cyber tip-line reports related to CSAM.
  • These reports highlight the grave nature of the CSAM threat to children’s rights in India.

Current Legal Framework in India

  • Section 67B of the IT Act 2000: Punishes those who publish or transmit child pornography in electronic form.
  • Sections 13, 14, and 15 of the POCSO Act 2012: Prohibit using children for pornographic purposes, storing child pornography, and using children for sexual gratification.
  • Section 294 of the Bharatiya Nyaya Sanhita: Penalises the sale, distribution, or public exhibition of obscene materials.
  • Section 295: Criminalises selling or distributing obscene material to children.
  • However, India’s current legislative framework lacks specific provisions to address AI-generated CSAM.

Need for Legislative Adaptation in India

  • Update Definitions: Replace the term ‘child pornography’ in the POCSO Act with ‘CSAM’ to make the definition more expansive and relevant to emerging technologies.
  • Define ‘Sexually Explicit’: Clarify the term ‘sexually explicit’ under Section 67B of the IT Act to facilitate the real-time identification and blocking of CSAM.
  • Update Definition of ‘Intermediaries’: Include Virtual Private Networks (VPNs), Virtual Private Servers (VPS), and Cloud Services under the ‘intermediary’ definition in the IT Act to ensure they comply with CSAM-related laws.
  • Statutory Amendments for Emerging Tech: Amend laws to address the risks posed by AI technologies in generating CSAM.
  • International Cooperation: Pursue the adoption of the UN Draft Convention on ‘Countering the Use of ICT for Criminal Purposes’ by the UN General Assembly to strengthen global cooperation in tackling AI-enabled crimes.
  • Update Digital India Act: The Digital India Act 2023, proposed to replace the outdated IT Act of 2000, must include provisions inspired by the UK’s upcoming legislation, specifically targeting AI-generated CSAM.
Share:
Print
Apply What You've Learned.
Previous Post Waqf (Amendment) Bill
Next Post Poverty Reduction in India: A Multi-Dimensional Analysis
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x