bronwyn@clicksafe.online

The Rise of Online Child Sexual Abuse Material (CSAM): What the Latest Global Data Mean for Parents

Keeping kids safer online

Numbers you have to know

Online child-sexual-abuse material (CSAM) has surged at an alarming rate worldwide. While many parents picture CSAM as something hidden in dark corners of the internet, the reality is far more mainstream, frequently linked to social apps, messaging platforms, and increasingly artificial-intelligence tools that children use every day.

The most recent data from respected global monitoring organisations — including the Internet Watch Foundation (IWF), INHOPE, and the National Center for Missing & Exploited Children (NCMEC) — reveal a troubling escalation in both volume and severity. Although South Africa does not publish consolidated national CSAM statistics, global trends are highly relevant, especially in a country with widespread mobile access, rising screen time among children, and limited parental visibility over online behaviour.

This article summarises the latest findings and translates them into clear warnings and practical guidance for parents.


1. What the 2024 global numbers tell us

1.1 INHOPE: An explosion in CSAM reporting

INHOPE, the global network of internet hotlines, recorded a dramatic surge between 2023 and 2024:

  • 2,497,438 suspected CSAM URLs were submitted in 2024
    +218% increase from 2023
  • 1,634,636 URLs confirmed illegal
    +202% increase from the previous year
  • 37% of all content was “new”, meaning never seen before
  • Victim profile:
    • 93% were pre-adolescent children between 3 and 13
    • Nearly 99% of victims were girls

These numbers highlight not only scale, but a shift towards younger victims and repeat exploitation through new content circulating globally.


1.2 Internet Watch Foundation (IWF): Self-generated & AI-generated CSAM rising sharply

The IWF’s 2024 assessment reveals that the nature of abuse is changing as fast as the technology children use.

  • 424,047 reports assessed in 2024
    +8% from 2023
  • 291,273 confirmed reports of CSAM
    +6% increase
  • 91% of confirmed content was self-generated — often created when minors are coerced, manipulated, or deceived into producing images/videos themselves
  • 97% of victims in recorded cases were girls
  • AI-generated abuse is escalating rapidly:
    • 245 cases in 2024
      → Up from 51 in 2023, representing a ≈380% increase
  • Hosting trends intensified:
    • 62% of all actioned webpages were hosted in EU countries
      → Up from 51% the previous year

This shift toward self-generated and AI-enhanced abuse is particularly concerning because it affects the very environments where children feel safest — messaging apps, video chats, and peer networks.


1.3 NCMEC (USA): AI and trafficking reports skyrocketing

Although the raw number of CyberTipline reports decreased due to a “bundling” consolidation system used by some platforms, the severity of cases rose:

  • 2023: 36.2 million reports
  • 2024: 20.5 million reports (lower due to bundling, not less abuse)
  • Reports involving generative AI increased by 1,325%
  • Reports tied to child sex trafficking increased by 55%

This demonstrates that even though reports may appear lower, the complexity and sophistication of abuse is increasing dramatically.


2. What this means for South African parents

South Africa does not publish comprehensive CSAM-specific national statistics for 2023–24. However:

  • The national sexual-offence figures (16+ population) dropped slightly from ~49,000 to ~47,000 incidents — but this does not isolate online-specific or child-specific abuse.
  • Historical international reporting suggests significant under-reporting locally and growing risk via global platforms used by South African children.
  • South African children, like children everywhere, are vulnerable to grooming, coercion, peer-pressure image sharing, sextortion, and AI-generated manipulation.

Parents must therefore assume that global CSAM trends directly affect local children, especially given South Africa’s high mobile internet penetration and the heavy reliance on social media among younger age groups.


3. Why parents must act now

Self-generated content is the new frontline

Children — especially girls aged 8 to 14 — are increasingly manipulated into sending images or videos themselves. This can happen through:

  • Online friendships
  • Fake peer accounts
  • Romantic grooming
  • Coercion (“If you don’t send one, I’ll tell everyone you’re boring”)
  • Extortion (“You must pay or send more, or I’ll share this with your parents”)

AI makes exploitation easier and more persuasive

AI can now:

  • Create realistic fake nudes of children
  • Alter or combine existing photos
  • Generate convincing fake identities to groom children
  • Scale abuse rapidly

Children no longer need to take a compromising photo for criminals to exploit them — a single innocent image can be morphed into explicit content using AI.

Content lasts forever

Once online, CSAM is extremely difficult to remove. Victims often live with recurring trauma, knowing material may resurface at any time.


4. What parents can do today

Practical steps:

  • Start open, non-judgmental conversations early — even from age 7 or 8
  • Explain online privacy, consent, and why “private photos” are never truly private
  • Keep devices in shared spaces
  • Teach children how to respond to grooming attempts
  • Monitor app permissions (camera, messaging, auto-save)
  • Educate them about AI — including deepfakes and image manipulation
  • Use safety tools like parental controls, safe search filters, and reporting systems
  • Encourage children to tell you immediately if anything feels uncomfortable or threatening

Reassure your child:

No matter what has happened, you will not punish them for coming to you, and you will handle it together.


Short Summary of Key 2023–2024 Findings

  • INHOPE:
    • +218% rise in suspected CSAM URLs (2.49 million in 2024)
    • +202% rise in confirmed illegal material (1.63 million)
    • 93% victims aged 3–13, 99% girls
  • IWF:
    • +8% increase in reports assessed
    • +6% increase in confirmed CSAM
    • 91% self-generated content
    • 380% increase in AI-generated CSAM
    • EU hosting increased from 51% to 62%
  • NCMEC:
    • Raw report volume decreased due to platform consolidation
    • 1,325% increase in AI-related exploitation
    • 55% increase in sex-trafficking reports
  • South Africa:
    • No national CSAM-specific dataset
    • Sexual offences (not CSAM-specific) fell ~4%
    • Historical data show significant increases in online child exploitation