Annually, the kid security ecosystem appears to the Nationwide Heart for Lacking & Exploited Youngsters’s CyberTipline Report for a snapshot of the net exploitation panorama. The info—what number of stories had been made by platforms and the general public, what number of recordsdata of suspected little one sexual abuse had been shared, and what number depicted toddlers versus teenagers—gives one of many few indicators now we have of the size and nature of technology-facilitated abuse.
And yearly, we ask: What do the numbers imply? Are we making progress, or falling behind?
This 12 months, the reply is sure… to each. We’re making progress. We’re dropping floor. And this stays solely the tip of the iceberg.
One factor we all know for a reality is that the size of abuse continues to be staggering. In 2024, the CyberTipline acquired 20.5 million stories, together with almost 63 million recordsdata—photos, movies, and different supplies associated to little one sexual exploitation.
Every report, every file, every incident displays a toddler who has been harmed. So whereas the numbers could also be decrease than what we noticed in final 12 months’s knowledge, they continue to be unacceptably excessive—and so they should be addressed by means of continued vigilance, innovation, and cross-sector collaboration.
The affect of expertise and consciousness
We’re seeing rising proof that each technological innovation and public consciousness are influencing the pipeline of reporting in ways in which enhance detection and prevention, whereas new applied sciences additionally introduce new challenges for little one security:
Bundling: One of many notable declines on this 12 months’s reporting could also be defined by NCMEC’s introduction of report “bundling,” which consolidates duplicate ideas tied to a single viral incident.Platform adjustments: Updates like default end-to-end encryption (E2EE) and revised content material insurance policies are doubtless altering what content material is detected and the way it’s reported. These adjustments matter—they replicate evolving approaches to privateness, security, and belief & security design. Coverage momentum: The REPORT Act, enacted in 2024, now mandates that platforms report instances of on-line enticement and little one intercourse trafficking. That coverage shift doubtless contributed to the spike in on-line enticement stories—displaying that little one security laws paired with platform compliance can enhance visibility into particular sorts of hurt.Public detection and response:: As we’ve seen with sextortion, public recognition of rising threats can play a pivotal function in surfacing hurt that platforms miss. This 12 months’s surge in public stories tied to violent on-line teams highlights each a rising willingness to report—and ongoing gaps in detection and disruption by platforms.
Key findings from the 2024 NCMEC CyberTipline Report
A more in-depth take a look at this 12 months’s knowledge reveals a number of essential developments and notable shifts within the little one security panorama:
20.5 million stories of suspected little one sexual exploitation had been submitted to NCMEC in 2024—a 43% lower from the 36.2 million stories in 2023. Nonetheless, when adjusted for incidents (to account for bundled stories), the quantity is 29.2 million distinct incidents, nonetheless reflecting a staggering scale of hurt.62.9 million recordsdata had been included in 2024 stories—33.1 million movies, 28 million photos, and almost 2 million different file varieties. These recordsdata are proof of abuse, and each one is tied to suspected abuse or exploitation of a kid.On-line enticement (crimes involving an grownup speaking with a toddler for sexual functions) stories rose 192%, reaching greater than 546,000 ideas. This dramatic enhance is probably going due partially to the brand new REPORT Act, which requires firms to report on-line enticement and little one intercourse trafficking for the primary time.Experiences involving generative AI surged by 1,325%, climbing from 4,700 in 2023 to 67,000 in 2024. Whereas this stays a small share of complete stories, it’s a transparent sign that AI-generated little one sexual abuse materials (AIG-CSAM) is rising —and calls for proactive security interventions like Security by Design, moral AI growth, and strong transparency reporting.NCMEC additionally noticed greater than 1,300 stories tied to violent on-line teams, representing a 200% enhance from 2023. These teams promote sadistic types of abuse, together with self-harm, sibling exploitation, and animal cruelty. Strikingly, 69% of those stories got here from members of the general public — comparable to dad and mom or caregivers —underscoring a excessive stakes hole in detection by platforms.
Defending the youngsters behind the numbers
No quantity of suspected exploitation stories is appropriate on the planet we wish for our kids. 63 million suspected abuse recordsdata are far too many recordsdata.
Behind every file and report is a toddler—somebody experiencing abuse, coercion, or exploitation. That’s the truth we can’t lose sight of.
And whereas adjustments in reporting methods, applied sciences, and insurance policies can all shift the numbers 12 months over 12 months, what stays fixed is the pressing want for a wiser, extra unified response. Decrease numbers don’t essentially imply much less abuse. In some instances, they imply much less visibility into it.
That’s why Thorn continues to champion a broader, extra resilient strategy to little one security that features issues like:
Adapting applied sciences and platform design to mitigate dangers from elevated use of E2EE and up to date content material insurance policies, which can affect what’s detectable—alongside a brand new technology of expertise firms stepping as much as proactively tackle these dangers by means of accountable reporting and intervention.Transparency reporting from on-line platforms, serving to your complete little one safety ecosystem and most people perceive what platforms are detecting, how they strategy little one security, and what they could be lacking.Security by Design rules that tech firms can observe and undertake early in expertise growth, so platforms are constructed with little one security in thoughts from the outset. Strong detection instruments, together with AI-powered classifiers that assist establish, evaluation, and report abusive content material earlier than it spreads. Assist providers for victims and survivors, who typically expertise revictimization every time their abuse materials resurfaces on-line. Effectively-resourced legislation enforcement, outfitted with the instruments and staffing wanted to establish extra little one victims sooner. Unique, youth-centered analysis to floor rising threats and guarantee we perceive how abuse evolves in digital areas. Cross-sector collaboration, as a result of no single actor—platform, policymaker, or nonprofit—can remedy this situation alone.
Closing ideas
Irrespective of how the numbers change 12 months over 12 months, Thorn’s mission stays steadfast: To rework the best way youngsters are protected against sexual abuse and exploitation within the digital age.
Actual progress requires that we complement what we study from these numbers by listening to what youngsters are experiencing at this time, investing within the methods that may shield them tomorrow, and addressing the threats hiding beneath the floor of the info.
Learn the complete 2024 CyberTipline Report right here, and go to thorn.org to study extra about how we’re constructing a safer world for youngsters.