For younger individuals, the web is a spot for self-discovery, socializing, and constructing significant connections on-line. However these similar areas may also be utilized by perpetrators who goal kids for grooming and sextortion, and use know-how to share baby sexual abuse materials (CSAM).Â
Due to this, know-how corporations play a key position in defending kids from abuse and exploitation within the digital age.
At Thorn, we empower tech corporations in that pursuit. Our purpose-built options equip tech platforms to fight the unfold of kid sexual abuse materials and assist scale back the cycle of trauma that its circulation causes. As consultants in baby security know-how, we additionally assist corporations perceive their particular position and capabilities in taking part within the baby security ecosystem.
Combating CSAM is a crucial step towards creating safer on-line environments and supporting survivors of abuse. Our multifaceted method empowers Thorn and our platform companions to fight sexual abuse and exploitation on the open internet, and defend kids on a world scale.
Â
Stopping the unfold of kid sexual abuse materials
It might be complicated to study that the very platforms we use to attach with our family and friends are additionally utilized by perpetrators to create and share baby sexual abuse materials. On-line, they can type tight-knit communities the place they facilitate the creation and commerce of kid sexual abuse materials.
What’s CSAM?
However what precisely is baby sexual abuse materials, or CSAM? Little one sexual abuse materials is legally referred to as baby pornography within the U.S. and refers to any content material that depicts sexually express actions involving a baby. Visible depictions embrace pictures, movies, stay streaming, and digital or computer-generated photos, together with AI-generated content material, indistinguishable from an precise minor. The emergence of generative AI broadens the scope to incorporate AI-adaptations of authentic content material, the sexualization of benign photos of kids, and absolutely AI-generated CSAM.
How huge a disaster is baby sexual abuse materials on-line? In 2004, 450,000 information of suspected CSAM have been reported within the U.S. By 2024 that quantity had skyrocketed to greater than 61 million information. That’s greater than 100 information being reported every minute. The web merely makes it too simple to provide and disseminate this horrific content material.
How does revictimization happen?
Even after a baby sufferer has been faraway from lively, hands-on abuse, images and movies of their abuse can flow into on-line and proceed the cycle of trauma.
Survivors of CSAM might have their abuse shared tens of 1000’s of occasions a 12 months. Every time the content material is shared, the sufferer is abused once more.
How does Thorn’s know-how cease the cycle of abuse?
Although thousands and thousands of information of CSAM unfold each day, they’re combined in with even better quantities of innocent photos and movies. This inflow of content material makes figuring out CSAM information extremely difficult, resource-intensive, and practically unattainable for human assessment alone. To not point out the debilitating emotional toll reviewing this materials takes on the individuals working to maintain on-line communities secure.
At Thorn, we developed Safer, our purpose-built answer, to empower tech platforms to detect, assessment, and report CSAM at scale.
Safer identifies identified and beforehand reported CSAM by means of its hashing and matching capabilities. It additionally detects unknown suspected CSAM by means of its predictive AI picture and video classifiers. Discovering this unreported abuse materials is crucial because it helps alert investigators to lively abuse conditions so victims will be faraway from hurt. Thorn has additionally launched know-how that identifies doubtlessly dangerous conversations associated to baby sexual abuse to cease hurt earlier than it begins. Safer arms Belief and Security groups with a proactive answer for locating CSAM and reporting it to authorities.
By preventing CSAM on their platforms, know-how corporations can defend kids and likewise break that cycle of revictimization.
Â
The trouble is working
So far, Thorn has helped the tech business detect and flag for elimination virtually 6.5 million baby sexual abuse information from the web. In 2024 alone, Safer detected greater than 4 million information of suspected CSAM throughout tech platforms — making a tangible affect on the lives of kids and survivors.
The businesses we companion with vary from small platforms to a number of the world’s largest family digital names.
Â
![]()
In 2019, international picture and video internet hosting website Flickr turned a Safer buyer and depends on our complete detection options to seek out CSAM on its platform. In 2021, Flickr deployed Safer’s CSAM Picture Classifier. Utilizing the classifier, their Belief and Security staff might detect beforehand unknown CSAM photos they possible wouldn’t have found in any other case.
One classifier hit led to the invention of two,000 beforehand unverified photos of CSAM and an investigation by regulation enforcement – by which a baby was rescued from hurt.
In 2022, Flickr reported 34,176 information of suspected CSAM to the Nationwide Middle for Lacking and Exploited Kids. That is knowledge that may be acted on to determine and take away baby victims from hurt.
Â
![]()
VSCO, an app for picture and video creation communities, deployed Safer in 2020. Within the face of accelerating CSAM on-line, VSCO’s core dedication to security drove them to prioritize detection on their platform.
VSCO makes use of Safer to proactively goal CSAM at scale. The software speeds their efforts and will increase the quantity of content material they will assessment, permitting them to forged a wider web. In three years, they’ve reported 35,000 information of suspected CSAM to authorities.
Â
A multifaceted method to on-line baby security
Tackling baby sexual abuse on-line requires a complete method, involving know-how, business training, coverage, and group engagement. Thorn works at every degree to create systemic change and strengthen the kid security ecosystem.
Security by Design
Within the tech business everybody from AI builders to knowledge internet hosting platforms, social media apps to search engines like google and yahoo, intersect with baby security ultimately. Thorn helps them perceive and determine the threats that happen on their platforms and the right way to mitigate them.
The emergence of generative AI solely accelerated the unfold of know-how facilitated baby sexual abuse. Thorn urges corporations to take a Security-by-Design method, which requires security measures to be constructed into the core design of applied sciences.
At Thorn, we construct know-how to defend kids from sexual abuse. However we’re only one piece of the puzzle, together with the tech business, policymakers, and the general public.Â
Once we work collectively, we will fight the unfold of CSAM. In doing so, we’ll cease revictimization, and begin to construct a world the place each baby is free to easily be a child.
Â
Be a part of us
Change into a power for good. Study extra about Thorn’s options and how one can contribute to remodeling the best way we defend kids from sexual abuse and exploitation within the digital age.
See Our Options











![One-Week Faculty Development Programme (FDP) on Literature as a Repository of Indian Knowledge Systems by NLU Tripura [Online; Aug 25-30; 7 Pm-8:30 Pm]: Register by Aug 24](https://i2.wp.com/cdn.lawctopus.com/wp-content/uploads/2025/08/Faculty-Development-Programme-FDP-on-Literature-as-a-Repository-of-Indian-Knowledge-Systems-by-NLU-Tripura.png?w=120&resize=120,86&ssl=1)


![CfP: Nyaayshastra Law Review (ISSN: 2582-8479) [Vol IV, Issue II] Indexed in HeinOnline, Manupatra, Google Scholar & Others, Free DOI, Certificate of Publication, Manuscript Booklet, Hard Copy & Internships Available: Submit by Sept 7!](https://i2.wp.com/www.lawctopus.com/wp-content/uploads/2024/09/NYAAYSHASTRA-Law-Review-1-1.png?w=120&resize=120,86&ssl=1)




