At Thorn, two of our core mission pillars are constructing progressive expertise and platform security. Key to those outcomes is our youngster sexual abuse materials (CSAM) and exploitation (CSE) detection resolution, Safer. This versatile product permits tech functions to search out and report CSAM and text-based harms on their platforms. In 2025, we had extra corporations than ever deploy Safer on their platforms. This widespread dedication to youngster security is essential to constructing a safer web and utilizing expertise as a power for good.
Safer’s 2025 Influence
Though Safer’s buyer group spans a variety of industries, all of them host user-generated content material or generative AI options.
Safer empowers their groups to detect, overview, and report CSAM and text-based CSE at scale. The scope of this detection is essential. It means their content material moderators and belief and security groups can detect CSAM among the many tens of millions of content material recordsdata uploaded and flag potential exploitation throughout tens of millions of messages shared. This effectivity saves time and hurries up their efforts. Simply as importantly, Safer permits groups to report CSAM or cases of on-line enticement to central reporting businesses, just like the Nationwide Heart for Lacking & Exploited Youngsters (NCMEC), which is essential for youngster sufferer identification.
Safer’s clients depend on our predictive synthetic intelligence and a complete hash database to assist them discover CSAM and potential exploitation. With their assist, we’re making strides towards lowering on-line sexual harms in opposition to youngsters and making a safer web.
Whole recordsdata processed
In 2025, Safer processed 415.4 billion recordsdata enter by our clients. In the present day, the Safer group includes greater than 80 platforms, with tens of millions of customers sharing an unimaginable quantity of content material each day. This gives a powerful basis for the necessary work of stopping the repeated and viral sharing of CSAM on-line.
To additional prolong the impression of our detection options, we partnered with Hive to supply Safer to their clients through an integration. Safer’s 2025 impression by the Hive integration contains almost 4.3 billion recordsdata processed for CSAM detection and 11,674 strains of textual content processed for the detection of potential exploitation.
Whole potential CSAM recordsdata detected
![]()
Safer detected almost 1.5 million photographs and movies containing identified CSAM in 2025. This implies Safer matched the recordsdata to hash values of identified CSAM verified by trusted companions. A hash is sort of a digital fingerprint, and utilizing it permits Safer to programmatically decide whether or not the file has beforehand been verified as CSAM by NCMEC or different NGOs.
Final 12 months, we considerably strengthened Safer’s detection capabilities with an enormous enlargement of hash knowledge. The import included:
1.6 million new SaferHash (picture) hashes, bringing our complete to six.3 million50 million new SSVH (video) hashes, bringing our complete to 64 million
Sustaining probably the most present and complete knowledge is crucial and ensures our clients have entry to sturdy picture and video detection.
Along with detecting identified CSAM, our predictive AI detected greater than 3,840,000 recordsdata of potential novel CSAM. Safer’s picture and video classifiers use machine studying to foretell whether or not new content material is more likely to be CSAM and flag it for additional overview. Figuring out and verifying novel CSAM allows its addition to the hash library, enabling future detection.
Whole strains of textual content processed
![]()
Safer launched a textual content classifier function in 2024 and, in 2025 alone, processed greater than 318,590,000 strains of textual content. This functionality affords an entire new dimension of detection, serving to platforms determine sextortion and different abuse behaviors taking place through textual content or messaging options. In all, greater than 1.3 million strains of potential youngster exploitation have been recognized, serving to content material moderators reply to doubtlessly threatening conduct.
Safer’s all-time impression
![]()
Safer’s impression continues to develop exponentially, with the group virtually tripling the all-time complete recordsdata processed in a single 12 months. Since 2019, Safer has processed 658.6 billion recordsdata and 334 million strains of textual content, ensuing within the detection of greater than 12.4 million potential CSAM recordsdata and almost 1.4 million cases of potential youngster exploitation. Each file processed and each potential match made helps create a safer web for youngsters and content material platform customers.
Construct a Safer web
Curbing platform misuse and addressing on-line sexual harms in opposition to youngsters requires an “all-hands” strategy. Many belief and security groups are being requested to do extra with fewer and fewer assets. Thorn is right here to assist content material moderators with a strong, multi-layered instrument that gives sturdy detection options for expertise groups. Collectively, we will remodel how children are shielded from sexual abuse and exploitation within the digital age.



















