At Thorn, we’re devoted to constructing cutting-edge expertise to defend kids from sexual abuse. Key to this mission is our baby sexual abuse materials (CSAM) and CSE detection answer, Safer, which permits tech platforms to search out and report CSAM and text-based harms on their platforms. In 2024, we had extra firms than ever deploy Safer on their platforms. This widespread dedication to baby security is essential to constructing a safer web and utilizing expertise as a drive for good.
Safer’s 2024 Affect
Though Safer’s group of consumers spans a variety of industries, all of them host content material uploaded by their customers or textual content inputs in generative engines and messaging options.
Safer empowers their groups to detect, evaluate, and report CSAM and text-based baby sexual exploitation at scale. The scope of this detection is crucial. It means their content material moderators and belief and security groups can discover CSAM amid the thousands and thousands of content material recordsdata uploaded and flag potential exploitation amid thousands and thousands of messages shared. This effectivity saves time and hurries up their efforts. Simply as importantly, Safer permits groups to report CSAM or situations of on-line enticement to central reporting businesses, just like the Nationwide Middle for Lacking & Exploited Youngsters (NCMEC), which is crucial for baby sufferer identification.
Safer’s clients depend on our predictive synthetic intelligence and a complete hash database to assist them discover CSAM and potential exploitation. With their assist, we’re making strides towards decreasing on-line sexual harms towards kids and making a safer web.
Whole recordsdata processed
In 2024, Safer processed 112.3 billion recordsdata enter by our clients. At the moment, the Safer group includes greater than 60 platforms, with thousands and thousands of customers sharing an unbelievable quantity of content material day by day. This represents a considerable basis for the necessary work of stopping repeated and viral sharing of CSAM on-line.
Whole potential CSAM recordsdata detected
Safer detected just below 2,000,000 pictures and movies of recognized CSAM in 2024. This implies Safer matched the recordsdata’ hashes to verified hash values from trusted sources, figuring out them as CSAM. A hash is sort of a digital fingerprint, and utilizing them permits Safer to programmatically decide if that file has beforehand been verified as CSAM by NCMEC or different NGOs.
Along with detecting recognized CSAM, our predictive AI detected greater than 2,200,000 recordsdata of potential novel CSAM. Safer’s picture and video classifiers use machine studying to foretell whether or not new content material is more likely to be CSAM and flag it for additional evaluate. Figuring out and verifying novel CSAM permits it to be added to the hash library, accelerating future detection.
Altogether, Safer detected greater than 4,100,000 recordsdata of recognized or potential CSAM.
Whole traces of textual content processed
Safer launched a textual content classifier function in 2024 and processed greater than 3,000,000 traces of textual content in simply the primary 12 months. This functionality presents an entire new dimension of detection, serving to platforms determine sextortion and different abuse behaviors taking place through textual content or messaging options. In all, nearly 3,200 traces of potential baby exploitation had been recognized, serving to content material moderators reply to doubtlessly threatening conduct.
Safer’s all-time influence
Final 12 months was a watershed second for Safer, with the group nearly doubling the all-time whole of recordsdata processed. Since 2019, Safer has processed 228.8 billion recordsdata and three million traces of textual content, ensuing within the detection of just about 6.5 million potential CSAM recordsdata and practically 3,200 situations of potential baby exploitation. Each file processed, and each potential match made, helps create a safer web for youngsters and content material platform customers.
Construct a Safer web
Curbing platform misuse and addressing on-line sexual harms towards kids requires an “all-hands” method. Too many platforms nonetheless undergo from siloed groups, inconsistent practices, and coverage gaps that jeopardize efficient content material moderation. Thorn is right here to vary that, and Safer is the reply.