As the top of the yr approaches, we’re reminded of the facility of our donors that makes progress attainable. This yr, our neighborhood proved that progress is unstoppable once we act collectively. Our work in 2025 addressed a number of the hardest challenges dealing with youngsters within the digital age:
The misuse of AI to generate sexual abuse materialsThe threats of sextortion and groomingThe pressing must construct security into expertise from the beginning
We confronted these challenges, decided to unfold consciousness and create affect. As we glance again, we put together for what’s forward. Each motion we take continues to form a safer digital world for kids.
Defending youngsters from rising AI threats
Synthetic intelligence is reshaping our world, and new threats to youngsters are rising. This yr, Thorn strengthened a world effort to make sure the generative AI business develops with baby security at its core.
Our analysis revealed a rising hurt: AI-generated baby sexual abuse materials (AIG-CSAM) and deepfakes that trigger actual trauma for actual youngsters. We helped middle the experiences of younger individuals and victims within the conversations our society is having round these new applied sciences. These generative AI platforms aren’t innocent experiments. They’re instruments that may re-victimize survivors and endanger youngsters at scale.
However we didn’t cease at elevating consciousness. Thorn convened main AI corporations, together with Amazon, Anthropic, Google, Meta, Microsoft, and OpenAI, to undertake Security by Design for Generative AI rules. This improvement framework embeds baby safety into the muse of AI fashions. What started as an pressing dialog grew to become a coordinated motion. We’re united with business leaders to forestall AI from harming youngsters.
Our management additionally prolonged to coverage. Thorn was one of many civil society organizations contributing to the EU AI Act Code of Observe, the EU’s first main regulatory steering for AI. Our suggestions helped safe key wins. The regulation requires corporations to doc their course of for eradicating CSAM from coaching knowledge and deal with it as a systemic threat to mitigate, not ignore.
These efforts resonated around the globe. This yr, Thorn consultants briefed policymakers within the U.S., U.Okay., and EU. The White Home acknowledged our work as a part of nationwide efforts to fight image-based sexual abuse. Once we act collectively, we will form even probably the most complicated applied sciences to guard youngsters.
Groundbreaking analysis that formed the dialog
The State of Sextortion Report uncovered the devastating affect this crime is having on victims. The info was staggering, indicating an pressing want to handle this rising type of exploitation. Our findings confirmed these aren’t remoted incidents however a systemic menace demanding fast motion.
Youth Views 2024 elevated the lived experiences of younger individuals as they navigate their on-line world. Their tales revealed how perpetrators goal youthful boys on-line. It additionally confirmed how sure dangerous exchanges are being normalized amongst friends. These insights are already guiding new prevention messaging and instruments that talk to the realities immediately’s youth face.
The Commodified Sexual Interactions Involving Minors report examined how expertise and tradition are reshaping threat for youth on-line. From the commodification of sexual interactions to the increasing grey areas of consent on-line, our analysis gave policymakers and tech leaders the insights they should intervene early, design safer platforms, and construct protections that meet youngsters the place they’re.
Collectively, these stories advance our understanding of on-line experiences. They’ve sparked conversations amongst households, enterprise leaders, and lawmakers. Each report proves that analysis can change the trajectory of kid safety.
Scaling safety by way of innovation
Expertise accelerated many of those threats, however it is usually key to fixing them. This yr, Thorn continued to construct instruments that flip innovation into safety. We’re serving to investigators and platforms transfer sooner to guard youngsters in every single place.
Thorn Detect is a big milestone in that pursuit. This digital forensics answer helps investigators establish baby victims sooner. The software reduces assessment time, publicity to dangerous content material, and burnout. What as soon as took weeks can now occur in hours. We’re giving investigators treasured time to deal with discovering and defending youngsters in hurt’s manner.
For former darkish net investigator Pete Manning, that affect is private. After years of working in baby sexual abuse circumstances, Pete noticed firsthand how overwhelming caseloads can take a toll on probably the most devoted professionals. By dramatically decreasing the handbook burden of sorting by way of huge content material libraries, Thorn Detect provides investigators the time and resilience to search out extra baby victims sooner.
Our innovation additionally expanded detection capabilities by way of a brand new Spanish-language textual content classifier. This function empowers platforms to detect extra cases of potential grooming and sextortion. This multilingual growth means safety for extra youngsters in additional communities.
Our trigger neighborhood powers every development we construct. Each innovation proves that hope can scale as quick as hurt, and that expertise generally is a drive for good.
Partnering for prevention and empowerment
This yr, Thorn met youngsters the place they’re and gave them the instruments to navigate digital areas safely.
Our work with Snapchat introduced a brand new security useful resource on to teenagers. We consulted on the creation of The Keys: A Information to Digital Security. This interactive information helps younger individuals acknowledge and reply to sextortion, picture sharing, and cyberbullying. The information contains quick movies, reflection workout routines, and dialog instruments for households. The aim is to open trustworthy, judgment-free discussions about on-line experiences earlier than hurt happens.
The NoFiltr Youth Council took this peer-to-peer strategy even additional. Designed by youth, for youth, the council created security sources for Roblox gamers. Their creativity and authenticity make prevention relatable and trustworthy. This effort reveals that younger individuals aren’t simply individuals on this motion; they’re main it.
Once we associate with platforms and youth, we give youngsters greater than info. We offer them with company. Each empowered selection brings us nearer to a world the place youngsters can safely discover, be taught, and develop on-line with out the specter of exploitation.
A brand new period for Thorn’s mission
This yr marked a daring new chapter in Thorn’s story, one which displays how far we’ve come and the way far we’re able to go.
We launched Thorn’s up to date mission assertion and 4 pillars of kid security:
Collectively, they kind the muse of our imaginative and prescient for a digital security internet that protects each baby. We work towards interconnected layers of safety for kids throughout expertise, coverage, and our communities.
This evolution is a mirrored image of what we’ve constructed collectively. Donor belief and associate collaboration have made Thorn’s progress attainable. Each breakthrough in expertise, each analysis perception, each new safeguard started with the individuals who selected to imagine that defending youngsters and a joyful childhood is price constructing for.
Along with your assist, we’re not simply responding to rising threats — we’re shaping the techniques which have the facility to forestall them. With each innovation and partnership, we’re remodeling how youngsters are protected within the digital age.
However there’s nonetheless work to do, and now we all know what’s attainable once we stand united.
Throughout this giving season, be part of us in constructing the digital security internet each baby deserves. Your assist fuels the expertise, analysis, and partnerships that make safety attainable and ensures that each baby can develop up protected, curious, and free.











![One-Week Faculty Development Programme (FDP) on Literature as a Repository of Indian Knowledge Systems by NLU Tripura [Online; Aug 25-30; 7 Pm-8:30 Pm]: Register by Aug 24](https://i2.wp.com/cdn.lawctopus.com/wp-content/uploads/2025/08/Faculty-Development-Programme-FDP-on-Literature-as-a-Repository-of-Indian-Knowledge-Systems-by-NLU-Tripura.png?w=120&resize=120,86&ssl=1)







