As we start 2026, I’m crammed with each deep gratitude and renewed willpower. Every year, the digital world appears to evolve quicker than the one earlier than. This brings new challenges, new potentialities, and a rising urgency to our mission. On this second of fast change, our resolve is evident: we’ll remodel how youngsters are protected against sexual abuse and exploitation within the digital age.
At Thorn, we’ve by no means been extra ready to satisfy this second. In 2025, we strengthened the muse of a real digital security web for youngsters. Within the 12 months forward, we’ll proceed to construct it. We are going to deepen what works and increase on what’s wanted. Collectively, we’re shaping a future the place security is in-built, not bolted on.
Accelerating technical innovation
In 2026, Thorn will proceed to prepared the ground by utilizing know-how to guard youngsters at scale. Our technical groups are constructing on years of machine studying innovation to assist platforms and investigators detect and reply to little one sexual abuse materials quicker and with larger precision.
This subsequent technology of know-how goes past detecting potential abuse. By including deeper context about what’s being seen — like sufferer maturity, severity, or kind of hurt depicted — we’re serving to investigators deal with essentially the most pressing circumstances and defend these victims extra rapidly.
Our instruments are evolving as perpetrators flip to new techniques. We’re advancing detection of suspected AI-generated CSAM and stylized content material meant to evade present detection. These developments are greater than technical progress – they replicate a transparent dedication to satisfy rising threats head-on and to maintain closing vital gaps that depart youngsters susceptible.
Strengthening sufferer identification
Behind each little one sexual abuse materials (CSAM) file detected is a toddler who deserves to be discovered and delivered to security. In 2026, we’ll proceed to strengthen the instruments that assist investigators establish victims quicker. We are going to empower these on the frontlines with know-how that makes an overwhelming job extra manageable.
Our work this 12 months focuses on refining and optimizing the techniques investigators depend on each day. We’ll guarantee they’ve entry to the most effective instruments and intelligence out there. By enhancing velocity, detection capabilities, and value, we’re serving to frontline defenders spend much less time sorting via overwhelming volumes of fabric. That offers them extra time to convey youngsters out of hurt’s manner.
Outfitted with our newest technical improvements, investigators can have entry to extra granular details about abuse materials content material. This empowers them to triage large queues of dangerous content material extra successfully. It’s a vital want as the quantity of suspected CSAM reported continues to rise annually.
Present world steering from our analysis and insights
Understanding how hurt evolves on-line is crucial to stopping it. We’ll proceed to conduct analysis to map rising dangers for youth on-line and share the insights wanted to strengthen little one safety.
Our workforce strives to convey readability to a posh problem. It’s how we form the worldwide dialog across the position of know-how in defending youngsters. From understanding new patterns of grooming and sextortion to exploring how generative AI is altering the menace panorama, Thorn’s analysis gives companions throughout tech, coverage, and legislation enforcement with the data they should act.
This work ensures that our options, and the kid security ecosystem we assist strengthen, keep grounded in knowledge. Most significantly, it elevates the lived realities of younger folks rising up on-line. It’s a vital step in shifting from reacting to hurt to stopping it altogether.
Trying forward
As we enter this subsequent 12 months, Thorn’s mission stays clear: to rework how youngsters are protected within the digital age. The challenges earlier than us are complicated. The chance for progress has by no means been larger.
None of this work could be attainable with out our group of supporters who stand with us each step of the best way. Collectively, we’re proving that know-how could be a pressure for good — and {that a} safer digital world for each little one is inside attain.




















