Since its founding, VSCO has believed in proactively defending the wellbeing of its international neighborhood of 300+ million registered customers, who add pictures and movies to its platform day by day. To guard its platform and photographers, VSCO used Safer – Thorn’s all-in-one resolution to detect, evaluate and report baby sexual abuse materials (CSAM) at scale from 2020 to 2025.
A Security by Design Strategy
VSCO’s robust give attention to photographers’ work and expertise on the platform is an extension of its security by design ethos. As they’ve developed the platform and grown, the corporate has invested in infrastructure that safeguards towards dangerous content material so its photographer neighborhood by no means has to see it.
A want to have complete safety towards CSAM led VSCO’s belief and security crew straight to Thorn. The VSCO crew knew about our mission to construct expertise to defend kids from sexual abuse, and have been eager about using Safer.
Â
Collaboration is Key
With the rise of user-generated content material, the unfold of CSAM has accelerated. Typically, the general public is shocked to search out CSAM and baby exploitation spreading on platforms they use on a regular basis.Â
Thorn is devoted to offering instruments and assets to content-hosting platforms as they’re key companions in combating the viral unfold of CSAM. Deploying Safer helped VSCO ship on its promise of being a trusted platform and offering a secure expertise for its photographer neighborhood.Â
![]()
Â
Between 2020 and 2025, VSCO leveraged Safer to flag 21,329 pictures and movies as suspected CSAM and detect 1,191 situations of identified CSAM. By proactively preventing the unfold of CSAM, VSCO ensured photographers weren’t uncovered to dangerous content material.
Collectively, VSCO and Safer made an influence.


















