Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 300+ million registered users, who upload photos and videos to its platform every day. The company’s mission is to close the gap between the creative life photographers want and the confidence, tools, and community they need to make it real.

A Safety by Design Approach

VSCO’s strong focus on photographers’ work and experience on the platform is an extension of its safety by design ethos. As they have developed the platform and scaled, the company has invested in infrastructure that safeguards against harmful content so its photographer community never has to see it.

A desire to have comprehensive protection against child sexual abuse material (CSAM) led VSCO’s trust and safety team straight to Thorn, whose mission is to protect children from sexual abuse and exploitation in the digital age. Safer was brought to market by Thorn in 2019 to fill the need for a solution that could adequately tackle the viral spread of CSAM.

CSAM Detection Built In

Safer Enterprise is a self-hosted solution that gives content-hosting platforms control, security, and scalability. Platforms control how the tool integrates into their infrastructure, giving them flexibility to choose how it feeds into their workflows while providing privacy-forward CSAM protection.

In 2020, VSCO incorporated Safer into its infrastructure, which unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams. Through 2025, VSCO leveraged Safer to proactively scan uploaded content to detect known and potentially novel CSAM and flag it for human review.

Collaboration is Key

With the rise of user-generated content, the spread of CSAM has accelerated. Often, the public is surprised to find CSAM and child exploitation spreading on platforms they use everyday. The reality is, hosting CSAM is an inherent risk if a platform has an upload button. In fact, reports from content-hosting platforms constitute the majority of reports received by the National Center for Missing and Exploited Children's Cybertipline.

Thorn is dedicated to providing tools and resources to content-hosting platforms as they are key partners in combating the spread of CSAM. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its photographer community. Between 2020 and 2025, Safer flagged 21,329 images and videos as potential CSAM and detected 1,191 instances of known CSAM on VSCO. By proactively detecting CSAM, VSCO ensured photographers weren’t exposed to harmful content.

ON-DEMAND DEMO Watch: Learn how our trust and safety solutions can be tailored to your challenges.   ▶️ Let’s build a safer internet together–your next step starts here.