VSCO Uses Safer to Protect Its Platform and Community of Creators from CSAM at Scale
Since its founding, VSCO has believed in proactively protecting the wellbeing of its global community of 250+ million registered users (or creators), who upload photos and videos to its platform every day. The company’s mission is to nurture creativity so creators can make it.
A Safety by Design Approach
VSCO’s strong focus on creators’ work and experience on the platform is an extension of its safety by design ethos. As they have developed the platform and scaled, the company has invested in infrastructure that safeguards against harmful content so its creator community never has to see it.
A desire to have comprehensive protection against child sexual abuse material (CSAM) led VSCO’s trust and safety team straight to Thorn, whose mission is to build technology to defend children from sexual abuse. Safer was brought to market by Thorn in 2019 to fill the need for a solution that could adequately tackle the viral spread of CSAM.
CSAM Detection Built In
Safer Enterprise is an on-prem solution. Its self-hosted deployment gives content-hosting platforms control, security, and scalability. Platforms control how the tool integrates into their infrastructure, giving them flexibility to choose how it feeds into their workflows while providing privacy-forward CSAM protection.
For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams. Proactive CSAM scanning of all uploaded content happens programmatically, and Safer’s CSAM Classifier identifies unknown CSAM and flags it for human review.
Collaboration is Key
With the rise of user-generated content, the spread of CSAM has accelerated. Often, the public is surprised to find CSAM and child exploitation spreading on platforms they use everyday. The reality is, hosting CSAM is an inherent risk if a platform has an upload button. In fact, reports from content-hosting platforms constitute the majority of reports received by the National Center for Missing and Exploited Children's Cybertipline.
Thorn is dedicated to providing tools and resources to content-hosting platforms as they are key partners in combating the spread of CSAM. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its creator community. In 2022, Safer flagged 35,378 images and videos as potential CSAM and detected 408 instances of known CSAM for VSCO. By proactively fighting the spread of CSAM, VSCO ensures creators aren’t exposed to this harmful content.
Together, VSCO and Safer are making an impact.