If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
Don’t know where to start? Read our primer that outlines the impact of this issue for content-hosting platforms and subsequent company policy recommendations to build a safer community.
Safer Community Resources
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure
In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash
Protect your platform from child sexual abuse material (CSAM) using a scalable, real-time solution that quickly identifies and queues content for safe and efficient removal and
List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
In partnership with the Technology Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process.
Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.
We began this journey almost a decade ago and the scale of the problem continues to grow. We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.
The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.
Our starter guide for companies looking to institute policies and processes to actively combat the spread of CSAM on their platform.
Let’s talk