In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Don’t know where to start? Read our primer that outlines the impact of this issue for content-hosting platforms and subsequent company policy recommendations to build a safer community.
Safer Community Resources
In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure had exploded exponentially to nearly 70 million.
In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash sharing will be a critical tool in helping us achieve that goal.
List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
In partnership with the Tech Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process.
Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.
We began this journey almost a decade ago and the scale of the problem continues to grow. We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.
The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.
Our starter guide for companies looking to institute policies and processes to actively combat the spread of CSAM on their platform.
Let’s talk