Learn Hashing and Matching is Core to Proactive CSAM Detection Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
Learn Comprehensive CSAM Detection Combines Hashing and Matching with Classifiers Addressing CSAM requires scalable tools to detect both known and unknown content.
Product Updates Safer’s 2022 Impact Report In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Case Study Flickr Uses CSAM Image Classifier to Find Harmful Content Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Product Updates Announcing RCMP Reporting via Safer Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Learn Safer’s Self-Hosted Deployment Provides Control, Security and Scalability Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Learn Optimize CSAM Detection with SaferList Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
Product Updates Safer’s 2021 Impact Report In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
Learn Four signs your platform should be proactively detecting child sexual abuse material If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
Blog Let’s Build a Better Internet for Every Child: Safer’s best-in-class technology is now available for anyone with an AWS Marketplace account In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure had exploded exponentially to nearly 70 million.
Blog The challenge of detecting CSAM videos and what we can do about it today In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash sharing will be a critical tool in helping us achieve that goal.
Learn Common Terms and Definitions List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
Product Updates Announcing Safer for Detection and Post-Detection of CSAM Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.
Learn Safer: Building the internet we deserve We began this journey almost a decade ago and the scale of the problem continues to grow. We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.
Learn A problem of epidemic proportions The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.