Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.
Clear trends emerged from Thorn’s latest research. SG-CSAM is on the rise. Child predators are more brazen. Youth find safety tools insufficient. Understand the risks to your platform.
Get the ReportSafer Community Resources
Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.
For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.
Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.
Addressing CSAM requires scalable tools to detect both known and unknown content.
In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure had exploded exponentially to nearly 70 million.
In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash sharing will be a critical tool in helping us achieve that goal.
List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
Let’s talk