There is a child behind every file, hash, and piece of content. To date, Safer has identified 5+ million potential CSAM files on customer platforms.
Together, we’re building a safer internet.
With a relentless focus on CSAM elimination strategies, Safer helps protect content-hosting platforms and their users from the risks of hosting child sexual abuse images and videos.
Top content-hosting platforms rely on our issue expertise, proprietary research, and data science proficiency.
Advanced hashing techniques and matching against a database aggregating 57.3 million known CSAM hash values provide highly accurate detection results.
Thorn’s child abuse model is trained in part using trusted data from the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline.
In 2023, the National Center for Missing and Exploited Children’s CyberTipline received over 104 million files of suspected child sexual abuse material (CSAM) from electronic service providers (ESPs) alone.
Reports from ESPs constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue.
Platform protection remains inconsistent across the industry. Companies that attempt to build their own solution quickly learn that the endeavor is costly and ineffective due to incomplete or siloed data sets. The result is ineffective solutions that only partially address the issue for content-hosting platforms.
Thorn built safer to tackle the growing CSAM issue.
With Safer, Thorn is equipping content-hosting platforms with industry-leading CSAM detection solutions to protect their platforms and users.
Hear more about Thorn’s vision in CEO Julie Cordua's TED talk.
Learn More About ThornLet's chat about putting Safer to work for your platform.
Get in Touch