There is a child behind every file, hash, and piece of content. To date, Safer has identified 2.8+ million potential CSAM files on customer platforms.
Together, we’re building a safer internet.
With a relentless focus on CSAM elimination strategies, Safer helps protect content-hosting platforms and their users from the risks of hosting child sexual abuse images and videos.
We are transforming the internet by enabling platforms worldwide to detect CSAM and collaborate to prevent its viral spread.
Our focus on mental wellbeing—within our own team and in Safer’s features—ensures content moderators can avoid burnout and safeguard their resilience.
Top content-hosting platforms rely on our issue expertise, proprietary research, and state-of-the-art classifiers to deliver a robust CSAM detection solution.
In 2022, the National Center for Missing and Exploited Children’s CyberTipline received over 88.3 million files of child sexual abuse material (CSAM) from electronic service providers (ESPs) alone.
Reports from ESPs constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue.
Platform protection remains inconsistent across the industry. Companies that attempt to build their own solution quickly learn that the endeavor is costly and ineffective due to incomplete or siloed data sets. The result is ineffective solutions that only partially address the issue for content-hosting platforms.
Safer was created and brought to market by Thorn to fill the need for a solution that could adequately tackle the growing CSAM issue.
With Safer, Thorn is equipping content-hosting platforms with industry-leading CSAM detection tools to protect their platforms and users.
Hear more about Thorn’s vision in CEO Julie Cordua's TED talk.Learn More About Thorn
Let's chat about putting Safer to work for your platform.Get in Touch