Proactive CSAM Detection Solutions Built by Experts in Child Safety Technology.

Safer by the Numbers

  • 130B+ Files processed
  • 57M+ Hashes of known CSAM in database
  • 5M+ Potential CSAM files identified on customer platforms since 2019
  • 1.9M+ Files classified as potential CSAM

Detect

Using two equally important technologies – hash matching and artificial intelligence (AI) – Safer detects both known and unknown CSAM and recognizes text-based online conversations that indicate or could lead to child exploitation.

  • Safer Match protects your platform from known CSAM (images and videos)
  • Safer Predict protects your platform from unknown CSAM (images and video) and text-based child sexual exploitation

Features

  • Hash Matching

    Cryptographic and proprietary perceptual hashing algorithms identify known CSAM in both image and video content. With access to a vast database aggregating 57.3 million hashes of known CSAM, Safer Match casts a wide net to detect and flag harmful content effectively.

  • Video Hash Matching

    Thorn's proprietary perceptual scene sensitive video hashing (SSVH) technique splits videos into scenes and frames to identify CSAM with precision.

  • CSAM and Text Classifiers

    Safer Predict’s advanced machine learning (ML) classification models detect new or previously unreported CSAM and child sexual exploitation behavior (CSE), generating a risk score to make human decisions easier and faster.

Review

The Safer review tool is a content moderation UI with wellness features built in to help you reduce unneeded exposure to CSA content, while enabling you to review CSAM effectively.

Benefits

  • Queue content for your team’s review
  • Protect your employees’ mental health

Report

Safer's reporting service provides a form UI to collect necessary data and connects to central reporting bodies in the US and Canada. In addition to packaging documentation, Safer’s reporting tool includes secure storage to preserve reported content.

Benefits

  • Simplify collection of necessary data
  • Securely store reported content
  • Connect to central reporting bodies

Contribute

Safer offers tools that enable cross-platform sharing of CSAM hash values. By enabling sharing, you can share your self-managed hash list of CSAM detected on your platform, either named or anonymously, with other Safer customers to help diminish the viral spread of harmful content.

Benefits

  • Breaks down data silos
  • Eliminates delay between reporting and future detection of new CSAM

Features

  • Perceptual Hash Optimization

    A self-sustaining feedback loop that improves matches, increases accuracy, and provides continuous service improvements.

  • SaferList

    A set of self-managed hash lists your content moderation team can use to reduce re-review of CSAM and to support policy enforcement for sexually exploitative content.

 A studio headshot of Risa Stein.
“Thorn makes it simple for businesses to set up and operate a robust child safety program. Their Safer tools are designed with flexibility in mind, and Thorn has provided excellent support to our product and engineering teams to ensure our implementation of these tools fits the unique context of our platform. Slack has long relied on Thorn to help keep our services safe in a responsible and privacy-protective way.”
Risa Stein, Director of Product Management, Integrity at Slack

Give your platform a competitive advantage.

Let’s discuss how Safer runs on your infrastructure.

Request Demo
You've successfully subscribed to Safer: Proactive Solution for CSE and CSAM Detection!