Power trust and safety with purpose-built solutions

Mitigate risk with proactive child sexual abuse material (CSAM) and exploitation (CSE) detection solutions built by experts in child safety technology.

Trusted by

Vimeo
Patreon
Stability.ai
Slack
Bluesky

Product trust and safety is complex.

Lock

Privacy considerations

You must consider privacy-forward solutions that also protect your platform from misuse and safeguard your brand reputation.

Magnifying Glass

Evolving abuse tactics

Bad actors’ means and methods evolve. To keep pace, you need to monitor threats and youth behavior.

Computer Circutry

Needing specialized solutions

You need child safety expertise and access to sensitive data in order to build effective CSAM detection tools.

Trust and safety solutions built by child safety technology experts

  • Known CSAM
  • Novel CSAM
  • Child Sexual Exploitation
  • Sextortion

Detect harmful content and maintain a privacy-forward stance

With our self-hosted deployment option, you share limited data while gaining access to expert-backed CSAM detection solutions.

A secure API-based deployment option is also available.

Keep pace with evolving abuse tactics with expert-backed solutions

Thorn’s original research and our team’s issue expertise inform the design of our products and services whether it’s proprietary perceptual hashing, predictive AI, or child safety advisory consulting.

Unleash predictive AI trained on trusted data

Safer by Thorn’s machine learning classification models were trained using data from trusted sources – in part using data from the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline.

See Our Solutions
 MediaLab
“Safer Predict’s text classifier significantly improves our ability to prioritize and escalate high-risk content and accounts. The multiple labels and risk scores help our team focus on problem accounts, some of which we had been suspicious about but lacked actionable evidence before we deployed the classifier. Thorn’s expertise is evident in Safer’s ability to detect conversations that could lead to sexual harms against children.”
Niles Livingston, Child Safety Manager, MediaLab

Trust and Safety Resources

  • Categories: Learn

    Deepfake nudes are a harmful reality for youth: New research from Thorn

    Deepfake technology is evolving at an alarming rate, lowering the barrier for bad actors to create hyper-realistic explicit images in seconds—with no technical expertise required.

    Read more
    Read more
  • Categories: Learn

    The Take It Down Act: What Trust and Safety Teams Need to Know

    Addressing the risks of nonconsensual image abuse AI-generated deepfake nudes are accelerating the spread of nonconsensual image abuse, making it easier for bad actors to manipulate

    Read more
    Read more
  • Categories: Case Study

    How GIPHY uses Safer by Thorn to proactively detect CSAM

    GIPHY proactively detects CSAM with Safer to deliver on its promise of being a source for content that makes conversations more positive.

    Read more
    Read more

On-demand demo

Learn how our trust and safety solutions can be tailored to your challenges

Our child sexual abuse and exploitation solutions are powered by original research, trusted data, and proprietary technology. Let’s build a safer internet together–your next step starts here.