Power trust and safety with purpose-built solutions

Mitigate risk with proactive child sexual abuse material (CSAM) and exploitation (CSE) detection solutions built by experts in child safety technology.

Trusted by

Vimeo
Patreon
Stability.ai
Slack
Bluesky

Product trust and safety is complex.

Lock

Privacy considerations

You must consider privacy-forward solutions that also protect your platform from misuse and safeguard your brand reputation.

Magnifying Glass

Evolving abuse tactics

Bad actors’ means and methods evolve. To keep pace, you need to monitor threats and youth behavior.

Computer Circutry

Needing specialized solutions

You need child safety expertise and access to sensitive data in order to build effective CSAM detection tools.

Trust and safety solutions built by child safety technology experts

  • Known CSAM
  • Novel CSAM
  • Grooming
  • Child Sexual Exploitation
  • Sextortion

Detect harmful content and maintain a privacy-forward stance

With our self-hosted deployment option, you share limited data while gaining access to expert-backed CSAM and exploitation detection solutions.

A secure API-based deployment option is also available.

Keep pace with evolving abuse tactics with expert-backed solutions

Thorn’s original research and our team’s issue expertise inform the design of our products and services whether it’s proprietary perceptual hashing, predictive AI, or child safety advisory consulting.

Unleash predictive AI trained on trusted data

Safer by Thorn’s machine learning classification models were trained using data from trusted sources – in part using data from the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline.

See Our Solutions
 OpenAI
“Thorn is unique in its depth of expertise in both child safety and AI technology. The combination makes them an exceptionally powerful partner in our work to assess and ensure the safety of our models.”
Chelsea C., Child Safety Technical Program Manager, Open AI

Trust and Safety Resources

  • Categories: Product Updates

    Announcing Safer Predict: AI-Driven CSAM & CSE Detection

    Our latest solution, Safer Predict offers a cutting-edge AI-driven solution that detects new and unreported CSAM as well as harmful text conversations.

    Read more
    Read more
  • Categories: Learn

    Five takeaways on building ethical AI content moderation

    Explore five key insights from Thorn’s “Humans in the Loop” webinar on building ethical, human-centered systems for AI content moderation.

    Read more
    Read more
  • Categories: Learn

    1 in 3 young boys are targeted online: What the new 2024 Youth Perspectives data tells us

    There are many online platforms where boys go to build worlds, compete with friends, and simply be kids. But our 2024 research reveals a troubling reality:

    Read more
    Read more

On-demand demo

Learn how our trust and safety solutions can be tailored to your challenges

Our child sexual abuse and exploitation solutions are powered by original research, trusted data, and proprietary technology. Let’s build a safer internet together–your next step starts here.