Emerging Trends Report

Is Your Platform at Risk?

Clear trends emerged from Thorn’s latest research. SG-CSAM is on the rise. Child predators are more brazen. Youth find safety tools insufficient. Understand the risks to your platform.

Get the Report

Safer Community Resources

The tools, best practices, and leading voices you need to keep your platform Safer

Learn

On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."

Read More
Product Updates

Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.

Read More
Product Updates

Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.

Read More
Case Study

For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.

Read More
Learn

Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.

Read More
Emerging Trends Report 2023

New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.

Get the report
Learn

Addressing CSAM requires scalable tools to detect both known and unknown content.

Read More
Product Updates

In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.

Read More
Case Study

Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.

Read More
Product Updates

Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.

Read More
Learn

Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.

Read More
Learn

Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.

Read More
Product Updates

In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.

Read More
Learn

If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:

Read More
Blog

In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure had exploded exponentially to nearly 70 million.

Read More

Let’s talk

Build a better internet with us.

You've successfully subscribed to Safer: Building the internet we deserve.!