Safer Community Resources

Trust and safety insights

Safeguard Youth and Protect Your Platform

We asked 8,000 youth what they experience online: Grooming, sharing nudes, insufficient safety tools. Bad actors exploit online communities to target kids. Understand the risks to your users and your platform.

Get the Brief Now

Safer Community Resources

Tools

In partnership with the Tech Coalition, Thorn has developed an API containing child sexual abuse material (CSAM) terms and phrases in multiple languages to improve your content moderation process.

Apply Now
Product Updates

In 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.

Read More
Webinar

July 10 at 12 p.m. PT | Hosted by Thorn, in collaboration with the National Center for Missing and Exploited Children

Save Your Spot
Learn

In 2023, NCMEC’s CyberTipline received a staggering 36.2 million reports of suspected child sexual exploitation.

Read More
Learn

Platform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support

Read More
Learn

Understand how to prevent risky situations involving youth and bad actors with insights from Thorn’s latest brief for digital platforms.

Get the brief now
Learn

The REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.

Read More
Learn

Learn how easy access to children online has given rise to new types of child sexual perpetrators.

Read More
Learn

4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.

Read More
Learn

In the last two years, generative AI has seen unprecedented advances. The technology ushered in the ability to create content and spread ideas faster than ever

Read More
Learn

Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.

Read More
Learn

On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."

Read More
Product Updates

Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.

Read More
Product Updates

Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.

Read More
Case Study

For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.

Read More
Learn

Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.

Read More
Emerging Trends Report 2023

New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.

Get the report
Learn

Addressing CSAM requires scalable tools to detect both known and unknown content.

Read More
Product Updates

In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.

Read More
Case Study

Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.

Read More
Product Updates

Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.

Read More
Learn

Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.

Read More
Learn

Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.

Read More
Product Updates

In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.

Read More
Learn

If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:

Read More
Blog

In 2004, the National Center for Missing & Exploited Children (NCMEC) reviewed roughly 450,000 child sexual abuse material, or CSAM, files. By 2019 that figure had exploded exponentially to nearly 70 million.

Read More

In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash sharing will be a critical tool in helping us achieve that goal.

Read More
Learn

List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.

Read More
Product Updates

Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.

Read More
Learn

We began this journey almost a decade ago and the scale of the problem continues to grow. We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.

Read More
Learn

The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.

Read More
Quick Guides

Our starter guide for companies looking to institute policies and processes to actively combat the spread of CSAM on their platform.

Read More

Let’s talk

Build a better internet with us.

You've successfully subscribed to Safer: Proactive Solution for CSE and CSAM Detection!