Explore Thorn's predictions for online child safety in 2025, from regulatory changes to AI challenges, and how these trends will shape the future of trust and safety.
We asked 8,000 youth what they experience online: Grooming, sharing nudes, insufficient safety tools. Bad actors exploit online communities to target kids. Understand the risks to your users and your platform.
Get the Brief NowExplore Thorn's predictions for online child safety in 2025, from regulatory changes to AI challenges, and how these trends will shape the future of trust and safety.
By John Starr, VP of Strategic Impact at Thorn, and Aaron Rodericks, Head of Trust & Safety at Bluesky In the past few weeks, Bluesky saw
Content moderation is crucial for business growth. Trust & Safety initiatives protect users, boost engagement, and improve company value.
Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Match.
Generative AI is introducing new vectors of risk for kids online, and it’s also unlocking critical tools to combat online child sexual abuse at scale.
Join Thorn's Sr. Research Manager, Amanda Goharian and VP of Customers & Strategic Partnerships, Amanda Volz, for a comprehensive overview of generative AI risks and strategies for mitigating these dangers on your platform.
John Starr puts several trust and safety leaders in the “hot spot” for a lightning round of questions to get their POV on what it’s like to work in trust and safety.
Discover the human side of trust and safety in this candid conversation between Patricia Cartes, Head of Trust & Safety at Cantina AI, and host John Starr, Thorn’s VP of Strategic Impact.
The CSAM Keyword Hub contains words and phrases related to CSAM and child sexual exploitation, boosting content moderation efforts.
Discover the human side of trust and safety in this candid conversation between Jerrel Peterson, Director of Content Policy at Spotify, and host John Starr, Thorn’s VP of Strategic Impact.
Discover the human side of trust and safety in this candid conversation between Yoel Roth, VP at Match Group, and host John Starr, Thorn’s VP of Strategic Impact.
At Thorn, we want to make online spaces safer. As part of the trust and safety ecosystem, we understand that the challenge of protecting users from
Our latest solution, Safer Predict offers a cutting-edge AI-driven solution that detects new and unreported CSAM as well as harmful text conversations.
ON-DEMAND | Featuring Rob Wang, Senior Manager of Data Science and Amanda Volz, VP of Customers and Strategic Partnerships
Thorn has developed a resource containing child sexual abuse material (CSAM) terms and phrases in multiple languages to use in your child safety mitigations.
In 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
ON-DEMAND | Hosted by Thorn, in collaboration with the National Center for Missing and Exploited Children
In 2023, NCMEC’s CyberTipline received a staggering 36.2 million reports of suspected child sexual exploitation.
Platform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support
Understand how to prevent risky situations involving youth and bad actors with insights from Thorn’s latest brief for digital platforms.
The REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.
Learn how easy access to children online has given rise to new types of child sexual perpetrators.
4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.
Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.
Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.
Addressing CSAM requires scalable tools to detect both known and unknown content.
In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
Safer by Thorn is available in the AWS Marketplace. Safer integrates with your AWS Cloud infrastructure for better control, scalability, and security.
Detecting CSAM within video content presents unique challenges. To solve this, Thorn's engineers developed a proprietary hashing technology called SSVH.
List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.
We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.
The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.
Let’s talk