Learn 2025 Online Child Safety Predictions for Trust and Safety Explore Thorn's predictions for online child safety in 2025, from regulatory changes to AI challenges, and how these trends will shape the future of trust and safety.
Learn How Teams are Making the Business Case for Investing in Trust & Safety Content moderation is crucial for business growth. Trust & Safety initiatives protect users, boost engagement, and improve company value.
Learn Understanding Gen AI’s Risks to Child Safety on Digital Platforms Generative AI is introducing new vectors of risk for kids online, and it’s also unlocking critical tools to combat online child sexual abuse at scale.
Learn Safe Space Lightning Rounds with Trust & Safety Leaders John Starr puts several trust and safety leaders in the “hot spot” for a lightning round of questions to get their POV on what it’s like to work in trust and safety.
Learn Safe Space: Talking Trust & Safety with Patricia Cartes Discover the human side of trust and safety in this candid conversation between Patricia Cartes, Head of Trust & Safety at Cantina AI, and host John Starr, Thorn’s VP of Strategic Impact.
Learn Introducing the CSAM Keyword Hub: A Free Collection of Words Related to Child Safety Risks The CSAM Keyword Hub contains words and phrases related to CSAM and child sexual exploitation, boosting content moderation efforts.
Learn Safe Space: Talking Trust & Safety with Jerrel Peterson Discover the human side of trust and safety in this candid conversation between Jerrel Peterson, Director of Content Policy at Spotify, and host John Starr, Thorn’s VP of Strategic Impact.
Learn Safe Space: Talking Trust & Safety with Yoel Roth Discover the human side of trust and safety in this candid conversation between Yoel Roth, VP at Match Group, and host John Starr, Thorn’s VP of Strategic Impact.
Learn The Dual Role of Technology: Thorn’s Insights From NCMEC’s 2023 CyberTipline Report In 2023, NCMEC’s CyberTipline received a staggering 36.2 million reports of suspected child sexual exploitation.
Learn Youth Tell the Truth About Safety Tools: Advice on How to Improve These Tools From Actual Teens Platform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support from a parent or
Learn Safeguard Youth and Protect Your Platform Understand how to prevent risky situations involving youth and bad actors with insights from Thorn’s latest brief for digital platforms.
Learn The REPORT Act Is Now Federal Law – Here’s What It Means for Online Platforms The REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.
Learn Unmasking the Perpetrators Online: Profiles of Bad Actors for Use by Trust and Safety Learn how easy access to children online has given rise to new types of child sexual perpetrators.
Learn 4 Considerations for Improving Your Child Safety Policies 4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.
Learn The Kids Online Safety Act (KOSA) Explained: What the Drafted Bill Could Mean for Your Online Platform Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
Learn Key Takeaways from the Online Child Sexual Exploitation Hearing with Social Media CEOs On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Learn Thorn’s Head of Data Science discuss how machine learning can support child safety on content-hosting platforms Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
Learn Hashing and Matching is Core to Proactive CSAM Detection Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
Learn Comprehensive CSAM Detection Combines Hashing and Matching with Classifiers Addressing CSAM requires scalable tools to detect both known and unknown content.
Learn Safer’s Self-Hosted Deployment Provides Control, Security and Scalability Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Learn Optimize CSAM Detection with SaferList Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
Learn Four signs your platform should be proactively detecting child sexual abuse material If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
Learn The challenge of detecting CSAM videos and what we can do about it today Detecting CSAM within video content presents unique challenges. To solve this, Thorn's engineers developed a proprietary hashing technology called SSVH.
Learn What is CSAM? Child Safety Terms & Definitions List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
Learn Safer: Building the internet we deserve We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.