Learn Understanding Gen AI’s Risks to Child Safety on Digital Platforms In the last two years, generative AI has seen unprecedented advances. The technology ushered in the ability to create content and spread ideas faster than ever before. Yet these same capabilities present critical implications for child safety. In short, generative AI is introducing new
Learn The Kids Online Safety Act (KOSA) Explained: What the Drafted Bill Could Mean for Your Online Platform Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
Learn Key Takeaways from the Online Child Sexual Exploitation Hearing with Social Media CEOs On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Learn Hashing and Matching is Core to Proactive CSAM Detection Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
Learn Comprehensive CSAM Detection Combines Hashing and Matching with Classifiers Addressing CSAM requires scalable tools to detect both known and unknown content.
Learn Safer’s Self-Hosted Deployment Provides Control, Security and Scalability Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Learn Optimize CSAM Detection with SaferList Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
Learn Four signs your platform should be proactively detecting child sexual abuse material If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling CSAM. If any of these resonate for you, it might be time to consider proactive CSAM detection:
Blog The challenge of detecting CSAM videos and what we can do about it today In 2019, Thorn CEO Julie Cordua delivered a TED talk about eliminating child sexual abuse material from the internet. In that talk, she explained how hash sharing will be a critical tool in helping us achieve that goal.
Learn Common Terms and Definitions List of common terms and definitions across the issue and technical space, alongside a global list of child protection organizations.
Learn Safer: Building the internet we deserve We began this journey almost a decade ago and the scale of the problem continues to grow. We're activating the larger technology ecosystem with tools to fight the spread of CSAM on platforms and eliminate it from the internet for good.
Learn A problem of epidemic proportions The New York Times published a four-part series tackling the intricacies of child sexual abuse material in 2019. We're sharing it as a resource alongside our insights on who this impacts most and how.