Product Updates Enhancing Platform Safety: Insights from Safer Predict's Text Detection Beta Period At Thorn, we want to make online spaces safer. As part of the trust and safety ecosystem, we understand that the challenge of protecting users from harmful content, especially child
Product Updates Announcing Safer Predict: AI-Driven CSAM & CSE Detection Our latest solution, Safer Predict offers a cutting-edge AI-driven solution that detects new and unreported CSAM as well as harmful text conversations.
Product Updates Safer’s 2023 Impact Report In 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Product Updates Thorn’s Head of Data Science discuss how machine learning can support child safety on content-hosting platforms Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
Product Updates Announcing Thorn-Hosted Safer Match, API-Based CSAM Detection Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Match.
Product Updates Safer’s 2022 Impact Report In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Product Updates Announcing RCMP Reporting via Safer Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Product Updates Safer’s 2021 Impact Report In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
Product Updates Announcing Safer for Detection and Post-Detection of CSAM Our first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.