In the last two years, generative AI has seen unprecedented advances. The technology ushered in the ability to create content and spread ideas faster than ever
Clear trends emerged from Thorn’s latest research. SG-CSAM is on the rise. Child predators are more brazen. Youth find safety tools insufficient. Understand the risks to your platform.
Get the ReportSafer Community Resources
In the last two years, generative AI has seen unprecedented advances. The technology ushered in the ability to create content and spread ideas faster than ever
Thorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
On January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Watch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
Announcing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Essential.
For VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.
Detect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
New report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.
Addressing CSAM requires scalable tools to detect both known and unknown content.
In 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Flickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Detect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Safer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Safer’s self-managed hashlists help customers optimize their CSAM detection. SaferList helps fill the gap between when new CSAM is reported and matched against.
In 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
Let’s talk