How GIPHY uses Safer by Thorn to proactively detect CSAM
GIPHY proactively detects CSAM with Safer to deliver on its promise of being a source for content that makes conversations more positive.
Read moreAccess 37,000+ CSAM keywords in multiple languages
Does your platform have messaging, search, or generative prompt functionality? Get access to our free CSAM keyword hub, containing terms and phrases in multiple languages, to use in your child safety mitigations.
Find research, guides, product updates and more to support you in making a safer internet.
GIPHY proactively detects CSAM with Safer to deliver on its promise of being a source for content that makes conversations more positive.
Read moreEverest Group has recognized Safer by Thorn as one of their content moderation technology trailblazers. Read why our purpose-built solutions made the short list.
Read moreCSAM has distinct characteristics that call for purpose-built solutions, such as CSAM classifiers. Thorn's data science team dives into the details.
Read moreThere are several child safety organizations that make their CSAM hash lists available to select partners via sharing programs. Learn more.
Read moreKnow the signs that could indicate your platform is at risk of hosting CSAM images and videos. Understand what types of sites are most at risk.
Read moreOnline child sexual abuse and exploitation has continued to rise for more than a decade. Understand the scale of the issue and the emerging trends to watch for.
Read moreA glossary of special terms and acronyms used within the child safety ecosystem. What's CSAM? CSEA? CSAI? Do they refer to different harms?
Read moreThis 7-part series is an essential guide to understanding CSAM and the available detection strategies leveraged by trust and safety professionals.
Read moreExplore Thorn's predictions for online child safety in 2025, from regulatory changes to AI challenges, and how these trends will shape the future of trust and safety.
Read moreBy John Starr, VP of Strategic Impact at Thorn, and Aaron Rodericks, Head of Trust & Safety at Bluesky In the past few weeks, Bluesky saw
Read moreContent moderation is crucial for business growth. Trust & Safety initiatives protect users, boost engagement, and improve company value.
Read moreAnnouncing the launch of our API-based solution for proactive detection of child sexual abuse material (CSAM): Safer Match.
Read moreGenerative AI is introducing new vectors of risk for kids online, and it’s also unlocking critical tools to combat online child sexual abuse at scale.
Read moreJoin Thorn's Sr. Research Manager, Amanda Goharian and VP of Customers & Strategic Partnerships, Amanda Volz, for a comprehensive overview of generative AI risks and strategies for mitigating these dangers on your platform.
Watch on demandJohn Starr puts several trust and safety leaders in the “hot spot” for a lightning round of questions to get their POV on what it’s like to work in trust and safety.
Read moreDiscover the human side of trust and safety in this candid conversation between Patricia Cartes, Head of Trust & Safety at Cantina AI, and host John Starr, Thorn’s VP of Strategic Impact.
Read moreThe CSAM Keyword Hub contains words and phrases related to CSAM and child sexual exploitation, boosting content moderation efforts.
Read moreDiscover the human side of trust and safety in this candid conversation between Jerrel Peterson, Director of Content Policy at Spotify, and host John Starr, Thorn’s VP of Strategic Impact.
Read moreDiscover the human side of trust and safety in this candid conversation between Yoel Roth, VP at Match Group, and host John Starr, Thorn’s VP of Strategic Impact.
Read moreAt Thorn, we want to make online spaces safer. As part of the trust and safety ecosystem, we understand that the challenge of protecting users from
Read moreOur latest solution, Safer Predict offers a cutting-edge AI-driven solution that detects new and unreported CSAM as well as harmful text conversations.
Read moreON-DEMAND | Featuring Rob Wang, Senior Manager of Data Science and Amanda Volz, VP of Customers and Strategic Partnerships
Watch on demandThorn has developed a resource containing child sexual abuse material (CSAM) terms and phrases in multiple languages to use in your child safety mitigations.
Apply nowIn 2023, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Read moreON-DEMAND | Hosted by Thorn, in collaboration with the National Center for Missing and Exploited Children
Watch on demandIn 2023, NCMEC’s CyberTipline received a staggering 36.2 million reports of suspected child sexual exploitation.
Read morePlatform safety tools — like blocking and reporting — are often a child’s first choice for responding to a harmful sexual interaction online. Instead of seeking support
Read moreUnderstand how to prevent risky situations involving youth and bad actors with insights from Thorn’s latest brief for digital platforms.
Get the brief nowThe REPORT Act is now federal law. We provide details about its components and explain how it will impact online platforms.
Read moreLearn how easy access to children online has given rise to new types of child sexual perpetrators.
Read more4 considerations for Trust and Safety teams at digital platforms as they review their child safety policies. Here’s what to consider.
Read moreThorn's policy team explains the Kids Online Safety Act (KOSA) and how the provisions in this bill may impact digital platforms.
Read moreOn January 31, the CEOs of Meta, TikTok, Snap, and Discord testified during the hearing, "Big Tech and the Online Child Sexual Exploitation Crisis."
Read moreWatch Dr. Rebecca Portnoff’s keynote at AWS re:Invent 2023 to learn how Thorn is using machine learning to detect CSAM.
Read moreFor VSCO, building Safer into its infrastructure unlocked automated solutions and moderation efficiencies for VSCO’s trust and safety and content moderation teams.
Read moreDetect known CSAM using hashing and matching, sometimes referred to as CSAM scanning. Learn how it works.
Read moreNew report highlights findings from Thorn’s latest research and offers recommendations for addressing online sexual threats to children.
Get the reportIn 2022, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM from their platforms.
Read moreFlickr’s Trust & Safety team uses Safer’s CSAM Image Classifier to detect and remove previously unknown child abuse content from their platform.
Read moreDetect CSAM and send reports to Royal Canadian Mounted Police from Safer, an all-in-one solution for CSAM moderation.
Read moreSafer is a flexible suite of tools designed to support your company’s processes and scale your child sexual abuse material (CSAM) elimination efforts.
Read moreIn 2021, Safer empowered content moderators and trust & safety professionals to detect, report and remove CSAM from their content-hosting platforms.
Read moreSafer by Thorn is available in the AWS Marketplace. Safer integrates with your AWS Cloud infrastructure for better control, scalability, and security.
Read moreDetecting CSAM within video content presents unique challenges. To solve this, Thorn's engineers developed a proprietary hashing technology called SSVH.
Read moreOur first version of Safer includes end-to-end functionality to support the identification, removal, and reporting of CSAM at scale and in real-time. Comprehensive coverage begins with proactive detection. Read more about the features we've released.
Read moreLearn how our trust and safety solutions can be tailored to your challenges
Our child sexual abuse and exploitation solutions are powered by original research, trusted data, and proprietary technology. Let’s build a safer internet together–your next step starts here.