Safer’s 2024 Impact Report
In 2024, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM and CSE on their platforms.
Read moreChild sexual abuse and exploitation is on the rise. Thorn equips platforms with tools and expert guidance to confront these online harms.
Reports to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline of online enticement have increased more than 300% between 2021 and 2023. In 2023, NCMEC received more than 104 million images and videos of suspected child sexual abuse from electronic service providers (ESPs) alone.
Reports from ESPs constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing online sexual harms against children.
Bad actors are continually evolving the tactics they use to sexually abuse and exploit children and the ways in which they exploit platforms. While platform protection remains inconsistent across the industry. Companies that attempt to build their own solution quickly learn that the endeavor is costly and ineffective due to a lack of issue expertise and incomplete or siloed data sets. The result is ineffective solutions that only partially address the issue.
Thorn is an innovative technology nonprofit transforming how children are protected from sexual abuse and exploitation in the digital age. Thorn built Safer to equip digital platforms with purpose-built solutions to detect child sexual abuse and exploitation. To date, Safer has identified 6.4 million potential CSAM files and 3,184 potential instances of text-based child exploitation on customer platforms.
Together, we’re building a safer internet.
With a relentless focus on child sexual abuse and exploitation, Safer by Thorn provides technology companies with solutions to help mitigate the risks of hosting CSAM or being misused to sexually exploit children.
“GoDaddy is proud to be a part of Thorn’s Safer community. Using their services, we can detect and remove CSEA content faster and safely share knowledge in the community in order to keep the Internet a safe and enjoyable place, especially for children.”
In 2024, Safer empowered content moderators and trust & safety professionals to detect, review, and report CSAM and CSE on their platforms.
Read moreExplore new data on online child sexual exploitation and the evolving risks youth face.
Read moreFind out why content moderation matters for every platform, and get a step-by-step approach to finding your content moderation solution.
Read moreLearn how our trust and safety solutions can be tailored to your challenges
Our child sexual abuse and exploitation solutions are powered by original research, trusted data, and proprietary technology. Let’s build a safer internet together–your next step starts here.