The product manager’s guide to choosing content moderation solutions
Find out why content moderation matters for every platform, and get a step-by-step approach to finding your content moderation solution.
Read moreChild sexual abuse and exploitation is on the rise. Thorn equips platforms with tools and expert guidance to confront these online harms.
Reports to the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline of online enticement have increased more than 300% between 2021 and 2023. In 2023, NCMEC received more than 104 million images and videos of suspected child sexual abuse from electronic service providers (ESPs) alone.
Reports from ESPs constitute the majority of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing online sexual harms against children.
Bad actors are continually evolving the tactics they use to sexually abuse and exploit children and the ways in which they exploit platforms. While platform protection remains inconsistent across the industry. Companies that attempt to build their own solution quickly learn that the endeavor is costly and ineffective due to a lack of issue expertise and incomplete or siloed data sets. The result is ineffective solutions that only partially address the issue.
Thorn is an innovative technology nonprofit transforming how children are protected from sexual abuse and exploitation in the digital age. Thorn built Safer to equip digital platforms with purpose-built solutions to detect child sexual abuse and exploitation. To date, Safer has identified 6+ million potential CSAM files on customer platforms.
Together, we’re building a safer internet.
With a relentless focus on child sexual abuse and exploitation, Safer by Thorn provides technology companies with solutions to help mitigate the risks of hosting CSAM or being misused to sexually exploit children.
“GoDaddy is proud to be a part of Thorn’s Safer community. Using their services, we can detect and remove CSEA content faster and safely share knowledge in the community in order to keep the Internet a safe and enjoyable place, especially for children.”
Find out why content moderation matters for every platform, and get a step-by-step approach to finding your content moderation solution.
Read moreDeepfake technology is evolving at an alarming rate, lowering the barrier for bad actors to create hyper-realistic explicit images in seconds—with no technical expertise required.
Read moreAddressing the risks of nonconsensual image abuse AI-generated deepfake nudes are accelerating the spread of nonconsensual image abuse, making it easier for bad actors to manipulate
Read moreLearn how our trust and safety solutions can be tailored to your challenges
Our child sexual abuse and exploitation solutions are powered by original research, trusted data, and proprietary technology. Let’s build a safer internet together–your next step starts here.