Mar 13 2023

Safer’s 2022 Impact Report

Post By: Safer / 3 min read

Product Updates
Blog

Thorn created Safer with the intention of transforming the internet by finding and removing child sex abuse material (CSAM), defending against revictimization, and diminishing the viral spread of new material. In 2022, with the help of our customers, we made progress toward this goal. Our customers range in industry and product focus, but they all have one thing in common: an upload button.

Content moderators and trust & safety professionals use Safer to detect, review, and report CSAM at scale. With millions of files being uploaded every day to their platforms, Safer’s customers rely on our comprehensive hash database and advanced AI/ML models to help them programmatically find known and previously unknown CSAM. In 2022, our customers made incredible strides. Together, we’re building a better internet.

New Features Launched in 2022

Safer had two major milestones in 2022 with the release of our CSAM Video Classifier, a machine learning classification model, and the expansion of our reporting capabilities to Royal Canadian Mounted Police (RCMP).

In order to understand the content contained within a video, we use a perceptual Scene-Sensitive Video Hashing (SSVH) technique to hash each of the scenes and frames within a video. Our CSAM Video Classifier then analyzes those hashes and returns a score that indicates the likelihood of that scene containing CSAM. The content is then reviewed by a moderator, who verifies whether or not the flagged content is CSAM. Machine learning classifiers like this are a powerful tool for detecting previously unknown content. In 2022, our customers detected 15,238 videos classified as potential CSAM.

The other major milestone in 2022 was Safer’s new reporting capabilities that enable Canadian tech companies to send reports to RCMP. This was the first expansion of Safer reporting beyond the National Center for Missing and Exploited Children (NCMEC). and is a critical step toward ensuring all platforms have the capability to detect and respond to CSAM – no matter where they’re located. The addition of RCMP reporting provides intelligence necessary that can have a life-saving impact and lead to the rescue of child victims.


Hash Matching

Hashing and matching are Safer’s core services. With the largest database of verified hashes (32+ million hashes) to match against, Safer can cast the widest net to detect known CSAM. In 2022, we hashed more than 42.1 billion images and videos for our customers. That empowered our customers to find 520,000 files of known CSAM on their platforms.

520000 total CSAM images and videos matched in 2022

We’re also working to break down data silos with SaferList, a self-managed set of hashlists to which our customers can contribute verified hashes. Our customers have the option to share this data among Safer’s community to increase cross-platform intelligence and diminish the viral spread of CSAM.

Image and Video Classifiers

In addition to detecting known CSAM, our classifiers use machine learning to predict whether new content is likely to be CSAM and flags it for further review by content moderators.

In 2022, 304466 images and 15238 videos were classified as potential CSAM.

The use of classifiers enables our customers to find previously unknown CSAM. In 2022, our classifier detected 304,466 images of potential CSAM—a 205% increase compared to 2021—that were previously unknown. This increase in images classified and the addition of the CSAM Video Classifier this year represent major steps forward toward reaching our goal of eliminating CSAM from the open web.

Reporting

Being an all-in-one solution to detect, review, and report CSAM, Safer enables customers to validate, compile and send reports to the National Center for Missing and Exploited Children (NCMEC)—and now RCMP—directly from our tool. In 2022, Safer customers sent 60,829 reports containing 74,265 files to NCMEC.

In 2022, 60829 reports were facilitated by Safer to NCMEC.

In 2022, NCMEC’s CyberTipline received more than 31.8 million reports from electronic service providers (ESPs) alone. These reports constitute 99% of reports received by NCMEC and show that content-hosting platforms are critical partners in addressing this issue. With Safer, Thorn is equipping the tech industry with an all-in-one solution to address CSAM on their platforms at scale.

You've successfully subscribed to Safer: Building the internet we deserve.!