Post By: Safer / 3 min read
The scale of CSAM
In 2019, the New York Times published a four-part series tackling the intricacies of child sexual abuse material, or CSAM, on the internet from the perspectives of victims, parents, technology companies, government, and NGOs:
This was one of the most in-depth exposés about this issue published to date from a major publication, catching the attention of everyone from the casual newsreader to executives in Silicon Valley. As a non-profit organization committed to eliminating child sexual abuse material from the internet, the heightened awareness and collective cry for action to both businesses and government presented Thorn an opportunity to engage more publicly than ever before.
This epidemic shows no signs of slowing down. Since the initial publishing by the New York Times, the National Center for Missing and Exploited Children (NCMEC) reported 69.1 million files of child sexual abuse imagery in 2019 - a 53% increase over the 2018 figures shared by the New York Times. Furthermore, as bandwidth continues to improve, this issue continues to evolve. The rise in CSAM videos is drastically outpacing that of images. 2019 became the first year the number of videos, a record 42.3 million, exceeded the number of images - and not just by a little, by 47.5%.
“You’re just trying to feel O.K. and not let something like this define your whole life. But the thing with the pictures is — that’s the thing that keeps this alive.”—Survivor, The New York Times
Thorn has been working at the intersection of non-profit, law enforcement, and the technology industry for nearly a decade with a mission to build technology to defend children from sexual abuse. The ways we’ve seen this abuse manifest in the trafficking space then online as currency to globally consume and trade has indicated a dire need for solutions in both technology and policy that can address the scale of the problem.
The viral spread of child sexual abuse material (CSAM) on the web presents both a human and business challenge. The human impact is devastating and the struggles of content-hosting platforms to effectively keep up with the proliferation of abuse material as technology evolves seems near impossible. Each time abuse content is shared, child victims are subjugated to retraumatization, powerless in stopping its circulation. In a 2017 survey conducted by the Canadian Centre for Child Protection, survivors of child sexual abuse shared the fears of being recognized by their abuse content as long as it remains in circulation. Nearly 70% of the survivors surveyed worried about being identified by their abuse content. Subsequently, being exposed to abuse content can cause a network effect of secondary trauma to those who unintentionally encounter it.
So where do we go from here?
The solution, as many have learned, isn’t as clear as many learning about this issue may think. There is no easy way to talk about child sexual abuse. Not in a way that softens the message to be any less upsetting or uncomfortable. Yet often the things that cause us the greatest discomfort can be a sign that it deserves to be confronted in that moment more than ever. The spread of child sexual abuse material is an issue that can only be stopped by mobilizing a community working together to protect children, we create a world where every child can simply be a kid.
As an organization, Thorn is dedicated to activating industry with the tools to combat the spread of CSAM on their platforms and in response, has announced the public launch of Safer, a comprehensive tool to identify, remove, and report CSAM.