Tools

CSAM Keyword Hub

Thorn has developed a resource containing child sexual abuse material (CSAM) terms and phrases in multiple languages to use in your child safety mitigations.

Get Access
FAQs

1 What can I do with the CSAM Keyword Hub?

Digital platforms with a chat function who are seeking to protect children from online predation on their platforms can apply for use of the hub. Law enforcement and non-governmental organizations (NGOs) are also eligible to apply.

2 How should the CSAM Keyword Hub be used?

Instead of using the CSAM Keyword Hub for strictly blocking specific keywords that match the list, the strong preference is to use the list to kickstart the training of machine learning models. We outline other potential uses cases in this article. By doing so, we hope companies can proactively detect and remove CSAM.

3 What kind of terms can be added?

Terms must be relevant to the detection and elimination of online materials that involve sex trafficking, exploitation, or the endangerment of children. They should be able to help identify, prevent, or remove this online material. Terms may include the identifier of a known child abuse image or video (for example a filename, MD5 hash, or street name). PII included in CSAI terms (such as a series name) is allowed, but PII that includes user account information is not allowed. Once terms are added, they are intended to remain even if a company decides to no longer participate.

4 Who can participate?

This resource is available to approved Internet service companies and NGOs that have a core mission of combatting CSAM.

CSAM Keyword Hub

Apply Now

Anyone interested in accessing the CSAM Keyword Hub must complete the application on this page, share their intent of use, and agree to the terms and conditions.