Jan 2 2025

What is CSAM? Child Safety Terms & Definitions

Post By: Safer / 7 min read

Learn
Blog

As you explore ways to reduce harm on your digital platform, you will undoubtedly encounter new terminology and tools to familiarize yourself with. This page is designed as a reference guide for common terms, CSAM detection technologies, and organizations working to protect children from sexual abuse and exploitation in the digital age.

What is CSAM?

Child Sexual Abuse Material (CSAM):

Child sexual abuse material refers to sexually explicit content involving a child. While the term “child pornography” is currently used in U.S. federal statutes, efforts are underway in many jurisdictions to update the terminology in legal guidelines.

Alternatively, CSAM may also be referred to as:

  • CSAI: child sexual abuse imagery
  • CSEI: child sexual exploitation imagery
  • IIOC: indecent images of children
  • CSEA: child sexual exploitation and abuse (umbrella term that CSAM falls under)

Related:

  • Self-generated CSAM (SG-CSAM)
  • AI-generated CSAM (AIG-CSAM)
  • AI-manipulated CSAM (AIM-CSAM)
  • Deepfakes

The legal definition of CSAM
In the U.S.: Section 2256 of Title 18, United States Code, defines “child pornography” as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. Undeveloped film, undeveloped videotape, and electronically stored data that can be converted into a visual image of child pornography are also deemed illegal visual depictions under federal law.

U.S.-based electronic service providers (ESPs) are legally required to report instances of child sexual abuse material (CSAM) to the NCMEC CyberTipline when they become aware of them.

Why CSAM is the preferred term.
Simply put, CSAM is preferred, as it better reflects the abuse that is depicted in the images and videos and the resulting trauma to the child. In fact, in 2016, an international working group, comprising a collection of countries and international organizations working to combat child exploitation, formally recognized CSAM as the preferred term. The results terminology guidelines are referred to as the Luxembourg Guidelines.

The term “child pornography” should be avoided for the reasons that follow:

  • Pornography implies consent, which a child can never give. Pornographic content is made for adults and depicts adults engaged in consensual acts.
  • Using the term ”pornography” in reference to content containing children risks normalizing, trivializing, and even legitimizing child sexual abuse and exploitation
  • CSAM better describes the reality of this crime: it is the documentation of the violent and horrific rape of children, often when they’re prepubescent and even non-verbal

Other key terms

AI-generated CSAM (AIG-CSAM):

Media that was wholly generated by a model that could range the spectrum of photorealistic to cartoon. The content could depict an identifiable individual (deepfake), or a fully synthetic depiction (non-deepfake). If an identifiable individual is the subject, that individual could be an identified or unidentified victim of hands-on abuse or could be a child that has never experienced hands-on abuse. Related: Deepfakes

AI-manipulated CSAM (AIM-CSAM):

Media that has some element of it altered using AIG technology that could range the spectrum of photorealistic to cartoon. The content could depict an identifiable individual (deepfake), or a fully synthetic depiction (non-deepfake). If an identifiable individual is the subject, that individual could be an identified or unidentified victim of hands-on abuse or could be a child that has never experienced hands-on abuse.

Child Sexual Exploitation (CSE):

Child sexual exploitation refers to situations where a perpetrator or third party coerces a child into sexual activities in exchange for something of value (like money, goods, shelter, or promises). This can happen through:

  • Direct force or threats
  • Manipulation or grooming
  • Power imbalance between the victim and perpetrator

The key distinguishing factor of exploitation versus sexual abuse is the element of exchange. While any child can become a victim, factors like poverty, foster care, homelessness, or prior abuse may increase vulnerability. Additionally, age is never a mitigating factor—any child under the age of consent requires protection. Related: Child sexual exploitation and abuse (CSEA)

Deepfakes:

Wholly created with an AI model, deepfakes are photorealistic images or videos depicting an identifiable individual in a sexual abuse scenario. The individual could be an identified or unidentified victim of hands-on abuse or could be a child that has never experienced hands-on abuse.

Deepfake nudes:

Created with AI, deepfake nudes are photorealistic images or videos in which the subject’s clothing is replaced by artificial nudity, making it appear as if they’re naked.

Grooming:

The act of deliberately establishing trust by an adult through forming an emotional connection with a minor with the intention of manipulating and carrying out sexual abuse or exploitation.

Non-consensual intimate imagery (NCII):

Intimate content that is being shared, produced, published, or reproduced without the consent of the individual depicted. Related: Self-generated CSAM

Revictimization:

Otherwise known as retraumatization, is the reoccurrence of sexual victimization after the abuse has occurred. For survivors of child sexual abuse, the distribution of their abuse content continues to exacerbate the trauma, fear, and vulnerability.

Self-generated CSAM (SG-CSAM):

Explicit imagery of a child that appears to have been taken by the child in the image. This imagery can result from both consensual or coercive experiences. Kids often refer to consensual experiences as “sexting” or “sharing nudes.” Related: Non-consensual intimate imagery (NCII)

Sexually Explicit (SE):

Sexually explicit content of children is content that may not meet the legal definition of CSAM but violates a digital platform’s community guidelines. Note: some platforms may allow sexually explicit content of adults.

Take It Down:

A service operated by NCMEC which helps remove nude, partially nude or sexually explicit photos and videos of underage people by assigning a unique digital fingerprint, called a hash value, to the images or videos. Online platforms can use those hash values to detect these images or videos on their public or unencrypted services and take action to remove this content.

Technology used in CSAM detection

Hash value:

The output, or digital fingerprint, of a hashing algorithm that represents an image, video, or another form of data that can be used to determine if two files are identical or similar (for perceptual hashes).

Cryptographic hash:

An algorithm that computes hash values for files such that two files having the same hash value are virtually guaranteed to be exactly the same file. A small change to file results in a significant change to the hash value. They are often used to verify file integrity. Examples of cryptographic hash types are MD5 and SHA1.

Perceptual hash:

Commonly referred to as fuzzy or visually similar image matching, perceptual hashes generate hash values that can be compared for similarity. A small change to file results in a small change to the hash value. Examples of perceptual hash types are PhotoDNA and pHash.

PhotoDNA:

Developed by Dr. Hany Farid in partnership with Dartmouth College and Microsoft, PhotoDNA is a perceptual hash that remains the industry standard for known CSAM detection. To learn more about Microsoft PhotoDNA or apply for an on-premise license, please visit https://www.microsoft.com/en-us/photodna.

Classifier (machine learning classification model):

A classifier in machine learning is an algorithm that automatically orders or categorizes data into one or more of a set of “classes.” Safer's classifiers scan a file and assign it a score that indicates the likelihood that the file contains a child sexual abuse image or video.

Hash lists used in known CSAM detection

Hash lists are collections of verified hash values that can be matched against to identify known CSAM.

ESP lists:

Hashes representing previously detected CSAM content contributed by Electronic Service Providers as part of voluntary hash sharing initiatives among industry. As of December 31, 2023, 46 ESPs and 12 other organizations have voluntarily chosen to participate in NCMEC’s hash-sharing initiative.

NGO lists:

Hashes representing previously reported CSAM contributed by child-serving non-profit organizations as part of voluntary hash sharing initiatives. The content represented by hashes on these lists has usually been verified by a trained analyst to ensure the integrity of the hash list.

SaferList:

A set of self-managed hash lists that each Safer customer can utilize to build internal hash lists (and even opt to share these lists with the Safer community). SaferList includes a CSAM list for novel content detected by the customer and an SE list for them to develop their own list of policy violative hashes. Learn more about Safer's hash sharing options.

SE list:

Hashes representing content that may not meet the legal definition of CSAM but is nonetheless sexually exploitative of children. The content represented by these hashes, although not CSAM, may be associated with a known series of CSAM or otherwise used in a nefarious manner. Matches against this list should not be reported but removed from platforms to protect the children depicted in the images/videos.

Organizations working globally to eliminate CSAM from the internet

Canadian Centre for Child Protection (C3P):

Operates Canada’s national CyberTipline for reporting the child sexual abuse and exploitation on the internet.

INHOPE:

A global network of 50 member hotlines to fight CSAM online. They also promote legislative and policy development.

International Centre for Missing and Exploited Children (ICMEC):

Operating in more than 120 countries, ICMEC empowers the global community with the tools, training and technology to create a safer world for children.

Internet Watch Foundation (IWF):

A charitable foundation that works to stop the re-victimization of people abused in childhood and make the internet a safer place, by identifying and removing global online child sexual abuse imagery.

National Center for Missing and Exploited Children (NCMEC):

The national clearinghouse in the United States for all issues related to the prevention of and recovery from child victimization. It operates the CyberTipline, the U.S. reporting hotline to report the online exploitation of children to NCMEC. Information shared on the report is made available to law enforcement for a possible investigation. Contact NCMEC, if you are an ESP looking to register to make reports.

Technology Coalition:

A global member organization for technology companies of varying sizes and sectors that work together to drive critical advances in technology and adoption of best practices for keeping children safe online.

Thorn:

An innovative technology nonprofit transforming how children are protected from sexual abuse and exploitation with cutting-edge technology, innovative research, and collaborative partnerships.

WeProtect Global Alliance:

A global alliance of 300 members from governments, the private sector, civil society and intergovernmental organizations to develop policies and solutions to protect children from sexual exploitation and abuse online.


Continue exploring this series: