Skip to main content
June 24, 2022
Niantic’s efforts to combat child sexual abuse and exploitation online

We want our community, and especially our younger players and their caretakers, to feel safe, offline and online, to interact, adventure, and explore the world. As part of this commitment, we are joining the fight in preventing, detecting, removing and reporting Child Sexual Abuse Material (CSAM) on our games and products. This short post presents our efforts to date, and you can continue to expect more from us on this front. 

Niantic has invested in cutting-edge technology to identify re-uploads of previously identified CSAM, using hash-matching technology such as PhotoDNA, as well as to identify uploads of never-seen-before CSAM. When potential CSAM is surfaced by those technologies, our teams of specialists review the image or video. If CSAM is confirmed, the content is removed and reported to the National Center for Missing and Exploited Children (NCMEC), regardless of the player’s motivation for sharing it. NCMEC then works with national and international Law Enforcement agencies.

In our commitment to fighting online child sexual exploitation and abuse, we are partnering with several national and international organizations, such as:

  • Thorn, whose mission is to eliminate CSAM from the internet. We are using their tool, Safer, to identify, remove and report CSAM.

  • The Internet Watch Foundation (IWF), who works to stop the repeated victimization of people abused in childhood, and to make the internet a safer place, by identifying and removing global online child sexual abuse imagery.

  • NCMEC, the USA’s nonprofit clearinghouse and comprehensive reporting center for all issues related to the prevention of and recovery from child victimization.

Our Player Guidelines forbid all forms of child sexual exploitation and users who violate this policy are permanently removed from our games and products.


–The Niantic team


Get the latest