Skip to main content
September 27, 2023
Our Approach To Safety

*Last updated September 27, 2023. We launched our Safety Center, a hub where you can find information and resources on building a safe and enjoyable Niantic experience.

By Camille François

Let’s start at the beginning: what is “trust and safety”? There is no single answer to that question — but for us, trust and safety is about creating the policies, tools, technologies, partnerships and design decisions that keep our products safe, and our experiences fun and fair for our players. As Campfire, our experience designed to encourage group play out in the real-world, rolls out globally, we felt now would be a good time to talk about how we’ve approached this critical work.

How it started

To ensure our games remain a safe and fun experience, we’ve been working on trust and safety at Niantic for a long time, by different names — some of this work has been carried out by our colleagues in Player Support, Legal and Product teams, for instance. In Fall 2021, we consolidated this work into the first dedicated Trust and Safety function at Niantic.

Niantic is building a brand new kind of gaming experience through augmented reality - a technology that is still evolving. To meet this challenge, in the past two years, we have overhauled, audited and extended our Player Guidelines and current set of policies, built a tooling infrastructure to support this work, joined industry leading groups, and hired a team of professionals to build a state of the art Trust and Safety team at Niantic.

How it’s going

So how do we make this work? It’s easy to think that trust and safety is just about writing  rules and banning people who break them. And of course, there’s some truth to that: we do develop policies to support safe experiences for our players. But this is just one step in a broader endeavor that begins with safety by design and ends with deploying technologies, partnerships, tools, and processes to keep our players safe and our experiences fun at scale.

Safety by Design

We designed our Trust & Safety function with the safety by design principles at its core. At Niantic, this means game producers and product designers partner with our Trust and Safety team early in the development process to understand and mitigate the types of risks that players may encounter while playing our games and using our products- before those products even get built. Campfire, our first community product, was built with this approach — and we’re continually refining how we do safety by design across the company. We believe safety and fairness in our products is everyone’s responsibility — not just the trust and safety team’s — and safety by design is a key way to put that into practice.

As part of this work, we borrowed the idea of “red teams” from the field of information security and put Campfire (along with other experiences across our games!) through Trust & Safety red team exercises to identify gaps or potential problem areas within the product before launch. This is not a foolproof method and we know there will be hard trade-offs to make. It is also not a matter of one-and-done – trust and safety requires continuous review and improvement. As we continue our evolution, we’re committed to sharpening our safety by design approach and to continue iterating on this model within our Trust & Safety red team, our Niantic colleagues across functions and our external partners.

Policy Development

Like others in the industry, we take a comprehensive and transparent approach to platform policy, thinking about the types of content, behaviors and actors we believe are and aren’t appropriate across our games and products. These rules cover social products such as Campfire, but also extend to our live events. Our policies are living documents, and we are committed to revising and improving them. As the online safety landscape evolves and our product teams iterate, we may update policies or launch new ones to make sure our rules remain relevant and applicable. As of today, these are the main policies that guide our safety work:

  1. Player Guidelines form the foundation of our policy values at Niantic. They cover the type of content we consider abusive on our products (ex: violent threats), the types of behaviors we seek to discourage (ex: harassment at a live event, cheating in our games), and the types of actors unwelcome across our games and services (ex: terrorist organizations).

  2. Live Event Code of Conduct sets the standard for the type of behavior that will be considered appropriate and acceptable at a live event, Campfire Meetup, or any other product-centered event.

  3. Terms of Service are a legal agreement between Niantic and anyone who uses our products and services. Some of our products also have guidelines tailored to the types of contents and experiences they facilitate, see for instance our 8th Wall Content guidelines.

  4. Privacy Policy provides a comprehensive overview of the steps we take to protect the personal information people share with us.

When establishing our policies, our focus is on reducing harm. Specifically, we look at what types of conduct may be more likely to lead to different types of harms (such as physical, mental, financial, or relational harm), and from that we determine what types of consequences should be applied to promote the safety of our users. We aim to ensure a proportionate approach to the penalties for breaking our rules:

  • The most egregious harms lead to the most severe punishments, for instance removal from Campfire and other Niantic products and services and in very severe cases, referral to local authorities.

  • For most of the other harms identified, we use progressive enforcement ladders such as our “three strikes” policy for gameplay fairness. These are designed to discourage and disincentivize unwanted behaviors, making sure we educate players on our policies, with suspensions and account removals used only as a last resort.

Moderation

Once our policies are developed, we work to safely, fairly and effectively enforce them at scale. Moderation is a broad term, but at Niantic, we see it as the complete system of technologies, tools, partnerships and processes that together enable us to apply our policies at scale. To do so, we rely on a combination of automatic detection technologies, user reporting and expert reviews to identify and remediate against content, actors and behaviors that violate our standards.

These goals are common to many others in industry, but we thought it important to be transparent on how we work towards them – by sharing information about the specific infrastructure, tools, and setup we use to those ends. Specifically:

  1. Automated Detection entails a combination of technologies and tools, including:

    • For detection of policy violating text, codenames and images, we rely on a combination of moderation classifiers built by Microsoft, Google and Cleanspeak

    • For detection of unknown child sexual abuse material (CSAM) and known CSAM (through  perceptual hashing), we rely on Thorn’s SAFER product

    • Information on known abusive websites and materials is also collected from partners such as Global Internet Forum to Counter Terrorism and the Internet Watch Foundation (for known websites participating in the distribution of child sexual abuse materials).

  2. Reporting and Reviewing. Automated detection helps minimize our players’ and our moderators’ exposure to abusive content, but can’t catch everything. We also built reporting tools for people using our games and services, enabling them to flag issues to Niantic so that our teams can investigate and address concerns. Reporting is a key piece of our safety efforts, and we continue to iterate to design better, more effective, and simpler reporting options for our players. Finally, Keywords Studios is our primary moderation partner at Niantic, their work is integral to our safety efforts.

Child safety

Certain Niantic products allow children to play with parental consent and we recognize that children have  unique needs with regard to safety, security and privacy. Players under the digital age of consent are not permitted to use Campfire, so to support teens (above the age of consent) on Campfire, we’ve begun building partnerships with leading child safety organizations to help keep players safe. We partnered with Thorn, notably leveraging their technology, SAFER, to prevent, detect, and remove CSAM. Their technologies also help us identify harmful conversations that can be especially difficult to detect.

The National Center for Missing and Exploited Children (NCMEC) is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. If we become aware of accounts engaging inappropriately with a minor, sharing CSAM, or other child safety violations, our policy is to terminate the account, report this to NCMEC and in some cases, to law enforcement as well. We are also proud members of the Tech Coalition, an alliance of global tech companies working together to combat child sexual exploitation and abuse online.

We know that navigating the world of online safety as a parent or caregiver can be challenging, so we created the Niantic Parent Guide to help them make informed decisions around our games and services as well as other digital parenting resources for children and teens. We also offer Niantic Kids, a login method for games that allow children under the age of digital consent. Niantic Kids is designed to obtain parental consent and keep children’s personal information secure when they play our games. The portal provides tools to review and approve permissions for childrens’ Niantic accounts.

In general, we strongly believe that all of our players should have the tools to control their experiences on Campfire and across our games, for instance by blocking people they do not wish to interact with, and by being able to control their privacy settings.

Our partnerships

Humility is the key to doing trust and safety work successfully. We have to know what we don’t know, and be humble enough to reach out for guidance. We are immensely grateful for our players and ambassadors who have shared valuable feedback on how to best tackle these issues together. We also maintain active participation in trust and safety industry groups, where we enjoy learning from and engaging with our peers. Our key partners for safety initiatives include:

Global Internet Forum to Counter Terrorism (GIFCT)

Countering violent extremism is an industry-wide and complex challenge. So, we reached out to the Global Internet Forum to Counter Terrorism (GIFCT) and their sister organization, Tech Against Terrorism, for admittance to their organizations. We officially became members in 2022, making Niantic the very first gaming company to join GIFCT. This ensures that we can coordinate with industry and civil society partners to prevent the spread of violent extremist materials online through their hash sharing database.

Trust and Safety Professional Association (TSPA) and the Fair Play Alliance (FPA)

We’ve found it critically important to partner and learn from our Trust and Safety peers in the industry. The Trust and Safety Professional Association (TSPA), which we joined shortly after establishing our Trust & Safety function, has been especially helpful, giving us a space to give and receive support from others in the industry and learn from those who’ve come before us. Our partners at the Fair Play Alliance convene gaming professionals and companies focused on fostering fun and positive player interactions.

Where we’re heading

Our Trust and Safety team has to innovate rapidly on issues across different industries, technologies, and end-user environments, crossing digital and physical spaces. As such, we need to remain nimble, curious, and resourceful. This work is challenging, but more than anything, this work is a privilege.

We get to partner internally with industry leading teams creating products that inspire movement, exploration, and face-to-face social interaction in a way that fosters fun and safe communities and experiences.

Best of all, we get to serve the most passionate global user base. From Ingress, to Pokémon GO, NBA-All World, Pikmin and Peridot, to Campfire, our players believe in the possibilities that come from engaging with each other IRL sharing in immersive experiences.

So, where are we going? We will continue to advocate for our players, wrestle with challenging issues, invest in tools, partnerships, policies and technologies that give users more control over their experience and increase transparency around our moderation systems as we scale our enforcement globally.

With regards to emerging technology and innovation, we’re considering safety from the beginning: we’re actively red-teaming several generative AI-driven experiences and features for integration into our products. By evaluating their performance from a safety perspective as part of launch readiness, we hope to ensure we’re thinking and acting responsibly and incorporating diverse perspectives into feature development in these areas. We’re also supporting upcoming features that will create new ways for players to join and celebrate their local community.

There will be times where we get things wrong and have to iterate quickly, especially as augmented reality gaming evolves, but we’ve found that the reward really is worth the struggle. We’re committed to fostering safe and enriching experiences in all of our games and products and we look forward to seeing our players thrive in these environments.


Get the latest