Skip to main content
September 28, 2022
Lightship VPS: Engineering the World’s Most Dynamic 3D AR Map

With the launch of Lightship VPS earlier this year, Niantic is fulfilling a long-held dream for AR creators — to have a precise, dynamic 3D map of the world in which to add, anchor and augment layers of virtual content. In this three-part series, Chief Scientist Victor Priscariu and Director of Engineering Pierre Fite-Georgel dive into the technology used to build and maintain the map so that AR developers around the world can make magic precisely where they want.

At Niantic, our goal is to give people tools to become 21st century explorers. For centuries, ship captains navigated the oceans with a compass, map, and an astrolabe and quadrant, which are devices used to find longitude. Mainstream adoption of GPS over the past two decades has made the technology fundamental for establishing location. And with the advent of the internet, GPS-enabled location data was combined with online maps and mobile phones. These days, moving around in the world, traveling and discovering new places is incredibly easy.

Today we’re in the early stages of another technological revolution, one that enables people to experience and discover places in the physical world through virtual space. Niantic’s 2016 launch of Pokémon GO represented the first large-scale virtual environment and brought augmented reality to the masses. Using that foundation, we’re building a version of the real-world metaverse where people can get a truly unique immersive experience and explore the outside world and virtual space together simultaneously. To further blend the worlds, we recently launched Niantic’s Lightship VPS (Visual Positioning System). VPS is like GPS for the virtual version of the outside world, only far more precise, down to the centimeter. With Lightship VPS, developers can build persistent AR content that is anchored at real-world locations. This means people can have more meaningful and compelling immersive AR experiences in the real world because the visuals are blended perfectly.

We joined Niantic to help build this dynamic 3D map of the real world, something that many people at Niantic had been working on for years. Niantic’s founders cut their teeth in this space by creating Google Maps, laying the foundation for online mapping with Street View. But for AR, we needed to start from scratch; no one had created a scalable solution for building such a 3D map like this before. There were some big technological challenges to overcome and doing it right takes time. We’re not done yet; in fact, we may never be “done” because our AR map will be constantly updated to reflect changes in the physical world. But the essential building blocks are in place and the AR world is taking shape — there are already more than 100,000 AR-mapped Wayspots in cities around the globe, and from the time it took to write that number and for anyone to read it the number has already gone up yet again!

Image from presentation by the Lightship VPS team at AWE USA 2022

Creating a 3D world map is exponentially harder than creating today’s standard digital map. There is more physical space to cover when you are on foot, in parks and other public spaces, and people are not just looking at the views from the street, they have more ground that can be covered in parks, town squares and other public spaces. Reconstructing locations in three dimensions also requires a lot more images of each location, representing how locations look at different times of day and in different weather and seasons, rather than just displaying polygons to express the shape of streets and buildings. Each point in the space needs to be observed from different perspectives, angles, and light conditions so the model can be robust to changes and user behaviors. These constraints make it difficult for one company to collect enough data. We get asked a lot about the technology behind all of this and thought we’d pull back the curtain a bit in a series of blog posts where we will cover data collection and processing, how the map itself is created from all the data, and explain the API service that users interact with.

The quest for exploration of the world around us has led people to come up with ways to map their environments, be it land, ocean, or sky. To do that, we need visual confirmation of objects and a way to figure out their relation to each other, and us. Telescopes were instrumental in allowing us to zoom in on distant sights on Earth, but people didn’t stop there. Galileo was one of the early astronomers who turned the telescope to the skies. By estimating the trajectory of celestial bodies and measuring the distance between them, his work completely altered human understanding of the cosmos and our place in it.

Demonstration of a scan at the Ball Tower

In that same spirit, map builders in the digital age have their tools and methods for mapping the 3D version of the world. Niantic is enlisting the help of the millions of people who are playing Pokémon GO and Ingress to gather data that is the foundation of our map. Once players in Pokémon GO reach a certain level in the game, they are given the opportunity to take on AR Mapping Field Research tasks. Players can receive in-game rewards for submitting AR scans of Wayspots, which is the focus of our data collection.

Illustration of scanning distance variation patterns

Wayspots are publicly accessible objects or locations that our players consider special in their area e.g. statues, local art installations, parks, or fountains. We have 17 million Wayspots in our database and we select those that are in areas where our local player community is and prioritize Wayspots where we need more scans. However, users can scan at any Wayspot whether it is a prioritized Wayspot or not. When a player arrives at their Mapping Field Research destination they are prompted to scan. Before players start their first scan, they are reminded not to scan inside private residences or from locations that are not accessible to the public. The scanning process involves taking short video clips ranging from 20 to 30 seconds each, about 300 images per location. The player points the camera at the Wayspot location and walks around it to get various views. We have numerous people get scans of each Wayspot, so that we have a large pool of scans so we can ensure a diverse set of data points — from different types of phones to environmental experiences like sunny or overcast— to get the most complete information and highest possible quality visuals.

We collect a variety of data types too, starting with visual image and latitude, longitude and altitude from GPS. We also gather information from other phone sensors, such as the accelerometer and gyroscope. These sensors clarify what the perspective of the person was at the time of their scan, specifically: the phone’s position and orientation. If the phone has a LiDAR sensor, we capture a high-resolution 3D view of the player’s surroundings, which helps determine the distance of sculptures, buildings, trees and other objects.

Example of a scan that was passed through the API which filtered out personal identifiable features

After the data has been collected, we process it on the back end to prepare it for use in building a map. We take a number of steps to protect the privacy of individuals when we collect scans, including automatically anonymising them once they are uploaded to our secure servers. This involves applying measures to blur potentially recognizable objects like faces and license plates. The scans are then passed through an AI model for quality control. The system checks to make sure the lighting conditions were good enough to see the images clearly, that the camera was not covered and obscuring the image, and that the camera was not pointed at the ground or out of focus, and that it filters out images that appear to be taken indoors.

So how do we actually build the map? Watch this space: In our next blog post we’ll talk about all the steps it takes to bring the map to life.

— Victor Priscariu, Chief Scientist and Pierre Fite-Georgel, Director of Engineering

Get the latest