Skip to main content
November 10, 2020
Mapping Reality: Building the Future of AR
Blog > Mapping Reality: Building the Future of AR
November 10, 2020
Mapping Reality: Building the Future of AR

At the recent WSJ Tech Live conference, Niantic CEO and founder John Hanke unveiled advancements made in the building a 3D map of the world including machine learning technology, which contextualizes objects and materials - think water, sky, buildings, ground - in real-time.

At Niantic, we’re building the Niantic Real World Platform to power AR experiences that function on all types of mobile hardware, whether it’s last year’s phone, today’s iPhone 12 Pro with LiDAR or the wearable devices of the future. Our platform leverages the processing power and cameras of smart devices, and uses proprietary advanced machine learning and computer vision to understand the physical world in real-time. In short, the better the sensors, the more AR software can deliver; whether it’s working in the dark or having better 3D dynamic recognition of moving objects. In the meantime, our team is also working to make AR appear more realistic across all devices and sensors.

And, as John showcased, we’re combining the depth perception with the understanding of what those objects are in a physical space. With a semantic understanding of 3D environments, we have the ability to inform our characters of the world we live in. This allows them to understand the contours of their environments. It allows us to form relatable bonds with them as we take one step closer to truly augmenting reality.

Here’s a look at the recent integration of this technology on the Niantic Real World Platform. This video shows how we have integrated the 6D.ai technology into our platform to build a 3D dynamic map of the world that is shared across devices and can be built concurrently by multiple players.

The Niantic Real World Platform provides a real-time 3D mapping capabilities and object recognition to build experiences where digital characters can intelligently navigate the world around them.

Over the last few weeks we've seen announcements that will bring the benefits of 3D reconstruction and mapping to consumers. The adoption of LiDAR will enhance AR experiences on high-end iPhones, and thanks to 3D reconstruction software, we're starting to see the results of devices being able to spatially understand the areas around us. Everyone in the AR market benefits from this rising tide of technology, and the introduction of better 3D visualization to consumers and businesses is another step closer to wider AR adoption.

What does this all mean?

For an end user, the result is an evolution of interactive experiences that feature seamless convergence between atoms (of the real-world) with the bits (of the digital world).

As a developer building in the AR space, our focus is on offering AR technology across platforms, making it easier for AR creators to get their experiences to their customers without having to worry about which device or platform they are on. Our ultimate goal is to allow developers to build diverse types of AR experiences across all current and future devices.

We see this global map as the backbone of the new and innovative experiences that will ultimately enhance how we interact with, discover and enjoy the real-world with others.

–The Niantic AR team

arrow Created with Sketch. Back to Blog

Get the latest