* VPS, World Pose and AR features combine for foundational location-based AR experiences
* ARDK 3.9 also adds Ad Hoc Mapping as an experimental feature
With the latest release of our augmented reality software development kit (ARDK) for Unity, some important mapping features move into production while another joins the list of experimental features. Through this process of research and product development, we are constantly exploring new ways to help developers build the future of spatial computing.
From Research to Product Feature to Use Cases
For 3.9, we moved World Pose (WPS) from experimental feature to full production. World Pose is a technology we developed to enhance geographic positioning and orientation for AR applications.
Unlike the Visual Positioning System (VPS), which relies on visual localization for high accuracy, World Pose uses a combination of Global Navigation Satellite System (GNSS) data, magnetometer measurements, and AR tracking data to provide a broader geographic positioning solution. This allows for the placement of AR content in areas with low VPS coverage, ensuring that users can still experience accurate and engaging AR interactions even in less mapped regions.
The World Pose algorithm works by jointly optimizing all GNSS and magnetometer measurements along the user’s trajectory using AR tracking data to determine the most likely geographic location and orientation of a device. The algorithm significantly improves angular error, which is crucial for accurate orientation, especially when users are far from the anchor location. This makes WPS particularly useful for applications that require large-scale content placement without relying solely on live AR data.
World Pose has several potential applications, including georeferencing, where it aligns map data to geographic coordinates, and mirror worlds, which involve converting map data into live sessions to generate geographically meaningful content. It can also be used for navigation, guiding users towards specific locations, and geo-anchoring, allowing users to place content anywhere using only geographic positioning. By integrating World Pose with other Niantic technologies, such as VPS and ARDK, developers can create more immersive and accurate AR experiences.
In this example from Wonderment by Design in the Netherlands, their testing demonstrated excellent accuracy by combining World Pose with realtime MarineTraffic API data, like unique MMSI numbers, ship size, load, distance, last port and destination, to have a robust set of location based information. World Pose helps open up a lot of new ways of visualizing real time data and spatial location based on digital plotting.
Take A Walk With Niantic’s Mapping and AR Technologies
Because real-world AR experiences encounter different environments and conditions as users move about, we know it’s important to leverage different features and technologies in a seamless fashion. Our Research team has built an internal demo app to bring features together. In this example we bring World Pose, VPS, and ARDK together. The video shows the different features at key points throughout the journey. This is not meant to represent any specific experience, but it is easy to see how different features and technologies combine to turn any walking experience into a highly immersive spatial experience without ever losing track of where users are in the real-world. This is the magic the technology unlocks – a spatial experience that blends real and digital worlds seamlessly into one experience.
Niantic has both 2D and 3D map tools and views available for developers. For mobile games, the 2D locations can easily be used in navigation scenarios while the 3D locations support more precise and richer immersive experiences.
In the demo, World Pose can be used differently based on the orientation of the phone, creating a Wayspot compass in horizontal view (leveraging information from our Coverage API) and a way to plot Wayspot locations when the phone is vertical. Further, the Niantic Maps SDK is used to display a 3D map of the area surrounding the user and the nearby VPS localizable wayspots.
Live meshing is used to continuously map the 3D space in front of the user as they walk. At the same time, VPS is queried continuously and Wayspot meshes are downloaded from the Niantic cloud and shown after each successful localization. An overhead map displays both the live and cloud meshes. As additional VPS locations are localized they all become visible in the overhead map making it clear that the user has covered a lot of ground and a simple local walk can become a richer experience.
Developing More Mapping Options
Our Research team has developed another experimental feature in 3.9 called Ad Hoc Mapping (Device Mapping). We know there are scenarios where the priority is to have multiple users quickly join a shared session, and this feature adds that capability. In this example, the user’s camera quickly scans an area to create a local map and additional users can easily localize against it to join a shared session. Once the mesh is created, any content can be anchored and shared by multiple people at this location. This is an ideal solution when durability of the map over time is not as important but it’s still valuable to have all of the AR features available.
Explore What’s Possible with Niantic’s Spatial Tools
To learn more about our Mapping and AR tools please visit lightship.dev.