Sponsored By

Designing a planet-scale real-world AR platform

At Niantic, we’re in the process of developing an incredible planet-scale Augmented Reality (AR) platform. In this post, we explore where we believe AR experiences are heading and the tech stack we’ve built in anticipation of that trajectory.

Edward Wu, Blogger

February 27, 2019

7 Min Read
Game Developer logo in a gray background | Game Developer

Niantic Real-World Platform

At Niantic, we’re in the process of developing an incredible planet-scale Augmented Reality (AR) platform, designed to help would-be AR game developers quickly and easily create their games without having to solve a lot of daunting technical challenges. We created the Niantic Real World Platform as the technological embodiment of our core ideals: exercise, exploration and social interaction. These principles manifest as a set of Java server libraries and client Unity C# APIs on mobile clients and this technology underpins the core server and client engines of Ingress, Pokémon GO, and soon Harry Potter: Wizards Unite, as well as our future products.

In December, we announced the Niantic Beyond Reality Developer Contest and invited developers to join us in developing incredible AR experiences on the Niantic Real World Platform before we release it to everyone. We had previously previewed the AR mapping and computer vision portions of our platform in June. Since then, many teams have asked us what our platform can help them achieve, and to paint a broader vision of what is possible with a successful application of our technology. In this post, we explore where we believe AR experiences are heading and the tech stack we’ve built in anticipation of that trajectory.

For the sake of explanation, let’s briefly leap ahead to a world of ubiquitous wearable computing. In this world, future AR devices seamlessly blend all of our senses between the real and virtual worlds. Our everyday experiences at play, work, and social are enhanced by hardware that is unobtrusive, can go anywhere, and is connected in real-time with low-latency 5G connections. What kind of experiences would you build for this future? 

Codename: Neon Demo

 

First of all, we’d expect this future to be inherently shared and social. At Niantic, we’ve seen how playing together has made an enormous impact in engagement in our games. Our players tell us that besides having fun, they have found benefits in making friends and building communities. In this future, sharing AR experiences would enhance these benefits. To do so, the AR interaction has to be natural to our senses. The digital would obey similar rules to the physical in order to create the suspense of disbelief in our brains. When this balance is achieved, players are immersed in this magic realism where they can have frictionless fun (check out Codename: Neon, one of our prototypes that was created to demo this). The technology just works as expected, obeying laws of physics. For example, players in Codename: Neon can harvest energy from the white pellets on the ground, and those are a shared resource–so if one player gets them, the other players can't! 

Second, we’d expect the world of AR to be heterogeneous, reflecting the variety, beauty, and diversity of the planet we live on. Naturally, we’d be incentivized to explore the real and virtual worlds together, and the data used to create virtual experiences should be tightly coupled to the real world. For example in Pokémon GO, when it rains in a player’s location in the real world, that is reflected in the game. Sensory experiences should align, because in doing so, we are driven to discover this brand new world of AR in a way that is connected and meaningful.

Third, we’d expect those virtual worlds push us to exercise and move. Capitalizing on our own natural rhythms of motion and rest, we can create compelling experiences that invite us to move and ensure that the AR and geospatial experiences unfold at an intentional pace. For example, Codename: Neon has a first of its kind AR game mechanic that encourages players to move at the scale of large open spaces like parks rather than a tabletop scale. Also in Ingress and Pokémon GO, kilometers walked is a MMO resource that opens up many possibilities in incentivizing players to explore their neighborhoods or cities.  

What technology did we build to satisfy these ideals? In order for our experiences to exist in a single virtual world, we need to be able to take a single-instance, real-time geospatially queryable environment to massive scale, and allow for the creation and mutation of shared geospatial objects. As a result, a central part of our platform is a real-time geospatial storage, indexing, and serving engine, which manages world-wide geospatial objects that developers can control. However, because we envisioned a world where single-world AR games that integrate and tie everyone’s reality together need to operate on a massive scale, with monthly usage measured in the billions of users, a major part of the technological investment we made was in horizontal scalability while retaining a single world instance, primarily by rethinking how server-authoritative games could be conducted on top of horizontally scalable Kubernetes container technology in conjunction with NoSQL denormalized datastores, rather than on the single instance relational SQL databases that MMOs in the past were typically built on. Consequently, Pokémon GO is built entirely on this platform, and has demonstrated concurrent real-time usage of several million players in a single, consistent game environment, with demonstrated monthly usage in the hundreds of millions.

Additionally, during our initial development of Ingress we found that it was infeasible to create a game world that seemed real and concrete using procedurally generated or scraped content–doing so would create a sense of cognitive dissonance whereby the game would highlight locations that were algorithmically sensible but in reality were insipid. As a result, we rely on our comprehensive dataset of millions of the world’s most interesting and accessible places to play our games that are nearly all user generated. Along with the toolchain to load and select these places based on metadata criteria into game titles, this dataset has been constantly submitted, curated and updated over the past 6 years by our players. As such, this dataset is critical to enabling experiences built on our platform to inspire users to explore their real world and walk from place to place.

Pokémon GO screenshotIngress Screenshot

 

These features are managed with a set of mobile client-side Unity APIs that map to a high-performance native plugin that enables the performant rendering and stylization of both real-world map information, as well as geospatial game objects. We’ve found that the most non-intuitive parts of the game to code that also consequently can have the most impact on serving scalability and client performance is maintaining the real-time linkage between player location and the state of the game world around them. As a result, we focused our efforts on creating an intuitive API that handles the intricacies of querying and caching both map and geospatial objects as the player moves about the world, allowing developers to code a planet-scale, single-instance real-time multiplayer gameplay with ease, and thereby freeing up opportunities to focus on finding the fun in their game designs.

Finally, we created a comprehensive set of APIs for real-time multiplayer AR experiences using the phone as a control device and viewing portal into the virtual world. We’ve focused on solving the hard problems in computer vision, networking, and game prototyping tools to make sure developers focus on the experience building. Our technology optimizes for real-time AR that achieves peer-to-peer multiplayer latencies that are in the tens of milliseconds. To put this in perspective, with rendering at 60fps, each new image is displayed at ~16ms, so we render the actual real position of players. This means that when using our stack in a multiplayer AR game you can see where your friends actually are rather than where your friends were. We also built in advanced computer vision algorithms into our client libraries to enable better end-user experience for faster synchronization and tracking for multiplayer AR. All of this works cross-platform with prototyping tools to enable developers to iterate in seconds rather than minutes.

When combined together, our technology platform enables rapid development of incredibly diverse and differentiated planet-scale AR experiences. We’ve used it internally on everything from hackathons, to early-stage game prototypes, all the way up to our upcoming title, Harry Potter: Wizards Unite. However, we know that there’s a whole world of great ideas and developers that this technology can empower.  If you would like to help us field test our platform, head to the website and sign up. We'd love to help you create something amazing.

-  Diana Hu (Head of AR Platform) & Ed Wu (VP, Platform), Niantic
 

Read more about:

Featured Blogs

About the Author

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like