Sponsored By

Apple's ARKit game development: a whole new world

At Skullfish Studios, we had the difficult task to make an AR game with the new Apple's ARkit within one month. Here we will explain our obstacles and solutions during the development.

Jose Joao Oliveira Junior, Blogger

September 27, 2017

7 Min Read
Game Developer logo in a gray background | Game Developer

Game creation will never be a straight line. Game after game we need to reinvent ourselves to achieve the best result as possible. And, speaking about new technologies, we should be even more creative.

We heard about Apple’s new ARKit a month before its launch date.  What could we do with a new technology that we had never worked before? Not much, right? So why don’t go beyond it?

First of all, for those who never worked with Augmented Reality (AR) is basically an integration of digital information with the user's environment in real time. So what’s the innovation Apple ARKit brought to conventional AR? Conventional AR needs a pattern to be mapped on the environment and uses it as an anchor point. Everything will be created from that point. Apple ARKit on the other hand doesn’t need any predefined pattern: it scans your ambient and creates its own anchors, named planes. Planes on ARKit are the core of everything to a successful development. It knows that you have a table and the floor under it. With all of this data accessible to developer, imagine the possibilities.

 

[Traditional Augmented Reality]

 

 

[ARKit]

 

 

We decided to use our main franchise Lila’s Tale, multiple award winner for Virtual Reality (VR), as base for our creation. This would give us the opportunity to expand our IP and focus on the new interactions since we already had a solid structure of the world the game is inserted.

The main train of thought we were working on was: we need something that could use ARKit full potential. We could make players move around, look up, down, crouch. How could we use all of this on a game? Hide&Seek was our first answer.

The first prototypes were our first contact with the new technology. Of course it didn’t work as we expected. All objects we created on user environment were flicking, moving, going away. They never stood still where they should be. Here is when we discovered the Plane Detection ARKit uses. Before ARKit detects any planes all object locations became rather imprecise. So the first thing we should do is wait for plane detection. After it is done, ARKit works wonderfully.

 

 

After 2 weeks we had an alpha version of Hide&Seek game. The game consists in Lila, our main character, hiding herself on a ambient we created on user environment. There is a dragon flying around also searching for her. So the player has to find Lila before the dragon does. Sometimes Lila is hidden inside some spots like a building or a cave and players should move their devices inside them to find Lila. There were many hidden spots that were only accessible if player interact with levers or valves. The funny part is that players should replicate lever and valve interactions with their device. For example, they should approach the lever and pull their devices back to activate the lever or keep spinning their devices to interact with the valve.


 


 

At this point we tested our new game and for our surprise something was missing: it was not funny as we planned. After thinking a while we discovered the game was not rewarding the player enough. Player finds Lila before Dragon, game resets, Lila hide in another place, Dragon become a little more difficult, but what? We already spent 15 days on it. We only had half way to go. Ok, time to let it go, back to the drawing board and let’s do another one. 

Since we already had a VR game base, Lila’s Stealth, which the best feature is “no hand interaction” on Gear VR, all movement was pointing where you want Lila to go and she goes. Why not make something similar? We could movement player around, lighting the way so Lila gets from point A to point B and using the full potential of ARKit. Ok, seems nice. What would be the challenge? She could dodge some enemies on his way. A Stealth game! Like Metal Gear or Hitman. This could work! So far so good. Let’s make a 2-day prototype and… it worked! The fun was back in the game!

 

 

With the game core ready, we needed to fix some problems. Users will be users, always. We know ARKit works only after plane detection but how to ask users to map their environment? Asking it directly wouldn’t work. They don’t know what this means. We needed something that forces users to move theirs device around while in the background we are mapping everything until we find at least a plane. The solution was making a bunch of butterflies flying around the device (camera oriented so they wouldn’t flick nor move unexpectedly) and a simple text like “Follow the butterflies”. This worked like a charm! We could cheat users to do what we wanted without telling them. They were happy following the butterflies we were happy detecting planes. This idea was later upgraded to insert Lila with the butterflies and make users help her to search for her world. Perfect!

 

 

The first big dev problem was crash. Every time we loaded a new scene the app crashed. What we were doing wrong? We had all the objects/scripts necessary for ARKit work properly on each scene. That was the problem. When Unity destroyed these objects to recreate them on another scene here is when the crash happened. Simple solution: Don’t Destroy On Load. Make them persistent on every single scene and you should be fine.

The second big problem was about scale. We had to create every level with a maximum size of 1.5 meters. We thought this would be a good size since user doesn’t always have too much space to move around. This could be incredibly small scale to work on Unity. We were making Lila movement through Unity’s NavMesh and every path it calculated or it was too small or didn’t exist. The solution: since we were generating NavMesh path at runtime, we changed manually some properties on NavMesh generation like agent radius but the most important was Manual Voxel Size. This property is about accuracy on unity calculation. The higher this number is, more accurately NavMesh will be calculated but will last longer to bake. Unity suggests 3.00 voxels per agent radius as a “good trade off between accuracy and bake speed”. Since we are baking it realtime we needed to find a better accuracy but without much cost of baking. We changed it to 5.00 and worked fine to us. Lila could move fine now.

 

 

Development part is done! Everything is working as designed so far. All other development stuff is pretty much exactly as developing for mobile, taking care of optimizations both code and assets. ARKit will only be run on Apple A9 64 bits chips, which are pretty good, so we can push quality limits a little since we aren’t developing for low-end devices. 

Augmented Reality hasn’t impressed me so far because we needed a pre pattern on environment to work. It worked as b2b applications but with the new Apple ARKit, AR universe expanded A LOT. Knowing what players have in their surrounding can take game experience to another level. The difficulty to us, game developers and designers, is using this data in our benefit. Creating something reality-believable to user will be our goal from now on. My projection to the future: when VR merge with this new AR technology, we will have the possibility to make the supreme game experience.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like