Sponsored By

7 tips to help building a better experience on VR

How the user experience of Finding Monsters Adventure for VR was designed after it's mobile version. Here are 7 tips for developers to create great experiences in virtual reality. From testing to motion sickness, learn what we've learned in this project.

Gabriela Thobias, Blogger

December 11, 2015

16 Min Read
Game Developer logo in a gray background | Game Developer

Hello! My name is Gabriela Valentin Thobias and I’m UI/UX Designer at Black River Studios, in Manaus, Brazil. What have brought me to the middle of the jungle to work was the opportunity to create unique, high quality and creative games. But the cherry on top was working with Gear VR. Developing for VR is different from anything I’ve worked before. As a designer, of course I’m worry about legibility, usability, organization, and so on, but, as there are no rules nor manual of good practices, it was a huge challenge.

 

 

Our game, Jake & Tess Finding Monsters Adventure, started as a mobile game. The main goal is to take pictures of cute and cool monsters and score high. It got awesome and recently it was successfully released on Google Play. When VR project started alongside mobile, we thought about making a port out of it. But we found out that porting it would be frustrating because we would not use all the VR's capacity and the experience had too many issues, like: not enough interactions, nor feedbacks and people were getting lost. After dropping the porting idea, we worked in an exclusive version for VR. We were able to create a small demo, good enough to show it in events and we received a lot of great feedbacks. We saw there that people needed something more, they wanted to interact closer to the monsters, they wanted to feel to the environment and have a fluid and comfortable experience, so we pushed it forward to create an innovative VR experience. To learn more about how we changed the game design features to better serve the VR, check Bringing Finding Monster to VR, by Thiago "Beto" Alves.

 

 

We’ve learned a lot about usability and user experience of VR since the project begun, so we hope this seven tips may help other developers. Here are some things that didn’t work at first and how we fixed them:

 

1- Don’t suppose anything without proving it right

In the beginning of the project, we had a screen that should give the players the information needed to play a level. As we were placing it in front of the camera, we were testing it in Unity itself and it seemed good and legible. But, in the minute we made a build and tested it, the position was all wrong. Texts were too bright, the information was closer than expected and as it was so close, we even felt a little claustrophobic.

 

The red square shows what the user could see without moving. At first, we thought it'd be nice to look up or down to interact with the interface, but it proved to be annoying for most of our play testers.

From that time on, every single little change we’ve made, we’ve builded and tested it with the Gear VR. The big lesson here was to test everything and every time, and when I’ve made enough changes for me, I tested with other people. People will see differently than you, no one is the same. VR is a much closer, emotive and personal approach than mobile is, sometimes the interface doesn’t communicate what you want because you didn’t test it enough. Listen to the feedbacks.

 

2- Research a lot

In the beginning of the project, we dedicated some time for research. We searched for everything we could find about VR. From John Carmack’s talks to some VR gameplays, we tried to absorb everything out there to create the best experience we could.

For everyone that asks me which are the best talks to watch about designing for VR, I always recommend this two:

These two talks (among others) helped us a lot to understand more about ergonomics and field range. Although there is no manual of good practices, we can learn a lot with the sharing researches out there.

 

3- Legibility according to distance

It’s well known that in most of the cases, the information should always be legible. In our case, there were some texts, like tutorial, that players should be able to see well. In VR, text distance may vary according to the size of your environment and assets. If you are in a small room, your assets should be at least as far as the wall of the room. If the assets are much further than the wall this may happen:

 

It will have a huge alias because, as it's big, the asset seems shrinked and texts get super contrasting. And people will find it uncomfortable because simply it’s not natural. In the real world, you don’t see a huge book far away big enough to be read crossing your wall, right?

 

 

What really helped us to keep track of the distances was a tool created by the Engineers. In the beginning of the project, we planned on having information assets around the player, just like a gallery in a exposition. We wanted to give this feeling to the players. So the engineers created a Gallery Editor, a tool that places the assets that we wanted, in the right distance, facing camera and we could even set the distance angle between each asset.

 

 

4- Motion Sickness, hell no!

Do you know that feeling when you are reading in the back seat of a car and you start feeling nauseated? That’s motion sickness. It happens every time the environment around you moves but you stood still. Some people are more sensitive than others. According to wikipedia:

 

In virtual reality, however, the effect is made more acute as all external reference points are blocked from vision, the simulated images are three-dimensional and in some cases stereo sound that may also give a sense of motion.” (Link here)

 

In VR, it’s necessary to be cautious using camera movement. In our game, we avoid motion sickness making the player stay still all the time while taking pictures. There is only one level that the camera moves. In this level, we start it with a very light move and as time goes by it gets faster. We don’t force the camera to turn one way or another, the player chooses where to look. And also, as we found out in researches, the camera movement gives less discomfort when the player is in a cockpit, so we placed the camera over a monster. Although we don't use it in our game, using a bluetooth controller can help reduce motion sickness since players can control everything including camera.

 

On the third Moon level, you can ride a monster. We tried to avoid motion sickness by making a light movement and not controlling the camera even when the monster changes the direction.

 

5- Better have an excess of feedbacks than few

People barely know VR yet. So, to avoid people getting lost with it, we’ve created a lot of feedbacks for each player action.

Crosshair : Our crosshair has 2 states: the idle and the Clickable UI. The first one is a little dot to give the player notion where is the center, so they can drag it over the clickable elements. And the second one, the “Clickable UI”, appears every time the player has it’s aim over a button.

 

 

Buttons: The buttons have normal animated buttons states (like idle, hover and pressed) but to make it easier for the player, we made bigger colliders (so players can click on it faster), sound feedbacks (on over and pressed), and VFX feedbacks.

 

 

6- Information in the front

In our tests, having too much information attached to the player’s HUD was uncomfortable and unnecessary. It didn’t feel natural. The players enter a whole new different world in VR, they want to be able to look around and see the details.

So, as information were way more comfortable in world camera, we placed most of our assets right in front of the player. Every time a loading is finished, the information “is born” in the x-angle the VR user is looking at.

 

 

The information attached to the player’s camera that we have is the camera HUD and the counter to begin the game. To avoid annoyance, we made the HUD very light and not aggressive. Also, the HUD is not in the right middle of the camera, it’s slightly lower in y-position so the film and time information could be legible and we avoid having color aberration (when assets are too close to the edge of the screen and we see green and red colors due to the curvature of VR’s lenses). The counter is attached to the camera but has a position track called lerp, so it's not fixed on the screen, each gives a natural feeling.

 

 

7- Natural Flow

Just like in mobile or any other platform, people must be sure of where they are, where they go and what they are doing. In order to have a natural flow in VR, our suggestion is:

 

A - Have a visible button for the action go to the previous page

 

In Level Selection, we have a button to world Selection, so the player can go back.

 

B - Have a title for each new screen (this also helps people to give more precise feedbacks)

 

Each level has it's own name.

 

C - In Gear VR, use the physical back button to instantly go to the previous screen or go to Oculus’ Page

As always, the invisible flow is the most natural one because people won’t even feel it. The natural flow is key to have a good experience in VR, and pay attention to Oculus’ guides about Reserved User Interactions since the beginning of your VR project.

In conclusion, VR is the next big thing and more than ever, everyone want to see better and innovative games for it. It’s challenging to work with, and we really love it. This is a super immersive and new technology. People will learn through time about how to develop better and better for it and also how to better interact with it, and it’s good that we share our visions in order to be able to find a consensus about how to create the best experience for it. VR is all about experience.

 

 

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like