Sponsored By

Making Great VR: Six Lessons Learned From I Expect You To Die

A collection of VR tips from Jesse Schell, a 20 year VR veteran, featuring examples and illustrations from "I Expect You To Die", an Oculus Rift project from Schell Games.

Jesse Schell, Blogger

June 26, 2015

51 Min Read
Game Developer logo in a gray background | Game Developer

 

Why VR?


At this point in time (June 2015), there is a lot of skepticism about VR in the game industry. Typical things game developers say about it include:

 

  • The Virtual Boy already showed us that VR doesn’t work.

  • They tried VR in the nineties -- it didn’t work then, why would it work now?

  • It’ll just be a fad -- like the Kinect, or 3DTV.

  • Nobody is going to want to put a device on their face.

  • The motion sickness problem is unsolvable.

  • What’s the point? There’s nothing you can do in VR you can’t already do on a game console.

My point of view is a bit different. I believe that after a massive flurry of VR hype in 2016, VR devices will be on the market indefinitely. They will not completely replace our existing game platforms, but instead they will join our gaming ecosystem, living and thriving alongside PC, console, and mobile gaming. The difference will be that VR gaming will be the most intense gaming experience, the mode of gaming that hardcore players will value most in the long run. I have been developing VR games for over twenty years, and in this article I’ll give tips for creating the best VR experiences possible, many of which were learned during the making of the initial prototype of one of Schell Games’ new VR titles: I Expect You To Die.

 

I Expect You To Die
 

I Expect You To Die is an escape-the-room VR puzzle game with a super-spy theme. So far, we have created a rough prototype puzzle (The Library) and a detailed vertical slice puzzle (The Car in the Plane). Our current version is designed for the Oculus Rift, and assumes the player is seated at a desk, using the mouse to interact with the world. We have put both of these levels up on Oculus Share, with the plan of making a complete game for sale once the Oculus Rift is launched. We may also want to release it on other platforms, but we aren’t sure yet, for reasons I’ll discuss later on.

 

Aladdin’s Magic Carpet VR Adventure, installed at DisneyQuest in 1998, is still operating today.

 

My VR History


Am I biased in favor of VR? Undoubtedly. I started working on it in earnest in 1992, and have been helping to develop VR games somewhat continuously since that time. It began with my graduate work at Carnegie Mellon University, which led to me joining the Disney VR Studio in 1995 where I helped develop Aladdin’s Magic Carpet VR Adventure and other VR experiences for DisneyQuest. In 2002, I returned to Carnegie Mellon where I have been teaching the Building Virtual Worlds class at the Entertainment Technology Center for the last 13 years. During that time I have helped students create over 500 VR worlds. In parallel to my teaching, I run Schell Games, a large (100 developer) game studio in Pittsburgh. I think you can imagine now that affordable high-end VR devices will at last be available in the home, it is hard for me to resist making games for them!

 

Daniël Ernst’s “Blocked In” gave Schell Games a lot of inspiration.  

 

Overcoming Skepticism


When Oculus made its initial announcements about launching a consumer VR platform, I was quite eager to get a Schell Games team started working on it! Unfortunately, it wasn’t as easy to get started as I had hoped. Far from jumping on the VR bandwagon, many of my team members raised an eyebrow at my VR enthusiasm, stating all the objections I listed above, and more. I tried to urge them on, insisting that the new OLED displays and optical tracking systems allow for far more powerful experiences than in the past, but still, most were not convinced. Showing is always better than telling, so what ultimately started to bring people around was an opportunity to experience some of the best demos that were out there. None of them were great, but they were good enough to inspire some of our team that maybe something was there, and there were several who, trying the demos circa July 2014, felt “we can certainly do better than this.” That year, for Jam Week (a yearly Schell Games ritual where all normal work stops so everyone can spend a week working on a passion project), Senior Engineer Jason Pratt pulled together the best of the best VR demos that he could find, and started on some experiments of his own. He developed a method of mouse interaction that we all felt was interesting and unique, and might work well for the Oculus Rift. At the same time, going through the set of demos Jason had downloaded, I couldn’t help but be impressed by Blocked In, an Oculus Rift demo that gave me a stronger sense of presence than I had ever felt in a VR world, even though, as a player, I could take no action but to look around. All of this generated a lot of discussion among our team.

 

Hexius: An early Schell Games prototype that had some problems.

 

Soon, another SG engineer (Matt DeLucas) started riffing on Jason’s work, creating Hexius, a world that used Jason’s unique VR mouse interface to choose where to teleport next. Not satisfied with mere teleportation, Matt turned it into flying, which promptly made us all motion sick. This led some to conclude that the medium was worthless. After all, what good is a videogame where you can’t move without throwing up? I countered that not all videogames require movement -- many successful games are about sitting still, and defending territory, or solving puzzles. The response to that was that when players put on a VR headset, they want to feel powerful, like a superhero. How could it ever be interesting to be a superhero who was tied up and couldn’t move? But… wait! That happens all the time! Superheroes, from Batman to Wonder Woman to James Bond are always getting tied up by villains, and having to escape through clever puzzle solving! We all became fascinated by this idea, and moments later, inspired by the famous phrase from Goldfinger, came up with the working title I Expect You To Die. We then pulled a team together, led by Mike Traficante, former director of Enemy Mind.

 

As we developed I Expect You To Die to its current state, we learned a number of lessons that I am pleased to share here.

 

Like VR, psilocybe semilanceata mushrooms can provide both visions and nausea.

 

Lesson One: Motion Sickness Can Be Eliminated


When you think about it, the phenomenon of motion sickness is incredibly strange. A person is confronted with unusual motion (say, that of a boat, car, or roller coaster), or the appearance of motion (say, an IMAX movie, or VR), and their body responds by becoming gradually more nauseous, and possibly, ultimately vomiting. Why vomiting? Why not sneezing, or getting chills, or feeling tingly, or any other number of possible physiological responses? Why any response at all? The answer is known as the “toxicology hypothesis”. Certain poisons (from some mushrooms, for example) can disrupt the neurology of the brain such that the input from the little hairs in our inner ear (which detect acceleration and rotation) do not align with the input from our visual system. These poisons must have been a significant problem sometime in our evolutionary past, because wise old nature has programmed our brains to vomit when this happens, thus saving our lives. The problem is that poisons are not the only way to cause this disconnect -- reading in the car, riding the Tilt-a-Whirl, and engaging in certain VR experiences can do the same thing.

 

Can we disrupt this mechanism somehow? Certain drugs (Dramamine, for example) do just that, but as Spalding Gray once noted about using drugs to disable certain parts of the brain, “there is no such thing as precision bombing,” and these drugs tend to make players feel drowsy and disconnected. One day (around 2060 or so, by my guess), we’ll probably have some kind of nanotechnology that can safely calm our motion sickness circuit without side effects, but until then, we have to live with it. Exactly what triggers motion sickness differs vastly from person to person, but our team has been struck by the fact that despite testing with many different people, we are seeing virtually no motion sickness on I Expect You To Die. Here are our tips for creating VR with no “motion discomfort”:

 

  1. Keep the framerate up. Consider 60 fps your new absolute acceptable minimum. 90fps or more should be your goal. Yes, I know this is hard. Yes, I know that PC platforms are variable. I don’t care. Your head and eyes can move quite fast, and when you are much below these high frame rates, your brain starts to sense something is wrong. Some people disagree, insisting that the brain can’t even detect the differences between such high frame rates -- that film has established that 24fps is plenty. If you feel this way, try this experiment. Go out under a fluorescent streetlight one night, and toss a ball in the air. The streetlight is pulsing at 50-60 times a second. If you just look at the light, it seems to be continuously on. But if you look at the ball, you can clearly see the individual light pulses. But you don’t even have to go outside. Pick up your PC mouse and shake it back and forth, and watch your cursor. Your screen is likely updating at 60 fps -- and you can clearly see the discrete positions. The way the brain perceives motion is not simple or easy. In the new world of VR, frame rates below 60 fps are no longer feasible.

  2. Avoid virtual camera movement. I know. You want to make a first person shooter, you want to make a racing game, you want to have a dogfight in space. All of these seem to require a virtual camera that whizzes all over the place while the real camera (the player’s eye) stays still. Well, guess what? Any time you create a disconnect between the eye and those little hairs in the inner ear, your player will become nauseated. So, yes, that means a lot of kinds of gameplay are off the table. Keep in mind, though, that for everything that VR takes away, it gives something new that couldn’t be done before. Being able to move your head and body through an environment, even for a short distance, is an incredible experience, as is manipulating virtual objects with your real hands. It requires some creativity to design within the bounds of the VR box, but if you are willing to do it, you can create powerful experiences that have just about zero motion sickness.

  3. If you must move the camera, don’t accelerate. Funny thing about those little hairs in your ears -- they can only detect acceleration, not velocity. They can’t tell the difference between zipping down the highway at 80 miles per hour, and sitting perfectly still. What they notice is speeding up and slowing down. When I coded up the locomotion system for Aladdin’s Magic Carpet VR Adventure at DisneyQuest, I took advantage of this fact by making the motion of the carpet be as linear as possible. Some amount of acceleration was necessary, though, and as a result, some motion sickness was inevitable. It was very limited, however, because the experience is on a five-minute timer, and most people can endure five minutes of mild virtual motion without much discomfort. For home play, however, five minutes is generally not an acceptable duration. Our initial I Expect You To Die prototype, for example, generally engages first-time players for twenty to thirty minutes or longer. Accordingly, the only virtual motion we have involves driving your car out of the plane, which only happens at the culmination of the experience, and is linear motion lasting only a few seconds.

  4. And whatever you do, keep the horizon level. Certain kinds of motion, virtual or real, are shortcuts to trigger your motion sickness alarm circuit. Rolling the camera in a “barrel roll” style, so that the horizon does cartwheels in front of the player’s eyes is the quickest shortcut to puketown. So -- don’t do that. The canals in your inner ear that control all this are circular, and very good at detecting rotation, so generally, you should avoid virtual rotation of any kind. Part of what makes VR unique is that it lets players really turn around -- for real! Use real rotation to let players look around an environment, and avoid virtual rotation whenever possible.


People will give you all kinds of tips about “quick cures” for the motion sickness problem, such as adding a “virtual nose”, featuring a prominent virtual vehicle around the player, or even asking your player to wear a pressure-point bracelet. All of these can diminish motion sickness, but none of them fully remove it, and how can your players enjoy your game if it literally makes them sick to their stomach? The reason VR motion can make players sick while the same motion on TV doesn’t is because VR can be compelling enough an illusion that your brain actually believes it is in another place. Don’t fight that, embrace it, by creating experiences that build upon that powerful sense of presence.

 

Our first instinct is always to copy the media that came before.

 

Lesson Two: Design for the Medium


Throughout the history of entertainment, the first impulse of those who create in a new medium is to imitate what came before. Early movies were just filmed stage plays, with no cuts, close ups, or camera motion. Early internet videos were attempts to imitate the format of television (30 and 60 minute shows). In all cases the pattern is the same -- the new medium is derided for not being as good as the old one, but then, gradually, after many experiments by pioneers, the unique strengths of each medium come to light, showing it to be powerful in a way that was not possible before. VR will naturally go through this same cycle -- so why not skip the tedious imitation, and jump right to the pioneering?

 

Things VR is Bad At (avoid these):
 

  • First person virtual motion

  • Screen-relative HUD interfaces

  • Variable frame-rate

 

Things VR is Great At (do these):
 

  • Making you feel like you are really in a place

  • Letting you touch and manipulate objects

  • Face-to-face confrontations

 

On I Expect You To Die, we quickly learned that making you feel like you were really in a place was very powerful, and as a result, we realized that the inconsistencies in our artwork were distracting. We felt that our new artwork would need to be detailed and consistent enough to create a strong sense of place.

 

Lesson Three: Immersion Is More Important Than Gameplay


The most powerful aspect of the VR experience is the phenomenon of immersion, sometimes called “presence.” It is the feeling of really being in a place that you intellectually know isn’t really there. When a simulation provides enough cues, something in your unconscious mind buys into the illusion, and starts treating it as if it is real. I’ve seen colleagues absent-mindedly try to lean on virtual tables, only to be startled to remember that they aren’t real. When you get really immersed in a VR world, there can be a shocking feeling when the headset is removed, and your mind and body try to shift from one reality to another. The sense of “spatial immersion” is something different than we’ve seen in video games up until now. It is different from, but related to, the sense of “flow” that we get from being deeply engaged in a challenging task. Philosophers and psychologists have been debating about the true nature of “presence” for centuries (the purpose of Zen meditation, for example, is to become truly “present”), so we shouldn’t be surprised if we don’t understand it entirely. And while we may not understand it, we must respect it, because it is the foundation of any successful VR experience.

 

But is it really more important than gameplay? It absolutely is. Players are glad to fiddle about and toy with the elements of a world in which they feel immersion, even if no gameplay is there. But if the immersion is broken or interrupted, the player becomes very aware that they are in a headset, and may even become annoyed by the whole experience, no matter how good the gameplay actually is.

 

But what builds immersion, exactly? It is difficult to pin it down. It seems to occur when the technology is able to simulate reality in a way that our subconscious mind finds satisfying and realistic. Looking around a world by moving your eyes, head, neck, and body is one example of this. Another very important aspect is reaching into the world and manipulating things with your hands. Touch screens have become the dominant medium in mobile platforms partly because touch is so primal. Two year olds who would have no hope of manipulating a D-pad have no trouble using an iPad. VR interfaces can take this one step further by letting you not just tap and swipe at virtual objects, but pick up virtual objects and hold them, pass them from hand to hand, or even throw them. The VR platforms that make the most use of hands will be the ones that are most successful, because there is something about fiddling with things with your own two hands directly, and not through a clunky gamepad, that is immensely immersive. This is why we focused on the mouse interface for I Expect You To Die: it provides a primitive way to “reach” into the world and manipulate objects in a much more immersive way than doing the same manipulations with a gamepad. We did experiment with using a gamepad for these interactions, but our brains could tell we were using an indirect tool, and the sense of immersion was interrupted.

 

Your main job, then, as a VR designer, is to avoid things that break immersion. This is not easy, because immersion is a fragile illusion, a magician’s trick in which one awkward interaction can spoil the whole effect. Some of the most common immersion breakers include:

 

In VR, a knife quickly becomes a screwdriver.

 

  1. Shallow object interactions. In a traditional adventure game, objects are often uni-taskers: screwdrivers are for unscrewing and nothing else. Knives are for cutting and nothing else. It is a sort of “key and lock” mentality. But when the phenomenon of immersion takes over, and your body thinks the virtual world is real, a great deal more detail is expected. One puzzle in I Expect You To Die involves unscrewing a panel from the car’s console. We placed screwdrivers within sight, which we fully expected players to use for this purpose. However, we found many players tried to unscrew the panel using a pocketknife they found in a glovebox. Since they would use a knife this way in the real world, it seemed perfectly natural to try using it as a screwdriver. Initially, this simply failed, and it was a real immersion breaker. Allowing players to unscrew the panel with a knife would require us to radically change our puzzle structure, so ultimately we added a dialog line from our narrator: “I’ve seen you do many creative things with a knife, but I don’t think turning screws will be among them.” This is kind of a cop out, but at least it acknowledges the player’s attempt, and asks them to move on to something else. Other interactions we handled better: shooting the champagne bottle with the gun shatters it, and the broken glass can be used to cut things. Lighting the money on fire with the lighter causes it to burn. Players love using objects on other objects -- if you can handle these interactions realistically, you will delight them. If you don’t, you’ll remind them this is “just a game”, and break their immersion. You are wiser to create a small game with rich object interactions than a big game with weak ones.
     

  2. Unrealistic audio. If I pick up a virtual coin and hold it in my hand, turning it over to look at front and back, I might be really immersed in what I’m doing if it looks realistic. But if I then drop it, and it makes no sound, I’ll be reminded that the world is fake, and my immersion is destroyed. If instead, it makes realistic “ping” and “ching” noises as it bounces around on the cobblestone street, my immersion will be maintained. Whatever amount of sound design and integration you normally do on a game, expect to double it for a VR experience, because so much more detailed sound design is required to make interactions with objects seem realistic.
     

  3. Proprioceptive disconnect. “Proprioception” is your sense of how your body is positioned; your awareness that you are sitting or standing, for example, or that one foot is crossed over the other. In a normal videogame, our proprioceptive sense is irrelevant to the game. In VR, it is a key part of our immersion. If you are seated while playing a VR game that involves your character walking about a room, your body perceives it as fake, and immersion is broken. In I Expect You To Die, we designed the game for seated players, and so we developed scenarios (sitting at a desk, sitting in a car) that involve being seated, which maintains immersion. Other types of proprioceptive disconnects involve objects penetrating the player’s body -- walking through a table, for example. Players do not like having their bodies penetrated by virtual objects. It feels disconcerting at first, as your mind and body struggle in subconscious fear, but soon it simply breaks immersion. The fastest way to bring about a proprioceptive disconnect is to give players virtual bodies that they can see. If visual sense of your body differs from your proprioceptive sense (those fake hands or feet aren’t positioned where your real ones are) your mind quickly rejects the reality as fake. Much better to show no body (your brain doesn’t mind this much, for some reason) than a body slightly out of place. Consider this the “uncanny valley” of VR avatars.
     

  4. Highlighting interface limitations. Our interfaces for interacting with the world are necessarily limited. Some actions (picking up and dropping an item) can feel reasonably realistic with even a limited interface. Others (pulling a trigger on a virtual gun by clicking the mouse) are not exactly realistic, but easily accepted by the brain. But still others can be frustrating or impossible. We tried to make interfaces for using a screwdriver in our game that involved moving the mouse in a circular motion, hoping that this would “map” to the somewhat unique way we twist our hands when using a screwdriver. This was a brain-mapping fail. Players found it frustrating and unrealistic, and their immersion was shattered. In later versions, we changed the game so that simply holding the screwdriver up to the screw caused it to automatically unscrew and come out. This was just as unrealistic, but not frustrating at all, because the “unscrewing” motion is something most of us have trained our hands do to somewhat automatically - that is, we don’t have to think in much detail about how we move our hands when we do it, and as a result, immersion is maintained.


It bears repeating: As a VR designer, maintaining immersion is your top priority. Anything you can do to keep it is worth doing. One tricky method we used in I Expect You To Die was comedy. By creating a world that is a caricature of a serious spy thriller, and having a narrator who makes slightly sarcastic remarks, we send the message that this world doesn’t take itself too seriously. As a result, we found that when players run up against flaws and imperfections in our game, they are more willing to accept them and roll with them in our comic world than they would in a world that took itself very seriously.

 

People new to VR tend not to look around.

 

Lesson Four: Looking Around Takes Getting Used To


As noted as far back as 1996 in Randy Pausch’s study of Disney’s Aladdin Magic Carpet VR Adventure, players new to VR are hesitant to turn their heads and look around. This seems to be because most of us have a lifetime of training that screen-based media is best enjoyed by sitting still and facing forward. On top of that, creating media that requires players to look around their environment is a bit of a subtle art form that many VR designers do not get right. Further, many early VR experiences with low (<60 fps) frame rates trained players to keep their heads still if they wanted to avoid motion sickness.

 

Lock and Look. Modern VR systems are something else altogether. Their combination of high frame rates (>60 fps), excellent 6DOF tracking systems, and OLED displays with a wide field of view make for a “looking around” experience that was not possible in the past. Something to note about your vision system: it loves to “lock onto” things. In the real world, we visually explore this way -- our eyes jump from object to object. When we “lock on” to an object at the edge of our vision, our neck then gradually turns to let us look at it more easily. I call this mode of visual exploration “lock and look” because of the way you first lock onto an object, and then turn your head to look at it. With older VR systems, “lock and look” tended not to happen due to narrow field of view, slow tracking, heavy headsets, and edge distortion on lenses. With a clunky system like that, your eyes do not scan the environment, as is natural, and instead participants would scan the environment by moving their neck and head first, and locking onto objects afterward. This was a kind of looking around, but it is not one that feels natural enough to give a strong feeling of immersion, any more than looking around in a PC Game by moving your mouse does. I believe one of the biggest changes in modern VR systems is support for human visual system’s “lock and look” ability, which does a great deal to feel natural and give a strong sense of immersion.

 

Putting useful items behind the player gives them reasons to look around.

 

Give reasons to look around. Giving the players reasons to look around is critical. New players will not be inclined to do so at first. Gradually, though, you can lead them to it. I Expect You To Die does this by putting players into a place where they don’t expect to move around freely. If your players can navigate forward, they want to look forward. If you hold them in place, they eventually start to look around. One level of I Expect You To Die takes place in a parked car. The player is sitting in the driver’s seat. Initially, they examine what is in front of them -- the steering wheel, the gas and brake pedals. When these don’t do anything, they gradually start to explore the car: the gearshift, the glove box, the passenger seat, and eventually they wonder what is in the back seat. Most players, once they start to explore the car, are startled at how real the world seems. This appears to be the power of VR immersion kicking in. If, instead, this were a game about driving a car, I am doubtful that the immersion would arrive so quickly (though motion sickness likely would).

 

Oooo… what’s in here?

 

Looking into things is cool. Not only do you need to give players reasons to look around (and interesting things to look at and interact with), but you would be wise to give players an opportunity to look into things. Designers, wanting to be helpful, have a tendency to make everything easily visible from the player’s starting point. But keep in mind the power of positional tracking. In our game, we gave players lots of reasons not just to turn, but to move their heads. Multiple glove boxes, items in the back seat, fine print, physics that allows objects to fall onto the floor of the car, and even an retinal scanning system that demands you lean forward (and then tries to kill you with a laser if you don’t move your head out of the way) were all different ways we encouraged players to move their heads around in the environment.

 

Locking objects in space near your head is surprisingly engaging.

 

Objects near your face are very immersive. We figured this one out really late. None of us were very used to the high quality positional VR tracking that is now possible, and we were quite surprised to see how compelling it is to examine an object in VR that is very close to your face. Somewhat late in development, we added a new feature that lets players freeze an object in space, for easy manipulation and viewing. This is a nice convenience for players (who often have multiple objects to manipulate and manage), but also gives a strong sense of immersion (even though it is somewhat unrealistic). Curiously, we found that if the object is completely frozen in place, it seemed unnatural -- people would ask “Did I break the game?” when an object froze in place. To prevent this, we added a very slight bobbing motion to the frozen objects. Now they somehow felt more natural, and players liked this chance to examine them up close. In future versions of the game we plan to make this feature more key for puzzle solving, and not just the convenience / window dressing that it is now.

 


This device would offer a very different experience than a game controller.

 

Lesson Five: Different VR Platforms Enable Very Different Experiences.


As we enter the new world of VR, questions arise about exactly how it will fit into our lives. Where in our homes will we use it? Is it best in the living room, like game consoles, or in a more private work-oriented space like PCs are used? Or will it be more mobile, used while sitting on a bed or cozy corner like we do with the Nintendo DS or the iPad? The answer seems to be that different VR platforms will be used in different ways. More mobile, casual systems like the Gear VR will likely be used in “reading nooks” where the DS and the iPad have been used. Sony’s Morpheus system will naturally be used where the PS4 is (often the living room), and the Oculus Rift will likely be used in the same place that PC Gaming has taken place. HTC’s Vive system, which practically demands you get up out of your chair and move about is more of a puzzle. Will it be most at home in the living room, because there is space there? Will players set it up in an empty garage? Will we see a return to murphy beds that fold into the wall so players can easily convert their bedrooms into virtual playgrounds?

 

The answers aren’t clear yet. But what is clear is this: The VR inputs that you use do a great deal to define what experiences will work best on the platform. Systems that simply use a gamepad are very different from ones that use a mouse which are very different from ones that use motion-tracked hand controllers. Systems that require sitting will have very different experiences than ones that give you the freedom to walk around a virtual space. While I Expect You to Die was designed as a seated experience since that is what Oculus seems to favor, our team is creating a completely different standing experience for HTC’s Vive system: a puzzle game that lets you walk around a virtual space, building structures with your hands.

 

It will be the rare VR experience that can easily be ported to the variety of different VR input systems. As a result, if you want to create a great VR experience, you should first choose the input system you plan to use, and design your game around that. Yes, this will mean it may not be portable to the other platforms, but it will be GREAT on the platform you have chosen. At this point in time, there are five types of inputs in common use:

 

 

  1. Non-tracked handheld game controller. Just a plain old gamepad held in your hands while you look around a VR environment. This does little to help with immersion, as it doesn’t do much to bring your hands into the environment. If you must use this as an input, be creative with it -- don’t just use it for running and shooting (that will lead to motion sickness anyway) -- instead, do something creative that no one has seen before.
     


 

  1. Headset mounted controller. Google cardboard has a single washer to slide, and the Gear VR has a tappable touch pad on the side of the headset. These controls have the merit of being fairly simple, but again, they do little to really build immersion. If you use them, focus on their simplicity -- don’t try to make them do more than they easily can.
     


 

 

  1. Mouse. I don’t say “mouse and keyboard” because using the keyboard while wearing a headset is mostly infeasible. The mouse, though, is not a bad VR control system. It lets you, in a primitive way, “reach into” the VR world and pick things up by clicking on them. This is the central interface premise that I Expect You To Die was designed around, with the added features of being able to move objects nearer and farther away with the mouse wheel, and being able to activate objects (shoot guns, light matches, etc.) by right clicking. We initially chose this since anyone with PC based VR was pretty much guaranteed to have a mouse available. Curiously, we found that controlling the mouse cursor by a mix of turning your head and moving the mouse is surprisingly intuitive.
     


 

  1. Hand tracker. Many people are experimenting with Leap Motion as a system to track empty hands in real time. Seeing a representation of your own hands in a VR world is powerfully immersive. It sounds like a VR dream come true, but it has its downsides. Tracking of hands via camera is quite imperfect at this point in time, causing popping, jumping, and flipping of virtual hands and fingers -- unquestionably an immersion breaker. There is also no clear “fire button” -- no clear way to give a solid binary input, which makes for a design challenge similar to that the Kinect has. Another immersion breaker is seeing a set of hands that is close to, but definitely unlike your own, for example, a female player seeing a set of hairy man hands. Perhaps in a couple years we will have solid camera based hand tracking, but at the moment, hand tracking is too experimental to sustain excellent immersion.

     

 

  1. Tracked hand controller. Sony, HTC, and Oculus have all reached the same conclusion: By holding a wireless controller in your hand (something like a Wiimote), it is possible to have near-perfect 6DOF tracking of each hand, as well as clear inputs for button pushing. The tactile feedback that a real button provides when you click it down is not to be underestimated. At the moment, it seems that this mode of input is likely to be the most flexible, popular, and immersion-sustaining type of input.
     

As you can see, each type of input is quite different from the others. They each have strengths and weaknesses. As the ecosystem of VR platforms quickly diversifies, there will be even more choices. Presently, VR is a very isolating medium. Systems designed to support multiple players in the same room will make for a very different experience than solo VR. Different still will be systems designed to support multiple players networked from different locations. In the long run, VR may be the most social medium ofall, as it will let remote players make eye contact with each other, and see real-time hand gestures while they engage in real-time voice chat. With all the vagary about what VR platforms are likely to succeed, it is a tough choice to settle on a single interface -- but the most immersive worlds will be the ones with interfaces custom-designed for a single platform.

 

Lesson Six: Iterative Process is Crucial


Most developers understand that when you undertake a game project with a lot of unknowns, you have to leave a lot of time in the schedule to deal with discoveries and problems. VR projects have a lot of unknowns, and necessarily require time for experimentation and adaptation. VR is a kind of brain hacking, and in truth, we don’t really understand the brain that well. There are some cases where the brain fills in gaps in a wonderful way, creating and sustaining powerful immersion. The “blind spots” in our vision that the brain fills in with color are a great example of this. So is the way OLED displays are blacked out the majority of the time -- the brain fills in the blackness with persistent imagery. Other times, though, the brain fails to fill in gaps you would think it could take care of. Virtual avatars are a great example of this. Novice VR developers often assume that a crude VR avatar will be okay -- the brain will tolerate the imperfect alignment with the body -- but it doesn’t, creating a powerful immersion breaker. The problem is that it is very difficult, ahead of time, to correctly predict what the brain can make up for, and what it can’t. The only way to know what it will and will not tolerate is to build it and try it, and that takes time and experimentation. Our experience has been that developing a VR scene seems to take about double what making a traditional game scene would take.

 

Wise words from a wise man.

 

Jason Vandenberghe once recommended the “4F” method of game design: Fail Fast, and Follow the Fun. This is more important than ever in VR, because you will have a lot of failures, and the fun will come in unexpected ways. In our early experiments with I Expect You To Die, we were surprised at how much fun people were simply making stacks out of books. Stacking books has nothing to do with our game -- we just had a lot of books in the scene as props, but somehow our interface made it fun, so we made sure to keep it in the game. Since immersion is more important than gameplay, looking for the toy-like interactions that keep people immersed is absolutely crucial.

 

 

Pioneers Only


VR is not yet a mass-market platform, but it soon will be. It behooves all of us to remember the lesson of the hype curve: when new technologies are announced, but not released, the world assumes they will be life-changing. Once they are released, the world immediately hates them and derides them for their flaws. After a little while, though, the world calms down, and acknowledges that the technology has its place -- it is good at some things, and bad at others. Right now VR is right on the cusp of launching, and naturally some people are shouting that VR will take over the entire game industry, and others saying that it will fall on its face. To me, it seems clear that starting in 2016, VR will take its own unique place in the ecosystem, and be very good at some things -- I’m just not completely sure exactly what those are yet. And that means that VR development won’t be easy. But if you want easy, don’t do VR. Just keep making minor improvements to the same games you’ve been playing for the last twenty years. But if you want to decide the course of what may be the most powerful medium of the 21st century, VR is definitely the place to be. I’ll see you there!

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like