Sponsored By

Valve shares advice on designing great VR game interactions

At GDC Europe today, Valve's Yasser Malaika shared some key lessons learned about how to make great VR game interactions by both Valve's own devs and external dev partners like Owlchemy Labs.

Alex Wawro, Contributor

August 4, 2015

6 Min Read
Game Developer logo in a gray background | Game Developer

Figuring out how to design the ways players interact with your VR game is tricky business, especially now when many high-profile VR headsets haven’t even hit the market yet. But Valve’s Yasser Malaika cautioned developers at GDC Europe today to look upon VR interaction design as a challenge, but also as an opportunity..

“Interaction is turning out to be an essential part of VR, as a medium,” says Malaika. “It is profoundly satisfying for users to interact with [VR] content,” and developers have a rare opportunity to freely experiment with how players reach out and touch their games.

I say touch because Malaika spoke with the assumption that most VR developers will be making experiences for players that have access to gesture controls, like those afforded by HTC Vive’s handheld controllers. 

“We favor hand controls because we feel like that’s a piece of the puzzle that wasn’t easily accessible,” says Malaika. “We feel its important for every customer to have that out of the box, so developers feel confident that they can explore VR-specific, novel experiences.”

But of course, there’s yet no standardized VR input scheme: right now, most players interact with VR games via either head movements, a gamepad or handheld motion controllers. Malaika suggests the standardization of VR input technology is a good thing, even at this early stage, and highlights several upsides: it’s easy for players to move from VR game to VR game with a minimal learning curve, and its easier for devs to target multiple platforms efficiently.

“We feel it’s very important, in these early days, for developers to develop for multiple platforms,” says Malaika. “We think that mitigates some of the risks, and we hope it will help the market for VR develop in a healthy way.”

Malaika went on to share some of the lessons Valve has learned about designing VR interactions, both from its own internal experiments and its partnerships with other VR developers. 

Don't cross the streams

His first piece of advice for VR developers to break down what you know about game design and “break out” traditional input conventions to figure out where they would most intuitively work in VR. The right stick on a gamepad typically controls camera movement, for example, effectively standing in for the player’s head movement. When you’re making a VR game that supports gamepads it makes more sense to let the player control their camera using head movement and bind the right stick to some other function.

“Don’t cross the streams,” says Malaika. In other words, don’t confuse your player by using unintuitive input methods. For example, “Mapping pointing or interaction with your head can be less than satisfactory….and doing so detracts from its natural task of governing what the user is looking at.”

Except sometimes, crossing the streams can actually work out okay. Malaika shared the example of a demo which the Valve engineers initially built to allow players to move through a virtual world by moving their head; when the engineers got access to motion controllers, they modified the demo to let the player move by waving their hand in a direction, and it turned out to be surprisingly enjoyable.

“It was sort of strange the first time, but it was one of those things you kind of quickly get used to, and then immediately miss when you don’t have,” says Malaika. 

Go for abstract, not hyper-realistic

Another lesson learned was that when it comes to modeling player avatars in VR, abstract trumps the real. Malaika says Valve has found that players tend to feel less immersed in games that try to model hands realistically, and more immersed in games with cartoony hands — like in Owlchemy Labs’ Job Simulator 2050 (pictured), for example, which Valve has been using to demo its Vive headset.

“There’s a really kind of interesting matrix of representation of visuals and of interaction,” says Malaika. “It really highlights how, in VR, there’s no substitute for observing and play testing.”

When Valve was creating its own Aperture Science VR demo, Malaika says the company ran into difficulties after pouring a bunch of energy into rendering the environment as realistically as possible.

“We put a lot of time into trying to enhance the physical realism…we had high-res textures, normal maps, parallax-sensitive environment maps…all of those things really helped to make the surfaces feel tangible,” says Malaika. Problem was, all that extra detail outshone the tiny clues the developers had seeded in the demo to show which objects were interactable and which were static — small green and red LED lights on lab drawers, for example — and wound up disappointing players by making them think they could interact with things they couldn’t. 

Make sure everything a player can do is meaningful

Malaika also recommends that VR developers avoid the temptation to render unmeaningful gestures in great detail. For example, you can require players to reach out, grasp a door handle, and turn it a full 90 degrees to open a door. “Something like that, although novel the first time, can quickly become fatiguing,” says Malaika. His experience suggests its probably best to save those levels of intricate detail for critical tasks. “I like to ask myself the question, ‘Is this meaningful?’”

The same goes for affording your player in-game abilities -- input in VR is much more limited than traditional console or PC game design, so you have to thnk very carefully about whether each input is meaningful.

“In VR, you don’t have a keyboard full of hotkeys,” says Malaika. “The buttons on a controller are much more limited, so you have to think about how to provide the same number of choices…and manage the number of choices a user has.”

Quick note: If you happen to be developing a game for the Vive or another VR system with haptic feedback, Malaika recommends experimenting with gentle haptics. Pulses can trigger a player’s sense of touch, he says, and more deeply immerse them in your game by linking a virtual stimulus with their real body.

Finally, Malaika reminds fellow VR game developers to try and keep a player's physical realities in mind when crafting virtual experiences. “Bring interaction to the user, not vice versa,” says Malaika; VR is heavily rooted in physical space, and so you have to consider things like a player’s height, their mobility (or lack thereof) and the room they’ll be playing in when designing your next VR game.

Read more about:

event-gdc

About the Author

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like