Sponsored By

Lessons on VR motion and interactivity from leading VR devs

A panel of virtual reality developers on notable VR games including Job Simulator, Star Trek: Bridge Crew, and others shared lessons about interactivity and motion in VR at the Virtual Reality Developers Conference.

Kris Graft, Contributor

November 2, 2016

4 Min Read
Game Developer logo in a gray background | Game Developer

A panel of virtual reality developers on notable VR games including Job Simulator, Star Trek: Bridge Crew, and others shared lessons about interactivity and motion in VR at the Virtual Reality Developers Conference this morning in San Francisco.

On interacting with objects

“The worst thing you can do is reach out to something and it does nothing,” said Owlchemy’s Cy Wise.

Owlchemy’s VR game Job Simulator 2050 is widely recognized in the development community as one of the forerunners of interactivity in a virtual space. Job Simulator is a playground of objects that players can grab, throw and prod, with design informed by countless playtesting sessions.

With Job Simulator, interaction with objects is mainly focused on the act of grabbing. This mechanic is taught early in the game through the game’s design, without using an explicit tutorial.

Wise said, “What’s really important really early on is to give a really simple interaction to give people an idea of how they’re going to navigate this world.”

Marc Tattersall, project director on Schell Games’ I Expect You to Die, added, “When it comes to interacting with objects…we tried to keep it to interactions people were familiar with." His approach is to stay away from complex, unique controls.

Wise echoed the idea that the focus of interactivity ought to be on intuitiveness.

“I would agree the simpler actions, the better,” said David Votypka with Ubisoft-owned Red Storm, which is working on Star Trek: Bridge Crew.

But he explained that “simple” has to be the right kind of “simple.” For example, to raise shields in Bridge Crew, players would just press a button on the virtual console. For some reason, a lot of playtesters would double-click this button, raising the shields, then lowering them inadvertently. So the developer changed that control to a lever you’d grab and slide while in-game.

John Gibson with Tripwire Interactive, whose VR game Killing Floor: Incursion focuses on a lot of gunplay and shooting interaction, said players’ interactions with their own inventory proved to be a challenge. For example, placing a gun and a flashlight too close to one another on their on-body inventory led to a lot of players drawing a flashlight on a monster instead of a gun.

On haptics

Haptics are important to the future of VR, but current hardware is limited in its haptic capabilities. Developers do still find important uses for controller vibration. “We use haptics, unsurprisingly, for jokes,” said Wise. For example, in Job Simulator, if you stick your hand in a blender, you get a buzzy vibration from the controller.

Tattersall said he uses haptics in a low key way that serves the gameplay. “Outside of alerts, I don’t think [players] should notice the haptics,” he said.

“Be frugal with [haptics],” added Wise.

On maintaining and breaking presence in VR

Wise stressed that hand representation is crucial to maintaining presence in VR. “Hands are your identity…it’s very important to get those right,” she said.

Owlchemy experimented with realistic-looking hands in VR, but Wise said they were “gross and veiny,” and more importantly, immersion-breaking. That’s one reason why Job Simulator features big white gloves. “When we tried not to be anyone’s hands, we could be everyone’s hands,” she said.

The challenge of affordance – the possibility of an object to be acted upon – is also an ongoing issue in VR interactivity and presence. When players try to manipulate something that looks interactive but isn’t designed to be interactive, that’s one of the biggest immersion-breakers in VR.

On session time in VR

Session time in VR games is a significant issue in the field: What’s the most comfortable amount of time for players to stay in VR?

The answer is: it depends on how you design your game.

“I think one of the things you have the potential of [doing is] you want people to physically do everything,” said Tripwire’s Gibson. “And you have to balance that out [with more relaxed physicality].” He said having people constantly bend down can be tiring for players.

In Bridge Crew and I Expect You to Die, there are points where you can take breaks between action, explained Votypka and Tattersall. Votypka said players can comfortably stay in VR for long periods of time, as long as the game is designed to keep them physically comfortable.

For Job Simulator, there’s no time limit for anything. Players can “self-modulate” their session times, Wise said.

Tattersall explained that as far as time is concerned, players often lose track of it, so that’s something to keep in mind. In his experience, when he asks players after a session how long they thought they were in VR, “it’s usually twice as long as they think they have.”

Wise said she’s seen the phenomenon too at Owlchemy. “We call it time dilation,” she said, attributing it to players hitting the cognitive flow state in VR.

About the Author

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like