Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
What are the biggest problems facing virtual reality developers? Oculus co-founders Palmer Luckey and Nate Mitchell took the stage at GDC Europe to outline some of the biggest stumbling blocks.
What are the biggest problems facing virtual reality developers? Oculus co-founders Palmer Luckey and Nate Mitchell took the stage at GDC Europe to outline some of the biggest stumbling blocks.
Luckey set the stage by talking up what VR could mean to games. "What's great about virtual reality is that it enhances a lot of things we love about games. It enhances immersion and removes abstraction... what gets between us and the world," he said. Though so far a lot of what's been announced for Oculus Rift has been ports of existing games, particularly FPSes, Luckey (pictured) thinks that that is "not because FPS are the only thing that can work in VR, or the best thing" -- it's that "their interaction paradigms already share a lot in common." In fact, Luckey thinks that -- as with touchscreen and motion controllers -- the best games will be designed from the ground up for VR. "The best ones are going to be the ones that are designed with the strengths and limitations of VR in mind," he said. "To make really good VR games, there is going to have to be some level of making no compromises -- if you want to have the best experiences." One thing he advised people to stay away from was making "cinematic games" as popularized on today's consoles. "It isn't something that you necessarily want to target with virtual reality." Why? "You want to maintain the illusion that someone is in the world at all times," Luckey said. "Pre-cut sequences [don't work], because that's going to remind the user that they're not really inside a virtual world." His last remarks were that "this is day zero" of virtual reality game development, and there's a long way to go. "Developers are... just starting to touch on the things you can do. All of the things you can do in VR... to be honest, we don't know all of them... It's going to take a lot of people a long time to figure out all the things you can do in VR."
Nate Mitchell, Oculus' VP of product, took the stage to introduce the developers to three of the hardest challenges of developing for a VR system -- any VR system. While there are many more than three, these are the ones he felt were most pertinent to highlight:
User Interface
Simulator Sickness
Latency
Though UI for console and PC games is well established, "this probably one of the most difficult things to wrap your head around," Mitchell said. "You bring your UI into the virtual space and it's this flat 2D object in every corner and region of the screen, and there are a bunch of problems with this." Per Mitchell, there are three main challenges to bring a UI to VR successfully:
Stereoscopy "With stereoscopy all sorts of problems start to arise that you wouldn't expect," said Mitchell. "The two primary problems are depth and convergence... Depth and convergence are at the heart at everything that's wrong with stereoscopic UI." For example, shooter reticles as implemented in current games don't work -- users focus on the target and the reticle splits into two, as in real life when you're focusing far away and there's an object close to you. But to "bring the reticle closer up -- that doesn't really solve the problem," said Mitchell. "What we've found to one of the most effective solutions is painting it on that object, just doing a ray trace to see where it should be in the world -- but that's not a good solution for every UI," he said.
Position Relative to Field of View So you've decided to try to take your current game UI unchanged into Oculus Rift. Well, the VR field of view, at 80 degrees, is wider than a PC/console (50 to 60 degrees) and it's suddenly no longer at the edges. "But of you move it to the outskirts of that, it's too big for them to glance at... and if they turn their head to look at it, it will just keep running away from them," Mitchell said. "One thing, just to hit that point home, is that you always want to try and retain head-look," said Mitchell. Many games throw UI into the center of the screen -- and lock it there. "If you put something in the world and staple it to the player it's just not an enjoyable experience, but when you add head look back it creates a much more natural feel," he said.
Integration with the Virtual World In his opinion, the real solution is making your UI part of the game world. "Presenting the user with information in a way that simulate reality is going to be the best approach for VR," Mitchell said. For example, displaying something on a computer screen within the game world is "a great way to present information to a player." "With Hawken, being able to look down inside your cockpit, being able to glance at your health and your ammo... is something that really conveys the power of virtual reality," he said. And you can get inspiration for work being done in other fields, Mitchell noted: "I often recommend people go look at the Google and Microsoft augmented reality future videos... those concepts are actually pretty doable in virtual reality."
Describing simulator sickness as the "inverse" of motion sickness, Mitchell said it's created when the world moves but, in real life, the body doesn't. And it's both common -- and unpredictable, striking even veteran players who don't feel motion sickness when playing PC/console games. One of the main ways to avert it "is just tracking precision -- the ability to track the player in space," he said, matching "their own input and what they expect from how the natural world would react." While moving forward works well in general, lateral movement is a big challenge. "Moving through a virtual space can be uncomfortable and disorienting to users... the challenge is not forward movement that drives people crazy, it's moving backwards or laterally very quickly." And while it's a key part of most FPS games, "being able to strafe on a dime doesn't always cause the most comfortable experiences," Mitchell noted. And elevation -- walking up and down stairs -- can also present a challenge. When the "periphery is moving very quickly, especially up and down quickly, it's disorienting," said Mitchell. "An easy workaround is to use an elevator," he said, as "the change in altitude isn't perceived. But you can't say, 'Every VR game, remove stairs from it forever.'" "Stationary experiences do seem to work well for everyone," said Mitchell, even those who experience simulation sickness very easily. CCP's EVE-VR puts the player in a cockpit, for example. "Most people feel great, but it is not the answer forever," he acknowledged. The good news is that "users do seem to acclimate over time" to the VR experience. Subtle movements can also disorient players. "If you suddenly tilt the horizon line even very minimally -- even a couple of degrees -- the player will start to become dizzy and not know why," Mitchell said. "As soon as you start doing little creepy things with how the world should be perceived, the player notices." Team Fortress 2's kill-cam was so disorienting that it caused players to tear off their headsets almost immediately, he said. And Unreal Tournament 3's knockback and screen shake on hit, designed to increase immersion in the TV/monitor environment, really disoriented players too, causing a similarly unpleasant effect.
"Latency is the enemy of VR," said Mitchell. "Moderate changes in latency, even 16ms -- the time it takes to render a frame at 60fps -- can be the difference between immersion and disorientation," he said. According to Mitchell John Carmack, who recently joined Oculus as CTO, thinks developers should aim for 90 to 120 frames per second, with 60 as a bare minimum. Mitchell also said that vsync is a must, as screen tearing is "just miserable" in VR. "The minimum spec for VR gaming is primarily governed by content," said Mitchell. "What we want developers to do is get in the mindset of sacrificing visual fidelity for increased frame rate, because the difference is just that powerful." His advice? "You just need to focus on that game engine." While the pipeline for displaying images on the Oculus Rift still introduces latency, the company is doing its best to iron that out. "I'm pretty confident that teams like ours will pull all of the rest of the latency out of the pipeline, and all of that will be on your shoulders," Mitchell said.
Read more about:
event-gdcYou May Also Like