Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
On how VR game designers will lose control over the player's avatar as new features such as positional tracking become commonplace.
One of the first lessons after the Oculus Rift DK1 came out and we got to play the first games with VR support was to always respect the camera's orientation reported by the device. Even small differences between the player's head motion and the rendered image could easily lead to discomfort, so it was important to always respond consistently to head-tracking, even if the game was paused or loading a level.
Losing control over the direction the player looks at forces us to assume he can look at any direction and having to account for this (say, by modelling the back part of the cockpit in a racing game). But also, forcing a rotation on the camera to ensure that the player will see an scripted event is no longer a tool at our disposal if we want to create consistent, high quality VR experiences.
Enter positional tracking
Both Oculus Rift DK2 and Sony's Project Morpheus feature positional tracking, which allows for not only the rotation, but also the player's own translation movements to be mapped into the virtual camera. If previously we had to account for the player being able to look in any direction, now the player can also move in any direction!
What should happen if the player sticks his head through a wall or a door? Your game might not have a crouch button, but that doesn't stop the player from crouching for real in the actual world, and you will have to deal somehow with that movement the hardware is reporting!
This is not an isolated case. As VR technology advances, we'll be able to map more and more of the player's own body onto his avatar, making the problem worse. Devices such as the Razer's Hydra already take the player's hands out of the designer's control, and others like the Virtuix Omni treadmill promise to do the same with the player's walking direction and speed.
Simulation coherence vs body coherence
We want our simulations to be coherent. People can't put their heads through walls in the physical world, so allowing it to happen inside the game can affect immersion and believability. In order to preserve that coherence with the real world, we might choose to limit the player's movements (for instance, by having the camera collide with walls rather than going through them, or otherwise clamping the movements within some limits given by the game's logic)
The problem with this approach is that in pursue of that coherent simulation, it sacrifices the coherence between the player's own body and his avatar inside the game. When the player moves, he expects an inmediate and precise feedback to that movement, a subconscious association created out of many years of continuous experience in the real world. If our game fails to deliver on that, it will break the illusion of presence, the feeling that the body and avatar are one and the same, and might even cause physical discomfort in some of our players.
I'd argue that, as designers, we need to learn to let go of the player's avatar. In VR, avatars act more as visitors in our virtual world than as part of it, and we should consider them as interfaces with the player's body. As technology improves, these interfaces will need to become thinnier and thinnier.
A practical case from Dreadhalls
Thanks to Oculus, I was luckly enough to have access to a DK2 and be able to integrate these new features into the game. Dealing with the player pushing his head through walls or other parts of the environment was one of the first issues, and we tried a number of solutions:
Doing nothing: This might be a valid course of action for certain games, but it was not ideal for Dreadhalls. Being able to look "outside" of the map, or pushing your head through a door, completely breaks immersion and affected a number of systems that relied on the camera being inside the environment.
Limiting the movement: Having the camera collide with the walls solved the previous problem, but as I mentioned before, this caused a mismatch between the player's motions and the visual feedback. That mismatch both hurt the feeling of presence and made some players (me included) feel sick.
Just go with it: Rather than fighting the player or allowing the game to break, the idea here was to play along and assume you could actually push your head through walls. What would happen? In Dreadhalls, when you do, the screen goes to black and all sounds are muffled. This turned out to be a much better solution to the problem. All of the player's movements are respected, and the simulation respond to them as coherently as possible, even if that's not what would happen in the real world.
That difference between the real world and a virtual one is important. It doesn't seem to be a problem if the VR world reacts differently than the real one, as long as both internal coherence, and coherence with the player's own body are both maintained. Virtual worlds have much more plasticity in that regard, and act and feel a bit like dream worlds. Interestingly, having UI elements intermixed with the simulated world doesn't seem to create problems either.
That plasticity also opens new doors for game design. Perhaps pushing your head through the environment could actually be a gameplay feature in a game where you play some kind of ghost?
It's still too early to know for sure if what we found out applies accross a large swath of players of different types, but it seems promising. In any case, what is sure is that the new VR technologies are changing the relation between the player and these virtual worlds, which means designers will need to also change how they approach the concept of the player's avatar.
Read more about:
Featured BlogsYou May Also Like