Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
A five-year retrospective on lessons learned from inside the history of VR locomotion by industry pioneers at Cloudhead Games.
By Paul White and Antony Stevens.
My journey into VR locomotion began with the sunsetting Razer Hydra in late 2013. An early motion controller system tracked by a low-power magnetic field, the Hydra was originally designed as a peripheral for flat PC gaming. But for some of us, it was also an unlikely hero—the Hydra was the first big key to unlocking presence in virtual reality.
It was the era of the DK1, the first of the Oculus Rift prototypes available to Kickstarters, offering only rotational head tracking during its initial foray into the rebirth of VR. Without positional tracking of the head or hands, player movement in VR projects was either bound to the analogue sticks or omitted entirely. These were the standards and limitations of the time; VR as we know it today was yet to exist.
I was working on Exploration School, an early tech demo for our built-for-VR adventure game "The Gallery." My challenge was to use the Hydra to mimic the motions of climbing a wall without using control sticks—just reach out and grab it. It sounds straightforward now, but during those early days of VR we thought it could never be done with the tech.
Holding the wired Hydra, you would reach out with your hand and press a button to capture the position of that arm on a surface. Any motion you made next would be countered and represented in game with our body persistence. If you let your arm down, your position would counter that movement, causing your camera and in-game body to move upward. If you raised your arm up, your position would counter, and you would climb down. It felt intuitive, all tech considered.
VR devs all around were experimenting with anything and everything, from climbing to flying to roller coasters, but there was no substantial test audience. Motion sickness was a concern internally, but there weren’t enough headsets in the wild to know how widespread its effect was. We knew what artificial movement felt like to us and other developers, but there was no way to know what was working and what wasn’t for various sensitivities.
When we brought Exploration School to public events, we gave players the best advice we had: “Don’t look down.”
Those first two years saw many VR developers building single-room projects—playboxes with no need for travel or locomotion. The Oculus Rift, for all intents and purposes, was a seated experience.
Our project, The Gallery, was a larger world that needed exploration, with terrain that was organic and rugged. We wanted realism where you could walk around, look at things, and feel alive in a world. VR was predominantly blocky at the time (both graphically and otherwise), and walking with the analogue stick felt like your body was a cart behind you, changing direction to chase after you each time you turned your head. It all felt unnatural.
Tank move was one alternative. This method allowed your head to deviate from the direction you were moving, so you could pan your view around an environment completely decoupled from your body direction. Think of your head as a swiveling neck cannon, while your body is driven on tracks and controlled by a joystick. It was a fitting abstraction.
Tank move was better because it meant you could look around while you locomoted. It was also worse because of vestibular disconnect—motion sickness caused by your brain perceiving directional movement through your eyes (the headset), without physical motion detected by your inner ear (the real one). Decoupling head movement from the body ultimately decoupled stomach contents from the body as well.
More important than the freedom to look around was the freedom to move around, and we knew that the positional tracking features of the upcoming DK2 (and experimental hardware from Valve) would help dictate movement. In the meantime, we wanted to get ahead of the curve and start building for the future that VR was heading toward. Using heuristic spine modeling and a simulated height, I was able to turn the single, rotational tracking point of the DK1 into two positional tracking points: head and root.
With that inferred root, we then had the approximate location of the player’s torso in relation to their head, and could then adjust their body avatar with movements accordingly. We could tell the difference between natural displacements, from the player crouching into a tent, to peering over a balcony at the distant world around them.
In the end, the feature never made it in. Everything was about to change anyway.
VR devkits were being released to the public in droves, now with positional tracking, and people were getting motion sick. Just by putting on a headset, it became an immediate uphill battle for comfort. Using your hands and body, standing up and crouching down—it had all added so much to presence. But it had come at a cost. Any time the camera displaced without the player, it was barf city. And in an exploration game like The Gallery, you couldn't just explore the contents of your chair.
Most locomotion in VR was now split between body cart, tank move, and stick move with yaw rotation. The latter was the worst of the bunch, not only producing artificial forward-backward movement (vection), but also allowing the player to control the camera independent of their head position. Instead of a body cart, your face was the one along for the ride. If motion sickness was going to be the widespread problem it was trending to be, we would need to find a better way.
In any given moment, the human eye is capable of what’s called a ‘saccadic movement.’ Your eyes are constantly dancing, looking around for other things, even though to you it seems like the movements are smooth or even still. It’s an imperceptible movement—a jump. The turn of a ballerina. This was the basis for VR Comfort Mode.
Rather than continuously rotating the camera over a duration, as yaw rotations do, Comfort Turns are instantaneous. You press a button, or flick the control stick, and the player camera changes its facing direction. And because it’s instantaneous, there's no visual motion for your brain to perceive, and no physical movement for your inner ear to detect—no vestibular disconnect. It goes a long way to mitigating motion sickness, and the option has remained a standard for comfort even in today’s evolved experiences.
At the same time, we were still trying to make sure moving felt like moving. We began to work on a VR obstacle course specifically to experiment with different locomotion styles. I went back to climbing, inspired by geodesic domes, and developed a spherical ladder that you could go all the way around. You would grab for a bar and latch up, climbing the ladder until you were looking down at the ground in front of you, before reaching a point of upside down on the other side of the dome.
That one never made it in either, but if you’re reading this, NASA, you know who to call.
Near the end of 2014, Valve invited us and a few other select developers to a secret summit. It was there that they revealed SteamVR (and what would eventually become the HTC Vive) for the first time. Rather than the heuristic, inside-out tracking points Valve had shown at Dev Days earlier in the year, SteamVR was using real, local points on the HMD tracked within a volumetric space via “Lighthouses.”
SteamVR did something else no other HMD had before: it added hands. Instead of being tethered to a small magnetic box in front of you, controllers could now be tracked spatially by the same Lighthouse hardware your head was. It offered the genuine ability to walk around and touch things in VR. It was simply amazing.
It also threw everything we knew—and every concept we had for our game—in a nice, tidy loop and out the window.
When the player was fixed in their meatspace (their physical space), that was a fixed local offset. We had designed climbing so that when a player grabbed onto a bar we calculated that offset. With roomscale, those constraints were gone—you could move anywhere. Now, if the player decided to move positionally mid-grab, their arm would outstretch and essentially break free from the ladder and the offset. We were calling and calculating a redundant position that was now tracked by hardware right out of the box.
Likewise, tank move became irrelevant. You could walk in any direction in meatspace while simultaneously looking around normally. Our whole book on locomotion and body persistence broke overnight. The entire game halted. From that point, we had three months to not only redesign everything we had, but miraculously hit a new benchmark of 90fps for the first public demo of SteamVR at GDC 2015.
The first thing I remember us brainstorming about for the GDC Demo was movement. We had always wanted the player to move, and even though SteamVR had introduced this larger, volumetric space, it was now just a bigger box. Our plan was to utilize the volume itself as a virtual elevator, so that the whole room could move upward—and you with it.
As the GDC Demo concluded to the sweeping score of Jeremy Soule, the elevator rose up, and the walls opened on all sides to reveal a skyline with infinite possibilities and directions to explore.
After GDC, people loved roomscale; it was a new feeling for everyone. But we still had a game to build. Whatever we made for roomscale had to play into the full-scale levels we had been designing for The Gallery up to that point.
The new freedoms of positional tracking, artificial locomotion, and variable heights all at once were almost too much. They contradicted the core feeling and pacing of the experience we'd designed. We needed constraints. And a new type of comfort.
We experimented with rapid locomotion prototypes to see what landed. The goal was to align our locomotion with how we wanted the player to perceive the game. More advanced techniques felt superfluous and worked against the tone and feel of our slow-burn exploration. We were also a small team of seven and we had to consider player-fatigue—something we had no metrics on.
How long could players explore inside VR before they felt eye strain or exhaustion? How much mental load, between advanced controls and puzzles, could a player bother with on top of that? And that was yet to even consider nausea.
Instead of being limited to sitting or standing in one position like with the Hydra or DK2, SteamVR had its own issues: Players were literally running into walls. We needed to find a way to redirect players to the center of their physical space, so they had maximum area to work, play, and explore. Like with the GDC Elevator Demo, we didn’t want the player to feel like they were constrained to just their room.
Artificial locomotion—traditional gamepad movement—was the jumping point. We experimented with play bounds (the ostensible walls of the roomscale volume) overlaying as a grid when the player moved forward with the analogue stick. Then we simulated head bobbing like in a traditional FPS.
Body Joystick had you at the center of your room, with any offset direction and distance from that position representing the directional vector and velocity of your movement—you used your body as the controller. Arm Joystick used the motion controller itself like a flight stick (a similar method is known today as ‘Onward-style movement’).
I was stuck on the concept of grabbing space and came up with a concept for VR handlebars. You could reach out and grab, and, as soon as you latched, a virtual handle would appear in that space. It looked like the holodeck; right when you grabbed forward, the whole entire world would move toward you. In The Gallery it went over like a fart in a spacesuit, but ultimately found a fitting home in zero-G experiences like Lone Echo.All of these methods worked against the feeling of our experience. And all of them made you feel like you were going to fall over. If you accelerated instantly, you’d get a lurching in your stomach. If you accelerated gradually, you’d get a different lurching in your stomach.
At the time, the few metrics we did have indicated that artificial locomotion wasn’t working for players. Now, with roomscale, it wasn’t working for our game either.
One of the first forms of teleportation I worked on was Astral Navigation. You would look up in-game and see a star path between the clouds that represented the layout of the level you were in. Aim to where you wanted to go and, when you looked back down, you’d have teleported to that point in the scene. Impractical, but we were trying to explore the upper limits of what we could do with movement.
At this point, Valve put out an early photogrammetry scan of their office (a rough point-cloud version similar to the current SteamVR Environment below). In it were various "information" nodes that you could teleport between to navigate the room. Rather than use artificial locomotion to slide around, you could zip from curated point to curated point. We instantly liked it—it felt cool, and it didn’t make you sick.
SteamVR hardware had elongated the tether to the computer, but we still had a cord to fight with. In a game with exploration like The Gallery, players were getting wrapped up and tangled when they tried to spin around. Immersion broke as they fixed themselves, their tangled cable, and their orientation in-game. We decided to take Valve’s teleportation nodes and augment them to support rotation.
The first question was how best to align players in an open level to have access to the interactions that we predicted they would want.
I prototyped a way for players to raycast their playspace by holding the touchpad on the controller. They could then use their thumb to aim and rotate where they wanted to go, see a projection of their bounds, and know what they would be able to reach within those bounds at their destination. We could then snap their playspace to an ideal hotspot when they teleported.
By mid-2015, we were getting closer. We removed the hotspots and just let players teleport wherever they wanted to, within our set parameters of height and distance. This nausea-free locomotion meant that we could help direct the experience while still giving players their freedom to move. It also offered a bridge for players new to VR to visualize and strategize their position regardless of their local space limitations, essentially giving roomscale gameplay to any room size.
Persistent bounds were important for that. When used in conjunction with artificial locomotion, visible boundaries were helpful as a stable point in the periphery to mitigate vection issues and nausea. With teleportation, they served to make the player feel safe within their space; play bounds could dynamically pop up when you came too close to a surface. When the player knew where they could safely move to, they were less likely to just stand in one spot.
Instead of an instant zort to a new location, which hurt the pace of how we wanted the game to feel, we did a fade-to-black “Blink”—another eye technique. The duration of the Blink was directly proportional to the actual distance you wanted to travel. We wanted that passage of time to feel realistic, so you couldn’t just Blink to the other end of the world in an instant. I worked with our sound designer, Joel, to nail down the timing, and he added foot foley so you could actually hear yourself walking as you Blinked.
It was all about the feel.
Denny was pushing for more experimentation with our artificial locomotion options. We played with different settings, from uni-directional (forward only), to bi-directional (forward-backward), to strafing enabled (side-to-side). These methods were still causing sickness, but Denny was working toward vection mitigation when he got to the idea of a porthole.
During artificial movement, we would bring a mask around the periphery of the eyes—a portal of sorts that we called a ‘Vection Portal’. With it, you could feel motion as you moved through the level, but without the fast-moving periphery that made it uncomfortable.
Despite these new comfort options for artificial locomotion, nausea wasn’t totally solved, and Blink was still the right feel for the game. Blink married The Gallery's cinematic style, it was comfortable for mass adoption, and it inspired players to move around within their space.
In the end, we had to choose one. We were a small team, and testing large VR levels with multiple locomotion methods wasn’t in the cards. We had to fully integrate and support one method, and we decided to stick with Blink.
From there I was able to optimize the game entirely around Blink. I could instantly bring in specific items based on the player Blinking into certain zones, and then cull the others. We were able to cut down processing and increase the fidelity of the game because fewer areas needed to be persistent at all times.
When the first episode of The Gallery finished and public adoption of VR finally arrived in April 2016, the take on Blink was increasingly polarized. Teleportation had become the standard, but the rotational and persistent bounds features we added with Blink felt too insular. Valve had made simultaneous developments to their teleportation, and player preference leaned toward those minimalist, built-in systems. We had inadvertently invented many of the same wheels and our systems ultimately overlapped.
When the Oculus Touch released in December 2016, we updated Blink to be more in-line with the standards of the time. Rather than aiming with your head (literally looking to where you wanted to go) as we originally had it, we set the default to cast from the hand, so you could point to your destination instead. We also added a ballistics trajectory arc that felt more intuitive to our players, albeit less connected to our world.
The feeling mattered, but so did the options.
Artificial locomotion was taking on a new name: Free Locomotion. Onward—a tactical FPS fully-integrated with smooth, artificial locomotion—had become a cult hit with VR gamers. Players felt that free locomotion was now the only way they could feel truly immersed. There was an outcry from the community any time a game lacked the option, and we would ultimately add support in the second episode of The Gallery, despite it being built for our default option, Blink.
Also gaining traction at the time was ArmSwinger. What I immediately liked about the ArmSwinger demo I saw was that the controllers inferred the player’s body direction by the motions they performed. You could turn the body with the direction of the arms, and still allow the head to pan decoupled from that direction. The downside to ArmSwinger however, was that it still used up a button and didn’t ultimately address vection issues.
Valve had let us in on another secret. Because of the depth of our hand functionality, The Gallery was used to reveal the first prototype of the SteamVR “Knuckles” controllers at Steam Dev Days 2016. These new controllers were trending away from buttons and the abstraction they produced; hand interactions could be performed by tracking of individual fingers, rather than with binary button inputs. With this in mind, I decided to augment the ArmSwinger mechanic to be buttonless.
I began researching FFTs, or “fast Fourier transforms.” I had studied them before for audio, but I was beginning to see their relevance in kinematics. The transforms can extract any number of combined waveforms within a time domain into an ordered table of discrete frequencies and amplitudes. More colloquially, the FFTs let me read body noise from motions of controllers and turn it into practical data.
I could then set the exact threshold of when movement should start; two periods that are cached or buffered can tell me when a player is running in spot based on their arm frequencies. With this method, you could swing one arm and pick an object up with the other while still continuing to move.
With body pucks (such as the Vive Tracker) not yet available to the public, full-body persistence would require us to infer the root position of the player. Once there, however, you could use height and leg data to start tailoring leg strength coefficient and calculating the specific physics and kinematics of any single person. Players could run at the same frequency beside each other, yet still have different travel distances.
In practice, games like Sprint Vector with its peripheral VFX have shown that arm swinger-types can help mitigate vection issues. The physical bobbing while you run is enough to ease into motion but still keep you grounded—there’s no lurching, and there’s limited vestibular disconnect. It’s a great option for free locomotion without sacrificing comfort. Plus, the cardio, you know?
Experimentation has always been the way forward, especially in VR. Aldin Dynamics introduced Telepath locomotion late last year, a hybrid of both teleportation and free locomotion. More games are experimenting with climbing and arm swinging, and we're seeing new locomotion and comfort options releasing every month.
But that great divide between comfort and realism is still going strong, and it becomes increasingly challenging as we fight to increase VR adoption. Current VR owners are desperate for realism and freedom, while the non-owners will never be sold unless they’re comfortable.
Teleportation can allow developers to know and design where the player is able to access and see at all times, but it can also feel limiting to the player—like digital training wheels.
Free locomotion can allow players a boundless experience, but it typically requires an acclimation period and some people simply can’t handle it.
Implementing one method into a game that was built for the other can break the balance, or sometimes the entire experience.
Simply, there are sensibilities in designing locomotion for VR. We, as developers, want to push the boundaries of immersion, but we should also strive to maintain a comfortable and considered experience for all players.
In the future, we want to release a Mutant Locomotion scheme, a culmination of all our various methods with no compromises. On the right hand you can Blink, with Comfort Turn inputs on either hemisphere of the analogue stick or touch pad. On the left hand is free locomotion. Buttonless Arm Swinger heuristics are ongoing and always sensing for oscillations to occur. We want to put it all on the table and really know how players want to explore. One distilled option, one integrated experience.
That’s the important bit: Give players those options, but remember to tailor the locomotion to the experience you want to create. There are too many independent developments out there attempting the shotgun approach with 50 different locomotion schemes. The options are good, but at that point they’re not designed for the game. At the end of the day, ask yourself what you want your player to experience. Take the lessons learned, the methods you’ve enjoyed, and listen to the feedback from your users. Ultimately, those are the people who are playing your experience at great lengths, and can let you know what they enjoy about it and what they don’t.
The next step is when we have additional tracking points in the mainstream or perhaps galvanic stimulation of the inner ear. Then we can get into the snowboarding and the skateboarding and the hoverboarding. It opens up a huge new window for experimentation in locomotion.
And honestly, I like that stuff—it’s what made those early days so exciting. We have a game to make and things to do, but in the end it’s still about pushing boundaries together. There’s a huge risk in pouring time and money into experimentation in VR, and a lone developer can’t climb to the finish and fully realize it themselves. It takes a village to move a medium.
One day, through a collective effort, we might find a better way, something far beyond the standard motion and teleportation. The walls will open on all sides to reveal a skyline with infinite possibilities and directions to explore.
This article was first published on Road to VR.
Read more about:
Featured BlogsYou May Also Like