Sponsored By

Swery dreams of D4: Making a next-gen Kinect game

In looking at how Access Games would develop the episodic D4: Dark Dreams Don't Die for the Xbox One's Kinect 2.0, Hidetaka "Swery" Suehiro found himself going back to the basics of gestural design.

Kris Ligman, Blogger

November 6, 2013

4 Min Read
Game Developer logo in a gray background | Game Developer

In looking at how Access Games would develop the episodic D4: Dark Dreams Don't Die for the Xbox One, Hidetaka "Swery" Suehiro found himself going back to basics. Speaking at GDC Next this morning, the Deadly Premonition developer described the peaks and troughs of developing a game with the Kinect 2.0's motion sensors in mind. The team, which had set out with a mind to verisimilitude -- full-body performance from the player to explore and fight through the game's high-energy action scenes -- found after rounds of testing that more abstracted, symbolic gestures were actually far more evocative.

Lowering the barrier of access

In the build showed to GDC Next attendees, D4 is designed to make use of the Kinect 2.0 from either a standing or sitting position. With few exceptions -- which showed up mainly in the action sequences -- all commands were accessible with the use of one hand. Suehiro says that this control setup is designed to limit the physical demands upon the player, after play-testing revealed that testers were growing exhausted even through the more sedately paced investigation sections of the game. Scaling back to one-handed controls presented its own hurdles: some gestures started to overlap, for instance, confusing the sensor as to whether the player was swiping her hand to change the camera angle or open a cabinet. To compensate for this, the team introduced a three-tiered gestural system which would switch between command sets based on whether the player's hand was closed, partly closed, or spread flat. The implications for this one-handed gestural system go beyond ease of use for able-bodied players -- it also allows for more intuitive handling for players with several disabilities. Even conversation selections, which seem able to be activated by voice command, are also accessible with one-handed motion control. "The basic concept is for the player to have empathy for the characters," says Suehiro, who compared the intended ease of D4's gestural use to telepathy. "That's how easy we want it to be. We don't want to tire the player out."

Toward simplicity

Suehiro, a black-belt in kempo, originally set out to match on-screen commands with player physicality as much as possible, having the player duck and weave through fighting sequences as though she were participating in a tightly choreographed Jackie Chan flick. However, what the team discovered was that rather than immersing the player further into the game's movements, it actually created a disconnect. "Say you want to dodge an attack. There are a lot of ways to dodge. You can move to the side, or crouch. If the player's impulse is to dodge one way, but the game asks him to do it another way, it's frustrating," Suehiro explained. Complicating things didn't make it any easier on the player, so Access opted to throw out its work up to that point and start fresh. "We decided to go back to oldschool video games and use symbolic gestures to represent what we wanted to do," said Suehiro, indicating that the team was inspired, for instance, by the navigational grammar of classical arcade and 2D Mario titles. This freed up D4 to use a much smaller list of easily processed gestural commands on the fly, speeding up response from the player. It's a simplified solution, but not simplistic -- an important distinction. Suehiro said he wasn't concerned that the use of more abstracted gestures would seem outmoded to modern players; the point was to come up with a ready-to-hand system that could be seen and automatically responded to, even by others in the room. "We've designed it such that the player can do what he wants to do, and the person next to him will get fidgety and want to do what he does," said Suehiro. "And not just the actions, but the gestures themselves -- gestures that everyone in the room would want to do. If there is a punch, everyone will want to punch." Suehiro described the interaction between game and performer as a sort of "community event," with viewers feeding off the energy of the action on screen as well as the player.

Selecting tools for the project, rather than the project for the tools

"When a new device comes to the market, the main challenge is figuring out how to make best use of the hardware, but I actually think a good practice is to go back to the essentials of old games," said Suehiro. "We need to avoid falling into the trap of designing a game based on the hardware. We need to get back to what we want the game to do, then look at how we can use the device to best achieve that." Suehiro said that while Access Games had started out excited to make use of the Kinect 2.0's more sophisticated sensors, the team had discovered it was pulling them away from the essence of what they were trying to create. "Even when working with a new device like Kinect, we should look back on game basics and think of things in [simpler] terms," said Suehiro. "I think that will allow us to take a purer look at what we really want to express through game design. In other words, rather than matching our game design to new input devices, we need to take control of such devices to match our game design."

Read more about:

event-gdc
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like