Sponsored By

Anticlimactic: The Cost of Realism in Cinematic Games

AAA titles increasingly strive to create a cinematic experience for the player. Commonly this is done by enhancing a game's realism. Unfortunately, concessions to realism can confound the player at what should be the game's most poignant moments.

Game Developer, Staff

February 17, 2011

11 Min Read
Game Developer logo in a gray background | Game Developer

Red Dead: Redemption

Red Dead: Redemption is by no means a frustrating game. The game allows you to clearly chart a path from A to B and then ride along said path, depicted as a bright yellow line, or fast travel to your destination. Death rarely means restarting more than a couple of minutes from where you died. Where the game falters, at times, in is its cinematic impact.

In one sense, the situation described above shouldn't feel so different from any other cause of player death or failure. As I'll explain, though, it's part of a larger symptom, and tends to occur after (or before) you've completed the game's challening segment -- in other words, it feels disruptive of what should be a cinematic (and glorious) moment. Put another way, it sucks to watch your hero fail because he ran into a supply closet instead of out the exit; this is different than watching him fail because he was beat up or failed to leap from a cliff. 

Aside from the occasional glitch (though these, too, are actually relevant, as I’ll explain later), most of the confusion in Red Dead from its realism. This sounds counter-intuitive, of course. So consider the scene I described above, which is actually quite typical. Why did I lose sight of my target?

For one, Red Dead’s locations are well-populated with realistically rendered people – meaning people who more or less look the same. The locations are realistically lit and textured, which can mean they are dark and muddy, and that key interaction points, such as doors, don’t stand out as much as they should. Finally, and most importantly, there are few invisible walls, allowing the player to fall wildly off the rails.

Red Dead’s primary solution to these issues is its radar screen, which allows you to clearly track enemies as red dots. However, the small and highly abstract map doesn’t represent height, walls, or doors in any meaningful way. Also, when struggling to climb out a window, descend a flight of stairs, find a horse, and ultimately hog-tie a villain, many players will lack the presence of mind to check the radar screen, and in any case, the necessity of fixating on the radar screen sometimes undermines the game’s visuals.

On the other hand, the minimalist HUD and reliance on in-game graphics causes other problems, such as charging at an armed foe without a proper weapon equipped (a problem exacerbated by the game’s odd tendency to randomly unequip weapons).

But so what, you say – failure simply means replaying a few minutes of a mission. In that sense, yes, Red Dead isn’t frustrating. But consider the impact of restarting a mission on the cinematic impact of that moment. The player’s built up adrenaline and anticipation evaporates and the game, which was for a moment transcendent, feels entirely like a game again. It’s obvious the Rockstar wants to give players the sensation of living a movie, which is why the realism in Red Dead is so meticulous, like an elaborate movie set.

Another way in which Red Dead’s realism undermines the game’s impact comes when the player’s ability to artificially interact with this realistic world diverges from the what the game considers to be "good" player behavior. Dialogue often occurs while riding alongside a NPC on horseback.

Bumping into this NPC with your horse (or the NPC bumping into you) triggers the NPC complaining about your riding ability, which interrupts the dialogue. Depending on when this interruption occurs, the NPC may never finish whatever he or she had originally been telling you.

This is especially annoying because, A, you don’t want to have to pay so much attention to your riding while listening to a story, and B, it can be near impossible to avoid accidental contact. Yes, a real person would be annoyed if you ran your horse into his, but is this detail worth including in the game -- in particular during important story segments? 

The horse situation worsens when a vital NPC dismounts before you do. More than once I gently bumped or ran over the NPC I was charged with protecting when dismounting from my own horse, which can kill the NPC and terminate the mission. While in most cases it’s a fun that you can use your horse as a weapon of mass destruction, it doesn’t make sense to allow the behavior in what should be a cinematic moment when NO player would want to run over the NPC, but when it’s fairly easy, due to the overwhelming detail of the game and occasional glitch to accidentally do so.

Finally, Red Dead highlights perhaps the biggest problem with in-game storytelling. As mentioned before, story elements often unfold while riding on horseback alongside an NPC. During these stretches, the game’s regular sound engine is still active, meaning that straying too far from the NPC can make his voice difficult to hear or inaudible, depending on how your speakers are configured.

Normally this isn’t a huge issue, but there are occasions when you’re riding with a mass of NPCs, and it can be difficult to discern which is the one talking. Most of the other game’s engines are running as well, meaning an unrelated NPC or wild animal can occasionally interfere with scripted mission sequences triggering (this happened to me on two occasions). While, again, I appreciate the realism, I would prioritize being able to hear story and have missions unfold smoothly.

Mass Effect 2

Mass Effect 2 adds another wrinkle: team members. I rarely died or failed missions in Mass Effect. To the contrary – I often completed missions without realizing I had done so. This occurs for two reasons. First, as in Red Dead, NPCs are rendered realistically.

Missions frequently center on disposing of a particular arch-villain, but because this villain isn’t exaggerated in a cartoonish fashion as he would be in an old arcade game he can be difficult to identify. The problem is exacerbated by the fact that many of these villains are aliens wearing alien armor or clothing. It’s like trying to quickly tell apart two exotic birds by site – as humans, we’re mostly equipped to notice subtle differences in human faces, and only then if they’re rendered at high enough detail.

The worst of the confusion arises from the game’s squad-based gameplay. In most squad-based games it’s unusual for NPC squad-members to perform mission-critical actions, which is to say they’ll occasionally take out the peripheral bad guy or perform some scripted task, but will leave the glory to the player.

Not so in Mass Effect, in large part because you can give direct orders to your squad members. As many of these have area affects, and because the targeting can shift or be imprecise, it’s not uncommon to accidentally kill the arch-villain. Again, the bad guys are “realistic,” which often means their health bars aren’t significantly larger than the regular enemies.

Accidentally finishing a mission is actually worse than accidentally failing one, because once a mission is finished you can’t go back; the satisfaction of shooting the bad guy in the head is lost forever. There were a few instances in Mass Effect 2 was so bad I never even laid eyes on the villain. Fortunately, the game takes a more on-rails approach than Red Dead, so the overall confusion level is kept to a minimum.

Workarounds and Bayonetta

Certainly Mass Effect and Red Dead are not alone in the problems I’ve described, but they provide sufficient example of how otherwise exceptional games can be marred by the realism and expansiveness for which they are so lauded.

Another brilliant title, Uncharted 2, suffers from this problem at many points despite large portions of its environments being vividly detailed and pre-rendered, showing just how difficult directing the player can be. Uncharted’s undoing tends to be the mixture of free-movement and detailed backdrops that only allow very limited player interaction, leaving the player scratching his head as to which wall is scalable and which crevice is actually a bottomless pit.  

To a certain extent, preserving the in-movie experience in games will always be problematic. The game has to provide a certain amount of challenge and freedom of choice, after all, and this inevitably leads to player’s making wrong choices that lead to delays or dead-ends – something we never see in movies.

And, because the in-game camera is meant to assist the player, not the viewer (even though these two people are usually one and the same), it’s possible for the camera to miss key events, such as the death of a villain.

There are a few ways to better ensure games retain their cinematic quality. First, consider that even on consoles gamers utilize wildly varying technologies. A clear system of signs and symbols helps gamers with smaller televisions better discern what’s what, and doesn’t really detract from a game’s realism that much (and can, of course, be switched off). 3D sound positioning should never put key dialog out of range (and gamers should have the option of placing all dialog front and center, when possible).

In general, using the in-game engine to render cinematics provides numerous challenges (especially on the PC, where a gamer’s hardware can greatly affect his ability to view and understand cinematic). Often it’s the little things – weird hair clipping or a stray equipped weapon obscuring the player’s view, say.

In games like Red Dead, where the cinematic is also a gameplay section, a whole slurry of problems can crop up. In most cases, it’s probably best to sacrifice a certain amount of realism and player input for the sake of ensuring the player can enjoy and understand the story segment.

Also, consider that realism, especially when combined with a small screen and imperfect graphics engine, can make it difficult to tell one thing from another. Exaggerated and cartoonish graphics have a huge advantage in this regard.

In the end, though, creating a cinematic experience is always going to be a balancing act. Realism and freedom of movement increase immersion, until the artifice of gameplay and the shortcomings of game hardware cause the game to fall apart.

Conversely, discernable rails/invisible walls and symbolic overlays decrease immersion, but can ensure that the end gameplay experience runs smoothly. I tend to feel that AAA Western developers have moved too far away from the artifice that makes games more playable.

I’ll close by looking at Bayonetta, a game that is a garble of confusing story and graphical fireworks, but that never confuses the player any of the ways mentioned in this article. Largely, this is because of the subtle ways the game limits the player to allow them to feel like they’re dexterous feats worthy of a superhero. More than anything, Bayonetta is a game meant recreate the superhero/kung-fu movie experience. It can’t be compared directly with a game like Red Dead because it’s not an open or seamless world.

However, it’s precisely because Bayonetta makes no attempt to recreate reality that the player is actually afforded more freedom of action and clarity of purpose. The developers always manage to make it clear which of the bosses many flailing limbs you’re supposed to hit. It’s impossible to accidentally kill an ally, because they’re not in the flow of the gameplay.

While I’d give game of the year to the horseback riding in Red Dead, overall the movement in Bayonetta is far more fluid and responsive, artificially so, but this just compensates for the fact that game controllers are something of a bottleneck for the human nervous system.

It’s also near-impossible to kill yourself by leaping from a cliff, thanks to the stage and camera design, a generous double-jump, and merciful QTE platforming. The cinematics all play out separately from the gameplay. There are only a few moments (one unforgivably just after defeating the game’s final boss) where you are expected to maneuver through a downright confusing situation that can lead to death.

This isn’t to say that Bayonetta’s solutions would work for any game. Rather, it’s to point out how a game that’s less realistic than it’s contemporaries in almost every conceivable way (and badly written to boot!) often succeeds at feeling more like a movie, or at least a less punctuated movie.

Ultimately, realism should not be an end unto itself as it not only doesn’t always lead to immersion, but can detract from it. Sometimes you end up with a miracle of technology like being able to gallop on horseback from the Pacific Northwest down to Mexico, whereas other times you’re struggling to walk beside a person while engaging in mundane conversation.

There’s no easy solution to any of these problems, and in some cases game developers may be forced into a game of give and take. But in the end, it seems to be a good rule of thumb to introduce artifice where the “realistic” option is likely to cause confusion and reduce a player’s ability to act intentionally – especially when the game’s most pregnant cinematic moments are unfolding.   

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like