Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Taking to the Soapbox, Michael Eilers explains why games need to be shorter and easier to widen their audience, and to encompass the interstitial gamer who is considerably shorter on time than the hardcore gamer.
Flexing tired hands as you grip the controller, you desperately maneuver your virtual self for cover as sniper fire seeks your vital organs. The last save point a distant memory, you sneak past the armed guardpost for the umpteenth time. The stakes are high - if you fail to reach that save point several hours of gameplay will be lost, and your next attempt will be yet another stolid trudge over the same familiar ground. Meanwhile, your spouse systematically turns out the lights and goes to bed. You twitch, knowing that every moment you stay up past your "bedtime" you will be rewarded with withering silence over next morning's cold breakfast cereal. Cursing the cruelty of some unnamed level designer, you trudge on, feeling the crosshairs of the sniper creeping across your shoulder blades.
Illustration by Greg Brauch. |
This is the world of the interstitial gamer - one who grew up on games but then got a life, or one who gave them up to take on career and child rearing and is now trying to return to the fold. Not quite casual but not quite hardcore, this type of gamer is dealt a cruel hand by today's game market, and many market trends seem set to exclude this type of gamer even as the ranks of this demographic swell.
A large number of "AAA" titles from 2004/2005 were extremely difficult and aimed squarely at the "hardcore" gamer audience, from God of War and Prince of Persia: Warrior Within to Doom 3 and Splinter Cell: Pandora Tomorrow; it was no coincidence that these titles were the "cover queens" of 2004, gracing every gaming magazine cover imaginable (sometimes more than once). These games were brutal and unforgiving, aimed at veteran gamers with pre-trained reflexes and preconceived notions about what constitutes good gaming and game design.
Why would this be a problem? Game designers, creators and testers are hardcore gamers; game magazine writers and editors are hardcore gamers. However, the majority of the gaming audience is not, and neither are those who are just now discovering gaming - and these gamers are the ones that actually pay for their games and keep the industry in the black.
The ESRB published a survey in late 2004 which pegged the age of the average gamer at twenty-nine, and only 59% male.1 Those of us who have crossed the threshold of thirty know that past that age, people have a far greater tendency to be married, raise children and have a demanding career. In other words, they are interstitial gamers - those who squeeze gaming in between their career, marriage, housework and weekends spent shuttling urchins between soccer practice and violin lessons. They are exactly the type of gamer who is going to have $600 for a new video card or $400 for a new console (plus the HDTV to run it), yet they are the ones least served by a marketplace which seems to be veering once again towards the hardcore gamer.
The interstitial gamer isn't limited to post-Gen-Xers, either; today's youth are the very definition of interstitial, their gaming time divided between their Pocket PC, GBA, cell phone and the console Dad just installed in the SUV. Their scheduled booked like a Hollywood producer's, they might game with one hand while Vonaging a friend in Ireland, programming the TiVo, and ripping DVDs on a laptop. The idea of committing forty or more hours to a single game and doing it in solid one- or two-hour blocks is laughable in this context.
The next-gen consoles themselves might force even hardcore gamers to go interstitial. As the console blends with PC, game play time will be competing with voice chatting time, videoconferencing time, e-mail checking and web surfing. And while most homes have more than one TV, it is safe to say that very few have more than one HDTV - bringing a household's gamers into direct conflict with DVD and HD-cable watchers. In the past, when Mom wanted to watch Dr. Phil you could just scoop up the 'Cube and head for the bedroom, but those hooked on HD gaming are going to find themselves fighting over a single household source for 1080 lines.
Industry leaders and pundits are also concerned about the interstitial gamer. Both J Allard and Laura Fryer2 of Microsoft have expressed concern over the complexity and difficulty of current and future games; Nintendo president Satoru Iwata has expressed his dismay several times at the hardcore bent of the industry and how that might be alienating both potential new gamers and older gamers without the time (or perhaps reflexes) to master a new skillset with each AAA release.3
Do they have reason to be concerned? At a GDC panel discussion in 2003, former Williams programmer Eugene Jarvis (creator of Robotron and Defender) revealed a shocking fact: many of the games we remember best from the Classic era were crafted according to the so-called "Ninety-second rule." According to this rubric, for an arcade game to be profitable during those competitive times a quarter had to drop into the machine every ninety seconds, on average; in other words, to do its "job" properly, the game was purposefully designed to kill off the player in under two minutes, which averages out to a mere thirty seconds per "life."
It was interesting to hear that this was an unspoken rule of design during what we have termed the "Golden Age," but Jarvis wasn't finished yet - he, along with the other members of the panel, were convinced that the extreme difficulty of these arcade games and the focus on visual flash over depth was exactly what snuffed out the game market itself, which suffered a financial and cultural collapse a mere two years after the arcade's glory days. The arcade has never fully recovered.
Modern gamers (and game designers) pat themselves on the back for revitalizing the game industry and broadening its reach, but many of the design lessons (and mistakes) from that era still haunt current games. The exponential rise in difficulty that was the hallmark of those games is still a dominant design element. Repetition of motif (an endless horde of identical baddies) is still common, as are "grind" mechanics that are more endurance test than gameplay element, and unreasonable limits on loading and saving the game or picking up where you left off.
Perhaps the most egregious current design sin is the use of the "checkpoint save," an anachronism which was relevant in the days when storage space was an expensive commodity; now that we have consoles with hard drives and/or 64MB memory cards (up to 1GB on portable devices) as well as PCs with near-terrabyte storage levels, that argument is moot.
Game designers (and hardcore gamers) might argue that the checkpoint save adds drama and jeopardy to your trip through a level, but this is only true the first or second time; by your tenth trip through a level you are playing with extreme clairvoyance, able to anticipate every enemy attack. When you already have the level memorized, this isn't gameplay, it is just work.
Game creators might go to their deathbed denying it, but pointless limitations on saving your game (widely-spread checkpoints, limited save-slots, special "no save" areas) are cheap ways to stretch gaming time over that arbitrary "30+ hour" limit so praised by the hardcore crowd. The gaming media reinforce this problem, complaining bitterly that they (hardcore gamers, of course) were able to beat a game in "under 5 hours," implying that it was a waste of money and time for any worthy gamer. When the interstitial gamer might have trouble freeing up 60 hours for gaming in an entire year, a 5-hour game could seem a gift from the gaming gods.
Other methods of adding complexity and difficulty to current games are just as egregious; the much-praised Half-Life 2 starts off with a promising storyline and lots of casual-gamer-friendly character interaction, and then abandons the player in a series of mazy canals with little or no direction, dozens of scripted events stacked against the player and many parts that can only be "beaten" by first dying and then memorizing where not to step the next time through - oh yes, and checkpoint saves that are arbitrarily far apart. Even worse are games that ship with a large percentage of the game content "locked" and force the player to master a series of repetitive challenges in order to "unlock" the cool stuff that was promised on the back of the box.
Hardcore gamers might sneer, but many of those who are new to gaming enter the market through the route of web games, cell phone games and the Nintendo portable systems as well as casual games such as The Sims 2. Confronted with a title such as Medal of Honor 2, with a complex control scheme and gameplay that rewards only those who obey a rigid series of tightly-scripted sequences, a Bejeweled 2 player will retreat to the browser. Games designed with a "weed out the weak" mindset still dominate press coverage and previews, and yet the weak are exactly who the game industry should be reaching out to embrace. Those of us in the industry are entirely complicit to this focus on hardcore gamers. When was the last time a Tycoon game made a magazine cover? What other casual gamer title besides the Sims 2 was featured recently? Did Puzzle Pirates or Diner Dash make any end-of-year "best of" or "must have" lists? Why is Halo 2 celebrated for selling 2 million copies, when many casual games have broken the 10-millon-download mark?
If any concessions are made towards the casual-gaming crowd, the hardcore audience cries foul, calling this "dumbing down" and "selling out." Is it possible to make games that satisfy both the hardcore and interstitial audiences? Many long-standing gamer favorites point towards ways to bridge that gap. The Sims removed scoring, levels, save checkpoints and most of these "hardcore" game mechanics and is still beloved by many for its sense of freedom and intimacy. Titles such as the Grand Theft Auto 3 series retain many hardcore gameplay elements (four stars, anyone?) but balance this with a grand sense of freedom and the ability to ignore the plot completely and even focus on nonviolent play such as ambulance driving or pizza delivery. Unique titles such as Puzzle Pirates point towards entirely new mechanics and genres that have yet to be realized.
Perhaps we should focus on user-created difficulty, in which the difficulty of the game is directly determined by the choices the player makes. If an MMO player wants to just beat up weaker animals all day long and never confront the dragon in the woods, this should be a valid and supported decision. Imagine a shooter in which the player could make the choice to stick to a silenced pistol and sneak through a level undisturbed, or use the rocket launcher and bring every bad guy in the area down on their head - the toughness of the level reflecting their own style of play. Setting up a game scenario such as this might require more effort than just scripting an event in which flaming barrels drop down the stairs, but if we set our sights on creating drama - not just difficulty - it might be possible to embrace this new audience rather than drive them away. Those of us without 30 hours to spare for a single game want desperately to remain gamers, and a new generation is just as eager - but serving up yet another Splinter Cell or an MMO which requires daily commitment doesn't serve the interstitial gamer, and we need them. They are the ones who will be paying the bills for our next decade of the game industry.
Endnotes
1"Survey: Video gamers getting older, heading online" USA Today 5/12/2004
2"Games suffer from 'geek stereotype'" BBC News
______________________________________________________
[Article illustration by Greg Brauch.]
Read more about:
FeaturesYou May Also Like