Sponsored By

The Psychology Of Games: The Glitcher's Dilemma

Psychologist and gamer Jamie Madigan writes for Gamasutra about how social dilemmas work in the world of gaming, and how designers can work to diffuse them before everybody gets glitch happy.

Jamie Madigan, Blogger

March 5, 2010

6 Min Read
Game Developer logo in a gray background | Game Developer

[Psychologist and gamer Jamie Madigan writes for Gamasutra about how social dilemmas work in the world of gaming, and how designers can work to diffuse them before everybody gets glitch happy.] Soon after its release, some players of the online first person shooter Modern Warfare 2 discovered what became known as "the javelin glitch." Someone, somewhere, somehow figured out that through a bizarre sequence of button presses you could glitch the game so that when you died in multiplayer you would explode violently and murder everyone within 30 feet of you, often resulting in a net gain in points. It wasn't long, though, before the method for creating this glitch spread through the Internet and servers were filled with exploding nincompoops. Just to a Youtube search for "Modern Warfare javelin glitch" and you'll get hours' worth of video explaining how to do it --it wasn't a very well kept secret. In fact, it quickly got bad enough that developer Infinity Ward had to rush out a patch to fix it, presumably screaming "Ack! No! You guys, stop it!" the whole time. But in the meantime, the javelin glitch presented players with an interesting dilemma assuming they weren't outright bent on griefing: they could either abuse the glitch to boost their own rankings and unlock new perks, or they could abstain and preserve the game's fair play. Of course, the problem is that if they abstain, someone else may abuse the glitch and dominate the match. The middle ground is when everyone glitches, but the resulting pandemonium isn't as much fun as fair play. Let's simplify the discussion by assuming a two-player deathmatch game between two non-griefers in Modern Warfare 2. Look, I've created a table to summarize the dilemma for you! It's suitable for framing. glitch_dilemma.jpg So what do you do? Psychologists and economists who study this kind of decision-making call it a "social dilemma." In these situations each person has what's called a "dominating" alternative where they're most likely to win (in this example, abusing the glitch) but most people REALLY want the "nondominating" alternative produced when everyone chooses to cooperate. Especially once the novelty factor wears off. Back in the 1960s research on these kinds of dilemmas exploded and out of it came what's known as "the prisoner's dilemma," based on an anecdote about getting confessions from two prisoners held under suspicion for a bank robbery. In his book, Rational Choice in an Uncertain World Robyn Dawes summarizes the classic scenario thusly: "Two men rob a bank. They are apprehended, but in order to obtain a conviction the district attorney needs confessions. He succeeds by proposing to each robber separately that if he confesses and his accomplice does not, he will go free and his accomplice will be sent to jail for ten years; if both confess, both will be sent to jail for five years, and if neither confesses, both will be sent to jail for one year on charges of carrying a concealed weapon. Further, the district attorney informs each man that he is proposing the same deal to his accomplice." Another table! prisoners_dilemma.jpg What would you do? In this case, both prisoners will probably confess if they're rational about it. Why? Because each prisoner get a better (or no worse) payoff by confessing no matter what the other guy does. Prisoner A thinks, "I don't know what B is going to do, so if I confess it's the best way to keep myself from getting screwed. If he keeps quiet, I go free. If he also confesses, I get 5 years instead of 10." In other words, confessing is the only way to keep the other guy from being able to screw you over. Notice how this mirrors the javelin glitch dilemma. Now let's take another example from the golden years of PC gaming. In the early days of Starcraft, a strategy called "Zerg rushing" emerged where at the beginning of the match players would quickly build lots of cheap Zerg units to overwhelm opponents before defenses could be constructed. Counter strategies developed for players who could manage them, but for a good chunk of the player base Starcraft became a game of seeing who could Zerg rush faster, which wasn't nearly as much fun as choosing from any other number of play styles or even races. So the dilemma was: zerg_rush_dilemma.jpg Again, the dominating strategy was to Zerg rush, because if you didn't and the other guy did, you lost, which was worse than any of the alternatives. This despite the fact that what you really both want is a varied, fun game. It's a design issue that still plagues strategy game developers today. Prisoner's dilemmas and social dilemmas in general can similarly be used to illustrate the reasons for ninja looting in World of Warcraft: loot_dilemma.jpg Or you could apply it to "tick throwing" and "fireball trapping" techniques in fighting games. I could go on, but I think you get the idea. What's really more interesting and useful, though, is to look at what psychology has to show us about when people DON'T choose the purely rational option of abusing a glitch or a winning but boring strategy. Generally, people are more likely to do this when: - They know they will be playing against their opponents in the future and face retribution - They expect to interact with their opponents outside the game - They don't expect to remain anonymous - They don't know how many games will be played with the same person Under these conditions, many players will adopt a strategy where they cooperate at first (for example, they don't glitch or rush), then if the other player abuses that trust they retaliate in kind. This is known as the "tit for tat" strategy. Some researchers with way too much time on their hands even organized tournaments where people were invited to write computer programs to play iterated prisoner dilemma games, and the programs that adhered to the "tit for tat" strategy tended to do the best. This is why things like playing with people on your friend's list, Steam community group, guild/clan, or a favorite dedicated server is good. And it's one reason why random matches between strangers or pickup groups can be infuriating. Making it easy to submit ratings to the profiles of people you just played also helps resolve these dilemmas to everyone's benefits. It's also the reason that I love the way that Halo 3 lets you remain in a lobby with the people you just played and go straight into another round with them. People being the complicated beings they are it's not a perfect system, though. Some people are just griefers out to disrupt the game no matter what. Some people won't abuse a glitch out of a sense of honor. Some will value their ranking on a leaderboard more than a sense of fair play for any individual match. But even if none of the bulleted items above is a silver bullet, they help across large numbers of games. References: Dawes, R. (1988). Rational Choice in an Uncertain World. Fort Worth: Harcourt Brace Publishers. [Jamie Madigan, Ph.D. is a psychologist and gamer who explores why players and developers do what they do by studying the overlap between psychology and video games at The Psychology of Games website. He can be reached at [email protected].]

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like