Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Ethics matter in game design, but it's not about F2P.
The past few months have seen a torrent of articles about Free-to-Play business models, often discussed alongside issues of ethics in games (for example: 1, 2, 3, 4, 5). These pieces have addressed the effect on addiction-prone players or children, the possibility of corrupting artistic intent, the sheer amount of money being spent, and other advantages and dangers of F2P. While all of this is relevant, I fear that we’re focusing too much on F2P and glossing over the wider ethical issues in game design.
Can’t paid games be unethical too? Children are vulnerable, but shouldn’t we also fight against the exploitation of adults? Can’t all business models corrupt artistic intent, as Dan Cook keeps pointing out in exasperation? F2P is the center of the current conversation, but I don’t think the ethical issue is really about the business model. “Free” is hardly a bad quality in itself, and high rates of spending seem more worthy of our aspiration than our disgust (after all, hardcore gamers and other hobbyists routinely spend hundreds or thousands of dollars a year, as Cook argues). Besides, if a game can rob you of your time or reduce your quality of life in other ways, it can be unethical without taking any money at all.
I’m optimistic about the F2P model (both of my commercial games are F2P), but I also believe that player exploitation is very real, very profitable, and genuinely immoral. Perhaps F2P makes it easier or more lucrative to exploit players, but any game can include exploitative techniques. Instead of making unfairly broad attacks on the F2P model, we should condemn exploitation wherever it occurs. There seem to be sizable groups of game developers who either A) don’t know about exploitative game design, or B) don’t see an ethical problem with it. Those are the designers that I’m hoping to persuade in this article.
Below, I’ll discuss game design techniques like randomized rewards, premium currencies, manipulative feedback, and pay-to-win schemes. But, first, I’ll try to establish a simple basis for my ethical claims.
When any product is sold, the seller is essentially saying “I’d rather have your money than this product”, and the buyer is saying the reverse. They’d each prefer to own what the other person is offering, and so the trade makes both of them “wealthier”. In this way, a healthy exchange benefits both sides.
This process breaks down whenever one party takes advantage of the other. For example, consider a used car dealer who knowingly sells someone a lemon. In this case, the buyer has a weaker position (since they lack information about the true state of the car), and the dealer can exploit that weakness to sell the car for more money than it ’s worth to the buyer. The seller becomes wealthier, but the buyer does not. This is an unhealthy exchange, and since the dealer acted knowingly, it was an unethical act.
In other cases, the buyer’s weakness is not a lack of information, but a lack of willpower or judgement. Exploiting this kind of weakness is equally unethical. We might imagine someone trying to sell drugs to an addict who wishes to quit, for example. If this is done knowingly, then the seller is sabotaging the buyer, acting against the buyer’s best interest for their own profit. Even if the buyer enthusiastically agrees to the exchange at the time, the seller is still intentionally exploiting the buyer’s weakness.
This highlights an often-confusing point: it’s not enough merely to get the buyer’s consent. Instead, the proper standard is that of mutual benefit. The most instructive example (which I’m borrowing from Jonathan Blow) is that of a con artist who scams an unwitting victim, perhaps by selling them a counterfeit item of little real value. The victim might fully consent to the exchange, and might walk away happily, never realizing that they’ve been cheated, but that doesn’t mean that the exchange was ethical. What matters is that the seller made money by exploiting the buyer, instead of working toward a mutually beneficial exchange.
All I’m trying to establish is that exploiting someone else’s weakness for your own profit is wrong, and it doesn’t matter what form that weakness takes. I hope that, without going into any detail about a particular theory of morality, we can at least agree on that as a foundation for further ethical claims.
Essentially, you should be trying to sell to your customer’s “best self”, a hypothetical version of them with perfect knowledge, willpower, rationality, and judgement. Without any weaknesses affecting their decision, would your customer still consider your offer to be a beneficial exchange? If you don’t think so, then you shouldn’t make that sale.
Of course, in most businesses, your product is offered to a large audience, and there’s no way to evaluate each customer one-by-one. That doesn’t mean that you’re free of ethical responsibilities, though. You should consider the effect that the sale would have on the audience as a whole, and you should do what you can to reduce the possibility of exploitation. This is why foods companies list their Nutrition Facts and suggest reasonable serving sizes, why bars refuse to serve patrons that are excessively drunk, and why casinos allow customers to ban themselves. These measures are meant to reduce the potential harm of exploitative exchanges, and an ethical business would enact them voluntarily.
Exploitation is unethical. But what does exploitation look like in the context of games?
Some games are regularly accused of employing “psychological tricks” to exploit their players. Without specifics, this can end up sounding like a paranoid conspiracy theory. It’s not. The “tricks” are simply the exploitation of common weaknesses such as cognitive biases. Many cognitive biases are widespread, well-understood, and experimentally verified. Still, I’ll make the disclaimer here that I’m writing as a practicing designer, not as an expert in psychology.
Below I’ve listed a few ways that cognitive biases or other weaknesses can be exploited in games. These aren’t examples of wrongdoing per se; they are techniques that can be used unethically, but they can also support healthy mechanics or be included accidentally. In most cases, however, their primary effect is to exploit some psychological weakness for profit, and they should be avoided without good reason otherwise.
1) Exploit loss aversion
Loss aversion refers to the common tendency to care much more about losses than gains. To an irrational extent, players will seek to avoid getting a penalty or missing out on an expected reward. This bias can also manifest as the sunk cost fallacy, in which people act irrationally to avoid feeling like they’re wasting resources. The simplest exploitation of loss aversion might be “crop withering” mechanics, in which the game threatens to take away a resource or to erase an expected gain unless the player takes some action. There are plenty of other, more subtle uses of loss aversion. For example:
Make the player work for the opportunity to buy something (“You’ve unlocked a new purchasable item”). The player will not want to waste the effort that they already made to reach this opportunity.
Pair plentiful in-game currencies with scarce premium currencies (“You have lots of gold, but not enough gems”). The player will not want to let the plentiful currency go to waste.
Increase the amount of time it takes to perform some common action (“The build time doubles every level”). The player will not want to lose the rate of progress to which they have become accustomed.
Cause automatic growth to halt until the player intervenes (“Your collector is full / Your crops are ready to be harvested”). The player will not want to waste time by leaving the game idle.
2) Use variable ratio reward mechanics
Essentially, variable ratio reward schedules are slot machine mechanics. The rewards are based on random chance but remain linked to player inputs: the perfect combination to elicit compulsive behavior or addiction.
Randomness is not a bad thing in itself, and it’s often used in the service of healthy dynamics. Procedural generation uses randomness to provide variety, for example, and poker uses random chance to create dynamics of probability management and bluffing.
However, randomness can also be used to create gameplay that is both compelling and empty, a simple recipe for regrettable wastes of time and money. Variable ratio rewards can be combined with the illusory rewards from the near miss effect (the positive feeling of “almost winning”). It can also be bolstered by the gambler’s fallacy (“I’m due for a win!”) or the hot hand fallacy (“I’m on a lucky streak!”).
3) Use excessive extrinsic feedback
When a game gives positive feedback (any attempt at positive reinforcement), the player will enjoy it to the extent that the feedback feels true and meaningful. However, most players don’t stop to think about the validity of a game’s feedback; instead, they accept the feedback by default, subconsciously giving the game the benefit of the doubt. This may be especially true for less-experienced players. Excessive positive feedback can thus be used to string players along, giving them the illusion that they are accomplishing something meaningful.
This process could be as simple as doling out rewards or other positive feedback whenever the player is getting bored, whenever it could make the player re-engage with the game (e.g. upon logging in), or upon the completion of a trivial goal. The effect can be enhanced by building a structure or pattern out of such goals, e.g. by presenting them as a short checklist or as a set that must be completed. Wrapping goals together into a larger structure encourages players to see them as more meaningful, regardless of whether or not that’s true, and effectively creates a new mental reward that acts as yet another bit of feedback (“You completed a set!”).
In the worst case, the abuse of extrinsic feedback can undermine the player’s intrinsic enjoyment (the overjustification effect).
4) Offer purchases that short-circuit game dynamics
When a player wants to reach some goal (like earning an item, or defeating their opponents in a competition), the game can offer to sell them an advantage, or even to sell them the goal directly. This is often derided as “pay-to-win”.
In the most innocent case, selling an advantage is just an indirect way of selling a difficulty adjustment. In a presentation about the game Shellrazer, for example, the developers explained that they balanced the game for the players with lots of time or skill, while selling advantages to the players who had neither. We might see this as selling access to the easy difficulty mode. This strikes me as bizarre (why is a game primarily making money from the players who are presumably least-engaged?) but not necessarily unethical.
More often, though, selling advantages is a means of extracting money from a treadmill dynamic. The player decides upon a goal, starts working toward it, then decides to pay to get it right away (or more easily) instead. The player is essentially paying to play less of the game, short-circuiting the existing game dynamics in favor of more immediate gratification. It’s a poor trade of long-term gain (ongoing gameplay) for a short-term reward, except that the short-term reward is meaningless unless the player continues playing, e.g. by choosing a new goal and repeating the process.
When the game is intrinsically rewarding, a pay-to-win system is more damaging, since short-circuiting the dynamics will skip over the intrinsic rewards entirely in favor of the extrinsic goal. For a multiplayer competitive game, this process can potentially ruin the game for all participants, not just the player who paid.
5) Make purchases harder to evaluate
A game can get around a player’s better judgement by obscuring or inflating the perceived value of whatever is being sold. For paid games, this might mean any pre-purchase misrepresentation (e.g. “bullshot“).
In F2P games, this technique usually involves a premium currency, which keeps sales one step further removed from actual money in the player’s mind. This is doubly effective if the cost of purchases in the game is constantly increasing; this sort of inflation can cause a purchase of premium currency to seem like a great deal initially, only for it to rapidly decrease in practical value as the game proceeds. The game might also misrepresent its dynamics; for example, if it’s implied that a purchase will make the game more rich and dynamically interesting, but instead it just scales up all the numbers in a way that produces equivalent gameplay, then the player receives only a momentary extrinsic reward instead of ongoing intrinsic rewards. Any other sort of bait-and-switch would serve just as well.
6) Rely on post-purchase rationalization and restraint bias
In addition to the active techniques listed above, there are two cognitive biases that serve to passively amplify the impact of exploitative game design. Restraint bias refers to the tendency to overestimate one’s own self-control, which may lead people more easily into exploitative situations (“those tricks wouldn’t work on me”). Post-purchase rationalization is the tendency to justify voluntary purchases even if later information reveals that the purchase was a poor decision. This can combine with any exploitative design technique; if you can get the player to make a bad purchase, they might still convince themselves that it was a good idea (“If I already spent that much time and money, I guess the game must be fun after all”). It may be impossible to avoid these passive biases, but it’s useful to note how they can enhance or disguise the effect of exploitative design.
The techniques that I’ve listed above can be hard to resist, in part because (since it’s possible to use them ethically, and since their downsides are not obvious) they’re so easy to rationalize. More importantly, exploitative design works. Even games that were designed to parody these techniques (such as AVGM,Progress Quest, or Cow Clicker) have surprised their creators with their popularity. If you’re using a simplistic metrics-driven design process, it could easily lead you toward exploitative game design, and it takes vigilance to avoid that pull.
If you want to act ethically, you need to honestly evaluate the value of your game. Keep in mind that games can have value beyond mere pleasure, and costs beyond time and money. People are affected in subtle ways by the media that they consume, and you’re responsible for those effects as well.
Here are a few tests that might help you get a better perspective on your game’s design. These aren’t hard-and-fast rules, but I’m hoping that they can provoke some thought:
Looking back with perfect hindsight, would a player feel that your game was well worth the cost, or would they regret the time and money that they spent on it?
Are your customers paying to play more of your game, or to play less?
If you removed as much extrinsic feedback as possible, would your game be worth playing?
Are you trying to get purchases during a “moment of weakness”, or would your players confidently make each purchase even if they had plenty of time to consider it? In other words, are you trying to circumvent the player’s better judgement?
Does your game design push the player primarily toward paying more money, or toward getting better at the game?
Are you faithfully representing the values and costs of your purchases, or do you make them harder to think about?
Do you expect that your typical target customer will, in total, be better off after buying what you’re selling?
Are you trying to increase profit primarily by increasing the value of the player’s experience, or by tweaking your monetization scheme?
Is your design primarily about presenting interesting gameplay, or about making the player take certain actions (driving metrics)?
Ultimately, these are components of a more important, more broad question: Are you doing what you can to ensure that the player’s encounter with your game is mutually beneficial?
In the sections above, I’ve argued that exploitative game design exists and that it ought to be seen as an ethical issue. There are a few natural questions and counter-arguments that seem to come up again and again; I’ve tried to offer initial responses to them here.
- You call it “exploitation”. My players call it “fun”. If people enjoy it, who are you to say that they’re wrong?
Recall the earlier example of the con artist selling a counterfeit product. The buyer might feel totally satisfied with their purchase, but that’s not enough to conclude that it was an ethical exchange. We have to look more closely.
Sometimes the player can benefit even when exploitative design is used. In small doses, and for a cheap-enough price, most games can be genuinely valuable to their players, even if only as an idle distraction. But the real issue with exploitative design techniques is that they are used to divorce perceived value from actual value (as the customer’s “best self” would judge it). Irrational loss aversion, compulsive behavior, and getting tricked into purchases are not good, valuable, healthy experiences, even though they are freely chosen.
In most cases, exploitative techniques surround a simple or vapid gameplay loop, which would normally lead to boredom. Exploitative design can get around boredom by using techniques like variable ratio rewards and extrinsic feedback, but boredom is sometimes a healthy reaction to wasting time. If your players knew this, fully understood your game’s dynamics, and saw through your game’s monetization design, would they still be happy to spend their time and money this way? If the answer is “no”, you’re exploiting their lack of understanding.
- My game is not unethical because it can be played entirely for free / because it has a cap on spending.
While these are good methods of reducing the maximum harm that a game can cause, neither method is a get-out-of-jail-free card. A totally free game can still be addictive and waste a player’s time, and a F2P game that fails to extract money from some players (despite its best efforts) is hardly worthy of praise. Similarly, a spending cap is not a complete defense; a game that exploits someone out of a few dollars is still unethical, in the same way that a thief who only steals a few dollars is still committing a crime.
- A lot of people use those techniques. Do you really believe that all those developers are evil?
No, of course not. In most cases, I think that these techniques are employed due to a designer’s lack of understanding, a naive reliance on metrics, or an innocent but misguided emphasis on monetization. In other cases, developers might knowingly work on exploitative games for practical personal reasons (after all, taking the high road can be expensive). I certainly don’t think that all exploitative game designers “are evil”. But, then, I don’t think that you have to be evil to do unethical things. My hope is just that good designers will reflect on their designs and consider what effect they’re having.
- Are you saying that the players of these games are stupid?
No. Maybe some players are, but everyone has biases, and everyone can be manipulated. More importantly, the players don’t bear ethical responsibility. If someone is scammed by a con artist, I don’t blame them or ask about how gullible they were; instead, I blame the con artist for exploiting them.
- What about [another common exploitative sales technique]? Are you saying that’s unethical too?
Yes, probably. Exploitation is far too common. Others have written about the subject with regard to state lotteries, web design, lending money, and social networks. Any time that someone seeks to gain at the expense of another, I’m concerned.
- You just think you know what’s best for everyone. Let them make their own choices! People should be free to waste their money if they want to.
I agree that people should be free to spend their money as they please. However, that doesn’t mean that you should be encouraging them to spend their money poorly. If someone is considering buying your product, and you believe that the transaction will make their life worse, you should not sell it to them. If you do, you’re knowingly profiting from their harm, and I call that unethical.
- “A fool and his money are soon parted.” These games may be exploitative, but the audience for them is huge. Even if I refuse to take their money, someone else will. If they want to buy my product, I’m going to sell it to them, period.
Yes, there are people out there who are ripe for exploitation. But just because they are especially vulnerable doesn’t mean it’s okay to prey on them, even if they are literally asking for it. I reject the idea of a dog-eat-dog world in which exploitation is the only way to get ahead. You don’t have to contribute to a collective wrong, and instead you can try to produce an alternative. The creators of Plants vs. Zombies proved that it’s possible to make a non-exploitative game that’s still a hit. Why not make that your ambition?
- Players will eventually realize that they’re being exploited and stop playing those games. The market will sort itself out.
I hope that this will prove true in time. But it’s not a sure thing (see the ongoing success of slot machines), and I suspect that a particularly vulnerable subset of players will continue playing exploitative games for a long time. In any case, it’s never too early for developers to start considering the ethics of their game designs.
- You can’t compare games to addictive drugs or to scams. Drugs are physically addictive, and scams involve lying to the victim. That’s not true of games.
Those differences aren’t relevant to my point. (In fact, I don’t think that drugs or lying are always bad.) I bring up these subjects as examples in which the buyer is making a voluntary purchase and yet the exchange is still exploitative. These examples show that it’s possible, and this dynamic is relevant when discussing exploitative games.
- Lighten up! They’re just games. Nobody’s dying here.
I admit, this is hardly the world’s biggest moral issue. But even a small wrong is still a wrong, and when it’s potentially repeated millions of times, it’s surely worth our attention.
- You say that sometimes those techniques are acceptable, and sometimes they’re unethical. Where do you draw the line?
They are unethical whenever they result in harm to others, usually by convincing customers to spend time or money without delivering sufficient value in return. Determining whether that’s happening is not always simple, but it deserves our earnest reflection.
- Why don’t you call out any specific games as exploitative?
I avoided giving examples of exploitative games here, since I’m mostly trying to offer a foundation for future discussion. I hope that others will write more specific analyses of existing games.
Most of my arguments above have been made before by others; this article is just my attempt to combine them and restate them in a way that makes the most sense to me. If you’re interested in reading more, here is some particularly salient material from those other authors:
- Two talks by Jonathan Blow: Design Reboot (most relevant 16:15-31:30) and Video Games and the Human Condition, addressing reward schedules and exploitative design.
- Achievements Considered Harmful? by Chris Hecker, discussing the effect of extrinsic rewards.
- who killed video games? (a ghost story) by Tim Rogers, a look at the mechanics of monetization in social games.
- Chasing the Whale: Examining the ethics of free-to-play games by Mike Rose, focusing on addiction.
- Social Games vs. Gambling, by Raph Koster, discussing the similarities and differences of those two industry segments.
- Contrivance and Extortion by Adam Saltsman, and Part 2, discussing the intersection of the “Checklist Effect” and microtransactions.
- Let’s Admit it: Addiction is not an Asset by Drew Dixon, pointing out that “addictive” should not be a compliment.
- I’d Like Fewer Addictive Games, Thanks by Patricia Hernandez, making a similar point.
- Apple is Gambling by Colin Northway, attacking games that “basically earn their money from failures in the human mind”.
- Shit Crayons by Ian Bogost, a response to the assertion that some exploitative games are actually about creativity.
@E_McNeill
Read more about:
Featured BlogsYou May Also Like