Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
As the recent Battlefront 2 pricing outrage shows, combining monetization models risks violating social contracts around how people buy games. There's an value judgment by the consumer, and it doesn't always match what the developers have in mind.
The discussion around loot boxes and unlockables has erupted into the latest wave of consumer outrage. Most recently, Battlefront 2 had to change the way it priced its in-game purchases in the face of consumer backlash. The "controversy" is likely blown out of proportion by vocal consumers, but it reveals underlying problems that have been festering for years. And they're not really problems with monetization, but rather with how the industry cultivates expectations around value. In this piece, I'll talk a little bit about the relationship between monetization models and consumer expectations.
How do consumers pay for games anyway? There are many models, ranging from pay-upfront (consumer pays a lump sum for a "complete" product), to subscription (recurring fee for service), to free-to-play (monetized via in-game purchases). With these monetization models, consumers have agreed on the social contracts around what they get for what they pay. Mostly. It certainly isn't clear that anybody really understands what games are supposed to cost.
The question is, what happens when the consumer pays upfront but is asked to pay again post-purchase? Paying for cosmetic items that don't affect the gameplay, such as character skins or fancy digital hats, is well accepted. When developers have experimented with selling gameplay-altering items, they've mostly met with negative results.
In the specific case of the recent Battlefront 2 controversy, the issue is that "hero" characters can have a potentially large gameplay impact in competitive multiplayer, so hiding them behind an unlock feels like a barrier to enjoyment. Sure, players will eventually accumulate enough in-game points to access features they already paid real money for. And there's the problem.
Let's back up for a minute and look at where loot boxes and unlockables fit within the gameplay experience. It seems clear that they're a form of extrinsic reward, and that's legitimate. People like getting new, shiny toys. Except that there's no one-size-fits-all formula for balancing this. Make it too tedious to unlock things and players are stuck grinding for progress. Make it too easy and the challenge associated with them disappears. But as with all extrinsic rewards, perhaps it's better to replace them with intrinsic rewards. After all, the real prize is playing with those unlocks and not the process of acquiring them. People seem much more likely to share and remember their play experiences rather than the process of unlocking things.
In the past, developers have experimented with letting consumers bypass the unlock system by paying to unlock items immediately. This has its own set of problems and brings up the specter of pay-to-win monetization models. This also creates a potential conflict of interest where developers can undermine their own progression systems in pursuit of revenue. After all, if developers purposely make aspects of the game more frustrating, can they convince players to pay to bypass them?
As it turns out, not all consumers have the same tolerance around what they will and won't pay for. Games have been around for long enough that there's data to inform decisions about what consumers will and won't pay for. It works out from the business end if they're optimizing for revenue. This seems reasonable, except that a player who pays upfront has different expectations from a free-to-play player.
The value proposition is different when the game is provided for free or at a large discount, and consumers are often willing to pay for the experience. Free-to-play games also benefit from a minority of players who are willing to spend large amount of money on in-game purchases. Yes, absolutely having in-game purchases is good for customer-lifetime-value numbers, but it isn't clear that the individual consumer makes an upfront purchase decision with that number in mind.
Combining the two monetization models seems like a recipe for conflicting expectations around what consumers will pay for. And while it might make mathematical sense to build in-game systems around the minority of big-spenders, it runs the risk of subverting the social contract with the remaining majority of consumers, who have already paid upfront for the product, and with some notion of expected entertainment value in mind. As a consumer, it sucks feeling like you've paid for a full product, only to find you have limited access to the features you paid for.
It's hard to know exactly how much these different factors are contributing to the uproar. There is probably some mob mentality going on, coupled with a lingering disillusionment with grinding for unlocks. Games are still objectively underpriced for the value they provide. At the same time, consumers also hate feeling like they're being nickel-and-dimed, and by a company that isn't exactly overflowing with consumer goodwill.
Ultimately it's unclear how much this will affect sales, now or in the long run. After all, most consumers are content to make a value judgment in the moment and make purchases accordingly. But unless companies can figure out how to balance revenue expectations with consumer notions of value, this mismatch is going to continue to cause turmoil in the marketplace.
When he isn't trying to clear his extensive backlog of purchased-but-unplayed-games, Alan builds machine learning models to explore whatever data he can get his hands on, including indie game pricing data on Steam.
You May Also Like