Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Some say that free-to-play is the bane of the video game industry. But rather than wishing free-to-play would go away, maybe we'll just have to get really good at using it.
In the April issue, Game Developer magazine editor Patrick Miller sees promise in free-to-play -- if it's done the right way. "Free-to-play is killing video games!" If you've heard it once, you've heard it a thousand times. From a business perspective, free-to-play is a useful tool because it can offer smaller studios a shot at an extraordinarily wide audience and higher overall revenues than the pay-once model - which, in turn, means more stability and job security. But nobody likes playing - or making - a game that feels like it's powered by your wallet, either. Rather than wish f2p would go away, we'll just have to get really good at using it.
Judging from the comments from this year's Salary Survey, monetization is on everyone's minds these days. But while devs and consumers alike aren't shy about heaping disdain upon free-to-play games, it's worth pointing out that over the last three years of salary surveys, we've seen a gradual decline in layoff rates and an increase in average salaries. Certainly, not all of those gains are necessarily due to the rise of the f2p model, but if you think about some of the hidden costs incurred with the traditional pay-once development cycle, you might be a little bit less skeptical about f2p. Traditional game development, as we think of it, is somewhere between the entertainment industry/Hollywood model, where you assemble a one-time team of people to produce one project, and the software development model, where you have a team of developers focused on building and improving a product for as long as that product is sold. If you're a developer making, say, Microsoft Word, you can be pretty sure that once you've shipped a version of Word, you'll still have plenty of work left to do with fixing remaining bugs, releasing new patches, and working on the next version of Word. If you're a game developer, though, at some point your game will be "done," and your studio might not have another project for you to work on. Essentially, game devs end up with all the liability of a film worker, but without any of the unions or support structures that make that model sustainable. On the other hand, many f2p games launch as early as they can put together a minimum viable product in order to start getting revenue coming in, and then gradually add new features and content after launch. As long as there is something to add to the game, there's a reason for the dev studio to keep people employed and working on the game, which simply isn't true for pay-once games. (This is also true for subscription-based games, but their success tends to depend on their ability to monopolize a player's attention for a long period of time, which is tricky.) Logically, that means we should see fewer layoffs in f2p game dev (when the games are performing well, anyway). As my film editor buddy Brian put it, "Pay-once dev is like working on a blockbuster film, free-to-play is like working on a TV show."
Free-to-play proponents like to mention that arcade games were the first example of monetization design. What many people seem to miss is that some of those games actually hit the Holy Grail of monetization design; they made paying fun. Play Final Fight on free-play mode and it gets dull fast because there's no cost to failing. Play it with a fixed amount of lives and continues and things get more interesting, but you end up playing through the same segments over and over. Play at 25 cents per continue, and you'll find yourself marshaling every last pixel in that health meter, asking yourself whether it's worth another 25 cents to see the next level, and so on. The experience is actually enhanced by the presence of actual, real-world stakes (the quarters in your pocket). Another unorthodox example of effective monetization design is the time-honored "money match," where two players bet on the outcome of a game. The fighting game community has taken these to rather ridiculous extremes (see the Marvel vs. Capcom 2 $50,000 money match between Toan and Fanatiq), but as an enthusiast myself, I love upping the stakes by putting a dollar or two on the line just to give each in-game moment a little bit more real-world weight. I lose more than I win, but the extra thrills make it worth it. And as f2p models continue to develop, I suspect we'll see more going on in f2p than just sticking a price tag on in-game content.
Nobody wants to play a game that makes you feel like a cash cow. But pay-once games are harder to sell than they were 10 years ago, and those business models also gave us wonderful workplace practices like "crunch time" and "laying everyone off after ship" - both of which make it harder to attract, cultivate, and retain talented developers. If we want to see the game industry become a place where developers can reasonably see themselves supporting their families, buying homes, and sticking around until retirement, we're going to have to solve The Money Issues in a way that makes everyone - devs, suits, and consumers - happy.
You May Also Like