Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
In this reprint from Game-Wisdom, I examine how the barriers of playing a game have evolved with the rise of the Free to Play market.
(Reprinted from Game-Wisdom)
Free To Play titles have become a popular debate among gamers on the grounds of justifying monetization elements. Some titles give the player value with new content while others are all about making the experience punishing unless the player spends a few bucks here and there.
I've posted in the past on my views of games based around consumable monetization and trying to trick consumers into paying and as I thought about it more, it occurred to me what my line is regarding free-to-play design.
Candy Crush Saga, originally posted on Cnet.com
To begin we need to define some concepts for this post. When it comes to the act of playing a video game, for this post we can simplify the player's involvement or skill level into three categories:
Novice: Someone who just started playing.
Expert: Someone who has learned all the mechanics and has gotten their fill of the game.
Master: Someone is playing the game at its highest most optimal level.
These three states are dependent on the genre that we are discussing. For instance in a fighting game, a novice would be someone learning a character, an expert would be someone who has completely figured out a character or two, while a master would be someone playing at a competitive level.
Regardless of the game, a player will go through each skill level in order of novice, expert and finally master, usually at different intervals based on the player's skill. Traditionally the first main barrier that was in the player's way was the initial price barrier: It doesn't matter how great a FPS gamer I am, I won't be playing Far Cry 3 without spending money (of course not counting pirating.)
Playing a game, players could hit a play barrier where their skills would peak and they were unable to continue. My example would be myself not being good enough at microing to play Starcraft 2 competitively. Because this kind of barrier is dependent entirely on the player's skill level, we won't delve into it as it's too varied to nail down specific concepts.
When Free-To-Play became popularized and removed the initial price barrier it replaced it with a consistent price barrier: Where the player will reach a point that the only way to keep their progression steady is to spend money.
The two barriers are different in design and function. The price barrier of buying a game is like a locked door: once you buy the game it is yours. While in a monetized game it is like trying to fill an ever expanding hole -- the longer you play the more money you'll have to spend.
Now, I know that I just lumped titles like League of Legends in with titles like Farmville and I will be distinguishing between them in a minute.
These two different barriers represent the biggest change for older gamers looking at the market today. Many gamers will say that if a free-to-play game had a "buy everything" price instead of micro transactions that they would play them then. However, there are three reasons why that won't happen.
DOTA 2, originally posted on PCgamer.com
Games like DOTA 2 and League of Legends keep their monetization away from impacting the gameplay.First is that the designers don't want you to reach a point where you won't need to buy anything. The income made from free-to-play is based majorly on people making micro transactions.
If no one buys anything, then that title is not going to last long.
Second, is that the most successful free-to-play games are never considered "finished" in the sense that there will be no new content. Designers know that to keep people coming back, that there has to be a constant increase in things to do. The players who are the most invested, are the ones most likely to continue purchasing content.
Having a "buy everything" option would simply not work as there would always be something added and the designers would be crazy to give all new content for free.
The last problem is that for heavily monetized games, the gameplay itself is based entirely on micro-transaction elements like boosters and consumable items. To actually buy every consumable item depending on the game could cost hundreds of dollars and would effectively leave the player with using them all to win.
Moving on, with the changes in the price barrier came a change of the term "free."
The major distinction in my opinion that separates a great free-to-play game from a poor one comes down to the gameplay and there are several aspects that we have to examine.
The first one is simply how much of the content is tied to the monetization model? The kicker for me when looking at monetized mobile games and social games like Farmville is how much of the monetization is tied to what little gameplay there is.
The advantage of having simplified content is that it is easy to create a constantly escalating state of challenges. If the first set of hurdles requires 10 points of energy or $1.00, the designer could then make the next set at 15 or $1.50 and continue from there.
Lord of the Rings Online, originally posted on joystiq.com
MMOs with micro transaction based stores feature all aspects of the game for purchase.The player is not finding new gameplay mechanics but is repeating the same content but with more hurdles.
This is great for the designers as it allows them to create new content without having to spend a lot of time and resources, but the quality of it will not improve.
Now, with titles like DOTA 2 or League of Legends, the monetization is largely based on cosmetic items, leaving the actual playing of the game removed from it.
Speaking of playing the game, the other element of gameplay to examine is: How much can someone play for free?
There are two parts to this question, the first having to do with content. With MMOs that went free-to-play, they continued adding content and expansions, such as with Lord of the Rings Online or DC Universe Online. Installing the game and setting up an account obviously doesn't entitle someone to get the additional content.
But if the base game is in of itself a full experience then that isn't a problem. A full experience meaning that the player has enough content to reach whatever the level cap is for the game. On the other hand, if the player has to continually buy quest packs on their way to the level cap that would be a different story.
The other side of gameplay revolves around any hurdles or inconveniences built into the free model. In MMOs it is common to restrict quality-of-life features like storage space, currency or even what type of items to equip behind the pay wall.
With social or mobile games the common inconvenience revolves around "pay or wait" mechanics. What designers implement is a system that limits how much time the player can actually spend playing the game without either waiting for more energy or spending money. "Pay or Wait" mechanics to me is an automatic avoid when I see them in any game along with having the game made frustrating in an attempt to force me to spend money.
When discussing the positives and negatives behind free-to-play and retail games, the problem comes down to whom the design favors. With a retail game, the consumer gets everything there is for the game (outside of DLC) in one lump purchase. While a free-to-play game doesn't have a hard limit of the # of purchases available.
Outernauts
Heavily monetized games advertise consumable bonuses to get people to pay and continue to spend money for more advantagesThis is why there are a lot of success stories of free-to-play games earning millions of dollars and the allure of developers to create more games.
Without a limit on spending in a monetized game, it’s easy for developers to earn a lot of money.
When a game is not about playing it and more about how much money is spent, I have a problem with that and where titles like League of Legends differ from other free-to-play games.
The analogy I like to use for how a free-to-play game should work is taking up a sport like tennis or golf. They have a low barrier of cost and anyone can swing a racket or golf club. There are places where you can practice hitting balls for cheap and normally won't need to pay money for any gear.
For people who want to take things further, they will spend money on their own equipment or accessing a court or golf course. At each step of learning how to play the sport, the amount of time and money needed grows, but only the person decides how far and how much money they are willing to spend.
There is no outside force either pushing the person further or stopping them from continuing. If all I want to learn about golf is going to a driving range once a month and hit a few balls that's perfectly fine. And at the end of the day, the person's skill is the deciding factor on how well they play, not how much money they spent.
Playing League of Legends at the higher skill levels for example, you need to have a stable of champions that you are good at to play competitively. That means either spending real money on them, or playing a lot of games to unlock them. But, having a lot of champions won't make you a better player unless you become skilled at them which no amount of money will help you on.
Contrast that with the most popular monetized games, where you could easily spend several hundreds of dollars to get a huge head start that will only last until the money runs out. In that way the comparison of monetization to casino play is apt: You're only a high roller until the money runs out. No money = no more (or very little) progression.
This is not the last time that we're going to talk about free-to-play design, as the market is currently the most lucrative in the game industry. As we see more consumers swept up by it and discussions on the ethical side of things, whether or not we'll have a free-to-play boom or bust remains to be seen.
Read more about:
BlogsYou May Also Like