Sponsored By

Money Emergence in Video Game Worlds

It wasn’t so long ago that the idea of an “in-game economy” was little more than a hopeful metaphor, but today, many games feature intricate and evolving marketplaces.

Matthew McCaffrey, Blogger

September 23, 2016

6 Min Read
Game Developer logo in a gray background | Game Developer

Some of the most fascinating innovation in digital markets happens in video game worlds. It wasn’t so long ago that the idea of an “in-game economy” was little more than a hopeful metaphor, but today, many games feature intricate and evolving marketplaces.

One great thing about this trend is that the more complex and realistic games get, the more they begin to mirror real-world economic principles, both in their artistry and through gamers’ actual social relations. You can even argue that game logic parallels economic logic, and reflects some of the same basic concepts, like choice, tradeoffs, and opportunity cost. But games can also illustrate more complex economic processes like specialization, the division of labor, and even the emergence of money.

This last is the topic of a new paper by Alex Salter and Solomon Stein, who explain how a medium of exchange emerged in Diablo II through a decentralized exchange process.

The story is pretty interesting: basically, in Diablo II’s multiplayer mode, players’ inventory space for in-game items was quite limited. Therefore in order to improve their characters, it became necessary for players to exchange less-valuable items so as to acquire more valuable ones. But direct exchange between players was complicated and uncertain, and what’s more, players faced the traditional problem of the high cost of finding trading partners.

In an attempt to solve the problem of the double coincidence of wants, players took to message boards to list their current and desired inventory. As it happened, certain in-game items like runes and gems were more saleable than others. Just as economist Carl Menger's theory predicts, players began to acquire these goods for use as indirect exchange media. Gradually they became generally accepted and were eventually used as currency. Players even developed detailed trading guides with price lists, and exchange rates were routinely published on the message boards.

This is a neat illustration of how a good becomes money through the voluntary exchanges of many individuals trying to overcome the inefficiencies of barter. In a way, it’s almost like R.A. Radford’s famous case study of money in a WWII POW camp updated for the 21st century. It’s also a promising entry into the literature on the political economy of game worlds, which unfortunately is still fairly sparse.

I do have one criticism of the paper though. That is, the authors see their story not only as a case of money emerging out of barter, but specifically as an example of money emerging in the absence of a state. By arguing along these lines, they hope to undermine the chartalist or state theory of money, which argues that it’s actually government policy that paves the way for a medium of exchange. In the authors’ view though, “there is no mechanism or institution in the game environment by which obligations to pay in material resources can be forced on players. As such, there is no state in the sense used by the chartalists.” (Alternatively, “no central authority existed or could exist to provide an alternative currency or cause some other object to become the focal monetary unit.”)

This argument works best against chartalist theory specifically, which usually invokes a state system of debt clearing or tax collecting to explain how money comes to exist. Since neither of these institutions appears in Diablo II, it’s reasonable to conclude they are not necessary for an in-game medium of exchange to appear.

Nevertheless, I think there may be a problem with the broader claim there was no state-like institution at all in the game world. For instance, it could be argued that Blizzard, as the developer, played the role of a state (although not in the way described by the chartalists). After all, the developers designed the game environment, which they could adjust at their discretion. And while they did not directly force players to exchange, they did completely change the pattern of exchange by adjusting the core rules of the game. They can therefore be said to have controlled the institutional setting in which player transactions took place.

Salter and Stein anticipate this objection, arguing that, “taking this broad a definition of a state means that every social system must have a state, since every social system has some framework of rules by which it is governed.” But this focuses on the wrong issue; what matters in defining a state is not the existence of rules as such, but the process by which rules appear and evolve. For instance, do rules emerge as the result of a decentralized (market) process, or by the unilateral decision of a planning agency? In the case of Diablo II, the data point to the latter.

To illustrate this point, note that the developers often patched the game to fix imperfections, making changes to the in-game economy that rebalanced player abilities and altered the relative use-value of items. Clearly then it was both possible and even expected that the developers would intervene to resolve any perceived issues in the structure of the game. So even if they didn’t explicitly force trades between players, Blizzard could change the rules with impunity and thereby redistribute various forms of in-game wealth.

On a related note, Salter and Stein suggest not only that a state didn’t exist in the game, but that it couldn’t have existed. I didn’t think this claim is adequately explained, and actually, when discussing the game’s history, the authors seem to rely on the fact that Blizzard didn’t force exchanges, not that it couldn’t. So I don’t think the argument shows so much the absence of a state as the absence of specific types of direct intervention, à la chartalism.

According to the authors’ definition, Diablo II is stateless because it does not contain “a legitimized monopoly on the initiation of coercion.” But whether the developers fall within this definition depends on how we define “coercion.” And there’s the rub: I don’t think we can always apply ideas about real-world institutions within narrow—maybe even contrived—examples like games. Because games are planned, controlled environments, we end up using economic metaphors instead.

Putting it a different way, we might ask: what would a state look like in a gaming context, if not a developer? I agree that we should be careful not to define the state too broadly, but at the same time, if we define it too narrowly, we risk excluding it by definition. And if we rule out states a priori, empirical case studies tell us relatively little.

In any case, although I’ve taken up a lot of space puzzling over it, the question of a state is minor compared to the broader study of Diablo II, and money’s emergence through voluntary exchange in accord with Menger’s theory. In my opinion, this is just one example of the many ways we can incorporate games into our teaching and learning, and I hope to see many more in the future.

An earlier version of this article appeared on Mises.org.

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like