Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Take a dataset of over 250 games, their development costs, their year of release, what they sold for and big they were. What can we learn about the trends and what it means for the industry?
Yesterday I was in Anaheim giving a talk called “Industry Lifecycles.” It was intended to be a brief summary of the blog post of the same title, with a dash of material from my recent post on game economics.
Now, that latter post resonated quite a lot. There was lengthy discussion on more Internet forums than I can count, but it came accompanied by skepticism regarding the data and conclusions. If you recall, the post was originally replies to various comment threads on different sites, glued together into a sort of Q&A format. It wasn’t based on solid research.
As many pointed out, getting hard data on game costs is difficult. When I did my talk “Moore’s Wall” in 2005, I did some basic research using mostly publicly available data on costs, and extrapolated out an exponential curve for game costs, and warned that the trendlines looked somewhat inescapable to me. But much has changed, not least of which is the advent of at least two whole new business models in the intervening time.
So the Casual Connect talk ended up being an updated Moore’s Wall. Using industry contacts and a bunch of web research, I assembled a data set of over 250 games covering the last several decades. This post is going to show you what I found, and in rather more detail than the talk since the talk was only 25 minutes. (You can follow this link to see the full slides, but this post is really a deeper dive on the same data.)
Each game has a reported development cost which, importantly, excludes marketing spend. So this is mostly the cost of salaries and various forms of overhead such as tools. When costs were reported in currencies other than the dollar (Euro, yen, even zlotys) I went back to the year of release, and converted the cost to a dollar value using the exchange rate prevailing in December of that year. I then took all dollar values and adjusted them for inflation so that we are comparing actual cost in today’s money.
The result:
As should be immediately apparent, it’s pretty hard to read the costs, because the vast majority of games cost under $50 million US dollars to make. Outliers are AAA console and PC titles that have enormous budgets, and you have probably heard about them because costs like that tend to make the news.
The chart gets a lot easier to read if you plot it on a log scale; in this chart, each vertical box implies costs going up by a factor of ten.
The trajectory line for AAA games is very clear. You can just eyeball that the slope of the line for console and PC releases goes up 10x every ten years and has since at least 1995 or so, and possibly earlier (data points start getting sparse back there). Remember, this is already adjusted for inflation.
We can also clearly see the appearance of indie games and mobile games on the chart. I have a lot less data points for these, as you can see, and a truly staggering number of them are released with basically no budget whatsoever. But the vast majority of those are also done at a loss; most of the mobile figures come from games that were at least nominally successful.
I took an average of the data per year, but it only tells us so much given that the data is weighted towards AAA games, and they pull up the average so dramatically. So I wouldn’t read too much into this graph except to say that even with the lack of really recent data points and older data points, the line is shockingly straight. I will say that a couple of the recent top of the line mobile games have budgets ranging from $5m to $20m — the bottom end is not as low as people think, when doing “AAA mobile.” Even PC indie games with high polish hit multiple millions.
All in all, given reporting bias (crazy expenses are more fun to talk about), and given that exponential cost differences mean the median or “typical” game is certainly not climbing at the same rate, and given the lack of enough mobile and indie titles in the data set, this average line is certainly over-reporting for games as a whole. You may find that somewhat reassuring, especially if you’re working on a $50m AAA game right now.
On the other hand, this picture is actually far too rosy in another way: it doesn’t include any marketing costs. As a rule of thumb, you can say that an AAA game’s marketing budget is approximately equal to 75-100% of its development cost. So costs of getting an AAA game to a consumer’s hands are actually more like double. In mobile, it’s not uncommon to hear savvy shops set aside three to ten times the development budget for marketing, because the market is that crowded.
Looking closely at the data points, there is rather an upward trend to the mobile titles and the indie titles as well. This isn’t surprising, given that as markets mature production costs tend to go up. But it raises the question as to whether there is some way we can compare apples to apples and see if there are global trends. After all, costs rising is fine if revenue and audience rise to match, right? It all comes out in the wash.
So I went looking for something that would correlate. I expected something like hardware power and capability to introduce “steps” in the graph, for example, and I wasn’t seeing that. Finally, I settled on one simple thing as a proxy: bytes. I went back and for each game, I located the actual install size, space taken up on disk (or on device) after a full install and all sideloaded, streamed, or first day patches were applied.
Needless to say, this also had to be plotted on a log scale, because the earliest games on the chart were only a few K in size, and the latest were many gigabytes. The result was this.
So, needless to say, bytes go up. Surprisingly, they don’t tend to go up in stepwise fashion as platforms are released, even back in the midst of console wars. Early on, carts with extra memory were slipped into production midway through the lifecycles of consoles, and later on, new run-time decompression techniques enabled disks to literally just have more bytes on them. For example, the NXE update to the 360 reduced install sizes using compression techniques by up to 30%. Given the addition of various forms of streaming that aren’t cached, for lots of types of games that require a connection, and it’s likely that the byte count here is, unlike costs, rather under-reported.
Either way, we now have a simple way to baseline. How many dollars does it cost a developer to create a byte? We know what we want to see: costs falling. In my earlier Moore’s Wall talk, I had looked at costs and costs per byte for the window of 1985 to 2005, and had arrived a simple conclusion (one which I repeated in several later talks such as “Age of the Dinosaurs”): game size went up by 122 times, costs rose by 22x, and therefore we got six times more efficient at creating content.
So here is the simple division of dollars and bytes, on a log scale.
Suddenly what become apparent is that there’s about 10x variability in costs within a given year, most of the time. Looking at the specific data points, I can tell you that most of this can be chalked up to whether the game is content-driven or system-driven. A story-based game, an RPG, something with tons of assets, will just naturally have a higher cost. There are also some famously troubled productions on in the data set; no surprise that they tend to sit towards the upper end of the range for their respective years.
The real eye-opener is that the $5m indie mobile title and the giant $100 million AAA cross-platform extravaganza cost the same to make in terms of megabytes. (They were actually off by only 3/1000 ’s of a penny). That’s likely because salaries are salaries, and don’t move that much when you change segments within the industry.
More troubling to me was that eyeballing the average cost per byte, it looks like we have plateaued.
Unreal Engine 3 and Unity both launched in the 2004-5 window. I would have expected these two amazing toolchains to have hugely helped the cost per byte. Instead, it kind of looks like it went flat.
It raises the disturbing possibility that maybe standardizing on these two engines has actually blocked faster innovation on techniques that reduce cost. I don’t know what else might be contributing to the flattening of the curve. Maybe the fact that Unity and Unreal are designed around static content pipelines, and don’t do a lot more with procedural content affects this? Maybe this is actually the good result, and costs were going to boomerang back up? There’s no way to know. I even unrolled the yearly average and simply sorted the games by release year to see if I was seeing things, and if anything it looks flatter because it reduces the impact of those outliers.
Data complexity in games is a real thing, and it is something that players, I think, routinely hugely underestimate. This post by Steve Theodore on Quora is illustrative. In it he shows a 1997 character that took ten working days, then one from ten years later that took 35. His estimate for a character today is a hundred days. What used to be one 256×256 texture is now authored as many 4096×4096 textures, for specular maps, bump maps, displacement maps, etc etc.
If we take the step back, though, the real issue here is whether we can, as developers, cover that cost. So I went back through the data set and where I could, plugged in the retail MSRP in inflation-adjusted dollars. For mobile games that were pay-once titles, I used the price; for older MMOs, I ballparked it at box cost plus six months subscription on average, and where I had actual LTV for users, I plugged that in. The result told me how much players have paid for a megabyte of game over the years. Spoiler: they’re getting a deal.
“Wait!” you might say. “We don’t pay for megabytes! We pay for fun! We pay for gameplay! Not raw install! We pay for value!!!” Yeah, yeah. But in practice, development costs are correlated with bytes, not Metacritic, I think (no graph for that, but it was an easy eyeball test, plus it makes obvious sense — a bad big game still costs).
Lots of people have made the observation that in terms of raw purchasing power, players pay around half of what they used to in the 80s. You can thank our old friend inflation; I particularly like the chart here showing the effect. Well, in terms of bytes, it’s way more than half.
What are the games that poke out at having high revenues per byte? They are “evergreen” games that rely strongly on
community
user-created content
player skill (sports-like)
Unsurprisingly, most of the high data points are MMOs and service-based online games. They’re probably not as high as they look, since these are also the games most likely to rely on streaming content — but for MMOs I did try to compensate by using the total space on disk after play, so any streaming caches are included.
The kicker on this is that this hugely underreports the fall in game prices because whole segments of the industry give away the games now. Those free to play games are still delivering that many bytes to users, who just don’t pay. And yes, some whales then pay enough to cover the free players. But for the resultant data point to be equivalent to the cost per byte of an AAA game of the same size, you would need every player to have a $60 life time spend in the game. On average. Needless to say, free to play games do not tend to hit $60 average for every player who enters the game (some do, in Asia especially, believe it or not).
That’s not even mentioning other aspects of downward price pressure, such as discounts over time, bundles, or Steam sales.
Now, I’ll be totally upfront here — I don’t have nearly enough data points on costs, install sizes, and typical revenues for mobile games. So this is all sort of speculative at this point. But I don’t like the shape of this curve, especially when I compare it to the other curve, on developer costs.
These two lines are separating, as you can see. Worse, this is a log scale, so they are separating faster every year. This is a classic “make it up in volume” scenario, you see. We can afford, as an industry, for players to pay less and less as long as we can sell to more and more players.
But… at least in developed countries, we are actually close to market saturation. There is a term, “total addressable market,” which means “everyone you can actually sell to.” We crossed the “50% of people are gamers” line almost eight years ago. It’s also a well-known basic rule of marketing that users who are farther away from your core audience cost more to acquire — in other words, the farther into the world’s population we go, the more marketing money we have to spend. And remember, marketing money isn’t in these charts.
On current trendlines, here are some naive forecasts generated by the simple expedient of overlaying a ruler on my monitor:
The first forecast is that at this rate, the average game will be free in about ten more years. And given that the dataset tilts towards AAA, yeah, I mean the average AAA game. Some games will be paying you to play them. Lest this seem crazy, that’s actually already the case for any free to play game we currently consider a flop that doesn’t make back its money; we paid dev and marketing cost, you played, and we didn’t cover the costs.
The second forecast is that the way we’re going, top end AAAA productions will drag the average cost of AAA into the stratosphere. We’re talking one terabyte games that cost $250m to develop, by the early 2020s.
We need to remember that a lot of this is simply the price of advancing technology. As long as technology advances exponentially, so will costs, especially if we keep using it naively.
And by naively, I mean, focusing on pixels.
* * *
Because there are some things that may ameliorate this curve. None of them are easy. In fact, most of them have not been executed consistently and effectively over the history of gaming, and we’re outright not actually that good at them. But specific outlier games have proven that these things can work and break these curves. The thing they all have in common is that they de-emphasize bytes in favor of other types of content.
Strong community drives retention, and retention drives revenue. Community is probably the easiest thing that developers should aggressively pursue, and it’s not cheap and it’s not at all easy. I estimate typical studio learning curve on doing this to be around 3-5 years of culture change.
Designing for systemic content rather than static content. This is bad news for a lot of games that I love. My absolute favorite game of last year was What Remains of Edith Finch. I thought it was a 10/10 masterpiece marrying the narrative and systemic arts. And with my business hat on, I wonder if in ten years we will see static content games like it as viable.
Focus more on multiplayer, since players are effectively content for one another. See the “community” bucket for the difficulties here.
Shift our F2P emphasis, which currently depends on trickling content and upselling it. That content load is exactly what may kill us.
We could also embrace users generating those bytes in various fashions. UGC, using player models, customization, whatever.
Algorithmic and procedural approaches need to become dramatically more widespread. Fortunately, the academic community is way ahead of you on this one, and there are already academic papers out there on generating entire games with code. Yeah, over the long haul, that may render you the developer obsolete, but at least publishers will live on and raise a glass to your memory as they feed your brains into the training data set for their neural net designer AIs.
Speaking of which, your servers are horrendously inefficient. Speaking as an old MMO guy, you are probably vastly underutilizing CPU simply because of libraries, containers, VMs, virtualization, and hugely inefficient web stack stuff. Try pretending that you need to host 5,000 instances of your online match-3 RPG on a Pentium box from 1999. It can be done. It might bend the curve.
Raising prices is the most obvious. Nobody wants to do this. It will probably happen anyway.
The other, similar, thing to do is make less games.
* * *
To the players out there: I know none of the above is stuff you necessarily want to hear. Trust me, a lot of it is not stuff developers want to hear either. If you want to preserve the games you love, you can help by not pirating, by supporting developers, by not tearing them down on social media and calling them inept greedy bastards, and most of all by just understanding the landscape.
And if you are a developer, the best advice I can give you is this… this world isn’t fully here now, but the trends are pretty dramatic in my opinion. So you should do some skill-building while you can.
Think of the whole industry as a mature market. We’re running out of platforms shifts that reset costs.
Get good at systemic design, design for retention, design for community. Basically, think like an MMO developer. Yeah, that means designing everything as games as a service.
Embrace procedurality.
But also embrace brand-building and marketing, because you ain’t gonna survive without it. This market is going keep getting more crowded.
And frankly, I think individual contributors need to start finding ways to get on-going revenue from older games. Because that world is also one where individual contributors become more and more interchangeable.
Now — it may well be that this data set is utterly inadequate. I invite more data points (submitted anonymously), especially from indie, free to play, and mobile. I’d need game name or unique identifier (so I can de-dupe), total development cost excluding marketing, year of release, and platform. I’d like total size of install or data generated and delivered to player as well.
Of course, this would be better if some web wizard built a website that supported anonymous submission of these data points as an industry service, and generated these graphs on the fly. Because this is not an issue that should pit developer against publisher, publisher against player. This is about the sustainability of the hobby we all love and that pays bills, keeps us sane, and sometimes drives us a little crazy.
To be clear: I would love it if these graphs were wrong.
Read more about:
Featured BlogsYou May Also Like