Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
When was the last time you saw a movie, music or book review in percentages? When did 70% become bad? Did you ever play a game that the critics loved, yet you hated? How about the other way around? Is it time to review the way we review games?
No matter how much we might disagree with them, reviews are important. They tell us about games, they give opinions on how good they are compared to others, and help us work out what to spend our money on.
Back in the old days, magazine reviews were pretty much all we had to go on, hanging off every word of the 4 or 5 articles about a game.
But now, the internet means that there are so many reviews, reviewers and review sites, that “review aggregating” sites such as Metacritic or Gamasutra are needed in order to get an overall opinion.
The problem, is that they do the complete opposite they don’t give an overall opinion, instead they provide weight to the haters and hide actual reviewers opinions in a mask of homogeneity.
The solution – a new form of aggregation formula, similar to that used by Rotten Tomatoes for movies.
When I started playing video games (about 28 years ago) game development reviews were something that everyone read – they were how you found out about games, and how you judged which games you’d buy.
When I started in game development (about 17 years ago) reviews were still vitally important – they boosted your ego, and your CV, they still swayed your purchases and sometimes even affected the end of project bonuses.
Reviews mattered, and were taken seriously, in part because there weren’t that many of them. In both of the above cases the amount of reviews you’d get was limited – depending on the platform and the territory maybe there’d be 4 or 5 magazines that would cover your game – and the only one(s) you’d be really interested were the ones that the publisher would love to plaster quotes from on the box (“this game is awesome (5 stars)” Official [InsertConsoleName] Magazine etc.)
Now however things have changed, at least in some regards.
Reviews are still the subjective opinions of people we (generally) don’t know. Review scores are still used by many of as an essential guide to the quality of a game.
But the rise of the internet and the demise of print have seen the number of review sites increase by orders of magnitude. So much so that it’s no longer enough to have a few great reviews for your game – you now have to have enough so that the AVERAGE review is great … enter the era of GameRankings, MetaCritic et. al.
These aggregation sites are practically essential in navigating the vast quantity of reviews for titles – so much so that game development contracts now specify GameRanking (or Metacritic) rating as bonus/contractual criteria.
Unfortunately, these aggregation sites have a huge flaw - Metacritic / Gamerankings are unfairly swung by bad reviews. If your game is averaging 80% it takes two “excellent” 90% reviews to make up for one “not my sort of game” 60% review. It takes four 90% reviews to make up for one “hater” 40% review – that’s tough – particularly as bad reviews can easily be given by people who don’t like that sort of game.
These effects can, if you’re unlucky, be magnified further as many sites end up just duplicating the content of reviews from the main sites, making the aggregate even more arbitrary. While the system of review “weighting” used by some aggregators (based on the status of the reviewing site) is aimed to solve some of these problems, it instead only exacerbates them further, should just one of those high-status sites happen to be your token “hater”.
Now I don’t want to give the impression that “hater” reviews are bad – I believe there should not be any homogeneity in reviewing. Rather, instead reviews should be biased towards the opinions of the reviewer - that’s why we read them.
BUT – those opinions only count when you actually READ the review, not when you just look at the score, which is all you get from the aggregator.
A score alone does not take in to account the preferences of the reviewer – can you tell that the 40% that a game received in that one review (which dragged down the overall average) was because the reviewer was a hardcore shooter fan, who really just didn’t want to review that racing-sim? Or because that horror-death game was reviewed by an extreme-moralist?
In general reviews provide a percentage which is supposed to allow the public to judge which is better Game A or Game B, but can you really compare a racer to a shooter to a puzzle game to a pony-sim?
Games can be dragged down by single elements that many would say “don’t really matter”. A game released on PS3 that has graphics that look like a PS2 game will get marked down for that, even if it’s crazy fun, maybe only by 10% or so, but enough to push it out of the tiny “top” percentage. This forces a block-buster mentality, whereby the only way to get good reviews is to spend more than the last game did, whereas what we should be doing is saying “are we having fun”.
Review percentage is also based on a NOW comparison, games are compared to the quality of other releases, so it might be fair to compare Blur with Split/Second, but how do you compare either of them to Ridge Racer, or older titles? If a so-called classic game got reviewed now it would be marked down accordingly, just take a look at some of the straight ports of arcade classics on Xbox live arcade. Those games rocked in their time, now they languish with 50-70% review scores.
So, in summary, there are too many reviews to read them all, so we have to aggregate.
But aggregation just gives us an average score. Not an aggregate opinion.
Given that, in our industry, a 70% score is regarded as mediocre, at best, an aggregate scoring system unfairly biases towards “hater” reviewers. What we need is a completely different approach to reviews, one that allows for crazy bias, one that allows for opinion.
Thankfully the movie industry has already worked out such a review process: Rotten Tomatoes. Rather than aggregate percentages, it simply gives a +1 if it’s a “Favourable” review, and a -1 if it’s a “Negative review”. Films are then scored based on whether most reviewers liked it or didn’t. So you get an opinion based aggregation that focuses on entertainment value and not arbitrary quality thresholds. And that’s all win.
You May Also Like