What’s The Score?

Think back to last month.  THQ was getting ready to release one of its biggest titles of the year, Homefront.  It cost the publisher a considerable amount of money, had John Milius (of Apocalypse Now and Red Dawn fame) for a writer, and arguably a pretty interesting gameplay concept for its multiplayer mode.  All was well for the industry giant until the reviews came pouring in.  Review scores weren’t quite the mid- to high 80s that THQ were expecting, and by the end of 15th March 2011, the game had an aggregate score of 72 on Metacritic.

What happened next?  Wired UK reported that “that same afternoon, THQ’s stock fell from around $6 (£3.70) per share to around $4.75 (£2.95) per share”.

It’s incredible to think that reviews were able to have such an impact on the company, especially considering the fact that THQ already had a review embargo in place until the actual release date of the game in the United States.  But such is the power of video game reviews in today’s world, it would seem.

What’s more disturbing, however, is what came from everyone’s favourite industry analyst Michael Patcher (you know, that guy that makes all the ridiculous predictions about Nintendo), in response to the whole “fiasco”.  In the following quote, pay particular attention to the part in bold:

“[THQ] had very high hopes for the game, promoted it a lot over the last few months, and most expected a score in the mid- to high 80s…Low 70s is only average, and typically means sales of 2 million or fewer for a new intellectual property. It could sell more with marketing support, but I think 2 million is a fair guess.”

That’s right, the 70s, not the 50s, are considered to be average.  What madness is this?  Review scores have become increasingly inflated over the years.  The amount of games that achieve a 8, 9 or 10 from the biggest and most influential video game news websites is pretty phenomenal.  In the past 3 months, IGN has issued a whopping 9.0 or higher score no less than 15 times.  Each of these games is supposedly “amazing”.  While I don’t doubt that titles such as Crysis 2 and Final Fantasy IV: The Complete Collection are good games, are they so spectacular that they are deserving of a 9?  I haven’t heard a single person say anything special about Crysis 2 since it released, unless it’s about the game’s engine.

Reviews serve a key purpose: to inform you, the reader, whether or not a certain product is worth your money. In these cash-strapped times, a bad recommendation can be costly (or, at worst, force you to lose out a bit on a trade-in). Are video games really getting that much better, or are standards failing?

A Predicament A Bit Too Close To Home

Review scores have always been something of an issue here at Bits ‘n’ Bytes Gaming headquarters. Deciding upon a scoring system that not only makes sense to the people reading our reviews, but also those who are writing them is tricky business. As you may well already know, we eventually opted for a simple tiered system, which uses our “Recommended” and “Mark of Excellence” labels. Games that don’t receive a label aren’t necessarily bad; they could well be pretty good depending on what you’re looking for from a game. But we felt that our reviews would carry much more weight if people were able to make up their minds based on what we actually thought, rather than what number we gave a game. After all, how do you gauge the difference between a 7, a 7.5, and an 8 anyway?

And herein lies the problem: there are multiple video game review websites out there, each with their own scoring system. And how a website defines its criteria for what constitutes a 10 or how bad a game must be in order to score a woeful 1 is entirely up to them. At the end of the day, a website is well within its rights to decide how it goes about rating products, but this does cause a considerable amount of problems in the process.

An overall system that all websites subscribe to would still have its faults, but ultimately it would present less of a problem. Review aggregate sites such as Metacritic take all the scores from the most prominent critic reviews and compile them into one big average score for a piece of software. Metacritic even states itself that, if the scoring system is different or no score is presented, it decides upon a score based on the “general feeling” evoked from the review. So as well as having a variety of systems across numerous websites, we also have an aggregate site that can potentially skewer the score or opinions presented in a review.

Video game reviewers are a bit like a panel of judges – they may sit at the same table (the video game press industry) but do they all use the same marking criteria?

Now, this isn’t Metacritic’s fault – they’re just trying to compile data about products because that’s what they do. But I imagine that their job would be a whole lot easier if they had a standardised, overall review system for all sites to go by. Furthermore, it would present the reader with a more accurate view of how good or bad a game actually is. What’s the difference between a 10 from GameSpot and a 10 from IGN?  Sure, I might be able to find out somewhere by referring to the review legend that each of those sites has, but can we honestly expect all readers to do that?  Of course, a standardised system would still be entirely dependent on how reviewers used it.  If they do not assess a game according to the review criteria and continue to pump out ridiculously disproportionate scores, then you essentially end up with a system of inflated scores across the board.  This would be down to the readership to essentially “name and shame” poor reviews.  However, this is not a reliable failsafe; the video game news readership is notoriously bad when it comes to things like user reviews, and the inner fanboy or fangirl in us can play a key role in making us see past the faults of a game when we shouldn’t.

Therefore, it’s still difficult for scores to be representative of what they actually are.  A 7 is not average and it should be closer to meaning pretty good or something.  With review scores being the complicated mess that they are, it’s not hard to see why the team here at Bits ‘n’ Bytes Gaming eventually opted for a somewhat tier-based system.  Even then, we realise that it’s not perfect, but we hope that by placing the emphasis back into the actual written aspect of a review, we will end up providing a more trustworthy and honest opinion.

Those Publishers Are Tricky Ones

If there’s one thing that video game reviews truly have the power to do, it’s to make publishers act stupidly.  In a desperate act to ensure that their games sell by the million, publishers have been known to do some pretty silly things.  In some instances, when things haven’t quite gone their way, they’ve reportedly gotten writers fired for giving bad reviews.  And in others, their own employees have taken some pretty rash action to secure good sales, without thinking about the consequences.  There have been plenty of publishers in the past who have attempted to secure positive reviews for their products through deals with video game press outlets.  One of the most notorious cases of this in recent times was the whole fiasco between Ubisoft and German video games magazine Computer Bild Spiele over Assassin’s Creed II.

If your memory is a bit hazy, Ubisoft basically told the team at Computer Bild Spiele that they could only have an early review copy of the game if they guaranteed that the game received a “sehr gut” mark.  Choosing not to receive a review copy was the difference between the review making its way into the press outlet’s next issue or the one after (a potentially costly decision).  Destructoid’s Jim Sterling summed up most people’s thoughts on the issue back when it happened in August 2009:

“Of course, asking an outlet to guarantee something like that before they’ve played the game is asking them to trade in every ounce of integrity they’ve worked to build, and if these allegations are true, it’s a shame that Ubisoft felt the need to barter against someone’s self respect like that.”

Thankfully, Computer Bild Spiele didn’t give in.  But it does make you wonder how many press outlets out there will actually trade some integrity in order to get that quick scoop.

The Final Verdict

It’s quite astonishing to think how much of an impact review scores truly have.  What seems like a simple number listed next to a game to indicate how good or bad it is is in actual fact something a lot more worryingly complex.  People visit video game news websites and read magazines expecting to find trustworthy and honest opinions, but that’s not necessarily what they’re getting.  We have a non-standardised scoring system that is all over the place, and some devious publishers that seek to undermine the review process in order to make a profit.  And when reviews impact their sales so considerably, you can almost understand why they do it.  The example about THQ that I gave at the start of this article wasn’t a joke.  In the current economic climate, if a game is a colossal failure, it can result in the closure of studios and the loss of jobs.

However, no matter how woeful the worst scenario can be, this doesn’t excuse these actions.  If publishers want their games to sell well, then the solution to their problem is pretty obvious: make better games.  There’s a reason why you didn’t see Valve going through hoops and circles to make sure that Portal 2 was a critical success; they just ensured that they made a brilliant game.  If more developers went about their business in a similar fashion, not only would the consumer benefit a lot more, but the review system would probably also become properly aligned, as the games at the very top would contrast more clearly with those towards the middle and bottom.

My sentiments are perhaps a little too idealistic in parts, but if a universal review system were ever seriously proposed, I know that I’d be one of the very first people to support it.  In the meanwhile, I urge all my colleagues and peers in the video game news industry, whether they be IGN or a small blog, to look at their own review systems and actually work out what it all means.  If you can’t clearly distinguish your criteria from the offset or if your “average” category starts at 7, then a lot of reworking is in order.

Avatar

By Editorial Team (Old)

The old BNB Gaming team was made up of some hugely talented gaming journalists reviewing and writing on all things gaming.

Leave a Reply

Your email address will not be published. Required fields are marked *