Sponsored By

Next Generation Graphics vs. The Console Business

Has the constant strive for increased graphical fidelity made the console business into its own worst enemy, and can it survive another generation leap at a time when studios are struggling to meet the budgetary requirments of the current generation?

Tom Battey, Blogger

June 11, 2012

5 Min Read
Game Developer logo in a gray background | Game Developer

I remember when I decided that videogames didn't need to look any better. It was 2001, I was twelve years old, and I was hunched over a biege PC in the school library watching clips of Final Fantasy X on some precursor to YouTube. I watched Yuna summon Valefor and I thought to myself: "Wow. This is it. This is as good as videogames will ever need to look."

Eleven years later, I still largely stick by that statement. The PS2 era will always represent, for me, a pinnacle moment where graphical fidelity and the imagination of game creators perfectly overlapped.

This was the first time that designers could create in-game models that actually looked human. Their faces were no longer textured on; they had real (digital) lips that could be synced to dialogue tracks, and eyes that could move and express emotion. There was no longer any need to cut away to pre-rendered cutscenes for narrative impact - these videogame people actually looked like people.

More...

Games like Final Fantasy X/XII and Metal Gear Solid 2/3 are standout examples that spring to mind. Their predecessors featured boxy approximations that required some imagination to appear human, but that extra graphical processing power could now render emotive characters, characters that could actually tell a story.

At the same time, PS2 tech wasn't so prohibitively expensive to develop for that publishers couldn't afford to take risks. For a time, we had studios developing games with a relatable visual fidelity and the creativity to experiment, and we got games like Psychonauts, Okami, Dark Cloud and Killer 7, games that would struggle to be green-lit in our modern climate.

The tech-jump to the current console generation has caused game budgets to balloon to a scale that has proven prohibitive to creativity. Much of this has to do with the cost of generating the complex, high-resolution assets that have become the required standard.

This increase in budget has caused publishers to become increasingly risk-averse. With game budgets sometimes stretching into the hundreds-of-millions, anything but a chart-topping success can, and increasingly has, led to studio closures. As a result, games have become 'safer', more derivative and less experimental. I can't provide the figures to prove it, but I know that we've seen far fewer Okamis and Killer 7s on HD consoles.

And now Epic and Square Enix have revealed their visions of what next-generation videogames are going to look like. And while a small part of me still gets a little giddy at the thought of all those shiny new games, a greater part of me worries about the consequences of another tech-leap forward at a time when companies are struggling to meet the budgetary requirements of the current generation.

I'm not one of those people who believes that the console business will be erased entirely by the rise of mobile and tablet gaming. However, I consider it entirely possible that the console business could end up stagnating itself into obsolescence by demanding the sort of budget that no one bar four or five huge AAA publishers can afford.

Kotaku's Stephen Totilo has written a good piece on the advancements put forward by Epic's Unreal Engine 4, an engine that is likely to be fairly ubiquitous on next-generation consoles, if the success of Unreal Engine 3 is anything to go by.

While some of the points raised are expected graphical fluff - particle effects are of negligible actual importance to game development - it does seem that Epic have identified the issue, and are attempting to combat the inevitable next-gen budget-hike with their new technology.

Having real-time global illumination as standard, for example, will mean less time spent on creating elaborate lighting systems. That saves money, and is only one of many alleged 'game-changing' features. Perhaps the most important addition is the ability to edit game code in 'real time', without having to compile to prototype. A seemingly simple feature like this could save hundreds of hours per game project, and reduce the required budget accordingly.

This is the kind of thinking developers need to take forward into the next console generation. Next-gen console business needs to be agile and accessible. We cannot afford a year-long post-launch dearth of creativity while everyone tries to get to grips with the technology. For all the time wasted on creating a particularly luscious dirt texture, developers will be losing custom to the ever-stronger mobile and social markets.

In short, we can't afford to attempt the traditional generational leap this time around. An increase in graphical fidelity alone won't sell a new generation of consoles. In fact, the budgetary inflation caused by such could endanger the whole AAA game business. We need a business model where smaller studios can compete with the massive publishers, otherwise complacency and stagnation will drive customers away to the point where no console games are financially viable.

If Epic's technology, and similar efforts from other studios, can reduce the cost of making games while increasing the scope and vision of these games at the same time, then that's great. That's the sort of progress we need. But it's possible that the constant strive for ever-fancier graphics has turned the console business into its own worst enemy. We'll begin to see whether or not this is true in the next two or three years. While I'm waiting, I might play Final Fantasy X again. Rather that than XIII-2, at any rate.

Read more about:

2012Featured Blogs

About the Author

Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like