Sponsored By

Darwinia+ - Deep And Dirty: Part 1

The one about Mother-in-laws, Donald Knuth, the sewers of London and optimizing Darwinia+.

Byron Atkinson-Jones, Blogger

April 17, 2009

5 Min Read
Game Developer logo in a gray background | Game Developer

My mother-in-law used to do something that really annoyed me. Okay, admittedly being a mother-in-law there were lots of things she did that annoyed me, but this one not only annoyed but really frustrated me at the same time.

Whenever you went to into the kitchen the most commonly used items would always be at the back of the cupboard. It didn’t matter if you rearranged them to be at the front they would always find their way to the back. So if you went to make a cup of instant coffee the granules would always be at the back, which meant un-packing the cupboard to get at said granules, and once finished re-packing again.

It turned what should have been a simple task into a logistical exercise. As somebody used to optimising processes and code this was a very frustrating exercise for me and no matter what argument I put forward it would always be the same when we came back another day. Sometimes some things just don’t want to be optimised.

Introversion's Xbox Live Arcade game Darwinia+ is coming to the end of the project, and as usual Knuth’s law has come into effect and we have to improve the performance as much as possible. But like my mother-in-law there are parts of this game that just flatly refuse to be optimised. I always equate this part of the job as similar to working in the sewers of London – you are going to get deep and dirty dealing with other people’s mess.

The first task was to pick a selection of games in the Multiwinia part of Darwinia+ and produce baseline results for comparison. These baseline files are generated by the code and consist of the current frames per second output once a second into a file. In theory all we need to do is repeat the same test under the exactly the same conditions, and we can graph the changes the optimisations make.

It’s not a finely grained picture of the game, but gives enough overview to see if anything is having an effect, and most importantly is something that can be automated to a certain extent. This is precisely the reason Multiwinia was chosen, as it is easier to rig the predictive physics system to play exactly the same game over and over again given the same player input, and to emulate that we just leave the controller alone and let the gameplay itself.

The second task is to pick which of the baselines was the worst performer, and in Darwinia+ that turned out to be the Multiwinia game Rocket Race on a 4-player map.  This particular map is huge and the game can run for an undetermined period of time depending in how well the player or AI is doing – all the rest of the games use a time limit. For some reason this particular map was dipping to below 8 frames per second which is well below the target 30 frames per second.

In reality a more meaningful measure is the number of milliseconds a frame takes but that is something I will be measuring when I have to go deeper and start to look at isolated sections of the game. This is when I put on my overalls, hard-hat, spotlight and prepare to go deeper and even dirtier.

If I had my way early on I would have changed the entire approach to optimisation. For my entire career I have always had this following line from Donald Knuth quoted to me:

"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil."

Or more accurately and ironically a small part of that line:

“Premature optimization is the root of all evil."

Nearly every lead developer I have worked for has used that quote as an excuse to leave the optimization of the game to the last few months when everything is complete. In part I agree with that quote but there’s also a small part of my brain nagging away, screaming:

 ‘But he wrote that in 1974 when CPU’s and computers in general were far less complex than they are today – not to mention the amount of code they would have had to deal with!’

And you know what, I am probably going to get hung, drawn and quartered for saying this, but that nagging scream is right, the world of coding has changed a lot since 1974. We now have processors that have caches and are able to execute more than one instruction at a time with predictive branching -- running at blistering speeds that would have made Knuth’s head spin back in ’74.

Our code base is now measured in the millions of lines that they simply would not have had enough storage space to keep, let alone RAM in which to fit it all for compilation. So why do we still cling to this archaic notion? Because having said all that, there is still some truth to it.

It’s here where I differ from most people’s interpretation of Knuth’s famous line. Rather than leave the optimization phase of the project to the last few months of the entire project do it early on a module-by-module basis.

One of the great things about C++ is that it forces us to work in a very modular basis (at least it should!), so what we end up with is a game made up of lots of loosely coupled modules. So it should be possible to treat modules as separate entities to be optimized on a case-by-case basis with a final interoperability optimization later on in the project lifecycle.

I can’t do this for Darwinia+ because the game has been out for quite some time in the form of Darwinia and Multiwinia for the PC. The game is modular but given the quantity of code going through each module one by one would be very costly in both time and money.

So all that is left is to run the game and identify hot spots and do deep analysis to work out exactly why it is killing the game – is it CPU or GPU bound?  So back to the sewer worker analogy and getting down deep and dirty…

Unlike my Mother-in-Law this one WILL be optimized!

Read more about:

2009Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like