Sponsored By

Game Usability Testing for Indies: It’s Easier than you Might Think! (Part 1)

Tips in doing usability sessions for your mobile game, from our indie studio. This is Part 1 in "Games Usability for Indies" and deals with hardware and software equipment. We give examples from A Clockwork Brain, our most recent game for iOS.

Maria Sifnioti, Blogger

February 8, 2012

11 Min Read
Game Developer logo in a gray background | Game Developer

I'm Maria, the Associate Producer for Total Eclipse, a game development studio based in Thessaloniki, Greece.  During production of A Clockwork Brain  (just released on the AppStore), we designed and organised two usability sessions, despite having no formal experience. 

I think I'm becoming addicted with creating series-of-articles! So this is the first part in "Game Usability for Indies" and deals with the hardware and software equipment that we used.

[This article was originally published in the Total Eclipse blog]

Total Eclipse is a small studio, with a core team of five. Even though we’re small, we consider usability testing very important.

In the past, for three of our largest productions, we had a publishing agreement. The publisher had been in charge of doing usability & beta testing for our games, with camera recordings, questionnaires, targeted player groups, the whole lot. We used to get the videos and watch them as a team afterwards. I’ve got to tell you, especially during usability, those videos were most of times heart-breaking and not in a good way. That taught us how important usability is and how crucial it is to test things outside our core team.

A Clockwork Brain Logo

A Clockwork Brain Logo

In our studio, we also tested our games with friends and family but in a much more informal setting – them playing, and us, behind their backs watching and keeping notes. However, for the last two years we’ve turned to self-publishing; we no longer have access to a publisher’s usability perks. As a result, for our latest iOS game, A Clockwork Brain, we decided to design the usability session from scratch.

Prior to this, none of us had any working experience with formal usability testing. I, myself, have had some experience in questionnaire design and facilitation of experiments, based on previous work in university research.

The research, design, and deployment of our usability session took one month from start to finish. We wanted to share what we learnt with you, and we hope you find it helpful. As there are a lot of things to talk about, we 've split the articles.

This here, the first article, discusses the hardware and software setup that we used.

Things we’ll talk about in the first part are:

  • Equipment used (i.e. camera, device case).

  • Software for capturing video and sound.

Setting up cameras and making the usability sled

Usability Room Set Up

We wanted to monitor the players while playing: both their facial expressions as well as their actions in the game. This called for a two-camera setup. The first would be a camera that faced the player. The tricky part was to find a suitable second camera that would face the device.

We discovered a number of very useful approaches. The first one is by Harry Brignull (of “90 Percent of Everything”, a treasure trove of material on Usability!) which shows how to make an iPhone usability sled for £5 by using acrylic. The second one, by Nick Bowmast uses a similar approach . The last, by Belen Barros Pena, has the capability to fit a number of different mobile devices and is constructed of Meccano parts.

We decided to do something similar to Brignull and Bowmast. Equipment required:

  1. Hard protective crystal clear case for the device (depends on what you want to use, we had one for iPod and one for iPad). Costs around €5 on Amazon UK. While the other methods don’t use cases, we thought it would feel more natural to the player’s touch.

  2. An acrylic (we call it plexiglass)  strip (26cm x 3cm x 3mm, at around €2)

  3. A way to heat the plexiglass. The safest way is by a strip heater. We, instead, used a blowtorch. Keep in mind that you need uniform heat, not direct flame, on the plexiglass, so use blowtorches with great care. Nick Bowmast ingeniously uses a toaster instead!

  4. Good quality, light, webcam with microphone, capable of being mounted on the plexiglass strip. We used Microsoft LifeCam HD and were extremely happy (costs around €42).  Its base can easily be removed and the camera attached to the strip.

  5. A second webcam with microphone that would point towards the player. We already had a Logitech Orbit AF Webcam (€66) in the office so we used that one.

Tools of the Trade

After heating and bending the strip to the desired angles, we drilled the mounting socket for the camera. Once the Lifecam was securely mounted, the kit was extensively tested. When it became absolutely certain that the angle was correct and the Lifecam did not obstruct the player’s view towards the screen, we heated/attached the plexiglass strip to the clear case.

There are a number of benefits in using such a sled setup. For starters, the distance between the camera and your device’s screen is always fixed. When you setup your camera’s focus, you will observe that, no matter how the user holds the device you will have a steady video stream.  Secondly, the way the camera is placed does not obstruct the player during the game. It also does not enforce him to remain at a specified position. The player is free to use the device as he normally would, whether he is left- or right-handed.  The added hard plastic case gives an even more natural feel for the device; gadgets like these are expected to have plastic cases. Finally, adding/removing the device from the sled is very easy, since it only requires detaching it from the plastic case.

An obvious downside to this setup is the extra weight (about 80-90 gr for the iPod/iPhone and 100-110 gr for iPad). Through the sessions, however, none of the players raised an issue or showed any sign of tiredness. Here is the distributed weight:

  • Case: ~18 gr (iPod touch) ~30 gr (iPad)

  • Camera: ~50 gr

  • Plexiglass strip: 10 gr

Another downside is that you will need a separate sled setup for each different device that you use (plexiglass strip glued to device case). Also, such a setup will always require you to know what device you will use during usability; testing a game on a user’s touchphone when you don’t have a case for that phone would be a problem.

We will now look at how we setup the recording from the camera setup.

The odyssey of camera recording

Device Sled Close Up - Speaker &



We wanted to have a formal usability setup, where the player would be alone with the game in a room and would not be hindered by our presence. Fortunately, we have a conference room in our studio which was perfect for this.  We put a high-spec desktop PC there, and connected it to the two cameras and speakers.

The following step was to set down requirements for our recordings. Making such a list of requirements is extremely useful. It makes you think more clearly about the whole procedure and forces you to start setting down fixed rules about what you want and what you don’t want from your usability session.  This is what we considered mandatory for our session:

  • A good quality of continuous video streaming from the usability room to where we watched.

  • Monitor both cameras at once (picture in picture).

  • Record voice as well as video.

  • Communicate in real time with the player through microphones. We didn’t want to barge in!

  • Reliable software – usability flow should not be disturbed due to technical reasons.

We tested a number of different free and cheap webcam and screen recording software under Windows, such as Camtasia, Amcap, and VLC. Here we should also mention Silverback (for Mac) which we did not test, but read a lot of good reviews about it.

Some very useful articles that helped us:

None of that software fitted the requirements we had set so we went on to trial Morae by Techsmith.  We were quite hesitant as it was a very expensive piece of software and completely outside our budget (€1,427).  It is quite powerful, having a wide range of research features, but frankly we did not need most of them.

Morae is made up of 3 distinct components: RecorderObserver, and Manager. The Recorder records the feed from the two cameras or from a camera and a computer desktop. The recording can be enhanced, if wished, by tracking mouse clicks, handing out surveys, breaking the study in tasks and recording a log. The Observer allows you to observe the Recorder feed remotely and also commence or stop Recordings.  The Recorder files are saved in Morae’s proprietary video format that can then be opened and edited with the Manager.

Snapshot from Morae's live feed (slightly edited to look better!)

We found out that the Observer can save the live feed from the cameras locally in .wmv format. We also discovered that when you turn the sound off in the live feed, the streaming is of much higher quality.

So this was our resulting setup:

  • Install Morae Recorder on the PC in the usability room and set it up properly.

  • Use the default template configuration for “Hardware or Mobile Software device study”.

  • Remove everything related to task logging/mouse effects/studies/markers and surveys. We did not need any of that.

  • Have a Skype voice call between the computer in the usability room and one of ours. This allowed to hear any player comments and to talk to them if needed.

  • Install Morae Observer on another PC. Have audio-less streaming, and save the video in .wmv format (the saved video does include sound).

By doing this, we eliminated any direct need of the Morae Manager. The Recorder will always save in the Morae format, which we were not able to view, but the quality of the Observer .wmv was quite good for our needs, as it offered what we wanted.
Techsmith sells an Observer and Recorder bundle separate from the Manager bundle, at €333.  You can see how this price range is much more affordable for us!

Once the setup was confirmed, we run a series of test sessions in order to ascertain the optimal light conditions and camera settings. These of course will be different per room and camera, but two things really helped: artificial light and a piece of black velvet cloth.  During usability, we lighted the room with artificial light set to a specific level. The room’s blinds were always shut, no matter the time of day. This way, we were always in control of the light level.

We also covered part of the recording area with black velvet. This piece of cloth will set you back about €15 but it is worth it. It ensures that the dark background absorbs the light and will make your device recordings so much better. Our velvet was a 50cm x 50cm area, attached with Velcro strips on the wooden table. The participant was instructed to try and constrain any moves around that area.

Conlusion

We hope you’ve found the first part of our usability design useful. In this article we have explained how we decided on which hardware and software to use for our usability sessions. If you do not have a portable application like we did, and you want to record from a desktop and a user-facing camera, things will probably be a bit easier. If streaming observation is also not a big issue for you, you will find that one of the free/low-cost software options would probably be just what you need.

Until this point, the cost for our usability setup is:

Hard, clear, plastic case for mobile device

€5

Plexiglass strip (26cm x 3cm x 3mm)

€2

High quality, versatile webcam (Logitech)

€66

Good quality, small-size, mountable camera (Microsoft)

€42

Black velvet cloth

€15

Software for camera recording & observation (Morae Observer+Recorder)

€333

Total

€463

Stay tuned for Part 2, where will focus onto the particulars of usability sessions, such as game specifics, participant recruitment, questionnaire design, session facilitation and usability results. 

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like