Usability and the Gaming Industry

19th February 2010

Chris Rourke, Managing Director , at User Vision discussed the usability factors in the gaming industry with Dave Cook, Games Journalist for the Games TM magazineLink opens in a new window . The article is in the latest issue of the magazine - out Friday 19th February 2010.

Interactive Lounge at User Vision used for testing a number of devices including games consoles

With Microsoft gearing up to stake a claim in the great motion control arms race, many questions still hang over Natal and the future of usability.Will it be the first step towards realising the pipe dream of symbiotic
gameplay, or will it fall foul to shovelware and broken promises?With this in mind, games™ speaks with some of the brightest minds who are already pushing user interaction forwards, across the next decade and beyond

We may still be devoted to mouse,keyboard and pad, but the way we interact with videogames has radically changed. Look back at Apple II text adventures like Zork and Hitchhiker’s Guide To The Galaxy – pages upon pages of blocky text didn’t offer any kind of aesthetic wonder, but the quality of the
writing drew you into an exciting world that was as vivid as your imagination. But with no visual signposts to subtly guide your actions, navigating these games required perseverance, nimble fingers, and an extremely high tolerance for the phrase ‘Do Not Understand.’

Progression in a text adventure required the player to probe the limits of the game’s vocabulary; learning its language was all part of the experience. This is an early example of the mantra of usability at work.
The idea is that in order to make interaction compelling, the control or navigation system must match the experience. Rock Band quickly becomes boring when played on a control-pad, but feels vital with a
plastic guitar in your hands. Complicated menu screens would frustrate in the highoctane world of Unreal Tournament, yet become a harmonious necessity for the stately pace of BioWare's recent Dragon Age: Origins.

“Usability by the strictest definition is: is it usable?” explains Chris Rourke, managing director and founder of User Vision, one of the leading usability testing facilities in Europe. “You really have to make a distinction between usability and user experience. For example, can a person use something effectively to meet a goal?
The result is always interesting, but this is a blunt instrument… you can have something that, yes it can be used to meet a goal, but the user didn’t really feel in control or enjoy that process and this reflects part of the user experience.”

While control methods must be tailored to each new game to make it usable, headsup displays and information relayed back to the player must also work in tandem with control mapping. Early HUD integration served as a go-between, conveying vital information to the user such as life count, inventory, health bars and signposting devices. Today, these displays offer a much wider range of information, and just as poor control can hinder a title, so too can an overabundance of data or sloppy delivery.

“This really takes into account capability through limitation,” Rourke admits, “especially in a very dynamic, fast-paced game where we may have to assimilate five or six pieces of information flashing up onscreen at once. This is something we have to carefully account for while testing usability, and the key is that you
want the game to be challenging but never overwhelming. Gamers have probably all come across this frustrating experience and it is obviously quite inefficient to the
point that players start to ask: ‘Do I really feel like I’d want to play that again? Did I feel in control of the game?’”

Balancing the user experience of digitally controlled titles is an intriguing process (See ‘Marks out of Ten’), but the advent of motion control has made the job significantly more unpredictable, particularly in the case of Wii, which has widened the gaming demographic significantly in recent years. “The random elements of touch control, such as gauging how a person will move with a Wiimote, can make the testing process harder,” explains Clare Barnett, user experience consultant at User Vision. “This also depends on the
actual target users. If you’re looking at Wii then you can see that it’s trying to embrace younger and older generations. But if you look at Xbox 360 and PlayStation 3, the games are more about the graphics, among other things. Every time we test a game,we have to bear in mind what the target audience is. There would be a low-level, generic way of going through the tests but

Marks out Of Ten

USER VISION’S EDINBURGH-BASED usability testing lab looks more like a plush household lounge than a bustling room full of beakers and flashing electronics – in fact, it’s referred to as an ‘Interactive Lounge’. The idea is to keep test users comfortable by replicating the kind of environment they would usually inhabit when playing a game. When testing the balance between immersion and ease of play, setting the correct tone is the only sure way of collecting accurate data.

Using a collection of cameras seamlessly rigged up inside the lounge,the team can assess ergonomic data
based on how the player is sitting or holding the controller, as well as any visible frustration or lack of interest they display. Eye tracking can also be applied to gauge the effectiveness of menu systems and other onscreen information. For example, did the player notice key elements of the HUD or vital sign-posting devices in a level, or did they become hopelessly lost and perhaps even frustrated?

After each session, which can last anywhere from 30 minutes to an hour and a half, users are asked to discuss their experience with the team and offer any suggestions that could improve ease of control. For longer studies, users keep a diary of their time spent with the game, and this allows the team to paint
a larger picture of a user experience  across a designated time frame.


you would eventually have to adapt the test to the specific platform.” The work carried out at User Vision and similar companies is necessary to gauge the effectiveness of a particular device or piece of software. This process is pivotal to the evolution of new technologies, as it provides necessary feedback for developers to learn from past mistakes and consider the next logical step. As consoles embrace the notion of augmented reality gaming through devices like Playstation Eye and Natal, ensuring ease of play becomes increasingly demanding.

Developed by Sony’s London studio, EyePet is a key example of a highly sophisticated augmented reality user interface at work, but one that requires a great deal of stringent testing before it can be shipped. When you consider that the game could potentially be played in any number of households with different
lighting and room layouts, both the game and the Playstation Eye peripheral must be guaranteed to function effectively, irrespective of where they are being used.

Add to this the unpredictability of each individual player’s motions or gestures and it becomes clear just how fine the line between success and failure can be. "One of the golden rules that we’ve always had within the group is that you only design an interface if it makes doing an operation better,” says Mark Lintott, tech director of the EyeToy Group. “Usability testing is a very intense part of our development. We spend a lot of time on this, both in people’s homes and in our studio. We targeted everyone, from our own kids early on, to going out to homes and letting families experience EyePet so that we could observe how they interacted with the game. We wanted to document many

The World Is Our playground

IF PERFECTED, THE convergence of real and virtual worlds could result in an unmatched state of immersion. While alternate reality games and flash mobs are seen as a fitting example of people participating in an
orchestrated game where they are the player, they do not fall into the same category as console or PC gaming.

Launched in 2000 by Professor Bruce H. Thomas at the University of South Australia’s Wearable Computer
Lab, ARQuake was the first example of a fully functional, augmented reality game that takes an existing IP into a real-world setting. The game combines id Software’s Quake with GPS and orientation technology as well as a headmounted display, light gun controller and a wearable laptop. The technology enables
players to move around outdoors and fend off alien attackers, taking stock of inventory, health and ammo
count using a HUD made instantly visible by the headset device.

Naturally, the science behind the technology is incredibly advanced. Starting out with their university campus, the team created a wire-frame version of the various structures and areas on site, then gradually dropped in enemies and synched this virtual level with the realworld setting to ensure the two were perfectly aligned. The results are fascinating and research into perfecting the technology continues to this day.

things, such as how people used the magic card or how people navigate in a 2D or 3D space, and this actually changed a lot of the game’s design. There were many things that we thought would simply work, when actually it was the other way round.” To increase user immersion in EyePet, the team also strived to make the pet as believable as possible, with clear facial expressions and ensuring that it accurately responds to the player’s touch. The technology behind the game is leaps and bounds ahead of the original
EyeToy series, but ensuring that each new instalment provides both ease of play and an immersive experience is the key to progressing the medium further.

Only when a developer has a firm handle on the technology, and the art of creating games to fit that technology has been nailed down, can further advancements be made. “I think the ambition going forward is to cut back on HUDs and I think we did that in the way EyePet communicates,” adds Russell Harding, game director of the EyeToy Group. “This comes from the pet having quite a humanistic face so that
we could deliver a lot of visual expression, but we would definitely like to move away from HUDs completely.

This would make games increasingly immersive and feel much more real. You can already see these advances in games like Killzone 2 and Call Of Duty, which are already fairly HUD-less, and this is all to do with feedback. For example, in Call Of Duty you don’t have a health bar as such, but when you get shot your screen turns red and you start to breathe differently, giving you visual feedback that isn’t necessarily all text and graphics, but is something that is almost emotional.”

Relaying essential data such as state of health, ammo count, mood and inventory without falling back on textual conveyance is another key step towards delivering true symbiosis in gaming. Stargazers may pine
for a future that allows players to interact with virtual worlds converged with or indistinguishable from our
own (See ‘The World is Your Playground’), there are still many obstacles tied to usability that the
videogame industry must first overcome.

Enter Paolo Barone, academic evangelist at Microsoft and a chap with a knack for predicting future technological advancements. “I would agree that usability is becoming more symbiotic with gamers,” he replies. “That’s where the real innovation is in the gaming industry. We will still see games being created within the existing paradigm of using a controller, but moving forward I think developers will be in pursuit of a better experience for gamers.

The industry is always learning and I can see the role of the interaction designer being increasingly brought into the development process. Compare this to ten years ago when this job didn’t even exist and I definitely see more attention given to making games easy to use and making them more immersive, rather than the pure quality of graphics, because at the end of the day, the most successful games ever
are those that are easy to play.”

Predicting the state of user interaction as far as 2019, Barone insists that there are many key advances to consider beyond the way we interact with technology and the way that technology responds
to our commands. He also places a strong emphasis on the future of digital distribution and browser-based streaming services as contributors to total immersion in gaming. “I see the potential for streaming not just games, but complete applications, and that every screen in the household will potentially become a place in which we can do anything,” he states. “The idea is that I will be able to sign-in to a screen via biometrics or fingerprint scan and decide if I want that particular screen to be my television, personal computer, audio
streaming device or, of course, games console. This is a long-term vision, but I think that this notion of mobile computing is where we are heading. You can even see now that we are using applications on the
cloud, but I still see the need for having local hardware for a while as we are still not even close to being able to fully stream entire Xbox 360 games.“

Barone also has faith in Project Natal as the next milestone for immersive gaming, but he also champions Nintendo for its work on the Wii, particularly the developer’s internally developed titles. Built from the ground up with the possibilities and limitations of the Wiimote firmly in mind, games like Wii Sports Resort and Super Mario Galaxy boast fluid and incredibly responsive

My Vitual Friend

DR. LESLIE BALL’S research into intelligent AI at Abertay University could have farreaching implications for the games industry, not only in the way we interact with games, but the way in which they react back. In the same way  that Milo has been pitched as an AI device that reacts to your tone of voice, expression and retains and uses information absorbed from the player, Abertay is looking to go one step further and essentially replicate a living entity within an interactive virtual world.

“Emotion is fundamental to most if not all human experiences,” explains Dr. Ball. “It affects decision making, perception and learning, and essentially we are looking at improving the emotional intelligence of machines, the ability of computers to reason, relate and ‘be clever’. While the primary role for this research
function lies in the field of entertainment, the possibilities for its use in the longer term are immense. In essence, if we imagine this technique being used successfully in massive multiplayer internet games such
as World of Warcraft; players could be communicating with each other, as avatars, from their living rooms on different sides of the world, using only facial expressions.”

The application of such sophisticated avatars within MMO titles would be a massive leap towards realising the possibilities of truly immersive virtual worlds. Time will tell if Dr. Ball’s vision can become a reality, but the launch of Milo could perhaps serve as a taste of what will follow.

controls, while third party titles such as Tiger Woods 10 are ported, then retro-fitted to feature motion controls. Indeed, there is widespread concern that Natal may follow suit and become smothered by shovelware and tacked-on features.

“We will probably see some game studios building Natal titles from the
ground up with the tech in mind, and I really look forward to this as it will really take advantage of all the possibilities of the technology.You can probably implement Natal into existing games like Project Gotham Racing and Forza, but that could end up being another gimmick. I’ve encountered so much of this on the Wii and it’s very disappointing. You can imagine that Natal has so far been under wraps, but from my experiences of livedemonstrations it is very impressive. You may have also seen the unveiling at E3 last year and wondered if the presentations were real. I can safely confirm that, yeah, they were.”

From the launch of Natal and beyond, it will be interesting to see just how well gamers respond to the notion of controllerfree gaming. Will it provide a state of heightened immersion or will it fall short of expectations? If Peter Molyneux is to be believed, then Milo & Kate is Natal’s real ace
in the hole, delivering user interaction of a kind we have never seen, with the potential to spur more ambitious and advanced projects.

Whether or not this proves to be the pinnacle of decades of accumulated usability testing and technological
advancement remains to be seen, but it has already garnered a great deal of scepticism. Barone muses: “Will Milo be a successful concept if it evolves enough? Personally, I’m not 100 per cent convinced. It is a really interesting piece of work and I am looking forward to see what will come out of that, but going forward…”


As academic research continues into the field of augmented reality, the future of usability and immersion evolves apace. The notion of being able to play within wholly believable worlds and interact with an indigenous population of intelligent, culturally unique NPCs that recognise your facial expressions, gestures and tone of voice may appear like science fiction run amok to some. However, this year will see the launch of new research conducted by Dr. Leslie Ball, an expert in artificial intelligence at Dundee’s Abertay
University (See ‘My Virtual Friend’).

Culminating in the development of an avatar that interacts with users at a very human level, this research is hoped to advance the believability and intelligence of virtual beings to an unprecedented degree. With the fundamentals of usability firmly at its core, the project could have major implications for the future of immersion and usability in games.

Will gamers readily embrace these advancements in years to come, or will the notion of converging virtual and real world settings be viewed as intrusive? Till receipts from Natal sales over the 2010 holiday period will prove telling, but Milo & Kate still feels as likely to be the next step in user interaction and immersion as it is to fall by the wayside like Sony’s Sixaxis pad. “I always believe that the vast majority of people will still want to interact with actual people rather than a simulation,” Barone concludes. “Then again, it could be a matter of us getting used to new things. Everybody’s scared of change.”






 

The estimated return on investment (ROI) for usability engineering is somewhere in the 10:1 to 100:1 range.

Compuware, Usability is good business.