Game Design 2: Expert Evaluation of User InterfacesPresentation Transcript
2011 Game Design 2 Lecture 6: Expert Evaluationhttp://www.comu346.com firstname.lastname@example.org
Expert Evaluations & Design / Usability HeuristicsWill look at:• Need for alternatives to user evaluation• Methods of evaluating without end users (using expert evaluators)• Some heuristics / guidelines offered by experts
End User Evaluations• End-user evaluations can be expensive – The methods are very time consuming – Users may not be willing – To get truly ‘fresh’ eyes, so called “kleenex” testing requires different players each time• Concerns about leaks – Few external play testers at early stages – Friends & family play testers may be too kind
Expert Evaluations• As an alternative to some user testing, expert evaluators / testers can be used• Falconer details 10 inspection methods, we will look at two: – Cognitive Walkthrough – Heuristic Evaluation
Cognitive Walkthrough• In this approach experts imitate users – Relatively quick and cheap – Expert needs to be skilled and requires: • A description of users (e.g. level of experience) • A description of system (or an operational system) • A description of the task to be carried out • A list of the actions required to complete the task
Cognitive Walkthrough• Expert addresses questions such as: – Is the goal clear at this stage? – Is the appropriate action obvious? – Is it clear that the appropriate action leads to the goal? – What problems (or potential problems) are there in performing the action?• Essential that the expert tries to think like the end user and not like themselves.
Consider Solium Infernum• A scenario: – Goal is to go to war and capture building – Is the appropriate action obvious? • What if no vendetta in place? – Is it clear that the appropriate action leads to the goal? • Does the game help you? – What problems (or potential problems) are there in performing the action? • If you haven’t got vendetta status, how do you know?• I know how to do this but thinking as a ‘user’ it’s not so easy.
Heuristic Evaluation• Involves assessing how closely an interface or system conforms to a predefined set of guidelines or heuristics.• Examples: – Nielsen’s usability heuristics – Schneiderman’s eight golden rules – Norman’s seven principles
Nielsen’s Usability Heuristics• Give feedback – keep users informed about what is happening• Speak the user’s language – dialogs should be expressed clearly using terms familiar to the user• User control and freedom – clearly marked exits and undo/redo• Consistency and standards• Prevent errors – even better than having good error messages
Nielsen’s Usability Heuristics• Minimise memory load – recognition rather than recall• Shortcuts – accelerators (unseen by novices) speed up interactions for experts• Aesthetic and minimalist design – don’t have irrelevant or rarely needed information• Good error messages – should indicate the problem and explain how to recover• Help and documentation – should be concise and easy to search
Norman’s 7 Principles1: Use both knowledge in the world and knowledge in the head.2: Simplify the structure of tasks.3: Make things visible.4: Get the mappings right.5: Exploit the power of constraints.6: Design for error.7: When all else fails, standardise.
Schneiderman’s heuristics (8 Golden Rules)1. Strive for consistency2. Enable frequent users to use shortcuts3. Offer informative feedback4. Design dialogues to yield closure5. Offer error prevention & simple error handling6. Permit easy reversal of actions7. Support internal locus of control8. Reduce short-term memory load (Faulkner Chapter 7)
How Many Evaluators?Different people find different problems. http://bit.ly/heuristichowto