• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Comu346   lecture 6 - evaluation
 

Comu346 lecture 6 - evaluation

on

  • 1,652 views

 

Statistics

Views

Total Views
1,652
Views on SlideShare
623
Embed Views
1,029

Actions

Likes
1
Downloads
11
Comments
0

1 Embed 1,029

http://www.comu346.com 1029

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Sorry I couldn’t be there. I’ll be back on Wednesday and at the classes on Thursday but Romana kindly agreed to help out so thanks Rom! Today we are going to have two short lectures - but before we do - a word on your coursework. This week we are going to again spend the tutorials looking at Solium Infernum. I might move on from this as of next week - so please try to get to the point where you REALLY know this game. You can play a single player game in a half hour or hour - and you need only play multiplayer in order to appreciate how it works. Mostly the experience of playing singleplayer is comparable.
  • Here at GCU, we always hark on about the value of user testing. We talk about how you are not designing for yourself - you’re designing for a particular type of person. And of course, that is true - but there are some areas where it makes sense to involve experts. So we’re going to talk about why that might be. We’re then going to look at a couple of techniques that you can use to perform an expert evaluation and consider some heuristics.
  • As I said, we are believers in user centred design here. After all you look at the game very differently from the user. You can’t help it. If you ever sit a newbie in front of one of your IP projects or a game jam game, you’ll wince when you see how they struggle to use it - so we recognise the importance and value of involving users. however - there are problems. Expensive Some users funny about using something that is not finished Hard to get non-hardcore interested takes tonnes of time to recruit and then host Kleenex testing is a lot of work and is expensive (+ risks) paranoid industry saves you waiting until THE VERY END to see if design any good
  • Have to decide about your user's level of experience use example of crates with e-Bug kids vs scientists Description of user similar but different to personas you need to know what they know Description of system menu plan or paper prototype etc or working system! Description of task List of actions required
  • Do they know word heuristic? It's a rule of thumb. doesn't define how you GET the list! Comparing interface to the heuristics __ There are no real solid game UI heuristics yet - so we have to adapt from general software design.
  • Feedback tomb raider puzzles user language no jargon User control & freedom see that this isn't necessarily game specific for SI, you can cancel orders before submitting Standards talk about 1st year who mentioned HUDs not changing in 15 years easier to go with standards than retrain
  • Memory load stem and leaf 7 items +- 2 Shortcuts SI has one for opening main menu RTS games, WOW, etc on windows, barely touch mouse on mac, not so much Minimalist colour 1+1 = 3 white space can be distracting Error messages actionscript! SI - gives error but doesn't say HOW to get past it docs no manuals any more
  • Norman talks about affordances should be able to LOOK at a thing and KNOW what it does 1: Quicktime logo bad you don't have that knowledge perhaps email logo with a letter is better You get to DECIDE what the user knows, but you better be right! 2: Multliplayer mode for SI - emailin files around kind of necessary but why not double click to copy file to appropriate dir why not email from inside app? 3: Think about force unleashed not aware of possible moves 4: like consistency and standards - you always expect things in same place is hot on the left al the time? 5: understand that you will be more effective if you have constraints APB is a good example! browser plug ins that show your bookmarks in a 3d world are good example 6: be defensive, assume error and help 7: learn from other produces. the L4D Flallout example jump button in L4D vs Fallout
  • There isn't ONE WAY of evaluation. No guaranteed set of heuristics. Need to find a structure / set of rules you believe in but no 'official set'. 4: interesting - like this in mass effect dialogues should fuck off give authoritative verbs 'save file' 'don't save file' - not yes / close also allow larger hit areas where possible e-bug / nintendo style of dialogue. 7: locus of control refers to whether YOU think that YOU CAUSE actions or whether they happen TO you. you want user to feel in control support their agency fate vs power! You’ll see that there is much similarity between these heuristics - there is a lot of wisdom in these simple rules.
  • Illustration showing which evaluators found which usability problems in a heuristic evaluation of a banking system. Each row represents one of the 19 evaluators and each column represents one of the 16 usability problems. Each square shows whether the evaluator represented by the row found the usability problem represented by the column: The square is black if this is the case and white if the evaluator did not find the problem. The rows have been sorted in such a way that the most successful evaluators are at the bottom and the least successful are at the top. The columns have been sorted in such a way that the usability problems that are the easiest to find are to the right and the usability problems that are the most difficult to find are to the left.
  • In principle, individual evaluators can perform a heuristic evaluation of a user interface on their own, but the experience from several projects indicates that fairly poor results are achieved when relying on single evaluators. Averaged over six of my projects, single evaluators found only 35 percent of the usability problems in the interfaces. However, since different evaluators tend to find different problems, it is possible to achieve substantially better performance by aggregating the evaluations from several evaluators. Figure 2 shows the proportion of usability problems found as more and more evaluators are added. The figure clearly shows that there is a nice payoff from using more than one evaluator. It would seem reasonable to recommend the use of about five evaluators, but certainly at least three So - again - you should read the recommended reading from the blog.

Comu346   lecture 6 - evaluation Comu346 lecture 6 - evaluation Presentation Transcript

  • http://www.comu346.com [email_address] Game Design 2 Lecture 6: Expert Evaluation 2011
  • Expert Evaluations & Design / Usability Heuristics
    • Will look at:
    • Need for alternatives to user evaluation
    • Methods of evaluating without end users (using expert evaluators)
    • Some heuristics / guidelines offered by experts
  • End User Evaluations
    • End-user evaluations can be expensive
      • The methods are very time consuming
      • Users may not be willing
      • To get truly ‘fresh’ eyes, so called “kleenex” testing requires different players each time
    • Concerns about leaks
      • Few external play testers at early stages
      • Friends & family play testers may be too kind
  • Expert Evaluations
    • As an alternative to some user testing, expert evaluators / testers can be used
    • Falconer details 10 inspection methods, we will look at two:
      • Cognitive Walkthrough
      • Heuristic Evaluation
  • Cognitive Walkthrough
    • In this approach experts imitate users
      • Relatively quick and cheap
      • Expert needs to be skilled and requires:
        • A description of users (e.g. level of experience)
        • A description of system (or an operational system)
        • A description of the task to be carried out
        • A list of the actions required to complete the task
  • Cognitive Walkthrough
    • Expert addresses questions such as:
      • Is the goal clear at this stage?
      • Is the appropriate action obvious?
      • Is it clear that the appropriate action leads to the goal?
      • What problems (or potential problems) are there in performing the action?
    • Essential that the expert tries to think like the end user and not like themselves.
  • Consider Solium Infernum
    • A scenario:
      • Goal is to go to war and capture building
      • Is the appropriate action obvious?
        • What if no vendetta in place?
      • Is it clear that the appropriate action leads to the goal?
        • Does the game help you?
      • What problems (or potential problems) are there in performing the action?
        • If you haven’t got vendetta status, how do you know?
    • I know how to do this but thinking as a ‘user’ it’s not so easy.
  • Heuristic Evaluation
    • Involves assessing how closely an interface or system conforms to a predefined set of guidelines or heuristics.
    • Examples:
      • Nielsen’s usability heuristics
      • Schneiderman’s eight golden rules
      • Norman’s seven principles
  • Nielsen’s Usability Heuristics
    • Give feedback
      • keep users informed about what is happening
    • Speak the user’s language
      • dialogs should be expressed clearly using terms familiar to the user
    • User control and freedom
      • clearly marked exits and undo/redo
    • Consistency and standards
    • Prevent errors
      • even better than having good error messages
  • Nielsen’s Usability Heuristics
    • Minimise memory load
      • recognition rather than recall
    • Shortcuts
      • accelerators (unseen by novices) speed up interactions for experts
    • Aesthetic and minimalist design
      • don’t have irrelevant or rarely needed information
    • Good error messages
      • should indicate the problem and explain how to recover
    • Help and documentation
      • should be concise and easy to search
  • Norman’s 7 Principles
    • 1: Use both knowledge in the world and knowledge in the head.
    • 2: Simplify the structure of tasks.
    • 3: Make things visible.
    • 4: Get the mappings right.
    • 5: Exploit the power of constraints.
    • 6: Design for error.
    • 7: When all else fails, standardise.
  • Schneiderman’s heuristics (8 Golden Rules)
    • Strive for consistency
    • Enable frequent users to use shortcuts
    • Offer informative feedback
    • Design dialogues to yield closure
    • Offer error prevention & simple error handling
    • Permit easy reversal of actions
    • Support internal locus of control
    • Reduce short-term memory load
    • (Faulkner Chapter 7)
  • How Many Evaluators?
    • Different people find different problems.
    http://bit.ly/heuristichowto
  • How Many Evaluators?