Your SlideShare is downloading. ×
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Storytelling the Results of Heuristic Evaluation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Storytelling the Results of Heuristic Evaluation

3,360

Published on

This interactive talk focuses on the UX tool of heuristic evaluation (or expert review) and best practices for designing and reporting the results of this review. Audience members will be prompted to …

This interactive talk focuses on the UX tool of heuristic evaluation (or expert review) and best practices for designing and reporting the results of this review. Audience members will be prompted to share their experiences in conducting reviews and reporting them. A straw poll will indicate how many follow a standard set of heuristics and how many do something else. Discussion of the whys and why nots will set the stage for focusing on how to report the results. A brief walk through the evolution of reporting from the checklist to the narrative will be reviewed with examples from reports to prompt audience stories of their process and its effectiveness. New UX practitioners and students, as well as seasoned veterans, will have the chance to defend their approach or perhaps be persuaded to change.

Published in: Design, Technology, Business
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,360
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
4
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Using the cards post-task or post-test.Participant walks table, chooses. Returns to discuss meaning. Log comments for later analysis.
  • Transcript

    • 1. Storytelling theresults of heuristicevaluationCarol BarnumDirector of Graduate Studies in Information Design and The Usability Center@ Southern Polytechnic
    • 2. Heuristic Eval is popular pickUPA survey results for HE/expert review % of respondents Survey year 77% 2007 74% 2009 75% 2011 Boston UPA 2012 Slide 2
    • 3. Why so popular? Fact or myth? Fast Cheap Easy Effective Convenient Boston UPA 2012 Slide 3
    • 4. Care to comment? Boston UPA 2012 Slide 4
    • 5. HE output• A list of usability problems• Tied to a heuristic or rule of practice• A ranking of findings by severity• Recommendations for fixing problems• Oh, and the positive findings, too Boston UPA 2012 Slide 5
    • 6. Nielsen’s 10 heuristics1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentationJ. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994 Boston UPA 2012 Slide 6
    • 7. What do you do?• Do you do it (or teach it)?• How do you do it?• Why do you do it?• Do you do it alone or with others?• How do you present findings?• Is it cheaper than utesting? Boston UPA 2012 Slide 7
    • 8. What do I do? A brief history• Phase 1: Nielsen is my bible Boston UPA 2012 8
    • 9. CUE 4 Hotel Pennsylvania• Comparative evaluation of reservation process• 17 teams – 8 did expert review/HE – Only 1 team used heuristic evaluation• Rolf’s conclusions – Findings “overly sensitive“—too many to manage – Need to improve classification schemes – Need more precise and usable recommendations CHI 2003 Results available at Rolf Molich’s DialogDesign website, http://www.dialogdesign.dk/CUE-4.htm Boston UPA 2012 Slide 9
    • 10. HEsamplefindingspage usability.spsu.edu UPA 10 2011
    • 11. What do I do? A brief history• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen – tables – severity ratings – screen captures for • observations • recommendations Boston UPA 2012 11
    • 12. Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives. H = Hyperspace; C = Cardiac Arrest; S = Shock SeverityFinding Description Recommendation H C S RatingObjectives/goals for Reason content is being Develop a consistent structure that    3the modules presented defines what’s noted in the Conciseness of presentation bulleted points, above. Definitions required to work Avoid generic statements that with the module/content don’t focus users on what they will bethat defines what’s noted in the    Objectives/goals for Reason content is being Develop a consistent structure 3 Evaluation criteria and the modules presented accomplishing. methods Conciseness of Advise that there is an assessment bulleted points, above. presentation Direct tie between content Avoid generic statements that used for evaluation and indicate if Definitions required to don’t focus users on what they and assessmentwith the work measure it’swill be accomplishing. at the end or interspersed in Sequence of presentation module/content the modulethere is an Advise that Evaluation criteria and follows logically from assessment used for evaluation Connect ideas in the goals and methods and indicate if it’s at the end or introduction Direct tie between objectives with outcomes in the interspersed in the module Quizzes challengeand assessment content users assessment in the goals and Connect ideas measure Follow thewith outcomes in the objectives order of presentation Sequence of presentation assessment follows logically from defined the the beginning Follow at order of presentation introduction Develop at the beginning defined interesting and Develop interesting and Quizzes challenge users challenging questions challenging questions Re-frame goals/objectives at the Re-frame goals/objectives at the end of the module end of the module Boston UPA 2012 12
    • 13. 13
    • 14. 35 page report 14
    • 15. What do I do? A brief history• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen; tables, screen captures, recommendations• Phase 3: screen captures, UX terminology Boston UPA 2012 15
    • 16. 65 pagereport 16
    • 17. A unique password between 6 and 16 characters wasrequired. “Unique” is not defined. This is a problemwith terminology. Usually, passwords must be a combination of letters and numbers for higher security. An all- letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users.The username and security question answer wererejected on submit. This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page. Differences in spelling “username” vs. “user name” are subtle but are consistency issues.The red banner is confusing as the user chose thegold (Free Edition). This is a consistency issue. 17
    • 18. What do I do? A brief history• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen; tables, screen captures, recommendations• Phase 3: screen captures, UX terminology – Phase 3.1: user experience emerges Boston UPA 2012 18
    • 19. State Tax Reviewer comment: I wanna click on the map, not the pulldown. WAH! Also, I’ve got no idea what the text on this page means. Slide 19
    • 20. What do I do? A brief history• Phase 1: Nielsen is my bible• Phase 2: loosely based findings from Nielsen; tables, screen captures, recommendations• Phase 3: screen captures, UX terminology – Phase 3.1: user experience emerges• Phase 4: tell the story of the user experience Boston UPA 2012 20
    • 21. Persona-based scenario review• Ginny Redish and Dana Chisnell• AARP report—58 pages, 50 websites – Two personas—Edith and Matthew – Evaluators “channel“ the user via persona and tasks/goals – Their story emerges Available from Redish &Associates http://www.redish.net/images/stories/PDF/AARP-50Sites.pdf Boston UPA 2012 Slide 21
    • 22. While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to clickChisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)
    • 23. Steve Krug’s approach• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have resources to fix• It’s easy to get distracted by less serious problems that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010 Boston UPA 2012 Slide 23
    • 24. Krug’s maxims• Focus ruthlessly on a small number of the most important problems.• When fixing problems, always do the least you can do. Boston UPA 2012 Slide 24
    • 25. Conversation, Storytelling• Ginny Redish – Letting Go of the Words, Morgan Kaufmann, 2007 – Engage in conversation with your reader• Whitney Quesenbery and Kevin Brooks – Storytelling for User Experience Design, Rosenfeld, 2010 – Stories can be a part of all stages of work from user research to evaluation Boston UPA 2012 Slide 25
    • 26. Report deliverable – What do you do? No deliverable Quick findings Presentation Detailed reportJim Ross, “Communicating User Research Findings,“ UX Matters, Feb. 6, 2012 Boston UPA 2012 Slide 26
    • 27. Big honkin’ report Boston UPA 2012 Slide 27
    • 28. Boston UPA 2012 Slide 28
    • 29. The EndMore in the book . . .orEmail mecbarnum@spsu.edu Boston UPA 2012 29

    ×