Essential User Experience Skills


Published on

These slides are from the STC Atlanta ( workshop on usability. We spent the day teaching technical communicators about usability and walked them through a mock usability test. The workshop was hosted by User Insight (

Published in: Technology, Design
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Essential User Experience Skills

    1. 1. Essential UX Skills forTechnical CommunicatorsNovember 14, 2009<br />Mark Richman<br />Information architect<br />WebSoSmart<br />Yina Li<br />Technical Writer<br />Horizon Software<br />Rachel Peters<br />Technical Writer<br />Aon eSolutions<br />Will Sansbury<br />UX architectSilverpop Systems<br />1<br />
    2. 2. Thanks to User Insightfor hosting us today!<br />Tweet your<br />appreciation to<br />@eholtzclaw!<br />2<br />
    3. 3. Agenda<br />All times –ish. And we’ll throw in a potty break or two if you’re well behaved.<br />3<br />
    4. 4. Heuristic EvaluationThat’s a $2 phrase for “expert review.”<br />Mark Richman<br />4<br />
    5. 5. What’s a heuristic evaluation?<br />A quick-and-dirty usability technique, this is a big-money term for an &apos;expert review&apos; of a website or application using a set of guidelines or &apos;heuristics&apos;. <br />Heuristic evaluation involves having a small set of evaluators examine the interface and judge its compliance with recognized usability principles (the &quot;heuristics&quot;). <br />JakobNeilsen and Rolf Moloch created this technique in 1990…<br />Using a fixed list of heuristics keeps the evaluator on track. Some evaluators have their own sets of heuristics.<br />5<br />
    6. 6. Neilsenand Molich&apos;s Heuristics (1990)Neilsen now offers an updated set of heuristics<br />Visibility of system status: The system should always keep users informed about what is going on, through appropriate feedback within reasonable time. <br />Match between system and the real world: The system should speak the users&apos; language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. <br />User control and freedom: Users often choose system functions by mistake and will need a clearly marked &quot;emergency exit&quot; to leave the unwanted state without having to go through an extended dialogue. Support undo and redo. <br />Consistency and standards: Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. <br />Error prevention: Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action. <br />Recognition rather than recall: Minimize the user&apos;s memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate. <br />Flexibility and efficiency of use: Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. <br />Aesthetic and minimalist design: Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. <br />Help users recognize, diagnose, and recover from errors: Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution. <br />Help and documentation: Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user&apos;s task, list concrete steps to be carried out, and not be too large. <br />Some Others<br />Don&apos;t force user to make precise actions<br />Direct attention properly<br />Consistent use of color or saturation<br />For this list and another well-regarded list of heuristics visit:<br /><br />Or search on “ heuristics”<br />6<br />
    7. 7. Our Task<br />Several group members evaluated four sites with content similar to STC<br />We each took different approaches to our evaluations<br />Here: summarize different approaches and also the results<br />7<br />
    8. 8. Heuristic Evaluations:Value and Caveats<br />Heuristics are mental shortcuts or assumptions that help us quickly make sense of the world.<br />How does it work? The expert uses your software product and looks for violations of the guidelines. For instance, hundreds of ad-packed pages would fail the heuristic ‘Aesthetic and Minimalist Design’.<br />Does it work? Yes and no. <br />Appraisers will differ in the usability problems they find<br />Evaluators may have trouble uncovering domain-specific issues.<br />Tests have shown that up to 50% of problems identified don’t actually affect the product’s usability<br /> Why use it? <br />Great way to quickly and cheaply point out serious usability issues<br />Use it early in the design process to uncover some blatant problems<br />Know that usability testing may uncover additional issues<br />8<br />
    9. 9. Perform a Quick Evaluationof<br />Some Heuristics that might be useful:<br />Aesthetic and minimalist design: Every extra unit of information in a dialogue competes with others and diminishes their relative visibility. <br />Direct attention properly<br />Consistent use of color or saturation<br />Consistency and standards<br />Display data in a clear and obvious manner<br />Error prevention<br />9<br />
    10. 10. Sites We Evaluated<br />Sample Heuristic Evaluations<br />10<br />
    11. 11. 11<br />
    12. 12. 12<br />
    13. 13.
    14. 14. 14<br />
    15. 15. Strategies<br />Two evaluators browsed page by page through the sites, looking for usability problems and violations of the heuristics.<br />One evaluator performed a representative task on two similar websites and used that task to focus her evaluation.<br />Takeaway: There is no right or wrong strategy, but performing a task can make your evaluation deeper and more meaningful <br />Don&apos;t focus on the task exclusively, but use it to add richness to your evaluation of the complete site.<br />15<br />
    16. 16. Technology Association of Georgia<br />16<br />
    17. 17. Technology Association of GeorgiaText evaluation<br />The layout of the home page is very busy. Many colors are used on this page. Along with the fast changing slides, there is no clear focus.<br />The top navigation is clear. However the quick links under the TAG TV are hard to notice. <br />The member login is placed at an easy to find, traditional location.<br />The search box under the member login is not in its usual place and could be missed by novice users. <br />The slideshow changes too fast but it does offer the audience the information about events at a glance. <br />The home page contains so much information that the user can&apos;t get a quick overview of the site.<br />17<br />
    18. 18. Technology Association of GeorgiaAdding a picture clarifies the text<br />Layout is very busy and contains many colors. There is no clear focus.<br />The top navigation is clear but the quick links under the TAG TV are hard to notice. <br />The member login is placed at an easy to find, traditional location.<br />The search box under the member login is not in a usual place and could be missed <br />The slideshow changes too fast but it does offer the audience event information.<br />18<br />
    19. 19. Technology Association of GeorgiaNavigation<br />19<br />
    20. 20. Technology Association of GeorgiaEvaluation using callouts<br />20<br />
    21. 21. Technology Association of Georgia Callout Format: Page Content and Layout<br />21<br />
    22. 22. Technology Association of GeorgiaSummary<br />Site has great content and oozes professionalism. <br />However a lot of strongly emphasized content competes for the user&apos;s attention. This is seen in the red, underlined links, large colored areas, and vibrant logos<br />Some web conventions are not followed, adding to the difficulty of finding items on a crowded page<br />22<br />
    23. 23. Information Architecture Institute<br />23<br />
    24. 24. Information Architecture InstituteCategories and Navigation<br />24<br />
    25. 25. Information Architecture InstituteDirecting Attention<br />25<br />
    26. 26. Spotlight: Comparing Header Types<br />Headers at IAI<br />Typical headers<br />26<br />
    27. 27. Information Architecture InstitutePage Content and Accessibility<br />27<br />
    28. 28. Information Architecture InstituteSummary<br />Clear hierarchy, directs attention effectively<br />Navigation and headers are clear without taking emphasis away from the content<br />A lot of content without being distracting<br />28<br />
    29. 29. STC Intermountain Chapter<br />29<br />
    30. 30. STC Intermountain Chapter Text with Pictures and Callouts<br />30<br />
    31. 31. STC Intermountain Chapter Finding the Next Meeting<br />31<br />
    32. 32. STC Intermountain ChapterSummary of Findings<br />Consider: Top findings might be the first item in each section.<br />32<br />
    33. 33. STC Washington DC Chapter<br />33<br />
    34. 34. STC Washington DC ChapterAdditional Recommendations<br />34<br />
    35. 35. Heuristic Evaluation Tips<br />Pictures are invaluable to add context to the evaluation<br />You may do a narrative or a page-by-page evaluation. <br />Narratives express findings in a conversational manner, but are not be easy to scan. To counteract this, use bullet points.<br />Callouts are great but care should be taken to keep them neat <br />Align them if possible<br />Keep them roughly the same size<br />Don&apos;t be Negative Norman – call attention to good design and practice in the existing system. The customer will appreciate that you respond to her good ideas. <br />35<br />
    36. 36. Card SortingSomething for the office supply fetishists.<br />Rachel Peters<br />36<br />
    37. 37. What is card sorting?<br />Image by cannedtuna -<br />37<br />
    38. 38. What aisle is hot dog chili on?<br />38<br />
    39. 39. With the hot dog buns?<br />39<br />
    40. 40. Chili’s kind of like a soup…<br />40<br />
    41. 41. Chili has beans…<br />41<br />
    42. 42. Nah, that’s too easy!<br />42<br />
    43. 43. Is hot dog chili a condiment?<br />43<br />
    44. 44. Card Sort Activity<br />Finding a place for everything<br />44<br />
    45. 45. Card Sort Instructions<br />How would you organize the STC Atlanta site?<br />Group the cards into categories.<br />Is something missing? Use a blank card to add it.<br />Something doesn’t belong? Put the card aside.<br />Card belongs in more than one group? Be creative.<br />Label the categories<br />Use a blank card to name each category.<br />Category names are up to you.<br />45<br />
    46. 46. Now What?<br />Look for trends in the results<br />46<br />
    47. 47. Open vs. Closed Sort<br />Open<br />No set category labels<br />Good for exploratory research<br />Helps you understand how the users arrange the information<br />Closed<br />Set category labels provided<br />Good for testing existing structures (navigation, table of contents, etc.)<br />47<br />
    48. 48. A Few Notes<br />Not Tarot cards<br />Use card sorts to help with decision making<br />Don’t let the cards decide for you<br />Remote testing options<br />Spreadsheets<br />OptimalSort -<br />WebSort -<br />More tools listed at<br />48<br />
    49. 49. For More Information<br />Card Sorting: Designing Usable Categories<br />Donna Spencer<br />Available from Rosenfeld Media:<br />49<br />
    50. 50. Just for funHow a grocery store is like a web site<br />A visit to Publix<br />50<br />
    51. 51. Home Page<br />51<br />
    52. 52. Feature Product or Article<br />52<br />
    53. 53. Ads<br />53<br />
    54. 54. Pop Up Ads!<br />54<br />
    55. 55. Checkout<br />55<br />
    56. 56. Usability TestingNo creepy two-way mirrors required.<br />Yina Li Will Sansbury<br />56<br />
    57. 57. What is usability testing?<br />Image by eekim -<br />
    58. 58. What is usability testing?<br />Qualitative<br />Subjective <br />Small scale; usually stop seeing significant new findings after 5 to 7 tests<br />Loose, forgiving method<br />Analysis based on observations<br />Relatively cheap and easy to execute<br />Quantitative<br />Objective<br />Large scale; requires large enough sample of users to statistically validate findings<br />Stresses rigorous scientific method<br />Analysis based on crunching numbers<br />Expensive in time and money<br />58<br />
    59. 59. Planning and Preparing a Usability Test<br />Yina Li<br />59<br />
    60. 60. Planning a usability test<br />Image by Experimental:DB :<br />60<br />
    61. 61. Goals<br /><br />61<br />
    62. 62. Focus<br /><br />62<br />
    63. 63. Focus<br /><br />63<br />
    64. 64. User Profiles<br />64<br />
    65. 65. Deliverables<br />Screeners<br />Consent form<br />Pre-test questionnaire<br />Scenarios/tasks<br />Post-task questionnaire<br />Post-test questionnaire<br />Facilitator script<br />Test plan<br />65<br />
    66. 66. Types of scenarios<br />First impression<br />Open-ended tasks<br />e.g. join STC on this site.<br />Answer-oriented<br />e.g. find information about the next chapter meeting<br />66<br />Nielsen Norman Group Usability In Practice: 3-Day Camp 2008<br />
    67. 67. How to create unbiased scenarios/tasks?<br />Avoid lingo used in the testing product<br />Currents<br />Summit<br />Do NOT provide instructions or steps<br />67<br />
    68. 68. Anything else?<br />How many tasks should I prepare?<br />35-40 minutes<br />What sequence of the tasks should I use?<br />Easy on the first task<br />Prioritize the tasks<br />Prepare extra tasks<br />RUN a pilot test<br />68<br />
    69. 69. Fun time<br />Write the two tasks for the STC website usability test. <br />69<br />
    70. 70. How to recruit test participants?<br />70<br />
    71. 71. User profiles / Personas<br /><br />71<br />
    72. 72. How to recruit test participants?<br />How many participants?<br />5 (<br />20 (<br />Should I recruit the participants?<br />Where to start?<br />Client relations<br />Account executives<br />Marketing<br />Sales<br />Customer support<br />72<br />
    73. 73. Creating a screener<br />What is a screener?<br />Short<br />Sequence of questions<br />Sample questions<br />What’s the last time you booked a hotel room online?<br />How many hours do you spend on internet per week?<br />What is your household income? Give a range.<br />What is your profession?<br />What company do you work for?<br />73<br />
    74. 74. Incentives<br />Type of incentives<br />Cash<br />Gift cards<br />Software or product the company makes<br />How much<br />It depends…<br />74<br />
    75. 75. It’s time to call<br />Not a sales call<br />Your opinion will help improve the product<br />Your time will be paid and how much<br />How long the test will be, where, and when<br />We will video and/or audio tape the session<br />Still interested? Now ask the questions in screener.<br />75<br />Nielsen Norman Group Usability In Practice: 3-Day Camp 2008<br />
    76. 76. It’s almost the testing day<br />Call to confirm<br />Send the following information:<br />Testing time<br />Location<br />Parking info<br />Driving direction<br />Contact information<br />76<br />
    77. 77. Facilitating Usability Tests &Analyzing Usability Findings<br />Will Sansbury<br />77<br />
    78. 78. Brief your observation team.<br />Image by llawliet -<br />78<br />
    79. 79. Observation Team Ground Rules<br />Focus on observation <br />Limit side conversations<br />Take good notes<br />Don’t jump to solutions<br />Keep your frustration in check<br />Trust the facilitator’s judgment<br />79<br />
    80. 80. Observation Team Ground Rules<br />NEVER tear down the user!<br />As facilitator, defend the user’sdignity above all else.<br />(Seriously. I’ve kicked people out of the observation room before.)<br />80<br />
    81. 81. Embrace multiple personalities.<br />Flight attendant<br />Keep participants happy<br />Protect the participant’s safety and dignity<br />Sportscaster<br />Keep the observation team engaged with play-by-play<br />Conduct sidelines interviews between sessions<br />Scientist<br />Plan and execute the test<br />Analyze the test results<br />From Carolyn Snider’s Paper Prototyping<br />81<br />
    82. 82. Make the participant feel comfortable.<br />Image by Tom Purves -<br />82<br />
    83. 83. Start when you confirm the test date.<br />Avoid email<br />Give them a choice of times<br />Charm with chit chat<br />“Do you go by Thomas or Tom?”<br />“Your office is in the Highlands? My favorite restaurant is down there.”<br />Absorb ALL the pain<br />Image from stock.xchng –<br />83<br />
    84. 84. Make sure they can find you.<br />Image from stock.xchng –<br />84<br />
    85. 85. Be the host(ess) with the most(est).<br />Image by Rachel from Cupcakes Take the Cake -<br />85<br />
    86. 86. (Just don’t be freaky.)<br />Image by Rachel from Cupcakes Take the Cake -<br />86<br />
    87. 87. Help them know what to expect.<br />Explain the test procedure<br />Stress the importance of thinking out loud<br />Obtain signed informed consent form<br />87<br />
    88. 88. Make sure they understand that…<br />You’re testing the product, not them.<br />When they’re struggling, you’re learning.<br />If they’re frustrated or have questions, they canask you for help.Set up a faux helpdesk phone number to ring the observation room.<br />Thinking out loud is critical. Affirm what they’re doing, but repeat the importancebefore each scenario.<br />88<br />
    89. 89. Run the test.<br />Image from stock.xchng –<br />89<br />
    90. 90. Run the test.<br />Provide the participant with written scenarios. You can give scenario instructions verbally, but written instructions can tell a more compelling story.<br />Ask them to read the scenario aloud.Primes the pump for thinking out loud.<br />After they finish the scenario, administer a survey.Some standard surveys exist, and tools like Morae include them.<br />Rinse and repeat for each scenario.<br />90<br />
    91. 91. Meanwhile, in the observation room...<br />Image by Ken Lund –<br />91<br />
    92. 92. Meanwhile, in the observation room...<br />Log interesting observations.Track time of each so that you can correlate notes with the video.<br />As you see usability issues, point them out to the observation team.You’ll have a common ground to start analysis discussions.<br />Pay attention to nonverbal cues, too.Look for odd mousing behaviors, facial expressions, andsounds of frustration.<br />92<br />
    93. 93. Add it all up.<br />Image by stuartpilbrow –<br />93<br />
    94. 94. Analyze the test findings.<br />Run analysis in two stages<br />Immediately after a test session, have each observer write down the issues they observed.<br />After all sessions, review all observations.<br />Transfer each observation from each session on to an index card or sticky note. <br />Once all issues are recorded, post them on a wall.<br />Read through each, and group similar items.<br />Look for high density areas which indicate issues observed often across multiple test sessions.<br />94<br />
    95. 95. Communicate findings to decision makers.<br />Formal report<br />Assign priority to findings and present highest firstBe careful to not dilute report with too many findings<br />Include stills from videos to illustrate findings<br />Brief profiles of test participants and actual quotes from tests foster empathy for the user<br />Highlights reels of videos go a long way with executives<br />Informal reports<br />If you’re agile, generate user stories directly from the final analysis session<br />Capture findings on wiki, intranet, or other shared resource<br />Just write it down somewhere! Don’t let findings be forgotten.<br />95<br />
    96. 96. Let’s eat!We’ll answer questions, too…<br />…if you don’t mind us talking with our mouths full.<br />96<br />
    97. 97. Mock Usability TestSome participants may be professional actors.<br />YOU!<br />97<br />