• Save
UX Research
Upcoming SlideShare
Loading in...5
×
 

UX Research

on

  • 2,306 views

UX Research

UX Research

Statistics

Views

Total Views
2,306
Views on SlideShare
2,288
Embed Views
18

Actions

Likes
12
Downloads
0
Comments
0

3 Embeds 18

http://kred.com 12
https://twitter.com 4
http://tweetedtimes.com 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Livescribe
  • Business question: How are users actually viewing your content (in what order, for how long and in what specific pattern or pathways)?Audience question: Have you wondered if critical links, buttons or content messaging is being viewed on a critical page?Description: this methodology is very useful when trying to determine why certain homepage metrics from analytics programs are of concern (not clicking on value proposition element…etc.)The respondent sits at a specialized computer screen and undergoes a simple calibration sequence. Respondent is given a stimulus question or task (active or passive) (show homepage for set period of time 15 seconds)System tracks eye pathways and fixations and produces a data file from that task.Important things to know about eye-tracking Tobii not designed for web sites or changing visual stimulus)This makes actual testing of web navigation (changing from page to page) very complex to actually analyze and not accurateVery effective for single stimulus presentations of fixed durationsExcellent for detailed analysis of home pages or critical landing pages and forms Very insightful for assessing impact of advertizing on homepage visual scanning.
  • Business question: Is anyone working on a major (large-scale) site launch or redesign that your company depends on for survival?
  • Audience question: “How many of you have a project at the point where it is ready for major commitment? (round A, new release, major new upgrade) I have a web site, software or product and I am about to commit major funding or resources to next phase of developmentDo I have usability problems with the user experience that are basically show stoppers Users cannot download the application Users cannot log in Users cannot set up a profile pageUsers cannot navigate to critical content 1-3 critical tasks in 60 min.
  • Business question: Do any of you have new development team that has minimal UX / Usability experience? Is your team employing best practices and are they aware of the key UX and Usability performance issues that an effective solution must meet.Description: A highly experienced usability / UI design expert conducts a structured audit of your system or product and rates the system on best practices and estimated performanceInterview and select an expert who has direct experience in your product category and sectorExpert gathers information from your development team and conducts structured audit based on predetermined best practices. Expert presents findings to your team (sometimes not a happy experience for UX design teams without knowledge of formal UCD methods.Very effective early in development and can be repeated with updates at less cost.
  • Business problem: How do I organize information like content, navigation, overall IA so that users understand it?Description: this is an automated version of the classic card sorting studies where you give users a pile of index cards with your content descriptions on them and ask them to sort the cards into groups according to how they relate to the content. Example: If I have a bunch of content categories how do I determine what the groupings are and the high level navigation labels? Lets say you have a site selling women’s underwear and you what to create a navigation structure that matches the users mental model. So do you organize the site by type of underwear on top level and styles, colors, or do you organize the site navigation by life style like (athletic, everyday, intimate) and they by type of article color, and price. Respondents are invited to an online study via email.When they agree they encounter a screen with a list of labels or terms in one column and are asked to sort the terms into groups they find organizationally relevant. When they are finished you can give them another card sort of just finish the study. When the required number of respondents are finished with the card sort you can view the dataCard sorting data is analyzed through the application of cluster analysis (not that easy to understand but very useful)
  • OPEN SORT: good for getting ideas on groups of contentCLOSED: Useful to see where people would put the content.
  • Card sorting as a method in HCI largely took off during the internet boom of the late 1990’s with the proliferation of website navigation.
  • Today it’s one of the most popular methods UX professionals use. In fact, practitioners report using Card Sorting as frequently as task oriented lab-based usability testing.
  • Using the cards post-task or post-test.Participant walks table, chooses. Returns to discuss meaning. Log comments for later analysis.
  • Usability, according to web design expert Jakob Nielsen, usually includes five aspects:Learnability. In other words, when the user is confronted by a new site, how easily can they figure out what the purpose of the site is and how to use on their own.Efficiency. How quickly can the user perform tasks and navigate on the site.Memorability. If the user returns, what parts and functions of the site are recognizable and don’t need to be relearned. Obviously, you don’t want your site to be hard even for your repeat visitors.Errors. To what extent do features and functions not work as predicted by the user or as designed by the developer.Satisfaction. How happy is the user when they are using your site.
  • I should note there that one of my greatest sources of inspiration about what usability is all about comes from this book by Steve Krug. Although it’s aimed at people doing web development in a corporate setting, the techniques are almost all spot on for the library world as well.
  • Usability tests are really not such a big deal. Here’s a quick overview of the steps:Come up with a set of 3-5 different tasks that you’ll ask users to perform.Round up some 5-10 volunteers who will act as test participants and then bring them one at a time into a testing area where you’ll observe them as they perform the predetermined tasks.After you’ve observed all the test partipants, you’ll have a pretty good idea of some things that need to be fixed and what things seems to be working OK. After you make the easiest 2-3 fixes, go back and do another round of testing and tweaking, etc.
  • But why go through all this trouble? Can’t I just see what’s wrong? Can’t I just ask the staff at my library to identify what needs to be fixed and take care of the problems that way?Well, no, not if you really want to make your user the center of the website’s design. Lots of librarians will want to tell you that the library website is for them, too, and that they are experts on what users want and need. That is true up to a point, but as anyone who has sat on a library redesign team knows, everyone has lots of opinions and insights, many of which are completely at odds with each other.The real user of your library’s website is your patron, your customers, your students, your members, whatever you want to call those folks who don’t work in the library. You want to please those folks more than anyone else.By making an honest effort to let your users determine the look and feel of your website—within reason—you’re likely to have a design that will actually “work” for them. By going directly to your users and recording how they actually use the site, you’ll get information that will ground the endless debates over design matters.Usability testing offers a method based on social science research methods.And, as I see again and again when I run usability tests, the results are always surprising. Your test participants are always going to find things that never occurred to your designers. Most usability experts will echo this experience, I believe.
  • OK, so you now have an idea about what service or resource you’re going to test, next you’ll want to think about what actual tasks you want your test participants to do. You’lll want to pick pick tasks that are going to reveal some useful information to you.One obvious place to go looking for tasks are those pages or services that you and your colleagues already know need work, such as your interlibrary loan form or the way that library hours are displayed.Another strategy is to think about what are the most common activities among patrons in your library. Take a look at your site statistics to see what are the most popular pages. Maybe that’s where you want to do your testing.Or maybe you’re about to launch a new page or service. Those are great opportunities for testing.
  • OK. So the gear you need is not too complicated. You’ll need a computer….a desktop or a laptop will do. Last year, I had test participants use my smarthphone wen I was testing a mobile web site. If you really want to get serious about user-centered design, you may want to do usability testing on paper sketches that precede any actual website coding. This is perfect acceptable and commonly done. It’s a great way to run tests that will help you get a basic page layout and site architecture problems.You’ll also want to install some screen recording software on the computer that your test participants use. That way, you can capture as a movie all the mouse movements, page clicks, and characters typed; this is really rich data to return to when the tests are done and you are trying to write up your report. I’ll talk in a minute about software options.Another option that has worked for me is to simply have a second person on hand helping you with the test. That person’s sole responsibility is to closely observe the test participant and take detailed notes.Finally, if you have screen recording software, you might as well get a USB microphone that can capture the conversation between the test participant and the test facilitator. You’ll want to encourage the participant to think aloud as much as possible as they perform tasks.
  • Here are some sample tasks that I’ve used for various reasons. You can see they are not huge, multistep projects but somewhat straightforward tasks.
  • Here are five options for screen recording software. I’ve used CamStudio a lot mostly because it is free and can be installed on any machine. With the others, you’ll get a much richer feature set but will limited by the number of machines you can install it on.
  • OK, so if you are doing the tests all by your lonesome (not the best situation but certainly still doable), you’ll be in charge of recruiting test participants, running the test, recording the test (you’ll definitely need screen recording software and a mic), and for prepping the test environment.
  • I am so thrilled that just hours ago a librarian in Canada, Amanda Etches, who is known for her work in usability, posted this wonderful picture today captioned, “Guerrilla testing.” This is what appears to be a staff of one person running usability tests from a library cart in a public area in the library.
  • If you can get another person to help you out with the testing, you can break up the tasks in rational ways.
  • Before you run off and do the tests, I recommend drafting a one-page document that details the protocol for your test. The most important part of the protocol is a script that you’ll type out and read from during the tests. It is essential that when you are delivering the task details to your test participants that you say it in the exact same way to each person and that you never give away details about how to do the task. The whole point of the test is to see how much the participant can do on their own without any expert help; this mimics the real world use of your library site. There’s rarely a librarian looking over your patron’s shoulder as he or she navigates your site.
  • Getting properly set up is key. Make sure your mic works, the screen recording software works, and that you’ve cleared the cache in your browser so that no words entered previously in search boxes will appear as your test participants use those search boxes.
  • Your script should include an explanation of why you are reading from a script; you don’t want to freak out the participant. You can break the ice before the task performance portion by asking them to talk about websites that they like to use.Make sure you don’t give them a chance to explore your site before doing the tasks. You want to drop them into as much as possible real world scenarios where they have come to the site to do a specific task they had in mind.
  • It’s essential that you ask the participant to speak aloud so you can hear them express any frustrations or surprises they’ve had.
  • If you can engineer a reward or payment to your testers, that’s great but not essential.
  • When the test is done, quickly get yourself to a computer so you can get down any insights you gained. After 2-3 participants, it’s easy to lose track of who did what and what you learned.
  • Thousands of users can be tested in multiple languages, locations and time zones in a short period of time. Two approaches:Task – based with a plug in“True Intent” based Quantitative and qualitative data can be collectedCan be extended easily to competitive or longitudinal testingExamples: LEOtrace, Vividence, User ZoomAugments traditional lab testing
  • Where it fits in Highly adaptive based on the needs This gives it POWER – it creates the missing link
  • Saves time – Very fast, thousands on panels, Money – essence of quick and dirty. Techniques for dealing with noise, unrealistic to be in the lab that long. Combines both qual/quant and attitudes and behavior.
  • All the flexibility you need to set up a study and analyzing the dataSignificant support in designing study and analysis. Pricing is all project based – typically very expensive – good choice for a large benchmark study
  • - These tools are self-service, with some amount of supportCan do largely the same as RV and Keynote, and becoming more powerful by the day Nice visualizations now Loop 11 is very basic, and UZ is much more flexible fraction of the cost = several hundred to a few thousand.
  • - Sort into groups
  • Using a monitor like that, the participant can move her/his head freely in a large area, the data accuracy is not lowered by natural head movement.
  • This effect can be used to direct attention, for example on an ad. Here two different versions of an ad were eye-tracked. In this case, the model is looking directly at the viewer.
  • This effect can be used to direct attention, for example on an ad. Here two different versions of an ad were eye-tracked. In this case, the model is looking directly at the viewer.
  • And in this version, the model looks at the product, forming a straight line between her eye and the product name on the package.
  • Note: give an example here
  • FIX THE PHOTO
  • Weird things may come out of this.May only make sense in the future.Don’t just dismiss data.Keep them, it may be useful later.
  • Iterate also on your research method

UX Research UX Research Presentation Transcript

  • UX 리서치(UX/UI 개발을 위한 사용자 조사의 이해 및 방법) 2012.11.16 InnoUX CEO 최병호 InnoUX@InnoUX.com, @ILOVEHCI
  • Table of Contents• Definition• Case Study• Methodology  Overview  Field Study  Card Sorting  Heuristic Evaluation  Usability Test  Remote Test  Focus Group  Benchmarking  Contextual Inquiry  Eye Tracking  Diary Study  Cognitive Walkthrough• ReferencesUX 리서치 1 © 2012 InnoUX & Innodesign All rights reserved.
  • Definition
  • 4 Common Biases in Customer Research• Confirmation Bias• Framing Effect• Observer-expectancy Effect• Recency Bias
  • Confirmation BiasYour tendency to search for or interpretinformation in a way that confirms yourpreconceptions or hypotheses.
  • Framing EffectWhen you and your team draw differentconclusions from the same data based on yourown preconceptions.
  • Observer-expectancyWhen you expect a given result from yourresearch which makes you unconsciouslymanipulate your experiments to give you thatresult
  • Recency BiasThis results from disproportionate salienceattributed to recent observations (your very lastinterview) – or the tendency to weigh morerecent information over earlier observations
  • Methodology:Overview
  • Types of AgileUX Customer Research Contextual Inquiry Task Analysis Card Sorting Focus Groups Surveys Usability Testing
  • My bullshit customer insights graph 12People The most striking truth of this curve is that zero users give zero insights 0 Insights Lots
  • You need to gather:• Factual information• Behavior• Pain points• Goals You can document this on the persona validation board As well as… Photos, video, audio, journals…document everything
  • Types of Research Methods Quantitative 12 fMRI Brain Imaging 10 Eye Tracking 9 Online Card Sorting 11 Online UX Concept Surveys 4 Focus Groups 7 Large Sample On-Line Behavior Testing 6 Lab-Based TestingGenerative 5 Ergonomic Observation Evaulative 8 Professional Heuristics 2 Remote Ethnography 1 Contextual Observation Qualitative 46
  • 1 Methodology: Contextual Observation/Ethnography ► Business problem ► How are people actually using products versus how they were designed? ► Description ► In-depth, in-person observation of tasks & activities at work or home. Observations are recorded. ► Benefits ► Access to the full dimensions of the user experience (e.g. information flow, physical environment, social interactions, interruptions, etc) ► Limitations ► Time-consuming research; travel involved, Smaller sample size does not provide statistical significance, Data analysis can be time consuming ► Data ► Patterns of observed behavior and verbatims based on participant response, transcripts and video recordings ► Tools ► LiveScribe (for combining audio recording with note-taking) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 47
  • 1 Methodology: Contextual Observation/Ethnography 48
  • 2 Methodology: Remote Ethnography ► Business problem ► How are people actually using in their environment in real-time? ► Description ► Participants self-record activities over days or weeks with pocket video cameras or mobile devices, based on tasks provided by researcher. ► Benefits ► Allows participants to capture activities as they happen and where they happen (away from computer), without the presence of observers. Useful for longitudinal research & geographically spread participants. ► Limitations ► Dependence on participant ability to articulate and record activities, Relatively high data analysis to small sample size ratio ► Data ► Patterns based on participant response, transcripts and video recordings ► Tools ► Qualvu.com Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 49
  • 4 Methodology: Focus Groups ► Business problem ► What are perceptions and ideas around products/concepts? ► Description ► Moderated discussion group to gain concept/product feedback and inputs; can include screens, physical models and other artifacts ► Benefits ► Efficient method for understanding end-user preferences and for getting early feedback on concepts , particularly for physical or complex products that benefit from hands-on exposure and explanation ► Limitations ► Lacks realistic context of use; Influence of participants on each other ► Data ► Combination of qualitative observations (like ethnographic research) with quantitative data (e.g. ratings, surveys) ► Tools ► See qualitative data analysis Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 50
  • How To Think About UX Performance (Engagement Pyramid) No EngagementUX Testing User cannot find critical features and cannot determine how to use them if they find them. Low EngagementUX Testing User can find critical features and understands how to use them but has no interest in doing so. Moderate EngagementUX Testing User can find critical features and understands how to use them and has some interest in doing so. Deep EngagementUX Testing User can find critical features and understands how to use them and cannot stop doing so. 51
  • 10 Methodology: Eye-Tracking Business Problem Do users see critical content and in what order? Description Respondents view content on a specialized workstation or glasses. Benefits Very accurate tracking of eye fixations and pathways. Limitations Relatively high cost, analysis is complex, data can be deceiving. Data Live eye fixations, heat maps…etc. Tools of Choice Tobii - SMI Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 52
  • 6 Methodology: Large-Sample Online Behavior Tracking ► Business problem ► Major redesign of a large complex site that is business-critical? ► Description ► 200-10,000+ respondents do tasks using online tracking / survey tools ► Benefits: ► Large sample size, low cost per respondent, extensive data possible ► Limitations ► No direct observation of users, survey design complex…other issues ► Data ► You name it (data exports to professional analysis tools). ► Tools of Choice ► Keynote WebEffective, UserZoom, Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 53
  • Recommended Reading: Albert and Tullis 54
  • 7 Methodology: Lab-based UX Testing ► Business problem ► Are there show-stopper (CI) usability problems with your user experience? ► Description ► 12-24 Respondents undertake structured tasks in controlled setting (Lab) ► Benefits ► Relatively fast, moderate cost, very graphic display of major issues ► Limitations ► Small sample, study design, recruiting good respondents ► Data ► Summary data in tabular and chart format PLUS video out-takes ► Tools ► Leased testing room, recruiting service and Morae (Industry Standard) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 55
  • Recommended Reading: Tullis and Albert 56
  • 8 Methodology: Professional Heuristics ► Business problem ► Rapid feedback on UX design based on best practices or opinions ► Definition ► “Heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions (same root as: eureka)” ► Benefits ► Fast, low cost, can be very effective in some applications ► Limitations ► No actual user data, analysis only as good as expert doing audit ► Data ► Ranging from verbal direction to highly detailed recommendations ► Tools of Choice ► Written or verbal descriptions and custom tools by each experts. Cost / respondent: NA Statistical validity: None – Some – Extensive 57
  • 9 Methodology: Automated Online Card Sorting ► Business problem ► User’s cannot understand where content they want is located? ► Description ► Online card sorting based on terms you provide (or users create) ► Benefits ► Large sample size, low cost, easy to field ► Limitations ► Use of sorting tools confuse users, data hard to understand ► Data ► Standard cluster analysis charts and more ► Tools of Choice ► WebSort…and others Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 58
  • 10 Methodology: fMRI (Brain Imaging) ► Business Problem? ► What areas of the brain are being activated by UX design ► Description ► Respondents given visual stimulus while in FMRI scanner ► Benefits ► Maps design variables to core functions of the human brain ► Limitations ► Expensive and data can be highly misleading ► Data ► Brain scans ► Tools ► Major medical centers and research services (some consultants) Cost / respondent: Low – Moderate – High Statistical validity: None – Some – Extensive 59
  • Methodology:Field Study
  • Methodology:Card Sorting
  • “It is important to use Card Sorting for the right reasonsand the right time in the project and to analyze theresults in combination with other inputs.”- DONNA SPENCER 2009
  • STEPS IN A CARD SORT1. Decide what you want to learn2. Select the type of Card Sort (open vs closed)3. Choose Suitable Content4. Choose and invite participants5. Conduct the sort (online or in-person)6. Analyze Results7. Integrate results
  • BEFORE YOU GET STARTED! Inventory + Audit the Content
  • WHAT ARE YOU WANTING TO LEARN?• New Intranet vs Existing?• Section of Intranet?• Whole organization vs single department?• For a project? For a team?
  • OPEN VS CLOSED CRM Year in Product Organizatio Company Review Targets CRM Vacation Christmas Projects n Chart OPEN Policy Vacation request Pay Days Party Walkathon News Meeting Project Review Results SORT form Vacation Policy Human Vacation Resources Events Walkathon Results request Pay Days Christmas CRM form Party Product Organizatio CRM Christmas Targets in Year n Chart Vacation Project Party Review Policy Review Meeting Pay Days CRM VacationOrganization request Chart CRM form Year in Project Review Review Product Meeting Walkathon Targets Results Christmas Year in CRM Party Review Project Company Company Walkathon Meeting Review Product Projects Projects News News Results Targets Vacation CLOSED Human Resources Departments Human Policy Vacation Resources request Pay Days Departments CRM Organizatio form n Chart SORT
  • SELECTING CONTENT
  • SELECTING CONTENTDo’s Don’ts•30 – 100 Cards • More than 100 cards•Select content that can be • Mix functionality andgrouped content•Select terms and concepts • Include both detailed andthat mean something to broad contentusers
  • ANALYSIS
  • LOOK AT• What groups were created• Where the cards were placed• What terms were used for labels• Organization scheme used• Whether people created accurate or inaccurate groups
  • INTEGRATE RESULTS: CREATE YOUR IA Centers ofOur Company Projects Departments Excellence Employee Community & Resources Groups Project Management Executive Blog Project Name 1 Executive Professionals Vacation & Holidays Events Operations Engineering New York Project Name 2 Expenses Charitable Campaigns Operations Support Terminal Technologies Project Name 3 Vancouver Carpool Vancouver Vessel Planning NAVIS Project Name 4 Travel Mission and Values Yard Planning Lawson Health & Safety Rail Planning IT Wellness Finance & Yard Planning Administration Benefits Human Resources Facilities Corporate Communications Payroll IT Communication Tools
  • Traditional Card Sort
  • Online Card Sorting
  • Card Sorting is as common as Lab based UsabilityTestingSource: 2011 UxPA Salary Survey
  • Terms & Concepts• Open Sort: Users sort items into groups and give the groups a name. Closed Sort: Users sort items into previously defined category names.• Reverse Card Sort (Tree Test) : Users are asked to locate items in a hierarchy (no design)• Most Users Start Browsing vs Searching: Across 9 websites and 25 tasks we found on average 86% start browsing http://www.measuringusability.com/blog/card-sorting.php http://www.measuringusability.com/blog/search-browse.php
  • Methodology:HeuristicEvaluation
  • Heuristic Eval is popular pickUPA survey results for HE/expert review % of respondents Survey year 77% 2007 74% 2009 75% 2011 Slide 94
  • Why so popular? Fact or myth? Fast Cheap Easy Effective Convenient Slide 95
  • Care to comment? Slide 96
  • HE output• A list of usability problems• Tied to a heuristic or rule of practice• A ranking of findings by severity• Recommendations for fixing problems• Oh, and the positive findings, too Slide 97
  • Nielsen’s 10 heuristics1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentationJ. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994 Slide 98
  • What do you do?• Do you do it (or teach it)?• How do you do it?• Why do you do it?• Do you do it alone or with others?• How do you report it?• How do you charge for it? Slide 99
  • HE samplefindingspage usability.spsu.edu UPA 201 100 1
  • Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives. H = Hyperspace; C = Cardiac Arrest; S = ShockFinding Description Recommendation H C S Severity RatingObjectives/goals for the Reason content is being Develop a consistent structure that    3modules presented defines what’s noted in the bulleted Conciseness of presentation points, above. Definitions required to work Avoid generic statements that don’t with the module/content focus users on what they will be Objectives/goals for Reason content is being Develop a consistent structure that    3 Evaluation criteria and the modules presented accomplishing. defines what’s noted in the methods Conciseness of presentation Advise that there is an assessment bulleted points, above. Definitions requiredand used for evaluation and indicate if it’s Direct tie between content to work Avoid generic statements that with the module/content don’t focus users on what they will assessment measurecriteria and Evaluation at the end or interspersed in the be accomplishing. Sequence ofmethods presentation module that there is an assessment Advise Direct tie between content Connect ideas in the indicateand follows logically from used for evaluation and goals if and assessment measure it’s at the end or interspersed in introductionSequence of presentation objectives with outcomes in the the module Quizzes challenge logically from follows users assessment in the goals and Connect ideas introduction Follow the with outcomes in the objectives order of presentation Quizzes challenge users assessment defined the the beginning Follow at order of presentation Develop interesting and challenging defined at the beginning questionsinteresting and Develop challenging questions Re-frame goals/objectives at the end Re-frame goals/objectives at the of theof the module end module 101
  • usability.spsu.edu UPA 201 102 1
  • Steve Krug’s approach• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have resources to fix• It’s easy to get distracted by less serious problems that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010 Slide 103
  • Krug’s maxims• Focus ruthlessly on a small number of the most important problems.• When fixing problems, always do the least you can do. Slide 104
  • Methodology:Usability Test
  • Learnability Satisfaction Efficiency Usability Errors MemorabilityNielsen, Jakob. “Usabiilty 101: Introduction to Usability.” useit.com. Web. 26 Mar. 2012.
  • Highly Recommended
  • Usability Tests Start new test Make changes Identify 3-5 to site tasks to test Observe test Identify the 2-3 participants easiest things to performing fix tasks
  • Why Test at All? User- Grounds centered design design debates Surprising Methodical results
  • Identify Tasks for the Test Known Most problem common areas activities New Popular pages or pages services
  • Recommended Gear Screen Computer recording (or paper prototype) software (or observer) Microphone (or observer)
  • Tasks You want to renew the loan period on a book you’ve checked out. You need to find an article on the role of genes in schizophrenia. You want to see what hours the library is open next Monday.
  • Screen Recording Software Price Features
  • Staff of One Recruits test participants Runs the test Records the Preps test environment test before & after (screen recording software & mic) test
  • “Guerilla testing!”Photo and caption credit::Amanda Etcheshttp://instagr.am/p/IuHhRri-wS/
  • Staff of Two #1 #2 Recruits test Observes the participants test Preps test environment Runs the test before & after each test
  • Draft a Test Protocol Goals Recruitment Tasks Script Staff Roles Equipment
  • Before Each Test Participant Arrives Clear cache in Check browser screen Check mic recording software
  • Getting Each Test Participant Started Don’t show website until Chat about first task is websites introduced Explain why they like you’ll be reading from a script
  • Working Through Tasks Don’t help the Encourage participant participant Stick to to speak the script aloud
  • As Each Participant Finishes End and save the Give screen participant recording Thank any payment participant or reward profusely
  • After Test Is Done • ASAP (same day) Make Notes • Identify common problems • Keep it simple Write Report • Focus on fixing just 2-3 things • Might not be same tasksTest Again • Lather, rinse, repeat
  • Focus on Fixing Easy Things Small changes A website that keeps up with users Iterative development and testing
  • For More InfoFrancoeur, Stephen. “Usability Testing Our New Website.” Beating the Bounds. 16 Jan.2012. Web. 27 Mar. 2012. linkHassenzahl, Marc and Noam Tractinsky. “User Experience - a Research Agenda.”Behaviour & Information Technology 25.2 (2006): 91–97. linkKrug, Steve. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and FixingUsability Problems. Berkeley, Calif: New Riders, 2010. Print.Nielsen, Jakob. “Usabiilty 101: Introduction to Usability.” useit.com. Web. 26 Mar. 2012. linkReidsma, Matthew. “How We Do Usability Testing.” Matthew Reidsma. 15 Nov. 2011. Web.27 Mar. 2012. linkReidsma, Matthew. “Why We Do Usability Testing.” Matthew Reidsma. 25 Oct. 2011. Web.27 Mar. 2012. linkUniversity of Texas at Austin. “Usability Testing.” Web Publishing. University of Texas atAustin. 27 Mar. 2012. link
  • Methodology:Remote Test
  • New technologies and techniques allow for Remote: – Moderated testing – Unmoderated testing – ObservationIrrelevance of Place
  • Remote Moderated Testing Products like GotoMeeting allow connections to the test (or observation) computer to the Internet. VoIP can carry voice cheaply. Translator ObserversParticipant Moderator LiveMeeting WebEx GoToMeeting For screen VoIP Audio Skype GoogleTalk
  • Remote Unmoderated Testing Find the umbrella. • Hundreds of users agree to participate in a study • In their natural context • From geographically spread locations • Users try to complete tasks + answer questions • No human moderation needed • Browser bar connects users with secure servers
  • Remote Unmoderated Testing ‘Task-based’ Surveys Online Card Sorting > Open or closed> Online/remote Usability Studies (unmoderated) > Stand alone or> Benchmarking (competitive /comparison) > Integrated with task-based studies & surveys> UX Dash`boards (measure ROI) Robust Set of Services Online Surveys User Recruiting Tool> Ad hoc research > Intercept real visitors (tab or layer)> Voice of Customer studies > Create your own private panel> Integrated with Web Analytics data > Use a panel provider*
  • When to do… Remote Testing In Person• Web/software UI • Physical artifact • Need rich qualitative• More quantitative based feedback• Large, distributed • Need to have the human connection sample or low incidence • Ensure high-level of• Low(er) budget consistency • Uncertain of quality or• High penetration of environment Internet access
  • Introduction to remote usability testing
  • Defining Remote Testing ToolsWhat do we mean by “remote testing tools”?• A remote testing tool is any technology that allows a researcher to collect data from users about their experience in using an interface, without direct, face-to- face contact.There are two ways of organizing remote testing tools• Moderated vs. Unmoderated• Qualitative vs. Quantitative-based Moderated Unmoderated Qualitative-based Qualitative-based Quantitative-based Quantitative-based 133
  • Unmoderated Testing Tools Full-service Self-service Card Quantitative- Sorting/IA based Unmoderated Surveys tools Click/Mouse Video tools Qualitative- based Reporting Expert Reviews 134
  • Our UX Toolkit 135
  • Why Should You Care?• Saves time o Lab study takes 2-4 weeks from start to finish, unmoderated typically takes hours to a few days*• Saves money o Participants compensation typically a lot less ($10 vs. $100) o Tools are becoming very inexpensive• Reliable metrics o Only (reasonable) way to collect UX data from large sample sizes• Geography is not a limitation o Collect feedback from customers all over the world• Greater Customer insight o Richest dataset about the customer experience 136
  • Why is this important NOW?• Questions from senior management are becoming more complex and time sensitive• Traditional usability is no longer enough• Push to measure the UX• Convergence with market research and web analytics to paint a more complete picture of the UX• Budgets/timeline constraints 137
  • Which one goes first? Lab first, then Unmoderated Unmoderated first, then Lab Identify/fix “low hanging fruit”, then Identify the most significant issues focus on remaining tasks with large online through metrics, then use lab sample size study to gather deeper qualitative understanding of those issues Generate new concepts, ideas, Collect video clips or more quotes of questions through lab testing, then users to help bring metrics to life test/validate online Validate attitudes/preferences Gather all the metrics to validate observed in lab testing design – if it tests well, then no need to bring users into the lab 138
  • Quantitative-based tools
  • Unmoderated Testing Tools - Quantitative Full-service Self-service Card Quantitative- Sorting/IA based Unmoderated Surveys tools Click/Mouse Video tools Qualitative- based Reporting Expert Reviews 140
  • OverviewCommon Research Questions:• What are the usability issues, and how big?• Which design is better, and by how much?• How do customer segments differ?• What are user design preferences?• Is the new design better than the old design?• Where are users most likely to abandon a transaction?Types of Studies: Typical Metrics:• Comprehensive evaluation • Task success• UX benchmark • Task time• Competitive evaluation • Self-report ratings such as ease of use,• Live site vs. prototype comparison confidence, satisfaction• Feature/function test • SUS• Discovery • Click paths • Abandonment 141
  • Strengths / Limitations / MythsStrengths:• Comparing products• Measuring user experience• Finding the right participants• Focusing on design improvements• Insight into the user’s real experienceLimitations:• Not well suited to rapid, iterative Myths: design • Only test with websites• Need a deep understanding of issues • Very expensive• Studies that require long sessions • Only gather quantitative data• Lose control over prototypes • A lot of noise in the data• Internet access • Same as any market research study • Comparing products 142
  • Full Service Tools – Online Usability Testing 143
  • Self Service Tools – Online Usability Testing 144
  • Card Sorting / IA Tools 145
  • Online Surveys 146
  • Click and Mouse Tools 147
  • Qualitative-based tools
  • Unmoderated Testing Tools - Qualitative Full-service Self-service Card Quantitative- Sorting/IA based Unmoderated Surveys tools Click/Mouse Video tools Qualitative- based Reporting Expert Reviews 149
  • OverviewCommon Research Questions:• What are the big pain usability pain points, and wins?• How do users think, and feel about the design?• Are we on the right (design) track?• What is the overall experience like?• Why might users abandon a transaction?Types of Studies:• Low fidelity prototype Common Metrics:• High fidelity prototype • Frequency of issues• Competitive evaluation • Verbatim comments• Comprehensive evaluation • Task success (via video)• Feature/function test • Task time (via video)• Discovery 150
  • Video Tools 151
  • Report Tools 152
  • Expert Review 153
  • Additional ResourcesOnline:www.measuringUX.comwww.measuringusability.comhttp://usability.bentley.eduhttp://www.usefulusability.com/24-usability-testing-tools/http://remoteresear.ch/tools/ 154
  • Methodology:Focus Group
  • During the interviewDOTake notesSmileAsk open-ended questions Don’tGet their story Talk about your productShut up and listen Ask about future behavior Sell Ask leading questions Talk much
  • "Its really hard to design products by focus groups. A lot of times, people dont know what they want until you show it to them.” - Steve JobsWhat areFocus Groups?
  • Let’s dispense with this little turdblossom right up front: Henry Fordnever said, “If Id asked customerswhat they wanted, they would havesaid "a faster horse,”– it’s simply an myth
  • What are focus groups?The concept of focus groups was developed in 1930’s bypsychoanalyst Ernest Dichter as a social research methodFocus groups are structured interviews that quickly andinexpensively reveal a target audience’s desires,experiences, attitudes, and prioritiesFocus groups can be a useful technique when a companyneeds a lot of insight from potential or existing customersin a short amount of time
  • When to do Focus Groups?In product design, focus groups are used early in the designcycle when the team is generating ideas and seeking tounderstand the needs of the target audience.Early in the design cycle, focus groups can help thecompany understand: User’s fundamental issues and perceptions of the product What users believe are the important features of the product What types of problems users experience with the product Where do users feel the product fails to meet their expectations
  • Focus Groups cannot be used tounequivocally prove or disprove ahypothesis about the user experience ofa product.
  • When NOT to do Focus Groups?When the objective is to acquire usability information A group of people can’t provide specific information regarding product features without structured usability testing sessionsWhen seeking to understand the perspectives of the biggerpopulation Quantitative data that is generalizable to the bigger population requires surveys or other methodological approaches that require a large sample of participants There is no guarantee that proportion of responses in the group matches that of larger population of usersAlthough focus groups are an excellent way to gather motivationsand insights from the users, it cannot be used to unequivocallyprove or disprove a hypothesis about the user experience
  • Types of Focus GroupsExploratoryFeature PrioritizationCompetitive AnalysisTrend Explanations
  • Exploratory Focus GroupsTypically conducted in the beginning of a design cycleUncover users’ general attitudes on a given topic, allowingproduct designs to See how their users will understand their product What words users will use to speak about it What criteria they will use to judge it
  • Feature Prioritization Focus GroupsGenerally held at the beginning of the product design cycle whenthe outlines of the product are clear.These groups focus on features of the product that are mostattractive to the users, with an emphasis on why they areappealing.Underlying assumption of this type of focus group is that that theparticipants are interested in the product, with discussionfocusing on what kinds of things they would like the product todo for them.
  • Competitive Analysis Focus GroupsAims to uncover what attracts and repels users with respect tocompetitor’s sites What associations do users have with the competitor? What aspects of the user experience they find valuable? Where does the product satisfy users’s needs and where does it not suffice? What emotions does the product evoke? How do users identify with the product?This type of focus group is often conducted anonymously
  • Trend Explanations Focus GroupGenerally held in either a re-design part of the developmentprocess, or in response to specific emotional or functional issuesin product developmentExploring the trends of users’ behaviors, needs, and expectationswithin and across products.
  • How to conductFocus groups
  • How to conduct Focus GroupsAssemble your team Make sure you have a good cross section of product, ux, marketing and development.Create a schedule A good schedule provides sufficient time for recruiting, testing, analyzing and integrating resultsDefine your users Recruit participants who are your users and thus likely to provide the best feedback – usually 6-8Define the scope of your research What is the complexity of your questions? What is the depth at which you wish to explore the answers? This will determine the number of people and the number of groups that need to be conducted
  • How to run your Focus GroupsChoose a topics for discussion On average, 3-5 topics per 90 minute focus groupCreate a discussion guideConsider the “core” questions you and your product team aretrying to answer and prioritize themEstablish roles Who will moderate? Who will take notes? Who will lead the discussion afterwards
  • Asking Good QuestionsQuestions should be:  Carefully ordered, thus positioning participants within a certain frame of mind, containing an intuitive flow  Non-directed: should not imply an answer. Example: “How difficult do you find this feature?”  Open-ended: general enough not to constrain answers to a specific responses (limit yes-no questions)  Focused: focused on specific topics you are investigating  Personal: people love to generalize their experience to the bigger public; create questions that concentrate on person’s current behavior and opinions without many opportunities to project their experiences onto the general public  Unambiguous: clear and concise, with few shades of meaning.
  • Example Discussion GuideWarm up and introduction (approx. 15 min) Introduction of moderation Ice breakers for participants Outline of the processMain topic discussion Moderate group discussion that focuses on specific questions you and your company have regarding a product.Wrap-Up Final thoughts and reflections
  • Warm Up & Introduction Tips Telling participants that they were chosen to be part of the group allows them to feel more comfortable with one another Informing the group of the purpose of the session focuses their attention to the desired end goal Clearly set out expectations and “ground rules” for discussion (no blocking, interruption, flow of discussion) Acknowledge any potentially anxiety provoking features of the environment (camera, mirrored wall) to help people feel more comfortable• Inform participants of their rights to participation Freedom to leave at any point, confidentiality of their thoughts
  • Main Discussion TipsProbes and follow-up questions are extremely useful They dig deeper into any given topic They clarify what people mean when they state their opinions My definition of “useful”, “clear”, and/or “good” may not mirror other people’s definitions of there terms. Probes help create a common definition of terms, and alleviate potential misunderstandings between the researcher and participant.
  • Context Is KingComfortable environment is key to a lively discussionLimited interruptions After the session begins, no new person should join the session so that the dynamic isn’t altered by another person’s presenceFood is encouraged Eating is an informal activity that often breaks tension in any group No noisy snacks that disturb the conversationSeating order Have a 10-15 minute social time before the focus group starts so the moderator can identify introverts, extroverts, and alpha-jerksVideotaping advised Human interaction is incredible complex. Since the moderator is part of the group dynamic, it is helpful to videotape the sessions in order to capture gestures, and other subtle interactions
  • John McLaughlin is a perfectexample of a bad moderator anda douchebag: highlyopinionated, clearly biased, witha tendency for dominating thediscussion What about the moderator?
  • What about the moderator?Group moderation is a skillBasic skills any moderator should embody are: Respect for the participants Ability to listen closely to other’s perspectives Ability to think fast on multiple levels simultaneously.The moderator must be able to predict the direction of theconversation and drive it toward a desired direction, without theparticipants realizing that they are being moderated. This can be accomplished via moderator’s subtle cues, tone, and/or body language
  • Effective ModerationControl Moderator should always be in control of leading discussion towards answers to questions, and deterring tangentsGood time management The flow should be monitored so that introduction of topics is at appropriate times, transitions are intuitive and natural.Participant-focused A moderator should mediate the discussion, rather than expressing opinionsRespect All participants should feel comfortable and have a voice, alpha-jerks managedPreparation Moderator should have sufficient knowledge of product space
  • Effective Moderation Tips  Spend time with participants beforehand to get a sense of who is quiet and might need more attention  Stick to the guide but be flexible enough to stray away from the script when necessary  Engage ALL participants in the discussion  Avoid introducing new terminology and concepts  Be mindful of body language  Clarify any comments & restate ideas and opinions to ensure everyone is on the same page  Probe for alternative opinions on any given topic  Don’t dominate the discussion, allow the group to lead  Provide the group with time to think & give a break when necessary  Use humor when appropriate and keep the energy level high
  • Commonobstacles An electronics company was testing a new boombox they hoped to start selling. Their research included focus groups where they showed the two colour options: yellow and black. The participants were in agreement that yellow was the best colour because it is vibrant and energetic. At the end of the focus group they were each allowed to take a boombox home and could choose yellow or black. They all chose black. – Steve Mulder, “The User is Always Right”
  • Common ObstaclesThe moderator is not an objective observer Moderator affects the group dynamics and discussionFocus groups reveal the way people think and not the way theyactually behaveOpinions from focus groups may be limited to the participantsin the sessions The sample may be biased for more reasons than just small size and therefore cannot be adequately extrapolated to represent the bigger populationReticent individuals are often silenced by outspoken ones Data may be biased toward those who speak up“Vividness effect” People often provide examples of situations that are most emotionally vivid to them.
  • Common ObstaclesOverly talkative participants When people are clearly talking without a purpose, ask them kindly to wrap up and move on: Moderate the extroverts, probe the introverts.Group dominance (The Alpha-Jerk Effect) A single dominant/bullying participant can ruin the focus groupUnqualified participants At times people misunderstand what the participation criteria are, or misrepresent their experienceTangents They can be useful for discussion of values and ideas, but should be wrapped up quickly and redirected to main discussion pointHostility & Offensive Ideas Vehement disagreement or offensive ideas can lead people in the group to feel uncomfortable. The moderator should redirect conversation to the focusing on the ideas behind any given perspective: Go Meta!
  • Analyzingthe data Researchers must continually be careful to avoid the trap of selective perception - Richard A. Krueger
  • What data to collect? Focus groups produce a ton of potentially useful information which can be extracted by means of: Transcripts Quotations Observer opinions Models Videotapes What information should be prioritized depends upon how what form of data answers your question, and how quickly you need to synthesize the results.
  • Analysis Steps Capture the initial hypothesis During the debriefing (occurring with little time lapse from focus group to retain memory), discuss with other observers thoughts regarding the groups opinions and feedback Transcribe and code Video interaction should be transcribed and themes/trends of opinions should be extracted via coding Coding is a method of extrapolating ideas from the transcripts and categorizing the responses (thus generating quantitative data). Your codes (general categories) should be short, concise, descriptive in nature, and accurately depicting a users’ single idea.
  • Coding FrameworkTop-down A hypothesis of what types of themes the participants’ transcripts will generate already exists and is confirmed using the data A pre-existing model is applied to the data.Bottom-up The data is explored without pre-existing framework in mind, generating themes based on the responses in the transcript. The model is generated using the data. This methodological approach to extracting themes from focus group interviews is recommended since it is less biased and more true to the data.
  • Extracting Trends in Data Mental models Mental representations of how your users understand the way the world or a product works Values What do people like and dislike and what criteria do they use to establish their opinions Stories Stories are a powerful way that people capture their unique, subjective experiences, and provide details about their assumptions, order of doing things and ways of solving problems Product pitfalls Brainstorms during focus groups can produce a list of problems that users experience using the product
  • Getting the most out of the data Questions to read for  What are reasons behind people’s opinions?  What terminology do people use and do products speak their users’ language?  Where do people contradict themselves?  When do people change their minds and how does that reveal their actual values and perspectives on a given product?  What do people consider to be important and is the product that is popular actually important to them?
  • Methodology:Benchmarking
  • Methodology:Contextual Inquiry
  • What isContextual Inquiry? 198
  • What is contextual inquiry?Contextual inquiry is a field data-gathering techniquethat studies a few carefully selected individuals indepth to arrive at a fuller understanding of the workpractice across all customers.Through inquiry and interpretation, it revealscommonalities across a product’s customer base. ~ Beyer & Holtzblatt
  • Contextual Inquiry: When to do itEvery ideation and design cycle should start with acontextual inquiry into the full experience of a customer andhis/her.Contextual inquiry clarifies and focuses the problems acustomer is experiencing by discovering the • Precise situation in which the problems occur. • What the problem entails. • How customers go about solving them.
  • Contextual Inquiry: How to do itWhat is your focus?Who is your audience?Recruit & schedule participantsLearn what your users doDevelop scenariosConduct the inquiryInterpret the resultsEvangelize the findingsRinse, repeat (at least monthly)
  • ResearchProcess 202
  • Inquiry Process: Main ObservationObserve the participants What are they doing? What tools are they using? How are they using the tools? Have they developed any work-arounds?Occasionally ask for the participant to describe whatthey are doing, providing any necessary explanations,clarifications, walk-through of actions.Take copious notes. Video record, if possible.
  • Inquiry Process: Follow-Up & Wrap-UpFollow-Up After the main observations, ask the participant any follow-up questions you may have regarding your observation which his/her memory is still fresh.Wrap-Up Ask the participant about his/her experience and perspective on the inquiry process •Was anything anxiety provoking? •Was there anything you would like to have been done differently?
  • Inquiry Results: What to collect What tools do participants use? Formal tools? Informal tools? What brands? Behavior sequences Order of actions is important to understanding how the participants think about the task Methods of organization How do the participants organize the information they use? (out of necessity, convenience, importance?) What kinds of interactions do participants have? What are the important parties in the transfer of knowledge? Are they people? Are they processes? Is the information shared? What is the nature of the interaction? Collect Artifacts Artifacts are the non-digital tools people use to help them accomplish the tasks they’re trying to do. (ex: if you are interested in learning how people schedule their events, photograph their daily calendar)
  • Analyzing the results“The output from customer research is not a neat hierarchy; rather, it is narratives ofsuccesses and breakdowns, examples of usethat entail context, and messy use artifacts” Dave Hendry 206
  • Research AnalysisWhat are people’s values? People are driven by their social and cultural contexts as much as their rational decision making processes.What are the mental models people build? When the operation of a process isn’t apparent, people create their own models of itWhat are the tools people use? It is important to know what tools people use since you are building new tools to replace the current ones.What terminology do people use to describe what they do? Words reveal aspects of people’s mental models and thought processesWhat methods do people use? Flow is work is crucial to understanding what people’s needs are and where existing tools are failing them.What are people’s goals? Understanding why people perform certain actions reveals an underlying structure of their work that they may not be aware of themselves.
  • AffinityDiagrams“People from different teams engaged in affinitydiagramming is as valuable as tequila shots andkaraoke. Everyone develops a sharedunderstanding of customer needs, without thehangover or walk of shame” 208
  • Research Analysis: Affinity Diagrams Creates a hierarchy of all observations, clustering them into themes. From the video observations, 50-100 singular observations are written on post-its (observations ranging from tools, sequences, interactions, work-arounds, mental models, etc) With entire team, notes are categorized by relations into themes and trends.
  • Methodology:Eye Tracking
  • What is eye tracking? Bill Albert December 13, 2011 walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Eye tracking records what people look at 212
  • See the user’s gaze - LiveYou can follow what theuser pays attention to inreal-time.The participant‟s gaze ismarked by red dotsand red lines.A camera displays theparticipant, so you cansee their facialexpressions and bodylanguage. 213
  • Eye tracking results: Heatmaps Heatmaps show what participants focus on. In this example, „hot spots‟ are the picture of the shoes, the central entry field and the two right-hand tiles underneath. The data of all participants is averaged in this map. 214
  • Eye tracking results: Gazeplot Gaze plots show the „visual path‟ of individual participants. Each bubble represents a fixation. The bubble size denotes the length or intensity of the fixation. Additional results are available in table format for more detailed analysis. More examples with interpretations are coming up. But before… 215
  • How does it work? Bill Albert December 13, 2011 walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Pupils move a lotOur pupils are constantly inmotion.When the pupil is moving, it‟scalled a „saccade‟.During a saccade, visualperception is unlikely or evenimpossible. in here, somewhere… 217
  • How we ‘look’ FixationThe pupil must focus on apoint in order to perceivecolour, faces, writing, etc.That is called a „fixation‟.Eye Tracking measures the speed of thepupil and can thus detect when a fixation Saccadeis happening! 218
  • What do these fixations tell us?Fixations are linked to attention.Moving your eyes means movingattention.A fixation does not mean that theparticipant definitely perceived anelement.But it is fair to say that elements thatdraw visual attention have a higherchance of being perceived (consciouslyor subconsciously). 219
  • How can a monitor tell what I look at? 220 tobii
  • The red-eye effectThere‟s a layer in our eyes thatreflects infrared light.This is where the red-eye effect inphotos comes from as photo flashesuse infrared light.The eye tracking monitor makesuse of this effect! 221
  • What the eye tracker seesThe eye tracking monitoruses infrared light tomake the pupils of theperson sitting in frontof it light up and sobecome detectable.This is what it looks likefor the monitor. tobii 222
  • Monitors - No strings attachedIt used to be like this: Now it‟s all free and comfy. The monitor can capture the gaze in a wide area, so the participant can relax and move naturally: kristenbell.org 223 tobii
  • Eye Tracking & User Research Bill Albert December 13, 2011 walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • You run user research to understand: Your user‟s actual behaviour. Your user‟s attitudes, feelings, and motivations. Your user‟s experience with your company/organisation (stories). What is and isn‟t working in your design/product. 225
  • You add eye tracking to get: A deeper understanding of user‟s actual behaviour. Insight into user‟s subconscious or instinctive behaviour. A better understanding of why your design does or doesn‟t work. Evidential (quantitative) data. 226
  • There’s 2 scenarios for eye tracking The Check-Up Inform your design How is my design, Use eye tracking data to website or product support your design performing? process  How do users perceive my  Conceptual design: what basic design/website/product? structure works best?  Do users notice what I want  Wireframe stage: where shall I them to notice? place my content or images?  How is my design performing in  Detailed design: how does my the context of typical usage visual design serve the tasks? website‟s/product‟s design purpose? 227
  • The Check-upHow is my Bill Albert design, website or December 13, 2011product performing? walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Set-up of an eye tracking testUser tests are often run in 45 to 60minute sessions with 6 to 15participants:1. Participants are give a number of typical task to complete, using the website, design or product you want to test.2. The user’s intuitive interaction is observed, comments and reactions are recorded.3. The participant‟s impressions are captured in an interview at the end of the test. 229
  • What happens then?The next step is to analyse theeye tracking data and the user‟sfeedback. We focus on: what users saw, what users overlooked and what they thought and felt about the website, design or product.The next slides are a couple of examples. 230
  • Examples: Testing website designsWhat do you thinkdraws the user‟sattention on this site?The listed offers inthe centre or thespecial offerbanners on the right? 231
  • The site suits browsers and focussed users This participant This participant focusses thoroughly reads the on the right-hand listed offers. banners. Whenever a destination He briefly gazes at the sparks her interest, she listed offers, but showslooks at the offer details, no reading behaviour e.g. the price. there. 232
  • What drew most attention on this design? 233
  • The key visual and a box at the bottom The key visual got Surprising: This box got lots of heaps of attention. It attention. reads: “If you are having trouble getting through to us on the phone, please click here to email us, we‟ll get The main back to you within 2navigation and business days”. its options got almost no Participants got the attention. impression that Telstra Clear has trouble with their customer service. 234 Note: Telstra Clear have since re-designed their homepage.
  • Inform your designUse eye tracking data to support Bill Albert December 13, 2011 your design process walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Design processThere‟s lots of decisions to make in all stages of the development process: 236
  • Decision like these… Where should the ‘Pay now’ button be? Will users notice this if I put it here? 237
  • … or these: How does my Does my design design perform draw enough compared to attention? others? visuality-group.co.uk 238
  • … and these: DoesDesign A work better? … or Design B? 239
  • Design principles – revealed by eye tracking Bill Albert December 13, 2011 walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Face EffectHumans are programmed to recognisefaces. Everywhere.This effect can beAlbert Bill seen in eye tracking. December 13, 2011Faces always draw attention! walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • The Face effect – an exampleYep, there’sattention oncertain… areas, … the face, however, is the strongest point of focus! 242 bunnyfoot
  • Using the Face effectThe ‘Face effect’ can Version A Version Bbe used to driveperception.Here‟s a great examplefrom humanfactors.com2 versions of an addesign were tested usingeye tracking.The goal of the ad is ofcourse to draw attentionto the product name. humanfactors.com 243
  • Using the Face effect Eye tracking results for ad Version A:  We see a face effect: The model‟s face draws a lot of attention.  The slogan is the other hot spot of the design. Participants will likely have read it.  The product and its name get some, but not a lot of attention. humanfactors.com 244
  • Using the Face effect Eye tracking results for ad Version B:  Again, we see a strong face effect. BUT: In this version, the models gaze is in line with the product and its name.  The product image and name get considerably more attention!  Additionally, even the product name at the bottom is noticed by a number of participants. humanfactors.com 245
  • Ways to focus attention Same effect: If the baby faces you, you‟ll look at the baby. But if the baby faces the ad message, you pay attention to the message. You basically follow the baby‟s gaze.usableworld.com.au 246
  • Banner BlindnessDid we learn to ignore them? Bill Albert December 13, 2011 walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Central bannersCentral banners areused on a lot ofhomepages.They use prime realestate on the homepage.That means they must bein the centre of attention,right? 248
  • Banner blindness… or are they?In this test, participants weregiven a task: Find the nearestATM.Participants focused on themain navigation and thefooter navigation– this iswhere they found the „ATMlocator‟.So, when visiting a site with atask in mind – as younormally do – the centralbanner can be ignored! 249
  • Compare the visual paths: Task versus browse When browsing, the central banner gets lots of attention. But how often do you visit a bank website just to browse?Participant was asked just to look at the homepage Participant was given a task („Find the nearest ATM‟) 250
  • Main focus: Navigation optionsEye tracking results show:When looking for Task: „What concerts are happen in Auckland this month?‟ Task: „You want to send an email to customer service‟something on awebsite, the mainfocus of attention arethe navigation options.Maybe users have learnedthat they‟re unlikely tofind what they‟re lookingfor in a central bannerimage. 251
  • When do users look at banners?In this example, participants looked at the banner even though they were looking forsomething specific. What‟s different? Task: „You want to get in touch with customer service‟ Participant was asked just to look at the homepage 252
  • Banner Blindness: The trick is… … don’t make your banners look like banners! 253
  • The bottom line:User research + Eye tracking =a more complete understanding of Bill Albert your user’s13, 2011 December experience walbert@bentley.edu 781.891.2500 | www.bentley.edu/usability
  • Methodology:Diary Study
  • Customer ValidationDiary Studies help here
  • Diary Studies from 30,000 feetWhat? How? Why?Analyzing and gaining insightsPractical tipsIdeas on how to apply
  • A diary study involves participants reporting theiractivities over a specific period of time, usually apartfrom the researcher and in their normal daily lives.
  • Benefits• Why users behave a certain way• Right context & environment• Remote• Sample over a duration• Bridge qualitative & quantitative• Integrate with metrics & tools• Flexibility
  • Users record thoughts, comments, etc. over time Interview users Gather feedback, data Organise and analysehttp://www.flickr.com/photos/vanessabertozzi/877910821 (affinity maps, analytics)http://www.flickr.com/photos/yourdon/3599753183/http://www.flickr.com/photos/stevendepolo/3020452399/http://www.flickr.com/photos/jevnin/390234217/
  • Participants keep a record of“When” data “What" dataDate & time Activity / taskDuration Feelings / moodActivity / task Environment / setting
  • No one right way to collect data Structured Yes/no Select a category Date & time Multiple choice Combine / mix & match Unstructured Open-ended Opinions / thoughts / feelings Notes / commentshttp://www.flickr.com/photos/roboppy/9625780/http://www.flickr.com/photos/vanessabertozzi/877910821
  • Organize before analyze Affinity mapping (integrate with “To the whiteboard”* exercises) •Extract notes from interview data •Place similar notes together, repeat •“tag” clusters according to topic, title, concept, etc. •Extract learning and insights by illustrating the whole concept •See “Rich Pictures” from Soft Systems Methodology*See “Enterprise guide to customer development” by Brant Cooper & Patrick Vlaskovits
  • “Hygiene” aspectsAt the beginning•Introduction / get-to-know-you•Demographics & psychographics, profiling•Instructions / Setting expectationsAt the end•Follow-up•Thanks / token gift•Reflection
  • Pitfalls• Belief bias• Behavior adjustment• Ramp-up time• Failure to recall
  • Start small, be nimble• Low-tech (email, SMS, twitter, paper/pen, phone interviews)• Simple, not complex, diary forms• Let users just ‘dump’ facts, don’t force them to think so much• Catch up regularly & share often• Tweak on the fly
  • Bring an observer or two • Efficiency is shared experience • Bring one or two key stakeholders along to the interview sessions (e.g. lead dev, mktg, sales, CEO, etc. • Share and contrast: Don’t be precious about your own viewpointshttp://www.flickr.com/photos/qkgirl/4837119913/
  • Map diary to session tracking Validate diary data against your metrics. Customers sometimes behave differently than what they say. Make the data richer.
  • Integrate with business metrics • Use diary studies to test business assumptions • Test against segments • Useful for generative or evaluative research • Scale up or down • Test with prototypes • Etc.Taken from “Enterprise guide to customer development” by Brant Cooper & Patrick Vlaskovits
  • ‘Hidden’ insights from qualitative data • Useful for new ideas • Generates more questions • Hidden facts about customers • Stuff you never knew to look • Store them for later
  • Take-aways• Diary studies are flexible, tweak to your requirements• Use it to address gaps in your research data• Integrate it with your existing measurement machinery• Refine and iterate your approach
  • More on diary studieshttp://www.webcredible.co.uk/user-friendly-resources/web-usability/diary-study-guide.shtmlhttp://www.system-concepts.com/articles/usability-articles/2011/a-quick-start-guide-to-online-diary-studies.htmlAnalyzing using affinity mapshttp://www.usabilitybok.org/methods/affinity-diagram?section=how-tohttp://www.mindtools.com/pages/article/newTMC_86.htmGamestorming book by Dave Gray, Sunni Brown, James Macanufo
  • Methodology:CognitiveWalkthrough
  • References
  • 인용/참조 문헌• Know Thy User: The Role of Research in Great Interactive Design (frog, Sep 2012)• The Mobile Frontier (Rachel Hinman, Feb 2012)• Introduction to AgileUX: Fundamentals of Customer Research (Will Evans, Jan 2012)• Customer Research & Persona Development (Will Evans, Oct 2012)• Introduction to UX Research: Conducting Focus Groups (Will Evans, Jan 2012)• Midwest UX 12: Mapping the Experience (Chris Risdon, Jun 2012)• Eye Tracking & User Research (Optimal Usability, Apr 2012)• Taking it to the streets: Investigating mobile and web UX through field studies (Emma Rose, Jun 2012)• NYTECH "Measuring Your User Experience Design“ (New York Technology Council, Mar 2012)• How to Conduct UX Benchmarking (UserZoom, May 2012)• Customer validation with Diary Studies (Boon Chew, Jan 2012)• The Science of Great Site Navigation: Online Card Sorting + Tree Testing (UserZoom, Jul 2012)• Introduction to Card Sorting (ThoughtFarmer, Sep 2012)• Usability Testing Basics (Stephen Francoeur, Mar 2012)• Storytelling: Rhetoric of heuristic evaluation (Southern Polytechnic State University, Mar 2012)• Cognitive and pluralistic (Aarushi Mishra, Oct 2012)• How to Quantitatively Measure Your User Experience (Richard Dalton, May 2012)• Remote Testing Methods & Tools Webinar (UserZoom, Dec 2011)• Beyond User Research (Louis Rosenfeld, Mar 2011)• User Interview Techniques (Liz Danzico, May 2010)• Design Research For Everyday Projects - UX London (leisa reichelt, Jun 2009)UX 리서치 280 © 2012 InnoUX & Innodesign All rights reserved.
  • 경청해주셔서 고맙습니다!