• Save
Better UX Surveys at UCD2012 by @cjforms
Upcoming SlideShare
Loading in...5
×
 

Better UX Surveys at UCD2012 by @cjforms

on

  • 1,726 views

Better UX Surveys: a workshop led by Caroline Jarrett at the UCD 2012 conference in London, UK on 10th November 2012. Slides include feedback from some exercises.

Better UX Surveys: a workshop led by Caroline Jarrett at the UCD 2012 conference in London, UK on 10th November 2012. Slides include feedback from some exercises.

Statistics

Views

Total Views
1,726
Views on SlideShare
1,687
Embed Views
39

Actions

Likes
9
Downloads
0
Comments
0

3 Embeds 39

http://rosenfeldmedia.com 24
http://lanyrd.com 14
https://twitter.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Better UX Surveys at UCD2012 by @cjforms Better UX Surveys at UCD2012 by @cjforms Presentation Transcript

  • Better UXUCD2012 workshop led by Caroline Jarrett
  • Many thanks to theorganisers, volunteers, and sponsors of UCD2012, LondonSupportersSponsors Organiser 2 2
  • Surveys: your views on these statementsA. “It‟s when someone says, "Cant I just send out a survey and collect the data?" that I start to shake”. – Indi Young http://rosenfeldmedia.com/books/mental-models/blog/oxymoron_scientific_survey/B. Online surveys are a great option for business owners who would like to conduct their own research – Smart Survey http://www.smart-survey.co.uk/articles/10-advantages-of-online-surveys/ 3
  • Better UX surveys could be…1. Improved questionnaires2. Surveys that ask about user experience in a better way3. Surveys that deliver more helpful insights for UX design 4 © Caroline Jarrett and Effortmark Ltd
  • Three ways UX people encounter surveys 1. Post-test / post-task surveys e.g. SUS 2. Someone is going to do a survey anyway 3. Triangulating between survey data and data from elsewhere 5Image credit: infodesign.com.au
  • Agenda 1. Post-test / post-task surveys 2. Someone is going to do a survey anyway 3. Triangulating between survey data and data from elsewhere 6
  • Presser et al 2004: pretesting focuses ona “broader concern for improving data qualityso that measurements meeta survey‟s objective” Field testing focuses on the mechanics and procedures Cognitive Usability interviewing testing focuses on focuses on the interaction questionshttp://www.slideshare.net/cjforms/introduction-to-usability-testing-for-survey-research
  • Try some cognitive interviewing• Pair up. One person gets to be the interviewer.• Non-interviewer: wait for your instruction. 8
  • Try some cognitive interviewing• Pair up. One person gets to be the interviewer.• Non-interviewer: wait for your instruction.• Interviewer: ask your pair to think aloud while answering this question. Take notes.„How many windows are therein your house?‟(Dillman et al, 2009) 9
  • OK, now swap and try this question• Please think about a computer system or web site that you used recently. Now think aloud as you answer this question: 10
  • OK, now swap and try this question • Please think about a computer system or web site that you used recently. Now think aloud as you answer this question: 11(from the SUS, the System Usability Scale, Brooke 1986)
  • Ask questions thatTip people can answer ? 12
  • We‟ve got a lot of different goals to consider What the organisation wants to achieve Our aims in What the user doing a survey wants to do 13
  • Let‟s start here Our aims in doing a survey 14
  • We use post-test questionnairesfor comparisons• One iteration with another• Products with each other• This product with an ideal Our aims in doing a survey 15
  • Tullis and Stetson found that SUS was the best questionnaire for comparisonsTullis, T. S. and J. N. Stetson (2004).A Comparison of Questionnaires for Assessing Website Usability. UPA 2004 Conference 16http://www.upassoc.org/usability_resources/conference/2004/UPA-2004-TullisStetson.pdf
  • The ideal: everything in balance Good Comparisons questions 17
  • Agenda 1. Post-test / post-task surveys 2. Someone is going to do a survey anyway 3. Triangulating between survey data and data from elsewhere 18
  • Survey = questionnaire + process © Caroline Jarrett and Effortmark Ltd 19
  • A sadly uninformative “survey” process Notice“Voice of the big gap customer” Some Send Reward or after each punish staff Insightquestions transaction 20
  • Ask a sample, not everyoneTip Make me feel special 21
  • A typical survey process, somewhat better“Let‟s doa survey” Some Send and Add luck Insightquestions hope 22
  • Probably best to be realisticand bring in the boss here What the organisation wants to achieve Our aims in What the user doing a survey wants to do 23
  • A better survey process Goals Users Build Deploy Analyse • Establish • Interview • Final version • Run the • Extract“Let‟s do your goals users about of questions survey from useful ideasa survey” for the the topics in • Build the approach to • Share with survey your survey questionnaire follow-up others Questions Questions Some you need users can Questionnaire Data Insightquestions answers to answer 24
  • We‟ve seen this bit a few moments ago Goals Users Build Deploy Analyse • Establish • Interview • Final version • Run the • Extract“Let‟s do your goals users about of questions survey from useful ideasa survey” for the the topics in • Build the approach to • Share with survey your survey questionnaire follow-up others Questions Questions Some you need users can Questionnaire Data Insightquestions answers to answer 25
  • The questions you need depend on your organisational and UX goals Goals Users Build Deploy Analyse • Establish • Interview • Final version • Run the • Extract“Let‟s do your goals users about of questions survey from useful ideasa survey” for the the topics in • Build the approach to • Share with survey your survey questionnaire follow-up others Questions Questions Some you need users can Questionnaire Data Insightquestions answers to answer 26
  • Goals come into the definition of usabilityThe extent to which a productcan be used by specified usersto achieve specified goals witheffectiveness, efficiency and satisfactionin a specified context of use(ISO 9241:11 1998) This assumes that we agree on the goals 27
  • We have lots of views ways of defininguser experience 28
  • But let‟s carry on with the standards theme2.15 user experiencepersons perceptions and responses resulting from the useand/or anticipated use of a product, system or serviceNOTE 1 User experience includes all the users emotions, beliefs, preferences, perceptions, physicaland psychological responses, behaviours and accomplishments that occur before, during and afteruse.NOTE 2 User experience is a consequence of brand image, presentation, functionality, systemperformance, interactive behaviour and assistive capabilities of the interactive system, the usersinternal and physical state resulting from prior experiences, attitudes, skills and personality, and thecontext of use.NOTE 3 Usability, when interpreted from the perspective of the users personal goals, can include thekind of perceptual and emotional aspects typically associated with user experience. Usability criteriacan be used to assess aspects of user experience.ISO 9241-210 29
  • Before ISO 9241-210 came along…user experience as the satisfaction bit of usabilityThe extent to which a productcan be used by specified usersto achieve specified goals witheffectiveness, efficiency and satisfactionin a specified context of use(ISO 9241:11 1998) Can I choose the goals? 30
  • Find out about users‟ goalsTip 31
  • Agenda 1. Post-test / post-task surveys 2. Someone is going to do a survey anyway 3. Triangulating between survey data and data from elsewhere 32
  • I think this question is tryingto ask about satisfaction 33
  • If you‟re going to ask about satisfaction,what endpoints would you use on a scale? A Z
  • Workshop participants came up with these –mostly about recalled emotion Satisfied  Dissatisfied Extremely satisfied  Extremely dissatisfied Love  Hate Interesting  Boring Fun  Dull Enjoyable  Unpleasant Made me feel good  Made me feel bad Easy  Confusing Enjoy  Not enjoy Workshop Delighted  Disappointed results Friendly  Scary 35
  • And these, mostly about whether theexperience was successful or not Fast  Slow Effortless  PainfulI could do what I came to do  I couldn‟t do what I came to do Success  Failure Rewarding  Frustrating Workshop results 36
  • And these,about predictions of future behaviour Would come back  Wouldn‟t come back Would post kudos  Would post complaints Workshop results 37
  • Here are some scales I thought of, ahead of the workshop Disappointed ThrilledSomething missing Something extra Miserable Happy Below par Above par Unfair Privilege 38
  • But maybe the same level of satisfaction generates different points on each scale Disappointed ThrilledSomething missing Something extra Miserable Happy Below par Above par Unfair Privilege 39
  • Satisfaction reflects different emotions depending on level of engagement Satisfaction here Engaged = “delight” Negative Emotion Positive Indifferent Satisfaction here = “pleasant”Adapted from Oliver, R. L. (1996)“Satisfaction: A Behavioral Perspective on the Consumer”
  • Interview firstTip 41
  • Satisfaction requires comparisonof an experience to something elseCompared experience to what?(nothing)ExpectationsNeedsExcellence (the ideal product)FairnessEvents that might have beenAdapted from Oliver, R. L. (1996)“Satisfaction: A Behavioral Perspective on the Consumer” 42
  • And the resulting thoughts differ accordinglyCompared experience to what? Resulting thoughts(nothing) IndifferenceExpectations Better / worse / differentNeeds Met / not met / mixtureExcellence (the ideal product) Good / poor quality (or „good enough‟)Fairness Treated equitably / inequitablyEvents that might have been Vindication / regretAdapted from Oliver, R. L. (1996)“Satisfaction: A Behavioral Perspective on the Consumer” 43
  • Example: bronze medal winners tendto be happier than silver medal winners Nathan Twaddle, Olympic Bronze Medal Winner in Beijing Matsumoto D, & Willingham B (2006). The thrill of victory and the agony of defeat: spontaneous expressions of medal winners of the 44 2004 Athens Olympic Games. Photo credit: peter.cipollone, Flickr
  • Not all experiences are equal Winning an Major life event Olympic medal Watching an event Occasional, from the 2012 salient Olympics on TV Watching the TV Unremarkable, news on a slow day repetitive 45News images from cnn.com
  • The approximate curve of forgetting High Major life eventQualityof data Low Unremarkable, repetitive Occasional, salient Recent Long ago 46 Time since event
  • Ask about recent vividTip experience 47 Image credit: Fraser Smith
  • Agenda 1. Post-test / post-task surveys 2. Someone is going to do a survey anyway 3. Triangulating between survey data and data from elsewhere 48
  • Memorable experiences are also complex• Think about the experience of attending this conference – What did you expect to happen? – What did you need to happen? – What would the ideal experience have been? – How did you expect to be treated compared to other people at the event? – If you hadn‟t come here, what else might have happened? 49
  • The exercise revealed quite a few differentperspectives on the conference• These questions were quite easy: participants had thought about these topics – What did you expect to happen? – What did you need to happen? – What would the ideal experience have been?• These questions were harder, but gave fresh perspectives – How did you expect to be treated compared to other people at the event? Workshop – If you hadn‟t come here, results what else might have happened? 50
  • The challenge of UX and surveys:which bit to measure?The extent to which a productcan be used by specified usersto achieve specified goals witheffectiveness, efficiency and satisfactionin a specified context of use(ISO 9241:11 1998) ?
  • The challenge of satisfaction surveys:which bit to measure?Compared experience to what? Resulting thoughts(nothing) IndifferenceExpectations Better / worse / differentNeeds Met / not met / mixtureExcellence (the ideal product) Good / poor quality (or „good enough‟)Fairness Treated equitably / inequitablyEvents that might have been Vindication / regret ??
  • The challenge of experience surveys:which bit to measure?• Think about an experience … – What did you expect to happen? – What did you need to happen? – What would the ideal experience have been? – How did you expect to be treated compared to other people at the event? – If you hadn‟t come here, what else might have happened? ???
  • Don‟t try to ask everythingTip 54 http://www.census.gov/history/www/genealogy/decennial_census_records/
  • A quick, interesting question is fine “Why did you come to this web site today?”Suggestion from Suzanne Boyd, Anthro-Tech 55
  • But whatever you do,have a BIG BOX for the context 56
  • BonusTip Successful Survey = Questionnaire + Process That involves lots of testing 57
  • Tips 1. Ask questions that people can answer 2. Ask a sample, not everyone 3. Find out about users‟ goals 4. Interview first 5. Ask about recent, vivid experience 6. Don‟t try to ask everything 7. Build lots of testing into your survey process 58
  • Caroline JarrettTwitter @cjformshttp://www.slideshare.net/cjformscarolinej@effortmark.co.uk 59
  • More resources onhttp://www.slideshare.net/cjforms 60