• Like
  • Save

How to design effective online surveys

  • 1,304 views
Uploaded on

Watch the entire webinar: http://info.userzoom.com/online-surveys-design-webinar.html …

Watch the entire webinar: http://info.userzoom.com/online-surveys-design-webinar.html

UserZoom teamed up with Elizabeth Ferrall-Nunge, User Experience Research Lead at Twitter, to discuss how to create effective surveys and how to avoid common survey pitfalls.

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,304
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Webinar:How to Design Effective Online SurveysOctober 30th, 2012
  • 2. Speakers Elizabeth Ferrall-Nunge Alfonso de la Nuez User Research Lead at Co-Founder & Co-CEO at Twitter UserZoom @enunge @delanuez23 Twitter Hashtag #uzwebinar
  • 3. Webinar Agenda• When survey is appropriate• Attributes for a good survey• Question types & when to use each• Questionnaire biases• Implementation considerations• UX and usability-focused survey types
  • 4. About UserZoom• All-in-one Enterprise software solution that helps UX Pros cost-effectively test, measure and improve Customer Experience over websites and mobile apps. Product Suite:  Unmoderated Remote Usability Testing  Online Surveys (web & mobile)• We specialize in online (or remote)  Online Card Sorting research & usability testing, saving  Tree Testing time, money and effort, while still obtaining  Screenshot Click Testing very rich insights  Screenshot Timeout Testing (5-sec test)  Web VOC• In business since 2002 (2009 as  Mobile VOC SaaS), offices in Sunnyvale (CA)  User Recruitment Tool Manchester (UK), Munich (DE) and Barcelona (Spain) Follow us on Twitter @userzoom
  • 5. Webinar HistoryDeveloped in collaboration with Aaron Sedley andHendrik Mueller while at Google.Materials presented at HCIC, UX Australia, etc.
  • 6. WHEN SURVEY IS APPROPRIATE
  • 7. Survey StrengthsGood for:• Attitudes• Perceptions• Likes & Dislikes• Goals/Intent• Task success Note: more on task-based surveys ahead• User characteristics• Tracking over time• Comparisons
  • 8. Survey WeaknessesNot appropriate for:• Usability & comprehension (Usability)• Cause and effect (Experiments)• User motivations (Interviews, Observation)• Precise user behavior, flows, context (Logs)• Bugs (Feedback forms)• Behaviors people are unwilling to report• Prioritizing features (Multiple methods)
  • 9. Complement Other Methods Survey research (quant)Is my data anecdotal or Why are we seeing representative? this trend? Small sample research (qual)
  • 10. ATTRIBUTES FOR A GOOD SURVEY
  • 11. Elements of Quality SurveysData accurately represent the target populationValidity:Responses measure the dimensions of interestReliability:Responses are consistent over time & samplesQuestionnaire minimizes biasesDesired level of precision for key measurements• Statistically valid comparisons
  • 12. OVERVIEW OF SURVEY LIFE CYCLE
  • 13. Stages of Survey Research1. Identifying research goals and constructs2. Determining how to sample your population3. Question types and when to use each of them4. Questionnaire biases to avoid5. Other survey design considerations6. Testing and optimizing of the survey7. Implementation considerations & fielding8. Survey analysis fundamentals
  • 14. POPULATION & SAMPLING
  • 15. Population, Sample, RespondentsPopulation Sampling frame Sample Respondents
  • 16. How to Sample• What population do you want to measure?• How many users do you have?• What level of precision do you want?• Do you want to segment by groups? o Whats the smallest group to compare with?• How will you invite people & field the survey?Recommendations• Random sampling is always better!!!• Do not survey users more than 2-4x a year• Target ~400 responses per segment o 384 gives a +/-5% margin of error• Start with small %, track response rate, adapt
  • 17. QUESTIN TYPES & WHEN TO USE EACH
  • 18. Types of Survey QuestionsOpen-ended questions: Closed-ended questions:• • Universe of answers is Rate a single object unknown • After universe of• Select one object from questions is known a really large amount
  • 19. Open-Ended: Options UnknownWhat, if anything, do you findfrustrating or unappealing aboutyour smart phone?What services or applicationswould you like to integratewith Pinterest? ✓
  • 20. Open-Ended: Too Many OptionsWhat was the make and modelof your first car?What is your favorite meal?What is your favorite thingabout working at Google? ✓
  • 21. Open-Ended: Natural Metrics How many hours did you work last week? How many times a day do you use your phone to get directions? ✓
  • 22. Types of Closed-Ended Questions
  • 23. Closed-Ended: Answer Options ClearHow often do you withdraw cash from an ATM?__ Less than once a month__ About once a month__ About 2-3 times a month__ About once a week ✓__ A few times a week__ About once a day__ Multiple times a day
  • 24. Closed-Ended: Ranking QuestionsRank the following main dinnercourses in order of preference.Rank answers from highest (1) to lowest (6).___ Fried chicken___ Beef stew___ Kangaroo steak ✓___ Seared tuna___ Spaghetti___ Seasonal vegetables
  • 25. Closed-Ended: w/o Natural MetricsOverall, how satisfied are youwith Google Drive? "Im 6 satisfied." "Im moderately satisfied." ...on a 7-point scale from extremely dissatisfied to extremely satisfied ✓
  • 26. Closed-Ended: Rating Questions Extreme as Equally spaced units possible Fully labeled scales ✓
  • 27. Closed-Ended: Unipolar vs. BipolarUnipolar measures Bipolar measures • Starts from zero • Starts at extreme • No natural midpoint negative • Goes to an extreme • Has a natural midpoint • Goes to opposite extreme positive5 scale points 7 scale pointsNot at all ..., Slightly..., Moderately..., Extremely...,, Moderately..., Slightly...,Very..., Extremely... Neither ... nor ..., Slightly..., Moderately..., Extremely. ..
  • 28. Closed-Ended: PrioritizingIf you really need prioritization help:Select up to 3 features that aremost important to you. ✓How important iseach feature to you?
  • 29. QUESTIONNAIRE BIASES
  • 30. Overview of Questionnaire Biases1. Satisficing:Short-cutting question answers2. Acquiescence:Tendency to agree to any statement3. Social desirability:Sticking to norms & expectations4. Order bias:Tendency to answer in certain way depending on thequestion or response order
  • 31. 1. Satisficing= Respondents shortcut answering questions= People attempt to make guesses!Reasons:• Question difficulty is high• Cognitive ability to understand & answer is low• Motivation to answer accurately is low• Fatigue occurs due to a long questionnaire
  • 32. 1. Satisficing: Difficult QuestionsHow many searches did you conduct last year? ✘ Avoid difficult or complex questions Capture actual behavior Ask about todays goals
  • 33. 1. Satisficing: Complex Questions ✘ Shorten questions and answers. Keep wording as simple as possible.
  • 34. 1. Satisficing: No opinion, n/a, ...Satisfaction with your smartphone:__ Very dissatisfied__ Slightly dissatisfied__ Neither satisfied nor dissatisfied__ Slightly satisfied__ Very satisfied__ No opinion ✘ Avoid "no opinion" (or similar) answers Break into two questions If you need it, make it visually distinct
  • 35. 1. Satisficing: Large Grid Questions Avoid large grid questions ✘ Consider separate questions for each Add alternate row shading
  • 36. 2. Acquiescence Bias= Respondents tend to agree to any statementReasons:• Cognitive ability or motivation to answer is low• Question difficulty or complexity is high• Personality tendencies skew towards agreeableness• Social norms suggest a "yes" response
  • 37. 2. Acquiescence: Binary QuestionsHas using Picasa increased the number of photos you share with your friends or colleagues? ✘ Yes No Avoid binary question types (Y/N, T/F) Ask construct-specific questions Measure attitudes on unbiased scale
  • 38. 2. Acquiescence: Agreement Scales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scale
  • 39. 2. Acquiescence: Agreement Scales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scaleIndicate your level of trust with Shopbop?[Extremely distrust,...,Extremely trust] ✓
  • 40. 3. Social Desirability= Respondents stick to norms & expectationsReasons:• Opinion does not conform with social norms• Feeling uncomfortable about answering• Asked to provide opinion on sensitive topics• Asked to provide identity
  • 41. 3. Social Desirability: Social NormsHow many servings of fruits and vegetables do you consume daily?How frequently do advertisements influence your purchases? ✘ Avoid such questions
  • 42. 3. Social Desirability: Sensitive Topics ✘Indicate your level of racism:__ Not at all racist__ Slightly racist__ Moderately racist__ Very racist__ Extremely racist Avoid sensitive questions Allow respondents to answer anonymously Use self-administered surveys
  • 43. 3. Social Desirability: Identity What is your full name? What is your home address? ✘ For sensitive topics, allow respondents to answer anonymously Use self-administered surveys
  • 44. 4. Response Order Bias= Tendency to select answers at the beginning(primacy) or end (recency) of an answer list/scaleReasons:• Unconsciously apply meaning based on order• Answer list is too long• Answer list cannot be viewed as a whole• Appropriate answer cannot be easily identified
  • 45. 4. Response Order Bias: Answer List Best ✘ Typical Worst Randomize the answer list order
  • 46. 4. Question Order Bias= Respondent answers are influenced by theorder in which questions appear in the surveyReasons:• Attention is drawn to dimensions which may not have otherwise been considered
  • 47. 4. Question Order Bias: Example1. Which of the following featureswould you like to see improved? ✘[Battery life, weight, screen size, ...]2. What do you find mostfrustrating about yoursmartphone?
  • 48. OTHER QUESTIONS TO AVOID
  • 49. Leading Questions Avoid leading ✘ questions Ask questions,Do you agree or disagree with not statements. this statement: I liked the surveys workshopa great deal. ✘ Measure attitude on a neutral scale.
  • 50. Recall and PredictionDo you prefer the previous or the current version of Facebook?Would you like Walmart more if its aisles wereless cluttered? ✘ Avoid such questions entirely Ask before and after, then compare Ask for each version, then compare
  • 51. Reference PeriodsHow many times did you work from home in Q1? Define reference periods ✘ Avoid terms that may be misinterpreted State references at the beginning ✓Between January 1 to March 31, 2012, how many times did you work from home?
  • 52. Cute LanguageOverall, what do you think of our new mobile app?__ Its great!__ Only OK. A little confusing.__ This UI sucks. Are you guys a bunch ✘ of baboons? Dont get cute. Use simple & straightforward language.
  • 53. Broad QuestionsHow well do you knowyour coworkers? ✘ Avoid broad questions Figure out what you want to measureIn the past month, how many times did you see your Tech Lead outside of work? ✓
  • 54. Double-Barreled QuestionsHow satisfied are you with the billing and payment options? ✘ Avoid asking about multiple things Use separate questionsHow satisfied are you with the billing options?How satisfied are you with the payment options? ✓
  • 55. Launch ReadinessIs the redesign ready to launch?Which of the following features should Instagram work on next? ✘ Avoid hypotheticals Ask about current experiencesWhat, if anything, do you find frustratingor unappealing about Instagram? ✓
  • 56. OTHER SURVEY DESIGN CONSIDERATIONS
  • 57. Survey Visual Design ✘
  • 58. Images ✘ ✓ ✓
  • 59. Question Order Funnel Broad & easy Specific & sensitive
  • 60. Group Related Questions vs.
  • 61. survey 1Survey Lengthsurvey 1 survey 2 vs. ✘ ✓
  • 62. SURVEY EXAMPLES:WHERE ARE THE BIASES?
  • 63. IMPLEMENTATION,CONSIDERATIONS & FIELDING
  • 64. Survey Creation ToolsFactors to consider: functionality, cost, ease of use, data storage, response amount, reportingFree* PaidSurvey Monkey ConfirmitZoomerang UserZoomKwiksurveys Get satisfactionGoogle Forms Uservoice Keynote Bizrate*Note: Not all free tools support complexfunctions (e.g., conditionals, added Medelliavariables, fully labeled scale points)
  • 65. In-Product Link
  • 66. In-Product Pop-Ups
  • 67. Crowdsourcing
  • 68. Using a Panel Provider
  • 69. Email Invitations
  • 70. Maximizing Response RatesDillmans Total Design Method (1978): o Put questions directly related to the topic upfront o Make the questionnaire appear small/short o Personalize the invitation for each respondent o Explain overall usefulness & importance of respondent o Explain the confidentiality of the collected data o Pre-announce the survey a week in advance o Send reminders after 1 and 3 weeks
  • 71. Maximizing Response RatesOther things to keep in mind: o Explain your relationship to the respondent o Potentially offer small gifts as incentives o For email surveys, Mondays appear to be the best day
  • 72. Maximizing Response Rates Incentives incentives Survey length Impact
  • 73. UX AND USABILITY-FOCUSED SURVEY TYPES
  • 74. Surveys within the UCD ProcessUserZoom’s Integrated Online Research Platform For UX Research: • Test live website, mobile app • Analyze competitors • Understand your visitors For UX Design:For CX Measurement: • Information architecture• Web VOC • Validate design• Mobile VOC • Iterative testing (AGILE) Picture source: SAP
  • 75. Task-based Survey In an online, task-based survey, a large sample of participants (typically 100 to 200) are asked to complete navigational tasks on a website or prototype such as looking for information, registering, making a purchase or reservation, etc. It is focused on performance and satisfaction.Task: Please locate the most popular full TV episode of all time on Hulu. Please make not of theepisode as you will be asked for it later. Click on the success button once you are done.
  • 76. Task-based Survey Validation question, example from a pilot study. Hulu is not UserZoom’s customer.
  • 77. Card Sorting Survey Card sorts help improve the way information is structured on the site so that it matches users’ mental models. Open card sort
  • 78. Card Sorting Survey Instructions: 1. Please start by reading each of Closed Card Sort the items on the left 2. Sort the items into meaningful groups by dragging from the left and dropping on the right Closed card sort
  • 79. Tree Testing Survey Tree testing complements card sorting by testing the site structure created from card sorting. 1. You are on an office supplies website. Where would you go to find a mouse pad? Please click through the menu until you locate where you would expect to find it. Instructions: 1. Youll be asked to find to find an item using a menu structure. 2. Keep clicking through until you have located the item. 3. You can always go back to search in other areas.
  • 80. Tree Testing Survey
  • 81. Click TestingWhere would you click to find more information about the reliability of suppliers? Please click once.After you have clicked once, please hit Next.
  • 82. Voice of Customer (VOC) Survey Find out things like: Who are the users that visit your website? Why do they visit? Are they able to navigate successfully? Would they recommend it to others?
  • 83. FOR MORE INFORMATION
  • 84. Recommended Reading• Groves, Fowler, Couper, et al. (2009), Survey Methodology• J. Wright (2010), Handbook of Survey Research• Floyd J., Fowler Jr. (1995), Improving Survey Questions: Design and Evaluation• Albert W., Tullis T., Tedesco D. (2010), Beyond the Usability Lab: Conducting Large-scale Online User Experience Studies
  • 85. Q&AElizabeth Ferrall-Nunge Alfonso de la Nuez UserZoom@enunge @delanuez23 @userzoom