Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

How to design effective online surveys


Published on

Watch the entire webinar:

UserZoom teamed up with Elizabeth Ferrall-Nunge, User Experience Research Lead at Twitter, to discuss how to create effective surveys and how to avoid common survey pitfalls.

Published in: Technology
  • Very nice tips on this. In case you need help on any kind of academic writing visit website ⇒ ⇐ and place your order
    Are you sure you want to  Yes  No
    Your message goes here
  • You'll notice the difference immediately when you make the switch from working with amateurs to working with professionals. I've been betting with these guys for more than three years and in that time I've made nearly £500,000! That's a life changing amount of money. If I can give you one piece of advice it's this – sign up and sign up NOW! Last time I was one of the last guys to grab a spot before Patrick closed the doors. If I hadn't gotten lucky that day I'd be half a million pounds poorer now and my life would be a hell of a lot different. ✱✱✱
    Are you sure you want to  Yes  No
    Your message goes here
  • Men fall in love with women who know this "secret ingredient" ❤❤❤
    Are you sure you want to  Yes  No
    Your message goes here
  • You won't get rich, but we do pay. ➢➢➢
    Are you sure you want to  Yes  No
    Your message goes here
  • You won't get rich, but we do pay. ☺☺☺
    Are you sure you want to  Yes  No
    Your message goes here

How to design effective online surveys

  1. 1. Webinar:How to Design Effective Online SurveysOctober 30th, 2012
  2. 2. Speakers Elizabeth Ferrall-Nunge Alfonso de la Nuez User Research Lead at Co-Founder & Co-CEO at Twitter UserZoom @enunge @delanuez23 Twitter Hashtag #uzwebinar
  3. 3. Webinar Agenda• When survey is appropriate• Attributes for a good survey• Question types & when to use each• Questionnaire biases• Implementation considerations• UX and usability-focused survey types
  4. 4. About UserZoom• All-in-one Enterprise software solution that helps UX Pros cost-effectively test, measure and improve Customer Experience over websites and mobile apps. Product Suite:  Unmoderated Remote Usability Testing  Online Surveys (web & mobile)• We specialize in online (or remote)  Online Card Sorting research & usability testing, saving  Tree Testing time, money and effort, while still obtaining  Screenshot Click Testing very rich insights  Screenshot Timeout Testing (5-sec test)  Web VOC• In business since 2002 (2009 as  Mobile VOC SaaS), offices in Sunnyvale (CA)  User Recruitment Tool Manchester (UK), Munich (DE) and Barcelona (Spain) Follow us on Twitter @userzoom
  5. 5. Webinar HistoryDeveloped in collaboration with Aaron Sedley andHendrik Mueller while at Google.Materials presented at HCIC, UX Australia, etc.
  7. 7. Survey StrengthsGood for:• Attitudes• Perceptions• Likes & Dislikes• Goals/Intent• Task success Note: more on task-based surveys ahead• User characteristics• Tracking over time• Comparisons
  8. 8. Survey WeaknessesNot appropriate for:• Usability & comprehension (Usability)• Cause and effect (Experiments)• User motivations (Interviews, Observation)• Precise user behavior, flows, context (Logs)• Bugs (Feedback forms)• Behaviors people are unwilling to report• Prioritizing features (Multiple methods)
  9. 9. Complement Other Methods Survey research (quant)Is my data anecdotal or Why are we seeing representative? this trend? Small sample research (qual)
  11. 11. Elements of Quality SurveysData accurately represent the target populationValidity:Responses measure the dimensions of interestReliability:Responses are consistent over time & samplesQuestionnaire minimizes biasesDesired level of precision for key measurements• Statistically valid comparisons
  13. 13. Stages of Survey Research1. Identifying research goals and constructs2. Determining how to sample your population3. Question types and when to use each of them4. Questionnaire biases to avoid5. Other survey design considerations6. Testing and optimizing of the survey7. Implementation considerations & fielding8. Survey analysis fundamentals
  15. 15. Population, Sample, RespondentsPopulation Sampling frame Sample Respondents
  16. 16. How to Sample• What population do you want to measure?• How many users do you have?• What level of precision do you want?• Do you want to segment by groups? o Whats the smallest group to compare with?• How will you invite people & field the survey?Recommendations• Random sampling is always better!!!• Do not survey users more than 2-4x a year• Target ~400 responses per segment o 384 gives a +/-5% margin of error• Start with small %, track response rate, adapt
  18. 18. Types of Survey QuestionsOpen-ended questions: Closed-ended questions:• • Universe of answers is Rate a single object unknown • After universe of• Select one object from questions is known a really large amount
  19. 19. Open-Ended: Options UnknownWhat, if anything, do you findfrustrating or unappealing aboutyour smart phone?What services or applicationswould you like to integratewith Pinterest? ✓
  20. 20. Open-Ended: Too Many OptionsWhat was the make and modelof your first car?What is your favorite meal?What is your favorite thingabout working at Google? ✓
  21. 21. Open-Ended: Natural Metrics How many hours did you work last week? How many times a day do you use your phone to get directions? ✓
  22. 22. Types of Closed-Ended Questions
  23. 23. Closed-Ended: Answer Options ClearHow often do you withdraw cash from an ATM?__ Less than once a month__ About once a month__ About 2-3 times a month__ About once a week ✓__ A few times a week__ About once a day__ Multiple times a day
  24. 24. Closed-Ended: Ranking QuestionsRank the following main dinnercourses in order of preference.Rank answers from highest (1) to lowest (6).___ Fried chicken___ Beef stew___ Kangaroo steak ✓___ Seared tuna___ Spaghetti___ Seasonal vegetables
  25. 25. Closed-Ended: w/o Natural MetricsOverall, how satisfied are youwith Google Drive? "Im 6 satisfied." "Im moderately satisfied." ...on a 7-point scale from extremely dissatisfied to extremely satisfied ✓
  26. 26. Closed-Ended: Rating Questions Extreme as Equally spaced units possible Fully labeled scales ✓
  27. 27. Closed-Ended: Unipolar vs. BipolarUnipolar measures Bipolar measures • Starts from zero • Starts at extreme • No natural midpoint negative • Goes to an extreme • Has a natural midpoint • Goes to opposite extreme positive5 scale points 7 scale pointsNot at all ..., Slightly..., Moderately..., Extremely...,, Moderately..., Slightly...,Very..., Extremely... Neither ... nor ..., Slightly..., Moderately..., Extremely. ..
  28. 28. Closed-Ended: PrioritizingIf you really need prioritization help:Select up to 3 features that aremost important to you. ✓How important iseach feature to you?
  30. 30. Overview of Questionnaire Biases1. Satisficing:Short-cutting question answers2. Acquiescence:Tendency to agree to any statement3. Social desirability:Sticking to norms & expectations4. Order bias:Tendency to answer in certain way depending on thequestion or response order
  31. 31. 1. Satisficing= Respondents shortcut answering questions= People attempt to make guesses!Reasons:• Question difficulty is high• Cognitive ability to understand & answer is low• Motivation to answer accurately is low• Fatigue occurs due to a long questionnaire
  32. 32. 1. Satisficing: Difficult QuestionsHow many searches did you conduct last year? ✘ Avoid difficult or complex questions Capture actual behavior Ask about todays goals
  33. 33. 1. Satisficing: Complex Questions ✘ Shorten questions and answers. Keep wording as simple as possible.
  34. 34. 1. Satisficing: No opinion, n/a, ...Satisfaction with your smartphone:__ Very dissatisfied__ Slightly dissatisfied__ Neither satisfied nor dissatisfied__ Slightly satisfied__ Very satisfied__ No opinion ✘ Avoid "no opinion" (or similar) answers Break into two questions If you need it, make it visually distinct
  35. 35. 1. Satisficing: Large Grid Questions Avoid large grid questions ✘ Consider separate questions for each Add alternate row shading
  36. 36. 2. Acquiescence Bias= Respondents tend to agree to any statementReasons:• Cognitive ability or motivation to answer is low• Question difficulty or complexity is high• Personality tendencies skew towards agreeableness• Social norms suggest a "yes" response
  37. 37. 2. Acquiescence: Binary QuestionsHas using Picasa increased the number of photos you share with your friends or colleagues? ✘ Yes No Avoid binary question types (Y/N, T/F) Ask construct-specific questions Measure attitudes on unbiased scale
  38. 38. 2. Acquiescence: Agreement Scales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scale
  39. 39. 2. Acquiescence: Agreement Scales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scaleIndicate your level of trust with Shopbop?[Extremely distrust,...,Extremely trust] ✓
  40. 40. 3. Social Desirability= Respondents stick to norms & expectationsReasons:• Opinion does not conform with social norms• Feeling uncomfortable about answering• Asked to provide opinion on sensitive topics• Asked to provide identity
  41. 41. 3. Social Desirability: Social NormsHow many servings of fruits and vegetables do you consume daily?How frequently do advertisements influence your purchases? ✘ Avoid such questions
  42. 42. 3. Social Desirability: Sensitive Topics ✘Indicate your level of racism:__ Not at all racist__ Slightly racist__ Moderately racist__ Very racist__ Extremely racist Avoid sensitive questions Allow respondents to answer anonymously Use self-administered surveys
  43. 43. 3. Social Desirability: Identity What is your full name? What is your home address? ✘ For sensitive topics, allow respondents to answer anonymously Use self-administered surveys
  44. 44. 4. Response Order Bias= Tendency to select answers at the beginning(primacy) or end (recency) of an answer list/scaleReasons:• Unconsciously apply meaning based on order• Answer list is too long• Answer list cannot be viewed as a whole• Appropriate answer cannot be easily identified
  45. 45. 4. Response Order Bias: Answer List Best ✘ Typical Worst Randomize the answer list order
  46. 46. 4. Question Order Bias= Respondent answers are influenced by theorder in which questions appear in the surveyReasons:• Attention is drawn to dimensions which may not have otherwise been considered
  47. 47. 4. Question Order Bias: Example1. Which of the following featureswould you like to see improved? ✘[Battery life, weight, screen size, ...]2. What do you find mostfrustrating about yoursmartphone?
  49. 49. Leading Questions Avoid leading ✘ questions Ask questions,Do you agree or disagree with not statements. this statement: I liked the surveys workshopa great deal. ✘ Measure attitude on a neutral scale.
  50. 50. Recall and PredictionDo you prefer the previous or the current version of Facebook?Would you like Walmart more if its aisles wereless cluttered? ✘ Avoid such questions entirely Ask before and after, then compare Ask for each version, then compare
  51. 51. Reference PeriodsHow many times did you work from home in Q1? Define reference periods ✘ Avoid terms that may be misinterpreted State references at the beginning ✓Between January 1 to March 31, 2012, how many times did you work from home?
  52. 52. Cute LanguageOverall, what do you think of our new mobile app?__ Its great!__ Only OK. A little confusing.__ This UI sucks. Are you guys a bunch ✘ of baboons? Dont get cute. Use simple & straightforward language.
  53. 53. Broad QuestionsHow well do you knowyour coworkers? ✘ Avoid broad questions Figure out what you want to measureIn the past month, how many times did you see your Tech Lead outside of work? ✓
  54. 54. Double-Barreled QuestionsHow satisfied are you with the billing and payment options? ✘ Avoid asking about multiple things Use separate questionsHow satisfied are you with the billing options?How satisfied are you with the payment options? ✓
  55. 55. Launch ReadinessIs the redesign ready to launch?Which of the following features should Instagram work on next? ✘ Avoid hypotheticals Ask about current experiencesWhat, if anything, do you find frustratingor unappealing about Instagram? ✓
  57. 57. Survey Visual Design ✘
  58. 58. Images ✘ ✓ ✓
  59. 59. Question Order Funnel Broad & easy Specific & sensitive
  60. 60. Group Related Questions vs.
  61. 61. survey 1Survey Lengthsurvey 1 survey 2 vs. ✘ ✓
  64. 64. Survey Creation ToolsFactors to consider: functionality, cost, ease of use, data storage, response amount, reportingFree* PaidSurvey Monkey ConfirmitZoomerang UserZoomKwiksurveys Get satisfactionGoogle Forms Uservoice Keynote Bizrate*Note: Not all free tools support complexfunctions (e.g., conditionals, added Medelliavariables, fully labeled scale points)
  65. 65. In-Product Link
  66. 66. In-Product Pop-Ups
  67. 67. Crowdsourcing
  68. 68. Using a Panel Provider
  69. 69. Email Invitations
  70. 70. Maximizing Response RatesDillmans Total Design Method (1978): o Put questions directly related to the topic upfront o Make the questionnaire appear small/short o Personalize the invitation for each respondent o Explain overall usefulness & importance of respondent o Explain the confidentiality of the collected data o Pre-announce the survey a week in advance o Send reminders after 1 and 3 weeks
  71. 71. Maximizing Response RatesOther things to keep in mind: o Explain your relationship to the respondent o Potentially offer small gifts as incentives o For email surveys, Mondays appear to be the best day
  72. 72. Maximizing Response Rates Incentives incentives Survey length Impact
  74. 74. Surveys within the UCD ProcessUserZoom’s Integrated Online Research Platform For UX Research: • Test live website, mobile app • Analyze competitors • Understand your visitors For UX Design:For CX Measurement: • Information architecture• Web VOC • Validate design• Mobile VOC • Iterative testing (AGILE) Picture source: SAP
  75. 75. Task-based Survey In an online, task-based survey, a large sample of participants (typically 100 to 200) are asked to complete navigational tasks on a website or prototype such as looking for information, registering, making a purchase or reservation, etc. It is focused on performance and satisfaction.Task: Please locate the most popular full TV episode of all time on Hulu. Please make not of theepisode as you will be asked for it later. Click on the success button once you are done.
  76. 76. Task-based Survey Validation question, example from a pilot study. Hulu is not UserZoom’s customer.
  77. 77. Card Sorting Survey Card sorts help improve the way information is structured on the site so that it matches users’ mental models. Open card sort
  78. 78. Card Sorting Survey Instructions: 1. Please start by reading each of Closed Card Sort the items on the left 2. Sort the items into meaningful groups by dragging from the left and dropping on the right Closed card sort
  79. 79. Tree Testing Survey Tree testing complements card sorting by testing the site structure created from card sorting. 1. You are on an office supplies website. Where would you go to find a mouse pad? Please click through the menu until you locate where you would expect to find it. Instructions: 1. Youll be asked to find to find an item using a menu structure. 2. Keep clicking through until you have located the item. 3. You can always go back to search in other areas.
  80. 80. Tree Testing Survey
  81. 81. Click TestingWhere would you click to find more information about the reliability of suppliers? Please click once.After you have clicked once, please hit Next.
  82. 82. Voice of Customer (VOC) Survey Find out things like: Who are the users that visit your website? Why do they visit? Are they able to navigate successfully? Would they recommend it to others?
  84. 84. Recommended Reading• Groves, Fowler, Couper, et al. (2009), Survey Methodology• J. Wright (2010), Handbook of Survey Research• Floyd J., Fowler Jr. (1995), Improving Survey Questions: Design and Evaluation• Albert W., Tullis T., Tedesco D. (2010), Beyond the Usability Lab: Conducting Large-scale Online User Experience Studies
  85. 85. Q&AElizabeth Ferrall-Nunge Alfonso de la Nuez UserZoom@enunge @delanuez23 @userzoom