Great Survey Design

1,466 views

Published on

Slide show for Training events.

Published in: Software
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,466
On SlideShare
0
From Embeds
0
Number of Embeds
408
Actions
Shares
0
Downloads
37
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Story about why we created content.
  • - Designed for surveys
    - Can be applied to forms/quizzes
  • - Key here is to use a sample, not perform a census
    - Forms – collect responses from everyone
  • Why a circle?
    Learn from past projects
    Develop and create new projects better
  • Empowers you when creating a survey/project on a team, or on behalf of someone else
  • The cycle is split into parts that occur in your office, and parts that occur in the software.
  • Reason to collect data is to act!
    Goals and objectives outline how you will act.
  • - Good goals have actions associated with them
    - Bad goals leave you asking how? When? Where? What?
  • 3-5 - # of things you can easily remember
    Aleta – What should we do if there are more than 5 learning objectives?
  • Set rules
    Time limit
    # of ideas
    No judgment
    Track who had what idea/Q
  • Label objective (ABC, 123)
    Match Qs to objective
    Aleta: What do we do with the brainstorm questions that do not fit with one of the a learning objective?
  • Aleta – What if the brainstorm questions do not help collect data for the learning objectives?
  • Most important question first/last
    Demographics first and last
    Aleta – What are the different ways that you can organize the questions in the survey?
    By objective
    As a conversation
  • Easier to do on paper or in a word doc than in the app
  • Wording/language is important
    Has anyone in your family suffered from an Acute myocardial infarction? vs heart attack
  • Open ended questions can open a can of worms
  • Respondent is offended
    Respondent leaves survey
    Follow up questions are affected by the emotion
  • Respondent provides a more positive than honest answer
  • - Respondent can’t truthfully answer question
    Leave, partial response – Bad
    Pick an answer, bad data - Worse
  • Not honest answer
    Respondent wants to be liked
  • Respondent leaves
    Bad Data
    Aleta – Do you have any suggestions about what to do with the extra survey questions that do not meet the survey goal?
  • - Respondent selects I don’t know more often
  • Respondents stop engaging
    Answer I don’t know more
    Other bad behaviors
  • Review survey questions/options
    Assure that they address the learning objective
    Can the proposed actions be performed with these survey question?
  • How is this going to be laid out?
    What are the psychological impacts of laying out the survey in a specific way?
    When you’re branding a survey, how can that bias your results?
  • All questions break down into these basic question types
  • Low fatigue, find answer and select
  • mid point vs. no midpoint
    Aleta – Is it ok to use different scales in my survey? Like a 1-5 scale and a 1-10 scale?
  • Medium fatigue – need to read all answers in the list
  • Need to type in answer to each textbox
    - You as survey admin need to spend more time reviewing responses/reporting
  • How will you handle data?
    Open text analysis
    Read and reply
    Make a list for a report
    Quote respondent directly
  • Tables go wrong when
    Too many rows
    Overwhelming to respondent
    Not a single topic
    Respondent starts to compare rows
  • Best to have as only question on a page
  • Important to do this early
    Question changes, layout changes
  • Aleta – What if the response data isn’t in the format that I was expecting?
  • Aleta – I add logic to my surveys when I am building them. Why do you suggest to wait and do this later in the Build process?
  • - Each type of logic serves a purpose
    - Not usually a need for each time in a single survey
  • Already selected mode in Need
    Recognize that mode can create sample bias
    Aleta – Can you give us an example of how the survey mode can introduce bias?
    Example: Percentage of households with internet capability in the US versus households with no internet
    - Choosing to email (versus telephone) this survey will create a highly biased sample
  • Who are you going to survey?
  • Very important for surveys/feedback systems
  • You don’t need to, and should not, survey everyone
  • Example: When comparing men and women in the US, the ratio within your survey should be the same as the ratio within the larger US population.
  • Do not email entire customer base
  • Do not email entire customer base
  • Need to collect more responses for a survey since you may remove responses collected.
    Aleta – What about partial responses. Should partial responses be included in the reports?
  • Identify responses that are not engaged and delete or exclude from the data set
  • Responses that did not follow the trend
  • Run one report for each learning objective
  • Run one report for each learning objective
  • When reviewing the report, data should make sense and provide a logical path to actions.
  • - Analyzing text responses takes time and effort
    - Don’t ask questions that you will not read or act on
  • Collect enough responses to segment data and be statistically accurate
  • This step is not a surprise
    You know from Need how data/reports will be shared and discussed
  • It’s ok to be a pest
    Aleta – What if I am building a survey for someone else? How can I empower action taking?
  • Follow up with stake holders
    Ask questions about actions
    Was there a financial benefit to company
    Great way to get involved and encourage next project
  • - Can easily be an email asking for feedback on specific parts of study/report/actions
  • Communicating changes to customers
    Communicating changes with business execs
    Communicating changes with employees
  • Great Survey Design

    1. 1. © 2014 Widgix LLC dba SurveyGizmo. ALL RIGHTS RESERVED. Materials are owned by Widgix LLC. Unless otherwise specified, all materials appearing herein, including the text, logos, graphics, icons, images, as well as the selection, assembly and arrangement thereof, are the sole property of Widgix LLC. Materials must not be copied, reproduced, modified, republished, uploaded, posted, transmitted, or distributed in any form.
    2. 2. Surveys: What Are They?
    3. 3. • A survey is a collection of questions asked repetitively to a sample of a population to mathematically derive characteristics of the total population. Surveys: What Are They Good For?
    4. 4. Great Survey Design Cycle
    5. 5. Why is This Cycle Important? • It’s a framework that provides guidelines when you work with clients and stake-holders • You’re likely doing parts of it already • Those are likely the parts of your process that work!
    6. 6. The Trifecta: Need, Design & Act
    7. 7. Unit 1: Need
    8. 8. Goals and Objectives
    9. 9. Needs: We All Have Them • Questions to ask: – What are we trying to figure out? – What kinds of reports or data do we want or expect? – What will we do with this data when we’re done? – Who is our intended audience or population? – How are you going to access the target audience?
    10. 10. Examples of Need – How well known is my brand? – Will customers buy this product? – If we offer X benefit, will our employee happiness go up? – Why are my customers not converting? – Will my product do well in a new market?
    11. 11. Set a Survey Goal • A goal is not a single learning point – a goal is what you plan to do with this data, and why. – Good goal: grow your company into new markets. “A survey will determine which markets are good for our existing products, so that we may expand our customer base.” – Bad goal: make more money for your business. “A survey will help us make more money.”
    12. 12. Learning Objectives • Determine your learning objectives – These should all support your overall need and goal – A good amount of learning objectives: three – You should have no more than five!
    13. 13. Brainstorm
    14. 14. Selection and Refinement
    15. 15. Eye on the Prize: ROI • If the cost of the survey is greater than the possible ROI, it’s a waste of time and money, • Without an ROI measurement, there is no encouragement to take action. • Without clearly defined actions, survey results may not have an ROI.
    16. 16. Unit 2: Design
    17. 17. Organize Brainstorm
    18. 18. Refine Brainstorm Ideas into Questions
    19. 19. Guide: Writing Questions • Multiple choice versus open-text questions – Quantitative versus qualitative • Phrasing and language use – unclear language – grammar – ambiguity – too technical • Language can differ btw demographic groups • Keep your questions: – Brief – Simple – Relevant – Specific and direct
    20. 20. Qualitative Versus Quantitative • Quantitative – Numeric in nature, extrapolated to whole population • Qualitative – Touchy-feely, give context to quantitative questions
    21. 21. The Four Horsemen of the Surveypocalypse Eliminate Bias
    22. 22. Emotional Bias • Asking loaded questions • Asking neutral-seeming questions on a loaded topic
    23. 23. Identity Bias • Asking “Do you like SurveyGizmo?” with a SurveyGizmo logo in the corner of the survey Isn’t Mel a great trainer?
    24. 24. Option Bias • Required, non-applicable questions • Leading or restrictive options • Different types of scales • Option lists of death ? ? ?
    25. 25. Conversational Bias • Surveys as a conversation • Respondents giving the answer they think you want to hear Mr. Black Job Interviews Today Were you fired from your last job?
    26. 26. Lack of Focus • Covering too many diverse topics • Additional questions that do not meet the survey goal • Questions that are not inline with the learning objectives • Questions that do not derive actionable results
    27. 27. Miscommunication • Know your audience and the language that they use and understand – Avoid technical terms unless it is appropriate – Define terms if necessary • Remember to speak in your company’s voice • Have a peer review for clarity
    28. 28. Survey fatigue as a cultural trend • Cultural survey fatigue – The average respondent is fatigued already, just by nature of: • Receiving emails from organizations • Suggestions on receipts and from cashiers
    29. 29. • Try to avoid… – Leading questions – Loaded or suggestive questions – Fatiguing question types – large tables, lots of open-text or essay questions – Sensitive questions – Highly technical language The Wrap-Up: Question Mistakes to Avoid
    30. 30. Re-establish Focus
    31. 31. Unit 3: Build
    32. 32. • Design: Involves thinking about psychology, emotions and words. It is the more abstract phase. • Build: Involves taking into account security walls, logic, combatting fatigue, bias, and poor data collection; It is the more active phase. How are Design and Build different?
    33. 33. Stages of Build
    34. 34. Construct
    35. 35. The Radio Button • Quantitative – Scale (should be horizontal) – Categorical (should be vertical) • “All of the above” is a no-no!
    36. 36. Neutral or not? Scale questions: The controversy 61-6
    37. 37. The Checkbox: Choose All That Apply
    38. 38. The Checkbox: Beware! Choosing more than one option changes statistical reporting a lot!
    39. 39. Multi-Text Questions • Qualitative • Explorative or un-aided response; used for lists Please list the names of phone providers that you have seen or heard advertised.
    40. 40. Essay Questions • Qualitative and explorative • This is a way to gather unaided responses for your survey 3. What is your favorite thing about SurveyGizmo?
    41. 41. Table Questions Do NOT use as a space-saver – these are fatiguing!
    42. 42. Table Questions: What’s Totally Okay
    43. 43. Build your survey. Test It. Get buy-in from your stakeholders.
    44. 44. Validate • Number, Email, Percent, Date • RegEx – Validate patterns like phone numbers, zip code, etc. • Capitalize each word • Autosuggest answers
    45. 45. Test Reports • Are your questions reporting the way you expect? • Are you able to create the reports you need using the data you are collecting? • Is the data in the format you need?
    46. 46. Apply and Test Logic
    47. 47. Different Types of Logic • Fatigue-fighting: • Page jumping • Show-when logic • Percent branching • Piping (repeating) • Bias-fighting: • Randomization • Disqualifiers • Survey timing/combatting straight-lining • Vote protection
    48. 48. Perform a Pilot Study
    49. 49. Unit 4: Collect
    50. 50. Survey Mode Mode introduces different forms of bias – so how the data is collected is important! Choose mode of survey
    51. 51. Choose Sample
    52. 52. • Your options are: survey everyone, or survey a percentage • Why? – Cost – Survey fatigue – You will miss certain sections of the population – Using a statistically valid sample is just as effective (or more effective) than trying to survey your entire population What is Sample? Why is it Important?
    53. 53. More on Sample A sample is statistically valid when every single person in that population has a equal chance or probability to be in a sample that you select.
    54. 54. What is the Ideal Sample Size? How many responses do you need for your survey to be statistically accurate? – It depends. • How accurate do you want the data to be? (margin of error or confidence interval) • How repeatable do you want the results to be? • How large is your total population?
    55. 55. How to Determine Sample Size • Estimate 400 responses • Use a sample calculator!
    56. 56. Sample Calculators: Magic? http://www.surveysystem.com/sscalc.htm
    57. 57. Caveats #1: If you are segmenting data for comparison, the segments should be the same as the segments in the represented population. #2: If you are cross-tabbing data, ensure that the data collected (per question that you are cross-tabbing) is statistically valid when representing the larger population.
    58. 58. Where Do You Get Sample? • Pull a population from your customer list. – Warning: Do NOT use your entire customer base. – If everyone has the same chance of being randomly selected, you are not biasing your results in any way.
    59. 59. • Panel Companies: A panel company is an organization that exists to sell anonymous survey responses to marketers and market researchers. Engage Panel (opt.)
    60. 60. Panel Companies: The Issues Drawbacks: – Using incentives – Cannot access market researchers – Some panel companies will buy from each other when they cannot provide the sample needed – Hard to determine level of bias in sample • If the panel companies award “points” for websites like Amazon – helps reduce sample bias based on incentive
    61. 61. Incentives • Biases your sample (ex. Toys R Us gift card as incentive) • Incentives can jeopardize your data (because respondents just want to get to the end) • Safeguards: – Survey page timer with disqualification – Shorter surveys – Red herring questions – Clean data (eliminating straight liners, Christmas trees, etc)
    62. 62. Unit 5: Report
    63. 63. Clean Data
    64. 64. How to Clean Data – Step 1 • Look out for: – Unusually quick responses – Unusually long responses
    65. 65. How to clean data - Step 1 1. Patterns/Straightlining 2. Red herring/logically inconsistent 4. Checking all or Checking one3. Gibberish, one word, fake text
    66. 66. How to clean data - Step 2 • Prepare your data for analysis – Beware of: • Inconsistent numeric values (How old are you? Etc.) • Breaks in validation • Do not introduce new bias! – Changing question text
    67. 67. • Run individual reports for each learning objective. – Use this process to determine the “highlights” of data collected as they relate to ultimate actions so that you can truly understand the most significant findings of your research. Run Initial Reports
    68. 68. Run Preliminary Reports • Your preliminary reports should be focused on your original learning objectives. – Did you get your questions answered? – Is the data in the format you expected? – Are you seeing the trends that you anticipated?
    69. 69. Running reports: Key factors • Make sure your data makes sense • For any overt trends you are finding in the data, make note of them and ensure that they are important towards the objectives that you had set for your survey
    70. 70. Analyze Data
    71. 71. Analyzing text responses • How will you deal with your qualitative data? – Keyword frequency – Word clouds – Positive/negative
    72. 72. Analyzing text responses Bucketing – You can use the SurveyGizmo Open Text Analysis tool.
    73. 73. Segmenting data for analysis • Often, your survey will contain demographic and firmographic questions to create segments in your survey. • These segments should remain the same from start to finish of the survey process.
    74. 74. Good indicators of a trend: • When you have data that isn’t statistically sound but is still interesting, you can call it “directional data”. – This data gives you an idea of what your population is saying, thinking or feeling, but you cannot use statistics to back it up.
    75. 75. Create Final Report
    76. 76. Suggestions for effective reports Stage 1: Write a summary – What was the ultimate goal of this survey? – Who was surveyed? – Who was the population? – Who responded? – Include basic highlights of the survey audience and your data to introduce the findings
    77. 77. Stage 2: Write a mini-report for each individual learning objective (ex: 401K changes). •The last section for every learning objective report will include the recommended actions to take based on the results of the survey (these should not be a surprise!) Suggestions for effective reports
    78. 78. Stage 3 (optional): Interesting and unexpected trends found – Good to know, not need-to-know – Ex. Perhaps you found a new, unintended segment of your population that could help you to make good business decisions moving forward •This is going the extra mile for your clients! Suggestions for effective reports
    79. 79. Stage 4: Conclusion – Recap what actions are going to be taken (if any) based on your findings. – Get all of stakeholders to agree to those actions. – Create a survey to be sent to stakeholders in order to gain feedback for the project and put actions in motion. – Important for the next stage, Act: ask stakeholders to provide metrics that can be used to measure the success of the actions that will be taken. Suggestions for effective reports
    80. 80. Tips for communicating data • Try to anticipate questions about the report • Know the details • Be honest
    81. 81. Unit 6: Act
    82. 82. Actions: The key to success! • Reiterate project goals and objectives • Motivate the stakeholders to take action based on the data collected • Establish a reasonable timeframe in which actionable results (positive or negative) can be expected. Empower Action Taking
    83. 83. Monitor Actions
    84. 84. How to get feedback • Send a short survey to all stakeholders • Ask for any suggestions • Allows you to work better together in the next study and improve the process. Get Feedback on Survey
    85. 85. Publish and Share Study Results
    86. 86. Great Survey Design Cycle © 2014 Widgix LLC dba SurveyGizmo. ALL RIGHTS RESERVED. Materials are owned by Widgix LLC. Unless otherwise specified, all materials appearing herein, including the text, logos, graphics, icons, images, as well as the selection, assembly and arrangement thereof, are the sole property of Widgix LLC. Materials must not be copied, reproduced, modified, republished, uploaded, posted, transmitted, or distributed in any form.

    ×