Webinar:
How to Design Effective Online Surveys
October 30th, 2012
Speakers




   Elizabeth Ferrall-Nunge           Alfonso de la Nuez
   User Research Lead at           Co-Founder & Co-CEO at
           Twitter                       UserZoom
          @enunge                       @delanuez23




               Twitter Hashtag #uzwebinar
Webinar Agenda
• When survey is appropriate
• Attributes for a good survey
• Question types & when to use each
• Questionnaire biases
• Implementation considerations
• UX and usability-focused survey types
About UserZoom

• All-in-one Enterprise software solution that
  helps UX Pros cost-effectively
  test, measure and improve Customer
  Experience over websites and mobile
  apps.                                           Product Suite:
                                                   Unmoderated Remote Usability Testing
                                                   Online Surveys (web & mobile)
• We specialize in online (or remote)
                                                   Online Card Sorting
  research & usability testing, saving
                                                   Tree Testing
  time, money and effort, while still obtaining
                                                   Screenshot Click Testing
  very rich insights
                                                   Screenshot Timeout Testing (5-sec test)
                                                   Web VOC
• In business since 2002 (2009 as                  Mobile VOC
  SaaS), offices in Sunnyvale (CA)                 User Recruitment Tool
  Manchester (UK), Munich (DE) and
  Barcelona (Spain)                               Follow us on Twitter @userzoom
Webinar History




Developed in collaboration with Aaron Sedley and
Hendrik Mueller while at Google.

Materials presented at HCIC, UX Australia, etc.
WHEN SURVEY IS APPROPRIATE
Survey Strengths

Good for:
• Attitudes
• Perceptions
• Likes & Dislikes
• Goals/Intent
• Task success Note: more on task-based surveys ahead
• User characteristics
• Tracking over time
• Comparisons
Survey Weaknesses

Not appropriate for:
• Usability & comprehension (Usability)
• Cause and effect (Experiments)
• User motivations (Interviews, Observation)
• Precise user behavior, flows, context (Logs)
• Bugs (Feedback forms)
• Behaviors people are unwilling to report
• Prioritizing features (Multiple methods)
Complement Other Methods


                         Survey
                     research (quant)


Is my data anecdotal or                 Why are we seeing
    representative?                        this trend?


                       Small sample
                      research (qual)
ATTRIBUTES FOR A GOOD
       SURVEY
Elements of Quality Surveys
Data accurately represent the target population

Validity:
Responses measure the dimensions of interest

Reliability:
Responses are consistent over time & samples

Questionnaire minimizes biases

Desired level of precision for key measurements
• Statistically valid comparisons
OVERVIEW OF SURVEY LIFE
        CYCLE
Stages of Survey Research


1.   Identifying research goals and constructs
2.   Determining how to sample your population
3.   Question types and when to use each of them
4.   Questionnaire biases to avoid
5.   Other survey design considerations
6.   Testing and optimizing of the survey
7.   Implementation considerations & fielding
8.   Survey analysis fundamentals
POPULATION & SAMPLING
Population, Sample, Respondents




Population   Sampling frame   Sample   Respondents
How to Sample
•   What population do you want to measure?
•   How many users do you have?
•   What level of precision do you want?
•   Do you want to segment by 'groups'?
    o What's the smallest group to compare with?
•   How will you invite people & field the survey?

Recommendations
• Random sampling is always better!!!
• Do not survey users more than 2-4x a year
• Target ~400 responses per segment
    o 384 gives a +/-5% margin of error
•   Start with small %, track response rate, adapt
QUESTIN TYPES & WHEN TO USE
           EACH
Types of Survey Questions

Open-ended questions: Closed-ended questions:
•                     •
  Universe of answers is Rate a single object
  unknown             •  After universe of
• Select one object from questions is known
  a really large amount
Open-Ended: Options Unknown

What, if anything, do you find
frustrating or unappealing about
your smart phone?

What services or applications
would you like to integrate
with Pinterest?             ✓
Open-Ended: Too Many Options

What was the make and model
of your first car?

What is your favorite meal?

What is your favorite thing
about working at Google?      ✓
Open-Ended: Natural Metrics


 How many hours did you work
 last week?

 How many times a day do you
 use your phone to get directions?

                              ✓
Types of Closed-Ended Questions
Closed-Ended: Answer Options Clear

How often do you withdraw cash
 from an ATM?
__ Less than once a month
__ About once a month
__ About 2-3 times a month
__ About once a week



                             ✓
__ A few times a week
__ About once a day
__ Multiple times a day
Closed-Ended: Ranking Questions

Rank the following main dinner
courses in order of preference.
Rank answers from highest (1) to lowest (6).


___ Fried chicken
___ Beef stew
___ Kangaroo steak


                                               ✓
___ Seared tuna
___ Spaghetti
___ Seasonal vegetables
Closed-Ended: w/o Natural Metrics

Overall, how satisfied are you
with Google Drive?
         "I'm 6 satisfied."

 "I'm moderately satisfied."
  ...on a 7-point scale from extremely dissatisfied
                to extremely satisfied
                                                      ✓
Closed-Ended: Rating Questions

                                    Extreme as
             Equally spaced units      possible




          Fully labeled scales

                                      ✓
Closed-Ended: Unipolar vs. Bipolar

Unipolar measures                             Bipolar measures

 •   Starts from zero                         •   Starts at extreme
 •   No natural midpoint                          negative
 •   Goes to an extreme                       •   Has a natural midpoint
                                              •   Goes to opposite extreme
                                                  positive



5 scale points                                7 scale points

Not at all ..., Slightly..., Moderately...,   Extremely...,, Moderately..., Slightly...,
Very..., Extremely...                         Neither ... nor
                                              ..., Slightly..., Moderately..., Extremely.
                                              ..
Closed-Ended: Prioritizing

If you really need prioritization help:


Select up to 3 features that are
most important to you.



                                          ✓
How important is
each feature to you?
QUESTIONNAIRE BIASES
Overview of Questionnaire Biases
1. Satisficing:
Short-cutting question answers

2. Acquiescence:
Tendency to agree to any statement

3. Social desirability:
Sticking to norms & expectations

4. Order bias:
Tendency to answer in certain way depending on the
question or response order
1. Satisficing

= Respondents shortcut answering questions
= People attempt to make guesses!


Reasons:
• Question difficulty is high
• Cognitive ability to understand & answer is low
• Motivation to answer accurately is low
• Fatigue occurs due to a long questionnaire
1. Satisficing: Difficult Questions


How many searches did you
 conduct last year?                       ✘
         Avoid difficult or complex questions
         Capture actual behavior
         Ask about today's goals
1. Satisficing: Complex Questions




                                ✘
        Shorten questions and answers.
        Keep wording as simple as possible.
1. Satisficing: No opinion, n/a, ...

Satisfaction with your smartphone:
__ Very dissatisfied
__ Slightly dissatisfied
__ Neither satisfied nor dissatisfied
__ Slightly satisfied
__ Very satisfied
__ No opinion
                                        ✘
           Avoid "no opinion" (or similar) answers
           Break into two questions
           If you need it, make it visually distinct
1. Satisficing: Large Grid Questions




         Avoid large grid questions
                                        ✘
         Consider separate questions for each
         Add alternate row shading
2. Acquiescence Bias

= Respondents tend to agree to any statement


Reasons:
• Cognitive ability or motivation to answer is low
• Question difficulty or complexity is high
• Personality tendencies skew towards
  agreeableness
• Social norms suggest a "yes" response
2. Acquiescence: Binary Questions

Has using Picasa increased the number of
 photos you share with your friends
 or colleagues?                        ✘
   Yes
   No

         Avoid binary question types (Y/N, T/F)
         Ask construct-specific questions
         Measure attitudes on unbiased scale
2. Acquiescence: Agreement Scales




       Avoid agreement scales
                                      ✘
       Ask construct-specific questions
       Measure attitudes on unbiased scale
2. Acquiescence: Agreement Scales




               Avoid agreement scales
                                             ✘
               Ask construct-specific questions
               Measure attitudes on unbiased scale

Indicate your level of trust with Shopbop?
[Extremely distrust,...,Extremely trust]     ✓
3. Social Desirability

= Respondents stick to norms & expectations


Reasons:
• Opinion does not conform with social norms
• Feeling uncomfortable about answering
• Asked to provide opinion on sensitive topics
• Asked to provide identity
3. Social Desirability: Social Norms

How many servings of fruits and vegetables
 do you consume daily?

How frequently do advertisements influence
 your purchases?
                                     ✘
       Avoid such questions
3. Social Desirability: Sensitive Topics



                                          ✘
Indicate your level of racism:
__ Not at all racist
__ Slightly racist
__ Moderately racist
__ Very racist
__ Extremely racist

          Avoid sensitive questions
          Allow respondents to answer anonymously
          Use self-administered surveys
3. Social Desirability: Identity


 What is your full name?

 What is your home address?
                                        ✘
         For sensitive topics, allow respondents
         to answer anonymously
         Use self-administered surveys
4. Response Order Bias

= Tendency to select answers at the beginning
(primacy) or end (recency) of an answer list/scale


Reasons:
• Unconsciously apply meaning based on order
• Answer list is too long
• Answer list cannot be viewed as a whole
• Appropriate answer cannot be easily identified
4. Response Order Bias: Answer List

                                    Best




                    ✘
                                    Typical


                                    Worst

        Randomize the answer list order
4. Question Order Bias

= Respondent answers are influenced by the
order in which questions appear in the survey


Reasons:
• Attention is drawn to dimensions which may
  not have otherwise been considered
4. Question Order Bias: Example


1. Which of the following features
would you like to see improved?


                                           ✘
[Battery life, weight, screen size, ...]

2. What do you find most
frustrating about your
smartphone?
OTHER QUESTIONS TO AVOID
Leading Questions


                                Avoid leading



                     ✘
                                questions



                                Ask questions,
Do you agree or disagree with   not statements.
  this statement: I liked the
  surveys workshop
a great deal.
                     ✘          Measure
                                attitude on a
                                neutral scale.
Recall and Prediction

Do you prefer the previous or the
 current version of Facebook?
Would you like Walmart more if
  its aisles were
less cluttered?                             ✘
       Avoid such questions entirely
       Ask before and after, then compare
       Ask for each version, then compare
Reference Periods

How many times did you work
 from home in Q1?

        Define reference periods
                                     ✘
        Avoid terms that may be misinterpreted
        State references at the beginning



                                       ✓
Between January 1 to March 31, 2012, how
  many times did you work from home?
Cute Language

Overall, what do you think of our
 new mobile app?
__ It's great!
__ Only OK. A little confusing.
__ This UI sucks. Are you guys a bunch
                                         ✘
  of baboons?

         Don't get cute.
         Use simple & straightforward language.
Broad Questions

How well do you know
your coworkers?                  ✘
         Avoid broad questions
         Figure out what you want to measure


In the past month, how many times did
  you see your Tech Lead outside of
  work?                                 ✓
Double-Barreled Questions

How satisfied are you with the
 billing and payment options?
                                           ✘
          Avoid asking about multiple things
          Use separate questions


How satisfied are you with the billing options?
How satisfied are you with the payment options?
                                                  ✓
Launch Readiness

Is the redesign ready to launch?
Which of the following features
  should Instagram work on next?
                                             ✘
         Avoid hypotheticals
         Ask about current experiences

What, if anything, do you find frustrating
or unappealing about Instagram?              ✓
OTHER SURVEY DESIGN
  CONSIDERATIONS
Survey Visual Design




                       ✘
Images




         ✘       ✓
             ✓
Question Order Funnel

             Broad & easy




           Specific & sensitive
Group Related Questions




             vs.
survey 1
Survey Length
survey 1   survey 2




                      vs.




                                       ✘
           ✓
SURVEY EXAMPLES:
WHERE ARE THE BIASES?
IMPLEMENTATION,
CONSIDERATIONS & FIELDING
Survey Creation Tools
Factors to consider: functionality, cost, ease of use,
  data storage, response amount, reporting

Free*                                       Paid
Survey Monkey                               Confirmit
Zoomerang                                   UserZoom
Kwiksurveys                                 Get satisfaction
Google Forms                                Uservoice
                                            Keynote
                                            Bizrate
*Note: Not all free tools support complex
functions (e.g., conditionals, added        Medellia
variables, fully labeled scale points)
In-Product Link
In-Product Pop-Ups
Crowdsourcing
Using a Panel Provider
Email Invitations
Maximizing Response Rates
Dillman's Total Design Method (1978):
   o   Put questions directly related to the topic upfront
   o   Make the questionnaire appear small/short
   o   Personalize the invitation for each respondent
   o   Explain overall usefulness & importance of
       respondent
   o   Explain the confidentiality of the collected data
   o   Pre-announce the survey a week in advance
   o   Send reminders after 1 and 3 weeks
Maximizing Response Rates
Other things to keep in mind:
   o Explain your relationship to the respondent
   o Potentially offer small gifts as incentives
   o For email surveys, Mondays appear to be the best
     day
Maximizing Response Rates




                       Incentives
                         incentives
                       Survey
                       length
                       Impact
UX AND USABILITY-FOCUSED
      SURVEY TYPES
Surveys within the UCD Process
UserZoom’s Integrated Online Research Platform

                                           For UX Research:
                                           • Test live website, mobile app
                                           • Analyze competitors
                                           • Understand your visitors




                                               For UX Design:

For CX Measurement:                            • Information architecture

• Web VOC                                      • Validate design

• Mobile VOC                                   • Iterative testing (AGILE)




                                                       Picture source: SAP
Task-based Survey
      In an online, task-based survey, a large sample of participants (typically 100 to 200) are asked to complete
      navigational tasks on a website or prototype such as looking for information, registering, making a purchase
      or reservation, etc. It is focused on performance and satisfaction.




Task: Please locate the most popular full TV episode of all time on Hulu. Please make not of the
episode as you will be asked for it later. Click on the success button once you are done.
Task-based Survey




  Validation question, example from a pilot study. Hulu is not UserZoom’s customer.
Card Sorting Survey
  Card sorts help improve the way information is structured on the site so that it
  matches users’ mental models.




                                     Open card sort
Card Sorting Survey                  Instructions:
                                     1. Please start by reading each of
             Closed Card Sort        the items on the left
                                     2. Sort the items into meaningful
                                     groups by dragging from the left
                                     and dropping on the right




                  Closed card sort
Tree Testing Survey
  Tree testing complements card sorting by testing the site structure created from
  card sorting.

  1. You are on an office supplies website. Where would you go to find a mouse
  pad? Please click through the menu until you locate where you would expect to
  find it.


                                                            Instructions:
                                                            1. You'll be asked to find to find an
                                                            item using a menu structure.
                                                            2. Keep clicking through until you have
                                                            located the item.
                                                            3. You can always go back to search in
                                                            other areas.
Tree Testing Survey
Click Testing
Where would you click to find more information about the reliability of suppliers? Please click once.
After you have clicked once, please hit Next.
Voice of Customer (VOC) Survey




  Find out things like:

  Who are the users that visit your website?
  Why do they visit?
  Are they able to navigate successfully?
  Would they recommend it to others?
FOR MORE INFORMATION
Recommended Reading


•   Groves, Fowler, Couper, et al. (2009), Survey
    Methodology
•   J. Wright (2010), Handbook of Survey Research
•   Floyd J., Fowler Jr. (1995), Improving Survey
    Questions: Design and Evaluation
•   Albert W., Tullis T., Tedesco D. (2010), Beyond the
    Usability Lab: Conducting Large-scale Online
    User Experience Studies
Q&A

Elizabeth Ferrall-Nunge   Alfonso de la Nuez   UserZoom
@enunge                   @delanuez23          @userzoom

How to design effective online surveys

  • 1.
    Webinar: How to DesignEffective Online Surveys October 30th, 2012
  • 2.
    Speakers Elizabeth Ferrall-Nunge Alfonso de la Nuez User Research Lead at Co-Founder & Co-CEO at Twitter UserZoom @enunge @delanuez23 Twitter Hashtag #uzwebinar
  • 3.
    Webinar Agenda • Whensurvey is appropriate • Attributes for a good survey • Question types & when to use each • Questionnaire biases • Implementation considerations • UX and usability-focused survey types
  • 4.
    About UserZoom • All-in-oneEnterprise software solution that helps UX Pros cost-effectively test, measure and improve Customer Experience over websites and mobile apps. Product Suite:  Unmoderated Remote Usability Testing  Online Surveys (web & mobile) • We specialize in online (or remote)  Online Card Sorting research & usability testing, saving  Tree Testing time, money and effort, while still obtaining  Screenshot Click Testing very rich insights  Screenshot Timeout Testing (5-sec test)  Web VOC • In business since 2002 (2009 as  Mobile VOC SaaS), offices in Sunnyvale (CA)  User Recruitment Tool Manchester (UK), Munich (DE) and Barcelona (Spain) Follow us on Twitter @userzoom
  • 5.
    Webinar History Developed incollaboration with Aaron Sedley and Hendrik Mueller while at Google. Materials presented at HCIC, UX Australia, etc.
  • 6.
    WHEN SURVEY ISAPPROPRIATE
  • 7.
    Survey Strengths Good for: •Attitudes • Perceptions • Likes & Dislikes • Goals/Intent • Task success Note: more on task-based surveys ahead • User characteristics • Tracking over time • Comparisons
  • 8.
    Survey Weaknesses Not appropriatefor: • Usability & comprehension (Usability) • Cause and effect (Experiments) • User motivations (Interviews, Observation) • Precise user behavior, flows, context (Logs) • Bugs (Feedback forms) • Behaviors people are unwilling to report • Prioritizing features (Multiple methods)
  • 9.
    Complement Other Methods Survey research (quant) Is my data anecdotal or Why are we seeing representative? this trend? Small sample research (qual)
  • 10.
    ATTRIBUTES FOR AGOOD SURVEY
  • 11.
    Elements of QualitySurveys Data accurately represent the target population Validity: Responses measure the dimensions of interest Reliability: Responses are consistent over time & samples Questionnaire minimizes biases Desired level of precision for key measurements • Statistically valid comparisons
  • 12.
  • 13.
    Stages of SurveyResearch 1. Identifying research goals and constructs 2. Determining how to sample your population 3. Question types and when to use each of them 4. Questionnaire biases to avoid 5. Other survey design considerations 6. Testing and optimizing of the survey 7. Implementation considerations & fielding 8. Survey analysis fundamentals
  • 14.
  • 15.
    Population, Sample, Respondents Population Sampling frame Sample Respondents
  • 16.
    How to Sample • What population do you want to measure? • How many users do you have? • What level of precision do you want? • Do you want to segment by 'groups'? o What's the smallest group to compare with? • How will you invite people & field the survey? Recommendations • Random sampling is always better!!! • Do not survey users more than 2-4x a year • Target ~400 responses per segment o 384 gives a +/-5% margin of error • Start with small %, track response rate, adapt
  • 17.
    QUESTIN TYPES &WHEN TO USE EACH
  • 18.
    Types of SurveyQuestions Open-ended questions: Closed-ended questions: • • Universe of answers is Rate a single object unknown • After universe of • Select one object from questions is known a really large amount
  • 19.
    Open-Ended: Options Unknown What,if anything, do you find frustrating or unappealing about your smart phone? What services or applications would you like to integrate with Pinterest? ✓
  • 20.
    Open-Ended: Too ManyOptions What was the make and model of your first car? What is your favorite meal? What is your favorite thing about working at Google? ✓
  • 21.
    Open-Ended: Natural Metrics How many hours did you work last week? How many times a day do you use your phone to get directions? ✓
  • 22.
  • 23.
    Closed-Ended: Answer OptionsClear How often do you withdraw cash from an ATM? __ Less than once a month __ About once a month __ About 2-3 times a month __ About once a week ✓ __ A few times a week __ About once a day __ Multiple times a day
  • 24.
    Closed-Ended: Ranking Questions Rankthe following main dinner courses in order of preference. Rank answers from highest (1) to lowest (6). ___ Fried chicken ___ Beef stew ___ Kangaroo steak ✓ ___ Seared tuna ___ Spaghetti ___ Seasonal vegetables
  • 25.
    Closed-Ended: w/o NaturalMetrics Overall, how satisfied are you with Google Drive? "I'm 6 satisfied." "I'm moderately satisfied." ...on a 7-point scale from extremely dissatisfied to extremely satisfied ✓
  • 26.
    Closed-Ended: Rating Questions Extreme as Equally spaced units possible Fully labeled scales ✓
  • 27.
    Closed-Ended: Unipolar vs.Bipolar Unipolar measures Bipolar measures • Starts from zero • Starts at extreme • No natural midpoint negative • Goes to an extreme • Has a natural midpoint • Goes to opposite extreme positive 5 scale points 7 scale points Not at all ..., Slightly..., Moderately..., Extremely...,, Moderately..., Slightly..., Very..., Extremely... Neither ... nor ..., Slightly..., Moderately..., Extremely. ..
  • 28.
    Closed-Ended: Prioritizing If youreally need prioritization help: Select up to 3 features that are most important to you. ✓ How important is each feature to you?
  • 29.
  • 30.
    Overview of QuestionnaireBiases 1. Satisficing: Short-cutting question answers 2. Acquiescence: Tendency to agree to any statement 3. Social desirability: Sticking to norms & expectations 4. Order bias: Tendency to answer in certain way depending on the question or response order
  • 31.
    1. Satisficing = Respondentsshortcut answering questions = People attempt to make guesses! Reasons: • Question difficulty is high • Cognitive ability to understand & answer is low • Motivation to answer accurately is low • Fatigue occurs due to a long questionnaire
  • 32.
    1. Satisficing: DifficultQuestions How many searches did you conduct last year? ✘ Avoid difficult or complex questions Capture actual behavior Ask about today's goals
  • 33.
    1. Satisficing: ComplexQuestions ✘ Shorten questions and answers. Keep wording as simple as possible.
  • 34.
    1. Satisficing: Noopinion, n/a, ... Satisfaction with your smartphone: __ Very dissatisfied __ Slightly dissatisfied __ Neither satisfied nor dissatisfied __ Slightly satisfied __ Very satisfied __ No opinion ✘ Avoid "no opinion" (or similar) answers Break into two questions If you need it, make it visually distinct
  • 35.
    1. Satisficing: LargeGrid Questions Avoid large grid questions ✘ Consider separate questions for each Add alternate row shading
  • 36.
    2. Acquiescence Bias =Respondents tend to agree to any statement Reasons: • Cognitive ability or motivation to answer is low • Question difficulty or complexity is high • Personality tendencies skew towards agreeableness • Social norms suggest a "yes" response
  • 37.
    2. Acquiescence: BinaryQuestions Has using Picasa increased the number of photos you share with your friends or colleagues? ✘ Yes No Avoid binary question types (Y/N, T/F) Ask construct-specific questions Measure attitudes on unbiased scale
  • 38.
    2. Acquiescence: AgreementScales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scale
  • 39.
    2. Acquiescence: AgreementScales Avoid agreement scales ✘ Ask construct-specific questions Measure attitudes on unbiased scale Indicate your level of trust with Shopbop? [Extremely distrust,...,Extremely trust] ✓
  • 40.
    3. Social Desirability =Respondents stick to norms & expectations Reasons: • Opinion does not conform with social norms • Feeling uncomfortable about answering • Asked to provide opinion on sensitive topics • Asked to provide identity
  • 41.
    3. Social Desirability:Social Norms How many servings of fruits and vegetables do you consume daily? How frequently do advertisements influence your purchases? ✘ Avoid such questions
  • 42.
    3. Social Desirability:Sensitive Topics ✘ Indicate your level of racism: __ Not at all racist __ Slightly racist __ Moderately racist __ Very racist __ Extremely racist Avoid sensitive questions Allow respondents to answer anonymously Use self-administered surveys
  • 43.
    3. Social Desirability:Identity What is your full name? What is your home address? ✘ For sensitive topics, allow respondents to answer anonymously Use self-administered surveys
  • 44.
    4. Response OrderBias = Tendency to select answers at the beginning (primacy) or end (recency) of an answer list/scale Reasons: • Unconsciously apply meaning based on order • Answer list is too long • Answer list cannot be viewed as a whole • Appropriate answer cannot be easily identified
  • 45.
    4. Response OrderBias: Answer List Best ✘ Typical Worst Randomize the answer list order
  • 46.
    4. Question OrderBias = Respondent answers are influenced by the order in which questions appear in the survey Reasons: • Attention is drawn to dimensions which may not have otherwise been considered
  • 47.
    4. Question OrderBias: Example 1. Which of the following features would you like to see improved? ✘ [Battery life, weight, screen size, ...] 2. What do you find most frustrating about your smartphone?
  • 48.
  • 49.
    Leading Questions Avoid leading ✘ questions Ask questions, Do you agree or disagree with not statements. this statement: I liked the surveys workshop a great deal. ✘ Measure attitude on a neutral scale.
  • 50.
    Recall and Prediction Doyou prefer the previous or the current version of Facebook? Would you like Walmart more if its aisles were less cluttered? ✘ Avoid such questions entirely Ask before and after, then compare Ask for each version, then compare
  • 51.
    Reference Periods How manytimes did you work from home in Q1? Define reference periods ✘ Avoid terms that may be misinterpreted State references at the beginning ✓ Between January 1 to March 31, 2012, how many times did you work from home?
  • 52.
    Cute Language Overall, whatdo you think of our new mobile app? __ It's great! __ Only OK. A little confusing. __ This UI sucks. Are you guys a bunch ✘ of baboons? Don't get cute. Use simple & straightforward language.
  • 53.
    Broad Questions How welldo you know your coworkers? ✘ Avoid broad questions Figure out what you want to measure In the past month, how many times did you see your Tech Lead outside of work? ✓
  • 54.
    Double-Barreled Questions How satisfiedare you with the billing and payment options? ✘ Avoid asking about multiple things Use separate questions How satisfied are you with the billing options? How satisfied are you with the payment options? ✓
  • 55.
    Launch Readiness Is theredesign ready to launch? Which of the following features should Instagram work on next? ✘ Avoid hypotheticals Ask about current experiences What, if anything, do you find frustrating or unappealing about Instagram? ✓
  • 56.
    OTHER SURVEY DESIGN CONSIDERATIONS
  • 57.
  • 58.
    Images ✘ ✓ ✓
  • 59.
    Question Order Funnel Broad & easy Specific & sensitive
  • 60.
  • 61.
    survey 1 Survey Length survey1 survey 2 vs. ✘ ✓
  • 62.
  • 67.
  • 68.
    Survey Creation Tools Factorsto consider: functionality, cost, ease of use, data storage, response amount, reporting Free* Paid Survey Monkey Confirmit Zoomerang UserZoom Kwiksurveys Get satisfaction Google Forms Uservoice Keynote Bizrate *Note: Not all free tools support complex functions (e.g., conditionals, added Medellia variables, fully labeled scale points)
  • 69.
  • 70.
  • 71.
  • 72.
    Using a PanelProvider
  • 73.
  • 74.
    Maximizing Response Rates Dillman'sTotal Design Method (1978): o Put questions directly related to the topic upfront o Make the questionnaire appear small/short o Personalize the invitation for each respondent o Explain overall usefulness & importance of respondent o Explain the confidentiality of the collected data o Pre-announce the survey a week in advance o Send reminders after 1 and 3 weeks
  • 75.
    Maximizing Response Rates Otherthings to keep in mind: o Explain your relationship to the respondent o Potentially offer small gifts as incentives o For email surveys, Mondays appear to be the best day
  • 76.
    Maximizing Response Rates Incentives incentives Survey length Impact
  • 77.
  • 78.
    Surveys within theUCD Process UserZoom’s Integrated Online Research Platform For UX Research: • Test live website, mobile app • Analyze competitors • Understand your visitors For UX Design: For CX Measurement: • Information architecture • Web VOC • Validate design • Mobile VOC • Iterative testing (AGILE) Picture source: SAP
  • 79.
    Task-based Survey In an online, task-based survey, a large sample of participants (typically 100 to 200) are asked to complete navigational tasks on a website or prototype such as looking for information, registering, making a purchase or reservation, etc. It is focused on performance and satisfaction. Task: Please locate the most popular full TV episode of all time on Hulu. Please make not of the episode as you will be asked for it later. Click on the success button once you are done.
  • 80.
    Task-based Survey Validation question, example from a pilot study. Hulu is not UserZoom’s customer.
  • 81.
    Card Sorting Survey Card sorts help improve the way information is structured on the site so that it matches users’ mental models. Open card sort
  • 82.
    Card Sorting Survey Instructions: 1. Please start by reading each of Closed Card Sort the items on the left 2. Sort the items into meaningful groups by dragging from the left and dropping on the right Closed card sort
  • 83.
    Tree Testing Survey Tree testing complements card sorting by testing the site structure created from card sorting. 1. You are on an office supplies website. Where would you go to find a mouse pad? Please click through the menu until you locate where you would expect to find it. Instructions: 1. You'll be asked to find to find an item using a menu structure. 2. Keep clicking through until you have located the item. 3. You can always go back to search in other areas.
  • 84.
  • 85.
    Click Testing Where wouldyou click to find more information about the reliability of suppliers? Please click once. After you have clicked once, please hit Next.
  • 86.
    Voice of Customer(VOC) Survey Find out things like: Who are the users that visit your website? Why do they visit? Are they able to navigate successfully? Would they recommend it to others?
  • 87.
  • 88.
    Recommended Reading • Groves, Fowler, Couper, et al. (2009), Survey Methodology • J. Wright (2010), Handbook of Survey Research • Floyd J., Fowler Jr. (1995), Improving Survey Questions: Design and Evaluation • Albert W., Tullis T., Tedesco D. (2010), Beyond the Usability Lab: Conducting Large-scale Online User Experience Studies
  • 89.
    Q&A Elizabeth Ferrall-Nunge Alfonso de la Nuez UserZoom @enunge @delanuez23 @userzoom