Self-AdministeredMobile SurveysMRC 2011 Workshop (Part 1)London (UK)April 18th, 2011Michael Bosnjak, PhD, Assoc. Prof.Free...
2    Self-Administered Mobile Surveys?                       Mobile Surveys?• Definitions of mobile surveys  – Interviewer...
3           Selected ApplicationsDirectly at point of sale:in shopping malls and                           At trade fairsa...
Overall Goal• Providing a very brief introduction into the  methodological foundations of self-administered  mobile survey...
Agenda• Background  – Survey research: Overall aims and scope  – The ´Total Survey Error´ concept  – Factsheet: Mobile Sur...
6   Background: Overall Aim of Surveys• Measuring ´true scores´, i.e. yielding unbiased  estimates for facts and/or latent...
7                    Background: Total Survey Error                           Measurement             Representation      ...
8     Background: Survey Errors/Biases• Coverage Error  Members of the target population have no chance of being  selected...
9Mobile Survey Methodology: Study Series                     1. Web: Item development: Determinants of the willingess to  ...
10www.mobileresearchconference.com
Agenda• Background  – Survey research: Overall aims and scope  – The ´Total Survey Error´ concept  – Factsheet: Mobile Sur...
12What can be presented/assessed? (I) Single       Multiple     Drop-Down choice       choice         menu
13What can be presented/assessed? (II)               Matrix / Textfield     Polarity   Voice / image /video               ...
14                      How ´usable´ are standard formats?                                 Subjective Usability Assessment...
15Technical Implementation: iPhone App
16Technical Implementation: Android
17  GPS positioning: Privacy concerns?Acceptance of        9% GPS-Location     amongparticipants with   an iPhone (MS II; ...
Agenda• Background  – Survey research: Overall aims and scope  – The ´Total Survey Error´ concept  – Factsheet: Mobile Sur...
19 Nonparticipation: Industry perceptions?                              Mobile Research Barometer 2/2011• Survey among 327...
20Nonparticipation: Self-Reports?                   Post survey in Wave 3, open responses, N= 63
21 ´True´ Reasons for (Non)Participation I• What is the influence of the following potential  determinants of the willingn...
22  Bosnjak et al.                  357  ´True´ Reasons for (Non)Participation II                                Highest i...
23´True´ Reasons for (Non)Participation III• If hedonic factors outperform cost/benefit-related,  then   – ´exciting´ ince...
24                                       Speed of participation? (MS I)                                   Geschwindigkeit ...
25               Speed of participation? (MS II)65                                                                 MS I320...
26   Current context of participation? (MS I)                                                                             ...
27Optimal length of mobile surveys? (MS II) „Do you want to continue answering the survey mobile or online (in this case y...
Agenda• Background  – Survey research: Overall aims and scope  – The ´Total Survey Error´ concept  – Factsheet: Mobile Sur...
29    Take-Home Messages & Discussion• ... on mobile survey measurement:  – Most ´classical´ closed-ended item formats can...
Thank you!michael.bosnjak@unibz.ithttp://www.bosnjak.euMichael Bosnjak, PhD, Assoc. Prof.Free University of Bozen-Bolzano,...
Appendix           31
Acceptance and users‘ behaviorInfluencing participants‘ behavior: design                                              Basi...
Acceptance and users‘ behaviorInfluencing participants‘ behavior                                              Basic compen...
Acceptance and users‘ behaviorInfluencing participants‘ behavior                            Response rates in wave 2 again...
Acceptance and users‘ behaviorInfluencing participants‘ behavior                                              Basic compen...
Nonresponse issues: Background Why increasing response rates to surveys?                                             nonre...
Nonresponse: Background:Types of nonresponse       Source: Bosnjak (2001)   37
Nonresponse: Background:Generic reasons for nonresponse• Failure to deliver the survey request   • Spam guards   • Unused ...
Nonresponse: Theory:Why do people (not) respond to surveys?• Economic exchange view• Human needs and values• Compliance he...
Nonresponse: Theory:Why do people (not) respond to surveys?                                Rationale:• Economic exchange v...
Nonresponse: Theory:Why do people (not) respond to surveys?                                Rationale:• Economic exchange v...
Nonresponse: Theory:Why do people (not) respond to surveys?                                                       Rational...
Nonresponse: Theory:Why do people (not) respond to surveys?                                 Rationale:• Economic exchange ...
Nonresponse: Theory:Why do people (not) respond to surveys?                                                     Rationale:...
Nonresponse: Theory:Why do people (not) respond to surveys?                                     Rationale:• Economic excha...
Nonresponse: Theory:Why do people (not) respond to surveys?                                      Rationale:• Economic exch...
Nonresponse: Theory:TDM-based recommendations (selection)                                 To increase benefits          To ...
Nonresponse: Evidence: Mail surveys:Effective methods & procedures I• Most effective factors in mail surveys (only factors...
Nonresponse: Evidence: Mail surveys:Effective methods & procedures II• Effective, but not covered because of limited contr...
Nonresponse: Evidence: Web surveys:Effective methods & procedures III• Personalization:   • Personal salutation (name) is ...
Upcoming SlideShare
Loading in …5
×

Workshop: 'Self-administered Mobile Survey Workshop' - Dr Michael Bosnjak, Free University of Bozen-Bolzano (Mobile Research Conference 2011)

1,758 views
1,595 views

Published on

Workshop - Part Theory: Prof. Dr. Michael Bosnjak gives an introduction into the methodological foundations of self-administered mobile surveys. Topics include issues of coverage, sampling, non-response and measurement applied to mobile survey contexts. Special attention is devoted to methods and procedures to increase response rates, and to visual design effects.

Published in: Business, Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,758
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
56
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Workshop: 'Self-administered Mobile Survey Workshop' - Dr Michael Bosnjak, Free University of Bozen-Bolzano (Mobile Research Conference 2011)

  1. 1. Self-AdministeredMobile SurveysMRC 2011 Workshop (Part 1)London (UK)April 18th, 2011Michael Bosnjak, PhD, Assoc. Prof.Free University of Bozen-Bolzano, School of Economics and Management 1
  2. 2. 2 Self-Administered Mobile Surveys? Mobile Surveys?• Definitions of mobile surveys – Interviewer-administered surveys • Interviews among mobile phone users • Interactive voice response surveys among mobile phone users – Self-administered surveys • SMS (text messaging) surveys • Browser-based surveys on mobile devices (e.g., mobile phones having mobile Internet-access, Smartphones, etc.)• Our focus: – Self-administered surveys AND – using a mobile phone AND – browser-based.
  3. 3. 3 Selected ApplicationsDirectly at point of sale:in shopping malls and At trade fairsat points of serviceAt public Insights In training from difficult to seminarsvenues, suchas concerts reach target groups, event/At schoolyards, in incident-based surveys, In workplacesuniversities & without internetrecreational immediacy accessfacilitiesEn-route En-routewith bus ortrain & at B2C B2B with bus or train & atthe airport the airport
  4. 4. Overall Goal• Providing a very brief introduction into the methodological foundations of self-administered mobile surveys (esp. sources of biases known from survey methodology)• Summarizing key findings of an own methodological study series conducted between 2008-2011• Discussing practical, evidence-based recommendations (esp. on measurement and nonresponse issues)
  5. 5. Agenda• Background – Survey research: Overall aims and scope – The ´Total Survey Error´ concept – Factsheet: Mobile Survey Study series (2008-2011)• Measurement issues – What can be presented/assessed? – How usable are mobile question formats? – Voice capturing/recognition: Why and how? – Acceptance of GPS positioning?• Nonresponse issues – Industry perceptions on mobile survey (non)participation? – Reasons for (non)participation: What do mobile survey participants tell us? – ´True´ reasons for (non)participation? – Speed of participation? – Optimal length of mobile surveys?• Take-home messages and discussion
  6. 6. 6 Background: Overall Aim of Surveys• Measuring ´true scores´, i.e. yielding unbiased estimates for facts and/or latent variables. – Examples of factual questions to measure facts: • Household-level income/expense estimates > Disposable income • Behavioral frequency estimates > Behavior – Examples of indicators supposed to measure latent variables: • Evaluative judgments > Attitudes • Behavioral likelihood scales > Intentions • Brand/product related attributes > Image• Sources of errors in surveys: • Representation-related biases: Coverage, Sampling, Nonresponse • Measurement-related biases/errors
  7. 7. 7 Background: Total Survey Error Measurement Representation Construct Population Measurement Coverage Measurement Inappropriateoperationalization(range restriction, Sampling Frame reliability, validity) Sampling Measurement Sample Inappropriateimplementation into Nonresponsea specific mode:Undesired design- related effects Response RespondentsRepresentative (valid) Representative for the for the construct in Survey estimate population in question? question?
  8. 8. 8 Background: Survey Errors/Biases• Coverage Error Members of the target population have no chance of being selected in the sample (e.g., no access to the Internet, incomplete lists etc.). Error due to the fact that not every unit in the population is represented on the frame.• Sampling Error ... arises from the fact that not all members of the frame population are measured.• Nonresponse Error The responses of people who have not been surveyed are different from those who actually have participated in a survey.• Measurement Error Deviation of the answers of respondents from their true values on the measure, e.g. due to inappropriate operationalizations of (latent) constructs, design features and context effects.
  9. 9. 9Mobile Survey Methodology: Study Series 1. Web: Item development: Determinants of the willingess to Mobile Study I (1.7.-2.9.08) participate in mobile surveys (Sozioland Web-Panel) 2. Pre-Testing: Expert usability assessment at YOC 3. Web: Determinants of the willingess to participate S4 (YOC Mobile-Panel; 979 panelists, 272 participants) 4. Olympic Games 2008 Mobile Survey (YOC Mobile-Panel; 979 panelists, 413 participants) 5. Web: Usability of S4 from participants´ perspective (YOC Mobile-Panel; 413 panelists from S4, 187 completes) Mobile Study II 6. Mobile survey: Evaluation of last vacation (29.9.-18.10.09) (Respondi Web-Panel; 3270 panalists, 540 completes) 7. Web: Usability of S6 from participants´ perspective (Respondi Web-Panel; 540 panelists from S6, 318 completes) Mobile Study III 8. Usability of voice capturing/recognition technology(March/April 2011) (presentation of results at tomorrow at MRC 2011, April 19, 2011)
  10. 10. 10www.mobileresearchconference.com
  11. 11. Agenda• Background – Survey research: Overall aims and scope – The ´Total Survey Error´ concept – Factsheet: Mobile Survey Study series (2008-2011)• Measurement issues – What can be presented/assessed? – How usable are mobile question formats? – Voice capturing/recognition: Why and how? – Acceptance of GPS positioning?• Nonresponse issues – Industry perceptions on mobile survey (non)participation? – Reasons for (non)participation: What do mobile survey participants tell us? – ´True´ reasons for (non)participation? – Speed of participation? – Optimal length of mobile surveys?• Take-home messages and discussion
  12. 12. 12What can be presented/assessed? (I) Single Multiple Drop-Down choice choice menu
  13. 13. 13What can be presented/assessed? (II) Matrix / Textfield Polarity Voice / image /video profile capturing
  14. 14. 14 How ´usable´ are standard formats? Subjective Usability Assessment Observed Post-hoc survey (Web) one week after mobile survey completion Indicators for usability score: fluency, simplicity, ease of use Item- Drop- NR Out Single choice Einfachauswahl untereinander 89,2 Multiple choice 9% Mehrfachauswahl untereinander 87,3Fragetyp Drop-Down menu Geschlossene Auswahlliste 82,7 Textfield Textfeld einzeilig 74,7 Voice recognition / capturing ? 45% 9% Image mit Bild Fragetyp map 87,9 23% 65,00 73,75 82,50 91,25 100,00 Usability score (Range: 0-100 Punkte) Sources: MS I and MS II combined
  15. 15. 15Technical Implementation: iPhone App
  16. 16. 16Technical Implementation: Android
  17. 17. 17 GPS positioning: Privacy concerns?Acceptance of 9% GPS-Location amongparticipants with an iPhone (MS II; n=45) 91% Yes (willing to disclose) No (not willing to disclose)
  18. 18. Agenda• Background – Survey research: Overall aims and scope – The ´Total Survey Error´ concept – Factsheet: Mobile Survey Study series (2008-2011)• Measurement issues – What can be presented/assessed? – How usable are mobile question formats? – Voice capturing/recognition: Why and how? – Acceptance of GPS positioning?• Nonresponse issues – Industry perceptions on mobile survey (non)participation? – Reasons for (non)participation: What do mobile survey participants tell us? – ´True´ reasons for (non)participation? – Speed of participation? – Optimal length of mobile surveys?• Take-home messages and discussion
  19. 19. 19 Nonparticipation: Industry perceptions? Mobile Research Barometer 2/2011• Survey among 327 market researchers about acceptance and use of mobile surveys in D/A/CH• Top 3 advantages of mobile surveys: – 51%: Independence of time/location – 49%: Context-sensitive, fast surveysMobile Research – 43%: Reachability of hard-to-reach, mobile target groupsBarometer• Top 3 barriers for mobile surveys: – 35%: Costs incurred to survey participants (data traffic) – 35%: Difficulties entering information (esp. open-ended questions) – 33%: Software/platform heterogeneityFebruar 2011
  20. 20. 20Nonparticipation: Self-Reports? Post survey in Wave 3, open responses, N= 63
  21. 21. 21 ´True´ Reasons for (Non)Participation I• What is the influence of the following potential determinants of the willingness to participate? 1. Attitude towards participating 2. Hedonic aspects (perceived enjoyment) 3. Social aspects (subjective norm) 4. Image and perceived self-congruity 5. Perceived benefits and costs• Hypothetical model Extended technology acceptance model (Venkatesh et al., 2003)• Prospective study design (MS I) – S1: Developing and optimizing measurement models – S3: Assessing all above mentioned determinants – S4: Olympic games mobile survey (non)participation
  22. 22. 22 Bosnjak et al. 357 ´True´ Reasons for (Non)Participation II Highest influences: > Hedonic aspects > Self-congruity Not relevant:Fit Indices (robust)SB-Χ²=407; df=296 > Expected costs (!)p<.05, Χ²/df= 1.37 > Opinions of othersNNFI=.98RMSEA=.04(.03-.05) *std.β, sig. at α=. 05, N= 272
  23. 23. 23´True´ Reasons for (Non)Participation III• If hedonic factors outperform cost/benefit-related, then – ´exciting´ incentives (lottery drawing) should increase participation rates – compensation for incurred costs should undermine hedonic motivation (salience of costs is increased)• MS I experiment, manipulating basic compensation (1 EUR, yes/no) and announced prize draw (100 EUR voucher, yes/no)• Results confirmed our expectations (see Appendix): – highest access and participation ratesfor „lottery & no incentive condition“
  24. 24. 24 Speed of participation? (MS I) Geschwindigkeit des Zugriffs auf Welle 1 und 2 For about 4.5 hours, Mobile response rates are higher compared to WebKumuliertert prozentualer Anteil Faster responses for Mobile compared to Web: approx. 35% Mobile versus aaprox. 10 % Web Stunden seit Einladung
  25. 25. 25 Speed of participation? (MS II)65 MS I320 8:00 9:00 10:00 11:00 12:00 13:00 14:00 15:00 16:00 17:00 18:00 19:00 20:00 Mean response speed in hours for different contact/invitation time points (sent out via SMS)
  26. 26. 26 Current context of participation? (MS I) At home 63,79% In the office / at work 17,24% „Were have you been taking part in the In a car 6,90% survey?“ At the bus or train station 4,31% (Wave 3; N=116) Using public transport 2,59% On the move (other reasons) 5,17%At home busy with my PC 17,35% Watched TV 14,29% Worked at home 10,20% "Which activity did you have to Read 8,16% disrupt to take part in the mobile In the office / at work 11,22% survey?/ What have you done inPreparing / eating a meal 7,14% that very situation?“ (Wave 3; N=98; open responses) On the move 8,16% Nothing disrupted 23,47%
  27. 27. 27Optimal length of mobile surveys? (MS II) „Do you want to continue answering the survey mobile or online (in this case you will get a link via email)?“ Participants: n= 540100 % 30,0 10,3 % 75 % 22,5 19,8 20,1 19,3 68,9 % Minnutes 50 % 15,0 89,7 % 25 % 7,5 5,2 5,1 5,2 31,1 % 0% 0 non iPhone iPhone Total iPhone Non iPhone mobile Part 1: Mobile- Initial Survey online Part 1 & 2: Mobile Survey
  28. 28. Agenda• Background – Survey research: Overall aims and scope – The ´Total Survey Error´ concept – Factsheet: Mobile Survey Study series (2008-2011)• Measurement issues – What can be presented/assessed? – How usable are mobile question formats? – Voice capturing/recognition: Why and how? – Acceptance of GPS positioning?• Nonresponse issues – Industry perceptions on mobile survey (non)participation? – Reasons for (non)participation: What do mobile survey participants tell us? – ´True´ reasons for (non)participation? – Speed of participation? – Optimal length of mobile surveys?• Take-home messages and discussion
  29. 29. 29 Take-Home Messages & Discussion• ... on mobile survey measurement: – Most ´classical´ closed-ended item formats can be included and are in many cases sufficiently usable – Various measurement options ´beyond´ the usual self- administered formats do exist (e.g. GPS positioning, multimedia upload) – Open-ended text may need to be replaced by voice capturing/ recognition (to be discussed tomorrow)• ... on mobile survey (non)response: – Industry perceptions and self-perception of potential mobile survey participants on the reasons for nonresponse may be misleading – Most probable motivators: anticipated enjoyment, image – Boomerang effects for (over-)compensation – Fast responses, given in various contexts – ´Optimal length´ may not exist, various factors appear to influence the willingness to spend time on (mobile) surveys
  30. 30. Thank you!michael.bosnjak@unibz.ithttp://www.bosnjak.euMichael Bosnjak, PhD, Assoc. Prof.Free University of Bozen-Bolzano, School of Economics and Management 30
  31. 31. Appendix 31
  32. 32. Acceptance and users‘ behaviorInfluencing participants‘ behavior: design Basic compensation (1 €): participation in mobile survey yes no Prize draw Prize draw (100 € Amazon voucher) (100 € Amazon voucher) yes no yes no Incentive information (timing) in the Group 1 Group 2 Group 3 Group 4 SMS on the survey Group 5 Group 6 Group 7 Group 8 landing page
  33. 33. Acceptance and users‘ behaviorInfluencing participants‘ behavior Basic compensation (1 €): participation in mobile survey Landing yes no page Prize draw Prize draw access (100 € Amazon voucher) (100 € Amazon voucher) yes no yes no Incentive Information (Timing) Group 1 Group 2 Group 3 Group 4 in the SMS 8,9% 17,3% 21,2% 12,3% on the survey Groups not relevant, first contact on7landing page 8 Group 5 Group 6 Group Group landing page
  34. 34. Acceptance and users‘ behaviorInfluencing participants‘ behavior Response rates in wave 2 against time Response rates maximized with price draw (group 3), additional compensation Cumulated response rate undermines motivation (see group 1: 1€ and price draw). SMS information Group 1 (1 EUR + price draw) Group 2 (1 EUR) Group 3 (prize draw) Reminder Group 4 (no incentive information) Hours since SMS invitation
  35. 35. Acceptance and users‘ behaviorInfluencing participants‘ behavior Basic compensation (1 €): participation in mobile survey All yes no questions Prize draw Prize draw answered (100 € Amazon voucher) (100 € Amazon voucher) yes no yes no Incentive Information (Timing) Group 1 Group 2 Group 3 Group 4 in the SMS 5,9% 12,8% 14,4% 9,2% on the Group 5 Group 6 Group 7 Group 8 survey landing page 9,8% 9,6% 9,1% 10,5%
  36. 36. Nonresponse issues: Background Why increasing response rates to surveys? nonresponse rate nonresponse true difference error Black Box yr ! = statistic of interest for respondents yt ! = statistic of the total sample ynr ! = statistic of interest for nonrespondents 36
  37. 37. Nonresponse: Background:Types of nonresponse Source: Bosnjak (2001) 37
  38. 38. Nonresponse: Background:Generic reasons for nonresponse• Failure to deliver the survey request • Spam guards • Unused or infrequently checked e-mail addresses • Non-availability during fielding period• Inability to provide the requested data • Lack of knowledge • Insufficient information readily available• Noncompliance: Refusals to survey requests 38
  39. 39. Nonresponse: Theory:Why do people (not) respond to surveys?• Economic exchange view• Human needs and values• Compliance heuristics• Transactional view• Planned behavior approach• Leverage-salience theory• Social exchange theory 39
  40. 40. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Respondents are motivated by the monetary benefits promised/• Human needs and values expected. Actionable recommendations:• Compliance heuristics „Pay respondents“ according to the time/effort invested• Transactional view Caveats:• Planned behavior approach • Peoples´ price points vary greatly and are unknown a-priori • May largely increase non-• Leverage-salience theory response bias •Undermines intrinsic motivation• Social exchange theory and may increase measurement error (low survey involvement) • Promised monetary incentives NOT consistently effective (!) 40
  41. 41. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Some values are• Human needs and values systematically related to• Compliance heuristics the propensity to respond (higher order• Transactional view needs, civit duty• Planned behavior approach orientation, etc.)• Leverage-salience theory Caveats:• Social exchange theory • Effects small (if any) • Actionable recommendations? 41
  42. 42. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Certain aspects of the survey announcement and survey• Human needs and values implementation do induce compliant behavior: 1. Reciprocity• Compliance heuristics 2. Scarcity 3. Authority• Transactional view 4. Consistency 5. Consensus• Planned behavior approach 6. Liking Actionable recommendations:• Leverage-salience theory Can be derived from persuasion literatures, but specific prescriptive• Social exchange theory models on how to tailor them toward survey situations are rare.Groves, Cialdini & Couper (1992); Cialdini (2008);http://www.influenceatwork.com/ 42
  43. 43. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Larger response propensity if communication style• Human needs and values reflects positive regard and• Compliance heuristics avoids adult-to-child communication styles.• Transactional view Caveats:• Planned behavior approach • Limited scope• Leverage-salience theory • Empirical evidence scarce • Covered by other theories• Social exchange theory (compliance heuristics, social exchange)Comley (2006) 43
  44. 44. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view The propensity to respond to surveys is primarily a function of• Human needs and values three factors: • Attitude to participate • Subjective norms• Compliance heuristics • Perceived behavioral control • Moral obligation• Transactional view Actionable recommendations:• Planned behavior approach If weights are known for a specific population/sample: Enables the researcher to design survey• Leverage-salience theory participation requests• Social exchange theory Caveats: Restricted to optimize survey announcementsBosnjak (2002); Bosnjak, Tuten & Wittmann (2005) 44
  45. 45. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Respondents are differentially motivated by• Human needs and values • different aspects of the survey (leverage, e.g. type of incentives) and by• Compliance heuristics • how much emphasis is put on each aspect by the surveyor• Transactional view (salience, e.g. preference for certain incentives )• Planned behavior approach Actionable recommendations: Because of the interaction between• Leverage-salience theory leverage*salience, improving response rates is not always• Social exchange theory desirable! Nonresponse bias may be influenced by leverage*salience interaction.Groves, Singer & Corning (2000) 45
  46. 46. Nonresponse: Theory:Why do people (not) respond to surveys? Rationale:• Economic exchange view Survey participation as social exchange: The likelihood of• Human needs and values responding is greater when the respondent trusts that the expected rewards will outweigh the• Compliance heuristics anticipated costs of responding.• Transactional view Actionable recommendations: Tailored Design Method, a well-• Planned behavior approach developed set of practical recommendations on all aspects of survey design/implementation,• Leverage-salience theory aimed at: •establishing trust• Social exchange theory •increasing participation benefits •decreasing participation costsDillman, Smyth & Christian (2009) 46
  47. 47. Nonresponse: Theory:TDM-based recommendations (selection) To increase benefits To decrease costs of To establish trust of participation participation•Obtain sponsorship by •Provide information about •Make it convenient tolegitimate authority the survey respond•Provide a token of •Ask for help or advice •Avoid subordinate languageappreciation in advance •Show positive regard •Make the questionnaire short•Make the task appear •Say thank you and easy to completeimportant •Support group values •Minimize requests to obtain•Ensure confidentiality and •Give tangible rewards personal or sensitivesecurity of information •Make the questionnaire information interesting •Emphasize similarity to other •Provide social validation requests or tasks to which a •Inform people that person has responded opportunities to respond are limitedDillman, Smyth & Christian (2009, p. 38) 47
  48. 48. Nonresponse: Evidence: Mail surveys:Effective methods & procedures I• Most effective factors in mail surveys (only factors under the researchers full control listed): • Personalization of requests to participate (Dillman, 1978, 2000; Edwards et al., 2007; Fox, Crask, & Kim, 1988; Heberlein & Baumgartner, 1978; Yammarino et al.,1991; Yu & Cooper, 1983) • Prepaid monetary incentives (Church, 1993) • Number of contacts made (esp. if prenotifier is included) (Armstrong & Lusk, 1987; Edwards et al., 2007; Fox et al., 1988; Heberlein & Baumgartner, 1978; Yammarino et al.,1991; Yu & Cooper, 1983)➡ Integrated and refined within the Total-Design-Method (Dillman, Smyth & Christian, 2009) 48
  49. 49. Nonresponse: Evidence: Mail surveys:Effective methods & procedures II• Effective, but not covered because of limited control: • Survey topic / topic involvement • Length • Sponsorship (University / commercial)• Factors reducing response rates (1, 2: Edwards et al., 2007; 3: Singer, Hippler & Schwarz, 1992): 1. Starting with the most general question (e.g. demographics) 2. Opportunity to opt-out of the study 3. Over-emphasizing data protection/confidentiality• Partly covered later for Web surveys: • Questionnaire design effects on nonresponse 49
  50. 50. Nonresponse: Evidence: Web surveys:Effective methods & procedures III• Personalization: • Personal salutation (name) is effective (esp. for powerful sender) (e.g., Heerwegh, et al., 2005; Joinson & Reips, 2007)• (Monetary) Incentives: • In general effective but small overall effect (Göritz, 2006) • Pre-paid monetary incentives need to be tangible to be effective (Birnholtz et al., 2004; Bosnjak & Tuten, 2003) • Lotteries esp. effective, timing important (immediate notification) (Bosnjak & Tuten, 2003; Tuten, Galesic & Bosnjak, 2004)• Contact features: • No of contacts very effective (Cook, Heath & Thompson, 2000) • SMS prenotifier very effective (Bosnjak et al., 2008) 50

×