2. May 18-20, 2006 Internet Data Collection Methods (Day 2-2) Standard Preparation for MS Submission Report analysis of a cross validation sample Assess non-response bias; analyze and report Do duplicate detection; report Do malicious data detection; report Read and cite Internet research method reviews
3. May 18-20, 2006 Internet Data Collection Methods (Day 2-3) Common Objections that Reviewers or Editors may Mount in R&R and Your Rebuttals Lack of access control leads to junk data You used a password protected consent form You filtered responses using timestamp and IP address You detected similarities in (or identical) response patterns: Flip data, run correlations, look for high values StudyResponse studies ranged from 0% to 6.9% duplicates using this screening method Simulations showed that repeats would be needed on more than 20% of cases to substantially disturb means/correlations
4. May 18-20, 2006 Internet Data Collection Methods (Day 2-4) Common Objections that Reviewers or Editors may Mount in R&R and Your Rebuttals II Your sample is bogus because of coverage errors You argue that sample representativeness is a challenge in all research and that purposive sampling is a better goal anyway Make the sample fit the question: An Internet survey of migrant workers? Coal miners? You show the consistency of results between web and cross-validation samples You argue that a typical group of Internet respondents has to be an improvement over pure undergrad samples You cite demographic studies of Internet: increasing normalization to the general population over time
5. May 18-20, 2006 Internet Data Collection Methods (Day 2-5) Common Objections that Reviewers or Editors may Mount in R&R and Your Rebuttals III No one has ever demonstrated the equivalence of the measures you used when administered over the web You cite research that factor structures replicate, substantive conclusions replicate, correlations generally replicate within the limits of sampling error, be wary of mean comparisons You argue this is a higher standard than many other published studies in which: Researchers routinely make up their own items Modify items or response options of existing scales Trim scale lengths and field abridged versions
6. May 18-20, 2006 Internet Data Collection Methods (Day 2-6) Common Objections that Reviewers or Editors may Mount in R&R and Your Rebuttals IV All Internet research participants are volunteers by definition and therefore volunteer bias makes your sample unusable Belmont report and federal legislation require all research to be conducted on volunteers, so volunteer bias is endemic to the whole social research enterprise Volunteer bias can substantially limit projectability of means, but my study doesn’t care about means Studies and simulations of the effect of volunteer bias generally show that correlations are reduced in magnitude because of restriction of range effects that volunteer bias causes
7. May 18-20, 2006 Internet Data Collection Methods (Day 2-7) Strengthening Research Plans for Web Studies A cross-validation sample using traditional RMs is never a bad thing Use your web sample only to make tests of correlative structures and self-referential comparisons of means (e.g., within subjects) Don’t compare means from web study to means from prior paper and pencil study without formal equating Speeded and objective tests need careful testing and cross-validation Assess correlations between substantive variables to demographics: If they don’t correlate, then the non-response bias may carry less weight
8. May 18-20, 2006 Internet Data Collection Methods (Day 2-8) Useful References I Birnbaum, M. H. (1999). Testing critical properties of decision making on the Internet. Psychological Science, 10, 399-407. Buchanan, T., & Smith, J. L. (1999). Using the Internet for psychological research: Personality testing on the World Wide Web. British Journal of Psychology, 90, 125-144. Krantz, J. H., Ballard, J., & Scher, J. (1997). Comparing the results of laboratory and Word-Wide Web samples on determinants of female attractiveness. Behavior Research Methods, Instruments, and Computers, 29, 264-269. Pasveer, K. A., & Ellard, J. H. (1998). The making of a personality inventory: Help from the WWW. Behavior Research Methods, Instruments, and Computers, 30, 309-313.
9. May 18-20, 2006 Internet Data Collection Methods (Day 2-9) Useful References II Smith, M. A., & Leigh, B. (1997). Virtual subjects: Using the Internet as an alternative source of subjects and research environment. Behavior Research Methods, Instruments, & Computers, 29, 496-505. Stanton, J. M. (1998). An empirical assessment of data collection using the Internet. Personnel Psychology, 51, 709-725. Yost, P.R. & Homer, L.E. (1998, April). Electronic versus Paper Surveys: Does the Medium Affect the Response? Paper presented at the annual meeting of the Society for Industrial and Organizational Psychology. Dallas, TX.