Your SlideShare is downloading. ×
0
Reducing Response Burden
Reducing Response Burden
Reducing Response Burden
Reducing Response Burden
Reducing Response Burden
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Reducing Response Burden

539

Published on

Brief deck describing approaches and constraints for reducing response burden.

Brief deck describing approaches and constraints for reducing response burden.

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
539
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Fred Oswald
    Rice University – Department of Psychology
    Jeff Stanton
    Syracuse University – School of Information Studies
    Kauffman Roundtable on Establishment Surveys
    August 9, 2011
    Reducing Response Burden
  • 2. Reducing Response Burden
    Goals
    Reduce time of administration and associated costs
    Increase response rates, compliance, fewer missing data,
    Decrease survey fatigue and increase willingness to re-engage over time
    Effects
    Increase information yield per unit of response time/effort
    Balance multiple stakeholder survey goals based on “lean” and less reliable information sources
  • 3. Approaches and Constraints
    Goal: No practical loss in reliability or validity at the desired aggregate level (unit, sub-unit/cluster)
    Approaches
    Reduce instructions/explanatory material
    Reduce item redundancy
    Distribute subsets of items strategically across units, using available data or imputation to complete analyses
    Automate field completion with NLP and scrapers
    Constraints
    Each approach has intrinsic weaknesses
    Stakeholders wedded to particular items
    Longitudinal comparability limits changes
  • 4. Research on Response Burden
    Example: How to determine efficient item assignments to sub-samples, given available item information, precision goals, and stakeholder concerns?
    Given different approaches (e.g., multiple imputation) what is the maximum reduction in item content before loss of precision becomes intolerable?
    How redundant are components of a profile (so certain items ‘drive’ a profile)?
    What are stakeholder reactions to various imputation/analysis strategies?
    Our Expertise
    Methods generalists who examine technology and statistical tools to inform practical goals of survey development and test characteristics
    Statistical tools: Item response theory, multi-level and cross-classified models, meta-analysis, machine learning
    Technology tools: Web-based surveys, data scrapers, alternative data collection methods
  • 5. Selected Papers
    Non-response issues
    Converse, P. D., Wolfe, E. W., Huang, X., & Oswald, F. L. (2008). Response rates for mixed-mode surveys using mail and email/web. American Journal of Evaluation, 29, 99-107.
    Rogelberg, S. G, & Stanton, J. M. (2007). Understanding and dealing with organizational survey nonresponse. Organizational Research Methods, 10, 195-209.
    Wolfe, E. W., Converse, P. D., & Oswald, F. L. (2008). Item-level non-response rates in an attitudinal survey of teachers delivered via mail and web. Journal of Computer-Mediated Communication, 14, 35-66.
    Shortening measures
    Donnellan, M. B., Oswald, F. L., Baird, B. M., & Lucas, R. E. (2006). The Mini-IPIP scales: Tiny-yet-effective measures of the Big Five factors of personality. Psychological Assessment, 18, 192-203.
    Russell, S.R., Spitzmüller, C., Lin, L.F., Stanton, J.M., Smith, P.C., & Ironson, G. H. (2004). Shorter can also be better: The abridged Job in General Measure. Educational and Psychological Measurement, 64 (5), 878-893.
    Stanton, J. M., Sinar, E. F., Balzer, W. K., & Smith, P. C. (2002). Issues and strategies for reducing the length of self-report scales. Personnel Psychology, 55 (1), 167-193.
    Stanton, J. M. (2000). Empirical distributions of correlations as a tool for scale reduction. Behavior Research Methods, Instruments, and Computers, 32, 403-406.
    Technology issues
    Rogelberg, S. G., Church, A. H., Waclawski, J., & Stanton, J. M. (2002). Organizational Survey Research: Overview, the Internet/intranet and present practices of concern. In S. G. Rogelberg (Ed.), Handbook of Research Methods in Industrial and Organizational Psychology. Oxford: Blackwell.
    Stanton, J. M., & Rogelberg, S. G. (2001). Using Internet/Intranet Web Pages to Collect Organizational Research Data. Organizational Research Methods, 4, 199-216.
    Stanton, J. M. (1998). An empirical assessment of data collection using the Internet. Personnel Psychology, 51, 709-725.
    Parallel forms (item banking, test security)
    Oswald, F. L., Friede, A. J., Schmitt, N., Kim, B. K., & Ramsay, L. J. (2005). Extending a practical method for developing alternate test forms using independent sets of items. Organizational Research Methods, 8, 149-164.

×