Beautiful Sapna Vip Call Girls Hauz Khas 9711199012 Call /Whatsapps
Linked Administrative Data and Adaptive Design
1. AAPOR| MAY 2019
LINKED ADMINISTRATIVE
DATA AND ADAPTIVE DESIGN
A SCHOOL-BASED SIMULATION
Mickey Jackson| Melissa Diliberti | Jana Kemp | Zoe Padgett
2. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Disclaimer
The results shown in this presentation are based on simulated response behavior to the
School Survey on Crime and Safety (SSOCS) under an assumed missing not at random
(MNAR) response mechanism. All estimates of nonresponse bias reported here are based
on this simulation, and should not be cited as indicators of actual nonresponse bias in real-
world SSOCS estimates.
2
3. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Adaptive Designs
• General approach: target interventions (e.g. incentives, mode switches) to “high priority” cases only
• Instead of full sample or random subsample
• Goal (usually): reduce nonresponse bias
• “High priority” usually means low predicted probability of response
• Requires auxiliary variables available before contact is made with sample members
• Best auxiliary variables are those associated with characteristics measured by the survey
• Extant administrative datasets are a potential source of such auxiliary variables
3
4. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
School-Based Surveys
• There is limited research thus far into use of adaptive designs for school-based surveys
• U.S. Department of Education maintains public, linkable, high-quality administrative datasets about U.S. schools
• Common Core of Data (CCD)
• Civil Rights Data Collection (CRDC)
• Thus, school-based surveys may offer fertile ground for application of adaptive design techniques
4
5. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Introduction to SSOCS and CRDC
• School Survey on Crime and Safety (SSOCS): recurring cross-sectional sample survey of U.S. public schools
• Sponsor: National Center for Education Statistics (NCES)
• Purpose: produce national estimates of prevalence of violent incidents, other disciplinary problems, and various
policies/practices
• Sample: random sample of ≈4,800 schools drawn from CCD
• Weighted response rate: ≈60%
• Recurrence: every 2 years (school years 2015—16, 2017—18, 2019—20, etc.)
• CRDC: mandatory administrative data collection from universe of U.S. public schools
• Sponsor: Office of Civil Rights (OCR)
• Purpose: facilitate enforcement of civil rights statutes
• Response rate: >99%
• Recurrence: every 2 years on same cycle as SSOCS
• Includes data elements that overlap directly and indirectly with SSOCS items
5
6. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Research Question
• Would an adaptive design that uses linked CRDC administrative data to target interventions to some subset of the
sample improve data collection outcomes for SSOCS?
• Evaluate impact on:
• Nonresponse bias (before and after weighting adjustment)
• Unequal weighting effect (i.e. variance inflation)
• Compare to:
• Adaptive design based on CCD frame data only
• Equivalent non-adaptive design (random targeting to equivalent percentage of sample)
• Expectation: CRDC-based adaptive design will perform best due to similarities between CRDC and SSOCS items
6
7. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Terminology
Variable type Meaning Source Observed for… Examples
Frame variables
Basic school
characteristics
CCD (SSOCS sampling
frame)
Full SSOCS sample
Level, enrollment,
urbanicity, etc.
Administrative
variables
CRDC variables with
directly or indirectly
related SSOCS items
CRDC Full SSOCS sample
Violent incidents,
instances of out-of-
school suspension,
etc.
Survey variable
Variables collected
from SSOCS survey
items
SSOCS
SSOCS respondents
only
Violent incidents,
number of disciplinary
transfers, etc.
7
8. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Targeting approach
• Assume adaptive design would target based on predicted response propensity (RP) score
• Steps:
• Estimate logistic regression predicting response in previously completed SSOCS cycle
• Apply regression coefficients to assign RP score to schools sampled for upcoming SSOCS cycle
• Assign intervention to 25% of upcoming sample with lowest RP scores
• Two alternative models:
• RP – CCD: CCD frame variables are only regression predictors
• RP – CCD + CRDC: CCD frame variables and CRDC administrative variables are regression predictors
• Baseline for comparison: assign intervention to random 25% of upcoming sample
8
9. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulation Setup
• Key analytic challenge: survey variables are unavailable for nonrespondents, so true bias is unknown
• Solution: conduct a simulation—if we assume there is nonresponse bias, would an adaptive design help to mitigate it?
• Create pseudo-population from prior SSOCS respondents (so survey variables are observed for entire population)
• Simulate response behavior under each targeting model
• Assume true probability of response is a function of survey variables (missing not at random)
• Targeting models use predicted RP (function of CCD and/or CRDC variables) to assign intervention
• Increase true probability of response for targeted cohort
• Initially assume 5 percentage-point treatment effect (similar to incentive tested in SSOCS:2018)
• Robustness check with 15 percentage-point treatment effect
9
10. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Violent Incidents (5 pp. treatment effect)
10
11. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Violent Incidents (5 pp. treatment effect)
11
12. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Unequal Weighting Effect, by Targeting Approach (5 pp. treatment
effect)
12
13. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Violent Incidents (15 pp. treatment effect)
13
14. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Violent Incidents (15 pp. treatment effect)
14
15. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Unequal Weighting Effect, by Targeting Approach (15 pp. treatment
effect)
15
16. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Conclusions for Practice
• Highly effective intervention required for adaptive design to improve over equivalent randomized design—consistent
with Tourangeau et al. (2017) review
• Apparently rich administrative data will not necessarily add value
• Important to assess correlations with survey variables
• Other research (Padgett et al. 2019) finds lower-than-expected correlations between apparently similar SSOCS and
CRDC variables
• If frame/administrative variables are also available for post hoc weighting, a priori targeting may have little impact on
nonresponse bias, even with a highly effective intervention
16
18. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Detailed Simulation Setup
• Basic approach to adaptive design:
• Estimate targeting model on data from prior SSOCS sample (“training” sample)
• Apply predictions to new SSOCS sample (“test” sample)
• Target intervention in new SSOCS sample based on predictions
• Therefore, need to simulate sampling and response behavior in both a training and a test sample
• Step 1: create population datasets for simulation
• Use SSOCS:2016 as training sample
• Use SSOCS:2018 as test sample
• Link both to 2015—16 CRDC to obtain administrative variables
• Expand respondents by final weight to create simulated population for which survey variables are observed
18
19. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Detailed Simulation Setup (cont.)
• Step 2: assign “true” RP scores to simulated SSOCS:2016 and SSOCS:2018 populations
• Assumed true RP model is a logit function of:
– Frame variables
– Key survey variables (violent incidents, security guards, disciplinary removals, disciplinary transfers, and presence
of threat assessment team)
• Corresponds to a missing not at random (MNAR) response mechanism
• By construction, simulated estimates will be biased
• Step 3: draw simulated training sample from SSOCS:2016 population (n = 4,500, simple random sample)
• Step 4: simulate response behavior in training sample using true RP scores
19
20. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Detailed Simulation Setup (cont.)
• Step 5: train targeting models on simulated SSOCS:2016 training sample
• RP – CCD: logistic regression RP model using CCD frame variables only
• RP – CCD + CRDC: logistic regression RP model using CCD frame variables and CRDC administrative variables
• Note: predicted RP model differs from true RP model, which we assume is unobservable to the user
• Step 6: draw simulated test sample from SSOCS:2018 population (n = 4,500, simple random sample)
• Step 7: assign predicted RP scores to SSOCS:2018 test sample
20
21. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Detailed Simulation Setup (cont.)
• Step 8: identify cohort receiving intervention under each targeting model
• RP – CCD: 25% with lowest predicted RP scores from CCD-only model
• RP – CCD+CRDC: 25% with lowest predicted RP scores from CCD+CRDC model
• Random: randomly selected 25% (provides baseline to assess value-added of model-based targeting)
• Step 9: increase true RP scores by 5 percentage points for priority cohort under each targeting model
• Based on results of SSOCS:2018 incentive experiment
• Assume uniform treatment effect
• Step 10: simulate response behavior in SSOCS:2018 test sample under each targeting model, using adjusted true RP
scores
21
22. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Detailed Simulation Setup (cont.)
• Step 11: calculate sample means, respondent means, and nonresponse bias using SSOCS survey variables among
simulated respondents under each targeting model
• Evaluated base-weighted and nonresponse-adjusted estimates
• Nonresponse adjustments used propensity stratification weighting using CCD and CRDC variables
• Also evaluated unequal weighting effect (UWE) from nonresponse adjustments
• Steps 3 – 11 repeated for 500 replications
22
23. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Disciplinary Transfers (5 pp. treatment effect)
23
24. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Disciplinary Removals (5 pp. treatment effect)
24
25. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Security Staff (5 pp. treatment effect)
25
26. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Proportion
with Threat Assessment Team (5 pp. treatment effect)
26
27. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Disciplinary Transfers (15 pp. treatment effect)
27
28. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Disciplinary Removals (15 pp. treatment effect)
28
29. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Mean Number
of Security Staff (15 pp. treatment effect)
29
30. AMERICAN INSTITUTES FOR RESEARCH | AIR.ORG
Simulated Nonresponse Bias, by Targeting Approach: Proportion
with Threat Assessment Team (15 pp. treatment effect)
30