Documents oerc_160913_va_symp_thorn
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Documents oerc_160913_va_symp_thorn

on

  • 142 views

 

Statistics

Views

Total Views
142
Views on SlideShare
142
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Documents oerc_160913_va_symp_thorn Presentation Transcript

  • 1. Synthesizing Extant Knowledge for Practitioners in a Carnegie Knowledge Network Chris Thorn, Managing Director Analytics and Program Technology September 16, 2013⦁ Columbus, OH
  • 2. Triple Aims of Educational Improvement Context: We Live in Extraordinary Times More Relevance Ambitious Learning For All Students More Efficient Systems ENGAGEMENT EFFECTIVENESS EFFICIENCY 2
  • 3. Why focus on value added? Value-added methods are relatively new, use is increasingly wide spread, but many technical questions remain unresolved. The Problem We’re Trying to Address: • • • • The state of knowledge in the field is changing rapidly The vast amount of information can be overwhelming Most findings are written in highly technical language Many experts are tied to commercial interests or policy stances
  • 4. What a teacher interested in learning more about value-added might find through an online search. McCaffrey, D. F., Lockwood, J. R., Koretz, D., Louis, T. A., & Hamilton, L. (2004). Models for value-added modeling of teacher effects. Journal of educational and behavioral statistics,29(1), 67101.
  • 5. Carnegie’s Distinctive Role: Integrative Agent Policy Advocates Legislators Union Leaders Rules & Regulations Local Teacher Union Officials Economists Designers Statisticians Applied Researchers State Education Officials Instrument Design Actual Practices of Use External Service Providers Principals Teachers District leaders
  • 6. The Carnegie Knowledge Network www.carnegieknowledgenetwork.org • Identifies high priority areas characterized by significant knowledge gaps between research and practice • Builds on an R&D agenda focused on practitioner needs • Engages the community of practitioners • Assembles balanced technical expertise • Acts as an integrative agent • Builds scholarly consensus • Informs policy
  • 7. CKN Online
  • 8. Most common value added models in use Vendor Name of Model Brief Description American Institutes for Research (AIR) Varied Usually control for student background Mathematica Varied Usually control for student background National Center for the Improvement of Educational Assessment (NCIEA) Student Growth Percentile (SGP) Models Models a descriptive measure of student growth within a teacher’s classroom SAS EVAAS Models control for prior test scores but not other student background variables Value Added Research Center (VARC) Varied Usually control for student background
  • 9. Highlights of the recommendations • Teachers of advantaged students benefit from models that do not control for student background factors, while teachers of disadvantaged students benefit from models that do control for student background factors • Even when correlations between models are high, different models will categorize many teachers differently • Rules for combining measures should reflect the qualities of those measures
  • 10. Highlights of the recommendations • High quality linkage is critical (dosage/teams/mobility) • Consider the level of precision and balance the risks • Bias may arise when comparing the value-added scores of teachers who work in different schools • The properties of value-added measures differ across grades and subjects • There is only a moderate, and often weak, correlation between value-added calculations for the same teacher based on different tests
  • 11. What’s on the Horizon for Carnegie • We have little research to draw upon for designing systems or for predicting the effects of emerging evaluation systems • The Foundation leveraging the pressure of accountability as the gateway drug to improvement • Variation in effectiveness is the problem to solve
  • 12. An Interesting Case Example • First year results from a large randomized field trial of Reading Recovery (I3 initiative) • Key: a multi-site trial 12
  • 13. RCT (average) Treatment Effect: Reading Recovery N=141 schools 16 14 12 It is a success 10 8 6 4 2 0 -0.5 -0.3 -0.1 0.1 0.3 0.5 0.7 0.9 Effect Size 1.1 1.3 1.5 1.7 1.9
  • 14. Distribution of RCT Treatment Effects: Reading Recovery N=141 schools 16 14 12 Count 10 Positive Deviants Undesirable/ Weak Outcomes 8 6 4 2 0 -0.5 -0.3 -0.1 0.1 0.3 0.5 0.7 0.9 Effect Size 1.1 1.3 1.5 1.7 1.9
  • 15. Distribution of Letter grade of Overall Value-Added for Ohio Schools 1200 1000 800 600 400 200 0 F D C B A
  • 16. See the System to Improve it We cannot improve outcomes without understanding the processes that generate them and the interconnections between the processes.