Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Documents oerc_160913_va_symp_thorn


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Documents oerc_160913_va_symp_thorn

  1. 1. Synthesizing Extant Knowledge for Practitioners in a Carnegie Knowledge Network Chris Thorn, Managing Director Analytics and Program Technology September 16, 2013⦁ Columbus, OH
  2. 2. Triple Aims of Educational Improvement Context: We Live in Extraordinary Times More Relevance Ambitious Learning For All Students More Efficient Systems ENGAGEMENT EFFECTIVENESS EFFICIENCY 2
  3. 3. Why focus on value added? Value-added methods are relatively new, use is increasingly wide spread, but many technical questions remain unresolved. The Problem We’re Trying to Address: • • • • The state of knowledge in the field is changing rapidly The vast amount of information can be overwhelming Most findings are written in highly technical language Many experts are tied to commercial interests or policy stances
  4. 4. What a teacher interested in learning more about value-added might find through an online search. McCaffrey, D. F., Lockwood, J. R., Koretz, D., Louis, T. A., & Hamilton, L. (2004). Models for value-added modeling of teacher effects. Journal of educational and behavioral statistics,29(1), 67101.
  5. 5. Carnegie’s Distinctive Role: Integrative Agent Policy Advocates Legislators Union Leaders Rules & Regulations Local Teacher Union Officials Economists Designers Statisticians Applied Researchers State Education Officials Instrument Design Actual Practices of Use External Service Providers Principals Teachers District leaders
  6. 6. The Carnegie Knowledge Network • Identifies high priority areas characterized by significant knowledge gaps between research and practice • Builds on an R&D agenda focused on practitioner needs • Engages the community of practitioners • Assembles balanced technical expertise • Acts as an integrative agent • Builds scholarly consensus • Informs policy
  7. 7. CKN Online
  8. 8. Most common value added models in use Vendor Name of Model Brief Description American Institutes for Research (AIR) Varied Usually control for student background Mathematica Varied Usually control for student background National Center for the Improvement of Educational Assessment (NCIEA) Student Growth Percentile (SGP) Models Models a descriptive measure of student growth within a teacher’s classroom SAS EVAAS Models control for prior test scores but not other student background variables Value Added Research Center (VARC) Varied Usually control for student background
  9. 9. Highlights of the recommendations • Teachers of advantaged students benefit from models that do not control for student background factors, while teachers of disadvantaged students benefit from models that do control for student background factors • Even when correlations between models are high, different models will categorize many teachers differently • Rules for combining measures should reflect the qualities of those measures
  10. 10. Highlights of the recommendations • High quality linkage is critical (dosage/teams/mobility) • Consider the level of precision and balance the risks • Bias may arise when comparing the value-added scores of teachers who work in different schools • The properties of value-added measures differ across grades and subjects • There is only a moderate, and often weak, correlation between value-added calculations for the same teacher based on different tests
  11. 11. What’s on the Horizon for Carnegie • We have little research to draw upon for designing systems or for predicting the effects of emerging evaluation systems • The Foundation leveraging the pressure of accountability as the gateway drug to improvement • Variation in effectiveness is the problem to solve
  12. 12. An Interesting Case Example • First year results from a large randomized field trial of Reading Recovery (I3 initiative) • Key: a multi-site trial 12
  13. 13. RCT (average) Treatment Effect: Reading Recovery N=141 schools 16 14 12 It is a success 10 8 6 4 2 0 -0.5 -0.3 -0.1 0.1 0.3 0.5 0.7 0.9 Effect Size 1.1 1.3 1.5 1.7 1.9
  14. 14. Distribution of RCT Treatment Effects: Reading Recovery N=141 schools 16 14 12 Count 10 Positive Deviants Undesirable/ Weak Outcomes 8 6 4 2 0 -0.5 -0.3 -0.1 0.1 0.3 0.5 0.7 0.9 Effect Size 1.1 1.3 1.5 1.7 1.9
  15. 15. Distribution of Letter grade of Overall Value-Added for Ohio Schools 1200 1000 800 600 400 200 0 F D C B A
  16. 16. See the System to Improve it We cannot improve outcomes without understanding the processes that generate them and the interconnections between the processes.