Examining Ohio’s Approach to Measuring Student Success
Series Session #1: Value-Added Information’s Role in Classroom and School Improvement
• First in a series of symposia, hosted by the OERC in partnership with Ohio Department of Education and Battelle for Kids, that are designed to bring together researchers, policy influencers, and educators to gain background and common understanding around how to help educators use analytics to drive classroom and school improvement.
• This session will lay the groundwork by beginning the conversation around the national value-added analysis landscape, Ohio’s established history with its value-added model, and where we are heading with the use of powerful education measures to inform curriculum, instruction, accountability, and evaluation.
Synthesizing Knowledge on Value-Added Models for Teachers
1. Synthesizing Extant Knowledge
for Practitioners in a
Carnegie Knowledge Network
Chris Thorn, Managing Director
Analytics and Program Technology
September 16, 2013⦁ Columbus, OH
2. Triple Aims of Educational
Improvement
Context: We Live in Extraordinary Times
More
Relevance
Ambitious Learning
For All Students
More Efficient
Systems
ENGAGEMENT
EFFECTIVENESS
EFFICIENCY
2
3. Why focus on value added?
Value-added methods are relatively new, use is
increasingly wide spread, but many technical
questions remain unresolved.
The Problem We’re Trying to Address:
•
•
•
•
The state of knowledge in the field is changing rapidly
The vast amount of information can be overwhelming
Most findings are written in highly technical language
Many experts are tied to commercial interests or
policy stances
4. What a teacher
interested in learning
more about value-added
might find through an
online search.
McCaffrey, D. F., Lockwood, J. R., Koretz,
D., Louis, T. A., & Hamilton, L. (2004).
Models for value-added modeling of
teacher effects. Journal of educational and
behavioral statistics, 29(1), 67-101.
5. Carnegie’s Distinctive Role: Integrative Agent
Policy
Advocates
Legislators
Union Leaders
State
Education
Rules &
Officials
Regulations
Economists
Local Teacher
Union Officials
Designers
Principals
Instrument Design
Statisticians
Applied
Researchers
Actual Practices of
Teachers
Use
District
leaders
External
Service
Providers
6. The Carnegie Knowledge Network
www.carnegieknowledgenetwork.org
• Identifies high priority areas characterized by
significant knowledge gaps between research
and practice
• Builds on an R&D agenda focused on
practitioner needs
• Engages the community of practitioners
• Assembles balanced technical expertise
• Acts as an integrative agent
• Builds scholarly consensus
• Informs policy
8. Most common value added models in
use
Vendor
Name of Model
Brief Description
American Institutes for
Research (AIR)
Varied
Usually control for student
background
Mathematica
Varied
Usually control for student
background
National Center for the
Improvement of Educational
Assessment (NCIEA)
Student Growth Percentile
(SGP) Models
Models a descriptive measure
of student growth within a
teacher’s classroom
SAS
EVAAS
Models control for prior test
scores but not other student
background variables
Value Added Research Center
(VARC)
Varied
Usually control for student
background
9. Highlights of the recommendations
• Teachers of advantaged students benefit from
models that do not control for student
background factors, while teachers of
disadvantaged students benefit from models that
do control for student background factors
• Even when correlations between models are
high, different models will categorize many
teachers differently
• Rules for combining measures should reflect the
qualities of those measures
10. Highlights of the recommendations
• High quality linkage is critical
(dosage/teams/mobility)
• Consider the level of precision and balance the
risks
• Bias may arise when comparing the valueadded scores of teachers who work in different
schools
• The properties of value-added measures differ
across grades and subjects
• There is only a moderate, and often
weak, correlation between value-added
11. What’s on the Horizon for Carnegie
• We have little research to draw upon for
designing systems or for predicting the effects of
emerging evaluation systems
• The Foundation leveraging the pressure of
accountability as the gateway drug to
improvement
• Variation in effectiveness is the problem to solve
12. An Interesting Case Example
• First year results from a
large randomized field
trial of
Reading Recovery
(I3 initiative)
• Key: a multi-site trial
12
15. Distribution of Letter grade of
Overall Value-Added for Ohio Schools
1200
1000
800
600
400
200
0
F
D
C
B
A
16. See the System to Improve it
We cannot improve outcomes
without understanding the processes that
generate them and the interconnections
between the processes.