This document discusses issues with evaluating and managing principals using value-added metrics in the same way that baseball managers and players are evaluated. It notes that principals do not directly deliver instruction to students and their impact cannot be easily measured within a school year like teachers. Using a single year of student test score data to evaluate principals is problematic. The document also discusses how metrics can drive unintended behaviors and suggests the focus should be on retention of effective educators rather than dismissal.
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Discussion ab out trends in assessment and accountability for National Superintendent's Dialogue
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
Discussion ab out trends in assessment and accountability for National Superintendent's Dialogue
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Retaining High Performers: Insights from DC Public Schools’ Teacher Exit SurveyJeremy Knight
As school districts across the country report various kinds of teacher shortages, how to retain teachers has emerged as a key area of interest for district leaders and policymakers. There are a variety of incentives and strategies to keep teachers in the profession, but which ones are most effective? Asking teachers themselves yields answers, some of which cut against the grain of conventional wisdom in the education community.
In order to better understand why teachers leave the profession, we analyzed teacher exit survey data from the District of Columbia Public Schools to determine what could have retained them or what would have had no effect. Because we believe that retention efforts should be focused on effective teachers, we broke down teachers’ responses by their latest teacher evaluation performance rating and focused our analysis on high-performing teachers.
Although DCPS is unique in some ways, lessons about what could have retained high-performing teachers may be transferable to other urban districts. The slide deck below presents our findings and offers considerations for other urban school districts.
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
An INSET presentation to Heads of Department on How to Conduct Teacher Appraisal by Mark Steed, Principal of Berkhamsted School.
The INSET relates to the Berkhamsted Schools Group model for Teacher appraisal.
Presentation at the HEA-funded workshop 'Using technology-based media to engage and support students in the disciplines of Finance, Accounting and Economics'
The workshop presented a variety of innovative approaches, which use technology, to engage and support learning in business disciplines that students find particularly challenging. Delegates had the opportunity to share and evaluate good practice in implementing and developing online teaching resources and to reflect on how to develop their own teaching practice, using technologies available in most institutions.
This presentation is part of a related blog post that provides an overview of the event: http://bit.ly/1o1WfHU
For further details of the HEA's work on active and experiential learning in the Social Sciences, please see: http://bit.ly/17NwgKX
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Retaining High Performers: Insights from DC Public Schools’ Teacher Exit SurveyJeremy Knight
As school districts across the country report various kinds of teacher shortages, how to retain teachers has emerged as a key area of interest for district leaders and policymakers. There are a variety of incentives and strategies to keep teachers in the profession, but which ones are most effective? Asking teachers themselves yields answers, some of which cut against the grain of conventional wisdom in the education community.
In order to better understand why teachers leave the profession, we analyzed teacher exit survey data from the District of Columbia Public Schools to determine what could have retained them or what would have had no effect. Because we believe that retention efforts should be focused on effective teachers, we broke down teachers’ responses by their latest teacher evaluation performance rating and focused our analysis on high-performing teachers.
Although DCPS is unique in some ways, lessons about what could have retained high-performing teachers may be transferable to other urban districts. The slide deck below presents our findings and offers considerations for other urban school districts.
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
An INSET presentation to Heads of Department on How to Conduct Teacher Appraisal by Mark Steed, Principal of Berkhamsted School.
The INSET relates to the Berkhamsted Schools Group model for Teacher appraisal.
Presentation at the HEA-funded workshop 'Using technology-based media to engage and support students in the disciplines of Finance, Accounting and Economics'
The workshop presented a variety of innovative approaches, which use technology, to engage and support learning in business disciplines that students find particularly challenging. Delegates had the opportunity to share and evaluate good practice in implementing and developing online teaching resources and to reflect on how to develop their own teaching practice, using technologies available in most institutions.
This presentation is part of a related blog post that provides an overview of the event: http://bit.ly/1o1WfHU
For further details of the HEA's work on active and experiential learning in the Social Sciences, please see: http://bit.ly/17NwgKX
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
Using Assessment Data for Educator and Student GrowthNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.
Student Learning Objectives, Mississippi Department of Education, Research in Action, Educator Effectiveness, Assessment Literacy, Assessment, Teacher Effectiveness, Policy
Karthik Muralidharan on research on achieving universal quality primary educa...Twaweza
A presentation by Prof. Karthik Muralidharan on research on achieving universal quality primary education in India. This was presented at the Commission for Science and Technology (COSTECH) in Dar es Salaam, Tanzania, on June 19, 2014, to an audience of researchers.
Karthik Muralidharan on research on achieving universal quality primary educa...
Naesp keynote3
1. Why it’s time we stopped managing
schools like baseball teams part
…for the most
John Cronin, Ph.D.
Senior Director, the Kingsbury
Center at Northwest Evaluation
Association
You can view this presentation at slideshare:
http://www.slideshare.net/NWEA/schools-cant
2. How does it work in baseball?
In baseball, the contribution of
players to the success of the team
can be measured (value-added).
In baseball, general managers
have complete control over the
acquisition and deployment of
players.
3. How does it work in baseball?
Sabermetricians estimate the
number of wins a player
contributes to his team.
It’s calculated by estimating the
number of runs contributed by a
player and adding the number of
runs denied by that player’s
defensive contributions.
4. So what are the issues?
• We’ve confused players with
managers.
• The metrics are problematic.
• We’ve chosen the wrong focus for
policy.
5. Baseball hasn’t found a
We assume the statistics
methodology to effectively
applied to players (teachers)
apply sabermetrics to
can be applied to their
managers. (principals).
managers
6. How does it work in classrooms?
Brian’s projection gains on this for
A gain students’ is estimated
spring’s tests are compared to this
his students. This projection may
projection. If the gains exceed the
take into account his student’s
projection, we say Brian poverty
past performance, their state
Brian’s students took theproduced
“value-added”.
rate, and a variety of other factors.
exam last spring
Value-added methodologies attempt
to isolate a teacher’s contribution to
learning by measuring student
growth while controlling or
eliminating factors that influence
growth but are outside the teacher’s
control, such as student poverty.
Last spring
This spring
7. How does it work in classrooms?
+ .25
Brian’s students’ gains on this
Brian’s gain is compared to that of
spring’s tests and he is typically
other teachersare compared to this
projection. If score”, a exceed the
assigned a “z the gains metric that
projection, we say Brian produced
shows where he stands relative to
“value-added”.
other teachers in the state.
Last spring
This spring
8. How are principals different?
•
They don’t directly deliver
instruction to students.
•
Their impact cannot easily be
measured within a school year
Source: Lipscomb, S.; Teh, B.; Gill, B.; Chiang, H.; Owens, A (2010, Sept.). Teacher and Principal ValueAdded: Research Findings and Implementation Practices. Cambridge, MA. Mathematica Policy Research.
9. Three schools value-added math and reading
results – who is the better principal?
Math
Reading
2
1.5
1
Many state assessment
systems use a single
year of data for
principal evaluation.
0.5
0
-0.5
-1
-1.5
Langston Hughes Elem
Scott Joplin Elem
Lewis Latimer Elem
13. So what are the issues?
• We’ve confused players with
managers.
• The metrics are problematic.
• The metrics are problematic.
• We’ve chosen the wrong focus for
policy.
14. How does it work in baseball?
In baseball, each player creates
his own metrics by getting on
base, stealing bases, or making
catches.
The metrics directly reflect their
performance.
15. Issues in the use of growth and valueadded measures
Differences among value-added
models
Los Angeles Times Study
Los Angeles Times Study #2
16. Issues in the use of value-added measures
Control for statistical error
All models attempt to address this
issue. Nevertheless, many teachers
value-added scores will fall within
the range of statistical error.
17. Issues in the use of growth and valueadded measures
Control for statistical error
New York City
New York City #2
18. What Makes Schools Work Study - Mathematics
Value-added indexwithin Group
15.0
Year 2
10.0
5.0
0.0
-5.0
-10.0
-10.0
-8.0
-6.0
-4.0
-2.0
0.0
2.0
4.0
6.0
8.0
10.0
12.0
Year 1
Data used represents a portion of the teachers who participated in Vanderbilt University’s What
Makes Schools Work Project, funded by the federal Institute of Education Sciences
23. Number of students who achieved the normal
mathematics growth in that district
Mathematics
Failed growth target
Number of Students
Met growth target
Student’s score in fall
25. Test duration and math growth between
two terms in one school’s fifth grade
Number of minutes for each
student’s first testtest. 50
second
140
100
Minutes
30
80
20
60
10
Scale score growth
The white line represents the
average duration of the 40
second
test.
120
40
0
20
The scale score growth attained
by each child.
0
-10
Test 1 Duration
Test 2 Duration
Scale Score Gain
The yellow
line represents the
average growth for fifth graders in
this district
26. Test duration and math growth between
two terms in all fifth grades in a district
90.0
25.0
80.0
60.0
Minutes
15.0
50.0
40.0
10.0
30.0
20.0
5.0
10.0
0.0
0.0
Test 1 Duration
Test 2 Duration
Scale Score Growth
Scale score growth
20.0
70.0
27. Test duration and math growth between
two terms in all fifth grades in a district
120.00
18.0
16.0
100.00
Minutes
80.00
12.0
10.0
60.00
8.0
40.00
6.0
4.0
20.00
2.0
0.00
0.0
Test 1 Duration
Test 2 Duration
Scale Score Growth
Scale score growth
14.0
28. The problem with spring-spring testing
Student’s spring to spring growth trajectory
Teacher 1
3/12
4/12
5/12
Summer
6/12
7/12
8/12
Teacher 2
9/12
10/12
11/12
12/12
1/13
2/13
3/13
29. Metrics do not provide a complete
picture of the classroom
They don’t capture
important noncognitive factors that
impact learning.
30. The intangibles
In baseball, the employment of
sabermetrics has reduced the
impact that a player’s
intangibles has on personnel
decisions. These intangibles
may include leadership
qualities, locker room
presence, and other
personality traits that may
contribute to team success.
31. Non-cognitive factors
In education, value-added
Jackson (2012) argues that
measurement have more
teachers may has focused
In on non-cognitive
policy-makers on the
impact baseball, the
employment of
teacher’s contribution to to
factors that are essential
sabermetrics has
academic success, as focused
student success like
general managers
reflected in test scores.on the
attendance, grades, and
player’s
suspensions.contribution to
the measures that
These are not the only
ultimately matter
measures that matter in the
sport,
however. runs and wins.
32. Non-cognitive factors
• Lowered the average student absenteeism by 7.4
days.
• Improved the probability that students would enroll
Employing value-added methodologies,
in the next grade by teachers had a
Jackson found that 5 percentage points.
• Reduced the likelihood of suspension by
substantive effect on non-cognitive 2.8%
outcomes that was independent of their .05
• Improved the average GPA by .09 (Algebra) or
effect on test scores
(English)
Source: Jackson, K. (2013). Non-Cognitive Ability, Test Scores and Teacher Quality: Evidence from 9th
Grade Teachers in North Carolina. Northwestern University and NBER
33. So what are the issues?
• We’ve confused players with
managers.
• The metrics are problematic.
• We’ve chosen the wrong focus for
• We’ve chosen the wrong focus for
policy.
policy.
34. Policy has focused on dismissal rather than
retention.
In
baseball, exceptional
players are much
rarer than average
ones. Thus it is vital
for a team to keep its
best players.
35. Employment of Elementary Teachers 20072012
1538000
The elementary school
NUMBER OF TEACHERS teacher workforce shrunk
by 178,000 teachers (11%)
between May, 2007 and
1544300
1544270
May, 2012.
1485600
Source
1415000
1360380
2007
2008
2009
2010
2011
2012
Source: (2012, May) Bureau of Labor Statistics – Occupational Employment Statistics
Numbers exclude special education and kindergarten teachers
36. The impact of seniority based layoffs on
school quality
In a simulation study of implementation of a layoff of
5% of teachers using New York City data, reliance on
seniority based layoffs resulted would:
• Result in 25% more teachers laid off.
• Teachers laid off wouldSource standard deviations
be .31
more effective (using a value-added criterion) than
those lost using an effectiveness criterion.
• 84% of teachers with unsatisfactory ratings would
be retained.
Source: Boyd, L., Lankford, H., Loeb, S., and Wycoff, J. (2011). Center for Education
Policy. Stanford University.
37. If evaluators do not
We must identify
also
differentiate their
identify the the
and protect least
ratings, then all to
effective teachers
most effective
differentiation with
gain credibility
teachers to improve
comes from the test.
public.
the profession.
38. Results of Tennessee Teacher Evaluation
Pilot
60%
53%
50%
40%
40%
30%
24%
20%
23%
22%
16%
12%
8%
10%
2%
0%
0%
1
2
Value-added result
3
4
Observation Result
5
39. Results of Georgia Teacher Evaluation Pilot
Evaluator Rating
1%
2%
23%
ineffective
Minimally Effective
Effective
Highly Effective
75%
42. What’s the analogy to schools?
Policy makers believe valueadded metrics provide a statistical
means to measure the
effectiveness of teachers and
principals.
43. What’s the assumed parallel to schools?
Policy
Policy-makers assume that reading
and mathematics constitute
adequate measures of effectiveness.
Policy-makers assume that the
principal controls the acquisition and
deployment of talent.
44.
45. The Cincinnati Approach - Method
• Evaluators were trained and calibrated
to the Danielson model
• Both peer and administrator evaluators
were used.
• Each teacher was observed three
times by a peer and once by an
administrator.
• Stakes were higher for beginning
teachers than veterans.
Source: Taylor, E. and Tyler, J. (2012, Fall). Can Teacher Evaluation Improve Teaching?
46. The Cincinnati Approach - Findings
• In the first year, the average teacher
improved student math scores by .05
SD, in subsequent years this improved
to .11 SD,
• Improvement was sufficient to move a
25th percentile teacher to near average.
• Reading scores did not improve.
• The evaluations retained a “leniency”
bias typical of other evaluation
programs.
• The pilot cost was high, $7,500 per
teacher.
47. The Cincinnati Approach - Context
• In the first year, the average teacher
improved student math scores by .05
SD, in subsequent years this improved to
.11 SD,
• Gains in the first two years of teaching are
typically .10 SD in mathematics
(Rockoff, 2004).
• Gains from being placed with highly
effective peers are .04 SD in mathematics
(Jackson and Bruegmann,).
• The pilot cost was high, $7,500 per teacher.
Rockoff, J. E. (2004) “The Impact of Individual Teachers on Student Achievement: Evidence from
Panel Data.” American Economic Review. 94(2): 247-252.
Jackson, C. K. and Bruegmann, E., Teaching Students and Teaching Each Other: The Importance of
Peer Learning for Teachers (2009, July).
NBER Working Paper No. 15202 JEL No. I2,J24
48. Reliability of a variety of teacher observation
implementations
Observation by
Reliability coefficient
(relative to state test
value-added gain)
Proportion of test
variance
explained
Principal – 1
.51
26.0%
Principal – 2
.58
33.6%
Principal and other administrator
.67
44.9%
Principal and three short
observations by peer observers
.67
44.9%
Two principal observations and
two peer observations
.66
43.6%
Two principal observations and
two different peer observers
.69
47.6%
Two principal observations one
peer observation and three short
observations by peers
.72
51.8%
Bill and Melina Gates Foundation (2013, January). Ensuring Fair and Reliable Measures of Effective
Teaching: Culminating Findings from the MET Projects Three-Year Study
49. Assessment Literacy in a Teacher Evaluation
Framework
Presenter - John Cronin, Ph.D.
Contacting us:
Rebecca Moore: 503-548-5129
E-mail: rebecca.moore@nwea.org
This PowerPoint presentation and recommended resources are
available at our SlideShare website:
50. Why it’s time we stopped pretending schools should
be managed like baseball teams
51. Suggested reading
Baker B., Oluwole, J., Green, P. (2013). The legal
consequences of mandating high stakes
decisions based on low quality information:
Teacher evaluation in the Race to the Top Era.
Education Policy Analysis Archives. Vol 21. No
5.
52. Thank you for attending
Presenter - John Cronin, Ph.D.
Contacting us:
NWEA Main Number: 503-624-1951
E-mail: rebecca.moore@nwea.org
The presentation and recommended resources are
available at our SlideShare site:
http://www.slideshare.net/NWEA/tag/kingsbury-center
53. What about principals?
The issue is the same
with principals, it is
difficult to separate the
contribution of the
principal to learning from
the contribution of
teachers.
Source: Lipscomb, S.; Teh, B.; Gill, B.; Chiang, H.; Owens, A (2010, Sept.). Teacher and Principal ValueAdded: Research Findings and Implementation Practices. Cambridge, MA. Mathematica Policy Research.
54. How does it work in classrooms?
+ .25
Two very important assumptions
• The teacher directly delivers instruction that
causes learning!
Last spring
• The teacher’s impact can be measured
within a school year!
This spring
55. Four issues
• How do you measure a principal?
• How accurate and reliable are
these measures?
• What anticipated and
unanticipated impacts do your
measures have on behavior?
• Where should our energy really be
focused?
57. So what are the issues?
• We’ve confused players with
managers.
We’ve metrics are problematic.
• The confused players with
managers.
• We’ve chosen the wrong focus for
policy.
58. How does it work in education?
Teacher or School Value-Added
How much academic growth does a
teacher or school produce relative to
the median teacher or school?