Ohio Principal Evaluation System

1,156 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,156
On SlideShare
0
From Embeds
0
Number of Embeds
540
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Ohio Principal Evaluation System

  1. 1. Ohio PrincipalEvaluation System A Snapshot of Implementation in 2011-2012 Jill Lindsey, PhD Suzanne Franco, EdD Ted Zigler, EdD Wright State University’s OERC Team Funded by the Ohio Education Research Center
  2. 2. PurposeTo gather feedback from principals, evaluatorsand superintendents regarding first useexperiences
  3. 3. Design• Criterion sampling• Phone and in-person interviews with superintendents• Focus group interview with evaluators• Focus group and individual interviews with principals evaluated using OPES
  4. 4. Guiding Questions related to• Implementation• Training• 50% student performance measures• Challenges• Perceptions• Comparison with past practice• Advice
  5. 5. Sample• Five Superintendents of the six districts in region with three or more trained in OPES agreed to be interviewed • All RttT Districts • All Typology 6 or 7 (urban/suburban with high median income) • Varied enrollment ( 2000-5000) with 5-13 Principals • Varied % free/reduced lunch population (4- 40%)• One district partially implementing as a pilot • Second largest • Highest % free-reduced lunch • 8 Principals evaluated with OPES
  6. 6. Superintendent Findings• All completed four sessions of training & found training to be very helpful• Not implementing or not fully implementing• Greater focus on teacher evaluation changes• All currently using Danielson model for teacher evaluation and satisfied but adapting• OPES an improvement over past practice
  7. 7. Evaluator Findings• All completed four sessions of training and found scenarios to be especially helpful but a lack of clarity around latitude• Modified rubric and forms• Prompted great conversations• Too much variation between evaluators• Time concerns• Equity concerns about use of student growth measures
  8. 8. Principal Findings• Did not receive training- need training• Pilot process, no student performance/growth measured used, not part of official record• Varied experiences in number of meetings and artifact expectations• Best evaluation experience, affirming, very collaborative, lots of conversation and input• Fit well with other district and building processes/goals• Concerns about use of student performance measures• Created empathy in teachers
  9. 9. Common Findings Across Groups• Training helpful and needed; need clarity• Is a very positive, collaborative process• Need more consistency in process• Piloting time is essential for best use & buy-in• Student Performance/growth component must be established• OTES and OPES intertwined
  10. 10. QuestionsIf you wish to contact us for more information: Jill.Lindsey@wright.edu Suzanne.Franco@wright.edu Ted.Zigler@wright.edu

×