• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
An Empirical Investigation of Ohio’s Educator Evaluation System (MWERA)
 

An Empirical Investigation of Ohio’s Educator Evaluation System (MWERA)

on

  • 178 views

Presented at the Mid-western Educational Research Association Annual Meeting Nov. 6-9 2013 by Anirudh V.S. Ruhil, Marsha S. Lewis, Lauren Porter, and Emily A. Price

Presented at the Mid-western Educational Research Association Annual Meeting Nov. 6-9 2013 by Anirudh V.S. Ruhil, Marsha S. Lewis, Lauren Porter, and Emily A. Price

Statistics

Views

Total Views
178
Views on SlideShare
178
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • NY findings: King presented preliminary numbers to the state Board of Regents Tuesday morning, announcing that nearly 50 percent of teachers received a “highly effective” rating, which is the top score. Another 42 percent were deemed “effective,” with only 4 percent as “developing” and 1 percent as “ineffective.”

An Empirical Investigation of Ohio’s Educator Evaluation System (MWERA) An Empirical Investigation of Ohio’s Educator Evaluation System (MWERA) Presentation Transcript

  • An Empirical Investigation of Ohio’s Educator Evaluation System Authors: Anirudh V.S. Ruhil (Ohio University) <ruhil@ohio.edu> Marsha S. Lewis (Ohio University) <lewism5@ohio.edu> Lauren Porter (The Ohio State University) <porter.700@osu.edu> Emily A. Price (Ohio University) < ep311508@ohio.edu>
  • Overview • Two question motivate our study: – What does Ohio’s educator evaluation system look like? – Any systematic patterns (conditionalunconditional)? • • • • OTESOPES eTPES Data Analysis and Findings Conclusions MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES (2014-2015) (1) Performance Standards (50%) -- Developed by OH Educator Standards Board -- Seven Components 1: Students 2: Content 3: Assessment 4: Instruction 5: Learning Environment 6: Collaboration and Communication 7: Professional Responsibility and Growth (2) Student Growth Measures (50%) -- Value-Added -- Approved Vendor Assessments -- LEA Measures MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • Student Growth Measures in OTES (A) Value-Added -- Grades 4th – 8th, ELA & Math -- Until June 30, 2014, majority (>25%) of SGM shall be based on Value-Added -- On or after July 1, 2014, all (50%) of SGM shall be based on Value-Added. (B) Approved Vendor Assessments -- Terra Nova -- ACT End-of Course -- NWEA MAP -- STAR -- … (C) LEA Measures -- Student Learning Objectives -- Shared Attribution -- LEASchool-level Value-Added -- LEASchool-level SLO MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OPES (2014-2015) (1) Performance Standards -- Shared vision, establish goals, and continuous improvement -- HQ instruction -> increased student achievement -- Manage resourcesoperations -- Establish collaborating learning and shared leadership -- Engage parents and community (2) Student Growth Measures -- Value-Added -- Approved Vendor Assessments -- LEA Measures MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • Student Growth Measures in OPES (A) Value-Added -- School-level Value-Added (B) Approved Vendor Assessments -- School-level Composite Measure -- School-level Aggregate of AVA scores (C) LEA Measures -- District SLOs -- District Value-Added -- Aggregate of Teachers’ Value-Added Scores -- Student Achievement Trends -- Progress on Improvement Plans -- Student Course-Taking Trend (e.g., AP) MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTESOPES Final Rating Determination Performance Standards SGM Rating 4 Above 3 Accomplished Accomplished 2 1 Skilled Developing Expected Skilled Skilled Developing Developing Below Developing Developing Ineffective Ineffective MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTESOPES Data for 2012-2013 • OTES – 26 LEAs = 23 PSDs + 1 JVSDs + 2 CSs Ineffective Skilled Accomplished Total 13 (0.6%) • Developing 298 (12.8%) 1,580 (67.8%) 441 (18.9%) 2,332 (100%) OPES – 27 LEAs = 24 PSDs + 1 JVSD + 2 CSs Ineffective Skilled Accomplished Total 0 (0%) • Developing 22 (12.9%) 98 (57.3%) 51 (29.8%) 171 (100%) After exclusions for LEAs piloting OTESOPES we have – 24 LEAs ~ 2,001 Teacher records – 15 LEAs ~ 62 Principal records – These sub-sample sizes drop further once “Exempt” records are excluded MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTESOPES Final Summative Ratings MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES: Performance Rating by SGM Rating MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES: Distribution of SGM Ratings Across Performance Standard Ratings MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES: SGM Category by Rating MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES: Value-Added Weight by Rating MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • OTES: SLO Weight by Ratings MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13
  • Initial Conclusions • Value-Added fairly congruent with other evaluation measures • Weight placed on Value-Added seems to be of no consequence for final summative rating • Our early results in line with NYC (http://bit.ly/16qYVYg) … possibly other states as well. • Limitations – Limited data at hand – Potentially biased set of LEAs studied – Value-Added has been well studied; More research is needed for Vendor Assessments & SLOs – Questions of OTESOPES reliability are only answerable with multiple waves of data MWERA Annual Meeting 2013 (Evanston, IL) 11/08/13