Hb 153 evaluation presentation january 2012

866 views
769 views

Published on

A comprehensive overview of the issues raised in HB 153 for teachers and administrators

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
866
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
8
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Hb 153 evaluation presentation january 2012

  1. 1. Michele Winship, Ph.D. winshipm@ohea.org
  2. 2.  Presentation slides Ohio Teacher Evaluation System Framework Current draft Ohio Teacher Evaluation System Model (OTES) Battelle for Kids Value-Added Talking Points Battelle for Kids Value-Added Resources Race to the Top Work Flow Chart District Assessment Mapping District Assessment Mapping Sample Template 2
  3. 3.  A national push for teacher evaluation reform from policy makers  Recognition through research that current teacher evaluation practices are not effective in helping teachers improve performance and identifying underperforming teachers  A desire to identify levels of teacher performance to reward high performers and remove low performers  RttT mandate to change evaluation practices  State-level policies that change evaluation requirements  Student performance as a significant factor in teacher (and principal) evaluation (adopted in 13 states currently) 3
  4. 4. 4
  5. 5.  Sec. 3319.112 (A) Not later than December 31, 2011, the state board of education shall develop a standards- based state framework for the evaluation of teachers. The framework shall establish an evaluation system that does the following: (1) Provides for multiple evaluation factors, including student academic growth which shall account for fifty per cent of each evaluation; (2) Is aligned with the standards for teachers adopted under section 3319.61 of the Revised Code; (3) Requires observation of the teacher being evaluated, including at least two formal observations by the evaluator of at least thirty minutes each and classroom walkthroughs; (4) Assigns a rating on each evaluation in accordance with division (B) of this section; (5) Requires each teacher to be provided with a written report of the results of the teachers evaluation; (6) Identifies measures of student academic growth for grade levels and subjects for which the value-added progress dimension prescribed by section 3302.021 of the Revised Code does not apply; (7) Implements a classroom-level, value-added program developed by a nonprofit organization described in division (B) of section 3302.021 of the Revised Code; (8) Provides for professional development to accelerate and continue teacher growth and provide support to poorly performing teachers; (9) Provides for the allocation of financial resources to support professional development. (HB 153 as signed by the Governor) 5
  6. 6. 6
  7. 7. Sec. 3319.111 [Effective 9/29/2011] Teacher evaluation(A) Not later than July 1, 2013, the board of education that of each school district, in consultation with teachersemployed by the board, shall adopt a standards-based teacher evaluation policy that conforms with the frameworkfor evaluation of teachers developed under section 3319.112 of the Revised Code. The policy shall become operativeat the expiration of any collective bargaining agreement covering teachers employed by the board that is in effecton the effective date of this section and shall be included in any renewal or extension of such an agreement.(B) When using measures of student academic growth as a component of a teachers evaluation, those measures shallinclude the value-added progress dimension prescribed by section 3302.021 of the Revised Code. For teachers ofgrade levels and subjects for which the value-added progress dimension is not applicable, the board shall administerassessments on the list developed under division (B)(2) of section 3319.112 of the Revised Code.(C)(1) The board shall conduct an evaluation of each teacher employed by the board at least once each schoolyear, except as provided in divisions (C)(2) and (3) of this section. The evaluation shall be completed by the first dayof April and the teacher shall receive a written report of the results of the evaluation by the tenth day of April.(2) If the board has entered into a limited contract or extended limited contract with the teacher pursuant tosection 3319.11 of the Revised Code, the board shall evaluate the teacher at least twice in any school year in whichthe board may wish to declare its intention not to re-employ the teacher pursuant to division (B), (C)(3), (D), or (E)of that section.One evaluation shall be conducted and completed not later than the fifteenth day of January and the teacher beingevaluated shall receive a written report of the results of this evaluation not later than the twenty-fifth day ofJanuary. One evaluation shall be conducted and completed between the tenth day of February and the first day ofApril and the teacher being evaluated shall receive a written report of the results of this evaluation not later thanthe tenth day of April.(3) The board may elect, by adoption of a resolution, to evaluate each teacher who received a rating ofaccomplished on the teachers most recent evaluation conducted under this section once every two school years. Inthat case, the biennial evaluation shall be completed by the first day of April of the applicable school year, and theteacher shall receive a written report of the results of the evaluation by the tenth day of April of that school year. 7
  8. 8.  Opportunities  Create evaluation systems that improve instructional practice through formative feedback and educator reflection  Design a complete ―system‖ of evaluation with formative feedback and support and not just a typical observation check list  Work together to identify best practices and scale them up through our locals  Bargain the process for changing the evaluation system as well as the procedures, practices and tools  Work collaboratively with administrators who are subject to the same requirements 8
  9. 9.  Challenges  Short timeline to complete the work and operationalize the system (July 1, 2013)  Unfunded mandate for non-RttT locals  Changing perceptions (ours and theirs) about the purpose of evaluation  Incorporating student growth in a way that benefits teachers and doesn’t rank and sort them  Limited state support at the present time  Non-explicit requirement to create assessment systems to provide required student growth metric  Annual evaluations for all but accomplished teachers 9
  10. 10.  HB 153 leveled the evaluation playing field  RttT districts and non-RttT districts are all required to reconstruct their evaluation systems to align with the adopted state framework based on the Ohio Standards for the Teaching Profession  All districts are now on virtually the same timeline:  RttT districts were required to implement their new evaluation systems by the 2013-2014 school year or sooner depending on their Scope of Work timeline and changes that are bargained collaboratively (MOU)  Non-RttT districts are required to adopt their evaluation systems no later than July 1, 2013 and implement them at the expiration of the current CBA (discrepancy in timeline—can’t implement if not created) 10
  11. 11.  HB 153 places an additional burden on ALL districts to address the requirement of the 50% student growth measure  The only measure currently available is value- added data for teachers in grades 3-8 in reading and math (some districts have extended data through Battelle for Kids initiatives)  ODE is creating a ―list of student assessments that measure mastery of course content‖ which districts can use (may need to purchase)  However, there will be many grade levels and courses with no existing assessments; districts will have to create their own 11
  12. 12.  HB 153 creates an advantage for RttT districts  RttT districts can use their funds to buy the time and support to re-create their evaluation systems, including the development of an assessment system  RttT districts can use their funds to purchase support for assessment systems (data management, specific testing protocols, testing materials and grading support)  However…these funds will go away. How will the systems be supported financially in the future? 12
  13. 13.  We must begin with the belief that the main purpose of teacher evaluation is improved teaching practice and student learning. Teacher performance is to be measured through multiple sources of evidence, with observation as one source. Student performance is required to be 50% of the evaluation, BUT student performance is to be measured through multiple sources of data, not just a single standardized test score. The State Board of Education has adopted a framework; districts must still develop their evaluation system that includes processes, procedures and forms. 13
  14. 14. 14
  15. 15.  Student academic growth will be measured through multiple measures which must include value-added scores on evaluations for teachers where value-added scores are available. Value-added scores are ONLY available for tested grades and subjects, math and reading in grades 3 – 8. Some extended reports are available in locals who participate in Battelle for Kids projects. Even if there are value-added scores, there must be additional student growth measures for all teachers. 15
  16. 16.  Districts will create a local student growth measure worth 50% of the evaluation from a combination of the following:  Value-Added Data  ODE-approved Student Assessments  Menu of Options Determined by the District 16
  17. 17.  Local boards of education may administer assessments chosen from the Ohio Department of Education’s assessment list ($$$) for teachers of subjects where value-added scores are not available. and/or local measures of student growth using state-designed criteria and guidance. This will require districts to create local measures of student growth (assessments) in areas where there are no standardized assessments. 17
  18. 18. Student Achievement, Including Improvement of Achievement, in Tested Grades and Subjects 20% State Achievement Growth Measure 30% District-level Growth Metric 30% School-based Growth Metric 20% Other Locally Determined Measures of Achievement 18
  19. 19. Student Achievement, Including Improvement ofAchievement in Non-tested Grades and Subjects 40% District-level Growth Metric 40% School-based Growth Metric 20% Other Locally Determined Measures of Achievement 19
  20. 20. Maintained by ODE  LRC  Valued-Added SAS® Data Accountability  AYP Growth Measure Processing Measures and  ODE Reports-School Reports and District (LEA) Measures Maintained by SAS®  Single Limited Access Developed & supported Password Protected Data: ODE- by BFK  District/LEA and school BFK  Regional SystemDiagnostic  Student information Partnership Technical  Trained VAL’s support  Analytic tools districts/LEAs through Tools  Teacher-level reports Assistance DVALT trainingEVAAS®  Limited Use Public Access and  Support to teacher-  Includes BFK SAS® EVAAS® teams reporting Support  Focus on school  Enhanced reporting improvement features  Toolkits  Online courses 20
  21. 21.  Value-Added Modeling (VAM) has become the ―gold‖ standard for measuring educator effectiveness. One year’s growth in one year’s time is the benchmark = effective.  Teachers who exceed this growth rate have a positive value-added rating (+) = highly effective  Teachers who fail to meet this growth rate have a negative value-added rating (-) 21
  22. 22.  BUT…VAM modeling is flawed. The tests used to generate the scores were never designed to measure teacher effectiveness. ―Student test scores alone are not sufficiently reliable and valid indicators of teacher effectiveness to be used in high- stakes personnel decisions, even when the most sophisticated statistical applications such as value-added modeling are employed.‖ (EPI Briefing Paper--Problems with the Use of Student Test Scores to Evaluate Teachers) 22
  23. 23.  Given that students  Lack of properly are not randomly scaled year-to-year assigned to tests makes it classes, VAM can’t difficult to evaluate distinguish between gains along the teacher effects and continuum. the effects based on  Mobility of students students’ needs. (especially in high needs schools) impact VAM do not provide the data information to help  VAM cannot ―struggling‖ teachers. distinguish among teachers in the middle range of performance. 23
  24. 24.  About 69% of teachers  VAM estimates vary can’t be accurately with the tests used assessed with VAMs*  If a teacher is in the bottom quintile based on one test  Teachers in subject there is a 43% chance she will areas that are not be in the bottom quintile on tested a different test, but a 16% chance she will be in the top two quintiles.  Teachers in grade levels  If a teacher is in the top (lower elementary) quintile based on one test where no prior test there is a 43% chance she will scores are available be in the top quintile on a different test, but a 13%  Special education & ELL chance she will be in the bottom two quintiles. 24
  25. 25. o Rollout Schedule o 30% of LEAs Link in Year 1 RttT (reports received fall 2011) o 60% of all RttT LEAs in Year 2 o 100% of all LEAs in Ohio in Years 3 & 4o Requirements—Accuracy of Reporting o Must conduct linkage o Minimum number of students and time enrolledo Access to Reporting o Online via EVAAS® accounts o Password protectedo Grades/Subjects Available o ODE: grades 4-8, math & reading o BFK: grade 3, math & reading; grades 3-8, science & social studies; high school—algebra I & II, geometry, pre- calculus, biology, chemistry, English 9, 10 & 11  Issue—Public Records Requests 25
  26. 26. 26
  27. 27. 27
  28. 28. 28
  29. 29. 29
  30. 30.  Used properly, student performance data DOES have a role in school and district improvement efforts, it CAN positively impact student performance. Nationally, we have come to believe that the data itself—the ―score‖—is the end game instead of the starting point. And…an overreliance on and faith in value- added metrics as accurate measures of TEACHER performance has entirely skewed the way we use student performance data. 30
  31. 31.  Tobe meaningful, student performance data should be used by educators to  Identify achievement gaps,  Inform instructional practice, and  Direct professional development. To effectively use the data, teams of educators should  Be trained in the analysis and interpretation of student performance data,  Have real-time access to the data, and  Meet regularly in teams to analyze the data and plan intervention, instruction and professional development. 31
  32. 32. How do we create the conditionsfor educators to use studentperformance data effectively? 32
  33. 33.  Use the assessments you have first. Determine what assessments you need to create a rigorous, comparable and inclusive assessment system that is designed to provide student performance data to be used for educator professional growth and also for inclusion in an evaluation system. Chart a course of action with a timeline, persons responsible and deliverables. 33
  34. 34.  Requiring student performance in teacher evaluations means districts will need to:1. Map current school-based and district-wide assessments in all grades and subjects2. Determine where assessment ―gaps‖ exist3. Create groups of educators to select/develop appropriate assessments for ―gaps‖4. Create an assessment timeline for all grades and subjects5. Collect, analyze and store student performance data6. Provide time and training for educators to work together with student data to improve their own instruction 34
  35. 35. 35
  36. 36.  Each evaluation will consist of two formal observations of the teacher at least thirty minutes each in duration, as well as periodic classroom walkthroughs. Teacher performance metrics must also use multiple and variable sources of data, such as lesson plans, samples of student work, classroom assessment results, and portfolios, in addition to data from direct observation in classrooms. 36
  37. 37. 37
  38. 38.  Goal Setting  Self Assessment against Ohio Standards  Analysis of student data  Identifying 2 professional growth goals Formative Assessment of Teacher Performance—Formal Observation  Pre-observation conference  Observation  Post-observation conference and reflection Evidence Collaboration and Professionalism (determined locally) Student Growth 38
  39. 39.  The overall teacher performance rating (50%) will be combined with the results of student growth measures (50%) to produce a summative evaluation rating as depicted in the following matrix. Teachers will be rated in one of four categories:  Accomplished  Proficient  Developing  Ineffective 39
  40. 40. Evaluation Matrix Teacher Performance 4 3 2 1 AboveStudent Growth Measures Accomplished Accomplished Proficient Developing Expected Proficient Proficient Developing Developing Below Developing Developing Ineffective Ineffective 40
  41. 41.  Teachers with above expected levels of student growth will develop a professional growth plan and may choose their credentialed evaluator for the evaluation cycle. Teachers with expected levels of student growth will develop a professional growth plan collaboratively with the credentialed evaluator and will have input on their credentialed evaluator for the evaluation cycle. Teachers with below expected levels of student growth will develop an improvement plan with their credentialed evaluator. The administration will assign the credentialed evaluator for the evaluation cycle and approve the improvement plan. This is entirely unrealistic and does not reflect what actually happens in schools. 41
  42. 42.  At the local level, the board of education will include in its evaluation policy, procedures for using the evaluation results for retention and promotion decisions and for removal of poorly- performing teachers. Seniority will not be the basis for teacher retention decisions, except when deciding between teachers who have comparable evaluations. The local board of education will also provide for the allocation of financial resources to support professional development. 42
  43. 43.  With a July 1, 2013 deadline for system completion, evaluation work will need to begin ASAP and may not fit into current bargaining cycle Effective evaluation reform will require collaboration with administration at a very different level in many locals Future evaluation language in CBAs will need to include all processes, procedures and tools Stakes are high; we can’t afford to adopt systems that aren’t designed to support teachers 43
  44. 44.  Composition and selection of evaluation team members Timeline for evaluation work Compensation for work outside of the school day Mandatory training for evaluators for observation protocols and ratings Training for staff about evaluation processes, procedures and tools No-fault piloting provision to work out problems 44
  45. 45. 1. Identify and engage district evaluation team, including teachers from various levels/areas2. Review and analyze teacher current evaluation polices and rules3. Conduct ODE Evaluation GAP Analysis4. Review effective evaluation models including the OTES5. Select/Develop a district evaluation system and tools6. Map and develop student assessments that will provide student performance data7. Create training for evaluators and teachers8. Construct a pilot timeline9. Have volunteer teachers and evaluators pilot the system10. Review and revise the system based on pilot data11. Train all evaluators and teachers12. Implement the new evaluation system 45
  46. 46.  Please send any questions to: eiiweb@ohea.org 46
  47. 47.  Teacher Evaluation Systems materials and resources (login required) http://www.ohea.org/teacher-evaluation- systems www.lauragoe.com Includes various state and local systems and examples of multiple measures for teacher performance and student growth Teacher Assessment and Evaluation: The NEAs Framework http://www.nea.org/home/41858.htm Getting Teacher Assessment Right: What Policymakers Can Learn from Research -- the source for Dr. Hinchey’s presentation: http://nepc.colorado.edu/publica tion/getting-teacher-assessment-right 47
  48. 48.  Goe, L., Bell, C., & Little, O. (2008). Approaches to evaluating teacher effectiveness: A research synthesis. Washington, DC: National Comprehensive Center for Teacher Quality. Goe, L., Holdheide, L., Miller, T. (2011) A practical guide to designing comprehensive teacher evaluation systems. Washington, DC: National Comprehensive Center for Teacher Quality. Hinchy, P. (2010). Getting Teacher Assessment Right: What Policymakers Can Learn From Research. Boulder, CO: National Education Policy Center. Mathers, C., Oliva, M., with Laine, S. W. M. (2008). Improving instruction through effective teacher evaluation: Options for states and districts. Research and Policy Brief. Washington, DC: National Comprehensive Center for Teacher Quality. National Education Association. (2009). Teacher evaluation systems: The window for opportunity and reform. Washington, D.C. Stronge, J. H, & Tucker, P. D. (2003). Handbook on teacher evaluation: Assessing and improving performance. Larchmont, NY: Eye on Education. 48
  49. 49.  Laura Goe--Webinar for Oregon School Coaches, April 20, 2011: http://www.lauragoe.com/LauraGoe/Oregon- April%202011.pptx EPI Briefing Paper--Problems with the Use of Student Test Scores to Evaluate Teachers: http://www.epi.org/publications/entry/bp278 Rand Education—Evaluating Value-Added Models for Teacher Accountability: http://www.rand.org/pubs/monographs/2004/R AND_MG158.pdf 49
  50. 50. Michele Winship614-227-3001winshipm@ohea.orgQuestions?eiiweb@ohea.org 50

×