Documents oerc_160913_va_symp_lindsey_lewis-final

241 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
241
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Documents oerc_160913_va_symp_lindsey_lewis-final

  1. 1. OERC RESEARCH RELATED TO STUDENT GROWTH MEASURES Jill L. Lindsey, PhD Wright State Unive rsity Marsha Le wis, MA Ohio Unive rsity OERC Institutional Partners M a k i n g R es ea r ch W or k f or E d u ca t i on
  2. 2. THE OHIO EDUCATION RESEARCH CENTER • A collaborative of Ohiobased research universities & institutions • Focused on a statewide research agenda • Addressing critical issues of education practice and policy • Custodians of an Ohio Longitudinal Data Archive
  3. 3. DUAL SGM RESEARCH AGENDA Measuring Student Growth Using Student Growth Measures in relation to other variables to guide policy & practice 3
  4. 4. FUNDED PROJECTS RELATED TO SGM  Educator Evaluation Studies  Extended Testing for Value-Added Reporting  Student Growth Measures (SGM) Policy & Practice 4
  5. 5. OTES/OPES FINDINGS RELATED TO SGM LEAs piloting or implementing in 2012-13 did not use SGM (n=37). Educators were generally positive about the new evaluation systems. Educators expressed a lack of trust & misunderstandings about value-added data and SGM. Educators expressed concerns about unfairness of using different student growth measures for evaluation. 5
  6. 6. EXTENDED TESTING FOR VALUE-ADDED MEASURE (VAM)  Eager for reliable student growth measures for "untested" grades and subjects  Question validity of vendor tests & lack assessment literacy  Worry about too much testing and stress on primary students (K-2)  Question how # students and % instructional time impact VAM  After HB required LEAs to use Extended Testing VAM in evaluation, a few LEAs dropped out 6
  7. 7. SGM POLICY & PRACTICE STUDY 7
  8. 8. EXAMPLE OF LINKAGE SCREEN TEACHER VIEW 8
  9. 9. LINK/ROSTER VERIFICATION SURVEYS Surveyed a sample of Ohio teachers who linked for the first time in 2011 and all Ohio teachers who linked in 2013. Asked about experiences with linkage training, linkage process, perceptions of accuracy, suggestions for improvement… 9
  10. 10. FINDINGS Teachers- Did you have any students this year for whom you shared the proportion of instructional time with another teacher? 2011 2013 Yes 80% 81% No 20% 19% 10
  11. 11. FINDINGS Teachers- Do you think the linkage process accurately captured what was happening in your classroom (i.e., students you taught last year, their length of enrollment, and your percentage of instructional time with them)? 2011 2013 Yes 46% 58% No 23% 26% Don't know 31% 16% 11
  12. 12. FINDINGS Teachers- Given your experience with the linkage process, how confident are you that the linkage process improves the accuracy of the teacher-level value-added data? 2011 2013 Not at all confident Somewhat confident Very confident 39% 31% 55% 61% 6% 9% 12
  13. 13. TEACHER-STUDENT DATA LINK SENSITIVIT Y ANALYSIS Another question in SGM Research Project: How consequential to teacher-level value-added measures is the precision of their reported percentages of shared instructional responsibility? 13
  14. 14. LINKAGE SENSITIVITY ANALYSIS  Provided SAS list of districts that responded to 2011 linkage sur vey  Developed nine scenarios of instructional responsibility (varied # students, % students shared)  SAS identified 882 teachers in the 62 districts and recalculated their value-added scores for 9 instructional time scenarios (10% responsibility through 90% responsibility) 14
  15. 15. LINKAGE SENSITIVITY ANALYSIS 15
  16. 16. LINKAGE SENSITIVITY ANALYSIS  Teacher value-added effectiveness levels (5 categories) remained largely stable with changes in % instructional responsibility.  5 Categories from Teacher Value-Added Reports:  Least effective (OTES SGM rating “below”)  Approaching average effectiveness  Average effectiveness  Above Average effectiveness  Most effective (OTES SGM rating “above”) (OTES rating “expected”) 16
  17. 17. LINKAGE SENSITIVITY ANALYSISPRELIMINARY FINDINGS  For OAA Math 2012 overall:  Moving teachers from 50% to 60% responsibility resulted in <4% of teachers being classified into adjacent effectiveness level.  Moving teachers from 50% to 90% responsibility resulted in 14% of teachers being classified into another effectiveness level. All but 3 teachers classified into adjacent effectiveness level.  *Reclassification likely further reduced by OTES SGM classification into 3 levels.  Teachers who share >75% of students most affected by changes in instructional responsibility. 17
  18. 18. STUDENT GROWTH MEASURES IN OHIO’S NEW TEACHER EVALUATION SYSTEM  The Ohio Teacher Evaluation System (OTES) was implemented for the first time in 2012-13 by 26 Ohio LEAs.  OTES final summative rating of teacher per formance is comprised of 50% teacher per formance on standards and 50% student academic growth measures.  “Categor y A” teachers in OTES are teachers with teacher-level value-added data available for some or all of the classes they teach. 18
  19. 19. CATEGORY A TEACHERS IN EARLY IMPLEMENTING LEAS  24 unique “implementing” LEAs with available data.  Total number of teachers with available OTES rating data in implementing districts: 2,001  Total number of categor y A teachers: 398* (20% of all teachers with OTES rating data had value-added data as all or par t of their student academic growth measure por tion) *46 cate g o r y A te ache r s we r e e x e mp t f r o m S GM s o no t i ncl u de d i n f o l l o wi ng f i g u r e s . 19
  20. 20. Distribution of SGM for Category A Teachers in OTES-Implementing LEAs 2012-13 16% 15% Below Expected Above 69% 20
  21. 21. CONGRUENCE OF THE T WO SIDES OF OTES FOR TEACHERS WITH VALUE -ADDED DATA (C ATEG O RY A TEAC H ER S) SGM Category Teacher Performance Rating: Accomplished Below Expected Above 4 46 18 46 194 34 Developing 5 5 0 Ineffective 0 0 0 Skilled 21
  22. 22. DISTRIBUTION OF CATEGORY A TEACHER FINAL SUMMATIVE RATING BY STUDENT GROWTH MEASURE CATEGORY SGM Category Teacher Final Summative Rating: Below Expected Above Accomplished 0 0 52 Skilled 0 240 0 Developing 50 5 0 Ineffective 5 0 0 22
  23. 23. NEXT STEPS  Research OTES/OPES implementation in 2013-14  Examine distribution of OPES/OTES ratings by district typology and other factors once 2013-14 data are available  Examine impact of different weighting of ValueAdded Data in SGM and congruence/differential effects of VAA and other student growth measures (e.g. Student Learning Objectives) 23
  24. 24. QUESTIONS? connect@oerc.osu.edu | oerc.osu.edu

×