Results drive accountability ppt 8-27-12 final

  • 381 views
Uploaded on

 

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
381
On Slideshare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
1
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Using Assessment Data as Part of aResults-Driven Accountability System Input from the NCEO Core Team and Sample Approaches National Center on Educational Outcomes (NCEO) August 27, 2012National Centeron Educational 1 Outcomes
  • 2. Overview1. Summary of NCEO Core Team Report2. Clarifying Questions3. Overview of Sample Approaches4. Discussion . National Center on Educational 2 Outcomes
  • 3. Core Team• Peggy Carr, National Center for Education Statistics• Alan Coulter, Data Accountability Center• Candace Cortiella, The Advocacy Institute• David Egnor, Office of Special Education Programs• Jack Fletcher, University of Houston• Lynn Fuchs, Vanderbilt University• Brian Gong, Center for Assessment• on Educational Riley, Kansas Department of Education Colleen National Center 3 Outcomes
  • 4. Resource Group• Rolf Blank, Council of Chief State School Officers• Anne Chartrand, Sourtheast RRC• Karen Denbroeder, Florida Department of Education• Judy Elliott, Consultant• David Francis, University of Houston• Michael Kolen, University of Iowa• Elizabeth Kozleski, University of Kansas• Rachel Quenemoen, Nat’l Center State Collaborative National Center on Educational 4 Outcomes
  • 5. Input of the NCEO Core TeamThree major sections:1. Framing Considerations2. Core Team Suggestions3. Example Reporting FormatNational Centeron Educational 5 Outcomes
  • 6. Framing Considerations1. Public transparency and understandability are critical features of a results-driven accountability system and must be reflected in measures used to review states on student performance.2. Multiple measures must be included. No single measure should be used in making decisions about student performance results.3. The use of measures of student performance should provide appropriate incentives to states, particularly in relation to identified values (e.g., inclusion in the general assessment). National Center on Educational 6 Outcomes
  • 7. Framing Considerations4. The measures should provide a flag to look deeper into areas that need improvement.5. A plan should be developed and steps taken to monitor, validate, and improve the use of measures by OSEP and others; additional variables may be appropriate to enhance the measures in the future. National Center on Educational 7 Outcomes
  • 8. Framing Considerations6. Variables that may be related to student performance but that have inconsistent interpretations and reliability should not be included in measures that are used for reviewing states on the performance of their students with disabilities. National Center on Educational 8 Outcomes
  • 9. Framing Considerations7. No increased burden on states to collect additional data should result from the shift to reviewing student performance results. The developed measures need to fit within what states are doing as they review districts, and should be compatible with and reflective of the state’s overall accountability system used for school improvement. National Center on Educational 9 Outcomes
  • 10. Core Team Suggestions1.Use a reporting format that ensures thatmultiple measures are considered for studentswith disabilities receiving special educationservices.2.Provide data for reading and mathematicsseparately.3.Include participation of students withdisabilities in state assessments.National Centeron Educational 10 Outcomes
  • 11. Core Team Suggestions4.Include participation of students withdisabilities in the general state assessment.5.Include performance of students withdisabilities on the general state assessment.6.Include the relative difficulty of stateassessments.7.Include the gap in general assessmentperformance between students with disabilitiesand students without disabilities. National Center on Educational 11 Outcomes
  • 12. Core Team Suggestions8.Include improvement in performance overtime. These 8 suggestions guided the development of a set of 6 tables to display the data.National Centeron Educational 12 Outcomes
  • 13. Example Reporting FormatTable 1: Reading General State AssessmentNational Centeron Educational 13 Outcomes
  • 14. Example Reporting FormatTable 2: Mathematics General State Assessment National Center on Educational 14 Outcomes
  • 15. Example Reporting FormatTable 3: Reading and Math Overall Performanceand Targets National Center on Educational 15 Outcomes
  • 16. Example Reporting FormatTable 4: Reading Alternate Assessments National Center on Educational 16 Outcomes
  • 17. Example Reporting FormatTable 5: Mathematics Alternate Assessments National Center on Educational 17 Outcomes
  • 18. Example Reporting FormatTable 6: Participation Rates for Students withDisabilities in Reading and MathematicsAssessments National Center on Educational 18 Outcomes
  • 19. Sample Approaches for Using Assessment Data Two Sample Approaches: 1. Decision Matrix (with 3 options) 2. Decision-Making Steps National Center on Educational 19 Outcomes
  • 20. Important NoteThe sample approaches include possible thresholds fordeciding whether a state exceeds, meets, or does notmeet expectations. OSEP and stakeholders shoulddiscuss/consider whether adjustments to these examplethresholds are needed.Stakeholders, experts, and OSEP will need to beinvolved in determining appropriate thresholds for anyelements that are used in reviewing assessment results. National Center on Educational 20 Outcomes
  • 21. Sample Approach 1aDecision Matrix (Includes State Proficiency Target)• Reading and math combined• Data from Core Team Tables 1-3 • Element 1: Participation in general assessment • Element 2: Improvement in percent proficient • Element 3: Gap in proficiency between students with disabilities and students without disabilities • Element 4: Percent proficient or above • Element 5: Gap in proficiency target and actual National Center on Educational 21 Outcomes
  • 22. Sample Approach 1aNational Centeron Educational 22 Outcomes
  • 23. Sample Approach 1aNational Centeron Educational 23 Outcomes
  • 24. Sample Approach 1aBenefits (Pros) and Challenges (Cons) Benefits (Pros) Challenges (Cons)• Combines variables that are • Complex and may lack difficult to look at separately transparency• Easy to see where state falls • Each element has issues across two content areas that need to be considered National Center on Educational 24 Outcomes
  • 25. Sample Approach 1bDecision Matrix (Without State Proficiency Target)• Reading and math combined• Data from Core Team Tables 1-2 • Element 1: Participation in general assessment • Element 2: Improvement in percent proficient • Element 3: Gap in proficiency between students with disabilities and students without disabilities • Element 4: Percent proficient or above National Center on Educational 25 Outcomes
  • 26. Sample Approach 1bNational Centeron Educational 26 Outcomes
  • 27. National Centeron Educational 27 Outcomes
  • 28. Sample Approach 1bBenefits (Pros) and Challenges (Cons) Benefits (Pros) Challenges (Cons)• Combines variables that are • Complex and may lack difficult to look at separately transparency• Easy to see where state falls • Each element has issues across two content areas that need to be considered National Center on Educational 28 Outcomes
  • 29. Sample Approach 1cDecision Matrix (Without State Proficiency Target) • Reading and math combined • Data from Core Team Tables 1-2, with additional alternate assessment data • Elements 1-4 same as for Approaches 1a and 1b • Element 5: Gap between percent proficient on general state assessment and percent proficient on AA-AAS National Center on Educational 29 Outcomes
  • 30. Sample Approach 1cDetailed information on this approach isnot included because additional data areneeded to make calculations.A decision matrix approach similar to thatin Sample Approaches 1a and 1b wouldbe used.National Centeron Educational 30 Outcomes
  • 31. Sample Approach 1cBenefits (Pros) and Challenges (Cons) Benefits (Pros) Challenges (Cons)• Combines variables that are • Complex and may lack difficult to look at separately transparency• Easy to see where state falls • Each element has issues across two content areas that need to be considered• Explicitly includes students in the AA-AAS National Center on Educational 31 Outcomes
  • 32. Sample Approach 2Decision-Making Steps• Reading and math separate• Four steps • Participation in general assessment • Gap in performance between students with disabilities and students without disabilities • Proficient rates on general assessment in relation to the difficulty of the state’s assessment • Alternate assessment participation and performance National Center on Educational 32 Outcomes
  • 33. Sample Approach 2Decision-Making Recording SheetNational Centeron Educational 33 Outcomes
  • 34. Sample Approach 2National Centeron Educational 34 Outcomes
  • 35. Sample Approach 2National Centeron Educational 35 Outcomes
  • 36. Sample Approach 2Benefits (Pros) and Challenges (Cons) Benefits (Pros) Challenges (Cons)• Considers all examples of • Must be completed one reporting tables by Core state at a time Team • More subjective than some• Provides greater approaches transparency than some • Comparability is a challenge approaches• Allows for adjustment of steps and variable for policy shifts National Center on Educational 36 Outcomes