SlideShare a Scribd company logo
Office of Accountability 6/12/2014
Overview of Measures used in CPS Accountability
June 2014 Webinar for Principals and Teachers
▪ Understand the basics of how assessments are scored and scoring
terminology
▪ Understand various accountability systems and the measures that are
used in each.
▪ Understand differences in metrics and use in accountability measures
▪ Understand exclusions and differences in data reporting
▪ Next steps
Purpose
Office of Accountability
▪ Interpreting individual performance scores
• Raw Score: Total number of points a student earned
• Scale Score: A scale score allows for comparing performance from one form of a test to
another and from student to student. It is generating by assigning a scale value to each
test item, based on the level of difficulty of the item.
• Standard Error of Measurement (SEM): This reflects the degree of accuracy in any given
score. The size of the SEM predicts how likely it is that the actual score is the same as the
“true” score. All assessments are subject to a degree of error, as the concepts being
measured are “sampled.”
▪ Interpreting performance compared to groups of students
• Percentile: A percentile indicates what percent of students in a norm sample an
individual’s score was higher than. This provides a sense of how students score compared
to others in a similar group (grade, course).
• Growth norm: This indicates the average growth observed for students at the same
starting point. NOTE: The growth norm may not be sufficient growth to get a student to
grade level. A student can make expected growth, but still be below grade level peers.
Assessment scoring basics
Office of Accountability
Accountability Systems
Office of Accountability
Accountability Systems
Teachers
REACH Students
Teacher Evaluation
Principals
REACH Students
Principal Evaluation
Schools
School Quality Rating
Policy
School Progress
Reports
Office of Accountability
Elementary School Measures
Teacher Evaluation Principal Evaluation SQRP
School Progress
Reports
Observations Yes Yes
Performance Tasks Yes
Student Attainment Yes Yes
Student Growth Yes Yes Yes Yes
Priority Group Growth Yes Yes Yes (web only)
Attendance Yes Yes Yes
Grades 3-8 On-Track Yes Yes
English Language Development (ELs
only)
Yes Yes
5 Essentials Yes Yes
Parent Survey Yes
Data Quality Yes
Behavior and Discipline Yes
Healthy Schools Certification Yes
Creative Schools Certification Yes
Teacher Attendance Yes
Office of Accountability
High School Measures
Teacher Evaluation Principal Evaluation SQRP
School Progress
Reports
Observations Yes Yes
Performance Tasks Yes
Student Attainment Yes Yes
Student Growth Yes Yes Yes Yes
Priority Group Growth Yes Yes Yes (web only)
Attendance Yes Yes Yes
Freshman On-Track Yes Yes Yes
Graduation Rate Yes Yes Yes
Dropout Rate Yes Yes Yes
Early College Credit & Career Certification Yes Yes
College Enrollment & Persistence Yes Yes
5 Essentials Yes Yes
Parent Survey Yes
Data Quality Yes
Behavior and Discipline Yes
Healthy Schools Certification Yes
Creative Schools Certification Yes
Teacher Attendance Yes
Office of Accountability
Types of Measures
Attainment vs. Growth
Student vs. Group
Percent vs. Mean
Value Added
Office of Accountability
Growth is a measure of change between two
points in time. We can measure raw growth, or
compare growth to a growth norm. Often,
growth is compared to students who have the
same pretest score for an apples-to-apples
comparison.
Attainment vs. Growth
Attainment is a measure of a single point in
time. We can measure how a student is
performing against a standard or norm sample.
➢ SQRP uses both attainment and growth. It is important to understand not only how much students
are learning, but if they are keeping pace with grade level learning expectations.
➢ REACH teacher and principal evaluations use only growth. Student attainment is largely influenced
by how students performed in the past, while growth is more influenced by what happens in the
classroom.
Office of Accountability
A student-level measure compares the
attainment or growth of a single student to a
standard or norm.
Student vs. Group Measures
Office of Accountability
A group-level measure uses the aggregate
performance of a classroom, grade level,
priority group, or whole school.
Used for making determinations
about individual students.
Used for making determinations about
schools, teachers or principals.
Note:
▪ Student and school norms may differ because the probability that a student receives a certain score may be different
than the probability an entire group achieves, on average, that same score.
▪ For example, on the NWEA Reading test in Grade 8:
• A student scoring 230 is at the 70th percentile among students.
• A school with an average score of 230 is at the 91st percentile among schools.
• Simply put, an 8th grader scoring 230 is rare, but an entire school of 8th graders averaging 230 is very rare.
Number of students meeting an objective
standard (percent-based)
Examples:
➢ Percent at/above national average
performance for their grade level (reported
but not used in accountability)
➢ Percent meeting national average growth
target (used in principal evaluation and
SQRP)
Types of Aggregate Measures
Office of Accountability
Biggest limitation: Does not look at magnitude of
difference from the standard. Students are either
a “Yes” or a “No”.
Average attainment or growth (mean-based)
Examples:
➢ National attainment percentile (used in
SQRP)
➢ National growth percentiles (used in
principal evaluation and SQRP)
➢ Value-added (used in teacher evaluation)
Office of Accountability
How can you really compare the growth of my students to other students in the district? My class
is very unique.
The value-added model recognizes that students academic growth varies by grade, prior performance, and
demographics. The goal of value-added is to measure the school or teacher’s impact on student learning
independent of these factors. To do this, value-added controls for:
Prior Reading Score Low-Income Status
Prior Math Score English Learner Status
Grade Level IEP Status
Gender STLS Participation
Race/Ethnicity Mobility
Value-Added
In addition to weighting
the student by
enrollment length and
instructional
responsibility, the value-
added model also
controls for how many
times the student moves
schools throughout the
year.
Exclusions and Reporting
Office of Accountability
▪ The SQRP and principal evaluation use “annualized enrollment” for most assessment measures.
This attributes students to the school where they were enrolled the longest amount of time during the
school year.
▪ REACH Students teacher evaluation uses Roster Verification and enrollment data to determine
how much a student is weighted in a teacher’s value-added score. Mobile students and students
where teachers shared instructional responsibility will be weighted less than other students.
▪ In all CPS accountability systems, students may be excluded for the following reasons:
• English Learner with an ACCESS Literacy score less than 3.5
• Alternative Assessment (IAA) indicator in SSM
• Retained
• No valid pretest or posttest
• For measures using annualization, student not in his/her annualized school for at least 45 days
Which students are included?
Office of Accountability
CPS vs. ISBE EL criteria
Office of Accountability
CPS criteria for assessment differs from ISBE criteria for exiting
EL services because of the very different purposes.
CPS ISBE
Relevant Cut Point •Relevant cut point: 3.0 and above on
ACCESS literacy 2013 required to be
assessed
•Those below 3.5 on ACCESS Literacy 2014
will be excluded from calculations
•Relevant cut point: 5.0 Overall, 4.2 Reading,
4.2 Writing on ACCESS 2014 as criteria to
exit services
Definition of cut
point
•Minimum proficiency level necessary to be
able to complete assessment.
•Level at which students are deemed
proficient in English and will no longer
receive full EL services.
Rationale for focus •Focus on literacy composite as the relevant
indicator of ability to read independently
for purpose of assessment
•Focus on all subject areas as indicators of
comprehensive English proficiency and
level
Various Points for Data Access
Office of Accountability
The list of students included in various reports may be slightly different,
based on timing, and exclusions
▪ Assessment vendor (MARC, EPAS reports, mClass Home)
• Data from vendors reflects the students tested at the time and does not include any
CPS exclusions
▪ Dashboard reports
• Reflects real-time data and enrollment (NWEA has a 2 morning delay)
• NWEA files include whether students are required or not based on exclusions
• Ability to look at tested vs. current, but not yet annualized
• Does NOT exclude for invalid tests
▪ Final processed data and reports
• DOES exclude invalid tests (testing off grade level, irregularities)
• Assignment based on annualization or attribution
Next Steps
Principals:
▪ Look for a series of communications from John Barker over the next few weeks covering:
▪ Data availability timeline
▪ Review of SQRP data
▪ Updates SQRP materials on Knowledge Center
▪ Begin summer planning for professional development and resource needs using student data
Teachers:
▪ Look for REACH Summative rating reports in early fall
▪ Begin reflection on student growth using multiple data points
Office of Accountability

More Related Content

Similar to overviewaccountabilitymetrics_june2014.ppt

Teacher evaluation presentation oregon
Teacher evaluation presentation   oregonTeacher evaluation presentation   oregon
Teacher evaluation presentation oregonJohn Cronin
 
Assessment update february 2014
Assessment update february 2014Assessment update february 2014
Assessment update february 2014sycamorees
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finishclearsateam
 
New K-12 Grading System (HS/Elem)
New K-12 Grading System (HS/Elem)New K-12 Grading System (HS/Elem)
New K-12 Grading System (HS/Elem)Manresa School
 
New microsoft office power point presentation
New microsoft office power point presentationNew microsoft office power point presentation
New microsoft office power point presentationmuhammad aslam
 
Cesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptCesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptroverdust
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth PresentationRichard Voltz
 
Gradding and reporting
Gradding and reporting Gradding and reporting
Gradding and reporting HennaAnsari
 
Strategies in teaching
Strategies in teachingStrategies in teaching
Strategies in teachingShian Morallos
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationRichard Voltz
 
Low vs high-stakes tests
Low  vs high-stakes testsLow  vs high-stakes tests
Low vs high-stakes testsSarahTaouir
 
Teacher Rating 2013-2014
Teacher Rating 2013-2014Teacher Rating 2013-2014
Teacher Rating 2013-2014Justin Rook
 
NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA
 
Purposeful assessments
Purposeful assessmentsPurposeful assessments
Purposeful assessmentsMary Miller
 

Similar to overviewaccountabilitymetrics_june2014.ppt (20)

Teacher evaluation presentation oregon
Teacher evaluation presentation   oregonTeacher evaluation presentation   oregon
Teacher evaluation presentation oregon
 
Assessment update february 2014
Assessment update february 2014Assessment update february 2014
Assessment update february 2014
 
Project from Start to Finish
Project from Start to FinishProject from Start to Finish
Project from Start to Finish
 
New k 12 grading system
New k 12 grading systemNew k 12 grading system
New k 12 grading system
 
New K-12 Grading System (HS/Elem)
New K-12 Grading System (HS/Elem)New K-12 Grading System (HS/Elem)
New K-12 Grading System (HS/Elem)
 
New microsoft office power point presentation
New microsoft office power point presentationNew microsoft office power point presentation
New microsoft office power point presentation
 
Cesa 6 effectiveness project ppt
Cesa 6 effectiveness project pptCesa 6 effectiveness project ppt
Cesa 6 effectiveness project ppt
 
EVAAS
EVAAS EVAAS
EVAAS
 
Middle School Conference EVAAS Workshop 2012
Middle School Conference EVAAS Workshop 2012Middle School Conference EVAAS Workshop 2012
Middle School Conference EVAAS Workshop 2012
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth Presentation
 
Gradding and reporting
Gradding and reporting Gradding and reporting
Gradding and reporting
 
Strategies in teaching
Strategies in teachingStrategies in teaching
Strategies in teaching
 
ppt.pdf
ppt.pdfppt.pdf
ppt.pdf
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal Evaluation
 
Low vs high-stakes tests
Low  vs high-stakes testsLow  vs high-stakes tests
Low vs high-stakes tests
 
Teacher Rating 2013-2014
Teacher Rating 2013-2014Teacher Rating 2013-2014
Teacher Rating 2013-2014
 
NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13
 
Purposeful assessments
Purposeful assessmentsPurposeful assessments
Purposeful assessments
 
Roles of assessment_in_making_instructional_decisions (1)
Roles of assessment_in_making_instructional_decisions (1)Roles of assessment_in_making_instructional_decisions (1)
Roles of assessment_in_making_instructional_decisions (1)
 
Module 9
Module 9Module 9
Module 9
 

Recently uploaded

Create the recognition your teams deserve.pptx
Create the recognition your teams deserve.pptxCreate the recognition your teams deserve.pptx
Create the recognition your teams deserve.pptxStephen Sitton
 
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...CIOWomenMagazine
 
Project Management Professional (PMP)® from PMI
Project Management Professional (PMP)® from PMIProject Management Professional (PMP)® from PMI
Project Management Professional (PMP)® from PMITasnur Tonoy
 
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...AgileNetwork
 
Risk Management in Banks - Overview (May 2024)
Risk Management in Banks - Overview (May 2024)Risk Management in Banks - Overview (May 2024)
Risk Management in Banks - Overview (May 2024)Kristi Rohtsalu
 
Founder-Game Director Workshop (Session 1)
Founder-Game Director  Workshop (Session 1)Founder-Game Director  Workshop (Session 1)
Founder-Game Director Workshop (Session 1)Amir H. Fassihi
 
Flexi time, Flexi work, QWL and Role Effectiveness
Flexi time, Flexi  work, QWL and  Role EffectivenessFlexi time, Flexi  work, QWL and  Role Effectiveness
Flexi time, Flexi work, QWL and Role EffectivenessSana Fatima
 
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...Travis Hills MN
 

Recently uploaded (9)

Create the recognition your teams deserve.pptx
Create the recognition your teams deserve.pptxCreate the recognition your teams deserve.pptx
Create the recognition your teams deserve.pptx
 
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...
 
Project Management Professional (PMP)® from PMI
Project Management Professional (PMP)® from PMIProject Management Professional (PMP)® from PMI
Project Management Professional (PMP)® from PMI
 
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...
ANIn Delhi Feb 2022 | Design the Future with Technology Disruption by N Kisho...
 
Risk Management in Banks - Overview (May 2024)
Risk Management in Banks - Overview (May 2024)Risk Management in Banks - Overview (May 2024)
Risk Management in Banks - Overview (May 2024)
 
TCS AI for Business Study – Key Findings
TCS AI for Business Study – Key FindingsTCS AI for Business Study – Key Findings
TCS AI for Business Study – Key Findings
 
Founder-Game Director Workshop (Session 1)
Founder-Game Director  Workshop (Session 1)Founder-Game Director  Workshop (Session 1)
Founder-Game Director Workshop (Session 1)
 
Flexi time, Flexi work, QWL and Role Effectiveness
Flexi time, Flexi  work, QWL and  Role EffectivenessFlexi time, Flexi  work, QWL and  Role Effectiveness
Flexi time, Flexi work, QWL and Role Effectiveness
 
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...
Travis Hills of Minnesota Leads Livestock Water and Energy in Sustainable Inn...
 

overviewaccountabilitymetrics_june2014.ppt

  • 1. Office of Accountability 6/12/2014 Overview of Measures used in CPS Accountability June 2014 Webinar for Principals and Teachers
  • 2. ▪ Understand the basics of how assessments are scored and scoring terminology ▪ Understand various accountability systems and the measures that are used in each. ▪ Understand differences in metrics and use in accountability measures ▪ Understand exclusions and differences in data reporting ▪ Next steps Purpose Office of Accountability
  • 3. ▪ Interpreting individual performance scores • Raw Score: Total number of points a student earned • Scale Score: A scale score allows for comparing performance from one form of a test to another and from student to student. It is generating by assigning a scale value to each test item, based on the level of difficulty of the item. • Standard Error of Measurement (SEM): This reflects the degree of accuracy in any given score. The size of the SEM predicts how likely it is that the actual score is the same as the “true” score. All assessments are subject to a degree of error, as the concepts being measured are “sampled.” ▪ Interpreting performance compared to groups of students • Percentile: A percentile indicates what percent of students in a norm sample an individual’s score was higher than. This provides a sense of how students score compared to others in a similar group (grade, course). • Growth norm: This indicates the average growth observed for students at the same starting point. NOTE: The growth norm may not be sufficient growth to get a student to grade level. A student can make expected growth, but still be below grade level peers. Assessment scoring basics Office of Accountability
  • 5. Accountability Systems Teachers REACH Students Teacher Evaluation Principals REACH Students Principal Evaluation Schools School Quality Rating Policy School Progress Reports Office of Accountability
  • 6. Elementary School Measures Teacher Evaluation Principal Evaluation SQRP School Progress Reports Observations Yes Yes Performance Tasks Yes Student Attainment Yes Yes Student Growth Yes Yes Yes Yes Priority Group Growth Yes Yes Yes (web only) Attendance Yes Yes Yes Grades 3-8 On-Track Yes Yes English Language Development (ELs only) Yes Yes 5 Essentials Yes Yes Parent Survey Yes Data Quality Yes Behavior and Discipline Yes Healthy Schools Certification Yes Creative Schools Certification Yes Teacher Attendance Yes Office of Accountability
  • 7. High School Measures Teacher Evaluation Principal Evaluation SQRP School Progress Reports Observations Yes Yes Performance Tasks Yes Student Attainment Yes Yes Student Growth Yes Yes Yes Yes Priority Group Growth Yes Yes Yes (web only) Attendance Yes Yes Yes Freshman On-Track Yes Yes Yes Graduation Rate Yes Yes Yes Dropout Rate Yes Yes Yes Early College Credit & Career Certification Yes Yes College Enrollment & Persistence Yes Yes 5 Essentials Yes Yes Parent Survey Yes Data Quality Yes Behavior and Discipline Yes Healthy Schools Certification Yes Creative Schools Certification Yes Teacher Attendance Yes Office of Accountability
  • 8. Types of Measures Attainment vs. Growth Student vs. Group Percent vs. Mean Value Added Office of Accountability
  • 9. Growth is a measure of change between two points in time. We can measure raw growth, or compare growth to a growth norm. Often, growth is compared to students who have the same pretest score for an apples-to-apples comparison. Attainment vs. Growth Attainment is a measure of a single point in time. We can measure how a student is performing against a standard or norm sample. ➢ SQRP uses both attainment and growth. It is important to understand not only how much students are learning, but if they are keeping pace with grade level learning expectations. ➢ REACH teacher and principal evaluations use only growth. Student attainment is largely influenced by how students performed in the past, while growth is more influenced by what happens in the classroom. Office of Accountability
  • 10. A student-level measure compares the attainment or growth of a single student to a standard or norm. Student vs. Group Measures Office of Accountability A group-level measure uses the aggregate performance of a classroom, grade level, priority group, or whole school. Used for making determinations about individual students. Used for making determinations about schools, teachers or principals. Note: ▪ Student and school norms may differ because the probability that a student receives a certain score may be different than the probability an entire group achieves, on average, that same score. ▪ For example, on the NWEA Reading test in Grade 8: • A student scoring 230 is at the 70th percentile among students. • A school with an average score of 230 is at the 91st percentile among schools. • Simply put, an 8th grader scoring 230 is rare, but an entire school of 8th graders averaging 230 is very rare.
  • 11. Number of students meeting an objective standard (percent-based) Examples: ➢ Percent at/above national average performance for their grade level (reported but not used in accountability) ➢ Percent meeting national average growth target (used in principal evaluation and SQRP) Types of Aggregate Measures Office of Accountability Biggest limitation: Does not look at magnitude of difference from the standard. Students are either a “Yes” or a “No”. Average attainment or growth (mean-based) Examples: ➢ National attainment percentile (used in SQRP) ➢ National growth percentiles (used in principal evaluation and SQRP) ➢ Value-added (used in teacher evaluation)
  • 12. Office of Accountability How can you really compare the growth of my students to other students in the district? My class is very unique. The value-added model recognizes that students academic growth varies by grade, prior performance, and demographics. The goal of value-added is to measure the school or teacher’s impact on student learning independent of these factors. To do this, value-added controls for: Prior Reading Score Low-Income Status Prior Math Score English Learner Status Grade Level IEP Status Gender STLS Participation Race/Ethnicity Mobility Value-Added In addition to weighting the student by enrollment length and instructional responsibility, the value- added model also controls for how many times the student moves schools throughout the year.
  • 13. Exclusions and Reporting Office of Accountability
  • 14. ▪ The SQRP and principal evaluation use “annualized enrollment” for most assessment measures. This attributes students to the school where they were enrolled the longest amount of time during the school year. ▪ REACH Students teacher evaluation uses Roster Verification and enrollment data to determine how much a student is weighted in a teacher’s value-added score. Mobile students and students where teachers shared instructional responsibility will be weighted less than other students. ▪ In all CPS accountability systems, students may be excluded for the following reasons: • English Learner with an ACCESS Literacy score less than 3.5 • Alternative Assessment (IAA) indicator in SSM • Retained • No valid pretest or posttest • For measures using annualization, student not in his/her annualized school for at least 45 days Which students are included? Office of Accountability
  • 15. CPS vs. ISBE EL criteria Office of Accountability CPS criteria for assessment differs from ISBE criteria for exiting EL services because of the very different purposes. CPS ISBE Relevant Cut Point •Relevant cut point: 3.0 and above on ACCESS literacy 2013 required to be assessed •Those below 3.5 on ACCESS Literacy 2014 will be excluded from calculations •Relevant cut point: 5.0 Overall, 4.2 Reading, 4.2 Writing on ACCESS 2014 as criteria to exit services Definition of cut point •Minimum proficiency level necessary to be able to complete assessment. •Level at which students are deemed proficient in English and will no longer receive full EL services. Rationale for focus •Focus on literacy composite as the relevant indicator of ability to read independently for purpose of assessment •Focus on all subject areas as indicators of comprehensive English proficiency and level
  • 16. Various Points for Data Access Office of Accountability The list of students included in various reports may be slightly different, based on timing, and exclusions ▪ Assessment vendor (MARC, EPAS reports, mClass Home) • Data from vendors reflects the students tested at the time and does not include any CPS exclusions ▪ Dashboard reports • Reflects real-time data and enrollment (NWEA has a 2 morning delay) • NWEA files include whether students are required or not based on exclusions • Ability to look at tested vs. current, but not yet annualized • Does NOT exclude for invalid tests ▪ Final processed data and reports • DOES exclude invalid tests (testing off grade level, irregularities) • Assignment based on annualization or attribution
  • 17. Next Steps Principals: ▪ Look for a series of communications from John Barker over the next few weeks covering: ▪ Data availability timeline ▪ Review of SQRP data ▪ Updates SQRP materials on Knowledge Center ▪ Begin summer planning for professional development and resource needs using student data Teachers: ▪ Look for REACH Summative rating reports in early fall ▪ Begin reflection on student growth using multiple data points Office of Accountability