SlideShare a Scribd company logo
1 of 42
1
Student Learning/Support
Objectives (SLO/SSO)
-Reviewing Module-
Student Learning/Support
Objectives
2
Module 3
Reviewing the SLO/SSO
3
Review
Goal
• Reviewing technically rigorous student learning/support
objectives (SLO/SSO) for use in guiding instruction (and/or
support services); while determining student mastery or growth
as part of an educator effectiveness system.
Objectives
• Participants will :
1. Conduct a multi-faceted, quality assurance review of the
student learning/support objectives for:
A. Completeness
B. Comprehensiveness (see Quick Start Training
Module*)
C. Coherency
Final Outcome
4
A refined SLO/SSO Form with:
• Rigorous, high-quality performance
measures designed to measure the
targeted content/professional
standards.
• Areas identified for further
improvement/consideration (as
documented within the Coherency
Rubric)
5
Helpful Resources
Participants should consult the following:
Training
• Handout #3: QA Checklist
 Step 5: Quality Assurance of the SLO/SSO Form
Templates
• Template #3: Coherency Rubric
• Template #3a: Performance Measure Rubric
Resources
• Help Desk
• Models: Scored Examples
Participants should consult the following:
 Completion of the SLO Form
• Handout #3-Quality Assurance Checklist
 Comprehensiveness of the Performance
Measures
• Template #3a-Performance Measure Rubric
 Coherency of the SLO Design
• Template #3-Coherency Rubric
6
Helpful Resources (cont.)
Process Components
7
STEP #1
Goal
STEP #2
Standards
STEP #3
Blueprint
STEP #4
Form
STEP #5
QA
8
• Check the drafted SLO/SSO (including the
performance measures) for quality.
• Refine the measures and targets.
• Edit the text and prepare discussion points and
highlights for the principal/supervisor.
• Update the completed SLO/SSO form with
performance data.
REVIEW Phase
9
Review Phase Components
Review
Preview
Checklist &
Rubrics
Completion: QA
Checklist
Comprehensiveness:
Performance Measure
Rubric
Coherency:
Coherency
Rubric
10
Task Structure
QA 1. Completeness
Is the SLO/SSO Form completed correctly?
2. Comprehensiveness
Are the assessments of high technical quality?
3. Coherency
Are the SLO/SSO components aligned to each
other?
11
STEP 5
Quality Assurance
12
Checklist & Rubric Preview
1. Apply the QA Checklist (Handout #3) to a completed
SLO or SSO Form.
A. What information is needed?
B. Who is the SLO/SSO focused on?
2. Preview the three (3) strands within the Performance
Measure Rubric (Template #3a) for each assessment
identified within the SLO/SSO.
A. What is the purpose of the assessment/performance
measure?
B. What standards does it purport to measure?
C. What technical data is provided/known?
13
Checklist & Rubric Preview
(cont.)
3. Preview the three (3) phases within the Coherency
Rubric for the SLO or SSO Form.
A. How well are the SLO/SSO components aligned to
each other?
B. How well do the identified assessments/
performance measures aligned to the SLO’s/SSO’s
stated goal?
14
Quality Assurance
Checklist
15
Quality Assurance Checklist
• The checklist is designed to verify each
element within the four (4) sections of the
SLO/SSO
• The checklist applies the business rules to
each element. The Help Desk document
provides examples for each element.
SLO/SSO Completeness
Quality Assurance Checklist
16
Element Definition
 1.1 Content Area Name of the content area upon which the SLO is based.
 1.2 Course Name of the specific course/subject upon which the SLO is based.
 1.3 Grade Level
Grade levels for students included in the course/subject in Element
1.2.
 1.4 Total Students
Aggregate number of students (estimated, across multiple sections)
for whom data will be collected.
 1.5 Average Class Size
The average number of students in a single session of the
course/subject identified in Element 1.2.
 1.6 Class Frequency
The frequency (within the given timeframe) of the course/subject
identified in Element 1.2.
 1.7 Instructional
Setting
The location or setting where the course/subject instruction is
provided.
 1.8 Instructional
Interval
The time frame of the course/subject identified in Element 1.2.
SLO Section I: Context
Quality Assurance Checklist
(cont.)
17
Element Definition
 1.1 Service Area
Name of the primary service area (e.g., speech) upon which the
SSO is based.
 1.2 Service Location Name of the location(s) services are provided.
 1.3 Grade Level
Grade level(s) of students and/or educator-type services are
provided.
 1.4 Total Recipients
Aggregate number of students and/or educators for whom data will
be collected.
 1.5 Average
Case Size
The “average” number of recipients of the services identified in
Element 1.4.
 1.6 Service Frequency
The typical frequency (within the identified service interval-
Element 1.8) services are provided to recipients identified in
Element 1.4.
 1.7 Service Setting
The contextual setting (e.g., school library, student’s home) services
are provided.
 1.8 Service Interval The typical time frame of the service model.
SSO Section I: Setting
18
SLO Section II: Goal
Element Definition
 2.1 Goal Statement
A narrative that articulates a key concept upon which the SLO is based. The statement addresses What,
Why, and How.
 2.2 Content Standards Targeted Content Standards, which are the foundation of performance measures, used to develop the SLO.
 2.3 Instructional Strategy
The approach used to facilitate learning the key concept articulated in the Goal Statement and delineated
among the Targeted Content Standards.
SLO Section III: Objective
Element Definition
 3.1 Starting Point
(Baseline)
The baseline data used for comparing student results at the end of the instructional interval.
 3.2 Objectives
(Whole Class)
The expected level of achievement for the entire student learning objective (SLO) population (as defined in
Element 1.4).
 3.3 Objectives
(Focused Students)
The expected level of achievement for a subset of the SLO population (as defined in Element 1.4).
 3.4 End Point
(Combined)
At the end of the instructional interval, the aggregate performance classification as delineated by four,
empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished).
Quality Assurance Checklist
(cont.)
19
SSO Section II: Goal
Element Definition
 2.1 Goal Statement
A narrative that articulates a key concept upon which the SSO is based. The statement addresses What,
Why, and How.
 2.2 Targeted Professional
Standards
Targeted Professional Standards outline the requirements an organization must fulfill to ensure that
products and services consistently meet customers' requirements. Content standards may also be
identified for those individuals providing instructional services.
 2.3 Implementation Strategy
The approach used to attain the primary service goal articulated in the Goal Statement and delineated
among the Targeted Professional Standards.
SSO Section III: Objective
Element Definition
 3.1 Starting Point (Baseline) The baseline data used for comparing client results at the end of the instructional interval.
 3.2 Objectives (All Clients) The expected level of performance for the entire client population (as defined in Element 1.4).
 3.3 Objectives (Focused
Clients)
The expected level of performance for a subset of the client population (as defined in Element 1.4).
 3.4 End Point (Combined)
At the end of the service interval, the aggregate performance classification as delineated by four,
empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished).
Quality Assurance Checklist
(cont.)
20
SLO Section IV: Performance Measure
Element Definition
 4.1 Name The name of each performance measure for which an objective is established in Element 3.2.
 4.2 Purpose
The purpose statement for each performance measure that outlines: (a) What the assessment measures, (b) How
to use the scores, and (c) Why the assessment was developed.
 4.3 Content Standards
The Targeted Content Standards (the foundation of performance measures) used to develop SLOs. The content
standards are those aligned with each performance measure.
 4.4 Performance Targets
Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of
achievement for each student in the SLO population (as defined in Element 1.4).
 4.5 Metric The metric by which the performance measure evaluates the performance target.
 4.6 Administration
The administrative steps before, during, and after the assessment window, as well as the step-by-step
procedures during each phase of administration, including: (a) the requirements for completing the
performance measure, including accommodations, equipment, and materials, (b) standard time allotments to
complete the overall performance measure, and (c) standard scripts that educators read to give directions for
completing the performance measure.
 4.7 Scoring Tools
Scoring Keys: Objective Measures
Scoring Rubric: Subjective Measures
 4.8 Results
 The number of students participating in the performance measure
 The number of students who met the target as stated in Element 4.4
 The percentage of students who met the target as stated in Element 4.4
 SLO Rating
One of four performance levels that the principal (or the evaluator) identifies after noting the actual
performance in respect to each objective stated in the SLO.
 Notes and Explanation
Space for the educator to note influences, factors, and other conditions associated with the SLO Rating, as well
as to reflect on a purposeful review of the data.
Quality Assurance Checklist
(cont.)
21
SSO Section IV: Performance Measure
Element Definition
 4.1 Name The name of each performance measure for which an objective is established in Element 3.2.
 4.2 Purpose
The purpose statement for each performance measure that outlines: (a) What the measure is evaluating, (b)
How to use the scores, and (c) Why the measure was developed.
 4.3 Professional Standards
The Professional Standards (the foundation of measures) used to develop SSOs. The professional standards are
those aligned with each identified measure.
 4.4 Performance Targets
Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of
attainment for each client in the SSO population (as defined in Element 1.4).
 4.5 Metric The metric by which the performance measure evaluates the performance target.
 4.6 Administration
The administrative steps before, during, and after the evaluation window, as well as the step-by-step
procedures during each phase of administration The requirements for completing the performance measure,
including accommodations, equipment, and materials The standard time to complete the overall evaluation.
 4.7 Scoring Tools
Scoring Keys: Objective Measures
Scoring Rubric: Subjective Measures; Data collection mechanisms
 4.8 Results
• The number of clients participating in the performance measure.
• The number of clients who met the target as stated in Element 4.4.
• The percentage of clients who met the target as stated in Element 4.4
 SSO Results
One of four performance levels that the supervisor identifies after noting the actual performance in respect to
each objective stated in the SSO.
 Notes and Explanations
Space for the professional to note influences, factors, and other conditions associated with the SSO Rating, as
well as to reflect on a purposeful review of the data.
Quality Assurance Checklist
(cont.)
22
Procedural Steps
Step 1. Select the drafted SLO/SSO, including applicable
performance measures.
Step 2. Beginning with Section I, use the Handout #3-Quality
Assurance Checklist and evaluate each element.
Step 3. Identify any element with missing or incorrect
information (i.e., statement or data placed in the wrong
element of the SLO/SSO Form).
Step 4. Flag any element needing refinement or further
discussion with other educators/professionals and/or
the principal/supervisor.
Step 5. Repeat Steps 2 through 4 with the other sections on the
SLO/SSO Form.
23
Performance Measure
Rubric
24
Performance Measure Rubric
• The rubric is designed to examine the quality
characteristics of teacher-made performance
measures. The rubric is comprised of 18 technical
descriptors organized into three strands.
• The rubric’s purpose is to provide teachers with a
self-assessment tool that assists in building high
quality measures of student achievement.
SLO/SSO Comprehensiveness
25
• For vendor-developed assessments, examine
the technical evidence to determine if the tool
is designed to measure the Targeted
Content/Professional Standards.
• For locally-developed assessments, follow the
guidelines in the Quick Start training to create
high-quality performance measures, and then
apply Template 3a.
Performance Measure Rubric
26
Performance Measure Rubric (cont.)
Task ID Descriptor Rating
1.1
The purpose of the performance measure is explicitly stated (who, what,
why).
1.2
The performance measure has targeted content standards representing a
range of knowledge and skills students are expected to know and
demonstrate.
1.3
The performance measure’s design is appropriate for the intended audience
and reflects challenging material needed to develop higher-order thinking
skills.
1.4
Specification tables articulate the number of items/tasks, item/task types,
passage readability, and other information about the performance measure -
OR- Blueprints are used to align items/tasks to targeted content standards.
1.5
Items/tasks are rigorous (designed to measure a range of cognitive
demands/higher-order thinking skills at developmentally appropriate levels)
and of sufficient quantities to measure the depth and breadth of the targeted
content standards.
Strand I: Design
27
Performance Measure Rubric (cont.)
Task ID Descriptor Rating
2.1
Items/tasks and score keys are developed using standardized procedures, including
scoring rubrics for human-scored, open-ended questions (e.g., short constructed
response, writing prompts, performance tasks, etc.).
2.2
Item/tasks are created and reviewed in terms of: (a) alignment to the targeted content
standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive
demand, and (e) bias, sensitivity, and fairness.
2.3
Administrative guidelines are developed that contain the step-by-step procedures used to
administer the performance measure in a consistent manner, including scripts to orally
communicate directions to students, day and time constraints, and allowable
accommodations/adaptations.
2.4
Scoring guidelines are developed for human-scored items/tasks to promote score
consistency across items/tasks and among different scorers. These guidelines articulate
point values for each item/task used to combine results into an overall score.
2.5
Summary scores are reported using both raw score points and performance level.
Performance levels reflect the range of scores possible on the assessment and use terms
or symbols to denote each level.
2.6
The total time to administer the performance measure is developmentally appropriate
for the test-taker. Generally, this is 30 minutes or less for young students and up to 60
minutes per session for older students (high school).
Strand II: Build
28
Performance Measure Rubric (cont.)
Task ID Descriptor Rating
3.1
The performance measures are reviewed in terms of design fidelity:
 Items/tasks are distributed based upon the design properties found within the specification or
blueprint documents;
 Item/task and form statistics are used to examine levels of difficulty, complexity, distractor
quality, and other properties; and,
 Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics.
3.2
The performance measure was reviewed in terms of editorial soundness, while ensuring consistency
and accuracy of all documents (e.g., administration guide):
 Identifies words, text, reading passages, and/or graphics that require copyright permission or
acknowledgements;
 Applies Universal Design principles; and,
 Ensures linguistic demands and readability are developmentally appropriate.
3.3
The performance measure was reviewed in terms of alignment characteristics:
 Pattern consistency (within specifications and/or blueprints);
 Targeted content standards match;
 Cognitive demand; and.
 Developmental appropriateness.
3.4
Cut scores are established for each performance level. Performance level descriptors describe the
achievement continuum using content-based competencies for each assessed content area.
Strand III: Review
29
Performance Measure Rubric (cont.)
Task ID Descriptor Rating
3.5
As part of the assessment cycle, post-administration analyses are conducted
to examine aspects such as items/tasks performance, scale functioning,
overall score distribution, rater drift, content alignment, etc.
3.6
The performance measure has score validity evidence that demonstrate item
responses were consistent with content specifications. Data suggest that the
scores represent the intended construct by using an adequate sample of
items/tasks within the targeted content standards. Other sources of validity
evidence such as the interrelationship of items/tasks and alignment
characteristics of the performance measure are collected.
3.7
Reliability coefficients are reported for the performance measure, which
includes estimating internal consistency. Standard errors are reported for
summary scores. When applicable, other reliability statistics such as
classification accuracy, rater reliability, etc. are calculated and reviewed.
Strand III: Review
Note: The indicators below are evaluated after students/clients
have taken the assessment (i.e., post-administration).
30
Procedural Steps
Step 1. Identify the performance measures being used within
the SLO/SSO.
Step 2. Examine the alignment characteristics of the
performance measure to those standards identified in
Section II.
Step 3. Determine the most applicable metric (e.g., growth)
based on the stated objectives in Section III.
Step 4. Evaluate the technical evidence provided by either the
vendor or the assessment’s developer.
Step 5. Repeat Steps 2 through 4 with the other performance
measures identified in Section IV.
31
Coherency Rubric
32
SLO/SSO Coherency
GOAL STATEMENT
SLO/SSO
RATING
OBJECTIVES
PERFORMANCE MEASURES &
TARGETS
ALL STUDENTS/
CLIENTS
FOCUSED
STUDENTS/CLIENTS
Coherency Rubric
33
• The Coherency Rubric, which examines the
alignment characteristics of each student
learning/support objective (SLO/SSO), serves as the
measurement tool to ensure that each SLO/SSO
meets the coherency criteria.
• The rubric evaluates each of the three (3) phases
used organize this training module. Each aspect of
the SLO/SSO should meet a specific descriptor in
the Coherency Rubric.
*Note* Template #3: Coherency Rubric is found in the Review module.
34
Phase I: Design
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinement
1.1
The goal statement articulates the “big idea”
under which targeted content/professional
standards are directly aligned. The statement is
concise and free of technical jargon.
 
1.2
Targeted content/professional standards have a
direct influence on student performance outcomes
and are viewed as “central” to the subject/service
area.
 
1.3
The course/subject (service) area associated with
the SLO/SSO is logically linked to the central
theme and targeted content/professional
standards.
 
Coherency Rubric
35
Phase I: Design
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinement
1.4
A blueprint or other design document illustrates
relationships among key components (i.e., Goal
Statement, Targeted Content/Professional
Standards, Objectives, Performance Measures,
and Overall Rating).
 
1.5
Performance measures are designed to evaluate
the targeted content/professional standards (as
demonstrated by the performance measure’s
alignment characteristics).
 
Coherency Rubric (cont.)
36
Phase II: Build
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinement
2.1
The goal statement represents a central concept
that is enduring, has leverage, and is foundational
to further, more complex learning outcomes.
 
2.2
The SLO/SSO is supported by a representative
sample of the educator’s/professional’s students,
with a sample size that is sufficient to make valid
inferences about student achievement and/or
outcomes.
 
2.3
Targeted content/professional standards are
selected using a valid and reliable approach that
is fair and unbiased.
 
Coherency Rubric (cont.)
37
Phase II: Build
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinement
2.4
Objectives are specific, criteria-focused, attainable
(yet challenging), and directly linked to the
performance measures.
 
2.5
Performance measures have benchmarks for two
or more points in time within a given school year
[Growth]. In addition or alternatively,
performance measures have a clear, date-specific
target for an on-demand demonstration of skill and
knowledge attainment [Mastery].
 
2.6
The overall rating is directly linked to a
performance continuum based on the percentage of
students meeting expectations across all
objectives.
 
Coherency Rubric (cont.)
38
Phase III: Review
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinem
ent
3.1
The SLO/SSO is based on performance
measures that are technically sound (i.e.,
reliable, valid, and fair) and fully align to
the targeted content standards.
 
3.2
The SLO/SSO form reviews mitigate
unintentional consequences and/or potential
threats to inferences made about meeting
performance expectations.
 
3.3
The SLO/SSO has data and/or evidence to
support the assignment of an overall teacher
rating (i.e., Unsatisfactory, Emerging,
Effective, and Distinguished).
 
Coherency Rubric (cont.)
39
Task ID Descriptor Rating
Meets
Criteria
Needs
Refinement
3.4
The SLO/SSO form has been examined to
ensure that it is complete. Meaning, all
applicable elements within the SLO Form
(Template #2a or Template #2c) have been
addressed according to the prescribed business
rules.
 
3.5
The SLO/SSO from has been reviewed to
ensure it includes “comprehensive”
performance measures. Meaning, all
performance measures have been examined to
determine that they are appropriate for use in
the process AND are of high technical quality.
 
Needs
Refinement
Clarification
Coherency Rubric (cont.)
Phase III: Review
40
Procedural Steps
Step 1. Select the drafted SLO/SSO, including applicable
performance measures.
Step 2. Beginning with Phase I, use the Coherency Rubric to
evaluate the alignment of the components to the overall
SLO/SSO design.
Step 3. Identify any component with weak alignment and/or
does not meet the rubric’s criteria.
Step 4. Flag any element needing refinement or further
discussion with other educators/professionals and/or
the principal/supervisor.
Step 5. Repeat Steps 2 thru 4 with the additional two sections
of the Coherency Rubric.
Reflection
• Quality Assurance
• The SLO/SSO Form is complete.
• The SLO’s/SSO’s assessments
are comprehensive.
• The SLO’s/SSO’s design is
coherent.
Step 5
41
Summary
42
This SLO/SSO Review Phase:
• Applied a set of quality assurance criteria to
ensure that the student learning/support
objective, along with its applicable
performance measures, was complete,
comprehensive, and coherent (i.e., “high
quality”).
• Identified areas of improvement for
subsequent SLO/SSO refinement.

More Related Content

Similar to M3 reviewing the slo-sso-final

Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working Draft
Research in Action, Inc.
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
Research in Action, Inc.
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINAL
Research in Action, Inc.
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]
'Gbenga Aina
 
Presentation for Six Sigma certification
Presentation for Six Sigma certificationPresentation for Six Sigma certification
Presentation for Six Sigma certification
Elena Titova
 
Performance Task Framework-Help Desk - May 2014-Final
Performance Task Framework-Help Desk - May 2014-FinalPerformance Task Framework-Help Desk - May 2014-Final
Performance Task Framework-Help Desk - May 2014-Final
Research in Action, Inc.
 

Similar to M3 reviewing the slo-sso-final (20)

M3-Review-SLOs-13NOV13
M3-Review-SLOs-13NOV13M3-Review-SLOs-13NOV13
M3-Review-SLOs-13NOV13
 
M0 school leader orientation-final
M0 school leader orientation-finalM0 school leader orientation-final
M0 school leader orientation-final
 
M0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo SiteM0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo Site
 
SLO for teachers
SLO for teachersSLO for teachers
SLO for teachers
 
SLO overview power point 2014
SLO overview power point 2014SLO overview power point 2014
SLO overview power point 2014
 
Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working Draft
 
M2-Building-SLOs-13NOV13
M2-Building-SLOs-13NOV13M2-Building-SLOs-13NOV13
M2-Building-SLOs-13NOV13
 
M2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSiteM2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSite
 
Template #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-FinalTemplate #6-Performance Measure Rubric-May 2014-Final
Template #6-Performance Measure Rubric-May 2014-Final
 
Template #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jpTemplate #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jp
 
SD-Session-3-The-Revised-SBM-Tool.pptx
SD-Session-3-The-Revised-SBM-Tool.pptxSD-Session-3-The-Revised-SBM-Tool.pptx
SD-Session-3-The-Revised-SBM-Tool.pptx
 
ug-pharmacy-sar-v0.pdf
ug-pharmacy-sar-v0.pdfug-pharmacy-sar-v0.pdf
ug-pharmacy-sar-v0.pdf
 
Template #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINALTemplate #3-Performance Measure Rubric-June 2014-FINAL
Template #3-Performance Measure Rubric-June 2014-FINAL
 
Student Growth Measures
Student Growth MeasuresStudent Growth Measures
Student Growth Measures
 
M0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSiteM0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSite
 
M0 orientation to the slo-sso-final
M0 orientation to the slo-sso-finalM0 orientation to the slo-sso-final
M0 orientation to the slo-sso-final
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]
 
Presentation for Six Sigma certification
Presentation for Six Sigma certificationPresentation for Six Sigma certification
Presentation for Six Sigma certification
 
Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-la
 
Performance Task Framework-Help Desk - May 2014-Final
Performance Task Framework-Help Desk - May 2014-FinalPerformance Task Framework-Help Desk - May 2014-Final
Performance Task Framework-Help Desk - May 2014-Final
 

More from Research in Action, Inc.

Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
Research in Action, Inc.
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Research in Action, Inc.
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Research in Action, Inc.
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
Research in Action, Inc.
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
Research in Action, Inc.
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
Research in Action, Inc.
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL
Research in Action, Inc.
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINAL
Research in Action, Inc.
 
M2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINALM2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINAL
Research in Action, Inc.
 

More from Research in Action, Inc. (19)

M3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSiteM3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSite
 
M1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo SiteM1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo Site
 
Template #2c building the sso-final
Template #2c building the sso-finalTemplate #2c building the sso-final
Template #2c building the sso-final
 
Template #2a building the slo-final
Template #2a building the slo-finalTemplate #2a building the slo-final
Template #2a building the slo-final
 
Template #1 designing the slo-sso-final
Template #1 designing the slo-sso-finalTemplate #1 designing the slo-sso-final
Template #1 designing the slo-sso-final
 
M1 designing the slo-sso-final
M1 designing the slo-sso-finalM1 designing the slo-sso-final
M1 designing the slo-sso-final
 
Educator evaluation policy overview-final
Educator evaluation policy overview-finalEducator evaluation policy overview-final
Educator evaluation policy overview-final
 
Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
 
Depth of Knowledge Chart
Depth of Knowledge ChartDepth of Knowledge Chart
Depth of Knowledge Chart
 
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
Assessment Selection Paper-Herman_Heritage_Goldschmidt (2011)
 
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINALModel #3-Nutrition Culinary-Level III-DEMO-FINAL
Model #3-Nutrition Culinary-Level III-DEMO-FINAL
 
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINALHO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
HO #3-Reviewing the Assessment-Scored Example-June 2014-FINAL
 
M3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINALM3-Reviewing the Assessment-June 2014-FINAL
M3-Reviewing the Assessment-June 2014-FINAL
 
Cognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINALCognitive Demand Crosswalk-June 2014-FINAL
Cognitive Demand Crosswalk-June 2014-FINAL
 
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINALModel #2-Grade 8 Pre-Algebra-DEMO-FINAL
Model #2-Grade 8 Pre-Algebra-DEMO-FINAL
 
Model #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINALModel #1-Art Grade 5-DEMO-FINAL
Model #1-Art Grade 5-DEMO-FINAL
 
HO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINALHO #2-Building the Assessment-Examples-June 2014-FINAL
HO #2-Building the Assessment-Examples-June 2014-FINAL
 
Template #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINALTemplate #2-Building the Assessment-June 2014-FINAL
Template #2-Building the Assessment-June 2014-FINAL
 
M2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINALM2-Building the Assessment-June 2014-FINAL
M2-Building the Assessment-June 2014-FINAL
 

Recently uploaded

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
AnaAcapella
 

Recently uploaded (20)

How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfFICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
Tatlong Kwento ni Lola basyang-1.pdf arts
Tatlong Kwento ni Lola basyang-1.pdf artsTatlong Kwento ni Lola basyang-1.pdf arts
Tatlong Kwento ni Lola basyang-1.pdf arts
 
VAMOS CUIDAR DO NOSSO PLANETA! .
VAMOS CUIDAR DO NOSSO PLANETA!                    .VAMOS CUIDAR DO NOSSO PLANETA!                    .
VAMOS CUIDAR DO NOSSO PLANETA! .
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
Economic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food AdditivesEconomic Importance Of Fungi In Food Additives
Economic Importance Of Fungi In Food Additives
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx
 

M3 reviewing the slo-sso-final

  • 3. 3 Review Goal • Reviewing technically rigorous student learning/support objectives (SLO/SSO) for use in guiding instruction (and/or support services); while determining student mastery or growth as part of an educator effectiveness system. Objectives • Participants will : 1. Conduct a multi-faceted, quality assurance review of the student learning/support objectives for: A. Completeness B. Comprehensiveness (see Quick Start Training Module*) C. Coherency
  • 4. Final Outcome 4 A refined SLO/SSO Form with: • Rigorous, high-quality performance measures designed to measure the targeted content/professional standards. • Areas identified for further improvement/consideration (as documented within the Coherency Rubric)
  • 5. 5 Helpful Resources Participants should consult the following: Training • Handout #3: QA Checklist  Step 5: Quality Assurance of the SLO/SSO Form Templates • Template #3: Coherency Rubric • Template #3a: Performance Measure Rubric Resources • Help Desk • Models: Scored Examples
  • 6. Participants should consult the following:  Completion of the SLO Form • Handout #3-Quality Assurance Checklist  Comprehensiveness of the Performance Measures • Template #3a-Performance Measure Rubric  Coherency of the SLO Design • Template #3-Coherency Rubric 6 Helpful Resources (cont.)
  • 7. Process Components 7 STEP #1 Goal STEP #2 Standards STEP #3 Blueprint STEP #4 Form STEP #5 QA
  • 8. 8 • Check the drafted SLO/SSO (including the performance measures) for quality. • Refine the measures and targets. • Edit the text and prepare discussion points and highlights for the principal/supervisor. • Update the completed SLO/SSO form with performance data. REVIEW Phase
  • 9. 9 Review Phase Components Review Preview Checklist & Rubrics Completion: QA Checklist Comprehensiveness: Performance Measure Rubric Coherency: Coherency Rubric
  • 10. 10 Task Structure QA 1. Completeness Is the SLO/SSO Form completed correctly? 2. Comprehensiveness Are the assessments of high technical quality? 3. Coherency Are the SLO/SSO components aligned to each other?
  • 12. 12 Checklist & Rubric Preview 1. Apply the QA Checklist (Handout #3) to a completed SLO or SSO Form. A. What information is needed? B. Who is the SLO/SSO focused on? 2. Preview the three (3) strands within the Performance Measure Rubric (Template #3a) for each assessment identified within the SLO/SSO. A. What is the purpose of the assessment/performance measure? B. What standards does it purport to measure? C. What technical data is provided/known?
  • 13. 13 Checklist & Rubric Preview (cont.) 3. Preview the three (3) phases within the Coherency Rubric for the SLO or SSO Form. A. How well are the SLO/SSO components aligned to each other? B. How well do the identified assessments/ performance measures aligned to the SLO’s/SSO’s stated goal?
  • 15. 15 Quality Assurance Checklist • The checklist is designed to verify each element within the four (4) sections of the SLO/SSO • The checklist applies the business rules to each element. The Help Desk document provides examples for each element. SLO/SSO Completeness
  • 16. Quality Assurance Checklist 16 Element Definition  1.1 Content Area Name of the content area upon which the SLO is based.  1.2 Course Name of the specific course/subject upon which the SLO is based.  1.3 Grade Level Grade levels for students included in the course/subject in Element 1.2.  1.4 Total Students Aggregate number of students (estimated, across multiple sections) for whom data will be collected.  1.5 Average Class Size The average number of students in a single session of the course/subject identified in Element 1.2.  1.6 Class Frequency The frequency (within the given timeframe) of the course/subject identified in Element 1.2.  1.7 Instructional Setting The location or setting where the course/subject instruction is provided.  1.8 Instructional Interval The time frame of the course/subject identified in Element 1.2. SLO Section I: Context
  • 17. Quality Assurance Checklist (cont.) 17 Element Definition  1.1 Service Area Name of the primary service area (e.g., speech) upon which the SSO is based.  1.2 Service Location Name of the location(s) services are provided.  1.3 Grade Level Grade level(s) of students and/or educator-type services are provided.  1.4 Total Recipients Aggregate number of students and/or educators for whom data will be collected.  1.5 Average Case Size The “average” number of recipients of the services identified in Element 1.4.  1.6 Service Frequency The typical frequency (within the identified service interval- Element 1.8) services are provided to recipients identified in Element 1.4.  1.7 Service Setting The contextual setting (e.g., school library, student’s home) services are provided.  1.8 Service Interval The typical time frame of the service model. SSO Section I: Setting
  • 18. 18 SLO Section II: Goal Element Definition  2.1 Goal Statement A narrative that articulates a key concept upon which the SLO is based. The statement addresses What, Why, and How.  2.2 Content Standards Targeted Content Standards, which are the foundation of performance measures, used to develop the SLO.  2.3 Instructional Strategy The approach used to facilitate learning the key concept articulated in the Goal Statement and delineated among the Targeted Content Standards. SLO Section III: Objective Element Definition  3.1 Starting Point (Baseline) The baseline data used for comparing student results at the end of the instructional interval.  3.2 Objectives (Whole Class) The expected level of achievement for the entire student learning objective (SLO) population (as defined in Element 1.4).  3.3 Objectives (Focused Students) The expected level of achievement for a subset of the SLO population (as defined in Element 1.4).  3.4 End Point (Combined) At the end of the instructional interval, the aggregate performance classification as delineated by four, empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished). Quality Assurance Checklist (cont.)
  • 19. 19 SSO Section II: Goal Element Definition  2.1 Goal Statement A narrative that articulates a key concept upon which the SSO is based. The statement addresses What, Why, and How.  2.2 Targeted Professional Standards Targeted Professional Standards outline the requirements an organization must fulfill to ensure that products and services consistently meet customers' requirements. Content standards may also be identified for those individuals providing instructional services.  2.3 Implementation Strategy The approach used to attain the primary service goal articulated in the Goal Statement and delineated among the Targeted Professional Standards. SSO Section III: Objective Element Definition  3.1 Starting Point (Baseline) The baseline data used for comparing client results at the end of the instructional interval.  3.2 Objectives (All Clients) The expected level of performance for the entire client population (as defined in Element 1.4).  3.3 Objectives (Focused Clients) The expected level of performance for a subset of the client population (as defined in Element 1.4).  3.4 End Point (Combined) At the end of the service interval, the aggregate performance classification as delineated by four, empirical ranges (i.e., Unsatisfactory, Emerging, Effective, and Distinguished). Quality Assurance Checklist (cont.)
  • 20. 20 SLO Section IV: Performance Measure Element Definition  4.1 Name The name of each performance measure for which an objective is established in Element 3.2.  4.2 Purpose The purpose statement for each performance measure that outlines: (a) What the assessment measures, (b) How to use the scores, and (c) Why the assessment was developed.  4.3 Content Standards The Targeted Content Standards (the foundation of performance measures) used to develop SLOs. The content standards are those aligned with each performance measure.  4.4 Performance Targets Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of achievement for each student in the SLO population (as defined in Element 1.4).  4.5 Metric The metric by which the performance measure evaluates the performance target.  4.6 Administration The administrative steps before, during, and after the assessment window, as well as the step-by-step procedures during each phase of administration, including: (a) the requirements for completing the performance measure, including accommodations, equipment, and materials, (b) standard time allotments to complete the overall performance measure, and (c) standard scripts that educators read to give directions for completing the performance measure.  4.7 Scoring Tools Scoring Keys: Objective Measures Scoring Rubric: Subjective Measures  4.8 Results  The number of students participating in the performance measure  The number of students who met the target as stated in Element 4.4  The percentage of students who met the target as stated in Element 4.4  SLO Rating One of four performance levels that the principal (or the evaluator) identifies after noting the actual performance in respect to each objective stated in the SLO.  Notes and Explanation Space for the educator to note influences, factors, and other conditions associated with the SLO Rating, as well as to reflect on a purposeful review of the data. Quality Assurance Checklist (cont.)
  • 21. 21 SSO Section IV: Performance Measure Element Definition  4.1 Name The name of each performance measure for which an objective is established in Element 3.2.  4.2 Purpose The purpose statement for each performance measure that outlines: (a) What the measure is evaluating, (b) How to use the scores, and (c) Why the measure was developed.  4.3 Professional Standards The Professional Standards (the foundation of measures) used to develop SSOs. The professional standards are those aligned with each identified measure.  4.4 Performance Targets Using the scoring tools for each performance measure (as listed in Element 4.1), the expected level of attainment for each client in the SSO population (as defined in Element 1.4).  4.5 Metric The metric by which the performance measure evaluates the performance target.  4.6 Administration The administrative steps before, during, and after the evaluation window, as well as the step-by-step procedures during each phase of administration The requirements for completing the performance measure, including accommodations, equipment, and materials The standard time to complete the overall evaluation.  4.7 Scoring Tools Scoring Keys: Objective Measures Scoring Rubric: Subjective Measures; Data collection mechanisms  4.8 Results • The number of clients participating in the performance measure. • The number of clients who met the target as stated in Element 4.4. • The percentage of clients who met the target as stated in Element 4.4  SSO Results One of four performance levels that the supervisor identifies after noting the actual performance in respect to each objective stated in the SSO.  Notes and Explanations Space for the professional to note influences, factors, and other conditions associated with the SSO Rating, as well as to reflect on a purposeful review of the data. Quality Assurance Checklist (cont.)
  • 22. 22 Procedural Steps Step 1. Select the drafted SLO/SSO, including applicable performance measures. Step 2. Beginning with Section I, use the Handout #3-Quality Assurance Checklist and evaluate each element. Step 3. Identify any element with missing or incorrect information (i.e., statement or data placed in the wrong element of the SLO/SSO Form). Step 4. Flag any element needing refinement or further discussion with other educators/professionals and/or the principal/supervisor. Step 5. Repeat Steps 2 through 4 with the other sections on the SLO/SSO Form.
  • 24. 24 Performance Measure Rubric • The rubric is designed to examine the quality characteristics of teacher-made performance measures. The rubric is comprised of 18 technical descriptors organized into three strands. • The rubric’s purpose is to provide teachers with a self-assessment tool that assists in building high quality measures of student achievement. SLO/SSO Comprehensiveness
  • 25. 25 • For vendor-developed assessments, examine the technical evidence to determine if the tool is designed to measure the Targeted Content/Professional Standards. • For locally-developed assessments, follow the guidelines in the Quick Start training to create high-quality performance measures, and then apply Template 3a. Performance Measure Rubric
  • 26. 26 Performance Measure Rubric (cont.) Task ID Descriptor Rating 1.1 The purpose of the performance measure is explicitly stated (who, what, why). 1.2 The performance measure has targeted content standards representing a range of knowledge and skills students are expected to know and demonstrate. 1.3 The performance measure’s design is appropriate for the intended audience and reflects challenging material needed to develop higher-order thinking skills. 1.4 Specification tables articulate the number of items/tasks, item/task types, passage readability, and other information about the performance measure - OR- Blueprints are used to align items/tasks to targeted content standards. 1.5 Items/tasks are rigorous (designed to measure a range of cognitive demands/higher-order thinking skills at developmentally appropriate levels) and of sufficient quantities to measure the depth and breadth of the targeted content standards. Strand I: Design
  • 27. 27 Performance Measure Rubric (cont.) Task ID Descriptor Rating 2.1 Items/tasks and score keys are developed using standardized procedures, including scoring rubrics for human-scored, open-ended questions (e.g., short constructed response, writing prompts, performance tasks, etc.). 2.2 Item/tasks are created and reviewed in terms of: (a) alignment to the targeted content standards, (b) content accuracy, (c) developmental appropriateness, (d) cognitive demand, and (e) bias, sensitivity, and fairness. 2.3 Administrative guidelines are developed that contain the step-by-step procedures used to administer the performance measure in a consistent manner, including scripts to orally communicate directions to students, day and time constraints, and allowable accommodations/adaptations. 2.4 Scoring guidelines are developed for human-scored items/tasks to promote score consistency across items/tasks and among different scorers. These guidelines articulate point values for each item/task used to combine results into an overall score. 2.5 Summary scores are reported using both raw score points and performance level. Performance levels reflect the range of scores possible on the assessment and use terms or symbols to denote each level. 2.6 The total time to administer the performance measure is developmentally appropriate for the test-taker. Generally, this is 30 minutes or less for young students and up to 60 minutes per session for older students (high school). Strand II: Build
  • 28. 28 Performance Measure Rubric (cont.) Task ID Descriptor Rating 3.1 The performance measures are reviewed in terms of design fidelity:  Items/tasks are distributed based upon the design properties found within the specification or blueprint documents;  Item/task and form statistics are used to examine levels of difficulty, complexity, distractor quality, and other properties; and,  Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3.2 The performance measure was reviewed in terms of editorial soundness, while ensuring consistency and accuracy of all documents (e.g., administration guide):  Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgements;  Applies Universal Design principles; and,  Ensures linguistic demands and readability are developmentally appropriate. 3.3 The performance measure was reviewed in terms of alignment characteristics:  Pattern consistency (within specifications and/or blueprints);  Targeted content standards match;  Cognitive demand; and.  Developmental appropriateness. 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area. Strand III: Review
  • 29. 29 Performance Measure Rubric (cont.) Task ID Descriptor Rating 3.5 As part of the assessment cycle, post-administration analyses are conducted to examine aspects such as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. 3.6 The performance measure has score validity evidence that demonstrate item responses were consistent with content specifications. Data suggest that the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliability, etc. are calculated and reviewed. Strand III: Review Note: The indicators below are evaluated after students/clients have taken the assessment (i.e., post-administration).
  • 30. 30 Procedural Steps Step 1. Identify the performance measures being used within the SLO/SSO. Step 2. Examine the alignment characteristics of the performance measure to those standards identified in Section II. Step 3. Determine the most applicable metric (e.g., growth) based on the stated objectives in Section III. Step 4. Evaluate the technical evidence provided by either the vendor or the assessment’s developer. Step 5. Repeat Steps 2 through 4 with the other performance measures identified in Section IV.
  • 32. 32 SLO/SSO Coherency GOAL STATEMENT SLO/SSO RATING OBJECTIVES PERFORMANCE MEASURES & TARGETS ALL STUDENTS/ CLIENTS FOCUSED STUDENTS/CLIENTS
  • 33. Coherency Rubric 33 • The Coherency Rubric, which examines the alignment characteristics of each student learning/support objective (SLO/SSO), serves as the measurement tool to ensure that each SLO/SSO meets the coherency criteria. • The rubric evaluates each of the three (3) phases used organize this training module. Each aspect of the SLO/SSO should meet a specific descriptor in the Coherency Rubric. *Note* Template #3: Coherency Rubric is found in the Review module.
  • 34. 34 Phase I: Design Task ID Descriptor Rating Meets Criteria Needs Refinement 1.1 The goal statement articulates the “big idea” under which targeted content/professional standards are directly aligned. The statement is concise and free of technical jargon.   1.2 Targeted content/professional standards have a direct influence on student performance outcomes and are viewed as “central” to the subject/service area.   1.3 The course/subject (service) area associated with the SLO/SSO is logically linked to the central theme and targeted content/professional standards.   Coherency Rubric
  • 35. 35 Phase I: Design Task ID Descriptor Rating Meets Criteria Needs Refinement 1.4 A blueprint or other design document illustrates relationships among key components (i.e., Goal Statement, Targeted Content/Professional Standards, Objectives, Performance Measures, and Overall Rating).   1.5 Performance measures are designed to evaluate the targeted content/professional standards (as demonstrated by the performance measure’s alignment characteristics).   Coherency Rubric (cont.)
  • 36. 36 Phase II: Build Task ID Descriptor Rating Meets Criteria Needs Refinement 2.1 The goal statement represents a central concept that is enduring, has leverage, and is foundational to further, more complex learning outcomes.   2.2 The SLO/SSO is supported by a representative sample of the educator’s/professional’s students, with a sample size that is sufficient to make valid inferences about student achievement and/or outcomes.   2.3 Targeted content/professional standards are selected using a valid and reliable approach that is fair and unbiased.   Coherency Rubric (cont.)
  • 37. 37 Phase II: Build Task ID Descriptor Rating Meets Criteria Needs Refinement 2.4 Objectives are specific, criteria-focused, attainable (yet challenging), and directly linked to the performance measures.   2.5 Performance measures have benchmarks for two or more points in time within a given school year [Growth]. In addition or alternatively, performance measures have a clear, date-specific target for an on-demand demonstration of skill and knowledge attainment [Mastery].   2.6 The overall rating is directly linked to a performance continuum based on the percentage of students meeting expectations across all objectives.   Coherency Rubric (cont.)
  • 38. 38 Phase III: Review Task ID Descriptor Rating Meets Criteria Needs Refinem ent 3.1 The SLO/SSO is based on performance measures that are technically sound (i.e., reliable, valid, and fair) and fully align to the targeted content standards.   3.2 The SLO/SSO form reviews mitigate unintentional consequences and/or potential threats to inferences made about meeting performance expectations.   3.3 The SLO/SSO has data and/or evidence to support the assignment of an overall teacher rating (i.e., Unsatisfactory, Emerging, Effective, and Distinguished).   Coherency Rubric (cont.)
  • 39. 39 Task ID Descriptor Rating Meets Criteria Needs Refinement 3.4 The SLO/SSO form has been examined to ensure that it is complete. Meaning, all applicable elements within the SLO Form (Template #2a or Template #2c) have been addressed according to the prescribed business rules.   3.5 The SLO/SSO from has been reviewed to ensure it includes “comprehensive” performance measures. Meaning, all performance measures have been examined to determine that they are appropriate for use in the process AND are of high technical quality.   Needs Refinement Clarification Coherency Rubric (cont.) Phase III: Review
  • 40. 40 Procedural Steps Step 1. Select the drafted SLO/SSO, including applicable performance measures. Step 2. Beginning with Phase I, use the Coherency Rubric to evaluate the alignment of the components to the overall SLO/SSO design. Step 3. Identify any component with weak alignment and/or does not meet the rubric’s criteria. Step 4. Flag any element needing refinement or further discussion with other educators/professionals and/or the principal/supervisor. Step 5. Repeat Steps 2 thru 4 with the additional two sections of the Coherency Rubric.
  • 41. Reflection • Quality Assurance • The SLO/SSO Form is complete. • The SLO’s/SSO’s assessments are comprehensive. • The SLO’s/SSO’s design is coherent. Step 5 41
  • 42. Summary 42 This SLO/SSO Review Phase: • Applied a set of quality assurance criteria to ensure that the student learning/support objective, along with its applicable performance measures, was complete, comprehensive, and coherent (i.e., “high quality”). • Identified areas of improvement for subsequent SLO/SSO refinement.

Editor's Notes

  1. The Student Learning Objective (SLO) and Student Support Objective (SSO) Process is comprised of three (3) phases: Designing, Building, and Reviewing. Student learning objectives/ support objectives provide a valid assessment of educator and professional staff (specialists) effectiveness through performance outcomes based on standards. This training series is comprised of four (4) training modules M0-SLO/SSO Orientation M1-Designing SLOs/SSOs M2-Building SLOs/SSOs M3-Reviewing SLOs/SSOs Welcome to the Reviewing Module: ________________________________________________________________________ Technical Notes “Structure” Concept – “What is this slide telling the audience?” Key Points for Trainers – “What/Where are the details needed for teaching?” Learning Activity – “How can the participant’s learning be enhanced?” (This item will not be populated for every slide.)
  2. REVIEW Phase Concept The SLO/SSO Review phase provides an opportunity for educators/professionals to complete a 3-tier Quality Assurance Review to ensure the SLO/SSO, along with its applicable performance measures, are complete, comprehensive, and coherent. Teachers will use checklists and rubrics to determine the (a) completeness of the SLO/SSO Template, (b) comprehensiveness of the performance measures, and (c) coherency (alignment) of the SLO/SSO. Key Points for Trainers Ensure the participants understand that Reviewing requires an extensive evaluation of the SLO’s/SSO’s quality in terms of the 3C’s. Completeness, Comprehensiveness, and Coherency Quality assurance checklist and rubric Process activities during this phase occur before and after the presentation to the principal, and include: Finalizing and submitting the proposed SLO/SSO; Refining the SLO/SSO based upon feedback from the principal/supervisor; Collecting performance data on student achievement/performance; Monitoring the SLO/SSO during the school year; Updating the SLO/SSO with data; Evaluating each performance indicator; and, Determining the rating. Learning Activity Allow time for participants to access and review the following documents from the Homeroom learning portal.