SlideShare a Scribd company logo
1 of 40
Assessment Literacy Series
Quick Start ©
1
Module 3
Participants will:
1. Review developed items/tasks.
2. Examine test alignment.
3. Conduct data reviews.
4. Establish refinement procedures.
2
“Review” Objectives
Training Material
 Modularized into three phases containing:
• Training: PowerPoint Presentations, Videos, etc.
• Templates: Forms, Spreadsheets, Business Rules, etc.
• Resources: Guides: Handouts, Models, etc.
 Delivered via online learning platform: Homeroom
www.hr.riagroup2013.com
3
Helpful Resources
Participants may wish to reference:
Templates
 Template #3 – Performance Measure Rubric
Resources
 Handout #3 – Reviewing the Assessment- Scored Example
4
Helpful Resources (cont.)
Review Components
REVIEW
Items/Tasks
Alignment
Characteristics
Data Review
Refinement
5
6
STEP 7
Item/Task Reviews
Item/Task Reviews
 Reviews are organized into two, complementary
aspects:
1. Design Fidelity
-Content
-Bias, Fairness, and Sensitivity
-Accessibility/Universal Design Features
2. Editorial Soundness
-Readability
-Sentence Structure & Grammar
-Word Choice
-Copyrights/Use Agreements 7
Quality Reviews
Ensuring the assessment:
• reflects the developed test blueprint or specification table.
• matches targeted content standards.
• includes multiple ways for test takers to demonstrate knowledge and
skills.
Eliminating potential validity threats by reviewing
for:
• Bias
• Fairness
• Sensitive Topics
• Accessibility/Universal Design Features 8
Content
 Determine if each item/task clearly aligns to the targeted
content standard.
 Evaluate all items for content “accuracy".
 Judge if each item/task is developmentally (grade)
appropriate in terms of:
• Reading level
• Vocabulary
• Required reasoning skills
 Review each item/task response in terms of the targeted
standards.
9
Bias
 Bias is the presence of some characteristic of an
item/task that results in the differential performance
of two individuals with the same ability but from
different subgroups.
 Bias-free items/tasks provide an equal opportunity
for all students to demonstrate their knowledge and
skills.
 Bias is not the same as stereotyping.
10
Fairness
 Fairness generally refers to the opportunity
for test-takers to learn the content being
measured.
 Item/task concepts and skills should have
been taught to the test-taker prior to
evaluating content mastery.
• More complex for large-scale assessments
• Assumes earlier grades taught the foundational
content. [May be a faulty assumption.]
11
Sensitivity
Review to ensure items/tasks are:
 sensitive to different cultures, religions, ethnic
and socio-economic groups, and disabilities.
 balanced by gender roles.
 positive in their language, situations, and
imagery.
 void of text that may elicit strong emotional
responses by specific groups of students.
12
Sensitivity (cont.)
-Topics to Avoid-
• Abortion
• Birth control
• Child abuse/neglect
• Creationism
• Divorce
• Incest
• Illegal activities
• Occult/witchcraft
• Rape
• Religious doctrine
• Sex/sexuality
• Sexual orientation
• Weight
• Suicide
• STDs
*Note: This listing provides examples of topics to avoid, but it does not
contain every sensitive topic. 13
Accessibility
 The extent to which a test and/or testing condition eliminates
barriers and permits the test-taker to fully demonstrate his/her
knowledge and skills.
 All items should be reviewed to ensure they are accessible to
the entire population of students. [Universal design features
are helpful by eliminating access barriers.]
 Item reviews must consider:
• Readability
• Syntax complexity
• Item presentation
• Font size
• Images, graphs, tables clarity
• Item/task spacing 14
Editorial Soundness
Ensure the assessments have developmentally appropriate:
• Readability levels
• Sentence structures
• Word choice
Eliminate validity threats created by:
• Confusing or ambiguous directions or prompts
• Imprecise verb use to communicate expectations
• Vague response criteria or expectations
15
Procedural Steps: Item/Tasks
Reviews
Step 1. Identify at least one other teacher to assist in the review (best
accomplished by department or grade-level committees).
Step 2. Organize the test form, answer key, and/or scoring rubrics,
and Handout #3-Reviewing the Assessment-Scored Example.
Step 3. Read each item/task and highlight any “potential” issues in
terms of content accuracy, potential bias, sensitive materials,
fairness, and developmental appropriateness.
Step 4. After reviewing the entire test form, including scoring rubrics,
revisit the highlighted items/tasks. Determine if the item/tasks
can be rewritten or must be replaced.
Step 5. Print revised assessment documents and conduct an editorial
review, ensuring readability, sentence/passage complexity, and
word selection are grammatically sound. Take corrective
actions prior to finalizing the documents. 16
QA Checklist: Items/Tasks
 All assessment forms have been reviewed for
content accuracy, bias, fairness, sensitivity, and
accessibility.
 All scoring rubrics have been reviewed for content
accuracy and developmental appropriateness.
 All edits have been applied and final documents are
correct.
17
18
STEP 8
Alignment and Performance
Level Reviews
Alignment Characteristics
Item/Task
 The degree to which the items/tasks are focused on the
targeted content standards in terms of:
• Content match
• Cognitive demand
Overall Assessment
 The degree to which the completed assessment reflects (as
described in the blueprint) the:
• Content patterns of emphasis
• Content range and appropriateness
19
Alignment Model: Webb
 Categorical Concurrence
• The same categories of the content standards are included in the assessment.
• Items might be aligned to more than one standard.
 Balance of Representation
• Ensures there is an even distribution of the standards across the test.
 Range of Knowledge
• The extent of knowledge required to answer correctly parallels the knowledge the
standard requires.
 Depth of Knowledge
• The cognitive demands in the standard must align to the cognitive demands in the
test item.
 Source of Challenge
• Students give a correct or incorrect response for the wrong reason (bias).
Source: Webb, N. L. (1997). Research Monograph No. 6: Criteria for alignment of expectations and assessments in
mathematics and science education. Washington, DC: Council of Chief State School Officers.
20
Alignment Model: Quick Start
Item/Task Level
Part I: Content & Cognitive Match
1. Item/task is linked to a specific content standard based upon the narrative
description of the standard and a professional understanding of the
knowledge, skill, and/or concept being described.
2. Item/task reflects the cognitive demand/higher-order thinking skill(s)
articulated in the standards. [Note: Extended performance (EP) tasks are
typically focused on several integrated content standards.]
Test Form Level
Part II: Emphasis & Sufficiency Match
1. Distribution of items/tasks reflects the emphasis placed on the targeted
content standards in terms of “density” and “instructional focus”, while
encompassing the range of standards articulated on the test blueprint.
2. Distribution of opportunities for the test-taker to demonstrate skills,
knowledge, and concept mastery at the appropriate developmental range is
sufficient.
21
Procedural Steps: Alignment Review
Step 1. Identify at least one other teacher to assist in the alignment review (best
accomplished by department or grade-level committees).
Step 2. Organize items/tasks, test blueprint, and targeted content standards.
Step 3. Read each item/task in terms of matching the standards both in terms of content
reflection and cognitive demand. For SA, EA, and EP tasks, ensure that scoring
rubrics are focused on specific content-based expectations. Refine any
identified issues.
Step 4. After reviewing all items/tasks, including scoring rubrics, count the number of
item/task points assigned to each targeted content standard. Determine the
percentage of item/task points per targeted content standard based upon the
total available. Identify any shortfalls in which too few points are assigned to a
standard listed in the test blueprint. Refine if patterns do not reflect those in the
standards.
Step 5. Using the item/task distributions, determine whether the assessment has at least
five (5) points for each targeted content standard and if points are attributed to
only developmentally appropriate items/tasks. Refine if point sufficiency does
not reflect the content standards. 22
QA Checklist: Alignment
 All items/tasks match the skills, knowledge, and
concepts articulated in the targeted content
standards.
 All scoring rubrics have been reviewed for content
match.
 All items/tasks reflect the higher-order thinking
skills expressed in the targeted content standards.
23
QA Checklist (cont.)
 The assessment reflects the range of targeted
content standards listed on the test blueprint.
 The assessment’s distribution of points reflects a
pattern of emphasis similar to those among the
targeted content standards.
 The assessment has a sufficient number of
developmentally appropriate item/task points to
measure the targeted content standards.
24
Performance Levels
Expectations
• Content-specific narratives that articulate a performance
continuum and describe how each level is different from
the others.
Categories
• A classification of performance given the range of possible
performance.
Scores
• The total number of points assigned to a particular category
of performance.
25
Performance Levels (cont.)
 Reflect the targeted content standards in combination with
learning expectations and other assessment data.
• Item-centered: focused on items/tasks
• Person-centered: focused on test-takers
 Apply to either “mastery” or “growth” metrics.
 Established prior to administration but often need refined or
validation using post-administration scores.
 Contain rigorous but attainable expectations.
26
Procedural Steps: Performance Level
Review
27
Step 1. Review each item/task in terms of how many points the “typically
satisfactory” test-taker would earn. Repeat this for the entire
assessment.
Step 2. Identify a preliminary test score (total raw points earned) that would
reflect the minimum level of mastery of the targeted standards based
upon the educator’s course expectation (i.e., must earn an 80% to pass
the final project and demonstration, etc.)
Step 3. From Step #1, total the number of raw points, calculate a percent
correct for the “typically satisfactory” test-taker.
Step 4. Compare the educator’s course expectation percent to the “typically
satisfactory” test-taker. Modify the assessment’s cut score to “fit” the
course expectation and the anticipated assessment performance.
Step 5. Validate the cut score and performance expectation after the
assessment is given (Step 9 Data Reviews).
QA Checklist: Performance Levels
28
 Assessment has at least two performance levels.
 Performance levels contain content-based descriptors,
similar in nature to those used in EA and EP scoring rubrics.
 Assessment has at least one cut score delineating either
meeting or not-meeting expectation.
 Assessment has at least two performance categories,
described by the performance level statements.
 Performance standards were established by educators
knowledgeable of the targeted content standards, identified
students, and performance results [driven by data and
experience].
29
STEP 9
Data Reviews
Data Reviews
 Conduct after test-takers have engaged in the
assessment procedures.
 Focus on data about the items/tasks, performance
levels, score distribution, administration guidelines,
etc.
 Evaluate technical quality by examining aspects such
as: rater reliability, internal consistency, intra-domain
correlations, decision-consistency, measurement error,
etc.
30
Data Reviews (cont.)
Areas of Focus-
• Scoring consistency of tasks
• Item/Task difficulty
• Performance levels
• Overall distribution
• Correlations between item/tasks and the total
score
• Administration timeline and guidance clarity
31
32
STEP 10
Refinements
Refinements
 Complete prior to the beginning of the next assessment
cycle.
 Analyze results from the prior assessment to identify areas
of improvement.
 Consider item/task replacement or augmentation to
address areas of concern.
 Strive to include at least 20% new items/tasks, or
implement an item/task tryout approach.
 Create two parallel forms (i.e., Form A and B) for test
security purposes.
33
Refinements (cont.)
Areas of Focus-
• Conduct rigor and developmental reviews for items
with over 90% accuracy.
• Clarify any identified guidelines needing further
improvements.
• Create items/tasks banks for future development
needs.
• Validate the performance levels reflect rigorous but
attainable standards.
• Develop exemplars based upon student responses.
34
Reflection
• Item/Task
Reviews
Step 7
• Alignment
Reviews
Step 8 • Data
Reviews
Step 9
• Refinement
Step 10
35
 Strand 3 of the Performance Measure Rubric evaluates seven (7)
components which are rated using the following scale:
• 1 – Fully addressed
• .5 – Partially addressed
• 0 – Not addressed
• N/A – Not applicable
Note: The Performance Measure Rubric is found within the “Template” folder
of the this module.
36
Final Check
Performance Measure Rubric
37
Final Check
Performance Measure Rubric Strand 3
Task
ID
Descriptor Rating
3.1
The performance measures are reviewed in terms of design fidelity:
 Items/tasks are distributed based upon the design properties found within the specification or
blueprint documents;
 Item/task and form statistics are used to examine levels of difficulty, complexity, distractor
quality, and other properties; and,
 Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics.
3.2
The performance measures are reviewed in terms of editorial soundness, while ensuring
consistency and accuracy of all documents (e.g., administration guide):
 Identifies words, text, reading passages, and/or graphics that require copyright permission or
acknowledgements;
 Applies Universal Design principles; and,
 Ensures linguistic demands and readability is developmentally appropriate.
3.3
The performance measure was reviewed in terms of alignment characteristics:
 Pattern consistency (within specifications and/or blueprints);
 Targeted content standards match;
 Cognitive demand; and,
 Developmental appropriateness.
3.4
Cut scores are established for each performance level. Performance level descriptors describe
the achievement continuum using content-based competencies for each assessed content area.
Task
ID
Descriptor Rating
3.5
As part of the assessment cycle, post administration analyses are conducted to examine such
aspects as items/tasks performance, scale functioning, overall score distribution, rater drift,
content alignment, etc.
3.6
The performance measure has score validity evidence that demonstrated item responses were
consistent with content specifications. Data suggest that the scores represent the intended
construct by using an adequate sample of items/tasks within the targeted content standards.
Other sources of validity evidence such as the interrelationship of items/tasks and alignment
characteristics of the performance measure are collected.
3.7
Reliability coefficients are reported for the performance measure, which includes estimating
internal consistency. Standard errors are reported for summary scores. When applicable,
other reliability statistics such as classification accuracy, rater reliabilities, etc. are calculated
and reviewed.
38
Final Check
Performance Measure Rubric Strand 3
[Note: The indicators below are evaluated after students have taken the assessment
(i.e., post administration).]
39
Summary
Follow the training guidelines and procedures to:
• Review items/tasks, scoring rubrics, and
assessment forms to create high-quality
performance measures.
• Apply the criteria specified within Template 3-
Performance Measure Rubric to further evaluate
assessment quality.
40
Points of Contact
• Research & Development
jbeaudoin@riagroup2013.com
• Technical Support Center
Email: helpdesk@riagroup2013.com
Hotline: 1.855.787.9446
• Business Services
gwilson@riagroup2013.com
www.ria2001.org

More Related Content

What's hot

Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment InstrumentsSAKITHA HALL
 
Chapter 7 Developing Assessment Instruments
Chapter 7 Developing Assessment InstrumentsChapter 7 Developing Assessment Instruments
Chapter 7 Developing Assessment InstrumentsTiekaWilkins
 
Alternative Assessment Techniques
Alternative Assessment TechniquesAlternative Assessment Techniques
Alternative Assessment TechniquesSanjaya Mishra
 
Authentic Assessment
Authentic AssessmentAuthentic Assessment
Authentic AssessmentNova Zamora
 
Phase 1 theme-based authentic performance and assessment - determining mean...
Phase 1   theme-based authentic performance and assessment - determining mean...Phase 1   theme-based authentic performance and assessment - determining mean...
Phase 1 theme-based authentic performance and assessment - determining mean...Carlos Tian Chow Correos
 
Revised using rubrics to facilitate self-assessment and self-reflection
Revised  using rubrics to facilitate self-assessment and self-reflectionRevised  using rubrics to facilitate self-assessment and self-reflection
Revised using rubrics to facilitate self-assessment and self-reflectionJeremy
 
Rubrics (Analytic and Holistic)
Rubrics (Analytic and Holistic)Rubrics (Analytic and Holistic)
Rubrics (Analytic and Holistic)Jeziel Dela Cruz
 
Language Assessment - Beyond Test-Alternatives Assessment by EFL Learners
Language Assessment - Beyond Test-Alternatives Assessment by EFL LearnersLanguage Assessment - Beyond Test-Alternatives Assessment by EFL Learners
Language Assessment - Beyond Test-Alternatives Assessment by EFL LearnersEFL Learning
 
13. using rubrics in student assessment
13. using rubrics in student assessment13. using rubrics in student assessment
13. using rubrics in student assessmentCate Atehortua
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessmentEMC-DE
 
Validity, reliabiltiy and alignment to determine the effectiveness of assessment
Validity, reliabiltiy and alignment to determine the effectiveness of assessmentValidity, reliabiltiy and alignment to determine the effectiveness of assessment
Validity, reliabiltiy and alignment to determine the effectiveness of assessmentMirea Mizushima
 
review in assessment of learning by MAD.antonette
review in assessment of learning by MAD.antonettereview in assessment of learning by MAD.antonette
review in assessment of learning by MAD.antonette09102565143
 
Online Course Assessment Part 2
Online Course Assessment Part 2Online Course Assessment Part 2
Online Course Assessment Part 2drpmcgee
 
58519966 review-of-principles-of-high-quality-assessment
58519966 review-of-principles-of-high-quality-assessment58519966 review-of-principles-of-high-quality-assessment
58519966 review-of-principles-of-high-quality-assessmentNeptune Amia Says
 
Rubrics: Transparent Assessment in Support of Learning
Rubrics: Transparent Assessment in Support of LearningRubrics: Transparent Assessment in Support of Learning
Rubrics: Transparent Assessment in Support of LearningKenneth Ronkowitz
 

What's hot (20)

Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
 
Chapter 7 Developing Assessment Instruments
Chapter 7 Developing Assessment InstrumentsChapter 7 Developing Assessment Instruments
Chapter 7 Developing Assessment Instruments
 
Alternative Assessment Techniques
Alternative Assessment TechniquesAlternative Assessment Techniques
Alternative Assessment Techniques
 
Intro to rubrics
Intro to rubricsIntro to rubrics
Intro to rubrics
 
Authentic Assessment
Authentic AssessmentAuthentic Assessment
Authentic Assessment
 
Presentation2
Presentation2Presentation2
Presentation2
 
Phase 1 theme-based authentic performance and assessment - determining mean...
Phase 1   theme-based authentic performance and assessment - determining mean...Phase 1   theme-based authentic performance and assessment - determining mean...
Phase 1 theme-based authentic performance and assessment - determining mean...
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Revised using rubrics to facilitate self-assessment and self-reflection
Revised  using rubrics to facilitate self-assessment and self-reflectionRevised  using rubrics to facilitate self-assessment and self-reflection
Revised using rubrics to facilitate self-assessment and self-reflection
 
Quality of assessment
Quality of assessmentQuality of assessment
Quality of assessment
 
Rubrics (Analytic and Holistic)
Rubrics (Analytic and Holistic)Rubrics (Analytic and Holistic)
Rubrics (Analytic and Holistic)
 
Language Assessment - Beyond Test-Alternatives Assessment by EFL Learners
Language Assessment - Beyond Test-Alternatives Assessment by EFL LearnersLanguage Assessment - Beyond Test-Alternatives Assessment by EFL Learners
Language Assessment - Beyond Test-Alternatives Assessment by EFL Learners
 
13. using rubrics in student assessment
13. using rubrics in student assessment13. using rubrics in student assessment
13. using rubrics in student assessment
 
LANGUAJE TESTING
LANGUAJE TESTINGLANGUAJE TESTING
LANGUAJE TESTING
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessment
 
Validity, reliabiltiy and alignment to determine the effectiveness of assessment
Validity, reliabiltiy and alignment to determine the effectiveness of assessmentValidity, reliabiltiy and alignment to determine the effectiveness of assessment
Validity, reliabiltiy and alignment to determine the effectiveness of assessment
 
review in assessment of learning by MAD.antonette
review in assessment of learning by MAD.antonettereview in assessment of learning by MAD.antonette
review in assessment of learning by MAD.antonette
 
Online Course Assessment Part 2
Online Course Assessment Part 2Online Course Assessment Part 2
Online Course Assessment Part 2
 
58519966 review-of-principles-of-high-quality-assessment
58519966 review-of-principles-of-high-quality-assessment58519966 review-of-principles-of-high-quality-assessment
58519966 review-of-principles-of-high-quality-assessment
 
Rubrics: Transparent Assessment in Support of Learning
Rubrics: Transparent Assessment in Support of LearningRubrics: Transparent Assessment in Support of Learning
Rubrics: Transparent Assessment in Support of Learning
 

Similar to M3-Reviewing the Assessment-June 2014-FINAL

M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALResearch in Action, Inc.
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7cdjhaigler
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrumentcdjhaigler
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instrumentsLarry Cobb
 
Revising Instructional Materials
Revising Instructional MaterialsRevising Instructional Materials
Revising Instructional MaterialsAngel Jones
 
Designing and conducting formative evaluations10
Designing and conducting formative evaluations10Designing and conducting formative evaluations10
Designing and conducting formative evaluations10gnpinkston
 
Criterion-Referenced Assessment Review
Criterion-Referenced Assessment ReviewCriterion-Referenced Assessment Review
Criterion-Referenced Assessment Reviewcloder6416
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instrumentsJCrawford62
 
PLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSPLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSSANA FATIMA
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessmentJen_castle
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessmentJen_castle
 

Similar to M3-Reviewing the Assessment-June 2014-FINAL (20)

QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14QS M1-Designing the Assessment-22JAN14
QS M1-Designing the Assessment-22JAN14
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINAL
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Revising Instructional Materials
Revising Instructional MaterialsRevising Instructional Materials
Revising Instructional Materials
 
Designing and conducting formative evaluations10
Designing and conducting formative evaluations10Designing and conducting formative evaluations10
Designing and conducting formative evaluations10
 
ATP 2012
ATP 2012ATP 2012
ATP 2012
 
Intro to rubrics
Intro to rubricsIntro to rubrics
Intro to rubrics
 
Criterion-Referenced Assessment Review
Criterion-Referenced Assessment ReviewCriterion-Referenced Assessment Review
Criterion-Referenced Assessment Review
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
La notes (5 10)
La notes (5 10)La notes (5 10)
La notes (5 10)
 
Chapter 10 apt 501
Chapter 10 apt 501Chapter 10 apt 501
Chapter 10 apt 501
 
PLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTSPLANNING CLASSROOM TESTS AND ASSESSMENTS
PLANNING CLASSROOM TESTS AND ASSESSMENTS
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessment
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessment
 
Creating Meaningful Rubrics
Creating Meaningful Rubrics Creating Meaningful Rubrics
Creating Meaningful Rubrics
 
Chapter 11 apt 501
Chapter 11 apt 501Chapter 11 apt 501
Chapter 11 apt 501
 
Chapter 12
Chapter 12Chapter 12
Chapter 12
 
M0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-FinalM0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-Final
 

More from Research in Action, Inc.

Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laResearch in Action, Inc.
 
Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftResearch in Action, Inc.
 
Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALResearch in Action, Inc.
 

More from Research in Action, Inc. (20)

SLO for teachers
SLO for teachersSLO for teachers
SLO for teachers
 
M3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSiteM3-Reviewing the SLO-SSO-DemoSite
M3-Reviewing the SLO-SSO-DemoSite
 
M2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSiteM2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSite
 
M1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo SiteM1 Designing the SLO-SSO-Demo Site
M1 Designing the SLO-SSO-Demo Site
 
M0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo SiteM0 School Leader Orientation-Demo Site
M0 School Leader Orientation-Demo Site
 
M0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSiteM0 Orientation to the SLO-SSO-DemoSite
M0 Orientation to the SLO-SSO-DemoSite
 
Template #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-laTemplate #3a performance measure rubric-final-la
Template #3a performance measure rubric-final-la
 
Template #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jpTemplate #3 coherency rubric-final-jp
Template #3 coherency rubric-final-jp
 
Template #2c building the sso-final
Template #2c building the sso-finalTemplate #2c building the sso-final
Template #2c building the sso-final
 
Template #2a building the slo-final
Template #2a building the slo-finalTemplate #2a building the slo-final
Template #2a building the slo-final
 
Template #1 designing the slo-sso-final
Template #1 designing the slo-sso-finalTemplate #1 designing the slo-sso-final
Template #1 designing the slo-sso-final
 
M3 reviewing the slo-sso-final
M3 reviewing the slo-sso-finalM3 reviewing the slo-sso-final
M3 reviewing the slo-sso-final
 
M2 building the slo-sso-final
M2 building the slo-sso-finalM2 building the slo-sso-final
M2 building the slo-sso-final
 
M1 designing the slo-sso-final
M1 designing the slo-sso-finalM1 designing the slo-sso-final
M1 designing the slo-sso-final
 
M0 school leader orientation-final
M0 school leader orientation-finalM0 school leader orientation-final
M0 school leader orientation-final
 
M0 orientation to the slo-sso-final
M0 orientation to the slo-sso-finalM0 orientation to the slo-sso-final
M0 orientation to the slo-sso-final
 
Educator evaluation policy overview-final
Educator evaluation policy overview-finalEducator evaluation policy overview-final
Educator evaluation policy overview-final
 
Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working Draft
 
Performance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINALPerformance Task Framework-June 2014-FINAL
Performance Task Framework-June 2014-FINAL
 
Depth of Knowledge Chart
Depth of Knowledge ChartDepth of Knowledge Chart
Depth of Knowledge Chart
 

Recently uploaded

CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 

Recently uploaded (20)

CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 

M3-Reviewing the Assessment-June 2014-FINAL

  • 1. Assessment Literacy Series Quick Start © 1 Module 3
  • 2. Participants will: 1. Review developed items/tasks. 2. Examine test alignment. 3. Conduct data reviews. 4. Establish refinement procedures. 2 “Review” Objectives
  • 3. Training Material  Modularized into three phases containing: • Training: PowerPoint Presentations, Videos, etc. • Templates: Forms, Spreadsheets, Business Rules, etc. • Resources: Guides: Handouts, Models, etc.  Delivered via online learning platform: Homeroom www.hr.riagroup2013.com 3 Helpful Resources
  • 4. Participants may wish to reference: Templates  Template #3 – Performance Measure Rubric Resources  Handout #3 – Reviewing the Assessment- Scored Example 4 Helpful Resources (cont.)
  • 7. Item/Task Reviews  Reviews are organized into two, complementary aspects: 1. Design Fidelity -Content -Bias, Fairness, and Sensitivity -Accessibility/Universal Design Features 2. Editorial Soundness -Readability -Sentence Structure & Grammar -Word Choice -Copyrights/Use Agreements 7
  • 8. Quality Reviews Ensuring the assessment: • reflects the developed test blueprint or specification table. • matches targeted content standards. • includes multiple ways for test takers to demonstrate knowledge and skills. Eliminating potential validity threats by reviewing for: • Bias • Fairness • Sensitive Topics • Accessibility/Universal Design Features 8
  • 9. Content  Determine if each item/task clearly aligns to the targeted content standard.  Evaluate all items for content “accuracy".  Judge if each item/task is developmentally (grade) appropriate in terms of: • Reading level • Vocabulary • Required reasoning skills  Review each item/task response in terms of the targeted standards. 9
  • 10. Bias  Bias is the presence of some characteristic of an item/task that results in the differential performance of two individuals with the same ability but from different subgroups.  Bias-free items/tasks provide an equal opportunity for all students to demonstrate their knowledge and skills.  Bias is not the same as stereotyping. 10
  • 11. Fairness  Fairness generally refers to the opportunity for test-takers to learn the content being measured.  Item/task concepts and skills should have been taught to the test-taker prior to evaluating content mastery. • More complex for large-scale assessments • Assumes earlier grades taught the foundational content. [May be a faulty assumption.] 11
  • 12. Sensitivity Review to ensure items/tasks are:  sensitive to different cultures, religions, ethnic and socio-economic groups, and disabilities.  balanced by gender roles.  positive in their language, situations, and imagery.  void of text that may elicit strong emotional responses by specific groups of students. 12
  • 13. Sensitivity (cont.) -Topics to Avoid- • Abortion • Birth control • Child abuse/neglect • Creationism • Divorce • Incest • Illegal activities • Occult/witchcraft • Rape • Religious doctrine • Sex/sexuality • Sexual orientation • Weight • Suicide • STDs *Note: This listing provides examples of topics to avoid, but it does not contain every sensitive topic. 13
  • 14. Accessibility  The extent to which a test and/or testing condition eliminates barriers and permits the test-taker to fully demonstrate his/her knowledge and skills.  All items should be reviewed to ensure they are accessible to the entire population of students. [Universal design features are helpful by eliminating access barriers.]  Item reviews must consider: • Readability • Syntax complexity • Item presentation • Font size • Images, graphs, tables clarity • Item/task spacing 14
  • 15. Editorial Soundness Ensure the assessments have developmentally appropriate: • Readability levels • Sentence structures • Word choice Eliminate validity threats created by: • Confusing or ambiguous directions or prompts • Imprecise verb use to communicate expectations • Vague response criteria or expectations 15
  • 16. Procedural Steps: Item/Tasks Reviews Step 1. Identify at least one other teacher to assist in the review (best accomplished by department or grade-level committees). Step 2. Organize the test form, answer key, and/or scoring rubrics, and Handout #3-Reviewing the Assessment-Scored Example. Step 3. Read each item/task and highlight any “potential” issues in terms of content accuracy, potential bias, sensitive materials, fairness, and developmental appropriateness. Step 4. After reviewing the entire test form, including scoring rubrics, revisit the highlighted items/tasks. Determine if the item/tasks can be rewritten or must be replaced. Step 5. Print revised assessment documents and conduct an editorial review, ensuring readability, sentence/passage complexity, and word selection are grammatically sound. Take corrective actions prior to finalizing the documents. 16
  • 17. QA Checklist: Items/Tasks  All assessment forms have been reviewed for content accuracy, bias, fairness, sensitivity, and accessibility.  All scoring rubrics have been reviewed for content accuracy and developmental appropriateness.  All edits have been applied and final documents are correct. 17
  • 18. 18 STEP 8 Alignment and Performance Level Reviews
  • 19. Alignment Characteristics Item/Task  The degree to which the items/tasks are focused on the targeted content standards in terms of: • Content match • Cognitive demand Overall Assessment  The degree to which the completed assessment reflects (as described in the blueprint) the: • Content patterns of emphasis • Content range and appropriateness 19
  • 20. Alignment Model: Webb  Categorical Concurrence • The same categories of the content standards are included in the assessment. • Items might be aligned to more than one standard.  Balance of Representation • Ensures there is an even distribution of the standards across the test.  Range of Knowledge • The extent of knowledge required to answer correctly parallels the knowledge the standard requires.  Depth of Knowledge • The cognitive demands in the standard must align to the cognitive demands in the test item.  Source of Challenge • Students give a correct or incorrect response for the wrong reason (bias). Source: Webb, N. L. (1997). Research Monograph No. 6: Criteria for alignment of expectations and assessments in mathematics and science education. Washington, DC: Council of Chief State School Officers. 20
  • 21. Alignment Model: Quick Start Item/Task Level Part I: Content & Cognitive Match 1. Item/task is linked to a specific content standard based upon the narrative description of the standard and a professional understanding of the knowledge, skill, and/or concept being described. 2. Item/task reflects the cognitive demand/higher-order thinking skill(s) articulated in the standards. [Note: Extended performance (EP) tasks are typically focused on several integrated content standards.] Test Form Level Part II: Emphasis & Sufficiency Match 1. Distribution of items/tasks reflects the emphasis placed on the targeted content standards in terms of “density” and “instructional focus”, while encompassing the range of standards articulated on the test blueprint. 2. Distribution of opportunities for the test-taker to demonstrate skills, knowledge, and concept mastery at the appropriate developmental range is sufficient. 21
  • 22. Procedural Steps: Alignment Review Step 1. Identify at least one other teacher to assist in the alignment review (best accomplished by department or grade-level committees). Step 2. Organize items/tasks, test blueprint, and targeted content standards. Step 3. Read each item/task in terms of matching the standards both in terms of content reflection and cognitive demand. For SA, EA, and EP tasks, ensure that scoring rubrics are focused on specific content-based expectations. Refine any identified issues. Step 4. After reviewing all items/tasks, including scoring rubrics, count the number of item/task points assigned to each targeted content standard. Determine the percentage of item/task points per targeted content standard based upon the total available. Identify any shortfalls in which too few points are assigned to a standard listed in the test blueprint. Refine if patterns do not reflect those in the standards. Step 5. Using the item/task distributions, determine whether the assessment has at least five (5) points for each targeted content standard and if points are attributed to only developmentally appropriate items/tasks. Refine if point sufficiency does not reflect the content standards. 22
  • 23. QA Checklist: Alignment  All items/tasks match the skills, knowledge, and concepts articulated in the targeted content standards.  All scoring rubrics have been reviewed for content match.  All items/tasks reflect the higher-order thinking skills expressed in the targeted content standards. 23
  • 24. QA Checklist (cont.)  The assessment reflects the range of targeted content standards listed on the test blueprint.  The assessment’s distribution of points reflects a pattern of emphasis similar to those among the targeted content standards.  The assessment has a sufficient number of developmentally appropriate item/task points to measure the targeted content standards. 24
  • 25. Performance Levels Expectations • Content-specific narratives that articulate a performance continuum and describe how each level is different from the others. Categories • A classification of performance given the range of possible performance. Scores • The total number of points assigned to a particular category of performance. 25
  • 26. Performance Levels (cont.)  Reflect the targeted content standards in combination with learning expectations and other assessment data. • Item-centered: focused on items/tasks • Person-centered: focused on test-takers  Apply to either “mastery” or “growth” metrics.  Established prior to administration but often need refined or validation using post-administration scores.  Contain rigorous but attainable expectations. 26
  • 27. Procedural Steps: Performance Level Review 27 Step 1. Review each item/task in terms of how many points the “typically satisfactory” test-taker would earn. Repeat this for the entire assessment. Step 2. Identify a preliminary test score (total raw points earned) that would reflect the minimum level of mastery of the targeted standards based upon the educator’s course expectation (i.e., must earn an 80% to pass the final project and demonstration, etc.) Step 3. From Step #1, total the number of raw points, calculate a percent correct for the “typically satisfactory” test-taker. Step 4. Compare the educator’s course expectation percent to the “typically satisfactory” test-taker. Modify the assessment’s cut score to “fit” the course expectation and the anticipated assessment performance. Step 5. Validate the cut score and performance expectation after the assessment is given (Step 9 Data Reviews).
  • 28. QA Checklist: Performance Levels 28  Assessment has at least two performance levels.  Performance levels contain content-based descriptors, similar in nature to those used in EA and EP scoring rubrics.  Assessment has at least one cut score delineating either meeting or not-meeting expectation.  Assessment has at least two performance categories, described by the performance level statements.  Performance standards were established by educators knowledgeable of the targeted content standards, identified students, and performance results [driven by data and experience].
  • 30. Data Reviews  Conduct after test-takers have engaged in the assessment procedures.  Focus on data about the items/tasks, performance levels, score distribution, administration guidelines, etc.  Evaluate technical quality by examining aspects such as: rater reliability, internal consistency, intra-domain correlations, decision-consistency, measurement error, etc. 30
  • 31. Data Reviews (cont.) Areas of Focus- • Scoring consistency of tasks • Item/Task difficulty • Performance levels • Overall distribution • Correlations between item/tasks and the total score • Administration timeline and guidance clarity 31
  • 33. Refinements  Complete prior to the beginning of the next assessment cycle.  Analyze results from the prior assessment to identify areas of improvement.  Consider item/task replacement or augmentation to address areas of concern.  Strive to include at least 20% new items/tasks, or implement an item/task tryout approach.  Create two parallel forms (i.e., Form A and B) for test security purposes. 33
  • 34. Refinements (cont.) Areas of Focus- • Conduct rigor and developmental reviews for items with over 90% accuracy. • Clarify any identified guidelines needing further improvements. • Create items/tasks banks for future development needs. • Validate the performance levels reflect rigorous but attainable standards. • Develop exemplars based upon student responses. 34
  • 35. Reflection • Item/Task Reviews Step 7 • Alignment Reviews Step 8 • Data Reviews Step 9 • Refinement Step 10 35
  • 36.  Strand 3 of the Performance Measure Rubric evaluates seven (7) components which are rated using the following scale: • 1 – Fully addressed • .5 – Partially addressed • 0 – Not addressed • N/A – Not applicable Note: The Performance Measure Rubric is found within the “Template” folder of the this module. 36 Final Check Performance Measure Rubric
  • 37. 37 Final Check Performance Measure Rubric Strand 3 Task ID Descriptor Rating 3.1 The performance measures are reviewed in terms of design fidelity:  Items/tasks are distributed based upon the design properties found within the specification or blueprint documents;  Item/task and form statistics are used to examine levels of difficulty, complexity, distractor quality, and other properties; and,  Items/tasks and forms are rigorous and free of bias, sensitive, or unfair characteristics. 3.2 The performance measures are reviewed in terms of editorial soundness, while ensuring consistency and accuracy of all documents (e.g., administration guide):  Identifies words, text, reading passages, and/or graphics that require copyright permission or acknowledgements;  Applies Universal Design principles; and,  Ensures linguistic demands and readability is developmentally appropriate. 3.3 The performance measure was reviewed in terms of alignment characteristics:  Pattern consistency (within specifications and/or blueprints);  Targeted content standards match;  Cognitive demand; and,  Developmental appropriateness. 3.4 Cut scores are established for each performance level. Performance level descriptors describe the achievement continuum using content-based competencies for each assessed content area.
  • 38. Task ID Descriptor Rating 3.5 As part of the assessment cycle, post administration analyses are conducted to examine such aspects as items/tasks performance, scale functioning, overall score distribution, rater drift, content alignment, etc. 3.6 The performance measure has score validity evidence that demonstrated item responses were consistent with content specifications. Data suggest that the scores represent the intended construct by using an adequate sample of items/tasks within the targeted content standards. Other sources of validity evidence such as the interrelationship of items/tasks and alignment characteristics of the performance measure are collected. 3.7 Reliability coefficients are reported for the performance measure, which includes estimating internal consistency. Standard errors are reported for summary scores. When applicable, other reliability statistics such as classification accuracy, rater reliabilities, etc. are calculated and reviewed. 38 Final Check Performance Measure Rubric Strand 3 [Note: The indicators below are evaluated after students have taken the assessment (i.e., post administration).]
  • 39. 39 Summary Follow the training guidelines and procedures to: • Review items/tasks, scoring rubrics, and assessment forms to create high-quality performance measures. • Apply the criteria specified within Template 3- Performance Measure Rubric to further evaluate assessment quality.
  • 40. 40 Points of Contact • Research & Development jbeaudoin@riagroup2013.com • Technical Support Center Email: helpdesk@riagroup2013.com Hotline: 1.855.787.9446 • Business Services gwilson@riagroup2013.com www.ria2001.org