SlideShare a Scribd company logo
"Interrater Reliability"
Made Easy!
An Introduction and Practice with Using
Rubrics for Assessment
Raymond Barclay, PhD
Blake Stack
Bonner Foundation
Summer Leadership
Institute at Lindsey Wilson
College
May 25, 2017
Enrollment x Design, LLC
Agenda
• Rubric Overview – Why and Types
• Implementation
o Measurement Dimensions
o Scaling
o Scoring
o Student Engagement
o Calibration
o Reliability
• Review of University of Richmond’s civic
engagement rubrics focused on Presentations
of Learning
• Calibration & Reliability Session(s)
• Debrief
5/25/2017 2
Enrollment x Design, LLC
What is a
rubric?
“a set of criteria specifying the characteristics of an
outcome and the levels of achievement in each
characteristic.”
• SOURCE: Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in
student affairs: A direct assessment of learning.
“A rubric is a scoring guide composed of criteria used
to evaluate performance, a product, or a project. For
instructors and students alike, a rubric defines what
will be assessed. They enable students to identify
what the instructor expects from their assignment
submission. It allows evaluation according to specified
criteria, making grading and ranking simpler, fairer
and more transparent.”
• SOURCE: University of Texas-Austin Faculty Innovation Center (https://
facultyinnovate.utexas.edu/teaching/check-learning/rubrics)
5/25/2017 3
Enrollment x Design, LLC
Blake Stack
Bio
• University of Richmond
• Bonner Center for Civic Engagement
• Coordinator, Bonner Scholars Program
• B.S. Business Administration / B.S. Religious
Studies (Cairn ’05)
5/25/2017 4
Enrollment x Design, LLC
Raymond Barclay - Bio
Education (First Generation College Student):
• PhD, Psychology (Measurement/Statistics and
Cognition), Temple University, School of Education
• Specialization in Design Thinking – University of
Virginia, Graduate School of Business
• Design Thinking & Charrettes – Harvard Univ.-Graduate
School of Design
• MS , Sustainable Design – Thomas Jefferson Univ./Univ.
of Philadelphia, College of Architecture and Built
Design (in progress)
• MDIV, Princeton Theological Seminary
• Philosophy & Religious Studies - Indiana University of
Pennsylvania
Published in the following areas:
• Online learning, assessment and learning, strategic planning,
resiliency, survey development, clinical psychology,
statistical methods, (hierarchical linear analysis,
multivariate analysis, cluster and factor analysis)
Current Role - President – Enrollment x Design, LLC (NJ) –
present
Prior Roles
• Associate Vice President/Associate Provost/Associate Vice
Chancellor Roles
o The New School (NY)
o Stetson University (FL)
o University of North Carolina (NC)
o College of Charleston (SC)
• Director Roles
o The College of New Jersey (NJ)
o Burlington County College (NJ)
o The Bonner Foundation (NJ)
• Senior Analyst Roles
o Arroyo Research Services (NC) - K-16 consulting/
evaluation firm)
o Rowan Univ./Burlington County Community College
(NJ)
5/25/2017 5
Enrollment x Design, LLC
Strengths &
Limits of
Rubric?
Strengths
• Creates objectivity and consistency across all
students
• Clarifies grading criteria in specific terms for
performance or product
• Shows expectations and how work will be evaluated
• Promotes students' awareness and provides
benchmarks to improve their performance or
product
Limitations
• Creating effective rubrics is time consuming
• Cannot measure all aspects of student learning
• May require additional feedback after students
receive their score
SOURCE: University of Texas-Austin Faculty Innovation Center (https://
facultyinnovate.utexas.edu/teaching/check-learning/rubrics)
5/25/2017 6
Enrollment x Design, LLC
Why use a rubric?
• Provides both qualitative descriptions of student learning and
quantitative results
• Clearly communicates expectations to students
• Provides consistency in evaluation
• Simultaneously provides student feedback and programmatic
feedback
• Allows for timely and detailed feedback
• Promotes colleague collaboration
• Helps us refine practice
5/25/2017 7
Enrollment x Design, LLC
Types of rubrics? Analytic*
Analytic rubrics articulate levels of performance for each criteria used
to assess student learning.
• Advantages
• Provides vehicle for more detailed feedback on areas of strength and weakness.
• Scoring is more consistent across students and graders when compared to other
approaches.
• Criterion can be weighted to reflect the relative importance of each dimension.
• Disadvantages
• Takes more time to create and use than a holistic rubric.
• Unless each point for each criterion is well-defined raters may not arrive at the
same score
*Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning.
5/25/2017 8
Enrollment x Design, LLC
Analytic
Example 1:
Undergraduat
e Research
Project with
Weightings
Source: http://ias.virginia.edu/assessment/outcomes/tools/rubrics
5/25/2017 9
Enrollment x Design, LLC
Analytic
Example 2:
Undergraduat
e Student
Employee
SLO
Source: http://studentaffairs.stonybrook.edu/assessment/selo/index.html
5/25/2017 10
Enrollment x Design, LLC
Holistic
Example 1:
Essay Writing
Source: http://fdc.umbc.edu/files/2013/01/SAMPLE-HOLISTIC-RUBRIC-FOR-ESSAYS.pdf
5/25/2017 11
Enrollment x Design, LLC
Holistic
Example 2:
Critical
Thinking
Source: http://teaching.temple.edu/sites/tlc/files/resource/pdf/
Holistic%20Critical%20Thinking%20Scoring%20Rubric.v2%20[Accessible].pdf
5/25/2017 12
Enrollment x Design, LLC
Types of rubrics? Holistic*
A holistic rubric consists of a single scale with all criteria to be included
in the evaluation being considered together.
Advantages
• Emphasis on what the learner is able to demonstrate, rather than what s/he
cannot do.
• Saves time by minimizing the number of decisions raters make.
• Can be applied consistently by trained raters increasing reliability.
Disadvantages
• Does not provide specific feedback for improvement.
• When student work is at varying levels spanning the criteria points it can be
difficult to select the single best description.
• Criteria cannot be weighted.
*Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning.
5/25/2017 13
Enrollment x Design, LLC
Things to Consider in Developing a Rubric
(see resources at end of ppt for more information)
• Have you consulted the plethora of professional literature and online resources?
oThere are a variety of subject areas that have been refined using
professional standards and empirical research
oThere are many classroom-tested rubrics assessed by instructors and their
students
• Can adapt the criteria, rating scale, and indicators to your needs?
oWhether adapting or designing a rubric from scratch, the developmental
process is the same and you must identify the basic components of your
rubric:
➢(a) the performance criteria, (b) the rating scale, and ( c) the indicators of
performance.
SOURCE: University of Texas-Austin Faculty Innovation Center (https://
facultyinnovate.utexas.edu/teaching/check-learning/rubrics)
5/25/2017 14
Enrollment x Design, LLC
Criteria: Evidence of Student Learning
• Whether the product is related to an essay, a research or applied project, or a presentation the evidence of
learning or thinking must be specified
• The evidence will drive the selection of the components that are most important to evaluate relative to a
given task within a specified instructional context.
Components = Criteria
• Key questions to help prioritize the criteria:
➢Which of the proposed criteria are non-negotiable?
➢What are the learning outcomes broadly or relative to specific program?
➢Which learning outcomes will be specified within the rubric?
➢Are there skills that are essential to declare the student is competent or has a certain proficiency
levels for the task or assignment to be complete?
➢How important is it for the student to complete the task or project (interest, logic, organization,
creativity) to demonstrate this proficiency level?
➢Are there process and product expectations?
SOURCE: University of Texas-Austin Faculty Innovation
Center (https://facultyinnovate.utexas.edu/teaching/
check-learning/rubrics)
5/25/2017 15
Enrollment x Design, LLC
Implementation Steps
1. Identify the outcome
2. Determine how you will collect the evidence
3. Develop the rubric based on observation criteria (Anchors)
4. Train the evaluators on how to use rubric
5. Test the rubric against examples
6. Revise as needed
7. Collect the results of scoring and report out
5/25/2017 16
*http://manoa.hawaii.edu/assessment/workshops/pdf/Rubrics_in_program_assessment_ppt_2013-10.pdf
Enrollment x Design, LLC
Choosing Measurement Dimensions*
Measurement Goals: List the measurement dimensions you want a
student to be able to demonstrate that are relevant to curriculum/
activities
Face Validity: Discuss the proposed measurement dimensions with
others to get subjectively assess whether the rubric measures what it
is purported to measure.
Parsimony: Edit content to make sure each measurement dimension is
concise and clear while ensuring a required breadth of coverage (>=3 &
<=8)
*Levy, J.D. Campus Labs: Data Driven
Innovation. Using Rubrics in student
affairs: A direct assessment of
learning.
5/25/2017 17
Enrollment x Design, LLC
Writing Descriptors*
1. Describe each level of mastery for each characteristic
2. Describe the best work you could expect
3. Describe an unacceptable product
4. Develop descriptions of intermediate level products for
intermediate categories
5. Each description and each category should be mutually exclusive
6. Be specific and clear; reduce subjectivity
*University of Florida Institutional
Assessment: Writing Effective Rubrics
5/25/2017 18
Enrollment x Design, LLC
Rubric Development & Use is a Practiced Art
Form!*
Conduct
Training
Rater
Practice!
Rater
Discussions
&
Negotiatio
n
Rubric
Iteration
*Levy, J.D. Campus Labs: Data Driven
Innovation. Using Rubrics in student
affairs: A direct assessment of
learning.
5/25/2017 19
Enrollment x Design, LLC
Pick your Scaling Approach (“indicators”) - I*
a. Beginner, Developing, Accomplished
b. Marginal, Proficient, Exemplary
c. Novice, Intermediate, Proficient, Distinguished
d. Not Yet Competent, Partly competent, Competent,
Sophisticated
a. Never, Rarely, Occasionally, Often, Always
b. Never, Once, Twice, Three times, Four
times c. Never 1-3 x, 4-6x, 5-7x…
a. Not at all, Slightly, Moderately,
Considerably, A great deal
b. Yes/No
c. Met, Partially Met, Not Met
Competency
Frequency of Behavior
Extent to Which
Performed
*Levy, J.D. Campus Labs: Data Driven
Innovation. Using Rubrics in student
affairs: A direct assessment of
learning.
5/25/2017 20
Enrollment x Design, LLC
Pick your Scaling Approach (“indicators”) -
II*
*SOURCE: University of Texas-Austin Faculty Innovation Center (https://facultyinnovate.utexas.edu/teaching/check-
learning/rubrics)
4 3 2 1
Task Requirements All Most Some Very few or none
Frequency Always Usually Some of the time Rarely or not at all
Accuracy No errors Few errors Some errors Frequent errors
Comprehensibility Always
comprehensible
Almost always
comprehensible
Gist and main ideas
are comprehensible
Isolated bits are
comprehensible
Content Coverage Fully developed,
fully supported
Adequately
developed,
adequately
supported
Partially developed,
partially supported
Minimally
developed,
minimally supported
Vocabulary Range Broad Adequate Limited Very limited
Variety Highly varied; non-
repetitive
Varied; occasionally
repetitive
Lacks variety;
repetitive
Basic, memorized,
highly repetitive
5/25/2017 21
Enrollment x Design, LLC
Things to Remember about Scaling
(“Indicators”)
• What is the ideal assessment for each criteria?
o Begin with the highest level of the scale to define top quality performance.
o Work backward to lower performance levels.
• Ensure continuity in the differences between the criteria (e.g., exceeds vs. meets, and meets vs.
does not meet expectations).
o Difference between a 2 and a 3 performance should not be more than the difference between a
3 and a 4 performance.
• Edit the indicators to ensure that the levels reflect variance in quality and not a shift in
importance of the criteria.
• Make certain that the indicators reflect equal steps along the scale.
o The difference between 4 and 3 should be equivalent to the difference between 3 - 2 and 2 - 1.
o “Yes, and more,” “Yes,” “Yes, but,” and “No” are ways for the rubric developer to think about
how to describe performance at each scale point.
SOURCE: University of Texas-Austin Faculty Innovation Center (https://facultyinnovate.utexas.edu/teaching/check-
learning/rubrics)
5/25/2017 22
Enrollment x Design, LLC
MetaRubrics

Campus
Labs
Example
*Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning.
5/25/2017 23
Enrollment x Design, LLC
Scoring Guidelines
1. The grader(s) should be trained in the proper use of the rubric
2. Use multiple graders, if possible, to score student work in order to gain
greater reliability.
3. If different graders are used, make every effort to ensure that they are as
consistent as possible in their scoring by providing adequate training and
examples.
4. If working alone, or without examples, you can achieve a greater level of
internal consistency by giving preliminary ratings to students’ work
• Through this approach, clusters of similar quality will soon develop.
• After establishing a firm scoring scheme, re-grade all students’ work to
assure greater internal consistency and fairness.
SOURCE: University of Texas-Austin Faculty Innovation Center (https://
facultyinnovate.utexas.edu/teaching/check-learning/rubrics)
5/25/2017 24
Enrollment x Design, LLC
Students & Your Rubric
Development
• Include students in the revision and/or development process.
oWhen students are involved, the assignment itself becomes more
meaningful.
Use
• Share the rubric with students before they complete the
assignment.
oEstablishes the level of performance expected which increases
likelihood they meet those standards.
SOURCE: University of Texas-Austin Faculty Innovation Center (https://
facultyinnovate.utexas.edu/teaching/check-learning/rubrics)
5/25/2017 25
Enrollment x Design, LLC
Calibration of Rubrics – 12 Steps*
1. Make copies of the rubric for each
rater
2. Identify representative student
works for each level of performance:
• Case a: 1 – not met; 2 - met, 1 –
exceeded
• Case b: 2 – not met; 2 – met, 2 –
exceeded
3. Provide copies of student work with
identifiers removed
4. Provide scoring sheet
5. Facilitator explains the SLO and the
rubric
6. Each rater independently scores
student work
7. Group discussion of each student
work.
8. Reach consensus on a score for each
work.
9. Recalibrate after 3 hours or at the
beginning of each rating session.
10. Check inter-rater consistency
11. Present results in a meaningful and
clean manner
12. Use results
*http://manoa.hawaii.edu/assessment/workshops/pdf/Rubrics_in_program_assessment_ppt_2013-10.pdf
5/25/2017 26
Enrollment x Design, LLC
Reliability!*
A. Inter-rater reliability: Between-rater consistency (“Inter-rater
agreement:
how many pairs of raters gave exactly the same score?”)
Affected by:
• Initial starting point or approach to scale (assessment tool)
• Interpretation of descriptions
• Domain / content knowledge
B. Intra-rater consistency: Within-rater consistency (“Inter-rater reliability:
what is the correlation between rater 1 and 2?”)
Affected by:
• Internal factors: mood, fatigue, attention
• External factors: order of evidence, time of day, other situations
• Applies to both multiple-rater and single rater situations
• EXCEL FORMULA: =CORREL(array1,array2)
*Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning.
5/25/2017 27
Enrollment x Design, LLC
Assessment
Scenario #1
Written Reflections
5/25/2017 28
Enrollment x Design, LLC
Assessment
Scenario #1
Written Reflection Instructions:
1. After forming groups, review the handout
entitled ”Goals, Prompts, and Rubrics:
Written Reflection” (5 min)
2. Take time to score the first reflection
individually. (Write this down.) Then
discuss with your group and agree upon a
group score. (5 min)
3. Repeat step 2 with two subsequent
written reflections. (10 min)
5/25/2017 29
Enrollment x Design, LLC
Assessment
Scenario #2
Senior Presentation
5/25/2017 30
Enrollment x Design, LLC
Assessment
Scenario #2
Senior Presentation Instructions:
1. Break up into pairs of two.
2. Review the handout entitled ”Goals,
Prompts, and Rubrics: Senior
Presentation” (5 min)
3. While viewing the presentation:
• One person writes down quotes
• One person looks at the rubric and
attempts to determine a rating 
4. Afterwards, pairs get together to
compare and discuss, and to make a final
designation for rubric rating
5/25/2017 31
Enrollment x Design, LLC
ResourcesRubrics
5/25/2017 32
Enrollment x Design, LLC
AAC&U VALUE RUBRICS (16)
• Intellectual and Practical Skills
• Inquiry and analysis
• Critical thinking
• Creative thinking
• Written communication
• Oral communication
• Reading
• Quantitative literacy
• Information literacy
• Teamwork
• Problem solving
• Personal and Social Responsibility
• Civic engagement—local and global
• Intercultural knowledge and competence
• Ethical reasoning
• Foundations and skills for lifelong learning
• Global learning
• Integrative and Applied Learning
• Integrative learning
SOURCE: https://www.aacu.org/value-rubrics
5/25/2017 33
Enrollment x Design, LLC
RCampus
5/25/2017 34
Enrollment x Design, LLC
Rubrics for evaluating dissertations:
• Focus groups with 272 faculty , 74
departments, 10 disciplines , 9
research universities
• Experience = 3,470 dissertations,
9,890 committees
Lovitts, B. E. (2007). Making the Implicit Explicit: Creating Performance
Expectations for the Dissertation. Stylus Publishing, LLC.
5/25/2017 35
Enrollment x Design, LLC
University of Hawaii – Manoa: Rubric Bank
• Includes all VALUE rubrics, plus rubrics for:
o Collaboration, teamwork, participation
o Critical thinking, creative thinking
o Ethical deliberation
o Information literacy
o Reflection/Metacognition
o Oral communication
o Writing
o Project design
o Assessing assessment
• https://manoa.hawaii.edu/assessment/resources/rubricbank.htm
5/25/2017 36
Contact Us
Raymond Barclay
ray@enrollmentxdesig
n.com
Blake Stack
bstack@richmond.edu

More Related Content

What's hot

Valiadity and reliability- Language testing
Valiadity and reliability- Language testingValiadity and reliability- Language testing
Valiadity and reliability- Language testing
Phuong Tran
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
Tarek Tawfik Amin
 
Language testing and evaluation validity and reliability.
Language testing and evaluation validity and reliability.Language testing and evaluation validity and reliability.
Language testing and evaluation validity and reliability.
Vadher Ankita
 
discrete-point and integrative testing
discrete-point and integrative testingdiscrete-point and integrative testing
discrete-point and integrative testing
indriyatul munawaroh
 
Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)
Rey-ra Mora
 
Testing writing
Testing writingTesting writing
Testing writing
Marta Ribas
 
Types of Tests,
Types of Tests, Types of Tests,
Types of Tests,
Wardah Azhar
 
Test Assembling (writing and constructing)
Test Assembling (writing and constructing)Test Assembling (writing and constructing)
Test Assembling (writing and constructing)
Tasneem Ahmad
 
Discrete point test 1
Discrete point test 1Discrete point test 1
Discrete point test 1
Novi Kirena
 
Testing writing
Testing writingTesting writing
Testing writing
Jenny Aque
 
Ethics my own effort
Ethics my own effortEthics my own effort
Ethics my own effort
Hina Honey
 
Reliability and validity
Reliability and validityReliability and validity
Reliability and validity
Carlos Tian Chow Correos
 
Writing test
Writing testWriting test
Writing test
Thao Le
 
Test and types of tests
Test and types of testsTest and types of tests
Test and types of tests
Fousiya O P
 
Test Techniques
Test TechniquesTest Techniques
Test Techniques
Ariane Mitschek
 
Validity in Assessment
Validity in AssessmentValidity in Assessment
Validity in Assessment
sheldine abuhan
 
Designing language test
Designing language testDesigning language test
Designing language test
Jesullyna Manuel
 
test Administration
test Administrationtest Administration
test Administration
AbuulHassan2
 
Experimental research
Experimental researchExperimental research
Experimental research
izzajalil
 
Types of test
Types of testTypes of test
Types of test
Shams ud din Pandrani
 

What's hot (20)

Valiadity and reliability- Language testing
Valiadity and reliability- Language testingValiadity and reliability- Language testing
Valiadity and reliability- Language testing
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 
Language testing and evaluation validity and reliability.
Language testing and evaluation validity and reliability.Language testing and evaluation validity and reliability.
Language testing and evaluation validity and reliability.
 
discrete-point and integrative testing
discrete-point and integrative testingdiscrete-point and integrative testing
discrete-point and integrative testing
 
Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)
 
Testing writing
Testing writingTesting writing
Testing writing
 
Types of Tests,
Types of Tests, Types of Tests,
Types of Tests,
 
Test Assembling (writing and constructing)
Test Assembling (writing and constructing)Test Assembling (writing and constructing)
Test Assembling (writing and constructing)
 
Discrete point test 1
Discrete point test 1Discrete point test 1
Discrete point test 1
 
Testing writing
Testing writingTesting writing
Testing writing
 
Ethics my own effort
Ethics my own effortEthics my own effort
Ethics my own effort
 
Reliability and validity
Reliability and validityReliability and validity
Reliability and validity
 
Writing test
Writing testWriting test
Writing test
 
Test and types of tests
Test and types of testsTest and types of tests
Test and types of tests
 
Test Techniques
Test TechniquesTest Techniques
Test Techniques
 
Validity in Assessment
Validity in AssessmentValidity in Assessment
Validity in Assessment
 
Designing language test
Designing language testDesigning language test
Designing language test
 
test Administration
test Administrationtest Administration
test Administration
 
Experimental research
Experimental researchExperimental research
Experimental research
 
Types of test
Types of testTypes of test
Types of test
 

Similar to Interrater Reliability Made Easy

Rubric design workshop
Rubric design workshopRubric design workshop
Rubric design workshop
Lisa M. Snyder
 
Proving to improve - UA Summit of Deans Councils
Proving to improve - UA Summit of Deans CouncilsProving to improve - UA Summit of Deans Councils
Proving to improve - UA Summit of Deans Councils
Mark Freeman
 
EDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINALEDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINAL
Ellen Wagner
 
An ontologyforopenrubricexchangeontheweb
An ontologyforopenrubricexchangeonthewebAn ontologyforopenrubricexchangeontheweb
An ontologyforopenrubricexchangeontheweb
bpanulla
 
Check, Check, Check in the Simulation Lab
Check, Check, Check in the Simulation LabCheck, Check, Check in the Simulation Lab
Check, Check, Check in the Simulation Lab
ExamSoft
 
ADOVH Validity and Reliability of Online Assessments.pdf
ADOVH Validity and Reliability of Online Assessments.pdfADOVH Validity and Reliability of Online Assessments.pdf
ADOVH Validity and Reliability of Online Assessments.pdf
ADOVH-University of South Africa
 
Rubric For Improving The Quality Of Online Courses T Lt April27 09
Rubric For Improving The Quality Of Online Courses T Lt April27 09Rubric For Improving The Quality Of Online Courses T Lt April27 09
Rubric For Improving The Quality Of Online Courses T Lt April27 09
nelsond
 
Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1
Lisa M. Snyder
 
FACT2 Learning Analytics Task Group (LATG) SCOA briefing
FACT2 Learning Analytics Task Group (LATG) SCOA briefingFACT2 Learning Analytics Task Group (LATG) SCOA briefing
FACT2 Learning Analytics Task Group (LATG) SCOA briefing
gketcham
 
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Gary Matkin
 
Assessment
AssessmentAssessment
Assessment
NewportCELT
 
Graded Assessment – Myth Or Fact Ppt Jan 2k10
Graded Assessment – Myth Or Fact Ppt Jan 2k10Graded Assessment – Myth Or Fact Ppt Jan 2k10
Graded Assessment – Myth Or Fact Ppt Jan 2k10
KeithH66
 
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
guest3f9d24
 
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Rachel Harris
 
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
2010 ohio tif meeting  creating a comprehensive teacher effectiveness system2010 ohio tif meeting  creating a comprehensive teacher effectiveness system
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
Christopher Thorn
 
Curriculum Mapping
Curriculum MappingCurriculum Mapping
Curriculum Mapping
Bonner Foundation
 
Interview presentation.pptx
Interview presentation.pptxInterview presentation.pptx
Interview presentation.pptx
Dr. S. Dinesh Kumar
 
Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...
Beck Pitt
 
Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...
OER Hub
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate Teaching
Vicki L. Wise
 

Similar to Interrater Reliability Made Easy (20)

Rubric design workshop
Rubric design workshopRubric design workshop
Rubric design workshop
 
Proving to improve - UA Summit of Deans Councils
Proving to improve - UA Summit of Deans CouncilsProving to improve - UA Summit of Deans Councils
Proving to improve - UA Summit of Deans Councils
 
EDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINALEDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINAL
 
An ontologyforopenrubricexchangeontheweb
An ontologyforopenrubricexchangeonthewebAn ontologyforopenrubricexchangeontheweb
An ontologyforopenrubricexchangeontheweb
 
Check, Check, Check in the Simulation Lab
Check, Check, Check in the Simulation LabCheck, Check, Check in the Simulation Lab
Check, Check, Check in the Simulation Lab
 
ADOVH Validity and Reliability of Online Assessments.pdf
ADOVH Validity and Reliability of Online Assessments.pdfADOVH Validity and Reliability of Online Assessments.pdf
ADOVH Validity and Reliability of Online Assessments.pdf
 
Rubric For Improving The Quality Of Online Courses T Lt April27 09
Rubric For Improving The Quality Of Online Courses T Lt April27 09Rubric For Improving The Quality Of Online Courses T Lt April27 09
Rubric For Improving The Quality Of Online Courses T Lt April27 09
 
Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1
 
FACT2 Learning Analytics Task Group (LATG) SCOA briefing
FACT2 Learning Analytics Task Group (LATG) SCOA briefingFACT2 Learning Analytics Task Group (LATG) SCOA briefing
FACT2 Learning Analytics Task Group (LATG) SCOA briefing
 
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
Beyond Accreditation and Standards: The Distance Educator’s Opportunity for L...
 
Assessment
AssessmentAssessment
Assessment
 
Graded Assessment – Myth Or Fact Ppt Jan 2k10
Graded Assessment – Myth Or Fact Ppt Jan 2k10Graded Assessment – Myth Or Fact Ppt Jan 2k10
Graded Assessment – Myth Or Fact Ppt Jan 2k10
 
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
E:\T&amp;La Special Projects\Computer Engineering &amp; Applied Science\Grade...
 
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...Designing useful evaluations - An online workshop for the Jisc AF programme_I...
Designing useful evaluations - An online workshop for the Jisc AF programme_I...
 
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
2010 ohio tif meeting  creating a comprehensive teacher effectiveness system2010 ohio tif meeting  creating a comprehensive teacher effectiveness system
2010 ohio tif meeting creating a comprehensive teacher effectiveness system
 
Curriculum Mapping
Curriculum MappingCurriculum Mapping
Curriculum Mapping
 
Interview presentation.pptx
Interview presentation.pptxInterview presentation.pptx
Interview presentation.pptx
 
Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...
 
Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...Assessing OER impact across varied organisations and learners: experiences fr...
Assessing OER impact across varied organisations and learners: experiences fr...
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate Teaching
 

More from Bonner Foundation

Fall Network Meeting Streamlining Operations .pdf
Fall Network Meeting Streamlining Operations .pdfFall Network Meeting Streamlining Operations .pdf
Fall Network Meeting Streamlining Operations .pdf
Bonner Foundation
 
FNM23 CEL and Change .pdf
FNM23 CEL and Change .pdfFNM23 CEL and Change .pdf
FNM23 CEL and Change .pdf
Bonner Foundation
 
Fall Network Meeting Career Connections.pdf
Fall Network Meeting Career Connections.pdfFall Network Meeting Career Connections.pdf
Fall Network Meeting Career Connections.pdf
Bonner Foundation
 
Best Practices - Building a Coalition of Student-Led Service Projects.pdf
Best Practices - Building a Coalition of Student-Led Service Projects.pdfBest Practices - Building a Coalition of Student-Led Service Projects.pdf
Best Practices - Building a Coalition of Student-Led Service Projects.pdf
Bonner Foundation
 
Fall Network Meeting Community Partnerships & Projects Session.pdf
Fall Network Meeting Community Partnerships & Projects Session.pdfFall Network Meeting Community Partnerships & Projects Session.pdf
Fall Network Meeting Community Partnerships & Projects Session.pdf
Bonner Foundation
 
Best Practices - Bonner Meetings.pptx
Best Practices - Bonner Meetings.pptxBest Practices - Bonner Meetings.pptx
Best Practices - Bonner Meetings.pptx
Bonner Foundation
 
Leveraging Data to Make the Case for Bonner Like Programs.pdf
Leveraging Data to Make the Case for Bonner Like Programs.pdfLeveraging Data to Make the Case for Bonner Like Programs.pdf
Leveraging Data to Make the Case for Bonner Like Programs.pdf
Bonner Foundation
 
BC'23 - Career Connections
BC'23 - Career ConnectionsBC'23 - Career Connections
BC'23 - Career Connections
Bonner Foundation
 
2023 Bonner Congress Opening Slides
2023 Bonner Congress Opening Slides2023 Bonner Congress Opening Slides
2023 Bonner Congress Opening Slides
Bonner Foundation
 
This is What Democracy Looks Like Powerbuilding -- Cali VanCleve
This is What Democracy Looks Like Powerbuilding -- Cali VanCleveThis is What Democracy Looks Like Powerbuilding -- Cali VanCleve
This is What Democracy Looks Like Powerbuilding -- Cali VanCleve
Bonner Foundation
 
SIT's Navigating Global Horizons.pdf
SIT's Navigating Global Horizons.pdfSIT's Navigating Global Horizons.pdf
SIT's Navigating Global Horizons.pdf
Bonner Foundation
 
Prioritizing Bonner How to Support the Student Journey (1).pptx
Prioritizing Bonner How to Support the Student Journey (1).pptxPrioritizing Bonner How to Support the Student Journey (1).pptx
Prioritizing Bonner How to Support the Student Journey (1).pptx
Bonner Foundation
 
Preparing a strong personal statement_fall_2023_grad_general.pptx
Preparing a strong personal statement_fall_2023_grad_general.pptxPreparing a strong personal statement_fall_2023_grad_general.pptx
Preparing a strong personal statement_fall_2023_grad_general.pptx
Bonner Foundation
 
Current Communication Apps and Their Uses in Bonner.pdf
Current Communication Apps and Their Uses in Bonner.pdfCurrent Communication Apps and Their Uses in Bonner.pdf
Current Communication Apps and Their Uses in Bonner.pdf
Bonner Foundation
 
'23 NSO - Cornerstones Activities
'23 NSO - Cornerstones Activities'23 NSO - Cornerstones Activities
'23 NSO - Cornerstones Activities
Bonner Foundation
 
'23 NSO - Accountability, Tracking, Visibility
'23 NSO - Accountability, Tracking, Visibility'23 NSO - Accountability, Tracking, Visibility
'23 NSO - Accountability, Tracking, Visibility
Bonner Foundation
 
'23 NSO - BSP Recruitment & Funding.pdf
'23 NSO - BSP Recruitment & Funding.pdf'23 NSO - BSP Recruitment & Funding.pdf
'23 NSO - BSP Recruitment & Funding.pdf
Bonner Foundation
 
'23 NSO - Community Partnerships
'23 NSO - Community Partnerships'23 NSO - Community Partnerships
'23 NSO - Community Partnerships
Bonner Foundation
 
'23 NSO - Campus-Wide Engagement
'23 NSO - Campus-Wide Engagement'23 NSO - Campus-Wide Engagement
'23 NSO - Campus-Wide Engagement
Bonner Foundation
 
'23 NSO - Other Foundation Initiatives & Support.pdf
'23 NSO - Other Foundation Initiatives & Support.pdf'23 NSO - Other Foundation Initiatives & Support.pdf
'23 NSO - Other Foundation Initiatives & Support.pdf
Bonner Foundation
 

More from Bonner Foundation (20)

Fall Network Meeting Streamlining Operations .pdf
Fall Network Meeting Streamlining Operations .pdfFall Network Meeting Streamlining Operations .pdf
Fall Network Meeting Streamlining Operations .pdf
 
FNM23 CEL and Change .pdf
FNM23 CEL and Change .pdfFNM23 CEL and Change .pdf
FNM23 CEL and Change .pdf
 
Fall Network Meeting Career Connections.pdf
Fall Network Meeting Career Connections.pdfFall Network Meeting Career Connections.pdf
Fall Network Meeting Career Connections.pdf
 
Best Practices - Building a Coalition of Student-Led Service Projects.pdf
Best Practices - Building a Coalition of Student-Led Service Projects.pdfBest Practices - Building a Coalition of Student-Led Service Projects.pdf
Best Practices - Building a Coalition of Student-Led Service Projects.pdf
 
Fall Network Meeting Community Partnerships & Projects Session.pdf
Fall Network Meeting Community Partnerships & Projects Session.pdfFall Network Meeting Community Partnerships & Projects Session.pdf
Fall Network Meeting Community Partnerships & Projects Session.pdf
 
Best Practices - Bonner Meetings.pptx
Best Practices - Bonner Meetings.pptxBest Practices - Bonner Meetings.pptx
Best Practices - Bonner Meetings.pptx
 
Leveraging Data to Make the Case for Bonner Like Programs.pdf
Leveraging Data to Make the Case for Bonner Like Programs.pdfLeveraging Data to Make the Case for Bonner Like Programs.pdf
Leveraging Data to Make the Case for Bonner Like Programs.pdf
 
BC'23 - Career Connections
BC'23 - Career ConnectionsBC'23 - Career Connections
BC'23 - Career Connections
 
2023 Bonner Congress Opening Slides
2023 Bonner Congress Opening Slides2023 Bonner Congress Opening Slides
2023 Bonner Congress Opening Slides
 
This is What Democracy Looks Like Powerbuilding -- Cali VanCleve
This is What Democracy Looks Like Powerbuilding -- Cali VanCleveThis is What Democracy Looks Like Powerbuilding -- Cali VanCleve
This is What Democracy Looks Like Powerbuilding -- Cali VanCleve
 
SIT's Navigating Global Horizons.pdf
SIT's Navigating Global Horizons.pdfSIT's Navigating Global Horizons.pdf
SIT's Navigating Global Horizons.pdf
 
Prioritizing Bonner How to Support the Student Journey (1).pptx
Prioritizing Bonner How to Support the Student Journey (1).pptxPrioritizing Bonner How to Support the Student Journey (1).pptx
Prioritizing Bonner How to Support the Student Journey (1).pptx
 
Preparing a strong personal statement_fall_2023_grad_general.pptx
Preparing a strong personal statement_fall_2023_grad_general.pptxPreparing a strong personal statement_fall_2023_grad_general.pptx
Preparing a strong personal statement_fall_2023_grad_general.pptx
 
Current Communication Apps and Their Uses in Bonner.pdf
Current Communication Apps and Their Uses in Bonner.pdfCurrent Communication Apps and Their Uses in Bonner.pdf
Current Communication Apps and Their Uses in Bonner.pdf
 
'23 NSO - Cornerstones Activities
'23 NSO - Cornerstones Activities'23 NSO - Cornerstones Activities
'23 NSO - Cornerstones Activities
 
'23 NSO - Accountability, Tracking, Visibility
'23 NSO - Accountability, Tracking, Visibility'23 NSO - Accountability, Tracking, Visibility
'23 NSO - Accountability, Tracking, Visibility
 
'23 NSO - BSP Recruitment & Funding.pdf
'23 NSO - BSP Recruitment & Funding.pdf'23 NSO - BSP Recruitment & Funding.pdf
'23 NSO - BSP Recruitment & Funding.pdf
 
'23 NSO - Community Partnerships
'23 NSO - Community Partnerships'23 NSO - Community Partnerships
'23 NSO - Community Partnerships
 
'23 NSO - Campus-Wide Engagement
'23 NSO - Campus-Wide Engagement'23 NSO - Campus-Wide Engagement
'23 NSO - Campus-Wide Engagement
 
'23 NSO - Other Foundation Initiatives & Support.pdf
'23 NSO - Other Foundation Initiatives & Support.pdf'23 NSO - Other Foundation Initiatives & Support.pdf
'23 NSO - Other Foundation Initiatives & Support.pdf
 

Recently uploaded

加急办理华威大学毕业证硕士文凭证书原版一模一样
加急办理华威大学毕业证硕士文凭证书原版一模一样加急办理华威大学毕业证硕士文凭证书原版一模一样
加急办理华威大学毕业证硕士文凭证书原版一模一样
uu1psyf6
 
Item # 10 -- Historical Presv. Districts
Item # 10 -- Historical Presv. DistrictsItem # 10 -- Historical Presv. Districts
Item # 10 -- Historical Presv. Districts
ahcitycouncil
 
Border towns and spaces of (in)visibility.pdf
Border towns and spaces of (in)visibility.pdfBorder towns and spaces of (in)visibility.pdf
Border towns and spaces of (in)visibility.pdf
Scalabrini Institute for Human Mobility in Africa
 
PAS PSDF Mop Up Workshop Presentation 2024 .pptx
PAS PSDF Mop Up Workshop Presentation 2024 .pptxPAS PSDF Mop Up Workshop Presentation 2024 .pptx
PAS PSDF Mop Up Workshop Presentation 2024 .pptx
PAS_Team
 
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC CharlotteA Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
Cori Faklaris
 
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHO
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHOMonitoring Health for the SDGs - Global Health Statistics 2024 - WHO
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHO
Christina Parmionova
 
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
OECDregions
 
Abiy Berehe - Texas Commission on Environmental Quality Updates
Abiy Berehe - Texas Commission on Environmental Quality UpdatesAbiy Berehe - Texas Commission on Environmental Quality Updates
Abiy Berehe - Texas Commission on Environmental Quality Updates
Texas Alliance of Groundwater Districts
 
World Food Safety Day 2024- Communication-toolkit.
World Food Safety Day 2024- Communication-toolkit.World Food Safety Day 2024- Communication-toolkit.
World Food Safety Day 2024- Communication-toolkit.
Christina Parmionova
 
PPT Item # 8&9 - Demolition Code Amendments
PPT Item # 8&9 - Demolition Code AmendmentsPPT Item # 8&9 - Demolition Code Amendments
PPT Item # 8&9 - Demolition Code Amendments
ahcitycouncil
 
CFYT Rolling Ads Dawson City Yukon Canada
CFYT Rolling Ads Dawson City Yukon CanadaCFYT Rolling Ads Dawson City Yukon Canada
CFYT Rolling Ads Dawson City Yukon Canada
pmenzies
 
Awaken new depths - World Ocean Day 2024, June 8th.
Awaken new depths - World Ocean Day 2024, June 8th.Awaken new depths - World Ocean Day 2024, June 8th.
Awaken new depths - World Ocean Day 2024, June 8th.
Christina Parmionova
 
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
Congressional Budget Office
 
Practical guide for the celebration of World Environment Day on june 5th.
Practical guide for the  celebration of World Environment Day on  june 5th.Practical guide for the  celebration of World Environment Day on  june 5th.
Practical guide for the celebration of World Environment Day on june 5th.
Christina Parmionova
 
About Potato, The scientific name of the plant is Solanum tuberosum (L).
About Potato, The scientific name of the plant is Solanum tuberosum (L).About Potato, The scientific name of the plant is Solanum tuberosum (L).
About Potato, The scientific name of the plant is Solanum tuberosum (L).
Christina Parmionova
 
PPT Item # 4 - 434 College Blvd. (sign. review)
PPT Item # 4 - 434 College Blvd. (sign. review)PPT Item # 4 - 434 College Blvd. (sign. review)
PPT Item # 4 - 434 College Blvd. (sign. review)
ahcitycouncil
 
Donate to charity during this holiday season
Donate to charity during this holiday seasonDonate to charity during this holiday season
Donate to charity during this holiday season
SERUDS INDIA
 
PPT Item # 7 - 231 Encino Avenue (sign. review)
PPT Item # 7 - 231 Encino Avenue (sign. review)PPT Item # 7 - 231 Encino Avenue (sign. review)
PPT Item # 7 - 231 Encino Avenue (sign. review)
ahcitycouncil
 
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
yemqpj
 
AHMR volume 10 number 1 January-April 2024
AHMR volume 10 number 1 January-April 2024AHMR volume 10 number 1 January-April 2024
AHMR volume 10 number 1 January-April 2024
Scalabrini Institute for Human Mobility in Africa
 

Recently uploaded (20)

加急办理华威大学毕业证硕士文凭证书原版一模一样
加急办理华威大学毕业证硕士文凭证书原版一模一样加急办理华威大学毕业证硕士文凭证书原版一模一样
加急办理华威大学毕业证硕士文凭证书原版一模一样
 
Item # 10 -- Historical Presv. Districts
Item # 10 -- Historical Presv. DistrictsItem # 10 -- Historical Presv. Districts
Item # 10 -- Historical Presv. Districts
 
Border towns and spaces of (in)visibility.pdf
Border towns and spaces of (in)visibility.pdfBorder towns and spaces of (in)visibility.pdf
Border towns and spaces of (in)visibility.pdf
 
PAS PSDF Mop Up Workshop Presentation 2024 .pptx
PAS PSDF Mop Up Workshop Presentation 2024 .pptxPAS PSDF Mop Up Workshop Presentation 2024 .pptx
PAS PSDF Mop Up Workshop Presentation 2024 .pptx
 
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC CharlotteA Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
A Guide to AI for Smarter Nonprofits - Dr. Cori Faklaris, UNC Charlotte
 
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHO
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHOMonitoring Health for the SDGs - Global Health Statistics 2024 - WHO
Monitoring Health for the SDGs - Global Health Statistics 2024 - WHO
 
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
Preliminary findings _OECD field visits to ten regions in the TSI EU mining r...
 
Abiy Berehe - Texas Commission on Environmental Quality Updates
Abiy Berehe - Texas Commission on Environmental Quality UpdatesAbiy Berehe - Texas Commission on Environmental Quality Updates
Abiy Berehe - Texas Commission on Environmental Quality Updates
 
World Food Safety Day 2024- Communication-toolkit.
World Food Safety Day 2024- Communication-toolkit.World Food Safety Day 2024- Communication-toolkit.
World Food Safety Day 2024- Communication-toolkit.
 
PPT Item # 8&9 - Demolition Code Amendments
PPT Item # 8&9 - Demolition Code AmendmentsPPT Item # 8&9 - Demolition Code Amendments
PPT Item # 8&9 - Demolition Code Amendments
 
CFYT Rolling Ads Dawson City Yukon Canada
CFYT Rolling Ads Dawson City Yukon CanadaCFYT Rolling Ads Dawson City Yukon Canada
CFYT Rolling Ads Dawson City Yukon Canada
 
Awaken new depths - World Ocean Day 2024, June 8th.
Awaken new depths - World Ocean Day 2024, June 8th.Awaken new depths - World Ocean Day 2024, June 8th.
Awaken new depths - World Ocean Day 2024, June 8th.
 
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
CBO’s Outlook for U.S. Fertility Rates: 2024 to 2054
 
Practical guide for the celebration of World Environment Day on june 5th.
Practical guide for the  celebration of World Environment Day on  june 5th.Practical guide for the  celebration of World Environment Day on  june 5th.
Practical guide for the celebration of World Environment Day on june 5th.
 
About Potato, The scientific name of the plant is Solanum tuberosum (L).
About Potato, The scientific name of the plant is Solanum tuberosum (L).About Potato, The scientific name of the plant is Solanum tuberosum (L).
About Potato, The scientific name of the plant is Solanum tuberosum (L).
 
PPT Item # 4 - 434 College Blvd. (sign. review)
PPT Item # 4 - 434 College Blvd. (sign. review)PPT Item # 4 - 434 College Blvd. (sign. review)
PPT Item # 4 - 434 College Blvd. (sign. review)
 
Donate to charity during this holiday season
Donate to charity during this holiday seasonDonate to charity during this holiday season
Donate to charity during this holiday season
 
PPT Item # 7 - 231 Encino Avenue (sign. review)
PPT Item # 7 - 231 Encino Avenue (sign. review)PPT Item # 7 - 231 Encino Avenue (sign. review)
PPT Item # 7 - 231 Encino Avenue (sign. review)
 
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
在线办理(ISU毕业证书)爱荷华州立大学毕业证学历证书一模一样
 
AHMR volume 10 number 1 January-April 2024
AHMR volume 10 number 1 January-April 2024AHMR volume 10 number 1 January-April 2024
AHMR volume 10 number 1 January-April 2024
 

Interrater Reliability Made Easy

  • 1. "Interrater Reliability" Made Easy! An Introduction and Practice with Using Rubrics for Assessment Raymond Barclay, PhD Blake Stack Bonner Foundation Summer Leadership Institute at Lindsey Wilson College May 25, 2017
  • 2. Enrollment x Design, LLC Agenda • Rubric Overview – Why and Types • Implementation o Measurement Dimensions o Scaling o Scoring o Student Engagement o Calibration o Reliability • Review of University of Richmond’s civic engagement rubrics focused on Presentations of Learning • Calibration & Reliability Session(s) • Debrief 5/25/2017 2
  • 3. Enrollment x Design, LLC What is a rubric? “a set of criteria specifying the characteristics of an outcome and the levels of achievement in each characteristic.” • SOURCE: Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. “A rubric is a scoring guide composed of criteria used to evaluate performance, a product, or a project. For instructors and students alike, a rubric defines what will be assessed. They enable students to identify what the instructor expects from their assignment submission. It allows evaluation according to specified criteria, making grading and ranking simpler, fairer and more transparent.” • SOURCE: University of Texas-Austin Faculty Innovation Center (https:// facultyinnovate.utexas.edu/teaching/check-learning/rubrics) 5/25/2017 3
  • 4. Enrollment x Design, LLC Blake Stack Bio • University of Richmond • Bonner Center for Civic Engagement • Coordinator, Bonner Scholars Program • B.S. Business Administration / B.S. Religious Studies (Cairn ’05) 5/25/2017 4
  • 5. Enrollment x Design, LLC Raymond Barclay - Bio Education (First Generation College Student): • PhD, Psychology (Measurement/Statistics and Cognition), Temple University, School of Education • Specialization in Design Thinking – University of Virginia, Graduate School of Business • Design Thinking & Charrettes – Harvard Univ.-Graduate School of Design • MS , Sustainable Design – Thomas Jefferson Univ./Univ. of Philadelphia, College of Architecture and Built Design (in progress) • MDIV, Princeton Theological Seminary • Philosophy & Religious Studies - Indiana University of Pennsylvania Published in the following areas: • Online learning, assessment and learning, strategic planning, resiliency, survey development, clinical psychology, statistical methods, (hierarchical linear analysis, multivariate analysis, cluster and factor analysis) Current Role - President – Enrollment x Design, LLC (NJ) – present Prior Roles • Associate Vice President/Associate Provost/Associate Vice Chancellor Roles o The New School (NY) o Stetson University (FL) o University of North Carolina (NC) o College of Charleston (SC) • Director Roles o The College of New Jersey (NJ) o Burlington County College (NJ) o The Bonner Foundation (NJ) • Senior Analyst Roles o Arroyo Research Services (NC) - K-16 consulting/ evaluation firm) o Rowan Univ./Burlington County Community College (NJ) 5/25/2017 5
  • 6. Enrollment x Design, LLC Strengths & Limits of Rubric? Strengths • Creates objectivity and consistency across all students • Clarifies grading criteria in specific terms for performance or product • Shows expectations and how work will be evaluated • Promotes students' awareness and provides benchmarks to improve their performance or product Limitations • Creating effective rubrics is time consuming • Cannot measure all aspects of student learning • May require additional feedback after students receive their score SOURCE: University of Texas-Austin Faculty Innovation Center (https:// facultyinnovate.utexas.edu/teaching/check-learning/rubrics) 5/25/2017 6
  • 7. Enrollment x Design, LLC Why use a rubric? • Provides both qualitative descriptions of student learning and quantitative results • Clearly communicates expectations to students • Provides consistency in evaluation • Simultaneously provides student feedback and programmatic feedback • Allows for timely and detailed feedback • Promotes colleague collaboration • Helps us refine practice 5/25/2017 7
  • 8. Enrollment x Design, LLC Types of rubrics? Analytic* Analytic rubrics articulate levels of performance for each criteria used to assess student learning. • Advantages • Provides vehicle for more detailed feedback on areas of strength and weakness. • Scoring is more consistent across students and graders when compared to other approaches. • Criterion can be weighted to reflect the relative importance of each dimension. • Disadvantages • Takes more time to create and use than a holistic rubric. • Unless each point for each criterion is well-defined raters may not arrive at the same score *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 8
  • 9. Enrollment x Design, LLC Analytic Example 1: Undergraduat e Research Project with Weightings Source: http://ias.virginia.edu/assessment/outcomes/tools/rubrics 5/25/2017 9
  • 10. Enrollment x Design, LLC Analytic Example 2: Undergraduat e Student Employee SLO Source: http://studentaffairs.stonybrook.edu/assessment/selo/index.html 5/25/2017 10
  • 11. Enrollment x Design, LLC Holistic Example 1: Essay Writing Source: http://fdc.umbc.edu/files/2013/01/SAMPLE-HOLISTIC-RUBRIC-FOR-ESSAYS.pdf 5/25/2017 11
  • 12. Enrollment x Design, LLC Holistic Example 2: Critical Thinking Source: http://teaching.temple.edu/sites/tlc/files/resource/pdf/ Holistic%20Critical%20Thinking%20Scoring%20Rubric.v2%20[Accessible].pdf 5/25/2017 12
  • 13. Enrollment x Design, LLC Types of rubrics? Holistic* A holistic rubric consists of a single scale with all criteria to be included in the evaluation being considered together. Advantages • Emphasis on what the learner is able to demonstrate, rather than what s/he cannot do. • Saves time by minimizing the number of decisions raters make. • Can be applied consistently by trained raters increasing reliability. Disadvantages • Does not provide specific feedback for improvement. • When student work is at varying levels spanning the criteria points it can be difficult to select the single best description. • Criteria cannot be weighted. *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 13
  • 14. Enrollment x Design, LLC Things to Consider in Developing a Rubric (see resources at end of ppt for more information) • Have you consulted the plethora of professional literature and online resources? oThere are a variety of subject areas that have been refined using professional standards and empirical research oThere are many classroom-tested rubrics assessed by instructors and their students • Can adapt the criteria, rating scale, and indicators to your needs? oWhether adapting or designing a rubric from scratch, the developmental process is the same and you must identify the basic components of your rubric: ➢(a) the performance criteria, (b) the rating scale, and ( c) the indicators of performance. SOURCE: University of Texas-Austin Faculty Innovation Center (https:// facultyinnovate.utexas.edu/teaching/check-learning/rubrics) 5/25/2017 14
  • 15. Enrollment x Design, LLC Criteria: Evidence of Student Learning • Whether the product is related to an essay, a research or applied project, or a presentation the evidence of learning or thinking must be specified • The evidence will drive the selection of the components that are most important to evaluate relative to a given task within a specified instructional context. Components = Criteria • Key questions to help prioritize the criteria: ➢Which of the proposed criteria are non-negotiable? ➢What are the learning outcomes broadly or relative to specific program? ➢Which learning outcomes will be specified within the rubric? ➢Are there skills that are essential to declare the student is competent or has a certain proficiency levels for the task or assignment to be complete? ➢How important is it for the student to complete the task or project (interest, logic, organization, creativity) to demonstrate this proficiency level? ➢Are there process and product expectations? SOURCE: University of Texas-Austin Faculty Innovation Center (https://facultyinnovate.utexas.edu/teaching/ check-learning/rubrics) 5/25/2017 15
  • 16. Enrollment x Design, LLC Implementation Steps 1. Identify the outcome 2. Determine how you will collect the evidence 3. Develop the rubric based on observation criteria (Anchors) 4. Train the evaluators on how to use rubric 5. Test the rubric against examples 6. Revise as needed 7. Collect the results of scoring and report out 5/25/2017 16 *http://manoa.hawaii.edu/assessment/workshops/pdf/Rubrics_in_program_assessment_ppt_2013-10.pdf
  • 17. Enrollment x Design, LLC Choosing Measurement Dimensions* Measurement Goals: List the measurement dimensions you want a student to be able to demonstrate that are relevant to curriculum/ activities Face Validity: Discuss the proposed measurement dimensions with others to get subjectively assess whether the rubric measures what it is purported to measure. Parsimony: Edit content to make sure each measurement dimension is concise and clear while ensuring a required breadth of coverage (>=3 & <=8) *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 17
  • 18. Enrollment x Design, LLC Writing Descriptors* 1. Describe each level of mastery for each characteristic 2. Describe the best work you could expect 3. Describe an unacceptable product 4. Develop descriptions of intermediate level products for intermediate categories 5. Each description and each category should be mutually exclusive 6. Be specific and clear; reduce subjectivity *University of Florida Institutional Assessment: Writing Effective Rubrics 5/25/2017 18
  • 19. Enrollment x Design, LLC Rubric Development & Use is a Practiced Art Form!* Conduct Training Rater Practice! Rater Discussions & Negotiatio n Rubric Iteration *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 19
  • 20. Enrollment x Design, LLC Pick your Scaling Approach (“indicators”) - I* a. Beginner, Developing, Accomplished b. Marginal, Proficient, Exemplary c. Novice, Intermediate, Proficient, Distinguished d. Not Yet Competent, Partly competent, Competent, Sophisticated a. Never, Rarely, Occasionally, Often, Always b. Never, Once, Twice, Three times, Four times c. Never 1-3 x, 4-6x, 5-7x… a. Not at all, Slightly, Moderately, Considerably, A great deal b. Yes/No c. Met, Partially Met, Not Met Competency Frequency of Behavior Extent to Which Performed *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 20
  • 21. Enrollment x Design, LLC Pick your Scaling Approach (“indicators”) - II* *SOURCE: University of Texas-Austin Faculty Innovation Center (https://facultyinnovate.utexas.edu/teaching/check- learning/rubrics) 4 3 2 1 Task Requirements All Most Some Very few or none Frequency Always Usually Some of the time Rarely or not at all Accuracy No errors Few errors Some errors Frequent errors Comprehensibility Always comprehensible Almost always comprehensible Gist and main ideas are comprehensible Isolated bits are comprehensible Content Coverage Fully developed, fully supported Adequately developed, adequately supported Partially developed, partially supported Minimally developed, minimally supported Vocabulary Range Broad Adequate Limited Very limited Variety Highly varied; non- repetitive Varied; occasionally repetitive Lacks variety; repetitive Basic, memorized, highly repetitive 5/25/2017 21
  • 22. Enrollment x Design, LLC Things to Remember about Scaling (“Indicators”) • What is the ideal assessment for each criteria? o Begin with the highest level of the scale to define top quality performance. o Work backward to lower performance levels. • Ensure continuity in the differences between the criteria (e.g., exceeds vs. meets, and meets vs. does not meet expectations). o Difference between a 2 and a 3 performance should not be more than the difference between a 3 and a 4 performance. • Edit the indicators to ensure that the levels reflect variance in quality and not a shift in importance of the criteria. • Make certain that the indicators reflect equal steps along the scale. o The difference between 4 and 3 should be equivalent to the difference between 3 - 2 and 2 - 1. o “Yes, and more,” “Yes,” “Yes, but,” and “No” are ways for the rubric developer to think about how to describe performance at each scale point. SOURCE: University of Texas-Austin Faculty Innovation Center (https://facultyinnovate.utexas.edu/teaching/check- learning/rubrics) 5/25/2017 22
  • 23. Enrollment x Design, LLC MetaRubrics
 Campus Labs Example *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 23
  • 24. Enrollment x Design, LLC Scoring Guidelines 1. The grader(s) should be trained in the proper use of the rubric 2. Use multiple graders, if possible, to score student work in order to gain greater reliability. 3. If different graders are used, make every effort to ensure that they are as consistent as possible in their scoring by providing adequate training and examples. 4. If working alone, or without examples, you can achieve a greater level of internal consistency by giving preliminary ratings to students’ work • Through this approach, clusters of similar quality will soon develop. • After establishing a firm scoring scheme, re-grade all students’ work to assure greater internal consistency and fairness. SOURCE: University of Texas-Austin Faculty Innovation Center (https:// facultyinnovate.utexas.edu/teaching/check-learning/rubrics) 5/25/2017 24
  • 25. Enrollment x Design, LLC Students & Your Rubric Development • Include students in the revision and/or development process. oWhen students are involved, the assignment itself becomes more meaningful. Use • Share the rubric with students before they complete the assignment. oEstablishes the level of performance expected which increases likelihood they meet those standards. SOURCE: University of Texas-Austin Faculty Innovation Center (https:// facultyinnovate.utexas.edu/teaching/check-learning/rubrics) 5/25/2017 25
  • 26. Enrollment x Design, LLC Calibration of Rubrics – 12 Steps* 1. Make copies of the rubric for each rater 2. Identify representative student works for each level of performance: • Case a: 1 – not met; 2 - met, 1 – exceeded • Case b: 2 – not met; 2 – met, 2 – exceeded 3. Provide copies of student work with identifiers removed 4. Provide scoring sheet 5. Facilitator explains the SLO and the rubric 6. Each rater independently scores student work 7. Group discussion of each student work. 8. Reach consensus on a score for each work. 9. Recalibrate after 3 hours or at the beginning of each rating session. 10. Check inter-rater consistency 11. Present results in a meaningful and clean manner 12. Use results *http://manoa.hawaii.edu/assessment/workshops/pdf/Rubrics_in_program_assessment_ppt_2013-10.pdf 5/25/2017 26
  • 27. Enrollment x Design, LLC Reliability!* A. Inter-rater reliability: Between-rater consistency (“Inter-rater agreement: how many pairs of raters gave exactly the same score?”) Affected by: • Initial starting point or approach to scale (assessment tool) • Interpretation of descriptions • Domain / content knowledge B. Intra-rater consistency: Within-rater consistency (“Inter-rater reliability: what is the correlation between rater 1 and 2?”) Affected by: • Internal factors: mood, fatigue, attention • External factors: order of evidence, time of day, other situations • Applies to both multiple-rater and single rater situations • EXCEL FORMULA: =CORREL(array1,array2) *Levy, J.D. Campus Labs: Data Driven Innovation. Using Rubrics in student affairs: A direct assessment of learning. 5/25/2017 27
  • 28. Enrollment x Design, LLC Assessment Scenario #1 Written Reflections 5/25/2017 28
  • 29. Enrollment x Design, LLC Assessment Scenario #1 Written Reflection Instructions: 1. After forming groups, review the handout entitled ”Goals, Prompts, and Rubrics: Written Reflection” (5 min) 2. Take time to score the first reflection individually. (Write this down.) Then discuss with your group and agree upon a group score. (5 min) 3. Repeat step 2 with two subsequent written reflections. (10 min) 5/25/2017 29
  • 30. Enrollment x Design, LLC Assessment Scenario #2 Senior Presentation 5/25/2017 30
  • 31. Enrollment x Design, LLC Assessment Scenario #2 Senior Presentation Instructions: 1. Break up into pairs of two. 2. Review the handout entitled ”Goals, Prompts, and Rubrics: Senior Presentation” (5 min) 3. While viewing the presentation: • One person writes down quotes • One person looks at the rubric and attempts to determine a rating  4. Afterwards, pairs get together to compare and discuss, and to make a final designation for rubric rating 5/25/2017 31
  • 32. Enrollment x Design, LLC ResourcesRubrics 5/25/2017 32
  • 33. Enrollment x Design, LLC AAC&U VALUE RUBRICS (16) • Intellectual and Practical Skills • Inquiry and analysis • Critical thinking • Creative thinking • Written communication • Oral communication • Reading • Quantitative literacy • Information literacy • Teamwork • Problem solving • Personal and Social Responsibility • Civic engagement—local and global • Intercultural knowledge and competence • Ethical reasoning • Foundations and skills for lifelong learning • Global learning • Integrative and Applied Learning • Integrative learning SOURCE: https://www.aacu.org/value-rubrics 5/25/2017 33
  • 34. Enrollment x Design, LLC RCampus 5/25/2017 34
  • 35. Enrollment x Design, LLC Rubrics for evaluating dissertations: • Focus groups with 272 faculty , 74 departments, 10 disciplines , 9 research universities • Experience = 3,470 dissertations, 9,890 committees Lovitts, B. E. (2007). Making the Implicit Explicit: Creating Performance Expectations for the Dissertation. Stylus Publishing, LLC. 5/25/2017 35
  • 36. Enrollment x Design, LLC University of Hawaii – Manoa: Rubric Bank • Includes all VALUE rubrics, plus rubrics for: o Collaboration, teamwork, participation o Critical thinking, creative thinking o Ethical deliberation o Information literacy o Reflection/Metacognition o Oral communication o Writing o Project design o Assessing assessment • https://manoa.hawaii.edu/assessment/resources/rubricbank.htm 5/25/2017 36