SlideShare a Scribd company logo
1 of 27
UNDERSTANDING BY DESIGN
CRITERIA AND VALIDITY
By NKWELLE AND AZIFUET
FOCUS
Students can escape bad
teaching, they can ’t escape
assessment.
Bound (1995) in Race and
Brown (2001) p33
Criteria set the standards of what you expect of what you
are evaluating. Let’s imagine we are evaluating a
restaurant or hotel - look at some of the reviews on useful
websites. They identify ‘criteria’ such as ‘cleanliness’ and
‘friendliness’. Often these websites use rating scales from 1
- 5 or 1 - 10. You could also add descriptors to your criteria
e.g. All tables and chairs are cleaned after use, and staff are
welcoming etc etc, then add other descriptors to identify
the different levels down the scale. The clearer your criteria
are the better it will be for those evaluating to make more
‘objective’ judgements about how good something or
someone is.
THE NEED FOR
CRITERIA
#Unlockyourpotential
What is the meaning of criteria?
A standard that is used for judging something or for
making a decision about something is called criteria.
The purpose of evaluation criteria is so that: i) everyone
who is doing the evaluating knows what
standard/definitions they are using for the task and that
the evaluations produced by different people can be
compared directly; ii) the person being evaluated knows
what they need to do in order to be successful when
they/their work is evaluated..
WHAT IS
CRITERIA
#Unlockyourpotential
 A rubric is a criterion-based scoring guide consisting of a
fixed measurement scale and description of the
characteristics for each score point.
 Rubrics describe degrees of quality, proficiency or
understanding along a continuum.
 If the assessment response needs only a YES/NO type of
answer, a CHECKLIST is used instead of a rubric.
FROM
CRITERIA TO
RUBRIC
#Unlockyourpotential
 Rubrics help students, parents and teacher identify what
quality work is. Students can judge their own work and
accept more responsibility of the final product.
 Rubrics help the teacher to easily explain to the student
why they got the grade that they received.
 Parents who work with their children also have a clear
understanding of what is expected for a special project.
Ask for a rubric to be sent home with project directions or
other assignments such as writing.
RUBRICS TO
ASSESS
UNDERSTANDING
#Unlockyourpotential
 Step 1: Identify your grading criteria.
What are the intended outcomes for the WORK? What do
you want students to do or demonstrate? What are the
primary dimensions (note: these are often referred to as
“traits” or as “criteria”) that count in the evaluation? Try
writing each one as a noun or noun phrase—for example,
“Insights and ideas that are central to the assignment”;
“Address of audience”; “Logic of organization”; “Integration of
source materials.”
DESIGING AND
REFINING
RUBRICS
#Unlockyourpotential
 Step 2: Describe the levels of success for each criterion
Consider what attributes you will be able to identify (both
those you want to see and those you do not) in your
students’ product, process or performance. Specify the
characteristics, skills or behaviors that you will be looking for,
as well as common mistakes you do not want to see..
DESIGING AND
REFINING
RUBRICS
#Unlockyourpotential
 Step 3: Brainstorm excellent, passable and not acceptable
characteristics
When criteria have been identified and performance-levels described,
decisions should be made about their varying importance in relation to
each other.
Consider the attributes you have identified in step two and categorize
them into excellent, passable and not acceptable.
What standard would you expect for a top mark?
What standard do you expect to pass?
What characteristics are not acceptable?
If desired, fill in the gap between passable and excellent.
DESIGING AND
REFINING
RUBRICS
#Unlockyourpotential
 Step 4: Test and moderate your rubric
Try your rubric with samples of student work to check that
the outcome from the rubric reflects the quality of the work.
Collaborate with your assessors and peers to test and review
your rubric. Are there points of confusion or disagreement in
using your rubric for making judgements?
 Step 5: Revise the rubric, as necessary
Be prepared to reflect on the effectiveness of the rubric and
revise it before its next implementation.
DESIGING AND
REFINING
RUBRICS
#Unlockyourpotential
TWO TYPES OF
RUBRICS
There are generally two types of
rubrics that are used to judge
student products and
performances :
Holistic and Analytic rubrics
 single criteria rubrics (one-dimensional) used to
assess participants' overall achievement on an
activity or item based on predefined achievement
levels;
 performance descriptions are written in
paragraphs and usually in full sentences.
HOLISTIC
RUBRIC
#Unlockyourpotential
HOLISTIC RUBRIC EXAMPLE
#Unlockyourpotential
 Two-dimensional rubrics with levels of achievement as
columns and assessment criteria as rows. Allows you to
assess participants' achievements based on multiple
criteria using a single rubric. You can assign different
weights (value) to different criteria and include an overall
achievement by totaling the criteria;
 written in a table form.
ANALYTIC
RUBRIC
#Unlockyourpotential
ANALYTIC RUBRIC EXAMPLE
#Unlockyourpotential
ENGLISH HOMEWORK
Pamela gave her students a homework to write
an essay on how the favorite food is prepared.
Michael a student of her class instead wrote an
essay on his favorite TV show.
Using Miss Pamela’s rubric do you think it will
be right to give Michael any grade, if yes on
what components and if no, why so?
EXAMPLES OF RUBRIC
THINK, PAIR, SHARE
#Unlockyourpotential
MATHEMATICS TEST
QUESTION.
Solve the equation [3]
2𝑥2
− 5 = 45
STUDENT SOLUTION
2x - 5 = 45
2x = 50
X = 25.
EXAMPLES OF RUBRIC
THINK, PAIR, SHARE
#Unlockyourpotential
Mark scheme/Rubric
2𝒙𝟐
= 50 [1]
𝒙𝟐
= 25 [1]
X = 5 [1]
SHOULD A RUBRIC BE REDESIGNED AND
REFINED DUE TO STUDENT WORK ?
QUESTION OF THE
DAY
#Unlockyourpotential
WHAT IS VALIDITY: Assessment validity refers to the extent that a test measures what it is supposed
to measure. It refers to whether a test measures what it aims to measure. ( measures the accuracy of
a test based on your objective). When testing for validity you don’t rely on a single assessment
exercise and you use a variety approach.
 Types of validity
• Content validity: it evaluates how well a test covers all relevant parts of the topic it aims to measure.
The content should match what you are trying to measure .The way you train learners should be the
way you test them. If you train them to define do not expect them to have analytical skills.
Content validity answers questions like; Is the test fully a representative of what it aims to measure?
Does it include all the right items?
VALIDITY IN ASSESSMENT
• Criterion validity: It evaluates how accurately a test measures the
outcome it was designed to measure. it divided in to predictive and
concurrent validity. predict accurately the outcome they are designed
to measure. For instance when a current given test that measures a
given characteristic correlates with a future performance of a student
test.
• Construct validity: Evaluates whether a measurement tool really
represents the thing we are interested in measuring. Does the test
measure the concept that it’s intended to measure?
 There are two main types of construct validity.
• Convergent construct : The extent to which your measure
corresponds to measures of related constructs. E.g. logic and math's
ability
• Divergent construct : The extent to which your measure is unrelated
to measures of distinct constructs. E.g. logical ability.
Types of validity
• Reliability is the overall consistency of a measure. It concerns the
extent to which an experiment, test, or any measuring
procedure yields the same results on repeated trials. . A measure
is said to have a high reliability if it produces similar results
under consistent conditions.
• In constructing a good test you have to follow certain criteria,
standards or rules.
• The most important criteria in measuring the characteristics of
a good test are validity and reliability.
Question : when do we say a test is valid and reliable?
Is a reliable test a valid test?
• An easy way to think about this concept is with a bullseye
metaphor: The very center of the bullseye is exactly what we
assess.
Reliability
Question : is it possible to have something with high validity
(accurate) and low reliability (consistent) ?
Is a valid test a reliable test?
Validity and Reliability
 Consider the challenge currently in any conventional classroom at Enko Middle School, the teacher makes up a 20-
problem test on fractions. Joseph gets 11 right The teacher infers that Joseph’s control of the entire realm of
fractions is very shaky. Valid conclusion? Not necessarily.
 First, we need to look at the test items and determine if they are representative of all types of problems with
fractions. Given that Joseph is a recent immigrant, maybe his English is weak but his math strong; does the test
factor out the English to let us see only his math ability?
 Suppose Jenny got 19 of the 20 problems right, but the one she got wrong asked for an explanation as to why
common denominators are needed…Suppose Sara gets all the history facts right on the multiple-choice test part of
her history exam, but completely fails the document-based question that calls for analysis of key events during the
same time frame? .
 These are the challenges that face us all. We have to be sure that the performances we demand are appropriate to
the particular understandings sought. Could a student perform well on the test without understanding? Could a
student with understanding nonetheless forget or jumble together key facts? Yes and yes.
The challenge of validity
 We want to avoid doubtful inferences when assessing any student work, but
especially so when assessing for understanding. The aim is not engaging work; the
aim is good evidence for judging achievement against stated goals so that you
assess what really matters, not merely what is easy to see and score.
 A focus on understanding makes the issue of validity challenging in any
Assessment. Most at times we focused solely on the correctness of the answers,
ignoring the process each student used to set up and solve each problem. Is
correctness indicative of understanding? Not necessarily. The best test papers may
simply reflect recall of the formulas involved, without any understanding of why
they work. Most people don’t self-assess their proposed assessments against any
design standards, and they often end up with Invalid inferences.
 A common problem in assessment is that many scorers presume greater
understanding in the student who knows all the facts or communicates with
elegance versus the student who makes mistakes or communicates poorly. But
what if the findings of the papers with mistake are truly insightful and the paper
that is well written and based on facts is superficial? Getting clear on what we can
and cannot conclude from the evidence—that’s always the issue in validity, and it
applies to how we score, not Just what we score.
The challenge of validity
.
Self-Test of Assessment Ideas
1. We need to look at more than just the percentage of correct answers. Why?
Sometimes getting the right answer occurs as a result of rote recall, good test-
taking skills, or lucky guessing. In assessing for understanding, we need to ferret
out the reasons behind the answers and what meaning the learner makes of the
results.
2. use constructed response questions on the same content to make sure that
correct answers cannot hide lack of understanding. Whenever possible, have
parallel assessments in diverse formats improve the quality of the evidence of
desired results.
3. Given that a single application or product may or may not link to larger goals,
regularly ask students to “show their work,” give reasons for answers, and show
connections to larger principles or ideas in the answers.
4. Assessment of understanding requires evidence of “application” in performance or
products, but that complicates judging results. What do we do when parts of a
complex performance are shaky, but we discern clear insight in the content? Or
the result is fine, yet we sense that little insight was required to complete the
project? How do we design performances that enable us to make precise
judgments about the different parts of performance?
General guidelines
Thanks for your kind attention

More Related Content

Similar to CRITERIA AND VALIDITY PRESENTATIONS.pptx

Coherence Scoring process
Coherence Scoring processCoherence Scoring process
Coherence Scoring processEdAdvance
 
Concept of rubrics
Concept of rubricsConcept of rubrics
Concept of rubricsdeepuplr
 
authentic vs. traditional assessment
authentic vs. traditional assessmentauthentic vs. traditional assessment
authentic vs. traditional assessmentfreshious
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instrumentsJCrawford62
 
Authentic Assessment
Authentic AssessmentAuthentic Assessment
Authentic Assessmentxanderjoy
 
eu-magazine.pdf
eu-magazine.pdfeu-magazine.pdf
eu-magazine.pdfGSAAA1
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7cdjhaigler
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7cdjhaigler
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrumentcdjhaigler
 
Rubric Presentation2
Rubric Presentation2 Rubric Presentation2
Rubric Presentation2 Denise Wright
 
Assessment of learning 1
Assessment of learning 1Assessment of learning 1
Assessment of learning 1Janiceona
 
Know how of question bank development
Know how of question bank developmentKnow how of question bank development
Know how of question bank developmentManoj Bhatt
 
12 Principles of High Quality Assessments (Version 2)
12 Principles of High Quality Assessments (Version 2)12 Principles of High Quality Assessments (Version 2)
12 Principles of High Quality Assessments (Version 2)Mr. Ronald Quileste, PhD
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learningAhlamModiarat
 
Assessment: Achieving improved efficiency, effectiveness, educational integri...
Assessment:Achieving improved efficiency, effectiveness, educational integri...Assessment:Achieving improved efficiency, effectiveness, educational integri...
Assessment: Achieving improved efficiency, effectiveness, educational integri...Diana Quinn
 

Similar to CRITERIA AND VALIDITY PRESENTATIONS.pptx (20)

Coherence Scoring process
Coherence Scoring processCoherence Scoring process
Coherence Scoring process
 
Test appraisal
Test appraisalTest appraisal
Test appraisal
 
Concept of rubrics
Concept of rubricsConcept of rubrics
Concept of rubrics
 
authentic vs. traditional assessment
authentic vs. traditional assessmentauthentic vs. traditional assessment
authentic vs. traditional assessment
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Authentic Assessment
Authentic AssessmentAuthentic Assessment
Authentic Assessment
 
eu-magazine.pdf
eu-magazine.pdfeu-magazine.pdf
eu-magazine.pdf
 
Validity
ValidityValidity
Validity
 
Rubrics
RubricsRubrics
Rubrics
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
Pencil andpapertest
Pencil andpapertestPencil andpapertest
Pencil andpapertest
 
Rubric Presentation2
Rubric Presentation2 Rubric Presentation2
Rubric Presentation2
 
Assessment of learning 1
Assessment of learning 1Assessment of learning 1
Assessment of learning 1
 
Know how of question bank development
Know how of question bank developmentKnow how of question bank development
Know how of question bank development
 
12 Principles of High Quality Assessments (Version 2)
12 Principles of High Quality Assessments (Version 2)12 Principles of High Quality Assessments (Version 2)
12 Principles of High Quality Assessments (Version 2)
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Language assessment
Language assessmentLanguage assessment
Language assessment
 
Assessment: Achieving improved efficiency, effectiveness, educational integri...
Assessment:Achieving improved efficiency, effectiveness, educational integri...Assessment:Achieving improved efficiency, effectiveness, educational integri...
Assessment: Achieving improved efficiency, effectiveness, educational integri...
 

Recently uploaded

Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxLigayaBacuel1
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........LeaCamillePacle
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 

Recently uploaded (20)

Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptx
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 

CRITERIA AND VALIDITY PRESENTATIONS.pptx

  • 1. UNDERSTANDING BY DESIGN CRITERIA AND VALIDITY By NKWELLE AND AZIFUET
  • 2. FOCUS Students can escape bad teaching, they can ’t escape assessment. Bound (1995) in Race and Brown (2001) p33
  • 3. Criteria set the standards of what you expect of what you are evaluating. Let’s imagine we are evaluating a restaurant or hotel - look at some of the reviews on useful websites. They identify ‘criteria’ such as ‘cleanliness’ and ‘friendliness’. Often these websites use rating scales from 1 - 5 or 1 - 10. You could also add descriptors to your criteria e.g. All tables and chairs are cleaned after use, and staff are welcoming etc etc, then add other descriptors to identify the different levels down the scale. The clearer your criteria are the better it will be for those evaluating to make more ‘objective’ judgements about how good something or someone is. THE NEED FOR CRITERIA #Unlockyourpotential
  • 4. What is the meaning of criteria? A standard that is used for judging something or for making a decision about something is called criteria. The purpose of evaluation criteria is so that: i) everyone who is doing the evaluating knows what standard/definitions they are using for the task and that the evaluations produced by different people can be compared directly; ii) the person being evaluated knows what they need to do in order to be successful when they/their work is evaluated.. WHAT IS CRITERIA #Unlockyourpotential
  • 5.  A rubric is a criterion-based scoring guide consisting of a fixed measurement scale and description of the characteristics for each score point.  Rubrics describe degrees of quality, proficiency or understanding along a continuum.  If the assessment response needs only a YES/NO type of answer, a CHECKLIST is used instead of a rubric. FROM CRITERIA TO RUBRIC #Unlockyourpotential
  • 6.  Rubrics help students, parents and teacher identify what quality work is. Students can judge their own work and accept more responsibility of the final product.  Rubrics help the teacher to easily explain to the student why they got the grade that they received.  Parents who work with their children also have a clear understanding of what is expected for a special project. Ask for a rubric to be sent home with project directions or other assignments such as writing. RUBRICS TO ASSESS UNDERSTANDING #Unlockyourpotential
  • 7.  Step 1: Identify your grading criteria. What are the intended outcomes for the WORK? What do you want students to do or demonstrate? What are the primary dimensions (note: these are often referred to as “traits” or as “criteria”) that count in the evaluation? Try writing each one as a noun or noun phrase—for example, “Insights and ideas that are central to the assignment”; “Address of audience”; “Logic of organization”; “Integration of source materials.” DESIGING AND REFINING RUBRICS #Unlockyourpotential
  • 8.  Step 2: Describe the levels of success for each criterion Consider what attributes you will be able to identify (both those you want to see and those you do not) in your students’ product, process or performance. Specify the characteristics, skills or behaviors that you will be looking for, as well as common mistakes you do not want to see.. DESIGING AND REFINING RUBRICS #Unlockyourpotential
  • 9.  Step 3: Brainstorm excellent, passable and not acceptable characteristics When criteria have been identified and performance-levels described, decisions should be made about their varying importance in relation to each other. Consider the attributes you have identified in step two and categorize them into excellent, passable and not acceptable. What standard would you expect for a top mark? What standard do you expect to pass? What characteristics are not acceptable? If desired, fill in the gap between passable and excellent. DESIGING AND REFINING RUBRICS #Unlockyourpotential
  • 10.  Step 4: Test and moderate your rubric Try your rubric with samples of student work to check that the outcome from the rubric reflects the quality of the work. Collaborate with your assessors and peers to test and review your rubric. Are there points of confusion or disagreement in using your rubric for making judgements?  Step 5: Revise the rubric, as necessary Be prepared to reflect on the effectiveness of the rubric and revise it before its next implementation. DESIGING AND REFINING RUBRICS #Unlockyourpotential
  • 11. TWO TYPES OF RUBRICS There are generally two types of rubrics that are used to judge student products and performances : Holistic and Analytic rubrics
  • 12.  single criteria rubrics (one-dimensional) used to assess participants' overall achievement on an activity or item based on predefined achievement levels;  performance descriptions are written in paragraphs and usually in full sentences. HOLISTIC RUBRIC #Unlockyourpotential
  • 14.  Two-dimensional rubrics with levels of achievement as columns and assessment criteria as rows. Allows you to assess participants' achievements based on multiple criteria using a single rubric. You can assign different weights (value) to different criteria and include an overall achievement by totaling the criteria;  written in a table form. ANALYTIC RUBRIC #Unlockyourpotential
  • 16. ENGLISH HOMEWORK Pamela gave her students a homework to write an essay on how the favorite food is prepared. Michael a student of her class instead wrote an essay on his favorite TV show. Using Miss Pamela’s rubric do you think it will be right to give Michael any grade, if yes on what components and if no, why so? EXAMPLES OF RUBRIC THINK, PAIR, SHARE #Unlockyourpotential
  • 17. MATHEMATICS TEST QUESTION. Solve the equation [3] 2𝑥2 − 5 = 45 STUDENT SOLUTION 2x - 5 = 45 2x = 50 X = 25. EXAMPLES OF RUBRIC THINK, PAIR, SHARE #Unlockyourpotential Mark scheme/Rubric 2𝒙𝟐 = 50 [1] 𝒙𝟐 = 25 [1] X = 5 [1]
  • 18. SHOULD A RUBRIC BE REDESIGNED AND REFINED DUE TO STUDENT WORK ? QUESTION OF THE DAY #Unlockyourpotential
  • 19. WHAT IS VALIDITY: Assessment validity refers to the extent that a test measures what it is supposed to measure. It refers to whether a test measures what it aims to measure. ( measures the accuracy of a test based on your objective). When testing for validity you don’t rely on a single assessment exercise and you use a variety approach.  Types of validity • Content validity: it evaluates how well a test covers all relevant parts of the topic it aims to measure. The content should match what you are trying to measure .The way you train learners should be the way you test them. If you train them to define do not expect them to have analytical skills. Content validity answers questions like; Is the test fully a representative of what it aims to measure? Does it include all the right items? VALIDITY IN ASSESSMENT
  • 20. • Criterion validity: It evaluates how accurately a test measures the outcome it was designed to measure. it divided in to predictive and concurrent validity. predict accurately the outcome they are designed to measure. For instance when a current given test that measures a given characteristic correlates with a future performance of a student test. • Construct validity: Evaluates whether a measurement tool really represents the thing we are interested in measuring. Does the test measure the concept that it’s intended to measure?  There are two main types of construct validity. • Convergent construct : The extent to which your measure corresponds to measures of related constructs. E.g. logic and math's ability • Divergent construct : The extent to which your measure is unrelated to measures of distinct constructs. E.g. logical ability. Types of validity
  • 21. • Reliability is the overall consistency of a measure. It concerns the extent to which an experiment, test, or any measuring procedure yields the same results on repeated trials. . A measure is said to have a high reliability if it produces similar results under consistent conditions. • In constructing a good test you have to follow certain criteria, standards or rules. • The most important criteria in measuring the characteristics of a good test are validity and reliability. Question : when do we say a test is valid and reliable? Is a reliable test a valid test? • An easy way to think about this concept is with a bullseye metaphor: The very center of the bullseye is exactly what we assess. Reliability
  • 22. Question : is it possible to have something with high validity (accurate) and low reliability (consistent) ? Is a valid test a reliable test? Validity and Reliability
  • 23.  Consider the challenge currently in any conventional classroom at Enko Middle School, the teacher makes up a 20- problem test on fractions. Joseph gets 11 right The teacher infers that Joseph’s control of the entire realm of fractions is very shaky. Valid conclusion? Not necessarily.  First, we need to look at the test items and determine if they are representative of all types of problems with fractions. Given that Joseph is a recent immigrant, maybe his English is weak but his math strong; does the test factor out the English to let us see only his math ability?  Suppose Jenny got 19 of the 20 problems right, but the one she got wrong asked for an explanation as to why common denominators are needed…Suppose Sara gets all the history facts right on the multiple-choice test part of her history exam, but completely fails the document-based question that calls for analysis of key events during the same time frame? .  These are the challenges that face us all. We have to be sure that the performances we demand are appropriate to the particular understandings sought. Could a student perform well on the test without understanding? Could a student with understanding nonetheless forget or jumble together key facts? Yes and yes. The challenge of validity
  • 24.  We want to avoid doubtful inferences when assessing any student work, but especially so when assessing for understanding. The aim is not engaging work; the aim is good evidence for judging achievement against stated goals so that you assess what really matters, not merely what is easy to see and score.  A focus on understanding makes the issue of validity challenging in any Assessment. Most at times we focused solely on the correctness of the answers, ignoring the process each student used to set up and solve each problem. Is correctness indicative of understanding? Not necessarily. The best test papers may simply reflect recall of the formulas involved, without any understanding of why they work. Most people don’t self-assess their proposed assessments against any design standards, and they often end up with Invalid inferences.  A common problem in assessment is that many scorers presume greater understanding in the student who knows all the facts or communicates with elegance versus the student who makes mistakes or communicates poorly. But what if the findings of the papers with mistake are truly insightful and the paper that is well written and based on facts is superficial? Getting clear on what we can and cannot conclude from the evidence—that’s always the issue in validity, and it applies to how we score, not Just what we score. The challenge of validity .
  • 26. 1. We need to look at more than just the percentage of correct answers. Why? Sometimes getting the right answer occurs as a result of rote recall, good test- taking skills, or lucky guessing. In assessing for understanding, we need to ferret out the reasons behind the answers and what meaning the learner makes of the results. 2. use constructed response questions on the same content to make sure that correct answers cannot hide lack of understanding. Whenever possible, have parallel assessments in diverse formats improve the quality of the evidence of desired results. 3. Given that a single application or product may or may not link to larger goals, regularly ask students to “show their work,” give reasons for answers, and show connections to larger principles or ideas in the answers. 4. Assessment of understanding requires evidence of “application” in performance or products, but that complicates judging results. What do we do when parts of a complex performance are shaky, but we discern clear insight in the content? Or the result is fine, yet we sense that little insight was required to complete the project? How do we design performances that enable us to make precise judgments about the different parts of performance? General guidelines
  • 27. Thanks for your kind attention