SlideShare a Scribd company logo
Is a term derived from the Latin word
validus, meaning strong.
In view of assessment, it is deemed valid if
it measures what it is supposed to.
Content-related evidence for validity pertains to the
extent to which the test covers the entire domain of
content.
 A test that appears to adequately measure the
learning outcomes and content is said to possess
face validity.
 Instructional validity is the extent to which as
assessment is systematically sensitive to the nature
of instruction offered.
 Table of Specifications (ToS) – it is a test
blueprint that identifies the content area and
describes the learning outcomes at each level of
domain.
Course title: Math
Grade level: V
Periods test is being used: 2
Date of test: August 8, 2014
Subject matter digest: Number and Number Sense
Type of test: Power, Speed, Partially speeded (Circle one)
Test time: 45 minutes
Test value: 100 points
Base number of test questions: 75
Constraints: Test time
Learning Objective
Item
Type
Revised Bloom's Taxonomy
Total
Remember
Understan
d
Apply Analyze Evaluate Create
No. Level Instructiona
l Time (in
minutes)
Q/P/% Q/P
1 Apply 95 16% 11/16 Matching 6(1) 5(2) 11/16
2
Understan
d
55 9% 7/10 MC 5(2) 5/10
⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞
10 Evaluate 40 7% 5/7 Essay 1(7) 1/7
Total 600 100% 75/100 11/12 23/31 16/34 4/10 3/6 1/7 58/100
Type of Test Questions Time Required to Answer
Alternate Response (True-false) 20 - 30 seconds
Modified True or False 30 - 45 seconds (Notar, et. al. 2004)
Sentence completion (one-word fill-in) 40 - 60 seconds
Multiple choice with four responses (lower level) 40 - 60 seconds
Multiple choice (higher level) 70 - 90 seconds
Matching Type (5 stems, 6 choices) 2 - 4 minutes
Short answer 2 - 4 minutes
Multiple choice (with calculations) 2 - 5 minutes
Word problems (simple arithmetic) 5 - 10 minutes
Short essays 15 - 20 minutes
Data analysis/graphing 15 - 25 minutes
Drawing models/labelling 20 - 30 minutes
Extended essays 35 - 50 minutes
Refers to the degree to which test scores agrees with an external
criterion.
It examines the relationship between an assessment and another
measure of the same trait.
There are three types of criteria:
Achievement test scores
Ratings, grades, and other
numerical judgments made by the
teacher
Career data
 Concurrent validity provides an estimate
of a student’s current performance relation
to a previously validated or established
measure.
 Predictive validity pertains to the power or
usefulness of test scores to predict future
performance.
Construct-related evidence of validity is an assessment of the
quality of the instrument used.
It measures the extent to which the assessment is a meaningful
measure of an unobservable trait or characteristics.
-Theoretical, Logical & Statistical
Convergent & Divergent
Validity occurs when measures of construct
that are in fact observe to be related.
Validity occurs when construct that are
unrelated are in reality observe not to be.
Integrates considerations of content, criteria, and
consequences into a construct framework for the empirical
testing of rational hypotheses about score meaning and
theoretically relevant relationships. (Merrick, 1989)
 Content aspect are parallel to content-related evidence
which calls for content relevance and
representativeness.
 Substantive aspects pertain to the theoretical
constructs and empirical evidences.
 Structural aspects assess how well the scoring
structure matches the construct domain.
 Generalizability aspects examine how score properties
and interpretations generalize to and across
populations groups, contexts and tasks.
 External aspects include convergent and discriminant
evidences taken from Multitrait-Multimethod studies.
 Consequential aspects pertain to the intended and
unintended effects of assessment on teaching and
learning.
Developing performance assessments
involves three steps: define the purpose,
choose the activity and develop criteria
for scoring.
1. The selected performance should reflect a valued activity.
2. The completion of performance assessments should
provide a valuable learning experiences.
3. The statement of goals and objectives should be clearly
aligned with the measurable outcomes of the
performance activity.
4. The task should not examine extraneous or unintended
variables.
5. Performance assessments should be fair and free from
bias.
1. Unclear test directions
2. Complicated vocabulary and sentence structure
3. Ambiguous statement
4. Inadequate time limits
5. Inappropriate level of difficulty of test items
6. Poorly constructed test items
7. Inappropriate test items for outcomes being measured
8. Short test
9. Improper arrangement of items
10. Identifiable pattern of answers
 Ask others to judge the clarity of what you
are assessing.
 Check to see if different ways of assessing
the same thing give the same result.
 Sample a sufficient number of examples of
what is being assessed.
 Prepare a detailed table of specifications.
 Ask others to judge the match between the
assessment items and the objectives of the
assessment.
 Compare groups known differ on what is
being assessed.
 Compare scores taken before to those taken
after instruction.
 Compare predicted consequences to actual
consequences.
 Compare scores on similar, but different traits.
 Provide adequate time to complete the
assessment.
 Ensure appropriate vocabulary, sentence
structure and item difficulty.
 Ask easy questions first.
 Use different methods to assess the same thing.
 Use only for intended purposes.
Talks about reproducibility and consistency in methods and
criteria
Internal & External reliability
A. Stability
B. Equivalence
C. Internal Consistency
D. Scorer or Rater Consistency
E. Decision Consistency
 Lengthen the assessment procedure by providing more
time, more questions and more observation whenever
practical.
 Broaden the scope of the procedure by assessing all the
significant aspects of the largest learning performance.
 Improve objectivity by using a systematic and more
formal procedure for scoring student performance. A
scoring scheme or rubric would prove useful.
 Use multiple markers by employing inter-rater
reliability.
 Combine results from several assessments
especially when making crucial educational
decisions.
 Provide sufficient time to students in completing
the assessment procedure.
 Teach students how to perform their best by
providing practice and training to students and
motivating them.
 Match the assessment difficulty to the
students’ ability levels by providing tasks
that are neither too easy nor too difficult,
and tailoring the assessment to each
student’s ability level when possible.
 Differentiate among students by selecting
assessment tasks that distinguish or
discriminate the best from the least able
students.

More Related Content

What's hot

Performance based-assessment
Performance based-assessmentPerformance based-assessment
Performance based-assessmentluisagodoy444
 
Decker Walker's curriculum model (1971)
Decker Walker's curriculum model (1971)Decker Walker's curriculum model (1971)
Decker Walker's curriculum model (1971)
Resa R. Noel PhD., MPhil., B.A., DipEd
 
Criterion-related Validity (Overview)
Criterion-related Validity (Overview)Criterion-related Validity (Overview)
Criterion-related Validity (Overview)
Biddle Consulting Group
 
Standardized testing.pptx 2
Standardized testing.pptx 2Standardized testing.pptx 2
Standardized testing.pptx 2Jesullyna Manuel
 
Developing Classroom-based Assessment Tools
Developing Classroom-based Assessment ToolsDeveloping Classroom-based Assessment Tools
Developing Classroom-based Assessment Tools
Mary Grace Ortiz
 
TSL3143 Topic 2a Models of Curriculum Design
TSL3143 Topic 2a Models of Curriculum DesignTSL3143 Topic 2a Models of Curriculum Design
TSL3143 Topic 2a Models of Curriculum Design
Yee Bee Choo
 
Quantitative Item Analysis
Quantitative Item Analysis Quantitative Item Analysis
Quantitative Item Analysis
Dr. Amjad Ali Arain
 
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
Azmi Mohd Tamil
 
assessment in curriculum development
assessment in curriculum development assessment in curriculum development
assessment in curriculum development younes Anas
 
Item analysis
Item analysisItem analysis
Item analysis
Melanio Florino
 
GUIDELINES CONSTRUCTING MULTIPLE CHOICE TEST ITEMS.pdf
GUIDELINES CONSTRUCTING  MULTIPLE CHOICE TEST ITEMS.pdfGUIDELINES CONSTRUCTING  MULTIPLE CHOICE TEST ITEMS.pdf
GUIDELINES CONSTRUCTING MULTIPLE CHOICE TEST ITEMS.pdf
NinoIgnacio2
 
Sample Curriculum Design for Elem. Grade 1 English
Sample Curriculum Design for Elem. Grade 1 EnglishSample Curriculum Design for Elem. Grade 1 English
Sample Curriculum Design for Elem. Grade 1 English
Jemark Pizon
 
Order and Ranking
Order and RankingOrder and Ranking
Order and Ranking
Dr. Amjad Ali Arain
 
Simple true false test
Simple true false testSimple true false test
Simple true false test
Shanelou Pading Ü
 
Interpretation of Assessment Results
Interpretation of Assessment ResultsInterpretation of Assessment Results
Interpretation of Assessment Results
Rica Joy Pontilar
 
Evaluating the Curriculum
Evaluating the CurriculumEvaluating the Curriculum
Evaluating the Curriculum
Jerwin Lopez
 

What's hot (20)

Performance based-assessment
Performance based-assessmentPerformance based-assessment
Performance based-assessment
 
Decker Walker's curriculum model (1971)
Decker Walker's curriculum model (1971)Decker Walker's curriculum model (1971)
Decker Walker's curriculum model (1971)
 
Criterion-related Validity (Overview)
Criterion-related Validity (Overview)Criterion-related Validity (Overview)
Criterion-related Validity (Overview)
 
Standardized testing.pptx 2
Standardized testing.pptx 2Standardized testing.pptx 2
Standardized testing.pptx 2
 
Developing Classroom-based Assessment Tools
Developing Classroom-based Assessment ToolsDeveloping Classroom-based Assessment Tools
Developing Classroom-based Assessment Tools
 
TSL3143 Topic 2a Models of Curriculum Design
TSL3143 Topic 2a Models of Curriculum DesignTSL3143 Topic 2a Models of Curriculum Design
TSL3143 Topic 2a Models of Curriculum Design
 
Quantitative Item Analysis
Quantitative Item Analysis Quantitative Item Analysis
Quantitative Item Analysis
 
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
Difficulty Index, Discrimination Index, Reliability and Rasch Measurement Ana...
 
assessment in curriculum development
assessment in curriculum development assessment in curriculum development
assessment in curriculum development
 
Item analysis
Item analysisItem analysis
Item analysis
 
GUIDELINES CONSTRUCTING MULTIPLE CHOICE TEST ITEMS.pdf
GUIDELINES CONSTRUCTING  MULTIPLE CHOICE TEST ITEMS.pdfGUIDELINES CONSTRUCTING  MULTIPLE CHOICE TEST ITEMS.pdf
GUIDELINES CONSTRUCTING MULTIPLE CHOICE TEST ITEMS.pdf
 
Types of test
Types of testTypes of test
Types of test
 
Sample Curriculum Design for Elem. Grade 1 English
Sample Curriculum Design for Elem. Grade 1 EnglishSample Curriculum Design for Elem. Grade 1 English
Sample Curriculum Design for Elem. Grade 1 English
 
Order and Ranking
Order and RankingOrder and Ranking
Order and Ranking
 
Product oriented
Product orientedProduct oriented
Product oriented
 
Simple true false test
Simple true false testSimple true false test
Simple true false test
 
Interpretation of Assessment Results
Interpretation of Assessment ResultsInterpretation of Assessment Results
Interpretation of Assessment Results
 
Table of specifications
Table of specificationsTable of specifications
Table of specifications
 
Test construction
Test constructionTest construction
Test construction
 
Evaluating the Curriculum
Evaluating the CurriculumEvaluating the Curriculum
Evaluating the Curriculum
 

Similar to Validity and reliability

Validity
ValidityValidity
Validity
EleynfieSanico2
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
aneez103
 
7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)
Jesús Ángel González López
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
Angel Jones
 
LESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxLESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptx
AdnanIssah
 
Validity and reliability of questionnaires
Validity and reliability of questionnairesValidity and reliability of questionnaires
Validity and reliability of questionnaires
Venkitachalam R
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrumentcdjhaigler
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7cdjhaigler
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7cdjhaigler
 
1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx
CamposJansen
 
Construction of Tests
Construction of TestsConstruction of Tests
Construction of Tests
Dakshta1
 
Presentation2
Presentation2Presentation2
Presentation2
Allison barbee
 
Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)
dheerajvyas5
 
Week1 Assessment Overview
Week1 Assessment OverviewWeek1 Assessment Overview
Week1 Assessment OverviewIPT652
 
Week1 B Assessment Overview
Week1 B Assessment OverviewWeek1 B Assessment Overview
Week1 B Assessment OverviewIPT652
 
Assessing learning in Instructional Design
Assessing learning in Instructional DesignAssessing learning in Instructional Design
Assessing learning in Instructional Designleesha roberts
 
7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)
Jesús Ángel González López
 
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
hakim azman
 
constructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptxconstructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptx
GajeSingh9
 

Similar to Validity and reliability (20)

Week 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and ReliabilityWeek 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and Reliability
 
Validity
ValidityValidity
Validity
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
 
LESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxLESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptx
 
Validity and reliability of questionnaires
Validity and reliability of questionnairesValidity and reliability of questionnaires
Validity and reliability of questionnaires
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
 
1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx
 
Construction of Tests
Construction of TestsConstruction of Tests
Construction of Tests
 
Presentation2
Presentation2Presentation2
Presentation2
 
Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)
 
Week1 Assessment Overview
Week1 Assessment OverviewWeek1 Assessment Overview
Week1 Assessment Overview
 
Week1 B Assessment Overview
Week1 B Assessment OverviewWeek1 B Assessment Overview
Week1 B Assessment Overview
 
Assessing learning in Instructional Design
Assessing learning in Instructional DesignAssessing learning in Instructional Design
Assessing learning in Instructional Design
 
7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)7.1 assessment and the cefr (1)
7.1 assessment and the cefr (1)
 
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
Languageassessmenttsl3123notes 141203115756-conversion-gate01 (1)
 
constructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptxconstructionoftests-211015110341 (1).pptx
constructionoftests-211015110341 (1).pptx
 

Recently uploaded

Marketing internship report file for MBA
Marketing internship report file for MBAMarketing internship report file for MBA
Marketing internship report file for MBA
gb193092
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
Celine George
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
DeeptiGupta154
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
camakaiclarkmusic
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
Special education needs
 
Best Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDABest Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDA
deeptiverma2406
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
DhatriParmar
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
Vivekanand Anglo Vedic Academy
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
SACHIN R KONDAGURI
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
tarandeep35
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
Jisc
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 

Recently uploaded (20)

Marketing internship report file for MBA
Marketing internship report file for MBAMarketing internship report file for MBA
Marketing internship report file for MBA
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
CACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdfCACJapan - GROUP Presentation 1- Wk 4.pdf
CACJapan - GROUP Presentation 1- Wk 4.pdf
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
 
Best Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDABest Digital Marketing Institute In NOIDA
Best Digital Marketing Institute In NOIDA
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
 
The French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free downloadThe French Revolution Class 9 Study Material pdf free download
The French Revolution Class 9 Study Material pdf free download
 
"Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe..."Protectable subject matters, Protection in biotechnology, Protection of othe...
"Protectable subject matters, Protection in biotechnology, Protection of othe...
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
S1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptxS1-Introduction-Biopesticides in ICM.pptx
S1-Introduction-Biopesticides in ICM.pptx
 
How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...How libraries can support authors with open access requirements for UKRI fund...
How libraries can support authors with open access requirements for UKRI fund...
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 

Validity and reliability

  • 1.
  • 2. Is a term derived from the Latin word validus, meaning strong. In view of assessment, it is deemed valid if it measures what it is supposed to.
  • 3. Content-related evidence for validity pertains to the extent to which the test covers the entire domain of content.
  • 4.  A test that appears to adequately measure the learning outcomes and content is said to possess face validity.  Instructional validity is the extent to which as assessment is systematically sensitive to the nature of instruction offered.  Table of Specifications (ToS) – it is a test blueprint that identifies the content area and describes the learning outcomes at each level of domain.
  • 5. Course title: Math Grade level: V Periods test is being used: 2 Date of test: August 8, 2014 Subject matter digest: Number and Number Sense Type of test: Power, Speed, Partially speeded (Circle one) Test time: 45 minutes Test value: 100 points Base number of test questions: 75 Constraints: Test time Learning Objective Item Type Revised Bloom's Taxonomy Total Remember Understan d Apply Analyze Evaluate Create No. Level Instructiona l Time (in minutes) Q/P/% Q/P 1 Apply 95 16% 11/16 Matching 6(1) 5(2) 11/16 2 Understan d 55 9% 7/10 MC 5(2) 5/10 ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ ⁞ 10 Evaluate 40 7% 5/7 Essay 1(7) 1/7 Total 600 100% 75/100 11/12 23/31 16/34 4/10 3/6 1/7 58/100
  • 6. Type of Test Questions Time Required to Answer Alternate Response (True-false) 20 - 30 seconds Modified True or False 30 - 45 seconds (Notar, et. al. 2004) Sentence completion (one-word fill-in) 40 - 60 seconds Multiple choice with four responses (lower level) 40 - 60 seconds Multiple choice (higher level) 70 - 90 seconds Matching Type (5 stems, 6 choices) 2 - 4 minutes Short answer 2 - 4 minutes Multiple choice (with calculations) 2 - 5 minutes Word problems (simple arithmetic) 5 - 10 minutes Short essays 15 - 20 minutes Data analysis/graphing 15 - 25 minutes Drawing models/labelling 20 - 30 minutes Extended essays 35 - 50 minutes
  • 7. Refers to the degree to which test scores agrees with an external criterion. It examines the relationship between an assessment and another measure of the same trait.
  • 8. There are three types of criteria: Achievement test scores Ratings, grades, and other numerical judgments made by the teacher Career data
  • 9.  Concurrent validity provides an estimate of a student’s current performance relation to a previously validated or established measure.  Predictive validity pertains to the power or usefulness of test scores to predict future performance.
  • 10. Construct-related evidence of validity is an assessment of the quality of the instrument used. It measures the extent to which the assessment is a meaningful measure of an unobservable trait or characteristics.
  • 11. -Theoretical, Logical & Statistical
  • 13. Validity occurs when measures of construct that are in fact observe to be related.
  • 14. Validity occurs when construct that are unrelated are in reality observe not to be.
  • 15. Integrates considerations of content, criteria, and consequences into a construct framework for the empirical testing of rational hypotheses about score meaning and theoretically relevant relationships. (Merrick, 1989)
  • 16.  Content aspect are parallel to content-related evidence which calls for content relevance and representativeness.  Substantive aspects pertain to the theoretical constructs and empirical evidences.  Structural aspects assess how well the scoring structure matches the construct domain.
  • 17.  Generalizability aspects examine how score properties and interpretations generalize to and across populations groups, contexts and tasks.  External aspects include convergent and discriminant evidences taken from Multitrait-Multimethod studies.  Consequential aspects pertain to the intended and unintended effects of assessment on teaching and learning.
  • 18. Developing performance assessments involves three steps: define the purpose, choose the activity and develop criteria for scoring.
  • 19. 1. The selected performance should reflect a valued activity. 2. The completion of performance assessments should provide a valuable learning experiences. 3. The statement of goals and objectives should be clearly aligned with the measurable outcomes of the performance activity. 4. The task should not examine extraneous or unintended variables. 5. Performance assessments should be fair and free from bias.
  • 20. 1. Unclear test directions 2. Complicated vocabulary and sentence structure 3. Ambiguous statement 4. Inadequate time limits 5. Inappropriate level of difficulty of test items 6. Poorly constructed test items 7. Inappropriate test items for outcomes being measured 8. Short test 9. Improper arrangement of items 10. Identifiable pattern of answers
  • 21.  Ask others to judge the clarity of what you are assessing.  Check to see if different ways of assessing the same thing give the same result.  Sample a sufficient number of examples of what is being assessed.  Prepare a detailed table of specifications.
  • 22.  Ask others to judge the match between the assessment items and the objectives of the assessment.  Compare groups known differ on what is being assessed.  Compare scores taken before to those taken after instruction.  Compare predicted consequences to actual consequences.
  • 23.  Compare scores on similar, but different traits.  Provide adequate time to complete the assessment.  Ensure appropriate vocabulary, sentence structure and item difficulty.  Ask easy questions first.  Use different methods to assess the same thing.  Use only for intended purposes.
  • 24. Talks about reproducibility and consistency in methods and criteria
  • 25. Internal & External reliability
  • 26.
  • 27. A. Stability B. Equivalence C. Internal Consistency D. Scorer or Rater Consistency E. Decision Consistency
  • 28.
  • 29.  Lengthen the assessment procedure by providing more time, more questions and more observation whenever practical.  Broaden the scope of the procedure by assessing all the significant aspects of the largest learning performance.  Improve objectivity by using a systematic and more formal procedure for scoring student performance. A scoring scheme or rubric would prove useful.
  • 30.  Use multiple markers by employing inter-rater reliability.  Combine results from several assessments especially when making crucial educational decisions.  Provide sufficient time to students in completing the assessment procedure.  Teach students how to perform their best by providing practice and training to students and motivating them.
  • 31.  Match the assessment difficulty to the students’ ability levels by providing tasks that are neither too easy nor too difficult, and tailoring the assessment to each student’s ability level when possible.  Differentiate among students by selecting assessment tasks that distinguish or discriminate the best from the least able students.