SlideShare a Scribd company logo
1 of 25
VALIDITY, RELIABILTIY
AND ALIGNMENT TO
DETERMINE THE
EFFECTIVENESS OF
ASSESSMENT
ASSESSMENT
VALIDITY RELIABILITY
STRATEGIES STANDARDS
ALIGNMENT
ASSESSMENT
Assessment is the process of gathering
and discussing information from multiple
and diverse sources in order to develop a
deep understanding of what students
know, understand, and can do with their
knowledge as a result of their educational
experiences. (Huba and Freed,2000)
VALIDITY
- refers to what the assessment is actually
testing.
RELIABILITY
- refers to the consistency of the
assessment.
ALIGNMENT
- refers to the connection between learning
objectives, learning activities and
assessment
A. VALIDITY
Definitions:
Validity - refers to what the assessment is
actually testing
- refers to how accurately a conclusion,
measurement or concept corresponds to what is
being tested
- defined as the extent to which an
assessment accurately measures what it is
intended to measure.
Factors Affecting Validity:
1. Student’s reading ability - Educators should
ensure that an assessment is at the correct
reading level of the students.
2. Student self-efficacy - If students have low
self-efficacy or beliefs about their abilities in
the particular area they are being tested in,
they will typically perform lower.
3. Student test anxiety level - Students with high
test anxiety will underperform due to
emotional and physiological factors
 Types of Validity:
1. Face Validity - ascertains that the measure appears to be
assessing the intended construct under study.
2. Construct Validity - is used to ensure that the measure is
actually measure what it is intended to measure and not
other variables
3. Criterion-Related Validity - is used to predict future or
current performance - it correlates test results with another
criterion of interest.
4. Formative Validity - when applied to outcomes assessment it
is used to assess how well a measure is able to provide
information to help improve the program under study.
5. Sampling Validity - ensures that the measure covers the
broad range of areas within the concept under study
Kinds of Validity Claim:
1. Predictive evidence - form of construct validation
that examines whether performance on an
assessment is strongly related to real-world success
in the domain that the assessment is meant to
reflect.
2. Consequential evidence - refers to the kinds of
consequences an assessment and its uses have for
learners and for instruction.
Why is validity necessary?
While reliability is necessary, it alone is not sufficient. For
a test to be reliable, it also needs to be valid
Ways on how to improve validity:
1. Make sure your goals and objectives are clearly defined
and operationalized.
2.Match your assessment measure to your goals and
objectives.
3.Get students involved; have the students look over the
assessment for troublesome wording, or other
difficulties.
4. If possible, compare your measure with other
measures, or data that may be available.
B. RELIABILITY
Definitions:
Reliability - refers to the consistency of the
assessment
- the degree to which an assessment
tool produces stable and consistent results
Factors Affecting Reliability:
1. The length of the assessment
2. The suitability of the questions or tasks for
the students being assessed.
3. The phrasing and terminology of the
questions.
4. The consistency in test administration
5. The design of the marking schedule and
moderation of marking procedures.
6. The readiness of students for the assessment
Types of Reliability:
1. Test-retest reliability - is a measure of reliability
obtained by administering the same test twice over
a period of time to a group of individuals.
2. Parallel forms reliability - is a measure of reliability
obtained by administering different versions of an
assessment tool to the same group of individuals
3. Inter-rater reliability - is a measure of reliability
used to assess the degree to which different judges
or raters agree in their assessment decisions
4. Internal consistency reliability - is a measure of
reliability used to evaluate the degree to which different
test items that probe the same construct produce similar
results.
a.) Average inter-item correlation - It is obtained by
taking all of the items on a test that probe the same
construct determining the correlation coefficient for each
pair of items, and finally taking the average of all of these
correlation coefficients
b.) Split-half reliability – The process of obtaining
this begun by “splitting in half” all items of a test that are
intended to probe the same area of knowledge in order
to form two “sets” of items.
C. ALIGNMENT
Definition: It is the connection between
learning objectives, learning activities and
assessment
OBJECTIVES ACTIVITIES ASSESSMENT
 3 Components of Alignment:
1. Objectives - What do I want students to know
how to do when they leave this course?
2. Activities - What kinds of activities in and out
of class will reinforce my learning objectives
and prepare students for assessments?
3. Assessment - What kinds of tasks will reveal
whether students have achieved the learning
objectives I have identified?
Why is alignment important?
Proper alignment keeps you going in the
right direction. If assessments are misaligned
with objectives or strategies, it can undermine
both student motivation and learning.
HOW CAN WE DEVELOP AN
EFECTIVE ASSESSMENT?
Strategies
Standards
Alignment
STRATEGIES FOR DEVELOPING
AN EFFECTIVE ASSESSMENT
1. Remember the effect of assessment on
student learning behaviors and outcomes.
2. Align the course assessment with the
learning outcomes and curriculum.
3. Prepare students for assessment by
providing formative tasks and explaining
the structure of the assessment for their
course.
STRATEGIES…
4. Design quality assessment tasks and
items.
5. Review Assessment Data.
6. Understand how to set standards and
grade cut-offs.
7. Give feedback to students.
STANDARDS
1. Assessment of Higher-Order Cognitive
Skills
2. High-Fidelity Assessment of Critical
Abilities
3. Assessments that Are Internationally
Benchmarked
4. Use of Items that Are Instructionally
Sensitive and Educationally Valuable
1. Assessment of Higher-Order
Cognitive Skills
One widely used approach in
conceptualizing knowledge and skills represented in
curriculum, teaching, and assessment Webb’s Depth of
Knowledge (DOK) taxonomy.
Using the DOK framework as a guide, if
assessments are to reflect and encourage transferable
abilities, a substantial majority of the items and tasks
(at least two-thirds) should tap conceptual knowledge
and abilities.
Webb’s Depth of Knowledge (DOK)
taxonomy
2. High-Fidelity Assessment of
Critical Abilities
This standard identifies a number of areas of knowledge
and skills that are clearly so critical for college and career
readiness that they should be targeted for inclusion in new
assessment systems. As described in the standard, these
includes:
 Research
 Analysis and Synthesis of Information
 Experimentation and Evaluation
 Communication in Oral, Written, Graphic, and Multi-Media
Forms
 Collaboration and Interpersonal Interaction
 Modeling, Design, and Complex Problem Solving
3. Assessments that Are
Internationally Benchmarked
The assessments should be as rigorous as those of
the leading education countries, in terms of the kind of
content and tasks they present, as well as the level of
performance they expect.
Such assessments sought were, in order of
importance:
1) teamwork
2) problem solving
3) interpersonal skills
4) oral communications
5) listening skills
4. Use of Items that Are
Instructionally Sensitive and
Educationally Valuable
Assessment tasks should also be
instructionally sensitive and educationally
useful. That is, they should 1) represent the
curriculum content in ways that respond to
instruction, and 2) have value for guiding and
informing teaching.

More Related Content

What's hot

Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)
Rey-ra Mora
 
Assessment of Learning
Assessment of LearningAssessment of Learning
Assessment of Learning
YaKu Loveria
 
Self and peer assessment
Self and peer assessmentSelf and peer assessment
Self and peer assessment
eshalcomel
 
Characteristics of a good test
Characteristics of a good testCharacteristics of a good test
Characteristics of a good test
cyrilcoscos
 
Educational measurement, assessment and evaluation
Educational measurement, assessment and evaluationEducational measurement, assessment and evaluation
Educational measurement, assessment and evaluation
Boyet Aluan
 

What's hot (20)

Test and Assessment Types
Test and Assessment TypesTest and Assessment Types
Test and Assessment Types
 
Writing Test Items
Writing Test ItemsWriting Test Items
Writing Test Items
 
Formative assessment
Formative assessmentFormative assessment
Formative assessment
 
Item analysis ppt
Item analysis pptItem analysis ppt
Item analysis ppt
 
Reliability (assessment of student learning I)
Reliability (assessment of student learning I)Reliability (assessment of student learning I)
Reliability (assessment of student learning I)
 
Validity
ValidityValidity
Validity
 
Steps to design a test
Steps to design a testSteps to design a test
Steps to design a test
 
Assessment of Learning
Assessment of LearningAssessment of Learning
Assessment of Learning
 
Clarity of Learning Targets (Assessment)
Clarity of Learning Targets (Assessment)Clarity of Learning Targets (Assessment)
Clarity of Learning Targets (Assessment)
 
Item analysis
Item analysisItem analysis
Item analysis
 
Steps fo test constructions
Steps fo test constructionsSteps fo test constructions
Steps fo test constructions
 
Classical Test Theory and Item Response Theory
Classical Test Theory and Item Response TheoryClassical Test Theory and Item Response Theory
Classical Test Theory and Item Response Theory
 
Item analysis
Item analysis Item analysis
Item analysis
 
Self and peer assessment
Self and peer assessmentSelf and peer assessment
Self and peer assessment
 
Planning an achievement test and assessment
Planning an achievement test and assessmentPlanning an achievement test and assessment
Planning an achievement test and assessment
 
Characteristics of a good test
Characteristics of a good testCharacteristics of a good test
Characteristics of a good test
 
3. instruments used in evaluation
3. instruments used in evaluation3. instruments used in evaluation
3. instruments used in evaluation
 
Presentation validity
Presentation validityPresentation validity
Presentation validity
 
Guidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of TestsGuidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of Tests
 
Educational measurement, assessment and evaluation
Educational measurement, assessment and evaluationEducational measurement, assessment and evaluation
Educational measurement, assessment and evaluation
 

Viewers also liked (6)

Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 
Basic Principles of Assessment
Basic Principles of AssessmentBasic Principles of Assessment
Basic Principles of Assessment
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessment
 
Reliability and validity
Reliability and validityReliability and validity
Reliability and validity
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
 
Validity, reliability & practicality
Validity, reliability & practicalityValidity, reliability & practicality
Validity, reliability & practicality
 

Similar to Validity, reliabiltiy and alignment to determine the effectiveness of assessment

Evaluation and measurement nursing education
Evaluation and measurement nursing educationEvaluation and measurement nursing education
Evaluation and measurement nursing education
parvathysree
 
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Greg Beloro
 
Unit 2.pptx
Unit 2.pptxUnit 2.pptx
Unit 2.pptx
Samruddhi Chepe
 
assessment of student learning in assessment in learning 1
assessment of student learning in assessment in learning 1assessment of student learning in assessment in learning 1
assessment of student learning in assessment in learning 1
Rai Blanquera
 
CBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessmentCBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessment
linetnafuna
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdf
PrincessAngelMagbanu
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
cdjhaigler
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
cdjhaigler
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
cdjhaigler
 
Essay Assessment And Learner
Essay Assessment And LearnerEssay Assessment And Learner

Similar to Validity, reliabiltiy and alignment to determine the effectiveness of assessment (20)

Evaluation and measurement nursing education
Evaluation and measurement nursing educationEvaluation and measurement nursing education
Evaluation and measurement nursing education
 
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
 
Unit 2.pptx
Unit 2.pptxUnit 2.pptx
Unit 2.pptx
 
Professional education reviewer for let or blept examinees
Professional education reviewer for let or blept examineesProfessional education reviewer for let or blept examinees
Professional education reviewer for let or blept examinees
 
Evaluation and measurement
Evaluation and measurementEvaluation and measurement
Evaluation and measurement
 
STANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TESTSTANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TEST
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)
 
assessment of student learning in assessment in learning 1
assessment of student learning in assessment in learning 1assessment of student learning in assessment in learning 1
assessment of student learning in assessment in learning 1
 
ASSESSMENT IN LEARNING 1-LESSONS 1-4 (1).ppt
ASSESSMENT IN LEARNING 1-LESSONS 1-4 (1).pptASSESSMENT IN LEARNING 1-LESSONS 1-4 (1).ppt
ASSESSMENT IN LEARNING 1-LESSONS 1-4 (1).ppt
 
CBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessmentCBA_2_Assessment_Introduction to assessment
CBA_2_Assessment_Introduction to assessment
 
LESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptxLESSON 6 JBF 361.pptx
LESSON 6 JBF 361.pptx
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdf
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
ASSESSMENT-GROUP-1-REPORT DEFINITION OF TERMS..pptx
ASSESSMENT-GROUP-1-REPORT DEFINITION OF TERMS..pptxASSESSMENT-GROUP-1-REPORT DEFINITION OF TERMS..pptx
ASSESSMENT-GROUP-1-REPORT DEFINITION OF TERMS..pptx
 
Creating Meaningful Rubrics
Creating Meaningful Rubrics Creating Meaningful Rubrics
Creating Meaningful Rubrics
 
Essay Assessment And Learner
Essay Assessment And LearnerEssay Assessment And Learner
Essay Assessment And Learner
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
 

More from Mirea Mizushima

More from Mirea Mizushima (20)

The ABC of Effective Writing
The ABC of Effective WritingThe ABC of Effective Writing
The ABC of Effective Writing
 
How not to write (Legal Writing)
How not to write (Legal Writing)How not to write (Legal Writing)
How not to write (Legal Writing)
 
Effective legal writing
Effective legal writingEffective legal writing
Effective legal writing
 
Clear and effective writing (News Story)
Clear and effective writing (News Story)Clear and effective writing (News Story)
Clear and effective writing (News Story)
 
Answering questions (Legal Writing)
Answering questions (Legal Writing)Answering questions (Legal Writing)
Answering questions (Legal Writing)
 
Writing the arguments (Legal Writing)
Writing the arguments (Legal Writing)Writing the arguments (Legal Writing)
Writing the arguments (Legal Writing)
 
Sequence of thoughts (Legal Writing)
Sequence of thoughts (Legal Writing)Sequence of thoughts (Legal Writing)
Sequence of thoughts (Legal Writing)
 
Opinion writing and memoranda
Opinion writing and memorandaOpinion writing and memoranda
Opinion writing and memoranda
 
Relevant facts
Relevant factsRelevant facts
Relevant facts
 
Elements of case analysis
Elements of case analysisElements of case analysis
Elements of case analysis
 
Spot the issue
Spot the issueSpot the issue
Spot the issue
 
Fallacies
FallaciesFallacies
Fallacies
 
Purpose of writing (Legal Writing)
Purpose of writing (Legal Writing)Purpose of writing (Legal Writing)
Purpose of writing (Legal Writing)
 
Prepositions (Legal Writing)
Prepositions (Legal Writing)Prepositions (Legal Writing)
Prepositions (Legal Writing)
 
Organizing sentences (exercises, short quizzes, etc.)
Organizing sentences (exercises, short quizzes, etc.)Organizing sentences (exercises, short quizzes, etc.)
Organizing sentences (exercises, short quizzes, etc.)
 
Argue the witticism
Argue the witticismArgue the witticism
Argue the witticism
 
Violence Against Women and Children
Violence Against Women and ChildrenViolence Against Women and Children
Violence Against Women and Children
 
Cooperative supervision and administrative practice and principles
Cooperative supervision and administrative practice and principlesCooperative supervision and administrative practice and principles
Cooperative supervision and administrative practice and principles
 
Learners with exceptionalities
Learners with exceptionalitiesLearners with exceptionalities
Learners with exceptionalities
 
Servant and Transformational Leadership
Servant and Transformational LeadershipServant and Transformational Leadership
Servant and Transformational Leadership
 

Recently uploaded

1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 

Recently uploaded (20)

Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Magic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptxMagic bus Group work1and 2 (Team 3).pptx
Magic bus Group work1and 2 (Team 3).pptx
 
Third Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptxThird Battle of Panipat detailed notes.pptx
Third Battle of Panipat detailed notes.pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
On National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan FellowsOn National Teacher Day, meet the 2024-25 Kenan Fellows
On National Teacher Day, meet the 2024-25 Kenan Fellows
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 

Validity, reliabiltiy and alignment to determine the effectiveness of assessment

  • 1. VALIDITY, RELIABILTIY AND ALIGNMENT TO DETERMINE THE EFFECTIVENESS OF ASSESSMENT
  • 3. ASSESSMENT Assessment is the process of gathering and discussing information from multiple and diverse sources in order to develop a deep understanding of what students know, understand, and can do with their knowledge as a result of their educational experiences. (Huba and Freed,2000)
  • 4. VALIDITY - refers to what the assessment is actually testing. RELIABILITY - refers to the consistency of the assessment. ALIGNMENT - refers to the connection between learning objectives, learning activities and assessment
  • 5. A. VALIDITY Definitions: Validity - refers to what the assessment is actually testing - refers to how accurately a conclusion, measurement or concept corresponds to what is being tested - defined as the extent to which an assessment accurately measures what it is intended to measure.
  • 6. Factors Affecting Validity: 1. Student’s reading ability - Educators should ensure that an assessment is at the correct reading level of the students. 2. Student self-efficacy - If students have low self-efficacy or beliefs about their abilities in the particular area they are being tested in, they will typically perform lower. 3. Student test anxiety level - Students with high test anxiety will underperform due to emotional and physiological factors
  • 7.  Types of Validity: 1. Face Validity - ascertains that the measure appears to be assessing the intended construct under study. 2. Construct Validity - is used to ensure that the measure is actually measure what it is intended to measure and not other variables 3. Criterion-Related Validity - is used to predict future or current performance - it correlates test results with another criterion of interest. 4. Formative Validity - when applied to outcomes assessment it is used to assess how well a measure is able to provide information to help improve the program under study. 5. Sampling Validity - ensures that the measure covers the broad range of areas within the concept under study
  • 8. Kinds of Validity Claim: 1. Predictive evidence - form of construct validation that examines whether performance on an assessment is strongly related to real-world success in the domain that the assessment is meant to reflect. 2. Consequential evidence - refers to the kinds of consequences an assessment and its uses have for learners and for instruction.
  • 9. Why is validity necessary? While reliability is necessary, it alone is not sufficient. For a test to be reliable, it also needs to be valid Ways on how to improve validity: 1. Make sure your goals and objectives are clearly defined and operationalized. 2.Match your assessment measure to your goals and objectives. 3.Get students involved; have the students look over the assessment for troublesome wording, or other difficulties. 4. If possible, compare your measure with other measures, or data that may be available.
  • 10. B. RELIABILITY Definitions: Reliability - refers to the consistency of the assessment - the degree to which an assessment tool produces stable and consistent results
  • 11. Factors Affecting Reliability: 1. The length of the assessment 2. The suitability of the questions or tasks for the students being assessed. 3. The phrasing and terminology of the questions. 4. The consistency in test administration 5. The design of the marking schedule and moderation of marking procedures. 6. The readiness of students for the assessment
  • 12. Types of Reliability: 1. Test-retest reliability - is a measure of reliability obtained by administering the same test twice over a period of time to a group of individuals. 2. Parallel forms reliability - is a measure of reliability obtained by administering different versions of an assessment tool to the same group of individuals 3. Inter-rater reliability - is a measure of reliability used to assess the degree to which different judges or raters agree in their assessment decisions
  • 13. 4. Internal consistency reliability - is a measure of reliability used to evaluate the degree to which different test items that probe the same construct produce similar results. a.) Average inter-item correlation - It is obtained by taking all of the items on a test that probe the same construct determining the correlation coefficient for each pair of items, and finally taking the average of all of these correlation coefficients b.) Split-half reliability – The process of obtaining this begun by “splitting in half” all items of a test that are intended to probe the same area of knowledge in order to form two “sets” of items.
  • 14. C. ALIGNMENT Definition: It is the connection between learning objectives, learning activities and assessment OBJECTIVES ACTIVITIES ASSESSMENT
  • 15.  3 Components of Alignment: 1. Objectives - What do I want students to know how to do when they leave this course? 2. Activities - What kinds of activities in and out of class will reinforce my learning objectives and prepare students for assessments? 3. Assessment - What kinds of tasks will reveal whether students have achieved the learning objectives I have identified?
  • 16. Why is alignment important? Proper alignment keeps you going in the right direction. If assessments are misaligned with objectives or strategies, it can undermine both student motivation and learning.
  • 17. HOW CAN WE DEVELOP AN EFECTIVE ASSESSMENT? Strategies Standards Alignment
  • 18. STRATEGIES FOR DEVELOPING AN EFFECTIVE ASSESSMENT 1. Remember the effect of assessment on student learning behaviors and outcomes. 2. Align the course assessment with the learning outcomes and curriculum. 3. Prepare students for assessment by providing formative tasks and explaining the structure of the assessment for their course.
  • 19. STRATEGIES… 4. Design quality assessment tasks and items. 5. Review Assessment Data. 6. Understand how to set standards and grade cut-offs. 7. Give feedback to students.
  • 20. STANDARDS 1. Assessment of Higher-Order Cognitive Skills 2. High-Fidelity Assessment of Critical Abilities 3. Assessments that Are Internationally Benchmarked 4. Use of Items that Are Instructionally Sensitive and Educationally Valuable
  • 21. 1. Assessment of Higher-Order Cognitive Skills One widely used approach in conceptualizing knowledge and skills represented in curriculum, teaching, and assessment Webb’s Depth of Knowledge (DOK) taxonomy. Using the DOK framework as a guide, if assessments are to reflect and encourage transferable abilities, a substantial majority of the items and tasks (at least two-thirds) should tap conceptual knowledge and abilities.
  • 22. Webb’s Depth of Knowledge (DOK) taxonomy
  • 23. 2. High-Fidelity Assessment of Critical Abilities This standard identifies a number of areas of knowledge and skills that are clearly so critical for college and career readiness that they should be targeted for inclusion in new assessment systems. As described in the standard, these includes:  Research  Analysis and Synthesis of Information  Experimentation and Evaluation  Communication in Oral, Written, Graphic, and Multi-Media Forms  Collaboration and Interpersonal Interaction  Modeling, Design, and Complex Problem Solving
  • 24. 3. Assessments that Are Internationally Benchmarked The assessments should be as rigorous as those of the leading education countries, in terms of the kind of content and tasks they present, as well as the level of performance they expect. Such assessments sought were, in order of importance: 1) teamwork 2) problem solving 3) interpersonal skills 4) oral communications 5) listening skills
  • 25. 4. Use of Items that Are Instructionally Sensitive and Educationally Valuable Assessment tasks should also be instructionally sensitive and educationally useful. That is, they should 1) represent the curriculum content in ways that respond to instruction, and 2) have value for guiding and informing teaching.