SlideShare a Scribd company logo
1 of 29
LESSON FIVE
TESTS
A test refers to a tool, technique, or method that is intended to measure students’ knowledge or their
ability to complete a particular task. In this sense, testing can be considered as a form of assessment. Test
should meet some basic requirements such as validity and reliability. A test can also be explained as a
method of collecting data for evaluation
TYPES OF TESTS.
 Criterion- referenced test
 Norm- reference test
 Aptitude test
Intelligence test
 Achievement test.
Criterion- referenced test
 This type of test compares a student’s academic achievement to a set of criteria or standards. This
type of assessment measures learners’ performance against a fixed set of predetermined criteria or
learning standards. It checks what learners are expected to know and be able to do at a specific time.
In this test, the pupil’s ability is measured in regard to a criterion that is specific body of knowledge or
skill. The test usually measures what students know or what they can do in a specific domain of
learning.
Norm-referenced
test
 The norm could be a national average score in a particular subject. Standardized tests are norm
referenced tests when grades are assigned to a pupil based upon comparison with other pupils.
 The state examination such as BECE and WASSCE are norm referenced tests.
 It can also be used in a classroom when teachers compare a pupil performance to others in the class.
Norm referenced tests used by states and nations are more reliable and tend to be valid since they are
based on large population.
Aptitude tests
 Aptitude tests predicts achievement.
 This test may stress what is not taught in schools. For example, a lot of applicants to foreign
schools are asked to take scholastic aptitude tests (SAT, test of English as a foreign language
TOFEL).
Intelligence tests
 Intelligence test are used only for special testing or placement of students. They are used
nowadays by most school.
Achievement tests
Achievement tests have replaced intelligence tests.
They provide information about present achievement or past learning on cumulative bases.
They deal with content that the schools teach or should be teaching.
Functions of test in home economics
education
 To the Learner:
 To assign students grades or rank them.
 It provides feedback to the students to know his or her strength and
weaknesses.
 It reduces fear and anxiety.
 Result from test serve as source of motivation.
 It enables learners acquire learning habits.
Self-assessment questions
 Differentiate between Norm-Referenced and Criterion- Referenced tests.
 Identify two (2) functions of test to the learner and the teacher respectively in Home Economic
education.
To the Home Economics Teacher:
 Tests help the teacher to identify whether the instructional objectives are achieved.
 To evaluate his or herself, to improve upon the methods and techniques of teaching.
 To enable the give feedback to the parent of the learners
 To grade the students.
 To find out the level of the pupils.
 To help assign pupils to specific learning groups.
 To help teacher identify students’ interests
Self-assessment questions
 Differentiate between Norm-Referenced and Criterion- Referenced tests.
 Identify two (2) functions of test to the learner and the teacher respectively
in Home Economic education.
LESSON 6
TEST VALIDITY AND RELIABILITY
 Validity
 It refers to the degree to which evidence and theory support the
interpretation of test scores entailed by proposed uses of tests. In other
words, validity refers to the soundness or appropriateness of your
interpretations and uses of students’ assessment results. Validity therefore
emphasize the results of your assessment which you interpret and not the
instrument or procedure itself.
 The process of validation, therefore involves accumulating evidence to
provide a sound scientific basis for the proposed score interpretation. Validity
involves a judgment that one make concerning the interpretation and uses of
assessment results after considering evidence from all relevant sources.
Types of Validity
 Content-related validity
 Criterion-related validity
 Construct related validity
 Content validity evidence:
 This is related to how adequately the content of a test and the responses to the test samples the
domain about which inferences are to be made. In other words, content validity refers to the
extent to which a student’s respond to the items of a test may be considered to be a
representative sample of their responses to a real or hypothetical universe. In classroom
assessment, the curriculum and instruction determine the domain of achievement tasks
Criterion related validity:
 This is a type of evidence that pertains to empirical technique of studying the
relationship between the test course or some other measures and some
independent external measures such as intelligent scores and university grade
point average. It answers the question how well the results of an assessment
can be used to infer or predict an individual standing on one or more outcome
other than the assessment procedure itself. The outcome is called the
criterion.
 There are two types of criterion related evidence
TYPES OF VALIDITY
 Concurrent validity: refers to the extent to which individuals’ current status
on criterion can be estimated from their current performance on an
assessment instrument. With the concurrent validity evidence, both test
course and the criterion scores are collected at the same time.
 Predictive validity: refers to the extent to which individuals’ performance
on criterion can be predicted from their prior performance. Whereas with the
predictive validity evidence, the criterion data are gathered at a later date.
For example using the performance in BDT home economics will predict their
selection into the senior high school home economics programme.
Construct related validity:
 This type of evidence refers to how well the assessment result can be
interpreted as reflecting an individual status regarding an educational trait
attribute or mental process. For example, reading comprehension, honesty,
creativity, and health.
Factors that affect validity
 1. Factors in the assessment instrument itself: A test itself or assessment task
appears to be measuring a subject matter content and the expected outcome.
However, there may be certain factors in the items that can prevent the items
from functioning as intended by the assessor. These factors tend to lower the
validity of the uses and the interpretation of the results. The following are such
factors
 Unclear direction
 Reading vocabulary and sentence structure
 Ambiguity of items
 Inadequate time limit
 Difficult test item
 Poor construction of test items
 Inappropriate test items for learning
 A test that is too short
 Improper arrangement of test items.
 Identifiable pattern of items.
 2. How the item function in relation to what has been taught
 Teachers for example establish learning outcome to attain by the end of their
lesson. The task in the test should necessarily be measuring those content
areas and their related learning outcome.
 3. Factors in administration of the assessment instrument
 The administration of an assessment or test may introduce factors that may
tend to lower the validity of the interpretations of the results. With regards
to teacher made test, such factors as insufficient time, unfair assistance to
individual student who ask for help, cheating, poor lighting and ventilation of
the testing room and disruptive noise during the testing turns to lower validity
of the results.
 4. Factors in students responses
 These are factors inherent in students and tend to affect their performance
during a test. Such factors include emotional disturbance, over anxiety and
level of motivation.
 5. Factors in scoring of an assessment
 It may introduce factors that have a detrimental effect on the validity of the
results particularly scoring of constructed responses (essay and performance
assessment
 6. The nature of the group
 Validity is always specific to a particular group and for a particular purpose.
So the characteristics of a group such as age, gender, ability level,
educational background and cultural background are important in establishing
the validity of assessment results. If the assessment are interpreted and used
without due consideration to those groups characteristics, the validity may be
lowered.
Reliability:
 It refers to the consistency of assessment scores over time on a population of
individual or groups. In general reliability is refers to the degree to which
assessment results are the same when:
 They complete same task(s) on two different occasions
 They complete different but equivalent or alternative task on the same or
different occasions
 Two or more assessors score (mark) their performance on the same task(s)
 In relation to reliability test, it refers to the consistency of the scores
obtained by the individuals when examined with the same test on different
occasion or with alternate forms. Therefore reliability implies the exactness
with which some traits are measured. In this case, there should be reason to
believe that the test case is stable and trustworthy over time on a population
of individuals or groups
Types of reliability
 There are four main types of reliability. Each can be estimated by comparing
different sets of results produced by the same method
 Test-retest reliability.
 Inter-rater reliability
 Parallel forms reliability
 Internal consistency
 Test retest reliability:
 This measures consistency of results when you repeat the same test on the same sample at a
different point in time. It is used when you are measuring something you expect to stay constant in
your sample. Example: a test of grades of a trainee teacher applicant should have high retest
reliability because a grade in performance is a trait that does not change overtime.
 Inter-rater reliability:
 Also known as inter observer reliability measures the degree of agreement between different
people observing the same thing. It is used when data is collected by researchers assigning ratings,
scores or categories to one or more variables. For example, a team of tutors observed self-
garments made by student teachers to record the fit of their garment. Rating scales can be used
with a set of criteria to assess various processes and fashion features of their garment. The results
of different tutors assessing the same student are compared and there is a strong correlation
between all set of results so the test has high inter-rater reliability.
 Parallel form reliability:
 Measures the correlation between two equivalent versions of a test. It is used when you have
two assessment tools or set of questions designed to measure the same thing. For example a set
of questions is formulated to measure financial risk aversion in a group of respondents. The
questions are randomly divided into two set and the respondent are randomly divided into two
groups. Both groups take both tests that is group A takes test A first and group B takes test B
first. The results of the two tests are compared and the results are almost identical, indicating
high parallel form reliability.
 Internal consistency:
 This assesses the correlation between multiple items in a test that are intended to measure the
same construct. It is used to calculate the internal consistency without repeating the test or
involving other researchers.
Factors that affect reliability of a test
 1. Characteristic of a test:
 A test is usually a composite of single items. It takes up the characteristics of
individual item that make it up. It follows that any weakness in the individual
items of the test from which the total score is derived will be reflected in the
total scores in terms of errors. The errors introduce into the total scores
intend reduce the reliability of a test
 2.Test difficulty:
 The difficulty of a selection type test item in terms of the proportion of
examinee that has answered the particular item correctly. The difficulty of a
test depends on the difficulty of the composite item. When the test is
difficult, students may be introduced to guess the answers to the items hence
introducing errors into the scores.
 3.Test length:
 This refers to the number of item in a test. Generally, other things being equall, the longer the
test, the higher the reliability because a test with limited number of items is not likely to
measure the abilities or behaviours under consideration accurately.
 4.Time allocated to the test:
 Testing time affect students’ performance. If the time allocated to take a test is too short,
students will not have enough time to read and think about the problem before answering
them. This could lead to guessing. On the other hand, if the time is too long, the fast students
will finish and would be tempted to help their colleagues leading to irregularities. Adequate
time should be given to students to take any test in order demonstrate their understanding and
comprehension.
 5.Testing conditions:
 When uniformity of the testing conditions is not ensured, inconsistencies are likely to be
introduced into the performance of the students which would affect their scores. Maintaining
uniform testing conditions is essential to reducing the errors and making the results reliable.
 6.Group variability:
 This influences reliability because reliability coefficient is directly influenced by the spread of
scores on the group assessed. Other things being equal, the larger the spread of score, the higher
the estimate of reliability will be. In general, if the group tested is heterogeneous, the reliability
of the scores tends to be high
 Subjectivity in scoring:
 If a test is subjectively scored, inconsistencies are allowed to create random errors within the
contest that in tend lower the reliability of the test
Self-assessment questions
 Distinguish between validity and reliable in test or examination
 State and explain two (2) types each of the following:
 Validity
 Reliability
 Explain two factors each that affect validity and reliability of tests.

More Related Content

Similar to LESSON 6 JBF 361.pptx

Principles of Language Assessment
Principles of Language AssessmentPrinciples of Language Assessment
Principles of Language AssessmentAndyMonge3
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliabilitysongoten77
 
Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)dheerajvyas5
 
STANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TESTSTANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TESTsakshi rana
 
test construction in mathematics
test construction in mathematicstest construction in mathematics
test construction in mathematicsAlokBhutia
 
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptx
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptxLearning_activity1_Martínez Chicaiza_Edwin Santiago..pptx
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptxEDWINSANTIAGOMARTINE
 
1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptxCamposJansen
 
construction and administration of unit test in science subject
construction and administration of unit test in science subjectconstruction and administration of unit test in science subject
construction and administration of unit test in science subjectAlokBhutia
 
B 190313162555
B 190313162555B 190313162555
B 190313162555pawanbais1
 
Educational Assessment and Evaluation
Educational Assessment and Evaluation Educational Assessment and Evaluation
Educational Assessment and Evaluation HennaAnsari
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessmentmunsif123
 
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)Greg Beloro
 
Good test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testGood test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testTiru Goel
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instrumentsJCrawford62
 
Test, measurement, assessment & evaluation
Test, measurement, assessment & evaluationTest, measurement, assessment & evaluation
Test, measurement, assessment & evaluationDrSindhuAlmas
 

Similar to LESSON 6 JBF 361.pptx (20)

Unit 2.pptx
Unit 2.pptxUnit 2.pptx
Unit 2.pptx
 
Principles of Language Assessment
Principles of Language AssessmentPrinciples of Language Assessment
Principles of Language Assessment
 
Presentation Validity & Reliability
Presentation Validity & ReliabilityPresentation Validity & Reliability
Presentation Validity & Reliability
 
Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)Concept and nature of measurment and evaluation (1)
Concept and nature of measurment and evaluation (1)
 
STANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TESTSTANDARDIZED AND NON-STANDARDIZED TEST
STANDARDIZED AND NON-STANDARDIZED TEST
 
test construction in mathematics
test construction in mathematicstest construction in mathematics
test construction in mathematics
 
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptx
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptxLearning_activity1_Martínez Chicaiza_Edwin Santiago..pptx
Learning_activity1_Martínez Chicaiza_Edwin Santiago..pptx
 
1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx1reviewofhighqualityassessment.pptx
1reviewofhighqualityassessment.pptx
 
construction and administration of unit test in science subject
construction and administration of unit test in science subjectconstruction and administration of unit test in science subject
construction and administration of unit test in science subject
 
B 190313162555
B 190313162555B 190313162555
B 190313162555
 
Week 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and ReliabilityWeek 8 & 9 - Validity and Reliability
Week 8 & 9 - Validity and Reliability
 
Validity Evidence
Validity EvidenceValidity Evidence
Validity Evidence
 
Educational Assessment and Evaluation
Educational Assessment and Evaluation Educational Assessment and Evaluation
Educational Assessment and Evaluation
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessment
 
educ-18-LESSON-3.pdf
educ-18-LESSON-3.pdfeduc-18-LESSON-3.pdf
educ-18-LESSON-3.pdf
 
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)Principlesofhighqualityassessment 121028212944-phpapp02 (1)
Principlesofhighqualityassessment 121028212944-phpapp02 (1)
 
Evalution
Evalution Evalution
Evalution
 
Good test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good testGood test , Reliability and Validity of a good test
Good test , Reliability and Validity of a good test
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Test, measurement, assessment & evaluation
Test, measurement, assessment & evaluationTest, measurement, assessment & evaluation
Test, measurement, assessment & evaluation
 

Recently uploaded

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxLigayaBacuel1
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomnelietumpap1
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 

Recently uploaded (20)

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptx
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
ENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choomENGLISH6-Q4-W3.pptxqurter our high choom
ENGLISH6-Q4-W3.pptxqurter our high choom
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 

LESSON 6 JBF 361.pptx

  • 1. LESSON FIVE TESTS A test refers to a tool, technique, or method that is intended to measure students’ knowledge or their ability to complete a particular task. In this sense, testing can be considered as a form of assessment. Test should meet some basic requirements such as validity and reliability. A test can also be explained as a method of collecting data for evaluation
  • 2.
  • 3.
  • 4. TYPES OF TESTS.  Criterion- referenced test  Norm- reference test  Aptitude test Intelligence test  Achievement test.
  • 5. Criterion- referenced test  This type of test compares a student’s academic achievement to a set of criteria or standards. This type of assessment measures learners’ performance against a fixed set of predetermined criteria or learning standards. It checks what learners are expected to know and be able to do at a specific time. In this test, the pupil’s ability is measured in regard to a criterion that is specific body of knowledge or skill. The test usually measures what students know or what they can do in a specific domain of learning.
  • 6. Norm-referenced test  The norm could be a national average score in a particular subject. Standardized tests are norm referenced tests when grades are assigned to a pupil based upon comparison with other pupils.  The state examination such as BECE and WASSCE are norm referenced tests.  It can also be used in a classroom when teachers compare a pupil performance to others in the class. Norm referenced tests used by states and nations are more reliable and tend to be valid since they are based on large population.
  • 7. Aptitude tests  Aptitude tests predicts achievement.  This test may stress what is not taught in schools. For example, a lot of applicants to foreign schools are asked to take scholastic aptitude tests (SAT, test of English as a foreign language TOFEL).
  • 8. Intelligence tests  Intelligence test are used only for special testing or placement of students. They are used nowadays by most school.
  • 9. Achievement tests Achievement tests have replaced intelligence tests. They provide information about present achievement or past learning on cumulative bases. They deal with content that the schools teach or should be teaching.
  • 10. Functions of test in home economics education  To the Learner:  To assign students grades or rank them.  It provides feedback to the students to know his or her strength and weaknesses.  It reduces fear and anxiety.  Result from test serve as source of motivation.  It enables learners acquire learning habits.
  • 11. Self-assessment questions  Differentiate between Norm-Referenced and Criterion- Referenced tests.  Identify two (2) functions of test to the learner and the teacher respectively in Home Economic education.
  • 12. To the Home Economics Teacher:  Tests help the teacher to identify whether the instructional objectives are achieved.  To evaluate his or herself, to improve upon the methods and techniques of teaching.  To enable the give feedback to the parent of the learners  To grade the students.  To find out the level of the pupils.  To help assign pupils to specific learning groups.  To help teacher identify students’ interests
  • 13. Self-assessment questions  Differentiate between Norm-Referenced and Criterion- Referenced tests.  Identify two (2) functions of test to the learner and the teacher respectively in Home Economic education.
  • 14. LESSON 6 TEST VALIDITY AND RELIABILITY  Validity  It refers to the degree to which evidence and theory support the interpretation of test scores entailed by proposed uses of tests. In other words, validity refers to the soundness or appropriateness of your interpretations and uses of students’ assessment results. Validity therefore emphasize the results of your assessment which you interpret and not the instrument or procedure itself.  The process of validation, therefore involves accumulating evidence to provide a sound scientific basis for the proposed score interpretation. Validity involves a judgment that one make concerning the interpretation and uses of assessment results after considering evidence from all relevant sources.
  • 15. Types of Validity  Content-related validity  Criterion-related validity  Construct related validity  Content validity evidence:  This is related to how adequately the content of a test and the responses to the test samples the domain about which inferences are to be made. In other words, content validity refers to the extent to which a student’s respond to the items of a test may be considered to be a representative sample of their responses to a real or hypothetical universe. In classroom assessment, the curriculum and instruction determine the domain of achievement tasks
  • 16. Criterion related validity:  This is a type of evidence that pertains to empirical technique of studying the relationship between the test course or some other measures and some independent external measures such as intelligent scores and university grade point average. It answers the question how well the results of an assessment can be used to infer or predict an individual standing on one or more outcome other than the assessment procedure itself. The outcome is called the criterion.  There are two types of criterion related evidence
  • 17. TYPES OF VALIDITY  Concurrent validity: refers to the extent to which individuals’ current status on criterion can be estimated from their current performance on an assessment instrument. With the concurrent validity evidence, both test course and the criterion scores are collected at the same time.  Predictive validity: refers to the extent to which individuals’ performance on criterion can be predicted from their prior performance. Whereas with the predictive validity evidence, the criterion data are gathered at a later date. For example using the performance in BDT home economics will predict their selection into the senior high school home economics programme.
  • 18. Construct related validity:  This type of evidence refers to how well the assessment result can be interpreted as reflecting an individual status regarding an educational trait attribute or mental process. For example, reading comprehension, honesty, creativity, and health.
  • 19. Factors that affect validity  1. Factors in the assessment instrument itself: A test itself or assessment task appears to be measuring a subject matter content and the expected outcome. However, there may be certain factors in the items that can prevent the items from functioning as intended by the assessor. These factors tend to lower the validity of the uses and the interpretation of the results. The following are such factors  Unclear direction  Reading vocabulary and sentence structure  Ambiguity of items  Inadequate time limit  Difficult test item  Poor construction of test items  Inappropriate test items for learning  A test that is too short  Improper arrangement of test items.  Identifiable pattern of items.
  • 20.  2. How the item function in relation to what has been taught  Teachers for example establish learning outcome to attain by the end of their lesson. The task in the test should necessarily be measuring those content areas and their related learning outcome.  3. Factors in administration of the assessment instrument  The administration of an assessment or test may introduce factors that may tend to lower the validity of the interpretations of the results. With regards to teacher made test, such factors as insufficient time, unfair assistance to individual student who ask for help, cheating, poor lighting and ventilation of the testing room and disruptive noise during the testing turns to lower validity of the results.
  • 21.  4. Factors in students responses  These are factors inherent in students and tend to affect their performance during a test. Such factors include emotional disturbance, over anxiety and level of motivation.  5. Factors in scoring of an assessment  It may introduce factors that have a detrimental effect on the validity of the results particularly scoring of constructed responses (essay and performance assessment  6. The nature of the group  Validity is always specific to a particular group and for a particular purpose. So the characteristics of a group such as age, gender, ability level, educational background and cultural background are important in establishing the validity of assessment results. If the assessment are interpreted and used without due consideration to those groups characteristics, the validity may be lowered.
  • 22. Reliability:  It refers to the consistency of assessment scores over time on a population of individual or groups. In general reliability is refers to the degree to which assessment results are the same when:  They complete same task(s) on two different occasions  They complete different but equivalent or alternative task on the same or different occasions  Two or more assessors score (mark) their performance on the same task(s)  In relation to reliability test, it refers to the consistency of the scores obtained by the individuals when examined with the same test on different occasion or with alternate forms. Therefore reliability implies the exactness with which some traits are measured. In this case, there should be reason to believe that the test case is stable and trustworthy over time on a population of individuals or groups
  • 23. Types of reliability  There are four main types of reliability. Each can be estimated by comparing different sets of results produced by the same method  Test-retest reliability.  Inter-rater reliability  Parallel forms reliability  Internal consistency
  • 24.  Test retest reliability:  This measures consistency of results when you repeat the same test on the same sample at a different point in time. It is used when you are measuring something you expect to stay constant in your sample. Example: a test of grades of a trainee teacher applicant should have high retest reliability because a grade in performance is a trait that does not change overtime.  Inter-rater reliability:  Also known as inter observer reliability measures the degree of agreement between different people observing the same thing. It is used when data is collected by researchers assigning ratings, scores or categories to one or more variables. For example, a team of tutors observed self- garments made by student teachers to record the fit of their garment. Rating scales can be used with a set of criteria to assess various processes and fashion features of their garment. The results of different tutors assessing the same student are compared and there is a strong correlation between all set of results so the test has high inter-rater reliability.
  • 25.  Parallel form reliability:  Measures the correlation between two equivalent versions of a test. It is used when you have two assessment tools or set of questions designed to measure the same thing. For example a set of questions is formulated to measure financial risk aversion in a group of respondents. The questions are randomly divided into two set and the respondent are randomly divided into two groups. Both groups take both tests that is group A takes test A first and group B takes test B first. The results of the two tests are compared and the results are almost identical, indicating high parallel form reliability.  Internal consistency:  This assesses the correlation between multiple items in a test that are intended to measure the same construct. It is used to calculate the internal consistency without repeating the test or involving other researchers.
  • 26. Factors that affect reliability of a test  1. Characteristic of a test:  A test is usually a composite of single items. It takes up the characteristics of individual item that make it up. It follows that any weakness in the individual items of the test from which the total score is derived will be reflected in the total scores in terms of errors. The errors introduce into the total scores intend reduce the reliability of a test  2.Test difficulty:  The difficulty of a selection type test item in terms of the proportion of examinee that has answered the particular item correctly. The difficulty of a test depends on the difficulty of the composite item. When the test is difficult, students may be introduced to guess the answers to the items hence introducing errors into the scores.
  • 27.  3.Test length:  This refers to the number of item in a test. Generally, other things being equall, the longer the test, the higher the reliability because a test with limited number of items is not likely to measure the abilities or behaviours under consideration accurately.  4.Time allocated to the test:  Testing time affect students’ performance. If the time allocated to take a test is too short, students will not have enough time to read and think about the problem before answering them. This could lead to guessing. On the other hand, if the time is too long, the fast students will finish and would be tempted to help their colleagues leading to irregularities. Adequate time should be given to students to take any test in order demonstrate their understanding and comprehension.  5.Testing conditions:  When uniformity of the testing conditions is not ensured, inconsistencies are likely to be introduced into the performance of the students which would affect their scores. Maintaining uniform testing conditions is essential to reducing the errors and making the results reliable.
  • 28.  6.Group variability:  This influences reliability because reliability coefficient is directly influenced by the spread of scores on the group assessed. Other things being equal, the larger the spread of score, the higher the estimate of reliability will be. In general, if the group tested is heterogeneous, the reliability of the scores tends to be high  Subjectivity in scoring:  If a test is subjectively scored, inconsistencies are allowed to create random errors within the contest that in tend lower the reliability of the test
  • 29. Self-assessment questions  Distinguish between validity and reliable in test or examination  State and explain two (2) types each of the following:  Validity  Reliability  Explain two factors each that affect validity and reliability of tests.