BELLWORK

True or False:

     “This is a valid test.”
STUDENT LEARNING OBJECTIVES
  Students will learn how „validity‟ is used in
           reference to assessments


Students will learn about three types of validity
                    evidences
NORMS
Please ask
 questions
WARM UP ACTIVITY




  Let‟s Take a Test about Me
My favorite color is red.   T   F

I don‟t know how to swim.       T
     F

I have a dog named Fido.    T   F

My watch is real gold.          T
    F
CONTENT-RELATED EVIDENCE OF VALIDITY
   Refers to the adequacy with which the content of a test
  represents the content of the curricular aim about which
                  inferences are to be made.

Two Approaches:
1. Developmental Care
2. External Reviews
DEVELOPMENTAL CARE
Employ a set of test-development procedures
 focused on assuring that the curricular aim‟s
content is properly reflected in the assessment
               procedure itself.
EXTERNAL REVIEWS
Assembling of judges who rate the content appropriateness
  of a given test in relationship to the curricular aim the test
                       allegedly represents
THE ISSUE OF ALIGNMENT
Norman Webb of the University of Wisconsin’s Method of
                Determining Alignment
 Categorical concurrence: Are the same or consistent
   categories used in both curricular expectations and
                     assessments?
Depth-of-knowledge consistency: To what extent are the
  cognitive demands of curricular aims and assessments
                       the same?
 Range of knowledge correspondence: Is the span of
 knowledge reflected in curricular aims and assessments
                        the same?
Balance of Representation: To what degree are different
      curricular aims given equal emphasis on the
                     assessments?
CRITERION-RELATED EVIDENCE OF VALIDITY
 Collected only in situations where educators are using an
   assessment procedure to predict how well students will
       perform on some subsequent criterion variable.
CONSTRUCT RELATED EVIDENCE
 Measuring what‟s hidden
 Gathered through a series of studies


 Three Approaches to Collecting Construct Related Evidence of
   Validity
1. Intervention Studies
2. Differential-Population Studies
3. Related-Measures Studies
INTERVENTION STUDIES
We hypothesize that students will respond
 differently to the assessment instrument
     after having received some type of
          treatment (or intervention)
DIFFERENTIAL-POPULATION STUDIES
We hypothesize that individuals representing
    distinctly different populations will score
differently on the assessment procedure under
                   consideration.
RELATED-MEASURES STUDIES
We hypothesize that a given kind of relationship
 will be present between students’ scores on the
 assessment device we’re scrutinizing and their
  scores on a related or unrelated assessment
                     device.

             Convergent Validity (+ +)
            Discriminant Evidence (+ -)
SANCTIONED AND UNSANCTIONED FORMS OF
           VALIDITY EVIDENCE
Face Validity
• the appearance of a test seems to coincide with the use
  to which the test is being put


Consequential Validity
• refers to whether the uses of test results are valid



   Refer to Standards for Educational and Psychological
                           Testing
RELIABILITY/VALIDITY
   Valid score-based inferences almost certainly guarantee that
                 consistent test results are present.


                                Vs.


Consistent test results almost certainly guarantee that valid score-
                    based inferences are present


Evidence of valid score-based inferences almost certainly requires that
                consistency of measurement is present.
WHY DID I JUST SIT HERE AND LEARN ALL THIS?
  Give serious thought to the content of an assessment domain being
                           represented by a test.


    There is value in having a colleague review your tests‟ content.


     At least you know about the other forms of validity evidence.



    Validity does NOT reside on the test itself.

Chapter 6: Validity

  • 2.
    BELLWORK True or False: “This is a valid test.”
  • 3.
    STUDENT LEARNING OBJECTIVES Students will learn how „validity‟ is used in reference to assessments Students will learn about three types of validity evidences
  • 4.
  • 5.
    WARM UP ACTIVITY Let‟s Take a Test about Me
  • 6.
    My favorite coloris red. T F I don‟t know how to swim. T F I have a dog named Fido. T F My watch is real gold. T F
  • 7.
    CONTENT-RELATED EVIDENCE OFVALIDITY Refers to the adequacy with which the content of a test represents the content of the curricular aim about which inferences are to be made. Two Approaches: 1. Developmental Care 2. External Reviews
  • 8.
    DEVELOPMENTAL CARE Employ aset of test-development procedures focused on assuring that the curricular aim‟s content is properly reflected in the assessment procedure itself.
  • 9.
    EXTERNAL REVIEWS Assembling ofjudges who rate the content appropriateness of a given test in relationship to the curricular aim the test allegedly represents
  • 10.
    THE ISSUE OFALIGNMENT Norman Webb of the University of Wisconsin’s Method of Determining Alignment Categorical concurrence: Are the same or consistent categories used in both curricular expectations and assessments? Depth-of-knowledge consistency: To what extent are the cognitive demands of curricular aims and assessments the same? Range of knowledge correspondence: Is the span of knowledge reflected in curricular aims and assessments the same? Balance of Representation: To what degree are different curricular aims given equal emphasis on the assessments?
  • 11.
    CRITERION-RELATED EVIDENCE OFVALIDITY Collected only in situations where educators are using an assessment procedure to predict how well students will perform on some subsequent criterion variable.
  • 12.
    CONSTRUCT RELATED EVIDENCE Measuring what‟s hidden  Gathered through a series of studies  Three Approaches to Collecting Construct Related Evidence of Validity 1. Intervention Studies 2. Differential-Population Studies 3. Related-Measures Studies
  • 13.
    INTERVENTION STUDIES We hypothesizethat students will respond differently to the assessment instrument after having received some type of treatment (or intervention)
  • 14.
    DIFFERENTIAL-POPULATION STUDIES We hypothesizethat individuals representing distinctly different populations will score differently on the assessment procedure under consideration.
  • 15.
    RELATED-MEASURES STUDIES We hypothesizethat a given kind of relationship will be present between students’ scores on the assessment device we’re scrutinizing and their scores on a related or unrelated assessment device. Convergent Validity (+ +) Discriminant Evidence (+ -)
  • 16.
    SANCTIONED AND UNSANCTIONEDFORMS OF VALIDITY EVIDENCE Face Validity • the appearance of a test seems to coincide with the use to which the test is being put Consequential Validity • refers to whether the uses of test results are valid Refer to Standards for Educational and Psychological Testing
  • 17.
    RELIABILITY/VALIDITY Valid score-based inferences almost certainly guarantee that consistent test results are present. Vs. Consistent test results almost certainly guarantee that valid score- based inferences are present Evidence of valid score-based inferences almost certainly requires that consistency of measurement is present.
  • 18.
    WHY DID IJUST SIT HERE AND LEARN ALL THIS? Give serious thought to the content of an assessment domain being represented by a test. There is value in having a colleague review your tests‟ content. At least you know about the other forms of validity evidence. Validity does NOT reside on the test itself.