Assessment In Art Education<br />Chapter 8: Validity and Reliability<br />Facilitated  by Marissa Barclay<br />
Validity and Reliability? What do these have to do with teaching art?!<br />The art educator involved in assessment initia...
VALIDITY<br />Validity: In short, inferences drawn from a test or assessment score need to be validated. <br />There are t...
#1: Relevance<br />Relevance refers to the quality of fit between the purpose of the assessment and selected performance f...
#2: Content Fidelity and Integrity<br />This refers to the authenticity of assessment task content. <br />The assessment t...
#3: Exhaustiveness<br />This term refers to the scope and comprehensiveness of performance task content and underlying con...
#4: Cognitive Complexity<br />Refers to the level s of intellectual complexity the performance task requires of students. ...
#5: Equity<br />Refers to the extent to which a task allows equal opportunities for all students to succeed. <br />The art...
#6: Meaningfulness<br />Refers to how motivating, challenging, and satisfying a task is to both students and others who mi...
#7: Straightforwardness<br />Refers to students’ need to see and understand what is expected of them. In assessment termin...
#8: Cohesiveness<br />Refers to the homogeneity of exercises in a performance. <br />If the holistic task score is to be i...
#9: Consequences<br />Refers to both intended and unintended consequences of a task or of interpretation of task scores,. ...
#10: Directness<br />Refers to the extent to which the task reflects the actual behavior of characteristic being examined....
#11: Cost and Effieiency<br />The value of the performance task must override the cost, and performance assessments genera...
#12: Generalizability<br />Refers to the degree to which the results of a performance assessment can be generalized across...
RELIABILITY<br /><ul><li>This is a concept known in classical tests and measurement theory.
Can also be defined as the consistency of scores.
For the art educator it is worthwhile to know what factors might cause an unreliable assessment score.
Note: an assessment can be reliable without being valid, but it cannot be valid without being reliable.
Meaning: reliability affects the quality of the decisions made on the basis of the derived score.</li></li></ul><li>Reliab...
Reliability…still continuing…<br />-if more than one judgment is needed, train scorers             <br />-Check consistenc...
Item Analysis<br />This is a related psychometric issue to validity and reliability. <br />This term is defined as “the pr...
Graphical descriptions of validity and reliability <br />
ACTIVITY TIME!<br />
What does “validity” mean?<br />Validity refers to inferences drawn from a test or assessment score needed to be validated...
Upcoming SlideShare
Loading in …5
×

Arte387 Ch8

469 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
469
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Arte387 Ch8

  1. 1. Assessment In Art Education<br />Chapter 8: Validity and Reliability<br />Facilitated by Marissa Barclay<br />
  2. 2. Validity and Reliability? What do these have to do with teaching art?!<br />The art educator involved in assessment initiatives beyond the classroom needs to have a working knowledge of validity and reliability issues as well as item analysis-all of which are critical to assessment practice. <br />In layman&apos;s terms: YOU NEED TO KNOW IT! <br />**all information in this Power Point is courtesy of the author, Donna Kay Beattie**<br />
  3. 3. VALIDITY<br />Validity: In short, inferences drawn from a test or assessment score need to be validated. <br />There are twelve validation criteria that are useful for judging the validity of performance-based art assignments. <br />These criteria should be addressed during the given performance task. <br />
  4. 4. #1: Relevance<br />Relevance refers to the quality of fit between the purpose of the assessment and selected performance format and tasks. <br />Art educator should look for the best possible match between purpose and assessment performance formats. <br />
  5. 5. #2: Content Fidelity and Integrity<br />This refers to the authenticity of assessment task content. <br />The assessment task must faithfully reflect the integrity of the discipline and clearly show the field’s most time-tested and valuable content and processes. <br />
  6. 6. #3: Exhaustiveness<br />This term refers to the scope and comprehensiveness of performance task content and underlying constructs are the internal qualities and behaviors that undergird a performance. <br />
  7. 7. #4: Cognitive Complexity<br />Refers to the level s of intellectual complexity the performance task requires of students. <br />An example of this might be a holistic performance –based task that covers each of the four visual arts disciplines, their interconnections, and their connections to the other academic disciplines. <br />
  8. 8. #5: Equity<br />Refers to the extent to which a task allows equal opportunities for all students to succeed. <br />The art educator can create a checklist of possible sources of bias to be used to review the task as it is being developed. <br />
  9. 9. #6: Meaningfulness<br />Refers to how motivating, challenging, and satisfying a task is to both students and others who might have interest in the task, such as parents, other teachers, administrators, and experts from the art disciplines. <br />
  10. 10. #7: Straightforwardness<br />Refers to students’ need to see and understand what is expected of them. In assessment terminology this is known as transparency. <br />
  11. 11. #8: Cohesiveness<br />Refers to the homogeneity of exercises in a performance. <br />If the holistic task score is to be interpreted as a valid measure of knowledge of the visual arts, then each exercise within the task should correlate well with the others. <br />
  12. 12. #9: Consequences<br />Refers to both intended and unintended consequences of a task or of interpretation of task scores,. <br />The art educator can address the consequential criterion by anticipating possible assessment side effects (negative and positive), and hypothesizing potential testing outcomes. <br />
  13. 13. #10: Directness<br />Refers to the extent to which the task reflects the actual behavior of characteristic being examined. <br />A direct assessment of students abilities to criticize a work of art would be to have them write a critical review. <br />Knowledge and skills cannot be assessed directly, but are inferred from performances and products. <br />
  14. 14. #11: Cost and Effieiency<br />The value of the performance task must override the cost, and performance assessments generally cost more than multiple-choice or other pencil-and-paper test formats. <br />All aspects of a performance assessment should be carefully studied in an effort to devise ways to reduce costs and increase efficiency. <br />
  15. 15. #12: Generalizability<br />Refers to the degree to which the results of a performance assessment can be generalized across different domains. <br />Contextual can be considered here because it is concerned with issues of transfer and generalizability.<br />Classroom settings, classroom management effects, and even rater variability are contextual issues that need to be analyzed before making judgments of transfer. <br />
  16. 16. RELIABILITY<br /><ul><li>This is a concept known in classical tests and measurement theory.
  17. 17. Can also be defined as the consistency of scores.
  18. 18. For the art educator it is worthwhile to know what factors might cause an unreliable assessment score.
  19. 19. Note: an assessment can be reliable without being valid, but it cannot be valid without being reliable.
  20. 20. Meaning: reliability affects the quality of the decisions made on the basis of the derived score.</li></li></ul><li>Reliability…continued. <br />There are twelve procedures to help improve the reliability and generalizability of art assessments:<br />**Also listed on the handout*<br />-Assess the same material -Broaden the scope of assessments<br />-Develop clear and concrete scoring criteria <br />-Make annotated examples showing each score<br />-Make scoring objective, use a scoring rubric <br />-When possible, and use more than one scorer<br />
  21. 21. Reliability…still continuing…<br />-if more than one judgment is needed, train scorers <br />-Check consistency, go back and check scores<br />-Score one question on all tests, then the next, etc. <br />- Provide practice and training assessment s<br />-Craft tests to fit each student’s needs <br />-Design tasks that help differentiate the most able from the least able students <br />
  22. 22. Item Analysis<br />This is a related psychometric issue to validity and reliability. <br />This term is defined as “the process of collecting summarizing, and using information from students’ responses to make decisions about each assessment task. <br />Three Steps of a simple item analysis on an important classroom test (written exam)<br />1) The teacher needs to group the assessments according to high and low scores. <br />2) Tallies are made of each group’s responses to each test item<br />3) Percentages for item responses are figured.<br />
  23. 23. Graphical descriptions of validity and reliability <br />
  24. 24. ACTIVITY TIME!<br />
  25. 25. What does “validity” mean?<br />Validity refers to inferences drawn from a test or assessment score needed to be validated<br />
  26. 26. Name two or three criteria useful for judging validity of performance-based art assignments<br />Relevance<br />Content and fidelity<br />Exhaustiveness<br />Cognitive complexity<br />Equity<br />Meaningfulness<br />Straightforwardness<br />Cohesiveness<br />Consequences<br />Directness<br />Cost and efieiency<br />Generalizability<br />
  27. 27. Define “reliability”<br />This is a concept known in classical tests and measurement theory. <br />Can also be defined as the consistency of scores. <br />
  28. 28. True or False: an assessment can be valid without being reliable<br />FALSE!<br />an assessment can be reliable without being valid, but it cannot be valid without being reliable. <br />Meaning: reliability affects the quality of the decisions made on the basis of the derived score.<br />

×