The Evaluation Checklist

18,163 views

Published on

Advantages and disadvantages of evaluation checklists and how to use them to improve evaluation practice. Presented at USF Center for Research, Evaluation, Assessment, and Measurement.

2 Comments
3 Likes
Statistics
Notes
No Downloads
Views
Total views
18,163
On SlideShare
0
From Embeds
0
Number of Embeds
16
Actions
Shares
0
Downloads
175
Comments
2
Likes
3
Embeds 0
No embeds

No notes for slide
  • Mnemonic deviceInherently systematicProvides guidance for collection of evidenceHighly relevant for evaluative purposesNOT to be thought of as measurement device
  • No statistically significant difference found among groupsInter-item correlation ranged from .79 to .96
  • The Evaluation Checklist

    1. 1. The Evaluation Checklist<br />Wes Martz, Ph.D.<br />
    2. 2. A mnemonic device that consists of a list of activities, items, and criteria used to perform a certain task.<br />2<br />
    3. 3. Guidance for the collection of relevant evidence used to determine the merit, worth, or significance of an evaluand.<br />3<br />
    4. 4. Existing Evaluation Checklists<br />Evaluation management <br />Evaluation models <br />Evaluation values and criteria <br />Metaevaluation <br />Evaluation capacity building <br />Checklist creation <br />4<br />http://www.wmich.edu/evalctr/checklists<br />
    5. 5. Checklist Organizer<br />5<br />
    6. 6. Good Checklists<br />6<br />
    7. 7. Criteria to Evaluate Checklists<br />Applicability to full range of intended uses<br />Clarity<br />Comprehensiveness<br />Concreteness<br />Ease of use<br />Fairness<br />Parsimony<br />Pertinence to the content area<br />7<br />Stufflebeam, D. L. (2000) Guidelines for developing evaluation checklists: The checklist development checklist. Available at http://www.wmich.edu/evalctr/checklists/guidelines_cdc.pdf<br />
    8. 8. Checklist Advantages<br />Consolidate vast knowledge in a parsimonious manner<br />Improve task performance<br />Reduce influence of halo and Rorschach effects<br />Reduce resource use<br />Improve memory recall<br />Set out minimum necessary steps in a process<br />8<br />Scriven, M. (2005). Checklists. In S. Mathison (Ed.), Encyclopedia of evaluation (pp. 53-59). Thousand Oaks, CA: Sage.<br />
    9. 9. Checklist Disadvantages<br />Evaluation myopia<br />Inappropriate use<br />Fatigue resulting from overuse<br />Unnecessary complexity decreases reliability<br />Burdensome process delays completing evaluation<br />9<br />Martz, W. (in press). Validating an evaluation checklist using a mixed method design. Evaluation and Program Planning (2009). doi: 10.1016/j.evalprogplan.2009.10.005<br />
    10. 10. Organizational Effectiveness Evaluation Checklist (OEC)<br />Developing and Validating an Evaluation Checklist: A Case Example<br />10<br />
    11. 11. OEC Overview<br />Organizational evaluation process framework<br />Iterative, explicit, weakly sequential<br />Six steps, 29 checkpoints<br />Criteria of merit checklist<br />12 universal criteria of merit<br />84 suggested measures<br />11<br />
    12. 12. OEC Overview<br />12<br />
    13. 13. Criteria to Evaluate the OEC<br />Applicability to full range of intended uses<br />Clarity<br />Comprehensiveness<br />Concreteness<br />Ease of use<br />Fairness<br />Parsimony<br />Pertinence to the content area<br />13<br />
    14. 14. OEC Validation Process<br />Phase 1: Expert panel review<br />Critical feedback survey<br />Written comments made on checklist<br />Phase 2: Field test<br />Single-case study<br />Semi-structured interview<br />14<br />
    15. 15. Expert Panel Overview<br />Study participants<br />Subject matter experts (organizational and evaluation theorists)<br />Targeted users (professional evaluators, organizational consultants, managers)<br />Review OEC for providing critical feedback<br />Identify strengths and weaknesses<br />Complete the critical feedback survey<br />Write comments directly on the checklist<br />15<br />
    16. 16. Expert Panel Data Analyses<br />Critical feedback survey<br />Descriptive statistics<br />Parametric and nonparametric analysis of variance<br />Written comments on checklist<br />Hermeneutic interpretation<br />Thematic analysis to cluster and impose meaning<br />Triangulation across participants to corroborate or falsify the imposed categories<br />16<br />
    17. 17. Key Findings<br />Pertinent to content area<br />Clear<br />Fair<br />Sound theory<br />Parsimony and ease of use were identified as areas to address<br />17<br /><ul><li> Content relevance
    18. 18. Representativeness
    19. 19. Substantive validity</li></li></ul><li>Field Test Overview<br />Conducted evaluation using revised OEC<br />Strictly followed the OEC to ensure fidelity<br />Post-evalsemi-structured client interview<br />A formative metaevaluation to detect and correct deficiencies in the process<br />18<br />
    20. 20. Observations from Field Test<br />Structured format minimized “scope-creep”<br />Identified several areas to clarify in OEC<br />Reinforced need for multiple measures,transparency in standards<br />Minimal disruption to the organization<br />19<br />
    21. 21. Validity Study Assessment<br />Strengths<br />Relatively quick validation process<br />Based on relevant evaluative criteria<br />Features a real-world application<br />Weaknesses <br />Single-case field study<br />Selection of the case study<br />Selection of the expert panel members<br />20<br />Martz, W. (in press). Validating an evaluation checklist using a mixed method design. Evaluation and Program Planning (2009). doi: 10.1016/j.evalprogplan.2009.10.005<br />
    22. 22. Lessons Learned<br />Checklist development should address unique attributes of the evaluand <br />Sampling frame is critical<br />Checklist validation should be grounded in theory, practice, and use<br />Mixed method approach provides increased confidence in validation conclusions<br />All checklists are a “work-in-process”<br />21<br />
    23. 23. Additional Checklist Resources<br />Martz, W. (in press). Validating an evaluation checklist using a mixed method design. Evaluation and Program Planning (2009).<br />Scriven, M. (2007). The logic and methodology of checklists. Available at http://www.wmich.edu/evalctr/checklists/papers/logic&methodology_dec07.pdf.<br />Stufflebeam, D. L. (2000) Guidelines for developing evaluation checklists: The checklist development checklist. Available at http://www.wmich.edu/evalctr/checklists/guidelines_cdc.pdf.<br />Stufflebeam, D. L. (2001). Evaluation checklists: Practical tools for guiding and judging evaluations. American Journal of Evaluation, 22, 71–79.<br />22<br />http://www.wmich.edu/evalctr/checklists<br />
    24. 24. The Evaluation Checklist<br />martz@EvaluativeOrganization.com<br />@wesmartz<br />

    ×