Summative Evaluation The Evaluation  after implementation
Involves _______ data Collecting Analyzing Summarizing
For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction
Effectiveness of Content  Instruction solve the problem? Criterion created prior to evaluation? Was the criterion established  in conjunction with the needs assessment?
Specifically Did learners achieve the objectives? Learners feeling about instruction? What were the costs? How much time did it take? Was instruction implemented  as designed? What unexpected outcomes?
Alternative Approaches to Summative Evaluation Objectivism Subjectivism
Objectivism Based on empiricism Answering questions on the bases of observed data Goal based and replicable, uses the scientific method
Subjectivism Employs expert judgment Includes qualitative methods observation and interviews  evaluate content “Goal Free” evaluators haven’t a  clue about the goals
Objectivism  (limitations) Examine only a limited number of factors May miss critical effects
Subjectivism  (limitations) Are not replicable Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation May miss critical effects
Designer Role in Summative Evaluation? Somewhat controversial
Timing of Summative Evaluation? Not in the first cycle
Summary Diagram Formative Design Reviews Expert Reviews One-to-one Eval. Small Group Eval. Field Trials Ongoing Eval. Summative Determine Goals of the Evaluation Select Orientation Select Design Design or Select Evaluation Measures Collect Data Analyze Data Report Results
Goals of the Evaluation What decisions must be made? What are the best questions? How practical is it to gather data? Who wants the answer to a question? How much uncertainty?
Orientation of Evaluation Goal-based or goal-free A middle ground? Quantitative or qualitative appropriate? Experimental or naturalistic approach?
Select Design of Evaluation Describes what data to collect When the data will be collected And under what conditions Issues to consider: How much confidence must we have that the instruction caused the learning? (internal validity) How important is the generalizability? (external validity) How much control do we have over the instructional situation?
Design or Select Evaluation Measures Payoff outcomes Is the problem solved? Costs avoided Increased outputs Improved quality Improved efficiency
Design or Select Evaluation Measures  (2) Learning Outcomes Use instrument you’ve already developed for the summative evaluation But measure the entire program
Design or Select Evaluation Measures  (3) Attitudes Rarely the primary payoff goals Ask about learner attitudes toward learning instructional materials subject matter  Indices of appeal attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement
Design or Select Evaluation Measures  (4) Level of Implementation degree to which the instruction was implemented Costs Cost-feasibility Cost-effectiveness
Alternative Designs Instruction then posttest Pretest then instruction then posttest
The Report Summary Background Needs assessment, audience, context, program description Description of evaluation study Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes
The Report  (continued) Results Outcomes, implementation, cost-effectiveness info., unintentional outcomes Discussion causal relationship between program & results Limitation of study Conclusion & Recommendations
Summary Summative evaluation is after implementation Limitations of subjective and objective evaluation What to include in the report

Summative Evaluation

  • 1.
    Summative Evaluation TheEvaluation after implementation
  • 2.
    Involves _______ dataCollecting Analyzing Summarizing
  • 3.
    For the purposeof Giving decision makers information on the effectiveness and efficiency of instruction
  • 4.
    Effectiveness of Content Instruction solve the problem? Criterion created prior to evaluation? Was the criterion established in conjunction with the needs assessment?
  • 5.
    Specifically Did learnersachieve the objectives? Learners feeling about instruction? What were the costs? How much time did it take? Was instruction implemented as designed? What unexpected outcomes?
  • 6.
    Alternative Approaches toSummative Evaluation Objectivism Subjectivism
  • 7.
    Objectivism Based onempiricism Answering questions on the bases of observed data Goal based and replicable, uses the scientific method
  • 8.
    Subjectivism Employs expertjudgment Includes qualitative methods observation and interviews evaluate content “Goal Free” evaluators haven’t a clue about the goals
  • 9.
    Objectivism (limitations)Examine only a limited number of factors May miss critical effects
  • 10.
    Subjectivism (limitations)Are not replicable Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation May miss critical effects
  • 11.
    Designer Role inSummative Evaluation? Somewhat controversial
  • 12.
    Timing of SummativeEvaluation? Not in the first cycle
  • 13.
    Summary Diagram FormativeDesign Reviews Expert Reviews One-to-one Eval. Small Group Eval. Field Trials Ongoing Eval. Summative Determine Goals of the Evaluation Select Orientation Select Design Design or Select Evaluation Measures Collect Data Analyze Data Report Results
  • 14.
    Goals of theEvaluation What decisions must be made? What are the best questions? How practical is it to gather data? Who wants the answer to a question? How much uncertainty?
  • 15.
    Orientation of EvaluationGoal-based or goal-free A middle ground? Quantitative or qualitative appropriate? Experimental or naturalistic approach?
  • 16.
    Select Design ofEvaluation Describes what data to collect When the data will be collected And under what conditions Issues to consider: How much confidence must we have that the instruction caused the learning? (internal validity) How important is the generalizability? (external validity) How much control do we have over the instructional situation?
  • 17.
    Design or SelectEvaluation Measures Payoff outcomes Is the problem solved? Costs avoided Increased outputs Improved quality Improved efficiency
  • 18.
    Design or SelectEvaluation Measures (2) Learning Outcomes Use instrument you’ve already developed for the summative evaluation But measure the entire program
  • 19.
    Design or SelectEvaluation Measures (3) Attitudes Rarely the primary payoff goals Ask about learner attitudes toward learning instructional materials subject matter Indices of appeal attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement
  • 20.
    Design or SelectEvaluation Measures (4) Level of Implementation degree to which the instruction was implemented Costs Cost-feasibility Cost-effectiveness
  • 21.
    Alternative Designs Instructionthen posttest Pretest then instruction then posttest
  • 22.
    The Report SummaryBackground Needs assessment, audience, context, program description Description of evaluation study Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes
  • 23.
    The Report (continued) Results Outcomes, implementation, cost-effectiveness info., unintentional outcomes Discussion causal relationship between program & results Limitation of study Conclusion & Recommendations
  • 24.
    Summary Summative evaluationis after implementation Limitations of subjective and objective evaluation What to include in the report

Editor's Notes

  • #3 Involves doing what with data? collecting
  • #5 Does the instruction solve the problem? Was the criterion established prior to evaluation? Was the criterion established in conjunction with the needs assessment?
  • #15 What decisions must be made as a result of the evaluation? What questions will best aid these decisions? How practical is it to gather data to answer a question? Who wants the answer to a question? How much uncertainty is associated with the answer?