Summative Evaluation
Upcoming SlideShare
Loading in...5
×
 

Summative Evaluation

on

  • 1,525 views

Created by John Hollenbeck, Ph.D.

Created by John Hollenbeck, Ph.D.

Statistics

Views

Total Views
1,525
Views on SlideShare
1,470
Embed Views
55

Actions

Likes
1
Downloads
37
Comments
0

3 Embeds 55

http://www.weebly.com 46
http://deborahedtech505.weebly.com 5
http://md.rmutk.ac.th 4

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Involves doing what with data? collecting
  • Does the instruction solve the problem? Was the criterion established prior to evaluation? Was the criterion established in conjunction with the needs assessment?
  • What decisions must be made as a result of the evaluation? What questions will best aid these decisions? How practical is it to gather data to answer a question? Who wants the answer to a question? How much uncertainty is associated with the answer?

Summative Evaluation Summative Evaluation Presentation Transcript

  • Summative Evaluation The Evaluation after implementation
  • Involves _______ data Collecting Analyzing Summarizing
  • For the purpose of Giving decision makers information on the effectiveness and efficiency of instruction View slide
  • Effectiveness of Content
    • Instruction solve the problem?
    • Criterion created prior to evaluation?
    • Was the criterion established in conjunction with the needs assessment?
    View slide
  • Specifically
    • Did learners achieve the objectives?
    • Learners feeling about instruction?
    • What were the costs?
    • How much time did it take?
    • Was instruction implemented as designed?
    • What unexpected outcomes?
  • Alternative Approaches to Summative Evaluation Objectivism Subjectivism
  • Objectivism
    • Based on empiricism
      • Answering questions on the bases of observed data
      • Goal based and replicable, uses the scientific method
  • Subjectivism
    • Employs expert judgment
    • Includes qualitative methods
      • observation and interviews
      • evaluate content
    • “Goal Free”
      • evaluators haven’t a clue about the goals
  • Objectivism (limitations)
    • Examine only a limited number of factors
    • May miss critical effects
  • Subjectivism (limitations)
    • Are not replicable
    • Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation
    • May miss critical effects
  • Designer Role in Summative Evaluation? Somewhat controversial
  • Timing of Summative Evaluation? Not in the first cycle
  • Summary Diagram
    • Formative
      • Design Reviews
      • Expert Reviews
      • One-to-one Eval.
      • Small Group Eval.
      • Field Trials
      • Ongoing Eval.
    • Summative
      • Determine Goals of the Evaluation
      • Select Orientation
      • Select Design
      • Design or Select Evaluation Measures
      • Collect Data
      • Analyze Data
      • Report Results
  • Goals of the Evaluation
    • What decisions must be made?
    • What are the best questions?
    • How practical is it to gather data?
    • Who wants the answer to a question?
    • How much uncertainty?
  • Orientation of Evaluation
    • Goal-based or goal-free
    • A middle ground?
    • Quantitative or qualitative appropriate?
    • Experimental or naturalistic approach?
  • Select Design of Evaluation
    • Describes what data to collect
    • When the data will be collected
    • And under what conditions
    • Issues to consider:
      • How much confidence must we have that the instruction caused the learning? (internal validity)
      • How important is the generalizability? (external validity)
      • How much control do we have over the instructional situation?
  • Design or Select Evaluation Measures
    • Payoff outcomes
      • Is the problem solved?
        • Costs avoided
        • Increased outputs
        • Improved quality
        • Improved efficiency
  • Design or Select Evaluation Measures (2)
    • Learning Outcomes
      • Use instrument you’ve already developed for the summative evaluation
      • But measure the entire program
  • Design or Select Evaluation Measures (3)
    • Attitudes
      • Rarely the primary payoff goals
      • Ask about learner attitudes toward
        • learning
        • instructional materials
        • subject matter
      • Indices of appeal
        • attention, likeableness, interest, relevance, familiarity, credibility, acceptability, and excitement
  • Design or Select Evaluation Measures (4)
    • Level of Implementation
      • degree to which the instruction was implemented
    • Costs
      • Cost-feasibility
      • Cost-effectiveness
  • Alternative Designs
    • Instruction then posttest
    • Pretest then instruction then posttest
  • The Report
    • Summary
    • Background
      • Needs assessment, audience, context, program description
    • Description of evaluation study
      • Purpose of evaluation, evaluation of the design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes
  • The Report (continued)
    • Results
      • Outcomes, implementation, cost-effectiveness info., unintentional outcomes
    • Discussion
      • causal relationship between program & results
      • Limitation of study
    • Conclusion & Recommendations
  • Summary
    • Summative evaluation is after implementation
    • Limitations of subjective and objective evaluation
    • What to include in the report