Module 7 g&m

1,015 views

Published on

Published in: Education, Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,015
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Module 7 g&m

  1. 1. T E A M 4 MODULE 7: CHAPTER 17 G&M
  2. 2. What is evaluation to you? Our text lists five program issues for evaluation: ● Quality ● Suitability ● Effectiveness ● Efficiency ● Importance How do these components relate to your place of work? Can you provide examples? EVALUATION PRE-DISCUSSION
  3. 3. —Evaluation-estimating value —Two steps 1.Compare results and objectives 2.Appraise or judge the value of the differences assessed What is the difference between “measurement” and “evaluation”? EVALUATION
  4. 4. —A systematic process, reliant upon multiple skills ● —Collecting information ● —Interpreting data ● —Drawing conclusions ● —Communicating outcomes The authors list two major purposes of evaluations. What are they? EVALUATION
  5. 5. ADDIE MODEL
  6. 6. 1. Identify All Clients and Stakeholders and Clarify Their Needs 2. Identify the Performance Improvement Initiative to Be Evaluated 3. Identify and Clarify the Purposes for the Evaluation 4. Determine the Critical Research Questions That the Evaluation Must Address 5. Develop an Evaluation Design 6. Analyze Resources and Constraints 7. Determine the Best Data Collection Methods 8. Plan Reporting and Communications Actions Can one person explain a step of the process from their own experience at work? THE PROCESS
  7. 7. KIRKPATRICK’S FOUR LEVELS OF EVALUATION MODEL
  8. 8. Most of us said that our organizations utilized the Kirkpatrick model of evaluation, but in a limited capacity (reaction and learning) What do you think would have been an effective method of incorporating Levels 3-4 (behavior and impact) ? KIRKPATRICK
  9. 9. ● Helps training professionals to understand evaluation in a systematic way ● A straightforward system for discussing training outcomes ● recognizes that single outcome measures cannot adequately reflect complexity of organizational training programs (Bates, 2004) Are there any other advantages of Kirkpatrick’s model that you would like to add? KIRKPATRICK ADVANTAGES
  10. 10. ● Model is incomplete- oversimplified view of training effectiveness. Other factors influence training outcomes. o learning culture of the organization o organizational or work unit goals and values o interpersonal support o climate for learning transfer o adequacy of material resources (Bates, 2004) KIRKPATRICK LIMITATIONS
  11. 11. ● Assumption of Causal Linkages- model assumes that the criteria represent a causal relationship between the levels of evaluation o research has failed to confirm causal linkages o “ if training is going to be effective , it is important that trainees react favorably” and “without learning, no change in behavior will occur” (Kirkpatrick, 1994) (Bates, 2004) KIRKPATRICK LIMITATIONS
  12. 12. ● Incremental Importance of Information- model assumes that each level of evaluation provides data that is more informative than the last o perception that establishing level 4 results provide the most useful information o “in practice, however, the weak conceptual linkages inherent in the model and resulting data it generates do not provide an adequate basis for this assumption” (Bates, 2004) KIRKPATRICK LIMITATIONS
  13. 13. •“Evaluation is the systematic process of delineating, obtaining, reporting, and applying descriptive and judgmental information about some object’s merit, worth, probity [moral correctness], feasibility, safety, significance, or equity"(Stufflebeam & Shinkfield, 2007) EVALUATION
  14. 14. Context- Input-Process-Product STUFFLEBEAM’S CIPP MODEL (1983)
  15. 15. What needs to be done? Context How should it be done? Input Is it being done? Process Did it succeed? Product STUFFLEBEAM’S CIPP MODEL
  16. 16. Uses for CIPP model: ● Conduct a needs analysis ● Evaluation of alternatives for addressing needs ● Monitor design/implementation of interventions ● Helps to examine outcomes of intervention regarding impact to the organization STUFFLEBEAM’S CIPP MODEL
  17. 17. ● Addresses concerns of decision-makers for justifying the investment in interventions/initiatives ● Provides a framework for comparing alternatives for future investments COST BENEFIT MODEL (KEARSLEY, 1986)
  18. 18. ● Provides the expected benefit or return on investment ● Expressed as a percentage or in actual dollars ● Identify the benefits of intervention ($), divide by the cost (%) or subtract costs RETURN ON INVESTMENT
  19. 19. ● Helps to decide how to best allocate resources ● Disadvantage: most interventions or initiatives provide benefits which are hard to quantify RETURN ON INVESTMENT
  20. 20. 1. What do you think is the difference between the Cost Benefit Model and the ROI model? 2. Provide examples of when these models should be used. QUESTIONS:
  21. 21. 7 step model 1. Determine purpose, objectives, participants (who wants this information) 2. Assess information needs 3. Consider proper protocol 4. Describe population to be studied, selet subjects 5. Identify other variables 6. Formulate a study design 7. Formulate a management plan FORMATIVE EVALUATIONS
  22. 22. ● Most useful for evaluating instruction ● May be used to for performance improvement and change interventions FORMATIVE EVALUATIONS
  23. 23. “...evaluative inquiry can not only be a means of accumulating information for decision making and action..but that it also be equally concerned with questioning and debating the value of what we do in organizations” (Preskill and Torres, 1999) EVALUATIVE INQUIRY
  24. 24. Evaluative inquiry is a way of fostering individual learning and team learning within an organization , about issues that are critical to its purpose and what it values (Parsons, 2009) EVALUATIVE INQUIRY
  25. 25. ● Collaboration ● Organizational learning and change ● Links learning and performance ● Diverse perspectives EVALUATIVE INQUIRY
  26. 26. ● A study is needed to evaluate and redesign an online master’s degree program consisting of 12 courses in informatics ● Educators are concerned about the quality of online education courses ● Meaningful assessment is essential for improving quality of such programs CASE STUDY FOR EVALUATION
  27. 27. Considering the evaluation models that we have discussed: 1. Which model(s) would you consider appropriate for this case? Why? 2. Design an evaluation program to include Steps 1-5 as described by Rothwell and Kazanas ( Gilley & Maycunich, p. 430-432) CASE STUDY FOR EVALUATION
  28. 28. Bates, R. (2004) A critical analysis of evaluation practice; the Kirkpatrick model and the principle of beneficence. Evaluation and program planning,27(3),341-34 Parsons, B. (2009) Evaluative inquiry for complex times. OD Practitioner, 41(1). Preskill, H. & Torres, R. T.(1999). building capacity for organizational learning through evaluative inquiry. Evaluation, 5(1) 42-60 REFERENCES

×