SlideShare a Scribd company logo
1 of 20
Chapter 7:
Measuring and Monitoring Program Outcomes:
The “bottom line”
Rossi, P.H., Lipsey, M.W., & Freeman, H.E. (2004). Evaluation: A systematic Approach (7th edition). Thousand Oaks,
CA:
Sage Publications.
Program Outcomes:
● The state of the target population or the social condition
that a program is expected to have change
○ Observed characteristics of target population or
social condition - NOT of the program
○ Definition of outcome makes no direct reference to
program actions
○ Does not necessarily mean that program targets
have changed or program has caused change
Rossi, Lipsey, & Freeman, 2004, pp. 204-205
Program Outcomes:
Outcome Level, Outcome Change, and Net Effect
Outcome Level: the status of an
outcome at some point in time
Outcome levels alone cannot be
interpreted with any confidence as
indicators of a program’s success or
failure
Outcome Change: difference
between outcome and levels at
different point in
time
Program Effect: portion of an
outcome change that can be
uniquely attributed to a program as
opposed to the influence of
another factor
Program effect is the difference between
outcome level attained with participation
in program and that which the same
individuals would have attained had they
not participated, which must be inferred
Rossi, et al., 2004, pp. 206-208
FIRST STEP in developing
measures for program outcome is
identifying relevant outcomes.
An evaluator must consider:
● the perspectives of the stakeholders
● program impact theory
● prior research
● unintended outcomes
Consider Stakeholder Perspectives -
own understanding of what program should accomplish and what outcomes they expect it to affect
● Direct Sources: Stated objective, goals, missions of program,
funding proposals, grants and contracts for services from
outside sponsors
● Outcome description must indicate pertinent characteristic,
behavior, or condition the program is expected to change
Rossi, et al., 2004, p. 209
Consider Program Impact
Theory
(Chapter 5)
A program impact theory is helpful for
identifying and organizing program
outcomes because it connects
program’s activities to proximal
outcomes that are expected to lead to
distal outcomes. Essentially, it is a
series of lined relationships between
program services and the ultimate
social benefits the program is intended
to produce.
Proximal Outcomes:
● Not the most important outcomes from a social or policy
perspective
● Most immediate outcomes
Distal Outcomes:
● Difficult to measure
● Greatest practical and political importance
● Influence by many other factors
Rossi, et al., 2004, pp. 209-212
Consider Prior Research
● Evaluation research on similar programs
● Be aware of standard definitions and measures that
established policy significance
● Could be known problems with certain definitions or
measures
Rossi, et al., 2004, p. 212
Consider Unintended
Outcomes
Much emphasis is placed on
identifying and defining the outcomes
that are expected; however, there may
be unintended positive or negative
outcomes and the evaluator must
make an effort to identify any potential
unintended outcomes that could be
significant for assessing the program’s
effects on the social condition(s).
● All types of prior research can be useful when considering
unintended outcomes
● Relationships with program personnel and participants can be
helpful when considering unintended outcomes
Rossi, et al., 2004, p. 213
SECOND STEP is to decide how
the selected outcomes will be
measured.
An evaluator must consider:
● One-dimensional vs. Multidimensional
● Measurement Procedures and Properties
● Reliability
● Validity
● Sensitivity
● Choice of Outcome Measures
Consider dimensions of outcomes
One-dimensional Outcomes:
● One intended outcome
Multidimensional Outcomes:
● Various components that the
evaluator needs to take into account
● Provide for broader coverage of the
concept and allow the strengths to
compensate for the weaknesses of
another
An evaluator should consider all dimensions before determining final measures.
Rossi, et al., 2004, pp. 214-215
Consider Measurement
Procedures and
Properties
Data has a few basic sources:
observations, records, responses to
interviews and questionnaires, and
standardized tests.
The information from these sources
becomes a measurement when
operationalized - generated through
set of specific, systematic operations
or procedures.
Established Procedures and Instruments:
● There are established procedures and instruments in place
for many program areas
Ready-Made Measurements:
● Not necessarily suitable for certain outcomes
Evaluator Developed Measurements:
● Take time that the evaluator may not have
● There are well-established measurement development
procedures for consistency
All forms of measurement and procedures (established, ready-
made, and evaluator developed) must be checked.
Rossi, et al., 2004, pp. 217-218
Consider Three Measurement Properties:
Reliability, Validity, and Sensitivity
Reliability: extent to which the
measure produces same results
when used repeatedly to
measure the same thing.
Physical Characteristics - more reliable
Psychological Characteristics - less
reliable
Evaluators should check twice or have
similar questions to check for
consistency
Validity: extent to which it
measures what it is intended to
measure.
Validity is difficult to test.
Depends on if measure is accepted as
valid by stakeholders
Depends on some comparison that
shows that the measure yields results
Sensitivity: extent to which the
values on the measure change
when there is a change or
difference in the thing being
measured.
Two ways outcome measures can be
insensitive:
1) May include elements that relate
to something other than what the
program could reasonably be
expected to change
2) When largely developed for
diagnostic purposes
Choice of Outcome Measure
An evaluator must take his/her time in selecting
measurement. A poorly chosen or conceived measurement
can undermine the worth of an impact assessment by
producing misleading estimates.
Rossi, et al., 2004, p. 222
THIRD STEP is to monitor
program outcomes.
Similar to program monitoring (Chapter 6), but outcome
monitoring continually collects and reviews information
relating to program outcomes.
Indicators for Outcome Monitoring
● Indicators should be as responsive as possible for program effects
● Most interpretable outcome indicators are those that involve variables that only the program
can affect and represent
● Outcome indicator that is easiest to link is client satisfaction - customer satisfaction
○ Problems with the indicator of customer satisfaction
■ Some customers are not able to recognize program benefits
■ Some customers are reluctant to appear critical and overrate outcomes of program
Rossi, et al., 2004, pp. 225 - 226
Benefits of Outcome Monitoring
● Provides useful and relatively inexpensive information about the program effects
● Provides information timely - could be available within months
● Generates feedback for administration of program - NOT for assessing program’s effect on
social conditions
Rossi, et al., 2004, p. 225
Limitations of Outcome Monitoring
● Outcome indicators can receive emphasis from - like “teaching to the test”
● Natural tendency to fudge and pad the indicator to make performance look better -
“Corruptibility of Indicators”
Rossi, et al., 2004, p. 227
FOURTH STEP is to interpret
outcome data.
When interpreting outcome data, an evaluator needs to
provide a suitable context and bring in information that
provides a relevant basis for comparison or explanation.
Evaluator considerations for interpreting outcome data:
● Information about changes in client mix, relevant demographic and economic trends and
information about program process and service utilization
● A framework that provides some standard for judging what constitutes better or worse
outcomes within the limitations of data
○ Pre and post comparisons provide useful feedback to administration but not credible
findings about program’s impact
○ “Benchmarking” (comparing outcome values with those from similar programs) are only
meaningful for evaluation purposes when all other things are equal between programs
being compared - a difficult standard to meet
Rossi, et al., 2004, pp. 228-231
Questions for Consideration:
1. What prior research is being utilized in your evaluation of a program? Are there any
established procedures and instruments that you can use in your evaluation? Are there
any ready-made measurements that you can utilize in your evaluation?
2. Have you ever used a ready-made evaluation in any type of evaluation? Assess the
suitability of this ready-made evaluation.
3. Assess the sensitivity of an evaluation you have been a part of in your past. Have you
ever used a measurement for group or program evaluation that was largely developed
for diagnostic purposes? What were the implications of using a diagnostic tool for group
or program evaluation?
4. In your experience with evaluations, have you ever experienced the pitfalls associated
with outcome monitoring? Explain why the pitfalls occurred or did not occur.

More Related Content

What's hot

Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationSudipta Barman
 
project monitoring and evaluation
 project monitoring and evaluation project monitoring and evaluation
project monitoring and evaluationMir Mosharraf Hosain
 
M& e slide share
M& e   slide shareM& e   slide share
M& e slide shareSelf
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid ProjectsTony
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Muthuraj K
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluationCarlo Magno
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaDiscover JKUAT
 
TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)Research to Action
 
HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS
 HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS
HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTSAbraham Ncunge
 
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsDeveloping-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsNIDHI SEN
 
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION Mwiza Helen
 

What's hot (19)

Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Project Monitoring and Evaluation
Project Monitoring and Evaluation Project Monitoring and Evaluation
Project Monitoring and Evaluation
 
Project Monitoring & Evaluation
Project Monitoring & EvaluationProject Monitoring & Evaluation
Project Monitoring & Evaluation
 
project monitoring and evaluation
 project monitoring and evaluation project monitoring and evaluation
project monitoring and evaluation
 
Project management
Project managementProject management
Project management
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
M& e slide share
M& e   slide shareM& e   slide share
M& e slide share
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
Monitoring and Evaluation
Monitoring and Evaluation Monitoring and Evaluation
Monitoring and Evaluation
 
Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.Monitoring and Evaluation for Project management.
Monitoring and Evaluation for Project management.
 
Plan Evaluation & Implementation
Plan Evaluation & ImplementationPlan Evaluation & Implementation
Plan Evaluation & Implementation
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluation
 
Monitoring indicators
Monitoring indicatorsMonitoring indicators
Monitoring indicators
 
Project monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino MokayaProject monitoring and evaluation by Samuel Obino Mokaya
Project monitoring and evaluation by Samuel Obino Mokaya
 
TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)TTIPEC: Monitoring and Evaluation (Session 2)
TTIPEC: Monitoring and Evaluation (Session 2)
 
Project monitoring
Project monitoringProject monitoring
Project monitoring
 
HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS
 HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS
HOW TO CARRY OUT MONITORING AND EVALUATION OF PROJECTS
 
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-ProjectsDeveloping-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
Developing-Monitoring-And-Evaluation-Framework-for-Budget-Work-Projects
 
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
THE ROLE OF THE BAORD IN MONITORING AND EVALUATION
 

Similar to Chapter7

Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programsnium
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016nida19
 
Program Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenProgram Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenBenjamingabanelabong
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptxgggadiel
 
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptx
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptxPROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptx
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptxleamangaring12
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
PCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on EvaluationPCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on Evaluationrexcris
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MoseStaton39
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxAASTHA76
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation MikeEly930
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfnoblex1
 
Chapter Two PME.pptx
Chapter Two PME.pptxChapter Two PME.pptx
Chapter Two PME.pptxHinSeene
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesNayyar Kazmi
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptmodestuseveline
 

Similar to Chapter7 (20)

M&E.ppt
M&E.pptM&E.ppt
M&E.ppt
 
Evaluation of health programs
Evaluation of health programsEvaluation of health programs
Evaluation of health programs
 
Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation Labor Markets Core Course 2013: Monitoring and evaluation
Labor Markets Core Course 2013: Monitoring and evaluation
 
Program evaluation 20121016
Program evaluation 20121016Program evaluation 20121016
Program evaluation 20121016
 
Program Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for childrenProgram Evaluations to avoid aids/HIV for children
Program Evaluations to avoid aids/HIV for children
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Logical framework
Logical  frameworkLogical  framework
Logical framework
 
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptx
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptxPROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptx
PROJECTPROGRAM IMPLEMENTATION, MONITORING AND EVALUATION.pptx
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
PCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on EvaluationPCM - Project Cycle Management, Training on Evaluation
PCM - Project Cycle Management, Training on Evaluation
 
Program Evaluations
Program EvaluationsProgram Evaluations
Program Evaluations
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Workbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docxWorkbook for Designing a Process Evaluation .docx
Workbook for Designing a Process Evaluation .docx
 
Workbook for Designing a Process Evaluation
 Workbook for Designing a Process Evaluation  Workbook for Designing a Process Evaluation
Workbook for Designing a Process Evaluation
 
Monitoring and evaluation for hiv
Monitoring and evaluation for hivMonitoring and evaluation for hiv
Monitoring and evaluation for hiv
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
A Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdfA Good Program Can Improve Educational Outcomes.pdf
A Good Program Can Improve Educational Outcomes.pdf
 
Chapter Two PME.pptx
Chapter Two PME.pptxChapter Two PME.pptx
Chapter Two PME.pptx
 
Monitoring and Evaluation of Health Services
Monitoring and Evaluation of Health ServicesMonitoring and Evaluation of Health Services
Monitoring and Evaluation of Health Services
 
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..pptINTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
INTRODUCTION TO PROGRAMME DEVELOPMENT..ppt
 

Recently uploaded

18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxAnaBeatriceAblay2
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 

Recently uploaded (20)

18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 

Chapter7

  • 1. Chapter 7: Measuring and Monitoring Program Outcomes: The “bottom line” Rossi, P.H., Lipsey, M.W., & Freeman, H.E. (2004). Evaluation: A systematic Approach (7th edition). Thousand Oaks, CA: Sage Publications.
  • 2. Program Outcomes: ● The state of the target population or the social condition that a program is expected to have change ○ Observed characteristics of target population or social condition - NOT of the program ○ Definition of outcome makes no direct reference to program actions ○ Does not necessarily mean that program targets have changed or program has caused change Rossi, Lipsey, & Freeman, 2004, pp. 204-205
  • 3. Program Outcomes: Outcome Level, Outcome Change, and Net Effect Outcome Level: the status of an outcome at some point in time Outcome levels alone cannot be interpreted with any confidence as indicators of a program’s success or failure Outcome Change: difference between outcome and levels at different point in time Program Effect: portion of an outcome change that can be uniquely attributed to a program as opposed to the influence of another factor Program effect is the difference between outcome level attained with participation in program and that which the same individuals would have attained had they not participated, which must be inferred Rossi, et al., 2004, pp. 206-208
  • 4. FIRST STEP in developing measures for program outcome is identifying relevant outcomes. An evaluator must consider: ● the perspectives of the stakeholders ● program impact theory ● prior research ● unintended outcomes
  • 5. Consider Stakeholder Perspectives - own understanding of what program should accomplish and what outcomes they expect it to affect ● Direct Sources: Stated objective, goals, missions of program, funding proposals, grants and contracts for services from outside sponsors ● Outcome description must indicate pertinent characteristic, behavior, or condition the program is expected to change Rossi, et al., 2004, p. 209
  • 6. Consider Program Impact Theory (Chapter 5) A program impact theory is helpful for identifying and organizing program outcomes because it connects program’s activities to proximal outcomes that are expected to lead to distal outcomes. Essentially, it is a series of lined relationships between program services and the ultimate social benefits the program is intended to produce. Proximal Outcomes: ● Not the most important outcomes from a social or policy perspective ● Most immediate outcomes Distal Outcomes: ● Difficult to measure ● Greatest practical and political importance ● Influence by many other factors Rossi, et al., 2004, pp. 209-212
  • 7. Consider Prior Research ● Evaluation research on similar programs ● Be aware of standard definitions and measures that established policy significance ● Could be known problems with certain definitions or measures Rossi, et al., 2004, p. 212
  • 8. Consider Unintended Outcomes Much emphasis is placed on identifying and defining the outcomes that are expected; however, there may be unintended positive or negative outcomes and the evaluator must make an effort to identify any potential unintended outcomes that could be significant for assessing the program’s effects on the social condition(s). ● All types of prior research can be useful when considering unintended outcomes ● Relationships with program personnel and participants can be helpful when considering unintended outcomes Rossi, et al., 2004, p. 213
  • 9. SECOND STEP is to decide how the selected outcomes will be measured. An evaluator must consider: ● One-dimensional vs. Multidimensional ● Measurement Procedures and Properties ● Reliability ● Validity ● Sensitivity ● Choice of Outcome Measures
  • 10. Consider dimensions of outcomes One-dimensional Outcomes: ● One intended outcome Multidimensional Outcomes: ● Various components that the evaluator needs to take into account ● Provide for broader coverage of the concept and allow the strengths to compensate for the weaknesses of another An evaluator should consider all dimensions before determining final measures. Rossi, et al., 2004, pp. 214-215
  • 11. Consider Measurement Procedures and Properties Data has a few basic sources: observations, records, responses to interviews and questionnaires, and standardized tests. The information from these sources becomes a measurement when operationalized - generated through set of specific, systematic operations or procedures. Established Procedures and Instruments: ● There are established procedures and instruments in place for many program areas Ready-Made Measurements: ● Not necessarily suitable for certain outcomes Evaluator Developed Measurements: ● Take time that the evaluator may not have ● There are well-established measurement development procedures for consistency All forms of measurement and procedures (established, ready- made, and evaluator developed) must be checked. Rossi, et al., 2004, pp. 217-218
  • 12. Consider Three Measurement Properties: Reliability, Validity, and Sensitivity Reliability: extent to which the measure produces same results when used repeatedly to measure the same thing. Physical Characteristics - more reliable Psychological Characteristics - less reliable Evaluators should check twice or have similar questions to check for consistency Validity: extent to which it measures what it is intended to measure. Validity is difficult to test. Depends on if measure is accepted as valid by stakeholders Depends on some comparison that shows that the measure yields results Sensitivity: extent to which the values on the measure change when there is a change or difference in the thing being measured. Two ways outcome measures can be insensitive: 1) May include elements that relate to something other than what the program could reasonably be expected to change 2) When largely developed for diagnostic purposes
  • 13. Choice of Outcome Measure An evaluator must take his/her time in selecting measurement. A poorly chosen or conceived measurement can undermine the worth of an impact assessment by producing misleading estimates. Rossi, et al., 2004, p. 222
  • 14. THIRD STEP is to monitor program outcomes. Similar to program monitoring (Chapter 6), but outcome monitoring continually collects and reviews information relating to program outcomes.
  • 15. Indicators for Outcome Monitoring ● Indicators should be as responsive as possible for program effects ● Most interpretable outcome indicators are those that involve variables that only the program can affect and represent ● Outcome indicator that is easiest to link is client satisfaction - customer satisfaction ○ Problems with the indicator of customer satisfaction ■ Some customers are not able to recognize program benefits ■ Some customers are reluctant to appear critical and overrate outcomes of program Rossi, et al., 2004, pp. 225 - 226
  • 16. Benefits of Outcome Monitoring ● Provides useful and relatively inexpensive information about the program effects ● Provides information timely - could be available within months ● Generates feedback for administration of program - NOT for assessing program’s effect on social conditions Rossi, et al., 2004, p. 225
  • 17. Limitations of Outcome Monitoring ● Outcome indicators can receive emphasis from - like “teaching to the test” ● Natural tendency to fudge and pad the indicator to make performance look better - “Corruptibility of Indicators” Rossi, et al., 2004, p. 227
  • 18. FOURTH STEP is to interpret outcome data. When interpreting outcome data, an evaluator needs to provide a suitable context and bring in information that provides a relevant basis for comparison or explanation.
  • 19. Evaluator considerations for interpreting outcome data: ● Information about changes in client mix, relevant demographic and economic trends and information about program process and service utilization ● A framework that provides some standard for judging what constitutes better or worse outcomes within the limitations of data ○ Pre and post comparisons provide useful feedback to administration but not credible findings about program’s impact ○ “Benchmarking” (comparing outcome values with those from similar programs) are only meaningful for evaluation purposes when all other things are equal between programs being compared - a difficult standard to meet Rossi, et al., 2004, pp. 228-231
  • 20. Questions for Consideration: 1. What prior research is being utilized in your evaluation of a program? Are there any established procedures and instruments that you can use in your evaluation? Are there any ready-made measurements that you can utilize in your evaluation? 2. Have you ever used a ready-made evaluation in any type of evaluation? Assess the suitability of this ready-made evaluation. 3. Assess the sensitivity of an evaluation you have been a part of in your past. Have you ever used a measurement for group or program evaluation that was largely developed for diagnostic purposes? What were the implications of using a diagnostic tool for group or program evaluation? 4. In your experience with evaluations, have you ever experienced the pitfalls associated with outcome monitoring? Explain why the pitfalls occurred or did not occur.