The document provides an overview of Lauren Decker's presentation on going beyond evaluation buzzwords. It discusses the importance of logic models and theories of change in evaluation. A logic model graphically depicts a program's key components, including resources, activities, participants, and short-, medium-, and long-term outcomes. It also shows how these components are hypothesized to produce intended changes. Measuring implementation fidelity, or the extent to which a program's components are delivered as intended, is important for evaluation. High-quality evaluations clearly use sound practices, focus on use and usefulness, align with program logic models, and consider implementation fidelity. Evaluators should ask the right questions and have clearly defined outcomes.
Innovation Network's own workbook (revised in 2010), offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center.
5 The Logical Framework - a short course for NGOsTony
A series of modules on project cycle, planning and the logical framework, aimed at team leaders of international NGOs in developing countries.
There is a handout to go with this module, a Logframe with blanks. http://www.slideshare.net/Makewa/exercise-watsan-logframe-with-blanks
Innovation Network's own workbook (revised in 2010), offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center.
5 The Logical Framework - a short course for NGOsTony
A series of modules on project cycle, planning and the logical framework, aimed at team leaders of international NGOs in developing countries.
There is a handout to go with this module, a Logframe with blanks. http://www.slideshare.net/Makewa/exercise-watsan-logframe-with-blanks
When designing a project, you have two entry points whether it is the traditional simple Logic Model or the innovative critical Theory of Change. this short presentation explains the differences between both.
Via Evaluation's Jessica Weitzel and Caroline Taggart give you the tools and techniques for maximizing the usefulness of data that most organizations already collect, or could easily begin to collect.
More information: viaevaluation.com
Workbook for Designing a Process Evaluation MoseStaton39
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
Workbook for Designing a Process Evaluation .docxAASTHA76
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid .
When designing a project, you have two entry points whether it is the traditional simple Logic Model or the innovative critical Theory of Change. this short presentation explains the differences between both.
Via Evaluation's Jessica Weitzel and Caroline Taggart give you the tools and techniques for maximizing the usefulness of data that most organizations already collect, or could easily begin to collect.
More information: viaevaluation.com
Workbook for Designing a Process Evaluation MoseStaton39
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
Workbook for Designing a Process Evaluation .docxAASTHA76
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid .
Workbook for Designing a Process Evaluation MikeEly930
Workbook
for
Designing
a Process
Evaluation
Produced for the
Georgia Department of Human
Resources
Division of Public Health
By
Melanie J. Bliss, M.A.
James G. Emshoff, Ph.D.
Department of Psychology
Georgia State University
July 2002
Evaluation Expert Session
July 16, 2002 Page 1
What is process evaluation?
Process evaluation uses empirical data to assess the delivery of
programs. In contrast to outcome evaluation, which assess the
impact of the program, process evaluation verifies what the
program is and whether it is being implemented as designed. Thus,
process evaluation asks "what," and outcome evaluation asks, "so
what?"
When conducting a process evaluation, keep in mind these three
questions:
1. What is the program intended to be?
2. What is delivered, in reality?
3. Where are the gaps between program design and delivery?
This workbook will serve as a guide for designing your own process
evaluation for a program of your choosing. There are many steps involved
in the implementation of a process evaluation, and this workbook will
attempt to direct you through some of the main stages. It will be helpful to
think of a delivery service program that you can use as your example as
you complete these activities.
Why is process evaluation important?
1. To determine the extent to which the program is being
implemented according to plan
2. To assess and document the degree of fidelity and variability in
program implementation, expected or unexpected, planned or
unplanned
3. To compare multiple sites with respect to fidelity
4. To provide validity for the relationship between the intervention
and the outcomes
5. To provide information on what components of the intervention
are responsible for outcomes
6. To understand the relationship between program context (i.e.,
setting characteristics) and program processes (i.e., levels of
implementation).
7. To provide managers feedback on the quality of implementation
8. To refine delivery components
9. To provide program accountability to sponsors, the public, clients,
and funders
10. To improve the quality of the program, as the act of evaluating is
an intervention.
Evaluation Expert Session
July 16, 2002 Page 2
Stages of Process Evaluation Page Number
1. Form Collaborative Relationships 3
2. Determine Program Components 4
3. Develop Logic Model*
4. Determine Evaluation Questions 6
5. Determine Methodology 11
6. Consider a Management Information System 25
7. Implement Data Collection and Analysis 28
8. Write Report**
Also included in this workbook:
a. Logic Model Template 30
b. Pitfalls to avoid ...
PJM6125 Project Evaluation: Selecting Evaluation Tools
Overview and Rationale
For this assignment, you will be selecting evaluation tools and adding those selected to the
Evaluation Goal Matrix that you developed as part of the previous assignment.
Program and Course Outcomes
This assignment is directly linked to the following key learning outcomes from the course
syllabus:
LO3: Analyze and apply appropriate evaluation tool
L07: Plan and conduct a tactical evaluation using both qualitative and quantitative
measures
In addition to these key learning outcomes, you will also have the opportunity to evidence
the following skills through completing this assignment:
Critical thinking
Problem solving
Essential Components & Instructions
Using the project you identified in Lesson 1 and the stakeholder analysis and performance
metrics you identified as part of Lesson 2, you will be selecting evaluation tools and adding
them to your Evaluation Goal Matrix from Lesson 2.
Begin by updating your Evaluation Goal Matrix with any feedback provided, and then add
an additional column titled 'Evaluation Tool' and select an evaluation tool for each of the
metrics you identified during Lesson 2. Therefore, you will identify a minimum of one
evaluation tool for each of the entries in your evaluation goal matrix.
Once you identify the tools that you will use, write a few paragraphs on how each tool will
be used, why the tool was selected, who will be responsible for performing the evaluation
with the tool and how the data will be used and will help support the success of the project.
These entries can be made below your updated evaluation goal matrix. You should provide
a thorough evaluation and explanation of each tool you list in your updated evaluation goal
matrix. You may wish to use materials from the lesson, readings, and external sources in
writing the explanations. However, be sure to cite any sources that you use in writing the
explanations.
Format
Below are some key guidelines you will want to ensure you follow. Think of this short list
as a quality control checklist, along with the attached grading rubric.
Be sure you have identified at least one tool per goal in your matrix
You may use a tool to assess multiple goals if it is appropriate; if you do this, make sure
in your explanation that you provide sufficient detail to address all the goals the tool
addresses
You should submit an updated Evaluation Goal Matrix and the narrative descriptive and
explanation of each tool in a single file (MsWord or .pdf)
You should include a cover page
You should provide a brief abstract about the process you went through to develop your
two tables.
You should provide a project summary of your project
You should format the documents professionally
The tables should be readable without having to zoom in on small text
Rubric(s)
Asse.
PJM6125 Project Evaluation: Selecting Evaluation Tools
Overview and Rationale
For this assignment, you will be selecting evaluation tools and adding those selected to the
Evaluation Goal Matrix that you developed as part of the previous assignment.
Program and Course Outcomes
This assignment is directly linked to the following key learning outcomes from the course
syllabus:
LO3: Analyze and apply appropriate evaluation tool
L07: Plan and conduct a tactical evaluation using both qualitative and quantitative
measures
In addition to these key learning outcomes, you will also have the opportunity to evidence
the following skills through completing this assignment:
Critical thinking
Problem solving
Essential Components & Instructions
Using the project you identified in Lesson 1 and the stakeholder analysis and performance
metrics you identified as part of Lesson 2, you will be selecting evaluation tools and adding
them to your Evaluation Goal Matrix from Lesson 2.
Begin by updating your Evaluation Goal Matrix with any feedback provided, and then add
an additional column titled 'Evaluation Tool' and select an evaluation tool for each of the
metrics you identified during Lesson 2. Therefore, you will identify a minimum of one
evaluation tool for each of the entries in your evaluation goal matrix.
Once you identify the tools that you will use, write a few paragraphs on how each tool will
be used, why the tool was selected, who will be responsible for performing the evaluation
with the tool and how the data will be used and will help support the success of the project.
These entries can be made below your updated evaluation goal matrix. You should provide
a thorough evaluation and explanation of each tool you list in your updated evaluation goal
matrix. You may wish to use materials from the lesson, readings, and external sources in
writing the explanations. However, be sure to cite any sources that you use in writing the
explanations.
Format
Below are some key guidelines you will want to ensure you follow. Think of this short list
as a quality control checklist, along with the attached grading rubric.
Be sure you have identified at least one tool per goal in your matrix
You may use a tool to assess multiple goals if it is appropriate; if you do this, make sure
in your explanation that you provide sufficient detail to address all the goals the tool
addresses
You should submit an updated Evaluation Goal Matrix and the narrative descriptive and
explanation of each tool in a single file (MsWord or .pdf)
You should include a cover page
You should provide a brief abstract about the process you went through to develop your
two tables.
You should provide a project summary of your project
You should format the documents professionally
The tables should be readable without having to zoom in on small text
Rubric(s)
Asse ...
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
Successful organizations are constantly monitoring, evaluating, and improving based off of their successes and failures. Learn how to design your own monitoring and evaluation program with this deck from WAN, and learn more on our free Strategic Advocacy Course, available at: http://worldanimal.net/our-programs/strategic-advocacy-course-new/about
Project Management Methodologies
PPMP20009
Week 10 Lecture
Dr Bernard Wong
[email protected]
1
Assignment 4
Continuous Improvement Plan
Week 12 Friday
Open the course profile to review criteria.
2
Reminder
PPMP20009
Presentation weeks 11 or 12
4
Create your own Deming PDCA cycle relating to the last assignment that you handed in.
Change Management
6
Formulate change
Plan change
Implement change
Manage transition
Sustain change
Take the ‘Act’ segment of the PDCA cycle you created earlier and define the five CM stages.
Formulate change
Plan change
Implement change
Manage transition
Sustain change
Continuous Improvement?
Why are we wanting to improve?
Where are we now?
What are we working with?
If you don’t know where you are going, any road will get you there.
Cheshire Cat
(Alice in Wonderland)
There are a number of things to consider when deciding what level of maturity to aim for.
Why are you wanting to increase your level of maturity in this space?
-Some might be wanting to do it simply as a continuous improvement strategy.
Some may be having issues with the performance of their program and project delivery or portfolio investment returns
Others may need it to be competitive in a market that looks at the P3M3 levels of organisations in the tendering process
Others may be required to undergo a mandatory audit – as did the Qld Govt in 2012.
One organisation that I have spoken with has noted that their environment has become increasingly fiscally constrained and as such funding is much more competitive. They want to increase certain sections of their maturity, specifically relating to benefits management, business case and blueprint development – so that they can be more competitive in seeking funding for initiatives. So in this case they are not necessarily trying to improve their maturity as a whole, but an aspect of it. In doing this however, it is likely that they will have an increase in maturity in other areas as well.
We need to know where you are now to assist in deciding where you want to go. This is where going through an assessment is essential and I do believe in this being independent. You can self assess but this will always be impacted with bias. You need to baseline.
What are you working with? What is your organisational context? What resources do you have both budget and people? Do you have authentic sponsorship or are your leaders just ticking a mandate off? What’s your organisational culture like, are they open to P3 management or are they likely to see effort to increase maturity as unnecessary overhead?
So when we went through this process we were fortunate to have an authentic sponsor, we had a culture of project and program delivery so the staff understood the value of the practice (and I do say practice rather than methodology – as if you have experienced practitioners, they will argue methodology with you – this is a good thing!). We.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
2. One word or phrase
expressing how you feel
about evaluation
3. “If you don’t know where you are going, how are you gonna’
know when you get there?”
Yogi Berra
4. Where are we going and how
will we get there?
What’s a logic model and how does it help?
Defining seemingly inter-changeable terms
Time and tool to begin a logic model (or refine one)
What is actually happening and how does it help to
know?
Program and evaluation importance of understanding
implementation fidelity
Bringing thoughtful program lessons into evaluation
plan and use
What makes a great evaluation?
Planning and asking the right question(s)
6. Logic Model vs. Theory of
Change
Logic Model = graphically depicts what your program
is intended to do
Key program components
Theory of Change = why your program
operates as intended and how
components and activities are hypothesized
to move outcomes
Assumptions underlying
expected change
7. Simple Logic Model
Inputs Outputs Outcomes
Resources Activities Participation Short
Medium
Long
What
we do
Who we
reach
What results
we expect
What we
invest
9. How does having a logic
model help my program?
Program phase
New program
Existing program
Redesigning
existing program
Logic model use
Creation & planning
Documentation &
communication
Reinvention,
improvement, &
expansion
10. Use the paper
provided to begin
a logic model for
your program
Remember to
include resources,
intended activities,
participants, short-,
mid-, and long-
term outcomes
If you can also
represent the
theory of change
(arrows)
What does our
program logic
model look like?
10 minutes
11. What should we include in our
program Logic Model?
Identify:
All key components of program
Resources, intended activities, participation, short-,
mid-, and long-term outcomes
Components and pathways(mediators) through
which program is expected to produce intended
outcomes (theory)
Student outcome(s) program is designed to
improve
Short-, mid-, long-term outcomes
Other outcomes as well if relevant
12. I have a Logic
Model, now
what?
– What is fidelity of
implementation?
13. Fidelity of Implementation
The extent to which key components of
the program are delivered as originally
intended by the developer
Key components = strategies, practices,
activities and behaviors that are critical to
defining your program
15. Fidelity of Implementation
the extent to which key components of
the program are delivered as originally
intended by the developer
Key components = strategies, practices,
activities and behaviors that are critical to
defining your program
When is your program not your program?
16. How do I measure fidelity of
my program?
Fidelity
Structural
key
components
Interactional
key
components
Structure Process
17. How does measuring fidelity
help my program?
Understanding fidelity helps you know about:
Participation
Activity delivery
Content
Example questions you can answer:
Were activities implemented by program staff
according to design?
Did staff receive program content as planned?
What were the adaptations that were made to the
program?
What were the factors that may have affected program
fidelity?
18. “I think you should
be more explicit
here in Step Two.”
Evaluations without measuring
fidelity…
19. Visit someone
else’s logic
model in process
For the activities
listed suggest:
What type of
fidelity could be
measured
Ways of
measuring that
type of fidelity
for the activity
How to measure
fidelity in your
program?
10 minutes
20. What do we need to do to
understand fidelity of our program?
Periodically measure implementation
fidelity
Measured separately for each key
component
Thresholds specified for determining
whether key components of program
were implemented with fidelity
Assess and Report whether each key
component was implemented with fidelity
22. High-quality evaluations include:
Evaluators who clearly use sound evaluation
principles and practices (transparency,
consideration of context, etc.)
Complex programs require thoughtful, directed
evaluations
Focus from the start on use of evaluation
processes and products to provide information
to, and about, the program being evaluated
Alignment to the program’s logic model and
theory of change
Include provision for some level of
implementation fidelity
23. High-quality evaluations include:
Preparation and planning for variation within multi-
site evaluations
Consideration of clear deadlines for completion
and deliverables set by program
Upfront acknowledgement of the limitations of the
proposed evaluation
Minimizing jargon to increase clarity of
communication of results
Striking a balance between consultancy/
recommendation support and maintaining
unbiased distance as an independent evaluator
24. Before meeting with an
evaluator:
Complete your program logic model and
theory of change
Or have discussions and ideas of key
components for development assistance from
evaluator
Consider what you really want to learn
What outcomes do you want an evaluation to
focus on for your program?
25. What do you really want to know?
Make sure you ask the right question!
26. Before meeting with an
evaluator:
Complete your program logic model and
theory of change
Consider what you really want to learn
What data is already available or will need to
be collected to answer your right question(s)?
If you believe data will need to be collected do
you want to collect them or have the evaluator
conduct the data collection?
Think about whether you have identified SMART
outcomes!
27. SMART Outcomes are:
Specific
Reflect simple information that is communicable and
easily understood
Measurable
Can changes by measured in reliable and valid ways?
Achievable
Able to be collected and sensitive to change during the
allotted time
Relevant
Reflect information that is important and likely to be used
Time bound
Progress can be tracked at desired frequency within
allotted time
Source: Adapted from
World Bank Group.
28. Questions for an evaluator
How does your proposed evaluation align with our
program logic model and theory of change?
What data do you need from us to conduct the
evaluation? What data to you propose to collect,
when, and in what format?
How intrusive will the data collection be for our
program?
Will your evaluation be able to tell us if the program
was in place as intended?
What we really want to know from an evaluation is
_________. How will your proposed evaluation
answer this question?
29. Questions to ask yourself about an
evaluation proposed to you
Does the evaluation design reflect my logic model
and theory of change?
Do the proposed outcomes follow my logic model
and theory of change?
Is it feasible to collect outcome data at the time I
expect to see changes (short- medium- and long-
term outcomes) in those selected?
Are the selected outcomes and measures
relevant for my program and my stakeholders?
Does the evaluation include SMART outcomes that
will be evaluated?