Program Evaluation: Can You Prove You Are Making a Difference?Isaac Castillo
Workshop for NetworkPeninsula - Nonprofit EXcellence in Training (NEXT) Workshop series
Most nonprofits assume they are doing good work and changing the lives of their service recipients, but how do you really know? In this workshop, participants will learn some important concepts, approaches, and techniques to utilize when attempting to measure nonprofit effectiveness. Common mistakes and challenges will also be discussed.
Can you prove you are making a difference? February 2015. Isaac Castillo
This document discusses the importance of measuring outcomes for nonprofit organizations. It defines key terms like outputs, outcomes, short-term outcomes, and logic models. The document emphasizes that senior leadership must support outcome measurement and an organizational culture that values data. Staff should be given responsibility for leading the culture change to focus on outcomes. Messaging should frame outcome measurement positively to help frontline staff provide better services.
New Thinking On How To Measure Nonprofit EffectivenessIsaac Castillo
The document outlines a framework of seven pillars for social sector excellence, with a focus on becoming a high-performance organization through courageous leadership, well-designed programs, learning culture, data-driven improvement, and impact evaluation. It discusses each pillar in detail and provides examples of how nonprofits can implement practices like defining populations served, designing evidence-based programs, establishing metrics, and using data for continuous learning and decision-making to better achieve their missions. The presentation aims to help nonprofits pursue organizational excellence and deliver meaningful, measurable results for those they serve.
This document describes an exercise to illustrate the importance of collecting outcomes data when making funding decisions between two after-school tutoring programs. Participants are asked which of the two programs they would fund after being given incremental data about each: the number of students served, hours of tutoring provided, and percentage of students who improved grades. Initially, Program 2 seems more effective, improving grades for 90% of students compared to 5% for Program 1. However, the final slide reveals that Program 1 stabilized student performance while the comparison group declined, highlighting the value of comparison data and not overlooking programs that maintain the status quo.
Workshop for NetworkPeninsula - Nonprofit EXcellence in Training (NEXT) Workshop series
How do you know which outcomes to measure as part of your collective impact effort, and how do you come to a consensus on those outcomes? This workshop will offer practical and theoretical advice on how a collective impact effort can identify, choose, and refine outcome areas as part of their work. Practical challenges (where will the data come from, who is going to analyze it, what will be done with the data) will also be discussed, along with some recommendations.
Can your nonprofit prove you are making a difference?Isaac Castillo
Slides from class taught at Tidewater Community College's Academy for Nonprofit Excellence. Focuses on the basics of data collection, outcome measurement, logic models,and performance management.
Castillo high quality program evaluation in nonprofitsIsaac Castillo
Presentation for Tidewater Community College Workforce Solutions school - the Academy for Nonprofit Excellence.
http://www.tccworkforce.org/non-profit-management
Program Evaluation Basics - Center for Nonprofit Success slidesIsaac Castillo
This document summarizes a presentation about program evaluation. The presentation discusses why program evaluation is important, defines what program evaluation is, and provides examples of how to conduct an effective program evaluation. Specifically, it covers evaluating outcomes, using both quantitative and qualitative data, comparing results to baselines or control groups, communicating findings, and determining the appropriate level of rigor and whether to conduct the evaluation internally or externally. The overall goal is to use data to improve programs and demonstrate their impact.
Program Evaluation: Can You Prove You Are Making a Difference?Isaac Castillo
Workshop for NetworkPeninsula - Nonprofit EXcellence in Training (NEXT) Workshop series
Most nonprofits assume they are doing good work and changing the lives of their service recipients, but how do you really know? In this workshop, participants will learn some important concepts, approaches, and techniques to utilize when attempting to measure nonprofit effectiveness. Common mistakes and challenges will also be discussed.
Can you prove you are making a difference? February 2015. Isaac Castillo
This document discusses the importance of measuring outcomes for nonprofit organizations. It defines key terms like outputs, outcomes, short-term outcomes, and logic models. The document emphasizes that senior leadership must support outcome measurement and an organizational culture that values data. Staff should be given responsibility for leading the culture change to focus on outcomes. Messaging should frame outcome measurement positively to help frontline staff provide better services.
New Thinking On How To Measure Nonprofit EffectivenessIsaac Castillo
The document outlines a framework of seven pillars for social sector excellence, with a focus on becoming a high-performance organization through courageous leadership, well-designed programs, learning culture, data-driven improvement, and impact evaluation. It discusses each pillar in detail and provides examples of how nonprofits can implement practices like defining populations served, designing evidence-based programs, establishing metrics, and using data for continuous learning and decision-making to better achieve their missions. The presentation aims to help nonprofits pursue organizational excellence and deliver meaningful, measurable results for those they serve.
This document describes an exercise to illustrate the importance of collecting outcomes data when making funding decisions between two after-school tutoring programs. Participants are asked which of the two programs they would fund after being given incremental data about each: the number of students served, hours of tutoring provided, and percentage of students who improved grades. Initially, Program 2 seems more effective, improving grades for 90% of students compared to 5% for Program 1. However, the final slide reveals that Program 1 stabilized student performance while the comparison group declined, highlighting the value of comparison data and not overlooking programs that maintain the status quo.
Workshop for NetworkPeninsula - Nonprofit EXcellence in Training (NEXT) Workshop series
How do you know which outcomes to measure as part of your collective impact effort, and how do you come to a consensus on those outcomes? This workshop will offer practical and theoretical advice on how a collective impact effort can identify, choose, and refine outcome areas as part of their work. Practical challenges (where will the data come from, who is going to analyze it, what will be done with the data) will also be discussed, along with some recommendations.
Can your nonprofit prove you are making a difference?Isaac Castillo
Slides from class taught at Tidewater Community College's Academy for Nonprofit Excellence. Focuses on the basics of data collection, outcome measurement, logic models,and performance management.
Castillo high quality program evaluation in nonprofitsIsaac Castillo
Presentation for Tidewater Community College Workforce Solutions school - the Academy for Nonprofit Excellence.
http://www.tccworkforce.org/non-profit-management
Program Evaluation Basics - Center for Nonprofit Success slidesIsaac Castillo
This document summarizes a presentation about program evaluation. The presentation discusses why program evaluation is important, defines what program evaluation is, and provides examples of how to conduct an effective program evaluation. Specifically, it covers evaluating outcomes, using both quantitative and qualitative data, comparing results to baselines or control groups, communicating findings, and determining the appropriate level of rigor and whether to conduct the evaluation internally or externally. The overall goal is to use data to improve programs and demonstrate their impact.
Data helps school based health centers to target services and tell their stories, but analysis can be intimidating and frustrating. Dabbling in the Data provides 15 interactive activities that organizations can use internally or with partners to make meaning of their data… without falling asleep! In this skill-building session, participants will learn how to learn from their data, yielding actionable insights and powerful stories. After practicing activities, attendees will reflect on how the activities might be used in their own organizations.
AGR Diversity Forum - Intro Slides 04.03.2015EmmaAGR
The document summarizes a diversity forum discussing how small, incremental improvements can have significant impacts. It notes that daily decisions and optimizing small things can add up over time. The forum focuses on monitoring socio-economic diversity among members, with around 25% currently doing so. There is an increasing focus on indicators like free school meals and whether applicants are first generation graduates. The agenda includes presentations from Enterprise, EDF, Oxford University, and Bank of England on their approaches to diversity and widening access.
In July 2013, Innovation Network's Kat Athanasiades and Veena Pankaj presented "Evaluation: Finding a Common Ground" at the Forum of Regional Associations of Grantmakers in Milwaukee, Wisconsin.
The purpose of this session was to highlight the common threads that distinguish regional associations from other organizational genres in the social sector. Regional associations promote effectiveness in philanthropy by providing grantmakers with opportunities to engage with others, share ideas, and generate best practices that support both the individual and collective impact of philanthropy. While each regional association employs its own unique mix of strategies to achieve impact, most are working towards a shared goal of promoting effectiveness within philanthropy. In this session, Innovation Network engaged participants in a group dialogue to draw out high level strategies and desired outcomes that are common across most regional associations. The resulting logic model is a tool that regional associations can use as a starting point when planning for evaluation.
Innovation Network is a nonprofit evaluation, research, and consulting firm. We provide knowledge and expertise to help nonprofits and funders learn from their work to improve their results. For more information, please visit www.innonet.org.
This is the slide deck for the workshop that Helen Bevan ran on "Transformational leadership in healthcare" at the International Conference on Residency Education, Vancouver, Canada, on October 24th 2015
This document proposes targeting working mothers with a new energy drink called Zenergy. It notes that while the energy drink market has grown exponentially, most people are not interested in the category. It sees an opportunity to appeal to a new, older audience by focusing on mental energy and wellbeing rather than physical stimulation. The proposed product is a caffeine-based drink with added vitamins, positioned as an ongoing source of mental energy. The target is overworked mothers aged 25-40 who suffer from "depleted mother syndrome" due to balancing work and family responsibilities without rest. A digital marketing strategy is outlined to engage this audience of "digital moms" through online influencers and content.
Evergreen and Growth: Sustainable Content Strategy for Social Media ManagersMa'ayan Plaut
The document discusses sustainable strategies for social media content management. It uses farming and gardening metaphors to describe techniques for planning, creating, organizing, sharing, and analyzing social media content over time. This includes conducting a content audit, creating a master calendar and spreadsheets to organize content by theme and timeline, tagging and linking content, and tracking engagement over time to inform future strategies. The goal is to develop a long-term content strategy that is purposeful, organized, and adapts based on measurement of past performance.
New Frameworks for Measuring Capacity and Assessing PerformanceTCC Group
If we start with the assumption that — in order to improve our social sector as a whole — those who do the work to strengthen our communities (the nonprofits) are equally as critical as those responsible for providing the resources for the work to get done (the foundations), then why wouldn’t we expect all social sector actors to build their capacity? How do we know when our grantees and our foundations are becoming more effective and impactful as a result of our capacity investments, organizational development efforts and technical assistance? What does a high performing organization or foundation look like? And can we measure that?
This presentation, provided during the Grantmakers for Effective Organizations 2016 National Conference in Minneapolis, reviews and demonstrates existing resources for assessing nonprofit and foundation capacity and effectiveness. Speakers introduced the pros and cons of a variety of rubrics in use in the field and offered guidance on how funders decide on the right fit for the desired purpose. Grantmaker peers also shared how they used different frameworks and tools to assess individual nonprofits and grantee cohorts. Session participants left with increased awareness of the importance of the facilitator’s role in interpreting data gleaned from assessments and of the data collection methods most appropriate for their organization.
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
This document summarizes a presentation on program evaluation for NGO partner organizations. It defines program evaluation as the systematic process of collecting and analyzing information about a program to make necessary decisions. There are two main types of evaluations: process evaluations, which verify proper implementation, and outcome evaluations, which assess a program's effectiveness and impact. The presentation outlines key steps for developing an evaluation plan, including determining the purpose and audience, identifying evaluation questions, choosing a methodology, collecting and analyzing data, and reporting findings. It also discusses important considerations like the appropriate evaluator and presents an activity for participants to develop strategies, outcomes, and discuss an evaluation plan for one of their program objectives.
Moving from evaluation to learning peter yorkMichele Garvey
This document discusses the importance of evaluative learning over traditional program evaluation for non-profits. It argues that evaluation focuses on determining a program's overall impact, while evaluative learning focuses on understanding how program components work in order to improve practices. The document provides six key practices of evaluative learning, including measuring achievable outcomes and gathering data from recipients. It outlines guiding steps for conducting evaluative learning, such as identifying achievable outcomes and analyzing patterns of cause-and-effect. Non-profits that engage in evaluative learning behaviors are over twice as likely to experience sustainable growth compared to those who do not.
Generating Stakeholder Buy-In to Establish a Culture of AssessmentExamSoft
Presented by Karen Bobak and Lisa Bloom
Vertical- Chiropractic
Using assessment data to make decisions regarding curriculum and instruction cannot be isolated to an administrative office. In order for data to have a meaningful and lasting impact on the academic program, it is important to solicit diverse stakeholder perspectives and provide opportunities for faculty engagement at every stage of the assessment process development. Developing collaborations in order to achieve incremental goals helps to establish a pattern of success and creates a vision of future accomplishments and an institutional culture of assessment. Each institution has unique needs. This workshop is designed to provide small group interaction to help attendees target optimal strategies doe their institutions.
At the end of the session, participants will be able to:
1) Identify why stakeholder buy-in is important
2) Formulate effective strategies for engagement of faculty and administration
3) Identify principles of Improvement Science to support a process of curriculum review
4) Review longitudinal assessment data to identify curricular gaps
5) Utilize assessment data to develop improvement plans
6) Develop specific strategies tailored to the needs of the participants’ institutions
The document provides an overview of program design, monitoring and evaluation. It discusses conducting needs assessments to understand community needs and priorities. It also covers developing a causal pathway framework to link program activities, outputs, and outcomes. Monitoring and evaluation are presented as important parts of the process to determine what is working and how programs can be improved.
Based on the scenario provided, Agency ABC's response for section A.1.3 would be a "2 - Somewhat Effective." While staff feel supported and are invited to paid trainings and workshops, the agency does not monitor annual PD requirements or have clear expectations. Monthly meetings focus more on planning and policy rather than skill-building. Performance reviews and supervision are also limited. Overall, the level of professional development and on-site support for continuing skills growth is somewhat effective but could be strengthened.
The document discusses the importance of strategic learning for nonprofit effectiveness and leadership. It argues that evaluation should focus on learning what works, for whom, and why rather than just accountability. The key aspects of strategic learning are: creating data gathering processes to leverage evaluation findings; infusing learning into planning; and taking immediate action based on evaluations. However, only about 25% of nonprofits are effective learners. The document outlines a seven step strategic learning process involving gathering data, analyzing results, making meaning from findings, and using decisions to improve programs.
Via Evaluation Evaluation Plan presentation for GPAVia Evaluation
The document discusses evaluation plans and their importance. It provides an overview of what an evaluation plan is and when one is necessary, such as when required by a funder. It outlines the key components of an effective evaluation plan, including identifying the evaluator, developing objectives and indicators, creating a logic model and timeline. The presentation aims to help attendees understand how to create strong evaluation plans that can improve their projects and demonstrate accountability.
This document provides an overview of monitoring and evaluation concepts, processes, methods, and reporting. It defines key terms like monitoring, evaluation, logical framework, and indicators. It describes monitoring and evaluation cycles and steps in designing an M&E system including developing an M&E matrix. It discusses data collection methods, types of reports, and outlines for technical, popular, monitoring and evaluation reports. The goal is to develop a common understanding of monitoring and evaluation.
The Change Acceleration Process (CAP) provides a framework to accelerate and optimize change initiatives. It includes seven core elements: leading change, creating a shared need, shaping a vision, mobilizing commitment, changing systems and structures, making change last, and monitoring progress. Tools are provided for each element to help change teams identify ways to achieve behavioral change and build a culture that drives change. Successful change requires addressing both the technical solution and gaining acceptance and commitment from stakeholders.
The document discusses effective program evaluation and provides a case study example. It begins by outlining why evaluation is needed and common problems with evaluation. Effective evaluation includes planning, frameworks, data collection and analysis plans. A case study on a learning intervention at PolyWrighton to improve work engagement is then described in detail. The intervention was evaluated using a multi-level framework and showed positive results, including a 399% ROI. Additional resources on evaluation and the programs discussed are provided.
As a solution, the BER model uses a two-dimensional matrix to aid the evaluation of complex multi-unit programs, with quadrants to identify over and underperforming units. The BER model was inspired by portfolio management approaches from the Boston Consulting Group and the General Electric Grid, as well as quadrant analysis by Andreasen (1995). However, its core principles are based on the concept of social return on investment, where output is always compared to input. It provides a relative perspective on performance that allows evaluators to account for impact based on the resources invested in an initiative.
This document outlines the presentation on evaluating a national health programme. It discusses key topics like monitoring versus evaluation, the history and purpose of evaluation, different types of evaluation including formative, summative and participatory evaluation. The document details the evaluation process including planning evaluations, gathering baseline data, implementing evaluations and using evaluation results. It also covers standards for effective evaluation including ensuring the utility, feasibility, propriety and accuracy of evaluations. The overall summary is that the document provides an overview of best practices for conducting program evaluations of national health initiatives.
Data helps school based health centers to target services and tell their stories, but analysis can be intimidating and frustrating. Dabbling in the Data provides 15 interactive activities that organizations can use internally or with partners to make meaning of their data… without falling asleep! In this skill-building session, participants will learn how to learn from their data, yielding actionable insights and powerful stories. After practicing activities, attendees will reflect on how the activities might be used in their own organizations.
AGR Diversity Forum - Intro Slides 04.03.2015EmmaAGR
The document summarizes a diversity forum discussing how small, incremental improvements can have significant impacts. It notes that daily decisions and optimizing small things can add up over time. The forum focuses on monitoring socio-economic diversity among members, with around 25% currently doing so. There is an increasing focus on indicators like free school meals and whether applicants are first generation graduates. The agenda includes presentations from Enterprise, EDF, Oxford University, and Bank of England on their approaches to diversity and widening access.
In July 2013, Innovation Network's Kat Athanasiades and Veena Pankaj presented "Evaluation: Finding a Common Ground" at the Forum of Regional Associations of Grantmakers in Milwaukee, Wisconsin.
The purpose of this session was to highlight the common threads that distinguish regional associations from other organizational genres in the social sector. Regional associations promote effectiveness in philanthropy by providing grantmakers with opportunities to engage with others, share ideas, and generate best practices that support both the individual and collective impact of philanthropy. While each regional association employs its own unique mix of strategies to achieve impact, most are working towards a shared goal of promoting effectiveness within philanthropy. In this session, Innovation Network engaged participants in a group dialogue to draw out high level strategies and desired outcomes that are common across most regional associations. The resulting logic model is a tool that regional associations can use as a starting point when planning for evaluation.
Innovation Network is a nonprofit evaluation, research, and consulting firm. We provide knowledge and expertise to help nonprofits and funders learn from their work to improve their results. For more information, please visit www.innonet.org.
This is the slide deck for the workshop that Helen Bevan ran on "Transformational leadership in healthcare" at the International Conference on Residency Education, Vancouver, Canada, on October 24th 2015
This document proposes targeting working mothers with a new energy drink called Zenergy. It notes that while the energy drink market has grown exponentially, most people are not interested in the category. It sees an opportunity to appeal to a new, older audience by focusing on mental energy and wellbeing rather than physical stimulation. The proposed product is a caffeine-based drink with added vitamins, positioned as an ongoing source of mental energy. The target is overworked mothers aged 25-40 who suffer from "depleted mother syndrome" due to balancing work and family responsibilities without rest. A digital marketing strategy is outlined to engage this audience of "digital moms" through online influencers and content.
Evergreen and Growth: Sustainable Content Strategy for Social Media ManagersMa'ayan Plaut
The document discusses sustainable strategies for social media content management. It uses farming and gardening metaphors to describe techniques for planning, creating, organizing, sharing, and analyzing social media content over time. This includes conducting a content audit, creating a master calendar and spreadsheets to organize content by theme and timeline, tagging and linking content, and tracking engagement over time to inform future strategies. The goal is to develop a long-term content strategy that is purposeful, organized, and adapts based on measurement of past performance.
New Frameworks for Measuring Capacity and Assessing PerformanceTCC Group
If we start with the assumption that — in order to improve our social sector as a whole — those who do the work to strengthen our communities (the nonprofits) are equally as critical as those responsible for providing the resources for the work to get done (the foundations), then why wouldn’t we expect all social sector actors to build their capacity? How do we know when our grantees and our foundations are becoming more effective and impactful as a result of our capacity investments, organizational development efforts and technical assistance? What does a high performing organization or foundation look like? And can we measure that?
This presentation, provided during the Grantmakers for Effective Organizations 2016 National Conference in Minneapolis, reviews and demonstrates existing resources for assessing nonprofit and foundation capacity and effectiveness. Speakers introduced the pros and cons of a variety of rubrics in use in the field and offered guidance on how funders decide on the right fit for the desired purpose. Grantmaker peers also shared how they used different frameworks and tools to assess individual nonprofits and grantee cohorts. Session participants left with increased awareness of the importance of the facilitator’s role in interpreting data gleaned from assessments and of the data collection methods most appropriate for their organization.
Organizational Capacity-Building Series - Session 6: Program EvaluationINGENAES
This document summarizes a presentation on program evaluation for NGO partner organizations. It defines program evaluation as the systematic process of collecting and analyzing information about a program to make necessary decisions. There are two main types of evaluations: process evaluations, which verify proper implementation, and outcome evaluations, which assess a program's effectiveness and impact. The presentation outlines key steps for developing an evaluation plan, including determining the purpose and audience, identifying evaluation questions, choosing a methodology, collecting and analyzing data, and reporting findings. It also discusses important considerations like the appropriate evaluator and presents an activity for participants to develop strategies, outcomes, and discuss an evaluation plan for one of their program objectives.
Moving from evaluation to learning peter yorkMichele Garvey
This document discusses the importance of evaluative learning over traditional program evaluation for non-profits. It argues that evaluation focuses on determining a program's overall impact, while evaluative learning focuses on understanding how program components work in order to improve practices. The document provides six key practices of evaluative learning, including measuring achievable outcomes and gathering data from recipients. It outlines guiding steps for conducting evaluative learning, such as identifying achievable outcomes and analyzing patterns of cause-and-effect. Non-profits that engage in evaluative learning behaviors are over twice as likely to experience sustainable growth compared to those who do not.
Generating Stakeholder Buy-In to Establish a Culture of AssessmentExamSoft
Presented by Karen Bobak and Lisa Bloom
Vertical- Chiropractic
Using assessment data to make decisions regarding curriculum and instruction cannot be isolated to an administrative office. In order for data to have a meaningful and lasting impact on the academic program, it is important to solicit diverse stakeholder perspectives and provide opportunities for faculty engagement at every stage of the assessment process development. Developing collaborations in order to achieve incremental goals helps to establish a pattern of success and creates a vision of future accomplishments and an institutional culture of assessment. Each institution has unique needs. This workshop is designed to provide small group interaction to help attendees target optimal strategies doe their institutions.
At the end of the session, participants will be able to:
1) Identify why stakeholder buy-in is important
2) Formulate effective strategies for engagement of faculty and administration
3) Identify principles of Improvement Science to support a process of curriculum review
4) Review longitudinal assessment data to identify curricular gaps
5) Utilize assessment data to develop improvement plans
6) Develop specific strategies tailored to the needs of the participants’ institutions
The document provides an overview of program design, monitoring and evaluation. It discusses conducting needs assessments to understand community needs and priorities. It also covers developing a causal pathway framework to link program activities, outputs, and outcomes. Monitoring and evaluation are presented as important parts of the process to determine what is working and how programs can be improved.
Based on the scenario provided, Agency ABC's response for section A.1.3 would be a "2 - Somewhat Effective." While staff feel supported and are invited to paid trainings and workshops, the agency does not monitor annual PD requirements or have clear expectations. Monthly meetings focus more on planning and policy rather than skill-building. Performance reviews and supervision are also limited. Overall, the level of professional development and on-site support for continuing skills growth is somewhat effective but could be strengthened.
The document discusses the importance of strategic learning for nonprofit effectiveness and leadership. It argues that evaluation should focus on learning what works, for whom, and why rather than just accountability. The key aspects of strategic learning are: creating data gathering processes to leverage evaluation findings; infusing learning into planning; and taking immediate action based on evaluations. However, only about 25% of nonprofits are effective learners. The document outlines a seven step strategic learning process involving gathering data, analyzing results, making meaning from findings, and using decisions to improve programs.
Via Evaluation Evaluation Plan presentation for GPAVia Evaluation
The document discusses evaluation plans and their importance. It provides an overview of what an evaluation plan is and when one is necessary, such as when required by a funder. It outlines the key components of an effective evaluation plan, including identifying the evaluator, developing objectives and indicators, creating a logic model and timeline. The presentation aims to help attendees understand how to create strong evaluation plans that can improve their projects and demonstrate accountability.
This document provides an overview of monitoring and evaluation concepts, processes, methods, and reporting. It defines key terms like monitoring, evaluation, logical framework, and indicators. It describes monitoring and evaluation cycles and steps in designing an M&E system including developing an M&E matrix. It discusses data collection methods, types of reports, and outlines for technical, popular, monitoring and evaluation reports. The goal is to develop a common understanding of monitoring and evaluation.
The Change Acceleration Process (CAP) provides a framework to accelerate and optimize change initiatives. It includes seven core elements: leading change, creating a shared need, shaping a vision, mobilizing commitment, changing systems and structures, making change last, and monitoring progress. Tools are provided for each element to help change teams identify ways to achieve behavioral change and build a culture that drives change. Successful change requires addressing both the technical solution and gaining acceptance and commitment from stakeholders.
The document discusses effective program evaluation and provides a case study example. It begins by outlining why evaluation is needed and common problems with evaluation. Effective evaluation includes planning, frameworks, data collection and analysis plans. A case study on a learning intervention at PolyWrighton to improve work engagement is then described in detail. The intervention was evaluated using a multi-level framework and showed positive results, including a 399% ROI. Additional resources on evaluation and the programs discussed are provided.
As a solution, the BER model uses a two-dimensional matrix to aid the evaluation of complex multi-unit programs, with quadrants to identify over and underperforming units. The BER model was inspired by portfolio management approaches from the Boston Consulting Group and the General Electric Grid, as well as quadrant analysis by Andreasen (1995). However, its core principles are based on the concept of social return on investment, where output is always compared to input. It provides a relative perspective on performance that allows evaluators to account for impact based on the resources invested in an initiative.
This document outlines the presentation on evaluating a national health programme. It discusses key topics like monitoring versus evaluation, the history and purpose of evaluation, different types of evaluation including formative, summative and participatory evaluation. The document details the evaluation process including planning evaluations, gathering baseline data, implementing evaluations and using evaluation results. It also covers standards for effective evaluation including ensuring the utility, feasibility, propriety and accuracy of evaluations. The overall summary is that the document provides an overview of best practices for conducting program evaluations of national health initiatives.
Training needs analysis, skills auditing, training evaluation, calculating training ROI and strategic learning and development best practice principles and processes
SOEDS, 11th April 2022 How to Evaluate CSR Projects and Programmes.pptxRAKESHNANDAN7
This document discusses evaluating CSR projects and programs. It begins by differentiating evaluation from appraisal, monitoring, and impact assessment. It then covers various types of evaluation including formative and summative, as well as tools like logical framework analysis and Bennett's hierarchy. Examples are provided of evaluating integrated contract broiler farming and e-learning materials. Challenges of evaluation like establishing controls and measuring long-term impacts are also discussed. The document emphasizes that evaluation is important for accountability, learning, and improving future CSR efforts.
Black, Adam Dr - Efficacy and how to improve learner outcomeseaquals
The document discusses improving learner outcomes through efficacy. It defines efficacy as making a measurable impact on learner outcomes. Pearson's approach involves efficacy reviews, studies, and analytics. Efficacy reviews use a framework to assess the likelihood of impacting outcomes. Studies involve long-term, holistic evaluations. Analytics provide insights from learner data to improve instruction and identify at-risk students. The key is defining clear outcomes and measuring progress towards goals to continuously enhance efficacy.
This document discusses the importance of monitoring and evaluation (M&E) for programs and projects. It defines monitoring as an ongoing process of collecting and analyzing data to track progress and make adjustments, while evaluation assesses relevance, effectiveness, impact and sustainability. The key aspects of building an M&E system are agreeing on outcomes to measure, selecting indicators, gathering baseline data, setting targets, monitoring implementation and results, reporting findings, and sustaining the system long-term. A strong M&E system provides evidence of achievements and challenges, enables learning and improvement, and helps ensure resources are allocated to effective programs.
Evaluation approaches presented by hari bhusalHari Bhushal
The document discusses evaluation approaches and methods. It defines evaluation as appraising the relevance, efficiency, effectiveness, impacts, and sustainability of plans, policies, programs and projects. Evaluations are used to draw lessons to improve future implementation and hold agencies accountable. The document then discusses different types of evaluations including formative, process, outcome and economic evaluations. It also outlines various evaluation approaches like appreciative inquiry, beneficiary assessment, case studies, contribution analysis, developmental evaluation, and participatory evaluation.
Similar to The Fluid Process of Nonprofit Evaluative Capacity Building (20)
The Future of Logic Models: Logic Models in 3DIsaac Castillo
The time has come for us to break out of our two dimensional representations of programmatic interventions. We need to re-think representations of program interventions and explore how logic models can be crafted in three dimensions (3D) to better respond to the evolving and dynamic nature of our programs and society.
Session at the American Evaluation Association Virtual Conference - October 28, 2020
The Logic Model Repair Shop: An Introduction to 3D Logic Models.Isaac Castillo
Presentation at EERS 2018 on #3DLogicModels by Elizabeth Grim, Isaac Castillo, and Elena Pinzon O'Quinn
http://eers.org/session/the-logic-model-repair-shop-why-most-logic-models-are-broken-and-how-we-can-fix-them/
Collective impact in 3D - Collective Impact Convening Short TalkIsaac Castillo
We can think of collective impact initiatives in new ways if we break out of a traditional approach to logic models and consider building logic models in three dimensions. #3DLogicModels
Evaluation Blooers and How to Make the Most of Your MistakesIsaac Castillo
Presentation focusing on common evaluation mistakes and how audience members can avoid similar situations. Slides contain a list of audience generated take aways and lessons learned.
A survey found that 49% of residents in the Kenilworth-Parkside neighborhood of Washington D.C. experienced food insecurity. This high level of food insecurity was due to a lack of income, grocery stores, transportation, and expensive groceries. In response, the DC Promise Neighborhood Initiative worked to address food insecurity by organizing a monthly food market to provide families with free food, maintaining a bus route to the nearest grocery store, and implementing a fruit and vegetable prescription program to improve residents' access to healthy foods.
Castillo Measure4Change - Presenting Data to Community ResidentsIsaac Castillo
This document discusses strategies for sharing data with community residents in a way that is easy to understand and overcomes initial skepticism. It recommends using multiple approaches, including printed materials, presentations at existing meetings, and hosting separate data-focused meetings. Examples are provided of using simple graphics and real people to illustrate key data points in an engaging way, such as chronic absenteeism or crime rates. The goal is to present a few carefully selected data metrics that challenge assumptions and help residents see how information can be relevant to their community's needs and outcomes.
This document discusses strategies for sharing data with community residents in an accessible way to build trust and address skepticism. It recommends using a variety of approaches, including printed materials with easy-to-understand visualizations highlighting a few key data points. Examples shown include educational attainment rates, food insecurity, travel times for groceries, and chronic absenteeism in local schools. The document also suggests using residents themselves and real crime data to question assumptions and perceptions about different neighborhoods.
When technology hits the sidewalk empowering community residents through 21s...Isaac Castillo
The conversation around 21st century data collection methods continues to evolve, but little work has been done to employ these methods to empower community stakeholders as part of the data collection process. This session will detail how DCPNI conducted a representative community survey using innovative data collection methods within an inclusive evaluation framework by employing community members as agents of community data collection. We will address the challenges faced in implementing a survey of significant scope, including time constraints, unknown literacy levels of survey participants, and inconsistent access to the internet, and detail how these challenges were overcome. The session will include a step-by-step demonstration of DCPNI's use of Key Survey Software to administer surveys on tablet devices with offline capability. We will discuss the strengths and weaknesses of our approach compared to alternatives and outline how the collected data will be utilized to benefit the community going forward.
Helping Families and Community Residents Use DataIsaac Castillo
This document discusses strategies for sharing data with community residents to build trust and address skepticism. It outlines presenting data in easy-to-understand publications and visualizations focusing on a few key metrics. Examples shown include educational attainment rates, food insecurity, travel times for groceries, and chronic absenteeism in local schools. The document advocates using local residents to present data and questioning assumptions, such as showing that crime rates are actually lower in the low-income neighborhood compared to a nearby high-income area. The goal is to have open conversations about data to address preconceptions and involve community members.
Engaging Families & Community Residents in Data-Driven WorkIsaac Castillo
This document discusses how the DC Promise Neighborhood Initiative (DCPNI) engaged families and community residents in their data-driven work. It describes how DCPNI administered a neighborhood survey, with input from residents on important topics. Residents then helped with data collection by participating on survey teams. DCPNI shared findings with the community through publications and meetings. Residents helped present and explain the data. DCPNI used findings on issues like food insecurity and absenteeism to create programs like food markets and an attendance initiative. The goal was to make data accessible and use community feedback to address priority concerns.
A Neighborhood Survey in the Nation’s Capital: Balancing Rigor, Resources, a...Isaac Castillo
Joint presentation by the DC Promise Neighborhood Initiative (DCPNI) and Urban Institute staff at the Eastern Evaluation Research Society's Annual Conference in 2014. Presentation focuses on DCPNI's neighborhood survey - a community wide data collection project. The slides offer tips and suggestions on how to make the process as smooth as possible without compromising data collection rigor.
Bharat Mata - History of Indian culture.pdfBharat Mata
Bharat Mata Channel is an initiative towards keeping the culture of this country alive. Our effort is to spread the knowledge of Indian history, culture, religion and Vedas to the masses.
The Antyodaya Saral Haryana Portal is a pioneering initiative by the Government of Haryana aimed at providing citizens with seamless access to a wide range of government services
Food safety, prepare for the unexpected - So what can be done in order to be ready to address food safety, food Consumers, food producers and manufacturers, food transporters, food businesses, food retailers can ...
This report explores the significance of border towns and spaces for strengthening responses to young people on the move. In particular it explores the linkages of young people to local service centres with the aim of further developing service, protection, and support strategies for migrant children in border areas across the region. The report is based on a small-scale fieldwork study in the border towns of Chipata and Katete in Zambia conducted in July 2023. Border towns and spaces provide a rich source of information about issues related to the informal or irregular movement of young people across borders, including smuggling and trafficking. They can help build a picture of the nature and scope of the type of movement young migrants undertake and also the forms of protection available to them. Border towns and spaces also provide a lens through which we can better understand the vulnerabilities of young people on the move and, critically, the strategies they use to navigate challenges and access support.
The findings in this report highlight some of the key factors shaping the experiences and vulnerabilities of young people on the move – particularly their proximity to border spaces and how this affects the risks that they face. The report describes strategies that young people on the move employ to remain below the radar of visibility to state and non-state actors due to fear of arrest, detention, and deportation while also trying to keep themselves safe and access support in border towns. These strategies of (in)visibility provide a way to protect themselves yet at the same time also heighten some of the risks young people face as their vulnerabilities are not always recognised by those who could offer support.
In this report we show that the realities and challenges of life and migration in this region and in Zambia need to be better understood for support to be strengthened and tuned to meet the specific needs of young people on the move. This includes understanding the role of state and non-state stakeholders, the impact of laws and policies and, critically, the experiences of the young people themselves. We provide recommendations for immediate action, recommendations for programming to support young people on the move in the two towns that would reduce risk for young people in this area, and recommendations for longer term policy advocacy.
How To Cultivate Community Affinity Throughout The Generosity JourneyAggregage
This session will dive into how to create rich generosity experiences that foster long-lasting relationships. You’ll walk away with actionable insights to redefine how you engage with your supporters — emphasizing trust, engagement, and community!
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Indira awas yojana housing scheme renamed as PMAYnarinav14
Indira Awas Yojana (IAY) played a significant role in addressing rural housing needs in India. It emerged as a comprehensive program for affordable housing solutions in rural areas, predating the government’s broader focus on mass housing initiatives.
Bangladesh studies presentation on Liberation War 1971 Indepence-of-Banglades...
The Fluid Process of Nonprofit Evaluative Capacity Building
1. The Fluid Process of Nonprofit
Evaluative Capacity Building
Isaac D. Castillo
Venture Philanthropy Partners
@Isaac_outcomes
@vppartners
GEO Learning Conference
May 31, 2017
VPPARTNERS.ORG | 05/31/17 | INVESTING IN SOCIAL CHANGE
Dan Tsin
Urban Alliance
@UrbanAllianc
e
2. Venture Philanthropy Partners (VPP) and
youthCONNECT
2VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
3. Venture Philanthropy Partners (VPP) and
youthCONNECT
3VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
4. What are the Important Nonprofit Evaluation
Capacity Building Questions?
• What is the organization’s current level of evaluation
capacity?
• Where do you want the organization’s level of evaluation
capacity to be (in general, or at the end of your
investment)?
• What will it take to move the organization to that higher
level of evaluation capacity?
VENTURE PHILANTHROPY PARTNERS| 05/31/017 | INVESTING IN SOCIAL CHANGE 4
5. Determining an Organization’s Evaluation Capacity
as an Outsider
6VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Who is the Organization Serving?
There are 3 big picture questions you should ask to help
determine an organization’s sophistication level
regarding evaluation capacity:
How Well is the Organization Serving its Clients?
Is the Organization Ready to Successfully
Innovate / Evolve / Scale?
6. Demographic Information
7VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Who is the
Organization
Serving?
Level 1:
Little to No Data Collected
Level 2:
Basic Demographic Information
7. Serving Clients
8VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
How Well Is the
Organization
Serving its
Clients?
Level 3:
Dosage / Output Data Collected
Level 4:
Outcome Data Collection and Use
8. Innovation, Evolution, or Scaling
9VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Is the
Organization
Ready to
Successfully
Innovate /
Evolve / Scale? Level 5:
Formal evaluation projects
(low rigor – frequently internal)
Level 7:
Second (or 3rd) High Rigor
Formal Evaluation
Level 6:
High Rigor Formal Evaluation
(external)
9. Evaluation Capacity “Ramp”
10VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Level 1: Little to No Data
Level 2: Basic Demographics
Level 3: Dosage / Outputs
Level 4: Outcome Data and Use
Level 5: Formal evaluation projects
Level 7: Multiple High
Rigor Evaluations
Level 6: High Rigor Evaluation
10. Evaluation Staff as a Proxy for Capacity
11VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
No staff person formally assigned data and evaluation tasks
Evaluation tasks part of someone’s duties
(junior or mid-level person)
Senior leader/staff responsible for evaluation tasks
Junior / mid-level person has 100% of their time committed to
evaluation tasks
Senior level person has 100% of their time committed to
evaluation tasks
3 or more full time staff devoting 100% of time to evaluation
2 full time staff devoting 100% of time to evaluation
11. Assessing Evaluation Capacity
12VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Level of Evaluation
Capacity
Data and Evaluation
Infrastructure
Staffing
Advanced
Multiple high rigor
evaluations
3+ full time staff spending
100% of time on evaluation
High rigor evaluation 2 full time staff spending
100% of time on evaluation
Formal evaluation project Senior person spends 100%
of time on evaluation
Moderate
Outcomes Junior / mid person spends
100% of time on evaluation
Dosage / outputs Part of senior person’s job
Low
Basic demographics Part of junior/mid person’s
job
Little to no data No assigned staff
12. Final Things to Consider
• Do you (or your organization) have the expertise to
provide evaluation capacity building to nonprofits at
every level?
• At what point would you need to bring in someone else
to help with advanced evaluation capacity building?
• How long is your typical investment / grant cycle and
what can be realistically done during that time period?
• Are you willing to commit resources to move
organizations to the highest levels?
VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE 13
13. Contact Information
2VENTURE PHILANTHROPY PARTNERS| 05/31/17 | INVESTING IN SOCIAL CHANGE
Isaac D. Castillo
Venture Philanthropy
Partners
@Isaac_outcomes
@vppartners
icastillo@vppartners.org
www.vppartners.org
Dan Tsin
Urban Alliance
@UrbanAlliance
dtsin@theurbanalliance.org
www.theurbanalliance.org