This document discusses case management tools used in OVC programs and the Child Status Index tool in particular. It provides an overview of what case management tools are, their purpose, and examples of tools used. It then summarizes findings from a study that assessed how programs implement and use the Child Status Index tool. The study found that while the tool is useful, it is implemented inconsistently and data is rarely used at the local level. The document discusses appropriate and inappropriate uses of the tool and outlines key questions around whether and how case management tools improve care decisions.
Using Data to Support the Most Vulnerable: An OVC Information Needs FrameworkMEASURE Evaluation
The "Using Data to Support the Most Vulnerable: An OVC Information Needs Framework" webinar, organized by the HIV/AIDS Monitoring and Evaluation Network (AIMEnet), presented the OVC Information Needs Framework. MEASURE Evaluation's Molly Cannon and Lisa Parker led the one-hour webinar.
This presentation explains the difference between Monitoring and Evaluation; the types of M&E frameworks; steps in logical framework and its difference from theory of change.
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
This presentation is the continuation of the first part, which was all about the basics of program evaluation. This ppt contains slides describing the impact evaluation in details and also the logical framework is also explained with practical examples.
N.B: Please go through it, using slide view to use the animation effects.
Using Data to Support the Most Vulnerable: An OVC Information Needs FrameworkMEASURE Evaluation
The "Using Data to Support the Most Vulnerable: An OVC Information Needs Framework" webinar, organized by the HIV/AIDS Monitoring and Evaluation Network (AIMEnet), presented the OVC Information Needs Framework. MEASURE Evaluation's Molly Cannon and Lisa Parker led the one-hour webinar.
This presentation explains the difference between Monitoring and Evaluation; the types of M&E frameworks; steps in logical framework and its difference from theory of change.
This presentation has a vivid description of the basics of doing a program evaluation, with detailed explanation of the " Log Frame work " ( LFA) with practical example from the CLICS project. This presentation also includes the CDC framework for evaluation of program.
N.B: Kindly open the ppt in slide share mode to fully use all the animations wheresoever made.
This presentation is the continuation of the first part, which was all about the basics of program evaluation. This ppt contains slides describing the impact evaluation in details and also the logical framework is also explained with practical examples.
N.B: Please go through it, using slide view to use the animation effects.
Concept of curriculum, composition of curriculum committee, steps of curriculum devt, curriculum evaluation, curriculum revision - the need, factors to consider and components
Monitoring is the continuous collection of data and information on specified indicators to assess the implementation of a development intervention in relation to activity schedules and expenditure of allocated funds, and progress and achievements in relation to its intended outcome.
Evaluation is the periodic assessment of the design implementation, outcome, and impact of a development intervention. It should assess the relevance and achievement of the intended outcome, and implementation performance in terms of effectiveness and efficiency, and the nature, distribution, and sustainability of impact.
This is the project covered by me and my collegue to build the capacity of a community and design a training where they were having a problem. We selected our community a SHG of Apsinga village located in Maharashtra.
Promoting a culture of monitoring and evaluation in educational institutions. How to develop a M&E system, and grounding M&E planning on the Logical Framework Approach, and using Logframe as reference for M&E.
The workplace equivalent of “teaching to the test” might be “we need training”. Why do individuals or organizations require training? Ideally, training is not applied as a one-size-fits-all answer to development, nor is it a knee-jerk reaction to a bad situation. Rather, effective training should be a planned and tailored implementation to elevate an employee’s skills required for efficacy in a current role, advancement to a future role or advancement of an enterprise-wide competency. Life Cycle Institute discusses actionable steps for assessing the current state of an employee or organization and developing a plan to advance towards competency through thoughtful and targeted training techniques
Managing Multiple Projects: 5 Most Common Mistakes and Strategies to Resolve ...devans00
Helpful tips for project managers who must manage multiple projects. Describes the problems and offers solutions to overcome issues. Influenced by Andrew Filev.
Concept of curriculum, composition of curriculum committee, steps of curriculum devt, curriculum evaluation, curriculum revision - the need, factors to consider and components
Monitoring is the continuous collection of data and information on specified indicators to assess the implementation of a development intervention in relation to activity schedules and expenditure of allocated funds, and progress and achievements in relation to its intended outcome.
Evaluation is the periodic assessment of the design implementation, outcome, and impact of a development intervention. It should assess the relevance and achievement of the intended outcome, and implementation performance in terms of effectiveness and efficiency, and the nature, distribution, and sustainability of impact.
This is the project covered by me and my collegue to build the capacity of a community and design a training where they were having a problem. We selected our community a SHG of Apsinga village located in Maharashtra.
Promoting a culture of monitoring and evaluation in educational institutions. How to develop a M&E system, and grounding M&E planning on the Logical Framework Approach, and using Logframe as reference for M&E.
The workplace equivalent of “teaching to the test” might be “we need training”. Why do individuals or organizations require training? Ideally, training is not applied as a one-size-fits-all answer to development, nor is it a knee-jerk reaction to a bad situation. Rather, effective training should be a planned and tailored implementation to elevate an employee’s skills required for efficacy in a current role, advancement to a future role or advancement of an enterprise-wide competency. Life Cycle Institute discusses actionable steps for assessing the current state of an employee or organization and developing a plan to advance towards competency through thoughtful and targeted training techniques
Managing Multiple Projects: 5 Most Common Mistakes and Strategies to Resolve ...devans00
Helpful tips for project managers who must manage multiple projects. Describes the problems and offers solutions to overcome issues. Influenced by Andrew Filev.
Project Management to Enterprise Agile Product DeliveryLeadingAgile
This deck explores how Project Managers, Program Managers and Portfolio Managers fit into an Enterprise Agile setting. The slide deck was used during a presentation by VP & Principal Consultant, Greg King at a meetup with the Atlanta Scrum Users Group.
Programs and Portfolios - Multi-project ManagementBryan Fenech
In this presentation we will cover
- Definitions and comparison of programs and portfolios
- Organisational context
- Origins – multi-project management challenges and industry responses
SenchaCon 2016: Building Enterprise Ext JS Apps with Mavenized Sencha Cmd - F...Sencha
In this session, we'll show you how CoreMedia's Maven plugin offers the deepest integration of Sencha Cmd into your Maven build process available today and takes modular Ext JS development to the next level.
Are your measures strategic? Are you measuring the right things to make sure your organization is strategically successful? Top-down versus bottoms up tracking.
Success factors for Enterprise Project ManagementAmarnath Gupta
Average of EPM Tool implementation fails organization wide, because of few missing factors and initiatives.
Read my presentation which describes about Success factors for Enterprise Project Management
Transition, Transformation, Program, PMBOK, Prince 2, Roles & Responsibilities, Stakeholder Communication / Relationship Building, Assessment, As Is ~ To Be
Evaluating Impact of OVC Programs: Standardizing our methodsMEASURE Evaluation
Jen Chapman presents on the Orphans and Vulnerable Children Program Evaluation Tool Kit, which supports PEPFAR-funded programs and helps fulfill the aims presented in the USAID Evaluation Policy.
Tackle troublesome behavior among youths before it leads to poor outcomes like violence, delinquency, dropping out of school, substance abuse and teen pregnancy. That lies at the heart of “prevention science.
Independent reviewing officers: improving outcomes for children and young peopleOfsted
Matthew Brazier HMI, National Lead (Looked-after children) gave this presentation at the 'Evidence of effectiveness' a regional workshop for IROs on 5 December 2015.
Responsible Data for Children Training_PublicSlides_110922.pdfStefaan Verhulst
The Responsible Data for Children (RD4C) initiative—a collaboration between The GovLab and UNICEF to promote the more responsible handling of data for and about children—has spent much of 2022 developing ways to socialize and operationalize the principles that put the best interests of children and a child rights approach at the center of our data activities.. From publishing new case studies that provide detail on what a responsible data approach looks like in action to supporting UNICEF and UNHCR country offices in helping them implement a responsible data for children approach to their operations to expanding its offerings in different languages, we’ve sought to help organizations understand what responsible data for children means and how they can realize it in their day-to-day operations.
Today, RD4C is continuing this work with self-guided training. Based on the tutorials offered to UNICEF staff in early 2022, these slides are a resource for organizations seeking to understand ways to operationalize the RD4C principles and implement the RD4C tools.
Highlights from three different speakers on the actual use of dashboards for decisionmaking.
MEASURE Evaluation shares the results of a landscape analysis looking for specific examples of dashboards prompting action. BroadReach shares an example of how their Vantage platform is making HIV data accessible in South Africa. JSI shares an example of low-tech but high-impact dashboard development and coaching that has transformed districts in Zimbabwe.
Managing missing values in routinely reported data: One approach from the Dem...MEASURE Evaluation
This Data for Impact webinar was held in December 2020. Access the recording and learn more at https://www.data4impactproject.org/resources/webinars/managing-missing-values-in-routinely-reported-data-one-approach-from-the-democratic-republic-of-the-congo/
This Data for Impact webinar took place October 29, 2020. Learn more at https://www.data4impactproject.org/resources/webinars/use-of-routine-data-for-economic-evaluations/
Data for Impact hosted a one-hour webinar sharing guidance for using routine data in evaluations. More: https://www.data4impactproject.org/resources/webinars/routine-data-use-in-evaluation-practical-guidance/
Lessons learned in using process tracing for evaluationMEASURE Evaluation
Access the recording for this Data for Impact (D4I) webinar at https://www.data4impactproject.org/lessons-learned-in-using-process-tracing-for-evaluation/
3. What is a case mgmt tool?
Paper-based instrument
Completed by direct service providers (generally
low-literate volunteers)
Assesses child’s well-being along priority
dimensions (e.g. health, social relations, etc.)
Care plan documentation
4. Purpose of a case mgmt tool
Purpose: To improve quality of care
Highest priority = Case workers’ needs
o Untested hypothesis: case workers will make
better decisions if they use a tool
May also support M&E / reporting needs
5. What is included?
Information collected
Client contact information / demographics
Wellbeing information that is changeable over time
Care plan: services & referrals provided
Information flow
Most important use is at local/SDP level
Some information may flow up regional level
6. A plethora of tools...
Child Status Index (MEASURE Evaluation)
Child Support Index (Pact)
OVC Wellbeing Tool (CRS)
Child Status Matrix (FHI)
Parenting Map (TSA)
Etc.
7. We need to be cautious
Some CM tools are being applied for purposes
beyond case management
Targeting (identifying beneficiaries)
Program monitoring (recording services provided)
Evaluation (aggregating wellbeing scores)
Exercise caution in using a CM tool for other
purposes
9. About the CSI
5 years ago, CSI was designed for low-literate
home visitors to capture children’s status across the
6 domains of PEPFAR OVC programming
Early hopes that CSI could meet a range of
information needs
CSI has been implemented for different purposes:
case management to program evaluation
CSI is used in at least 16 countries
O'Donnell K, Nyangara F, Murphy R, Nyberg B. Child Status Index. A Tool for Assessing the Well-
Being of Orphans and Vulnerable Children—MANUAL. Chapel Hill, NC: MEASURE Evaluation; 2009.
10. CSI Assessment: Phase I
Rationale: to systematically assess how programs
are implementing & using the CSI and understand
OVC program field needs for additional tools to
meet care, support, and M&E demands
Study questions
For what purposes are OVC programs using CSI?
What are the advantages and limitations of CSI?
What are the unmet M&E needs of OVC programs?
Cannon & Snyder. 2012. The CSI Usage Assessment. Chapel Hill, NC: MEASURE Evaluation.
11. Summary of Findings: Phase 1
25 interviews with senior program staff in 13
countries
Program staff find the CSI useful
CSI implemented mainly by volunteers
Information collected via CSI is rarely used by
volunteers except for targeting (not
recommended)
Care plans and referral protocols are inadequate
12. Summary of Findings: Phase 1
Variation in CSI implementation, data use due to:
Unclear purpose & guidance on CSI use with
desire to assess impact
Variability in training approaches
Insufficient support/funding for technical
assistance, follow-up, and training (data
management, analysis)
CSI is important, but one tool in the toolkit
13. So now what?
Study Phase II
Problem: Lack of information on the utility of the CSI at the
community-level as a job aid & input from CCWs
Purpose: To understand how CCWs and care teams make
decisions about children (including role of job aid / data)
Methods: Interviews with CCWs and team leads in five
countries, among organizations using/not using the CSI
Revision of CSI Guidance
14. 6 Core CSI functions (we think)
1. Builds rapport between service provider and
beneficiaries
2. Orients service provider to the holistic needs of
the child and encourages referrals
3. Strengthens informed care decisions by
systematically considering and documenting
child’s needs
15. 6 Core CSI functions (we think)
4. If applied regularly with the same child, may show
a child’s progress over time in particular
domains
5. May be helpful in community-level planning and
resource allocation decision making
6. May reveal emergency situations (a score of 1
in any outcome area)
16. Probably inappropriate CSI uses
Targeting
Unnecessarily complex
May lead to expectations of action/enrolment
1st contact with child may not be reliable
Evaluating regional or national program impact
Children’s needs/status are assessed relative to their
local community, and not to national standards
17. Probably inappropriate CSI uses
Producing a single combined score for the child
CSI assessment should be presented as 12
independent measures
Risk varies across domains
CSI scale values are not equal-interval, but ordinal
Evaluating implementing organizations
19. Some key questions
Are case management tools effective at improving
care decision making?
For all types of case workers? Formal? Informal?
How does training in both case management and
tool use factor?
What specifically about a case management tool
improves care decision making? For whom?
Does the benefit outweigh the burden?
20. Some key questions II
Are CM tools useful in managing case workers?
Are some CM tools useful also for targeting
beneficiaries, monitoring outputs (services
delivered, and evaluating impact?
What are the risks?
21. The research presented here has been supported by the
President’s Emergency Plan for AIDS Relief (PEPFAR)
through the United States Agency for International
Development (USAID) under the terms of MEASURE
Evaluation cooperative agreement GHA-A-00-08-00003-
00. Views expressed are not necessarily those of
PEPFAR, USAID or the United States government.
MEASURE Evaluation is implemented by the Carolina
Population Center at the University of North Carolina at
Chapel Hill in partnership with Futures Group, ICF
International, John Snow, Inc., Management Sciences for
Health, and Tulane University.