Full Program Design
Upcoming SlideShare
Loading in...5
×
 

Full Program Design

on

  • 2,744 views

Thiết kế và đánh giá một chương trình

Thiết kế và đánh giá một chương trình

Statistics

Views

Total Views
2,744
Views on SlideShare
2,741
Embed Views
3

Actions

Likes
0
Downloads
185
Comments
1

1 Embed 3

http://www.slideshare.net 3

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Full Program Design Full Program Design Presentation Transcript

  • Program Design, Monitoring and Evaluation USAID/Vietnam Informal Training January, 2004
  • Overview of Training
    • Elements of a Program
    • Introduction to Needs Assessment
    • Program Design (The Causal Pathway)
    • Monitoring Program Success
    • Evaluation
    • Proposal Design
      • Linking a Program Framework to Proposal Design
  • Elements of a Program Implementation and Monitoring Assessment Program Design Evaluation
  • Elements of a Program Implementation and Monitoring Assessment Program Design Evaluation
  • Needs Assessment Basics
    • Why conduct a needs assessment?
      • To learn what a group or community sees as the most important needs
      • To understand the environment in which you will be working
      • To prioritize what you may be able to do with your group or community
  • Needs Assessment Basics
    • What is a needs assessment survey?
      • Some general characteristics:
        • Have pre-set list of questions to be answered
        • Have pre-determined sample size for the number and types of people to be surveyed
        • Should be PARTICPATORY
  • Phases of Needs Assessment
    • Phase 1: Brainstorm
      • Why am I doing this?
      • What are my goals in conducting the survey?
      • Am I ready to do this?
    Brainstorm
  • Phases of Needs Assessment
    • Phase 2: Assess Available Information
      • What do I already know about the needs of this target group/community?
      • What are the existing resources to assist in informing our assessment?
      • Am I ready to do this?
    Brainstorm Assess Available Data
  • Phases of Needs Assessment
    • Phase 3: Develop Questions
      • What do you want to learn from the target group/community?
    Brainstorm Assess Available Data Develop Questions
  • Phases of Needs Assessment
    • Phase 4: Identify Target Population
      • Who in the community has the information to answer your questions?
      • Consider revising your questions based on the groups/individuals you have identified
    Brainstorm Assess Available Data Develop Questions Identify Target Population
  • Phases of Needs Assessment
    • Phase 5: Choose Method
      • Consider:
        • Time
        • Human Resources
        • Financial Resources
        • Size/characteristics of target population
        • Regional/geographic issues
    Brainstorm Assess Available Data Develop Questions Identify Target Population Choose Method
  • Phases of Needs Assessment
    • Phase 6: Draft the Survey
      • Include instructions based on target groups
      • Test the survey on a test group comprised of the kinds of people you will survey
      • Revise based on the test survey
    Brainstorm Assess Available Data Develop Questions Identify Target Population Choose Method Draft Survey
  • Phases of Needs Assessment
    • Phase 7: Implement
      • Tabulate results
      • Interpret results
      • Plan future actions!
    Brainstorm Assess Available Data Develop Questions Identify Target Population Choose Method Draft Survey Implement
  • Needs Assessment Tools
    • There are many ways to Engage a community in assessing its needs…
      • Focus Groups
      • Community Forums
      • Individual Interviews
      • Surveys
      • Community Resource Inventories
  • Needs Assessment Tools
    • Focus Groups
      • Guided group discussion of 6 to 12 individuals from similar backgrounds with a skilled moderator and, if possible, a recorder. Moderator guides the group into increasing levels of focus on key issues.
      • Time: 1.5 – 2 hours each
      • Expertise: Moderate
      • Cost: Low
  • Needs Assessment Tools
    • Community Forums
      • A series of public meetings to involve the community in defining and discussing needs. They are less formal than focus groups and open to the public.
      • Time: 2 – 4 hours each
      • Expertise: Low
      • Cost: Low
  • Needs Assessment Tools
    • Individual Interviews
      • One-on-one interviews with individuals who represent the target population or have extensive knowledge or experience. A skilled interviewer asks specific and open-ended questions to obtain information about needs. Respondents can express their understanding openly and freely.
      • Time: Not more than 1 hour per meeting
      • Expertise: Moderate
      • Cost: Low
  • Needs Assessment Tools
    • Surveys
      • Generally three kinds: face-to-face, telephone and mailed (emailed).
      • Time: Not more than 45 minutes to 1 hour to complete
      • Expertise: High
      • Cost: High
  • Needs Assessment Tools
    • Community Resource Inventories
      • A means of data collection that usually results from a survey of service providers, which yields a listing or summary of information about activities and services provided by organizations and agencies in a defined geographic area.
      • Time: Not more than 1 hour to complete
      • Expertise: Moderate
      • Cost: Moderate
  • Additional Resources for Assessment Tools
    • http://ctb.ku.edu/tools/en/section_1042.htm
    • http://www.health.state.mn.us/communityeng/needs/needs.html
    • http://www.careinternational.org.uk/resource_centre/civilsociety/inventory_of_resources/section_2/section_a/a10_somalia_capacity_assessment_tool.pdf
    • http://www.familiesandwork.org/forums/download/f01/needs.PDF
  • Program Design
    • Some quick Definitions:
      • Cause :
      • Pathway :
      • The Causal Pathway : A framework for designing a program with clearly defined inputs, activities and outcomes.
  • Causal Pathway Framework Impact Impact Change in the health, social, economic status of the population of interest
  • Causal Pathway Framework Impact Effect Effect Change in the knowledge, attitudes, skills, behavior of the population of interest
  • Causal Pathway Framework Impact Effect Outputs Outputs Products and services that must be in place for the effects and impact to occur
  • Causal Pathway Framework Impact Effect Outputs Activities Activities Either technical activity or support required to produce the outputs
  • Causal Pathway Framework Impact Effect Outputs Activities Inputs Inputs Resources needed to support the activities
  • Causal Pathway Framework Impact Effect Outputs Activities Inputs Design direction Causal Hypothesis This set of inputs and activities will result in these products and services (outputs), which will lead to these changes in the population, which will contribute to the desired impact.
  •  
  • Sample Causal Pathway for Service Delivery Program Impact Effects Activities Inputs Outputs Activities Outputs Health, social, economic status Knowledge Attitudes Behavior Activity: Deliver services Output: Good quality services available Activity: Provide education, counseling Output: Good education, counseling available Activity: Supplies & logistics Output: Adequate supplies Activity: Training Output: Skilled workers Activity: Super vision Output: Better, motivated workers Staff Sites $$$ Techn Expert
  • Where does Evaluation fit? Impact Effect Outputs Activities Inputs Evaluation
  • A good planning tool can help us ...
    • Identify where a problem may exist
    • link actions and results
    • decide what resources are needed
    • make evaluation part of initial plan
    • learn what works and what doesn’t
  • Program Design: Key Issues USAID/Vietnam Informal Training January, 2004
  • Before you consider Monitoring and Evaluating…
    • Some basic pre-implementation steps:
      • Q: Are the outputs, effects and impact
      • easily measurable ?
      • Q: Are the key beneficiaries and
      • cooperating partners clearly identified ?
      • Q: Are there sufficient funds and political
      • support and/or commitment from relevant
      • parties for this program?
  • Before you consider Implementing…
    • Crucial Questions:
      • Q: How will the program build capacity among its beneficiaries?
      • Q: What are the program’s sustainability and exit strategies ?
      • Q: What are the risks and what steps will be taken to monitor and minimize risks?
  • Before you consider Implementing…
    • Additional Factors:
      • Policy Support Measures
      • Participation of Local Stakeholders
      • Gender Issues
      • Management Capacity
      • Environmental Issues
      • Economic and Financial Issues
  • Program Design: Exercise 1: Designing your own Causal Pathway
  • Sample Causal Pathway for Service Delivery Program Impact Effects Activities Inputs Outputs Activities Outputs $60,000 3 International Staff Members – full-time Training Resources Overhead for 1 in-country staff clinic Design Focused Training Materials for Midwives Prepare Training Agenda and schedule for 10 midwives Training Manuals for midwives Training agenda Train 10 midwives on life saving skills Prepare placement and work schedule for midwives in 5 local district clinics 10 Trained Midwives in LSS Liaise with 5 local district clinics Increased access to midwifery services for families living in marginalized communities Increased use of clinical services Increased number of women with attended births Liaison with 5 district clinics Placement for 10 trained midwives with work schedule Reduced Maternal Mortality Reduced Infant Mortality Higher quality of life
  • The Causal Pathway: Monitoring and Evaluation USAID/Vietnam Informal Training January, 2004
  • Monitoring and Evaluation
    • What is it?
    • Monitoring and evaluation is the process through which we gain information about the activities and achievements of programs, in order to make decisions to improve them.
  • Monitoring and Evaluation: Why Do It?
    • Did we do what we said we were going to do?
    • Did we achieve what we said we would achieve?
    • Also ...
    • Was the project design sound? How can it be improved?
    • Did our project cause the observed change?
  • Causal Pathway Framework Impact Effect Outputs Activities Inputs Design direction Implementation, monitoring and evaluation direction
  • Causal Pathway Framework Impact Effect Outputs Activities Inputs Were inputs available, adequate, timely? Were activities performed on schedule? Were outputs produced? Were they of acceptable quality? Were effects observed? Was impact achieved?
  • Measurement
    • How do we know if these steps occurred?
    • We measure them, using …
    • Indicators
  • Where measurement falls short….
    • Programs often measure processes, rather than impact, effects, or even outputs!
    • Example: # of trainings conducted
    • Why is this insufficient in telling us how well a program has succeeded?
  • Measuring Outputs
    • Output Indicators
      • measure products and services provided by the program, and the quality of these products and services
  • Formulation of an Indicator
    • Output indicator:
      • [# of] or [ % of planned]
      • [specific activities / products / services] that
      • [have been carried out / achieved]
      • [to acceptable / expected standard of quality]
  • Output Indicators: Examples:
    • Activity  Output
    • Output indicator
      • # of outreach staff trained
      • Method: project records
    Train outreach staff Outreach staff trained
  • Output Indicators: Examples:
    • Activity  Output
    • Output indicator
      • % of trained outreach staff who
      • received rating of “good” or “excellent”
      • on final training exercise
      • Method: project / training records
    Train outreach staff Well-trained outreach staff
  • Output Indicators: Examples:
    • Activity  Output
    • Output indicator
      • % of trained outreach staff who
      • perform education and service
      • responsibilities “well” or “very well”
      • Method: supervisors’ checklist
    Train outreach staff Skilled outreach staff in field sites
  • Output Indicators ...
    • Advantages
    • Directly related to activities
    • Easy, quick and frequent measurements
    • Include measures of quality of care
    • Disadvantages
    • Do not tell you if people change, only what project does
    • Can lose sight of desired impact
  • Measuring Effects
    • Effect Indicators
      • measure the level of knowledge, attitudes, skills, intentions and behaviors of the population of interest
  • Formulation of an Indicator
    • Effect indicator:
      • [% or #] of
      • [group members] who
      • [know / believe / can / do]
      • [specific knowledge/attitude/skill/behavior]
  • Formulation of an Indicator
    • Effect indicator: Example
      • % of
      • adolescents with disabilities aged 12-18 with adequate skills for employment in a chosen vocation
      • Method: Observation of skills, with scoring guide
  • Effect Indicators …
    • Advantages
    • Show changes in people
    • Often an adequate endpoint if causal pathway is strong
    • Disadvantages
    • Do not tell you if status has changed
    • Do not tell you what caused the change (problem of attribution)
    • Data can be difficult and expensive to obtain (if population-based)
  • Measuring Impact
    • Impact Indicators
      • measure the status of the population of interest
  • Formulation of an Indicator
    • Impact indicator: usually a rate or ratio
      • disability employment rate
      • fertility rate
      • rate of children born with disabilities
      • HIV Prevalence
      • Empowerment rate ???
    • Method: population-based
  • Impact Indicators …
    • Advantages
    • This is the point!!
    • Disadvantages
    • Problem of attribution
    • Often difficult / impossible methodologically
    • Changes occur slowly
    • Not needed if strong causal pathway exists
    • More likely to be done at national /macro level
  • Should We Use Standard Indicators?
    • Yes, when available
      •  Good source: Measure Evaluation Project
      •  Other resources on the web
    • Yes, when relevant
      • More relevant for impact and effect
      • Less relevant for outputs
    • Otherwise, be creative and specific
      •  Especially for outputs
  • Good indicators are …
    • Useful
      • linked directly to the causal pathway, so they act as markers for progress
      • contribute information for decision making
  • Good indicators are …
    • Ethical
      • rights of individuals should be respected and protected in the collection and use of data
      • use informed consent
  • Good indicators are …
    • Scientifically robust
      • valid
      • reliable
      • sensitive
      • specific
  • Good indicators are …
    • Accessible
      • must be able to measure easily
      • choose appropriate data collection method
  • Data Collection Methods
    • Usually quantitative:
      • Project records, service statistics
      • Observation (with standardized checklist)
      • Mini-surveys / exit interviews
      • Community-based KAP surveys
      • Census
      • Special studies: biological data, chart or record review
      • Others
  • Choose Indicators for your Project Causal Pathway that ...
    • are well-formulated
    • include measures of output quality
    • link directly to the pathway
    • measure important causal links throughout the pathway
    • rely on a variety of data collection methods
    • KISS (Keep It Short and Simple)
  • Monitoring and Evaluation: Examples of Indicators
  • How are these indicators different?
    • # trained and skilled disabled workers
    • # of rapes
    • % of hearing impaired children in inclusive education schools
    • Mean # of months rehabilitation supplies
    • # of disabled workers gainfully employed
    or or
    • # of rapes reported
    • % of hearing impaired children who successfully complete primary school
    or
    • rehabilitation supplies adequate for 3 months (Y / N)
    or
  • Sample indicators for …
    • Effect: Implementation of national barrier
    • free access codes in provinces
    • Output: Teachers successfully trained and
    • able to demonstrate proper
    • techniques in inclusive education
    • Output: District People’s Committee regularly
    • supervising joint early identification,
    • rehabilitation and inclusive education
    • program
    • Sample Monitoring and Evaluation Chart
      • Basically a clear summary of all objectives and their associated activities and outputs/results (The chart below is NOT complete!)
    Activities Reports from Provincial Government and NGOs % of registered businesses successfully adhering to disability employment law Implementation of national level disability employment law Reports from Government and NGOs Employment rate among people with disabilities Increased employment among people with disabilities Reports from Government, NGOs and the World Bank Poverty rate among people with disabilities The socioeconomic status of people with disabilities is increased Program monitoring reports Law is approved Assist government with the development of the disability employment law Expected Results/Outputs Project Objective Impact Risks Source of Verification Indicator
  • Other Resources
    • Understanding Impact Evaluation http://www.worldbank.org/poverty/impact/index.htm
    • Online Evaluation Resource Library http://oerl.sri.com/
  • Program Design: Exercise 2: Designing Indicators for your Causal Pathway
  • Program Design: Evaluation USAID/Vietnam Informal Training January, 2004
  • Program Evaluation This introduction to Evaluation is designed for staff who assist in program design. Following are some general principles and issues that are important to consider in the program design phase.
  • Program Evaluation
    • What is an Evaluation?
      • Evaluation should be an integral part of the development of a program. It should:
        • Identify, during the life of a program, strengths, weaknesses and relevance to the overall objectives
        • Assess the program impact on the lives of local community members
        • Apply the lessons learned to additional program planning.
    • Evaluation planning should begin during the design phase of a program
    • Programs that have vague or unclear goals tend to produce unclear and unfocused evaluations
    Program Evaluation
  • Program Evaluation
    • A “Good Evaluation” will:
      • Raise new questions and fresh ideas
      • Suggest different ways of looking at outcomes
  • Some Key Focal Areas for a Program Evaluation include….
    • Capacity Building
    • Sustainability
    • Cost-effectiveness
    • Relevance to needs of target population
    • Coverage
    • Replicability
    • Gender
  • Remember…
    • Evaluation should always be undertaken in the spirit of “valuing” a program.
    • Evaluation aims to improve , not to criticize .
  • Steps in planning and management of an Evaluation
    • Collect data
    • Analyze/interpret the data
    • Prepare the draft report
    • Debriefing workshop
    3. Conduct the evaluation
    • Prepare a terms of reference (SOW)
    • Select evaluation team
    • Identify methodologies
    • Identify time frame
    • Consult with relevant stakeholders
    • Prepare logistically
    2. Plan the evaluation
    • Identify evaluation purpose and focus – why it is needed? For whom? What specifically needs to be learned?
    1. Clarify/agree on the need for the evaluation
  • Steps in planning and management of an Evaluation
    • -Include all relevant stakeholders
    • Donors
    • Beneficiaries
    • Partners
    • Interested organizations
    5. Disseminate the Evaluation report
    • Should be clear on:
    • Key findings and their implications
    • Key recommendations
    4. Draft evaluation report
  • Who should be the Evaluator?
    • Choosing a competent, efficient and experienced evaluator is not always easy
    • Evaluators should have a genuine interest in the program, and if possible, should have related program experience. Teamwork skills are often key.
  • Internal vs. External Evaluations
    • Internal and external evaluations are not always mutually exclusive
    • Your organization may wish to conduct both over the course of your program implementation
  • Internal and External Evaluation
    • Internal Evaluators….
    • Advantages
    • know the project, staff, community
    • Have easy access to materials, data
    • Less expensive
    • Sympathetic to aims of project
    • More likely to consult colleagues, communities, and to share information
    • External Evaluators….
    • Advantages
    • May be free from internal bias
    • Provide a fresh perspective
    • Can often spend more focused time
    • May bring new insights, ideas or skills
    • May resolve internal conflicts by acting as a neutral party
  • Internal and External Evaluation
    • Internal Evaluators…
    • Drawbacks
    • Perhaps less objective and more likely to be influenced by colleagues
    • Have less time
    • May not have necessary skills
    • May be biased toward certain aspects of the work
    • May be reluctant to criticize
    • May create internal conflict
    • External Evaluators…
    • Drawbacks
    • May not understand aim of the project
    • May impose inappropriate criteria
    • May address external agenda rather than actual needs or timetable of project
    • Likely to be more expensive
    • Could be uncommitted to project’s future
    • May not consult adequately
  • Program Design: Transforming a Program Framework into a Proposal USAID/Vietnam Informal Training January, 2004
  • From Causal Pathway to Proposal…
    • The Causal Pathway is the skeletal framework for a proposal, with the addition of a few extra ingredients….
  • Elements of a Professional Proposal
    • Cover Letter
    • Executive Summary (If Proposal is longer than 10 Pages)
    • Introduction/Background information
    • Program Description
    • Program Timeline
    • Monitoring and Evaluation Chart
    • Organizational Chart
    • Detailed Budget
    • Detailed Budget Notes
    • Annexes or Appendices (If necessary)
  • Elements of a Professional Proposal
    • Cover Letter
      • Addresses the recipient of the proposal (the potential donor)
      • Clearly describes in a paragraph the major objective(s) of the proposal
      • Adds any additional, important information which may not be clear from the proposal
      • Should be positive and hopeful!
  • Elements of a Professional Proposal
    • Executive Summary
      • Only necessary if the proposal is longer than 10 pages
      • Clearly summarizes the program description, including major objectives, outputs, and activities, in addition to major beneficiaries, cooperating partners, and timeframe
  • Elements of a Professional Proposal
    • Introduction/Background Information
      • Should set the stage for the program
        • Who is your organization and what are your particular skills/existing programs?
        • What is the nature of the area in which you will be working? Politics? History? People? (As necessary)
        • Describe briefly the nature and results of the needs assessment that was conducted leading to the design of your program.
        • Introduce Beneficiaries and Cooperating Partners
  • Elements of a Professional Proposal
    • Program Description
      • Should be clear, concise and outlined based on program objectives
    Objectives = Impact/Effects Results/Outputs = Outputs Proposal Causal Pathway Activities = Activities Who/what/where? = Inputs
  • Elements of a Professional Proposal
    • Program Description – continued…
      • The description should also include:
        • Beneficiaries
        • Main Cooperating Partners and their roles
        • The general timeframe for implementation of major activities and expected results/outputs and objectives
        • Section on Sustainability and Exit Strategy (MUST HAVE!)
          • How will complete control and management of the program be transferred to the appropriate stakeholders where appropriate?
          • Include strategies on Capacity Building, Financial Sustainability, Policy….etc, where appropriate.
  • Program Timeline X X Activity 2.1 X X Activity 2.2 X Evaluation X X X Monitoring Reporting Expected Result 2 X X X Activity 1.2 X X X Activity 1.1 Expected Result 1 Month 6 Month 5 Month 4 Month 3 Month 2 Month 1 Year 1
    • Monitoring and Evaluation Chart
      • Basically a clear summary of all objectives and their associated MAJOR activities and outputs/results (simplified below)
    Activities Reports from Provincial Government and NGOs % of registered businesses successfully adhering to disability employment law Implementation of national level disability employment law Reports from Government and NGOs Employment rate among people with disabilities Increased employment among people with disabilities Reports from Government, NGOs and the World Bank Poverty rate among people with disabilities The socioeconomic status of people with disabilities is increased Program monitoring reports Law is approved Assist government with the development of the disability employment law Expected Results/Outputs Project Objective Impact Risks Source of Verification Indicator
  • Elements of a Professional Proposal
    • Organizational Chart
      • Helps to understand the management structure for a given program and may help to identify where management issues may arise
      • Also useful for staff in an organization
    Director Rehab Coordinator Administrative Assistant Outreach Coordinator Technician Secretary Driver Technician
  • Elements of a Professional Proposal
    • Detailed Budget (sample)
      • Outlines on by year specific costs according to general categories
      • Should distinguish costs to the potential donor versus costs covered by core funds (or other donors)
    3,000 6,000 1,500 1,500 1,500 1,500 12 500 / month Secretary 142,000 24,000 48,000 60,000 Total 60,000 30,000 30,000 12 5000 / month Country Director 48,000 24,000 24,000 24 2000 / month Program Manager 139,000 Subtotal - Personnel 24,000 12,000 12,000 24 1000 / month Program Assistant 2. Personnel - Local 1. Personnel - International Total USAID Other USAID Other USAID Units Cost per unit Category Year 2 Year 1
  • Elements of a Professional Proposal
    • Detailed Budget Notes
      • Should explain in detail EACH line item in the detailed budget
      • Example:
      • Line item 4.3 Regional Dissemination Workshop ($150)
      • This workshop is designed to disseminate findings from the Inclusive Education pilot in Dong Dan province. Estimated # of participants = 50. Costs will cover local venue rental ($100), and lunch and refreshments for 50 people at $1/person ($50).
  • The End