Best Practices in Nonprofit Impact Measurement , CNM
Upcoming SlideShare
Loading in...5
×
 

Best Practices in Nonprofit Impact Measurement , CNM

on

  • 2,208 views

 

Statistics

Views

Total Views
2,208
Views on SlideShare
2,208
Embed Views
0

Actions

Likes
1
Downloads
124
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial LicenseCC Attribution-NonCommercial License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Outputs are products of program activities or tangible, direct results of program activities. They are usually measured in terms of the volume of work accomplished Some examples are: Number of classes taught, Number of students taught, Number of meals provided Counseling sessions conducted, Number of brochures distributed Number of participants served
  • data.

Best Practices in Nonprofit Impact Measurement , CNM Best Practices in Nonprofit Impact Measurement , CNM Presentation Transcript

  • Best Practices in NP Impact Measurement Presented by Charlotte Keany, Director of Consulting, and Mary Jones, Consultant
  • Workshop Objectives
    • Introduction to an outcomes based program evaluation process
    • Learn how to develop a logic model
    • Learn how logic models can be used in program planning and evaluation
  • Outcomes Based Evaluation What is it? A systematic process to obtain information on an organization’s activities, its impacts and the effectiveness of its work, so that it can improve its activities and describe its accomplishments. Paul W. Mattessich, The Manager’s Guide to Program Evaluation.
  • Outcomes Based Evaluation Why is it important?
    • Determine whether program goals have been met
    • Refine programs to improve overall effectiveness
    • Enhance ability to communicate results
    • Enhance promotion and marketing of a program to the public
    • Allocate dollars more efficiently
  • “ If you don’t know where you are going, how are you gonna know when you get there? Yogi Berra
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • What is a Logic Model?
    • A logic model is a framework that helps you design results-based programs.
    • It is a visual representation of how your program works.
    • It includes what you put into your program, what you do and what you plan to achieve.
  • Why use a Logic Model?
    • Tells your story and the difference you are making in the community
    • Strengthens your case for program investment
    • Demonstrates accountability to stakeholders
    • Builds understanding and promotes consensus about what the program is and how it works
    • Helps with planning, evaluation, implementation and communications
  • Logic Model Components Your Planned Work Your Intended Results Inputs Activities Outputs Outcomes
  • Elements of a Logic Model INPUTS are materials and resources a program uses in its activities or processes to serve clients.
  • Logic Model Components
    • Inputs
    • Staff
    • Volunteers
    • Facilities
    • Equipment
    • Curricula
    • Money
    Activities Outputs Outcomes
  • Elements of a Logic Model Inputs are resources a program uses to achieve program objectives. ACTIVITIES are what you do with your resources. The programs and services are designed to meet your clients’ needs and fulfill your mission and vision.
  • Logic Model Components
    • Inputs
    • Staff
    • Volunteers
    • Facilities
    • Equipment
    • Curricula
    • Money
    • Activities
    • Feed & shelter homeless families
    • Provide job training
    • Mentor youth
    Outputs Outcomes
  • Elements of a Logic Model Inputs are resources a program uses to achieve program objectives. Activities are what you do with your resources. The programs and services are designed to meet your clients’ needs and fulfill your mission and vision. OUTPUTS are units of service regarding your program.
  • Logic Model Components
    • Inputs
    • Staff
    • Volunteers
    • Facilities
    • Equipment
    • Curricula
    • Money
    • Activities
    • Feed & shelter homeless families
    • Provide job training
    • Mentor youth
    • Outputs
    • Number of meals served
    • Number of counseling sessions conducted
    • Number of youth mentored
    Outcomes
  • Elements of a Logic Model Inputs are resources a program uses to achieve program objectives. Activities are what a program does with its inputs; the services it provides to fulfill its mission Outputs are products of a program’s activities OUTCOMES are the actual impact benefits for participants during or after their involvement with a program
  • Logic Model Components
    • Inputs
    • Staff
    • Volunteers
    • Facilities
    • Equipment
    • Curricula
    • Money
    • Activities
    • Feed & sheltering homeless families
    • Provide job training
    • Mentor youth
    • Outputs
    • Number of meals served
    • Number of counseling sessions conducted
    • Number of youth mentored
    • Outcomes
    • New Knowledge
    • Increased skills
    • Changed attitudes or values
    • Altered status
  • Levels of Outcomes
    • Short-term outcomes – The most immediate benefits or changes participants experience
    • Intermediate outcomes – Links initial outcomes to the longer-term desired outcomes
    • Long-term Impact – The ultimate impact a participant will enjoy as a result of participating in a program or services of an organization.
  • Teen Mother Parenting Education Program
  • Teen Mother Parenting Education
    • Inputs
    • Pregnant teens
    • MSW Program Manager
    • RN Instructor
    • Education Manuals
    • Videos
    • Money
    • Activities
    • Parenting classes; two times a week for one hour
    • Content: infant nutrition, develop-ment, safety, caretaking
    • Outputs
    • # pregnant teens served
    • # high schools partici-pating
    • # manuals distributed
    • Initial Outcomes
    • Teen mothers are knowledge-able about prenatal nutrition and health guidelines
    • Intermediate Outcomes
    • Teen mother follow proper nutrition and health guidelines
    • Teen mothers deliver healthy babies
    • Long-Term Outcomes
    • Babies achieve 12 month milestones for physical, motor, verbal, and social development
  • Follow the Chain of Reasoning
    • To have a successful, sustainable program . . .
    • Certain resources/inputs are needed to operate your program.
    • If you have access to them, then you accomplish your planned activities .
    • If you accomplish your planned activities, then , you will deliver the products and/or services you intended (outputs).
    • If you accomplish your planned activities to the extent intended, then your participants will benefit in specific, measurable ways (outcomes).
    • If these benefits to participants are achieved, then changes in organizations, communities, or systems will occur with positive impact .
  • The Conceptual Chain
    • Inputs
    • Staff
    • Volunteers
    • Facilities
    • Equipment
    • Curricula
    • Money
    • Activities
    • Feed & shelter homeless families
    • Provide job training
    • Mentor youth
    • Outputs
    • Number of meals served
    • Number of counseling sessions conducted
    • Number of youth mentored
    • Initial Outcomes
    • Changes in:
    • New Knowledge
    • Increased skills
    • Changed attitudes or values
    • Intermediate Outcomes
    • Changes in behavior resulting from :
    • New Knowledge
    • Increased skills
    • Changed attitudes or values
    • Long-Term Outcomes
    • Changes resulting in:
    • Altered status
  • Teen Smoking Reduction Program
    • Inputs
    • Staff
    • Instruction Modules
    • Volunteers
    • Training of Volunteers
    • Activities
    • One Session Lecture
    • Match with volunteer mentor for counseling
    • Outputs
    • Number of youth in lecture
    • Number of youth matched with mentors
    • Initial Outcomes
    • Increased knowledge of risks about smoking
    • Increased support from mentor to quit smoking
    • Intermediate Outcomes
    • Youth quit smoking
    • Long-Term Outcomes
    • Youth quit smoking for more than one year
    • Less illness
    • Longer life expectancy
    • Lowered health cost
  • Outcomes Based Evaluation Steps Program Improvement Continuous Process
  • Step 2: Develop Outcome Indicators
      • Indicators are specific, observable, measurable characteristics or changes that will represent achievement of the outcome
      • Indicators are specific statistics (e.g., number/percent) the program will calculate to summarize its level of achievement
  • Identify Outcome Indicators
      • Indicators should be:
      • Direct
      • Meaningful
      • Useful
      • Practical to collect
      • Quantitative
      • Multiple Indicators
  • Teen Mother Parenting Education Outcome Indicators
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • Step 3: Collect Data
      • Identify data sources for your indicators
      • Design data collection methods
      • Pretest your data collection
      • instruments and procedures
      • Document your procedure
  • Data Collection Procedures
      • What data is collected?
      • How will they collect it?
      • Who collects it?
      • When will they collect it?
      • What do they do with it?
  • Data Collection Methods:
      • Interviews
      • Focus Groups
      • Surveys
      • Observation
      • Document review
      • Others (?)
  • Keep in Mind 20% of the effort generates 80% of the results
  • Outcome Measurement Framework
  • Teen Mother Parenting Education Measurement Framework
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • Step 4: Analyze Results
      • Document and summarize data
      • Tabulate results
      • Analyze data
  • Outcome of Delivering Healthy Babies by Age of Mother Age All Participants Newborn weighed above 5.5 pounds, scored 7 or above on Apgar scale <17 17+ Total Percent of All Yes Number 8 5 13 72% % of age group 67% 83% No Number 4 1 5 28% % of age group 33% 17% Totals 12 6 18 100
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • Step 5: Communicate Findings
      • Present data in clear and understandable format
      • Provide explanatory information related to your findings
  • Outcome Data Table: Teen Mother Parenting Education Program
  • Data Presentation
      • Pie Chart: Bar Chart:
  • Trial-run Outcome Findings
      • Do the findings seem reasonable?
      • Is the information presented clearly?
      • Are explanations of problem areas and proposed remedies satisfactory?
      • Does anything seem to be missing?
      • What other charts or tables would be helpful?
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • Measure & Monitor Outcomes for Continuous Improvement
      • Review your process and data; make any necessary adjustments to your program
      • Monitor and review your programs periodically
  • Outcomes Based Evaluation Steps Continuous Process Program Improvement
  • Myth Busters
      • It’s an event to get over with and then move on!
      • Evaluation is a whole set of new activities – “ We don’t have the resources!”
      • There’s a “right” way to do outcomes evaluation. “What if I don’t get it right?”
      • Funders will accept or reject my outcomes plan.
      • I always know what my clients need – “I don’t need outcomes evaluation to tell me if I’m really meeting the needs of my clients or not.”
  • Use your Findings
    • Provide direction for Staff
    • Identify Staff and Volunteer training
    • Identify technical assistance needs
    • Identify program improvement needs
    • Support annual and long range planning
    • Guide budget and justify resource allocation
    • Focus Board members on programmatic issues
    Resulting in…… Meeting your Mission
  • Resources
    • Measuring Program Outcomes: A Practical Approach. United Way of America. To order, call 800-772-0008
    • Logic Model Development Guide. Clear and concise discussion of the use of logic models. W. K. Kellogg Foundation (www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf)
    • Information Gold Mine; Innovative uses of Evaluation . Paul W. Mattessich and Shelly Hendricks (Amherst H. Wilder Foundation 2007)
    • Measuring Program Outcomes: A Practical Approach . United Way of America 1996
    • www.managementhelp.org/evaluatn/lgc_mdl.htm
    • The Manager’s Guide to Program Evaluation, Paul W. Mattessich, (Fieldstone Alliance 2003)
  • Questions