What Brought You Here?
Why You Should Measure
Outcomes
What You Should
Measure
How To Make It Happen
2
Why are you interested in this workshop?
What do you hope to take away today?
3
Program evaluation is carefully collecting and
analyzing information about a program or some
aspect of a program in order to make necessary
decisions.
The type of evaluation you undertake to
improve your program depends on what you
want to learn.
Continual improvement is an unending journey.
4
Process
Descriptions of what you do, how much you produce,
and how you do it
Outcomes
Changes that occur within the client, member or
system as a result of your activities
5
6
External Pressures
Increased Public Scrutiny
Funder Accountability
Economic Decline
Internal Motivations
Continuous Improvement
Greater Mission Focus
7
Direct service providers need to demonstrate
results for the client or target population
Associations must justify fees and retain
members
Systems change nonprofits need to show
changes in system capacity, investment or
readiness
Foundations must demonstrate impact across
grants awarded or at the community level
8
9
Data can provide powerful evidence
Data can be objective and reduce bias
Balanced information provides the opportunity
for independent interpretation
Data in existing tracking systems may be useful
10
Who are the possible audiences for your data?
What do they already know?
What do they want to know?
How will they use the data?
11
Possible Audience What Do They
Already Know?
What Do They
Need to Know?
How Will They Use
the Data?
12
Exercise
Designing a good tool
requires more time
and attention than you
may think.
Questions can tap into:
• Characteristics
• Opinions
• Experiences
• Knowledge
• Attitudes
• Behaviors
13
Process –activity/output/efficiency
Number and type of volunteer hours provided,
volunteers trained, workshops held, applications
processed, etc.
Example: Average contact time with participants
Outcomes – effectiveness/results/impact
Can include change in clients’ level of awareness,
knowledge gained, behaviors that have changed, etc.
Example: Percentage of participants who
demonstrate increased knowledge of effective
parenting skills
14
What is a logic model? A logic model is a
simple description of how your program,
service, or initiative works that shows the
linkages between:
Problem you are attempting to address
Program components
Program activities
Outcomes (both short- and long-term)
15
Useful for designing/developing your program
or initiative
Provides a framework for evaluation
Provides a focal point for stakeholders,
requiring them to work together to identify the
components of the model and to think about
expected outcomes
16
17
Find a champion
Usually someone in leadership
Integrate communication about the effort into
operations
Select someone who can implement and propel
the effort forward
Choose a person who has a compatible mindset
Choose a clear communicator
Provide professional development if needed
Give authority that allows dedicated time
18
What outcomes will you measure?
How will you define each outcome?
What tools will you use to do so?
Who is responsible for data
collection?
What are the timelines for doing so?
What questions do you want to be
able to answer at the end of the year?19
Surveys
Personal interviews
Focus groups
Other data collections
tools
Existing databases
20
A quantitative research project in which a
relatively large number of people are queried,
each being asked a standard set of questions,
posed in the same way each time.
21
To increase understanding
To collect specific, standardized data across
respondents
To collect data across many people as efficiently
as possible
To collect data quickly, even across multiple
groups
Open-ended quotes may add impact and
credibility
22
An interview is the collection of data by asking
people questions and following up or probing
their answers.
23
To increase understanding
As an exploratory first step to creating
quantitative tools
Enhance understanding of interesting findings
which emerged from other processes
Real-world quotes may add impact and
credibility
To collect specific, standardized data across
respondents
24
A focus group is a group discussion. Participants
are brought together in a neutral location for the
specific purpose of discussing and issue or
responding to ideas or materials of interest.
25
To increase understanding
To brainstorm or explore an idea you may wish
to implement
Enhance understanding of interesting findings
which emerged from other processes
To generate ideas for possible solutions
26
Observations
Checklists (process documentation)
Attendance logs
Case records
Existing databases
27
Depends on the purpose – who can answer your
questions?
If not apparent, ask for guidance from key
informants.
If potential population is large, generate a list of
candidates and key characteristics then randomly
select.
28
Document your processes first
Choose the most simple solution that will get the
job done
Include leadership, data entry personnel and data
analysts in these conversations
Focus on how data relates to reporting
Software is not the silver bullet
29
Keep it simple
Take baby steps
Balance simplicity with
precision
Focus on interpretation
Blend methods if
appropriate
30
Measuring Nonprofit Outcomes

Measuring Nonprofit Outcomes

  • 2.
    What Brought YouHere? Why You Should Measure Outcomes What You Should Measure How To Make It Happen 2
  • 3.
    Why are youinterested in this workshop? What do you hope to take away today? 3
  • 4.
    Program evaluation iscarefully collecting and analyzing information about a program or some aspect of a program in order to make necessary decisions. The type of evaluation you undertake to improve your program depends on what you want to learn. Continual improvement is an unending journey. 4
  • 5.
    Process Descriptions of whatyou do, how much you produce, and how you do it Outcomes Changes that occur within the client, member or system as a result of your activities 5
  • 6.
  • 7.
    External Pressures Increased PublicScrutiny Funder Accountability Economic Decline Internal Motivations Continuous Improvement Greater Mission Focus 7
  • 8.
    Direct service providersneed to demonstrate results for the client or target population Associations must justify fees and retain members Systems change nonprofits need to show changes in system capacity, investment or readiness Foundations must demonstrate impact across grants awarded or at the community level 8
  • 9.
    9 Data can providepowerful evidence Data can be objective and reduce bias Balanced information provides the opportunity for independent interpretation Data in existing tracking systems may be useful
  • 10.
  • 11.
    Who are thepossible audiences for your data? What do they already know? What do they want to know? How will they use the data? 11
  • 12.
    Possible Audience WhatDo They Already Know? What Do They Need to Know? How Will They Use the Data? 12 Exercise
  • 13.
    Designing a goodtool requires more time and attention than you may think. Questions can tap into: • Characteristics • Opinions • Experiences • Knowledge • Attitudes • Behaviors 13
  • 14.
    Process –activity/output/efficiency Number andtype of volunteer hours provided, volunteers trained, workshops held, applications processed, etc. Example: Average contact time with participants Outcomes – effectiveness/results/impact Can include change in clients’ level of awareness, knowledge gained, behaviors that have changed, etc. Example: Percentage of participants who demonstrate increased knowledge of effective parenting skills 14
  • 15.
    What is alogic model? A logic model is a simple description of how your program, service, or initiative works that shows the linkages between: Problem you are attempting to address Program components Program activities Outcomes (both short- and long-term) 15
  • 16.
    Useful for designing/developingyour program or initiative Provides a framework for evaluation Provides a focal point for stakeholders, requiring them to work together to identify the components of the model and to think about expected outcomes 16
  • 17.
  • 18.
    Find a champion Usuallysomeone in leadership Integrate communication about the effort into operations Select someone who can implement and propel the effort forward Choose a person who has a compatible mindset Choose a clear communicator Provide professional development if needed Give authority that allows dedicated time 18
  • 19.
    What outcomes willyou measure? How will you define each outcome? What tools will you use to do so? Who is responsible for data collection? What are the timelines for doing so? What questions do you want to be able to answer at the end of the year?19
  • 20.
    Surveys Personal interviews Focus groups Otherdata collections tools Existing databases 20
  • 21.
    A quantitative researchproject in which a relatively large number of people are queried, each being asked a standard set of questions, posed in the same way each time. 21
  • 22.
    To increase understanding Tocollect specific, standardized data across respondents To collect data across many people as efficiently as possible To collect data quickly, even across multiple groups Open-ended quotes may add impact and credibility 22
  • 23.
    An interview isthe collection of data by asking people questions and following up or probing their answers. 23
  • 24.
    To increase understanding Asan exploratory first step to creating quantitative tools Enhance understanding of interesting findings which emerged from other processes Real-world quotes may add impact and credibility To collect specific, standardized data across respondents 24
  • 25.
    A focus groupis a group discussion. Participants are brought together in a neutral location for the specific purpose of discussing and issue or responding to ideas or materials of interest. 25
  • 26.
    To increase understanding Tobrainstorm or explore an idea you may wish to implement Enhance understanding of interesting findings which emerged from other processes To generate ideas for possible solutions 26
  • 27.
    Observations Checklists (process documentation) Attendancelogs Case records Existing databases 27
  • 28.
    Depends on thepurpose – who can answer your questions? If not apparent, ask for guidance from key informants. If potential population is large, generate a list of candidates and key characteristics then randomly select. 28
  • 29.
    Document your processesfirst Choose the most simple solution that will get the job done Include leadership, data entry personnel and data analysts in these conversations Focus on how data relates to reporting Software is not the silver bullet 29
  • 30.
    Keep it simple Takebaby steps Balance simplicity with precision Focus on interpretation Blend methods if appropriate 30

Editor's Notes

  • #2 Good morning. I am very pleased to have the opportunity to talk to you today about simple steps that can help you move evaluation planning for your organization forward. Over the past 20 years, I have been involved in conducting program evaluations, many for nonprofit or programs. I hope that I will be able to share some information today that will help you take positive steps in focusing, simplifying or kick-starting evaluation efforts in a way that is meaningful for you organization.
  • #4 So tell me – why were you interested in this session?
  • #5 Evaluation can be summed up this way: (Read slide). Consider this quote: “Continual improvement is an unending journey.” When you think about your organization, do you think this holds true? Every organization has room for improvement. Evaluation can help you identify critical areas for enhancement. We are going to focus quite a bit on measurement today because good measurement is a keystone to evaluation. Measurement helps us identify opportunities for us to make more intentional and mindful choices.
  • #6 The type of evaluation you undertake to improve your program depends on what you need to learn. These are many different types of evaluation. We will focus primarily on activities related to process and outcomes evaluation today, as these are good places to start. Process evaluations, quite simply, examine what you do and how you do it, while outcome evaluation examine the results you achieve from those activities.
  • #8 External Scandals in the sector – what do you remember? – UW of America defrauded of over $1M by former Pres. Funders are demanding more proof of effectiveness. The economic decline has made donors and association members more discriminating regarding the organizations that they support or join. Internal Continuous Improvement. Every organization has room for improvement. Greater Mission Focus. Are we clearly directing our efforts towards things that work to achieve our mission?
  • #9 Direct service providers do this to retain or win funding. When the economy declines, associations become very vulnerable to attrition. Memberships are sometimes seen as “expendable” expenses unless the value is clear. Systems change organizations also need to secure funding. Foundations – helps them continue determination of community-based funding priorities
  • #10 So what are the advantages of using data? Read slide. Outcomes data specifically provides evidence of results that can lead to: Improved quality and quantity of services Greater efficiency and effectiveness Volunteer/staff satisfaction Program recognition Increased funding Greater impact on the community
  • #12 When you contemplate what to measure, it’s important to consider the audiences you will be sharing data with. This helps you gain clarity on the questions you wish to answer. Some important issues to consider with regards to audience are: READ SLIDE Let’s start with the first one. What are some possible audiences for your data? DOCUMENT ON FLIP CHART
  • #13 This is a helpful structure to document the audiences that are most relevant to you. Then answer each question for each audience. This will give you some guidance on the things you may wish to measure. Focus on results, not activities What you find interesting may not be interesting to your audience
  • #14 Good tools take time to develop. For research-based data, your questions should be clear and ideally elicit standardized interpretations. There are six primary types of data we collect for evaluation: Process - Characteristics: Demographics, etc. Opinions: Reports on positions Experiences: Reports things that have happened Outcomes - Knowledge: Questions to assess awareness of facts Attitude: A settled way of thinking about something Behaviors: Reports on actions
  • #15 Just like everything else in life, your measurement strategy should be balanced. One particularly important dimension to balance are measures that examine process vs. outcomes. Process Measures are typically used to examine activity and output, and sometimes efficiency. Examples include the number and type of volunteer hours provided, volunteers trained, and workshops held, etc. Outcomes Measures are the most desirable, and generally a little more difficult to collect. Can include change in client’s level of awareness, knowledge, attitudes or behaviors. EX. Of participants trained, how much has knowledge increased? Outcomes can also be short-term or long-term. Short – during or immediately after. Long-term – 1 year as rule of thumb.
  • #16 EXERCISE: LOGIC MODEL SHEETS: Ask them to pull out the sample logic model from their handouts. The logic model, sometimes referred to as a “theory of change” is a simple tool that describes how a program works and helps determine key measurement and evaluation information. The key elements of a logic model are 1) client (youth and family) and system conditions 2) major program components, 3) program activities, and 4) program outcomes (both short and long term). All of these components are linked logically to create your program evaluation framework.
  • #17 Useful for designing your program – the process of developing a logic model will clarify your thinking about your program, how it is intended to work, and what adaptations may need to be made once the program is operational. Provides a great framework from which to conduct on-going evaluation of the program. It allows identification of logical outcomes that are expected given the types of program components and activities that are implemented. Value in the process of developing the logic model – provides a focal point for stakeholders, requiring them to work together to identify the various components of the model and to think about expected outcomes. Useful for explaining the program to others and creates a sense of ownership among the stakeholders.
  • #19 I recommend 3 primary steps to making evaluation happen, in order. The first one is PEOPLE. Find a champion: Having outcomes as a priority at organizational and department leadership is important to ensuring staff buy-in and to drive implementation. Select someone who can propel efforts forward: Don’t pick someone who is highly focused on clinical practice or visionary. Need someone who is interested and driven by the details
  • #20 Building your team doesn’t mean you have to have a fully-fledged department – it just simply means to have enough capacity to match the scale of your desired effort. Once you have your team in place then turn to your process. By process, I mean the nuts and bolts that are needed that make up your outcomes measurement strategy. This is a good set of starter questions to help you pin own your process.
  • #21 Identifying the tools you will use to collect data is a key part of developing your outcomes measurement process: There are many different ways that we can collect data, so its important to focus on the most simple solution. The tools should also be an appropriate length that will address your primary concepts, but will also keep the respondent engaged, if relevant. I’m going to quickly review a few types of tools and when they are best used.
  • #22 Surveys Usually quick and easy, particularly when large numbers of respondents are important. They may also be more convenient when you do not have close interaction with the desired respondents.
  • #23 Method Phone Snail Mail Web-Based Blended Approach Question Types Open-ended Closed-ended Combination
  • #24 Personal Interviews Work well to collect detailed information, gives the added benefit of non-verbal cues so you can probe when issues seem unclear, great option when you want to increase rapport with your clients/ participants. But, they are time-consuming.
  • #25 Interviews for research or evaluation purposes promote understanding and change. Qualitative interviews can be a first step to creating quantitative tools, as well as great supplementary tools to enhance understanding of other findings that need more exploration. e.g. Conducted exploratory interviews of Virginia’s drug court programs. Programs have been developed in 28 courts, and the localities have local discretion to guide program development and implementation. Our process helps us understand the variations across the programs so we can create quantitative tools to gather standardized data in a new database for gathering client-specific data. Real world testimonials and quotes can be extremely valuable.
  • #26 Focus Groups Focus groups assemble a group of relevant people (customers, clients, potential investors) for the purpose of gathering information. There is a reason why they are called focus groups: you should have questions outlined that you want to answer and that you will be taking action on.
  • #27 EXPLORATORY: Want to get a general reaction to a thought or idea, or do brainstorming ISSUE-FOCUSED: Trying to address a particular issue, problem or create a solution. You have specific things you want to accomplish when you leave the room. (e.g., Meals on Wheels)
  • #28 Other Data Collection Tools Systematic collection of information could take many different forms, including checklists to document the steps in a procedure, forms to document processes, retrieval of specific key data from expanded client files, etc. This type focuses on internal collection of information, rather than feedback from external parties. These are often quite helpful in situations where staff have autonomy in maintaining client data, but no standardized forms exist, to summarize information across the organization. CASA EXAMPLE: No standardized contact log –collected info across sites – difficult – suggested use of consistent documentation for case management purposes, in case of turnover! Tools help in other ways – CONTINUITY
  • #29 But who are the best people to answer your primary questions? This depends on your purpose. As an example, in interviewing the drug courts, we have found that tasks are often divided between multiple agencies. In some instances, we need 3 or 4 different people in the room to provide a balanced perspective on the program. In other instances, one person runs the entire program and can tell us everything we need to know. We discovered this by asking our key contacts in each program. When the potential population is large, such as participants in a program, I generally use a selected method based on accepted research practice, such as random selection based on key characteristics (age, race, gender, geographical region, etc.). Another possibility is to randomly select program delivery sessions to attend and solicit participants.
  • #30 The final step is the system for collecting data. This can be on paper, a simple Excel spreadsheet, or a software system. Regardless of the system you choose, you should be clear on your process first or you will be wasting time. There is no question. This should include leadership, data entry & analysts. In addition, the analyses and reports that you incorporate in the system should be driven by your key questions. Again pre-planning and clarity are key. Software is not a silver bullet!!!!
  • #31 Evaluation and measurement works best when you plan ahead. In general, measures that are implemented or derived in a “I need it now” situation are not that reliable. Remember – focus on continuous improvement - all organizations can be improved. How are you determining the steps to improve your organization? Are you using gut instinct or real information? Keep it simple - think about the audience for your information. Don’t use jargon or technical terms if your audience can’t respond to it. I If your organization is small, it may need a few measures and your efforts to demonstrate impact will set you apart. Remember, take baby steps if you need to. As long as you are moving forward, you are moving positively! Ask questions that in such a way that you can interpret the results and prioritize those that drive decision-making. Outcomes measures have the most dramatic effect on potential customers, funders or investors. Move toward results measurement as much as you can. Blend methods if appropriate.
  • #32 In the spirit of evaluation, I have included a feedback form in your handouts today and would appreciate any thoughts you have on the session. If you can take a moment to complete the form before you leave, please drop it off on the way out. Also, if you have questions about measurement or would like to discuss your company’s needs in more depth, please feel free to contact me.