TECHNICAL ASSISTANCE WORKSHOP:

DATA, IMPACT, & RESOURCES
Realizing Success Without Increased Funding

United Way of Mass ...
MYTH:

Outcomes-based
evaluation is a
waste of valuable
staff time!
AGENDA:
►Understand & utilize DATA

MANAGEMENT;
►Develop quality OUTCOMES &
INDICATORS; and
►Advocate effectively for
FUND...
LOGIC & DECISION MAKING
FUNDING SOURCES /
POLICY

INPUTS

OUTCOME TARGETS /
OUTCOME INDICATORS

OUTCOMES /
IMPACT

ACTIVIT...
LOGIC & EXTERNAL FORCES
Intended
Impact

SITUATION /
PRIORITIES

DECISION
MAKING
CYCLE
EXTERNAL
FACTORS

ASSUMPTIONS

Theo...
CDBG PROGRAM EXAMPLE:
Asian Liaison Program
Inputs
Activities Outputs Outcomes Impacts
Inputs

Strategy

Strategy

Initiat...
CDBG PROGRAM EXAMPLE:
SSYMCA - Germantown Neighborhood Center

Inputs

Inputs

Strategy

Strategy

Initiative

Initiative
...
1.
2.
3.
4.
5.
6.

STEPS TO SUCCESSFUL
“ACTIONABLE” MEASUREMENT
Prepare Organization
Define Outcomes
Define Indicators
Col...
ORGANIZATIONAL READINESS
►Mission : For what purpose
does the organization exist?

►Vision : What would the

community loo...
ORGANIZATIONAL READINESS
►Decision-Making : Does the

organization’s leadership use
DATA to make decisions?
►Improvements ...
ORGANIZATIONAL READINESS
Marguerite Casey Foundation
Capacity Assessment Tool, 2001
http://www.caseygrants.org/pages/resou...
OUTCOMES
►Difficult:
 Preventative (Health)
 Developmental (Education)
 “One-time” / Anonymous (Food Shelves)
►Easier:
...
OUTCOMES
►Early Childhood
►Education
►Jobs & Economic
Security
►Survival
Citation: The Robin Hood Foundation http://www.ro...
OUTCOMES
►Immediate : Currently in OR just

after leaving the program

►Intermediate: 3-6 months after

program

►Ultimate...
INDICATORS
►What would I see, hear, or read about

clients that would mean progress
toward the outcome?
►What numbers and/...
INDICATORS
Making Connections:
The National Survey Indicators
Database Annie E. Casey Foundation
http://tarc.aecf.org/init...
INDICATORS
National Neighborhood Indicators
Partnership: Urban Institute, 2008-2012
http://neighborhoodindicators.org

 C...
useful

COLLECTING DATA
► Current program records
► Cost/Time restraints
► Follow-up data collection
► Utilize Existing Da...
DATA COLLECTION TIPS
►Don’t ask for unnecessary data
►Consider time your survey will take
►Format survey to meet your need...
ONLINE SURVEY TOOLS
Survey Monkey www.surveymonkey.com
 Free version may be useful for small, informal surveys – 10
quest...
SOFTWARE OPTIONS
► Community Tech Knowledge:

http://www.communitytech.net
► Vista Share: http://www.vistashare.com
► Desi...
PILOTING
►Impact is a long term goal; therefore,

measure outcomes more frequently
►How do joint efforts lead to
solutions...
OUTCOMES & COLLABORATION
Strive Partnership, Cincinnati –
Coalition for Cradle to Career Success
►

►
►

Connect & Coordin...
OUTCOMES & COLLABORTATION
YouTube Video:
http://www.youtube.com/watch?
v=FLqc_9VxfCE
Jeff Edmondson, Executive Director
of...
ANALYZE & REPORT
►State & Federal Policy Makers
►Foundation Leadership
►Grantees & Ultimate Beneficiaries
►Private Donors
...
EVALUATION DASHBOARDS
Example: Idashboards:
http://examples2.idashboards.com/idashboards/?
guestuser=wpgv1
CDBG PUBLIC SERVICES
EVALUATION DASHBOARD
FY 06-07 - FY 10-11
14,000
12,000
10,000
8,000
6,000

Beneficiaries

4,000
2,000...
RECOMMENDED FOLLOW-UP
Evaluation Plan Workbook,

Innovation Network,
http://www.innonet.org/client_docs/File/evaluat
http:...
FACT
“(With outcomes-based measurement
system in place), We’ve been able to
learn very quickly what is working
and where o...
QUESTIONS & COMMENTS
REFERENCES

► Actionable Measurement Guidelines, Bill & Melinda

Gates Foundation 2010,
http://www.gatesfoundation.org/lea...
REFERENCES
► Evaluation Dashboards: Practical Solutions for

Reporting Results, Innovation Network 2008,
http://www.innone...
REFERENCES
► Measurement as Learning, BridgeSpan 2011,

http://www.bridgespan.org/measurement-as-lea
► Program Planning & ...
Upcoming SlideShare
Loading in …5
×

Data, Impact, & Resources

488 views

Published on

Melissa Horr shares research on collective impact, data, and advocacy on December 8, 2011 in Quincy, MA for nonprofit and government public service agencies.

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
488
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Before presentation, download YouTube video and Evaluation Dashboard website; Welcome and thank attendees; Thank both Sarah Link and Kory Eng as Special Guests and announce Kory as a guest speaker; Introduce and highlight importance of the topic – surviving tight economic times with slashed budgets while still serving the even greater needs of the community and meeting the grant requirements
  • My goal in this workshop is to change this perception if it still exists in the room; The difference of managing to outcomes as a waste of time verse effective use of time is the focus: Proving OR Improving??
  • In addition, also advocating for policy changes…
  • This model was created by combining several different models (see references) and shaping it to our CDBG clients. Arrows represent the decision making and would ideally represent reflection, learning, and decision making; Inputs: Staff (Donna and Antoinette, NHS), Volunteers, Time, Money, Materials, Partners, etc; Outputs: Activities and Participation; Leads to CHANGE – social and funding
  • Situation: problem or issue that the program is to address sits within a setting or situation from which priorities are set; Assumptions: beliefs we have about the program, the people, the environment and the way we think the program will look; External Factors: technological, political, social, economic changes; Theory of Change: defines all the building blocks required to bring about a long-term goal; is a statement or series of statements about what the organization is trying to achieve and will hold itself accountable for within some manageable period
  • Credit to the Bill & Melinda Gates Foundation for the model… see references; AQUA: measure changes in populations and systems; Green: Measure progress toward targets, test assumptions, identify what works, how, and why; ORANGE: track implementation and progress toward targets.
  • Credit to the Bill & Melinda Gates Foundation for the model… see references.
  • “Actionable” Measurement: Term borrowed from the Bill and Melinda Gates Foundation, Means purposeful measurement designed and intended to be acted upon. For example: Measuring how many people are using the program so that we may act on recruitment OR measuring how many people are improving their reading skills in the program so that we might act to focus more on that skill.
  • To direct you to this free resource to measure your organizational readiness for an “actionable” measurement system.
  • The “Housing First” model for ending homeless proved an economic difference that putting families in homes allowed for families to regain financial stability and health; this could be done for food pantries – beneficiaries of the food pantry were able to redirect their spending on rent so they didn’t lose their home… The Robin Hood Foundation has taken it upon themselves to find “place holders” for outcomes for some of the programs that are difficult to identify with and are committed to piloting those outcomes and rewriting new ones with evidence of quality.
  • Early Childhood: Programs like Parent Child Home Program through Commission on the Family; Education: Quincy After School; Jobs & Economic Security: Quincy 2000; Survival: ESL classes by John Chen / AASA and QCAP Southwest Community Center EFC; The point being that different programs require different sets of outcomes and by separating programs into categories, funders are better able to conceptualize what they are looking for that program to do and what kind of social return on investment they can expect.
  • “Imagine the client…” Carry this mission out this year “and beyond” ; What behaviors do you prefer and/or actually see?; What values, attitudes, status would you prefer to be the fullest extent of benefit for the client and/or actually see?
  • Illustrate & Quantify Progress
  • Don’t collect data that has no use…
  • Online Surveys can be a huge help in understanding what your constituents think and how successful your programs are, without breaking your budget – and using good tools can be a big part of that.
  • Customer service excellent
  • Founded in 2006
  • Prioritize Intended Audiences; Think about your audience and what you want to achieve with presenting the results of your program
  • Think about the dashboard in your car… the dashboard helps you make informed decisions. Do I need to slow down? Am I going the speed limit? Do I need to get gas? How close is my tank to empty? Is my car getting too hot? Do I need to pull over before my car overheats?
  • What do these data points prove? How can we develop a new dashboard model that effectively communicates the community impact of the CDBG public service dollars?
  • Agencies who use outcomes based measurement are more effective with their resources; therefore, it is a valuable tool to survive tight fiscal times.
  • Data, Impact, & Resources

    1. 1. TECHNICAL ASSISTANCE WORKSHOP: DATA, IMPACT, & RESOURCES Realizing Success Without Increased Funding United Way of Mass Bay and Merrimack Valley City of Quincy Dept of Planning & Community Development December 8th, 2011
    2. 2. MYTH: Outcomes-based evaluation is a waste of valuable staff time!
    3. 3. AGENDA: ►Understand & utilize DATA MANAGEMENT; ►Develop quality OUTCOMES & INDICATORS; and ►Advocate effectively for FUNDING from various sources for programs with social impact.
    4. 4. LOGIC & DECISION MAKING FUNDING SOURCES / POLICY INPUTS OUTCOME TARGETS / OUTCOME INDICATORS OUTCOMES / IMPACT ACTIVITIES / OUTPUTS
    5. 5. LOGIC & EXTERNAL FORCES Intended Impact SITUATION / PRIORITIES DECISION MAKING CYCLE EXTERNAL FACTORS ASSUMPTIONS Theory of Change
    6. 6. CDBG PROGRAM EXAMPLE: Asian Liaison Program Inputs Activities Outputs Outcomes Impacts Inputs Strategy Strategy Initiative Initiative Grant Grant Activitie s Outputs Outcome s Impacts Outreach to increase the diversity of beneficiaries for City of Quincy CDBG programs. Offer both direct and indirect service at sites where CDBG programs are implemented. Assist and track clients through data collection, spoken and written materials translation, and ESL classes.
    7. 7. CDBG PROGRAM EXAMPLE: SSYMCA - Germantown Neighborhood Center Inputs Inputs Strategy Strategy Initiative Initiative Grant Grant Activities Activitie s Outputs Outputs Outcomes Impacts Outcome s Impacts Transform neighborhood into a healthy & sustainable community of choice & opportunity. Offer programs to strengthen the community for women, youth, and family and offer a food pantry to supply basic needs. Provide match funding for services and activities aligned with the initiative while tracking program participants, and success stories.
    8. 8. 1. 2. 3. 4. 5. 6. STEPS TO SUCCESSFUL “ACTIONABLE” MEASUREMENT Prepare Organization Define Outcomes Define Indicators Collect Data Pilot System Analyze / Report Results
    9. 9. ORGANIZATIONAL READINESS ►Mission : For what purpose does the organization exist? ►Vision : What would the community look like if the organization was no longer needed?
    10. 10. ORGANIZATIONAL READINESS ►Decision-Making : Does the organization’s leadership use DATA to make decisions? ►Improvements : Are there opportunities for staff to use DATA to drive improvements?
    11. 11. ORGANIZATIONAL READINESS Marguerite Casey Foundation Capacity Assessment Tool, 2001 http://www.caseygrants.org/pages/resources /resources_downloadassessment.asp  Well written, well formatted  FREE excel download  Capacity Measured: Leadership, Adaptive, Management, Operational  Offers Scores & Analysis
    12. 12. OUTCOMES ►Difficult:  Preventative (Health)  Developmental (Education)  “One-time” / Anonymous (Food Shelves) ►Easier:  Counteractive programs  Designed to address observable problems  Example: Chronic Homelessness
    13. 13. OUTCOMES ►Early Childhood ►Education ►Jobs & Economic Security ►Survival Citation: The Robin Hood Foundation http://www.robinhood.org
    14. 14. OUTCOMES ►Immediate : Currently in OR just after leaving the program ►Intermediate: 3-6 months after program ►Ultimate: 6-12 months & beyond program
    15. 15. INDICATORS ►What would I see, hear, or read about clients that would mean progress toward the outcome? ►What numbers and/or percentage would indicate significance? ►Example: “2,000 or 50% of the participants will quit smoking by the end of the program.”
    16. 16. INDICATORS Making Connections: The National Survey Indicators Database Annie E. Casey Foundation http://tarc.aecf.org/initiatives/mc/mcid/index.php  FREE download, easy to adapt  Offers 150 indicators in 8 areas of impact, including: ►Alliances, Advocacy, & Collective Action ►Economic Opportunity ►Child and Family well-being
    17. 17. INDICATORS National Neighborhood Indicators Partnership: Urban Institute, 2008-2012 http://neighborhoodindicators.org  Community databases of indicators, handbooks, curricula & technical assistance  Local Partnerships, (The Boston Foundation)  Data compiled from gov’t, census, research, & survey sources
    18. 18. useful COLLECTING DATA ► Current program records ► Cost/Time restraints ► Follow-up data collection ► Utilize Existing Databases:  EXAMPLE: Harvard Family Research Project OST Evaluation Database: http://gseweb.harvard.edu/hfrp/projects/after http://gseweb.harvard.edu/hfrp/projects/afte
    19. 19. DATA COLLECTION TIPS ►Don’t ask for unnecessary data ►Consider time your survey will take ►Format survey to meet your needs ►Consider the language you are using ►Be sensitive to culture, assumptions, bias, and loaded questions Citation: Innovation Network http://www.innonet.org/client_docs/File/Survey_Dev_Tips.pdf
    20. 20. ONLINE SURVEY TOOLS Survey Monkey www.surveymonkey.com  Free version may be useful for small, informal surveys – 10 questions, 100 responses;  Select version $16.99/mo, or $199/yr – unlimited questions & responses, customization, ability to export Excel & PDF files Other Options: Zoomerang.com; SurveyGizmo.com; PollDaddy.com; ConstantContact.com; FormSite.com; Moodle.com
    21. 21. SOFTWARE OPTIONS ► Community Tech Knowledge: http://www.communitytech.net ► Vista Share: http://www.vistashare.com ► Design Data: http://www.ddco.com; http://www.outcomeresults.com ► City Span (Provider): http://www.cityspan.com ► Social Solutions: http://www.socialsolutions.com ► Athena Case Management (Penelope): http:// www.athenasoftware.net
    22. 22. PILOTING ►Impact is a long term goal; therefore, measure outcomes more frequently ►How do joint efforts lead to solutions? ►Measure for contribution (part), not attribution (credit) ►Harmonize and Collaborate
    23. 23. OUTCOMES & COLLABORATION Strive Partnership, Cincinnati – Coalition for Cradle to Career Success ► ► ► Connect & Coordinate the various partners – non-profits, schools, & other support organizations Identify key indicators across the continuum; continuously make progress toward improvement Transform education by using data to determine which strategies yield the best results and driving resources toward those strategies
    24. 24. OUTCOMES & COLLABORTATION YouTube Video: http://www.youtube.com/watch? v=FLqc_9VxfCE Jeff Edmondson, Executive Director of Strive Partnership Citation: Strive Together Partnership, http://www.strivetogether.org
    25. 25. ANALYZE & REPORT ►State & Federal Policy Makers ►Foundation Leadership ►Grantees & Ultimate Beneficiaries ►Private Donors ►Practitioners
    26. 26. EVALUATION DASHBOARDS Example: Idashboards: http://examples2.idashboards.com/idashboards/? guestuser=wpgv1
    27. 27. CDBG PUBLIC SERVICES EVALUATION DASHBOARD FY 06-07 - FY 10-11 14,000 12,000 10,000 8,000 6,000 Beneficiaries 4,000 2,000 0 20062007 20082009 20102011 $1,800,000 $1,600,000 $1,400,000 $1,200,000 $1,000,000 $800,000 $600,000 $400,000 $200,000 $0 Leveraging Expenditures 2006- 2008- 20102007 2009 2011
    28. 28. RECOMMENDED FOLLOW-UP Evaluation Plan Workbook, Innovation Network, http://www.innonet.org/client_docs/File/evaluat http://www.innonet.org/client_docs/File/evalua
    29. 29. FACT “(With outcomes-based measurement system in place), We’ve been able to learn very quickly what is working and where our participants need more help .” Marcele Carneiro Gama Viana 10,000 Women Project Manager Brazilian University FDC Measurement as Learning Bridgespan, 2011
    30. 30. QUESTIONS & COMMENTS
    31. 31. REFERENCES ► Actionable Measurement Guidelines, Bill & Melinda Gates Foundation 2010, http://www.gatesfoundation.org/learning/Documents/ ► Basic Guide to Outcomes-Based Evaluation for Nonprofit Organizations with Very Limited Resources, Authenticity Consulting, LLC., http://managementhelp.org/evaluation/outcomes-evalu ► Community Development Evaluation Storymap and Legend, NeighborWorks America 2006, http://www.nw.org/network/training/documents/Com
    32. 32. REFERENCES ► Evaluation Dashboards: Practical Solutions for Reporting Results, Innovation Network 2008, http://www.innonet.org/resources/files/AEA_Dashboa ► Key Steps in Outcome Management, Series on Outcome Management for Nonprofit Organizations, The Urban Institute 2003, http://www.urban.org/publications/310776.html ► Low-Cost Online Survey Tools Keep Nonprofits on Budget, MassNonProfit 2011, http://www.massnonprofit.org/expert.php? artid=2593&catid=67
    33. 33. REFERENCES ► Measurement as Learning, BridgeSpan 2011, http://www.bridgespan.org/measurement-as-lea ► Program Planning & Development – Program Logic Model, University of Missouri Extension, http:// extension.missouri.edu/staff/programdev/plm ► Software for Nonprofit Evaluation and Case Management, Innovation Network 2010, http://www.innonet.org/resources/files/2010-01

    ×