Your SlideShare is downloading. ×
INET Results-Based Accountability Workshop: May 2, 2014
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

INET Results-Based Accountability Workshop: May 2, 2014

133
views

Published on

Published in: Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
133
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Navicate is one of 16 non-profits in VT to gothrough initial RBA training instituteIn 2014, state gov’t. looking to adopt a bill to increase accountability and improve decision making
  • What you collect for data is how you are deciding to tell your story.
  • Population: Well-being for whole populations in a geographic area --- These are measures directly taken from the Vermont Population Outcomes (RBA Bill – S.293)Performance: Well being for customer populations – this is the piece of the population impact that YOUR ORGANIZATION can be held responsible for.
  • So:THIS IS WHERE WE STARTED – BLUE SKY – What’s good for Vermont?WHAT IS THE OVERALL GOAL? (Without thinking about what your organization controls?) This was a great introduction to RBA for our staff and board.
  • We then shifted to what NAVICATE’s IMPACT was on these OVERALL GOALS.Asked each staff member to answer these for their programs. -- this is the work we are finishing now. Next step would be to ask these questions by stakeholder group – educators, employers, parents, partners…You do this for EACH customer group (students, businesses, schools, funders other agencies…)
  • RBA gives you a common language to use when you’re talking data.Result or Outcome: A condition of well-being for children, adults, families, or communities. Indicator or Benchmark:A measure which helps quantify the achievement of a result.Performance Measure:A measure of how well a program, agency, or service system is working.Ultimately you’re answering 3 questions for each program – How much did we do? How well did we do it? And IS anyone better off?THIS IS THE ACTIVITY WE ARE GOING TO DO LATER.
  • We tried doing this for the whole organization, but didn’t learn anything. We needed the depth. Compared across programs – started with College Connections b/c they asked us to start with a single program. As time and interest grew, we moved onto the other programs. We were encouraged to use the data we were already collecting through the 27 evaluation questions we ask students to report through a pre/post survey. We lined up the questions to the programs. To go further, we worked out the actual percentage of gain.
  • GAINS CHART
  • ​Data & Eval Coordinator interviewed staff and board w/ 7 population questions-- part of educating staff and board on WHY they want to collect this data --- what part can we be held accountable for?2. We then were individually interviewed on 7 performance questions (focused on each program). 3. Began working on our RBA performance grids & data collection tables by program.4. Begin sorting through what data we collect versus what data we SHOULD be collecting -- what is productive vs. impactful? Is this the most important problem we can be solving?
  • PRINT AS HANDOUT –
  • Don’t get hung up over which category things go in – just get it on the paper and then discuss how it should be reported or which of the questions it answers.
  • Transcript

    • 1. Results-Based Accountability Using data to change policy and program practice Joseph McLaughlin Research and Evaluation Director Lindsey Lathrop Assistant Director
    • 2. Success Boston Initiative A case study on measuring systemic and organizational impact
    • 3. What is Success Boston? • Success Boston is a cross-sector collaboration between The Boston Foundation, City of Boston, Boston Public Schools (BPS), nonprofit organizations, and higher education institutions (38 schools) • The initiative’s work is organized along a “Getting Ready, Getting In, and Getting Through” framework • Common goal to increase the number of BPS graduates who earn 2 or 4-year degrees, particularly among low income students of color
    • 4. Graduation Rate Outcomes and Goals • 2008 Study- “Getting to the Finish Line” revealed that only 35% of BPS college enrollees from the Class of 2000 graduated by 2007- seven years after high school. • Success Boston initiative launched with two specific goals: – Increase the graduation rate for the Class of 2009 to 52% (150% of Class of 2000) – Double the graduation rate for the Class of 2011 (200% of Class of 2000) • Transition coaching program piloted with the Class of 2009
    • 5. Performance Measures (National Student Clearinghouse ‘NSC’ is primary data source) • Enrollment rates • Remedial course-taking rates • Persistence rates • Graduation rates • College coaching impact measures (quasi-experimental studies) *All measures are disaggregated for demographic and schooling subgroups
    • 6. Cumulative College Enrollment Rates, Seven Years after Graduation, NSC Data 67.2 73.8 77.0 60 62 64 66 68 70 72 74 76 78 80 2000 2003 2005 PerCent Class
    • 7. Six Year Graduation Rates of College Enrollees Who Enrolled in College in First Year after Graduation 40.6 46.5 47.4 49.2 30 32 34 36 38 40 42 44 46 48 50 2000 2003 2005 2006 PerCent Class
    • 8. One-Year College Persistence Rates, All and by Type of College Attended (in percent) Graduating Class (A) All Colleges (B) Two Year College (C) Four Year College 2001 78.1 55.4 86.5 2003 81.6 61.0 87.5 2007 81.3 61.3 87.3 2008 82.5 66.5 87.5 Percentage Point Change +4.4 +11.1 +1.0
    • 9. One- and Two-Year Persistence Rates of Class of 2009 BPS Graduates at Seven Selected Colleges, by Success Boston Coaching Participation Enrolled in Success Boston Not in Success Boston Difference Group 1-Yr Persistence Rate 2-Yr Persistence Rate 1-Yr Persistence Rate 2-Yr Persistence Rate 1-Yr Persistence Rate 2-Yr Persistence Rate N All 86.4 73.4 66.0 49.9 +20.4 +23.5 711 Male 83.3 66.7 62.5 46.6 +20.9 +20.1 343 Female 88.1 77.1 70.0 53.6 +18.1 +23.5 368 Black 90.3 77.4 64.8 49.6 +25.6 +27.8 306 Hispanic 82.6 66.3 58.6 42.7 +24.0 +23.6 243
    • 10. Estimated Percentage Point Impacts of Participation in Success Boston Programs on the One Year and Two Year Persistence Rates of BPS Graduates from the Class of 2009, All and by Gender Group (A) One Year Persistence Rate (B) Sig. Level (C) Two Year Persistence Rate (D) Sig. Level (E) Three Year Persistence Rate (F) Sig. Level All 16.7 .01 15.6 .01 16.3 .01 Men 18.6 .01 16.3 .05 11.1 .10 Women 14.8 .01 15.9 .01 18.8 .01 Black 21.9 .01 18.9 .01 22.0 .05 Hispanic 17.0 .01 13.7 .05 15.4 .05 10
    • 11. Work with what you have.
    • 12. Since 2011, 700+ people have been trained in RBA in Vermont.
    • 13. What is Results-Based Accountability? A framework for: • evaluation, • planning, and • collaboration ...that is simple, common sense and useful. RBA uses data to: • inform decisions, • challenge assumptions, and • explain actions to stakeholders.
    • 14. From Ends  Means From Talk  Action Result or Outcome What do we want? Indicator or Benchmark How we will recognize it? What will it take to get there? Performance Measure How will we know we made an impact? Population Performance/ Program ENDS MEANS
    • 15. RBA’s 2 Levels of Accountability Population (All of Vermont): • Vermont has a prosperous economy. • Vermonters are healthy. • Vermont’s children and young people achieve their potential. (E) Youth successfully transition to adulthood. Performance (What we can be held to): • % of youth with positive skills acquired and changed behavior • % of students know about and can access community resources related to careers and college • % of students who are prepared for employment or post- secondary education after completing a Navicate program
    • 16. 7 Qs for Population Accountability 1. What are the quality of life conditions we want for the children, adults, and families who live in our community? 2. What would these conditions look like if we could see them? 3. How can we measure these conditions? 4. How are we doing on the most important of these measures? 5. Who are the partners that have a role to play in doing better? 6. What works to do better, including no-cost and low- cost ideas? 7. What do we propose to do?
    • 17. 7 Qs for Performance Accountability 1. Who are our customers? 2. How can we measure if our customers are better off? 3. How can we measure if we are delivering services well? 4. How are we doing on the most important of these measures? 5. Who are the partners that have a role to play in doing better? 6. What works to do better, including no-cost and low- cost ideas? 7. What do we propose to do?
    • 18. Navicate Performance Examples Result or Outcome • All Vermont youth are engaged, inspired, and equipped to achieve their career and educational aspirations. • All members of the Vermont community value and nurture youth in the process, thus assuring the educational, economic, and social vitality of the whole community. Indicator or Benchmark For our students: Rate of high school graduation, percent pursuing post- secondary education, teen unemployment rate, college persistence rate, percent ready for careers By Program: Performance Measures: Three Questions: 1. How much did we do? 2. How well did we do? 3. Is anyone better off?
    • 19. Program Performance Measures % of youth with positive skills acquired and changed behavior % of students know about and can access community resources related to careers and college % of students who are prepared for employment or post-secondary education after completing a Navicate program
    • 20. Is anyone better off? = Why we do our work BEFORE College Connections Disagree 
    • 21. AFTER Many more agree!
    • 22. Contribution relationship Appropriate responsibility THE LINKAGE Between POPULATION and PERFORMANCE POPULATION ACCOUNTABILITY Youth are aware of and can find jobs. Rate of teen employment, rate of internships Businesses are engaged in work-based learning. Gain of # of employers offering WBL opportunities to youth Youth are Pursuing Post-Secondary Education Percent students pursuing post-secondary education, rate of second year success # of students engaged % of students successfully completing internships # / % of students that know which school & community resources can help them PERFORMANCE ACCOUNTABILITY High School Internship Program POPULATION RESULTS CUSTOMER RESULTS Alignment of measures
    • 23. Comparing your performance 1. To ourselves -- can we do better than our own history? 2. To others -- when it is a fair apples to apples comparison 3. To standards -- when we know what good performance is
    • 24. Overview: Population and Performance Accountability All Youth in Vermont WHOLE POPULATION OUTCOME: Youth successfully transition to adulthood INDICATORS Students have exposure and access to real life and life- long learning People involved in Navicate PERFORMANCE MEASURE How much did we do? How well did we do it? Is anyone better off? (i.e. Dual Enrollment, Internships, Employers involved, Changed student skills/behaviors, VT youth engaged in school and learning about themselves through careers and college Navicate’s Contribution Means End
    • 25. RBA in a nutshell 2 levels of accountability 3 performance measures 7 accountability questions from ends to means
    • 26. Work with the data you ALREADY have. Start NOW! Navicate’s main lesson:
    • 27. How much did we do? # of customers served (by customer characteristic) # activities (by type of activity) How well did we do it? (% common measures) workload ratio, staff turnover rate, staff morale, percent of staff fully trained, worker safety, unit cost, customer satisfaction) % activity-specific measures: percent of actions timely and correct percent of customers completing the activity percent of actions meeting standards IS ANYONE BETTER OFF? (# and %) Skills and Knowledge Attitude/Opinion Behavior Circumstance EFFORT EFFECT QUANTITY QUALITY
    • 28. Quantity (How Much We Do) How much service did we deliver? Quality (How Well We Do It) How well did we do it? EFFORT • 183 high school students served • 204 employers engaged in internships, tours, job-shadows, career presentations etc. • 4 high schools • 1 alternative program providers • 143 internships completed • 87% successfully complete internships (71 of 82 in 2013) • 5% increase out of school youth served • 8% increase internships completed (132 in 2012, 143 in 2013) • 19 % increase of employers (100 in 2012, 119 in 2013) • 6% did not complete an internship (9 of 143 in 2013) RBA Grid TIPS Internship Program – 1 yr. of data
    • 29. Activity Time! In your team, 1. Decide on one program to focus on 2. Answer the three RBA questions: - How much do we do? - How well do we do it? - Is anyone better off? 3. Fill in 3 parts of the blank RBA grid for that program. Don’t get hung up over which category things go in – just get it on the paper and then discuss how it should be reported or which of the questions it answers.
    • 30. Database Grant language Marketing Performance Review of Organizational Health Strategic Planning Board Development Staff Professional Development A common language on how to talk about our impact!
    • 31. We even made an infographic.