2.12 Creating a Yardstick: Developing a Performance Measurement System

1,958 views

Published on

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,958
On SlideShare
0
From Embeds
0
Number of Embeds
93
Actions
Shares
0
Downloads
88
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Comes from surveys we have done in communities that are prepping for HEARTH. This may be different in your community, but we have certainly seen consistency. When we review these with senior leaders in communities – even Executive Directors, HMIS administrators and the like – I think some people shake their head in agreement, while others are a bit shocked at times.
  • The key is to do this for the SYSTEM AS A WHOLE, as well as individual organizations within the system. No point everyone going rogue and doing their own thing. Kim is going to talk more about thinking like a service system in the context of HEARTH and moving beyond just HUD-funded programs.
  • Lingle wrote Bullseye!
  • Too often organizations or communities have been collecting oodles of data without knowing why they are doing it or what it is going to tell them. I say articulate in one sentence what your organization intends to do and the change it seeks to create. If you have good objectives and good measures for those objectives then you are actually testing whether the change that you want to see (say, ending chronic homelessness) is actually happening. Believe it or not, with enough cheerleading data collection, analysis and use to improve performance can be motivating and – dare I say it – fun! I look for data to be used and performance measurement to occur top to bottom and bottom to top within an organization. I love seeing Boards use data to make decisions and staff wrestle with what information is saying to that they drive improvement. . If frontline staff don’t see themselves reflected in the system, they simply won’t use it or they will lie or a combination of both. Recent surveys we have done show that at least 1 in 5 organizations have had staff that have felt this way. Oh, and do clients see value in having their information collected and analyzed? A solid incremental approach is better than starting with a monstrous performance measurement design that fails because of its own weight. Building over time and getting better over time is entirely acceptable – and encouraged.
  • " If you can't describe what you are doing as a process, you don't know what you are doing." 
- W. Edwards Deming 
"In God we trust, all others bring data."
- W. Edwards Deming
  • " An acre of performance is worth a whole world of promise."
- William Dean Howells
  • 2.12 Creating a Yardstick: Developing a Performance Measurement System

    1. 1. Performance Measurement Some Handy Thoughts Iain De Jong OrgCode Consulting, Inc.
    2. 2. Agenda <ul><li>Intro/Definition of Performance Measurement </li></ul><ul><li>Ingredients for Successful Performance Measurement </li></ul><ul><li>When we know we need to focus on performance measurement </li></ul><ul><li>Approaches to performance measurement; moving from program to system level </li></ul><ul><li>Priority setting, coordinating performance measurement across the system; metrics to look at </li></ul><ul><li>Performance measurement and HEARTH, PM Tools </li></ul><ul><li>Q & A/Session Close </li></ul>
    3. 3. To Get Started <ul><li>If you want copies of the presentations, send an email (contact details for each presenter are at the end of this presentation). </li></ul><ul><li>At some point presentations will be on the NAEH website – www.naeh.org - as well as oodles of other cool resources. </li></ul><ul><li>The OrgCode website - www.orgcode.com - will also contain some of the presentations and other materials you may find useful. </li></ul>
    4. 4. Interactive Options @orgcode [email_address]
    5. 5. Intro/Definitions of Performance Measurement
    6. 6. Definitions of Performance Measurement <ul><li>Performance measurement is the process by which criteria is established to determine the quality of activities based on organizational goals using quantitative evidence. </li></ul><ul><li>A simple, effective system for determining whether an organization is meeting objectives. </li></ul>
    7. 7. What Performance Measurement is Not <ul><li>It is NOT a waste of time. </li></ul><ul><li>It is NOT your HMIS. That is a data retention, management and report generating system – not a performance measurement system. </li></ul><ul><li>It is NOT something you consider once per year when doing a funding application. </li></ul><ul><li>It is NOT a collection of anecdotes. </li></ul><ul><li>It is NOT something you look at with passing interest and do nothing about. </li></ul>
    8. 8. 6 Things That Performance Measurement Does (When Done Right) <ul><li>Ensures that service and system requirements are met </li></ul><ul><li>Sets transparent sensible objectives </li></ul><ul><li>Provides internal and external standards for comparison </li></ul><ul><li>Provides a “scoreboard” for people to monitor their performance level </li></ul><ul><li>Highlights problems that deserve priority attention </li></ul><ul><li>Provide feedback to drive the improvement efforts </li></ul>
    9. 9. 4 Purposes to Performance Measurement at the System Level <ul><li>Identify chronic underperformers </li></ul><ul><li>Identify those organizations where, with coaching and extra attention, performance standards can be met </li></ul><ul><li>Ensure prudent financial stewardship </li></ul><ul><li>Acknowledge (reward?) strong performers </li></ul>
    10. 10. From the Trenches <ul><li>Between 1 in 5 and 1 in 3 organizations report that staff have felt pressure to lie about their data at least once in the past year </li></ul><ul><li>3 out of 5 organizations that use an HMIS also use one or more other approaches to collecting information on service </li></ul><ul><li>Almost half of all staff report they need help analyzing data </li></ul>
    11. 11. From the Trenches <ul><li>Between 40-50% of frontline staff are uncomfortable with data and performance measurement </li></ul><ul><li>3 out of the top 5 ways in which organizations use data pertains to funding (applications, soliciting donations/fundraising, and meetings with funders) and performance measurement tends to track less than 50% </li></ul>
    12. 12. From the Trenches <ul><li>4 out of 5 organizations report that data is consistent with their organization’s Mission, but less than 2 out of 5 report that the data adequately demonstrates the efforts that go into the work </li></ul><ul><li>7 out of 10 service providers do not have a process for assessing acuity of clients to point them in the direction of the resources best able to end their homelessness </li></ul>
    13. 13. From the Trenches <ul><li>Less than half of service providers have logic models for each homeless or housing program </li></ul><ul><li>Of those that have a logic model, only two-thirds regularly update and measure against the logic model </li></ul>
    14. 14. Ingredients for Successful Performance Measurement
    15. 15. Intensive Supports Mid-level support Little If Any Support Homeless Population – Not Homogeneous “ Funnel” Of Homeless Services Intake & Assessment – Acuity Determined
    16. 16. Driving Change Through Performance Measurement 2009 2019
    17. 17. For Successful Operational Performance <ul><li>The intake into the system should create service pathways for clients based upon their needs. </li></ul><ul><li>Prioritize those most in need of service first. </li></ul><ul><li>Support plans should be informed by information gleaned at intake – seamlessly. </li></ul><ul><li>Track how the acuity of clients change over time. </li></ul><ul><li>Look beyond outputs to outcomes. </li></ul><ul><li>Have a logic model, follow it and update against it. </li></ul>
    18. 18. Getting Started <ul><li>Define your strategic objectives. </li></ul><ul><li>Define your strategic metrics/key performance indicators. </li></ul><ul><li>Set performance targets. </li></ul><ul><li>Establish a baseline. </li></ul><ul><li>Determine when you need to look at which metrics to determine performance against targets. </li></ul>
    19. 19. To Be Successful In Getting Started <ul><li>See if people share the same perceptions and values across services. </li></ul><ul><li>Look at the system as a whole and the place of every organization within it. </li></ul><ul><li>Review the quality of your existing HMIS data and other data that may sit with other sources. </li></ul><ul><li>If you need to refresh/update your plan to end homelessness, this is a good time to do so. </li></ul>
    20. 20. Want the Simplest Approach? <ul><li>Make sure your data and your approach is meaningful, unambiguous and widely understood </li></ul><ul><li>Spend time on working with all data collecting and measuring organizations so that they “own” and manage the process </li></ul><ul><li>Up front work on ensuring there is a high level of data integrity </li></ul><ul><li>Embed data collection as part of “normal” duties instead of something people do in addition to “real” work </li></ul><ul><li>Take action based upon that which is learned in measurement </li></ul><ul><li>Link it all to your strategic objectives </li></ul>
    21. 21. A Good Performance Measurement Approach <ul><li>Looks at the system as a whole as well as individual agents/performers </li></ul><ul><li>Focuses the attention on what matters most, not everything that is possible to measure </li></ul><ul><li>Moves from data collection to the creation of a narrative based upon that which is collected, when, by whom, for what purpose and with which degrees of accuracy </li></ul><ul><li>Ensures measurement of the right things </li></ul><ul><li>Embraces positive deviants </li></ul><ul><li>Creates a common language for an organization </li></ul><ul><li>Is independently verifiable </li></ul>
    22. 22. Performance Measurement Is Only As Good As the Data <ul><li>Consistent definitions </li></ul><ul><li>Timely entry by staff </li></ul><ul><li>Non-duplication of entries (including aliases) </li></ul><ul><li>Agreement on data updates by program </li></ul><ul><li>Exits are duly noted </li></ul><ul><li>Information is complete </li></ul>
    23. 23. &quot; You get what you measure. Measure the wrong thing and you get the wrong behaviors.” <ul><li>- John H. Lingle </li></ul>
    24. 24. High Performing Organizations <ul><li>Know exactly which individuals/families they have the expertise to serve and which programs to provide. They are NOT all things to all people. </li></ul><ul><li>Prioritize which individuals they serve. They are NOT first come, first serve. </li></ul><ul><li>Internally monitor their performance and are dedicated to being reflective practitioners. </li></ul><ul><li>Possess a simple, transparent and operational logic model for each program that they run. </li></ul><ul><li>Have internal performance standards and operational standards that they measure against. </li></ul>
    25. 25. Signs of System Level Performance Measurement Failure <ul><li>Funders receive information and do not de-fund chronic underperformers </li></ul><ul><li>Failure to learn and improve (excuse culture instead of a solution focused culture) </li></ul><ul><li>Delay in reporting out results </li></ul><ul><li>Continuing to function as a collection of projects rather than the system as a whole </li></ul><ul><li>Anecdotal tampering of findings </li></ul>
    26. 26. Process <ul><li>Define </li></ul><ul><li>Measure </li></ul><ul><li>Learn </li></ul><ul><li>Improve </li></ul>
    27. 27. When We Know We Need to Focus on Performance Measurement
    28. 28. Approaches to Performance Measurement <ul><li>Moving from the Program to System Level </li></ul>
    29. 29. <ul><li>“ A zebra does not change its spots.” </li></ul><ul><li>- Al Gore </li></ul>
    30. 30. Advice for Success <ul><li>Begin the design of your performance measurement system with the end in mind. </li></ul><ul><li>Have solid strategic objectives and test against the change that you think needs to happen is happening. </li></ul><ul><li>Create a data loving culture. </li></ul><ul><li>Ensure the data collection and performance measurement has value </li></ul><ul><li>Focus on getting better and better and better. </li></ul>
    31. 31. Advice for Good Strategic Objectives <ul><li>1.  Is it measurable or verifiable? </li></ul><ul><li>2.  Is it achievable or feasible? </li></ul><ul><li>3.  Is it flexible or adaptable? </li></ul><ul><li>4.  Is it consistent with the rest of your strategic plan? </li></ul><ul><li>5.  Does it stretch your people without breaking them? </li></ul><ul><li>6.  Is it clear, easy to understand, and inviting to achieve? </li></ul>
    32. 32. Performance Measurement & Your Funding Programs <ul><li>While meeting reporting expectations for funding, the key to success in thinking like a system is to find commonality across requirements and structuring community activities into sectors of service rather than funding programs. </li></ul>
    33. 33. Sectors of Service <ul><li>Approach 1 </li></ul><ul><li>Prevention </li></ul><ul><li>Outreach </li></ul><ul><li>Emergency Shelter </li></ul><ul><li>Transitional Housing </li></ul><ul><li>Rapid Re-housing </li></ul><ul><li>Permanent Supportive Housing </li></ul><ul><li>Approach 2 </li></ul><ul><li>Prevention </li></ul><ul><li>Outreach </li></ul><ul><li>Shelter </li></ul><ul><li>Drop-ins </li></ul><ul><li>Housing Supports </li></ul>
    34. 34. Example of Performance Indicators by Program Type Prevention Outreach Emergency Shelter Transitional Housing Rapid Re-Housing Permanent Supportive Housing Number of Unique Individuals Served       Successful Housing Outcomes       Average Length of Stay     Recidivism       Successful Income Outcomes   Direct Client Assistance Utilization    Occupancy  
    35. 35. Example of Sector of Service Outreach Strategic Objective Decrease Street Homelessness by 50% or more by the next Point in Time Count by housing people, helping them access shelter and/or reuniting with family/friends, focusing on those with highest acuity first. Performance Indicators <ul><li># of unique individuals served </li></ul><ul><li>successful housing outputs </li></ul><ul><li>successful shelter outputs </li></ul><ul><li>successful family/friend reunification </li></ul><ul><li>recidivism </li></ul>Performance Targets <ul><li>each outreach team to work with no more than 25 unique individuals per month </li></ul><ul><li>4 successful housing outputs per outreach team per month (at least 2 higher acuity per month) </li></ul><ul><li>8 successful shelter outputs per outreach team per month (at least 4 higher acuity per month) </li></ul><ul><li>1 successful family/friend reunification per outreach team per month </li></ul>
    36. 36. To Achieve Excellence in Sectors of Service <ul><li>Create and publish standards or core service expectations for each sector of service. </li></ul><ul><li>Ensure that all Strategic Objectives are SMART! </li></ul><ul><li>Set targets that are achievable, not just an aspiration. </li></ul><ul><li>Define the service pathways and linkages between the sectors of service to allow for unambiguous service navigation. </li></ul>
    37. 37. The Data Analysis Plan <ul><li>Programs </li></ul><ul><li>Populations </li></ul><ul><li>Data Elements </li></ul><ul><li>Data Calculations </li></ul><ul><li>Frequency </li></ul><ul><li>Reporting Format </li></ul><ul><li>Report Audience </li></ul>
    38. 38. Data Analysis Plan: Programs <ul><li>By individual program </li></ul><ul><li>Across all programs (duplicate users across programs; individual program users; average number of users per program) </li></ul>
    39. 39. Data Analysis Plan: Populations <ul><li>Specific groups to potentially investigate (examples): </li></ul><ul><ul><li>By gender </li></ul></ul><ul><ul><li>By age group(s) </li></ul></ul><ul><ul><li>By military service </li></ul></ul><ul><ul><li>By presence of a specific diagnosis or life issue </li></ul></ul><ul><ul><li>By level of acuity </li></ul></ul><ul><ul><li>By length of homelessness </li></ul></ul><ul><ul><li>By location within city </li></ul></ul><ul><ul><li>By indoor or outdoor </li></ul></ul><ul><ul><li>Etc. </li></ul></ul>
    40. 40. Data Analysis Plan: Data Elements <ul><li>You can use HMIS measures </li></ul>
    41. 41. Data Analysis Plan: Data Calculations <ul><li>Need to write out how exactly the conclusions were reached, especially when moving beyond descriptive statistics </li></ul>
    42. 42. Data Analysis Plan: Frequency <ul><li>Need to pre-determine which data will be analyzed to measure performance </li></ul><ul><li>Need to determine what the data ranges are for the analysis: </li></ul><ul><ul><li>Intake date </li></ul></ul><ul><ul><li>Program enrollment date </li></ul></ul><ul><ul><li>Assessment dates </li></ul></ul><ul><ul><li>Exit date </li></ul></ul>
    43. 43. Data Analysis Plan: Reporting Format <ul><li>Product Considerations </li></ul><ul><li>Web-update </li></ul><ul><li>Paper report </li></ul><ul><li>Annual Report </li></ul><ul><li>Report to the Board </li></ul><ul><li>Report to Funder </li></ul><ul><li>Style Considerations </li></ul><ul><li>Table </li></ul><ul><li>Graphs </li></ul><ul><li>Context </li></ul><ul><li>Analysis </li></ul><ul><li>Outcome Impacts </li></ul>
    44. 44. Data Analysis Plan: Report Audience <ul><li>Board </li></ul><ul><li>Funder </li></ul><ul><li>General public </li></ul><ul><li>Staff </li></ul><ul><li>Supervisory Team </li></ul><ul><li>Senior Management Team </li></ul><ul><li>Executive Director </li></ul><ul><li>Partner agencies </li></ul><ul><li>10 Year Plan Committee </li></ul><ul><li>Elected Officials </li></ul>
    45. 45. Quality Assurance <ul><li>&quot; When dealing with numerical data, approximately right is better than precisely wrong.&quot;
- Carl G. Thor </li></ul>
    46. 46. Quarterly Report Considerations <ul><li>Target(s) for the quarter </li></ul><ul><li>Outputs for the quarter </li></ul><ul><li>Target(s) year to date </li></ul><ul><li>Outputs year to date </li></ul><ul><li>Quarter over quarter </li></ul><ul><li>Year over year </li></ul><ul><li>Outcomes to date </li></ul>
    47. 47. Canaries in Coal Mines <ul><li>http://popwatch.ew.com/2008/12/22/site-of-the-14// </li></ul>
    48. 48. Considerations for Process Review at the System Level <ul><li>Evidence of client prioritization </li></ul><ul><li>Evidence of seamless linking of clients to programs that meet their needs </li></ul><ul><li>Funding expensed by quarter </li></ul><ul><li>Percentage of sector target achieved by organization </li></ul><ul><li>Site visits </li></ul>
    49. 49. Priority setting, coordinating performance measurement across the system; metrics to look at
    50. 50. Performance Measurement & HEARTH, PM TOOLS
    51. 51. Q & A
    52. 52. Iain De Jong OrgCode Consulting Inc. 416-432-0410 [email_address] @orgcode www.orgcode.com Susan McGee Homeward Trust 780-496-6035 [email_address] @susanmcgee www.homewardtrust.ca Kim Walker Center for Capacity Building National Alliance to End Homelessness 202-942-8292 [email_address] www.naeh.org Amanda Sternberg Homeless Action Network Detroit 313-964-3666 x4201 [email_address] www.handetroit.org

    ×