0
Michigan Statewide HMIS Performance Improvement using Outcome & Indicator  Measures
About Michigan <ul><li>Michigan Statewide HMIS </li></ul><ul><li>2.5 years old </li></ul><ul><li>57 CoCs including Detroit...
Michigan Vertical Configuration
Measurement:  An Ongoing Collaboration <ul><li>Funding Organizations </li></ul><ul><li>Program Leadership </li></ul><ul><l...
Selecting What to Measure <ul><li>Mix of measures </li></ul><ul><ul><li>Process Indicators </li></ul></ul><ul><ul><ul><li>...
Coordinating the Process of Measurement <ul><li>How do we know what success looks like? </li></ul><ul><ul><li>Client Chara...
Defining the Population Who is Served <ul><li>Developing shared screening questions.  Data Standards plus: </li></ul><ul><...
When Measurement Occurs <ul><li>Intermediate Outcomes Processes/Indicators </li></ul><ul><ul><li>Measured during care or a...
Base Lines & Targets <ul><li>Critical for evaluating change and optimizing performance </li></ul><ul><ul><li>Realistic / h...
Benchmarking <ul><li>Working with “like” programs to: </li></ul><ul><ul><li>Identify realistic targets </li></ul></ul><ul>...
Example Control Chart:  School Attendance
Defining the Measure <ul><li>Purpose:  To measure the Programs ability to support the educational plan of children in care...
Creating Management Information <ul><li>Cutting the performance rate to guide data driven management decisions.  </li></ul...
The Context of Services <ul><li>Lack of affordable housing </li></ul><ul><li>Vacancy rates </li></ul><ul><li>Unemployment ...
Supporting Performance Improvement? <ul><li>Support the development of initial measures (indicators, objectives and outcom...
Project Activities Continued <ul><li>Provide reports that may be run monthly with a click of the button on all selected me...
Project Activities Continued <ul><li>Maintain,  with local support , a database of contextual variables. </li></ul><ul><li...
Contact Information <ul><li>Barbara Ritter </li></ul><ul><li>517-230-7146 </li></ul><ul><li>[email_address] </li></ul><ul>...
Upcoming SlideShare
Loading in...5
×

Michigan Statewide HMIS: Performance Improvement Using ...

363

Published on

Published in: Business, Health & Medicine
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
363
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Michigan Statewide HMIS: Performance Improvement Using ..."

  1. 1. Michigan Statewide HMIS Performance Improvement using Outcome & Indicator Measures
  2. 2. About Michigan <ul><li>Michigan Statewide HMIS </li></ul><ul><li>2.5 years old </li></ul><ul><li>57 CoCs including Detroit </li></ul><ul><li>370+ agencies with 1,322 programs </li></ul><ul><li>3 State Departments (DHS,DCH,DOE) </li></ul><ul><li>Over 120,000 unique clients </li></ul>
  3. 3. Michigan Vertical Configuration
  4. 4. Measurement: An Ongoing Collaboration <ul><li>Funding Organizations </li></ul><ul><li>Program Leadership </li></ul><ul><li>Staff who collect information and enter data </li></ul>
  5. 5. Selecting What to Measure <ul><li>Mix of measures </li></ul><ul><ul><li>Process Indicators </li></ul></ul><ul><ul><ul><li>Time to referral or placement in housing </li></ul></ul></ul><ul><ul><ul><li>Percentage returning for ongoing services </li></ul></ul></ul><ul><ul><li>Short-term or Intermediate objectives - measure critical processes </li></ul></ul><ul><ul><ul><li>Positive Housing placement at Discharge </li></ul></ul></ul><ul><ul><ul><li>Improved income at discharge </li></ul></ul></ul><ul><ul><ul><li>Linked with needed supportive services </li></ul></ul></ul><ul><ul><ul><li>Percent re-admissions </li></ul></ul></ul><ul><ul><li>Outcomes – measures sustained change </li></ul></ul><ul><ul><ul><li>Stable Housing </li></ul></ul></ul><ul><ul><ul><li>Stable Employment </li></ul></ul></ul>
  6. 6. Coordinating the Process of Measurement <ul><li>How do we know what success looks like? </li></ul><ul><ul><li>Client Characteristics / Who is Served </li></ul></ul><ul><ul><li>When measurement occurs </li></ul></ul><ul><ul><li>Establishing Base Lines </li></ul></ul><ul><ul><li>Benchmarking / Comparing Performance </li></ul></ul><ul><ul><li>Improving Performance </li></ul></ul><ul><ul><li>The Environment / Contextual Variables </li></ul></ul><ul><ul><li>Evaluate Outcomes </li></ul></ul>
  7. 7. Defining the Population Who is Served <ul><li>Developing shared screening questions. Data Standards plus: </li></ul><ul><ul><li>Have they had a lease in their own name? </li></ul></ul><ul><ul><li>Were they homeless as children? </li></ul></ul><ul><ul><li>What is their housing history? </li></ul></ul><ul><ul><li>What is their education history? </li></ul></ul><ul><ul><li>What is their employment history? </li></ul></ul><ul><ul><li>What is their health history? </li></ul></ul><ul><ul><li>Are there disabilities of long duration? </li></ul></ul><ul><ul><li>Using a Self Sufficiency Score </li></ul></ul>
  8. 8. When Measurement Occurs <ul><li>Intermediate Outcomes Processes/Indicators </li></ul><ul><ul><li>Measured during care or at discharge. </li></ul></ul><ul><ul><li>Reflect stable and optimized program processes. </li></ul></ul><ul><li>Examples </li></ul><ul><ul><li>Length of time to critical referrals </li></ul></ul><ul><ul><li>% of staff turnover </li></ul></ul><ul><ul><li># of re-admissions </li></ul></ul><ul><ul><li>Session compliance </li></ul></ul><ul><ul><li>Medication Errors </li></ul></ul><ul><ul><li>Critical Incidents </li></ul></ul><ul><ul><li>Customer Satisfaction </li></ul></ul><ul><li>Outcomes </li></ul><ul><ul><li>Measured after discharge for some period of time. </li></ul></ul><ul><ul><li>Were changes achieved in care sustained? </li></ul></ul><ul><ul><li>Did we impact other long-term primary outcomes? </li></ul></ul><ul><li>Examples </li></ul><ul><ul><li>Positive Housing Placement </li></ul></ul><ul><ul><li>Improved Income </li></ul></ul><ul><ul><li>Employment/School </li></ul></ul>
  9. 9. Base Lines & Targets <ul><li>Critical for evaluating change and optimizing performance </li></ul><ul><ul><li>Realistic / honest targets </li></ul></ul><ul><ul><li>Stabilized Processes. </li></ul></ul><ul><ul><ul><li>Percentiles are consistent across time </li></ul></ul></ul><ul><ul><li>Systematic and Routine Measurement </li></ul></ul><ul><ul><ul><li>Before, </li></ul></ul></ul><ul><ul><ul><li>During, and </li></ul></ul></ul><ul><ul><ul><li>After program change. </li></ul></ul></ul>
  10. 10. Benchmarking <ul><li>Working with “like” programs to: </li></ul><ul><ul><li>Identify realistic targets </li></ul></ul><ul><ul><li>Identify performance that is off norm </li></ul></ul><ul><ul><li>Share ideas to improve </li></ul></ul><ul><ul><li>Address measurement problems </li></ul></ul><ul><ul><li>Build transparency </li></ul></ul>
  11. 11. Example Control Chart: School Attendance
  12. 12. Defining the Measure <ul><li>Purpose: To measure the Programs ability to support the educational plan of children in care. </li></ul><ul><li>A client is considered to have met their individual education plan for the day if they participated for 75% or more of the time they were scheduled to be involved in education activities. </li></ul><ul><li>Available education days reflect only days when it was possible for the client to participate in their individual education plan. For example, if the resident’s educational plan is to attend school, only those days that the school was open during the reporting period are counted. Excused absences do not count as possible days. If a client was excused due to illness, pass or an appointment, these days are not considered available education days. </li></ul>
  13. 13. Creating Management Information <ul><li>Cutting the performance rate to guide data driven management decisions. </li></ul><ul><ul><li>Example 1: Employment: Overall is avg 50%, but for women the avg is 65%, for men it is 30%. Do you revise services to men or serve only women? </li></ul></ul><ul><ul><li>Example 2: Retention in housing: Overall is avg 72%, but for those w. substance abuse problems that number is only 15%. You conduct a study to determine barrier for those clients. </li></ul></ul>
  14. 14. The Context of Services <ul><li>Lack of affordable housing </li></ul><ul><li>Vacancy rates </li></ul><ul><li>Unemployment rates </li></ul><ul><li>Lack of access to supporting services </li></ul><ul><ul><li>Long Waiting Lists </li></ul></ul><ul><ul><li>No Insurance or Pay Source </li></ul></ul><ul><li>Barriers to success or community assets </li></ul>
  15. 15. Supporting Performance Improvement? <ul><li>Support the development of initial measures (indicators, objectives and outcomes) and targets through meetings with funders, program leadership and staff. </li></ul><ul><li>Provide a menu of measures for program leadership to select from. </li></ul>
  16. 16. Project Activities Continued <ul><li>Provide reports that may be run monthly with a click of the button on all selected measures. The reports will aggregate data that is viewable through the HMIS for Programs, Agencies, CoCs, and Statewide. </li></ul><ul><li>Convene routine Benchmarking meetings to discuss measurement issues as well as performance ideas. </li></ul>
  17. 17. Project Activities Continued <ul><li>Maintain, with local support , a database of contextual variables. </li></ul><ul><li>Share/problem-solve measurement issues. Support the evolution of measures. </li></ul><ul><li>Provide data for statewide analysis as defined in contracts. </li></ul>
  18. 18. Contact Information <ul><li>Barbara Ritter </li></ul><ul><li>517-230-7146 </li></ul><ul><li>[email_address] </li></ul><ul><li>MCAH WEB SITE: </li></ul><ul><li>www.mihomeless.org </li></ul>
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×