• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
weems-Texas Medicaid Health Home Pilot Project Evaluation Methods
 

weems-Texas Medicaid Health Home Pilot Project Evaluation Methods

on

  • 1,376 views

Leslie A. Weems, LMSW, Senior Policy Analyst and Project Manager, Medicaid/CHIP Division, Texas Health and Human Services Commission discusses Texas Medicaid Health Home Pilot Project Evaluation ...

Leslie A. Weems, LMSW, Senior Policy Analyst and Project Manager, Medicaid/CHIP Division, Texas Health and Human Services Commission discusses Texas Medicaid Health Home Pilot Project Evaluation Methods at the New Tactics for Building Medical Homes in State Medicaid and CHIP Programs webinar

Statistics

Views

Total Views
1,376
Views on SlideShare
1,201
Embed Views
175

Actions

Likes
0
Downloads
3
Comments
0

5 Embeds 175

http://www.nashp.org 150
http://www.nashpconference.org 19
http://nashp.org 4
http://webcache.googleusercontent.com 1
http://web.archive.org 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    weems-Texas Medicaid Health Home Pilot Project Evaluation Methods weems-Texas Medicaid Health Home Pilot Project Evaluation Methods Presentation Transcript

    • 1
    • Project GoalAssess the impact of varied models of health homes in increasing access and quality of both primary and specialty care in a cost-effective manner that is alsopatient/family-centered, population-based, coordinated, clinically managed, team-based and comprehensive. 2
    • Pilot Overview• up to 8 pilot projects, and each may have multiple practice sites• assess over a 24-month period the performance of each pilot using a common evaluation methodology• primary care case management (PCCM), capitated managed care programs, and fee for service• >5,000 Medicaid beneficiary children per pilot project 3
    • Evaluation*Methodology designed by Health and Human Services Commission (HHSC) consultant Bailit Health Purchasing, LLC • Domain I: adoption of health home practices • Domains II & III: patient access and patient experience • Domain IV: practice experience and satisfaction • Domain V: service utilization • Domain VI: clinical care quality • Domain VII: annual and trended PMPM cost 4
    • Domain I: Adoption of Health Home Practices• Medical Home Implementation Quotient (MHIQ) by TransforMed (excluding the Health Information Technology segment and the Practice Management segment)• self-report survey results• results reported at beginning of project and at 6, 12, 18, and 24 months• no comparison groups 5
    • Domains II & III: Patient Access and Experience • pre/post evaluation methodology • Consumer Assessment of Healthcare Providers and Systems (CAHPS) Clinician & Group Survey V 1.0 using a 4-point scale with some additional questions • children’s parents or guardians will receive a survey at the beginning of implementation and at month 24 • will not match the pre- and post-survey takers 6
    • Domain IV: Practice Experience and Satisfaction • pre/post survey methodology • Harvard School of Public Health: Survey of Practice Environment and Satisfaction in PCMH Pilots • performed over telephone at beginning of project and again at month 24 7
    • Domains V, VI, and VII: Service Utilization, Clinical Care Quality, and PMPM Cost• pre/post with comparison group evaluation methodology• claims data for children identified as served by pilot providers and claims data for a control group• compare changes in performance over time• pilot practices will also be tracked quarterly using a subset of the formal evaluation measures The primary analyses will be conducted within each pilot. Comparison across pilots will be qualitative in nature, because they are expected to vary in fundamental ways (e.g., geography, practice type) that will make identifying causes of differential impact difficult at best. 8