PRODUCTIVITY METRICS USED IN HOSPITAL FUNDING AGREEMENTS

1,035 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,035
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

PRODUCTIVITY METRICS USED IN HOSPITAL FUNDING AGREEMENTS

  1. 1. PRODUCTIVITY METRICS USED IN HOSPITAL FUNDING AGREEMENTS 2006 ACE Annual Meeting Las Vegas, Nevada Sunday, February 19, 2006 Speaker: Anthony J. Trimarchi
  2. 2. What is a Hospital Funding Agreement?  Business decision that requires hospital funding of a physician practice (or department) to advance organizational goals. Examples Include:  recruitment start-up funding  joint ventures  deficit funding of mission critical programs
  3. 3. Examples Used:  MEDICAL ONCOLOGY - 2002  projected clinical growth  perceived physician access issues  clinical integration  existing deficit  RADIATION ONCOLOGY - 2005  strategic planning  physician retention  equipment/capital needs assessment  program location
  4. 4. Why Review Physician Productivity?  Most valuable resource  Accountability to business enterprise  Recruitment decisions  Identification of “other” factors  inefficiencies  staffing
  5. 5. Physician “Buy-In”  CLINICAL PRODUCTIVITY REVIEW COMMITTEE -comprised of physician leadership, administrator and project staff:  review current productivity levels – individual and aggregate WRVU’s (from billing system)  determine national standards for similar practices as a benchmark  develop WRVU based goals/targets that support financial and service needs
  6. 6. Clinical Full Time Equivalents  HOW DO PHYSICIANS SPEND THEIR TIME?  Clinical  Teaching  Research  Administrative  Other  IDENTIFY UNFUNDED/UNDER FUNDED WORK  Negotiate funding or change behavior  CLINICAL FTE’S  Interview physicians  “Drill down” clinical schedules  Based on time available for clinic, not funding sources
  7. 7. CFTE Example: Site 1 Site 2 Site 3 R&T Admin Total Physician A 0.57 0.00 0.00 0.33 0.00 0.90 Physician B 0.51 0.00 0.30 0.19 0.00 1.00 Physician C 0.10 0.00 0.00 0.90 0.00 1.00 Physician D 0.32 0.45 0.00 0.13 0.10 1.00 Physician E 0.38 0.00 0.02 0.20 0.40 1.00 Physician F 0.40 0.00 0.20 0.40 0.00 1.00 Physician G 0.00 0.82 0.00 0.18 0.00 1.00 Physician H 0.40 0.10 0.00 0.00 0.00 0.50 Physician I 0.86 0.10 0.00 0.04 0.00 1.00 Totals 3.54 1.47 0.52 2.37 0.50 8.40
  8. 8. Benchmark Selection  MEDICAL ONCOLOGY – review committee recognized the lack of comparable data as an early obstacle.  RADIATION ONCOLOGY – required a better “fit” with similar institutions than the industry surveys provided due to low survey participation. In both cases, it was believed that industry benchmarks (MGMA, UHC, SROA) fell short of the needs of the organization:
  9. 9. Solution  MEDICAL ONCOLOGY – Hired a consultant to survey the National Comprehensive Cancer Network (NCCN) centers. Conducted site visits to 2 comparable cancer centers to observe operational and staffing efficiencies.  RADIATION ONCOLOGY – Designed a survey (modeled after the 2003 SROA tool) with participation from 4 similar organizations. Survey questions covered:  specialized services  staffing and operational structure  equipment  patient/treatment volumes
  10. 10. Radiation Oncology Physician Productivity Benchmarks MGMA 8,239 UHC 9,011 UW Survey 8,089 UW Performance 9,436 Median Academic Practice Work Relative Value Units Per Clinical FTE Physician:
  11. 11. Other Radiation Oncology Benchmark Examples  Medical physicists per accelerator  Number of patients per nursing FTE  Tx plans per dosimetrist  Beam treatments per therapist  Support staff per CFTE physician  Annual Tx requiring anesthesia  Annual seed implants  HDR Tx by sub-specialty  Levels of hospital support
  12. 12. Radiation Oncology Results  PHYSICIANS  Set physician productivity benchmark at 8,089 WRVU’s per CFTE.  Validated the need to recruit additional physicians.  Built benchmarks into funding agreement.  Used benchmarks to determine future recruitment needs.  OTHER  Validated that physicist, dosimetrist, nursing and support staffing levels were acceptable at current productivity levels when compared to comparable institutions.  Cited several facility/equipment issues that would need to be addressed.
  13. 13. Example: Radiation Oncology Recruitment Agreement Baseline information Base year YR#1 YR#2 YR#3 YR#4 WRVUs 33,404 38,415 40,335 42,352 44,470 CFTE 3.54 4.54 4.54 4.54 4.54 WRVUs/CFTE 9,436 8,461 8,884 9,329 9,795 Benchmark WRVUs/CFTE 8,089 8,089 8,089 8,089 8,089 Cash Collections 2,172,888 2,498,821 2,623,762 2,754,950 2,892,698 Cash/WRVU 65.05 65.05 65.05 65.05 65.05 Funds Flow Incremental cash collections - 325,933 450,874 582,062 719,810 Practice overhead - 65,187 90,175 116,412 143,962 Clinical medicine fund - 28,682 39,677 51,221 63,343 Department expenses - 45,631 63,122 81,489 100,773 Faculty Compensation and Benefits - 400,000 412,000 424,360 437,091 Total Expenses - 539,499 604,974 673,483 745,169 Clinical Salary Shortfall - $(213,566) $(154,100) $ (91,420) $ (25,360)
  14. 14. Medical Oncology  2002 MGMA Median 2,854  2002 NCCN Median 3,251  UW Performance Level 2,825
  15. 15. Benchmark Selection Criteria  Recognition that a legitimate funding gap existed.  Benchmark performance would be expected in order to justify future support.  Would take into consideration the significant research and teaching effort of faculty.  Group vs. individual performance measure.  Benchmark would become a component of incentive program.
  16. 16. Individual Benchmarks Based on Percentage Research and Teaching (R&T) Adopted NCCN median as benchmark, and established goals that reflect research and teaching effort as follows:  LOW 35% R&T 2,167 Goal  MODERATE 45% R&T 1,799 Goal  HIGH 55% R&T 1,439 Goal
  17. 17. Funding Agreement Principles  WRVU Goal Met = 100% Shortfall Funding:  salary metrics (AAMC)  department overhead rate cap  WRVU Goal Not Met:  funding up to level had goal been met  imputed revenue based on historic collections per WRVU  remainder of shortfall shared  2/3 hospitals  1/3 department
  18. 18. Example: Medical Oncology Funding Agreement Medical Oncology Example GOAL MET GOAL NOT MET WRVU Revenue WRVU Revenue Physician A 2,167 $ 112,684 2,167 $ 112,684 Physician B 1,799 $ 91,749 1,350 $ 68,850 Physician C 1,439 $ 83,462 1,200 $ 69,600 Group Performance 5,405 $ 287,895 4,717 $ 251,134 Revenue Per WRVU $53.00 $53.00 Financial Shortfall (per income statement) (300,000) (336,761) Amount of Support had Goal Been Met 300,000 300,000 Remaining Shortfall - (36,761) Additional Hospital Support - 24,507 Department Share of Shortfall - 12,254
  19. 19. Medical Oncology Results  Validated assumptions regarding physician shortages and initiated recruitment activity.  Set baseline WRVU targets that took into consideration research productivity.  Utilized “aggregate” measures to establish financial support agreements.  Established systems to enhance internal data collection needed to track productivity accurately.  Regular reporting to faculty on performance.  Adopted several clinic operational improvement and staffing strategies.
  20. 20. Lessons Learned  Physician “buy-in” essential  Industry benchmarks not perfect  Get creative with benchmarks:  incentive plans  staffing levels  funding agreements  other EASIER TO WORK WITH OUR HOSPITAL PARTNERS!

×