Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Transitioning from Traditional DW to Apache® Spark™ in Operating Room Predictive Modeling

5,383 views

Published on

The prevailing issue when working with Operating Room (OR) scheduling within a hospital setting is that it is difficult to schedule and predict available OR block times. This leads to empty and unused operating rooms leading to longer waiting times for patients for their procedures. In this three-part session, Ayad Shammout and Denny will show:

1) How we tried to solve this problem using traditional DW techniques
2) How we took advantage of the DW capabilities in Apache Spark AND easily transition to Spark MLlib so we could more easily predict available OR block times resulting in better OR utilization and shorter wait times for patients.
3) Some of the key learnings we had when migrating from DW to Spark.

Published in: Engineering
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website! https://vk.cc/818RFv
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Transitioning from Traditional DW to Apache® Spark™ in Operating Room Predictive Modeling

  1. 1. Transitioning from Traditional DW to Spark in OR Predictive Modeling Ayad Shammoutand Denny Lee October 21st,2015
  2. 2. About Ayad Shammout • Director of BusinessIntelligence,Beth IsraelDeaconess Medical Center • Helped build BusinessIntelligence, highlyavailable / disaster recoveryinfrastructure for BIDMC 2
  3. 3. About Denny Lee • TechnologyEvangelist, Databricks • Former Sr. Director of Data SciencesEng, Concur • Helped bring Hadoop onto Windows and Azure 3
  4. 4. We are Databricks, the company behind Spark Founded by the creators of Apache Spark in 2013 Share of Spark code contributed by Databricks in 2014 75% 4 Data Value Created Databricks on top of Spark to make big data simple.
  5. 5. Why is Operating Room Scheduling Predictive Modeling Important?
  6. 6. 6 $15-$20 / minute for a basic surgicalprocedure Time is an OR's most valuable resource Lack of OR availability means loss of patient OR efficiency differs depending on the OR staffing and allocation (8, 10, 13, or 16h), not the workload (i.e. cases)
  7. 7. 7 “You are notgoing to getthe elephantto shrink or change itssize. You need to face the fact that the elephantis8 OR tall and 11hrwide” Steven Shafer, MD
  8. 8. 8 Operating Room Better utilization = Better profit margins Reduce support and maintenance costs Medical Staff Better utilization = Better profit margins Better medical staff efficiencies= Better outcomes Patients Shorter wait times and lesscancellations Better medical staff efficiencies= Better outcomes
  9. 9. Develop Predictive Model • Develop a predictive model that would identify available OR time 15 business days in advance. • Allowus to confirm wait list cases two weeks in advance, instead of when the blocks normally release four days out. 9
  10. 10. Forecast OR Schedule • Case load 15 businessdays in advance • Book more cases weeks in advance to prevent under- utilization • Reduce staff overtime and idle time 10
  11. 11. Background • Threesurgical groups • GYN, urology, generalsurgery,colorectal, surgical oncology • Eyes, plastics, ENT • Orthopedics, podiatry • Currentlybuilt using SQL ServerData Mining 11
  12. 12. Using Traditional Data Warehousing Techniques
  13. 13. OR DW SSAS Data Mining Data Sources OR Reports Traditional Data Warehousing & Data Mining OR Predictive Model Process mining model every3 hours OR Prediction DB Data inserts every 3 hours Predictionresults
  14. 14. 14 Original Design • Multiple data sourcespushing data into SQL Server and SQL ServerAnalysis ServerData Mining • Hand built 225 different DM modules (5 days, 15 businessdays ahead, 3 differentgroups) • Pipeline processhad to run225 times / day (3 pools x 75 modules)
  15. 15. 15 Regression Calculations SSAS Data Mining T-SQL Code Intercept R2 Mean Adjusted R2 Coefficients Standard Deviation Variance Standard Error
  16. 16. Taking advantage of Spark’s DW Capabilities and MLlib
  17. 17. OR DWData Sources OR Reports OR Predictive Model in Spark Data inserts every 3 hours
  18. 18. 18 demoOR Block Scheduling Extract History data and run linear regression with SGD with multiple variables
  19. 19. 19
  20. 20. 21
  21. 21. 22
  22. 22. 23
  23. 23. 24
  24. 24. 25
  25. 25. 26
  26. 26. 27 OR Schedule Report (example)
  27. 27. 28 Why the model is working • Can coordinate waitlist schedulinglogistics with physicians and patients within two weeks of the surgery • Plan staff schedulingand resourcessothere are less last-minute staffing issuesfor nursing and anesthesia • Utilization metrics are showing us where we can maximize our elective surgicalscheduleand level demand
  28. 28. Key Learnings when Migrating from Traditional DW to Spark
  29. 29. 30 Transitioning to the Cloud Beth Israel DeaconessMedical Center is increasingly moving to cloud infrastructure services with the hopesof closing itsdata center when the hospital's lease is up in the next five years. CIO John Halamka says he's decommissioning HP and Dell servers as he movesmore of hiscompute workloads to Amazon Web Services, where he's currently using 30 virtual machines to test and develop new applications. "It is no longer cost effective to deal with server hosting ourselves because our challenge isn't real estate, it's power and cooling," he says.
  30. 30. 31 Transitioning to the Cloud • Need time for engineers,analysts, and data scientists to learn how to build for the cloud • Build for security right from start – processheavy, a lot of documentation, audits / reviews • Differentiating data engineersand engineers(REST APIs, services, elasticity, etc.)
  31. 31. 32 Transitioning to Spark • No more stored proceduresor indexes • Good for Spark SQL, services design • Prototype, prototype, prototype • Leverage existing languagesand skill sets • Leverage the MOOCsand other Spark training • Break down the silos of data engineers,engineers,data scientists, and analysts
  32. 32. 33 Transitioning DW to Spark • Understand Partitioning,Broadcast Joins, and Parquet • Not all Hive functionsare available in Spark (99%of the time that is okay) due to Hive context • Don’t limit yourselfto build star-schemas / snowflake schemas • Expand outside of traditional DW: machine learning,streaming
  33. 33. Thank you. For more information, please contact ayad.shammout@hotmail.com denny@databricks.com

×