Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

ACDC: Alpha-Carving Decision Chain for Risk Stratification

462 views

Published on

In many healthcare settings, intuitive decision rules for risk stratification can help effective hospital resource allocation. This paper introduces a novel variant of decision tree algorithms that produces a chain of decisions, not a general tree. Our algorithm, $\alpha$-Carving Decision Chain (ACDC), sequentially carves out ``pure'' subsets of the majority class examples. The resulting chain of decision rules yields a pure subset of the minority class examples. Our approach is particularly effective in exploring large and class-imbalanced health datasets. Moreover, ACDC provides an interactive interpretation in conjunction with visual performance metrics such as Receiver Operating Characteristics curve and Lift chart.

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

ACDC: Alpha-Carving Decision Chain for Risk Stratification

  1. 1. ACDC: Alpha-Carving Decision Chain for Risk Stratification Yubin Park, Accordion Health, Inc. Joyce Ho, Emory University JoydeepGhosh, The University of Texas at Austin 1ICML WHI 2016
  2. 2. What is Decision Chain (DC)? • Also known as Rule Lists (Wang & Rudin, 2015) • A sequence of rules, applied to one after another, where the ratio of positive class increases over the sequence of rules • Toy Example: A decision chain for predicting the likelihood of being a Longhorn fan, • If Tom lives in Austin, TX à 25% chance of being a Longhorn fan • And Tom likes to watch football games à 50% chance • And Tom goes out for a tail gate on every Saturday à 75% chance • And Tom wears burnt orange t-shirts all the time à 95% chance 2ICML WHI 2016
  3. 3. Conceptually, something like this 3 + + + ✓A ✓B S1 S1 S0 S0 S2 S2 + + + ✓A ✓B S1 S1 S0 S0 S2 S2 Decision Tree Decision Chain ICML WHI 2016
  4. 4. Is DC More Interpretable than DT? • In Decision Chain (DC), • Risk is proportional to the number of rules • Less to memorize for filtering out low-risk population (or samples) • More to memorize for capturing high-risk population • Using DC, one can implement an economically efficient business process based on job maturity-level • While in Decision Tree (DT), • The number of rules is agnostic to risk • Low-risk can be captured with one rule as well as hundreds of rules • Thus, DC may be helpful for some applications 4ICML WHI 2016
  5. 5. In Healthcare Applications, • Class imbalance problems are prevalent • Majority class examples can be often carved out (or filtered out) with a simple conjunction of if-else statements • An implementation strategy • Filter out majority class with rules à Obtain a less imbalanced dataset à Apply a fancy machine learning algorithm • One question:How many majority examples should I filter out? • A possible solution:If we build a decision chain, then we can streamline grid- search much easily • DC can be more interpretable as well as more efficient (sometimes) 5ICML WHI 2016
  6. 6. Question is How • We will use a greedy approach • Note that decision tree is also a greedy algorithm • Pick a splitting feature that maximizes {information gain, purity score, etc.} • Split the dataset into parts based on the value of the splitting feature • Repeat from the beginning for each dataset • We will grow a decision chain as follows • Pick a splitting feature that carves out the most amount of majority samples • Split the dataset into parts based on the value of the splitting feature • Repeat from the beginning on only one partition that has more positive class examples 6ICML WHI 2016
  7. 7. More Details on How • Selecting the best splitting feature • We will use Alpha-Divergence • Alpha-Divergence is the same as KL-Divergence when Alpha=1 • Alpha-Divergence is the same as Hellingerdistance when Alpha=0.5 • Alpha-Divergence can be a lot of different things based on the value of Alpha • We will change the value of Alpha adaptively (with a simple strategy) to achieve our goal • More details are in the paper 7ICML WHI 2016
  8. 8. Effect of Different Alphas • High Alpha • Pure partitions • Low Alpha • Balanced partitions 8 α = 1 α = 16 α = 48 α = 64 0 25 50 75 50 100 150 nbp.systolic count Shock F T ICML WHI 2016
  9. 9. Experiments: Septic Shock • Alpha-Carving Decision Chain (ACDC) shows comparable performance with other decision tree algorithms • ATree(a=1): C4.5 • ATree(a=2): CART • ATree(a=x): other alpha- trees 9 ● ● ● ● ● 0.5 0.6 0.7 0.8 ATree(a=1) ATree(a=2) ATree(a=4) ATree(a=16) ATree(a=64) ATree(a=128) ACDC Model AUC ICML WHI 2016
  10. 10. Experiments: Septic Shock • Since ACDC is a decision chain, we can make this cool visualization • Put decision rules and performance metrics in a single chart • Risk is proportional to the number of rules applied 10 ● ● ● ● ● L1: nbp.systolic<=132 L2: nbp.systolic<=98.4 L3: min.nbp.systolic<=90.4 L4: nbp.diastolic<=46 0 1 2 3 0.00 0.25 0.50 0.75 1.00 Coverage Lift ICML WHI 2016
  11. 11. Experiments: Cardiac Arrest • Another example: Asystole • Again, comparable to other decision trees 11 ● ● ● ● ● ● ● 0.5 0.6 0.7 0.8 ATree(a=1) ATree(a=2) ATree(a=4) ATree(a=16) ATree(a=64) ATree(a=128) ACDC Model AUC ICML WHI 2016
  12. 12. Experiments: Cardiac Arrest 12 ● ● ● ● ● ● ● ● L1: min.nbp.diastolic<=48.767L2: min.nbp.systolic<=104 L3: min.spo2<=90 L4: spo2<=93.6 L5: min.spo2<=78 L6: avg.pp>57.495 L7: hr<=90.9 0.0 2.5 5.0 7.5 10.0 0.00 0.25 0.50 0.75 1.00 Coverage Lift ● ● ● ● ● ● ● ● ● L1: min.nbp.diastolic<=48. L2: min.nbp.systolic<=104 L3: min.spo2<=90 L4: spo2<=93.6 L5: min.spo2<=78 L6: avg.pp>57.495 L7: hr<=90.9 0.00 0.25 0.50 0.75 1.00 0.00 0.25 0.50 0.75 1.00 FPR TPR ICML WHI 2016
  13. 13. Experiments: Cardiac Arrest • You also can make a risk pyramid 13 min.diastolic < 48 mmHg min.systolic < 104 mmHg min.spo2 < 90 % spo2 < 93 % Baseline Risk 1.3 times higher 1.5 times higher 3.7 x 5.8 x Asystole ! Risk Stratification ! Decision Chain Higher risk ICML WHI 2016
  14. 14. Contacts • Yubin Park • yubin [at] accordionhealth[dot] com • Joyce Ho • joyce [dot] c [dot] ho [at] emory [dot] edu • Joydeep Ghosh • jghosh [at] utexas [dot] edu 14ICML WHI 2016

×