SlideShare a Scribd company logo
1 of 24
Directive Explanations for Monitoring the Risk of Diabetes
Onset:
Introducing Directive Data-Centric Explanations and
Combinations to Support What-If Explorations
Aditya Bhattacharya
aditya.bhattacharya@kuleuven.be
@adib0073
Jeroen Ooge
jeroen.ooge@kuleuven.be
@JeroenOoge
Gregor Stiglic
gregor.stiglic@um.si
@GStiglic
Katrien Verbert
katrien.verbert@kuleuven.be
@katrien_v
Explainable Decision Support
Systems in Healthcare
Explainable Decision Support Systems in Healthcare
ML-based
Decision Support Systems
XAI Methods
Explainable Decision Support Systems
Explainable Decision Support Systems in Healthcare
ML-based
Decision Support Systems
XAI Methods
Explainable Decision Support Systems
Healthcare Experts Explainable Interface for
monitoring the risk of
diabetes onset for patients
Understand the rationale
behind the predicted risk
of diabetes onset
Monitoring the Risk of
Type 2 Diabetes Onset
Visually Directive Explanation
Dashboard
Visually Directive Explanation Dashboard
Visually Directive Explanation Dashboard
Feature Importance Explanation
Visually Directive Explanation Dashboard
Data-Centric Explanation
Visually Directive Explanation Dashboard
Example-based Explanation
Explainable AI Methods
Feature Importance Explanations (Model-Centric Explanations)
• Feature importance explainability is a model-centric explanation method as it estimates the
importance of features in the model that have the most influence on its output or prediction.
• Examples of feature importance methods are permutation importance, partial dependence
plots, LIME based feature importance and Shapley values (SHAP) based feature importance.
Data-Centric Explanations
• Data-centric explainability focuses on examining the data used to train the model rather than the model's internal
workings. The idea is that by analyzing the training data, we can gain insights into how the model makes its
predictions and identify potential biases or errors.
• Examples of data-centric explanation approaches include summarizing datasets using common statistical methods
like mean, mode, and variance, visualizing the data distributions to compare feature values to those across the
remaining dataset, and observing changes in model predictions through what-if analysis to probe into the
sensitivity of the features.
• Additionally, data-centric explanations include creating more awareness about the data quality by sharing more
insights about the various data issues, such as data drift, skewed data, outliers, correlated features and etc., that
can impact the overall performance of the ML models.
Counterfactual Explanations (Example-based Explanations)
• Counterfactual explanations are example-based methods that provide minimum
conditions required to obtain an alternate decision.
• Rather than explaining the inner working of the model, counterfactuals can guide users to
obtain their desired predictions.
* Applied Machine Learning Explainability Techniques, A.
Bhattacharya
*
Research Questions
and
User Study
Research Questions
RQ1. In what ways do patients and HCPs find our visually directive explanation
dashboard useful for monitoring and evaluating the risk of diabetes onset?
RQ2. In what ways do HCP and patients perceive data-centric, model-centric, and
example-based visually directive explanations in terms of usefulness, understandability,
and trustworthiness in the context of healthcare?
RQ3. In what ways do visually directive explanations facilitate patients and HCPs to take
action for improving patient conditions?
Iterative User-Centric Design and Evaluation Process
Low-fidelity prototype High-fidelity prototype
Figma click-through prototype
Interactive web application prototype
11 healthcare experts
Qualitative study through 1:1 interviews
45 healthcare experts and 51 diabetes patients
Mixed-methods study through online questionnaires
Thematic analysis for evaluation
Evaluation through descriptive statistics, test of
proportion, and analyzing participant-reported
Likert scale question
Key-takeaways
Combining XAI methods to address different dimensions of explainability
* Applied Machine Learning Explainability Techniques, A.
Bhattacharya
*
Tailoring Directive Explanations for Healthcare Experts
o Increasing actionability through interactive what-if analysis
o Explanations through actionable features instead of non-actionable features
o Color-coded visual indicators
o Data-centric directive explanations
* These design implications are aligned with the recommendations from Wang et al. [2019] -
Designing Theory-Driven User-Centric Explainable AI
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Summarizing the contribution of this research
1. Combining XAI methods to address different dimensions of explainability
2. Visually directive data-centric explanations that provide local explanations with a global overview
3. The design of a directive explanation dashboard that combines different explanation methods and
further compared them in terms of understandability, usefulness, actionability, and trustworthiness
with healthcare experts and patients.
4. Design implications for tailoring visually directive explanations for healthcare experts
Thank you for your attention!
Directive Explanations for Monitoring the Risk of Diabetes
Onset:
Introducing Directive Data-Centric Explanations and
Combinations to Support What-If Explorations
Aditya Bhattacharya
aditya.bhattacharya@kuleuven.be
@adib0073
Jeroen Ooge
jeroen.ooge@kuleuven.be
@JeroenOoge
Gregor Stiglic
gregor.stiglic@um.si
@GStiglic
Katrien Verbert
katrien.verbert@kuleuven.be
@katrien_v

More Related Content

Similar to Directive Explanations for Monitoring the Risk of Diabetes Onset - ACM IUI 2023

Required ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docxRequired ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docx
audeleypearl
 
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docxDiscussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
duketjoy27252
 
Centralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docxCentralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docx
write31
 
The Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docxThe Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docx
write5
 
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignmentvincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincent barner
 
Final Medical Guideline Paper
Final Medical Guideline PaperFinal Medical Guideline Paper
Final Medical Guideline Paper
Sumaiya Sarawat
 
McGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSWMcGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSW
Robert McGrath
 
Evolution Of Health Care Information Systems
Evolution Of Health Care Information SystemsEvolution Of Health Care Information Systems
Evolution Of Health Care Information Systems
Lana Sorrels
 
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Health Catalyst
 
Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...
Fit Focus Hub
 

Similar to Directive Explanations for Monitoring the Risk of Diabetes Onset - ACM IUI 2023 (20)

Required ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docxRequired ResourcesThe following resources are required to comple.docx
Required ResourcesThe following resources are required to comple.docx
 
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docxDiscussion 3Select a topic for your Topic 3 Executive Summary as.docx
Discussion 3Select a topic for your Topic 3 Executive Summary as.docx
 
Customer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for PayersCustomer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
Customer Journey Analytics: Cracking the Patient Engagement Challenge for Payers
 
IRJET - Medicine Recommendation System
IRJET - Medicine Recommendation SystemIRJET - Medicine Recommendation System
IRJET - Medicine Recommendation System
 
Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.Generative AI in Health Care a scoping review and a persoanl experience.
Generative AI in Health Care a scoping review and a persoanl experience.
 
Integrate RWE into clinical development
Integrate RWE into clinical developmentIntegrate RWE into clinical development
Integrate RWE into clinical development
 
Centralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docxCentralization of Healthcare Insurance.docx
Centralization of Healthcare Insurance.docx
 
The Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docxThe Inclusion of Nurses in the Systems Development Life Cycle.docx
The Inclusion of Nurses in the Systems Development Life Cycle.docx
 
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignmentvincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
vincentbarner_HI-560-Health Care Data Analysis_Unit-9_assignment
 
Final Medical Guideline Paper
Final Medical Guideline PaperFinal Medical Guideline Paper
Final Medical Guideline Paper
 
Care Report My Nursing Experts.docx
Care Report My Nursing Experts.docxCare Report My Nursing Experts.docx
Care Report My Nursing Experts.docx
 
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
Maureen Charlebois, Chief Nursing Director and Group Director, Canada Health ...
 
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
Seyedjamal Zolhavarieh - A model of knowledge quality assessment in clinical ...
 
Promoting the Spread of Health Care Innovations
Promoting the Spread of Health Care InnovationsPromoting the Spread of Health Care Innovations
Promoting the Spread of Health Care Innovations
 
McGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSWMcGrath Health Data Analyst SXSW
McGrath Health Data Analyst SXSW
 
Cindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate OrganizationCindy Brach - Becoming a Health Literate Organization
Cindy Brach - Becoming a Health Literate Organization
 
Evolution Of Health Care Information Systems
Evolution Of Health Care Information SystemsEvolution Of Health Care Information Systems
Evolution Of Health Care Information Systems
 
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
Master Your Value-Based Care Strategy: Introducing Health Catalyst Value Opti...
 
Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...Artificial intelligence in healthcare revolutionizing personalized healthcare...
Artificial intelligence in healthcare revolutionizing personalized healthcare...
 
Care Report.docx
Care Report.docxCare Report.docx
Care Report.docx
 

More from Aditya Bhattacharya

More from Aditya Bhattacharya (10)

ODSC APAC 2022 - Explainable AI
ODSC APAC 2022 - Explainable AIODSC APAC 2022 - Explainable AI
ODSC APAC 2022 - Explainable AI
 
Explainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretableExplainable AI - making ML and DL models more interpretable
Explainable AI - making ML and DL models more interpretable
 
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
Accelerating Data Science and Machine Learning Workflow with Azure Machine Le...
 
Machine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlowMachine learning and Deep learning on edge devices using TensorFlow
Machine learning and Deep learning on edge devices using TensorFlow
 
Time series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly DetectionTime series Segmentation & Anomaly Detection
Time series Segmentation & Anomaly Detection
 
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
Application of Masked RCNN for segmentation of brain haemorrhage from Compute...
 
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
Aditya Bhattacharya - Enterprise DL - Accelerating Deep Learning Solutions to...
 
Aditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep LearningAditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
Aditya Bhattacharya Chest XRay Image Analysis Using Deep Learning
 
Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019Computer vision-must-nit-silchar-ml-hackathon-2019
Computer vision-must-nit-silchar-ml-hackathon-2019
 
Computer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathonComputer vision-nit-silchar-hackathon
Computer vision-nit-silchar-hackathon
 

Recently uploaded

Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
fonyou31
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
SoniaTolstoy
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Krashi Coaching
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
heathfieldcps1
 

Recently uploaded (20)

Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 

Directive Explanations for Monitoring the Risk of Diabetes Onset - ACM IUI 2023

  • 1. Directive Explanations for Monitoring the Risk of Diabetes Onset: Introducing Directive Data-Centric Explanations and Combinations to Support What-If Explorations Aditya Bhattacharya aditya.bhattacharya@kuleuven.be @adib0073 Jeroen Ooge jeroen.ooge@kuleuven.be @JeroenOoge Gregor Stiglic gregor.stiglic@um.si @GStiglic Katrien Verbert katrien.verbert@kuleuven.be @katrien_v
  • 3. Explainable Decision Support Systems in Healthcare ML-based Decision Support Systems XAI Methods Explainable Decision Support Systems
  • 4. Explainable Decision Support Systems in Healthcare ML-based Decision Support Systems XAI Methods Explainable Decision Support Systems Healthcare Experts Explainable Interface for monitoring the risk of diabetes onset for patients Understand the rationale behind the predicted risk of diabetes onset Monitoring the Risk of Type 2 Diabetes Onset
  • 7. Visually Directive Explanation Dashboard Feature Importance Explanation
  • 8. Visually Directive Explanation Dashboard Data-Centric Explanation
  • 9. Visually Directive Explanation Dashboard Example-based Explanation
  • 11. Feature Importance Explanations (Model-Centric Explanations) • Feature importance explainability is a model-centric explanation method as it estimates the importance of features in the model that have the most influence on its output or prediction. • Examples of feature importance methods are permutation importance, partial dependence plots, LIME based feature importance and Shapley values (SHAP) based feature importance.
  • 12. Data-Centric Explanations • Data-centric explainability focuses on examining the data used to train the model rather than the model's internal workings. The idea is that by analyzing the training data, we can gain insights into how the model makes its predictions and identify potential biases or errors. • Examples of data-centric explanation approaches include summarizing datasets using common statistical methods like mean, mode, and variance, visualizing the data distributions to compare feature values to those across the remaining dataset, and observing changes in model predictions through what-if analysis to probe into the sensitivity of the features. • Additionally, data-centric explanations include creating more awareness about the data quality by sharing more insights about the various data issues, such as data drift, skewed data, outliers, correlated features and etc., that can impact the overall performance of the ML models.
  • 13. Counterfactual Explanations (Example-based Explanations) • Counterfactual explanations are example-based methods that provide minimum conditions required to obtain an alternate decision. • Rather than explaining the inner working of the model, counterfactuals can guide users to obtain their desired predictions. * Applied Machine Learning Explainability Techniques, A. Bhattacharya *
  • 15. Research Questions RQ1. In what ways do patients and HCPs find our visually directive explanation dashboard useful for monitoring and evaluating the risk of diabetes onset? RQ2. In what ways do HCP and patients perceive data-centric, model-centric, and example-based visually directive explanations in terms of usefulness, understandability, and trustworthiness in the context of healthcare? RQ3. In what ways do visually directive explanations facilitate patients and HCPs to take action for improving patient conditions?
  • 16. Iterative User-Centric Design and Evaluation Process Low-fidelity prototype High-fidelity prototype Figma click-through prototype Interactive web application prototype 11 healthcare experts Qualitative study through 1:1 interviews 45 healthcare experts and 51 diabetes patients Mixed-methods study through online questionnaires Thematic analysis for evaluation Evaluation through descriptive statistics, test of proportion, and analyzing participant-reported Likert scale question
  • 18. Combining XAI methods to address different dimensions of explainability * Applied Machine Learning Explainability Techniques, A. Bhattacharya *
  • 19. Tailoring Directive Explanations for Healthcare Experts o Increasing actionability through interactive what-if analysis o Explanations through actionable features instead of non-actionable features o Color-coded visual indicators o Data-centric directive explanations * These design implications are aligned with the recommendations from Wang et al. [2019] - Designing Theory-Driven User-Centric Explainable AI
  • 20. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 21. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 22. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 23. Summarizing the contribution of this research 1. Combining XAI methods to address different dimensions of explainability 2. Visually directive data-centric explanations that provide local explanations with a global overview 3. The design of a directive explanation dashboard that combines different explanation methods and further compared them in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. 4. Design implications for tailoring visually directive explanations for healthcare experts
  • 24. Thank you for your attention! Directive Explanations for Monitoring the Risk of Diabetes Onset: Introducing Directive Data-Centric Explanations and Combinations to Support What-If Explorations Aditya Bhattacharya aditya.bhattacharya@kuleuven.be @adib0073 Jeroen Ooge jeroen.ooge@kuleuven.be @JeroenOoge Gregor Stiglic gregor.stiglic@um.si @GStiglic Katrien Verbert katrien.verbert@kuleuven.be @katrien_v

Editor's Notes

  1. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  2. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  3. Our research particularly focuses on providing an explainable interface for an ML-based system used for monitoring the risk of diabetes onset which could be used by healthcare experts such as nurses and physicians. To understand the real needs of our users in detail, we first conducted an exploratory focus group discussion with 4 nurses. Method – We first showed them SHAP based explanations for explaining the model predicted risk of diabetes onset We then conducted a codesign session with our participants to understand the key components of the explainable interface. Results: As a result of this study, we formulated the responses of our participants into the following User Requirements: Additionally, our user conveyed that visualizations for SHAP based explanations are complex and they need simpler visualizations to communicate with patients * Is it important to highlight about the tasks? (2 slides, 1.5 mins)
  4. Our research particularly focuses on providing an explainable interface for an ML-based system used for monitoring the risk of diabetes onset which could be used by healthcare experts such as nurses and physicians. To understand the real needs of our users in detail, we first conducted an exploratory focus group discussion with 4 nurses. Method – We first showed them SHAP based explanations for explaining the model predicted risk of diabetes onset We then conducted a codesign session with our participants to understand the key components of the explainable interface. Results: As a result of this study, we formulated the responses of our participants into the following User Requirements: Additionally, our user conveyed that visualizations for SHAP based explanations are complex and they need simpler visualizations to communicate with patients * Is it important to highlight about the tasks? (2 slides, 1.5 mins)
  5. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  6. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  7. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  8. This research work presents our Visually Directive Explanation Dashboard, which we developed following an iterative user-centric design process to satisfy our user requirements. We included model-agnostic local explanation methods to meet our explanation goals considering our user requirements. Our dashboard included feature importance explanations – Important Risk Factors Data Centric explanations – VC1, VC2 and V5 Counterfactual Explanations – Recommendations to reduce risk Another video – separate (30-45 secs) We further tailored the representation of these explanation methods. We mainly included interactive explanations that supported what-if explorations instead of static representations. Our users can alter the selected feature value to observe any change in the predicted risk We also separated emphasized on actionable health variables over non-actionable ones as the users can alter these actionable variable to obtain their favourable outcome. We also categorized the actionable features as patient measures – these provide information patient vitals like blood sugar, BMI etc. and patient behaviours – which provides information from behavioral information captured through FINDRISC questionnaires Our customizations also include providing information about feasibility and impact of counterfactual information presented as recommendations.
  9. We wanted to address the following research questions using our Visually Directive Explanation Dashboard - In general we wanted to analyze and compare the understandability, usefulness, actionability, and trust of the different explanation methods included in our dashboard with HCPs who are our primary users and patients who could be our potential users.
  10. We followed an iterative user-centric design process for the design and evaluation of our dashboard. We first designed a low-fidelity click-through prototype in Figma in multiple iterations. Here you can see the final version of the low-fidelity process We conducted a qualitative user study through 1:1 interviews with 11 healthcare experts. We evaluated our qualitative interview data using thematic analysis. We also utilized the feedback to perform design changes for our high-fidelity prototype Particularly, we improved the discoverability of our interactive visual explanation methods through tooltips and explicit visual indicators Overall, the healthcare experts were positive about the utility of this dashboard and further suggested that patients can directly use this as a self-monitoring tool. So we included patients as our participants in the next study. We then designed and developed our high-fidelity web application prototype. We conducted a mixed-methods study with 45 healthcare experts and 51 patients through online questionnaires. We evaluated the data gathered through descriptive statistics, test of proportion and analyzing participant reported Likert scale questions and their justifications. We finally addressed our research questions and summarized our research findings considering collective feedback from our two user studies.
  11. Explainable artificial intelligence is increasingly used in machine learning (ML) based decision-making systems in healthcare Existing XAI methods such as LIME, SHAP, Saliency Maps and others are predominantly designed for ML experts and little research has compared the utility of these different explanation methods in guiding healthcare experts who may not have technical ML knowledge, for patient care. Additionally, current XAI methods provide explanations through complex visualizations which are static and difficult to understand for healthcare experts. These gaps highlight the necessity for analyzing and comparing explanation methods with healthcare professionals (HCPs) such as nurses and physicians. (1 min)
  12. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  13. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  14. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  15. We share our design implications for tailoring the visual representation of directive explanations for healthcare experts from our observations and results Our modified design of this visual component (VC3) used in our high-fidelity prototype enabled them to perform interactive what-if analysis, i.e. allowed them to change the feature values and observe the change in the overall prediction. Hence, we recommend the usage of interactive design elements that allows what-if analysis for representing directive explanations for HCPs. This recommendation also supports hypothesis generation In our approach, we included only actionable variables for visual components which supports what-if interactions and better identification of coherent factors [57 ]. We anticipated that allowing the ability to alter values of non-actionable variables can create confusion for HCPs, especially for representing counterfactual explanations. HCPs indicated that the color-coded representations of risk factors were very useful for getting quick insights. Hence, we recommend the usage of color-coded representations and visual indicators to highlight factors that can increase or decrease the predictor variable. This further facilitates the identification of coherent factors. HCPs indicated that our representation of data-centric explainability through the patient summary was very informative. They could easily identify how good or bad the risk factors are for a specific patient. Additionally, they could get an overview of how other patients are doing as compared to a specific patient through the data-distribution charts. Thus, our representation of data-centric explainability provided a local explanation but with a global perspective. Furthermore, data-centric directive explanations support forward reasoning by providing access to source and situational data and yet can be easily integrated with multiple explanation methods.
  16. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  17. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  18. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study
  19. This paper presents three primary research contributions Visually directive data-centric explanations that provide local explanations of the predicted risk for individual patients with a global overview of risk factors for the entire patient population. The design of a directive explanation dashboard that combines visually represented data-centric, feature-importance, and counterfactual explanations and further compared the different visual explanations in terms of understandability, usefulness, actionability, and trustworthiness with healthcare experts and patients. Design implications for tailoring explanations for healthcare experts based on observations of our user-centered design process and an elaborate user study