Paper presentation @ IntRS 2021: Joint Workshop on Interfaces and Human Decision Making for Recommender Systems | Co-located with RecSys 2021 (September 25 - October 1, 2021 | Amsterdam, the Netherlands)
Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System
1. Evaluating Explainable Interfaces for a Knowledge
Graph-Based Recommender System
Erasmo Purificato1,2, Baalakrishnan Aiyer Manikandan1, Prasanth Vaidya Karanam1,
Mahantesh Vishvanath Pattadkal1, Prof. Dr.-Ing. Ernesto William De Luca1,2
1Otto von Guericke University Magdeburg, Germany – Faculty of Computer Science
2Georg Eckert Institute – Leibniz Institute for International Textbook Research, Brunswick, Germany
September 29, 2021 | IntRS Workshop at RecSys’21 – Amsterdam, the Netherlands & Online
2. Motivation
Recommender systems have contributed significantly to the growth of interest in retrieving
personalised information in several contexts.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
3. Motivation
Recommender systems have contributed significantly to the growth of interest in retrieving
personalised information in several contexts.
In academic domain:
• Expert recommendation;
(Pavan et al. 2015; Mangravite et al., 2016)
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
4. Motivation
Recommender systems have contributed significantly to the growth of interest in retrieving
personalised information in several contexts.
In academic domain:
• Expert recommendation;
(Pavan et al. 2015; Mangravite et al., 2016)
• Scientific paper recommendation.
(Hassan et al., 2017; Bai et al., 2019)
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
5. Motivation
Increasing need for interpretability of the system outcomes from the end-user perspective.
Explainable AI (XAI) aims at
“exposing complex artificial intelligence
models to humans in a systematic and
interpretable manner”.
(Samek et al., 2019)
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
6. Motivation
The design of user interfaces plays a fundamental role to provide the proper explanations
to the end-users, even more than the implementation of the system itself (in many cases).
Problems:
• Individual characteristics change the user-perceived transparency (Gedikli et al., 2014)
• Goals and cognitive capabilities affect the perception of explanation (Miller, 2019)
• Different users require different explanation details (Millecamp et al., 2019)
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
7. Motivation
The design of user interfaces plays a fundamental role to provide the proper explanations
to the end-users, even more than the implementation of the system itself (in many cases).
Problems:
• Individual characteristics change the user-perceived transparency (Gedikli et al., 2014)
• Goals and cognitive capabilities affect the perception of explanation (Miller, 2019)
• Different users require different explanation details (Millecamp et al., 2019)
⇒ from static explanations towards adaptive and personalised explanations.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 2
8. Contribution
Our contribution:
1. Knowledge-graph based recommender system for scientific publications;
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 3
9. Contribution
Our contribution:
1. Knowledge-graph based recommender system for scientific publications;
2. Two explainable UIs providing different type of explanations to the users.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 3
10. Contribution
Our contribution:
1. Knowledge-graph based recommender system for scientific publications;
2. Two explainable UIs providing different type of explanations to the users.
Our goal:
“Assess which of the different ways of displaying results and related explanations is the
most comprehensible and likely to increase confidence and trustworthiness in using the
system by researchers covering different fields of study (e.g. computer science, social
sciences, linguistics, educational media and humanities).”
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 3
11. GEI Knowledge Graph
Contains GEI members’ job information, published works, research interests, projects in
which they are involved and existing relationships with external persons (e.g. co-authorship).
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 4
13. Data preparation
Focus on:
• Authorship relationship between
research output and person entities
(including external persons) as the
pivotal connection of the system;
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 5
14. Data preparation
Focus on:
• Authorship relationship between
research output and person entities
(including external persons) as the
pivotal connection of the system;
• Membership relationship between
persons and organisational units
(i.e. departments);
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 5
15. Data preparation
Focus on:
• Authorship relationship between
research output and person entities
(including external persons) as the
pivotal connection of the system;
• Membership relationship between
persons and organisational units
(i.e. departments);
• Interest relationship between persons
and research interests (merged with
activities the users were involved in).
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 5
16. Data preparation
Topic creation for research outputs:
• Merged title and abstract;
• Documents translated in English,
if needed;
• Use of BERTopic (Grootendorst, 2021)
for topic modelling.
Topic label Representative terms
Information retrieval information, user, semantic
International and
german, polish, czech, history
German history
GEI eckert, institution, education
Educational
research, studies, education
and textbook
Social, environment
moral, prejudice, conflict, threat
and Politics
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 5
17. System implementation
Inspired from the idea given by entity2rec (Palumbo et al., 2017):
• Definition of property-specific subgraphs;
• Vector representation of entities learned by node2vec algorithm (Grover et al., 2016);
• Cosine similarity used to compute similarity scores between person and research
output embeddings at a subgraph level;
• Top-N recommendations ranked by averaging out the scores from each person-research
output combination to obtain the final overall similarity:
sim(u, w) =
1
|N|
X
i
simi (u, w)
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 6
18. System implementation
Given the authorship relationship as pivotal, generated three subgraphs:
• Topic subgraph containing persons,
research outputs and generated topics;
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 6
19. System implementation
Given the authorship relationship as pivotal, generated three subgraphs:
• Topic subgraph containing persons,
research outputs and generated topics;
• Research interest and Activity
subgraph including research outputs,
persons, their research interests and
the activities they were involved in;
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 6
20. System implementation
Given the authorship relationship as pivotal, generated three subgraphs:
• Topic subgraph containing persons,
research outputs and generated topics;
• Research interest and Activity
subgraph including research outputs,
persons, their research interests and
the activities they were involved in;
• Department subgraph containing
research outputs, persons and the
organisational units they belong to.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 6
21. Explainable user interfaces
System A
System B
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
22. Explainable user interfaces
System A
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
23. Explainable user interfaces
System A
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
24. Explainable user interfaces
System A
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
25. Explainable user interfaces
System B
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
26. Explainable user interfaces
System B
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
27. Explainable user interfaces
System B
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 7
28. Evaluation
User study:
• Among the researchers at GEI, covering
several research areas;
• Evaluation of the effectiveness of both
explainable interfaces;
• Pool of participants of 23 researchers;
• Within-subjects evaluation with random
appearance of the user interfaces;
• 5-point Likert questions.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 8
29. Results
Label System Question
Q1 A The explanation sentences were relevant in interpreting the decision-making of the system.
Q1 B The similarity score was relevant to me in interpreting the decision-making process of the system.
Q2 A The user-to-paper graph was relevant in understanding the recommendation process.
Q2 B I found the similarity bar chart relevant in understanding the recommendations.
Q3 A & B The explanations given by the system help me understand why the items were recommended to me.
Q4 A & B I understood why the items were recommended to me.
Q5 A & B I feel confident in using this system as I have attained better understanding of the decision-making process.
Q6 A & B The explanations have increased my trust in the system.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 9
30. Results
Label System Question
Q1 A The explanation sentences were relevant in interpreting the decision-making of the system.
Q1 B The similarity score was relevant to me in interpreting the decision-making process of the system.
Q2 A The user-to-paper graph was relevant in understanding the recommendation process.
Q2 B I found the similarity bar chart relevant in understanding the recommendations.
Q3 A & B The explanations given by the system help me understand why the items were recommended to me.
Q4 A & B I understood why the items were recommended to me.
Q5 A & B I feel confident in using this system as I have attained better understanding of the decision-making process.
Q6 A & B The explanations have increased my trust in the system.
System A System B
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 9
31. Results
We conducted some targeted interviews to enhance the qualitative evaluation.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 9
32. Conclusion
In this paper:
• Presented the implementation of a KG-based recommender for research papers
developed in the academic context of the Georg Eckert Institute;
• Proposed two different explainable interfaces for results visualisation, as a first
attempt to assess what type of explanation is perceived as most comprehensible and
likely to increase trust in the system;
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 10
33. Conclusion
In this paper:
• Presented the implementation of a KG-based recommender for research papers
developed in the academic context of the Georg Eckert Institute;
• Proposed two different explainable interfaces for results visualisation, as a first
attempt to assess what type of explanation is perceived as most comprehensible and
likely to increase trust in the system;
⇒ Findings:
• Providing explanations in natural language and showing the graph connections is
perceived to be much more effective than similarity score and bar charts;
• User study and targeted interviews confirmed that different backgrounds might change
the perception of explanations.
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 10
34. Questions are welcome!
Erasmo Purificato, M.Sc.
erasmo.purificato@ovgu.de
erasmo.purificato@gei.de
*Credits: icons made by Pixelmeetup from Flaticon
Erasmo Purificato et al. Evaluating Explainable Interfaces for a Knowledge Graph-Based Recommender System 11