Explaining job recommendations:
a human-centred perspective
FEAST @ ECML-PKDD - 23 Sept 2022
Katrien Verbert
Augment/HCI - KU Leuven
@katrien_v
Human-Computer Interaction group
Explainable AI - recommender systems – visualization – intelligent user interfaces
Learning analytics &
human resources
Media
consumption
Precision agriculture
Healthcare
Augment Katrien Verbert
ARIA Adalberto Simeone
Computer
Graphics
Phil Dutré
LIIR Sien Moens
E-media
Vero Vanden Abeele
Luc Geurts
Kathrin Gerling
Augment/HCI team
Robin De Croon
Postdoc researcher
Katrien Verbert
Professor
Francisco Gutiérrez
Postdoc researcher
Tom Broos
PhD researcher
Nyi Nyi Htun
Postdoc researcher
Houda Lamqaddam
Postdoc researcher
Oscar Alvarado
Postdoc researcher
https://augment.cs.kuleuven.be/
Diego Rojo Carcia
PhD researcher
Maxwell Szymanski
PhD researcher
Jeroen Ooge
PhD researcher
Aditya Bhattacharya
PhD researcher
Ivania Donoso Guzmán
PhD researcher
3
q Explaining model outcomes to increase user trust and acceptance
q Enable users to interact with the explanation process to improve the model
Research objectives
Models
5
Collaborative filtering – Content-based filtering
Knowledge-based filtering - Hybrid
Recommendation techniques
7
Bostandjiev, S., O'Donovan, J., & Höllerer, T. (2013, March). LinkedVis: exploring social and
semantic career recommendations. In Proceedings of the 2013 international conference on
Intelligent user interfaces (pp. 107-116).
Explaining prediction models
8
Gutiérrez, F., Ochoa, X., Seipp, K., Broos, T., & Verbert, K. (2019, September). Benefits and trade-offs of
different model representations in decision support systems for non-expert users. In IFIP Conference on
Human-Computer Interaction (pp. 576-597).
Explanation methods
9
Human resources
Job mediators
¤ Supporting dialogue
¤ Explaining predications
Job seekers
¤ Supporting job seekers
¤ Explaining recommendations
10
Predictions for finding jobs
11
Predicting duration to find a job
12
Context: Three years of data, 700 000 job seekers.
Key Issues: Missing data, prediction trust issues, job
seeker motivation, lack of control.
Goal
13
Design explanations to increase:
Support the dialogue between domain expert and laymen
14
Human-in-the-loop
Sven Charleer, Andrew Vande Moere, Joris Klerkx, Katrien Verbert, and Tinne De Laet. 2017. Learning
Analytics Dashboards to Support Adviser-Student Dialogue. IEEE Transactions on Learning Technologies
(2017), 1–12.
“… the expert can become the
intermediary between the [system] and
the [end-user] in order to avoid
misinterpretation and incorrect
decisions on behalf of the data… “
Customer journey approach
Observation of hands-on time
Observations of individual mediation sessions
Questionnaire
Preliminary study
Design goals
16
[DG1] Control the message
[DG2] Clarify the recommendations
[DG3] Support the mediator
17
Design and development
18
Forest plot Circles chart Barchart
Evaluation
19
Years of experience: (M = 9, SD = 4.3)
Six mediators dealt only with higher education job seekers.
Four with secondary to higher education.
Two dealth with job seekers without
technical/professional education.
Semi-structured interviews
1) Feedback on parameter visuals.
2) Interaction feedback with the working prototype dashboard.
Qualitative evaluation with expert users:
(N = 12, 10f, age: M= 40.7, SD = 9.4)
[DG1] control the message
20
Two themes
(1) Customization
(2) Importance of the human factor
[DG2] Clarify recommendations
21
Two themes
(1) Understanding the visualisation
(2) Convincing power
[DG3] Support the mediator
Useful cases
¤ Orientation
¤ Job mobility
22
Take away messages
¤ Key difference between actionable and non-actionable
parameters.
¤ Need for customization and contextualization.
¤ The human expert plays a crucial role when interpreting
and relaying in the predicted or recommended output.
23
Charleer S., Gutiérrez Hernández F., Verbert K. (2019). Supporting job mediator and job
seeker through an actionable dashboard. In: Proceedings of the 24th IUI conference on
Intelligent User Interfaces Presented at the ACM IUI 2019, Los Angeles, USA. (Core: A)
Human resources
Job mediators
¤ Supporting dialogue
¤ Explaining predications
Job seekers
¤ Supporting job seekers
¤ Explaining recommendations
24
Explaining job recommendations
25
25
• Abundant overload of job vacancies
• Dynamic Labor Market: need to support job mobility
• Providing effective recommendations particularly
challenging.
• Need for:
increased diversity
explanations
user control
exploration
Approach
Explaining job recommendations to show competence match
Support exploration and user control over broad and diverse recommendations
Explaining job recommendations
27
Gutiérrez, F., Charleer, Sven, De Croon, Robin, Nyi Nyi Htun, Goetschalckx, Gerd, & Verbert,
Katrien. (2019) “Explaining and exploring job recommendations: a user-driven approach for
interacting with knowledge-based job recommender systems”. In Proceedings of the 13th
ACM Conference on Recommender Systems. ACM, 2019
Methods
28
2
9
Ranking of parameters as voted by participants
3
0
Labor Market Explorer Design Goals
31
[DG1] Exploration/Control
Job seekers should be able to control
recommendations and filter out the information
flow coming from the recommender engine by
prioritizing specific items of interest.
[DG2] Explanations
Recommendations and matching scores should be
explained, and details should be provided on-
demand.
[DG3] Actionable Insights
The interface should provide actionable insights to
help job-seekers find new or more job
recommendations from different perspectives.
32
Final Evaluation
33
66 job seekers (age 33.9 ± 9.5, 18F)
8 Training Programs, 4 Groups, 1 Hour.
1
2
3
4
5
6
7
8
ResQue Questionnaire + two open questions.
Users explored the tool freely.
All interactions were logged.
34
Results
35
Results
Results
¤ Explanations contribute to support user empowerment.
¤ A diverse set of actionable insights were also mentioned by
participants.
¤ Participants in the technical group engaged more with all
the different features of the dashboard.
¤ Non-native speakers, sales and construction groups
engaged more with the map.
¤ The table overview was perceived as very useful by all user
groups, but the interaction may need further simplification
for some users.
36
Research challenges
¤ Insight vs. information overload
¤ Visual representations often difficult for non-expert users
¤ Limitations of user studies
37
Next steps
¤ Personalisation
¤ Conversational explanation methods
¤ Interactive explanation methods
38
Peter Brusliovsky Nava Tintarev Cristina Conati
Denis Parra
Collaborations
Bart Knijnenburg Jurgen Ziegler
Questions?
katrien.verbert@cs.kuleuven.be
@katrien_v
Thank you!
http://augment.cs.kuleuven.be/

Explaining job recommendations: a human-centred perspective

  • 1.
    Explaining job recommendations: ahuman-centred perspective FEAST @ ECML-PKDD - 23 Sept 2022 Katrien Verbert Augment/HCI - KU Leuven @katrien_v
  • 2.
    Human-Computer Interaction group ExplainableAI - recommender systems – visualization – intelligent user interfaces Learning analytics & human resources Media consumption Precision agriculture Healthcare Augment Katrien Verbert ARIA Adalberto Simeone Computer Graphics Phil Dutré LIIR Sien Moens E-media Vero Vanden Abeele Luc Geurts Kathrin Gerling
  • 3.
    Augment/HCI team Robin DeCroon Postdoc researcher Katrien Verbert Professor Francisco Gutiérrez Postdoc researcher Tom Broos PhD researcher Nyi Nyi Htun Postdoc researcher Houda Lamqaddam Postdoc researcher Oscar Alvarado Postdoc researcher https://augment.cs.kuleuven.be/ Diego Rojo Carcia PhD researcher Maxwell Szymanski PhD researcher Jeroen Ooge PhD researcher Aditya Bhattacharya PhD researcher Ivania Donoso Guzmán PhD researcher 3
  • 4.
    q Explaining modeloutcomes to increase user trust and acceptance q Enable users to interact with the explanation process to improve the model Research objectives Models
  • 5.
  • 6.
    Collaborative filtering –Content-based filtering Knowledge-based filtering - Hybrid Recommendation techniques
  • 7.
    7 Bostandjiev, S., O'Donovan,J., & Höllerer, T. (2013, March). LinkedVis: exploring social and semantic career recommendations. In Proceedings of the 2013 international conference on Intelligent user interfaces (pp. 107-116).
  • 8.
    Explaining prediction models 8 Gutiérrez,F., Ochoa, X., Seipp, K., Broos, T., & Verbert, K. (2019, September). Benefits and trade-offs of different model representations in decision support systems for non-expert users. In IFIP Conference on Human-Computer Interaction (pp. 576-597).
  • 9.
  • 10.
    Human resources Job mediators ¤Supporting dialogue ¤ Explaining predications Job seekers ¤ Supporting job seekers ¤ Explaining recommendations 10
  • 11.
  • 12.
    Predicting duration tofind a job 12 Context: Three years of data, 700 000 job seekers. Key Issues: Missing data, prediction trust issues, job seeker motivation, lack of control.
  • 13.
  • 14.
    Support the dialoguebetween domain expert and laymen 14 Human-in-the-loop Sven Charleer, Andrew Vande Moere, Joris Klerkx, Katrien Verbert, and Tinne De Laet. 2017. Learning Analytics Dashboards to Support Adviser-Student Dialogue. IEEE Transactions on Learning Technologies (2017), 1–12. “… the expert can become the intermediary between the [system] and the [end-user] in order to avoid misinterpretation and incorrect decisions on behalf of the data… “
  • 15.
    Customer journey approach Observationof hands-on time Observations of individual mediation sessions Questionnaire Preliminary study
  • 16.
    Design goals 16 [DG1] Controlthe message [DG2] Clarify the recommendations [DG3] Support the mediator
  • 17.
  • 18.
    Design and development 18 Forestplot Circles chart Barchart
  • 19.
    Evaluation 19 Years of experience:(M = 9, SD = 4.3) Six mediators dealt only with higher education job seekers. Four with secondary to higher education. Two dealth with job seekers without technical/professional education. Semi-structured interviews 1) Feedback on parameter visuals. 2) Interaction feedback with the working prototype dashboard. Qualitative evaluation with expert users: (N = 12, 10f, age: M= 40.7, SD = 9.4)
  • 20.
    [DG1] control themessage 20 Two themes (1) Customization (2) Importance of the human factor
  • 21.
    [DG2] Clarify recommendations 21 Twothemes (1) Understanding the visualisation (2) Convincing power
  • 22.
    [DG3] Support themediator Useful cases ¤ Orientation ¤ Job mobility 22
  • 23.
    Take away messages ¤Key difference between actionable and non-actionable parameters. ¤ Need for customization and contextualization. ¤ The human expert plays a crucial role when interpreting and relaying in the predicted or recommended output. 23 Charleer S., Gutiérrez Hernández F., Verbert K. (2019). Supporting job mediator and job seeker through an actionable dashboard. In: Proceedings of the 24th IUI conference on Intelligent User Interfaces Presented at the ACM IUI 2019, Los Angeles, USA. (Core: A)
  • 24.
    Human resources Job mediators ¤Supporting dialogue ¤ Explaining predications Job seekers ¤ Supporting job seekers ¤ Explaining recommendations 24
  • 25.
    Explaining job recommendations 25 25 •Abundant overload of job vacancies • Dynamic Labor Market: need to support job mobility • Providing effective recommendations particularly challenging. • Need for: increased diversity explanations user control exploration
  • 26.
    Approach Explaining job recommendationsto show competence match Support exploration and user control over broad and diverse recommendations
  • 27.
    Explaining job recommendations 27 Gutiérrez,F., Charleer, Sven, De Croon, Robin, Nyi Nyi Htun, Goetschalckx, Gerd, & Verbert, Katrien. (2019) “Explaining and exploring job recommendations: a user-driven approach for interacting with knowledge-based job recommender systems”. In Proceedings of the 13th ACM Conference on Recommender Systems. ACM, 2019
  • 28.
  • 29.
    2 9 Ranking of parametersas voted by participants
  • 30.
  • 31.
    Labor Market ExplorerDesign Goals 31 [DG1] Exploration/Control Job seekers should be able to control recommendations and filter out the information flow coming from the recommender engine by prioritizing specific items of interest. [DG2] Explanations Recommendations and matching scores should be explained, and details should be provided on- demand. [DG3] Actionable Insights The interface should provide actionable insights to help job-seekers find new or more job recommendations from different perspectives.
  • 32.
  • 33.
    Final Evaluation 33 66 jobseekers (age 33.9 ± 9.5, 18F) 8 Training Programs, 4 Groups, 1 Hour. 1 2 3 4 5 6 7 8 ResQue Questionnaire + two open questions. Users explored the tool freely. All interactions were logged.
  • 34.
  • 35.
  • 36.
    Results ¤ Explanations contributeto support user empowerment. ¤ A diverse set of actionable insights were also mentioned by participants. ¤ Participants in the technical group engaged more with all the different features of the dashboard. ¤ Non-native speakers, sales and construction groups engaged more with the map. ¤ The table overview was perceived as very useful by all user groups, but the interaction may need further simplification for some users. 36
  • 37.
    Research challenges ¤ Insightvs. information overload ¤ Visual representations often difficult for non-expert users ¤ Limitations of user studies 37
  • 38.
    Next steps ¤ Personalisation ¤Conversational explanation methods ¤ Interactive explanation methods 38
  • 39.
    Peter Brusliovsky NavaTintarev Cristina Conati Denis Parra Collaborations Bart Knijnenburg Jurgen Ziegler
  • 40.