Your SlideShare is downloading. ×
0
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
SetFusion Visual Hybrid Recommender -  IUI 2014
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

SetFusion Visual Hybrid Recommender - IUI 2014

360

Published on

Slides of my presentation at IUI 2014, the visual Hybrid Recommender SetFusion - "See What you Want to See: Visual User-Driven Approach for …

Slides of my presentation at IUI 2014, the visual Hybrid Recommender SetFusion - "See What you Want to See: Visual User-Driven Approach for Recommendation"

http://dl.acm.org/citation.cfm?id=2557542

DEMO available:
http://www.youtube.com/watch?v=9LwSx1V6Yxk

Published in: Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
360
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
8
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. See What you Want to See: Visual User-Driven Approach for Recommendation Denis Parra, PUC Chile Peter Brusilovsky, University of Pittsburgh Christoph Trattner, Graz University of Technology IUI 2014, Haifa, Israel
  • 2. Outline •  Short intro to some Challenges in Recommender Systems •  Our Approach to User Controllability (demo) •  User Study & Results •  Summary & Future Work 02/27/2014 D.Parra et al.~ IUI 2014 2
  • 3. INTRODUCTION Recommender Systems: Introduction & Challenges addressed in this research 3 * Danboard (Danbo): Amazon’s cardboard robot, in these slides represents a recommender system *
  • 4. Recommender Systems (RecSys) Systems that help people to find relevant items in a crowded item or information space (McNee et al. 2006) 02/27/2014 D.Parra et al.~ IUI 2014 4
  • 5. Challenges of RecSys Addressed Here Traditionally, RecSys has focused on producing accurate recommendation algorithms. In this research, these challenges are addressed: 1.  Human Factors in RecSys: Study controllability by introducing a novel visualization that presents fusion of different recommenders 2.  Evaluation: Use of Objective, Subjective & Behavioral metrics 02/27/2014 D.Parra et al.~ IUI 2014 5
  • 6. Research Goals & User Studies Research Goal •  To understand the effect of controllability on the user engagement and on the overall user experience of a RecSys (on this paper) Through •  Two studies conducted using Conference Navigator: 02/27/2014 D.Parra et al.~ IUI 2014 6 Program Proceedings Author List Recommendations http://halley.exp.sis.pitt.edu/cn3/
  • 7. WHYIUISHOULDCARE:HCI+RECSYS COMMUNITY Previous research related to this work / Motivating results from TalkExplorer study 7/22/2013 D.Parra ~ PhD. Dissertation Defense 7
  • 8. TasteWeights (Bostandjev et ala 2012) 7/22/2013 D.Parra ~ PhD. Dissertation Defense 8
  • 9. Preliminary Work: TalkExplorer •  Adaptation of Aduna Visualization in CN •  Main research question: Do fusion (intersection) of contexts of relevance improve user experience? 7/22/2013 D.Parra ~ PhD. Dissertation Defense 9 Center user CN user RecommenderRecommender Cluster with intersect ion of entities Cluster (of talks) associated to only one entity
  • 10. SETFUSION:USER-CONTROLLABLE HYBRIDINTERFACE 10
  • 11. Our Proposed Interface: SetFusion 02/27/2014 D.Parra et al.~ IUI 2014 11
  • 12. Our Proposed Interface - II 02/27/2014 D.Parra et al.~ IUI 2014 12 Traditional Ranked List Papers sorted by Relevance. It combines 3 recommendation approaches.
  • 13. Our Proposed Interface - III 02/27/2014 D.Parra et al.~ IUI 2014 13 Sliders Allow the user to control the importance of each data source or recommendation method Interactive Venn Diagram Allows the user to inspect and to filter papers recommended. Actions available: -  Filter item list by clicking on an area -  Highlight a paper by mouse-over on a circle -  Scroll to paper by clicking on a circle -  Indicate bookmarked papers
  • 14. Mixed Hybridization: Item Score 7/22/2013 D.Parra ~ PhD. Dissertation Defense 14 M: The set of all methods available to fuse rankreci,mj : rank–position in the list of a recommended item reci : recommended item i mj, : recommendation method j Wmj : weight given by the user to the method mj using the controllable interface |Mreci| represents the number of methods by which item reci was recommended Slider weight
  • 15. RESEARCH:DETAILS&RESULTS Description and Analysis of the results of the 3 user studies
  • 16. Studies: CSCW 2013 & UMAP 2013 02/27/2014 D.Parra et al.~ IUI 2014 16 CSCW 2013 Conditions Static List Interactive SetFusion # Attendants ~400 # RecSys Users 15 22 Study type Between Subjects UMAP 2013 Interactive SetFusion ~ 100 50 1 group Preliminary User study: Here we learned that the Interactive interface had a positive effect on user behavior and perception of the recsys Second study: Only interactive interface CHANGES: 1.  Preference Elicitation: In CSCW we avoided cold start. In UMAP we had no constraints 2.  Use of the ratings to update the recommended items 3.  Tuning of Content- based recommender
  • 17. Comparing CSCW and UMAP 02/27/2014 D.Parra et al.~ IUI 2014 17 (Only Interactive Interfaces) CSCW 2013 UMAP 2013 # Users exposed to recommendation 84 95 # Users who used the recommender 22 ( ~ 26 %) 50 ( ~52.6 %) # Users bookmarked papers 6 ( ~ 27.2 %) 14 (~28 %) # Talks bookmarked / user avg. 28 / 4.67 103 / 7.36 Average User rating 3.73 / 10 ( ~45.4 %) 3.62 / 8 (~16%) Usage at Recommender Page # Talks explored (user avg.) 16.84 14.9 # People returning 7 (~31.8%) 14 (28%) Average time spent in page (seconds) 261.72 353.8
  • 18. Comparing CSCW and UMAP 02/27/2014 D.Parra et al.~ IUI 2014 18 (Only Interactive Interfaces) CSCW 2013 UMAP 2013 # Users exposed to recommendation 84 95 # Users who used the recommender 22 50 # Users bookmarked papers 6 ( ~ 27.2 %) 14 (28 %) # Talks bookmarked / user avg. 28 / 4.67 103 / 7.36 Average User rating 3.73 / 10 ( ~45.4 %) 3.62 / 8 (~16%) Usage at Recommender Page # Talks explored (user avg.) 16.84 14.9 # People returning 7 (~31.8%) 14 (28%) Average time spent in page (seconds) 261.72 353.8
  • 19. Comparing CSCW and UMAP 02/27/2014 D.Parra et al.~ IUI 2014 19 (Only Interactive Interfaces) CSCW 2013 UMAP 2013 # Users exposed to recommendation 84 95 # Users who used the recommender 22 50 # Users bookmarked papers 6 ( ~ 27.2 %) 14 (~28 %) # Talks bookmarked / user avg. 28 / 4.67 103 / 7.36 Average User rating 3.73 / 10 ( ~45.4 %) 3.62 / 8 (~16%) Usage at Recommender Page # Talks explored (user avg.) 16.84 14.9 # People returning 7 (~31.8%) 14 (28%) Average time spent in page (seconds) 261.72 353.8
  • 20. From the Final Survey CSCW 2013 (11 users) UMAP 2013 (8 users) I don’t think that Conference Navigator needs a Recommender System M = 2.36, S.E. = 0.2 M = 1.5 , S.E. = 0.21 (p < 0.05) I would recommend this system to my colleagues M = 3.36, S.E. = 0.28 M = 4.25, S.E. = 0.33 (p < 0.05) 02/27/2014 D.Parra et al.~ IUI 2014 20 - Users perceived SetFusion significantly as a more useful tool in UMAP than in CSCW
  • 21. CONCLUSIONS&FUTUREWORK
  • 22. Summary of Results •  From Study 1 we showed that User Controllability had an effect on the user experience with RecSys. •  Comparing SetFusion in Study 1 and Study 2: – A natural elicitation setting (UMAP) allowed users to be more engaged on using the system for the task of the interface: bookmark papers recommended. – Users also perceived the system as more useful in UMAP 2013. – Ratings are a form of giving user control, a big lesson from Study 1: if you ask user for feedback, use it! 02/27/2014 D.Parra et al.~ IUI 2014 22
  • 23. Limitations & Future Work •  Apply our approach to other domains (fusion of data sources or recommendation algorithms) •  Find alternatives to scale the approach to more than 3 sets, potential alternatives: – Clustering and – Radial sets •  Consider other factors that might interact with the user experience: – Controllability by itself vs. minimum level of accuracy 02/27/2014 D.Parra et al.~ IUI 2014 23
  • 24. THANKS! QUESTIONS?DPARRA@ING.PUC.CL

×