Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

CaRR Workshop Keynote Slides

1,045 views

Published on

Recommender Systems and IR are technically very similar problems, but are typically treated separately and often investigated by different groups of researchers. Looking at how people behave with such systems can be one way of unifying the problem, as well as the researchers, and can also be a useful, complementary evaluation method. When examining user behaviour, context is crucial. By focusing on the user behaviour and the encapsulating context, we can ask questions about tools that combine search and recsys like: when do people prefer to search and when do they prefer recommendations? What does this mean for what they are trying to achieve? In this talk I will try to answer such questions with examples from leisure and health domains. Finally, looking towards the future, I will argue that the relationship between search and recommender systems and behaviour can go full circle i.e., that both have the potential to impact on user behaviour in positive ways, and will present some ideas that I together with collaborators are doing to explore this.

Published in: Technology, Sports
  • Be the first to comment

CaRR Workshop Keynote Slides

  1. 1. David Elsweiler| david.elsweiler@ur.de Lehrstuhl für Informationswissenschaft| www.iw.ur.de Behaviour with Search and Recommender Systems: what can it tell us?
  2. 2. Coming up... • Discuss some of the work I have been doing in Rec-Sys and Search – Leisure and Food / Health domains • Behavioural focus • Outline the benefits I believe such a focus has for both the rec-sys and the IR community
  3. 3. Caveat: Not just my work!
  4. 4. Computer Science Background
  5. 5. Photo by Martin LaBar (going on hiatus) - Creative Commons Attribution-NonCommercial License http://www.flickr.com/photos/32454422@N00 Created with Haiku Deck
  6. 6. Photo by Martin LaBar (going on hiatus) - Creative Commons Attribution-NonCommercial License http://www.flickr.com/photos/32454422@N00 Created with Haiku Deck
  7. 7. Photo by ell brown - Creative Commons Attribution License http://www.flickr.com/photos/39415781@N06 Created with Haiku Deck
  8. 8. Photo by will_cyclist - Creative Commons Attribution-NonCommercial License http://www.flickr.com/photos/88379351@N00 Created with Haiku Deck
  9. 9. Photo by Pete Prodoehl - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/35237092540@N01 Created with Haiku Deck
  10. 10. Photo by Pete Prodoehl - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/35237092540@N01 Created with Haiku Deck
  11. 11. Photo by davidjwbailey - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/27711971@N06 Created with Haiku Deck
  12. 12. Photo by davidjwbailey - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/27711971@N06 Created with Haiku Deck
  13. 13. Photo by bigwhitehobbit - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/28618339@N03 Created with Haiku Deck
  14. 14. Photo by bigwhitehobbit - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/28618339@N03 Created with Haiku Deck
  15. 15. Photo by My name's axel - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/37611179@N00 Created with Haiku Deck
  16. 16. Photo by John Howard - Getty Royalty-Free License http://www.gettyimages.com/Corporate/LicenseAgreements.aspx Created with Haiku Deck
  17. 17. Photo by John Howard - Getty Royalty-Free License http://www.gettyimages.com/Corporate/LicenseAgreements.aspx Created with Haiku Deck „This is a rec-sys problem. Think about Netflix, Spotify, Amazon etc.“ „ but the process of searching can also be part of the fun “ We have been investigating these questions in different contexts: • Wikipedia, social-media, distributed leisure events
  18. 18. App • Helps vistors find events • Generates Plans • Guides the visitor • 1000-2000 users • Interaction log-data
  19. 19. App • Helps vistors find events • Generates Plans • Guides the visitor • 1000-2000 users • Interaction log-data
  20. 20. App • Helps vistors find events • Generates Plans • Guides the visitor • 1000-2000 users • Interaction log-data
  21. 21. App • Helps vistors find events • Generates Plans • Guides the visitor • 1000-2000 users • Interaction log-data • Every 6 months • 1-2,000 users • Interaction data logged
  22. 22. App • Helps vistors find events • Generates Plans • Guides the visitor • 1000-2000 users • Interaction log-data • Combine with other data sources e.g. survey from >50 users • Rich understanding of how system features were used • How system usage influences experience on evening
  23. 23. Photo by Kaysse - Creative Commons Attribution License http://www.flickr.com/photos/29862505@N08 Created with Haiku Deck
  24. 24. Photo by Thomas Rousing - Creative Commons Attribution License http://www.flickr.com/photos/43812360@N05 Created with Haiku Deck • Offline evaluation of various Rec-Sys algs • LNMusic: 860 users; 4,973 ratings • LNMuseums: 1,047 users; 10,992 ratings • Of the single recommenders the popularity baseline performs best • Combining Content-based and Collaborative Filtering improves performance (dynamic weighting even more) • Additionally considering temporal contiguity does not affect the performance
  25. 25. Photo by Thomas Rousing - Creative Commons Attribution License http://www.flickr.com/photos/43812360@N05 Created with Haiku Deck • Online evaluation (live A/B testing) • Different weights with our best system and TempCont • Slight cost to user acceptance (ERec ∈ ESel ) • Routes were tighter and more compact, which would allow users to spend less time travelling and more time visiting events • First hint that changing the system has an influence on the behaviour (and perhaps on the experience) •
  26. 26. Investigating behavioural patterns • Long Night of Music (1159 users, 111 GPS) • Dominant tab for users: • Most users (81.2%) stick to one or two tabs for selecting events of interest • Most events (82.8%) came from dominant tab Rec Sys By Tour Genre Search Map 37.2% 15.6% 17.4% 24.5% 5.3%
  27. 27. Tab-usage during the night
  28. 28. Tab-usage during the night • Planning phase • Event discovery with the aim of planning in mind e.g. Searching, Browsing and in particular RecSys Tab-usage during the night
  29. 29. Tab-usage during the night • After 8pm behaviour changed • Less interaction with search, genre & RecSys • More geographical, in part. Map tab Tab-usage during the night
  30. 30. Photo by Leo Reynolds - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/49968232@N00 Created with Haiku Deck • Metrics to model user experience on evening • # event visits • Evening duration • Ratio of visiting time • Avg. event visiting time • Recall and Precision of visited events, • Diversity of events • Temporal contiguity of events • Ratio of top N events
  31. 31. • Visit significantly more events than the others – on average nearly 1.5 events more • Spent significantly more time visiting events • Likely because of the Temporal Contiguity component in the RecSys • More efficient use of time on evening • Significantly shorter interaction times • More popular events
  32. 32. • Also visited more events • Spent less time visiting events • Longer evenings • Tend to only visit events near stops on one or two lines • Value for money users
  33. 33. • Visited less diverse and less popular events • Favour more esoteric choices that fit more closely with their specific genres of interest. • Specificity comes at a cost of a smaller number of visited events and also a lower ratio of visiting time • Greater precision, meaning they tend to adhere more rigidly to their original plans during the night.
  34. 34. • Spent less time during the evening overall (~30 mins) and 5 mins less at each event • Surprisingly no influence on popularity • Seems users cherry pick known about events of interest e.g. recommendations from friends • Spend a lot of time planning these events (increased interaction time before event)
  35. 35. Map Tab • Interacted less before the evening (5.6min vs. 15.7min) • Temporal contiguity for visited events is lower • Visited events less likely to have been previously marked – likely explained by such users marking fewer events as interesting (4.71 events vs. 9.79; p=0.01). • Visited events were less popular (10.1% vs. 15.7% of visited events were among the top 5)
  36. 36. Visited events precision over time: • Map users stuck with their smaller plans until around 9.30pm • Other users until around 12.30 am • Both groups were more likely to deviate as time went on
  37. 37. • 55 users provided feedback about the app and their priorities for the evening • Rec-sys and Tour tab users appreciate routes with: • an efficient use of time, shorter paths, and many events. • Tour tab users value interestingness of events less than other users
  38. 38. • Genre tab users: • were less interested in using time efficiently, • didn‘t care much about having short travel times • not bothered about visiting many events. • Instead, they put value on visiting interesting but not diverse events
  39. 39. • Map tab users: – 88.9% claimed they used the app as an electronic program guide (vs 62.2%) • Reflects map tab users having no ambitions of making plans but instead to spontaneously decide where to go next.
  40. 40. • Search tab users: –Outliers –don‘t really state any real prefences with respect to the other groups –There was one finding of note that linked to their outcomes: –Strong disagreement with the statement that the app helped to reduce travelling time, while other groups strongly agreed –Cherry-picking events not a good strategy if you want an efficient route
  41. 41. Photo by fractalznet - Creative Commons Attribution-NonCommercial License http://www.flickr.com/photos/95575701@N00 Created with Haiku Deck • What users want differs and changes over time • Distinct patterns of usage: • Correlation between using specific features and outcomes of the evening • Correlation between reported user priorities and usage of specific features • Different support best in different situations • Users adapt their behaviour
  42. 42. Photo by marco bono - Creative Commons Attribution-NonCommercial-ShareAlike License http://www.flickr.com/photos/47001509@N00 Created with Haiku Deck
  43. 43. Müller, M.; Harvey, M.; Elsweiler, D. & Mika, S. (2012), Ingredient Matching to Determine the Nutritional Properties of Internet-Sourced Recipes, in 'Proc. 6th International Conference on Pervasive Computing Technologies for Healthcare'
  44. 44. Harvey, M., Elsweiler, D., Ludwig, B. (2013)
You are what you eat: learning user tastes for rating prediction
20th String Processing and Information Retrieval Symposium (SPIRE). Jerusalem, Israel.
  45. 45. Plans • User created • Automatically based on user tastes and WHO guidelines
  46. 46. Plans • User created • Automatically based on user tastes and WHO guidelines
  47. 47. Plans • User created • Automatically based on user tastes and WHO guidelines
  48. 48. Behaviour with the system • How is this system used? • What factors affect this? • Behavioural Change – User has a goal (e.g. eat less fatty foods, lose weight, eat more protein) – Can the system help change behaviour to move the user towards his or her goal? • Does system usage influence behavioural change?
  49. 49. Photo by Fake Plastic Alice - Creative Commons Attribution License http://www.flickr.com/photos/57764541@N00 Created with Haiku Deck • A behavioural approach is system agnostic • Behaviour is highly context-dependent • As are user goals • Behaviour > interaction: • non-system behaviours e.g. LN outcomes • Complementary evaluation approach
  50. 50. Photo by Derek Bridges - Creative Commons Attribution License http://www.flickr.com/photos/84949728@N00 Created with Haiku Deck

×