Competence Center Information Retrieval & Machine LearningUMAP‘13 Doctoral ConsortiumEvaluation of Cross-Domain News Artic...
Agenda213. Juni 2013► Problem description► Challenges in News Article Recommendation Sparsity Dynamic item collection E...
Problem description313. Juni 2013► Information overload amount of on-line accessible news articles increases limited use...
Problem formalization413. Juni 2013►
Sparsity513. Juni 2013►
Dynamics613. Juni 2013► News  dynamic contentBillsus, D. & Pazzani, M.J., 2007. Adaptive News Access. In P. Brusilovsky, ...
Evaluation713. Juni 2013► Strategy on-line: A/B testing (user-centric) off-line: data set (data-centric)► Numerous facet...
Evaluation (cont‘d)813. Juni 2013Dispatcher● recommendation request click●●●●●●
Evaluation (cont‘d)913. Juni 2013Dispatcher● recommendation request click● ● ● ● ●● ? ?Li, L. et al., 2011. Unbiased of...
Cross-domain setting1013. Juni 2013D1D2UUIID1D2UUIID1D2UUIID1D2UUIInooverlapUseroverlapItemoverlapfulloverlapCremonesi, P....
Research Questions1113. Juni 2013► How can other publishers user interactions contribute todecrease sparsity for the targe...
Data outline1213. Juni 2013► > 1-2M impressions by 12 publishers (general news, local news,finance, information technology...
Preliminary results1313. Juni 2013► Sparsity► Histogram of the relative frequency of user interactions
Preliminary results (cont‘d)1413. Juni 2013► Dynamics
Preliminary results (cont‘d)1513. Juni 2013► Popularity
Conclusions1613. Juni 2013►
Next steps1713. Juni 2013► Implementation of existing cross-domain recommenderalgorithms► Evaluating recommender algorithm...
Thank you for the attention!1813. Juni 2013Questions???
Announcement: NRS 20131913. Juni 2013► International News Recommender Systems Workshop and Challenge► In conjunction with ...
Competence Center Information Retrieval &Machine Learningwww.dai-labor.deFonFax+49 (0) 30 / 314 – 74+49 (0) 30 / 314 – 74 ...
Upcoming SlideShare
Loading in...5
×

Evaluation of Cross-Domain News Article Recommendations

945

Published on

Presentation given at the UMAP 2013 Doctoral Consortium

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
945
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation of Cross-Domain News Article Recommendations

  1. 1. Competence Center Information Retrieval & Machine LearningUMAP‘13 Doctoral ConsortiumEvaluation of Cross-Domain News Article RecommendationsBenjamin Kille13. Juni 2013
  2. 2. Agenda213. Juni 2013► Problem description► Challenges in News Article Recommendation Sparsity Dynamic item collection Evaluation► Research Questions► Data outline► Preliminary results► Conclusions► Next steps
  3. 3. Problem description313. Juni 2013► Information overload amount of on-line accessible news articles increases limited user perception limited time capacity► Solution: Recommender System  filtering news articles withrespect to relevance/utility► Special challenges for news recommender systems Sparsity Dynamics► General challenges for recommender systems Evaluation strategy
  4. 4. Problem formalization413. Juni 2013►
  5. 5. Sparsity513. Juni 2013►
  6. 6. Dynamics613. Juni 2013► News  dynamic contentBillsus, D. & Pazzani, M.J., 2007. Adaptive News Access. In P. Brusilovsky, A. Kobsa, &W. Nejdl, eds. The Adaptive Web. Springer, pp. 550–570.► Unlike music or movies rarely re-consumed► For instance: Deutsche Presse Agentur (DPA) 750 messages 220k words 1,5k imageshttp://www.dpa.de/Zahlen-Fakten.152.0.html
  7. 7. Evaluation713. Juni 2013► Strategy on-line: A/B testing (user-centric) off-line: data set (data-centric)► Numerous facets utility relevance novelty serendipity …► Dependending on the model formulation preference prediction (requires numerical preference data) item ranking
  8. 8. Evaluation (cont‘d)813. Juni 2013Dispatcher● recommendation request click●●●●●●
  9. 9. Evaluation (cont‘d)913. Juni 2013Dispatcher● recommendation request click● ● ● ● ●● ? ?Li, L. et al., 2011. Unbiased offline evaluation of contextual-bandit-based newsarticle recommendation algorithms. In Proceedings of the fourth ACM internationalconference on Web search and data mining - WSDM ’11. p. 297.
  10. 10. Cross-domain setting1013. Juni 2013D1D2UUIID1D2UUIID1D2UUIID1D2UUIInooverlapUseroverlapItemoverlapfulloverlapCremonesi, P., Tripodi, A. & Turrin, R., 2011. Cross-Domain Recommender Systems. In2011 IEEE 11th International Conference on Data Mining Workshops. IEEE, pp. 496–503.
  11. 11. Research Questions1113. Juni 2013► How can other publishers user interactions contribute todecrease sparsity for the target publisher?► What characteristics must recommender algorithms exhibit tosuccessfully cope with dynamically changing item collections?► How to evaluate cross-domain recommender systems withdynamically changing item collections? How do standardevaluation metrics compare to the observed clicks?
  12. 12. Data outline1213. Juni 2013► > 1-2M impressions by 12 publishers (general news, local news,finance, information technology, sports, etc.) on a daily basis► user features such as browser ISP OS device► news article features such as title text URL Image► http://www.dai-labor.de/en/irml/epen/►Real interactions with actual users!
  13. 13. Preliminary results1313. Juni 2013► Sparsity► Histogram of the relative frequency of user interactions
  14. 14. Preliminary results (cont‘d)1413. Juni 2013► Dynamics
  15. 15. Preliminary results (cont‘d)1513. Juni 2013► Popularity
  16. 16. Conclusions1613. Juni 2013►
  17. 17. Next steps1713. Juni 2013► Implementation of existing cross-domain recommenderalgorithms► Evaluating recommender algorithms with respect to CTR novelty diversity► Investigate UI effects► Analyze applicability of context-sensitive recommendations► User/Item clustering to speed-up computation time
  18. 18. Thank you for the attention!1813. Juni 2013Questions???
  19. 19. Announcement: NRS 20131913. Juni 2013► International News Recommender Systems Workshop and Challenge► In conjunction with ACM RecSys 2013IMPORTANT DATES July 21, 2013 paper submission deadline July 1, 2013 data set release August 15, 2013 on-line challenge kick-offHIGHLIGHTS Access to a real recommender system Real-time requirements Big Data Cross-domain Implicit feedbackWebsite: https://sites.google.com/site/newsrec2013/homeTwitter: @NRSws2013
  20. 20. Competence Center Information Retrieval &Machine Learningwww.dai-labor.deFonFax+49 (0) 30 / 314 – 74+49 (0) 30 / 314 – 74 003DAI-LaborTechnische Universität BerlinFakultät IV – Elektrontechnik & InformatikSekretariat TEL 14Ernst-Reuter-Platz 710587 Berlin, Deutschland20Benjamin KilleResearcher / PhD studentbenjamin.kille@dai-labor.de7412813. Juni 2013
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×