Increasing Diversity Through Furthest Neighbor-Based Recommendation
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Increasing Diversity Through Furthest Neighbor-Based Recommendation

on

  • 624 views

 

Statistics

Views

Total Views
624
Views on SlideShare
623
Embed Views
1

Actions

Likes
0
Downloads
5
Comments
0

1 Embed 1

http://us-w1.rockmelt.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Increasing Diversity Through Furthest Neighbor-Based Recommendation Presentation Transcript

  • 1. Increasing Diversity Through Furthest Neighbor-Based Recommendation Alan Said, Benjamin Kille, Brijnesh J. Jain, Sahin Albayrak 1
  • 2. Agenda Problem Approach: k Furthest Neighbor Experimental settings Results Conclusions Discussion27.02.2012 Information Retrieval & Machine 2 Learning
  • 3. Problem: Missing diversity • Accurate recommendations • However, all items appear similar27.02.2012 Information Retrieval & Machine 3 Learning
  • 4. Problem: Missing diversity Intersection: Intersection: Actors Plot Harry Potter and the Chamber of Secrets 49,0% 32,3% Harry Potter and the Prisoner of Azkaban 52,9% 26,7% Harry Potter and the Goblet of Fire 39,2% 29,2% Harry Potter and the Order of the Phoenix 49,0% 16,8% Harry Potter and the Half-Blood Prince 41,2% 15,5% Harry Potter and the Deathly Hallows: Part 1 43,1% 16,8% Harry Potter and the Deathly Hallows: Part 2 49,0% 22,4% Source: imdb.com27.02.2012 Information Retrieval & Machine 4 Learning
  • 5. Desired features of Recommendations Reflect a user‘s preferences Correct ranking Novelty Serendipity Idea: Combining orthogonal recommendation Diversity27.02.2012 CC IRML Folie 5
  • 6. k Furthest Neighbor dislike like27.02.2012 Information Retrieval & Machine 6 Learning
  • 7. k Furthest Neighbor27.02.2012 CC IRML Folie 7
  • 8. Experimental settingsData set: randomly sampled 1 million ratings out ofMovieLens (1M100k)  excluded: 100 most popular movies (rating frequency)  excluded: users with < 40 ratings  44,214 users; 9,432 moviesApproaches: Evaluation: – kNN Pearson  precision @N – kNN cosine  recall @N – kFN Pearson  overlap – kFN cosine N ϵ {5; 10; 25; 50; 100; 200} 27.02.2012 Information Retrieval & Machine 8 Learning
  • 9. Results I Precision @ NN 5 10 25 50 100 200Pearson Similarity 0,0007 0,0110 0,0170 0,0280 0,0410 0,0900Cosine Similarity 0,0050 0,0070 0,0160 0,0270 0,0570 0,0000 27.02.2012 Information Retrieval & Machine 9 Learning
  • 10. Results II Recall @ NN 5 10 25 50 100 200Pearson Similarity 0,0080 0,0130 0,0210 0,2300 0,0140 0,0100Cosine Similarity 0,0020 0,0060 0,0070 0,0060 0,0050 0,0040 27.02.2012 Information Retrieval & Machine 10 Learning
  • 11. Results III Overlap 27.02.2012 Information Retrieval & Machine 11 Learning
  • 12. Conclusion „The enemy of my enemy is my friend“ seems to hold in the context of recommender systems kFN achieved worse precision kFN provided higher recall with N > 50 kFN did provide orthogonal recommendations27.02.2012 Information Retrieval & Machine 12 Learning
  • 13. Thanks for your attention!!! http://recsyswiki.com27.02.2012 CC IRML Folie 13
  • 14. Contact Benjamin Kille Researcher of Competence Center +49 (0) 30 / 314 – 74 128 Information Retrieval +49 (0) 30 / 314 – 74 003 & Machine Learning benjamin.kille@dai-labor.de27.02.2012 CC IRML Folie 14
  • 15. Discussion How to optimize the approach? Are there other ways to introcude more diverse recommendations? How to evaluate diversity in the context of recommender system?27.02.2012 CC IRML Folie 15