Recommender Systems     Challenge      ACM RecSys 2012          Dublin     September 13 2012
Organizers●   Nikos Manouselis                       ●   Jannis Hermanns    - Agro-Know & ARIADNE Foundation           - M...
The Challenge - 2 tracksCAMRa                       ScienceRec● Previously: 2010 & 2011   ● First time● Finding users to r...
What went wrong?● Initial results indicate that RecSys Challenge was not   successful   ○ measurable result: 5 submissions...
What went right?Why are we all here?● finding datasets to experiment with (especially from live,    industrial systems) in...
The real challengeHow to make such contests work, being also useful for...● ...the data publisher [insight into what can/c...
Our Workshop● Follows a simple structure similar to how  you would participate in a challenge  ○   Available Data Sets  ○ ...
Program09:00–09:15 Welcome & intro                       11:00–12:30: Real Use09:15–10:00 Working with Data               ...
Upcoming SlideShare
Loading in …5
×

RecSysChallenge Opening

1,152 views

Published on

The opening slides for the Recommender Systems Challenge at RecSys2012

Published in: Technology, Education
1 Comment
1 Like
Statistics
Notes
No Downloads
Views
Total views
1,152
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
9
Comments
1
Likes
1
Embeds 0
No embeds

No notes for slide

RecSysChallenge Opening

  1. 1. Recommender Systems Challenge ACM RecSys 2012 Dublin September 13 2012
  2. 2. Organizers● Nikos Manouselis ● Jannis Hermanns - Agro-Know & ARIADNE Foundation - Moviepilot -@jannis● Alan Said ● Katrien Verbert - PhD student @ TU Berlin -@alansaid - KULeuven● Domonkos Tikk ● Hendrik Drachsler - CEO @ Gravity R&D -@domonkostikk - Open University - The Netherlands● Benjamin Kille ● Kris Jack PhD tudent @ TU Berlin -@bennykille - Mendeley
  3. 3. The Challenge - 2 tracksCAMRa ScienceRec● Previously: 2010 & 2011 ● First time● Finding users to recom- ● Novel algorithms, mend a movie to visualisations, services for● moviepilot.com data paper recommendation● live evaluation ● Mendeley data (3 datasets)● camrachallenge.com ● Several requested data ● 4 submitted papers● ~60 participants● 1 submitted paper
  4. 4. What went wrong?● Initial results indicate that RecSys Challenge was not successful ○ measurable result: 5 submissions, 2 accepted papers + 1 accepted presentation/talk● Several issues encountered ○ “we downloaded the dataset but could not run extensive simulations because it was difficult to process” ○ “we wanted to combine the dataset with live data from the platform but we didn’t have enough user info” ○ “we used different datasets than the ones suggested because they were easier to access/use” ○ too diverse tracks ○ unawareness / difficulty in spreading information about the challenge
  5. 5. What went right?Why are we all here?● finding datasets to experiment with (especially from live, industrial systems) instead of working with the old "favorites"● learning how existing algorithms can be reused (extended, adapted, evolved) instead of coding from scratch● finding how our algorithm (unique, novel, amazing, the best) can be contributed to the community conceiving designing/deploying a great recommendation service● make a business case out of our algorithm/service● (become rich/famous/...)
  6. 6. The real challengeHow to make such contests work, being also useful for...● ...the data publisher [insight into what can/cannot be done with their data]● ...the research community [insight into new algorithms, approaches, services + contributions to existing frameworks/libraries]● ...the deployed platform [insight into new services that could work better / be more useful]● ...everyone [create publicity/awareness]
  7. 7. Our Workshop● Follows a simple structure similar to how you would participate in a challenge ○ Available Data Sets ○ Existing Algorithms/Frameworks ○ New Investigated Methods ○ Prototyped and/or Deployed Services
  8. 8. Program09:00–09:15 Welcome & intro 11:00–12:30: Real Use09:15–10:00 Working with Data ● From a toolkit of recommendation algorithms ● The MovieLens dataset – Michael Ekstrand into a real business: the Gravity R&D experience ● Mendeley’s data and perspective on data – Domonkos Tikk challenges – Kris Jack ● Selecting algorithms from the plista contest to ● Processing Rating Datasets for Recommender deliver plista’s ads and editorial content on Systems’ Research: Preliminary Experience premium publisher’s websites - Torben Brodt from two Case Studies - Giannis Stoitsis, ● Mendeley Suggest: engineering a personalised George Kyrgiazos, Georgios Chinis, Elina article recommender system - Kris Jack Megalou 12:30–14:30: Lunch break10:00–10:30: Algorithms & Experiments 14:30–15:30: Frameworks, Libs & APIs ● Usage-based vs. Citation-based Methods for ● Hands-on Recommender System Experiments Recommending Scholarly Research Articles - with MyMediaLite - Zeno Gantner André Vellino ● Using Apache’s Mahout and Contributing to it- ● Cross-Database Recommendation Using a Sebastian Schelter Topical Space - Atsuhiro Takasu, Takeshi ● Flexible Recommender Experiments with Lenskit Sagara, Akiko Aizawa - Michael Ekstrand 15:30–17:30: Hands-on work

×