10. Page What is the puzzle of recommendations? Puzzle: Understand the listener’s preferences and help her find and discover music she likes Foundational Problem: How do we measure success? Without an objective metric how do we conduct scientific research?
11. Page This is not a new observation ISMIR 2001 Resolution: There is a current need for metrics to evaluate the performance and accuracy of the various approaches and algorithms being developed within the Music Information Retrieval research community. Herlocker et al. Evaluating collaborative filtering recommender systems. [2004]: Each algorithmic approach has adherents who claim it to be superior for some purpose. Clearly identifying the best algorithm for a given purpose has proven challenging, in part because researchers disagree on which attributes should be measured, and on which metrics should be used for each attribute. Researchers who survey the literature will find over a dozen quantitative metrics and additional qualitative evaluation techniques.
12.
13.
14.
15. Page The (abbreviated) History of MIR The dawn of time (1960) The start of infinite music (2001) Dotcoms are cool again (2006) The Future (according to Gartner/Jupiter always happening in present +3 years) “ Hmm, it’s hard to file LPs in the card catalog” Learned how to stop making bad recs The Age of Good Recs Crisis? – The Dark Arts Age The Wall of Really Good Recs (no David Gilmour) The golden age of Mark Zuckerberg house parties (2005)
22. The biggest problem in music science today Page Can we get this guy to stop ruining every show we go to?
23. Page Or, Option #2 Or, seeing as this presentation is the very first session at the very first workshop on the very first day of RecSys2010, we could…
24. Page Or, Option #2 … adjourn to here for the rest of the week