Music discovery on the net


Published on

  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Music discovery on the net

  1. 1. Music discovery on the net Barcamp3, Berlin Petar Djekic October 18th, 2008
  2. 2. From phonograph to widgets Widget Mobile Web Web PC PC PC Portable Portable Portable Portable TV TV TV TV TV Car Audio Car Audio Car Audio Car Audio Car Audio Radio Radio Radio Radio Radio Radio Phono Phono Phono HiFi system HiFi system HiFi system HiFi system 1890 1920 1930 1950 1980 1990 2000 Source: own, Wikipedia ’08
  3. 3. Yet still.. „iPod classic can „There is are an average of hold up to 30,000 700 songs stored on a U.S. songs“ music downloader’s player.“ „Average MP3 player only 57% full“ Source: Apple 2008, Forrester Research 2008, IPSOS 2006
  4. 4. Music Discovery 4 “The only bad thing about “A wealth of information MySpace is that there creates a poverty of are 100,000 bands and no attention” filtering. I try to find the bands I Herbert A. Simon, Nobel prize winning economist might like but often I just get tired of looking.” 15 year old student, IFPI focus group research, July 2007
  5. 5. Music Discovery 5 Many places, similar technologies
  6. 6. Recommendation technologies: Overview   Human behaviour: Recommendations are based on behaviour, e.g., Collaborative filtering using listening or purchase habits   Human annotation: Recommendations are based on annotations and expertise, e.g., ratings, tags, classification into genres, editorial content   Content analysis: Recommendations are based on characteristics of the content itself, e.g., sound density, vocals, tempo, sound color, instruments, volume, dynamics
  7. 7. „Freakomendations“: Variety Source: audiobaba
  8. 8. „Freakomendations“: Manipulation Source: Paul Lamere,
  9. 9. „Freakomendations“: Cold-start Source: iTunes Genius
  10. 10. „Freakomendations“: Relevance Source:
  11. 11. Recommendation technologies: Issues   Relevance: How good does   Variety: Variety of the content suit my taste? recommendations (Beatles- How about mood and problem); connection expectations? between variety and content available   Scalability: Indexing of existing content libraries   Privacy: Who owns YOUR and new releases (cold- data? starts)   Explanation: Why was   Objectivity: Manipulation of something recommended? rankings, consistency of   Portability: How about recommendations mobile devices, MP3 players
  12. 12. Mash it up now! <resources> Human annotation/behaviour   MusicBrainz: similar artists, tags, meta data, CC / PD license   Yahoo! Music: similarities, charts, ratings, meta data, REST webservice, max. 5000 queries/day  similarities, tags, ratings, meta data, REST webservice, free for non-commercial use Content analysis   Echo.nest: sound analysis, recommendations, custom HTTP webservice,   audiobaba: similarities, custom HTTP webservice, max. 1 query/sec
  13. 13. Mash it up now! <resources> Matching Full-track   Identifier: MusicBrainz,   Youtube ISRC, All music guide   Imeem Media   Meta data: G’n’R, Platform, yahoo GunsNRoses, Guns N’   Seeqpod, skreemr Roses…   Acoustic fingerprints:   Radio stream Standards?
  14. 14. Recommendations, again Books David Jennings (2006) Net, Blogs, and Rock‘n‘Roll David Huron (2008) Sweet Anticipation: Music and the Psychology of Expectation Papers Kim, J., and Belkin, N. J. (2002). Categories of music description and search terms and phrases used by non-music experts, http:// Tintarev, N. (2007), A Survey of Explanations in Recommender Systems, TintarevMasthoffICDE07.pdf Mobasher, B. et al (2007) Trustworthy Recommender Systems: An Analysis of Attack Models and Algorithm Robustness, Conferences The International Conferences on Music Information Retrieval and Related Activities, ISMIR, ACM Recommender Systems, RecSys, Blogs Duke Listens!,
  15. 15. Thank you! @polyano
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.