USING MASHUP
TECHNOLOGY TO IMPROVE
     FINDABILITY
           Sten Govaerts
        Promotor: Erik Duval
     Co-promotor: Katrien Verbert
OVERVIEW

• Research   outline

• Music

• Technology    Enhanced
 Learning

• Publication   list

• Further   planning
FINDABILITY

Findability is the ability of users to
  identify an appropriate website
and navigate the pages of the site
 to discover and retrieve relevant
      information resources.

                Peter Morville (2005)
MASHUPS
MASHUPS
• mashups   in music: re-mixing multiple existing songs to create a
 new one.
MASHUPS
• mashups   in music: re-mixing multiple existing songs to create a
 new one.

•amashup is an application that combines data from multiple
 online sources to create a new result which was not the
 original intend of the data.
MASHUPS
• mashups     in music: re-mixing multiple existing songs to create a
 new one.

•amashup is an application that combines data from multiple
 online sources to create a new result which was not the
 original intend of the data.

• data   is key!

  • tweaking       and enriching data is important

  • interesting     data makes an interesting mashup
THE ROCKANANGO PROJECT
SCOPE.
SCOPE.
• roots   in HORECA.
SCOPE.
• roots   in HORECA.

  • how   does a bartender select his music?
SCOPE.
• roots   in HORECA.

  • how   does a bartender select his music?

  • how   does an expert select his music?
SCOPE.
• roots   in HORECA.

  • how    does a bartender select his music?

  • how    does an expert select his music?

• makingthe expert data accessible in a
 usable way for a bartender
SCOPE.
• roots   in HORECA.

  • how    does a bartender select his music?

  • how    does an expert select his music?

• making the expert data accessible in a
  usable way for a bartender


A musical context is a musical description for situations based on
             atmospheres and musical properties.
SCOPE.
• roots   in HORECA.

  • how    does a bartender select his music?

  • how    does an expert select his music?

• making the expert data accessible in a
  usable way for a bartender


A musical context is a musical description for situations based on
             atmospheres and musical properties.
SCOPE.
• roots   in HORECA.

  • how    does a bartender select his music?

  • how    does an expert select his music?

• making the expert data accessible in a
  usable way for a bartender


A musical context is a musical description for situations based on
             atmospheres and musical properties.
METADATA SCHEMAS
Corthaut, Nik; Govaerts, Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadata
generation, schemas and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR,
Philadelphia, USA, 14-18 September 2008, Proceedings of the 9th International Conference on Music
Information Retrieval, pages 249-254
context
              subcontext A                                      subcontext B
                    songs with                                        songs with
          subgenre(easy listening                                   genre(pop)
              OR pop café)
                                                                           +
                        +


                    75                                                25
                                                                      songs with
                    songs with
                                                               mood(intimate OR
         mood(relax OR tasteful
             OR stylish)
                                                 +                romantic
                                                                 OR sensual)
                        +
                                                                           +
                    songs with                                        songs with
          popularity(5 UNTIL 7)                             popularity(6 UNTIL 7)

Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell;
Tindale, Adam; Dannenberg, Roger (eds.), International Conference on Music Information Retrieval,
ISMIR, Victoria, BC, 8-12 October 2006, pages 308-313, University of Victoria
APPLICATIONS.
APPLICATIONS.

• HORECA       (+1000 pubs & restaurants)

• Retail   (Fun, Prémaman, Carrefour,...)

• Banks    (Dexia & Fortis)

• Webradio     (HUMO, TMF, Mars, Overtoom, ...)
GENERATE THE METADATA

• from    different sources:

  • the audio signal
  • web sources
  • the Aristo database
  • attention metadata

• using   our metadata generation framework: SamgI
ORIGIN OF AN ARTIST
METADATA GENERATION:
   COUNTRY & CONTINENT
• why    is it useful?

  • subgenres
  • popularity
  • recommendations

• expensive     to annotate

• very   few existing research

• very   hard with signal processing
METADATA GENERATION:
     COUNTRY & CONTINENT
 • why     is it useful?

   • subgenres
   • popularity
   • recommendations

 • expensive      to annotate

 • very    few existing research

! • very   hard with signal processing
IN THE BACKGROUND...
                                              google maps


                                               freebase
                  last.fm        google app
                                   engine
twitter                                         last.fm


last.fm        yahoo! pipes   last on am/fm

website
             dapper               youtube
Coverage
http://www.cs.kuleuven.be/~sten/lastonamfm
LAST.FM MASHUP




http://www.cs.kuleuven.be/~sten/lastonamfm
FUTURE

• follow up research by Markus Schedl
• want to test our algorithm on his data                       set



Govaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe,
Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information
Retrieval Conference, pages 261-266, ISMIR-The International Society for Music Information Retrieval
CLASSIFY BY SEARCH ENGINE
ONE APPROACH...
• can    classify Genre and more!
•   M. Schedl, T. Pohle, P. Knees, G. Widmer,
    “Assigning and Visualizing Music Genres by
    Web-based Co-occurrence Analysis”,
    Proceedings of the 7th International Conference
    on Music Information Retrieval, 2006, pp.
    260-265.

•   G. Geleijnse, J. Korst, "Web-based Artist
    Categorization", Proceedings of the 7th
    International Conference on Music Information
    Retrieval, 2006, pp. 266 - 271.
CLASSIFICATION WITH
  SEARCH ENGINES
     using co-occurrence
CLASSIFICATION WITH
  SEARCH ENGINES
     using co-occurrence

   Artist + Genre + Schema
CLASSIFICATION WITH
  SEARCH ENGINES
     using co-occurrence

   Artist + Genre + Schema
Rock:      Jazz:




Blues:      Pop:




Country:   Metal:
Rock:              Jazz:




           0,013            0,013
Blues:              Pop:



           0,009            0,015
Country:           Metal:



           0,009            0,005
RESULTS

• 1st   results were much worse

• what   happened?

• re-run   the original experiment

  • evaluate   on the same data set: 1995 artists and 9 genres.

• different   search engines: Google,Yahoo! and Live! Search.

• over   time: 8 times over a period of 36 days.
WHAT TO USE?
• use   Google when it’s stable else rely on Yahoo!

• when    is it stable? test with a small set

  • some     artists get classified incorrectly on bad days

  • compare        the accuracy achieved with the test set to the
   average.

  Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still
  work?, AdMIRe: International Workshop on Advances in Music Information Research 2009,
  San Diego, USA, 14-16 December 2009, Proceedings of AdMIRe: International Workshop on
  Advances in Music Information Research 2009, pages 483-488, IEEE
STUDENT ACTIVITY METER
PROBLEM
PROBLEM
PROBLEM
PROBLEM
PROBLEM
OBJECTIVES

• self-monitoring   for learners

• awareness    for teachers

• time   tracking

• learning
       resource
 recommendation
STUDENT ACTIVITY MONITOR
STUDENT ACTIVITY MONITOR
STUDENT ACTIVITY MONITOR
3 EVALUATIONS


• with   CS students

• withCGIAR courses and
 teachers

• with
     Learning and Knowledge
 Analytics course participants
CS STUDENTS CASE STUDY
• usability    and user satisfaction evaluation
• 12   CS students
•2   evaluation sessions:
  • task  based interview with think aloud
     (after 1 week of tracking)
  • user   satisfaction (SUS & MSDT)
     (after 1 month)

Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflection
and awareness, ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11
December 2010, Lecture Notes in Computer Science, volume 6483, pages 91-100, Springer
USABILITY & USER
                 SATISFACTION


• in   general, people understand the visualizations well!

• some    small issues were uncovered...

• average    SUS score: 73% (stdv: 9,35)
USABILITY & USER
 SATISFACTION
USABILITY & USER
 SATISFACTION
USABILITY & USER
 SATISFACTION
USABILITY & USER
 SATISFACTION
CGIAR CASE STUDY
CGIAR CASE STUDY
CGIAR CASE STUDY
  wants to                          more details
 search for       detect outliers    on student
  students
 good indicator for effort understand the workload

more metrics       use for course design optimization
obtain course overview
                              compare students
        increase awareness
want better 1 to 1       progress evolution
 comparison tool
LAK CASE STUDY



• open     course on learning and knowledge analytics

• visual   analytics enthousiasts + experts (who can also teach)
LAK CASE STUDY



• open     course on learning and knowledge analytics

• visual   analytics enthousiasts + experts (who can also teach)
LAK CASE STUDY
  verify the status of the     more metrics
    classroom activity
                      chronological course dwell time
self-reflection to measure
  progress and increase
        motivation        find students experiencing
                       problems and low engagement
     more data
FEDERATED SEARCH AND
SOCIAL RECOMMENDATION
         WIDGET
WHAT’S A WIDGET ?!?
WHAT’S A WIDGET ?!?
WHAT’S A WIDGET ?!?
WHAT’S A WIDGET ?!?
CONTEXT
• Personal   Learning Environment:
  • customizable
  • re-use, creation   & mashup of tools, resources
• enable   users to access content
  • in   different contexts
CONTEXT
• Personal   Learning Environment:
  • customizable
  • re-use, creation   & mashup of tools, resources
• enable   users to access content
  • in   different contexts
CONTEXT
• Personal   Learning Environment:
  • customizable
  • re-use, creation   & mashup of tools, resources
• enable   users to access content
  • in   different contexts
ARCHITECTURE
WIDGET
WIDGET
WIDGET
EXTENDED PAGERANK
                                                                   hare
                                                                        d    R1
                                                            d   /s
                                                        save
                                                        saved/shared
                                d
                             en tion
                                           Sten                              R2
                          fri ec                          dis
                              n                                like
                         c on                                      d
                                                  lik
                                                     ed
                                                                            R3
              d n
                     Sandy
          ien ctio
        fr e
       c on
            n                   lik
                                      ed
                                                                   R4

Erik                                        R5
EVALUATION

• 15   PhD students at K.U. Leuven and EPFL.

•   What?

    • usability

    • user   satisfaction

    • usefulness
FIRST PHASE
• current   media search tool: Google & YouTube




• understanding   recommendations: 6/15 from like/dislike
SECOND PHASE
SECOND PHASE
• only   14 participants (one less)
•   open questions
• usefulness      of recommendations: 11/14 pro.
• user   satisfaction: System Usability Scale (SUS) & MS
    Desirability Toolkit
• SUS    score: 66,25%
    •2   groups
SECOND PHASE
• only   14 participants (one less)
•   open questions
• usefulness      of recommendations: 11/14 pro.
• user   satisfaction: System Usability Scale (SUS) & MS
    Desirability Toolkit
• SUS    score: 66,25%
    •2                                K.U. Leuven: high (75%)
         groups
                                 EPFL + one K.U.Leuven: low (50%)
WHY THE DIFFERENT SUS?
• 1st phase by 2 interviewers
• issues:

  • distracts   of unrelated widget’s UI updates.

  • layout   too dense

  • height   of widgets too small

• KULeuven      student had prior experience with iGoogle.

• not   evaluating the widget but the whole experience...
DESIRABILITY...
DESIRABILITY...
DESIRABILITY...
FURTHER EVALUATION



• evaluation   at a company and university
PUBLICATIONS: MUSIC
•   Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell; Tindale, Adam;
    Dannenberg, Roger (eds.), International Conference on Music Information Retrieval, ISMIR, Victoria, BC, 8-12 October
    2006, International Conference on Music Information Retrieval, ISMIR, pages 308-313, University of Victoria
•   Govaerts, Sten; Corthaut, Nik; Duval, Erik. Mood-ex-machina: towards automation of moody tunes, Dixon, Simon;
    Bainbridge, David; Typke, Rainer (eds.), International Conference on Music Information Retrieval, ISMIR, Vienna, Austria,
    23-27 September 2007, Proceedings of the 8th International Conference on Music Information Retrieval, ISMIR 2007,
    pages 347-350, Österreichische Computer Gesellschaft
•   Corthaut, Nik; Govaerts, Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadata generation, schemas
    and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR, Philadelphia, USA, 14-18 September
    2008, Proceedings of the 9th International Conference on Music Information Retrieval, pages 249-254
•   Corthaut, Nik; Lippens, Stefaan; Govaerts, Sten; Duval, Erik; Martens, Jean-Pierre. The integration of a metadata
    generation framework in a music annotation workflow, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of
    ISMIR2009: 10th International Society for Music Information Retrieval Conference, ISMIR-The International Society for
    Music Information Retrieval
•   Govaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe, Japan, 26-30
    October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, pages
    261-266, ISMIR-The International Society for Music Information Retrieval
•   Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still work?, AdMIRe:
    International Workshop on Advances in Music Information Research 2009, San Diego, USA, 14-16 December 2009,
    Proceedings of AdMIRe: International Workshop on Advances in Music Information Research 2009, pages 483-488, IEEE
ISMIR

I think ISMIR was motivated from two directions: the desire for a
 focus on music indexing, search, and retrieval, which cuts across
     many disciplines, and a desire for a focused technical and
               scientific forum for music research.
                         -Roger Dannenberg

   It is not just statistics and computer science (as Wikipedia
   explains for "bioinformatics") but also many other aspects,
        including social, musicological, perceptual etc. ones.
                           -Michael Fingerhut
PUBLICATIONS: TEL

•   Parra Chico, Gonzalo; Govaerts, Sten; Duval, Erik. More! a social discovery tool for researchers, DIR 2010: Dutch-
    Belgian Information Retrieval Workshop, Nijmegen, Nederland, 25 January 2010, DIR 2010: 10th Dutch-Belgian
    Information Retrieval Workshop
•   Renzel, D.; Hobelt, C.; Dahrendorf, D.; Friedrich, M.; Modritscher, F.; Verbert, Katrien; Govaerts, Sten; Palmer, M.;
    Bogdanov, E.. Collaborative development of a PLE for language learning, International Journal of Emerging Technologies
    in Learning, volume 5, 2010
•   Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflection and awareness,
    ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11 December 2010, Lecture Notes in
    Computer Science, volume 6483, pages 91-100, Springer
•   Govaerts, Sten; El Helou, Sandy; Duval, Erik; Gillet, Denis. A federated search and social recommendation widget,
    Proceedings of the 2nd International Workshop on Social Recommender Systems (SRS 2011) in conjunction with the
    2011 ACM Conference on Computer Supported Cooperative Work (CSCW 2011), Hangzhou, China, 19-23 March
    2011, pages 1-8.
PUBLICATIONS IN THE
                        PIPELINE
•   Felix Mödritscher, Barbara Krumay, Sten Govaerts, Erik Duval, Sandy El Helou, Denis Gillet, Alexander Nussbaumer,
    Dietrich Albert, Carsten Ullrich. May I suggest? Three PLE recommender strategies in comparison, PLE Conference
    2011, Southampthon, UK.
    => ACCEPTED, not a core part of my PhD.
•   Govaerts, Sten; Verbert, Katrien; Duval, Erik. Visualizing student activities for teachers, IEEE Conference on Visual
    Analytics Science and Technology (IEEE VAST), Providence, USA
    => UNDER REVIEW, notification 8 June.
•   Sten Govaerts, Katrien Verbert, Daniel Dahrendorf, Carsten Ullrich, Manuel Schmidt, Michael Werkle, Arunangsu
    Chatterjee, Alexander Nussbaumer, Dominik Renzel, Maren Scheffel, Martin Friedrich, Jose Luis Santos, Effie L-C Law,
    Erik Duval. Towards reponsive open learning environments: the ROLE interoperability framework. The 6th European
    Conference on Technology Enhanced Learning Towards Ubiquitous Learning, Lecture Notes of Computer Science.
    => UNDER REVIEW, notification 31 May.
FUTURE PLANNING

• Ph.D. on    papers

• if   2 papers under review are accepted => FINISH.

• potential   for writing (a) journal article(s):

  •0     articles: submit end September

  •1     article: submit end November

  •2     articles: submit Xmas.
THANK YOU!
                QUESTIONS?
slides will appear on http://www.slideshare.net/stengovaerts

Using mashup technology to improve findability

  • 1.
    USING MASHUP TECHNOLOGY TOIMPROVE FINDABILITY Sten Govaerts Promotor: Erik Duval Co-promotor: Katrien Verbert
  • 2.
    OVERVIEW • Research outline • Music • Technology Enhanced Learning • Publication list • Further planning
  • 3.
    FINDABILITY Findability is theability of users to identify an appropriate website and navigate the pages of the site to discover and retrieve relevant information resources. Peter Morville (2005)
  • 4.
  • 5.
    MASHUPS • mashups in music: re-mixing multiple existing songs to create a new one.
  • 6.
    MASHUPS • mashups in music: re-mixing multiple existing songs to create a new one. •amashup is an application that combines data from multiple online sources to create a new result which was not the original intend of the data.
  • 7.
    MASHUPS • mashups in music: re-mixing multiple existing songs to create a new one. •amashup is an application that combines data from multiple online sources to create a new result which was not the original intend of the data. • data is key! • tweaking and enriching data is important • interesting data makes an interesting mashup
  • 8.
  • 9.
  • 10.
  • 11.
    SCOPE. • roots in HORECA. • how does a bartender select his music?
  • 12.
    SCOPE. • roots in HORECA. • how does a bartender select his music? • how does an expert select his music?
  • 13.
    SCOPE. • roots in HORECA. • how does a bartender select his music? • how does an expert select his music? • makingthe expert data accessible in a usable way for a bartender
  • 14.
    SCOPE. • roots in HORECA. • how does a bartender select his music? • how does an expert select his music? • making the expert data accessible in a usable way for a bartender A musical context is a musical description for situations based on atmospheres and musical properties.
  • 15.
    SCOPE. • roots in HORECA. • how does a bartender select his music? • how does an expert select his music? • making the expert data accessible in a usable way for a bartender A musical context is a musical description for situations based on atmospheres and musical properties.
  • 16.
    SCOPE. • roots in HORECA. • how does a bartender select his music? • how does an expert select his music? • making the expert data accessible in a usable way for a bartender A musical context is a musical description for situations based on atmospheres and musical properties.
  • 17.
  • 18.
    Corthaut, Nik; Govaerts,Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadata generation, schemas and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR, Philadelphia, USA, 14-18 September 2008, Proceedings of the 9th International Conference on Music Information Retrieval, pages 249-254
  • 19.
    context subcontext A subcontext B songs with songs with subgenre(easy listening genre(pop) OR pop café) + + 75 25 songs with songs with mood(intimate OR mood(relax OR tasteful OR stylish) + romantic OR sensual) + + songs with songs with popularity(5 UNTIL 7) popularity(6 UNTIL 7) Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell; Tindale, Adam; Dannenberg, Roger (eds.), International Conference on Music Information Retrieval, ISMIR, Victoria, BC, 8-12 October 2006, pages 308-313, University of Victoria
  • 20.
  • 21.
    APPLICATIONS. • HORECA (+1000 pubs & restaurants) • Retail (Fun, Prémaman, Carrefour,...) • Banks (Dexia & Fortis) • Webradio (HUMO, TMF, Mars, Overtoom, ...)
  • 22.
    GENERATE THE METADATA •from different sources: • the audio signal • web sources • the Aristo database • attention metadata • using our metadata generation framework: SamgI
  • 23.
  • 24.
    METADATA GENERATION: COUNTRY & CONTINENT • why is it useful? • subgenres • popularity • recommendations • expensive to annotate • very few existing research • very hard with signal processing
  • 25.
    METADATA GENERATION: COUNTRY & CONTINENT • why is it useful? • subgenres • popularity • recommendations • expensive to annotate • very few existing research ! • very hard with signal processing
  • 26.
    IN THE BACKGROUND... google maps freebase last.fm google app engine twitter last.fm last.fm yahoo! pipes last on am/fm website dapper youtube
  • 27.
  • 29.
  • 30.
  • 31.
    FUTURE • follow upresearch by Markus Schedl • want to test our algorithm on his data set Govaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, pages 261-266, ISMIR-The International Society for Music Information Retrieval
  • 32.
  • 33.
    ONE APPROACH... • can classify Genre and more! • M. Schedl, T. Pohle, P. Knees, G. Widmer, “Assigning and Visualizing Music Genres by Web-based Co-occurrence Analysis”, Proceedings of the 7th International Conference on Music Information Retrieval, 2006, pp. 260-265. • G. Geleijnse, J. Korst, "Web-based Artist Categorization", Proceedings of the 7th International Conference on Music Information Retrieval, 2006, pp. 266 - 271.
  • 34.
    CLASSIFICATION WITH SEARCH ENGINES using co-occurrence
  • 35.
    CLASSIFICATION WITH SEARCH ENGINES using co-occurrence Artist + Genre + Schema
  • 36.
    CLASSIFICATION WITH SEARCH ENGINES using co-occurrence Artist + Genre + Schema
  • 38.
    Rock: Jazz: Blues: Pop: Country: Metal:
  • 39.
    Rock: Jazz: 0,013 0,013 Blues: Pop: 0,009 0,015 Country: Metal: 0,009 0,005
  • 40.
    RESULTS • 1st results were much worse • what happened? • re-run the original experiment • evaluate on the same data set: 1995 artists and 9 genres. • different search engines: Google,Yahoo! and Live! Search. • over time: 8 times over a period of 36 days.
  • 45.
    WHAT TO USE? •use Google when it’s stable else rely on Yahoo! • when is it stable? test with a small set • some artists get classified incorrectly on bad days • compare the accuracy achieved with the test set to the average. Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still work?, AdMIRe: International Workshop on Advances in Music Information Research 2009, San Diego, USA, 14-16 December 2009, Proceedings of AdMIRe: International Workshop on Advances in Music Information Research 2009, pages 483-488, IEEE
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52.
    OBJECTIVES • self-monitoring for learners • awareness for teachers • time tracking • learning resource recommendation
  • 53.
  • 54.
  • 55.
  • 56.
    3 EVALUATIONS • with CS students • withCGIAR courses and teachers • with Learning and Knowledge Analytics course participants
  • 57.
    CS STUDENTS CASESTUDY • usability and user satisfaction evaluation • 12 CS students •2 evaluation sessions: • task based interview with think aloud (after 1 week of tracking) • user satisfaction (SUS & MSDT) (after 1 month) Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflection and awareness, ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11 December 2010, Lecture Notes in Computer Science, volume 6483, pages 91-100, Springer
  • 58.
    USABILITY & USER SATISFACTION • in general, people understand the visualizations well! • some small issues were uncovered... • average SUS score: 73% (stdv: 9,35)
  • 59.
    USABILITY & USER SATISFACTION
  • 60.
    USABILITY & USER SATISFACTION
  • 61.
    USABILITY & USER SATISFACTION
  • 62.
    USABILITY & USER SATISFACTION
  • 63.
  • 64.
  • 65.
    CGIAR CASE STUDY wants to more details search for detect outliers on student students good indicator for effort understand the workload more metrics use for course design optimization obtain course overview compare students increase awareness want better 1 to 1 progress evolution comparison tool
  • 66.
    LAK CASE STUDY •open course on learning and knowledge analytics • visual analytics enthousiasts + experts (who can also teach)
  • 67.
    LAK CASE STUDY •open course on learning and knowledge analytics • visual analytics enthousiasts + experts (who can also teach)
  • 68.
    LAK CASE STUDY verify the status of the more metrics classroom activity chronological course dwell time self-reflection to measure progress and increase motivation find students experiencing problems and low engagement more data
  • 69.
    FEDERATED SEARCH AND SOCIALRECOMMENDATION WIDGET
  • 70.
  • 71.
  • 72.
  • 73.
  • 74.
    CONTEXT • Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources • enable users to access content • in different contexts
  • 75.
    CONTEXT • Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources • enable users to access content • in different contexts
  • 76.
    CONTEXT • Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources • enable users to access content • in different contexts
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
    EXTENDED PAGERANK hare d R1 d /s save saved/shared d en tion Sten R2 fri ec dis n like c on d lik ed R3 d n Sandy ien ctio fr e c on n lik ed R4 Erik R5
  • 82.
    EVALUATION • 15 PhD students at K.U. Leuven and EPFL. • What? • usability • user satisfaction • usefulness
  • 83.
    FIRST PHASE • current media search tool: Google & YouTube • understanding recommendations: 6/15 from like/dislike
  • 84.
  • 85.
    SECOND PHASE • only 14 participants (one less) • open questions • usefulness of recommendations: 11/14 pro. • user satisfaction: System Usability Scale (SUS) & MS Desirability Toolkit • SUS score: 66,25% •2 groups
  • 86.
    SECOND PHASE • only 14 participants (one less) • open questions • usefulness of recommendations: 11/14 pro. • user satisfaction: System Usability Scale (SUS) & MS Desirability Toolkit • SUS score: 66,25% •2 K.U. Leuven: high (75%) groups EPFL + one K.U.Leuven: low (50%)
  • 87.
    WHY THE DIFFERENTSUS? • 1st phase by 2 interviewers • issues: • distracts of unrelated widget’s UI updates. • layout too dense • height of widgets too small • KULeuven student had prior experience with iGoogle. • not evaluating the widget but the whole experience...
  • 88.
  • 89.
  • 90.
  • 91.
    FURTHER EVALUATION • evaluation at a company and university
  • 92.
    PUBLICATIONS: MUSIC • Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell; Tindale, Adam; Dannenberg, Roger (eds.), International Conference on Music Information Retrieval, ISMIR, Victoria, BC, 8-12 October 2006, International Conference on Music Information Retrieval, ISMIR, pages 308-313, University of Victoria • Govaerts, Sten; Corthaut, Nik; Duval, Erik. Mood-ex-machina: towards automation of moody tunes, Dixon, Simon; Bainbridge, David; Typke, Rainer (eds.), International Conference on Music Information Retrieval, ISMIR, Vienna, Austria, 23-27 September 2007, Proceedings of the 8th International Conference on Music Information Retrieval, ISMIR 2007, pages 347-350, Österreichische Computer Gesellschaft • Corthaut, Nik; Govaerts, Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadata generation, schemas and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR, Philadelphia, USA, 14-18 September 2008, Proceedings of the 9th International Conference on Music Information Retrieval, pages 249-254 • Corthaut, Nik; Lippens, Stefaan; Govaerts, Sten; Duval, Erik; Martens, Jean-Pierre. The integration of a metadata generation framework in a music annotation workflow, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, ISMIR-The International Society for Music Information Retrieval • Govaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, pages 261-266, ISMIR-The International Society for Music Information Retrieval • Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still work?, AdMIRe: International Workshop on Advances in Music Information Research 2009, San Diego, USA, 14-16 December 2009, Proceedings of AdMIRe: International Workshop on Advances in Music Information Research 2009, pages 483-488, IEEE
  • 93.
    ISMIR I think ISMIRwas motivated from two directions: the desire for a focus on music indexing, search, and retrieval, which cuts across many disciplines, and a desire for a focused technical and scientific forum for music research. -Roger Dannenberg It is not just statistics and computer science (as Wikipedia explains for "bioinformatics") but also many other aspects, including social, musicological, perceptual etc. ones. -Michael Fingerhut
  • 94.
    PUBLICATIONS: TEL • Parra Chico, Gonzalo; Govaerts, Sten; Duval, Erik. More! a social discovery tool for researchers, DIR 2010: Dutch- Belgian Information Retrieval Workshop, Nijmegen, Nederland, 25 January 2010, DIR 2010: 10th Dutch-Belgian Information Retrieval Workshop • Renzel, D.; Hobelt, C.; Dahrendorf, D.; Friedrich, M.; Modritscher, F.; Verbert, Katrien; Govaerts, Sten; Palmer, M.; Bogdanov, E.. Collaborative development of a PLE for language learning, International Journal of Emerging Technologies in Learning, volume 5, 2010 • Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflection and awareness, ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11 December 2010, Lecture Notes in Computer Science, volume 6483, pages 91-100, Springer • Govaerts, Sten; El Helou, Sandy; Duval, Erik; Gillet, Denis. A federated search and social recommendation widget, Proceedings of the 2nd International Workshop on Social Recommender Systems (SRS 2011) in conjunction with the 2011 ACM Conference on Computer Supported Cooperative Work (CSCW 2011), Hangzhou, China, 19-23 March 2011, pages 1-8.
  • 95.
    PUBLICATIONS IN THE PIPELINE • Felix Mödritscher, Barbara Krumay, Sten Govaerts, Erik Duval, Sandy El Helou, Denis Gillet, Alexander Nussbaumer, Dietrich Albert, Carsten Ullrich. May I suggest? Three PLE recommender strategies in comparison, PLE Conference 2011, Southampthon, UK. => ACCEPTED, not a core part of my PhD. • Govaerts, Sten; Verbert, Katrien; Duval, Erik. Visualizing student activities for teachers, IEEE Conference on Visual Analytics Science and Technology (IEEE VAST), Providence, USA => UNDER REVIEW, notification 8 June. • Sten Govaerts, Katrien Verbert, Daniel Dahrendorf, Carsten Ullrich, Manuel Schmidt, Michael Werkle, Arunangsu Chatterjee, Alexander Nussbaumer, Dominik Renzel, Maren Scheffel, Martin Friedrich, Jose Luis Santos, Effie L-C Law, Erik Duval. Towards reponsive open learning environments: the ROLE interoperability framework. The 6th European Conference on Technology Enhanced Learning Towards Ubiquitous Learning, Lecture Notes of Computer Science. => UNDER REVIEW, notification 31 May.
  • 96.
    FUTURE PLANNING • Ph.D.on papers • if 2 papers under review are accepted => FINISH. • potential for writing (a) journal article(s): •0 articles: submit end September •1 article: submit end November •2 articles: submit Xmas.
  • 97.
    THANK YOU! QUESTIONS? slides will appear on http://www.slideshare.net/stengovaerts

Editor's Notes