Research and Deployment of Analytics in             Learning Settings	              PAWS Meeting 9 April 2012	School of In...
Human-Computer Interaction	                                  Awareness  Sense-making	                                     ...
more focus on interaction...
tracking traces                                                   Blogs	Rescuetime	   Rabbit- eclipse 	   Twitter	        ...
tracking traces                                                   Blogs	Rescuetime	   Rabbit- eclipse 	   Twitter	        ...
tracking traces	                     www.role-project.eu
Duval, Erik. Attention please! Learning analytics for visualization and recommendation, Proceedings of LAK11: 1stInternati...
objectives	• self-monitoring        for learners	• awareness         for teachers	• learning        resource use and recom...
overview	                                    	                                   	• Student    Activity Meter	• Step   Up!...
Student activity meter (SAM):            demo.	   http://ariadne.cs.kuleuven.be/monitorwidget-rwtheval/ or                ...
Design Based Research                  Methodology	•  Rapid   prototyping	•  Evaluate         Ideas in short iteration cyc...
Iteration one	•  usability   and user satisfaction evaluation	•  12CS students, using a                     -based  time t...
User satisfaction	• average   SUS score: 73%
iteration two	• 20   persons: 3 CGIAR, 2 Law, 8 CS teachers  7 CS TA s.	• An online survey about usefulness, teacher issue...
CGIAR CASE STUDY	                                                      issue for                                          ...
demogra       evaluation          design                                                            negative	            p...
iteration three	• open course on learning and knowledge analytics, http://bit.ly/dWYVbX	• 12 visual analytics enthousiasts...
LAK CASE STUDY	                                                      issue for                                            ...
ideas from experts	2	   the used resource types	5	   detailed information per student	4	   detailed information of 2 stude...
demogra       evaluation          design                                                      negative	            positiv...
Iteration four	• a   CS course on C++ programming	• 11people: 7 teachers, 2 TA s  1 course planner	• richerdata set: track...
USER SATISFACTION	• average   SUS score: 69,69%	               all: want to continue using it	               9/11: give it...
demo- evaluation                     design                                                                 negative	     ...
conclusion	•  SAMenables to find a wide variety of new insights	 • a   better course overview	 • understanding    student t...
Santos Odriozola, Jose Luis; Govaerts, Sten; Verbert, Katrien; Duval, ErikGoal-oriented visualizations of activity trackin...
Human-Computer Interaction Course
http://bit.ly/I7hfbe
usage
User satisfaction	• average   SUS score: 77%
Nikos Manouselis, Hendrik Drachsler, Katrien Verbert and Erik Duval. Recommender Systems for Learning.SpringerBriefs in Co...
http://bit.ly/A4CwZU
challenges	• Evaluation	• Data   sets	• Context	• User   interfaces
EVALUATION  DATA SETS
Verbert, Katrien; Drachsler, Hendrik; Manouselis, Nikos; Wolpers, Martin; Vuorikari, Riina; Duval, Erik.Dataset-driven res...
http://bit.ly/acBKsp
how to achieve objectives                                 	•  Setting   up a website / maintain TELeurope group community	...
dataTEL challenge  dataTEL cafe                    event	      •  a   call for TEL datasets	      •  eight   data sets sub...
http://dev.mendeley.com/
Mendeley	    APOSDLE	 ReMashed	     Organic.e    Mace	      Melt	                                                         ...
User-based CF	                                     A	                 Sam	   highcorrelation	                             ...
Item-based CF	Sam	                      A	                      B	      high                             correlation	Ian	N...
similarity measures	• Cosine   similarity	• Pearson   correlation	• Tanimoto    or extended Jaccard coefficient
similarity measures	MAE of item-based collaborative filtering based on           different similarity metrics
algorithms	MAE of user-based, item-based and slope-one           collaborative filtering
CONTEXT
Verbert, Katrien; Manouselis, Nikos; Ochoa, Xavier; Wolpers, Martin; Drachsler, Hendrik; Bosnic, Ivana;Duval, Erik. Contex...
data dimensions
challenges	• context    acquisition	• standardized     representation of contextual data	• evaluation	• user   interfaces
VISUALIZING THE RATIONALE OFRECOMMENDATIONS
Visualizing recommendations	          adapted from Keim et al. 2008
objectives	• Address    cold start issues	• Justification   and trust	• Richer   interaction capabilities
examples	Klerkx and Duval 2009	                           ODonovan et al. 2010
Suggestions welcome!
Questions? 	katrien.verbert@cs.kuleuven.be	      twitter: @katrien_v
References	•    Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedin...
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Research and Deployment of Analytics in Learning Settings
Upcoming SlideShare
Loading in...5
×

Research and Deployment of Analytics in Learning Settings

1,738

Published on

Published in: Education, Technology
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,738
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
11
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide
  • Microsoft Desirability Toolkit
  • ‘Knowing about collaboration andcommunication’ (the 3rd row with *) is not addressed by SAM, but is added to check a possible bias. The highest rated was ‘knowing how much time students spent’ and ‘Awareness of what students are doing’ Finding students in trouble and the best students was also rated rather low. Awareness of resource use has been mostly met, but can be improved by differentiating external resources (the external resource use issue is indecisive).
  • Actual use was high
  • For this evaluation we wanted to get expert feedback and see how SAM would operate in a large course. SAM was deployed in an open onlinecourse on Learning and Knowledge Analytics (LAK)5 – an emerging research domain in TEL that focuses on better measurement, analysis, visualization and reporting of data about learners [2]. More details on iteration 2 and 3 are available in [10]. allow re-ordering of the axes through drag-and-drop for better metrics comparison. To cope with the line density better, configurable histograms (12) are added to the axes.270 participants
  • Providing feedback most importantBoth LAK and CGIAR teachers want to understand the document use. The main differences between LAK and CGIAR teachers are: LAK rates finding students at risk higher and finding good students lower, online tool use is not so interesting for LAK teachers and collaboration is more important. Awareness is also rated high. Comparing with the objectives, awareness and resource use is again the most important.
  • How can data sets be shared according to privacy and legal protection rights? How to develop a respective policy to use and share data sets? How to pre-process data sets to make them suitable for other researchers? How to define common evaluation criteria for TEL recommender systems? How to develop overview methods to monitor the performance of TEL recommender systems on data sets?
  • Research and Deployment of Analytics in Learning Settings

    1. 1. Research and Deployment of Analytics in Learning Settings PAWS Meeting 9 April 2012 School of Information Sciences, University of Pittsburgh Katrien Verbert
    2. 2. Human-Computer Interaction Awareness Sense-making prof. Erik Duval Computer Graphics prof. Phil Dutré Language Intelligence Information Retrieval prof. Sien Moens Flexible Interaction between people and information http://hci.cs.kuleuven.be/
    3. 3. more focus on interaction...
    4. 4. tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
    5. 5. tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
    6. 6. tracking traces www.role-project.eu
    7. 7. Duval, Erik. Attention please! Learning analytics for visualization and recommendation, Proceedings of LAK11: 1stInternational Conference on Learning Analytics and Knowledge, pages 9-17, ACM (2011)
    8. 8. objectives • self-monitoring for learners • awareness for teachers • learning resource use and recommendations • part of Learning Analytics research [ACM LAK conf., Siemens 2011, Duval 2011]
    9. 9. overview • Student Activity Meter • Step Up! • Recommender systems for learning • Future research plans
    10. 10. Student activity meter (SAM): demo. http://ariadne.cs.kuleuven.be/monitorwidget-rwtheval/ or http://bit.ly/I8AYV1
    11. 11. Design Based Research Methodology •  Rapid prototyping •  Evaluate Ideas in short iteration cycles of Design, Implementation Evaluation •  Focus on Usefulness Usability •  Think-aloud evaluations, SUS (System Usability Scale) surveys, usability lab, ...
    12. 12. Iteration one •  usability and user satisfaction evaluation •  12CS students, using a -based time tracker •  2 evaluation sessions: •  task based interview with think aloud (after 1 week of tracking) •  user satisfaction (SUS MSDT) (after 1 month)
    13. 13. User satisfaction • average SUS score: 73%
    14. 14. iteration two • 20 persons: 3 CGIAR, 2 Law, 8 CS teachers 7 CS TA s. • An online survey about usefulness, teacher issues and how the tool can resolve these. • on average: 40 mins are spent using SAM.
    15. 15. CGIAR CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ?!Being aware of what students are doing ✔ ✔Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ✔ ✔Knowing how and when online tools have been used ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ?!Knowing how much time students spent ?! ✔Knowing if external learning resources are used ✔ ?!
    16. 16. demogra evaluation design negative positive phics goal changes • ↑learnability usability, • ↓errors 12 CS satisfaction, small usabilityI. 1st iteration • good satisfaction students preliminary issues • usefulness usefulness positive • provides assessing 19 resource awareness teacher needs,II. teachers help function recomm. not • all vis. useful use TA s useful • many uses usefulness • 90% wants it
    17. 17. iteration three • open course on learning and knowledge analytics, http://bit.ly/dWYVbX • 12 visual analytics enthousiasts + experts (who also teach) • almost identical survey to CGIAR case.
    18. 18. LAK CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ✔Being aware of what students are doing ✔ ?!Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ?!Knowing how and when online tools have been used ✗ ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ✗Knowing how much time students spent ?! ✔Knowing if external learning resources are used ?! ?!
    19. 19. ideas from experts 2 the used resource types 5 detailed information per student 4 detailed information of 2 students 3 detailed usage stats of resources 1 stats or vis. on content creation
    20. 20. demogra evaluation design negative positive phics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive assessing • provides awareness 19 resource teacher • all vis. useful II. teachers help function recomm. not needs, use • many uses TA s useful usefulness • 90% want it • provides awareness assessing re-orderable most and feedback 12 teacher parallel addressed • many uses III. participan needs, expert coordinates needs are • 66% want it ts feedback, use with indecisive • recomm. can be usefulness histograms useful
    21. 21. Iteration four • a CS course on C++ programming • 11people: 7 teachers, 2 TA s 1 course planner • richerdata set: tracking from programming environment • qualitative study using a structured face-2-face interview
    22. 22. USER SATISFACTION • average SUS score: 69,69% all: want to continue using it 9/11: give it to students
    23. 23. demo- evaluation design negative positive graphics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive • provides awareness 19 assessing teacher resource • all vis. useful II. teachers needs, use help function recomm. not • many uses TA s usefulness useful • 90% want it • provides awareness and assessing teacher most 12 re-orderable feedback needs, expert addressedIII. participant PC with • many uses feedback, use needs are s histograms • 66% want it usefulness indecisive • recomm. can be useful • provides time overview filter search, conflicting • provides course overview 11 use, usefulness icons, zooming visions of • PC assist with detectingIV. teachers satisfaction in line chart, students doing problems TA s editing PC axes well or at risk • many uses insights • 100% want it
    24. 24. conclusion •  SAMenables to find a wide variety of new insights • a better course overview • understanding student time spending • almostall participants want to continue using SAM 26
    25. 25. Santos Odriozola, Jose Luis; Govaerts, Sten; Verbert, Katrien; Duval, ErikGoal-oriented visualizations of activity tracking: a case study with engineering students, Proceedings of LAK12: 2ndInternational Conference on Learning Analytics and Knowledge, pages 10, ACM (to appear)
    26. 26. Human-Computer Interaction Course
    27. 27. http://bit.ly/I7hfbe
    28. 28. usage
    29. 29. User satisfaction • average SUS score: 77%
    30. 30. Nikos Manouselis, Hendrik Drachsler, Katrien Verbert and Erik Duval. Recommender Systems for Learning.SpringerBriefs in Computer Science, 90 pages, Springer US  (to appear).
    31. 31. http://bit.ly/A4CwZU
    32. 32. challenges • Evaluation • Data sets • Context • User interfaces
    33. 33. EVALUATION DATA SETS
    34. 34. Verbert, Katrien; Drachsler, Hendrik; Manouselis, Nikos; Wolpers, Martin; Vuorikari, Riina; Duval, Erik.Dataset-driven research for improving TEL recommender systems, LAK11:1st InternationalConference on Learning Analytics and Knowledge, pages 44-53 (2011)
    35. 35. http://bit.ly/acBKsp
    36. 36. how to achieve objectives •  Setting up a website / maintain TELeurope group community •  Setup a open data repository for sharing educational datasets and related researches outcomes •  Organizing annual workshop and SI •  Organizing a data competition like in TREC
    37. 37. dataTEL challenge dataTEL cafe event •  a call for TEL datasets •  eight data sets submitted http://bit.ly/ieqmWW
    38. 38. http://dev.mendeley.com/
    39. 39. Mendeley APOSDLE ReMashed Organic.e Mace Melt dunet Collection period 1 year 3 months 2 years 9 months 3 years 6 months Users 200.000 6 140 1.000 1.148 98 Items 1.857.912 163 96.000 11.000 12.000 1.923 Activities 4.848.725 1.500 23.264 920 461.982 16.353 reads + + - - + - tags - (+) + + + + ratings (+) - + + + + downloads + + - - + + search - + - - + - collaborations - + - - - - tasks/goals - + + - - - sequence - + - - - - competence - + - - + - time - - - - + +
    40. 40. User-based CF A Sam highcorrelation B Ian Neil C
    41. 41. Item-based CF Sam A B high correlation Ian Neil C
    42. 42. similarity measures • Cosine similarity • Pearson correlation • Tanimoto or extended Jaccard coefficient
    43. 43. similarity measures MAE of item-based collaborative filtering based on different similarity metrics
    44. 44. algorithms MAE of user-based, item-based and slope-one collaborative filtering
    45. 45. CONTEXT
    46. 46. Verbert, Katrien; Manouselis, Nikos; Ochoa, Xavier; Wolpers, Martin; Drachsler, Hendrik; Bosnic, Ivana;Duval, Erik. Context-aware recommender systems for learning: a survey and future challenges, IEEETransactions on Learning Technologies, 20 pages (Accepted)
    47. 47. data dimensions
    48. 48. challenges • context acquisition • standardized representation of contextual data • evaluation • user interfaces
    49. 49. VISUALIZING THE RATIONALE OFRECOMMENDATIONS
    50. 50. Visualizing recommendations adapted from Keim et al. 2008
    51. 51. objectives • Address cold start issues • Justification and trust • Richer interaction capabilities
    52. 52. examples Klerkx and Duval 2009 ODonovan et al. 2010
    53. 53. Suggestions welcome!
    54. 54. Questions? katrien.verbert@cs.kuleuven.be twitter: @katrien_v
    55. 55. References •  Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, (pp. 9-17), ACM. •  D. Keim, G. Andrienko, J.-D. Fekete, C. Go ̈rg, J. Kohlhammer, and G. Melanc ̧on. Visual Analytics: Definition, Process, and Challenges. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950 of Lecture Notes in Computer Science, pages 154–175. Springer Berlin / Heidelberg, 2008 •  J. Klerkx and E. Duval. Visualising social bookmarks. Journal of Digital Information, 10(2):1–40, 2009 •  J. ODonovan, B. Gretarsson, S.Bostandjiev, C. Hall, and T. Hollerer. SmallWorlds: Visualizing Social Recommendations. In G. Melançon, T. Munzner, and D. Weiskopf (eds) Eurographics/ IEEE-VGTC Symposium on Visualization 2010, Volume 29 (2010), Number 3, 10 pages •  Siemens, G. Gasevic, D. (eds) (2011). Proceedings of the 1st conference on Learning Analytics and Knowledge 2011. ACM.
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×