Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Research and Deployment of Analytics in Learning Settings

2,080 views

Published on

Published in: Education, Technology
  • Be the first to comment

Research and Deployment of Analytics in Learning Settings

  1. 1. Research and Deployment of Analytics in Learning Settings PAWS Meeting 9 April 2012 School of Information Sciences, University of Pittsburgh Katrien Verbert
  2. 2. Human-Computer Interaction Awareness Sense-making prof. Erik Duval Computer Graphics prof. Phil Dutré Language Intelligence Information Retrieval prof. Sien Moens Flexible Interaction between people and information http://hci.cs.kuleuven.be/
  3. 3. more focus on interaction...
  4. 4. tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
  5. 5. tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
  6. 6. tracking traces www.role-project.eu
  7. 7. Duval, Erik. Attention please! Learning analytics for visualization and recommendation, Proceedings of LAK11: 1stInternational Conference on Learning Analytics and Knowledge, pages 9-17, ACM (2011)
  8. 8. objectives • self-monitoring for learners • awareness for teachers • learning resource use and recommendations • part of Learning Analytics research [ACM LAK conf., Siemens 2011, Duval 2011]
  9. 9. overview • Student Activity Meter • Step Up! • Recommender systems for learning • Future research plans
  10. 10. Student activity meter (SAM): demo. http://ariadne.cs.kuleuven.be/monitorwidget-rwtheval/ or http://bit.ly/I8AYV1
  11. 11. Design Based Research Methodology •  Rapid prototyping •  Evaluate Ideas in short iteration cycles of Design, Implementation Evaluation •  Focus on Usefulness Usability •  Think-aloud evaluations, SUS (System Usability Scale) surveys, usability lab, ...
  12. 12. Iteration one •  usability and user satisfaction evaluation •  12CS students, using a -based time tracker •  2 evaluation sessions: •  task based interview with think aloud (after 1 week of tracking) •  user satisfaction (SUS MSDT) (after 1 month)
  13. 13. User satisfaction • average SUS score: 73%
  14. 14. iteration two • 20 persons: 3 CGIAR, 2 Law, 8 CS teachers 7 CS TA s. • An online survey about usefulness, teacher issues and how the tool can resolve these. • on average: 40 mins are spent using SAM.
  15. 15. CGIAR CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ?!Being aware of what students are doing ✔ ✔Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ✔ ✔Knowing how and when online tools have been used ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ?!Knowing how much time students spent ?! ✔Knowing if external learning resources are used ✔ ?!
  16. 16. demogra evaluation design negative positive phics goal changes • ↑learnability usability, • ↓errors 12 CS satisfaction, small usabilityI. 1st iteration • good satisfaction students preliminary issues • usefulness usefulness positive • provides assessing 19 resource awareness teacher needs,II. teachers help function recomm. not • all vis. useful use TA s useful • many uses usefulness • 90% wants it
  17. 17. iteration three • open course on learning and knowledge analytics, http://bit.ly/dWYVbX • 12 visual analytics enthousiasts + experts (who also teach) • almost identical survey to CGIAR case.
  18. 18. LAK CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ✔Being aware of what students are doing ✔ ?!Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ?!Knowing how and when online tools have been used ✗ ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ✗Knowing how much time students spent ?! ✔Knowing if external learning resources are used ?! ?!
  19. 19. ideas from experts 2 the used resource types 5 detailed information per student 4 detailed information of 2 students 3 detailed usage stats of resources 1 stats or vis. on content creation
  20. 20. demogra evaluation design negative positive phics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive assessing • provides awareness 19 resource teacher • all vis. useful II. teachers help function recomm. not needs, use • many uses TA s useful usefulness • 90% want it • provides awareness assessing re-orderable most and feedback 12 teacher parallel addressed • many uses III. participan needs, expert coordinates needs are • 66% want it ts feedback, use with indecisive • recomm. can be usefulness histograms useful
  21. 21. Iteration four • a CS course on C++ programming • 11people: 7 teachers, 2 TA s 1 course planner • richerdata set: tracking from programming environment • qualitative study using a structured face-2-face interview
  22. 22. USER SATISFACTION • average SUS score: 69,69% all: want to continue using it 9/11: give it to students
  23. 23. demo- evaluation design negative positive graphics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive • provides awareness 19 assessing teacher resource • all vis. useful II. teachers needs, use help function recomm. not • many uses TA s usefulness useful • 90% want it • provides awareness and assessing teacher most 12 re-orderable feedback needs, expert addressedIII. participant PC with • many uses feedback, use needs are s histograms • 66% want it usefulness indecisive • recomm. can be useful • provides time overview filter search, conflicting • provides course overview 11 use, usefulness icons, zooming visions of • PC assist with detectingIV. teachers satisfaction in line chart, students doing problems TA s editing PC axes well or at risk • many uses insights • 100% want it
  24. 24. conclusion •  SAMenables to find a wide variety of new insights • a better course overview • understanding student time spending • almostall participants want to continue using SAM 26
  25. 25. Santos Odriozola, Jose Luis; Govaerts, Sten; Verbert, Katrien; Duval, ErikGoal-oriented visualizations of activity tracking: a case study with engineering students, Proceedings of LAK12: 2ndInternational Conference on Learning Analytics and Knowledge, pages 10, ACM (to appear)
  26. 26. Human-Computer Interaction Course
  27. 27. http://bit.ly/I7hfbe
  28. 28. usage
  29. 29. User satisfaction • average SUS score: 77%
  30. 30. Nikos Manouselis, Hendrik Drachsler, Katrien Verbert and Erik Duval. Recommender Systems for Learning.SpringerBriefs in Computer Science, 90 pages, Springer US  (to appear).
  31. 31. http://bit.ly/A4CwZU
  32. 32. challenges • Evaluation • Data sets • Context • User interfaces
  33. 33. EVALUATION DATA SETS
  34. 34. Verbert, Katrien; Drachsler, Hendrik; Manouselis, Nikos; Wolpers, Martin; Vuorikari, Riina; Duval, Erik.Dataset-driven research for improving TEL recommender systems, LAK11:1st InternationalConference on Learning Analytics and Knowledge, pages 44-53 (2011)
  35. 35. http://bit.ly/acBKsp
  36. 36. how to achieve objectives •  Setting up a website / maintain TELeurope group community •  Setup a open data repository for sharing educational datasets and related researches outcomes •  Organizing annual workshop and SI •  Organizing a data competition like in TREC
  37. 37. dataTEL challenge dataTEL cafe event •  a call for TEL datasets •  eight data sets submitted http://bit.ly/ieqmWW
  38. 38. http://dev.mendeley.com/
  39. 39. Mendeley APOSDLE ReMashed Organic.e Mace Melt dunet Collection period 1 year 3 months 2 years 9 months 3 years 6 months Users 200.000 6 140 1.000 1.148 98 Items 1.857.912 163 96.000 11.000 12.000 1.923 Activities 4.848.725 1.500 23.264 920 461.982 16.353 reads + + - - + - tags - (+) + + + + ratings (+) - + + + + downloads + + - - + + search - + - - + - collaborations - + - - - - tasks/goals - + + - - - sequence - + - - - - competence - + - - + - time - - - - + +
  40. 40. User-based CF A Sam highcorrelation B Ian Neil C
  41. 41. Item-based CF Sam A B high correlation Ian Neil C
  42. 42. similarity measures • Cosine similarity • Pearson correlation • Tanimoto or extended Jaccard coefficient
  43. 43. similarity measures MAE of item-based collaborative filtering based on different similarity metrics
  44. 44. algorithms MAE of user-based, item-based and slope-one collaborative filtering
  45. 45. CONTEXT
  46. 46. Verbert, Katrien; Manouselis, Nikos; Ochoa, Xavier; Wolpers, Martin; Drachsler, Hendrik; Bosnic, Ivana;Duval, Erik. Context-aware recommender systems for learning: a survey and future challenges, IEEETransactions on Learning Technologies, 20 pages (Accepted)
  47. 47. data dimensions
  48. 48. challenges • context acquisition • standardized representation of contextual data • evaluation • user interfaces
  49. 49. VISUALIZING THE RATIONALE OFRECOMMENDATIONS
  50. 50. Visualizing recommendations adapted from Keim et al. 2008
  51. 51. objectives • Address cold start issues • Justification and trust • Richer interaction capabilities
  52. 52. examples Klerkx and Duval 2009 ODonovan et al. 2010
  53. 53. Suggestions welcome!
  54. 54. Questions? katrien.verbert@cs.kuleuven.be twitter: @katrien_v
  55. 55. References •  Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, (pp. 9-17), ACM. •  D. Keim, G. Andrienko, J.-D. Fekete, C. Go ̈rg, J. Kohlhammer, and G. Melanc ̧on. Visual Analytics: Definition, Process, and Challenges. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950 of Lecture Notes in Computer Science, pages 154–175. Springer Berlin / Heidelberg, 2008 •  J. Klerkx and E. Duval. Visualising social bookmarks. Journal of Digital Information, 10(2):1–40, 2009 •  J. ODonovan, B. Gretarsson, S.Bostandjiev, C. Hall, and T. Hollerer. SmallWorlds: Visualizing Social Recommendations. In G. Melançon, T. Munzner, and D. Weiskopf (eds) Eurographics/ IEEE-VGTC Symposium on Visualization 2010, Volume 29 (2010), Number 3, 10 pages •  Siemens, G. Gasevic, D. (eds) (2011). Proceedings of the 1st conference on Learning Analytics and Knowledge 2011. ACM.

×