• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Research and Deployment of Analytics in Learning Settings
 

Research and Deployment of Analytics in Learning Settings

on

  • 1,974 views

 

Statistics

Views

Total Views
1,974
Views on SlideShare
1,967
Embed Views
7

Actions

Likes
5
Downloads
10
Comments
0

5 Embeds 7

https://si0.twimg.com 3
https://twimg0-a.akamaihd.net 1
http://paper.li 1
http://a0.twimg.com 1
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Microsoft Desirability Toolkit
  • ‘Knowing about collaboration andcommunication’ (the 3rd row with *) is not addressed by SAM, but is added to check a possible bias. The highest rated was ‘knowing how much time students spent’ and ‘Awareness of what students are doing’ Finding students in trouble and the best students was also rated rather low. Awareness of resource use has been mostly met, but can be improved by differentiating external resources (the external resource use issue is indecisive).
  • Actual use was high
  • For this evaluation we wanted to get expert feedback and see how SAM would operate in a large course. SAM was deployed in an open onlinecourse on Learning and Knowledge Analytics (LAK)5 – an emerging research domain in TEL that focuses on better measurement, analysis, visualization and reporting of data about learners [2]. More details on iteration 2 and 3 are available in [10]. allow re-ordering of the axes through drag-and-drop for better metrics comparison. To cope with the line density better, configurable histograms (12) are added to the axes.270 participants
  • Providing feedback most importantBoth LAK and CGIAR teachers want to understand the document use. The main differences between LAK and CGIAR teachers are: LAK rates finding students at risk higher and finding good students lower, online tool use is not so interesting for LAK teachers and collaboration is more important. Awareness is also rated high. Comparing with the objectives, awareness and resource use is again the most important.
  • How can data sets be shared according to privacy and legal protection rights? How to develop a respective policy to use and share data sets? How to pre-process data sets to make them suitable for other researchers? How to define common evaluation criteria for TEL recommender systems? How to develop overview methods to monitor the performance of TEL recommender systems on data sets?

Research and Deployment of Analytics in Learning Settings Research and Deployment of Analytics in Learning Settings Presentation Transcript

  • Research and Deployment of Analytics in Learning Settings PAWS Meeting 9 April 2012 School of Information Sciences, University of Pittsburgh Katrien Verbert
  • Human-Computer Interaction Awareness & Sense-making prof. Erik Duval Computer Graphics prof. Phil Dutré Language Intelligence & Information Retrieval prof. Sien Moens Flexible Interaction between people and information http://hci.cs.kuleuven.be/
  • more focus on interaction...
  • tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
  • tracking traces Blogs Rescuetime Rabbit- eclipse Twitter plugin
  • tracking traces www.role-project.eu
  • Duval, Erik. Attention please! Learning analytics for visualization and recommendation, Proceedings of LAK11: 1stInternational Conference on Learning Analytics and Knowledge, pages 9-17, ACM (2011)
  • objectives • self-monitoring for learners • awareness for teachers • learning resource use and recommendations • part of Learning Analytics research [ACM LAK conf., Siemens 2011, Duval 2011]
  • overview • Student Activity Meter • Step Up! • Recommender systems for learning • Future research plans
  • Student activity meter (SAM): demo. http://ariadne.cs.kuleuven.be/monitorwidget-rwtheval/ or http://bit.ly/I8AYV1
  • Design Based Research Methodology •  Rapid prototyping •  Evaluate Ideas in short iteration cycles of Design, Implementation & Evaluation •  Focus on Usefulness & Usability •  Think-aloud evaluations, SUS (System Usability Scale) surveys, usability lab, ...
  • Iteration one •  usability and user satisfaction evaluation •  12CS students, using a -based time tracker •  2 evaluation sessions: •  task based interview with think aloud (after 1 week of tracking) •  user satisfaction (SUS & MSDT) (after 1 month)
  • User satisfaction • average SUS score: 73%
  • iteration two • 20 persons: 3 CGIAR, 2 Law, 8 CS teachers & 7 CS TA s. • An online survey about usefulness, teacher issues and how the tool can resolve these. • on average: 40 mins are spent using SAM.
  • CGIAR CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ?!Being aware of what students are doing ✔ ✔Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ✔ ✔Knowing how and when online tools have been used ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ?!Knowing how much time students spent ?! ✔Knowing if external learning resources are used ✔ ?!
  • demogra evaluation design negative positive phics goal changes • ↑learnability usability, • ↓errors 12 CS satisfaction, small usabilityI. 1st iteration • good satisfaction students preliminary issues • usefulness usefulness positive • provides assessing 19 resource awareness teacher needs,II. teachers & help function recomm. not • all vis. useful use & TA s useful • many uses usefulness • 90% wants it
  • iteration three • open course on learning and knowledge analytics, http://bit.ly/dWYVbX • 12 visual analytics enthousiasts + experts (who also teach) • almost identical survey to CGIAR case.
  • LAK CASE STUDY issue for teacher addressed Provide feedback to the students ✔ ✔Being aware of what students are doing ✔ ?!Knowing about collaboration and communication ✔ ✗Knowing which documents are used and how much ✔ ?!Knowing how and when online tools have been used ✗ ?!Finding the students who are not doing well ✔ ?!Finding the best students ?! ✗Knowing how much time students spent ?! ✔Knowing if external learning resources are used ?! ?!
  • ideas from experts 2 the used resource types 5 detailed information per student 4 detailed information of 2 students 3 detailed usage stats of resources 1 stats or vis. on content creation
  • demogra evaluation design negative positive phics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive assessing • provides awareness 19 resource teacher • all vis. useful II. teachers help function recomm. not needs, use & • many uses & TA s useful usefulness • 90% want it • provides awareness assessing re-orderable most and feedback 12 teacher parallel addressed • many uses III. participan needs, expert coordinates needs are • 66% want it ts feedback, use with indecisive • recomm. can be & usefulness histograms useful
  • Iteration four • a CS course on C++ programming • 11people: 7 teachers, 2 TA s & 1 course planner • richerdata set: tracking from programming environment • qualitative study using a structured face-2-face interview 
  • USER SATISFACTION • average SUS score: 69,69% all: want to continue using it 9/11: give it to students
  • demo- evaluation design negative positive graphics goal changes usability, • ↑learnability 12 CS satisfaction, small usability • ↓errors I. 1st iteration students preliminary issues • good satisfaction usefulness • usefulness positive • provides awareness 19 assessing teacher resource • all vis. useful II. teachers needs, use & help function recomm. not • many uses & TA s usefulness useful • 90% want it • provides awareness and assessing teacher most 12 re-orderable feedback needs, expert addressedIII. participant PC with • many uses feedback, use & needs are s histograms • 66% want it usefulness indecisive • recomm. can be useful • provides time overview filter & search, conflicting • provides course overview 11 use, usefulness & icons, zooming visions of • PC assist with detectingIV. teachers satisfaction in line chart, students doing problems & TA s editing PC axes well or at risk • many uses & insights • 100% want it
  • conclusion •  SAMenables to find a wide variety of new insights • a better course overview • understanding student time spending • almostall participants want to continue using SAM 26
  • Santos Odriozola, Jose Luis; Govaerts, Sten; Verbert, Katrien; Duval, ErikGoal-oriented visualizations of activity tracking: a case study with engineering students, Proceedings of LAK12: 2ndInternational Conference on Learning Analytics and Knowledge, pages 10, ACM (to appear)
  • Human-Computer Interaction Course
  • http://bit.ly/I7hfbe
  • usage
  • User satisfaction • average SUS score: 77%
  • Nikos Manouselis, Hendrik Drachsler, Katrien Verbert and Erik Duval. Recommender Systems for Learning.SpringerBriefs in Computer Science, 90 pages, Springer US  (to appear).
  • http://bit.ly/A4CwZU
  • challenges • Evaluation • Data sets • Context • User interfaces
  • EVALUATION & DATA SETS
  • Verbert, Katrien; Drachsler, Hendrik; Manouselis, Nikos; Wolpers, Martin; Vuorikari, Riina; Duval, Erik.Dataset-driven research for improving TEL recommender systems, LAK11:1st InternationalConference on Learning Analytics and Knowledge, pages 44-53 (2011)
  • http://bit.ly/acBKsp
  • how to achieve objectives •  Setting up a website / maintain TELeurope group community •  Setup a open data repository for sharing educational datasets and related researches outcomes •  Organizing annual workshop and SI •  Organizing a data competition like in TREC
  • dataTEL challenge & dataTEL cafe event •  a call for TEL datasets •  eight data sets submitted http://bit.ly/ieqmWW
  • http://dev.mendeley.com/
  • Mendeley APOSDLE ReMashed Organic.e Mace Melt dunet Collection period 1 year 3 months 2 years 9 months 3 years 6 months Users 200.000 6 140 1.000 1.148 98 Items 1.857.912 163 96.000 11.000 12.000 1.923 Activities 4.848.725 1.500 23.264 920 461.982 16.353 reads + + - - + - tags - (+) + + + + ratings (+) - + + + + downloads + + - - + + search - + - - + - collaborations - + - - - - tasks/goals - + + - - - sequence - + - - - - competence - + - - + - time - - - - + +
  • User-based CF A Sam highcorrelation B Ian Neil C
  • Item-based CF Sam A B high correlation Ian Neil C
  • similarity measures • Cosine similarity • Pearson correlation • Tanimoto or extended Jaccard coefficient
  • similarity measures MAE of item-based collaborative filtering based on different similarity metrics
  • algorithms MAE of user-based, item-based and slope-one collaborative filtering
  • CONTEXT
  • Verbert, Katrien; Manouselis, Nikos; Ochoa, Xavier; Wolpers, Martin; Drachsler, Hendrik; Bosnic, Ivana;Duval, Erik. Context-aware recommender systems for learning: a survey and future challenges, IEEETransactions on Learning Technologies, 20 pages (Accepted)
  • data dimensions
  • challenges • context acquisition • standardized representation of contextual data • evaluation • user interfaces
  • VISUALIZING THE RATIONALE OFRECOMMENDATIONS
  • Visualizing recommendations adapted from Keim et al. 2008
  • objectives • Address cold start issues • Justification and trust • Richer interaction capabilities
  • examples Klerkx and Duval 2009 ODonovan et al. 2010
  • Suggestions welcome!
  • Questions? katrien.verbert@cs.kuleuven.be twitter: @katrien_v
  • References •  Duval, E. (2011). Attention please!: learning analytics for visualization and recommendation. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, (pp. 9-17), ACM. •  D. Keim, G. Andrienko, J.-D. Fekete, C. Go ̈rg, J. Kohlhammer, and G. Melanc ̧on. Visual Analytics: Definition, Process, and Challenges. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North, editors, Information Visualization, volume 4950 of Lecture Notes in Computer Science, pages 154–175. Springer Berlin / Heidelberg, 2008 •  J. Klerkx and E. Duval. Visualising social bookmarks. Journal of Digital Information, 10(2):1–40, 2009 •  J. ODonovan, B. Gretarsson, S.Bostandjiev, C. Hall, and T. Hollerer. SmallWorlds: Visualizing Social Recommendations. In G. Melançon, T. Munzner, and D. Weiskopf (eds) Eurographics/ IEEE-VGTC Symposium on Visualization 2010, Volume 29 (2010), Number 3, 10 pages •  Siemens, G. & Gasevic, D. (eds) (2011). Proceedings of the 1st conference on Learning Analytics and Knowledge 2011. ACM.