Learning Analytics and Future R&D at CELSTEC

25,690 views

Published on

This is an updated version of my presentation at LAK12 that includes related research on TEL RecSys and the new LinkedUp project that will address the current challenges we are facing like the lack of datasets, common evaluation framework and real world examples for LA and data supported education.

Published in: Technology, Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
25,690
On SlideShare
0
From Embeds
0
Number of Embeds
94
Actions
Shares
0
Downloads
6
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Learning Analytics and Future R&D at CELSTEC

  1. 1. Learning Analyticsand Future R&D Opportunities Lecture by Hendrik Drachsler at RWTH Aachen, Germany, June 20, 2012 1
  2. 2. Goals of the lecture LA Framework Survey Findings Conclusions 2
  3. 3. A view on Learning AnalyticsThe LearningAnalyticsFramework 3
  4. 4. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  5. 5. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  6. 6. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  7. 7. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  8. 8. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  9. 9. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  10. 10. Greller, W., & Drachsler, H. (to appear). Turning Learning into Numbers. Toward a Generic Framework for Learning Analytics. Journal of Educational Technology & Society. 4
  11. 11. Stakeholders vs. LA FrameworkOpinions fromthe stakeholderstoward thedimensions of theLA framework 5
  12. 12. Learning Analytics Questionnaire Extrapolate opinions from different target groups: (a) what is the current understandings and the expectations on LA (b) is there a common understanding of LA 6
  13. 13. Learning Analytics Questionnaire • 4 weeks available • 156 people after clean up • 121 people full records Data and survey available at: http://bit.ly/la_survey 7
  14. 14. Participants 5.8% 11.0% Higher Education8.4% K-12 Vocational Others 74.8% 8
  15. 15. Participants - Roles % 50 44% Teachers 36% Researchers37.5 L.Designers 26% Managers 25 16%12.5 0 s s s s er er er er Te ac h ar ch sig n na g Roles es e L .D e Ma R 9
  16. 16. Participants - ReachResponses from 31 countries [UK (38), US (30), NL (22)] 10
  17. 17. Stakeholders data datasubjects clients 11
  18. 18. Stakeholders (a) who was expected to benefit the most from learning analytics Teachers Parents Institutions Learners 12
  19. 19. Stakeholders (a) who was expected to benefit the most from learning analytics Outcomes: 1. Teachers 2. Learners 3. Institutions 4. Parents Teachers Parents Institutions Learners 12
  20. 20. Stakeholders (b) how much will learning analytics influence bilateral relationships? Teachers Parents Institutions 13 Learners
  21. 21. Stakeholders (b) how much will learning analytics influence bilateral relationships? Outcomes: 1. Teacher-student 84% 2. Student-teacher 63% 3. Student-student 46% 4. Teacher-teacher 41% Teachers Parents Institutions 13 Learners
  22. 22. ObjectivesReflection(Glahn, 2009) 14
  23. 23. ObjectivesReflection Prediction(Glahn, 2009) 14
  24. 24. Objectives The importance of 3 generic objectives: (a) reflection (b) prediction (c) unveil hidden information 15
  25. 25. ObjectivesIn which way learning analytics will change educationalpractice in particular areas? n = 119 11% no changes at all 43% small changes 45% extensive changes 16
  26. 26. ObjectivesIn which way learning analytics will change educationalpractice in particular areas? Item=2: Timely information n 119 11% no changes at all about learning 43% small changes Item 8: extensive changes 45% Better insights by institutions in their courses Item 5: Easier grading Item 6: Objective assessment 16
  27. 27. Educational DataDrachsler, H., et al. (2010). Issues and Considerations regarding Sharable Data Sets forRecommender Systems in Technology Enhanced Learning. 1st Workshop RecommnderSystems in Technology Enhanced Learning (RecSysTEL@EC-TEL 2010) September, 28,2010, Barcelona, Spain. 17
  28. 28. Educational Data Drachsler, H., et al. (2010). Issues and Considerations regarding Sharable Data Sets forVerbert, K., Manouselis, in Technology Enhanced Learning.E. (accepted).Recommnder Recommender Systems N., Drachsler, H., and Duval, 1st Workshop Dataset-driven Systems in Technology Enhanced Learning (RecSysTEL@EC-TEL 2010) September, 28,Research to Support Learning and Knowledge Analytics. Journal of Educational Technology 2010, Barcelona, Spain.& Society. 17
  29. 29. Educational Data Researcher: 1. Added context information (n=43, means 3.42) Teacher: 1. Added context information (n=52, means 3.42) Manager: 1. Sharing within the institution (n=16, means 3.63) 2. Anonymisation (n=19, means 3.53) 18
  30. 30. TechnologiesManouselis, N., Drachsler, H., Verbert, K., and Duval, E. (to appear). RecommenderSystems for Learning. Berlin:Springer 19
  31. 31. TechnologiesPredictionManouselis, N., Drachsler, H., Verbert, K., and Duval, E. (to appear). RecommenderSystems for Learning. Berlin:Springer 19
  32. 32. Technologies ReflectionPeter Kraker, ClaudiaWagner, FleurJeanquartier, StefanieN. Lindstaedt (2011):On the Way to aScience Intelligence:Visualizing TEL Tweetsfor Trend DetectionSixth EuropeanConference onTechnology EnhancedLearning (EC-TEL 2011)Manouselis, N., Drachsler, H., Verbert, K., and Duval, E. (to appear). RecommenderSystems for Learning. Berlin:Springer 19
  33. 33. TechnologiesTrust in accurate and appropriate LA results ...1.View on learning progress2. Predict learning resource 3. Assessment 4.View on engagement 5. Compare learners 6. Prediction of peers 7. Prediction of learner performance 20
  34. 34. Constraints1.Legal protection2.Privacy3.Ethics4.Ownership 21
  35. 35. ConstraintsEffect size of LA on ... not at all a little very much privacy ethics data data trans- ownership openness parency
  36. 36. Competences 23
  37. 37. Competences 1.E-literacy 2.Interpretation skills 3.Self-directedness 4.Ethical understanding 23
  38. 38. Competences 24
  39. 39. Competences Item 3: Critical reflection Item 7: Self-directedness Item 1: Numerical skills Item 5: Ethical thinking 24
  40. 40. Competences Item 3: Critical reflection Item 7: Self-directednessthat 70.2% (n=85) believed learners were NOT competent enough to Item 1: Numerical skills independently learn from learning analytics Item 5: Ethical thinking 24
  41. 41. Limitationspicture by Stephan Downes http://www.flickr.com/photos/stephen_downes/3808714505 25
  42. 42. Limitations1. Dominance of responsesfrom the HE sector2. Absence of students3. Low awareness of LA; Only surveyed innovators and early adopters4. Mainly opinions from western cultures 26
  43. 43. Survey summaryMain findingsof the surveyaccording tothe LAframework 27
  44. 44. + Stakeholders1.Main beneficiaries of LA are learnersand teachers followed by organisations2.Biggest benefits would be gained in theteacher-to-student relationship3.Learners require teacher support tolearn from LA 28
  45. 45. + Stakeholders1.Main beneficiaries of LA are learnersand teachers followed by organisations2.Biggest benefits would be gained in theteacher-to-student relationship LA asto3.Learners require teacher support tio n?learn from LA inn ova 28
  46. 46. + Objectives1. Reflection support is main objective fromthe stakeholders view ...2. ...by revealing hidden information aboutlearners 29
  47. 47. + Objectives1. Reflection support is main objective fromthe stakeholders view ...2. ...by revealing hidden information aboutlearners Reflection support only for teacher student relationship? 29
  48. 48. + Educational Data1. Context information from learners and the learningprocess2. Anonymisation is the second most important dataattribute3. Willingness to share if data is anonymised 30
  49. 49. + Educational Data1. Context information from learners and the learningprocess2. Anonymisation is the second most important dataattribute3. Willingness to share if data is anonymised Can we achieve a collection of reference datasets? 30
  50. 50. + Technologies1. Trust in LA algorithms is not well developed2. High confidence on gaining a comprehensiveview of the learning progress 31
  51. 51. + Technologies1. Trust in LA algorithms is not well developed2. High confidence on gaining a comprehensiveview of the learning progress How accurate can we measure a learning progress? 31
  52. 52. + Constraints1. Data ownership is the most important topic2. LA lead to breaches of privacy but privacy and ethical aspectsare of lesser importance3. Many organisations have ethical boards and guidelines in place 32
  53. 53. + Constraints1. Data ownership is the most important topic2. LA lead to breaches of privacy but privacy and ethical aspectsare of lesser importance3. Many organisations have ethical boards and guidelines in place Do we need new policies for data ownership and privacy? 32
  54. 54. + Competences1. Skepticism that LA will lead to more independence oflearners to control their learning process2. Training need to guide students to more self-directednessand critical reflection 33
  55. 55. + Competences1. Skepticism that LA will lead to more independence oflearners to control their learning process2. Training need to guide students to more self-directednessand critical reflection Do we need mandatory courses on statistics for the edu. sector? 33
  56. 56. Future R&D Future R&Dpicture by Tom Raftery http://www.flickr.com/photos/traftery/4773457853 34
  57. 57. 10 years of TEL RecSys research in one BOOK Chapter 1: Background Chapter 2: TEL context Recommender Chapter 3: Extended survey Systems for of 42 RecSys Learning Chapter 4: Challenges and OutlookManouselis, N., Drachsler, H., Verbert, K., Duval, E.(2012). Recommender Systems for Learning. Berlin:Springer. 35
  58. 58. 10 years of TEL RecSys research in one BOOK Chapter 1: Background Chapter 2: TEL context Recommender Chapter 3: Extended survey Systems for of 42 RecSys Learning Chapter 4: Challenges and OutlookManouselis, N., Drachsler, H., Verbert, K., Duval, E.(2012). Recommender Systems for Learning. Berlin:Springer. 35
  59. 59. Available TEL datasets 36
  60. 60. Recommender Technologies 37
  61. 61. Analysis according to the frameworkSupported tasks 38
  62. 62. Analysis according to the frameworkSupported tasksDomain model 38
  63. 63. Analysis according to the framework User model 39
  64. 64. Analysis according to the framework Personalization Approach 40
  65. 65. Analysis according to the frameworkSupported tasks Domain model User model Personalization Approach 41
  66. 66. TEL RecSys::Ideal research design1. A selection of datasets for your RecSys task2. An offline study of different algorithms on the datasets3. A comprehensive controlled user study to test psychological, pedagogical and technical aspects4. Rollout of the RecSys in real-life scenarios 42
  67. 67. TEL RecSys::Open issues1. Evaluation2. Datasets3. Context4. Visualization5. Virtualization6. Privacy 43
  68. 68. Addressing the issues::LinkedUP LinkedUp Web submissi data on data( Personal( data Stage(1J Initialisation Initialisation 36stages6of6the6LinkedUp competition6 LinkedUp Challenge6Environment • Lowest(requirements(level(for(participation • Inital(prototypes(and(mockups,(use(of(data( • LinkedUp Evaluation(Framework Participation criteria testbed(required • Methods and Test(Cases Stage(2 • 10(to(20(projects(are(expected • LinkedUp Data(Testbed • Competitor ranking list • Medium(requirements(level(for(participation • Working(prototypes,(minimum(amount of data(sources,(clear(target(user(group LinkedUp Support Actions Stage(3 • 5(to(10(projects(are(expected( • Dissemination((events,( training)( • Data(sharing(initiatives • Deployment(in(realJworld( use(cases • Community(building(&(clustering • Sustainable(technologies,(reaching out • Technology(transfer to critical(amount(of(users, Stage(4 • 3(to(5(projects(are(expected( • Cashprice( awards(&(consulting E P S T P F Network(of(supporting(organisations( ( I (see 3.2Spreadingexcellence,exploitingresults,disseminatingknowledge) S E C B C O 44
  69. 69. Addressing the issues::LinkedUP LinkedUp Web submissi data on data( Personal( data Stage(1J Initialisation Initialisation 36stages6of6the6LinkedUp competition6 LinkedUp Challenge6Environment • Lowest(requirements(level(for(participation • Inital(prototypes(and(mockups,(use(of(data( • LinkedUp Evaluation(Framework Participation criteria testbed(required • Methods and Test(Cases Stage(2 • 10(to(20(projects(are(expected • LinkedUp Data(Testbed • Competitor ranking list the and • Medium(requirements(level(for(participation • Working(prototypes,(minimum(amount of f LinkedUp Support Actions t o ork os, al data(sources,(clear(target(user(group Stage(3 r • Dissemination((events,( training)( pa etw • 5(to(10(projects(are(expected( r n • Data(sharing(initiatives e n Eu atio ces om Up • Deployment(in(realJworld( use(cases • Community(building(&(clustering 0 • Sustainable(technologies,(reaching out c d Be ke 00 tern feren • Technology(transfer to critical(amount(of(users, Stage(4 • 3(to(5(projects(are(expected( 15 t in n • Cashprice( awards(&(consulting n p to e a Li u c oP n c and E P S Network(of(supporting(organisations( win nda ps T F tte sho I ( (see 3.2Spreadingexcellence,exploitingresults,disseminatingknowledge) S C B E a rk 44 w o C O
  70. 70. Addressing the issues::LinkedUP 45
  71. 71. ! Development*of*the*Evalua0on*Framework* * P1: Initialisation P2: Establishment P3: Exit and and Evaluation Sustainability M0-M6: Preparation M7-M18: Competition cycle M18-M24: Finalising Comp etition Revie Final Expert 3x Draft w of EF proposal EF release validation of EF New Refin versio ement n of EFLiterature review Group Concept DocumentationCognitive Mapping Mapping Dissemination Practical experiences and refinement Stefan Dietze 25/05/12 27 46
  72. 72. !Group&Concept&Mapping&&•  Group Concept Mapping resembles the Post-it notes problem solving technique and Delphi method•  GCM involves participants in a few simple activities (generating, sorting and rating of ideas) that most people are used to.GCM is different in two substantial ways:1. Robust analysis (MDS and HCA)GCM takes up the original participants contribution and then quantitativelyaggregate it to show their collective view (as thematic clusters)2. VisualisationGCM presents the results from the analysis as conceptual maps and other graphicalrepresentations (pattern matching and go-zones). 47 Stefan Dietze 25/05/12 1
  73. 73. !Group&Concept&Mapping&& Example: EU FP7 Handover •  105 criteria about accurate handover training interventions •  Sorting on similarity in meaning •  Rating on importance and feasibility 48 Stefan Dietze 25/05/12 31
  74. 74. !Group&Concept&Mapping&& A point map 49 Stefan Dietze 25/05/12 32
  75. 75. !Group&Concept&Mapping&& A cluster map 50 Stefan Dietze 25/05/12 33
  76. 76. !Group&Concept&Mapping&& Clusters’ labels A 51 Stefan Dietze 25/05/12 34
  77. 77. !Group&Concept&Mapping&& Rating Map importance 52 Stefan Dietze 25/05/12 35
  78. 78. !Group&Concept&Mapping&& Rating Map feasibility 53 Stefan Dietze 25/05/12 36
  79. 79. !Group&Concept&Mapping&& 54 Stefan Dietze 25/05/12 37
  80. 80. !Practical experiences and refinement Compe titions Review Draft 3x of EF Refine New version ment of EF 55 Stefan Dietze 25/05/12 38
  81. 81. Many Thanks::Questions? This presentation is available at: slideshare.com/Drachsler Email: hendrik.drachsler@ou.nl Email: wolfgang.greller@ou.nl Supporting projects: ! 56

×