2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje

407 views
349 views

Published on

2013 07 05
(uc3m)
lasi
emadrid
pmmerino
uc3m
evaluacion plataformas e learning analitica aprendizaje

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
407
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje

  1. 1. Evaluation in e-Learning Platforms Using Learning Analytics Techniques Pedro J. Muñoz-Merino Contact: pedmume@it.uc3m.es Universidad Carlos III de Madrid
  2. 2. 2 Introduction: Evaluation ● Evaluation  Understand as much as possible all the aspects of the learning process to improve learning ● Traditional methodologies for evaluation  Surveys  Personal interviews  pre-test, post-test
  3. 3. 3 Introduction: Learning analytics for evaluating ● Learning analytics  Specially useful with a big amount of students  Many data available  Transformation of the data into useful information  This information can be combined to obtain useful conclusions for evaluation
  4. 4. 4 What can be evaluated? ● Importance of selecting the proper metrics to perform the evaluation process  Materials: parts to improve  Topics  Students o Learning o Behaviour o Learning profiles  Learning process  Tools
  5. 5. 5 Does the platform influence the evaluation? ● Depending on the platform, some metrics cannot be retreived, so there is no possibility of some information for the evaluation ● Some metrics are similar in several platforms  E.g. Use of videos, same exercise framework ● Other measures are quite different. The semantic and features of each platform should be taken into account  E.g. Gamification features, different exercise framework
  6. 6. 6 Videos
  7. 7. 7 Example 1: Google Course Builder
  8. 8. 8 Example 2: Khan Academy
  9. 9. 9 Example 3: ISCARE competition tool
  10. 10. 10 How can the evaluation be done? ● The data on the platforms can be transformed in very different ways  Select the best way of transformation of the data depending on the purpose of the evaluation  e.g. evaluation of exercises
  11. 11. 11 Case Study: Khan Academy
  12. 12. 12 KA: Evaluation of exercises
  13. 13. 13 KA: Evaluation of topics
  14. 14. 14 KA: Individual reports: self-reflection
  15. 15. 15 KA: Evaluation of the whole class
  16. 16. 16 KA: Evaluation of the correct progress ● Minimum conditions for correct progress ― 16 videos completed ― 21 exercises of proficiency ● Results ― 12 students had a correct progress on the platform and are ready to go to the face to face sessions
  17. 17. 17 KA: Evaluation of the efficiency
  18. 18. 18 KA: Evaluation of the total use ● 8 out of 44 students that did not do a correct progress, used a considerable effort. These students interacted more than 225 minutes, started more than 15 videos or had more than 20 attempts at different types of exercises ● There is a statistically significant difference at 99% level between the total time (TT) and - videos completed (r=0.80), - videos started(r=0.81), - exercises attempted (r=0.71) - exercises with proficiency (r=0.73)
  19. 19. 19 KA: Evaluation of the optional items ● Optional items - Set goals - Update profile ● 17 students used some optional functionality. Correlation with - The total time (r=0.16, p=0.19) - The percentage of proficiencies obtained (r=0.3, p=0.014) - recommender/explorer parameter (r=0.1, p=0.42)
  20. 20. 20 KA: Evaluation of exercise solving habits Acces to an exercise Answers correctly? Correct behaviorYES Has user seen related video? NO Increase video avoidance NO Did user ask for hints? YES Increase hint avoidance NO Did user answered reflexively? YES Correct behavior YES Increase unreflective user NO
  21. 21. 21 KA: Evaluation of exercise solving habits ● Some statistics - 30.3 % hint avoider - 25.8 % video avoider - 40.9 % unreflective user - 12.1% of hint abuser Hint avoid. Video avoid. Unrefl . User Hint abuser Hint avoidance 1 0.382 0.607 -0.186 Video avoid. 0.382 1 0.289 0.096 Unrefl. user 0.607 0.289 1 0.317
  22. 22. 22 Conclusions ● Learning analytics for evaluation  Study the features of the platform  Determine what is possible in the platform  Select the proper metrics that are relevant for the evaluation  Analyze the best way to calculate the metrics  Determine the commonalities and differences for the different platforms  Put together all the metrics to achieve a whole evaluation
  23. 23. Evaluation in e-Learning Platforms Using Learning Analytics Techniques Pedro J. Muñoz-Merino Contact: pedmume@it.uc3m.es Universidad Carlos III de Madrid

×