• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje
 

2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje

on

  • 403 views

2013 07 05

2013 07 05
(uc3m)
lasi
emadrid
pmmerino
uc3m
evaluacion plataformas e learning analitica aprendizaje

Statistics

Views

Total Views
403
Views on SlideShare
403
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje 2013 07 05 (uc3m) lasi emadrid pmmerino uc3m evaluacion plataformas e learning analitica aprendizaje Presentation Transcript

    • Evaluation in e-Learning Platforms Using Learning Analytics Techniques Pedro J. Muñoz-Merino Contact: pedmume@it.uc3m.es Universidad Carlos III de Madrid
    • 2 Introduction: Evaluation ● Evaluation  Understand as much as possible all the aspects of the learning process to improve learning ● Traditional methodologies for evaluation  Surveys  Personal interviews  pre-test, post-test
    • 3 Introduction: Learning analytics for evaluating ● Learning analytics  Specially useful with a big amount of students  Many data available  Transformation of the data into useful information  This information can be combined to obtain useful conclusions for evaluation
    • 4 What can be evaluated? ● Importance of selecting the proper metrics to perform the evaluation process  Materials: parts to improve  Topics  Students o Learning o Behaviour o Learning profiles  Learning process  Tools
    • 5 Does the platform influence the evaluation? ● Depending on the platform, some metrics cannot be retreived, so there is no possibility of some information for the evaluation ● Some metrics are similar in several platforms  E.g. Use of videos, same exercise framework ● Other measures are quite different. The semantic and features of each platform should be taken into account  E.g. Gamification features, different exercise framework
    • 6 Videos
    • 7 Example 1: Google Course Builder
    • 8 Example 2: Khan Academy
    • 9 Example 3: ISCARE competition tool
    • 10 How can the evaluation be done? ● The data on the platforms can be transformed in very different ways  Select the best way of transformation of the data depending on the purpose of the evaluation  e.g. evaluation of exercises
    • 11 Case Study: Khan Academy
    • 12 KA: Evaluation of exercises
    • 13 KA: Evaluation of topics
    • 14 KA: Individual reports: self-reflection
    • 15 KA: Evaluation of the whole class
    • 16 KA: Evaluation of the correct progress ● Minimum conditions for correct progress ― 16 videos completed ― 21 exercises of proficiency ● Results ― 12 students had a correct progress on the platform and are ready to go to the face to face sessions
    • 17 KA: Evaluation of the efficiency
    • 18 KA: Evaluation of the total use ● 8 out of 44 students that did not do a correct progress, used a considerable effort. These students interacted more than 225 minutes, started more than 15 videos or had more than 20 attempts at different types of exercises ● There is a statistically significant difference at 99% level between the total time (TT) and - videos completed (r=0.80), - videos started(r=0.81), - exercises attempted (r=0.71) - exercises with proficiency (r=0.73)
    • 19 KA: Evaluation of the optional items ● Optional items - Set goals - Update profile ● 17 students used some optional functionality. Correlation with - The total time (r=0.16, p=0.19) - The percentage of proficiencies obtained (r=0.3, p=0.014) - recommender/explorer parameter (r=0.1, p=0.42)
    • 20 KA: Evaluation of exercise solving habits Acces to an exercise Answers correctly? Correct behaviorYES Has user seen related video? NO Increase video avoidance NO Did user ask for hints? YES Increase hint avoidance NO Did user answered reflexively? YES Correct behavior YES Increase unreflective user NO
    • 21 KA: Evaluation of exercise solving habits ● Some statistics - 30.3 % hint avoider - 25.8 % video avoider - 40.9 % unreflective user - 12.1% of hint abuser Hint avoid. Video avoid. Unrefl . User Hint abuser Hint avoidance 1 0.382 0.607 -0.186 Video avoid. 0.382 1 0.289 0.096 Unrefl. user 0.607 0.289 1 0.317
    • 22 Conclusions ● Learning analytics for evaluation  Study the features of the platform  Determine what is possible in the platform  Select the proper metrics that are relevant for the evaluation  Analyze the best way to calculate the metrics  Determine the commonalities and differences for the different platforms  Put together all the metrics to achieve a whole evaluation
    • Evaluation in e-Learning Platforms Using Learning Analytics Techniques Pedro J. Muñoz-Merino Contact: pedmume@it.uc3m.es Universidad Carlos III de Madrid