Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Deconstructing Disengagement:Analyzing Learner Subpopulations in  Massive Open Online Courses       René      Chris     Em...
MOOCs (in this paper) are           instructionist + individualised   • 6-10 weeks long   • 2-3 hours of video lectures/we...
Massive Open Online Courses        Heterogeneous population:     Learners join from anywhere in the     world, at any age,...
Defining Success for Open-Access Learners Assessment scores are problematic: • not comparable across courses • not availab...
Defining Success for Open-Access Learners Completion rates are highly problematic: • numerator = certificate earners, i.e....
Defining Success for Open-Access Learners Process measures hold promise: • conceptualize learning as an ongoing   set of i...
Defining Success for Open-Access Learners         Assessment scores         Completion rates         Process measures
How to classify learners intomeaningful subpopulations?
Classification Criteria Classification methods for MOOC subpopulations: Universal – valid across multiple courses Theory-d...
Lens for Analysis • Compare subpopulations • Compare courses
The Data
Analyzed Three Courses
Who took these MOOCs?
A lot of data!
Gender skew
Interesting age group
HDI skew
Clustering
Sub-populations basis?         Engaged                     Not Engaged
Engagement Ideal  Engagement                   Time
Coarse Engagement Labels  (T) On Track:   Did the weekly assignment on                  time  (B) Behind:     Did the week...
The Aggregate Class                    A = Auditing                                       O = Out                         ...
The Aggregate Class                    A = Auditing                                       O = Out                         ...
The Aggregate Class                    A = Auditing                                       O = Out                         ...
Example Student 1
Example Student 2
Example Student 3
Clustering Methodology              There were 21,108 paths in the                        GS class
Four Prototypical Trajectories                    Cluster!                (k-means of L1 norm)
Four Prototypical Trajectories                     And?
The Four Prototypical Trajectories
Prototypical Trajectory 1: Completing
Prototypical Trajectory 2: Auditing
Prototypical Trajectory 3: Disengaging
Prototypical Trajectory 4: Sampling
Four Prototypical Trajectories Consistent across three courses: Auditing learners watch lectures throughout course, but at...
Four Prototypical Trajectories            The other courses?
Four Prototypical Trajectories
Four Prototypical Trajectories                 <suspense>
Four Prototypical Trajectories
Four Prototypical Trajectories      Same pattern in all classes
HS Composition [46k]        Disengaging          Completing   Completing                                          Sampling...
UG Composition [27k]                        Disengaging    Completing                                      Sampling       ...
MS Composition [21k]                 Disengaging   Completing                               Sampling           Auditing
Validation
Cluster Validation  • Different values of k (split by time)  • Including “assignment pass” (95%    overlap)  • Excluding “...
High Level        Clustering        Four      Engagement in   Prototypical         MOOCs          Patterns
Results &RecommendationsComparing Trajectories  between Courses
Overall Experience                                      Au                                        d it                    ...
Discussion Forum                                 Au                                   d it                                ...
Geographical Distribution Trend confirmed by top four participating countries    United States, India, Russia, United Kin...
Gender                                   Au                                     d it                                      ...
Future Directions
Future Directions  Experiments    Collaboration and Peer Effects    Interface Customization    Targeted Interventions ...
Thank you! Stanford Lytics Lab           lytics.stanford.edu Office of the Vice Provost for Online Learning Roy Pea, Cliff...
More info? René Kizilcec       kizilcec@stanford.edu Chris Piech         piech@cs.stanford.edu Emily Schneider     elfs@cs...
Upcoming SlideShare
Loading in …5
×

Deconstructing Disengagement: Analyzing Learner Subpopulations in MOOCs

1,888 views

Published on

Presentation from Learning Analytics and Knowledge 2013.

The relatively low completion rates of learners have been a central critique as MOOCs grow in popularity. This focus on completion rates, however, implies a monolithic view of disengagement that fails to acknowledge alternative forms of participation in MOOCs. We develop a classifier which identifies four prototypical trajectories that learners take through MOOCs: Completing learners, Disengaging learners, Auditing learners, and Sampling learners. These subpopulations are defined by learners’ longitudinal patterns of engagement with assessments and video lectures; the subpopulations can be used as a lens to learn more about other aspects of the learners or the courses. Link to full paper in the final slide.

Published in: Education, Technology
  • Be the first to comment

Deconstructing Disengagement: Analyzing Learner Subpopulations in MOOCs

  1. 1. Deconstructing Disengagement:Analyzing Learner Subpopulations in Massive Open Online Courses René Chris Emily Kizilcec Piech Schneider
  2. 2. MOOCs (in this paper) are instructionist + individualised • 6-10 weeks long • 2-3 hours of video lectures/week • autograded assessments with regular deadlines • discussion forum
  3. 3. Massive Open Online Courses Heterogeneous population: Learners join from anywhere in the world, at any age, for any reason
  4. 4. Defining Success for Open-Access Learners Assessment scores are problematic: • not comparable across courses • not available for all learners because test-taking is not aligned with learner goals
  5. 5. Defining Success for Open-Access Learners Completion rates are highly problematic: • numerator = certificate earners, i.e. learners who take assessments • denominator = o total enrolled? overestimate; indicator of interest and not participation o total active? how defined? • ignore plurality of learner intentions • no nuance about subpopulations to help us design interventions or customized course features
  6. 6. Defining Success for Open-Access Learners Process measures hold promise: • conceptualize learning as an ongoing set of interactions with learning objects and other humans • allow early detection and prediction • indicate points for intervention
  7. 7. Defining Success for Open-Access Learners Assessment scores Completion rates Process measures
  8. 8. How to classify learners intomeaningful subpopulations?
  9. 9. Classification Criteria Classification methods for MOOC subpopulations: Universal – valid across multiple courses Theory-driven – reflect the processes of learning Parsimonious – based on small, meaningful feature set Predictive – suggest likely outcomes Dynamic – account for new information over time
  10. 10. Lens for Analysis • Compare subpopulations • Compare courses
  11. 11. The Data
  12. 12. Analyzed Three Courses
  13. 13. Who took these MOOCs?
  14. 14. A lot of data!
  15. 15. Gender skew
  16. 16. Interesting age group
  17. 17. HDI skew
  18. 18. Clustering
  19. 19. Sub-populations basis? Engaged Not Engaged
  20. 20. Engagement Ideal Engagement Time
  21. 21. Coarse Engagement Labels (T) On Track: Did the weekly assignment on time (B) Behind: Did the weekly assignment, but finished after the due date (A) Auditing: Watched videos but did not do the assignment (O) Out: Did not interact with the course, either through videos or assignments We were able to predict who would take the final AUC = 0.96
  22. 22. The Aggregate Class A = Auditing O = Out T = On Track B = Behind In this picture Out Is not to scale!
  23. 23. The Aggregate Class A = Auditing O = Out T = On Track B = Behind 5k In this picture Out Is not to scale!
  24. 24. The Aggregate Class A = Auditing O = Out T = On Track B = Behind 7k In this picture Out Is not to scale!
  25. 25. Example Student 1
  26. 26. Example Student 2
  27. 27. Example Student 3
  28. 28. Clustering Methodology There were 21,108 paths in the GS class
  29. 29. Four Prototypical Trajectories Cluster! (k-means of L1 norm)
  30. 30. Four Prototypical Trajectories And?
  31. 31. The Four Prototypical Trajectories
  32. 32. Prototypical Trajectory 1: Completing
  33. 33. Prototypical Trajectory 2: Auditing
  34. 34. Prototypical Trajectory 3: Disengaging
  35. 35. Prototypical Trajectory 4: Sampling
  36. 36. Four Prototypical Trajectories Consistent across three courses: Auditing learners watch lectures throughout course, but attempt very few assessments Completing learners attempt majority of assessments offered in course Disengaging learners attempt assessments at beginning of the course, but then sparsely watch lectures or disappear entirely Sampling learners briefly explore course by watching a few videos
  37. 37. Four Prototypical Trajectories The other courses?
  38. 38. Four Prototypical Trajectories
  39. 39. Four Prototypical Trajectories <suspense>
  40. 40. Four Prototypical Trajectories
  41. 41. Four Prototypical Trajectories Same pattern in all classes
  42. 42. HS Composition [46k] Disengaging Completing Completing Sampling Auditing
  43. 43. UG Composition [27k] Disengaging Completing Sampling Auditing
  44. 44. MS Composition [21k] Disengaging Completing Sampling Auditing
  45. 45. Validation
  46. 46. Cluster Validation • Different values of k (split by time) • Including “assignment pass” (95% overlap) • Excluding “behind” (94% overlap) • Silhouette of 0.8 (that’s pretty good) • Pass the common sense test
  47. 47. High Level Clustering Four Engagement in Prototypical MOOCs Patterns
  48. 48. Results &RecommendationsComparing Trajectories between Courses
  49. 49. Overall Experience Au d it in g Co mp let ing Di se HS ng ag ing Sa mp lin g Completing (and Auditing) Au have best experience d it in g Co mp let ing Di se UG ng ag ing Sa mp lin g Identify subpopulations early Au d it in g to customize course features Co mp let ing Di se GS ng ag ing Sa mp lin g 3.0 3.5 4.0 4.5 5.0 Overall Experience
  50. 50. Discussion Forum Au d it in g Co mp let ing Di se HS ng ag ing Completing learners are Sa mp most active on the forum lin g Au d it in g Co mp let ing Di se UG ng ag Causal relationship? ing Sa mp lin g Au d it in g Co mp let Reputation systems & Di se ing GS ng Social features ag ing Sa mp lin g 0.1 0.51.0 2.0 4.0 7.0 10.0 Average Forum Activity
  51. 51. Geographical Distribution Trend confirmed by top four participating countries  United States, India, Russia, United Kingdom
  52. 52. Gender Au d it in g Co mp let ing Di se HS ng ag Female Completing learners ing Sa underrepresented in mp lin g advanced courses Au d it in g Co mp let ing Di se UG ng Stereotype threat? ag ing Sa Spencer et al., 1999 mp lin g Au d it in g Co mp let ing Frame assessments to Di se GS ng ag ing minimize stereotype threat Sa mp lin g 2 4 6 8 10 12 14 16 Odds Ratio (Male/Female)
  53. 53. Future Directions
  54. 54. Future Directions  Experiments  Collaboration and Peer Effects  Interface Customization  Targeted Interventions  Nuanced Analytics  Auditing: MOOC-as-a-resource vs. MOOC-as-a-class  Disengaging: Early prediction for intervention  Reasons to enroll and trajectories  Engagement trajectories for real-time analytics in MOOCs  Dashboard visualizations
  55. 55. Thank you! Stanford Lytics Lab lytics.stanford.edu Office of the Vice Provost for Online Learning Roy Pea, Clifford Nass, Daphne Koller Our LAK reviewers Reference S. Spencer, C. Steele, and D. Quinn. Stereotype threat and women’s math performance. Journal of Experimental Social Psychology, 35(1):4–28, 1999.
  56. 56. More info? René Kizilcec kizilcec@stanford.edu Chris Piech piech@cs.stanford.edu Emily Schneider elfs@cs.stanford.edu Stanford’s Learning Analytics Group: Lytics Lab lytics.stanford.edu Paper: http://goo.gl/OSX72

×