The document summarizes 10 lessons learned from large-scale online learning initiatives. The lessons are: 1) Improve how learners find information within courses. 2) Encourage more frequent student engagement. 3) Use adaptive online math learning systems. 4) Get students to participate in learning communities. 5) Use student profiles to strategize interventions. 6) Anticipate and support students at risk of disengagement. 7) Offer catch-up options for students who fall behind. 8) Optimize instructional video design. 9) Allow students to struggle with concepts rather than always providing solutions. 10) Ensure courses support curiosity, challenge, and creativity.
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
Using Research to Improve Online Math Learning
1. Learning at Scale
Using Research To Improve Learning Practices
and Technology for Teaching Math
Maria H. Andersen, Ph.D.
Chief Troublemaker, Edge of Learning
Math Faculty, Westminster College
18. Learners are in the maze, and
they can’t see the big picture.
19. Many of you have
been in a maze like
what our students
experience.
20. Wouldn’t it be nice to
have a transporter to
get right to where
you want to go?
21.
22. Why don’t students
seem to be able to
find information that
is clearly printed in
the syllabus?
23.
24.
25.
26.
27. The impact of findability on student motivation, self-
efficacy, and perceptions of online course quality
Authors: B. Simunich, D. Robins & V. Kelly
Published: 2015
Study size: n=81, students had to complete 7 tasks, navigating either a
well-constructed course (according to QM) or a “broken”
course
28. What else can we do to help
learners navigate this maze?
29. Help students to build a physical
map to anchor memories as they
learn.
30.
31. 7.00x Introduction to Biology: The Secret of
Life MITx on EdX Course Report - 2013 Spring
Authors: D. Seaton, J. Reich, S. Nesterko, et al.
Published: 2014
Study size: 38k signups, 3K certified
(Seaton et al, 2014)
32. 6.00x Introduction to Computer Science and
Programming MITx on edX – 2012 Fall
Authors: S. Features, P. Grimson, J. Guttag
Published: 2014
Study size: 84.5k signups, 5.7K certified
33. 14.73x The Challenges of Global Poverty
MITx on edX – 2013 Spring
Authors: P. Black, P. White
Published: 2014
Study size: 40k signups, 4.6K certified
37. Examining Engagement: Analysing Learner
Subpopulations in MOOCs
Authors: R. Ferguson, D. Clow
Published: 2015
Study size: 34k signups, 7k full participants, 4 MOOCs
42. Are you ever surprised by things
you can find on the Internet?
Stand up if you know who the
Borg are.
Stay standing if you’ve ever done an
internet search for a cat in a Borg
costume.
44. Traditional
Course
Online homework
(quantity and
difficulty adapts
to mastery levels
of student)
Online homework
(all students do
same problems)
< <
A hierarchy of online math learning based
on principles from learning research
Online homework
(adapts to mastery
levels of student
and learner profile
or preferences)
<
49. Learning is not a spectator sport: doing is better
than watching for learning from a MOOC
Authors: K. Koedinger, J. Kim, J. Jia, et al
Published: 2015
Study size: 27k registrants, 1k completers, “Introduction to
Psychology as a Science” delivered through Coursera and
OLI platform
50.
51.
52. Our digital lives are with us everywhere,
our learning should be too.
53. Stand up if you teach online or
rely heavily on digital learning
materials.
Stay standing if you think it’s easy to
get students to participate in these.
Stay standing if you use discussion
boards with your students.
56. Comments in MOOCs: who is doing the
talking and does it help?
Authors: B. Swinnerton, S. Hotchkiss & N.P. Morris
Published: 2017
Study size: 25k active learners, 8k commenters, 5k students who
completed a pre-course survey (roughly half commenters
and half non-commenters)
57.
58.
59.
60.
61. Predicting Student Retention in Massive Open
Online Courses using Hidden Markov Models
Authors: G. Balakrishnan
Published: 2013
Study size: 30k enrolled students in “Software as a Service” MOOC
67. Using Data Mining to Differentiate Instruction
in College Algebra
Authors: R. Manspeaker
Published: 2011
Study size: 524 students in College Algebra using digital homework,
over 2 semesters
68.
69. (Manspeaker, 2011)
Overachievers (OA)
Characteristics What do they struggle with?
• see mathematics as useful
• do not see how the class
specifically applies to their
lives
• high grades in math
• non-standard problems
• problems not found on
homework or worked in class
70. Underachievers (UA)
Characteristics What do they struggle with?
• Intelligent and well-
prepared
• Bored and frustrated with
the course
• Do reasonably well on the
first exam
• Procedural and non-standard
problems
• Get low scores in most
aspects of the course
• High risk for dropping out
(Manspeaker, 2011)
71. “Employees” (E)
Characteristics What do they struggle with?
• Feel the class is a menial job
for which they get “paid”
with a passing grade
• Dislike math
• Learn through memorizing
• Excel at procedural
problems
• Problems involving creative
thinking or independent
thought
• Attendance (starts strong,
then drops)
(Manspeaker, 2011)
72. (Manspeaker, 2011)
Rote Memorizers (RM)
Characteristics What do they struggle with?
• Rely on memorizing to get
through math
• Negative views of math
which will get worse over
time
• Poorest scores
• Emporium-style courses
• Everything
73. Sisyphean Strivers (SS)
Characteristics What do they struggle with?
• Excellent Attendance
• Do well on written
homework
• Have high understanding of
math if interviewed
• Do very poorly on exams
• Non-standard questions
• Application problems
• Exams
• Online homework
(Manspeaker, 2011)
74.
75. Stand up if students drop your
classes.
Stay standing if the digital systems
you use make it easy to see which
students you should intervene with.
Stay standing if you have strategies
in place to try to intervene before
disengagement.
76. 6. Anticipate and catch
students as they fall off the
path to success.
77. Community Insights: Emerging Benchmarks &
Student Success Trends from across the Civitas
Authors: Civitas Learning
Published: 2016
Study size: 2 million learners, 55 colleges
84. Using Bayesian Learning to Classify College
Algebra Students by Understanding in Real-Time
Authors: A. Cousino
Published: 2013
Study size: 505 students over 2 semesters
87. Examining Engagement: Analysing Learner
Subpopulations in MOOCs
Authors: R. Ferguson, D. Clow
Published: 2015
Study size: 34k signups, 7k full participants, 4 MOOCs
89. Samplers Visit, but only briefly (commented in some courses)
Strong Starters Complete the first week, but then dropout
Returners Made it through 2 weeks, but then dropout
Midway Dropouts Dropped out around assessment 3 or 4
Nearly There’s Made it through ¾ of the course, then dropped
Late Completers Completed course, but submitted things late
Keen Completers Completed course, mostly on time (>80%)
(Ferguson & Clow, 2015)
91. Deconstructing Disengagement: Analyzing Learner
Subpopulations in Massive Open Online Courses
Authors: R. Kizilcec, C. Piech, E. Schneider
Published: 2013
Study size: 94k active learners in 3 computer science MOOCs
99. How Video Production Affects Student
Engagement: An Empirical Study of MOOC Videos
Authors: P. Guo, J. Kim, R. Rubin
Published: 2014
Study size: 6.9 million video watching sessions across four
courses on the edX MOOC platform
105. 9. Don’t aim for perfect.
Let students get stuck a little
more.
106. Your click decides your fate : Inferring Information
Processing and Attrition Behavior from MOOC
Video Clickstream Interactions
Authors: T. Sinha, P. Jermann, N. Li et al.
Published: 2014
Study size: 66k enrolled, 36.5k interacted with videos, there were 48
video lectures (producing 10 Gb of JSON data) from
“Functional Programming in Scala” run on Coursera
107. 10 GB of JSON video click data
Play (Pl)
Pause (Pa)
SeekFw (Sf)
SeekBw (Sb)
ScrollFw (SSf)
ScrollBw (SSb)
RatechangeFast (Rf)
RatechangeSlow (Rs)
PlPaSbPl RfPaPlPa
(rewatch) (fast watching)
SSbSbPaPl
(clear concept)
(Sinha et al., 2014)
108. (Sinha et al., 2014)
Information Processing Index (IPI)
109. (Sinha et al., 2014)
If students’ rewatching behavior
changes from low to high IPI, they
are 33% less likely to drop out.
111. Community Insights: Emerging Benchmarks &
Student Success Trends from across the Civitas
Authors: Civitas Learning
Published: 2016
Study size: 4 million learners (this part was conducted looking at 2
years of data), 55 colleges
I taught math in higher education for about 14 years. And the grass was nice, and I liked working with students, and all-in all, I was happy.
But I began looking at the grass outside the fence of academe and I became tired of the slow pace of change, the paper grading, the whining, and the educational technology that didn’t seem to improve.
About 5 years ago, I was enticed to go through the gate to the other side. I spent the last 5 years working either for or closely partnered with educational technology companies to build products for learning. I built MOOCs. I built adaptive learning platforms. I designed massive learning experiences at a school of 70,000 students (all online). Our biochem course had 8000 students and our college algebra course had 3000 students in it. It was fun, but working for one company felt too restrictive. I decided I wanted to be a wild horse.
I took a leap of faith. (and I looked for a skydiving pony but the Internet did not deliver that one). I left a job about 5 months ago to have a “good think” on everything I’ve learned in my 5 years of extremely accelerated learning in the ed tech industry. In fact, this presentation is an outgrowth of that. I wanted to take a step back and see what we actually have gained from the last 5 years of crazy digital learning experiments.
Personally, and looking through the lens of the higher education industry, I’ve watched us transition from institutions that were wholly face-to-face learning…
To teaching online courses. Now, where is the instructor … Oh look! There’s the instructor. He’s the picture-in-picture. In all seriousness though, isn’t that what it feels like? We’ve gone from being in the same space with students to peeking in from the outside?
And then we graduated from “merely” online courses to massive courses and learning experiences. For those that teach MOOCs, or manage large learning platforms, we are looking at learning from a scale of 10,000, 100,000 or millions of students engaged with the product. We manage a population of learners the size of a city.
After watching all of this progression from multiple perspectives, I began to wonder … what have we actually learned? We’ve looked at learning through a statistical sample size larger than we have ever had before. What do we actually know now? Should we be adjusting anything down the line in the smaller courses? By the way, there will be a quiz.
The purpose of showing you this data and relating these experiences (often from at-scale courses and MOOCs) is not because I want to encourage you all to go out and do these things. It is because for the first time in history, we have extremely large data sets that tell us about natural eLearning engagement and persistence patterns.
This leads me to a question and a finding. First the question … What are the 20something students and students with Bachelors degrees going to MOOCs to learn that we are not teaching them in college? The finding – there is a hunger for courses outside the traditional structure of higher education, but mostly by those that already have successfully learned how to learn.
The human development index of the country of residence of the student correlates solidly (r=0.83) with completion rates. Higher HDI scores are associated with higher life expectancy at birth, mean years of schooling and expected years of schooling for the population, and a decent standard of living. If we calculated an HDI for the populations we serve, would we see similar trends?
Probably, you get to “launch courses” 2 to 3 times a year. Every time you carefully rework your syllabii to ensure the best possible outcome. What happens a week or two later? In the software world we talk about this term “agile” and what it means is that we have quick iteration cycles, typically 2-4 weeks. In MOOC-land, I was able to launch a course, watch several hundred/thousand students wander through the “startup materials” and then iterate for courses launching the next week. Every week we discovered lessons about what worked and what didn’t.
As instructors, we can see the whole plan from above.
Why aren’t students seeing the Announcements?
Why aren’t students seeing the Announcements?
Why would I move away from the grid format?
But don’t just take my word for it. There have actually been studies. Here’s a small-n study on eLearning in particular – navigation is a huge frustration point for students. But also, there are thousands and thousands of studies and hundreds of books on the topic of usability in software experiences (which is essentially what we deliver though eLearning platforms).
Exams are weeks 5, 9, and 13
Red lines are a weekly email. Activity drops to HALF between emails/deadlines. The email/deadline sucks the learners back in for another cycle through another week. Don’t feel bad about frequent deadlines and announcements.
Maybe math really is just harder to learn than everything else? Or … what if prerequisites are really THAT important?
After looking at somewhere around 50 case studies, white papers, dissertations, and journal articles. I’m willing to go out on a ledge here and say that using an online math learning system can only help your students. I can no longer, in good conscience, recommend a course that uses only traditional pencil & paper homework.
But Why? Why has the scale tipped in favor of online math? The first step up gives us spaced repetition, feedback, and greater accountability. The second step gives us more efficiency of learning, mastery of content. The third step would eliminate some of the boredom (theoretically) and deliver appropriate challenge.
1. Online homework can adjust to the pace of each student. Students can go slower or faster and dive in to resources as needed.
2. Repeated engagement throughout the week (not just once or twice a week).
4. Immediate dopamine. Think of it as customer support. We want immediate results. We want immediate help. Think about this the next time you find yourself complaining about the poor service from a company. Students feel that way too.
Based on the work of Rachel Manspeaker. Very interesting dissertation.
The most influential impact comes from doing activities, with a normalized coefficient of 0.44 (a 1 standard deviation increase in doing activities produces 0.44 sd increase in quiz score). The strength of this relationship is more than six times the impact of watching video or reading pages (both
The most influential impact comes from doing activities, with a normalized coefficient of 0.44 (a 1 standard deviation increase in doing activities produces 0.44 sd increase in quiz score). The strength of this relationship is more than six times the impact of watching video or reading pages (both
3. We have, as a species, gone digital. 90% of our students have Smartphones. We are competing for the students’ digital attention. Without digital platforms that deliver learning, we might not get any learning. While our devices DO distract us, they also provide a way to carry around everything you are learning. Much the same way I can carry 500+ research papers in my laptop (or access them from the cloud), students can learn when they have time if they have their resources.
Stand up if you believe that participating in discussion or groupwork in an online class is good for students. Stay standing if you think it’s easy to get students to participate.
Internet communities (but also social communities)
“Superposters” (creators) make more than 2 comments.
Boxplots show the majority of commenters fall in the range from 2-4 comments for the whole course. This gives us some sense of natural non-incentivized participation rate. If you want participation/interaction, you will have to incentivize it.
Here’s some kind of great irony, our most social-media saavy students (the 18-20somethings) posted less (by median) than the older students.
This does tell us something interesting about PT workers though. And this number about FT educators has got to be some kind of proof that teachers are really busier than everyone else.
Participating in online discussion
Participating in online discussion
Even lurking is an activity that correlates with persistence.
Too bad it’s near impossible to get analytics on who has viewed threads, huh?
Drive the lurkers back in to see what another student said that was so great.
What is a “persona”? For our purpose, a persona represents a cluster of math learners who exhibit similar behavioral and decision making patterns
They had assessment questions after the videos, so they measured engagement by looking at play data and engagement with the questions after the videos.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
There’s a clear path laid for students to follow at the beginning of the course, but students have to actually start down that path to be successful.
LMS activity is more than just logins to a particular course. It is all LMS logins at the institution over this 2-week period.
For students who login more than 5 times, predicted persistance is 92%.
4 or less logins in the first 14 days
Only 0 or 1 login in the first 14 days. Furthermore, about half these students will leave the institution.
It turns out there’s another crucial point in a course (particularly in a math course) where we are at risk for losing students.
This is one that you might not have thought of.
Based on the work of Rachel Manspeaker. Very interesting dissertation.
OA is an overachiever. Between weeks 6 and 8, the OA students start to learn topics that are new to them. Their habits weren’t formed to deal with new stuff.
Stand up if you have had any of the following circumstances in the last year: were sick days in a row, had a sick child or relative for several days in a row, car broke down for a week, got in a car accident, went to a several-day conference for work, got married, got divorced, got dumped, or had any other major life crisis.
Stay standing if found it was easy to keep up with your job while these things were going on.
Doesn’t add to 100, it’s complicated. Two of the MOOCs had quite a different structure and some categories did not appear for these and other categories were much higher. Note that 31% of those that started, dropped and never came back.
From week to week in the courses, researchers tracked the student movement between categories.
Pay particular attention to the students who ”fall out” each week.
Pay really close attention. Notice anything? They never get back on track. In fact, the best you can hope for after you fall out is to bounce between audit (some engagement with the course) and out (no engagement with the course).
Once you get left behind in a digital course, you can’t catch back up.
Many of us recorded our instructional videos years ago. We know a lot more now about what is effective.
They had assessment questions after the videos, so they measured engagement by looking at play data and engagement with the questions after the videos.
Including some kind of talking head increased engagement to 46% (vs 33% for non talking head videos). Intersperse ”talking head” between slides or tablet work or use PIP options.
Keep your videos shorter than 6 minutes. Students often do not watch to the end. Many “autoplay” videos were immediately paused. [removed from study data].
There was also slightly higher engagement in a less formal setting (sitting at a desk vs a studio recording). Record where you feel comfortable, where you’re more likely to make eye contact and be more engaging and enthusiastic.
Also better engagement on PLANNED vs unplanned lectures. In other words, while it is easy to cut up lectures recorded in class, it is not always engaging.
There was more engagement on tablet-style videos than slides.
These code combinations are called n-grams and were grouped into 7 categories.
These code combinations are called n-grams and were grouped into 7 behavioral action categories and an Information Processing Index.
These behavioral action categories were then collected to infer the students’ perception of the video lecture segment. Compare the two regions where the information processing index is high.
But I suspect it is do to an education design that is increasingly transactional instead of transformational.
Academic Performance is not the Primary Risk to Departure
Note: 2 years of data, 4 million learners
Transactional vs transformational
1. We need to improve findability.
Students need to be coerced into more frequent engagement.
Once you get left behind in a digital course, you can’t catch back up.