Although MOOCs, or Massively Open Online Courses, have been the subject of considerable discussion for the past couple of years, drawing both praise and ridicule for their alleged potential to transform higher education, we are only now beginning to understand even the simplest things about them. What does it take to produce a MOOC? Who offers to teach them and why? Who takes MOOCs and why? What factors affect successful completion of a MOOC, and what constitutes successful completion of a MOOC in the first place? Are MOOCs an effective way of teaching and learning? Answering these basic questions about MOOCs can be difficult even for experienced researchers, because MOOCs are different from other educational modalities in key ways.
Conducting reliable and systematic research on MOOCs is challenging for several reasons, including the following: 1) the commitment level of many students who enroll in MOOCs is low, which means that the amount of effort they expend to succeed in the courses varies markedly between students, and response rates for tests, assignments, and surveys are low; 2) the student populations from which MOOC students are drawn are fundamentally different from the student populations at traditional colleges and universities, which means that for MOOC students, a different set of variables may predict and explain academic outcomes like persistence, completion, engagement, and learning; 3) baseline measures against which to measure progress from beginning to end of a class or which to employ as statistical controls are not readily available; and 4) both the nature and degree of student attrition in MOOCs make it difficult to draw generalizable conclusions.
The research and evaluation team in the Office of Information Technology at the University of Minnesota has conducted a year-long investigation into a dozen Minnesota MOOCs, and in the course of this investigation we have developed approaches to MOOC research that attempt to address the obstacles described above. This presentation will outline crucial aspects of MOOC research methodology, and the audience will gain an understanding of how creative data collection and analysis methods can serve to mitigate or creatively by-pass the barriers to systematic MOOC research. In particular, to answer our MOOC research questions, we deployed multiple data collection methodologies including self-reported time and effort diaries, pre-course, post-course, and follow-up surveys and subject matter knowledge exams, semi-structured faculty interviews, and MOOC vendor data. To analyze the data, we employed a mixed methods approach that included a variety of qualitative and quantitative techniques, focusing on paired testing and paying careful attention to issues of differential attrition. We will also highlight selected findings from our study and show where we believe we have succeeded and where we still have work to do.
1. How to Research MOOCs:
A Primer (with Results)
D. Christopher Brooks, Ph.D.
J.D. Walker, Ph.D.
20th Annual Online Learning Consortium
International Conference
29 October 2014
8. How to Research MOOCs
Your Background with MOOCs
How many of you…
• have participated in the design, teaching, or
support of a MOOC?
• work for a university that has offered courses
as MOOCs?
• work for a company that
offers a MOOC platform (EdX,
Coursera, Udacity, etc.)?
9.
10. How to Research MOOCs
MOOCs at Minnesota: Background
• Offered about 12 MOOCs since summer
2013:
– Statistical Molecular Thermodynamics
– Social Epidemiology
– Health Informatics
– Global Food Systems/Sustainability
– Canine Theriogenology
– Resilience in Children
– Fundamentals of Fluid Power
– Creative Problem-Solving
• Total nominal enrollment: 333,495
11. How to Research MOOCs
Why Researching MOOCs Is Hard
• Students not very
motivated
• Very diverse and
different student
population
• Lack of baseline
information
• Non-random attrition
12. How to Research MOOCs
Advantages of
MOOC Research
• Strength in numbers
• Large amount of
non-self-report data
Kyle Bowen, http://classhack.com/post/76426264075/reallybigdata
13. MOOC Research Progression:
Early Stages = Descriptive/Exploratory
S. Gross. “Blind Men and the Elephant Systems Thinking”
14. MOOC Research
Early Stages: Descriptive/Exploratory
• What are MOOCs?
• Who takes MOOCs?
• Why do people take MOOCs?
• Who teaches MOOCs?
• What happens in MOOCs?
Kyle Bowen, http://classhack.com/post/41632460268/moocmart
16. Early Stages: Descriptive/Exploratory
Minnesota Faculty and Student Analysis
• Research questions:
– Time & effort
– Student characteristics
• Data sources:
– Time & effort diaries
– Faculty interviews
– Pre and post surveys
– Coursera data
D. Christopher Brooks, “OZ: Road Work”
17. Early Stages: Descriptive/Exploratory
University of Minnesota
Instructor & TA Self-Reported Effort
Planning Executing Total
Statistical Molecular
Thermodynamics 207.50 205.50 413.00
Sustainability &
Food Systems 272.50 157.50 430.00
Interprofessional
Health Care
Informatics
263.00 56.00 319.00
Social
Epidemiology 114.50 74.50 189.00
Canine
Theriogenology for
Dog Enthusiasts
19.92 52.35 72.27
Unidentified & TAs 234.95 242.00 476.95
TOTAL 1,112.37 787.85 1,900.22
AVERAGE 222.47 157.57 380.04
18. Early Stages: Descriptive/Exploratory
University of Minnesota:
Faculty Experience: Overview
Personal
Satisfaction
Despite
considerable
effort
Student &
Professional
Connections
Broad
Reach
Despite low
completion
rates
Professional
Satisfaction
Keeping
Things
Fresh
Despite the
public debate
Think
about
Teaching
19. Early Stages: Descriptive/Exploratory
University of Minnesota:
Faculty Experience: Beyond the MOOC
BIG
IDEAS
Student-
Centered
Discussion
& Activities
Brief
Lecture
20. Early Stages: Descriptive/Exploratory
University of Minnesota:
Post-MOOC Follow-Up: Why Offer MOOCs?
• 90% materials reuse!
• “I could go on and on about
the benefits [teaching the
MOOC has] brought to my
students, and to my teaching
and research.”
Kyle Bowen, http://classhack.com/post/76426180711/five-updated
21. Early Stages: Descriptive/Exploratory
University of Minnesota
Reasons for Enrolling: Factor Analysis
University
-related
reasons
(3.929)
Access-related
reasons
(1.475)
Professional
reasons
(1.312)
Enjoyment-related
reasons
(1.029)
1. This subject is relevant to my academic field of study
2. This class teaches skills that will help my job/career
3. Because this course is offered by a prestigious
university
4. I think taking this course will be fun and enjoyable
5. I am not geographically close to educational
institutions
6. Traditional courses are too expensive
7. I was interested in taking a course with this professor
8. This course is offered by the University of Minnesota
9. General interest in the topic
10. To help me decide whether to take further
college/university classes
11. To make professional connections
12. To obtain a badge or certification that will be useful to
me professionally
• Total variance explained: 64.55%
• (Eigenvalues in parentheses)
35. Intermediate Stages: Correlational Analysis
Success & Completion
• Defining success
and completion
matters
• How one defines
success and
completion matters
• Who defines
success matters
• The denominator
matters
Audrey Watters, “Say ‘MOOC’ One More Time,”
https://www.flickr.com/photos/surreal_badger/8573233
746/.
36. Intermediate Stages: Correlational Analysis
University of Minnesota:
Predicting Completion: Student/Faculty Defined
Intent Self-Reported 50% Completed Total Points
Reasons for
Enrolling
University + + + 0
Professional + 0 0 0
Access + 0 0 0
Enjoyment + 0 0 0
Demographics
English Proficiency
+ - 0 +
Location: USA 0 + - 0
Age – 0 - 0
Sex 0 0 0 -
Obstacles to
Completion
Tech Unfamiliar 0 0 -
Connection Problems + 0 -
Computer Problems 0 0 0
Time Zone Issues 0 0 0
Family Issues 0 - -
Work Issues 0 - -
Intent 0 + +
Self-Reported +
37. Intermediate Stages: Correlational Analysis
University of Minnesota:
Predicting Completion: Student/Faculty Defined
Intent Self-Reported 50% Completed Total Points
Reasons for
Enrolling
University + + + 0
Professional + 0 0 0
Access + 0 0 0
Enjoyment + 0 0 0
Demographics
English Proficiency
+ - 0 +
Location: USA 0 + - 0
Age – 0 - 0
Sex 0 0 0 -
Obstacles to
Completion
Tech Unfamiliar 0 0 -
Connection Problems + 0 -
Computer Problems 0 0 0
Time Zone Issues 0 0 0
Family Issues 0 - -
Work Issues 0 - -
Intent 0 + +
Self-Reported +
38. Intermediate Stages: Correlational Analysis
University of Minnesota:
Predicting Completion: Student/Faculty Defined
Intent Self-Reported 50% Completed Total Points
Reasons for
Enrolling
University + + + 0
Professional + 0 0 0
Access + 0 0 0
Enjoyment + 0 0 0
Demographics
English Proficiency + - 0 +
Location: USA 0 + - 0
Age – 0 - 0
Sex 0 0 0 -
Obstacles to
Completion
Tech Unfamiliar 0 0 -
Connection Problems + 0 -
Computer Problems 0 0 0
Time Zone Issues 0 0 0
Family Issues 0 - -
Work Issues 0 - -
Intent 0 + +
Self-Reported +
39. Intermediate Stages: Correlational Analysis
Clickstream Data and Learner Intentions
Chen, B. et al. “How do MOOC learners’ intentions relate to their behaviors and overall
outcome?”
40. Intermediate Stages: Correlational Analysis
Clickstream data and demographics
Guo & Reineke, (2014), “Demographic Differences in How Students Navigate through
MOOCs”
41.
42. MOOC Research Progression:
Mature Stages = Controlled
Comparative Designs
Buttered cat figures extracted from Greg Williams' WikiWorld
43. Mature Stage: Controlled Comparative Designs
Computer Science: MOOC vs Hybrid Class
• Recommender systems MOOC, fall 2013,
Professor Joseph Konstan
http://militantrecommender.blogspot.com/
44. Mature Stage: Controlled Comparative Designs
Computer Science: MOOC vs Hybrid Class
Pre- and post-course surveys and knowledge test:
Q. What is the core idea behind dimensionality
reduction recommenders?
a. To reduce the computation from polynomial to linear.
b. To strip off any product attributes so products appear simpler.
c. To reduce the computation time from O(n3) to O(n2).
d. To transform a ratings matrix into a pair of smaller taste-space
matrices.
e. I have no idea.
50. Mature Stage: Controlled Comparative Designs
Learning in a Physics MOOC: Pre- & Posttest
Colvin et al. (2014). “Learning in an Introductory Physics MOOC: All Cohorts Learn Equally, Including an On-
Campus Class.”
51. Mature Stage: Controlled Comparative Designs
A/B Test of Instructor Involvement
Tomkin et al. (2014). “Do professors matter? Using an a/b test to evaluate the impact of instructor involvement on MOOC student
outcomes.”
54. How to Research MOOCs:
A Primer (with Results)
Thanks!
D. Christopher Brooks, Ph.D. (cbrooks@educause.edu)
ECAR: http://www.educause.edu/ecar
Twitter: @dcbphd
J.D. Walker, Ph.D. (jdwalker@umn.edu)
University of Minnesota: http://z.umn.edu/research