OAAI: Deploying an Open
Ecosystem for Learning Analytics

Use Hash Tags > #EDU13 and #OAAI
Open Academic Analytics Initiative
Deploying an Open Ecosystem

for Learning Analytics
Josh Baron
Senior Academic Technology Officer
Principle Investigator, OAAI
Marist College
Josh.Baron@Marist.edu

Use Hash Tags > #EDU13 and #OAAI
36%

Reference: Integrated Postsecondary Education Data System (IPEDS)
13%
Four Year Graduation Rate for
Savannah State University (2010)
Open Academic Analytics Initiative
 EDUCAUSE Next Generation
Learning Challenges (NGLC)

 Funded by Bill and Melinda
Gates Foundations
 $250,000 over a 15 month period
 Goal: Leverage Learning Analytics to create an opensource academic early alert system and research
“scaling factors”
OAAI Early Alert System Overview

LMS Data

SIS Data

Step #1: Developed model
using historical data
Student Attitude Data
(SATs, current GPA, etc.)

Academic Alert
Report (AAR)

Student Demographic Data
(Age, gender, etc.)

Identifies
students “at
risk” to not
complete
course

Sakai Event Log Data
Sakai Gradebook Data

Predictive
Model
Scoring
Model Developed
Using Historical Data

“Creating an Open Academic
Early Alert System”

Intervention Deployed
“Awareness” or Online
Academic Support
Environment (OASE)
OAAI Goals and Milestones
 Build learning analytics-based early alert system
 Sakai Collaboration and Learning Environment
 Secure data capture process for extracting LMS data

 Pentaho Business Intelligence Suite
 Open-source data mining, integration, analysis & reporting

 OAAI Predictive Model released under open license
 Predictive Modeling Markup Language

 Researching learning analytics scaling factors
 How “portable” are predictive models?
 What intervention strategies are most effective?
Research Design
 Deployed OAAI system to 2200 students across
four institutions
 Two Community Colleges
 Two Historically Black
Colleges and Universities

 One instructors teaching three sections
 One section was control, other two sections were
treatment groups
Research Design
Each instructor received
an Academic Alert
Report (AAR) three
times during the
semester at 25%, 50%
and 75% into the
semester.
OAAI Predictive Model
Development and Portability Research Findings
Predictive Model Training
Predictors of
Student Risk
LMS predictors were
measured relative to
course averages.

Some predictors were
discarded if not enough
data was available.
Predictive Power
Initial Portability
Research Findings
Compared predictive elements
and correlations in the Marist
data to what was found by Dr.
John Campbell in his
dissertation research at
Purdue.
Institutional Profiles
Spring ’12 Portability Findings
Fall ’12 Portability Findings
Intervention Strategies
Design, Deployment and Research Findings
OAAI Early Alert System Overview

LMS Data

SIS Data

Step #1: Developed model
using historical data
Student Attitude Data
(SATs, current GPA, etc.)

Academic Alert
Report (AAR)

Student Demographic Data
(Age, gender, etc.)

Identifies
students “at
risk” to not
complete
course

Sakai Event Log Data
Sakai Gradebook Data

Predictive
Model
Scoring
Model Developed
Using Historical Data

“Creating an Open Academic
Early Alert System”

Intervention Deployed
“Awareness” or Online
Academic Support
Environment (OASE)
Awareness Messaging
“Based on your performance on recent graded
assignments and exams, as well as other factors that
tend to predict academic success, I am becoming worried
about your ability to successfully complete this class.

I am reaching out to offer some assistance and to
encourage you to consider taking steps to improve your
performance. Doing so early in the semester will
increase the likelihood of you successfully completing the
class and avoid negatively impacting on your academic
standing.”
Online Academic Support
Environment (OASE)
OASE Design Framework
 Follows online course design concepts
 Learner – Content Interactions
 Learner – Facilitator Interactions
 Learner – Mentor Interactions

 Leverages Open Educational Resources
Design Frame #1
Learner-Content Interactions
 Self-Assessment Instruments
 Assist students in self-identifying
areas of weakness
 OER Content for Remediation
 Focus on core subjects (math, writing)
 OER Content for Learning Skills

 Time management, test taking strategies, etc.
Design Frame #2
Learner - Facilitator Interaction
 Academic Learning Specialist role
 Connecting learners to people & services
 Promoting services and special events
 Moderates discussions on pertinent topics
 Example: “Your first semester at college”
Design Frame #3
Learner - Mentor Interactions
 Online interactions facilitated by student
“mentor”
 Facilitates weekly “student perspective” discussions

 Online “student lounge” for informal interactions
 Let others know about study groups, etc.
Intervention Research Findings

Final Grade (%)

Final Course Grades
Mean Final Grade for "at Risk" Students
100
90
80

• Analysis showed a
statistically significant
positive impact on final
course grades
– No difference between
treatment groups

70
60
50
Awareness

OASE

Control

• Saw larger impact in
spring then fall
• Similar trend amount
low income students
Intervention Research Findings
Content Mastery
Content Mastery for "at Risk" Students

• Student in intervention
groups were statistically
more likely to “master the
content” then those in
controls.

Frequency

1000
800
600

400
200
0
Yes No
Control

Yes No
Intervention

– Content Mastery = Grade
of C or better

• Similar for low income
students.
Intervention Research Findings
Withdrawals
Withdrawal rates for "at Risk" Students

Frequency

1000
800

600
400
200
0
Yes No
Control

Yes No
Intervention

• Students in
intervention groups
withdrew more
frequently than
controls
• Possibly due to
students avoiding
withdrawal penalties.
• Consistent with
findings from Purdue
University
Instructor Feedback
"Not only did this project directly assist my students by
guiding students to resources to help them succeed, but
as an instructor, it changed my pedagogy; I became more
vigilant about reaching out to individual students and
providing them with outlets to master necessary skills.
P.S. I have to say that this semester, I received the highest
volume of unsolicited positive feedback from
students, who reported that they felt I provided them
exceptional individual attention!
Future Research Interests
 Factors that impact on intervention effectiveness
 Intervention Immunity – Students who do not respond
to first intervention never do
 Student Engagement – How can we increase the
level of engagement with OASE?
 Can predictive models be customized for specific
delivery methods and programs/subjects?
 Can Learning Analytics identify “at risk” students who
would otherwise not be identified?
Social Networks Adapting
Pedagogical Practice (SNAPP)
LEARNING ANALYTICS AND KNOWLEDGE CONFERENCE (LAK)
THE INTERNATIONAL JOURNAL OF THE SOCIETY FOR LEARNING ANALYTICS RESEARCH
SOLAR FLARES – REGIONAL CONFERENCES
SOLAR STORMS – DISTRIBUTED RESEARCH LAB
MOOCS ON LEARNING ANALYTICS

http://www.solaresearch.org/
Questions?

Josh Baron
Josh.Baron@Marist.edu
@JoshBaron
Big Questions about Big Data
 Where is Learning Analytics and Big Data
in the “hype cycle”? Where will it end up?
 What do you see as the biggest ethical
issues surrounding Learning Analytics?
 How important is “openness” in Learning
Analytics? Open standards? Open
licensing?

 Could Learning Analytics end up driving
the wrong learning outcomes?
Questions?

Josh Baron
Josh.Baron@Marist.edu
@JoshBaron
Learning Analytics Ethics
• “The obligation of knowing” – John Campbell
– If we have the data and tools to improve student
success, are we obligated to use them?
• Consider This > If a student has a 13% chance of passing
a course, should they be dropped? 3%?

• Who owns the data, the student? Institution?
– Should students be allowed to “opt out”?
• Consider This > Is it fair to the other students if by opting
out the predictive model’s power drops?

• What do we reveal to students? Instructors?
• Consider This > If we tell a student in week three they
have a 9% chance of passing, what will they do?
38
OAAI: Deploying an Open Ecosystem for Learner Analytics

OAAI: Deploying an Open Ecosystem for Learner Analytics

  • 1.
    OAAI: Deploying anOpen Ecosystem for Learning Analytics Use Hash Tags > #EDU13 and #OAAI
  • 2.
    Open Academic AnalyticsInitiative Deploying an Open Ecosystem for Learning Analytics Josh Baron Senior Academic Technology Officer Principle Investigator, OAAI Marist College Josh.Baron@Marist.edu Use Hash Tags > #EDU13 and #OAAI
  • 3.
    36% Reference: Integrated PostsecondaryEducation Data System (IPEDS)
  • 4.
    13% Four Year GraduationRate for Savannah State University (2010)
  • 6.
    Open Academic AnalyticsInitiative  EDUCAUSE Next Generation Learning Challenges (NGLC)  Funded by Bill and Melinda Gates Foundations  $250,000 over a 15 month period  Goal: Leverage Learning Analytics to create an opensource academic early alert system and research “scaling factors”
  • 7.
    OAAI Early AlertSystem Overview LMS Data SIS Data Step #1: Developed model using historical data Student Attitude Data (SATs, current GPA, etc.) Academic Alert Report (AAR) Student Demographic Data (Age, gender, etc.) Identifies students “at risk” to not complete course Sakai Event Log Data Sakai Gradebook Data Predictive Model Scoring Model Developed Using Historical Data “Creating an Open Academic Early Alert System” Intervention Deployed “Awareness” or Online Academic Support Environment (OASE)
  • 8.
    OAAI Goals andMilestones  Build learning analytics-based early alert system  Sakai Collaboration and Learning Environment  Secure data capture process for extracting LMS data  Pentaho Business Intelligence Suite  Open-source data mining, integration, analysis & reporting  OAAI Predictive Model released under open license  Predictive Modeling Markup Language  Researching learning analytics scaling factors  How “portable” are predictive models?  What intervention strategies are most effective?
  • 9.
    Research Design  DeployedOAAI system to 2200 students across four institutions  Two Community Colleges  Two Historically Black Colleges and Universities  One instructors teaching three sections  One section was control, other two sections were treatment groups
  • 10.
    Research Design Each instructorreceived an Academic Alert Report (AAR) three times during the semester at 25%, 50% and 75% into the semester.
  • 11.
    OAAI Predictive Model Developmentand Portability Research Findings
  • 12.
  • 13.
    Predictors of Student Risk LMSpredictors were measured relative to course averages. Some predictors were discarded if not enough data was available.
  • 14.
  • 15.
    Initial Portability Research Findings Comparedpredictive elements and correlations in the Marist data to what was found by Dr. John Campbell in his dissertation research at Purdue.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
    OAAI Early AlertSystem Overview LMS Data SIS Data Step #1: Developed model using historical data Student Attitude Data (SATs, current GPA, etc.) Academic Alert Report (AAR) Student Demographic Data (Age, gender, etc.) Identifies students “at risk” to not complete course Sakai Event Log Data Sakai Gradebook Data Predictive Model Scoring Model Developed Using Historical Data “Creating an Open Academic Early Alert System” Intervention Deployed “Awareness” or Online Academic Support Environment (OASE)
  • 21.
    Awareness Messaging “Based onyour performance on recent graded assignments and exams, as well as other factors that tend to predict academic success, I am becoming worried about your ability to successfully complete this class. I am reaching out to offer some assistance and to encourage you to consider taking steps to improve your performance. Doing so early in the semester will increase the likelihood of you successfully completing the class and avoid negatively impacting on your academic standing.”
  • 22.
  • 23.
    OASE Design Framework Follows online course design concepts  Learner – Content Interactions  Learner – Facilitator Interactions  Learner – Mentor Interactions  Leverages Open Educational Resources
  • 24.
    Design Frame #1 Learner-ContentInteractions  Self-Assessment Instruments  Assist students in self-identifying areas of weakness  OER Content for Remediation  Focus on core subjects (math, writing)  OER Content for Learning Skills  Time management, test taking strategies, etc.
  • 25.
    Design Frame #2 Learner- Facilitator Interaction  Academic Learning Specialist role  Connecting learners to people & services  Promoting services and special events  Moderates discussions on pertinent topics  Example: “Your first semester at college”
  • 26.
    Design Frame #3 Learner- Mentor Interactions  Online interactions facilitated by student “mentor”  Facilitates weekly “student perspective” discussions  Online “student lounge” for informal interactions  Let others know about study groups, etc.
  • 27.
    Intervention Research Findings FinalGrade (%) Final Course Grades Mean Final Grade for "at Risk" Students 100 90 80 • Analysis showed a statistically significant positive impact on final course grades – No difference between treatment groups 70 60 50 Awareness OASE Control • Saw larger impact in spring then fall • Similar trend amount low income students
  • 28.
    Intervention Research Findings ContentMastery Content Mastery for "at Risk" Students • Student in intervention groups were statistically more likely to “master the content” then those in controls. Frequency 1000 800 600 400 200 0 Yes No Control Yes No Intervention – Content Mastery = Grade of C or better • Similar for low income students.
  • 29.
    Intervention Research Findings Withdrawals Withdrawalrates for "at Risk" Students Frequency 1000 800 600 400 200 0 Yes No Control Yes No Intervention • Students in intervention groups withdrew more frequently than controls • Possibly due to students avoiding withdrawal penalties. • Consistent with findings from Purdue University
  • 30.
    Instructor Feedback "Not onlydid this project directly assist my students by guiding students to resources to help them succeed, but as an instructor, it changed my pedagogy; I became more vigilant about reaching out to individual students and providing them with outlets to master necessary skills. P.S. I have to say that this semester, I received the highest volume of unsolicited positive feedback from students, who reported that they felt I provided them exceptional individual attention!
  • 31.
    Future Research Interests Factors that impact on intervention effectiveness  Intervention Immunity – Students who do not respond to first intervention never do  Student Engagement – How can we increase the level of engagement with OASE?  Can predictive models be customized for specific delivery methods and programs/subjects?  Can Learning Analytics identify “at risk” students who would otherwise not be identified?
  • 33.
  • 34.
    LEARNING ANALYTICS ANDKNOWLEDGE CONFERENCE (LAK) THE INTERNATIONAL JOURNAL OF THE SOCIETY FOR LEARNING ANALYTICS RESEARCH SOLAR FLARES – REGIONAL CONFERENCES SOLAR STORMS – DISTRIBUTED RESEARCH LAB MOOCS ON LEARNING ANALYTICS http://www.solaresearch.org/
  • 35.
  • 36.
    Big Questions aboutBig Data  Where is Learning Analytics and Big Data in the “hype cycle”? Where will it end up?  What do you see as the biggest ethical issues surrounding Learning Analytics?  How important is “openness” in Learning Analytics? Open standards? Open licensing?  Could Learning Analytics end up driving the wrong learning outcomes?
  • 37.
  • 38.
    Learning Analytics Ethics •“The obligation of knowing” – John Campbell – If we have the data and tools to improve student success, are we obligated to use them? • Consider This > If a student has a 13% chance of passing a course, should they be dropped? 3%? • Who owns the data, the student? Institution? – Should students be allowed to “opt out”? • Consider This > Is it fair to the other students if by opting out the predictive model’s power drops? • What do we reveal to students? Instructors? • Consider This > If we tell a student in week three they have a 9% chance of passing, what will they do? 38

Editor's Notes

  • #5 Shockingly, was a low as 3% in 2002!
  • #7 OK, so what is the OAAI and how are we working to address this problem…with the goal of leveraging Big Data to create an open-source academic early alert system that allows us to predict which students are at risk to not complete the course (and do so early on in the semester) and then deploy an intervention to help that student succeed.
  • #8 I’ll talk about our intervention strategies in a little more detail a bit later on in the presentation…
  • #9 OAAI is building on this prior success through two primary “thrusts”…the first is the creation of an “open ecosystem” for academic analytics…this ecosystem contains the Sakai…We will also be open sourcing our predictive model and releasing it using a standard mark-up language which will allow…The other major “trust” is researching critical scaling factors…so we are looking at this issue of “portability”…how can you effective take a model developed for one academic context (e.g. liberal arts school) and “port” it to another context (e.g. community college).We are also looking at what are the most effective intervention strategies, particularly those which use…
  • #21 I’ll talk about our intervention strategies in a little more detail a bit later on in the presentation…
  • #24 The approach we have taken is to create a “design framework” for the OASE
  • #25 Include some screenshots of examples from Khan, etc.
  • #26 Note that the goal is not to replace tutoring services or answer a lot of subject matter questions online.
  • #27 Note that the goal is not to replace tutoring services or answer a lot of subject matter questions online.
  • #39 If we determine a student is “at risk”, are we obligated to intervene?