Leveraging Analytics
to Improve Student Success
Karen Vignare,
University Maryland University College
@kvignare
Ellen Wagner, PAR Framework
@edwsonoma
Session Description
• This session shows how analytics can be used to identify
opportunities for improving student success.
• By the end of the session, participants will make connections
between predictions about risk, and the interventions most
likely to work best under varying conditions and with different
populations.
Setting the Context:
Data Are Changing Everything
“But education researchers have
always worked with data.”
• We do qualitative research with data
• We do quantitative research with data
• We do evaluations with data
• We develop surveys and instruments and experiments to
collect more data
• We pull data from LMSs, SISs, ERPs, CRMs …
• We write reports, summaries, make presentations, develop
articles and books and webcasts….
From Hindsight to Foresight
6
Analytics in Higher Education
Learning Analytics
Best way to teach and learn
Learner Analytics
Best way to support students
Organizational Analytics
Best ways to operate a college
Academic
Analytics
Create new insights and opportunities for
data in our practices
• Enrollment management
• Student services
• Program and learning experience design
• Content creation
• Retention, completion
• Gainful employment
• Institutional Culture
How Are We Doing So Far?
• Data is the number 1 challenge in the adoption and use of analytics.
Organizations continue to struggle with data accuracy, consistency,
access.
• The primary focus of analytics focuses on reducing costs, improving
the bottom line, managing risk.
• Intuition, based on experience, is still the driving factor in data-
driven decision-making. Analytics are used as a part of the process.
• Many organizations lack the proper analytical talent. Organizations
that struggle with making good use of analytics often don’t know
how to apply the results.
• Culture plays a critical role in the effective use of data analytics.
9
GROUP DISCUSSION
• Is your institution using (or planning to use) academic analytics
specifically to improve student success?
• What kinds of questions are you trying to answer?
• What kinds of data are you planning to use?
• What kinds of barriers are you encountering?
Getting to the right answer takes work
• Analysis and model building is an
iterative process
• Around 70-80% efforts are spent
on data exploration and
understanding.
SAS Analysis/Modeling Process
Link Predictions to Action
• Predictive analytics refer to a wide varieties of methodologies.
There is no single “best” way of doing predictive analytics.
You need to know what you are looking for.
• Simply knowing who is at risk is simply not enough. Predictions
have value when they are tied to what you can do about it.
• Linking behavioral predictions of risk with interventions at the
best points of fit offers a powerful strategy for increasing rates
of student retention, academic progress and completion.
Collaborative
National
Multi-institutional
Non-profit
Institutional Effectiveness
+
Student Success
What PAR does
PAR uses descriptive, inferential and predictive analyses to create
benchmarks, institutional predictive models and to inventory,
map and measure student success interventions that have direct
positive impact on behaviors correlated with success.
Linking Predictions to Action
• Identify obstacles and remove barriers from student success
pathways.
• Provide actionable information so students and advisors can
build informed opportunity pathways.
• Know where to invest in student success leveraging
collaborative insight that determine return on investment in
interventions and support.
Benchmarks & Insight Predictive Analytics Intervention Inventory and ROI
Tools
Diagnostics
PAR analytic toolset
Benchmarks & Insight Predictive Analytics Intervention Inventory and ROI
Tools
Web Tools
Student Success Matrix (SSMx)
PAR by the Numbers
• 2.2 million students and 24.5 million courses in the PAR data warehouse, in
a single federated data set, using common data definitions.
• 48 institutions, 351 unique campuses.
• 77 discrete variables are available for each student record in the data set.
Additional 2 dozen constructed variables used to explore specific
dimensions and promising patterns of risk and retention.
• 343 discrete interventions filtered on predictor behaviors, point in student
life cycle, student attributes, institutional priorities and ROI factors in the
growing SSMx dataset.
Structured, Readily Available Data
• Common data definitions
= reusable predictive
models and meaningful
comparisons.
• Openly published via a cc
license @
https://public.datacookb
ook.com/public/institutio
ns/par
Speak the
same
language
PAR Puts it All Together
Determine students
probability of failure
(predictions)
Determine which
students respond to
interventions (uplift
modeling)
Determine which
interventions are most
effective (explanatory
modeling)
Allocate resources
accordingly (cost
benefit analysis)
Findings from aggregated dataset
Positive Predictors
High school GPA (when available)
Dual Enrollment – HS/College
Any prior credit
CC GPA
Credit Ratio
Successful Course Completion
Positive completion of DevEd
Courses
Negative Predictors
Withdrawals
Low # of credits attempted
Varies but can be significant
PELL Grant Recipient
Taken Dev Ed
Age
Fully online student
Race
• Measurement resources are usually located separately
from intervention planning & implementation resources
• Lack of connection of predictors to interventions and
interventions to outcomes
©PAR Framework 2015
Common Challenges for
Intervention Effectiveness
PAR Student Success Matrix (SSMx)
• An organizational structure that helps institutions
inventory, organize and conceptualize interventions
aimed at improving student outcomes.
• A common framework for classifying interventions
• Provides a basis for intervention measurement
©PAR Framework 2015
learner
characteristics
learner behaviors
fit/feelings of
belonging
other learner
support
course/program
characteristics
instructor
behaviors
time connection entry progress completion
predictors
©PAR Framework 2015
SMALL GROUP DISCUSSION
How Are You Measuring
Interventions at YOUR Institution?
Specific Examples of
Data Driven Improvements
• UMUC / U of Hawaii – replication of community college success
prediction studies
• U of Hawaii – “Obstacle courses”
• University of North Dakota – predictives tied to student
watchlist data
• Intervention measurement at Sinclair CC and Lone Star CC
• National online learning impact study on student retention (in
press, based on results from >500,000 students taking
onground, blended and online courses)
Intervention Measurement –
Student Success Courses Results
• 12 month credit ratio: Only 1 of the 8 Student Success
Courses analyzed showed a statistically significant positive
effect for students taking the course vs. those who did
not.
• Retention: 7 of the 8 courses showed a significantly
positive effect
• Retention higher by 14% to 4X
Intervention Measurement –
Student Success Courses
Course Component Summary:
 Public university offering online degree
programs to a diverse population of
working adults
 Largest open access public online
university in U.S.
 Premier provider of higher education to
U.S. military since 1949
 Part of the University System of Maryland
About UMUC
20th Century
 Historical
 Longitudinal
 Warehouse
 Siloed
 External
Reporting
21st Century
 Predictive
 Real-Time
 Dashboards
 Integrated
Institutional Insights
 Continuous
Improvements
Evolution of Data for Retention
 Institutional Research
 Institutional Effectiveness
 Business Intelligence
 Civitas Learning, Inc.
 PAR Framework, Inc.
Retention Resources at UMUC
 Pre-enrollment
 Demographics
 Enrollment
 LMS Engagement
 Student Performance
 Transfer
 Military
Factors Included in Predictive
Model for Retention at UMUC
 Campus
 Class Load
 Military Status
 Academic Performance
 Payment Method
Key Factors for Retention at UMUC
 One year retention (year over year measured
with a cohort)
 Re-enrollment (term to term metric that
includes all students)
 Successful course completion (percentage of
students receiving a successful grade)
 Graduation (1,2,3,4,5, and 10 year rate tracks
the graduation status of the starting cohort over
time)
Metrics at UMUC
 Curriculum Redesign (2010)
 8-week Standard Sessions (2010)
 Community College Transfer (2010)
 Registration Policy (2013)
 Onboarding (2014)
 Just-in-Time Messages (2014)
Retention Initiatives
71.2 72 71.6 73.2
60.5 59.5 61.5
66
0
10
20
30
40
50
60
70
80
Fall 2011 Fall 2012 Fall 2013 Fall 2014
Stateside
Overseas
Retention Rates and Headcounts
47,416 46,213 41,197 41,356
Operationalize
successful tests;
“Lessons
Learned” fed
back
to body of
knowledge
Student Retention Enterprise Framework
Diagnosed
from internal
data and
external
research
Root cause
analysis
performed
and search
of existing
body of
knowledge
solutions
Work within
Governance
Structure
Levers pulled
here;
Measure
success &
ROI;
Quarterly
Retention Root Cause Identification & Analysis
Retention
Opportunity
Problem
Analyzed
Hypothesis
Generated
Test &
Learn Cycle
Operationaliz
e or Re-
create
Discussion
 How will you begin, or improve, your
analytics journey at YOUR institution?
Elements of a Data Model
 Use modeling to
 Test likely impact on retention when new
initiatives or planned interventions are
undertaken
 Create models that build out retention
impact by segments, e.g., demographics,
academic programs, persistence, etc.
Continual Improvement
Design
Intervention
Collect Data
Analyze
Data
Refine or
Sunset
DISCUSSION
THANK YOU FOR YOUR INTEREST

EDUCA Leveraging Analytics FINAL

  • 1.
    Leveraging Analytics to ImproveStudent Success Karen Vignare, University Maryland University College @kvignare Ellen Wagner, PAR Framework @edwsonoma
  • 2.
    Session Description • Thissession shows how analytics can be used to identify opportunities for improving student success. • By the end of the session, participants will make connections between predictions about risk, and the interventions most likely to work best under varying conditions and with different populations.
  • 3.
    Setting the Context: DataAre Changing Everything
  • 4.
    “But education researchershave always worked with data.” • We do qualitative research with data • We do quantitative research with data • We do evaluations with data • We develop surveys and instruments and experiments to collect more data • We pull data from LMSs, SISs, ERPs, CRMs … • We write reports, summaries, make presentations, develop articles and books and webcasts….
  • 5.
  • 6.
  • 7.
    Analytics in HigherEducation Learning Analytics Best way to teach and learn Learner Analytics Best way to support students Organizational Analytics Best ways to operate a college Academic Analytics
  • 8.
    Create new insightsand opportunities for data in our practices • Enrollment management • Student services • Program and learning experience design • Content creation • Retention, completion • Gainful employment • Institutional Culture
  • 9.
    How Are WeDoing So Far? • Data is the number 1 challenge in the adoption and use of analytics. Organizations continue to struggle with data accuracy, consistency, access. • The primary focus of analytics focuses on reducing costs, improving the bottom line, managing risk. • Intuition, based on experience, is still the driving factor in data- driven decision-making. Analytics are used as a part of the process. • Many organizations lack the proper analytical talent. Organizations that struggle with making good use of analytics often don’t know how to apply the results. • Culture plays a critical role in the effective use of data analytics. 9
  • 10.
    GROUP DISCUSSION • Isyour institution using (or planning to use) academic analytics specifically to improve student success? • What kinds of questions are you trying to answer? • What kinds of data are you planning to use? • What kinds of barriers are you encountering?
  • 11.
    Getting to theright answer takes work • Analysis and model building is an iterative process • Around 70-80% efforts are spent on data exploration and understanding. SAS Analysis/Modeling Process
  • 12.
    Link Predictions toAction • Predictive analytics refer to a wide varieties of methodologies. There is no single “best” way of doing predictive analytics. You need to know what you are looking for. • Simply knowing who is at risk is simply not enough. Predictions have value when they are tied to what you can do about it. • Linking behavioral predictions of risk with interventions at the best points of fit offers a powerful strategy for increasing rates of student retention, academic progress and completion.
  • 13.
  • 14.
    What PAR does PARuses descriptive, inferential and predictive analyses to create benchmarks, institutional predictive models and to inventory, map and measure student success interventions that have direct positive impact on behaviors correlated with success.
  • 15.
    Linking Predictions toAction • Identify obstacles and remove barriers from student success pathways. • Provide actionable information so students and advisors can build informed opportunity pathways. • Know where to invest in student success leveraging collaborative insight that determine return on investment in interventions and support.
  • 16.
    Benchmarks & InsightPredictive Analytics Intervention Inventory and ROI Tools Diagnostics PAR analytic toolset
  • 17.
    Benchmarks & InsightPredictive Analytics Intervention Inventory and ROI Tools Web Tools Student Success Matrix (SSMx)
  • 18.
    PAR by theNumbers • 2.2 million students and 24.5 million courses in the PAR data warehouse, in a single federated data set, using common data definitions. • 48 institutions, 351 unique campuses. • 77 discrete variables are available for each student record in the data set. Additional 2 dozen constructed variables used to explore specific dimensions and promising patterns of risk and retention. • 343 discrete interventions filtered on predictor behaviors, point in student life cycle, student attributes, institutional priorities and ROI factors in the growing SSMx dataset.
  • 19.
    Structured, Readily AvailableData • Common data definitions = reusable predictive models and meaningful comparisons. • Openly published via a cc license @ https://public.datacookb ook.com/public/institutio ns/par
  • 20.
  • 21.
    PAR Puts itAll Together Determine students probability of failure (predictions) Determine which students respond to interventions (uplift modeling) Determine which interventions are most effective (explanatory modeling) Allocate resources accordingly (cost benefit analysis)
  • 22.
    Findings from aggregateddataset Positive Predictors High school GPA (when available) Dual Enrollment – HS/College Any prior credit CC GPA Credit Ratio Successful Course Completion Positive completion of DevEd Courses Negative Predictors Withdrawals Low # of credits attempted Varies but can be significant PELL Grant Recipient Taken Dev Ed Age Fully online student Race
  • 23.
    • Measurement resourcesare usually located separately from intervention planning & implementation resources • Lack of connection of predictors to interventions and interventions to outcomes ©PAR Framework 2015 Common Challenges for Intervention Effectiveness
  • 24.
    PAR Student SuccessMatrix (SSMx) • An organizational structure that helps institutions inventory, organize and conceptualize interventions aimed at improving student outcomes. • A common framework for classifying interventions • Provides a basis for intervention measurement ©PAR Framework 2015
  • 25.
    learner characteristics learner behaviors fit/feelings of belonging otherlearner support course/program characteristics instructor behaviors time connection entry progress completion predictors ©PAR Framework 2015
  • 26.
    SMALL GROUP DISCUSSION HowAre You Measuring Interventions at YOUR Institution?
  • 27.
    Specific Examples of DataDriven Improvements • UMUC / U of Hawaii – replication of community college success prediction studies • U of Hawaii – “Obstacle courses” • University of North Dakota – predictives tied to student watchlist data • Intervention measurement at Sinclair CC and Lone Star CC • National online learning impact study on student retention (in press, based on results from >500,000 students taking onground, blended and online courses)
  • 28.
    Intervention Measurement – StudentSuccess Courses Results • 12 month credit ratio: Only 1 of the 8 Student Success Courses analyzed showed a statistically significant positive effect for students taking the course vs. those who did not. • Retention: 7 of the 8 courses showed a significantly positive effect • Retention higher by 14% to 4X
  • 29.
    Intervention Measurement – StudentSuccess Courses Course Component Summary:
  • 30.
     Public universityoffering online degree programs to a diverse population of working adults  Largest open access public online university in U.S.  Premier provider of higher education to U.S. military since 1949  Part of the University System of Maryland About UMUC
  • 31.
    20th Century  Historical Longitudinal  Warehouse  Siloed  External Reporting 21st Century  Predictive  Real-Time  Dashboards  Integrated Institutional Insights  Continuous Improvements Evolution of Data for Retention
  • 32.
     Institutional Research Institutional Effectiveness  Business Intelligence  Civitas Learning, Inc.  PAR Framework, Inc. Retention Resources at UMUC
  • 33.
     Pre-enrollment  Demographics Enrollment  LMS Engagement  Student Performance  Transfer  Military Factors Included in Predictive Model for Retention at UMUC
  • 34.
     Campus  ClassLoad  Military Status  Academic Performance  Payment Method Key Factors for Retention at UMUC
  • 35.
     One yearretention (year over year measured with a cohort)  Re-enrollment (term to term metric that includes all students)  Successful course completion (percentage of students receiving a successful grade)  Graduation (1,2,3,4,5, and 10 year rate tracks the graduation status of the starting cohort over time) Metrics at UMUC
  • 36.
     Curriculum Redesign(2010)  8-week Standard Sessions (2010)  Community College Transfer (2010)  Registration Policy (2013)  Onboarding (2014)  Just-in-Time Messages (2014) Retention Initiatives
  • 37.
    71.2 72 71.673.2 60.5 59.5 61.5 66 0 10 20 30 40 50 60 70 80 Fall 2011 Fall 2012 Fall 2013 Fall 2014 Stateside Overseas Retention Rates and Headcounts 47,416 46,213 41,197 41,356
  • 38.
    Operationalize successful tests; “Lessons Learned” fed back tobody of knowledge Student Retention Enterprise Framework Diagnosed from internal data and external research Root cause analysis performed and search of existing body of knowledge solutions Work within Governance Structure Levers pulled here; Measure success & ROI; Quarterly Retention Root Cause Identification & Analysis Retention Opportunity Problem Analyzed Hypothesis Generated Test & Learn Cycle Operationaliz e or Re- create
  • 39.
    Discussion  How willyou begin, or improve, your analytics journey at YOUR institution?
  • 40.
    Elements of aData Model  Use modeling to  Test likely impact on retention when new initiatives or planned interventions are undertaken  Create models that build out retention impact by segments, e.g., demographics, academic programs, persistence, etc.
  • 41.
  • 43.
  • 44.
    THANK YOU FORYOUR INTEREST

Editor's Notes

  • #8 Analytics is not one size fits all 3 major areas of analytics in HE, according to Russ Learning The act & process of learning, Curricular, Best way to teach and learn Learner Demographics, Behaviors, Best way to support students (Individually is the goal) Organizational Capacity, budget, scheduling, Best way to operate a college
  • #17 Place Holders for Demo Sections
  • #18 Place Holders for Demo Sections
  • #35 We can give examples from each category
  • #36 These are commonly used to report retention (as opposed to measuring success)
  • #39 PRESENTATION NOTES This is the strategic framework that we need to implement In “Operationalize or re-create”, we didn’t abandon anything, because it always rolls back into body of knowledge Governance is part of this!
  • #43 Based on PAR