The document discusses a project that used virtual learning environment (VLE) data to provide weekly automated feedback to first-year students on their engagement. The project aimed to improve student engagement and progression. Students received emails summarizing their VLE activity and survey results found most students changed their VLE usage and would participate again. The document recommends using VLE data to provide feedback but ensuring appropriate ethical approval is received.
Procuring digital preservation CAN be quick and painless with our new dynamic...
Using VLE data to provide weekly automated feedback on student engagement
1. PredictEd: Using VLE data
to provide weekly automated
feedback on Student
Engagement
Owen Corrigan
Mark Glynn
Alan F. Smeaton
Sinéad Smyth
@glynnmark
2. @glynnmark #heie
Outline
• The Challenge
• What did we do?
• How did we give feedback?
• What did the students say?
• What are the benefit &
drawbacks?
• The Recommendations
3. @glynnmark #heie
Challenge
1) Provide regular timely feedback to first year students
with respect to their engagement with the VLE
2) To improve first year students’ engagement with
course materials and therefore by association hopefully
improve overall progression rates through improved
grades
3) Demonstrate how the VLE data can be harnessed to
advance course design.
6. @glynnmark #heie
Benefits & Drawbacks
• +2.67 % in their final
exam scores
• Staff became more
aware of what content
on the course pages
that students engaged
with
• overall increase in
awareness amongst
staff of the benefits of
the VLE
• extensive ethical
consideration
• Hawthorne effect
• Not as significant as
desired
7. @glynnmark #heie
What did the students say?
Students who took part were asked to complete a short
survey at the start of Semester 2 - N=133 (11% response rate)
Question Group 1 (more
detailed email)
Group 2
% of respondents who opted out of
PredictED during the course of the semester 4.5% 4.5%
% who changed their Loop usage as a
result of the weekly emails
43.3% 28.9%
% who would take part again/are offered and
are taking part again
72.2%
(45.6%/ 26.6% )
76.6%
(46% /30.6% )
8. @glynnmark #heie
Conclusions & Recommendations
VLE data can be used to provide regular automated
feedback and feedforward to students
Communicate thoroughly with the students and staff
involved in this project
if you decide to use VLE data please ensure that you get
appropriate ethical clearance first
11. @glynnmark #heie
Study by numbers
• 17 Modules across the
University (first year, high
failure rate, use Loop,
periodicity, stability of
content, Lecturer on-board)
• Offered to students who opt-
in or opt-out, over 18s only
• 76% of students opted-in,
377 opted-out, no difference
among cohorts
• 10,245 emails sent to 1,184
students who opted-in over
13 weekly email alerts
12. @glynnmark #heie
33% said they changed how they
used Loop. We asked them how?
• Studied more
– “More study”
– “Read some other articles online”
– “Wrote more notes”
– “I tried to apply myself much more, however yielded no results”
– “It proved useful for getting tutorial work done”
• Used Loop more
– “I tried harder to engage with my modules on loop”
– “I think as it is recorded I did not hesitate to go on loop. And loop as become my first
support of study.”
– “I logged on more”
– “I read most of the extra files under each topic, I usually would just look at the lecture
notes.”
– “I looked at more of the links on the course nes pages, which helped me to further my
understanding of the topics”
– “I learnt how often I need to log on to stay caught up.”
13. @glynnmark #heie
Did you change Loop usage for other
modules?
• Most who commented used Loop more often for other modules
– “More often”
– “More efficient”
– “Used loop more for other modules when i was logging onto
loop for the module linked to PredictED”
– “Felt more motivated to increase my Loop usage in general for
all subjects”
One realised that Lecturers could see their Loop activity
“I realised that since teachers knew how much i was
using loop, i had to try to mantain pages long on so it
looked as if i used it a lot”
14. @glynnmark #heie
So much student data we could use
Demographics
• Age, home/term address, commuting distance, socio-economic status, family
composition, school attended, census information, home property value, sibling
activities, census information
Academic Performance
• CAO and Leaving cert, University exams, course preferences, performance relative
to peers in school
Physical Behaviour
• Library access, sports centre, clubs and societies, eduroam access yielding co-
location with others and peer groupings, lecture/lab attendance,
Online Behaviour
• Mood and emotional analysis of Facebook, Twitter, Instagram activities, friends and
their actual social network, access to VLE (Moodle)
15. @glynnmark #heie
Modules which work well …
• Have periodicity (repeatability) in Moodle access
• Confidence of predictor increases over time
• Don't have high pass rates (< 0.95)
• Have large number of students, early-stage
16. @glynnmark #heie
No significant difference in the entry profiles of
participants vs. non-participants overall
PredictEd Participant Profile
19. @glynnmark #heie
Subject Description Non-Participant Participant
BE101 Introduction to Cell Biology and Biochemistry 58.89 62.05
CA103 Computer Systems 70.28 71.34
CA168 Digital World 63.81 65.26
ES125 Social&Personal Dev with Communication Skills 67.00 66.46
HR101 Psychology in Organisations 59.43 63.32
LG101 Introduction to Law 53.33 54.85
LG116 Introduction to Politics 45.68 44.85
LG127 Business Law 60.57 61.82
MS136 Mathematics for Economics and Business 60.78 69.35
SS103 Physiology for Health Sciences 55.27 57.03
Overall Dff in all modules 58.36 61.22
Average scores for participants are higher in 8 of the 10
modules analysed, significantly higher in BE101, and
CA103. MS136
Module Average Performance Participants
vs. Non-Participants
22. @glynnmark #heie
Importance of Ethics
• Ethics are important to ensure safety of participants and
researchers
• Educational Data Analytics is a new area of research
– Not much previous research to highlight possible ethical issues
– Requires extensive ethical consideration
• We have spent a lot of time this Summer preparing a DCU REC
submission
– We’ve submitted and had approval for a test case
– We’ve met with REC chair to brief him
• We are following the 8 Principles set out by the Open University
who are at EXACTLY the same stage as us
23. @glynnmark #heie
Notes on model confidence
• Y axis is confidence in AUC ROC (not probability)
• X axis is time in weeks
• 0.5 or below is a poor result
• Most Modules start at 0.5 when we don't have much
information
• 0.6 is acceptable, 0.7 is really good (for this task)
• The model should increase in confidence over time
• Even if confidence overall increases, due to randomness the
confidence may go up and down
• It should trend upwards to be a valid model and viable
module choice
24. @glynnmark #heie
Timescale for Rollout
• Still some issues on Moodle access log data transfer
to be resolved
• Still have to resolve student name / email address /
Moodle ID / student number
• Still to resolve timing of when we can get new
registration data, updates to registrations (late
registrations, change of module, change of course,
etc.) …
• Should we get new, “clean” data each week ?
25. @glynnmark #heie
Why did you take part?
• The majority of students
wanted to learn/monitor
their performance
• Many others were curious
• Some were interested in
the Research aspect
• Some were just following
advice
• Others were indifferent
26. @glynnmark #heie
How easy was it to understand the
information in the emails ?
(1= not at all easy, 5 = extremely easy)
• Average 3.97 (SD= 1.07)
• Very few had comments to make
(19/133)
– Most who commented wanted more
detail.
Editor's Notes
On Leaving Cert data P(T<=t) two-tail 0.110956
On Maths LC Points P(T<=t) two-tail 0.199347
Key point here: Students with higher prior education attainment tend to score more highly in modules. This is to suggest that any difference in the results between participants and non-participants will not be a result of differences in their prior educational attainment profile- they are the same.
Shane dawson austrailia
Significance based on t-test results, and z-scores- no statistically significant difference in the other modules.
The majority of students wanted to learn/monitor their performance
“thought it might help my grade”
“For feedback”
“To see if I was participating well and engaging with my modules”
“To monitor my progress in the module”
“To get updates on how I was expected to perform in the exams based on my loop access”
“To motivate me and tell me when I'm not working enough on the module, as a reminder”
Many others were curious
“I was curious about the result and it was no effort to take part so, why not?”
“Out of interest to see how i get on”
“Because I wanted to know where I stood amongst other classmates”
“Thought it would be interesting to see if there was a correlation”
“curiosity”
“try it out”
“Just to see what would happen”
Some were interested in the Research aspect
“I do Science Ed so interested in outcomes of such research”
“Because I felt my contribution would be beneficial to the researchers”
“I wanted to aid in the reaearch that was being done”
“To be helpful and give my opinion”
Some were just following advice
“I was advised to”
“Because it was recommended”
“Peer pressure”
“Because the lecturers talked many times about it”
Others were indifferent
“I was sent a link”
“I didn't see any reason not to”
“no reason”
“For the craic”
“why not”