ACODE70
Despite calls for actionable information, few learning analytics approaches nationally allow staff to easily ‘do’ anything with data. Coupled with the typically long development cycles of software tools, this has the potential to stall uptake of learning analytics by interested staff. This presentation will outline two approaches at the University of Sydney and Macquarie University where staff were closely involved in the coevolution and development of two bespoke learning analytics tools to personalise student-staff interactions at scale. This allowed the tools to meet pressing needs, and has led to substantial organic adoption and positive student outcomes. These highlight the importance of grassroots developments for building wider learning analytics capabilities.
A tale of two universities - organic growth of learning analytics through bespoke coevolution
1. Page 1
A tale of two universities:
Organic growth of learning analytics through bespoke
coevolution
@dannydotliu
danny.liu@mq.edu.au
danny.liu@sydney.edu.au
2. Page 2
The contexts of learning analytics
Common barriers to adoption
– Policy and ethical
challenges
– Culture of resistance to
change
– Vendor solutions
– Data accuracy
– One-size-fits-all
Pressing institutional needs
– $Millions lost to attrition
– Larger class sizes
– More disconnected students
– Feedback very generalised
– Data are scattered
3. Page 3
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
4. Learning analytics in Moodle
MOTIVATIONS AND BENEFITS
• Benefits
• Staff familiarity
• Single point of access
• Learning experience
data already there
• Problems
• No available learning
analytics tool with
actionable data
Log viewer
Statistics report
MOCLog
5. ‘Participatory design’
5
ACADEMICS AND STUDENT SUPPORT STAFF
Staff expectations
Prototyping and development
User testing
Piloting
Feedback and further development
7. Stakeholder impacts
7
PERSONALISED, DATA-DRIVEN INTERVENTIONS
• 3400+ personalised emails sent, average ~46% opened
• From unit convenors and student support staff
• For census, updates, reminders
• With predominantly at-risk students
• Using logins, assessment submissions, grades, attendance
• Next: wider trials in semester 1, 2016
I was surprised someone cared/was actually monitoring, kind of a weird, I don't
know totalitarian/'people are watching you' feeling? But in this situation I was happy.
Very useful. I wouldn't have been able to do such a large scale analysis and identify so
many students without MEAP. I wouldn't have been able to send them such tailored,
structured and consistent messages.
8. Page 8
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
9. Page 9
The Student Relationship Engagement System
Attendance
Interim
grades
LMS
metrics
Third party
tools & other
data
10. Page 10
Personalising connections with students
– Empowering staff
– Flexible & intuitive
– Targeted and personalised
– Multi-channel
– Benefits
– Highly customisable
– Efficient – key data in one
place, operating at scale
– Connect staff and all
students (not just at-risk)
11. Page 11
Co-evolution of the SRES
– Organic adoption by academics
– Students felt cared-for
– Reduced attrition, improved grade distribution
0
5000
10000
15000
20000
25000
0
10
20
30
40
50
60
70
2012 2013 2014 2015
Numberofstudents
Numberofunitsorschools
Number of units
Number of schools
Number of students
Pilot
EWS, Track
& Connect
EWS
integrated
New
analyses
More data
types
Data
import
12. Page 12
Learning analytics by stealth?
– As staff data literacies grew, so have system capabilities
– Covert introduction to data-driven pedagogy
13. Page 13
1. Macquarie: Empowering staff with actionable LMS data
2. Sydney: Learning analytics by stealth
14. Page 14
Lessons learned and issues raised
– Give them what they want vs. build it and they will come
– Find champions with tolerance for error
– Customisability is key
– Usefulness (eventually) trumps aesthetics (to an extent)
– But people still like shiny things
– Data are not enough – connect with pedagogical, pastoral
– Surprisingly little kickback about privacy & ethics
– Tension between research ethics & general ethics
– It’s tricky to measure impact
– Iterate – capabilities, implementation
– Focus on the human
19. Page 19
Adoption pipeline
Colvin et al. (2016) Student retention and learning analytics: A snapshot of Australian
practices and a framework for advancement. Office of Learning and Teaching, Sydney.
First, implementers require an analytic tool or combination of tools that manage data inputs
and generate outputs in the form of actionable feedback… As these increasingly meet the
real needs of learners and educators the organisational uptake is accelerated.
20. Page 20
1. MEAP Empowering staff with actionable LMS data
Chris Froissard, Deborah Richards, Amara Atif et al.
2. SRES Learning analytics by stealth
Charlotte Taylor, Adam Bridgeman, Kathryn Bartimote-Aufflick,
Abelardo Pardo et al.
@dannydotliu
danny.liu@mq.edu.au
danny.liu@sydney.edu.au