Presentation delivered at the UCISA event A-Z of learning analytics 28/06/2017. Ed Foster & Jane McNeil. A longer case study can be found at https://www.google.com/url?q=https://www.ucisa.ac.uk/-/media/Files/publications/truthaboutda/TheTruthAboutDA&sa=U&ved=0ahUKEwi8r-7W5_7eAhVKRBUIHf66CGEQFggMMAM&client=internal-uds-cse&cx=008281077274678676179:yulrfklwima&usg=AOvVaw17iuGZYPJPqFRCMGyBKLd0
1. Progress with learning analytics: teaching quality & analytics
ABLE Project 2015-1-BE-EPPKA3-PI-FORWARD
STELA Project: 562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
2. NTU progress with learning analytics
• Already a data-rich institution
• Data sub committee
• Pilot in 2013-14 using Solutionpath’s Stream tool
• Now
• Wholesale adoption across the institution
• 91% of students logged in (2015-16)
• 2,056 staff logged in (2015-16)
• Disruptive technology
• Lever for institutional change
• Data one of 4 main themes for TEF submission
6. Dashboard TutorStudent
Metrics
Raw data &
engagement rating
Student engagement
with course
Metrics & alerts
Engagement
with students
presented
to
students
students
act
presented
to
tutors
more-informed
interactions
Embedded Learning Analytics
Two agents of change model
NTU is strongly
interested in
the potential of
parallel work on
course quality
metrics &
embedding
data into
management
decision
making
7. The importance of engagement for success
• Av. engagement for the year = very strong predictor of progression
Average engagement whole year (2015-16)
Low HighGoodPartial
Progression%
9.5% 81.3% 92.0% 94.9%
8. Risks associated with sustained low engagement
• As the year progresses, sustained low engagement = increasing risk
Impact of low average engagement over time (2015-16)
Progression%
Low av. engagement
Welcome Week
Low av. engagement
1st term
Low av. engagement
Whole year
64% 27% 9.5%
10. How do students use the Dashboard?
0% 20% 40% 60% 80% 100%
Spoke to someone providing specialist help
(for example student support services/…
Spoke to your tutor
Increased the amount of time you spend
studying
Changed your behaviour to raise or maintain
your engagement score (for example made…
Checked your attendance
% Of students who responded 'yes- very often, often or sometimes'
Studentresponses
Student actions after logging in to the Dashboard (n=753,
Feb/Mar 2017)
11. Relationship between students logging in
to the Dashboard and progression
Progression%
72.4% 89.8%81.4%
1 log in 4-6 log ins 20+ log ins
Total log ins for the year (2015-16)
13. Staff embedding the Dashboard into their
working practices
• Staff involved in one Dashboard pilot told us that they
used the Dashboard in the following ways:
• Overview monitoring their tutees
• ”I use it to check the ‘academic health’ of students
• Preparing for tutorials
• If I am concerned about a student’s engagement, I will look at
the Dashboard prior to meeting the student to enable me to
triangulate evidence gleaned from other areas, e.g. a Module
Leader’s feedback …”
• “I access information pre-tutorial and have it available during
the tutorial”
14. Using the Dashboard in tutorials
• Framing the discussion
• “The information within the Dashboard is a start point for discussion”
• Check student self-perception
• “show a visual representation of their engagement – this works well with
art students it seems”
• Coaching
• “Looking at engagement with modules, referring to attendance, but in a
positive and encouraging way. If there is a lot of good things to talk
about, then I do so. If the picture does not look so good then I encourage
more engagement, if it looks terrible, I try to understand what the
problem is."
• “[I] use this as a springboard to talk about engaging with course texts …”
• Action planning/ referrals
• “I use the Dashboard to update notes after the tutorial as I want to use
the tutorial time listening to them and exploring issues/ priorities…”
• ”We agree an action plan which is noted in the Dashboard”
16. Discussion points
• Disruptive technology
• Learning analytics without institutional change likely to
lead to disappointing results
• Resource, training, double loop change, LA, LM
• Institutional data & data infrastructure not necessarily set
up to enable this use of data
• Two change agents
• Not the tool, but institutional change
• Engagement is a profoundly powerful predictor of success
Editor's Notes
Important to say that this work sits within a wide programme of academic development and student support.
Note TEF GOLD and the role played by Learning Analytics (LA) in that. One of our major themes was our systematic use of data to make evidence-based developments
Analytics, metrics and qualitative data
Parallel project to NTU Dashboard: Course quality metrics.
Descriptive data. Good data in this context: meaningful proxies for quality, robust data, accessible to course leaders, recognised as valid by them> like NTU Dashboard project, this is about generating data, presenting it in an accessible way, developing the use of it.
Emerging set of course quality metrics: related to Gibb’s work on factors that predict quality. Let’s call these Learning Metrics (LM).
TEF is a great opportunity to move this up the agenda. (TEF is selected NSS + non-continuation+ )
64% find Dashboard useful when they log in themselves
80% reported that they found it useful when their tutors used it in tutorials
SO, HOW ARE LA and LM connected?
CLICK LA initially seen by staff as primarily about informing interventions with individual students. = 1-1 tutorial, support strategies. THEN as a way to INDIRECTLY TARGET students for wider, thematic interventions or support schemes (e.g. for equality groups in Success for All)
CLICK LM are used to make judgements about programme quality. For “feedback and course correction”. Like the TEF. (Traditional Q metric vs Gibbs-based metrics is a story of another day!)
HOWEVER, it is at the point of interaction between LA and LM that things get really interesting. WHY?
Because engagement predicts everything else. Engagement as student behaviour (can de deficit) vs engagement as a programme strategy (by design).
Story:
new Data Sub-committee – importance of the committee in brining distributed expertise together to look at patterns across multiple data sources.
Example: referrals – patterns. Can see across modules. Implications for course design. BUT Engagement predicts these earlier than assessment failure.
Therefore, add LA on engagement to LM on course quality for use in monitoring and in periodic course review. = Deeper understanding of contextualised strategies for course design and LTA strategies to improve student success. E.g. our Festival: how to use course design to engender communities