This document analyzes student survey data to understand drivers of student ratings of digital teaching/learning quality and identify student personas. Key driver analysis found opportunities to update digital skills, well-designed learning spaces, up-to-date software, and engaging lectures influence ratings. Personas analysis identified mainstream pragmatists, specialist enthusiasts, and negative thinkers groups. Qualitative data found most students want improved existing resources while some ask for new services or are broadly critical of the digital experience.
2. Background
2
• Data from Jisc’s student insights
surveys 2018-19
• Focus on students’ digital experience
• Can identify areas for improvement
and good practice
• Two key performance rating scales:
- Quality of digital teaching
and learning
- Quality of digital infrastructure
• Student, teaching staff and PS staff
surveys allow triangulation of
perspectives
3. 3
Mixed methods analysis:
•Pushing the insights data further
•Using multivariate statistics (quantitative) with
free text data (qualitative) to investigate student
data in more detail
•Focus on national picture for university (HE)
undergraduate student data
4. 4
Key Driver Analysis:
What influences students’ ratings of their
digital teaching and learning?
Personas:
Can we group individual learners in terms
of their opinions and experience of the
digital?
6. 6
What is Key driver analysis (KDA)?
• Imagine you have a limited budget J and/or limited resources
• You’ve asked your students to rate the quality of something
• You want to lock in what’s doing well, and improve what’s not
• Where do you start?
How would you rate the quality of digital teaching
and learning on your course? (seven point scale)
DT&L
rating
Overall, I am satisfied with the quality of the course
(five point scale)
NSS overall
rating
7. 7
How satisfied are you with the
service at this hotel?
• Overall satisfaction score
• Check in
• Restaurant experience
• Bar service
• Room cleanliness
Minor strength Major strength
Minor weakness Major weakness
Impact on overall satisfaction score
Performance
Check in
Bar
Restaurant
Room
KDA: an example
8. 8
12 x
potentially
relevant
issues in the
survey
1. Teaching spaces are well designed for the technologies we use
2. How often do you work online with others?
3. Online assessments are delivered and managed well
4. I can easily find things on the VLE
5. I have regular opportunities to review and update my digital skills
6. The software used on my course is industry standard and up to date
7. How often do you use an educational game/simulation for learning?
8. Do you have access to online course materials and recorded
lectures?
9. How often do you use a polling device/online quiz to give answers in
class?
10. My course prepares me for the digital workplace
11. How often do you create a digital record of your learning?
12. How often do you produce digital work in formats other than
Word/PowerPoint?
9. Q19: How would you rate the quality of digital teaching
and learning on your course (seven point scale)
The insights surveys give you an overall rating …
… but which of the 12 issues influence this the most?
DT&L
rating
10. Q19: How would you rate the quality of digital teaching
and learning on your course (seven point scale)
10
DT&L
rating
12 issues measured:
• What is their order of influence on
overall rating?
• Which issues are performing well
nationally (lock in success)
• Which are performing poorly
nationally (a target for
improvement)?
Performance
Minor
strength
Major
strength
Minor
weakness
Major
weakness
Impact on overall satisfaction
12. 12
Poor performance nationally:
• I have regular opportunities
to update digital skills
• Teaching spaces are well
designed for tech we use
• Software is industry
standard/up to date
• Online assessments are
delivered/managed well
• I can easily find things on
the VLE
13. 13
What do the free text data tell us?
Students were asked what their institution could
do to improve digital teaching and learning:
16,823 HE students responded in 2018
11,269 HE students responded in 2019
What issues do
students associate
with ‘better’ digital
learning and teaching?
14. 14
• Quality of lecture experience: recordings,
slides/notes, polling activities
• Quality of online academic resources: e-
books and e-journals; materials for revision
• Perceived value for money, especially high
value software
Most responses concern the digital
environment for learning: platforms,
networks, classrooms, learning spaces
Fewer concerned support for digital skills:
most often: ‘don’t assume students know how
to learn digitally’
What issues do
university students
associate
with ‘better’ digital
learning and
teaching?
15. 15
Activity: Overall, how would you rate the
quality of digital teaching and learning at
your organisation?
We can only analyse what has been measured: what else
might be important?
Which other issues do you think might influence the
perceived quality of digital teaching and learning at your
organisation?
16. 16
In 2019-2020 now have five key metric questions that
can be used to track progress through time:
1. Overall, how would you rate the quality of your organisation's digital
provision (software, hardware, learning environment)
2. Overall, how would you rate the quality of digital teaching and learning at
your organisation?
3. Overall, how confident are you at trying out new technologies?
4. Overall, how motivated are you to use technology to support your
learning/teaching/work?
5. Overall, how would you rate the quality of support you get from your
organisation to develop your digital skills?
17. Personas
Can we group individual university students in terms
of their opinions and experience of the digital?
18. We know from previous work (Jisc ‘Digital
Student’ programme) that students differ in
ways that impact on their digital experiences
18
Work with secondary school and FE students (Davies et al.
2010, Sharpe et al. 2015) suggested three broad digital
persona types:
• Unconnected and vulnerable
• Mainstream pragmatists
• Intensive and specialist enthusiasts
Would these (or something similar) be reflected in our data?
19. Identifying personas
19
• Multiple correspondence analysis (MCA) is a specific
statistical technique
• Taken from UX persona research
• Uses cluster analysis to segment user groups (see
https://measuringu.com and academic works such as
Laporte et al, 2012, Schroeder et al. 2018)
• Take the same questions used in the KDA and put into this
analysis: how do opinions cluster together?
21. Majority group
Use digital regularly
Rate DT&L as good
Minority group
Using digital a lot
Rate DT&L as
excellent+
Think the VLE is badly
organised
Smallest group
Never use digital
Rate DT&L as average-
Never WOWO
22. Mainstream pragmatists
Majority group
Use digital regularly
Rate DT&L as good
Specialist enthusiasts
Minority group
Using digital a lot
Rate DT&L as
excellent+
Think the VLE is badly
organised
Unconnected and
vulnerable?
Smallest group
Never use digital
Rate DT&L as average-
Never WOWO
23. 23
Qualitative data:
• Analysis of responses to four free text
questions, especially ‘What should we do
to improve your experience of digital
learning and teaching?’
What do university
students
tell us?
24. 24
• Most undergraduate students want ‘more’ and
‘better’ versions of what they have already
e.g. better quality lecture capture, access to
more journals
• A minority discuss specifics with digital
expertise. They may be discerningly negative
about aspects of digital provision. They are
more likely to ask for new/different services
eg loan schemes, student apps, desktop
features
• A different minority of comments are broadly
and generally negative about the digital L&T
experience (reasons differ)
What do university
students
tell us?
25. • Most undergraduate students want ‘more’ and
‘better’ versions of what they have already
e.g. better quality lecture capture, access to
more journals
• A minority discuss specifics with digital
expertise. They may be discerningly negative
about aspects of digital provision. They are
more likely to ask for new/different services
eg loan schemes, student apps, desktop
features
• A different minority of comments are broadly
and generally negative about the digital L&T
experience (reasons differ)
25
What do students
tell us?
Mainstream
pragmatists
Specialist
enthusiasts
Negative
thinkers?
26. 26
• We can’t assume our negative thinkers are ‘unconnected and
vulnerable’…
• …but from qualitative data we can identify the following sub-
groups of negative thinkers:
Digitally left behind: lacking required access or skills
Digitally disappointed: digital L&T is poor quality
Digitally disaffected: critical of digital L&T overall
Using qual and quant methods broadly supports three
persona types identified in previous research
27. 27
Conclusion: UK undergraduates want…
Drivers
• Regular opportunities to review and update their digital skills
• Well-designed learning environments that deliver well-managed assessments
• Industry-standard, up-to-date software; digitally-enabled teaching spaces
• Engaging lectures with appropriate digital resources/activities designed in
Personas
When it comes to digital attitudes, students can be considered:
- Mainstream pragmatists
- Specialist enthusiasts
- Negative thinkers
28. 28
This is the national picture …
… what are your insights data telling you?
29. 29
In praise of mixed methods analysis
• Think about the types of data you capture
… and the possible importance of what
isn’t currently captured
• Talk and listen to people: don’t just
respond to numbers that their data
produce
• Responding to data doesn’t always mean
having to change things; lock in good
practice too
• Use data to empower your organisation
and individuals within it