Input from stakeholders for a Learning Analytics for Learning Design tool
1. “Make it Personal!” –
Gathering input from
stakeholders for a Learning
Analytics-supported
Learning Design tool
Marcel Schmitz1, Maren Scheffel2, Evelien van
Limbeek1, Roger Bemelmans1, and Hendrik
Drachsler2,3,4
1 Zuyd University of Applied Sciences, NL
2 Open University, NL
3 Goethe University Frankfurt, DE
4 German Institute for International Educational Research (DIPF), DE
4. Metacognitive
Competences
Learning Activity
Lecture
Learning Activity
Workshop
Learning Activity
Online
Discussion
Learning Activity
Learning Activity
Learning Activity
Lecture
Learning Activity
Workshop
Learning Activity
Online
Discussion
Learning Activity
Learning Activity
Idea
Desired
Situation
LA4LD
Design
Time
Run
Time
Learning
Design
Actions
after
analysis
4
Behavior
during
Learning
Analysis
Learning
Analytics
Dashboards
Actions
after
analysis
5. Design based
Research
Main research Question:
Can Learning Analytics, by use of Learning
Dashboards, improve the Learning Design of
a course during run-time and in that way
make the learning process more efficient,
effective or increase the satisfaction of
teachers and students?
Research Type: Design Based Research
Substudy 1: State of the art, Identification context, users
Substudy 2: Concept/Tool design – “Run Time”
Substudy 3: Tool (re)design – “Metacognitive competences”
Substudy 4: Interventions/actions?
5
6. State of the
art
6
Schmitz, M., Limbeek van, E., Greller, W., Sloep, P., Drachsler, H.: Opportunities and challenges in using
learning analytics in learning design. In: Data Driven Approaches in Digital Education: Proceeding of 12th
European Conference on Technology Enhanced Learning, EC-TEL pp. 209–223 (2017).
7. Identification
context and
users.
• Experiment learning dashboard (SURF-LAD)
• Comparison online activity, student learning
strategies and motivation and results.
(REFLECTOR)
Schmitz, M., Scheffel, M., Limbeek van, E., Halem van, N., Cornelisz, I., Klaveren van, C., Bemelmans, R., Drachsler, H.:
Investigating the relationships between online activity, learning strategies and grades to create learning analytics-supported
learning designs.
• Survey on data and dashboard perspective
• Focus groups/Creative sessions on data and dashboard
perspective with students, teachers and IT support staff.
7
Schmitz, M., Scheffel, M., Limbeek van, E., Bemelmans, R., Drachsler, H.: “Make it personal!”- Gathering input from
stakeholders for a Learning Analytics-supported Learning Design tool.
8. Identification
context and
users
Research Questions
1. What do teachers want to see displayed in their LA4LD
tool?
2. What do students want to see displayed in their LA4LD
tool?
3. What is the current state of self-reflection from the
students?
8
9. Methods
(Focusgroups Data)
Three focusgroups Data
Students (A1) and teachers (A2)
• Learning analytics beneficial?
• Only hard data (login, download and online presence data)
or also soft data like (emotions and learning styles)?
• Anonymous?
• Imagine Learning Analytics in any of your courses now?
• Current insight in your progress now?
9
10. Methods
(Focusgroups Data)
Three focusgroups Data
Guiding questions: TEL experts (A3)
• Which learning tools are being used?
• Can those tools/data be connected?
• If not, why not?
• Policy, rules or legislations?
• Are there analytics systems?
10
12. Methods
(Survey – 575
students asked)
Motivated Strategies for Learning Questionnaire (MSLQ)
12
Rehearsal 4 Intrinsic Goal Orientation 4
Elaboration 6 Extrinsic Goal Orientation 4
Organization 4 Task Value 6
Critical Thinking 5 Control of Learning Beliefs 4
Metacognitive self-regulation 12 Self Efficacy for Learning and Performance8
Time and Study environment 8 Test Anxiety 5
Effort Regulation 4 31
Peer Learning 3
Help Seeking 4
50
Learning Strategies Motivation
Reviews selfregulated learning & instruments
Panadero, E.: A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology 8,
422 (2017)
Roth, A., Ogrin, S., Schmitz, B.: Assessing self-regulated learning in higher education: a systematic literature review of
self-report instruments. Educational Assessment, Evaluation and Accountability 28(3), 225{250 (2016)
MSLQ Handbook:
Pintrich, P., Smith, D., Garcia, T., Wilbert, M.: A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ)
(1991)
Metacognitive self-regulation
Planning
1 Scan content & structure before
2 Able to adjust behaviour
3 Decide what is needed to know before
4 Set targets to direct activities
Monitoring
1 Miss information, because distracted
2 Tries using other material, if too difficult
3 Having the idea of not remembering the material
4 Tries finding out which subjects aren’t understood
5 Spend time after class, if don’t understand during
Self-regulation
1 Ask self questions to filter most important information
2 Read again the material, if don’t understand
3 Ask self question to check if I understand it
13. Methods
(Survey Data – 575
students asked)
13
Extra questions on data
• Insight study progress now
• Using measurement instruments in the classroom
• Willingness to make online data available.
• Willingness to give extra data to improve educational
units
• Willingness to share not anonymously
• I would like to receive personal feedback on:
Possible answers (more than one is accepted):
• the way I learn
• usage of learning material
• tests results
• lectures
• collaboration
14. Results
(Survey MSLQ
Planning)
14
n = 143
P1: Scan material before 67 do 50 do not
P3: Decide what they need to know before
67 do 48 do not
P4: Set targets 48 do 66 do not
P2: Change behavior 63 claim able, 39 claim not able.
41 don’t know
15. Results
(Survey MSLQ
Monitoring)
15
n = 143
M1: 92 are not easily distracted, 16 are
M3: 74 are sure that they can not remember, 41 think
they can
M4: 82 try to see what they don't understand during
study, 27 do not.
M2: 67 search for other resources, 48 do not
M5: 35 spends extra time, 84 will not
16. Results
(Survey MSLQ
Self-reflection)
16
n = 143
SR3: Ask themselves questions if they understand
52 do and 65 do not.
SR1: Ask themselves questions to filter the important
45 do and 68 do not.
SR2: 107 Do read material again if they don't
understand, 17 do not.
17. Results
(Survey data)
17
n = 143
58,7% neutral or negative about measuring in the
classroom
66,4% positive about sharing online activity data
55,2% would share data non anonymously
Receive information on:
- Test results (69,2%)
- The way they learn (55,2%)
- Lectures (55,2%)
- Collaboration in workgroup (49%)
- Usage of learning material (46,2%)
20. Results
(Focusgroups)
20
Data perspective
A1: Students (n=5)
A2: Teachers (n=5)
Dashboard perspective
B1: Students year 1 (n=6)
B2: Students year >1 (n=6)
B3: Teachers (n=6)
A3: TEL support (n=6)
Results A3:
Insight in local situation
21. Discussion
(Focusgroups)
21
• Student and teachers do not make a specific distinction
between learning systems, learning design and learning
analytics
• Because students see learning analytics as an
integrated part of their online learning environment,
other functionalities can stimulate them using learning
analytics (i.e.: Learning activity scheduler)
• Learning activities are a central element
23. Discussion
(Survey)
23
• There is an opportunity for improvement in students’
planning.
• The students claim to do more in less time.
• Students are prepared to share data.
• Self reflection is low.
24. Conclusion
24
• Make it personal (both students and teachers)
• As students and teachers see education holistic a data
ecosystem is the desired solution
• First year students do have different wants and needs
25. Future work
25
From Make it personal to Make it sensible
i.e.: Students and teachers want to compare,
researchers give a warning signal for comparison.
Future work: Experiment timing feedback on learning
design planned fall 2018. (Design time versus Run time
after each activity/ Run time weekly)
Jivet, I., Scheffel, M., Specht, M., Drachsler, H.: License to evaluate: Preparing learning analytics dashboards for educational
practice. In: Proceedings of the 8th International Conference on Learning Analytics and Knowledge. pp. 31{40. ACM,
New York, NY, USA (2018)