Nowadays, we are constantly interacting with computers, mobiles and other wearable devices. These interactions leave behind the digital footprint of the user. This data is used with different goals in the so-called Big Data field to predict customer behaviour in marketing and health research. Learning Analytics tackles this challenge in the Technology Enhanced Learning field.
George Siemens defines Learning Analytics as the measurement, collection, analysis and reporting of the data to understand and optimise learning. In this context, we find a variety of studies that process the data different. Some studies implement complex algorithms and display the outcome to the user. Others rely on simpler approaches to process the data but enabling the user to explore the data with understandable, comprehensive and usable visualisations. Users can draw conclusions by their own and, with this information, steer their own learning process. This thesis is contextualised in the latter and intends to help students to become autonomous and lead their own educational process.
This dissertation presents the work in the scope of four research questions: 1) RQ1 - What characteristics of learning activities can be visualised usefully for learners?; 2) RQ2 - What characteristics of learning activities can be visualised usefully for teachers?; 3) RQ3 - What are the affordances of and user problems with tracking data automatically and manually?; and 4) RQ4 - What are the key components of a simple and flexible architecture to collect, store and manage learning activity?.
The exploration of these research questions include the deployment of: 1) three different learning dashboard designs deployed in real courses with 128 students participating in the evaluations; 2) the analysis of two Massive Open Online Courses (MOOCs) with 56876 enrolled students; and 3) the deployment of an architecture in two real case studies, including a European project with more than 15 scheduled pilots.
Manual and automatic trackers have benefits and drawbacks. For example, manual trackers respect the user privacy in blended learning courses but the data provided by the students is not trusted by their fellow students. Automatic trackers are more accurate, but they do not track the activity outside of the computer, and, therefore, do not provide the complete picture that students demand.
This research also identifies three components to deploy a simple and flexible architecture to collect data in open learning environments: 1) a set of simple services to push and pull the learning traces; 2) a simple data schema to ensure completeness and findability of the data; and 3) independent components to collect the learning activity.
Delhi Call Girls CP 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Public PhD defense
1. Exploring Learning Analytics
and Learning Dashboards
from a HCI Perspective
Jose Luis Santos
PUBLIC PHD DEFENSE
1http://bit.do/santos_scholar http://www.slideshare.net/jlsantoso
2. Learning analytics is the measurement, collection, analysis
and reporting of data about learners and their contexts,
for purposes of understanding and optimising learning and
the environments in which it occurs” - George Siemens [1]
[1] G. Siemens. “Learning analytics: envisioning a research discipline and
a domain of practice”. Proceedings of the 2nd International Conference on
Learning Analytics and Knowledge . ACM. 2012, pp. 4–8.
DEFINITION
page 1 of the thesis text
2
21. Classrooms and learning communities
Learning Dashboards
as Personal Informatics Tools
Open Learning
Environments
see background section (pg.3) of the thesis text
21 SCOPE
23. Classrooms and learning communities
Learning Dashboards
as Personal Informatics Tools
Open Learning
Environments
Motivation
see background section (pg.3) of the thesis text
23 SCOPE
54. RESEARCH QUESTIONS
RQ1: What characteristics of learning activities can
be visualised usefully for learners?
RQ2: What characteristics of learning activities can
be visualised usefully for teachers?
RQ3: What are the affordances of and user problems
with tracking data automatically and manually?
RQ4: What are the key components of a simple and
flexible architecture to collect, store and manage
learning activity? 54
55. RESEARCH QUESTIONS
RQ1: What characteristics of learning activities can
be visualised usefully for learners?
RQ2: What characteristics of learning activities can
be visualised usefully for teachers?
RQ3: What are the affordances of and user problems
with tracking data automatically and manually?
RQ4: What are the key components of a simple and
flexible architecture to collect, store and manage
learning activity? 55
57. Publications57
Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with
engineering students”, In Proceedings of the 2nd International Conference on Learning
Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152.
Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In Proceedings
of the Third International Conference on Learning Analytics and Knowledge (LAK ’13).
ACM, New York, NY, USA, 14-22.
Santos et al. 2013b. “Evaluating the use of open badges in an open learning environment” , In
proceedings of of the Eight European Conference on Technology Enhanced Learning,
Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin, Germany.
314-327.
Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of
Universal Computer Science, Vol. 21, No. 7, pp. 976-996
Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on
the UNED COMA courses”. In Proceedings of the Fourth International Conference on
Learning Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102
58. “As described in section 1.1, learning dashboards
visualise learning traces, actions that students
perform while they learn. In this context, RQ1
explores the usefulness of such traces in five different
open learning courses.”
Problem58
rq1 problem statement - see page 14 of the thesis text
59. [3,4]
Time
spent
Artefacts
Produced
Social
Interaction
Resource
use
Exercise/
Test
results
Ch. 2
Ch. 3
Ch. 4
[3] K. Verbert, E. Duval, J. Klerkx, S. Govaerts, and J. L. Santos. “Learning Analytics Dashboard Applications”. In:
American Behavioral Scientist 57.10 (2013), pp. 1500–1509.
[4] K. Verbert, S. Govaerts, E. Duval, J. L. Santos, F. Van Assche, G. Parra, and J. Klerkx. “Learning dashboards: an
overview and future research opportunities”. In: Personal and Ubiquitous Computing 18.6 (2014), pp. 1499–1514.
Approach
page 15 of the thesis text
59
63. RESEARCH QUESTIONS
RQ1: What characteristics of learning activities can
be visualised usefully for learners?
RQ2: What characteristics of learning activities can
be visualised usefully for teachers?
RQ3: What are the affordances of and user problems
with tracking data automatically and manually?
RQ4: What are the key components of a simple and
flexible architecture to collect, store and manage
learning activity? 63
65. Publications65
Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with
engineering students”, In Proceedings of the 2nd International Conference on Learning
Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152.
Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In
Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK
’13). ACM, New York, NY, USA, 14-22.
Santos et al. 2013b. “Evaluating the use of open badges in an open learning
environment” , In proceedings of of the Eight European Conference on Technology Enhanced
Learning, Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin,
Germany. 314-327.
Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on the
UNED COMA courses”. In Proceedings of the Fourth International Conference on Learning
Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102
Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of
Universal Computer Science, Vol. 21, No. 7, pp. 976-996
66. “Results of our analyses [94, 95] report that
dashboards for teachers are designed to raise
awareness of the activities taking place in the course,
analyse activity and plan interventions, among others.
Related to activity analysis, we explored what
teachers can actually learn from visualisations.”
Problem66
rq2 problem statement - see page 17 of the thesis text
67. Time
spent
Artefacts
Produced
Social
Interaction
Resource
use
Exercise/
Test
results
Drop-outs
[5,6,7]
Language use
[8]
Social
interaction [9]
[5] C. Alario-Hoyos et al. “Analysing the Impact of Built-In and External Social Tools in a MOOC on Educational Technologies”. In: ECTEL’13 . Vol. 8095. LNCS. Springer, 2013, pp. 5–18.
[6] D. Clow. “MOOCs and the funnel of participation”. In: Proceedings of the Third International Conference on Learning Analytics and Knowledge. LAK ’13. ACM, 2013, pp. 185–189.
[7] H. Spoelstra et al. “Team formation instruments to enhance learner interactions in open learning environments”. In: Computers in Human Behavior 45 (2015), pp. 11–20.
[8] P. Levy. “Technology-Supported Design for Inquiry-Based Learning”. In: Exploring Learning & Teaching in Higher Education . Springer, 2015, pp. 289–304.
[9] N. Michinov et al. “Procrastination, participation, and performance in online learning environments”. In: Computers & Education 56.1 (Jan. 2011), pp. 243–252.
67
Approach
table with data from page 17 and chapter 5 of the thesis text
71. RESEARCH QUESTIONS
RQ1: What characteristics of learning activities can
be visualised usefully for learners?
RQ2: What characteristics of learning activities can
be visualised usefully for teachers?
RQ3: What are the affordances of and user problems
with tracking data automatically and manually?
RQ4: What are the key components of a simple and
flexible architecture to collect, store and manage
learning activity? 71
72. “Therefore, we consider relevant to evaluate how
students perceived automatic and manual trackers.”
Problem72
rq3 problem statement - see page 19 of the thesis text
Manual Automatic
PROS Privacy tracking fatigue [10]
CONTRAS tracking fatigue [10] Privacy
[10] E. K. Choe, N. B. Lee, B. Lee, W. Pratt, and J. A. Kientz. “Understanding quantified-selfers’ practices in collecting and exploring personal data”. In:
Proceedings of the 32nd annual ACM conference on Human factors in computing systems . ACM. 2014, pp. 1143–1152.
73. 73
Approach
table with data from page 20 of the thesis text
Lab sessions
Blended learning
courses
no learning activity outside
of the classroom
big part of the learning
activity outside of the
classroom
automatic trackers manual trackers
Rabbit Eclipse plug-in *
* https://marketplace.eclipse.org/content/rabbit
74. Outcome
Lab sessions Blended learning courses
Approach Automatic trackers Manual trackers
No privacy concerns No tracking fatigue
lack of tracking Over reporting
74
75. RESEARCH QUESTIONS
RQ1: What characteristics of learning activities can
be visualised usefully for learners?
RQ2: What characteristics of learning activities can
be visualised usefully for teachers?
RQ3: What are the affordances of and user problems
with tracking data automatically and manually?
RQ4: What are the key components of a simple and
flexible architecture to collect, store and manage
learning activity? 75
76. Publications76
Santos et al. 2015 “Tracking Data in Open Learning Environments" Journal of Universal
Computer Science, Vol. 21, No. 7, pp. 976-996
Santos et al. 2012. “Goal-oriented visualizations of activity tracking: a case study with
engineering students”, In Proceedings of the 2nd International Conference on Learning
Analytics and Knowledge (LAK ’12). ACM, New York, NY, USA, 143-152.
Santos et al. 2013a. “Addressing learner issues with StepUp!: an evaluation”, In
Proceedings of the Third International Conference on Learning Analytics and Knowledge (LAK
’13). ACM, New York, NY, USA, 14-22.
Santos et al. 2013b. “Evaluating the use of open badges in an open learning
environment” , In proceedings of of the Eight European Conference on Technology Enhanced
Learning, Scaling up Learning for Sustained Impact, Springer Berlin Heidelberg, Berlin,
Germany. 314-327.
Santos et al. 2014. “Success, activity and drop-outs in MOOCs an exploratory study on
the UNED COMA courses”. In Proceedings of the Fourth International Conference on
Learning Analytics And Knowledge (LAK’14). ACM, New York, NY, USA, 98-102
77. 77
image at page 21 of the thesis text
Approach
Experience report
two environments
78. 78
Rabbit Eclipse
plugin
RescueTime
Wordpress API
Blogspot API
Medium / RSS
Twitter
Toggl
Trackers REST services
dashboard
badge system
Internethosted in the cloud
Google App Engine
1. Common data schema
2 3
three elements described in page 22 of the thesis text
Outcome
80. image at page 137 of the thesis text
128 students actually used the learning dashboards
56876 students enrolled in the MOOC courses
the architecture was deployed in
more than 10 case studies
3 learning analytics dashboards
80
81. Publications and RQs
C - Conference, J - Journal
RQ1
Chapter 2: Santos et al. 2012 (C)
Chapter 3: Santos et al. 2013a (C)
Chapter 4: Santos et al. 2013b (C)
RQ2 Chapter 5: Santos et al. 2014 (C)
RQ3
Chapter 2: Santos et al. 2012 (C)
Chapter 3: Santos et al. 2013a (C)
RQ4 Chapter 6: Santos et al. 2015 (J)
81