This was part of the Doctoral Consortium presentation in the ICMI Conference 2019 at Suzhou, China on 14th October, 2019. Collaboration is an important skill of the 21st century. It can take place in an online (or remote) setting or in a colocated
(or face-to-face) setting. With the large scale adoption
of sensor use, studies on co-located collaboration (CC) has
gained momentum. CC takes place in physical spaces where
the group members share each other’s social and epistemic
space. This involves subtle multimodal interactions such
as gaze, gestures, speech, discourse which are complex in
nature. The aim of this PhD is to detect these interactions
and then use these insights to build an automated real-time
feedback system to facilitate co-located collaboration
Co-located Collaboration Analytics
with: Maren Scheffel, Hendrik Drachsler and
ICMI 2019 Doctoral Consortium
14 Oct 2019
What is collaboration?
• Occurs when two or more persons work towards a common goal (Martinez-
• Can be in different settings:
– Online (or remote)
– Co-located (or face-to-face)
• Can be for different purposes:
– Programming (Grover et al., 2016)
– Meetings (Terken & Strum, 2010)
– Problem solving (Cukurova et al., 2018)
– Brainstorming (Tausch et al., 2014), etc.
Multimodal Indicators of CC
• Non-verbal signals using NISPI framework (Cukurova et al., 2018)
• Joint Visual Attention (JVA) (Richardson & Dale, 2005; Jermann et al.,
2011; Schneider et al., 2015)
• Hand movements, head movement and physical engagement (Spikol et al.,
• Audio cues (Bassiou et al., 2016)
Dimensions or Indexes of CC
• Mutual regulation (Blaye, 1988), Conflict resolution (Doise et al., 1976)
• IVA, IA, Synchrony, Equality (Spikol et al., 2017; Cukurova et al., 2017,
• 9 dimensions (Meier et al. , 2007) and other dimensions (Johnson &
Indicators Dimensions or
Onderwerp via >Beeld >Koptekst en voettekst
Real-time or post-hoc feedback
• Reflect table (Bachour et al., 2010)
• Conversation Clocks (Bergstrom & Karahalios, 2007a, 2007b)
• Groupgarden (Tausch et al., 2014)
• JVA (Schneider et al., 2015)
• Huge gap between the theory and practicality of the dimensions of
• Most of the feedback systems provide a post-hoc feedback or a real-
time reflective feedback
• Dearth of studies on automated multimodal analysis in non-computer
supported environments (Worsley & Blikstein, 2015)
• RQ1 – What multimodal indicators give evidences of the quality of CC?
• RQ2 – To what extent, can we design an automated real-time feedback
system using the insights from the multimodal indicators to facilitate CC?
Designing the collaboration task and it’s
outcomes or decide the collaboration
Architectural design to integrate
multiple sensors and enable real-time
Annotating the huge dataset
• T1 – Literature Review: MMLA in CC
• T2 – Formative Study: Prototype design, pilot study, collection of data
• T3 – Summative Study: Evaluation and modification of the feedback
Bachour, K., Kaplan, F., & Dillenbourg, P. (2010). An interactive table for supporting participation
balance in face-to-face collaborative learning.IEEE Transactions on Learning Technologies,3(3), 203–213.
Baker, M. J. (1999). Argumentation and constructive interaction. Foundations of argumentative text
processing, 5, 179–202.
Bassiou, N., Tsiartas, A., Smith, J., Bratt, H., Richey, C., Shriberg, E., ... & Alozie, N. (2016, September).
Privacy-Preserving Speech Analytics for Automatic Assessment of Student Collaboration. In INTERSPEECH
Bergstrom, T., & Karahalios, K. (2007a). Conversation clock: Visualizing audio patterns in co-located
groups. In System sciences, 2007. hicss 2007. 40th annual hawaii international conference on(pp.
Bergstrom, T., & Karahalios, K. (2007b). Seeing more: visualizing audio cues. Human-Computer
Interaction–INTERACT 2007, 29–42.
Blaye, A. (1988). Confrontation socio-cognitive et r ́esolution de probl`eme (`a propos du produit de deux
ensembles)(Unpublished doctoral dissertation).
Blikstein, P. (2013). Multimodal learning analytics. Proceedings of the Third International Conference on
Learning Analytics and Knowledge - LAK ’13 , 102. http://doi.org/10.1145/2460296.2460316
Chikersal, P. (2017). Deep structures of collaboration(Unpublished doctoral dissertation). Carnegie
Mellon University Pittsburgh, PA.
Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups.Review of
educational research,64(1), 1–35.
Craig, S. D., D’Mello, S., Witherspoon, A., & Graesser, A. (2008). Emote aloud during learning with
autotutor:Applying the facial action coding system to cognitive–affective states during learning.
Cognition and Emotion,22(5), 777–788.
Cukurova, M., Luckin, R., Mavrikis, M., & Mill ́an, E. (2017). Machine and human observable differences
in groups’ collaborative problem-solving behaviours. In European conference on technology enhanced
Cukurova, M., Luckin, R., Mill ́an, E., & Mavrikis, M. (2018). The nispi framework: Analysing
collaborative problem-solving from students’ physical interactions. Computers & Education,116, 93–109.
Davidsen, J., & Ryberg, T. (2017). “This is the size of one meter”: Children’s bodily-material collaboration.
International Journal of Computer-Supported Collaborative Learning, 12(1), 65-90.
Dede, C.: Comparing frameworks for 21st century skills. 21st century skills: Rethinking how students
learn 20, 51-76 (2010)
Dillenbourg, P.: What do you mean by collaborative learning? (1999)
D’mello, S. K., Craig, S. D., Witherspoon, A., Mcdaniel, B., & Graesser, A. (2008). Automatic detection of
learner’s affect from conversational cues.User modeling and user-adapted interaction,18(1), 45–80.
Doise, W., Mugny, G., & Perret-Clermont, A.-N. (1976). Social interaction and cognitive development:
Further evidence.European Journal of Social Psychology,6(2), 245–247.
Grover, S., Bienkowski, M., Tamrakar, A., Siddiquie, B., Salter, D., & Divakaran, A. (2016). Multimodal
analytics to study collaborative problem solving in pair programming. In Proceedings of the sixth
international conference on learning analytics & knowledge(pp. 516–517).
Jermann, P., Mullins, D., N ̈ussli, M.-A., & Dillenbourg, P. (2011). Collaborative gaze footprints:
correlates of interaction quality. In Connecting computer-supported collaborative learning to policy
and practice: Cscl 2011 conference proceedings.(Vol. 1, pp. 184–191).
Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social
interdependence theory and cooperative learning. Educational researcher, 38(5), 365-379.
Kulyk, O., Wang, J., & Terken, J. (2005, July). Real-time feedback on nonverbal behaviour to enhance
social dynamics in small group meetings. In International Workshop on Machine Learning for Multimodal
Interaction (pp. 150-161). Springer, Berlin, Heidelberg.
Lubold, N., & Pon-Barry, H. (2014, November). Acoustic-prosodic entrainment and rapport in
collaborative learning dialogues. In Proceedings of the 2014 ACM workshop on Multimodal Learning
Analytics Workshop and Grand Challenge (pp. 5-12). ACM.
Meier, A., Spada, H., & Rummel, N. (2007). A rating scheme for assessing the quality of
computer-supported collaboration processes. International Journal of Computer-Supported Collaborative
Learning, 2(1), 63-86.
Praharaj S., Scheffel M., Drachsler H., Specht M. (2018) Multimodal Analytics for Real-Time Feedback in Co-located Collaboration.
In: Pammer-Schindler V., Pérez-Sanagustín M., Drachsler H., Elferink R., Scheffel M. (eds) Lifelong Technology-Enhanced
EC-TEL 2018. Lecture Notes in Computer Science, vol 11082. Springer, Cham
Richardson, D. C., & Dale, R. (2005). Looking to understand: The coupling between speakers’ and
listeners’ eye movements and its relationship to discourse comprehension. Cognitive science,29(6),
Schneider, B., & Blikstein, P. (2015). Unraveling students’ interaction around a tangible interface
using multimodal learning analytics.Journal of Educational Data Mining,7(3), 89–116.
Schneider, B., & Pea, R. (2013). Real-time mutual gaze perception enhances collaborative learning and
collaboration quality.International Journal of Computer-supported collaborative learning,8(4), 375–397.
Schneider, B., & Pea, R. (2014). Toward collaboration sensing. International Journal of
Computer-Supported Collaborative Learning,9(4), 371–395.
Schneider, B., Abu-El-Haija, S., Reesman, J., & Pea, R. (2013). Toward collaboration sensing: applying
network analysis techniques to collaborative eye-tracking data. In Proceedings of the third international
conference on learning analytics and knowledge(pp. 107–111)
Soller, A., Martínez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding: A review of state of the art technology
supporting collaborative learning. International Journal of Artificial Intelligence in Education, 15(4), 261-290.
Schneider, B., Sharma, K., Cuendet, S., Zufferey, G., Dillenbourg, P., & Pea, R. D. (2015). 3d
tangibles facilitate joint visual attention in dyads. InProceedings of 11th international conference
of computer supported collaborative learning(Vol. 1, pp. 156–165).
Spikol, D., Ruffaldi, E., & Cukurova, M. (2017). Using multimodal learning analytics to identify aspects
of collaboration in project-based learning. Philadelphia, PA: International Society of the Learning
Tausch, S., Hausen, D., Kosan, I., Raltchev, A., & Hussmann, H. (2014, October). Groupgarden:
supporting brainstorming through a metaphorical group mirror on table or wall. In Proceedings of the 8th
Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (pp. 541-550). ACM.
Worsley, M., & Blikstein, P. (2015). Using learning analytics to study cognitive disequilibrium in
a complex learning environment. In Proceedings of the fifth international conference on learning analytics
and knowledge(pp. 426–427).
Worsley, M., & Blikstein, P. (2015, March). Leveraging multimodal learning analytics to differentiate
student learning strategies. In Proceedings of the Fifth International Conference on Learning Analytics
And Knowledge (pp. 360-367). ACM.
Praharaj, S., Scheffel, M., Drachsler, H., & Specht, M. (2019, September). Group Coach for Co-located Collaboration. In European
Conference on Technology Enhanced Learning (pp. 732-736). Springer, Cham.