Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Co-located Collaboration Analytics


Published on

This was part of the Doctoral Consortium presentation in the ICMI Conference 2019 at Suzhou, China on 14th October, 2019. Collaboration is an important skill of the 21st century. It can take place in an online (or remote) setting or in a colocated
(or face-to-face) setting. With the large scale adoption
of sensor use, studies on co-located collaboration (CC) has
gained momentum. CC takes place in physical spaces where
the group members share each other’s social and epistemic
space. This involves subtle multimodal interactions such
as gaze, gestures, speech, discourse which are complex in
nature. The aim of this PhD is to detect these interactions
and then use these insights to build an automated real-time
feedback system to facilitate co-located collaboration

Published in: Education
  • Be the first to comment

  • Be the first to like this

Co-located Collaboration Analytics

  1. 1. Co-located Collaboration Analytics Sambit Praharaj with: Maren Scheffel, Hendrik Drachsler and Marcus Specht ICMI 2019 Doctoral Consortium Suzhou, China 14 Oct 2019
  2. 2. What is collaboration? • Occurs when two or more persons work towards a common goal (Martinez- Moyano, 2006) • Can be in different settings: – Online (or remote) – Co-located (or face-to-face) • Can be for different purposes: – Programming (Grover et al., 2016) – Meetings (Terken & Strum, 2010) – Problem solving (Cukurova et al., 2018) – Brainstorming (Tausch et al., 2014), etc. Pagina 2
  3. 3. MMLA in CC Pagina 3 Co-located Collaboration
  4. 4. Multimodal Indicators of CC • Non-verbal signals using NISPI framework (Cukurova et al., 2018) • Joint Visual Attention (JVA) (Richardson & Dale, 2005; Jermann et al., 2011; Schneider et al., 2015) • Hand movements, head movement and physical engagement (Spikol et al., 2017) • Audio cues (Bassiou et al., 2016) Pagina 4
  5. 5. Dimensions or Indexes of CC • Mutual regulation (Blaye, 1988), Conflict resolution (Doise et al., 1976) • IVA, IA, Synchrony, Equality (Spikol et al., 2017; Cukurova et al., 2017, 2018) • 9 dimensions (Meier et al. , 2007) and other dimensions (Johnson & Johnson, 2009) Pagina 5
  6. 6. Indicators  Dimensions or Indexes Onderwerp via >Beeld >Koptekst en voettekst Pagina 6
  7. 7. Real-time or post-hoc feedback • Reflect table (Bachour et al., 2010) • Conversation Clocks (Bergstrom & Karahalios, 2007a, 2007b) • Groupgarden (Tausch et al., 2014) • JVA (Schneider et al., 2015) Pagina 7
  8. 8. Real-time feedback Pagina 8
  9. 9. Current problems • Huge gap between the theory and practicality of the dimensions of collaboration • Most of the feedback systems provide a post-hoc feedback or a real- time reflective feedback • Dearth of studies on automated multimodal analysis in non-computer supported environments (Worsley & Blikstein, 2015) Pagina 9
  10. 10. Research Questions • RQ1 – What multimodal indicators give evidences of the quality of CC? • RQ2 – To what extent, can we design an automated real-time feedback system using the insights from the multimodal indicators to facilitate CC? Pagina 10
  11. 11. Research Challenges Pagina 11 Designing the collaboration task and it’s outcomes or decide the collaboration scenario Architectural design to integrate multiple sensors and enable real-time prediction Annotating the huge dataset collected
  12. 12. Research Tasks Pagina 12 • T1 – Literature Review: MMLA in CC • T2 – Formative Study: Prototype design, pilot study, collection of data • T3 – Summative Study: Evaluation and modification of the feedback strategy
  13. 13. Rough conceptual framework Pagina 13
  14. 14. Expected Outcomes Pagina 14 Theoretical indexes of CC Practically detected indexes of CC Post-hoc Real-time (Soller et al., 2005)
  15. 15. 1st Use-Case study Pagina 15 (Praharaj et al., 2018)
  16. 16. 2nd study Onderwerp via >Beeld >Koptekst en voettekst Pagina 16 (Praharaj et al., 2019)
  17. 17. 2nd study Onderwerp via >Beeld >Koptekst en voettekst Pagina 17
  18. 18. Thank you! “The greatest danger for most of us is not that our aim is too high and we miss it, but that it is too low and we reach it.” – Michelangelo Pagina 18 SCAN ME
  19. 19. Our team Pagina 19
  20. 20. References Bachour, K., Kaplan, F., & Dillenbourg, P. (2010). An interactive table for supporting participation balance in face-to-face collaborative learning.IEEE Transactions on Learning Technologies,3(3), 203–213. Baker, M. J. (1999). Argumentation and constructive interaction. Foundations of argumentative text processing, 5, 179–202. Bassiou, N., Tsiartas, A., Smith, J., Bratt, H., Richey, C., Shriberg, E., ... & Alozie, N. (2016, September). Privacy-Preserving Speech Analytics for Automatic Assessment of Student Collaboration. In INTERSPEECH (pp. 888-892). Bergstrom, T., & Karahalios, K. (2007a). Conversation clock: Visualizing audio patterns in co-located groups. In System sciences, 2007. hicss 2007. 40th annual hawaii international conference on(pp. 78–78). Bergstrom, T., & Karahalios, K. (2007b). Seeing more: visualizing audio cues. Human-Computer Interaction–INTERACT 2007, 29–42. Blaye, A. (1988). Confrontation socio-cognitive et r ́esolution de probl`eme (`a propos du produit de deux ensembles)(Unpublished doctoral dissertation). Blikstein, P. (2013). Multimodal learning analytics. Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13 , 102. Chikersal, P. (2017). Deep structures of collaboration(Unpublished doctoral dissertation). Carnegie Mellon University Pittsburgh, PA. Pagina 20
  21. 21. References Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups.Review of educational research,64(1), 1–35. Craig, S. D., D’Mello, S., Witherspoon, A., & Graesser, A. (2008). Emote aloud during learning with autotutor:Applying the facial action coding system to cognitive–affective states during learning. Cognition and Emotion,22(5), 777–788. Cukurova, M., Luckin, R., Mavrikis, M., & Mill ́an, E. (2017). Machine and human observable differences in groups’ collaborative problem-solving behaviours. In European conference on technology enhanced learning(pp. 17–29). Cukurova, M., Luckin, R., Mill ́an, E., & Mavrikis, M. (2018). The nispi framework: Analysing collaborative problem-solving from students’ physical interactions. Computers & Education,116, 93–109. Davidsen, J., & Ryberg, T. (2017). “This is the size of one meter”: Children’s bodily-material collaboration. International Journal of Computer-Supported Collaborative Learning, 12(1), 65-90. Dede, C.: Comparing frameworks for 21st century skills. 21st century skills: Rethinking how students learn 20, 51-76 (2010) Pagina 21
  22. 22. References Dillenbourg, P.: What do you mean by collaborative learning? (1999) D’mello, S. K., Craig, S. D., Witherspoon, A., Mcdaniel, B., & Graesser, A. (2008). Automatic detection of learner’s affect from conversational cues.User modeling and user-adapted interaction,18(1), 45–80. Doise, W., Mugny, G., & Perret-Clermont, A.-N. (1976). Social interaction and cognitive development: Further evidence.European Journal of Social Psychology,6(2), 245–247. Grover, S., Bienkowski, M., Tamrakar, A., Siddiquie, B., Salter, D., & Divakaran, A. (2016). Multimodal analytics to study collaborative problem solving in pair programming. In Proceedings of the sixth international conference on learning analytics & knowledge(pp. 516–517). Jermann, P., Mullins, D., N ̈ussli, M.-A., & Dillenbourg, P. (2011). Collaborative gaze footprints: correlates of interaction quality. In Connecting computer-supported collaborative learning to policy and practice: Cscl 2011 conference proceedings.(Vol. 1, pp. 184–191). Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational researcher, 38(5), 365-379. Kulyk, O., Wang, J., & Terken, J. (2005, July). Real-time feedback on nonverbal behaviour to enhance social dynamics in small group meetings. In International Workshop on Machine Learning for Multimodal Interaction (pp. 150-161). Springer, Berlin, Heidelberg. Lubold, N., & Pon-Barry, H. (2014, November). Acoustic-prosodic entrainment and rapport in collaborative learning dialogues. In Proceedings of the 2014 ACM workshop on Multimodal Learning Analytics Workshop and Grand Challenge (pp. 5-12). ACM. Pagina 22
  23. 23. References Meier, A., Spada, H., & Rummel, N. (2007). A rating scheme for assessing the quality of computer-supported collaboration processes. International Journal of Computer-Supported Collaborative Learning, 2(1), 63-86. Praharaj S., Scheffel M., Drachsler H., Specht M. (2018) Multimodal Analytics for Real-Time Feedback in Co-located Collaboration. In: Pammer-Schindler V., Pérez-Sanagustín M., Drachsler H., Elferink R., Scheffel M. (eds) Lifelong Technology-Enhanced EC-TEL 2018. Lecture Notes in Computer Science, vol 11082. Springer, Cham Richardson, D. C., & Dale, R. (2005). Looking to understand: The coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cognitive science,29(6), 1045–1060. Schneider, B., & Blikstein, P. (2015). Unraveling students’ interaction around a tangible interface using multimodal learning analytics.Journal of Educational Data Mining,7(3), 89–116. Schneider, B., & Pea, R. (2013). Real-time mutual gaze perception enhances collaborative learning and collaboration quality.International Journal of Computer-supported collaborative learning,8(4), 375–397. Schneider, B., & Pea, R. (2014). Toward collaboration sensing. International Journal of Computer-Supported Collaborative Learning,9(4), 371–395. Schneider, B., Abu-El-Haija, S., Reesman, J., & Pea, R. (2013). Toward collaboration sensing: applying network analysis techniques to collaborative eye-tracking data. In Proceedings of the third international conference on learning analytics and knowledge(pp. 107–111) Soller, A., Martínez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding: A review of state of the art technology supporting collaborative learning. International Journal of Artificial Intelligence in Education, 15(4), 261-290. Pagina 23
  24. 24. References Schneider, B., Sharma, K., Cuendet, S., Zufferey, G., Dillenbourg, P., & Pea, R. D. (2015). 3d tangibles facilitate joint visual attention in dyads. InProceedings of 11th international conference of computer supported collaborative learning(Vol. 1, pp. 156–165). Spikol, D., Ruffaldi, E., & Cukurova, M. (2017). Using multimodal learning analytics to identify aspects of collaboration in project-based learning. Philadelphia, PA: International Society of the Learning Sciences. Tausch, S., Hausen, D., Kosan, I., Raltchev, A., & Hussmann, H. (2014, October). Groupgarden: supporting brainstorming through a metaphorical group mirror on table or wall. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (pp. 541-550). ACM. Worsley, M., & Blikstein, P. (2015). Using learning analytics to study cognitive disequilibrium in a complex learning environment. In Proceedings of the fifth international conference on learning analytics and knowledge(pp. 426–427). Worsley, M., & Blikstein, P. (2015, March). Leveraging multimodal learning analytics to differentiate student learning strategies. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 360-367). ACM. Praharaj, S., Scheffel, M., Drachsler, H., & Specht, M. (2019, September). Group Coach for Co-located Collaboration. In European Conference on Technology Enhanced Learning (pp. 732-736). Springer, Cham. Pagina 24