Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Multimodal Learning Analytics for Collaborative Learning Understanding and Support, Doctoral Consortium EC-TEL 2018

This project has multiple focus points: using the help of Multimodal Learning Analytics to understand how co-located collaboration takes place, what are the indicators of collaboration (such as pointing at peer, looking at peer, making
constructive interruptions, etc.); then we try to form a Collaboration Framework (CF)
which defines the aspects of successful collaboration and forms a model. These
insights help us to build the support framework to enable efficient real-time feedback
during a group activity to facilitate collaboration.

  • Login to see the comments

  • Be the first to like this

Multimodal Learning Analytics for Collaborative Learning Understanding and Support, Doctoral Consortium EC-TEL 2018

  1. 1. MULTIFOCUS: MULTImodal learning analytics FOr Collaborative learning Understanding and Support Sambit Praharaj with: Maren Scheffel, Hendrik Drachsler and Marcus Specht PhD Overview
  2. 2. What is collaboration? • Occurs when two or more persons work towards a common goal (Martinez- Moyano, 2006) • Can be in different settings: – Online (or remote) – Co-located (or face-to-face) • Can be for different purposes: – Programming (Grover et al., 2016) – Meetings (Terken & Strum, 2010) – Problem solving (Cukurova et al., 2018) – Brainstorming (Tausch et al., 2014), etc. Pagina 2
  3. 3. MMLA in CC Pagina 3 Co-located Collaboration
  4. 4. Related work (MI of CC) • Non-verbal signals using NISPI framework (Cukurova et al., 2018) • Joint Visual Attention (JVA) (Richardson & Dale, 2005; Jermann et al., 2011; Schneider et al., 2015) • Hand movements, head movement and physical engagement (Spikol et al., 2017) • Audio cues (Bassiou et al., 2016) Pagina 4
  5. 5. Related work (Dimensions of CC) • Mutual regulation (Blaye, 1988), Conflict resolution (Doise et al., 1976) • IVA, IA, Synchrony, Equality (Spikol et al., 2017; Cukurova et al., 2017, 2018) • 9 dimensions (Meier et al. , 2007) and other dimensions (Johnson & Johnson, 2009) Pagina 5
  6. 6. Related work (Feedback) • Reflect table (Bachour et al., 2010) • Conversation Clocks (Bergstrom & Karahalios, 2007a, 2007b) • Groupgarden (Tausch et al., 2014) • JVA (Schneider et al., 2015) Pagina 6
  7. 7. Related work (Feedback) Pagina 7
  8. 8. Current problems • Huge gap between the theory and practicality of the dimensions of collaboration • Most of the feedback systems provide a post-hoc feedback or a real- time reflective feedback • Dearth of studies on automated multimodal analysis in non-computer supported environments (Worsley & Blikstein, 2015) Pagina 8
  9. 9. Research Questions • RQ1 – What multimodal indicators give evidences of collaboration quality in a co-located setting? • RQ2 – How can we measure multimodal indicators of collaboration quality with sensor technology and build efficient multimodal data models to enable real-time data aggregation and analysis? • RQ3 – How can we enable efficient real-time (or near to real-time) direct or indirect interventions supported by MMLA to facilitate (or improve) collaboration? Pagina 9
  10. 10. Research Challenges Pagina 10 Designing the collaboration task and it’s outcomes or decide the collaboration situation Architectural design to integrate multiple sensors and DNN to enable real-time prediction Annotating the huge dataset collected
  11. 11. Research Tasks Pagina 11 • T1 – Literature Survey: MMLA in CC • T2 – Formative Study: Prototype design, pilot study, collection of data • T3 – Accuracy Study: Classifier accuracy calculation and adjustments to the prototype • T4 – Summative Study: Evaluation and modification of the feedback strategy
  12. 12. Rough conceptual framework Pagina 12
  13. 13. Expected Outcomes Pagina 13 Theoretical indexes of CC Practically detected indexes of CC Post-hoc Real-time (Soller et al., 2005)
  14. 14. 1st Use-Case study Pagina 14 (Praharaj et al., 2018)
  15. 15. Thank you! “The greatest danger for most of us is not that our aim is too high and we miss it, but that it is too low and we reach it.” – Michelangelo Pagina 15
  16. 16. Questions/ Comments Pagina 16
  17. 17. References Bachour, K., Kaplan, F., & Dillenbourg, P. (2010). An interactive table for supporting participation balance in face-to-face collaborative learning.IEEE Transactions on Learning Technologies,3(3), 203–213. Baker, M. J. (1999). Argumentation and constructive interaction. Foundations of argumentative text processing, 5, 179–202. Bassiou, N., Tsiartas, A., Smith, J., Bratt, H., Richey, C., Shriberg, E., ... & Alozie, N. (2016, September). Privacy-Preserving Speech Analytics for Automatic Assessment of Student Collaboration. In INTERSPEECH (pp. 888-892). Bergstrom, T., & Karahalios, K. (2007a). Conversation clock: Visualizing audio patterns in co-located groups. In System sciences, 2007. hicss 2007. 40th annual hawaii international conference on(pp. 78–78). Bergstrom, T., & Karahalios, K. (2007b). Seeing more: visualizing audio cues. Human-Computer Interaction–INTERACT 2007, 29–42. Blaye, A. (1988). Confrontation socio-cognitive et r ́esolution de probl`eme (`a propos du produit de deux ensembles)(Unpublished doctoral dissertation). Blikstein, P. (2013). Multimodal learning analytics. Proceedings of the Third International Conference on Learning Analytics and Knowledge - LAK ’13 , 102. http://doi.org/10.1145/2460296.2460316 Chikersal, P. (2017). Deep structures of collaboration(Unpublished doctoral dissertation). Carnegie Mellon University Pittsburgh, PA. Pagina 17
  18. 18. References Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups.Review of educational research,64(1), 1–35. Craig, S. D., D’Mello, S., Witherspoon, A., & Graesser, A. (2008). Emote aloud during learning with autotutor:Applying the facial action coding system to cognitive–affective states during learning. Cognition and Emotion,22(5), 777–788. Cukurova, M., Luckin, R., Mavrikis, M., & Mill ́an, E. (2017). Machine and human observable differences in groups’ collaborative problem-solving behaviours. In European conference on technology enhanced learning(pp. 17–29). Cukurova, M., Luckin, R., Mill ́an, E., & Mavrikis, M. (2018). The nispi framework: Analysing collaborative problem-solving from students’ physical interactions. Computers & Education,116, 93–109. Davidsen, J., & Ryberg, T. (2017). “This is the size of one meter”: Children’s bodily-material collaboration. International Journal of Computer-Supported Collaborative Learning, 12(1), 65-90. Dede, C.: Comparing frameworks for 21st century skills. 21st century skills: Rethinking how students learn 20, 51-76 (2010) Di Mitri, D., Klemke, R., Drachsler, H., & Specht, M. Towards a real-time feedback system based on analysis of multimodal data. Pagina 18
  19. 19. References Dillenbourg, P.: What do you mean by collaborative learning? (1999) D’mello, S. K., Craig, S. D., Witherspoon, A., Mcdaniel, B., & Graesser, A. (2008). Automatic detection of learner’s affect from conversational cues.User modeling and user-adapted interaction,18(1), 45–80. Doise, W., Mugny, G., & Perret-Clermont, A.-N. (1976). Social interaction and cognitive development: Further evidence.European Journal of Social Psychology,6(2), 245–247. Grover, S., Bienkowski, M., Tamrakar, A., Siddiquie, B., Salter, D., & Divakaran, A. (2016). Multimodal analytics to study collaborative problem solving in pair programming. In Proceedings of the sixth international conference on learning analytics & knowledge(pp. 516–517). Jermann, P., Mullins, D., N ̈ussli, M.-A., & Dillenbourg, P. (2011). Collaborative gaze footprints: correlates of interaction quality. In Connecting computer-supported collaborative learning to policy and practice: Cscl 2011 conference proceedings.(Vol. 1, pp. 184–191). Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational researcher, 38(5), 365-379. Kulyk, O., Wang, J., & Terken, J. (2005, July). Real-time feedback on nonverbal behaviour to enhance social dynamics in small group meetings. In International Workshop on Machine Learning for Multimodal Interaction (pp. 150-161). Springer, Berlin, Heidelberg. Lubold, N., & Pon-Barry, H. (2014, November). Acoustic-prosodic entrainment and rapport in collaborative learning dialogues. In Proceedings of the 2014 ACM workshop on Multimodal Learning Analytics Workshop and Grand Challenge (pp. 5-12). ACM. Pagina 19
  20. 20. References Meier, A., Spada, H., & Rummel, N. (2007). A rating scheme for assessing the quality of computer-supported collaboration processes. International Journal of Computer-Supported Collaborative Learning, 2(1), 63-86. Praharaj S., Scheffel M., Drachsler H., Specht M. (2018) Multimodal Analytics for Real-Time Feedback in Co-located Collaboration. In: Pammer-Schindler V., Pérez-Sanagustín M., Drachsler H., Elferink R., Scheffel M. (eds) Lifelong Technology-Enhanced Learning. EC-TEL 2018. Lecture Notes in Computer Science, vol 11082. Springer, Cham Richardson, D. C., & Dale, R. (2005). Looking to understand: The coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cognitive science,29(6), 1045–1060. Schneider, B., & Blikstein, P. (2015). Unraveling students’ interaction around a tangible interface using multimodal learning analytics.Journal of Educational Data Mining,7(3), 89–116. Schneider, B., & Pea, R. (2013). Real-time mutual gaze perception enhances collaborative learning and collaboration quality.International Journal of Computer-supported collaborative learning,8(4), 375–397. Schneider, B., & Pea, R. (2014). Toward collaboration sensing. International Journal of Computer-Supported Collaborative Learning,9(4), 371–395. Schneider, B., Abu-El-Haija, S., Reesman, J., & Pea, R. (2013). Toward collaboration sensing: applying network analysis techniques to collaborative eye-tracking data. In Proceedings of the third international conference on learning analytics and knowledge(pp. 107–111) Soller, A., Martínez, A., Jermann, P., & Muehlenbrock, M. (2005). From mirroring to guiding: A review of state of the art technology for supporting collaborative learning. International Journal of Artificial Intelligence in Education, 15(4), 261-290. Pagina 20
  21. 21. References Schneider, B., Sharma, K., Cuendet, S., Zufferey, G., Dillenbourg, P., & Pea, R. D. (2015). 3d tangibles facilitate joint visual attention in dyads. InProceedings of 11th international conference of computer supported collaborative learning(Vol. 1, pp. 156–165). Spikol, D., Ruffaldi, E., & Cukurova, M. (2017). Using multimodal learning analytics to identify aspects of collaboration in project-based learning. Philadelphia, PA: International Society of the Learning Sciences. Tausch, S., Hausen, D., Kosan, I., Raltchev, A., & Hussmann, H. (2014, October). Groupgarden: supporting brainstorming through a metaphorical group mirror on table or wall. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (pp. 541-550). ACM. Worsley, M., & Blikstein, P. (2015). Using learning analytics to study cognitive disequilibrium in a complex learning environment. In Proceedings of the fifth international conference on learning analytics and knowledge(pp. 426–427). Worsley, M., & Blikstein, P. (2015, March). Leveraging multimodal learning analytics to differentiate student learning strategies. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge (pp. 360-367). ACM. https://www.google.nl/url?sa=i&rct=j&q=&esrc=s&source=images&cd=&cad=rja&uact=8&ved=2ahUKEwjns7DgqIrdAhWEsaQKHdK LAGQQjRx6BAgBEAU&url=https%3A%2F%2Fwww.lee.k12.nc.us%2Fdomain%2F2133&psig=AOvVaw0_B95fIztW5tzAu2- ckRwX&ust=1535359089801605 Pagina 21

×