Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Student-facing Learning dashboards

521 views

Published on

Slides of my talk on student-facing learning dashboards at University Jean Moulin Lyon 3, 25 May 2018

  • Be the first to comment

  • Be the first to like this

Student-facing Learning dashboards

  1. 1. Katrien Verbert Augment / HCI – KU Leuven @katrien_v Student-facing Learning Dashboards Augment Lab
  2. 2. Human-Computer Interaction research group Augment prof. Katrien Verbert ARIA prof. Adalberto Simeone Computer Graphics prof. Phil Dutré Language Intelligence & Information Retrieval prof. Sien Moens “Flexible interaction between people and information”
  3. 3. Augment team Robin De Croon Postdoc researcher Katrien Verbert Augment/HCI, Computer Science department Tinne De Laet Leuven Engineering and Science Education Center Head of Tutorial Services of Engineering Science Coordinator of STELA KU Leuven coordinator of ABLE Francisco Gutiérrez PhD researcher Tom Broos PhD researcher Martijn Millecamp PhD researcher Sven Charleer Postdoc researcher Nyi Nyi Htun Postdoc researcher Gayane Sedrakyan Postdoc researcher Houda Lamqaddam PhD researcher Yucheng Jin PhD researcher Oscar Alvarado PhD researcher http://augment.cs.kuleuven.be/
  4. 4. “Learning analytics is about collecting traces that learners leave behind and using those traces to improve learning.” - Erik Duval 5 Duval, E., & Verbert, K. (2012). Learning analytics. E-Learning and Education, 1(8). LEARNING ANALYTICSintro
  5. 5. LEARNING ANALYTICS 6 “The measurement, collection, analysis, and reporting of data about learners and their contexts, for purpose of understanding and optimising learning and the environments in which it occurs” J. L. Santos. Learning Analytics and Learning Dashboards: a Human- Computer Interaction Perspective. PhD dissertation, KU Leuven, 2015. G. Siemens. “Learning analytics: envisioning a research discipline and a domain of practice”. Proceedings of the 2nd International Conference on Learning Analytics and Knowledge . ACM. 2012, pp. 4–8. Microlevel intro
  6. 6. Learning analytics Src: Steve Schoettler
  7. 7. tracking traces www.role-project.eu
  8. 8. Recommender systems Users who bought the same product also bought product B and C
  9. 9. Learning analytics Src: Steve Schoettler
  10. 10. LEARNING DASHBOARDS 13 “A Learning Dashboard is a single display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualisations.” B. A. Schwendimann, M. J. Rodríguez-Triana, A. Vozniuk, L. P. Prieto, M. S. Boroujeni, A. Holzer, D. Gillet, and P. Dillenbourg. Understandig learning at a glance: An overview of learning dashboard studies. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, pages 532–533. ACM, 2016. K. Verbert, E. Duval, J. Klerkx, S. Govaerts, and J. L. Santos. Learning Analytics Dashboard Applications. American Behavioral Scientist, 57(10):1500–1509, 2013. Design-based research Design guidelines intro awareness (self) reflection sense making impact data questions answers Behavior change or new meaning
  11. 11. Successful Transition from secondary to higher Education using Learning Analytics enhance a successful transition from secondary to higher education by means of learning analytics ü design and build analytics dashboards, ü dashboards that go beyond identifying at-risk students, allowing actionable feedback for all students on a large scale. Achieving Benefits from Learning Analytics research strategies and practices for using learning analytics to support students during their first year at university ü developing the technological aspects of learning analytics, ü focuses on how learning analytics can be used to support students.
  12. 12. [!] Feedback must be “actionable”. Warning! Male students have 10% less probability to be successful. You are male. Warning! Your online activity is lagging behind. action? ? action? ? ü
  13. 13. Verbert K, Duval E, Klerkx J; Govaerts S, Santos JL (2013) Learning analytics dashboard applications. American Behavioural Scientist, 10 pages. Published online February 2013. [!] Feedback must be “actionable”. awareness (self) reflection sense making impact data questions answers Behavior change or new meaning
  14. 14. Verbert,K.,Govaerts,S.,Duval,E.,SantosOdriozola,J.,Van Assche,F.,ParraChico,G.,Klerkx,J.(2014).Learning dashboards:anoverviewandfutureresearchopportunities. PersonalandUbiquitousComputing,18(6),1499-1514.
  15. 15. BLENDED LEARNING F2F GROUP WORK STUDENT-ADVISER 18 https://www.flickr.com/photos/lockechrisj/
  16. 16. Sten Govaerts, Katrien Verbert, Aberlardo Pardo, Erik Duval. The student activity meter for awareness and self-reflection. CHI'12 Extended Abstracts on Human Factors in Computing Systems. ACM, 2012. CREATING EFFECTIVE LEARNING DASHBOARDSblended learning abundance of data - effort - outcome
  17. 17. CREATING EFFECTIVE LEARNING DASHBOARDSblended learning Verbert, K., Govaerts, S., Duval, E., Santos, J. L., Van Assche, F., Parra, G., & Klerkx, J. (2013). Learning dashboards: an overview and future research opportunities. Personal and Ubiquitous Computing, 1-16.
  18. 18. RQ1: How should we visualise learner data to support students to explore the path from effort to outcomes? RQ2: How can we promote students, inside and outside the classroom, to actively explore this effort to outcomes path? 21 CREATING EFFECTIVE LEARNING DASHBOARDSblended learning abundance of data - effort - outcome
  19. 19. TwitterBlogs 22
  20. 20. Inquiry-Based Learning 23
  21. 21. 24
  22. 22. Charleer, S., Klerkx, J., Santos, J. L., & Duval, E. Improving awareness and reflection through collaborative, interactive visualizations of badges. In Proceedings of the 3rd Workshop on Awareness and Reflection in Technology-Enhanced Learning, pages 69-81. CEUR Workshop Proceedings, 2013 ARTEL 2014 . Graz, Austria ARTEL 2013
  23. 23. 26 EVALUATIONS
  24. 24. Abstract the LA data Provide access to the artefacts Augment the abstracted data Provide access to teacher and peer feedback 27 RESULTS RQ1: What are relevant learning traces, and how should we visualise these data to support students to explore the path from effort to outcomes?
  25. 25. 28 RESULTS RQ2: How can we promote students, inside and outside the classroom, to actively explore this effort to outcomes path? Visualise the learner path Integrate LA into the workflow Facilitate collaborative exploration of the LA data
  26. 26. RQ1: visualise to facilitate exploring29 Visualise the learner path
  27. 27. RQ1: visualise to facilitate exploring30 Visualise the learner path
  28. 28. RQ1: visualise to facilitate exploring31 Visualise the learner path
  29. 29. RQ1: visualise to facilitate exploring32 Visualise the learner path
  30. 30. 33 BALANCED DISCUSSION IN THE CLASSROOMF2F Group Work RQ3: What are the design challenges for ambient Learning Dashboards to promote balanced group participation in classrooms, and how can they be met? RQ4: Are ambient Learning Dashboards effective means for creating balanced group participation in classroom settings? over- and under-participation
  31. 31. 34 K. Bachour, F. Kaplan, and P. Dillenbourg. An interactive table for supporting participation balance in face-to-face collaborative learning. IEEE Trans. Learn. Technol., 3(3):203–213, July 2010. Over- participation:“free- riders” can affect the motivated learner to reduce contributions G. Salomon and T. Globerson. When teams do not function the way they ought to. International Journal of Educational Research, 13(1):89 – 99, 1989.
  32. 32. EVALUATION SETUP 35
  33. 33. EVALUATION SETUP case study 1 # participants 12 students deployment 1 3h session with dashboard 1 3h session without dashboard evaluation class discussion, questionnaires (perceived distraction/awareness/usefulness), activity/quality logging case study 2 # participants 19 students deployment half 3h session without dashboard, half 3h session with dashboard evaluation questionnaires (perceived importance feedback/motivation) activity/quality logging 36
  34. 34. EVALUATION SETUP 37
  35. 35. Visualise balance in an abstract and neutral way Add the qualitative dimension to the visualisation Create a realistic picture of the classroom situation 38 RESULTS RQ3: What are the design challenges for ambient LDs to promote balanced group participation in classrooms, and how can they be met?
  36. 36. Ambient dashboards as support for teacher/presenter Ambient dashboards raise awareness of the invisible Ambient feedback information can activate students 39 RESULTS RQ4: Are ambient LDs effective means for creating balanced group participation in classroom settings? Charleer, S., Klerkx, J., Duval, E., De Laet, T. and Verbert, K. (2017) ‘Towards balanced discussions in the classroom using ambient information visualisations’, Int. J. Technology Enhanced Learning, Vol. 9, Nos. 2/3, pp.227–253.
  37. 37. SUPPORTING ADVISER-STUDENT DIALOGUE RQ5: What are the design challenges for creating a Learning Dashboard to support study advice sessions, and how can they be met? RQ6: How does such a Learning Dashboard contribute to the role of the adviser, student, and dialogue? 40 lack of data-based feedback
  38. 38. 42
  39. 39. 43
  40. 40. EVALUATION SETUP design # participants 17 study advisers (preliminary feedback) 5 study advisers (iterative feedback) approach brainstorms/observations iterative design evaluation # participants 5 study advisors deployment Engineering Science, Engineering Science: Architecture 97 sessions (15-30min per session) evaluation 15 sessions observed questionnaires perceived usefulness 44
  41. 41. Factual Insights (-) Interpretative Insights (+) Reflective Insights (!)
  42. 42. S. Charleer, A. Vande Moere, J. Klerkx, K. Verbert, and T. De Laet. Learning analytics dashboards to support adviser-student dialogue. IEEE Transaction on Learning Technologies, 14 pages
  43. 43. RESULTS S. Claes, N. Wouters, K. Slegers, and A. V. Moere. Controlling In-the-Wild Evaluation Studies of Public Displays. pages 81–84, 2015. 47
  44. 44. “When students see the numbers, they are surprised, but now they believe me. Before, I used my gut feeling, now I feel more certain of what I say as well”. “It’s like a main thread guiding the conversation.” “I can talk about what to do with the results, instead of each time looking for the data and puzzling it together.” “Students don’t know where to look during the conversation, and avoid eye contact. The dashboard provides them a point of focus”. “A student changed her study method in June and could now see it paid off.” LISSA supports a personal dialogue. ü the level of usage depends on the experience and style of the study advisors ü fact-based evidence at the side ü narrative thread ü key moments and student path help to reconstruct personal track “I can focus on the student’s personal path, rather than on the facts.” “Now, I can blame the dashboard and focus on collaboratively looking for the next step to take.” 48
  45. 45. LISSA dashboard https://able.cs.kuleuven.be/demo-september/2016/1
  46. 46. Doubting to continue (Group 1) Doubting which courses to take (Group 2) Doubting which courses to deliberate (Group 3) Martijn Millecamp, Francisco Gutiérrez, Sven Charleer, Katrien Verbert, Tinne De Laet. A qualitative evaluation of a learning dashboard to support advisor-student dialogues, FP@LAK18
  47. 47. Group 1 Group 2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2
  48. 48. Group 1 Group 2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2
  49. 49. Group 1 Group 2 Group 3 Time db used 0.58 0.43 0.43 Avg. nb of insights 13.8 10.1 3.67 Avg nb factual insights 4.7 3.9 0.33 Avg nb of interpretative insights 3.33 3 3.2 Avg nb of reflective insights 5.8 3.2 2
  50. 50. Data Confidence Collaboration Adviser’s role 54 RESULTS RQ6: How does such a Learning Dashboard contribute to the role of the adviser, student, and dialogue? RQ5: What are the design challenges for creating a Learning Dashboard to support study advice sessions, and how can they be met? Authorship Visual Encoding Ethics
  51. 51. [!] Wording matters. 73% chance of success 73% of students of earlier cohorts with the same study efficiency obtained the bachelor degree http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
  52. 52. [!] Do not oversimplify. Show uncertainty. • reality is complex • measurement is limited • individual circumstances • need for nuance • trigger reflection
  53. 53. 57 • reality is complex • measurement is limited • individual circumstances • need for nuance • trigger reflection [!] Do not oversimplify. Show uncertainty.
  54. 54. LISSA: status 58 26 programs >4500 students 114 student advisors training of study advisors dashboards for three examination periods
  55. 55. Next steps • available data • national and institutional regulations and culture • educational vision • educational system, size of population .. • … 59
  56. 56. Katrien Verbert – KU Leuven katrien.verbert@cs.kuleuven.be @katrien_v Thank you! Questions? Augment Lab Slide design: Sven Charleer

×