Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

SHEILA Project – LAK18 Workshop Slides


Published on

Slides from the LAK18 workshop "Developing an evidence-based institutional LA policy".

Published in: Education
  • Be the first to comment

SHEILA Project – LAK18 Workshop Slides

  1. 1. Developing an evidence-based institutional LA policy SHEILA Project Yi-Shan Tsai, Dragan Gašević, Maren Scheffel, Pedro Manuel Moreno Marcos, Alexander Whitelock-Wainwright LAK18 Workshop 6th March, 2018
  2. 2. Programme (9.00-12.30) SHEILA overview Student & staff surveys Student & staff focus groups SHEILA policy framework Developing an institutional policy 10.30 -11.00 AM
  3. 3. SHEILA overview
  4. 4. Developing an evidence-based institutional learning analytics policy Dragan Gašević @dgasevic Mar 6, 2018 LAK’18, SHEILA workshop Sydney, NSW, Australia
  5. 5. Our institution is in early days of adoption
  6. 6. Adoption Challenges
  7. 7. Adoption challenge Leadership for strategic implementation & monitoring Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  8. 8. Adoption challenge Equal engagement with different stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  9. 9. Adoption challenge Training to cultivate data literacy among primary stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  10. 10. Adoption challenge Policies for learning analytics practice Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  11. 11. What’s necessary to move forward?
  12. 12. SHEILA
  13. 13. Inclusive adoption process
  14. 14. Inclusive adoption process
  15. 15. Inclusive adoption process Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
  16. 16. Methodology Literature -  Policy -  Adoption Academic staff -  Survey -  Focus groups Students -  Survey -  Focus groups Senior managers -  Survey -  Interviews Experts -  Group concept mapping Policy framework Institutional policy/ Other stakeh. -  Workshops -  Committees
  17. 17. Student & staff surveys
  18. 18. Student Survey Update Alex Wainwright
  19. 19. Development and Validation • Validated in multiple contexts. • Mainly focused on descriptive statistics. • What next?
  20. 20. Latent Class Analysis • Segmentation procedure. • Identify latent sub-groups. • Includes covariates to predict class membership.
  21. 21. Model Tested • Dutch students (n = 1240). •  Formula: (Expectation_Items) ~ Age + Gender + student_Type + level_Of_Study + Faculty
  22. 22. Results • Four class solution. • Age and student_Type are important covariates.
  23. 23.
  24. 24.
  25. 25. Latent Class Regression • Class 3 membership associated with age and student type. • Age (β = .02, standard error = .009). • Student Type (European student: β = .97, standard error = .36)
  26. 26.
  27. 27.
  28. 28. Why is this important? • Gauging stakeholder expectations early on can offset the feelings of dissatisfaction. • Expectations are not homogenous. • Able to manage expectations. • Prevent the creation of unrealistic expectations.
  29. 29. Cross-Cultural Differences • Data from 6 universities. • Compare items based on descriptives – making large inferences. • Multiple ANOVAs? • Multiple-Indicators Multiple-Causes (MIMIC)
  30. 30. MIMIC Modelling • Used in confirmatory factor analysis. • Direct effect of covariate on latent variable = population heterogeneity. • Direct effect on indicators = differential item functioning.
  31. 31. Two Approaches • Formulate hypotheses – no research to ground these. • Exploratory approach – no hypotheses.
  32. 32. Exploratory Approach • Dummy codes (k-1). • Dutch university is the baseline (largest sample). • All directs effects set to zero – inspect modification indices (MI). • High MI value – freely estimate direct effect.
  33. 33. Predicted Expectations MIMIC Model Direct Effects .025 .975 Ethics_Factor ~ Spanish_University -.54 -.36 Q1_2 ~ Spanish_University -.33 -.08 Q5_2 ~ uk_University -.52 -.14
  34. 34. MIMIC Model Interpretation • Spanish students score -.46 (95% CI [-.54,-.36]) units lower than Dutch students on the Ethics and Privacy factor. • UK students score significantly lower on item 5 (-.33; 95% CI [-.52, -.14]) than Dutch students (i.e., UK students have lower expectations toward the University seeking consent to collect data).
  35. 35. Next • Need to ground these findings. • Possible cultural reasons for these differences?
  36. 36. Student & staff focus groups
  37. 37. University of Edinburgh 6 Student focus groups, 26 students 5 staff focus groups, 18 teaching staff Yi-Shan Tsai
  38. 38. Interests – students Personalised support •  Inform teaching support and curriculum design. •  Support a widening access policy. •  Support students at all achievement levels to improve learning. •  Assist with transitions from pre-tertiary education to higher education.
  39. 39. Concerns– students •  Surveillance •  Stereotypes and biases
  40. 40. Concerns– students Legitimate or illegitimate? •  Purpose •  Anonymity •  Access Privacy paradox Transparency Effective communication
  41. 41. Interests – teaching staff •  Pedagogical interests Know how students engage with learning contents Improve the design and provision of learning materials, curriculum, and support to students.
  42. 42. Concerns– teaching staff Profiling students & unequal support Privacy & autonomy Demotivation & Anxiety Behaviour alteration Student-Centred Concerns
  43. 43. Concerns– teaching staff Time pressure Performance judgement Teaching professionalism is disrespected Managing expectations Teacher-centred concerns
  44. 44. Concerns– teaching staff Differences among individual students/ teachers/ courses/ subjects/ disciplines/ faculties Interpretations of learning (data collection, analysis & analytics interpretation) Damaging teacher-student relationships LA capabilities LA-centred concerns
  45. 45. Tallinn University 5 student focus groups, 18 students 5 staff focus groups, 20 teaching staff Kairit Tammets
  46. 46. Student focus groups •  Awareness: •  Students do not know what is LA or the possibilities of using their data for different purposes •  Students are not aware of the educational data university may have about them, but they in general assume that university has access to data and it is already used in some processes •  Students have not checked their contracts if and how it is regulated to use their educational data in different processes
  47. 47. Student focus groups •  Expectations: •  Students consider improving the quality of teaching as the main priority of developing LA services: e.g. what learning methods are used in the classrooms? •  Overview of the tasks to be submitted in current term was evaluated to be very needed and appreciated: due date, what course, in which environment •  Recommendation of the learning resources from different databased, libraries etc. based on the group and individuals’ needs was considered relevant •  Consent, privacy: •  Students are sensitive about accessing their educational data. Support staff (like study counsellor) may be the one who sees overall picture about the learner, but not personal data (health issues, educational history, workplace) •  Any teacher or program head should see only generalized results of the group, not individual data about the learner. Reason: not trusting staff, prejustice may develop •  Sharing students’ anonymized educational data with the third parties only without commercial purposes.
  48. 48. Staff focus groups Awareness: •  Staff from ICT and educational field were more aware about the concept and possibilities of LA; Expectations: •  Staff mainly sees LA as a tool for supporting students: enhancing their SRL competences by making decisions about own learning; •  Additionally was LA perceived as a tool for “managing my course better” •  Program heads perceive LA differently than regular staff - would like to know more about learners than just one course - background, learning path, results in other courses •  Program heads are more willing to act when learner is identified as under- performing, drop-out etc, regular staff perceived it is student’s responsibility for acting in case of identified problems
  49. 49. Staff focus groups •  Challenges: •  Regular staff considers LA as time consuming: •  monitoring individual students. They consider it as teaching in secondary school, but university is voluntary place and students should take the responsibility •  Not sure if they would improve their practice based on LA-data, because this should be already part of the professional practice (“I am doing it anyway”) •  Interest to see learners’ personal data: background data, health data: •  Where students are coming from (what they have studied before; what is the profile)? •  Where they are working - who is my audience? •  (Mental) health issues as issues for teaching staff - should we be aware of them? •  Worried - if I had personal data about the students - how it would influence my opinion about the student?
  50. 50. Universidad Carlos III de Madrid 5 student focus groups, 23 students 4 staff focus groups, 16 teaching staff Aarón Rubio Fernández Pedro Manuel Moreno Marcos
  51. 51. Students’ interests about LA •  Students demand improvements in their learning experiences. §  Better student-teacher feedback. §  More information about the gathering and analysis of their data. §  Better academic resources (PDFs, videos, etc.) and better academic tools and services (such as e-learning platforms or book recommendation services). §  A personalised learning environment for each student.
  52. 52. Students’ interests about LA •  Most students do not know exactly what LA is, but they know that the use of their data could improve their learning experience. •  Therefore, their interest in the analysis of their data will grow if LA is able to solve the students’ problems.
  53. 53. Students’ concerns about LA •  Students have several concerns related to their data. §  What kind of data are gathered and analysed? §  Who can access to the students’ data? §  What are the purposes of the analysis? §  Are the data secure? §  Is it possible to remove students’ data from the collection without consequences?
  54. 54. Implications for a LA policy •  Basing on students’ interests and concerns, a LA policy should be able to: §  Provide students with useful information which improve their knowledge about LA. §  Identify students’ problems, define how they should be solved, and the best techniques and tools to do it. §  Take into account students’ concerns about their data (for example, anonymising data whenever possible).
  55. 55. Open Universiteit Nederland 2 student focus groups, 7 students 2 staff focus groups, 5 teaching staff Maren Scheffel
  56. 56. Student Focus Groups OUNL •  Know that data is being collected but don’t really know what is being done with it, more transparency is needed •  Data should mainly be used for student support •  Always ask when results are on individual level, data as part of average is ok •  Opting-out of data collection should be possible for some aspects, especially important is opting-out of feedback provision
  57. 57. Student Focus Groups OUNL •  Feedback should come via a dashboard •  Feedback should come from the system, NOT the teacher •  Feedback should be personalised and customisable •  Feedback should be a pull scenario, not a push scenario à University students are responsible for their own learning à  Regular feedback backed by teacher good for school, less needed at university
  58. 58. SHEILA policy framework
  59. 59. SHEILA policy framework
  60. 60. Developing an institutional policy
  61. 61. Group discussion walls •  Dimension 1: •  Dimension 2: •  Dimension 3: •  Dimension 4: •  Dimension 5: •  Dimension 6: