Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

SHEILA workshop at EC-TEL 2018

95 views

Published on

Slides used in the SHEILA workshop at EC-TEL 2018 in Leeds, UK, on September 3, 2018.

Published in: Education
  • Be the first to comment

  • Be the first to like this

SHEILA workshop at EC-TEL 2018

  1. 1. Developing evidence-based institutional learning analytics policy 13th European Conference on Technology Enhanced Learning Workshop 3rd September 2018 http://sheilaproject.eu/
  2. 2. Schedule • 9.00-10.45 SHEILA framework + Policy development • 10.45-11.15 Break • 11.15-12.45 Policy development http://sheilaproject.eu/
  3. 3. SHEILA project overview & Senior managers’ views Yi-Shan Tsai University of Edinburgh yi-shan.tsai@ed.ac.uk @yi_shan_tsai http://sheilaproject.eu/
  4. 4. Supporting Higher Education to Integrate Learning Analytics http://sheilaproject.eu/
  5. 5. Objectives • The state of the art • Direct engagement with key stakeholders • A comprehensive policy framework http://sheilaproject.eu/
  6. 6. Inclusive adoption process Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
  7. 7. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  8. 8. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  9. 9. Adoption challenge Leadership for strategic implementation & monitoring Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  10. 10. Adoption challenge Equal engagement with different stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  11. 11. Adoption challenge Training to cultivate data literacy among primary stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  12. 12. Adoption challenge Policies for learning analytics practice Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  13. 13. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  14. 14. What is the state of the art? What are the drivers? What are the challenges?
  15. 15. Survey • 22 countries, 46 institutions • November 2016 NO P LA NS IN P RE P A RA TION IMP LE ME NT ED 2 13 15 16 The adoption of LA Institution-wide Small scale N/A
  16. 16. Interviews • 16 countries, 51 HEIs, 64 interviews, 78 participants • August 2016 - January 2017 N O P L A N S I N P R E P A R A T I O N I M P L E M E N T E D 9 7 5 12 18 The adoption of learning analytics (interviews) Institution-wide Partial/ Pilots Data exploration/cleaning
  17. 17. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  18. 18. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  19. 19. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  20. 20. Why learning analytics? LA Learner driver Teaching driver Institutional driver Self-regulation Learning support Performance
  21. 21. “People are thinking about learning analytics as a way to try and personalise education and enhance education. And actually make our education more inclusive both by understanding how different students engage with different bits of educational processes, but also about through developing curricula to make them more flexible and inclusive as a standard.”
  22. 22. “I think what we would be looking at is how do we evolve the way we teach to provide better learning outcomes for the students, greater mastery of the subject.”
  23. 23. “We’re trying to understand better the curriculum that needs to be offered for the students in our region. And…I think importantly how our pedagogical model fits that and deliver the best experience for our students.”
  24. 24. Barriers to the success of learning analytics • Analytics expertise – 34 (76%) • A data-driven culture at the institution – 30 (67%) • Teaching staff/tutor buy-in – 29 (64%) • The affordances of current learning analytics technology – 29 (64%)
  25. 25. Ethical and privacy concerns access transparency anonymity
  26. 26. Implications • Interests were high but experiences were premature. • There was strong motivation in increasing institutional performance by improving teaching quality. • Key barriers were around skills, institutional culture, technology, ethics and privacy.
  27. 27. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  28. 28. Staff Survey
  29. 29. With regards to learning analytics … … what do academic staff ideally expect to happen? … what do academic staff predict to happen in reality? Goal of the survey
  30. 30. 4 academic institutions University of Edinburgh Carlos III Madrid n = 81 n = 26 Open Universiteit University of Tallinn n = 54 n = 49 from spring to fall 2017
  31. 31. 16 items, some examples The university will provide me with guidance on how to access LA about my students The LA service will show how a student’s learning progress compares to their learning goals/the course objectives The teaching staff will have an obligation to act if the analytics show that a student is at-risk of failing, underperforming, or that they could improve their learning
  32. 32. University of Edinburgh: • Ideal: LA will collect and present data that is accurate (M = 5.91) Q9 • Predicted: Providing guidance to access LA about students (M = 5.05) Q1 Carlos III de Madrid: • Ideal: LA presented in a format that is understandable and easy to read (M = 6.31) Q11 • Predicted: LA will present students with a complete profile of their learning across every course (M = 5.27) Q12 Highest expectation values
  33. 33. Highest expectation values Open Universiteit Nederland: • Ideal: LA will collect and present data that is accurate (M = 6.60) Q9 • Predicted: Able to access data about students’ progress in a course that I am teaching (M = 5.17) Q4 University of Tallinn: • Ideal: Able to access data about students’ progress in a course that I am teaching (M = 6.04) Q4 • Predicted: Able to access data about students’ progress in a course that I am teaching (M = 5.49) Q4
  34. 34. Lowest expectation values University of Edinburgh: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 3.65) Q14 • Predicted: Teaching staff will be competent in incorporating analytics into the feedback and support they provide to students (M = 3.49) Q13 Carlos III de Madrid: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.42) Q14 • Predicted: Teaching staff will have an obligation to act if students are found to be at-risk of failing or under performing (M = 3.77) Q14
  35. 35. Lowest expectation values Open Universiteit Nederland: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.44) Q14 • Predicted: Feedback from analytics will be used to promote students’ academic and professional skill development for future employability (M = 3.24) Q15 University of Tallinn: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.80) Q14 • Predicted: Q14 (M = 3.82)
  36. 36. Staff focus groups
  37. 37. Goal To better understand the viewpoints of academic staff on: • Learning analytics opportunities in the HEIs from the perspective of students, teachers and programs; • Concerns related with adapting of learning analytics; • Needed steps to adopt learning analytics at the HEIs
  38. 38. Study participants • University of Edinburgh: 5 focus groups, 18 teaching staff • Universidad Carlos III de Madrid: 4 focus groups, 16 teaching staff • Open Universiteit Nederland: 2 focus groups, 5 teaching staff • Tallinn University: 5 focus groups, 20 teaching staff
  39. 39. Results: Expectations & LA opportunities STUDENT LEVEL TEACHER LEVEL PROGRAM LEVEL Take responsibility for their learning and enhancing their SRL- skills Assess the degree of success to prevent students from begin worried or optimistic about their performance Method to identify student’s weaknesses and know where students are with their progress Understand how students engage with learning content Improve of the design and provision of learning materials, courses, curriculum and support to students Understand how program is working (strengths and bottlenecks) Improve educational quality (e.g. content level)
  40. 40. Results: Meaningful data
  41. 41. Results: concerns – student level https://www.pinterest.com/pin/432486370448743887/
  42. 42. Results: concerns – teacher level http://create-learning.com https://www.pinterest.com/pin/432486370448743887/ Http://memegenerator.net
  43. 43. Results: concerns – program level • Interpretation of learning: • Was the right data collected? • Were the accurate algorithms developed ? • Was an appropriate message given for the students? • Connecting LA to real learning – is this meaningful picture of learning what is happening in online environments?
  44. 44. What we should consider? • LA should be just one component of many for collecting feedback and enhancing decision-making • Involve stakeholders: • Academic staff to in developing and setting up of LA • Pedagogy experts involved to ensure data makes sense to improve learning • Provide training, communication!
  45. 45. What we should consider? •Design of the tools that are: •Easy to use •Providing visualizations of data •Not requiring mathematical/statistical skills •Not taking a lot of time •Considering ethical and privacy aspects
  46. 46. Student Views Pedro Manuel Moreno Marcos Department of Telematics Engineering Universidad Carlos III de Madrid pemoreno@it.uc3m.es http://sheilaproject.eu/
  47. 47. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  48. 48. Student survey results http://sheilaproject.eu/
  49. 49. Background • 12 Items Survey • Two Subscales: • Ethical and Privacy Expectations • Service Expectations • 6 Distributions: • Edinburgh (N = 884) • Liverpool (N = 191) • Tallinn (N = 161) • Madrid (N = 543) • Netherlands (N = 1247) • Blanchardstown (N = 237) http://sheilaproject.eu/
  50. 50. Ideal Expectation Scale Predicted Expectation Scale Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party 1 2 3 4 5 6 7 Item Average Location Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Ethical and Privacy Expectations http://sheilaproject.eu/
  51. 51. Keep Data Secure – Predicted Expectation Scale Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 40 50 Percentage http://sheilaproject.eu/
  52. 52. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 Percentage Consent to Collect – Predicted Expectation Scale http://sheilaproject.eu/
  53. 53. Ideal Expectation Scale Predicted Expectation Scale ObligationtoAct IntegrateintoFeedback SkillDevelopment RegularlyUpdate CompleteProfile StudentDecisionMaking CourseGoals ObligationtoAct IntegrateintoFeedback SkillDevelopment RegularlyUpdate CompleteProfile StudentDecisionMaking CourseGoals 1 2 3 4 5 6 7 Average Location Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Service Expectations http://sheilaproject.eu/
  54. 54. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 Percentage Course Goals – Predicted Expectation Scale http://sheilaproject.eu/
  55. 55. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 Percentage Obligation to Act – Predicted Expectation Scale http://sheilaproject.eu/
  56. 56. Summary • Beliefs towards learning analytics are not consistent. • Emphasis on data security and improving learning. http://sheilaproject.eu/
  57. 57. Student focus groups http://sheilaproject.eu/
  58. 58. Background • 18 focus groups • 4 partners’ institutions • 74 students • Interviews: Around 1h http://sheilaproject.eu/
  59. 59. Interests and expectations • Improve the quality of teaching • Better student-teacher feedback • Better academic resources and academic tools to improve learning • Personalized support • Recommendation of learning resources • Feedback from a system, via a dashboard • Provide an overview of the tasks to be done in a semester → improve curriculum design http://sheilaproject.eu/
  60. 60. Awareness • Students do not know what LA is, but they recognise its importance if it can solve students’ problems • Students are not generally aware of the data collected → Transparency • Students have not checked the conditions they have accepted about data http://sheilaproject.eu/
  61. 61. Concerns http://sheilaproject.eu/ Surveillance Anonymization Purpose of data Kind of data Consent and access Security Provision of opt-outs Stereotypes and biases
  62. 62. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  63. 63. Group Concept Mapping Dr. Maren Scheffel Open University Netherlands
  64. 64. • innovations in way network is delivered • (investigate) corporate/structural alignment • assist in the development of non-traditional partnerships (Rehab with the Medicine Community) • expand investigation and knowledge of PSN'S/PSO's • continue STHCS sponsored forums on public health issues (medicine managed care forum) • inventory assets of all participating agencies (providers, Venn Diagrams) • access additional funds for telemedicine expansion • better utilization of current technological bridge • continued support by STHCS to member facilities • expand and encourage utilization of interface programs to strengthen the viability and to improve the health care delivery system (ie teleconference) • discussion with CCHN Work quickly and effectively under pressure 49 Organize the work when directions are not specific. 39 Decide how to manage multiple tasks. 20 Manage resources effectively. 4 2. Sort 3. Rate 1. Brainstorm Group Concept Mapping
  65. 65. Onderwerp via >Beeld >Koptekst en voettekst Pagina 68 27 March 2014@HDrachsler 68 / 31 An essential feature of a higher education institution’s learning analytics policy should be … Group Concept Mapping
  66. 66. Online sorting @HDrachsler 27 March 2014 69 / 31 Group Concept Mapping
  67. 67. Online rating @HDrachsler 27 March 2014 70 / 31 Group Concept Mapping
  68. 68. Participants
  69. 69. Participants
  70. 70. Point Map 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 9899
  71. 71. Cluster Replay Map
  72. 72. Cluster Replay Map
  73. 73. Cluster Replay Map
  74. 74. Cluster Map 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis
  75. 75. Rating Map – Importance 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis Cluster Legend Layer Value 1 5.08 to 5.27 2 5.27 to 5.46 3 5.46 to 5.65 4 5.65 to 5.84 5 5.84 to 6.03
  76. 76. Rating Map – Ease 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis Cluster Legend Layer Value 1 3.79 to 4.12 2 4.12 to 4.45 3 4.45 to 4.78 4 4.78 to 5.11 5 5.11 to 5.44
  77. 77. Rating Ladder Graph importance ease privacy & transparency privacy & transparency risks & challenges risks & challenges roles & responsibilities (of all stakeholders) roles & responsibilities (of all stakeholders) objectives of LA (learner and teacher support) objectives of LA (learner and teacher support) data management data management research & data analysis research & data analysis 3.79 3.79 6.03 6.03 r = 0.66
  78. 78. Go Zone – Roles & Responsibilities 5 38 62 11 19 22 33 39 48 70 91 25 28 37 40 55 61 66 27 47 49 6.08 4.72 3.12 ease 3.83 5.48 6.59 importance r = 0.26 55. being clear about the purpose of learning analytics 61. a clear articulation of responsibilities when it comes to the use of institutional data
  79. 79. Yi-Shan Tsai, Pedro Manuel Moreno-Marcos, Ioana Jivet, Maren Scheffel, Kairit Tammets, Kaire Kollom, and Dragan Gašević. (to appear). The SHEILA framework: Informing institutional strategies and policy processes of learning analytics. Journal of Learning Analyitcs.
  80. 80. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  81. 81. SHEILA framework
  82. 82. SHEILA policy framework
  83. 83. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees

×