Successfully reported this slideshow.
Your SlideShare is downloading. ×

SHEILA Results – Conference 5 June 2018

Loading in …3
×

Check these out next

1 of 100 Ad
1 of 100 Ad
Advertisement

More Related Content

Slideshows for you (19)

Similar to SHEILA Results – Conference 5 June 2018 (20)

Advertisement

More from LACE Project (20)

Advertisement

SHEILA Results – Conference 5 June 2018

  1. 1. SHEILA Project Results Dragan Gašević @dgasevic SHEILA Conference 5th June 2018 Brussels, Belgium http://sheilaproject.eu/
  2. 2. Inclusive adoption process
  3. 3. Inclusive adoption process http://sheilaproject.eu/
  4. 4. Inclusive adoption process Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment, 9(Winter 2014), 17-28.
  5. 5. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  6. 6. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  7. 7. Adoption challenge Leadership for strategic implementation & monitoring Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  8. 8. Adoption challenge Equal engagement with different stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  9. 9. Adoption challenge Training to cultivate data literacy among primary stakeholders Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  10. 10. Adoption challenge Policies for learning analytics practice Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
  11. 11. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  12. 12. Institutional survey & interviews Yi-Shan Tsai University of Edinburgh yi-shan.tsai@ed.ac.uk @yi_shan_tsai
  13. 13. What is the state of the art? What are the drivers? What are the challenges?
  14. 14. Survey • 22 countries, 46 institutions • November 2016 NO P LA NS IN P RE P A RA TION IMP LE ME NT ED 2 13 15 16 The adoption of LA Institution-wide Small scale N/A
  15. 15. Interviews • 16 countries, 51 HEIs, 64 interviews, 78 participants • August 2016 - January 2017 N O P L A N S I N P R E P A R A T I O N I M P L E M E N T E D 9 7 5 12 18 The adoption of learning analytics (interviews) Institution-wide Partial/ Pilots Data exploration/cleaning
  16. 16. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  17. 17. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  18. 18. Motivations to adopt learning analytics • To improve student learning performance – 40 (87%) • To improve student satisfaction – 33 (72%) • To improve teaching excellence – 33 (72 %) • To improve student retention– 26 (57 %) • To explore what learning analytics can do for our institution/ staff/ students – 25 (54 %) 46 institutions
  19. 19. Why learning analytics? LA Learner driver Teaching driver Institutional driver Self-regulation Learning support Performance
  20. 20. Why learning analytics? LA Learner driver Teaching driver Institutional driver Self-regulation Learning support Performance
  21. 21. “People are thinking about learning analytics as a way to try and personalise education and enhance education. And actually make our education more inclusive both by understanding how different students engage with different bits of educational processes, but also about through developing curricula to make them more flexible and inclusive as a standard.”
  22. 22. “I think what we would be looking at is how do we evolve the way we teach to provide better learning outcomes for the students, greater mastery of the subject.”
  23. 23. “We’re trying to understand better the curriculum that needs to be offered for the students in our region. And…I think importantly how our pedagogical model fits that and deliver the best experience for our students.”
  24. 24. Barriers to the success of learning analytics • Analytics expertise – 34 (76%) • A data-driven culture at the institution – 30 (67%) • Teaching staff/tutor buy-in – 29 (64%) • The affordances of current learning analytics technology – 29 (64%)
  25. 25. Ethical and privacy concerns access transparency anonymity
  26. 26. Analytical capability challenge • More than half of the institutions with 0-3 years experience indicated noticeable gaps among different stakeholders regarding perceptions and understanding of learning analytics.
  27. 27. Success • We have achieved the goals that we set for learning analytics.
  28. 28. Success • Most institutions did not have confirmed success. • Half of the institutions with 0-3 years experience claimed “gaining experience” as part of their success.
  29. 29. Implications • Interests were high but experiences were premature. • There was strong motivation in increasing institutional performance by improving teaching quality. • Key barriers were around skills, institutional culture, technology, ethics and privacy.
  30. 30. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  31. 31. Student Survey Results Alex Wainwright University of Liverpool a.wainwright@Liverpool.ac.uk http://sheilaproject.eu/
  32. 32. Background • 12 Items Survey • Two Subscales: • Ethical and Privacy Expectations • Service Expectations • 6 Distributions: • Edinburgh (N = 884) • Liverpool (N = 191) • Tallinn (N = 161) • Madrid (N = 543) • Netherlands (N = 1247) • Blanchardstown (N = 237) http://sheilaproject.eu/
  33. 33. Ideal Expectation Scale Predicted Expectation Scale Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party 1 2 3 4 5 6 7 Item Average Location Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Ethical and Privacy Expectations http://sheilaproject.eu/
  34. 34. Keep Data Secure – Predicted Expectation Scale Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 40 50 Percentage http://sheilaproject.eu/
  35. 35. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 Percentage Consent to Collect – Predicted Expectation Scale http://sheilaproject.eu/
  36. 36. Ideal Expectation Scale Predicted Expectation Scale ObligationtoAct IntegrateintoFeedback SkillDevelopment RegularlyUpdate CompleteProfile StudentDecisionMaking CourseGoals ObligationtoAct IntegrateintoFeedback SkillDevelopment RegularlyUpdate CompleteProfile StudentDecisionMaking CourseGoals 1 2 3 4 5 6 7 Average Location Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Service Expectations http://sheilaproject.eu/
  37. 37. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 30 Percentage Course Goals – Predicted Expectation Scale http://sheilaproject.eu/
  38. 38. Blanchardstown Edinburgh Liverpool Madrid Open University of the Netherlands Tallinn Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree Response 10 20 Percentage Obligation to Act – Predicted Expectation Scale http://sheilaproject.eu/
  39. 39. Summary • Beliefs towards learning analytics are not consistent. • Emphasis on data security and improving learning. http://sheilaproject.eu/
  40. 40. Student focus groups Pedro Manuel Moreno Marcos Department of Telematic Engineering Universidad Carlos III de Madrid http://sheilaproject.eu/
  41. 41. Goals • Interest and expectations • Awareness • Concerns
  42. 42. Background • 18 focus groups • 4 partners’ institutions • 74 students • Interviews: Around 1h http://sheilaproject.eu/
  43. 43. Interests and expectations • Improve the quality of teaching • Better student-teacher feedback • Better academic resources and academic tools to improve learning • Personalized support • Recommendation of learning resources • Feedback from a system, via a dashboard • Provide an overview of the tasks to be done in a semester → improve curriculum design http://sheilaproject.eu/
  44. 44. Awareness • Students do not know what LA is, but they recognise its importance if it can solve students’ problems • Students are not generally aware of the data collected → Transparency • Students have not checked the conditions they have accepted about data http://sheilaproject.eu/
  45. 45. Concerns http://sheilaproject.eu/ Surveillance Anonymization Purpose of data Kind of data Consent and access Security Provision of opt-outs Stereotypes and biases
  46. 46. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  47. 47. Staff Survey Maren Scheffel Open Universiteit Nederland Welten Institute @m_a_s_c
  48. 48. With regards to learning analytics … … what do academic staff ideally expect to happen? … what do academic staff predict to happen in reality? Goal of the survey
  49. 49. 4 academic institutions University of Edinburgh Carlos III Madrid n = 81 n = 26 Open Universiteit University of Tallinn n = 54 n = 49 from spring to fall 2017
  50. 50. 16 items, some examples The university will provide me with guidance on how to access LA about my students The LA service will show how a student’s learning progress compares to their learning goals/the course objectives The teaching staff will have an obligation to act if the analytics show that a student is at-risk of failing, underperforming, or that they could improve their learning
  51. 51. University of Edinburgh: • Ideal: LA will collect and present data that is accurate (M = 5.91) Q9 • Predicted: Providing guidance to access LA about students (M = 5.05) Q1 Carlos III de Madrid: • Ideal: LA presented in a format that is understandable and easy to read (M = 6.31) Q11 • Predicted: LA will present students with a complete profile of their learning across every course (M = 5.27) Q12 Highest expectation values
  52. 52. Highest expectation values Open Universiteit Nederland: • Ideal: LA will collect and present data that is accurate (M = 6.60) Q9 • Predicted: Able to access data about students’ progress in a course that I am teaching (M = 5.17) Q4 University of Tallinn: • Ideal: Able to access data about students’ progress in a course that I am teaching (M = 6.04) Q4 • Predicted: Able to access data about students’ progress in a course that I am teaching (M = 5.49) Q4
  53. 53. Lowest expectation values University of Edinburgh: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 3.65) Q14 • Predicted: Teaching staff will be competent in incorporating analytics into the feedback and support they provide to students (M = 3.49) Q13 Carlos III de Madrid: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.42) Q14 • Predicted: Teaching staff will have an obligation to act if students are found to be at-risk of failing or under performing (M = 3.77) Q14
  54. 54. Lowest expectation values Open Universiteit Nederland: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.44) Q14 • Predicted: Feedback from analytics will be used to promote students’ academic and professional skill development for future employability (M = 3.24) Q15 University of Tallinn: • Ideal: Teaching staff will have an obligation to act if students are found to be at- risk of failing or under performing (M = 4.80) Q14 • Predicted: Q14 (M = 3.82)
  55. 55. Staff focus groups Kairit Tammets Centre for educational technology School of Digital Technologies Tallinn University
  56. 56. Goal To better understand the viewpoints of academic staff on: • Learning analytics opportunities in the HEIs from the perspective of students, teachers and programs; • Concerns related with adapting of learning analytics; • Needed steps to adopt learning analytics at the HEIs
  57. 57. Study participants • University of Edinburgh: 5 focus groups, 18 teaching staff • Universidad Carlos III de Madrid: 4 focus groups, 16 teaching staff • Open Universiteit Nederland: 2 focus groups, 5 teaching staff • Tallinn University: 5 focus groups, 20 teaching staff
  58. 58. Results: Expectations & LA opportunities STUDENT LEVEL TEACHER LEVEL PROGRAM LEVEL Take responsibility for their learning and enhancing their SRL- skills Assess the degree of success to prevent students from begin worried or optimistic about their performance Method to identify student’s weaknesses and know where students are with their progress Understand how students engage with learning content Improve of the design and provision of learning materials, courses, curriculum and support to students Understand how program is working (strengths and bottlenecks) Improve educational quality (e.g. content level)
  59. 59. Results: Meaningful data
  60. 60. Results: Meaningful data
  61. 61. Results: Meaningful data
  62. 62. Results: Meaningful data
  63. 63. Results: Meaningful data
  64. 64. Results: concerns – student level https://www.pinterest.com/pin/432486370448743887/
  65. 65. Results: Concerns – student level https://www.pinterest.com/pin/432486370448743887/
  66. 66. Results: concerns – student level https://www.pinterest.com/pin/432486370448743887/
  67. 67. Results: concerns – teacher level https://www.pinterest.com/pin/432486370448743887/
  68. 68. Results: concerns – teacher level http://create-learning.com https://www.pinterest.com/pin/432486370448743887/
  69. 69. Results: concerns – teacher level http://create-learning.com https://www.pinterest.com/pin/432486370448743887/ Http://memegenerator.net
  70. 70. Results: concerns – program level • Interpretation of learning: • Was the right data collected? • Were the accurate algorithms developed ? • Was an appropriate message given for the students? • Connecting LA to real learning – is this meaningful picture of learning what is happening in online environments?
  71. 71. What we should consider? • LA should be just one component of many for collecting feedback and enhancing decision-making • Involve stakeholders: • Academic staff to in developing and setting up of LA • Pedagogy experts involved to ensure data makes sense to improve learning • Provide training, communication!
  72. 72. What we should consider? •Design of the tools that are: •Easy to use •Providing visualizations of data •Not requiring mathematical/statistical skills •Not taking a lot of time •Considering ethical and privacy aspects
  73. 73. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  74. 74. Group Concept Mapping Prof. Dr. Hendrik Drachsler Open University Netherlands DIPF / University of Frankfurt, Germany
  75. 75. • innovations in way network is delivered • (investigate) corporate/structural alignment • assist in the development of non-traditional partnerships (Rehab with the Medicine Community) • expand investigation and knowledge of PSN'S/PSO's • continue STHCS sponsored forums on public health issues (medicine managed care forum) • inventory assets of all participating agencies (providers, Venn Diagrams) • access additional funds for telemedicine expansion • better utilization of current technological bridge • continued support by STHCS to member facilities • expand and encourage utilization of interface programs to strengthen the viability and to improve the health care delivery system (ie teleconference) • discussion with CCHN ...organize the issues... Work quickly and effectively under pressure 49 Organize the work when directions are not specific. 39 Decide how to manage multiple tasks. 20 Manage resources effectively. 4 2. Sort 3. Rate 1. Brainstorm 27 March 2014@HDrachsler 82 / 31 Group Concept Mapping (GCM) Study
  76. 76. Onderwerp via >Beeld >Koptekst en voettekst Pagina 83 27 March 2014@HDrachsler 83 / 31 Group Concept Mapping An essential feature of a higher education institution’s learning analytics policy should be …
  77. 77. Online sorting @HDrachsler 27 March 2014 84 / 31 Group Concept Mapping
  78. 78. Online rating @HDrachsler 27 March 2014 85 / 31 Group Concept Mapping
  79. 79. Participants
  80. 80. Participants
  81. 81. Point Map 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 9899
  82. 82. Cluster Replay Map
  83. 83. Cluster Replay Map
  84. 84. Cluster Replay Map
  85. 85. Cluster Map 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis
  86. 86. Rating Map – Importance 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis Cluster Legend Layer Value 1 5.08 to 5.27 2 5.27 to 5.46 3 5.46 to 5.65 4 5.65 to 5.84 5 5.84 to 6.03
  87. 87. Rating Map – Ease 1. privacy & transparency 2. roles & responsibilities (of all stakeholders) 3. objectives of LA (learner and teacher support) 4. risks & challenges 5. data management 6. research & data analysis Cluster Legend Layer Value 1 3.79 to 4.12 2 4.12 to 4.45 3 4.45 to 4.78 4 4.78 to 5.11 5 5.11 to 5.44
  88. 88. Rating Ladder Graph importance ease privacy & transparency privacy & transparency risks & challenges risks & challenges roles & responsibilities (of all stakeholders) roles & responsibilities (of all stakeholders) objectives of LA (learner and teacher support) objectives of LA (learner and teacher support) data management data management research & data analysis research & data analysis 3.79 3.79 6.03 6.03 r = 0.66
  89. 89. Yi-Shan Tsai, Pedro Manuel Moreno-Marcos, Kairit Tammets, Kaire Kollom, and Dragan Gašević. 2018. SHEILA policy framework: informing institutional strategies and policy processes of learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18). ACM, New York, NY, USA, 320-329. DOI: https://doi.org/10.1145/317035 8.3170367
  90. 90. Go Zone – Roles & Responsibilities 5 38 62 11 19 22 33 39 48 70 91 25 28 37 40 55 61 66 27 47 49 6.08 4.72 3.12 ease 3.83 5.48 6.59 importance r = 0.26 55. being clear about the purpose of learning analytics 61. a clear articulation of responsibilities when it comes to the use of institutional data
  91. 91. Yi-Shan Tsai, Pedro Manuel Moreno-Marcos, Kairit Tammets, Kaire Kollom, and Dragan Gašević. 2018. SHEILA policy framework: informing institutional strategies and policy processes of learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK '18). ACM, New York, NY, USA, 320-329. DOI: https://doi.org/10.1145/317035 8.3170367
  92. 92. @hdrachsler drachsler@dipf.de 99 Many thanks for your attention! Questions now or later: Slides: http://bit.ly/TrustedLA
  93. 93. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  94. 94. SHEILA framework
  95. 95. SHEILA policy framework
  96. 96. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees
  97. 97. Methodology Literature - Policy - Adoption Academic staff - Survey - Focus groups Students - Survey - Focus groups Senior managers - Survey - Interviews Experts - Group concept mapping Policy framework Institutional policy/strategy Other stakeh. - Workshops - Committees

Editor's Notes

  • With senior managers, we were
  • EADTU (European Association of Distance Teaching Universities)
    EUA (European University Association)
    HeLF (Heads of e-Learning Forum)
    EUNIS (European University Information Systems)
    SNOLA (Spanish Network of Learning Analytics)
    eMadrid
  • 22 countries: Austria, Bulgaria, Cyprus , Czech Republic, Denmark, Estonia, Finland, Germany, Hungary, Ireland, Italy, Lithuania, Netherlands, Norway, Portugal, Romania, Serbia, Slovakia, Spain, Switzerland, Turkey, UK

    Interview + survey: 26 countries
  • A survey question (multiple choices) provided 11 options for motivations specific to learning and teaching.
  • All related to institutional performance: league ranking, satisfaction survey, teaching excellence framework
    But also dependent on teaching quality
  • Early stage - exploration
  • Most institutions seem to have incorporated all levels of goals into their planning or implementation of LA
    Enhance self-regulation skills: provide data-based information to guide students
    Improve learning support: curriculum, feedback, personalized support, pastoral care, timely support
    Increase institutional performance: retention rate, student satisfaction, league ranking
  • Particularly teaching-level (1/1 – S; 18/19 – M; 6/7- L) – UoE data
  • 13 options
    Moderately-sized, large, critical
  • Three most mentioned issues regarding ethics and privacy
  • 15 institutions that have implemented LA.
  • UoE data
  • Interest is strong
    Institutions were exploring what LA can do
    Using LA to enhance teaching so as to increase institutional performance is the biggest motivation among managers
    Barriers – skills, culture, technology, ethics and privacy
  • Hi – I’m Alex Wainwright… and I’m going to give an overview of the student survey results… this is going to cover response rates and some general insights obtained…
  • The student survey is composed of 12 items… and responses are made on two scales that correspond to a desired service… and what students expect in reality… so they reflect two levels of expectation…

    Through the development and validation process we have identified two subscales… these refers to ethical and privacy expectations… such as whether students expect to provide consent for the collection of their educational data…

    And the other subscales refers to service expectations… so this covers things such as whether students expect to receive updates on how their learning progress compares to a set goal….

    As you can see… we have distributed the instrument at six different higher education institutions… with the highest response rate being at the open university of the netherlands…

    All distributions have shown the scales to be valid and to also show excellent measurement quality….
  • Firstly… I am going to go over the ethical and privacy expectation items…

    On this figure you can see the average responses to these items by expectation scale and location….

    The x axis provides an indication of what the items refer to…

    So we have beliefs about providing consent when data is used for an alternative purpose… or whether consent should be sought before distributing data to third party companies

    What can be seen is that students ideal expectations are generally higher than predicted expectations – this is anticipated as it is a desired level of service…

    Across both scales… however… we can see that the expectation that all collected data remain secure receives the highest average response… whereas… the expectation to provide consent before educational data is collected and analysed receives the lowest average response across these five items… and whilst students agree with this belief… it verges on indifference on the predicted expectation scale for the Spanish student sample….

    It may be that students are open to universities collecting and analysing educational data… particularly as it is used for attendance purposes, for example…

    Whereas… they have stronger beliefs toward universities abiding by data handling policies that will ensure that all data remains secure…
  • We can also look at these two particular ethical and privacy expectation items in more detail…

    This figure shows the percentage of students responding in a certain way to the data security expectation… with darker colours reflecting a higher percentage of students responding that way…

    And what is show is that… between 60 to 80% of students across all universities either agreed or strongly agreed with the expectation that universities will ensure data is kept secure…
  • For the consent to collect expectation… this figure shows that there is more variation in the responses…

    For those students from Edinburgh, Liverpool, the Netherlands, and Blanchardstown… the largest response of around 30% is for strongly agree to this belief…

    Whereas… the largest percentage of responses for Madrid and Tallinn… which was around 25%... Was for somewhat agree…



  • Looking at the service expectation items… we can that the average responses tend to be similar across locations…

    Of particular note… the obligation to act is the item with the lowest response on average… with students in Madrid, the Netherlands, and Tallinn generally showing indifference to this belief on the predicted expectation scale...

    The higher average responses… on the other hand… seem to be around aspects of self-regulated learning such as students expecting to receive a complete profile of their learning…. Making their own decisions on the analytics that they receive… and knowing how their progress compares to a set learning goal….
  • Looking into what are the highest and lowest average response items… we can also understand differences within each sample…

    For knowing how progress compares to a set learning goal… between 20 to 35% of students across each sample agreed with this expectation… with around 4% disagreeing….
  • As for the obligation to act… the highest response rates are variable…

    Around 20% of students in the Tallinn and Madrid samples somewhat disagreed with this expectation… For the Dutch students 24% expressed indifference to this belief… whereas in Liverpool and Blanchardstown around 28% showed agreement…
  • The output from the student survey shows that the expectations of students towards learning analytics are not consistent across each sample… with students generally showing variations in what they want from such services…


    On the other hand… we can generally see that students expect a learning analytics service that emphasises data security… and provides tools that support learning as opposed to those that emphasise early interventions
  • Engagement: what do they view, do with the content and relationship between participation and grades
    Academic data: Pre-requisite subjects to understand the background
    Previous studies to know if additional materials are needed
    Demographic: Previous studies
    Employment
    Health: Visually impaired – to adjust content
    Educational needs – to adjust content
    mental issues – to be aware

  • Academic data: Pre-requisite subjects to understand the background
    Previous studies to know if additional materials are needed
    Demographic: Previous studies
    Employment
    Health: Visually impaired – to adjust content
    Educational needs – to adjust content
    mental issues – to be aware

  • Demographic: Previous studies
    Employment
    Health: Visually impaired – to adjust content
    Educational needs – to adjust content
    mental issues – to be aware

  • Health: Visually impaired – to adjust content
    Educational needs – to adjust content
    mental issues – to be aware

  • Qualitative data to get addition to LA data students’ perceptions and understandings about teaching and learning processes
  • Staff is worried that we will put a lot of efforts and resources for developing LA services – but maybe students do not want such service and will never benefit it as they should
  • Privacy and autonomy – staff emphasized that services should accept the ethical regulations and behaviour in all the different levels
  • Staff was worried that profiling of the students as e.g. low-performing might end up with the lost of motivation and anxiety.
  • The main concern on teacher level was related with the time constraints: staff was not very convinced now that LA may help them, rather they felt it takes more time from them, especially if they do have to monitor each individual student.
  • Performance judgement- will my professionalism will be evaluated based on LA data
    Fulfilling expectations
  • Staff was wondering: shall I be objective?
  • How can I be objective

×