Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

LERU Presentation - March 2017

71 views

Published on

Presenation on learning analytics at Edinburgh, given to LERU meeting in Edinburgh, March 2017

Published in: Education
  • Be the first to comment

  • Be the first to like this

LERU Presentation - March 2017

  1. 1. LERU: Learning Analytics Anne-Marie Scott Learning, Teaching and Web Services 17 March 2017
  2. 2. Learning Analytics Map of Activities, Research and Roll-out (LAMARR) http://www.ed.ac.uk/information-services/learning-technology/learning-analytics
  3. 3. • How can University teaching teams develop critical and participatory approaches to educational data analysis? • How can we develop ways of involving students as research partners and active participants in their own data collection and analysis, as well as foster critical understanding of the use of computational analysis in education? Learning Analytics Report Card (LARC) http://larc-project.com Knox, J. (2017). Data Power in Education: exploring critical awareness with the ‘Learning Analytics Report Card’ (LARC). Special Issue: Data Power in Material Contexts, Journal of Television and Media. http://journals.sagepub.com/doi/full/10.1177/1527476417690029 Isard, A. and Knox, J. 2016. Automatic Generation of Student Report Cards. 9th International Natural Language Generation conference. Edinburgh, Sept 5-8 http://www.macs.hw.ac.uk/InteractionLab/INLG2016/proceedings/pdf/INLG33.pdf
  4. 4. Supporting Higher Education to Integrate Learning Analytics • Aim: Developing a LA policy development framework • Activities: Interviews and survey with decision makers, group concept mapping with experts, surveys and focus groups with students and teaching staff, and 4 institutional policies. • Project partners: The University of Edinburgh, Open University of the Netherlands, Universidad Carlos III de Madrid, Tallinn University, Brussels Educational Services, European Association for Quality Assurance in Higher Education, Erasmus Student Network http://sheilaproject.eu/
  5. 5. 64 institutional interviews with 51 HEIs NO PLANS UNDER PREPARATIONS IMPLEMENTED 9 7 5 12 18 THE ADOPTION OF LEARNING ANALYTICS Institution-wide Partial/ Pilots Data exploration/cleaning N/A
  6. 6. Group Concept Mapping importance ease privacy & transparency privacy & transparency risks & challenges risks & challenges roles & responsibilities (of all stakeholders) roles & responsibilities (of all stakeholders) objectives of LA (learner and teacher support) objectives of LA (learner and teacher support) data management data management research & data analysis research & data analysis 3.79 3.79 6.03 6.03 r = 0.66
  7. 7. Policy Challenges Tsai, Y-S., Gašević, D. (2016). Adoption of Learning Analy+cs in Higher Education: State, Challenges, and Policies (Executive summary). SHEILA Project Report http://sheilaproject.eu/wp-content/uploads/2016/06/Adoption-of-Learning-Analytics-in-Higher-Education_Executive-Summary.pdf
  8. 8. Edinburgh: Principles 1. As an institution we understand that data never provides the whole picture about students’ capacities or likelihood of success, and it will therefore never be used to inform action at an individual level without human intervention; 2. Where we use learning analytics to target support at students, we will do so in order to help all students to achieve their learning goals rather than taking a deficit model targeted at supporting students at risk of failure; 3. We will be transparent about how we collect and use data, where consent applies, and where responsibilities for the ethical use of data lie; 4. We recognise that data and algorithms can contain and perpetuate bias, and will actively work to build awareness of this and address it where it occurs; 5. Good governance will be core to our approach, to ensure learning analytics projects and implementations are ethically conducted and align with organisational strategy, policy and values; and 6. Data generated from learning analytics will not be used to monitor staff performance, as learning analytics is a field that seeks to understand and optimise learning and it does not aim to measure staff performance. http://www.ed.ac.uk/academic-services/projects/learning-analytics-policy
  9. 9. Edinburgh: Purposes • Quality – Learning analytics can be used as a form of feedback on the efficacy of pedagogical design. Academic teams can use analytics about student activity (individual or cohort) as part of course review and re-design processes as well as potentially using analytics as a form of in-course monitoring and feedback. • Equity – Learning analytics approaches can allow us to see more nuanced views of our highly diverse student population, challenge assumptions that we may be making, and allow supportive resource to be directed where it is most needed. • Personalised feedback – Learning analytics can be used to tailor the messages and support that we offer to our students, providing more personalised feedback to support student reflection and academic planning. • Coping with scale – With the challenge of growing cohorts of students, learning analytics can help to strengthen the academic relationship by doing some of the heavy lifting of identifying individuals or groups of individuals that might benefit from particular interventions or information from staff. • Student Experience – In addition to supporting a more personalised experience, learning analytics can improve progression and retention, ensure that our academic offerings align with the needs and goals of students, and support satisfaction and wellbeing. Analytics can also be used to promote critical reflection skills and enable our students to take responsibility for their own learning. • Skills – Interactions with analytics as part of the University learning experience can help our students build 'digital savviness' and prompt more critical reflection on how data about them is being used more generally, what consent might actually mean and how algorithms work across datasets to define and profile individuals. Learning analytics approaches can also be used to promote the development of key employability skills. Supporting staff to develop skills in working with learning analytics applications is also an investment in institutional capacity and leadership. • Efficiency – Learning analytics can be used to evaluate and demonstrate institutional efficiency through a) measuring the impact of initiatives and validating that benefits are being realised and b) demonstrating that publically-funded resource is being deployed in support of the best outcomes of all students. http://www.ed.ac.uk/academic-services/projects/learning-analytics-policy
  10. 10. 1. Co-responsibility in an asymmetrical power and contractual relationship …obligation to act is a co-responsibility of students and institution, tempered by the asymmetrical power and contractual relationship in which the institution has very specific moral and legal duties to respond Image credit: https://pixabay.com/en/michelangelo-abstract-boy-child-71282/ Prinsloo, P & Slade S (2017) An elephant in the learning analytics room – the obligation to act, LAK17 presentation, https://www.slideshare.net/prinsp/an-elephant-in-the-learning-analytics-room-the-obligation-to-act
  11. 11. • What is the current state of analytics adoption in your institution? Barriers / activities? • Do the issues and challenges uncovered by the Sheila project sound familiar to you? • What do your students think?

×