Successfully reported this slideshow.
Your SlideShare is downloading. ×

What have we learned from 6 years of implementing learning analytics amongst 100.000+ students

Ad

@DrBartRienties
Professor of Learning Analytics
WHAT HAVE WE LEARNED FROM 6 YEARS OF
IMPLEMENTING LEARNING ANALYTICS
AMONG...

Ad

Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning ...

Ad

Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning ...

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Ad

Loading in …3
×

Check these out next

1 of 28 Ad
1 of 28 Ad

What have we learned from 6 years of implementing learning analytics amongst 100.000+ students

Download to read offline

By Professor Bart Rienties, Head of Academic Professional Development, Institute of Educational Technology, The Open University, UK

Abstract

The Open University UK (OU) has been implementing learning analytics since 2014, starting with one or two modules to its current practice of large-scale implementation across all its 400+ modules and 170.000+ students and 4000+ teaching staff. While a range of reviews (e.g., Adenij, 2019) and scholarly repositories (e.g., Web of Science) indicate that the OU is the largest contributor to academic output in learning analytics in the world, behind the flashy publications and practitioner outputs there are a range of complex issues in terms of ethics and privacy, data infrastructures, buy-in from staff, student engagement, and how to make sense of big data in a complex organisation like the OU.

Based upon large-scale big data research we found some interesting tensions in both design and educational theory, such as:
– 69% of engagement by students on a week by week basis is determined by how teachers are designing courses (i.e., learning design and instructional design indeed directly influence behaviour and cognition), but many teachers seem reluctant to change their learning design based upon data of what works and what does not work (e.g., making sense of data, agency);
– How teachers engage with predictive learning analytics (PLA) significantly improves student outcomes, but only a minority of teachers actually use PLA;
– Some disadvantaged groups engage more actively in OU courses, but nonetheless perform lower than non-disadvantaged students.

During this CELDA keynote I would like to share some of my own reflections of how the OU has implemented learning analytics, and how these insights are helping towards a stronger evidence-base for data-informed change. Furthermore, by sharing some of the lessons learned from implementing learning analytics on a large scale I hope to provide some dos and don’ts in terms of how you might consider to use data in your own practice and context.

By Professor Bart Rienties, Head of Academic Professional Development, Institute of Educational Technology, The Open University, UK

Abstract

The Open University UK (OU) has been implementing learning analytics since 2014, starting with one or two modules to its current practice of large-scale implementation across all its 400+ modules and 170.000+ students and 4000+ teaching staff. While a range of reviews (e.g., Adenij, 2019) and scholarly repositories (e.g., Web of Science) indicate that the OU is the largest contributor to academic output in learning analytics in the world, behind the flashy publications and practitioner outputs there are a range of complex issues in terms of ethics and privacy, data infrastructures, buy-in from staff, student engagement, and how to make sense of big data in a complex organisation like the OU.

Based upon large-scale big data research we found some interesting tensions in both design and educational theory, such as:
– 69% of engagement by students on a week by week basis is determined by how teachers are designing courses (i.e., learning design and instructional design indeed directly influence behaviour and cognition), but many teachers seem reluctant to change their learning design based upon data of what works and what does not work (e.g., making sense of data, agency);
– How teachers engage with predictive learning analytics (PLA) significantly improves student outcomes, but only a minority of teachers actually use PLA;
– Some disadvantaged groups engage more actively in OU courses, but nonetheless perform lower than non-disadvantaged students.

During this CELDA keynote I would like to share some of my own reflections of how the OU has implemented learning analytics, and how these insights are helping towards a stronger evidence-base for data-informed change. Furthermore, by sharing some of the lessons learned from implementing learning analytics on a large scale I hope to provide some dos and don’ts in terms of how you might consider to use data in your own practice and context.

Advertisement
Advertisement

More Related Content

Slideshows for you (19)

Similar to What have we learned from 6 years of implementing learning analytics amongst 100.000+ students (20)

Advertisement

More from Bart Rienties (18)

Advertisement

What have we learned from 6 years of implementing learning analytics amongst 100.000+ students

  1. 1. @DrBartRienties Professor of Learning Analytics WHAT HAVE WE LEARNED FROM 6 YEARS OF IMPLEMENTING LEARNING ANALYTICS AMONGST 100.000+ STUDENTS? 18 November 2020
  2. 2. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
  3. 3. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
  4. 4. Adeniji, B. (2019). A Bibliometric Study on Learning Analytics. Long Island University. Retrieved from https://digitalcommons.liu.edu/post_fultext_dis/16/
  5. 5. “In the UK the Open University (OU) is a world leader in the collection, intelligent analysis and use of large scale student analytics. It provides academic staff with systematic and high quality actionable analytics for student, academic and institutional benefit (Rienties, Nguyen, Holmes, Reedy, 2017). Rienties and Toetenel’s, 2016 study (Rienties & Toetenel, 2016) identifies the importance of the linkage between LA outcomes, student satisfaction, retention and module learning design. These analytics are often provided through dashboards tailored for each of academics and students (Schwendimann et al., 2017). The OU’s world-class Analytics4Action initiative (Rienties, Boroowa, Cross, Farrington-Flint et al., 2016) supports the university-wide approach to LA. In particular, the initiative provided valuable insights into the identification of students and modules where interventions would be beneficial, analysing over 90 large-scale modules over a two-year period… The deployment of LA establishes the need and opportunity for student and module interventions (Clow, 2012). The study concludes that the faster the feedback loop to students, the more effective the outcomes. This is often an iterative process allowing institutions to understand and address systematic issues. Legal, ethical and moral considerations in the deployment of LA and interventions are key challenges to institutions. They include informed consent, transparency to students, the right to challenge the accuracy of data and resulting analyses and prior consent to intervention processes and their execution (Slade & Tait, 2019)” Wakelam, E., Jefferies, A., Davey, N., & Sun, Y. (2020). The potential for student performance prediction in small cohorts with minimal available attributes. British Journal of Educational Technology, 51(2), 347-370. doi: 10.1111/bjet.12836
  6. 6. Leading global distance learning, delivering high-quality education to anyone, anywhere, anytime The Open University Largest University in Europe No formal entry requirements enter with one A-level or less 33% 38% of part-time undergraduates taught by OU in UK 173,927 formal students 55% of students are 'disadvantaged' FTSE 100 have sponsored staff on OU courses in 2017/8 60% 66% of new undergraduates are 25+ 1,300 Open University students has a disability (23,630) 1 in 8 Students are already in work 3 in 4 employers use OU learning solutions to develop workforce
  7. 7. A special thanks to Vaclav Bayer, Avinash Boroowa, Shi-Min Chua, Simon Cross, Doug Clow, Chris Edwards, Rebecca Ferguson, Mark Gaved, Christothea Herodotou, Martin Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Mark Nichols, Quan Nguygen, Tom Olney, Lynda Prescott, John Richardson, Saman Rizvi, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Belinda Tynan, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, Zdenek Zdrahal, and others…
  8. 8. What we have learned in six years at the OU Change is slow, but can be enhanced with: 1. Clear senior management support 2. Bottom-up support from teachers and researchers who are willing to take a risk 3. Evidence-based research can gradually change perspectives and narratives 4. You quickly forget about the small/medium/large successes and fail to realise that you are making a real impact 5. Large-scale innovation takes substantial time and effort 6. It is all about people…
  9. 9. Predictive analytics and professional development Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University LACE Learning Analytics Review (Vol. LAK15-1). Milton Keynes: Open University. Kuzilek, J., Hlosta, M., & Zdrahal, Z. (2017). Open University Learning Analytics dataset. Scientific Data, 4, 170171. doi: 10.1038/sdata.2017.171 Wolff, A., Zdrahal, Z., Herrmannova, D., Kuzilek, J., & Hlosta, M. (2014). Developing predictive models for early detection of at-risk students on distance learning modules, Workshop: Machine Learning and Learning Analytics Paper presented at the Learning Analytics and Knowledge (2014), Indianapolis.
  10. 10. Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start Activity space FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  11. 11. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: successful student FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS
  12. 12. FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Start FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS FSF RFSOFS ORFN O SRFROF OR ORSORFS OS RS Pass Fail No submit TMA-1time VLE opens Start VLE trail: student who did not submit
  13. 13. Probabilistic model: all students time TMA1 VLE start
  14. 14. OU Analyse demo http://analyse.kmi.open.ac.uk
  15. 15. Herodotou, C., Rienties, B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university: Insights from a longitudinal case study. Internet and Higher Education, 45, 100725.
  16. 16. Herodotou, C., Rienties, B., Hlosta, M., Boroowa, A., Mangafa, C., Zdrahal, Z., (2020). Scalable implementation of predictive learning analytics at a distance learning university: Insights from a longitudinal case study. Internet and Higher Education, 45, 100725. Amongst the factors shown to be critical to the scalable PLA implementation were: Faculty's engagement with OUA, teachers as “champions”, evidence generation and dissemination, digital literacy, and conceptions about teaching online.
  17. 17. What is missing? (My own reflections) 1. Student-facing learning analytics 2. A deep understanding of students’ learning dispositions 3. An inclusive culture where staff are pro-actively supported and rewarded to use learning analytics
  18. 18. 1. Student Facing Analytics
  19. 19. 2. A deep understanding of students’ learning dispositions Tempelaar, D. T., Rienties, B. Mittelmeier, J. Nguyen, Q. (2018) Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, pp. 408-420.
  20. 20. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167
  21. 21. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167
  22. 22. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1 (Jan-March 2017), 6-16.
  23. 23. Tempelaar, D. T., Rienties, B., Nguyen, Q. (2017). Towards actionable learning analytics using dispositions. IEEE Transactions on Learning Technologies, 1 (Jan-March 2017), 6-16.
  24. 24. An inclusive culture where staff are pro-actively supported and rewarded to use learning analytics 1. Staff essential for support, regulation, teacher presence, learning design, monitoring, etc. 2. Staff even more important than before… 3. Learning analytics uptake only possible when staff buy-in.
  25. 25. What have I learned in six years at the OU Change is slow, but can be enhanced with: 1. Clear senior management support 2. Bottom-up support from teachers and researchers who are willing to take a risk 3. Evidence-based research can gradually change perspectives and narratives 4. You quickly forget about the small/medium/large successes and fail to realise that you are making a real impact 5. Large-scale innovation takes substantial time and effort 6. It is all about people…
  26. 26. Further reflections 1. Who owns the data? 2. What about the ethics? 3. What about professional development? 4. Are we optimising the record player?
  27. 27. http://oro.open.ac.uk/
  28. 28. T: drBartRienties E: bart.rienties@open.ac.uk W: www.bartrienties.nl W: https://www.organdonation.nhs.uk/ W: https://www.sportentransplantatie.nl/

Editor's Notes

  • Case-based reasoning (reasoning from precedents, k-Nearest Neighbours)
    Based on demographic data
    Based on VLE activities
    Classification and Regression Trees (CART)
    Bayes networks (naïve and full)

    Final verdict decided by voting

×