Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Here be dragons: mapping the (un)chartered in learning analytics

717 views

Published on

Virtual Presentation, School of Educational Studies and Leadership (EDSL), (Te) Kura Mātauranga me te Rangatiratanga,
Tuesday, 7 August, 2018

Published in: Education

Here be dragons: mapping the (un)chartered in learning analytics

  1. 1. Here be dragons: mapping the (un)chartered in learning analytics Virtual Presentation, School of Educational Studies and Leadership (EDSL), (Te) Kura Mātauranga me te Rangatiratanga, Tuesday, 7 August, 2018 Paul Prinsloo University of South Africa (Unisa) @14prinspImage credit: https://commons.wikimedia.org/wiki/File:Lenox_Globe_(2)_Britannica.png
  2. 2. Kia ora!
  3. 3. Acknowledgement I do not own the copyright of any of the images in this presentation. I therefore acknowledge the original copyright and licensing regime of every image used. This presentation (excluding the images) is licensed under a Creative Commons Attribution 4.0 International License.
  4. 4. Source credit: https://www.theatlantic.com/technology/archive/2013/12/no-old-maps-actually-say-here- be-dragons/282267/ But a globe does. That’s right: One globe—just one—contains the words Hic sunt dracones. Called the Hunt-Lenox Globe, it was built in 1510, making it one of the first European globes ever made Not a single old paper map presents those exact words— “Here be dragons”— in the margins or otherwise. Nor does any paper map include “Hic sunt dracones,” the words’ Latin equivalent.
  5. 5. Image credit: https://commons.wikimedia.org/wiki/File:Lenox_Globe_(2)_Britannica.png “Hic sunt dracones”
  6. 6. Overview of presentation • In the beginning… The emergence of the tribe • The state of the tribe/field in 2014 • Fast forward to 2018 • And ethics? • Mapping (some of) the unchartered 1. How is the field/context changing 2. The role of automated-decision making systems in learning analytics 3. The role of regulation/law/policy 4. Is learning analytics research or Research and what about the ethics? 5. Understanding the complexity of student success and how learning analytics fit into describing, understanding, predicting and prescribing 6. Student-centered learning analytics • (In)conclusions
  7. 7. Image credit: https://www.flickr.com/photos/setatum/2791924619 The emergence of learning analytics as discipline, field of practice, and research focus
  8. 8. Source credit: https://tekri.athabascau.ca/analytics/ “Learning institutions and corporations make little use of the data learners ‘throw off’ in the process of accessing learning materials, interacting with educators and peers, and creating new content. In an age where educational institutions are under growing pressure to reduce costs and increase efficiency, analytics promises to be an important lens through which to view and plan for change at course and institution levels” (emphasis added).
  9. 9. Source credit: https://tekri.athabascau.ca/analytics/ “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.”
  10. 10. Image credit: https://pixabay.com/en/cave-paintig-prehistoric-rupestral-490205/ What happened since 2011? The emergence of the ‘tribe’…
  11. 11. Source credit: https://www.amazon.com/Academic-Tribes-Territories-Intellectual- Discipline/dp/0335206271/ref=sr_1_1?ie=UTF8&qid=1520507872&sr=8-1&keywords=ACADEMIC+TRIBES+AND+TERRITORIES • Each discipline has its own set of values, attitudes, ways of seeing the world, ways of seeing knowledge and ways of behaving • The above is often formalised and perpetuated in departments, schools, rituals, conferences and journals • Each tribe protects its territory, the language it uses • Inter-tribal rivalry, protectionism and ‘raids’ (and the effects thereof)
  12. 12. • Who are the ‘gate-keepers’, the voices in learning analytics? How does it matter? • What are the ‘rules”, ‘tribal culture’? • How does the Annual conference, the acceptance rate, the proceedings shape the field? • How does the Journal of Learning Analytics shape the (future of) the field? Let us consider, for a moment… • Where are the centers of learning analytics located (geopolitically) and institutionally and how does this matter? • What/who are the competing fields/tribes and how does learning analytics relate with them?
  13. 13. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. 2014
  14. 14. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. Learning analytics as “field draws on assorted theory and methodologies from disciplines as diverse as education, psychology, philosophy, sociology, linguistics, learning sciences, statistics, machine learning/artificial intelligence and computer science” (p. 232) Describing the tribe
  15. 15. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. Network of all authors in the LAK conferences coded by disciplinary background. Red: Computer Science; Blue: Education; Green: Other (Industry, Engineering; Linguistics; or Business) (nodes sized by degree centrality) (2011-2013) • Computer Science – 51% • Education – 40% • “Missing” - machine learning, artificial intelligence, statistics, and data mining • Relatively few inter-disciplinary nodes
  16. 16. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. “While LMS and SIS data can provide insight into how to improve teaching and learning, this level of focus is not suitably aligned with the substantial challenges that face all levels of education – many of which require a systemic and integrated response. For example, while it is helpful to note that students who regularly log into a LMS may perform better than their less active peers, this information is not suitable for developing a focused response to poor performing students. It is neither helpful nor productive to simply tell under-performing students to log in more frequently” (p. 231; emphasis added) What the tribe does: Early learning analytics
  17. 17. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. “Learning analytics to date has served to identify a condition, but has not advanced to deal with the learning challenges in a more nuanced and integrated manner” (p. 232) What the tribe should do…
  18. 18. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. Cross-tabulation of the authors’ home disciplines (i.e., numbers in the circles represent the number of authors) with both research methods reported in the journal papers and types of journal papers (2011-2013)
  19. 19. Dawson, S., Gašević, D., Siemens, G., & Joksimovic, S. (2014, March). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM. Cross-tabulation of the authors’ home disciplines (i.e., numbers in the circles represent the number of authors) with both research methods reported in the conference papers and types of conference papers (2011-2013)
  20. 20. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. 2018
  21. 21. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. Educational Data-Mining Learning Analytics Automated discovery Focus on leveraging human judgement Automated adoption Developed to inform instructors and learners Reductionist frameworks – “they reduce phenomena to components and focus on the analysis of individual components and relationships between them” (p. 1) “Stronger focus on understanding complex systems as wholes”
  22. 22. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. Four propositions of learning analytics (Ferguson & Clow, 2017) 1)improve learning outcomes 2)support learning and teaching? 3) deployed widely; and 4) used ethically? To what extent does learning analytics To what extent is learning analytics
  23. 23. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. 57% “There are no dominating theories; rather there is a plethora of theories used to explain different aspects of LA. […] While the theoretical development of the field is still in its infancy, a few field-specific theories have been developed and applied” (p. 7)
  24. 24. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. “Our results also demonstrate that the overall potential of LA is so far higher than the actual evidence” (p. 10)
  25. 25. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. Learning analytics evidence across the years 2012– 2018 (%).
  26. 26. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. • “only 19% of the reviewed studies used both qualitative and quantitative methods of data analysis, with an increase in 2017” • “Predictive methods have been one of the dominating methods for several years. However, since 2016 the use of these methods has considerably decreased. The decrease, together with an increase of relationship mining methods and the rather stable use of methods for the distillation of data for human judgement, suggests that LA research in HE is shifting from prediction of, e.g., retention and grades, towards a deeper understanding of students’ learning experiences.”
  27. 27. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. • “so far there is little evidence (9%) that the research findings demonstrate improvements in learning outcomes, including knowledge acquisition, skill development and cognitive gains, as well as learning support and teaching”
  28. 28. Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The Current Landscape of Learning Analytics in Higher Education. Computers in Human Behavior. “It is worrying that more than 80% of the papers do not mention ethics at all. Moreover, there are only few studies that approach ethical issues (e.g., data privacy and security, informed consent) in a systematic way. However, we should not jump to the conclusion that most studies are done in an unethical way, but we call for more explicit reflection on ethics to rise in the coming years. The increase of the studies that reflect on the ethical issues for the year 2017 (36%) might indicate that there is already a positive move in this direction” (emphasis added).
  29. 29. Prinsloo, P., & Slade, S. (2017). Ethics and Learning Analytics: Charting the (Un)Charted. In: Lang, Charles; Siemens, George; Wise, Alyssa and Gašević, Dragan eds. Handbook of Learning Analytics. SOLAR, pp. 49–57. 2017
  30. 30. Prinsloo, P., & Slade, S. (2017). Ethics and Learning Analytics: Charting the (Un)Charted. In: Lang, Charles; Siemens, George; Wise, Alyssa and Gašević, Dragan eds. Handbook of Learning Analytics. SOLAR, pp. 49–57. “In a context where much is to be said for the potential economic benefits (for both students and the institution) of more successful learning experiences resulting from increased data harvesting, we should not ignore the possibilities of ‘data-proxy-induced hardship... when the detail obtained from the data-proxy comes to disadvantage its embodied referent in some way’(Smith, 2016, p. 16; also see Ruggiero, 2016; Strauss, 2016b; and Watters, 2016)” (Prinsloo & Slade, 2017, p. 50; emphasis added)
  31. 31. Prinsloo, P., & Slade, S. (2017). Ethics and Learning Analytics: Charting the (Un)Charted. In: Lang, Charles; Siemens, George; Wise, Alyssa and Gašević, Dragan eds. Handbook of Learning Analytics. SOLAR, pp. 49–57. In a recent overview of learning analytics practices in the Australian context, Dawson, Gašević, and Rogers (2016) report that the “relative silence afforded to ethics across the studies is significant” (p. 3) and that this “does not reflect the seriousness with which the sector should consider these issues” (p. 33). The report suggests that “It is likely that the higher education sector has not been ready for such a conversation previously, although it is argued that as institutions are maturing, ethical considerations take on a heightened salience” (p. 33).
  32. 32. Prinsloo, P., & Slade, S. (2017). Ethics and Learning Analytics: Charting the (Un)Charted. In: Lang, Charles; Siemens, George; Wise, Alyssa and Gašević, Dragan eds. Handbook of Learning Analytics. SOLAR, pp. 49–57. Interesting/important issues • Educational triage – allocating (scarce) resources to where it will make the most difference and ensure a return-on- investment • The responsibility of knowing and the moral duty to act once we know • Differentiating between essential information and ‘handy’ information – the dangers of collecting/scraping data that, out of the original context, lose validity • The mandate of higher education institutions and the moral obligation to use student data • The nuances of opting out • Ethical issues that sit “outside of the law”
  33. 33. Image credit: https://pixabay.com/en/crocodile-scale-texture-close-up-2901544/ Mapping [some of] the unchartered
  34. 34. 1. How is the field/context changing 2. The role of automated-decision making systems in learning analytics 3. The role of regulation/law/policy 4. Is learning analytics research or Research and why/how does it matter? 5. Understanding the complexity of student success and how learning analytics fit into describing, understanding, predicting and prescribing 6. Student-centered learning analytics
  35. 35. Higher education has always collected, analysed and used student data – so what has changed? Image credit: https://en.wikipedia.org/wiki/Scholasticism
  36. 36. IN THE PAST AT PRESENT Data sources Demographic and learning data at specific points in the learning journey: data application, registration, class registers, assignments, summative assessment, personal communication Continuous, directed, gifted and automated collection of data from a range of data sources – student administration, learning management system (LMS), sources outside of the LMS Data use Reporting purposes, operational planning on cohort, group level by management, institutional researchers Descriptive, diagnostic, predictive and prescriptive on group/cohort level Plus individualised, often real- time use of data to inform pedagogy, curriculum, assessment, student support by faculty, students and support staff Who used the data (officially)? Management, institutional researchers, planners, quality assurance and HR departments Plus researchers, faculty, students and support staff 1
  37. 37. IN THE PAST AT PRESENT Who did the collection, analysis and who used the data Humans Increasingly humans in combination with algorithmic decision-making processes Temporal aim Retrospective/historical data to make predictions with regard to budget, future enrollments & resource allocation on institutional level Plus real-time data for real-time interventions Default Forgetting Remembering Personal identifiers Anonymised, aggregated data Plus re-identifiable data Personal/ised data Oversight/ data governance Broad institutional oversight. Ethical Review Board (ERB) approval for research purposes Approval, oversight and governance highly complex and contested
  38. 38. (1) Humans perform the task (2) Task is shared with algorithms (3) Algorithms perform task: human supervision (4) Algorithms perform task: no human input Seeing Yes or No? Yes or No? Yes or No? Yes or No? Processing Yes or No? Yes or No? Yes or No? Yes or No? Acting Yes or No? Yes or No? Yes or No? Yes or No? Learning Yes or No? Yes or No? Yes or No? Yes or No? Danaher, J. (2015). How might algorithms rule our lives? Mapping the logical space of algocracy. [Web log post]. Retrieved from http://philosophicaldisquisitions.blogspot.com/2015/06/how-might-algorithms-rule-our-lives.html Human-algorithm interaction in the collection, analysis and use of student data: What are the (ethical) issues? 2
  39. 39. Source credit: https://www.technologyreview.com/s/526401/laws-and-ethics-cant-keep-pace-with-technology/ 3
  40. 40. Griffiths, D. (2017, September). An Ethical Waiver for Learning Analytics?. In European Conference on Technology Enhanced Learning (pp. 557-560). Springer, Cham. 4
  41. 41. Willis, J. E., Slade, S., & Prinsloo, P. (2016). Ethical oversight of student data in learning analytics: A typology derived from a cross- continental, cross-institutional perspective. Educational Technology Research and Development, 64, 881-901. DOI: 10.1007/s11423- 016-9463-4 http://link.springer.com/article/10.1007/s11423-016-9463-4 Who will provide oversight over the ethical issues in learning analytics? An interpretative multiple-case study: Indiana University, Open University (UK) and the University of South Africa (Unisa)
  42. 42. “Ethics are the mirror in which we evaluate ourselves and hold ourselves accountable” (emphasis added). Holding actors and humans accountable still works “better than every single other system ever tried” (Brin, 2016) Ethics and accountability
  43. 43. Ethics in learning analytics: Selected examples 2013-2017
  44. 44. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510-1529. 2013
  45. 45. Source credit: https://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ecms/web-content/ethical- use-of-student-data-policy.pdf 2014
  46. 46. Source credit: https://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ecms/web-content/ethical- use-of-student-data-policy.pdf Principle 1: Learning analytics is an ethical practice that should align with core organisational principles, such as open entry to undergraduate level study. Principle 2: The OU has a responsibility to all stakeholders to use and extract meaning from student data for the benefit of students where feasible. Principle 3: Students should not be wholly defined by their visible data or our interpretation of that data. Principle 4: The purpose and the boundaries regarding the use of learning analytics should be well defined and visible.
  47. 47. Source credit: https://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ecms/web-content/ethical- use-of-student-data-policy.pdf Principle 5: The University is transparent regarding data collection, and will provide students with the opportunity to update their own data and consent agreements at regular intervals. Principle 6: Students should be engaged as active agents in the implementation of learning analytics (e.g. informed consent, personalised learning paths, interventions). Principle 7: Modelling and interventions based on analysis of data should be sound and free from bias. Principle 8: Adoption of learning analytics within the OU requires broad acceptance of the values and benefits (organisational culture) and the development of appropriate skills across the organisation.
  48. 48. A long time ago in a far-off galaxy made of academic conferences… Source credit: https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics Guiding principles 1. Responsibility 2. Transparency and consent 3. Privacy 4. Validity 5. Access 6. Enabling positive interventions 7. Minimising adverse impacts 8. Stewardship of data
  49. 49. Imagecredit:https://pixabay.com/en/human-adult-waters-ship-dragons-3055939/ Understanding student retention and success as a complex and fluid ecology, consisting of many intersecting and often mutually constitutive factors in the nexus between students, institution and broader societal factors. 5
  50. 50. Processes Inter & intra- personal domains Modalities: • Attribution • Locus of control • Self-efficacy Processes Modalities: • Attribution • Locus of control • Self-efficacy Domains Academic Operational Social TRANSFORMED INSTITUTIONAL IDENTITY & ATTRIBUTES THE STUDENT AS AGENT IDENTITY, ATTRIBUTES, HABITUS Success THE INSTITUTION AS AGENT IDENTITY, ATTRIBUTES, HABITUS SHAPING CONDITIONS: (predictable as well as uncertain) SHAPING CONDITIONS: (predictable as well as uncertain) Choice, Admission Learning activities Course success Gradua- tion THE STUDENT WALK Multiple, mutually constitutive interactions between student, institution & networks F I T FIT F I T FIT Employ- ment/ citizenship TRANSFORMED STUDENT IDENTITY & ATTRIBUTES F I T F I T F I T F I T F I T F I T F I T F I T Retention/Progression/Positive experience Subotzky, G., & Prinsloo, P. (2011). Turning the tide: a socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2): 177-19.
  51. 51. Source credit: http://timoelliott.com/blog/2013/02/gartnerbi-emea-2013-part-1-analytics-moves-to-the-core.html Learning analytics in action
  52. 52. Image credit: https://pixabay.com/en/puzzle-cooperation-together-1020002/ “This interconnected set of problems has an interconnected set of solutions” (Martin, 2007, pp. 5-6)
  53. 53. Imagecredit:https://pixabay.com/en/binary-code-man-display-dummy-face-1327512/ Student data sovereignty Student data are not something separate from students’ identities, their histories, their beings. Data are an integral, albeit informational part of students being. In the light of the view that data are not something students own but rather who they are; what are we assuming when we say we ‘collect’ their data? E.g. Floridi, L. (2005). The ontological interpretation of informational privacy. Ethics and Information Technology, 7(4), 185-200. 6
  54. 54. Image credit: https://pixabay.com/en/fantasy-dragons-lizard-feed-child-2231796/ Students’ journeys are intimately weaved into our (institutional) stories. In the light of the asymmetrical power relationship, we have a bigger responsibility
  55. 55. Image credit: https://pixabay.com/en/dragon-fantasy-creature-mythology-637003/ Some (in)concluding pointers
  56. 56. Institutions have not only a fiduciary duty to collect, analyse and use student data, but also a moral duty. We never ‘own’ the data – students don’t ‘throw off’ data as they engage with their materials, their peers, their instructors. Their data are so much more than ‘digital footprints’. We need to recognise their data sovereignty. An ethical approach to collecting, analysing and using student data should be a powerful counter-narrative to some of the current the dominant discourses driven by economic interests.
  57. 57. THANK YOU Paul Prinsloo (Prof) Research Professor in Open Distance Learning (ODL), College of Economic and Management Sciences, Samuel Pauw Building, Office 5-21, P.O. Box 392, Unisa, 0003, Republic of South Africa T: +27 (0) 12 433 4719 (office) prinsp@unisa.ac.za Skype: paul.prinsloo59 Personal blog: http://opendistanceteachingandlearning.wordpress.com Twitter profile: @14prinsp Image credit: https://pixabay.com/en/figure-dragon-wing-face-ceramic-3124002/

×