Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Implications and questions for institutional learning analytics implementation arising from teacher DIY learning analytics

192 views

Published on

Learning analytics promises to provide insights that can help improve the quality of learning experiences. Since the late 2000s it has inspired significant investments in time and resources by researchers and institutions to identify and implement successful applications of learning analytics. However, there is limited evidence of successful at scale implementation, somewhat limited empirical research investigating the deployment of learning analytics, and subsequently concerns about the insight that guides the institutional implementation of learning analytics. This paper describes and examines the rationale, implementation and use of a single example of teacher do-it-yourself (DIY) learning analytics to add a different perspective. It identifies three implications and three questions about the institutional implementation of learning analytics that appear to generate interesting research questions for further investigation.

Published in: Education
  • Be the first to comment

Implications and questions for institutional learning analytics implementation arising from teacher DIY learning analytics

  1. 1. Implications and questions for institutional learning analytics implementation from teacher DIY learning analytics http://tiny.cc/ktsdiy David Jones Hazel Jones Colin Beer Celeste Lawson
  2. 2. as if Institutional implementation of learning analytics http://tiny.cc/ktsdiy
  3. 3. as if Institutional implementation of learning analytics Teacher DIY LA http://tiny.cc/ktsdiy
  4. 4. as if Institutional implementation of learning analytics 3 Implications 3 Questions Teacher DIY LA http://tiny.cc/ktsdiy
  5. 5. as if Institutional implementation of learning analytics 3 Implications 3 Questions New insights to guide research & practice?? Teacher DIY LA http://tiny.cc/ktsdiy
  6. 6. for a large number of institutions, organisational adoption of learning analytics either remains a conceptual, unrealised aspiration or,… is often narrow and limited in scope and impact (Colvin et al, 2017, p. 281) Under construction
  7. 7. Limitations of available conceptual models …do not fully capture the breadth of factors that shape LA implementations …curtailing their ability to present managers with the nuanced, situated, fine-grained insight they require (Colvin et al, 2017, p. 284)
  8. 8. Limitations of empirical work …interviews with senior leaders charged with responsibility for implementing learning analytics at 32 universitys (Colvin et al, 2017, p. 285) …recommended that further empirical analysis of learning analytics implementation are conducted over time that allow more nuanced insight (Colvin et al, 2017, p. 287)
  9. 9. What about the teacher?
  10. 10. What about the teacher? Lack of human-centeredness in LA…pervades implementation (Liu et al, 2017, p. 150) …tend to privilege the administrator rather than the student - or even the instructor. (Kruse & Pongsajapan, 2012)
  11. 11. What is the experience of teachers using institutional learning analytics? How might an understanding of their experience inform the institutional implementation of learning analytics?
  12. 12. One teacher’s DIY learning analytics
  13. 13. Different perspectives
  14. 14. One teacher’s DIY learning analytics Single course Two offerings per year S1: 300+ students, multiple modes, ~200 online S2: ~100 students, online only Initial Teacher Education – all sectors Less than 15% of students meet lead teacher in person 75% of the students are online only
  15. 15. A little confused about the criteria column
  16. 16. Who? • Where are they? • Sector? • Prior studies? • Contact details? What? • Completed learning task? • Blog posts?
  17. 17. (Baker, 2016, p. 607) the goal of getting key information to a human who can use it Use of [big] data to provide actionable intelligence for learners & teachers (Ferguson, 2014)
  18. 18. UConnect Student Enquiry Student Centre Unofficial transcript UWork Faculty Centre Class Roster UTeach Study Desk Profile Reports Activity Completion BIM View Student Specialisation Mode Mode Courses Activity completion Blog postsGPA, Specialisation, Course history Address Phone # • 10+ minutes • 6 “reports” • 2 systems
  19. 19. …no surprise that those things that the affordances make easy are apt to get done, those things that the affordances make difficult are not apt to get done (Norman, 1993, p. 106)
  20. 20. (Baker, 2016, p. 607) Implication #1 the goal of getting key information to a human who can use it Institutional LA implementation is falling short ..common request…across the focus groups was the ability to correlate data across systems (Corrin et al, 2013, p. 204) ..teachers are the best people to know what data they want but rarely have the data that they actually want in a place and form where it can actually be used (Liu, 2017)
  21. 21. Implication #1 Is it? How? How widespread? Institutional LA implementation is falling short Why? What can be done? How long at your institution? Research Practice
  22. 22. One teacher’s solution - DIY learning analytics
  23. 23. A little confused about the criteria column
  24. 24. A little confused about the criteria column Know thy student
  25. 25. • Where are they? • Sector? • Prior studies? • Contact details? • Completed learning task? • Blog posts?
  26. 26. • Where are they? • Sector? • Prior studies? • Contact details? • Completed learning task? • Blog posts?
  27. 27. “criteria” activity Note the completion time difference • Where are they? • Sector? • Prior studies? • Contact details? • Completed learning task? • Blog posts?
  28. 28. Sentiment analysis • Completed learning task? • Blog posts? Learning design specific
  29. 29. (Ko et al, 2011, p. 21) Implication #3 [m]ost programs today … written not by professional software developers, but by people with expertise in other domains working towards goals for which they need computational support Teacher DIY learning analytics is possible
  30. 30. (Behrens, 2009, p. 124) often said to have negative consequences for their hosts: undermining official systems, sapping valuable resources and corrupting organizational data and processes
  31. 31. Implication #3 How widespread is it? Teacher DIY learning analytics is possible What are the issues? Security Privacy Efficiency Are you aware of it? Why & what shape does it take? Research Practice Is it a little unsettling? What will/should be done?
  32. 32. (Behrens, 2009, p. 124) Some shadow systems offer an effective and efficient way for users to cope with the deficiencies of formal systems
  33. 33. …no surprise that those things that the affordances make easy are apt to get done, those things that the affordances make difficult are not apt to get done (Norman, 1993, p. 106)
  34. 34. Did it get used? 4 offerings of 1 course (2015 and 2016) 850 students
  35. 35. Know thy student clicks per student Testing761 students (89.5%)
  36. 36. Daily usage Max single day: 140 Used: 666 days over 2 years (91.2%) S1 300+ students S2 ~100 students
  37. 37. Embedded and ubiquitous
  38. 38. Implication #2 Embedded, ubiquitous, contextual learning analytics encourage usage and emergent practice Human action will be most effective if the data and models provided to those humans is of the highest possible quality for the context (Baker, 2016, p. 610) ..learning analytics cannot be decoupled from actual, situated learning and teaching practice (Gasevic et al, 2015, p. 83)
  39. 39. there is no single technological solution that applies for every teacher, every course, or every view of teaching (Mishra & Koehler, 2006, p. 1029)
  40. 40. teachers' work is largely bricolage … the concept of bricolage plays a unifying and heuristic role in understanding and explaining teachers' work (Hatton, 1989, p. 78) Bricoleurs…professional do-it-yourself person (Hatton, 1989, p. 75) TEL innovation is a process of bricolage. (Scanlon et al, 2013, p. 7)
  41. 41. (Johnson et. al., 2014) #1
  42. 42. …no surprise that those things that the affordances make easy are apt to get done, those things that the affordances make difficult are not apt to get done (Norman, 1993, p. 106) [A bricoleur’s] universe of instruments is closed and the rules of his game are always to make do with 'whatever is at hand' (Levi-Strauss, 1966, p. 17)
  43. 43. Implication #2 Embedded, ubiquitous, contextual learning analytics encourage usage and emergent practice What happens in other contexts? Research Practice How embedded, ubiquitous & contextual is LA?
  44. 44. Questions arising from teacher DIY learning analytics?
  45. 45. How can we use “Know thy student”... ...in our course? …at our institution?
  46. 46. Question #1: Does institutional LA have an incomplete focus? …to move from small-scale research towards broader institutional implementation (Ferguson et al, 2014, p. 120) …make the leap from the focused & particular to the broad and general (Lonn et al, 2013, p. 235)
  47. 47. The less context, the better for reuse Reusability paradox (Wiley, n.d) …boutique models that vary by course…may be unwieldy to implement despite being more accurate generalized models may be inaccurate…but represent a cost effective &…efficient approach (Gasevic et al, 2015, p. 83) ..comparison…between course-specific and general models points to a significant tension in the field that must be resolved
  48. 48. The less context, the better for reuse Reusability paradox (Wiley, n.d) (Wiliam, 2006, p. 17) …‘what works’ is not the right question in education. Everything works somewhere, and nothing works everywhere It is impossible to design systems which are appropriate for all users and all situations (MacLean et al, 1990, p. 175)
  49. 49. Question #1: Does institutional LA have an incomplete focus? How do we resolve the LA reusability paradox? What will happen if we do? Can we? Research Practice What is actually meant by ”scalability” or “at scale”?
  50. 50. Question #2: Does institutional LA have a starvation problem? was delayed due to existing projects … that were a higher priority for the institution (Lonn et al, 2013, p. 238)
  51. 51. Question #2: Does institutional LA have a starvation problem? One course with a particular learning design and particular LA requirements
  52. 52. Question #2: Does institutional LA have a starvation problem? How does this impact LA and L&T? What can be done about it? Research Practice How long for single course LA to be implemeted at your institution? Does it?
  53. 53. Question #3: If and how do we enable teacher DIY LA? ...Such engagement may also be a necessary condition to properly understand the ecology of practices that will be the context for any particular TEL innovation (Scanlon et al, 2013, p. 6) …consequence of the complexity of TEL… is that there is scope for ‘user-driven’ contributions from both teachers & students
  54. 54. Question #3: If and how do we enable teacher DIY LA? Teachers…needed a platform to collect, analyze, and perform action on these data at scale (Liu et al, 2017, p. 148) …participatory design approach (Liu et al, 2017, p. 149-150)Teacher-centered Human-centered Customizable, flexible and scalable…
  55. 55. …affordances of pervasive digital technology also produce innovations characterised by generativity (Yoo et al, 2012, p. a technology’s overall capacity to produce unprompted change driven by large, varied and uncoordinated audiences (Zittrain, 2006, p. 1980)
  56. 56. Users must be able to tailor a system to their wants (Kay, 1984, p. 57) The protean nature of the computer…is such that it can act like a machine or like a language to be shaped & exploited (Kay, 1984, p. 59)
  57. 57. (Jones & Clark, 2014) The incommensurable nature…imply that any attempts to fruitfully merge the two will need to deal with existing, and sometimes strongly held assumptions and mindsets.
  58. 58. Creating computationally rich environments (Grover & Pea, 2013) Low floor, high ceiling Enable transfer Use-modify-create (Repenning et al, 2010) Systemic and sustainable Ten principles for effective tinkering (Dron, 2014) A theory of tailorable technology design (Germonprez et al, 2007)
  59. 59. Reproducible research Literate computing – Jupyter notebooks Version control Virtualisation (Sinha and Sudish, 2016) Web augmentation (Diaz & Arellano, 2015)
  60. 60. Question #3: If and how do we enable teacher DIY LA? Who is already working on this? Research Practice Would you want to? How to do it? Challenges? Outcomes? Could you? Want to talk more about exploring how?
  61. 61. as if Institutional implementation of learning analytics 3 Implications 3 Questions New insights to guide Teacher DIY LA Questions? http://tiny.cc/ktsdiy
  62. 62. Baker, R. (2016). Stupid Tutoring Systems, Intelligent Humans. International Journal of Artificial Intelligence in Education, 26(2), 600–614. https://doi.org/10.1007/s40593-016-0105-0 Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124–129. Colvin, C., Dawson, S., Wade, A., & Gašević, D. (2017). Addressing the Challenges of Institutional Adoption. In C. Lang, G. Siemens, A. F. Wise, & D. Gaševic (Eds.), The Handbook of Learning Analytics (1st ed., pp. 281–289). Alberta, Canada: Society for Learning Analytics Research (SoLAR). Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics by understanding the needs of teachers. In Electric Dreams. Proceedings ascilite 2013 (pp. 201–205). Díaz, O., & Arellano, C. (2015). The Augmented Web: Rationales, Opportunities, and Challenges on Browser-Side Transcoding. ACM Trans. Web, 9(2), 8:1–8:30. https://doi.org/10.1145/2735633 Ferguson, R. (2014). Learning analytics FAQs. Education. Retrieved from https://www.slideshare.net/R3beccaF/learning-analytics-fa-qs
  63. 63. Jones, D., & Clark, D. (2014). Breaking BAD to bridge the reality/rhetoric chasm. In B. Hegarty, J. McDonald, & S. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp. 262–272). Kay, A. (1984). Computer Software. Scientific American, 251(3), 53–59. Ko, A. J., Abraham, R., Beckwith, L., Blackwell, A., Burnett, M., Erwig, M., … Wiedenbeck, S. (2011). The State of the Art in End-user Software Engineering. ACM Comput. Surv., 43(3), 21:1–21:44. https://doi.org/10.1145/1922649.1922658 Levi-Strauss, C. (1966). The Savage Mind. Weidenfeld and Nicolson. Liu, D. Y.-T. (2017). What do Academics really want out of Learning Analytics? – ASCILITE TELall Blog. Retrieved August 27, 2017, from http://blog.ascilite.org/what-academics-really-want-out-of-learning-analytics/ Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends (pp. 143–169). Springer International Publishing. https://doi.org/10.1007/978-3-319-52977-6_5
  64. 64. Liu, D. Y.-T. (2017). What do Academics really want out of Learning Analytics? – ASCILITE TELall Blog. Retrieved August 27, 2017, from http://blog.ascilite.org/what-academics-really-want-out-of-learning-analytics/ Liu, D. Y.-T., Bartimote-Aufflick, K., Pardo, A., & Bridgeman, A. J. (2017). Data-Driven Personalization of Student Learning Support in Higher Education. In A. Peña-Ayala (Ed.), Learning Analytics: Fundaments, Applications, and Trends (pp. 143–169). Springer International Publishing. https://doi.org/10.1007/978-3-319-52977-6_5 Lonn, S., Aguilar, S., & Teasley, S. D. (2013). Issues, Challenges, and Lessons Learned when Scaling Up a Learning Analytics Intervention. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 235–239). New York, NY, USA: ACM. https://doi.org/10.1145/2460296.2460343 Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine. Cambridge, Mass: Perseus.
  65. 65. Lonn, S., Aguilar, S., & Teasley, S. D. (2013). Issues, Challenges, and Lessons Learned when Scaling Up a Learning Analytics Intervention. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp. 235–239). New York, NY, USA: ACM. https://doi.org/10.1145/2460296.2460343 MacLean, A., Carter, K., Lövstrand, L., & Moran, T. (1990). User-tailorable Systems: Pressing the Issues with Buttons. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 175–182). New York, NY, USA: ACM. https://doi.org/10.1145/97243.97271 Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine. Cambridge, Mass: Perseus. Repenning, A., Webb, D., & Ioannidou, A. (2010). Scalable Game Design and the Development of a Checklist for Getting Computational Thinking into Public Schools. In Proceedings of the 41st ACM Technical Symposium on Computer Science Education (pp. 265–269). New York, NY, USA: ACM. https://doi.org/10.1145/1734263.1734357
  66. 66. Scanlon, E., Sharples, M., Fenton-O’Creevy, M., Fleck, J., Cooban, C., Ferguson, R., … Waterhouse, P. (2013). Beyond prototypes: Enabling innovation in technology‐enhanced learning. London. Retrieved from http://tel.ioe.ac.uk/wpcontent/%0Duploads/2013/11/BeyondPrototypes.pdf Sinha, R., & Sudhish, P. S. (2016). A principled approach to reproducible research: a comparative review towards scientific integrity in computational research. In 2016 IEEE International Symposium on Ethics in Engineering, Science and Technology (ETHICS) (pp. 1–9). https://doi.org/10.1109/ETHICS.2016.7560050 Wiley, D. (n.d.). The Reusability Paradox. Retrieved from http://cnx.org/content/m11898/latest/ Wiliam, D. (2006). Assessment: Learning communities can use it to engineer a bridge connecting teaching and learning. JSD, 27(1). Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408. Zittrain, J. L. (2006). The Generative Internet. Harvard Law Review, 119(7), 1974–2040.
  67. 67. Scanlon, E., Sharples, M., Fenton-O’Creevy, M., Fleck, J., Cooban, C., Ferguson, R., … Waterhouse, P. (2013). Beyond prototypes: Enabling innovation in technology‐enhanced learning. London. Retrieved from http://tel.ioe.ac.uk/wpcontent/%0Duploads/2013/11/BeyondPrototypes.pdf Sinha, R., & Sudhish, P. S. (2016). A principled approach to reproducible research: a comparative review towards scientific integrity in computational research. In 2016 IEEE International Symposium on Ethics in Engineering, Science and Technology (ETHICS) (pp. 1–9). https://doi.org/10.1109/ETHICS.2016.7560050 Wiley, D. (n.d.). The Reusability Paradox. Retrieved from http://cnx.org/content/m11898/latest/ Wiliam, D. (2006). Assessment: Learning communities can use it to engineer a bridge connecting teaching and learning. JSD, 27(1). Yoo, Y., Boland, R. J., Lyytinen, K., & Majchrzak, A. (2012). Organizing for Innovation in the Digitized World. Organization Science, 23(5), 1398–1408. Zittrain, J. L. (2006). The Generative Internet. Harvard Law Review, 119(7), 1974–2040.
  68. 68. Slide 20, 35, 44: "Don Norman" by happy.apple available at http://flickr.com/photos/happy.apple/6117336152 under Attribution License https://creativecommons.org/licenses/by/2.0/ Slide 58: "IMG_6967.JPG" by Tantek ?elik available at http://flickr.com/photos/tantek/4875352288 under Attribution-NonCommercial License https://creativecommons.org/licenses/by-nc/2.0/ Slide 59: "Alan Kay and the prototype of Dynabook, pt. 5" by Marcin Wichary available at http://flickr.com/photos/MarcinWichary/3010032738 under Attribution License https://creativecommons.org/licenses/by/2.0/ Slide 60: "Book: All Marketers Are Liars" by John Drake available at http://flickr.com/photos/JohnDrakeFlickr/7227826626 under Attribution-NoDerivs License https://creativecommons.org/licenses/by-nd/2.0/ Slide 61, 62: "Possibilities" by Susan available at http://flickr.com/photos/SusanAstray/2176917760 under Attribution-NonCommercial-ShareAlike License https://creativecommons.org/licenses/by-nc-sa/2.0/ Slide 65, 66, 67, 68, 69, 70, 71: "The Leeds Library" by Michael D Beckwith available at http://flickr.com/photos/michael_d_beckwith/16438065636 under Public Domain Dedication (CC0) https://creativecommons.org/publicdomain/zero/1.0/
  69. 69. How it works
  70. 70. Teacher Institution
  71. 71. Teacher Institution (Behrens, 2009, p. 124) …covertely replicate the data and functionality of formally sanctioned systems
  72. 72. Screen—scaping and CSV exports Teacher Institution
  73. 73. Teacher Institution Know thy student
  74. 74. Teacher Institution Visit Course Site Know thy student
  75. 75. Teacher Institution Know thy student Update course page
  76. 76. Teacher Institution Know thy student Respond to queries

×