Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Transitioning Education’s Knowledge Infrastructure ICLS 2018

749 views

Published on

Keynote Address, International Conference of the Learning Sciences, London Festival of Learning

Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?


Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).

The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?

Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.

Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.

Published in: Education

Transitioning Education’s Knowledge Infrastructure ICLS 2018

  1. 1. Transitioning Education’s Knowledge Infrastructure Shaping Design or Shouting from the Touchline? Simon Buckingham Shum @sbuckshum • http://utscic.edu.au Keynote Address, International Conference of the Learning Sciences London Festival of Learning, June 2018
  2. 2. 2 Simon Shum UCL Ergonomics MSc student, 1987 Jon Shum UCL Space Research Team technician, 1961
  3. 3. Deep acknowledgements to the team who’ve shaped the ideas in this talk… https://utscic.edu.au/about/people Lecturer
  4. 4. 4 “infrastructure”
  5. 5. 5 infrastructure amid justified public concern about data/algorithm ethics, and academic concerns about computational methods, how do we design WE TRUST?
  6. 6. 6 infrastructure wants to be INVISIBLE
  7. 7. 7 my new amazing piece of augmented reality kit
  8. 8. 8
  9. 9. 9
  10. 10. 10
  11. 11. 11
  12. 12. 12
  13. 13. multifocal 13 multivocal
  14. 14. 14 long-distance big picture lenses close-up lenses Try on my multifocals…
  15. 15. Learning Analytics many tributaries converging 15 https://www.flickr.com/photos/baggis/4173304621
  16. 16. 16 Learning Analytics is therefore building bridges with… Many disciplines… LAK conference keynote speakers include: Learning Sciences Paul Kirschner David Williamson Shaffer Art Graesser Psychometrics Robert Mislevy Information Visualisation Katy Börner Cristina Conati IT Ethics Mireille Hildebrandt Critical studies of technology Neil Selwyn Writing analytics Danielle McNamara
  17. 17. 17 Learning Analytics is therefore building bridges with… LAK conference and LA Summer Institutes offer: Industry panels to explore the opportunities and tensions with academia Industry researchers on the Doctoral Consortium Commercial sponsors and exhibitors Collaborative projects with industry Many disciplines… LAK conference keynote speakers include: LMS, Analytics Vendors & Publishers… Learning Sciences Paul Kirschner David Williamson Shaffer Art Graesser Psychometrics Robert Mislevy Information Visualisation Katy Börner Cristina Conati IT Ethics Mireille Hildebrandt Critical studies of technology Neil Selwyn Writing analytics Danielle McNamara
  18. 18. 18 Learning learning sciences educational research teaching practice curriculum design learning design pedagogy assessment… Analytics data statistics classification machine learning text processing visualisation predictive models… …not a straightforward dialogue
  19. 19. 19 What’s missing? Human Factorsstakeholder involvement participatory design cycles user interface design privacy and ethics end-user acceptance organisational strategy staff training…
  20. 20. 20 Learning Analytics: A Human-Centred Design Discipline Learning Analytics Human Factors
  21. 21. Who are Learning Analytics for? 21
  22. 22. Who are Learning Analytics for? 22 Learning Sciences researchers! 1. Theory 2. Experiment 3. Simulation 4. Data-Intensive Science http://FourthParadigm.org
  23. 23. Educational eResearch infrastructure Education meets Cyberinfrastructure, eScience, eSocialScience, Grid computing, etc… 23 Bill Cope & M ary Kalantzis (2016). Big Data Com es to School: Im plications for Learning, Assessm ent, and Research. AERA Open, 2, (2): April 1, 2016. Open Access: https://doi.org/10.1177/2332858416641907 Lina M arkauskaite (2016), Digital M edia, Technologies and Scholarship: Som e Shapes of eResearch in Educational Inquiry. The Australian Educational Researcher, 37, (4), pp.79-101. Open Access: https://link.springer.com /content/pdf/10.1007%2FBF03216938.pdf Educational research methods could change radically when: • “sample size” is less relevant: N=All • statistically sig. patterns are easy to find in such big data • qualitative textual coding may be automated • social ties can be tracked at scale • etc…
  24. 24. Who are Learning Analytics for? 24 “The new possibility is that educators and learners — the stakeholders who constitute the learning system studied for so long by researchers — are for the first time able to see their own processes and progress rendered in ways that until now were the preserve of researchers outside the system.” Knight, S. and Buckingham Shum , S. (2017). Theory & Learning Analytics. H andbook of Learning Analytics, Society for Learning Analytics Research, Chap. 1, pp.17-22. http://doi.org/10.18608/hla17 Learners! Educators!
  25. 25. Who are Learning Analytics for? 25 “Data gathering, analysis, interpretation and even intervention (in the case of adaptive software) is no longer the preserve of the researcher, but shifts to embedded sociotechnical educational infrastructure.” Learners! Educators! Knight, S. and Buckingham Shum , S. (2017). Theory & Learning Analytics. H andbook of Learning Analytics, Society for Learning Analytics Research, Chap. 1, pp.17-22. http://doi.org/10.18608/hla17
  26. 26. 26 But… for many, we’re suspect, wielding data, analytics and AI in ways that may oppress rather than empower tectonic shifts under way in the educational landscape…
  27. 27. Justified concerns around privacy, surveillance and data ethics are redefining the context for our work 27
  28. 28. Growing public literacy around the ethics of data / algorithms / AI is to be welcomed 28 For more see http://datasociety.net
  29. 29. Critical academic commentary on the datafication of education (Ben Williamson) 29 “Datafication” at all levels of the educational system, from government statistics to biometrics Concern about the ownership of data and analytics platforms by commercial entities Worried that students will have opportunities closed down rather than opened up by algorithms And much more… Justified?
  30. 30. Critical academic commentary on Learning Analytics (Neil Selwyn, LAK18 keynote) 30 The Promises and Problems of ‘Learning Analytics’ https://w w w .youtube.com /w atch?v=rsUx19_Vf0Q https://latte-analytics.sydney.edu.au/keynotes
  31. 31. Classification schemes and metrics are suspect, with good reason… 31 ideas now being popularised… incisive STS scholarship into classification schemes…
  32. 32. Du Gay, P. and Pryke, M. (2002) Cultural Economy: Cultural Analysis and Commercial Life. Sage, London. pp. 12-13 “accounting tools...do not simply aid the measurement of economic activity, they shape the reality they measure” 32
  33. 33. 33 A rapidly changing educational data/analytics ecosystem… Venture Capitalists Philanthropic Foundations Publishers as analytics providers Pearson McGraw Hill Wiley etc. Learning Platform Services Blackboard Canvas D2L Facebook etc. Adaptive/ Learning Analytics Services SmartSparrow Knewton Unizen CogTools etc. Data Protection Laws GDPR National privacy lawsetc. Govnt. & inter- national datasets UK HESA Data Futures OECD PISA UNESCO Inst. for Statistics US Institute for HE Practice etc. Learning Analytics H um an Factors
  34. 34. 34 A rapidly changing educational data/analytics ecosystem… Learning Analytics H um an Factors Publishers as analytics providers Pearson McGraw Hill Wiley etc. Learning Platform Services Blackboard Canvas D2L Facebook etc. Adaptive/ Learning Analytics Services SmartSparrow Knewton Unizen CogTools etc. Data Protection Laws GDPR National privacy lawsetc. Govnt. & inter- national Statistics UK HESA Data Futures OECD PISA UNESCO Inst. for Statistics US Institute for HE Practice etc. Venture Capitalists Philanthropic Foundations
  35. 35. 35 2 historical lenses: Knowledge Infrastructures (Paul Edwards) Pasteur’s Quadrant (Donald Stokes)
  36. 36. “Knowledge Infrastructures” (Paul Edwards) 36http://knowledgeinfrastructures.orghttps://mitpress.mit.edu/books/vast-machine Knowledge Infrastructures: Intellectual Frameworks and Research Challenges A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming
  37. 37. “Knowledge Infrastructures” 37 “robust networks of people, artifacts, and institutions that generate, share, and maintain specific knowledge about the human and natural worlds.” Routine, well-functioning knowledge systems include the world weather forecast infrastructure, the Centers for Disease Control, or the Intergovernmental Panel on Climate Change — individuals, organizations, routines, shared norms, and practices. Paul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
  38. 38. “Knowledge Infrastructures” 38 “Infrastructures are not systems, in the sense of fully coherent, deliberately engineered, end-to-end processes. …ecologies or complex adaptive systems[…] made to interoperate by means of standards, socket layers, social practices, norms, and individual behaviors.” Paul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
  39. 39. “Knowledge Infrastructures” 39 “Infrastructures are not systems, in the sense of fully coherent, deliberately engineered, end-to-end processes. …ecologies or complex adaptive systems[…] made to interoperate by means of standards, socket layers, social practices, norms, and individual behaviors.” I think we can see the educational ecosystem herePaul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
  40. 40. Knowledge Infrastructure concepts 40 Models, models, models… “Everything we know about the world’s climate — past, present, and future — we know through models.” (p.xiv) “I’m not talking about the difference between “raw” and “cooked” data. I mean this literally. Today, no collection of signals or observations […] becomes global in time and space without first passing through a series of data models.” (p.xiii) Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  41. 41. Knowledge Infrastructure concepts 41 Models, models, models… “Everything we know about the world’s climate — past, present, and future — we know through models.” (p.xiv) Today, no collection of signals or observations […] becomes global in time and space without first passing through a series of data models.” (p.xiii) Machines ‘see’ learners only through models “Raw data is an oxymoron” (Geof Bowker) Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  42. 42. Knowledge Infrastructure concepts 42 infrastructural inversion “The climate knowledge infrastructure never disappears from view, because it functions by infrastructural inversion: continual self-interrogation, examining and reexamining its own past. The black box of climate history is never closed.” Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  43. 43. Knowledge Infrastructure concepts 43 infrastructural inversion “The climate knowledge infrastructure never disappears from view, because it functions by infrastructural inversion: continual self-interrogation, examining and reexamining its own past. The black box of climate history is never closed.” We must keep lifting the lid on learning analytics infrastructures We must equip learners and educators to engage critically with such tools Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  44. 44. Knowledge Infrastructure concepts 44 metadata friction “People long ago observed climate and weather for their own reasons, within the knowledge frameworks of their times. You would like to use what they observed — not as they used it, but in new ways, with more precise, more powerful tools. […] So you dig into the history of data. You fight metadata friction, the difficulty of recovering contextual knowledge about old records.” (p.xvii) Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  45. 45. metadata friction “People long ago observed climate and weather for their own reasons, within the knowledge frameworks of their times. You would like to use what they observed — not as they used it, but in new ways, with more precise, more powerful tools. […] So you dig into the history of data. You fight metadata friction, the difficulty of recovering contextual knowledge about old records.” (p.xvii) Knowledge Infrastructure concepts 45 cf. Reanalysis of educational data (your own and others’) using computational methods Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
  46. 46. Epistemic Infrastructure taxonomy for professional knowledge (Markauskaite & Goodyear) Partic contributions at the “Micro-KI” level: how professionals construct their EI 46 Markauskaite, L. & Goodyear, P. (2017). Epistemic Fluency and Professional Education: Innovation, Knowledgeable Action and Actionable Knowledge (Springer, 2017), p.376
  47. 47. 47 Arguably, education is in transition to a new KI The seams of the old KI are under stress Rapid tech. change stresses the systems with more inertia. How can we manage this?
  48. 48. What can we expect to see when a KI is in transition? 48 “social norms, relationships, and ways of thinking, acting, and working” are impacted “…when they change, authority, influence, and power are redistributed.” “new kinds of knowledge work and workers displace old ones” “increased access for some may mean reduced access for others”
  49. 49. KI helps explain contemporary concerns… 49 outsourcing of student feedback to machines alarm over the political agendas driving the collection of educational data concern over the commercial drivers/owners behind data worries that use of analytics/AI = dated pedagogy
  50. 50. KI helps explain our current immaturity… 50 poor interoperability between platforms early products with simplistic dashboards that don’t count what really counts in learning growing recognition of the data illiteracy in educational institutions universities discovering they have limited control over how they can access their students’ data from cloud platforms
  51. 51. KI helps explain our current immaturity… 51 use of machine learning with no awareness of data and algorithmic bias poor grounding of learning analytics in theories of learning an early fascination (borrowed from business) with predicting student failure (assumes the past can and should predict the future) excessive weight placed on computational performance with less attention to educational outcomes
  52. 52. How to respond as researchers? 52 “social scientists cannot remain simple bystanders or critics of the current transformations, which will not be reversed; “instead, we need research practices that can help innovate, rethink, and rebuild.”
  53. 53. How to respond as researchers? 53 “treat it as a design opportunity, create a cadre at the interface between scientists and software, and use participatory design techniques So what does that look like?
  54. 54. 54 2 historical lenses: Knowledge Infrastructures (Paul Edwards) Pasteur’s Quadrant (Donald Stokes)
  55. 55. Donald Stokes: Pasteur’s Quadrant • 55
  56. 56. Donald Stokes: Pasteur’s Quadrant • 56 • Edison: invent commercially viable electric lighting • Early, pre-theoretical taxonomising of phenomena • Bohr: atomic structure; quantum theory • Pasteur: understand and control bacteria in order to prevent disease • Manhattan Project: atomic bomb • John Maynard Keynes: economics
  57. 57. Locating Learning Sciences & Learning Analytics 57 Learning Sciences research into the foundational constructs and processes underpinning learning Data Science/AI research into new approaches to machine learning, algorithm optimisation, mitigating bias, etc. BOHR’s Quadrant PASTEUR’s Quadrant EDISON’s Quadrant Learning Analytics to improve outcomes in specific contexts (may or may not draw on current theory, but doesn’t feed back to foundational research concepts) Learning Sciences: research-based educational intervention in specific contexts, reflect on implications for theory, establish sustainable practices (e.g. DBR; DBIR; RPPs; Improvement Science; Collaborative Data-intensive Improvement)
  58. 58. Locating Learning Sciences & Learning Analytics 58 Learning Sciences research into the foundational constructs and processes underpinning learning Data Science/AI research into new approaches to machine learning, algorithm optimisation, mitigating bias, etc. BOHR’s Quadrant PASTEUR’s Quadrant EDISON’s Quadrant Learning Analytics to improve outcomes in specific contexts (may or may not draw on current theory, but doesn’t feed back to foundational research concepts) Learning Sciences: research-based educational intervention in specific contexts, reflect on implications for theory, establish sustainable practices (e.g. DBR; DBIR; RPPs; Improvement Science; Collaborative Data-intensive Improvement)
  59. 59. Locating Learning Sciences & Learning Analytics 59 Learning Sciences research into the foundational constructs and processes underpinning learning Data Science/AI research into new approaches to machine learning, algorithm optimisation, mitigating bias, etc. BOHR’s Quadrant PASTEUR’s Quadrant EDISON’s Quadrant Learning Analytics to improve outcomes in specific contexts (may or may not draw on current theory, but doesn’t feed back to foundational research concepts) Learning Sciences: research-based educational intervention in specific contexts, reflect on implications for theory, establish sustainable practices (e.g. DBR; DBIR; RPPs; Improvement Science; Collaborative Data-intensive Improvement) Learning Analytics: design and deploy analytics that demonstrate how theory can inspire models, algorithms, code, user experiences, teaching practices, and ultimately, learning. The ability to formally model theoretical concepts, and shape learning outcomes, advances theories (as in other fields)
  60. 60. The analytics innovator’s dilemma… 60 PASTEUR’s Quadrant — USE inspired foundational research The innovation gulf facing learning analytics (much ed-tech) research: 1. Researchers develop novel forms of student-facing analytics, but… 2. USE-inspired research requires USERS. There’ll be few users without robust infrastructure 3. So, how to create a KI that accelerates the transition from analytics innovations to embedded infrastructure?
  61. 61. Hybrid analytics innovation + service centres 61 University of Technology Sydney Connected Intelligence Centre University of Michigan Digital Innovation Greenhouse EDUCAUSE Review, Mar/Apr 2018 https://er.educause.edu/articles/2018/3/architecting-for-learning-analytics-innovating-for-sustainable-impact
  62. 62. CIC’s organisational position VC DVC Research Faculty School Centre Academics DVC Education CIC DVC Operations IT BI Analytics LMS Analytics HYBRID INNOVATION/SERVICES CENTRE: Analytics innovation in a non-faculty centre Reporting to DVC (Education & Students) Staffed by academics + admin team Design and launch Master of Data Science & Innovation, & PhD program
  63. 63. CIC skillset Board Room VC/DVCs/Deans/Directors Common Room Academic staff Server Room IT Division Interpersonal skills + Education, Learning Design, Interface Design, Programming, Web Development, Text Analytics, Machine Learning, Statistics, Visualisation, Decision-Support, Sensemaking, Creativity & Risk, Participatory Design
  64. 64. Advantages that this org structure brings Operating within the DVC’s Office enables close coupling with student services and teaching innovation Baseline funding provides invaluable stability for planning projects and staff Reporting directly to a DVC, and talking directly to other operational directors, gets stuff done Operating outside a faculty provides agility for decision-making, and helpful neutrality
  65. 65. 65 2 design lenses: Analytic Accountability Cycle zoom in on the analytics design cycle Design Practices zoom in on the material practices of analytics design
  66. 66. Expertises/stakeholders and key transitions in designing a Learning Analytics system Educational/Learning Sciences Researcher Learning Theory Educator Learner Learning Outcomes Educational Insights Programmer Software, Hardware User Interface Data Algorithm Learning Analytics Researcher IF… THEN…
  67. 67. Algorithmic accountability [deep dive] 67 http://simon.buckinghamshum.net/2016/03/algorithmic-accountability-for-learning-analytics
  68. 68. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Algorithm Learning Outcomes Learning Theory Learning Analytics Researcher User Interface Educational Insights Data IF… THEN… Accountability in terms of: Computer Science
  69. 69. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Algorithm Learning Outcomes Learning Theory Learning Analytics Researcher User Interface Educational Insights Data Training Data IF… THEN… Accountability in terms of: Data Science
  70. 70. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Algorithm Learning Outcomes Learning Theory Learning Analytics Researcher User Interface Educational Insights Data Design Process IF… THEN… Accountability in terms of: User-Centred Design
  71. 71. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Learning Theory Learning Analytics Researcher User Interface Educational Insights Accountability in terms of: Learning Sciences Data IF… THEN… Algorithm Learning Outcomes
  72. 72. 72 So how can we frame the theory-analytics relationship?
  73. 73. Why are the Learning Sciences missing so often from Learning Analytics? • 73 April 28 2016: LAK16 Keynote Paul Kirschner - Learning Analytics: Utopia or Dystopia https://youtu.be/8Ojm nOiM IKI “Put the learning back into learning analytics” The Learning Sciences much to offer (helpful examples) Ignore us at your dystopic peril…
  74. 74. One of the most mature fusions of qualitative and quantitative methods 74http://w w w .quantitativeethnography.org https://w w w .youtube.com /w atch?v=LjcfGSdIBAk LAK18 Keynote Address
  75. 75. The quant/qual distinction has dissolved. Each has methods to enrich the other 75 “In the age of Big Data, we have an opportunity to expand the tools of ethnography — and history, and literary analysis, and philosophy, and any discipline that analyzes meaning — by using statistical techniques not to supplant grounded understanding, but to expand it. To use additional warrants to support the stories that we tell about the things people do, and reasons they do them.” David Williamson Shaffer, Quantitative Ethnography, p.398
  76. 76. • 76 Machine learning ≠ atheoretical empiricism Rosé, C. P. (2018). Learning analytics in the Learning Sciences, invited chapter in F. Fischer, C. H m elo-Silver, S. Goldm an, & P. Reim ann (Eds.) International H andbook of the Learning Sciences, Taylor & Francis. “The strong emphasis on empiricism grounded in big data advocated by data mining researchers can sometimes be misunderstood as an advocacy of atheoretical approaches.” “I caution against a bottom-up, atheoretical empiricism. In contrast, I would stress the role of rich theoretical frameworks for motivating operationalizations of variables” “[strive for] intensive exchange between the Learning Sciences and neighboring fields of data mining, computational linguistics, and other areas of computational social sciences.” Carolyn Rosé, 2018
  77. 77. • 77 Data science opens new ways to observe learning “Computational tools, which include machine learning approaches, can serve as lenses through which researchers may make observations that contribute to theory, as machinery used to encode operationalizations of theoretical constructs, and as languages to build assessments that measure the world in terms of these operationalizations.” […] “They are more limited in the sense that in applying them, a reduction of the richness of signal in the real world occurs as a necessary discretization takes place. However, they are also more powerful in the sense of the speed and ubiquity of the observation that is possible.” Carolyn Rosé, 2018 Rosé, C. P. (2018). Learning analytics in the Learning Sciences, invited chapter in F. Fischer, C. H m elo-Silver, S. Goldm an, & P. Reim ann (Eds.) International H andbook of the Learning Sciences, Taylor & Francis.
  78. 78. Learning Sciences meets Learning Analytics: Helpful accounts of justified concerns, and demonstrable synergies, e.g… 78 Wise & Cui (ICLS 2018): 7 principles for Learning Sciences-aware analytics Jivet, et al. (LAK 2018): 3 ways to ground learning dashboards in the Learning Sciences Marzouk, et al. (AJET 2016): grounding automated student feedback in Self-Determination Theory Rummel, et al. (IJAIED 2016): AI+CSCL could be great, or end very badly Wise and Schwarz (IJCSCL 2018): 8 provocations for CSCL, inc. 2 debating the role of computational methods
  79. 79. 79 How do we bridge “from clicks to constructs”?
  80. 80. Proxies for “Conscientiousness”? Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring and supporting learning in video games. Cambridge, MA, MIT Press. Figure 5 fromreport to The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
  81. 81. Proxies for “Conscientiousness”? Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring and supporting learning in video games. Cambridge, MA, MIT Press. Figure 5 fromreport to The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
  82. 82. Proxies for “Conscientiousness”? Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring and supporting learning in video games. Cambridge, MA, MIT Press. Figure 5 fromreport to The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
  83. 83. Proxies for “Conscientiousness”? Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring and supporting learning in video games. Cambridge, MA, MIT Press. Figure 5 fromreport to The John D. and Catherine T. MacArthur Foundation Reports on Digital Media and Learning http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
  84. 84. 84 Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5 From clicks to constructs in MOOCs Defining a C21 capability of “Crowd-Sourced Learning”
  85. 85. 85 ©Allrightsreserved,SandraMilligan
  86. 86. 86
  87. 87. 87
  88. 88. 88 ©Allrightsreserved,SandraMilligan
  89. 89. 89 From clicks to constructs in MOOCs Defining a C21 capability of Crowd-Sourced Learning (Part of a larger map) Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5
  90. 90. 90 From clicks to constructs in MOOCs Defining a C21 capability of Crowd-Sourced Learning (Part of a larger map) Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5
  91. 91. 91 From clicks to constructs in MOOCs Defining a C21 capability of Crowd-Sourced Learning (Part of a larger map) Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5
  92. 92. 92 From clicks to constructs in MOOCs Defining a C21 capability of Crowd-Sourced Learning (Part of a larger map) Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5
  93. 93. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Learning Theory Learning Analytics Researcher User Interface Educational Insights Recap: a productive, multivocal dialogue is unfolding between Learning Sciences & Learning Analytics Data IF… THEN… Algorithm Learning Outcomes
  94. 94. Educational/Learning Sciences researcher Programmer Software, Hardware Educator Learner Learning Theory Learning Analytics Researcher User Interface Educational Insights This advances the algorithmic accountability challenge in education — clarifies the key relationships, and how they have been designed Data IF… THEN… Algorithm Learning Outcomes
  95. 95. 95 Can we learn from the history of HCI theory? From narrow experimental science to rich, timely design input The definition of “theory” has evolved. Matured from a focus on cognitive psychology’s concepts and experimental methods, to broader, richer theories • from modelling individual mental states, to cognition as embodied, social, distributed • handle the complexity of real use contexts • inform design on realistic timescales
  96. 96. The diverse contributions of “theory” in HCI research and design In what senses do Learning Analytics use theory — and what forms do the Learning Sciences offer to designers?
  97. 97. 97 2 design lenses: Analytic Accountability Cycle zoom in on the analytics design cycle Design Practices zoom in on the material practices of analytics design
  98. 98. 98 Design process close-up: Co-designing a team feedback timeline (Nursing Simulation)
  99. 99. Co-designing an automatic feedback tool for nurses, with students and academics Carlos G. Prieto-Alvarez, Roberto Martinez-Maldonado, & Anderson, T. (2018). Co-designing learning analytics tools with learners. In Jason M. Lodge, Jared Cooney Horvath, & L. Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (Vol. 1). London: Routledge. Card-sorting exploration Sketching Learner/Data Journeys to show where analytics might help Giving voice to students, teachers and designers by adapting techniques from Co-Design Helping stakeholders understand data privacy, collection and use in learning analytics tools
  100. 100. Multimodal student data from simulations
  101. 101. Automated visualisation of nursing team activity Patient’s state changes 3 nursing roles Use of a device Administer medication
  102. 102. 102 Design process close-up: Bringing theory, learning design and co-design together: automated feedback on academic writing
  103. 103. Automated formative feedback on writing (Civil Law) Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
  104. 104. Law academic annotates automated feedback in Word Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
  105. 105. Law academic annotates automated feedback in Word Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
  106. 106. Align the assessment rubric with the textual features (i.e. rhetorical moves) that the tool can identify Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
  107. 107. Evaluate with students: what worked well? Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent. International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0 …but it was far from perfect: see the paper for detailed evaluation results
  108. 108. Automated feedback on reflective writing Reflection is critical to the integration of academic + experiential knowledge This is where you disclose what you’re uncertain about, and how you’ve changed, in the first person Scholarship clarifies the hallmarks of deeper reflective writing Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
  109. 109. Learning reflective writing: Distillation of theory and pedagogy into a framework Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
  110. 110. Learning reflective writing: Distillation of theory and pedagogy into a framework Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
  111. 111. Learning reflective writing: Simplification of framework à a visual language Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
  112. 112. Information design + Interface design The key to automated annotations of the reflection
  113. 113. Information design + Interface design
  114. 114. Example design problem: initial detection of affect poorly calibrated: red-lining words that clearly don’t reflect author affect/emotion in their writing http://heta.io/how-can-writing-analytics-researchers-rapidly-codesign-feedback-with-educators
  115. 115. Participatory prototyping builds trust in the NLP http://heta.io/how-can-writing-analytics-researchers-rapidly-codesign-feedback-with-educators Learning Analytics researchers work with academics (3 hour workshop) Goal: calibrate the parser detecting affect in reflective writing, working through sample texts Rapid prototyping with a Python notebook, then integrated into end-user tool for further testing
  116. 116. Transparency in the analytics infrastructure: Academic Writing Analytics platform now open source https://utscic.edu.au/open-source-writing-analytics Higher Ed. Text Analytics Project: http://heta.io Demo: http://acawriter-demo.utscic.edu.au
  117. 117. Acknowledgement:https://www.freevector.com/jigsaw-puzzle# Educator /Student Resources Learning Design Evaluation Analytics Capability Bundling analytics with educator resources Integrated Writing Activities with Writing Analytics http://heta.io/resources
  118. 118. 118 how do we handle inherent imperfection in analytics for complex competencies?
  119. 119. “Embracing Imperfection in Learning Analytics” 119 Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018). Em bracing Im perfection in Learning Analytics. In Proceedings LAK18: International Conference on Learning Analytics and Know ledge, M arch 5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y, USA). https://doi.org/10.1145/3170358.3170413 Cognitive dissonance when feedback violates the student’s expectations: “…as D’Mello and Graesser [15] demonstrate, it is when the student experiences dissonance because the analytics fail to match their expectations that they are likely to reflect on why they think the machine is wrong. We believe that this form of critical questioning is more likely to happen if the student has been given an underlying reason to be a little distrustful of the classifier.”
  120. 120. “Embracing Imperfection in Learning Analytics” 120 Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018). Em bracing Im perfection in Learning Analytics. In Proceedings LAK18: International Conference on Learning Analytics and Know ledge, M arch 5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y, USA). https://doi.org/10.1145/3170358.3170413 Mindful engagement with technology: “Salomon et al. [42] are concerned that students move beyond mindless use of potentially powerful cognitive tools, and instead employ “nonautomatic, effortful, and thus metacognitively guided processes” [p4]. This is precisely the role that we have been arguing that “imperfect analytics” can help to facilitate.”
  121. 121. “Embracing Imperfection in Learning Analytics” 121 1. Robust learning design ensures that the activity involving automated feedback is meaningful whether or not the technology always works (Knight et al ICLS 2018 crossover paper) 2. Explicit encouragement — in student briefings, and in the user interface — to push back if they disagree with the feedback Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018). Em bracing Im perfection in Learning Analytics. In Proceedings LAK18: International Conference on Learning Analytics and Know ledge, M arch 5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y, USA). https://doi.org/10.1145/3170358.3170413
  122. 122. Macro-level Critical infrastructure studies reveal how KI is inherently political, social and technical – an evolved system of systems …but there is a risk that unless you have skin in the game, critics of Learning Analytics/AI will be perceived as just ‘shouting from the touchline’ (from Bohr’s Quadrant)
  123. 123. Micro-level We need ‘insider’ accounts of how design practices can bring the different disciplines and stakeholders together, with integrity
  124. 124. 124 “What would data science look like if its key critics were engaged to help improve it? …and how might critiques of data science improve with an approach that considers the day-to-day practices of data science?” Gina N eff, Anissa Tanw eer, Brittany Fiore-Gartland, and Laura Osburn (2017). Critique and Contribute: A Practice-Based Fram ew ork for Im proving Critical Data Studies and Data Science. Big Data, Volum e 5, N um ber 2, 2017. https://doi.org/10.1089/big.2016.0050
  125. 125. Fairness Accountability & Transparency 125 https://fatconference.org
  126. 126. Critical perspectives on ML practices 126 https://mitpress.mit.edu/books/machine-learners
  127. 127. How do we ensure that LS and LA are on the field, on the same team, and shaping the game? Are we equipped to shape the new knowledge infrastructure? formalisable theory? sufficiently agile methods? suitably skilled professionals?
  128. 128. Take home message: Learning Analytics – with the help of the Learning Sciences – must develop design practices that bring the different disciplines and stakeholders together, with integrity This is an exhilarating time to be shaping educational research and practice!

×