Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Teaching, Assessment and Learning Analytics: Time to Question AssumptionsSimon Buckingham Shum
Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
Towards Contested Collective Intelligence
Simon Buckingham Shum, Director Connected Intelligence Centre, University of Technology Sydney
This talk is to open up a dialogue with the important work of the SWARM project. I’ll introduce the key ideas that have shaped my work on interactive software tools to make thinking visible, shareable and contestable, some of the design prototypes, and some of the lessons we’ve learnt en route.
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Teaching, Assessment and Learning Analytics: Time to Question AssumptionsSimon Buckingham Shum
Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
Towards Contested Collective Intelligence
Simon Buckingham Shum, Director Connected Intelligence Centre, University of Technology Sydney
This talk is to open up a dialogue with the important work of the SWARM project. I’ll introduce the key ideas that have shaped my work on interactive software tools to make thinking visible, shareable and contestable, some of the design prototypes, and some of the lessons we’ve learnt en route.
Jurgen Schulte is an award-winning academic at UTS who has been using an adaptive learning platform (WileyPLUS Orion) in combination with post processing of the data. In this talk he shares some of his experiences
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
Ready for Web 3.0: How will Linked Data benefit Higher Education?Su White
How will Linked Data benefit Higher Education?
The potential impact of widespread use of linked-data in Higher Education is immense. Everyday understandings of the power derived by placing raw data in the public domain is growing. It promises to transform education, interconnecting administrative data, enriching and embellishing teaching resources while providing tools and resources for learners and researchers alike.
Dr Su White is based in the Learning Societies Lab, in Electronics and Computer Science at the University of Southampton. A member of the JISC Infrastructure and Resources Committee and co- author of Semantic Technologies for Learning and Teaching, Su’s research interests include the impact of emerging technologies on Higher Education.
USING MRQAP TO ANALYSE THE DEVELOPMENT OF MATHEMATICS PRE-SERVICE TRAINEES’ C...Christian Bokhove
This paper looks at a data analysis method for analyzing longitudinal network data called MRQAP. We describe a dataset from a study on the development of peer networks of one cohort of pre-service mathematics trainees in the south of England and apply the MRQAP method to its four timepoints. We include attributes for gender, study programme, trust and self-efficacy. The analysis shows that MRQAP is a viable data analysis method for looking at the longitudinal development of networks. We conclude with a short discussion of further methodological challenges and limitations.
Developing a multiple-document-processing performance assessment for epistem...Simon Knight
http://oro.open.ac.uk/41711/
The LAK15 theme “shifts the focus from data to impact”, noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory.
Ethical and Legal Issues in Computational Social Science - Lecture 7 in Intro...Lauri Eloranta
Seventh lecture of the course CSS01: Introduction to Computational Social Science at the University of Helsinki, Spring 2015.(http://blogs.helsinki.fi/computationalsocialscience/).
Lecturer: Lauri Eloranta
Questions & Comments: https://twitter.com/laurieloranta
Complex Social Systems - Lecture 5 in Introduction to Computational Social Sc...Lauri Eloranta
Fifth lecture of the course CSS01: Introduction to Computational Social Science at the University of Helsinki, Spring 2015.(http://blogs.helsinki.fi/computationalsocialscience/).
Lecturer: Lauri Eloranta
Questions & Comments: https://twitter.com/laurieloranta
Computational Social Science – what is it and what can(‘t) it do?Christian Bokhove
Title: Computational Social Science – what is it and what can(‘t) it do?
What is your talk about?
In Computational Social Science (CSS) we use computer science algorithms to analyse qualitative data at scale. In this talk I define CSS, describe what the opportunities and barriers are in using such methods, and give examples from published research, for example on analysing thousands of Ofsted documents.
What are the key messages of your talk?
The use of CSS methods makes it is possible to analyse some data sources at scale that previously would be unrealistic to analyse ‘by hand’.
What are the implications for practice or research from your talk?
CSS allows both more qualitative and more quantitative researchers to analyse unstructured data sources at scale.
Short Biography
Dr Christian Bokhove is an Associate Professor in Mathematics. In his research, he combines conventional qualitative and quantitative methods with novel computational methods.
Towards a Social Learning Space for Open Educational ResourcesSocialLearn, Open U
OpenEd 2010, Barcelona
Simon Buckingham Shum & Rebecca Ferguson
Knowledge Media Institute & Institute of Educational Technology, The Open University, Milton Keynes, UK
Jurgen Schulte is an award-winning academic at UTS who has been using an adaptive learning platform (WileyPLUS Orion) in combination with post processing of the data. In this talk he shares some of his experiences
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
Ready for Web 3.0: How will Linked Data benefit Higher Education?Su White
How will Linked Data benefit Higher Education?
The potential impact of widespread use of linked-data in Higher Education is immense. Everyday understandings of the power derived by placing raw data in the public domain is growing. It promises to transform education, interconnecting administrative data, enriching and embellishing teaching resources while providing tools and resources for learners and researchers alike.
Dr Su White is based in the Learning Societies Lab, in Electronics and Computer Science at the University of Southampton. A member of the JISC Infrastructure and Resources Committee and co- author of Semantic Technologies for Learning and Teaching, Su’s research interests include the impact of emerging technologies on Higher Education.
USING MRQAP TO ANALYSE THE DEVELOPMENT OF MATHEMATICS PRE-SERVICE TRAINEES’ C...Christian Bokhove
This paper looks at a data analysis method for analyzing longitudinal network data called MRQAP. We describe a dataset from a study on the development of peer networks of one cohort of pre-service mathematics trainees in the south of England and apply the MRQAP method to its four timepoints. We include attributes for gender, study programme, trust and self-efficacy. The analysis shows that MRQAP is a viable data analysis method for looking at the longitudinal development of networks. We conclude with a short discussion of further methodological challenges and limitations.
Developing a multiple-document-processing performance assessment for epistem...Simon Knight
http://oro.open.ac.uk/41711/
The LAK15 theme “shifts the focus from data to impact”, noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory.
Ethical and Legal Issues in Computational Social Science - Lecture 7 in Intro...Lauri Eloranta
Seventh lecture of the course CSS01: Introduction to Computational Social Science at the University of Helsinki, Spring 2015.(http://blogs.helsinki.fi/computationalsocialscience/).
Lecturer: Lauri Eloranta
Questions & Comments: https://twitter.com/laurieloranta
Complex Social Systems - Lecture 5 in Introduction to Computational Social Sc...Lauri Eloranta
Fifth lecture of the course CSS01: Introduction to Computational Social Science at the University of Helsinki, Spring 2015.(http://blogs.helsinki.fi/computationalsocialscience/).
Lecturer: Lauri Eloranta
Questions & Comments: https://twitter.com/laurieloranta
Computational Social Science – what is it and what can(‘t) it do?Christian Bokhove
Title: Computational Social Science – what is it and what can(‘t) it do?
What is your talk about?
In Computational Social Science (CSS) we use computer science algorithms to analyse qualitative data at scale. In this talk I define CSS, describe what the opportunities and barriers are in using such methods, and give examples from published research, for example on analysing thousands of Ofsted documents.
What are the key messages of your talk?
The use of CSS methods makes it is possible to analyse some data sources at scale that previously would be unrealistic to analyse ‘by hand’.
What are the implications for practice or research from your talk?
CSS allows both more qualitative and more quantitative researchers to analyse unstructured data sources at scale.
Short Biography
Dr Christian Bokhove is an Associate Professor in Mathematics. In his research, he combines conventional qualitative and quantitative methods with novel computational methods.
Towards a Social Learning Space for Open Educational ResourcesSocialLearn, Open U
OpenEd 2010, Barcelona
Simon Buckingham Shum & Rebecca Ferguson
Knowledge Media Institute & Institute of Educational Technology, The Open University, Milton Keynes, UK
Learning.Analytics for Learning.Futures?
Simon Buckingham Shum and Ruth Deakin Crick
Centre for Connected Intelligence, UTS
The social, technical and political challenges we face as a society demand new ways of thinking and working which are collaborative, holistic and resilient. As we unpack what these words mean, the implications for a university – indeed any learning organisation – run deep. At the core of the paradigm shift we see the need for learners (at all levels) to take increasing responsibility for their learning in authentic contexts: to become resilient agents of their own learning trajectories; to think holistically and to make sense of complex data. Far from being solely ‘graduate attributes’, the same qualities are needed by us: if we can’t model these qualities ourselves, we can’t teach them; if we can’t assess them authentically, we have no evidence base and we can’t provide formative feedback. This line of argument shapes how CIC is conceiving learning analytics (computer-supported tools to help learners and educators gather, analyse, visualise and act on learners’ data) and collective intelligence (networking tools to build a learning community’s evidence-base). In this talk we will give glimpses of these approaches in action, we’ll hear from learners and educators on what this paradigm shift feels like, and through several activities, we invite you to imagine how we can collaborate to test these concepts across UTS, as we move into Learning.Futures.
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Talk of Richard Andrews @ ticEDUCA2010 - I International Conference on ICT and Education, Institute of Education of the Univerity of Lisbon
Richard Andrews
Professor in English
Department of Learning, Curriculum and Communication Institute of Education University of London
Kirsty Kitto, Simon Buckingham Shum, and Andrew Gibson. (2018). Embracing Imperfection in Learning Analytics. In Proceedings of LAK18: International Conference on Learning Analytics and Knowledge, March 5–9, 2018, Sydney, NSW, Australia, pp.451-460. (ACM, New York, NY, USA). https://doi.org/10.1145/3170358.3170413
Open Access: http://simon.buckinghamshum.net/2018/01/embracing-imperfection-in-learning-analytics
Abstract: Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons.
Methodological innovation for mathematics education researchChristian Bokhove
In this talk I will highlight how innovative research methods can help us in answering research questions for mathematics education. Some examples will be:
The use of social network analysis for communication networks of trainee mathematics teachers, as well as interactions in the mathematics classroom.
The use of sequence analysis for analysing data from an online mathematics tool.
The usefulness of open approaches to improve research transparency.
I will draw these projects together to sketch some interesting directions for mathematics education research.
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Opening to the inaugural workshop on Learning Analytics in Schools held at LAK18: International Conference on Learning Analytics & Knowledge, Sydney. http://lak18.solaresearch.org
Prof. Simon Buckingham Shum
Prof. Ruth Deakin Crick
Summer@UTS Workshop, 8th Feb. 2018
Connected Intelligence Centre
https://utscic.edu.au/event/resilience-complexity
A WORKSHOP FOR UTS STUDENTS
Simon Buckingham Shum (CIC Director) & Kailash Awati (Senior Lecturer in Data Science)
This half-day workshop will provide you with hands-on experience mapping issues, ideas and arguments using the research-validated Compendium visual hypertext tool for mapping wicked problems. No technical expertise required.
Compendium QuickStart Guide: http://simon.buckinghamshum.net/wp-content/uploads/2017/12/Compendium_QuickStart.pdf
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Honest Reviews of Tim Han LMA Course Program.pptxtimhan337
Personal development courses are widely available today, with each one promising life-changing outcomes. Tim Han’s Life Mastery Achievers (LMA) Course has drawn a lot of interest. In addition to offering my frank assessment of Success Insider’s LMA Course, this piece examines the course’s effects via a variety of Tim Han LMA course reviews and Success Insider comments.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
1. Transitioning Education’s
Knowledge Infrastructure
Shaping Design or Shouting from the Touchline?
Simon Buckingham Shum
@sbuckshum • http://utscic.edu.au
Keynote Address, International Conference of the Learning Sciences
London Festival of Learning, June 2018
5. 5
infrastructure
amid justified public concern about data/algorithm ethics,
and academic concerns about computational methods,
how do we design
WE TRUST?
16. 16
Learning Analytics is therefore building bridges with…
Many disciplines…
LAK conference keynote speakers include:
Learning Sciences Paul Kirschner
David Williamson Shaffer
Art Graesser
Psychometrics Robert Mislevy
Information
Visualisation
Katy Börner
Cristina Conati
IT Ethics Mireille Hildebrandt
Critical studies of
technology
Neil Selwyn
Writing analytics Danielle McNamara
17. 17
Learning Analytics is therefore building bridges with…
LAK conference and LA Summer Institutes offer:
Industry panels to explore the opportunities and tensions
with academia
Industry researchers on the Doctoral Consortium
Commercial sponsors and exhibitors
Collaborative projects with industry
Many disciplines…
LAK conference keynote speakers include:
LMS, Analytics Vendors & Publishers…
Learning Sciences Paul Kirschner
David Williamson Shaffer
Art Graesser
Psychometrics Robert Mislevy
Information
Visualisation
Katy Börner
Cristina Conati
IT Ethics Mireille Hildebrandt
Critical studies of
technology
Neil Selwyn
Writing analytics Danielle McNamara
18. 18
Learning
learning sciences
educational research
teaching practice
curriculum design
learning design
pedagogy
assessment…
Analytics
data
statistics
classification
machine learning
text processing
visualisation
predictive models…
…not a straightforward dialogue
22. Who are Learning Analytics for?
22
Learning Sciences researchers!
1. Theory
2. Experiment
3. Simulation
4. Data-Intensive Science
http://FourthParadigm.org
23. Educational eResearch infrastructure
Education meets Cyberinfrastructure, eScience, eSocialScience, Grid computing, etc…
23
Bill Cope & M ary Kalantzis (2016). Big Data Com es to School: Im plications for Learning, Assessm ent, and Research. AERA Open, 2, (2): April 1, 2016. Open
Access: https://doi.org/10.1177/2332858416641907
Lina M arkauskaite (2016), Digital M edia, Technologies and Scholarship: Som e Shapes of eResearch in Educational Inquiry. The Australian Educational
Researcher, 37, (4), pp.79-101. Open Access: https://link.springer.com /content/pdf/10.1007%2FBF03216938.pdf
Educational research methods could change radically when:
• “sample size” is less relevant: N=All
• statistically sig. patterns are easy to find in such big data
• qualitative textual coding may be automated
• social ties can be tracked at scale
• etc…
24. Who are Learning Analytics for?
24
“The new possibility is that educators and learners — the
stakeholders who constitute the learning system studied for so long
by researchers — are for the first time able to see their
own processes and progress rendered in ways that until
now were the preserve of researchers outside the system.”
Knight, S. and Buckingham Shum , S. (2017). Theory & Learning Analytics. H andbook of Learning Analytics,
Society for Learning Analytics Research, Chap. 1, pp.17-22. http://doi.org/10.18608/hla17
Learners! Educators!
25. Who are Learning Analytics for?
25
“Data gathering, analysis, interpretation and even intervention
(in the case of adaptive software) is no longer the preserve of the
researcher, but shifts to embedded sociotechnical
educational infrastructure.”
Learners! Educators!
Knight, S. and Buckingham Shum , S. (2017). Theory & Learning Analytics. H andbook of Learning Analytics,
Society for Learning Analytics Research, Chap. 1, pp.17-22. http://doi.org/10.18608/hla17
26. 26
But… for many, we’re suspect, wielding
data, analytics and AI in ways that may
oppress rather than empower
tectonic shifts under way
in the educational landscape…
27. Justified concerns around privacy, surveillance and
data ethics are redefining the context for our work
27
28. Growing public literacy around the ethics
of data / algorithms / AI is to be welcomed
28
For more see
http://datasociety.net
29. Critical academic commentary on the
datafication of education (Ben Williamson)
29
“Datafication” at all levels of the educational
system, from government statistics to biometrics
Concern about the ownership of data and
analytics platforms by commercial entities
Worried that students will have opportunities
closed down rather than opened up by
algorithms
And much more… Justified?
30. Critical academic commentary on
Learning Analytics (Neil Selwyn, LAK18 keynote)
30
The Promises and Problems
of ‘Learning Analytics’
https://w w w .youtube.com /w atch?v=rsUx19_Vf0Q
https://latte-analytics.sydney.edu.au/keynotes
31. Classification schemes and metrics are
suspect, with good reason…
31
ideas now being
popularised…
incisive STS
scholarship into
classification
schemes…
32. Du Gay, P. and Pryke, M. (2002) Cultural Economy: Cultural Analysis and Commercial Life. Sage, London. pp. 12-13
“accounting tools...do not simply aid
the measurement of economic activity,
they shape the reality they
measure”
32
33. 33
A rapidly changing educational data/analytics ecosystem…
Venture Capitalists
Philanthropic Foundations
Publishers
as analytics
providers
Pearson
McGraw Hill
Wiley
etc.
Learning
Platform
Services
Blackboard
Canvas
D2L
Facebook
etc.
Adaptive/
Learning
Analytics
Services
SmartSparrow
Knewton
Unizen
CogTools
etc.
Data
Protection
Laws
GDPR National
privacy lawsetc.
Govnt. &
inter-
national
datasets
UK HESA Data Futures
OECD PISA
UNESCO Inst. for Statistics
US Institute for HE Practice
etc.
Learning Analytics
H um an
Factors
34. 34
A rapidly changing educational data/analytics ecosystem…
Learning Analytics
H um an
Factors
Publishers
as analytics
providers
Pearson
McGraw Hill
Wiley
etc.
Learning
Platform
Services
Blackboard
Canvas
D2L
Facebook
etc.
Adaptive/
Learning
Analytics
Services
SmartSparrow
Knewton
Unizen
CogTools
etc.
Data
Protection
Laws
GDPR National
privacy lawsetc.
Govnt. &
inter-
national
Statistics
UK HESA Data Futures
OECD PISA
UNESCO Inst. for Statistics
US Institute for HE Practice
etc.
Venture Capitalists
Philanthropic Foundations
36. “Knowledge Infrastructures” (Paul Edwards)
36http://knowledgeinfrastructures.orghttps://mitpress.mit.edu/books/vast-machine
Knowledge Infrastructures:
Intellectual Frameworks
and Research Challenges
A Vast Machine: Computer Models,
Climate Data, and the Politics of
Global Warming
37. “Knowledge Infrastructures”
37
“robust networks of people, artifacts, and
institutions that generate, share, and
maintain specific knowledge about the
human and natural worlds.”
Routine, well-functioning knowledge systems include the world weather forecast
infrastructure, the Centers for Disease Control, or the Intergovernmental Panel on Climate
Change — individuals, organizations, routines, shared norms, and practices.
Paul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge
Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
38. “Knowledge Infrastructures”
38
“Infrastructures are not systems, in the sense of fully
coherent, deliberately engineered, end-to-end processes.
…ecologies or complex adaptive systems[…]
made to interoperate by means of standards, socket
layers, social practices, norms, and individual
behaviors.”
Paul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge
Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
39. “Knowledge Infrastructures”
39
“Infrastructures are not systems, in the sense of fully
coherent, deliberately engineered, end-to-end processes.
…ecologies or complex adaptive systems[…]
made to interoperate by means of standards, socket
layers, social practices, norms, and individual
behaviors.” I think we can see the
educational ecosystem
herePaul N . Edw ards, , Steven J. Jackson, M elissa K. Chalm ers, Geoffrey C. Bow ker, Christine L. Borgm an, David Ribes, M att Burton, Scout Calvert (2013). Know ledge
Infrastructures: Intellectual Fram ew orks and Research Challenges. Report from N SF/Sloan Fndn. W orkshop, M ichigan, M ay 2012
40. Knowledge Infrastructure concepts
40
Models, models, models…
“Everything we know about the world’s climate — past,
present, and future — we know through models.” (p.xiv)
“I’m not talking about the difference between “raw” and
“cooked” data. I mean this literally. Today, no collection of
signals or observations […] becomes global in time and space
without first passing through a series of data models.” (p.xiii)
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
41. Knowledge Infrastructure concepts
41
Models, models, models…
“Everything we know about the world’s climate — past,
present, and future — we know through models.” (p.xiv)
Today, no collection of signals or observations […] becomes
global in time and space without first passing through a series
of data models.” (p.xiii)
Machines ‘see’
learners only through
models
“Raw data is an
oxymoron”
(Geof Bowker)
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
42. Knowledge Infrastructure concepts
42
infrastructural inversion
“The climate knowledge infrastructure never disappears
from view, because it functions by infrastructural inversion:
continual self-interrogation, examining and reexamining its
own past. The black box of climate history is never closed.”
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
43. Knowledge Infrastructure concepts
43
infrastructural inversion
“The climate knowledge infrastructure never disappears
from view, because it functions by infrastructural inversion:
continual self-interrogation, examining and reexamining its
own past. The black box of climate history is never closed.”
We must keep lifting the lid
on learning analytics
infrastructures
We must equip learners and
educators to engage
critically with such tools
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
44. Knowledge Infrastructure concepts
44
metadata friction
“People long ago observed climate and weather for their own
reasons, within the knowledge frameworks of their times.
You would like to use what they observed — not as they used it,
but in new ways, with more precise, more powerful tools.
[…]
So you dig into the history of data. You fight metadata friction, the
difficulty of recovering contextual knowledge about old records.”
(p.xvii)
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
45. metadata friction
“People long ago observed climate and weather for their own
reasons, within the knowledge frameworks of their times.
You would like to use what they observed — not as they used it,
but in new ways, with more precise, more powerful tools.
[…]
So you dig into the history of data. You fight metadata friction, the
difficulty of recovering contextual knowledge about old records.”
(p.xvii)
Knowledge Infrastructure concepts
45
cf. Reanalysis of educational data
(your own and others’) using
computational methods
Paul Edw ards (2010). A Vast M achine: Com puter M odels, Clim ate Data, and the Politics of Global W arm ing. M IT Press
46. Epistemic Infrastructure taxonomy for professional knowledge
(Markauskaite & Goodyear)
Partic contributions at the “Micro-KI” level: how professionals construct their EI
46
Markauskaite, L. & Goodyear, P.
(2017). Epistemic Fluency and
Professional Education:
Innovation, Knowledgeable
Action and Actionable Knowledge
(Springer, 2017), p.376
47. 47
Arguably, education is in transition
to a new KI
The seams of the old KI
are under stress
Rapid tech. change stresses the systems
with more inertia. How can we manage this?
48. What can we expect to see when a KI is in transition?
48
“social norms, relationships, and ways of thinking, acting,
and working” are impacted
“…when they change, authority, influence, and power are
redistributed.”
“new kinds of knowledge work and workers displace old
ones”
“increased access for some may mean reduced access for
others”
49. KI helps explain contemporary concerns…
49
outsourcing of student feedback to machines
alarm over the political agendas driving the collection of
educational data
concern over the commercial drivers/owners behind data
worries that use of analytics/AI = dated pedagogy
50. KI helps explain our current immaturity…
50
poor interoperability between platforms
early products with simplistic dashboards that don’t count what
really counts in learning
growing recognition of the data illiteracy in educational
institutions
universities discovering they have limited control over how they
can access their students’ data from cloud platforms
51. KI helps explain our current immaturity…
51
use of machine learning with no awareness of data and
algorithmic bias
poor grounding of learning analytics in theories of learning
an early fascination (borrowed from business) with predicting
student failure (assumes the past can and should predict the
future)
excessive weight placed on computational performance with less
attention to educational outcomes
52. How to respond as researchers?
52
“social scientists cannot remain simple bystanders
or critics of the current transformations, which will
not be reversed;
“instead, we need research practices that can help
innovate, rethink, and rebuild.”
53. How to respond as researchers?
53
“treat it as a design opportunity, create a cadre at the
interface between scientists and software, and use
participatory design techniques
So what does that look like?
56. Donald Stokes: Pasteur’s Quadrant
• 56
• Edison: invent commercially
viable electric lighting
• Early, pre-theoretical
taxonomising of phenomena
• Bohr: atomic structure;
quantum theory
• Pasteur: understand and control
bacteria in order to prevent disease
• Manhattan Project: atomic bomb
• John Maynard Keynes: economics
57. Locating Learning Sciences & Learning Analytics
57
Learning Sciences
research into the
foundational constructs
and processes
underpinning learning
Data Science/AI research
into new approaches to
machine learning,
algorithm optimisation,
mitigating bias, etc.
BOHR’s Quadrant PASTEUR’s Quadrant
EDISON’s Quadrant Learning Analytics to improve outcomes in
specific contexts (may or may not draw on current theory,
but doesn’t feed back to foundational research concepts)
Learning Sciences:
research-based
educational intervention in
specific contexts, reflect on
implications for theory,
establish sustainable
practices
(e.g. DBR; DBIR; RPPs;
Improvement Science;
Collaborative Data-intensive
Improvement)
58. Locating Learning Sciences & Learning Analytics
58
Learning Sciences
research into the
foundational constructs
and processes
underpinning learning
Data Science/AI research
into new approaches to
machine learning,
algorithm optimisation,
mitigating bias, etc.
BOHR’s Quadrant PASTEUR’s Quadrant
EDISON’s Quadrant Learning Analytics to improve outcomes in
specific contexts (may or may not draw on current theory,
but doesn’t feed back to foundational research concepts)
Learning Sciences:
research-based
educational intervention in
specific contexts, reflect on
implications for theory,
establish sustainable
practices
(e.g. DBR; DBIR; RPPs;
Improvement Science;
Collaborative Data-intensive
Improvement)
59. Locating Learning Sciences & Learning Analytics
59
Learning Sciences
research into the
foundational constructs
and processes
underpinning learning
Data Science/AI research
into new approaches to
machine learning,
algorithm optimisation,
mitigating bias, etc.
BOHR’s Quadrant PASTEUR’s Quadrant
EDISON’s Quadrant Learning Analytics to improve outcomes in
specific contexts (may or may not draw on current theory,
but doesn’t feed back to foundational research concepts)
Learning Sciences:
research-based
educational intervention in
specific contexts, reflect on
implications for theory,
establish sustainable
practices
(e.g. DBR; DBIR; RPPs;
Improvement Science;
Collaborative Data-intensive
Improvement)
Learning Analytics: design and deploy
analytics that demonstrate how
theory can inspire models,
algorithms, code, user experiences,
teaching practices, and ultimately,
learning.
The ability to formally model
theoretical concepts, and shape
learning outcomes, advances theories
(as in other fields)
60. The analytics innovator’s dilemma…
60
PASTEUR’s Quadrant — USE inspired foundational research
The innovation gulf facing learning analytics (much ed-tech) research:
1. Researchers develop novel forms of student-facing
analytics, but…
2. USE-inspired research requires USERS.
There’ll be few users without robust infrastructure
3. So, how to create a KI that accelerates the transition
from analytics innovations to embedded infrastructure?
61. Hybrid analytics innovation + service centres
61
University of Technology Sydney
Connected Intelligence Centre
University of Michigan
Digital Innovation Greenhouse
EDUCAUSE Review, Mar/Apr 2018
https://er.educause.edu/articles/2018/3/architecting-for-learning-analytics-innovating-for-sustainable-impact
63. CIC skillset
Board Room
VC/DVCs/Deans/Directors
Common Room
Academic staff
Server Room
IT Division
Interpersonal skills
+
Education, Learning Design, Interface
Design, Programming, Web Development,
Text Analytics, Machine Learning,
Statistics, Visualisation, Decision-Support,
Sensemaking, Creativity & Risk,
Participatory Design
64. Advantages that this org structure brings
Operating within the DVC’s Office
enables close coupling with student
services and teaching innovation
Baseline funding provides invaluable
stability for planning projects and staff
Reporting directly to a DVC, and talking
directly to other operational directors,
gets stuff done
Operating outside a faculty provides
agility for decision-making, and
helpful neutrality
65. 65
2 design lenses:
Analytic Accountability Cycle
zoom in on the analytics design cycle
Design Practices
zoom in on the material practices of analytics design
66. Expertises/stakeholders and key transitions
in designing a Learning Analytics system
Educational/Learning
Sciences Researcher
Learning Theory
Educator
Learner
Learning Outcomes
Educational Insights
Programmer
Software, Hardware
User Interface
Data
Algorithm
Learning Analytics
Researcher
IF…
THEN…
72. 72
So how can we frame the
theory-analytics relationship?
73. Why are the Learning Sciences missing so
often from Learning Analytics?
• 73
April 28 2016: LAK16 Keynote
Paul Kirschner - Learning Analytics: Utopia or Dystopia
https://youtu.be/8Ojm nOiM IKI
“Put the learning back
into learning analytics”
The Learning Sciences
much to offer (helpful
examples)
Ignore us at your
dystopic peril…
74. One of the most mature fusions of
qualitative and quantitative methods
74http://w w w .quantitativeethnography.org https://w w w .youtube.com /w atch?v=LjcfGSdIBAk
LAK18 Keynote Address
75. The quant/qual distinction has dissolved.
Each has methods to enrich the other
75
“In the age of Big Data, we have an opportunity to expand the tools of
ethnography — and history, and literary analysis, and philosophy, and any
discipline that analyzes meaning — by using statistical
techniques not to supplant grounded understanding,
but to expand it. To use additional warrants to support
the stories that we tell about the things people do,
and reasons they do them.”
David Williamson Shaffer, Quantitative Ethnography, p.398
76. • 76
Machine learning ≠ atheoretical empiricism
Rosé, C. P. (2018). Learning analytics in the
Learning Sciences, invited chapter in F. Fischer,
C. H m elo-Silver, S. Goldm an, & P. Reim ann
(Eds.) International H andbook of the Learning
Sciences, Taylor & Francis.
“The strong emphasis on empiricism grounded in big data
advocated by data mining researchers can sometimes be
misunderstood as an advocacy of atheoretical
approaches.”
“I caution against a bottom-up, atheoretical
empiricism. In contrast, I would stress the role of rich
theoretical frameworks for motivating
operationalizations of variables”
“[strive for] intensive exchange between the
Learning Sciences and neighboring fields of data
mining, computational linguistics, and other areas of
computational social sciences.”
Carolyn Rosé, 2018
77. • 77
Data science opens new ways to observe learning
“Computational tools, which include machine learning approaches,
can serve as lenses through which researchers may
make observations that contribute to theory,
as machinery used to encode operationalizations of
theoretical constructs, and as languages to build
assessments that measure the world in terms of these
operationalizations.”
[…]
“They are more limited in the sense that in applying them, a
reduction of the richness of signal in the real world occurs as a
necessary discretization takes place. However, they are also
more powerful in the sense of the speed and ubiquity of the
observation that is possible.”
Carolyn Rosé, 2018
Rosé, C. P. (2018). Learning analytics in the
Learning Sciences, invited chapter in F. Fischer,
C. H m elo-Silver, S. Goldm an, & P. Reim ann
(Eds.) International H andbook of the Learning
Sciences, Taylor & Francis.
78. Learning Sciences meets Learning Analytics:
Helpful accounts of justified concerns, and demonstrable synergies, e.g…
78
Wise & Cui (ICLS 2018): 7 principles for Learning Sciences-aware analytics
Jivet, et al. (LAK 2018): 3 ways to ground learning dashboards in the
Learning Sciences
Marzouk, et al. (AJET 2016): grounding automated student feedback in
Self-Determination Theory
Rummel, et al. (IJAIED 2016): AI+CSCL could be great, or end very badly
Wise and Schwarz (IJCSCL 2018): 8 provocations for CSCL, inc. 2 debating
the role of computational methods
79. 79
How do we bridge
“from clicks to constructs”?
80. Proxies for
“Conscientiousness”?
Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring
and supporting learning in video games. Cambridge, MA, MIT Press.
Figure 5 fromreport to The John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning
http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
81. Proxies for
“Conscientiousness”?
Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring
and supporting learning in video games. Cambridge, MA, MIT Press.
Figure 5 fromreport to The John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning
http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
82. Proxies for
“Conscientiousness”?
Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring
and supporting learning in video games. Cambridge, MA, MIT Press.
Figure 5 fromreport to The John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning
http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
83. Proxies for
“Conscientiousness”?
Shute, V. J. and M. Ventura (2013). Stealth Assessment: Measuring
and supporting learning in video games. Cambridge, MA, MIT Press.
Figure 5 fromreport to The John D. and Catherine T. MacArthur
Foundation Reports on Digital Media and Learning
http://myweb.fsu.edu/vshute/pdf/Stealth_Assessment.pdf
84. 84
Milligan, S. and Griffin, P. (2016). Understanding learning and learning design in MOOCs: A measurement-based
interpretation. Journal of Learning Analytics, 3(2), 88– 115. http://dx.doi.org/10.18608/jla.2016.32.5
From clicks to constructs in MOOCs
Defining a C21 capability of “Crowd-Sourced Learning”
89. 89
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
90. 90
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
91. 91
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
92. 92
From clicks to constructs in
MOOCs
Defining a C21 capability of
Crowd-Sourced
Learning
(Part of a
larger map)
Milligan, S. and Griffin, P. (2016).
Understanding learning and learning
design in MOOCs: A measurement-based
interpretation. Journal of Learning
Analytics, 3(2), 88– 115.
http://dx.doi.org/10.18608/jla.2016.32.5
95. 95
Can we learn from the history of HCI theory?
From narrow experimental science to rich, timely design input
The definition of “theory” has evolved.
Matured from a focus on cognitive
psychology’s concepts and experimental
methods, to broader, richer theories
• from modelling individual mental states, to
cognition as embodied, social, distributed
• handle the complexity of real use contexts
• inform design on realistic timescales
96. The diverse contributions of “theory” in HCI
research and design In what senses do
Learning Analytics use
theory — and what forms do
the Learning Sciences offer
to designers?
97. 97
2 design lenses:
Analytic Accountability Cycle
zoom in on the analytics design cycle
Design Practices
zoom in on the material practices of analytics design
99. Co-designing an automatic feedback tool for nurses,
with students and academics
Carlos G. Prieto-Alvarez, Roberto Martinez-Maldonado, & Anderson, T. (2018). Co-designing learning analytics tools with learners. In Jason M. Lodge, Jared Cooney Horvath, & L.
Corrin (Eds.), Learning analytics in the classroom: Translating learning analytics research for teachers (Vol. 1). London: Routledge.
Card-sorting exploration
Sketching Learner/Data Journeys to show where analytics might help
Giving voice to students, teachers and
designers by adapting techniques
from Co-Design
Helping stakeholders understand
data privacy, collection and use in
learning analytics tools
103. Automated formative feedback on writing (Civil Law)
Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent.
International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
104. Law academic annotates automated feedback in Word
Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent.
International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
105. Law academic annotates automated feedback in Word
Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent.
International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
106. Align the assessment rubric with the textual features (i.e. rhetorical
moves) that the tool can identify
Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent.
International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
107. Evaluate with students: what worked well?
Knight, S., Buckingham Shum , S., Ryan, P., Sándor, Á. and W ang, X. (2018). Designing Academ ic W riting Analytics for Civil Law Student Self-Assessm ent.
International Journal of Artificial Intelligence in Education, 28, (1), 1-28. DOI: https://doi.org/10.1007/s40593-016-0121-0
…but it was far from perfect: see the paper for detailed evaluation results
108. Automated feedback on reflective writing
Reflection is critical to the integration of academic +
experiential knowledge
This is where you disclose what you’re uncertain
about, and how you’ve changed, in the first person
Scholarship clarifies the hallmarks of deeper
reflective writing
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International
Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
109. Learning reflective writing:
Distillation of theory and pedagogy into a framework
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International
Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
110. Learning reflective writing:
Distillation of theory and pedagogy into a framework
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International
Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
111. Learning reflective writing:
Simplification of framework à a visual language
Gibson, A., Aitken, A., Sándor, Á., Buckingham Shum , S., Tsingos-Lucas, C. and Knight, S. (2017). Reflective W riting Analytics for Actionable Feedback. Proceedings of 7th International
Conference on Learning Analytics & Know ledge, M arch 13-17, 2017, Vancouver, BC, Canada. (ACM Press). DOI: http://dx.doi.org/10.1145/3027385.3027436
112. Information design + Interface design
The key to automated annotations of the reflection
114. Example design problem: initial detection of affect poorly calibrated:
red-lining words that clearly don’t reflect author affect/emotion in their writing
http://heta.io/how-can-writing-analytics-researchers-rapidly-codesign-feedback-with-educators
115. Participatory prototyping builds trust in the NLP
http://heta.io/how-can-writing-analytics-researchers-rapidly-codesign-feedback-with-educators
Learning Analytics researchers work with
academics (3 hour workshop)
Goal: calibrate the parser detecting affect in
reflective writing, working through sample texts
Rapid prototyping with a Python notebook, then
integrated into end-user tool for further testing
116. Transparency in the analytics infrastructure:
Academic Writing Analytics platform now open source
https://utscic.edu.au/open-source-writing-analytics
Higher Ed. Text Analytics Project: http://heta.io
Demo: http://acawriter-demo.utscic.edu.au
118. 118
how do we handle inherent
imperfection in analytics for
complex competencies?
119. “Embracing Imperfection in Learning Analytics”
119
Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018). Em bracing Im perfection in Learning Analytics. In Proceedings LAK18: International Conference on
Learning Analytics and Know ledge, M arch 5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y, USA). https://doi.org/10.1145/3170358.3170413
Cognitive dissonance when feedback violates the student’s expectations:
“…as D’Mello and Graesser [15] demonstrate, it is when the
student experiences dissonance because the analytics
fail to match their expectations that they are likely to
reflect on why they think the machine is wrong. We
believe that this form of critical questioning is more likely to happen if
the student has been given an underlying reason to be a little distrustful
of the classifier.”
120. “Embracing Imperfection in Learning Analytics”
120
Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018). Em bracing Im perfection in Learning Analytics. In Proceedings LAK18: International Conference on
Learning Analytics and Know ledge, M arch 5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y, USA). https://doi.org/10.1145/3170358.3170413
Mindful engagement with technology:
“Salomon et al. [42] are concerned that students move beyond
mindless use of potentially powerful cognitive tools, and instead
employ “nonautomatic, effortful, and thus
metacognitively guided processes” [p4].
This is precisely the role that we have been arguing
that “imperfect analytics” can help to facilitate.”
121. “Embracing Imperfection in Learning Analytics”
121
1. Robust learning design
ensures that the activity
involving automated feedback
is meaningful whether or not
the technology always works
(Knight et al ICLS 2018
crossover paper)
2. Explicit encouragement
— in student briefings, and in
the user interface — to push
back if they disagree with the
feedback
Kirsty Kitto, Sim on Buckingham Shum , and Andrew Gibson. (2018).
Em bracing Im perfection in Learning Analytics. In Proceedings LAK18:
International Conference on Learning Analytics and Know ledge, M arch
5–9, 2018, Sydney, N SW , Australia, pp.451-460. (ACM , N ew York, N Y,
USA). https://doi.org/10.1145/3170358.3170413
122. Macro-level
Critical infrastructure studies reveal how
KI is inherently political, social and
technical – an evolved system of systems
…but there is a risk that unless you have skin in the game,
critics of Learning Analytics/AI will be perceived as just
‘shouting from the touchline’ (from Bohr’s Quadrant)
123. Micro-level
We need ‘insider’ accounts of how design
practices can bring the different disciplines
and stakeholders together, with integrity
124. 124
“What would data science look like if its key
critics were engaged to help improve it?
…and how might critiques of data science improve
with an approach that considers the day-to-day
practices of data science?”
Gina N eff, Anissa Tanw eer, Brittany Fiore-Gartland, and Laura Osburn (2017). Critique and Contribute: A Practice-Based Fram ew ork
for Im proving Critical Data Studies and Data Science. Big Data, Volum e 5, N um ber 2, 2017. https://doi.org/10.1089/big.2016.0050
127. How do we ensure that LS and LA
are on the field, on the same team,
and shaping the game?
Are we equipped to shape the new
knowledge infrastructure?
formalisable theory?
sufficiently agile methods?
suitably skilled professionals?
128. Take home message:
Learning Analytics – with the help of the
Learning Sciences – must develop design
practices that bring the different disciplines
and stakeholders together, with integrity
This is an exhilarating time to be shaping
educational research and practice!