Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
On Social Learning, Sensemaking Capacity, and Collective IntelligenceSimon Buckingham Shum
We are transitioning to an era in which the authority of previously dependable sources of understanding is increasingly called into question, in tandem with societal and global challenges that require new ways of thinking. Correspondingly, hard questions are now being asked about our education system’s adequacy. Our challenge is to create the infrastructures in which “K–Life” learners develop the capacities to thrive personally, and as citizens, under unprecedented conditions of uncertainty. The capacity to make sense of complex personal, intellectual, and social dilemmas is what we need to foster in our children, graduates, researchers, and employees: these skills can be summarized as “social learning.” This session will describe a range of R&D initiatives to illustrate socio-technical responses to these challenges, including intensively collaborative projects like the SocialLearn Project, the OLnet Project, the Compendium Institute, and the Learning Warehouse.
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
In this webinar, Prof Hendrik Drachsler will reflect on the process of applying learning analytics solutions within higher education settings, its implications, and the critical lessons learned in the Trusted Learning Research Program. The talk will focus on the experience of edutec.science research collective consisting of researchers from the Netherlands and Germany that contribute to the Trusted Learning Analytics (TLA) research program. The TLA program aims to provide actionable and supportive feedback to students and stands in the tradition of human-centered learning analytics concepts. Thus, the TLA program aims to contribute to unfolding the full potential of each learner. It, therefore, applies sensor technology to support psychomotor as well as web technology to support meta-cognitive and collaborative learning skills with high-informative feedback methods. Prof. Drachsler applies validated measurement instruments from the field of psychometric and investigates to what extent Learning Analytics interventions can reproduce the findings of these instruments. During this webinar, Prof Drachsler will discuss the lessons learned from implementing TLA systems. He will touch on TLA prerequisites like ethics, privacy, and data protection, as well as high informative feedback for psychomotor, collaborative, and meta-cognitive competencies and the ongoing research towards a repository, methods, tools and skills that facilitate the uptake of TLA in Germany and the Netherlands.
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
On Social Learning, Sensemaking Capacity, and Collective IntelligenceSimon Buckingham Shum
We are transitioning to an era in which the authority of previously dependable sources of understanding is increasingly called into question, in tandem with societal and global challenges that require new ways of thinking. Correspondingly, hard questions are now being asked about our education system’s adequacy. Our challenge is to create the infrastructures in which “K–Life” learners develop the capacities to thrive personally, and as citizens, under unprecedented conditions of uncertainty. The capacity to make sense of complex personal, intellectual, and social dilemmas is what we need to foster in our children, graduates, researchers, and employees: these skills can be summarized as “social learning.” This session will describe a range of R&D initiatives to illustrate socio-technical responses to these challenges, including intensively collaborative projects like the SocialLearn Project, the OLnet Project, the Compendium Institute, and the Learning Warehouse.
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
In this webinar, Prof Hendrik Drachsler will reflect on the process of applying learning analytics solutions within higher education settings, its implications, and the critical lessons learned in the Trusted Learning Research Program. The talk will focus on the experience of edutec.science research collective consisting of researchers from the Netherlands and Germany that contribute to the Trusted Learning Analytics (TLA) research program. The TLA program aims to provide actionable and supportive feedback to students and stands in the tradition of human-centered learning analytics concepts. Thus, the TLA program aims to contribute to unfolding the full potential of each learner. It, therefore, applies sensor technology to support psychomotor as well as web technology to support meta-cognitive and collaborative learning skills with high-informative feedback methods. Prof. Drachsler applies validated measurement instruments from the field of psychometric and investigates to what extent Learning Analytics interventions can reproduce the findings of these instruments. During this webinar, Prof Drachsler will discuss the lessons learned from implementing TLA systems. He will touch on TLA prerequisites like ethics, privacy, and data protection, as well as high informative feedback for psychomotor, collaborative, and meta-cognitive competencies and the ongoing research towards a repository, methods, tools and skills that facilitate the uptake of TLA in Germany and the Netherlands.
How smart are smart classrooms? Evaluating International Evidence@cristobalcobo
There has been a considerable progress in integrating technological innovations to facilitate the learning process. This has a potentially important implications on student’s learning process as well as the role of teachers. SMART Classroom is a machine-assisted educational platform developed in Korea that allows learners to study at their own pace while teachers play a role as advisers, coaches and facilitators. Artificial intelligence allows for identification of optimal lessons based on learning algorithms and patterns of individual learning. The session will showcase an example of a framework of Korean education policies and an initiative of smart classroom, and how it has contributed to improving the learning quality and reducing the education gap in Korea.
@cristobalcobo
https://cristobalcobo.net
learning in a networked world: the role of social media and augmented learning.
Keynote presentation to the New Educator Program Hedley Beare Centre for Teaching and Learning 23-25 August 2011
I delivered this talk via video conference to a 3-university meeting attempting to define a common standard for quality in online teaching. I looked at quality from perspective of Three Generations of Onlien Pedagogy. I may have just shared my mixed feelings about quality control systems in these slides
Educator-NICs: Envisaging the Future of ICT–enabled Networked Improvement Communities
Learning Emergence Workshop • University of Bristol • 20th May 2014
Establishing personal learning environments on tablet computers:Brian Whalley
Establishing personal learning environments on tablet computers: enhancing the student experience through HE/FE and beyond and exploring the implications
Workshop Paper given at 2012 Northwest Academic Libraries Conference
'Beyond the library: student transition and success'
This is the large version. A very cut down version was presented at my Inaugural Lecture on 5 March 2014, Bristol, UK which is now on YouTube: make some coffee and take a peek? https://www.youtube.com/watch?v=HWnyfqOxR6E
Building large-scale evidence for education (the case of Plan Ceibal, Uruguay)@cristobalcobo
Keynote “Innovations and initiatives”. Education World Forum 2018.The Department for Education (DfE) and the British Council, London
At the Education World Forum #London #EWF18 #EFF19
@cristobalcobo
@fundacionceibal
Educating Children of the 21st Century provides an open forum where educators and stakeholders from ASOMEX schools, can learn, share experiences and propose knowledge-based solutions, by presenting and discussing research findings, developments and trends in applying ICT to improve teaching, learning, and school leadership in the 21st century.
The conference gives teachers an opportunity to meet colleagues and share ideas that may advance the effective use of technology in their schools. Furthermore, the conference serves as a venue for participants to share information and explore new paths for innovation, to exchange views and know-how, to advance 21st Century skills using technology.
Keynote talk at the Web Science Summer School, Singapore, 8 December 2014. Today we see the rise of Social Machines, like Twitter, Wikipedia and Galaxy Zoo—where communities identify and solve their own problems, harnessing commitment, local knowledge and embedded skills, without having to rely on experts or governments.
The Social Machines paradigm provides a lens onto the interacting sociotechnical systems of our hybrid digital-physical world, citizen-centric and at scale—emphasising empowerment and sociality in a world of pervasive technology adoption and automation.
This talk will present the Social Machines paradigm as an approach to social media analytics and a rethinking of our scholarly practices and knowledge infrastructure.
How smart are smart classrooms? Evaluating International Evidence@cristobalcobo
There has been a considerable progress in integrating technological innovations to facilitate the learning process. This has a potentially important implications on student’s learning process as well as the role of teachers. SMART Classroom is a machine-assisted educational platform developed in Korea that allows learners to study at their own pace while teachers play a role as advisers, coaches and facilitators. Artificial intelligence allows for identification of optimal lessons based on learning algorithms and patterns of individual learning. The session will showcase an example of a framework of Korean education policies and an initiative of smart classroom, and how it has contributed to improving the learning quality and reducing the education gap in Korea.
@cristobalcobo
https://cristobalcobo.net
learning in a networked world: the role of social media and augmented learning.
Keynote presentation to the New Educator Program Hedley Beare Centre for Teaching and Learning 23-25 August 2011
I delivered this talk via video conference to a 3-university meeting attempting to define a common standard for quality in online teaching. I looked at quality from perspective of Three Generations of Onlien Pedagogy. I may have just shared my mixed feelings about quality control systems in these slides
Educator-NICs: Envisaging the Future of ICT–enabled Networked Improvement Communities
Learning Emergence Workshop • University of Bristol • 20th May 2014
Establishing personal learning environments on tablet computers:Brian Whalley
Establishing personal learning environments on tablet computers: enhancing the student experience through HE/FE and beyond and exploring the implications
Workshop Paper given at 2012 Northwest Academic Libraries Conference
'Beyond the library: student transition and success'
This is the large version. A very cut down version was presented at my Inaugural Lecture on 5 March 2014, Bristol, UK which is now on YouTube: make some coffee and take a peek? https://www.youtube.com/watch?v=HWnyfqOxR6E
Building large-scale evidence for education (the case of Plan Ceibal, Uruguay)@cristobalcobo
Keynote “Innovations and initiatives”. Education World Forum 2018.The Department for Education (DfE) and the British Council, London
At the Education World Forum #London #EWF18 #EFF19
@cristobalcobo
@fundacionceibal
Educating Children of the 21st Century provides an open forum where educators and stakeholders from ASOMEX schools, can learn, share experiences and propose knowledge-based solutions, by presenting and discussing research findings, developments and trends in applying ICT to improve teaching, learning, and school leadership in the 21st century.
The conference gives teachers an opportunity to meet colleagues and share ideas that may advance the effective use of technology in their schools. Furthermore, the conference serves as a venue for participants to share information and explore new paths for innovation, to exchange views and know-how, to advance 21st Century skills using technology.
Keynote talk at the Web Science Summer School, Singapore, 8 December 2014. Today we see the rise of Social Machines, like Twitter, Wikipedia and Galaxy Zoo—where communities identify and solve their own problems, harnessing commitment, local knowledge and embedded skills, without having to rely on experts or governments.
The Social Machines paradigm provides a lens onto the interacting sociotechnical systems of our hybrid digital-physical world, citizen-centric and at scale—emphasising empowerment and sociality in a world of pervasive technology adoption and automation.
This talk will present the Social Machines paradigm as an approach to social media analytics and a rethinking of our scholarly practices and knowledge infrastructure.
Smart Data - How you and I will exploit Big Data for personalized digital hea...Amit Sheth
Amit Sheth's keynote at IEEE BigData 2014, Oct 29, 2014.
Abstract from:
http://cci.drexel.edu/bigdata/bigdata2014/keynotespeech.htm
Big Data has captured a lot of interest in industry, with the emphasis on the challenges of the four Vs of Big Data: Volume, Variety, Velocity, and Veracity, and their applications to drive value for businesses. Recently, there is rapid growth in situations where a big data challenge relates to making individually relevant decisions. A key example is personalized digital health that related to taking better decisions about our health, fitness, and well-being. Consider for instance, understanding the reasons for and avoiding an asthma attack based on Big Data in the form of personal health signals (e.g., physiological data measured by devices/sensors or Internet of Things around humans, on the humans, and inside/within the humans), public health signals (e.g., information coming from the healthcare system such as hospital admissions), and population health signals (such as Tweets by people related to asthma occurrences and allergens, Web services providing pollen and smog information). However, no individual has the ability to process all these data without the help of appropriate technology, and each human has different set of relevant data!
In this talk, I will describe Smart Data that is realized by extracting value from Big Data, to benefit not just large companies but each individual. If my child is an asthma patient, for all the data relevant to my child with the four V-challenges, what I care about is simply, “How is her current health, and what are the risk of having an asthma attack in her current situation (now and today), especially if that risk has changed?” As I will show, Smart Data that gives such personalized and actionable information will need to utilize metadata, use domain specific knowledge, employ semantics and intelligent processing, and go beyond traditional reliance on ML and NLP. I will motivate the need for a synergistic combination of techniques similar to the close interworking of the top brain and the bottom brain in the cognitive models.
For harnessing volume, I will discuss the concept of Semantic Perception, that is, how to convert massive amounts of data into information, meaning, and insight useful for human decision-making. For dealing with Variety, I will discuss experience in using agreement represented in the form of ontologies, domain models, or vocabularies, to support semantic interoperability and integration. For Velocity, I will discuss somewhat more recent work on Continuous Semantics, which seeks to use dynamically created models of new objects, concepts, and relationships, using them to better understand new cues in the data that capture rapidly evolving events and situations.
Smart Data applications in development at Kno.e.sis come from the domains of personalized health, energy, disaster response, and smart city.
The title of this talk borrows from the title of a chapter in a recently published book by Richard Smiraglia, Cultural Synergy in Information Institutions (7.9: What if There Were a Map?). The use of visualizations in the exploration of bodies of knowledge and for the organization of knowledge has a long history. Think in terms of the tree(s) of knowledge and large-scale maps of science (see Atlas of Science by Katy Börner). This talk introduces the work of a European network of research collaboration (a so-called COST Action) KnoweScape. KnoweScape explores how knowledge maps (from simple to sophisticated) can be made and applied to better understand, navigate, and curate collections held by libraries and archives. In terms of general research methodology, this talk is also a plea for creating overview prior to in-debt analysis and to seek for relative stable reference frameworks against which rapid changes of our knowledge can be interrogated. Looking at results produced by this community of scholars so far, it will become clear why the making of knowledge maps requires the collaboration of physicists, computer scientists, sociologists of knowledge, digital humanities scholars, and information scientists and professionals.
AI WORLD: I-World: EIS Global Innovation Platform: BIG Knowledge World vs. BI...Azamat Abdoullaev
Future World Projects
Global Intelligence Platform
Smart World
Smart Nation
Smart Cities Global Initiative
Smart Superpower Projects
Big Data and Big Knowledge, etc.
Introductory lecture, Visit of students at DANS-KNAW, as part of the programme “Dutch Designs: Innovation in Library, Museum and Information Services in the Netherlands.” University of Washington, Seattle (Directors Trent Hill, Rose Paquet), July 18, 2019
ABSTRACT : Computational social science (CSS) is an academic discipline that combines the traditional social sciences with computer science. While social scientists provide research questions, data sources, and acquisition methods, computer scientists contribute mathematical models and computational tools. CSS uses computationally methods and statistical tools to analyze and model social phenomena, social structures, and human social behavior. The purpose of this paper is to provide a brief introduction to computational social science.
Key Words: computational social science, social-computational systems, social simulation models, agent-based models
e-Research and the Demise of the Scholarly ArticleDavid De Roure
Innovations 2013 - e-Science, we-Science and the latest evolutions in e-publishing. STM International Association of Scientific, Technical & Medical Publishers. 4th December 2013, Congress Centre, Great Russell Street, London, UK.
Web Observatories, e-Research and the Importance of Collaboration. WST 2014 Webinar series, 20th March 2014
See Web Science Trust http://webscience.org/
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Kirsty Kitto, Simon Buckingham Shum, and Andrew Gibson. (2018). Embracing Imperfection in Learning Analytics. In Proceedings of LAK18: International Conference on Learning Analytics and Knowledge, March 5–9, 2018, Sydney, NSW, Australia, pp.451-460. (ACM, New York, NY, USA). https://doi.org/10.1145/3170358.3170413
Open Access: http://simon.buckinghamshum.net/2018/01/embracing-imperfection-in-learning-analytics
Abstract: Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons.
Opening to the inaugural workshop on Learning Analytics in Schools held at LAK18: International Conference on Learning Analytics & Knowledge, Sydney. http://lak18.solaresearch.org
Prof. Simon Buckingham Shum
Prof. Ruth Deakin Crick
Summer@UTS Workshop, 8th Feb. 2018
Connected Intelligence Centre
https://utscic.edu.au/event/resilience-complexity
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Learning Analytics as Educational Knowledge Infrastructure
1. Learning Analytics as
Educational Knowledge Infrastructure
Simon Buckingham Shum
University of Technology Sydney
Professor of Learning Informatics
Director, Connected Intelligence Centre
@sbuckshum • http://cic.uts.edu.au
http://Simon.BuckinghamShum.net
SoLAR Webinar, 6th August, 2019
2. Deep acknowledgements to
the team whose joint work has
shaped my thinking…
https://cic.uts.edu.au/about/people
Lecturer Lecturer
6. 6
People are literally on the
streets protesting against
AI in education We need trust-building
conversations for an informed
dialogue. A luddite rebellion
won’t help anyone…
https://twitter.com/AGavrielatos/status/1121704316069236739
7. Proposition for today:
We’re now in a transitional phase — we’re laying foundations for the next
educational
knowledge
infrastructure
12. 12
A rapidly changing educational data/analytics ecosystem…
Venture Capitalists…
Philanthropic Foundations…
Publishers
as analytics
providers
Pearson
McGraw Hill
Squirrel AI
etc.
Learning
Platform
Services
Blackboard
Canvas
D2L
Facebook
etc.
Adaptive/
Learning
Analytics
Services
SmartSparrow
Knewton
Unizen
Squirrel AI
etc.
Data
Protection
Laws
GDPR National
privacy lawsetc.
Govnt. &
inter-
national
datasets
UK HESA Data Futures
OECD PISA
UNESCO Inst. for Statistics
US Institute for HE Practice
etc.
Learning Analytics
Human
Factors
13. 13
A rapidly changing educational data/analytics ecosystem…
Learning Analytics
Human
Factors
Publishers
as analytics
providers
Pearson
McGraw Hill
Squirrel AI
etc.
Adaptive/
Learning
Analytics
Services
SmartSparrow
Knewton
Unizen
Squirrel AI
etc.
Data
Protection
Laws
GDPR National
privacy lawsetc.
Venture Capitalists…
Philanthropic Foundations…
Learning
Platform
Services
Blackboard
Canvas
D2L
Facebook
etc.
Govnt. &
inter-
national
datasets
UK HESA Data Futures
OECD PISA
UNESCO Inst. for Statistics
US Institute for HE Practice
etc.
14. 14
Expand from…
“The Fourth Paradigm”
a Computer Science vision of how research is building on the
Empirical, Theoretical and Computational paradigms
moving into a Data-Intensive paradigm
https://www.microsoft.com/en-us/research/publication/fourth-paradigm-data-intensive-scientific-discovery
To see the wider systems…
“Knowledge Infrastructures”
a critical lens on how human+technical systems in science
interoperate to construct, share, contest and sanction knowledge
http://hdl.handle.net/2027.42/97552
15. 15
e.g. Paul Edwards on…
climate science
How do global data, models, visualisations, science and politics
combine to produce knowledge about the past, present and
future, and how do they handle uncertainty?
https://mitpress.mit.edu/books/vast-machine
That’s what a knowledge infrastructure looks like after nearly 200 years’ evolution
“Computer Models, Climate Data, and the Politics of Global Warming”
“Computer Models, Learning Data, and the Politics of Education” …??
16. “Knowledge Infrastructures”
16
“robust networks of people, artifacts, and
institutions that generate, share, and
maintain specific knowledge about the
human and natural worlds.”
Routine, well-functioning knowledge systems include the world weather forecast
infrastructure, the Centers for Disease Control, or the Intergovernmental Panel on Climate
Change — individuals, organizations, routines, shared norms, and practices.
Paul N. Edwards, , Steven J. Jackson, Melissa K. Chalmers, Geoffrey C. Bowker, Christine L. Borgman, David Ribes, Matt Burton, Scout Calvert (2013). Knowledge
Infrastructures: Intellectual Frameworks and Research Challenges. Report from NSF/Sloan Fndn. Workshop, Michigan, May 2012. http://hdl.handle.net/2027.42/97552
17. “Knowledge Infrastructures”
17
“Infrastructures are not systems, in the sense of fully
coherent, deliberately engineered, end-to-end processes.
…ecologies or complex adaptive systems […]
made to interoperate by means of standards, socket
layers, social practices, norms, and individual
behaviors.”
Paul N. Edwards, , Steven J. Jackson, Melissa K. Chalmers, Geoffrey C. Bowker, Christine L. Borgman, David Ribes, Matt Burton, Scout Calvert (2013). Knowledge
Infrastructures: Intellectual Frameworks and Research Challenges. Report from NSF/Sloan Fndn. Workshop, Michigan, May 2012. http://hdl.handle.net/2027.42/97552
18. “Knowledge Infrastructures”
18
“Infrastructures are not systems, in the sense of fully
coherent, deliberately engineered, end-to-end processes.
…ecologies or complex adaptive systems […]
made to interoperate by means of standards, socket
layers, social practices, norms, and individual
behaviors.”
Paul N. Edwards, , Steven J. Jackson, Melissa K. Chalmers, Geoffrey C. Bowker, Christine L. Borgman, David Ribes, Matt Burton, Scout Calvert (2013). Knowledge
Infrastructures: Intellectual Frameworks and Research Challenges. Report from NSF/Sloan Fndn. Workshop, Michigan, May 2012. http://hdl.handle.net/2027.42/97552
I think we can see the
educational ecosystem
here
19. “Knowledge Infrastructures”
19
“I intend the notion of knowledge infrastructure to signal parallels
with other infrastructures […] Yet this is no mere analogy […]
Get rid of infrastructure and you are left with
claims you can’t back up, facts you can’t verify,
comprehension you can’t share, and data you
can’t trust.” (p.19)
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
20. “Knowledge Infrastructures”
20
Monitoring
1 2 3
Modelling Memory
…perform 3 key functions…
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
21. Knowledge Infrastructure concepts
21
metadata friction
“People long ago observed climate and weather for their own reasons, within the
knowledge frameworks of their times.
You would like to use what they observed — not as they used it,
but in new ways, with more precise, more powerful tools.
[…]
So you dig into the history of data. You fight metadata friction, the
difficulty of recovering contextual knowledge about old records.”
(p.xvii)
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
22. metadata friction
“People long ago observed climate and weather for their own reasons, within the
knowledge frameworks of their times.
You would like to use what they observed — not as they used it,
but in new ways, with more precise, more powerful tools.
[…]
So you dig into the history of data. You fight metadata friction, the
difficulty of recovering contextual knowledge about old records.”
(p.xvii)
Knowledge Infrastructure concepts
22
cf. Reanalysis of educational data
(your own and others’) using
computational methods
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
23. Knowledge Infrastructure concepts
23
Models, models, models…
“Everything we know about the world’s climate — past,
present, and future — we know through models.” (p.xiv)
“I’m not talking about the difference between “raw” and
“cooked” data. I mean this literally. Today, no collection of
signals or observations […] becomes global in time and space
without first passing through a series of data models.” (p.xiii)
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
24. Knowledge Infrastructure concepts
24
Models, models, models…
“Everything we know about the world’s climate — past,
present, and future — we know through models.” (p.xiv)
Today, no collection of signals or observations […] becomes
global in time and space without first passing through a series
of data models.” (p.xiii)
Machines ‘see’
learners only through
models
“Raw data is an
oxymoron”
(Geof Bowker)
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
25. Knowledge Infrastructure concepts
25
infrastructural inversion
“The climate knowledge infrastructure never disappears
from view, because it functions by infrastructural inversion:
continual self-interrogation, examining and reexamining its
own past. The black box of climate history is never closed.”
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
26. Knowledge Infrastructure concepts
26
infrastructural inversion
“The climate knowledge infrastructure never disappears
from view, because it functions by infrastructural inversion:
continual self-interrogation, examining and reexamining its
own past. The black box of climate history is never closed.”
We must keep lifting the lid
on learning analytics
infrastructures
We must equip learners and
educators to engage
critically with such tools
Paul Edwards (2010). A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press
27. Epistemic Infrastructure taxonomy for professional knowledge
Partic contributions at the “Micro-KI” level: how professionals construct their EI
27
Markauskaite, L. & Goodyear, P. (2017).
Epistemic Fluency and Professional
Education: Innovation, Knowledgeable
Action and Actionable Knowledge
(Springer, 2017), p.376
29. 29
In what senses might Learning Analytics constitute,
or at least contribute to, an emerging KI?
LA is only 10 years old, and there’s much to do.
But knowing what functioning KIs look like could help us prioritise.
KI concepts seem
to apply to critical
perspectives on LA
1
LA is starting to display
KI properties at different
levels of the system
2
30. Micro
student activity traces during learning
Analytics from individual
student activity
Meso
institution-wide demographics and formal outcomes
School/Uni
Information Systems
Macro/Meso/Micro Learning Analytics
Macro
state/national/international comparisons/league tables
PISA School Rankings
Uni Rankings
Buckingham Shum, S. (2012). Learning Analytics. UNESCO IITE Policy Brief. http://bit.ly/LearningAnalytics
Aggregation of user traces enriches meso + macro
analytics with finer-grained process data
Breadth + depth from macro + meso levels
add power to micro analytics
31. Micro
assignment/course specific networks […] optimising learning in a course
Meso
institution-wide networks […] optimising learning in the institution
Macro/Meso/Micro Educational KI
Macro
state/national/international networks sharing data, models, scholarship
à debate but emerging consensus on optimising learning
32. 32
Trusted data sources *
Validated models *
Interoperable data flows and models *
Established research methodologies *
Government policy held accountable to international
scientific consensus *
* all under rigorous scholarly review and debate
If Learning Analytics
were Climate Science…
33. 33
If the challenge is to build education’s KI,
what are the practical implications for LA?
Micro
assignment/course specific networks […] optimising learning in a course
Meso
institution-wide networks […] optimising learning in the institution
Macro
state/national/international networks sharing data, models, scholarship
à debate but emerging consensus on optimising learning
34. Accountability: ground models in educ. research + learning sciences
Impact policy + practice: make the evidence base accessible
Share models (and data?) Climate data ≠ Learner data
Macro
state/national/international networks sharing data, models, scholarship
à debate but emerging consensus on optimising learning
35. Educationally
meaningful
construct
Sub-Construct
Not directly observable
Human
Observable
Computationally
Detectable
Behaviour Digitally Captured Event
Digitally Captured Event
Digitally Captured EventSub-Construct
Sub-Construct
Behaviour
Behaviour
Behaviour
Behaviour
Digitally Captured Event
Digitally Captured Event
Ground models in learning sciences + educ. research
Adapted from: Wise, A., Knight, S., Buckingham Shum, S. (In Press) Collaborative Learning Analytics. In: Cress, U., Rosé C,, Wise A., & Oshima, J. (Eds.)
International Handbook of Computer-Supported Collaborative Learning. Springer
See also: Buckingham Shum, S. (2016). Envisioning C21 Learning Analytics. Keynote Address, LASI-Asia, Seoul. https://cic.uts.edu.au/lasi-asia-keynote2016
Derived FeatureMetricsConstructs Digitally Captured EventDerived Feature
36. Educationally
meaningful
construct
Sub-Construct
Not directly observable
Human
Observable
Computationally
Detectable
Behaviour Digitally Captured Event
Digitally Captured Event
Digitally Captured EventSub-Construct
Sub-Construct
Behaviour
Behaviour
Behaviour
Behaviour
Digitally Captured Event
Digitally Captured Event
Ground models in learning sciences + educ. research
Adapted from: Wise, A., Knight, S., Buckingham Shum, S. (In Press) Collaborative Learning Analytics. In: Cress, U., Rosé C,, Wise A., & Oshima, J. (Eds.)
International Handbook of Computer-Supported Collaborative Learning. Springer
See also: Buckingham Shum, S. (2016). Envisioning C21 Learning Analytics. Keynote Address, LASI-Asia, Seoul. https://cic.uts.edu.au/lasi-asia-keynote2016
Derived FeatureMetricsConstructs Digitally Captured EventDerived Feature
infrastructural inversion?
37. Impact policy + practice: make the evidence base accessible
http://evidence.laceproject.eu
38. In principle, as variation reduces (e.g. timescale, geography, methodology),
so do the KI challenges. So MACRO to MESO should help simplify the KI.
But institutions still have long histories
Institutional data and knowledge are still notoriously slippery to curate
And institutionalized teaching practices slow to change
Meso
institution-wide networks […] optimising learning in the institution
“data management”
“knowledge management”
“progressive pedagogy”
“authentic assessment”
39. Nonetheless, it’s at the MESO + MICRO layers
where LA can really add to KI
Enable data flows
Tune analytics for the institution’s specific needs
Co-design with stakeholders
Micro
assignment/course specific networks […] optimising learning in a course
Meso
institution-wide networks […] optimising learning in the institution
40. 40
Envisioning the learning ecosystem
beyond the LMS,
in the wild
Kitto, K., O’Hara, J., Philips, M., Gardiner, G., Ghodrati, M. & Buckingham Shum, S. (2019) The Connected University: Connectedness Learning Across a Lifetime. In Ruth
Bridgstock and Neil Tippett (Eds.), Higher Education and the Future of Graduate Employability: A Connectedness Learning Approach. https://doi.org/10.4337/9781788972611
“How are we going to deliver
LA over that type of
complexity?”
Kirsty Kitto: Designing Learning Analytics Ecosystems (LASI 2019)
https://www.beyondlms.org/blog/LASIworkshop
41. 41
Towards LA data flows over an emergent ecosystem:
LA-API infrastructure designed for huge diversity in data + analytics
Kirsty Kitto, Zak Waters, Simon Buckingham Shum, Mandy Lupton, Shane Dawson, George Siemens (2018): Learning Analytics Beyond the LMS: Enabling Connected
Learning via Open Source Analytics in “the wild”. Final Report, Office for Learning and Teaching, Australian Government: Canberra. http://www.beyondlms.org
42. generalisable models
without sacrificing context-sensitivity
Micro
assignment/course specific networks […] optimising learning in a course
Meso
institution-wide networks […] optimising learning in the institution
43. Framework @UTS for educators to co-design
Analytics/AI à augment teaching practice
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th
International
Conference on Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Eprint: https://tinyurl.com/lak19clad
Student
Task
Design
Feedback
& User
Interface
Features
in the
Data
Educators
Analytics/AI
designers
Assessment
45. • A pedagogically robust writing exercise was rated significantly more
useful with the addition of AcaWriter
• Students who used AcaWriter made significantly more academic
rhetorical moves in their revised essays
• A significantly higher proportion of AcaWriter users improved their
drafts (many students degraded them across drafts)
• Students who used AcaWriter produced higher graded submissions
if they engaged deeply with AcaWriter’s feedback
Building UTS trust with an “AcaWriter micro-KI”
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
Shibani, A. (2019, In Prep). Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics. Doctoral Dissertation, Connected Intelligence Centre, University of Technology Sydney
46. Building the AcaWriter micro-KI à educator trust
“Overall, since we’ve been working with CIC around written
communication over the course of the last four of five semesters,
we have seen marked improvement in students’ written
communication. Overall their individual assignment pass-rate is
going up... We are seeing improvements in the number of
students who are either meeting or exceeding the
expectations around written communication”
Shibani, A. (2019, In Prep). Augmenting Pedagogic Writing Practice with Contextualizable Learning Analytics. Doctoral Dissertation, Connected Intelligence Centre, University of Technology Sydney
47. Building the AcaWriter micro-KI à student trust
“It's like having a tutor or
another person check and
give constructive
feedback on your work.”
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th
International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
48. Building the AcaWriter micro-KI à student trust
“When you’re editing your own writing, you
automatically think that your work sounds good
and that all your ideas and views have been clearly
conveyed. This exercise was useful in the sense that it
indicated areas where I needed to be more explicit,
which on my own I would not have noticed.”
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th
International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
49. Building the AcaWriter micro-KI à student trust
“I think what is being taught is something I was already
aware of. However, by being forced to actually identify
ways of arguing, along with the types of words used
to do so, it has broadened my perspective. I think I will
be more aware of the way I am writing now.”
Shibani, A., Knight, S. and Buckingham Shum, S. (2019). Contextualizable Learning Analytics Design: A Generic Model, and Writing Analytics Evaluations. Proc. 9th
International Conference on
Learning Analytics & Knowledge (LAK19). ACM Press, NY, pp. 210-219. DOI: https://doi.org/10.1145/3303772.3303785. Open Access Eprint: https://tinyurl.com/lak19clad
51. Learning Analytics Deck for co-design
http://ladeck.utscic.edu.au
Carlos Prieto’s PhD: ‘Playing cards’ to help stakeholder communication
as they design a new kind of analytics tool
52. Co-design with educators to tune writing analytics
http://heta.io/how-can-writing-analytics-researchers-rapidly-codesign-feedback-with-educators
Goal: calibrate the parser detecting affect in
reflective writing, working through sample texts
Rapid prototyping with a Jupyter notebook to agree
on thresholds
53. 53
More on LA + KI…
(in particular on LA’s relationship to the learning sciences)
http://simon.buckinghamshum.net/2018/06/icls2018-keynote
54. 54
More on Human-Centred AIED & Learning Analytics…
http://simon.buckinghamshum.net/2019/05/human-centred-analyticsai-in-education
Collections of insider accounts from teams who are building these infrastructures:
how do they engage with issues of epistemology, pedagogy, politics, ethics…?
Human-Centred Learning Analytics. Journal of
Learning Analytics, 6(2), pp. 1–94 (Eds.) Simon
Buckingham Shum, Rebecca Ferguson,
& Roberto Martinez-Maldonado
Learning Analytics and AI: Politics, Pedagogy and
Practices. British Journal of Educational Technology
(50th Anniversary Special Issue), (Eds.) Simon
Buckingham Shum & Rose Luckin. (late 2019)
What’s the Problem with Learning Analytics? Journal
of Learning Analytics. Invited Commentaries on Neil
Selwyn’s LAK18 Keynote Talk, from Carolyn Rosé,
Rebecca Ferguson, Paul Prinsloo & Alfred
Essa (late 2019)
55. 55
Reflections on the future educational KI
Are we aspiring for an
“Intergovernmental Panel on Learning?”
Is part of this already in place?...
UNESCO Global Education Monitoring
Report https://en.unesco.org/gem-
report
A conventional form of educational KI
56. 56
Reflections on the future educational KI
Commercial platforms and their R&D programs
are ‘vertical Knowledge Infrastructures’ at
national and increasingly international scales
Knowledge about learners from proprietary
platforms, primarily ITS (but expanding beyond no doubt)
All the usual questions and concerns around
multinational platforms, data ownership,
commercial products in education…
57. Conclusion
We know how a mature, functioning Knowledge
Infrastructure operates, and the influence it can have on
science, policy and practice (not that this is straightforward)
Insights into KI structure and dynamics should help the LA
community focus its efforts to invent an educational KI that
can be sustained, and trusted
Your feedback welcomed!
@sbuckshum • Simon.BuckinghamShum@uts.edu.au • Simon.BuckinghamShum.net