This document summarizes a presentation on learning analytics given by Simon Buckingham Shum. Some key points:
- Learning analytics aims to unlock student data to improve 21st century learning by analyzing patterns in data to better understand learning processes and identify students who may need help.
- Examples discussed include Purdue University's predictive model that identified 66-80% of struggling students and a system that provides real-time feedback to students.
- Analytics can look beyond grades and course performance to capture data on learning dispositions, engagement, curriculum mastery, and student discourse to provide a more holistic view of the learning process.
- Challenges include ensuring analytics are used ethically and to improve learning rather than
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...Peter Brusilovsky
Modern educational settings from regular classrooms to MOOCs produce a a rapidly increasing volume of data that captures individual learning progress of millions of students at different level of granularity. This presence of this data opens a unique opportunity to re-engineer traditional education and build and develop a range of efficient data-driven approaches to support teaching and learning. In my talk, I will present several ways to use big educational data explored in our lab. The focus will be on open social learning modeling and identifying individual differences through sequential pattern mining, but several other approaches will be mentioned. Open social learning modeling and sequential pattern mining provides two considerably different examples on using educational data. One offers an immediate use of class interaction history to develop more engaging content access while another shows how big data could be used to uncover important individual differences that could be used to optimize the process for individual leaners.
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshopPeter Brusilovsky
Abstract: In recent years, the use of Artificial Intelligence (AI) technologies expanded to many areas directly affecting the lives of millions. AI-based approaches advise human decision-makers who should be released on bail, whether it is a good time to discharge a patient from a hospital and whether a specific student is at risk to fail a course. Such extensive use in AI in decision making came with a range of protentional problems that have been extensively studied over the last few years. Recognition of these problems motivated a rapid rise of research on “human-centered AI”, which attempted to address and minimize the negative effects of using AI technologies. The majority of work on human-centered AI focus on various types of Human-AI collaboration through such technologies as transparency, explainability, and user control. In my talk, I will review how the ideas of Human-AI collaboration, transparency, explainability, and user control have been used in educational applications of AI in the past and will discuss now new ideas in this research area developed outside of AI-Ed could be creatively applied in educational context.
Iui2015: Personalized Search: Reconsidering the Value of Open User ModelsPeter Brusilovsky
IUI 2015 talk slides: Ahn, J., Brusilovsky, P., and Han, S. (2015) Personalized Search: Reconsidering the Value of Open User Models. In: Proceedings of Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, Georgia, USA, ACM, pp. 202-212
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...Peter Brusilovsky
Modern educational settings from regular classrooms to MOOCs produce a a rapidly increasing volume of data that captures individual learning progress of millions of students at different level of granularity. This presence of this data opens a unique opportunity to re-engineer traditional education and build and develop a range of efficient data-driven approaches to support teaching and learning. In my talk, I will present several ways to use big educational data explored in our lab. The focus will be on open social learning modeling and identifying individual differences through sequential pattern mining, but several other approaches will be mentioned. Open social learning modeling and sequential pattern mining provides two considerably different examples on using educational data. One offers an immediate use of class interaction history to develop more engaging content access while another shows how big data could be used to uncover important individual differences that could be used to optimize the process for individual leaners.
Human-Centered AI in AI-ED - Keynote at AAAI 2022 AI for Education workshopPeter Brusilovsky
Abstract: In recent years, the use of Artificial Intelligence (AI) technologies expanded to many areas directly affecting the lives of millions. AI-based approaches advise human decision-makers who should be released on bail, whether it is a good time to discharge a patient from a hospital and whether a specific student is at risk to fail a course. Such extensive use in AI in decision making came with a range of protentional problems that have been extensively studied over the last few years. Recognition of these problems motivated a rapid rise of research on “human-centered AI”, which attempted to address and minimize the negative effects of using AI technologies. The majority of work on human-centered AI focus on various types of Human-AI collaboration through such technologies as transparency, explainability, and user control. In my talk, I will review how the ideas of Human-AI collaboration, transparency, explainability, and user control have been used in educational applications of AI in the past and will discuss now new ideas in this research area developed outside of AI-Ed could be creatively applied in educational context.
Iui2015: Personalized Search: Reconsidering the Value of Open User ModelsPeter Brusilovsky
IUI 2015 talk slides: Ahn, J., Brusilovsky, P., and Han, S. (2015) Personalized Search: Reconsidering the Value of Open User Models. In: Proceedings of Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, Georgia, USA, ACM, pp. 202-212
Stereotype Modeling for Problem-Solving Performance Predictions in MOOCs and ...Peter Brusilovsky
Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization, UMAP2017, pp 76-84
Stereotypes are frequently used in real life to classify students according to their performance in class. In literature, we can find many references to weaker students, fast learners, struggling students, etc. Given the lack of detailed data about students, these or other kinds of stereotypes could be potentially used for user modeling and personalization in the educational context. Recent research in MOOC context demonstrated that data-driven learner stereotypes could work well for detecting and preventing student dropouts. In this paper, we are exploring the application of stereotype-based modeling to a more challenging task -- predicting student problem-solving and learning in two programming courses and two MOOCs. We explore traditional stereotypes based on readily available factors like gender or education level as well as some advanced data-driven approaches to group students based on their problem-solving behavior. Each of the approaches to form student stereotype cohorts is validated by comparing models of student learning: do students in different groups learn differently? In the search for the stereotypes that could be used for adaptation, the paper examines ten approaches. We compare the performance of these approaches and draw conclusions for future research.
Technologies to support self-directed learning through social interactionDragan Gasevic
This talk will describe underlying principles, design, and experience gained with ProSolo, a platform that supports personalized, competency-based learning through social interaction. Traditional educational models are primarily focused on classroom education and training typically associated with the notion of credit hours as the (only) route towards formal credentials. This limits opportunities for creating personalized learning pathways in the changing educational context. ProSolo provide users with the ability to unbundle education programs, courses, and units into discrete yet inter-related competencies, allowing learners to construct their education pathway in a manner that better reflects their interests and future career motivations and requirements. ProSolo is developed with the intention of providing learners with opportunities to customize, modify, and personalize their self-directed learning journey. ProSolo supports the development of skills for self-directed learning by allowing learners to control the planning, learning, and presentation of outcomes associated with their learning. To support learners with different levels of prior knowledge, study skills, and cultural backgrounds, ProSolo offers features for supporting self-directed learning through three types of scaffolds, including instructional, social, and technological. Learning in ProSolo occurs within a socially rich environment that aggregates learners’ information created and shared in their existing online spaces. ProSolo makes use of learning analytics to empower learners and instructors in this new model of education. ProSolo was used in the Data, Learning, and Analytics MOOC and is currently being piloted at several university sites.
User Control in AIED (Artificial Intelligence in Education)Peter Brusilovsky
Slides of my intro to "Meet the Expert" session at AIED 2021. This is a subset of slides of a longer presentation on user control in AI extended with many specific examples from AIED area.
Two Brains are Better than One: User Control in Adaptive Information AccessPeter Brusilovsky
In recent years, the use of Artificial Intelligence (AI) technologies expanded to many areas where they directly affect the lives of many people. AI-based approaches advise human decision-makers who should be released on bail, whether it is a good time to discharge a patient from a hospital and whether a specific student is at risk to fail a course. Such an extensive use in AI in decision making came with a range of protentional problems that have been extensively studied over the last few years. Recognition of these problems motivated a rapid rise of research on “human-centered AI”, which attempted to address and minimize the negative effects of using AI technologies. Among the ideas of human-centered AI is user control - engaging users in affecting AI decision making to prevent possible errors and biases. In my talk, I will focus on the application of user control in one popular area of AI application, adaptive information access. Adaptive information access systems such as personalized search and recommender systems attempt to model their users to help them in finding the most relevant information. Yet, user modeling and personalization mechanisms might not always work as expected resulting in errors, biases, and suboptimal behavior. Combining the decision power or AI with the ability of the user to guide and control it brings together the strong sides of artificial and human intelligence and could lead to better results. In my talk, I review several projects focused on user control in adaptive information access systems and discuss the benefits and challenges of this approach.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
From Expert-Driven to Data-Driven Adaptive LearningPeter Brusilovsky
Keynote slides for the Workshop on Advancing Education with Data at the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, Aug 14, 2017
Examining the Value of Learning Analytics for Supporting Work-integrated Lear...Vitomir Kovanovic
Slides from our presentation at the Seventh National Conference
on Work-Integrated Learning (ACEN’18).
The full paper is available at https://www.researchgate.net/publication/328578409_Examining_the_value_of_learning_analytics_for_supporting_work-integrated_learning
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
Ascilite webinar series: http://www.ascilite.org.au/index.php?p=news_detail&item=240
A slightly different version of the Macquarie University keynote at http://www.slideshare.net/sbs/our-learning-analytics-are-our-pedagogy
I swapped out more general critiques of big data, for more detail on Dispositional and Discourse Learning Analytics
Invited talk, INSIGHT Centre for Data Analytics, Univ. Galway, 2 Oct 2013, http://www.insight-centre.org
Abstract:
Data and analytics are transforming how organisations work in all sectors. While there are clearly ethical issues around big data and privacy, there may also be an argument that educational institutions have a moral obligation to use all the information they have to maximize the learner's progress. So, assuming education can't (arguably shouldn't) resist this revolution, the question is how to harness this new capability intelligently. Learning Analytics is an exploding research field and startup market: do leaders know what to ask when the vendors roll up with dazzling dashboards? In this talk I'll provide an overview of developments, and consider some of the key questions we should be asking. Like any modelling technology and accounting system, analytics are not neutral, and do not passively describe sociotechnical reality: they begin to shape it. Moreover, they start with the things that are easiest to count, which doesn't necessarily equate to the things we value in learning. Given the crisis in education at many levels, what realities do we want analytics to perpetuate, or bring into being?
Bio:
Simon Buckingham Shum is Professor of Learning Informatics at the UK Open University's Knowledge Media Institute. He researches, teaches and consults on Learning Analytics, Collective Intelligence and Argument Visualization. His background is B.Sc. Psychology, M.Sc. Ergonomics and Ph.D. Human-Computer Interaction. He co-edited Visualizing Argumentation (Springer 2003), the standard reference in the field, followed by Knowledge Cartography (2008). In the field of Learning Analytics, he served as Program Co-Chair of the 2nd International Learning Analytics LAK12 conference, chaired the LAK13 Discourse-Centric Learning Analytics workshop, and the LASI13 Dispositional Learning Analytics workshop. He is a co-founder of the Society for Learning Analytics Research, Compendium Institute, LearningEmergence.net, and was Co-Founder and General Editor of the Journal of Interactive Media in Education. He serves on the Advisory Groups for a variety of learning analytics initiatives in education and enterprise, and is a Visiting Fellow at University of Bristol Graduate School of Education. Contact him via http://simon.buckinghamshum.net
Stereotype Modeling for Problem-Solving Performance Predictions in MOOCs and ...Peter Brusilovsky
Proceedings of the 25th Conference on User Modeling, Adaptation and Personalization, UMAP2017, pp 76-84
Stereotypes are frequently used in real life to classify students according to their performance in class. In literature, we can find many references to weaker students, fast learners, struggling students, etc. Given the lack of detailed data about students, these or other kinds of stereotypes could be potentially used for user modeling and personalization in the educational context. Recent research in MOOC context demonstrated that data-driven learner stereotypes could work well for detecting and preventing student dropouts. In this paper, we are exploring the application of stereotype-based modeling to a more challenging task -- predicting student problem-solving and learning in two programming courses and two MOOCs. We explore traditional stereotypes based on readily available factors like gender or education level as well as some advanced data-driven approaches to group students based on their problem-solving behavior. Each of the approaches to form student stereotype cohorts is validated by comparing models of student learning: do students in different groups learn differently? In the search for the stereotypes that could be used for adaptation, the paper examines ten approaches. We compare the performance of these approaches and draw conclusions for future research.
Technologies to support self-directed learning through social interactionDragan Gasevic
This talk will describe underlying principles, design, and experience gained with ProSolo, a platform that supports personalized, competency-based learning through social interaction. Traditional educational models are primarily focused on classroom education and training typically associated with the notion of credit hours as the (only) route towards formal credentials. This limits opportunities for creating personalized learning pathways in the changing educational context. ProSolo provide users with the ability to unbundle education programs, courses, and units into discrete yet inter-related competencies, allowing learners to construct their education pathway in a manner that better reflects their interests and future career motivations and requirements. ProSolo is developed with the intention of providing learners with opportunities to customize, modify, and personalize their self-directed learning journey. ProSolo supports the development of skills for self-directed learning by allowing learners to control the planning, learning, and presentation of outcomes associated with their learning. To support learners with different levels of prior knowledge, study skills, and cultural backgrounds, ProSolo offers features for supporting self-directed learning through three types of scaffolds, including instructional, social, and technological. Learning in ProSolo occurs within a socially rich environment that aggregates learners’ information created and shared in their existing online spaces. ProSolo makes use of learning analytics to empower learners and instructors in this new model of education. ProSolo was used in the Data, Learning, and Analytics MOOC and is currently being piloted at several university sites.
User Control in AIED (Artificial Intelligence in Education)Peter Brusilovsky
Slides of my intro to "Meet the Expert" session at AIED 2021. This is a subset of slides of a longer presentation on user control in AI extended with many specific examples from AIED area.
Two Brains are Better than One: User Control in Adaptive Information AccessPeter Brusilovsky
In recent years, the use of Artificial Intelligence (AI) technologies expanded to many areas where they directly affect the lives of many people. AI-based approaches advise human decision-makers who should be released on bail, whether it is a good time to discharge a patient from a hospital and whether a specific student is at risk to fail a course. Such an extensive use in AI in decision making came with a range of protentional problems that have been extensively studied over the last few years. Recognition of these problems motivated a rapid rise of research on “human-centered AI”, which attempted to address and minimize the negative effects of using AI technologies. Among the ideas of human-centered AI is user control - engaging users in affecting AI decision making to prevent possible errors and biases. In my talk, I will focus on the application of user control in one popular area of AI application, adaptive information access. Adaptive information access systems such as personalized search and recommender systems attempt to model their users to help them in finding the most relevant information. Yet, user modeling and personalization mechanisms might not always work as expected resulting in errors, biases, and suboptimal behavior. Combining the decision power or AI with the ability of the user to guide and control it brings together the strong sides of artificial and human intelligence and could lead to better results. In my talk, I review several projects focused on user control in adaptive information access systems and discuss the benefits and challenges of this approach.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
From Expert-Driven to Data-Driven Adaptive LearningPeter Brusilovsky
Keynote slides for the Workshop on Advancing Education with Data at the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Halifax, NS, Canada, Aug 14, 2017
Examining the Value of Learning Analytics for Supporting Work-integrated Lear...Vitomir Kovanovic
Slides from our presentation at the Seventh National Conference
on Work-Integrated Learning (ACEN’18).
The full paper is available at https://www.researchgate.net/publication/328578409_Examining_the_value_of_learning_analytics_for_supporting_work-integrated_learning
The Virtuous Loop of Learning Analytics & Academic Technology Innovation John Whitmer, Ed.D.
Faculty and academic departments creating innovative educational practices are often starved for useful data and analysis to determine whether their innovations made a difference. Research has found that this data is a systematically significant predictor of success much more powerful than traditional demographic or academic preparedness variables. This leads to a “virtuous loop” in which digital technology adoption enables assessment which then improves educational practices using those technologies.
This presentation was delivered at the Online Learning Consortium Collaborate Event, November 19, 2015.
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
Ascilite webinar series: http://www.ascilite.org.au/index.php?p=news_detail&item=240
A slightly different version of the Macquarie University keynote at http://www.slideshare.net/sbs/our-learning-analytics-are-our-pedagogy
I swapped out more general critiques of big data, for more detail on Dispositional and Discourse Learning Analytics
Invited talk, INSIGHT Centre for Data Analytics, Univ. Galway, 2 Oct 2013, http://www.insight-centre.org
Abstract:
Data and analytics are transforming how organisations work in all sectors. While there are clearly ethical issues around big data and privacy, there may also be an argument that educational institutions have a moral obligation to use all the information they have to maximize the learner's progress. So, assuming education can't (arguably shouldn't) resist this revolution, the question is how to harness this new capability intelligently. Learning Analytics is an exploding research field and startup market: do leaders know what to ask when the vendors roll up with dazzling dashboards? In this talk I'll provide an overview of developments, and consider some of the key questions we should be asking. Like any modelling technology and accounting system, analytics are not neutral, and do not passively describe sociotechnical reality: they begin to shape it. Moreover, they start with the things that are easiest to count, which doesn't necessarily equate to the things we value in learning. Given the crisis in education at many levels, what realities do we want analytics to perpetuate, or bring into being?
Bio:
Simon Buckingham Shum is Professor of Learning Informatics at the UK Open University's Knowledge Media Institute. He researches, teaches and consults on Learning Analytics, Collective Intelligence and Argument Visualization. His background is B.Sc. Psychology, M.Sc. Ergonomics and Ph.D. Human-Computer Interaction. He co-edited Visualizing Argumentation (Springer 2003), the standard reference in the field, followed by Knowledge Cartography (2008). In the field of Learning Analytics, he served as Program Co-Chair of the 2nd International Learning Analytics LAK12 conference, chaired the LAK13 Discourse-Centric Learning Analytics workshop, and the LASI13 Dispositional Learning Analytics workshop. He is a co-founder of the Society for Learning Analytics Research, Compendium Institute, LearningEmergence.net, and was Co-Founder and General Editor of the Journal of Interactive Media in Education. He serves on the Advisory Groups for a variety of learning analytics initiatives in education and enterprise, and is a Visiting Fellow at University of Bristol Graduate School of Education. Contact him via http://simon.buckinghamshum.net
Confronting Reality with Big Data & Learning Analytics
We are experiencing an explosion in the quantity of data available online from archives and live streams. Learning Analytics is concerned with how educational research, and learning platform design, can make more effective use of such data (Long & Siemens, 2011). Improving outcomes through the analysis of data is of interest to researchers, administrators, systems architects, social media developers, educators and learners. Analytics are being held up by some as a way to confront, and tackle, the tough new realities of less money, less attention, and higher accountability for quality of learning.
Researchers and vendors are building reporting capabilities into tools that provide unprecedented levels of data on learners. This symposium will show what is possible, and what's coming soon. What objections could possibly be raised to such progress?
However, information infrastructure embodies and shapes worldviews: classification schemes are not only systematic ways to capture and preserve, but also to forget, by virtue of what remains invisible (Bowker & Star, 1999). Learning analytics and recommendation engines are designed with a particular conception of ‘success’, driving the patterns deemed to be evidence of progress, the interventions that are deemed appropriate, the data captured and the rules that fire in software.
This symposium will air some of the critical arguments around the limits of decontextualised data and automated analytics, which often appear reductionist in nature, failing to illuminate higher order learning. There are complex ethical issues around data fusion, and it is not clear to what extent learners are empowered, in contrast to being merely the objects of tracking technology. Educators may also find themselves at the receiving end of a new battery of institutional ‘performance indicators’ that do not reflect what they consider to be authentic learning and teaching.
This Symposium will provide the opportunity to hear a series of brief presentations introducing contrasting perspectives, before the debate is opened to all. Speakers from a cross-section of The Open University will describe how we are connecting datasets, analysing student data and prototyping next generation analytics. Complementing this, JISC will present a national capability perspective, with an update on the JISC CETIS ‘landscape analysis’ of the field, which will clarify potential benefits, issues to consider, and help institutions to assess their current capability and possible next steps.
Participants will catch up with developments in this fast moving field, through exposure to the possibilities of analytics, as well as issues to be alert to.
ICO Fall School 2012, Santuari de Santa Maria del Collell, Gironahttps://sites.google.com/site/icofallschool2012
A week long PhD training school for educational and ed-tech researchers
Keynote lecture at 2016 NTU Learning and Teaching Seminar - Students as Partn...Simon Bates
Keynote lecture at 2016 NTU Learning and Teaching Seminar - Students as Partners in Learning and Teaching. In this keynote, I will consider the role of students as partners in learning with reference to what current research can tell us about how people learn, what students have to say about what supports their learning, and where technology can help.
Presentation by John Whitmer, Michael Haskell (Cal Poly SLO), and Hillary Kaplowitz (CSU Northtridge) at US West Coast Moodle Moot 2012.
“Learner Analytics” has captured the attention of the media and is the topic of much debate in professional and academic circles. What lies behind the hype? In this presentation, we will discuss the state and limits to current in research in LMS Learner Analytics. We will then look at examples of Learner Analytics in Moodle, including tools for faculty and reports for reporting across the entire instance.
[DSC Europe 22] Machine learning algorithms as tools for student success pred...DataScienceConferenc1
The goal of higher education institutions is to provide quality education to students. Predicting academic success and early intervention to help at-risk students is an important task for this purpose. This talk explores the possibilities of applying machine learning in developing predictive models of academic performance. What factors lead to success at university? Are there differences between students of different generations? Answers are given by applying machine learning algorithms to a data set of 400 students of three generations of IT studies. The results show differences between students with regard to student responsibility and regularity of class attendance and great potential of applying machine learning in developing predictive models.
4 March 2010 (Thursday) | 15:30 - 17:40 | http://citers2010.cite.hku.hk/abstract/20 | Dr. Barbara MEANS | Center for Technology in Learning, SRI International
Aligning Learning Analytics with Classroom Practices & NeedsSimon Knight
The Learning Analytics Research Network (LEARN) invites you to join us for a talk about the exciting ways in which the University of Technology Sydney is using participatory design to augment existing classroom practices with learning analytics. Simon Knight, a LEARN Visiting Scholar from the University of Technology Sydney, will introduce a variety of projects, including their work developing analytics to support student writing.
Come meet others at NYU interested in learning analytics while learning from the examples of leading work in Australia. A light lunch will be served and the talk will be followed by a short Q&A. RSVP is required.
About Simon Knight
Simon Knight is a lecturer at the University of Technology Sydney in the Faculty of Transdisciplinary Innovation. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University.
About Simon’s Talk
How do we make use of data about our students to support their learning, and where does learning analytics fit into that? Educators are increasingly asked to work with data and technologies such as learning analytics to support and provide evidence of student learning. However, what learning analytics developers should design for, and how educators will implement analytics, is unclear. Learning analytics risks the same levels of low uptake and implementation as many other educational technologies if they do not align with educator practice and needs. How then do we tackle this gap, to support and develop technologies that are implemented in practice, for impact on learning?
At the University of Technology Sydney, we have taken a participatory design based approach to designing and implementing learning analytics in practice, and understanding their impact. In our work we have identified existing practices with which learning analytics may be aligned to augment them. This talk introduces some of these projects, particularly drawing on our work in developing analytics to support student writing (writing analytics), giving examples of how analytics were aligned with existing pedagogic practices to support learning. Through this augmentation, supported by design-based approaches, we argue we can develop research and practice in tandem.
Learning Analytics (or: The Data Tsunami Hits Higher Education)Simon Buckingham Shum
Keynote Address to The Impact of Higher Education: Addressing the Challenges of the 21st CenturyEuropean Association for Institutional Research (EAIR) 35th Annual Forum 2013, Erasmus University, Rotterdam, the Netherlands, 28-31 August 2013. http://www.eair.nl/forum/rotterdam
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
1.4 modern child centered education - mahatma gandhi-2.pptx
Learning Analytics BETT2013
1. BETT 2013, London — LearnLive HigherEd
Learning Analytics:
Unlocking student data for
21st century learning?
Simon Buckingham Shum
Knowledge Media Institute
The Open University UK
simon.buckinghamshum.net
@sbskmi #LearningAnalytics
2. 70-strong lab prototyping next generation
learning / sensemaking / social web media
linked data / semantic web services 2
3. learning objective:
walk out with
better questions
than you can ask right now
3
9. A recent analytics product review…
“Some have tried to argue that
this technology doesn't work out
cost effectively when compared to
conventional tests... but this
misses a huge point. More often
than not, we test after the event
and discover the problem — but
this is too late..”
9
12. How is your aquatic ecosystem?
“This means that the keeper can be notified before water
conditions directly harm the fish—an assured outcome of
predictive software that lets you know if it looks like the
pH is due to drop, or the temperature is on its way up.
This way, it’s a real fish saver, as
opposed to a forensic examiner,
post-wipeout.”
(From a review of Seneye, in a hobbyist magazine)
12
13. How is your learning ecosystem?
This means that the teacher can be notified before
learning conditions directly harm the students — an
assured outcome of predictive software that lets you
know if it looks like engagement is due to drop, or
distraction is on its way up.
This way, it’s a real student saver,
as opposed to a forensic
examiner, post-wipeout.
13
14. but you still need to know
what good looks like…
and what to do when it drops… 14
18. Purdue University Signals: real time traffic-
lights for students based on predictive model
MODEL:
• ACT or SAT score
• Overall grade-point average
• CMS usage composite
• CMS assessment composite
• CMS assignment composite
• CMS calendar composite
Predicted 66%-80%
of struggling
students who
needed help
Campbell et al (2007). Academic Analytics: A New Tool for a New
Era, EDUCAUSE Review, vol. 42, no. 4 (July/August 2007): 40– 18
57. http://bit.ly/lmxG2x
19. Purdue University Signals: real time traffic-
lights for students based on predictive model
“Results thus far show that
students who have engaged with
Course Signals have higher
average grades and seek out help
resources at a higher rate than
other students.”
Pistilli, M. D., Arnold, K. and Bethune, M., Signals: Using Academic
Analytics to Promote Student Success. EDUCAUSE Review
Online, July/Aug., (2012).
http://www.educause.edu/ero/article/signals-using-academic- 19
analytics-promote-student-success
20. Enabling staff to
monitor courses
View profiles
and student showing predictions
academic of academic success
success in relation to success
predictions factors and cohort
Chris Ballard, Tribal Labs / @chrisaballard / www.triballabs.net
21. Predictive model relates predictions to student
success factors to help staff identify interventions
Understand patterns of student
activity and engagement with
university services
Chris Ballard, Tribal Labs / @chrisaballard / www.triballabs.net
22. predictive models
are exciting
but there are many other
kinds of analytics
22
23. Analytics in your VLE:
Blackboard: feedback to students
http://www.blackboard.com/Platforms/Analytics/Products/Blackboard-Analytics-for-Learn.aspx
23
26. Emerging interest in learning analytics
Professor Mark Stubbs | m.stubbs@mmu.ac.uk
• Why? Make better decisions MMU
Example: Choosing a new VLE: exploring
since 2010 …
VLE usage
Learner patterns
demographics Exam
Entry results … planning
wide
qualifications institution-
2013
support for
• Seek to correlate variables with final success/failure
• Triangulate with extensive survey and focus groups
• Result: Critical Success Factors inform
requirements for new VLE
28. Why do dispositions matter?
“Knowledge of methods alone
will not suffice: there must be
the desire, the will, to employ
them. This desire is an affair
of personal disposition.”
John Dewey
Dewey, J. How We Think: A Restatement of the Relation of Reflective Thinking
to the Educative Process. Heath and Co, Boston, 1933 28
29. Validated as loading onto
7 dimensions of “Learning Power”
Being Stuck & Static Changing & Learning
Data Accumulation Meaning Making
Passivity Critical Curiosity
Being Rule Bound Creativity
Isolation & Dependence Learning Relationships
Being Robotic Strategic Awareness
Fragility & Dependence Resilience
Univ. Bristol and Vital Partnerships provides practitioner resources and
tools to support their application in schools, HEIs and the workplace 29
30. ELLI: Effective Lifelong Learning Inventory
Web questionnaire 72 items (children and adult versions: used
in schools, universities and workplace)
30
31. Analytics for lifelong/lifewide
learning dispositions: ELLI
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and
Learning Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, Vancouver). Eprint: http://oro.open.ac.uk/32823
33. EnquiryBlogger:
Tuning Wordpress as an ELLI-based learning journal
Piloting from Yr 5, to secondary, to Masters level
Standard Wordpress
editor
http://learningemergence.net/tools/enquiryblogger 33
34. EnquiryBlogger:
Tuning Wordpress as an ELLI-based learning journal
Piloting from Yr 5, to secondary, to Masters level
Categories from
ELLI
http://learningemergence.net/tools/enquiryblogger 34
35. EnquiryBlogger:
Tuning Wordpress as an ELLI-based learning journal
Piloting from Yr 5, to secondary, to Masters level
Plugin visualizes blog
categories, mirroring
the ELLI spider. Direct
navigation to blog
posts from here
35
38. unpacking deeper learning
example:
online student discourse
analytics that go beyond
“number of forum posts”
+ “trending topics”
38
39. Social Network Analysis (SNAPP)
What’s going on
in these discussion forums?
Bakharia, A. and Dawson, S., SNAPP: a bird's-eye view of temporal participant interaction. In: Proceedings of the 1st 39
International Conference on Learning Analytics and Knowledge (Banff, Alberta, Canada, 2011). ACM. pp.168-173
40. Social Network Analysis (SNAPP)
40
http://www.slideshare.net/aneeshabakharia/snapp-20minute-presentation
41. Social Network Analysis (SNAPP)
2 learners connect
otherwise separate
clusters
tutor only engaging
with active students,
ignoring disengaged
ones on the edge
41
http://www.slideshare.net/aneeshabakharia/snapp-20minute-presentation
42. Social Learning Analytics about to appear in
products…
http://www.desire2learn.com/products/analytics (this is from a beta demo)
42
43. Discourse analytics: what intellectual
contribution does this learner make?
Rebecca is playing
the role of broker,
connecting peers’
contributions in
meaningful ways
De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. Discourse-centric learning analytics. 1st International
Conference on Learning Analytics & Knowledge (Banff, 27 Mar-1 Apr, 2011), ACM: New York. pp.22-33 http://oro.open.ac.uk/25829
44. Semantic Social Network Analytics:
shows if users agree or disagree
De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. Discourse-centric learning analytics. 1st International
Conference on Learning Analytics & Knowledge (Banff, 27 Mar-1 Apr, 2011), ACM: New York. pp.22-33 http://oro.open.ac.uk/25829
45. Discourse analytics on webinar
textchat
Can we spot the
quality learning
conversations in
a 2.5 hr webinar?
Ferguson, R. and Buckingham Shum, S., Learning analytics to identify exploratory dialogue within synchronous text chat. In: 1st
International Conference on Learning Analytics and Knowledge (Banff, Canada, 2011). ACM
46. Discourse analytics on webinar
textchat
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar…
Sheffield, UK not as sunny See you!
as yesterday - still warm
bye for now!
Greetings from Hong Kong
bye, and thank you
Morning from Wiltshire,
80
sunny here! Bye all for now
60
40
20
0
9:28
9:32
10:13
11:48
12:00
12:05
12:04
9:36
9:40
9:41
9:46
9:50
9:53
9:56
10:00
10:05
10:07
10:07
10:09
10:17
10:23
10:27
10:31
10:35
10:40
10:45
10:52
10:55
11:04
11:08
11:11
11:17
11:20
11:24
11:26
11:28
11:31
11:32
11:35
11:36
11:38
11:39
11:41
11:44
11:46
11:52
11:54
12:03
-20
-40
Average Exploratory
-60
47. Discourse analytics on webinar
textchat
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar
but if we zoom in on a peak…
80
60
40
20
0
9:28
9:32
10:13
11:48
12:00
12:05
12:04
9:36
9:40
9:41
9:46
9:50
9:53
9:56
10:00
10:05
10:07
10:07
10:09
10:17
10:23
10:27
10:31
10:35
10:40
10:45
10:52
10:55
11:04
11:08
11:11
11:17
11:20
11:24
11:26
11:28
11:31
11:32
11:35
11:36
11:38
11:39
11:41
11:44
11:46
11:52
11:54
12:03
-20
-40
Average Exploratory
-60
48. Discourse analytics on webinar
textchat
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar
but if we zoom in on a peak…
Classified as
“exploratory
talk”
(more
substantive
100 for learning)
50
0
9:28
“non-
9:40
9:50
10:00
10:07
10:17
10:31
10:45
11:04
11:17
11:26
11:32
11:38
11:44
11:52
12:03
-50 exploratory”
Averag
-100
49. “Rhetorical parsing” to identify constructions
signifying scholarly writing
OPEN QUESTION:
“… little is known …”
“… role … has been elusive”
“Current data is insufficient …”
CONTRASTING IDEAS:
“… unorthodox view resolves …”
“In contrast with previous
SURPRISE: hypotheses ...”
“We have recently observed ... “... inconsistent with past
surprisingly” findings ...”
“We have identified ... unusual”
“The recent discovery ... suggests
intriguing roles”
http://technologies.kmi.open.ac.uk/cohere/2012/01/09/cohere-plus-automated-rhetorical-annotation
De Liddo, A., Sándor, Á. and Buckingham Shum, S., Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation
Study. Computer Supported Cooperative Work, 21, 4-5, (2012), 417-448. http://oro.open.ac.uk/31052
50. “What are the key contributions of this text?
Human analyst Computational analyst
http://technologies.kmi.open.ac.uk/cohere/2012/01/09/cohere-plus-automated-rhetorical-annotation
De Liddo, A., Sándor, Á. and Buckingham Shum, S., Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation
Study. Computer Supported Cooperative Work, 21, 4-5, (2012), 417-448. http://oro.open.ac.uk/31052
51. learning objective
– how are we doing?
walk out with
better questions
than you could ask 30mins ago
51
52. How will my org. evolve from a digital
exoskeleton to a nervous system?
Ed Dumbill: http://strata.oreilly.com/2012/08/digital-nervous-system-big-data.html 52
53. The Wal-Martification of education?
“What counts as
data, how do you get
it, and what does it
actually mean?”
“The basic question is not
what can we measure?
The basic question is
“data narrowness” what does a good
“instrumental learning” education look like?
“students with no curiosity” Big questions.
http://chronicle.com/blogs/techtherapy/2012/05/02/episode-95-learning-analytics-could-lead-to-wal-martification-of-college 53
http://lak12.wikispaces.com/Recordings
54. Analytics provide maps
= systematic ways of distorting reality
in order to reduce complexity
“A marker of the health of the
learning analytics field will be
the quality of debate around
what the technology renders
visible and leaves invisible.”
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies:
Pedagogy, Modelling and Learning Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29
Apr-2 May, 2012, Vancouver, BC). ACM: New York. Eprint: http://oro.open.ac.uk/32823
55. Will your staff know how to
read and write analytics?
This will become a key literacy.
55
56. What if you engaged your
learners in the co-design of
the analytics which will track
them?
Think about the conversations
you’d need to have…
56
57. Are you ready for
your performance indicators
to be computed from analytics?
57
58. Our analytics are our
pedagogy
They promote assessment regimes
— which drive (and strangle)
educational innovation
58
61. BETT 2013, London — LearnLive HigherEd
Learning Analytics:
Unlocking student data for
21st century learning?
Simon Buckingham Shum
Knowledge Media Institute
The Open University UK
simon.buckinghamshum.net
@sbskmi #LearningAnalytics