Invited talk, INSIGHT Centre for Data Analytics, Univ. Galway, 2 Oct 2013, http://www.insight-centre.org
Abstract:
Data and analytics are transforming how organisations work in all sectors. While there are clearly ethical issues around big data and privacy, there may also be an argument that educational institutions have a moral obligation to use all the information they have to maximize the learner's progress. So, assuming education can't (arguably shouldn't) resist this revolution, the question is how to harness this new capability intelligently. Learning Analytics is an exploding research field and startup market: do leaders know what to ask when the vendors roll up with dazzling dashboards? In this talk I'll provide an overview of developments, and consider some of the key questions we should be asking. Like any modelling technology and accounting system, analytics are not neutral, and do not passively describe sociotechnical reality: they begin to shape it. Moreover, they start with the things that are easiest to count, which doesn't necessarily equate to the things we value in learning. Given the crisis in education at many levels, what realities do we want analytics to perpetuate, or bring into being?
Bio:
Simon Buckingham Shum is Professor of Learning Informatics at the UK Open University's Knowledge Media Institute. He researches, teaches and consults on Learning Analytics, Collective Intelligence and Argument Visualization. His background is B.Sc. Psychology, M.Sc. Ergonomics and Ph.D. Human-Computer Interaction. He co-edited Visualizing Argumentation (Springer 2003), the standard reference in the field, followed by Knowledge Cartography (2008). In the field of Learning Analytics, he served as Program Co-Chair of the 2nd International Learning Analytics LAK12 conference, chaired the LAK13 Discourse-Centric Learning Analytics workshop, and the LASI13 Dispositional Learning Analytics workshop. He is a co-founder of the Society for Learning Analytics Research, Compendium Institute, LearningEmergence.net, and was Co-Founder and General Editor of the Journal of Interactive Media in Education. He serves on the Advisory Groups for a variety of learning analytics initiatives in education and enterprise, and is a Visiting Fellow at University of Bristol Graduate School of Education. Contact him via http://simon.buckinghamshum.net
Learning Analytics (or: The Data Tsunami Hits Higher Education)Simon Buckingham Shum
Keynote Address to The Impact of Higher Education: Addressing the Challenges of the 21st CenturyEuropean Association for Institutional Research (EAIR) 35th Annual Forum 2013, Erasmus University, Rotterdam, the Netherlands, 28-31 August 2013. http://www.eair.nl/forum/rotterdam
This presentation proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these analytics on SocialLearn, an online learning space in use at the UK’s Open University, and the challenges that this is raising. This work takes an iterative approach to analytics, encouraging learners to respond to and help to shape not only the analytics but also their associated recommendations
Learning Analytics (or: The Data Tsunami Hits Higher Education)Simon Buckingham Shum
Keynote Address to The Impact of Higher Education: Addressing the Challenges of the 21st CenturyEuropean Association for Institutional Research (EAIR) 35th Annual Forum 2013, Erasmus University, Rotterdam, the Netherlands, 28-31 August 2013. http://www.eair.nl/forum/rotterdam
This presentation proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these analytics on SocialLearn, an online learning space in use at the UK’s Open University, and the challenges that this is raising. This work takes an iterative approach to analytics, encouraging learners to respond to and help to shape not only the analytics but also their associated recommendations
The paper was presented at ICALT 2013: http://infoscience.epfl.ch/record/185963?ln=en
This paper proposes a novel approach to build and deploy learning analytics dashboards in multiple learning environments. Existing learning dashboards are barely portable: once deployed on a learning platform, it requires considerable effort to deploy the dashboard elsewhere. We suggest constructing dashboards from lightweight web applications, namely widgets. Our approach allows to port dashboards with no additional cost between learning environments that implement open specifications (OpenSocial and ActivityStreams) for data access and use widget APIs. We propose to facilitate reuse by sharing the dashboards and widgets via a centralized analytics repository.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
The popular media tells us that we live in an age of disengagement. 21st century professors are told they need to design curriculum to support student success and create an engaging classroom whether it is face-to-face, online, or in a blended learning environment. Creating engaging learning environments with technology will be essential to embrace 21st century learners and their ever evolving learning styles. Information Technology is dedicated to this philosophy and embraces varying technologies and learning concepts with other institutions and with our own faculty to generate innovation with technology and learning engagement in tandem. Information Technology invites the Stevens community to explore how educators can use some of the tools such as apps, clickers, open education resources, mobile learning, collaborative learning platforms from Google Hangouts to Massive Open Online Courses, and embrace the engagement strategies of social media
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
Digital Learning, Emerging Technologies, Abundant Data, and Pedagogies of CareGeorge Veletsianos
Keynote delivered at the Emerging Technologies in Authentic Learning Contexts Conference (Cape Town, South Africa), drawing links between my research on digital learning, emerging technologies, learner experiences, and the changing higher education landscape.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
Designing Systemic Learning Analytics at the Open University
Belinda TynanPro-Vice-Chancellor Learning & TeachingThe Open University, UK
Simon Buckingham Shum Knowledge Media InstituteThe Open University, UK
Replay from today's webinar in the SoLAR online open course Strategy & Policy for Systemic Learning Analytics. Thanks to the Australian Office for Learning and Technology for sponsoring this, and to George Siemens for convening (replay):
Abstract: The OU has been analysing student data and feeding this back to faculties since its doors opened 40 years ago. However, the emergence of learning analytics technologies open new possibilities for engaging in more effective sensemaking of richer learner data, and more timely interventions. We will introduce the framework we are developing to orchestrate the rollout of a systemic organisational analytics infrastructure (both human and technical), and discuss some of the issues that arise. We will also describe how strategic research efforts will key into this design, should they prove effective.
The paper was presented at ICALT 2013: http://infoscience.epfl.ch/record/185963?ln=en
This paper proposes a novel approach to build and deploy learning analytics dashboards in multiple learning environments. Existing learning dashboards are barely portable: once deployed on a learning platform, it requires considerable effort to deploy the dashboard elsewhere. We suggest constructing dashboards from lightweight web applications, namely widgets. Our approach allows to port dashboards with no additional cost between learning environments that implement open specifications (OpenSocial and ActivityStreams) for data access and use widget APIs. We propose to facilitate reuse by sharing the dashboards and widgets via a centralized analytics repository.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
The popular media tells us that we live in an age of disengagement. 21st century professors are told they need to design curriculum to support student success and create an engaging classroom whether it is face-to-face, online, or in a blended learning environment. Creating engaging learning environments with technology will be essential to embrace 21st century learners and their ever evolving learning styles. Information Technology is dedicated to this philosophy and embraces varying technologies and learning concepts with other institutions and with our own faculty to generate innovation with technology and learning engagement in tandem. Information Technology invites the Stevens community to explore how educators can use some of the tools such as apps, clickers, open education resources, mobile learning, collaborative learning platforms from Google Hangouts to Massive Open Online Courses, and embrace the engagement strategies of social media
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
Presentation about Learning Analytics for JISC network event; discussion of research findings and implications for individual and institutions considering a Learning Analytics project. Also discuss implications for my work with Blackboard on "Platform Analytics."
Digital Learning, Emerging Technologies, Abundant Data, and Pedagogies of CareGeorge Veletsianos
Keynote delivered at the Emerging Technologies in Authentic Learning Contexts Conference (Cape Town, South Africa), drawing links between my research on digital learning, emerging technologies, learner experiences, and the changing higher education landscape.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Improving Student Achievement with New Approaches to DataJohn Whitmer, Ed.D.
Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.
Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.
CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.
Designing Systemic Learning Analytics at the Open University
Belinda TynanPro-Vice-Chancellor Learning & TeachingThe Open University, UK
Simon Buckingham Shum Knowledge Media InstituteThe Open University, UK
Replay from today's webinar in the SoLAR online open course Strategy & Policy for Systemic Learning Analytics. Thanks to the Australian Office for Learning and Technology for sponsoring this, and to George Siemens for convening (replay):
Abstract: The OU has been analysing student data and feeding this back to faculties since its doors opened 40 years ago. However, the emergence of learning analytics technologies open new possibilities for engaging in more effective sensemaking of richer learner data, and more timely interventions. We will introduce the framework we are developing to orchestrate the rollout of a systemic organisational analytics infrastructure (both human and technical), and discuss some of the issues that arise. We will also describe how strategic research efforts will key into this design, should they prove effective.
Perspectives on Sustainability in Higher Education: Inviting and Leveraging C...BCcampus
Vivian Neal, Educational Consultant, Teaching and Learning Centre, Simon Fraser University
Janet Pivnick, Educational Consultant, Teaching and Learning Centre, Simon Fraser University
Festival of Learning in Burnaby, B.C. - June 6-9, 2016
Webinar for LearningAnalytics.net Open Course, Feb. 2011, (Athabasca U)
Simon Buckingham Shum
Knowledge Media Institute
Open University UK
http://simon.buckinghamshum.net
http://open.edu
Learning Analytics -Towards a New Discipline-Dragan Gasevic
The talk, motivated by the present state of learning and education, identifies a need for a systematic change of the present preactice. Learning analytics is identified as a possible way to good to address this open challenge. Some connections with evidence-based medicine are drawn. Finally, learning analytics is defined as well as some open research challenges.
Will Learning Analytics Transform Higher Education?Abelardo Pardo
Discussion on the elements, actors, cultural change and scenarios that are related to Learning Analytics in Higher Education Institutions. Presentation given at the Digital Education Show Asia, Kuala Lumpur, June 2015
Keynote delivered by George Siemens (@gsiemens), Dragan Gasevic (@dgasevic), and Ryan Baker (@BakerEDMLab) at the 8th International Educational Data Mining Conference (EDM 2015) in Madrid, Spain on June 27, 2015
Educational data mining and learning analytics have to date largely focused on specific research questions that provide insight into granular interactions. These insights have bee abstracted to include the development of predictive models, intelligent tutors, and adaptive learning. While there are several domains where holistic or systems models have provided additional explanatory power, work around learning has not created holistic models with the level of concreteness or richness required. The need for both granular and integrated high-level view of learning is further influenced by distributed, life long, multi-spaced learning that today defines education. Drawing on social and knowledge graph theory, we propose the development of a Personal Learning Graph (PLeG) - an open and learner-owned profile that addresses cognitive, affective, and related elements that reflect what a learner knows, is able to do, and processes through which she learns best. This talk will introduce PLeG, detail required technical infrastructure, and articulate how it would interact with established learning software.
Learning analytics: study goal and data explorerJisc
Presenters:
Rob Wyn Jones, senior data and analytics integrator, Jisc
Paul Bailey, senior co-design manager, Jisc
An introduction to two tools from the Jisc learning analytics service.
Study goal provides a motivational student learning app to view and record analytics data and is available on Apple and Android.
Data explorer provides a simple set of admin tools to assist HEI and FEI’s onboarding into the learning records warehouse, and paves the way for more sophisticated dashboards and analysis, from a range of vendors and open source, to be adopted through our new framework.
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
This slide was presented in International the 2015 Conference on Education Research.
I aggregated several my other partial slides and reports to describe adaptive learning model pertaining to concept of learning analytics as well as LOD for curriculum standards and digital resources. There is short introduction to the project of ISO/IEC 20748 Learning analytics interoperability - Part 1: Reference model.
Ascilite webinar series: http://www.ascilite.org.au/index.php?p=news_detail&item=240
A slightly different version of the Macquarie University keynote at http://www.slideshare.net/sbs/our-learning-analytics-are-our-pedagogy
I swapped out more general critiques of big data, for more detail on Dispositional and Discourse Learning Analytics
OAAI: Deploying an Open Ecosystem for Learner AnalyticsJoshua
The Open Academic Analytics Initiative (OAAI), an NGLC grant recipient, has developed a predictive model for learner analytics using open-source tools, which we are releasing under an open-source license. We will share project outcomes along with research into effective OER-based intervention strategies and other critical learner analytics scaling factors.
[DSC Europe 22] Machine learning algorithms as tools for student success pred...DataScienceConferenc1
The goal of higher education institutions is to provide quality education to students. Predicting academic success and early intervention to help at-risk students is an important task for this purpose. This talk explores the possibilities of applying machine learning in developing predictive models of academic performance. What factors lead to success at university? Are there differences between students of different generations? Answers are given by applying machine learning algorithms to a data set of 400 students of three generations of IT studies. The results show differences between students with regard to student responsibility and regularity of class attendance and great potential of applying machine learning in developing predictive models.
Educational Technology is becoming increasingly important in the higher education sector as innovative educators are using technology to improve pedagogy and student learning. This is not limited to academic institutions as corporate trainers also seek to leverage their people development resources to improve the operating performance of their organizations.
As a result the field of EdTech has been growing rapidly over the past decade as entrepreneurs see the opportunities to use technology to improve the speed and depth of learning. The drive ultimately stems from the transition to a knowledge economy where information is the vital fuel and improved learning can provide breakthrough insights that have substantial public or private value.
This presentation will look at the trends impacting and being impacted by EdTech, student and faculty perceptions, economics, adoption success, factors, investment patterns and the major technologies that are being used in higher educational institutions.
Global Brain Power: edX and the Transformation of Learning through Big DataCapgemini
An interview with Anant Agarwal, President of edX
"Analyzing the Big Data from the students’ clickstreams allows us to gain insights into how students learn and collaborate."
ICO Fall School 2012, Santuari de Santa Maria del Collell, Gironahttps://sites.google.com/site/icofallschool2012
A week long PhD training school for educational and ed-tech researchers
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
Institutional Success Via a Data-Centric Technology EcosystemIT Consultant
The content and data landscape is vast and decentralized, and aggregating assets via a unified technology strategy poses significant challenges. This presentation describes how to design a federated technology ecosystem to inform recruitment, marketing, and outreach efforts, and increase student retention through the innovative use of data modeling and predictive analytics. Collecting data for actionable insight, leveraging CRM, web, and mobile channels, and tracking retention and graduation rates will be highlighted.
Similar to insight-centre-galway-learning-analytics (20)
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Normal Labour/ Stages of Labour/ Mechanism of Labour
insight-centre-galway-learning-analytics
1. Learning Analytics
The New Burden of Knowledge
Simon Buckingham Shum
Knowledge Media Institute
The Open University UK
http://simon.buckinghamshum.net
http://linkedin.com/in/simon
INSIGHT Centre for Data Analytics, Univ. Galway, 2 Oct 2013
http://www.insight-centre.org
@sbskmi #LearningAnalytics
2. mission
walk out with
better questions
than you can ask right now about analytics
new tech and collaboration opportunities
to advance
education 2
10. Data and analytics are transforming
business, government and public services
10
Why would Higher Education be immune?
Why wouldn’t a sector focused on evidence-based
thinking and action welcome it?
A critical discussion is emerging
More later…
11. 11L. Johnson, R. Smith, H. Willis, A. Levine, and K. Haywood, The 2011 Horizon Report (Austin, TX: The New Media Consortium,
2011), http://www.nmc.org/pdf/2011-Horizon-Report.pdf
NMC Horizon 2011 Report:
Learning Analytics
(4-5yrs adoption)
Analytics is being heralded…
(2013 report)
19. From an analytics product review…
“Some have tried to argue that
this technology doesn't work out
cost effectively when compared to
conventional tests... but this
misses a huge point. More often
than not, we test after the event
and discover the problem — but
this is too late..”
19
22. How is your aquatic ecosystem?
“This means that the keeper can be notified before water
conditions directly harm the fish—an assured outcome of
predictive software that lets you know if it looks like the
pH is due to drop, or the temperature is on its way up.
This way, it’s a real fish saver, as
opposed to a forensic examiner,
post-wipeout.”
(From a review of Seneye, in a hobbyist magazine)
22
23. How is your learning ecosystem?
This means that the teacher can be notified before
learning conditions directly harm the students — an
assured outcome of predictive software that lets you
know if it looks like engagement is due to drop, or
distraction is on its way up.
This way, it’s a real student saver,
as opposed to a forensic
examiner, post-wipeout.
23
27. Purdue University Signals: real time traffic-
lights for students based on predictive model
27
Predicted 66%-80%
of struggling
students who
needed help
MODEL:
• ACT or SAT score
• Overall grade-point average
• CMS usage composite
• CMS assessment composite
• CMS assignment composite
• CMS calendar composite
Campbell et al (2007). Academic Analytics: A New Tool for a New
Era, EDUCAUSE Review, vol. 42, no. 4 (July/August 2007): 40–
57. http://bit.ly/lmxG2x
28. Purdue University Signals: real time traffic-
lights for students based on predictive model
28
“Results thus far show that
students who have engaged with
Course Signals have higher
average grades and seek out help
resources at a higher rate than
other students.”
Pistilli, M. D., Arnold, K. and Bethune, M., Signals: Using Academic
Analytics to Promote Student Success. EDUCAUSE Review
Online, July/Aug., (2012).
http://www.educause.edu/ero/article/signals-using-academic-
analytics-promote-student-success
29. Predictive analytics @open.edu
Registra)on
Pa.ern
CRM
contact
VLE
interac)on
Grades
Demo-‐
graphics
?
How early can we predict
likelihood of dropout, formal
withdrawal, failure?
Now exploring conventional
statistics, machine learning
and growing datasets
Library
interac)on
OpenLearn
interac)on
FutureLearn
interac)on
Social
App
X
interac)on
OU
history
30. Predictive analytics @open.edu
A.L. Wolff and Z. Zdrahal (2012). Improving Retention by Identifying and Supporting “At-risk” Students. EDUCAUSE Review Online, July-
August 2012. http://www.educause.edu/ero/article/improving-retention-identifying-and-supporting-risk-students
Test a range of
predictive models:
final result (pass/fail)
final numerical score
drop in the next TMA
score of the next TMA
Demo-
graphics
Previous
results
VLE
activity
Adding in user interaction data from the VLE
31. the opportunity for the
learning sciences
to combine with your university’s
collective
intelligence
31
37. Micro:
individual user actions
(and hence cohort)
Hard distinctions between Learning +
Academic analytics may dissolve
Meso:
institution-wide
Macro:
region/state/national/international
Aggregation of user traces
enriches meso + macro analytics
with finer-grained process data
…as they get joined up, each level enriches the others
38. Micro:
individual user actions
(and hence cohort)
Hard distinctions between Learning +
Academic analytics may dissolve
Meso:
institution-wide
Macro:
region/state/national/international
Aggregation of user traces
enriches meso + macro analytics
with finer-grained process data
Breadth + depth from macro
+ meso levels add power to
micro analytics
…as they get joined up, each level enriches the others
43. Analytics coming to a VLE near you:
e.g. Blackboard
43
http://www.blackboard.com/platforms/analytics/overview.aspx
http://www.blackboard.com/Platforms/Analytics/Products/Blackboard-Analytics-for-Learn.aspx
44. 44
Student Activity Dashboard (Erik Duval)
Duval E. (2011) Attention please!: learning analytics for visualization and recommendation. Proceedings of the 1st International
Conference on Learning Analytics and Knowledge. Banff, Alberta, Canada: ACM, 9-17.
47. Intelligent tutoring for skills mastery (CMU)
http://oli.cmu.edu
Lovett M, Meyer O and Thille C. (2008) The Open Learning Initiative: Measuring the effectiveness of the OLI statistics course in accelerating student
learning. Journal of Interactive Media in Education 14. http://jime.open.ac.uk/article/2008-14/352
“In this study, results showed
that OLI-Statistics students
[blended learning] learned a full
semester’s worth of material in
half as much time and
performed as well or better than
students learning from
traditional instruction over a full
semester.”
48. 48
Are students using the right tools at the right
time in the right way? (Abelardo Pardo, LAK13 Keynote)
http://www.slideshare.net/abelardo_pardo/bridging-the-middle-space-with-learning-analytics
49. Social Learning Analytics
Buckingham Shum, Sand Ferguson, R (2012). Social Learning Analytics. Journal of Educational Technology and Society, 15(3) pp. 3–26.
http://oro.open.ac.uk/34092
• Explosive growth in social
media
• The open/free content
paradigm
• Evidence of a global shift in
societal attitudes which
increasingly values
participation
• Innovation depends on
reciprocal social
relationships, tacit knowing
50. Social Network Analysis (SNAPP)
50Bakharia, A. and Dawson, S., SNAPP: a bird's-eye view of temporal participant interaction. In: Proceedings of the 1st
International Conference on Learning Analytics and Knowledge (Banff, Alberta, Canada, 2011). ACM. pp.168-173
What’s going on
in these discussion forums?
51. Social Network Analysis (SNAPP)
51
http://www.slideshare.net/aneeshabakharia/snapp-20minute-presentation
52. Social Network Analysis (SNAPP)
52
http://www.slideshare.net/aneeshabakharia/snapp-20minute-presentation
2 learners connect
otherwise separate
clusters
tutor only engaging
with active students,
ignoring disengaged
ones on the edge
53. Social Learning Analytics about to appear in
products…
53
http://www.desire2learn.com/products/analytics (this is from a beta demo)
54. Semantic Social Network Analytics
De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. Discourse-centric learning analytics. 1st
International Conference on Learning Analytics & Knowledge (Banff, 27 Mar-1 Apr, 2011) http://oro.open.ac.uk/25829
55. Visualizing and filtering social ties in
SocialLearn by topic and type
Schreurs B, Teplovs C, Ferguson R, De Laat M and Buckingham Shum S. (2013) Visualizing Social Learning Ties by Type and Topic:
Rationale and Concept Demonstrator. Proc. 3rd International Conference on Learning Analytics & Knowledge. Leuven, BE: ACM, 33-37.
Open Access Eprint: http://oro.open.ac.uk/36891
61. The promise of language technologies
Beyond number / size / frequency
of posts or ‘trending topic’
?
http://www.glennsasscer.com/wordpress/wp-content/uploads/2011/10/iceberg.jpg
62. Discourse analytics on webinar
textchat
Ferguson, R. and Buckingham Shum, S., Learning analytics to identify exploratory dialogue within synchronous text chat. In: 1st International
Conference on Learning Analytics and Knowledge (Banff, Canada, 2011). ACM
Can we spot the
quality learning
conversations in
a 2.5 hr webinar?
65. Discourse analytics on webinar
textchat
-100
-50
0
50
100
9:28
9:40
9:50
10:00
10:07
10:17
10:31
10:45
11:04
11:17
11:26
11:32
11:38
11:44
11:52
12:03
Averag
Classified as
“exploratory
talk”
(more
substantive
for learning)
“non-
exploratory”
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar
but if we zoom in on a peak…
Ferguson, R., Wei, Z., He, Y. and Buckingham Shum, S., An Evaluation of Learning Analytics to Identify Exploratory Dialogue in Online Discussions. In: Proc.
3rd International Conference on Learning Analytics & Knowledge (Leuven, BE, 8-12 April, 2013). ACM. http://oro.open.ac.uk/36664
66. Discourse analytics on webinar
textchat
Visualizing by individual user. The gradient of the threshold line is
adjusted to every 5 posts in 6 classified as “Exploratory Talk”
Ferguson, R., Wei, Z., He, Y. and Buckingham Shum, S., An Evaluation of Learning Analytics to Identify Exploratory Dialogue in Online Discussions. In: Proc.
3rd International Conference on Learning Analytics & Knowledge (Leuven, BE, 8-12 April, 2013). ACM. http://oro.open.ac.uk/36664
67. “Rhetorical parsing” to identify constructions
signifying scholarly writing
OPEN QUESTION:
“… little is known …”
“… role … has been elusive”
“Current data is insufficient …”
CONTRASTING IDEAS:
“… unorthodox view resolves …”
“In contrast with previous
hypotheses ...”
“... inconsistent with past
findings ...”
SURPRISE:
“We have recently observed ...
surprisingly”
“We have identified ... unusual”
“The recent discovery ... suggests
intriguing roles”
http://technologies.kmi.open.ac.uk/cohere/2012/01/09/cohere-plus-automated-rhetorical-annotation
De Liddo, A., Sándor, Á. and Buckingham Shum, S., Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation
Study. Computer Supported Cooperative Work, 21, 4-5, (2012), 417-448. http://oro.open.ac.uk/31052
68. 68
Xerox Incremental Parser (XIP)
Sándor, Á. and Vorndran, A. (2010). The detection of salient messages from social science research papers and its application in document search.
Workshop on Natural Language Processing Tools Applied to Discourse Analysis in Psychology, Buenos Aires, Argentina, May 10-14. 2010.
69. 69
Xerox Incremental Parser (XIP)
Sándor, Á. and Vorndran, A. (2010). The detection of salient messages from social science research papers and its application in document search.
Workshop on Natural Language Processing Tools Applied to Discourse Analysis in Psychology, Buenos Aires, Argentina, May 10-14. 2010.
70. Initial evaluation of XIP is promising,
but methodologically complex
Human analyst XIP
A striking example – but not all were like this (De Liddo et al, 2012)
19 sentences annotated 22 sentences annotated
11 sentences same as human annotation
71 sentences annotated 59 sentences annotated
42 sentences same as human annotation
Document 1
Document 2
Extract from annotation comparison:
71. Xerox Incremental Parser (XIP)
XIP’s raw output is fine for NLP
machines/researchers, but
not learner/educator
friendly
72. Xerox Incremental Parser (XIP)
5000 (or even 30) plain text files…
we need overviews
of XIP analyses from
a corpus
73. Making XIP analytics visible:
Annotations on the full text using the OU’s Cohere
social sensemaking app (Firefox add-on)
74. XIP Dashboard
All papers by year and concept, with
colour = concept density (v2 mockup)
74
Simsek D, Buckingham Shum S, Sándor Á, De Liddo A and Ferguson R. (2013) XIP Dashboard: Visual Analytics from Automated Rhetorical Parsing of
Scientific Metadiscourse. 1st International Workshop on Discourse-Centric Learning Analytics, at 3rd International Conference on Learning Analytics &
Knowledge. Leuven, BE (Apr. 8-12, 2013). Open Access Eprint: http://oro.open.ac.uk/37391
76. Why do dispositions matter?
76
“Knowledge of methods alone
will not suffice: there must be
the desire, the will, to employ
them. This desire is an affair
of personal disposition.”
John Dewey
Dewey, J. How We Think: A Restatement of the Relation of Reflective Thinking to the Educative
Process. Heath and Co, Boston, 1933
77. “In the growth mindset, people believe
that their talents and abilities can be
developed through passion,
education, and persistence … It’s
about a commitment to … taking
informed risks … surrounding
yourself with people who will
challenge you to grow”
Carol Dweck
77
Interview with Carol Dweck:
http://interviewscoertvisser.blogspot.co.uk/2007/11/interview-with-carol-dweck_4897.html
Why do dispositions matter?
78. “We’re looking at the profiles of
what it means to be effective in the
21st century. […] Resilience will
be the defining concept. When
challenged and bent, you learn and
bounce back stronger.”
“Dispositions are now at least as
important as Knowledge and
Skills. …They cannot be taught.
They can only be cultivated.”
John Seely Brown
78
http://reimaginingeducation.org conference (May 28, 2013)
Dispositions clip: http://www.c-spanvideo.org/clip/4457327
Whole talk: http://www.c-spanvideo.org/program/SecD
Why do dispositions matter?
79. How can we model and
quantify learning
dispositions in order
to develop analytics?
79
80. Validated as loading onto
7 dimensions of “Learning Power”
Changing & Learning
Meaning Making
Critical Curiosity
Creativity
Learning Relationships
Strategic Awareness
Resilience
Being Stuck & Static
Data Accumulation
Passivity
Being Rule Bound
Isolation & Dependence
Being Robotic
Fragility & Dependence
Ruth Deakin Crick
Grad. School of Education
81. Learning to Learn: 7 Dimensions of Learning Power
Factor analysis of the literature plus expert interviews: identified seven
dimensions of effective “learning power”, since validated empirically with
learners at many levels. (Deakin Crick, Broadfoot and Claxton, 2004)
84. Analytics for lifelong/lifewide
learning dispositions: ELLI
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and
Learning Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, Vancouver). Eprint: http://oro.open.ac.uk/32823
85. ELLI generates cohort data for each
dimension
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and
Learning Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, Vancouver). Eprint: http://oro.open.ac.uk/32823
86. Primary School EnquiryBloggers
Bushfield School, Wolverton, UK
EnquiryBlogger: blogging for Learning Power & Authentic Enquiry
http://learningemergence.net/2012/06/20/enquiryblogger-for-learning-power-authentic-enquiry
87. Masters level EnquiryBloggers
Graduate School of Education, University of Bristol
EnquiryBlogger: blogging for Learning Power & Authentic Enquiry
http://learningemergence.net/2012/06/20/enquiryblogger-for-learning-power-authentic-enquiry
89. Could a platform generate an
ELLI profile from user traces?
Shaofu Huang: Prototyping Learning Power Modelling in SocialLearn
http://www.open.ac.uk/blogs/SocialLearnResearch/2012/06/20/social-learning-analytics-symposium
Different social
network patterns
in different
contexts may
load onto
Learning
Relationships
Questioning and
challenging may
load onto Critical
Curiosity
Sharing relevant
resources from
other contexts
may load onto
Meaning Making
Repeated
attempts to pass
an online test
may load onto
Resilience
90. Your most
recent mood
comment:
“Great, at
last I have
found all the
resources
that I have
been
looking for,
thanks to!
Steve and
Ellen.!
In your last discussion with your mentor, you decided
to work on your resilience by taking on more learning
challenges
Your ELLI Spider
shows that you
have made a start
on working on
your resilience,
and that you are
also beginning to
work on your
creativity, which
you identified as
another area to
work on.
1 2 3
45
Envisioning a social learning analytics
dashboard
Ferguson R and Buckingham Shum S. (2012) Social Learning Analytics: Five Approaches. Proc. 2nd International Conference on Learning Analytics &
Knowledge. Vancouver, 29 Apr-2 May: ACM: New York, 23-33. DOI: http://dx.doi.org/10.1145/2330601.2330616 Eprint: http://oro.open.ac.uk/32910
92. Accounting tools are not neutral
“accounting tools...do not simply
aid the measurement of
economic activity, they shape the
reality they measure”
Du Gay, P. and Pryke, M. (2002) Cultural Economy: Cultural Analysis and Commercial Life. Sage, London. pp. 12-13
93. cf. Bowker and Starr’s “Sorting Things Out”
on classification schemes
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning
Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, 2012, Vancouver, BC). ACM. Eprint: http://oro.open.ac.uk/32823
“A marker of the health of the
learning analytics field will be
the quality of debate around
what the technology renders
visible and leaves invisible.”
94. The Wal-Martification of education?
94http://chronicle.com/blogs/techtherapy/2012/05/02/episode-95-learning-analytics-could-lead-to-wal-martification-of-college
http://lak12.wikispaces.com/Recordings
“The basic question is not
what can we measure?
The basic question is
what does a good
education look like?
Big questions.
“data narrowness”
“instrumental learning”
“students with no curiosity”
95. 95
“Our analytics are our
pedagogy”
(and epistemology)
They promote assessment regimes
— which drive (and strangle)
educational innovation
Knight S., Buckingham Shum S. and Littleton K. (2013) Epistemology, Pedagogy, Assessment and Learning Analytics. Proc. 3rd International
Conference on Learning Analytics & Knowledge. Leuven, BE: ACM, 75-84 Open Access Eprint: http://oro.open.ac.uk/36635
99. All analytics are infused with human values
Elaborated version of figure from Doug Clow:
h.p://www.slideshare.net/dougclow/the-‐learning-‐analy)cs-‐cycle-‐closing-‐the-‐loop-‐effec)vely
(slide
5)
99
What kinds of learners?
What kinds of learning?
What data could be
generated digitally
from the use context?
(you can invent future
technologies if need)
Does your theory
predict patterns
signifying learning?
What human +/or
software
interventions /
recommendations?
How to render the analytics,
for whom, and will they
understand them?
What analytical tools
could be used to find
such patterns?
ethics
105. Open course on systemic
deployment of analytics
105https://www.canvas.net/courses/policy-and-strategy-for-systemic-deployment-of-learning-analytics
Universities and
companies exploring
institutional strategy,
policy and
infrastructure
106. JISC Briefings on Learning Analytics
106http://publications.cetis.ac.uk/c/analytics
107. EDUCAUSE Briefings on Learning Analytics
107
http://www.educause.edu/library/learning-analytics
110. We all love a good big brother (don’t we?)
“A responsible school/university
in 2016 will use every form of
data shared by students in order
to maximise their success.”
Discuss
Thank you!