Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Developing a multiple-document-processing performance assessment for epistem...Simon Knight
http://oro.open.ac.uk/41711/
The LAK15 theme “shifts the focus from data to impact”, noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Nurturing the Connections: The Role of Quantitative Ethnography in Learning A...Dragan Gasevic
This talk will explore connections between two emerging fields focused on harnessing the potential of data – learning analytics and quantitative ethnography. Learning analytics is focused on the analysis of data collected from user interactions with technology with the goal of advancing our understanding of and enhancing human learning. Despite some early success stories and widespread interest, producing meaningful and actionable results is still a top open research challenge for learning analytics. The talk will first explore how quantitative ethnography can offer promising approaches that can address this open challenge in learning analytics. The talk will next discuss how progress in learning analytics can be used to accelerate the development of the field of quantitative ethnography. The talk will finally outline promising directions for future research at the intersection of learning analytics and quantitative ethnography.
Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Developing a multiple-document-processing performance assessment for epistem...Simon Knight
http://oro.open.ac.uk/41711/
The LAK15 theme “shifts the focus from data to impact”, noting the potential for Learning Analytics based on existing technologies to have scalable impact on learning for people of all ages. For such demand and potential in scalability to be met the challenges of addressing higher-order thinking skills should be addressed. This paper discuses one such approach – the creation of an analytic and task model to probe epistemic cognition in complex literacy tasks. The research uses existing technologies in novel ways to build a conceptually grounded model of trace-indicators for epistemic-commitments in information seeking behaviors. We argue that such an evidence centered approach is fundamental to realizing the potential of analytics, which should maintain a strong association with learning theory.
Keynote Address, Expanding Horizons 2012, Macquarie University
http://staff.mq.edu.au/teaching/workshops_programs/expanding_horizons
"Learning Analytics": unprecedented data sets and live data streams about learners, with computational power to help make sense of it all, and new breeds of staff who can talk predictive models, pedagogy and ethics. This means rather different things to different people: unprecedented opportunity to study, benchmark and improve educational practice, at scales from countries and institutions, to departments, individual teachers and learners. "Benchmarking" may trigger dystopic visions of dumbed down proxies for 'real teaching and learning', but an emu response is no good. For educational institutions, our calling is to raise the quality of debate, shape external and internal policy, and engage with the companies and open communities developing the future infrastructure. How we deploy these new tools rests critically on assessment regimes, what can be logged and measured with integrity, and what we think it means to deliver education that equips citizens for a complex, uncertain world.
Nurturing the Connections: The Role of Quantitative Ethnography in Learning A...Dragan Gasevic
This talk will explore connections between two emerging fields focused on harnessing the potential of data – learning analytics and quantitative ethnography. Learning analytics is focused on the analysis of data collected from user interactions with technology with the goal of advancing our understanding of and enhancing human learning. Despite some early success stories and widespread interest, producing meaningful and actionable results is still a top open research challenge for learning analytics. The talk will first explore how quantitative ethnography can offer promising approaches that can address this open challenge in learning analytics. The talk will next discuss how progress in learning analytics can be used to accelerate the development of the field of quantitative ethnography. The talk will finally outline promising directions for future research at the intersection of learning analytics and quantitative ethnography.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
Learning analytics are more than measurementDragan Gasevic
Slides used for the keynote
Learning analytics are more than measurement
at
Policies for Educational Data Mining and Learning Analytics Briefing
organized by http://www.laceproject.eu/
Examining the Value of Learning Analytics for Supporting Work-integrated Lear...Vitomir Kovanovic
Slides from our presentation at the Seventh National Conference
on Work-Integrated Learning (ACEN’18).
The full paper is available at https://www.researchgate.net/publication/328578409_Examining_the_value_of_learning_analytics_for_supporting_work-integrated_learning
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
Learning with me Mate: Analytics of Social Networks in Higher EducationDragan Gasevic
Effects of social interactions are reported in research on higher education to lead to positive outcomes such as higher levels of internalization, sense of community, academic achievement, metacognition, and student retention. The role of social networks has especially been emphasized in research due to the availability of theoretical foundations and analytic methods to investigate their effects in higher education. The increased use of technologies in education allows for the collection of large and rich datasets about social networks which call for the use of novel analytics methods. This talk will first give a brief overview of the existing work on and lessons learned from some well-known studies on social networks in higher education in diverse situations from face-to-face to massive open online courses. The talk will then identify critical challenges that require immediate attention in order for the study of social networks to make a sustainable impact on learning and teaching. The most important take away from the talk will be that
- computational aspects of the study of social networks need to be integrated deeply with theory, research and practice,
- novel methods for the study of critical dimensions (discourse, structure and dynamics) that shape network formation and network effects are necessary, and
- innovative instructional approaches are essential to address the changing conditions created by contemporary educational and technological contexts.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Introduction to Learning Analytics for High School Teachers and ManagersVitomir Kovanovic
Presentation at the first Learning Analytics Learning Network (LALN) Event in Adelaide, Australia on Oct 22, 2019.
Abstract:
With the increased adoption of technology, institutions have unprecedented opportunities to continuously improve the quality of their services through data collection and analysis. Schools and universities now have data about learners and their contexts that can provide valuable insight into how they learn. Early attempts were directed towards mining educational data to identify students-at-risk and develop interventions. Recently, more sophisticated approaches are being deployed by researchers and practitioners. These include analysis of learner behaviour that leads to various learning outcomes, social networks and teams, employability, creativity, and critical thinking. Analysing digital traces generated through learning processes requires a broad suite of methods from data science, statistics, psychometrics, social and learning sciences.
This workshop aims to introduce teachers and educators to the fast growing and promising field of learning analytics. How digital data can be used for the analysis and improvement of student learning will be explored. First, we will provide an overview of learning analytics, its key methods and approaches, as well as problems for which it can be used. Secondly, attendees will engage in group learning activities to explore ways in which learning analytics could be used within their institutions. The focus will be on identifying learning-related challenges that are relevant to their particular context and exploring how learning analytics can be used to practically and effectively.
Everything I have learnt about eLearningPoh-Sun Goh
A summary of key ideas and useful tips for applying eLearning in medical education.
See also update on 7 April 2020 at
https://www.slideshare.net/dnrgohps/everything-i-have-learnt-about-elearning-updated-7-april-2020
and
https://www.slideshare.net/dnrgohps/implementation-of-technology-enhanced-learning-including-vr-ar-and-ai-in-medical-education-some-questions-to-ask
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Presents an overview of the learning analytics field touching on the status of the technology, the challenges it faces, the arrival of predictive analytics to education and the best approach towards a successful implementation.
Learning analytics are more than measurementDragan Gasevic
Slides used for the keynote
Learning analytics are more than measurement
at
Policies for Educational Data Mining and Learning Analytics Briefing
organized by http://www.laceproject.eu/
Examining the Value of Learning Analytics for Supporting Work-integrated Lear...Vitomir Kovanovic
Slides from our presentation at the Seventh National Conference
on Work-Integrated Learning (ACEN’18).
The full paper is available at https://www.researchgate.net/publication/328578409_Examining_the_value_of_learning_analytics_for_supporting_work-integrated_learning
Learning analytics and Moodle: So much we could measure, but what do we want to measure? A presentation to the USQ Math and Sciences Community of Practice May 2013
Learning with me Mate: Analytics of Social Networks in Higher EducationDragan Gasevic
Effects of social interactions are reported in research on higher education to lead to positive outcomes such as higher levels of internalization, sense of community, academic achievement, metacognition, and student retention. The role of social networks has especially been emphasized in research due to the availability of theoretical foundations and analytic methods to investigate their effects in higher education. The increased use of technologies in education allows for the collection of large and rich datasets about social networks which call for the use of novel analytics methods. This talk will first give a brief overview of the existing work on and lessons learned from some well-known studies on social networks in higher education in diverse situations from face-to-face to massive open online courses. The talk will then identify critical challenges that require immediate attention in order for the study of social networks to make a sustainable impact on learning and teaching. The most important take away from the talk will be that
- computational aspects of the study of social networks need to be integrated deeply with theory, research and practice,
- novel methods for the study of critical dimensions (discourse, structure and dynamics) that shape network formation and network effects are necessary, and
- innovative instructional approaches are essential to address the changing conditions created by contemporary educational and technological contexts.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Introduction to Learning Analytics for High School Teachers and ManagersVitomir Kovanovic
Presentation at the first Learning Analytics Learning Network (LALN) Event in Adelaide, Australia on Oct 22, 2019.
Abstract:
With the increased adoption of technology, institutions have unprecedented opportunities to continuously improve the quality of their services through data collection and analysis. Schools and universities now have data about learners and their contexts that can provide valuable insight into how they learn. Early attempts were directed towards mining educational data to identify students-at-risk and develop interventions. Recently, more sophisticated approaches are being deployed by researchers and practitioners. These include analysis of learner behaviour that leads to various learning outcomes, social networks and teams, employability, creativity, and critical thinking. Analysing digital traces generated through learning processes requires a broad suite of methods from data science, statistics, psychometrics, social and learning sciences.
This workshop aims to introduce teachers and educators to the fast growing and promising field of learning analytics. How digital data can be used for the analysis and improvement of student learning will be explored. First, we will provide an overview of learning analytics, its key methods and approaches, as well as problems for which it can be used. Secondly, attendees will engage in group learning activities to explore ways in which learning analytics could be used within their institutions. The focus will be on identifying learning-related challenges that are relevant to their particular context and exploring how learning analytics can be used to practically and effectively.
Everything I have learnt about eLearningPoh-Sun Goh
A summary of key ideas and useful tips for applying eLearning in medical education.
See also update on 7 April 2020 at
https://www.slideshare.net/dnrgohps/everything-i-have-learnt-about-elearning-updated-7-april-2020
and
https://www.slideshare.net/dnrgohps/implementation-of-technology-enhanced-learning-including-vr-ar-and-ai-in-medical-education-some-questions-to-ask
Learning analytics as an academic research space has been growing in influence for nearly a decade. Campuses globally are deploying learning analytics to address a range of challenges including student dropout, poor engagement and targeted marketing as well as predict teaching and resource needs. As a field, learning analytics has advanced rapidly both as a research domain and as a practical on-campus activity to increase organizational use of data. In this presentation, Dr. George Siemens will explore both the research and the practice of analytics in education, focusing on the development of the Society for Learning Analytics, models for research and organizational data use and growing sophistication of data collection through psychophysiological approaches.
The Evidence Hub: Harnessing the Collective Intelligence of Communities to Bu...Anna De Liddo
Presentation to the Large-Scale Idea Management and Deliberation Systems Workshop @
6th International Conference on Communities and Technologies C&T2013
June 29,2013
Munich, Germany
Writing Analytics for Epistemic Features of Student Writing #icls2016 talkSimon Knight
Talk presented at #ICLS2016 presented in Singapore. I discuss levels of description as sites of epistemic cognition focusing on writing and use of textual features to associate rubric scores with epistemic cognition.
My thanks to my collaborators (listed on the paper) particularly Laura Allen, who also generously let me adapt the later slides on NLP studies of writing.
Abstract: Literacy, encompassing the ability to produce written outputs from the reading of multiple sources, is a key learning goal. Selecting information, and evaluating and integrating claims from potentially competing documents is a complex literacy task. Prior research exploring differing behaviours and their association to constructs such as epistemic cognition has used ‘multiple document processing’ (MDP) tasks. Using this model, 270 paired participants, wrote a review of a document. Reports were assessed using a rubric associated with features of complex literacy behaviours. This paper focuses on the conceptual and empirical associations between those rubric-marks and textual features of the reports on a set of natural language processing (NLP) indicators. Findings indicate the potential of NLP indicators for providing feedback regarding the writing of such outputs, demonstrating clear relationships both across rubric facets and between rubric facets and specific NLP indicators.
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new field learning analytics and mobilized the education sector to embrace the use of data for decision-making. This talk will first introduce the field of learning analytics and touch on lessons learned from some well-known case studies. The talk will then identify critical challenges that require immediate attention in order for learning analytics to make a sustainable impact on learning, teaching, and decision making. The talk will conclude by discussing a set of milestones selected as critical for the maturation of the field of learning analytics. The most important take away from the talk will be that
- systemic approaches to the development and adoption of learning analytics are critical,
- multidisciplinary teams are necessary to unlock a full potential of learning analytics, and
- capacity development at institutional levels through the inclusion of diverse stakeholders is essential for full learning analytics adoption.
This is the second edition of the talk that previously gave under the same title on several occasions. The second edition reflects many developments happened in the field of learning analytics, especially those in the following two projects - http://he-analytics.com and http://sheilaproject.eu.
An introduction to Competency-based education and the new student demographic. Discover today's modern student and the education system designed to fit them. http://bit.ly/1hU8ntv
Teaching, Assessment and Learning Analytics: Time to Question AssumptionsSimon Buckingham Shum
Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Kirsty Kitto, Simon Buckingham Shum, and Andrew Gibson. (2018). Embracing Imperfection in Learning Analytics. In Proceedings of LAK18: International Conference on Learning Analytics and Knowledge, March 5–9, 2018, Sydney, NSW, Australia, pp.451-460. (ACM, New York, NY, USA). https://doi.org/10.1145/3170358.3170413
Open Access: http://simon.buckinghamshum.net/2018/01/embracing-imperfection-in-learning-analytics
Abstract: Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons.
Human-Centered Learning Analytics and Artificial Intelligence in Education: H...Yannis
Although Artificial Intelligence (AI) and Learning Analytics (LA) have shown their potential in Education, stakeholders’ agency seems to be threatened. On the other hand, multiple issues regarding FATE (Fairness, Accountability, Transparency and Ethics) have been raised when AI or LA-based solutions are designed and implemented. These issues have been especially acute since the emergence of Large Language Models and Generative AI.
This talk discusses the quest for an optimal balance between human and computational agents, when LA tools and services are employed in a Technology Enhanced Learning (TEL) ecosystem. Through the discussion of relevant conceptual models and examples, it argues for Human-Centered Learning Analytics (HCLA) and Human-Centered Artificial Intelligence (HCAI) approaches, where agency and FATE principles are essential design parameters.
The talk focuses especially on LA/AI solutions that may position teachers as designers of effective interventions and orchestration actions. Selected Human-Centered Design (HCD) principles are discussed and illustrated, and directions for future research and development are formulated to overcome the main obstacles for adoption of human-centered approaches for LA and AI in education.
Educator-NICs: Envisaging the Future of ICT–enabled Networked Improvement Communities
Learning Emergence Workshop • University of Bristol • 20th May 2014
Computers in Human Behavior xxx (2012) xxx–xxxContents lists.docxpatricke8
Computers in Human Behavior xxx (2012) xxx–xxx
Contents lists available at SciVerse ScienceDirect
Computers in Human Behavior
j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / c o m p h u m b e h
Critical thinking in E-learning environments
Raafat George Saadé a,⇑, Danielle Morin a,1, Jennifer D.E. Thomas b,2
a Concordia University, John Molson School of Business, Montreal, Quebec, Canada
b Pace University, Ivan Seidenberg School of CSIS, New York, NY, USA
a r t i c l e i n f o
Article history:
Available online xxxx
Keywords:
E-learning
Critical thinking
Assessment
Information technology
0747-5632/$ - see front matter � 2012 Elsevier Ltd. A
http://dx.doi.org/10.1016/j.chb.2012.03.025
⇑ Corresponding author. Tel.: +1 514 848 2424; fax
E-mail address: [email protected] (R.G. Sa
1 Tel.: +1 514 848 2424; fax: +1 514 848 2824.
2 Tel.: +1 212 346 1569; fax: +1 212 346 1863.
Please cite this article in press as: Saadé, R. G., e
10.1016/j.chb.2012.03.025
a b s t r a c t
One of the primary aims of higher education in today’s information technology enabled classroom is to
make students more active in the learning process. The intended outcome of this increased IT-facilitated
student engagement is to foster important skills such as critical thinking used in both academia and
workplace environments. Critical thinking (CT) skills entails the ability(ies) of mental processes of discern-
ment, analysis and evaluation to achieve a logical understanding. Critical thinking in the classroom as well
as in the workplace is a central theme; however, with the dramatic increase of IT usage the mechanisms by
which critical thinking is fostered and used has changed. This article presents the work and results of
critical thinking in a virtual learning environment. We therefore present a web-based course and we
assess in which parts of the course, and to what extent, critical thinking was perceived to occur. The course
contained two categories of learning modules namely resources and interactive components. Critical
thinking was measured subjectively using the ART scale. Results indicate the significance of ‘‘interactivity’’
in what students perceived to be critical-thinking-oriented versus online material as a resource. Results
and opportunities that virtual environments present to foster critical thinking are discussed.
� 2012 Elsevier Ltd. All rights reserved.
1. Introduction
One of the primary aims of higher education in today’s informa-
tion technology (IT) enabled classroom, is to make students more
active in the learning process (Ibrahim & Samsa, 2009). The in-
tended outcome of this increased IT-facilitated student engage-
ment is to foster important skills such as critical thinking. Given
the importance of information technology for critical thinking in
learning, it is vital that we understand better the associated key
factors related to: background of students, beliefs, perceptions
and attitudes and associated anteceden.
Learning Analytics (or: The Data Tsunami Hits Higher Education)Simon Buckingham Shum
Keynote Address to The Impact of Higher Education: Addressing the Challenges of the 21st CenturyEuropean Association for Institutional Research (EAIR) 35th Annual Forum 2013, Erasmus University, Rotterdam, the Netherlands, 28-31 August 2013. http://www.eair.nl/forum/rotterdam
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Opening to the inaugural workshop on Learning Analytics in Schools held at LAK18: International Conference on Learning Analytics & Knowledge, Sydney. http://lak18.solaresearch.org
Prof. Simon Buckingham Shum
Prof. Ruth Deakin Crick
Summer@UTS Workshop, 8th Feb. 2018
Connected Intelligence Centre
https://utscic.edu.au/event/resilience-complexity
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
1. Learning Analytics:
Welcome to the future of assessment?
Simon Buckingham Shum
Knowledge Media Institute, The Open University
Visiting Fellow, University of Bristol
(From August, University of Technology Sydney)
simon.buckinghamshum.net
twitter @sbskmi #LearningAnalytics #edmedia See the question at #edmediakeynote
Keynote
address,
EdMedia
2014,
25th
June,
Tampere,
Finland
1
2. learning objective: leave with
an expanded vision of analytics
better questions to ask in your next
analytics conversation
2
3. Big Data status report:
3
“Big data is like teenage sex: everyone
talks about it, nobody really knows
how to do it, everyone thinks
everyone else is doing it, so everyone
claims they are doing it...”
https://www.facebook.com/dan.ariely/posts/904383595868
4. When the Chancellor announces the adoption
of a new economic modelling technique…
4
…we query the
limitations
of the model
9. Similarly, when we are confronted with
new learning analytics…
LAK13 Panel: Educational Data Scientists: A Scarce Breed
http://people.kmi.open.ac.uk/sbs/2013/03/lak13-edu-data-scientists-scarce-breed
John Behrens
(Pearson)
9
10. LAK13 Panel: Educational Data Scientists: A Scarce Breed
http://people.kmi.open.ac.uk/sbs/2013/03/lak13-edu-data-scientists-scarce-breed
John Behrens
(Pearson)
10
…we should query the limitations of the model
13. It’s out of the labs and into products: every learning
tool now has an “analytics dashboard” (a Google image search)
13
14. Intelligent tutoring for skills mastery (CMU)
Lovett M, Meyer O and Thille C. (2008) The Open Learning Initiative: Measuring the effectiveness of the OLI statistics course in accelerating student
learning. Journal of Interactive Media in Education 14. http://jime.open.ac.uk/article/2008-14/352
“In this study, results showed that
OLI-Statistics students [blended
learning] learned a full semester’s
worth of material in half as much
time and performed as well or
better than students learning from
traditional instruction over a full
semester.”
15. Purdue University Signals: real time traffic-lights for
students based on predictive model
15
Campbell et al (2007). Academic Analytics: A New Tool for a New Era, EDUCAUSE
Review, vol. 42, no. 4 (July/August 2007): 40–57. http://bit.ly/lmxG2x
Validate a statistical model from:
• ACT or SAT score
• Overall grade-point average
• CMS usage composite
• CMS assessment composite
• CMS assignment composite
• CMS calendar composite
Predicted 66%-80% of struggling
students who needed help
16. Purdue University Signals: real time traffic-lights for
students based on predictive model
16
Pistilli, M. D., Arnold, K. and Bethune, M., Signals: Using Academic Analytics to
Promote Student Success. EDUCAUSE Review Online, July/Aug., (2012).
http://www.educause.edu/ero/article/signals-using-academic-analytics-
promote-student-success
“Results thus far show that students
who have engaged with Course Signals
have higher average grades and seek
out help resources at a higher rate
than other students.”
19. …and many more examples including
discourse analytics language technologies to assess the quality
of online postings and debate
social network analytics graph analytics to assess strength
and topics of interpersonal ties
epistemic game analytics assessing the degree of
professional engagement in authentic project scenarios
visualizations to reveal important patterns of tool use over time
(see other presentations and tutorials)
19
20. but
before
we
get
carried
away,
let’s
just
pause…
20
21. Selwyn, N. (2014). Data entry: towards the critical study of digital data and education. Learning, Media and Technology. http://dx.doi.org/
10.1080/17439884.2014.921628
“observing, measuring, describing,
categorising, classifying, sorting, ordering
and ranking). […] these processes of meaning-making are never
wholly neutral, objective and ‘automated’ but are fraught with
problems and compromises, biases and
omissions.
21
22. For Morozov, analytics is
where technological
solutionism hits education:
22
“This flight from thinking
and the urge to replace
human judgments with
timeless truths produced by
algorithms is the underlying
driving force of
solutionism.”
23. Could analytics help us shift from the calculating
mind to the contemplative mind?
23
See also:
Complexity, Computing, Contemplation, Learning?
http://learningemergence.net/2011/05/04/cccl
http://www.contemplativecomputing.org/2011/03/first-draft-of-a-contemplative-computing-article.html
Alex Pang: “A contemplative stance can help people be
more creative; deal with complex problems that
require months or years to solve […]
Contemplation promotes both self-sufficiency and
close, questioning observation of the world, and both
are particularly valuable in this moment in the history
of technology.”
Calculating Mind, Contemplative Mind
http://people.kmi.open.ac.uk/sbs/2008/09/calculating-contemplative-mind
25. can
we
tell
from
your
digital
profile
if
you’re
learning?
25
26. can
we
tell
from
your
digital
profile
if
you’re
learning?
26
Who?
27. can
we
tell
from
your
digital
profile
if
you’re
learning?
27
Who?
How? With what confidence?
After what kinds of training?
28. can
we
tell
from
your
digital
profile
if
you’re
learning?
28
Who?
How? With what confidence?
After what kinds of training?
Sourcing which data,
with what integrity?
29. can
we
tell
from
your
digital
profile
if
you’re
learning?
29
Who?
How? With what confidence?
After what kinds of training?
Sourcing which data,
with what integrity?
What kind of learning?
What kind of learner?
30. Accounting tools are not neutral
Du Gay, P. and Pryke, M. (2002) Cultural Economy: Cultural Analysis and Commercial Life. Sage, London. pp. 12-13
“accounting tools...do not simply
aid the measurement of economic
activity, they shape the
reality they measure”
31. In
what
senses
do
analy5cs
“shape
the
reality
they
measure”?
31
32. How
do
analyQcs
shape
educaQon?
Analytics reports at the
organisational and national
levels come with
consequences at different
scales — sometimes punitive,
often impacting millions of
people.
PoliQcally
32
33. How
do
analyQcs
shape
educaQon?
What data, concepts
and relationships do
the analytics
designers seek to
model?
Ontologically
33
34. Bowker, G. C. and Star, L. S. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press, Cambridge, MA, pp. 277, 278, 281
“Classification systems provide both a
warrant and a tool for forgetting [...]
what to forget and how to forget it [...]
The argument comes down to asking not
only what gets coded in but what gets
coded out of a given scheme.”
34
36. Which analytics could reflect the progress that ‘Joe’
has made on so many other fronts other than his SATS?
36
37. Key modelling issue: unit of analysis
! Discourse analysis: how do machines and humans differ in the
way they segment a transcript to make sense of it?
! Rosé, C. P., & Tovares A. (in press). What Sociolinguistics and Machine Learning Have to Say to One Another about Interaction
Analysis. In L. Resnick, Asterhan C., & Clarke S. (Eds.), Socializing Intelligence Through Academic Talk and Dialogue. Washington,
D.C.: American Educational Research Association
! Collective intelligence: If we are shifting from a sole focus on
individual accomplishment, to that of group knowledge
construction and performance, how do analytics assess
changes in a group’s knowledge and processes?
! Chen, B., & Resendes, M. (2014). Uncovering what matters: Analyzing transitional relations among contribution types in
knowledge-building discourse. In Proceedins of the Fourth International Conference on Learning Analytics And
Knowledge - LAK ’14 (pp. 226–230). New York, New York, USA: ACM Press. doi:10.1145/2567574.2567606
37
38. How
do
analyQcs
shape
educaQon?
What thresholds, samples,
relationships, patterns, etc. do
the algorithms encode and
seek?
On what basis is a
recommendation engine
proposing interventions?
Algorithmically
38
40. governingalgorithms.org
A
technology
or
an
epistemology?
Barocas,
S.,
Hood,
S.
and
Ziewitz,
M.
(2013).
Governing
Algorithms:
A
Provoca5on
Piece.
Social
Science
Research
Network
Paper
2245322.
DOI:
h=p://dx.doi.org/10.2139/ssrn.2245322
Secrecy,
obscurity,
inscrutability
Agency,
automaQon,
accountabiliQes
A
typology
of
algorithms
by
genre?
The
inscrutability
of
algorithms
NormaQvity,
bias,
values
40
41. Open Learning Analytics: open source
algorithmic transparency (at least for those who are literate)
no analytics ‘lock-in’ for educators
http://www.solaresearch.org/mission/ola
42. How
do
analyQcs
shape
educaQon?
What meaning-making does
the representation and
interaction design
encourage?
SemioQcally
42
43. outcome
How
do
analyQcs
shape
educaQon?
By
changing
the
system
dynamics
researchers
/
educators
/
instrucQonal
designers
administrators
/
leaders
/
policymakers
intent
43
44. outcome
How
do
analyQcs
shape
educaQon?
By
changing
the
system
dynamics
Faster
feedback
loops
could
enable
more
rapid
adaptaQon:
of
agents’
behaviour,
and
of
learning
resources
and
designs
researchers
/
educators
/
instrucQonal
designers
administrators
/
leaders
/
policymakers
intent
44
45. DelegaQon
of
authority
to
define
goals,
analyQcs,
and
meaning
How
do
analyQcs
shape
educaQon?
Distribution of power
between educators,
learners, leaders,
community…?
?
?
46. How
do
analyQcs
shape
educaQon?
epistemology
pedagogyassessment
Knight, S., Buckingham Shum, S. and Littleton, K. (In Press, 2014). Epistemology, Assessment, Pedagogy: Where Learning
Meets Analytics in the Middle Space. Journal of Learning Analytics. Open Access Eprint: http://oro.open.ac.uk/39226
the
middle
space of
learning analytics
What epistemological
assumptions are shaping
the assessment regime,
and hence the
pedagogy? What
questions are analytics
used to help answer?
46
47. Example: epistemological assumptions
47
Knight, S., Buckingham Shum, S. and Littleton, K. (In Press, 2014). Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the
Middle Space. Journal of Learning Analytics. Open Access Eprint: http://oro.open.ac.uk/39226
Allows testing of problem-solving
and analysis - sifting information
"if you allow communication,
discussions, searches and so on, you
eliminate cheating because it's not
cheating any more. That is the way
we should think."
48. Figure
from
Doug
Clow:
h=p://www.slideshare.net/dougclow/the-‐learning-‐analyQcs-‐cycle-‐closing-‐the-‐loop-‐effecQvely
(slide
5)
How
do
analyQcs
shape
educaQon?
All
of
the
above
are
encapsulated
in
any
learning
analyQcs
deployment
48
49. 49
What
kinds
of
learners?
What
kinds
of
learning?
What
data
could
be
generated
digitally
from
the
use
context?
How
is
it
‘cleaned’?
Does
your
theory
predict
pa=erns
signifying
learning?
What
human
+/or
solware
intervenQons
/
recommendaQons?
How
to
render
the
analyQcs,
for
whom,
and
will
they
understand
them?
What
analyQcal
tools
could
be
used
to
find
such
pa=erns?
How
do
analyQcs
shape
educaQon?
51. what
kinds
of
learning
are
we
opQmising
the
system
for?
51
52. Learning analytics for this?
“The test of successful education is
not the amount of knowledge that
pupils take away from school, but
their appetite to know and
their capacity to learn.”
Sir Richard Livingstone, 1941
52
53. “We’re looking at the profiles of what it means
to be effective in the 21st century. […]
Resilience will be the defining concept.
When challenged and bent, you learn and
bounce back stronger.”
“Dispositions are now at least as
important as Knowledge and Skills. …
They cannot be taught.
They can only be cultivated.”
John Seely Brown
53
US Dept. of Educ. http://reimaginingeducation.org conference (May 28, 2013)
Dispositions clip: http://www.c-spanvideo.org/clip/4457327
Whole talk: http://www.c-spanvideo.org/program/SecD
Learning analytics for this?
54. “It’s more than knowledge and skills. For the
innovation economy, dispositions come
into play: readiness to
collaborate; attention to
multiple perspectives; initiative;
persistence; curiosity.”
Larry Rosenstock
LearningREimagined project: http://learning-reimagined.com
Larry Rosenstock:
http://audioboo.fm/boos/1669375-50-seconds-of-larry-rosenstock-ceo-of-hightechhigh-on-how-he-would-re-imagine-learning
Learning analytics for this?
55. “In the growth mindset, people believe that
their talents and abilities can be developed
through passion, education, and persistence
…
It’s about a commitment to … taking
informed risks … surrounding
yourself with people who will
challenge you to grow”
Carol Dweck
Interview with Carol Dweck:
http://interviewscoertvisser.blogspot.co.uk/2007/11/interview-with-carol-dweck_4897.html
Another interview: http://www.youtube.com/watch?v=ICILzbB1Obg
Learning analytics for this?
56. Important work by Tony Bryk et al.:
Drivers of “Productive Persistence”
http://www.carnegiealphalabs.org/persistence/
57. Important work by Tony Bryk et al.:
Drivers of “Productive Persistence”
http://www.carnegiealphalabs.org/persistence/
Note: a research-
based rationale
for architecting a
suite of analytics
techniques
58. Bryk: “sense of belonging” a key
predictor of remedial maths completion
58
http://learningemergence.net/2014/05/27/tony-bryk-lecture
59. Envisioning a wholistic university education
(and analytics to match)
59
http://reinventors.net/series/reinvent-university
61. 1st International Workshop on
Discourse-Centric Learning Analytics
analytics that look beneath
the surface, and quantify
linguistic proxies for ‘deeper
learning’
Beyond number / size / frequency
of posts; ‘hottest thread’
http://www.glennsasscer.com/wordpress/wp-content/uploads/2011/10/iceberg.jpg
solaresearch.org/events/lak/lak13/dcla13
62. Discourse analytics on webinar textchat
Ferguson, R. and Buckingham Shum, S., Learning analytics to identify exploratory dialogue within synchronous text chat. In: 1st
International Conference on Learning Analytics and Knowledge (Banff, Canada, 2011). ACM
Can we spot the
quality learning
conversations in a
2.5 hr webinar?
65. Discourse analytics on webinar textchat
-100
0
100
9:28
9:40
9:50
10:00
10:07
10:17
10:31
10:45
11:04
11:17
11:26
11:32
11:38
11:44
11:52
12:03
Averag
Classified as
“exploratory
talk”
(more
substantive
for learning)
“non-
exploratory”
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar
but if we zoom in on a peak…
Ferguson, R., Wei, Z., He, Y. and Buckingham Shum, S., An Evaluation of Learning Analytics to Identify Exploratory Dialogue in Online Discussions. In: Proc.
3rd International Conference on Learning Analytics & Knowledge (Leuven, BE, 8-12 April, 2013). ACM. http://oro.open.ac.uk/36664
66. Rhetorical discourse analytics
66
OPEN QUESTION:
“… little is known …”
“… role … has been elusive”
“Current data is insufficient …”
CONTRASTING IDEAS:
“… unorthodox view resolves …”
“In contrast with previous
hypotheses ...”
“... inconsistent with past
findings ...”
SURPRISE:
“We have recently observed ... surprisingly”
“We have identified ... unusual”
“The recent discovery ... suggests intriguing
roles”
http://technologies.kmi.open.ac.uk/cohere/2012/01/09/cohere-plus-automated-rhetorical-annotation
De Liddo, A., Sándor, Á. and Buckingham Shum, S., Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation
Study. Computer Supported Cooperative Work, 21, 4-5, (2012), 417-448. http://oro.open.ac.uk/31052
68. Rhetorical discourse analytics
68
Human analyst Computational analyst
http://technologies.kmi.open.ac.uk/cohere/2012/01/09/cohere-plus-automated-rhetorical-annotation
De Liddo, A., Sándor, Á. and Buckingham Shum, S., Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation
Study. Computer Supported Cooperative Work, 21, 4-5, (2012), 417-448. http://oro.open.ac.uk/31052
69. Rhetorical discourse analytics
69
Duygu Simsek’s PhD: http://people.kmi.open.ac.uk/simsek/research/
Glimpses of analytics capable of
detecting higher order thinking.
But humans will always read
differently to machines
Can we correlate this with
“academic writing”, and can such
analytics be used as formative
feedback on drafts?
70. Rhetorical discourse analytics
70
Simsek D, Buckingham Shum S, Sándor Á, De Liddo A and Ferguson R. (2013) XIP Dashboard: http://oro.open.ac.uk/37391
CONTRAST
SUMMARY &
CONTRIBUTION
77. Quantifying learning dispositions
agency; identity; motivation; responsibility
Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics.
Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, Vancouver). Eprint: http://oro.open.ac.uk/32823
http://learningemergence.net/2012/04/30/learning-powered-learning-analytics
A
wholisQc
visual,
intended
to
build
intrinsic
moQvaQon,
inviQng
stretch,
providing
a
new
language,
provoking
conversaQon
that
Qes
to
the
learner’s
idenQty
78. Self-report through reflective blogging
9-10 yr old EnquiryBloggers • Bushfield School, Wolverton, UK
EnquiryBlogger Wordpress Multisite plugins
http://learningemergence.net/tools/enquiryblogger
78
79. Masters level EnquiryBloggers
Graduate School of Education, University of Bristol
EnquiryBlogger: blogging for Learning Power & Authentic Enquiry
http://learningemergence.net/2012/06/20/enquiryblogger-for-learning-power-authentic-enquiry
79
81. http://learningemergence.net/2014/03/01/assessing-learning-dispositions-academic-mindsets
2020? personal data cloud generates my dispositional
profile for reflection from behavioural data?
>>> help me take responsibility for my own learning
Shaofu Huang: Prototyping Learning Power Modelling in SocialLearn
http://www.open.ac.uk/blogs/SocialLearnResearch/2012/06/20/social-learning-analytics-symposium
Simon Knight: http://people.kmi.open.ac.uk/knight/2014/02/knowledge-in-search
Social network patterns,
teamwork effectiveness
and initiation of
relationships
Questioning, arguing
and search behaviours
reveal intrinsic curiosity
and epistemic
commitments
Tagging/sharing/
blogging/social patterns
reveal how you see
connections between
ideas
Behavioural and somatic
traces associated with
perseverance, grit,
tenacity; overcoming
panic/stress when
stretched
82. Your most
recent mood
comment:
“Great, at last
I have found all
the resources
that I have
been looking
for, thanks to
Steve and
Ellen.
In your last discussion with your mentor, you decided
to work on your resilience by taking on more
learning challenges
Your ELLI Spider
shows that you have
made a start on
working on your
resilience, and that
you are also
beginning to work on
your creativity, which
you identified as
another area to work
on.
1 2 3
45
Envisioning a social learning analytics dashboard
Ferguson R and Buckingham Shum S. (2012) Social Learning Analytics: Five Approaches. Proc. 2nd International Conference on Learning Analytics &
Knowledge. Vancouver, 29 Apr-2 May: ACM: New York, 23-33. DOI: http://dx.doi.org/10.1145/2330601.2330616 Eprint: http://oro.open.ac.uk/32910
82
85. 85
The big shifts that analytics could bring…
Organisational
Culture
evidence-based
decisions and
org learning
Academic
Culture
data-intensive
learning sciences/
educ research
Practitioner
Culture
evidence impact of
learning designs;
timely interventions
C21
Qualities
place these on a firm
empirical evidence
base
86. 86
Critical zones for research+practice…
data-culture
org. learning
how do HEIs manage
the embedding of
real time analytics
services?
sensemaking
meets
computation
creative intelligence +
computational
thinking
educator data
literacy
how do staff learn to
read and write
analytics?
pedagogical
innovation
how do learning
analytics change
student experience?
92. conclusion
analy5cs
will
shape
educa5on
—
on
mul5ple
dimensions
an
analy5cs
approach
perpetuates
an
educa5onal
worldview
—
so
let’s
ensure
this
is
inten5onal
–
not
accidental...