UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Learning.Analytics for Learning.Futures?
Simon Buckingham Shum and Ruth Deakin Crick
Centre for Connected Intelligence, UTS
The social, technical and political challenges we face as a society demand new ways of thinking and working which are collaborative, holistic and resilient. As we unpack what these words mean, the implications for a university – indeed any learning organisation – run deep. At the core of the paradigm shift we see the need for learners (at all levels) to take increasing responsibility for their learning in authentic contexts: to become resilient agents of their own learning trajectories; to think holistically and to make sense of complex data. Far from being solely ‘graduate attributes’, the same qualities are needed by us: if we can’t model these qualities ourselves, we can’t teach them; if we can’t assess them authentically, we have no evidence base and we can’t provide formative feedback. This line of argument shapes how CIC is conceiving learning analytics (computer-supported tools to help learners and educators gather, analyse, visualise and act on learners’ data) and collective intelligence (networking tools to build a learning community’s evidence-base). In this talk we will give glimpses of these approaches in action, we’ll hear from learners and educators on what this paradigm shift feels like, and through several activities, we invite you to imagine how we can collaborate to test these concepts across UTS, as we move into Learning.Futures.
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
Towards Contested Collective Intelligence
Simon Buckingham Shum, Director Connected Intelligence Centre, University of Technology Sydney
This talk is to open up a dialogue with the important work of the SWARM project. I’ll introduce the key ideas that have shaped my work on interactive software tools to make thinking visible, shareable and contestable, some of the design prototypes, and some of the lessons we’ve learnt en route.
Teaching, Assessment and Learning Analytics: Time to Question AssumptionsSimon Buckingham Shum
Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
Tony Vlachakis, an educational technologist that works at the Georgia Department of Education, gave this presentation update on the K-12 Computer Science Framework Review.
UCL joint Institute of Education (London Knowledge Lab) & UCL Interaction Centre seminar, 20th April 2016. Replay: https://youtu.be/0t0IWvcO-Uo
Algorithmic Accountability & Learning Analytics
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
ABSTRACT. As algorithms pervade societal life, they are moving from the preserve of computer science to becoming the object of far wider academic and media attention. Many are now asking how the behaviour of algorithms can be made “accountable”. But why are they “opaque” and to whom? As this vital discussion unfolds in relation to Big Data in general, the Learning Analytics community must articulate what would count as meaningful questions and satisfactory answers in educational contexts. In this talk, I propose different lenses that we can bring to bear on a given learning analytics tool, to ask what it would mean for it to be accountable, and to whom. From a Human-Centred Informatics perspective, it turns out that algorithmic accountability may be the wrong focus.
BIO. Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, which he joined in August 2014 to direct the new Connected Intelligence Centre. Prior to that he was at The Open University’s Knowledge Media Institute 1995-2014. He brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI (PhD, York) where he worked with Rank Xerox Cambridge EuroPARC on Design Rationale. He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin wrote Constructing Knowledge Art (2015). He is active in the emerging field of Learning Analytics and is a co-founder of the Society for Learning Analytics Research, Compendium Institute and Learning Emergence network.
Learning.Analytics for Learning.Futures?
Simon Buckingham Shum and Ruth Deakin Crick
Centre for Connected Intelligence, UTS
The social, technical and political challenges we face as a society demand new ways of thinking and working which are collaborative, holistic and resilient. As we unpack what these words mean, the implications for a university – indeed any learning organisation – run deep. At the core of the paradigm shift we see the need for learners (at all levels) to take increasing responsibility for their learning in authentic contexts: to become resilient agents of their own learning trajectories; to think holistically and to make sense of complex data. Far from being solely ‘graduate attributes’, the same qualities are needed by us: if we can’t model these qualities ourselves, we can’t teach them; if we can’t assess them authentically, we have no evidence base and we can’t provide formative feedback. This line of argument shapes how CIC is conceiving learning analytics (computer-supported tools to help learners and educators gather, analyse, visualise and act on learners’ data) and collective intelligence (networking tools to build a learning community’s evidence-base). In this talk we will give glimpses of these approaches in action, we’ll hear from learners and educators on what this paradigm shift feels like, and through several activities, we invite you to imagine how we can collaborate to test these concepts across UTS, as we move into Learning.Futures.
Valedictory Lecture
Making Thinking Visible in Complex Times
Prof Simon Buckingham Shum
This event took place on 15th July 2014 at 4:00pm (15:00 GMT)
Berrill Lecture Theatre, The Open University, Walton Hall Campus, Milton Keynes, United Kingdom
In 1968 Doug Engelbart gave “The Mother of All Demos”: a disruptive technology lab had quietly invented the mouse, collaborative on-screen editing, hyperlinks, video conferencing, and much more. This was the start of the paradigm shift, still unfolding: computers were no longer to be low level number crunchers, but might mediate and mould the highest forms of human thinking, both individual and collective. In this talk I review nearly 19 years in KMi chasing this vision with many colleagues, inventing tools for making dialogue, argument and learning processes visible in different ways. How do we harness such tools to tackle, not aggravate, the fundamental challenge facing the educational system, and its graduates: to think broadly and deeply, and to thrive amidst profound uncertainty and complexity? These are the hallmarks of the OU — and indeed, all true education from primary school onwards.
Towards Contested Collective Intelligence
Simon Buckingham Shum, Director Connected Intelligence Centre, University of Technology Sydney
This talk is to open up a dialogue with the important work of the SWARM project. I’ll introduce the key ideas that have shaped my work on interactive software tools to make thinking visible, shareable and contestable, some of the design prototypes, and some of the lessons we’ve learnt en route.
Teaching, Assessment and Learning Analytics: Time to Question AssumptionsSimon Buckingham Shum
Presented by the Assessment Research Centre
and the Melbourne Centre for the Study of Higher Education
Teaching, Assessment and Learning Analytics: Time to Question Assumptions
Simon Buckingham Shum
Professor of Learning Informatics, and Director of the Connected Intelligence Centre (CIC)
University of Technology Sydney
When: 11.30 -12.30 pm, Wed. 13 Sep 2017
Where: Frank Tate Room, Level 9, 100 Leicester St, Carlton
This will be a non-technical talk accessible to a broad range of educational practitioners and researchers, designed to provoke a conversation that provides time to question assumptions. The field of Learning Analytics sits at the convergence of two fields: Learning (including learning technology, educational research and learning/assessment sciences) and Analytics (statistics; visualisation; computer science; data science; AI). Many would add Human-Computer Interaction (e.g. participatory design; user experience; usability evaluation) as a differentiator from related fields such as Educational Data Mining, since the Learning Analytics community attracts many with a concern for the sociotechnical implications of designing and embedding analytics in educational organisations.
Learning Analytics is viewed by many educators with the same suspicion they reserve for AI or “learning management systems”. While in some cases this is justified, I will question other assumptions with some learning analytics examples which can serve as objects for us to think with. I am curious to know what connections/questions arise when these are shared..
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he was appointed in August 2014 to direct the new Connected Intelligence Centre. Previously he was Professor of Learning Informatics and an Associate Director at The UK Open University’s Knowledge Media Institute. He is active in the field of Learning Analytics as a co-founder and former Vice President of the Society for Learning Analytics Research, and Program Co-Chair of LAK18, the International Learning Analytics and Knowledge Conference. Previously he co-founded the Compendium Institute and Learning Emergence networks. Simon brings a Human-Centred Informatics (HCI) approach to his work, with a background in Psychology (BSc, York), Ergonomics (MSc, London) and HCI Design Argumentation (PhD, York). He co-edited Visualizing Argumentation (2003) followed by Knowledge Cartography (2008, 2nd Edn. 2014), and with Al Selvin, wrote Constructing Knowledge Art (2015). He was recently appointed as a Fellow of The RSA. http://Simon.BuckinghamShum.net
Tony Vlachakis, an educational technologist that works at the Georgia Department of Education, gave this presentation update on the K-12 Computer Science Framework Review.
What is computational thinking? Who needs it? Why? How can it be learnt? ...Aaron Sloman
What is computational thinking?
Who needs it? Why? How can it be learnt?
Can it be taught? How?
Slides for invited presentation at Conference of ALT (Association for Learning Technology) 11th Sept 2012, University of Manchester.
PDF available (easier for printing, selecting text, etc.):
http://www.cs.bham.ac.uk/research/projects/cogaff/talks/#talk105
A video of the actual presentation (using no slides because of a projector problem) is now available here
http://www.youtube.com/watch?v=QXAFz3L2Qpo
It also has been made available as "slide 47" after the PDF presentation on this page.
I attempt to generalise Jeannette Wing's notion of "Computational thinking" (ACM 2006) to include attempting to understand much biological information processing, and try to show the necessity for educators to do deep computational thinking if they wish to facilitate processes of learning.
Tutorial, Learning Analytics Summer Institute, Ann Arbor, June 2017
As algorithms pervade societal life, they’re moving from an arcane topic reserved for computer scientists and mathematicians, to the object of far wider academic and mainstream media attention (try a web news search on algorithms, and then add ethics). As agencies delegate machines with increasing powers to make judgements about complex human qualities such as ’employability’, ‘credit worthiness’, or ‘likelihood of committing a crime’, we are confronted by the challenge of “governing algorithms”, lest they turn into Weapons of Math Destruction. But in what senses are they opaque, and to whom? And what is meant by “accountable”?
The education sector is clearly not immune from these questions, and it falls to the Learning Analytics community to convene a vigorous debate, and devise good responses. In this tutorial, I’ll set the scene, and then propose a set of lenses that we can bring to bear on a learning analytics infrastructure, to identify some of the meanings that “accountability” might have. It turns out that algorithmic transparency and accountability may be the wrong focus — or rather, just one piece of the jigsaw. Intriguingly, even if you can look inside the algorithmic ‘black box’, which is imagined to lie in the system’s code, there may be little of use there. I propose that a human-centred informatics approach offers a more holistic framing, where the aggregate quality we are after might be termed Analytic System Integrity. I’ll work through a couple of examples as a form of ‘audit’, to show where one can identify weaknesses and opportunities, and consider the implications for how we conceive and design learning analytics that are responsive to the questions that society will rightly be asking.
Computational Thinking: Why It is Important for All StudentsNAFCareerAcads
Given the importance of computing and computer science in most career paths, computational thinking must be a part of every curriculum. This session explores
how computational thinking is related to computer science and information technology and how it might affect K-12 education. Participants will look at curricula examples and learn about new resources produced by a joint ISTE/
CSTA NSF group.
Presenter: Joe Kmoch, Milwaukee Public Schools
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
The last decade has seen the emergence of new research areas called "affective computing", "computational creativity", and "computational humor". The automated recognition of sentiments and opinions has become a stable track of the top Artificial Intelligence conferences. A growing number of international projects are focusing on the implementation of forms of creativity. More recently, the idea of building computer programs capable of recognizing and generating humor is not considered unachievable as in the past.
In this talk, I gave a summary of my experience in these new areas of computer science. More specifically, I described some of the ideas, resources, and methods developed during my research activity.
Teaching the Technologies learning area using a thinking skills approachJason Zagami
Presentation to the Digital Technologies 2015 EdTechSA on 16 July 2015
The Technologies learning area provides an opportunity to develop in students five distinct but complementary ways of thinking about and understanding the world: Systems Thinking, Design Thinking, Computational Thinking, Futures Thinking, and Strategic Thinking. This session will explore approaches to teaching the Technologies learning area through problem-solving activities that develop these thinking approaches.
Human Learning Online and Teaching Online Shalin Hai-Jew
Learners will…
consider how humans learn
review how humans learn online
study various types of online learning designs
review instructional design methods and standards
consider technical considerations in building online learning
explore various types of online learning designs
consider how online learning data may inform evolving learning designs
Learning Analytics Toolkit & TinCan/xAPI@Work Proof Of Concept ProgressLearningCafe
Following on from our last webinar on Sharing our TinCan/xAPI@Work Journey, we give an update on creating a working proof of concept for TinCan/xAPI. Dr Kirsty Kitto will presents on the work being done in developing a toolkit, which uses xAPI to store data about student participation in learning activities designed using standard social media tools such as Facebook etc.
We discuss:
What is the status of adoption of TinCan/xAPI in the industry ? How fast or slowly is it moving now ?What can you realistically achieve now with xAPI ? What is the road map you need to take ?Are there opportunities for the corporate and education sectors to collaborate to increase adoption ?
Case Study: Lessons from Newell Rubbermaid's SAP HANA Proof of ConceptSAPinsider Events
View this session from Reporting & Analytics 2014. Coming to Las Vegas in November! www.reporting2015.com
In this session, Newell Rubbermaid guides you through the key elements that comprised its SAP HANA business case and proof of concept, including an emphasis on process improvement. Learn firsthand how Newell Rubbermaid:
· Identified which business processes were most likely to realize significant improvement as a result of utilizing SAP HANA
· Established a “current state” baseline and demonstrated a “projected state” that could be realized through the use of SAP HANA
· Determined which SAP BI tools to use based on specific reporting scenarios and end user requirements
What is computational thinking? Who needs it? Why? How can it be learnt? ...Aaron Sloman
What is computational thinking?
Who needs it? Why? How can it be learnt?
Can it be taught? How?
Slides for invited presentation at Conference of ALT (Association for Learning Technology) 11th Sept 2012, University of Manchester.
PDF available (easier for printing, selecting text, etc.):
http://www.cs.bham.ac.uk/research/projects/cogaff/talks/#talk105
A video of the actual presentation (using no slides because of a projector problem) is now available here
http://www.youtube.com/watch?v=QXAFz3L2Qpo
It also has been made available as "slide 47" after the PDF presentation on this page.
I attempt to generalise Jeannette Wing's notion of "Computational thinking" (ACM 2006) to include attempting to understand much biological information processing, and try to show the necessity for educators to do deep computational thinking if they wish to facilitate processes of learning.
Tutorial, Learning Analytics Summer Institute, Ann Arbor, June 2017
As algorithms pervade societal life, they’re moving from an arcane topic reserved for computer scientists and mathematicians, to the object of far wider academic and mainstream media attention (try a web news search on algorithms, and then add ethics). As agencies delegate machines with increasing powers to make judgements about complex human qualities such as ’employability’, ‘credit worthiness’, or ‘likelihood of committing a crime’, we are confronted by the challenge of “governing algorithms”, lest they turn into Weapons of Math Destruction. But in what senses are they opaque, and to whom? And what is meant by “accountable”?
The education sector is clearly not immune from these questions, and it falls to the Learning Analytics community to convene a vigorous debate, and devise good responses. In this tutorial, I’ll set the scene, and then propose a set of lenses that we can bring to bear on a learning analytics infrastructure, to identify some of the meanings that “accountability” might have. It turns out that algorithmic transparency and accountability may be the wrong focus — or rather, just one piece of the jigsaw. Intriguingly, even if you can look inside the algorithmic ‘black box’, which is imagined to lie in the system’s code, there may be little of use there. I propose that a human-centred informatics approach offers a more holistic framing, where the aggregate quality we are after might be termed Analytic System Integrity. I’ll work through a couple of examples as a form of ‘audit’, to show where one can identify weaknesses and opportunities, and consider the implications for how we conceive and design learning analytics that are responsive to the questions that society will rightly be asking.
Computational Thinking: Why It is Important for All StudentsNAFCareerAcads
Given the importance of computing and computer science in most career paths, computational thinking must be a part of every curriculum. This session explores
how computational thinking is related to computer science and information technology and how it might affect K-12 education. Participants will look at curricula examples and learn about new resources produced by a joint ISTE/
CSTA NSF group.
Presenter: Joe Kmoch, Milwaukee Public Schools
Keynote Address, International Conference of the Learning Sciences, London Festival of Learning
Transitioning Education’s Knowledge Infrastructure:
Shaping Design or Shouting from the Touchline?
Abstract: Bit by bit, a data-intensive substrate for education is being designed, plumbed in and switched on, powered by digital data from an expanding sensor array, data science and artificial intelligence. The configurations of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards).
The idea that we may be transitioning into significantly new ways of knowing – about learning and learners – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. For instance, assuming that we want to shape this infrastructure, how do we engage with the teams designing the platforms our schools and universities may be using next year? Who owns the data and algorithms, and in what senses can an analytics/AI-powered learning system be ‘accountable’? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? If we want to work in “Pasteur’s Quadrant” (Donald Stokes), we must go beyond learning analytics that answer research questions, to deliver valued services to frontline educational users: but how are universities accelerating the analytics innovation to infrastructure transition?
Wrestling with these questions, the learning analytics community has evolved since its first international conference in 2011, at the intersection of learning and data science, and an explicit concern with those human factors, at many scales, that make or break the design and adoption of new educational tools. We are forging open source platforms, links with commercial providers, and collaborations with the diverse disciplines that feed into educational data science. In the context of ICLS, our dialogue with the learning sciences must continue to deepen to ensure that together we influence this knowledge infrastructure to advance the interests of all stakeholders, including learners, educators, researchers and leaders.
Speaking from the perspective of leading an institutional analytics innovation centre, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
The last decade has seen the emergence of new research areas called "affective computing", "computational creativity", and "computational humor". The automated recognition of sentiments and opinions has become a stable track of the top Artificial Intelligence conferences. A growing number of international projects are focusing on the implementation of forms of creativity. More recently, the idea of building computer programs capable of recognizing and generating humor is not considered unachievable as in the past.
In this talk, I gave a summary of my experience in these new areas of computer science. More specifically, I described some of the ideas, resources, and methods developed during my research activity.
Teaching the Technologies learning area using a thinking skills approachJason Zagami
Presentation to the Digital Technologies 2015 EdTechSA on 16 July 2015
The Technologies learning area provides an opportunity to develop in students five distinct but complementary ways of thinking about and understanding the world: Systems Thinking, Design Thinking, Computational Thinking, Futures Thinking, and Strategic Thinking. This session will explore approaches to teaching the Technologies learning area through problem-solving activities that develop these thinking approaches.
Human Learning Online and Teaching Online Shalin Hai-Jew
Learners will…
consider how humans learn
review how humans learn online
study various types of online learning designs
review instructional design methods and standards
consider technical considerations in building online learning
explore various types of online learning designs
consider how online learning data may inform evolving learning designs
Learning Analytics Toolkit & TinCan/xAPI@Work Proof Of Concept ProgressLearningCafe
Following on from our last webinar on Sharing our TinCan/xAPI@Work Journey, we give an update on creating a working proof of concept for TinCan/xAPI. Dr Kirsty Kitto will presents on the work being done in developing a toolkit, which uses xAPI to store data about student participation in learning activities designed using standard social media tools such as Facebook etc.
We discuss:
What is the status of adoption of TinCan/xAPI in the industry ? How fast or slowly is it moving now ?What can you realistically achieve now with xAPI ? What is the road map you need to take ?Are there opportunities for the corporate and education sectors to collaborate to increase adoption ?
Case Study: Lessons from Newell Rubbermaid's SAP HANA Proof of ConceptSAPinsider Events
View this session from Reporting & Analytics 2014. Coming to Las Vegas in November! www.reporting2015.com
In this session, Newell Rubbermaid guides you through the key elements that comprised its SAP HANA business case and proof of concept, including an emphasis on process improvement. Learn firsthand how Newell Rubbermaid:
· Identified which business processes were most likely to realize significant improvement as a result of utilizing SAP HANA
· Established a “current state” baseline and demonstrated a “projected state” that could be realized through the use of SAP HANA
· Determined which SAP BI tools to use based on specific reporting scenarios and end user requirements
Build Your Own SaaS using Docker. A proof of concept with a simple Memcached SaaS.
See the Memcached as a service application in action at http://www.memcachedasaservice.com
Find the source code on GitHub: https://github.com/jbarbier/SaaS_Memcached
An example of a successful proof of conceptETLSolutions
In this presentation we explain how to create a successful proof of concept for software, using a real example from our work in the Oil & Gas industry.
Kirsty Kitto, Simon Buckingham Shum, and Andrew Gibson. (2018). Embracing Imperfection in Learning Analytics. In Proceedings of LAK18: International Conference on Learning Analytics and Knowledge, March 5–9, 2018, Sydney, NSW, Australia, pp.451-460. (ACM, New York, NY, USA). https://doi.org/10.1145/3170358.3170413
Open Access: http://simon.buckinghamshum.net/2018/01/embracing-imperfection-in-learning-analytics
Abstract: Learning Analytics (LA) sits at the confluence of many contributing disciplines, which brings the risk of hidden assumptions inherited from those fields. Here, we consider a hidden assumption derived from computer science, namely, that improving computational accuracy in classification is always a worthy goal. We demonstrate that this assumption is unlikely to hold in some important educational contexts, and argue that embracing computational “imperfection” can improve outcomes for those scenarios. Specifically, we show that learner-facing approaches aimed at “learning how to learn” require more holistic validation strategies. We consider what information must be provided in order to reasonably evaluate algorithmic tools in LA, to facilitate transparency and realistic performance comparisons.
#cwcon #f4: "Compose, Design, Educate: Designing a Digital Rhetorics Themed O...Allegra Smith
This presentation traces the design and implementation of an online first-year composition course at a Research I institution during the 2017-18 academic year. The speaker will share lessons learned from designing and teaching the course, as well as training and mentoring graduate instructors to teach online for the first time (Bourelle, 2016). Topics covered will include positioning a digital rhetorics themed distance learning course within a STEM-based university, teaching multimodal assignments in an online course, and integrating information design concepts such as user-centeredness (Blythe, 2001) and wicked problems (Rittel & Webber, 1973) into online first-year writing curricula.
Towards a Social Learning Space for Open Educational ResourcesSocialLearn, Open U
OpenEd 2010, Barcelona
Simon Buckingham Shum & Rebecca Ferguson
Knowledge Media Institute & Institute of Educational Technology, The Open University, Milton Keynes, UK
LAK2011: 1st International Conference on Learning Analytics and Knowledge February 27-March 1, 2011
Banff, Alberta
Anna De Liddo, Simon Buckingham Shum,
Ivana Quinto, Michelle Bachler, Lorella Cannavacciuolo
Tiered Eportfolio Apprenticeship Model (T.E.A.M.)
Presentation at the 2014 Nebraska Distance Learning Annual Conference by Roz Hussin and Allison Hunt (University of Nebraska-Lincoln, USA), and facilitated by Stefan Schmid (BBW, Germany)
Nowadays, we are constantly interacting with computers, mobiles and other wearable devices. These interactions leave behind the digital footprint of the user. This data is used with different goals in the so-called Big Data field to predict customer behaviour in marketing and health research. Learning Analytics tackles this challenge in the Technology Enhanced Learning field.
George Siemens defines Learning Analytics as the measurement, collection, analysis and reporting of the data to understand and optimise learning. In this context, we find a variety of studies that process the data different. Some studies implement complex algorithms and display the outcome to the user. Others rely on simpler approaches to process the data but enabling the user to explore the data with understandable, comprehensive and usable visualisations. Users can draw conclusions by their own and, with this information, steer their own learning process. This thesis is contextualised in the latter and intends to help students to become autonomous and lead their own educational process.
This dissertation presents the work in the scope of four research questions: 1) RQ1 - What characteristics of learning activities can be visualised usefully for learners?; 2) RQ2 - What characteristics of learning activities can be visualised usefully for teachers?; 3) RQ3 - What are the affordances of and user problems with tracking data automatically and manually?; and 4) RQ4 - What are the key components of a simple and flexible architecture to collect, store and manage learning activity?.
The exploration of these research questions include the deployment of: 1) three different learning dashboard designs deployed in real courses with 128 students participating in the evaluations; 2) the analysis of two Massive Open Online Courses (MOOCs) with 56876 enrolled students; and 3) the deployment of an architecture in two real case studies, including a European project with more than 15 scheduled pilots.
Manual and automatic trackers have benefits and drawbacks. For example, manual trackers respect the user privacy in blended learning courses but the data provided by the students is not trusted by their fellow students. Automatic trackers are more accurate, but they do not track the activity outside of the computer, and, therefore, do not provide the complete picture that students demand.
This research also identifies three components to deploy a simple and flexible architecture to collect data in open learning environments: 1) a set of simple services to push and pull the learning traces; 2) a simple data schema to ensure completeness and findability of the data; and 3) independent components to collect the learning activity.
Conference Designs revisited – Concepts, Practices, and Perspectivesc60357
Contribution to the Round Table on "Conferences In The Context Of Academic Performance, Informal Learning And Alternative Designs - Between Theory And Practice" in
Network 6, 06 SES 04, at the ECER 2016 Conference in Dublin (2016-08-24).
The Generative AI System Shock, and some thoughts on Collective Intelligence ...Simon Buckingham Shum
Keynote Address: Team-based Learning Collaborative Asia Pacific Community (TBLC-APC) Symposium (“Impact of emerging technologies on learning strategies”) 8-9 February 2024, Sydney https://tbl.sydney.edu.au
Slides from my contribution to the panel convened by Jeremy Roschelle at the International Society for the Learning Sciences: Engaging Learning Scientists in Policy Challenges: AI and the Future of Learning
Deliberative Democracy as a strategy for co-designing university ethics aro...Simon Buckingham Shum
Buckingham Shum, S. (2021). Deliberative Democracy as a strategy for co-designing university ethics around analytics and AI in education. AARE2021: Australian Association for Research in Education, 28 Nov. – 2 Dec. 2021
Deliberative Democracy as a Strategy for Co-designing University Ethics Around Analytics and AI in Education
Simon Buckingham Shum
Connected Intelligence Centre, University of Technology Sydney
Universities can see an increasing range of student and staff activity as it becomes digitally visible in their platform ecosystems. The fields of Learning Analytics and AI in Education have demonstrated the significant benefits that ethically responsible, pedagogically informed analysis of student activity data can bring, but such services are only possible because they are undeniably a form of “surveillance”, raising legitimate questions about how the use of such tools should be governed.
Our prior work has drawn on the rich concepts and methods developed in human-centred system design, and participatory/co-design, to design, deploy and validate practical tools that give a voice to non-technical stakeholders (e.g. educators; students) in shaping such systems. We are now expanding the depth and breadth of engagement that we seek, looking to the Deliberative Democracy movement for inspiration. This is a response to the crisis in confidence in how typical democratic systems engage citizens in decision making. A hallmark is the convening of a Deliberative Mini-Public (DMP) which may work at different scales (organisation; community; region; nation) and can take diverse forms (e.g. Citizens’ Juries; Citizens’ Assemblies; Consensus Conferences; Planning Cells; Deliberative Polls). DMP’s combination of stratified random sampling to ensure authentic representation, neutrally facilitated workshops, balanced expert briefings, and real support from organisational leaders, has been shown to cultivate high quality dialogue in sometimes highly conflicted settings, leading to a strong sense of ownership of the DMP's final outputs (e.g. policy recommendations).
This symposium contribution will describe how the DMP model is informing university-wide consultation on the ethical principles that should govern the use of analytics and AI around teaching and learning data.
March 2021 • 24/7 Instant Feedback on Writing: Integrating AcaWriter into yo...Simon Buckingham Shum
Slides accompanying the monthly UTS educator briefing https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-18-march/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions). This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
ICQE20: Quantitative Ethnography Visualizations as Tools for ThinkingSimon Buckingham Shum
Slides for this keynote talk to the 2nd International Conference on Quantitative Ethnography
http://simon.buckinghamshum.net/2021/02/icqe2020-keynote-qe-viz-as-tools-for-thinking/
24/7 Instant Feedback on Writing: Integrating AcaWriter into your TeachingSimon Buckingham Shum
https://cic.uts.edu.au/events/24-7-instant-feedback-on-writing-integrating-acawriter-into-your-teaching-2-dec/
What difference could instant feedback on draft writing make to your students? Over the last 5 years the Connected Intelligence Centre has been developing and piloting an automated feedback tool for academic writing (AcaWriter), working closely with academics across several faculties. The research portal documents how educators and students engage with this kind of AI, and what we’ve learnt about integrating it into teaching and assessment.
In May, AcaWriter was launched to all students along with an information portal. Now we want to start upskilling academics, tutors and learning technologists, in a monthly session to give you the chance to learn about AcaWriter, and specifically, good practices for integrating it into your subject. CIC can support you, and we hope you may be interested in co-designing publishable research.
AcaWriter handles several different ‘genres’ of writing, including reflective writing (e.g. a Reflective Essay; Reflective Blogs/Journals on internships/work-placements) and analytical writing (e.g. Argumentative Essays; Research Abstracts & Introductions).
This briefing will demo AcaWriter, and show it can be embedded in student activities. We hope this sparks ideas for your own teaching, which we can discuss in more detail.
An introduction to argumentation for UTS:CIC PhD students (with some Learning Analytics examples, but potentially of wider interest to students/researchers)
Webinar: Learning Informatics Lab, University of Minnesota
Replay the talk: https://youtu.be/dcJZeDIMr2I
Learning Informatics
AI • Analytics • Accountability • Agency
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
University of Technology Sydney
Abstract:
“Health Informatics”. “Urban Informatics”. “Social Informatics”. Informatics offers systemic ways of analyzing and designing the interaction of natural and artificial information processing systems. In the context of education, I will describe some Learning Informatics lenses and practices which we have developed for co-designing analytics and AI with educators and students. We have a particular focus on closing the feedback loop to equip learners with competencies to navigate a complex, uncertain future, such as critical thinking, professional reflection and teamwork. En route, we will touch on how we build educators’ trust in novel tools, our design philosophy of “embracing imperfection” in machine intelligence, and the ways that these infrastructures embody values. Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences spark productive reflection around as the UMN Learning Informatics Lab builds its program.
Biography:
Simon Buckingham Shum is Professor of Learning Informatics at the University of Technology Sydney, where he serves as inaugural director of the Connected Intelligence Centre. CIC is a transdisciplinary innovation centre, using analytics to provide new insights for university teams, with particular expertise in educational data science. Simon’s career-long fascination with software’s ability to make thinking visible has seen him active in communities including Computer-Supported Cooperative Work, Hypertext, Design Rationale, Scholarly Publishing, Semantic Web, Computational Argumentation, Educational Technology and Learning Analytics. The challenge of visualizing contested knowledge has produced several books: Visualizing Argumentation, Knowledge Cartography, and Constructing Knowledge Art. He has been active over the last decade in shaping the field of Learning Analytics, co-founding the Society for Learning Analytics Research, and catalyzing several strands: Social Learning Analytics, Discourse Analytics, Dispositional Analytics and Writing Analytics. http://Simon.BuckinghamShum.net
Despite AI’s potential for beneficial use, it creates important risks for Australians. AI, big data, and AI-informed decision making can cause exclusion, discrimination, skill loss, and economic impact; and can affect privacy, security of critical infrastructure and social well-being. What types of technology raise particular human rights concerns? Which human rights are particularly implicated?
Abstract: The emerging configuration of educational institutions, technologies, scientific practices, ethics policies and companies can be usefully framed as the emergence of a new “knowledge infrastructure” (Paul Edwards). The idea that we may be transitioning into significantly new ways of knowing – about learning and learners, teaching and teachers – is both exciting and daunting, because new knowledge infrastructures redefine roles and redistribute power, raising many important questions. What should we see when open the black box powering analytics? How do we empower all stakeholders to engage in the design process? Since digital infrastructure fades quickly into the background, how can researchers, educators and learners engage with it mindfully? This isn’t just interesting to ponder academically: your school or university will be buying products that are being designed now. Or perhaps educational institutions should take control, building and sharing their own open source tools? How are universities accelerating the transition from analytics innovation to infrastructure? Speaking from the perspective of leading an institutional innovation centre in learning analytics, I hope that our experiences designing code, competencies and culture for learning analytics sheds helpful light on these questions.
Towards Collaboration Translucence: Giving Meaning to Multimodal Group DataSimon Buckingham Shum
Vanessa Echeverria, Roberto Martinez-Maldonado, and Simon Buck- ingham Shum.. 2019. Towards Collaboration Translucence: Giving Meaning to Multimodal Group Data. In Proceedings of ACM CHI conference (CHI’19). ACM, New York, NY, USA, Paper 39, 16 pages. https://doi.org/10.1145/3290605.3300269
Collocated, face-to-face teamwork remains a pervasive mode of working, which is hard to replicate online. Team members’ embodied, multimodal interaction with each other and artefacts has been studied by researchers, but due to its complexity, has remained opaque to automated analysis. However, the ready availability of sensors makes it increasingly affordable to instrument work spaces to study teamwork and groupwork. The possibility of visualising key aspects of a collaboration has huge potential for both academic and professional learning, but a frontline challenge is the enrichment of quantitative data streams with the qualitative insights needed to make sense of them. In response, we introduce the concept of collaboration translucence, an approach to make visible selected features of group activity. This is grounded both theoretically (in the physical, epistemic, social and affective dimensions of group activity), and contextually (using domain-specific concepts). We illustrate the approach from the automated analysis of healthcare simulations to train nurses, generating four visual proxies that fuse multimodal data into higher order patterns.
Panel held at LAK13: 3rd International Conference on Learning Analytics & Knowledge
http://simon.buckinghamshum.net/2013/03/lak13-edu-data-scientists-scarce-breed
Educational Data Scientists: A Scarce Breed
The Educational Data Scientist is currently a poorly understood, rarely sighted breed. Reports vary: some are known to be largely nocturnal, solitary creatures, while others have been reported to display highly social behaviour in broad daylight. What are their primary habits? How do they see the world? What ecological niches do they occupy now, and will predicted seismic shifts transform the landscape in their favour? What survival skills do they need when running into other breeds? Will their numbers grow, and how might they evolve? In this panel, the conference will hear and debate not only broad perspectives on the terrain, but will have been exposed to some real life specimens, and caught glimpses of the future ecosystem.
Opening to the inaugural workshop on Learning Analytics in Schools held at LAK18: International Conference on Learning Analytics & Knowledge, Sydney. http://lak18.solaresearch.org
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
TESDA TM1 REVIEWER FOR NATIONAL ASSESSMENT WRITTEN AND ORAL QUESTIONS WITH A...
Provocations for CLA Dashboard
1. Making social learning visible:
some provocations for Connected Learning Analytics
dashboard concepts
Simon Buckingham Shum
Professor of Learning Informatics
Director, Connected Intelligence Centre
@sbuckshum
cic.uts.edu.au
2. To nurture skillful social learners (prev talk)…
…invisible + ephemeral social processes need to
be made visible + persistent
…for which we need analytics to aggregate +
visualise data meaningfully
as proxies for social + personal learning
2
3. Bridging the data—meaning gulf
3
logs of user actions (xAPI triples)
meaningful, timely representations
how students and educators think about learning
(not necessarily the same thing)
4. Bridging the data—meaning gulf
4
logs of user actions (xAPI triples)
meaningful, timely representations
how students and educators think about learning
(not necessarily the same thing)
sensemaking
ac+on/interven+on
5. Bridging the data—meaning gulf
5
logs of user actions (xAPI triples)
meaningful, timely representations
how students and educators think about learning
(not necessarily the same thing)
Bloom’s
taxonomy,
etc
Opera&on
...
Wayfinding
…
Sensemaking
…
Innova&on
6. All of the following are possible in
closed learning platforms
‘Technically straightforward’
to apply to an xAPI LRS?
But which are of most interest?
6
11. Learning
Technology
KMi,
OU
AI
&
Argumenta&on
Learning
Disposi&ons
Human-‐Centred
Informa&cs
Learning
Analy&cs
Seman&c
Scien&fic
Publishing
Dialogue
/
Issue
/
Argument
Mapping
Social learning
analytics
— quantifying
“professional
identity”
12. Schreurs B, Teplovs C, Ferguson R, De Laat
M and Buckingham Shum S. (2013)
Visualizing Social Learning Ties by Type
and Topic: Rationale and Concept
Demonstrator. Proc. 3rd International
Conference on Learning Analytics &
Knowledge. Leuven, BE: ACM, 33-37. Open
Access Eprint: http://oro.open.ac.uk/36891
Adding topic and type of social tie to filter SN
13. Adding topic and type of social tie to filter SN
Schreurs B, Teplovs C, Ferguson R, De Laat
M and Buckingham Shum S. (2013)
Visualizing Social Learning Ties by Type
and Topic: Rationale and Concept
Demonstrator. Proc. 3rd International
Conference on Learning Analytics &
Knowledge. Leuven, BE: ACM, 33-37. Open
Access Eprint: http://oro.open.ac.uk/36891
15. analytics that look beneath the
surface, and quantify linguistic
proxies for ‘deeper learning’
Beyond number / size / frequency
of posts; ‘hottest thread’
http://www.glennsasscer.com/wordpress/wp-content/uploads/2011/10/iceberg.jpg
16. Discourse analytics on webinar textchat
Ferguson, R. and Buckingham Shum, S., Learning analytics to identify exploratory dialogue within synchronous text chat. In: 1st
International Conference on Learning Analytics and Knowledge (Banff, Canada, 2011). ACM
Can we spot the
quality learning
conversations in a
2.5 hr webinar?
19. Discourse analytics on webinar textchat
-100
0
100
9:28
9:40
9:50
10:00
10:07
10:17
10:31
10:45
11:04
11:17
11:26
11:32
11:38
11:44
11:52
12:03
Averag
Classified as
“exploratory
talk”
(more
substantive
for learning)
“non-
exploratory”
Given a 2.5 hour webinar, where in the live
textchat were the most effective learning
conversations?
Not at the start and end of a webinar
but if we zoom in on a peak…
Ferguson, R., Wei, Z., He, Y. and Buckingham Shum, S., An Evaluation of Learning Analytics to Identify Exploratory Dialogue in Online Discussions. In: Proc.
3rd International Conference on Learning Analytics & Knowledge (Leuven, BE, 8-12 April, 2013). ACM. http://oro.open.ac.uk/36664
21. Visualizations of writing cohesion
21
Whitelock, D., D. Field, J. T. E. Richardson, N. V. Labeke and S. Pulman (2014). Designing and Testing Visual Representations of Draft Essays for Higher
Education Students. 2nd International Workshop on Discourse-Centric Learning Analytics, Fourth International Conference on Learning Analytics and
Knowledge, Indianapolis, Indiana, USA. https://dcla14.files.wordpress.com/2014/03/dcla14_whitelock_etal.pdf
22. Rhetorical functions of metadiscourse
identified by the Xerox Incremental Parser (XIP)
BACKGROUND KNOWLEDGE
Recent studies indicate …
… the previously proposed …
… is universally accepted ...
NOVELTY
... new insights provide direct evidence ...
... we suggest a new ... approach ...
... results define a novel role ...
OPEN QUESTION
… little is known …
… role … has been elusive
Current data is insufficient …
GENERALIZING
... emerging as a promising approach
Our understanding ... has grown
exponentially ...
... growing recognition of the importance ...
CONTRASTING IDEAS
… unorthodox view resolves …
paradoxes …
In contrast with previous
hypotheses ...
... inconsistent with past
findings ...
SIGNIFICANCE
studies ... have provided important
advances
Knowledge ... is crucial for ... understanding
valuable information ... from studies
SURPRISE
We have recently observed ... surprisingly
We have identified ... unusual
The recent discovery ... suggests intriguing roles
SUMMARIZING
The goal of this study ...
Here, we show ...
Altogether, our results ... indicate
22
27. “It’s more than knowledge and skills. For the
innovation economy, dispositions
come into play:
readiness to collaborate;
attention to multiple perspectives;
initiative;
persistence;
curiosity.”
Larry Rosenstock
High Tech High
San Diego
hightechhigh.org
LearningREimagined project: http://learning-reimagined.com
Larry Rosenstock: http://audioboo.fm/boos/1669375-50-seconds-of-larry-rosenstock-ceo-of-hightechhigh-on-how-he-would-re-imagine-learning
Knowledge, Skills & Dispositions
27
28. “It’s vital to know that … focusing on
learning is not smoke and
mirrors. It’s not just some clever idea
among the intelligentsia. It’s really important.
And it’s really, really important that we
can measure it, demonstrate it, and
develop a language for it.”
Mark Moorhouse
Matthew Moss High School,
Rochdale, UK
http://youtu.be/kayzkma1TIM — Learning Futures channel: http://bit.ly/lfmovies
MMHS website: http://www.mmhs.co.uk/learning
Measuring learning to learn
28
29. Deakin Crick, R., S. Huang, A. Ahmed Shafi and C. Goldspink (2015). Developing Resilient Agency in Learning: The Internal Structure of
Learning Power. British Journal of Educational Studies: Published online: 24 Mar 2015. http://dx.doi.org/10.1080/00071005.2015.1006574 29
Evidencing learning dispositions: CLARA survey
(Ruth Deakin Crick, UTS)
30. 30
Structural Equation
Model underpinning
CLARA
Deakin Crick, R., S. Huang, A. Ahmed Shafi and C. Goldspink (2015).
Developing Resilient Agency in Learning: The Internal Structure of Learning
Power. British Journal of Educational Studies: Published online: 24 Mar 2015.
http://dx.doi.org/10.1080/00071005.2015.1006574
32. Mindful Agency
Sense making
Creativity
CuriosityBelonging
Collaboration
Hope and
optimism
Behavioural analytics for learning dispositions?
Social network patterns,
teamwork effectiveness and
initiation of relationships?
Questioning, arguing and
search behaviours reveal
intrinsic curiosity and
epistemic commitments?
Tagging/sharing/blogging/social
patterns reveal how you see
connections between ideas?
32