Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Ioana Jivet
It has been long argued that learning analytics has the potential to act as a "middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
The Learning Tracker - A Learner Dashboard that Encourages Self-regulation in...Ioana Jivet
Although Massive Open Online Courses (MOOCs) have the potential to make quality education affordable and available to the masses, completion rates are extremely low due to the to the high level of autonomy and self-regulated skills that MOOCs require.
The aim of the present work is to investigate how self-regulated learning skills can be enhanced by encouraging metacognition and reflection in MOOC learners by means of social comparison. To this end, following an iterative process, we have developed the Learning Tracker, an interactive widget which allows learners to visualise their learning behaviour and com-
pare it to that of previous graduates of the same MOOC. Each iteration was extensively evaluated in live TU Delft MOOCs running on the edX platform while engaging over 20.000 MOOC learners.
Our results show that learners that have access to the Learning Tracker are more likely to graduate the MOOC. Moreover, we have observed that the widget has a positive impact on learners’ engagement and reduces procrastination. Based on our results, we argue that the mere fact of receiving feedback on a limited number of learning habits could trigger self-
reflection in learners and lead to improved learner performance.
This tutorial is designed for everyone with an interest in increasing the impact of their learning analytics research. It was given by Rebecca Ferguson on 22 June 2021 at the Learning Analytics Summer Institute 2021, hosted by the University of British Columbia and held virtually.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of...Daniel Davis
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of Supporting Learning Outcomes.
Presented in June 2018 at Learning @ Scale in London, England.
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Ioana Jivet
It has been long argued that learning analytics has the potential to act as a "middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
The Learning Tracker - A Learner Dashboard that Encourages Self-regulation in...Ioana Jivet
Although Massive Open Online Courses (MOOCs) have the potential to make quality education affordable and available to the masses, completion rates are extremely low due to the to the high level of autonomy and self-regulated skills that MOOCs require.
The aim of the present work is to investigate how self-regulated learning skills can be enhanced by encouraging metacognition and reflection in MOOC learners by means of social comparison. To this end, following an iterative process, we have developed the Learning Tracker, an interactive widget which allows learners to visualise their learning behaviour and com-
pare it to that of previous graduates of the same MOOC. Each iteration was extensively evaluated in live TU Delft MOOCs running on the edX platform while engaging over 20.000 MOOC learners.
Our results show that learners that have access to the Learning Tracker are more likely to graduate the MOOC. Moreover, we have observed that the widget has a positive impact on learners’ engagement and reduces procrastination. Based on our results, we argue that the mere fact of receiving feedback on a limited number of learning habits could trigger self-
reflection in learners and lead to improved learner performance.
This tutorial is designed for everyone with an interest in increasing the impact of their learning analytics research. It was given by Rebecca Ferguson on 22 June 2021 at the Learning Analytics Summer Institute 2021, hosted by the University of British Columbia and held virtually.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of...Daniel Davis
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of Supporting Learning Outcomes.
Presented in June 2018 at Learning @ Scale in London, England.
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Assessing Students and Tutors with Learning Analytics DashboardsEADTU
Vassilios Verykios from Hellenic Open University gave a presentation about Assessing Students and Tutors with Learning Analytics Dashboards as part of the online events by expert pool Assessment within EMPOWER.
Bring your own idea - Visual learning analyticsJoris Klerkx
Workshop on visual learning analytics that was part of LASI 2014 - http://www.solaresearch.org/events/lasi-2/lasi2014/
Examples of learning dashboards were presented during the workshop by Sven Charleer:
http://www.slideshare.net/svencharleer/learning-dashboard-visual-learning-analytics-workshop-lasi2014-h-harvard
Presentation given by Rebecca Ferguson at Charles Sturt University, Wagga Wagga campus, on 16 March 2018. http://uimagine.edu.au/portfolio/guest-lecture-dr-rebecca-ferguson/
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
De toekomst van Learning Analytics - wat is haalbaar en wat is wenselijk?SURF Events
Woensdag 11 november
Sessieronde 4
Titel: De toekomst van Learning Analytics - wat is haalbaar en wat is wenselijk?
Spreker(s): Doug Clow (Open University UK), Hendrik Drachsler (Open Universiteit)
Zaal: Leeuwen I
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Ten years ago there were no educational products available for K-12 Math that were truly adaptive. Now just about everyone claims to be adaptive in some way. But what does it mean to be “adaptive”? How do these products work? And how do you evaluate which best fits your needs?
In this presentation, Nigel Green, Vice President of User Experience at DreamBox Learning, discusses the evolving definition of adaptive learning and it's application in varying technologies and approaches, including: how different student actions and behaviors can inform an adaptive engine, how adaptive learning programs can be integrated into your blended learning models, and some of the possible futures of adaptive learning.
Presentation by Rebecca Ferguson at Learning and Knowledge 2015 (LAK15), Poughkeepsie, NY, USA.
Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.
AIE 2015 China Conference: Using the NMC K-12 Horizon ReportDavid W. Deeds
David W. Deeds' presentation for the Alliance of International Educators' (AIE) 2015 China Chapter Conference: Using the New Media Consortium's (NMC) K-12 Horizon Report to Chart Your School's Future. Given Oct. 24-25 in Shanghai, China. David is the Technology Integrator/Teacher for the Yew Wah International Education School in Yantai, China.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at Leuven Learning Lab’s first annual Educational Technology conference day on Learning Analytics
(https://www.kuleuven.be/english/education/learning-lab).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Assessing Students and Tutors with Learning Analytics DashboardsEADTU
Vassilios Verykios from Hellenic Open University gave a presentation about Assessing Students and Tutors with Learning Analytics Dashboards as part of the online events by expert pool Assessment within EMPOWER.
Bring your own idea - Visual learning analyticsJoris Klerkx
Workshop on visual learning analytics that was part of LASI 2014 - http://www.solaresearch.org/events/lasi-2/lasi2014/
Examples of learning dashboards were presented during the workshop by Sven Charleer:
http://www.slideshare.net/svencharleer/learning-dashboard-visual-learning-analytics-workshop-lasi2014-h-harvard
Presentation given by Rebecca Ferguson at Charles Sturt University, Wagga Wagga campus, on 16 March 2018. http://uimagine.edu.au/portfolio/guest-lecture-dr-rebecca-ferguson/
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
De toekomst van Learning Analytics - wat is haalbaar en wat is wenselijk?SURF Events
Woensdag 11 november
Sessieronde 4
Titel: De toekomst van Learning Analytics - wat is haalbaar en wat is wenselijk?
Spreker(s): Doug Clow (Open University UK), Hendrik Drachsler (Open Universiteit)
Zaal: Leeuwen I
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Ten years ago there were no educational products available for K-12 Math that were truly adaptive. Now just about everyone claims to be adaptive in some way. But what does it mean to be “adaptive”? How do these products work? And how do you evaluate which best fits your needs?
In this presentation, Nigel Green, Vice President of User Experience at DreamBox Learning, discusses the evolving definition of adaptive learning and it's application in varying technologies and approaches, including: how different student actions and behaviors can inform an adaptive engine, how adaptive learning programs can be integrated into your blended learning models, and some of the possible futures of adaptive learning.
Presentation by Rebecca Ferguson at Learning and Knowledge 2015 (LAK15), Poughkeepsie, NY, USA.
Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.
AIE 2015 China Conference: Using the NMC K-12 Horizon ReportDavid W. Deeds
David W. Deeds' presentation for the Alliance of International Educators' (AIE) 2015 China Chapter Conference: Using the New Media Consortium's (NMC) K-12 Horizon Report to Chart Your School's Future. Given Oct. 24-25 in Shanghai, China. David is the Technology Integrator/Teacher for the Yew Wah International Education School in Yantai, China.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at Leuven Learning Lab’s first annual Educational Technology conference day on Learning Analytics
(https://www.kuleuven.be/english/education/learning-lab).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
WCOL2019: Learning analytics for learning design or learning design for learn...Marko Teräs
Presentation at the 28th ICDE World Conference on Online Learning on the relationship between learning design and learning analytics. Part of a national-level learning analytics research and development project funded by the Finnish Ministry of Education and Culture.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
The power of learning analytics to unpack learning and teaching: a critical p...Bart Rienties
Across the globe many educational institutions are collecting vast amounts of small and big data about students and their learning behaviour, such as their class attendance, online activities, or assessment scores. As a result, the emerging field of Learning Analytics (LA) is exploring how data can be used to empower teachers and institutions to effectively support learners. In the recent Innovative Pedagogy Report Ferguson et al. (2017) encourage researchers and practitioners to move towards a new form of learning analytics called student-led learning analytics, which enable learners to specify their own goals and ambitions. They also support learners to reach these goals. This is particularly helpful for individuals who have little time to spare for study. In this ESRC session, based upon 6 years of experience with LA data and large-scale implementations amongst 450000+ students at a range of context, I will use an interactive format to discuss and debate three major questions: 1) To what extent is learning analytics the new holy grail of learning and teaching? 2) How can instructional design be optimised using the principles of learning analytics?; 3) With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students?
This is the presentation that was delivered to the Viewpoints team at the first 'data day' - its aims were to show the immediate team the current stage of development and to discuss the data implications of the user interface and user choices.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Data-based feedback through learning dashboards: does it support the first-ye...Tinne De Laet
Presentation supporting the EFYE 2018 pre-conference workshop "Data-based feedback through learning dashboards: does it support the first-year experience" - https://efye2018.nl/programme/parallel-sessions/
Improving education by learning analtyicsTinne De Laet
These are the slides of the invited talk "improving education by learning analytics" for the LAW studiedag 2017 https://www.maastrichtuniversity.nl/nl/events/studiedag-2017.
Learning analytics tussen droom en daad: eerlijke ervaringen uit concrete imp...Tinne De Laet
Presentatie gegeven tijdens https://www.surf.nl/agenda/2018/05/seminar-aan-de-slag-met-learning-analytics/index.html.
Ervaringen met learning dashboard opgedaan binnen KU Leuven in kader van twee Erasmus+ projecten ABLE en STELA.
Learning dashboards voor feedback op leer- en studeervaardigheden en academis...Tinne De Laet
Keynote presentatie van LESEC annual event 2018 (https://set.kuleuven.be/LESEC/news-events/annual-event-2018) over learning dashboards. Concreet bevat de presentatie bevindingen en aanbevelingen van grootschalige piloten binnen KU Leuven in kader van twee Europese Erasmus+ projecten: ABLE en STELA.
Confidence in and beliefs about first year engineering student successTinne De Laet
This paper explores the confidence freshman engineering students have in being successful in the first study year and which study-related behaviour they believe to be important to this end. Additionally, this paper studies which feedback these students would like to receive and compares it with the experiences of second-year students regarding feedback. To this end, two questionnaires were administered: one with freshman engineering students to measure their expectations regarding study success and expected feedback and one with second-year engineering students to evaluate their first year feedback experience.
The results show that starting first-year engineering students are confident regarding their study success. This confidence is however higher than the observed first-year students success. Not surprisingly, first-year students have good intentions and believe that most academic activities are important for student success. When second-year students look back on their first year, their beliefs in the importance of these activities have strongly decreased, especially regarding the importance of preparing classes and following communication through email and the virtual learning environment. First-year students expect feedback regarding their academic performance and engagement. They expect that this feedback primarily focuses on the impact on their future study pathway rather than on comparison to peer students. Second-year students indicate that the amount of feedback they receive could be improved, but agree with the first-year students that comparative feedback is less important.
Conference Key Areas: Engineering Education Research, Attractiveness of Engineering Education, Gender and Diversity
Keywords: academic self-confidence, feedback, reasons for students success, student beliefs
Learning and study strategies: a learning analytics approach for feedbackTinne De Laet
Presentation of a learning dashboard developed by KU Leuven within the STELA project (http://stela-project.eu//).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard provides feedback on the learning and study strategies, as measured by the LASSI questionnaire.
Presentation of the learning dashboard developed by KU Leuven within the ABLE project (http://www.ableproject.eu/).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard is developed for the interaction between study advisor and student. More information in our journal paper http://ieeexplore.ieee.org/document/7959628/
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Instructions for Submissions thorugh G- Classroom.pptx
Learning dashboards for actionable feedback: the (non)sense of chances of success and predictive models
1. Learning dashboards for
actionable feedback
the (non)sense of chances of success and predictive models
Tinne De Laet
Tinne.DeLaet@kuleuven.be
@TinneDeLaet
2. “Learning analytics is about
collecting traces that learners
leave behind and using those
traces to improve learning.”
- Erik Duval
Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012, https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/ 2
Learning Analytics?
3. Learning Dashboards?
3Dashboard Confusion, Stephen Few, Intelligent Enterprise, March 20, 2004
“A dashboard is a visual display of the
most important information needed to
achieve one or more objectives;
consolidated and arranged on a single
screen so the information can be monitored
at a glance.”
- Stephen Few
4. Successful Transition from secondary to higher
Education using Learning Analytics
enhance a successful transition from
secondary to higher education by means of
learning analytics
design and build analytics dashboards,
dashboards that go beyond identifying at-risk
students, allowing actionable feedback for all
students on a large scale.
Achieving Benefits from Learning Analytics
research strategies and practices for using
learning analytics to support students during
their first year at university
developing the technological aspects of
learning analytics,
focuses on how learning analytics can be used
to support students.
4
www.stela-project.eu
@STELA_project
2015-1-UK01-KA203-013767
www.ableproject.eu
@ABLE_project_eu
562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
5. STELA ♥ ABLE
5
actionable feedback
student-centered
program level
inclusive
first-year experience
institution-wide
Learning Analytics
actual implementation
6. [!] Feedback must be “actionable”.
6
Warning!
Male students have
10% less probability to
be successful.
You are male.
Warning!
Your online activity is
lagging behind.
action?
?
action?
?
9. [!] Start with the available data.
Lots of data may eventually become
available in the future …
…. already start with what is available
9
(*)
(*) Zarraonandia, T., Aedo, I., Díaz, P., & Montero, A. (2013). An augmented lecture feedback system to support learner and teacher communication.
British Journal of Educational Technology, 44(4), 616-628.
11. Study advisor – student conversations
11
Should I consider
another program?
Can I still finish the
bachelor in 3 years?
How should I compose
my program for next
year?
What is the personal
situation?
How can I help?
What is the best
next step?
12. [!] Use all available expertise.
12
visualization experts
practitioners / end-users
researchers LA
researchers first-year
study success
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue.
In IEEE Transactions on Learning Technology (http://ieeexplore.ieee.org/document/7959628/).
14. [!] Wording matters.
14
73% chance of success
73% of students of earlier
cohorts with the same
study efficiency obtained
the bachelor degree
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
15. LISSA dashboard
15
Three examination periods
observations, interviews,
questionnaires
pilot with two engineering programs
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue. In IEEE Transactions on Learning Technology
16. LISSA: evaluation – observations
16
15 observations
insights
(-) factual
(+) interpretative
(!) reflective
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue. In IEEE Transactions on Learning Technology
17. Evaluation – interviews
“When students see the numbers, they are
surprised, but now they believe me.
Before, I used my gut feeling, now I feel
more certain of what I say as well”.
“It’s like a main thread
guiding the
conversation.”
“I can talk about what to do with the results,
instead of each time looking for the data and
puzzling it together.”
“Students don’t know where to look during the
conversation, and avoid eye contact.
The dashboard provides them a point of focus”.
“A student changed her
study method in June and
could now see it paid off.”
LISSA supports a personal dialogue.
the level of usage depends on the experience
and style of the study advisors
fact-based evidence at the side
narrative thread
key moments and student path help to
reconstruct personal track
“I can focus on the
student’s personal
path, rather than on
the facts.”
“Now, I can blame
the dashboard and
focus on
collaboratively looking
for the next step to
take.”
17
18. LISSA: status
18
26 programs >4500 students
114 student advisors
training of study advisors
http://blog.associatie.kuleuven.be/tinnedelaet/lissa-learning-dashboard-supporting-student-advisers-in-traditional-higher-education/
Millecamp M., Gutiérrez F., Charleer S., Verbert K., De Laet T.# (2018). A qualitative evaluation of a learning dashboard to support advisor-student
dialogues. Proceedings of the 8th International Learning Analytics & Knowledge Conference. LAK. Sydney, 5-9 March 2018 (pp. 1-5) ACM.
dashboards for three examination
periods
19. LISSA: evaluation – student
questionnaires
19
26 programs @KU Leuven
291 student questionnaires
first examination period
“Confronting, but
useful”
“I want to use this
dashboard at home.”
“Also show the sub-grades
for labs, … ”
“How can I know the data is
trustworth?”
“Can’t these visualizations be
send to students?” “Crisp and clear.”
20. 20
0
0
1
1
1
1
4
2
1
4
4
3
29
21
36
37
49
42
176
112
156
132
141
169
80
155
93
116
92
72
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
1. The dashboard is clarifying and surveyable.
2. The shown information regarding my study
situation is correct.
3. The shown position with respect to my fellow
students (histograms per exam and global…
4. A conversation with my student advisors helped
me to gain insight in my study trajectory.
5. The visualisation is of added value to the
conversation with the student advisor.
6. The shown information provide me insight in
my current situation.
Student questionnaire January 2018 (N=291)
Strongly Disagree Disagree Neither Agree or Disagree Agree Strongly Agree
21. [!] Do not oversimplify. Show
uncertainty.
21
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
22. [!] Be careful with predictive
algorithms.
22
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
24. [!] Start with the available data.
24
data already available?
administrative (examples)
student records course grades
systems (examples)
LMS access logs advisor meetings
)
Broos T., Verbert K., Van Soom C., Langie G., De Laet T.# (2018). Small data as a conversation starter for learning analytics: exam results dashboard for first-year students in higher
education. Journal of Research in Innovative Teaching & Learning, , 1-14.
25. [!] Think beyond the obvious data.
25
• Don’t think too traditional.
• Many institutions are collecting survey
data for educational research.
26. [!] Not all data is usable.
26
example data from a traditional course with “VLE as a file system”
test scores
activity/week (#days)
weeks of the year
27. [!] Not all data is usable.
27
example data from a course with flipped classroom & blended learning
exam scores
activity (# of modules used)
Not a single student
using less than 10
modules passed the
course.
Most of the successful
students used 15
modules or more.
28. [!] Keep Learning Analytics in
mind when designing learning
activities.
28
Learning
Analytics
Learning Design
INFORM
ENABLE
If LA indeed contributes to improved
learning design…
… don’t make it an afterthought
30. data already available?
administrative (examples)
student records course grades
[!] Think beyond the obvious data.
30
systems (examples)
LMS access logs advisor meetings
surveys (examples)
quality insurance LASSI
31. ~ 30 LASSI questions
(shortened version)
“Learning Skills”
Example: When preparing for an
exam, I create questions that I
think might be included.
Example: I find it difficult to
maintain my concentration
while doing my coursework.
Example: I find it hard to stick
to a study schedule.
raw scores
(selected 5 out of 10)
CONCENTRATION
MOTIVATION
FAILURE ANXIETY
TEST STRATEGY
TIME MANAGEMENT
norm scores
(in Flemish HE context)
Example: STRONG
Example: AVERAGE
Example: LOW
Example: VERY STRONG
Example: VERY WEAK
31
Meta cognitive abilities
Pinxten, M., Van Soom, C., Peeters, C., De Laet, T., Langie, G., At-risk at the gate: prediction of study success of first-year science and engineering students in an
open-admission university in Flanders—any incremental validity of study strategies? Eur J Psychol Educ (2017).
readySTEMgo Erasmus+ project https://iiw.kuleuven.be/english/readystemgo
32. Dashboard learning skills
32
students complete LASSI
questionnaire
students received personalized email
with invitation for dashboard
4367 students in 26 programs
in 9 faculties @KU Leuven
demo:
https://learninganalytics.set.kuleuven.be/lassi-1718/ (KU Leuven login)
2 programs @TU Delft
33. Feedback model
1. What is this about?
2. How am I doing?
3. How does this relates to
others?
4. Why is this relevant?
5. What can I do about it?
33
34. 34
3. How does this relates to
others?
2. How am I doing?
1. What is this about?
@studyProgram@
@yourScore@
35. 4. Why is this relevant?
5. What can I do about it?
35
39. Students that click through
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
39
better learning skills
40. More intense users
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
40
worse learning skills
41. [!] Give students “the key”.
41
• Student has the key to own
data.
• Student takes initiative to
share/discuss own data.
• GDPR as opportunity!
43. [!] Acceptance precedes impact.
43
• Involve stakeholders from the start and
value their input!
COmmunication
COoperation
• Demonstrate usefulness.
• Take care of ethics and privacy.
• Best scenario:
students & study advisors as ambassadors
COCO
44. Impact?
survey before intervention
2nd year students 2016-2017
experiences first-year feedback
41 vragen, 5-point Likert scale
pen & paper
dashboards
LISSA
LASSI (learning skills)
3 x REX (grades)
Survey after intervention
2nd year students 2017-2018
45. Impact?
During the first year I received sufficient information regarding my academic achievements.
45
Engineering Science (p<0.001)
46. Impact?
The information I received helped to position myself with respect to my peers.
46
Engineering Science (p<0.001)
48. [!] Context matters!
• available data
• national and institutional regulations
and culture
• educational vision
• educational system, size of population ..
• …
Don’t just copy existing LA solutions!
48
49. Summary
case studies 11 findings/recommendations
[!] Use all available expertise.
[!] Start with the available data.
[!] Look beyond the obvious data.
[!] Not all data is usable.
[!] Wording matters.
[!] Don’t oversimplify. Show uncertainty.
[!] Beware of predictive algorithms.
[!] Keep Learning Analytics in mind when designing
learning activities.
[!] Give students “the key” to their data.
[!] Acceptance precedes impact.
[!] Context matters!
humble approach
small data
involvement of stakeholders, especially practitioners
actionable feedback
scalability
traditional university settings
Is this Learning Analytics?
51. Project team @
51
Sven Charleer
AugmentHCI, Computer Science department
PhD researcher ABLE
Katrien Verbert
AugmentHCI, Computer Science department
Copromotor of STELA & ABLE
Carolien Van Soom
Leuven Engineering and Science Education Center
Head of Tutorial Services of Science
Copromotor of STELA & ABLE
Greet Langie
Leuven Engineering and Science Education Center
Vicedean (education) faculty of Engineering Technology
Copromotor of STELA & ABLE
Tinne De Laet
Leuven Engineering and Science Education Center
Head of Tutorial Services of Engineering Science
Coordinator of STELA
KU Leuven coordinator of ABLE
Francisco Gutiérrez
AugmentHCI, Computer Science department
PhD researcher ABLE
Tom Broos
Leuven Engineering and Science Education Center
AugmentHCI, Computer Science department
PhD researcher STELA
Martijn Millecamp
AugmentHCI, Computer Science department
PhD researcher ABLE
Special thanks to study advisors for their cooperation, advice, feedback, and support!
Jasper, Bart, Riet, Hilde, An, Katrien, …
♥