Although Massive Open Online Courses (MOOCs) have the potential to make quality education affordable and available to the masses, completion rates are extremely low due to the to the high level of autonomy and self-regulated skills that MOOCs require.
The aim of the present work is to investigate how self-regulated learning skills can be enhanced by encouraging metacognition and reflection in MOOC learners by means of social comparison. To this end, following an iterative process, we have developed the Learning Tracker, an interactive widget which allows learners to visualise their learning behaviour and com-
pare it to that of previous graduates of the same MOOC. Each iteration was extensively evaluated in live TU Delft MOOCs running on the edX platform while engaging over 20.000 MOOC learners.
Our results show that learners that have access to the Learning Tracker are more likely to graduate the MOOC. Moreover, we have observed that the widget has a positive impact on learners’ engagement and reduces procrastination. Based on our results, we argue that the mere fact of receiving feedback on a limited number of learning habits could trigger self-
reflection in learners and lead to improved learner performance.
This tutorial is designed for everyone with an interest in increasing the impact of their learning analytics research. It was given by Rebecca Ferguson on 22 June 2021 at the Learning Analytics Summer Institute 2021, hosted by the University of British Columbia and held virtually.
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Ioana Jivet
It has been long argued that learning analytics has the potential to act as a "middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Assessing Students and Tutors with Learning Analytics DashboardsEADTU
Vassilios Verykios from Hellenic Open University gave a presentation about Assessing Students and Tutors with Learning Analytics Dashboards as part of the online events by expert pool Assessment within EMPOWER.
This tutorial is designed for everyone with an interest in increasing the impact of their learning analytics research. It was given by Rebecca Ferguson on 22 June 2021 at the Learning Analytics Summer Institute 2021, hosted by the University of British Columbia and held virtually.
Awareness is not enough. Pitfalls of learning analytics dashboards in the edu...Ioana Jivet
It has been long argued that learning analytics has the potential to act as a "middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Assessing Students and Tutors with Learning Analytics DashboardsEADTU
Vassilios Verykios from Hellenic Open University gave a presentation about Assessing Students and Tutors with Learning Analytics Dashboards as part of the online events by expert pool Assessment within EMPOWER.
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of...Daniel Davis
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of Supporting Learning Outcomes.
Presented in June 2018 at Learning @ Scale in London, England.
Presentation given by Rebecca Ferguson at Charles Sturt University, Wagga Wagga campus, on 16 March 2018. http://uimagine.edu.au/portfolio/guest-lecture-dr-rebecca-ferguson/
Part of a FutureLearn Academic Network (FLAN) panel at the ALT conference in Edinburgh, 4 September 2019.
Over the last few years, Massive Open Online Courses (MOOCs) have had a huge impact on the scale of higher education teaching and learning globally. In 2018, 101 million MOOC learners participated in 11,000+ courses created by over 900 universities in partnerships with dozens of platform providers (Shah 2018). Higher Education institutions are using MOOCs to innovate, experiment with and strategise the future of online learning (Ferguson et al. 2016), (Fox 2016), (Hollands & Tirthali 2014).
The FutureLearn Academic Network (FLAN) connects staff involved with MOOCs at FutureLearn partner institutions, enabling them to share research and explore shared research opportunities. Understanding the impact of MOOCs on learning and learners is one of 12 priority areas recently identified by FLAN members as needing more research (FLAN 2019).
In this panel session, three FLAN members will share their research and lessons learnt from using MOOCs to widen the impact of teaching and learning on specific groups of learners and learning communities: bringing together experts and learners from around the world for citizen science activities for learning, using the FutureLearn approach to digital pedagogy – conversational learning – to support teaching and learning on international, closed and formally accredited courses, and reaching across traditional professional training boundaries to those who otherwise be unlikely to be able to participate in new approaches to team-based training.
• Professor Eileen Scanlon, Open University. Citizen science platforms at the Open University such as nQuire and iSpot have been used in FutureLearn. I will contribute a perspective on the role that such activities contribute to learning science.
• Professor Rebecca Ferguson, Open University. A discussion of the use of conversational learning on an international closed and formally accredited FutureLearn course. The course includes work around Sustainable Development Goal 4 to “Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all”.
• Dr Daksha Patel & Dr Astrid Leck, London School of Hygiene & Tropical Medicine. This talk will discuss the design of a FutureLearn MOOC aimed at addressing the global health challenge of trachoma elimination, and an evaluation – using Wenger et al.’s (2011) Value Creation Framework – of its impact on practice for trachoma elimination in endemic countries.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at the Learning Analytics Summer Institute event (LASI Asia) run in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Presentation by Rebecca Ferguson to the FutureLearn Academic Network (FLAN) meeting held at Universitat Pompeu Fabra in Barcelona on 27 January 2017. ‘What does the UK FLAN research tell us’ looks at 167 papers published by UK universities that are partnered with the FutureLearn MOOC platform. It focuses on priority areas for research, and the pressing research questions that emerge from the current research.
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Presentation given by Rebecca Ferguson at the ORT University Institute of Education, Montevideo, Uruguay on 12 April 2016. It deals with the Innovating Pedagogy reports produced annually since 2012 by the Institute of Educational Technology (IET) at The Open University (OU).
SOLAR - learning analytics, the state of the artRebecca Ferguson
On 3 May 2012, the Society for Learning Analytics Research (SoLAR) organised a learning analytics summit. The summit took place in Vancouver, Canada, following the second Learning Ananlytics and Knowledge conference (LAK12). This presentation summarised the state of the art in learning analytics at the time, identifying drivers, challenges, interest groups and future challenges.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
Presentation by Rebecca Ferguson at Learning and Knowledge 2015 (LAK15), Poughkeepsie, NY, USA.
Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Presentation at 'Analytics in learning and teaching: the role of big data, personalized learning and the future of the teacher, event organised at the University of Central Lancashire (UCLAN) by the Vital project (Visualisation tools and analytics to monitor language learning and teaching) on 17 July 2017. Presentation includes work from the LACE and LAEP projects.
30 autores analisam a educação de Niterói no Estado do Rio de Janeiro. Obra de referência serve para a analisar a situação das redes municipais brasileiras.
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of...Daniel Davis
Toward Large-Scale Learning Design: Categorizing Course Designs in Service of Supporting Learning Outcomes.
Presented in June 2018 at Learning @ Scale in London, England.
Presentation given by Rebecca Ferguson at Charles Sturt University, Wagga Wagga campus, on 16 March 2018. http://uimagine.edu.au/portfolio/guest-lecture-dr-rebecca-ferguson/
Part of a FutureLearn Academic Network (FLAN) panel at the ALT conference in Edinburgh, 4 September 2019.
Over the last few years, Massive Open Online Courses (MOOCs) have had a huge impact on the scale of higher education teaching and learning globally. In 2018, 101 million MOOC learners participated in 11,000+ courses created by over 900 universities in partnerships with dozens of platform providers (Shah 2018). Higher Education institutions are using MOOCs to innovate, experiment with and strategise the future of online learning (Ferguson et al. 2016), (Fox 2016), (Hollands & Tirthali 2014).
The FutureLearn Academic Network (FLAN) connects staff involved with MOOCs at FutureLearn partner institutions, enabling them to share research and explore shared research opportunities. Understanding the impact of MOOCs on learning and learners is one of 12 priority areas recently identified by FLAN members as needing more research (FLAN 2019).
In this panel session, three FLAN members will share their research and lessons learnt from using MOOCs to widen the impact of teaching and learning on specific groups of learners and learning communities: bringing together experts and learners from around the world for citizen science activities for learning, using the FutureLearn approach to digital pedagogy – conversational learning – to support teaching and learning on international, closed and formally accredited courses, and reaching across traditional professional training boundaries to those who otherwise be unlikely to be able to participate in new approaches to team-based training.
• Professor Eileen Scanlon, Open University. Citizen science platforms at the Open University such as nQuire and iSpot have been used in FutureLearn. I will contribute a perspective on the role that such activities contribute to learning science.
• Professor Rebecca Ferguson, Open University. A discussion of the use of conversational learning on an international closed and formally accredited FutureLearn course. The course includes work around Sustainable Development Goal 4 to “Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all”.
• Dr Daksha Patel & Dr Astrid Leck, London School of Hygiene & Tropical Medicine. This talk will discuss the design of a FutureLearn MOOC aimed at addressing the global health challenge of trachoma elimination, and an evaluation – using Wenger et al.’s (2011) Value Creation Framework – of its impact on practice for trachoma elimination in endemic countries.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at the Learning Analytics Summer Institute event (LASI Asia) run in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Presentation by Rebecca Ferguson to the FutureLearn Academic Network (FLAN) meeting held at Universitat Pompeu Fabra in Barcelona on 27 January 2017. ‘What does the UK FLAN research tell us’ looks at 167 papers published by UK universities that are partnered with the FutureLearn MOOC platform. It focuses on priority areas for research, and the pressing research questions that emerge from the current research.
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Presentation given by Rebecca Ferguson at the ORT University Institute of Education, Montevideo, Uruguay on 12 April 2016. It deals with the Innovating Pedagogy reports produced annually since 2012 by the Institute of Educational Technology (IET) at The Open University (OU).
SOLAR - learning analytics, the state of the artRebecca Ferguson
On 3 May 2012, the Society for Learning Analytics Research (SoLAR) organised a learning analytics summit. The summit took place in Vancouver, Canada, following the second Learning Ananlytics and Knowledge conference (LAK12). This presentation summarised the state of the art in learning analytics at the time, identifying drivers, challenges, interest groups and future challenges.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
Presentation by Rebecca Ferguson at Learning and Knowledge 2015 (LAK15), Poughkeepsie, NY, USA.
Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Presentation at 'Analytics in learning and teaching: the role of big data, personalized learning and the future of the teacher, event organised at the University of Central Lancashire (UCLAN) by the Vital project (Visualisation tools and analytics to monitor language learning and teaching) on 17 July 2017. Presentation includes work from the LACE and LAEP projects.
30 autores analisam a educação de Niterói no Estado do Rio de Janeiro. Obra de referência serve para a analisar a situação das redes municipais brasileiras.
Ciro Continisio ha finanziato con successo il suo gioco UFHO2 tramite la crowd, Alessio Ricco invece ha lanciato il suo twinsmatcher senza fare ricorso al crowdfunding.
Obiettivo del workshop è focalizzare le differenze e i punti in comune tra i due approcci per evidenziare problematiche e cercare di arrivare a delle buone pratiche per rendere il finanziamento di un progetto gaming sostenibile a lungo termine.
Aggiornamento dell’Analisi delle Piattaforme Italiane di Crowdfunding, ad oggi è il più completo rapporto sul mercato del Crowdfunding in Italia, compilato da Ivana Pais dell’Università Cattolica di Milano e Daniela Castrataro di twintangibles, socie fondatrici della Italian Crowdfunding Network
Effects of Technological Interventions for Self-regulation: A Control Experi...Hassan Khosravi
The benefits of incorporating scaffolds that promote strategies of self-regulated learning (SRL) to help student learning are widely studied and recognised in the literature. However, the best methods for incorporating them in educational technologies and empirical evidence about which scaffolds are most beneficial to students are still emerging. In this paper, we report our findings from conducting an in-the-field controlled experiment with 797 post-secondary students to evaluate the impact of incorporating scaffolds for promoting SRL strategies in the context of assisting students in creating novel content, also known as learnersourcing. The experiment had five conditions, including a control group that had access to none of the scaffolding strategies for creating content, three groups each having access to one of the scaffolding strategies (planning, externally-facilitated monitoring and self-assessing) and a group with access to all of the aforementioned scaffolds. The results revealed that the addition of the scaffolds for SRL strategies increased the complexity and effort required for creating content, were not positively assessed by learners and led to slight improvements in the quality of the generated content. We discuss the implications of our findings for incorporating SRL strategies in educational technologies.
How educators value data analytics about their moocs (1)davinia.hl
Michos, K., Hernández-Leo, D., Jiménez, M., (2017) How educators value data analytics about their MOOCs, CEUR Proceedings of Work in Progress Papers of the Experience and Research Tracks and Position Papers of the Policy Track at EMOOCs 2017 co-located with the EMOOCs 2017 Conference (Vol-1841), Madrid, Spain, 77-82.
http://ceur-ws.org/Vol-1841/R06_117.pdf
Quality assurance of MOOCs: The OpenupEd quality labelJon Rosewell
The OpenupEd quality label is a quality enhancement approach to e-learning, tailored specifically to MOOCs. I will briefly introduce the OpenupEd quality label, show how it relates to other e-learning quality frameworks, and outline the ways in which it can be used, ranging from informal self-assessment to a full external review. Which of the benchmarks could contribute to enhanced design of MOOCs? Are the benchmarks sufficiently detailed? Do they capture all important aspects?
Using ExamSoft to Facilitate Active Retrieval and Promote Student SuccessExamSoft
David Caldwell, Director of Professional Affairs, and Elizabeth Lafitte, Assistant Professor, both from University of Louisiana Monroe School of Pharmacy
Active retrieval is an evidence-based study strategy that may function as a formative assessment of student knowledge, but also promotes long-term knowledge retention. The theory of active retrieval posits that using an active study method of self-quizzing over passive study methods, such as re-studying or re-reading material enhances knowledge retention. Active retrieval can also be used as a metacognitive strategy to enhance learning. ExamSoft allows students to participate in self-quizzes and receive feedback on their performance using secure review that provides students with question rationale. Longitudinal reporting through ExamSoft allows the faculty who are facilitating active retrieval to access information regarding multiple downloads and student performance. The presenters will provide participants with an overview of their implementation of an active retrieval exercise utilizing ExamSoft and its benefits in a first year Doctor of Pharmacy course.
OECD Review on Evaluation and Assessment Frameworks for Improving School Outc...EduSkills OECD
Purpose: To explore how systems of E&A can be used to improve the quality, equity and efficiency of school education.
Focus: A Review of national approaches to E&A in school education (primary and secondary schools)
Comprehensive approach: The Review looks at the various components of E&A such as:
Student assessment;
Teacher appraisal;
School evaluation;
The appraisal of school leaders;
Education system evaluation.
This details a successful data-driven redesign of Math 215, an online statistics concepts course at Franklin University. The redesigned course incorporated new interactive educational multimedia. This new design resulted in improved student retention, better student performance, and better satisfaction with the course.
Data-Driven Development (D3) and Evaluation of Enskill EnglishLewisJohnson34
Invited conference presentation at the 2020 International Conference on Artificial Intelligence in Education, based on an article published in the International Journal of Artificial Intelligence in Education.
Helping Students Get the Most Out of ExamSoft Longitudinal ReportsExamSoft
Presented by Dr. Melinda E. Lull, Assistant Professor of Pharmaceutical Sciences, Wegmans School of Pharmacy, St. John Fisher College
While students are able to view their own assessment data and longitudinal reports from ExamSoft, they can easily become lost in a sea of numbers and categories. In order to best benefit student performance, students must understand both the interpretation of and the benefit from ExamSoft reports. This session will discuss ways to provide assessment data to students and aid them interpreting and using their results.
Restructuring a Capstone Course Based on Longitudinal Assessment of Module ExamsExamSoft
Presented by Karen Kier
Vertical- Pharmacy
Restructuring a Capstone Course Based on Longitudinal Assessment of Module Exams
ExamSoft was utilized for all therapeutic modules for a 2-year cohort of pharmacy students. Prior to the students final capstone module, a pretest was given to the students (n=154). The pretest concept for the capstone module had existed for 15 years. This is the first time the pretest was constructed from the cohort of exams for those students. Previous pretests were derived by faculty teaching in the modules. The current pretest was based on evidence from ExamSoft categories and performance reports that allowed module coordinators to select questions which demonstrated a weak knowledge base for the class. The average score on the pretest was a 32% compared to past years performance of 55 to 63% averages. This demonstrated that the college did not have a good plan to help students remediate in their areas of weakness. If the student passed the module, they moved on to the next module but did not master material. The capstone course coordinator then created a dynamic class schedule that immediately took this information into account in designing the 7 week module for spring semester. The capstone module faculty created review materials on Moodle with review quizzes on ExamSoft. Any student that had a red weakness area on their Strengths and Opportunities report was required to do the review material and quizzes for each area. Any student could take advantage of this opportunity as well. Student progress was tracked in ExamSoft and Moodle. Post-test results are pending an April 17th exam and will be available for this session. The results of this cohort has started discussion on how to remediate students sooner in the modules. In addition, the faculty continue to use the rationale sections of the exam to provide immediate feedback to students on exams. The use of ExamSoft categories as well as student reports has created an opportunity to dialogue with students and faculty to improve student outcomes.
At the end of the session, participants will be able to:
1) Understand the importance of tracking longitudinal outcomes to dynamically change a course to best suit student needs
2) Discuss the use of ExamSoft categories to pull relevant data for assessing outcomes
3) Discuss the ability to create an online review session and ExamSoft quizzes to allow students to improve knowledge base based on their individual ExamSoft Strengths and Opportunities Report
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
4. 4
What is a MOOC?
Massive Open Online Course
Best Courses. Top Institutions.
Learn anytime, anywhere.
• 35 million learners
• 500 universities
• 4 200 MOOCs
5. 5
Dropout as a main challenge
• Low completion rates <15 % (Jordan, 2016)
• Underdeveloped learning skills and study habits
– High autonomy
– Role of the teacher
– Low metacognitive awareness
6. 6
Self Regulated Learning
• Definition: capability of the learner “to adjust her
actions and goals to achieve desired results in light
of changing environmental conditions”
(Zimmerman, 1990)
• Major success factor in online learning
environments, including MOOCs
• Lack of learner support in current MOOC platforms
8. 8
Aim
Investigate how self-regulated
learning skills can be enhanced in
MOOC learners
Encouraging metacognition
and self-reflection on learning
behaviour
Providing feedback through
social comparison with successful
learners on a learner dashboard
10. 10
Development
Design-based research methodology
• Incremental
• Evaluation on edX MOOCs offered by TU Delft
Two components
• Data
• Visualisation
First iteration Evaluation
January – March 2016
Second
iteration
Evaluation
April – June 2016
13. 13
Preliminary evaluation of the first iteration
• Metric configuration
• Additional information set
– Average graduate in the following week
– Reflection and planning support
Adjustments in the second iteration
15. 15
Preliminary evaluation of the first iteration
• Metric configuration
• Additional information set
– Average graduate at the end of current week
– Reflection and planning support
• Interactive elements
Adjustments in the second iteration
20. 20
Experimental setup
Three TU Delft MOOCs
– Weekly publication of learning material
– Video lectures, weekly assignments, practice
quizzes
– Graduation: >60% final score
Replicated longitudinal study
21. 21
Experimental setup
Method: randomized controlled trial
– Demographic analysis to ensure populations
are sufficiently randomized
WaterX SewageX InnovationX
Test group 5 460 4 038 1 184
Control group 5 483 4 099 1 168
Total enrolled 10 943 8 137 2 352
27. 27
Learners’ behaviour
RQ2.1 Do learners become more engaged
with the MOOC when they can compare
their behaviour with that of successful
learners?
28. 28
Learners’ engagement – course material
Learners are more engaged with the graded course material.
WaterX SewageX Innovationx
Graded quizzes .036 .114 .044
Practice non-graded quizzes .512 .071 -
Mann-Whitney test results (p-values) between the test
group and the control group.
– Significance level α = .050
– Significant differences are marked in bold.
29. 29
More learners are engaged with graded course content.
Learners’ engagement – course material
31. 31
Learners’ self-regulation
RQ2.2 Do learners show improvement of
their time-management skills when they
compare their behaviour to that of
successful learners?
32. 32
Learners’ self-regulation - procrastination
WaterX SewageX Innovationx
Timeliness
(recommended)
.055 .113 .039
Timeliness
(actual)
.040 .145 .035
Mann-Whitney test results (p-values) between the test
group and the control group.
– Significance level α = .050
– Significant differences are marked in bold.
Learners procrastinate less.
34. 34
Learners’ on-trackness
RQ2.3 Do learners change their behaviour
so it becomes similar to that of successful
learners when they compare themselves
to it?
35. 35
Learners’ on-trackness
Similarity between a learners’ behaviour
and that of the average graduate
1. Compute on-trackness score weekly
2. Cluster learners based on the evolution
of the on-trackness score
36. 36
Learners’ on-trackeness – clusters
No conclusive evidence that the Learning Tracker influences
the distribution of learners into clusters.
on-track
behind,
but keep up
behind,
initial activity
behind,
no activity
MOOC stands for Massive Open Online Course. The term massive refers to the large number of learners that can participate. MOOCs are open, meaning that learners have free access to the content once they enroll. The term online refers to the fact that accessing a MOOC can only be done by somebody that has Internet access.
MOOCs caught public eye in 2012, when top universities like Stanford, MIT or Harvard launched today’s largest MOOC platforms. One such example is edX, a nonprofit MOOC provider founded by Harvard University and MIT. Their mission is to increase access to high-quality education for anyone, anywhere.
MOOCs are expected to revolutionize education by making high quality education accessible to the masses and thus reducing the gap between the most privileged and the most disadvantaged learners.
Until now, more than 35 million learners enrolled in at least one MOOC and more than 500 universities are offering over 4000 MOOCs.
Despite their potential, MOOCs face some challenges. One of these major challenges is a low completion rate most of the time below 15%. Although literature identifies several reasons for learners dropping out early, the one that we want to focus on is underdeveloped learning skills and study habits. Very often, learners drop out because they are not equipped with proper skills to learn with a MOOC.
A new learning environment that requires high autonomy in terms of motivation, defining learning paths and engaging with other MOOC participants. Learners have a lot of freedom in choosing when, where, what and how to learn.
Learners face less constraints than in traditional face-to-face education. There are no consequences for failing or dropping out and the role of the instructor/teachers changes.
Many times learners are not aware that their learning skills are not adequate. Low metacognitive awareness means that learners are not inclined to think about and evaluate their own thinking process or the effectiveness of their strategies.
Yet, there are solutions. Learning psychologists identified that what makes learners successful is a skill called self-regulated learning. They explain that this is a major factor that influences the large-scale success/failure of online learning environments, including MOOCs
However, current MOOC platforms fail to support learners in developing these skills which are indispensable for online learning.
Looking at edX, for example, learners have the Progress page on which theys can view the grades they obtained for each assignment, but they do not receive any information of how they could improve their learning.
Thus the aim of this work is to address the lack of support in the learning process offered to learners and investigate how SRL skills can be enhanced in MOOC learners.
Relying on research done in the field of learning sciences, we hypothesize that SRL skills can be developed if learners reflect on and evaluate their learning behaviour.
In order to test this hypothesis, we developed a learner dashboard embedded on edX MOOC pages on which learners receive feedback on their learning behaviour. At the same time, learners can compare their behaviour to that of previously successful learners.
What we considered “successful learners” are learners that graduated previous editions of the same MOOC.
Incremental development: First iteration in January-March 2016 and a second iteration in April-June 2016
How it works: based on the edX trace logs that record every action learners perform on the platform, we extract and compute a series of metrics that are displayed on a widget embedded in the course pages.
Reasons for choosing a spider chart: concise visualization of numerous metrics in a small space and easily comparable medium
6 metrics are displayed around the spider chart. Learners can visualise their own performance and the one of the average graduate. The values of the metrics are computed once, before the beginning of the course for the average graduate. The values for the current learners are updated once a week when new edX data becomes available.
The first iteration was then evaluated on a live TU Delft MOOC on edX. Based on preliminary results that showed that the LT had a positive effect on the graduation rates, several adjustments were made to the second iteration.
First off, the metrics were changed to focus more on self-regulating behaviour. For example, we included metrics that referred to the number of visits per week, the average time between two consecutive sessions or the average length of a session.
Regarding the technologies used to implement it, for computing the behaviour metrics, we used code written in Java 8.
The widget in itself is a JS Script and for drawing it we used Highcharts, a JS charting library.
What gives validity to our study is that we evaluated two iteration of the LT over the full duration of each MOOC. Such studies are rare in literature.
Method: Learners are randomly assigned to a test or a control group. Test group has access to the LT, while the control group does not. This method allowed us to identify the effects of the Learning Tracker by comparing the behaviour of the test group with that of the control group.
The analysis was performed only taking into account data from active learners (those that spent more than 5 minutes on the platform).
Here we plotted the percentage of learners that graduated in each group in all three MOOCs. As we can see, the graduation percentage is higher in all three cases.
No significant differences between the final grades.
We also looked at the final grades the learners obtained. Statistical tests show no significant differences between the test group and the control group.
This plot show the distribution of learners in the test group and the control group according to the final grade. Two things to notice:
Firstly, there is a high density around 0 – common to every MOOC
Secondly, the curve representing the test group is above the one representing the control group – meaning that more learners passed the graduation threshold, although they did not pursue higher grades.
The same results hold also for the other two course.
The next steps of the analysis looked into what could be reasons for this change in graduation rate. Did the learners behaviour change? And if so, in what ways?
Solving practice quizzes first to grasp concepts is not a strategy employed by the test learners.
- Might be an explanation for the higher % of learners in the range of 60-80% for the final grade.
This graph shows the progression of both group through the first MOOC with respect to the number of learners that attempted at least one graded quiz question since the beginning of the course.
The widget was made available in week 2.
Number of learners that attempted at least one graded quiz question since the beginning of the course.
The difference becomes visible between week 2 and 3, a week after the widget was made available.
We investigated how learners use their time on the platform and their time-management skills
We observed significant differences in one metric: the timeliness of submission = the average time between learners submit assignments and the deadline.
To evaluate learners’ on-trackness, we measured the similarity between a learners’ behaviour and that of a successful learners by calculating an on-trackness score.
Using k-means clustering algorithm, we grouped learners into clusters with similar behaviours over time.
The learners exhibit similar behaviour patterns in all three courses.
- Describe the clusters
Out of the four patterns, two (Cluster 1 and Cluster 2) show a steady progress, while the other two exhibit a decrease in on-trackness over time (Cluster 3 and Cluster 4). The decrease in on-trackness score reflects drop-outs or very low activity.
However, when we looked at the distribution of the test and control learners into the four clusters we did not find any evidence that shows that the LT motivates learners to be on-track.
Learning behaviour can be broken down into several study habits that influence each other
Learning behaviour is a “web” of study habits
Positive effect: (a) Increases the likelihood of graduation because it increases the engagement with graded course material. (b) Reduces procrastination.
on-trackness score = the similarity between one’s behaviour and that of successful learners
To evaluate learners’ on-trackness, we measured the similarity between a learners’ behaviour and that of a successful learners by calculating an on-trackness score. The score is computed through an arithmetic weighted sum of the metric deviations. The metric deviations are the differences between a learner’s scaled value of the metric and the average graduate’s scaled value. The weights are inversely proportional with the amplitude of the metric deviation. The higher the difference, the lower the weight.
on-trackness score = the similarity between one’s behaviour and that of successful learners
To evaluate learners’ on-trackness, we measured the similarity between a learners’ behaviour and that of a successful learners by calculating an on-trackness score. The score is computed through an arithmetic weighted sum of the metric deviations. The metric deviations are the differences between a learner’s scaled value of the metric and the average graduate’s scaled value. The weights are inversely proportional with the amplitude of the metric deviation. The higher the difference, the lower the weight.
on-trackness score = the similarity between one’s behaviour and that of successful learners
To evaluate learners’ on-trackness, we measured the similarity between a learners’ behaviour and that of a successful learners by calculating an on-trackness score. The score is computed through an arithmetic weighted sum of the metric deviations. The metric deviations are the differences between a learner’s scaled value of the metric and the average graduate’s scaled value. The weights are inversely proportional with the amplitude of the metric deviation. The higher the difference, the lower the weight.
on-trackness score = the similarity between one’s behaviour and that of successful learners
To evaluate learners’ on-trackness, we measured the similarity between a learners’ behaviour and that of a successful learners by calculating an on-trackness score. The score is computed through an arithmetic weighted sum of the metric deviations. The metric deviations are the differences between a learner’s scaled value of the metric and the average graduate’s scaled value. The weights are inversely proportional with the amplitude of the metric deviation. The higher the difference, the lower the weight.
The inspiration for this work was research done in the field of search engines. Bateman investigated the effects of reflection and social comparison in search behaviour with positive results.
As a reference model, they quantified the behaviour of “expert searchers”. In our case, what we consider a model worth comparing against are previous graduates of the same MOOC.