A reflection on where we are with learning analytics as a new multi-discipline research area. Reflections from the Learning Analytics Conference 2013 with respect to Assessment.
The Power of Learning Analytics: Is There Still a Need for Educational Research?Bart Rienties
Across the globe many institutions and organisations have high hopes that learning analytics can play a major role in helping their organisations remain fit-for-purpose, flexible, and innovative. A broad goal of learning analytics is to apply the outcomes of analysing data gathered by monitoring and measuring the learning process. Learning analytics applications in education are expected to provide institutions with opportunities to support learner progression, but more importantly provide personalised, rich learning on a large scale. Substantial progress in learning analytics research has been made in the last few years.
Researchers in learning analytics use a range of advanced computational techniques (e.g., Bayesian modelling, cluster analysis, natural language processing, machine learning) for predicting which learners are likely to fail or succeed, and how to provide appropriate support in a flexible and adaptive manner.
In this keynote, I will argue that unless educational researchers at EARLI embrace some of the key principles, methods, and approaches of learning analytics, educational researchers may be left behind. In particular, a main merit of learning analytics is linking large datasets of actual learning processes and outcomes with learning dispositions and learner characteristics. Using evidence-based approaches rapid insights and advancements are developed how learning designs and learning processes can be optimised to maximise the potential of each learner. For example, our recent research with 151 modules and 133K students at the Open University UK indicates that learning design has a strong impact on student behaviour, satisfaction, and performance. Learning analytics can also drive learning in more “traditional”, face-to-face contexts. For example, by measuring emotions, epistemological expressions, and cross-cultural dialogue, social interactions can be effectively supported by innovative dashboards and adaptive
approaches. I aim to unpack the advantages and limitations of learning analytics and how EARLI researchers can embrace such data-driven research approaches
More info at www.bartrienties.nl
ASCILITE Webinar: A review of five years of implementation and research in al...Bart Rienties
Date and time: Wednesday 20 September 2017 at 5pm AEST
Abstract: The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, we have been unpacking the complexity of instructional practices, as well as providing empirical evidence of how learning designs influence student behaviour, satisfaction, and performance. This seminar will discuss the implementation of learning design at the OU in the last 5 years, and reviews empirical evidence from several studies that have linked learning design with learning analytics. Recommendations are put forward to support future adoptions of the learning design approach, and potential research trajectories.
https://ascilite.org/get-involved/sigs/learning-analytics-sig/
www.bartrienties.nl
Understanding the relationship between pedagogical beliefs and technology use...Vrije Universiteit Brussel
Current evidence indicates that the use of technology during teaching and learning activities is steadily increasing (Berrett, Murphy, & Sullivan, 2012; Inan & Lowther, 2010; National Education Association, 2008), yet achieving ‘technology integration’ is a complex process of educational change. This is apparent as the use of technology in schools is still extremely varied and, in many instances, limited (e.g., Spector, 2010; Tondeur, Cooper, & Newhouse, 2010). In this respect, achieving the goal of meaningful technology integration (i.e., using technology to support 21st century teaching and learning) does not depend solely on technology-related factors (see also Arntzen & Krug, 2011; Sang, Valcke, van Braak, Tondeur, & Chang, 2010). Rather, the personal willingness of teachers plays a key role in teachers’ decisions whether and how to integrate technology within their classroom practices (Hermans, Tondeur, van Braak & Valcke, 2008; Ottenbreit-Leftwich, Newby, Glazewski, & Ertmer, 2010).
According to previous studies, teachers select applications of technology that align with their selection of other curricular variables and processes (e.g., teaching strategies) and that fit into their existing beliefs about ‘good’ education (Hermans et al., 2008; Niederhauser & Stoddart, 2001). Technological devices such as computers, tablets, or interactive whiteboards do not embody one single pedagogical orientation (Lawless & Pellegrino, 2007); rather, they enable the implementation of a spectrum of approaches to teaching and learning (Tondeur, Hermans, van Braak, & Valcke, 2008). In other words, the role technology plays in teachers’ classrooms depends on their conceptions of the nature of teaching and learning. In this respect, research on educational innovations suggests that technology integration can only be fully understood when teachers’ pedagogical beliefs are taken into account (Ertmer, 2005; Hermans, 2009).
With the impetus and call for increased technology integration (e.g., U.S. DOE, 2010; UNESCO, 2011), it is critically important to examine the link between teachers’ beliefs and teachers’ practices. In the last decade, the relationship between the pedagogical beliefs of teachers and their uses of technology has been examined extensively (cf. Hermans et al., 2008; Ottenbreit-Leftwich et al., 2010; Prestridge, 2009, 2010), but still this relationship remains unclear (Mueller et al., 2008). Given the centrality and importance of teachers’ pedagogical beliefs and the lack of a clear understanding about the relationship between beliefs and classroom technology use, the purpose of this review study is to examine and clarify this relationship. A meta-aggregative approach was used to locate, critically appraise, and synthesize the qualitative evidence base (see Hannes & Lockwood, 2011).
Research program educationaldataanalytics4personalisedt&l-2017Demetrios G. Sampson
Educational Data Analytics for Personalised Teaching and Learning
Keynote Speaker
2017 Symposium on Taiwan-Estonia Research Cooperation, Taipei, Taiwan
6-9 March 2017
The Power of Learning Analytics: Is There Still a Need for Educational Research?Bart Rienties
Across the globe many institutions and organisations have high hopes that learning analytics can play a major role in helping their organisations remain fit-for-purpose, flexible, and innovative. A broad goal of learning analytics is to apply the outcomes of analysing data gathered by monitoring and measuring the learning process. Learning analytics applications in education are expected to provide institutions with opportunities to support learner progression, but more importantly provide personalised, rich learning on a large scale. Substantial progress in learning analytics research has been made in the last few years.
Researchers in learning analytics use a range of advanced computational techniques (e.g., Bayesian modelling, cluster analysis, natural language processing, machine learning) for predicting which learners are likely to fail or succeed, and how to provide appropriate support in a flexible and adaptive manner.
In this keynote, I will argue that unless educational researchers at EARLI embrace some of the key principles, methods, and approaches of learning analytics, educational researchers may be left behind. In particular, a main merit of learning analytics is linking large datasets of actual learning processes and outcomes with learning dispositions and learner characteristics. Using evidence-based approaches rapid insights and advancements are developed how learning designs and learning processes can be optimised to maximise the potential of each learner. For example, our recent research with 151 modules and 133K students at the Open University UK indicates that learning design has a strong impact on student behaviour, satisfaction, and performance. Learning analytics can also drive learning in more “traditional”, face-to-face contexts. For example, by measuring emotions, epistemological expressions, and cross-cultural dialogue, social interactions can be effectively supported by innovative dashboards and adaptive
approaches. I aim to unpack the advantages and limitations of learning analytics and how EARLI researchers can embrace such data-driven research approaches
More info at www.bartrienties.nl
ASCILITE Webinar: A review of five years of implementation and research in al...Bart Rienties
Date and time: Wednesday 20 September 2017 at 5pm AEST
Abstract: The Open University UK (OU) has been one of few institutions that have explicitly and systematically captured the designs for learning at a large scale. By applying advanced analytical techniques on large and fine-grained datasets, we have been unpacking the complexity of instructional practices, as well as providing empirical evidence of how learning designs influence student behaviour, satisfaction, and performance. This seminar will discuss the implementation of learning design at the OU in the last 5 years, and reviews empirical evidence from several studies that have linked learning design with learning analytics. Recommendations are put forward to support future adoptions of the learning design approach, and potential research trajectories.
https://ascilite.org/get-involved/sigs/learning-analytics-sig/
www.bartrienties.nl
Understanding the relationship between pedagogical beliefs and technology use...Vrije Universiteit Brussel
Current evidence indicates that the use of technology during teaching and learning activities is steadily increasing (Berrett, Murphy, & Sullivan, 2012; Inan & Lowther, 2010; National Education Association, 2008), yet achieving ‘technology integration’ is a complex process of educational change. This is apparent as the use of technology in schools is still extremely varied and, in many instances, limited (e.g., Spector, 2010; Tondeur, Cooper, & Newhouse, 2010). In this respect, achieving the goal of meaningful technology integration (i.e., using technology to support 21st century teaching and learning) does not depend solely on technology-related factors (see also Arntzen & Krug, 2011; Sang, Valcke, van Braak, Tondeur, & Chang, 2010). Rather, the personal willingness of teachers plays a key role in teachers’ decisions whether and how to integrate technology within their classroom practices (Hermans, Tondeur, van Braak & Valcke, 2008; Ottenbreit-Leftwich, Newby, Glazewski, & Ertmer, 2010).
According to previous studies, teachers select applications of technology that align with their selection of other curricular variables and processes (e.g., teaching strategies) and that fit into their existing beliefs about ‘good’ education (Hermans et al., 2008; Niederhauser & Stoddart, 2001). Technological devices such as computers, tablets, or interactive whiteboards do not embody one single pedagogical orientation (Lawless & Pellegrino, 2007); rather, they enable the implementation of a spectrum of approaches to teaching and learning (Tondeur, Hermans, van Braak, & Valcke, 2008). In other words, the role technology plays in teachers’ classrooms depends on their conceptions of the nature of teaching and learning. In this respect, research on educational innovations suggests that technology integration can only be fully understood when teachers’ pedagogical beliefs are taken into account (Ertmer, 2005; Hermans, 2009).
With the impetus and call for increased technology integration (e.g., U.S. DOE, 2010; UNESCO, 2011), it is critically important to examine the link between teachers’ beliefs and teachers’ practices. In the last decade, the relationship between the pedagogical beliefs of teachers and their uses of technology has been examined extensively (cf. Hermans et al., 2008; Ottenbreit-Leftwich et al., 2010; Prestridge, 2009, 2010), but still this relationship remains unclear (Mueller et al., 2008). Given the centrality and importance of teachers’ pedagogical beliefs and the lack of a clear understanding about the relationship between beliefs and classroom technology use, the purpose of this review study is to examine and clarify this relationship. A meta-aggregative approach was used to locate, critically appraise, and synthesize the qualitative evidence base (see Hannes & Lockwood, 2011).
Research program educationaldataanalytics4personalisedt&l-2017Demetrios G. Sampson
Educational Data Analytics for Personalised Teaching and Learning
Keynote Speaker
2017 Symposium on Taiwan-Estonia Research Cooperation, Taipei, Taiwan
6-9 March 2017
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
This document provides an overview of a workshop on using learning analytics to improve student transition and support in the first year. The workshop was delivered by the ABLE and STELA projects in partnership.
It begins with introductions of the presenters and a discussion of the workshop structure. Next, the document explores definitions and concepts of learning analytics through short discussions and examples. It then highlights examples of learning analytics projects and implementations at partner institutions like Nottingham Trent University, Leiden University, and Delft University of Technology.
The workshop also included an exploration activity where participants discussed goals and interventions for a hypothetical learning analytics project. Finally, the document outlines three case studies that workshop groups worked on, with an emphasis on presenting results
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...Bart Rienties
The Open University UK (OU) has been implementing learning analytics and learning design on a large scale since 2012. With its 170+ students and 4000+ teaching staff, the OU has been at the forefront of testing, implementing, and evaluating the impact of learning analytics and learning design on students outcome and retention. A range of reviews and scholarly repositories (e.g., Web of Science) indicate that the OU is the largest contributor to academic output in learning analytics and learning design in the world. However, despite the large uptake of learning analytics at the OU there are a range of complex issues in terms of buy-in from staff, data infrastructures, ethics and privacy, student engagement, and perhaps most importantly how to make sense of big and small data in a complex organisation like the OU. During his talk Bart will be presenting on the implementation and learnings.
Xiao Hu "Overview of the Space of Learning Analytics and Educational Data Min...CITE
This document provides an overview of the fields of learning analytics and educational data mining. It discusses the types of methods used in each field, including prediction, relationship mining, and discovery with models. Recent trends are noted, such as increased emphasis on constructs like motivation and engagement, and broader data sources. Challenges and opportunities are also presented, such as improving connections between fields and addressing issues around research ethics, privacy, and data management. Upcoming conferences and a MOOC on the topic are also announced. Questions from attendees are answered, focusing on friction between fields, collaboration opportunities, and handling research ethics and privacy concerns.
The document discusses a study by Ajjan and Hartshorne from 2008 that investigated faculty decisions to adopt Web 2.0 technologies. The study examined whether faculty were aware of benefits of using these technologies and what factors predicted their adoption. It focused on perceived usefulness, ease of use, and compatibility with teaching style. The findings were that efforts should improve perceived usefulness, ease of use and compatibility of Web 2.0 tools, and best practices models are needed to facilitate adoption in higher education. The document then provides descriptions of various media tools that could be used in courses.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Using Learning analytics to support learners and teachers at the Open UniversityBart Rienties
In this seminar Prof Bart Rienties will reflect on how the Open University UK has become a leading institution in implementing learning analytics at scale amongst its 170K students and 5K staff. Furthermore, he will discuss how learning analytics is being adopted at other UK institutions, and what the implications for higher education might be in these Covid19 times.
https://www.kent.ac.uk/cshe/news-events.html
Are Wikis and Weblogs an appropriate approach to foster collaboration, refle...Christian Schmidt
The document discusses research into using wikis and weblogs to foster student collaboration, reflection, and motivation in mathematics education. A research study was conducted with 127 German students to examine the effects of using individual and class weblogs on students' self-determination and reflection. The results found no significant differences between the groups. Qualitative data from student interviews and blog posts will be further analyzed to understand how digital tools can support learning. Previous research on using wikis and weblogs in education is also summarized.
HESA JISC DATA The Power of Learning Analytics with(out)leanring designBart Rienties
1. Learning analytics provides insights into student engagement, satisfaction, and performance when combined with data on learning design and teacher interventions.
2. An analysis of over 150 modules found that the type of learning design impacted online behavior, end-of-module surveys, and exam results.
3. Providing teachers with predictive learning analytics and visualizations on at-risk students led to increased usage of the tools and had a positive impact on student performance and retention rates according to regression analysis.
Tiffany Barnes "Making a meaningful difference: Leveraging data to improve le...CITE
The document discusses the future of learning and how data can be leveraged to improve learning for most people. It outlines using data to recognize excellence in teaching and learning, provide real-time support, and identify effective collaborations. A case study is described that used an intelligent tutoring system to construct student models and provide feedback based on past student data. Guiding principles of respect, beneficence, and justice are presented for developing learning systems.
Inaugural lecture: The power of learning analytics to give students (and teac...Bart Rienties
Join us at the Berrill Theatre and online on Tuesday 30 January 2018, 6-7pm for the Inaugural Lecture of Professor Bart Rienties, in which he will talk about the power of learning analytics in teaching and learning. Bart Rienties is Professor of Learning Analytics at the Institute of Educational Technology (IET) at The Open University. He is programme director Learning Analytics within IET and head of Data Wranglers, whereby he leads of group of learning analytics academics who conduct evidence-based research and sense making of Big Data at the OU.
As educational psychologist, he conducts multi-disciplinary research on work-based and collaborative learning environments and focuses on the role of social interaction in learning, which is published in leading academic journals and books. His primary research interests are focussed on Learning Analytics, Computer-Supported Collaborative Learning, and the role of motivation in learning. Furthermore, Bart is interested in broader internationalisation aspects of higher education. He has successfully led a range of institutional/national/European projects and received a range of awards for his educational innovation projects.
Bart is World Champion Transplant cycling Team Time Trial 2017, the first academic with a transplant to be promoted to full professor, and a keen explorer of life.
In The power of learning analytics to give students (and teachers) what they want!, Bart will describe how his research into learning analytics is enabling him to predict which learning strategy might work best for each student, and provide different, unique experiences for each depending on what they want. In particular, he will explore how student dispositions like motivation, emotion, or anxiety encourages or hinders effective online learning, and how we may need to adjust our approaches depending on individual differences.
Event programme:
18:00 - 18:45 – The power of learning analytics to give students (and teachers) what they want!
18:45 - 19:00 – Q&A
19:00 - 19:45 – Drinks Reception
There will be time for questions and comments. We very much hope you will be able to attend what promises to be an inspiring event and have your say.
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
Learning gains has increasingly become apparent within the HE literature, gained traction in government policies in the UK, and are at the heart of Teaching Excellence Framework (TFL). As such, this raises a question to what extent teaching and learning environment can actually predict students’ learning gains using principles of learning analytics. In this presentation, which is joined work with University of Surrey and Oxford Brookes, I will focus on some preliminary findings based upon developing and testing an Affective-Behaviour-Cognition learning gains model using longitudinal approach. The main aim of the research is to examine whether learning gains occur on all three levels of Affective-Behaviour-Cognition model and whether any particular student or course characteristics can predict learning gains or lack of learning and dropout. For more info, see https://abclearninggains.com/
This document provides an overview of learning analytics, including:
- A brief history of learning analytics and related fields from the 1920s to present day.
- Drivers for the increased focus on learning analytics like big data, personalized learning, and demands for educational institutions to demonstrate performance.
- A definition of learning analytics as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs."
- Key dimensions that learning analytics research and practice focuses on, such as data sources and methods, connection to learning and teaching, purpose, stakeholder management, and ethics.
Investigating the relationship of Science and Mathematics Achievement with At...FatmaArkan
This study investigated the relationships between science and mathematics achievement, attitudes towards STEM, and internet addiction. The researchers administered surveys on attitudes towards STEM and internet addiction to 435 high school students. They found that positive attitudes towards STEM and more time spent studying on computers each week significantly predicted higher mathematics and science achievement. Additionally, school type significantly predicted science achievement. However, internet addiction was not a significant predictor of achievement. The study provides insights into factors influencing STEM learning and performance.
Gobert, Dede, Martin, Rose "Panel: Learning Analytics and Learning Sciences"CITE
This panel discussed learning analytics and learning sciences. Janice Gobert discussed problems with standardized tests and how interactive labs have assessment potential but challenges. Chris Dede discussed his research on immersive learning using virtual reality and challenges assessing open-ended environments. Taylor Martin discussed how microgenetic research and learning analytics can improve data collection and analysis. Carolyn Rose discussed using conversational data and a new theoretical framework analyzing social processes and distances. The panel addressed if these methods lead to improved standardized test scores, with Janice and Chris noting validity issues with standardized tests and that these methods improve deeper learning over rote memorization.
The document discusses strategies for increasing faculty engagement with technology enhanced learning. It suggests getting buy-in from senior management and ensuring proper support for changes. Other tips include understanding the context, finding out what approaches faculty currently use, emphasizing two-way communication, celebrating innovation, and keeping the focus on enhancing student learning. The goal is to move beyond superficial engagement and foster deep, meaningful involvement with technology.
Designing and testing visual representations of draft essays for Higher Educa...Denise Whitelock
This presentation reports the findings of an empirical investigation which set out to test a set of rainbow exercises. The rainbow diagrams are pictorial representations of formal graphs that are derived automatically from student essays. They were designed to allow students to discover how key concepts in a well written essay are connected together. The students would then be able to compare a rainbow diagram of their own essay with a good essay and make changes to it before submission to their tutor. A trial was undertaken with academics, teaching and learning staff, doctoral students at The Open University of Catalonia and the Open University UK, before implementation into the web application known as OpenEssayist.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
This document provides an overview of a workshop on using learning analytics to improve student transition and support in the first year. The workshop was delivered by the ABLE and STELA projects in partnership.
It begins with introductions of the presenters and a discussion of the workshop structure. Next, the document explores definitions and concepts of learning analytics through short discussions and examples. It then highlights examples of learning analytics projects and implementations at partner institutions like Nottingham Trent University, Leiden University, and Delft University of Technology.
The workshop also included an exploration activity where participants discussed goals and interventions for a hypothetical learning analytics project. Finally, the document outlines three case studies that workshop groups worked on, with an emphasis on presenting results
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...Bart Rienties
The Open University UK (OU) has been implementing learning analytics and learning design on a large scale since 2012. With its 170+ students and 4000+ teaching staff, the OU has been at the forefront of testing, implementing, and evaluating the impact of learning analytics and learning design on students outcome and retention. A range of reviews and scholarly repositories (e.g., Web of Science) indicate that the OU is the largest contributor to academic output in learning analytics and learning design in the world. However, despite the large uptake of learning analytics at the OU there are a range of complex issues in terms of buy-in from staff, data infrastructures, ethics and privacy, student engagement, and perhaps most importantly how to make sense of big and small data in a complex organisation like the OU. During his talk Bart will be presenting on the implementation and learnings.
Xiao Hu "Overview of the Space of Learning Analytics and Educational Data Min...CITE
This document provides an overview of the fields of learning analytics and educational data mining. It discusses the types of methods used in each field, including prediction, relationship mining, and discovery with models. Recent trends are noted, such as increased emphasis on constructs like motivation and engagement, and broader data sources. Challenges and opportunities are also presented, such as improving connections between fields and addressing issues around research ethics, privacy, and data management. Upcoming conferences and a MOOC on the topic are also announced. Questions from attendees are answered, focusing on friction between fields, collaboration opportunities, and handling research ethics and privacy concerns.
The document discusses a study by Ajjan and Hartshorne from 2008 that investigated faculty decisions to adopt Web 2.0 technologies. The study examined whether faculty were aware of benefits of using these technologies and what factors predicted their adoption. It focused on perceived usefulness, ease of use, and compatibility with teaching style. The findings were that efforts should improve perceived usefulness, ease of use and compatibility of Web 2.0 tools, and best practices models are needed to facilitate adoption in higher education. The document then provides descriptions of various media tools that could be used in courses.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Using Learning analytics to support learners and teachers at the Open UniversityBart Rienties
In this seminar Prof Bart Rienties will reflect on how the Open University UK has become a leading institution in implementing learning analytics at scale amongst its 170K students and 5K staff. Furthermore, he will discuss how learning analytics is being adopted at other UK institutions, and what the implications for higher education might be in these Covid19 times.
https://www.kent.ac.uk/cshe/news-events.html
Are Wikis and Weblogs an appropriate approach to foster collaboration, refle...Christian Schmidt
The document discusses research into using wikis and weblogs to foster student collaboration, reflection, and motivation in mathematics education. A research study was conducted with 127 German students to examine the effects of using individual and class weblogs on students' self-determination and reflection. The results found no significant differences between the groups. Qualitative data from student interviews and blog posts will be further analyzed to understand how digital tools can support learning. Previous research on using wikis and weblogs in education is also summarized.
HESA JISC DATA The Power of Learning Analytics with(out)leanring designBart Rienties
1. Learning analytics provides insights into student engagement, satisfaction, and performance when combined with data on learning design and teacher interventions.
2. An analysis of over 150 modules found that the type of learning design impacted online behavior, end-of-module surveys, and exam results.
3. Providing teachers with predictive learning analytics and visualizations on at-risk students led to increased usage of the tools and had a positive impact on student performance and retention rates according to regression analysis.
Tiffany Barnes "Making a meaningful difference: Leveraging data to improve le...CITE
The document discusses the future of learning and how data can be leveraged to improve learning for most people. It outlines using data to recognize excellence in teaching and learning, provide real-time support, and identify effective collaborations. A case study is described that used an intelligent tutoring system to construct student models and provide feedback based on past student data. Guiding principles of respect, beneficence, and justice are presented for developing learning systems.
Inaugural lecture: The power of learning analytics to give students (and teac...Bart Rienties
Join us at the Berrill Theatre and online on Tuesday 30 January 2018, 6-7pm for the Inaugural Lecture of Professor Bart Rienties, in which he will talk about the power of learning analytics in teaching and learning. Bart Rienties is Professor of Learning Analytics at the Institute of Educational Technology (IET) at The Open University. He is programme director Learning Analytics within IET and head of Data Wranglers, whereby he leads of group of learning analytics academics who conduct evidence-based research and sense making of Big Data at the OU.
As educational psychologist, he conducts multi-disciplinary research on work-based and collaborative learning environments and focuses on the role of social interaction in learning, which is published in leading academic journals and books. His primary research interests are focussed on Learning Analytics, Computer-Supported Collaborative Learning, and the role of motivation in learning. Furthermore, Bart is interested in broader internationalisation aspects of higher education. He has successfully led a range of institutional/national/European projects and received a range of awards for his educational innovation projects.
Bart is World Champion Transplant cycling Team Time Trial 2017, the first academic with a transplant to be promoted to full professor, and a keen explorer of life.
In The power of learning analytics to give students (and teachers) what they want!, Bart will describe how his research into learning analytics is enabling him to predict which learning strategy might work best for each student, and provide different, unique experiences for each depending on what they want. In particular, he will explore how student dispositions like motivation, emotion, or anxiety encourages or hinders effective online learning, and how we may need to adjust our approaches depending on individual differences.
Event programme:
18:00 - 18:45 – The power of learning analytics to give students (and teachers) what they want!
18:45 - 19:00 – Q&A
19:00 - 19:45 – Drinks Reception
There will be time for questions and comments. We very much hope you will be able to attend what promises to be an inspiring event and have your say.
The power of learning analytics to measure learning gains: an OU, Surrey and ...Bart Rienties
Learning gains has increasingly become apparent within the HE literature, gained traction in government policies in the UK, and are at the heart of Teaching Excellence Framework (TFL). As such, this raises a question to what extent teaching and learning environment can actually predict students’ learning gains using principles of learning analytics. In this presentation, which is joined work with University of Surrey and Oxford Brookes, I will focus on some preliminary findings based upon developing and testing an Affective-Behaviour-Cognition learning gains model using longitudinal approach. The main aim of the research is to examine whether learning gains occur on all three levels of Affective-Behaviour-Cognition model and whether any particular student or course characteristics can predict learning gains or lack of learning and dropout. For more info, see https://abclearninggains.com/
This document provides an overview of learning analytics, including:
- A brief history of learning analytics and related fields from the 1920s to present day.
- Drivers for the increased focus on learning analytics like big data, personalized learning, and demands for educational institutions to demonstrate performance.
- A definition of learning analytics as "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs."
- Key dimensions that learning analytics research and practice focuses on, such as data sources and methods, connection to learning and teaching, purpose, stakeholder management, and ethics.
Investigating the relationship of Science and Mathematics Achievement with At...FatmaArkan
This study investigated the relationships between science and mathematics achievement, attitudes towards STEM, and internet addiction. The researchers administered surveys on attitudes towards STEM and internet addiction to 435 high school students. They found that positive attitudes towards STEM and more time spent studying on computers each week significantly predicted higher mathematics and science achievement. Additionally, school type significantly predicted science achievement. However, internet addiction was not a significant predictor of achievement. The study provides insights into factors influencing STEM learning and performance.
Gobert, Dede, Martin, Rose "Panel: Learning Analytics and Learning Sciences"CITE
This panel discussed learning analytics and learning sciences. Janice Gobert discussed problems with standardized tests and how interactive labs have assessment potential but challenges. Chris Dede discussed his research on immersive learning using virtual reality and challenges assessing open-ended environments. Taylor Martin discussed how microgenetic research and learning analytics can improve data collection and analysis. Carolyn Rose discussed using conversational data and a new theoretical framework analyzing social processes and distances. The panel addressed if these methods lead to improved standardized test scores, with Janice and Chris noting validity issues with standardized tests and that these methods improve deeper learning over rote memorization.
The document discusses strategies for increasing faculty engagement with technology enhanced learning. It suggests getting buy-in from senior management and ensuring proper support for changes. Other tips include understanding the context, finding out what approaches faculty currently use, emphasizing two-way communication, celebrating innovation, and keeping the focus on enhancing student learning. The goal is to move beyond superficial engagement and foster deep, meaningful involvement with technology.
Designing and testing visual representations of draft essays for Higher Educa...Denise Whitelock
This presentation reports the findings of an empirical investigation which set out to test a set of rainbow exercises. The rainbow diagrams are pictorial representations of formal graphs that are derived automatically from student essays. They were designed to allow students to discover how key concepts in a well written essay are connected together. The students would then be able to compare a rainbow diagram of their own essay with a good essay and make changes to it before submission to their tutor. A trial was undertaken with academics, teaching and learning staff, doctoral students at The Open University of Catalonia and the Open University UK, before implementation into the web application known as OpenEssayist.
OpenEssayist: Feedback and moving forward with draft essaysDenise Whitelock
The presentation assists users when submitting draft essays for analysis by OpenEssayist to make full use of the feedback in order to draft another version of their essay.
Technology-Enhanced Assessment and Feedback: How is evidence-based literature...Denise Whitelock
This desktop research commissioned by the Higher Education Academy set out to consult with the academic community about which references on assessment and feedback with technology enhancement were most useful to practitioners. While all the recommended publications may be characterised as reputable and the majority were peer-reviewed (67.7%), only a minority provided quantitative data (28.2%), of which relatively few provided appropriate experimental designs or statistical analysis (18.5%). The majority of publications were practitioner-led case studies. The references that were recommended to us are clearly having an impact on current practice and are found valuable by practitioners. The key messages from these sources are consistent and often give detailed and practical guidance for other academics. We found that most of the recommended literature focused on the goals that technology enhancement can enable assessment and feedback to meet and how assessment and feedback can be designed to make best use of the technology.
Online Assessment through Moodle Platform in Higher EducationNiroj Dahal
This presentation was done at ICT in Education Conference organized by TU, KUSOED and OSLOMET as a part of NORHED project on 19-21 September 2019 at Hotel Yellow Pagoda, Kathmandu.
This document summarizes a workshop on linking learning analytics, learning design, and MOOCs. It discusses how learning analytics can provide actionable intelligence for learners and educators. Group activities involved analyzing MOOCs to identify learning outcomes, assessments, and how analytics could support learning. The document suggests learning design tools like templates, planners, and maps can help identify useful analytics and frame analytics questions. The goal is to use analytics to facilitate learning, identify struggles, engagement, and address problems by starting with pedagogy.
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
This document provides an introduction and overview of the 2013 CHECET course on Emerging Technologies to improve Teaching and Learning in Higher Education. It defines emerging technologies as those that are evolving, not fully understood, and potentially disruptive. The course will involve both online and face-to-face sessions over 6 weeks, exploring educational challenges and how emerging technologies can address them. Participants will design a case study and assessment will include reflections, participation, and a final case study presentation.
This document discusses research approaches for studying emerging e-learning practices and technologies. It focuses on learning analytics and social network analysis. Learning analytics can help understand learning behavior, provide evidence for improving learning environments, and support assessment/feedback, enquiry/sensemaking, and discourse. Examples discussed include open feedback tools, social networks to support knowledge construction, and discourse analysis. Combining different data sources through powerful analytics tools can provide insights. Resources on learning analytics and social network analysis of OER communities are also listed.
Openness Initiatives in Distance EducationGülay Ekren
The document discusses openness initiatives in distance education. It provides an introduction to key concepts of openness like open educational resources (OERs), MOOCs, and open source software. It then outlines the aims and methods of the study, which involved a content analysis of 46 articles from the International Review of Research in Open and Distance Learning (IRRODL) journal. The results of the study found that research areas focused on issues like instructional design, management and organization, and educational technology. Studies also centered on themes such as OERs, MOOCs, connectivism, and open education. Most studies used qualitative or mixed methods approaches.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
This document summarizes a presentation on learning analytics given at the ALT-C 2014 conference in Warwick, UK. It discusses the Learning Analytics Community Exchange (LACE) project, which is a 24 month EU support project with 9 partners focusing on implementing learning analytics in schools, higher education, and industry. The presentation defines learning analytics as the measurement, collection, analysis and reporting of learner data to understand and optimize learning. It also discusses cultural and technical challenges around topics such as data privacy, change management, and making insights actionable. Examples of learning analytics tools described include SNAPP for social networks and LOCO-Analyst.
Open instructional design and assessments - OE4BW 2020nwahls
Organization of a learning experience
Learning outcomes, learner levels, and accessibility
Active learning in self-directed and distance contexts
Turning resources and assessments into a course or textbook
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
The document discusses the SpeakApps project which aims to develop tools and tasks for oral production and interaction using a learning analytics approach. It provides an overview of learning analytics and references a learning analytics reference model. The model describes analyzing data from the SpeakApps platform to evaluate claims about task design, specifically regarding time limitations for recordings. Data sources would include behavioral logs from the platform and user generated content to assess the engagement and experiences of students, teachers, and instructional designers.
The Model of Analytical Geometry Interactive Module using Systematic, Active,...AJHSSR Journal
ABSTRACT: This study aimed at producing an interactive Analytical Geometry Module based on
Systematic, Active, Effective (SAE) e-module model that can build learning independence and competency of
students in Mathematics Education study programs. This research employed the developmental research method
with the stages of development: preliminary studies, design, development, and testing. This study documented
that most of the students were interested in lectures on analytical geometry utilizing module teaching materials
in the form of interactive electronics (interactive e-modules) that could build student learning independence and
easily understand them. For this reason, it is necessary to develop teaching materials that are easy to understand,
namely interactive e-modules in the field of analytical geometry. At the design stage, this study captured the
results of the initial draft e-module interactive analytical geometry with systematic: introduction (preface,
instructions for using e-modules, material description, prerequisites, learning objectives), learning activities
include: a description of the material and examples of questions, practice exercises, summaries, competency
tests, practice answer instructions, feedback, and reference lists. At the development stage, an interactive draft
geometry e-module product draft was produced. The e-module draft is an interactive analytical geometry that is
designed and developed systematically and effectively and enables students to build learning independently by
adhering to the principles of developing teaching materials in learning. In the testing phase, the results of the
trial on the draft e-module were limited to students, and the overall average score was 3.27, which means that
the students considered that the analytical geometry interactive e-module product was good.
KEYWORDS:pengembangan, e-modul interaktif, SAE (Systematic, Active, Effective), geometri analitis
Keynote address Analytics4Action Evaluation Framework: a review of evidence-...Bart Rienties
Bart Rienties is a Reader in Learning Analytics at the Institute of Educational Technology at the Open University UK. He is programme director Learning Analytics within IET and Chair of Analytics4Action project, which focuses on evidence-based research on interventions on OU modules to enhance student experience. As educational psychologist, he conducts multi-disciplinary research on work-based and collaborative learning environments and focuses on the role of social interaction in learning, which is published in leading academic journals and books. His primary research interests are focussed on Learning Analytics, Computer-Supported Collaborative Learning, and the role of motivation in learning. Furthermore, Bart is interested in broader internationalisation aspects of higher education. He successfully led a range of institutional/national/European projects and received several awards for his educational innovation projects.
This document discusses multimodal learning analytics and the challenges associated with it. It presents a case study called the PELARS project, which collects and analyzes data from various modalities like computer vision, mobile devices, and workstations to understand hands-on STEM learning. The challenges discussed include getting meaningful data from real classrooms, analyzing messy and incomplete multimodal data, and developing visualizations that provide useful insights. Real-world applicability and scalability of multimodal learning analytics approaches are difficult but important open questions.
This document discusses using analytics in virtual worlds to measure learner engagement and predict behavior changes. It proposes immersing pre-service teachers in a 3D virtual environment to engage them in learning. Analytics would trace learner interactions and connections, while surveys and assessments evaluate changes in attitudes, reflections, and theoretical understanding. The goal is to build a predictive model to determine how virtual world interactions influence real-world behavior and how the 3D space could adapt based on learner-generated data.
Just a buzz: Exploring collaborative learning in an open course for professio...Chrissi Nerantzi
This document summarizes research on an open online course called FDOL132 for the professional development of teachers in higher education. A survey of participants found that the majority valued group work, feedback, and recognition for their studies. Interviews revealed that participants found groups challenging due to language barriers and commitment levels but appreciated learning from colleagues internationally. Time constraints were a significant challenge. Overall, participants reported a valuable learning experience from the course and examples of applying what they learned in practice, though facilitators' active engagement and support was important for participation.
This document discusses MOOCs and learning analytics. It provides an overview of MOOCs, including their characteristics and growth. It also discusses learning analytics, defining it as measuring data about learners and contexts to optimize learning. Various learning analytics methods and applications are outlined, including at the EDSA and OU. The OU's Analyse tool for early identification of at-risk students using machine learning is also summarized.
Using Analytics to Transform the Library Agenda - Linda Corrin | Talis Insigh...Talis
1. The document discusses the use of learning analytics to understand student learning and optimize teaching practices. It describes how analytics can provide insights into student performance, engagement, and retention at various levels from the individual to the institution.
2. Interviews with teachers found they are interested in analytics about student engagement and performance but have concerns about interpreting data. Teachers want analytics to help understand ideal students and provide feedback to improve teaching.
3. A conceptual framework is presented that links learning analytics to learning design to provide context for analyzing educational activities and interactions with resources. Planning questions are also outlined to help educators implement learning analytics.
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
Similar to Learning Analytics and student feedback (20)
Should feedback be at the centre of Personalised Learning?Denise Whitelock
Should feedback be at the centre of Personalised Learning?
The advent of e-Learning has prompted the development of web-based learning systems, recognising there is no fixed learning pathway that will be appropriate for all learners. However, most learning platforms with personalised learning sequencing rely on a learner’s preferences.
However if we want students to be able to learn to make reliable judgements about their learning and to identify any further support they require to meet their learning goals, then personalised automatic feedback should play an important role. This presentation explores the role that technology enhanced feedback can play in the pursuit of a personalised learning agenda.
References
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). Feedback on academic essay writing through pre-emptive hints: Moving towards ‘advice for action’. Winner of Best Research Paper Award. Special Issue of European Journal of Open, Distance and E-Learning, Best of EDEN RW8, 8th EDEN Research Workshop (eds. U. Bernath and A. Szucs). Published by European Distance and E-Learning Network, 1-15. ISSN 1027 5207
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). OpenEssayist: A supply and demand learning analytics tool for drafting academic essays. The 5th International Learning Analytics and Knowledge (LAK) Conference, Poughkeepsie, New York, USA. 16-20 March 2015. ISBN 978-1-4503-3417-4
Technology Enhanced Assessment: Do we have a wolf in sheep's clothing?Denise Whitelock
Technology Enhanced Assessment: Do we have a wolf in sheep’s clothing?
A sea change in assessment, precipitated by both researchers and practitioners alike, was crystallised by a statement issued by the Assessment Reform Group, who have rejected the notion of Assessment that foregrounds cognitive ability tests that are valued for their predicted validity (Broadfoot, Daugherty, Gardner, Harlen, James & Stobart, 2002). The ARG set out to promote better alignment between teaching, learning & assessment and endorsed the term ‘Assessment for Learning’. This presentation explores the role that technology enhanced assessment can play in encouraging the assessment for learning agenda. It presents a number of cases of peer, self and computer assessments that display a range of characteristics for the next generation of assessment tasks.
The discussion of the cases reveals a missing characteristic, which is a form of feedback to the students that will take their learning forward which I refer to as “Advice for Action” (Whitelock, 2011). Recent developments in automatic feedback systems for essay writing (Whitelock, Twiner, Richardson, Field & Pulman, 2015a and 2015b) will be presented and the role of visualisations and socio-emotive feedback in conveying meaningful feedback will also be discussed.
Since any feedback that is not understood or cannot be acted upon is likely to be ignored it will not facilitate learner improvement or confidence. This will always be a challenge – but how can technology enhanced assessment pursue this agenda?
References
Broadfoot, P., Daugherty, R., Gardner, J., Harlen, W., James, M. & Stobart, G. (2002). Assessment for learning: 10 principles, Research-based principles to guide classroom practice. London: Assessment Reform Group. Retrieved 4 April 2017, from http://sunnyspelles.co.uk/Pedagogy%20Resources/A4L/10principles.pdf
Whitelock, D. (2011). Activating Assessment for Learning: are we on the way with Web 2.0? In M.J.W. Lee & C. McLoughlin (Eds.) Web 2.0-Based-E-Learning: Applying Social Informatics for Tertiary Teaching. IGI Global. 319-342.
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015a). Feedback on academic essay writing through pre-emptive hints: Moving towards ‘advice for action’. Winner of Best Research Paper Award. Special Issue of European Journal of Open, Distance and E-Learning, Best of EDEN RW8, 8th EDEN Research Workshop (eds. U. Bernath and A. Szucs). Published by European Distance and E-Learning Network, 1-15. ISSN 1027 5207
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015b). OpenEssayist: A supply and demand learning analytics tool for drafting academic essays. The 5th International Learning Analytics and Knowledge (LAK) Conference, Poughkeepsie, New York, USA. 16-20 March 2015. ISBN 978-1-4503-3417-4
Who has the crystal ball for moving forward with Digital Assessment?Denise Whitelock
The document discusses various perspectives on digital assessment including students, awarding bodies, teachers, researchers, software developers, and disrupters. It addresses key issues for each group such as students' feelings about assessment, the challenge of e-assessment for awarding bodies, feedback systems designed by teachers, research on praise and mindsets, the SAFeSEA automated feedback tool, and drivers for disruption from commercial companies.
Good pedagogical practice driving learning analytics: OpenMentor, Open Commen...Denise Whitelock
OpenMentor, Open Comment are two systems that provide immediate feedback to tutors and students about their learning and support of learning. All these systems were built on the formalisation of sound pedagogical practice. SAFeSEA is a new EPSRC funded project with Oxford University investigating summarisation techniques to give feedback on draft essays.
Understanding current practice around the Assessment of Multimedia ArtefactsDenise Whitelock
This document discusses current practices around assessing multimedia artifacts. It provides perspectives from both tutors and students. Tutors find marking criteria open to interpretation and ranking artifacts easier than marking. Students have trouble understanding criteria, even with face-to-face explanation. Peer assessment has shown improvements in student performance and understanding of what constitutes good work. The document advocates for students understanding assessment standards and comparing their own work to develop self-monitoring skills.
Academics' Understanding of Authentic AssessmentDenise Whitelock
This paper reports on a project undertaken at The Open University which set out to explore academics’ notion and practice of authentic assessment through the exploration of the following research objectives:
1. To understand what is meant by authentic assessment in the literature by examining a set of examples of authentic assessments.
2. To construct a questionnaire which could be used by Open University academics to explore their understanding of authentic assessment.
3. To investigate through means of a questionnaire the types of assessment academics were currently undertaking and whether they fitted into a broad definition of authentic assessment.
The findings from the electronic survey suggest that Open University academics are on the way to designing meaningful assessments for their students. Although many of the courses were employing assessment tasks that could be considered as ‘authentic’, only 25% of the academics had heard of the terms ‘authentic learning’ and ‘authentic assessment’, which is a low response compared with ‘learning design’. However, there has been a well publicised Learning Design initiative taking place across the University.
Technology Enhanced Activities for Learning Science for Children in Hospital ...Denise Whitelock
This research set out to construct a Roadmap for the role that technology can play in the teaching of science to chronically ill children. The modus operandi employed for this study was adapted from Gordon & Glenn's (2003) range of roadmap methodologies.
The challenges of Assessment and Feedback: findings from an HEA projectDenise Whitelock
The document summarizes the findings of an HEA project on the challenges of assessment and feedback. It discusses various methods of technology-enhanced assessment including e-portfolios, peer assessment, MCQs, and self-assessment. It provides advice on how to design effective feedback and the importance of supporting students to act on feedback. Key messages emphasize that pedagogy is more important than technology, automated marking can be reliable, and staff development is essential.
Supporting Science Studies for children with long term health problems using ...Denise Whitelock
Children with long term health problems cannot always maintain their schooling and keep up with the curriculum (Prevatt et al., 2000; McDougall et al., 2004). They also spend prolonged periods in hospital where access to science teaching is very limited (Sobrino, Lizasoain & Ochoa, 2001). In order to address this problem, the Nefreduca project was designed to develop a short science curriculum for children in Spain with chronic kidney disease. The Nefreduca project was designed to develop a series of open source science inquiry based web learning materials for children with chronic kidney disease. In this paper the learning design strategies employed to build the Nefreduca platform are described , together with how the students’ conceptions of the Kidney’s role in the nutrition process were extended whilst trialling the Nefreduca materials. The students’ notions of the kidney also changed after using the Nefreduca programme. Their answers illustrated a deeper understanding of the urine production process, it’s constituency; and its connection to the blood filtration that occurs in the kidney. These findings suggest that the Nefreduca activities could serve as appropriate teaching material for scaffolding the students’ mental models of the kidney.
Synthesis Report onAssessment and Feedback with Technology Enhancement (SRAFTE)Denise Whitelock
Presentation of key findings from Synthesis Report on Assessment and Feedback with Technology Enhancement (SRAFTE) project conducted for the HEA by University of Southampton and The Open University.
1. The document describes a study examining how teachers provide feedback to children with chronic illnesses who are learning science remotely using an online platform called Nefreduca.
2. The researchers analyzed recordings of teacher-student interactions using Bales' feedback categories and identified four common types of feedback incidents.
3. The researchers then proposed an operational model for how an automated feedback system in Nefreduca could analyze students' responses and provide tailored content and socio-emotional support.
Feedback
Marking
Standards
Pedagogy
Assessment
literacy
Change
management
Technical skills
Evaluation
Dissemination
Sustainability
Community of
practice
Train the trainer
Cascading
training
Continuous
professional
development
Peer support
Resources and
materials
Blended
approaches
Just in time
training
Recognition
Investigating the Pedagogical Push and Technological Pull of Computer Assiste...Denise Whitelock
The document discusses computer-assisted formative assessment and the development of automated feedback tools. It describes stages of analysis a computer could use to analyze students' free-text responses and provide feedback, including detecting errors, revealing omissions, requesting clarification or further analysis, and checking causality. The feedback is aimed at helping students develop their understanding rather than just finding right or wrong answers. The tools are meant to support open-ended, divergent assessment in arts subjects.
Framing Feedback for Formative Assessment, Denise WhitelockDenise Whitelock
Presentation given to formative e-assessment project group at the Institute of Education, 3rd July 2008. Synopsis of projects at The Open University, UK.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
Communicating effectively and consistently with students can help them feel at ease during their learning experience and provide the instructor with a communication trail to track the course's progress. This workshop will take you through constructing an engaging course container to facilitate effective communication.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
Gender and Mental Health - Counselling and Family Therapy Applications and In...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
1. Learning Analytics and Student Feedback
Professor Denise Whitelock
The Open University, Walton Hall,
Milton Keynes MK7 6AA, UK
denise.whitelock@open.ac.uk
2. Learning Analytics and Student Feedback
• What is Learning
Analytics?
• Origins
• Early work
• Learning Analytics
and Assessment
DMW UOC May 2013
3. Definition
“Learning Analytics are concerned with the measurement,
collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and
optimising learning and the environments in which it
occurs”.
Reference
SoLAR, Open Learning Analytics: An Integrated &
Modularised Platform, White Paper, Society for Learning
Analytics Research, 2011.
DMW UOC May 2013
5. Middle Space
• 3rd
LAK Conference
• Learning explicit
• New analytic
methods
• Computational
• Representational
• Statistical
• Visualisation
DMW UOC May 2013
6. Can LAK hold together for long?
• Challenges
• Different
methodologies
• Different theories
• Different predjucies
• Agreement on topics 3rd
LAK 2013
• Visualisation, social network
analysis, communication
and collaboration, discourse
analytics, predictive
analytics, sequence
analytics, assessment
After Suthers & Verbert (2013)
DMW UOC May 2013
7. Political and economic drivers
• Educause Review (2007)
• Academic Analytics,
Campbell & Oblinger (2007)
• Large data and stats =
predictive modelling
• Improve number of
graduates in US
• US finding now with school
exam data
• Hand code
• M.L.
• Apply whote set
• Make predictions
DMW UOC May 2013
8. Formative Research towards separate field
• Open Learner models (Bull & Kay, 2007)
• Social Network analysis (De Laat et al, 2007)
• Networks Adapting Pedagogical Practice (SNAPP),
(Dawson et al, 2010)
• Visualisation of large data sets, Honeycomb (van Ham
et al, 2009)
• Gephi: open source tool (Bastian et al, 2009)
• Signals (Arnold, 2010)
DMW UOC May 2013
9. Signals: Flagship Project
• Moves data from VLE
• Combines with prediction
models
• Real time red/amber/green
traffic lights
• Pilot study (Arnold, 2010)
showed
• Students sought help
earlier
• 12% more B/C grades
• 14% less D/F grades
DMW UOC May 2013
10. Formative Assessment and Learning
Analytics (1)
Tempelaar et al, 2013
• 1st
year Math & Stats
undergraduates,
Maastricht
• Reason text book
• Online questions
• Practice and
performance tests
• 92% higher practice
pass
• 51% lower practice pass
DMW UOC May 2013
11. Formative Assessment and Learning
Analytics (2)
Important for SAFeSEA?
• Learning styles (Vermunt, 1996)
• Self regulation for deep learning
• Practice for stepwise learning
• Motivation and engagement wheel (Martin,
2007)
• Learning emotions
• Pekrun’s control-value theory of learning
emotions
DMW UOC May 2013
12. Performance of video lectures
• Findings from Mirriahi &
Dawson (2013)
• Correlations between
quizzes and lectures
• Shows misalignment
between Assessment
and Teaching
materials?
DMW UOC May 2013
13. HOU2LEARN
• PLE from Hellenic Open
University
• Social network analysis
and final grades
• Online collaboration is
not a predictor for final
grade
• Koulocheri & Xenos,
2013
DMW UOC May 2013
15. Key words and phrases visualized in the essay context. Sentences in
light-grey (green) background are key sentences as extracted by the
EssayAnalyser (the number at the start of the sentence indicates its
key-ness ranking); bigrams are indicated in bold (red) and boxed.
DMW UOC May 2013
16. The structural elements of the essay can be used jointly with
the key word extraction to highlight relevant information within
specific parts of the essay, here the introduction (and the
assignment question)
DMW UOC May 2013
17. Key words and phrases as separate lists
DMW UOC May 2013
18. Dispersion of key words across the essay
http://www.open.ac.uk/iet/main/research-scholarship/research-projec
DMW UOC May 2013
19. Can we find ways of using graph visualization
techniques on the key words and key sentences, to
make them helpful and meaningful to students?
DMW UOC May 2013
20. Final Thoughts
• Instant machine
feedback not prevalent
• Artificial Intelligence
analysis to tutors
leading to Wizard of Oz
responses (Shaffer,
2013)
• Just in time feedback is
the ultimate goal
DMW UOC May 2013
21. References (1)
Arnold, K.E. (2010). Signals: applying academic analytics, Educause
Quarterly, 33(1), p10.
http://www.educause.edu/ero/article/signals-applying-academic-analytics
(Accessed 30 April 2013)
Bastien, M., Heymann, S. & Jacomy, M. (2009). Gephi: an open source
software for exploring and manipulating networks. Paper presented at
the International AAAI Conference on Weblogs and Social Media.
Bull, S & Kay, J. (2007). Student models that invite the learner in: the
SMILI:-) open learner modelling framework, International Journal of
Artificial Intelligence in Education, 17(2).
Campbell, J.P. & Oblinger, D.G. (2007). Academic Analytics,
Educause.
Dawson, S., Bakharia, A. & Heathcote, E. (2010). Snapp: Realising the
affordances of real-time SNA within networked learning environments.
Paper presented at The 7th
International Conference on Networked
Learning, Aalborg, Denmark (3-4 May).DMW UOC May 2013