A copy of the slides produced to highlight the Predicted project that is mining data from our VLE and using it to predict academic success for students
Data and assessment powerpoint presentation 2015Erica Zigelman
Presented for Datag in Albany, NY. This presentation is all about multiple types of data you may obtain within your classroom and how to assess your students.
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Realizeit is an adaptive learning platform that aims to provide individualized learning experiences for students. It uses analytics to guide students and instructors. Analytics help improve courses over time, provide insights into student and faculty engagement, and measure the effectiveness of different instructional approaches. Case studies at Colorado Technical University and University of Central Florida found increased pass rates, retention, and student satisfaction when using Realizeit. Institutions can also collaborate with Realizeit on research to continuously improve courses.
This presentation to the MoodleMoot UK/I 2017 provides an overview of Learning Analytics for VLE/LMS data and lessons learned in practice from using this data to model student risk and other characteristics. The findings come from fundamental research and application of Blackboard's X-Ray Learning Analytics application.
Phil Winne "Learning Analytics for Learning Science When N = me"CITE
Phil Winne argues that traditional learning science offers limited support for individual learners due to its reliance on randomized controlled trials. However, learning analytics that leverage large datasets can better support learners by clustering data about similar individuals and providing personalized feedback and recommendations. Winne presents nStudy, an online tool that traces self-regulated learning behaviors to gather data and provide analytics to guide learners' monitoring, assembling, rehearsing, and generating of information.
Data and assessment powerpoint presentation 2015Erica Zigelman
Presented for Datag in Albany, NY. This presentation is all about multiple types of data you may obtain within your classroom and how to assess your students.
E assessment conference scotland 2014 presentation>
As technology evolves and becomes more integrated into education, the data trail created by learners is enormous. The analysis of this data referred to as “Learning analytics” drives learning in a cyclical pattern; data is collected, analysed, and interventions are made based on the data. After these interventions, more data is collected and analysed, and additional (perhaps different) interventions are made.
This presentation outlines how the data related to assessments is collected from three different projects within DCU and then analysed with the aim of improving the student learning experience. Each project has two common threads; making life easier for the lecturer and improving the experience of the student.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
Realizeit is an adaptive learning platform that aims to provide individualized learning experiences for students. It uses analytics to guide students and instructors. Analytics help improve courses over time, provide insights into student and faculty engagement, and measure the effectiveness of different instructional approaches. Case studies at Colorado Technical University and University of Central Florida found increased pass rates, retention, and student satisfaction when using Realizeit. Institutions can also collaborate with Realizeit on research to continuously improve courses.
This presentation to the MoodleMoot UK/I 2017 provides an overview of Learning Analytics for VLE/LMS data and lessons learned in practice from using this data to model student risk and other characteristics. The findings come from fundamental research and application of Blackboard's X-Ray Learning Analytics application.
Phil Winne "Learning Analytics for Learning Science When N = me"CITE
Phil Winne argues that traditional learning science offers limited support for individual learners due to its reliance on randomized controlled trials. However, learning analytics that leverage large datasets can better support learners by clustering data about similar individuals and providing personalized feedback and recommendations. Winne presents nStudy, an online tool that traces self-regulated learning behaviors to gather data and provide analytics to guide learners' monitoring, assembling, rehearsing, and generating of information.
NYSCOSS Conference Superintendents Training on Assessment 9 14NWEA
This document discusses using data wisely from a superintendent's perspective. It covers three main topics: assessment basics, improving assessment programs, and developing a data culture. The document emphasizes that what is measured gets attended to, so assessments must be properly aligned and designed. It also stresses using multiple years of data to provide context and control for outside factors to fairly evaluate teachers. Developing the right assessment systems and using data thoughtfully can significantly improve student achievement.
8 steps of action research of team teachingcalidiane1
1) The document outlines the 8 steps of an action research project to evaluate the effectiveness of team teaching on student achievement.
2) Data such as TAKS scores and district assessments will be analyzed to compare student performance in team-taught classes versus self-contained classes.
3) Interviews with teachers and administrators will provide perspectives on the advantages and disadvantages of team teaching currently in place.
Using Assessment Data for Educator and Student GrowthNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.
Taking control of the South Carolina Teacher Evaluation frameworkNWEA
This document discusses recommendations for improving teacher evaluation frameworks. It advocates that evaluations should focus on helping teachers improve, be controlled by principals, and use multiple measures rather than solely relying on test scores. An effective framework uses evidence of teaching practices, student learning, and professional responsibilities. While testing and observations are part of evaluations, their results must be interpreted carefully. Overall evaluations should provide meaningful performance differentiations to help retain top educators and dismiss ineffective ones.
John Cronin presented on issues administrators need to know about using tests for high-stakes teacher evaluation. He discussed that tests should be one part of a comprehensive evaluation using multiple data sources like observations and participation. He outlined issues like not all subjects have appropriate assessments and tests may not accurately measure all students. Cronin recommended embracing growth measurement formatively in addition to outcomes and using multiple years of student achievement data in evaluation.
The document discusses various issues around student assessment and accountability. It provides data on teacher and administrator perspectives on standardized testing and uses of assessment data. A majority of teachers believe students are over-tested and too much time is spent on test preparation. The document also examines different approaches to teacher evaluation, including value-added models and student growth percentiles, noting issues with reliability and fairness. It emphasizes the importance of principals in evaluation and using multiple measures, not just test scores, to differentiate teacher performance.
Presentation delivered at the UCISA event A-Z of learning analytics 28/06/2017. Ed Foster & Jane McNeil. A longer case study can be found at https://www.google.com/url?q=https://www.ucisa.ac.uk/-/media/Files/publications/truthaboutda/TheTruthAboutDA&sa=U&ved=0ahUKEwi8r-7W5_7eAhVKRBUIHf66CGEQFggMMAM&client=internal-uds-cse&cx=008281077274678676179:yulrfklwima&usg=AOvVaw17iuGZYPJPqFRCMGyBKLd0
This document discusses strategies for maximizing student assessment systems. It advocates defining your own assessment goals rather than focusing solely on compliance. It provides seven principles for effective assessment programs: 1) Define assessment purposes and ensure validity, 2) Educate teachers on assessments, 3) Align results to audience needs, 4) Eliminate redundant assessments, 5) Deliver timely results, 6) Use metrics that focus on all students, and 7) Contribute to transparency and long-term focus. The document argues that assessment goals, metrics, and incentives should support all students rather than just those near performance cutoffs.
The document provides information on different methods for evaluating academic programs, including audits, assessment experience questionnaires (AEQ), and focus groups. It discusses analyzing data from these various sources, including looking for consistencies and gaps. Guidelines are offered for conducting mock audits, administering the AEQ survey, facilitating focus groups, and coding and analyzing qualitative data. The goal is to triangulate information from multiple assessment methods to develop a comprehensive understanding of student learning experiences.
The document describes the implementation of an institution-wide learning analytics dashboard at Nottingham Trent University (NTU) from 2013 to the present. The dashboard aims to promote student success by providing insights into progression, grades, and degree attainment. It also aims to improve staff-student relationships by surfacing engagement information to inform personalized tutorial discussions. A multi-year, multi-phase implementation process included stakeholder consultation and working groups. Student and staff feedback was positive, finding the dashboard useful for monitoring academic progress, preparing for tutorials, and developing action plans.
This document discusses research on effective teacher goal setting and its impact on student achievement. It finds that setting specific, moderately challenging goals can improve student performance by 8-16% depending on task complexity. Goals work best when they direct effort, build persistence and strategies, and foster commitment. Proper goal setting considers context, sets interim benchmarks, and provides leadership support through communication, modeling, and praise. The research suggests that teacher goal setting, if implemented well, can powerfully increase student learning.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
Through analyzing achievement data, providing targeted instruction and intervention, and offering professional development, Wildwood Elementary worked to increase the academic performance of its low-income students in mathematics. Key aspects of the intervention included coordinating support services, adhering to mathematical targets, and examining data frequently. As a result, Wildwood closed the achievement gap between low-income and non-low-income students in both reading and mathematics, earned recognition as a School of Distinction, and was removed from the state's list of failing schools.
This document discusses conducting action research in school settings. It provides an example of an action research study conducted in a school district to address increasing student engagement and achievement. The study examined incorporating student voice into a process called Coaching for Design that involved teachers designing lessons. Data collection involved interviews and focus groups. Results showed student voice activities increased participant engagement and perspectives. While longer-term impacts on achievement were unclear, the district has continued building on the work. The document prompts discussion of developing one's own action research plan.
This document defines and explains the process of action research. It states that action research is a process where teachers study their own instructional practices and student learning in order to improve. The process involves identifying a classroom problem, developing a plan to address it, collecting and analyzing data, and making instructional decisions and sharing results. It then outlines the phases of action research in more detail, including identifying problems, developing a research plan, collecting and analyzing data from multiple sources, and drawing conclusions to continue improving teaching practices.
This document discusses action research and its various forms in education. Action research involves teachers identifying issues in their practice, gathering and analyzing data, and making changes to improve outcomes. It can be conducted individually or collaboratively at different levels from a single classroom to district-wide. Benefits include improved instruction, assessment, and policies informed by evidence. Support may be needed for coaching, technology assistance, substitutes or release time depending on the scope. Potential impacts range from changes in a teacher's curriculum to reforming organizational structures across a district.
The author modified their original action research plan after reviewing steps from a book on conducting action research. The revised plan covers all eight steps from the book but in a different order tailored to their unique campus needs. The plan aims to improve instruction by having faculty disaggregate student data and use it to drive classroom lessons. It involves conducting a needs assessment, understanding current practices, researching effective methods, analyzing data, developing an implementation plan, taking action, monitoring progress, and evaluating results to improve student achievement.
Application of assessment and evaluation data to improve a dynamic graduate m...Pat Barlow
1. The document describes the process of creating and refining assessment tools and curriculum for a graduate medical education workshop on research design and statistics.
2. They developed an initial assessment, pre-course survey, new classroom activities and homework, and post-course evaluation to gather data and feedback from students.
3. After implementing changes based on the assessment data, the workshop was much more successful and rigorous, demonstrating the importance of continuously collecting and using student feedback to improve a dynamic curriculum over time.
ABLE - Inside Government E Foster 26th November 2015Ed Foster
1) NTU developed a student dashboard using learning analytics to improve student retention, engagement, and attainment.
2) The dashboard provides data on student engagement like library usage, VLE access, and attendance to students and staff.
3) Analysis found students with high engagement were more likely to progress to the next year and receive higher degrees.
4) Both students and staff reported changing their behaviors due to the insights from the dashboard. Students increased engagement activities while staff targeted interactions.
There are many assessments being used but lack of clarity about their purposes. An organized assessment plan is needed to coordinate different initiatives and ensure all elements work together toward the overall goal of student progress. The first step is to examine current assessments being used and discard any not serving a clear need, in order to simplify the system. A valid, reliable and coordinated assessment approach can provide the right data to inform instructional decisions at each level from problem identification to evaluation of plans.
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning. The document discusses motivation and goals of learning analytics, challenges, and examples of data that could be analyzed from students, including demographics, academic performance, physical behavior, and online behavior. It also discusses ensuring principles of transparency, alignment with pedagogy, and responsibility in student data use. Examples are provided of diagnostic testing, analyzing VLE access data, and measuring the effects of video use and flipped classrooms.
This document discusses learning analytics and provides examples. It begins with a definition of learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts for purposes of understanding and optimizing learning. Challenges and technical aspects are mentioned. Examples of data that could be analyzed from various institutions are outlined, including demographics, academic performance, physical behavior and online behavior. Motivations for learning analytics are discussed. The document considers what data is already collected at institutions and potential challenges. Core principles for learning analytics from the Open University UK are outlined. Examples of learning analytics projects are briefly described.
NYSCOSS Conference Superintendents Training on Assessment 9 14NWEA
This document discusses using data wisely from a superintendent's perspective. It covers three main topics: assessment basics, improving assessment programs, and developing a data culture. The document emphasizes that what is measured gets attended to, so assessments must be properly aligned and designed. It also stresses using multiple years of data to provide context and control for outside factors to fairly evaluate teachers. Developing the right assessment systems and using data thoughtfully can significantly improve student achievement.
8 steps of action research of team teachingcalidiane1
1) The document outlines the 8 steps of an action research project to evaluate the effectiveness of team teaching on student achievement.
2) Data such as TAKS scores and district assessments will be analyzed to compare student performance in team-taught classes versus self-contained classes.
3) Interviews with teachers and administrators will provide perspectives on the advantages and disadvantages of team teaching currently in place.
Using Assessment Data for Educator and Student GrowthNWEA
www.earnperhit.com/essay => Professional academic writing
www.Lucky-Bet.site => Bet on Sports - 50% Deposit Bonus
www.Lucky-Bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.Lucky-Bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.Lucky-Bet.site/eurobet => Best European Bookmaker
This presentation reviews major topics to be considered when using assessment data in implementing a school's program of educator and student growth and evaluation. By attending this workshop, participants will improve their assessment literacy, learn how to improve student achievement and instructional effectiveness through thoughtful data use, and discuss common issues shared by educators when using data for evaluative purposes.
Taking control of the South Carolina Teacher Evaluation frameworkNWEA
This document discusses recommendations for improving teacher evaluation frameworks. It advocates that evaluations should focus on helping teachers improve, be controlled by principals, and use multiple measures rather than solely relying on test scores. An effective framework uses evidence of teaching practices, student learning, and professional responsibilities. While testing and observations are part of evaluations, their results must be interpreted carefully. Overall evaluations should provide meaningful performance differentiations to help retain top educators and dismiss ineffective ones.
John Cronin presented on issues administrators need to know about using tests for high-stakes teacher evaluation. He discussed that tests should be one part of a comprehensive evaluation using multiple data sources like observations and participation. He outlined issues like not all subjects have appropriate assessments and tests may not accurately measure all students. Cronin recommended embracing growth measurement formatively in addition to outcomes and using multiple years of student achievement data in evaluation.
The document discusses various issues around student assessment and accountability. It provides data on teacher and administrator perspectives on standardized testing and uses of assessment data. A majority of teachers believe students are over-tested and too much time is spent on test preparation. The document also examines different approaches to teacher evaluation, including value-added models and student growth percentiles, noting issues with reliability and fairness. It emphasizes the importance of principals in evaluation and using multiple measures, not just test scores, to differentiate teacher performance.
Presentation delivered at the UCISA event A-Z of learning analytics 28/06/2017. Ed Foster & Jane McNeil. A longer case study can be found at https://www.google.com/url?q=https://www.ucisa.ac.uk/-/media/Files/publications/truthaboutda/TheTruthAboutDA&sa=U&ved=0ahUKEwi8r-7W5_7eAhVKRBUIHf66CGEQFggMMAM&client=internal-uds-cse&cx=008281077274678676179:yulrfklwima&usg=AOvVaw17iuGZYPJPqFRCMGyBKLd0
This document discusses strategies for maximizing student assessment systems. It advocates defining your own assessment goals rather than focusing solely on compliance. It provides seven principles for effective assessment programs: 1) Define assessment purposes and ensure validity, 2) Educate teachers on assessments, 3) Align results to audience needs, 4) Eliminate redundant assessments, 5) Deliver timely results, 6) Use metrics that focus on all students, and 7) Contribute to transparency and long-term focus. The document argues that assessment goals, metrics, and incentives should support all students rather than just those near performance cutoffs.
The document provides information on different methods for evaluating academic programs, including audits, assessment experience questionnaires (AEQ), and focus groups. It discusses analyzing data from these various sources, including looking for consistencies and gaps. Guidelines are offered for conducting mock audits, administering the AEQ survey, facilitating focus groups, and coding and analyzing qualitative data. The goal is to triangulate information from multiple assessment methods to develop a comprehensive understanding of student learning experiences.
The document describes the implementation of an institution-wide learning analytics dashboard at Nottingham Trent University (NTU) from 2013 to the present. The dashboard aims to promote student success by providing insights into progression, grades, and degree attainment. It also aims to improve staff-student relationships by surfacing engagement information to inform personalized tutorial discussions. A multi-year, multi-phase implementation process included stakeholder consultation and working groups. Student and staff feedback was positive, finding the dashboard useful for monitoring academic progress, preparing for tutorials, and developing action plans.
This document discusses research on effective teacher goal setting and its impact on student achievement. It finds that setting specific, moderately challenging goals can improve student performance by 8-16% depending on task complexity. Goals work best when they direct effort, build persistence and strategies, and foster commitment. Proper goal setting considers context, sets interim benchmarks, and provides leadership support through communication, modeling, and praise. The research suggests that teacher goal setting, if implemented well, can powerfully increase student learning.
www.earnperhit.com/essay => Professional academic writing
www.lucky-bet.site => Bet on Sports - 50% Deposit Bonus
www.lucky-bet.site/casino => Online Casino - 5000$ Welcome Bonus
www.lucky-bet.site/lotto247 => Lotto247 - Win Big, Live Free
www.lucky-bet.site/eurobet => Best European Bookmaker
Overview of assessments, growth, and value added in a teacher evaluation context
Through analyzing achievement data, providing targeted instruction and intervention, and offering professional development, Wildwood Elementary worked to increase the academic performance of its low-income students in mathematics. Key aspects of the intervention included coordinating support services, adhering to mathematical targets, and examining data frequently. As a result, Wildwood closed the achievement gap between low-income and non-low-income students in both reading and mathematics, earned recognition as a School of Distinction, and was removed from the state's list of failing schools.
This document discusses conducting action research in school settings. It provides an example of an action research study conducted in a school district to address increasing student engagement and achievement. The study examined incorporating student voice into a process called Coaching for Design that involved teachers designing lessons. Data collection involved interviews and focus groups. Results showed student voice activities increased participant engagement and perspectives. While longer-term impacts on achievement were unclear, the district has continued building on the work. The document prompts discussion of developing one's own action research plan.
This document defines and explains the process of action research. It states that action research is a process where teachers study their own instructional practices and student learning in order to improve. The process involves identifying a classroom problem, developing a plan to address it, collecting and analyzing data, and making instructional decisions and sharing results. It then outlines the phases of action research in more detail, including identifying problems, developing a research plan, collecting and analyzing data from multiple sources, and drawing conclusions to continue improving teaching practices.
This document discusses action research and its various forms in education. Action research involves teachers identifying issues in their practice, gathering and analyzing data, and making changes to improve outcomes. It can be conducted individually or collaboratively at different levels from a single classroom to district-wide. Benefits include improved instruction, assessment, and policies informed by evidence. Support may be needed for coaching, technology assistance, substitutes or release time depending on the scope. Potential impacts range from changes in a teacher's curriculum to reforming organizational structures across a district.
The author modified their original action research plan after reviewing steps from a book on conducting action research. The revised plan covers all eight steps from the book but in a different order tailored to their unique campus needs. The plan aims to improve instruction by having faculty disaggregate student data and use it to drive classroom lessons. It involves conducting a needs assessment, understanding current practices, researching effective methods, analyzing data, developing an implementation plan, taking action, monitoring progress, and evaluating results to improve student achievement.
Application of assessment and evaluation data to improve a dynamic graduate m...Pat Barlow
1. The document describes the process of creating and refining assessment tools and curriculum for a graduate medical education workshop on research design and statistics.
2. They developed an initial assessment, pre-course survey, new classroom activities and homework, and post-course evaluation to gather data and feedback from students.
3. After implementing changes based on the assessment data, the workshop was much more successful and rigorous, demonstrating the importance of continuously collecting and using student feedback to improve a dynamic curriculum over time.
ABLE - Inside Government E Foster 26th November 2015Ed Foster
1) NTU developed a student dashboard using learning analytics to improve student retention, engagement, and attainment.
2) The dashboard provides data on student engagement like library usage, VLE access, and attendance to students and staff.
3) Analysis found students with high engagement were more likely to progress to the next year and receive higher degrees.
4) Both students and staff reported changing their behaviors due to the insights from the dashboard. Students increased engagement activities while staff targeted interactions.
There are many assessments being used but lack of clarity about their purposes. An organized assessment plan is needed to coordinate different initiatives and ensure all elements work together toward the overall goal of student progress. The first step is to examine current assessments being used and discard any not serving a clear need, in order to simplify the system. A valid, reliable and coordinated assessment approach can provide the right data to inform instructional decisions at each level from problem identification to evaluation of plans.
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning. The document discusses motivation and goals of learning analytics, challenges, and examples of data that could be analyzed from students, including demographics, academic performance, physical behavior, and online behavior. It also discusses ensuring principles of transparency, alignment with pedagogy, and responsibility in student data use. Examples are provided of diagnostic testing, analyzing VLE access data, and measuring the effects of video use and flipped classrooms.
This document discusses learning analytics and provides examples. It begins with a definition of learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts for purposes of understanding and optimizing learning. Challenges and technical aspects are mentioned. Examples of data that could be analyzed from various institutions are outlined, including demographics, academic performance, physical behavior and online behavior. Motivations for learning analytics are discussed. The document considers what data is already collected at institutions and potential challenges. Core principles for learning analytics from the Open University UK are outlined. Examples of learning analytics projects are briefly described.
The document discusses a project that used virtual learning environment (VLE) data to provide weekly automated feedback to first-year students on their engagement. The project aimed to improve student engagement and progression. Students received emails summarizing their VLE activity and survey results found most students changed their VLE usage and would participate again. The document recommends using VLE data to provide feedback but ensuring appropriate ethical approval is received.
This document provides an overview of PredictED, a pilot learning analytics project at an unnamed university. The project used data from the university's virtual learning environment (VLE) to provide weekly feedback emails to 1,184 opted-in students across 17 first-year modules. The emails aimed to encourage students to engage more with course content online. Most students (76%) and lecturers opted into the program. Academic performance was higher on average for participants compared to non-participants in 8 of 10 analyzed modules. Students reported studying and engaging more with the VLE due to the feedback. The document also briefly discusses the types of student data that could potentially be used for predictive analytics and the importance of students' academic networks.
Studying learning journeys with lecture capture through Staff-Student partner...Karl Luke
Studying learning journeys with lecture capture through Staff-Student partnerships
This document discusses two student partnership projects at Cardiff University that explored student use of lecture recordings. Student partners conducted research including surveys and interviews that provided insights into how students use lecture capture. Key findings indicated that lecture recordings enhanced learning for many students and supported inclusivity. The partnerships helped advance understanding of lecture capture and provided practical advice on implementing learning technologies through collaboration with students.
Australian university teacher’s engagement with learning analytics: Still ea...Blackboard APAC
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
By Liu Qizhang.
Flipped classroom is an emerging pedagogical model in which the typical lecture and homework elements of a course are reversed. It blends education technology and activity learning to enhance students’ learning. We are among the pioneers in the School of Business to flip part of our course.
In this talk, we will share our experience of flipping four lessons in Semester I 2013/2014. In particular, we will answer some of the questions related to flipped classroom: Why flip the classroom? What should be flipped and what should not? How to make flipped classroom more efficient? What do students think about flipped classroom?
This document discusses embedding learning analytics across an institution and the challenges of implementing an institutional learning analytics solution. It outlines five key challenges: defining strategic goals and effective governance; interacting with institutional tools and users while addressing data ethics concerns; exposing assumptions and designing appropriate tools; effective communication; and managing ongoing implementation and change. It then provides examples of NTU's student dashboard which aims to improve student success, staff-student relationships, and students' self-management of learning through metrics on student engagement. Evaluation found the dashboard was useful for most students and correlated with improved progression and study habits. The dashboard is being expanded and its impact further evaluated.
Using intelligent tutoring systems, virtual laboratories, simulations, and frequent opportunities for assessment and feedback, The Open Learning Initiative (OLI) builds open learning environments that support continuous improvement in teaching and learning.
One of the most powerful features of web-based learning environments is that we can embed assessment into, virtually all, instructional activities. As students interact with OLI environments, we collect real-time data of student work. We use this data to create four positive feedback loops:
• feedback to students
• feedback to instructors
• feedback to course designers
• feedback to learning science researchers
In this JumpStart Session, we demonstrate how OLI uses the web to deliver online instruction that instantiates course designs based on research and how the learning environments, in turn, support ongoing research. We will discuss the Community College Open Learning Initiative (CC-OLI) and how faculty and colleges across the country can participate in CC-OLI and the connection between CC-OLI and Washington State’s Open Course Library project.
Analysing analytics, what is learning analytics?Moodlerooms
The document discusses learning analytics, which is defined as the measurement, collection, analysis and reporting of learner data to optimize learning. It describes how data from student profiles, activities, course content and results can be collected and analyzed descriptively, diagnostically, predictively and prescriptively. The document also addresses ethical concerns regarding data privacy, transparency and ensuring analytics are used to benefit students. It provides examples of how different stakeholders may use analytics and discusses the Open University's principles of applying analytics in an ethical manner that respects student consent and privacy.
Developing Health Sciences students’ information skills through online self-p...Sarah Gallagher
Initial feedback on a cross cohort evaluation of an online self-paced information skills programme in three second year health sciences programmes at the Unviersity of Otago: Medicine, Pharmacy and Physiotherapy. Presented at Spotlight on Teaching 2013, University of Otago.
This document discusses Newcastle University's approach to module evaluation as a tool to improve teaching quality. It provides an overview of the university and the context for module evaluations. It describes the process of implementing online module evaluations across the university using a centralized system with common questions. It discusses challenges engaging both staff and students, and strategies used. It concludes by reflecting on lessons learned and plans for continued improvements.
The document summarizes a pilot project that tested using a pre-arrival induction task through the university's student dashboard. The task involved students answering 6 questions before arriving on campus. The pilot found that students who completed the task had higher engagement with the dashboard, better progression to the second year, and higher average grades. It is an effective early predictor of students who may need additional support. The document recommends more fully integrating the task into course activities and providing follow-up interventions for at-risk students identified through the task.
The document discusses scaffolding problem-based learning (PBL) through module length problems at the University of Leicester's Interdisciplinary Science programme. It found that initially, PBL delivery led to surface learning and poor exam results. Interventions like pre-session preparation materials, feedback sessions, and subject-specific teaching fellows improved student marks and engagement. A student focus group indicated the changes, especially use of teaching fellows, benefited their learning. While limited by a small cohort, the results suggest scaffolding can help students, particularly those with strong or weak first year performance.
The document discusses learning analytics, which is defined as the measurement, collection, analysis and reporting of data about learners and their learning environments. It aims to understand and optimize learning. The document outlines the types of data that is collected on students, including profiles, activities, content accessed, and results. It also discusses the goals of improving student success, retention, and experience. Key topics covered include descriptive, diagnostic, predictive and prescriptive analytics. The document raises important ethical concerns around data access, ownership, transparency and privacy when applying learning analytics and discusses approaches taken by organizations like the Open University.
The document summarizes a study evaluating the usability of a virtual learning environment (VLE) from the perspective of teachers at King Saud University. The study involved having teachers complete tasks in the VLE and provide feedback. Results showed that teachers were generally interested in using a VLE, found it easy to use, and thought students could learn it quickly. However, teachers also felt the VLE was overly complex, needed technical support, and was inconsistent. The study provided insights into both the positives and negatives of the VLE according to teacher users.
Talis Insight Europe 2017 - Careful now: Reading List management in Irish uni...Talis
This document discusses reading list management in Irish universities. It provides context on university expansion in Ireland and current reading list practices, which vary across institutions. It then presents a case study of NUI Galway's implementation of a Reading List Management System (RLMS) to improve student satisfaction, engagement, and collection management. Key aspects included getting academic staff to submit reading lists, ensuring adequate copies based on class size, and integrating the lists into the learning management system. Since launching in 2016, over 450 lists have been added. Student and faculty feedback has been positive, and the library's student satisfaction ratings have continued to improve. However, ongoing advocacy is still needed to gain full academic buy-in for the system.
Developing Health Sciences students’ information skills through online self-p...Sarah Gallagher
The document summarizes a study on developing health sciences students' information literacy skills through an online self-paced course called StudySmart. It describes StudySmart as an online library course with topics, quizzes and videos designed by the university health sciences library staff. The study evaluated StudySmart across different student cohorts and found high completion rates. Student feedback indicated that videos and database searching skills were most useful, while some videos could be shortened. The study concluded academic support and tying the course to assessments improves learning, and constant improvement is needed.
Empirical studies of adaptive annotation in the educational context have demonstrated that it can help students to acquire knowledge faster, improve learning outcomes, reduce navigational overhead, and encourage non-sequential navigation. Over the last 8 years we have explored a lesser known effect of adaptive annotation – its ability to significantly increase student engagement in working with non-mandatory educational content. In the presence of adaptive link annotation, students tend to access significantly more learning content; they stay with it longer, return to it more often and explore a wider variety of learning resources. This talk will present an overview of our exploration of the addictive links effect in many course-long studies, which we ran in several domains (C, SQL and Java programming), for several types of learning content (quizzes, problems, interactive examples). The first part of the talk will review our exploration of a more traditional knowledge-based personalization approach and the second part will focus on more recent studies of social navigation and open social student modeling
offshore lecture 1 12th SEptember 2015 IPGKSAH.pptxWanFadh1
This document provides an overview of a PLG 501 class on research methods in education taught by Prof Munirah Ghazali. It includes information about the course such as administrative details, lecture topics, required textbook, and support systems. The course will cover an introduction to educational research including the scientific method, different types of research classified by purpose (e.g. basic, applied) and method (e.g. quantitative, qualitative). It will also provide examples of various research designs like descriptive research, experimental research, and ethnography.
Similar to Predicted project Hatfield UK 2015 M Glynn (20)
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
Orchestrating the Future: Navigating Today's Data Workflow Challenges with Ai...Kaxil Naik
Navigating today's data landscape isn't just about managing workflows; it's about strategically propelling your business forward. Apache Airflow has stood out as the benchmark in this arena, driving data orchestration forward since its early days. As we dive into the complexities of our current data-rich environment, where the sheer volume of information and its timely, accurate processing are crucial for AI and ML applications, the role of Airflow has never been more critical.
In my journey as the Senior Engineering Director and a pivotal member of Apache Airflow's Project Management Committee (PMC), I've witnessed Airflow transform data handling, making agility and insight the norm in an ever-evolving digital space. At Astronomer, our collaboration with leading AI & ML teams worldwide has not only tested but also proven Airflow's mettle in delivering data reliably and efficiently—data that now powers not just insights but core business functions.
This session is a deep dive into the essence of Airflow's success. We'll trace its evolution from a budding project to the backbone of data orchestration it is today, constantly adapting to meet the next wave of data challenges, including those brought on by Generative AI. It's this forward-thinking adaptability that keeps Airflow at the forefront of innovation, ready for whatever comes next.
The ever-growing demands of AI and ML applications have ushered in an era where sophisticated data management isn't a luxury—it's a necessity. Airflow's innate flexibility and scalability are what makes it indispensable in managing the intricate workflows of today, especially those involving Large Language Models (LLMs).
This talk isn't just a rundown of Airflow's features; it's about harnessing these capabilities to turn your data workflows into a strategic asset. Together, we'll explore how Airflow remains at the cutting edge of data orchestration, ensuring your organization is not just keeping pace but setting the pace in a data-driven future.
Session in https://budapestdata.hu/2024/04/kaxil-naik-astronomer-io/ | https://dataml24.sessionize.com/session/667627
Build applications with generative AI on Google CloudMárton Kodok
We will explore Vertex AI - Model Garden powered experiences, we are going to learn more about the integration of these generative AI APIs. We are going to see in action what the Gemini family of generative models are for developers to build and deploy AI-driven applications. Vertex AI includes a suite of foundation models, these are referred to as the PaLM and Gemini family of generative ai models, and they come in different versions. We are going to cover how to use via API to: - execute prompts in text and chat - cover multimodal use cases with image prompts. - finetune and distill to improve knowledge domains - run function calls with foundation models to optimize them for specific tasks. At the end of the session, developers will understand how to innovate with generative AI and develop apps using the generative ai industry trends.
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...Social Samosa
The Modern Marketing Reckoner (MMR) is a comprehensive resource packed with POVs from 60+ industry leaders on how AI is transforming the 4 key pillars of marketing – product, place, price and promotions.
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...sameer shah
"Join us for STATATHON, a dynamic 2-day event dedicated to exploring statistical knowledge and its real-world applications. From theory to practice, participants engage in intensive learning sessions, workshops, and challenges, fostering a deeper understanding of statistical methodologies and their significance in various fields."
1. Student Data: Data is knowledge
– putting the knowledge back in the
students’ hands
Owen Corrigan
Mark Glynn
Aisling McKenna
Alan F. Smeaton
Sinéad Smyth
@glynnmark
2. Outline
• Motivation and goals
• Selecting the modules
• Study by numbers
• The interventions
- What the student sees
• What the students said
• The results
4. Study by numbers
• 17 Modules across the
University (first year, high
failure rate, use Loop,
periodicity, stability of
content, Lecturer on-board)
• Offered to students who
opt-in or opt-out, over 18s
only
• 76% of students opted-in,
377 opted-out, no difference
among cohorts
• 10,245 emails sent to 1,184
students who opted-in over
13 weekly email alerts
7. Modules which work well …
• Have periodicity (repeatability) in Moodle access
• Confidence of predictor increases over time
• Don't have high pass rates (< 0.95)
• Have large number of students, early-stage
8. No significant difference in the entry profiles of
participants vs. non-participants overall
PredictEd Participant Profile
11. What did the students say?
Students who took part were asked to complete a short
survey at the start of Semester 2 - N=133 (11% response rate)
Question Group 1 (more
detailed email)
Group 2
% of respondents who opted out of
PredictED during the course of the semester 4.5% 4.5%
% who changed their Loop usage as a
result of the weekly emails
43.3% 28.9%
% who would take part again/are offered and
are taking part again
72.2%
(45.6%/ 26.6% )
76.6%
(46% /30.6% )
12. 33% said they changed how they
used Loop. We asked them how?
• Studied more
– “More study”
– “Read some other articles online”
– “Wrote more notes”
– “I tried to apply myself much more, however yielded no results”
– “It proved useful for getting tutorial work done”
• Used Loop more
– “I tried harder to engage with my modules on loop”
– “I think as it is recorded I did not hesitate to go on loop. And loop as become my first
support of study.”
– “I logged on more”
– “I read most of the extra files under each topic, I usually would just look at the lecture
notes.”
– “I looked at more of the links on the course nes pages, which helped me to further my
understanding of the topics”
– “I learnt how often I need to log on to stay caught up.”
13. Did you change Loop usage for
other modules?
• Most who commented used Loop more often for other modules
– “More often”
– “More efficient”
– “Used loop more for other modules when i was logging onto
loop for the module linked to PredictED”
– “Felt more motivated to increase my Loop usage in general
for all subjects”
One realised that Lecturers could see their Loop activity
“I realised that since teachers knew how much i was
using loop, i had to try to mantain pages long on so it
looked as if i used it a lot”
14. Subject Description Non-Participant Participant
BE101 Introduction to Cell Biology and Biochemistry 58.89 62.05
CA103 Computer Systems 70.28 71.34
CA168 Digital World 63.81 65.26
ES125 Social&Personal Dev with Communication Skills 67.00 66.46
HR101 Psychology in Organisations 59.43 63.32
LG101 Introduction to Law 53.33 54.85
LG116 Introduction to Politics 45.68 44.85
LG127 Business Law 60.57 61.82
MS136 Mathematics for Economics and Business 60.78 69.35
SS103 Physiology for Health Sciences 55.27 57.03
Overall Dff in all modules 58.36 61.22
Average scores for participants are higher in 8 of the 10
modules analysed, significantly higher in BE101, and
CA103. MS136
Module Average Performance
Participants vs. Non-Participants
21. Importance of Ethics
• Ethics are important to ensure safety of participants and
researchers
• Educational Data Analytics is a new area of research
– Not much previous research to highlight possible ethical
issues
– Requires extensive ethical consideration
• We have spent a lot of time this Summer preparing a DCU
REC submission
– We’ve submitted and had approval for a test case
– We’ve met with REC chair to brief him
• We are following the 8 Principles set out by the Open
University who are at EXACTLY the same stage as us
22. So much student data we could
useDemographics
• Age, home/term address, commuting distance, socio-economic status, family
composition, school attended, census information, home property value, sibling
activities, census information
Academic Performance
• CAO and Leaving cert, University exams, course preferences, performance relative
to peers in school
Physical Behaviour
• Library access, sports centre, clubs and societies, eduroam access yielding co-
location with others and peer groupings, lecture/lab attendance,
Online Behaviour
• Mood and emotional analysis of Facebook, Twitter, Instagram activities, friends and
their actual social network, access to VLE (Moodle)
24. Notes on model confidence
• Y axis is confidence in AUC ROC (not probability)
• X axis is time in weeks
• 0.5 or below is a poor result
• Most Modules start at 0.5 when we don't have much
information
• 0.6 is acceptable, 0.7 is really good (for this task)
• The model should increase in confidence over time
• Even if confidence overall increases, due to randomness the
confidence may go up and down
• It should trend upwards to be a valid model and viable
module choice
34. Timescale for Rollout
• Still some issues on Moodle access log data transfer
to be resolved
• Still have to resolve student name / email address /
Moodle ID / student number
• Still to resolve timing of when we can get new
registration data, updates to registrations (late
registrations, change of module, change of course,
etc.) …
• Should we get new, “clean” data each week ?
35. Why did you take part?
• The majority of students
wanted to learn/monitor
their performance
• Many others were curious
• Some were interested in
the Research aspect
• Some were just following
advice
• Others were indifferent
36. How easy was it to understand the
information in the emails ?
(1= not at all easy, 5 = extremely easy)
• Average 3.97 (SD= 1.07)
• Very few had comments to make
(19/133)
– Most who commented wanted more
detail.