Jisc provides a learning analytics service to help higher education institutions use student data to improve outcomes. The service includes predictive models, dashboards, and an app to provide insights to staff and students. It integrates various institutional data sources. Over 30 institutions are using the service to enhance student success, curriculum, employability, and digital apprenticeships through analytics.
Jisc learning analytics service core slidesPaul Bailey
The document summarizes Jisc's learning analytics service, which aims to help higher education institutions use student data and analytics to improve student outcomes. The service provides tools for predictive modeling, dashboards, and an app for students. It also offers guidance on legal and ethical issues, workshops on implementation, and connects institutions with analytics solution providers. The goal is to support 40 institutions by 2018 through the free core service and additional fee-based products and services.
This document discusses Jisc's learning analytics service, which aims to help higher education institutions in the UK implement learning analytics through three core strands: a toolkit, community, and service. The service will provide dashboards, apps, and tools to collect, analyze, and report on student data in order to improve retention, attainment, and personalize learning. It outlines the onboarding process, technical architecture, and implementation approach for institutions to get started with learning analytics.
The document discusses national learning analytics in the UK and Jisc's role in providing learning analytics services. It describes Jisc's learning analytics tools and products like the Data Explorer dashboards, Study Goal app, and Learning Data Hub. It outlines Jisc's onboarding process for institutions and examples of how they are working with universities and colleges to implement learning analytics.
The document summarizes Paul Bailey's presentation on Jisc's learning analytics service. The service aims to help higher education institutions in the UK use learning analytics to improve student retention, attainment, employability and personalized learning. It provides tools like dashboards, a student app, and an alert and intervention system. The service also offers an onboarding process to help institutions prepare, and a community for sharing knowledge and experiences with learning analytics.
The document provides the programme for the 11th UK Learning Analytics Network Meeting at Aston University on 5 September 2017. The programme includes updates on Jisc's Effective Learning Analytics project and interactive sessions on planning student-facing interventions and curriculum enhancements. It also outlines Jisc's learning analytics service and products like the Jisc Learning Records Warehouse and Study Goal student app.
The document outlines the agenda for the 8th UK Learning Analytics Network Meeting at the Open University on November 2nd, 2016. The agenda includes updates on Jisc's learning analytics program, sessions on learning design and analytics, legal issues, and the Learning Analytics Community Exchange.
Jisc provides a learning analytics service to help higher education institutions use student data to improve outcomes. The service includes predictive models, dashboards, and an app to provide insights to staff and students. It integrates various institutional data sources. Over 30 institutions are using the service to enhance student success, curriculum, employability, and digital apprenticeships through analytics.
Jisc learning analytics service core slidesPaul Bailey
The document summarizes Jisc's learning analytics service, which aims to help higher education institutions use student data and analytics to improve student outcomes. The service provides tools for predictive modeling, dashboards, and an app for students. It also offers guidance on legal and ethical issues, workshops on implementation, and connects institutions with analytics solution providers. The goal is to support 40 institutions by 2018 through the free core service and additional fee-based products and services.
This document discusses Jisc's learning analytics service, which aims to help higher education institutions in the UK implement learning analytics through three core strands: a toolkit, community, and service. The service will provide dashboards, apps, and tools to collect, analyze, and report on student data in order to improve retention, attainment, and personalize learning. It outlines the onboarding process, technical architecture, and implementation approach for institutions to get started with learning analytics.
The document discusses national learning analytics in the UK and Jisc's role in providing learning analytics services. It describes Jisc's learning analytics tools and products like the Data Explorer dashboards, Study Goal app, and Learning Data Hub. It outlines Jisc's onboarding process for institutions and examples of how they are working with universities and colleges to implement learning analytics.
The document summarizes Paul Bailey's presentation on Jisc's learning analytics service. The service aims to help higher education institutions in the UK use learning analytics to improve student retention, attainment, employability and personalized learning. It provides tools like dashboards, a student app, and an alert and intervention system. The service also offers an onboarding process to help institutions prepare, and a community for sharing knowledge and experiences with learning analytics.
The document provides the programme for the 11th UK Learning Analytics Network Meeting at Aston University on 5 September 2017. The programme includes updates on Jisc's Effective Learning Analytics project and interactive sessions on planning student-facing interventions and curriculum enhancements. It also outlines Jisc's learning analytics service and products like the Jisc Learning Records Warehouse and Study Goal student app.
The document outlines the agenda for the 8th UK Learning Analytics Network Meeting at the Open University on November 2nd, 2016. The agenda includes updates on Jisc's learning analytics program, sessions on learning design and analytics, legal issues, and the Learning Analytics Community Exchange.
Learning analytics as a national initiativePaul Bailey
Jisc is developing a national learning analytics initiative in the UK that includes three core components: a learning analytics service, toolkit and consultancy, and a community network. The initiative aims to help institutions implement learning analytics through an onboarding process, code of practice, and open architecture service. The service includes predictive models, intervention tools, dashboards, and a student app to improve retention, outcomes, and support through analysis of educational data.
This document discusses the development of a learning analytics app to engage students. It provides an overview of learning analytics and Jisc's learning analytics project. Student consultations found they want to see assessments, engagement levels, reading lists, and find study partners. Proposed app principles include being comparative, social, gamified, and private by default. Wireframes show a timeline, stats on engagement/attainment, logging activities, and setting targets. The app will have an initial release in April 2016 for smartphones, with a second release in September 2016 adding more features informed by user testing.
Jisc provides digital services and solutions to UK higher education. It is developing a Learning Analytics service with 3 strands: a service, toolkit, and community. The service involves dashboards, a student app, and intervention tools. A timeline was outlined ending with the full service in 2017. Technical trials were discussed to integrate data. Considerations around ethics, consent and supporting staff/students were also mentioned.
The document provides an overview of Jisc's learning analytics service. It discusses learning analytics concepts, evidence of the benefits of learning analytics from various institutions, the components of Jisc's service including the learning analytics toolkit, community, and architecture. It outlines the on-boarding process, discovery readiness assessment, products in the beta service, how suppliers will be worked with, and the pricing formula beginning in 2018-19.
Jisc Learning Analytics intro for digital leadersPaul Bailey
This document discusses learning analytics and Jisc's learning analytics project. It defines learning analytics as the measurement, collection, analysis and reporting of learner data to understand and optimize learning. The goals of learning analytics include improving retention, achievement, employability and personalized learning. Jisc's learning analytics project has three strands: a learning analytics service, toolkit and community. It outlines the phases of the project from 2015-2017 and encourages involvement. The document prompts attendees to discuss goals and challenges of implementing learning analytics at their institutions. It provides contact information for further resources.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...Blackboard APAC
This document discusses using analytics to drive continual improvement and catalyze change in higher education. It provides examples of how analytics can be used to measure learning outcomes, analyze the costs and effectiveness of instruction, and inform decisions to improve student performance and success. The document advocates asking questions of the data, digging deeper into analyses, and using metrics and insights to take action and measure progress.
Data visualisation with predictive learning analyticsChris Ballard
The document discusses using predictive analytics and data visualization in education. It outlines an objective to build predictive models for student success and map them to retention themes. Examples of visualization include monitoring courses and modules, and identifying at-risk students. Guidelines recommend visualizations be simple to interpret, adapt to the user, indicate how predictions are built, bridge predictive and historical data, enable user response and monitoring of actions. The goal is to identify at-risk students earlier and understand factors influencing student success.
Jisc technology to tutoring new and emerging developmentsPaul Bailey
Presentation on 27 January at the Centre for Recording Achievement seminar on Technology to support 21 Century Tutoring: new and emerging developments. Paul Bailey, Lisa Grey and Ruth Drysdale
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
This document discusses how analyzing big data can provide valuable insights for education. It explains that big data is characterized by the 3 Vs: volume, velocity, and variety. Analyzing student data can provide insights into trends, transparency, and actionable information to improve areas like grades, outcomes, and personalized learning. It also discusses challenges in higher education like student retention and time to degree completion that big data analytics may help address. Examples of analytics applications that can help institutions understand students, instructors, programs and provide real-time dashboards and predictive modeling are presented.
This document discusses learning analytics and the potential uses of various institutional data sources. It describes how combining data from student records, the virtual learning environment (VLE), library usage, attendance records, and other sources through a learning analytics service could help improve student retention and attainment, enhance teaching quality, enable personalized learning, and support student health and well-being. Specific opportunities mentioned include predicting at-risk students, analyzing factors related to student success and employability, and using activity data to support timely interventions. Engaging various stakeholders like students, teachers, and campus planners is presented as important for effective use of learning analytics.
Implementing analytics - Rob Wyn Jones, Shri Footring and Rebecca DaviesJisc
Led by Rob Wyn Jones, consultant and Shri Footring, senior co-design manager - enterprise, both Jisc.
With contribution from Rebecca Davies, pro vice-chancellor and chief operating officer, Aberystwyth University.
Connect more in Wales, 7 July 2016
Big data and education is an important topic, with key challenges around data integration, skills, and privacy protections. Huawei provides big data storage solutions to help address these issues by providing a robust, high performance infrastructure that can integrate, store, process, model, and visualize large amounts of educational data. This assists with increasing educator effectiveness, delivering personalized education, and equipping students with relevant skills, but clear policies are still needed regarding privacy protections for learners.
Opening/Framing Comments: John Behrens, Vice President, Center for Digital Data, Analytics, & Adaptive Learning Pearson
Discussion of how the field of educational measurement is changing; how long held assumptions may no longer be taken for granted and that new terminology and language are coming into the.
Panel 1: Beyond the Construct: New Forms of Measurement
This panel presents new views of what assessment can be and new species of big data that push our understanding for what can be used in evidentiary arguments.
Marcia Linn, Lydia Liu from UC Berkeley and ETS discuss continuous assessment of science and new kinds of constructs that relate to collaboration and student reasoning.
John Byrnes from SRI International discusses text and other semi-structured data sources and different methods of analysis.
Kristin Dicerbo from Pearson discusses hidden assessments and the different student interactions and events that can be used in inferential processes.
Panel 2: The Test is Just the Beginning: Assessments Meet Systems Context
This panel looks at how assessments are not the end game, but often the first step in larger big-data practices at districts/state/national levels.
Gerald Tindal from the University of Oregon discusses State data systems and special education, including curriculum-based measurement across geographic settings.
Jack Buckley Commissioner of the National Center for Educational Statistics discussing national datasets where tests and other data connect.
Lindsay Page, Will Marinell from the Strategic Data Project at Harvard discussing state and district datasets used for evaluating teachers, colleges of education, and student progress.
Panel 3: Connecting the Dots: Research Agendas to Integrate Different Worlds
This panel will look at how research organizations are viewing the connections between the perspectives presented in Panels 1 and 2; what is known, what is still yet to be discovered in order to achieve the promised of big connected data in education.
Andrea Conklin Bueschel Program Director at the Spencer Foundation
Ed Dieterle Senior Program Officer at the Bill and Melinda Gates Foundation
Edith Gummer Program Manager at National Science Foundation
This document discusses using data and learning analytics to inform blended learning design and delivery. It describes learning design and curriculum design, as well as learning analytics and how they interact. Potential uses of data discussed include understanding what learning materials and activities students engage with the most, what students are learning, and tailoring learning design based on the characteristics of incoming student cohorts. Challenges around legal/ethical issues, data wrangling, and institutional culture are also addressed. The document encourages discussion of current and potential uses of curriculum analytics within institutions.
Ellen Wagner, Executive Director, WCET.
Putting Data to Work
This session explores changing data sensibilities at US post-secondary institutions with particular attention paid to how predictive analytics are changing expectations for institutional accountability and student success. Results from the Predictive Analytics Reporting Framework show that predictive modeling can identify students at risk and that linking behavioral predictions of risk with interventions to mitigate those risks at the point of need is a powerful strategy for increasing rates of student retention, academic progress and completion.
presentation at the 15th annual SLN SOLsummit February 27, 2014
http://slnsolsummit2014.edublogs.org/
Are you really ready to roll out learning analytics across your entire instit...Jisc
Speaker: Steve Hoole, senior analytics consultant.
This workshop will enable delegates to consider a process to assist them finding suitable solution for learning analytics implementation. It will discuss how to move from small pilots to institutional wider implementation for learning analytics considering issues such as legal and ethical requirements such as GDPR compliance, planning for intervention management and ensuring staff and student engagement and support.
Can you predict a degree from five weeks' worth of VLE data? Learning analyti...Jisc
Speakers:
Jason Bailey, learning technologies adviser, University of Brighton
Katie Piatt, e-learning manager, University of Brighton
The University of Brighton are in their second year of student dashboard delivery and are also modelling historical data as part of their learning analytics work in order to predict student outcomes.
The document discusses learning analytics and the Jisc learning analytics service. It provides an overview of what learning analytics is, the goals of the Jisc service which include helping institutions get started with learning analytics and providing standard tools, and the components of the Jisc service including a code of practice, community resources, data collection and products like Data Explorer and Study Goal. It also discusses working with institutions, engagement activities, the on-boarding process, and engaging with solution providers.
Jisc learning analytics service oct 2016Paul Bailey
This document summarizes Paul Bailey's presentation on Jisc's learning analytics service. It discusses what learning analytics is, how it can be used to improve student performance, teaching quality, and institutional strategy. The service will provide dashboards, a student app, and an alert system to help identify at-risk students. It will initially focus on student engagement and attainment data to improve retention and achievement. An on-boarding process and readiness assessment are also outlined to help institutions implement learning analytics. The goal is to launch the service in 2017 to measure its impact on key performance indicators.
Learning analytics as a national initiativePaul Bailey
Jisc is developing a national learning analytics initiative in the UK that includes three core components: a learning analytics service, toolkit and consultancy, and a community network. The initiative aims to help institutions implement learning analytics through an onboarding process, code of practice, and open architecture service. The service includes predictive models, intervention tools, dashboards, and a student app to improve retention, outcomes, and support through analysis of educational data.
This document discusses the development of a learning analytics app to engage students. It provides an overview of learning analytics and Jisc's learning analytics project. Student consultations found they want to see assessments, engagement levels, reading lists, and find study partners. Proposed app principles include being comparative, social, gamified, and private by default. Wireframes show a timeline, stats on engagement/attainment, logging activities, and setting targets. The app will have an initial release in April 2016 for smartphones, with a second release in September 2016 adding more features informed by user testing.
Jisc provides digital services and solutions to UK higher education. It is developing a Learning Analytics service with 3 strands: a service, toolkit, and community. The service involves dashboards, a student app, and intervention tools. A timeline was outlined ending with the full service in 2017. Technical trials were discussed to integrate data. Considerations around ethics, consent and supporting staff/students were also mentioned.
The document provides an overview of Jisc's learning analytics service. It discusses learning analytics concepts, evidence of the benefits of learning analytics from various institutions, the components of Jisc's service including the learning analytics toolkit, community, and architecture. It outlines the on-boarding process, discovery readiness assessment, products in the beta service, how suppliers will be worked with, and the pricing formula beginning in 2018-19.
Jisc Learning Analytics intro for digital leadersPaul Bailey
This document discusses learning analytics and Jisc's learning analytics project. It defines learning analytics as the measurement, collection, analysis and reporting of learner data to understand and optimize learning. The goals of learning analytics include improving retention, achievement, employability and personalized learning. Jisc's learning analytics project has three strands: a learning analytics service, toolkit and community. It outlines the phases of the project from 2015-2017 and encourages involvement. The document prompts attendees to discuss goals and challenges of implementing learning analytics at their institutions. It provides contact information for further resources.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...Blackboard APAC
This document discusses using analytics to drive continual improvement and catalyze change in higher education. It provides examples of how analytics can be used to measure learning outcomes, analyze the costs and effectiveness of instruction, and inform decisions to improve student performance and success. The document advocates asking questions of the data, digging deeper into analyses, and using metrics and insights to take action and measure progress.
Data visualisation with predictive learning analyticsChris Ballard
The document discusses using predictive analytics and data visualization in education. It outlines an objective to build predictive models for student success and map them to retention themes. Examples of visualization include monitoring courses and modules, and identifying at-risk students. Guidelines recommend visualizations be simple to interpret, adapt to the user, indicate how predictions are built, bridge predictive and historical data, enable user response and monitoring of actions. The goal is to identify at-risk students earlier and understand factors influencing student success.
Jisc technology to tutoring new and emerging developmentsPaul Bailey
Presentation on 27 January at the Centre for Recording Achievement seminar on Technology to support 21 Century Tutoring: new and emerging developments. Paul Bailey, Lisa Grey and Ruth Drysdale
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
This document discusses how analyzing big data can provide valuable insights for education. It explains that big data is characterized by the 3 Vs: volume, velocity, and variety. Analyzing student data can provide insights into trends, transparency, and actionable information to improve areas like grades, outcomes, and personalized learning. It also discusses challenges in higher education like student retention and time to degree completion that big data analytics may help address. Examples of analytics applications that can help institutions understand students, instructors, programs and provide real-time dashboards and predictive modeling are presented.
This document discusses learning analytics and the potential uses of various institutional data sources. It describes how combining data from student records, the virtual learning environment (VLE), library usage, attendance records, and other sources through a learning analytics service could help improve student retention and attainment, enhance teaching quality, enable personalized learning, and support student health and well-being. Specific opportunities mentioned include predicting at-risk students, analyzing factors related to student success and employability, and using activity data to support timely interventions. Engaging various stakeholders like students, teachers, and campus planners is presented as important for effective use of learning analytics.
Implementing analytics - Rob Wyn Jones, Shri Footring and Rebecca DaviesJisc
Led by Rob Wyn Jones, consultant and Shri Footring, senior co-design manager - enterprise, both Jisc.
With contribution from Rebecca Davies, pro vice-chancellor and chief operating officer, Aberystwyth University.
Connect more in Wales, 7 July 2016
Big data and education is an important topic, with key challenges around data integration, skills, and privacy protections. Huawei provides big data storage solutions to help address these issues by providing a robust, high performance infrastructure that can integrate, store, process, model, and visualize large amounts of educational data. This assists with increasing educator effectiveness, delivering personalized education, and equipping students with relevant skills, but clear policies are still needed regarding privacy protections for learners.
Opening/Framing Comments: John Behrens, Vice President, Center for Digital Data, Analytics, & Adaptive Learning Pearson
Discussion of how the field of educational measurement is changing; how long held assumptions may no longer be taken for granted and that new terminology and language are coming into the.
Panel 1: Beyond the Construct: New Forms of Measurement
This panel presents new views of what assessment can be and new species of big data that push our understanding for what can be used in evidentiary arguments.
Marcia Linn, Lydia Liu from UC Berkeley and ETS discuss continuous assessment of science and new kinds of constructs that relate to collaboration and student reasoning.
John Byrnes from SRI International discusses text and other semi-structured data sources and different methods of analysis.
Kristin Dicerbo from Pearson discusses hidden assessments and the different student interactions and events that can be used in inferential processes.
Panel 2: The Test is Just the Beginning: Assessments Meet Systems Context
This panel looks at how assessments are not the end game, but often the first step in larger big-data practices at districts/state/national levels.
Gerald Tindal from the University of Oregon discusses State data systems and special education, including curriculum-based measurement across geographic settings.
Jack Buckley Commissioner of the National Center for Educational Statistics discussing national datasets where tests and other data connect.
Lindsay Page, Will Marinell from the Strategic Data Project at Harvard discussing state and district datasets used for evaluating teachers, colleges of education, and student progress.
Panel 3: Connecting the Dots: Research Agendas to Integrate Different Worlds
This panel will look at how research organizations are viewing the connections between the perspectives presented in Panels 1 and 2; what is known, what is still yet to be discovered in order to achieve the promised of big connected data in education.
Andrea Conklin Bueschel Program Director at the Spencer Foundation
Ed Dieterle Senior Program Officer at the Bill and Melinda Gates Foundation
Edith Gummer Program Manager at National Science Foundation
This document discusses using data and learning analytics to inform blended learning design and delivery. It describes learning design and curriculum design, as well as learning analytics and how they interact. Potential uses of data discussed include understanding what learning materials and activities students engage with the most, what students are learning, and tailoring learning design based on the characteristics of incoming student cohorts. Challenges around legal/ethical issues, data wrangling, and institutional culture are also addressed. The document encourages discussion of current and potential uses of curriculum analytics within institutions.
Ellen Wagner, Executive Director, WCET.
Putting Data to Work
This session explores changing data sensibilities at US post-secondary institutions with particular attention paid to how predictive analytics are changing expectations for institutional accountability and student success. Results from the Predictive Analytics Reporting Framework show that predictive modeling can identify students at risk and that linking behavioral predictions of risk with interventions to mitigate those risks at the point of need is a powerful strategy for increasing rates of student retention, academic progress and completion.
presentation at the 15th annual SLN SOLsummit February 27, 2014
http://slnsolsummit2014.edublogs.org/
Are you really ready to roll out learning analytics across your entire instit...Jisc
Speaker: Steve Hoole, senior analytics consultant.
This workshop will enable delegates to consider a process to assist them finding suitable solution for learning analytics implementation. It will discuss how to move from small pilots to institutional wider implementation for learning analytics considering issues such as legal and ethical requirements such as GDPR compliance, planning for intervention management and ensuring staff and student engagement and support.
Can you predict a degree from five weeks' worth of VLE data? Learning analyti...Jisc
Speakers:
Jason Bailey, learning technologies adviser, University of Brighton
Katie Piatt, e-learning manager, University of Brighton
The University of Brighton are in their second year of student dashboard delivery and are also modelling historical data as part of their learning analytics work in order to predict student outcomes.
The document discusses learning analytics and the Jisc learning analytics service. It provides an overview of what learning analytics is, the goals of the Jisc service which include helping institutions get started with learning analytics and providing standard tools, and the components of the Jisc service including a code of practice, community resources, data collection and products like Data Explorer and Study Goal. It also discusses working with institutions, engagement activities, the on-boarding process, and engaging with solution providers.
Jisc learning analytics service oct 2016Paul Bailey
This document summarizes Paul Bailey's presentation on Jisc's learning analytics service. It discusses what learning analytics is, how it can be used to improve student performance, teaching quality, and institutional strategy. The service will provide dashboards, a student app, and an alert system to help identify at-risk students. It will initially focus on student engagement and attainment data to improve retention and achievement. An on-boarding process and readiness assessment are also outlined to help institutions implement learning analytics. The goal is to launch the service in 2017 to measure its impact on key performance indicators.
The document summarizes Paul Bailey's presentation on Jisc's learning analytics service. The service aims to help higher education institutions in the UK use learning analytics to improve student retention, attainment, employability and personalized learning. It provides tools like dashboards, a student app, and an alert and intervention system. The service also offers an onboarding process to help institutions prepare, and a community for sharing knowledge and experiences with learning analytics.
The document discusses using learning analytics and data from apprenticeships to improve the apprentice experience. It describes an initiative to collect and analyze data from all aspects of the apprenticeship journey to enhance and improve the journey. The initiative is developing a learning analytics service including a learning data hub to gather this apprenticeship data and provide analytics dashboards for employers.
Jisc learning analytics MASHEIN Jan 2017Paul Bailey
Jisc Learning Analytics presentation at Leading Digital Learning: Key Issues for Small and Specialist Institutions event organised by MASHEIN (Management of Small Higher
Education Institutions Network)
tableau together with analytics
introduction to the simple examples of using data visualisation.. and also how to bridge the gap for using data for Education
Strategies
Data
Analytics
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Intro to jisc Learning Analytics March 16Paul Bailey
Jisc provides digital services and infrastructure for UK higher education. This document discusses Jisc's learning analytics services, which aim to improve student retention, achievement, and employability through collecting and analyzing student data. The services include a learning analytics toolkit, community, and service. The service will collect student activity and performance data from sources like Moodle and blackboard and analyze it using tools from Tribal and Unicon/Marist. Initial pilot institutions will explore readiness and available products. Next steps include addressing legal/ethical issues, technical trials of data collection and analysis tools, and full implementations starting in 2016. A student app is also being developed to display personalized analytics and allow students to log activities and set targets.
Jisc learning analytics service sept2016Paul Bailey
The Jisc learning analytics service aims to help institutions improve student retention, attainment, employability and personalised learning through the application of learning analytics techniques. The service will provide dashboards and apps for students, as well as an on-boarding process and community for sharing knowledge. It will launch in September 2017 after testing tools and metrics in 2015-16 and transitioning to a service in 2016-17. The service architecture includes data collection, visualization dashboards for staff, a student app, and an alert system to enable interventions.
Implemententing analytics part 1 - Niall SclaterJisc
The document provides an overview of Jisc's Effective Learning Analytics programme, which aims to help higher education institutions implement learning analytics to improve student retention and attainment. Key points include:
- The programme tested and developed learning analytics tools from 2015-2017 and launched a service in September 2017 to measure the impact on retention and achievement.
- Learning analytics can help improve retention, attainment, employability and personalised learning by applying techniques like machine learning and data mining.
- The programme provides institutions with a toolkit, community support, and case studies on implementing learning analytics for outcomes like improving grades and identifying at-risk students earlier.
How learning analytics can influence the digital strategy in an institutionJisc
Speakers:
Dr Nick Moore, director of library, technology and information services, University of Gloucestershire
Dr Christine Couper, director of strategic planning, University of Greenwich
As the field of learning analytics matures, the strategic approaches taken by institutions that result in successful implementation is beginning to emerge. This talk will explore the pathways some institutions are taking to solidify learning analytics as an integral part of their curriculum enhancement and student success paradigms. This session will showcase experience and examples from two universities that have been implementing the Jisc learning analytics service and lessons learnt.
The document provides an overview of Jisc's Learning Analytics project which aims to help higher education institutions in the UK improve student retention, achievement, and employability through the application of learning analytics techniques. The project involves three core strands: a learning analytics service, toolkit, and community. It also discusses the architecture, data structures, how institutions can get involved, and provides examples of analytics activities at different universities.
Overview of Effective Learning Analytics Using data and analytics to support ...Bart Rienties
Begona Nunez-Herran and Kevin Mayles (Data and Student Analytics), Rebecca Ward (Data Strategy and Governance)
-Move towards centralised LA data infrastructure
-Data governance and lessons learned
Prof Bart Rienties & PhD students (Institute of Educational Technology)
-What is the latest “blue sky” learning analytics research from the OU?
-Rogers Kalissa: Social Learning Analytics to support teaching (University of Oslo)
-Saman Rizvi: Cultural impact of MOOC learning (IET)
-Shi Min Chua: Why does no one reply to my posts (IET/WELS)
-Maina Korir: Ethics and LA (IET)
-Anna Gillespie: Predictive Learning Analytics and role of tutors (EdD)
Prof John Domingue (Knowledge Media Institute) & Dr Thea Herodotou (IET)
-What have we learned from 5 years of large scale implementation of OU Analyse?
-Where is LA/AI going?
The document discusses learning analytics and outlines an agenda for an OER policy roundtable. It defines learning analytics and academic analytics. It discusses benefits like reducing attrition and personalizing learning. Examples from universities in Australia are provided. The document outlines what teaching and learning data is needed, strategies for acquiring this data, and potential actionable insights around retention, at-risk students, and learning effectiveness. It concludes with recommendations for next steps like funding research projects, establishing an open analytics platform, and pilot studies.
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at TribalChris Ballard
Speaking engagement at LACE SoLAR Flare hosted by the Open University. Turning Learning Analytics Research into Practice at Tribal. A video of my talk can be found at http://stadium.open.ac.uk/stadia/preview.php?whichevent=2606&s=1&schedule=3411&option=&record=0#
Data Driven Instructional Decision MakingA framework.docxwhittemorelucilla
Data Driven
Instructional Decision Making
A framework
Data –Driven Instruction
Data-driven instruction is characterized by cycles
that provide a feedback loop
in which teachers plan and deliver instruction, assess student
understanding through the collection of data, analyze the data, and
then pivot instruction based on insights from their analysis.
From: Teachers know best: Making Data Work For Teachers and Students
Bill & Melinda Gates Foundation
https://s3.amazonaws.com/edtech-production/reports/Gates-TeachersKnowBest-MakingDataWork.pdf
Data-Driven Decision Making Process Cycle
Data Planning
and
Production
Data Analysis
Developing
an Action
Plan
Monitoring
progress
Measuring
Success
Implementing
the Action
Plan
Data is used
From : Teachers know best: Making Data Work For Teachers and Students
Bill & Melinda Gates Foundation
https://s3.amazonaws.com/edtech-production/reports/Gates-
TeachersKnowBest-MakingDataWork.pdf
Data –Driven Instruction Feedback Loop
Data Planning
and
Production
Data Analysis
Developing an
Action Plan
Monitoring
progress
Measuring
Success
Implementing
the Action
Plan
Data –Driven Instruction Feedback Loop
Data Planning
and
Production
Data Analysis
Developing an
Action Plan
Monitoring
progress
Measuring
Success
Implementing
the Action
Plan
Instructors need to
facilitate this data –driven
instruction decision loop
in a timely and smooth
fashion
…and on an ongoing basis
• Per student
• Per class
• Per group
Data –Driven Instruction Feedback Loop
Roles Inherent in the Data-Driven Instruction
Decision Making Loop
• Planner
• Data Producer
• Data Analyst
• Monitor
• Reporter
• Data End User
• IT
• Operations and Logistics
Data Planning and Production Questions
• What questions are to be addressed in future data-informed
conversations? Which questions are more important?
• What information (metrics) are needed to answer these question?
• Is the information available and feasibly attainable?
• Are the necessary technology and resources available?
• How can current non-data based instructional decision making be
mapped to data-based instructional decision making process?
• What are the costs associated with this endeavor?
• What are the timelines ?
• How and when will the data be collected and stored?
Data Analysis Questions
• What relations exists between the metrics? What patterns do
the data reveal?
• How many levels of the metric are needed to answer the
questions?
• Do the original questions need to be revised or expanded?
• Do the original metrics need to be redefined or expanded?
• What analytical tools are currently available? What tools
need to be designed to support the analysis?
• What method of analysis or evaluation will be used?
• What are the data limitations, strengths, challenges, context?
Monitor Questions
• How are the metrics evolving as the learning and instructional
processes evolve.
Research in to Practice: Building and implementing learning analytics at TribalLACE Project
Keynote by Chris Ballard, Data Scientist, Tribal, given at the LACE SoLAR Flare event held at The Open University, Milton Keynes, UK on 9 October 2015. #LACEflare
This document discusses learning analytics and how it can be used in Moodle. It defines learning analytics as the measurement, collection, analysis and reporting of data about learners and their contexts in order to understand and optimize learning. It describes how learner interactions in Moodle leave behind data that can be analyzed. It provides examples of how learning analytics can be used in Moodle to identify at-risk students, adapt teaching styles, and make curricular changes based on where students struggle. Finally, it discusses some native and third-party tools that can be used to implement learning analytics in Moodle like Inspire Analytics, Intelliboard, and Zoola.
Similar to Jisc learning analytics service overview Aug 2018 (20)
ALT-C 2019 Jisc curriculum analytics - full set of slidesPaul Bailey
A deep dive into student data to discover curriculum insights
Authors: Paul Bailey, Niall Sclater, Michael Webb, Alan Paull, and Scott Wilson
A full set of slides around curriculum analytics.
The document discusses learning analytics at the University of Greenwich. It provides an overview of how the university uses student data from various sources like grades, library usage, and attendance to monitor student engagement and outcomes. Interventions are put in place if engagement drops, such as meetings with personal tutors. Apps have been created for students and tutors to view analytics data. Considerations around data privacy and transparency are also discussed. Finally, the document considers the potential role of strategic planners in interpreting learning analytics data patterns and evidence to support activities like the Teaching Excellence Framework.
The document announces a challenge to develop a virtual learning environment that does not require a computer, phone, or tablet screen. It seeks ideas for how learning could take place without traditional screens by using emerging technologies like AI, voice tools, augmented reality, and wearable devices. The competition offers a £1000 prize for the best idea submitted. Participants must be 18+, based in the UK, and agree that submissions will be publicly shared.
The document summarizes the programme for the 9th UK Learning Analytics Network Meeting hosted by the University of Exeter and Jisc. The programme included presentations on learning analytics projects and tools from Exeter University, King's College London, Oxford Brookes, and Blackboard. It also provided updates on Jisc's learning analytics service, which includes a learning analytics toolkit, community, and analytics labs to help institutions adopt learning analytics.
The presentation summarizes Jisc's learning analytics service project, which aims to help higher education institutions improve student retention and achievement through the application of learning analytics techniques. The project involves developing a learning analytics service, toolkit, and community. The service will provide dashboards and alert systems to help identify at-risk students. An initial pilot launched in 2015-2016, with the full service planned to launch in September 2017. The project is currently working with over 30 institutions in its various phases.
Jisc is developing a learning analytics service to help higher education institutions in the UK improve student retention and attainment. The service will provide institutions access to standard analytic tools and technologies, and will include a code of practice on legal and ethical issues. A pilot program is underway from 2015-2017 to test the tools and metrics before a full service launch in September 2017. The goals are to help students through personalized learning and improved employability. The service will include an analytic toolkit, online community, and data standards to enable institutions to participate.
Learning Analytics Connect More BelfastPaul Bailey
The Jisc learning analytics initiative aims to help organizations implement learning analytics through developing standard tools and sharing knowledge. The initiative provides a learning analytics toolkit and community resources. It is developing a cloud-based learning analytics architecture that institutions can customize. Currently 35 institutions are engaged in the initiative through a discovery process. Going forward, the initiative will provide a readiness toolkit to help institutions prepare for learning analytics implementation. The presentation also discusses considerations around data used, stakeholder involvement, and conducting a readiness assessment.
Jisc is developing a national learning analytics service in the UK to help higher education institutions improve student retention, attainment, and experiences. The service will include a learning analytics toolkit, community, and centralized data and analytics service. The goals are to provide institutions with standardized tools and analytics to help identify at-risk students and improve teaching and support based on aggregated student data. A phased rollout is planned over two years to develop dashboards, alerts, and apps to visualize analytics and enable interventions.
This document outlines the agenda and objectives for a learning analytics conference for further education and skills. The objectives are to:
1) Prioritize user stories for predicting student failure or underachievement and identify other areas of interest.
2) Identify and prioritize the data sources and systems that need to be integrated, such as student information, the VLE, attendance data, and library information. This includes systems like BKSB.
3) Establish a reference group.
The agenda includes presentations on defining learning analytics and related projects. It also outlines the types of student data available, such as demographics, prior attainment, progress, and attendance. External data sources like labor market information and outcomes
The bedford college moodle grade tracker analyticsPaul Bailey
The document discusses Bedford College's development of modules for the Moodle learning management system including the GradeTracker and ePLP. The GradeTracker allows for management of grades for qualifications like BTECs and A-levels. The ePLP tracks student attendance, targets, tutorials and destinations. The modules provide integrated tools for managing all aspects of teaching and learning. Bedford College has made the GradeTracker available to over 80 other organizations and aims to provide ongoing support and release other modules to the wider community.
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
3. “learning analytics is the measurement,
collection, analysis and reporting of data
about learners and their contexts, for
purposes of understanding and
optimising learning and the
environments in which it occurs”
SoLAR – Society for Learning Analytics Research
Learning Analytics Service
4. Effective Learning Analytics Challenge
Learning Analytics Service
Rationale
»Organisations wanted help to get started and have access to standard
tools and technologies to monitor and intervene
Priorities identified
»Code of Practice on legal and ethical issues
»Develop a core learning analytics service with app for students
»Provide a network to share knowledge and experience
Timescale
»2015-17 Development
»2017-18 Beta Service
»Aug 2018 Full Service
5. Agenda
Learning Analytics Service
Predictive models
identify students at risk
Timely intervention by teaching or support
staff
Increased retention
Better understanding
of the effectiveness
of interventions
Rich data on student
activity and attainment
Data shared with
student prompting
them to change
own behaviour
Better student
outcomes
Data can be
explored to
understand patterns
of behaviour
Better understanding
of the behaviours
linked to differential
outcomes
6. Paul Bailey, Senior Codesign Manager, Research and Development
Jisc learning analytics service
https://docs.analytics.alpha.jisc.ac.uk/docs/learning-analytics/Home
7. Jisc’s Learning Analytics Project
Three core strands:
Learning
Analytics Service
Toolkit,
Consultancy,
Framework
Community,
Network, Events
Jisc Learning Analytics
Learning Analytics Service
8. Community: Project Blog,
mailing list and network events
Blog: http://analytics.jiscinvolve.org
Docs: http://docs.analytics.alpha.jisc.ac.uk/
Mailing: analytics@jiscmail.ac.uk
Learning Analytics Service
9. On-boarding Process
Stage 1: Orientation – get more info
Stage 2: Discovery – DIY and/or paid for consultancy
Stage 3: Culture and Organisation Setup – sign up for
Jisc service and/or supplier products
Stage 4: Data Integration - push data to learning data
hub
Stage 5: Implementation Planning
Learning Analytics Service
https://analytics.jiscinvolve.org/wp/on-boarding/
10. Discovery readiness
Topic ID Question Commentary Response Score
Leadersh
ip
1 The institutional senior management
team is committed to using data to
make decisions
Please provide a commentary on you
response to each question where
appropriate
0 - Hardly or not at
all
1 - To some extent
2 - To a great
extent
Leadersh
ip
2 Our vice-chancellor / principal has
encouraged the institution to
investigate the potential of learning
analytics
0 - Hardly or not at
all
1 - To some extent
2 - To a great
extent
Leadersh
ip
3 There is a named institutional
champion / lead for learning analytics
0 - No
2 - Yes
Vision 4 We have identified the key
performance indicators that we wish to
improve with the use of data
0 - Hardly or not at
all
1 - To some extent
2 - To a great
extent
Learning Analytics Service
A supported review of institutional readiness
11. Lessons Learned
1. Governance – senior management buy-in, wide engagement, dedicated project
manager
2. Agreed goal – managing expectations
3. Clear strategic aims – see case studies
4. The main benefits/challenges
» It is more than the “product”
» Data cleaning and business processes (assessment data, student status, etc)
» Improving student support process – managing interventions
» Good communication to staff and students
Learning Analytics Service
Implementation of learning analytics
12. Toolkit: Code of Practice
Learning Analytics Service
Code of Practice
http://www.jisc.ac.uk/guides/code-of-practice-for-learning-
analytics
Literature Review
http://repository.jisc.ac.uk/5661/1/Learning_Analytics_A-
_Literature_Review.pdf
Template Learning Analytics Policy
https://analytics.jiscinvolve.org/wp/2016/11/29/developing-
an-institutional-learning-analytics-policy/
Guidance on consent for learning analytics
https://analytics.jiscinvolve.org/wp/2017/02/16/consent-for-
learning-analytics-some-practical-guidance-for-institutions/
13. Legal and ethical: consent and GDPR
Learning Analytics Service
Advice is
Make sure your collection notice covers the use of data
to support the student learning and wellbeing
Not ask for consent for the use of non-sensitive data for
analytics (our current understanding is that this can be
considered as of legitimate interest or public interest)
Ask for consent for use of sensitive data (which, under
the GDPR, is called “special category data”)
Ask for consent to take interventions directly with
students on the basis of the analytics
https://analytics.jiscinvolve.org/wp/
14. Take-up of Jisc service
30 institutions signed-up
8 institutions institution wide roll-out Sept
14 HEIs in data integration/pilot stage
8 Colleges in service development
Learning Analytics Service
15. Data
Collection
Data
Storage
and Analysis
Presentation
and Action
Jisc Learning Analytics open architecture: core
Alert and Intervention
system
Other Staff
Dashboards
Consent Service
(tbc)
Student App:
Study Goal
Jisc Learning
Analytics Predictor
Learning
Data Hub
Student Records VLE Library
Staff dashboards in
Data Explorer
Self Declared Data Attendance, Presence, Equipment use etc….
Data Aggregator
UDD Transformation Toolkit Plugins and/or Universal xAPI Translator
16. Products and dashboards
Data Explorer: Learning Analytics dashboards for staff, focussing on showing learning analytics
data to staff based on their role.
Study Goal: An app for students - allowing them to view their learning analytics data, and set
measurable actions to support their success.
Learning Analytics Predictor: A predictive model designed to do one thing well - predict
success at course level. Output can be viewed in Data Explorer or any other system that can
integrated in the Learning Data Hub.
Traffic Lights Calculator: A straightforward rules based engine, allowing RAG status to be
calculated for online activity, attendance and achievement, at module level. Output fromTLC
can viewed in data explorer or any other system that can integrated in the learning data hub.
Learning Data Hub: the core of Jisc's learning analytics service, holds data about students,
works in conjunction with an institutions data warehouse, rather than replace it, to share data
between applications in a standard way, a collection point for semi-structured learning data
such as student activity.
Learning Analytics Service
17. Data Explorer
Data Explorer Release 2.0 - Aug 18
View data in learning records warehouse
Site Overview – overview of all data
My Students and My Modules
Notes (interventions) on students
RAG Status and predictive models
User Guide and videos
https://docs.analytics.alpha.jisc.ac.uk/docs/d
ata-explorer/Home
Jisc Learning Analytics 2017
18.
19. Study Goal
Study Goal aims
Social learning app with gamification
Setting targets and logging self-declared activity
(fitbit model)
View activity and attainment data
Attendance check-in
Guides and videos
https://docs.analytics.alpha.jisc.ac.uk/docs/study-
goal/Home
Jisc Learning Analytics 2017
21. Learning Analytics Service
VLE data
+
Student record system
+
Attendance data
+
Library data
Buildings data
+
Learning space data
+
Location data
Teaching quality data
+
Assessment data
+
Curriculum design data
Content data
+
Learning pathways data
Better retention
and attainment
Retention and
attainment
A more efficient
campus
Improved teaching
& curricula
Personalised and
adaptive learning
Efficient campus
Improving teaching
& curricula
Now
Learning
analytics
Institutional
analytics
Educational
analytics
Cognitive
Analytics and AI
Future
22. Health and well-being
• Can we use activity data to support health and well-being?
• Timely interventions identify students earlier
• Patterns of behaviour
• Improved student support processes
• Developing coping strategies
• Additional data
• Student sentiment analysis
• Long term data study
• Sensitive data
• Build AI models to predict at risk students, also beyond
graduation
Learning Analytics Service
23. Student Success
• Behavioural patterns that lead to success (attendance, engagement, attainment, submission
date/time of assignments)
• Predictive models that look at success i.e. first or 2:1 – that will model the behaviours
• Grouping of behaviours that lead to success (e.g. accessing a wider range of resources, time on
task, linking intended with actual behaviours)
Learning Analytics Service
25. Employability
• Analyse data to find indicators that
lead to employability
Baseline data on employability
Activity data e.g.
• Careers entry profiles
• Careers engagement activity
• Employability skills in modules
• Work experience
Learning Analytics Service
HEPI Employability: Degrees of
Value