The document discusses learning analytics at The Open University. It describes how the university uses predictive analytics and indicators to identify at-risk students early and trigger interventions. A key focus is developing early alert indicators using statistical modeling of student data to predict assignment submissions and module completion probabilities. The university also implemented a policy on ethical use of student data, developed an evaluation framework for analytics-driven actions, and examined the impact of learning design on student outcomes and satisfaction. The overall aim is to retain more students and help them achieve their study goals through strategic and targeted use of analytics.
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Data Driven College Counseling by SchooLinksKatie Fang
This workshop will expose school counselors and administrators to a framework for data-driven college planning and accountability. Attendees will learn about data collection, pattern analysis, and translating insight into intervention to best support students in their college planning process. No special statistical knowledge is required for this session, just enthusiasm to understand how using data unlock better student outcomes.
Associate Professor Tracey Bretag: Contract cheating implications for Teachin...Studiosity.com
"Contract cheating is a symptom, not a problem." Associate Professor Bretag provides an overview of the research on contract cheating and how students deal with it in the higher education landscape, at the 2018 Studiosity Symposium.
Watch the video of Tracey's presentation at https://youtu.be/6rS2mTIr1U4 [41mins]
Presentations morning session 22 January 2018 HEFCE open event “Using data to...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
10.30-11.00 Welcome and Coffee
11.00-11.30 Lightning presentations by participants, outlining insights about learning gains
1130-1300 Insights from the ABC-Learning Gains project
Dr Jekaterina Rogaten (OU): Reviewing affective, behavioural and cognitive learning gains in higher education of 54 learning gains studies
Prof Bart Rienties & Dr Jekaterina Rogaten (OU): Are assessment scores good proxies of estimating learning gains: a large-scale study amongst humanities and science students
Prof Rhona Sharpe (University of Surrey) & Dr Simon Cross (OU): Insights from 45 qualitative interviews with different learning gain paths of high and low achievers
Dr Ian Scott (Oxford Brookes) & Dr Simon Lygo-Baker (OU): Making sense of learning trajectories: a qualitative perspective
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Data Driven College Counseling by SchooLinksKatie Fang
This workshop will expose school counselors and administrators to a framework for data-driven college planning and accountability. Attendees will learn about data collection, pattern analysis, and translating insight into intervention to best support students in their college planning process. No special statistical knowledge is required for this session, just enthusiasm to understand how using data unlock better student outcomes.
Associate Professor Tracey Bretag: Contract cheating implications for Teachin...Studiosity.com
"Contract cheating is a symptom, not a problem." Associate Professor Bretag provides an overview of the research on contract cheating and how students deal with it in the higher education landscape, at the 2018 Studiosity Symposium.
Watch the video of Tracey's presentation at https://youtu.be/6rS2mTIr1U4 [41mins]
Learning analytics: the state of the art and the futureRebecca Ferguson
Presentation given by Rebecca Ferguson at 'Nuevas métricsas y enfoques para la evaluación e innovación en el aprendizaje' in Montevideo, Uruguay, on Wednesday 13 April 2016.
The talk deals with the state of the art in learning analytics, and with actions for taking this work forward at a national level.
2019 Midwest Scholarship of Teaching & Learning (SOTL) conference presentation. The goal of this presentation is to share our data-informed approach to re-engineer the exam design, delivery, grading, and item analysis process in order to construct better exams that maximize all students potential to flourish. Can we make the use of exam analytics so easy and time efficient that faculty clearly see the benefit? For more info see our blog at https://kaneb.nd.edu/real/
Presentation given by Rebecca Ferguson at the ORT University Institute of Education, Montevideo, Uruguay on 12 April 2016. It deals with the Innovating Pedagogy reports produced annually since 2012 by the Institute of Educational Technology (IET) at The Open University (OU).
Entrepreneurship and Mentorship in Online CoursesGreg Bybee
Research by Stanford Professor Chuck Eesley. Research conducted on NovoEd's experiential learning platform.
Mentorship programs are increasingly on the agenda for policymakers and universities interested in fostering entrepreneurship. Few studies examine causal effects of mentorship. We investigate the impact of the type of mentorship on the likelihood that university students will become entrepreneurs. We use a longitudinal field experiment with a pre-test/post-test design where students in an entrepreneurship class were randomly assigned to receive mentorship from either entrepreneur or non-entrepreneur mentors. To our knowledge, this is the first randomized trial of a mentoring program in entrepreneurship. We find significant positive effects of mentorship, particularly by certain types of mentors.
Throughput, cost and standardization: Does a serious game in healthcare work ...INSPIRE_Network
Throughput, cost and standardization: Does a serious game in healthcare work for teaching parents and clinician neuro assessment in Children with VP Shunt?
Closing the Gap With STEM Education: Why, What, and How
Participants will learn why there is a growing need for STEM education in the United States, what STEM education is, how STEM education at the middle school level contributes to closing the gap, and how to successfully plan and implement a middle school program.
Ken Verburg Project Lead the Way - Lexington, SC
Presentation by Rebecca Ferguson to the FutureLearn Academic Network (FLAN) meeting held at Universitat Pompeu Fabra in Barcelona on 27 January 2017. ‘What does the UK FLAN research tell us’ looks at 167 papers published by UK universities that are partnered with the FutureLearn MOOC platform. It focuses on priority areas for research, and the pressing research questions that emerge from the current research.
Designing Systemic Learning Analytics at the Open University
Belinda TynanPro-Vice-Chancellor Learning & TeachingThe Open University, UK
Simon Buckingham Shum Knowledge Media InstituteThe Open University, UK
Replay from today's webinar in the SoLAR online open course Strategy & Policy for Systemic Learning Analytics. Thanks to the Australian Office for Learning and Technology for sponsoring this, and to George Siemens for convening (replay):
Abstract: The OU has been analysing student data and feeding this back to faculties since its doors opened 40 years ago. However, the emergence of learning analytics technologies open new possibilities for engaging in more effective sensemaking of richer learner data, and more timely interventions. We will introduce the framework we are developing to orchestrate the rollout of a systemic organisational analytics infrastructure (both human and technical), and discuss some of the issues that arise. We will also describe how strategic research efforts will key into this design, should they prove effective.
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Learning analytics: the state of the art and the futureRebecca Ferguson
Presentation given by Rebecca Ferguson at 'Nuevas métricsas y enfoques para la evaluación e innovación en el aprendizaje' in Montevideo, Uruguay, on Wednesday 13 April 2016.
The talk deals with the state of the art in learning analytics, and with actions for taking this work forward at a national level.
2019 Midwest Scholarship of Teaching & Learning (SOTL) conference presentation. The goal of this presentation is to share our data-informed approach to re-engineer the exam design, delivery, grading, and item analysis process in order to construct better exams that maximize all students potential to flourish. Can we make the use of exam analytics so easy and time efficient that faculty clearly see the benefit? For more info see our blog at https://kaneb.nd.edu/real/
Presentation given by Rebecca Ferguson at the ORT University Institute of Education, Montevideo, Uruguay on 12 April 2016. It deals with the Innovating Pedagogy reports produced annually since 2012 by the Institute of Educational Technology (IET) at The Open University (OU).
Entrepreneurship and Mentorship in Online CoursesGreg Bybee
Research by Stanford Professor Chuck Eesley. Research conducted on NovoEd's experiential learning platform.
Mentorship programs are increasingly on the agenda for policymakers and universities interested in fostering entrepreneurship. Few studies examine causal effects of mentorship. We investigate the impact of the type of mentorship on the likelihood that university students will become entrepreneurs. We use a longitudinal field experiment with a pre-test/post-test design where students in an entrepreneurship class were randomly assigned to receive mentorship from either entrepreneur or non-entrepreneur mentors. To our knowledge, this is the first randomized trial of a mentoring program in entrepreneurship. We find significant positive effects of mentorship, particularly by certain types of mentors.
Throughput, cost and standardization: Does a serious game in healthcare work ...INSPIRE_Network
Throughput, cost and standardization: Does a serious game in healthcare work for teaching parents and clinician neuro assessment in Children with VP Shunt?
Closing the Gap With STEM Education: Why, What, and How
Participants will learn why there is a growing need for STEM education in the United States, what STEM education is, how STEM education at the middle school level contributes to closing the gap, and how to successfully plan and implement a middle school program.
Ken Verburg Project Lead the Way - Lexington, SC
Presentation by Rebecca Ferguson to the FutureLearn Academic Network (FLAN) meeting held at Universitat Pompeu Fabra in Barcelona on 27 January 2017. ‘What does the UK FLAN research tell us’ looks at 167 papers published by UK universities that are partnered with the FutureLearn MOOC platform. It focuses on priority areas for research, and the pressing research questions that emerge from the current research.
Designing Systemic Learning Analytics at the Open University
Belinda TynanPro-Vice-Chancellor Learning & TeachingThe Open University, UK
Simon Buckingham Shum Knowledge Media InstituteThe Open University, UK
Replay from today's webinar in the SoLAR online open course Strategy & Policy for Systemic Learning Analytics. Thanks to the Australian Office for Learning and Technology for sponsoring this, and to George Siemens for convening (replay):
Abstract: The OU has been analysing student data and feeding this back to faculties since its doors opened 40 years ago. However, the emergence of learning analytics technologies open new possibilities for engaging in more effective sensemaking of richer learner data, and more timely interventions. We will introduce the framework we are developing to orchestrate the rollout of a systemic organisational analytics infrastructure (both human and technical), and discuss some of the issues that arise. We will also describe how strategic research efforts will key into this design, should they prove effective.
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Speakers:
David Lewis, senior analytics consultant, Jisc
Mike Hughes, IT director, City University, London
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Overview of Effective Learning Analytics Using data and analytics to support ...Bart Rienties
Begona Nunez-Herran and Kevin Mayles (Data and Student Analytics), Rebecca Ward (Data Strategy and Governance)
-Move towards centralised LA data infrastructure
-Data governance and lessons learned
Prof Bart Rienties & PhD students (Institute of Educational Technology)
-What is the latest “blue sky” learning analytics research from the OU?
-Rogers Kalissa: Social Learning Analytics to support teaching (University of Oslo)
-Saman Rizvi: Cultural impact of MOOC learning (IET)
-Shi Min Chua: Why does no one reply to my posts (IET/WELS)
-Maina Korir: Ethics and LA (IET)
-Anna Gillespie: Predictive Learning Analytics and role of tutors (EdD)
Prof John Domingue (Knowledge Media Institute) & Dr Thea Herodotou (IET)
-What have we learned from 5 years of large scale implementation of OU Analyse?
-Where is LA/AI going?
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Learning analytics futures: a teaching perspectiveRebecca Ferguson
Talk given by Rebecca Ferguson on 22 November 2018 int Universita Ca'Foscario Venezia at the event Nuovi orizzonti della ricerca pedagogica: evidence-based learning e learning analytics
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
In this study, the effect of combining variables from the different data sources for student academic performance prediction was examined using three state-of-the–art classifiers: Decision Tree (DT), Artificial Neural Network (ANN) and Support Vector Machine (SVM). The study examined the use of heterogeneous multi-model ensemble techniques to predict student academic performance based on the combination of these classifiers and three different data sources. A quantitative approach was used to develop the various base classifier models while the ensemble models were developed using stacked generalisation ensemble method in order to overcome the individual weaknesses of the different models. Variables were extracted from the institution’s Student Record System and Learning Management System (Moodle) and from a structured student questionnaire. At present, negligible work has been done using this integrated approach and ensemble techniques especially with aggregated learner data in performance prediction in HE. The empirical results obtained show that the ensemble models.........................
A whole of institution approach to improving student experience using usage d...Blackboard APAC
The University of Adelaide's Beacon of Enlightenment Strategic and Operational Plans outline a number of key targets for online learning. The 2013-14 Benchmarking eLearning Projects provided a baseline of the University's performance in regard to these online learning targets and a defined set of benchmarks. This presentation will provide a background to the Benchmarking eLearning Project, the data collected and how the data is leveraged in the faculty to improve the online learning experience of students. The newly established Guidelines for Minimum use of MyUni and a targeted approach to enhancing online course design in line with the targets will be discussed. Future plans to automate reporting using an expanded source of data to inform support activities and ensure the university is improving over time will be outlined.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
1. Learning Analytics @ The Open University
JISC Networking Event 11th May 2016
Kevin Mayles, Head of Analytics, The Open University
2. kevin.mayles@open.ac.uk | @kevinmayles
Where are you from?
Learning Analytics @ The Open University
● PVC Learning & Teaching
● CIO / IT
● Planning Office
● Student Support
● Faculty
Learning
and
Teaching
Centre
Institute of
Educational
Technology
Faculties
and
Schools
Learning
and
Teaching
Solutions
Academic Professional Services
Information
Technology
Strategy
and
Information
Office
Academic
Services
Marketing
Student
Registration
and Fees
Business
Performance
Improvement
Library
Services
4. kevin.mayles@open.ac.uk | @kevinmayles
OU Context
2014/15
174k students
The average age of our new
undergraduate students is 29
40% new undergraduates have 1
A-Level or lower on entry
Over 21,000 OU students have
disabilities
868k assessments submitted, 395k
phone calls and 176k emails received
from students
5. kevin.mayles@open.ac.uk | @kevinmayles
p.5
A clear vision statement was developed to galvanise effort across the
institution on the focused use of analytics
Analytics for student success vision
Vision
To use and apply information strategically (through specified indicators) to retain
students and progress them to complete their study goals
Mission
This needs to be achieved at :
● a macro level to aggregate information about the student learning experience at an
institutional level to inform strategic priorities that will improve student retention and
progression
● a micro level to use analytics to drive short, medium and long-term interventions
7. kevin.mayles@open.ac.uk | @kevinmayles
The OU recognises that three equally important strengths are required
for the effective deployment of analytics
Analytics enhancement strategy
Adapted from Barton and Court (2012)
8. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
9. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
11. kevin.mayles@open.ac.uk | @kevinmayles
11
Open University: data + analysis
Statistical modelling
2015 cohort
‘Training’ dataset
Predictions
for 2016
cohort
Output dataset
Factors
Factors
Logistic regression
12. kevin.mayles@open.ac.uk | @kevinmayles
12
Development of early alert indicators
The 30 variables identified associated with success vary in their
importance at each milestone
Student
(Demographic)
Student –
previous
study/motivation
Student progress
in previous OU
study
Student – module
Qualification /
module of study
Calvert (2014)
13. kevin.mayles@open.ac.uk | @kevinmayles
13
Current indicators
Module probabilities
Integrated into
the Student
Support
Intervention
Tool
Predicts the
probability of a
student
completing and
passing the
module
17. kevin.mayles@open.ac.uk | @kevinmayles
17
Outcomes of current pilots
Summary of the interim evaluation of piloting as at March 2016
● There is a mixed picture in the quantitative analysis on the impact in the pilot tutor
groups on withdrawal rates and assignment submissions (note that tutors are self
selected and the expectations to intervene are not consistent across the module
piloting)
● It is a useful tool for understanding students and their participation
● Predictions generally agree with tutors' experience and intuitions of which students
might potentially be at risk
● A (potential) USP of OU Analyse was the information provided between the
assignment submission in relation to students' engagement with learning materials
● Overall, all tutors interviewed were positive about the affordances of OUA, and are
keen to use it again (for a range of reasons) in their next module
18. kevin.mayles@open.ac.uk | @kevinmayles
18
Case studies and vignettes
“I love it it’s brilliant. It brings together things I already do
[…] it’s an easy way to find information without researching
around such as in the forums and look for students to see
what they do when I have no contact with them […] if they
do not answer emails or phones there is not much I can do.
OUA tells me whether they are engaged and gives me an
early indicator rather than waiting for the day they submit”
19. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
22. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
28. kevin.mayles@open.ac.uk | @kevinmayles
Supporting module teams
● Module teams work with support to identify actions that
can be taken for current and future presentations
● The Analytics project have developed a ‘costed’ menu of
response actions that can be taken ‘in-presentation’ or
during the next presentation
● Budgetary considerations
● Resource considerations
Removing ‘Blockers’
28
LTS enabled
Module Team
SST enabled
AL enabled
Library enabledSupporting evaluation methods
29. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
33. kevin.mayles@open.ac.uk | @kevinmayles
Constructivist
Learning Design
Assessment
Learning Design
Balanced-
variety Learning
Design
Socio-construct.
Learning Design
Student
Satisfaction
Student
retention
Learning Design
150+ modules
VLE Engagement
Week
1
Week
2
Week
30+
Rienties, B. and Toetenel, L. (2016)
Communication
34. kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
35. kevin.mayles@open.ac.uk | @kevinmayles
Are there any questions?
For further details please contact:
● Kevin Mayles – kevin.mayles@open.ac.uk
● @kevinmayles
● Slideshare: http://www.slideshare.net/KevinMayles
● OU Analyse: https://analyse.kmi.open.ac.uk/
References:
BARTON, D. and COURT, D., 2012. Making Advanced Analytics Work For You. Harvard business review, 90(10), pp.
78-83.
CALVERT, C.E., 2014. Developing a model and applications for probabilities of student success: a case study of
predictive analytics. Open Learning: The Journal of Open, Distance and e-Learning.
KUZILEK, J., HLOSTA, M., HERRMANNOVA, D., ZDRAHAL, Z. and WOLFF, A., 2015. OU Analyse: Analysing At-Risk
Students at The Open University. Learning Analytics Review, no. LAK15-1, March 2015, ISSN: 2057-7494
RIENTIES, B. and TOETENEL, L., 2016. The impact of learning design on student behaviour, satisfaction and
performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 pp. 333–
341.
SCHÖN, D.A., 1987. Educating the reflective practitioner: Toward a new design for teaching and learning in the
professions. San Francisco, CA, US: Jossey-Bass.
Editor's Notes
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Belinda
Analytics is at the heart of the university’s strategic priority to deliver an outstanding student experience.
We’ve developed this vision that drives our development of the use of analytics for both short term action and long term strategic decision making.
Belinda
Our strategy is based around the 3 key underpinning strengths we need to develop as an institution. Each equally important.
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Needing to improve our longer-term forecasting of student numbers we are now able to predict a students chances of success by mapping their situation/characteristics to 30 of the most signification factors associated with student success:
These include student characteristics or factors such as
previous educational attainment
occupational status
previous study or reasons for study
the number of credits already achieved
the total number of previous passes or withdrawals
the number of credits being studied in the year
latest assignment scores
what modules or qualification they are studying.
We then map this data to historical records. Essentially we model one cohort of students based on the activates and outcomes of previous cohorts. This enables us to correlate those student characteristics to likely student outcome. This aggregated data is very accurate – providing us with a 2% tolerance.
This data can be used to predict student progression and therefore income.
As well as being applied to a cohort these predictions can also be used to trigger interventions with individual students who may have a low probability of success.
Mention Zdenek Zdrahal and how session…
In the second approach we have developed a tool called OU Analyse which enables us to undertake a week by week analysis of student engagement including study engagement with their learning activities.
From this data we can analyse previous student cohorts and correlate learning activities against outcome. This enables us to identify, on a module by module basis, which activities carry the greatest value in terms of student success.
To note: the predictive model assesses a student using 4 different algorithms, each of the algorithms takes into account different factors and the historic data we hold about previously successful students. The model will then give a student a vote for each of the algorithms that shows the student is at risk, therefore the more votes you have the higher the likelihood you are at risk.
On this slide, the diagram represents one of the algorithms that takes into account the learning activities, which, are represented by the circles. Blue circles are activities that combine in different ways, actions such as forum use, interaction or downloading of resources, or online content. The pink circle shows a week of non-engagement with the VLE.
Here we see a stylised pattern of weekly activity of a student who successfully achieved a pass for this module.
Contrast this with the activity of a student who didn’t submit. You can see that the pattern is very different. In this example, the student who didn’t submit spend periods not engaging with the VLE.
By modelling the pattern of behaviour of students who pass, fail or don’t submit, we are able to get a predict what successful and not successful VLE engagment looks like, and get a ‘finger-print’ for each module.
Armed with this knowledge we are then able to monitor activity within the current cohort and quickly identify individual students who are not engaging with those high value activities. These students are flagged as ‘at risk’ and can be offered targeted interventions.
Finally, we have also applied analytics to the exploration of the impact of learning design on learning performance.
Takes demographic data + presentation-related data (aggregated VLE data available weekly)
For each module, identifying important VLE actions, e.g., access forum, access or download a resource, access OU content. Learning activities will include a combination of these actions
Identify students at risk of failing the module as early as possible so that OU intervention is efficient and meaningful.
Finally, we have also applied analytics to the exploration of the impact of learning design on learning performance.
Takes demographic data + presentation-related data (aggregated VLE data available weekly)
For each module, identifying important VLE actions, e.g., access forum, access or download a resource, access OU content. Learning activities will include a combination of these actions
Identify students at risk of failing the module as early as possible so that OU intervention is efficient and meaningful.
Based on demographic data
Based on VLE activities
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist.
The LAK paper by Rienties and colleagues indicates that learning design and learning design activities in particular strongly influence how students are engaging in our LMS. VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs.
In terms of learning outcomes, learning design seems to have an impact on learning performance. Students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. In particular, modules with a heavy reliance on content and cognition (assimilative activities) seemed to lead to lower completion and pass rates.
Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
Red lines represent –ve effect
Green lines represent +ve effect
Blue lines represent no significant effect.
Eventually with the availability of yet more data, we may soon be better able to understand the complex relationship between learning design and learning processes and outcomes with resultant sharing of best practice evidenced by such analyses. If this could then be mapped to student characteristics, there is the potential to personalise learning designs.
(Learning performance was calculated by the number of learners who completed and passed the module relative to the number of learners who registered for each module.)
Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist.
The LAK paper by Rienties and colleagues indicates that learning design and learning design activities in particular strongly influence how students are engaging in our LMS. VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs.
In terms of learning outcomes, learning design seems to have an impact on learning performance. Students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. In particular, modules with a heavy reliance on content and cognition (assimilative activities) seemed to lead to lower completion and pass rates.
Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort).
Red lines represent –ve effect
Green lines represent +ve effect
Blue lines represent no significant effect.
Eventually with the availability of yet more data, we may soon be better able to understand the complex relationship between learning design and learning processes and outcomes with resultant sharing of best practice evidenced by such analyses. If this could then be mapped to student characteristics, there is the potential to personalise learning designs.
(Learning performance was calculated by the number of learners who completed and passed the module relative to the number of learners who registered for each module.)
Belinda
This encapsulates our strategy which is moving forward on all fronts.
Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.