The document discusses developing evidence-based institutional learning analytics policies. It describes the methodology used in the SHEILA project which included surveys and interviews with academic staff, students, senior managers and experts. The surveys and interviews aimed to understand stakeholders' views on learning analytics, adoption challenges, and expectations. Key findings included strong interest in using learning analytics to improve student and teaching performance, but also barriers around skills, culture, technology, ethics and privacy. Academic staff surveys found the highest expectations were around accurate data collection and presentation, while lowest were around obligations to act on at-risk students. Focus groups with staff further explored opportunities and concerns around learning analytics.
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...David Heath
This document summarizes a study on higher education teachers' experiences with learning analytics in relation to student retention. The study surveyed 276 teachers across Australia and New Zealand about their discussions and involvement with learning analytics, interest in potential applications, and views on institutional support. Key findings include that while teachers are interested in using analytics for retention efforts like identifying at-risk students, they need more training, access to data, and guidance on how to interpret and respond to data. The document calls for institutions to provide more support to help teachers effectively use learning analytics.
Curriculum jeremy kilpatrick and john dosseyGlaiden Rufino
This document discusses key aspects of developing a coherent mathematics curriculum. It emphasizes that a curriculum must clearly define its purpose and intended outcomes. It recommends focusing content domains and cognitive processes concisely while ensuring connections. A philosophy of pedagogy should value reasoning, problem-solving, multiple perspectives and mathematical autonomy. Finally, developing coherence across all elements and attending to challenges of implementation are vital.
Teaching and Learning Analytics to Support the Classroom Teacher Inquiry
Invited Tutorial
IEEE Global Engineering Education Conference (EDUCON2017), University of Piraeus, Greece
26-28 April 2017
Invited Tutorial
Συνέδριο ΕΤΠΕ2017, ΑΙΣΠΕΤΕ, Ελλάδα
21-23 April 2017
Invited Tutorial
8th IEEE International Conference on Technology for Education (T4E 2016), IIT Bombay, Mumbai, India
1 December 2016
This document summarizes a 3-year HEFCE pilot programme called LEGACY aimed at measuring learning gain and employability across 18 Russell Group universities. The programme has 4 work packages focused on (1) measuring learning gain, (2) student strengths and career development, (3) career adaptability, and (4) the impact of international experiences on employability. It will develop tools and methodologies for assessing learning gain longitudinally and cross-sectionally, identify core learning dimensions, and produce recommendations based on findings from student surveys, interviews and data analysis. The goal is to better understand factors influencing learning
The document summarizes the impact and initiatives of the African Institute for Mathematical Sciences (AIMS). AIMS seeks to contribute to Africa's socio-economic transformation through scientific training, technical advances, and innovative policy design. It operates 6 centers of excellence that provide master's programs in mathematical sciences, drawing world-class lecturers. AIMS also has initiatives in industry partnerships, teacher training, research, and the Next Einstein Forum to promote science in Africa. Upcoming plans include expanding training programs, research chairs, and convenings to further AIMS' mission.
Realizeit is an adaptive learning platform that aims to provide individualized learning experiences for students. It uses analytics to guide students and instructors. Analytics help improve courses over time, provide insights into student and faculty engagement, and measure the effectiveness of different instructional approaches. Case studies at Colorado Technical University and University of Central Florida found increased pass rates, retention, and student satisfaction when using Realizeit. Institutions can also collaborate with Realizeit on research to continuously improve courses.
This document summarizes several case studies highlighting the use of TurningPoint response technology by educators. It describes how TurningPoint allows educators to gauge student understanding in real-time, stimulate discussion, and use reporting features to produce learning analytics and research. Case studies describe its use to provide feedback and self-assessment for students in large classes, integrate interactive theater to enhance learning, and prove teaching excellence.
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...David Heath
This document summarizes a study on higher education teachers' experiences with learning analytics in relation to student retention. The study surveyed 276 teachers across Australia and New Zealand about their discussions and involvement with learning analytics, interest in potential applications, and views on institutional support. Key findings include that while teachers are interested in using analytics for retention efforts like identifying at-risk students, they need more training, access to data, and guidance on how to interpret and respond to data. The document calls for institutions to provide more support to help teachers effectively use learning analytics.
Curriculum jeremy kilpatrick and john dosseyGlaiden Rufino
This document discusses key aspects of developing a coherent mathematics curriculum. It emphasizes that a curriculum must clearly define its purpose and intended outcomes. It recommends focusing content domains and cognitive processes concisely while ensuring connections. A philosophy of pedagogy should value reasoning, problem-solving, multiple perspectives and mathematical autonomy. Finally, developing coherence across all elements and attending to challenges of implementation are vital.
Teaching and Learning Analytics to Support the Classroom Teacher Inquiry
Invited Tutorial
IEEE Global Engineering Education Conference (EDUCON2017), University of Piraeus, Greece
26-28 April 2017
Invited Tutorial
Συνέδριο ΕΤΠΕ2017, ΑΙΣΠΕΤΕ, Ελλάδα
21-23 April 2017
Invited Tutorial
8th IEEE International Conference on Technology for Education (T4E 2016), IIT Bombay, Mumbai, India
1 December 2016
This document summarizes a 3-year HEFCE pilot programme called LEGACY aimed at measuring learning gain and employability across 18 Russell Group universities. The programme has 4 work packages focused on (1) measuring learning gain, (2) student strengths and career development, (3) career adaptability, and (4) the impact of international experiences on employability. It will develop tools and methodologies for assessing learning gain longitudinally and cross-sectionally, identify core learning dimensions, and produce recommendations based on findings from student surveys, interviews and data analysis. The goal is to better understand factors influencing learning
The document summarizes the impact and initiatives of the African Institute for Mathematical Sciences (AIMS). AIMS seeks to contribute to Africa's socio-economic transformation through scientific training, technical advances, and innovative policy design. It operates 6 centers of excellence that provide master's programs in mathematical sciences, drawing world-class lecturers. AIMS also has initiatives in industry partnerships, teacher training, research, and the Next Einstein Forum to promote science in Africa. Upcoming plans include expanding training programs, research chairs, and convenings to further AIMS' mission.
Realizeit is an adaptive learning platform that aims to provide individualized learning experiences for students. It uses analytics to guide students and instructors. Analytics help improve courses over time, provide insights into student and faculty engagement, and measure the effectiveness of different instructional approaches. Case studies at Colorado Technical University and University of Central Florida found increased pass rates, retention, and student satisfaction when using Realizeit. Institutions can also collaborate with Realizeit on research to continuously improve courses.
This document summarizes several case studies highlighting the use of TurningPoint response technology by educators. It describes how TurningPoint allows educators to gauge student understanding in real-time, stimulate discussion, and use reporting features to produce learning analytics and research. Case studies describe its use to provide feedback and self-assessment for students in large classes, integrate interactive theater to enhance learning, and prove teaching excellence.
Jenni hayman ed media brief paper presentationJenni Hayman
The document summarizes a literature review and preliminary instrument developed for a Delphi study on essential practices for online instruction. The literature review analyzed 18 sources and identified 70 recommended practices across categories like facilitation, assessment, and instructional design. These practices will be used to develop a survey to determine which ones expert online instructors agree are essential at their institution. The goal is to establish guidelines to improve online instructor training and evaluation.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Προσκεκλημένη Ομιλία
"Αναλυτική Εκπαιδευτικών Δεδομένων στην Σχολική Πράξη"
5ο Πανελλήνιο Επιστημονικό Συνέδριο
«Ένταξη και Χρήση των ΤΠΕ στην Εκπαιδευτική Διαδικασία»
21-23 Απριλίου 2017
This document summarizes the key findings and recommendations from the US National Mathematics Advisory Panel's 2008 report on modernizing mathematics curriculum and instruction in the United States. The summary highlights that the Panel recommended streamlining the K-8 mathematics curriculum to focus on mastery of key topics like fractions that are critical foundations for algebra. It also recommended ensuring all students have access to an authentic algebra course by 8th grade and that teachers need to have strong content knowledge in algebra topics. The Panel found limited evidence that calculators improve math skills and called for more high-quality research on effective instructional practices.
Open Education Research : Overview, Benefits and Challenges Robert Farrow
Open education research has grown since 2012 through projects led by the OER Research Hub exploring topics like student performance, access, and educator practice. The hub conducts global surveys, publishes reports, and builds research capacity through networks. Current projects examine business models for open education and how teachers reuse OER through online courses. The presentation reviews the hub's work and encourages collaboration to further open scholarship.
The document discusses the New Mathways Project (NMP), which promotes math pathways and redesigned developmental math courses to improve student outcomes. It covers:
1) What math pathways and the NMP are, including defining math pathways as course sequences aligned to programs of study and the NMP's four-pronged redesign model.
2) The positive impact of the NMP on student outcomes like developmental education completion rates doubling in half the time.
3) Best practices for institutions implementing math pathways, such as defining back-to-back math and regional transfer agreements.
Curriculum analytics: Using data from student learning analyticsJisc
This document discusses curriculum analytics, which is the use of data to understand and enhance the curriculum. It provides examples of how curriculum analytics can be used, such as identifying modules that result in better learning, understanding how curriculum sequencing affects performance, and informing real-time teaching adjustments. Potential users are identified as lecturers, course directors, learning technologists, and senior management. Next steps proposed include a curriculum data gathering pilot to define useful metrics and analyze use cases. The document advocates developing curriculum objects that describe curriculum aspects and associated analytics to enhance learning.
Linda Darling-Hammond puts American school reform in the context of what other nations are doing to prepare their young for a global knowledge economy. See the best-practices recommendations.
Big Data Analysis on Student Learning & Course Evaluation in Waseda Universit...CHES_waseda_univ
This document summarizes a presentation about big data analysis on student learning and course evaluation at Waseda University in Japan. It discusses how accountability in higher education has increased in Japan due to globalization and demographic changes. Waseda University established the Center for Higher Education Studies to strengthen institutional research functions and use data analytics to support decision making. As a case study, the presentation analyzes data from an integrated data warehouse on student time spent studying and course grades. It argues this type of analysis can open a dialogue about benchmarks for student learning and theories of education within higher education.
1. The document discusses learning analytics (LA), including what it is, examples of LA tools and projects, and stakeholder viewpoints.
2. Stakeholders like managers, teachers, and students have different views on how LA could be used to improve learning, teaching, and student outcomes.
3. Key concerns about LA include issues around resources, skills, privacy, and ensuring LA adds value and doesn't negatively stereotype or limit students.
Development of educational tools that enable large-scale ethical empirical re...Hassan Khosravi
The value of students developing the capacity to make accurate judgements about the quality of their work and that of others has been widely studied and recognised in higher education literature. To date, much of the research and commentary on evaluative judgement has been theoretical in nature, focusing on perceived benefits and proposing strategies seen to hold the potential to foster evaluative judgement. Their efficacy remains largely untested. The rise of educational tools and technologies which generate data on learning activities at an unprecedented scale, alongside insights from the learning analytics and educational data mining communities, provide new opportunities for fostering and supporting empirical research on evaluative judgement. Accordingly, this paper offers a conceptual framework and an instantiation of the framework in the form of an educational tool called RiPPLE for data-driven approaches to investigate the enhancement of evaluative judgement. Two case studies, demonstrating how RiPPLE can foster and support empirical research on evaluative judgement are presented.
ROLE OF INTERNATIONAL TESTING AND ASSESSMENT AGENCIES Mamoona Shahzad
International testing and assessment agencies are responsible for constructing and administering tests at the international level to evaluate and compare educational systems among countries. Some major agencies discussed include the OECD's PISA, IEA's TIMSS and PIRLS, ETS, ACER, IAEA, and AEA. These agencies seek to compare student achievement across nations and inform education policy through cross-national assessments in important subject areas.
Introduction to Learning Analytics for High School Teachers and ManagersVitomir Kovanovic
Presentation at the first Learning Analytics Learning Network (LALN) Event in Adelaide, Australia on Oct 22, 2019.
Abstract:
With the increased adoption of technology, institutions have unprecedented opportunities to continuously improve the quality of their services through data collection and analysis. Schools and universities now have data about learners and their contexts that can provide valuable insight into how they learn. Early attempts were directed towards mining educational data to identify students-at-risk and develop interventions. Recently, more sophisticated approaches are being deployed by researchers and practitioners. These include analysis of learner behaviour that leads to various learning outcomes, social networks and teams, employability, creativity, and critical thinking. Analysing digital traces generated through learning processes requires a broad suite of methods from data science, statistics, psychometrics, social and learning sciences.
This workshop aims to introduce teachers and educators to the fast growing and promising field of learning analytics. How digital data can be used for the analysis and improvement of student learning will be explored. First, we will provide an overview of learning analytics, its key methods and approaches, as well as problems for which it can be used. Secondly, attendees will engage in group learning activities to explore ways in which learning analytics could be used within their institutions. The focus will be on identifying learning-related challenges that are relevant to their particular context and exploring how learning analytics can be used to practically and effectively.
Outcome-based education (OBE) is an educational theory that bases each part of an educational system around goals (outcomes). By the end of the educational experience, each student should have achieved the goal.
High-Quality “Arts Integration” Programs Can Benefit Learning in Core Subjects José Moreno
“Arts integration” is a mouthful of a term for a simple idea: using the arts to help students learn about other subjects. Now, a study by the American Institutes for Research (AIR) quantifies the effects. It finds that high-quality programs that incorporate music, theater or other arts into core subjects such as English and math can make a difference in learning.
The document outlines a three tier model for promoting institutional adoption of learning analytics at universities.
Tier 1 involves small scale pilot projects using various learning analytics tools to provide insights. Tier 2 establishes a community of interest to share practices. Tier 3 develops learning analytics principles, frameworks and governance models for institutional implementation.
The model was applied at Victoria University of Wellington, resulting in learning analytics principles and framework documents, and progress towards an institutional governance model to bring analytics to scale safely while respecting data ethics. Various pilot projects provided lessons about the need for staff capability development and coordination across the university.
Learning analytics research informed institutional practiceYi-Shan Tsai
The document summarizes learning analytics research and initiatives at the University of Edinburgh. It discusses early MOOC and VLE analytics projects that aimed to understand student behaviors and identify patterns. It also describes the Learning Analytics Map of Activities, Research and Roll-out (LAMARR) and efforts to build institutional capacity for learning analytics. Challenges discussed include the effort required to analyze raw data and involve stakeholders. The document advocates developing critical and participatory approaches to educational data analysis.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
The document discusses the SpeakApps project which aims to develop tools and tasks for oral production and interaction using a learning analytics approach. It provides an overview of learning analytics and references a learning analytics reference model. The model describes analyzing data from the SpeakApps platform to evaluate claims about task design, specifically regarding time limitations for recordings. Data sources would include behavioral logs from the platform and user generated content to assess the engagement and experiences of students, teachers, and instructional designers.
Jenni hayman ed media brief paper presentationJenni Hayman
The document summarizes a literature review and preliminary instrument developed for a Delphi study on essential practices for online instruction. The literature review analyzed 18 sources and identified 70 recommended practices across categories like facilitation, assessment, and instructional design. These practices will be used to develop a survey to determine which ones expert online instructors agree are essential at their institution. The goal is to establish guidelines to improve online instructor training and evaluation.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Προσκεκλημένη Ομιλία
"Αναλυτική Εκπαιδευτικών Δεδομένων στην Σχολική Πράξη"
5ο Πανελλήνιο Επιστημονικό Συνέδριο
«Ένταξη και Χρήση των ΤΠΕ στην Εκπαιδευτική Διαδικασία»
21-23 Απριλίου 2017
This document summarizes the key findings and recommendations from the US National Mathematics Advisory Panel's 2008 report on modernizing mathematics curriculum and instruction in the United States. The summary highlights that the Panel recommended streamlining the K-8 mathematics curriculum to focus on mastery of key topics like fractions that are critical foundations for algebra. It also recommended ensuring all students have access to an authentic algebra course by 8th grade and that teachers need to have strong content knowledge in algebra topics. The Panel found limited evidence that calculators improve math skills and called for more high-quality research on effective instructional practices.
Open Education Research : Overview, Benefits and Challenges Robert Farrow
Open education research has grown since 2012 through projects led by the OER Research Hub exploring topics like student performance, access, and educator practice. The hub conducts global surveys, publishes reports, and builds research capacity through networks. Current projects examine business models for open education and how teachers reuse OER through online courses. The presentation reviews the hub's work and encourages collaboration to further open scholarship.
The document discusses the New Mathways Project (NMP), which promotes math pathways and redesigned developmental math courses to improve student outcomes. It covers:
1) What math pathways and the NMP are, including defining math pathways as course sequences aligned to programs of study and the NMP's four-pronged redesign model.
2) The positive impact of the NMP on student outcomes like developmental education completion rates doubling in half the time.
3) Best practices for institutions implementing math pathways, such as defining back-to-back math and regional transfer agreements.
Curriculum analytics: Using data from student learning analyticsJisc
This document discusses curriculum analytics, which is the use of data to understand and enhance the curriculum. It provides examples of how curriculum analytics can be used, such as identifying modules that result in better learning, understanding how curriculum sequencing affects performance, and informing real-time teaching adjustments. Potential users are identified as lecturers, course directors, learning technologists, and senior management. Next steps proposed include a curriculum data gathering pilot to define useful metrics and analyze use cases. The document advocates developing curriculum objects that describe curriculum aspects and associated analytics to enhance learning.
Linda Darling-Hammond puts American school reform in the context of what other nations are doing to prepare their young for a global knowledge economy. See the best-practices recommendations.
Big Data Analysis on Student Learning & Course Evaluation in Waseda Universit...CHES_waseda_univ
This document summarizes a presentation about big data analysis on student learning and course evaluation at Waseda University in Japan. It discusses how accountability in higher education has increased in Japan due to globalization and demographic changes. Waseda University established the Center for Higher Education Studies to strengthen institutional research functions and use data analytics to support decision making. As a case study, the presentation analyzes data from an integrated data warehouse on student time spent studying and course grades. It argues this type of analysis can open a dialogue about benchmarks for student learning and theories of education within higher education.
1. The document discusses learning analytics (LA), including what it is, examples of LA tools and projects, and stakeholder viewpoints.
2. Stakeholders like managers, teachers, and students have different views on how LA could be used to improve learning, teaching, and student outcomes.
3. Key concerns about LA include issues around resources, skills, privacy, and ensuring LA adds value and doesn't negatively stereotype or limit students.
Development of educational tools that enable large-scale ethical empirical re...Hassan Khosravi
The value of students developing the capacity to make accurate judgements about the quality of their work and that of others has been widely studied and recognised in higher education literature. To date, much of the research and commentary on evaluative judgement has been theoretical in nature, focusing on perceived benefits and proposing strategies seen to hold the potential to foster evaluative judgement. Their efficacy remains largely untested. The rise of educational tools and technologies which generate data on learning activities at an unprecedented scale, alongside insights from the learning analytics and educational data mining communities, provide new opportunities for fostering and supporting empirical research on evaluative judgement. Accordingly, this paper offers a conceptual framework and an instantiation of the framework in the form of an educational tool called RiPPLE for data-driven approaches to investigate the enhancement of evaluative judgement. Two case studies, demonstrating how RiPPLE can foster and support empirical research on evaluative judgement are presented.
ROLE OF INTERNATIONAL TESTING AND ASSESSMENT AGENCIES Mamoona Shahzad
International testing and assessment agencies are responsible for constructing and administering tests at the international level to evaluate and compare educational systems among countries. Some major agencies discussed include the OECD's PISA, IEA's TIMSS and PIRLS, ETS, ACER, IAEA, and AEA. These agencies seek to compare student achievement across nations and inform education policy through cross-national assessments in important subject areas.
Introduction to Learning Analytics for High School Teachers and ManagersVitomir Kovanovic
Presentation at the first Learning Analytics Learning Network (LALN) Event in Adelaide, Australia on Oct 22, 2019.
Abstract:
With the increased adoption of technology, institutions have unprecedented opportunities to continuously improve the quality of their services through data collection and analysis. Schools and universities now have data about learners and their contexts that can provide valuable insight into how they learn. Early attempts were directed towards mining educational data to identify students-at-risk and develop interventions. Recently, more sophisticated approaches are being deployed by researchers and practitioners. These include analysis of learner behaviour that leads to various learning outcomes, social networks and teams, employability, creativity, and critical thinking. Analysing digital traces generated through learning processes requires a broad suite of methods from data science, statistics, psychometrics, social and learning sciences.
This workshop aims to introduce teachers and educators to the fast growing and promising field of learning analytics. How digital data can be used for the analysis and improvement of student learning will be explored. First, we will provide an overview of learning analytics, its key methods and approaches, as well as problems for which it can be used. Secondly, attendees will engage in group learning activities to explore ways in which learning analytics could be used within their institutions. The focus will be on identifying learning-related challenges that are relevant to their particular context and exploring how learning analytics can be used to practically and effectively.
Outcome-based education (OBE) is an educational theory that bases each part of an educational system around goals (outcomes). By the end of the educational experience, each student should have achieved the goal.
High-Quality “Arts Integration” Programs Can Benefit Learning in Core Subjects José Moreno
“Arts integration” is a mouthful of a term for a simple idea: using the arts to help students learn about other subjects. Now, a study by the American Institutes for Research (AIR) quantifies the effects. It finds that high-quality programs that incorporate music, theater or other arts into core subjects such as English and math can make a difference in learning.
The document outlines a three tier model for promoting institutional adoption of learning analytics at universities.
Tier 1 involves small scale pilot projects using various learning analytics tools to provide insights. Tier 2 establishes a community of interest to share practices. Tier 3 develops learning analytics principles, frameworks and governance models for institutional implementation.
The model was applied at Victoria University of Wellington, resulting in learning analytics principles and framework documents, and progress towards an institutional governance model to bring analytics to scale safely while respecting data ethics. Various pilot projects provided lessons about the need for staff capability development and coordination across the university.
Learning analytics research informed institutional practiceYi-Shan Tsai
The document summarizes learning analytics research and initiatives at the University of Edinburgh. It discusses early MOOC and VLE analytics projects that aimed to understand student behaviors and identify patterns. It also describes the Learning Analytics Map of Activities, Research and Roll-out (LAMARR) and efforts to build institutional capacity for learning analytics. Challenges discussed include the effort required to analyze raw data and involve stakeholders. The document advocates developing critical and participatory approaches to educational data analysis.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
The document discusses the SpeakApps project which aims to develop tools and tasks for oral production and interaction using a learning analytics approach. It provides an overview of learning analytics and references a learning analytics reference model. The model describes analyzing data from the SpeakApps platform to evaluate claims about task design, specifically regarding time limitations for recordings. Data sources would include behavioral logs from the platform and user generated content to assess the engagement and experiences of students, teachers, and instructional designers.
Learning analytics futures: a teaching perspectiveRebecca Ferguson
Talk given by Rebecca Ferguson on 22 November 2018 int Universita Ca'Foscario Venezia at the event Nuovi orizzonti della ricerca pedagogica: evidence-based learning e learning analytics
This document discusses the potential for establishing a Center for Data Governance and Innovation to help facilitate learning analytics projects at UvA universities. It outlines some roles the center could play, such as approving learning analytics projects to ensure ethical and legal compliance, managing knowledge about data policies, and facilitating communication between stakeholders. The center could help address issues around data ownership, gatekeeper resistance, and complex infrastructure challenges. Establishing good data governance is important to enable learning analytics and other big data initiatives while protecting student privacy and ethical use of data.
The document summarizes a case study on using data analysis and learning analytics in higher education. It describes how data was collected through student surveys to understand attitudes towards university services quality. The data was analyzed using SPSS and most students had positive attitudes. Recommendations included using additional quality models and awareness campaigns for services. Data scientists can help universities make data-driven decisions to improve student outcomes and resource allocation.
This document discusses measuring teaching excellence and learning gain in higher education. It defines learning gain as the improvement in students' skills, knowledge, and development between two points in time. Teaching excellence is defined broadly as high-quality teaching, learning environments, and student outcomes. The document examines potential metrics to measure these concepts and challenges around benchmarking standards across institutions. It also summarizes early findings from a national study measuring learning gain at multiple universities.
The FACT2 Learning Analytics Task Group report summarizes their work in 2013-2014. Their goals were to develop professional learning for faculty on learning analytics and identify best practices. In fall 2013, they presented findings at several SUNY conferences on using data to enhance student experience. A spring 2014 pilot at SUNY Oswego used a retention system called Starfish for nearly 1,000 at-risk students. The task group defined learning analytics as software collecting multiple data sets to predict and impact student success. They recommended establishing an ongoing working group to develop learning analytic practices and tools across SUNY.
Slides for conference program at e-Learning Korea 2016. Also this slides contain ISO/IEC TR 20748-1 Learning Analytics Interoperability - Part 1: Reference model as well as curriculum standards. Mainly this slides was prepared for LASI-Asia 2016 #lasiasia16.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Learning analytics involves analyzing educational data to understand students and improve teaching and learning. It can be performed at different scales from individual courses to institutions. Examples include using VLE data to track online discussion or predict student needs, and MOOC data to inform course design. Learning analytics can benefit students by personalizing support, teachers by informing instruction, and institutions by improving programs. Challenges include integrating diverse data sources and sharing insights appropriately.
Australian university teacher’s engagement with learning analytics: Still ea...Blackboard APAC
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
This document summarizes the results of a group concept mapping study on organizational challenges and opportunities for open online education in the Netherlands. The study identified 8 clusters of challenges and opportunities: 1) online teaching, 2) supporting mechanisms, 3) assessment, 4) external target groups, 5) educational flexibility, 6) quality of education, 7) institutional reputation, and 8) educational efficiency. Educational flexibility was identified as the largest opportunity area, while assessment was seen as the biggest challenge. Overall, opportunities for open online education were recognized, but implementation challenges remain, particularly regarding teacher skills and organizational support.
The document outlines the Teaching Learning Critical Pathway (TLCP) model, which is used to organize teaching and learning actions. It involves setting a SMART goal focused on an area of student need, gathering and analyzing data, planning instruction aligned to curriculum expectations, establishing baseline student data, and conducting assessments and reflection. The TLCP is intended to set high expectations for students and provide effective feedback to guide instruction and elicit evidence of student learning.
The document discusses requirements for learning analytics based on a lecture and workshop at East China Normal University. It begins with introductions and then outlines the day's plan to discuss definitions of analytics, actors in learning analytics, framework models, and requirements. It emphasizes starting with pedagogy and poses questions about what data is available and how to build trust. Ethical challenges are noted around data protection, privacy, transparency, and purpose. The goal is to use analytics to facilitate learning while avoiding instructivist approaches and stress for learners.
The document summarizes key points about international assessments of student learning and 21st century skills. It discusses how assessments are shifting from multiple choice tests of facts to performance tasks that require problem-solving, critical thinking, and applying knowledge in new contexts. High-achieving countries increasingly use methods like projects, essays, and portfolios to evaluate applied learning alongside subject knowledge. The document advocates for assessment systems that align standards, curriculum, and instruction with clear learning goals like 21st century skills; use multiple sources of evidence including teacher-developed and externally-developed measures; and embody high expectations for all students.
Olc 2017 april-how to promote large scale adoption of adaptive courseware allKaren Vignare
This document summarizes a presentation on promoting large-scale adoption of adaptive courseware. It discusses eight universities that are accelerating adoption over three years and sharing their experiences. The presentation covers defining adaptive courseware, approaches to achieving widespread usage, selecting courseware, and challenges in implementation. Key strategies discussed include cross-institution collaboration, incentivizing faculty, and taking a data-driven approach through pilots and iterations. The goal is to personalize learning, improve student outcomes, and reduce costs through embracing new educational models.
The document discusses developing an evidence-based institutional learning analytics policy. It summarizes discussions from focus groups with students and staff at two universities. Students were interested in personalized support but concerned about privacy and bias. Staff saw potential to improve teaching and support students, but had concerns about privacy, profiling, and interpreting learning data. Developing an inclusive policy requires addressing stakeholders' varied expectations and concerns.
SHEILA Project - Workshop Slides Online Educa Berlin 2016LACE Project
Learning Analytics (LA) is currently a very active topic in education, but its implementation is beset with potential pitfalls for an organisation wishing to develop extensive use of it. Building upon international experience and local knowledge, the EC-funded SHEILA Project (Jan 2016-June 2018) is creating a policy framework for higher education institutions to enable them to design and enact an LA policy for themselves, using an innovative concept mapping approach (ROMA) combined with interviews of key stakeholders in several European countries. It is a partnership of the Universities of Edinburgh (coordinator), Tallinn University, Open University NL and Carlos III Madrid, with Brussels Education Services, Erasmus Student Network and European Quality Assurance Network (ENQA).
In this workshop we discussed with participants our interim data from:
- interviews from senior HEI leaders charged with the implementation of learning analytics to understand the current processes, barriers, and opportunities;
- group concept mapping by international expert panel to identify critical concerns for learning analytics policy;
- benchmark of the learning analytics sophistication in the European HE sector by administering a survey to members of the EUA.
Overview of the LAEP learning analytics projectLACE Project
Overview of the LAEP project - implications and opportunities of learning analytics for European educational policy. Presented at the LAEP / LACE workshop held in Amsterdam, 15-16 March 2016.
Presentations given by representatives of EU-funded projects at the learning analytics expert workshop held in Amsterdam 15-16 March 2016, organised by the LAEP and LACE projects.
Lightning (four-minute) presentations given by invited experts at the Amsterdam learning analytics workshop organised by the LAEP and LACE projects in Amsterdam on 15-16 March 2016. #laepanalytics
Research in to Practice: Building and implementing learning analytics at TribalLACE Project
Keynote by Chris Ballard, Data Scientist, Tribal, given at the LACE SoLAR Flare event held at The Open University, Milton Keynes, UK on 9 October 2015. #LACEflare
Learning analytics LACE SoLAR Flare 2015LACE Project
These slides accompanied 15 'lightning presentations' on learning analytics given by a range of speakers from countries across Europe at the LACE SoLAR Flare 2015 networking event. This event took place at The Open University, UK, on 9 October 2015. It formed part of an international series of learning analytics networking events under the auspices of the Society for Learning Analytics Research (SoLAR). This was the second Flare to be run by the Learning Analytics Community Exchange (LACE) project.
Scalable Learning Analytics and Interoperability – an assessment of potential...LACE Project
A presentation given at the 2015 EUNIS Congress, held at Abertay University in Dundee, June 2015.
Learning analytics is now moving from being a research interest to topic for adoption. As this happens, the challenge of efficiently and reliably moving data between systems becomes of vital practical importance. In this context, “scalable learning analytics” is not intended to refer to infrastructural throughput, but to refer to the feasibility of a combination of: a) pervasive system
integration, and b) efficient analytical and data management practices. There are a number of
considerations that are of particular relevance to learning analytics in addition to elements that are generic to analytics. This contribution to EUNIS 2015 seeks to clarify, by argument and through evidence, both where there are potential benefits and limitations to applying interoperability specifications (and standards) in the service of scalable learning analytics.
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
This presentation is to introduce a discussion session at the 2015 EUNIS Congress workshop session of the E-Learning Task Force. The LACE Project is very briefly introduced, followed by an explanation of the presenter's view of learning analytics and a critique of some common themes. Assessment Analytics is presented as an antithesis to these themes and an assessment lifecycle model (used in the Jisc Electronic Management of Assessment Programme) is used to outline some ways in which assessment analytics can be realised, as stimulus for discussion.
LACE Spring Briefing - WatchMe project overviewLACE Project
The document discusses the WatchMe project, which aims to develop a mobile electronic portfolio system to improve feedback and assessment for competency-based higher education. The system will integrate workplace-based feedback and assessment using modules for feedback, visualization, and e-assessment of entrustable professional activities (EPAs). It will combine the EPA approach with learning analytics to provide tailored feedback based on assessments. The portfolio system, called ePASS, will offer just-in-time feedback to learners before assessments and aggregate assessment data in visualizations.
LACE Spring Briefing - Lea's Box Project overviewLACE Project
Presentation of the Lea's Box project by Michael Kickmeier-Rust at the "Policies for Educational Data Mining and Learning Analytics" Briefing held on April 15 in Brussels.
LACE Spring Briefing - PELARS project overview LACE Project
Manolis Mavrikis presented on the potentials and pitfalls of learning analytics for exploratory and experiential learning. He discussed how learning analytics can support hands-on, inquiry-based learning through simulations, games, and coding activities. He provided examples from EU projects of using learning analytics to gain insights into classroom dynamics, student goal achievement, and support self-reflection. However, barriers remain in achieving the full potential of learning analytics due to issues around teacher and student literacy in making decisions based on analytics data.
LACE Spring Briefing - the LACE project LACE Project
The LACE project aims to build bridges between research, policy, and practice to realize the potential of learning analytics in Europe. The project has over 47 associated partners and is working on issues like data standards, interoperability, privacy and ethics, and developing an evidence base. The project is active in K-12, higher education, and the workplace through knowledge transfer workshops and events. Sectors differ in their use of learning analytics, from more awareness raising in K-12 to more systematic use in higher education. The project is developing an evidence hub to provide evidence on how learning analytics can improve outcomes, support, teaching and retention. However, the evidence base for learning analytics is currently weak. The project aims to inform policy through
Themes in Learning Analytics - A Critical ViewLACE Project
The document summarizes Adam Cooper's presentation at the Talis Insight conference in 2015 on the themes of learning analytics. It identifies four main themes: data warehouses, dashboards, predictive analytics, and retention and intervention systems. For each theme, it provides a brief critical view. The presentation encourages moving the field forward through open discussion, sharing, and participatory design to socially construct knowledge about learning analytics in a way that considers people and current practices.
European Perspectives on Learning Analytics: LAK15 LACE panelLACE Project
The document provides an overview of learning analytics work in Europe from several countries including Estonia, the Netherlands, and France. In Estonia, key initiatives discussed include the educational cloud and eDidaktikum teacher learning environment which incorporate learning analytics dashboards. In the Netherlands, the focus is on collaboration between universities, Apereo, and SURF. In France, recent and current national projects exploring learning analytics are highlighted such as Péricles and Hubble. The document concludes with an overview of the goals of the LACE project which aims to build bridges between learning analytics research, policy, and practice across Europe.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
Gender and Mental Health - Counselling and Family Therapy Applications and In...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1
5. Objectives
• The state of the art
• Direct engagement with key stakeholders
• A comprehensive policy framework
http://sheilaproject.eu/
6. Inclusive adoption process
Macfadyen, L., Dawson, S., Pardo, A., Gašević, D., (2014). The learning analytics imperative and the sociotechnical challenge: Policy for complex systems. Research & Practice in Assessment,
9(Winter 2014), 17-28.
7. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
8. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
9. Adoption challenge
Leadership for strategic
implementation & monitoring
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the
Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
10. Adoption challenge
Equal engagement with
different stakeholders
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the
Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
11. Adoption challenge
Training to cultivate data literacy
among primary stakeholders
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the
Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
12. Adoption challenge
Policies for learning analytics practice
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the
Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
13. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
14. What is the state of the art?
What are the drivers?
What are the challenges?
15. Survey
• 22 countries, 46 institutions
• November 2016
NO P LA NS
IN P RE P A RA TION
IMP LE ME NT ED 2 13
15
16
The adoption of LA
Institution-wide Small scale N/A
16. Interviews
• 16 countries, 51 HEIs, 64 interviews, 78 participants
• August 2016 - January 2017
N O P L A N S
I N P R E P A R A T I O N
I M P L E M E N T E D
9 7 5
12
18
The adoption of learning analytics (interviews)
Institution-wide Partial/ Pilots Data exploration/cleaning
17. Motivations to adopt learning analytics
• To improve student learning performance – 40 (87%)
• To improve student satisfaction – 33 (72%)
• To improve teaching excellence – 33 (72 %)
• To improve student retention– 26 (57 %)
• To explore what learning analytics can do for our
institution/ staff/ students – 25 (54 %)
46 institutions
18. Motivations to adopt learning analytics
• To improve student learning performance – 40 (87%)
• To improve student satisfaction – 33 (72%)
• To improve teaching excellence – 33 (72 %)
• To improve student retention– 26 (57 %)
• To explore what learning analytics can do for our
institution/ staff/ students – 25 (54 %)
46 institutions
19. Motivations to adopt learning analytics
• To improve student learning performance – 40 (87%)
• To improve student satisfaction – 33 (72%)
• To improve teaching excellence – 33 (72 %)
• To improve student retention– 26 (57 %)
• To explore what learning analytics can do for our
institution/ staff/ students – 25 (54 %)
46 institutions
21. “People are thinking about learning analytics as a way
to try and personalise education and enhance
education. And actually make our education more
inclusive both by understanding how different students
engage with different bits of educational processes, but
also about through developing curricula to make them
more flexible and inclusive as a standard.”
22. “I think what we would be looking at is how do we
evolve the way we teach to provide better learning
outcomes for the students, greater mastery of the
subject.”
23. “We’re trying to understand better the curriculum that
needs to be offered for the students in our region.
And…I think importantly how our pedagogical model
fits that and deliver the best experience for our
students.”
24. Barriers to the success of learning analytics
• Analytics expertise – 34 (76%)
• A data-driven culture at the institution – 30 (67%)
• Teaching staff/tutor buy-in – 29 (64%)
• The affordances of current learning analytics technology – 29 (64%)
26. Implications
• Interests were high but experiences were premature.
• There was strong motivation in increasing institutional performance
by improving teaching quality.
• Key barriers were around skills, institutional culture, technology,
ethics and privacy.
27. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
29. With regards to learning analytics …
… what do academic staff ideally expect to happen?
… what do academic staff predict to happen in
reality?
Goal of the survey
30. 4 academic institutions
University of Edinburgh Carlos III Madrid
n = 81 n = 26
Open Universiteit University of Tallinn
n = 54 n = 49
from spring to fall 2017
31. 16 items, some examples
The university will provide me with guidance on how to access LA
about my students
The LA service will show how a student’s learning progress compares to
their learning goals/the course objectives
The teaching staff will have an obligation to act if the analytics show
that a student is at-risk of failing, underperforming, or that they could
improve their learning
32.
33.
34. University of Edinburgh:
• Ideal: LA will collect and present data that is accurate (M = 5.91) Q9
• Predicted: Providing guidance to access LA about students (M = 5.05) Q1
Carlos III de Madrid:
• Ideal: LA presented in a format that is understandable and easy to read
(M = 6.31) Q11
• Predicted: LA will present students with a complete profile of their
learning across every course (M = 5.27) Q12
Highest expectation values
35. Highest expectation values
Open Universiteit Nederland:
• Ideal: LA will collect and present data that is accurate (M = 6.60) Q9
• Predicted: Able to access data about students’ progress in a course that I
am teaching (M = 5.17) Q4
University of Tallinn:
• Ideal: Able to access data about students’ progress in a course that I am
teaching (M = 6.04) Q4
• Predicted: Able to access data about students’ progress in a course that I
am teaching (M = 5.49) Q4
36.
37. Lowest expectation values
University of Edinburgh:
• Ideal: Teaching staff will have an obligation to act if students are found to be at-
risk of failing or under performing (M = 3.65) Q14
• Predicted: Teaching staff will be competent in incorporating analytics into the
feedback and support they provide to students (M = 3.49) Q13
Carlos III de Madrid:
• Ideal: Teaching staff will have an obligation to act if students are found to be at-
risk of failing or under performing (M = 4.42) Q14
• Predicted: Teaching staff will have an obligation to act if students are found to be
at-risk of failing or under performing (M = 3.77) Q14
38. Lowest expectation values
Open Universiteit Nederland:
• Ideal: Teaching staff will have an obligation to act if students are found to be at-
risk of failing or under performing (M = 4.44) Q14
• Predicted: Feedback from analytics will be used to promote students’ academic
and professional skill development for future employability (M = 3.24) Q15
University of Tallinn:
• Ideal: Teaching staff will have an obligation to act if students are found to be at-
risk of failing or under performing (M = 4.80) Q14
• Predicted: Q14 (M = 3.82)
40. Goal
To better understand the viewpoints of academic staff on:
• Learning analytics opportunities in the HEIs from the
perspective of students, teachers and programs;
• Concerns related with adapting of learning analytics;
• Needed steps to adopt learning analytics at the HEIs
41. Study participants
• University of Edinburgh: 5 focus groups, 18 teaching staff
• Universidad Carlos III de Madrid: 4 focus groups, 16
teaching staff
• Open Universiteit Nederland: 2 focus groups, 5 teaching
staff
• Tallinn University: 5 focus groups, 20 teaching staff
42. Results: Expectations & LA opportunities
STUDENT
LEVEL
TEACHER
LEVEL
PROGRAM
LEVEL
Take responsibility for their
learning and enhancing their
SRL- skills
Assess the degree of success to
prevent students from begin
worried or optimistic about
their performance
Method to identify student’s
weaknesses and know where
students are with their progress
Understand how students
engage with learning content
Improve of the design and
provision of learning materials,
courses, curriculum and support
to students
Understand how program is
working (strengths and
bottlenecks)
Improve educational quality
(e.g. content level)
46. Results: concerns – program level
• Interpretation of learning:
• Was the right data collected?
• Were the accurate algorithms developed ?
• Was an appropriate message given for the students?
• Connecting LA to real learning – is this meaningful picture of
learning what is happening in online environments?
47. What we should consider?
• LA should be just one component of many for collecting
feedback and enhancing decision-making
• Involve stakeholders:
• Academic staff to in developing and setting up of LA
• Pedagogy experts involved to ensure data makes sense to
improve learning
• Provide training, communication!
48. What we should consider?
•Design of the tools that are:
•Easy to use
•Providing visualizations of data
•Not requiring mathematical/statistical skills
•Not taking a lot of time
•Considering ethical and privacy aspects
49. Student Views
Pedro Manuel Moreno Marcos
Department of Telematics Engineering
Universidad Carlos III de Madrid
pemoreno@it.uc3m.es
http://sheilaproject.eu/
50. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
52. Background
• 12 Items Survey
• Two Subscales:
• Ethical and Privacy Expectations
• Service Expectations
• 6 Distributions:
• Edinburgh (N = 884)
• Liverpool (N = 191)
• Tallinn (N = 161)
• Madrid (N = 543)
• Netherlands (N = 1247)
• Blanchardstown (N = 237)
http://sheilaproject.eu/
53. Ideal Expectation Scale Predicted Expectation Scale
Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party Alternative Purpose Consent to Collect Identifiable Data Keep Data Secure Third Party
1
2
3
4
5
6
7
Item
Average
Location
Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Ethical and Privacy Expectations http://sheilaproject.eu/
54. Keep Data Secure – Predicted Expectation Scale
Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree
Response
10
20
30
40
50
Percentage
http://sheilaproject.eu/
55. Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree
Response
10
20
30
Percentage
Consent to Collect – Predicted Expectation Scale http://sheilaproject.eu/
56. Ideal Expectation Scale Predicted Expectation Scale
ObligationtoAct
IntegrateintoFeedback
SkillDevelopment
RegularlyUpdate
CompleteProfile
StudentDecisionMaking
CourseGoals
ObligationtoAct
IntegrateintoFeedback
SkillDevelopment
RegularlyUpdate
CompleteProfile
StudentDecisionMaking
CourseGoals
1
2
3
4
5
6
7
Average
Location
Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Service Expectations http://sheilaproject.eu/
57. Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree
Response
10
20
30
Percentage
Course Goals – Predicted Expectation Scale http://sheilaproject.eu/
58. Blanchardstown
Edinburgh
Liverpool
Madrid
Open University of the Netherlands
Tallinn
Strongly Disagree Disagree Somewhat Disagree Neither Agree nor Disagree Somewhat Agree Agree Strongly Agree
Response
10
20
Percentage
Obligation to Act – Predicted Expectation Scale http://sheilaproject.eu/
59. Summary
• Beliefs towards learning analytics are not consistent.
• Emphasis on data security and improving learning.
http://sheilaproject.eu/
61. Background
• 18 focus groups
• 4 partners’ institutions
• 74 students
• Interviews: Around 1h
http://sheilaproject.eu/
62. Interests and expectations
• Improve the quality of teaching
• Better student-teacher feedback
• Better academic resources and academic tools to improve learning
• Personalized support
• Recommendation of learning resources
• Feedback from a system, via a dashboard
• Provide an overview of the tasks to be done in a semester → improve
curriculum design
http://sheilaproject.eu/
63. Awareness
• Students do not know what LA is, but they recognise its importance if
it can solve students’ problems
• Students are not generally aware of the data collected → Transparency
• Students have not checked the conditions they have accepted about
data
http://sheilaproject.eu/
67. • innovations in way network is delivered
• (investigate) corporate/structural alignment
• assist in the development of non-traditional partnerships (Rehab with the
Medicine Community)
• expand investigation and knowledge of PSN'S/PSO's
• continue STHCS sponsored forums on public health issues (medicine
managed care forum)
• inventory assets of all participating agencies (providers, Venn Diagrams)
• access additional funds for telemedicine expansion
• better utilization of current technological bridge
• continued support by STHCS to member facilities
• expand and encourage utilization of interface programs to strengthen the
viability and to improve the health care delivery system (ie teleconference)
• discussion with CCHN
Work
quickly and
effectively
under
pressure
49
Organize the
work when
directions are
not specific.
39
Decide how to
manage
multiple tasks.
20 Manage resources effectively.
4
2. Sort
3. Rate
1. Brainstorm
Group Concept Mapping
68. Onderwerp via >Beeld >Koptekst en voettekst Pagina 68
27 March 2014@HDrachsler 68 / 31
An essential feature of a higher education institution’s
learning analytics policy should be …
Group Concept Mapping
77. Cluster Map
1. privacy & transparency
2. roles & responsibilities
(of all stakeholders)
3. objectives of LA
(learner and teacher support)
4. risks & challenges
5. data management
6. research & data analysis
78. Rating Map – Importance
1. privacy & transparency
2. roles & responsibilities
(of all stakeholders)
3. objectives of LA
(learner and teacher support)
4. risks & challenges
5. data management
6. research & data analysis
Cluster Legend
Layer Value
1 5.08 to 5.27
2 5.27 to 5.46
3 5.46 to 5.65
4 5.65 to 5.84
5 5.84 to 6.03
79. Rating Map – Ease
1. privacy & transparency
2. roles & responsibilities
(of all stakeholders)
3. objectives of LA
(learner and teacher support)
4. risks & challenges
5. data management
6. research & data analysis
Cluster Legend
Layer Value
1 3.79 to 4.12
2 4.12 to 4.45
3 4.45 to 4.78
4 4.78 to 5.11
5 5.11 to 5.44
80. Rating Ladder Graph
importance ease
privacy & transparency
privacy & transparency
risks & challenges
risks & challenges
roles & responsibilities (of all stakeholders)
roles & responsibilities (of all stakeholders)
objectives of LA (learner and teacher support)
objectives of LA (learner and teacher support)
data management
data management
research & data analysis
research & data analysis
3.79 3.79
6.03 6.03
r = 0.66
81. Go Zone – Roles & Responsibilities
5
38
62
11
19
22
33
39 48
70
91
25
28
37
40
55
61
66
27
47 49
6.08
4.72
3.12
ease
3.83 5.48 6.59
importance
r = 0.26
55. being clear about the purpose of learning analytics
61. a clear articulation of responsibilities when it comes to the use of institutional data
82. Yi-Shan Tsai, Pedro Manuel
Moreno-Marcos, Ioana Jivet,
Maren Scheffel, Kairit Tammets,
Kaire Kollom, and Dragan
Gašević. (to appear). The
SHEILA framework: Informing
institutional strategies and
policy processes of learning
analytics. Journal of Learning
Analyitcs.
83. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
86. Methodology
Literature
- Policy
- Adoption
Academic staff
- Survey
- Focus groups
Students
- Survey
- Focus groups
Senior managers
- Survey
- Interviews
Experts
- Group concept
mapping
Policy
framework
Institutional
policy/strategy
Other stakeh.
- Workshops
- Committees
Editor's Notes
With senior managers, we were
EADTU (European Association of Distance Teaching Universities)
EUA (European University Association)
HeLF (Heads of e-Learning Forum)
EUNIS (European University Information Systems)
SNOLA (Spanish Network of Learning Analytics)
eMadrid
A survey question (multiple choices) provided 11 options for motivations specific to learning and teaching.
All related to institutional performance: league ranking, satisfaction survey, teaching excellence framework
But also dependent on teaching quality
Early stage - exploration
Most institutions seem to have incorporated all levels of goals into their planning or implementation of LA
Enhance self-regulation skills: provide data-based information to guide students
Improve learning support: curriculum, feedback, personalized support, pastoral care, timely support
Increase institutional performance: retention rate, student satisfaction, league ranking
13 options
Moderately-sized, large, critical
Three most mentioned issues regarding ethics and privacy
Interest is strong
Institutions were exploring what LA can do
Using LA to enhance teaching so as to increase institutional performance is the biggest motivation among managers
Barriers – skills, culture, technology, ethics and privacy
Qualitative data to get addition to LA data students’ perceptions and understandings about teaching and learning processes
Staff was worried that profiling of the students as e.g. low-performing might end up with the lost of motivation and anxiety.
Staff was wondering: shall I be objective?
How can I be objective
Hi – I’m Alex Wainwright… and I’m going to give an overview of the student survey results… this is going to cover response rates and some general insights obtained…
The student survey is composed of 12 items… and responses are made on two scales that correspond to a desired service… and what students expect in reality… so they reflect two levels of expectation…
Through the development and validation process we have identified two subscales… these refers to ethical and privacy expectations… such as whether students expect to provide consent for the collection of their educational data…
And the other subscales refers to service expectations… so this covers things such as whether students expect to receive updates on how their learning progress compares to a set goal….
As you can see… we have distributed the instrument at six different higher education institutions… with the highest response rate being at the open university of the netherlands…
All distributions have shown the scales to be valid and to also show excellent measurement quality….
Firstly… I am going to go over the ethical and privacy expectation items…
On this figure you can see the average responses to these items by expectation scale and location…. The x axis provides an indication of what the items refer to…
So we have beliefs about providing consent when data is used for an alternative purpose… or whether consent should be sought before distributing data to third party companies
What can be seen is that students ideal expectations are generally higher than predicted expectations – this is anticipated as it is a desired level of service…
Across both scales… however… we can see that the expectation that all collected data remain secure receives the highest average response… whereas… the expectation to provide consent before educational data is collected and analysed receives the lowest average response across these five items… and whilst students agree with this belief… it verges on indifference on the predicted expectation scale for the Spanish student sample….
It may be that students are open to universities collecting and analysing educational data… particularly as it is used for attendance purposes, for example…
Whereas… they have stronger beliefs toward universities abiding by data handling policies that will ensure that all data remains secure…
We can also look at these two particular ethical and privacy expectation items in more detail…
This figure shows the percentage of students responding in a certain way to the data security expectation… with darker colours reflecting a higher percentage of students responding that way…
And what is show is that… between 60 to 80% of students across all universities either agreed or strongly agreed with the expectation that universities will ensure data is kept secure…
For the consent to collect expectation… this figure shows that there is more variation in the responses…
For those students from Edinburgh, Liverpool, the Netherlands, and Blanchardstown… the largest response of around 30% is for strongly agree to this belief…
Whereas… the largest percentage of responses for Madrid and Tallinn… which was around 25%... Was for somewhat agree…
Looking at the service expectation items… we can that the average responses tend to be similar across locations…
Of particular note… the obligation to act is the item with the lowest response on average… with students in Madrid, the Netherlands, and Tallinn generally showing indifference to this belief on the predicted expectation scale...
The higher average responses… on the other hand… seem to be around aspects of self-regulated learning such as students expecting to receive a complete profile of their learning…. Making their own decisions on the analytics that they receive… and knowing how their progress compares to a set learning goal….
Looking into what are the highest and lowest average response items… we can also understand differences within each sample…
For knowing how progress compares to a set learning goal… between 20 to 35% of students across each sample agreed with this expectation… with around 4% disagreeing….
As for the obligation to act… the highest response rates are variable…
Around 20% of students in the Tallinn and Madrid samples somewhat disagreed with this expectation… For the Dutch students 24% expressed indifference to this belief… whereas in Liverpool and Blanchardstown around 28% showed agreement…
The output from the student survey shows that the expectations of students towards learning analytics are not consistent across each sample… with students generally showing variations in what they want from such services…
On the other hand… we can generally see that students expect a learning analytics service that emphasises data security… and provides tools that support learning as opposed to those that emphasise early interventions