1. The document discusses learning analytics (LA), including what it is, examples of LA tools and projects, and stakeholder viewpoints.
2. Stakeholders like managers, teachers, and students have different views on how LA could be used to improve learning, teaching, and student outcomes.
3. Key concerns about LA include issues around resources, skills, privacy, and ensuring LA adds value and doesn't negatively stereotype or limit students.
Nurturing the Connections: The Role of Quantitative Ethnography in Learning A...Dragan Gasevic
This talk will explore connections between two emerging fields focused on harnessing the potential of data – learning analytics and quantitative ethnography. Learning analytics is focused on the analysis of data collected from user interactions with technology with the goal of advancing our understanding of and enhancing human learning. Despite some early success stories and widespread interest, producing meaningful and actionable results is still a top open research challenge for learning analytics. The talk will first explore how quantitative ethnography can offer promising approaches that can address this open challenge in learning analytics. The talk will next discuss how progress in learning analytics can be used to accelerate the development of the field of quantitative ethnography. The talk will finally outline promising directions for future research at the intersection of learning analytics and quantitative ethnography.
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new field learning analytics and mobilized the education sector to embrace the use of data for decision-making. This talk will first introduce the field of learning analytics and touch on lessons learned from some well-known case studies. The talk will then identify critical challenges that require immediate attention in order for learning analytics to make a sustainable impact on learning, teaching, and decision making. The talk will conclude by discussing a set of milestones selected as critical for the maturation of the field of learning analytics. The most important take away from the talk will be that
- systemic approaches to the development and adoption of learning analytics are critical,
- multidisciplinary teams are necessary to unlock a full potential of learning analytics, and
- capacity development at institutional levels through the inclusion of diverse stakeholders is essential for full learning analytics adoption.
This is the second edition of the talk that previously gave under the same title on several occasions. The second edition reflects many developments happened in the field of learning analytics, especially those in the following two projects - http://he-analytics.com and http://sheilaproject.eu.
OAAI: Deploying an Open Ecosystem for Learner AnalyticsJoshua
The Open Academic Analytics Initiative (OAAI), an NGLC grant recipient, has developed a predictive model for learner analytics using open-source tools, which we are releasing under an open-source license. We will share project outcomes along with research into effective OER-based intervention strategies and other critical learner analytics scaling factors.
Learning analytics: An opportunity for higher education?Dragan Gasevic
Slides used in my keynote at the Annual Conference of the European Association of Distance Teaching Universities - The open, online, flexible higher education conference - #OOFHEC2015
Towards Strengthening Links between Learning Analytics and AssessmentDragan Gasevic
. The emergence of learning analytics afforded for the analysis of digital traces of user interaction with technology. This analysis offers many opportunities to advance understanding and enhance learning and the environments in which learning occurs. Existing research has shown how learning analytics can provide contributions to different areas of education such as prediction of student success, uncovering learning strategies, understanding affective states, and unpacking the role social networks in learning. While these results have shown much promise, one critical challenge remains unclear – how learning analytics can help track learning progression and inform assessment especially from the perspective of the 21st century skills. This talk will explore opportunities and challenges for the integration of methods commonly used in learning analytics to analyze different digital traces with methods commonly used in assessment and psychometric research. The paper particularly focuses on open learning environments where analytics-based assessment is rather underexplored in contrast to assessment in specialized (intelligent tutoring) systems where the combined use of data mining and psychometric techniques has been established for some time now.
Nurturing the Connections: The Role of Quantitative Ethnography in Learning A...Dragan Gasevic
This talk will explore connections between two emerging fields focused on harnessing the potential of data – learning analytics and quantitative ethnography. Learning analytics is focused on the analysis of data collected from user interactions with technology with the goal of advancing our understanding of and enhancing human learning. Despite some early success stories and widespread interest, producing meaningful and actionable results is still a top open research challenge for learning analytics. The talk will first explore how quantitative ethnography can offer promising approaches that can address this open challenge in learning analytics. The talk will next discuss how progress in learning analytics can be used to accelerate the development of the field of quantitative ethnography. The talk will finally outline promising directions for future research at the intersection of learning analytics and quantitative ethnography.
State and Directions of Learning Analytics Adoption (Second edition)Dragan Gasevic
The analysis of data collected from user interactions with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new field learning analytics and mobilized the education sector to embrace the use of data for decision-making. This talk will first introduce the field of learning analytics and touch on lessons learned from some well-known case studies. The talk will then identify critical challenges that require immediate attention in order for learning analytics to make a sustainable impact on learning, teaching, and decision making. The talk will conclude by discussing a set of milestones selected as critical for the maturation of the field of learning analytics. The most important take away from the talk will be that
- systemic approaches to the development and adoption of learning analytics are critical,
- multidisciplinary teams are necessary to unlock a full potential of learning analytics, and
- capacity development at institutional levels through the inclusion of diverse stakeholders is essential for full learning analytics adoption.
This is the second edition of the talk that previously gave under the same title on several occasions. The second edition reflects many developments happened in the field of learning analytics, especially those in the following two projects - http://he-analytics.com and http://sheilaproject.eu.
OAAI: Deploying an Open Ecosystem for Learner AnalyticsJoshua
The Open Academic Analytics Initiative (OAAI), an NGLC grant recipient, has developed a predictive model for learner analytics using open-source tools, which we are releasing under an open-source license. We will share project outcomes along with research into effective OER-based intervention strategies and other critical learner analytics scaling factors.
Learning analytics: An opportunity for higher education?Dragan Gasevic
Slides used in my keynote at the Annual Conference of the European Association of Distance Teaching Universities - The open, online, flexible higher education conference - #OOFHEC2015
Towards Strengthening Links between Learning Analytics and AssessmentDragan Gasevic
. The emergence of learning analytics afforded for the analysis of digital traces of user interaction with technology. This analysis offers many opportunities to advance understanding and enhance learning and the environments in which learning occurs. Existing research has shown how learning analytics can provide contributions to different areas of education such as prediction of student success, uncovering learning strategies, understanding affective states, and unpacking the role social networks in learning. While these results have shown much promise, one critical challenge remains unclear – how learning analytics can help track learning progression and inform assessment especially from the perspective of the 21st century skills. This talk will explore opportunities and challenges for the integration of methods commonly used in learning analytics to analyze different digital traces with methods commonly used in assessment and psychometric research. The paper particularly focuses on open learning environments where analytics-based assessment is rather underexplored in contrast to assessment in specialized (intelligent tutoring) systems where the combined use of data mining and psychometric techniques has been established for some time now.
Evaluation for Impact and Learning Asia Value Advisors Nov 6 2014Victor Kuo
The workshop will overview intermediate and advanced concepts of evaluating the impact of philanthropic foundations as well as the organizational systems that support impact evaluation and learning within foundations. Main topics include: prioritizing evaluation audiences and purposes, selecting among a range of evaluation designs (randomized controlled trials, quasi-experimental designs, correlational studies, descriptive studies); organizational readiness for evaluation and learning; and organizational learning. A range of practical tools for developing evaluation projects and for building organizational practices in evaluation and learning will also be shared. Current debates, criticisms, and possible ways forward will be presented using select cases and illustrations. Participants will be encouraged to bring their own examples, offer honest appraisals, and identify ways to advance their own philanthropic work. (This workshop is at an intermediate level; basic concepts of evaluation will be reviewed briefly in the context of more advanced topics.)
Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Closing the Gap With STEM Education: Why, What, and How
Participants will learn why there is a growing need for STEM education in the United States, what STEM education is, how STEM education at the middle school level contributes to closing the gap, and how to successfully plan and implement a middle school program.
Ken Verburg Project Lead the Way - Lexington, SC
Evaluation for Impact and Learning Asia Value Advisors Nov 6 2014Victor Kuo
The workshop will overview intermediate and advanced concepts of evaluating the impact of philanthropic foundations as well as the organizational systems that support impact evaluation and learning within foundations. Main topics include: prioritizing evaluation audiences and purposes, selecting among a range of evaluation designs (randomized controlled trials, quasi-experimental designs, correlational studies, descriptive studies); organizational readiness for evaluation and learning; and organizational learning. A range of practical tools for developing evaluation projects and for building organizational practices in evaluation and learning will also be shared. Current debates, criticisms, and possible ways forward will be presented using select cases and illustrations. Participants will be encouraged to bring their own examples, offer honest appraisals, and identify ways to advance their own philanthropic work. (This workshop is at an intermediate level; basic concepts of evaluation will be reviewed briefly in the context of more advanced topics.)
Can medical education take advantage of Learning Analytics techniques? How? Where? In this presentation a study is analyzed pinpointing three areas in which Medical Education needs to invest and all three are related to Learning Analytics.
Closing the Gap With STEM Education: Why, What, and How
Participants will learn why there is a growing need for STEM education in the United States, what STEM education is, how STEM education at the middle school level contributes to closing the gap, and how to successfully plan and implement a middle school program.
Ken Verburg Project Lead the Way - Lexington, SC
This slide was presented in International the 2015 Conference on Education Research.
I aggregated several my other partial slides and reports to describe adaptive learning model pertaining to concept of learning analytics as well as LOD for curriculum standards and digital resources. There is short introduction to the project of ISO/IEC 20748 Learning analytics interoperability - Part 1: Reference model.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Using Learning analytics to support learners and teachers at the Open UniversityBart Rienties
In this seminar Prof Bart Rienties will reflect on how the Open University UK has become a leading institution in implementing learning analytics at scale amongst its 170K students and 5K staff. Furthermore, he will discuss how learning analytics is being adopted at other UK institutions, and what the implications for higher education might be in these Covid19 times.
https://www.kent.ac.uk/cshe/news-events.html
This is a Walden University course (EDUC 8103), A8: Course Project—Program Proposal. It is written in APA format, has been graded by an instructor (A), and includes references. Most higher-education assignments are submitted to turnitin, so remember to paraphrase. Let us begin.
From institutional Policy to individual practice: Using Learning technologies...Sarra_Saffron_Powell
Charting the development and rationale of a student learning skills project in Higher Education as an integrated semi automated system that uses learner diagnostics to provide automated learning plans for students. Looks at using Policy as institutional leverage and technology to assess student learning skills development.
Quanitiative Research PlanTextbooksAmerican Psychological Asso.docxamrit47
Quanitiative Research Plan
Textbooks
American Psychological Association. (2010). Publication manual of the American Psychological Association (6th ed.). Washington, DC: Author.
Frankfort-Nachmias, C., & Nachmias, D. (2008).Research methods in the social sciences (7th ed.). New York: Worth. (This textbook includes a GSS data disk that will be used in course assignments.)
Green, S. B., & Salkind, N. J. (2014). Using SPSS for Windows and Macintosh: Analyzing and understanding data (7th ed.). Upper Saddle River, NJ: Pearson.
Textbooks from RSCH 8100: Research Theory, Design, and Methods:
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (Laureate Education, Inc., custom ed.). Thousand Oaks, CA: Sage Publications.
Reynolds, P. D. (2007). A primer in theory construction. Boston: Pearson Education.
-or-
Reynolds, P. D. (2010). A primer in theory construction (Laureate Education, Inc., custom ed.). Boston, MA: Pearson Education.
Project Instructions
Quanitiative Research Plan
My chosen topic: Participation of students in non-profit educational program
Section 1 (edit and reduce pages by making them full)
· Title
· Introduction
·
· Opening statement
· Background of study
· Problem statement
· Purpose of the study
· Research question(s) and hypotheses
· Theoretical framework
Section 2: Craft a 5 page paper in which you do the following:
· Assess the strengths and limitations of each of the research designs presented in Weeks 2 and 3.
· Recommend a quantitative design for your research plan. Include a rationale for why that design would be most appropriate.
· For the designs that you did not choose, state why each one is not appropriate for your research questions, hypotheses, and variables.
· Support your work with references to the literature.
Section 3: Craft a 5 page paper that includes the following:
· The levels of measurement that will be important for your study and why.
· How you will ensure content validity, empirical validity, and construct validity for your study. If any of these types of validity do not apply to your plan, provide a rationale.
· How you will ensure reliability for the measurement in your study.
· The strengths and limitations of the measurement instrument you have selected in terms of reliability and validity.
· Provide at least 10 references to the literature to support your choices and rationales.
Section 4: Craft a 5-page paper that includes the following:
·
· The levels of measurement that will be important for your study and why.
· How you will ensure content validity, empirical validity, and construct validity for your study. If any of these types of validity do not apply to your plan, provide a rationale.
· How you will ensure reliability for the measurement in your study.
· The strengths and limitations of the measurement instrument you have selected in terms of reliability and validity.
· Provide at least 10 references to the literature to support your choices and ...
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
Similar to What can learning analytics do for us (20)
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
1. What can learning analytics
do for us?
Yi-Shan Tsai
yi-shan.tsai@ed.ac.uk
@yi_shan_tsai
http://sheilaproject.eu/
RTEN seminar series
30 January 2019
2. What to expect…
• What is learning analytics (LA)
• Project overview
• Multi-stakeholder viewpoints on the pros and cons
• Implications for a learning analytics (LA) strategy
• SHEILA framework
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
5. Learning analytics is…
“the measurement, collection, analysis and reporting of data about
learners and their contexts, for purposes of understanding and
optimising learning and the environments in which it occurs” (Long et
al., 2011).
Long, P. D., Siemens, G., Conole, G., & Gašević, D. (Eds.). (2011). In Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK’11). Banff, AB,
Canada: ACM.
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
8. Learning analytics examples
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Arnold, K. E., & Pistilli, M. D. (2012, April). Course Signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd International Conference
on Learning Analytics and Knowledge (pp. 267-270).
Goal: address retention
Predictive algorithm:
-Performance
-Effort
-Prior academic history
-Student characteristics
11. Learning analytics examples
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Learning Analytics Report Card (LARC)
http://larc-project.com
Involve students in critical conversations
around the use of their data for
computational analysis in education.
Knox, J. (2017). Data Power in Education: exploring critical awareness with the ‘Learning Analytics
Report Card’ (LARC). Special Issue: Data Power in Material Contexts, Journal of Television and
Media. http://journals.sagepub.com/doi/full/10.1177/1527476417690029
Isard, A. and Knox, J. 2016. Automatic Generation of Student Report Cards. 9th International Natural
Language Generation conference. Edinburgh, Sept 5-8
http://www.macs.hw.ac.uk/InteractionLab/INLG2016/proceedings/pdf/INLG33.pdf
13. Learning analytics is…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Clow, D. (2012, April). The learning analytics cycle: closing the loop effectively. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 134-138). ACM.
15. Objectives
• Understand the state of the art
• Engage key stakeholders directly
• Develop a policy framework
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
17. Challenges in institutional
adoption of LA
Literature review
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
Tsai, Y. S., & Gasevic, D. (2017). Learning analytics in higher education – challenges and policies: a review of eight learning analytics policies. In Proceedings of the
Seventh International Learning Analytics & Knowledge Conference (pp. 233-242).
19. 2. Unequal engagement with primary stakeholders
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
https://pixabay.com/
20. 3. Gaps in data literacy across stakeholders
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
https://pixabay.com/
21. 4. Weak grounding of learning theories
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
https://pixabay.com/
22. 5. Little evidence demonstrating the impacts of LA-based
interventions
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
https://pixabay.com/
23. 6. Lack of institution-based polices for learning analytics
practices
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
http://sheilaproject.eu/
https://pixabay.com/
26. Managers would like learning analytics to…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• To improve student learning performance – 40 (87%)
• To improve student satisfaction – 33 (72%)
• To improve teaching excellence – 33 (72 %)
• To improve student retention– 26 (57 %)
• To explore what learning analytics can do for our
institution/ staff/ students – 25 (54 %)
Institutional survey
(n=46)
Tsai, Y.-S., & Gašević, D. (2017). The State of Learning Analytics in Europe – Executive Summary – SHEILA (Executive summary). Retrieved from http://sheilaproject.eu/2017/04/18/the-state-
of-learning-analytics-in-europe-executive-summary/
27. Managers would like learning analytics to…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• To provide personalised learning support (39 %)
• To increase learning motivations (37 %)
• To inform curriculum (35 %)
• To encourage self-regulated learning (30 %)
• To improve student-teacher communication (26 %)
• To improve student recruitment (24 %)
• Other (2 %)
Institutional survey
(n=46)
Tsai, Y.-S., & Gašević, D. (2017). The State of Learning Analytics in Europe – Executive Summary – SHEILA (Executive summary). Retrieved from http://sheilaproject.eu/2017/04/18/the-state-
of-learning-analytics-in-europe-executive-summary/
28. Institutional goals & approaches
• Interview data (27 institutions)
• Epistemic Network Analysis
(ENA)
• The concurrence of codes
implies the strength of
connection
• The frequency of code
concurrence is expressed by the
weight (thickness) of the lines
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Shaffer, D. W., Collier, W., & Ruis, A. R. (2016). A tutorial on epistemic network analysis: Analyzing the structure of connections in cognitive, social, and interaction data. Journal of Learning
Analytics, 3(3), 9-45.
29. Comparison by adoption experience
Less than one year One year or more
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
30. Teachers would like learning analytics to…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
STUDENT
LEVEL
TEACHER
LEVEL
PROGRAM
LEVEL
Take responsibility for their
learning and enhancing their
SRL- skills
Assess the degree of success to
prevent students from begin
worried or optimistic about
their performance
Method to identify student’s
weaknesses and know where
students are with their progress
Understand how students
engage with learning content
Improve of the design and
provision of learning materials,
courses, curriculum and support
to students
Understand how program is
working (strengths and
bottlenecks)
Improve educational quality
(e.g. content level)
Focus groups:
• 16 groups
• 4 universities
• 59 participants
31. Teachers would like learning analytics to…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Focus groups:
• 16 groups
• 4 universities
• 59 participants
Know the ‘usefulness’ of resources and the preferences
of students towards learning materials
Enable personalised support to second language
speakers
Evaluate the workload of students who were mostly
part-time learners
UK & Spanish FGs
Estonian FGs
Dutch FGs
32. Students would like learning analytics to…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Personalised
support
Feedback
Academic
resources
Teaching
quality
Focus groups:
• 18 groups
• 4 universities
• 74 participants
33. Desires for self-regulated learning
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Survey:
• 6 institutions
• 3053 returns
SRL:
• Receiving a complete
profile of their
learning
• Making their own
decisions based on
the analytics results
• Knowing how their
progress compares to
a set learning goal
34. High expectations of self-regulated learning
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Survey:
• 6 institutions
• 3053 returns
SRL:
• Receiving a complete
profile of their
learning
• Making their own
decisions based on
the analytics results
• Knowing how their
progress compares to
a set learning goal
35.
36. Managers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Returns on
investment
Resources
Culture Skills
37. Returns on investment…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“We could have had a little black box instead of an expensive piece of
software. The wrapping around the project, the energy, the
commitment, the targeted actions, how much of that would have just
delivered some change anyway?” – Manager
38. Returns on investment…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“Learning analytics is at an early stage and so over time it may be able
to do a lot more things than we can do now. In fact, it will but there is
this problem that if things don’t deliver what people expect quickly,
they then say, ‘oh there’s no point in doing that’, and it gets a bad
name.” – Manager
39. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Students
Teachers
Learning
analytics
40. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Student-level
https://www.pinterest.com/pin/432486370448743887/
41. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Student-level
https://www.pinterest.com/pin/432486370448743887/
42. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Student-level
https://www.pinterest.com/pin/432486370448743887/
43. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Teacher-level
https://www.pinterest.com/pin/432486370448743887/
44. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Teacher-level
https://goo.gl/images/cA7J3h
45. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Learning analytics-level
https://goo.gl/images/TK6J8p
Can individual
differences be
captured?
46. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
• Learning analytics-level
https://goo.gl/images/TVLukx
Interpretations of
learning vary
47. Teachers are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“I don’t want it [LA] to make all of the students behave in the exact
same way to satisfy an algorithm. I want it to enable students to have
the best experience in whatever that experience is. You know, you can
be totally different from everyone else and still do perfectly fine. I want
it [LA] to…enable students to do better and not make them all mini
‘me’s.” - Teacher
48. Students are concerned about…
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Privacy Stereotypes
Learning being
quantified
Losing human
contacts
49. http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“Although consumers seem to be concerned
about their privacy as reflected in their
intentions to disclose (e.g., measured via
‘‘willingness to provide information’’),
anecdotal evidence suggests their behaviors
diverge from their intentions to disclose
personal details.”
(Norberg, Horne, & Horne, 2007, p. 107)
Privacy
paradox
50. 1. Perceived benefits outweigh perceived risks
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“I haven’t been in University for so long, so for me to get back to the
school was challenging, and so for me with the personal tutor I don’t
mind sharing my data ‘cause she will help me to develop myself
further.” - Student
51. 2. Power imbalance
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“You have to agree to share this data otherwise you wouldn’t enrol, so
you are not probably thinking that much about consequences of every
single piece of data that you provide to the university. It’s just because
it’s a part of the process of application.” - Student
52. 3. Trust exacerbates information asymmetry
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
“That [Data policy] is something I don’t think I would ever focus on or
look for, so I honestly don’t know. It could be out there and I could
maybe Google it, and it would be on a page somewhere if I wanted to
find it.” - Student
53. Implications for LA strategy
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
54. 1. A sound policy and effective
communication
Purpose, access, anonymity, and security
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
55. 2. Increase the observability and
trialability of LA to attract buy-in
An incremental approach
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
56. 3. Install data literacy and
reflective skills among key
stakeholders
Moving from data to action
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
57. 4. Incorporating the views of
different stakeholders to develop a
common vision and a sense of
ownership
A dialogic approach
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
58. 5. Clarify the value of LA within
its limitations
Expectation management
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
59. SHEILA framework
Action, challenges, and policy
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
60. http://sheilaproject.eu/sheila-framework/
Tsai, Y. S., Moreno-Marcos, P. M., Jivet, I., Scheffel, M.,
Tammets, K., Kollom, K., & Gašević, D. (2018). The
SHEILA Framework: Informing Institutional Strategies
and Policy Processes of Learning Analytics. Journal of
Learning Analytics, 5(3), 5-20.
http://sheilaproject.eu/
Learning analytics (LA) SHEILA project Stakeholder views Implications for LA strategy SHEILA framework
Editor's Notes
Learning today often takes place in a hybrid environment – a combination of online and offline settings.
In a traditional classroom, we observe how students are doing with their studies through direct interactions in the class and evaluations of their assignments or exams. Today, the shift of learning to the online setting means classroom observations are no longer sufficient. Meanwhile, the advancement of technology means it is possible for us to collect a wide range of digital data from students’ interactions with the learning environments, e.g., log-ins, access time, length of time spent. Learning analytics analyses the data by looking for patterns.
A 3-year pilot project with Civitas Learning started in 2016 to investigate student engagement in the digital learning environments. The project serves a strategic purpose to gain experience of developing learning analytics models and develop a Learning Analytics Policy.
Signals is based on predictive models. The purpose is to determine in real time which students might be at risk, partially indicated by their effort within a course.
Interventions may be: Posting of a traffic signal indicator on a student’s LMS home page; • E-mail messages or reminders; • Text messages; • Referral to academic advisor or academic resource centers; or, • Face to face meetings with the instructor.
A 3-year pilot project with Civitas Learning started in 2016 to investigate student engagement in the digital learning environments. The project serves a strategic purpose to gain experience of developing learning analytics models and develop a Learning Analytics Policy.
To close the feedback loop
The project does not deal with technical issues but socio-cultural issues around LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
Following the literature review, we carried out a series of research activities to get a better understanding of the identified challenges and other factors that influence the adoption of LA.
A survey question (multiple choices) provided 11 options for motivations specific to learning and teaching. 46 from 22 countries
responded (response rate: 15%).
The top 4 are strongly associated with institutional KPIs
By contrast, only the ‘improving student recruitment’ option is more directly linked to institution KPIs. Perhaps managers do not see a close link between this and LA.
Institutional goal: Goals set for LA are to improve institutional performance or management. LA will influence decisions made by senior managers.
Teaching goal: Goals set for LA are to inform teaching and support. LA is to will influence decisions made by teachers.
Learning level: Goals set for LA are to improve learning and student experience. LA is to will influence decisions made by students.
These connections suggest that institutions or projects that set out to improve institutional performance through LA tended to have identified clear problems to solve, e.g., student retention issues. By contrast, institutions or projects that aimed to use LA to inform teaching and student support had a stronger focus on exploring a phenomenon, e.g., how students engage with learning resources.
For novice institutions, LA was often adopted as a measuring tool for institutional performance, e.g., student retention rate. By contrast, more experienced institutions showed strong connections between teaching-level goals and exploratory approaches . Though institutional goal dominates, there is a shift of focus towards teaching and learning.
Provision of timely feedback, easy access to digital resources, and personalised learning support
Even though the average responses tended to be similar across locations, the sample of students from the Open University of the Netherlands were found to have lower ideal expectations towards receiving complete profiles across modules based on LA, compared to the other samples. This highlights that there is no one-size-fits-all LA solution, and further investigation into the preference of students towards the access to their learning data at this particular institution is needed.
U19
U1
How data is collected, analysed and interpreted affects our intrepretations of learning
Students expressed protective attitudes of personal data, but the described actions that they’ve taken to protect their personal data showed the contrary.
Trust plays a role here.
Five attributes of innovations: relative advantage, compatibility, complexity, trialability, observability
- Rogers, Everett. M. (2010). Diffusion of innovations. Simon and Schuster.