Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at Leuven Learning Lab’s first annual Educational Technology conference day on Learning Analytics
(https://www.kuleuven.be/english/education/learning-lab).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Jurgen Schulte is an award-winning academic at UTS who has been using an adaptive learning platform (WileyPLUS Orion) in combination with post processing of the data. In this talk he shares some of his experiences
Fighting level 3: From the LA framework to LA practice on the micro-levelHendrik Drachsler
This presentation explores shortcomings of learning analytics for the wide adoption in educational organisations. It is NOT about ethics and privacy rather than focuses on shortcomings of learning analytics for teachers and students in the classroom (micro-level). We investigated if and to what extend learning analytics dashboards are addressing educational concepts. Map opportunities and challenges for the use of Learning Analytics dashboards for the design of courses, and present an evaluation instrument for the effects of Learning Analytics called EFLA. EFLA can be used to measure the effects of LA tools at the teacher and student side. It is a robust but light (8 items) measurement to quickly investigate the level of adoption of learning analytics in a course (micro-level). The presentation concludes that Learning Analytics is still to much a computer science dicipline that does not fulfill the often claimed position of the middle space between educational and computer science research.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at Leuven Learning Lab’s first annual Educational Technology conference day on Learning Analytics
(https://www.kuleuven.be/english/education/learning-lab).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Jurgen Schulte is an award-winning academic at UTS who has been using an adaptive learning platform (WileyPLUS Orion) in combination with post processing of the data. In this talk he shares some of his experiences
Fighting level 3: From the LA framework to LA practice on the micro-levelHendrik Drachsler
This presentation explores shortcomings of learning analytics for the wide adoption in educational organisations. It is NOT about ethics and privacy rather than focuses on shortcomings of learning analytics for teachers and students in the classroom (micro-level). We investigated if and to what extend learning analytics dashboards are addressing educational concepts. Map opportunities and challenges for the use of Learning Analytics dashboards for the design of courses, and present an evaluation instrument for the effects of Learning Analytics called EFLA. EFLA can be used to measure the effects of LA tools at the teacher and student side. It is a robust but light (8 items) measurement to quickly investigate the level of adoption of learning analytics in a course (micro-level). The presentation concludes that Learning Analytics is still to much a computer science dicipline that does not fulfill the often claimed position of the middle space between educational and computer science research.
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
This master class covers the latest developments and possibilities of learning analytics and addresses the issue of visualising data for teachers using current examples.
This class is organised in the context of the LACE (Learning Analytics Community Exchange) project which brings together existing key European players in the field of learning analytics & Educational Data Mining in order to support development of communities of practice and share emerging best practices.
Who has the crystal ball for moving forward with Digital Assessment?Denise Whitelock
Who has the crystal ball for moving forward with Digital Assessment?
Digital assessment is an evolving construct used in education to enrich, inform and complement the teaching process. Using automatic feedback however, has been under-utilised and under-valued throughout this process and further highlighted with the introduction of electronic teaching and assessments.
This presentation will discuss the issues raised by teachers and students in this arena. It will provide exemplars of how their concerns are currently being addressed by both researchers and software developers in order to support educator feedback to students. Finally, the issue of potential disrupters will be raised which moves us into the realm of crystal ball gazing.
Keynote address presented at WISEflow Conference, Brunel University
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Should feedback be at the centre of Personalised Learning?Denise Whitelock
Should feedback be at the centre of Personalised Learning?
The advent of e-Learning has prompted the development of web-based learning systems, recognising there is no fixed learning pathway that will be appropriate for all learners. However, most learning platforms with personalised learning sequencing rely on a learner’s preferences.
However if we want students to be able to learn to make reliable judgements about their learning and to identify any further support they require to meet their learning goals, then personalised automatic feedback should play an important role. This presentation explores the role that technology enhanced feedback can play in the pursuit of a personalised learning agenda.
References
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). Feedback on academic essay writing through pre-emptive hints: Moving towards ‘advice for action’. Winner of Best Research Paper Award. Special Issue of European Journal of Open, Distance and E-Learning, Best of EDEN RW8, 8th EDEN Research Workshop (eds. U. Bernath and A. Szucs). Published by European Distance and E-Learning Network, 1-15. ISSN 1027 5207
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). OpenEssayist: A supply and demand learning analytics tool for drafting academic essays. The 5th International Learning Analytics and Knowledge (LAK) Conference, Poughkeepsie, New York, USA. 16-20 March 2015. ISBN 978-1-4503-3417-4
Presentation at THE DIGITAL UNIVERSITY
A SYMPOSIUM IN CELEBRATION OF CHEC’S 20TH ANNIVERSARY
30 OCTOBER 2013
CO-HOSTED BY THE UNIVERSITY OF THE WESTERN CAPE
The power of learning analytics to unpack learning and teaching: a critical p...Bart Rienties
Across the globe many educational institutions are collecting vast amounts of small and big data about students and their learning behaviour, such as their class attendance, online activities, or assessment scores. As a result, the emerging field of Learning Analytics (LA) is exploring how data can be used to empower teachers and institutions to effectively support learners. In the recent Innovative Pedagogy Report Ferguson et al. (2017) encourage researchers and practitioners to move towards a new form of learning analytics called student-led learning analytics, which enable learners to specify their own goals and ambitions. They also support learners to reach these goals. This is particularly helpful for individuals who have little time to spare for study. In this ESRC session, based upon 6 years of experience with LA data and large-scale implementations amongst 450000+ students at a range of context, I will use an interactive format to discuss and debate three major questions: 1) To what extent is learning analytics the new holy grail of learning and teaching? 2) How can instructional design be optimised using the principles of learning analytics?; 3) With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students?
Using Data to Drive Personalized Math Learning NeedsDreamBox Learning
Technologies to support data-driven decision-making hold great promise for increasing the effectiveness of teaching and learning activities, accelerating student achievement, and improving organizational performance. To access what students are learning and how they are progressing, educators can now use a continuous improvement framework for data-driven decision-making to organize people and processes to reach education objectives.
Join us for this webinar and discuss topics including:
• Building a sustainable data analysis framework
• Common challenges involved in establishing data-driven practices
• Incorporating blended learning environments to meet school goals
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
This master class covers the latest developments and possibilities of learning analytics and addresses the issue of visualising data for teachers using current examples.
This class is organised in the context of the LACE (Learning Analytics Community Exchange) project which brings together existing key European players in the field of learning analytics & Educational Data Mining in order to support development of communities of practice and share emerging best practices.
Who has the crystal ball for moving forward with Digital Assessment?Denise Whitelock
Who has the crystal ball for moving forward with Digital Assessment?
Digital assessment is an evolving construct used in education to enrich, inform and complement the teaching process. Using automatic feedback however, has been under-utilised and under-valued throughout this process and further highlighted with the introduction of electronic teaching and assessments.
This presentation will discuss the issues raised by teachers and students in this arena. It will provide exemplars of how their concerns are currently being addressed by both researchers and software developers in order to support educator feedback to students. Finally, the issue of potential disrupters will be raised which moves us into the realm of crystal ball gazing.
Keynote address presented at WISEflow Conference, Brunel University
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Higher Education & Game Principles: Context, Theory & Application - Daniel La...Blackboard APAC
This presentation reports on the efficacy of a mobile learning intervention that combined ‘push notifications’ and game principles within a timed quiz app. An institutional interdisciplinary case study was conducted which compared rates of student retention and academic performance with their usage of a purpose-designed learning app. Leading up to lectures the app pushed daily quizzes to students’ personal mobile devices and then rewarded them with feedback, points, badges and a position on a leaderboard. During this session, the findings of this study will be discussed and conclusions made in regards to what findings mean for the future research into higher education learning enabled via mobile app technologies.
The Open University (OU) is a global leader in quality online, open and distance education with more than 180,000 students and 8,000 faculty and staff. Like many organizations, the OU is embracing data and learning analytics as an increasingly important approach for understanding learner behaviors. During this Fischer Speaker Series event, Dr. Tynan explores the vagaries of leading an institutional strategy at scale, specifically focusing on faculty, student and institutional engagement with analytics to support student success- detailing wins, pitfalls and unexpected twists resulting in unintended but delightful outcomes.
Professor Belinda Tynan is the Pro- Vice-Chancellor (Learning Innovation) and Professor of Higher Education at the Open University, UK. Reporting to the Vice-Chancellor, the Pro-Vice-Chancellor for Learning Innovation contributes to the strategic vision and mission of the University and has a focus on supporting student success by providing executive leadership in the areas of innovation, strategy and policy development, production, informal learning and research and scholarship in technology enhanced learning.
The video of this presentation can be viewed at https://goo.gl/W8qpi6
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Should feedback be at the centre of Personalised Learning?Denise Whitelock
Should feedback be at the centre of Personalised Learning?
The advent of e-Learning has prompted the development of web-based learning systems, recognising there is no fixed learning pathway that will be appropriate for all learners. However, most learning platforms with personalised learning sequencing rely on a learner’s preferences.
However if we want students to be able to learn to make reliable judgements about their learning and to identify any further support they require to meet their learning goals, then personalised automatic feedback should play an important role. This presentation explores the role that technology enhanced feedback can play in the pursuit of a personalised learning agenda.
References
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). Feedback on academic essay writing through pre-emptive hints: Moving towards ‘advice for action’. Winner of Best Research Paper Award. Special Issue of European Journal of Open, Distance and E-Learning, Best of EDEN RW8, 8th EDEN Research Workshop (eds. U. Bernath and A. Szucs). Published by European Distance and E-Learning Network, 1-15. ISSN 1027 5207
Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D. & Pulman, S. (2015). OpenEssayist: A supply and demand learning analytics tool for drafting academic essays. The 5th International Learning Analytics and Knowledge (LAK) Conference, Poughkeepsie, New York, USA. 16-20 March 2015. ISBN 978-1-4503-3417-4
Presentation at THE DIGITAL UNIVERSITY
A SYMPOSIUM IN CELEBRATION OF CHEC’S 20TH ANNIVERSARY
30 OCTOBER 2013
CO-HOSTED BY THE UNIVERSITY OF THE WESTERN CAPE
The power of learning analytics to unpack learning and teaching: a critical p...Bart Rienties
Across the globe many educational institutions are collecting vast amounts of small and big data about students and their learning behaviour, such as their class attendance, online activities, or assessment scores. As a result, the emerging field of Learning Analytics (LA) is exploring how data can be used to empower teachers and institutions to effectively support learners. In the recent Innovative Pedagogy Report Ferguson et al. (2017) encourage researchers and practitioners to move towards a new form of learning analytics called student-led learning analytics, which enable learners to specify their own goals and ambitions. They also support learners to reach these goals. This is particularly helpful for individuals who have little time to spare for study. In this ESRC session, based upon 6 years of experience with LA data and large-scale implementations amongst 450000+ students at a range of context, I will use an interactive format to discuss and debate three major questions: 1) To what extent is learning analytics the new holy grail of learning and teaching? 2) How can instructional design be optimised using the principles of learning analytics?; 3) With the introduction of student-led analytics, to what extent can learning analytics promote ‘personalisation’ or ‘generalisation’ for diverse populations of students?
Using Data to Drive Personalized Math Learning NeedsDreamBox Learning
Technologies to support data-driven decision-making hold great promise for increasing the effectiveness of teaching and learning activities, accelerating student achievement, and improving organizational performance. To access what students are learning and how they are progressing, educators can now use a continuous improvement framework for data-driven decision-making to organize people and processes to reach education objectives.
Join us for this webinar and discuss topics including:
• Building a sustainable data analysis framework
• Common challenges involved in establishing data-driven practices
• Incorporating blended learning environments to meet school goals
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
Supporting Higher Education to Integrate Learning Analytics_EUNIS20171107Yi-Shan Tsai
This talk summarised the SHEILA project and its preliminary findings. It was presented at the EUNIS (European University Information Systems) workshop on 7 November 2017.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Data-based feedback through learning dashboards: does it support the first-ye...Tinne De Laet
Presentation supporting the EFYE 2018 pre-conference workshop "Data-based feedback through learning dashboards: does it support the first-year experience" - https://efye2018.nl/programme/parallel-sessions/
Presentation of the learning dashboard developed by KU Leuven within the ABLE project (http://www.ableproject.eu/).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard is developed for the interaction between study advisor and student. More information in our journal paper http://ieeexplore.ieee.org/document/7959628/
Slides from Keynote presentation at the University of Southern California's 2015 Teaching with Technology annual conference.
"9:15 am – ANN Auditorium
Key Note: What Do We Mean by Learning Analytics?
Leah Macfadyen, Director for Evaluation and Learning Analytics, University of British Columbia
Executive Board, SoLAR (Society for Learning Analytics Research)
Leah Macfadyen will define and explore the emerging and interdisciplinary field of learning analytics in the context of quantified and personalized learning. Leah will use actual examples and case studies to illustrate the range of stakeholders learning analytics may serve, the diverse array of questions they may be used to address, and the potential impact of learning analytics in higher education."
Presentation Slides from ISSOTL 2015.
Bronnimann, J., West, D., Heath, D. & Huijser, H. (2015) Leveraging learning analytics for future pedagogies and scholarship. Paper presented at Leading learning and the scholarship of change: 12th annual ISSOTL conference, Melbourne, Australia.
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
WCOL2019: Learning analytics for learning design or learning design for learn...Marko Teräs
Presentation at the 28th ICDE World Conference on Online Learning on the relationship between learning design and learning analytics. Part of a national-level learning analytics research and development project funded by the Finnish Ministry of Education and Culture.
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Ethnobotany and Ethnopharmacology:
Ethnobotany in herbal drug evaluation,
Impact of Ethnobotany in traditional medicine,
New development in herbals,
Bio-prospecting tools for drug discovery,
Role of Ethnopharmacology in drug evaluation,
Reverse Pharmacology.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
5. “Learning analytics is
about collecting traces
that learners leave
behind and using
those traces to
improve learning.”
- Erik Duval
Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012, https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/ 5
Learning Analytics?
6. Learning Dashboards?
6Dashboard Confusion, Stephen Few, Intelligent Enterprise, March 20, 2004
“A dashboard is a visual display
of the most important
information needed to achieve
one or more objectives;
consolidated and arranged on a
single screen so the information
can be monitored at a glance.”
- Stephen Few
7. Successful Transition from secondary to higher
Education using Learning Analytics
enhance a successful transition from
secondary to higher education by means of
learning analytics
design and build analytics dashboards,
dashboards that go beyond identifying at-risk
students, allowing actionable feedback for all
students on a large scale.
Achieving Benefits from Learning Analytics
research strategies and practices for using
learning analytics to support students during
their first year at university
developing the technological aspects of
learning analytics,
focuses on how learning analytics can be used
to support students.
7
www.stela-project.eu
@STELA_project
2015-1-UK01-KA203-013767
www.ableproject.eu
@ABLE_project_eu
562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD
8. STELA ♥ ABLE
8
actionable feedback
student-centered
program level
inclusive
first-year experience
institution-wide
Learning Analytics
actual implementation
9. [!] Feedback must be “actionable”.
9
Warning!
Male students have
10% less probability to
be successful.
You are male.
Warning!
Your online activity is
lagging behind.
action?
?
action?
?
12. [!] Start with the available data.
Lots of data may eventually become
available in the future …
…. already start with what is available
12
(*)
(*) Zarraonandia, T., Aedo, I., Díaz, P., & Montero, A. (2013). An augmented lecture feedback system to support learner and teacher communication.
British Journal of Educational Technology, 44(4), 616-628.
14. Study advisor – student conversations
14
Should I consider
another program?
Can I still finish the
bachelor in 3 years?
How should I compose
my program for next
year?
What is the personal
situation?
How can I help?
What is the best
next step?
15. [!] Use all available expertise.
15
visualization experts
practitioners / end-users
researchers LA
researchers first-year
study success
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue.
In IEEE Transactions on Learning Technology (http://ieeexplore.ieee.org/document/7959628/).
17. [!] Wording matters.
17
73% chance of success
73% of students of earlier
cohorts with the same
study efficiency obtained
the bachelor degree
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
18. LISSA: status
18
26 programs >4500 students
114 student advisors
training of study advisors
• Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue. In IEEE Transactions on Learning Technology
• http://blog.associatie.kuleuven.be/tinnedelaet/lissa-learning-dashboard-supporting-student-advisers-in-traditional-higher-education/
• Millecamp M., Gutiérrez F., Charleer S., Verbert K., De Laet T.# (2018). A qualitative evaluation of a learning dashboard to support advisor-student dialogues. Proceedings of the 8th
International Learning Analytics & Knowledge Conference. LAK. Sydney, 5-9 March 2018 (pp. 1-5) ACM.
dashboards for three examination
periods
observations, interviews,
questionnaires
19. LISSA: evaluation – observations
19
15 observations
insights
(-) factual
(+) interpretative
(!) reflective
Charleer S., Vande Moere A., Klerkx J., Verbert K., De Laet T. (2017). Learning Analytics Dashboards to Support Adviser-Student Dialogue. In IEEE Transactions on Learning Technology
20. Evaluation – interviews
“When students see the numbers, they are
surprised, but now they believe me.
Before, I used my gut feeling, now I feel
more certain of what I say as well”.
“It’s like a main thread
guiding the
conversation.”
“I can talk about what to do with the results,
instead of each time looking for the data and
puzzling it together.”
“Students don’t know where to look during the
conversation, and avoid eye contact.
The dashboard provides them a point of focus”.
“A student changed her
study method in June and
could now see it paid off.”
LISSA supports a personal dialogue.
the level of usage depends on the experience
and style of the study advisors
fact-based evidence at the side
narrative thread
key moments and student path help to
reconstruct personal track
“I can focus on the
student’s personal
path, rather than on
the facts.”
“Now, I can blame
the dashboard and
focus on
collaboratively looking
for the next step to
take.”
20
21. LISSA: evaluation – student
questionnaires
21
26 programs @KU Leuven
291 student questionnaires
first examination period
“Confronting, but
useful”
“I want to use this
dashboard at home.”
“Also show the sub-grades
for labs, … ”
“How can I know the data is
trustworth?”
“Can’t these visualizations be
send to students?” “Crisp and clear.”
22. 22
0
0
1
1
1
1
4
2
1
4
4
3
29
21
36
37
49
42
176
112
156
132
141
169
80
155
93
116
92
72
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
1. The dashboard is clarifying and surveyable.
2. The shown information regarding my study
situation is correct.
3. The shown position with respect to my fellow
students (histograms per exam and global…
4. A conversation with my student advisors helped
me to gain insight in my study trajectory.
5. The visualisation is of added value to the
conversation with the student advisor.
6. The shown information provide me insight in
my current situation.
Student questionnaire January 2018 (N=291)
Strongly Disagree Disagree Neither Agree or Disagree Agree Strongly Agree
23. [!] Do not oversimplify. Show
uncertainty.
23
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
24. [!] Be careful with predictive
algorithms.
24
http://blog.associatie.kuleuven.be/tinnedelaet/the-nonsense-of-chances-of-success-and-predictive-models/
• reality is complex
• measurement is limited
• individual circumstances
• need for nuance
• trigger reflection
26. [!] Start with the available data.
26
data already available?
administrative (examples)
student records course grades
systems (examples)
LMS access logs advisor meetings
)
Broos T., Verbert K., Van Soom C., Langie G., De Laet T.# (2018). Small data as a conversation starter for learning analytics: exam results dashboard for first-year students in higher
education. Journal of Research in Innovative Teaching & Learning, , 1-14.
demo: https://learninganalytics.set.kuleuven.be/static-demo-rex/ (en) or https://learninganalytics.set.kuleuven.be/demo/rex-1718jan-ir (nl)
27. [!] Think beyond the obvious data.
27
• Don’t think too traditional.
• Many institutions are collecting survey
data for educational research.
28. [!] Not all data is usable.
28
example data from a traditional course with “VLE as a file system”
test scores
activity/week (#days)
weeks of the year
29. [!] Not all data is usable.
29
example data from a course with flipped classroom & blended learning
exam scores
activity (# of modules used)
Not a single student
using less than 10
modules passed the
course.
Most of the successful
students used 15
modules or more.
30. [!] Keep Learning Analytics in
mind when designing learning
activities.
30
Learning
Analytics
Learning Design
INFORM
ENABLE
If LA indeed contributes to improved
learning design…
… don’t make it an afterthought
32. data already available?
administrative (examples)
student records course grades
[!] Think beyond the obvious data.
32
systems (examples)
LMS access logs advisor meetings
surveys (examples)
quality insurance LASSI
33. ~ 30 LASSI questions
(shortened version)
“Learning Skills”
Example: When preparing for an
exam, I create questions that I
think might be included.
Example: I find it difficult to
maintain my concentration
while doing my coursework.
Example: I find it hard to stick
to a study schedule.
raw scores
(selected 5 out of 10)
CONCENTRATION
MOTIVATION
FAILURE ANXIETY
TEST STRATEGY
TIME MANAGEMENT
norm scores
(in Flemish HE context)
Example: STRONG
Example: AVERAGE
Example: LOW
Example: VERY STRONG
Example: VERY WEAK
33
Meta cognitive abilities
Pinxten, M., Van Soom, C., Peeters, C., De Laet, T., Langie, G., At-risk at the gate: prediction of study success of first-year science and engineering students in an
open-admission university in Flanders—any incremental validity of study strategies? Eur J Psychol Educ (2017).
readySTEMgo Erasmus+ project https://iiw.kuleuven.be/english/readystemgo
34. Dashboard learning skills
34
students complete LASSI
questionnaire
students received personalized email
with invitation for dashboard
4367 students in 26 programs
in 9 faculties @KU Leuven
demo:
https://learninganalytics.set.kuleuven.be/static-demo-lassi/ (en) of
https://learninganalytics.set.kuleuven.be/demo/lassi-1718 (nl)
2 programs @TU Delft
35. Feedback model
1. What is this about?
2. How am I doing?
3. How does this relates to
others?
4. Why is this relevant?
5. What can I do about it?
35
36. 36
3. How does this relates to
others?
2. How am I doing?
1. What is this about?
@studyProgram@
@yourScore@
37. 4. Why is this relevant?
5. What can I do about it?
37
41. Students that click through
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
41
better learning skills
42. More intense users
Broos, T., Peeters, L., Verbert, K., Van Soom, C., Langie, G., & De Laet, T. (2017, July). Dashboard for Actionable Feedback on Learning Skills: Scalability and Usefulness.
In International Conference on Learning and Collaboration Technologies (pp. 229-241). Springer, Cham.
42
worse learning skills
43. What can we learn from dashboard usage?
43Broos T., Verbert K., Van Soom C., Langie G., De Laet T.# (2018). Low-investment, Realistic-Return Business Cases for Learning Analytics Dashboards: Leveraging Usage Data and
Microinteractions. accepted for ECTEL 2018
p<1e-5 p<1e-9p-test
44. [!] Give students “the key”.
44
• Student has the key to own
data.
• Student takes initiative to
share/discuss own data.
• GDPR as opportunity!
45. [!] Acceptance precedes impact.
45
• Involve stakeholders from the start and
value their input!
COmmunication
COoperation
• Demonstrate usefulness.
• Take care of ethics and privacy.
• Best scenario:
students & study advisors as ambassadors
COCO
46. Impact?
survey before intervention
2nd year students 2016-2017
experiences first-year feedback
41 vragen, 5-point Likert scale
pen & paper
dashboards
LISSA
LASSI (learning skills)
3 x REX (grades)
Survey after intervention
2nd year students 2017-2018
47. Impact?
During the first year I received sufficient information regarding my academic achievements.
47
Engineering Science (p<0.001)
48. Impact?
The information I received helped to position myself with respect to my peers.
48
Engineering Science (p<0.001)
50. [!] Context matters!
• available data
• national and institutional regulations
and culture
• educational vision
• educational system, size of population ..
• …
Don’t just copy existing LA solutions!
50
51. Summary
case studies 11 findings/recommendations
[!] Use all available expertise.
[!] Start with the available data.
[!] Look beyond the obvious data.
[!] Not all data is usable.
[!] Wording matters.
[!] Don’t oversimplify. Show uncertainty.
[!] Beware of predictive algorithms.
[!] Keep Learning Analytics in mind when designing
learning activities.
[!] Give students “the key” to their data.
[!] Acceptance precedes impact.
[!] Context matters!
humble approach
small data
involvement of stakeholders, especially practitioners
actionable feedback
scalability
traditional university settings
Is this Learning Analytics?
53. Project team @
53
Sven Charleer
AugmentHCI, Computer Science department
PhD researcher ABLE
Katrien Verbert
AugmentHCI, Computer Science department
Copromotor of STELA & ABLE
Carolien Van Soom
Leuven Engineering and Science Education Center
Head of Tutorial Services of Science
Copromotor of STELA & ABLE
Greet Langie
Leuven Engineering and Science Education Center
Vicedean (education) faculty of Engineering Technology
Copromotor of STELA & ABLE
Tinne De Laet
Leuven Engineering and Science Education Center
Head of Tutorial Services of Engineering Science
Coordinator of STELA
KU Leuven coordinator of ABLE
Francisco Gutiérrez
AugmentHCI, Computer Science department
PhD researcher ABLE
Tom Broos
Leuven Engineering and Science Education Center
AugmentHCI, Computer Science department
PhD researcher STELA
Martijn Millecamp
AugmentHCI, Computer Science department
PhD researcher ABLE
Special thanks to study advisors for their cooperation, advice, feedback, and support!
Jasper, Bart, Riet, Hilde, An, Katrien, …
♥
56. Micro-interactions - REX
56
defaultalternative
)
Broos T., Verbert K., Van Soom C., Langie G., De Laet T.# (2018). Low-investment, Realistic-Return Business Cases for Learning Analytics Dashboards: Leveraging Usage Data and
Microinteractions. accepted for ECTEL 2018
57. Micro-interactions
57Broos T., Verbert K., Van Soom C., Langie G., De Laet T.# (2018). Low-investment, Realistic-Return Business Cases for Learning Analytics Dashboards: Leveraging Usage Data and
Microinteractions. accepted for ECTEL 2018
58. 58
2
3
1
7
2
1
10
44
23
36
3
11
64
97
81
74
36
29
150
115
126
110
119
91
61
30
56
59
128
157
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
7. The dashboard makes me more aware on my current
study situation.
8. The dashboard makes me forecast the different
possibilities in my future study trajectory.
9. The dashboard helps me to reflect on my past and
current study behaviour or study trajectory.
10. The dashboards stimulates me to adapt my approach
in my studies for the future (study behaviour or study…
11. If I will have a new conversation after one of the next
examination periods, I hope that the visualisation will be…
12. I would like to consult the information on my own.
Student questionnaire January 2018 (N=291)
Strongly Disagree Disagree Neither Agree or Disagree Agree Strongly Agree
59. How to determine
thresholds for different
groups?
LISSA dashboard
59
upper and lower group: clear message
middle group as small as possible
Do not overfit! (nuance)
Meer continue feedback.
Focus op welbevinden.
Tijdig studenten opmerken die het moeilijk hebben met de aanpassing en die pro-actief feedback geven maar ook contacteren.