Presentation of the learning dashboard developed by KU Leuven within the ABLE project (http://www.ableproject.eu/).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard is developed for the interaction between study advisor and student. More information in our journal paper http://ieeexplore.ieee.org/document/7959628/
Learning and study strategies: a learning analytics approach for feedbackTinne De Laet
Presentation of a learning dashboard developed by KU Leuven within the STELA project (http://stela-project.eu//).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard provides feedback on the learning and study strategies, as measured by the LASSI questionnaire.
Australian university teacher’s engagement with learning analytics: Still ea...Blackboard APAC
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...Blackboard APAC
Learning analytics is a 'hot topic' in education with many institutions seeking to make better use of the data available via various systems. One of the key challenges in this process is to understand the business questions that people working in various roles in institutions would like to be able to answer. However, it is also important that these questions are appropriately structured and specific in order to gather the relevant data. This session builds on the workshop run at last year's Blackboard Learning and Teaching conference where participants explored business questions and use cases for learning analytics from a range of perspectives.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Developing Health Sciences students’ information skills through online self-p...Sarah Gallagher
Initial feedback on a cross cohort evaluation of an online self-paced information skills programme in three second year health sciences programmes at the Unviersity of Otago: Medicine, Pharmacy and Physiotherapy. Presented at Spotlight on Teaching 2013, University of Otago.
Learning and study strategies: a learning analytics approach for feedbackTinne De Laet
Presentation of a learning dashboard developed by KU Leuven within the STELA project (http://stela-project.eu//).
Learning dashboard supported by learning analytics, showing off the use of technology for learning in higher education, for the transition of secondary to higher education in particular. The dashboard provides feedback on the learning and study strategies, as measured by the LASSI questionnaire.
Australian university teacher’s engagement with learning analytics: Still ea...Blackboard APAC
This session reports the results of a recent OLT-funded national exploratory study addressing the relevant factors and their impact when implementing learning analytics for student retention purposes. The project utilised a mixed-method research design and yielded a series of outputs, including the development of a non-technical overview of learning analytics, focusing on linking the fields of student retention and learning analytics resulting in an institution level survey focusing on sector readiness and decision making relating to utilising learning analytics for retention purposes. An academic level survey was administered to academic staff exploring their progress, aspirations and support needs relating to learning analytics. Follow-up interviews expanded on their experiences with learning analytics to date. An evidence-based framework was developed, mapping important factors affecting learning analytics decision making and implementation. This was illustrated by a suite of five case studies developed by each of the research partner institutions detailing their experiences with learning analytics and demonstrating why elements in the framework are important. These findings were shared and tested at a National Forum in April 2015.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...Blackboard APAC
Learning analytics is a 'hot topic' in education with many institutions seeking to make better use of the data available via various systems. One of the key challenges in this process is to understand the business questions that people working in various roles in institutions would like to be able to answer. However, it is also important that these questions are appropriately structured and specific in order to gather the relevant data. This session builds on the workshop run at last year's Blackboard Learning and Teaching conference where participants explored business questions and use cases for learning analytics from a range of perspectives.
Delivered at Innovate and Educate: Teaching and Learning Conference by Blackboard. 24 -27 August 2015 in Adelaide, Australia.
Five short presentations from a panel session at the Learning Analytics and Knowledge Conference 2015, on the topic of "Learning Analytics - European Perspectives", held at Marist College, Poughkeepsie on March 18th 2015. The speakers are: Rebecca Ferguson, Alejandra Martinz Mones, Kairit Tammets, Alan Berg, Anne Boyer, and Adam Cooper.
Developing Health Sciences students’ information skills through online self-p...Sarah Gallagher
Initial feedback on a cross cohort evaluation of an online self-paced information skills programme in three second year health sciences programmes at the Unviersity of Otago: Medicine, Pharmacy and Physiotherapy. Presented at Spotlight on Teaching 2013, University of Otago.
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Personalised and Adaptive Mentoring in Medical Education - The myPAL projectNicolas Van Labeke
This position paper describes a long-term Technology-Enhanced Learning ini-tiative at the Leeds Institute of Medical Education in which a personalised adaptive learning mentor will be deployed for all MBChB students enrolled in the course. The system, myPAL, is enriching the existing TEL programs em-bedded in the curriculum and will be leveraging recent advances in Learning Analytics and Open Learner Modelling. The paper presents the context of the project and the opportunities that deployment settings will offer, and highlights the research and development strands that will underpin it.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
ASCA's Mindsets and Behaviors competencies are excellent benchmarks to ensure student success in academics, careers, and social/emotional pursuits, but can be a lot for counselors and educators to take on. Learn how AchieveWORKS can personalize the ASCA competencies for students. AchieveWORKS assessments can make learning personalized by identifying focus areas so that students take ownership of the competencies they need most.
Research output in Irish H.E. academic libraries 2000-2015 Terry O'Brien
Presentation given by Terry O'Brien & Kieran Cronin at CONUL (Consortium of National & University Libraries) 2017 Annual Conference - Inspiring and Supporting Research (Athlone, Ireland, May 2017)
This master class covers the latest developments and possibilities of learning analytics and addresses the issue of visualising data for teachers using current examples.
This class is organised in the context of the LACE (Learning Analytics Community Exchange) project which brings together existing key European players in the field of learning analytics & Educational Data Mining in order to support development of communities of practice and share emerging best practices.
Presentation by Mats Cullhed, University of Uppsala, Coimbra Group at the 2018 European Distance Learning Week's fourth day webinar on "New Learning Spaces to Support the EU 2020 Learning-Intensive Society" - 8 November 2018
Recording of the discussion is available: https://eden-online.adobeconnect.com/prkoy4q728k1/
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Framing Blended learning, teaching, and educationEADTU
Framing Blended learning, teaching, and education by Stephan Poelmans from KU Leuven During the EMBED event 'Implementing the European Maturity Model for Blended Education' 22 January 2020
Workshop run at the European Conference for e-Learning 2015 (ECEL 2015) at the University of Hertfordshire, UK. The workshop included an introduction of both learning analytics and learning design, as well as an exploration of how these could be employed in MOOCs. Some of the group work was focused on the Agincourt MOOC run by the University of Southampton on the FutureLearn platform.
Talk by Rebeca Ferguson (Open University, UK, and LACE project).
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to move from small-scale research within our disciplines towards large-scale implementation across our institutions. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building. In her keynote, Rebecca introduced tools, resources, organisations and case studies that can be used to support the deployment of learning analytics at scale
Personalised and Adaptive Mentoring in Medical Education - The myPAL projectNicolas Van Labeke
This position paper describes a long-term Technology-Enhanced Learning ini-tiative at the Leeds Institute of Medical Education in which a personalised adaptive learning mentor will be deployed for all MBChB students enrolled in the course. The system, myPAL, is enriching the existing TEL programs em-bedded in the curriculum and will be leveraging recent advances in Learning Analytics and Open Learner Modelling. The paper presents the context of the project and the opportunities that deployment settings will offer, and highlights the research and development strands that will underpin it.
Presentation given at SCONUL 2014, the summer conference of The Society of College, National and University Libraries, Glasgow, June 2014. The presentation focuses on frequently asked questions (FAQs) about learning analytics, with the emphasis on the role and perspective of libraries in this area.
ASCA's Mindsets and Behaviors competencies are excellent benchmarks to ensure student success in academics, careers, and social/emotional pursuits, but can be a lot for counselors and educators to take on. Learn how AchieveWORKS can personalize the ASCA competencies for students. AchieveWORKS assessments can make learning personalized by identifying focus areas so that students take ownership of the competencies they need most.
Research output in Irish H.E. academic libraries 2000-2015 Terry O'Brien
Presentation given by Terry O'Brien & Kieran Cronin at CONUL (Consortium of National & University Libraries) 2017 Annual Conference - Inspiring and Supporting Research (Athlone, Ireland, May 2017)
This master class covers the latest developments and possibilities of learning analytics and addresses the issue of visualising data for teachers using current examples.
This class is organised in the context of the LACE (Learning Analytics Community Exchange) project which brings together existing key European players in the field of learning analytics & Educational Data Mining in order to support development of communities of practice and share emerging best practices.
Presentation by Mats Cullhed, University of Uppsala, Coimbra Group at the 2018 European Distance Learning Week's fourth day webinar on "New Learning Spaces to Support the EU 2020 Learning-Intensive Society" - 8 November 2018
Recording of the discussion is available: https://eden-online.adobeconnect.com/prkoy4q728k1/
Using learning analytics to support formative assessment oln 20171111Yi-Shan Tsai
This talk covers ideas about using learning analytics to enhance formative assessment, with an introduction of two learning analytics tools developed in Australia - Loop and OnTask.
ABLE - the NTU Student Dashboard - University of DerbyEd Foster
implementing a university wide learning analytics system.
Presentation Overview:
- Introduction
- Developing the NTU Student Dashboard
- Transitioning from pilot phase to whole institution roll-out
- Embedding the resource into working practices
- Future development
Framing Blended learning, teaching, and educationEADTU
Framing Blended learning, teaching, and education by Stephan Poelmans from KU Leuven During the EMBED event 'Implementing the European Maturity Model for Blended Education' 22 January 2020
Analysing analytics, what is learning analytics?Moodlerooms
Take look at analytics in the learning eco-system, including what sort of data is being analysed, who needs the data and what are they going to do with it? This session also looked at the data that can come from Moodle and what questions it can help you and your institution answer.
ABLE - UKAT - Using Learning Analytics to Boost Personal TutoringEd Foster
Session aims:
• Introduce learning analytics
• Describe the development of the NTU Student Dashboard
• Discuss potential benefits of learning analytics for personal tutors
• Raise some challenges of converting student information to actionable intelligenc
Assessment Analytics - EUNIS 2015 E-Learning Task Force WorkshopLACE Project
This presentation is to introduce a discussion session at the 2015 EUNIS Congress workshop session of the E-Learning Task Force. The LACE Project is very briefly introduced, followed by an explanation of the presenter's view of learning analytics and a critique of some common themes. Assessment Analytics is presented as an antithesis to these themes and an assessment lifecycle model (used in the Jisc Electronic Management of Assessment Programme) is used to outline some ways in which assessment analytics can be realised, as stimulus for discussion.
Scalable, Actionable, and Ethical Learning Dashboards: a reality checkTinne De Laet
Keynote presentation at Edmedia 2018 conference: https://www.aace.org/conf/edmedia/speakers/.
Results of Erasmus+ projects ABLE (www.ableproject.eu) and STELA (www.stela-project.eu) on learning dashboards for supporting first-year students.
Using learning analytics to improve student transition into and support throu...Tinne De Laet
Presentation supporting the ABLE and STELA workshop titled "Using learning analytics to improve student transition into and support throughout the 1st year" delivered at the EFYE 2016 conference in Gent, Belgium
Effective Creation, Mediation and Use of Knowledge in and about Education.EduSkills OECD
This presentation was given by Philippa Cordingley from the Centre for the Use of Research and Evidence in Education (CUREE) at the CERI Conference on Innovation, Governance and Reform in Education on 3 November 2014 during session 3.a: Knowledge-intensive Governance, Innovation and Change.
Learning Analytics for online and on-campus education: experience and researchTinne De Laet
This presentation was used Tinne De Laet, KU Leuven, for a keynote presentation during the event: http://www.educationandlearning.nl/agenda/2017-10-13-cel-innovation-room-10-learning-and-academic-analytics organised by Leiden University, Erasmus University Rotterdam, and Delft University of Technology.
The presentations presents the results of two case studies from the Erasmus+ project ABLE and STELA, and provides 9 recommendations regarding learning analytics.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at Leuven Learning Lab’s first annual Educational Technology conference day on Learning Analytics
(https://www.kuleuven.be/english/education/learning-lab).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Data-based feedback through learning dashboards: does it support the first-ye...Tinne De Laet
Presentation supporting the EFYE 2018 pre-conference workshop "Data-based feedback through learning dashboards: does it support the first-year experience" - https://efye2018.nl/programme/parallel-sessions/
Learning analytics: the state of the art and the futureRebecca Ferguson
Presentation given by Rebecca Ferguson at 'Nuevas métricsas y enfoques para la evaluación e innovación en el aprendizaje' in Montevideo, Uruguay, on Wednesday 13 April 2016.
The talk deals with the state of the art in learning analytics, and with actions for taking this work forward at a national level.
Learning Dashboards for Feedback at ScaleTinne De Laet
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELA, ABLE, and LALA ) that focuses on scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? This talk shares experiences of a large scale deployment of learning dashboards with more than 12.000 students. Presented at laffas.eu.
Improving education by learning analtyicsTinne De Laet
These are the slides of the invited talk "improving education by learning analytics" for the LAW studiedag 2017 https://www.maastrichtuniversity.nl/nl/events/studiedag-2017.
Learning dashboards for actionable feedback: the (non)sense of chances of suc...Tinne De Laet
Presentation at humane event on digital transformation in higher education (http://www.humane.eu/events/seminars-and-conferences/2018/aveiro-042018/).
Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from two European projects (ABLE and STELA) that aimed at developing learning dashboards for more traditional higher education institutions and integrating it within actual educational practices. The talk will challenge your beliefs regarding “chances of success” and predictive models in higher education.
Learning analytics tussen droom en daad: eerlijke ervaringen uit concrete imp...Tinne De Laet
Presentatie gegeven tijdens https://www.surf.nl/agenda/2018/05/seminar-aan-de-slag-met-learning-analytics/index.html.
Ervaringen met learning dashboard opgedaan binnen KU Leuven in kader van twee Erasmus+ projecten ABLE en STELA.
Learning dashboards voor feedback op leer- en studeervaardigheden en academis...Tinne De Laet
Keynote presentatie van LESEC annual event 2018 (https://set.kuleuven.be/LESEC/news-events/annual-event-2018) over learning dashboards. Concreet bevat de presentatie bevindingen en aanbevelingen van grootschalige piloten binnen KU Leuven in kader van twee Europese Erasmus+ projecten: ABLE en STELA.
Confidence in and beliefs about first year engineering student successTinne De Laet
This paper explores the confidence freshman engineering students have in being successful in the first study year and which study-related behaviour they believe to be important to this end. Additionally, this paper studies which feedback these students would like to receive and compares it with the experiences of second-year students regarding feedback. To this end, two questionnaires were administered: one with freshman engineering students to measure their expectations regarding study success and expected feedback and one with second-year engineering students to evaluate their first year feedback experience.
The results show that starting first-year engineering students are confident regarding their study success. This confidence is however higher than the observed first-year students success. Not surprisingly, first-year students have good intentions and believe that most academic activities are important for student success. When second-year students look back on their first year, their beliefs in the importance of these activities have strongly decreased, especially regarding the importance of preparing classes and following communication through email and the virtual learning environment. First-year students expect feedback regarding their academic performance and engagement. They expect that this feedback primarily focuses on the impact on their future study pathway rather than on comparison to peer students. Second-year students indicate that the amount of feedback they receive could be improved, but agree with the first-year students that comparative feedback is less important.
Conference Key Areas: Engineering Education Research, Attractiveness of Engineering Education, Gender and Diversity
Keywords: academic self-confidence, feedback, reasons for students success, student beliefs
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
4. WHO AM I? WHY I AM HERE?
woman
engineer
Head Tutorial Services
Engineering Science
KU Leuven, Belgium
.
Tinne De Laet
associate
professor
5. WHO AM I? WHY I AM HERE?
Head of Tutorial Services of Engineering
Science KU Leuven
• Daily experiences of challenges in transition from
secondary to higher education
• looking for opportunities for cross-fertilization
between “first-year experience” and “engineering”
•Strategic Partnership: 2015-1-UK01-KA203-013767
•Achieving Benefits from Learning Analytics
•Nottingham Trent University (UK), KU Leuven (Belgium),
Leiden University (Netherlands)
• http://www.ableproject.eu/
KU Leuven promotor of ABLE
Erasmus+ strategic partnership project
7. WHAT IS LEARNING ANALYTICS?
no universally agreed definition
7
“the measurement, collection, analysis and reporting of data about learners and their
contexts, for purposes of understanding and optimizing learning and the environments in
which it occurs” [1]
[1] Learning and Academic Analytics, Siemens, G., 5 August 2011, http://www.learninganalytics.net/?p=131
[2] What is Analytics? Definition and Essential Characteristics, Vol. 1, No. 5. CETIS Analytics Series, Cooper, A.,
http://publications.cetis.ac.uk/2012/521
“the process of developing actionable insights through problem definition and the
application of statistical models and analysis against existing and/or simulated future
data” [2]
8. WAT IS LEARNING ANALYTICS?
8
[3] Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012,
https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/
“learning analytics is about collecting
traces that learners leave behind and
using those traces to improve learning”
[Erik Duval, 3]
† 12 March 2016
no universally agreed definition
9. IS IT ABOUT INSTITUTIONAL DATA?
•high-level figures:
provide an overview for internal and external reports;
used for organisational planning purposes.
•academic analytics:
figures on retention and success, used by the institution to assess performance.
•educational data mining:
searching for patterns in the data.
•learning analytics:
use of data, which may include ‘big data’,
to provide actionable intelligence for learners and teachers.
[4] Learning analytics FAQs, Rebecca Ferguson, Slideshare,
http://www.slideshare.net/R3beccaF/learning-analytics-fa-qs
11. SPECIFIC CONTEXT
• open admission in the Flemish (Belgium) higher education system
→ a substantial part of first-year students enters without the right qualifications
→ bachelor drop-out rate of around 40% in the Faculties of Science & Technology at KU
Leuven.
• university invests in advising students before and throughout the first-year
• study career guidance by study advisors assigned to particular program
• study advisors are professionals, trained in coaching students
• BUT study advisors too often “drive blind”
hard to gather data that could support the advising session
13. METHODOLOGY
1. Development
• need analysis with five study advisors and brainstorm sessions
• user-centred, rapid-prototyping design approach
• six iterations: four digital mock-ups and two functional dashboards
• two visualisation experts & head of the Tutorial Services of Engineering Science
2. Result = LISSA
→ Learning dashboard for Insights and Support during Study Advice
3. Evaluation with students & study advisors
22. THE EVALUATION
Presentations to 17 study advisors for preliminary
feedback
• data-supported evidence
• LISSA should not be used without a study advisor’s guidance.
• LISSA should remain secondary to the experience and expertise of
study advisor
• decrease in workload: no need to “look for data” and
manual preparations
• data transparancy
• open and honest approach
• some concern around student protesting againts difficulty of a course
“with the facts visualised it is easier to
convince the student”
“certain myths exist with students,
like how some people
manage to successfully complete
seven exams during re-sits”.
23. THE EVALUATION
97 sessions, interview of study advisors,
15 sessions observed
LISSA supporting a personal dialogue
• the level of usage depends on the experience of the
study advisors
• fact-based evidence at the side
• narrative thread
• key moments and student path help to reconstruct
personal track
“When I show them the number of students that
succeed seven or eight exams, they are
surprised, but now they believe me. Before, I
used my gut feeling, now I feel
more certain of what I say as well”.
“It’s like a main thread guiding the conversation”
“I can talk about what to do with the results,
instead of each time looking for the data, and
puzzling it together”
“Students don’t know where to look during the
conversation, and avoid eye contact. The
dashboard provides them a point of focus”.
“I have changed my study method in June and
now see it paid off.”
26. DISCUSSION AND LESSONS LEARNT
• Learning Analytics data
• only data that is available at any higher education institute is used
• now easily available to study advisors
• Visualisation
• minimal design without animations as LISSA is a mere supportive tool, allowing focus on the contextualisation and not on the
dashboard features itself
• visual encoding of information provide fast and easy overview (e.g. color coding of failed courses)
• Study advisor
• The role of the study advisor is key: critical and reflective interpretation, coaching
• LISSA leaves room for personal opinion and tacit experience
• Transparancy
• LISSA uses data transparently
• not always desired to use with student with really bad results (demotivation), but can also motivate students
• comparison to peers is not considered desired by all study advisors
• senstive nature of histograms of courses
27. CONCLUSION AND REFLECTIONS
• study advisors and student “like” LISSA
• LISSA is a mere supportive tool
• expertise of study advisors is key
Future
• repeat interventions and extend (KU Leuven and Leiden University)
• extend dashboard with other student data
looking how student background data can be integrated in an ethical manner
• qualitative analysis using focus groups and structured interviews with students