Workshop run at the Institutional Web Management Workshop (IWMW) 2017 at Univeristy of kent, Tuesday 11th July 2017. Facilitated with Jon Rathmill, University of kent
Introducing the new Learning technologistMarieke Guy
Marieke Guy introduced herself as the new learning technologist at the university and explained that her role involves managing, researching, and enabling learning through technology. She outlined several technology initiatives already in progress, such as a digital strategy and new classroom equipment. Guy sought feedback from staff on what is and isn't working well currently and where her efforts could be most effective in improving the learning experience through technology solutions such as a redesign of the university's learning management system and online assessment tools.
Making your mind up: Formalising the evaluation of learning technologies Marieke Guy
The document discusses the need for institutions to take a more formal approach to evaluating learning technologies. It introduces some existing evaluation frameworks like the Educause rubric and SECTIONS model. It then outlines UCEM's approach, which involved thoroughly investigating requirements, identifying systems to evaluate, developing a testing plan based on the Educause rubric, testing functionality and data flows, and involving stakeholders before selecting a new assessment platform. Attendees at the talk were asked to provide ideas on evaluation processes and challenges through a Mural board.
The document summarizes recent and upcoming work from Jisc Data Analytics to support higher education providers. Recent work includes dashboards on topics like international student impacts and postgraduate recruitment. Upcoming products include dashboards tracking Welsh HE performance and graduate outcomes. The document also summarizes findings from Jisc's 2020 student digital experience survey, including requests for more online content, technology support, and consistency in teaching methods during the pandemic. Finally, it previews Jisc's work to help universities address challenges from the pandemic like building digital skills and embracing blended learning models.
Learning analytics research and development work at University of Oslo, NorwayJisc
The document summarizes the work of HuLAR, the Hub for Learning Analytics Research at the University of Oslo in Norway. HuLAR coordinates resources and infrastructure to support learning analytics research and development across 10 nodes/departments at the university. It oversees 16 projects exploring topics like learning design, discourse analysis, and legal frameworks. A key focus is establishing technical infrastructure for consent handling, data storage, and analysis tools while complying with privacy regulations. However, challenges include limited resources for data scientists, difficulties obtaining student consent, and complex legal aspects that restrict the use of learning analytics for quality improvement.
Transforming the student experience using learning analyticsJisc
This document discusses using learning analytics (LA) to improve the student experience. It argues that students should be interested in LA because of potential benefits, but risks must be addressed. A good student-staff partnership requires clarity of purpose, advocacy, student representation, governance protecting student interests, transparency, and regular communication. Effectiveness of LA can be monitored through feedback loops, reviewed interventions, improved student outcomes, and greater understanding of teaching and learning. The document uses Greenwich University as a case study, highlighting the purpose of improved engagement and outcomes, advocacy, partnership between the university and student union, governance, ethical framework, transparency, and student engagement. What works includes staff finding value in the data, positive student feedback on attendance monitoring, the
Introducing the new Learning technologistMarieke Guy
Marieke Guy introduced herself as the new learning technologist at the university and explained that her role involves managing, researching, and enabling learning through technology. She outlined several technology initiatives already in progress, such as a digital strategy and new classroom equipment. Guy sought feedback from staff on what is and isn't working well currently and where her efforts could be most effective in improving the learning experience through technology solutions such as a redesign of the university's learning management system and online assessment tools.
Making your mind up: Formalising the evaluation of learning technologies Marieke Guy
The document discusses the need for institutions to take a more formal approach to evaluating learning technologies. It introduces some existing evaluation frameworks like the Educause rubric and SECTIONS model. It then outlines UCEM's approach, which involved thoroughly investigating requirements, identifying systems to evaluate, developing a testing plan based on the Educause rubric, testing functionality and data flows, and involving stakeholders before selecting a new assessment platform. Attendees at the talk were asked to provide ideas on evaluation processes and challenges through a Mural board.
The document summarizes recent and upcoming work from Jisc Data Analytics to support higher education providers. Recent work includes dashboards on topics like international student impacts and postgraduate recruitment. Upcoming products include dashboards tracking Welsh HE performance and graduate outcomes. The document also summarizes findings from Jisc's 2020 student digital experience survey, including requests for more online content, technology support, and consistency in teaching methods during the pandemic. Finally, it previews Jisc's work to help universities address challenges from the pandemic like building digital skills and embracing blended learning models.
Learning analytics research and development work at University of Oslo, NorwayJisc
The document summarizes the work of HuLAR, the Hub for Learning Analytics Research at the University of Oslo in Norway. HuLAR coordinates resources and infrastructure to support learning analytics research and development across 10 nodes/departments at the university. It oversees 16 projects exploring topics like learning design, discourse analysis, and legal frameworks. A key focus is establishing technical infrastructure for consent handling, data storage, and analysis tools while complying with privacy regulations. However, challenges include limited resources for data scientists, difficulties obtaining student consent, and complex legal aspects that restrict the use of learning analytics for quality improvement.
Transforming the student experience using learning analyticsJisc
This document discusses using learning analytics (LA) to improve the student experience. It argues that students should be interested in LA because of potential benefits, but risks must be addressed. A good student-staff partnership requires clarity of purpose, advocacy, student representation, governance protecting student interests, transparency, and regular communication. Effectiveness of LA can be monitored through feedback loops, reviewed interventions, improved student outcomes, and greater understanding of teaching and learning. The document uses Greenwich University as a case study, highlighting the purpose of improved engagement and outcomes, advocacy, partnership between the university and student union, governance, ethical framework, transparency, and student engagement. What works includes staff finding value in the data, positive student feedback on attendance monitoring, the
Digital expectations and the student lifecycle: is engaging with students on ...Jisc
Speaker: Jack Tattersall, senior account manager, Guidebook.
Student expectations now demand their institutions offer a full mobile experience. This 60 minute session will map out the student lifecycle in detail and demonstrate how a mobile app can drive engagement at every stage. We'll discuss the challenges that face universities as they attempt to engage with students during the prospective, onboarding and support stages of the student lifecycle.
Attendees will walk away from this session with ideas on how to drive engagement and improve support through mobile. We'll offer a self-assessment of the university's current engagement performance and an action plan of how they could boost this through mobile technology.
FE digital student findings and recommendationsJisc
Findings and recommendations from the FE digital student project. Presented by Sarah Knight and Paul Bailey at the Learning and teaching practice experts group on 22 April 2015
Jisc learning analytics network meeting - why are we here?Jisc
The document summarizes the 19th Jisc Learning Analytics Networking Event that took place in October 2020. The event discussed how learning analytics has become more important for monitoring student engagement and intervention during remote learning due to COVID-19. It provided an opportunity for participants to learn about different learning analytics tools and systems, hear vendor success stories, and get guidance on governance, legal, and ethical issues. Jisc also provided updates on expanding their student success tools and dashboards, integrating with other systems, and new initiatives in curriculum and wellbeing analytics.
This document provides an agenda for the 20th Jisc Learning Analytics Network Meeting on April 21st. The meeting will include welcome and updates from Jisc staff between 09:30-10:30, followed by two presentations on learning analytics research and development from 11:00-11:45. From 11:45-12:15 the discussion will focus on how machine learning and AI can improve research in the cloud. The meeting aims to share updates and insights on learning analytics initiatives.
Transforming assessment and feedback with technology - Jisc Digifest 2016Jisc
Students expect their assessment experiences to be effectively supported by technology but this can be difficult to achieve with current assessment processes, practices and systems.
This demonstration shows how our new resources, developed in collaboration with universities, colleges, and partner bodies, can help. Using the outcomes of our self-assessment tool you can develop a tailored action plan supported by proven guidance and resources to maximise the benefits that technology can offer.
The document summarizes a meeting to discuss supporting staff to teach effectively online. It introduces Jisc's digital capability service and discovery tool, which includes a self-assessment quiz to evaluate digital skills. Feedback from the tool includes next steps and resources. A new question set on effective online teaching was developed through a review process. Key areas covered include knowledge acquisition, critical engagement, knowledge application, dialogue, collaboration, content creation, and supporting online learners. Challenges discussed include accessibility, non-institutional tools, assessing collaboration, specialist practices, and developing student online learning skills. Updates provided new case studies and information on digital capability events.
The document discusses responsible disclosure in higher education. It surveys policies at universities regarding cyber issues and outlines additional approaches used in industry, like bug bounties. There were complications in directly applying industrial practices to universities. Outcomes of consulting key stakeholders included utilizing interested student groups to test low-risk systems during off-hours. Current work involves selecting initial systems for students to penetration test, with the goal of establishing a formal responsible disclosure policy.
Understanding learning gain and why this might matter to you Jisc
The document discusses learning gain and why measuring it is important. It outlines the session which will clarify what learning gain means, consider drivers for interest in measuring it like the Teaching Excellence Framework, introduce types of learning gain measures, and discuss how learning technologies could provide data. Challenges of developing robust learning gain measures are also examined. Examples of UK universities measuring skills, engagement, and attributes are provided.
Meeting the RDM challenge - exercise - Jisc Digital Festival 2014Jisc
The document discusses research data management (RDM) challenges and components needed for research data services. It notes that libraries are primarily responsible for providing support and leadership, while research offices and IT departments also play roles. The Digital Curation Centre (DCC) aims to build RDM capacity and skills across UK higher education through needs assessments, policy development, guidance, training, and advocacy to support institutions in implementing RDM.
Making a difference with technology-enhanced learning - Esther Barrett, Debbi...Jisc
Led by Esther Barrett, subject specialist - teaching, learning and assessment, Jisc.
With contributions from:
Debbie Baff, senior academic developer, Swansea University
Richard Speight, Digiskills Cymru Project Manager, Unison Cymru
There will be a focus how technology can support learning and teaching for a better student experience. Local providers will be sharing how their technology-based approaches have made a difference for learners and teachers.
Connect more in Wales, Thursday 7 July 2016
The document discusses the role of the Chief Information Security Officer (CISO) at the University of Edinburgh. It outlines that the CISO was appointed to provide central leadership on information security risks across the university. The CISO's main responsibilities include leading the information security strategy, managing information security risks from internal and external threats, advising on security threats, and developing security policies and governance. Initial priorities for the CISO included recruiting a security team, focusing on users, overhauling risk governance, and supporting strategic projects. Keys to success are aligning with the university's digital transformation strategy, gaining buy-in from colleges, ensuring business areas own their risks, and providing supporting services through collaboration.
This presentation was given by Peter Karlberg of the National Agency for Education (Skolverket) of Sweden at the GCES Conference on Education Governance: The Role of Data in Tallinn on 13 February during the afternoon session workshop on Learning Analytics.
Jisc established standard job roles, grades and salaries after merging multiple organizations. It sought to define clear technical career pathways in addition to traditional management routes. Using the Skills Framework for the Information Age (SFIA), Jisc mapped job descriptions and skills to its grades and created pathways across roles. This allows staff to advance based on technical skills rather than management. It also establishes people management as a technical skill with its own career path.
Better Predictions, More Graduates: Data Determines Shortest Paths to DegreeAmazon Web Services
Learn how the California Community College System leverages Amazon SageMaker to understand course-taking patterns of graduates and use that data to inform discussions around current students’ shortest pathways to completion.
This presentation demonstrates a research to practice activity undertaken by the National Center for Technology Innovation in collaboration with an industry leader. This activity explores and addresses administrators’ needs related to assistive technology purchases, infrastructure, and integration with instructional technology plans and federal regulations such as IDEA and NCLB.
Implementing analytics - Rob Wyn Jones, Shri Footring and Rebecca DaviesJisc
Led by Rob Wyn Jones, consultant and Shri Footring, senior co-design manager - enterprise, both Jisc.
With contribution from Rebecca Davies, pro vice-chancellor and chief operating officer, Aberystwyth University.
Connect more in Wales, 7 July 2016
Student experience experts meet up - introduction and updateJisc
This document summarizes the agenda for a meeting of the Jisc student experience experts group. The meeting will include presentations and discussions on Jisc's recent research into student technology use during the pandemic, sharing examples of effective practices to support digital learning experiences, and short member spotlight presentations on interactive simulations and personalized teaching tools. Attendees are asked to provide feedback and discussions will aim to help guide Jisc's future student experience work.
Governing Complex Education Systems: The Use of Data, Tracey Burns, OECDEduSkills OECD
This presentation was given by Tracey Burns of the OECD at the GCES Conference on Education Governance: The Role of Data in Tallinn on 12 February during the opening session on OECD and Governing Complex Education Systems. It looks at trends in governance and provides a detailed overview of the GCES project, explaining its main research questions, analytical model, main findings and outputs.
The design of data systems within education can be challenging due to a lack of easily accessible information and a large variety of stakeholders with differing needs. Architecting Academic Intelligence is the process of centralizing and making accessible the student administrative information to the every member of the administration, faculty and staff of the City Colleges of Chicago so as to more efficiently promote student success.
Widening Access and Participation Dashboards for Data Informed Decision Makin...SEDA
This document summarizes a discussion paper presented at SEDA on using data dashboards to inform decisions about widening access and participation at universities. It discusses how Ulster University collects and analyzes student data to guide educational interventions and support students. Examples of data sources and visualization dashboards are provided at the university, faculty, school, and student levels. The session promoted sharing practices for making evidence-based, data-informed decisions to improve access, participation, and student outcomes.
Digital expectations and the student lifecycle: is engaging with students on ...Jisc
Speaker: Jack Tattersall, senior account manager, Guidebook.
Student expectations now demand their institutions offer a full mobile experience. This 60 minute session will map out the student lifecycle in detail and demonstrate how a mobile app can drive engagement at every stage. We'll discuss the challenges that face universities as they attempt to engage with students during the prospective, onboarding and support stages of the student lifecycle.
Attendees will walk away from this session with ideas on how to drive engagement and improve support through mobile. We'll offer a self-assessment of the university's current engagement performance and an action plan of how they could boost this through mobile technology.
FE digital student findings and recommendationsJisc
Findings and recommendations from the FE digital student project. Presented by Sarah Knight and Paul Bailey at the Learning and teaching practice experts group on 22 April 2015
Jisc learning analytics network meeting - why are we here?Jisc
The document summarizes the 19th Jisc Learning Analytics Networking Event that took place in October 2020. The event discussed how learning analytics has become more important for monitoring student engagement and intervention during remote learning due to COVID-19. It provided an opportunity for participants to learn about different learning analytics tools and systems, hear vendor success stories, and get guidance on governance, legal, and ethical issues. Jisc also provided updates on expanding their student success tools and dashboards, integrating with other systems, and new initiatives in curriculum and wellbeing analytics.
This document provides an agenda for the 20th Jisc Learning Analytics Network Meeting on April 21st. The meeting will include welcome and updates from Jisc staff between 09:30-10:30, followed by two presentations on learning analytics research and development from 11:00-11:45. From 11:45-12:15 the discussion will focus on how machine learning and AI can improve research in the cloud. The meeting aims to share updates and insights on learning analytics initiatives.
Transforming assessment and feedback with technology - Jisc Digifest 2016Jisc
Students expect their assessment experiences to be effectively supported by technology but this can be difficult to achieve with current assessment processes, practices and systems.
This demonstration shows how our new resources, developed in collaboration with universities, colleges, and partner bodies, can help. Using the outcomes of our self-assessment tool you can develop a tailored action plan supported by proven guidance and resources to maximise the benefits that technology can offer.
The document summarizes a meeting to discuss supporting staff to teach effectively online. It introduces Jisc's digital capability service and discovery tool, which includes a self-assessment quiz to evaluate digital skills. Feedback from the tool includes next steps and resources. A new question set on effective online teaching was developed through a review process. Key areas covered include knowledge acquisition, critical engagement, knowledge application, dialogue, collaboration, content creation, and supporting online learners. Challenges discussed include accessibility, non-institutional tools, assessing collaboration, specialist practices, and developing student online learning skills. Updates provided new case studies and information on digital capability events.
The document discusses responsible disclosure in higher education. It surveys policies at universities regarding cyber issues and outlines additional approaches used in industry, like bug bounties. There were complications in directly applying industrial practices to universities. Outcomes of consulting key stakeholders included utilizing interested student groups to test low-risk systems during off-hours. Current work involves selecting initial systems for students to penetration test, with the goal of establishing a formal responsible disclosure policy.
Understanding learning gain and why this might matter to you Jisc
The document discusses learning gain and why measuring it is important. It outlines the session which will clarify what learning gain means, consider drivers for interest in measuring it like the Teaching Excellence Framework, introduce types of learning gain measures, and discuss how learning technologies could provide data. Challenges of developing robust learning gain measures are also examined. Examples of UK universities measuring skills, engagement, and attributes are provided.
Meeting the RDM challenge - exercise - Jisc Digital Festival 2014Jisc
The document discusses research data management (RDM) challenges and components needed for research data services. It notes that libraries are primarily responsible for providing support and leadership, while research offices and IT departments also play roles. The Digital Curation Centre (DCC) aims to build RDM capacity and skills across UK higher education through needs assessments, policy development, guidance, training, and advocacy to support institutions in implementing RDM.
Making a difference with technology-enhanced learning - Esther Barrett, Debbi...Jisc
Led by Esther Barrett, subject specialist - teaching, learning and assessment, Jisc.
With contributions from:
Debbie Baff, senior academic developer, Swansea University
Richard Speight, Digiskills Cymru Project Manager, Unison Cymru
There will be a focus how technology can support learning and teaching for a better student experience. Local providers will be sharing how their technology-based approaches have made a difference for learners and teachers.
Connect more in Wales, Thursday 7 July 2016
The document discusses the role of the Chief Information Security Officer (CISO) at the University of Edinburgh. It outlines that the CISO was appointed to provide central leadership on information security risks across the university. The CISO's main responsibilities include leading the information security strategy, managing information security risks from internal and external threats, advising on security threats, and developing security policies and governance. Initial priorities for the CISO included recruiting a security team, focusing on users, overhauling risk governance, and supporting strategic projects. Keys to success are aligning with the university's digital transformation strategy, gaining buy-in from colleges, ensuring business areas own their risks, and providing supporting services through collaboration.
This presentation was given by Peter Karlberg of the National Agency for Education (Skolverket) of Sweden at the GCES Conference on Education Governance: The Role of Data in Tallinn on 13 February during the afternoon session workshop on Learning Analytics.
Jisc established standard job roles, grades and salaries after merging multiple organizations. It sought to define clear technical career pathways in addition to traditional management routes. Using the Skills Framework for the Information Age (SFIA), Jisc mapped job descriptions and skills to its grades and created pathways across roles. This allows staff to advance based on technical skills rather than management. It also establishes people management as a technical skill with its own career path.
Better Predictions, More Graduates: Data Determines Shortest Paths to DegreeAmazon Web Services
Learn how the California Community College System leverages Amazon SageMaker to understand course-taking patterns of graduates and use that data to inform discussions around current students’ shortest pathways to completion.
This presentation demonstrates a research to practice activity undertaken by the National Center for Technology Innovation in collaboration with an industry leader. This activity explores and addresses administrators’ needs related to assistive technology purchases, infrastructure, and integration with instructional technology plans and federal regulations such as IDEA and NCLB.
Implementing analytics - Rob Wyn Jones, Shri Footring and Rebecca DaviesJisc
Led by Rob Wyn Jones, consultant and Shri Footring, senior co-design manager - enterprise, both Jisc.
With contribution from Rebecca Davies, pro vice-chancellor and chief operating officer, Aberystwyth University.
Connect more in Wales, 7 July 2016
Student experience experts meet up - introduction and updateJisc
This document summarizes the agenda for a meeting of the Jisc student experience experts group. The meeting will include presentations and discussions on Jisc's recent research into student technology use during the pandemic, sharing examples of effective practices to support digital learning experiences, and short member spotlight presentations on interactive simulations and personalized teaching tools. Attendees are asked to provide feedback and discussions will aim to help guide Jisc's future student experience work.
Governing Complex Education Systems: The Use of Data, Tracey Burns, OECDEduSkills OECD
This presentation was given by Tracey Burns of the OECD at the GCES Conference on Education Governance: The Role of Data in Tallinn on 12 February during the opening session on OECD and Governing Complex Education Systems. It looks at trends in governance and provides a detailed overview of the GCES project, explaining its main research questions, analytical model, main findings and outputs.
The design of data systems within education can be challenging due to a lack of easily accessible information and a large variety of stakeholders with differing needs. Architecting Academic Intelligence is the process of centralizing and making accessible the student administrative information to the every member of the administration, faculty and staff of the City Colleges of Chicago so as to more efficiently promote student success.
Widening Access and Participation Dashboards for Data Informed Decision Makin...SEDA
This document summarizes a discussion paper presented at SEDA on using data dashboards to inform decisions about widening access and participation at universities. It discusses how Ulster University collects and analyzes student data to guide educational interventions and support students. Examples of data sources and visualization dashboards are provided at the university, faculty, school, and student levels. The session promoted sharing practices for making evidence-based, data-informed decisions to improve access, participation, and student outcomes.
SGCI Science Gateways: Software sustainability via on-campus teams - Webinar ...Sandra Gesing
Achieve software sustainability via on-campus teams. SGCI can support you with a roadmap to use free resources on campus and/or build your own on-campus team
Mobilising a nation: RDM education and training in South Africaheila1
Big data; small data; case study; SKA, research data management; university libraries; NeDICC; NRF announcement; UCT, UP, Wits; training intervention; DCC; Carnegie
On November 21st 2014 at the Tufts University Medford campus and November 25th 2014 at the campus of the University of Massachusetts Medical School in Worcester, the BLC and Digital Science hosted a workshop focused on better understanding the research information management landscape.
Kevin Gardner, Director of Strategic Initiatives, Office of the Senior Vice Provost for Research, University of New Hampshire, described UNH's decision to implement a research information management system and the lessons learned.
This document outlines a university's journey to implement a research information management system. It discusses problems with the lack of ability to measure and understand research outputs. A working group was formed in 2011-2012 to select a system. The goals are to capture research data, integrate with other systems, and analyze research strengths in a global context. Implementation will require stakeholder buy-in and integration with various IT systems. The benefits include positioning the university as a research leader, attracting faculty and students, and complying with federal mandates.
This document provides an overview of analytics for learning and discusses implementation at GRCC. It begins with definitions of analytics, business intelligence, academic analytics, and learning analytics. It then discusses GRCC's strategic needs in areas like access to data, early alert systems, and measuring outcomes. The document outlines GRCC's analytics implementation, including hiring a data warehouse architect and campus training. It shows sample student and instructor reports in Blackboard Analytics and discusses next steps like dedicating time, building capacity, and engaging culture. It provides additional analytics resources.
1) Rensselaer Polytechnic Institute aims to incorporate data science education across its curriculum to develop "data dexterity" in every student.
2) A proposed core curriculum includes data-intensive courses in science and the major, as well as collaborative projects through a new Data INCITE laboratory.
3) The goal is for data management and analysis to become as fundamental as calculus, with open data sharing and verification of results.
Digital Capability: How digitally capable are we?BlackboardEMEA
Is the implementation of Blackboard/Collaborate/et cetera hampered by staff and students’ IT skills? Users struggle to make the most of Blackboard without basic digital capabilities such as file management, knowing what a browser is, not to mention wider capabilities such as creating and uploading videos, understanding file size issues, or utilising their mobiles and the list goes on.
During 2014 the UCISA User Skills Group undertook their inaugural Digital Capabilities Survey and followed up with several case studies. Their research shows:
• What strategic approaches universities are taking to support staff and students with their digital capabilities
• What universities are doing to address these skills for their staff and students
• How the sector is defining digital capabilities
• What universities are doing with BYO
Speakers:
David Lewis, senior analytics consultant, Jisc
Martin Lynch, learning systems manager, University of South Wales
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Rachel Bruce UK research and data management where are we nowJisc
The document discusses the state of research data management in UK universities. It finds that while areas like data cataloguing and access/storage systems are progressing, governance of data access/reuse and digital preservation/planning are lagging. Barriers to progress include low researcher priority, funding availability, and lack of staff/infrastructure. Gaps include defining responsibilities, standards, costs, and tools. Coordination and sharing resources across institutions is needed to help universities advance research data management.
Learning analytics research informed institutional practiceYi-Shan Tsai
The document summarizes learning analytics research and initiatives at the University of Edinburgh. It discusses early MOOC and VLE analytics projects that aimed to understand student behaviors and identify patterns. It also describes the Learning Analytics Map of Activities, Research and Roll-out (LAMARR) and efforts to build institutional capacity for learning analytics. Challenges discussed include the effort required to analyze raw data and involve stakeholders. The document advocates developing critical and participatory approaches to educational data analysis.
Mol, S.T. (2014, November). Learning Analytics: The good, the bad, the ugly. Presentation delivered as part of the UvA Faculty of Economics and Business Educational Innovation Seminar Series. University of Amsterdam, the Netherlands.
The document discusses learning analytics at the University of Greenwich. It provides an overview of how the university uses student data from various sources like grades, library usage, and attendance to monitor student engagement and outcomes. Interventions are put in place if engagement drops, such as meetings with personal tutors. Apps have been created for students and tutors to view analytics data. Considerations around data privacy and transparency are also discussed. Finally, the document considers the potential role of strategic planners in interpreting learning analytics data patterns and evidence to support activities like the Teaching Excellence Framework.
The document discusses national learning analytics in the UK and Jisc's role in providing learning analytics services. It describes Jisc's learning analytics tools and products like the Data Explorer dashboards, Study Goal app, and Learning Data Hub. It outlines Jisc's onboarding process for institutions and examples of how they are working with universities and colleges to implement learning analytics.
Blackboard tlc presentation on ucisa dig cap v4gillianfielding
Slides on the UCISA Digital Capabilities Survey undertaken in 2014 across the UK Higher education sector. Shows the state of the nation on digital capabilities.
This document provides an overview of a webinar on digital curation and research data management for universities. The webinar covers an introduction to digital curation, the benefits and drivers for research data management, current initiatives in UK universities, and the role of libraries in supporting research data management. Libraries are increasingly involved in developing institutional policies, providing training, and advising researchers on writing data management plans and sharing data. The webinar highlights training opportunities for librarians to develop skills in research data management and digital curation.
Ways to ensure “buy in” from the academics in the transition to digitised ass...Marieke Guy
Ways to ensure “buy in” from the academics in the transition to digitised assessments
Marieke Guy (Head of Digital Assessment) & Claudia Cox (Digital Assessment Advisor)
Uniwise partner meeting
2nd November 2023
Assessing for a World Beyond AssessmentMarieke Guy
Marieke Guy from University College London discussed challenges with assessment and ways institutions are innovating. Assessment is a complex problem with many stakeholders. UCL is exploring new approaches like integrating artificial intelligence, offering students optionality in assessments, and designing authentic assessments that mirror real-world problems. This involves case studies of modules using videos, collaborative projects, and virtual simulations. UCL also aims to make assessment more relevant, innovative, enable technology, improve feedback, and foster student enjoyment of learning.
The blandness is its formulaic style’: insights to help understand the impact...Marieke Guy
This document announces a lunch and learn session on the impact of AI on assessments. It provides six small changes that can be made now to current assessments, such as discussing academic integrity with students and revising exam questions. Larger changes are presented in an assessment menu inspired by a card game. The session will discuss issues around ubiquitous AI tools enabling easy cheating, the purpose of assessment, and moving forward with generative AI. References are provided on related topics such as AI detecting cheating, a student using ChatGPT to cheat, and universities rejecting anti-plagiarism technology.
Redesigning assessments for a world with artificial intelligenceMarieke Guy
Redesigning assessments for a world with artificial intelligence presentation By Marieke Guy, Head of Digital Assessment, UCL
QAA Annual Conference, The Future of Quality: What’s Next?
Wednesday 13 September 2023
Closing remarks: Assessment with Phill DawsonMarieke Guy
Marieke Guy gave the closing remarks for the assessment conference at UCL. She highlighted several themes from the conference including cross-team, cross-institution, and cross-sector collaboration on digital assessment. Two talks focused on using feedback to improve student learning and preparing students for their future through valid assessments not tied to the past. The conference organizers and host King's College London were thanked for their work in bringing people together to discuss advancing assessment practices.
This document summarizes a presentation given by Simon Walker and Marieke Guy about the University College London's (UCL) journey towards digital transformation of assessment and feedback.
Some key points:
- UCL implemented a secure digital assessment platform called AUCL in response to the COVID-19 pandemic to deliver over 1,000 assessments remotely.
- Since then UCL has expanded usage of AUCL, with over 1,600 exams and 65,000 students using it in year two.
- Student and staff surveys showed mostly positive feedback but also areas for improvement like assessment weightings, duration, and content representation.
- UCL is piloting lockdown browsers, improving academic integrity, and partnering with
The document summarizes UCL's pilot of using a lockdown browser for digital assessments. It describes the rationale for using a lockdown browser, details four pilot programs conducted or planned at UCL involving different locations, devices and numbers of students, and key areas of interest being evaluated including device type, online management and invigilation, and student and staff perspectives. The goal is to assess the viability and scalability of using lockdown browsers to help ensure academic integrity for digital assessments conducted in-person.
Digital Assessment Team 2022 - a day in the life.pptxMarieke Guy
The Digital Assessment Team at UCL provides support for digital assessment across all faculties. The team consists of specialists in different subject areas as well as learning technologists. They provide training to staff and departments on UCL's digital assessment platform AssessmentUCL. Additionally, the team works on improvements to the platform, investigates new assessment tools, and supports the use of other tools like Turnitin and Moodle. The team's workload is consistent throughout the year with no downtime between project sprints and ongoing support requests.
This document discusses various approaches to assessment using AssessmentUCL. It describes using dynamic questions and variables in multiple choice assessments. It also discusses allocating different papers or versions to students, using videos for assessments, group activities, mock scenarios, and providing improved feedback including audio/video. Other approaches mentioned include industry case studies, portfolios, infographics, rethinking coursework, and online marking.
Designing alternative assessments requires analyzing how technology tools can help or hinder learning goals, getting student feedback on new approaches, and adapting processes based on data. Assessment should be integrated into course and program design from the start and linked to learning outcomes, and attending workshops or speaking with a Digital Assessment Advisor can provide support on effective strategies.
MCQs_ The joys of making your mind up.pdfMarieke Guy
Explore the benefits and challenges of using MCQs in both formative and summative assessment, and get practical guidance on designing good MCQs in AssessmentUCL.
4 March, 10.30am-11.30am. Online event.
Multiple choice questions have often had a bad rap in education, sometimes seen as assessing only lower level skills such as factual recall. However, with good question design this assessment approach can allow for testing of more complex cognitive processes. Add in the increasing sophistication of options offered by digital assessment platforms, which allow automatic grading and statistical analysis, and you can begin to significantly streamline your marking processes.
This workshop will explore the benefits and challenges of using MCQs in both formative and summative assessment and provide practical guidance on:
Constructing good MCQs
The range of MCQs available on digital platforms, focussing on AssessmentUCL.
There will be time for discussion and questions.
After attending this session, you will be able to:
Create worthwhile MCQs that test a range of learning outcomes.
Understand the range of MCQs available on digital platforms and how they can be used, focussing on AssessmentUCL.
Who should attend this session
All those engaged in teaching, assessment and the support of learning (academics, administrators, professional service colleagues).
Rubrics_ removing the glitch in the assessment matrix (1).pdfMarieke Guy
Rubrics bring together criteria, grades and feedback into a single scoring matrix. This session will explore how to design a good rubric and the benefits and potential challenges of using rubrics in assessments.
Would you like to increase reliability and consistency in marking, ensure alignment with intended learning outcomes and provide an efficient feedback mechanism for students? If so, this session on rubrics is for you.
Rubrics are a useful way of bringing together criteria, grades and feedback into a single scoring matrix to help streamline marking, provide transparency and support learners to understand how their performance will be judged.
This workshop will focus on the benefits and potential challenges of using rubrics in assessment within your subject area and provide practical guidance on:
How to design a good rubric
Creating and marking with rubrics in Assessment UCL
There will be opportunities for discussion and questions.
After attending this session, you will be able to:
Understand the benefits and potential challenges of using rubrics in assessment
Design an appropriate rubric for your assessments
Understand how to create and mark with rubrics in Assessment UCL
Who should attend this session
All those engaged in teaching, assessment and the support of learning (academics, administrators, professional service colleagues).
The document describes several video assessment techniques that can be used for students:
1. Students record or upload a video presentation on a topic and receive automatic feedback to improve their presentation skills.
2. Students upload a video demonstrating a skill and receive feedback to enhance their competency.
3. In a virtual classroom, students collaborate to record a group presentation and provide peer assessment on each other's contributions.
4. Students record video responses to pre-recorded questions to practice interview techniques through a standardized question/answer approach.
This document discusses alternative assessment methods and provides rationales and ideas for various approaches. It proposes using video assessments to allow students to practice real-world skills, eportfolios to enable continuous assessment and reflection, and industry case studies/scenarios to provide authentic assessments aligned with industry. Other suggestions include mini-quizzes for varied engagement and assessment, and balancing new approaches with resources. The goal is to better prepare students through assessment practices that mimic the real world.
The Transnational Online Pivot: A Case Study Exploring Online Delivery in ChinaMarieke Guy
This document summarizes a case study exploring the transition to online delivery of teaching content in China due to the COVID-19 pandemic. Academics from the Royal Agricultural University normally travel to China to teach students in-person, but had to shift to delivering pre-recorded lectures and holding interactive Zoom sessions. While online teaching can overcome geographical barriers, it also presents challenges like language differences and student engagement. Feedback from students indicated interactive sessions worked best when broken into shorter segments. Academics found recording lectures technically straightforward but time-consuming and felt isolated from students. Mixed methods were argued to provide multiple views for understanding the student experience in this transnational online environment.
1) The document discusses using multimedia like video and audio to enhance teaching. It provides reasons for using multimedia, such as positive student feedback about supplemental materials.
2) Options for creating multimedia at RAU are reviewed, including available hardware, Panopto for uploading and sharing videos, and tips for storyboarding and producing content.
3) Various types of multimedia that could be used are suggested, such as lecture recordings, how-to guides, interviews, and virtual open days. Accessibility and interactive options are also covered.
Who are you online? Or how to build an academic online identity…Marieke Guy
The document discusses how to build an online academic identity by establishing profiles on websites like LinkedIn, Twitter, blogs and research profiles to promote your work, build networks, and stay informed. It provides tips on customizing profiles, sharing research and content online, engaging with other academics, and using tools to curate an online brand that establishes yourself as an expert in your field while maintaining appropriate conduct. Maintaining an up-to-date online presence can help promote the university and one's research, teaching, and career.
Main Java[All of the Base Concepts}.docxadhitya5119
This is part 1 of my Java Learning Journey. This Contains Custom methods, classes, constructors, packages, multithreading , try- catch block, finally block and more.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
1. #IWMW17 #A7
The Sixty Minute (Data Dashboard)
Makeover – in 1 hour 30 minutes!
Marieke Guy, QAA
Jon Rathmill, University of Kent
Tuesday 11th July 2017
16:00 – 17:30
3. #IWMW17 #A7
• Data analyst at Quality Assurance Agency for higher
education (QAA)
• QAA mission is to safeguard standards and improved
the quality of UK higher education, wherever it is
delivered in the world
• Does this by:
delivering elements of revised operation model for quality assessment
managing assessment process for TEF
regulating Access to HE qualification
maintaining UK Quality Code
advising on degree awarding powers
carrying out review of Alternative Providers
Strategic international work (TNE)
Marieke Guy - QAA
4. #IWMW17 #A7
Jon Rathmill - University of Kent
• Planning Analyst in the Planning and Business
Information Office (PBIO)
• Previously worked in FE college MI department
• PBIO is responsible for the provision of management
information, both external and internal, on the
complete range of student academic activity.
• Migrating to use of Qlikview to display data and
statistics.
Allows users to filter and shape reports to meet their needs
Quicker to disseminate data
Visualisations improve understanding
Standardised design across dashboards to help users
Allows greater data control – only certain users see certain things
5. #IWMW17 #A7
Overview of the Workshop
Time Session
Introduction session
16:00 – 16:30 Data in Higher Education (MG & JR)
Practical session: Sixty second dashboards
16:30 – 16:45 User stories (All)
16:45 – 17:00 Data sources (All)
17:00 – 17:20 Designing a dashboard (All)
Show and tell and feedback
17:20 – 17:30 Delegates present their dashboard (All)
8. #IWMW17 #A7
• DLHE – Destination of Leavers in HE
• NSS – National Student survey
• LEO – Longitudinal Education Outcomes
• POLAR – Participation of Local Areas
• PRES – Postgraduate Research Experience Survey
• KIS – Key Information Sets
• HE-BCI – HE Business Community Interaction
• JACs – Joint Academic Coding System
• HECos – HE Classification of Subjects
• CAH – Common Aggregation Hierarchy
• HESA – Higher Education Statistics Agency
• TEF – Teaching Excellence Framework
• REF – Research Excellence Framework
Acronyms…
9. #IWMW17 #A7
“The HE sector has always been a data-rich
sector, and universities generate and use
enormous volumes of data each day.
However, the sector has not yet capitalised
on the enormous opportunities presented by
the data revolution, and is lagging behind
other sectors in this area.”
From Bricks to Clicks report, Higher education Commission
10. #IWMW17 #A7
• Data everywhere in the HE sector:
Collection by HESA and through surveys
Use in TEF, REF, league tables
• Data collection/use is (to some extent) in hand
(HEDIIP/data Futures) but data analysis isn’t
• Data ownership and management is slowly evolving
• Data seeping in to all aspects of HE provision and
decision making: programme design, retention, WP,
learning analytics
• Many staff concerned about their data capabilities
• Agencies need to work together (Bell review)
Data in HE
11. #IWMW17 #A7
“A business intelligence dashboard (BI
dashboard) is a software interface that
provides preconfigured or customer defined
metrics, statistics, insights and visualisation
into current data.
It allows users to view instant results into
the live performance state of business or
data analytics.”
Technopedia
12. #IWMW17 #A7
Where the Work is - IPPR
http://wheretheworkis.org/ - Institute for Public Policy Research
15. #IWMW17 #A7
• HESA data:
Summary data from the HESA Student, Staff and Finance submissions
HE Business and Community Interaction (HE-BCI), Estates Management
and Destinations collections
Performance Indicators, Student Staff Ratios, Aggregate Offshore Record
(AOR)
• Heidi - web-based management information
service developed for accessing, extracting and
manipulating data
• Heidi plus – New BI service, more granularity,
more visualisation opportunities
, Heidi and
16. #IWMW17 #A7
• National analytics experimentation project led by
Jisc and HESA
• Aim to refresh Heidi Plus content with insights from
a wide range of alternative data sources
• Teams comprise staff from multiple institutions
• Teams themed – agile approach
• Working towards ‘proof of concept’ dashboards
• These dashboards will be further developed by
HESA
• 130 people, 70 institutions, on 6th round
BI Analytics Labs
18. #IWMW17 #A7
Team member Institution
Tom Wale University of Oxford (Product owner )
Jon Rathmill University of Kent
Carolyn Deeming Plymouth University
Marieke Guy QAA
Elena Hristozova University of Nottingham
Myles Danson Jisc (Scrum Master)
Kris Popat Cetis (Data Wrangler)
Neil Richards HESA/Jisc (Data viz)
Team Tom
19. #IWMW17 #A7
• Employability (including TEF planner)
• Staff
• Market insights
• Library resources
• Finance
• FE – in particular manufacturing and how it links to
local FE colleges
• Research
Theme areas
20. #IWMW17 #A7
Research Student Experience
I want to: Understand the progress and
completion of my research students,
and the issues that they face
So that I can: Ensure that my institution
can improve its research student
experience and improve the proportion
graduating, and how this compares with
others
Brexit and International Exposure
I want to: Understand my exposure to the EU
by subject for research in terms of research
students, staff and research income
So that I can: Understand what gaps I might
have should European students, staff and
income dry up, and whether this is particularly
different from other institutions
Preparation for REF
I want to: Understand how my
institution’s activity since REF14
contributes to REF21
So that I can: Understand how my
institution’s performance compares
with others operating in the same
subject areas
User stories
21. #IWMW17 #A7
• Low shelf data
Publicly available & openly licensed
Vast, distributed, no common vocabulary, complex
May be patchy
Not designed to be combined with other data
Examples include demographic, geo-spatial, international, census
• High shelf data
Closed licensing - available by subscription or is locked to third
party organisations
Examples include funding and regulatory, local councils, Government
bodies, fees and admissions, careers and trajectory, current study data,
staff, research, financial, estates or even institutions themselves
Low shelf vs high shelf data
22. #IWMW17 #A7
• HESA Student
• HESA Staff
• HESA Finance
• HEA – PRES data
• Eurostats
http://ec.europa.eu/eurostat/web/education-and-
training/data/database
• EU Community Research and Development
Information Service (CORDIS)
• RCUK – funding awarded – Gateway to Research
Data sources used
27. #IWMW17 #A7
• Brainstorm possible themes
• Decide on a theme to be explored
• Craft up to three user stories for your dashboard
using the user stories template
In your groups
30. #IWMW17 #A7
• Brainstorm HESA data fields that may be useful
(student, staff, finance, other)
• Brainstorm possible external data sources to use
• Have a look at: http://heidi-ckan.dev.jisc-betas.net/
• Think about connections between your data sets
(unique IDs)
• Think about the possible challenges your choices
may pose:
High shelf vs low shelf data
Data quality, availability, licence
Time, cost, date
In your groups
32. #IWMW17 #A7
• Pick an artist
• Agree on one user story to develop
• Create some sketches of what your dashboard
could look
Think about potential users
Consider usability, layout, colour
Consider filters, searches, titles, legends, navigation
Sense check that it tells an honest story!
In small groups
33. #IWMW17 #A7
“It’s not good enough for there to
be a looming fear in the sector; we
should have an open forum for
debate about the detail of data, and
the best ways to use it.”
Ant Bagshaw, Wonkhe
34. #IWMW17 #A7
• Make friends:
Find out who deals with data in your organisation
Find out what tools they use
Build up links with them
• Think about data:
Start thinking about data, visualising data and the complexities of data
What data do you have? Google analytics, other?
Are there opportunities for embedding it on your website?
• Watch out for the M5 (Jisc, HESA, QAA) data
conference (3rd November – London)
• Get your staff involved in the BI Analytics Labs
work (Myles.Danson@jisc.ac.uk)
Future activities