The document outlines the agenda for the 8th UK Learning Analytics Network Meeting at the Open University on November 2nd, 2016. The agenda includes updates on Jisc's learning analytics program, sessions on learning design and analytics, legal issues, and the Learning Analytics Community Exchange.
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Speakers:
David Lewis, senior analytics consultant, Jisc
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Co-developing bespoke, enterprise-scale analytics systems with teaching staffDanny Liu
Presentation at the NSW Learning Analytics Working Group meeting, 3 February 2016, at the University of Technology, Sydney. Covering projects from Macquarie University and the University of Sydney.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
Online Educa Berlin Conference Presentation
Big Data in Education - Theory and Practice
Presented December 6, 2013 by
Mike Moore, Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Learning Analytics: Seeking new insights from educational dataAndrew Deacon
CPUT Fundani TWT - 22 May 2014
Analytics is a buzzword that encompasses the analysis and visualisation of big data. Current interest results from the growing access to data and the many software tools now available to analyse this data in Higher Education, through platforms such as Learning Management Systems. This seminar provides an overview of current applications and uses of learning analytics and how it can help institutions of learning better support their learners. The illustrative examples look at institutional and social media data that together provide rich insights into institutional, teaching and learning issues. A few simple ways to perform such analytics in a context of Higher Education will be introduced.
Open Learning Analytics panel at Open Education Conference 2014Stian Håklev
The past five years have seen a dramatic growth in interest in the emerging field of Learning Analytics (LA), and particularly in the potential the field holds to address major challenges facing education. However, much of the work in the learning analytics landscape today is closed in nature, small in scale, tool- or software-centric, and relatively disconnected from other LA initiatives. This lack of collaboration, openness, and system integration often leads to fragmentation where learning data cannot be aggregated across different sources, institutions only have the option to implement "closed" systems, and cross disciplinary research opportunities are limited. Beyond the immediate concerns this fragmentation creates for educators and learners, a closed approach dramatically limits our ability to build upon successes, learn from failures and move beyond the "pockets of excellence (and failures)? approach that typifies much of the educational technology landscape.
The potential benefits of openness as a core value within the learning analytics community are numerous. Learning initiatives could be informed by large scale research projects. Open-source software, such as dashboards and analytics engines, could be available free of licensing costs and easily enhanced by others, and OERs could become more personalized to match learners' needs. Open data sets and reproducible papers could rapidly spread understanding of analytical approaches, enabling secondary analysis and comparison across research projects. To realize this future, leaders within the learning analytics, open technologies (software, standards, etc.), open research (open data, open predictive models, etc.) and open learning (OER, MOOCs, etc.) fields have established a "network of practice" aimed at connecting subject matter experts, projects, organizations and companies working in these domains. As an initial organizing event, these leaders organized an Open Learning Analytics (OLA) Summit directly following the 2014 Learning Analytics and Knowledge (LAK) conference this past March as means to further the goal of establishing "openness' as a core value of the larger learning analytics movement. Additional details on the Summit and those involved can be found at: http://www.prweb.com/releases/2014/04/prweb11754343.htm.
This panel session will bring together several thought leaders from the Open Learning Analytics community who participated in the Summit to facilitate an interactive dialog with attendees on the intersection of learning analytics and open learning, open technologies, open data, and open research. The presenters represent a broad range of experience with institutional analytics projects, an open source development consortium, the sharing of open learner data, and academic research on open learning environments.
Speakers:
David Lewis, senior analytics consultant, Jisc
An opportunity to find out about how an institution has been implementing learning analytics to support the student journey with and opportunity to discuss issues and possibilities that the use of learning analytics may create.
Co-developing bespoke, enterprise-scale analytics systems with teaching staffDanny Liu
Presentation at the NSW Learning Analytics Working Group meeting, 3 February 2016, at the University of Technology, Sydney. Covering projects from Macquarie University and the University of Sydney.
A Pulse of Predictive Analytics In Higher Education │ Civitas LearningCivitas Learning
Civitas Learning presents the findings of our survey conducted during the September 2014 Civitas Learning Summit, where more than 100 leaders representing 40 Pioneer Partner institutions gathered to share more on their work. The survey, distributed to all participants, resulted in 74 responses highlighting how this cross-section of higher education institutions are using advanced analytics to power student success initiatives.
Learning Analytics: What is it? Why do it? And how?Timothy Harfield
Presentation delivered to graduate students at Emory University as part of a TATTO (Teaching Assistant Training and Teaching Opportunity) brown bag session.
ABSTRACT
Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Data driven approaches to teaching and learning are rapidly being adopted within educational environments, but there is still much confusion about what learning analytics is, what it can do, and how it is best employed.
This talk will provide a general overview of the field of learning analytics, its terminology and methods, as well as contemporary ethical debates. It will also introduce several open source and Emory-supported analytics tools available to students and instructors to facilitate the achievement of various learning outcomes.
Online Educa Berlin conference: Big Data in Education - theory and practiceMike Moore
Online Educa Berlin Conference Presentation
Big Data in Education - Theory and Practice
Presented December 6, 2013 by
Mike Moore, Sr. Advisory Consultant - Analytics
Desire2Learn, Inc.
Learning design meets learning analytics: Dr Bart Rienties, Open UniversityBart Rienties
8th UK Learning Analytics Network Meeting, The Open University, 2nd November 2016
1) The power of 151 Learning Designs on 113K+ students at the OU?
2) How can we use learning design to empower teachers?
3) How can Early Alert Systems improve Student Engagement and Academic Success? (Amara Atif, Macquarie University)
4) What evidence is there that learning design makes a difference over time and how students engage?
Learning analytics: Threats and opportunitiesMartin Hawksey
Slides used at ALT's White Rose Learning Technologist's SIG to introduce threats and opportunities for using Learning Analytics. Links related to this presentation are at http://bit.ly/LAWhiteRose
Blackboard’s data science team conducts large-scale analysis of the relationship between the use of our academic technologies and student impact, in order to inform product design, disseminate effective practices, and advance the base of empirical research in educational technologies.
In this presentation, John Whitmer, Director of Analytics & Research, will discuss findings from 2016. Some findings challenge our conventional knowledge, while others confirm what we believed to be true.
Archived presentation made to JISC Learning Analytics workgroup on Feb 22, 2017
Learning Analytics (or: The Data Tsunami Hits Higher Education)Simon Buckingham Shum
Keynote Address to The Impact of Higher Education: Addressing the Challenges of the 21st CenturyEuropean Association for Institutional Research (EAIR) 35th Annual Forum 2013, Erasmus University, Rotterdam, the Netherlands, 28-31 August 2013. http://www.eair.nl/forum/rotterdam
Brouns, Firssova, Kalz - Is there value of learning analytics in MOOCs? - Da...EUmoocs
This presentation by Francis Brouns, Olga Firssova, Marco Kalz was given at the Data Science & Social Research International Conference in Naples on 19 February 2016. To find out more visit http://project.europeanmoocs.eu/project/publications/
Jisc learning analytics MASHEIN Jan 2017Paul Bailey
Jisc Learning Analytics presentation at Leading Digital Learning: Key Issues for Small and Specialist Institutions event organised by MASHEIN (Management of Small Higher
Education Institutions Network)
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
Presentation of research findings and implications from a large-scale analysis of LMS activity and grade data from across 927 institutions, 70,000 courses, and 3.3 million students. This webinar will speak to the promise (and potential pitfalls) of large-scale learning analytics research to promote student success.
This slide was used in ISO/IEC JTC1 SC36 Plenary Meeting in June 22, 2015.
Title of this slide is 'Proof of Concept for Learning Analytics Interoperability and subtitle is 'Reference Model based on open source SW'.
Talk given by Rebecca Ferguson at the iLife event 'Health, Education and Lifestyle in the Digital Era' organised by Maastricht University at the Bonbonniere, Maastricht on 24 November 2015.
The talk focuses on the 'Visions of the Future' Policy Delphi study carried out by the Learning Analytics Community Exchange (LACE) project.
The study focuses on eight possible visions of the future of learning analytics:
1. In 2025, classrooms monitor the physical environment to support learning and teaching
2. In 2025, personal data tracking supports learning
3. In 2025, analytics are rarely used in education
4. In 2025, individuals control their own data
5. In 2025, open systems for learning analytics are widely adopted
6. In 2025, learning analytics systems are essential tools of educational management
7. In 2025, analytics support self-directed autonomous learning
8. In 2025, most teaching is delegated to computers
More details of the study are available at laceproject.eu
Τι είναι τα wiki; Ποιες οι βασικές αρχές τους; Πως μπορούν να χρησιμοποιηθούν στην εκπαιδευτική διαδικασία; Τι έχουν δείξει οι έρευνες; Κατάλληλη διδακτική προσέγγιση; κτλ.
Materials for introduction to adaptive learning and learning analytics as well as efforts of interoperability standardization. This slides treats brief concept of adaptive learning, reference model of learning analytics, data APIs for learning analytics, and topic list of standardization community (ISO/IEC JTC1 SC36).
Being FAIR: FAIR data and model management SSBSS 2017 Summer SchoolCarole Goble
Lecture 1:
Being FAIR: FAIR data and model management
In recent years we have seen a change in expectations for the management of all the outcomes of research – that is the “assets” of data, models, codes, SOPs, workflows. The “FAIR” (Findable, Accessible, Interoperable, Reusable) Guiding Principles for scientific data management and stewardship [1] have proved to be an effective rallying-cry. Funding agencies expect data (and increasingly software) management retention and access plans. Journals are raising their expectations of the availability of data and codes for pre- and post- publication. The multi-component, multi-disciplinary nature of Systems and Synthetic Biology demands the interlinking and exchange of assets and the systematic recording of metadata for their interpretation.
Our FAIRDOM project (http://www.fair-dom.org) supports Systems Biology research projects with their research data, methods and model management, with an emphasis on standards smuggled in by stealth and sensitivity to asset sharing and credit anxiety. The FAIRDOM Platform has been installed by over 30 labs or projects. Our public, centrally hosted Asset Commons, the FAIRDOMHub.org, supports the outcomes of 50+ projects.
Now established as a grassroots association, FAIRDOM has over 8 years of experience of practical asset sharing and data infrastructure at the researcher coal-face ranging across European programmes (SysMO and ERASysAPP ERANets), national initiatives (Germany's de.NBI and Systems Medicine of the Liver; Norway's Digital Life) and European Research Infrastructures (ISBE) as well as in PI's labs and Centres such as the SynBioChem Centre at Manchester.
In this talk I will show explore how FAIRDOM has been designed to support Systems Biology projects and show examples of its configuration and use. I will also explore the technical and social challenges we face.
I will also refer to European efforts to support public archives for the life sciences. ELIXIR (http:// http://www.elixir-europe.org/) the European Research Infrastructure of 21 national nodes and a hub funded by national agreements to coordinate and sustain key data repositories and archives for the Life Science community, improve access to them and related tools, support training and create a platform for dataset interoperability. As the Head of the ELIXIR-UK Node and co-lead of the ELIXIR Interoperability Platform I will show how this work relates to your projects.
[1] Wilkinson et al, The FAIR Guiding Principles for scientific data management and stewardship Scientific Data 3, doi:10.1038/sdata.2016.18
EMMA Summer School - Rebecca Ferguson - Learning design and learning analytic...EUmoocs
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs. Participants will find it helpful to have registered for FutureLearn and explored the platform for a short time in advance of the workshop.
This presentation was given during the EMMA Summer School, that took place in Ischia (Italy) on 4-11 July 2015.
More info on the website: http://project.europeanmoocs.eu/project/get-involved/summer-school/
Follow our MOOCs: http://platform.europeanmoocs.eu/MOOCs
Design and deliver your MOOC with EMMA: http://project.europeanmoocs.eu/project/get-involved/become-an-emma-mooc-provider/
Aligning Learning Analytics with Classroom Practices & NeedsSimon Knight
The Learning Analytics Research Network (LEARN) invites you to join us for a talk about the exciting ways in which the University of Technology Sydney is using participatory design to augment existing classroom practices with learning analytics. Simon Knight, a LEARN Visiting Scholar from the University of Technology Sydney, will introduce a variety of projects, including their work developing analytics to support student writing.
Come meet others at NYU interested in learning analytics while learning from the examples of leading work in Australia. A light lunch will be served and the talk will be followed by a short Q&A. RSVP is required.
About Simon Knight
Simon Knight is a lecturer at the University of Technology Sydney in the Faculty of Transdisciplinary Innovation. His research investigates how people find and evaluate evidence, particularly in the context of learning and educator practices. Dr Knight received his Bachelor’s degree in Philosophy and Psychology from the University of Leeds before completing a teacher education program and Philosophy of Education MA at the UCL Institute of Education. Following teaching high school social sciences, Dr Knight completed an MPhil in Educational Research Methods at Cambridge, and PhD in Learning Analytics at the UK Open University.
About Simon’s Talk
How do we make use of data about our students to support their learning, and where does learning analytics fit into that? Educators are increasingly asked to work with data and technologies such as learning analytics to support and provide evidence of student learning. However, what learning analytics developers should design for, and how educators will implement analytics, is unclear. Learning analytics risks the same levels of low uptake and implementation as many other educational technologies if they do not align with educator practice and needs. How then do we tackle this gap, to support and develop technologies that are implemented in practice, for impact on learning?
At the University of Technology Sydney, we have taken a participatory design based approach to designing and implementing learning analytics in practice, and understanding their impact. In our work we have identified existing practices with which learning analytics may be aligned to augment them. This talk introduces some of these projects, particularly drawing on our work in developing analytics to support student writing (writing analytics), giving examples of how analytics were aligned with existing pedagogic practices to support learning. Through this augmentation, supported by design-based approaches, we argue we can develop research and practice in tandem.
The following is a presentation given at the Open Apereo conference 2015. It provides updates on the Apereo Learning Analytics initiative and the work that has been implemented over the past year since its inception in June 2014
Joining it all up: developing research-practice linkages in the UKHazel Hall
Seminar presentation on efforts to strengthen research-practice linkages in librarianship and information science in the UK since 2009 presented to the School of Business and Economics, Åbo Akademi University, Finland on Thursday 13th March 2014. There is a fuller report of my work visit to Finland at http://hazelhall.org/2014/03/17/social-media-and-public-libraries-a-doctoral-defence-in-finland/.
Similar to Jisc learning analytics update-nov2016 (20)
ALT-C 2019 Jisc curriculum analytics - full set of slidesPaul Bailey
A deep dive into student data to discover curriculum insights
Authors: Paul Bailey, Niall Sclater, Michael Webb, Alan Paull, and Scott Wilson
A full set of slides around curriculum analytics.
Jisc technology to tutoring new and emerging developmentsPaul Bailey
Presentation on 27 January at the Centre for Recording Achievement seminar on Technology to support 21 Century Tutoring: new and emerging developments. Paul Bailey, Lisa Grey and Ruth Drysdale
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
Normal Labour/ Stages of Labour/ Mechanism of LabourWasim Ak
Normal labor is also termed spontaneous labor, defined as the natural physiological process through which the fetus, placenta, and membranes are expelled from the uterus through the birth canal at term (37 to 42 weeks
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
2. Programme
Jisc Learning Analytics 2016
10:25 – 11:15 Update on Jisc’s learning analytics programme
11:15 – 11:30 Tea / coffee
11:30 – 12:30 Learning design meets learning analytics, Dr Bart Rienties, Open University
12:30 – 13:30 Lunch
13:30 – 14:15 Parallel session 1: Legal issues for learning analytics, Andrew Cormack, Jisc
Parallel session 2: Addressing the challenges , Il-Hyun Jo, Ewha Womans
University
14:15 – 15:00 Parallel session 1:The potential of blockchain , Prof John Domingue,
Knowledge Media Institute, OU
The design and deployment of a learning analytics dashboard, David Evans,
NorthWarwickshire & Hinckley College
15:00 – 15:15 Tea / coffee – Juniper/Medlar Room,The Hub
15:15 – 15:55 The Learning Analytics Community Exchange, Dr Doug Clow, Institute for
Educational Technology, OU
3. Paul Bailey, Senior Codesign Manager, Research and Development
Jisc learning analytics service
http://www.slideshare.net/paul.bailey/
6. Effective Learning Analytics Challenge
Jisc Learning Analytics 2016
Rationale
»Organisations wanted help to get started and have access to standard
tools and technologies to monitor and intervene
Priorities identified
»Code of Practice on legal and ethical issues
»Develop basic learning analytics service with app for students
»Provide a network to share knowledge and experience
Timescale
»2015-16—test and develop the tools and metrics
»2016-17—transition to service
»Sep 2017—launch, measure impact: retention and achievement
7. Jisc’s Learning Analytics Project
Three core strands:
Learning
Analytics Service
Toolkit Community
Jisc Learning Analytics
Jisc Learning Analytics 2016
11. Descriptive Analytics
what happened? How do I compare?
Prescriptive Analytics
what should I do?
Predictive
what will happen?
Automated
it’s done
Data
Diagnostic Analytics
why did it happen?
Ordered Data
Sector
Transformation
Awareness
Experimentation
Organisation
support
Organisational
transformation
Analytics without a national approach
13. Sector
Transformation
Awareness
Experimentation
Organisation
support
Organisational
transformation
Descriptive Analytics
what happened? How do I compare?
Predictive Analytics
what will happen?
Prescriptive Analytics
what should I do?
Automated
it’s done
Data
Diagnostic Analytics
why did it happen?
Ordered Data
Standardised Data
Adaptive learning etc.
Recommendation engines
etc.
Predictive models,
Intervention management etc
Data exploration tools,
processes etc
Dashboards,
Benchmarking etc.
Data Warehouse, data
stores
Data connectors
Analytics with a national approach
15. - Sector Data used in mashups:
- NSS
- SCONUL
- LiDP
- HESA
- Open Access Reporting/Deposit,
- JUSP / IRUS
- IRUS
- IMD
- Altmetrics
- H index
- Impact Factor
- REF metrics
- Jisc Collections bands & Subscription
data
Jisc Learning Analytics 2016
Library Labs: 6 teams,
33 participants drawn
from Libraries
16. Library Analytics
Jisc Learning Analytics 2016
Library Labs
- BUT also analytics on institutional
data:
- e-resource usage by type &
department
- e-resource cost benchmarking
- EZProxy logs
- Loans
- Gate entries
- Acquisitions
- Counter reports
- Capita Decisions
- Journal Citation Reports
17. Library Analytics
Jisc Learning Analytics 2016
Library Labs
Birkbeck,University of London
Sheffield Hallam University
University of Edinburgh
University of Warwick
The University of Manchester
University of Salford
Liverpool John Moores University
Newcastle University
Southampton Solent University
Anglia Ruskin University Library
University of South Wales
University of Nottingham
Brunel University London
Kingston University
Teesside University
Bodleain Libraries, University of Oxford
University of Wolverhampton
University of Leicester
University of Reading
Manchester Metropolitan University
University of Bath
De Montfort University
18. Library Analytics
- Mashing up Library data was difficult – SCONUL is not HESA
- Many different internal systems, comparative analytics difficult
- Proof of concept dashboards stimulating institutions (traffic lights)
- More interest and contributions to recipes at http://github.com/jiscdev/xapi-lib
- New verbs! Eduroam, presence
- Data Sharing Agreements and an experimental area in the Heidi Lab
- Scope for more librarians alongside planners on Jisc’s beta BI project
Jisc Learning Analytics 2016
23. Learning analytics products and tools
Learning records warehouse – active
Data Explorer – basic visualisations
Student Unified Data Definition –
version 1.2.7 and examples major SRS
and validation too
VLE – xAPI recipe and plugins for
Blackboard and Moodle
Attendance tracking – xAPI recipe
(being piloted soon)
Student App – release 1 Dec 2016
Jisc Learning Analytics 2016
Tribal Student Insights (10)
Open Learning Analytics Processor (4)
Further learning analytics product
pilots (tbc)
24. UDDValidatorTool
• Customer-side UDD validation (web-based, secure access)
• UDD data preparation tool for institutions
• Jisc will load the historical data (once validated)
• Covers current & future UDD - 1.2.7, 1.2.x, 1.3.0 etc
• Links directly to UDDGitHub site (dynamic updates)
• Agile approach to software functionality/ release
• V1.0 - hard validation (UDD structure, optional/ mandatory fields, field contents)
• Relational entities – integrity checks
• Soft validation - data quality and concentration/ coverage (working withTribal/ Unicon Marist)
• Focus on key fields for predictive modelling purposes, student app
• Gives control & flexibility to our members – rapidly quick data validation (Azure Cloud)
Jisc Learning Analytics 2016
25. Implementations
Profile Aims Tools No Data Sources
Teaching and
research led
Universities
Student
retention
and success
Tribal student
insight/data
warehouse
7 VLE (Moodle and
Blackboard), student
records and attendance
Teaching and
research led
Universities
Success and
engagement
Student app 4 VLE (Moodle and
Blackboard), student
records
Teaching led
Universities
Student
retention
Open source
processors/data
warehouse
4 VLE (Moodle and
Blackboard), student
records and attendance
FE Colleges Student
retention
Tribal student
insight
2 VLE (Moodle), student
records and attendance
Jisc Learning Analytics 2016
27. On-boarding Process
Stage 1: Orientation
Stage 2: Discovery
Stage 3: Culture and Organisation Setup
Stage 4: Data Integration
Stage 5: Implementation Planning
Jisc Learning Analytics 2016
https://analytics.jiscinvolve.org/wp/on-boarding/
28. Stage 1: Orientation
Jisc Learning Analytics 2016
Stage 1. Orientation
1. Sign up to the analytics mailing list
Evidence required:
A list of people in your institution signed up to the mailing list
2. Review the learning analytics blog post and relevant reports
Evidence required:
Notes on useful articles and posts you have found
3. Attend a Jisc webinar, network meeting or workshop
Evidence required:
Notes from attending a recent event
29. Stage 2: Discovery Readiness
Jisc Learning Analytics 2016
Stage 2. Discovery
4. Decide on institutional aims for learning analytics
Evidence required: A prioritised list of your aims for learning analytics
5. Strategic alignment, senior management approval and you have a nominated
project lead
Evidence Required: Named sponsor from the senior management team, Named project lead and contact details, Named technical lead and contact
leaded, A list of members of your working/management group
6. Undertake the readiness assessment
Evidence required :A completed readiness assessment questionnaire with your commentary on the answers
7. Arrange a verification meeting with Jisc to discuss the outcomes and possible next
steps
Evidence required: Date of meeting, documentation to share and a list of people attending
30. Discovery readiness
Topic ID Question Commentary Response Score
Leadership 1 The institutional senior management
team is committed to using data to
make decisions
Please provide a commentary on you
response to each question where
appropriate
0 - Hardly or not at all
1 - To some extent
2 - To a great extent
Leadership 2 Our vice-chancellor / principal has
encouraged the institution to
investigate the potential of learning
analytics
0 - Hardly or not at all
1 - To some extent
2 - To a great extent
Leadership 3 There is a named institutional
champion / lead for learning analytics
0 - No
2 - Yes
Vision 4 We have identified the key
performance indicators that we wish to
improve with the use of data
0 - Hardly or not at all
1 - To some extent
2 - To a great extent
Jisc Learning Analytics 2016
A supported review of institutional readiness
https://analytics.jiscinvolve.org/wp/on-boarding/step-6-readiness-assessment/
31. Stage 3: Culture and Organisation Setup
Jisc Learning Analytics 2016
Stage 3. Culture and Organisation Setup
8. Start to address readiness recommendations
Evidence required: Action plan to address readiness recommendations
9. Legal and ethical policy considerations in hand
Evidence required: List of institutional policies relevant to learning analytics; Plan to update/create policies to cover
learning analytics
10. Decision on learning analytics products to pilot
Evidence required: A documented list of products with an agreed rational for choices
11. Data processing agreement signed
Evidence required: Signed Data Processing Agreement
12. Select student groups for the pilot and engage staff/students
Evidence required: List of student groups/cohorts and numbers of students involved
32. Stage 4: Data Integration
Jisc Learning Analytics 2016
Stage 4. Data Integration
13. Undertake a data and systems audit
14. Contact Jisc to start data integration
15. Install and evaluate the VLE data plugin(s) on a test system at your
institution
16. Extract student data, transform to UDD and validate.
17. Extract historical VLE (or other activity) data
18. InstallVLE (or other activity) data plugin(s) on live system, activate for live
data upload to LRW
19.View uploaded LRW data using data explorer to check quality
33. Jisc Learning Analytics 2016
Stage 4: Data collection
About the student Activity data
TinCan
(xAPI)ETL
34. Stage 5: Implementation Planning
Jisc Learning Analytics 2016
Stage 5. Implementation Planning
20: Move to implementation Stage
Evidence required: An implementation plan with agreed timescales
35. Jisc Learning Analytics 2016
On-boarding Process
Data Visualisation
Dashboards
Ready to
implement
Ready to
implement
36. On-boarding – get started
Stage 1: Orientation – review/done
Stage 2: Discovery – mostly self-support
Stage 3: Culture and Organisation Setup – Jan 2017
Stage 4: Data Integration – slots from early 2017
Stage 5: Implementation Planning - slots from
early 2017
Jisc Learning Analytics 2016
38. Co-design challenges 2017
Explore our co-design challenges
Help steer our innovation work by exploring the next big ideas for technology in education and
research.
Jisc Learning Analytics 2016
39. Jisc Learning Analytics 2016
Data
driven
learning
gains
Next
generation
research
environment
Digital skills
for
research
Should we gather more data on
students, staff and buildings that
would allow us to deliver better
experiences?
We think it is time for a new type
of learning environment, but what
would this look like?
We think it is time for a new type
of learning environment, but what
would this look like?
What would a truly digital
apprenticeship look like?
Can we make better use of data to
improve learning, teaching and
student outcomes?
How do we equip researchers and
related staff with the skills they need
for the future of research?
The
intelligent
campus
The digital
apprentice
Next
generation
learning
environment
40. Jisc Learning Analytics 2016
1
Discuss
emerging
challenges
2
Prioritise
ideas
3
Announce
successful
ideas
4
Report
progress
Identify
ideas
31st Oct – 24th Nov 4th Jan– 30th Jan 6th Feb Apr/May
Release 6 challenge
areas and invite Jisc
members and other
experts to discuss
Audience: managers,
consumers, some
leaders, other experts
Present ideas for
activities Jisc could
do and ask members
which they support
Audience: managers,
consumers, some
leaders
Release 6 challenge
areas and invite Jisc
members and other
experts to discuss
Audience: everyone
who followed the
challenge
Release 6 challenge
areas and invite Jisc
members and other
experts to discuss
Audience: everyone
who followed the
challenge
Go to ‘View’ menu > ‘Header and Footer…’ to edit the footers on this slide (click ‘Apply’ to change only the currently selected slide, or ‘Apply to All’ to change the footers on all slides).