A contribution to INTERNATIONAL SEMINAR ON OPEN AND DISTANCE EDUCATION (SIEAD-BR 2018) 22nd October 2018.
"Contributions from Open and Distance Education to Higher Education Quality: present and future"
"Contribuições da Educação Aberta e à Distância para uma Educação Superior de Qualidade: presente e futuro"
In this presentation I will suggest using a quality framework to help you think about and improve quality of e-learning. I start with some general observations about quality and the need for quality frameworks. I then discuss two specific frameworks: the well-established E-xcellence benchmarks for e-learning, and the OpenupEd framework which as been specifically aligned at MOOCs. Finally I return to some more practical advise, particularly about thinking about the learning design of a course at an early stage.
Quality assurance of MOOCs: The OpenupEd quality labelJon Rosewell
The OpenupEd quality label is a quality enhancement approach to e-learning, tailored specifically to MOOCs. I will briefly introduce the OpenupEd quality label, show how it relates to other e-learning quality frameworks, and outline the ways in which it can be used, ranging from informal self-assessment to a full external review. Which of the benchmarks could contribute to enhanced design of MOOCs? Are the benchmarks sufficiently detailed? Do they capture all important aspects?
A presentation on 'MOOCs and Quality Issues' given at a workshop organised by the QA-QE special interest group of the UK Higher Education Academy (HEA) [http://qaqe-sig.net/?page_id=8]
Quality assurance of MOOCs: The OpenupEd quality labelJon Rosewell
The OpenupEd quality label is a quality enhancement approach to e-learning, tailored specifically to MOOCs. I will briefly introduce the OpenupEd quality label, show how it relates to other e-learning quality frameworks, and outline the ways in which it can be used, ranging from informal self-assessment to a full external review. Which of the benchmarks could contribute to enhanced design of MOOCs? Are the benchmarks sufficiently detailed? Do they capture all important aspects?
A presentation on 'MOOCs and Quality Issues' given at a workshop organised by the QA-QE special interest group of the UK Higher Education Academy (HEA) [http://qaqe-sig.net/?page_id=8]
Presentation by Esther Huertas Hidalgo, AQU Catalunya, Spain, The European Association for Quality Assurance in Higher Education (ENQA) at the 2018 European Distance Learning Week's fourth day webinar on "Considerations for Quality Assurance of e-Learning Provision" - 8 November 2018
Recording of the discussion is available: https://eden-online.adobeconnect.com/p4nuxa1r3qiv/
Next steps for excellence in the quality of e-learning (EADTU Paris masterclass)Jon Rosewell
Overview of Excellence NEXT project for quality assurance in e-learning, presented as part of masterclass at EADTU conference, Paris, 2013. [http://conference.eadtu.eu/]
Give them what they want: Participatory approaches to developing anonymous as...Simon Davis
Presented at ALT-C 2015; https://altc.alt.ac.uk/2015/sessions/give-them-what-they-want-developing-a-flexible-anonymous-assignment-workflow-to-meet-diverse-needs-895/
Governing Quality Of Online Content Through Threshold Standards: Facilitating...Charles Darwin University
A presentation outlining different approaches to ensuring quality of technology enhanced learning and teaching in higher education. Please cite: Sankey. M. (2017). Governing Quality Of Online Content Through Threshold Standards: Facilitating A Consistent Learning Experience. Online e-Learning Summit 2017. Sydney, 20-21 June.
About the VISCED Poject:
The VISCED project carried out an inventory of innovative ICT-enhanced learning initiatives and major ‘e-mature’ secondary and post-secondary education providers for the 14-21 age group in Europe. This entailed a systematic review at international and national levels including a study into operational examples of fully virtual schools and colleges. The outputs of this work have been analysed and compared to identify relevant parameters and success factors for classifying and comparing these initiatives.
See http://www.virtualschoolsandcolleges.info/
About this presentation:
EFQUEL Innovation Forum
14-16 September 2011
Oeiras, Portugal
The EFQUEL Innovation Forum 2011 was called “Certify the future…?!
Accreditation, Certification and Internationalisation”. This annual international forum by EFQUEL provides an opportunity to discuss future and innovative practices, research and policy developments in the various sectors of education.
http://eif.efquel.org/files/2012/03/Booklet_EIF2011_20110902_webversion.pdf
http://www.virtualschoolsandcolleges.info/news/first-meeting-visced-international-advisory-committee-taking-place-portugal
Paul Bacsich from SERO led a workshop at this event entitled “Critical success factors and quality aspects for virtual schools” The presentation given by Paul to launch this workshop is entitled “Benchmarking and critical success factors for Virtual Schools”. This event was also linked to the first meeting of the International Advisory Committee of VISCED and so participation in this forum provided several opportunities for the VISCED team to extend their network.
Presentation by Esther Huertas Hidalgo, AQU Catalunya, Spain, The European Association for Quality Assurance in Higher Education (ENQA) at the 2018 European Distance Learning Week's fourth day webinar on "Considerations for Quality Assurance of e-Learning Provision" - 8 November 2018
Recording of the discussion is available: https://eden-online.adobeconnect.com/p4nuxa1r3qiv/
Next steps for excellence in the quality of e-learning (EADTU Paris masterclass)Jon Rosewell
Overview of Excellence NEXT project for quality assurance in e-learning, presented as part of masterclass at EADTU conference, Paris, 2013. [http://conference.eadtu.eu/]
Give them what they want: Participatory approaches to developing anonymous as...Simon Davis
Presented at ALT-C 2015; https://altc.alt.ac.uk/2015/sessions/give-them-what-they-want-developing-a-flexible-anonymous-assignment-workflow-to-meet-diverse-needs-895/
Governing Quality Of Online Content Through Threshold Standards: Facilitating...Charles Darwin University
A presentation outlining different approaches to ensuring quality of technology enhanced learning and teaching in higher education. Please cite: Sankey. M. (2017). Governing Quality Of Online Content Through Threshold Standards: Facilitating A Consistent Learning Experience. Online e-Learning Summit 2017. Sydney, 20-21 June.
About the VISCED Poject:
The VISCED project carried out an inventory of innovative ICT-enhanced learning initiatives and major ‘e-mature’ secondary and post-secondary education providers for the 14-21 age group in Europe. This entailed a systematic review at international and national levels including a study into operational examples of fully virtual schools and colleges. The outputs of this work have been analysed and compared to identify relevant parameters and success factors for classifying and comparing these initiatives.
See http://www.virtualschoolsandcolleges.info/
About this presentation:
EFQUEL Innovation Forum
14-16 September 2011
Oeiras, Portugal
The EFQUEL Innovation Forum 2011 was called “Certify the future…?!
Accreditation, Certification and Internationalisation”. This annual international forum by EFQUEL provides an opportunity to discuss future and innovative practices, research and policy developments in the various sectors of education.
http://eif.efquel.org/files/2012/03/Booklet_EIF2011_20110902_webversion.pdf
http://www.virtualschoolsandcolleges.info/news/first-meeting-visced-international-advisory-committee-taking-place-portugal
Paul Bacsich from SERO led a workshop at this event entitled “Critical success factors and quality aspects for virtual schools” The presentation given by Paul to launch this workshop is entitled “Benchmarking and critical success factors for Virtual Schools”. This event was also linked to the first meeting of the International Advisory Committee of VISCED and so participation in this forum provided several opportunities for the VISCED team to extend their network.
On urgent needs for a revised quality agenda. Improving the quality of teaching in educational institutions through the introduction of new educational programs, modern pedagogy, and smart-technologies in the educational process. Technical Assistance mission, MHSSE, NEO; HERE and YTIT, Uzbekistan 18-19 November 2019.
The general aim of this work has been to define some guidelines and recommendations for implementation of OCW by institution in a context of student mobility. The approach taken is to determine a set of controls as part of a quality model for the implementation of OCW in virtual mobility. Therefore, this quality model would take into account some acknowledged quality aspects in eLearning, production and reuse of OERs, and at the end, the implementation of mobility programs.
The present work is an output of the project ´´Open Course Ware in the European HE context´ European project founded by the Lifelong Learning Programme of the European Union. The focus of the project is the creation of preconditions for a strong European OCW framework and as consequence a decline of obstacles to collaboration between European institutes, and therefore an increase in real student mobility.
Quality in online, open and flexible education - a global perspectiveicdeslides
A presentation from International Council for Open and Distance Education - ICDE at the VI Cread Andes Convention and VI Virtual Educa Ecuador Conference in Ecuador, 29 May - 1 June 2018
Presentation of Sandra Kucina Softic, EDEN Vice-President, SRCE at the Digital Skills Gap PLA (Peer Learning Activity) hosted by SRCE in Zagreb, Croatia
Slides used during webinar on strategies of higher education institutions on open education.
Held on 11 March 2015 during Masterclass "Towards open educational processes and practices"
http://portal.ou.nl/en/web/masterclass-ow-050216/introduction/-/wiki/Main/Programme
Darco Jansen gave a presentation on 20 May 2016 about HE institutions strategies on Open Eduaction at International Conference on Smart Learning Ecosystems and Regional Development. Based on several surveys he demonstrated that Europe is strongly involved in MOOCs and Open education compared to the US. Darco elaborated on the role of regional support centers for Open education in stimulating smart learning ecosystems and smart cities. The development of these support center is presently stimulated by the SCORE2020 project.
As befits its title, Technologies in practice (TM129) takes a practical focus to learning, with up to 50% of study time having a practical aspect. The tutorial program should support this and in the past some tutors have found innovative ways of bringing practical demonstrations or exercises into their face-to-face sessions, for example demonstrating a robot vacuum cleaner or setting up an ad-hoc network of students’ laptops.
Producing online tutorials with an equivalent practical focus is a challenge. For TM129 we have developed a set of labcasts which deliver practical-focused synchronous tutorial events to all students, with one demonstration for each of the three blocks of the course: Robotics, networking and Linux. These labcasts are practical demonstrations which explore equipment and techniques which extend the coverage of the module. They move beyond video by the use of ‘widgets’ and a chat window which provide opportunities for students to engage actively with the demonstration. We will briefly outline these activities and present some student evaluation results.
We discuss how we plan to extend these activities into remote practical activities using OpenSTEM lab facilities. These will allow students to undertake further practical work where the student directly controls the practical activity.
We will present a framework of possible use-cases for remote practical activities, considering group size, synchronicity and locus of control; discuss some of the technological and pedagogical implications; and review progress towards delivering engaging practical activities at a distance.
A talk delivered at The Open University STEM Teaching Conference 6 Feb 2020
OpenStudio and Digital Photography: creating and sharing better imagesJon Rosewell
OpenStudio was created for the Open University course 'T189 Digital Photography: creating and sharing better images', and continues to be used in the current version TG089 run in partnership with the Royal Photographic Society. I will discuss the pedagogy of the course, the role of OpenStudio within it, and how OpenStudio is perceived by students.
The Open University, eSTEeM Conference, 25 April 2017
Summary
Find out how the OpenSTEM lab can be used to support remote access to tutor-led practical work in robotics and other technologies.
Abstract
As befits its title, Technologies in practice (TM129) takes a practical focus to learning, with up to 50% of study time having a practical aspect. The tutorial program should support this and in the past some tutors have found innovative ways of bringing practical demonstrations or exercises into their face-to-face sessions, for example demonstrating a robot vacuum cleaner or setting up an ad-hoc network of students’ laptops.
Producing online tutorials with an equivalent practical focus is a challenge. The creation of the OpenSTEM lab provides an opportunity to meet this challenge. Part of the HEFCE and OU funding for the OpenSTEM lab has provided five large ‘Baxter’ robots which will be accessible remotely as well as two which will be used at residential school. The lab also provides racked equipment bays for smaller remote access experiments, such as those being developed for the electronics curriculum. For a large population module such as TM129, this infrastructure provides an opportunity to roll-out practical-focused synchronous tutorial events to all students, provided the activities are well designed and scripted so that they can be delivered by a number of tutors.
In this presentation I will review the possible use-cases for remote practical activities, discuss some of the technological and pedagogical challenges, and review progress towards delivering engaging practical activities at a distance.
Assessing with confidence
Jon Rosewell, The Open University
Confidence-based marking (CBM) is an assessment method which asks the student not only to provide the answer to a question, but also to report their level of confidence (or certainty) in the correctness of their answer. They need to consider this carefully because it affects the marks they are awarded: a student scores full marks for knowing that they know the correct answer, some credit for a tentative correct answer but are penalised if they believe they know the answer but get it wrong. There are several motivations for using CBM: it rewards care and effort so engendering greater engagement, it encourages reflective learning, and it promises accuracy and reliability.
CBM has had niche success in the past in the context of medical training and recently may have a found a new niche in the context of regulatory compliance; these are both areas where assessment of competency and mastery is expected. However, CBM has not been widely adopted in other areas of education.
In this talk I will review the CBM landscape and ask why CBM is not used more widely. What are the benefits claimed and how robust is the evidence? How should CBM be presented to the students? Do they need training to understand how the system works? Is it a fair method of assessment? Does it disadvantage any category of student? How does it fit with ideas around ‘assessment for learning’ and ‘reflective learning’?
Confidence-based marking could offer both the student and teacher greater insight into a student’s understanding than the standard fare of e-assessment, the multiple-choice quiz. It is a technique that we should therefore keep under consideration.
Robot explorers: Gender and group attitudes to STEM: a pilot evaluation of an...Jon Rosewell
Gender and group attitudes to STEM: a pilot evaluation of an outreach robotics activity.
Alice Peasgood, Jon Rosewell, Tony Hirst
Abstract
Women are underrepresented in Science, Technology, Engineering and Mathematics (STEM) subjects in higher education (HE), although attitudes and participation in STEM are less polarised at younger ages. Outreach activities that aim to inspire and enthuse school-age students may help girls to consider study and careers in STEM subjects.
The Royal Institution run extra-curricular ‘masterclasses’ that aim to inspire school students in mathematics. Our session in a series of secondary maths masterclasses uses a hands-on robotics activity based on the theme of ‘robot explorers’. Students work in small groups to solve the challenge of programming a small mobile robot to navigate by applying their maths and programming skills. This pilot study looked at the possible influence of gender and friendship groups on attitudes to STEM in the context of that activity.
Those attending the masterclass series were Year 9 students nominated by East London schools. Students completed a short evaluation sheet for the session and reported whether they knew others in their group. An observer noted whether boys or girls used the computer, held the robot, and similar measures. All data was collected anonymously and the study was approved by the OU Human Research Ethics Committee (HREC/2016/2238/Rosewell/1).
Preliminary results suggest that girls enjoyed the class more than boys. Girls also showed a greater increase in level of interest in robotics, although from a lower level than boys. There is a suggestion that individuals who found themselves in a group in which they had no friends reported a lower score for enjoyment.
The importance of friendship to the enjoyment and learning experienced in small group activity should be considered in the design of extra-curricular activities if they are to meet their stated aim of enthusing young students.
Opening up multiple choice - assessing with confidenceJon Rosewell
This presentation presents a new online question style, Open CBM (Certainty/Confidence Based Marking).
This achieves an open style of question (similar to a free-text or numeric question) where the student doesn't pick from possible answers, but retains the robust and easy implementation of a multiple choice (MCQ) question.
It achieves this by appropriating the technique of certainty/confidence-based marking (CBM). In CBM, a student both selects an answer and also their level of confidence: they score full marks for knowing that they know the correct answer, some credit for a tentative correct answer but are penalised if they believe they know the answer but get it wrong.
An Open CBM question is presented in two stages. Initially, the question is presented with no answer options visible; instead the student must set their confidence level that they know the answer. Only then are the possible answers are revealed and the student answers as a normal MCQ. The marking scheme follows standard CBM practice. Mechanically the question remains a simple MCQ: answer matching is trivial and robust, questions are easy to implement, and existing question banks can be reused. However, to the student, the question is effectively transformed from closed MCQ to an open question. They need to formulate an answer first before they can decide their confidence in their answer, so they must decide their answer in the absence of any positive or negative clues, reducing the chance of misconceptions, or working backwards.
A speculation on the possible use of badges for learning at the UK Open Unive...Jon Rosewell
There has recently been a flurry of interest in supporting the idea of using ‘badges’ to recognise learning, particularly due to the Mozilla Open Badges project (http://openbadges.org/) and the funding channelled through the 2012 Digital Medial and Learning Competition (http://www.dmlcompetition.net/). Badges offer the potential of rewarding informal learning and reaching non-traditional learners.
This paper speculates on ways in which badges for learning could fit into the offering of the UK Open University, and exposes some of the tensions that badges raise.
[Paper presented at European Association of Distance Teaching Universities (EADTU) conference, Cyprus, 27-28 Sept 2012]
Badges for Nature (HASTAC/DML proposal)Jon Rosewell
‘Badges for Natural History’ will recognize and reward the knowledge and skills of the new generation of naturalists that are making a great contribution to our understanding of the world’s biodiversity. These badges will be issued first by a group of eight projects from across the globe. Badge earners will be able to move their badges between sites as they share their knowledge and experience of natural history across the world.
Next Steps for Excellence in the Quality of e-LearningJon Rosewell
The development of e-learning has progressed to a stage where it is becoming part of mainstream provision in higher education. Therefore the issue of assessing and sustaining the quality of e-learning must now come to the fore. Quality assessment in higher education is well-established in relation to learning and teaching generally, but what methods can be used to establish quality in the domain of e-learning?
The E-xcellence methodology for assessing quality in e-learning (EADTU 2009) is securing recognition by European and international learning organisations. It was designed to be applied to the design and delivery of e-learning in both distance learning and blended learning contexts. It supports a range of uses, from accreditation by external agencies to process improvement through internal review.
The methodology presents principles of good practice in six domains of e-learning: strategic management; curriculum design; course design; course delivery; student support; and staff support. A total of 33 benchmark statements cover these domains, and are supported by a handbook for practitioners and guidance for assessors. The handbook includes principles for quality e-learning and exemplars of good practice. Amongst the tools is an online ‘QuickScan’ self-evaluation questionnaire based on the E-xcellence benchmarks which is highly valued as a focus for collaborative review of e-learning programmes.
The e-learning landscape has changed since the E-xcellence methodology was first developed. In particular, the use of Open Education Resources (OECD 2007) and the application of social networking tools (Mason & Rennie 2008) were not explicitly considered in the original benchmarks. Accordingly, the E-xcellence NEXT project was instigated to produce and evaluate a revision of the benchmark criteria, associated handbook and exemplars. This paper describes the project process and initial recommendations.
A consultation exercise was carried out among E-xcellence participants. Feedback from this was brought to participatory workshops at a European Seminar on QA in e-learning in June 2011. Following this exercise, the benchmark statements were revised and are now available in beta version.
The project resources (Quickscan and manual) are being used for a series of self-evaluation and assessment seminars held at European higher education institutions. Feedback from these assessment seminars will be used to finalise materials for publication late in 2012. At that point the E-xcellence Next project will offer to the higher education community a set of self-evaluation and quality assessment tools which are fully updated to encompass social networking, Open Educational Resources and other recent developments in e-learning.
Can computer-marked final assessment improve retention?Jon Rosewell
Distance learning modules (particularly low-cost introductory and enrichment modules) may show poor retention compared to traditional campus courses. The perceived difficulty of exams and end-of-module assessments (EMA) appears to deter some students from submitting. In contrast, interactive computer-marked assignments (iCMA) are typically attempted by most students.
Can retention therefore be improved by changing the format of part of the final assessment to an iCMA?
Robotics and the meaning of life is a 10-point, 10-week general-interest Open University module. The assessment comprised a mid-module iCMA and a final written EMA. The iCMA (a Moodle quiz) provided detailed feedback only after the submission deadline. The EMA included short-answer questions, a programming question and an essay. The EMA was script-marked and feedback limited to overall score and performance profile provided well after the end of the course.
The intervention simply replaced the script-marked short-answer questions by a second iCMA covering the same content with similar questions. The programming and essay questions were retained unchanged as a written, script-marked EMA.
The hypothesis to be tested was that retention would increase: students would be more likely to submit the final iCMA, their confidence would increase, and they would be motivated to submit the written EMA.
Quantitative data were gathered for patterns of submission, course completion and pass rates for two presentations (124 and 220 students); data were also available for thirteen previous presentations (1814 students). Structured interviews were carried out to probe student preferences, confidence and engagement.
More students submitted the iCMA (86%) than the EMA (81%). Although they had the same deadline, 91% of students submitted the iCMA before the EMA. They submitted the iCMA well in advance of the deadline (median 4 days 15 hrs) but kept the EMA open as long as possible (median 18 hrs before deadline; 11% submitted in the final hour). These patterns strongly suggest that students were more confident with the iCMA than the EMA. Completion rates were the highest recorded: 88% and 89% compared to 79% for pre-intervention presentations. Overall pass rates were also improved (83% and 85% c.f. 76%). This can be ascribed to improved submission rates alone: the pass rate and mean scores among those who submit were unchanged giving confidence that the assessment difficulty was unaltered.
Student interviews suggested that students did attempt the final iCMA before the EMA and had greater confidence in obtaining a good mark for the iCMA than the EMA. Students valued the mix of assessment methods and felt it produced a robust result; although some expressed concern over the correctness of computer marking, they appreciated the detailed feedback it provided.
This intervention suggests that a change of assessment format can improve student engagement and pass rates without compromising rigour.
QA in e-Learning and Open Educational Resources (OER)Jon Rosewell
Introductory slides for a workshop on updating the e-learning quality assurance benchmarks of the E-xcellence NEXT project http://www.eadtu.nl/e-xcellencelabel
Exploring Web 2.0 to support online learning communities: where technology me...Jon Rosewell
A presentation to kick off a workshop at ICL2009 conference, given by Giselle Ferreira, Wendy Fisher, Jon Rosewell & Karen Kear, The Open University. http://www.open.ac.uk/blogs/terg/
Equitability and Dominance in Online Forums: An Ecological ApproachJon Rosewell
Participation in online forums varies greatly: a few students post many messages, some post a few, and many only read. A rough ‘rule of thirds’ has been suggested (eg Mason 1989), but it is possible that this rule of thumb hides interesting structure.
However, similar patterns can be seen when analysing the abundance of species in ecological communities, so maybe indices of ecological diversity could also provide a useful characterisation of an online community. Such indices can unpick both ‘species richness’ (here number of participants) and equitability / dominance.
To explore this, 36 forums containing 27,000 messages were analysed to see if an ecological approach to online communities could offer useful insights.
Members of the OU Robotics Outreach Group have been running hands on school and community workshops using the Lego Mindstorms robot invention system. Typically, these activities have been based around remote control activities using prebuilt robots, programming workshops using prebuilt robots or hybrid workshops involving simple robot construction and programming tasks.
In this presentation, we describe a new activity format - a robot construction activity using a preprogrammed robot controller capable of solving a situated task based on the popular RoboCupJunior robot rescue challenge.
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
Basic phrases for greeting and assisting costumers
Quality frameworks for e-learning (SIEAD 2018, Brazil)
1. Quality frameworks for e-learning
Jon Rosewell, The Open University (UK)
INTERNATIONAL SEMINAR ON OPEN AND DISTANCE EDUCATION (SIEAD-BR 2018)
Contributions from Open and Distance Education to Higher Education Quality: present and future
22nd October 2018
2. 174,000 students
Formal, paid-for, degrees
Supported distance learning
Open access
50 years (nearly!)
60 million visitors
Non-formal, free
OER, open courseware
Since 2006 (open2.net 1999)
8 million participants
MOOC platform
Many HE partners
Open courseware platform
CC-BY-SA-NC
Broadcast TV partner
OU/BBC audience:
265 million views
Informal learning
3. Rise of e-learning?
• Global demand for HE
• MOOCs
• Blended learning on campus
• Society needs life-long learning –
upskilling, reskilling
• Short learning programmes?
UK undergraduate numbers
Universities UK, Patterns and trends in UK higher education 2018
5. What do we mean by ‘quality’ in HE?
• Compliance & consumer protection
– Accreditation
– Guarantee of uniform standards
• Reputation
– Recruit good students, produce good graduates
• Quality enhancement / Process improvement
– Institutional mission
– Stakeholder engagement
– Measures of added value (‘learning gain’)
6. Approaches to QA in e-learning
• Compliance or enhancement?
• Process or product?
• Input elements?
• Pedagogical models?
• Outcome measures?
• Self-assessment or external review?
• Scorecard? Benchmarking against others?
Holistic: emphasis on process & context as well as product
7. Quality frameworks
ICDE report on quality models 2015
• Review of quality frameworks
• Ideally:
• Multifaceted – many measures, holistic
• Dynamic – flexible when technology changes
• Mainstreamed – improvement through individuals
• Representative – balance stakeholders
• Multifunctional – e.g. external recognition, plan for
improvement, create quality culture
Ebba Ossiannilsson, Keith Williams, Anthony F. Camilleri, and Mark Brown (2015)
Quality models in online and open education around the globe: State of the art and
recommendations, ICDE Report http://www.icde.org/quality
8. A generic framework for QA in HE
Ebba Ossiannilsson, Keith Williams, Anthony F. Camilleri, and Mark Brown (2015)
Quality models in online and open education around the globe: State of the art and
recommendations, ICDE Report http://www.icde.org/quality
9. European Standards & Guidelines (ESG)
and e-learning
1.1 Policy for QA
1.2 Design and approval of programme
1.3 Student-centred learning, teaching & assessment
1.4 Student admission, progression, recognition & certification
1.5 Teaching staff
1.6 Learning resources and student support
1.6 Information management
1.8 Public information
1.9 Ongoing monitoring and periodic review
1.10 Cyclical external quality assurance
10. ENQA: Considerations for QA of e-learning
• Recently published!
• Supplement to ‘European Standards and Guidelines’ 2015
• Additional guidance and indicators
Huertas et al (2018) ENQA Occasional Papers, No. 26
https://enqa.eu/index.php/publications/papers-reports/occasional-papers/
13. Organisation of resources
Strategic Management a high level view of how the institution plans its e-
learning
Curriculum Design how e-learning is used across a whole programme
of study
Course Design how e-learning is used in the design of individual
courses
Course Delivery the technical and practical aspects of e-learning
delivery
Staff Support the support and training provided to staff
Student Support the support, information and guidance provided to
students
14. Sample benchmark
Curriculum design
8. …
9. Curricula are designed to enable participation in academic
communities via social media tools. These online
communities provide opportunities for collaborative
learning, contact with external professionals and
involvement in research and professional activities.
10.…
15. Sample indicators
• There are institutional policies relating to the provision of online community spaces for student-
student and student-teacher interactions.
• Curriculum designers specify clearly the educational role that student-student interaction plays
in their programmes.
• Criteria for the assessment of student online collaboration exist and are applied consistently
across programmes and courses.
At excellence level
• Teaching staff are supported by formal and informal staff development activity in the use of
online tools for community building.
• The institution works closely with professional bodies in the development of online professional
communities.
• Innovative assessment approaches, such as online collaborative work, peer assessment and self-
assessment, form a part of the institution’s practice in this area.
16. Benchmarking as quality enhancement tool
• Statement of best practice
– Suggested indicators
• Collecting evidence
– Can be specific to each university
• Identification of weaknesses & strengths
• …leading to roadmap of actions for improvement
17. Different ways to use E-xcellence
• Informal self-assessment using QuickScan
– Identify ‘hot’ and ‘cold’ spots
• Full internal self-assessment
– Stakeholders collect evidence
– Prepare roadmap of improvement actions
• Integrate with institutional process
– Embed selected benchmarks in internal process
• EADTU E-xcellence Associates Label
– Self-assessment, roadmap, external review
recognition by EADTU
NB: Resources such as
manual and benchmarks
are freely available!
19. Participant feedback
• Framework
– Quickscan is valuable to structure discussion
– Completeness of the framework is appreciated
• Team working / stakeholders
– People exchange perspectives with other departments
• External perspective
– Exchange of experience between the evaluators and staff was valuable
– New ideas surfaced for course design
• Reflection
– A valued ‘moment of reflection’ on quality
– People become aware of choices and implementations
– Gives insight into strengths and weaknesses
• Analysis
– Opportunity to formulate e-learning policy
– Provides foundations for decision making
21. Detailed issues
• Workload management (of staff)
• E-learning strategy
• Academic communities / social media
• Interactivity / e-learning tools
Non-issues:
• Reliability / performance (of VLE)
• Student support generally
– Not a ‘problem’ but much activity
23. Why worry about MOOC quality?
Students – know what they are committing to
Employers – recognition of content and skills
Authors – personal reputation, 'glow' of success
Universities / providers – brand reputation
Funders – philanthropists, government, investors
Quality agencies – on behalf of all above
24. Are MOOCs different from e-learning?
• MOOC vs Higher Education e-learning
– Short, free, no entry requirements
• MOOC participants
– Motivations differ from degree students
– Completion may not be not their goal
But a MOOC is a Course so maybe it should be judged like any
other HE course?
25. OpenupEd Quality Label
• Derived from E-xcellence
– Lightweight process
• Self-assessment
• Formal label
– External review
www.openuped.eu/quality-label
26. OpenupEd MOOC features
• Openness to learners
• Digital openness
• Learner-centred approach
• Independent learning
• Media-supported interaction
• Recognition options
• Quality focus
• Spectrum of diversity
27. OpenupEd MOOC benchmarks
• Derived from E-xcellence benchmarks
• For the institution:
– To be checked every 3-5 years
– 21 benchmark statements, in six groups:
Strategic management, Curriculum design, Course design, Course delivery, Staff support,
Student support
• For the course:
– To be checked for each MOOC
– 11 benchmark statements
28. Benchmarks – course level
22 A clear statement of learning outcomes for both knowledge and skills is provided.
23 There is reasoned coherence between learning outcomes, course content, teaching and
learning strategy (including use of media), and assessment methods.
24 Course activities aid participants to construct their own learning and to communicate it to
others.
25 The course content is relevant, accurate, and current.
26 Staff who write and deliver the course have the skills and experience to do so successfully.
27 Course components have an open licence and are correctly attributed. Reuse of material is
supported by the appropriate choice of formats and standards.
28 The course conforms to guidelines for layout, presentation and accessibility.
31. OL: Openness to learners
DO: Digital openness
LC: Learner-centred approach
IL: Independent learning
MI: Media-supported interaction
RO: Recognition options
QF: Quality focus
SD: Spectrum of diversity
Quick scan
32. MOOC case study: OU + FutureLearn
A representative Open University MOOC … published on FutureLearn
• Evidence for OpenupEd features and benchmarks
• Quality emerges from joint efforts of OU (university) &
FutureLearn (platform provider)
• Holistic approach:
• Institutional and course level
• Process as well as product
• Structures and processes embed a concern for quality
throughout development, delivery and evaluation
Jansen, D., Rosewell, J., & Kear, K. (2017). ‘Quality Frameworks for MOOCs.’ In: M. Jemni, Kinshuk, & M. K. Khribi (Eds.), Open
Education: from OERs to MOOCs, 261–281. Springer http://oro.open.ac.uk/47595/
37. Learning design – in practice
Toetenel, Lisette and Rienties, Bart (2016). Learning Design – creative design to visualise learning activities. Open Learning, 31(3) pp. 233–244.
38. Learning design – in practice
Toetenel, Lisette and Rienties, Bart (2016). Learning Design – creative design to visualise learning activities. Open Learning, 31(3) pp. 233–244.
39. What students want – and what they need
“Student satisfaction is “unrelated” to learning behaviour and
academic performance, a study has found.
[…] while students dislike collaborative learning, they are
more likely to pass if they take part in it” (Times Higher Education, Feb 12th
2018)
From an analysis of 100,000 students
on 151 modules
More at Bart Rientes, OU Inaugural Lecture
40. How does student satisfaction relate to module performance?Satisfaction
Students who successfully completed module
Slide from Bart Rienties Inaugural lecture
42. In summary…
• A quality framework should underpin e-learning provision
– to help create a quality culture
– that is more likely to produce quality e-learning
– and quality enhancement
• There is no simple recipe, but…
– Work in a module team
– Think about learning design
– Think about student support
Hello, I am Jon Rosewell, I am a lecturer at the Open University, UK.
I’m not going to give you a recipe for how to create quality e-learning.
Instead, I will suggest using a framework to help you improve quality – we can always do better!
The learning landscape is very diverse – MOOCs are the new kid on the block, but not the only way of delivering learning at scale
The OU itself – open access degree courses, well-supported but relatively costly.
Futurelearn is MOOC platform – OU and other universities are partners
OpenLearn has had OER and open courseware for years before MOOC – nearly 20 if you include open2.net
OpenLearn Create is sister site which allows anyone to create open (freely licenced) courseware
And other channels for informal learning eg partnership with BBC – broadcasts are supported by material on OpenLearn.
The amount of e-learning will increase because of all these reasons
So it matters that we get quality right!
Figure on the right shows UK undergraduates – not e-learning, all students
Even in mature economy, demand for first degree is rising – top line.
Demand for HE will be rising much faster globally – and e-learning can scale to pick up the demand
But note the other lines – this includes the UK part-time sector which has collapsed as a result of introducing very high fees
Loans are available but not if you already hold a degree – see the fall in orange line which represents reskilling/upskilling at university.
So universities now have a problem offering life-long learning by conventional courses – MOOCS or short learning programmes may fill the gap
I’m going to start with some general discussion of quality
Quality a difficult term to pin down!
At a minimum – is the course/qualification good enough to recognised / accepted?
But different universities want to improve their reputation – good brand attracts good students.
Teachers want to teach, to teach well, and to teach better, so quality enhancement
My university has a particular mission for students who would not otherwise go to university
-- disadvantaged backgrounds, low previous qualifications, disabilities
Improving for them is very important to the university, but may not be visible in rankings.
If we are concerned about quality, how do we check it?
Can we check the quality of the course by looking in detail at all its parts?
Can we predict the quality of a course by looking at the pedagogical model it uses, ie how it is taught?
Can we judge by what happens? How many students completed? Passed the course?
We need to take a holistic view, quality emerges when there is a good process
There are existing quality frameworks that can be used.
ICDE did a global review of a number of quality models
They suggest choosing one which is multifaceted (many measures, holistic), dynamic (flexible to change in tech), mainstreamed (high-level improvement through individual practices), representative (balance stakeholders), multifunctional (instil quality culture, roadmap of actions, external recognition)
The ICDE report found a good degree of consensus across commonly used frameworks
They include broad issues: Strategic planning and development, curriculum design as well as course design and delivery, and support available, both to students and staff.
So a very wide ranging view necessary to assure quality, not just scorecard of product
That is echoed in European standards and guidelines which apply to QA across Europe
These are 10 standards in ESG to do with internal QA
Apply to all modes of delivery – face to face and distance/online
Bold shows where additional guidance and indicators for e-learning might be needed
-- they align roughly with the areas picked out in ICDE study
Output from ENQA working group has just been published
It supplements the ESG with some additional guidance for e-learning but standards themselves are not changed.
I’m now going to focus on one specific framework, E-xcellence
E-xcellence is a project about quality in e-learning in Higher Education that has been around for over 10 years.
Provides a well-tested framework for thinking about quality in e-learning.
There are resources on the website. There is set of benchmarks which set out what good e-learning looks like.
These are captured in a manual which has a lot of useful background.
There are six chapters which reflect broad areas of concern seen in ICDE report
35 benchmarks in total
Here is a sample benchmark
Benchmark = statement of best practice in most institutions
Note they are very general, which allows each institution to do things their own way.
The institution needs to provide evidence to show how they measure up to each benchmark.
More detailed indicators for benchmarks.
Examples of good and excellent practice
Suggest the kinds of evidence that would support achieving a benchmark
But each university may approach things differently, so other evidence is ok
Not a scorecard!
Benchmarking as quality enhancement tool
Statements of good practice for comparison
Identification of weaknesses & strengths by collecting evidence roadmap for improvement
E-xcellence is very flexible.
Full process (full self-assessment, external review, roadmap for actions) leads to a E-xcellence label
But can be used informally and resources freely available, so you don’t have to commit to full process.
I want to mention some experiences from E-xcellence
Firstly, the process is valued by those who have done it.
This is a summary of feedback from participants in E-xcellence reviews
It is interesting to look back at reviews to see where universities have faced issues
Looking at broad topic areas (chapters)
Most issues in strategic management, curriculum design and staff support.
Fewer in Course design and delivery,
Fewest in Student support
– need to come back to that because good student support is essential for student success
Workload management (Staff support) highest
E-learning strategy (Strategic management)
Development of academic communities (Curriculum design) / Social media (Student support)
Course design & delivery generally low but provision of good interactive tools is a concern
Somethings aren’t seen as problems
-- that includes Student support which isn’t seen as problematic, but lots of activity
– understood to be important, but under control
I want to move on specifically to MOOCs
MOOCs are not part of ‘normal’ university teaching and they are free,
so should we pay much attention to quality?
Does it matter if a MOOC isn’t good?
Yes – several stakeholders involved, all have an interest
MOOCs are different from ‘normal’ HE and maybe from ‘normal’ e-learning.
MOOCs are free, open (needing no prior qualifications), typically short – unlike degree course
Recent work says that MOOC participants’ motivations are very different from ‘normal’ student.
They may not be interested in completing, just dip in to find something they need or that interests them, skip the rest
They may not think not completing is a failure
But we design a MOOC as a ‘course’ (not a book, not a ‘resource’)
It has a beginning and end, assessment, so ‘completion’ must be the teacher’s intention
So maybe should judge similarly to other courses.
OpenupEd is a European portal for MOOCs. Not a platform, but a way to gather MOOCs which offer a good quality experience
OpenupEd quality label is derived from E-xcellence so it provides a framework for thinking about quality of MOOCs in an organised way.
The materials are freely available for use in self-assessment
OpenupEd expects MOOCs to support these distinctive features or values.
They are felt to be important for a good educational experience
The OpenupEd benchmarks are derived from E-xcellence.
So they are well-tested.
Many apply to the whole institution – they can be checked once and then just revisited every few years
So for each new MOOC, a much smaller number and less effort required.
Here are some benchmarks at the course level
These are the ones that need checking for each MOOC.
Mainly straightforward to judge.
To help an initial quick self-assessment, there is a table to fill in.
This is the course level – fits on to a single sheet of paper!
List of benchmarks
Scale – is the benchmark not achieved, partially achieved, largely achieved, full achieved?
This is only for a quick self-assessment – will need to document evidence more fully
Not a scorecard! – this is to prompt roadmap for improvement
Mapping to OpenupEd features – evidence for a benchmark often is also evidence for an OpenupEd feature
No extra work needed to check
Something different about MOOCs is there is often a split between a university and platform provider.
For example, a MOOC may be written at the Open University (university) and published on FutureLearn (platform provider)
-- different people, different systems.
So can OpenupEd work in that situation?
Yes – quality emerges from joint efforts so evidence has to come from both partners.
Again we see that a concern for quality is deeply embedded.
I said I wouldn’t give a recipe but…
It helps to think about the design of a course as a whole.
This is a tool used as part of a learning design process at the OU – but there are other tools out there
It helps you build up an overall picture of what a student will experience
As a teacher you can construct a course from many different types of activity.
It helps to see them in in broad classes
This view lets you plan activities over time
You can see at this stage the course is maybe a little out of balance.
-- there is a lot of time spent doing assimilative activity, but almost nothing in communication & collaboration
-- on the right the weekly workload is shown and that looks uneven
This shows an example of how one course design changed.
Blue shows the shape of the course at a very early stage of planning.
Then there was a workshop where the team got together to look at the overall learning design.
The orange shows what it looks like as a result of that
You can see that the course team decided to reduce the time spent on assimilative activities
And increase the time spent on finding and handling information and on communication and collaboration (other changes also)
-- encourage the student to be more active in their learning
A good reason for encouraging collaborative learning – student success is higher if courses are designed with communicative activities.
Students love receiving lots of ‘stuff’ which they work through alone and they dislike collaborating with other students.
So courses with high proportion of assimilation are popular, but students engage less well over time and may not succeed.
So be careful of using surveys which ask students about satisfaction!
Many modules, which vary in student success (vertical) and student satisfaction (horizontal).
But satisfaction is not correlated to success
There are some courses (one the left) which get high satisfaction scores but low completion
And others (one the right) where students are very successful – but which they hate!
So a summary overall
I believe it really helps to have a quality framework to work with
It helps to create a quality culture – and that will help to improve quality
There is no single recipe for good quality courses – lots of scope for innovative ideas!
Just keep these points in mind which are common practice in ODL universities
-- work in a team of people with different skills
-- think about learning design at an early stage
-- make sure there are good mechanisms for student support