The OpenupEd quality label is a quality enhancement approach to e-learning, tailored specifically to MOOCs. I will briefly introduce the OpenupEd quality label, show how it relates to other e-learning quality frameworks, and outline the ways in which it can be used, ranging from informal self-assessment to a full external review. Which of the benchmarks could contribute to enhanced design of MOOCs? Are the benchmarks sufficiently detailed? Do they capture all important aspects?
3. What do we mean by ‘quality’ in HE?
• Compliance & consumer protection
– Accreditation
– Guarantee of uniform standards
• Reputation
– Recruit good students, produce good graduates
• Quality enhancement / Process improvement
– Institutional mission
– Stakeholder engagement
– Measures of added value (‘learning gain’)
4. Approaches to QA in e-learning
• Compliance or enhancement?
• Process or product?
• Input elements?
• Pedagogical models?
• Outcome measures?
• Self-assessment or external review?
• Scorecard? Benchmarking against others?
Holistic: emphasis on process & context as well as product
5. A generic framework for QA in HE
Ebba Ossiannilsson, Keith Williams, Anthony F. Camilleri, and Mark Brown (2015)
Quality models in online and open education around the globe: State of the art and
recommendations, ICDE Report http://www.icde.org/quality
6. European Standards & Guidelines (ESG)
and e-learning
1.1 Policy for QA
1.2 Design and approval of programme
1.3 Student-centred learning, teaching & assessment
1.4 Student admission, progression, recognition & certification
1.5 Teaching staff
1.6 Learning resources and student support
1.6 Information management
1.8 Public information
1.9 Ongoing monitoring and periodic review
1.10 Cyclical external quality assurance
7. ENQA: Considerations for QA of e-learning
• Published 2018
• Supplement to ‘European Standards and Guidelines’ 2015
• Additional guidance and indicators
Huertas et al (2018) ENQA Occasional Papers, No. 26
https://enqa.eu/index.php/publications/papers-reports/occasional-papers/
8. Poll – do you use a QA process/framework?
• No
• Yes, internally defined
• Yes, defined by QA / Govt agency
9. Poll – why would you do more QA?
I want to improve my teaching
My boss tells me to
It is my job!
The QA agency / Ministry make me
I want a promotion
Other reasons
Check as many as apply
12. Organisation of resources
Strategic Management a high level view of how the institution plans its e-learning
Curriculum Design how e-learning is used across a whole programme of study
Course Design how e-learning is used in the design of individual courses
Course Delivery the technical and practical aspects of e-learning delivery
Staff Support the support and training provided to staff
Student Support the support, information and guidance provided to students
13. Sample benchmark
Course design
10. …
11. Learning outcomes determine the use of methods and
course contents. In a blended-learning context there is an
explicit rationale for the use of each element in the blend.
12. …
14. Sample indicators
Indicators
• Fitness for purpose drives decisions on the selection of teaching and
learning activities. The blending is such that different methods and
media are well chosen within and between courses, both in distribution
over time and extent of use.
At excellence level
• There is extensive institutional experience of delivery using blended
learning and this experience is widely shared through the organisation.
• Well informed decisions on the use of teaching and learning activities
are made routinely and reflect institutional policies regarding the
development of learner knowledge and skills.
15. Benchmarking as quality enhancement tool
• Statement of best practice
– Suggested indicators
• Collecting evidence
– Can be specific to each university
• Identification of weaknesses & strengths
• …leading to roadmap of actions for improvement
16. Poll – who should collect evidence?
• Course author
• Administrator
• Students
• External reviewer
• Team of stakeholders
17. Different ways to use E-xcellence
• Informal self-assessment using QuickScan
– Identify ‘hot’ and ‘cold’ spots
• Full internal self-assessment
– Stakeholders collect evidence
– Prepare roadmap of improvement actions
• Integrate with institutional process
– Embed selected benchmarks in internal process
• EADTU E-xcellence Associates Label
– Self-assessment, roadmap, external review
recognition by EADTU
NB: Resources such as
manual and benchmarks
are freely available!
19. Why worry about MOOC quality?
Students – know what they are committing to
Employers – recognition of content and skills
Authors – personal reputation, 'glow' of success
Universities / providers – brand reputation
Funders – philanthropists, government, investors
Quality agencies – on behalf of all above
20. Are MOOCs different from e-learning?
• MOOC vs Higher Education e-learning
– Short, free, no entry requirements
– Not accredited
– Reputational risk
• MOOC participants
– Motivations differ from degree students
– Completion may not be not their goal
But a MOOC is a Course so maybe it should be judged like any other
HE course?
21. OpenupEd Quality Label
• Derived from E-xcellence
– Lightweight process
• Self-assessment
• Formal label
– External review
www.openuped.eu/quality-label
22. OpenupEd MOOC features
• Openness to learners
• Digital openness
• Learner-centred approach
• Independent learning
• Media-supported interaction
• Recognition options
• Quality focus
• Spectrum of diversity
23. OpenupEd MOOC benchmarks
• Derived from E-xcellence benchmarks
• For the institution:
– To be checked every 3-5 years
– 21 benchmark statements, in six groups:
Strategic management, Curriculum design, Course design, Course delivery, Staff support,
Student support
• For the course:
– To be checked for each MOOC
– 11 benchmark statements
24. Benchmarks – course level
22. A clear statement of learning outcomes for both knowledge and skills is provided.
23. There is reasoned coherence between learning outcomes, course content,
teaching and learning strategy (including use of media), and assessment
methods.
24. Course activities aid participants to construct their own learning and to
communicate it to others.
25. The course content is relevant, accurate, and current.
26. Staff who write and deliver the course have the skills and experience to do so
successfully.
27. Course components have an open licence and are correctly attributed. Reuse of
material is supported by the appropriate choice of formats and standards.
28. The course conforms to guidelines for layout, presentation and accessibility.
25. Benchmarks – course level
29. The course contains sufficient interactivity (student-to-content or student-to-
student) to encourage active engagement. The course provides learners with
regular feedback through self-assessment activities, tests or peer feedback.
30. Learning outcomes are assessed using a balance of formative and summative
assessment appropriate to the level of certification.
31. Assessment is explicit, fair, valid and reliable. Measures appropriate to the
level of certification are in place to counter impersonation and plagiarism.
32. Course materials are reviewed, updated and improved using feedback from
stakeholders.
26. Additional notes – example
31. Assessment is explicit, fair, valid and reliable. Measures appropriate to the
level of certification are in place to counter impersonation and plagiarism.
See comments to OpenupEd benchmark 29 above.
The advent of digital badges (for example Mozilla open badges) provides a method of
rewarding achievement that may be appropriate for MOOCs. The award of digital badges
can be linked to automated or peer assessment. Digital badges have an infrastructure that
verifies the identity of the holder and provides a link back to the issuer and the criteria and
evidence for which it was awarded. Badges thus may provide a validated award that can be
kept distinct from the HEI’s normal qualifications.
See also:
E-xcellence benchmark #17
Chapter 3 Course design
§ 2.3.1 Transferable skills
§ 2.4 Assessment procedures
§ 3.4 Assessment
§ 4.2.5 Online assessment
30. OL: Openness to learners
DO: Digital openness
LC: Learner-centred approach
IL: Independent learning
MI: Media-supported interaction
RO: Recognition options
QF: Quality focus
SD: Spectrum of diversity
Quick scan
31. Whiteboard – your MOOC experience
Not achieved Fully achieved
22. Clear learning outcomes
23. Aligned LOs, content, assessment
24. Activities construct learning
25. Relevant, accurate, current
29. Interactivity, active learning, self-ass.
30. Formative & summative assessment
32. Whiteboard – MOOC features
• Openness to learners
• Digital openness
• Learner-centred approach
• Independent learning
• Media-supported interaction
• Recognition options
• Quality focus
• Spectrum of diversity
33. New checklists (OpenupEd, SCORE2020)
Checklist 1: Is it a MOOC or not?
– 14 items
Checklist 2: Quality of the design of MOOC
– 26 items
Checklist 3: Accessibility
– 6 items
Checklist 4: Technical platform and support for staff and
participants
– 12 items
34. Checklist 2: Quality of design
Dimension Criteria Level
Target group MOOCs are accessible to all people and as such various target
groups are identified
For each target group the needs, challenges and prior
knowledge are described
The description of each target group is supported by references
different studies
Overall goal The overall objective of the course is described in a few
sentences
Learning ob. The course describes a limited number of learning objectives
There is a reasoned coherence between learning outcomes,
course content, teaching and learning strategy (including use of
Levels: Not achieved, Partially achieved, Largely achieved, Fully achieved
35. Checklist 2: Quality of design
Dimension Criteria Level
Learning
activities
Activities aid participants to construct their own learning and to
communicate it to others
The ‘pathways’ (activities, tasks and routes) are designed in
such a way that they can be performed at different levels of
difficulty or complexity, to account for the broad spectrum of
participants’ knowledge and skills that are expected
Various activities are proposed with different formats. For
example: quizzes, peer evaluation, video conferences, activities
in forums or external social networks).
The course contains sufficient interactivity (learner-to-content,
learner-to-learner, or learner-to-teacher) to encourage active
engagement.
Levels: Not achieved, Partially achieved, Largely achieved, Fully achieved
36. Checklist 2: Quality of design
Dimension Criteria Level
Feedback
mechanism
Feedback by an academic tutor is limited and scalable
(characteristic of a MOOC).
The course provides learners with regular feedback through self-
assessment activities, tests or peer feedback.
The frequency of monitoring has been planned (forum, group,
post)
A weekly announcement or mass mailing with orientations for the
following week is planned.
In each weekly session, the pedagogical team makes a
synthesis of artefacts from the previous week’s session.
Some live events (Hangout, Tweetchat) are scheduled
Levels: Not achieved, Partially achieved, Largely achieved, Fully achieved
44. What is learning design?
• A way of documenting the design of a course
• A way of thinking about & discussing design
– before it is too late!
• A way to think about appropriate use of technologies
• A framework for evaluating courses
• A framework for evaluating designs of courses
45. The 7Cs of Learning Design (Gráinne Conole)
Many activities to help teams
consider each of these stages
– Course tweet (elevator pitch)
– Personas
– Resource audit
– Interactive > Constructive >
Active > Passive
– Constructive alignment (of
learning outcomes, activities,
assessment)
http://www2.le.ac.uk/projects/oer/oers/beyond-distance-research-alliance/7Cs-toolkit
46. UCL Learning Designer (Diana Laurillard)
• Acquisition
• Inquiry
• Practice
• Production
• Discussion
• Collaboration
https://www.ucl.ac.uk/learning-designer/
Laurillard, D. (2002). Rethinking university teaching: A conversational
framework for the effective use of learning technologies. Routledge.
53. OU Learning design – in practice
Toetenel, Lisette and Rienties, Bart (2016). Learning Design – creative design to visualise learning activities. Open Learning, 31(3) pp. 233–244.
54. OU Learning design – in practice
Toetenel, Lisette and Rienties, Bart (2016). Learning Design – creative design to visualise learning activities. Open Learning, 31(3) pp. 233–244.
55. In summary…
• A quality framework should underpin e-learning provision
– to help create a quality culture
– that is more likely to produce quality e-learning
– and quality enhancement
• And that also applies to MOOCs
• There is no simple recipe, but…
– Work in a course team
– Think about learning design at an early stage
– Don’t let QA procedures get in the way of the day job!
58. What students want – and what they need
“Student satisfaction is “unrelated” to learning behaviour and
academic performance, a study has found.
[…] while students dislike collaborative learning, they are
more likely to pass if they take part in it”
(Times Higher Education, Feb 12th 2018)
From an analysis of 100,000 students
on 151 modules
More at Bart Rientes, OU Inaugural Lecture
59. How does student satisfaction relate to module performance?Satisfaction
Students who successfully completed module
Slide from Bart Rienties Inaugural lecture
60. MOOC case study: OU + FutureLearn
A representative Open University MOOC … published on FutureLearn
• Evidence for OpenupEd features and benchmarks
• Quality emerges from joint efforts of OU (university) &
FutureLearn (platform provider)
• Holistic approach:
• Institutional and course level
• Process as well as product
• Structures and processes embed a concern for quality
throughout development, delivery and evaluation
Jansen, D., Rosewell, J., & Kear, K. (2017). ‘Quality Frameworks for MOOCs.’ In: M. Jemni, Kinshuk, & M. K. Khribi (Eds.), Open
Education: from OERs to MOOCs, 261–281. Springer http://oro.open.ac.uk/47595/
62. In summary…
• A quality framework should underpin e-learning provision
– to help create a quality culture
– that is more likely to produce quality e-learning
– and quality enhancement
• There is no simple recipe, but…
– Work in a module team
– Think about learning design
– Think about student support
63. Checklist 1: Is it a MOOC or not?
Dimension Criteria Level
Massive Pedagogical model means effort doesn’t increase
significantly as number of participants increase
Open Course accessible to (almost) all people without limitation
Full course experience available without cost
Online All aspects are delivered online
Course At least 1 ECTS (25-30 hours of study)
Participants receive some feedback (e.g. automated
quizzes, peers, general feedback from staff)
At least some recognition like badge or certificate of
completion.
Levels: Not achieved, Partially achieved, Largely achieved, Fully achieved
Editor's Notes
The OpenupEd quality label is a quality enhancement approach to e-learning, tailored specifically to MOOCs. I will briefly introduce the OpenupEd quality label, show how it relates to other e-learning quality frameworks, and outline the ways in which it can be used, ranging from informal self-assessment to a full external review. Which of the benchmarks could contribute to enhanced design of MOOCs? Are the benchmarks sufficiently detailed? Do they capture all important aspects?
Hello, I am Jon Rosewell, I am a lecturer at the Open University, UK.
I’m not going to give you a recipe for how to create quality e-learning.
Instead, I will suggest using a framework to help you think about and improve quality
I’m going to start with some general discussion of quality
Quality a difficult term to pin down!
At a minimum – is the course/qualification good enough to recognised / accepted?
But different universities want to improve their reputation – good brand attracts good students.
Teachers want to teach, to teach well, and to teach better, so quality enhancement
My university has a particular mission for students who would not otherwise go to university
-- disadvantaged backgrounds, low previous qualifications, disabilities
Improving for them is very important to the university, but may not be visible in rankings.
If we are concerned about quality, how do we check it?
Can we check the quality of the course by looking in detail at all its parts?
Can we predict the quality of a course by looking at the pedagogical model it uses, ie how it is taught?
Can we judge by what happens? How many students completed? Passed the course?
We need to take a holistic view, quality emerges when there is a good process
The 2015 ICDE report reviews a number of existing frameworks
found a good degree of consensus across commonly used frameworks
Took a holistic view
They include broad issues: Strategic planning and development, curriculum design as well as course design and delivery, and support available, both to students and staff.
So a very wide ranging view necessary to assure quality, not just scorecard of product
That is echoed in European standards and guidelines which apply to QA across Europe
These are 10 standards in ESG to do with internal QA
Apply to all modes of delivery – face to face and distance/online
Bold shows where additional guidance and indicators for e-learning might be needed
-- they align roughly with the areas picked out in ICDE study
Output from ENQA working group has just been published
It supplements the ESG with some additional guidance for e-learning but standards themselves are not changed.
I’m now going to focus on one specific framework, E-xcellence
E-xcellence is a project about quality in e-learning in Higher Education that has been around for over 10 years.
Provides a well-tested framework for thinking about quality in e-learning.
There are resources on the website. There is set of benchmarks which set out what good e-learning looks like.
These are captured in a manual which has a lot of useful background.
There are six chapters which reflect broad areas of concern seen in ICDE report
35 benchmarks in total
Here is a sample benchmark
Benchmark = statement of best practice in most institutions
Note they are very general, which allows each institution to do things their own way.
The institution needs to provide evidence to show how they measure up to each benchmark.
More detailed indicators for benchmarks.
Examples of good and excellent practice
Suggest the kinds of evidence that would support achieving a benchmark
But each university may approach things differently, so other evidence is ok
Not a scorecard!
Benchmarking as quality enhancement tool
Statements of good practice for comparison
Identification of weaknesses & strengths by collecting evidence roadmap for improvement
E-xcellence is very flexible.
Full process (full self-assessment, external review, roadmap for actions) leads to a E-xcellence label
But can be used informally and resources freely available, so you don’t have to commit to full process.
I want to move on specifically to MOOCs
MOOCs are not part of ‘normal’ university teaching and they are free,
so should we pay much attention to quality?
Does it matter if a MOOC isn’t good?
Yes – several stakeholders involved, all have an interest
MOOCs are different from ‘normal’ HE and maybe from ‘normal’ e-learning.
MOOCs are free, open (needing no prior qualifications), typically short – unlike degree course
Recent work says that MOOC participants’ motivations are very different from ‘normal’ student.
They may not be interested in completing, just dip in to find something they need or that interests them, skip the rest
They may not think not completing is a failure
But we design a MOOC as a ‘course’ (not a book, not a ‘resource’)
It has a beginning and end, assessment, so ‘completion’ must be the teacher’s intention
So maybe should judge similarly to other courses.
OpenupEd is a European portal for MOOCs. Not a platform, but a way to gather MOOCs which offer a good quality experience
OpenupEd quality label is derived from E-xcellence so it provides a framework for thinking about quality of MOOCs in an organised way.
The materials are freely available for use in self-assessment
OpenupEd expects MOOCs to support these distinctive features or values.
They are felt to be important for a good educational experience
The OpenupEd benchmarks are derived from E-xcellence.
So they are well-tested.
Many apply to the whole institution – they can be checked once and then just revisited every few years
So for each new MOOC, a much smaller number and less effort required.
Here are some benchmarks at the course level
These are the ones that need checking for each MOOC.
Mainly straightforward to judge.
Here are some benchmarks at the course level
These are the ones that need checking for each MOOC.
Mainly straightforward to judge.
Additional notes for each benchmark show:
-- link back to closest e-xcellence benchmark
-- refs to relevant e-xcellence manual sections
-- considerations specific to MOOCs
Template document provided in which to collect evidence for benchmark
Will vary according to local context
Can judge how well the benchmark is achieved overall on scale: not achieved to fully achieved
Evidence will also support one or more OpenupEd features – tick them off
To help an initial quick self-assessment, there is a table to fill in.
This is the course level – fits on to a single sheet of paper!
List of benchmarks
Scale – is the benchmark not achieved, partially achieved, largely achieved, full achieved?
This is only for a quick self-assessment – will need to document evidence more fully
Not a scorecard! – this is to prompt roadmap for improvement
Mapping to OpenupEd features – evidence for a benchmark often is also evidence for an OpenupEd feature
No extra work needed to check
Here are some benchmarks at the course level
These are the ones that need checking for each MOOC.
Mainly straightforward to judge.
OpenupEd expects MOOCs to support these distinctive features or values.
They are felt to be important for a good educational experience
More detailed checklists now provided, expanded using items from SCORE2020 project
Divided up somewhat
Some wording taken directly from OpenupEd / E-xcellence benchmarks, others from SCORE2020
These are examples – there are more
Some are more detailed, prescriptive. Are they always essential? Appropriate?
QRF takes a more complex approach
Draws attention to different phases involved in MOOC creation
And that there are different roles and perspectives
So multidimensional context
Here is just one of the 5 phases, showing it includes a number of different processes
The participation of the three roles can vary
– actually designer responsible for most of these since I picked design phase
But can be much more variable
And taking just one of these processes, can unpick a number of criteria
Again with different roles and perspectives
Another way of getting to info is to ask leading questions
Helpful way of thinking through issues before starting work!
Altogether comprehensive but complex
I said I wouldn’t give a recipe but…
Thinking about learning design for an context – not necessarily moocs or e-learning
But will be useful for moocs
A bit contrived because of need for everything to start with C!
Phases as in Quality reference framework
Careful – design activities for students to carry out
But there are some activities to help team with design process
Based firmly on Laurillard’s conversational framework.
Learning from:
-- acquisition
-- inquiry
-- practice
-- production
-- discussion
-- collaboration
Based firmly on Laurillard’s conversational framework.
It helps to think about the design of a course as a whole.
This is a tool used as part of a learning design process at the OU – but there are other tools out there
It helps you build up an overall picture of what a student will experience
NB not for MOOCs – any OU course, online or blended
As a teacher you can construct a course from many different types of activity.
It helps to see them in in broad classes
This view lets you plan activities over time
You can see at this stage the course is maybe a little out of balance.
-- there is a lot of time spent doing assimilative activity, but almost nothing in communication & collaboration
-- on the right the weekly workload is shown and that looks uneven
This shows an example of how one course design changed.
Blue shows the shape of the course at a very early stage of planning.
Then there was a workshop where the team got together to look at the overall learning design.
The orange shows what it looks like as a result of that
You can see that the course team decided to reduce the time spent on assimilative activities
And increase the time spent on finding and handling information and on communication and collaboration (other changes also)
-- encourage the student to be more active in their learning
So a summary overall
I believe it really helps to have a quality framework to work with
It helps to create a quality culture – and that will help to improve quality
There is no single recipe for good quality courses – lots of scope for innovative ideas!
Just keep these points in mind which are common practice in ODL universities
-- work in a team of people with different skills
-- think about learning design at an early stage
-- don’t let QA become burdensome
Many thanks for your attention
A good reason for encouraging collaborative learning – student success is higher if courses are designed with communicative activities.
Students love receiving lots of ‘stuff’ which they work through alone and they dislike collaborating with other students.
So courses with high proportion of assimilation are popular, but students engage less well over time and may not succeed.
So be careful of using surveys which ask students about satisfaction!
Many modules, which vary in student success (horizontal) and student satisfaction (vertical).
But satisfaction is not correlated to success
There are some courses (one the left) which get high satisfaction scores but low completion
And others (one the right) where students are very successful – but which they hate!
Something different about MOOCs is there is often a split between a university and platform provider.
For example, a MOOC may be written at the Open University (university) and published on FutureLearn (platform provider)
-- different people, different systems.
So can OpenupEd work in that situation?
Yes – quality emerges from joint efforts so evidence has to come from both partners.
Again we see that a concern for quality is deeply embedded.
So a summary overall
I believe it really helps to have a quality framework to work with
It helps to create a quality culture – and that will help to improve quality
There is no single recipe for good quality courses – lots of scope for innovative ideas!
Just keep these points in mind which are common practice in ODL universities
-- work in a team of people with different skills
-- think about learning design at an early stage
-- make sure there are good mechanisms for student support