How to benchmark quality in MOOCS:
The OpenupEd label
Jon Rosewell
The Open University, UK

EADTU Masterclass, 23rd Oct 2013, Paris
Why bother with quality?
• One Coursera course:
– ‘The worse course I’ve taken’ (158 posts)
– ‘Thanks Dr. Peng for a great course’ (177 posts)
• ‘Angry about first optional programming assignment?’
– ‘"Angry" implies entitlement to something better which
is, for a free course, probably not the way to go.’
•
•

Coursera: Computing for Data Analysis (Oct 2013)
Coursera: Social Network Analysis (Oct 2013)
Why bother with quality?
•
•
•
•
•

Students – know what they are committing to
Employers – recognition of content and skills
Authors – personal reputation, 'glow' of success
Institutions – brand reputation
Funders – philanthropic, venture caps, governments

• Quality agencies – on behalf of above
UK Quality Assurance Agency
“Our job is to safeguard quality and
standards in UK universities and colleges,
so that students have the best possible
learning experience”
“Factors which apply to all learning
opportunities regardless of location, mode of
delivery, academic subject; MOOCs are no
exception to that”
Anthony McClaran, QAA event (July, 2013)
Quality and learners
“What are MOOCs actually aiming at?
“Can the quality of MOOCs be assessed in the same way
as any defined university course with traditional degree
awarding processes?
“Or do we have to take into account a different type of
objective with MOOC learners? Are the learners mostly
interested in only small sequences of learning, tailored to
their own individual purpose, and then sign off and move to
other MOOCs because their own learning objective was
fulfilled?”
Ulf-Daniel Ehlers, Ebba Ossiannilsson, Alastair Creelman
http://mooc.efquel.org/
Why bother with quality?
• Reported completion may be very low (1-10%)
• Does that matter?
– With very large starting numbers, there are still many
learners completing
– Maybe learners achieve personal goals even if they
don’t complete
• Can MOOCs encourage access to HE if >90% have an
experience which is a ‘failure’?
Quality points
Provenance
Provenance
Reputation
Reputation
Brand
Brand

checking
MOOC
MOOC
creation
peer review

use

user recommendation
OpenupEd Label
• Partners will be HEIs
– meet national QA & accreditation
• Internal QA process for MOOC approval
• OpenupEd MOOC quality label gained initially
– self-assessment & review
– institutional and course level (first 2 courses)
• Label to be renewed periodically
– additional MOOCs reviewed at course level only
• HEI evaluates and monitors its MOOCs
OpenupEd features
•
•
•
•
•
•
•
•

Openness to learners
Digital openness
Learner-centred approach
Independent learning
Media-supported interaction
Recognition options
Quality focus
Spectrum of diversity
E-xcellence project 2005–present
Funded by EU Lifelong Learning programme
Managed by EADTU
• E-xcellence 2005-06
– Development and trialling of criteria, handbooks and
methodology
• E-xcellence plus 2008-09
– Dissemination to institutions and to QA agencies in 9 European
countries
• E-xcellence NEXT 2011-12
– Continuing dissemination and updating of criteria and
resources
http://e-xcellencelabel.eadtu.eu
E-xcellence: modes of use
• Informal self-evaluation
– Use Quickscan
• Local seminar
– Local use of Quickscan with justification for rating
– Meeting: institution, project team, national QA agency
– Improvement roadmap
• Full assessment
– As above but part of formal accreditation
– Evidence provided for benchmarks
Issues
• How tightly are MOOCs linked to HEI core offering?
– extracts from existing courses
– ‘skunk works’ projects
• Terminology
– Are they ‘students’ or ‘participants’?
• not registered
• different legal / pastoral relationships
– duty of disclosure?
– responsibility?
Draft benchmarks – activity
•
•
•
•

Think of a MOOC you teach or study
Which benchmarks are appropriate?
Are there benchmarks missing?
In practice:
– Which would be useable?
– Which would be discriminating?
• Is the mapping to OpenupEd features correct?
• All comments welcomed!
Comments and feedback?
Email:

Jon.Rosewell@open.ac.uk

Thank you for your attention

How to benchmark quality in MOOCs - the OpenupEd label

  • 1.
    How to benchmarkquality in MOOCS: The OpenupEd label Jon Rosewell The Open University, UK EADTU Masterclass, 23rd Oct 2013, Paris
  • 2.
    Why bother withquality? • One Coursera course: – ‘The worse course I’ve taken’ (158 posts) – ‘Thanks Dr. Peng for a great course’ (177 posts) • ‘Angry about first optional programming assignment?’ – ‘"Angry" implies entitlement to something better which is, for a free course, probably not the way to go.’ • • Coursera: Computing for Data Analysis (Oct 2013) Coursera: Social Network Analysis (Oct 2013)
  • 3.
    Why bother withquality? • • • • • Students – know what they are committing to Employers – recognition of content and skills Authors – personal reputation, 'glow' of success Institutions – brand reputation Funders – philanthropic, venture caps, governments • Quality agencies – on behalf of above
  • 4.
    UK Quality AssuranceAgency “Our job is to safeguard quality and standards in UK universities and colleges, so that students have the best possible learning experience” “Factors which apply to all learning opportunities regardless of location, mode of delivery, academic subject; MOOCs are no exception to that” Anthony McClaran, QAA event (July, 2013)
  • 5.
    Quality and learners “Whatare MOOCs actually aiming at? “Can the quality of MOOCs be assessed in the same way as any defined university course with traditional degree awarding processes? “Or do we have to take into account a different type of objective with MOOC learners? Are the learners mostly interested in only small sequences of learning, tailored to their own individual purpose, and then sign off and move to other MOOCs because their own learning objective was fulfilled?” Ulf-Daniel Ehlers, Ebba Ossiannilsson, Alastair Creelman http://mooc.efquel.org/
  • 6.
    Why bother withquality? • Reported completion may be very low (1-10%) • Does that matter? – With very large starting numbers, there are still many learners completing – Maybe learners achieve personal goals even if they don’t complete • Can MOOCs encourage access to HE if >90% have an experience which is a ‘failure’?
  • 7.
  • 8.
    OpenupEd Label • Partnerswill be HEIs – meet national QA & accreditation • Internal QA process for MOOC approval • OpenupEd MOOC quality label gained initially – self-assessment & review – institutional and course level (first 2 courses) • Label to be renewed periodically – additional MOOCs reviewed at course level only • HEI evaluates and monitors its MOOCs
  • 9.
    OpenupEd features • • • • • • • • Openness tolearners Digital openness Learner-centred approach Independent learning Media-supported interaction Recognition options Quality focus Spectrum of diversity
  • 10.
    E-xcellence project 2005–present Fundedby EU Lifelong Learning programme Managed by EADTU • E-xcellence 2005-06 – Development and trialling of criteria, handbooks and methodology • E-xcellence plus 2008-09 – Dissemination to institutions and to QA agencies in 9 European countries • E-xcellence NEXT 2011-12 – Continuing dissemination and updating of criteria and resources
  • 12.
  • 14.
    E-xcellence: modes ofuse • Informal self-evaluation – Use Quickscan • Local seminar – Local use of Quickscan with justification for rating – Meeting: institution, project team, national QA agency – Improvement roadmap • Full assessment – As above but part of formal accreditation – Evidence provided for benchmarks
  • 16.
    Issues • How tightlyare MOOCs linked to HEI core offering? – extracts from existing courses – ‘skunk works’ projects • Terminology – Are they ‘students’ or ‘participants’? • not registered • different legal / pastoral relationships – duty of disclosure? – responsibility?
  • 17.
    Draft benchmarks –activity • • • • Think of a MOOC you teach or study Which benchmarks are appropriate? Are there benchmarks missing? In practice: – Which would be useable? – Which would be discriminating? • Is the mapping to OpenupEd features correct? • All comments welcomed!
  • 18.

Editor's Notes

  • #8 Have used similar slide to explain how perceived quality of OER material emerges – here same framework used to consider MOOCs Currently, much perceived quality is associated with brand and repository – where is MOOC found? Which big-name university produced it? But could be misleading – big name universities gain reputation for research, so no guarantee that individual MOOC is well designed teaching However, probably all MOOC derive from taught courses, and they have some measure of checking while created. Also, presume there is some review / selection process by institution before making public offering. (If there isn’t, there is reputational risk to host). Increasingly there will also be Rating / voting / recommendation by users, possibly also alt-metrics.
  • #9 The overall quality process is intended to encourage quality enhancement through self-assessment and review. The OpenupEd MOOC benchmarks are themselves provisional and open to revision.
  • #10 Distinctive OpenupEd features
  • #11 What is the problem? There are established HE QA procedures in Europe These were designed for conventional universities They don’t necessarily fit e-learning Solution: Provide resources and processes for QA of e-learning These can be adapted for local/national purposes
  • #12 Resources: manual including benchmarks and indicators Structure Strategic Management Curriculum Design Course Design Course Delivery Student Support Staff Support
  • #13 Resources: manual including benchmarks and indicators Structure Strategic Management Curriculum Design Course Design Course Delivery Student Support Staff Support
  • #14 Online Quickscan 35 Benchmark statements Quick self-assessment of e-learning performance Rate programme/course on the most relevant aspects Identifies hot / cold spots of e-learning programme/course Online version provides feedback: To identify elements to be improved To guide the internal discussion To learn if a full quality assessment procedure is useful
  • #15 Expect that OpenupEd process will be light-touch version of above.
  • #16 Note: Benchmarks – split into those that apply at course and those that apply to institution Reference back to closest e-xcellence benchmark – look this up in manual for more background and detailed indicators http://e-xcellencelabel.eadtu.eu Mapping to OpenupEd features Simple Likert scale – not achieved, partially achieved, largely achieved, full achieived Expect that this will give initial thoughts on positioning relative to benchmarks. Expect that full review process will allow customising benchmarks to those relevant to institution, then collection of evidence (no room on this form!) and agreed level of performance. Following review, expect roadmap of improvement actions, to be followed up at periodic review.