How to benchmark quality in MOOCs - the OpenupEd label


Published on

An introduction to the draft QA benchmarks for MOOCs on These benchmarks are derived from the E-xcellence NEXT benhcmarks for e-learning.

Published in: Business, Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Have used similar slide to explain how perceived quality of OER material emerges – here same framework used to consider MOOCs
    Currently, much perceived quality is associated with brand and repository – where is MOOC found? Which big-name university produced it?
    But could be misleading – big name universities gain reputation for research, so no guarantee that individual MOOC is well designed teaching
    However, probably all MOOC derive from taught courses, and they have some measure of checking while created. Also, presume there is some review / selection process by institution before making public offering. (If there isn’t, there is reputational risk to host).
    Increasingly there will also be Rating / voting / recommendation by users, possibly also alt-metrics.
  • The overall quality process is intended to encourage quality enhancement through self-assessment and review.
    The OpenupEd MOOC benchmarks are themselves provisional and open to revision.
  • Distinctive OpenupEd features
  • What is the problem?
    There are established HE QA procedures in Europe
    These were designed for conventional universities
    They don’t necessarily fit e-learning
    Provide resources and processes for QA of e-learning
    These can be adapted for local/national purposes
  • Resources: manual including benchmarks and indicators
    Strategic Management
    Curriculum Design
    Course Design
    Course Delivery
    Student Support
    Staff Support
  • Resources: manual including benchmarks and indicators
    Strategic Management
    Curriculum Design
    Course Design
    Course Delivery
    Student Support
    Staff Support
  • Online Quickscan
    35 Benchmark statements
    Quick self-assessment of e-learning performance
    Rate programme/course on the most relevant aspects
    Identifies hot / cold spots of e-learning programme/course
    Online version provides feedback:
    To identify elements to be improved
    To guide the internal discussion
    To learn if a full quality assessment procedure is useful
  • Expect that OpenupEd process will be light-touch version of above.
  • Note:
    Benchmarks – split into those that apply at course and those that apply to institution
    Reference back to closest e-xcellence benchmark – look this up in manual for more background and detailed indicators
    Mapping to OpenupEd features
    Simple Likert scale – not achieved, partially achieved, largely achieved, full achieived
    Expect that this will give initial thoughts on positioning relative to benchmarks. Expect that full review process will allow customising benchmarks to those relevant to institution, then collection of evidence (no room on this form!) and agreed level of performance. Following review, expect roadmap of improvement actions, to be followed up at periodic review.
  • How to benchmark quality in MOOCs - the OpenupEd label

    1. 1. How to benchmark quality in MOOCS: The OpenupEd label Jon Rosewell The Open University, UK EADTU Masterclass, 23rd Oct 2013, Paris
    2. 2. Why bother with quality? • One Coursera course: – ‘The worse course I’ve taken’ (158 posts) – ‘Thanks Dr. Peng for a great course’ (177 posts) • ‘Angry about first optional programming assignment?’ – ‘"Angry" implies entitlement to something better which is, for a free course, probably not the way to go.’ • • Coursera: Computing for Data Analysis (Oct 2013) Coursera: Social Network Analysis (Oct 2013)
    3. 3. Why bother with quality? • • • • • Students – know what they are committing to Employers – recognition of content and skills Authors – personal reputation, 'glow' of success Institutions – brand reputation Funders – philanthropic, venture caps, governments • Quality agencies – on behalf of above
    4. 4. UK Quality Assurance Agency “Our job is to safeguard quality and standards in UK universities and colleges, so that students have the best possible learning experience” “Factors which apply to all learning opportunities regardless of location, mode of delivery, academic subject; MOOCs are no exception to that” Anthony McClaran, QAA event (July, 2013)
    5. 5. Quality and learners “What are MOOCs actually aiming at? “Can the quality of MOOCs be assessed in the same way as any defined university course with traditional degree awarding processes? “Or do we have to take into account a different type of objective with MOOC learners? Are the learners mostly interested in only small sequences of learning, tailored to their own individual purpose, and then sign off and move to other MOOCs because their own learning objective was fulfilled?” Ulf-Daniel Ehlers, Ebba Ossiannilsson, Alastair Creelman
    6. 6. Why bother with quality? • Reported completion may be very low (1-10%) • Does that matter? – With very large starting numbers, there are still many learners completing – Maybe learners achieve personal goals even if they don’t complete • Can MOOCs encourage access to HE if >90% have an experience which is a ‘failure’?
    7. 7. Quality points Provenance Provenance Reputation Reputation Brand Brand checking MOOC MOOC creation peer review use user recommendation
    8. 8. OpenupEd Label • Partners will be HEIs – meet national QA & accreditation • Internal QA process for MOOC approval • OpenupEd MOOC quality label gained initially – self-assessment & review – institutional and course level (first 2 courses) • Label to be renewed periodically – additional MOOCs reviewed at course level only • HEI evaluates and monitors its MOOCs
    9. 9. OpenupEd features • • • • • • • • Openness to learners Digital openness Learner-centred approach Independent learning Media-supported interaction Recognition options Quality focus Spectrum of diversity
    10. 10. E-xcellence project 2005–present Funded by EU Lifelong Learning programme Managed by EADTU • E-xcellence 2005-06 – Development and trialling of criteria, handbooks and methodology • E-xcellence plus 2008-09 – Dissemination to institutions and to QA agencies in 9 European countries • E-xcellence NEXT 2011-12 – Continuing dissemination and updating of criteria and resources
    11. 11.
    12. 12. E-xcellence: modes of use • Informal self-evaluation – Use Quickscan • Local seminar – Local use of Quickscan with justification for rating – Meeting: institution, project team, national QA agency – Improvement roadmap • Full assessment – As above but part of formal accreditation – Evidence provided for benchmarks
    13. 13. Issues • How tightly are MOOCs linked to HEI core offering? – extracts from existing courses – ‘skunk works’ projects • Terminology – Are they ‘students’ or ‘participants’? • not registered • different legal / pastoral relationships – duty of disclosure? – responsibility?
    14. 14. Draft benchmarks – activity • • • • Think of a MOOC you teach or study Which benchmarks are appropriate? Are there benchmarks missing? In practice: – Which would be useable? – Which would be discriminating? • Is the mapping to OpenupEd features correct? • All comments welcomed!
    15. 15. Comments and feedback? Email: Thank you for your attention