Blended Learning
Workshop
STRATEGIES FOR ASSURING THE QUALITY
OF A BLENDED COURSE
Richard Walker
E-Learning Development Team
University of York
2
1. What steps can we take to ensure that a
blended course really meets its objectives in
supporting student learning?
2. How can we support a continuous process of
improvement in the way that we design and
support student learning activities.
Underpinning these questions:
Key questions for this workshop
3. How can we engage course instructors in QA
& continuous improvement processes?
3
1. ‘Designing in’ good practice
- QA & pre-testing frameworks
2. Developing an evaluation plan
- Principles & practical considerations
3. Course delivery & the development cycle
- Building a continuous process of improvement
Workshop Outline
4
 Pedagogic aims for online
delivery
 Design models: what’s
possible?
 E-tools: best fit for
pedagogic objectives
 Develop site: reflecting
guidelines & standards
 Test: peer review &
student testing
 Induction
 Supporting / sustaining
student activity
 Evaluating student
learning experience
 Lessons learned, informing course
design, task design & instructional
responsibilities.
Virtuous
development
cycle
5
Design phase: clear objectives for student learning
 Learning outcomes informing activity design;
technology; structure of the blend
Development phase: embedding QA principles
 Implement in development of learning space
 Informing design and presentation of learning
resources, tasks & activities
 Supported through training & quality frameworks
Pre-testing phase: review of learning space
 Fitness for purpose
‘Designing in’ good practice
6
8 general standards:
1. Course overview & introduction
2. Learning objectives
3. Assessment & measurement
4. Resources & materials
5. Learner engagement
6. Course technology
7. Learner support
8. Accessibility
Quality Matters Rubric
http://www.qmprogram.org/rubric
7
A. Course Overview & Introduction
 Statement of purpose; objectives; orientation
B. Course Design
 Course structure, usability, guidance & support
C. Presentation of Resources
 Layout, format, instructions
D. Site Interaction
 Communication channels, standards for participation
Blended Module Checklist
Online interactive
version of the checklist
8
Self-directed testing
Peer review
Pre-testing your site
9
Health check template
Module title General comments
Site structure
i. Top level
ii. Folder structure
iii. Empty content areas
iv. Unused tools
“There is no real guidance within the course site on
the resources which students have been asked to
use. It is good practice to describe the resource and
explain any plug-in requirements to view each
resource, as well as tips on how to engage with it.”
“Once underway in the course, each link to a file
resource should have a description (this is a PDF
file; file size; what to do), so that students know
what media they are going to engage with.”
Usability / accessibility
i. Menu length
ii. Welcome message
iii. Filenames
iv. Flexible formats
Multimedia
i. Plug-ins / players
ii. Launch OK
CMS usage
i. Permissions
10
Self-directed testing
Peer review
Student testing
Pre-testing your site
11
How can you be sure that the learning objectives for
the blended course are realised?
How will you track student participation and
engagement?
How will you evaluate the overall effectiveness of the
course design & delivery processes?
Reflection point
12
 Plan before course starts
 Embed in overall design of course
(reflecting learning objectives)
 Inform students about evaluation (if participation required)
Your plan should consider:
i. Aims & focus of evaluation
ii. Key questions
iii. Stakeholders
iv. Time scales & dependencies
v. Instruments & methods
Developing your evaluation plan
Adapted from Jara et al. (2008) Evaluation of E-Learning Courses
13
Principles for course evaluation
• Outcome-based: focusing on measurable & objective
standards
– Were the course objectives met (e.g. levels of
engagement & patterns of use of online resources)?
– Did learners reach the targeted learning outcomes
(e.g. approaches to learning; levels of understanding)?
 Outcome-based: focusing on measurable & objective standards
– Were the course objectives met (e.g. levels of engagement & patterns of use of
online resources)?
– Did learners reach the targeted learning outcomes (e.g. approaches to learning;
levels of understanding)?
 Interpretive: focusing on context
(perceptions of the learning experience)
– What were the students’ affective and attitudinal responses to
the blended course experience?
– How were the e-learning tools used by students to support their
learning in formal & informal study activities?
– How did the lecturer/tutors perceive students’ learning relative
to previous performance? (What actions should be taken for
future course development?)
14
Data collection methods
(Informal
progress
checks)
Entry &
exit
surveys
Contribution
statistics
Focus group
interviews
Tools for
reflection
Course
statistics
15
Evaluation Pathway
Class
Sessions
Class
Sessions
Class
Sessions
Class
Sessions
Feedback on
performance
Online
Activity
Feedback on
performance
Online
Activity
Feedback on
performance
Online
Activity
Role Start Course Delivery End Post Course
Instructor Entry
Survey
Feedback on performance Exit
Survey
Students Task performance and
self reflection
System Course statistics & contribution
histories
Researcher Content analysis Focus Group
16
Evaluation Pathway
Class
Sessions
Class
Sessions
Class
Sessions
Class
Sessions
Feedback on
performance
Online
Activity
Feedback on
performance
Online
Activity
Feedback on
performance
Online
Activity
Role Start Course Delivery End Post Course
Instructor Entry
Survey
Feedback on performance Exit
Survey
Students Task performance and
self reflection
System Course statistics & contribution
histories
Researcher Content analysis Focus Group
23
Student engagement
 Survey fatigue
Reliability: halo/horns effect
Validity
 Visibility of student learning
 Context of student learning
Challenges in interpreting your data
24
Was the course design fit for purpose?
 Usefulness / engagement patterns for online
components of module
 Complementary nature of class-based & online activities
 Relevance of assessment plan
 Sequencing of tasks
Were the course materials suited for the online tasks?
 Levels of learning / differentiation & accessibility
Was instructional support adequate, enabling & timely?
 Instructions, feedback and support
Reflection on action : Defining next steps
25
Design :
Summary
Course delivery as a development cycle
Deliver :
Evaluate :
Review :
Pedagogic aims; design model; course
testing; delivery & evaluation plans
Socialise; support; sustain; sum up
student learning. Evidence collection
as a feature of course delivery
Establish holistic view of student
learning – employing outcome focused
& interpretive research methods
Reflection on action –
defining next steps
26
References and recommended reading
Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order
Learning Without Tutor Overload?' Open Learning: The Journal of Open and
Distance Learning, 18: 2, 121 — 134
Gunawardena, C., Lowe, C. & Carabajal, K. (2000). Evaluating Online Learning:
models and methods. In D. Willis et al. (Eds.), Proceedings of Society for
Information Technology & Teacher Education International Conference 2000 (pp.
1677-1684). Chesapeake, VA: AACE.
Jara, M., Mohamad, F., & Cranmer, S. (2008). Evaluation of E-Learning Courses.
WLE Centre Occasional Paper 4. Institute of Education, University of London.
http://www.wlecentre.ac.uk/cms/files/occasionalpapers/evaluation_of_online_cou
rses_25th.pdf
Quality Matters Program Rubric http://www.qmprogram.org/rubric
Thank You
Richard Walker
The University of York
richard.walker@york.ac.uk

BdEurope2011_BlendedLearningWorkshop.pptx

  • 1.
    Blended Learning Workshop STRATEGIES FORASSURING THE QUALITY OF A BLENDED COURSE Richard Walker E-Learning Development Team University of York
  • 2.
    2 1. What stepscan we take to ensure that a blended course really meets its objectives in supporting student learning? 2. How can we support a continuous process of improvement in the way that we design and support student learning activities. Underpinning these questions: Key questions for this workshop 3. How can we engage course instructors in QA & continuous improvement processes?
  • 3.
    3 1. ‘Designing in’good practice - QA & pre-testing frameworks 2. Developing an evaluation plan - Principles & practical considerations 3. Course delivery & the development cycle - Building a continuous process of improvement Workshop Outline
  • 4.
    4  Pedagogic aimsfor online delivery  Design models: what’s possible?  E-tools: best fit for pedagogic objectives  Develop site: reflecting guidelines & standards  Test: peer review & student testing  Induction  Supporting / sustaining student activity  Evaluating student learning experience  Lessons learned, informing course design, task design & instructional responsibilities. Virtuous development cycle
  • 5.
    5 Design phase: clearobjectives for student learning  Learning outcomes informing activity design; technology; structure of the blend Development phase: embedding QA principles  Implement in development of learning space  Informing design and presentation of learning resources, tasks & activities  Supported through training & quality frameworks Pre-testing phase: review of learning space  Fitness for purpose ‘Designing in’ good practice
  • 6.
    6 8 general standards: 1.Course overview & introduction 2. Learning objectives 3. Assessment & measurement 4. Resources & materials 5. Learner engagement 6. Course technology 7. Learner support 8. Accessibility Quality Matters Rubric http://www.qmprogram.org/rubric
  • 7.
    7 A. Course Overview& Introduction  Statement of purpose; objectives; orientation B. Course Design  Course structure, usability, guidance & support C. Presentation of Resources  Layout, format, instructions D. Site Interaction  Communication channels, standards for participation Blended Module Checklist Online interactive version of the checklist
  • 8.
  • 9.
    9 Health check template Moduletitle General comments Site structure i. Top level ii. Folder structure iii. Empty content areas iv. Unused tools “There is no real guidance within the course site on the resources which students have been asked to use. It is good practice to describe the resource and explain any plug-in requirements to view each resource, as well as tips on how to engage with it.” “Once underway in the course, each link to a file resource should have a description (this is a PDF file; file size; what to do), so that students know what media they are going to engage with.” Usability / accessibility i. Menu length ii. Welcome message iii. Filenames iv. Flexible formats Multimedia i. Plug-ins / players ii. Launch OK CMS usage i. Permissions
  • 10.
    10 Self-directed testing Peer review Studenttesting Pre-testing your site
  • 11.
    11 How can yoube sure that the learning objectives for the blended course are realised? How will you track student participation and engagement? How will you evaluate the overall effectiveness of the course design & delivery processes? Reflection point
  • 12.
    12  Plan beforecourse starts  Embed in overall design of course (reflecting learning objectives)  Inform students about evaluation (if participation required) Your plan should consider: i. Aims & focus of evaluation ii. Key questions iii. Stakeholders iv. Time scales & dependencies v. Instruments & methods Developing your evaluation plan Adapted from Jara et al. (2008) Evaluation of E-Learning Courses
  • 13.
    13 Principles for courseevaluation • Outcome-based: focusing on measurable & objective standards – Were the course objectives met (e.g. levels of engagement & patterns of use of online resources)? – Did learners reach the targeted learning outcomes (e.g. approaches to learning; levels of understanding)?  Outcome-based: focusing on measurable & objective standards – Were the course objectives met (e.g. levels of engagement & patterns of use of online resources)? – Did learners reach the targeted learning outcomes (e.g. approaches to learning; levels of understanding)?  Interpretive: focusing on context (perceptions of the learning experience) – What were the students’ affective and attitudinal responses to the blended course experience? – How were the e-learning tools used by students to support their learning in formal & informal study activities? – How did the lecturer/tutors perceive students’ learning relative to previous performance? (What actions should be taken for future course development?)
  • 14.
    14 Data collection methods (Informal progress checks) Entry& exit surveys Contribution statistics Focus group interviews Tools for reflection Course statistics
  • 15.
    15 Evaluation Pathway Class Sessions Class Sessions Class Sessions Class Sessions Feedback on performance Online Activity Feedbackon performance Online Activity Feedback on performance Online Activity Role Start Course Delivery End Post Course Instructor Entry Survey Feedback on performance Exit Survey Students Task performance and self reflection System Course statistics & contribution histories Researcher Content analysis Focus Group
  • 16.
    16 Evaluation Pathway Class Sessions Class Sessions Class Sessions Class Sessions Feedback on performance Online Activity Feedbackon performance Online Activity Feedback on performance Online Activity Role Start Course Delivery End Post Course Instructor Entry Survey Feedback on performance Exit Survey Students Task performance and self reflection System Course statistics & contribution histories Researcher Content analysis Focus Group
  • 17.
    23 Student engagement  Surveyfatigue Reliability: halo/horns effect Validity  Visibility of student learning  Context of student learning Challenges in interpreting your data
  • 18.
    24 Was the coursedesign fit for purpose?  Usefulness / engagement patterns for online components of module  Complementary nature of class-based & online activities  Relevance of assessment plan  Sequencing of tasks Were the course materials suited for the online tasks?  Levels of learning / differentiation & accessibility Was instructional support adequate, enabling & timely?  Instructions, feedback and support Reflection on action : Defining next steps
  • 19.
    25 Design : Summary Course deliveryas a development cycle Deliver : Evaluate : Review : Pedagogic aims; design model; course testing; delivery & evaluation plans Socialise; support; sustain; sum up student learning. Evidence collection as a feature of course delivery Establish holistic view of student learning – employing outcome focused & interpretive research methods Reflection on action – defining next steps
  • 20.
    26 References and recommendedreading Fox, S. and MacKeogh, K. (2003) 'Can eLearning Promote Higher-order Learning Without Tutor Overload?' Open Learning: The Journal of Open and Distance Learning, 18: 2, 121 — 134 Gunawardena, C., Lowe, C. & Carabajal, K. (2000). Evaluating Online Learning: models and methods. In D. Willis et al. (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2000 (pp. 1677-1684). Chesapeake, VA: AACE. Jara, M., Mohamad, F., & Cranmer, S. (2008). Evaluation of E-Learning Courses. WLE Centre Occasional Paper 4. Institute of Education, University of London. http://www.wlecentre.ac.uk/cms/files/occasionalpapers/evaluation_of_online_cou rses_25th.pdf Quality Matters Program Rubric http://www.qmprogram.org/rubric
  • 21.
    Thank You Richard Walker TheUniversity of York richard.walker@york.ac.uk