What stakeholders really think of eLearning. Quality through self-evaluation.Deborah Arnold, Vidéoscop-Université Nancy 2, FranceThis project has been funded with support from the European Commission. This communication reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.
Theoretical background
How to design sound-self-evaluation questionnaires
Interpreting results
SEVAQ+ user testing and survey
Scenarios for implementing SEVAQ+What are wegoing to cover?
Whatis SEVAQ+?A combined tool & approach for the self-evaluation of quality in technology enhanced learning
A Lifelong Learning Programme KA4 Project (2009-2010) “Dissemination and exploitation of results”
Support for improving the quality and attractiveness of Vocational Education and Training (VET)
Support for the HE modernisation agenda: governance, curricular reformWhat can we do with SEVAQ+?Create questionnaires based on recognised quality approaches.
Analyse results with easy-to-interpret data.
Improve the quality of our courses.
Teachers and trainers can gather feedback on what learners really think of their learning experience.
Training managers can get the full picture by comparing responses from the different stakeholders involved.
Organisations can use the results of SEVAQ+ to benchmark against others.
Learners too have a voice in the process.1st conceptual model: the Kirkpatrick levels Level 5: ROI Financial incidence of the training on the global costs and incentive  Level 4: Business results Indicators to be determined:productivityincrease, sales increase, turnover’s drop …Level 3: Transfer of learning outcomes in the workplace: Measuring weeks after what has been changed by learning (surveys to learners, peers, line manager… witness groups)Level 2: Learning assessment: by tests, exams. Measuring just after what has been achieved through learning (skills, knowledge and / or competences)Level 1: Reactions satisfaction survey completed by the learner
2nd conceptual model: the EFQM model  © EFQM European Foundation for Quality ManagementA management model for total quality (all processes, all actors) Each domain acts in interaction with all the others
The main stakeholder in SEVAQ v1.0: the learner Partially EFQM and partially Kirkpatrick learner3 domains of the evaluation: The resourcesused by the learner during his      learning experience The  processes (activities) proposed to the learner     during the delivery of the course The results:learning objectives achieved, effects of the     experience on the learner, some measure of the transfer      in the workplace
A wider scope of processes: the learning processes +   all the design processes of the provider A wider target of users: the learner + the other stakeholders interested in the delivery of the offer  Two distinctive contexts: e-learning in VET (as before)  + e-learning in HESEVAQ+: extending the approach ObjectE-learning design Learning processTeacher/ trainerUserManagerTGsVETHEContext
The extension of SEVAQ+Managers TrainersProviders Learners+ stakeholders+ contexts (HE and VET)  + EFQM domains+Kirkpatrick levels
Sub-criterion aSub-criterionbStatementsSub-criterioncSub-criteriondSub-criterione208 statements723 statements 38 sub-criteria99 Sub-criteriaResourcesCriterion  1Criterion  2Activities Criterion 3Criterion 4ResultsSEVAQ 1SEVAQ + 17 Criteria22 Criteria The structure of the SEVAQ+  questionnaires
EFQM & Kirkpatrick appliedEFQMKirkpatrick 1 & 2EFQMKirkpatrick 3
Testing SEVAQ+: Step 1 - getting startedSEVAQ+ demos: http://www.sevaq-plus.preau.ccip.fr/demo/demo_1/sevaqplus_demo.html
Step 2: design and implement questionnairesDemo 1Demo 2aDemo 2bDemo 3Demo 4
Interpreting results: HistogramsQuestion 247 : You knew every week (or day) what you were expected to do.

What stakeholders really think of eLearning. Quality through self-evaluation.

  • 1.
    What stakeholders reallythink of eLearning. Quality through self-evaluation.Deborah Arnold, Vidéoscop-Université Nancy 2, FranceThis project has been funded with support from the European Commission. This communication reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.
  • 2.
  • 3.
    How to designsound-self-evaluation questionnaires
  • 4.
  • 5.
  • 6.
    Scenarios for implementingSEVAQ+What are wegoing to cover?
  • 7.
    Whatis SEVAQ+?A combinedtool & approach for the self-evaluation of quality in technology enhanced learning
  • 8.
    A Lifelong LearningProgramme KA4 Project (2009-2010) “Dissemination and exploitation of results”
  • 9.
    Support for improvingthe quality and attractiveness of Vocational Education and Training (VET)
  • 10.
    Support for theHE modernisation agenda: governance, curricular reformWhat can we do with SEVAQ+?Create questionnaires based on recognised quality approaches.
  • 11.
    Analyse results witheasy-to-interpret data.
  • 12.
    Improve the qualityof our courses.
  • 13.
    Teachers and trainerscan gather feedback on what learners really think of their learning experience.
  • 14.
    Training managers canget the full picture by comparing responses from the different stakeholders involved.
  • 15.
    Organisations can usethe results of SEVAQ+ to benchmark against others.
  • 16.
    Learners too havea voice in the process.1st conceptual model: the Kirkpatrick levels Level 5: ROI Financial incidence of the training on the global costs and incentive Level 4: Business results Indicators to be determined:productivityincrease, sales increase, turnover’s drop …Level 3: Transfer of learning outcomes in the workplace: Measuring weeks after what has been changed by learning (surveys to learners, peers, line manager… witness groups)Level 2: Learning assessment: by tests, exams. Measuring just after what has been achieved through learning (skills, knowledge and / or competences)Level 1: Reactions satisfaction survey completed by the learner
  • 17.
    2nd conceptual model:the EFQM model © EFQM European Foundation for Quality ManagementA management model for total quality (all processes, all actors) Each domain acts in interaction with all the others
  • 18.
    The main stakeholderin SEVAQ v1.0: the learner Partially EFQM and partially Kirkpatrick learner3 domains of the evaluation: The resourcesused by the learner during his learning experience The processes (activities) proposed to the learner during the delivery of the course The results:learning objectives achieved, effects of the experience on the learner, some measure of the transfer in the workplace
  • 19.
    A wider scopeof processes: the learning processes + all the design processes of the provider A wider target of users: the learner + the other stakeholders interested in the delivery of the offer Two distinctive contexts: e-learning in VET (as before) + e-learning in HESEVAQ+: extending the approach ObjectE-learning design Learning processTeacher/ trainerUserManagerTGsVETHEContext
  • 20.
    The extension ofSEVAQ+Managers TrainersProviders Learners+ stakeholders+ contexts (HE and VET) + EFQM domains+Kirkpatrick levels
  • 21.
    Sub-criterion aSub-criterionbStatementsSub-criterioncSub-criteriondSub-criterione208 statements723statements 38 sub-criteria99 Sub-criteriaResourcesCriterion 1Criterion 2Activities Criterion 3Criterion 4ResultsSEVAQ 1SEVAQ + 17 Criteria22 Criteria The structure of the SEVAQ+ questionnaires
  • 22.
    EFQM & KirkpatrickappliedEFQMKirkpatrick 1 & 2EFQMKirkpatrick 3
  • 23.
    Testing SEVAQ+: Step1 - getting startedSEVAQ+ demos: http://www.sevaq-plus.preau.ccip.fr/demo/demo_1/sevaqplus_demo.html
  • 24.
    Step 2: designand implement questionnairesDemo 1Demo 2aDemo 2bDemo 3Demo 4
  • 25.
    Interpreting results: HistogramsQuestion247 : You knew every week (or day) what you were expected to do.
  • 26.
    Interpreting results: radargraphsImprovement needed Sub-criteria with results under the mean37 : Time management [ 2.49 ]41 : Blended approach [ 2.59 ]79 : Levels of overall knowledge outcomes  [ 2.62 ]86 : Learning management [ 2.41 ]88 : Self-motivation [ 2.55 ]Improvement less or not needed Sub-criteria with results above the mean18 : Pedagogical aspects of learning content [ 2.76 ]20 : Technical requirements [ 3.04 ]22 : Instructional design [ 2.77 ]38 : Navigation and resource options [ 2.84 ]84 : Awareness of learning preferences [ 2.72 ]Mean value for the evaluation = 2.7
  • 27.
    Interpreting results: tableSummarytable of critical questions by giving the percentage of persons quoting the question as important and giving a bad evaluation. A question is consider to be critical when its score is above 40% and very critical when its score is 60% and above.
  • 28.
  • 29.
    SEVAQ+ user testing(key figures)136 direct responses to call for testers
  • 30.
    Leading to over16 000 potential testers (58% HE / 42% VET)
  • 31.
    7 local andnational seminars in FR, PL, LT and IT
  • 32.
    3 international workshops(Valencia, Oeiras & Budapest) with EDEN & EFQUEL
  • 33.
    1500 current testers(more end Jan with end of semester evaluations)
  • 34.
  • 35.
  • 36.
  • 37.
    2158 Questionnaires completed(Response rate = 70%)European-wide user survey (key figures)
  • 38.
  • 39.
  • 40.
    Scenarios of useTellus how YOU want to use SEVAQ+!<
  • 41.
    Join us<Facebook: SEVAQ+www.sevaq.euLinkedIn:Evaluation of training and courses
  • 42.
    Join usYouTubechannel: Qualityin eLearning<Test the tool and have your say!
  • 43.
    User evaluation –ongoing till end January 2011
  • 44.
  • 45.
    Join theDelphi expert group: March – June 2011
  • 46.
    Invitation to validationworkshop June 2011Deborah.Arnold@univ-nancy2.frwww.eden-online.org/nap_elgg/pg/profile/deborah.arnoldThank you for listening!With the indispensable contribution of:Anne-Marie Husson (CCIP-Le Préau)
  • 47.
  • 48.
    Rolf Reinhardt (EFQUEL)Andall the SEVAQ+ partners!www.sevaq.eu