What stakeholders really think of eLearning. Quality through self-evaluation.
What stakeholders really think of eLearning. Quality through self-evaluation.<br />Deborah Arnold, Vidéoscop-Université Nancy 2, France<br />This project has been funded with support from the European Commission. This communication reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.<br />
Teachers and trainers can gather feedback on what learners really think of their learning experience.
Training managers can get the full picture by comparing responses from the different stakeholders involved.
Organisations can use the results of SEVAQ+ to benchmark against others.
Learners too have a voice in the process.</li></li></ul><li>1st conceptual model: the Kirkpatrick levels <br />Level 5: ROI Financial incidence of the training on the global costs and incentive <br />Level 4: Business results Indicators to be determined:productivityincrease, sales increase, turnover’s drop …<br />Level 3: Transfer of learning outcomes in the workplace: Measuring weeks after what has been changed by learning (surveys to learners, peers, line manager… witness groups)<br />Level 2: Learning assessment: by tests, exams. Measuring just after what has been achieved through learning (skills, knowledge and / or competences)<br />Level 1: Reactions satisfaction survey completed by the learner <br />
The main stakeholder in SEVAQ v1.0: the learner <br />Partially EFQM and partially Kirkpatrick <br />learner<br />3 domains of the evaluation:<br /> The resourcesused by the learner during his learning experience<br /> The processes (activities) proposed to the learner during the delivery of the course <br /><ul><li>The results:learning objectives achieved, effects of the experience on the learner, some measure of the transfer </li></ul> in the workplace <br />
A wider scope of processes: the learning processes + all the design processes of the provider <br />A wider target of users: the learner + the other stakeholders interested in the delivery of the offer <br />Two distinctive contexts: e-learning in VET (as before) + e-learning in HE<br />SEVAQ+: extending the approach <br />Object<br />E-learning design <br />Learning process<br />Teacher/ trainer<br />User<br />Manager<br />TGs<br />VET<br />HE<br />Context<br />
The extension of SEVAQ+<br />Managers <br />Trainers<br />Providers <br />Learners<br />+ stakeholders+ contexts (HE and VET) + EFQM domains+Kirkpatrick levels <br />
Interpreting results: Histograms<br />Question 247 : You knew every week (or day) what you were expected to do.<br />
Interpreting results: radar graphs<br />Improvement needed Sub-criteria with results under the mean<br />37 : Time management [ 2.49 ]<br />41 : Blended approach [ 2.59 ]<br />79 : Levels of overall knowledge outcomes [ 2.62 ]<br />86 : Learning management [ 2.41 ]<br />88 : Self-motivation [ 2.55 ]<br />Improvement less or not needed Sub-criteria with results above the mean<br />18 : Pedagogical aspects of learning content [ 2.76 ]<br />20 : Technical requirements [ 3.04 ]<br />22 : Instructional design [ 2.77 ]<br />38 : Navigation and resource options [ 2.84 ]<br />84 : Awareness of learning preferences [ 2.72 ]<br />Mean value for the evaluation = 2.7<br />
Interpreting results: table<br />Summary table of critical questions by giving the percentage of persons quoting the question as important and giving a bad evaluation. <br />A question is consider to be critical when its score is above 40% and very critical when its score is 60% and above.<br />
Join the Delphi expert group: March – June 2011
Invitation to validation workshop June 2011</li></li></ul><li>Deborah.Arnold@univ-nancy2.fr<br />www.eden-online.org/nap_elgg/pg/profile/deborah.arnold<br />Thank you for listening!<br />With the indispensable contribution of:<br /><ul><li>Anne-Marie Husson (CCIP-Le Préau)