CDE Conference 09/02/2009. C Daly: Embedding evaluation in distance courses – using narrative methods


Published on

Research in Distance Education conference. Evaluation and Assessment strand presentation.
Dr Caroline Daly
Centre for Excellence: Work‐based learning
for Education Professionals, Institute of

Published in: Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • CDE Conference 09/02/2009. C Daly: Embedding evaluation in distance courses – using narrative methods

    1. 1. Embedding evaluation in distance courses – using narrative methods 9 February 2000 Caroline Daly, Norbert Pachler, Jon Pickering, Jill Russell and Jon Wardle
    2. 2. Rationale for the research projects <ul><li>We need course development to be informed by e-learners’ </li></ul><ul><li>experiences </li></ul><ul><li>The ‘primacy of improving the student learning experience’ (HEA </li></ul><ul><li>Strategic Plan 2005 – 2015 ) for overseas and home students </li></ul><ul><li>The inadequacies of evaluation based on ‘satisfaction’ and </li></ul><ul><li>‘ exit’ models </li></ul><ul><li>We perceive coherence in our learning designs where the students </li></ul><ul><li>may not </li></ul><ul><li>We need a sustainable way of understanding learners’ </li></ul><ul><li>experiences in distance e-learning contexts </li></ul><ul><li>We need to avoid the ‘sisyphus effect’ </li></ul>
    3. 3. E-learners’ experiences – four types of ‘newness’ which affect participation for all (CDE 1) The experience of managing work, life and learning with technology The experience of writing with fellow students in an online discussion in order to learn The experience of collaborative learning in an online tutor group The experience of social relationships with other e-learners
    4. 4. Narrative evaluation methods are based on… <ul><li>a ‘way of knowing’ or ‘mode of thought’ (Bruner, 1985) </li></ul><ul><li>a way of filtering experience in order to make sense of it </li></ul><ul><li>giving practitioners access to informal as well as formal experiences of students’ learning </li></ul><ul><li>respondents in engaging with knowledge they have which is often tacit and unarticulated </li></ul><ul><li>a qualitative way of understanding something for both the learner and the practitioner - an interpretation of reality </li></ul><ul><ul><ul><ul><li>the learner organises their experiences by ‘narrating’ </li></ul></ul></ul></ul><ul><ul><ul><ul><li>the practitioner constructs the meanings to be coherent within a view of what has happened </li></ul></ul></ul></ul>
    5. 5. Why narrative methods? A way of capturing the learner voice as a prime research or evaluation tool in qualitative approaches A way of developing situated accounts of respondents’ experiences To provide ongoing information about our practice as providers while it is happening To impact on learners’ abilities to adapt to new learning contexts (Levy, 2006) A way of better understanding the realities of ‘being a learner’ in complex contexts To inform effective education practitioner-development
    6. 6. <ul><li>Loosely structured </li></ul><ul><li>“ without overspecifying the substance or the perspective of the talk” (McCracken,1988) </li></ul><ul><li>Naïve interviewer stance </li></ul><ul><li>Invitational, not interventional </li></ul><ul><li>Long </li></ul><ul><li>‘ Confessional’ </li></ul>Narrative interviews
    7. 7. What is going on in a narrative interview? <ul><li>Emplotment </li></ul><ul><li>Chronology </li></ul><ul><li>Opportunities for individual sense-making (e.g. metaphor-building in this example) </li></ul><ul><li>Opportunities for deep levels of reflection by the respondent can elicit rich data </li></ul><ul><li>The agenda becomes the respondent’s </li></ul>
    8. 8. Contemporary contexts for narrative evaluation <ul><li>` Patterns of student participation are changing. A range of new contexts raise challenges for distance education and training, in terms of how we can evaluate the learner experience: </li></ul><ul><ul><ul><ul><li>E-learning </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Distance education </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Mass education </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Work-based learning </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Professional accreditation </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Retention, isolation, dilution, serial change, multiple pressures… </li></ul></ul></ul></ul>
    9. 9. CDE 2 project 2006-7   Context Design 2 online masters degrees UCL & Bournemouth University Longitudinal – 9 months Participants (n=18) new students - 11 in primary healthcare (UCL) - 7 in creative media (BU) Quantitative data – pre-course questionnaire E-learning at a distance - online discussions - electronic text-based resources - wikis (BU) - blogs (BU) - podcasts (BU)   Qualitative data – learner narratives - online discussion – course-based - narrative interviews by telephone - online commentaries - individual email correspondence - focus groups with tutors International - UCL cohort includes a high proportion of South African students - CEMP cohort is mostly UK based Analytical approach based on systematic reading and open-coding of range of narratives to establish the effectiveness of methods <ul><li>Main aims: </li></ul><ul><li>to develop methods to collect evaluation information from distance learners </li></ul><ul><li>to design an embedded approach to evaluation in fully online contexts </li></ul>
    10. 10. Narrative evaluation methods for distance learners <ul><li>Group narratives aimed at collective sense-making of experiences. The UCL ‘intensive introduction to e-learning’ as an embedded evaluation activity; </li></ul><ul><li>Telephone interviews . Adapting individual, loosely-structured narrative interviews using ‘postmodern’ techniques (Gubrium, 2003) from face to face contexts; </li></ul><ul><li>Virtual focus groups using online forums to elicit reflective group discussion on key issues; </li></ul><ul><li>Individual email correspondence (post-interviews and post-online forums) to invite reflective review of earlier narratives. </li></ul>
    11. 11. Narrative interviewing at a distance <ul><li>The power relations between tutor/evaluator and learner </li></ul><ul><li>changed </li></ul><ul><li>The learner knew more than the interviewer about the context of the interview </li></ul><ul><li>The learner controlled the context of the interview </li></ul><ul><li>The learner questioned the interviewer </li></ul><ul><li>The interviewer exercised measured tentativeness </li></ul><ul><li>The interview became a dialogue where prompts must be given, but control over the agenda is handed over to the learner as far as possible </li></ul>
    12. 12. Tutors’ responses <ul><li>The content of narrative interviews did not ‘surprise’ the tutors much </li></ul><ul><li>The narratives provided a coherent picture – told the ‘whole story’ of the learners’ experiences in a way not normally available </li></ul><ul><li>The data did expose experiences not anticipated by tutors (e.g. the ‘feeling sick’ about the experience of posting to the forum; scale of the struggle to balance work/life/study; the fact that students participate in the middle of the working day e.g. ‘between patients’) </li></ul><ul><li>Highlighted the need for more close engagement with students’ experiences </li></ul><ul><li>Highlighted students’ desire to ‘own’ evaluation processes, e.g. to decide when and how it is convenient to discuss their experiences </li></ul>
    13. 13. Dialogue – the future for meaningful evaluation? <ul><li>The value of narrative evaluation is much more than viewing it as an effective means of eliciting rich evaluation data. </li></ul><ul><li>“ An alternative view of evaluation redefines it as a social practice in itself and so the emphasis becomes as much on the value of the substantive processes of evaluation as on the product (data) elicited by the evaluation” </li></ul><ul><li>Russell (2008) </li></ul>
    14. 14. e- VALU -ation <ul><li>“… involves not only collecting descriptive information …but also using something called “values” to (a) determine what information should be collected and (b) draw explicitly evaluative inferences from the data… </li></ul><ul><li>… research can tell us “what’s so” but only evaluation can tell us “so what”. </li></ul><ul><li>Davidson (2005) </li></ul>
    15. 15. Project publications <ul><li>Daly, C. (2008) ‘Evaluation for new learning contexts – how can it be ‘fit for purpose’?’ Reflecting Education 4 (1) 127-138. </li></ul><ul><li>Daly, C., Pachler, N., Pickering, J. and Bezemer, J. (2007) Teachers as e-learners: exploring the experiences of teachers in an online professional master’s programme Journal of In-service Education 33 (4) 443-462. </li></ul><ul><li>Daly, C., Pachler, N., Pickering, J. and Bezemer, J. (2006) ‘A study of e-learners’ experiences in the mixed-mode professional master’s programme, the Master of Teaching’ WLE Centre, Institute of Education, </li></ul>