Researching learners' digital experiences


Published on

Presentation for LIDU Seminar at the OU, 14 October 2010

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • The recognition that learners are using numerous personal and social technologies, outside those recommended by tutors was termed the ‘underworld’ by Linda Creanor and colleagues in their report of interviews with 52 adult learners (Creanor et al, 2006). Their choice of terminology exemplified the extent to which their research approach had revealed aspects of learners’ lives which previously had not been observed in the course evaluations typical of the time. The unexpected use of technology by learners included the use of the Internet as the first port of call for information and the pervasive use of social networking tools. Such findings have been replicated in larger samples since (JISC student expectations and great expectations surveys, annual surveys at Edinburgh and oxford, Melville, 2009). This has led to a desire to find out exactly how learners are making use of the technology available to them, if not in ways which their teachers and tutors expect.Creanor et al used interviewing with a very open set of questions, analysed carefully by a team of 3 using IPA. Phase 2 used these tools, nothing surprising here, except it might seem obvious to say, but needs mentioning - learner experience research needs methods which capture and retain student voicesbut it’s the approach that underpins them that is more interesting ...
  • Well, first up, we found that context is incredibly important. Everyone has been saying that for a while, but ‘context’ used to be seen as the course (when research = course evaluation) or then more recently as the VLE. If you do holistic research, you find the different learning environments learners are operating in, and for some (agile adopters) those they are creating. But even for the agile adopter, that is context dependent, they might be expert in one environment but not in the next. The only way to see this is by close examination of a person. You can’t get at it from a survey or even from a single interview. THIS IS IMPORTANT when you are concerned, as we are, with learner development and teaching practice. This careful understanding is so important when faced daily with this kind of *&?! (charleswankels new book)
  • Also see STROLL’s guidelines for induction
  • So, in summary, the benefits of conducting holistic, participatory research are: (while remembering LexDis’s warning that not all LE research is participatory).1. It uncovers hidden practices: we found situated learning practices, and a few examples of creative appropriation (not many)
  • Second, it improves impact.We saw Student produced outputs e.g. LEXDIS, which help us to work out not just what the experience is, but how we should respond as well.The SLiDA case studies are examples of impact. And we are starting to produce conceptual accounts which arise from the actual learner experience, but these have some way to go I think.
  • SamplingThe classic problem with sampling in online research is that you don’t know quite who you are talking to. Our research shows that this doesn’t always have to be a problem. Most of our projects used purposive sampling. Want the articulate, reflective, skilled learners, but issues about representativeness. E.g. Dujardin talks about the need to have a ‘key informant’ willing to engage in reflective conversations with tutors, to help us all understand participation patterns in an online learning community. For example email interviewing already been noted to have problems with providing data from IT literate individual who have a preference for the written word (hunt & mchale, 2007). Do note though that attrition tends to be better with participatory approaches e.g. thema, Lexdis, lead, all exceeded their expected numbers.ElicitationIn qualitative research, it is so time consuming and disheartening to spend time conducting interviews and transcribing and not get the data you needed. The methods described here help students to feel part of research project, to understand what is wanted from them, to have opportunities to clarify, question and validate their responses e.g. The researchers personalised requests for information, conversational style and replies, were important in eliciting the data needed. Elicitation is also helped by having artefacts to prompt conversation e.g. interview plus and card sort, although Towle & Draffan (2009, Greenwich presentation) note that interview plus takes time for the researcher (they advise seenign the artefacts in advance, and that researchers need to understand the context in which the artefact has been produced , in order to be able to talk about it. Also popular is using students are researchers (Ainley, 2009)Dealing with the data The powerful nature of learners’ voices. Avoiding the temptation to over emphasise selected verbatim quotes at the expense of analysis and synthesis of qualitative data. Remembering that storytelling still needs to come from a research methodology e.g. Thema’s good explanation of what case studies (after Yin 2003) can and can’t do.handling large amounts of multimedia data e.g. stroll mindmapsRepresenting learners’ voicesProducing easily digestible and readable findings from large amounts of data e.g. personalized, vivid case studies of individual learners (e.g. Thema)Ethical issuesThema – how to deal with students who let the researcher know that they were having problems. Informed consentThema – difference between being anonymous and being impossible to identify. Our participant information sheets have to acknowledge this danger up front. Explaining to students just how widely their middle of the night study bedroom video diary will be disseminated at national conferences  
  • Researching learners' digital experiences

    1. 1. Researching learners’ digital experiences<br />RhonaSharpe<br />JISC Learner Experience Support & Synthesis projects<br />JISC SLiDA projectChair ELESIG<br />Oxford Centre for Staff and Learning Development<br />Oxford Brookes University<br />Greg BenfieldJISC Learner Experience Support & Synthesis projects<br />JISC SLiDA project<br />Oxford Centre for Staff and Learning Development<br />Oxford Brookes University<br />Helen Beetham<br />Consultant to JISC<br />JISC Llida project<br />
    2. 2. Why research learners’ experiences of e-learning?<br />Learners’ experiences are a key measure of our success <br />Technology is pervasive in learners’ lives<br />Learner’ perspectives are surprising, demanding, innovative<br />Learners have new expectations of education, thanks to technology<br />Learners need new skills and strategies for the digital age<br />
    3. 3.
    4. 4. Data collection methods<br />Interviewing<br /><ul><li>Interview plus (E4L, LexDis)
    5. 5. Email interviews ‘pen-pal’ (Thema)
    6. 6. Card sorts (E4L)
    7. 7. Telephone interviews (PB-LXP)</li></ul>Diary keeping<br /><ul><li>Video logs (Stroll)
    8. 8. Diaries in format of learner’s choice (Lead)</li></li></ul><li>Holistic<br />“Liling’s case study gives an insight into the challenges faced – and overcome – by an<br />overseas student for whom English is a second language and who has had to adapt to a<br />different educational culture from the one with which she was familiar. It also shows the<br />importance of technology for keeping in touch with her family and friends and, conversely,<br />the barriers that it can raise in this respect.”<br />
    9. 9. Sustained engagement<br />Students feel involved, feel part of the project, (logo, contact).<br />Using a flexible and responsive approach to data collection<br />High involvement from academic staff in each of the disciplines.<br />Providing incentives and benefits<br />Students get something out of the project, including the benefits of reflection on their learning<br />
    10. 10. Participatory approach<br />“Nothing about me, without me”<br />Involving learners as consultants and partners<br />Early and continued participation<br />Meaningful and useful outcomes<br />
    11. 11. What if you can engage learners over time, in holistic, participatory research?<br />
    12. 12. Context<br />Technology use was prompted by the course, the tutor, peers, work and/or specific learning requirements.<br />Learners use technology to create their own environments which meet their needs and the demands of their context. <br />We noted the agility of some learners at finding and using tools, skills and social networks to support their study in creative ways.<br />
    13. 13. Creative appropriation<br />
    14. 14. Creative appropriation<br /> Driven by contextual or individual need, not provided by tutors, e.g. <br /><ul><li>‘Had a phone tutorial with my supervisor referring to a support document he emailed to me – I digitally recorded the tutorial and saved it as a digital file on my laptop. This has then been playing while I make the adjustments to the document’ (Clarke 2009: 12)
    15. 15. “One of the group members was not able to make it today so what we did we were connected by using MSN Messenger so we were discussing notes. We were feeding back to the other person.” (Jefferies et al. 2009: 16)</li></li></ul><li>Creative appropriation<br />Blending social and academic…<br /><ul><li>‘Chun-Tao also blended the academic side of her life with social technology by using Facebook to find out about software and sites that would be useful for her work, like Zotero and ClickUni, which “looks something like iGoogle but it has things like Facebook [… and] College News.” (Thema Case Study)</li></li></ul><li>Learner produced outputs<br />10/13/10| slide 13<br />
    16. 16.<br />
    17. 17. What if you can engage learners over time, in holistic, participatory research?<br />Uncover hidden practices<br />Creative appropriation<br />Agile adopters<br />Situated practices<br />
    18. 18. What if you can engage learners over time, in holistic, participatory research?<br />Improves impact<br /><ul><li>Persuasive evidence
    19. 19. Learner created outputs
    20. 20. Conceptual accounts: learner development model</li></li></ul><li>Challenges with this approach<br />Sampling<br />Elicitation<br />Dealing with the data<br />Representing learners’ voices<br />Ethical issues<br />