Thank you for inviting for me for interview. As asked I’m going to present on what I believe are the main challenges in undertaking educational research and training evaluation. This presentation will last no more than ten minutes however I do have more to say on the topic, drawn from my wide experience in research so I hope we can explore that more in follow up questions.
To outline the presentation I’m going to speak about what I believe are the three central challenges in undertaking educational research and explain how each of these can be overcome, or a least, be well managed. First I will explain the challenge posed by lacking clarity on the aims and the central questions of the research. Secondly I will address the challenge of using the appropriate research methods to actually answer your central research question. Finally I will speak about the challenges of effective data analysis and presentation.
A critical challenge for any research project is setting a clear and achievable aim, which then can be fulfilled by specific and answerable questions. This is particularly important for education research and trainingevaluation, where determining which aspects to asses or evaluate will very likely mean that assessing other aspects is not possible. Even simple projects will have multiple stakeholders and can easily suffer from too many or over ambitions research questions, loading the project with conflicting expectations. The second aspect to consider is how achievable and answerable each research question is. For example a project might try to determine the impact of a training programme on learners. However impacts of a training course are very difficult to determine in isolation, and disentangle from other effects away from the course. Within training evaluations it would very unusual to have a comparison group who did not attend the training or large scale quantitative data to determine differences. Therefore some compromise on what and how can be credibly ascertained is to be expected. For example within my prior research work I was involved on a project to determine the impact of a soft skills training course for work place learners. The project accepted that given the elusive nature of this impact it would have to be evaluated via the views of the learners themselves and their line mangers, rather than any objective measure. Challenges around central project aims can be managed by a few strategies. Firstly to set a central research question that can actually be answered it is best to remain limited and specific. attempting to answering large questions or too many questions stretches credibility. This focussed approach requires that discussions take place with stakeholders' agreeing with what you are going to do but also what you are not going to be exploring in the project. This dovetails into a careful consideration of the time, methods and resources available to you. For example within my most recent research project to evaluating the soft skills course, it was possible to conduct interviews with learners after their course, with a follow up interview at six months, but not at 12 months due to the timescale of the project.
The second challenge I want to highlight is the appropriate selection of methods. Fundamentally the methods selected should be those that answer the research questions. However many researchers fall into the trap of using methods they have used before and are comfortable with and therefore struggle to adapt the data they gather to be relevant to their research project. Some methods may discourage participation in the project due to the time required or do not capture the data within those who do participate as they are inappropriate for a particular topic.For example long form one on one interviews are suitable for personal and sensitive topics. However they may not be the best match for evaluating a training course, which is a more everyday experience and which people may not have reflected upon. in my last project I found some interview participants could speak about their biography and career at length but did not have much to say about their recent training course. For these participants I would have liked to organised a focus group or a group interview where participants could share, compare and contrast experiences, and generating discussion between themselves which they could not in isolation. To manage these problems researchers should ask whether methods can truly answer the question posed. The design should come out of the study, rather than being imposed on the study.Matching methods to aims may require imagination , the creatively to use a new method, or the flexibility to adapt an existing one. Also researchers should consider the practical aspects of participation, what the experience of this participation would be and what data it will generate.
The final challenge I present here is that of the analysis and presentation of findings.Often within quantitative methods there is a danger of using overly complex analysis, for which the collected datais unsuitable. Also with qualitative data, researchers might find that they have too much data to review and summarise into concise findings.Also there is a possibility that the presented research does not communicate to its intended audience, either by being too long, too confusing or inaccessible. To move quantitative analysis away from overly complex methods the project may need to refocus upon its aims and perform only the analysis that will meet these. Also remember that complex analysis performed poorly is weaker compared to simple analysis performed well.To manage large amounts of qualitative data researchers should be analysing data as its collected so that this is a continuing process which is much more manageable.To ensure succesfil dissemination, researchers should presentfindings in different avenues,supplementing long reports, with shorter versions, videos posted on you tube, presentations and slideshows and short leaflets that speak simply to targeted audiences.
To conclude research challenges in educational research can be overcome by Clear aims, matched with appropriate research methodsPractical and focussed approach to methods , including consideration of what participation involvesAnalysis continually centred on the research questions, with findings presented to communicate
Classic trap of reserachers using the methods they are most comfortable, have used before.
NES Patient Safteyinitative ; measure reduction in adverse eventsSignificant event analysis training or patietn data trainingPerson centred care, persuation and negoationaitng skillsChecklists
To outline the presentation I’m going to speak about what I believe are the three central challenges in undertaking educational research and explain how each of these can be overcome, or a least be well managed. First I will explain the challenge posed by lacking clarity on the aims and the central questions of the research. Second I will argue that matching the appropriate research method to your central question is critical to success particularly in enabling the best type of participation and data capture and how this is particularly important for educational research. Finally I will speak about the challenges of making the most of research by effective data analysis and presention
What are the main challenges inundertaking Educational Research and Training Evaluation and how can these be overcome? Matt McGovern 20th August 2o12
Key challenges Three challenges Aims and central questions: Lacking clarity Methods: appropriate to answer questions Data analysis and presentation: errors and communication
1) Aims and central questions Unclear or conflicting project aims and questions Expectations of different stakeholders How achievable is the project ? How observable are results/impacts? The Managed Solution A question that can be answered: limited and specific Aims widely discussed and agreed upon Time, methods and resources available
2) Research Methods Methods inappropriate for research aims Methods used are those most comfortable with Methods not appropriate for the topic under study and capturing relevant data The managed solution Careful consideration of methods suitable for the question posed Imagination, creativity and flexibility Practical considerations of participation
3) Data analysis and presentation Analysis and presentation challenges using overly complex analysis, repurposing data or too much data Findings do not communicate to intended audience The managed solution Re focus analysis upon agreed aims Flexible approach to dissemination
Conclusion Clear aims, matched with appropriate research methods Practical and focussed approach to methods , including consideration of what participation involves Analysis continually centred on research questions, with findings presented to communicate
Challenges Structured and focussed but flexible Quality, capture experiences of course or actually definitely state somethinghonestly might be a tick box exercise or instructed tobe positive. Get feedback into the report Methods matched to answer the question, qualitative methods to explore relationships but must start with a specific question.
Purpose of educational research/ training evaluationAnswering a questionEvaluation: determine the level of meritShould always be a way to capture experiences, views, incidents
TimeResources Time of your research team Time of your research participants Quick questionnaire
Classic project managmentIdentify all stakeholders and their objectivesDefining questions/ the scope of the studyIndentify the methods that will actually answerquestions/meet the objectives of stakeholdersPlan, consider risks and alternate plansImplement and review
Quant challengesAsking what you don’t know you didn’t knowBiased by participantion, online, email, not comfortable.Include at end of course but less time for reflection.Pressure to go too complicated go into inferentialstatistics
Quanlitative challengesParticipantion,: needs times investmentValidity Sum of your questions actually captures
ChallengesWhat you don’t know your don’t knowParticipants: biases Those who are really passionateeither positively or negatively. Difficult to research those who are indifferent, didn’t complete the course, those who found it a waste of time
OvercomeWhat you don’t know you didn’t know: hard, conreast andblack and white, not exploring relationships are valuable, How many people completed the course? Participant observation: see group dynamics, actual courseNot intensely personal, use a focus group/ group interview, one on one interviews personal and biographicalLooking for effects: longitudinal aspect
Overcome: focussed but flexible Initial interviews with key informants to inform Questionnaire: short and focussed answering simple questions Focus group/ group interviews about the course experience allowing to group stimulate discussion
Altering experiencesParticipants produced stronger views than theyotherwise would haveTell interviwer what they want to hear
ParticipationToo busy to take partHow is your sample biased?Those who had a bad experience at a training may notwant to take part in any research about it waste oftime. I stopped attendingThose who had a good or terrible , mildly mediocreviewpoint gets lost
QuantQuestionnaire too longWhat you don’t know you didn’t know, missing theToo focussed upon narrow indicators
Disentagning effectGraduates from a training programme may go on toachieve an objectControl group which is comparable
Overcome, small and specficSpecific about what you are evaluatingMore than just did they like the courseFleixlble to overcomeInitial qualitative interviews with key informers, questionnaire togain hard data, even from those who so you are getting abroad, inclusive picture, practical questions such as did youcomplete the course or ansewre in a black and white way suchas did you contineu with the study afterwards, use questionairefindings to do deeper and more specific, wider but more awarequestions than before, with particpants, with key informers(those who designed and deliverd the course)
Key challenges Three challenges Aims and central questions: clear, specific and achievable Methods: appropriate participation and data capture Data analysis and presentation: errors and communication