Dr. Myfanwy McDonald, Centre for Community Child Health, Murdoch Children’s Research Institute, The Royal Children's Hospital, Melbourne - The meaning of 'evidence' in the child and family services field

Like this? Share it with your network

Share

Dr. Myfanwy McDonald, Centre for Community Child Health, Murdoch Children’s Research Institute, The Royal Children's Hospital, Melbourne - The meaning of 'evidence' in the child and family services field

  • 759 views
Uploaded on

Dr. Myfanwy McDonald, Project Officer, Centre for Community Child Health, Murdoch Children’s Research Institute, The Royal Children's Hospital, Melbourne delivered the meaning of ‘evidence’ in the......

Dr. Myfanwy McDonald, Project Officer, Centre for Community Child Health, Murdoch Children’s Research Institute, The Royal Children's Hospital, Melbourne delivered the meaning of ‘evidence’ in the child and family services field presentation at the Child Protection Forum 2013.

She talked about the concept of "quality" evidence, considering effective programs and processes, the tension between the concept of 'gold standard' evidence and continuous quality improvement methodlogies, and the way forward for evaluation and research in the child and family services field.

Find out more at http://www.informa.com.au/childprotectionforum2013

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
759
On Slideshare
759
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
12
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Community Child Health research group | Murdoch Childrens Research Institute located at The Royal Children’s Hospital Centre for Community Child Health The meaning of ‘evidence’ in the child and family services field Myfanwy McDonald & Tim Moore
  • 2. The elephant in the room: CRFAA The conference-related Friday afternoon affect
  • 3. Posing questions rather than providing answers “Questions are more transformative than answers… Questions create the space for something new to emerge… Answers… while satisfying, shut down the discussion.” Peter Block
  • 4. The focus of our work Early intervention Universal prevention Source: Protecting Children is Everyone’s Business (2009)
  • 5. Why is the question of evidence important? • Programs and initiatives designed to improve outcomes for children often have only modest effects - or they don’t work at all so • We need to have a better understanding about what works in order that our resources are more effectively and efficiently utilised and for this • We need evidence to demonstrate what works and • We need evidence derived from rigorous, valid, reliable research designs but • There are inherent challenges in this
  • 6. Questions guiding this presentation • What are the challenges? • Is there a new way (or ways) of thinking about the issues pertaining to evidence? • Where might we go from here?
  • 7. Outline • Highlight 3 issues: 1. getting the evidence we need 2. implementation 3. ‘program-centred’ mindset • Highlight some questions for each • Some concluding considerations
  • 8. Issue 1: Getting the evidence we need
  • 9. Why are Cochrane reviews so boring? “Five thousand (mostly) high-quality Cochrane reviews notwithstanding, the troubling aspect of this enterprise is not the few narrow questions that the reviews answer but the many broad ones they leave unanswered...
  • 10. “... The reason why Cochrane reviews are boring — and sometimes unimplementable in practice — is that the technical process of stripping away all but the bare bones of a focused experimental question removes what practitioners and policymakers most need to engage with: the messy context in which people get ill, seek health care (or not), receive and take treatment (or not), and change their behaviour (or not).” Greenhalgh, 2013
  • 11. What is the real problem? • The people doing the research aren’t asking the right questions? • The evidence isn’t translated in a way that can be utilised in practice? • We’re trying to bring order to a phenomenon that something that is fundamentally chaotic / disorganised (i.e. the ‘messy’ context)?
  • 12. Issue 2: Implementing the findings of research
  • 13. The rise of Implementation Science • Deliberate, purposeful attempt to implement evidence-based interventions into practice • Typically involves a purveyor who oversees the process of implementation • Balancing fidelity with flexibility – delivering it as intended, adapting it to the unique context
  • 14. Challenges of Implementation • In organisations with no capacity for ongoing implementation, whose responsibility is it to keep up to date with the evidence? • Practitioners? • Managers? • Organisational level? • Where does the boundary between fidelity and flexibility lie? Who makes the decision?
  • 15. Alternatives to Implementation Science • Continuous Quality Improvement (CQI): • The point at which practitioners are engaged is at the point a problem is identified • Working through the problem together • Using ‘localised’ data • Rapid implementation of a practice change • Review outcomes • Adjust as required
  • 16. So what? • Is one approach superior? • Challenges with CQI, e.g.,: • may encourage a focus on small practice issues rather than child and family outcomes • Is one more suited to the ‘messy’ context of practice? • Which one should be used when?
  • 17. Issue 3: The ‘program-centred’ mindset
  • 18. The ‘program-centred’ mindset • Focusing on programs as the ‘answer’ to poor outcomes amongst children and families • Program centred mindset as opposed to a focus on: - Process: how programs are delivered (rather than what program is delivered) - Surrounding contextual factors (e.g. service system structure, community environments, government policies)
  • 19. Why do we like programs? • Programs are easier to evaluate using gold standard methodologies • The evidence therefore is easy to interpret and compare • We don’t have the resources / time / capacity to focus on the bigger issues (e.g. government policies that impact negatively upon families) • It’s not our role to focus on the bigger issues • Touching upon the bigger issues is risky – who will we get ‘offside’
  • 20. Questions • Could it be that programs have typically moderate effects because of the impact of surrounding contextual factors? • If we continue to focus on programs rather than processes and broader contextual factors will they only ever have moderate effects? • Whose role is it to address the broader contextual issues that impact upon children and families? Child and family services? NGOs? Policy-makers? Government? Advocacy groups? • How much is the nature of the evidence (and our views about ‘gold standard’ evidence) driving our interest in programs?
  • 21. Where might we go from here?
  • 22. For research synthesis projects • A ‘realist approach’ to research synthesis: • systematic reviews of RCTs and • broad based review of research, theory, practice-based evidence from a range of different disciplines • Doesn’t discount the importance of gold standard evidence but also takes into account a broader range of ‘evidence’
  • 23. Questions for consideration • How else can we make research more relevant to practitioners and policy-makers? How to bridge the gap between science and the ‘messy context’? • What role should practitioners play in: • identifying the problems? • coming up with ‘localised’ solutions? • How do we keep processes and broader contextual factors ‘on the table’? Whose responsibility is that?
  • 24. Contact details Myfanwy McDonald Senior Project Officer The Royal Children’s Hospital Centre for Community Child Health P: (03) 9345 4463 E: myfanwy.mcdonald@mcri.edu.au For CCCH Research & Policy papers see: http://www.rch.org.au/ccch/resources_and_publications/