How can we effectively
present evidence for
policymakers?
Gai Moore
HARC Scholarship December 2013
Using evidence from research: what’s the problem?
• Evidence can inform decision making and improve health care
• Policy makers and program managers want to use evidence
• Some challenges:
– Finding research that is relevant
– Assessing the quality of research
– Applying it to their policy context
– Interpreting research ‘language’
• Clear, timely and relevant research
What do people think would help?
• Easier access to research
• Relevant, useful, applicable
• Summaries and syntheses of research
• Engaging users in prioritising research
• What has been tested to date?
– Access to online registries of research
– Tailored, targeted messages
– Summaries of systematic reviews
– So … what methods are people using?
My HARC program
• Purpose: to identify alternative
approaches to presenting
evidence from research
• Explore the policy context, the
methods used, and their
effectiveness
• Consider how we might
maximise the benefits of our
approach
• Program:
– EUPHA Conference and workshop
(Brussels)
– RURU (St Andrew’s University)
workshop
– NICE (London)
– Health Evidence (McMaster’s
University)
– Health Systems Evidence
(McMasters University)
NICE Guidance
• System wide focus
• Systematic, transparent,
authoritative
• Stakeholders engaged
throughout
• Syntheses and summaries of
reviews
• 2-3 year timeframe
• Researchers: in house
Role:
• Lead and coordinate
Products:
• Guidance
• Evidence summaries
Dissemination:
• Web publication
• Email alerts
“It’s all about transparency.
Stakeholders have to see how we arrived at the recommendations.”
Health Evidence
• SRs of effectiveness
• Transparent, consistent,
guidance
• Targets policy makers and
program managers
• Automated system
• Summaries of reviews
• Tools to support
implementation
• Monitors site navigation
• Researchers: In house
Role:
• Broker knowledge
Products:
• Registry of SRs
• Tools, resources, links
• Webinars
• Capacity building
Dissemination:
• Web interface
• Targeted email notifications
• Social media
“We broker knowledge”
Health Systems Evidence
• SRs health systems research
• Systematic, action oriented,
context specific
• Syntheses and summaries of
reviews and local data
• All stakeholders: scoping and
‘dialogues’
• 12-24 month timeframe
• Online tools to support action
• Researchers: In house
Role:
• Drive and coordinate action
Products:
• Systematic reviews
• Evidence briefs
• Deliberative dialogues
• Capacity building
• Rapid response
Dissemination:
• Web publication
• Facilitated forums
• ‘Evidence prompts’
“It’s all about the evidence … we mean the best available evidence”
Knowledge Exchange
• Rapid reviews
• Responsive, rapid, targeted,
• Specific policy or program
teams or agencies
• Engaged from commissioning
to final product
• Formats are tailored to the
agency
• 3 months timeframe
• Researchers: External
Role:
• Knowledge brokering
Products:
• ECheck rapid reviews
• EMake evaluation
• HARC e-bulletin
• Web CIPHER
• PHRP journal
Dissemination:
• Web publication
• Targeted emails
• Confidential reports
“Support decision makers to access research findings and expertise, and to use research ”
Key learnings
• Product, strategy, purpose
• Responsive to emerging evidence needs
– Launch of webCIPHER
– Site navigation, content and use
– Skills and capacity building in agencies
– Introducing social media
– Building on our KE website
• Lens through which to analyse agencies evidence needs
– Knowledge translation strategy
Questions?

Gai Moore, Sax Institute

  • 1.
    How can weeffectively present evidence for policymakers? Gai Moore HARC Scholarship December 2013
  • 2.
    Using evidence fromresearch: what’s the problem? • Evidence can inform decision making and improve health care • Policy makers and program managers want to use evidence • Some challenges: – Finding research that is relevant – Assessing the quality of research – Applying it to their policy context – Interpreting research ‘language’ • Clear, timely and relevant research
  • 3.
    What do peoplethink would help? • Easier access to research • Relevant, useful, applicable • Summaries and syntheses of research • Engaging users in prioritising research • What has been tested to date? – Access to online registries of research – Tailored, targeted messages – Summaries of systematic reviews – So … what methods are people using?
  • 4.
    My HARC program •Purpose: to identify alternative approaches to presenting evidence from research • Explore the policy context, the methods used, and their effectiveness • Consider how we might maximise the benefits of our approach • Program: – EUPHA Conference and workshop (Brussels) – RURU (St Andrew’s University) workshop – NICE (London) – Health Evidence (McMaster’s University) – Health Systems Evidence (McMasters University)
  • 5.
    NICE Guidance • Systemwide focus • Systematic, transparent, authoritative • Stakeholders engaged throughout • Syntheses and summaries of reviews • 2-3 year timeframe • Researchers: in house Role: • Lead and coordinate Products: • Guidance • Evidence summaries Dissemination: • Web publication • Email alerts “It’s all about transparency. Stakeholders have to see how we arrived at the recommendations.”
  • 6.
    Health Evidence • SRsof effectiveness • Transparent, consistent, guidance • Targets policy makers and program managers • Automated system • Summaries of reviews • Tools to support implementation • Monitors site navigation • Researchers: In house Role: • Broker knowledge Products: • Registry of SRs • Tools, resources, links • Webinars • Capacity building Dissemination: • Web interface • Targeted email notifications • Social media “We broker knowledge”
  • 7.
    Health Systems Evidence •SRs health systems research • Systematic, action oriented, context specific • Syntheses and summaries of reviews and local data • All stakeholders: scoping and ‘dialogues’ • 12-24 month timeframe • Online tools to support action • Researchers: In house Role: • Drive and coordinate action Products: • Systematic reviews • Evidence briefs • Deliberative dialogues • Capacity building • Rapid response Dissemination: • Web publication • Facilitated forums • ‘Evidence prompts’ “It’s all about the evidence … we mean the best available evidence”
  • 8.
    Knowledge Exchange • Rapidreviews • Responsive, rapid, targeted, • Specific policy or program teams or agencies • Engaged from commissioning to final product • Formats are tailored to the agency • 3 months timeframe • Researchers: External Role: • Knowledge brokering Products: • ECheck rapid reviews • EMake evaluation • HARC e-bulletin • Web CIPHER • PHRP journal Dissemination: • Web publication • Targeted emails • Confidential reports “Support decision makers to access research findings and expertise, and to use research ”
  • 9.
    Key learnings • Product,strategy, purpose • Responsive to emerging evidence needs – Launch of webCIPHER – Site navigation, content and use – Skills and capacity building in agencies – Introducing social media – Building on our KE website • Lens through which to analyse agencies evidence needs – Knowledge translation strategy
  • 10.

Editor's Notes

  • #3 Questions: Where is the evidence when I need it? When I do find it, how does it relate to the problem I want to access? How do I evaluate the quality of papers in a way that makes sense? How do I pull the findings together when the evidence is conflicting? How do I work out its applicability in my context?
  • #4 A lot of opinion based literature on what people think would work - Easier access to relevant research and preferably online - Including information on feasibility and context - Syntheses of research that pull together the findings from multiple studies or sources of evidence TIME and COMPLEXITY - Involving users in choosing priorities for research BUT little research actually measuring what works So… in the absence of strong evidence, … what are people doing? LAVIS: Three models of summaries tested Like and dislikes of the structure of the summary Assessing the quality of the evidence Areas for improvement Relevance assessment tool
  • #5 Pre conference title: Successful knowledge transfer: operating at multiple levels of the science-policy interface Organised by EAHC in cooperation with DG SANCO, WHO Europe and EUPHA Executive Agency for Health and Consumers programs, conferences, joint actions and grants RURU Research Unit for Research Utilisation: Sandra Nutley and Huw Davies: co-production and partnership research National Collaborating Centre for Methods and Tools
  • #6 National Institute of Clinical Excellence Guidance pulls together several systematic reviews Principles that drive their process are: systematic process and method; transparency so people can trace the decision back to the evidence sources and consultations Stakeholders, who are …. Organisations who commission or provide healthcare Public health or social care services Include: doctors, public health facilities, government decision makers, charities and local authorities. 2 -3 years from question generation to end product Clear process, defined strategy Implementation: No Impact assessment: No The priority is on getting the process and the method right
  • #7 To identify, assess, and provide access to SR of the effectiveness of public health interventions Automated system: stakeholder survey, identifies relevant reviews, notifies two researchers who independently quality assess them, sends email notifications or new and updated reviews. Principles which govern their product development are: a systematic process, transparent quality assessment, and credibility ie The target audience for their evidence products is public health policy makers and program managers Systematic reviews, summaries of reviews Timeframe: Five years in the development of the system Coding system and a search engine provides 20-40 max articles Monitor site navigation and content use and social media Social media: pick up any use of their content and retweet it, thank the tweeter, or point ot a review, or to a utube video or resource on their site NICE invests in the process and method HE invests in the data repository and web infrastructure
  • #8 Stakeholders: district health administrators, ngos, professional associations, Implementation NO Impact assessment NO Their focus is less on how an individual agency or team uses a SR, but more on high level multistakeholder dialogue
  • #9 They bring their question to the table Face to face meetings as the review process beings and again at the draft report Maximise the contribution of each to a particular policy problem Knowledge brokers Building skills, systems and xxx
  • #10 It’s less about the product, more about the strategy, - the right set of products and processes, so the right, relevant evidence, reaches the right decision makers and managers, in the most effective way for them The strategy is built around the agency’s purpose. - Are our products and services consistent with our primary purpose? - Do they build on and complement our existing suite of services? - Do they continue to add value for policy makers and program managers? - What are their emerging evidence needs? ( - How conscious are we of OUR key principles or responsiveness?) The last year has seen the development of products, systems and processes that reflect some of the learning from my HARC trip Led the development of an organisation wide KT Strategy primary role was exactly that: - to understand the types of products and services - construct a strategy matched their context, their stakeholders, partnerships - Explicit about how their choices contributed to their purpose. Personal level: HARC scholarship fundamental shift in the way I see our work and on the way we position ourselves, types of partnership approaches that work for us Structure my thinking around our service development Given me links to a network of researchers in my field Understanding of why and how they have constructed their current research practice Where my own research practice might lie going forward