SlideShare a Scribd company logo
Studying implementation
Implementation science workshop
Hawassa University, August 21-25, 2017
E-mail: anna.bergstrom@kbh.uu.se
Today
 PROCESS EVALUATION
 THE PeriKIP PROJECT
PROCESS EVALUATION
What is process evaluation?
To understand the effects of interventions
 Allows for exploration of implementation and
change processes and the factors associated
with variations in effectiveness.
 Can examine the utility of theories underpinning
intervention design and generate questions or
hypotheses for future research.
Process evaluation is needed for us to move from
“what works” to “what works where and why”.
 Aims to understand why certain implementation
strategies bring about improvement (while others
don’t) – illuminates mechanisms of change
 Provide an understanding of the determinants of
success/failure
 Provides understanding of the planned vs delivered
vs ‘exposure’ etc
 Can ensure in-depth understanding from smaller
studies which can inform scale-up
 Can assist in understanding outcome-heterogeneity
of large-scale projects
 Commonly requires mixed-methods
What is process evaluation? cont.
Process evaluation steps
 Planning: methodological expertise, appropriate
interdisciplinary mix, degree of separation
between teams and means of merging findings
 Design and conduct: describe the intervention and
causal assumptions, identify key uncertainties,
select research questions and methods, balance
data collection all sites/selected sites, timing of
data collection
 Analysis: descriptive quantitative information on
fidelity, dose, and reach. More detailed modelling
of variations between participants or sites.
UK Medical Research Council, 2015; Moore et al., 2
 Describe the implementation strategy (theory)
 Clarify causal assumptions (logic model)
 Identify key uncertainties
 Identify and prioritise research questions
 Select a combination of research methods
UK Medical Research Council, 2015; Moore et al., 2
Process evaluation steps cont.
MRC guidance for process
evaluations
UK Medical Research Council, 2015; Moore et al., 2
MRC guidance for process
evaluations cont.
 Clearly defining the components of the
intervention (implementation strategy)
- Naming
- Defining
- Operationalizing (Actor, Action, Action targets,
Temporality, Dose, Implementation outcome
addressed, Theoretical justification)
Proctor et al 2013
 Fidelity: whether the intervention was delivered
as intended
 Dose: the quantity of intervention strategy
implemented
 Reach: whether the intended audience came into
contact with the intervention, and how
 Adaptations made
MRC guidance for process
evaluations cont.
 Mechanisms through which interventions bring
about/trigger change
– How the effects of the specific intervention
occurred
– How these effects might be replicated by
similar future studies
MRC guidance for process
evaluations cont.
 Anything external to the study that may act as a
barrier or facilitator to the implementation of the
strategy, or its effects
 Understanding context is critical for interpretation
of the findings and for understanding of its
generalizability.
 Interaction with context can be complex even
with ‘simple’ interventions
MRC guidance for process
evaluations cont.
Key terms
 Reach: Characteristic of the target audience.
 Dose delivered: A function of efforts of the
providers of the intervention.
 Dose received: Assess the extent of engagement
of participants with the intervention.
 Fidelity: The extent to which the intervention was
delivered as planned.
 Recruitment: Procedures used to approach and
attract participants.
Linnan and Steckler,
 Implementation: A composite score that indicates
the extent to which the intervention
(/implementation strategy) has been
implemented and received by the intended
audience.
 Context: Aspects of the larger social, political,
and economic environment that may influence
the implementation of the strategy.
Key terms cont.
Linnan and Steckler,
For whom and how are process
evaluations helpful?
For researchers and policy makers
 Explaining success (Will outcomes be similar in
other contexts? How can the effects be
replicated?)
 Explaining failure (Is it due to the intervention or
to poor implementation?)
 Does the intervention have different effects on
subgroups?
For systematic reviewers
 Understanding the nature of intervention and
implementation heterogeneity
Methods: Observations
 Person, camera or audiotape
 Not intrusive, i.e. not altering behaviour
 Can capture performance
 Takes time to learn
 Can provide data difficult to analyze
 The less structured the observation/the more
complex the change the harder it is
 Important to train observers
Methods: Self-reports and
documentation
 Interviews and questionnaires
 Periodic or retrospective (periodic likely more
reliable data)
 Collect data shortly after the implementation has
occurred
 Program records and documentation
Cost evaluation
 The set-up cost
 Running cost for e.g. time used by involved
people, materials, potentially purchased
equipment
 Changes because of an intervention e.g. more
use of health care - > change in cost for health
providers and health consumers
 Relate cost of the intervention to for example life
years saved or some other factor
The PeriKIP project
The Perinatal Knowledge Into Practice
(PeriKIP) project
Objective: To test the
feasibility of a multilevel
health system intervention
applying participatory
groups to improve perinatal
health.
Setting: Nguyen Binh, Ha
Quang and Phuc Hoa
districts in Cao Bang
PeriKIP cont.
Innovation
Recipients
Inner context
Outer context
Facilitation
Initiation of monthly facilitated local stakeholder
group meetings at
 48 commune health centers
 3 district hospitals
 1provincial hospital
 i-PARIHS
The Plan-Do-Study-Act
 Plan: make a clear plan for
actions targeting a certain
problem including assigning:
o who is responsible
o a deadline for
undertaking the
planned action
o expected outcome of
the action
Plan
Do
Study
Act
The Plan-Do-Study-Act cont.
 Do: carry out the planned action.
Plan
Do
Study
Act
The Plan-Do-Study-Act cont.
 Study: compare the expected
outcome of action with the result
of the action.
Plan
Do
Study
Act
The Plan-Do-Study-Act cont.
 Act: discuss the lessons learnt
relating to the action and its
outcome including taking
decisions on refinements,
adjustments or how to support
the action to become routine.
Plan
Do
Study
Act
PeriKIP cont.
 Before and after study applying process
evaluation Knowledge survey:
- Focusing on antenatal care and postnatal care
- Client case using vignettes for monitoration of
labour, management of preeclampsia and
haemorrhage
 Observation of
- Antenatal care visits in sub-sample of units
- Labour and immediate postnatal care
 Village Health Workers will collect data on
pregnancies and birth outcomes using mobile
PeriKIP cont. Logic Model - Identify
key uncertainties
PeriKIP cont.
 Qualitative data collection
- Heterogeneous groups
- Homogeneous groups
- Facilitators
- Key informants
 Context assessment using the COACH tool
 Facilitators guide (content)
 Supervisors guide (content)
 Facilitators diaries
 Supervisors diaries
 Meeting summary (attendance, meeting time,
etc)
 Problem identification list
Assignment (only a few examples!)
 Could your study include aspects around how a certain
intervention was implemented?
 Would it make sense to investigate the fidelity to which an
intervention was implemented across your study setting?
The reach? The recruitment?
 Is there already a logic model for your piece of the larger
program? If not – would that be helpful to have?
 Is the perceptions from recipients of the intervention
known?
 Are there reasons to think that the intervention is taken up
to different degrees at different sites – and if so would it be
interesting to try to find out why?
References process evaluation
31
1. Saunders et al. Developing a Process-Evaluation Plan for Assessing Health Promotion
Program Implementation: A How-To Guide. Health Promot Pract 2005 6: 134
2. Linnan L, Steckler A. Process evaluation for Public Health Interventions and Research.
San Fransisco: Jossey-Bass; 2002.
3. Grant et al.: Process evaluations for cluster-randomised trials of complex interventions:
a proposed framework for design and reporting. Trials 2013 14:15.
4. Medical Research Council: Developing and Evaluating Complex Interventions: New
Guidance. London: Medical Research Council; 2008.
5. Hawe P, Shiell A, Riley T, Gold L: Methods for exploring implementation variation and
local context within a cluster randomised community intervention trial. J Epidemiol
Community Health 2004, 58:788–793.
6. Durlak J: Why programme implementation is so important. J Prev Interv Community 1998,
17:5–18.
7. Lewin S, Glenton C, Oxman AD: Use of qualitative methods alongside randomised
controlled trials of complex healthcare interventions: methodological study. Br Med J
2009, 339:b3496.
8. Hulscher M, Laurant M, Grol R: Process evaluation of quality improvement interventions.
In Quality Improvement Research; Understanding The Science of Change in Healthcare.
Edited by Grol R, Baker R, Moss F. London: BMJ Publishing Group; 2004:165–183.
9. Cook T: Emergent principles for the design, implementation, and analysis of cluster-
based experiments in social science. Ann Am Acad Pol Soc Sci 2005, 599:176–198.
10. Bonell C, Oakley A, Hargreaves J, Strange V, Rees R: Assessment of generalisability in
Types of evaluations
 Effect evaluation
– Design
– Primary outcome, secondary outcomes
 Cost evaluation
 Process evaluation
Studying implementation
 Specific strategy (or a set of strategies)
– e.g. reminders
 Being used in a specific domain of healthcare
– e.g. rational prescription of antibiotics
 In a specific setting
– e.g. amongst physicians in primary health care
OTHER PROCESS EVALUATION
FRAMEWORKS
RE-AIM (re-aim.org)
 Reach is the absolute number, proportion, and representativeness
of individuals who are willing to participate in a given initiative.
 Effectiveness is the impact of an intervention on outcomes,
including potential negative effects, quality of life, and economic
outcomes.
 Adoption is the absolute number, proportion, and
representativeness of settings and intervention agents who are
willing to initiate a program.
 Implementation refers to the intervention agents’ fidelity to the
various elements of an intervention’s protocol. This includes
consistency of delivery as intended and the time and cost of the
intervention.
 Maintenance is the extent to which a program or policy becomes
institutionalized or part of the routine organizational practices and
policies. Maintenance also has referents at the individual level. At
DESIGNS
Quantitative evaluation designs
 Randomized controlled trials, randomized on the
level of the individual patients
 Cluster randomized controlled trials on the level
of clusters of patients/persons, e.g.
professionals, hospitals or communities
 Uncontrolled before and after studies
 Controlled before and after studies
 Time series designs
Cluster randomized controlled trials
Types of analysis
 Cluster level analysis – the cluster is the level of
randomization and analysis
 Alternative models which can incorporate
hierarchical data:
– Pregnant women (level 1) – covariates such as age
– Cared for by midwives (level 2) – covariates such
working experience
– Nested within practices (level 3) – covariates as size
of the hospital
Cluster randomized controlled trials
cont.
 Two-armed trials: control vs intervention
(/implementation strategy)
 Multiple arm trials: control vs intervention
(/implementation strategy A) vs intervention B
(/implementation strategy B)
Cluster randomized controlled trials
cont.
 Two-armed trials
 Multiple arm trials
 Factorial designs:
allows two
randomized trials to
be conducted for
the same sample
size as a two-arm
trial.
Cluster randomized controlled trials
cont.
 Two-armed trials
 Multiple arm trials
 Factorial designs
 Stepped-wedge design: all clusters receive the
intervention – time when initiating the
intervention (/implementation strategy) is
randomized.
1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12
Cluster 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Baseline Implementation Sustainability
1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3
Cluster 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Baseline Implementation Sustainability
20212017 2018 2019 2020
20212017 2018 2019 2020
Wedge 1
Wedge 2
Wedge 3
Wedge 4
Quasi-experimental designs
 Quasi-experimental studies often are conducted
where there are practical and ethical barriers to
conducting randomized controlled trials.
 The three most commonly used designs in
guideline implementation studies:
 uncontrolled before and after studies
 time series designs
 controlled before and after studies
Uncontrolled before and after
studies
 Measure provider performance before and after
the introduction of an intervention (e.g.
dissemination of guidelines) in the same study
site(s)
 Any observed differences in performance are
assumed to be due to the intervention.
 Relatively simple to conduct but intrinsically
weak evaluative designs (secular trends/sudden
changes make it difficult to attribute observed
changes to the intervention.
 NB. Risk of Hawthorne effect
Time series designs
 Attempt to detect whether an intervention has
had an effect significantly greater than the
underlying trend.
 Useful in guideline implementation research for
evaluating the effects of interventions when it is
difficult to randomize or identify an appropriate
control group (e.g. dissemination of national
guidelines or mass media campaigns).
 Increase the confidence with which the estimate
of effect can be attributed to the intervention.
 Data are collected at multiple time points before
and after the intervention (the multiple time
points before the intervention allow the
underlying trend to be estimated, the multiple
time points after the intervention allow the
intervention effect to be estimated accounting for
the underlying trend
Time series designs cont.
 The most important determinant of technique is
the number of data points prior to the
intervention (providing a stable estimate of the
underlying trend).
 Rule of thumb: 20 data points before and 20 after
 Data points after the intervention to allow full
time series modelling.
 Often difficult to collect sufficient data points
unless routine data sources are available.
Time series designs cont.
 A control population having similar
characteristics and performance to the study
population (and thus expected to experience
secular trends or sudden changes similar to the
study population) is identified.
 Data are collected in both populations using
similar methods before and after the intervention
is introduced in the study population.
 A ‘between group’ analysis comparing
performance in the study and control groups
following the intervention is undertaken.
Controlled before and after studies
 Well designed before
and after studies
should protect
against secular
trends/sudden
changes
 Often difficult to
identify a comparable
control group
 Even in well-matched
groups, baseline
Controlled before and after studies
cont

More Related Content

What's hot

Translational research
Translational researchTranslational research
Translational research
Shahin Hameed
 
Models of Evidence Base Practice PPT use in ANP, Education subject
Models of Evidence Base Practice PPT use in ANP, Education subjectModels of Evidence Base Practice PPT use in ANP, Education subject
Models of Evidence Base Practice PPT use in ANP, Education subject
sonal patel
 
Evidence based practice
Evidence based practiceEvidence based practice
Evidence based practice
varshitha Nakka
 
Good practice in evaluation research
Good practice in evaluation researchGood practice in evaluation research
Good practice in evaluation research
QSR International
 
Implementation science and learning health systems: Pieces of a puzzle
Implementation science and learning health systems: Pieces of a puzzle Implementation science and learning health systems: Pieces of a puzzle
Implementation science and learning health systems: Pieces of a puzzle
Department of Learning Health Sciences, University of Michigan Medical School
 
Hallmarks of Scientific Research
Hallmarks of Scientific ResearchHallmarks of Scientific Research
Hallmarks of Scientific Research
MaherMubeen
 
Using Implementation Science to transform patient care (Knowledge to Action C...
Using Implementation Science to transform patient care (Knowledge to Action C...Using Implementation Science to transform patient care (Knowledge to Action C...
Using Implementation Science to transform patient care (Knowledge to Action C...
NEQOS
 
Evidence based nursing practice
Evidence based nursing practiceEvidence based nursing practice
Evidence based nursing practice
ananthakrishnan87
 
Research unit 1
Research unit 1Research unit 1
Research unit 1
JomilyJoyson1
 
Business research method ppt 1
Business research method ppt 1Business research method ppt 1
Business research method ppt 1
hagayi musemakweri
 
2. scientific investigation
2. scientific investigation2. scientific investigation
2. scientific investigationMuneer Hussain
 
Chapter15 power point july132012
Chapter15 power point july132012Chapter15 power point july132012
Chapter15 power point july132012
mmofyauk
 
Qualitative Research Synthesis a short introduction
Qualitative Research Synthesis a short introductionQualitative Research Synthesis a short introduction
Qualitative Research Synthesis a short introduction
Philwood
 
Research design
Research designResearch design
Evidence-Based Practice and the Future of Nursing
Evidence-Based Practice and the Future of NursingEvidence-Based Practice and the Future of Nursing
Evidence-Based Practice and the Future of Nursing
Other Mother
 
Reflecting on the nature of teacher work
Reflecting on the nature of teacher workReflecting on the nature of teacher work
Reflecting on the nature of teacher work
Philwood
 
(My initial post)reflect on the focus area or system(s) for the
(My initial post)reflect on the focus area or system(s) for the (My initial post)reflect on the focus area or system(s) for the
(My initial post)reflect on the focus area or system(s) for the
ssuserfa5723
 
Quantitative research - Research Methodology - Manu Melwin Joy
Quantitative research - Research Methodology - Manu Melwin JoyQuantitative research - Research Methodology - Manu Melwin Joy
Quantitative research - Research Methodology - Manu Melwin Joy
manumelwin
 
CORE: Quantitative Research Methodology: An Overview
CORE: Quantitative Research Methodology: An OverviewCORE: Quantitative Research Methodology: An Overview
CORE: Quantitative Research Methodology: An Overview
Trident University
 

What's hot (20)

Translational research
Translational researchTranslational research
Translational research
 
Implementation research
Implementation researchImplementation research
Implementation research
 
Models of Evidence Base Practice PPT use in ANP, Education subject
Models of Evidence Base Practice PPT use in ANP, Education subjectModels of Evidence Base Practice PPT use in ANP, Education subject
Models of Evidence Base Practice PPT use in ANP, Education subject
 
Evidence based practice
Evidence based practiceEvidence based practice
Evidence based practice
 
Good practice in evaluation research
Good practice in evaluation researchGood practice in evaluation research
Good practice in evaluation research
 
Implementation science and learning health systems: Pieces of a puzzle
Implementation science and learning health systems: Pieces of a puzzle Implementation science and learning health systems: Pieces of a puzzle
Implementation science and learning health systems: Pieces of a puzzle
 
Hallmarks of Scientific Research
Hallmarks of Scientific ResearchHallmarks of Scientific Research
Hallmarks of Scientific Research
 
Using Implementation Science to transform patient care (Knowledge to Action C...
Using Implementation Science to transform patient care (Knowledge to Action C...Using Implementation Science to transform patient care (Knowledge to Action C...
Using Implementation Science to transform patient care (Knowledge to Action C...
 
Evidence based nursing practice
Evidence based nursing practiceEvidence based nursing practice
Evidence based nursing practice
 
Research unit 1
Research unit 1Research unit 1
Research unit 1
 
Business research method ppt 1
Business research method ppt 1Business research method ppt 1
Business research method ppt 1
 
2. scientific investigation
2. scientific investigation2. scientific investigation
2. scientific investigation
 
Chapter15 power point july132012
Chapter15 power point july132012Chapter15 power point july132012
Chapter15 power point july132012
 
Qualitative Research Synthesis a short introduction
Qualitative Research Synthesis a short introductionQualitative Research Synthesis a short introduction
Qualitative Research Synthesis a short introduction
 
Research design
Research designResearch design
Research design
 
Evidence-Based Practice and the Future of Nursing
Evidence-Based Practice and the Future of NursingEvidence-Based Practice and the Future of Nursing
Evidence-Based Practice and the Future of Nursing
 
Reflecting on the nature of teacher work
Reflecting on the nature of teacher workReflecting on the nature of teacher work
Reflecting on the nature of teacher work
 
(My initial post)reflect on the focus area or system(s) for the
(My initial post)reflect on the focus area or system(s) for the (My initial post)reflect on the focus area or system(s) for the
(My initial post)reflect on the focus area or system(s) for the
 
Quantitative research - Research Methodology - Manu Melwin Joy
Quantitative research - Research Methodology - Manu Melwin JoyQuantitative research - Research Methodology - Manu Melwin Joy
Quantitative research - Research Methodology - Manu Melwin Joy
 
CORE: Quantitative Research Methodology: An Overview
CORE: Quantitative Research Methodology: An OverviewCORE: Quantitative Research Methodology: An Overview
CORE: Quantitative Research Methodology: An Overview
 

Similar to Studying implementation 2017

Implementation Research: A Primer
Implementation Research: A PrimerImplementation Research: A Primer
Implementation Research: A Primer
amusten
 
A theoretical framework to assess implementation fidelity of adaptive public...
A theoretical framework to assess implementation fidelity of adaptive  public...A theoretical framework to assess implementation fidelity of adaptive  public...
A theoretical framework to assess implementation fidelity of adaptive public...
valéry ridde
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorldFish
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
gggadiel
 
Outline model Impact Evaluation toolkit
Outline model Impact Evaluation toolkitOutline model Impact Evaluation toolkit
Outline model Impact Evaluation toolkit
NIDA-Net
 
SMA Implementation science seminar (Day 1).pptx
SMA Implementation science seminar (Day 1).pptxSMA Implementation science seminar (Day 1).pptx
SMA Implementation science seminar (Day 1).pptx
AbdirahmanWaseem
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
Resty Samosa
 
Understanding Why, When, and What it Will Take to do Operations and/or Implem...
Understanding Why, When, and What it Will Take to do Operations and/or Implem...Understanding Why, When, and What it Will Take to do Operations and/or Implem...
Understanding Why, When, and What it Will Take to do Operations and/or Implem...CORE Group
 
Process Evaluation Handout
Process Evaluation HandoutProcess Evaluation Handout
Process Evaluation HandoutCORE Group
 
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...CORE Group
 
Process evaluation or workshop ghana2
Process evaluation or workshop ghana2Process evaluation or workshop ghana2
Process evaluation or workshop ghana2CORE Group
 
Evaluation of health services
Evaluation of health servicesEvaluation of health services
Evaluation of health services
kavita yadav
 
Pilot Study/Feasibility Study........pdf
Pilot Study/Feasibility Study........pdfPilot Study/Feasibility Study........pdf
Pilot Study/Feasibility Study........pdf
Mostaque Ahmed
 
Learning Approaches
Learning ApproachesLearning Approaches
Learning Approaches
IDS
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
CesToronto
 
Jeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference SlidesJeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference Slides
ShawnHoke
 
Highlights from ExL Pharma's Proactive GCP Compliance
Highlights from ExL Pharma's Proactive GCP ComplianceHighlights from ExL Pharma's Proactive GCP Compliance
Highlights from ExL Pharma's Proactive GCP ComplianceExL Pharma
 
Ovretveit implementation science research course 1day sept 11
Ovretveit implementation science research course 1day sept 11Ovretveit implementation science research course 1day sept 11
Ovretveit implementation science research course 1day sept 11
john
 
Building Blocks: Protocols
Building Blocks:  ProtocolsBuilding Blocks:  Protocols
Building Blocks: Protocols
guest5efb1
 

Similar to Studying implementation 2017 (20)

Implementation Research: A Primer
Implementation Research: A PrimerImplementation Research: A Primer
Implementation Research: A Primer
 
A theoretical framework to assess implementation fidelity of adaptive public...
A theoretical framework to assess implementation fidelity of adaptive  public...A theoretical framework to assess implementation fidelity of adaptive  public...
A theoretical framework to assess implementation fidelity of adaptive public...
 
Dr V K Tiwari
Dr V K TiwariDr V K Tiwari
Dr V K Tiwari
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Outline model Impact Evaluation toolkit
Outline model Impact Evaluation toolkitOutline model Impact Evaluation toolkit
Outline model Impact Evaluation toolkit
 
SMA Implementation science seminar (Day 1).pptx
SMA Implementation science seminar (Day 1).pptxSMA Implementation science seminar (Day 1).pptx
SMA Implementation science seminar (Day 1).pptx
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
Understanding Why, When, and What it Will Take to do Operations and/or Implem...
Understanding Why, When, and What it Will Take to do Operations and/or Implem...Understanding Why, When, and What it Will Take to do Operations and/or Implem...
Understanding Why, When, and What it Will Take to do Operations and/or Implem...
 
Process Evaluation Handout
Process Evaluation HandoutProcess Evaluation Handout
Process Evaluation Handout
 
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...
Process Evaluation: Documenting "How" and Understanding the "Why" of Implemen...
 
Process evaluation or workshop ghana2
Process evaluation or workshop ghana2Process evaluation or workshop ghana2
Process evaluation or workshop ghana2
 
Evaluation of health services
Evaluation of health servicesEvaluation of health services
Evaluation of health services
 
Pilot Study/Feasibility Study........pdf
Pilot Study/Feasibility Study........pdfPilot Study/Feasibility Study........pdf
Pilot Study/Feasibility Study........pdf
 
Learning Approaches
Learning ApproachesLearning Approaches
Learning Approaches
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
 
Jeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference SlidesJeff Alexander Regenstrief Conference Slides
Jeff Alexander Regenstrief Conference Slides
 
Highlights from ExL Pharma's Proactive GCP Compliance
Highlights from ExL Pharma's Proactive GCP ComplianceHighlights from ExL Pharma's Proactive GCP Compliance
Highlights from ExL Pharma's Proactive GCP Compliance
 
Ovretveit implementation science research course 1day sept 11
Ovretveit implementation science research course 1day sept 11Ovretveit implementation science research course 1day sept 11
Ovretveit implementation science research course 1day sept 11
 
Building Blocks: Protocols
Building Blocks:  ProtocolsBuilding Blocks:  Protocols
Building Blocks: Protocols
 

Recently uploaded

A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
Peter Windle
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
EduSkills OECD
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
TechSoup
 
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdfAdversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
Po-Chuan Chen
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
Celine George
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
Jisc
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
Nguyen Thanh Tu Collection
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
Peter Windle
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
timhan337
 
Acetabularia Information For Class 9 .docx
Acetabularia Information For Class 9  .docxAcetabularia Information For Class 9  .docx
Acetabularia Information For Class 9 .docx
vaibhavrinwa19
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
DhatriParmar
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
TechSoup
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
Balvir Singh
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
Jean Carlos Nunes Paixão
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
Anna Sz.
 

Recently uploaded (20)

A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.Biological Screening of Herbal Drugs in detailed.
Biological Screening of Herbal Drugs in detailed.
 
Francesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptxFrancesca Gottschalk - How can education support child empowerment.pptx
Francesca Gottschalk - How can education support child empowerment.pptx
 
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup   New Member Orientation and Q&A (May 2024).pdfWelcome to TechSoup   New Member Orientation and Q&A (May 2024).pdf
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdf
 
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdfAdversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
Adversarial Attention Modeling for Multi-dimensional Emotion Regression.pdf
 
How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17How to Make a Field invisible in Odoo 17
How to Make a Field invisible in Odoo 17
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
BÀI TẬP BỔ TRỢ TIẾNG ANH GLOBAL SUCCESS LỚP 3 - CẢ NĂM (CÓ FILE NGHE VÀ ĐÁP Á...
 
Embracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic ImperativeEmbracing GenAI - A Strategic Imperative
Embracing GenAI - A Strategic Imperative
 
Honest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptxHonest Reviews of Tim Han LMA Course Program.pptx
Honest Reviews of Tim Han LMA Course Program.pptx
 
Acetabularia Information For Class 9 .docx
Acetabularia Information For Class 9  .docxAcetabularia Information For Class 9  .docx
Acetabularia Information For Class 9 .docx
 
The Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptxThe Accursed House by Émile Gaboriau.pptx
The Accursed House by Émile Gaboriau.pptx
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
Introduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp NetworkIntroduction to AI for Nonprofits with Tapp Network
Introduction to AI for Nonprofits with Tapp Network
 
Operation Blue Star - Saka Neela Tara
Operation Blue Star   -  Saka Neela TaraOperation Blue Star   -  Saka Neela Tara
Operation Blue Star - Saka Neela Tara
 
Lapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdfLapbook sobre os Regimes Totalitários.pdf
Lapbook sobre os Regimes Totalitários.pdf
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
Polish students' mobility in the Czech Republic
Polish students' mobility in the Czech RepublicPolish students' mobility in the Czech Republic
Polish students' mobility in the Czech Republic
 

Studying implementation 2017

  • 1. Studying implementation Implementation science workshop Hawassa University, August 21-25, 2017 E-mail: anna.bergstrom@kbh.uu.se
  • 2. Today  PROCESS EVALUATION  THE PeriKIP PROJECT
  • 4. What is process evaluation? To understand the effects of interventions  Allows for exploration of implementation and change processes and the factors associated with variations in effectiveness.  Can examine the utility of theories underpinning intervention design and generate questions or hypotheses for future research. Process evaluation is needed for us to move from “what works” to “what works where and why”.
  • 5.  Aims to understand why certain implementation strategies bring about improvement (while others don’t) – illuminates mechanisms of change  Provide an understanding of the determinants of success/failure  Provides understanding of the planned vs delivered vs ‘exposure’ etc  Can ensure in-depth understanding from smaller studies which can inform scale-up  Can assist in understanding outcome-heterogeneity of large-scale projects  Commonly requires mixed-methods What is process evaluation? cont.
  • 6. Process evaluation steps  Planning: methodological expertise, appropriate interdisciplinary mix, degree of separation between teams and means of merging findings  Design and conduct: describe the intervention and causal assumptions, identify key uncertainties, select research questions and methods, balance data collection all sites/selected sites, timing of data collection  Analysis: descriptive quantitative information on fidelity, dose, and reach. More detailed modelling of variations between participants or sites. UK Medical Research Council, 2015; Moore et al., 2
  • 7.  Describe the implementation strategy (theory)  Clarify causal assumptions (logic model)  Identify key uncertainties  Identify and prioritise research questions  Select a combination of research methods UK Medical Research Council, 2015; Moore et al., 2 Process evaluation steps cont.
  • 8. MRC guidance for process evaluations UK Medical Research Council, 2015; Moore et al., 2
  • 9. MRC guidance for process evaluations cont.  Clearly defining the components of the intervention (implementation strategy) - Naming - Defining - Operationalizing (Actor, Action, Action targets, Temporality, Dose, Implementation outcome addressed, Theoretical justification) Proctor et al 2013
  • 10.  Fidelity: whether the intervention was delivered as intended  Dose: the quantity of intervention strategy implemented  Reach: whether the intended audience came into contact with the intervention, and how  Adaptations made MRC guidance for process evaluations cont.
  • 11.  Mechanisms through which interventions bring about/trigger change – How the effects of the specific intervention occurred – How these effects might be replicated by similar future studies MRC guidance for process evaluations cont.
  • 12.  Anything external to the study that may act as a barrier or facilitator to the implementation of the strategy, or its effects  Understanding context is critical for interpretation of the findings and for understanding of its generalizability.  Interaction with context can be complex even with ‘simple’ interventions MRC guidance for process evaluations cont.
  • 13. Key terms  Reach: Characteristic of the target audience.  Dose delivered: A function of efforts of the providers of the intervention.  Dose received: Assess the extent of engagement of participants with the intervention.  Fidelity: The extent to which the intervention was delivered as planned.  Recruitment: Procedures used to approach and attract participants. Linnan and Steckler,
  • 14.  Implementation: A composite score that indicates the extent to which the intervention (/implementation strategy) has been implemented and received by the intended audience.  Context: Aspects of the larger social, political, and economic environment that may influence the implementation of the strategy. Key terms cont. Linnan and Steckler,
  • 15. For whom and how are process evaluations helpful? For researchers and policy makers  Explaining success (Will outcomes be similar in other contexts? How can the effects be replicated?)  Explaining failure (Is it due to the intervention or to poor implementation?)  Does the intervention have different effects on subgroups? For systematic reviewers  Understanding the nature of intervention and implementation heterogeneity
  • 16. Methods: Observations  Person, camera or audiotape  Not intrusive, i.e. not altering behaviour  Can capture performance  Takes time to learn  Can provide data difficult to analyze  The less structured the observation/the more complex the change the harder it is  Important to train observers
  • 17. Methods: Self-reports and documentation  Interviews and questionnaires  Periodic or retrospective (periodic likely more reliable data)  Collect data shortly after the implementation has occurred  Program records and documentation
  • 18. Cost evaluation  The set-up cost  Running cost for e.g. time used by involved people, materials, potentially purchased equipment  Changes because of an intervention e.g. more use of health care - > change in cost for health providers and health consumers  Relate cost of the intervention to for example life years saved or some other factor
  • 20. The Perinatal Knowledge Into Practice (PeriKIP) project Objective: To test the feasibility of a multilevel health system intervention applying participatory groups to improve perinatal health. Setting: Nguyen Binh, Ha Quang and Phuc Hoa districts in Cao Bang
  • 21. PeriKIP cont. Innovation Recipients Inner context Outer context Facilitation Initiation of monthly facilitated local stakeholder group meetings at  48 commune health centers  3 district hospitals  1provincial hospital  i-PARIHS
  • 22. The Plan-Do-Study-Act  Plan: make a clear plan for actions targeting a certain problem including assigning: o who is responsible o a deadline for undertaking the planned action o expected outcome of the action Plan Do Study Act
  • 23. The Plan-Do-Study-Act cont.  Do: carry out the planned action. Plan Do Study Act
  • 24. The Plan-Do-Study-Act cont.  Study: compare the expected outcome of action with the result of the action. Plan Do Study Act
  • 25. The Plan-Do-Study-Act cont.  Act: discuss the lessons learnt relating to the action and its outcome including taking decisions on refinements, adjustments or how to support the action to become routine. Plan Do Study Act
  • 26. PeriKIP cont.  Before and after study applying process evaluation Knowledge survey: - Focusing on antenatal care and postnatal care - Client case using vignettes for monitoration of labour, management of preeclampsia and haemorrhage  Observation of - Antenatal care visits in sub-sample of units - Labour and immediate postnatal care  Village Health Workers will collect data on pregnancies and birth outcomes using mobile
  • 27. PeriKIP cont. Logic Model - Identify key uncertainties
  • 28.
  • 29. PeriKIP cont.  Qualitative data collection - Heterogeneous groups - Homogeneous groups - Facilitators - Key informants  Context assessment using the COACH tool  Facilitators guide (content)  Supervisors guide (content)  Facilitators diaries  Supervisors diaries  Meeting summary (attendance, meeting time, etc)  Problem identification list
  • 30. Assignment (only a few examples!)  Could your study include aspects around how a certain intervention was implemented?  Would it make sense to investigate the fidelity to which an intervention was implemented across your study setting? The reach? The recruitment?  Is there already a logic model for your piece of the larger program? If not – would that be helpful to have?  Is the perceptions from recipients of the intervention known?  Are there reasons to think that the intervention is taken up to different degrees at different sites – and if so would it be interesting to try to find out why?
  • 31. References process evaluation 31 1. Saunders et al. Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation: A How-To Guide. Health Promot Pract 2005 6: 134 2. Linnan L, Steckler A. Process evaluation for Public Health Interventions and Research. San Fransisco: Jossey-Bass; 2002. 3. Grant et al.: Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013 14:15. 4. Medical Research Council: Developing and Evaluating Complex Interventions: New Guidance. London: Medical Research Council; 2008. 5. Hawe P, Shiell A, Riley T, Gold L: Methods for exploring implementation variation and local context within a cluster randomised community intervention trial. J Epidemiol Community Health 2004, 58:788–793. 6. Durlak J: Why programme implementation is so important. J Prev Interv Community 1998, 17:5–18. 7. Lewin S, Glenton C, Oxman AD: Use of qualitative methods alongside randomised controlled trials of complex healthcare interventions: methodological study. Br Med J 2009, 339:b3496. 8. Hulscher M, Laurant M, Grol R: Process evaluation of quality improvement interventions. In Quality Improvement Research; Understanding The Science of Change in Healthcare. Edited by Grol R, Baker R, Moss F. London: BMJ Publishing Group; 2004:165–183. 9. Cook T: Emergent principles for the design, implementation, and analysis of cluster- based experiments in social science. Ann Am Acad Pol Soc Sci 2005, 599:176–198. 10. Bonell C, Oakley A, Hargreaves J, Strange V, Rees R: Assessment of generalisability in
  • 32. Types of evaluations  Effect evaluation – Design – Primary outcome, secondary outcomes  Cost evaluation  Process evaluation
  • 33. Studying implementation  Specific strategy (or a set of strategies) – e.g. reminders  Being used in a specific domain of healthcare – e.g. rational prescription of antibiotics  In a specific setting – e.g. amongst physicians in primary health care
  • 35. RE-AIM (re-aim.org)  Reach is the absolute number, proportion, and representativeness of individuals who are willing to participate in a given initiative.  Effectiveness is the impact of an intervention on outcomes, including potential negative effects, quality of life, and economic outcomes.  Adoption is the absolute number, proportion, and representativeness of settings and intervention agents who are willing to initiate a program.  Implementation refers to the intervention agents’ fidelity to the various elements of an intervention’s protocol. This includes consistency of delivery as intended and the time and cost of the intervention.  Maintenance is the extent to which a program or policy becomes institutionalized or part of the routine organizational practices and policies. Maintenance also has referents at the individual level. At
  • 37. Quantitative evaluation designs  Randomized controlled trials, randomized on the level of the individual patients  Cluster randomized controlled trials on the level of clusters of patients/persons, e.g. professionals, hospitals or communities  Uncontrolled before and after studies  Controlled before and after studies  Time series designs
  • 38. Cluster randomized controlled trials Types of analysis  Cluster level analysis – the cluster is the level of randomization and analysis  Alternative models which can incorporate hierarchical data: – Pregnant women (level 1) – covariates such as age – Cared for by midwives (level 2) – covariates such working experience – Nested within practices (level 3) – covariates as size of the hospital
  • 39. Cluster randomized controlled trials cont.  Two-armed trials: control vs intervention (/implementation strategy)  Multiple arm trials: control vs intervention (/implementation strategy A) vs intervention B (/implementation strategy B)
  • 40. Cluster randomized controlled trials cont.  Two-armed trials  Multiple arm trials  Factorial designs: allows two randomized trials to be conducted for the same sample size as a two-arm trial.
  • 41. Cluster randomized controlled trials cont.  Two-armed trials  Multiple arm trials  Factorial designs  Stepped-wedge design: all clusters receive the intervention – time when initiating the intervention (/implementation strategy) is randomized. 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 Cluster 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Baseline Implementation Sustainability 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 Cluster 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Baseline Implementation Sustainability 20212017 2018 2019 2020 20212017 2018 2019 2020 Wedge 1 Wedge 2 Wedge 3 Wedge 4
  • 42. Quasi-experimental designs  Quasi-experimental studies often are conducted where there are practical and ethical barriers to conducting randomized controlled trials.  The three most commonly used designs in guideline implementation studies:  uncontrolled before and after studies  time series designs  controlled before and after studies
  • 43. Uncontrolled before and after studies  Measure provider performance before and after the introduction of an intervention (e.g. dissemination of guidelines) in the same study site(s)  Any observed differences in performance are assumed to be due to the intervention.  Relatively simple to conduct but intrinsically weak evaluative designs (secular trends/sudden changes make it difficult to attribute observed changes to the intervention.  NB. Risk of Hawthorne effect
  • 44. Time series designs  Attempt to detect whether an intervention has had an effect significantly greater than the underlying trend.  Useful in guideline implementation research for evaluating the effects of interventions when it is difficult to randomize or identify an appropriate control group (e.g. dissemination of national guidelines or mass media campaigns).  Increase the confidence with which the estimate of effect can be attributed to the intervention.
  • 45.  Data are collected at multiple time points before and after the intervention (the multiple time points before the intervention allow the underlying trend to be estimated, the multiple time points after the intervention allow the intervention effect to be estimated accounting for the underlying trend Time series designs cont.
  • 46.  The most important determinant of technique is the number of data points prior to the intervention (providing a stable estimate of the underlying trend).  Rule of thumb: 20 data points before and 20 after  Data points after the intervention to allow full time series modelling.  Often difficult to collect sufficient data points unless routine data sources are available. Time series designs cont.
  • 47.  A control population having similar characteristics and performance to the study population (and thus expected to experience secular trends or sudden changes similar to the study population) is identified.  Data are collected in both populations using similar methods before and after the intervention is introduced in the study population.  A ‘between group’ analysis comparing performance in the study and control groups following the intervention is undertaken. Controlled before and after studies
  • 48.  Well designed before and after studies should protect against secular trends/sudden changes  Often difficult to identify a comparable control group  Even in well-matched groups, baseline Controlled before and after studies cont