SlideShare a Scribd company logo
1 of 83
Implementing Intervention Research into Public
Policy—the BI3-Approach^
Christiane Spiel1 & Barbara Schober1 & Dagmar Strohmeier2
Published online: 27 February 2016
# The Author(s) 2016. This article is published with open access
at Springerlink.com
Abstract Evidence-based intervention programs have be-
come highly important in recent years, especially in educa-
tional contexts. However, transferring these programs into
practice and into the wider field of public policy often fails.
As a consequence, the field of implementation research has
emerged, several implementation frameworks have been de-
veloped, and implementation studies conducted. However, in-
tervention research and implementation research have not yet
been connected systematically and different traditions and re-
search groups are involved. Implementation researchers are
mostly given mandates by politicians to take on the imple-
mentation of already existing interventions. This might be
one of the key reasons why there are still many problems in
translating programs into widespread community practice. In
this paper, we argue for a systematic integration of interven-
tion and implementation research (BI3-Approach^) and recom-
mend a six-step procedure (PASCIT). This requires re-
searchers to design and develop intervention programs using
a field-oriented and participative approach. In particular, the
perspective of policymakers has to be included as well as an
analysis of which factors support or hinder evidence-based
policy in contrast to opinion-based policy. How this system-
atic connection between intervention and implementation re-
search can be realized, is illustrated by means of the develop-
ment and implementation of the ViSC school program, which
intends to reduce aggressive behavior and bullying and to
foster social and intercultural competencies.
Keywords Intervention research . Implementation science .
Public policy . Integrative approach
Implementing Intervention Research into Public
Policy
Evidence-based intervention programs in educational contexts
have become highly important in recent years. However,
transferring these programs into practice and into the wider
field of public policy often fails (Fixsen et al. 2013). As a
consequence, the field of implementation research has
emerged (Rossi and Wright 1984; Ogden and Fixsen 2014).
In recent years, a growing body of implementation research
has indicated that an active, long-term, multilevel implemen-
tation approach is far more effective than passive forms of
dissemination (Ogden and Fixsen 2014). Within the field of
implementation research, several theoretical bases and
models—implementation frameworks—have been developed
(Meyers et al. 2012).
However, intervention research and implementation re-
search have not yet been systematically connected and differ-
ent tradit ions and research groups are involved.
Implementation researchers are mostly given mandates by
politicians to take on the implementation of already existing
interventions. Moreover, implementation research remains
rather isolated and is sometimes considered to be less scien-
tifically valuable than research that develops new interven-
tions (Fixsen et al. 2011). This might be one of the key reasons
why there are still many problems in translating programs into
widespread community practice (Spoth et al. 2013).
* Christiane Spiel
[email protected]
1 Faculty of Psychology, University of Vienna,
Universitaetsstraße 7,
1010 Vienna, Austria
2 School of Applied Health and Social Sciences, University of
Applied
Sciences Upper Austria, Linz, Austria
Prev Sci (2018) 19:337–346
DOI 10.1007/s11121-016-0638-3
http://crossmark.crossref.org/dialog/?doi=10.1007/s11121-016-
0638-3&domain=pdf
In this paper, we argue for a systematic Integration of
Intervention and Implementation research (BI3-Approach^).
That means researchers design and develop intervention pro-
grams based on a field-oriented and participative approach
from the very beginning (according to the concept of use-
inspired basic research, Stokes 1997; see also Spiel 2009a).
This is not only a matter of transferring a program to practi -
tioners at the end of the research process; the whole concep-
tualization of an intervention as well as its evaluation and
implementation should systematically consider the needs of
the field (Spiel et al. 2011b) in an integrated way (Beelmann
and Karing 2014). Consequently, the perspective of all
stakeholders should be included (Shonkoff 2000). Based
on theoretical considerations we drew from the literature
and our experiences with intervention and implementation
research, we summarized the most relevant actions to be
taken and issues to be considered on the part of researchers,
into a systematic six-step procedure (PASCIT) in order to
propose such a systematic connection between intervention
research and implementation research. We expect that such
a connection would increase the probability of sustainably
implementing evidence-based intervention programs into
public policy.
How this systematic connection between intervention and
implementation research can be realized is illustrated by
means of the ViSC Social Competence Program. The main
goal of the ViSC program is to reduce aggression and bullying
and to foster social and intercultural competencies. It was
embedded in a national strategy for violence prevention. For
sustainable implementation, a cascaded train-the-trainer (re-
searcher-multipliers-teachers-students) model was developed
and applied. Advantages and limitations of the six-step proce-
dure are also discussed for the VISC program aswell as from a
general point of view at the end of the article.
Theoretical Background
In recent decades, the evidence-based movement has gained
greatly in impact, especially in Anglo-American contexts
(Kratochwill and Shernoff 2003). Efforts have been made
to make better use of research-based prevention and inter-
vention programs in human service areas such as medicine,
employment, child welfare, health, and juvenile justice
(Fixsen et al. 2009; Spiel 2009a). As part of this
evidence-based movement, various efforts have been made
to define standards of evidence. For example, the Society
for Prevention Research (Flay et al. 2005) has provided
standards to assist practitioners, policy makers, and admin-
istrators in determining which interventions are efficacious,
which are effective, and which are ready for dissemination
(for details, see Flay et al. 2005). Other standards are pro-
vided by, for instance, the What Works Clearinghouse (see
www.whatworks.ed.gov), the Best Evidence Encyclopedia
(see www.bestevidence.org), the Campbell Collaboration
(see www.campbellcollaboration.org), and the UK-Based
Evidence for Policy and Practice Information and Co-
ordinating Centre (see www.eppi.ioe.ac.uk). Common to
these standards is the fact that evidence-based programs
are defined by the research methodology used to evaluate
them, and randomized trials are defined as the gold stan-
dard for defining evidence (Fixsen et al. 2009).
However, there are considerable differences in the uptake
of research findings among public service areas and scientific
disciplines (Nutley et al. 2007). Particularly in the field of
education, there has not been extensive implementation of
the body of evidence-based research, and the adoption of pre-
vention and intervention programs is driven more by ideology
than by evidence (Forman et al. 2013; Slavin 2008; Spiel
2009a, b).
Despite differences among public service fields and coun-
tries, it has become obvious that the evidence-based move-
ment has not provided the intended benefits to consumers
and communities (Fixsen et al. 2009), and implementing these
programs into practice and in the wider range of public policy
has often failed (Fixsen et al. 2009, 2013). There is large-scale
agreement about one of the central reasons for this disappoint-
ment: Program evaluation has not historically included any
mention or systematic study of implementation (Meyers
et al. 2012). As a consequence, the field of implementation
research has emerged (Rossi and Wright 1984). In recent
years, a growing body of implementation research has indi-
cated that an active, long-term, multilevel implementation ap-
proach (= mission-driven focus) is far more effective than
passive forms of dissemination without any active involve-
ment of practitioners (Ogden and Fixsen 2014; Spiel et al.
2012).
Fixsen et al. (2005) defined implementation as the Bspecific
set of activities designed to put into practice an activity or
program of known dimensions^ (p.5; a similar definition is
provided by Forman et al. 2013). Consequently, implementa-
tion science has been defined as Bthe scientific study of
methods to promote the systemic uptake of research findings
and evidence-based practices into professional practice and
public policy^ (Forman et al. 2013, p.80; see also Eccles and
Mittman 2006).
In the last decade, many implementation studies have
been conducted and several conceptual models and
implementation frameworks have been presented. The
review provided by Meyers et al. (2012) consists of 25
frameworks. They found 14 dimensions that were com-
mon to many of these frameworks and grouped them into
six areas: (a) assessment strategies, (b) decisions about
adaptation, (c) capacity-building strategies, (d) creating a
structure for implementation, (e) ongoing implementation
support strategies, and (f) improving future applications.
338 Prev Sci (2018) 19:337–346
http://www.whatworks.ed.gov/
http://www.bestevidence.org/
http://www.campbellcollaboration.org/
http://www.eppi.ioe.ac.uk/
According to their synthesis, the implementation process
consists of a temporal series of these interrelated steps,
which are critical to quality implementation.
Despite these efforts within the field of implementation
science, there is agreement among researchers that the empir -
ical support for evidence-based implementation is insufficient
(Ogden and Fixsen 2014). Although there is a large body of
empirical evidence on the importance of implementation and
growing knowledge of the contextual factors that can influ-
ence implementation, knowledge of how to increase the like-
lihood of quality implementation is still needed (Meyers et al.
2012).
Moreover, intervention research and implementation
research have not yet been systematically connected.
Forman et al. (2013) explicitly pointed out the differences
between intervention and implementation activities. While
intervention activity refers to the provision of a prevention
or intervention program to clients and consists (in the field
of school) of a group leader conducting the program wi th
targeted students, implementation activity refers to actions
taken in the organizational setting to ensure that the inter-
vention delivered to clients is complete and appropriate.
Consequently, different research groups with different re-
search traditions are usually involved in the two tasks.
Implementation researchers are mostly given mandates by
politicians to take on the implementation of already
existing interventions. Moreover, implementation research
is seen as less exciting than research that develops new
interventions (Fixsen et al. 2011). Furthermore, implemen-
tation research is very difficult to do within the constraints
of university research environments (e.g., due to time or
financial constraints) and is sometimes even considered to
be less scientifically valuable (Fixsen et al. 2011; Spiel
2009b).
Therefore, we suggest a systematic integration of interven-
tion and implementation research. Consequently, researchers
should design and develop intervention programs using a
field-oriented and participative approach (according to the
concept of use-inspired basic research, Stokes 1997; see also
Spiel 2009a). The whole conceptualization of an intervention
as well as its evaluation and implementation should systemat-
ically consider the needs of the field (e.g., Spiel et al. 2011a, b,
c), and intervention, evaluation, and implementation should
be developed in an integrative way. In order to realize this
and to avoid as many presumable risks as possible, the per -
spective of all stakeholders should be included (Shonkoff
2000). In particular, the perspective of policymakers has to
be included (Spiel et al. 2011a, b, c; Davies 2004), as well
as analyses of what factors support or hinder evidence-based
policy (Davies 2004, 2012).
In the next section, we propose an approach for the goal-
oriented integration of intervention and implementation
research.
PASCIT—six steps of an BI3-Research^ (Integrative
Intervention and Implementation Research)
Approach
Combining theoretical and empirical knowledge from prior
research (e.g., Glasgow et al. 1999; Greenhalgh et al. 2004)
with our own experience in intervention and implementation
research and with the arguments and desiderates described
above, we consider at least six steps as constitutive compo-
nents of an integrative approach to intervention and imple-
mentation research. Although these steps mostly occur in suc-
cession, some of them might be performed simultaneously.
The six steps together must be considered as parts of a dy-
namic process with many sub-processes, feedback loops, and
interdependencies.
Step 1: Mission-driven Problem recognition (P)
Researchers are used to identify scientific problems and de-
siderates for new insights. However, in the case of an integra-
tive approach to intervention/ prevention and implementation
research, the focus is not primarily on problems arising in
basic research but on (mostly social) problems in society.
Consequently, in order to identify such problems and in order
to be generally willing to take action, researchers must not
only be curiosity-driven but also mission-driven, combining
the quest for fundamental understanding with a consideration
of practical use (Stokes 1997). In other words, if scientists
intend to contribute to this field of research, the first step
requires socio-political responsibility as a basic mindset.
Step 2: Ensuring Availability of robust knowledge on how
to handle the problem (A)
The availability of robust and sound scientific knowledge and
evidence is a fundamental precondition for working on an
identified problem. Moreover, it is a prerequisite for any kind
of transfer (Spiel et al. 2009). Consequently, researchers have
to be experts in the relevant field with excellent knowledge of
theory, methods, empirical findings, and limitations. This also
includes the political dimension of research in the sense of
defining and financing corresponding research topics.
Step 3: Identification of reasonable Starting points
for action (S)
The identification of a problem and the availability of r elevant
insights for initiating changes are not enough if one does not
succeed in identifying promising starting points for interven-
tions and their implementation. This must be emphasized, as a
wide body of research has made it clear that many interventi on
programs and measures do not work everywhere and at all
times (Meyers et al. 2012). Here again, a necessary condition
Prev Sci (2018) 19:337–346 339
is high expertise in the relevant scientific field. However, this
alone will not do. It must be combined with a differentiated
view of prevailing cultural and political conditions.
Researchers need knowledge and experience in the relevant
practical field and its contextual conditions. This also includes
knowledge about potential problems and limitations.
Step 4: Establishment of a Cooperation process
with policymakers (C)
This step is a very crucial one for several reasons. Successful
development and implementation of evidence-based interven-
tion in practical settings involves various stakeholders and
requires cooperation, persistence, time, and money. In order
to conduct integrative intervention and implementation re-
search, stable alliances with the relevant policymakers are
necessary, which many researchers traditionally have not
established. However, as research often follows its own, very
intrinsic logic, which clearly differs from political thinking, a
very deliberate process of establishing cooperation and build-
ing alliances is necessary. Among other things, this includes
more awareness of policymakers’ scope of action. Researchers
have to consider that there are other influences on government
and policy, beyond evidence. These include values, beliefs,
and ideology, which are the driving forces of many political
processes (Davies 2004, 2012). Habits and traditions are also
important and cannot be changed quickly. Furthermore, re-
searchers’ experience, expertise and judgment influence
policymaking, and media; lobbyists and pressure groups also
play a role. Researchers have to keep in mind that
policymaking is highly embedded in a bureaucratic culture
and is forced to respond quickly to everyday contingencies.
Last but not least, as resources are limited, policymaking is
always a matter of what works at what costs and with what
outcomes (Davies 2004, 2012). Consequently, researchers
have to find ways to integrate evidence with all these factors.
However, again, this step also addresses a certain basic atti -
tude of researchers: It requires that researchers make their
voice heard, and sometimes, they have to be very insistent.
Step 5: Coordinated development of Intervention
and Implementation (I)
This step is the centerpiece and a long process rather than a
step. The coordinated development has to be performed in a
theory-driven, ecological, collaborative, and participatory
way. This means that researchers have to include the perspec-
tives of all relevant stakeholders (practitioners, policymakers,
government officials, public servants, and communities) in
this development process, communicate in the language of
these diverse stakeholders, and meet them as equals.
Therefore, researchers again have to consider parameters for
their research work that differs from many traditional
approaches: working together right from the beginning is not
common in many fields and also requires new conceptions of
e.g., research planning (regarding things like the duration of
project phases; see Meyers et al. 2012). Here, the big chal -
lenge is to find a balance between wide participation and the
maintenance of scientific criteria and standards of evidence, as
well as between the freedom of science and research on de-
mand. Consequently, researchers must have theoretical
knowledge and practical experience in their very specific field
of expertise, but the required profile for a successful
Bintegrative intervention and implementation researcher^ is
much wider.
Step 6: Transfer of program implementation (T)
For this final scale-up step, several models and guidelines
have been proposed by implementation science (see
BTheoretical Background^). Therefore, we will only make a
reference to them (e.g., Fixsen et al. 2013; Meyers et al. 2012;
Spoth et al. 2013).
In sum, none of the PASCITsteps refer to a completely new
consideration or demand (see also Glasgow et al. 1999, who
developed the RE-AIM framework for the evaluation of pub-
lic health interventions), but they have to be seen as one co-
herent approach. The sound, consistent integration of inter -
vention and implementation research with the goal of intro-
ducing changes to policy also requires a (re)differentiation of
our scientific identity and the creation of a new, wider job
description for researchers in this field.
To illustrate the application of the PASCIT steps of the BI3-
Approach^, the following sections introduce the ViSC school
program, which seeks to reduce aggressive behavior and bul -
lying and to foster social and intercultural competencies.
The ViSC School Program—Fostering Social
and Intercultural Competencies
Step 1: Mission-driven Problem recognition (P)
Since 1997, violence in schools has gained widespread public
attention in Austria, and a number of reports in the media have
addressed this topic. As a consequence, a joint statement is -
sued by four federal ministers declared the government’s in-
tention to set up initiatives to prevent violence in several social
domains. The government provided financial support for a
number of individual intervention and prevention projects in
schools. However, as shown in a national report (Atria and
Spiel 2003), most of the initiatives taken to prevent violence in
schools were organized by individual teachers, researchers
were not involved in the planning and organization of these
projects. Therefore, these projects and programs were not the-
oretically based, project goals were imprecisely formulated,
340 Prev Sci (2018) 19:337–346
and programs were rarely documented and evaluated. In sum,
they were a far cry from aligning with the standards of
evidence.
Step 2: Ensuring Availability of robust knowledge on how
to handle the problem (A)
Many prevention and intervention programs have been deve-
loped by researchers to take on violence at school. They have
been evaluated in numerous efficacy and effectiveness trials
(e.g., Ferguson et al. 2007; Fox et al. 2012; Smith et al. 2004;
Ttofi and Farrington 2009, 2011), and many resources of na-
tional institutions have been invested into the implementation
of research-based programs in several countries (e.g.,
Kennedy 1997; Nutley et al. 2007; Petrosino et al. 2001;
Shonkoff 2000). It has become apparent that there are at least
two key features necessary for program success: (a) to ap-
proach the school as a whole and to incorporate a systemic
perspective and (b) to conduct activities at school level, class
level, and individual level (Smith et al. 2004; Ttofi and
Farrington 2009, 2011). However, the deployment of research
findings in applied settings has remained slow and incomplete
(e.g., Slavin 2002, 2008). It turns out that national strategies
actively supported by governments are needed for sustainable
violence prevention (Olweus 2004; Roland 2000; Spiel et al.
2011a, b, c).
Step 3: Identification of reasonable Starting points
for action (S)
Consequently, the best means for dealing with violence in
Austrian schools was the development of a national strategy
with policy and advocacy as important pillars (Roland 2000).
Violence prevention programswhich take into account the key
factors identified for success (Smith et al. 2004; Ttofi and
Farrington 2011), comply with the standards of evidence
(e.g., Flay et al. 2005) and consider cultural and situational
conditions (Datnow 2002, 2005; Shonkoff 2000) should be
conceptualized as central parts of this national strategy.
Step 4: Establishment of a Cooperation process
with policymakers (C)
Therefore, we tried to convince officials at the Federal
Ministry for Education and the Minister herself of the need
for a strategy at a national level by advocating for evidence -
based programs and explaining the advantages of a strategi c
plan as opposed to individual initiatives. Based on several
previous projects with the Federal Ministry for Education,
we had established an open and honest line of communication.
At the beginning of 2007, in the wake of a quick succession
of dramatic, well-publicized incidents in schools and the pub-
lic discussion of the high rates of bullying and victimization in
Austria that followed reports of the results of the Health
Behaviours in School-aged Children (HBSC) survey (Craig
and Harel 2004), we received a mandate from the Federal
Ministry for Education to develop a national strategy for vio-
lence prevention in the Austrian public school system. In pre-
paring the national strategy, we had to cope with the challenge
that there was no existing system of collaboration among var-
ious stakeholders actively involved in violence prevention and
intervention (e.g., school psychologists, social workers, teach-
er unions) and the lack of knowledge concerning scientific
standards. It was therefore our intention to systematically in-
tegrate the perspectives of these stakeholder groups in strategy
development and to specifically consider the application of
theory-based and empirically evaluated prevention and inter-
vention programs (Spiel and Strohmeier 2011).
We developed the strategy between January and November
2007 in a continuous dialogue with officials responsible for
this issue at the Federal Ministry of Education and in intensive
exchange with international colleagues who had been in-
volved in similar national strategies in their own countries
(for details, see Spiel and Strohmeier 2011). In December
2007, the Federal Minister decided to implement the strategy
and presented this decision and the strategy plan in a major
press conference. For strategy management and implementa-
tion, a steering committee was established at the Federal
Ministry, with Christiane Spiel as an external member respon-
sible for research issues. In 2008, the national strategy became
part of the coalition agreement between the two governing
parties and was designed to last up to the end of the legislation
period in September 2013. As a consequence, money was
devoted to the national strategy and the activities within the
strategy. The officers of the Federal Ministry and the Federal
Minister herself were very much committed to the national
strategy and very keen on getting positive results. The imple-
mentation of the strategy continues during the legislation pe-
riod from 2013 to 2018 (for details about the national strategy
and its development, see Spiel and Strohmeier 2011, 2012;
Spiel, et al. 2012). As one part of the national strategy, it
was possible to expand the so-called ViSC class project
(Atria and Spiel 2007), which had previously been developed
and applied in rather controlled contexts, into a school-wide
program, to develop an implementation strategy for Austrian
schools, and to conduct a large-scale evaluation study.
Step 5: Coordinated development of Intervention
and Implementation (I)
In accordance with the national strategy, the main goals of the
ViSC program are to reduce aggressive behavior and bullying
as well as to foster social and intercultural competencies in
schools (Strohmeier et al. 2012). The ViSC program was de-
signed to focus on the school as a whole and to incorporate a
systemic perspective. The prevention of aggression and
Prev Sci (2018) 19:337–346 341
bullying was defined as a comprehensive school development
project over the duration of an entire school year. Activities
were designed to operate on three different levels: the school
as a whole, the classrooms, and the individual (for details, see
Spiel and Strohmeier 2011; Strohmeier et al. 2012). In an in-
school training for all teachers, basic knowledge on bully-
victim behavior and its development was presented. Based
on this shared knowledge, the school principal and all of the
school’s teachers were encouraged to jointly develop (a) a
shared definition of aggression and bullying, (b) shared prin-
ciples on how to handle aggression and bullying, and (c) com-
monly agreed-upon measures to sustainably reduce aggres-
sion and bullying on the school level. Furthermore, teachers
were trained to conduct talks with bullies, victims and their
parents in accordance with shared, standardized procedures, in
reaction to critical incidents.
One unique feature of the ViSC program concept is that it
includes a theory-based project at the class level, which was
developed in recognition of the importance of the class con-
text for the prevalence of bullying and victimization. The
ViSC class project consists of 13 training units (for details,
see Atria and Spiel 2007; Spiel and Strohmeier 2011;
Strohmeier et al. 2012). Taking the culture of Austrian schools
into account, the ViSC class project is well structured, but
open for adaptation in terms of materials and activities used.
Before expanding the ViSC class project to the ViSC school
program, it had been implemented four times by different
researchers in Austrian and German schools. Summative and
formative program evaluations confirmed promising results
(Atria and Spiel 2007; Gollwitzer et al. 2006, 2007). In par -
ticular, a high implementation quality could be found. Out of a
total of 52 training units (4 classes×13 units), only one train-
ing unit was deemed not to be in accordance with the project
goals. Twenty-nine units were structured exactly according to
the training manual and 22 units were in accordance with the
project goals, but used adapted materials (Gollwitzer et al.
2006).
The ViSC program’s implementation model was devel-
oped concurrently by the same researchers who developed
the ViSC program. It needed to take the context and culture
of the Austrian school system as well as the concrete situ-
ation at each specific school into account (Datnow 2002,
2005). Therefore, and to avoid overburdening teachers and
principals, we developed a cascaded train-the-trainer model
for the implementation of the ViSC program: researchers
train multipliers, multipliers train teachers, and teachers
train their students. To train multipliers—known as ViSC
coaches—ViSC courses were offered by the program de-
velopers. The ViSC courses consisted of three work-
shops—mostly held at the University of Vienna—and the
implementation of the ViSC program in one school. ViSC
coaches were required to hold in-school trainings for the
entire staff at their assigned school and to supervise and
coach them throughout the implementation process. The
ViSC coaches also held an additional in-school training
for those teachers who planned to conduct the ViSC class
project and offered them three supervision units during the
implementation of the class project.
Step 6: Transfer of program implementation (T)
The primary target groups for recruiting ViSC coaches are
teachers at educational universities and psychologists. From
2008 to 2014, 55 coaches were trained. To support schools in
implementing the ViSC program, many materials were pro-
vided on a website, which was presented and explained to
teachers by the ViSC coaches. Furthermore, a manual was
created to serve as a guide for teachers.
In 2009/10, the ViSC program was intensively evaluated in
terms of implementation quality and effectiveness. All sec-
ondary schools located in the capital city of Austria were
invited to participate in the ViSC program. Out of 155 second-
ary schools in Vienna, 34 schools applied for participation, out
of which 26 schools fulfilled the necessary requirements
(participation of the whole school staff in the program,
providing time resources for all ViSC trainings, taking part
in the evaluation study; Gradinger et al. 2014; Strohmeier
et al. 2012). Applying a randomized intervention-control
group design, 13 schools were randomly assigned to the inter -
vention group. Out of the remaining 13 schools, only five
agreed to serve as control schools. Data from 2042 students
(1377 in the intervention group, 665 in the control group)
from grades 5 to 7 (47.3 % girls), mean age 11.7 years
(SD=0.9), and attending 105 classes were collected at four
points in time. In addition, 338 teachers participated in a sur-
vey at the pre-testing and post-testing points.
Our first analyses focused on the short-term effectiveness
of the program with respect to aggressive behavior and vic-
timization (Strohmeier et al. 2012). A multiple group latent
change score model (LCS) to compare the control and inter -
vention group was applied, with gender and age as covariates.
The multiple group LCS model, imposing strict measurement
invariance, fit the data well. Results indicated a decline in
aggressive behavior (the latent mean of the aggression change
score in the intervention group, M=−0.23, differed signifi -
cantly from 0; p= 0.13), but no change in victimization.
Boys scored higher on aggression at time 1 and had lower
decreases over time. Age did not have any effects (for
details, see Strohmeier et al. 2012). A further analysis using
the Handling Bullying Questionnaire (HBQ) showed that
teachers in the intervention group used more non-punitive
strategies to work with bullies and more strategies to support
victims compared to teachers in the control group (Strohmeier
et al. 2012). We also investigated the short-term effect of the
ViSC program on cyberbullying and cybervictimization.
Results of a multiple group bivariate LCS model, controlling
342 Prev Sci (2018) 19:337–346
for traditional aggression, traditional victimization, and age
showed effectiveness for both cyberbullying (latent d=0.39)
and cybervictimization (latent d = 0.29; for details, see
Gradinger et al. 2014).
The evaluation of the implementation quality of the
ViSC program focused on implementation fidelity (the
number of conducted units documented by the ViSC
coaches) and participant responsiveness (participation
rates of the teaching staff; Schultes et al. 2014). There
was a high variability for both scores: implementation fi -
delity ranged from 0.4 (conduction of 40 % of the pre-
scribed training units) to 2.0 (conduction of twice as many
training units as prescribed in the curriculum); the range of
participation rates lay between 30 and 100 %. Multilevel
analyses showed that teachers’ self-efficacy was signifi-
cantly more enhanced in schools where the ViSC program
had been implemented with high fidelity and that only
teachers with high participant responsiveness significantly
changed their behavior in bullying situations (for details,
see Schultes et al. 2014). We used these results to adapt
the training of the ViSC coaches. In a participatory ap-
proach together with coaches and school principals, we
worked out what conditions are necessary for implementing
the ViSC program with high fidelity and high participant
responsiveness. In further analyses, implementation fidelity
and participation rates will be considered. Currently, the
ViSC program is being implemented in Romania and
Cyprus by local researchers. Initial evaluations show prom-
ising results.
Our intention was not only to implement a program to
prevent violence, but also to enable principals and teachers
to assess and interpret violence rates in their schools and class-
rooms as well as to evaluate the effectiveness of interventions
against violence without depending on the presence of re-
searchers for data collection, analysis, and interpretation.
Therefore, we developed the self-evaluation tool AVEO
(Austrian Violence Evaluation Online Tool), which provides
information about violence rates from the perspective of both,
students and teachers (Schultes et al. 2011; Spiel et al. 2011a,
b, c; see also Spiel et al. 2012). The teacher and school per -
spectives were systematically integrated into the development
of the self-evaluation tool and its implementation was careful-
ly evaluated.
Discussion
Although there is a large amount of empirical evidence for
the importance of implementation, there is still not enough
knowledge available on how to increase the likelihood of
quality implementation (Meyers et al. 2012). From our per-
spective, one reason for this state of insufficiency might be
that intervention and implementation research have not
been systematically connected up until now. In this paper,
we proposed the systematic integration of intervention and
implementation research. We presented an integrative inter-
vention and implementation approach (I3-Approach) in-
cluding a procedure with six consecutive steps (PASCIT)
from (1) problem recognition to (6) transfer of program
implementation. To illustrate the integration of intervention
and implementation, we presented a program from our own
research, focusing on violence prevention in schools. In this
section, we discuss the strengths and limitations of this
example. Finally, problems and challenges of the I3-
Approach and the PASCIT procedure are examined on a
general level.
For the development and implementation of the ViSC
school program, it was very helpful that it was part of a na-
tional strategy on violence prevention in the public school
system, which we ourselves had also developed. Moreover,
from our perspective, the establishment of a national strategy
was a prerequisite for the upscaling of the ViSC program, as
necessary implementation capacity and respective organiza-
tional structures (Fixsen et al. 2005) have not been established
before. The public discussion of the high rates of bullying in
Austria and several incisive events in schools raised
policymakers’ awareness of the issue and gave us a window
of opportunity for action. A further important step was that the
national strategy became part of the coalition agreement of the
governmental parties. This solved the budget problem. The
national strategy also supported the implementation of sound
measures for sustainability (for details, see Spiel and
Strohmeier 2012; Spiel et al. 2012) and the realization of a
randomized control trial as scientific standard was defined
within the strategy. To our knowledge, it was the first time
that this gold standard was applied in a program financed by
the Austrian Federal Ministry for Education. Further strengths
of the ViSC program include the fact that adaptation within a
defined range is an explicit part of the program (for details, see
Strohmeier et al. 2012); the building of organizational capac-
ity through collaboration with, e.g., school psychology, the
ViSC coaches training, the implementation concept and its
evaluation, as well as the AVEO self-assessment as a feedback
mechanism.
However, there are also some limitations. In the
ViSC program, we did not work directly with schools,
but rather trained ViSC coaches. This resulted in lower
commitment of the schools and lower implementation
quality in the evaluation study (Schultes et al. 2014),
but advantages for long-term implementation in the
school system. Nevertheless, we recommend convincing
politicians and government officials that the initial im-
plementation of such programs should be done under
the supervision of researchers and program developers.
A further limitation has to do with the cultural context.
In Austria, responsibility and commitment are not well
Prev Sci (2018) 19:337–346 343
established in the educational system (see Spiel and
Strohmeier 2012). Out of the 155 secondary schools in
Vienna invited to participate in the ViSC program, only
34 applied for participation. Considering the high bully-
ing rates in Austria (Craig and Harel 2004), which in-
dicate a need for intervention, the participation rate was
low. The low engagement can also be seen by the fact
that out of the 13 schools we asked to serve as control
schools, only 5 agreed to do so. In other countries, e.g.,
in Finland (Salmivalli et al. 2013), a greater degree of
responsibility in the educational system means that par-
ticipation rates in such programs are usually much
higher.
We tried hard to fulfil all the critical steps identified by
Meyers et al. (2012) and were successful concerning most of
them (see above), but it was not possible to realize our inten-
tion in all cases. To prepare the schools for the intervention,
we defined criteria for program participation, such as partici -
pation of the entire school staff in the program. The school
principals were asked to get consent in advance. However,
after the start of the program, it became apparent that this
consent had not been obtained in several cases. We further
recommended that no other programs should be conducted
simultaneously, which was also disregarded by some schools.
Finally yet importantly, the school principals’ leadership was
not strong enough. This can be attributed to the Austrian
school law and cannot be changed easily. School autonomy
has been a topic of political discussion in Austria for many
years.
Overall, the ViSC program illustrates that intervention and
implementation can be systematically combined in the sense
of PASCIT. However, it also illustrates that detailed planning
of such projects by researchers is difficult and that limitations
on different levels—regarding, e.g., cultural contexts—have
to be kept in mind.
But why is the systematic combination of intervention
and implementation as proposed by PASCIT so difficult?
On the surface, the steps seem self-evident. And what is
really new in contrast to existing implementation frame-
works and transfer concepts? Obviously, most, if not all
components (both within and across the six steps) of
PASCIT, are already known and have long been consid-
ered in intervention and implementation research.
However, the new and demanding challenge is our postu-
lation of bringing them together in an integrative and co-
ordinated way, in order to achieve success. The I3-
Approach represents a very basic but also a very system-
atic research concept and is more than purely the sum of
its steps, ignoring one aspect changes the whole dynamic.
RE-AIM (Glasgow et al. 1999) has recommended such a
systematic view for the evaluation of public health mea-
sures. However, so far, it has not been implemented com-
prehensively either. Nevertheless, the validity and
convenience of PASCIT have to be proven by future research
and programs in different fields and cultural contexts.
Obviously, combining intervention and implementa-
tion research is very demanding. Therefore, the appro-
priate acknowledgement in the scientific community is
essential. Consequently, individual researchers should
not be the only ones engaging in this kind of research;
universities also have to include it in their missions. We
therefore strongly recommend a discussion of success
criteria in academia (Fixsen et al. 2011) and that the
social responsibility of academics and universities, re-
spectively, will be considered more deeply. The current
gratification system in science is more oriented to basic
than to applied research. Within the applied field, it is
predominantly technology and medicine that are finan-
cially supported and acknowledged by the public.
Mission-driven research picking up problems in society
is less financed and noticed. Consequently, the number
of researchers engaged in this field is limited. However,
in the last few years some advances could be observed.
A further problem lies in the availability of knowledge. In
the social sciences, particularly in the educational field, it is
not easy to get robust scientific knowledge. Reasons for this
include that replication studies are rare and only probability
conclusions can be drawn. Here, the development of standards
of evidence was of high importance (e.g., Flay et al. 2005).
However, the requirements defined in these standards are not
as comprehensive as demanded by the I3-Approach. For ex-
ample, the evidence standards defined by the Society for
Prevention Research (Flay et al. 2005) proposed criteria for
efficacious and effective interventions and interventions rec-
ognized as ready for broad dissemination. However, they did
not combine them with the affordance of implementation.
As mentioned earlier, the commitment of policymakers is
crucial. Researchers need to have a great deal of persistence
and knowledge about policymakers’ scope of action.
However, in most cases this is not enough. A window of
opportunity is also needed and researchers have to catch it.
Here, the media can be supportive (Spiel and Strohmeier
2012).
To sum up, from our perspective, it is a continuous chal-
lenge to introduce the criteria for and the advantages of
evidence-based practice to politicians, public officials, and
practitioners on the one hand, and to promote the recognition
of intervention and implementation research in the scientific
community on the other hand. The I3-Approach and its
PASCIT steps offer one possible procedure for realization.
Obviously, other procedures, such as bottom-up approaches,
might be possible, especially if implementation capacity (e.g.,
in the sense of sufficient competent manpower) and respective
organizational structures are already established. However, we
argue for a systematic, strategic procedure instead of an inci -
dental one.
344 Prev Sci (2018) 19:337–346
Acknowledgments Open access funding provided by University
of
Vienna. This study was funded by grants from the Austrian
Federal
Ministry for Education, Arts, and Culture
The development of the national strategy for violence
prevention and
the evaluation of its implementation as well as the
implementation of the
ViSC program were financially supported by the Austrian
Federal
Ministry for Education, Arts, and Culture. We want to thank the
Federal
Ministry for their support and the participating teachers for
their
engagement.
References
Atria, M., & Spiel, C. (2003). The Austrian situation: Many
initiatives,
few evaluations. In P. Smith (Ed.), Violence in Schools: The
Response in Europe (pp. 83–99). London: RoutledgeFalmer.
Atria, M., & Spiel, C. (2007). Viennese Social Competence
(ViSC)
Training for Students: Program and evaluation. In J. E. Zins, M.
J.
Elias, & C. A. Maher (Eds.), Bullying, Victimization, and Peer
Harassment. A Handbook of Prevention and Intervention (pp.
179–197). New York: Haworth Press.
Beelmann, A., & Karing, C. (2014). Implementationsfaktoren
und -
prozesse in der Präventionsforschung: Strategien, Probleme,
Ergebnisse, Perspektiven [Implementation factors and processes
in
prevention research: Strategies, problems, findings, prospects].
Psychologische Rundschau, 65, 129–139. doi:10.1026/0033-
3042/
a000215.
Craig, W., & Harel, Y. (2004). Bullying, physical fighting and
victimiza-
tion. In C. Currie (Ed.),Health behaviour in school-aged
children: A
WHO cross national study (pp. 133–144). Geneva: World Health
Organization.
Datnow, A. (2002). Can we transplant educational reform, and
does it
last? Journal of Educational Change, 3, 215–239.
doi:10.1023/A.
1021221627854.
Datnow, A. (2005). The sustainability of comprehensive school
reform models in changing district and state contexts.
Educational Administration Quarterly, 41, 121–153. doi:10.
1177/0013161X04269578.
Davies, P. (2004). Is Evidence-Based Government Possible?
Jerry Lee
Lecture 2004. Paper presented at the 4th Annual Campbell
Collaboration Colloquium, Washington, DC.
Davies, P. (2012). The State of Evidence-Based Policy
Evaluation and its
Role in Policy Formation. National Institute Economic Review,
219,
41–52. doi:10.1177/002795011221900105.
Eccles, M. P., & Mittman, B. S. (2006). Welcome to
implementation
science. Implementation Science, 1, 1. doi:10.1186/1748-5908-
1-1.
Ferguson, C. J., SanMiguel, C., Kilburn, J. C., & Sanchez, P.
(2007). The
effectiveness of school-based anti-bullying programs. Criminal
Justice Review, 32, 401–414.
Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R.M.,
&Wallace, F.
(2005). Implementation research: A synthesis of the literature.
Tampa, FL: University of South Florida, Louis de la Parte
Florida Mental Health Institute, National Implementation
Research Network (FMHI Publication No. 231). Retrieved
from http://nirn.fpg.unc.edu/resources/implementation-
research-synthesis-literature
Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009).
Core
implementation components. Research on Social Work Practi ce,
19,
531. doi:10.1177/1049731509335549.
Fixsen, D. L., Blase, K., & Van Dyke, M. (2011). Mobilizing
communi-
ties for implementing evidence-based youth violence prevention
programming: A Commentary. American Journal of Community
Psychology (Special Issue), 48, 133–137.
Fixsen, D. L., Blase, K., Metz, A., & Van Dyke, M. (2013).
Statewide
implementation of evidence-based programs. Exceptional
Children
(Special Issue), 79, 213–230.
Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G.,
Gottfredson, D.,
Kellam, S., . . . Ji, P. (2005). Standards of evidence: Criteria for
efficacy, effectiveness and dissemination. Prevention Science,
6,
151–175.
Forman, S. G., Shapiro, E. S., Codding, R. S., Gonzales, J. E.,
Reddy, L.
A., Rosenfield, . . . Stoiber, K. C. (2013). Implementation
Science
and School Psychology. School Psychology Quarterly, 28, 77–
100.
doi:10.1037/spq0000019
Fox, B. H., Farrington, D. P., & Ttofi, M. M. (2012). Successful
bullying
prevention programs: Influence of research design,
implementation
features, and program components. International Journal of
Conflict and Violence, 6, 273–282.
Glasgow, R. E., Vogt, T. M., & Boles, S.M. (1999). Evaluating
the public
health impact of health promotion interventions: The RE-AIM
framework. American Journal of Public Health, 89, 1322–1327.
doi:10.2105/AJPH.89.9.1322.
Gollwitzer, M., Eisenbach, K., Atria, M., Strohmeier, D., &
Banse, R.
(2006). Evaluation of Aggression-Reducing Effects of the
BViennese Social Competence Training.^. Swiss Journal of
Psychology, 65, 125–135.
Gollwitzer, M., Banse, R., Eisenbach, K., & Naumann, A.
(2007).
Effectiveness of the Vienna Social Competence Training on
Explicit and Implicit Aggression. European Journal of
Psychological Assessment, 23, 150–156.
Gradinger, P., Yanagida, T., Strohmeier, D., & Spiel, C. (2014).
Prevention of Cyberbullying and Cybervictimization:
Evaluation
of the ViSC Social Competence Program. Journal of School
Violence, 14, 87–110. doi:10.1080/15388220.2014.963231.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., &
Kyriakidou, O.
(2004). Diffusion of innovations in service organizations:
Systematic review and recommendations. The Milbank
Quarterly,
82, 581–629. doi:10.1111/j.0887-378X.2004.00325.x.
Kennedy, M. M. (1997). The connection between research and
practice. Educational Researcher, 26, 4–12. doi:10.3102/
0013189X026007004.
Kratochwill, T. R., & Shernoff, E. S. (2003). Evidence-based
practice:
Promoting evidence-based interventions in school psychology.
School Psychology Quarterly, 18, 389–408.
Meyers, D. C., Durlak, J. A., & Wandersmann, A. (2012). The
quality
implementation framework: A synthesis of critical steps in the
im-
plementation process. American Journal of Community
Psychology,
50, 462–480. doi:10.1007/s10464-012-9522-x.
Nutley, S. M.,Walter, I., & Davies, H. T. O. (2007).Using
evidence. How
research can inform public service. Bristol: The Policy Press.
Prev Sci (2018) 19:337–346 345
Compliance with Ethical Standards
Conflict of Interest The authors declare that they have no
conflict of
interest.
Ethical Approval All procedures performed in studies involving
hu-
man participants were in accordance with the ethical standards
of the
institutional and/or national research committee and with the
1964
Helsinki declaration and its later amendments or comparable
ethical
standards.
Informed Consent Informed consent was obtained from all
individual
participants included in the study.
Open Access This article is distributed under the terms of the
Creative Commons Attribution 4.0 International License (http://
creativecommons.org/licenses/by/4.0/), which permits
unrestricted
use, distribution, and reproduction in any medium, provided you
give
appropriate credit to the original author(s) and the source,
provide a link
to the Creative Commons license, and indicate if changes were
made.
http://dx.doi.org/10.1026/0033-3042/a000215
http://dx.doi.org/10.1026/0033-3042/a000215
http://dx.doi.org/10.1023/A.1021221627854
http://dx.doi.org/10.1023/A.1021221627854
http://dx.doi.org/10.1177/0013161 X04269578
http://dx.doi.org/10.1177/0013161X04269578
http://dx.doi.org/10.1177/002795011221900105
http://dx.doi.org/10.1186/1748-5908-1-1
http://nirn.fpg.unc.edu/resources/implementation-research-
synthesis-literature
http://nirn.fpg.unc.edu/resources/imple mentation-research-
synthesis-literature
http://dx.doi.org/10.1177/1049731509335549
http://dx.doi.org/10.1037/spq0000019
http://dx.doi.org/10.2105/AJPH.89.9.1322
http://dx.doi.org/10.1080/15388220.2014.963231
http://dx.doi.org/10.1111/j.0887-378X.2004.00325.x
http://dx.doi.org/10.3102/0013189X026007004
http://dx.doi.org/10.3102/0013189X026007004
http://dx.doi.org/10.1007/s10464-012-9522-x
Ogden, T., & Fixsen, D. L. (2014). Implementation Science: A
brief
overview and a look ahead. Zeitschrift für Psychologie, 222, 4–
11.
doi:10.1027/2151-2604/a000160.
Olweus, D. (2004). The Olweus Bullying Prevention
Programme: Design
and implementation issues and a new national initiative in
Norway.
In P. K. Smith, D. J. Pepler, & K. Rigby (Eds.), Bullying in
Schools:
How Successful Can Interventions Be? (pp. 13–36). Cambridge:
Cambridge University Press.
Petrosino, A., Boruch, R. F., Soydan, H., Duggan, L., &
Sanchez-Meca,
J. (2001). Meeting the challenges of evidence-based policy: The
Campbell Collaboration. Annuals of the American Academy of
Political and Social Science, 578, 14–34.
Roland, E. (2000). Bullying in schools: Three national
innovations in
Norwegian schools in 15 years. Aggressive Behavior, 26, 135–
143.
Rossi, P. H., &Wright, J. D. (1984). Evaluation research: An
assessment.
Annual Review of Sociology, 10, 331–352.
Salmivalli, C., Poskiparta, E., Ahtola, A., & Haataja, A. (2013).
The
implementation and effectiveness of the KiVa Antibullying
Program in Finland. European Psychologist, 18, 79–88. doi:10.
1027/1016-9040/a000140.
Schultes, M.-T., Strohmeier, D., Burger, C., & Spiel, C. (2011).
Fostering evidence-based prevention in schools through self-
evaluation: The Austrian Violence Evaluation Online-Tool
(AVEO). Poster presented at the Evidence-Based Prevention
of Bullying and Youth Violence Conference in Cambridge,
Great Britain.
Schultes, M.-T., Stefanek, E., van de Schoot, R., Strohmeier,
D., & Spiel,
C. (2014). Measuring implementation of a school-based
violence
prevention program: Fidelity and teachers’ responsiveness as
pre-
dictors of proximal outcomes. Zeitschrift für Psychologie, 222,
49–
57. doi:10.1027/2151-2604/a000165.
Shonkoff, J. P. (2000). Science, policy, and practice: Three
cultures in
search of a shared mission. Child Development, 71, 181–187.
doi:
10.1111/1467-8624.00132.
Slavin, R. E. (2002). Evidence-based education policies:
Transforming
educational practice and research. Educational Researcher, 31,
15–
21.
Slavin, R. E. (2008). Evidence-based reform in education:
Which evi-
dence counts? Educational Researcher, 37, 47–50.
Smith, P. K., Pepler, D., & Rigby, K. (2004). Bullying in
schools: How
successful can interventions be? Cambridge: Cambridge
University
Press.
Spiel, C. (2009a). Evidence-based practice: A challenge for
European
developmental psychology. European Journal of Developmental
Psychology, 6, 11–33.
Spiel, C. (2009b). Evidenzbasierte Bildungspolitik und
Bildungspraxis –
eine Fiktion? Problemaufriss, Thesen, Anregungen [Evidence-
based
education policy and educational practice – a fiction?
Problem, theses, suggestions]. Psychologische Rundschau,
60, 255–256.
Spiel, C., & Strohmeier, D. (2011). National strategy for
violence preven-
tion in the Austrian public school system: Development and
imple-
mentation. International Journal of Behavioral Development,
35,
412–418. doi:10.1177/0165025411407458.
Spiel, C., & Strohmeier, D. (2012). Evidence-based practice and
policy:
When researchers, policy makers, and practitioners learn how to
work together. European Journal of Developmental Psychology,
9,
150–162.
Spiel, C., Lösel, F., &Wittmann, W.W. (2009). Transfer
psychologischer
Erkenntnisse – eine notwendige, jedoch schwierige Aufgabe
[Transfer psychological findings – a necessary but difficult
task].
Psychologische Rundschau, 60, 257–258.
Spiel, C., Salmivalli, C., & Smith, P. K. (2011a). Translational
research:
National strategies for violence prevention in school.
International
Journal of Behavioral Development, 35, 381–382. doi:10.1177/
0165025411407556.
Spiel, C., Schober, B., Strohmeier, D., & Finsterwald, M.
(2011b).
Cooperation among researchers, policy makers, administrators,
and practitioners: Challenges and recommendations. ISSBD
Bulletin 2011, 2, 11–14.
Spiel, C., Strohmeier, D., Schultes, M.-T., & Burger, C.
(2011c).
Nachhaltigkeit von Gewaltprävention in Schulen: Erstellung und
Erprobung eines Selbstevaluations-instruments [Sustainable vio-
lence prevention in schools: Development and evaluation of a
self-
evaluation tool for schools]. Vienna: University of Vienna.
Spiel, C., Wagner, P., & Strohmeier, D. (2012). Violence
prevention in
Austrian schools: Implementation and evaluation of a national
strat-
egy. International Journal of Conflict and Violence, 6, 176–186.
Spoth, R., Rohrbach, L. A., Greenberg,M., Leaf, P., Brown,
C.H., Fagan, A.,
. . . Hawkins, J. D. (2013). Addressing core challenges for the
next
generation of type 2 translation research and systems: The
Translation
Science to Population Impact (TSci Impact) Framework.
Prevention
Science, 14, 319–351. doi:10.1007/s11121-012-0362-6
Stokes, D. E. (1997). Pasteur’s quadrant. Basic science and
tech-
nological innovation. Washington, DC: Brookings Institution
Press.
Strohmeier, D., Hoffmann, C., Schiller, E.-M., Stefanek, E., &
Spiel, C. (2012). ViSC Social Competence Program. New
Directions for Youth Development, 133, 71–80. doi:10.1002/
yd.20008.
Ttofi, M. M., & Farrington, D. P. (2009). What works in
preventing
bullying: Effective elements of anti-bullying programmes.
Journal
of Aggression, Conflict and Peace Research, 1, 13–24.
Ttofi, M. M., & Farrington, D. P. (2011). Effectiveness of
school-based
programs to reduce bullying: A systematic and meta-analytic re-
view. Journal of Experimental Criminology, 7, 27–56. doi:10.
1007/s11292-010-9109-1.
346 Prev Sci (2018) 19:337–346
http://dx.doi.org/10.1027/2151-2604/a000160
http://dx.doi.org/10.1027/1016-9040/a000140
http://dx.doi.org/10.1027/1016-9040/a000140
http://dx.doi.org/10.1027/2151-2604/a000165
http://dx.doi.org/10.1111/1467-8624.00132
http://dx.doi.org/10.1177/0165025411407458
http://dx.doi.org/10.1177/0165025411407556
http://dx.doi.org/10.1177/0165025411407556
http://dx.doi.org/10.1007/s11121-012-0362-6
http://dx.doi.org/10.1002/yd.20008
http://dx.doi.org/10.1002/yd.20008
http://dx.doi.org/10.1007/s11292-010-9109-1
http://dx.doi.org/10.1007/s11292-010-9109-1Implementing
Intervention Research into Public �Policy—the “I3-
Approach”AbstractImplementing Intervention Research
into Public PolicyTheoretical BackgroundPASCIT—six steps of
an “I3-Research” (Integrative Intervention and
Implementation Research) ApproachStep 1: Mission-driven
Problem recognition (P)Step 2: Ensuring Availability of robust
knowledge on how to handle the problem (A)Step 3:
Identification of reasonable Starting points for action (S)Step 4:
Establishment of a Cooperation process with policymakers
(C)Step 5: Coordinated development of Intervention and
Implementation (I)Step 6: Transfer of program implementation
(T)The ViSC School Program—Fostering Social and
Intercultural CompetenciesStep 1: Mission-driven Problem
recognition (P)Step 2: Ensuring Availability of robust
knowledge on how to handle the problem (A)Step 3:
Identification of reasonable Starting points for action (S)Step 4:
Establishment of a Cooperation process with policymakers
(C)Step 5: Coordinated development of Intervention and
Implementation (I)Step 6: Transfer of program implementation
(T)DiscussionReferences
DEBATE Open Access
“There is nothing so practical as a good
theory”: a pragmatic guide for selecting
theoretical approaches for implementation
projects
Elizabeth A. Lynch1* , Alison Mudge2, Sarah Knowles3, Alison
L. Kitson4, Sarah C. Hunter4 and Gill Harvey1
Abstract
Background: A multitude of theories, models and frameworks
relating to implementing evidence-based practice in
health care exist, which can be overwhelming for clinicians and
clinical researchers new to the field of implementation
science. Clinicians often bear responsibility for implementation,
but may be unfamiliar with theoretical approaches
designed to inform or understand implementation.
Main text: In this article, a multidisciplinary group of clinicians
and health service researchers present a pragmatic
guide to help clinicians and clinical researchers understand
what implementation theories, models and frameworks
are; how a theoretical approach to implementation might be
used; and some prompts to consider when selecting a
theoretical approach for an implementation project. Ten
commonly used and highly cited theoretical approaches are
presented, none of which have been utilised to their full
potential in the literature to date. Specifically, theoretical
approaches tend to be applied retrospectively to evaluate or
interpret findings from a completed implementation
project, rather than being used to plan and design theory-
informed implementation strategies which would intuitively
have a greater likelihood of success. We emphasise that there is
no right or wrong way of selecting a theoretical
approach, but encourage clinicians to carefully consider the
project’s purpose, scope and available data and resources
to allow them to select an approach that is most likely to
“value-add” to the implementation project.
Conclusion: By assisting clinicians and clinical researchers to
become confident in selecting and applying theoretical
approaches to implementation, we anticipate an increase in
theory-informed implementation projects. This then will
contribute to more nuanced advice on how to address evidence-
practice gaps and ultimately to contribute to better
health outcomes.
Keywords: Evidence-based practice, Implementation,
Knowledge translation, Theory-informed
Background
Clinicians and clinical researchers usually have expert
knowledge about evidence-based interventions for differ-
ent clinical conditions. While some health professionals
may have experience of implementing evidence-based
interventions in their own practice or overseeing a
change in practice by health professionals directly under
their supervision, many are not familiar or confident
with the current evidence regarding how to effectively,
efficiently and sustainably implement evidence-based in-
terventions into routine clinical practice.
Implementation science has been defined as “the scien-
tific study of methods to promote the systematic uptake of
research findings and other evidence-based practices into
routine practice, and, hence, to improve the quality and
effectiveness of health services” [1], and recognises that
strong evidence alone is not sufficient to change practice.
Researchers from a multitude of backgrounds have pro-
posed different approaches to predicting, guiding and
explaining how evidence is implemented, drawing on
* Correspondence: [email protected]
1Adelaide Nursing School, Faculty of Health and Medical
Sciences, The
University of Adelaide, Adelaide, South Australia 5000,
Australia
Full list of author information is available at the end of the
article
© The Author(s). 2018 Open Access This article is distributed
under the terms of the Creative Commons Attribution 4.0
International License
(http://creativecommons.org/licenses/by/4.0/), which permits
unrestricted use, distribution, and
reproduction in any medium, provided you give appropriate
credit to the original author(s) and the source, provi de a link to
the Creative Commons license, and indicate if changes were
made. The Creative Commons Public Domain Dedication waiver
(http://creativecommons.org/publicdomain/zero/1.0/) applies to
the data made available in this article, unless otherwise stated.
Lynch et al. BMC Health Services Research (2018)
18:857
https://doi.org/10.1186/s12913-018-3671-z
http://crossmark.crossref.org/dialog/?doi=10.1186/s12913-018-
3671-z&domain=pdf
http://orcid.org/0000-0001-8756-1051
mailto:[email protected]
http://creativecommons.org/licenses/by/4.0/
http://creativecommons.org/publicdomain/zero/1.0/
practical experience, observation, empirical study and the
development and synthesis of an eclectic range of theories
about individual, group and organisational change. This
has resulted in a plethora of implementation frameworks,
models, and theories; recent data suggest more than 100
theoretical approaches are being used by implementation
researchers [2].
Use of established implementation theories, models or
frameworks can help implementation researchers
through contributing to new and more nuanced know-
ledge about how and why implementation succeeds or
fails [3]. For clinicians, implementation theories, models
and frameworks can be applied so new initiatives are
planned, implemented and evaluated more systematic-
ally, which may enhance the success, sustainability and
scalability of the project. When we consciously move
our thinking from implicit assumptions about how we
think implementation works to making our thinking
more explicit and structured through the application of
an established theoretical approach, then we might be
able to be more objective and more creative about our
approach to planning, guiding and evaluating.
Despite the growing recognition of the need to use
theory to inform implementation programs [3], many
clinicians and clinical researchers are unfamiliar with
theories of implementation and behaviour change. For
instance, in Australia in 2012, the majority of medical,
nursing and allied health professionals who had success-
fully applied for National Health and Medical Research
Council (NHMRC) Translating Research Into Practice
(TRIP) fellowships had no previous experience using im-
plementation theories or frameworks [4].
While previous manuscripts about implementation the-
ories, models and frameworks are helpful for researchers
with good background knowledge in implementation sci-
ence to make sense of the different theoretical approaches
available (for example, by providing a taxonomy to distin-
guish between different categories of implementation the-
ories, models and frameworks) [3], it is important that
information about how and why to select and apply theor-
etical approaches is made accessible to frontline health
professionals who strive to provide the best quality,
evidence-based care to their patients.
Throughout this manuscript, we will caution the
reader that this article cannot provide a simple “paint by
numbers” approach to implementation. It would be
counter-productive to try to create an algorithm that
would capture planning and implementing behaviour
change across multiple innovations in the complex adap-
tive systems of modern healthcare. Rather, we encourage
thoughtful assessment and responsiveness to the particu-
lar circumstances of each implementation—the change
to be introduced, the proposed way to make the change,
the people who need to be involved, and the setting in
which the change happens. Our intention is to provide
accessible guidance based on our collective clinical and
research experience, to assist clinicians and clinical re-
searchers in their endeavours to design and conduct
more effective implementation projects to improve clin-
ical care, and to design more robust evaluations to ad-
vance the empirical evidence for this emerging science.
Therefore the aims of this paper are to:
� Demystify some of the jargon through defining the
nature and role of frameworks, models and theories
� Describe and compare commonly used theoretical
approaches, and how they are applied in practice
� Suggest how to select a theoretical approach based
on purpose, scope and context of the individual
project
The suggestions made in this debate paper are derived
from our experiential knowledge as a multidisciplinary
group of clinicians and health service researchers with
interest and experience in implementation science in dif-
ferent international settings. EAL was a hospital-based
physiotherapist for 14 years and is now an early-career
researcher, AM is a practicing general physician and
mid-career researcher, SK has a psychology background
and is a researcher in a National Institute of Health Re-
search Collaboration for Leadership in Applied Health
Research and Care, SCH (psychology background) is an
early career researcher, and GH and ALK have back-
grounds in nursing and are senior researchers with ex-
tensive experience in improvement and implementation
research. We work collaboratively with front-line clini-
cians and run workshops to help clinicians and clinical
researchers apply theory to improve the success and sus-
tainability of implementation projects. Through this
work we recognise the need to assist clinicians and clin-
ical researchers to navigate this complex landscape.
We encourage the reader to consider our recommen-
dations as prompts or guides rather than definitive pre-
scriptions. We have found that a more systematic
understanding of implementation processes tends to be
generated when different people (with different back-
grounds and different approaches to implementation)
share their experiences. Therefore, we have framed this
paper using the experiences of author AM to illustrate
the ways that clinicians can engage with utilising imple-
mentation theories, models and frameworks.
Case study part 1 (experience of author AM): I am a
physician and I plan to implement a delirium
prevention program. I have some implementation
experience and know that it won’t be easy. I have
heard about implementation science, so I hope there
may be tools to help me.
Lynch et al. BMC Health Services Research (2018)
18:857 Page 2 of 11
I understand a bit about Knowledge to Action (KTA)
to guide my planning. I have strong evidence of
effectiveness and cost-effectiveness [knowledge cre-
ation & synthesis], and there are established clinical
practice guidelines [knowledge tools/products]. There
is an effective model to implement delirium preven-
tion developed in the USA (http://www.hospitalelder-
lifeprogram.org), but it used skilled geriatric nurses
and large numbers of trained volunteers, which is not
feasible in my hospital. None of the strategies in the
guidelines are “hard” but they just don’t seem to get
done consistently. I need to find out from staff and
patients why this is the case, and then try to find ways
to support them. Perhaps they need more education
or reminders, or maybe we can reallocate the tasks to
make it easier? Or are there strategies I am not famil -
iar with? Whatever I do, I want to measure better care
in some way to keep my boss happy and the staff in-
terested. And my previous projects have tended to fiz-
zle out over time… KTA gives me part of a plan but I
need some more tools to know how to take the next
steps.
Main text
Defining frameworks, models and theories
Some researchers have delineated between frameworks,
models and theories, whereas other researchers use
these terms interchangeably. In general, implementation
frameworks, models and theories are cognitive tools that
can assist a researcher or implementer to plan, predict,
guide or evaluate the process of implementing evidence
into practice.
Generally (for more detail refer to cited reference) [5]:
� A framework lists the basic structure and
components underlying a system or concept.
Examples of typical frameworks are the
Consolidated Framework for Implementation
Research (CFIR) [6], the Theoretical Domains
Framework (TDF) [7, 8], RE-AIM [8–10] and Pro-
moting Action on Research Implementation in
Health Services (PARIHS) [9, 10].
� A model is a simplified representation of a system or
concept with specified assumptions. An example of a
model is the Knowledge to Action (KTA) cycle [11].
� A theory may be explanatory or predictive, and
underpins hypotheses and assumptions about how
implementation activities should occur. An example
of a theory is the Normalization Process Theory
(NPT) [12].
In our experience, clinicians and clinical researchers
want to know what implementation approach will help
them and their project best; for many clinicians and clin-
ical researchers that we talk to, navigating the rapidly
expanding number of implementation theories, frame-
works and models is completely daunting, and is made
worse by unfamiliarity with the language used and in-
consistencies in nomenclature. To avoid compounding
this problem, we will refer to frameworks, models and
theories collectively as “theoretical approaches”, and
support our readers to focus on which theoretical ap-
proach can best suit the purpose, scope and context of
the implementation project.
Theoretical approaches help to shape how we think,
which is why they are important. However, most of the
time we are not aware of the underlying theories or frame-
works we use. In implementation science, theoretical ap-
proaches have been developed for different purposes, with
different intended users and are often underpinned by dif-
ferent philosophical perspectives (see Table 1). For in-
stance, some have been designed to assist implementation
researchers and enhance the quality of implementation re-
search, [6] to support improvement and practice develop-
ment in clinical settings [9] to understand factors
influencing the implementation of evidence in particular
health service settings [13] or to ensure comprehensive
evaluation or reporting of an implementation program
[14]. Some are based on the underlying assumption that
implementation is rational and predictable when relevant
factors are accounted for [7, 8]; in contrast, others are
built on the assumption that implementation is unpredict-
able, and ongoing monitoring, flexibility and adaptation is
required [9, 10]. Some have been designed iteratively,
based on the developers’ experience in implementing evi -
dence in real-world settings [9, 15, 16], whereas others
have been developed systematically through reviewing and
synthesising published literature [6, 7, 11]. And finally,
some but not all theoretical approaches have been tested,
validated and/or adapted over time [8, 10, 17].
Commonly used theoretical approaches
Two articles were published in 2017 which presented the
most commonly used dissemination and implementation
research frameworks cited in academic publications [18]
and the theories most commonly used by implementation
scientists [2]. For pragmatic reasons (acknowledging the
systematic approach taken by authors of both manu-
scripts), we used these two articles to guide the selection
of theoretical approaches for discussion in this paper. We
included the ten theoretical approaches that were within
the top 15 on both lists (i.e. both highly cited in the
literature and commonly used in implementation practice)
[6–9, 11–16, 19]. These are presented in Table 1. We do
not infer that these are the best or only theoretical ap-
proaches that should be used in implementation projects;
simply that they are the most commonly used.
Lynch et al. BMC Health Services Research (2018)
18:857 Page 3 of 11
http://www.hospitalelderlifeprogram.org
http://www.hospitalelderlifeprogram.org
Table 1 Summary of ten commonly applied theoretical
approaches to implementation
Knowledge to Action [11]
Purpose (as described by authors) A framework to conceptualise
the process of knowledge translation which integrates the roles
of knowledge
creation and knowledge application. Provide conceptual clarity
by offering a framework to elucidate the key elements of the
knowledge
translation process
Brief description: This approach provides an overview to help
guide and understand how knowledge is created and
synthesised, and tools (like
clinical guidelines) are developed, then how these tools are
applied in clinical settings through tailoring and adaptation,
implementation,
monitoring and sustaining. Assumes that action plans will be
realised (underpinned by assumption that actions are rational).
Takes a systems
approach – recognises that knowledge producers and users are
situated within a larger social system
How developed: Developed by reviewing literature of > 30
planned action theories, identified common elements. Added to
planned action model
a knowledge creation process and labelled the combined models
the knowledge to action cycle.
Changes/developments over time: No
Ease of use: clear and easy to understand, intuitive. No specific
guidance on how to do each step of the action cycle but
provides some guidance
on important elements to consider.
Additional resources: no specific resources currently available
on how to action each step of cycle
Theoretical Domains Framework (TDF) [7, 8]
Purpose (as described by authors): An integrative theoretical
framework, developed for cross-disciplinary implementation
and other behaviour
change research to assess implementation and other behavioural
problems and to inform intervention design.
Brief description: provides a holistic list of factors that
influence behaviour – application of TDF can give researcher
confidence that factors
influencing an individual’s behaviour will be identified, which
in turn can identify factors that need to be addressed in order
for behaviour change
to occur (i.e. can be used to inform behaviour change strategy
development/selection). Can be used in conjunction with
Behaviour Change Wheel
to develop and deliver behaviour change strategy
How developed: through an expert consensus process and
synthesis of 33 theories and 128 key theoretical constructs
related to behaviour
change.
Changes/developments over time: Validity was investigated by
behavioural experts sorting theoretical constructs using closed
and open sort tasks.
Validation study demonstrated good support for the basic
structure of the TDF and led to refinements, leading to
publication of new iteration of
framework in 2012
Ease of use: Quite straightforward to apply, can be time
consuming to use for analysis – potential to overwhelm novice
researcher given the 14
domains and 84 component constructs. COM-B and Behaviour
Change Wheel work together with TDF.
Additional resources: interview guides provided in publications
[7, 30] assist ease of data collection and illustrate domains.
Subject of thematic
series in Implementation Science journal, guide to use of TDF
published 2017 [31].
RE-AIM framework [14, 17]
Purpose (as described by authors): Originally developed as a
framework to guide consistent reporting of evaluations
regarding the public health
impact of health promotion interventions, thereby providing a
framework for determining what programs are worth sustained
investment and for
identifying those that work in real-world environments.
Brief description: Reporting checklist for public health
interventions (what patient groups are receiving intervention,
have patient outcomes
changed, what health professionals/ health professional groups
are providing intervention, are they delivering intervention as
intended, will the
program be sustained in the long term) to evaluate real world
impact. Can be used when designing or evaluating a public
health intervention.
How developed: Through inductive thinking building on results
of previous research
Changes/developments over time: “E” was initially efficacy
[14], then effectiveness [17]
Ease of use: Easy, interventions can be rated on the five
dimensions, providing a score. Some of the reporting points (in
particular Reach and
Adoption) are not being interpreted and reported as developers
intended
Additional resources: dedicated website with online tools,
examples [33]
Consolidated Framework for Implementation Research [6]
Purpose (as described by authors): Framework to promote
implementation theory development and verification about what
works, where and
why.
Brief description: list of factors (5 domains and 37 constructs)
that can influence an implementation project, can be used in
planning or in
evaluation stages (does not guide how to implement). Research
focus in contrast to doing/practitioner focus
How developed: Published theories which sought to facilitate
translation of research findings into practice in the healthcare
sector were reviewed.
Team identified constructs that had evidence that they
influenced implementation and could be measured. Some
constructs were streamlined
and combined, whereas other constructs were separated and
delineated.
Changes/developments over time: No
Ease of use: Clear, but may be difficult to digest language if
new to area of implementation science
Additional resources: dedicated website that provides examples,
templates and tools to assist in developing and evaluating
implementation
projects, collecting and analysing data [28]
Lynch et al. BMC Health Services Research (2018)
18:857 Page 4 of 11
Table 1 Summary of ten commonly applied theoretical
approaches to implementation (Continued)
Conceptual model of evidence-based practice implementation in
public service sectors [15]
Purpose (as described by authors): A multi-level, four phase
model of the implementation process that can be used in public
service sectors.
Brief description: Conceptual model of factors that can
influence implementation in the unique context of public sector
services (focus on role of
service delivery organisations and the services in which they
operate) at each of the 4 implementation stages: Exploration,
Adoption/Preparation,
Implementation, Sustainment (EPIS). Explicitly recognises that
different variables play crucial roles at different points in the
implementation process.
Does not provide guidance on how to move through different
stages of implementation.
How developed: based on literature and authors’ experience of
public service sectors, funded by the National Institute of
Mental Health
Changes/developments over time: No
Ease of use: Little clarity on how to operationalise different
factors, potential to be confusing for those unfamiliar with
implementation
Additional resources: California Evidence-Based Clearinghouse
for Child Welfare have developed webinars regarding use of
EPIS framework. Freely
available from http://www.cebc4cw.org/implementing-
programs/tools/epis/
Conceptual model of implementation research [19]
Purpose (as described by authors) a heuristic skeleton model for
the study of implementation processes in mental health services,
identifying the
implications for research and training.
Brief description: Guides how implementation research can be
organised, how it fits/aligns with evidence-based practices. May
be useful for
complete novice who needs clarity between clinical
interventions, implementation strategies, and working through
how to measure clinical and
implementation effectiveness. Various theories can be placed
upon the model to help explain aspects of the broader
phenomena.
How developed: drawn from 3 extant frameworks: stage
pipeline model, multi-level models of change and models of
health service use.
Changes/developments over time: No
Ease of use: Clear and easy to understand
Additional resources: No
Implementation effectiveness model [16]
Purpose (as described by authors): an integrative model to
capture and clarify the multidetermined, multilevel phenomenon
of innovation
implementation
Brief description: A list of constructs that can influence
implementation effectiveness, based on the premise that
implementation effectiveness is a
function of an organisation’s climate for implementing a given
innovation and the targeted organisational members’
perceptions of the fit of the
innovation to their values. Does not provide specific guidance
for how to implement, was not designed specifically for the
context of health care.
Likely to be most useful for projects with a clear organisational
approach.
How developed: from authors’ personal experience with
reference to literature
Changes/developments over time: No
Ease of use: Main manuscript very wordy (text-based).
Concepts are clear.
Additional resources: No
Promoting Action on Research Implementation in Health
Services (PARIHS) [9, 10]
Purpose (as described by authors): Organisational or conceptual
framework to help explain and predict successful
implementation of evidence into
practice and to understand the complexities involved.
Brief description: Conceptualises how evidence can be
successfully implemented in health care settings using the
process of facilitation.
Underlying premise is that facilitation will enable people to
apply evidence in their local setting, which is situated within a
broader organisational
and societal context. Framework strives to capture the
complexities involved in implementation, so most useful in
more complex projects.
How developed: from authors’ experience working as
facilitators and researchers on quality improvement activities
and health service research
projects.
Changes/developments over time: Has had several iterations
since first publication in 1998 in response to findings from
empirical testing. Revised
to integrated or i-PARIHS framework in 2015 [10]
Ease of use: Does not operationalise its constructs, so may be
difficult for novice to understand and apply, particularly when
not being supported
by expert facilitator. Facilitator’s toolkit easy to apply to
conduct pre- and post-implementation evaluation. For people
experienced in implementa-
tion, framework provides guidance on all of the things to
consider when implementation is complex.
Additional resources: Facilitator’s Toolkit in book associated
with 2015 iteration of PARIHS guides user through how to
assess, facilitate and
evaluate [32]
Interactive Systems Framework [13]
Purpose (as described by authors): Heuristic to help clarify the
issues related to how to move what is known about prevention
(particularly
prevention of youth violence and child maltreatment) into more
widespread use.
Brief description: Framework regarding translating findings
from prevention research to clinical practice. The framework
comprises three systems:
the Innovation Synthesis and Translation System (which distils
information about innovations and translates it into user-
friendly formats); the
Innovation Support System (which provides training, technical
assistance or other support to users in the field); and the
Innovation Delivery System
Lynch et al. BMC Health Services Research (2018)
18:857 Page 5 of 11
http://www.cebc4cw.org/implementing-programs/tools/epis/
Of note, there are similarities across the theoretical ap-
proaches. All consider the ‘new evidence’ to be imple-
mented; the ‘context’ where the evidence will be
introduced; the ‘agents or actors’ who will use or apply
the new evidence; and the ‘mechanisms’ or processes
that actually make the changes happen. Mechanisms can
either be people such as change champions, knowledge
brokers, opinion leaders, project managers or facilitators
or they can be processes such as new protocols, educa-
tion sessions or audit and feedback cycles, or a combin-
ation of both.
It is important to acknowledge that there is no univer-
sally agreed-upon theory of successful implementation,
nor empirical evidence about the relative advantages of
one theoretical approach over another. While this may
be frustrating to people new to the area of impleme nta-
tion science, the number of viable theoretical approaches
offers clinicians an opportunity to “think outside the
box”, and highlights the importance of clarifying what
they are seeking to know or change through their pro-
ject, and then being strategic in selecting a suitable the-
oretical approach.
Case study part 2 (experience of author AM): So it is
clear that I will need to adapt principles and protocols
from successful programs in the USA to my local
context. But how do I know what the context is? Top
picks on Google scholar for “context assessment
implementation science” seem to be Consolidated
Framework for Implementation Research (CFIR) and
Promoting Action on Research Implementation in
Health Services (PARIHS). Both have nice guides that
suggest useful questions to ask. There seems to be
quite a lot of overlap, although I am drawn to
PARIHS because from my experience I know that
someone will need to spend time on the ward and
build trust before we start to ask questions and
introduce change. I suspect this ‘facilitator’ will be a
critical role for the complex intervention because
there are several behaviours to change.
When I look up “behaviour change implementation
science”, the Theoretical Domains Framework (TDF)
dominates. Like CFIR and PARIHS, there are a lot of
elements, but I can see that they would be helpful for
planning or analysing surveys and interviews with
patients and staff to clarify what motivates, helps and
hinders them. I do feel worried about how I will
collect and analyse so much data across the several
different groups involved in my project.
And once we have a thorough understanding of the
context, staff and patients, how will I select strategies?
And if it does work, how long will it will take until the
“new” becomes “normal” so that I can move on to the
next problem? My colleague tells me that Normalization
Process Theory (NPT) is a useful way to think about this,
and I am impressed with the Normalisation of Complex
Interventions-Measure Development (NoMAD) tool I
find on their website; I can see how I could adapt it to
find out whether staff feel the changes are embedded.
Table 1 Summary of ten commonly applied theoretical
approaches to implementation (Continued)
(which implements innovations in the world of practice).
How developed: Collaborative development of the framework
by Division of Violence Protection staff members, university
faculty and graduate
students, with input from practitioners, researchers, and
funders.
Changes/developments over time: No
Ease of use: Easy to understand, no clear guidance available
regarding how to apply framework
Additional resources: No
Normalization Process Model, Normalization Process Theory
(NPT) [12]
Purpose (as described by authors): provides a conceptual
framework for understanding and evaluating the processes by
which new health
technologies and other complex interventions are routinely
operationalized in everyday work, and sustained in practice.
Brief description: NPT is an Action Theory, which means that it
is concerned with explaining what people do rather than their
attitudes or beliefs.
Proposes that for successful sustained use: individuals & groups
must work collectively to implement intervention; work of
implementation occurs
via 4 particular processes; continuous investment carrying
forward in space and time required. Can be helpful to
understand and evaluate how
new health technologies/complex interventions are routinely
operationalised sustained in practice. Not designed to guide
implementation.
How developed: in iterations, based on experiences of authors.
Initially, developers mapped the elements of embedding
processes and developed
the concept of normalization. Next a robust applied theoretical
model of Collective Action was produced, and applied to trials,
government
processes and healthcare systems. The final stage focused on
building a middle-range theory that explains how material
practices become rou-
tinely embedded in their social contexts.
Changes/developments over time: Through its focus on being a
theory, the authors continually refine and test NPT to ensure its
validity. More
recently, NPT has been extended towards a more general theory
of implementation.
Ease of use: easy to apply with use of specifically developed
resources
Additional resources: dedicated website with toolkit, examples.
Interactive toolkit can be used to plan project or analyse data
[29].
Lynch et al. BMC Health Services Research (2018)
18:857 Page 6 of 11
So where do I go from here? Do I frame the whole
project with KTA, assess context with CFIR, assess
the patient and staff views with TDF, adopt
facilitation as the central element from PARIHS,
and then look at how well it has gone using NPT?
Am I being thorough or theoretically sloppy? They
all look sensible, but I am not sure how to use any
of them and I am worried it is going to add a
whole lot of work to a complicated project.
How the theoretical approaches have been used
Recent work investigating how theoretical approaches
are selected suggests that theories are not selected to
suit the purpose of the implementation research; rather
theoretical approaches that are familiar to researchers
tend to be used, regardless of the aim of the project [2].
To explore how theoretical approaches have been ap-
plied in the literature to date, we searched for and iden-
tified review papers for 6 of our 10 theoretical
approaches: KTA [20], the Reach, Effectiveness, Adop-
tion, Implementation, and Maintenance Framework
(RE-AIM) [21], CFIR [22], NPT [23], PARIHS [24] and
TDF [25]. (For details of these review papers, see Add-
itional file 1: Table S1).
The overall message from these reviews is that theor-
etical approaches are not being utilised to their full po-
tential. Despite the fact that many approaches have been
developed to prospectively design and plan implementa-
tion strategies, they are almost overwhelmingly applied
retrospectively to evaluate or interpret findings from a
completed implementation project [22–24]. Further, the
components of the theoretical approaches (such as cod-
ing systems or reporting dimensions) tend not to be ap-
plied consistently, with some users selecting to apply
only particular theoretical components [21, 22], or ap-
plying components in different ways than the developers
intended [21].
These findings again suggest that there is not an
agreed “best” – or even “easiest” – theory to apply, and
that even implementation researchers may need to take
a pragmatic approach to the use of theory in complex
real-world projects. It reflects the relative immaturity of
the field, but this provides opportunities for clinical and
research partners to contribute to advancing our know -
ledge of how theory is selected and adapted for practical
use. To support thoughtful use of theory, we suggest
some practical guidance to how to choose which theory
to use, and to how to use the theory effectively to sup-
port the implementation project, based on our experi-
ence and interactions with clinical and academic staff
new to implementation.
How do you select the theoretical approach for your
implementation project?
Research reporting guidelines for implementation stud-
ies, including Standards for Reporting Implementation
Studies (STaRI) [26] and Template for Intervention De-
scription and Replication (TIDieR) [27] specify that the
rationale and theory underpinning implementation strat-
egies should be reported. Some researchers also recom-
mend that the reason for selecting a particular
theoretical approach should be justified [2, 24].
We acknowledge that there will always be multiple
questions that could be posed for each implementation
project, each of which could be approached from differ-
ent theoretical perspectives. Below we provide some
prompts to assist clinicians and clinical researchers to
select a theoretical approach that can value-add to dif-
ferent implementation projects, rather than simply citing
a theoretical approach in order to meet a reporting
guideline. Clinicians tend to be pragmatic; they generally
are not motivated by concepts like theoretical purity,
they just want things to work. We emphasise that there
is an “art” to selecting and applying theoretical ap-
proaches – these prompts need to be applied and con-
sidered alongside a clinician or researcher’s experience
and skill and the nuances of the implementation project.
In our experience, clinicians are often anxious that
they will select the ‘wrong’ theory. We reiterate that
there is no precise formula for choosing a theoretical ap-
proach. One important thing to consider in theory
Fig. 1 Five questions to help select a theoretical approach
Lynch et al. BMC Health Services Research (2018)
18:857 Page 7 of 11
selection is the goodness-of-fit, which is determined by
each study’s needs and aims, rather than there being a
‘wrong’ choice. The following are suggested questions
that could be considered to identify which theoretical
approach is particularly appropriate or useful for differ -
ent implementation projects (see Fig. 1).
Who are you working with?
Are you working with individuals who have complete
autonomy, are you working with a team, or are you
working with an entire health service? Almost all imple-
mentation studies will inevitably touch on different or-
ganisational levels (micro-, meso- and macro-level
implementation), so consider the fit of the theoretical
approaches to the organisational level where your pro-
ject is positioned, and whether more than one approach
is required to guide implementation at different levels.
Some approaches are particularly concerned with indi-
vidual experiences or behaviours (for example, TDF),
others with group interaction or collective working (for
example, NPT; Klein) and others encompass the broader
contextual factors impacting across a wider setting or
service (for example, PARIHS).
When in the process are you going to use theory?
The point in time of the implementation project may be
another factor guiding theoretical approach selection.
Some approaches lend themselves particularly to the de-
sign and planning of an implementation strategy (for ex-
ample, Exploration, Preparation, Implementation and
Sustainment (EPIS); Proctor; Interconnected Systems
Framework (ISF); TDF in conjunction with Behaviour
Change Wheel), others to tracking the development of a
project (for example, KTA), and others to planning an
evaluation and defining outcome measures to assess im-
plementation success (for example, RE-AIM).
Why are you applying a theory?
The aims and intended outcomes of each study should
be considered, as different theoretical approaches pro-
vide different ‘pay offs’ in terms of the understanding
gained. Different theoretical approaches can be used to
measure achievement of a specific change (for example,
TDF; ISF), to generate a better understanding of barriers
and facilitators to inform implementation approaches
(for example, PARIHS; CFIR), to develop knowledge
about an ongoing implementation process (for example,
KTA; Proctor), or to provide a framework of relevant
implementation outcomes (for example, CFIR; RE-AIM).
How will you collect data?
Choice of theoretical approach may also be informed by
what data will be available for analysis. Although ideally
data collection would be designed with a particular
approach in mind so that data are collected to answer
the questions of interest, we are aware that in practice
clinicians need to work with the resources that are avail -
able to them. For example, clinicians may have access to
routinely collected outcome data which could be evalu-
ated using the constructs in RE-AIM, but these same
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3
Implementing Intervention Research into PublicPolicy—the BI3

More Related Content

Similar to Implementing Intervention Research into PublicPolicy—the BI3

Adding New Dimensions To Case Study Evaluations The Case Of Evaluating Compr...
Adding New Dimensions To Case Study Evaluations  The Case Of Evaluating Compr...Adding New Dimensions To Case Study Evaluations  The Case Of Evaluating Compr...
Adding New Dimensions To Case Study Evaluations The Case Of Evaluating Compr...Scott Faria
 
Respond to the colleagues post detailed below, with a discussio.docx
Respond to the colleagues post detailed below, with a discussio.docxRespond to the colleagues post detailed below, with a discussio.docx
Respond to the colleagues post detailed below, with a discussio.docxpeggyd2
 
Open educational practice dimensions
Open educational practice dimensionsOpen educational practice dimensions
Open educational practice dimensionsgrainne
 
A virtual environment for formulation of policy packages
A virtual environment for formulation of policy packagesA virtual environment for formulation of policy packages
A virtual environment for formulation of policy packagesAraz Taeihagh
 
SOCW 6311 WK 1 responses Respond to at least two colleagues .docx
SOCW 6311 WK 1 responses Respond to at least two colleagues .docxSOCW 6311 WK 1 responses Respond to at least two colleagues .docx
SOCW 6311 WK 1 responses Respond to at least two colleagues .docxsamuel699872
 
External Validity and Policy AdaptationFrom Impact Evalua.docx
External Validity and Policy AdaptationFrom Impact Evalua.docxExternal Validity and Policy AdaptationFrom Impact Evalua.docx
External Validity and Policy AdaptationFrom Impact Evalua.docxmecklenburgstrelitzh
 
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...INTRODUCTION - Global Health Workshop: Methods for implementation science in ...
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...valéry ridde
 
COMM141 Essential English Skills.docx
COMM141 Essential English Skills.docxCOMM141 Essential English Skills.docx
COMM141 Essential English Skills.docxwrite31
 
Effective Strategy in Local & Regional Development
Effective Strategy in Local & Regional DevelopmentEffective Strategy in Local & Regional Development
Effective Strategy in Local & Regional DevelopmentScott Hutcheson, Ph.D.
 
Action Research A Remedy To Overcome The Gap Between Theory And Practice
Action Research  A Remedy To Overcome The Gap Between  Theory  And  PracticeAction Research  A Remedy To Overcome The Gap Between  Theory  And  Practice
Action Research A Remedy To Overcome The Gap Between Theory And PracticeAngie Miller
 
In 1,250-1,500 words, discuss the implementation plan for your evi
In 1,250-1,500 words, discuss the implementation plan for your eviIn 1,250-1,500 words, discuss the implementation plan for your evi
In 1,250-1,500 words, discuss the implementation plan for your eviLizbethQuinonez813
 
Professional inquiry is one of the most important aspects.pdf
Professional inquiry is one of the most important aspects.pdfProfessional inquiry is one of the most important aspects.pdf
Professional inquiry is one of the most important aspects.pdfsdfghj21
 
10 1108 jwam-09-2019-0027
10 1108 jwam-09-2019-002710 1108 jwam-09-2019-0027
10 1108 jwam-09-2019-0027kamilHussain15
 
Siklus Kebijakan Publik Implementasi
Siklus Kebijakan Publik ImplementasiSiklus Kebijakan Publik Implementasi
Siklus Kebijakan Publik Implementasikuliaaah
 
The role of monitoring and evaluation in six South African reading programmes...
The role of monitoring and evaluation in six South African reading programmes...The role of monitoring and evaluation in six South African reading programmes...
The role of monitoring and evaluation in six South African reading programmes...Mlungisi Zuma
 
Studying implementation 2017
Studying implementation 2017Studying implementation 2017
Studying implementation 2017Martha Seife
 

Similar to Implementing Intervention Research into PublicPolicy—the BI3 (20)

Adding New Dimensions To Case Study Evaluations The Case Of Evaluating Compr...
Adding New Dimensions To Case Study Evaluations  The Case Of Evaluating Compr...Adding New Dimensions To Case Study Evaluations  The Case Of Evaluating Compr...
Adding New Dimensions To Case Study Evaluations The Case Of Evaluating Compr...
 
Respond to the colleagues post detailed below, with a discussio.docx
Respond to the colleagues post detailed below, with a discussio.docxRespond to the colleagues post detailed below, with a discussio.docx
Respond to the colleagues post detailed below, with a discussio.docx
 
Open educational practice dimensions
Open educational practice dimensionsOpen educational practice dimensions
Open educational practice dimensions
 
A virtual environment for formulation of policy packages
A virtual environment for formulation of policy packagesA virtual environment for formulation of policy packages
A virtual environment for formulation of policy packages
 
SOCW 6311 WK 1 responses Respond to at least two colleagues .docx
SOCW 6311 WK 1 responses Respond to at least two colleagues .docxSOCW 6311 WK 1 responses Respond to at least two colleagues .docx
SOCW 6311 WK 1 responses Respond to at least two colleagues .docx
 
External Validity and Policy AdaptationFrom Impact Evalua.docx
External Validity and Policy AdaptationFrom Impact Evalua.docxExternal Validity and Policy AdaptationFrom Impact Evalua.docx
External Validity and Policy AdaptationFrom Impact Evalua.docx
 
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...INTRODUCTION - Global Health Workshop: Methods for implementation science in ...
INTRODUCTION - Global Health Workshop: Methods for implementation science in ...
 
COMM141 Essential English Skills.docx
COMM141 Essential English Skills.docxCOMM141 Essential English Skills.docx
COMM141 Essential English Skills.docx
 
art-RintoP
art-RintoPart-RintoP
art-RintoP
 
Effective Strategy in Local & Regional Development
Effective Strategy in Local & Regional DevelopmentEffective Strategy in Local & Regional Development
Effective Strategy in Local & Regional Development
 
Action Research A Remedy To Overcome The Gap Between Theory And Practice
Action Research  A Remedy To Overcome The Gap Between  Theory  And  PracticeAction Research  A Remedy To Overcome The Gap Between  Theory  And  Practice
Action Research A Remedy To Overcome The Gap Between Theory And Practice
 
In 1,250-1,500 words, discuss the implementation plan for your evi
In 1,250-1,500 words, discuss the implementation plan for your eviIn 1,250-1,500 words, discuss the implementation plan for your evi
In 1,250-1,500 words, discuss the implementation plan for your evi
 
Policy Capacity
Policy CapacityPolicy Capacity
Policy Capacity
 
Research uptake and impact — Project brief
Research uptake and impact — Project briefResearch uptake and impact — Project brief
Research uptake and impact — Project brief
 
Professional inquiry is one of the most important aspects.pdf
Professional inquiry is one of the most important aspects.pdfProfessional inquiry is one of the most important aspects.pdf
Professional inquiry is one of the most important aspects.pdf
 
10 1108 jwam-09-2019-0027
10 1108 jwam-09-2019-002710 1108 jwam-09-2019-0027
10 1108 jwam-09-2019-0027
 
Siklus Kebijakan Publik Implementasi
Siklus Kebijakan Publik ImplementasiSiklus Kebijakan Publik Implementasi
Siklus Kebijakan Publik Implementasi
 
The role of monitoring and evaluation in six South African reading programmes...
The role of monitoring and evaluation in six South African reading programmes...The role of monitoring and evaluation in six South African reading programmes...
The role of monitoring and evaluation in six South African reading programmes...
 
Hefcereport gb
Hefcereport gbHefcereport gb
Hefcereport gb
 
Studying implementation 2017
Studying implementation 2017Studying implementation 2017
Studying implementation 2017
 

More from MalikPinckney86

Find a recent merger or acquisition that has been announced in the.docx
Find a recent merger or acquisition that has been announced in the.docxFind a recent merger or acquisition that has been announced in the.docx
Find a recent merger or acquisition that has been announced in the.docxMalikPinckney86
 
Find an example of a document that misuses graphics. This can be a d.docx
Find an example of a document that misuses graphics. This can be a d.docxFind an example of a document that misuses graphics. This can be a d.docx
Find an example of a document that misuses graphics. This can be a d.docxMalikPinckney86
 
Find a scholarly research study from the Ashford University Library .docx
Find a scholarly research study from the Ashford University Library .docxFind a scholarly research study from the Ashford University Library .docx
Find a scholarly research study from the Ashford University Library .docxMalikPinckney86
 
Find a work of visual art, architecture, or literature from either A.docx
Find a work of visual art, architecture, or literature from either A.docxFind a work of visual art, architecture, or literature from either A.docx
Find a work of visual art, architecture, or literature from either A.docxMalikPinckney86
 
Find a real-life” example of one of the following institutions. Exa.docx
Find a real-life” example of one of the following institutions. Exa.docxFind a real-life” example of one of the following institutions. Exa.docx
Find a real-life” example of one of the following institutions. Exa.docxMalikPinckney86
 
Find a listing of expenses by diagnosis or by procedure. The source .docx
Find a listing of expenses by diagnosis or by procedure. The source .docxFind a listing of expenses by diagnosis or by procedure. The source .docx
Find a listing of expenses by diagnosis or by procedure. The source .docxMalikPinckney86
 
Financial Reporting Problem  and spreedsheet exercise.This is an.docx
Financial Reporting Problem  and spreedsheet exercise.This is an.docxFinancial Reporting Problem  and spreedsheet exercise.This is an.docx
Financial Reporting Problem  and spreedsheet exercise.This is an.docxMalikPinckney86
 
Find a Cybersecurity-related current event that happned THIS WEEK, a.docx
Find a Cybersecurity-related current event that happned THIS WEEK, a.docxFind a Cybersecurity-related current event that happned THIS WEEK, a.docx
Find a Cybersecurity-related current event that happned THIS WEEK, a.docxMalikPinckney86
 
Financing Health Care in a Time of Insurance Restructuring Pleas.docx
Financing Health Care in a Time of Insurance Restructuring Pleas.docxFinancing Health Care in a Time of Insurance Restructuring Pleas.docx
Financing Health Care in a Time of Insurance Restructuring Pleas.docxMalikPinckney86
 
Financing International Trade Please respond to the followingCom.docx
Financing International Trade Please respond to the followingCom.docxFinancing International Trade Please respond to the followingCom.docx
Financing International Trade Please respond to the followingCom.docxMalikPinckney86
 
Financial Statement Analysis and DisclosuresDiscuss the import.docx
Financial Statement Analysis and DisclosuresDiscuss the import.docxFinancial Statement Analysis and DisclosuresDiscuss the import.docx
Financial Statement Analysis and DisclosuresDiscuss the import.docxMalikPinckney86
 
Financial Ratios what are the limitations of financial ratios  .docx
Financial Ratios what are the limitations of financial ratios  .docxFinancial Ratios what are the limitations of financial ratios  .docx
Financial Ratios what are the limitations of financial ratios  .docxMalikPinckney86
 
Financial mangers make decisions today that will affect the firm i.docx
Financial mangers make decisions today that will affect the firm i.docxFinancial mangers make decisions today that will affect the firm i.docx
Financial mangers make decisions today that will affect the firm i.docxMalikPinckney86
 
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docx
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docxFinancial Laws and RegulationsComplete an APA formatted 2 page pap.docx
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docxMalikPinckney86
 
Financial Management DiscussionWhen reviewing the financial st.docx
Financial Management DiscussionWhen reviewing the financial st.docxFinancial Management DiscussionWhen reviewing the financial st.docx
Financial Management DiscussionWhen reviewing the financial st.docxMalikPinckney86
 
Final Written Art Project (500 words) carefully and creatively wri.docx
Final Written Art Project (500 words) carefully and creatively wri.docxFinal Written Art Project (500 words) carefully and creatively wri.docx
Final Written Art Project (500 words) carefully and creatively wri.docxMalikPinckney86
 
Final Research Paper Research the responsibility of a critical t.docx
Final Research Paper Research the responsibility of a critical t.docxFinal Research Paper Research the responsibility of a critical t.docx
Final Research Paper Research the responsibility of a critical t.docxMalikPinckney86
 
Financial management homeworkUnit III Financial Planning, .docx
Financial management homeworkUnit III Financial Planning, .docxFinancial management homeworkUnit III Financial Planning, .docx
Financial management homeworkUnit III Financial Planning, .docxMalikPinckney86
 
Final ProjectThe Final Project should demonstrate an understanding.docx
Final ProjectThe Final Project should demonstrate an understanding.docxFinal ProjectThe Final Project should demonstrate an understanding.docx
Final ProjectThe Final Project should demonstrate an understanding.docxMalikPinckney86
 
Final ProjectImagine that you work for a health department and hav.docx
Final ProjectImagine that you work for a health department and hav.docxFinal ProjectImagine that you work for a health department and hav.docx
Final ProjectImagine that you work for a health department and hav.docxMalikPinckney86
 

More from MalikPinckney86 (20)

Find a recent merger or acquisition that has been announced in the.docx
Find a recent merger or acquisition that has been announced in the.docxFind a recent merger or acquisition that has been announced in the.docx
Find a recent merger or acquisition that has been announced in the.docx
 
Find an example of a document that misuses graphics. This can be a d.docx
Find an example of a document that misuses graphics. This can be a d.docxFind an example of a document that misuses graphics. This can be a d.docx
Find an example of a document that misuses graphics. This can be a d.docx
 
Find a scholarly research study from the Ashford University Library .docx
Find a scholarly research study from the Ashford University Library .docxFind a scholarly research study from the Ashford University Library .docx
Find a scholarly research study from the Ashford University Library .docx
 
Find a work of visual art, architecture, or literature from either A.docx
Find a work of visual art, architecture, or literature from either A.docxFind a work of visual art, architecture, or literature from either A.docx
Find a work of visual art, architecture, or literature from either A.docx
 
Find a real-life” example of one of the following institutions. Exa.docx
Find a real-life” example of one of the following institutions. Exa.docxFind a real-life” example of one of the following institutions. Exa.docx
Find a real-life” example of one of the following institutions. Exa.docx
 
Find a listing of expenses by diagnosis or by procedure. The source .docx
Find a listing of expenses by diagnosis or by procedure. The source .docxFind a listing of expenses by diagnosis or by procedure. The source .docx
Find a listing of expenses by diagnosis or by procedure. The source .docx
 
Financial Reporting Problem  and spreedsheet exercise.This is an.docx
Financial Reporting Problem  and spreedsheet exercise.This is an.docxFinancial Reporting Problem  and spreedsheet exercise.This is an.docx
Financial Reporting Problem  and spreedsheet exercise.This is an.docx
 
Find a Cybersecurity-related current event that happned THIS WEEK, a.docx
Find a Cybersecurity-related current event that happned THIS WEEK, a.docxFind a Cybersecurity-related current event that happned THIS WEEK, a.docx
Find a Cybersecurity-related current event that happned THIS WEEK, a.docx
 
Financing Health Care in a Time of Insurance Restructuring Pleas.docx
Financing Health Care in a Time of Insurance Restructuring Pleas.docxFinancing Health Care in a Time of Insurance Restructuring Pleas.docx
Financing Health Care in a Time of Insurance Restructuring Pleas.docx
 
Financing International Trade Please respond to the followingCom.docx
Financing International Trade Please respond to the followingCom.docxFinancing International Trade Please respond to the followingCom.docx
Financing International Trade Please respond to the followingCom.docx
 
Financial Statement Analysis and DisclosuresDiscuss the import.docx
Financial Statement Analysis and DisclosuresDiscuss the import.docxFinancial Statement Analysis and DisclosuresDiscuss the import.docx
Financial Statement Analysis and DisclosuresDiscuss the import.docx
 
Financial Ratios what are the limitations of financial ratios  .docx
Financial Ratios what are the limitations of financial ratios  .docxFinancial Ratios what are the limitations of financial ratios  .docx
Financial Ratios what are the limitations of financial ratios  .docx
 
Financial mangers make decisions today that will affect the firm i.docx
Financial mangers make decisions today that will affect the firm i.docxFinancial mangers make decisions today that will affect the firm i.docx
Financial mangers make decisions today that will affect the firm i.docx
 
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docx
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docxFinancial Laws and RegulationsComplete an APA formatted 2 page pap.docx
Financial Laws and RegulationsComplete an APA formatted 2 page pap.docx
 
Financial Management DiscussionWhen reviewing the financial st.docx
Financial Management DiscussionWhen reviewing the financial st.docxFinancial Management DiscussionWhen reviewing the financial st.docx
Financial Management DiscussionWhen reviewing the financial st.docx
 
Final Written Art Project (500 words) carefully and creatively wri.docx
Final Written Art Project (500 words) carefully and creatively wri.docxFinal Written Art Project (500 words) carefully and creatively wri.docx
Final Written Art Project (500 words) carefully and creatively wri.docx
 
Final Research Paper Research the responsibility of a critical t.docx
Final Research Paper Research the responsibility of a critical t.docxFinal Research Paper Research the responsibility of a critical t.docx
Final Research Paper Research the responsibility of a critical t.docx
 
Financial management homeworkUnit III Financial Planning, .docx
Financial management homeworkUnit III Financial Planning, .docxFinancial management homeworkUnit III Financial Planning, .docx
Financial management homeworkUnit III Financial Planning, .docx
 
Final ProjectThe Final Project should demonstrate an understanding.docx
Final ProjectThe Final Project should demonstrate an understanding.docxFinal ProjectThe Final Project should demonstrate an understanding.docx
Final ProjectThe Final Project should demonstrate an understanding.docx
 
Final ProjectImagine that you work for a health department and hav.docx
Final ProjectImagine that you work for a health department and hav.docxFinal ProjectImagine that you work for a health department and hav.docx
Final ProjectImagine that you work for a health department and hav.docx
 

Recently uploaded

Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........LeaCamillePacle
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxLigayaBacuel1
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 

Recently uploaded (20)

Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........Atmosphere science 7 quarter 4 .........
Atmosphere science 7 quarter 4 .........
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptx
 
Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 

Implementing Intervention Research into PublicPolicy—the BI3

  • 1. Implementing Intervention Research into Public Policy—the BI3-Approach^ Christiane Spiel1 & Barbara Schober1 & Dagmar Strohmeier2 Published online: 27 February 2016 # The Author(s) 2016. This article is published with open access at Springerlink.com Abstract Evidence-based intervention programs have be- come highly important in recent years, especially in educa- tional contexts. However, transferring these programs into practice and into the wider field of public policy often fails. As a consequence, the field of implementation research has emerged, several implementation frameworks have been de- veloped, and implementation studies conducted. However, in- tervention research and implementation research have not yet been connected systematically and different traditions and re- search groups are involved. Implementation researchers are mostly given mandates by politicians to take on the imple- mentation of already existing interventions. This might be one of the key reasons why there are still many problems in translating programs into widespread community practice. In this paper, we argue for a systematic integration of interven- tion and implementation research (BI3-Approach^) and recom- mend a six-step procedure (PASCIT). This requires re- searchers to design and develop intervention programs using a field-oriented and participative approach. In particular, the perspective of policymakers has to be included as well as an analysis of which factors support or hinder evidence-based policy in contrast to opinion-based policy. How this system- atic connection between intervention and implementation re-
  • 2. search can be realized, is illustrated by means of the develop- ment and implementation of the ViSC school program, which intends to reduce aggressive behavior and bullying and to foster social and intercultural competencies. Keywords Intervention research . Implementation science . Public policy . Integrative approach Implementing Intervention Research into Public Policy Evidence-based intervention programs in educational contexts have become highly important in recent years. However, transferring these programs into practice and into the wider field of public policy often fails (Fixsen et al. 2013). As a consequence, the field of implementation research has emerged (Rossi and Wright 1984; Ogden and Fixsen 2014). In recent years, a growing body of implementation research has indicated that an active, long-term, multilevel implemen- tation approach is far more effective than passive forms of dissemination (Ogden and Fixsen 2014). Within the field of implementation research, several theoretical bases and models—implementation frameworks—have been developed (Meyers et al. 2012). However, intervention research and implementation re- search have not yet been systematically connected and differ- ent tradit ions and research groups are involved. Implementation researchers are mostly given mandates by politicians to take on the implementation of already existing interventions. Moreover, implementation research remains rather isolated and is sometimes considered to be less scien- tifically valuable than research that develops new interven- tions (Fixsen et al. 2011). This might be one of the key reasons
  • 3. why there are still many problems in translating programs into widespread community practice (Spoth et al. 2013). * Christiane Spiel [email protected] 1 Faculty of Psychology, University of Vienna, Universitaetsstraße 7, 1010 Vienna, Austria 2 School of Applied Health and Social Sciences, University of Applied Sciences Upper Austria, Linz, Austria Prev Sci (2018) 19:337–346 DOI 10.1007/s11121-016-0638-3 http://crossmark.crossref.org/dialog/?doi=10.1007/s11121-016- 0638-3&domain=pdf In this paper, we argue for a systematic Integration of Intervention and Implementation research (BI3-Approach^). That means researchers design and develop intervention pro- grams based on a field-oriented and participative approach from the very beginning (according to the concept of use- inspired basic research, Stokes 1997; see also Spiel 2009a). This is not only a matter of transferring a program to practi - tioners at the end of the research process; the whole concep- tualization of an intervention as well as its evaluation and implementation should systematically consider the needs of the field (Spiel et al. 2011b) in an integrated way (Beelmann and Karing 2014). Consequently, the perspective of all stakeholders should be included (Shonkoff 2000). Based on theoretical considerations we drew from the literature and our experiences with intervention and implementation research, we summarized the most relevant actions to be
  • 4. taken and issues to be considered on the part of researchers, into a systematic six-step procedure (PASCIT) in order to propose such a systematic connection between intervention research and implementation research. We expect that such a connection would increase the probability of sustainably implementing evidence-based intervention programs into public policy. How this systematic connection between intervention and implementation research can be realized is illustrated by means of the ViSC Social Competence Program. The main goal of the ViSC program is to reduce aggression and bullying and to foster social and intercultural competencies. It was embedded in a national strategy for violence prevention. For sustainable implementation, a cascaded train-the-trainer (re- searcher-multipliers-teachers-students) model was developed and applied. Advantages and limitations of the six-step proce- dure are also discussed for the VISC program aswell as from a general point of view at the end of the article. Theoretical Background In recent decades, the evidence-based movement has gained greatly in impact, especially in Anglo-American contexts (Kratochwill and Shernoff 2003). Efforts have been made to make better use of research-based prevention and inter- vention programs in human service areas such as medicine, employment, child welfare, health, and juvenile justice (Fixsen et al. 2009; Spiel 2009a). As part of this evidence-based movement, various efforts have been made to define standards of evidence. For example, the Society for Prevention Research (Flay et al. 2005) has provided standards to assist practitioners, policy makers, and admin- istrators in determining which interventions are efficacious, which are effective, and which are ready for dissemination (for details, see Flay et al. 2005). Other standards are pro-
  • 5. vided by, for instance, the What Works Clearinghouse (see www.whatworks.ed.gov), the Best Evidence Encyclopedia (see www.bestevidence.org), the Campbell Collaboration (see www.campbellcollaboration.org), and the UK-Based Evidence for Policy and Practice Information and Co- ordinating Centre (see www.eppi.ioe.ac.uk). Common to these standards is the fact that evidence-based programs are defined by the research methodology used to evaluate them, and randomized trials are defined as the gold stan- dard for defining evidence (Fixsen et al. 2009). However, there are considerable differences in the uptake of research findings among public service areas and scientific disciplines (Nutley et al. 2007). Particularly in the field of education, there has not been extensive implementation of the body of evidence-based research, and the adoption of pre- vention and intervention programs is driven more by ideology than by evidence (Forman et al. 2013; Slavin 2008; Spiel 2009a, b). Despite differences among public service fields and coun- tries, it has become obvious that the evidence-based move- ment has not provided the intended benefits to consumers and communities (Fixsen et al. 2009), and implementing these programs into practice and in the wider range of public policy has often failed (Fixsen et al. 2009, 2013). There is large-scale agreement about one of the central reasons for this disappoint- ment: Program evaluation has not historically included any mention or systematic study of implementation (Meyers et al. 2012). As a consequence, the field of implementation research has emerged (Rossi and Wright 1984). In recent years, a growing body of implementation research has indi- cated that an active, long-term, multilevel implementation ap- proach (= mission-driven focus) is far more effective than passive forms of dissemination without any active involve-
  • 6. ment of practitioners (Ogden and Fixsen 2014; Spiel et al. 2012). Fixsen et al. (2005) defined implementation as the Bspecific set of activities designed to put into practice an activity or program of known dimensions^ (p.5; a similar definition is provided by Forman et al. 2013). Consequently, implementa- tion science has been defined as Bthe scientific study of methods to promote the systemic uptake of research findings and evidence-based practices into professional practice and public policy^ (Forman et al. 2013, p.80; see also Eccles and Mittman 2006). In the last decade, many implementation studies have been conducted and several conceptual models and implementation frameworks have been presented. The review provided by Meyers et al. (2012) consists of 25 frameworks. They found 14 dimensions that were com- mon to many of these frameworks and grouped them into six areas: (a) assessment strategies, (b) decisions about adaptation, (c) capacity-building strategies, (d) creating a structure for implementation, (e) ongoing implementation support strategies, and (f) improving future applications. 338 Prev Sci (2018) 19:337–346 http://www.whatworks.ed.gov/ http://www.bestevidence.org/ http://www.campbellcollaboration.org/ http://www.eppi.ioe.ac.uk/ According to their synthesis, the implementation process consists of a temporal series of these interrelated steps, which are critical to quality implementation.
  • 7. Despite these efforts within the field of implementation science, there is agreement among researchers that the empir - ical support for evidence-based implementation is insufficient (Ogden and Fixsen 2014). Although there is a large body of empirical evidence on the importance of implementation and growing knowledge of the contextual factors that can influ- ence implementation, knowledge of how to increase the like- lihood of quality implementation is still needed (Meyers et al. 2012). Moreover, intervention research and implementation research have not yet been systematically connected. Forman et al. (2013) explicitly pointed out the differences between intervention and implementation activities. While intervention activity refers to the provision of a prevention or intervention program to clients and consists (in the field of school) of a group leader conducting the program wi th targeted students, implementation activity refers to actions taken in the organizational setting to ensure that the inter- vention delivered to clients is complete and appropriate. Consequently, different research groups with different re- search traditions are usually involved in the two tasks. Implementation researchers are mostly given mandates by politicians to take on the implementation of already existing interventions. Moreover, implementation research is seen as less exciting than research that develops new interventions (Fixsen et al. 2011). Furthermore, implemen- tation research is very difficult to do within the constraints of university research environments (e.g., due to time or financial constraints) and is sometimes even considered to be less scientifically valuable (Fixsen et al. 2011; Spiel 2009b). Therefore, we suggest a systematic integration of interven- tion and implementation research. Consequently, researchers should design and develop intervention programs using a
  • 8. field-oriented and participative approach (according to the concept of use-inspired basic research, Stokes 1997; see also Spiel 2009a). The whole conceptualization of an intervention as well as its evaluation and implementation should systemat- ically consider the needs of the field (e.g., Spiel et al. 2011a, b, c), and intervention, evaluation, and implementation should be developed in an integrative way. In order to realize this and to avoid as many presumable risks as possible, the per - spective of all stakeholders should be included (Shonkoff 2000). In particular, the perspective of policymakers has to be included (Spiel et al. 2011a, b, c; Davies 2004), as well as analyses of what factors support or hinder evidence-based policy (Davies 2004, 2012). In the next section, we propose an approach for the goal- oriented integration of intervention and implementation research. PASCIT—six steps of an BI3-Research^ (Integrative Intervention and Implementation Research) Approach Combining theoretical and empirical knowledge from prior research (e.g., Glasgow et al. 1999; Greenhalgh et al. 2004) with our own experience in intervention and implementation research and with the arguments and desiderates described above, we consider at least six steps as constitutive compo- nents of an integrative approach to intervention and imple- mentation research. Although these steps mostly occur in suc- cession, some of them might be performed simultaneously. The six steps together must be considered as parts of a dy- namic process with many sub-processes, feedback loops, and interdependencies. Step 1: Mission-driven Problem recognition (P)
  • 9. Researchers are used to identify scientific problems and de- siderates for new insights. However, in the case of an integra- tive approach to intervention/ prevention and implementation research, the focus is not primarily on problems arising in basic research but on (mostly social) problems in society. Consequently, in order to identify such problems and in order to be generally willing to take action, researchers must not only be curiosity-driven but also mission-driven, combining the quest for fundamental understanding with a consideration of practical use (Stokes 1997). In other words, if scientists intend to contribute to this field of research, the first step requires socio-political responsibility as a basic mindset. Step 2: Ensuring Availability of robust knowledge on how to handle the problem (A) The availability of robust and sound scientific knowledge and evidence is a fundamental precondition for working on an identified problem. Moreover, it is a prerequisite for any kind of transfer (Spiel et al. 2009). Consequently, researchers have to be experts in the relevant field with excellent knowledge of theory, methods, empirical findings, and limitations. This also includes the political dimension of research in the sense of defining and financing corresponding research topics. Step 3: Identification of reasonable Starting points for action (S) The identification of a problem and the availability of r elevant insights for initiating changes are not enough if one does not succeed in identifying promising starting points for interven- tions and their implementation. This must be emphasized, as a wide body of research has made it clear that many interventi on programs and measures do not work everywhere and at all times (Meyers et al. 2012). Here again, a necessary condition
  • 10. Prev Sci (2018) 19:337–346 339 is high expertise in the relevant scientific field. However, this alone will not do. It must be combined with a differentiated view of prevailing cultural and political conditions. Researchers need knowledge and experience in the relevant practical field and its contextual conditions. This also includes knowledge about potential problems and limitations. Step 4: Establishment of a Cooperation process with policymakers (C) This step is a very crucial one for several reasons. Successful development and implementation of evidence-based interven- tion in practical settings involves various stakeholders and requires cooperation, persistence, time, and money. In order to conduct integrative intervention and implementation re- search, stable alliances with the relevant policymakers are necessary, which many researchers traditionally have not established. However, as research often follows its own, very intrinsic logic, which clearly differs from political thinking, a very deliberate process of establishing cooperation and build- ing alliances is necessary. Among other things, this includes more awareness of policymakers’ scope of action. Researchers have to consider that there are other influences on government and policy, beyond evidence. These include values, beliefs, and ideology, which are the driving forces of many political processes (Davies 2004, 2012). Habits and traditions are also important and cannot be changed quickly. Furthermore, re- searchers’ experience, expertise and judgment influence policymaking, and media; lobbyists and pressure groups also play a role. Researchers have to keep in mind that policymaking is highly embedded in a bureaucratic culture and is forced to respond quickly to everyday contingencies.
  • 11. Last but not least, as resources are limited, policymaking is always a matter of what works at what costs and with what outcomes (Davies 2004, 2012). Consequently, researchers have to find ways to integrate evidence with all these factors. However, again, this step also addresses a certain basic atti - tude of researchers: It requires that researchers make their voice heard, and sometimes, they have to be very insistent. Step 5: Coordinated development of Intervention and Implementation (I) This step is the centerpiece and a long process rather than a step. The coordinated development has to be performed in a theory-driven, ecological, collaborative, and participatory way. This means that researchers have to include the perspec- tives of all relevant stakeholders (practitioners, policymakers, government officials, public servants, and communities) in this development process, communicate in the language of these diverse stakeholders, and meet them as equals. Therefore, researchers again have to consider parameters for their research work that differs from many traditional approaches: working together right from the beginning is not common in many fields and also requires new conceptions of e.g., research planning (regarding things like the duration of project phases; see Meyers et al. 2012). Here, the big chal - lenge is to find a balance between wide participation and the maintenance of scientific criteria and standards of evidence, as well as between the freedom of science and research on de- mand. Consequently, researchers must have theoretical knowledge and practical experience in their very specific field of expertise, but the required profile for a successful Bintegrative intervention and implementation researcher^ is much wider. Step 6: Transfer of program implementation (T)
  • 12. For this final scale-up step, several models and guidelines have been proposed by implementation science (see BTheoretical Background^). Therefore, we will only make a reference to them (e.g., Fixsen et al. 2013; Meyers et al. 2012; Spoth et al. 2013). In sum, none of the PASCITsteps refer to a completely new consideration or demand (see also Glasgow et al. 1999, who developed the RE-AIM framework for the evaluation of pub- lic health interventions), but they have to be seen as one co- herent approach. The sound, consistent integration of inter - vention and implementation research with the goal of intro- ducing changes to policy also requires a (re)differentiation of our scientific identity and the creation of a new, wider job description for researchers in this field. To illustrate the application of the PASCIT steps of the BI3- Approach^, the following sections introduce the ViSC school program, which seeks to reduce aggressive behavior and bul - lying and to foster social and intercultural competencies. The ViSC School Program—Fostering Social and Intercultural Competencies Step 1: Mission-driven Problem recognition (P) Since 1997, violence in schools has gained widespread public attention in Austria, and a number of reports in the media have addressed this topic. As a consequence, a joint statement is - sued by four federal ministers declared the government’s in- tention to set up initiatives to prevent violence in several social domains. The government provided financial support for a number of individual intervention and prevention projects in schools. However, as shown in a national report (Atria and Spiel 2003), most of the initiatives taken to prevent violence in
  • 13. schools were organized by individual teachers, researchers were not involved in the planning and organization of these projects. Therefore, these projects and programs were not the- oretically based, project goals were imprecisely formulated, 340 Prev Sci (2018) 19:337–346 and programs were rarely documented and evaluated. In sum, they were a far cry from aligning with the standards of evidence. Step 2: Ensuring Availability of robust knowledge on how to handle the problem (A) Many prevention and intervention programs have been deve- loped by researchers to take on violence at school. They have been evaluated in numerous efficacy and effectiveness trials (e.g., Ferguson et al. 2007; Fox et al. 2012; Smith et al. 2004; Ttofi and Farrington 2009, 2011), and many resources of na- tional institutions have been invested into the implementation of research-based programs in several countries (e.g., Kennedy 1997; Nutley et al. 2007; Petrosino et al. 2001; Shonkoff 2000). It has become apparent that there are at least two key features necessary for program success: (a) to ap- proach the school as a whole and to incorporate a systemic perspective and (b) to conduct activities at school level, class level, and individual level (Smith et al. 2004; Ttofi and Farrington 2009, 2011). However, the deployment of research findings in applied settings has remained slow and incomplete (e.g., Slavin 2002, 2008). It turns out that national strategies actively supported by governments are needed for sustainable violence prevention (Olweus 2004; Roland 2000; Spiel et al. 2011a, b, c).
  • 14. Step 3: Identification of reasonable Starting points for action (S) Consequently, the best means for dealing with violence in Austrian schools was the development of a national strategy with policy and advocacy as important pillars (Roland 2000). Violence prevention programswhich take into account the key factors identified for success (Smith et al. 2004; Ttofi and Farrington 2011), comply with the standards of evidence (e.g., Flay et al. 2005) and consider cultural and situational conditions (Datnow 2002, 2005; Shonkoff 2000) should be conceptualized as central parts of this national strategy. Step 4: Establishment of a Cooperation process with policymakers (C) Therefore, we tried to convince officials at the Federal Ministry for Education and the Minister herself of the need for a strategy at a national level by advocating for evidence - based programs and explaining the advantages of a strategi c plan as opposed to individual initiatives. Based on several previous projects with the Federal Ministry for Education, we had established an open and honest line of communication. At the beginning of 2007, in the wake of a quick succession of dramatic, well-publicized incidents in schools and the pub- lic discussion of the high rates of bullying and victimization in Austria that followed reports of the results of the Health Behaviours in School-aged Children (HBSC) survey (Craig and Harel 2004), we received a mandate from the Federal Ministry for Education to develop a national strategy for vio- lence prevention in the Austrian public school system. In pre- paring the national strategy, we had to cope with the challenge that there was no existing system of collaboration among var- ious stakeholders actively involved in violence prevention and
  • 15. intervention (e.g., school psychologists, social workers, teach- er unions) and the lack of knowledge concerning scientific standards. It was therefore our intention to systematically in- tegrate the perspectives of these stakeholder groups in strategy development and to specifically consider the application of theory-based and empirically evaluated prevention and inter- vention programs (Spiel and Strohmeier 2011). We developed the strategy between January and November 2007 in a continuous dialogue with officials responsible for this issue at the Federal Ministry of Education and in intensive exchange with international colleagues who had been in- volved in similar national strategies in their own countries (for details, see Spiel and Strohmeier 2011). In December 2007, the Federal Minister decided to implement the strategy and presented this decision and the strategy plan in a major press conference. For strategy management and implementa- tion, a steering committee was established at the Federal Ministry, with Christiane Spiel as an external member respon- sible for research issues. In 2008, the national strategy became part of the coalition agreement between the two governing parties and was designed to last up to the end of the legislation period in September 2013. As a consequence, money was devoted to the national strategy and the activities within the strategy. The officers of the Federal Ministry and the Federal Minister herself were very much committed to the national strategy and very keen on getting positive results. The imple- mentation of the strategy continues during the legislation pe- riod from 2013 to 2018 (for details about the national strategy and its development, see Spiel and Strohmeier 2011, 2012; Spiel, et al. 2012). As one part of the national strategy, it was possible to expand the so-called ViSC class project (Atria and Spiel 2007), which had previously been developed and applied in rather controlled contexts, into a school-wide program, to develop an implementation strategy for Austrian schools, and to conduct a large-scale evaluation study.
  • 16. Step 5: Coordinated development of Intervention and Implementation (I) In accordance with the national strategy, the main goals of the ViSC program are to reduce aggressive behavior and bullying as well as to foster social and intercultural competencies in schools (Strohmeier et al. 2012). The ViSC program was de- signed to focus on the school as a whole and to incorporate a systemic perspective. The prevention of aggression and Prev Sci (2018) 19:337–346 341 bullying was defined as a comprehensive school development project over the duration of an entire school year. Activities were designed to operate on three different levels: the school as a whole, the classrooms, and the individual (for details, see Spiel and Strohmeier 2011; Strohmeier et al. 2012). In an in- school training for all teachers, basic knowledge on bully- victim behavior and its development was presented. Based on this shared knowledge, the school principal and all of the school’s teachers were encouraged to jointly develop (a) a shared definition of aggression and bullying, (b) shared prin- ciples on how to handle aggression and bullying, and (c) com- monly agreed-upon measures to sustainably reduce aggres- sion and bullying on the school level. Furthermore, teachers were trained to conduct talks with bullies, victims and their parents in accordance with shared, standardized procedures, in reaction to critical incidents. One unique feature of the ViSC program concept is that it includes a theory-based project at the class level, which was developed in recognition of the importance of the class con- text for the prevalence of bullying and victimization. The
  • 17. ViSC class project consists of 13 training units (for details, see Atria and Spiel 2007; Spiel and Strohmeier 2011; Strohmeier et al. 2012). Taking the culture of Austrian schools into account, the ViSC class project is well structured, but open for adaptation in terms of materials and activities used. Before expanding the ViSC class project to the ViSC school program, it had been implemented four times by different researchers in Austrian and German schools. Summative and formative program evaluations confirmed promising results (Atria and Spiel 2007; Gollwitzer et al. 2006, 2007). In par - ticular, a high implementation quality could be found. Out of a total of 52 training units (4 classes×13 units), only one train- ing unit was deemed not to be in accordance with the project goals. Twenty-nine units were structured exactly according to the training manual and 22 units were in accordance with the project goals, but used adapted materials (Gollwitzer et al. 2006). The ViSC program’s implementation model was devel- oped concurrently by the same researchers who developed the ViSC program. It needed to take the context and culture of the Austrian school system as well as the concrete situ- ation at each specific school into account (Datnow 2002, 2005). Therefore, and to avoid overburdening teachers and principals, we developed a cascaded train-the-trainer model for the implementation of the ViSC program: researchers train multipliers, multipliers train teachers, and teachers train their students. To train multipliers—known as ViSC coaches—ViSC courses were offered by the program de- velopers. The ViSC courses consisted of three work- shops—mostly held at the University of Vienna—and the implementation of the ViSC program in one school. ViSC coaches were required to hold in-school trainings for the entire staff at their assigned school and to supervise and coach them throughout the implementation process. The
  • 18. ViSC coaches also held an additional in-school training for those teachers who planned to conduct the ViSC class project and offered them three supervision units during the implementation of the class project. Step 6: Transfer of program implementation (T) The primary target groups for recruiting ViSC coaches are teachers at educational universities and psychologists. From 2008 to 2014, 55 coaches were trained. To support schools in implementing the ViSC program, many materials were pro- vided on a website, which was presented and explained to teachers by the ViSC coaches. Furthermore, a manual was created to serve as a guide for teachers. In 2009/10, the ViSC program was intensively evaluated in terms of implementation quality and effectiveness. All sec- ondary schools located in the capital city of Austria were invited to participate in the ViSC program. Out of 155 second- ary schools in Vienna, 34 schools applied for participation, out of which 26 schools fulfilled the necessary requirements (participation of the whole school staff in the program, providing time resources for all ViSC trainings, taking part in the evaluation study; Gradinger et al. 2014; Strohmeier et al. 2012). Applying a randomized intervention-control group design, 13 schools were randomly assigned to the inter - vention group. Out of the remaining 13 schools, only five agreed to serve as control schools. Data from 2042 students (1377 in the intervention group, 665 in the control group) from grades 5 to 7 (47.3 % girls), mean age 11.7 years (SD=0.9), and attending 105 classes were collected at four points in time. In addition, 338 teachers participated in a sur- vey at the pre-testing and post-testing points. Our first analyses focused on the short-term effectiveness of the program with respect to aggressive behavior and vic-
  • 19. timization (Strohmeier et al. 2012). A multiple group latent change score model (LCS) to compare the control and inter - vention group was applied, with gender and age as covariates. The multiple group LCS model, imposing strict measurement invariance, fit the data well. Results indicated a decline in aggressive behavior (the latent mean of the aggression change score in the intervention group, M=−0.23, differed signifi - cantly from 0; p= 0.13), but no change in victimization. Boys scored higher on aggression at time 1 and had lower decreases over time. Age did not have any effects (for details, see Strohmeier et al. 2012). A further analysis using the Handling Bullying Questionnaire (HBQ) showed that teachers in the intervention group used more non-punitive strategies to work with bullies and more strategies to support victims compared to teachers in the control group (Strohmeier et al. 2012). We also investigated the short-term effect of the ViSC program on cyberbullying and cybervictimization. Results of a multiple group bivariate LCS model, controlling 342 Prev Sci (2018) 19:337–346 for traditional aggression, traditional victimization, and age showed effectiveness for both cyberbullying (latent d=0.39) and cybervictimization (latent d = 0.29; for details, see Gradinger et al. 2014). The evaluation of the implementation quality of the ViSC program focused on implementation fidelity (the number of conducted units documented by the ViSC coaches) and participant responsiveness (participation rates of the teaching staff; Schultes et al. 2014). There was a high variability for both scores: implementation fi - delity ranged from 0.4 (conduction of 40 % of the pre- scribed training units) to 2.0 (conduction of twice as many
  • 20. training units as prescribed in the curriculum); the range of participation rates lay between 30 and 100 %. Multilevel analyses showed that teachers’ self-efficacy was signifi- cantly more enhanced in schools where the ViSC program had been implemented with high fidelity and that only teachers with high participant responsiveness significantly changed their behavior in bullying situations (for details, see Schultes et al. 2014). We used these results to adapt the training of the ViSC coaches. In a participatory ap- proach together with coaches and school principals, we worked out what conditions are necessary for implementing the ViSC program with high fidelity and high participant responsiveness. In further analyses, implementation fidelity and participation rates will be considered. Currently, the ViSC program is being implemented in Romania and Cyprus by local researchers. Initial evaluations show prom- ising results. Our intention was not only to implement a program to prevent violence, but also to enable principals and teachers to assess and interpret violence rates in their schools and class- rooms as well as to evaluate the effectiveness of interventions against violence without depending on the presence of re- searchers for data collection, analysis, and interpretation. Therefore, we developed the self-evaluation tool AVEO (Austrian Violence Evaluation Online Tool), which provides information about violence rates from the perspective of both, students and teachers (Schultes et al. 2011; Spiel et al. 2011a, b, c; see also Spiel et al. 2012). The teacher and school per - spectives were systematically integrated into the development of the self-evaluation tool and its implementation was careful- ly evaluated. Discussion Although there is a large amount of empirical evidence for
  • 21. the importance of implementation, there is still not enough knowledge available on how to increase the likelihood of quality implementation (Meyers et al. 2012). From our per- spective, one reason for this state of insufficiency might be that intervention and implementation research have not been systematically connected up until now. In this paper, we proposed the systematic integration of intervention and implementation research. We presented an integrative inter- vention and implementation approach (I3-Approach) in- cluding a procedure with six consecutive steps (PASCIT) from (1) problem recognition to (6) transfer of program implementation. To illustrate the integration of intervention and implementation, we presented a program from our own research, focusing on violence prevention in schools. In this section, we discuss the strengths and limitations of this example. Finally, problems and challenges of the I3- Approach and the PASCIT procedure are examined on a general level. For the development and implementation of the ViSC school program, it was very helpful that it was part of a na- tional strategy on violence prevention in the public school system, which we ourselves had also developed. Moreover, from our perspective, the establishment of a national strategy was a prerequisite for the upscaling of the ViSC program, as necessary implementation capacity and respective organiza- tional structures (Fixsen et al. 2005) have not been established before. The public discussion of the high rates of bullying in Austria and several incisive events in schools raised policymakers’ awareness of the issue and gave us a window of opportunity for action. A further important step was that the national strategy became part of the coalition agreement of the governmental parties. This solved the budget problem. The national strategy also supported the implementation of sound measures for sustainability (for details, see Spiel and
  • 22. Strohmeier 2012; Spiel et al. 2012) and the realization of a randomized control trial as scientific standard was defined within the strategy. To our knowledge, it was the first time that this gold standard was applied in a program financed by the Austrian Federal Ministry for Education. Further strengths of the ViSC program include the fact that adaptation within a defined range is an explicit part of the program (for details, see Strohmeier et al. 2012); the building of organizational capac- ity through collaboration with, e.g., school psychology, the ViSC coaches training, the implementation concept and its evaluation, as well as the AVEO self-assessment as a feedback mechanism. However, there are also some limitations. In the ViSC program, we did not work directly with schools, but rather trained ViSC coaches. This resulted in lower commitment of the schools and lower implementation quality in the evaluation study (Schultes et al. 2014), but advantages for long-term implementation in the school system. Nevertheless, we recommend convincing politicians and government officials that the initial im- plementation of such programs should be done under the supervision of researchers and program developers. A further limitation has to do with the cultural context. In Austria, responsibility and commitment are not well Prev Sci (2018) 19:337–346 343 established in the educational system (see Spiel and Strohmeier 2012). Out of the 155 secondary schools in Vienna invited to participate in the ViSC program, only 34 applied for participation. Considering the high bully- ing rates in Austria (Craig and Harel 2004), which in- dicate a need for intervention, the participation rate was
  • 23. low. The low engagement can also be seen by the fact that out of the 13 schools we asked to serve as control schools, only 5 agreed to do so. In other countries, e.g., in Finland (Salmivalli et al. 2013), a greater degree of responsibility in the educational system means that par- ticipation rates in such programs are usually much higher. We tried hard to fulfil all the critical steps identified by Meyers et al. (2012) and were successful concerning most of them (see above), but it was not possible to realize our inten- tion in all cases. To prepare the schools for the intervention, we defined criteria for program participation, such as partici - pation of the entire school staff in the program. The school principals were asked to get consent in advance. However, after the start of the program, it became apparent that this consent had not been obtained in several cases. We further recommended that no other programs should be conducted simultaneously, which was also disregarded by some schools. Finally yet importantly, the school principals’ leadership was not strong enough. This can be attributed to the Austrian school law and cannot be changed easily. School autonomy has been a topic of political discussion in Austria for many years. Overall, the ViSC program illustrates that intervention and implementation can be systematically combined in the sense of PASCIT. However, it also illustrates that detailed planning of such projects by researchers is difficult and that limitations on different levels—regarding, e.g., cultural contexts—have to be kept in mind. But why is the systematic combination of intervention and implementation as proposed by PASCIT so difficult? On the surface, the steps seem self-evident. And what is really new in contrast to existing implementation frame-
  • 24. works and transfer concepts? Obviously, most, if not all components (both within and across the six steps) of PASCIT, are already known and have long been consid- ered in intervention and implementation research. However, the new and demanding challenge is our postu- lation of bringing them together in an integrative and co- ordinated way, in order to achieve success. The I3- Approach represents a very basic but also a very system- atic research concept and is more than purely the sum of its steps, ignoring one aspect changes the whole dynamic. RE-AIM (Glasgow et al. 1999) has recommended such a systematic view for the evaluation of public health mea- sures. However, so far, it has not been implemented com- prehensively either. Nevertheless, the validity and convenience of PASCIT have to be proven by future research and programs in different fields and cultural contexts. Obviously, combining intervention and implementa- tion research is very demanding. Therefore, the appro- priate acknowledgement in the scientific community is essential. Consequently, individual researchers should not be the only ones engaging in this kind of research; universities also have to include it in their missions. We therefore strongly recommend a discussion of success criteria in academia (Fixsen et al. 2011) and that the social responsibility of academics and universities, re- spectively, will be considered more deeply. The current gratification system in science is more oriented to basic than to applied research. Within the applied field, it is predominantly technology and medicine that are finan- cially supported and acknowledged by the public. Mission-driven research picking up problems in society is less financed and noticed. Consequently, the number of researchers engaged in this field is limited. However, in the last few years some advances could be observed.
  • 25. A further problem lies in the availability of knowledge. In the social sciences, particularly in the educational field, it is not easy to get robust scientific knowledge. Reasons for this include that replication studies are rare and only probability conclusions can be drawn. Here, the development of standards of evidence was of high importance (e.g., Flay et al. 2005). However, the requirements defined in these standards are not as comprehensive as demanded by the I3-Approach. For ex- ample, the evidence standards defined by the Society for Prevention Research (Flay et al. 2005) proposed criteria for efficacious and effective interventions and interventions rec- ognized as ready for broad dissemination. However, they did not combine them with the affordance of implementation. As mentioned earlier, the commitment of policymakers is crucial. Researchers need to have a great deal of persistence and knowledge about policymakers’ scope of action. However, in most cases this is not enough. A window of opportunity is also needed and researchers have to catch it. Here, the media can be supportive (Spiel and Strohmeier 2012). To sum up, from our perspective, it is a continuous chal- lenge to introduce the criteria for and the advantages of evidence-based practice to politicians, public officials, and practitioners on the one hand, and to promote the recognition of intervention and implementation research in the scientific community on the other hand. The I3-Approach and its PASCIT steps offer one possible procedure for realization. Obviously, other procedures, such as bottom-up approaches, might be possible, especially if implementation capacity (e.g., in the sense of sufficient competent manpower) and respective organizational structures are already established. However, we argue for a systematic, strategic procedure instead of an inci - dental one.
  • 26. 344 Prev Sci (2018) 19:337–346 Acknowledgments Open access funding provided by University of Vienna. This study was funded by grants from the Austrian Federal Ministry for Education, Arts, and Culture The development of the national strategy for violence prevention and the evaluation of its implementation as well as the implementation of the ViSC program were financially supported by the Austrian Federal Ministry for Education, Arts, and Culture. We want to thank the Federal Ministry for their support and the participating teachers for their engagement. References Atria, M., & Spiel, C. (2003). The Austrian situation: Many initiatives, few evaluations. In P. Smith (Ed.), Violence in Schools: The Response in Europe (pp. 83–99). London: RoutledgeFalmer. Atria, M., & Spiel, C. (2007). Viennese Social Competence (ViSC) Training for Students: Program and evaluation. In J. E. Zins, M. J. Elias, & C. A. Maher (Eds.), Bullying, Victimization, and Peer Harassment. A Handbook of Prevention and Intervention (pp.
  • 27. 179–197). New York: Haworth Press. Beelmann, A., & Karing, C. (2014). Implementationsfaktoren und - prozesse in der Präventionsforschung: Strategien, Probleme, Ergebnisse, Perspektiven [Implementation factors and processes in prevention research: Strategies, problems, findings, prospects]. Psychologische Rundschau, 65, 129–139. doi:10.1026/0033- 3042/ a000215. Craig, W., & Harel, Y. (2004). Bullying, physical fighting and victimiza- tion. In C. Currie (Ed.),Health behaviour in school-aged children: A WHO cross national study (pp. 133–144). Geneva: World Health Organization. Datnow, A. (2002). Can we transplant educational reform, and does it last? Journal of Educational Change, 3, 215–239. doi:10.1023/A. 1021221627854. Datnow, A. (2005). The sustainability of comprehensive school reform models in changing district and state contexts. Educational Administration Quarterly, 41, 121–153. doi:10. 1177/0013161X04269578. Davies, P. (2004). Is Evidence-Based Government Possible? Jerry Lee Lecture 2004. Paper presented at the 4th Annual Campbell Collaboration Colloquium, Washington, DC. Davies, P. (2012). The State of Evidence-Based Policy
  • 28. Evaluation and its Role in Policy Formation. National Institute Economic Review, 219, 41–52. doi:10.1177/002795011221900105. Eccles, M. P., & Mittman, B. S. (2006). Welcome to implementation science. Implementation Science, 1, 1. doi:10.1186/1748-5908- 1-1. Ferguson, C. J., SanMiguel, C., Kilburn, J. C., & Sanchez, P. (2007). The effectiveness of school-based anti-bullying programs. Criminal Justice Review, 32, 401–414. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R.M., &Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network (FMHI Publication No. 231). Retrieved from http://nirn.fpg.unc.edu/resources/implementation- research-synthesis-literature Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practi ce, 19, 531. doi:10.1177/1049731509335549. Fixsen, D. L., Blase, K., & Van Dyke, M. (2011). Mobilizing communi- ties for implementing evidence-based youth violence prevention programming: A Commentary. American Journal of Community Psychology (Special Issue), 48, 133–137.
  • 29. Fixsen, D. L., Blase, K., Metz, A., & Van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children (Special Issue), 79, 213–230. Flay, B. R., Biglan, A., Boruch, R. F., Castro, F. G., Gottfredson, D., Kellam, S., . . . Ji, P. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175. Forman, S. G., Shapiro, E. S., Codding, R. S., Gonzales, J. E., Reddy, L. A., Rosenfield, . . . Stoiber, K. C. (2013). Implementation Science and School Psychology. School Psychology Quarterly, 28, 77– 100. doi:10.1037/spq0000019 Fox, B. H., Farrington, D. P., & Ttofi, M. M. (2012). Successful bullying prevention programs: Influence of research design, implementation features, and program components. International Journal of Conflict and Violence, 6, 273–282. Glasgow, R. E., Vogt, T. M., & Boles, S.M. (1999). Evaluating the public health impact of health promotion interventions: The RE-AIM framework. American Journal of Public Health, 89, 1322–1327. doi:10.2105/AJPH.89.9.1322. Gollwitzer, M., Eisenbach, K., Atria, M., Strohmeier, D., & Banse, R.
  • 30. (2006). Evaluation of Aggression-Reducing Effects of the BViennese Social Competence Training.^. Swiss Journal of Psychology, 65, 125–135. Gollwitzer, M., Banse, R., Eisenbach, K., & Naumann, A. (2007). Effectiveness of the Vienna Social Competence Training on Explicit and Implicit Aggression. European Journal of Psychological Assessment, 23, 150–156. Gradinger, P., Yanagida, T., Strohmeier, D., & Spiel, C. (2014). Prevention of Cyberbullying and Cybervictimization: Evaluation of the ViSC Social Competence Program. Journal of School Violence, 14, 87–110. doi:10.1080/15388220.2014.963231. Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82, 581–629. doi:10.1111/j.0887-378X.2004.00325.x. Kennedy, M. M. (1997). The connection between research and practice. Educational Researcher, 26, 4–12. doi:10.3102/ 0013189X026007004. Kratochwill, T. R., & Shernoff, E. S. (2003). Evidence-based practice: Promoting evidence-based interventions in school psychology. School Psychology Quarterly, 18, 389–408. Meyers, D. C., Durlak, J. A., & Wandersmann, A. (2012). The quality implementation framework: A synthesis of critical steps in the im-
  • 31. plementation process. American Journal of Community Psychology, 50, 462–480. doi:10.1007/s10464-012-9522-x. Nutley, S. M.,Walter, I., & Davies, H. T. O. (2007).Using evidence. How research can inform public service. Bristol: The Policy Press. Prev Sci (2018) 19:337–346 345 Compliance with Ethical Standards Conflict of Interest The authors declare that they have no conflict of interest. Ethical Approval All procedures performed in studies involving hu- man participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed Consent Informed consent was obtained from all individual participants included in the study. Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http:// creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give
  • 32. appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. http://dx.doi.org/10.1026/0033-3042/a000215 http://dx.doi.org/10.1026/0033-3042/a000215 http://dx.doi.org/10.1023/A.1021221627854 http://dx.doi.org/10.1023/A.1021221627854 http://dx.doi.org/10.1177/0013161 X04269578 http://dx.doi.org/10.1177/0013161X04269578 http://dx.doi.org/10.1177/002795011221900105 http://dx.doi.org/10.1186/1748-5908-1-1 http://nirn.fpg.unc.edu/resources/implementation-research- synthesis-literature http://nirn.fpg.unc.edu/resources/imple mentation-research- synthesis-literature http://dx.doi.org/10.1177/1049731509335549 http://dx.doi.org/10.1037/spq0000019 http://dx.doi.org/10.2105/AJPH.89.9.1322 http://dx.doi.org/10.1080/15388220.2014.963231 http://dx.doi.org/10.1111/j.0887-378X.2004.00325.x http://dx.doi.org/10.3102/0013189X026007004 http://dx.doi.org/10.3102/0013189X026007004 http://dx.doi.org/10.1007/s10464-012-9522-x Ogden, T., & Fixsen, D. L. (2014). Implementation Science: A brief overview and a look ahead. Zeitschrift für Psychologie, 222, 4– 11. doi:10.1027/2151-2604/a000160. Olweus, D. (2004). The Olweus Bullying Prevention Programme: Design and implementation issues and a new national initiative in
  • 33. Norway. In P. K. Smith, D. J. Pepler, & K. Rigby (Eds.), Bullying in Schools: How Successful Can Interventions Be? (pp. 13–36). Cambridge: Cambridge University Press. Petrosino, A., Boruch, R. F., Soydan, H., Duggan, L., & Sanchez-Meca, J. (2001). Meeting the challenges of evidence-based policy: The Campbell Collaboration. Annuals of the American Academy of Political and Social Science, 578, 14–34. Roland, E. (2000). Bullying in schools: Three national innovations in Norwegian schools in 15 years. Aggressive Behavior, 26, 135– 143. Rossi, P. H., &Wright, J. D. (1984). Evaluation research: An assessment. Annual Review of Sociology, 10, 331–352. Salmivalli, C., Poskiparta, E., Ahtola, A., & Haataja, A. (2013). The implementation and effectiveness of the KiVa Antibullying Program in Finland. European Psychologist, 18, 79–88. doi:10. 1027/1016-9040/a000140. Schultes, M.-T., Strohmeier, D., Burger, C., & Spiel, C. (2011). Fostering evidence-based prevention in schools through self- evaluation: The Austrian Violence Evaluation Online-Tool (AVEO). Poster presented at the Evidence-Based Prevention of Bullying and Youth Violence Conference in Cambridge, Great Britain. Schultes, M.-T., Stefanek, E., van de Schoot, R., Strohmeier, D., & Spiel,
  • 34. C. (2014). Measuring implementation of a school-based violence prevention program: Fidelity and teachers’ responsiveness as pre- dictors of proximal outcomes. Zeitschrift für Psychologie, 222, 49– 57. doi:10.1027/2151-2604/a000165. Shonkoff, J. P. (2000). Science, policy, and practice: Three cultures in search of a shared mission. Child Development, 71, 181–187. doi: 10.1111/1467-8624.00132. Slavin, R. E. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31, 15– 21. Slavin, R. E. (2008). Evidence-based reform in education: Which evi- dence counts? Educational Researcher, 37, 47–50. Smith, P. K., Pepler, D., & Rigby, K. (2004). Bullying in schools: How successful can interventions be? Cambridge: Cambridge University Press. Spiel, C. (2009a). Evidence-based practice: A challenge for European developmental psychology. European Journal of Developmental Psychology, 6, 11–33. Spiel, C. (2009b). Evidenzbasierte Bildungspolitik und
  • 35. Bildungspraxis – eine Fiktion? Problemaufriss, Thesen, Anregungen [Evidence- based education policy and educational practice – a fiction? Problem, theses, suggestions]. Psychologische Rundschau, 60, 255–256. Spiel, C., & Strohmeier, D. (2011). National strategy for violence preven- tion in the Austrian public school system: Development and imple- mentation. International Journal of Behavioral Development, 35, 412–418. doi:10.1177/0165025411407458. Spiel, C., & Strohmeier, D. (2012). Evidence-based practice and policy: When researchers, policy makers, and practitioners learn how to work together. European Journal of Developmental Psychology, 9, 150–162. Spiel, C., Lösel, F., &Wittmann, W.W. (2009). Transfer psychologischer Erkenntnisse – eine notwendige, jedoch schwierige Aufgabe [Transfer psychological findings – a necessary but difficult task]. Psychologische Rundschau, 60, 257–258. Spiel, C., Salmivalli, C., & Smith, P. K. (2011a). Translational research: National strategies for violence prevention in school. International Journal of Behavioral Development, 35, 381–382. doi:10.1177/ 0165025411407556.
  • 36. Spiel, C., Schober, B., Strohmeier, D., & Finsterwald, M. (2011b). Cooperation among researchers, policy makers, administrators, and practitioners: Challenges and recommendations. ISSBD Bulletin 2011, 2, 11–14. Spiel, C., Strohmeier, D., Schultes, M.-T., & Burger, C. (2011c). Nachhaltigkeit von Gewaltprävention in Schulen: Erstellung und Erprobung eines Selbstevaluations-instruments [Sustainable vio- lence prevention in schools: Development and evaluation of a self- evaluation tool for schools]. Vienna: University of Vienna. Spiel, C., Wagner, P., & Strohmeier, D. (2012). Violence prevention in Austrian schools: Implementation and evaluation of a national strat- egy. International Journal of Conflict and Violence, 6, 176–186. Spoth, R., Rohrbach, L. A., Greenberg,M., Leaf, P., Brown, C.H., Fagan, A., . . . Hawkins, J. D. (2013). Addressing core challenges for the next generation of type 2 translation research and systems: The Translation Science to Population Impact (TSci Impact) Framework. Prevention Science, 14, 319–351. doi:10.1007/s11121-012-0362-6 Stokes, D. E. (1997). Pasteur’s quadrant. Basic science and tech- nological innovation. Washington, DC: Brookings Institution Press.
  • 37. Strohmeier, D., Hoffmann, C., Schiller, E.-M., Stefanek, E., & Spiel, C. (2012). ViSC Social Competence Program. New Directions for Youth Development, 133, 71–80. doi:10.1002/ yd.20008. Ttofi, M. M., & Farrington, D. P. (2009). What works in preventing bullying: Effective elements of anti-bullying programmes. Journal of Aggression, Conflict and Peace Research, 1, 13–24. Ttofi, M. M., & Farrington, D. P. (2011). Effectiveness of school-based programs to reduce bullying: A systematic and meta-analytic re- view. Journal of Experimental Criminology, 7, 27–56. doi:10. 1007/s11292-010-9109-1. 346 Prev Sci (2018) 19:337–346 http://dx.doi.org/10.1027/2151-2604/a000160 http://dx.doi.org/10.1027/1016-9040/a000140 http://dx.doi.org/10.1027/1016-9040/a000140 http://dx.doi.org/10.1027/2151-2604/a000165 http://dx.doi.org/10.1111/1467-8624.00132 http://dx.doi.org/10.1177/0165025411407458 http://dx.doi.org/10.1177/0165025411407556 http://dx.doi.org/10.1177/0165025411407556 http://dx.doi.org/10.1007/s11121-012-0362-6 http://dx.doi.org/10.1002/yd.20008 http://dx.doi.org/10.1002/yd.20008 http://dx.doi.org/10.1007/s11292-010-9109-1 http://dx.doi.org/10.1007/s11292-010-9109-1Implementing Intervention Research into Public �Policy—the “I3- Approach”AbstractImplementing Intervention Research into Public PolicyTheoretical BackgroundPASCIT—six steps of an “I3-Research” (Integrative Intervention and
  • 38. Implementation Research) ApproachStep 1: Mission-driven Problem recognition (P)Step 2: Ensuring Availability of robust knowledge on how to handle the problem (A)Step 3: Identification of reasonable Starting points for action (S)Step 4: Establishment of a Cooperation process with policymakers (C)Step 5: Coordinated development of Intervention and Implementation (I)Step 6: Transfer of program implementation (T)The ViSC School Program—Fostering Social and Intercultural CompetenciesStep 1: Mission-driven Problem recognition (P)Step 2: Ensuring Availability of robust knowledge on how to handle the problem (A)Step 3: Identification of reasonable Starting points for action (S)Step 4: Establishment of a Cooperation process with policymakers (C)Step 5: Coordinated development of Intervention and Implementation (I)Step 6: Transfer of program implementation (T)DiscussionReferences DEBATE Open Access “There is nothing so practical as a good theory”: a pragmatic guide for selecting theoretical approaches for implementation projects Elizabeth A. Lynch1* , Alison Mudge2, Sarah Knowles3, Alison L. Kitson4, Sarah C. Hunter4 and Gill Harvey1 Abstract Background: A multitude of theories, models and frameworks relating to implementing evidence-based practice in health care exist, which can be overwhelming for clinicians and clinical researchers new to the field of implementation science. Clinicians often bear responsibility for implementation, but may be unfamiliar with theoretical approaches
  • 39. designed to inform or understand implementation. Main text: In this article, a multidisciplinary group of clinicians and health service researchers present a pragmatic guide to help clinicians and clinical researchers understand what implementation theories, models and frameworks are; how a theoretical approach to implementation might be used; and some prompts to consider when selecting a theoretical approach for an implementation project. Ten commonly used and highly cited theoretical approaches are presented, none of which have been utilised to their full potential in the literature to date. Specifically, theoretical approaches tend to be applied retrospectively to evaluate or interpret findings from a completed implementation project, rather than being used to plan and design theory- informed implementation strategies which would intuitively have a greater likelihood of success. We emphasise that there is no right or wrong way of selecting a theoretical approach, but encourage clinicians to carefully consider the project’s purpose, scope and available data and resources to allow them to select an approach that is most likely to “value-add” to the implementation project. Conclusion: By assisting clinicians and clinical researchers to become confident in selecting and applying theoretical approaches to implementation, we anticipate an increase in theory-informed implementation projects. This then will contribute to more nuanced advice on how to address evidence- practice gaps and ultimately to contribute to better health outcomes. Keywords: Evidence-based practice, Implementation, Knowledge translation, Theory-informed Background Clinicians and clinical researchers usually have expert
  • 40. knowledge about evidence-based interventions for differ- ent clinical conditions. While some health professionals may have experience of implementing evidence-based interventions in their own practice or overseeing a change in practice by health professionals directly under their supervision, many are not familiar or confident with the current evidence regarding how to effectively, efficiently and sustainably implement evidence-based in- terventions into routine clinical practice. Implementation science has been defined as “the scien- tific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services” [1], and recognises that strong evidence alone is not sufficient to change practice. Researchers from a multitude of backgrounds have pro- posed different approaches to predicting, guiding and explaining how evidence is implemented, drawing on * Correspondence: [email protected] 1Adelaide Nursing School, Faculty of Health and Medical Sciences, The University of Adelaide, Adelaide, South Australia 5000, Australia Full list of author information is available at the end of the article © The Author(s). 2018 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provi de a link to
  • 41. the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. Lynch et al. BMC Health Services Research (2018) 18:857 https://doi.org/10.1186/s12913-018-3671-z http://crossmark.crossref.org/dialog/?doi=10.1186/s12913-018- 3671-z&domain=pdf http://orcid.org/0000-0001-8756-1051 mailto:[email protected] http://creativecommons.org/licenses/by/4.0/ http://creativecommons.org/publicdomain/zero/1.0/ practical experience, observation, empirical study and the development and synthesis of an eclectic range of theories about individual, group and organisational change. This has resulted in a plethora of implementation frameworks, models, and theories; recent data suggest more than 100 theoretical approaches are being used by implementation researchers [2]. Use of established implementation theories, models or frameworks can help implementation researchers through contributing to new and more nuanced know- ledge about how and why implementation succeeds or fails [3]. For clinicians, implementation theories, models and frameworks can be applied so new initiatives are planned, implemented and evaluated more systematic- ally, which may enhance the success, sustainability and scalability of the project. When we consciously move our thinking from implicit assumptions about how we think implementation works to making our thinking
  • 42. more explicit and structured through the application of an established theoretical approach, then we might be able to be more objective and more creative about our approach to planning, guiding and evaluating. Despite the growing recognition of the need to use theory to inform implementation programs [3], many clinicians and clinical researchers are unfamiliar with theories of implementation and behaviour change. For instance, in Australia in 2012, the majority of medical, nursing and allied health professionals who had success- fully applied for National Health and Medical Research Council (NHMRC) Translating Research Into Practice (TRIP) fellowships had no previous experience using im- plementation theories or frameworks [4]. While previous manuscripts about implementation the- ories, models and frameworks are helpful for researchers with good background knowledge in implementation sci- ence to make sense of the different theoretical approaches available (for example, by providing a taxonomy to distin- guish between different categories of implementation the- ories, models and frameworks) [3], it is important that information about how and why to select and apply theor- etical approaches is made accessible to frontline health professionals who strive to provide the best quality, evidence-based care to their patients. Throughout this manuscript, we will caution the reader that this article cannot provide a simple “paint by numbers” approach to implementation. It would be counter-productive to try to create an algorithm that would capture planning and implementing behaviour change across multiple innovations in the complex adap- tive systems of modern healthcare. Rather, we encourage thoughtful assessment and responsiveness to the particu-
  • 43. lar circumstances of each implementation—the change to be introduced, the proposed way to make the change, the people who need to be involved, and the setting in which the change happens. Our intention is to provide accessible guidance based on our collective clinical and research experience, to assist clinicians and clinical re- searchers in their endeavours to design and conduct more effective implementation projects to improve clin- ical care, and to design more robust evaluations to ad- vance the empirical evidence for this emerging science. Therefore the aims of this paper are to: � Demystify some of the jargon through defining the nature and role of frameworks, models and theories � Describe and compare commonly used theoretical approaches, and how they are applied in practice � Suggest how to select a theoretical approach based on purpose, scope and context of the individual project The suggestions made in this debate paper are derived from our experiential knowledge as a multidisciplinary group of clinicians and health service researchers with interest and experience in implementation science in dif- ferent international settings. EAL was a hospital-based physiotherapist for 14 years and is now an early-career researcher, AM is a practicing general physician and mid-career researcher, SK has a psychology background and is a researcher in a National Institute of Health Re- search Collaboration for Leadership in Applied Health Research and Care, SCH (psychology background) is an early career researcher, and GH and ALK have back- grounds in nursing and are senior researchers with ex-
  • 44. tensive experience in improvement and implementation research. We work collaboratively with front-line clini- cians and run workshops to help clinicians and clinical researchers apply theory to improve the success and sus- tainability of implementation projects. Through this work we recognise the need to assist clinicians and clin- ical researchers to navigate this complex landscape. We encourage the reader to consider our recommen- dations as prompts or guides rather than definitive pre- scriptions. We have found that a more systematic understanding of implementation processes tends to be generated when different people (with different back- grounds and different approaches to implementation) share their experiences. Therefore, we have framed this paper using the experiences of author AM to illustrate the ways that clinicians can engage with utilising imple- mentation theories, models and frameworks. Case study part 1 (experience of author AM): I am a physician and I plan to implement a delirium prevention program. I have some implementation experience and know that it won’t be easy. I have heard about implementation science, so I hope there may be tools to help me. Lynch et al. BMC Health Services Research (2018) 18:857 Page 2 of 11 I understand a bit about Knowledge to Action (KTA) to guide my planning. I have strong evidence of effectiveness and cost-effectiveness [knowledge cre- ation & synthesis], and there are established clinical practice guidelines [knowledge tools/products]. There
  • 45. is an effective model to implement delirium preven- tion developed in the USA (http://www.hospitalelder- lifeprogram.org), but it used skilled geriatric nurses and large numbers of trained volunteers, which is not feasible in my hospital. None of the strategies in the guidelines are “hard” but they just don’t seem to get done consistently. I need to find out from staff and patients why this is the case, and then try to find ways to support them. Perhaps they need more education or reminders, or maybe we can reallocate the tasks to make it easier? Or are there strategies I am not famil - iar with? Whatever I do, I want to measure better care in some way to keep my boss happy and the staff in- terested. And my previous projects have tended to fiz- zle out over time… KTA gives me part of a plan but I need some more tools to know how to take the next steps. Main text Defining frameworks, models and theories Some researchers have delineated between frameworks, models and theories, whereas other researchers use these terms interchangeably. In general, implementation frameworks, models and theories are cognitive tools that can assist a researcher or implementer to plan, predict, guide or evaluate the process of implementing evidence into practice. Generally (for more detail refer to cited reference) [5]: � A framework lists the basic structure and components underlying a system or concept. Examples of typical frameworks are the Consolidated Framework for Implementation Research (CFIR) [6], the Theoretical Domains Framework (TDF) [7, 8], RE-AIM [8–10] and Pro- moting Action on Research Implementation in
  • 46. Health Services (PARIHS) [9, 10]. � A model is a simplified representation of a system or concept with specified assumptions. An example of a model is the Knowledge to Action (KTA) cycle [11]. � A theory may be explanatory or predictive, and underpins hypotheses and assumptions about how implementation activities should occur. An example of a theory is the Normalization Process Theory (NPT) [12]. In our experience, clinicians and clinical researchers want to know what implementation approach will help them and their project best; for many clinicians and clin- ical researchers that we talk to, navigating the rapidly expanding number of implementation theories, frame- works and models is completely daunting, and is made worse by unfamiliarity with the language used and in- consistencies in nomenclature. To avoid compounding this problem, we will refer to frameworks, models and theories collectively as “theoretical approaches”, and support our readers to focus on which theoretical ap- proach can best suit the purpose, scope and context of the implementation project. Theoretical approaches help to shape how we think, which is why they are important. However, most of the time we are not aware of the underlying theories or frame- works we use. In implementation science, theoretical ap- proaches have been developed for different purposes, with different intended users and are often underpinned by dif- ferent philosophical perspectives (see Table 1). For in- stance, some have been designed to assist implementation researchers and enhance the quality of implementation re-
  • 47. search, [6] to support improvement and practice develop- ment in clinical settings [9] to understand factors influencing the implementation of evidence in particular health service settings [13] or to ensure comprehensive evaluation or reporting of an implementation program [14]. Some are based on the underlying assumption that implementation is rational and predictable when relevant factors are accounted for [7, 8]; in contrast, others are built on the assumption that implementation is unpredict- able, and ongoing monitoring, flexibility and adaptation is required [9, 10]. Some have been designed iteratively, based on the developers’ experience in implementing evi - dence in real-world settings [9, 15, 16], whereas others have been developed systematically through reviewing and synthesising published literature [6, 7, 11]. And finally, some but not all theoretical approaches have been tested, validated and/or adapted over time [8, 10, 17]. Commonly used theoretical approaches Two articles were published in 2017 which presented the most commonly used dissemination and implementation research frameworks cited in academic publications [18] and the theories most commonly used by implementation scientists [2]. For pragmatic reasons (acknowledging the systematic approach taken by authors of both manu- scripts), we used these two articles to guide the selection of theoretical approaches for discussion in this paper. We included the ten theoretical approaches that were within the top 15 on both lists (i.e. both highly cited in the literature and commonly used in implementation practice) [6–9, 11–16, 19]. These are presented in Table 1. We do not infer that these are the best or only theoretical ap- proaches that should be used in implementation projects; simply that they are the most commonly used. Lynch et al. BMC Health Services Research (2018)
  • 48. 18:857 Page 3 of 11 http://www.hospitalelderlifeprogram.org http://www.hospitalelderlifeprogram.org Table 1 Summary of ten commonly applied theoretical approaches to implementation Knowledge to Action [11] Purpose (as described by authors) A framework to conceptualise the process of knowledge translation which integrates the roles of knowledge creation and knowledge application. Provide conceptual clarity by offering a framework to elucidate the key elements of the knowledge translation process Brief description: This approach provides an overview to help guide and understand how knowledge is created and synthesised, and tools (like clinical guidelines) are developed, then how these tools are applied in clinical settings through tailoring and adaptation, implementation, monitoring and sustaining. Assumes that action plans will be realised (underpinned by assumption that actions are rational). Takes a systems approach – recognises that knowledge producers and users are situated within a larger social system How developed: Developed by reviewing literature of > 30 planned action theories, identified common elements. Added to planned action model a knowledge creation process and labelled the combined models the knowledge to action cycle.
  • 49. Changes/developments over time: No Ease of use: clear and easy to understand, intuitive. No specific guidance on how to do each step of the action cycle but provides some guidance on important elements to consider. Additional resources: no specific resources currently available on how to action each step of cycle Theoretical Domains Framework (TDF) [7, 8] Purpose (as described by authors): An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research to assess implementation and other behavioural problems and to inform intervention design. Brief description: provides a holistic list of factors that influence behaviour – application of TDF can give researcher confidence that factors influencing an individual’s behaviour will be identified, which in turn can identify factors that need to be addressed in order for behaviour change to occur (i.e. can be used to inform behaviour change strategy development/selection). Can be used in conjunction with Behaviour Change Wheel to develop and deliver behaviour change strategy How developed: through an expert consensus process and synthesis of 33 theories and 128 key theoretical constructs related to behaviour change. Changes/developments over time: Validity was investigated by
  • 50. behavioural experts sorting theoretical constructs using closed and open sort tasks. Validation study demonstrated good support for the basic structure of the TDF and led to refinements, leading to publication of new iteration of framework in 2012 Ease of use: Quite straightforward to apply, can be time consuming to use for analysis – potential to overwhelm novice researcher given the 14 domains and 84 component constructs. COM-B and Behaviour Change Wheel work together with TDF. Additional resources: interview guides provided in publications [7, 30] assist ease of data collection and illustrate domains. Subject of thematic series in Implementation Science journal, guide to use of TDF published 2017 [31]. RE-AIM framework [14, 17] Purpose (as described by authors): Originally developed as a framework to guide consistent reporting of evaluations regarding the public health impact of health promotion interventions, thereby providing a framework for determining what programs are worth sustained investment and for identifying those that work in real-world environments. Brief description: Reporting checklist for public health interventions (what patient groups are receiving intervention, have patient outcomes changed, what health professionals/ health professional groups are providing intervention, are they delivering intervention as intended, will the program be sustained in the long term) to evaluate real world
  • 51. impact. Can be used when designing or evaluating a public health intervention. How developed: Through inductive thinking building on results of previous research Changes/developments over time: “E” was initially efficacy [14], then effectiveness [17] Ease of use: Easy, interventions can be rated on the five dimensions, providing a score. Some of the reporting points (in particular Reach and Adoption) are not being interpreted and reported as developers intended Additional resources: dedicated website with online tools, examples [33] Consolidated Framework for Implementation Research [6] Purpose (as described by authors): Framework to promote implementation theory development and verification about what works, where and why. Brief description: list of factors (5 domains and 37 constructs) that can influence an implementation project, can be used in planning or in evaluation stages (does not guide how to implement). Research focus in contrast to doing/practitioner focus How developed: Published theories which sought to facilitate translation of research findings into practice in the healthcare sector were reviewed. Team identified constructs that had evidence that they influenced implementation and could be measured. Some
  • 52. constructs were streamlined and combined, whereas other constructs were separated and delineated. Changes/developments over time: No Ease of use: Clear, but may be difficult to digest language if new to area of implementation science Additional resources: dedicated website that provides examples, templates and tools to assist in developing and evaluating implementation projects, collecting and analysing data [28] Lynch et al. BMC Health Services Research (2018) 18:857 Page 4 of 11 Table 1 Summary of ten commonly applied theoretical approaches to implementation (Continued) Conceptual model of evidence-based practice implementation in public service sectors [15] Purpose (as described by authors): A multi-level, four phase model of the implementation process that can be used in public service sectors. Brief description: Conceptual model of factors that can influence implementation in the unique context of public sector services (focus on role of service delivery organisations and the services in which they operate) at each of the 4 implementation stages: Exploration, Adoption/Preparation, Implementation, Sustainment (EPIS). Explicitly recognises that
  • 53. different variables play crucial roles at different points in the implementation process. Does not provide guidance on how to move through different stages of implementation. How developed: based on literature and authors’ experience of public service sectors, funded by the National Institute of Mental Health Changes/developments over time: No Ease of use: Little clarity on how to operationalise different factors, potential to be confusing for those unfamiliar with implementation Additional resources: California Evidence-Based Clearinghouse for Child Welfare have developed webinars regarding use of EPIS framework. Freely available from http://www.cebc4cw.org/implementing- programs/tools/epis/ Conceptual model of implementation research [19] Purpose (as described by authors) a heuristic skeleton model for the study of implementation processes in mental health services, identifying the implications for research and training. Brief description: Guides how implementation research can be organised, how it fits/aligns with evidence-based practices. May be useful for complete novice who needs clarity between clinical interventions, implementation strategies, and working through how to measure clinical and implementation effectiveness. Various theories can be placed upon the model to help explain aspects of the broader
  • 54. phenomena. How developed: drawn from 3 extant frameworks: stage pipeline model, multi-level models of change and models of health service use. Changes/developments over time: No Ease of use: Clear and easy to understand Additional resources: No Implementation effectiveness model [16] Purpose (as described by authors): an integrative model to capture and clarify the multidetermined, multilevel phenomenon of innovation implementation Brief description: A list of constructs that can influence implementation effectiveness, based on the premise that implementation effectiveness is a function of an organisation’s climate for implementing a given innovation and the targeted organisational members’ perceptions of the fit of the innovation to their values. Does not provide specific guidance for how to implement, was not designed specifically for the context of health care. Likely to be most useful for projects with a clear organisational approach. How developed: from authors’ personal experience with reference to literature Changes/developments over time: No
  • 55. Ease of use: Main manuscript very wordy (text-based). Concepts are clear. Additional resources: No Promoting Action on Research Implementation in Health Services (PARIHS) [9, 10] Purpose (as described by authors): Organisational or conceptual framework to help explain and predict successful implementation of evidence into practice and to understand the complexities involved. Brief description: Conceptualises how evidence can be successfully implemented in health care settings using the process of facilitation. Underlying premise is that facilitation will enable people to apply evidence in their local setting, which is situated within a broader organisational and societal context. Framework strives to capture the complexities involved in implementation, so most useful in more complex projects. How developed: from authors’ experience working as facilitators and researchers on quality improvement activities and health service research projects. Changes/developments over time: Has had several iterations since first publication in 1998 in response to findings from empirical testing. Revised to integrated or i-PARIHS framework in 2015 [10] Ease of use: Does not operationalise its constructs, so may be difficult for novice to understand and apply, particularly when not being supported
  • 56. by expert facilitator. Facilitator’s toolkit easy to apply to conduct pre- and post-implementation evaluation. For people experienced in implementa- tion, framework provides guidance on all of the things to consider when implementation is complex. Additional resources: Facilitator’s Toolkit in book associated with 2015 iteration of PARIHS guides user through how to assess, facilitate and evaluate [32] Interactive Systems Framework [13] Purpose (as described by authors): Heuristic to help clarify the issues related to how to move what is known about prevention (particularly prevention of youth violence and child maltreatment) into more widespread use. Brief description: Framework regarding translating findings from prevention research to clinical practice. The framework comprises three systems: the Innovation Synthesis and Translation System (which distils information about innovations and translates it into user- friendly formats); the Innovation Support System (which provides training, technical assistance or other support to users in the field); and the Innovation Delivery System Lynch et al. BMC Health Services Research (2018) 18:857 Page 5 of 11 http://www.cebc4cw.org/implementing-programs/tools/epis/ Of note, there are similarities across the theoretical ap-
  • 57. proaches. All consider the ‘new evidence’ to be imple- mented; the ‘context’ where the evidence will be introduced; the ‘agents or actors’ who will use or apply the new evidence; and the ‘mechanisms’ or processes that actually make the changes happen. Mechanisms can either be people such as change champions, knowledge brokers, opinion leaders, project managers or facilitators or they can be processes such as new protocols, educa- tion sessions or audit and feedback cycles, or a combin- ation of both. It is important to acknowledge that there is no univer- sally agreed-upon theory of successful implementation, nor empirical evidence about the relative advantages of one theoretical approach over another. While this may be frustrating to people new to the area of impleme nta- tion science, the number of viable theoretical approaches offers clinicians an opportunity to “think outside the box”, and highlights the importance of clarifying what they are seeking to know or change through their pro- ject, and then being strategic in selecting a suitable the- oretical approach. Case study part 2 (experience of author AM): So it is clear that I will need to adapt principles and protocols from successful programs in the USA to my local context. But how do I know what the context is? Top picks on Google scholar for “context assessment implementation science” seem to be Consolidated Framework for Implementation Research (CFIR) and Promoting Action on Research Implementation in Health Services (PARIHS). Both have nice guides that suggest useful questions to ask. There seems to be quite a lot of overlap, although I am drawn to PARIHS because from my experience I know that
  • 58. someone will need to spend time on the ward and build trust before we start to ask questions and introduce change. I suspect this ‘facilitator’ will be a critical role for the complex intervention because there are several behaviours to change. When I look up “behaviour change implementation science”, the Theoretical Domains Framework (TDF) dominates. Like CFIR and PARIHS, there are a lot of elements, but I can see that they would be helpful for planning or analysing surveys and interviews with patients and staff to clarify what motivates, helps and hinders them. I do feel worried about how I will collect and analyse so much data across the several different groups involved in my project. And once we have a thorough understanding of the context, staff and patients, how will I select strategies? And if it does work, how long will it will take until the “new” becomes “normal” so that I can move on to the next problem? My colleague tells me that Normalization Process Theory (NPT) is a useful way to think about this, and I am impressed with the Normalisation of Complex Interventions-Measure Development (NoMAD) tool I find on their website; I can see how I could adapt it to find out whether staff feel the changes are embedded. Table 1 Summary of ten commonly applied theoretical approaches to implementation (Continued) (which implements innovations in the world of practice). How developed: Collaborative development of the framework by Division of Violence Protection staff members, university faculty and graduate
  • 59. students, with input from practitioners, researchers, and funders. Changes/developments over time: No Ease of use: Easy to understand, no clear guidance available regarding how to apply framework Additional resources: No Normalization Process Model, Normalization Process Theory (NPT) [12] Purpose (as described by authors): provides a conceptual framework for understanding and evaluating the processes by which new health technologies and other complex interventions are routinely operationalized in everyday work, and sustained in practice. Brief description: NPT is an Action Theory, which means that it is concerned with explaining what people do rather than their attitudes or beliefs. Proposes that for successful sustained use: individuals & groups must work collectively to implement intervention; work of implementation occurs via 4 particular processes; continuous investment carrying forward in space and time required. Can be helpful to understand and evaluate how new health technologies/complex interventions are routinely operationalised sustained in practice. Not designed to guide implementation. How developed: in iterations, based on experiences of authors. Initially, developers mapped the elements of embedding processes and developed the concept of normalization. Next a robust applied theoretical
  • 60. model of Collective Action was produced, and applied to trials, government processes and healthcare systems. The final stage focused on building a middle-range theory that explains how material practices become rou- tinely embedded in their social contexts. Changes/developments over time: Through its focus on being a theory, the authors continually refine and test NPT to ensure its validity. More recently, NPT has been extended towards a more general theory of implementation. Ease of use: easy to apply with use of specifically developed resources Additional resources: dedicated website with toolkit, examples. Interactive toolkit can be used to plan project or analyse data [29]. Lynch et al. BMC Health Services Research (2018) 18:857 Page 6 of 11 So where do I go from here? Do I frame the whole project with KTA, assess context with CFIR, assess the patient and staff views with TDF, adopt facilitation as the central element from PARIHS, and then look at how well it has gone using NPT? Am I being thorough or theoretically sloppy? They all look sensible, but I am not sure how to use any of them and I am worried it is going to add a whole lot of work to a complicated project. How the theoretical approaches have been used
  • 61. Recent work investigating how theoretical approaches are selected suggests that theories are not selected to suit the purpose of the implementation research; rather theoretical approaches that are familiar to researchers tend to be used, regardless of the aim of the project [2]. To explore how theoretical approaches have been ap- plied in the literature to date, we searched for and iden- tified review papers for 6 of our 10 theoretical approaches: KTA [20], the Reach, Effectiveness, Adop- tion, Implementation, and Maintenance Framework (RE-AIM) [21], CFIR [22], NPT [23], PARIHS [24] and TDF [25]. (For details of these review papers, see Add- itional file 1: Table S1). The overall message from these reviews is that theor- etical approaches are not being utilised to their full po- tential. Despite the fact that many approaches have been developed to prospectively design and plan implementa- tion strategies, they are almost overwhelmingly applied retrospectively to evaluate or interpret findings from a completed implementation project [22–24]. Further, the components of the theoretical approaches (such as cod- ing systems or reporting dimensions) tend not to be ap- plied consistently, with some users selecting to apply only particular theoretical components [21, 22], or ap- plying components in different ways than the developers intended [21]. These findings again suggest that there is not an agreed “best” – or even “easiest” – theory to apply, and that even implementation researchers may need to take a pragmatic approach to the use of theory in complex real-world projects. It reflects the relative immaturity of the field, but this provides opportunities for clinical and research partners to contribute to advancing our know -
  • 62. ledge of how theory is selected and adapted for practical use. To support thoughtful use of theory, we suggest some practical guidance to how to choose which theory to use, and to how to use the theory effectively to sup- port the implementation project, based on our experi- ence and interactions with clinical and academic staff new to implementation. How do you select the theoretical approach for your implementation project? Research reporting guidelines for implementation stud- ies, including Standards for Reporting Implementation Studies (STaRI) [26] and Template for Intervention De- scription and Replication (TIDieR) [27] specify that the rationale and theory underpinning implementation strat- egies should be reported. Some researchers also recom- mend that the reason for selecting a particular theoretical approach should be justified [2, 24]. We acknowledge that there will always be multiple questions that could be posed for each implementation project, each of which could be approached from differ- ent theoretical perspectives. Below we provide some prompts to assist clinicians and clinical researchers to select a theoretical approach that can value-add to dif- ferent implementation projects, rather than simply citing a theoretical approach in order to meet a reporting guideline. Clinicians tend to be pragmatic; they generally are not motivated by concepts like theoretical purity, they just want things to work. We emphasise that there is an “art” to selecting and applying theoretical ap- proaches – these prompts need to be applied and con- sidered alongside a clinician or researcher’s experience and skill and the nuances of the implementation project. In our experience, clinicians are often anxious that
  • 63. they will select the ‘wrong’ theory. We reiterate that there is no precise formula for choosing a theoretical ap- proach. One important thing to consider in theory Fig. 1 Five questions to help select a theoretical approach Lynch et al. BMC Health Services Research (2018) 18:857 Page 7 of 11 selection is the goodness-of-fit, which is determined by each study’s needs and aims, rather than there being a ‘wrong’ choice. The following are suggested questions that could be considered to identify which theoretical approach is particularly appropriate or useful for differ - ent implementation projects (see Fig. 1). Who are you working with? Are you working with individuals who have complete autonomy, are you working with a team, or are you working with an entire health service? Almost all imple- mentation studies will inevitably touch on different or- ganisational levels (micro-, meso- and macro-level implementation), so consider the fit of the theoretical approaches to the organisational level where your pro- ject is positioned, and whether more than one approach is required to guide implementation at different levels. Some approaches are particularly concerned with indi- vidual experiences or behaviours (for example, TDF), others with group interaction or collective working (for example, NPT; Klein) and others encompass the broader contextual factors impacting across a wider setting or service (for example, PARIHS). When in the process are you going to use theory?
  • 64. The point in time of the implementation project may be another factor guiding theoretical approach selection. Some approaches lend themselves particularly to the de- sign and planning of an implementation strategy (for ex- ample, Exploration, Preparation, Implementation and Sustainment (EPIS); Proctor; Interconnected Systems Framework (ISF); TDF in conjunction with Behaviour Change Wheel), others to tracking the development of a project (for example, KTA), and others to planning an evaluation and defining outcome measures to assess im- plementation success (for example, RE-AIM). Why are you applying a theory? The aims and intended outcomes of each study should be considered, as different theoretical approaches pro- vide different ‘pay offs’ in terms of the understanding gained. Different theoretical approaches can be used to measure achievement of a specific change (for example, TDF; ISF), to generate a better understanding of barriers and facilitators to inform implementation approaches (for example, PARIHS; CFIR), to develop knowledge about an ongoing implementation process (for example, KTA; Proctor), or to provide a framework of relevant implementation outcomes (for example, CFIR; RE-AIM). How will you collect data? Choice of theoretical approach may also be informed by what data will be available for analysis. Although ideally data collection would be designed with a particular approach in mind so that data are collected to answer the questions of interest, we are aware that in practice clinicians need to work with the resources that are avail - able to them. For example, clinicians may have access to routinely collected outcome data which could be evalu- ated using the constructs in RE-AIM, but these same