SlideShare a Scribd company logo
1 of 92
Evidence-Based Practice in Psychology:
Implications for Research and Research Training
�
Russell M. Bauer
University of Florida
In this article, the author discusses the implications of
evidence-based
practice (EBP) for research and research training in clinical
psychology. It
is argued that EBP provides a useful framework for addressing
some here-
tofore ignored problems in clinical research. Advancing
evidence-based
psychological practice will require educators to inject
significant new con-
tent into research, design, and methodology courses and to
further inte-
grate research and practicum training. The author believes this
to be an
exciting opportunity for the field, not only because it will
further psychol-
ogists’ integration into the interdisciplinary health care and
research envi-
ronment, but also because it will provide new tools to educate
students for
capable, not just competent professional activity. © 2007 Wiley
Periodi-
cals, Inc. J Clin Psychol 63: 685–694, 2007.
Keywords: education and training; research
In recent years, the notion that psychologists deliver “health
care” rather than just “men-
tal health care” has taken hold in our field. Along with this
identification as a health
care discipline comes a set of responsibilities to provide
patients with clinical services
that have been shown through research to be effective for
addressing patient problems.
The fundamental goal of the evidence-based practice movement
(EBP) is to effect a
cultural change within health care whereby practitioners will
make “conscious, explicit,
and judicious” use of current best evidence in clinical practice
with individual patients
(Mayer, 2004; Straus, Richardson, Glasziou, & Haynes, 2005).
The contemporary empha-
sis on EBP is quite strong within other health care disciplines,
where it has permeated
the culture of education, practice, and research, and where it is
seen as furnishing at
least a partial answer to a fundamental call for accountability
and continuous quality
Correspondence concerning this article should be addressed to:
Russell M. Bauer, Department of Clinical and
Health Psychology, University of Florida, P.O. Box 100165
HSC, Gainesville, FL 32610-0165; e-mail:
[email protected]
JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 63(7), 685–694
(2007) © 2007 Wiley Periodicals, Inc.
Published online in Wiley InterScience
(www.interscience.wiley.com). DOI: 10.1002/jclp.20374
improvement in the overall system of health care delivery in the
United States (Institute
of Medicine, 2001).
Most psychologists understand that EBP refers to a process by
which best evidence
is used intentionally in making decisions about patient care.
Psychologists are most famil-
iar with the construct of best evidence in the context of the
empirically supported treat-
ment movement, but some may mistakenly believe that EBP and
empirically supported
treatment (EST) are synonymous. As other articles in this series
make clear, they are not;
EBP is a much broader concept that refers to knowledge and
action in the three essential
elements of patient encounters: (a) the best evidence guiding a
clinical decision (the best
evidence domain), (b) the clinical expertise of the health care
professional to diagnose
and treat the patient’s problems (the clinical expertise domain),
and (c) the unique pref-
erences, concerns and expectations that the patient brings to the
health care setting (the
client domain). These three elements are often referred to as the
three pillars of EBP.
Even a brief consideration of the many variables and
mechanisms involved in the three
pillars will lead the clinical psychologist to an obvious
conclusion: EBP not only pro-
vides a framework for conceptualizing clinical problems, but
also suggests a research
agenda whereby patterns of wellness and illness are investigated
with an eye toward how
best practices are potentially mediated by unique aspects of
practitioner expertise. In
addition, how key patient characteristics influence treatment
acceptability and help define
what role the patient plays in the health care relationship are
highlighted.
This is not a new agenda, but is quite similar to the agenda set
forth by Gordon Paul
in 1969 in his now-famous ultimate clinical question, “What
treatment, by whom, is most
effective for this individual, with that specific problem, under
which set of circum-
stances, and how does it come about?” (Paul, 1967, 1969, p.
44). In asking this question,
Paul’s goal was to draw attention to variables that needed to be
described, measured, or
controlled for firm evidence to accumulate across studies of
psychotherapy. The agenda
for evidence-based psychological practice is similar, though
broader, encompassing assess-
ment as well as treatment, psychological healthcare policy as
well as clinical procedure,
and populations as well as individuals. As such, expanding the
scope of evidence-based
psychological practice provides an opportunity for
psychologists to build conceptual and
methodological bridges with their colleagues in medicine,
nursing, pharmacy, health pro-
fessions, and public health.
Although its status as a health care delivery process is typically
emphasized, EBP,
initially referred to as evidence-based medicine, evolved at
McMaster University as a
pedagogical strategy for teaching students and practitioners how
to incorporate research
results into the process of patient care ( McCabe, 2006; Sackett,
Rosenberg, Gray, Haynes,
& Richardson, 1996). As professional psychology begins to
seriously consider the rele-
vance of EBP for broad aspects of practice (Davidson & Spring,
2006), we will have to
grapple with some obvious implications for (a) how we conduct
practice-based research,
and (b) how we educate and train our students, the next cadre of
clinical researchers, to
develop the knowledge, skills, and expertise to contribute to the
evidence base. In this
article, I discuss some of these implications with an eye toward
viewing the EBP move-
ment as an opportunity to begin to answer some of our most
difficult research questions,
and to begin to address some of our most vexing and persistent
problems in education and
training.
Practice-Based Research
From a research perspective, EBP provides a framework for
investigating heretofore
neglected aspects of “rubber-meets-the-road” practice. That is,
confronting gaps in the
686 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
evidence base from an EBP perspective draws attention to key
client variables (e.g.,
preferences for one treatment over another, ability/willingness
to adhere to treatment,
credibility of treatment rationales, demographic and
socioeconomic variables that enhance
or impede health care access or that contribute to attitudes
about treatment acceptability)
and dimensions of clinical expertise (e.g., the ability to deliver
the appropriate EST for
the patient’s problem, the ability to adapt treatments to unique
clients, the ability to
deliver assessments appropriate to decision-making, the ability
to communicate effec-
tively with patient) that deserve empirical study. Practitioners
face these gaps because our
dominant research paradigms tend to yield data about
homogeneous majority groups
receiving standard treatment in optimal settings.
Thus far, most of what constitutes evidence-based psychological
practice is in the
area of empirically supported treatment (Chambless, 1995;
Chambless et al., 1998). Cur-
rently, there are several psychological therapies with well-
established efficacy for treat-
ment of a variety of psychological problems (American
Psychological Association [APA]
Division 12 Dissemination Subcommittee of the Committee on
Science and Practice).
Continued expansion of this list to include new therapies and
clinical problems, and
demonstrating the portability of well-controlled efficacy studies
to real world problems
(effectiveness) is continuing apace (Chambless & Ollendick,
2001).
A parallel expansion of the evidence base for psychological
assessment procedures is
needed. More research is needed regarding the diagnostic utility
of assessment tools in
predicting at-risk status, in helping select which treatment is
indicated, or in predicting
treatment response. Even in areas where the evidence for the
clinical utility of assessment
procedures is strong (e.g., in surgical epilepsy, where the
results of presurgical evaluation
of verbal memory strongly predict which patients will develop
postsurgical neuropsycho-
logical morbidity; Chelune, 1995) the best available evidence
has not yet caused the
majority of clinicians to modify their assessment approach
accordingly.
A full instantiation of EBP in psychology will require an
expansion of systematic
research efforts that will provide us with more information
about the clinical expertise
and patient domains. This represents a real opportunity to
broaden the scope of EBP in
psychology. How do psychological practitioners with varying
levels of expertise decide
which of a number of alternative treatments to utilize in the
clinic? What factors make
clinically efficacious treatments acceptable to patients? How
does cultural diversity inter-
act with treatment acceptability? To apply best evidence to
individual clinical problems
seamlessly, we need to develop a research agenda that allows us
to retrieve and analyze
answers to these kinds of questions. This is a daunting task, and
one that seems intracta-
ble from the point of view of our exclusive reliance on
quantitative research methods and
controlled experiments. Perhaps this is an area in which
increased knowledge of qualita-
tive research methods (see below) would be beneficial for the
field. This is an area to
which practicing scientist–practitioners can provide critical
information by adopting a
data-driven approach to practice that incorporates measurement
and reporting of assess-
ment and treatment outcomes for purposes of further addressing
effectiveness questions.
Implications for Education and Training
In a recent survey on training in ESTs, Woody, Weisz, and
McLean (2005) reported that,
although many doctoral training programs provided didactic
dissemination of EST-
related information, actual supervised training in ESTs had
declined compared to a sim-
ilar survey conducted in 1993. The overall conclusion was that
the field had a long way
to go in insuring that our students have sufficient skill and
experience to practice EST in
their professional lives. The authors cited several obstacles to
training in ESTs, including
Evidence-Based Practice and Research 687
Journal of Clinical Psychology DOI 10.1002/jclp
(a) uncertainty about what it means to train students in EBP; (b)
insufficient time to
provide specific training in multiple ESTs given other training
priorities, including research;
(c) within-program shortages of trained supervisors needed to
provide a truly broad EST
training experience; and (d) philosophic opposition to what
some perceive as an overly
rigid, manualized approach to treatment that reduces
professional psychological practice
to technician status. It seems obvious to me that most of these
barriers imply a method of
training in which competency in ESTs is built one treatment at a
time, thus requiring large
investments of time and faculty effort to the cause. Although it
is true that students need
practical training in a variety of clinical methods, one key issue
is whether a goal of
graduate education is to train students to competency in a
critical number of ESTs, or
whether the goal is to train them in broader principles of
evidence-based practice that will
enable them to easily adapt to novel demands for new
competencies after attaining their
PhD (educating for capability rather than competency; Fraser &
Greenhalgh, 2001).
There is evidence that clinical psychology training directors are
ready for this devel-
opment. In the Woody et al. (2005) survey, some clinical
training directors indicated that
current practice reflects an underemphasis on broad principles
of evidence-based practice
in favor of learning particular procedures on a treatment-by-
treatment basis. Some of the
issues related to the ability of programs to provide appropriate
training would be addressed
if we adopted a more general principles approach. Although not
particularly on point in
the context of this article, it is my view that developing
competencies in ESTs for research
and professional practice is the joint and cumulative
responsibility of doctoral programs,
internships, and postdoctoral programs that work together to
provide a continuum of
training in knowledge and skills in EBPP.
Training in EBPP will require graduate training programs to
include new content in
research training curricula so that students are ready to
understand and apply basic prin-
ciples of EBPP in their everyday professional lives. Primary
needs include training in (a)
epidemiology, (b) clinical trials methodology, (c) qualitative
research methods and mea-
surement, (d) how to conduct and appraise systematic reviews
and meta-analyses, and (e)
in building skills in informatics and electronic database
searching necessary to find best
available evidence relevant to the problems that students will
encounter in their research
and clinical work. Such content could be introduced in a basic
research methods course,
could be taught separately in a course on EBPP, or could be
infused in the curriculum
through a combination of didactic, practicum, and research
experiences (for additional
ideas on infusion of EBPP into the curriculum, see Dillillo &
McChargue, this issue).
Achieving true infusion and integration will require that all
program faculty is committed
to the concept of EBPP, that all will have received some basic
education in EBPP them-
selves, and that EBPP concepts are represented throughout the
curriculum. The faculty
development implications of advancing EBPP are not trivial. In
the short run, an effective
strategy may be to partner with colleagues in medicine, health
professions, nursing,
and public health to provide interdisciplinary instruction and
mentoring in basic princi-
ples of EBP.
Epidemiology
Many problems important to psychologists (e.g., whether a
clinical assessment tool is
effective in identifying at-risk patients, whether a treatment
protocol is effective in reduc-
ing psychological distress or disability in a defined population)
can be conceptualized
and described in epidemiological terms. For example, the
strength of a treatment effect
can be described with reference to the concept of “number
needed to treat” (the number
688 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
of patients who would need to be treated to produce one
additional favorable outcome),
or “number needed to harm” (the number of patients who would
need to be treated to
prevent one additional unfavorable outcome), or, more
generally, in terms of relative or
absolute risk reduction. Knowledge of basic aspects of
diagnostic test performance (e.g.,
sensitivity, specificity, positive and negative predictive value)
so critical to psychological
practice can also be enhanced by forging links between these
concepts and corresponding
concepts in epidemiology (e.g., positive and negative likelihood
ratios). A broad ground-
ing in epidemiological methods will promote further ways of
understanding and inferring
causality from observational and experimental data, will further
an appreciation for pre-
ventative methods, and will provide much-needed appreciation
for community- and
population-based methods that will complement psychology’s
traditional emphasis on
individuals and small groups.
Clinical Trials Methodology
Although many graduate statistics and methodology courses
cover such topics as case-
control designs, cohort designs, and elements of randomized
clinical trials (RCTs), clas-
sical methodology education in the Campbell and Stanley
(1963) tradition needs to be
supplemented with contemporary information relevant to
clinical trials methodology. For
example, training in standards for designing, conducting, and
reporting clinical trials
consistent with the CONSORT statement (Begg et al., 1996;
Moher, Schulz, & Altman,
2001) is important so that reports of psychological clinical
trials have appropriate con-
sistency and transparency. Training in methods for reporting the
size of treatment effects
(going beyond statistical significance), allocating samples,
specifying outcomes (relative
and absolute risk reduction, number needed to treat and number
needed to harm), and
addressing the ethical issues of clinical trials are all critically
needed if psychology is to
develop a truly evidence-based practice. Building the ability to
evaluate the results of
extant trials critically is also crucial if psychological
practitioners are to meaningfully
apply the best evidence standard to their own clinical work and
research.
Qualitative Research Methods and Measurement
Clinical psychologists trained in the scientist–practitioner
tradition are almost exclu-
sively focused on quantitative research methods, with an
attendant emphasis on measure-
ment precision, quantitative statistical analysis, and tightly
controlled experimental design.
This scientific tradition links us with our colleagues in the
natural and social sciences,
and represents our preferred “way of knowing” the world. In
contrast, qualitative approaches
to research seek to evaluate the quality, or essence of human
experience using a funda-
mentally different methodological and analytic framework (
Mays & Pope, 1995, 2000;
Pope, Ziebland, & Mays, 2000). Many psychologists are
familiar with at least some
qualitative research methods exemplified, for example, in
ethnography, sociometry,
participant-observation, or content analysis of discourse.
However, methods such as con-
vergent interviewing, focus groups, and personal histories are
generally foreign to most
students in scientist–practitioner programs. As applied to health
care, qualitative research-
ers may seek to evaluate the experiences of brain-injured
patients in rehabilitative set-
tings as a way of enhancing the design of the rehabilitation
environment for purposes of
maximizing recovery. They may investigate case dispositions in
a child neurosurgery
clinic by evaluating commonalities among physicians’ notes and
clinical decisions. They
may evaluate treatment acceptability by interviewing patients
about their experiences in
Evidence-Based Practice and Research 689
Journal of Clinical Psychology DOI 10.1002/jclp
treatment. It is important for psychologists to become more
familiar with these methods
because many systematic reviews in the EBP literature contain
the results of qualitative
studies (Thomas et al., 2004). Although qualitative research is
generally incapable of
establishing causative relationships among variables, they may
be the only (and therefore
the best) source of evidence for rare conditions and they may
suggest associations worthy
of future research. Reviews of this area as applied to healthcare
can be found in Green-
halgh & Taylor (1997), Grypdonck (2006), Holloway (1997),
and Leininger (1994).
Conducting Systematic Reviews and Meta-Analyses
The explosion of relevant medical and psychological literature
has made it difficult for
scientist–practitioners to have access to the best evidence at the
single-study level while
attending to multiple simultaneous demands for their time. For
this reason, systematic
reviews of the literature are becoming increasingly important as
sources for state of the
art information. Most graduate courses in research methodology
and statistics devote
little attention to conducting reviews or meta-analyses, although
many programs now
appear to be offering grant-writing courses or seminars. In these
courses, an emphasis on
design and critique of individual studies is commonplace,
whereas development of skills
in evaluating systematic reviews or meta-analyses is rare. If
psychology is to become a
key player in evidence-based-practice, the next cadre of
scientist–practitioners will have
to develop skills in conducting and evaluating these kinds of
reviews. In programming
needed education and training, it is important to distinguish
between narrative reviews
(the kind of review that is seen, for example, in Psychological
Bulletin) and systematic
reviews. Narrative reviews are conducted by knowledgeable
persons who often conduct
the review for advancing a particular theoretical conclusion.
They therefore yield poten-
tially biased conclusions because there is no consensually
agreed-upon method for com-
bining and weighting results from different studies. In contrast,
systematic reviews and
meta-analyses proceed according to specified methodological
conventions in which the
search method, the procedure for including and excluding
studies, and the method for
eventually calculating effect sizes or odds ratios are specified
beforehand (e.g., fixed
effects vs. random effects models), as are methods for
determining statistical and clinical
significance (Cook, Mulrow, & Haynes, 1997; Cook, Sackett &
Spitzer, 1995; Quintana
& Minami, 2006). Meta-analysis is a specific form of
quantitative systematic review that
aggregates the results of similar studies for purposes of
generating more stable conclu-
sions from pooled data than is possible at the individual-study
level (Egger, Smith, &
Phillips, 1997; Rosenthal & DiMatteo, 2001; Wolf, 1986).
Recent techniques allow for
the calculation of bias in published studies, allowing the reader
to appraise whether the
results of the analysis reflects an undistorted view of effect size
(Stern, Egger, & Smith,
2001). Clinical psychologists need to know these basic concepts
so that they can evaluate
the relevance and quality of available evidence.
Informatics and Database Searching Skills
If a tree falls in the woods, and there is no one there to hear it,
does it make a sound? This
classical conundrum about the nature of reality seems relevant
to the key issue of infor-
mation access in evidence-based practice. If useful information
about best evidence exists,
but we do not or cannot access it, it cannot be brought to bear
on clinical decision making
(Slawson & Shaughnessy, 2005). For this reason, developing
expertise in informatics and
database searching is a critical step in making EBPP a reality.
In my experience, most
690 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
psychologists, and students of psychology, search a limited
number of databases (PubMed,
U.S. National Library of Medicine, 1971; PsychLit, APA, 1967)
with a single search
term, and (at most) a single Boolean operator. It is not
uncommon for a supervisor of a
fledgling student to hear that, “there’s nothing in the literature”
about a topic the student
is interested in researching. Most use very little of what is
available, and many are com-
pletely unaware of many of the most important and useful
resources available for EBPP.
A detailed discussion of these resources is beyond my scope
(see Hunt & McKibbon,
1997); nevertheless, it seems critical that some effort be
devoted (either in faculty devel-
opment seminars or in graduate education) to addressing
database availability explicitly,
including access strategies, search methodology, and approaches
to information manage-
ment (managing search results). A key first step in getting this
accomplished may be to
establish a close relationship with a librarian or library
informatics specialist who can
help translate educational and research needs into strategies for
accessing needed infor-
mation, and who can provide access to needed databases and
other resources. It is not
uncommon, particularly in larger institutions, for at least one
member of the library staff
to be particularly skilled in evidence-based medicine. There are
a number of databases
that are of particular relevance to EBPP, including CINAHL
(nursing and allied health;
Cinahl Information Systems, 1984), EMBASE (1974), The
Cochrane Library (including
the Cochrane Database of Systematic Reviews [CDSR];
Cochrane Library, 1999a), the
Database of Abstracts of Reviews of Effects (DARE; Cochrane
Library, 1999b), the
Cochrane Central Register of Controlled Trials (CENTRAL;
Cochrane Library, 1999c),
and the ACP Journal Club (American College of Physicians,
1994), available on the Ovid
(Ovid Technologies, New York, NY) search engine (for a more
in-depth discussion, see
Walker & London, this issue).
Obtaining access to these databases is only part of the story; the
development of
strategic searching skills designed to yield a manageable
number of relevant search results
is a key outcome goal of educational efforts that will be
achieved only through actual
practice in problem-based learning situations. Finally,
development of a local or profession-
wide resource that contains the answers to evidence-based
queries (so-called, critically
appraised topics or CATS) will enable students and their
mentors to benefit from the
evidence-based practice efforts of their colleagues. Other
authors in this series have
suggested ways of incorporating skill-building activities into
practicum and other parts
of the psychology curriculum (see Collins, Belar, &
Leffingwell, this issue; DiLillo &
McChargue, this issue).
The Way Forward
In this article, I have tried to highlight ways that the
interdisciplinary trend toward evidence-
based practice offers real opportunities to address some difficult
research problems and
to revitalize certain aspects of our graduate curricula. This brief
analysis has likely raised
more questions (e.g., How? When? By whom? In what way?) as
far as the training impli-
cations are concerned, and has not dealt at all with criticisms
that have been thoughtfully
levied against the EBP approach to research and research
training. One key issue in
advancing EBP within psychology will be to pay attention to the
key stage of the process
by which knowledge (best evidence) is transformed into action
and application. This, in
my view, is the state of the process that is least understood from
a psychological view-
point. What are the principles by which best evidence can be
modified to fit the individ-
ual case? What evidence is “good enough” to drive a clinical
decision? What about those
aspects of psychological health care (e.g., relationship, trust,
identification, and model-
ing) that are implicitly important in the delivery of services, but
that don’t themselves
Evidence-Based Practice and Research 691
Journal of Clinical Psychology DOI 10.1002/jclp
have large-scale independent empirical support? These (and
others) are key questions we
will need to grapple with as we implement an evidence base for
clinical psychology and
teach students how to access and use it.
With regard to pedagogy, I am convinced that the only way to
go is to incorporate
problem-based, real-time experiences throughout the curriculum
in which students can
learn to walk the EBPP walk. This is a significant undertaking
with profound implica-
tions as far as faculty development is concerned. I am as
skeptical of an Evidence-Based
Practice Course as a way to develop the needed skills and
capacities of our students as I
am that a Cultural Diversity Course will somehow help build
multicultural competencies.
We will need to figure out how to incorporate the content, the
concepts, and the tech-
niques of evidence-based psychological practice at all levels of
research and clinical
training if we are to be truly successful in assimilating the
EBPP way of thinking. We
cannot do it all; faculty are generally not up to speed with all
that is needed, and, for the
practicing clinician, health care events proceed at a rapid pace.
We can begin the process
by equipping tomorrow’s psychological practitioners with the
tools necessary to imple-
ment EBPP into their everyday clinical practice. In addition, we
can capitalize on the
obvious opportunities to expand our multidisciplinary
interdependence on other health
professionals in nursing, medicine, pharmacy, and public health
who are further down the
EBP road than we are. Providing faculty with needed support,
and developing methods
for educating and training tomorrow’s psychologists in EBPP is
critically needed in estab-
lishing an evidence base equal to the task of providing quality
psychological health care
for those that depend on us.
References
American College of Physicians. (1994). ACP Journal Club
homepage. Retrieved February 15,
2007, from http://www.acpjc.org
American Psychological Association. (1967). PsycINFO
homepage. Retrieved February 15, 2007,
from http://www.apa.org/psycinfo/products/psycinfo.html
Begg, C., Cho, M., Eastwood, S., Horton, R., Moher, D., Olkin,
I., et al. (1996). Improving the
quality of reporting of randomized controlled trials: the
CONSORT statement. Journal of the
American Medical Association, 276, 637– 639.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and
quasi-experimental designs for research.
Chicago: Rand McNally College Publishing.
Chambless D. L. (1995). Training and dissemination of
empirically validated psychological treat-
ments: Report and recommendations. The Clinical Psychologist,
48, 3–23.
Chambless, D. L., Baker, M. J., Baucom, D. H., et al. (1998).
Update on empirically validated
therapies, II. The Clinical Psychologist, 51, 3–16.
Chambless, D. L., & Ollendick, T. H. (2001). Empirically
supported psychological interventions:
controversies and evidence. Annual Review of Psychology, 52,
685–716.
Chelune, G. (1995). Hippocampal adequacy versus functional
reserve: Predicting memory func-
tions following temporal lobectomy. Archives of Clinical
Neuropsychology, 10, 413– 432.
Cinahl Information Systems. (1984). Homepage. Retrieved
February 15, 2007, from http://
www.cinahl.com
Cochrane Library. (1999a). Homepage. Retrieved February 15,
2007, from http://www3.
interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME
Cochrane Library. (1999b). DARE Homepage. Retrieved
February 15, 2007, from http://www.
mrw.interscience.wiley.com/cochrane/cochrane_cldare_articles_
fs.html
Cochrane Library. (1999c). Cochrane Central Register of
Controlled Trials Homepage. Retrieved
February 15, 2007, from
http://www.mrw.interscience.wiley.com/cochrane/cochrane_
clcentral_articles_fs.html
692 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
Collins, F. L., Leffingwell, T.R., & Belar, C. D. (2007).
Teaching evidence-based practice: Impli-
cations for psychology. Journal of Clinical Psychology, 63,
657– 670.
Cook, D. J., Mulrow, C. D., & Haynes, R. B. (1997). Systematic
reviews: Synthesis of best evi-
dence for clinical decisions. Annals of Internal Medicine, 126,
376–380.
Cook, D. J., Sackett, D. L., Spitzer, W. O. (1995).
Methodologic guidelines for systematic reviews
of randomized control trials in health care from the Potsdam
Consultation on Meta-Analysis.
Journal of Clinical Epidemiology, 48, 167–171.
Davidson, K. W., & Spring, B. (2006). Developing an evidence
base in clinical psychology. Journal
of Clinical Psychology, 62, 259–271.
DiLillo, D., & McChargue, D. (2007). Implementing evidence-
based practice training in a scientist–
practitioner program. Journal of Clinical Psychology, 63, 671–
684.
Egger, M., Smith, G. D., & Phillips, A. N. (1997). Meta-
analysis: Principles and procedures. British
Medical Journal, 315, 1533–1537.
EMBASE. (1974). Homepage. Retrieved February 15, 2007,
from http://www.embase.com
Fraser, S.W., & Greenhalgh, T. (2001). Coping with
complexity: Educating for capability. British
Medical Journal, 323, 799–803.
Greenhalgh, T., & Taylor, R. (1997). How to read a paper:
Papers that go beyond numbers (quali-
tative research). British Medical Journal, 315, 740 –743.
Grypdonck, M. H. (2006). Qualitative health research in the era
of evidence-based practice. Qual-
itative Health Research, 16, 1371–1385.
Holloway I. (1997). Basic concepts for qualitative research.
Oxford: Blackwell Science.
Hunt, D. L., & McKibbon, K. A. (1997). Locating and
appraising systematic reviews. Annals of
Internal Medicine, 126, 532–538.
Institute of Medicine. (2001). Crossing the quality chasm: A
new health system for the 21st century.
Washington, DC: National Academies Press.
Leininger, M. (1994). Evaluation criteria and critique of
qualitative research studies. In J. M. Morse
(Ed.), Critical issues in qualitative research methods. Thousand
Oaks, CA: Sage.
Mayer, D. (2004). Essential evidence-based medicine. New
York: Cambridge University Press.
Mays, N., & Pope, C. (1995). Reaching the parts other methods
cannot reach: An introduction
to qualitative methods in health and health services research.
British Medical Journal, 311,
42– 45.
Mays, N., & Pope, C. (2000). Assessing quality in qualitative
research. British Medical Journal,
320, 50 –52.
McCabe, O. L. (2006). Evidence-based practice in mental
health: accessing, appraising, and adopt-
ing research data. International Journal of Mental Health, 35, 50
– 69.
Moher, D., Schulz, K. F., & Altman, D. G. (2001). The
CONSORT statement: Revised recommen-
dations for improving the quality of reports of parallel-group
randomized trials. Lancet, 357,
1191–1194.
Paul, G. L. (1967). Outcome research in psychotherapy. Journal
of Consulting Psychology, 31,
109–118.
Paul, G. L. (1969). Behavior modification research: Design and
tactics. In C. M. Franks (Ed.),
Behavior therapy: Appraisal and status (pp. 29– 62). New York:
McGraw-Hill.
Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research
in health care: Analyzing qualita-
tive data. British Medical Journal, 320, 114 –116.
Quintana, S. M., & Minami, T. (2006). Guidelines for meta-
analyses of counseling psychology
research. The Counseling Psychologist, 34, 839–877.
Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent
developments in quantitative
methods for literature reviews. Annual Review of Psychology,
52, 59–82.
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R.
B., & Richardson, W. S. (1996).
Evidence based medicine: What it is and what it isn’t. British
Medical Journal, 312, 71–72.
Evidence-Based Practice and Research 693
Journal of Clinical Psychology DOI 10.1002/jclp
Slawson, D. C., & Shaughnessy, A. F. (2005). Teaching
evidence based medicine: should we be
teaching information management instead? Academic Medicine,
80, 685– 689.
Stern, J. A. C., Egger, M., & Smith, G. D. (2001). Investigating
and dealing with publication and
other biases in meta-analysis. British Medical Journal, 323,
101–105.
Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B.
(2005). Evidence based medicine:
How to practice and teach EBM. Edinburgh: Elsevier/Churchill
Livingstone.
Thomas, J., Harden, A., Oakley, A., Oliver, S., Sutcliffe, K.,
Rees, R., et al. (2004). Integrating
qualitative research with trials in systematic reviews. British
Medical Journal, 328, 1010 –1012.
U.S. National Library of Medicine. (1971). Medline/PubMed
homepage. Retrieved February 15,
2007, from http://www.ncbi.nlm.nih.gov/entrez/query.fcgi
Walker, B. W., & London, S. (2007). Novel tools and resources
for evidence-based practice in
psychology. Journal of Clinical Psychology, 63, 633– 642.
Wolf, F. M. (1986). Meta-analysis: Quantitative methods for
research synthesis. Beverly Hills, CA:
Sage.
Woody, S. R., Weisz, J., & McLean, C. (2005). Empirically
supported treatments: 10 years later.
The Clinical Psychologist, 58, 5–11.
694 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
îronn the eaitors
Practice-Based Evidence:
Back to the Future
Larry K. Brendtro, Martin L Mitchell, & James Doncaster
Researchers are shifting from the medical model of studying
treatments, to a practice-
based model focusing on the nature and needs of a person in a
therapeutic relationship.
As seen from the articles in this special issue, this has heen a
central tenet of Re-ED
since founded by Nicholas Hobbs fifty years ago.
Confusion abounds about what qualifies as"evidence" of
effective interventions. The
president of the American Psychology Associa-
tion [APA] notes that "much of the research that
guides evidence-based practice is too inacces-
sible, overwhelming, and removed from practice"
(Goodheart, 2010, p. 9). Yet lists of evidence-based
treatments are being used to control funding in
treatment, human services, and education. Stated
simply, such policies are
based on shaky science.
Certainly there is no short-
age of evidence that some
methods are destructive,
like withholding treatment
or placing traumatized kids
in toxic environments. But
a wide variety of therapeu-
tic interventions can have
a positive impact if con-
ducted within a trusting
alliance.
There are two very differ-
ent views of what evidence
is most important. Re-
search in the traditional
medical model compares a
proposed treatment with
alternates or a placebo. If a
prescribed number of pub-
lished studies give a statis-
tical edge, the treatment is
anointed as "evidence-based." This is followed
by endorsements from the National Institute of
Health, the Department of Education, or other
authoritative bodies.
Providing lists of curative treatments may work for
medicine, but this is not how to find what works in
complex therapeutic relationships. Mental health
research has shown that the process of enshrining
James Doncaster, guest editor
winter 2011 volume 19, number 4 | 5
specific treatment models as evidence-based is based
on flawed science (Chan, Hróbjartsson, Haahr,
Gotzsche, & Altman, 2004). Dennis Gorman
(2008) of Texas A & M University documents simi-
lar problems with school-based substance abuse
and violence prevention research which he calls
scientiftc nonsense.
Julia Littell (2010) of the Campbell Coalition
documents dozens of ways that sloppy science is
being used to elevate specific treatments to evi-
dence based status. Here are just a few of these
research flaws:
Allegiance Effect:
Studies produced by advocates of a particular
method are positively biased.
File Cabinet Effect:
Studies showing failure or no effects are tucked
away and not submitted for publication.
Pollyanna Publishing Effect:
Professional journals are much more likely to publish
studies that show positive effects and reject those that
do not.
Replication by Repetition Effect:
Reviewers rely heavily on recycling findings cited
by others, confusing rumor and repetition with
replication.
Silence the Messenger Effect:
Those who raise questions about the scientific base
of studies are met with hostility and ad hominem
attacks.
When researchers account for such biases, a clear
pattern emerges. Widely touted evidence-based treat-
ments turn out to be no better orno worse than other ap-
proaches. Solid science speaks—success does not lie
in the specific method but in common factors, the
most important being the helping relationship.
Re-ED uses human relationships
to change the worid
one chiid at a time.
Our field is in ferment as the focus of research is
shifting. Instead of the study of treatments, the
child now takes center stage. The practice-based
model focuses on the nature and needs of an indi-
vidual in an ecology (Brendtro & Mitchell, 2oro).
Effective interventions use research and practice
expertise to target client characteristics including
problems, strengths, culture, and motivation
(APA, 2006). Research and evaluation measure
progress and provide feedback on the quality of the
therapeutic alliance (Duncan, Miller, Wampold, &
Hubble, 2010).
Instead of the study of
treatments, the child now
takes center stage.
Re-ED is rooted in practice-based evidence. It taps
a rich tradition of research, provides tools for di-
rect work with youth, and tailors interventions to
the individual child in an ecosystem (Cantrell &
Cantrell, 2007; Freado, 2010). Fifty years after they
were developed by Nicholas Hobbs and colleagues,
the Re-ED principles offer a still-current map for
meeting modern challenges. Re-ED does not im-
pose a narrowly prescribed regimen of treatment,
but uses human relationships to change the world
one child at a time.
Larry K. Brendtro, PhD, is Dean of the Starr In-
stitute for Training and co-editor of this journal with
Martin L. Mitchell, EdD, President and CEO of
Starr Cotnnionwealth, Albion, Michigan. They can be
contacted via email at [email protected]
James Doncaster, MA, is the senior director of orga-
nizational development at Pressley Ridge in Pittsburgh,
Pennsylvania, and is guest editor of this special issue
on the fiftieth anniversary of the founding of Re-ED. He
may be contacted at [email protected]
6 I reclaiming children and youth www.reclaimingjournal.com
References
APA Presidential Task Force on Evidence-Based Practice.
(2006). Evidence-based practice in psychology. Ameri-
can Psychologist, 61(4), 271-285.
Brendtro, L., & Mitchell, M. (2010). Weighing the evidence:
From chaos to consilience. Reclaiming Children and
Youth, 19(2), 3-9.
Cantrell, R., is Cantrell, M. (2007). Helpitig troubled children
and youth. Memphis, TN: American Re-Education As-
sociation.
Chan, A., Hróbjartsson, A., Haahr, M., Gotzsche, P., & Alt-
man, D. (2004). Empirical evidence for selective report-
ing of outcomes in randomized trials: Comparison of
protocols to published articles. JAMA, 291, 2457-2465.
Duncan, B., Miller, S., Wampold, B., & Hubble, M, (Eds.).
(2010). The heart and soul of change, second edition: Deliv-
ering what works in therapy. Washington, DC: American
Psychological Association.
Freado, M. (2010). Measuring the impact of Re-ED. Reclaim-
ing Children and Youth, 79(2), 28-31.
Goodheart, C. (2010). The education you need to know.
Monitor on Psychology, 41(7), 9.
Gorman, D. (2008), Science, pseudoscience, and the need
for practical knowledge. ylc/d(fiio?J, 103,1752-1753.
Littell, J. (2010). Evidence-based practice: Evidence or ortho-
doxy. In B. Duncan, S. Miller, B. Wampold, & M. Hubble
(Eds.), The heart and soul of change, second edition: Deliv-
ering what works in therapy. Washington, DC: American
Psychological Association.
PRINCIPLES OF RE-ED
Trust between a child and adult is essential, the foundation on
which all other principles rest.
Life is to be lived now, not in the past, and lived in the future
only as a present challenge.
Competence makes a difference, and children should be good at
something, especially at school.
Time is an ally, working on the side of growth in a period of
development.
Self-control can be taught and children and adolescents helped
to manage their behavior.
Intelligence can be taught to cope with challenges of family,
school and community.
Feelings should be nurtured, controlled when necessary,
explored with trusted others.
The group is very important to young people, and it can be a
major source of instruction in growing up.
Ceremony and ritual give order, stability, and confidence to
troubled children and adolescence.
The body is the armature ofthe self, around which the
psychological self is constructed.
Communities are important so youth can participate and learn to
serve.
A child should know some joy in each day.
Hobbs, N. (1982). The troubled and troubling child. San
Francisco, CA: Jossey-Bass.
winter 2011 volume 19, number 4 | 7
Copyright of Reclaiming Children & Youth is the property of
Reclaiming Children & Youth and its content
may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder's express
written permission. However, users may print, download, or
email articles for individual use.
Presidential Address – 2012 / Message annuel du Président –
2012
Psychological Treatments: Putting Evidence Into Practice
and Practice Into Evidence
DAVID J. A. DOZOIS
University of Western Ontario
Abstract
In June 2011, the Canadian Psychological
Association (CPA) Board of Directors
launched a task force on the evidence-
based practice of psychological treatments. The purpose of
this task force was to operationalize what constitutes
evidence-based practice in psychological treatment, to make
recommendations about how psychologists can best integrate
evidence into practice, and to disseminate information to con-
sumers about evidence-based interventions. An important im-
petus for this task force was the continuing and widening
scientist–practitioner gap. There are both barriers and oppor-
tunities when it comes to promoting greater reliance on the
scientific literature and greater uptake of empirically sup-
ported treatments among practitioners. Two main factors pre-
vail. For one, there is considerable controversy over what
constitutes best evidence. The second is that researchers often
do not communicate their findings in a manner that effec-
tively translates their results from the laboratory to the clinic.
It is crucial that we not only make practice evidence-based
but also make evidence practice-based. In this article, I focus
on current issues and opportunities with respect to evidence-
based practice and identify strategies for closing the gap be-
tween research and practice.
Keywords: evidence-based practice, evidence-based treatment,
empiri-
cally supported treatment, bridging research and practice,
psychotherapy
A number of years ago, as I was heading out of the house to
attend my undergraduate classes, my father said to me, “What
do you have today, David?” I told him, “I have personality and
motivation.” “Good for you!” he said. I am fortunate to have
had and continue to have a great relationship with my parents.
We have a lot of fun together and my parents have always been
an incredible encouragement to me. In preparing for my ad-
dress, my dad—a retired minister—also provided me with some
good advice: “If you don’t strike oil in the first 20 minutes, stop
boring.”
As President of the Canadian Psychological Association
(CPA), I have the special honour of providing an address to the
membership. I intend to use this platform to share with Cana-
dian psychologists some ideas related to evidence-based prac-
tice. Part of my presidential mandate was for CPA to develop its
own position on the evidence-based practice of psychological
treatments to support and guide practice as well as to inform
stakeholders. Psychological health and disorders are clearly a
priority for many of Canada’s stakeholder groups (e.g., Mental
Health Commission of Canada, Treasury Board, Public Health
Agency of Canada) and their effective treatment needs to be-
come a priority for CPA as well. When I first brought this idea
to the CPA Board of Directors in March 2011, Dr. Lorne
Sexton, who was on the board in the portfolio of Professional
Affairs, and who had just chaired a task force on prescriptive
authority for psychologists, said, “And I thought prescription
privileges was controversial.”
To be sure, this is a sensitive topic, and I hope that I will deal
with it appropriately and at least do it some justice. In his
classic monograph, “Why I Don’t Attend Case Conferences,”
Paul Meehl (1973) began by stating, “The first portion of the
paper will be highly critical and aggressively polemic (If you
want to shake people up, you have to raise a little hell). The
second part, while not claiming grandiosely to offer a definitive
solution to the problem, proposes some directions of thinking
and ‘experimenting’ that might lead to a significant improve-
ment over current conditions” (p. 227). Although I have no
intention of raising a little hell, I would similarly like to
highlight the problem and then move toward some potential—
not grandious or definitive— but potential solutions.
After briefly highlighting some of the outcome data that
support the idea that psychological treatments are effective for
a variety of mental health problems, I would like to address the
difficult fact that the empirical research is often not utilized by
practitioners. There are various reasons why clinicians may not
read the literature or apply it to their practices and I will focus
on some of these concerns. Following this brief review, I will
provide a quick update on the work of the CPA Task Force on
Evidence-Based Practice of Psychological Treatments because I
think it helps to address the issue of “What is evidence-based
practice?” and “How should evidence be used?”— both of
which have been cited as barriers to promoting greater reliance
Correspondence concerning this article should be addressed to
David
J. A. Dozois, Department of Psychology, Westminster Hall,
Room 313E,
University of Western Ontario, London, Ontario N6A 3K7
Canada. E-mail:
[email protected]
Canadian Psychology / Psychologie canadienne © 2013
Canadian Psychological Association
2013, Vol. 54, No. 1, 1–11 0708-5591/13/$12.00 DOI:
10.1037/a0031125
1
mailto:[email protected]
http://dx.doi.org/10.1037/a0031125
on the scientific literature among practitioners. I will conclude
with some recommendations— both for the practitioner and
scientist—for bridging the gap between science and practice.
Efficacy of Psychological Treatments
Psychological treatments are efficacious for a number of differ-
ent disorders (e.g., Australian Psychological Society, 2010;
Beck
& Dozois, 2011; Butler, Chapman, Forman, & Beck, 2006;
Chambless & Ollendick, 2001; Epp & Dobson, 2010; Hofmann,
Asnaani, Vonk, Sawyer, & Fang, 2012; Nathan & Gorman,
1998;
Ruscio & Holohan, 2006). Although space restrictions preclude
a
fulsome review of this literature, I will give a couple of
examples.
The Australian Psychological Society (2010) published a
compre-
hensive review of the best evidence available on the efficacy of
psychological interventions for a broad range of mental
disorders.
The research was evaluated according to its evidentiary level,
quality, relevance, and strength. Included in this document were
systematic reviews and meta-analyses, randomized controlled
tri-
als, nonrandomized controlled trials, comparative studies, and
case
series.
I will just focus on the findings for the treatment of adults for
illustration purposes (see Table 1). For depression, the highest
level of empirical support was for cognitive– behaviour therapy
(CBT), interpersonal psychotherapy (IPT), brief psychody-
namic psychotherapy, and CBT-oriented self-help interven-
tions. The highest level of support for bipolar disorder was
obtained for CBT, IPT, family therapy, mindfulness-based cog-
nitive therapy, and psychoeducation as treatments adjunctive to
pharmacotherapy. Across the anxiety disorders (including gen-
eralised anxiety disorder, panic disorder, specific phobia, social
anxiety, obsessive– compulsive disorder, and posttraumatic
stress disorder [PTSD]), the highest level of evidence obtained
was for CBT. Both CBT and Motivational Interviewing were
deemed effective for substance-use disorders. Whereas CBT
was the most consistently supported treatment for bulimia ner-
vosa and binge eating disorder, family therapy and psychody-
namic therapy obtained the most support for anorexia nervosa.
CBT also had the most support for sleep disorders, sexual
disorders, pain, chronic fatigue, somatization, hypochondriasis,
and body dysmorphic disorder. CBT and family therapy were
considered the most effective interventions for psychotic dis-
orders. Finally, dialectical behaviour therapy received the most
empirical support for borderline personality disorder (Austra-
lian Psychological Society, 2010). I should note that there was
some support noted for other types of interventions as well,
although they did not have the highest degree of research
support.
This is positive news. Many psychological treatments are
effective for treating mental health problems, but also demon-
strate longevity. In the case of depression, for example, CBT is
equally effective as medication for the treatment of an acute
episode (DeRubeis, Gelfand, Tang, & Simons, 1999; DeRubeis
et al., 2005; DeRubeis, Webb, Tang, & Beck, 2010) but signif-
icantly reduces the risk of relapse relative to pharmacotherapy
(Hollon et al., 2005). In fact, the average risk of relapse fol-
lowing antidepressant medication is more than double the rate
following CBT (i.e., 60% compared with 25% based on
follow-up periods of 1 to 2 years; see Gloaguen, Cottraux,
Cucherat, & Blackburn, 1998).
In addition to the efficacy of psychological interventions, a
strong economic case can also be made for their cost recovery.
Table 1
Psychological Treatments With the Highest Level of
Support (Adults)
Mood disorders
Depression
Cognitive–behavior therapy
Interpersonal psychotherapy
Psychodynamic psychotherapy
Self-help (Cognitive-behavior therapy)
Bipolar disorder1
Cognitive–behavior therapy
Interpersonal psychotherapy
Family therapy
Mindfulness-based cognitive therapy
Psychoeducation
Anxiety disorders
Generalized anxiety disorder
Cognitive–behavior therapy
Panic disorder
Cognitive–behavior therapy
Specific phobia
Cognitive–behavior therapy
Social anxiety
Cognitive–behavior therapy
Obsessive–compulsive disorder
Cognitive–behavior therapy
Posttraumatic stress disorder
Cognitive–behavior therapy
Substance-use disorders
Cognitive–behavior therapy
Motivational interviewing
Sleep disorders
Cognitive–behavior therapy
Eating disorders
Anorexia nervosa
Family therapy
Psychodynamic psychotherapy
Bulimia nervosa
Cognitive–behavior therapy
Binge-eating disorder
Cognitive–behavior therapy
Somatoform disorders
Pain
Cognitive–behavior therapy
Chronic fatigue
Cognitive–behavior therapy
Somatization
Cognitive–behavior therapy
Hypochondriasis
Cognitive–behavior therapy
Body dysmorphic
Cognitive–behavior therapy
Borderline personality disorder
Dialectical behavior therapy
Psychotic disorders
Cognitive–behavior therapy
Family therapy
Dissociative disorders
Cognitive–behavior therapy2
Note. Source: Australian Psychological Society (2010).
1 As adjunct to medication. 2 Few studies have investigated the
effective-
ness of treatments for dissociative disorders.
2 DOZOIS
David M. Clark (CPA’s 2011 to 2012 Honorary President) and
his colleagues (D. M. Clark et al., 2009), for example, argued
that psychological treatments would largely pay for themselves
by reducing the costs associated with disability and increasing
revenue related to return to work and increased productivity
(also see Centre for Economic Performance’s Mental Health
Policy Group, 2012; D. M. Clark, 2012; Layard, Clark, Knapp,
& Mayraz, 2007; Myhr & Payne, 2006). The cost-effectiveness
of these interventions, and the importance of evidence-based
practice, was also recently highlighted in a report of the Mental
Health Commission of Canada (2012).
The Scientist–Practitioner Gap
Notwithstanding compelling data on their efficacy and effec-
tiveness, few practitioners utilize the treatments that have gar-
nered the strongest scientific support. Do not get me wrong—
many psychologists do keep up with the literature and practice
in an
evidence-based manner (Beutler, Williams, Wakefield, &
Entwistle,
1995; Sternberg, 2006). Yet there is considerable evidence of a
scientist–practitioner gap (Babione, 2010; Lilienfeld, 2010;
Ruscio
& Holohan, 2006; Meehl, 1987; Stewart & Chambless, 2007).
For
instance, few clients with depression and panic disorder receive
scientifically supported treatments (Lilienfeld, 2010). Although
the majority of psychologists (88%) surveyed reported using
CBT
techniques to treat anxiety, most did not use exposure or
response
prevention in the treatment of obsessive– compulsive disorder
and
76% indicated that they rarely or never used interoceptive expo-
sure in the treatment of panic disorder (Freiheit, Vye, Swan, &
Cady, 2004).
Roz Shafran and her colleagues (Shafran et al., 2009) reported
that, in 1996, psychodynamic psychotherapy was the most com-
mon psychological treatment offered for generalised anxiety
dis-
order, panic disorder, and social phobia. Supportive counselling
was the most common treatment for PTSD in the United
Kingdom,
despite treatment guidelines (National Institute for Health and
Clinical Excellence, 2005) that recommend trauma-focused psy-
chological interventions as the treatments of choice. Sadly,
many
practitioners remain uninformed of relevant research, believe
that
it is not relevant for their practices, and neglect to evaluate out-
come in their own clinical work (Lehman, 2010; Parrish &
Rubin,
2011; Stewart & Chambless, 2007).
This issue came to light a few years ago in an article written by
Baker, McFall, and Shoham (2008) and published in the journal
Psychological Science in the Public Interest. The Washington
Post
picked up this story, titled “Is Your Therapist a Little Behind
the
Times?” Baker et al. (2009) wrote,
A young woman enters a physician’s office seeking help for
diabetes.
She assumes that the physician has been trained to understand,
value
and use the latest science related to her disorder. Down the hall,
a
young man enters a clinical psychologist’s office seeking help
for
depression. He similarly assumes that the psychologist has been
trained to understand, value and use current research on his
disorder.
The first patient would be justified in her beliefs; the second,
often,
would not. This is the overarching conclusion of a 2-year
analysis that
[was] published on the views and practices of hundreds of
clinical
psychologists.
Barriers to Promoting Greater Reliance on the
Scientific Literature
Well what are some of the barriers to promoting greater reliance
on the scientific literature? Pagoto et al. (2007) posed questions
to
members of various professional Listservs in clinical
psychology,
health psychology, and behavioural medicine to identify an
initial
(rather than representative) list of barriers and facilitators
regard-
ing evidence-based practice. Respondents were asked to submit
their top one to two barriers and facilitators. The top barrier
pertained to attitudes toward evidence-based practice. For
exam-
ple, there is the perception that “EBP forces psychology to
become
a hard science, thereby dampening the discipline’s humanity”
(Pagoto et al., 2007, p. 700). Concern was also expressed that
clinical evidence is more valuable than scientific evidence. This
finding concurs with Stewart and Chambless (2007), who
sampled
519 psychologists in independent practice. Practitioners mildly
agreed that psychotherapy outcome research has much meaning
for their practices; they moderately to strongly agreed that past
clinical experience affects their treatment decisions, whereas
there
was only mild agreement that treatment outcome research influ-
ences usual practice (also see Shafran et al., 2009).
This issue is extraordinarily complex. I do not pretend to have
the answers, nor could I adequately describe in this article all of
the
arguments surrounding this debate (for review, see Hunsley,
2007a; Norcross, Beutler, & Levant, 2005; Westen, Novotny, &
Thompson-Brenner, 2004). In a nutshell, we have diversity of
perspectives on the “truth” and what is important in therapy. At
one end of the spectrum are researchers who work tirelessly to
develop and disseminate the results from randomized controlled
trials. These individuals may caricature some psychotherapists
as
flying by the seat of their pants rather than grounding their
work in
evidence. On the other end, we have front-line clinicians who
work
tirelessly to help their patients with complex comorbid
problems.
These practitioners may caricature researchers as ivory-tower
ac-
ademics who do not understand the clinical realities of day-to-
day
practice and study unrepresentative patients in highly controlled
environments (Fertuck, 2007).
A number of arguments are cited in the literature as to why
clinicians may not use or value the scientific literature (see
Hunsley, 2007a; Kazdin, 2008; Shafran et al., 2009; Westen et
al.,
2004). For example, arguments have been advanced that
research
trials have limited applicability to actual clinical practice.
Patients
treated in psychotherapy outcome trials, for example, are
believed
to be less severe and less complex (e.g., with fewer comorbid
conditions) than are individuals seen in actual practice. In
contrast
to this idea, however, patients in regular clinical practices are
often
excluded from clinical trials because they do not meet their
sever-
ity or duration criteria (e.g., Stirman, DeRubeis, Crits-
Christoph,
& Brody, 2003). In addition, many therapy trials permit most
types
of comorbidity (e.g., DeRubeis et al., 2005; Hollon et al., 2005;
Stirman, DeRubeis, Crits-Christoph, & Rothman, 2005).
Another related criticism pertains to the idea that research
findings may not generalise to clinical practice (Margison et al.,
2000; Ruscio & Holohan, 2006). In other words, there may be a
difference between efficacy (i.e., that the intervention works
under
highly controlled conditions) and effectiveness (i.e., that the
inter-
vention also works under normal circumstances). In a review of
the
treatment effectiveness literature, however, Hunsley and Lee
3PRESIDENTIAL ADDRESS
(2007) concluded that the majority of the effectiveness studies
show completion rates and outcomes comparable with the
results
typically obtained in randomized controlled trials (also see
Teachman, Drabick, Hershenberg, Vivian, & Wolfe, 2012).
Others have reacted to the randomized controlled trial (RCT) as
the “gold standard” of research. RCTs may be optimal for
research
in medicine, some claim, but are not necessarily the most appro-
priate way to investigate psychotherapy outcome (Bohart, 2005;
Westen & Morrison, 2001). In the realm of psychotherapy, this
reactivity to RCTs has been further reinforced by the
development
of lists of empirically supported treatments. Commissioned by
Division 12 (Clinical Psychology) of the American
Psychological
Association (APA), the Task Force on Promotion and
Dissemina-
tion of Psychological Procedures published its 1995 report,
which
listed treatments considered to be either well-established or
prob-
ably efficacious according to a standard set of criteria (e.g.,
Chambless et al., 1996). These criteria were also adopted by the
Clinical Section of CPA in their task force report, Empirically
Supported Treatments in Psychology: Implications for Canadian
Professional Psychology (Hunsley, Dobson, Johnston, & Mikail,
1999a, 1999b).
The APA’s criteria for empirically supported treatments elicited
both enthusiasm and controversy. Although there was
excitement
about the recognition of “effective” psychological treatments
there
were also myriad concerns. For example, some psychologists
expressed resistance to this top-down approach and perceived
the
criteria to be overly rigid and restrictive, arguing that the type
of
research deemed necessary to produce supportive evidence for a
treatment is incompatible with schools of psychotherapy outside
of
the cognitive and behavioural framework (see Bryceland &
Stam,
2005; Stuart & Lilienfeld, 2007). Although I believe the
movement
toward empirically-supported treatments is well intentioned, I
agree that there are issues with defining evidence in this limited
manner.
The reality, though, is that we need rigorous controlled research
to evaluate the impact of our interventions. Tight experimental
control, operational definitions, random assignment, precise
mea-
surement, and statistical significance—all of which makes us
con-
cerned about external and ecological validity—are at the crux of
the experimental design (Kazdin, 2008; Lilienfeld, 2010). Obvi-
ously, RCTs do not answer all of our questions and the findings
need to be applied to the real world, but we do need controlled
research.
You see, science sets up safeguards against biases. I may see a
depressed individual improve in therapy and conclude that my
intervention worked. In addition to my own clinical
observations,
there may also be self-report data available (e.g., the Beck De-
pression Inventory-II; Beck, Steer, & Brown, 1996) that
indicates
significant improvement. Yet my conclusion may be erroneous
because rival explanations could account for this change (e.g.,
regression to the mean due to repeated measurement,
spontaneous
remission; see Lilienfeld, 2010).
It is tempting for us, as clinicians (and I note here that I do
have
a small independent practice as well), to conclude that the
research
does not apply to my individual case—that somehow applying a
particular evidence-based treatment is akin to the Procrustean
dilemma. Procrustes was a mythological character who boasted
that every guest invited to his house would fit the guest room
bed,
irrespective of his or her size. Such claim attracted considerable
attention. What Procrustes failed to mention, however, was how
he
could make this happen— either by cutting off their legs or
stretch-
ing them to make them fit the bed (see Kuyken, Padesky, &
Dudley, 2009). As therapists, we obviously do not want to cut
off
or distort a client’s experience to fit our preexisting theories
and
present a “one size fits all” type of intervention (Kuyken et al.,
2009). However, it would also be erroneous to conclude that,
because a patient does not map perfectly well to the RCT, I
should
not pay attention to this research. As Meehl (1973) pointed out,
doing so involves “failing to understand probability logic as ap-
plied to a single case” (p. 234). Incidentally, when I was in
graduate school at the University of Calgary, the writings of
Paul
Meehl were pivotal to our training. I hope that this is still the
case
and encourage students, researchers, and clinicians to make
Meehl’s work a staple in their academic diet.
We might be tempted to state that we are not dealing with
groups or the nomothetic; we are dealing with an individual,
with
the ideographic. However, decades of research has demonstrated
that if we depart from actuarial decision making, we will get it
wrong more times that we will get it right (Dawes, Faust, &
Meehl,
1989; Grove & Lloyd, 2006; Meehl, 1954). As humans, we are
prone to a range of biases that include confirmation bias,
illusory
correlations, neglect of base rates, and availability heuristics, to
name a few (Chapman & Chapman, 1969, 1975; Chwalisz, 2003;
Paley, 2006; Turk & Salovey, 1985; Tversky & Kahneman,
1973).
As Lilienfeld (2010) pointed out, scientific thinking is not
natural
for many of us; it is, in many ways, “uncommon sense, because
it
requires us to set aside our gut hunches and intuitions in lieu of
convincing data. . . Science requires us to override more auto-
matic, effortless, and intuitive modes of thinking with more
con-
trolled, effortful, and reflective modes of thinking” (p. 282).
Sci-
ence helps to reduce human error. As Meehl (1987) stated, we
need a “general scientific commitment not to be fooled and not
to
fool anybody else” (p. 9). The desire to not to be fooled and not
to
fool anybody else needs to be fundamental to our fabric as psy-
chologists, which is why evidence-based practice is so crucial.
Evidence-Based Practice
There is growing recognition in the field that the practice of
professional psychology should be based on valid evidence
regard-
ing which approaches to invention are most likely to be
successful.
In 2006, the APA established a task force on evidence-based
practice in psychology that attempted to acknowledge multiple
types of research evidence (American Psychological Association
Presidential Task Force on Evidence-Based Practice, 2006, p.
273): “Evidence-based practice in psychology is the integration
of
the best available research with clinical expertise in the context
of
patient characteristics, culture, and preferences” (also see
Spring,
2007). Unfortunately, the APA task force identified evidence on
a
continuum “from uncorroborated clinical observations through
meta-analyses of the results of RCTs” (Stuart & Lilienfeld,
2007,
p. 615). The task force also said little about the need for
ongoing
idiographic evaluation of one’s clinical cases. In addition, at the
heart of the three circles is “clinical decision making”—yet, as I
discussed earlier, clinical decision making is heavily prone to
error.
4 DOZOIS
CPA Task Force on Evidence-Based
Psychological Treatments
As one of my presidential initiatives, the CPA Board of Direc-
tors launched the Task Force on the Evidence-Based Practice of
Psychological Treatments in June 2011. The purpose of this task
force was to operationalize what constitutes evidence-based
prac-
tice in psychological treatments, to make recommendations
about
how psychologists can best integrate evidence into practice, and
to
disseminate information to consumers about evidence-based
inter-
ventions. An important impetus for this task force was the con-
tinuing and widening scientist–practitioner gap.
The task force (I co-chaired with Dr. Sam Mikail) was
populated
last summer and began its work in September 2011. Task force
members were chosen to represent a variety of research,
practice,
knowledge-translation, consumer, and community perspectives.
There is also good representation from different theoretical
orien-
tations, including interpersonal, emotion-focused, cognitive–
behavioural, and psychodynamic perspectives.
We produced a document that operationalizes what constitutes
evidence-based practice of psychological treatment. The
members
of the task force were interested in a definition of evidence-
based
practice that was complex enough to incorporate the following
ideas: (a) peer-reviewed research evidence is central; (b) one
should be evidence-based not only in his or her general fund of
knowledge but also in session-by-session work; (c) the process
involves one of collaboration with a client/patient (rather than a
top-down process). The Task Force on Evidence-Based Practice
of
Psychological Treatments will soon be releasing its final docu-
ment, which will be posted on the website of the Canadian
Psycho-
logical Association (see
http://www.cpa.ca/aboutcpa/committees/
cpataskforces/).
The next step involved establishing a hierarchy of evidence that
was respectful of diverse research methodologies, palatable to
different groups of individuals, and yet comprehensive and
com-
pelling (see Figure 1). For example, we stated that
although all research methodologies have some potential to
provide
relevant evidence, psychologists should first consider findings
that are
replicated across studies and that have utilized methodologies
that
address threats to the validity of obtained results (e.g., internal
valid-
ity, external validity, generalizability, transferability). Thus,
psychol-
ogists should consider the best available evidence, highest on
the
hierarchy of research evidence. Evidence lower on the hierarchy
should be considered only to the extent that better research
evidence
does not exist, or if there are clear factors that mitigate against
using
the best evidence (Canadian Psychological Association, 2012, p.
8).
As shown in Figure 1, the psychologist is to use the hierarchy of
evidence to make initial treatment decisions, and then monitor
change over time, feeding back to the hierarchy again when
necessary.
In March and April of 2012, the task force sought feedback on
these core elements. Our next steps involved developing
vignette
examples to illustrate the process of being evidence-based in
one’s
practice and making specific recommendations for the CPA
Board
for further development and dissemination. We have also devel-
oped an annotated resource list that will direct practitioners to
where they can find the necessary information on evidence-
based
practice. A guide was also developed to highlight, for the
general
public, the added value that psychologists bring relative to other
practitioners (e.g., research base, evidence-based focus).
It is important to point out that evidence-based practice is a
process by which the best evidence available is used to make
optimal clinical decisions. Some psychologists equate evidence-
Figure 1. The hierarchy of research evidence related to clinical
practice.
5PRESIDENTIAL ADDRESS
http://www.cpa.ca/aboutcpa/committees/cpataskforces/
http://www.cpa.ca/aboutcpa/committees/cpataskforces/
based practice with empirically supported therapies but the two
are
not synonymous. There are, in fact, many ways to provide
evidence-based treatment without employing techniques that are
explicitly empirically supported (e.g., by focusing on
effectiveness
trials and naturalistic studies or by emphasising evidence-based
procedures and principles of practice). Clinical practice should
be
evidence-informed but it does not need to be evidence-driven
(Bohart, 2005).
Closing the Gap Between Science and Practice
Although there is controversy regarding what constitutes “evi-
dence,” the vast majority of psychologists do support the idea
that
they should practice in a manner that is evidence-based. So
what
can scientists and practitioners do to close the gap? I think that
the
work of the CPA task force has been important in terms of
providing a palatable definition of evidence-based practice that
is
neither too restrictive nor too diluted. We have also derived a
hierarchy of evidence that is open to diverse methodologies but
that focuses on the need to balance internal and external
validity.
Yet we need to do more to close this gap. What follows are
some
suggestions for the scientist and for the practitioner about how
we
can work together to improve evidence-based practice and
practice-based evidence.
Recommendations for Scientists
Better translation of science. First, we need better strategies
for communicating and translating research into practice.
Beutler,
Williams, Wakefield, and Entwistle (1995) conducted a survey
of
practitioners and clinical academic psychologists. Of the
practitio-
ners, 47% reported reading research articles at least monthly,
21%
less than monthly, and 32% never. Beutler et al. argued,
however,
that practitioners do generally value research but need strategies
to
help them translate scientific findings into clinical practice.
Gen-
erally speaking, we do not do a particularly good job of this.
We do not translate well our findings from science to practice.
I remember an old cell phone commercial that highlighted the
idea
that you have fewer dropped calls and less interference if you
use
this particular service. The ad started with a man at the airport
calling his partner, “Honey . . . I’m . . . leaving . . . you.” Of
course, with the right cell phone service, the message would
have
been accurately received: “Honey, I’m not leaving without
you.”
We need to make sure that our results—our messages—are re-
ceived clearly and accurately. Academic research articles may
not
even be the best venue for communicating research findings to
clinicians (Beutler et al., 1995). In addition, the sheer number
of
research articles makes “keeping up” virtually impossible. As
Spring (2011) noted, there are over 8,000 research articles pub-
lished every day, which is why clinical practice guidelines and
systematic reviews are so important.
Perhaps we will get better at translating science over time. In
the
spring of 2012, when the CPA Board met with local
psychologists
in university, hospital, and private-practice settings in Halifax,
I
had the privilege of speaking with Dr. Michelle Eskritt, who is
an
Associate Professor at Mount Saint Vincent University.
Michelle
informed me about an innovative new 4-year bachelor of
science
program in science communication. The program intends to
train
individuals who can be good communicators of science. There is
a related program at Laurentian University. We must create
infra-
structure for more efficient and effective translation of clinical
research from the laboratory to the practice arena (King, 2006).
Researchers need to make evidence practice-based. To quote
Lawrence Green (2007), professor of epidemiology and
biostatis-
tics at University of California, San Francisco, “if we want
more
evidence-based practice, we need more practice-based
evidence.”
We need to do more to make research useful to the clinician.
More effectiveness trials and better communication with
practitioners. Second, as mentioned previously, we must dem-
onstrate not only efficacy (that the intervention works under
highly
controlled conditions) but also effectiveness (that the
intervention
also works under normal circumstances). Earlier I noted the
review
by Hunsley and Lee (2007), which demonstrated that efficacy
and
effectiveness trials are comparable in terms of completion rates
and outcome; however, there are only a small number of effec-
tiveness trials in the literature.
Related to the need for more effectiveness trials is the need for
better communication between scientists and clinicians
(Teachman
et al., 2012). Communication is two-way, not one-way, and
prac-
titioners understandably do not want to be disseminated upon
(Wilson, 2011). Scientists also need to hear the important voice
of
practitioners about what works in the real world. One way the
Society of Clinical Psychology (APA Division 12) is attempting
close the gap between science and practice is by providing
clini-
cians with a voice in the research process. In various surveys,
clinicians are afforded the opportunity to provide feedback on
their
use of empirically supported treatments in real-world practice.
It is
hoped that by fostering two-way rather than one-way communi-
cation, clinicians will be more likely to make use of research
findings and that greater collaboration will take place
(Goldfried,
2010).
Increased research on mechanisms of change. Third, we
need more research on mechanisms of change. Numerous studies
have shown that psychological interventions are effective for a
host of conditions. What we do not understand well is why.
Increased research on mechanisms of change is important and
could help clinicians to determine which therapeutic ingredients
to
emphasise (D. A. Clark, in press; Kazdin, 2008). Demonstration
of
a link does not necessarily inform us about why such a relation
exists. For example, knowing that gender is a risk factor in de-
pression (with females twice as likely to become depressed as
males) does not help me to understand why this is the case
(Ingram,
Miranda, & Segal, 1998; Ingram & Price, 2010). Similarly, just
because a treatment works does not mean that we understand
why
or can capitalize on the mechanism of change.
In some of my own research, my colleagues and I have dem-
onstrated that a well-organized negative representation of self
(i.e.,
the organisation of the self-schema) meets sensitivity,
specificity,
and stability criteria as a vulnerability factor for depression
(Dozois,
2007; Dozois & Dobson, 2001a, 2001b; Dozois, Eichstedt,
Collins,
Phoenix, & Harris, 2012; Lumley, Dozois, Hennig, & Marsh,
2012; Seeds & Dozois, 2010). In previous research, we have
shown that negative cognitive organisation remains stable even
though people improve from an episode of depression. In
one randomized clinical trial, we examined the effects of
cognitive
therapy (CT) plus pharmacotherapy (PT) compared with medica-
tion alone on depressive symptoms, surface-level cognitions,
and
deeper-level cognitions (i.e., cognitive organisation; Dozois,
6 DOZOIS
Bieling, et al., 2009). Symptom reduction was equivalent for CT
�
PT and PT alone. Group differences were also not significant on
more surface-level cognition (i.e., automatic thoughts, dysfunc-
tional attitudes). Individuals in CT � PT, however, showed
greater
cognitive organisation for positive content and less
interconnect-
edness of interpersonal negative content than did those treated
with
pharmacotherapy alone (this is illustrated in Figure 2).
Obviously
this finding needs to be replicated and examined in CT alone
compared with PT alone, and I am working on that now with Dr.
Lena Quilty and colleagues at the Centre for Mental Health and
Addiction in Toronto. Nonetheless, this is the first evidence to
suggest that the trait-like vulnerability of a highly
interconnected
negative self-structure can be modified by CT � PT. This
finding
may help to explain why CT reduces the risk of relapse or
recur-
rence—it seems to change deeper-level cognition. Of course, an
alternative explanation may be that relapse prevention has more
to
do with the accessibility of the schema (e.g., cognitive
reactivity)
than its organisation per se (cf. Segal, Gemar, & Williams,
1999,
Segal et al., 2006). The flood of negative thoughts that occur
once
the schema is activated and what a patient does with such
thoughts
(e.g., ruminating on them vs. acceptance; Wells, in press) may
be
the most important predictor of relapse. Nonetheless, if these
findings are replicated and a shift in the organisation of self-
representation is an important mechanism of long-term
treatment
change, then treatments can target this explicitly.
By understanding how treatment works, we will be in a better
position to capitalize on and match patients to variables that are
critical to outcome (Kazdin, 2008). We will also be able to
deliver
treatment “doses” to specific patients in a manner that will max-
imize resources (cf. Day, Eyer, & Thorn, in press).
Related to mechanisms of change is the movement toward
evidence-based procedures (e.g., core procedures that are
impor-
tant to use in the treatment of different problems and
conditions,
such as behavioural activation, cognitive restructuring,
exposure,
acceptance-based strategies, and so on). For example,
transdiagnos-
tic protocols (Dozois, Seeds, & Collins, 2009; Mansell,
Harvey, Watkins, & Shafran, 2009; McHugh, Murray, &
Barlow,
2009)—treatments that target pathological mechanisms that are
com-
mon across disorders—may enhance the relevance of the
research to
practice and circumvent many issues related to comorbidity
(Shafran
et al., 2009).
Training in evidence-based thinking. Fourth, we need to
shift our graduate education so that we go beyond helping
students
learn the content of how to administer empirically supported
treatments to also training psychologists in evidence-based
prac-
tice (Babione, 2010; Bauer, 2007; Hershenberg, Drabick, &
Vivian, 2012; Hunsley, 2007b; Lee, 2007; Leffler, Jackson,
West,
McCarty, & Atkins, in press). In other words, we need to train
students how to think critically, respect, and understand
scientific
knowledge and empirical methodologies, and integrate this
infor-
mation to make scientifically informed clinical decisions within
the context of a patient’s needs and background. As Babione
(2010) pointed out, students “need to be knowledgeable of when
it
is beneficial to adhere to a particular modality, when to modify
it,
or when to abandon it and place heavier focus on the other
components of the evidence-based framework” (p. 447). We
need
to teach our students how to think in an evidence-based manner
so
that they can adapt to novelty and integrate new research into
their
practices.
Perhaps it is time for clinical programs to evaluate their curric-
ulum not only for the content of knowledge but also for the
process
of learning. We need to ensure that we are modelling evidence-
based practice, providing the best training and asking the right
questions (see Lee, 2007; Leffler, et al., in press).
Recommendations for Practitioners
Clinicians, too, can take steps to narrow the research-practice
gap. Next, I outline some considerations for practitioners.
Measure treatment progress systematically. By routinely
administering reliable and valid indices of patient functioning,
practitioners may better determine whether a particular
interven-
tion is effective (see Fitzpatrick, 2012; Overington & Ionita,
2012;
Sales & Alves, 2012) and make informed treatment decisions
that
are less clouded with confirmation biases and other heuristics
(Dozois & Dobson, 2010; Kazdin, 2008). As illustrated in the
hierarchy (see Figure 1), we need to determine how things are
going through ongoing evaluation and then refer back to
hierarchy
if necessary.
I use a variety of psychometric indices in my own independent
practice. In addition to determining efficacy, there are other im-
portant advantages to monitoring change over time. For
example,
collecting data in therapy demonstrates to clients that the
therapist
is confident in his or her ability to help, is credible, and
respects
accountability. Data can also be used to examine the stability of
the
treatment response (e.g., to ensure that a patient’s change does
not
Figure 2. Changes in cognitive organisation as a function of
cognitive therapy.
7PRESIDENTIAL ADDRESS
simply reflect a flight into health). For instance, Jarrett,
Vittengl,
and Clark (2008) demonstrated that additional treatment may be
indicated to prevent relapse when a patient’s depression scores
are
in the mild range or higher during any of the last 6 weeks of
therapy. Psychometric data also provides a clear indication of
when treatment is successful and can be safely terminated.
Finally,
data gathered over time can be tabulated across different cases
and
can allow therapists to evaluate their own efficacy among
patients
with different diagnoses and client characteristics (see Dozois
&
Dobson, 2010).
Capitalize on clinician’s knowledge and experiences. We
also need to capitalize on clinician’s knowledge and
experiences.
As Kazdin (2008) contends, we often consider research to be the
contribution to knowledge and practice as the application of that
knowledge. However, this is an unfortunate way of viewing the
contributions that scientists and practitioners make and only
reifies
the scientist–practitioner gap. Clinical work can and does
contrib-
ute importantly to science. By systematically coding their
experi-
ences, clinicians can contribute to the existing body of
knowledge
and transfer important information to the next generation of
psy-
chologists. We need direct collaborations between those who
iden-
tify themselves as primarily scientists and those whose primary
identification is as a clinician. Our discipline needs the
experience
and expertise of practitioners (Kazdin, 2008).
One exciting development has been the establishment of prac-
tice research networks, which are designed to foster
collaboration
among researchers and clinicians by conducting naturalistic
stud-
ies in psychotherapy. These networks provide the infrastructure
for
practice-based evidence to complement evidence-based practice
(Audin et al., 2001; Castonguay et al., 2010; Castonguay,
Locke,
& Hayes, 2011; Norquist, 2001). Castonguay and colleagues
(2010) note that typical evidence-based strategies (e.g., RCTs),
although important, have reflected a “top-down” approach that
may have contributed to “empirical imperialism” (p. 328)—
scien-
tists who treat few patients tell clinicians who rarely conduct
research what variables should be studied to improve outcome.
In
contrast, practice research networks involve clinical
practitioners
in the community collaborating with researchers to decide on
the
research questions, design the methodology, and implement the
studies with the goal of increasing effectiveness research while
also maintaining scientific rigor. The Pennsylvania
Psychological
Association Practice Research Network was the first
psychother-
apy network devoted specifically to this type of collaborative
research (Castonguay et al., 2010). Tasca (2012a, 2012b) and
his
colleagues have recently received a Canadian Institutes of
Health
Research Planning and Meeting Grant to launch a psychotherapy
practice research network in Canada—and there are others as
well
(e.g., a Practice Research Network being developed at York
Uni-
versity).
Conclusion
The gap between science and practice needs to be filled both by
the scientist and by the practitioner. As Kazdin (2008) cogently
argues,
the researcher is not likely to say, ‘There is no solid evidence
for any
treatment, so I am going to withhold best guesses by
experienced
professionals.’ Similarly, practicing clinicians, in need of help
for
their relatives, are likely to search the Web, read extensively,
and
make phone calls to medical centers and experts to identify
what the
evidence is for the various surgical, pharmacological, and other
alter-
natives for their parents or children with significant medical
problems.
The clinician is not likely to say, ‘My relative is different and
unique
and the evidence really has not been tested with people like her,
so I
am going to forgo that treatment.’” (p. 151)
We need science so that opinion does not prevail (Nathan &
Gorman, 1998). We must not forget that human judgment and
memory are fallible. We need more science in practice. We need
to train psychologists so that they think in an evidence-based
manner and make conscious, explicit, and judicious use of evi-
dence in their day-to-day practices. We also need more practice
in
science—to rely on the strength and expertise of our clinicians
to
improve science. For the good of our profession and for the
health
and well-being of Canadians, we must to work together to study,
to practice, to foster, to develop, and to disseminate evidence-
based practice and practice-based evidence.
Résumé
En juin 2011, le conseil d’administration de la Société
canadienne de psychologie (SCP) a créé un groupe de travail
chargé de se pencher sur les traitements psychologiques basés
sur des données probantes. Plus précisément, le but du groupe
de travail était d’opérationnaliser ce qui constitue une pratique
basée sur des données probantes en ce qui a trait aux traitements
psychologiques, de formuler de recommandations sur les
meilleures façons d’intégrer des données probantes de la
recherche dans la pratique professionnelle de la psychologie et
de disséminer l’information sur les traitements basés sur les
données probantes parmi les consommateurs. L’écart
grandissant
entre le scientifique et le praticien a été une importante
incitation à la création du groupe de travail. Il existe à la fois
des obstacles et des occasions lorsqu’il s’agit de promouvoir,
dans la pratique professionnelle, un plus grand appui sur la
littérature scientifique et une plus grande utilisation de
traitements corroborés par des données empiriques. À cet égard,
deux principaux facteurs se distinguent : premièrement, la
définition de « meilleurs éléments probants » soulève une
importante controverse. Deuxième, il est fréquent que les
chercheurs ne communiquent pas les résultats de leurs travaux
d’une façon qui permette d’assurer leur transition du laboratoire
à la clinique. Il est donc très important non seulement d’axer les
traitements sur des données probantes, mais aussi de centrer les
recherches sur les traitements à donner. Dans cet article,
l’auteur
se penche sur des problèmes actuels et des occasions en ce qui a
trait aux traitements basés sur des données probantes et propose
des stratégies visant à réduire l’écart entre la recherche et la
pratique.
Mots-clés : pratique basée sur des données probantes, traitement
basé sur des données probantes, traitement fondé sur des
données
empiriques, rapprocher la recherche et la pratique,
psychothérapie.
References
American Psychological Association Presidential Task Force on
Evidence-
Based Practice. (2006). Evidence-based practice in psychology.
Ameri-
can Psychologist, 61, 271–285. doi:10.1037/0003-
066X.61.4.271
8 DOZOIS
http://dx.doi.org/10.1037/0003-066X.61.4.271
Audin, K., Mellor-Clark, J., Barkham, M., Margison, F.,
McGrath, G.,
Lewis, S., . . . Parry, G. (2001). Practice research networks for
effective
psychological therapies. Journal of Mental Health, 10, 241–251.
Australian Psychological Society. (2010). Evidence-based
psychological
interventions: A literature review (3rd ed.). Melbourne,
Australia: Au-
thor.
Babione, J. M. (2010). Evidence-based practice in psychology:
An ethical
framework for graduate education, clinical training, and
maintaining
professional competence. Ethics & Behavior, 20, 443– 453.
doi:10.1080/
10508422.2010.521446
Baker, T. B., McFall, R. M., & Shoham, V. (2008). Current
status and
future prospects of clinical psychology: Toward a scientifically
princi-
pled approach to mental and behavioral health care.
Psychological
Science in the Public Interest, 9, 67–103.
Baker, T. B., McFall, R. M., & Shoham, V. (2009, November
15). Is your
therapist a little behind the times?. Washington Post. Retrieved
from
http://www.washingtonpost.com/wp-
dyn/content/article/2009/11/13/
AR2009111302221.html
Bauer, R. M. (2007). Evidence-based practice in psychology:
Implications
for research and research training. Journal of Clinical
Psychology, 63,
685– 694. doi:10.1002/jclp.20374
Beck, A. T., & Dozois, D. J. A. (2011). Cognitive therapy:
Current status
and future directions. Annual Review of Medicine, 62, 397–
409. doi:
10.1146/annurev-med-052209-100032
Beck, A. T., Steer, R. A., & Brown, G. K. (1996). Beck
Depression
Inventory Manual (2nd. ed.). San Antonio, TX: Psychological
Corpora-
tion.
Beutler, L. E., Williams, R. E., Wakefield, P. J., & Entwistle, S.
R. (1995).
Bridging scientist and practitioner perspectives in clinical
psychology.
American Psychologist, 50, 984 –994. doi:10.1037/0003-
066X.50.12
.984
Bohart, A. C. (2005). Evidence-based psychotherapy means
evidence-
informed, not evidence-driven. Journal of Contemporary
Psychother-
apy, 35, 39 –53. doi:10.1007/s10879-005-0802-8
Bryceland, C., & Stam, H. (2005). Empirical validation and
professional
codes of ethics: Description or prescription? Journal of
Constructivist
Psychology, 18, 131–155. doi:10.1080/10720530590914770
Butler, A. C., Chapman, J. E., Forman, E. M., & Beck, A. T.
(2006). The
empirical status of cognitive-behavioral therapy: A review of
meta-
analyses. Clinical Psychology Review, 26, 17–31.
doi:10.1016/j.cpr
.2005.07.003
Canadian Psychological Association. (2012). Evidence-based
practice of
psychological treatments: A Canadian perspective (Report of
the CPA
Task Force on Evidence-Based Practice of Psychological
Treatments).
Ottawa, Ontario: Author.
Castonguay, L. G., Boswell, J. F., Zack, S. E., Baker, S.,
Boutselis, M. A.,
Chiswick, N. R., . . . Holtforth, M. G. (2010). Helpful and
hindering
events in psychotherapy: A practice research network study.
Psycho-
therapy (Chicago, Ill.), 47, 327–344. doi:10.1037/a0021164
Castonguay, L. G., Locke, B. D., & Hayes, J. A. (2011). The
Center for
Collegiate Mental Health: An example of a practice-research
network in
university counseling centers. Journal of College Student
Psychother-
apy, 25, 105–119. doi:10.1080/87568225.2011.556929
Centre for Economic Performance’s Mental Health Policy
Group. (2012).
How mental illness loses out in the NHS. London School of
Economics
and Political Science. London, UK.
Chambless, D. L., & Ollendick, T. H. (2001). Empirically
supported
psychological interventions: Controversies and evidence.
Annual Review
of Psychology, 52, 685–716.
doi:10.1146/annurev.psych.52.1.685
Chambless, D. L., Sanderson, W. C., Shoham, V., Bennett
Johnson, S.,
Pope, K. S., Crits-Christoph, P., . . . McCurry, S. (1996). An
update on
empirically validated therapies. The Clinical Psychologist, 49,
5–18.
Chapman, L. J., & Chapman, J. P. (1969). Illusory correlation
as an
obstacle to the use of valid psychodiagnostic signs. Journal of
Abnormal
Psychology, 74, 271–280. doi:10.1037/h0027592
Chapman, L. J., & Chapman, J. P. (1975). The basis of illusory
correlation.
Journal of Abnormal Psychology, 84, 574 –575.
doi:10.1037/h0077112
Chwalisz, K. (2003). Evidence-based practice: A framework for
twenty-
first-century scientist-practitioner training. The Counseling
Psycholo-
gist, 31, 497–528. doi:10.1177/0011000003256347
Clark, D. A. (in press). Cognitive restructuring: A major
contribution of
cognitive therapy. In D. J. A. Dozois (Ed.), CBT: General
Strategies.
Volume 1. In S. G. Hofmann (Series Ed.), Cognitive-behavioral
therapy:
A complete reference guide. Oxford, UK: Wiley-Blackwell.
Clark, D. M. (2012, June 18). It is inexcusable that mental
health treat-
ments are still underfunded. The Guardian. Retrieved from
http://www
.guardian.co.uk/commentisfree/2012/jun/18/inexcusable-mental-
health-
treatments-underfunded
Clark, D. M., Layard, R., Smithies, R., Richards, D. A.,
Suckling, R., &
Wright, B. (2009). Improving access to psychological therapy:
Initial
evaluation of two UK demonstration sites. Behaviour Research
and
Therapy, 47, 910 –920. doi:10.1016/j.brat.2009.07.010
Dawes, R. M., Faust, D., & Meehl, P. E. (1989). Clinical versus
actuarial
judgment. Science, 243, 1668 –1674.
doi:10.1126/science.2648573
Day, M. A., Eyer, J. C., & Thorn, B. E. (in press). Therapeutic
relaxation.
In D. J. A. Dozois (Ed.), CBT: General Strategies. Volume 1. In
S. G.
Hofmann (Series Ed.), Cognitive-behavioral therapy: A
complete refer-
ence guide. Oxford, UK: Wiley-Blackwell.
DeRubeis, R. J., Gelfand, L. A., Tang, T. Z., & Simons, A. D.
(1999).
Medications versus cognitive behavior therapy for severely
depressed
outpatients: Mega-analysis of four randomized comparisons.
The Amer-
ican Journal of Psychiatry, 156, 1007–1013.
DeRubeis, R. J., Hollon, S. D., Amsterdam, J. D., Shelton, R.
C., Young,
P. R., Salomon, R. M., . . . Gallop, R. (2005). Cognitive therapy
vs
medications in the treatment of moderate to severe depression.
Archives
of General Psychiatry, 62, 409 – 416.
doi:10.1001/archpsyc.62.4.409
DeRubeis, R. J., Webb, C. A., Tang, T. Z., & Beck, A. T.
(2010). Cognitive
therapy. In K. S. Dobson (Ed.), Handbook of cognitive-
behavioral
therapies (3rd ed., pp. 277–316). New York, NY: Guilford.
Dozois, D. J. A. (2007). Stability of negative self-structures: A
longitudinal
comparison of depressed, remitted, and nonpsychiatric controls.
Journal
of Clinical Psychology, 63, 319 –338. doi:10.1002/jclp.20349
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training
Evidence-Based Practice Research and Training

More Related Content

Similar to Evidence-Based Practice Research and Training

Detecting flawed meta analyses
Detecting flawed meta analysesDetecting flawed meta analyses
Detecting flawed meta analysesJames Coyne
 
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...Stephen Faucher
 
Psychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewPsychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewHealthcare and Medical Sciences
 
Concepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxConcepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxstudywriters
 
Available online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxAvailable online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxcelenarouzie
 
Mental Health Disparities - Research
Mental Health Disparities - ResearchMental Health Disparities - Research
Mental Health Disparities - ResearchAnn Hinnen Sparks
 
Counseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCounseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCollegeEssay.Org
 
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxEMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxSALU18
 
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxReply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxcarlt4
 
HPHY 212 Week 4, lecture 1 publications-fall 2014
HPHY 212 Week 4, lecture 1   publications-fall 2014HPHY 212 Week 4, lecture 1   publications-fall 2014
HPHY 212 Week 4, lecture 1 publications-fall 2014University of Oregon
 
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coCONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coLynellBull52
 
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxRunning head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxtoltonkendal
 
PhD thesis Berghout 2010
PhD thesis Berghout 2010PhD thesis Berghout 2010
PhD thesis Berghout 2010Caspar Berghout
 
Week 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxWeek 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxcockekeshia
 
Thinking like a nurse.docx
Thinking like a nurse.docxThinking like a nurse.docx
Thinking like a nurse.docxwrite5
 

Similar to Evidence-Based Practice Research and Training (20)

Detecting flawed meta analyses
Detecting flawed meta analysesDetecting flawed meta analyses
Detecting flawed meta analyses
 
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
 
Psychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewPsychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief Review
 
MSc-UCL-Dissertation-9
MSc-UCL-Dissertation-9MSc-UCL-Dissertation-9
MSc-UCL-Dissertation-9
 
Concepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxConcepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docx
 
Available online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxAvailable online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docx
 
EVIDENCE BASED NURSING PRACTICE
EVIDENCE BASED NURSING PRACTICE EVIDENCE BASED NURSING PRACTICE
EVIDENCE BASED NURSING PRACTICE
 
Mental Health Disparities - Research
Mental Health Disparities - ResearchMental Health Disparities - Research
Mental Health Disparities - Research
 
A Qualitative Study on Lupus Patients (2)
A Qualitative Study on Lupus Patients (2)A Qualitative Study on Lupus Patients (2)
A Qualitative Study on Lupus Patients (2)
 
Counseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCounseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdf
 
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxEMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
 
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxReply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
 
HPHY 212 Week 4, lecture 1 publications-fall 2014
HPHY 212 Week 4, lecture 1   publications-fall 2014HPHY 212 Week 4, lecture 1   publications-fall 2014
HPHY 212 Week 4, lecture 1 publications-fall 2014
 
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coCONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
 
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxRunning head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
 
J.Clin_.Psychiatry
J.Clin_.PsychiatryJ.Clin_.Psychiatry
J.Clin_.Psychiatry
 
PhD thesis Berghout 2010
PhD thesis Berghout 2010PhD thesis Berghout 2010
PhD thesis Berghout 2010
 
Week 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxWeek 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docx
 
Course project ntr_5503
Course project ntr_5503Course project ntr_5503
Course project ntr_5503
 
Thinking like a nurse.docx
Thinking like a nurse.docxThinking like a nurse.docx
Thinking like a nurse.docx
 

More from SANSKAR20

The Assignment (3–5 pages)Complete a leadership development plan .docx
The Assignment (3–5 pages)Complete a leadership development plan .docxThe Assignment (3–5 pages)Complete a leadership development plan .docx
The Assignment (3–5 pages)Complete a leadership development plan .docxSANSKAR20
 
The assignment consist of a Case Study.  I have attached the Case St.docx
The assignment consist of a Case Study.  I have attached the Case St.docxThe assignment consist of a Case Study.  I have attached the Case St.docx
The assignment consist of a Case Study.  I have attached the Case St.docxSANSKAR20
 
The annotated bibliography will present an introduction and five ref.docx
The annotated bibliography will present an introduction and five ref.docxThe annotated bibliography will present an introduction and five ref.docx
The annotated bibliography will present an introduction and five ref.docxSANSKAR20
 
The artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxThe artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxSANSKAR20
 
The Assignment (2–3 pages including a budget worksheet)Explain th.docx
The Assignment (2–3 pages including a budget worksheet)Explain th.docxThe Assignment (2–3 pages including a budget worksheet)Explain th.docx
The Assignment (2–3 pages including a budget worksheet)Explain th.docxSANSKAR20
 
The assigment is to Research and find me resources on  Portland Sta.docx
The assigment is to Research and find me resources on  Portland Sta.docxThe assigment is to Research and find me resources on  Portland Sta.docx
The assigment is to Research and find me resources on  Portland Sta.docxSANSKAR20
 
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docx
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docxthe article.httpwww.nytimes.com20120930opinionsundaythe-m.docx
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docxSANSKAR20
 
The Arts and Royalty; Philosophers Debate Politics Please respond .docx
The Arts and Royalty; Philosophers Debate Politics Please respond .docxThe Arts and Royalty; Philosophers Debate Politics Please respond .docx
The Arts and Royalty; Philosophers Debate Politics Please respond .docxSANSKAR20
 
The assassination of Archduke Franz Ferdinand was the immediate caus.docx
The assassination of Archduke Franz Ferdinand was the immediate caus.docxThe assassination of Archduke Franz Ferdinand was the immediate caus.docx
The assassination of Archduke Franz Ferdinand was the immediate caus.docxSANSKAR20
 
The article Fostering Second Language Development in Young Children.docx
The article Fostering Second Language Development in Young Children.docxThe article Fostering Second Language Development in Young Children.docx
The article Fostering Second Language Development in Young Children.docxSANSKAR20
 
The Article Critique is required to be a minimum of two pages to a m.docx
The Article Critique is required to be a minimum of two pages to a m.docxThe Article Critique is required to be a minimum of two pages to a m.docx
The Article Critique is required to be a minimum of two pages to a m.docxSANSKAR20
 
The Apple Computer Company is one of the most innovative technology .docx
The Apple Computer Company is one of the most innovative technology .docxThe Apple Computer Company is one of the most innovative technology .docx
The Apple Computer Company is one of the most innovative technology .docxSANSKAR20
 
The artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxThe artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxSANSKAR20
 
The Article Attached A Bretton Woods for InnovationBy St.docx
The Article Attached A Bretton Woods for InnovationBy St.docxThe Article Attached A Bretton Woods for InnovationBy St.docx
The Article Attached A Bretton Woods for InnovationBy St.docxSANSKAR20
 
The analysis must includeExecutive summaryHistory and evolution.docx
The analysis must includeExecutive summaryHistory and evolution.docxThe analysis must includeExecutive summaryHistory and evolution.docx
The analysis must includeExecutive summaryHistory and evolution.docxSANSKAR20
 
The annotated bibliography for your course is now due. The annotated.docx
The annotated bibliography for your course is now due. The annotated.docxThe annotated bibliography for your course is now due. The annotated.docx
The annotated bibliography for your course is now due. The annotated.docxSANSKAR20
 
The Americans With Disabilities Act (ADA) was designed to protect wo.docx
The Americans With Disabilities Act (ADA) was designed to protect wo.docxThe Americans With Disabilities Act (ADA) was designed to protect wo.docx
The Americans With Disabilities Act (ADA) was designed to protect wo.docxSANSKAR20
 
The air they have of person who never knew how it felt to stand in .docx
The air they have of person who never knew how it felt to stand in .docxThe air they have of person who never knew how it felt to stand in .docx
The air they have of person who never knew how it felt to stand in .docxSANSKAR20
 
The agreement is for the tutor to write a Microsoft word doc of a .docx
The agreement is for the tutor to write a Microsoft word doc of a .docxThe agreement is for the tutor to write a Microsoft word doc of a .docx
The agreement is for the tutor to write a Microsoft word doc of a .docxSANSKAR20
 
The abstract is a 150-250 word summary of your Research Paper, and i.docx
The abstract is a 150-250 word summary of your Research Paper, and i.docxThe abstract is a 150-250 word summary of your Research Paper, and i.docx
The abstract is a 150-250 word summary of your Research Paper, and i.docxSANSKAR20
 

More from SANSKAR20 (20)

The Assignment (3–5 pages)Complete a leadership development plan .docx
The Assignment (3–5 pages)Complete a leadership development plan .docxThe Assignment (3–5 pages)Complete a leadership development plan .docx
The Assignment (3–5 pages)Complete a leadership development plan .docx
 
The assignment consist of a Case Study.  I have attached the Case St.docx
The assignment consist of a Case Study.  I have attached the Case St.docxThe assignment consist of a Case Study.  I have attached the Case St.docx
The assignment consist of a Case Study.  I have attached the Case St.docx
 
The annotated bibliography will present an introduction and five ref.docx
The annotated bibliography will present an introduction and five ref.docxThe annotated bibliography will present an introduction and five ref.docx
The annotated bibliography will present an introduction and five ref.docx
 
The artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxThe artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docx
 
The Assignment (2–3 pages including a budget worksheet)Explain th.docx
The Assignment (2–3 pages including a budget worksheet)Explain th.docxThe Assignment (2–3 pages including a budget worksheet)Explain th.docx
The Assignment (2–3 pages including a budget worksheet)Explain th.docx
 
The assigment is to Research and find me resources on  Portland Sta.docx
The assigment is to Research and find me resources on  Portland Sta.docxThe assigment is to Research and find me resources on  Portland Sta.docx
The assigment is to Research and find me resources on  Portland Sta.docx
 
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docx
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docxthe article.httpwww.nytimes.com20120930opinionsundaythe-m.docx
the article.httpwww.nytimes.com20120930opinionsundaythe-m.docx
 
The Arts and Royalty; Philosophers Debate Politics Please respond .docx
The Arts and Royalty; Philosophers Debate Politics Please respond .docxThe Arts and Royalty; Philosophers Debate Politics Please respond .docx
The Arts and Royalty; Philosophers Debate Politics Please respond .docx
 
The assassination of Archduke Franz Ferdinand was the immediate caus.docx
The assassination of Archduke Franz Ferdinand was the immediate caus.docxThe assassination of Archduke Franz Ferdinand was the immediate caus.docx
The assassination of Archduke Franz Ferdinand was the immediate caus.docx
 
The article Fostering Second Language Development in Young Children.docx
The article Fostering Second Language Development in Young Children.docxThe article Fostering Second Language Development in Young Children.docx
The article Fostering Second Language Development in Young Children.docx
 
The Article Critique is required to be a minimum of two pages to a m.docx
The Article Critique is required to be a minimum of two pages to a m.docxThe Article Critique is required to be a minimum of two pages to a m.docx
The Article Critique is required to be a minimum of two pages to a m.docx
 
The Apple Computer Company is one of the most innovative technology .docx
The Apple Computer Company is one of the most innovative technology .docxThe Apple Computer Company is one of the most innovative technology .docx
The Apple Computer Company is one of the most innovative technology .docx
 
The artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docxThe artist Georges Seurat is one of the worlds most fascinating art.docx
The artist Georges Seurat is one of the worlds most fascinating art.docx
 
The Article Attached A Bretton Woods for InnovationBy St.docx
The Article Attached A Bretton Woods for InnovationBy St.docxThe Article Attached A Bretton Woods for InnovationBy St.docx
The Article Attached A Bretton Woods for InnovationBy St.docx
 
The analysis must includeExecutive summaryHistory and evolution.docx
The analysis must includeExecutive summaryHistory and evolution.docxThe analysis must includeExecutive summaryHistory and evolution.docx
The analysis must includeExecutive summaryHistory and evolution.docx
 
The annotated bibliography for your course is now due. The annotated.docx
The annotated bibliography for your course is now due. The annotated.docxThe annotated bibliography for your course is now due. The annotated.docx
The annotated bibliography for your course is now due. The annotated.docx
 
The Americans With Disabilities Act (ADA) was designed to protect wo.docx
The Americans With Disabilities Act (ADA) was designed to protect wo.docxThe Americans With Disabilities Act (ADA) was designed to protect wo.docx
The Americans With Disabilities Act (ADA) was designed to protect wo.docx
 
The air they have of person who never knew how it felt to stand in .docx
The air they have of person who never knew how it felt to stand in .docxThe air they have of person who never knew how it felt to stand in .docx
The air they have of person who never knew how it felt to stand in .docx
 
The agreement is for the tutor to write a Microsoft word doc of a .docx
The agreement is for the tutor to write a Microsoft word doc of a .docxThe agreement is for the tutor to write a Microsoft word doc of a .docx
The agreement is for the tutor to write a Microsoft word doc of a .docx
 
The abstract is a 150-250 word summary of your Research Paper, and i.docx
The abstract is a 150-250 word summary of your Research Paper, and i.docxThe abstract is a 150-250 word summary of your Research Paper, and i.docx
The abstract is a 150-250 word summary of your Research Paper, and i.docx
 

Recently uploaded

Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 

Recently uploaded (20)

Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 

Evidence-Based Practice Research and Training

  • 1. Evidence-Based Practice in Psychology: Implications for Research and Research Training � Russell M. Bauer University of Florida In this article, the author discusses the implications of evidence-based practice (EBP) for research and research training in clinical psychology. It is argued that EBP provides a useful framework for addressing some here- tofore ignored problems in clinical research. Advancing evidence-based psychological practice will require educators to inject significant new con- tent into research, design, and methodology courses and to further inte- grate research and practicum training. The author believes this to be an exciting opportunity for the field, not only because it will further psychol- ogists’ integration into the interdisciplinary health care and research envi- ronment, but also because it will provide new tools to educate students for capable, not just competent professional activity. © 2007 Wiley Periodi- cals, Inc. J Clin Psychol 63: 685–694, 2007.
  • 2. Keywords: education and training; research In recent years, the notion that psychologists deliver “health care” rather than just “men- tal health care” has taken hold in our field. Along with this identification as a health care discipline comes a set of responsibilities to provide patients with clinical services that have been shown through research to be effective for addressing patient problems. The fundamental goal of the evidence-based practice movement (EBP) is to effect a cultural change within health care whereby practitioners will make “conscious, explicit, and judicious” use of current best evidence in clinical practice with individual patients (Mayer, 2004; Straus, Richardson, Glasziou, & Haynes, 2005). The contemporary empha- sis on EBP is quite strong within other health care disciplines, where it has permeated the culture of education, practice, and research, and where it is seen as furnishing at least a partial answer to a fundamental call for accountability and continuous quality Correspondence concerning this article should be addressed to: Russell M. Bauer, Department of Clinical and Health Psychology, University of Florida, P.O. Box 100165 HSC, Gainesville, FL 32610-0165; e-mail: [email protected] JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 63(7), 685–694 (2007) © 2007 Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/jclp.20374
  • 3. improvement in the overall system of health care delivery in the United States (Institute of Medicine, 2001). Most psychologists understand that EBP refers to a process by which best evidence is used intentionally in making decisions about patient care. Psychologists are most famil- iar with the construct of best evidence in the context of the empirically supported treat- ment movement, but some may mistakenly believe that EBP and empirically supported treatment (EST) are synonymous. As other articles in this series make clear, they are not; EBP is a much broader concept that refers to knowledge and action in the three essential elements of patient encounters: (a) the best evidence guiding a clinical decision (the best evidence domain), (b) the clinical expertise of the health care professional to diagnose and treat the patient’s problems (the clinical expertise domain), and (c) the unique pref- erences, concerns and expectations that the patient brings to the health care setting (the client domain). These three elements are often referred to as the three pillars of EBP. Even a brief consideration of the many variables and mechanisms involved in the three pillars will lead the clinical psychologist to an obvious conclusion: EBP not only pro- vides a framework for conceptualizing clinical problems, but also suggests a research agenda whereby patterns of wellness and illness are investigated
  • 4. with an eye toward how best practices are potentially mediated by unique aspects of practitioner expertise. In addition, how key patient characteristics influence treatment acceptability and help define what role the patient plays in the health care relationship are highlighted. This is not a new agenda, but is quite similar to the agenda set forth by Gordon Paul in 1969 in his now-famous ultimate clinical question, “What treatment, by whom, is most effective for this individual, with that specific problem, under which set of circum- stances, and how does it come about?” (Paul, 1967, 1969, p. 44). In asking this question, Paul’s goal was to draw attention to variables that needed to be described, measured, or controlled for firm evidence to accumulate across studies of psychotherapy. The agenda for evidence-based psychological practice is similar, though broader, encompassing assess- ment as well as treatment, psychological healthcare policy as well as clinical procedure, and populations as well as individuals. As such, expanding the scope of evidence-based psychological practice provides an opportunity for psychologists to build conceptual and methodological bridges with their colleagues in medicine, nursing, pharmacy, health pro- fessions, and public health. Although its status as a health care delivery process is typically emphasized, EBP, initially referred to as evidence-based medicine, evolved at McMaster University as a
  • 5. pedagogical strategy for teaching students and practitioners how to incorporate research results into the process of patient care ( McCabe, 2006; Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996). As professional psychology begins to seriously consider the rele- vance of EBP for broad aspects of practice (Davidson & Spring, 2006), we will have to grapple with some obvious implications for (a) how we conduct practice-based research, and (b) how we educate and train our students, the next cadre of clinical researchers, to develop the knowledge, skills, and expertise to contribute to the evidence base. In this article, I discuss some of these implications with an eye toward viewing the EBP move- ment as an opportunity to begin to answer some of our most difficult research questions, and to begin to address some of our most vexing and persistent problems in education and training. Practice-Based Research From a research perspective, EBP provides a framework for investigating heretofore neglected aspects of “rubber-meets-the-road” practice. That is, confronting gaps in the 686 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp evidence base from an EBP perspective draws attention to key
  • 6. client variables (e.g., preferences for one treatment over another, ability/willingness to adhere to treatment, credibility of treatment rationales, demographic and socioeconomic variables that enhance or impede health care access or that contribute to attitudes about treatment acceptability) and dimensions of clinical expertise (e.g., the ability to deliver the appropriate EST for the patient’s problem, the ability to adapt treatments to unique clients, the ability to deliver assessments appropriate to decision-making, the ability to communicate effec- tively with patient) that deserve empirical study. Practitioners face these gaps because our dominant research paradigms tend to yield data about homogeneous majority groups receiving standard treatment in optimal settings. Thus far, most of what constitutes evidence-based psychological practice is in the area of empirically supported treatment (Chambless, 1995; Chambless et al., 1998). Cur- rently, there are several psychological therapies with well- established efficacy for treat- ment of a variety of psychological problems (American Psychological Association [APA] Division 12 Dissemination Subcommittee of the Committee on Science and Practice). Continued expansion of this list to include new therapies and clinical problems, and demonstrating the portability of well-controlled efficacy studies to real world problems (effectiveness) is continuing apace (Chambless & Ollendick, 2001).
  • 7. A parallel expansion of the evidence base for psychological assessment procedures is needed. More research is needed regarding the diagnostic utility of assessment tools in predicting at-risk status, in helping select which treatment is indicated, or in predicting treatment response. Even in areas where the evidence for the clinical utility of assessment procedures is strong (e.g., in surgical epilepsy, where the results of presurgical evaluation of verbal memory strongly predict which patients will develop postsurgical neuropsycho- logical morbidity; Chelune, 1995) the best available evidence has not yet caused the majority of clinicians to modify their assessment approach accordingly. A full instantiation of EBP in psychology will require an expansion of systematic research efforts that will provide us with more information about the clinical expertise and patient domains. This represents a real opportunity to broaden the scope of EBP in psychology. How do psychological practitioners with varying levels of expertise decide which of a number of alternative treatments to utilize in the clinic? What factors make clinically efficacious treatments acceptable to patients? How does cultural diversity inter- act with treatment acceptability? To apply best evidence to individual clinical problems seamlessly, we need to develop a research agenda that allows us to retrieve and analyze answers to these kinds of questions. This is a daunting task, and one that seems intracta- ble from the point of view of our exclusive reliance on
  • 8. quantitative research methods and controlled experiments. Perhaps this is an area in which increased knowledge of qualita- tive research methods (see below) would be beneficial for the field. This is an area to which practicing scientist–practitioners can provide critical information by adopting a data-driven approach to practice that incorporates measurement and reporting of assess- ment and treatment outcomes for purposes of further addressing effectiveness questions. Implications for Education and Training In a recent survey on training in ESTs, Woody, Weisz, and McLean (2005) reported that, although many doctoral training programs provided didactic dissemination of EST- related information, actual supervised training in ESTs had declined compared to a sim- ilar survey conducted in 1993. The overall conclusion was that the field had a long way to go in insuring that our students have sufficient skill and experience to practice EST in their professional lives. The authors cited several obstacles to training in ESTs, including Evidence-Based Practice and Research 687 Journal of Clinical Psychology DOI 10.1002/jclp (a) uncertainty about what it means to train students in EBP; (b) insufficient time to provide specific training in multiple ESTs given other training
  • 9. priorities, including research; (c) within-program shortages of trained supervisors needed to provide a truly broad EST training experience; and (d) philosophic opposition to what some perceive as an overly rigid, manualized approach to treatment that reduces professional psychological practice to technician status. It seems obvious to me that most of these barriers imply a method of training in which competency in ESTs is built one treatment at a time, thus requiring large investments of time and faculty effort to the cause. Although it is true that students need practical training in a variety of clinical methods, one key issue is whether a goal of graduate education is to train students to competency in a critical number of ESTs, or whether the goal is to train them in broader principles of evidence-based practice that will enable them to easily adapt to novel demands for new competencies after attaining their PhD (educating for capability rather than competency; Fraser & Greenhalgh, 2001). There is evidence that clinical psychology training directors are ready for this devel- opment. In the Woody et al. (2005) survey, some clinical training directors indicated that current practice reflects an underemphasis on broad principles of evidence-based practice in favor of learning particular procedures on a treatment-by- treatment basis. Some of the issues related to the ability of programs to provide appropriate training would be addressed if we adopted a more general principles approach. Although not particularly on point in
  • 10. the context of this article, it is my view that developing competencies in ESTs for research and professional practice is the joint and cumulative responsibility of doctoral programs, internships, and postdoctoral programs that work together to provide a continuum of training in knowledge and skills in EBPP. Training in EBPP will require graduate training programs to include new content in research training curricula so that students are ready to understand and apply basic prin- ciples of EBPP in their everyday professional lives. Primary needs include training in (a) epidemiology, (b) clinical trials methodology, (c) qualitative research methods and mea- surement, (d) how to conduct and appraise systematic reviews and meta-analyses, and (e) in building skills in informatics and electronic database searching necessary to find best available evidence relevant to the problems that students will encounter in their research and clinical work. Such content could be introduced in a basic research methods course, could be taught separately in a course on EBPP, or could be infused in the curriculum through a combination of didactic, practicum, and research experiences (for additional ideas on infusion of EBPP into the curriculum, see Dillillo & McChargue, this issue). Achieving true infusion and integration will require that all program faculty is committed to the concept of EBPP, that all will have received some basic education in EBPP them- selves, and that EBPP concepts are represented throughout the curriculum. The faculty
  • 11. development implications of advancing EBPP are not trivial. In the short run, an effective strategy may be to partner with colleagues in medicine, health professions, nursing, and public health to provide interdisciplinary instruction and mentoring in basic princi- ples of EBP. Epidemiology Many problems important to psychologists (e.g., whether a clinical assessment tool is effective in identifying at-risk patients, whether a treatment protocol is effective in reduc- ing psychological distress or disability in a defined population) can be conceptualized and described in epidemiological terms. For example, the strength of a treatment effect can be described with reference to the concept of “number needed to treat” (the number 688 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp of patients who would need to be treated to produce one additional favorable outcome), or “number needed to harm” (the number of patients who would need to be treated to prevent one additional unfavorable outcome), or, more generally, in terms of relative or absolute risk reduction. Knowledge of basic aspects of diagnostic test performance (e.g., sensitivity, specificity, positive and negative predictive value)
  • 12. so critical to psychological practice can also be enhanced by forging links between these concepts and corresponding concepts in epidemiology (e.g., positive and negative likelihood ratios). A broad ground- ing in epidemiological methods will promote further ways of understanding and inferring causality from observational and experimental data, will further an appreciation for pre- ventative methods, and will provide much-needed appreciation for community- and population-based methods that will complement psychology’s traditional emphasis on individuals and small groups. Clinical Trials Methodology Although many graduate statistics and methodology courses cover such topics as case- control designs, cohort designs, and elements of randomized clinical trials (RCTs), clas- sical methodology education in the Campbell and Stanley (1963) tradition needs to be supplemented with contemporary information relevant to clinical trials methodology. For example, training in standards for designing, conducting, and reporting clinical trials consistent with the CONSORT statement (Begg et al., 1996; Moher, Schulz, & Altman, 2001) is important so that reports of psychological clinical trials have appropriate con- sistency and transparency. Training in methods for reporting the size of treatment effects (going beyond statistical significance), allocating samples, specifying outcomes (relative and absolute risk reduction, number needed to treat and number
  • 13. needed to harm), and addressing the ethical issues of clinical trials are all critically needed if psychology is to develop a truly evidence-based practice. Building the ability to evaluate the results of extant trials critically is also crucial if psychological practitioners are to meaningfully apply the best evidence standard to their own clinical work and research. Qualitative Research Methods and Measurement Clinical psychologists trained in the scientist–practitioner tradition are almost exclu- sively focused on quantitative research methods, with an attendant emphasis on measure- ment precision, quantitative statistical analysis, and tightly controlled experimental design. This scientific tradition links us with our colleagues in the natural and social sciences, and represents our preferred “way of knowing” the world. In contrast, qualitative approaches to research seek to evaluate the quality, or essence of human experience using a funda- mentally different methodological and analytic framework ( Mays & Pope, 1995, 2000; Pope, Ziebland, & Mays, 2000). Many psychologists are familiar with at least some qualitative research methods exemplified, for example, in ethnography, sociometry, participant-observation, or content analysis of discourse. However, methods such as con- vergent interviewing, focus groups, and personal histories are generally foreign to most students in scientist–practitioner programs. As applied to health care, qualitative research-
  • 14. ers may seek to evaluate the experiences of brain-injured patients in rehabilitative set- tings as a way of enhancing the design of the rehabilitation environment for purposes of maximizing recovery. They may investigate case dispositions in a child neurosurgery clinic by evaluating commonalities among physicians’ notes and clinical decisions. They may evaluate treatment acceptability by interviewing patients about their experiences in Evidence-Based Practice and Research 689 Journal of Clinical Psychology DOI 10.1002/jclp treatment. It is important for psychologists to become more familiar with these methods because many systematic reviews in the EBP literature contain the results of qualitative studies (Thomas et al., 2004). Although qualitative research is generally incapable of establishing causative relationships among variables, they may be the only (and therefore the best) source of evidence for rare conditions and they may suggest associations worthy of future research. Reviews of this area as applied to healthcare can be found in Green- halgh & Taylor (1997), Grypdonck (2006), Holloway (1997), and Leininger (1994). Conducting Systematic Reviews and Meta-Analyses The explosion of relevant medical and psychological literature has made it difficult for
  • 15. scientist–practitioners to have access to the best evidence at the single-study level while attending to multiple simultaneous demands for their time. For this reason, systematic reviews of the literature are becoming increasingly important as sources for state of the art information. Most graduate courses in research methodology and statistics devote little attention to conducting reviews or meta-analyses, although many programs now appear to be offering grant-writing courses or seminars. In these courses, an emphasis on design and critique of individual studies is commonplace, whereas development of skills in evaluating systematic reviews or meta-analyses is rare. If psychology is to become a key player in evidence-based-practice, the next cadre of scientist–practitioners will have to develop skills in conducting and evaluating these kinds of reviews. In programming needed education and training, it is important to distinguish between narrative reviews (the kind of review that is seen, for example, in Psychological Bulletin) and systematic reviews. Narrative reviews are conducted by knowledgeable persons who often conduct the review for advancing a particular theoretical conclusion. They therefore yield poten- tially biased conclusions because there is no consensually agreed-upon method for com- bining and weighting results from different studies. In contrast, systematic reviews and meta-analyses proceed according to specified methodological conventions in which the search method, the procedure for including and excluding studies, and the method for
  • 16. eventually calculating effect sizes or odds ratios are specified beforehand (e.g., fixed effects vs. random effects models), as are methods for determining statistical and clinical significance (Cook, Mulrow, & Haynes, 1997; Cook, Sackett & Spitzer, 1995; Quintana & Minami, 2006). Meta-analysis is a specific form of quantitative systematic review that aggregates the results of similar studies for purposes of generating more stable conclu- sions from pooled data than is possible at the individual-study level (Egger, Smith, & Phillips, 1997; Rosenthal & DiMatteo, 2001; Wolf, 1986). Recent techniques allow for the calculation of bias in published studies, allowing the reader to appraise whether the results of the analysis reflects an undistorted view of effect size (Stern, Egger, & Smith, 2001). Clinical psychologists need to know these basic concepts so that they can evaluate the relevance and quality of available evidence. Informatics and Database Searching Skills If a tree falls in the woods, and there is no one there to hear it, does it make a sound? This classical conundrum about the nature of reality seems relevant to the key issue of infor- mation access in evidence-based practice. If useful information about best evidence exists, but we do not or cannot access it, it cannot be brought to bear on clinical decision making (Slawson & Shaughnessy, 2005). For this reason, developing expertise in informatics and database searching is a critical step in making EBPP a reality. In my experience, most
  • 17. 690 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp psychologists, and students of psychology, search a limited number of databases (PubMed, U.S. National Library of Medicine, 1971; PsychLit, APA, 1967) with a single search term, and (at most) a single Boolean operator. It is not uncommon for a supervisor of a fledgling student to hear that, “there’s nothing in the literature” about a topic the student is interested in researching. Most use very little of what is available, and many are com- pletely unaware of many of the most important and useful resources available for EBPP. A detailed discussion of these resources is beyond my scope (see Hunt & McKibbon, 1997); nevertheless, it seems critical that some effort be devoted (either in faculty devel- opment seminars or in graduate education) to addressing database availability explicitly, including access strategies, search methodology, and approaches to information manage- ment (managing search results). A key first step in getting this accomplished may be to establish a close relationship with a librarian or library informatics specialist who can help translate educational and research needs into strategies for accessing needed infor- mation, and who can provide access to needed databases and other resources. It is not uncommon, particularly in larger institutions, for at least one
  • 18. member of the library staff to be particularly skilled in evidence-based medicine. There are a number of databases that are of particular relevance to EBPP, including CINAHL (nursing and allied health; Cinahl Information Systems, 1984), EMBASE (1974), The Cochrane Library (including the Cochrane Database of Systematic Reviews [CDSR]; Cochrane Library, 1999a), the Database of Abstracts of Reviews of Effects (DARE; Cochrane Library, 1999b), the Cochrane Central Register of Controlled Trials (CENTRAL; Cochrane Library, 1999c), and the ACP Journal Club (American College of Physicians, 1994), available on the Ovid (Ovid Technologies, New York, NY) search engine (for a more in-depth discussion, see Walker & London, this issue). Obtaining access to these databases is only part of the story; the development of strategic searching skills designed to yield a manageable number of relevant search results is a key outcome goal of educational efforts that will be achieved only through actual practice in problem-based learning situations. Finally, development of a local or profession- wide resource that contains the answers to evidence-based queries (so-called, critically appraised topics or CATS) will enable students and their mentors to benefit from the evidence-based practice efforts of their colleagues. Other authors in this series have suggested ways of incorporating skill-building activities into practicum and other parts of the psychology curriculum (see Collins, Belar, &
  • 19. Leffingwell, this issue; DiLillo & McChargue, this issue). The Way Forward In this article, I have tried to highlight ways that the interdisciplinary trend toward evidence- based practice offers real opportunities to address some difficult research problems and to revitalize certain aspects of our graduate curricula. This brief analysis has likely raised more questions (e.g., How? When? By whom? In what way?) as far as the training impli- cations are concerned, and has not dealt at all with criticisms that have been thoughtfully levied against the EBP approach to research and research training. One key issue in advancing EBP within psychology will be to pay attention to the key stage of the process by which knowledge (best evidence) is transformed into action and application. This, in my view, is the state of the process that is least understood from a psychological view- point. What are the principles by which best evidence can be modified to fit the individ- ual case? What evidence is “good enough” to drive a clinical decision? What about those aspects of psychological health care (e.g., relationship, trust, identification, and model- ing) that are implicitly important in the delivery of services, but that don’t themselves Evidence-Based Practice and Research 691 Journal of Clinical Psychology DOI 10.1002/jclp
  • 20. have large-scale independent empirical support? These (and others) are key questions we will need to grapple with as we implement an evidence base for clinical psychology and teach students how to access and use it. With regard to pedagogy, I am convinced that the only way to go is to incorporate problem-based, real-time experiences throughout the curriculum in which students can learn to walk the EBPP walk. This is a significant undertaking with profound implica- tions as far as faculty development is concerned. I am as skeptical of an Evidence-Based Practice Course as a way to develop the needed skills and capacities of our students as I am that a Cultural Diversity Course will somehow help build multicultural competencies. We will need to figure out how to incorporate the content, the concepts, and the tech- niques of evidence-based psychological practice at all levels of research and clinical training if we are to be truly successful in assimilating the EBPP way of thinking. We cannot do it all; faculty are generally not up to speed with all that is needed, and, for the practicing clinician, health care events proceed at a rapid pace. We can begin the process by equipping tomorrow’s psychological practitioners with the tools necessary to imple- ment EBPP into their everyday clinical practice. In addition, we can capitalize on the obvious opportunities to expand our multidisciplinary interdependence on other health
  • 21. professionals in nursing, medicine, pharmacy, and public health who are further down the EBP road than we are. Providing faculty with needed support, and developing methods for educating and training tomorrow’s psychologists in EBPP is critically needed in estab- lishing an evidence base equal to the task of providing quality psychological health care for those that depend on us. References American College of Physicians. (1994). ACP Journal Club homepage. Retrieved February 15, 2007, from http://www.acpjc.org American Psychological Association. (1967). PsycINFO homepage. Retrieved February 15, 2007, from http://www.apa.org/psycinfo/products/psycinfo.html Begg, C., Cho, M., Eastwood, S., Horton, R., Moher, D., Olkin, I., et al. (1996). Improving the quality of reporting of randomized controlled trials: the CONSORT statement. Journal of the American Medical Association, 276, 637– 639. Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally College Publishing. Chambless D. L. (1995). Training and dissemination of empirically validated psychological treat- ments: Report and recommendations. The Clinical Psychologist, 48, 3–23. Chambless, D. L., Baker, M. J., Baucom, D. H., et al. (1998).
  • 22. Update on empirically validated therapies, II. The Clinical Psychologist, 51, 3–16. Chambless, D. L., & Ollendick, T. H. (2001). Empirically supported psychological interventions: controversies and evidence. Annual Review of Psychology, 52, 685–716. Chelune, G. (1995). Hippocampal adequacy versus functional reserve: Predicting memory func- tions following temporal lobectomy. Archives of Clinical Neuropsychology, 10, 413– 432. Cinahl Information Systems. (1984). Homepage. Retrieved February 15, 2007, from http:// www.cinahl.com Cochrane Library. (1999a). Homepage. Retrieved February 15, 2007, from http://www3. interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME Cochrane Library. (1999b). DARE Homepage. Retrieved February 15, 2007, from http://www. mrw.interscience.wiley.com/cochrane/cochrane_cldare_articles_ fs.html Cochrane Library. (1999c). Cochrane Central Register of Controlled Trials Homepage. Retrieved February 15, 2007, from http://www.mrw.interscience.wiley.com/cochrane/cochrane_ clcentral_articles_fs.html 692 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp
  • 23. Collins, F. L., Leffingwell, T.R., & Belar, C. D. (2007). Teaching evidence-based practice: Impli- cations for psychology. Journal of Clinical Psychology, 63, 657– 670. Cook, D. J., Mulrow, C. D., & Haynes, R. B. (1997). Systematic reviews: Synthesis of best evi- dence for clinical decisions. Annals of Internal Medicine, 126, 376–380. Cook, D. J., Sackett, D. L., Spitzer, W. O. (1995). Methodologic guidelines for systematic reviews of randomized control trials in health care from the Potsdam Consultation on Meta-Analysis. Journal of Clinical Epidemiology, 48, 167–171. Davidson, K. W., & Spring, B. (2006). Developing an evidence base in clinical psychology. Journal of Clinical Psychology, 62, 259–271. DiLillo, D., & McChargue, D. (2007). Implementing evidence- based practice training in a scientist– practitioner program. Journal of Clinical Psychology, 63, 671– 684. Egger, M., Smith, G. D., & Phillips, A. N. (1997). Meta- analysis: Principles and procedures. British Medical Journal, 315, 1533–1537. EMBASE. (1974). Homepage. Retrieved February 15, 2007, from http://www.embase.com Fraser, S.W., & Greenhalgh, T. (2001). Coping with complexity: Educating for capability. British
  • 24. Medical Journal, 323, 799–803. Greenhalgh, T., & Taylor, R. (1997). How to read a paper: Papers that go beyond numbers (quali- tative research). British Medical Journal, 315, 740 –743. Grypdonck, M. H. (2006). Qualitative health research in the era of evidence-based practice. Qual- itative Health Research, 16, 1371–1385. Holloway I. (1997). Basic concepts for qualitative research. Oxford: Blackwell Science. Hunt, D. L., & McKibbon, K. A. (1997). Locating and appraising systematic reviews. Annals of Internal Medicine, 126, 532–538. Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academies Press. Leininger, M. (1994). Evaluation criteria and critique of qualitative research studies. In J. M. Morse (Ed.), Critical issues in qualitative research methods. Thousand Oaks, CA: Sage. Mayer, D. (2004). Essential evidence-based medicine. New York: Cambridge University Press. Mays, N., & Pope, C. (1995). Reaching the parts other methods cannot reach: An introduction to qualitative methods in health and health services research. British Medical Journal, 311, 42– 45. Mays, N., & Pope, C. (2000). Assessing quality in qualitative
  • 25. research. British Medical Journal, 320, 50 –52. McCabe, O. L. (2006). Evidence-based practice in mental health: accessing, appraising, and adopt- ing research data. International Journal of Mental Health, 35, 50 – 69. Moher, D., Schulz, K. F., & Altman, D. G. (2001). The CONSORT statement: Revised recommen- dations for improving the quality of reports of parallel-group randomized trials. Lancet, 357, 1191–1194. Paul, G. L. (1967). Outcome research in psychotherapy. Journal of Consulting Psychology, 31, 109–118. Paul, G. L. (1969). Behavior modification research: Design and tactics. In C. M. Franks (Ed.), Behavior therapy: Appraisal and status (pp. 29– 62). New York: McGraw-Hill. Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research in health care: Analyzing qualita- tive data. British Medical Journal, 320, 114 –116. Quintana, S. M., & Minami, T. (2006). Guidelines for meta- analyses of counseling psychology research. The Counseling Psychologist, 34, 839–877. Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52, 59–82.
  • 26. Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t. British Medical Journal, 312, 71–72. Evidence-Based Practice and Research 693 Journal of Clinical Psychology DOI 10.1002/jclp Slawson, D. C., & Shaughnessy, A. F. (2005). Teaching evidence based medicine: should we be teaching information management instead? Academic Medicine, 80, 685– 689. Stern, J. A. C., Egger, M., & Smith, G. D. (2001). Investigating and dealing with publication and other biases in meta-analysis. British Medical Journal, 323, 101–105. Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B. (2005). Evidence based medicine: How to practice and teach EBM. Edinburgh: Elsevier/Churchill Livingstone. Thomas, J., Harden, A., Oakley, A., Oliver, S., Sutcliffe, K., Rees, R., et al. (2004). Integrating qualitative research with trials in systematic reviews. British Medical Journal, 328, 1010 –1012. U.S. National Library of Medicine. (1971). Medline/PubMed homepage. Retrieved February 15, 2007, from http://www.ncbi.nlm.nih.gov/entrez/query.fcgi Walker, B. W., & London, S. (2007). Novel tools and resources
  • 27. for evidence-based practice in psychology. Journal of Clinical Psychology, 63, 633– 642. Wolf, F. M. (1986). Meta-analysis: Quantitative methods for research synthesis. Beverly Hills, CA: Sage. Woody, S. R., Weisz, J., & McLean, C. (2005). Empirically supported treatments: 10 years later. The Clinical Psychologist, 58, 5–11. 694 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp îronn the eaitors Practice-Based Evidence: Back to the Future Larry K. Brendtro, Martin L Mitchell, & James Doncaster Researchers are shifting from the medical model of studying treatments, to a practice- based model focusing on the nature and needs of a person in a therapeutic relationship. As seen from the articles in this special issue, this has heen a central tenet of Re-ED since founded by Nicholas Hobbs fifty years ago. Confusion abounds about what qualifies as"evidence" of effective interventions. The
  • 28. president of the American Psychology Associa- tion [APA] notes that "much of the research that guides evidence-based practice is too inacces- sible, overwhelming, and removed from practice" (Goodheart, 2010, p. 9). Yet lists of evidence-based treatments are being used to control funding in treatment, human services, and education. Stated simply, such policies are based on shaky science. Certainly there is no short- age of evidence that some methods are destructive, like withholding treatment or placing traumatized kids in toxic environments. But a wide variety of therapeu- tic interventions can have a positive impact if con- ducted within a trusting alliance. There are two very differ- ent views of what evidence is most important. Re- search in the traditional medical model compares a proposed treatment with alternates or a placebo. If a prescribed number of pub- lished studies give a statis- tical edge, the treatment is anointed as "evidence-based." This is followed by endorsements from the National Institute of Health, the Department of Education, or other
  • 29. authoritative bodies. Providing lists of curative treatments may work for medicine, but this is not how to find what works in complex therapeutic relationships. Mental health research has shown that the process of enshrining James Doncaster, guest editor winter 2011 volume 19, number 4 | 5 specific treatment models as evidence-based is based on flawed science (Chan, Hróbjartsson, Haahr, Gotzsche, & Altman, 2004). Dennis Gorman (2008) of Texas A & M University documents simi- lar problems with school-based substance abuse and violence prevention research which he calls scientiftc nonsense. Julia Littell (2010) of the Campbell Coalition documents dozens of ways that sloppy science is being used to elevate specific treatments to evi- dence based status. Here are just a few of these research flaws: Allegiance Effect: Studies produced by advocates of a particular method are positively biased. File Cabinet Effect: Studies showing failure or no effects are tucked away and not submitted for publication. Pollyanna Publishing Effect:
  • 30. Professional journals are much more likely to publish studies that show positive effects and reject those that do not. Replication by Repetition Effect: Reviewers rely heavily on recycling findings cited by others, confusing rumor and repetition with replication. Silence the Messenger Effect: Those who raise questions about the scientific base of studies are met with hostility and ad hominem attacks. When researchers account for such biases, a clear pattern emerges. Widely touted evidence-based treat- ments turn out to be no better orno worse than other ap- proaches. Solid science speaks—success does not lie in the specific method but in common factors, the most important being the helping relationship. Re-ED uses human relationships to change the worid one chiid at a time. Our field is in ferment as the focus of research is shifting. Instead of the study of treatments, the child now takes center stage. The practice-based model focuses on the nature and needs of an indi- vidual in an ecology (Brendtro & Mitchell, 2oro). Effective interventions use research and practice expertise to target client characteristics including problems, strengths, culture, and motivation (APA, 2006). Research and evaluation measure progress and provide feedback on the quality of the
  • 31. therapeutic alliance (Duncan, Miller, Wampold, & Hubble, 2010). Instead of the study of treatments, the child now takes center stage. Re-ED is rooted in practice-based evidence. It taps a rich tradition of research, provides tools for di- rect work with youth, and tailors interventions to the individual child in an ecosystem (Cantrell & Cantrell, 2007; Freado, 2010). Fifty years after they were developed by Nicholas Hobbs and colleagues, the Re-ED principles offer a still-current map for meeting modern challenges. Re-ED does not im- pose a narrowly prescribed regimen of treatment, but uses human relationships to change the world one child at a time. Larry K. Brendtro, PhD, is Dean of the Starr In- stitute for Training and co-editor of this journal with Martin L. Mitchell, EdD, President and CEO of Starr Cotnnionwealth, Albion, Michigan. They can be contacted via email at [email protected] James Doncaster, MA, is the senior director of orga- nizational development at Pressley Ridge in Pittsburgh, Pennsylvania, and is guest editor of this special issue on the fiftieth anniversary of the founding of Re-ED. He may be contacted at [email protected] 6 I reclaiming children and youth www.reclaimingjournal.com References
  • 32. APA Presidential Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. Ameri- can Psychologist, 61(4), 271-285. Brendtro, L., & Mitchell, M. (2010). Weighing the evidence: From chaos to consilience. Reclaiming Children and Youth, 19(2), 3-9. Cantrell, R., is Cantrell, M. (2007). Helpitig troubled children and youth. Memphis, TN: American Re-Education As- sociation. Chan, A., Hróbjartsson, A., Haahr, M., Gotzsche, P., & Alt- man, D. (2004). Empirical evidence for selective report- ing of outcomes in randomized trials: Comparison of protocols to published articles. JAMA, 291, 2457-2465. Duncan, B., Miller, S., Wampold, B., & Hubble, M, (Eds.). (2010). The heart and soul of change, second edition: Deliv- ering what works in therapy. Washington, DC: American Psychological Association. Freado, M. (2010). Measuring the impact of Re-ED. Reclaim- ing Children and Youth, 79(2), 28-31. Goodheart, C. (2010). The education you need to know. Monitor on Psychology, 41(7), 9. Gorman, D. (2008), Science, pseudoscience, and the need for practical knowledge. ylc/d(fiio?J, 103,1752-1753. Littell, J. (2010). Evidence-based practice: Evidence or ortho- doxy. In B. Duncan, S. Miller, B. Wampold, & M. Hubble (Eds.), The heart and soul of change, second edition: Deliv- ering what works in therapy. Washington, DC: American Psychological Association.
  • 33. PRINCIPLES OF RE-ED Trust between a child and adult is essential, the foundation on which all other principles rest. Life is to be lived now, not in the past, and lived in the future only as a present challenge. Competence makes a difference, and children should be good at something, especially at school. Time is an ally, working on the side of growth in a period of development. Self-control can be taught and children and adolescents helped to manage their behavior. Intelligence can be taught to cope with challenges of family, school and community. Feelings should be nurtured, controlled when necessary, explored with trusted others. The group is very important to young people, and it can be a major source of instruction in growing up. Ceremony and ritual give order, stability, and confidence to troubled children and adolescence. The body is the armature ofthe self, around which the psychological self is constructed. Communities are important so youth can participate and learn to serve.
  • 34. A child should know some joy in each day. Hobbs, N. (1982). The troubled and troubling child. San Francisco, CA: Jossey-Bass. winter 2011 volume 19, number 4 | 7 Copyright of Reclaiming Children & Youth is the property of Reclaiming Children & Youth and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. Presidential Address – 2012 / Message annuel du Président – 2012 Psychological Treatments: Putting Evidence Into Practice and Practice Into Evidence DAVID J. A. DOZOIS University of Western Ontario Abstract In June 2011, the Canadian Psychological Association (CPA) Board of Directors launched a task force on the evidence- based practice of psychological treatments. The purpose of this task force was to operationalize what constitutes
  • 35. evidence-based practice in psychological treatment, to make recommendations about how psychologists can best integrate evidence into practice, and to disseminate information to con- sumers about evidence-based interventions. An important im- petus for this task force was the continuing and widening scientist–practitioner gap. There are both barriers and oppor- tunities when it comes to promoting greater reliance on the scientific literature and greater uptake of empirically sup- ported treatments among practitioners. Two main factors pre- vail. For one, there is considerable controversy over what constitutes best evidence. The second is that researchers often do not communicate their findings in a manner that effec- tively translates their results from the laboratory to the clinic. It is crucial that we not only make practice evidence-based but also make evidence practice-based. In this article, I focus on current issues and opportunities with respect to evidence- based practice and identify strategies for closing the gap be- tween research and practice. Keywords: evidence-based practice, evidence-based treatment, empiri- cally supported treatment, bridging research and practice, psychotherapy A number of years ago, as I was heading out of the house to attend my undergraduate classes, my father said to me, “What do you have today, David?” I told him, “I have personality and motivation.” “Good for you!” he said. I am fortunate to have had and continue to have a great relationship with my parents. We have a lot of fun together and my parents have always been an incredible encouragement to me. In preparing for my ad- dress, my dad—a retired minister—also provided me with some good advice: “If you don’t strike oil in the first 20 minutes, stop boring.”
  • 36. As President of the Canadian Psychological Association (CPA), I have the special honour of providing an address to the membership. I intend to use this platform to share with Cana- dian psychologists some ideas related to evidence-based prac- tice. Part of my presidential mandate was for CPA to develop its own position on the evidence-based practice of psychological treatments to support and guide practice as well as to inform stakeholders. Psychological health and disorders are clearly a priority for many of Canada’s stakeholder groups (e.g., Mental Health Commission of Canada, Treasury Board, Public Health Agency of Canada) and their effective treatment needs to be- come a priority for CPA as well. When I first brought this idea to the CPA Board of Directors in March 2011, Dr. Lorne Sexton, who was on the board in the portfolio of Professional Affairs, and who had just chaired a task force on prescriptive authority for psychologists, said, “And I thought prescription privileges was controversial.” To be sure, this is a sensitive topic, and I hope that I will deal with it appropriately and at least do it some justice. In his classic monograph, “Why I Don’t Attend Case Conferences,” Paul Meehl (1973) began by stating, “The first portion of the paper will be highly critical and aggressively polemic (If you want to shake people up, you have to raise a little hell). The second part, while not claiming grandiosely to offer a definitive solution to the problem, proposes some directions of thinking and ‘experimenting’ that might lead to a significant improve- ment over current conditions” (p. 227). Although I have no intention of raising a little hell, I would similarly like to highlight the problem and then move toward some potential— not grandious or definitive— but potential solutions. After briefly highlighting some of the outcome data that support the idea that psychological treatments are effective for a variety of mental health problems, I would like to address the difficult fact that the empirical research is often not utilized by
  • 37. practitioners. There are various reasons why clinicians may not read the literature or apply it to their practices and I will focus on some of these concerns. Following this brief review, I will provide a quick update on the work of the CPA Task Force on Evidence-Based Practice of Psychological Treatments because I think it helps to address the issue of “What is evidence-based practice?” and “How should evidence be used?”— both of which have been cited as barriers to promoting greater reliance Correspondence concerning this article should be addressed to David J. A. Dozois, Department of Psychology, Westminster Hall, Room 313E, University of Western Ontario, London, Ontario N6A 3K7 Canada. E-mail: [email protected] Canadian Psychology / Psychologie canadienne © 2013 Canadian Psychological Association 2013, Vol. 54, No. 1, 1–11 0708-5591/13/$12.00 DOI: 10.1037/a0031125 1 mailto:[email protected] http://dx.doi.org/10.1037/a0031125 on the scientific literature among practitioners. I will conclude with some recommendations— both for the practitioner and scientist—for bridging the gap between science and practice. Efficacy of Psychological Treatments Psychological treatments are efficacious for a number of differ- ent disorders (e.g., Australian Psychological Society, 2010; Beck
  • 38. & Dozois, 2011; Butler, Chapman, Forman, & Beck, 2006; Chambless & Ollendick, 2001; Epp & Dobson, 2010; Hofmann, Asnaani, Vonk, Sawyer, & Fang, 2012; Nathan & Gorman, 1998; Ruscio & Holohan, 2006). Although space restrictions preclude a fulsome review of this literature, I will give a couple of examples. The Australian Psychological Society (2010) published a compre- hensive review of the best evidence available on the efficacy of psychological interventions for a broad range of mental disorders. The research was evaluated according to its evidentiary level, quality, relevance, and strength. Included in this document were systematic reviews and meta-analyses, randomized controlled tri- als, nonrandomized controlled trials, comparative studies, and case series. I will just focus on the findings for the treatment of adults for illustration purposes (see Table 1). For depression, the highest level of empirical support was for cognitive– behaviour therapy (CBT), interpersonal psychotherapy (IPT), brief psychody- namic psychotherapy, and CBT-oriented self-help interven- tions. The highest level of support for bipolar disorder was obtained for CBT, IPT, family therapy, mindfulness-based cog- nitive therapy, and psychoeducation as treatments adjunctive to pharmacotherapy. Across the anxiety disorders (including gen- eralised anxiety disorder, panic disorder, specific phobia, social anxiety, obsessive– compulsive disorder, and posttraumatic stress disorder [PTSD]), the highest level of evidence obtained was for CBT. Both CBT and Motivational Interviewing were deemed effective for substance-use disorders. Whereas CBT was the most consistently supported treatment for bulimia ner-
  • 39. vosa and binge eating disorder, family therapy and psychody- namic therapy obtained the most support for anorexia nervosa. CBT also had the most support for sleep disorders, sexual disorders, pain, chronic fatigue, somatization, hypochondriasis, and body dysmorphic disorder. CBT and family therapy were considered the most effective interventions for psychotic dis- orders. Finally, dialectical behaviour therapy received the most empirical support for borderline personality disorder (Austra- lian Psychological Society, 2010). I should note that there was some support noted for other types of interventions as well, although they did not have the highest degree of research support. This is positive news. Many psychological treatments are effective for treating mental health problems, but also demon- strate longevity. In the case of depression, for example, CBT is equally effective as medication for the treatment of an acute episode (DeRubeis, Gelfand, Tang, & Simons, 1999; DeRubeis et al., 2005; DeRubeis, Webb, Tang, & Beck, 2010) but signif- icantly reduces the risk of relapse relative to pharmacotherapy (Hollon et al., 2005). In fact, the average risk of relapse fol- lowing antidepressant medication is more than double the rate following CBT (i.e., 60% compared with 25% based on follow-up periods of 1 to 2 years; see Gloaguen, Cottraux, Cucherat, & Blackburn, 1998). In addition to the efficacy of psychological interventions, a strong economic case can also be made for their cost recovery. Table 1 Psychological Treatments With the Highest Level of Support (Adults) Mood disorders Depression
  • 40. Cognitive–behavior therapy Interpersonal psychotherapy Psychodynamic psychotherapy Self-help (Cognitive-behavior therapy) Bipolar disorder1 Cognitive–behavior therapy Interpersonal psychotherapy Family therapy Mindfulness-based cognitive therapy Psychoeducation Anxiety disorders Generalized anxiety disorder Cognitive–behavior therapy Panic disorder Cognitive–behavior therapy Specific phobia Cognitive–behavior therapy Social anxiety Cognitive–behavior therapy Obsessive–compulsive disorder Cognitive–behavior therapy Posttraumatic stress disorder Cognitive–behavior therapy Substance-use disorders Cognitive–behavior therapy
  • 41. Motivational interviewing Sleep disorders Cognitive–behavior therapy Eating disorders Anorexia nervosa Family therapy Psychodynamic psychotherapy Bulimia nervosa Cognitive–behavior therapy Binge-eating disorder Cognitive–behavior therapy Somatoform disorders Pain Cognitive–behavior therapy Chronic fatigue Cognitive–behavior therapy Somatization Cognitive–behavior therapy Hypochondriasis Cognitive–behavior therapy Body dysmorphic Cognitive–behavior therapy Borderline personality disorder Dialectical behavior therapy Psychotic disorders
  • 42. Cognitive–behavior therapy Family therapy Dissociative disorders Cognitive–behavior therapy2 Note. Source: Australian Psychological Society (2010). 1 As adjunct to medication. 2 Few studies have investigated the effective- ness of treatments for dissociative disorders. 2 DOZOIS David M. Clark (CPA’s 2011 to 2012 Honorary President) and his colleagues (D. M. Clark et al., 2009), for example, argued that psychological treatments would largely pay for themselves by reducing the costs associated with disability and increasing revenue related to return to work and increased productivity (also see Centre for Economic Performance’s Mental Health Policy Group, 2012; D. M. Clark, 2012; Layard, Clark, Knapp, & Mayraz, 2007; Myhr & Payne, 2006). The cost-effectiveness of these interventions, and the importance of evidence-based practice, was also recently highlighted in a report of the Mental Health Commission of Canada (2012). The Scientist–Practitioner Gap Notwithstanding compelling data on their efficacy and effec- tiveness, few practitioners utilize the treatments that have gar- nered the strongest scientific support. Do not get me wrong— many psychologists do keep up with the literature and practice in an evidence-based manner (Beutler, Williams, Wakefield, &
  • 43. Entwistle, 1995; Sternberg, 2006). Yet there is considerable evidence of a scientist–practitioner gap (Babione, 2010; Lilienfeld, 2010; Ruscio & Holohan, 2006; Meehl, 1987; Stewart & Chambless, 2007). For instance, few clients with depression and panic disorder receive scientifically supported treatments (Lilienfeld, 2010). Although the majority of psychologists (88%) surveyed reported using CBT techniques to treat anxiety, most did not use exposure or response prevention in the treatment of obsessive– compulsive disorder and 76% indicated that they rarely or never used interoceptive expo- sure in the treatment of panic disorder (Freiheit, Vye, Swan, & Cady, 2004). Roz Shafran and her colleagues (Shafran et al., 2009) reported that, in 1996, psychodynamic psychotherapy was the most com- mon psychological treatment offered for generalised anxiety dis- order, panic disorder, and social phobia. Supportive counselling was the most common treatment for PTSD in the United Kingdom, despite treatment guidelines (National Institute for Health and Clinical Excellence, 2005) that recommend trauma-focused psy- chological interventions as the treatments of choice. Sadly, many practitioners remain uninformed of relevant research, believe that it is not relevant for their practices, and neglect to evaluate out- come in their own clinical work (Lehman, 2010; Parrish & Rubin, 2011; Stewart & Chambless, 2007).
  • 44. This issue came to light a few years ago in an article written by Baker, McFall, and Shoham (2008) and published in the journal Psychological Science in the Public Interest. The Washington Post picked up this story, titled “Is Your Therapist a Little Behind the Times?” Baker et al. (2009) wrote, A young woman enters a physician’s office seeking help for diabetes. She assumes that the physician has been trained to understand, value and use the latest science related to her disorder. Down the hall, a young man enters a clinical psychologist’s office seeking help for depression. He similarly assumes that the psychologist has been trained to understand, value and use current research on his disorder. The first patient would be justified in her beliefs; the second, often, would not. This is the overarching conclusion of a 2-year analysis that [was] published on the views and practices of hundreds of clinical psychologists. Barriers to Promoting Greater Reliance on the Scientific Literature Well what are some of the barriers to promoting greater reliance on the scientific literature? Pagoto et al. (2007) posed questions to members of various professional Listservs in clinical psychology, health psychology, and behavioural medicine to identify an
  • 45. initial (rather than representative) list of barriers and facilitators regard- ing evidence-based practice. Respondents were asked to submit their top one to two barriers and facilitators. The top barrier pertained to attitudes toward evidence-based practice. For exam- ple, there is the perception that “EBP forces psychology to become a hard science, thereby dampening the discipline’s humanity” (Pagoto et al., 2007, p. 700). Concern was also expressed that clinical evidence is more valuable than scientific evidence. This finding concurs with Stewart and Chambless (2007), who sampled 519 psychologists in independent practice. Practitioners mildly agreed that psychotherapy outcome research has much meaning for their practices; they moderately to strongly agreed that past clinical experience affects their treatment decisions, whereas there was only mild agreement that treatment outcome research influ- ences usual practice (also see Shafran et al., 2009). This issue is extraordinarily complex. I do not pretend to have the answers, nor could I adequately describe in this article all of the arguments surrounding this debate (for review, see Hunsley, 2007a; Norcross, Beutler, & Levant, 2005; Westen, Novotny, & Thompson-Brenner, 2004). In a nutshell, we have diversity of perspectives on the “truth” and what is important in therapy. At one end of the spectrum are researchers who work tirelessly to develop and disseminate the results from randomized controlled trials. These individuals may caricature some psychotherapists as flying by the seat of their pants rather than grounding their work in evidence. On the other end, we have front-line clinicians who
  • 46. work tirelessly to help their patients with complex comorbid problems. These practitioners may caricature researchers as ivory-tower ac- ademics who do not understand the clinical realities of day-to- day practice and study unrepresentative patients in highly controlled environments (Fertuck, 2007). A number of arguments are cited in the literature as to why clinicians may not use or value the scientific literature (see Hunsley, 2007a; Kazdin, 2008; Shafran et al., 2009; Westen et al., 2004). For example, arguments have been advanced that research trials have limited applicability to actual clinical practice. Patients treated in psychotherapy outcome trials, for example, are believed to be less severe and less complex (e.g., with fewer comorbid conditions) than are individuals seen in actual practice. In contrast to this idea, however, patients in regular clinical practices are often excluded from clinical trials because they do not meet their sever- ity or duration criteria (e.g., Stirman, DeRubeis, Crits- Christoph, & Brody, 2003). In addition, many therapy trials permit most types of comorbidity (e.g., DeRubeis et al., 2005; Hollon et al., 2005; Stirman, DeRubeis, Crits-Christoph, & Rothman, 2005). Another related criticism pertains to the idea that research findings may not generalise to clinical practice (Margison et al.,
  • 47. 2000; Ruscio & Holohan, 2006). In other words, there may be a difference between efficacy (i.e., that the intervention works under highly controlled conditions) and effectiveness (i.e., that the inter- vention also works under normal circumstances). In a review of the treatment effectiveness literature, however, Hunsley and Lee 3PRESIDENTIAL ADDRESS (2007) concluded that the majority of the effectiveness studies show completion rates and outcomes comparable with the results typically obtained in randomized controlled trials (also see Teachman, Drabick, Hershenberg, Vivian, & Wolfe, 2012). Others have reacted to the randomized controlled trial (RCT) as the “gold standard” of research. RCTs may be optimal for research in medicine, some claim, but are not necessarily the most appro- priate way to investigate psychotherapy outcome (Bohart, 2005; Westen & Morrison, 2001). In the realm of psychotherapy, this reactivity to RCTs has been further reinforced by the development of lists of empirically supported treatments. Commissioned by Division 12 (Clinical Psychology) of the American Psychological Association (APA), the Task Force on Promotion and Dissemina- tion of Psychological Procedures published its 1995 report, which listed treatments considered to be either well-established or prob-
  • 48. ably efficacious according to a standard set of criteria (e.g., Chambless et al., 1996). These criteria were also adopted by the Clinical Section of CPA in their task force report, Empirically Supported Treatments in Psychology: Implications for Canadian Professional Psychology (Hunsley, Dobson, Johnston, & Mikail, 1999a, 1999b). The APA’s criteria for empirically supported treatments elicited both enthusiasm and controversy. Although there was excitement about the recognition of “effective” psychological treatments there were also myriad concerns. For example, some psychologists expressed resistance to this top-down approach and perceived the criteria to be overly rigid and restrictive, arguing that the type of research deemed necessary to produce supportive evidence for a treatment is incompatible with schools of psychotherapy outside of the cognitive and behavioural framework (see Bryceland & Stam, 2005; Stuart & Lilienfeld, 2007). Although I believe the movement toward empirically-supported treatments is well intentioned, I agree that there are issues with defining evidence in this limited manner. The reality, though, is that we need rigorous controlled research to evaluate the impact of our interventions. Tight experimental control, operational definitions, random assignment, precise mea- surement, and statistical significance—all of which makes us con- cerned about external and ecological validity—are at the crux of the experimental design (Kazdin, 2008; Lilienfeld, 2010). Obvi-
  • 49. ously, RCTs do not answer all of our questions and the findings need to be applied to the real world, but we do need controlled research. You see, science sets up safeguards against biases. I may see a depressed individual improve in therapy and conclude that my intervention worked. In addition to my own clinical observations, there may also be self-report data available (e.g., the Beck De- pression Inventory-II; Beck, Steer, & Brown, 1996) that indicates significant improvement. Yet my conclusion may be erroneous because rival explanations could account for this change (e.g., regression to the mean due to repeated measurement, spontaneous remission; see Lilienfeld, 2010). It is tempting for us, as clinicians (and I note here that I do have a small independent practice as well), to conclude that the research does not apply to my individual case—that somehow applying a particular evidence-based treatment is akin to the Procrustean dilemma. Procrustes was a mythological character who boasted that every guest invited to his house would fit the guest room bed, irrespective of his or her size. Such claim attracted considerable attention. What Procrustes failed to mention, however, was how he could make this happen— either by cutting off their legs or stretch- ing them to make them fit the bed (see Kuyken, Padesky, & Dudley, 2009). As therapists, we obviously do not want to cut off or distort a client’s experience to fit our preexisting theories
  • 50. and present a “one size fits all” type of intervention (Kuyken et al., 2009). However, it would also be erroneous to conclude that, because a patient does not map perfectly well to the RCT, I should not pay attention to this research. As Meehl (1973) pointed out, doing so involves “failing to understand probability logic as ap- plied to a single case” (p. 234). Incidentally, when I was in graduate school at the University of Calgary, the writings of Paul Meehl were pivotal to our training. I hope that this is still the case and encourage students, researchers, and clinicians to make Meehl’s work a staple in their academic diet. We might be tempted to state that we are not dealing with groups or the nomothetic; we are dealing with an individual, with the ideographic. However, decades of research has demonstrated that if we depart from actuarial decision making, we will get it wrong more times that we will get it right (Dawes, Faust, & Meehl, 1989; Grove & Lloyd, 2006; Meehl, 1954). As humans, we are prone to a range of biases that include confirmation bias, illusory correlations, neglect of base rates, and availability heuristics, to name a few (Chapman & Chapman, 1969, 1975; Chwalisz, 2003; Paley, 2006; Turk & Salovey, 1985; Tversky & Kahneman, 1973). As Lilienfeld (2010) pointed out, scientific thinking is not natural for many of us; it is, in many ways, “uncommon sense, because it requires us to set aside our gut hunches and intuitions in lieu of convincing data. . . Science requires us to override more auto- matic, effortless, and intuitive modes of thinking with more
  • 51. con- trolled, effortful, and reflective modes of thinking” (p. 282). Sci- ence helps to reduce human error. As Meehl (1987) stated, we need a “general scientific commitment not to be fooled and not to fool anybody else” (p. 9). The desire to not to be fooled and not to fool anybody else needs to be fundamental to our fabric as psy- chologists, which is why evidence-based practice is so crucial. Evidence-Based Practice There is growing recognition in the field that the practice of professional psychology should be based on valid evidence regard- ing which approaches to invention are most likely to be successful. In 2006, the APA established a task force on evidence-based practice in psychology that attempted to acknowledge multiple types of research evidence (American Psychological Association Presidential Task Force on Evidence-Based Practice, 2006, p. 273): “Evidence-based practice in psychology is the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences” (also see Spring, 2007). Unfortunately, the APA task force identified evidence on a continuum “from uncorroborated clinical observations through meta-analyses of the results of RCTs” (Stuart & Lilienfeld, 2007, p. 615). The task force also said little about the need for ongoing idiographic evaluation of one’s clinical cases. In addition, at the
  • 52. heart of the three circles is “clinical decision making”—yet, as I discussed earlier, clinical decision making is heavily prone to error. 4 DOZOIS CPA Task Force on Evidence-Based Psychological Treatments As one of my presidential initiatives, the CPA Board of Direc- tors launched the Task Force on the Evidence-Based Practice of Psychological Treatments in June 2011. The purpose of this task force was to operationalize what constitutes evidence-based prac- tice in psychological treatments, to make recommendations about how psychologists can best integrate evidence into practice, and to disseminate information to consumers about evidence-based inter- ventions. An important impetus for this task force was the con- tinuing and widening scientist–practitioner gap. The task force (I co-chaired with Dr. Sam Mikail) was populated last summer and began its work in September 2011. Task force members were chosen to represent a variety of research, practice, knowledge-translation, consumer, and community perspectives. There is also good representation from different theoretical orien- tations, including interpersonal, emotion-focused, cognitive– behavioural, and psychodynamic perspectives.
  • 53. We produced a document that operationalizes what constitutes evidence-based practice of psychological treatment. The members of the task force were interested in a definition of evidence- based practice that was complex enough to incorporate the following ideas: (a) peer-reviewed research evidence is central; (b) one should be evidence-based not only in his or her general fund of knowledge but also in session-by-session work; (c) the process involves one of collaboration with a client/patient (rather than a top-down process). The Task Force on Evidence-Based Practice of Psychological Treatments will soon be releasing its final docu- ment, which will be posted on the website of the Canadian Psycho- logical Association (see http://www.cpa.ca/aboutcpa/committees/ cpataskforces/). The next step involved establishing a hierarchy of evidence that was respectful of diverse research methodologies, palatable to different groups of individuals, and yet comprehensive and com- pelling (see Figure 1). For example, we stated that although all research methodologies have some potential to provide relevant evidence, psychologists should first consider findings that are replicated across studies and that have utilized methodologies that address threats to the validity of obtained results (e.g., internal valid- ity, external validity, generalizability, transferability). Thus, psychol- ogists should consider the best available evidence, highest on
  • 54. the hierarchy of research evidence. Evidence lower on the hierarchy should be considered only to the extent that better research evidence does not exist, or if there are clear factors that mitigate against using the best evidence (Canadian Psychological Association, 2012, p. 8). As shown in Figure 1, the psychologist is to use the hierarchy of evidence to make initial treatment decisions, and then monitor change over time, feeding back to the hierarchy again when necessary. In March and April of 2012, the task force sought feedback on these core elements. Our next steps involved developing vignette examples to illustrate the process of being evidence-based in one’s practice and making specific recommendations for the CPA Board for further development and dissemination. We have also devel- oped an annotated resource list that will direct practitioners to where they can find the necessary information on evidence- based practice. A guide was also developed to highlight, for the general public, the added value that psychologists bring relative to other practitioners (e.g., research base, evidence-based focus). It is important to point out that evidence-based practice is a process by which the best evidence available is used to make optimal clinical decisions. Some psychologists equate evidence- Figure 1. The hierarchy of research evidence related to clinical practice.
  • 55. 5PRESIDENTIAL ADDRESS http://www.cpa.ca/aboutcpa/committees/cpataskforces/ http://www.cpa.ca/aboutcpa/committees/cpataskforces/ based practice with empirically supported therapies but the two are not synonymous. There are, in fact, many ways to provide evidence-based treatment without employing techniques that are explicitly empirically supported (e.g., by focusing on effectiveness trials and naturalistic studies or by emphasising evidence-based procedures and principles of practice). Clinical practice should be evidence-informed but it does not need to be evidence-driven (Bohart, 2005). Closing the Gap Between Science and Practice Although there is controversy regarding what constitutes “evi- dence,” the vast majority of psychologists do support the idea that they should practice in a manner that is evidence-based. So what can scientists and practitioners do to close the gap? I think that the work of the CPA task force has been important in terms of providing a palatable definition of evidence-based practice that is neither too restrictive nor too diluted. We have also derived a hierarchy of evidence that is open to diverse methodologies but that focuses on the need to balance internal and external validity. Yet we need to do more to close this gap. What follows are
  • 56. some suggestions for the scientist and for the practitioner about how we can work together to improve evidence-based practice and practice-based evidence. Recommendations for Scientists Better translation of science. First, we need better strategies for communicating and translating research into practice. Beutler, Williams, Wakefield, and Entwistle (1995) conducted a survey of practitioners and clinical academic psychologists. Of the practitio- ners, 47% reported reading research articles at least monthly, 21% less than monthly, and 32% never. Beutler et al. argued, however, that practitioners do generally value research but need strategies to help them translate scientific findings into clinical practice. Gen- erally speaking, we do not do a particularly good job of this. We do not translate well our findings from science to practice. I remember an old cell phone commercial that highlighted the idea that you have fewer dropped calls and less interference if you use this particular service. The ad started with a man at the airport calling his partner, “Honey . . . I’m . . . leaving . . . you.” Of course, with the right cell phone service, the message would have been accurately received: “Honey, I’m not leaving without you.”
  • 57. We need to make sure that our results—our messages—are re- ceived clearly and accurately. Academic research articles may not even be the best venue for communicating research findings to clinicians (Beutler et al., 1995). In addition, the sheer number of research articles makes “keeping up” virtually impossible. As Spring (2011) noted, there are over 8,000 research articles pub- lished every day, which is why clinical practice guidelines and systematic reviews are so important. Perhaps we will get better at translating science over time. In the spring of 2012, when the CPA Board met with local psychologists in university, hospital, and private-practice settings in Halifax, I had the privilege of speaking with Dr. Michelle Eskritt, who is an Associate Professor at Mount Saint Vincent University. Michelle informed me about an innovative new 4-year bachelor of science program in science communication. The program intends to train individuals who can be good communicators of science. There is a related program at Laurentian University. We must create infra- structure for more efficient and effective translation of clinical research from the laboratory to the practice arena (King, 2006). Researchers need to make evidence practice-based. To quote Lawrence Green (2007), professor of epidemiology and biostatis- tics at University of California, San Francisco, “if we want more
  • 58. evidence-based practice, we need more practice-based evidence.” We need to do more to make research useful to the clinician. More effectiveness trials and better communication with practitioners. Second, as mentioned previously, we must dem- onstrate not only efficacy (that the intervention works under highly controlled conditions) but also effectiveness (that the intervention also works under normal circumstances). Earlier I noted the review by Hunsley and Lee (2007), which demonstrated that efficacy and effectiveness trials are comparable in terms of completion rates and outcome; however, there are only a small number of effec- tiveness trials in the literature. Related to the need for more effectiveness trials is the need for better communication between scientists and clinicians (Teachman et al., 2012). Communication is two-way, not one-way, and prac- titioners understandably do not want to be disseminated upon (Wilson, 2011). Scientists also need to hear the important voice of practitioners about what works in the real world. One way the Society of Clinical Psychology (APA Division 12) is attempting close the gap between science and practice is by providing clini- cians with a voice in the research process. In various surveys, clinicians are afforded the opportunity to provide feedback on their use of empirically supported treatments in real-world practice. It is hoped that by fostering two-way rather than one-way communi-
  • 59. cation, clinicians will be more likely to make use of research findings and that greater collaboration will take place (Goldfried, 2010). Increased research on mechanisms of change. Third, we need more research on mechanisms of change. Numerous studies have shown that psychological interventions are effective for a host of conditions. What we do not understand well is why. Increased research on mechanisms of change is important and could help clinicians to determine which therapeutic ingredients to emphasise (D. A. Clark, in press; Kazdin, 2008). Demonstration of a link does not necessarily inform us about why such a relation exists. For example, knowing that gender is a risk factor in de- pression (with females twice as likely to become depressed as males) does not help me to understand why this is the case (Ingram, Miranda, & Segal, 1998; Ingram & Price, 2010). Similarly, just because a treatment works does not mean that we understand why or can capitalize on the mechanism of change. In some of my own research, my colleagues and I have dem- onstrated that a well-organized negative representation of self (i.e., the organisation of the self-schema) meets sensitivity, specificity, and stability criteria as a vulnerability factor for depression (Dozois, 2007; Dozois & Dobson, 2001a, 2001b; Dozois, Eichstedt, Collins, Phoenix, & Harris, 2012; Lumley, Dozois, Hennig, & Marsh, 2012; Seeds & Dozois, 2010). In previous research, we have shown that negative cognitive organisation remains stable even
  • 60. though people improve from an episode of depression. In one randomized clinical trial, we examined the effects of cognitive therapy (CT) plus pharmacotherapy (PT) compared with medica- tion alone on depressive symptoms, surface-level cognitions, and deeper-level cognitions (i.e., cognitive organisation; Dozois, 6 DOZOIS Bieling, et al., 2009). Symptom reduction was equivalent for CT � PT and PT alone. Group differences were also not significant on more surface-level cognition (i.e., automatic thoughts, dysfunc- tional attitudes). Individuals in CT � PT, however, showed greater cognitive organisation for positive content and less interconnect- edness of interpersonal negative content than did those treated with pharmacotherapy alone (this is illustrated in Figure 2). Obviously this finding needs to be replicated and examined in CT alone compared with PT alone, and I am working on that now with Dr. Lena Quilty and colleagues at the Centre for Mental Health and Addiction in Toronto. Nonetheless, this is the first evidence to suggest that the trait-like vulnerability of a highly interconnected negative self-structure can be modified by CT � PT. This finding may help to explain why CT reduces the risk of relapse or recur- rence—it seems to change deeper-level cognition. Of course, an alternative explanation may be that relapse prevention has more
  • 61. to do with the accessibility of the schema (e.g., cognitive reactivity) than its organisation per se (cf. Segal, Gemar, & Williams, 1999, Segal et al., 2006). The flood of negative thoughts that occur once the schema is activated and what a patient does with such thoughts (e.g., ruminating on them vs. acceptance; Wells, in press) may be the most important predictor of relapse. Nonetheless, if these findings are replicated and a shift in the organisation of self- representation is an important mechanism of long-term treatment change, then treatments can target this explicitly. By understanding how treatment works, we will be in a better position to capitalize on and match patients to variables that are critical to outcome (Kazdin, 2008). We will also be able to deliver treatment “doses” to specific patients in a manner that will max- imize resources (cf. Day, Eyer, & Thorn, in press). Related to mechanisms of change is the movement toward evidence-based procedures (e.g., core procedures that are impor- tant to use in the treatment of different problems and conditions, such as behavioural activation, cognitive restructuring, exposure, acceptance-based strategies, and so on). For example, transdiagnos- tic protocols (Dozois, Seeds, & Collins, 2009; Mansell, Harvey, Watkins, & Shafran, 2009; McHugh, Murray, & Barlow,
  • 62. 2009)—treatments that target pathological mechanisms that are com- mon across disorders—may enhance the relevance of the research to practice and circumvent many issues related to comorbidity (Shafran et al., 2009). Training in evidence-based thinking. Fourth, we need to shift our graduate education so that we go beyond helping students learn the content of how to administer empirically supported treatments to also training psychologists in evidence-based prac- tice (Babione, 2010; Bauer, 2007; Hershenberg, Drabick, & Vivian, 2012; Hunsley, 2007b; Lee, 2007; Leffler, Jackson, West, McCarty, & Atkins, in press). In other words, we need to train students how to think critically, respect, and understand scientific knowledge and empirical methodologies, and integrate this infor- mation to make scientifically informed clinical decisions within the context of a patient’s needs and background. As Babione (2010) pointed out, students “need to be knowledgeable of when it is beneficial to adhere to a particular modality, when to modify it, or when to abandon it and place heavier focus on the other components of the evidence-based framework” (p. 447). We need to teach our students how to think in an evidence-based manner so that they can adapt to novelty and integrate new research into their
  • 63. practices. Perhaps it is time for clinical programs to evaluate their curric- ulum not only for the content of knowledge but also for the process of learning. We need to ensure that we are modelling evidence- based practice, providing the best training and asking the right questions (see Lee, 2007; Leffler, et al., in press). Recommendations for Practitioners Clinicians, too, can take steps to narrow the research-practice gap. Next, I outline some considerations for practitioners. Measure treatment progress systematically. By routinely administering reliable and valid indices of patient functioning, practitioners may better determine whether a particular interven- tion is effective (see Fitzpatrick, 2012; Overington & Ionita, 2012; Sales & Alves, 2012) and make informed treatment decisions that are less clouded with confirmation biases and other heuristics (Dozois & Dobson, 2010; Kazdin, 2008). As illustrated in the hierarchy (see Figure 1), we need to determine how things are going through ongoing evaluation and then refer back to hierarchy if necessary. I use a variety of psychometric indices in my own independent practice. In addition to determining efficacy, there are other im- portant advantages to monitoring change over time. For example, collecting data in therapy demonstrates to clients that the therapist is confident in his or her ability to help, is credible, and
  • 64. respects accountability. Data can also be used to examine the stability of the treatment response (e.g., to ensure that a patient’s change does not Figure 2. Changes in cognitive organisation as a function of cognitive therapy. 7PRESIDENTIAL ADDRESS simply reflect a flight into health). For instance, Jarrett, Vittengl, and Clark (2008) demonstrated that additional treatment may be indicated to prevent relapse when a patient’s depression scores are in the mild range or higher during any of the last 6 weeks of therapy. Psychometric data also provides a clear indication of when treatment is successful and can be safely terminated. Finally, data gathered over time can be tabulated across different cases and can allow therapists to evaluate their own efficacy among patients with different diagnoses and client characteristics (see Dozois & Dobson, 2010). Capitalize on clinician’s knowledge and experiences. We also need to capitalize on clinician’s knowledge and experiences. As Kazdin (2008) contends, we often consider research to be the contribution to knowledge and practice as the application of that knowledge. However, this is an unfortunate way of viewing the
  • 65. contributions that scientists and practitioners make and only reifies the scientist–practitioner gap. Clinical work can and does contrib- ute importantly to science. By systematically coding their experi- ences, clinicians can contribute to the existing body of knowledge and transfer important information to the next generation of psy- chologists. We need direct collaborations between those who iden- tify themselves as primarily scientists and those whose primary identification is as a clinician. Our discipline needs the experience and expertise of practitioners (Kazdin, 2008). One exciting development has been the establishment of prac- tice research networks, which are designed to foster collaboration among researchers and clinicians by conducting naturalistic stud- ies in psychotherapy. These networks provide the infrastructure for practice-based evidence to complement evidence-based practice (Audin et al., 2001; Castonguay et al., 2010; Castonguay, Locke, & Hayes, 2011; Norquist, 2001). Castonguay and colleagues (2010) note that typical evidence-based strategies (e.g., RCTs), although important, have reflected a “top-down” approach that may have contributed to “empirical imperialism” (p. 328)— scien- tists who treat few patients tell clinicians who rarely conduct research what variables should be studied to improve outcome. In contrast, practice research networks involve clinical
  • 66. practitioners in the community collaborating with researchers to decide on the research questions, design the methodology, and implement the studies with the goal of increasing effectiveness research while also maintaining scientific rigor. The Pennsylvania Psychological Association Practice Research Network was the first psychother- apy network devoted specifically to this type of collaborative research (Castonguay et al., 2010). Tasca (2012a, 2012b) and his colleagues have recently received a Canadian Institutes of Health Research Planning and Meeting Grant to launch a psychotherapy practice research network in Canada—and there are others as well (e.g., a Practice Research Network being developed at York Uni- versity). Conclusion The gap between science and practice needs to be filled both by the scientist and by the practitioner. As Kazdin (2008) cogently argues, the researcher is not likely to say, ‘There is no solid evidence for any treatment, so I am going to withhold best guesses by experienced professionals.’ Similarly, practicing clinicians, in need of help for their relatives, are likely to search the Web, read extensively, and
  • 67. make phone calls to medical centers and experts to identify what the evidence is for the various surgical, pharmacological, and other alter- natives for their parents or children with significant medical problems. The clinician is not likely to say, ‘My relative is different and unique and the evidence really has not been tested with people like her, so I am going to forgo that treatment.’” (p. 151) We need science so that opinion does not prevail (Nathan & Gorman, 1998). We must not forget that human judgment and memory are fallible. We need more science in practice. We need to train psychologists so that they think in an evidence-based manner and make conscious, explicit, and judicious use of evi- dence in their day-to-day practices. We also need more practice in science—to rely on the strength and expertise of our clinicians to improve science. For the good of our profession and for the health and well-being of Canadians, we must to work together to study, to practice, to foster, to develop, and to disseminate evidence- based practice and practice-based evidence. Résumé En juin 2011, le conseil d’administration de la Société canadienne de psychologie (SCP) a créé un groupe de travail chargé de se pencher sur les traitements psychologiques basés sur des données probantes. Plus précisément, le but du groupe de travail était d’opérationnaliser ce qui constitue une pratique basée sur des données probantes en ce qui a trait aux traitements psychologiques, de formuler de recommandations sur les meilleures façons d’intégrer des données probantes de la
  • 68. recherche dans la pratique professionnelle de la psychologie et de disséminer l’information sur les traitements basés sur les données probantes parmi les consommateurs. L’écart grandissant entre le scientifique et le praticien a été une importante incitation à la création du groupe de travail. Il existe à la fois des obstacles et des occasions lorsqu’il s’agit de promouvoir, dans la pratique professionnelle, un plus grand appui sur la littérature scientifique et une plus grande utilisation de traitements corroborés par des données empiriques. À cet égard, deux principaux facteurs se distinguent : premièrement, la définition de « meilleurs éléments probants » soulève une importante controverse. Deuxième, il est fréquent que les chercheurs ne communiquent pas les résultats de leurs travaux d’une façon qui permette d’assurer leur transition du laboratoire à la clinique. Il est donc très important non seulement d’axer les traitements sur des données probantes, mais aussi de centrer les recherches sur les traitements à donner. Dans cet article, l’auteur se penche sur des problèmes actuels et des occasions en ce qui a trait aux traitements basés sur des données probantes et propose des stratégies visant à réduire l’écart entre la recherche et la pratique. Mots-clés : pratique basée sur des données probantes, traitement basé sur des données probantes, traitement fondé sur des données empiriques, rapprocher la recherche et la pratique, psychothérapie. References American Psychological Association Presidential Task Force on Evidence- Based Practice. (2006). Evidence-based practice in psychology. Ameri-
  • 69. can Psychologist, 61, 271–285. doi:10.1037/0003- 066X.61.4.271 8 DOZOIS http://dx.doi.org/10.1037/0003-066X.61.4.271 Audin, K., Mellor-Clark, J., Barkham, M., Margison, F., McGrath, G., Lewis, S., . . . Parry, G. (2001). Practice research networks for effective psychological therapies. Journal of Mental Health, 10, 241–251. Australian Psychological Society. (2010). Evidence-based psychological interventions: A literature review (3rd ed.). Melbourne, Australia: Au- thor. Babione, J. M. (2010). Evidence-based practice in psychology: An ethical framework for graduate education, clinical training, and maintaining professional competence. Ethics & Behavior, 20, 443– 453. doi:10.1080/ 10508422.2010.521446 Baker, T. B., McFall, R. M., & Shoham, V. (2008). Current status and future prospects of clinical psychology: Toward a scientifically princi- pled approach to mental and behavioral health care. Psychological Science in the Public Interest, 9, 67–103.
  • 70. Baker, T. B., McFall, R. M., & Shoham, V. (2009, November 15). Is your therapist a little behind the times?. Washington Post. Retrieved from http://www.washingtonpost.com/wp- dyn/content/article/2009/11/13/ AR2009111302221.html Bauer, R. M. (2007). Evidence-based practice in psychology: Implications for research and research training. Journal of Clinical Psychology, 63, 685– 694. doi:10.1002/jclp.20374 Beck, A. T., & Dozois, D. J. A. (2011). Cognitive therapy: Current status and future directions. Annual Review of Medicine, 62, 397– 409. doi: 10.1146/annurev-med-052209-100032 Beck, A. T., Steer, R. A., & Brown, G. K. (1996). Beck Depression Inventory Manual (2nd. ed.). San Antonio, TX: Psychological Corpora- tion. Beutler, L. E., Williams, R. E., Wakefield, P. J., & Entwistle, S. R. (1995). Bridging scientist and practitioner perspectives in clinical psychology. American Psychologist, 50, 984 –994. doi:10.1037/0003- 066X.50.12 .984 Bohart, A. C. (2005). Evidence-based psychotherapy means evidence-
  • 71. informed, not evidence-driven. Journal of Contemporary Psychother- apy, 35, 39 –53. doi:10.1007/s10879-005-0802-8 Bryceland, C., & Stam, H. (2005). Empirical validation and professional codes of ethics: Description or prescription? Journal of Constructivist Psychology, 18, 131–155. doi:10.1080/10720530590914770 Butler, A. C., Chapman, J. E., Forman, E. M., & Beck, A. T. (2006). The empirical status of cognitive-behavioral therapy: A review of meta- analyses. Clinical Psychology Review, 26, 17–31. doi:10.1016/j.cpr .2005.07.003 Canadian Psychological Association. (2012). Evidence-based practice of psychological treatments: A Canadian perspective (Report of the CPA Task Force on Evidence-Based Practice of Psychological Treatments). Ottawa, Ontario: Author. Castonguay, L. G., Boswell, J. F., Zack, S. E., Baker, S., Boutselis, M. A., Chiswick, N. R., . . . Holtforth, M. G. (2010). Helpful and hindering events in psychotherapy: A practice research network study. Psycho- therapy (Chicago, Ill.), 47, 327–344. doi:10.1037/a0021164 Castonguay, L. G., Locke, B. D., & Hayes, J. A. (2011). The Center for
  • 72. Collegiate Mental Health: An example of a practice-research network in university counseling centers. Journal of College Student Psychother- apy, 25, 105–119. doi:10.1080/87568225.2011.556929 Centre for Economic Performance’s Mental Health Policy Group. (2012). How mental illness loses out in the NHS. London School of Economics and Political Science. London, UK. Chambless, D. L., & Ollendick, T. H. (2001). Empirically supported psychological interventions: Controversies and evidence. Annual Review of Psychology, 52, 685–716. doi:10.1146/annurev.psych.52.1.685 Chambless, D. L., Sanderson, W. C., Shoham, V., Bennett Johnson, S., Pope, K. S., Crits-Christoph, P., . . . McCurry, S. (1996). An update on empirically validated therapies. The Clinical Psychologist, 49, 5–18. Chapman, L. J., & Chapman, J. P. (1969). Illusory correlation as an obstacle to the use of valid psychodiagnostic signs. Journal of Abnormal Psychology, 74, 271–280. doi:10.1037/h0027592 Chapman, L. J., & Chapman, J. P. (1975). The basis of illusory correlation. Journal of Abnormal Psychology, 84, 574 –575. doi:10.1037/h0077112
  • 73. Chwalisz, K. (2003). Evidence-based practice: A framework for twenty- first-century scientist-practitioner training. The Counseling Psycholo- gist, 31, 497–528. doi:10.1177/0011000003256347 Clark, D. A. (in press). Cognitive restructuring: A major contribution of cognitive therapy. In D. J. A. Dozois (Ed.), CBT: General Strategies. Volume 1. In S. G. Hofmann (Series Ed.), Cognitive-behavioral therapy: A complete reference guide. Oxford, UK: Wiley-Blackwell. Clark, D. M. (2012, June 18). It is inexcusable that mental health treat- ments are still underfunded. The Guardian. Retrieved from http://www .guardian.co.uk/commentisfree/2012/jun/18/inexcusable-mental- health- treatments-underfunded Clark, D. M., Layard, R., Smithies, R., Richards, D. A., Suckling, R., & Wright, B. (2009). Improving access to psychological therapy: Initial evaluation of two UK demonstration sites. Behaviour Research and Therapy, 47, 910 –920. doi:10.1016/j.brat.2009.07.010 Dawes, R. M., Faust, D., & Meehl, P. E. (1989). Clinical versus actuarial judgment. Science, 243, 1668 –1674. doi:10.1126/science.2648573
  • 74. Day, M. A., Eyer, J. C., & Thorn, B. E. (in press). Therapeutic relaxation. In D. J. A. Dozois (Ed.), CBT: General Strategies. Volume 1. In S. G. Hofmann (Series Ed.), Cognitive-behavioral therapy: A complete refer- ence guide. Oxford, UK: Wiley-Blackwell. DeRubeis, R. J., Gelfand, L. A., Tang, T. Z., & Simons, A. D. (1999). Medications versus cognitive behavior therapy for severely depressed outpatients: Mega-analysis of four randomized comparisons. The Amer- ican Journal of Psychiatry, 156, 1007–1013. DeRubeis, R. J., Hollon, S. D., Amsterdam, J. D., Shelton, R. C., Young, P. R., Salomon, R. M., . . . Gallop, R. (2005). Cognitive therapy vs medications in the treatment of moderate to severe depression. Archives of General Psychiatry, 62, 409 – 416. doi:10.1001/archpsyc.62.4.409 DeRubeis, R. J., Webb, C. A., Tang, T. Z., & Beck, A. T. (2010). Cognitive therapy. In K. S. Dobson (Ed.), Handbook of cognitive- behavioral therapies (3rd ed., pp. 277–316). New York, NY: Guilford. Dozois, D. J. A. (2007). Stability of negative self-structures: A longitudinal comparison of depressed, remitted, and nonpsychiatric controls. Journal of Clinical Psychology, 63, 319 –338. doi:10.1002/jclp.20349