SlideShare a Scribd company logo
1 of 36
Evidence-Based Practice in Psychology:
Implications for Research and Research Training
�
Russell M. Bauer
University of Florida
In this article, the author discusses the implications of
evidence-based
practice (EBP) for research and research training in clinical
psychology. It
is argued that EBP provides a useful framework for addressing
some here-
tofore ignored problems in clinical research. Advancing
evidence-based
psychological practice will require educators to inject
significant new con-
tent into research, design, and methodology courses and to
further inte-
grate research and practicum training. The author believes this
to be an
exciting opportunity for the field, not only because it will
further psychol-
ogists’ integration into the interdisciplinary health care and
research envi-
ronment, but also because it will provide new tools to educate
students for
capable, not just competent professional activity. © 2007 Wiley
Periodi-
cals, Inc. J Clin Psychol 63: 685–694, 2007.
Keywords: education and training; research
In recent years, the notion that psychologists deliver “health
care” rather than just “men-
tal health care” has taken hold in our field. Along with this
identification as a health
care discipline comes a set of responsibilities to provide
patients with clinical services
that have been shown through research to be effective for
addressing patient problems.
The fundamental goal of the evidence-based practice movement
(EBP) is to effect a
cultural change within health care whereby practitioners will
make “conscious, explicit,
and judicious” use of current best evidence in clinical practice
with individual patients
(Mayer, 2004; Straus, Richardson, Glasziou, & Haynes, 2005).
The contemporary empha-
sis on EBP is quite strong within other health care disciplines,
where it has permeated
the culture of education, practice, and research, and where it is
seen as furnishing at
least a partial answer to a fundamental call for accountability
and continuous quality
Correspondence concerning this article should be addressed to:
Russell M. Bauer, Department of Clinical and
Health Psychology, University of Florida, P.O. Box 100165
HSC, Gainesville, FL 32610-0165; e-mail:
[email protected]
JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 63(7), 685–694
(2007) © 2007 Wiley Periodicals, Inc.
Published online in Wiley InterScience
(www.interscience.wiley.com). DOI: 10.1002/jclp.20374
improvement in the overall system of health care delivery in the
United States (Institute
of Medicine, 2001).
Most psychologists understand that EBP refers to a process by
which best evidence
is used intentionally in making decisions about patient care.
Psychologists are most famil-
iar with the construct of best evidence in the context of the
empirically supported treat-
ment movement, but some may mistakenly believe that EBP and
empirically supported
treatment (EST) are synonymous. As other articles in this series
make clear, they are not;
EBP is a much broader concept that refers to knowledge and
action in the three essential
elements of patient encounters: (a) the best evidence guiding a
clinical decision (the best
evidence domain), (b) the clinical expertise of the health care
professional to diagnose
and treat the patient’s problems (the clinical expertise domain),
and (c) the unique pref-
erences, concerns and expectations that the patient brings to the
health care setting (the
client domain). These three elements are often referred to as the
three pillars of EBP.
Even a brief consideration of the many variables and
mechanisms involved in the three
pillars will lead the clinical psychologist to an obvious
conclusion: EBP not only pro-
vides a framework for conceptualizing clinical problems, but
also suggests a research
agenda whereby patterns of wellness and illness are investigated
with an eye toward how
best practices are potentially mediated by unique aspects of
practitioner expertise. In
addition, how key patient characteristics influence treatment
acceptability and help define
what role the patient plays in the health care relationship are
highlighted.
This is not a new agenda, but is quite similar to the agenda set
forth by Gordon Paul
in 1969 in his now-famous ultimate clinical question, “What
treatment, by whom, is most
effective for this individual, with that specific problem, under
which set of circum-
stances, and how does it come about?” (Paul, 1967, 1969, p.
44). In asking this question,
Paul’s goal was to draw attention to variables that needed to be
described, measured, or
controlled for firm evidence to accumulate across studies of
psychotherapy. The agenda
for evidence-based psychological practice is similar, though
broader, encompassing assess-
ment as well as treatment, psychological healthcare policy as
well as clinical procedure,
and populations as well as individuals. As such, expanding the
scope of evidence-based
psychological practice provides an opportunity for
psychologists to build conceptual and
methodological bridges with their colleagues in medicine,
nursing, pharmacy, health pro-
fessions, and public health.
Although its status as a health care delivery process is typically
emphasized, EBP,
initially referred to as evidence-based medicine, evolved at
McMaster University as a
pedagogical strategy for teaching students and practitioners how
to incorporate research
results into the process of patient care ( McCabe, 2006; Sackett,
Rosenberg, Gray, Haynes,
& Richardson, 1996). As professional psychology begins to
seriously consider the rele-
vance of EBP for broad aspects of practice (Davidson & Spring,
2006), we will have to
grapple with some obvious implications for (a) how we conduct
practice-based research,
and (b) how we educate and train our students, the next cadre of
clinical researchers, to
develop the knowledge, skills, and expertise to contribute to the
evidence base. In this
article, I discuss some of these implications with an eye toward
viewing the EBP move-
ment as an opportunity to begin to answer some of our most
difficult research questions,
and to begin to address some of our most vexing and persistent
problems in education and
training.
Practice-Based Research
From a research perspective, EBP provides a framework for
investigating heretofore
neglected aspects of “rubber-meets-the-road” practice. That is,
confronting gaps in the
686 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
evidence base from an EBP perspective draws attention to key
client variables (e.g.,
preferences for one treatment over another, ability/willingness
to adhere to treatment,
credibility of treatment rationales, demographic and
socioeconomic variables that enhance
or impede health care access or that contribute to attitudes
about treatment acceptability)
and dimensions of clinical expertise (e.g., the ability to deliver
the appropriate EST for
the patient’s problem, the ability to adapt treatments to unique
clients, the ability to
deliver assessments appropriate to decision-making, the ability
to communicate effec-
tively with patient) that deserve empirical study. Practitioners
face these gaps because our
dominant research paradigms tend to yield data about
homogeneous majority groups
receiving standard treatment in optimal settings.
Thus far, most of what constitutes evidence-based psychological
practice is in the
area of empirically supported treatment (Chambless, 1995;
Chambless et al., 1998). Cur-
rently, there are several psychological therapies with well-
established efficacy for treat-
ment of a variety of psychological problems (American
Psychological Association [APA]
Division 12 Dissemination Subcommittee of the Committee on
Science and Practice).
Continued expansion of this list to include new therapies and
clinical problems, and
demonstrating the portability of well-controlled efficacy studies
to real world problems
(effectiveness) is continuing apace (Chambless & Ollendick,
2001).
A parallel expansion of the evidence base for psychological
assessment procedures is
needed. More research is needed regarding the diagnostic utility
of assessment tools in
predicting at-risk status, in helping select which treatment is
indicated, or in predicting
treatment response. Even in areas where the evidence for the
clinical utility of assessment
procedures is strong (e.g., in surgical epilepsy, where the
results of presurgical evaluation
of verbal memory strongly predict which patients will develop
postsurgical neuropsycho-
logical morbidity; Chelune, 1995) the best available evidence
has not yet caused the
majority of clinicians to modify their assessment approach
accordingly.
A full instantiation of EBP in psychology will require an
expansion of systematic
research efforts that will provide us with more information
about the clinical expertise
and patient domains. This represents a real opportunity to
broaden the scope of EBP in
psychology. How do psychological practitioners with varying
levels of expertise decide
which of a number of alternative treatments to utilize in the
clinic? What factors make
clinically efficacious treatments acceptable to patients? How
does cultural diversity inter-
act with treatment acceptability? To apply best evidence to
individual clinical problems
seamlessly, we need to develop a research agenda that allows us
to retrieve and analyze
answers to these kinds of questions. This is a daunting task, and
one that seems intracta-
ble from the point of view of our exclusive reliance on
quantitative research methods and
controlled experiments. Perhaps this is an area in which
increased knowledge of qualita-
tive research methods (see below) would be beneficial for the
field. This is an area to
which practicing scientist–practitioners can provide critical
information by adopting a
data-driven approach to practice that incorporates measurement
and reporting of assess-
ment and treatment outcomes for purposes of further addressing
effectiveness questions.
Implications for Education and Training
In a recent survey on training in ESTs, Woody, Weisz, and
McLean (2005) reported that,
although many doctoral training programs provided didactic
dissemination of EST-
related information, actual supervised training in ESTs had
declined compared to a sim-
ilar survey conducted in 1993. The overall conclusion was that
the field had a long way
to go in insuring that our students have sufficient skill and
experience to practice EST in
their professional lives. The authors cited several obstacles to
training in ESTs, including
Evidence-Based Practice and Research 687
Journal of Clinical Psychology DOI 10.1002/jclp
(a) uncertainty about what it means to train students in EBP; (b)
insufficient time to
provide specific training in multiple ESTs given other training
priorities, including research;
(c) within-program shortages of trained supervisors needed to
provide a truly broad EST
training experience; and (d) philosophic opposition to what
some perceive as an overly
rigid, manualized approach to treatment that reduces
professional psychological practice
to technician status. It seems obvious to me that most of these
barriers imply a method of
training in which competency in ESTs is built one treatment at a
time, thus requiring large
investments of time and faculty effort to the cause. Although it
is true that students need
practical training in a variety of clinical methods, one key issue
is whether a goal of
graduate education is to train students to competency in a
critical number of ESTs, or
whether the goal is to train them in broader principles of
evidence-based practice that will
enable them to easily adapt to novel demands for new
competencies after attaining their
PhD (educating for capability rather than competency; Fraser &
Greenhalgh, 2001).
There is evidence that clinical psychology training directors are
ready for this devel-
opment. In the Woody et al. (2005) survey, some clinical
training directors indicated that
current practice reflects an underemphasis on broad principles
of evidence-based practice
in favor of learning particular procedures on a treatment-by-
treatment basis. Some of the
issues related to the ability of programs to provide appropriate
training would be addressed
if we adopted a more general principles approach. Although not
particularly on point in
the context of this article, it is my view that developing
competencies in ESTs for research
and professional practice is the joint and cumulative
responsibility of doctoral programs,
internships, and postdoctoral programs that work together to
provide a continuum of
training in knowledge and skills in EBPP.
Training in EBPP will require graduate training programs to
include new content in
research training curricula so that students are ready to
understand and apply basic prin-
ciples of EBPP in their everyday professional lives. Primary
needs include training in (a)
epidemiology, (b) clinical trials methodology, (c) qualitative
research methods and mea-
surement, (d) how to conduct and appraise systematic reviews
and meta-analyses, and (e)
in building skills in informatics and electronic database
searching necessary to find best
available evidence relevant to the problems that students will
encounter in their research
and clinical work. Such content could be introduced in a basic
research methods course,
could be taught separately in a course on EBPP, or could be
infused in the curriculum
through a combination of didactic, practicum, and research
experiences (for additional
ideas on infusion of EBPP into the curriculum, see Dillillo &
McChargue, this issue).
Achieving true infusion and integration will require that all
program faculty is committed
to the concept of EBPP, that all will have received some basic
education in EBPP them-
selves, and that EBPP concepts are represented throughout the
curriculum. The faculty
development implications of advancing EBPP are not trivial. In
the short run, an effective
strategy may be to partner with colleagues in medicine, health
professions, nursing,
and public health to provide interdisciplinary instruction and
mentoring in basic princi-
ples of EBP.
Epidemiology
Many problems important to psychologists (e.g., whether a
clinical assessment tool is
effective in identifying at-risk patients, whether a treatment
protocol is effective in reduc-
ing psychological distress or disability in a defined population)
can be conceptualized
and described in epidemiological terms. For example, the
strength of a treatment effect
can be described with reference to the concept of “number
needed to treat” (the number
688 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
of patients who would need to be treated to produce one
additional favorable outcome),
or “number needed to harm” (the number of patients who would
need to be treated to
prevent one additional unfavorable outcome), or, more
generally, in terms of relative or
absolute risk reduction. Knowledge of basic aspects of
diagnostic test performance (e.g.,
sensitivity, specificity, positive and negative predictive value)
so critical to psychological
practice can also be enhanced by forging links between these
concepts and corresponding
concepts in epidemiology (e.g., positive and negative likelihood
ratios). A broad ground-
ing in epidemiological methods will promote further ways of
understanding and inferring
causality from observational and experimental data, will further
an appreciation for pre-
ventative methods, and will provide much-needed appreciation
for community- and
population-based methods that will complement psychology’s
traditional emphasis on
individuals and small groups.
Clinical Trials Methodology
Although many graduate statistics and methodology courses
cover such topics as case-
control designs, cohort designs, and elements of randomized
clinical trials (RCTs), clas-
sical methodology education in the Campbell and Stanley
(1963) tradition needs to be
supplemented with contemporary information relevant to
clinical trials methodology. For
example, training in standards for designing, conducting, and
reporting clinical trials
consistent with the CONSORT statement (Begg et al., 1996;
Moher, Schulz, & Altman,
2001) is important so that reports of psychological clinical
trials have appropriate con-
sistency and transparency. Training in methods for reporting the
size of treatment effects
(going beyond statistical significance), allocating samples,
specifying outcomes (relative
and absolute risk reduction, number needed to treat and number
needed to harm), and
addressing the ethical issues of clinical trials are all critically
needed if psychology is to
develop a truly evidence-based practice. Building the ability to
evaluate the results of
extant trials critically is also crucial if psychological
practitioners are to meaningfully
apply the best evidence standard to their own clinical work and
research.
Qualitative Research Methods and Measurement
Clinical psychologists trained in the scientist–practitioner
tradition are almost exclu-
sively focused on quantitative research methods, with an
attendant emphasis on measure-
ment precision, quantitative statistical analysis, and tightly
controlled experimental design.
This scientific tradition links us with our colleagues in the
natural and social sciences,
and represents our preferred “way of knowing” the world. In
contrast, qualitative approaches
to research seek to evaluate the quality, or essence of human
experience using a funda-
mentally different methodological and analytic framework (
Mays & Pope, 1995, 2000;
Pope, Ziebland, & Mays, 2000). Many psychologists are
familiar with at least some
qualitative research methods exemplified, for example, in
ethnography, sociometry,
participant-observation, or content analysis of discourse.
However, methods such as con-
vergent interviewing, focus groups, and personal histories are
generally foreign to most
students in scientist–practitioner programs. As applied to health
care, qualitative research-
ers may seek to evaluate the experiences of brain-injured
patients in rehabilitative set-
tings as a way of enhancing the design of the rehabilitation
environment for purposes of
maximizing recovery. They may investigate case dispositions in
a child neurosurgery
clinic by evaluating commonalities among physicians’ notes and
clinical decisions. They
may evaluate treatment acceptability by interviewing patients
about their experiences in
Evidence-Based Practice and Research 689
Journal of Clinical Psychology DOI 10.1002/jclp
treatment. It is important for psychologists to become more
familiar with these methods
because many systematic reviews in the EBP literature contain
the results of qualitative
studies (Thomas et al., 2004). Although qualitative research is
generally incapable of
establishing causative relationships among variables, they may
be the only (and therefore
the best) source of evidence for rare conditions and they may
suggest associations worthy
of future research. Reviews of this area as applied to healthcare
can be found in Green-
halgh & Taylor (1997), Grypdonck (2006), Holloway (1997),
and Leininger (1994).
Conducting Systematic Reviews and Meta-Analyses
The explosion of relevant medical and psychological literature
has made it difficult for
scientist–practitioners to have access to the best evidence at the
single-study level while
attending to multiple simultaneous demands for their time. For
this reason, systematic
reviews of the literature are becoming increasingly important as
sources for state of the
art information. Most graduate courses in research methodology
and statistics devote
little attention to conducting reviews or meta-analyses, although
many programs now
appear to be offering grant-writing courses or seminars. In these
courses, an emphasis on
design and critique of individual studies is commonplace,
whereas development of skills
in evaluating systematic reviews or meta-analyses is rare. If
psychology is to become a
key player in evidence-based-practice, the next cadre of
scientist–practitioners will have
to develop skills in conducting and evaluating these kinds of
reviews. In programming
needed education and training, it is important to distinguish
between narrative reviews
(the kind of review that is seen, for example, in Psychological
Bulletin) and systematic
reviews. Narrative reviews are conducted by knowledgeable
persons who often conduct
the review for advancing a particular theoretical conclusion.
They therefore yield poten-
tially biased conclusions because there is no consensually
agreed-upon method for com-
bining and weighting results from different studies. In contrast,
systematic reviews and
meta-analyses proceed according to specified methodological
conventions in which the
search method, the procedure for including and excluding
studies, and the method for
eventually calculating effect sizes or odds ratios are specified
beforehand (e.g., fixed
effects vs. random effects models), as are methods for
determining statistical and clinical
significance (Cook, Mulrow, & Haynes, 1997; Cook, Sackett &
Spitzer, 1995; Quintana
& Minami, 2006). Meta-analysis is a specific form of
quantitative systematic review that
aggregates the results of similar studies for purposes of
generating more stable conclu-
sions from pooled data than is possible at the individual-study
level (Egger, Smith, &
Phillips, 1997; Rosenthal & DiMatteo, 2001; Wolf, 1986).
Recent techniques allow for
the calculation of bias in published studies, allowing the reader
to appraise whether the
results of the analysis reflects an undistorted view of effect size
(Stern, Egger, & Smith,
2001). Clinical psychologists need to know these basic concepts
so that they can evaluate
the relevance and quality of available evidence.
Informatics and Database Searching Skills
If a tree falls in the woods, and there is no one there to hear it,
does it make a sound? This
classical conundrum about the nature of reality seems relevant
to the key issue of infor-
mation access in evidence-based practice. If useful information
about best evidence exists,
but we do not or cannot access it, it cannot be brought to bear
on clinical decision making
(Slawson & Shaughnessy, 2005). For this reason, developing
expertise in informatics and
database searching is a critical step in making EBPP a reality.
In my experience, most
690 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
psychologists, and students of psychology, search a limited
number of databases (PubMed,
U.S. National Library of Medicine, 1971; PsychLit, APA, 1967)
with a single search
term, and (at most) a single Boolean operator. It is not
uncommon for a supervisor of a
fledgling student to hear that, “there’s nothing in the literature”
about a topic the student
is interested in researching. Most use very little of what is
available, and many are com-
pletely unaware of many of the most important and useful
resources available for EBPP.
A detailed discussion of these resources is beyond my scope
(see Hunt & McKibbon,
1997); nevertheless, it seems critical that some effort be
devoted (either in faculty devel-
opment seminars or in graduate education) to addressing
database availability explicitly,
including access strategies, search methodology, and approaches
to information manage-
ment (managing search results). A key first step in getting this
accomplished may be to
establish a close relationship with a librarian or library
informatics specialist who can
help translate educational and research needs into strategies for
accessing needed infor-
mation, and who can provide access to needed databases and
other resources. It is not
uncommon, particularly in larger institutions, for at least one
member of the library staff
to be particularly skilled in evidence-based medicine. There are
a number of databases
that are of particular relevance to EBPP, including CINAHL
(nursing and allied health;
Cinahl Information Systems, 1984), EMBASE (1974), The
Cochrane Library (including
the Cochrane Database of Systematic Reviews [CDSR];
Cochrane Library, 1999a), the
Database of Abstracts of Reviews of Effects (DARE; Cochrane
Library, 1999b), the
Cochrane Central Register of Controlled Trials (CENTRAL;
Cochrane Library, 1999c),
and the ACP Journal Club (American College of Physicians,
1994), available on the Ovid
(Ovid Technologies, New York, NY) search engine (for a more
in-depth discussion, see
Walker & London, this issue).
Obtaining access to these databases is only part of the story; the
development of
strategic searching skills designed to yield a manageable
number of relevant search results
is a key outcome goal of educational efforts that will be
achieved only through actual
practice in problem-based learning situations. Finally,
development of a local or profession-
wide resource that contains the answers to evidence-based
queries (so-called, critically
appraised topics or CATS) will enable students and their
mentors to benefit from the
evidence-based practice efforts of their colleagues. Other
authors in this series have
suggested ways of incorporating skill-building activities into
practicum and other parts
of the psychology curriculum (see Collins, Belar, &
Leffingwell, this issue; DiLillo &
McChargue, this issue).
The Way Forward
In this article, I have tried to highlight ways that the
interdisciplinary trend toward evidence-
based practice offers real opportunities to address some difficult
research problems and
to revitalize certain aspects of our graduate curricula. This brief
analysis has likely raised
more questions (e.g., How? When? By whom? In what way?) as
far as the training impli-
cations are concerned, and has not dealt at all with criticisms
that have been thoughtfully
levied against the EBP approach to research and research
training. One key issue in
advancing EBP within psychology will be to pay attention to the
key stage of the process
by which knowledge (best evidence) is transformed into action
and application. This, in
my view, is the state of the process that is least understood from
a psychological view-
point. What are the principles by which best evidence can be
modified to fit the individ-
ual case? What evidence is “good enough” to drive a clinical
decision? What about those
aspects of psychological health care (e.g., relationship, trust,
identification, and model-
ing) that are implicitly important in the delivery of services, but
that don’t themselves
Evidence-Based Practice and Research 691
Journal of Clinical Psychology DOI 10.1002/jclp
have large-scale independent empirical support? These (and
others) are key questions we
will need to grapple with as we implement an evidence base for
clinical psychology and
teach students how to access and use it.
With regard to pedagogy, I am convinced that the only way to
go is to incorporate
problem-based, real-time experiences throughout the curriculum
in which students can
learn to walk the EBPP walk. This is a significant undertaking
with profound implica-
tions as far as faculty development is concerned. I am as
skeptical of an Evidence-Based
Practice Course as a way to develop the needed skills and
capacities of our students as I
am that a Cultural Diversity Course will somehow help build
multicultural competencies.
We will need to figure out how to incorporate the content, the
concepts, and the tech-
niques of evidence-based psychological practice at all levels of
research and clinical
training if we are to be truly successful in assimilating the
EBPP way of thinking. We
cannot do it all; faculty are generally not up to speed with all
that is needed, and, for the
practicing clinician, health care events proceed at a rapid pace.
We can begin the process
by equipping tomorrow’s psychological practitioners with the
tools necessary to imple-
ment EBPP into their everyday clinical practice. In addition, we
can capitalize on the
obvious opportunities to expand our multidisciplinary
interdependence on other health
professionals in nursing, medicine, pharmacy, and public health
who are further down the
EBP road than we are. Providing faculty with needed support,
and developing methods
for educating and training tomorrow’s psychologists in EBPP is
critically needed in estab-
lishing an evidence base equal to the task of providing quality
psychological health care
for those that depend on us.
References
American College of Physicians. (1994). ACP Journal Club
homepage. Retrieved February 15,
2007, from http://www.acpjc.org
American Psychological Association. (1967). PsycINFO
homepage. Retrieved February 15, 2007,
from http://www.apa.org/psycinfo/products/psycinfo.html
Begg, C., Cho, M., Eastwood, S., Horton, R., Moher, D., Olkin,
I., et al. (1996). Improving the
quality of reporting of randomized controlled trials: the
CONSORT statement. Journal of the
American Medical Association, 276, 637– 639.
Campbell, D. T., & Stanley, J. C. (1963). Experimental and
quasi-experimental designs for research.
Chicago: Rand McNally College Publishing.
Chambless D. L. (1995). Training and dissemination of
empirically validated psychological treat-
ments: Report and recommendations. The Clinical Psychologist,
48, 3–23.
Chambless, D. L., Baker, M. J., Baucom, D. H., et al. (1998).
Update on empirically validated
therapies, II. The Clinical Psychologist, 51, 3–16.
Chambless, D. L., & Ollendick, T. H. (2001). Empirically
supported psychological interventions:
controversies and evidence. Annual Review of Psychology, 52,
685–716.
Chelune, G. (1995). Hippocampal adequacy versus functional
reserve: Predicting memory func-
tions following temporal lobectomy. Archives of Clinical
Neuropsychology, 10, 413– 432.
Cinahl Information Systems. (1984). Homepage. Retrieved
February 15, 2007, from http://
www.cinahl.com
Cochrane Library. (1999a). Homepage. Retrieved February 15,
2007, from http://www3.
interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME
Cochrane Library. (1999b). DARE Homepage. Retrieved
February 15, 2007, from http://www.
mrw.interscience.wiley.com/cochrane/cochrane_cldare_articles_
fs.html
Cochrane Library. (1999c). Cochrane Central Register of
Controlled Trials Homepage. Retrieved
February 15, 2007, from
http://www.mrw.interscience.wiley.com/cochrane/cochrane_
clcentral_articles_fs.html
692 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
Collins, F. L., Leffingwell, T.R., & Belar, C. D. (2007).
Teaching evidence-based practice: Impli-
cations for psychology. Journal of Clinical Psychology, 63,
657– 670.
Cook, D. J., Mulrow, C. D., & Haynes, R. B. (1997). Systematic
reviews: Synthesis of best evi-
dence for clinical decisions. Annals of Internal Medicine, 126,
376–380.
Cook, D. J., Sackett, D. L., Spitzer, W. O. (1995).
Methodologic guidelines for systematic reviews
of randomized control trials in health care from the Potsdam
Consultation on Meta-Analysis.
Journal of Clinical Epidemiology, 48, 167–171.
Davidson, K. W., & Spring, B. (2006). Developing an evidence
base in clinical psychology. Journal
of Clinical Psychology, 62, 259–271.
DiLillo, D., & McChargue, D. (2007). Implementing evidence-
based practice training in a scientist–
practitioner program. Journal of Clinical Psychology, 63, 671–
684.
Egger, M., Smith, G. D., & Phillips, A. N. (1997). Meta-
analysis: Principles and procedures. British
Medical Journal, 315, 1533–1537.
EMBASE. (1974). Homepage. Retrieved February 15, 2007,
from http://www.embase.com
Fraser, S.W., & Greenhalgh, T. (2001). Coping with
complexity: Educating for capability. British
Medical Journal, 323, 799–803.
Greenhalgh, T., & Taylor, R. (1997). How to read a paper:
Papers that go beyond numbers (quali-
tative research). British Medical Journal, 315, 740 –743.
Grypdonck, M. H. (2006). Qualitative health research in the era
of evidence-based practice. Qual-
itative Health Research, 16, 1371–1385.
Holloway I. (1997). Basic concepts for qualitative research.
Oxford: Blackwell Science.
Hunt, D. L., & McKibbon, K. A. (1997). Locating and
appraising systematic reviews. Annals of
Internal Medicine, 126, 532–538.
Institute of Medicine. (2001). Crossing the quality chasm: A
new health system for the 21st century.
Washington, DC: National Academies Press.
Leininger, M. (1994). Evaluation criteria and critique of
qualitative research studies. In J. M. Morse
(Ed.), Critical issues in qualitative research methods. Thousand
Oaks, CA: Sage.
Mayer, D. (2004). Essential evidence-based medicine. New
York: Cambridge University Press.
Mays, N., & Pope, C. (1995). Reaching the parts other methods
cannot reach: An introduction
to qualitative methods in health and health services research.
British Medical Journal, 311,
42– 45.
Mays, N., & Pope, C. (2000). Assessing quality in qualitative
research. British Medical Journal,
320, 50 –52.
McCabe, O. L. (2006). Evidence-based practice in mental
health: accessing, appraising, and adopt-
ing research data. International Journal of Mental Health, 35, 50
– 69.
Moher, D., Schulz, K. F., & Altman, D. G. (2001). The
CONSORT statement: Revised recommen-
dations for improving the quality of reports of parallel-group
randomized trials. Lancet, 357,
1191–1194.
Paul, G. L. (1967). Outcome research in psychotherapy. Journal
of Consulting Psychology, 31,
109–118.
Paul, G. L. (1969). Behavior modification research: Design and
tactics. In C. M. Franks (Ed.),
Behavior therapy: Appraisal and status (pp. 29– 62). New York:
McGraw-Hill.
Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research
in health care: Analyzing qualita-
tive data. British Medical Journal, 320, 114 –116.
Quintana, S. M., & Minami, T. (2006). Guidelines for meta-
analyses of counseling psychology
research. The Counseling Psychologist, 34, 839–877.
Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent
developments in quantitative
methods for literature reviews. Annual Review of Psychology,
52, 59–82.
Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R.
B., & Richardson, W. S. (1996).
Evidence based medicine: What it is and what it isn’t. British
Medical Journal, 312, 71–72.
Evidence-Based Practice and Research 693
Journal of Clinical Psychology DOI 10.1002/jclp
Slawson, D. C., & Shaughnessy, A. F. (2005). Teaching
evidence based medicine: should we be
teaching information management instead? Academic Medicine,
80, 685– 689.
Stern, J. A. C., Egger, M., & Smith, G. D. (2001). Investigating
and dealing with publication and
other biases in meta-analysis. British Medical Journal, 323,
101–105.
Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B.
(2005). Evidence based medicine:
How to practice and teach EBM. Edinburgh: Elsevier/Churchill
Livingstone.
Thomas, J., Harden, A., Oakley, A., Oliver, S., Sutcliffe, K.,
Rees, R., et al. (2004). Integrating
qualitative research with trials in systematic reviews. British
Medical Journal, 328, 1010 –1012.
U.S. National Library of Medicine. (1971). Medline/PubMed
homepage. Retrieved February 15,
2007, from http://www.ncbi.nlm.nih.gov/entrez/query.fcgi
Walker, B. W., & London, S. (2007). Novel tools and resources
for evidence-based practice in
psychology. Journal of Clinical Psychology, 63, 633– 642.
Wolf, F. M. (1986). Meta-analysis: Quantitative methods for
research synthesis. Beverly Hills, CA:
Sage.
Woody, S. R., Weisz, J., & McLean, C. (2005). Empirically
supported treatments: 10 years later.
The Clinical Psychologist, 58, 5–11.
694 Journal of Clinical Psychology, July 2007
Journal of Clinical Psychology DOI 10.1002/jclp
from the editors
Confusion abounds about what qualifies as “evidence” of
effective interventions. The
president of the American Psychology Associa-
tion [APA] notes that “much of the research that
guides evidence-based practice is too inacces-
sible, overwhelming, and removed from practice”
(Goodheart, 2010, p. 9). Yet lists of evidence-based
treatments are being used to control funding in
treatment, human services, and education. Stated
simply, such policies are
based on shaky science.
Certainly there is no short-
age of evidence that some
methods are destructive,
like withholding treatment
or placing traumatized kids
in toxic environments. But
a wide variety of therapeu-
tic interventions can have
a positive impact if con-
ducted within a trusting
alliance.
There are two very differ-
ent views of what evidence
is most important. Re-
search in the traditional
medical model compares a
proposed treatment with
alternates or a placebo. If a
prescribed number of pub-
lished studies give a statis-
tical edge, the treatment is
anointed as “evidence-based.” This is followed
by endorsements from the National Institute of
Health, the Department of Education, or other
authoritative bodies.
Providing lists of curative treatments may work for
medicine, but this is not how to find what works in
complex therapeutic relationships. Mental health
research has shown that the process of enshrining
Practice-Based Evidence:
Back to the Future
Larry K. Brendtro, Martin L. Mitchell, & James Doncaster
Researchers are shifting from the medical model of studying
treatments, to a practice-
based model focusing on the nature and needs of a person in a
therapeutic relationship.
As seen from the articles in this special issue, this has been a
central tenet of Re-ED
since founded by Nicholas Hobbs fifty years ago.
James Doncaster, guest editor
winter 2011 volume 19, number 4 | 5
specific treatment models as evidence-based is based
on flawed science (Chan, Hróbjartsson, Haahr,
Gøtzsche, & Altman, 2004). Dennis Gorman
(2008) of Texas A & M University documents simi-
lar problems with school-based substance abuse
and violence prevention research which he calls
scientific nonsense.
Julia Littell (2010) of the Campbell Coalition
documents dozens of ways that sloppy science is
being used to elevate specific treatments to evi-
dence based status. Here are just a few of these
research flaws:
Allegiance Effect:
Studies produced by advocates of a particular
method are positively biased.
File Cabinet Effect:
Studies showing failure or no effects are tucked
away and not submitted for publication.
Pollyanna Publishing Effect:
Professional journals are much more likely to publish
studies that show positive effects and reject those that
do not.
Replication by Repetition Effect:
Reviewers rely heavily on recycling findings cited
by others, confusing rumor and repetition with
replication.
Silence the Messenger Effect:
Those who raise questions about the scientific base
of studies are met with hostility and ad hominem
attacks.
When researchers account for such biases, a clear
pattern emerges. Widely touted evidence-based treat-
ments turn out to be no better or no worse than other ap-
proaches. Solid science speaks—success does not lie
in the specific method but in common factors, the
most important being the helping relationship.
Re-ED uses human relationships
to change the world
one child at a time.
Our field is in ferment as the focus of research is
shifting. Instead of the study of treatments, the
child now takes center stage. The practice-based
model focuses on the nature and needs of an indi-
vidual in an ecology (Brendtro & Mitchell, 2010).
Effective interventions use research and practice
expertise to target client characteristics including
problems, strengths, culture, and motivation
(APA, 2006). Research and evaluation measure
progress and provide feedback on the quality of the
therapeutic alliance (Duncan, Miller, Wampold, &
Hubble, 2010).
Instead of the study of
treatments, the child now
takes center stage.
Re-ED is rooted in practice-based evidence. It taps
a rich tradition of research, provides tools for di-
rect work with youth, and tailors interventions to
the individual child in an ecosystem (Cantrell &
Cantrell, 2007; Freado, 2010). Fifty years after they
were developed by Nicholas Hobbs and colleagues,
the Re-ED principles offer a still-current map for
meeting modern challenges. Re-ED does not im-
pose a narrowly prescribed regimen of treatment,
but uses human relationships to change the world
one child at a time.
Larry K. Brendtro, PhD, is Dean of the Starr In-
stitute for Training and co-editor of this journal with
Martin L. Mitchell, EdD, President and CEO of
Starr Commonwealth, Albion, Michigan. They can be
contacted via email at [email protected]
James Doncaster, MA, is the senior director of orga-
nizational development at Pressley Ridge in Pittsburgh,
Pennsylvania, and is guest editor of this special issue
on the fiftieth anniversary of the founding of Re-ED. He
may be contacted at [email protected]
6 | reclaiming children and youth www.reclaimingjournal.com
References
APA Presidential Task Force on Evidence-Based Practice.
(2006). Evidence-based practice in psychology. Ameri-
can Psychologist, 61(4), 271-285.
Brendtro, L., & Mitchell, M. (2010). Weighing the evidence:
From chaos to consilience. Reclaiming Children and
Youth, 19(2), 3-9.
Cantrell, R., & Cantrell, M. (2007). Helping troubled children
and youth. Memphis, TN: American Re-Education As-
sociation.
Chan, A., Hróbjartsson, A., Haahr, M., Gøtzsche, P., & Alt-
man, D. (2004). Empirical evidence for selective report-
ing of outcomes in randomized trials: Comparison of
protocols to published articles. JAMA, 291, 2457-2465.
Duncan, B., Miller, S., Wampold, B., & Hubble, M, (Eds.).
(2010). The heart and soul of change, second edition: Deliv-
ering what works in therapy. Washington, DC: American
Psychological Association.
Freado, M. (2010). Measuring the impact of Re-ED. Reclaim-
ing Children and Youth, 19(2), 28-31.
Goodheart, C. (2010). The education you need to know.
Monitor on Psychology, 41(7), 9.
Gorman, D. (2008), Science, pseudoscience, and the need
for practical knowledge. Addiction, 103, 1752–1753.
Littell, J. (2010). Evidence-based practice: Evidence or ortho-
doxy. In B. Duncan, S. Miller, B. Wampold, & M. Hubble
(Eds.), The heart and soul of change, second edition: Deliv-
ering what works in therapy. Washington, DC: American
Psychological Association.
PrinciPles oF re-ed
Trust between a child and adult is essential, the foundation on
which all other principles rest.
Life is to be lived now, not in the past, and lived in the future
only as a present challenge.
Competence makes a difference, and children should be good at
something, especially at school.
Time is an ally, working on the side of growth in a period of
development.
Self-control can be taught and children and adolescents helped
to manage their behavior.
Intelligence can be taught to cope with challenges of family,
school and community.
Feelings should be nurtured, controlled when necessary,
explored with trusted others.
The group is very important to young people, and it can be a
major source of instruction in growing up.
Ceremony and ritual give order, stability, and confidence to
troubled children and adolescence.
The body is the armature of the self, around which the
psychological self is constructed.
Communities are important so youth can participate and learn to
serve.
A child should know some joy in each day.
Hobbs, N. (1982). The troubled and troubling child. San
Francisco, CA: Jossey-Bass.
winter 2011 volume 19, number 4 | 7
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Psychological Treatments: Putting Evidence Into Practice and
Practice Into Evidence
Dozois, David J A
Canadian Psychology; Feb 2013; 54, 1; ProQuest Central
pg. 1
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.
Reproduced with permission of the copyright owner. Further
reproduction prohibited without permission.

More Related Content

Similar to Evidence-Based Practice in PsychologyImplications for Resea.docx

Detecting flawed meta analyses
Detecting flawed meta analysesDetecting flawed meta analyses
Detecting flawed meta analysesJames Coyne
 
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...Stephen Faucher
 
Psychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewPsychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewHealthcare and Medical Sciences
 
Concepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxConcepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxstudywriters
 
Available online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxAvailable online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxcelenarouzie
 
Mental Health Disparities - Research
Mental Health Disparities - ResearchMental Health Disparities - Research
Mental Health Disparities - ResearchAnn Hinnen Sparks
 
Counseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCounseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCollegeEssay.Org
 
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxEMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxSALU18
 
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxReply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxcarlt4
 
HPHY 212 Week 4, lecture 1 publications-fall 2014
HPHY 212 Week 4, lecture 1   publications-fall 2014HPHY 212 Week 4, lecture 1   publications-fall 2014
HPHY 212 Week 4, lecture 1 publications-fall 2014University of Oregon
 
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coCONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coLynellBull52
 
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxRunning head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxtoltonkendal
 
PhD thesis Berghout 2010
PhD thesis Berghout 2010PhD thesis Berghout 2010
PhD thesis Berghout 2010Caspar Berghout
 
Week 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxWeek 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxcockekeshia
 
Thinking like a nurse.docx
Thinking like a nurse.docxThinking like a nurse.docx
Thinking like a nurse.docxwrite5
 

Similar to Evidence-Based Practice in PsychologyImplications for Resea.docx (20)

Detecting flawed meta analyses
Detecting flawed meta analysesDetecting flawed meta analyses
Detecting flawed meta analyses
 
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...Advanced Regression Methods For Single-Case Designs  Studying Propranolol In ...
Advanced Regression Methods For Single-Case Designs Studying Propranolol In ...
 
Psychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief ReviewPsychological Interventions in Inpatient Medical Settings: A Brief Review
Psychological Interventions in Inpatient Medical Settings: A Brief Review
 
MSc-UCL-Dissertation-9
MSc-UCL-Dissertation-9MSc-UCL-Dissertation-9
MSc-UCL-Dissertation-9
 
Concepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docxConcepts For Clinical Judgment Discussion Module 3.docx
Concepts For Clinical Judgment Discussion Module 3.docx
 
Available online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docxAvailable online at www.sciencedirect.comScienceDirectBe.docx
Available online at www.sciencedirect.comScienceDirectBe.docx
 
EVIDENCE BASED NURSING PRACTICE
EVIDENCE BASED NURSING PRACTICE EVIDENCE BASED NURSING PRACTICE
EVIDENCE BASED NURSING PRACTICE
 
Mental Health Disparities - Research
Mental Health Disparities - ResearchMental Health Disparities - Research
Mental Health Disparities - Research
 
A Qualitative Study on Lupus Patients (2)
A Qualitative Study on Lupus Patients (2)A Qualitative Study on Lupus Patients (2)
A Qualitative Study on Lupus Patients (2)
 
Counseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdfCounseling and Research by CollegeEssay.org.pdf
Counseling and Research by CollegeEssay.org.pdf
 
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docxEMPIRICAL STUDYThe meaning of learning to live with medica.docx
EMPIRICAL STUDYThe meaning of learning to live with medica.docx
 
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docxReply DB5 w9 researchReply discussion boards 1-jauregui.docx
Reply DB5 w9 researchReply discussion boards 1-jauregui.docx
 
HPHY 212 Week 4, lecture 1 publications-fall 2014
HPHY 212 Week 4, lecture 1   publications-fall 2014HPHY 212 Week 4, lecture 1   publications-fall 2014
HPHY 212 Week 4, lecture 1 publications-fall 2014
 
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary coCONCEPT ANALYSISMindfulness in nursing an evolutionary co
CONCEPT ANALYSISMindfulness in nursing an evolutionary co
 
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docxRunning head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
Running head SEARCHING AND CRITIQUING THE EVIDENCE1SEARCHING .docx
 
J.Clin_.Psychiatry
J.Clin_.PsychiatryJ.Clin_.Psychiatry
J.Clin_.Psychiatry
 
PhD thesis Berghout 2010
PhD thesis Berghout 2010PhD thesis Berghout 2010
PhD thesis Berghout 2010
 
Week 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docxWeek 2 The Clinical Question77 unread replies.2525 replies..docx
Week 2 The Clinical Question77 unread replies.2525 replies..docx
 
Course project ntr_5503
Course project ntr_5503Course project ntr_5503
Course project ntr_5503
 
Thinking like a nurse.docx
Thinking like a nurse.docxThinking like a nurse.docx
Thinking like a nurse.docx
 

More from gitagrimston

External Factor Analysis Summary (EFAS Table)External Factors.docx
External Factor Analysis Summary (EFAS Table)External Factors.docxExternal Factor Analysis Summary (EFAS Table)External Factors.docx
External Factor Analysis Summary (EFAS Table)External Factors.docxgitagrimston
 
Exploring Online Consumer Behaviors.docx
Exploring Online Consumer Behaviors.docxExploring Online Consumer Behaviors.docx
Exploring Online Consumer Behaviors.docxgitagrimston
 
External and Internal Analysis 8Extern.docx
External and Internal Analysis 8Extern.docxExternal and Internal Analysis 8Extern.docx
External and Internal Analysis 8Extern.docxgitagrimston
 
Exploring Music Concert Paper Guidelines Instructions.docx
Exploring Music  Concert Paper Guidelines Instructions.docxExploring Music  Concert Paper Guidelines Instructions.docx
Exploring Music Concert Paper Guidelines Instructions.docxgitagrimston
 
Expo 12 Discussion QuestionsThink about the cooperative learni.docx
Expo 12 Discussion QuestionsThink about the cooperative learni.docxExpo 12 Discussion QuestionsThink about the cooperative learni.docx
Expo 12 Discussion QuestionsThink about the cooperative learni.docxgitagrimston
 
ExplanationMaster Honey is a franchise-style company that sel.docx
ExplanationMaster Honey is a franchise-style company that sel.docxExplanationMaster Honey is a franchise-style company that sel.docx
ExplanationMaster Honey is a franchise-style company that sel.docxgitagrimston
 
Explain where industry profits are maximized in the figure below.docx
Explain where industry profits are maximized in the figure below.docxExplain where industry profits are maximized in the figure below.docx
Explain where industry profits are maximized in the figure below.docxgitagrimston
 
Exploratory EssayResearch - 1The ability to Wallow in complex.docx
Exploratory EssayResearch - 1The ability to Wallow in complex.docxExploratory EssayResearch - 1The ability to Wallow in complex.docx
Exploratory EssayResearch - 1The ability to Wallow in complex.docxgitagrimston
 
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docx
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docxExploring MusicExtra Credit #2 Due November 6 in classIn G.docx
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docxgitagrimston
 
Explain why Franz Boas did not accept Morgan’s view about evol.docx
Explain why Franz Boas did not accept Morgan’s view about evol.docxExplain why Franz Boas did not accept Morgan’s view about evol.docx
Explain why Franz Boas did not accept Morgan’s view about evol.docxgitagrimston
 
Explanations 6.1 Qualities of Explanations Questions 0 of 3 com.docx
Explanations  6.1 Qualities of Explanations Questions 0 of 3 com.docxExplanations  6.1 Qualities of Explanations Questions 0 of 3 com.docx
Explanations 6.1 Qualities of Explanations Questions 0 of 3 com.docxgitagrimston
 
Experts PresentationStudentPSY 496Instructor.docx
Experts PresentationStudentPSY 496Instructor.docxExperts PresentationStudentPSY 496Instructor.docx
Experts PresentationStudentPSY 496Instructor.docxgitagrimston
 
Explain whether Okonkwo was remaining truthful to himself by killi.docx
Explain whether Okonkwo was remaining truthful to himself by killi.docxExplain whether Okonkwo was remaining truthful to himself by killi.docx
Explain whether Okonkwo was remaining truthful to himself by killi.docxgitagrimston
 
Explain How these Aspects Work Together to Perform the Primary Fun.docx
Explain How these Aspects Work Together to Perform the Primary Fun.docxExplain How these Aspects Work Together to Perform the Primary Fun.docx
Explain How these Aspects Work Together to Perform the Primary Fun.docxgitagrimston
 
Explain the 3 elements of every negotiation. Why is WinWin used m.docx
Explain the 3 elements of every negotiation. Why is WinWin used m.docxExplain the 3 elements of every negotiation. Why is WinWin used m.docx
Explain the 3 elements of every negotiation. Why is WinWin used m.docxgitagrimston
 
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docx
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docxExplain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docx
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docxgitagrimston
 
Exploration 8 – Shifting and Stretching Rational Functions .docx
Exploration 8 – Shifting and Stretching Rational Functions .docxExploration 8 – Shifting and Stretching Rational Functions .docx
Exploration 8 – Shifting and Stretching Rational Functions .docxgitagrimston
 
Exploring Innovation in Action Power to the People – Lifeline Ene.docx
Exploring Innovation in Action Power to the People – Lifeline Ene.docxExploring Innovation in Action Power to the People – Lifeline Ene.docx
Exploring Innovation in Action Power to the People – Lifeline Ene.docxgitagrimston
 
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docx
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docxExperiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docx
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docxgitagrimston
 
Experimental Essay The DialecticThe purpose of this paper is to.docx
Experimental Essay The DialecticThe purpose of this paper is to.docxExperimental Essay The DialecticThe purpose of this paper is to.docx
Experimental Essay The DialecticThe purpose of this paper is to.docxgitagrimston
 

More from gitagrimston (20)

External Factor Analysis Summary (EFAS Table)External Factors.docx
External Factor Analysis Summary (EFAS Table)External Factors.docxExternal Factor Analysis Summary (EFAS Table)External Factors.docx
External Factor Analysis Summary (EFAS Table)External Factors.docx
 
Exploring Online Consumer Behaviors.docx
Exploring Online Consumer Behaviors.docxExploring Online Consumer Behaviors.docx
Exploring Online Consumer Behaviors.docx
 
External and Internal Analysis 8Extern.docx
External and Internal Analysis 8Extern.docxExternal and Internal Analysis 8Extern.docx
External and Internal Analysis 8Extern.docx
 
Exploring Music Concert Paper Guidelines Instructions.docx
Exploring Music  Concert Paper Guidelines Instructions.docxExploring Music  Concert Paper Guidelines Instructions.docx
Exploring Music Concert Paper Guidelines Instructions.docx
 
Expo 12 Discussion QuestionsThink about the cooperative learni.docx
Expo 12 Discussion QuestionsThink about the cooperative learni.docxExpo 12 Discussion QuestionsThink about the cooperative learni.docx
Expo 12 Discussion QuestionsThink about the cooperative learni.docx
 
ExplanationMaster Honey is a franchise-style company that sel.docx
ExplanationMaster Honey is a franchise-style company that sel.docxExplanationMaster Honey is a franchise-style company that sel.docx
ExplanationMaster Honey is a franchise-style company that sel.docx
 
Explain where industry profits are maximized in the figure below.docx
Explain where industry profits are maximized in the figure below.docxExplain where industry profits are maximized in the figure below.docx
Explain where industry profits are maximized in the figure below.docx
 
Exploratory EssayResearch - 1The ability to Wallow in complex.docx
Exploratory EssayResearch - 1The ability to Wallow in complex.docxExploratory EssayResearch - 1The ability to Wallow in complex.docx
Exploratory EssayResearch - 1The ability to Wallow in complex.docx
 
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docx
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docxExploring MusicExtra Credit #2 Due November 6 in classIn G.docx
Exploring MusicExtra Credit #2 Due November 6 in classIn G.docx
 
Explain why Franz Boas did not accept Morgan’s view about evol.docx
Explain why Franz Boas did not accept Morgan’s view about evol.docxExplain why Franz Boas did not accept Morgan’s view about evol.docx
Explain why Franz Boas did not accept Morgan’s view about evol.docx
 
Explanations 6.1 Qualities of Explanations Questions 0 of 3 com.docx
Explanations  6.1 Qualities of Explanations Questions 0 of 3 com.docxExplanations  6.1 Qualities of Explanations Questions 0 of 3 com.docx
Explanations 6.1 Qualities of Explanations Questions 0 of 3 com.docx
 
Experts PresentationStudentPSY 496Instructor.docx
Experts PresentationStudentPSY 496Instructor.docxExperts PresentationStudentPSY 496Instructor.docx
Experts PresentationStudentPSY 496Instructor.docx
 
Explain whether Okonkwo was remaining truthful to himself by killi.docx
Explain whether Okonkwo was remaining truthful to himself by killi.docxExplain whether Okonkwo was remaining truthful to himself by killi.docx
Explain whether Okonkwo was remaining truthful to himself by killi.docx
 
Explain How these Aspects Work Together to Perform the Primary Fun.docx
Explain How these Aspects Work Together to Perform the Primary Fun.docxExplain How these Aspects Work Together to Perform the Primary Fun.docx
Explain How these Aspects Work Together to Perform the Primary Fun.docx
 
Explain the 3 elements of every negotiation. Why is WinWin used m.docx
Explain the 3 elements of every negotiation. Why is WinWin used m.docxExplain the 3 elements of every negotiation. Why is WinWin used m.docx
Explain the 3 elements of every negotiation. Why is WinWin used m.docx
 
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docx
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docxExplain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docx
Explain how the Kluckhohn–Strodtbeck and the Hofstede framework ca.docx
 
Exploration 8 – Shifting and Stretching Rational Functions .docx
Exploration 8 – Shifting and Stretching Rational Functions .docxExploration 8 – Shifting and Stretching Rational Functions .docx
Exploration 8 – Shifting and Stretching Rational Functions .docx
 
Exploring Innovation in Action Power to the People – Lifeline Ene.docx
Exploring Innovation in Action Power to the People – Lifeline Ene.docxExploring Innovation in Action Power to the People – Lifeline Ene.docx
Exploring Innovation in Action Power to the People – Lifeline Ene.docx
 
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docx
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docxExperiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docx
Experiment 8 - Resistance and Ohm’s Law 8.1 Introduction .docx
 
Experimental Essay The DialecticThe purpose of this paper is to.docx
Experimental Essay The DialecticThe purpose of this paper is to.docxExperimental Essay The DialecticThe purpose of this paper is to.docx
Experimental Essay The DialecticThe purpose of this paper is to.docx
 

Recently uploaded

Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 

Recently uploaded (20)

Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 

Evidence-Based Practice in PsychologyImplications for Resea.docx

  • 1. Evidence-Based Practice in Psychology: Implications for Research and Research Training � Russell M. Bauer University of Florida In this article, the author discusses the implications of evidence-based practice (EBP) for research and research training in clinical psychology. It is argued that EBP provides a useful framework for addressing some here- tofore ignored problems in clinical research. Advancing evidence-based psychological practice will require educators to inject significant new con- tent into research, design, and methodology courses and to further inte- grate research and practicum training. The author believes this to be an exciting opportunity for the field, not only because it will further psychol- ogists’ integration into the interdisciplinary health care and research envi- ronment, but also because it will provide new tools to educate students for capable, not just competent professional activity. © 2007 Wiley Periodi- cals, Inc. J Clin Psychol 63: 685–694, 2007.
  • 2. Keywords: education and training; research In recent years, the notion that psychologists deliver “health care” rather than just “men- tal health care” has taken hold in our field. Along with this identification as a health care discipline comes a set of responsibilities to provide patients with clinical services that have been shown through research to be effective for addressing patient problems. The fundamental goal of the evidence-based practice movement (EBP) is to effect a cultural change within health care whereby practitioners will make “conscious, explicit, and judicious” use of current best evidence in clinical practice with individual patients (Mayer, 2004; Straus, Richardson, Glasziou, & Haynes, 2005). The contemporary empha- sis on EBP is quite strong within other health care disciplines, where it has permeated the culture of education, practice, and research, and where it is seen as furnishing at least a partial answer to a fundamental call for accountability and continuous quality Correspondence concerning this article should be addressed to: Russell M. Bauer, Department of Clinical and Health Psychology, University of Florida, P.O. Box 100165 HSC, Gainesville, FL 32610-0165; e-mail: [email protected] JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 63(7), 685–694 (2007) © 2007 Wiley Periodicals, Inc. Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/jclp.20374
  • 3. improvement in the overall system of health care delivery in the United States (Institute of Medicine, 2001). Most psychologists understand that EBP refers to a process by which best evidence is used intentionally in making decisions about patient care. Psychologists are most famil- iar with the construct of best evidence in the context of the empirically supported treat- ment movement, but some may mistakenly believe that EBP and empirically supported treatment (EST) are synonymous. As other articles in this series make clear, they are not; EBP is a much broader concept that refers to knowledge and action in the three essential elements of patient encounters: (a) the best evidence guiding a clinical decision (the best evidence domain), (b) the clinical expertise of the health care professional to diagnose and treat the patient’s problems (the clinical expertise domain), and (c) the unique pref- erences, concerns and expectations that the patient brings to the health care setting (the client domain). These three elements are often referred to as the three pillars of EBP. Even a brief consideration of the many variables and mechanisms involved in the three pillars will lead the clinical psychologist to an obvious conclusion: EBP not only pro- vides a framework for conceptualizing clinical problems, but also suggests a research agenda whereby patterns of wellness and illness are investigated
  • 4. with an eye toward how best practices are potentially mediated by unique aspects of practitioner expertise. In addition, how key patient characteristics influence treatment acceptability and help define what role the patient plays in the health care relationship are highlighted. This is not a new agenda, but is quite similar to the agenda set forth by Gordon Paul in 1969 in his now-famous ultimate clinical question, “What treatment, by whom, is most effective for this individual, with that specific problem, under which set of circum- stances, and how does it come about?” (Paul, 1967, 1969, p. 44). In asking this question, Paul’s goal was to draw attention to variables that needed to be described, measured, or controlled for firm evidence to accumulate across studies of psychotherapy. The agenda for evidence-based psychological practice is similar, though broader, encompassing assess- ment as well as treatment, psychological healthcare policy as well as clinical procedure, and populations as well as individuals. As such, expanding the scope of evidence-based psychological practice provides an opportunity for psychologists to build conceptual and methodological bridges with their colleagues in medicine, nursing, pharmacy, health pro- fessions, and public health. Although its status as a health care delivery process is typically emphasized, EBP, initially referred to as evidence-based medicine, evolved at McMaster University as a
  • 5. pedagogical strategy for teaching students and practitioners how to incorporate research results into the process of patient care ( McCabe, 2006; Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996). As professional psychology begins to seriously consider the rele- vance of EBP for broad aspects of practice (Davidson & Spring, 2006), we will have to grapple with some obvious implications for (a) how we conduct practice-based research, and (b) how we educate and train our students, the next cadre of clinical researchers, to develop the knowledge, skills, and expertise to contribute to the evidence base. In this article, I discuss some of these implications with an eye toward viewing the EBP move- ment as an opportunity to begin to answer some of our most difficult research questions, and to begin to address some of our most vexing and persistent problems in education and training. Practice-Based Research From a research perspective, EBP provides a framework for investigating heretofore neglected aspects of “rubber-meets-the-road” practice. That is, confronting gaps in the 686 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp evidence base from an EBP perspective draws attention to key
  • 6. client variables (e.g., preferences for one treatment over another, ability/willingness to adhere to treatment, credibility of treatment rationales, demographic and socioeconomic variables that enhance or impede health care access or that contribute to attitudes about treatment acceptability) and dimensions of clinical expertise (e.g., the ability to deliver the appropriate EST for the patient’s problem, the ability to adapt treatments to unique clients, the ability to deliver assessments appropriate to decision-making, the ability to communicate effec- tively with patient) that deserve empirical study. Practitioners face these gaps because our dominant research paradigms tend to yield data about homogeneous majority groups receiving standard treatment in optimal settings. Thus far, most of what constitutes evidence-based psychological practice is in the area of empirically supported treatment (Chambless, 1995; Chambless et al., 1998). Cur- rently, there are several psychological therapies with well- established efficacy for treat- ment of a variety of psychological problems (American Psychological Association [APA] Division 12 Dissemination Subcommittee of the Committee on Science and Practice). Continued expansion of this list to include new therapies and clinical problems, and demonstrating the portability of well-controlled efficacy studies to real world problems (effectiveness) is continuing apace (Chambless & Ollendick, 2001).
  • 7. A parallel expansion of the evidence base for psychological assessment procedures is needed. More research is needed regarding the diagnostic utility of assessment tools in predicting at-risk status, in helping select which treatment is indicated, or in predicting treatment response. Even in areas where the evidence for the clinical utility of assessment procedures is strong (e.g., in surgical epilepsy, where the results of presurgical evaluation of verbal memory strongly predict which patients will develop postsurgical neuropsycho- logical morbidity; Chelune, 1995) the best available evidence has not yet caused the majority of clinicians to modify their assessment approach accordingly. A full instantiation of EBP in psychology will require an expansion of systematic research efforts that will provide us with more information about the clinical expertise and patient domains. This represents a real opportunity to broaden the scope of EBP in psychology. How do psychological practitioners with varying levels of expertise decide which of a number of alternative treatments to utilize in the clinic? What factors make clinically efficacious treatments acceptable to patients? How does cultural diversity inter- act with treatment acceptability? To apply best evidence to individual clinical problems seamlessly, we need to develop a research agenda that allows us to retrieve and analyze answers to these kinds of questions. This is a daunting task, and one that seems intracta- ble from the point of view of our exclusive reliance on
  • 8. quantitative research methods and controlled experiments. Perhaps this is an area in which increased knowledge of qualita- tive research methods (see below) would be beneficial for the field. This is an area to which practicing scientist–practitioners can provide critical information by adopting a data-driven approach to practice that incorporates measurement and reporting of assess- ment and treatment outcomes for purposes of further addressing effectiveness questions. Implications for Education and Training In a recent survey on training in ESTs, Woody, Weisz, and McLean (2005) reported that, although many doctoral training programs provided didactic dissemination of EST- related information, actual supervised training in ESTs had declined compared to a sim- ilar survey conducted in 1993. The overall conclusion was that the field had a long way to go in insuring that our students have sufficient skill and experience to practice EST in their professional lives. The authors cited several obstacles to training in ESTs, including Evidence-Based Practice and Research 687 Journal of Clinical Psychology DOI 10.1002/jclp (a) uncertainty about what it means to train students in EBP; (b) insufficient time to provide specific training in multiple ESTs given other training
  • 9. priorities, including research; (c) within-program shortages of trained supervisors needed to provide a truly broad EST training experience; and (d) philosophic opposition to what some perceive as an overly rigid, manualized approach to treatment that reduces professional psychological practice to technician status. It seems obvious to me that most of these barriers imply a method of training in which competency in ESTs is built one treatment at a time, thus requiring large investments of time and faculty effort to the cause. Although it is true that students need practical training in a variety of clinical methods, one key issue is whether a goal of graduate education is to train students to competency in a critical number of ESTs, or whether the goal is to train them in broader principles of evidence-based practice that will enable them to easily adapt to novel demands for new competencies after attaining their PhD (educating for capability rather than competency; Fraser & Greenhalgh, 2001). There is evidence that clinical psychology training directors are ready for this devel- opment. In the Woody et al. (2005) survey, some clinical training directors indicated that current practice reflects an underemphasis on broad principles of evidence-based practice in favor of learning particular procedures on a treatment-by- treatment basis. Some of the issues related to the ability of programs to provide appropriate training would be addressed if we adopted a more general principles approach. Although not particularly on point in
  • 10. the context of this article, it is my view that developing competencies in ESTs for research and professional practice is the joint and cumulative responsibility of doctoral programs, internships, and postdoctoral programs that work together to provide a continuum of training in knowledge and skills in EBPP. Training in EBPP will require graduate training programs to include new content in research training curricula so that students are ready to understand and apply basic prin- ciples of EBPP in their everyday professional lives. Primary needs include training in (a) epidemiology, (b) clinical trials methodology, (c) qualitative research methods and mea- surement, (d) how to conduct and appraise systematic reviews and meta-analyses, and (e) in building skills in informatics and electronic database searching necessary to find best available evidence relevant to the problems that students will encounter in their research and clinical work. Such content could be introduced in a basic research methods course, could be taught separately in a course on EBPP, or could be infused in the curriculum through a combination of didactic, practicum, and research experiences (for additional ideas on infusion of EBPP into the curriculum, see Dillillo & McChargue, this issue). Achieving true infusion and integration will require that all program faculty is committed to the concept of EBPP, that all will have received some basic education in EBPP them- selves, and that EBPP concepts are represented throughout the curriculum. The faculty
  • 11. development implications of advancing EBPP are not trivial. In the short run, an effective strategy may be to partner with colleagues in medicine, health professions, nursing, and public health to provide interdisciplinary instruction and mentoring in basic princi- ples of EBP. Epidemiology Many problems important to psychologists (e.g., whether a clinical assessment tool is effective in identifying at-risk patients, whether a treatment protocol is effective in reduc- ing psychological distress or disability in a defined population) can be conceptualized and described in epidemiological terms. For example, the strength of a treatment effect can be described with reference to the concept of “number needed to treat” (the number 688 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp of patients who would need to be treated to produce one additional favorable outcome), or “number needed to harm” (the number of patients who would need to be treated to prevent one additional unfavorable outcome), or, more generally, in terms of relative or absolute risk reduction. Knowledge of basic aspects of diagnostic test performance (e.g., sensitivity, specificity, positive and negative predictive value)
  • 12. so critical to psychological practice can also be enhanced by forging links between these concepts and corresponding concepts in epidemiology (e.g., positive and negative likelihood ratios). A broad ground- ing in epidemiological methods will promote further ways of understanding and inferring causality from observational and experimental data, will further an appreciation for pre- ventative methods, and will provide much-needed appreciation for community- and population-based methods that will complement psychology’s traditional emphasis on individuals and small groups. Clinical Trials Methodology Although many graduate statistics and methodology courses cover such topics as case- control designs, cohort designs, and elements of randomized clinical trials (RCTs), clas- sical methodology education in the Campbell and Stanley (1963) tradition needs to be supplemented with contemporary information relevant to clinical trials methodology. For example, training in standards for designing, conducting, and reporting clinical trials consistent with the CONSORT statement (Begg et al., 1996; Moher, Schulz, & Altman, 2001) is important so that reports of psychological clinical trials have appropriate con- sistency and transparency. Training in methods for reporting the size of treatment effects (going beyond statistical significance), allocating samples, specifying outcomes (relative and absolute risk reduction, number needed to treat and number
  • 13. needed to harm), and addressing the ethical issues of clinical trials are all critically needed if psychology is to develop a truly evidence-based practice. Building the ability to evaluate the results of extant trials critically is also crucial if psychological practitioners are to meaningfully apply the best evidence standard to their own clinical work and research. Qualitative Research Methods and Measurement Clinical psychologists trained in the scientist–practitioner tradition are almost exclu- sively focused on quantitative research methods, with an attendant emphasis on measure- ment precision, quantitative statistical analysis, and tightly controlled experimental design. This scientific tradition links us with our colleagues in the natural and social sciences, and represents our preferred “way of knowing” the world. In contrast, qualitative approaches to research seek to evaluate the quality, or essence of human experience using a funda- mentally different methodological and analytic framework ( Mays & Pope, 1995, 2000; Pope, Ziebland, & Mays, 2000). Many psychologists are familiar with at least some qualitative research methods exemplified, for example, in ethnography, sociometry, participant-observation, or content analysis of discourse. However, methods such as con- vergent interviewing, focus groups, and personal histories are generally foreign to most students in scientist–practitioner programs. As applied to health care, qualitative research-
  • 14. ers may seek to evaluate the experiences of brain-injured patients in rehabilitative set- tings as a way of enhancing the design of the rehabilitation environment for purposes of maximizing recovery. They may investigate case dispositions in a child neurosurgery clinic by evaluating commonalities among physicians’ notes and clinical decisions. They may evaluate treatment acceptability by interviewing patients about their experiences in Evidence-Based Practice and Research 689 Journal of Clinical Psychology DOI 10.1002/jclp treatment. It is important for psychologists to become more familiar with these methods because many systematic reviews in the EBP literature contain the results of qualitative studies (Thomas et al., 2004). Although qualitative research is generally incapable of establishing causative relationships among variables, they may be the only (and therefore the best) source of evidence for rare conditions and they may suggest associations worthy of future research. Reviews of this area as applied to healthcare can be found in Green- halgh & Taylor (1997), Grypdonck (2006), Holloway (1997), and Leininger (1994). Conducting Systematic Reviews and Meta-Analyses The explosion of relevant medical and psychological literature has made it difficult for
  • 15. scientist–practitioners to have access to the best evidence at the single-study level while attending to multiple simultaneous demands for their time. For this reason, systematic reviews of the literature are becoming increasingly important as sources for state of the art information. Most graduate courses in research methodology and statistics devote little attention to conducting reviews or meta-analyses, although many programs now appear to be offering grant-writing courses or seminars. In these courses, an emphasis on design and critique of individual studies is commonplace, whereas development of skills in evaluating systematic reviews or meta-analyses is rare. If psychology is to become a key player in evidence-based-practice, the next cadre of scientist–practitioners will have to develop skills in conducting and evaluating these kinds of reviews. In programming needed education and training, it is important to distinguish between narrative reviews (the kind of review that is seen, for example, in Psychological Bulletin) and systematic reviews. Narrative reviews are conducted by knowledgeable persons who often conduct the review for advancing a particular theoretical conclusion. They therefore yield poten- tially biased conclusions because there is no consensually agreed-upon method for com- bining and weighting results from different studies. In contrast, systematic reviews and meta-analyses proceed according to specified methodological conventions in which the search method, the procedure for including and excluding studies, and the method for
  • 16. eventually calculating effect sizes or odds ratios are specified beforehand (e.g., fixed effects vs. random effects models), as are methods for determining statistical and clinical significance (Cook, Mulrow, & Haynes, 1997; Cook, Sackett & Spitzer, 1995; Quintana & Minami, 2006). Meta-analysis is a specific form of quantitative systematic review that aggregates the results of similar studies for purposes of generating more stable conclu- sions from pooled data than is possible at the individual-study level (Egger, Smith, & Phillips, 1997; Rosenthal & DiMatteo, 2001; Wolf, 1986). Recent techniques allow for the calculation of bias in published studies, allowing the reader to appraise whether the results of the analysis reflects an undistorted view of effect size (Stern, Egger, & Smith, 2001). Clinical psychologists need to know these basic concepts so that they can evaluate the relevance and quality of available evidence. Informatics and Database Searching Skills If a tree falls in the woods, and there is no one there to hear it, does it make a sound? This classical conundrum about the nature of reality seems relevant to the key issue of infor- mation access in evidence-based practice. If useful information about best evidence exists, but we do not or cannot access it, it cannot be brought to bear on clinical decision making (Slawson & Shaughnessy, 2005). For this reason, developing expertise in informatics and database searching is a critical step in making EBPP a reality. In my experience, most
  • 17. 690 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp psychologists, and students of psychology, search a limited number of databases (PubMed, U.S. National Library of Medicine, 1971; PsychLit, APA, 1967) with a single search term, and (at most) a single Boolean operator. It is not uncommon for a supervisor of a fledgling student to hear that, “there’s nothing in the literature” about a topic the student is interested in researching. Most use very little of what is available, and many are com- pletely unaware of many of the most important and useful resources available for EBPP. A detailed discussion of these resources is beyond my scope (see Hunt & McKibbon, 1997); nevertheless, it seems critical that some effort be devoted (either in faculty devel- opment seminars or in graduate education) to addressing database availability explicitly, including access strategies, search methodology, and approaches to information manage- ment (managing search results). A key first step in getting this accomplished may be to establish a close relationship with a librarian or library informatics specialist who can help translate educational and research needs into strategies for accessing needed infor- mation, and who can provide access to needed databases and other resources. It is not uncommon, particularly in larger institutions, for at least one
  • 18. member of the library staff to be particularly skilled in evidence-based medicine. There are a number of databases that are of particular relevance to EBPP, including CINAHL (nursing and allied health; Cinahl Information Systems, 1984), EMBASE (1974), The Cochrane Library (including the Cochrane Database of Systematic Reviews [CDSR]; Cochrane Library, 1999a), the Database of Abstracts of Reviews of Effects (DARE; Cochrane Library, 1999b), the Cochrane Central Register of Controlled Trials (CENTRAL; Cochrane Library, 1999c), and the ACP Journal Club (American College of Physicians, 1994), available on the Ovid (Ovid Technologies, New York, NY) search engine (for a more in-depth discussion, see Walker & London, this issue). Obtaining access to these databases is only part of the story; the development of strategic searching skills designed to yield a manageable number of relevant search results is a key outcome goal of educational efforts that will be achieved only through actual practice in problem-based learning situations. Finally, development of a local or profession- wide resource that contains the answers to evidence-based queries (so-called, critically appraised topics or CATS) will enable students and their mentors to benefit from the evidence-based practice efforts of their colleagues. Other authors in this series have suggested ways of incorporating skill-building activities into practicum and other parts of the psychology curriculum (see Collins, Belar, &
  • 19. Leffingwell, this issue; DiLillo & McChargue, this issue). The Way Forward In this article, I have tried to highlight ways that the interdisciplinary trend toward evidence- based practice offers real opportunities to address some difficult research problems and to revitalize certain aspects of our graduate curricula. This brief analysis has likely raised more questions (e.g., How? When? By whom? In what way?) as far as the training impli- cations are concerned, and has not dealt at all with criticisms that have been thoughtfully levied against the EBP approach to research and research training. One key issue in advancing EBP within psychology will be to pay attention to the key stage of the process by which knowledge (best evidence) is transformed into action and application. This, in my view, is the state of the process that is least understood from a psychological view- point. What are the principles by which best evidence can be modified to fit the individ- ual case? What evidence is “good enough” to drive a clinical decision? What about those aspects of psychological health care (e.g., relationship, trust, identification, and model- ing) that are implicitly important in the delivery of services, but that don’t themselves Evidence-Based Practice and Research 691 Journal of Clinical Psychology DOI 10.1002/jclp
  • 20. have large-scale independent empirical support? These (and others) are key questions we will need to grapple with as we implement an evidence base for clinical psychology and teach students how to access and use it. With regard to pedagogy, I am convinced that the only way to go is to incorporate problem-based, real-time experiences throughout the curriculum in which students can learn to walk the EBPP walk. This is a significant undertaking with profound implica- tions as far as faculty development is concerned. I am as skeptical of an Evidence-Based Practice Course as a way to develop the needed skills and capacities of our students as I am that a Cultural Diversity Course will somehow help build multicultural competencies. We will need to figure out how to incorporate the content, the concepts, and the tech- niques of evidence-based psychological practice at all levels of research and clinical training if we are to be truly successful in assimilating the EBPP way of thinking. We cannot do it all; faculty are generally not up to speed with all that is needed, and, for the practicing clinician, health care events proceed at a rapid pace. We can begin the process by equipping tomorrow’s psychological practitioners with the tools necessary to imple- ment EBPP into their everyday clinical practice. In addition, we can capitalize on the obvious opportunities to expand our multidisciplinary interdependence on other health
  • 21. professionals in nursing, medicine, pharmacy, and public health who are further down the EBP road than we are. Providing faculty with needed support, and developing methods for educating and training tomorrow’s psychologists in EBPP is critically needed in estab- lishing an evidence base equal to the task of providing quality psychological health care for those that depend on us. References American College of Physicians. (1994). ACP Journal Club homepage. Retrieved February 15, 2007, from http://www.acpjc.org American Psychological Association. (1967). PsycINFO homepage. Retrieved February 15, 2007, from http://www.apa.org/psycinfo/products/psycinfo.html Begg, C., Cho, M., Eastwood, S., Horton, R., Moher, D., Olkin, I., et al. (1996). Improving the quality of reporting of randomized controlled trials: the CONSORT statement. Journal of the American Medical Association, 276, 637– 639. Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally College Publishing. Chambless D. L. (1995). Training and dissemination of empirically validated psychological treat- ments: Report and recommendations. The Clinical Psychologist, 48, 3–23. Chambless, D. L., Baker, M. J., Baucom, D. H., et al. (1998).
  • 22. Update on empirically validated therapies, II. The Clinical Psychologist, 51, 3–16. Chambless, D. L., & Ollendick, T. H. (2001). Empirically supported psychological interventions: controversies and evidence. Annual Review of Psychology, 52, 685–716. Chelune, G. (1995). Hippocampal adequacy versus functional reserve: Predicting memory func- tions following temporal lobectomy. Archives of Clinical Neuropsychology, 10, 413– 432. Cinahl Information Systems. (1984). Homepage. Retrieved February 15, 2007, from http:// www.cinahl.com Cochrane Library. (1999a). Homepage. Retrieved February 15, 2007, from http://www3. interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME Cochrane Library. (1999b). DARE Homepage. Retrieved February 15, 2007, from http://www. mrw.interscience.wiley.com/cochrane/cochrane_cldare_articles_ fs.html Cochrane Library. (1999c). Cochrane Central Register of Controlled Trials Homepage. Retrieved February 15, 2007, from http://www.mrw.interscience.wiley.com/cochrane/cochrane_ clcentral_articles_fs.html 692 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp
  • 23. Collins, F. L., Leffingwell, T.R., & Belar, C. D. (2007). Teaching evidence-based practice: Impli- cations for psychology. Journal of Clinical Psychology, 63, 657– 670. Cook, D. J., Mulrow, C. D., & Haynes, R. B. (1997). Systematic reviews: Synthesis of best evi- dence for clinical decisions. Annals of Internal Medicine, 126, 376–380. Cook, D. J., Sackett, D. L., Spitzer, W. O. (1995). Methodologic guidelines for systematic reviews of randomized control trials in health care from the Potsdam Consultation on Meta-Analysis. Journal of Clinical Epidemiology, 48, 167–171. Davidson, K. W., & Spring, B. (2006). Developing an evidence base in clinical psychology. Journal of Clinical Psychology, 62, 259–271. DiLillo, D., & McChargue, D. (2007). Implementing evidence- based practice training in a scientist– practitioner program. Journal of Clinical Psychology, 63, 671– 684. Egger, M., Smith, G. D., & Phillips, A. N. (1997). Meta- analysis: Principles and procedures. British Medical Journal, 315, 1533–1537. EMBASE. (1974). Homepage. Retrieved February 15, 2007, from http://www.embase.com Fraser, S.W., & Greenhalgh, T. (2001). Coping with complexity: Educating for capability. British
  • 24. Medical Journal, 323, 799–803. Greenhalgh, T., & Taylor, R. (1997). How to read a paper: Papers that go beyond numbers (quali- tative research). British Medical Journal, 315, 740 –743. Grypdonck, M. H. (2006). Qualitative health research in the era of evidence-based practice. Qual- itative Health Research, 16, 1371–1385. Holloway I. (1997). Basic concepts for qualitative research. Oxford: Blackwell Science. Hunt, D. L., & McKibbon, K. A. (1997). Locating and appraising systematic reviews. Annals of Internal Medicine, 126, 532–538. Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st century. Washington, DC: National Academies Press. Leininger, M. (1994). Evaluation criteria and critique of qualitative research studies. In J. M. Morse (Ed.), Critical issues in qualitative research methods. Thousand Oaks, CA: Sage. Mayer, D. (2004). Essential evidence-based medicine. New York: Cambridge University Press. Mays, N., & Pope, C. (1995). Reaching the parts other methods cannot reach: An introduction to qualitative methods in health and health services research. British Medical Journal, 311, 42– 45. Mays, N., & Pope, C. (2000). Assessing quality in qualitative
  • 25. research. British Medical Journal, 320, 50 –52. McCabe, O. L. (2006). Evidence-based practice in mental health: accessing, appraising, and adopt- ing research data. International Journal of Mental Health, 35, 50 – 69. Moher, D., Schulz, K. F., & Altman, D. G. (2001). The CONSORT statement: Revised recommen- dations for improving the quality of reports of parallel-group randomized trials. Lancet, 357, 1191–1194. Paul, G. L. (1967). Outcome research in psychotherapy. Journal of Consulting Psychology, 31, 109–118. Paul, G. L. (1969). Behavior modification research: Design and tactics. In C. M. Franks (Ed.), Behavior therapy: Appraisal and status (pp. 29– 62). New York: McGraw-Hill. Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research in health care: Analyzing qualita- tive data. British Medical Journal, 320, 114 –116. Quintana, S. M., & Minami, T. (2006). Guidelines for meta- analyses of counseling psychology research. The Counseling Psychologist, 34, 839–877. Rosenthal, R., & DiMatteo, M. R. (2001). Meta-analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52, 59–82.
  • 26. Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn’t. British Medical Journal, 312, 71–72. Evidence-Based Practice and Research 693 Journal of Clinical Psychology DOI 10.1002/jclp Slawson, D. C., & Shaughnessy, A. F. (2005). Teaching evidence based medicine: should we be teaching information management instead? Academic Medicine, 80, 685– 689. Stern, J. A. C., Egger, M., & Smith, G. D. (2001). Investigating and dealing with publication and other biases in meta-analysis. British Medical Journal, 323, 101–105. Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B. (2005). Evidence based medicine: How to practice and teach EBM. Edinburgh: Elsevier/Churchill Livingstone. Thomas, J., Harden, A., Oakley, A., Oliver, S., Sutcliffe, K., Rees, R., et al. (2004). Integrating qualitative research with trials in systematic reviews. British Medical Journal, 328, 1010 –1012. U.S. National Library of Medicine. (1971). Medline/PubMed homepage. Retrieved February 15, 2007, from http://www.ncbi.nlm.nih.gov/entrez/query.fcgi Walker, B. W., & London, S. (2007). Novel tools and resources
  • 27. for evidence-based practice in psychology. Journal of Clinical Psychology, 63, 633– 642. Wolf, F. M. (1986). Meta-analysis: Quantitative methods for research synthesis. Beverly Hills, CA: Sage. Woody, S. R., Weisz, J., & McLean, C. (2005). Empirically supported treatments: 10 years later. The Clinical Psychologist, 58, 5–11. 694 Journal of Clinical Psychology, July 2007 Journal of Clinical Psychology DOI 10.1002/jclp from the editors Confusion abounds about what qualifies as “evidence” of effective interventions. The president of the American Psychology Associa- tion [APA] notes that “much of the research that guides evidence-based practice is too inacces- sible, overwhelming, and removed from practice” (Goodheart, 2010, p. 9). Yet lists of evidence-based treatments are being used to control funding in treatment, human services, and education. Stated simply, such policies are based on shaky science. Certainly there is no short- age of evidence that some
  • 28. methods are destructive, like withholding treatment or placing traumatized kids in toxic environments. But a wide variety of therapeu- tic interventions can have a positive impact if con- ducted within a trusting alliance. There are two very differ- ent views of what evidence is most important. Re- search in the traditional medical model compares a proposed treatment with alternates or a placebo. If a prescribed number of pub- lished studies give a statis- tical edge, the treatment is anointed as “evidence-based.” This is followed by endorsements from the National Institute of Health, the Department of Education, or other authoritative bodies. Providing lists of curative treatments may work for medicine, but this is not how to find what works in complex therapeutic relationships. Mental health research has shown that the process of enshrining Practice-Based Evidence: Back to the Future Larry K. Brendtro, Martin L. Mitchell, & James Doncaster Researchers are shifting from the medical model of studying
  • 29. treatments, to a practice- based model focusing on the nature and needs of a person in a therapeutic relationship. As seen from the articles in this special issue, this has been a central tenet of Re-ED since founded by Nicholas Hobbs fifty years ago. James Doncaster, guest editor winter 2011 volume 19, number 4 | 5 specific treatment models as evidence-based is based on flawed science (Chan, Hróbjartsson, Haahr, Gøtzsche, & Altman, 2004). Dennis Gorman (2008) of Texas A & M University documents simi- lar problems with school-based substance abuse and violence prevention research which he calls scientific nonsense. Julia Littell (2010) of the Campbell Coalition documents dozens of ways that sloppy science is being used to elevate specific treatments to evi- dence based status. Here are just a few of these research flaws: Allegiance Effect: Studies produced by advocates of a particular method are positively biased. File Cabinet Effect: Studies showing failure or no effects are tucked away and not submitted for publication. Pollyanna Publishing Effect:
  • 30. Professional journals are much more likely to publish studies that show positive effects and reject those that do not. Replication by Repetition Effect: Reviewers rely heavily on recycling findings cited by others, confusing rumor and repetition with replication. Silence the Messenger Effect: Those who raise questions about the scientific base of studies are met with hostility and ad hominem attacks. When researchers account for such biases, a clear pattern emerges. Widely touted evidence-based treat- ments turn out to be no better or no worse than other ap- proaches. Solid science speaks—success does not lie in the specific method but in common factors, the most important being the helping relationship. Re-ED uses human relationships to change the world one child at a time. Our field is in ferment as the focus of research is shifting. Instead of the study of treatments, the child now takes center stage. The practice-based model focuses on the nature and needs of an indi- vidual in an ecology (Brendtro & Mitchell, 2010). Effective interventions use research and practice expertise to target client characteristics including problems, strengths, culture, and motivation (APA, 2006). Research and evaluation measure progress and provide feedback on the quality of the
  • 31. therapeutic alliance (Duncan, Miller, Wampold, & Hubble, 2010). Instead of the study of treatments, the child now takes center stage. Re-ED is rooted in practice-based evidence. It taps a rich tradition of research, provides tools for di- rect work with youth, and tailors interventions to the individual child in an ecosystem (Cantrell & Cantrell, 2007; Freado, 2010). Fifty years after they were developed by Nicholas Hobbs and colleagues, the Re-ED principles offer a still-current map for meeting modern challenges. Re-ED does not im- pose a narrowly prescribed regimen of treatment, but uses human relationships to change the world one child at a time. Larry K. Brendtro, PhD, is Dean of the Starr In- stitute for Training and co-editor of this journal with Martin L. Mitchell, EdD, President and CEO of Starr Commonwealth, Albion, Michigan. They can be contacted via email at [email protected] James Doncaster, MA, is the senior director of orga- nizational development at Pressley Ridge in Pittsburgh, Pennsylvania, and is guest editor of this special issue on the fiftieth anniversary of the founding of Re-ED. He may be contacted at [email protected] 6 | reclaiming children and youth www.reclaimingjournal.com References
  • 32. APA Presidential Task Force on Evidence-Based Practice. (2006). Evidence-based practice in psychology. Ameri- can Psychologist, 61(4), 271-285. Brendtro, L., & Mitchell, M. (2010). Weighing the evidence: From chaos to consilience. Reclaiming Children and Youth, 19(2), 3-9. Cantrell, R., & Cantrell, M. (2007). Helping troubled children and youth. Memphis, TN: American Re-Education As- sociation. Chan, A., Hróbjartsson, A., Haahr, M., Gøtzsche, P., & Alt- man, D. (2004). Empirical evidence for selective report- ing of outcomes in randomized trials: Comparison of protocols to published articles. JAMA, 291, 2457-2465. Duncan, B., Miller, S., Wampold, B., & Hubble, M, (Eds.). (2010). The heart and soul of change, second edition: Deliv- ering what works in therapy. Washington, DC: American Psychological Association. Freado, M. (2010). Measuring the impact of Re-ED. Reclaim- ing Children and Youth, 19(2), 28-31. Goodheart, C. (2010). The education you need to know. Monitor on Psychology, 41(7), 9. Gorman, D. (2008), Science, pseudoscience, and the need for practical knowledge. Addiction, 103, 1752–1753. Littell, J. (2010). Evidence-based practice: Evidence or ortho- doxy. In B. Duncan, S. Miller, B. Wampold, & M. Hubble (Eds.), The heart and soul of change, second edition: Deliv- ering what works in therapy. Washington, DC: American Psychological Association.
  • 33. PrinciPles oF re-ed Trust between a child and adult is essential, the foundation on which all other principles rest. Life is to be lived now, not in the past, and lived in the future only as a present challenge. Competence makes a difference, and children should be good at something, especially at school. Time is an ally, working on the side of growth in a period of development. Self-control can be taught and children and adolescents helped to manage their behavior. Intelligence can be taught to cope with challenges of family, school and community. Feelings should be nurtured, controlled when necessary, explored with trusted others. The group is very important to young people, and it can be a major source of instruction in growing up. Ceremony and ritual give order, stability, and confidence to troubled children and adolescence. The body is the armature of the self, around which the psychological self is constructed. Communities are important so youth can participate and learn to serve.
  • 34. A child should know some joy in each day. Hobbs, N. (1982). The troubled and troubling child. San Francisco, CA: Jossey-Bass. winter 2011 volume 19, number 4 | 7 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Psychological Treatments: Putting Evidence Into Practice and Practice Into Evidence Dozois, David J A Canadian Psychology; Feb 2013; 54, 1; ProQuest Central pg. 1 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
  • 35. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
  • 36. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.