SlideShare a Scribd company logo
1 of 249
Copyright © 2020 e-Service Journal. All rights reserved. No
copies of this work may be distributed
in print or electronically without express written permission
from Indiana University Press.
doi: 10.2979/eservicej.12.1.02
42
An Evaluation of Medicare’s
Hospital Compare as a
Decision-Making Tool for
Patients and Hospitals
Sagnika Sen
Pennsylvania State University, USA
AbStRACt
Medicare’s Hospital Compare aims to assist patients to make
informed decisions about
their choice of care through its star rating systems and side-by-
side comparison of
hospitals. Despite the use of the rating system by hospitals as an
endorsement of quality,
it is not clear whether the information helps consumers make
choices specific to certain
diseases. Moreover, the system also does not provide any
guidance to hospitals as to
which quality improvements lead to better outcome and why.
Using data from 4793
hospitals, this research explores the relationship using the triad
of structure, process, and
outcome. Our results show that the star rating system is
inadequate for making disease-
specific decision. More importantly, there is little evidence
linking the structure and
process related variables with disease specific clinical qual ity
outcomes.
Keywords: Medicare, Triple Aim Performance, Hospital
Performance, Clinical
Quality, Efficiency
INtRoduCtIoN
To bring transparency and efficiency in health services, Centers
for Medicare and
Medicaid Services (CMS) provides consumers with a tool to
assess the quality
of hospitals and other health care providers in their vicinity
through its Hospital
Compare website. (Medicare, n.d.). The data in Hospital
Compare originate from
An Evaluation of Medicare’s Hospital Compare
43
different quality and cost-effectiveness initiatives undertaken
by CMS where indi-
vidual hospitals report on various outcome and process
measures regarding mor-
tality, safety, readmissions, patient experience, and timeliness
and effectiveness of
care (Kaye et al., 2017). Using a complex methodology, CMS
assigns a star-rating
system on a scale of 1 through 5 (1: worst, 5:best) to individual
hospitals (Hospital
Compare Overall Ratings Resources, n.d.). Upon entering a zip
code or a hospital
name in the Hospital Compare website, a summary of nearby
hospitals along with
their star-ratings are displayed. Up to three hospitals can then
be selected to make
detailed side-by-side comparisons related to heart attack, heart
failure, pneumonia,
surgery and other conditions. These comparisons are organized
by patient satisfac-
tion, timeliness and effectiveness of care, readmissions and
deaths, among others.
While the star rating system is widely used by patients, care
providers,
insurance companies, and policymakers (Mehta et al., 2020),
there is also a
considerable debate regarding its deviance from other quality
rankings (Austin
et al., 2015). Furthermore, there is often little information
explaining the relation-
ship between star ratings and a specific disease outcome (e.g.
Acute Myocardial
Infarction, commonly known as heart attack) due to
methodological reasons of
standardization and inability to use data from low -volume
hospitals (George,
et al., 2017). Often, there is no underlying pattern of correlation
among different
outcome measures, thus raising the concern whether consumers
decision should
rely on global ranking systems (Hu et al., 2017).
Furthermore, there are limitations of the Hospital Compare
database
regarding its ability to provide direction to the hospitals as to
which quality
improvement and efficiency initiatives are yielding better
outcomes (MacLean
& Shapiro, 2016). Only a handful of hospitals achieve “triple
aim performance,”
i.e. scoring high on all three outcome dimensions measured by
CMS—clinical
quality, patient experience, and efficiency (Roth, et al., 2019).
Despite the vast
amount of data collected by CMS regarding hospitals’
technology capabilities,
quality, and cost effectiveness initiatives, there is a lack of
comprehensive studies
to assess how these relate to different outcome measures.
In this regard, the current study explores the relationship of
different classes
of outcome variables with technology capabilities and process
related variables. The
objective of the study is two-fold. First, whether the CMS star
ratings system provides
sufficient information to consumers towards choosing a hospital
for a disease-specific
condition. Second, how and to what extent structural and
process initiatives affect the
Sagnika Sen
44 e-Service Journal Volume 12 Issue 1
different outcome dimensions, such as patient satisfaction, cost
efficiency, and qual-
ity. Using data from 4,793 hospitals included in 2018 Hospital
Compare database,
we focus on general outcomes such as patient survey of hospital
and spending per
beneficiary as well as readmission rates and excess days spent
in care specific to acute
myocardial infarction (AMI), commonly known as heart attack.
While the CMS star rating is used by hospitals as an
endorsement of
quality, there is a lack of understanding as to whether these
ratings really help
and patients and family members in their choice of care. More
importantly, to
the best of our knowledge there are no studies exploring the
causal relationship
between structural and process variables to hospital
performance.
The rest of the paper is organized as follows. We present a brief
review
of the literature in the next section, followed by a description of
our data and
methodology. Analysis and discussion of the results are
presented next. Finally,
we discuss the limitations of the study and concluding remarks.
lItERAtuRE REVIEw
In a seminal article, Donabedian (1966) proposed using the triad
of structure,
process, and outcome to evaluate the quality of health care.
Ever since its intro-
duction, the Donabedian framework has been the most cited in
health services
research, especially regarding the theory and practice of quality
assurance in
healthcare (Ayanian & Markel, 2016).
According to the Donabedian framework, structure is defined as
the set-
tings where healthcare takes place and includes provider
qualifications and organ-
izational characteristics. Process includes the functions
surrounding the delivery
of care such as diagnosis, treatment, prevention. Finally,
outcome relates to the
effect of healthcare service on the patient and population. These
concepts were
further extended to identify different dimensions of quality
(Donabedian, 1990)
and still constitutes the foundation of quality assessment. In the
following, we
briefly describe the extant literature on each of the three
dimensions of structure,
process, and outcome as it relates to healthcare research.
Structural Measures
One of the most important structural measures arguably
revolves around a hospi-
tal’s technology capabilities. While the Donabedian framework
includes provider
An Evaluation of Medicare’s Hospital Compare
45
qualification, we feel that hospitals participating in CMS
programs such as
Medicare and Medicaid have standard qualification rules for
their doctors and
nurses, and as such would have similar effect on all hospitals.
However, since the
introduction of the HITECH (Health Information Technology
for Economic
and Clinical Health) act in 2010, considerable emphasis has
been placed on hos-
pital capabilities regarding electronic healthcare records (EHR),
especially the
ability to collect, receive, and transmit patient healthcare
records in standardized
format. Hospitals were incentivized to achieve “meaningful use”
of EHR with
respect to healthcare quality (Gholami et al., 2015).
A significant body of academic research has explored the
relationship of
technology and healthcare quality (Chaudhry et al., 2006). A
longitudinal study
of hospitals in the US have shown that healthcare technology
usage is not only
is associated with increase in healthcare quality but also
reducing operating costs
(Bardhan & Thouin, 2013). Also, investme nts in technology
leads hospitals to
disclose quality measures voluntarily (Angst et al., 2014).
While extant literature predominantly have shown positive
effect of health-
care technology (Buntin et al., 2011), a recent article also cites
the existence of
“productivity paradox” seen earlier in the manufacturing sector
(Bui et al., 2018).
Their study of hospitals in the state of New York show only
mixed outcomes after
a considerable investment in technology, especially since their
research found no
evidence of relationship between technology use and patient
satisfaction, mortal-
ity, and readmission rates. The authors of this paper call for
further research to
explore the causal linkage between technology use and specific
outcomes such as
patient satisfaction, spending, mortality, and readmission rates.
Process Measures
The quality improvement literature has long recognized the role
of process man-
agement in impacting outcomes. Quality initiatives such as Six
Sigma aim to
improve quality through a rational modularization and
streamlining of workflows
followed by the implementation of standardized best practices
(McCormack
et al., 2009). Healthcare organizations have embraced various
process improve-
ment initiatives towards improving hospital efficiency, clinical
outcomes, and
patient experience (Roth et al., 2019). In general, these
programs have resulted
in improved outcomes (Zheng et al., 2018).
Sagnika Sen
46 e-Service Journal Volume 12 Issue 1
In order to reduce the number of preventable medical errors,
CMS devel-
oped a set of best practices to improve care delivery. These
processes are specif-
ically aimed to improve the care for acute myocardial infarction
(heart attack),
heart failure, pneumonia, as well as surgical processes and
infections. It has been
shown that participating in process improvement initiatives for
heart attack
resulted in reducing clinical outcomes of mortality and
readmission rates (Ding,
2015). However, other studies have shown that hospitals’
emphasis on process
management leads to increases in clinical quality but reduction
in patient satis-
faction (Chandrasekaran et al., 2012).
Measuring healthcare Service outcomes
Effectiveness and efficiency are inherent indicators of process
performance and
have been captured in the literature as quality and efficiency
(Melville et al.,
2004). Quality can be measured in terms of process results and
is determined
by how well a process meets the customer’s needs. In the
context of healthcare,
quality can be measured by customer perceptions, and/or
ranking and rating pro-
vided by insurance agencies (e.g. Medicare) and independent
third parties (e.g.
US News and World Report).
Efficiency, on the other hand, is a simple ratio of output to
input and is
representative of how well the results are achieved. Recent
literature in healthcare
services have emphasized on the triple aim performance—
clinical quality, patient
satisfaction, and reduction in cost (Roth et al., 2019; Zheng et
al., 2018). We
adopt all three outcome measures in our analysis described
below.
dAtA ANd MEthodology
This research utilizes data from CMS Hospital Compare
(Medicare, n.d.) for
the year 2018. A total of 4,793 acute care hospitals registered
with Medicare
are included in the database. Hospital Compare reports
information on vari-
ous performance metrics such as spending, quality and
efficiency of care, HIT
implementation, and customer satisfaction collected from the
hospitals. In addi-
tion, CMS also provides ranking and benchmarking for each of
the hospitals.
Information regarding Veterans Administration hospitals,
children’s hospitals,
and critical access hospitals are also included in Hospital
Compare but was not
part of the current study.
An Evaluation of Medicare’s Hospital Compare
47
Details of the variables used in this study are provided in Table
1. As
previously mentioned, the triple aim performance goals are
used. For patient
satisfaction, we use the aggregate scores from Hospital
Consumer Assessment
of Healthcare Providers and Systems (HCAHPS) patient
experience survey.
In addition, the CMS overall star rating is also used. For cost
reduction/effi-
ciency, the Medicare Spending Per Beneficiary (MSPB) is used.
MSPB is a
price-standardized, risk-adjusted measures of spending
efficiency (Trzeciak
et al., 2017). It assesses the cost of services performed by
hospitals and other
healthcare providers during the period immediately prior to,
during, and
following a beneficiary’s hospital stay compared to a median
national hos-
pital. The measure adjusts for geographic differences, patient
severity, and
age (Medicare Spending Per Beneficiary (MSPB) Measure
Methodology, n.d.).
For clinical quality, the heart attack measures are chosen. Since
hospital per-
formance varies across different disease and treatment
conditions, we chose
to focus on one disease (Hu et al., 2017). In the past, disease
specific mor-
tality and readmission rates were used as standard clinical
quality outcomes.
However, these measures sometimes created skewed incentives
for hospitals
(Psotka et al., 2020). Consequently, more recent measures by
CMS include
Excess Days in Care instead or mortality which measures
unplanned patient
encounters such as observation stays, emergency department
visits 30 days
post discharge (Horwitz et al., 2018).
The process variables are a combination of heart attack specific
measures
(e.g. percentage of patients who were admitted with complaints
of chest pain and
received aspirin) and general emergency department (ED)
throughput measures
(e.g. time spent in ED). We have also included emergency
department volume as
one of the control variables.
Structural measures included health information technology
(HIT) related
measures, as well as safety measures. Descriptive statistics of
all variables are pro-
vided in Table 2.
Sagnika Sen
48 e-Service Journal Volume 12 Issue 1
T
ab
le
1
: S
tr
u
ct
u
re
, P
ro
ce
ss
, a
n
d
O
u
tc
om
e
M
ea
su
re
s
fo
r
H
os
p
it
al
s
St
ru
ct
u
re
P
ro
ce
ss
O
u
tc
om
e
H
ea
lt
h
I
n
fo
rm
at
io
n
T
ec
h
n
o
lo
gy
(
H
IT
)
•
O
P
_1
2:
A
bi
li
ty
t
o
re
ce
iv
e
la
b
re
su
lt
s
el
ec
tr
on
ic
al
ly
•
O
P
_1
7:
A
bi
li
ty
t
o
tr
ac
k
la
b
re
su
lt
s,
t
es
ts
, r
ef
er
ra
ls
be
tw
ee
n
v
is
it
s
el
ec
tr
on
ic
al
ly
Sa
fe
S
u
rg
er
y
C
h
ec
k
li
st
u
se
•
SM
_S
S_
C
H
E
C
K
: I
n
p
at
ie
n
t
•
O
P
_2
5:
O
u
tp
at
ie
n
t
Sa
fe
ty
•
SM
_H
S_
PA
T
IE
N
T
_S
A
F
:
U
se
o
f
h
os
p
it
al
s
u
rv
ey
o
n
P
at
ie
n
t
Sa
fe
ty
c
u
lt
u
re
T
im
el
y
an
d
E
ff
ec
ti
ve
c
ar
e:
H
ea
rt
A
tt
ac
k
*
•
O
P
_4
: O
u
tp
at
ie
n
ts
w
it
h
c
h
es
t
p
ai
n
o
r
p
os
si
bl
e
h
ea
rt
at
ta
ck
w
h
o
re
ce
iv
ed
a
sp
ir
in
w
it
h
in
2
4
h
ou
rs
o
f
ar
ri
va
l
or
b
ef
or
e
tr
an
sf
er
ri
n
g
fr
om
t
h
e
em
er
ge
n
cy
d
ep
ar
tm
en
t
•
O
P
_5
: A
ve
ra
ge
n
u
m
be
r
of
m
in
u
te
s
be
fo
re
o
u
tp
at
ie
n
ts
w
it
h
c
h
es
t
p
ai
n
o
r
p
os
si
bl
e
h
ea
rt
a
tt
ac
k
go
t
an
E
C
G
T
im
el
y
an
d
E
ff
ec
ti
ve
c
ar
e:
E
m
er
ge
n
cy
D
ep
ar
tm
en
t
T
h
ro
u
gh
p
u
t
•
E
D
_1
b:
A
ve
ra
ge
t
im
e
p
at
ie
n
ts
s
p
en
t
in
t
h
e
em
er
ge
n
cy
d
ep
ar
tm
en
t,
b
ef
or
e
th
ey
w
er
e
ad
m
it
te
d
t
o
th
e
h
os
p
it
al
as
a
n
i
n
p
at
ie
n
t.
•
O
P
_1
8b
: A
ve
ra
ge
t
im
e
p
at
ie
n
ts
s
p
en
t
in
t
h
e
em
er
ge
n
cy
d
ep
ar
tm
en
t
be
fo
re
le
av
in
g
fr
om
t
h
e
vi
si
t
•
O
P
_2
0:
A
ve
ra
ge
t
im
e
p
at
ie
n
ts
s
p
en
t
in
t
h
e
em
er
ge
n
cy
d
ep
ar
tm
en
t
be
fo
re
t
h
ey
w
er
e
se
en
b
y
a
h
ea
lt
h
ca
re
•
p
ro
fe
ss
io
n
al
•
O
P
_2
2:
%
o
f
p
at
ie
n
ts
le
ft
w
it
h
ou
t
be
in
g
se
en
a
t
th
e
em
er
ge
n
cy
d
ep
ar
tm
en
t
O
th
er
:
•
E
D
V
: E
m
er
ge
n
cy
d
ep
ar
tm
en
t
vo
lu
m
e
R
at
in
g
•
H
_H
SP
_R
A
T
IN
G
_L
IN
E
A
R
_S
C
O
R
E
:
O
ve
ra
ll
h
os
p
it
al
r
at
in
g
fr
om
p
at
ie
n
t
su
rv
ey
•
H
os
pi
ta
l o
ve
ra
ll
ra
ti
ng
: S
ta
r
ra
ti
n
g
by
C
M
S
(1
–5
, 1
: w
or
st
, 5
b
es
t)
E
ff
ic
ie
n
cy
•
M
SP
B
: M
ed
ic
ar
e
Sp
en
d
in
g
p
er
b
en
ef
ic
ia
ry
(>
1
S
p
en
d
in
g
m
or
e
th
an
n
at
io
n
al
a
ve
ra
ge
,
<1
, S
p
en
d
in
g
m
or
e
th
an
n
at
io
n
al
a
ve
ra
ge
)
Q
u
al
it
y:
H
ea
rt
A
tt
ac
k
•
E
D
A
C
_3
0_
A
M
I:
E
xc
es
s
d
ay
s
in
a
cu
te
ca
re
, t
h
e
n
u
m
be
r
of
d
ay
s
a
p
at
ie
n
t
sp
en
d
s
in
a
n
e
m
er
ge
n
cy
d
ep
ar
tm
en
t,
h
os
p
it
al
ob
se
rv
at
io
n
u
n
it
, o
r
h
os
p
it
al
i
n
p
at
ie
n
t
u
n
it
w
it
h
in
3
0
d
ay
s
af
te
r
th
e
d
at
e
of
d
is
ch
ar
ge
fr
om
h
os
p
it
al
iz
at
io
n
f
or
h
ea
rt
a
tt
ac
k.
•
R
E
A
D
M
_3
0_
A
M
I:
R
at
e
of
r
ea
d
m
is
si
on
f
or
h
ea
rt
a
tt
ac
k
p
at
ie
n
ts
*T
w
o
ot
h
er
H
ea
rt
A
tt
ac
k
m
ea
su
re
s
re
la
te
d
t
o
br
ea
ki
n
g
u
p
b
lo
od
c
lo
t
tr
ea
tm
en
t
h
ad
a
v
er
y
lo
w
r
ep
or
ti
n
g
ra
te
a
n
d
a
re
n
ot
u
se
d
i
n
t
h
e
an
al
ys
is
An Evaluation of Medicare’s Hospital Compare
49
T
ab
le
2
: D
es
cr
ip
ti
ve
S
ta
ti
st
ic
s
O
P
_1
2:
A
bi
li
ty
t
o
re
ce
iv
e
la
b
re
su
lt
s
el
ec
tr
on
ic
al
ly
O
P
_1
7:
A
bi
li
ty
t
o
tr
ac
k
la
b
re
su
lt
s,
t
es
ts
,
re
fe
rr
al
s
be
tw
ee
n
vi
si
ts
e
le
ct
ro
n
ic
al
ly
SM
_S
S_
C
H
E
C
K
:
In
p
at
ie
n
t
Sa
fe
S
u
rg
er
y
ch
ec
kl
is
t
O
P
_2
5:
O
u
tp
at
ie
n
t
Sa
fe
S
u
rg
er
y
ch
ec
kl
is
t
SM
_H
S_
PA
T
IE
N
T
_S
A
F
:
U
se
o
f
h
os
p
it
al
s
u
rv
ey
o
n
P
at
ie
n
t
Sa
fe
ty
c
u
lt
u
re
Ye
s
32
13
31
27
34
58
34
44
26
80
N
o
29
0
37
2
12
9
97
86
4
N
ot
A
va
il
ab
le
12
90
12
94
12
06
12
52
12
49
E
D
V
: E
m
er
ge
n
cy
D
ep
ar
tm
en
t
V
ol
u
m
e
Fr
eq
u
en
cy
L
ow
(
0
–1
9,
99
9
p
at
ie
n
ts
a
n
n
u
al
ly
)
13
03
M
ed
iu
m
(
20
,0
00
—
39
,9
99
p
at
ie
n
ts
a
n
n
u
al
ly
)
96
2
H
ig
h
(
40
,0
00
—
59
,9
99
p
at
ie
n
ts
a
n
n
u
al
ly
)
60
3
V
er
y
H
ig
h
6
0,
00
0+
p
at
ie
n
ts
a
n
n
u
al
ly
69
9
N
ot
A
va
ila
bl
e
12
26
H
os
p
it
al
o
ve
ra
ll
r
at
in
g:
S
ta
r
ra
ti
n
g
by
C
M
S
Fr
eq
u
en
cy
1
25
9
2
75
0
3
11
78
4
11
53
5
33
5
N
ot
A
va
ila
bl
e
11
18
(C
on
ti
nu
ed
)
Sagnika Sen
50 e-Service Journal Volume 12 Issue 1
N
M
ea
n
St
d.
D
ev
ia
ti
on
O
P
_4
: A
sp
ir
in
a
t
A
rr
iv
al
26
01
94
.6
6.
54
O
P
_5
: M
ed
ia
n
T
im
e
to
E
C
G
26
49
8.
27
6.
19
E
D
_1
b:
A
ve
ra
ge
t
im
e
in
E
D
, a
rr
iv
al
t
o
d
ep
a
s
in
p
at
ie
n
t
39
19
27
3.
4
11
1.
12
O
P
_1
8b
: A
ve
ra
ge
t
im
e
p
at
ie
n
ts
s
p
en
t
in
t
h
e
E
D
38
12
14
1.
41
41
.7
O
P
_2
0:
D
oo
r
to
d
ia
gn
os
ti
c
ev
al
38
16
22
.1
8
13
.5
2
O
P
_2
2:
L
ef
t
be
fo
re
b
ei
n
g
se
en
35
60
1.
72
1.
76
M
SP
B
: M
ed
ic
ar
e
Sp
en
d
in
g
p
er
B
en
ef
ic
ia
ry
31
34
0.
98
0.
09
E
D
A
C
_3
0_
A
M
I:
E
xc
es
s
d
ay
s
in
a
cu
te
c
ar
e
(h
ea
rt
a
tt
ac
k)
21
23
7.
07
22
.6
7
R
E
A
D
M
_3
0_
A
M
I:
r
ea
d
m
is
si
on
r
at
e
(h
ea
rt
a
tt
ac
k)
21
23
16
.0
2
1.
08
T
ab
le
2
: C
on
ti
n
u
ed
An Evaluation of Medicare’s Hospital Compare
51
T
ab
le
3
: R
eg
re
ss
io
n
R
es
u
lt
s*
P
at
ie
n
t
Su
rv
ey
S
co
re
H
os
p
it
al
R
at
in
g
by
C
M
S
M
ed
ic
ar
e
Sp
en
di
n
g
p
er
B
en
ef
ic
ia
ry
E
xc
es
s
D
ay
s:
H
ea
rt
A
tt
ac
k
R
ea
dm
is
si
on
R
at
e:
H
ea
rt
A
tt
ac
k
(C
o
n
st
an
t)
92
.1
41
3.
47
2
1.
05
1
-1
9.
63
5
15
.5
3
O
P
_4
_S
co
re
:
A
sp
ir
in
a
t
A
rr
iv
al
0.
00
6
-0
.0
01
0.
07
O
P
_5
_S
co
re
:
M
ed
ia
n
T
im
e
to
E
C
G
-0
.0
23
E
D
_1
b
_S
co
re
:
A
ve
ra
ge
t
im
e
in
E
D
,
ar
ri
va
l
to
d
ep
a
s
in
p
at
ie
n
t
-0
.0
15
-0
.0
04
0.
00
1
0.
00
1
O
P
_1
8
b
_S
co
re
:
A
ve
ra
ge
t
im
e
p
at
ie
n
ts
s
p
en
t
in
t
h
e
E
D
0.
01
5
0.
00
4
O
P
_2
0
_S
co
re
:
D
o
o
r
to
d
ia
gn
o
st
ic
ev
al
-0
.0
11
-0
.0
01
O
P
_2
2
_S
co
re
:
L
ef
t
b
ef
o
re
b
ei
n
g
se
en
-0
.2
56
-0
.0
95
O
P
_2
5
:
O
u
tp
at
ie
n
t
Sa
fe
S
u
rg
er
y
ch
ec
k
li
st
=
Ye
s
-1
.3
17
O
P
_2
5
:
O
u
tp
at
ie
n
t
Sa
fe
S
u
rg
er
y
ch
ec
k
li
st
=
N
o
-8
2.
57
9
SM
_H
S_
PA
T
IE
N
T
_S
A
F
=
Ye
s
0.
01
3.
31
1
0.
17
5
SM
_S
S_
C
H
E
C
K
=
N
o
95
.6
33
E
D
V
=
1
:
L
ow
0.
93
9
-0
.0
17
E
D
V
=
2
:
M
ed
iu
m
-0
.3
17
E
D
V
=
3
:
H
ig
h
-0
.4
73
0.
02
E
D
V
=
4
: V
er
y
H
ig
h
0.
49
7
-0
.3
98
0.
02
2
3.
19
9
ad
ju
st
ed
R
2
0
.2
1
1
0
.1
7
7
0
.0
8
4
0
.0
8
8
0
.0
1
4
*a
ll
co
ef
fi
ci
en
ts
a
re
s
ig
n
if
ic
an
t
at
p
=
0.
01
Sagnika Sen
52 e-Service Journal Volume 12 Issue 1
These different pieces of data reside in separate reports within
Hospital
Compare indexed by each hospital. Once data from these
different sources are
combined, separate regression models were run for each
outcome variable. For
categorical variables, the “not available” group was used as the
baseline. Results
of the regression are provided in Table 3.
RESultS
First glance at the results reveal that not all outcome variables
are equally impacted
by the structure and process variables. Survey-based patient
satisfaction and CMS
computed star rating outcomes are the ones best explained, as is
evidenced from
the adjusted R2 values of 21.1% and 17.7% respectively. The
efficiency meas-
ure, Medicare Spending Per Beneficiary (MSPB), and one of the
heart attack
related measures (excess days of care) have moderate values of
adjusted R2 values,
whereas heart-attack readmission rates are not at all impacted
by the structure
and process related variables. In the following section, the
structure-outcome and
process-outcome relationships are discussed in detail.
Structure-outcome Relationships
Interestingly, the two HIT variables did not have any effect on
any of the
five outcomes despite about two-thirds of the hospitals
reporting both
capabilities. While it seems counterintuitive, recent research
suggests that
electronic health care capabilities cannot be fully harnessed
unless the organ-
ization’s capabilities are built to exploit those technologies
(Jena et al., 2020).
Hospitals that did not have an inpatient safe surgery checklist
(compared to
the ones that did not report on this measure) highly impacted
excess days
of care. Not having a safe surgery checklist increased the excess
days of care
considerably. However, this measure did not have any effect on
the other four
outcomes. The outpatient safe surgery checklist on the other
hand, resulted
in reduced patient satisfaction (compared to hospitals that did
not report on
the surgery checklist). A possible explanation may be that it
increased the
time taken for outpatient procedures. Also, hospitals that did
not have an
outpatient safe surgery checklist had reduced excess days.
Finally, hospitals
that used a survey of patient safety culture led to both an
increase in spending
and excess days of care.
An Evaluation of Medicare’s Hospital Compare
53
Process-outcome Relationships
For the process variables specific to heart attack care,
administering aspirin has a
positive effect both on CMS hospital rating as well as in
reducing spending per
beneficiary. Surprisingly, it also slightly increases excess days
in acute care.. The
average time it takes for a probable heart attack patient to get
an ECG reduces
patient satisfaction but does not have any effect on the other
outcome variables.
The average time spent in the emergency department (ED) for
patients
who were ultimately admitted as inpatients reduces patient and
CMS rating,
increases spending per beneficiary, and increases readmission
rates. Overall time
spent in ED for all patients increases both patient satisfaction
and CMS ratings.
The percentage of people who left the ED before being seen
reduces both patient
satisfaction and CMS rating.
A hospital’s emergency department volume seems to play a
significant role
for most outcomes. In general, higher volume hospitals had less
satisfaction,
lower ratings, more spending, and higher amount of excess
days. Not all volume
categories have the same impact on the outcome variables
though. It is only the
very high-volume hospitals that resulted in more excess days.
For both spending
per beneficiary and CMS rating, the ED volume, which can
serve as a proxy for
hospital size, resulted in increased spending and lower rating.
dISCuSSIoN
One of the key findings from our analysis is that the CMS
overall rating pro-
vides a broad overview of hospital performance. All outcomes
show an improving
trend towards the higher star rated hospitals. However, while
the structure and
process variables explain quite a bit about patient satisfaction
and CMS com-
puted hospital ratings, they provide less information regarding
spending effi-
ciency, and even less for disease-specific clinical outcomes. In
other words, while
the current structure and process-related variables demonstrably
improve patient
performance, their impact on reducing unplanned visits and
readmission rates
is not evident. A closer look at the distribution on excess days
and readmission
rates show a significant overlap of these measures across
hospital ratings (Figure
1), implying that hospitals even in high-star rating category may
have less-than-
standard outcome for heart attack patients. Interestingly,
hospitals that were not
assigned a star rating by CMS had worse performance than
those that received
Sagnika Sen
54 e-Service Journal Volume 12 Issue 1
star ratings of 4 and 5, but at par or slightly better than those
with ratings 1–3.
It should be noted here that consumers do not have ready access
to the clinical
quality scores through the Hospital Compare website, and are
shown the per-
formance of the hospital as compared to national median. In
order to access the
actual scores, patients have to look through the enormous
number of data files
in the archives.
Figure 1: Heart Attack Readmission Rates and Excess Days In
Care Across Hospital Rating
An Evaluation of Medicare’s Hospital Compare
55
In summary, the CMS star ratings, while providing a general
overview of
a hospital’s performance, may not be the best way to choose
care for a specific
disease. More importantly, the structure and process variables,
currently captured
by the CMS, fail to provide hospitals with any insights as to
which initiatives
result in better clinical and spending outcomes.
CoNCluSIoNS ANd futuRE RESEARCh
In this study, we assess of the utility of the Hospital Compare
star rating ser-
vice in helping patients make informed decision for the choice
of their care. We
also explore which structure and process variables impact
different dimensions of
hospital performance and how. Our analysis highlights the
shortcomings of the
current service for both patients and providers.
At this point, the limitations of our study should be recognized.
This is a
cross-sectional study of hospitals reporting on many of their
process and quality
related initiatives. Since CMS does not report any data where
the number of
cases are very small, some methodological issues are raised
regarding the under-
estimation of quality risks at low-volume hospitals (George et
al., 2017). More
information regarding the variation in patient demographics as
well as hospital
characteristics (size, urban/rural location) should be included in
future studies to
appropriately assess the clinical quality. Apart from a low
volume of cases, some
hospitals did not report performance data on excess days and
quality of care,
although they reported other process and structural measures.
Further longitu-
dinal studies may investigate if the proportion of hospitals
reporting these meas-
ures increase over time, and whether such changes explain the
causal relationship
between process initiatives and quality measures.
REfERENCES
Angst, C., Agarwal, R., Gordon, G., Khuntia, J., & Mccullough,
J. S. (2014). Information technol-
ogy and voluntary quality disclosure by hospitals. Decision
Support Systems, 57.
Austin, M. M., Jha, A. K., Romano, P. S., Singer, S. J., Vogus,
T. J., Wachter, R. M., & Pronovost,
P. J. (2015). National hospital ratings systems share few
common scores and may generate
confusion instead of clarity. Health Affairs, 34(3), 423–430.
https://doi.org/10.1377/hlthaff
.2014.0201
Ayanian, J. Z., & Markel, H. (2016). Donabedian’s lasting
framework for health care quality. New
England Journal of Medicine, 375(3), 205–207.
https://doi.org/10.1056/NEJMp1605101
Sagnika Sen
56 e-Service Journal Volume 12 Issue 1
Bardhan, I., & Thouin, M. F. (2013). Health information
technology and its impact on the qual-
ity and cost of healthcare delivery. Decision Support Systems,
55(2), 438–449. https://doi.org
/10.1016/j.dss.2012.10.003
Bui, Q. “Neo,” Hansen, S., Liu, M., & Tu, Q. (John). (2018).
The productivity paradox in
health information technology. Communications of the ACM,
61(10), 78–85. https://doi.org
/10.1145/3183583
Buntin, M. B., Burke, M. F., Hoaglin, M. C., & Blumenthal, D.
(2011). The benefits of health
information technology: A review of the recent literature shows
predominantly positive results.
Health Affairs, 30(3), 464–471.
https://doi.org/10.1377/hlthaff.2011.0178
Chandrasekaran, A., Senot, C., & Boyer, K. K. (2012). Process
management impact on clinical
and experiential quality: Managing tensions between safe and
patient-centered healthcare.
Manufacturing and Service Operations Management, 14(4),
548–566. https://doi.org/10.1287
/msom.1110.0374
Chaudhry, B., Wang, J., Wu, S., Maglione, M., Mojica, W.,
Roth, E., Shekelle, P. G. (2006). Improving
patient care. Systematic review: Impact of health information
technology on quality, efficiency,
and costs of medical care. Annals of Internal Medicine,
144(10), 742–752. Retrieved from http://
search.ebscohost.com/login.aspx?direct=true&db=cin20&AN=2
009195180&site=ehost-live
Ding, X. (2015). The impact of service design and process
management on clinical quality: An
exploration of synergetic effects. Journal of Operations
Management, 36, 103–114. https://doi
.org/10.1016/j.jom.2015.03.006
Donabedian A. (1966.) Evaluating the quality of medical care.
Milbank Memorial Fund Quarterly,
44(3), 166–206. Reprinted in Milbank Memorial Fund
Quarterly, 2005, 83(4), 691–729.
Donabedian, A. (1990). The seven pillars of quality. In
Archives of Pathology and Laboratory Medicine
(Vol. 114, pp. 1115–1118). Arch Pathol Lab Med.
George, E. I., Ročková, V., Rosenbaum, P. R., Satopää, V. A.,
& Silber, J. H. (2017). Mortality rate
estimation and standardization for public reporting: Medicare’s
hospital compare. Journal of
the American Statistical Association, 112(519), 933–947.
https://doi.org/10.1080/01621459
.2016.1276021
Gholami, R., Añón Higón, D., & Emrouznejad, A. (2015).
Hospital performance: Efficiency or
quality? Can we have both with IT? Expert Systems with
Applications, 42(12), 5390–5400.
https://doi.org/10.1016/j.eswa.2014.12.019
Horwitz, L. I., Wang, Y., Altaf, F. K., Wang, C., Lin, Z., Liu,
S., Herrin, J. (2018). Hospital
characteristics associated with postdischarge hospital
readmission, observation, and emer-
gency department utilization. Medical Care, 56(4), 281–289.
https://doi.org/10.1097
/MLR.0000000000000882
Medicare. (n.d.). Hospital Compare.
https://www.medicare.gov/hospitalcompare/search.html
Hospital Compare Overall Ratings Resources. (n.d.).
QualityNet. Retrieved July 31, 2020, from
https://www.qualitynet.org/inpatient/public-reporting/overall-
ratings/resources
Hu, J., Jordan, J., Rubinfeld, I., Schreiber, M., Waterman, B., &
Nerenz, D. (2017). Correlations
among hospital quality measures: What “hospital compare” data
tell us. American Journal of
Medical Quality: The Official Journal of the American College
of Medical Quality, 32(6), 605–
610. https://doi.org/10.1177/1062860616684012
An Evaluation of Medicare’s Hospital Compare
57
Jena, R., Rudramuniyaiah, P. S., & Shah, V. (2020). A
framework for reconciling care coordination
efficiency and effectiveness using e-service implementation
ambidexterity. E-Service Journal,
11(3). https://doi.org/10.2979/eservicej.11.3.03
Kaye, D. R., Norton, E. C., Ellimoottil, C., Ye, Z., Dupree, J.
M., Herrel, L. A., & Miller, D. C.
(2017). Understanding the relationship between the centers for
Medicare and Medicaid ser-
vices’ hospital compare star rating, surgical case volume, and
short-term outcomes after major
cancer surgery. Cancer, 123(21), 4259–4267.
https://doi.org/10.1002/cncr.30866
MacLean, C., & Shapiro, L. (2016). Does the hospital compare
5-Star rating promote public health?
https://doi.org/10.1377/hblog20160908.056393
McCormack, K., Willems, J., van den Bergh, J.,
Deschoolmeester, D., Willaert, P., Indihar
Štemberger, M., Vlahovic, N. (2009). A global investigation of
key turning points in busi-
ness process maturity. Business Process Management Journal,
15(5), 792–815. https://doi.org
/10.1108/14637150910987946
Medicare Spending Per Beneficiary (MSPB) Measure
Methodology. (n.d.). QualityNet. Retrieved
August 3, 2020, from
https://www.qualitynet.org/inpatient/measures/mspb/methodolo
gy
Mehta, R., Paredes, A. Z., Tsilimigras, D. I., Farooq, A.,
Sahara, K., Merath, K., Pawlik, T. M.
(2020). CMS hospital compare system of star ratings and
surgical outcomes among patients
undergoing surgery for cancer: Do the ratings matter? Annals of
Surgical Oncology, 27, 3138–
3146. https://doi.org/10.1245/s10434–019-08088-y
Melville, N., Kraemer, K. L., & Gurbaxani, V. (2004). Review:
Information technology and
Organizational performance: An integrative model of IT
business value. MIS Quarterly, 28(2),
283–322.
Psotka, M. A., Fonarow, G. C., Allen, L. A., Joynt Maddox, K.
E., Fiuzat, M., Heidenreich, P.,
O’Connor, C. M. (2020). The hospital readmissions reduction
program: Nationwide perspec-
tives and recommendations. JACC: Heart Failure, 8(1), 1–11.
https://doi.org/10.1016/j.jchf
.2019.07.012
Roth, A., Tucker, A. L., Venkataraman, S., & Chilingerian, J.
(2019). Being on the productivity fron-
tier: Identifying “triple aim performance” hospitals. Production
and Operations Management,
28(9), 2165–2183. https://doi.org/10.1111/poms.13019
Trzeciak, S., Gaughan, J. P., Bosire, J., Angelo, M., Holzberg,
A. S., & Mazzarelli, A. J. (2017).
Association between Medicare star ratings for patient
experience and Medicare spending
per beneficiary for US hospitals. Journal of Patient Experience,
4(1), 17–21. https://doi.org
/10.1177/2374373516685938
Zheng, Z. (Eric), Bardhan, I., & Ayabakan, S. (2018). Did the
hospital readmission reduction
program achieve triple aim goals? Evidence from healthcare
data analytics. In Pacific Asia
Conference on Information Systems (PACIS). PACIS. Retrieved
from https://aisel.aisnet.org
/pacis2018/207
59
Nidhi Singh is Assistant Professor and Dean (Students Affairs)
at Jaipuria
Institute of Management, Noida. She is an active researcher
enrolled with IP
University, Delhi. She has qualified UGC Net also. She has
presented many
papers in various Seminars & Conferences including IIMR,
IICA, NLSIU etc.
and published papers in journals of National & International
Repute like the
International Journal of Information Management, Elsevier,
Journal of Retailing
and Consumer Services, Elsevier, International Journal of Bank
Marketing,
Emerald, Decision-Springer publication, Management and
Labour Studies
-Sage Publication, International Journal of Sustainable Strategic
Management
-Inderscience publication, FIIM, SERD, GSCCR etc.
Dr. Sagnika Sen is an Associate Professor of Information
Systems in the School
of Graduate Professional Studies at Pennsylvania State
University. She received
her Ph.D. from Arizona State University. Her research focuses
on process per-
formance, metrics and incentive design in organizations, mainly
the design of
effective decision-making frameworks and the use of data-
driven decision models
to obtain analytical insights on processes and performance
measures. She has
published in top academic journals in the field of Information
Systems such as
Information Systems Research and Journal of Management
Information Systems. Her
work has also appeared in other prestigious academic outlets
such as Decision
Support Systems, Information and Management,
Communications of the ACM,
Human Resources Management, Service Sciences, Journal of
Managerial Psychology,
etc.
Reproduced with permission of copyright owner.
Further reproduction prohibited without permission.
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 1/15
EBSCO Publishing Citation Format: APA 7th Edition
(American Psychological Assoc.):
NOTE: Review the instructions at
http://support.ebsco.com/help/?int=eds&lang=&feature_id=APA
and make any
necessary corrections before using. Pay special attention to
personal names, capitalization, and dates. Always
consult your library resources for the exact formatting and
punctuation guidelines.
References
Saunders, H., Gallagher-Ford, L., Kvist, T., & Vehvilainen-
Julkunen, K. (2019). Practicing Healthcare
Professionals’ Evidence-Based Practice Competencies: An
Overview of Systematic Reviews.
Worldviews on Evidence-Based Nursing, 16(3), 176.
https://doi.org/10.1111/wvn.12363
<!--Additional Information:
Persistent link to this record (Permalink):
https://search.ebscohost.com/login.aspx?
direct=true&AuthType=shib&db=edsgao&AN=eds gcl.58778503
4&site=eds-live&scope=site&custid=s8856897
End of citation-->
Practicing Healthcare Professionals' Evidence‐ Based Practice
Competencies: An Overview of
Systematic Reviews
Background: Evidence‐ based practice (EBP) competencies are
essential for all practicing healthcare
professionals to provide evidence‐ based, quality care, and
improved patient outcomes. The multistep
EBP implementation process requires multifaceted competencies
to successfully integrate best
evidence into daily healthcare delivery. Aims: To summarize
and synthesize the current research
literature on practicing health professionals' EBP competencies
(i.e., their knowledge, skills, attitudes,
beliefs, and implementation) related to employing EBP in
clinical decision‐ making. Design: An overview
of systematic reviews. Methods: PubMed/MEDLINE, CINAHL,
Scopus, and Cochrane Library were
systematically searched on practicing healthcare professionals'
EBP competencies published in January
2012–July 2017. A total of 3,947 publications were retrieved, of
which 11 systematic reviews were
eligible for a critical appraisal of methodological quality. Three
independent reviewers conducted the
critical appraisal using the Rapid Critical Appraisal tools
developed by the Helene Fuld National Institute
for Evidence‐ Based Practice in Nursing & Healthcare. Results:
Practicing healthcare professionals' self‐
reported EBP knowledge, skills, attitudes, and beliefs were at a
moderate to high level, but they did not
translate into EBP implementation. Considerable overlap
existed in the source studies across the
included reviews. Few reviews reported any impact of EBP
competencies on changes in care processes
or patient outcomes. Most reviews were methodologically of
moderate quality. Significant variation in
study designs, settings, interventions, and outcome measures in
the source studies precluded any
comparisons of EBP competencies across healthcare disciplines.
Linking Evidence to Action: As EBP is
a shared competency, the development, adoption, and use of an
EBP competency set for all healthcare
professionals are a priority along with using actual (i.e.,
performance‐ based), validated outcome
measures. The widespread misconceptions and
misunderstandings that still exist among large
proportions of practicing healthcare professionals about the
basic concepts of EBP should urgently be
addressed to increase engagement in EBP implementation and
attain improved care quality and patient
outcomes.
javascript:openWideTip('http://support.ebsco.com/help/?int=eds
&lang=&feature_id=APA%27);
https://search.ebscohost.com/login.aspx?direct=true&AuthType
=shib&db=edsgao&AN=edsgcl.587785034&site=eds-
live&scope=site&custid=s8856897
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 2/15
Keywords: evidence‐ based practice; knowledge; competence;
systematic review; healthcare
professional
Knowledge of the principles of evidence‐ based practice (EBP)
and skills to perform the steps of the EBP
implementation process are essential competencies for all
practicing healthcare professionals (Melnyk,
Gallagher‐ Ford, & Fineout‐ Overholt, [16]). In nursing,
competence has been defined as the "ability to
perform the task with desirable outcomes under the varied
circumstances of the real world" (Benner, [
3], p. 304), referring to the expected knowledge, attitudes,
beliefs, skills, and abilities (i.e.,
competencies) for successful performance of critical work
functions. In health care, "core competencies
offer a common shared language for all health professions for
defining what all are expected to be able
to do to work optimally" (Albarqouni et al., [ 1], p. 2).
However, defining core competencies in EBP (i.e.,
outlining the expected EBP knowledge, skills, attitudes, beliefs,
and implementation, which are crucially
important for improving care quality and patient outcomes
because they enable healthcare professionals
to make clinical decisions grounded on best available evidence
and integrate the evidence into their
daily practice; Melnyk et al., [18]; Wallen et al., [34]) has been
a relatively recent development both in
nursing (Melnyk et al., [16]; Stevens, [28]) and in health care
(Albarqouni et al., [ 1]). Moreover, the
uptake and use of the EBP core competencies in daily practice
have been slow, which hinders
healthcare organizations from delivering highest quality,
evidence‐ based health care via consistent,
broad‐ based EBP implementation. Furthermore, systematic
integration of best evidence into practice is
challenging due to the complexity of the EBP implementation
process consisting of multiple sequential
steps, the mastery of which requires multifaceted interventions,
such as developing individual readiness
for EBP, translating and ensuring availability of best evidence
in usable forms for clinical practice, and
building organizational readiness, culture, and structures
supportive of EBP (Melnyk, Gallagher‐ Ford, &
Fineout‐ Overholt, [17]; Saunders, Vehviläinen‐ Julkunen, &
Stevens, [25]).
Similar to the idea of EBP itself (DiCenso, Cullum, & Ciliska, [
6]; Sackett, Rosenberg, Gray, Haynes, &
Richardson, [22]), the realization about the importance for all
healthcare professionals to develop a
sufficient level of EBP competence is not new, as the first
Sicily statement (Dawes et al., [ 5]) outlined
that it is a minimum requirement for all healthcare professionals
to understand and implement the
principles and process of EBP. To this end, two sets of nurses'
EBP competencies have been developed
through separate national consensus processes in the USA to
evaluate practicing nurses' abilities to
employ EBP (Melnyk et al., [16]) and to guide EBP professional
development and education programs
in nursing (Stevens, [28]). However, the EBP competencies
published thus far in nursing have been
self‐ reported and discipline‐ specific (i.e., they have focused
on measuring the perceived EBP
competencies of nurses). Although there have been a few actual
(i.e., performance‐ based) evaluation
tools developed in the last 10 years for more objective
measurement of EBP competencies, they have
also been discipline‐ specific and undertaken primarily in the
fields of medicine, occupational therapy,
physical therapy, and most recently, in nursing (Halm, [ 8]; Ilic,
Nordin, Glasziou, Tilson, & Villanueva,
[10]; Laibhen‐ Parkes, Kimble, Melnyk, Sudia, & Codone, [11];
McCluskey & Bishop, [12]; Spurlock &
Wonder, [27]; Tilson, [29]). However, as EBP is a shared
competency (i.e., the key principles and steps
of the EBP process are universal and applicable to all healthcare
disciplines), a unique opportunity
exists to jointly develop interprofessional core competencies in
EBP that objectively measure the actual
EBP performance of all healthcare professionals.
The Current State of Practicing Healthcare Professionals' EBP
Competencies
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 3/15
A recent integrative review on EBP readiness of nurses
(Saunders & Vehviläinen‐ Julkunen, [24])
concluded that EBP competencies of nurses internationally are
at a low to moderate level, particularly in
terms of their EBP knowledge, EBP skills, and their confidence
in employing EBP. These results are
consistent with the findings from other recent reviews of EBP
competencies across other healthcare
disciplines (Mota da Silva, da Cunha Menezes Costa, Narciso
Garcia, & Oliveira Pena Costa, [20];
Scurlock‐ Evans, Upton, & Upton, [26]; Upton, Stephens,
Williams, & Scurlock‐ Evans, [32]). Therefore,
instead of setting high performance expectations for EBP, it is
essential to first focus on advancing
practicing healthcare professionals' EBP competencies, before
they will be capable of consistently
implementing EBP and integrating best evidence into their daily
care delivery. Once healthcare
professionals are competent in EBP, they will be more likely to
engage in EBP in their daily work, and
patient care delivery in most healthcare organizations will
likely become more evidence‐ based. This
substantial chasm between the EBP implementation goals of
healthcare organizations and the current
EBP implementation capabilities of large numbers of healthcare
professionals due to their low level of
EBP competence is precisely the gap that urgently requires
attention and immediate action in
healthcare organizations worldwide.
Aims
The aim of this overview of systematic reviews was to
summarize and synthesize the current
international research literature on practicing healthcare
professionals' EBP competencies (i.e., their
knowledge, skills, attitudes, beliefs, and implementation of
EBP) related to employing EBP in clinical
decision‐ making. This overview addresses the following
research question: What do systematic reviews
published in international peer‐ reviewed journals state about
practicing healthcare professionals' EBP
competencies?
Design
Published systematic reviews on the EBP competencies of all
practicing healthcare professionals,
including nurses, physicians, physical therapists, occupational
therapists, and other allied health
professionals, were considered for inclusion in this overview of
systematic reviews. The relevant data in
the reviews were systematically extracted, summarized, and
synthesized according to the guidelines
provided by the Cochrane Collaboration (Becker & Oxman, [
2]). The review process is presented
according to the Preferred Reporting Items for Systematic
Reviews and Meta‐ Analyses (PRISMA)
statement or guideline for reporting study methods and results
(Moher, Liberati, Tetzlaff, Altman, & The
PRISMA Group, [19]).
Methods
Systematic literature search methods were used to conduct
electronic database searches in
PubMed/MEDLINE, Cumulative Index for Nursing and Allied
Health Literature (CINAHL), Scopus, and
Cochrane Library for primary empirical studies and reviews
published between January 1, 2012, and
July 31, 2017 (i.e., for a period of approximately the last 5
years), without any language restrictions.
With the expert assistance of a university librarian, keywords
and search terms related to the various
healthcare disciplines, EBP, and competencies were first
searched independently and then in
combination, with appropriate modifications made for the
various databases (e.g., MeSH terms in
PubMed). The term "research utilization" was not used as the
aim of this overview of systematic reviews
was to focus on healthcare professionals' EBP competencies
(i.e., their EBP knowledge, skills, attitudes,
beliefs, and implementation). Moreover, research utilization
focuses on the retrieval, critique, and use of
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 4/15
the research results from a single primary study, whereas EBP is
commonly considered to be a much
broader concept including research utilization and the
integration of summarized and translated best
evidence from several well‐ defined studies into clinical
practice (Melnyk & Fineout‐ Overholt, [14]). In
addition to the searched databases, authors of the included
reviews were contacted for any missing key
information, the reviews were reference‐ chased, and the lists
of contents of the following peer‐ reviewed
journals between the years of 2012–2017 were hand‐ searched:
Worldviews on Evidence‐ Based
Nursing, Journal of Advanced Nursing, BMC Health Services
Research, BMC Medical Education, BMJ
Open, Physiotherapy, and British Journal of Occupational
Therapy. These journals were selected
because they had published the majority of the reviews focusing
on the topic of healthcare
professionals' EBP competencies yielded by the systematic
literature searches conducted for this
overview.
Inclusion and Exclusion Criteria
The inclusion and exclusion criteria for systematic reviews are
listed in Table S1. Systematic reviews
were defined as reviews that had clearly stated aims or
objectives, predetermined inclusion criteria,
searched at least three databases, performed data extraction,
provided a synthesis of data, and
performed a quality appraisal of the included studies. To be
eligible for inclusion in this overview,
reviews were required to (a) focus on one or more of the
outcomes of interest (i.e., EBP competencies
of healthcare professionals), (b) fulfill the definition of a
systematic review, (c) meet the inclusion and
exclusion criteria, and (d) meet the benchmark set for the
methodological quality of the reviews. Before
undertaking this overview of systematic reviews, the Cochrane
Library and the Joanna Briggs Institute
Library of Systematic Reviews were searched. No published or
in‐ progress systematic reviews or
overviews of systematic reviews on this topic were found.
Search Results and Data Evaluation
The database searches yielded a total of 3,932 publications, and
15 additional publications were
identified through other sources. Titles were screened, and
duplicates as well as those not clearly
indicating a focus on practicing healthcare professionals' EBP
competencies were excluded. All
remaining abstracts (n = 407) were screened against the purpose
and inclusion criteria before being
selected for further appraisal. After eliminating a total of 392
records that did not meet one or more
inclusion criteria, the second screening resulted in 12 reviews.
Three reviews were added through
reference‐ chasing and hand‐ searching tables of content of the
selected peer‐ reviewed journals,
resulting in a total of 15 full‐ text reviews, which were assessed
for eligibility. Four full‐ text reviews were
excluded from the overview, as they contained no critical
appraisal of methodological quality and
therefore did not meet the definition of a systematic review
outlined for this overview. As a result, data
were extracted from 11 systematic reviews. Figure S1 details
the stages of searching and selecting
reviews for inclusion or exclusion using the PRISMA flow
diagram (Moher et al., [19]).
Data Extraction
The following data were extracted for each of the 11 reviews
and organized in a data matrix, using a
standardized data extraction form developed according to the
guidance from the PRISMA statement
(Moher et al., [19]): Author(s), country, year of publication,
types of participants, settings, study design(s)
included, EBP aspects reviewed, quality appraisal(s) performed,
main findings, and author's
conclusions. The data were extracted by one reviewer and
independently checked for accuracy and
consistency by two other reviewers to ensure rigor and
reproducibility. Any differences in opinion
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 5/15
between the three researchers were discussed until a mutual
agreement was formed. All 11 reviews
were included in the critical appraisal of methodological
quality.
Critical Appraisal of Methodological Quality
The overall quality and differences in quality between the
included reviews were compared and
contrasted, in order to help interpret the results of the reviews
synthesized in this overview. The overall
quality of the reviews was not used as a criterion for inclusion,
as the reviews included in this overview
were required to meet the definition of a systematic review,
specific inclusion criteria, and to pass a
critical appraisal of methodological quality, the main purpose of
which was to ensure that the included
reviews conformed to usual research norms.
The criteria used by the three independent reviewers for
evaluating the methodological quality were
those in the Rapid Critical Appraisal (RCA) tool for systematic
reviews and meta‐ analyses of
quantitative studies developed by the Helene Fuld National
Institute for Evidence‐ Based Practice in
Nursing & Healthcare of the Ohio State University College of
Nursing ([OSUCN] 2017). The reviewers
used the tool to critically appraise the validity, reliability, and
applicability and generalizability through
independently answering a series of 15 appraisal questions and
subquestions. In addition, an evaluation
quantifying the strength of evidence (i.e., quality + level of
evidence) in the included reviews was added
to the standardized form for conducting the critical appraisal of
methodological quality. The three
independent reviewers critically appraised the strength of
evidence as being low, moderate, or high,
based on the percentage of critical appraisal criteria fulfilled
(0–33%, 34–66%, and 67% and over). Any
discrepancies and differences in opinion in the critical
appraisals of methodological quality related to the
included reviews were discussed among the three researchers
until consensus was reached. The
benchmark of methodological quality for the reviews included
in this overview was set at a total
minimum score of at least five out of a total of 15 appraisal
criteria on the RCA tool fulfilled (i.e., 34%),
indicating acceptable scientific rigor.
Data Synthesis
To answer the primary research question of this overview, the
data from the 11 included reviews on
practicing healthcare professionals' EBP competencies were
summarized, analyzed, and synthesized
by using guidance from the Cochrane Collaboration (Becker &
Oxman, [ 2]). A narrative synthesis is
presented, as a meta‐ analysis was not possible due to the
heterogeneity of the source studies
contained in the reviews, including substantial variation in
outcomes and educational interventions, as
well as the poor quality of reporting of the results in some of
the included reviews.
Findings
Characteristics of the Systematic Reviews Included in the
Overview
The 11 included reviews originated from all around the globe:
Though the majority (n = 6, 55%) were
from Europe, another two were from Australia, and one each
were from Asia, South America, and North
America. As expected, almost one‐ half (n = 5, 45%) of the
included reviews originated from English‐
speaking countries, which traditionally comprise the nations
leading the international EBP movement.
Unexpectedly, the majority (n = 6, 55%) of the reviews
originated from smaller countries, such as
Ireland, Greece, Finland, and the Netherlands, many of which
are non‐ English‐ speaking and have
embarked on the EBP journey more recently. The number of
source studies in the 11 included
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 6/15
systematic reviews ranged from n = 6 to n = 32, with a total of
204 source studies from 24 different
countries on six continents of the world.
Seven (64%) of the 11 reviews included source studies using a
cross‐ sectional survey design, another
seven (64%) included randomized controlled trials (RCTs) or
cluster RCTs, six (55%) included source
studies using a pretest–posttest intervention or a cluster
nonrandomized study design, four (36%)
included qualitative study designs, two each of the 11
systematic reviews included mixed‐ methods study
designs and longitudinal observational designs, and one each of
the 11 reviews included prospective
cohort designs or reviews. Although the majority (n = 7, 64%)
of the 11 included systematic reviews
contained one or more source studies using an experimental
design (i.e., used a second group for
comparison), the vast majority of the source studies were
nonrandomized, one‐ group quasi‐
experimental study designs, cross‐ sectional surveys, or
qualitative study designs. Similarly, although the
vast majority of the total number of source studies used a
nonrandom sample (e.g., a convenience or
purposive sample), seven of the 11 (64%) systematic reviews
included at least one source study that
used a random sample.
Only five of the 11 included reviews discussed or displayed
(e.g., in their extracted data tables) the
response rates of their source studies, and even when they were
actually reported, they frequently were
not reported for all source studies in the reviews. Overall, the
reported response rates were relatively
low, and there was wide variability in the response rates from
9% to 100%. Furthermore, healthcare
professionals' EBP competencies were measured using a wide
variety of published and unpublished
instruments, some of which were general instruments measuring
several EBP competencies, such as
the EBP Questionnaire (Upton & Upton, [33]), whereas other
instruments measured one specific EBP
competency, such as the EBP Beliefs Scale (Melnyk &
Fineout‐ Overholt, [13]). Selected characteristics
of the included reviews (n = 11) are presented in Table S2.
Participants and Practice Settings in the Systematic Reviews
A total of 59,382 healthcare professionals participated in the
source studies of the 11 included reviews
published between January 2012 and July 2017. Healthcare
disciplines represented in the reviews were
primarily nursing, medicine, physical therapy, and occupational
therapy, but participants from at least 10
additional allied health disciplines were included in the source
studies of the reviews, as listed in the
Turnbull et al. ([30]) model for allied health professionals. In
almost one‐ half (n = 5, 45%) of the
systematic reviews, the source studies focused on only one
healthcare discipline (e.g., nurses).
However, six of the 11 included systematic reviews contained
source studies with multidisciplinary
samples, which included health professionals other than nurses,
doctors, physical therapists, and
occupational therapists. All 11 included systematic reviews
focused on practicing healthcare
professionals, but four of the 11 (36%) systematic reviews also
contained small subsamples of
healthcare students in some of their source studies. The clinical
settings of the source studies were
poorly identified with only general statements such as "various
settings" or "any clinical setting," or the
settings were not described at all in the majority (n = 7, 64%) of
the included reviews. However, some of
the included reviews did disclose containing source studies
from hospital, primary care, and community
care settings.
Outcomes Measured and Overlap Between the Included Reviews
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 7/15
Outcomes measured in the included reviews varied
considerably, with several reviews containing other
outcomes in addition to those related to healthcare
professionals' EBP competencies. Moreover, the
instruments used to measure the outcomes also varied
considerably. Healthcare professionals' EBP
competencies were measured by using self‐ report assessments
in the source studies of all of the 11
included reviews (i.e., perceived EBP competencies were
measured, instead of using more objective
measures of actual performance, such as EBP knowledge tests).
A total of 204 source studies were
contained in the 11 reviews included in this overview. There
was substantial overlap across the included
reviews in terms of their source studies, as the 11 included
reviews with a total of 204 source studies
referred to a total of 133 separate studies, of which 48 were
included in more than one review. An effort
was made to avoid double counting which might lend extra
weight to those study results that had been
included in more than one review. A summary of the main
findings from the source studies can be found
in the fuller version of this overview published online. Table S3
summarizes the EBP competency
outcomes of healthcare professionals from the included reviews.
Overall Quality and Completeness of Reporting in the Included
Systematic Reviews
The overall quality of the included reviews was appraised using
guidance from the Cochrane
Collaboration (Becker & Oxman, [ 2]). All of the reviews met
the definition of systematic reviews as
outlined for this overview. Interestingly, although two of the 11
included reviews were characterized as a
"scoping review" or a "systematic scoping review," they
nevertheless included a critical appraisal of
methodological quality of their source studies, which reflects
the wide variety of terms that are used,
sometimes inconsistently, to describe the various types of
reviews published in the international
literature.
The critical appraisal of methodological quality conducted by
the three reviewers with the RCA tool
(OSUCN, [21]) revealed a broad range of strength of evidence
among the included reviews. The
benchmark for the strength of evidence indicating acceptable
methodological quality was set at 34%
(i.e., a total minimum score of at least 5 out of a total of 15
critical appraisal criteria fulfilled). All 11
included reviews met this minimum standard for acceptable
scientific rigor, with 10 out of the 11 reviews
appraised at moderate quality. The median score (0–15) was 8
(moderate), with the scores ranging from
5 to 10 (out of 15). Only one of the 11 included reviews barely
attained a high score (i.e., a score of at
least 10 out of 15 appraisal criteria fulfilled).
The pronounced heterogeneity in the source studies of the
included reviews in terms of their study
designs, practice settings, outcome measures, outcomes of
interest, and educational interventions,
combined with poor and inconsistent reporting quality (e.g., not
reporting source study settings) and
missing or incomplete data (e.g., only one of the 11 reviews
reported effect sizes for the source studies
and few reported p‐ values or confidence intervals), prompted
the results of this overview to be
narratively summarized. This also precluded any comparisons of
EBP competencies across healthcare
disciplines. In particular, there was considerable variation in the
outcome measures used in the source
studies of the reviews, including unpublished, not theoretically
based, and not psychometrically tested
instruments, which were inconsistently or incompletely
described. Moreover, many assertions were
made in the reporting of the source studies, but few assertions
were backed up by actual data in the
reviews. Furthermore, although the educational interventions
may have had a positive effect on EBP
competencies, the impact of the improved EBP competencies on
patient outcomes or practice changes
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 8/15
remains unclear, as healthcare professionals' improved EBP
competencies may not necessarily have
influenced practice in any way.
On the other hand, although the vast majority of the source
studies in the included reviews used
nonprobability sampling methods and cross‐ sectional survey,
pretest–posttest intervention, or qualitative
study designs, it is important to acknowledge that seven (64%)
of the 11 reviews contained at least one
RCTs or cluster RCT as a source study. In total, the 11 reviews
contained 33 RCTs or cluster RCTs as
source studies, some of which were included in more than one
review. These results are consistent with
the findings of Young, Rohwer, Volmink, and Clarke ([36]),
who found that despite the commonly held
perception of relatively rare use of experimental study designs
such as RCTs in some healthcare
disciplines, the reviews included in their overview nevertheless
included a total of 25 RCTs. In summary,
the overall quality and completeness of evidence in the included
reviews of this overview was low to
moderate at best, as the majority of the reviews did not contai n
a comprehensive literature search,
report on both included and excluded studies, or discuss the
potential biases of the reviews. Lastly,
some of the reviews did not report on the response rates, the
number of participants in their source
studies, or match the stated objectives of the review with what
was actually discussed in the review.
Discussion
The first Sicily statement (Dawes et al., [ 5]) outlined that
knowledge and understanding of the principles
of EBP and skills to implement the steps of the EBP process are
essential competencies for all
practicing healthcare professionals. To that end, this overview
of systematic reviews summarized and
synthesized evidence from 11 systematic reviews containing
204 source studies that assessed the
current state of the EBP competencies for practicing healthcare
professionals, provided critical
appraisals of their ability to implement the steps of the EBP
process, and evaluated the effectiveness of
various educational interventions for advancing their EBP
competencies using a wide variety of study
designs, outcome measures, and outcomes of interest.
Although the majority of healthcare professionals across
disciplines indicated familiarity with both the
concept of "evidence‐ based practice" and the
discipline‐ specific terms of (e.g., "evidence‐ based
nursing" or "evidence‐ based medicine") widespread confusion
appeared to exist among large
proportions of healthcare professionals about the commonly
accepted definitions of EBP and the
meanings of the basic concepts related to EBP (Condon,
McGrane, Mockler, & Stokes, [ 4]; Scurlock‐
Evans et al., [26]; Ubbink, Guyatt, & Vermeulen, [31]; Upton et
al., [32]), which were consistent with the
results of other reviews (Saunders & Vehviläinen‐ Julkunen,
[24]). This is disconcerting because the lack
of clarity about even the most basic definitions and concepts of
EBP among large proportions of
healthcare professionals impedes healthcare organizations from
delivering the highest quality, evidence‐
based health care. It also may contribute to a perception among
healthcare professionals and
organizations that EBP is being implemented, when in reality,
clinical care delivery is still more closely
associated with the traditions, routines, and customs of
opinion‐ based practice (Saunders &
Vehviläinen‐ Julkunen, [24]; Wonder, Spurlock, Lancaster, &
Gainey, [35]). Furthermore, large
proportions of healthcare professionals across disciplines
appear to hold a variety of misconceptions,
misinterpretations, and misunderstandings of what actually
constitutes EBP (Saunders, Stevens, &
Vehviläinen‐ Julkunen, [23]; Scurlock‐ Evans et al., [26];
Upton et al., [32]). For example, Scurlock‐ Evans
et al. ([26]) contended that physical therapists may not only be
confused as to the meaning of the term
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebscoh… 9/15
"evidence," they may also be confused about how they should
go about integrating evidence and about
what type of evidence they should be implementing in practice.
Practicing healthcare professionals' self‐ reported EBP attitudes
toward and beliefs in the importance
and value of EBP for improving care quality and patient
outcomes were mainly positive across health
disciplines, and generally at a higher level than their
self‐ reported EBP knowledge and skills.
Unfortunately, however, these EBP competencies did not
translate into EBP behaviors, as EBP
implementation in daily practice was generally at a low level
across disciplines (Saunders & Vehviläinen‐
Julkunen, [24]; Scurlock‐ Evans et al., [26]; Ubbink et al., [31];
Upton et al., [32]). Furthermore, although
healthcare professionals' self‐ rated EBP knowledge and skills
were higher than their EBP
implementation, healthcare professionals across disciplines
rated their EBP knowledge and skills to be
at an insufficient level for integrating best evidence into daily
practice. Perhaps for this reason, large
proportions of healthcare professionals across disciplines did
not use best available evidence or
implement EBP in daily care delivery. This is consistent with
the findings of previous studies indicating
that the majority of clinicians do not consistently engage in
EBP (Melnyk, Fineout‐ Overholt, Gallagher‐
Ford, & Kaplan, [15]; Melnyk et al., [17]; Wallen et al., [34]).
Another concern related to the included reviews was failing to
measure the impact of healthcare
professionals' EBP competencies on patient outcomes, even
when it was explicitly stated as one of the
objectives of the review. Although four of the 11 included
reviews reported measuring the impact of
healthcare professionals' EBP competencies on practice changes
or patient outcomes as a stated
objective, only one review actually discussed any results related
to patient outcomes. The lack of
measuring the impact on patient outcomes of healthcare
professionals' EBP competencies and that of
educational interventions promoting healthcare professionals'
EBP competencies is consistent with the
results of other reviews (Hecht, Buhse, & Meyer, [ 9];
Häggman‐ Laitila, Mattila, & Melender, [ 7]) and
overviews (Young et al., [36]).
Limitations in the Overview
The main limitation of this overview of systematic reviews is
the potential for various biases, including
selection, publication, and indexing biases. To reduce the
potential for bias, we followed guidance from
the Cochrane Collaboration and PRISMA on the methodology
for conducting rigorous systematic
reviews and reporting their results, followed a prespecified
review protocol, and systematically searched
multiple electronic databases in collaboration with a university
librarian, using keywords and search
terms modified appropriately for the various databases. In
addition, we searched for ongoing systematic
reviews prior to undertaking this overview, reference‐ chased
the systematic reviews included in this
overview, and hand‐ searched the tables of contents of the
peer‐ reviewed scientific journals in which the
majority of the systematic reviews on healthcare professionals'
EBP competencies had been published.
As hand‐ searching the tables of contents did not result in
additional searches, we believe that our
search strategy would effectively capture most of the relevant
systematic reviews published on this topic
between January 2012 and July 2017. However, as in any
review, it is possible that some relevant
systematic reviews were not identified.
Second, three reviewers independently used a study
design‐ specific critical appraisal tool to evaluate
the methodological quality of each included review, with any
discrepancies and differences discussed to
form a mutual agreement, which increased the reliability of the
data. In addition, all of the included
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebsco… 10/15
reviews, originating from 10 different countries worldwide, had
passed an international peer review and
had been published in high‐ quality scientific journals. As the
majority (n = 6, 55%) of the included
reviews originated from non‐ English‐ speaking countries
representing six different languages, publication
and language biases, although possible, are unlikely.
Third, self‐ reported assessments were used to measure
healthcare professionals' EBP competencies in
all of the 11 included reviews (i.e., perceived EBP
competencies were assessed, instead of using more
objective measures of actual performance, such as EBP
knowledge tests). Because of a lack of
congruence between self‐ reported and more objectively
measured knowledge and ability, especially
when measuring complex tasks such as EBP implementation
(Saunders, Vehviläinen‐ Julkunen, et al.,
[25]; Scurlock‐ Evans et al., [26]; Wonder et al., [35]), using
self‐ reports may result in bias (through the
participants giving more socially acceptable responses than
nonrespondents), and in overestimation of
some EBP competencies, such as EBP knowledge, for which
more objective measures are available.
Fourth, the search term "research utilization" was not used for
our overview of systematic reviews as the
aim was to focus on the EBP competencies that practicing
healthcare professionals need to
successfully integrate translated best evidence into daily
clinical practice. However, we acknowledge
that it is not uncommon for research utilization to be used in
studies as if it were an alternative term for
EBP, and therefore, we are aware that some of the published
systematic reviews may have been
missed by our search. Fifth, the modest methodological quality
of the identified systematic reviews and
the relatively low quality of reporting of the results in the
systematic reviews may have affected the
results of this overview. Finally, effect sizes were not reported
in all but one of the included systematic
reviews. Therefore, generalizability of the results is limited,
and the results of this overview should be
extrapolated with caution.
Implications for Practice and Research
Evidence‐ based practice competencies are essential for all
practicing healthcare professionals in
guiding their integration of best evidence into their clinical
decision‐ making and thus enabling them to
provide higher‐ quality care and produce better patient
outcomes. However, as EBP is a shared
competency and the steps of EBP implementation are universal,
there is an urgent need for the
collaborative development, implementation, and evaluation of
an EBP competency set for all healthcare
professionals (i.e., an interprofessional set of EBP competencies
that can be used by all practicing
healthcare professionals from any healthcare discipline).
Recently, the development of a first set of such
interprofessional core competencies in EBP for all healthcare
professionals was published as a
consensus statement based on a systematic review and Delphi
survey (Albarqouni et al., [ 1]), which
contained 68 core competencies in EBP applicable to all
healthcare professionals. This type of
interprofessional core competencies in EBP for all healthcar e
professionals should be the focus of future
research studies, as the EBP competencies will guide the
development of interprofessional EBP
competency measures (via self‐ ratings or actual performance)
as well as joint EBP curricula for
practicing healthcare professionals, and thus, their subsequent
uptake, adoption, and use in clinical
practice should be a high priority for all practicing healthcare
professionals. In addition, addressing the
widespread misconceptions and misunderstandings currently
existing among large proportions of
healthcare professionals about the basic concepts of EBP is
crucially important for increasing their
engagement in EBP implementation and for attaining improved
care quality and patient outcomes.
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebsco… 11/15
Nursing and some allied health disciplines, such as physical
therapy and occupational therapy, have
traditionally relied on measuring competencies through
self‐ report assessments even when the
constructs of interest, such as EBP knowledge, ability, or
competence, could be assessed through more
objective measures. Therefore, future studies should focus on
developing and using actual, that is,
performance‐ based, validated outcome measures for EBP
competencies through using rigorous study
and review methodologies and robust reporting practices.
Although EBP is a shared competency,
implementation of EBP is a complex process requiring
multifaceted educational interventions that
contain interacting components, and thus, it should be
investigated whether the differences in
healthcare professionals' primary roles, educational
backgrounds across disciplines, and in contextual
factors may influence the effects of the EBP educational
interventions.
Conclusions
The findings of this overview of systematic reviews suggest that
irrespective of their healthcare
discipline, large proportions of practicing healthcare
professionals perceive their EBP competencies to
be insufficient for employing EBP in daily care delivery. These
perceptions as well as widespread
confusion, misconceptions, and misunderstandings about the
meanings of the most basic concepts of
EBP among healthcare professionals across disciplines
contribute to their low levels of EBP
implementation both in terms of the principles and in terms of
the process of EBP (i.e., healthcare
professionals neither using translated best evidence as the basis
for clinical decision‐ making in daily
practice nor implementing all the steps of the EBP process). As
EBP is a shared competency, practicing
healthcare professionals should actively participate in the
uptake, adoption, and use of the
interprofessional core competencies in EBP for all healthcare
professionals as well as collaboratively
advance EBP implementation through the development and
evaluation of the effectiveness of research‐
based EBP interventions, strategies, and tools.
EBP competencies are essential for all practicing healthcare
professionals as they guide healthcare
professionals' integration of best evidence into their clinical
decision‐ making and thus, enable them to
provide higher‐ quality care to patients, resulting in better
patient outcomes.
It is important to recognize that EBP is a shared competency;
that is, the key principles and steps of the
EBP implementation process are universal and applicable to all
healthcare disciplines.
There is an urgent need for conducting research studies on the
applicability in practice, as well as the
uptake, adoption, and evaluation of the interprofessional core
competencies in EBP for all healthcare
professionals recently published as a consensus statement based
on a systematic review and Delphi
survey (Albarqouni et al., [ 1]).
Future research studies should also focus on developing and
using actual, that is, performance‐ based,
validated outcome measures for assessing nurses' EBP
competencies, instead of continuing to evaluate
perceived (i.e., self‐ rated) competencies via self‐ assessments,
even when the constructs of interest,
such as EBP knowledge and ability, could be assessed through
more objective, performance‐ based
measures.
Linking Evidence to Action
GRAPH: Figure S1. The modified PRISMA Flow diagram
(Moher et al., [19]): Identification, screening
and selection of systematic reviews for inclusion in the
overview.
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fe d
s.p.ebsco… 12/15
GRAPH: Table S1. Inclusion and Exclusion Criteria for the
Overview of Systematic Reviews.Table S2.
Characteristics of Included Systematic Reviews in the
Overview.Table S3. Summary Table of EBP
Outcomes in the Systematic Reviews Included in the Overview.
Footnotes
1 This research was supported by grants awarded to Dr.
Saunders from the Finnish Work Environment
Fund, which are gratefully acknowledged.
References
Albarqouni, L., Hoffmann, T., Straus, S., Olsen, N. R., Young,
T., Ilic, D., ... Glasziou, P. (2018). Core
competencies in evidence‐ based practice for health
professionals. Consensus statement based on a
systematic review and Delphi survey. JAMA Network Open, 2
(1), e190281.
https://doi.org/10.1001/jamanetworkopen.2018.0281
2 Becker, L., & Oxman, A. (2011). Overviews of reviews. In J.
P. T. Higgins & S. Green (Eds.), Cochrane
handbook for systematic reviews of interventions (Version
5.1.0). Retrieved from The Cochrane
Collaboration website: http://www.cochrane-handbook.org
3 Benner, P. (1984). From novice to expert. Excellence and
power in clinical nursing practice. Menlo
Park, CA : Addison Wesley.
4 Condon, C., McGrane, N., Mockler, D., & Stokes, E. (2016).
Ability of physiotherapists to undertake
evidence‐ based practice steps: A scoping review.
Physiotherapy, 102, 10 – 19.
https://doi.org/10.1016/j.physio.2015.06.003
5 Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A.,
Martin, J., Hopayian, K., ... Osborne, J.
(2005). Second International Conference of Evidence‐ Based
Health Care Teachers and Developers:
Sicily statement on evidence‐ based practice. BMC Medical
Education, 5, 1.
https://doi.org/10.1186/1472-6920-5-1
6 DiCenso, A., Cullum, N., & Ciliska, D. (1998). Implementing
evidence‐ based nursing: Some
misconceptions. Evidence‐ Based Nursing, 1, 38 – 40.
https://doi.org/10.1136/ebn.1.2.38
7 Häggman‐ Laitila, A., Mattila, L.‐ R., & Melender, H.‐ L.
(2016). Educational interventions on evidence‐
based nursing in clinical practice: A systematic review with
qualitative analysis. Nurse Education Today,
43, 50 – 59. https://doi.org/10.1016/j.nedt.2016.04.023
8 Halm, M. A. (2018). Evaluating the impact of EBP education:
Development of a modified Fresno test
for acute care nursing. Worldviews on Evidence‐ Based
Nursing, 15, 272 – 280.
https://doi.org/10.1111/wvn.12291
9 Hecht, L., Buhse, S., & Meyer, G. (2016). Effectiveness of
training in evidence‐ based medicine skills
for healthcare professionals: A systematic review. BMC
Medical Education, 16, 103.
https://doi.org/10.1186/s12909-016-0616-2
Ilic, D., Nordin, R. B., Glasziou, P., Tilson, J. K., &
Villanueva, E. (2013). Implementation of a blended
learning approach to teach evidence‐ based practice: A protocol
for a mixed‐ method study. BMC Medical
http://www.cochrane-handbook.org/
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebsco… 13/15
Education, 13 (170). https://doi.org/10.1186/1472-6920-13-170
Laibhen‐ Parkes, N., Kimble, L. P., Melnyk, B. M., Sudia, T., &
Codone, S. (2018). An adaptation of the
original Fresno test to measure evidence‐ based competence of
pediatric bedside nurses. Worldviews
on Evidence‐ Based Nursing, 15 (3), 230 – 240.
https://doi.org/10.1111/wvn.12289
McCluskey, A., & Bishop, B. (2009). The adapted Fresno test of
competence in evidence‐ based
practice. Journal of Continuing Education in Health Professions,
29 (2), 119 – 126.
Melnyk, B. M., & Fineout‐ Overholt, E. (2007). The
evidence‐ based practice beliefs scale. Gilbert, AZ :
ARCC IIc Publishing.
Melnyk, B. M., & Fineout‐ Overholt, E. (Eds.). (2011).
Evidence‐ based practice in nursing and
healthcare: A guide to best practice (2nd ed.). Philadelphia, PA
: Lippincott, Williams & Wilkins.
Melnyk, B., Fineout‐ Overholt, E., Gallagher‐ Ford, L., &
Kaplan, L. (2012). The state of evidence‐ based
practice among U.S. nurses. Journal of Nursing Administration,
42, 410 – 417.
https://doi.org/10.1097/NNA.0b013e3182664e0a
Melnyk, B. M., Gallagher‐ Ford, L., & Fineout‐ Overholt, E.
(2014). The establishment of evidence‐ based
practice competencies for practicing registered nurses and
advanced practice nurses in real‐ world
settings: Proficiencies to improve healthcare quality, reliability,
patient outcomes, and costs. Worldviews
on Evidence‐ Based Nursing, 1 (1), 5 – 15.
https://doi.org/10.1111/wvn.12021
Melnyk, B. M., Gallagher‐ Ford, L., & Fineout‐ Overholt, E.
(2016). Implementing the EBP competencies
in healthcare: A practical guide for improving quality, safety, &
outcomes. Indianapolis, IN : Sigma Theta
Tau International.
Melnyk, B. M., Gallagher‐ Ford, L., Zellefrow, C., Tucker, S.,
Thomas, B., Sinnott, L. T., & Tan, A. (2018).
The first U.S. study on nurses' evidence‐ based practice
competencies indicates major deficits that
threaten healthcare quality, safety, and patient outcomes.
Worldviews on Evidence‐ Based Nursing, 15
(1), 16 – 25. https://doi.org/10.1111/wvn.12269
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G., & The
PRISMA Group (2009). Preferred reporting
items for systematic reviews and meta‐ analyses: The PRISMA
statement. PLoS Medicine, 6 (6),
e1000097. https://doi.org/10.1371/journal.pmed1000097
Mota da Silva, T., da Cunha Menezes Costa, L., Narciso Garcia,
A., & Oliveira Pena Costa, L. (2015).
What do physical therapists think about evidence‐ based
practice? A systematic review. Manual
Therapy, 20, 388 – 401.
https://doi.org/10.1016/j.math.2014.10.009
Ohio State University College of Nursing (2017). Rapid Critical
Assessment (RCA) tools for systematic
reviews & meta‐ analyses of quantitative studies and literature
reviews. Columbus, OH : The Helene
Fuld National Institute for Evidence‐ Based Nursing &
Healthcare.
Sackett, D., Rosenberg, W., Gray, J., Haynes, R., & Richardson,
W. (1996). Evidence‐ based medicine:
What it is and what it isn't. British Journal of Medicine, 312, 71
– 72.
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebsco… 14/15
https://doi.org/10.1136/bmj.312.7023.71
Saunders, H., Stevens, K. R., & Vehviläinen‐ Julkunen, K.
(2016). Nurses' readiness for evidence‐ based
practice at Finnish university hospitals: A national survey.
Journal of Advanced Nursing, 72, 1863 –
1874. https://doi.org/10.1111/jan.12963
Saunders, H., & Vehviläinen‐ Julkunen, K. (2015). The state of
readiness for evidence‐ based practice
among nurses: An integrative review. International Journal of
Nursing Studies, 56, 128 – 140.
Saunders, H., Vehviläinen‐ Julkunen, K., & Stevens, K. R.
(2016). Effectiveness of an education
intervention to strengthen nurses' readiness for evidence‐ based
practice: A single‐ blind randomized
controlled study. Applied Nursing Research, 31, 175 – 185.
https://doi.org/10.1016/j.apnr.2016.03.004
Scurlock‐ Evans, L., Upton, P., & Upton, D. (2014).
Evidence‐ based practice in physiotherapy: A
systematic review of barriers, enablers and interventions.
Physiotherapy, 100, 208 – 219.
https://doi.org/10.1016/j.physio.2014.03.001
Spurlock, D., & Wonder, A. H. (2015). Validity and reliability
evidence for a new measure: The evidence‐
based practice knowledge assessment in nursing. Journal of
Nursing Education, 54, 605 – 613.
https://doi.org/10.3928/01484834-20151016-01
Stevens, K. R. (2009). Essential evidence‐ based practice
competencies in nursing (2nd ed.). San
Antonio, TX : San Antonio Academic Center for
Evidence‐ Based Practice, University of Texas Health
Science Center.
Tilson, J. K. (2010). Validation of the modified Fresno test:
Assessing physical therapists' evidence‐
based practice knowledge and skills. BMC Medical Education,
10 (38), 38 https://doi.org/10.1186/1472-
6920-10-38
Turnbull, C., Grimmer‐ Somers, K., Kumar, S., May, E., Law,
D., & Ashworth, E. (2009). Allied, scientific
and complementary health professionals: A new model for
Australian allied health. Australian Health
Review, 33 (1), 27 – 37. https://doi.org/10.1071/AH090027
Ubbink, D. T., Guyatt, G. H., & Vermeulen, H. (2013).
Framework of policy recommendations for
implementation of evidence‐ based practice: A systematic
scoping review. British Medical Journal Open,
3 (1), e001881. https://doi.org/10.1136/bmjopen-2012-001881
Upton, D., Stephens, D., Williams, B., & Scurlock‐ Evans, L.
(2014). Occupational therapists' attitudes,
knowledge, and implementation of evidence‐ based practice: A
systematic review of published research.
British Journal of Occupational Therapy, 77 (1), 24 – 38.
https://doi.org/10.4276/030802214X13887685335544
Upton, D., & Upton, P. (2005). Knowledge and use of
evidence‐ based practice by allied health and
health science professionals in the United Kingdom. Journal of
Allied Health, 35, 127 – 133.
Wallen, G., Mitchell, S., Melnyk, B., Fineout‐ Overholt, E.,
Miller‐ Davis, C., Yates, J., & Hastings, C.
(2010). Implementing evidence‐ based practice: Effectiveness
of a structured multifaceted mentorship
6/25/22, 12:42 AM Library OneSearch
https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95-
4660-9974-
129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed
s.p.ebsco… 15/15
programme. Journal of Advanced Nursing, 66, 2761 – 2771.
https://doi.org/10.1111/j.1365-
2648.2010.05442.x
Wonder, A. H., Spurlock, D., Lancaster, S., & Gainey, M.
(2017). Comparison of nurses' self‐ reported
and objectively measured evidence‐ based practice knowledge.
Journal of Continuing Education in
Nursing, 48, 65 – 70. https://doi.org/10.3928/00220124-
20170119-06
Young, T., Rohwer, A., Volmink, J., & Clarke, M. (2014). What
are the effects of teaching evidence‐
based health care (EBHC)? Overview of systematic reviews
PLoS ONE, 9 (1), e86706.
https://doi.org/10.1371/journal.pone.0086706
~~~~~~~~
By Hannele Saunders; Lynn Gallagher‐ Ford; Tarja Kvist and
Katri Vehviläinen‐ Julkunen
Reported by Author; Author; Author; Author
This article is copyrighted. All rights reserved.
Source: Worldviews on Evidence-Based Nursing
https://doi.org/10.1177/1078390319889673
Journal of the American Psychiatric
Nurses Association
2020, Vol. 26(3) 288 –292
© The Author(s) 2019
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1078390319889673
journals.sagepub.com/home/jap
Brief Report
Introduction
More than 45,000 Americans over age 10 years died by
suicide in 2016, making it the 10th leading cause of death
in the United States (Centers for Disease Control and
Prevention, 2018). Among adults who complete suicide,
in the year prior to their death, approximately 20% to
25% have an emergency department (ED) visit for delib-
erate self-harm (Ahmedani et al., 2014; Ahmedani et al.,
2015) and associated suicidality (hereafter referred to as
DSH), which can include nonsuicidal self-injury. Because
EDs are providing front-line suicide prevention services,
improving the overall quality of ED mental health care
for patients who present with DSH represents an opportu-
nity to intervene with these high-risk patients. While pre-
vious studies have attempted to determine the quality of
care in EDs for DSH patients, the literature is lacking in
recent, U.S.-based research.
ED nursing managers were surveyed because of their
broad knowledge of typical unit policies, practices, and
staffing structure. In addition to providing management of
frontline nurses, they also oversee the organizational
structure of nursing treatment for DSH patients and are
therefore well-positioned to shape the processes of care
for these patients. Essential to improving this process is
understanding the aspects of care that managers perceive
as important to providing quality care, as well as the extent
to which evidence-based practices (EBPs) have translated
889673 JAPXXX10.1177/1078390319889673Journal of the
American Psychiatric Nurses AssociationDiana et al.
research-article2019
1Amaya H. Diana, The University of Pennsylvania,
Philadelphia, PA,
USA
2Mark Olfson, MD, MPH, Columbia University, New York, NY,
USA
3Sara Wiesel Cullen, PhD, MSW, The University of
Pennsylvania,
Philadelphia, PA, USA
4Steven C. Marcus, PhD, The University of Pennsylvania,
Philadelphia,
PA, USA
Corresponding Author:
Sara Wiesel Cullen, School of Social Policy and Practice, the
University of Pennsylvania, 3701 Locust Walk, Philadelphia,
PA 19104-
6243, USA.
Email: [email protected]
The Relationship Between
Evidence-Based Practices and
Emergency Department Managers’
Perceptions on Quality of
Care for Self-Harm Patients
Amaya H. Diana1, Mark Olfson2, Sara Wiesel Cullen3 ,
and Steven C. Marcus4
Abstract
OBJECTIVE: To understand the extent to which implementation
of evidence-based practices affects emergency
department (ED) nurse managers’ perceptions of quality of care
provided to deliberate self-harm patients.
METHODS: ED nursing leadership from a nationally
representative sample of 513 hospitals completed a survey on
the ED management of deliberate self-harm patients, including
the quality of care for deliberate self-harm patients on
a 1 to 5 point Likert-type scale. Unadjusted and adjusted
analyses, controlling for relevant hospital characteristics,
examined associations between the provision of evidence-based
practices and quality of care. RESULTS: The overall
mean quality rating was 3.09. Adjusted quality ratings were
higher for EDs that routinely engaged in discharge planning
(β = 0.488) and safety planning (β = 0.736) processes. Ratings
were also higher for hospitals with higher levels
of mental health staff (β = 0.368) and for teaching hospitals (β
= 0.319). CONCLUSION: Preliminary findings
suggest a national institutional readiness for further
implementation of evidence-based practices for deliberate self-
harm patients.
Keywords
emergency department, deliberate self-harm, suicide prevention,
evidence-based practices, quality of care
Check for Updates
http://crossmark.crossref.org/dialog/?doi=10.1177%2F10783903
19889673&domain=pdf&date_stamp=2019-11-21
https://us.sagepub.com/en-us/journals-permissions
https://doi.org/10.1177/1078390319889673
http://journals.sagepub.com/home/jap
mailto:[email protected]
Diana et al. 289
to the ED. For instance, after a DSH event, the provision
of appropriate assessment and safety planning reduces
risk for repeat DSH and suicide attempts (Boudreaux
et al., 2016; Stanley et al., 2018). Safety planning is a
brief behavioral intervention that can be performed by
nurses in the ED that involves restricting access to lethal
means, teaching coping skills, identifying a social and
emergency network, and building motivation for con-
tinuing mental health treatment (Stanley et al., 2018).
Despite the evidence supporting the efficacy of assess-
ment and safety planning, it remains unknown how often
these strategies are actually employed in EDs or the
extent to which they improve the quality of care for DSH
patients.
In order to assess the gap between research and prac-
tice in this area, a national survey of over 500 ED manag-
ers collected data on the extent to which EDs provide
assessments, the elements of safety planning practices
identified above, and mental health referrals on discharge.
We then examined the extent to which implementation of
these practices influenced ED nurse managers’ percep-
tions of the quality of care provided to DSH patients.
Methods
Between May 2017 and January 2018, we mailed an ED
management of DSH survey to a random sample of 665
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co
Copyright © 2020 e-Service Journal. All rights reserved. No co

More Related Content

Similar to Copyright © 2020 e-Service Journal. All rights reserved. No co

Exploring the Impact of Information System Introduction
Exploring the Impact of Information System IntroductionExploring the Impact of Information System Introduction
Exploring the Impact of Information System IntroductionSuelette Dreyfus
 
Initial post week 12
Initial post week 12 Initial post week 12
Initial post week 12 rraquedan
 
A Collaborative Product Commerce Approach To Value Based Health Plan Purchasing
A Collaborative Product Commerce Approach To Value Based Health Plan PurchasingA Collaborative Product Commerce Approach To Value Based Health Plan Purchasing
A Collaborative Product Commerce Approach To Value Based Health Plan PurchasingKate Campbell
 
Please follow instructions carefully. Thank you so kindly. Ass.docx
Please follow instructions carefully. Thank you so kindly. Ass.docxPlease follow instructions carefully. Thank you so kindly. Ass.docx
Please follow instructions carefully. Thank you so kindly. Ass.docxmattjtoni51554
 
Top seven healthcare outcome measures of health
Top seven healthcare outcome measures of healthTop seven healthcare outcome measures of health
Top seven healthcare outcome measures of healthJosephMtonga1
 
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docx
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docxThe New Focus on Quality and OutcomesIntroductionIn 1999, the .docx
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docxoreo10
 
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docx
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docxSubmission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docx
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docxdeanmtaylor1545
 
2751 8775-1-sm
2751 8775-1-sm2751 8775-1-sm
2751 8775-1-smkomentarku
 
Suggested ResourcesThe resources provided here are optional. You.docx
Suggested ResourcesThe resources provided here are optional. You.docxSuggested ResourcesThe resources provided here are optional. You.docx
Suggested ResourcesThe resources provided here are optional. You.docxdeanmtaylor1545
 
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docx
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docxRESEARCH ARTICLE Open AccessHealthcare professionals’ view.docx
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docxrgladys1
 
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdf
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdfNRS 493 Grand Canyon University Improving Patients Quality Discussion.pdf
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdfbkbk37
 
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION,
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION, MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION,
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION, DioneWang844
 
Quality and Performance.docx
Quality and Performance.docxQuality and Performance.docx
Quality and Performance.docxwrite22
 
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...bkbk37
 
CANCER DATA COLLECTION6The Application of Data to Problem-So
CANCER DATA COLLECTION6The Application of Data to Problem-SoCANCER DATA COLLECTION6The Application of Data to Problem-So
CANCER DATA COLLECTION6The Application of Data to Problem-SoTawnaDelatorrejs
 
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docx
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docxPayer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docx
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docxdanhaley45372
 
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docx
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docxRunning Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docx
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docxcowinhelen
 
NURS 521 Nursing Informatics And Technology.docx
NURS 521 Nursing Informatics And Technology.docxNURS 521 Nursing Informatics And Technology.docx
NURS 521 Nursing Informatics And Technology.docxstirlingvwriters
 
Application of Data Analytics to Improve Patient Care: A Systematic Review
Application of Data Analytics to Improve Patient Care: A Systematic ReviewApplication of Data Analytics to Improve Patient Care: A Systematic Review
Application of Data Analytics to Improve Patient Care: A Systematic ReviewIRJET Journal
 
Introduction Healthcare system is considered one of the busiest.pdf
Introduction Healthcare system is considered one of the busiest.pdfIntroduction Healthcare system is considered one of the busiest.pdf
Introduction Healthcare system is considered one of the busiest.pdfbkbk37
 

Similar to Copyright © 2020 e-Service Journal. All rights reserved. No co (20)

Exploring the Impact of Information System Introduction
Exploring the Impact of Information System IntroductionExploring the Impact of Information System Introduction
Exploring the Impact of Information System Introduction
 
Initial post week 12
Initial post week 12 Initial post week 12
Initial post week 12
 
A Collaborative Product Commerce Approach To Value Based Health Plan Purchasing
A Collaborative Product Commerce Approach To Value Based Health Plan PurchasingA Collaborative Product Commerce Approach To Value Based Health Plan Purchasing
A Collaborative Product Commerce Approach To Value Based Health Plan Purchasing
 
Please follow instructions carefully. Thank you so kindly. Ass.docx
Please follow instructions carefully. Thank you so kindly. Ass.docxPlease follow instructions carefully. Thank you so kindly. Ass.docx
Please follow instructions carefully. Thank you so kindly. Ass.docx
 
Top seven healthcare outcome measures of health
Top seven healthcare outcome measures of healthTop seven healthcare outcome measures of health
Top seven healthcare outcome measures of health
 
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docx
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docxThe New Focus on Quality and OutcomesIntroductionIn 1999, the .docx
The New Focus on Quality and OutcomesIntroductionIn 1999, the .docx
 
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docx
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docxSubmission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docx
Submission Id ab299d7c-b547-4cf3-958a-07922ca71f2765 SIM.docx
 
2751 8775-1-sm
2751 8775-1-sm2751 8775-1-sm
2751 8775-1-sm
 
Suggested ResourcesThe resources provided here are optional. You.docx
Suggested ResourcesThe resources provided here are optional. You.docxSuggested ResourcesThe resources provided here are optional. You.docx
Suggested ResourcesThe resources provided here are optional. You.docx
 
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docx
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docxRESEARCH ARTICLE Open AccessHealthcare professionals’ view.docx
RESEARCH ARTICLE Open AccessHealthcare professionals’ view.docx
 
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdf
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdfNRS 493 Grand Canyon University Improving Patients Quality Discussion.pdf
NRS 493 Grand Canyon University Improving Patients Quality Discussion.pdf
 
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION,
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION, MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION,
MHA6999 SEMINAR IN HEALTHCARE CASES-- WEEK 2 LECTURE, DISCUSSION,
 
Quality and Performance.docx
Quality and Performance.docxQuality and Performance.docx
Quality and Performance.docx
 
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...
MSN 5650 Miami Regional University Reducing Hospital Readmission Presentation...
 
CANCER DATA COLLECTION6The Application of Data to Problem-So
CANCER DATA COLLECTION6The Application of Data to Problem-SoCANCER DATA COLLECTION6The Application of Data to Problem-So
CANCER DATA COLLECTION6The Application of Data to Problem-So
 
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docx
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docxPayer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docx
Payer Mix and EHR Adoptionin HospitalsDong Yeong Shin, doc.docx
 
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docx
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docxRunning Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docx
Running Head EVALUATION PLAN FOCUSEVALUATION PLAN FOCUS 1.docx
 
NURS 521 Nursing Informatics And Technology.docx
NURS 521 Nursing Informatics And Technology.docxNURS 521 Nursing Informatics And Technology.docx
NURS 521 Nursing Informatics And Technology.docx
 
Application of Data Analytics to Improve Patient Care: A Systematic Review
Application of Data Analytics to Improve Patient Care: A Systematic ReviewApplication of Data Analytics to Improve Patient Care: A Systematic Review
Application of Data Analytics to Improve Patient Care: A Systematic Review
 
Introduction Healthcare system is considered one of the busiest.pdf
Introduction Healthcare system is considered one of the busiest.pdfIntroduction Healthcare system is considered one of the busiest.pdf
Introduction Healthcare system is considered one of the busiest.pdf
 

More from AlleneMcclendon878

Explain in your own words why it is important to read a statistical .docx
Explain in your own words why it is important to read a statistical .docxExplain in your own words why it is important to read a statistical .docx
Explain in your own words why it is important to read a statistical .docxAlleneMcclendon878
 
Explain how Matthew editedchanged Marks Gospel for each of the fol.docx
Explain how Matthew editedchanged Marks Gospel for each of the fol.docxExplain how Matthew editedchanged Marks Gospel for each of the fol.docx
Explain how Matthew editedchanged Marks Gospel for each of the fol.docxAlleneMcclendon878
 
Explain the degree to which media portrayal of crime relates to publ.docx
Explain the degree to which media portrayal of crime relates to publ.docxExplain the degree to which media portrayal of crime relates to publ.docx
Explain the degree to which media portrayal of crime relates to publ.docxAlleneMcclendon878
 
Explain the difference between genotype and phenotype. Give an examp.docx
Explain the difference between genotype and phenotype. Give an examp.docxExplain the difference between genotype and phenotype. Give an examp.docx
Explain the difference between genotype and phenotype. Give an examp.docxAlleneMcclendon878
 
Explain the history behind the Black Soldier of the Civil War In t.docx
Explain the history behind the Black Soldier of the Civil War In t.docxExplain the history behind the Black Soldier of the Civil War In t.docx
Explain the history behind the Black Soldier of the Civil War In t.docxAlleneMcclendon878
 
Explain the fundamental reasons why brands do not exist in isolation.docx
Explain the fundamental reasons why brands do not exist in isolation.docxExplain the fundamental reasons why brands do not exist in isolation.docx
Explain the fundamental reasons why brands do not exist in isolation.docxAlleneMcclendon878
 
Explain the difference between hypothetical and categorical imperati.docx
Explain the difference between hypothetical and categorical imperati.docxExplain the difference between hypothetical and categorical imperati.docx
Explain the difference between hypothetical and categorical imperati.docxAlleneMcclendon878
 
Explain in 100 words provide exampleThe capital budgeting decisi.docx
Explain in 100 words provide exampleThe capital budgeting decisi.docxExplain in 100 words provide exampleThe capital budgeting decisi.docx
Explain in 100 words provide exampleThe capital budgeting decisi.docxAlleneMcclendon878
 
Explain how Supreme Court decisions influenced the evolution of the .docx
Explain how Supreme Court decisions influenced the evolution of the .docxExplain how Supreme Court decisions influenced the evolution of the .docx
Explain how Supreme Court decisions influenced the evolution of the .docxAlleneMcclendon878
 
Explain how an offender is classified according to risk when he or s.docx
Explain how an offender is classified according to risk when he or s.docxExplain how an offender is classified according to risk when he or s.docx
Explain how an offender is classified according to risk when he or s.docxAlleneMcclendon878
 
Explain a lesson plan. Describe the different types of information.docx
Explain a lesson plan. Describe the different types of information.docxExplain a lesson plan. Describe the different types of information.docx
Explain a lesson plan. Describe the different types of information.docxAlleneMcclendon878
 
explain the different roles of basic and applied researchdescribe .docx
explain the different roles of basic and applied researchdescribe .docxexplain the different roles of basic and applied researchdescribe .docx
explain the different roles of basic and applied researchdescribe .docxAlleneMcclendon878
 
Explain the basics of inspirational and emotion-provoking communicat.docx
Explain the basics of inspirational and emotion-provoking communicat.docxExplain the basics of inspirational and emotion-provoking communicat.docx
Explain the basics of inspirational and emotion-provoking communicat.docxAlleneMcclendon878
 
Explain how leaders develop through self-awareness and self-discipli.docx
Explain how leaders develop through self-awareness and self-discipli.docxExplain how leaders develop through self-awareness and self-discipli.docx
Explain how leaders develop through self-awareness and self-discipli.docxAlleneMcclendon878
 
Explain five ways that you can maintain professionalism in the meeti.docx
Explain five ways that you can maintain professionalism in the meeti.docxExplain five ways that you can maintain professionalism in the meeti.docx
Explain five ways that you can maintain professionalism in the meeti.docxAlleneMcclendon878
 
Explain security awareness and its importance.Your response should.docx
Explain security awareness and its importance.Your response should.docxExplain security awareness and its importance.Your response should.docx
Explain security awareness and its importance.Your response should.docxAlleneMcclendon878
 
Experimental Design AssignmentYou were given an Aedesaegyp.docx
Experimental Design AssignmentYou were given an Aedesaegyp.docxExperimental Design AssignmentYou were given an Aedesaegyp.docx
Experimental Design AssignmentYou were given an Aedesaegyp.docxAlleneMcclendon878
 
Expand your website plan.Select at least three interactive fea.docx
Expand your website plan.Select at least three interactive fea.docxExpand your website plan.Select at least three interactive fea.docx
Expand your website plan.Select at least three interactive fea.docxAlleneMcclendon878
 
Exercise 7 Use el pronombre y la forma correcta del verbo._.docx
Exercise 7 Use el pronombre y la forma correcta del verbo._.docxExercise 7 Use el pronombre y la forma correcta del verbo._.docx
Exercise 7 Use el pronombre y la forma correcta del verbo._.docxAlleneMcclendon878
 
Exercise 21-8 (Part Level Submission)The following facts pertain.docx
Exercise 21-8 (Part Level Submission)The following facts pertain.docxExercise 21-8 (Part Level Submission)The following facts pertain.docx
Exercise 21-8 (Part Level Submission)The following facts pertain.docxAlleneMcclendon878
 

More from AlleneMcclendon878 (20)

Explain in your own words why it is important to read a statistical .docx
Explain in your own words why it is important to read a statistical .docxExplain in your own words why it is important to read a statistical .docx
Explain in your own words why it is important to read a statistical .docx
 
Explain how Matthew editedchanged Marks Gospel for each of the fol.docx
Explain how Matthew editedchanged Marks Gospel for each of the fol.docxExplain how Matthew editedchanged Marks Gospel for each of the fol.docx
Explain how Matthew editedchanged Marks Gospel for each of the fol.docx
 
Explain the degree to which media portrayal of crime relates to publ.docx
Explain the degree to which media portrayal of crime relates to publ.docxExplain the degree to which media portrayal of crime relates to publ.docx
Explain the degree to which media portrayal of crime relates to publ.docx
 
Explain the difference between genotype and phenotype. Give an examp.docx
Explain the difference between genotype and phenotype. Give an examp.docxExplain the difference between genotype and phenotype. Give an examp.docx
Explain the difference between genotype and phenotype. Give an examp.docx
 
Explain the history behind the Black Soldier of the Civil War In t.docx
Explain the history behind the Black Soldier of the Civil War In t.docxExplain the history behind the Black Soldier of the Civil War In t.docx
Explain the history behind the Black Soldier of the Civil War In t.docx
 
Explain the fundamental reasons why brands do not exist in isolation.docx
Explain the fundamental reasons why brands do not exist in isolation.docxExplain the fundamental reasons why brands do not exist in isolation.docx
Explain the fundamental reasons why brands do not exist in isolation.docx
 
Explain the difference between hypothetical and categorical imperati.docx
Explain the difference between hypothetical and categorical imperati.docxExplain the difference between hypothetical and categorical imperati.docx
Explain the difference between hypothetical and categorical imperati.docx
 
Explain in 100 words provide exampleThe capital budgeting decisi.docx
Explain in 100 words provide exampleThe capital budgeting decisi.docxExplain in 100 words provide exampleThe capital budgeting decisi.docx
Explain in 100 words provide exampleThe capital budgeting decisi.docx
 
Explain how Supreme Court decisions influenced the evolution of the .docx
Explain how Supreme Court decisions influenced the evolution of the .docxExplain how Supreme Court decisions influenced the evolution of the .docx
Explain how Supreme Court decisions influenced the evolution of the .docx
 
Explain how an offender is classified according to risk when he or s.docx
Explain how an offender is classified according to risk when he or s.docxExplain how an offender is classified according to risk when he or s.docx
Explain how an offender is classified according to risk when he or s.docx
 
Explain a lesson plan. Describe the different types of information.docx
Explain a lesson plan. Describe the different types of information.docxExplain a lesson plan. Describe the different types of information.docx
Explain a lesson plan. Describe the different types of information.docx
 
explain the different roles of basic and applied researchdescribe .docx
explain the different roles of basic and applied researchdescribe .docxexplain the different roles of basic and applied researchdescribe .docx
explain the different roles of basic and applied researchdescribe .docx
 
Explain the basics of inspirational and emotion-provoking communicat.docx
Explain the basics of inspirational and emotion-provoking communicat.docxExplain the basics of inspirational and emotion-provoking communicat.docx
Explain the basics of inspirational and emotion-provoking communicat.docx
 
Explain how leaders develop through self-awareness and self-discipli.docx
Explain how leaders develop through self-awareness and self-discipli.docxExplain how leaders develop through self-awareness and self-discipli.docx
Explain how leaders develop through self-awareness and self-discipli.docx
 
Explain five ways that you can maintain professionalism in the meeti.docx
Explain five ways that you can maintain professionalism in the meeti.docxExplain five ways that you can maintain professionalism in the meeti.docx
Explain five ways that you can maintain professionalism in the meeti.docx
 
Explain security awareness and its importance.Your response should.docx
Explain security awareness and its importance.Your response should.docxExplain security awareness and its importance.Your response should.docx
Explain security awareness and its importance.Your response should.docx
 
Experimental Design AssignmentYou were given an Aedesaegyp.docx
Experimental Design AssignmentYou were given an Aedesaegyp.docxExperimental Design AssignmentYou were given an Aedesaegyp.docx
Experimental Design AssignmentYou were given an Aedesaegyp.docx
 
Expand your website plan.Select at least three interactive fea.docx
Expand your website plan.Select at least three interactive fea.docxExpand your website plan.Select at least three interactive fea.docx
Expand your website plan.Select at least three interactive fea.docx
 
Exercise 7 Use el pronombre y la forma correcta del verbo._.docx
Exercise 7 Use el pronombre y la forma correcta del verbo._.docxExercise 7 Use el pronombre y la forma correcta del verbo._.docx
Exercise 7 Use el pronombre y la forma correcta del verbo._.docx
 
Exercise 21-8 (Part Level Submission)The following facts pertain.docx
Exercise 21-8 (Part Level Submission)The following facts pertain.docxExercise 21-8 (Part Level Submission)The following facts pertain.docx
Exercise 21-8 (Part Level Submission)The following facts pertain.docx
 

Recently uploaded

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphThiyagu K
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajanpragatimahajan3
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...Sapna Thakur
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...fonyou31
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...PsychoTech Services
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 

Recently uploaded (20)

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Advance Mobile Application Development class 07
Advance Mobile Application Development class 07Advance Mobile Application Development class 07
Advance Mobile Application Development class 07
 
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
BAG TECHNIQUE Bag technique-a tool making use of public health bag through wh...
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
Ecosystem Interactions Class Discussion Presentation in Blue Green Lined Styl...
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
IGNOU MSCCFT and PGDCFT Exam Question Pattern: MCFT003 Counselling and Family...
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 

Copyright © 2020 e-Service Journal. All rights reserved. No co

  • 1. Copyright © 2020 e-Service Journal. All rights reserved. No copies of this work may be distributed in print or electronically without express written permission from Indiana University Press. doi: 10.2979/eservicej.12.1.02 42 An Evaluation of Medicare’s Hospital Compare as a Decision-Making Tool for Patients and Hospitals Sagnika Sen Pennsylvania State University, USA AbStRACt Medicare’s Hospital Compare aims to assist patients to make informed decisions about their choice of care through its star rating systems and side-by- side comparison of hospitals. Despite the use of the rating system by hospitals as an endorsement of quality, it is not clear whether the information helps consumers make choices specific to certain diseases. Moreover, the system also does not provide any
  • 2. guidance to hospitals as to which quality improvements lead to better outcome and why. Using data from 4793 hospitals, this research explores the relationship using the triad of structure, process, and outcome. Our results show that the star rating system is inadequate for making disease- specific decision. More importantly, there is little evidence linking the structure and process related variables with disease specific clinical qual ity outcomes. Keywords: Medicare, Triple Aim Performance, Hospital Performance, Clinical Quality, Efficiency INtRoduCtIoN To bring transparency and efficiency in health services, Centers for Medicare and Medicaid Services (CMS) provides consumers with a tool to assess the quality of hospitals and other health care providers in their vicinity through its Hospital Compare website. (Medicare, n.d.). The data in Hospital Compare originate from
  • 3. An Evaluation of Medicare’s Hospital Compare 43 different quality and cost-effectiveness initiatives undertaken by CMS where indi- vidual hospitals report on various outcome and process measures regarding mor- tality, safety, readmissions, patient experience, and timeliness and effectiveness of care (Kaye et al., 2017). Using a complex methodology, CMS assigns a star-rating system on a scale of 1 through 5 (1: worst, 5:best) to individual hospitals (Hospital Compare Overall Ratings Resources, n.d.). Upon entering a zip code or a hospital name in the Hospital Compare website, a summary of nearby hospitals along with their star-ratings are displayed. Up to three hospitals can then be selected to make detailed side-by-side comparisons related to heart attack, heart failure, pneumonia, surgery and other conditions. These comparisons are organized by patient satisfac- tion, timeliness and effectiveness of care, readmissions and
  • 4. deaths, among others. While the star rating system is widely used by patients, care providers, insurance companies, and policymakers (Mehta et al., 2020), there is also a considerable debate regarding its deviance from other quality rankings (Austin et al., 2015). Furthermore, there is often little information explaining the relation- ship between star ratings and a specific disease outcome (e.g. Acute Myocardial Infarction, commonly known as heart attack) due to methodological reasons of standardization and inability to use data from low -volume hospitals (George, et al., 2017). Often, there is no underlying pattern of correlation among different outcome measures, thus raising the concern whether consumers decision should rely on global ranking systems (Hu et al., 2017). Furthermore, there are limitations of the Hospital Compare database regarding its ability to provide direction to the hospitals as to which quality
  • 5. improvement and efficiency initiatives are yielding better outcomes (MacLean & Shapiro, 2016). Only a handful of hospitals achieve “triple aim performance,” i.e. scoring high on all three outcome dimensions measured by CMS—clinical quality, patient experience, and efficiency (Roth, et al., 2019). Despite the vast amount of data collected by CMS regarding hospitals’ technology capabilities, quality, and cost effectiveness initiatives, there is a lack of comprehensive studies to assess how these relate to different outcome measures. In this regard, the current study explores the relationship of different classes of outcome variables with technology capabilities and process related variables. The objective of the study is two-fold. First, whether the CMS star ratings system provides sufficient information to consumers towards choosing a hospital for a disease-specific condition. Second, how and to what extent structural and process initiatives affect the
  • 6. Sagnika Sen 44 e-Service Journal Volume 12 Issue 1 different outcome dimensions, such as patient satisfaction, cost efficiency, and qual- ity. Using data from 4,793 hospitals included in 2018 Hospital Compare database, we focus on general outcomes such as patient survey of hospital and spending per beneficiary as well as readmission rates and excess days spent in care specific to acute myocardial infarction (AMI), commonly known as heart attack. While the CMS star rating is used by hospitals as an endorsement of quality, there is a lack of understanding as to whether these ratings really help and patients and family members in their choice of care. More importantly, to the best of our knowledge there are no studies exploring the causal relationship between structural and process variables to hospital performance. The rest of the paper is organized as follows. We present a brief
  • 7. review of the literature in the next section, followed by a description of our data and methodology. Analysis and discussion of the results are presented next. Finally, we discuss the limitations of the study and concluding remarks. lItERAtuRE REVIEw In a seminal article, Donabedian (1966) proposed using the triad of structure, process, and outcome to evaluate the quality of health care. Ever since its intro- duction, the Donabedian framework has been the most cited in health services research, especially regarding the theory and practice of quality assurance in healthcare (Ayanian & Markel, 2016). According to the Donabedian framework, structure is defined as the set- tings where healthcare takes place and includes provider qualifications and organ- izational characteristics. Process includes the functions surrounding the delivery of care such as diagnosis, treatment, prevention. Finally,
  • 8. outcome relates to the effect of healthcare service on the patient and population. These concepts were further extended to identify different dimensions of quality (Donabedian, 1990) and still constitutes the foundation of quality assessment. In the following, we briefly describe the extant literature on each of the three dimensions of structure, process, and outcome as it relates to healthcare research. Structural Measures One of the most important structural measures arguably revolves around a hospi- tal’s technology capabilities. While the Donabedian framework includes provider An Evaluation of Medicare’s Hospital Compare 45 qualification, we feel that hospitals participating in CMS programs such as Medicare and Medicaid have standard qualification rules for their doctors and
  • 9. nurses, and as such would have similar effect on all hospitals. However, since the introduction of the HITECH (Health Information Technology for Economic and Clinical Health) act in 2010, considerable emphasis has been placed on hos- pital capabilities regarding electronic healthcare records (EHR), especially the ability to collect, receive, and transmit patient healthcare records in standardized format. Hospitals were incentivized to achieve “meaningful use” of EHR with respect to healthcare quality (Gholami et al., 2015). A significant body of academic research has explored the relationship of technology and healthcare quality (Chaudhry et al., 2006). A longitudinal study of hospitals in the US have shown that healthcare technology usage is not only is associated with increase in healthcare quality but also reducing operating costs (Bardhan & Thouin, 2013). Also, investme nts in technology leads hospitals to disclose quality measures voluntarily (Angst et al., 2014).
  • 10. While extant literature predominantly have shown positive effect of health- care technology (Buntin et al., 2011), a recent article also cites the existence of “productivity paradox” seen earlier in the manufacturing sector (Bui et al., 2018). Their study of hospitals in the state of New York show only mixed outcomes after a considerable investment in technology, especially since their research found no evidence of relationship between technology use and patient satisfaction, mortal- ity, and readmission rates. The authors of this paper call for further research to explore the causal linkage between technology use and specific outcomes such as patient satisfaction, spending, mortality, and readmission rates. Process Measures The quality improvement literature has long recognized the role of process man- agement in impacting outcomes. Quality initiatives such as Six Sigma aim to improve quality through a rational modularization and
  • 11. streamlining of workflows followed by the implementation of standardized best practices (McCormack et al., 2009). Healthcare organizations have embraced various process improve- ment initiatives towards improving hospital efficiency, clinical outcomes, and patient experience (Roth et al., 2019). In general, these programs have resulted in improved outcomes (Zheng et al., 2018). Sagnika Sen 46 e-Service Journal Volume 12 Issue 1 In order to reduce the number of preventable medical errors, CMS devel- oped a set of best practices to improve care delivery. These processes are specif- ically aimed to improve the care for acute myocardial infarction (heart attack), heart failure, pneumonia, as well as surgical processes and infections. It has been shown that participating in process improvement initiatives for heart attack
  • 12. resulted in reducing clinical outcomes of mortality and readmission rates (Ding, 2015). However, other studies have shown that hospitals’ emphasis on process management leads to increases in clinical quality but reduction in patient satis- faction (Chandrasekaran et al., 2012). Measuring healthcare Service outcomes Effectiveness and efficiency are inherent indicators of process performance and have been captured in the literature as quality and efficiency (Melville et al., 2004). Quality can be measured in terms of process results and is determined by how well a process meets the customer’s needs. In the context of healthcare, quality can be measured by customer perceptions, and/or ranking and rating pro- vided by insurance agencies (e.g. Medicare) and independent third parties (e.g. US News and World Report). Efficiency, on the other hand, is a simple ratio of output to input and is
  • 13. representative of how well the results are achieved. Recent literature in healthcare services have emphasized on the triple aim performance— clinical quality, patient satisfaction, and reduction in cost (Roth et al., 2019; Zheng et al., 2018). We adopt all three outcome measures in our analysis described below. dAtA ANd MEthodology This research utilizes data from CMS Hospital Compare (Medicare, n.d.) for the year 2018. A total of 4,793 acute care hospitals registered with Medicare are included in the database. Hospital Compare reports information on vari- ous performance metrics such as spending, quality and efficiency of care, HIT implementation, and customer satisfaction collected from the hospitals. In addi- tion, CMS also provides ranking and benchmarking for each of the hospitals. Information regarding Veterans Administration hospitals, children’s hospitals,
  • 14. and critical access hospitals are also included in Hospital Compare but was not part of the current study. An Evaluation of Medicare’s Hospital Compare 47 Details of the variables used in this study are provided in Table 1. As previously mentioned, the triple aim performance goals are used. For patient satisfaction, we use the aggregate scores from Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) patient experience survey. In addition, the CMS overall star rating is also used. For cost reduction/effi- ciency, the Medicare Spending Per Beneficiary (MSPB) is used. MSPB is a price-standardized, risk-adjusted measures of spending efficiency (Trzeciak et al., 2017). It assesses the cost of services performed by hospitals and other healthcare providers during the period immediately prior to,
  • 15. during, and following a beneficiary’s hospital stay compared to a median national hos- pital. The measure adjusts for geographic differences, patient severity, and age (Medicare Spending Per Beneficiary (MSPB) Measure Methodology, n.d.). For clinical quality, the heart attack measures are chosen. Since hospital per- formance varies across different disease and treatment conditions, we chose to focus on one disease (Hu et al., 2017). In the past, disease specific mor- tality and readmission rates were used as standard clinical quality outcomes. However, these measures sometimes created skewed incentives for hospitals (Psotka et al., 2020). Consequently, more recent measures by CMS include Excess Days in Care instead or mortality which measures unplanned patient encounters such as observation stays, emergency department visits 30 days post discharge (Horwitz et al., 2018).
  • 16. The process variables are a combination of heart attack specific measures (e.g. percentage of patients who were admitted with complaints of chest pain and received aspirin) and general emergency department (ED) throughput measures (e.g. time spent in ED). We have also included emergency department volume as one of the control variables. Structural measures included health information technology (HIT) related measures, as well as safety measures. Descriptive statistics of all variables are pro- vided in Table 2. Sagnika Sen 48 e-Service Journal Volume 12 Issue 1 T ab le 1 : S
  • 63. d i n t h e an al ys is An Evaluation of Medicare’s Hospital Compare 49 T ab le 2 : D es cr ip ti ve
  • 92. 1. 08 T ab le 2 : C on ti n u ed An Evaluation of Medicare’s Hospital Compare 51 T ab le 3 : R eg re ss
  • 114. ig n if ic an t at p = 0. 01 Sagnika Sen 52 e-Service Journal Volume 12 Issue 1 These different pieces of data reside in separate reports within Hospital Compare indexed by each hospital. Once data from these different sources are combined, separate regression models were run for each outcome variable. For categorical variables, the “not available” group was used as the baseline. Results of the regression are provided in Table 3.
  • 115. RESultS First glance at the results reveal that not all outcome variables are equally impacted by the structure and process variables. Survey-based patient satisfaction and CMS computed star rating outcomes are the ones best explained, as is evidenced from the adjusted R2 values of 21.1% and 17.7% respectively. The efficiency meas- ure, Medicare Spending Per Beneficiary (MSPB), and one of the heart attack related measures (excess days of care) have moderate values of adjusted R2 values, whereas heart-attack readmission rates are not at all impacted by the structure and process related variables. In the following section, the structure-outcome and process-outcome relationships are discussed in detail. Structure-outcome Relationships Interestingly, the two HIT variables did not have any effect on any of the five outcomes despite about two-thirds of the hospitals reporting both
  • 116. capabilities. While it seems counterintuitive, recent research suggests that electronic health care capabilities cannot be fully harnessed unless the organ- ization’s capabilities are built to exploit those technologies (Jena et al., 2020). Hospitals that did not have an inpatient safe surgery checklist (compared to the ones that did not report on this measure) highly impacted excess days of care. Not having a safe surgery checklist increased the excess days of care considerably. However, this measure did not have any effect on the other four outcomes. The outpatient safe surgery checklist on the other hand, resulted in reduced patient satisfaction (compared to hospitals that did not report on the surgery checklist). A possible explanation may be that it increased the time taken for outpatient procedures. Also, hospitals that did not have an outpatient safe surgery checklist had reduced excess days. Finally, hospitals
  • 117. that used a survey of patient safety culture led to both an increase in spending and excess days of care. An Evaluation of Medicare’s Hospital Compare 53 Process-outcome Relationships For the process variables specific to heart attack care, administering aspirin has a positive effect both on CMS hospital rating as well as in reducing spending per beneficiary. Surprisingly, it also slightly increases excess days in acute care.. The average time it takes for a probable heart attack patient to get an ECG reduces patient satisfaction but does not have any effect on the other outcome variables. The average time spent in the emergency department (ED) for patients who were ultimately admitted as inpatients reduces patient and CMS rating, increases spending per beneficiary, and increases readmission
  • 118. rates. Overall time spent in ED for all patients increases both patient satisfaction and CMS ratings. The percentage of people who left the ED before being seen reduces both patient satisfaction and CMS rating. A hospital’s emergency department volume seems to play a significant role for most outcomes. In general, higher volume hospitals had less satisfaction, lower ratings, more spending, and higher amount of excess days. Not all volume categories have the same impact on the outcome variables though. It is only the very high-volume hospitals that resulted in more excess days. For both spending per beneficiary and CMS rating, the ED volume, which can serve as a proxy for hospital size, resulted in increased spending and lower rating. dISCuSSIoN One of the key findings from our analysis is that the CMS overall rating pro- vides a broad overview of hospital performance. All outcomes
  • 119. show an improving trend towards the higher star rated hospitals. However, while the structure and process variables explain quite a bit about patient satisfaction and CMS com- puted hospital ratings, they provide less information regarding spending effi- ciency, and even less for disease-specific clinical outcomes. In other words, while the current structure and process-related variables demonstrably improve patient performance, their impact on reducing unplanned visits and readmission rates is not evident. A closer look at the distribution on excess days and readmission rates show a significant overlap of these measures across hospital ratings (Figure 1), implying that hospitals even in high-star rating category may have less-than- standard outcome for heart attack patients. Interestingly, hospitals that were not assigned a star rating by CMS had worse performance than those that received
  • 120. Sagnika Sen 54 e-Service Journal Volume 12 Issue 1 star ratings of 4 and 5, but at par or slightly better than those with ratings 1–3. It should be noted here that consumers do not have ready access to the clinical quality scores through the Hospital Compare website, and are shown the per- formance of the hospital as compared to national median. In order to access the actual scores, patients have to look through the enormous number of data files in the archives. Figure 1: Heart Attack Readmission Rates and Excess Days In Care Across Hospital Rating An Evaluation of Medicare’s Hospital Compare 55 In summary, the CMS star ratings, while providing a general overview of a hospital’s performance, may not be the best way to choose care for a specific
  • 121. disease. More importantly, the structure and process variables, currently captured by the CMS, fail to provide hospitals with any insights as to which initiatives result in better clinical and spending outcomes. CoNCluSIoNS ANd futuRE RESEARCh In this study, we assess of the utility of the Hospital Compare star rating ser- vice in helping patients make informed decision for the choice of their care. We also explore which structure and process variables impact different dimensions of hospital performance and how. Our analysis highlights the shortcomings of the current service for both patients and providers. At this point, the limitations of our study should be recognized. This is a cross-sectional study of hospitals reporting on many of their process and quality related initiatives. Since CMS does not report any data where the number of cases are very small, some methodological issues are raised regarding the under-
  • 122. estimation of quality risks at low-volume hospitals (George et al., 2017). More information regarding the variation in patient demographics as well as hospital characteristics (size, urban/rural location) should be included in future studies to appropriately assess the clinical quality. Apart from a low volume of cases, some hospitals did not report performance data on excess days and quality of care, although they reported other process and structural measures. Further longitu- dinal studies may investigate if the proportion of hospitals reporting these meas- ures increase over time, and whether such changes explain the causal relationship between process initiatives and quality measures. REfERENCES Angst, C., Agarwal, R., Gordon, G., Khuntia, J., & Mccullough, J. S. (2014). Information technol- ogy and voluntary quality disclosure by hospitals. Decision Support Systems, 57. Austin, M. M., Jha, A. K., Romano, P. S., Singer, S. J., Vogus, T. J., Wachter, R. M., & Pronovost,
  • 123. P. J. (2015). National hospital ratings systems share few common scores and may generate confusion instead of clarity. Health Affairs, 34(3), 423–430. https://doi.org/10.1377/hlthaff .2014.0201 Ayanian, J. Z., & Markel, H. (2016). Donabedian’s lasting framework for health care quality. New England Journal of Medicine, 375(3), 205–207. https://doi.org/10.1056/NEJMp1605101 Sagnika Sen 56 e-Service Journal Volume 12 Issue 1 Bardhan, I., & Thouin, M. F. (2013). Health information technology and its impact on the qual- ity and cost of healthcare delivery. Decision Support Systems, 55(2), 438–449. https://doi.org /10.1016/j.dss.2012.10.003 Bui, Q. “Neo,” Hansen, S., Liu, M., & Tu, Q. (John). (2018). The productivity paradox in health information technology. Communications of the ACM, 61(10), 78–85. https://doi.org /10.1145/3183583 Buntin, M. B., Burke, M. F., Hoaglin, M. C., & Blumenthal, D. (2011). The benefits of health
  • 124. information technology: A review of the recent literature shows predominantly positive results. Health Affairs, 30(3), 464–471. https://doi.org/10.1377/hlthaff.2011.0178 Chandrasekaran, A., Senot, C., & Boyer, K. K. (2012). Process management impact on clinical and experiential quality: Managing tensions between safe and patient-centered healthcare. Manufacturing and Service Operations Management, 14(4), 548–566. https://doi.org/10.1287 /msom.1110.0374 Chaudhry, B., Wang, J., Wu, S., Maglione, M., Mojica, W., Roth, E., Shekelle, P. G. (2006). Improving patient care. Systematic review: Impact of health information technology on quality, efficiency, and costs of medical care. Annals of Internal Medicine, 144(10), 742–752. Retrieved from http:// search.ebscohost.com/login.aspx?direct=true&db=cin20&AN=2 009195180&site=ehost-live Ding, X. (2015). The impact of service design and process management on clinical quality: An exploration of synergetic effects. Journal of Operations Management, 36, 103–114. https://doi .org/10.1016/j.jom.2015.03.006 Donabedian A. (1966.) Evaluating the quality of medical care. Milbank Memorial Fund Quarterly, 44(3), 166–206. Reprinted in Milbank Memorial Fund
  • 125. Quarterly, 2005, 83(4), 691–729. Donabedian, A. (1990). The seven pillars of quality. In Archives of Pathology and Laboratory Medicine (Vol. 114, pp. 1115–1118). Arch Pathol Lab Med. George, E. I., Ročková, V., Rosenbaum, P. R., Satopää, V. A., & Silber, J. H. (2017). Mortality rate estimation and standardization for public reporting: Medicare’s hospital compare. Journal of the American Statistical Association, 112(519), 933–947. https://doi.org/10.1080/01621459 .2016.1276021 Gholami, R., Añón Higón, D., & Emrouznejad, A. (2015). Hospital performance: Efficiency or quality? Can we have both with IT? Expert Systems with Applications, 42(12), 5390–5400. https://doi.org/10.1016/j.eswa.2014.12.019 Horwitz, L. I., Wang, Y., Altaf, F. K., Wang, C., Lin, Z., Liu, S., Herrin, J. (2018). Hospital characteristics associated with postdischarge hospital readmission, observation, and emer- gency department utilization. Medical Care, 56(4), 281–289. https://doi.org/10.1097 /MLR.0000000000000882 Medicare. (n.d.). Hospital Compare. https://www.medicare.gov/hospitalcompare/search.html Hospital Compare Overall Ratings Resources. (n.d.). QualityNet. Retrieved July 31, 2020, from
  • 126. https://www.qualitynet.org/inpatient/public-reporting/overall- ratings/resources Hu, J., Jordan, J., Rubinfeld, I., Schreiber, M., Waterman, B., & Nerenz, D. (2017). Correlations among hospital quality measures: What “hospital compare” data tell us. American Journal of Medical Quality: The Official Journal of the American College of Medical Quality, 32(6), 605– 610. https://doi.org/10.1177/1062860616684012 An Evaluation of Medicare’s Hospital Compare 57 Jena, R., Rudramuniyaiah, P. S., & Shah, V. (2020). A framework for reconciling care coordination efficiency and effectiveness using e-service implementation ambidexterity. E-Service Journal, 11(3). https://doi.org/10.2979/eservicej.11.3.03 Kaye, D. R., Norton, E. C., Ellimoottil, C., Ye, Z., Dupree, J. M., Herrel, L. A., & Miller, D. C. (2017). Understanding the relationship between the centers for Medicare and Medicaid ser- vices’ hospital compare star rating, surgical case volume, and short-term outcomes after major cancer surgery. Cancer, 123(21), 4259–4267. https://doi.org/10.1002/cncr.30866
  • 127. MacLean, C., & Shapiro, L. (2016). Does the hospital compare 5-Star rating promote public health? https://doi.org/10.1377/hblog20160908.056393 McCormack, K., Willems, J., van den Bergh, J., Deschoolmeester, D., Willaert, P., Indihar Štemberger, M., Vlahovic, N. (2009). A global investigation of key turning points in busi- ness process maturity. Business Process Management Journal, 15(5), 792–815. https://doi.org /10.1108/14637150910987946 Medicare Spending Per Beneficiary (MSPB) Measure Methodology. (n.d.). QualityNet. Retrieved August 3, 2020, from https://www.qualitynet.org/inpatient/measures/mspb/methodolo gy Mehta, R., Paredes, A. Z., Tsilimigras, D. I., Farooq, A., Sahara, K., Merath, K., Pawlik, T. M. (2020). CMS hospital compare system of star ratings and surgical outcomes among patients undergoing surgery for cancer: Do the ratings matter? Annals of Surgical Oncology, 27, 3138– 3146. https://doi.org/10.1245/s10434–019-08088-y Melville, N., Kraemer, K. L., & Gurbaxani, V. (2004). Review: Information technology and Organizational performance: An integrative model of IT business value. MIS Quarterly, 28(2),
  • 128. 283–322. Psotka, M. A., Fonarow, G. C., Allen, L. A., Joynt Maddox, K. E., Fiuzat, M., Heidenreich, P., O’Connor, C. M. (2020). The hospital readmissions reduction program: Nationwide perspec- tives and recommendations. JACC: Heart Failure, 8(1), 1–11. https://doi.org/10.1016/j.jchf .2019.07.012 Roth, A., Tucker, A. L., Venkataraman, S., & Chilingerian, J. (2019). Being on the productivity fron- tier: Identifying “triple aim performance” hospitals. Production and Operations Management, 28(9), 2165–2183. https://doi.org/10.1111/poms.13019 Trzeciak, S., Gaughan, J. P., Bosire, J., Angelo, M., Holzberg, A. S., & Mazzarelli, A. J. (2017). Association between Medicare star ratings for patient experience and Medicare spending per beneficiary for US hospitals. Journal of Patient Experience, 4(1), 17–21. https://doi.org /10.1177/2374373516685938 Zheng, Z. (Eric), Bardhan, I., & Ayabakan, S. (2018). Did the hospital readmission reduction program achieve triple aim goals? Evidence from healthcare data analytics. In Pacific Asia Conference on Information Systems (PACIS). PACIS. Retrieved from https://aisel.aisnet.org
  • 129. /pacis2018/207 59 Nidhi Singh is Assistant Professor and Dean (Students Affairs) at Jaipuria Institute of Management, Noida. She is an active researcher enrolled with IP University, Delhi. She has qualified UGC Net also. She has presented many papers in various Seminars & Conferences including IIMR, IICA, NLSIU etc. and published papers in journals of National & International Repute like the International Journal of Information Management, Elsevier, Journal of Retailing and Consumer Services, Elsevier, International Journal of Bank Marketing, Emerald, Decision-Springer publication, Management and Labour Studies -Sage Publication, International Journal of Sustainable Strategic Management -Inderscience publication, FIIM, SERD, GSCCR etc. Dr. Sagnika Sen is an Associate Professor of Information
  • 130. Systems in the School of Graduate Professional Studies at Pennsylvania State University. She received her Ph.D. from Arizona State University. Her research focuses on process per- formance, metrics and incentive design in organizations, mainly the design of effective decision-making frameworks and the use of data- driven decision models to obtain analytical insights on processes and performance measures. She has published in top academic journals in the field of Information Systems such as Information Systems Research and Journal of Management Information Systems. Her work has also appeared in other prestigious academic outlets such as Decision Support Systems, Information and Management, Communications of the ACM, Human Resources Management, Service Sciences, Journal of Managerial Psychology, etc.
  • 131. Reproduced with permission of copyright owner. Further reproduction prohibited without permission. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 1/15 EBSCO Publishing Citation Format: APA 7th Edition (American Psychological Assoc.): NOTE: Review the instructions at http://support.ebsco.com/help/?int=eds&lang=&feature_id=APA and make any necessary corrections before using. Pay special attention to personal names, capitalization, and dates. Always consult your library resources for the exact formatting and punctuation guidelines. References Saunders, H., Gallagher-Ford, L., Kvist, T., & Vehvilainen- Julkunen, K. (2019). Practicing Healthcare Professionals’ Evidence-Based Practice Competencies: An Overview of Systematic Reviews. Worldviews on Evidence-Based Nursing, 16(3), 176. https://doi.org/10.1111/wvn.12363 <!--Additional Information: Persistent link to this record (Permalink): https://search.ebscohost.com/login.aspx?
  • 132. direct=true&AuthType=shib&db=edsgao&AN=eds gcl.58778503 4&site=eds-live&scope=site&custid=s8856897 End of citation--> Practicing Healthcare Professionals' Evidence‐ Based Practice Competencies: An Overview of Systematic Reviews Background: Evidence‐ based practice (EBP) competencies are essential for all practicing healthcare professionals to provide evidence‐ based, quality care, and improved patient outcomes. The multistep EBP implementation process requires multifaceted competencies to successfully integrate best evidence into daily healthcare delivery. Aims: To summarize and synthesize the current research literature on practicing health professionals' EBP competencies (i.e., their knowledge, skills, attitudes, beliefs, and implementation) related to employing EBP in clinical decision‐ making. Design: An overview of systematic reviews. Methods: PubMed/MEDLINE, CINAHL, Scopus, and Cochrane Library were systematically searched on practicing healthcare professionals' EBP competencies published in January 2012–July 2017. A total of 3,947 publications were retrieved, of which 11 systematic reviews were eligible for a critical appraisal of methodological quality. Three independent reviewers conducted the critical appraisal using the Rapid Critical Appraisal tools developed by the Helene Fuld National Institute for Evidence‐ Based Practice in Nursing & Healthcare. Results: Practicing healthcare professionals' self‐ reported EBP knowledge, skills, attitudes, and beliefs were at a moderate to high level, but they did not translate into EBP implementation. Considerable overlap existed in the source studies across the included reviews. Few reviews reported any impact of EBP
  • 133. competencies on changes in care processes or patient outcomes. Most reviews were methodologically of moderate quality. Significant variation in study designs, settings, interventions, and outcome measures in the source studies precluded any comparisons of EBP competencies across healthcare disciplines. Linking Evidence to Action: As EBP is a shared competency, the development, adoption, and use of an EBP competency set for all healthcare professionals are a priority along with using actual (i.e., performance‐ based), validated outcome measures. The widespread misconceptions and misunderstandings that still exist among large proportions of practicing healthcare professionals about the basic concepts of EBP should urgently be addressed to increase engagement in EBP implementation and attain improved care quality and patient outcomes. javascript:openWideTip('http://support.ebsco.com/help/?int=eds &lang=&feature_id=APA%27); https://search.ebscohost.com/login.aspx?direct=true&AuthType =shib&db=edsgao&AN=edsgcl.587785034&site=eds- live&scope=site&custid=s8856897 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 2/15 Keywords: evidence‐ based practice; knowledge; competence; systematic review; healthcare professional
  • 134. Knowledge of the principles of evidence‐ based practice (EBP) and skills to perform the steps of the EBP implementation process are essential competencies for all practicing healthcare professionals (Melnyk, Gallagher‐ Ford, & Fineout‐ Overholt, [16]). In nursing, competence has been defined as the "ability to perform the task with desirable outcomes under the varied circumstances of the real world" (Benner, [ 3], p. 304), referring to the expected knowledge, attitudes, beliefs, skills, and abilities (i.e., competencies) for successful performance of critical work functions. In health care, "core competencies offer a common shared language for all health professions for defining what all are expected to be able to do to work optimally" (Albarqouni et al., [ 1], p. 2). However, defining core competencies in EBP (i.e., outlining the expected EBP knowledge, skills, attitudes, beliefs, and implementation, which are crucially important for improving care quality and patient outcomes because they enable healthcare professionals to make clinical decisions grounded on best available evidence and integrate the evidence into their daily practice; Melnyk et al., [18]; Wallen et al., [34]) has been a relatively recent development both in nursing (Melnyk et al., [16]; Stevens, [28]) and in health care (Albarqouni et al., [ 1]). Moreover, the uptake and use of the EBP core competencies in daily practice have been slow, which hinders healthcare organizations from delivering highest quality, evidence‐ based health care via consistent, broad‐ based EBP implementation. Furthermore, systematic integration of best evidence into practice is challenging due to the complexity of the EBP implementation process consisting of multiple sequential steps, the mastery of which requires multifaceted interventions,
  • 135. such as developing individual readiness for EBP, translating and ensuring availability of best evidence in usable forms for clinical practice, and building organizational readiness, culture, and structures supportive of EBP (Melnyk, Gallagher‐ Ford, & Fineout‐ Overholt, [17]; Saunders, Vehviläinen‐ Julkunen, & Stevens, [25]). Similar to the idea of EBP itself (DiCenso, Cullum, & Ciliska, [ 6]; Sackett, Rosenberg, Gray, Haynes, & Richardson, [22]), the realization about the importance for all healthcare professionals to develop a sufficient level of EBP competence is not new, as the first Sicily statement (Dawes et al., [ 5]) outlined that it is a minimum requirement for all healthcare professionals to understand and implement the principles and process of EBP. To this end, two sets of nurses' EBP competencies have been developed through separate national consensus processes in the USA to evaluate practicing nurses' abilities to employ EBP (Melnyk et al., [16]) and to guide EBP professional development and education programs in nursing (Stevens, [28]). However, the EBP competencies published thus far in nursing have been self‐ reported and discipline‐ specific (i.e., they have focused on measuring the perceived EBP competencies of nurses). Although there have been a few actual (i.e., performance‐ based) evaluation tools developed in the last 10 years for more objective measurement of EBP competencies, they have also been discipline‐ specific and undertaken primarily in the fields of medicine, occupational therapy, physical therapy, and most recently, in nursing (Halm, [ 8]; Ilic, Nordin, Glasziou, Tilson, & Villanueva, [10]; Laibhen‐ Parkes, Kimble, Melnyk, Sudia, & Codone, [11]; McCluskey & Bishop, [12]; Spurlock &
  • 136. Wonder, [27]; Tilson, [29]). However, as EBP is a shared competency (i.e., the key principles and steps of the EBP process are universal and applicable to all healthcare disciplines), a unique opportunity exists to jointly develop interprofessional core competencies in EBP that objectively measure the actual EBP performance of all healthcare professionals. The Current State of Practicing Healthcare Professionals' EBP Competencies 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 3/15 A recent integrative review on EBP readiness of nurses (Saunders & Vehviläinen‐ Julkunen, [24]) concluded that EBP competencies of nurses internationally are at a low to moderate level, particularly in terms of their EBP knowledge, EBP skills, and their confidence in employing EBP. These results are consistent with the findings from other recent reviews of EBP competencies across other healthcare disciplines (Mota da Silva, da Cunha Menezes Costa, Narciso Garcia, & Oliveira Pena Costa, [20]; Scurlock‐ Evans, Upton, & Upton, [26]; Upton, Stephens, Williams, & Scurlock‐ Evans, [32]). Therefore, instead of setting high performance expectations for EBP, it is essential to first focus on advancing practicing healthcare professionals' EBP competencies, before they will be capable of consistently
  • 137. implementing EBP and integrating best evidence into their daily care delivery. Once healthcare professionals are competent in EBP, they will be more likely to engage in EBP in their daily work, and patient care delivery in most healthcare organizations will likely become more evidence‐ based. This substantial chasm between the EBP implementation goals of healthcare organizations and the current EBP implementation capabilities of large numbers of healthcare professionals due to their low level of EBP competence is precisely the gap that urgently requires attention and immediate action in healthcare organizations worldwide. Aims The aim of this overview of systematic reviews was to summarize and synthesize the current international research literature on practicing healthcare professionals' EBP competencies (i.e., their knowledge, skills, attitudes, beliefs, and implementation of EBP) related to employing EBP in clinical decision‐ making. This overview addresses the following research question: What do systematic reviews published in international peer‐ reviewed journals state about practicing healthcare professionals' EBP competencies? Design Published systematic reviews on the EBP competencies of all practicing healthcare professionals, including nurses, physicians, physical therapists, occupational therapists, and other allied health professionals, were considered for inclusion in this overview of systematic reviews. The relevant data in the reviews were systematically extracted, summarized, and synthesized according to the guidelines
  • 138. provided by the Cochrane Collaboration (Becker & Oxman, [ 2]). The review process is presented according to the Preferred Reporting Items for Systematic Reviews and Meta‐ Analyses (PRISMA) statement or guideline for reporting study methods and results (Moher, Liberati, Tetzlaff, Altman, & The PRISMA Group, [19]). Methods Systematic literature search methods were used to conduct electronic database searches in PubMed/MEDLINE, Cumulative Index for Nursing and Allied Health Literature (CINAHL), Scopus, and Cochrane Library for primary empirical studies and reviews published between January 1, 2012, and July 31, 2017 (i.e., for a period of approximately the last 5 years), without any language restrictions. With the expert assistance of a university librarian, keywords and search terms related to the various healthcare disciplines, EBP, and competencies were first searched independently and then in combination, with appropriate modifications made for the various databases (e.g., MeSH terms in PubMed). The term "research utilization" was not used as the aim of this overview of systematic reviews was to focus on healthcare professionals' EBP competencies (i.e., their EBP knowledge, skills, attitudes, beliefs, and implementation). Moreover, research utilization focuses on the retrieval, critique, and use of 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974-
  • 139. 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 4/15 the research results from a single primary study, whereas EBP is commonly considered to be a much broader concept including research utilization and the integration of summarized and translated best evidence from several well‐ defined studies into clinical practice (Melnyk & Fineout‐ Overholt, [14]). In addition to the searched databases, authors of the included reviews were contacted for any missing key information, the reviews were reference‐ chased, and the lists of contents of the following peer‐ reviewed journals between the years of 2012–2017 were hand‐ searched: Worldviews on Evidence‐ Based Nursing, Journal of Advanced Nursing, BMC Health Services Research, BMC Medical Education, BMJ Open, Physiotherapy, and British Journal of Occupational Therapy. These journals were selected because they had published the majority of the reviews focusing on the topic of healthcare professionals' EBP competencies yielded by the systematic literature searches conducted for this overview. Inclusion and Exclusion Criteria The inclusion and exclusion criteria for systematic reviews are listed in Table S1. Systematic reviews were defined as reviews that had clearly stated aims or objectives, predetermined inclusion criteria, searched at least three databases, performed data extraction, provided a synthesis of data, and performed a quality appraisal of the included studies. To be eligible for inclusion in this overview, reviews were required to (a) focus on one or more of the outcomes of interest (i.e., EBP competencies
  • 140. of healthcare professionals), (b) fulfill the definition of a systematic review, (c) meet the inclusion and exclusion criteria, and (d) meet the benchmark set for the methodological quality of the reviews. Before undertaking this overview of systematic reviews, the Cochrane Library and the Joanna Briggs Institute Library of Systematic Reviews were searched. No published or in‐ progress systematic reviews or overviews of systematic reviews on this topic were found. Search Results and Data Evaluation The database searches yielded a total of 3,932 publications, and 15 additional publications were identified through other sources. Titles were screened, and duplicates as well as those not clearly indicating a focus on practicing healthcare professionals' EBP competencies were excluded. All remaining abstracts (n = 407) were screened against the purpose and inclusion criteria before being selected for further appraisal. After eliminating a total of 392 records that did not meet one or more inclusion criteria, the second screening resulted in 12 reviews. Three reviews were added through reference‐ chasing and hand‐ searching tables of content of the selected peer‐ reviewed journals, resulting in a total of 15 full‐ text reviews, which were assessed for eligibility. Four full‐ text reviews were excluded from the overview, as they contained no critical appraisal of methodological quality and therefore did not meet the definition of a systematic review outlined for this overview. As a result, data were extracted from 11 systematic reviews. Figure S1 details the stages of searching and selecting reviews for inclusion or exclusion using the PRISMA flow diagram (Moher et al., [19]).
  • 141. Data Extraction The following data were extracted for each of the 11 reviews and organized in a data matrix, using a standardized data extraction form developed according to the guidance from the PRISMA statement (Moher et al., [19]): Author(s), country, year of publication, types of participants, settings, study design(s) included, EBP aspects reviewed, quality appraisal(s) performed, main findings, and author's conclusions. The data were extracted by one reviewer and independently checked for accuracy and consistency by two other reviewers to ensure rigor and reproducibility. Any differences in opinion 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 5/15 between the three researchers were discussed until a mutual agreement was formed. All 11 reviews were included in the critical appraisal of methodological quality. Critical Appraisal of Methodological Quality The overall quality and differences in quality between the included reviews were compared and contrasted, in order to help interpret the results of the reviews synthesized in this overview. The overall quality of the reviews was not used as a criterion for inclusion, as the reviews included in this overview were required to meet the definition of a systematic review,
  • 142. specific inclusion criteria, and to pass a critical appraisal of methodological quality, the main purpose of which was to ensure that the included reviews conformed to usual research norms. The criteria used by the three independent reviewers for evaluating the methodological quality were those in the Rapid Critical Appraisal (RCA) tool for systematic reviews and meta‐ analyses of quantitative studies developed by the Helene Fuld National Institute for Evidence‐ Based Practice in Nursing & Healthcare of the Ohio State University College of Nursing ([OSUCN] 2017). The reviewers used the tool to critically appraise the validity, reliability, and applicability and generalizability through independently answering a series of 15 appraisal questions and subquestions. In addition, an evaluation quantifying the strength of evidence (i.e., quality + level of evidence) in the included reviews was added to the standardized form for conducting the critical appraisal of methodological quality. The three independent reviewers critically appraised the strength of evidence as being low, moderate, or high, based on the percentage of critical appraisal criteria fulfilled (0–33%, 34–66%, and 67% and over). Any discrepancies and differences in opinion in the critical appraisals of methodological quality related to the included reviews were discussed among the three researchers until consensus was reached. The benchmark of methodological quality for the reviews included in this overview was set at a total minimum score of at least five out of a total of 15 appraisal criteria on the RCA tool fulfilled (i.e., 34%), indicating acceptable scientific rigor. Data Synthesis
  • 143. To answer the primary research question of this overview, the data from the 11 included reviews on practicing healthcare professionals' EBP competencies were summarized, analyzed, and synthesized by using guidance from the Cochrane Collaboration (Becker & Oxman, [ 2]). A narrative synthesis is presented, as a meta‐ analysis was not possible due to the heterogeneity of the source studies contained in the reviews, including substantial variation in outcomes and educational interventions, as well as the poor quality of reporting of the results in some of the included reviews. Findings Characteristics of the Systematic Reviews Included in the Overview The 11 included reviews originated from all around the globe: Though the majority (n = 6, 55%) were from Europe, another two were from Australia, and one each were from Asia, South America, and North America. As expected, almost one‐ half (n = 5, 45%) of the included reviews originated from English‐ speaking countries, which traditionally comprise the nations leading the international EBP movement. Unexpectedly, the majority (n = 6, 55%) of the reviews originated from smaller countries, such as Ireland, Greece, Finland, and the Netherlands, many of which are non‐ English‐ speaking and have embarked on the EBP journey more recently. The number of source studies in the 11 included 6/25/22, 12:42 AM Library OneSearch
  • 144. https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 6/15 systematic reviews ranged from n = 6 to n = 32, with a total of 204 source studies from 24 different countries on six continents of the world. Seven (64%) of the 11 reviews included source studies using a cross‐ sectional survey design, another seven (64%) included randomized controlled trials (RCTs) or cluster RCTs, six (55%) included source studies using a pretest–posttest intervention or a cluster nonrandomized study design, four (36%) included qualitative study designs, two each of the 11 systematic reviews included mixed‐ methods study designs and longitudinal observational designs, and one each of the 11 reviews included prospective cohort designs or reviews. Although the majority (n = 7, 64%) of the 11 included systematic reviews contained one or more source studies using an experimental design (i.e., used a second group for comparison), the vast majority of the source studies were nonrandomized, one‐ group quasi‐ experimental study designs, cross‐ sectional surveys, or qualitative study designs. Similarly, although the vast majority of the total number of source studies used a nonrandom sample (e.g., a convenience or purposive sample), seven of the 11 (64%) systematic reviews included at least one source study that used a random sample. Only five of the 11 included reviews discussed or displayed (e.g., in their extracted data tables) the response rates of their source studies, and even when they were
  • 145. actually reported, they frequently were not reported for all source studies in the reviews. Overall, the reported response rates were relatively low, and there was wide variability in the response rates from 9% to 100%. Furthermore, healthcare professionals' EBP competencies were measured using a wide variety of published and unpublished instruments, some of which were general instruments measuring several EBP competencies, such as the EBP Questionnaire (Upton & Upton, [33]), whereas other instruments measured one specific EBP competency, such as the EBP Beliefs Scale (Melnyk & Fineout‐ Overholt, [13]). Selected characteristics of the included reviews (n = 11) are presented in Table S2. Participants and Practice Settings in the Systematic Reviews A total of 59,382 healthcare professionals participated in the source studies of the 11 included reviews published between January 2012 and July 2017. Healthcare disciplines represented in the reviews were primarily nursing, medicine, physical therapy, and occupational therapy, but participants from at least 10 additional allied health disciplines were included in the source studies of the reviews, as listed in the Turnbull et al. ([30]) model for allied health professionals. In almost one‐ half (n = 5, 45%) of the systematic reviews, the source studies focused on only one healthcare discipline (e.g., nurses). However, six of the 11 included systematic reviews contained source studies with multidisciplinary samples, which included health professionals other than nurses, doctors, physical therapists, and occupational therapists. All 11 included systematic reviews focused on practicing healthcare professionals, but four of the 11 (36%) systematic reviews also contained small subsamples of
  • 146. healthcare students in some of their source studies. The clinical settings of the source studies were poorly identified with only general statements such as "various settings" or "any clinical setting," or the settings were not described at all in the majority (n = 7, 64%) of the included reviews. However, some of the included reviews did disclose containing source studies from hospital, primary care, and community care settings. Outcomes Measured and Overlap Between the Included Reviews 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 7/15 Outcomes measured in the included reviews varied considerably, with several reviews containing other outcomes in addition to those related to healthcare professionals' EBP competencies. Moreover, the instruments used to measure the outcomes also varied considerably. Healthcare professionals' EBP competencies were measured by using self‐ report assessments in the source studies of all of the 11 included reviews (i.e., perceived EBP competencies were measured, instead of using more objective measures of actual performance, such as EBP knowledge tests). A total of 204 source studies were contained in the 11 reviews included in this overview. There was substantial overlap across the included reviews in terms of their source studies, as the 11 included
  • 147. reviews with a total of 204 source studies referred to a total of 133 separate studies, of which 48 were included in more than one review. An effort was made to avoid double counting which might lend extra weight to those study results that had been included in more than one review. A summary of the main findings from the source studies can be found in the fuller version of this overview published online. Table S3 summarizes the EBP competency outcomes of healthcare professionals from the included reviews. Overall Quality and Completeness of Reporting in the Included Systematic Reviews The overall quality of the included reviews was appraised using guidance from the Cochrane Collaboration (Becker & Oxman, [ 2]). All of the reviews met the definition of systematic reviews as outlined for this overview. Interestingly, although two of the 11 included reviews were characterized as a "scoping review" or a "systematic scoping review," they nevertheless included a critical appraisal of methodological quality of their source studies, which reflects the wide variety of terms that are used, sometimes inconsistently, to describe the various types of reviews published in the international literature. The critical appraisal of methodological quality conducted by the three reviewers with the RCA tool (OSUCN, [21]) revealed a broad range of strength of evidence among the included reviews. The benchmark for the strength of evidence indicating acceptable methodological quality was set at 34% (i.e., a total minimum score of at least 5 out of a total of 15 critical appraisal criteria fulfilled). All 11 included reviews met this minimum standard for acceptable
  • 148. scientific rigor, with 10 out of the 11 reviews appraised at moderate quality. The median score (0–15) was 8 (moderate), with the scores ranging from 5 to 10 (out of 15). Only one of the 11 included reviews barely attained a high score (i.e., a score of at least 10 out of 15 appraisal criteria fulfilled). The pronounced heterogeneity in the source studies of the included reviews in terms of their study designs, practice settings, outcome measures, outcomes of interest, and educational interventions, combined with poor and inconsistent reporting quality (e.g., not reporting source study settings) and missing or incomplete data (e.g., only one of the 11 reviews reported effect sizes for the source studies and few reported p‐ values or confidence intervals), prompted the results of this overview to be narratively summarized. This also precluded any comparisons of EBP competencies across healthcare disciplines. In particular, there was considerable variation in the outcome measures used in the source studies of the reviews, including unpublished, not theoretically based, and not psychometrically tested instruments, which were inconsistently or incompletely described. Moreover, many assertions were made in the reporting of the source studies, but few assertions were backed up by actual data in the reviews. Furthermore, although the educational interventions may have had a positive effect on EBP competencies, the impact of the improved EBP competencies on patient outcomes or practice changes 6/25/22, 12:42 AM Library OneSearch
  • 149. https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 8/15 remains unclear, as healthcare professionals' improved EBP competencies may not necessarily have influenced practice in any way. On the other hand, although the vast majority of the source studies in the included reviews used nonprobability sampling methods and cross‐ sectional survey, pretest–posttest intervention, or qualitative study designs, it is important to acknowledge that seven (64%) of the 11 reviews contained at least one RCTs or cluster RCT as a source study. In total, the 11 reviews contained 33 RCTs or cluster RCTs as source studies, some of which were included in more than one review. These results are consistent with the findings of Young, Rohwer, Volmink, and Clarke ([36]), who found that despite the commonly held perception of relatively rare use of experimental study designs such as RCTs in some healthcare disciplines, the reviews included in their overview nevertheless included a total of 25 RCTs. In summary, the overall quality and completeness of evidence in the included reviews of this overview was low to moderate at best, as the majority of the reviews did not contai n a comprehensive literature search, report on both included and excluded studies, or discuss the potential biases of the reviews. Lastly, some of the reviews did not report on the response rates, the number of participants in their source studies, or match the stated objectives of the review with what was actually discussed in the review.
  • 150. Discussion The first Sicily statement (Dawes et al., [ 5]) outlined that knowledge and understanding of the principles of EBP and skills to implement the steps of the EBP process are essential competencies for all practicing healthcare professionals. To that end, this overview of systematic reviews summarized and synthesized evidence from 11 systematic reviews containing 204 source studies that assessed the current state of the EBP competencies for practicing healthcare professionals, provided critical appraisals of their ability to implement the steps of the EBP process, and evaluated the effectiveness of various educational interventions for advancing their EBP competencies using a wide variety of study designs, outcome measures, and outcomes of interest. Although the majority of healthcare professionals across disciplines indicated familiarity with both the concept of "evidence‐ based practice" and the discipline‐ specific terms of (e.g., "evidence‐ based nursing" or "evidence‐ based medicine") widespread confusion appeared to exist among large proportions of healthcare professionals about the commonly accepted definitions of EBP and the meanings of the basic concepts related to EBP (Condon, McGrane, Mockler, & Stokes, [ 4]; Scurlock‐ Evans et al., [26]; Ubbink, Guyatt, & Vermeulen, [31]; Upton et al., [32]), which were consistent with the results of other reviews (Saunders & Vehviläinen‐ Julkunen, [24]). This is disconcerting because the lack of clarity about even the most basic definitions and concepts of EBP among large proportions of healthcare professionals impedes healthcare organizations from delivering the highest quality, evidence‐ based health care. It also may contribute to a perception among
  • 151. healthcare professionals and organizations that EBP is being implemented, when in reality, clinical care delivery is still more closely associated with the traditions, routines, and customs of opinion‐ based practice (Saunders & Vehviläinen‐ Julkunen, [24]; Wonder, Spurlock, Lancaster, & Gainey, [35]). Furthermore, large proportions of healthcare professionals across disciplines appear to hold a variety of misconceptions, misinterpretations, and misunderstandings of what actually constitutes EBP (Saunders, Stevens, & Vehviläinen‐ Julkunen, [23]; Scurlock‐ Evans et al., [26]; Upton et al., [32]). For example, Scurlock‐ Evans et al. ([26]) contended that physical therapists may not only be confused as to the meaning of the term 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebscoh… 9/15 "evidence," they may also be confused about how they should go about integrating evidence and about what type of evidence they should be implementing in practice. Practicing healthcare professionals' self‐ reported EBP attitudes toward and beliefs in the importance and value of EBP for improving care quality and patient outcomes were mainly positive across health disciplines, and generally at a higher level than their self‐ reported EBP knowledge and skills. Unfortunately, however, these EBP competencies did not
  • 152. translate into EBP behaviors, as EBP implementation in daily practice was generally at a low level across disciplines (Saunders & Vehviläinen‐ Julkunen, [24]; Scurlock‐ Evans et al., [26]; Ubbink et al., [31]; Upton et al., [32]). Furthermore, although healthcare professionals' self‐ rated EBP knowledge and skills were higher than their EBP implementation, healthcare professionals across disciplines rated their EBP knowledge and skills to be at an insufficient level for integrating best evidence into daily practice. Perhaps for this reason, large proportions of healthcare professionals across disciplines did not use best available evidence or implement EBP in daily care delivery. This is consistent with the findings of previous studies indicating that the majority of clinicians do not consistently engage in EBP (Melnyk, Fineout‐ Overholt, Gallagher‐ Ford, & Kaplan, [15]; Melnyk et al., [17]; Wallen et al., [34]). Another concern related to the included reviews was failing to measure the impact of healthcare professionals' EBP competencies on patient outcomes, even when it was explicitly stated as one of the objectives of the review. Although four of the 11 included reviews reported measuring the impact of healthcare professionals' EBP competencies on practice changes or patient outcomes as a stated objective, only one review actually discussed any results related to patient outcomes. The lack of measuring the impact on patient outcomes of healthcare professionals' EBP competencies and that of educational interventions promoting healthcare professionals' EBP competencies is consistent with the results of other reviews (Hecht, Buhse, & Meyer, [ 9]; Häggman‐ Laitila, Mattila, & Melender, [ 7]) and overviews (Young et al., [36]).
  • 153. Limitations in the Overview The main limitation of this overview of systematic reviews is the potential for various biases, including selection, publication, and indexing biases. To reduce the potential for bias, we followed guidance from the Cochrane Collaboration and PRISMA on the methodology for conducting rigorous systematic reviews and reporting their results, followed a prespecified review protocol, and systematically searched multiple electronic databases in collaboration with a university librarian, using keywords and search terms modified appropriately for the various databases. In addition, we searched for ongoing systematic reviews prior to undertaking this overview, reference‐ chased the systematic reviews included in this overview, and hand‐ searched the tables of contents of the peer‐ reviewed scientific journals in which the majority of the systematic reviews on healthcare professionals' EBP competencies had been published. As hand‐ searching the tables of contents did not result in additional searches, we believe that our search strategy would effectively capture most of the relevant systematic reviews published on this topic between January 2012 and July 2017. However, as in any review, it is possible that some relevant systematic reviews were not identified. Second, three reviewers independently used a study design‐ specific critical appraisal tool to evaluate the methodological quality of each included review, with any discrepancies and differences discussed to form a mutual agreement, which increased the reliability of the data. In addition, all of the included
  • 154. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebsco… 10/15 reviews, originating from 10 different countries worldwide, had passed an international peer review and had been published in high‐ quality scientific journals. As the majority (n = 6, 55%) of the included reviews originated from non‐ English‐ speaking countries representing six different languages, publication and language biases, although possible, are unlikely. Third, self‐ reported assessments were used to measure healthcare professionals' EBP competencies in all of the 11 included reviews (i.e., perceived EBP competencies were assessed, instead of using more objective measures of actual performance, such as EBP knowledge tests). Because of a lack of congruence between self‐ reported and more objectively measured knowledge and ability, especially when measuring complex tasks such as EBP implementation (Saunders, Vehviläinen‐ Julkunen, et al., [25]; Scurlock‐ Evans et al., [26]; Wonder et al., [35]), using self‐ reports may result in bias (through the participants giving more socially acceptable responses than nonrespondents), and in overestimation of some EBP competencies, such as EBP knowledge, for which more objective measures are available. Fourth, the search term "research utilization" was not used for our overview of systematic reviews as the aim was to focus on the EBP competencies that practicing
  • 155. healthcare professionals need to successfully integrate translated best evidence into daily clinical practice. However, we acknowledge that it is not uncommon for research utilization to be used in studies as if it were an alternative term for EBP, and therefore, we are aware that some of the published systematic reviews may have been missed by our search. Fifth, the modest methodological quality of the identified systematic reviews and the relatively low quality of reporting of the results in the systematic reviews may have affected the results of this overview. Finally, effect sizes were not reported in all but one of the included systematic reviews. Therefore, generalizability of the results is limited, and the results of this overview should be extrapolated with caution. Implications for Practice and Research Evidence‐ based practice competencies are essential for all practicing healthcare professionals in guiding their integration of best evidence into their clinical decision‐ making and thus enabling them to provide higher‐ quality care and produce better patient outcomes. However, as EBP is a shared competency and the steps of EBP implementation are universal, there is an urgent need for the collaborative development, implementation, and evaluation of an EBP competency set for all healthcare professionals (i.e., an interprofessional set of EBP competencies that can be used by all practicing healthcare professionals from any healthcare discipline). Recently, the development of a first set of such interprofessional core competencies in EBP for all healthcare professionals was published as a consensus statement based on a systematic review and Delphi survey (Albarqouni et al., [ 1]), which
  • 156. contained 68 core competencies in EBP applicable to all healthcare professionals. This type of interprofessional core competencies in EBP for all healthcar e professionals should be the focus of future research studies, as the EBP competencies will guide the development of interprofessional EBP competency measures (via self‐ ratings or actual performance) as well as joint EBP curricula for practicing healthcare professionals, and thus, their subsequent uptake, adoption, and use in clinical practice should be a high priority for all practicing healthcare professionals. In addition, addressing the widespread misconceptions and misunderstandings currently existing among large proportions of healthcare professionals about the basic concepts of EBP is crucially important for increasing their engagement in EBP implementation and for attaining improved care quality and patient outcomes. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebsco… 11/15 Nursing and some allied health disciplines, such as physical therapy and occupational therapy, have traditionally relied on measuring competencies through self‐ report assessments even when the constructs of interest, such as EBP knowledge, ability, or competence, could be assessed through more objective measures. Therefore, future studies should focus on developing and using actual, that is,
  • 157. performance‐ based, validated outcome measures for EBP competencies through using rigorous study and review methodologies and robust reporting practices. Although EBP is a shared competency, implementation of EBP is a complex process requiring multifaceted educational interventions that contain interacting components, and thus, it should be investigated whether the differences in healthcare professionals' primary roles, educational backgrounds across disciplines, and in contextual factors may influence the effects of the EBP educational interventions. Conclusions The findings of this overview of systematic reviews suggest that irrespective of their healthcare discipline, large proportions of practicing healthcare professionals perceive their EBP competencies to be insufficient for employing EBP in daily care delivery. These perceptions as well as widespread confusion, misconceptions, and misunderstandings about the meanings of the most basic concepts of EBP among healthcare professionals across disciplines contribute to their low levels of EBP implementation both in terms of the principles and in terms of the process of EBP (i.e., healthcare professionals neither using translated best evidence as the basis for clinical decision‐ making in daily practice nor implementing all the steps of the EBP process). As EBP is a shared competency, practicing healthcare professionals should actively participate in the uptake, adoption, and use of the interprofessional core competencies in EBP for all healthcare professionals as well as collaboratively advance EBP implementation through the development and evaluation of the effectiveness of research‐
  • 158. based EBP interventions, strategies, and tools. EBP competencies are essential for all practicing healthcare professionals as they guide healthcare professionals' integration of best evidence into their clinical decision‐ making and thus, enable them to provide higher‐ quality care to patients, resulting in better patient outcomes. It is important to recognize that EBP is a shared competency; that is, the key principles and steps of the EBP implementation process are universal and applicable to all healthcare disciplines. There is an urgent need for conducting research studies on the applicability in practice, as well as the uptake, adoption, and evaluation of the interprofessional core competencies in EBP for all healthcare professionals recently published as a consensus statement based on a systematic review and Delphi survey (Albarqouni et al., [ 1]). Future research studies should also focus on developing and using actual, that is, performance‐ based, validated outcome measures for assessing nurses' EBP competencies, instead of continuing to evaluate perceived (i.e., self‐ rated) competencies via self‐ assessments, even when the constructs of interest, such as EBP knowledge and ability, could be assessed through more objective, performance‐ based measures. Linking Evidence to Action GRAPH: Figure S1. The modified PRISMA Flow diagram (Moher et al., [19]): Identification, screening and selection of systematic reviews for inclusion in the
  • 159. overview. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fe d s.p.ebsco… 12/15 GRAPH: Table S1. Inclusion and Exclusion Criteria for the Overview of Systematic Reviews.Table S2. Characteristics of Included Systematic Reviews in the Overview.Table S3. Summary Table of EBP Outcomes in the Systematic Reviews Included in the Overview. Footnotes 1 This research was supported by grants awarded to Dr. Saunders from the Finnish Work Environment Fund, which are gratefully acknowledged. References Albarqouni, L., Hoffmann, T., Straus, S., Olsen, N. R., Young, T., Ilic, D., ... Glasziou, P. (2018). Core competencies in evidence‐ based practice for health professionals. Consensus statement based on a systematic review and Delphi survey. JAMA Network Open, 2 (1), e190281. https://doi.org/10.1001/jamanetworkopen.2018.0281 2 Becker, L., & Oxman, A. (2011). Overviews of reviews. In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (Version 5.1.0). Retrieved from The Cochrane Collaboration website: http://www.cochrane-handbook.org
  • 160. 3 Benner, P. (1984). From novice to expert. Excellence and power in clinical nursing practice. Menlo Park, CA : Addison Wesley. 4 Condon, C., McGrane, N., Mockler, D., & Stokes, E. (2016). Ability of physiotherapists to undertake evidence‐ based practice steps: A scoping review. Physiotherapy, 102, 10 – 19. https://doi.org/10.1016/j.physio.2015.06.003 5 Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., ... Osborne, J. (2005). Second International Conference of Evidence‐ Based Health Care Teachers and Developers: Sicily statement on evidence‐ based practice. BMC Medical Education, 5, 1. https://doi.org/10.1186/1472-6920-5-1 6 DiCenso, A., Cullum, N., & Ciliska, D. (1998). Implementing evidence‐ based nursing: Some misconceptions. Evidence‐ Based Nursing, 1, 38 – 40. https://doi.org/10.1136/ebn.1.2.38 7 Häggman‐ Laitila, A., Mattila, L.‐ R., & Melender, H.‐ L. (2016). Educational interventions on evidence‐ based nursing in clinical practice: A systematic review with qualitative analysis. Nurse Education Today, 43, 50 – 59. https://doi.org/10.1016/j.nedt.2016.04.023 8 Halm, M. A. (2018). Evaluating the impact of EBP education: Development of a modified Fresno test for acute care nursing. Worldviews on Evidence‐ Based Nursing, 15, 272 – 280. https://doi.org/10.1111/wvn.12291
  • 161. 9 Hecht, L., Buhse, S., & Meyer, G. (2016). Effectiveness of training in evidence‐ based medicine skills for healthcare professionals: A systematic review. BMC Medical Education, 16, 103. https://doi.org/10.1186/s12909-016-0616-2 Ilic, D., Nordin, R. B., Glasziou, P., Tilson, J. K., & Villanueva, E. (2013). Implementation of a blended learning approach to teach evidence‐ based practice: A protocol for a mixed‐ method study. BMC Medical http://www.cochrane-handbook.org/ 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebsco… 13/15 Education, 13 (170). https://doi.org/10.1186/1472-6920-13-170 Laibhen‐ Parkes, N., Kimble, L. P., Melnyk, B. M., Sudia, T., & Codone, S. (2018). An adaptation of the original Fresno test to measure evidence‐ based competence of pediatric bedside nurses. Worldviews on Evidence‐ Based Nursing, 15 (3), 230 – 240. https://doi.org/10.1111/wvn.12289 McCluskey, A., & Bishop, B. (2009). The adapted Fresno test of competence in evidence‐ based practice. Journal of Continuing Education in Health Professions, 29 (2), 119 – 126. Melnyk, B. M., & Fineout‐ Overholt, E. (2007). The
  • 162. evidence‐ based practice beliefs scale. Gilbert, AZ : ARCC IIc Publishing. Melnyk, B. M., & Fineout‐ Overholt, E. (Eds.). (2011). Evidence‐ based practice in nursing and healthcare: A guide to best practice (2nd ed.). Philadelphia, PA : Lippincott, Williams & Wilkins. Melnyk, B., Fineout‐ Overholt, E., Gallagher‐ Ford, L., & Kaplan, L. (2012). The state of evidence‐ based practice among U.S. nurses. Journal of Nursing Administration, 42, 410 – 417. https://doi.org/10.1097/NNA.0b013e3182664e0a Melnyk, B. M., Gallagher‐ Ford, L., & Fineout‐ Overholt, E. (2014). The establishment of evidence‐ based practice competencies for practicing registered nurses and advanced practice nurses in real‐ world settings: Proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews on Evidence‐ Based Nursing, 1 (1), 5 – 15. https://doi.org/10.1111/wvn.12021 Melnyk, B. M., Gallagher‐ Ford, L., & Fineout‐ Overholt, E. (2016). Implementing the EBP competencies in healthcare: A practical guide for improving quality, safety, & outcomes. Indianapolis, IN : Sigma Theta Tau International. Melnyk, B. M., Gallagher‐ Ford, L., Zellefrow, C., Tucker, S., Thomas, B., Sinnott, L. T., & Tan, A. (2018). The first U.S. study on nurses' evidence‐ based practice competencies indicates major deficits that threaten healthcare quality, safety, and patient outcomes. Worldviews on Evidence‐ Based Nursing, 15 (1), 16 – 25. https://doi.org/10.1111/wvn.12269
  • 163. Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G., & The PRISMA Group (2009). Preferred reporting items for systematic reviews and meta‐ analyses: The PRISMA statement. PLoS Medicine, 6 (6), e1000097. https://doi.org/10.1371/journal.pmed1000097 Mota da Silva, T., da Cunha Menezes Costa, L., Narciso Garcia, A., & Oliveira Pena Costa, L. (2015). What do physical therapists think about evidence‐ based practice? A systematic review. Manual Therapy, 20, 388 – 401. https://doi.org/10.1016/j.math.2014.10.009 Ohio State University College of Nursing (2017). Rapid Critical Assessment (RCA) tools for systematic reviews & meta‐ analyses of quantitative studies and literature reviews. Columbus, OH : The Helene Fuld National Institute for Evidence‐ Based Nursing & Healthcare. Sackett, D., Rosenberg, W., Gray, J., Haynes, R., & Richardson, W. (1996). Evidence‐ based medicine: What it is and what it isn't. British Journal of Medicine, 312, 71 – 72. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebsco… 14/15 https://doi.org/10.1136/bmj.312.7023.71
  • 164. Saunders, H., Stevens, K. R., & Vehviläinen‐ Julkunen, K. (2016). Nurses' readiness for evidence‐ based practice at Finnish university hospitals: A national survey. Journal of Advanced Nursing, 72, 1863 – 1874. https://doi.org/10.1111/jan.12963 Saunders, H., & Vehviläinen‐ Julkunen, K. (2015). The state of readiness for evidence‐ based practice among nurses: An integrative review. International Journal of Nursing Studies, 56, 128 – 140. Saunders, H., Vehviläinen‐ Julkunen, K., & Stevens, K. R. (2016). Effectiveness of an education intervention to strengthen nurses' readiness for evidence‐ based practice: A single‐ blind randomized controlled study. Applied Nursing Research, 31, 175 – 185. https://doi.org/10.1016/j.apnr.2016.03.004 Scurlock‐ Evans, L., Upton, P., & Upton, D. (2014). Evidence‐ based practice in physiotherapy: A systematic review of barriers, enablers and interventions. Physiotherapy, 100, 208 – 219. https://doi.org/10.1016/j.physio.2014.03.001 Spurlock, D., & Wonder, A. H. (2015). Validity and reliability evidence for a new measure: The evidence‐ based practice knowledge assessment in nursing. Journal of Nursing Education, 54, 605 – 613. https://doi.org/10.3928/01484834-20151016-01 Stevens, K. R. (2009). Essential evidence‐ based practice competencies in nursing (2nd ed.). San Antonio, TX : San Antonio Academic Center for Evidence‐ Based Practice, University of Texas Health Science Center.
  • 165. Tilson, J. K. (2010). Validation of the modified Fresno test: Assessing physical therapists' evidence‐ based practice knowledge and skills. BMC Medical Education, 10 (38), 38 https://doi.org/10.1186/1472- 6920-10-38 Turnbull, C., Grimmer‐ Somers, K., Kumar, S., May, E., Law, D., & Ashworth, E. (2009). Allied, scientific and complementary health professionals: A new model for Australian allied health. Australian Health Review, 33 (1), 27 – 37. https://doi.org/10.1071/AH090027 Ubbink, D. T., Guyatt, G. H., & Vermeulen, H. (2013). Framework of policy recommendations for implementation of evidence‐ based practice: A systematic scoping review. British Medical Journal Open, 3 (1), e001881. https://doi.org/10.1136/bmjopen-2012-001881 Upton, D., Stephens, D., Williams, B., & Scurlock‐ Evans, L. (2014). Occupational therapists' attitudes, knowledge, and implementation of evidence‐ based practice: A systematic review of published research. British Journal of Occupational Therapy, 77 (1), 24 – 38. https://doi.org/10.4276/030802214X13887685335544 Upton, D., & Upton, P. (2005). Knowledge and use of evidence‐ based practice by allied health and health science professionals in the United Kingdom. Journal of Allied Health, 35, 127 – 133. Wallen, G., Mitchell, S., Melnyk, B., Fineout‐ Overholt, E., Miller‐ Davis, C., Yates, J., & Hastings, C. (2010). Implementing evidence‐ based practice: Effectiveness of a structured multifaceted mentorship
  • 166. 6/25/22, 12:42 AM Library OneSearch https://eds.p.ebscohost.com/eds/delivery?sid=4d430855-5c95- 4660-9974- 129b72c95a55%40redis&vid=1&ReturnUrl=https%3a%2f%2fed s.p.ebsco… 15/15 programme. Journal of Advanced Nursing, 66, 2761 – 2771. https://doi.org/10.1111/j.1365- 2648.2010.05442.x Wonder, A. H., Spurlock, D., Lancaster, S., & Gainey, M. (2017). Comparison of nurses' self‐ reported and objectively measured evidence‐ based practice knowledge. Journal of Continuing Education in Nursing, 48, 65 – 70. https://doi.org/10.3928/00220124- 20170119-06 Young, T., Rohwer, A., Volmink, J., & Clarke, M. (2014). What are the effects of teaching evidence‐ based health care (EBHC)? Overview of systematic reviews PLoS ONE, 9 (1), e86706. https://doi.org/10.1371/journal.pone.0086706 ~~~~~~~~ By Hannele Saunders; Lynn Gallagher‐ Ford; Tarja Kvist and Katri Vehviläinen‐ Julkunen Reported by Author; Author; Author; Author This article is copyrighted. All rights reserved. Source: Worldviews on Evidence-Based Nursing
  • 167. https://doi.org/10.1177/1078390319889673 Journal of the American Psychiatric Nurses Association 2020, Vol. 26(3) 288 –292 © The Author(s) 2019 Article reuse guidelines: sagepub.com/journals-permissions DOI: 10.1177/1078390319889673 journals.sagepub.com/home/jap Brief Report Introduction More than 45,000 Americans over age 10 years died by suicide in 2016, making it the 10th leading cause of death in the United States (Centers for Disease Control and Prevention, 2018). Among adults who complete suicide, in the year prior to their death, approximately 20% to 25% have an emergency department (ED) visit for delib- erate self-harm (Ahmedani et al., 2014; Ahmedani et al., 2015) and associated suicidality (hereafter referred to as DSH), which can include nonsuicidal self-injury. Because EDs are providing front-line suicide prevention services, improving the overall quality of ED mental health care for patients who present with DSH represents an opportu- nity to intervene with these high-risk patients. While pre- vious studies have attempted to determine the quality of care in EDs for DSH patients, the literature is lacking in recent, U.S.-based research. ED nursing managers were surveyed because of their broad knowledge of typical unit policies, practices, and
  • 168. staffing structure. In addition to providing management of frontline nurses, they also oversee the organizational structure of nursing treatment for DSH patients and are therefore well-positioned to shape the processes of care for these patients. Essential to improving this process is understanding the aspects of care that managers perceive as important to providing quality care, as well as the extent to which evidence-based practices (EBPs) have translated 889673 JAPXXX10.1177/1078390319889673Journal of the American Psychiatric Nurses AssociationDiana et al. research-article2019 1Amaya H. Diana, The University of Pennsylvania, Philadelphia, PA, USA 2Mark Olfson, MD, MPH, Columbia University, New York, NY, USA 3Sara Wiesel Cullen, PhD, MSW, The University of Pennsylvania, Philadelphia, PA, USA 4Steven C. Marcus, PhD, The University of Pennsylvania, Philadelphia, PA, USA Corresponding Author: Sara Wiesel Cullen, School of Social Policy and Practice, the University of Pennsylvania, 3701 Locust Walk, Philadelphia, PA 19104- 6243, USA. Email: [email protected] The Relationship Between Evidence-Based Practices and Emergency Department Managers’ Perceptions on Quality of
  • 169. Care for Self-Harm Patients Amaya H. Diana1, Mark Olfson2, Sara Wiesel Cullen3 , and Steven C. Marcus4 Abstract OBJECTIVE: To understand the extent to which implementation of evidence-based practices affects emergency department (ED) nurse managers’ perceptions of quality of care provided to deliberate self-harm patients. METHODS: ED nursing leadership from a nationally representative sample of 513 hospitals completed a survey on the ED management of deliberate self-harm patients, including the quality of care for deliberate self-harm patients on a 1 to 5 point Likert-type scale. Unadjusted and adjusted analyses, controlling for relevant hospital characteristics, examined associations between the provision of evidence-based practices and quality of care. RESULTS: The overall mean quality rating was 3.09. Adjusted quality ratings were higher for EDs that routinely engaged in discharge planning (β = 0.488) and safety planning (β = 0.736) processes. Ratings were also higher for hospitals with higher levels of mental health staff (β = 0.368) and for teaching hospitals (β = 0.319). CONCLUSION: Preliminary findings suggest a national institutional readiness for further implementation of evidence-based practices for deliberate self- harm patients. Keywords emergency department, deliberate self-harm, suicide prevention, evidence-based practices, quality of care Check for Updates http://crossmark.crossref.org/dialog/?doi=10.1177%2F10783903 19889673&domain=pdf&date_stamp=2019-11-21
  • 170. https://us.sagepub.com/en-us/journals-permissions https://doi.org/10.1177/1078390319889673 http://journals.sagepub.com/home/jap mailto:[email protected] Diana et al. 289 to the ED. For instance, after a DSH event, the provision of appropriate assessment and safety planning reduces risk for repeat DSH and suicide attempts (Boudreaux et al., 2016; Stanley et al., 2018). Safety planning is a brief behavioral intervention that can be performed by nurses in the ED that involves restricting access to lethal means, teaching coping skills, identifying a social and emergency network, and building motivation for con- tinuing mental health treatment (Stanley et al., 2018). Despite the evidence supporting the efficacy of assess- ment and safety planning, it remains unknown how often these strategies are actually employed in EDs or the extent to which they improve the quality of care for DSH patients. In order to assess the gap between research and prac- tice in this area, a national survey of over 500 ED manag- ers collected data on the extent to which EDs provide assessments, the elements of safety planning practices identified above, and mental health referrals on discharge. We then examined the extent to which implementation of these practices influenced ED nurse managers’ percep- tions of the quality of care provided to DSH patients. Methods Between May 2017 and January 2018, we mailed an ED management of DSH survey to a random sample of 665