SlideShare a Scribd company logo
Assessing the Utility of Consumer Surveys
for Improving the Quality of Behavioral
Health Care Services
J. Randy Koch, PhD
Alison B. Breland, PhD
Mary Nash, PhD
Karen Cropsey, PsyD
Abstract
The development and implementation of provider performance
and consumer outcome measures
for behavioral health care have been growing over the last
decade, presumably because they are
useful tools for improving service quality. However, the extent
to which providers have successfully
used performance measurement results has not been adequately
determined. To this end, two
methods were used to better understand the use of data obtained
from an annual survey of
behavioral health care consumers: a cross-sectional survey of
executive directors, clinical program
directors, and quality improvement directors and follow-up
interviews with a subsample of survey
respondents. Results revealed information about the use of
consumer survey data, factors that
facilitate and hinder the use of results, as well as respondents’
opinions about consumer survey
administration procedures. These findings provide valuable
information for the application of
performance measures and, ultimately, improving consumer
outcomes.
Address correspondence to Alison B. Breland, PhD, Institute for
Drug and Alcohol Studies, Virginia Commonwealth
University, McGuire Hall, Rm. B08, 1112 East Clay Street(
P.O. Box 980310, Richmond, VA 23298, USA. Phone: +1-804-
6282300; Fax: +1-804-8287862; E-mail: [email protected]
J. Randy Koch, PhD, Institute for Drug and Alcohol Studies,
Virginia Commonwealth University, P.O. Box 980310,
Richmond, VA, USA. Phone: +1-804-8288633; Fax: +1-804-
8287862; E-mail: [email protected]
Mary Nash, PhD, School of Human and Organization
Development, Fielding Graduate University, Santa Barbara, CA,
USA. Phone: +1-757-4356589; Fax: +1-757-4356589; E-mail:
[email protected]
Karen Cropsey, PsyD, Department of Psychiatry and Behavioral
Neurobiology, University of Alabama School of
Medicine, Birmingham, AL, USA. Phone: +1-205-9160135;
Fax: +1-205-9409258; E-mail: [email protected]
This research was performed at the Virginia Commonwealth
University, Institute for Drug and Alcohol Studies, 1112 East
Clay Street, Suite B-08, Richmond, VA 23298.
Journal of Behavioral Health Services & Research, 2010. c)
2010 National Council for Community Behavioral
Healthcare.
234 The Journal of Behavioral Health Services & Research 38:2
April 2011
Introduction
Over the past decade, there has been significant growth in the
development and implementation
of provider performance and consumer outcome measures for
the behavioral health care field. The
Federal Substance Abuse and Mental Health Services
Administration has been at the forefront in
the development of performance measures for the public
behavioral health care system and has
sponsored several initiatives that have facilitated the acceptance
of performance measurement as an
essential business practice, including the Mental Health
Statistics Improvement Program (MHSIP)
Consumer-Oriented Report Card,1 the Outcomes Roundtable for
Children and Families, and the
Forum on Performance Measures. The private sector has also
been actively promoting the use of
performance measures through accreditation organizations that
may require or strongly encourage
provider organizations and health plans to establish
performance measurement systems (e.g., the
ORYX program operated by The Joint Commission, the Health
Care Effectiveness Data and
Information Set sponsored by the National Committee for
Quality Assurance (NCQA), and the
uSPEQ system of the Council on Accreditation of Rehabilitation
Facilities (CARF)).
A key rationale for requiring the implementation of performance
measures is their presumed
value to providers for improving the quality of services they
deliver to consumers; importantly,
increased consumer satisfaction has been associated with better
drug use outcomes after treatment.2
However, although there is anecdotal evidence that some
providers have successfully used
performance measurement results to improve the quality of
services, the extent to which this has
occurred has not been determined. In addition, understanding
how the usefulness of performance
data can be improved, as well as identifying barriers to using
performance measurement data for
quality improvement (QI), needs to be explored. Given the
widespread use of performance
measures in both the public and private sectors and the cost
associated with their implementation, it
is critical that these and related questions be addressed.
Performance measurement systems are diverse and often
complicated. In particular, these
systems may include a variety of different types of measures
(e.g., standardized clinical instruments
and measures generated from administrative data). However, for
a variety of reasons (e.g., cost,
ease of data collection, and the desire to obtain input directly
from consumers), consumer surveys
have become popular, common components of existing
measurement systems. For example,
consumer surveys are now widely used to assess consumers’
perceptions of service accessibility,
cultural sensitivity, and treatment outcomes.3,4 In fact, the
States are now requested to provide
some key performance indicators, including the results of a
consumer survey, as part of the Federal
mental health (MH) block grant.5 Further, consumer surveys
can be used along with other sources
of data (e.g., administrative data) to more fully understand a
particular issue. As an example,
administrative data might show that cost for a service is
relatively low, but consumer survey data
could show that consumers actually perceive the cost to be
burdensome. Using multiple sources of
data in this way can improve consumer services and highlights
the importance of consumer
surveys. Thus, determining the usefulness of consumer surveys
for improving the quality of
behavioral health care services is particularly important.
Consumer surveys have been used in behavioral health care for
over a decade and are an
established tool for assessing the quality of care.3,6–10 Studies
have shown that higher levels of
satisfaction are significantly associated with appropriate
technical quality of care11 and with longer
lengths of stay, which in turn is associated with more positive
clinical outcomes.12 Satisfaction with
access and satisfaction with effectiveness of substance use
disorder treatment have been found to
be significantly associated with abstinence from substance use
at 1 year.13 Additionally, higher
satisfaction with provider interpersonal relationships has also
been found to be associated with
quality care.14
Despite being considered a key measure of health care quality,
the actual utility of consumer
survey outcomes for quality improvement remains largely
unexamined and unrealized.15–18 The
Assessing the Utility of Consumer Surveys KOCH et al. 235
literature indicates several possible reasons for the lack of
integration of quality improvement
principles and efforts into behavioral health care, including a
lack of consensus on meaningful and
feasible measures of care,19 an inability to specifically define
the elements of high quality care,20
and a scarcity of comparative results.21 Also, potential self-
report bias,22 the formidableness of the
challenge,17,23 and organizational or system characteristics
such as change management and
readiness for change, culture, knowledge management and
information dissemination, support, and
infrastructure are often cited.22–26
To date, there have been very few studies published in peer-
reviewed journals examining the use of
consumer survey data for quality improvement in behavioral
health care programs. In one recent study,
staff members were supported in a quality improvement
intervention using data from a consumer
survey.27 In another, interviews were conducted with senior
health professionals to determine barriers
to using patient survey data in quality improvement.28
However, no studies have measured actual use
of consumer survey data for quality improvement. This study
had two main goals:
1. To better understand the extent to which community
treatment programs (CTPs) use
consumer survey data, as well as which factors facilitated and
hindered use
2. To explore which consumer survey and organizational
characteristics are related to the use of
consumer survey data.
Method
Project site
This study was conducted with CTPs located in Virginia. The
behavioral health care system in
Virginia is particularly well suited as a site for a study on the
use of consumer surveys for several
reasons. First, the public sector in Virginia uses the MHSIP
Consumer Survey29 and the Youth
Services Survey for Families (YSSF30), the two most widely
used consumer surveys for public
behavioral health care services. Second, Virginia has a long
history of using these surveys. The
MHSIP Consumer Survey has been conducted annually since
1996, and the YSSF has been
conducted annually since 2000. Third, Virginia has developed
relatively sophisticated reports of
consumer survey results that include case-mix adjusted results
and individual comparisons to
“similar” CTPs identified through cluster analyses.
Procedures
The study had two components: (1) development, field testing,
and administration of a cross-
sectional web-based survey, based on issues/themes identified
through discussion groups with key
stakeholders and a review of the literature; and (2)
semistructured follow-up interviews conducted
with selected survey respondents to examine in greater depth
issues identified through the web-
based survey.
Development of the cross-sectional survey
Discussion groups were convened in order to generate items for
the CTP survey. These groups
were conducted separately with a panel of national experts and
agency staff (leaders/managers)
from a sample of public and private CTPs. The national expert
discussion group was conducted via
conference call with seven persons recognized as leaders in the
field of behavioral health care
performance measurement and QI, who have specific experience
with consumer surveys.
Participants in this discussion group were identified by
soliciting nominations from national
organizations with a particular interest in this area such as
MHSIP, The Joint Commission, NCQA,
236 The Journal of Behavioral Health Services & Research 38:2
April 2011
CARF, and the Outcomes Roundtable for Children and Families.
Consent was obtained through e-
mail. Participants in the discussion group for national experts
did not receive compensation.
Four other discussion groups were also conducted—two with
community services board (CSB;
agencies of local government responsible for providing
community-based behavioral health care
services in Virginia) staff and two with private-sector groups.
Potential participants were identified
from contact lists maintained by the Virginia Department of
Behavioral Health and Developmental
Services (DBHDS), which funds and licenses all 40 of
Virginia’s CSBs, and by a private-sector
behavioral health care management services organization.
Prospective participants were selected
using a stratified random sampling procedure in which staff
were stratified by type of position
(executive director, mental health program director, substance
abuse (SA) program director, child/
adolescent program director, and QI/program evaluation
director) and geographic area (rural vs.
urban). Written informed consent was obtained from
participants immediately prior to the initiation
of each discussion group. These participants (or their
organizations) were each paid a $50.00
stipend to cover transportation costs and were provided lunch.
For the discussion groups with public sector staff, a total of 17
persons from both rural and urban
CSBs participated in two groups. For the discussion groups with
private-sector staff, a total of six
persons participated in two groups. Both groups were made up
of a variety of executive directors,
clinical program directors, and QI directors.
A set of standardized questions was used to guide each
discussion, focusing on the following
topics: how consumer survey data are used; factors that
facilitate and hinder use; and
organizational, staff, and clinical factors related to use. These
group discussions were digitally
recorded and transcribed, and transcripts were reviewed to
identify particular issues and
themes.
Outcomes from the discussion groups indicated that CTPs may
use consumer survey data for
quality assurance, staff development/supervision, public
relations, to meet accreditation require-
ments, and to support funding requests, among other reasons. In
addition, several factors were
identified that may facilitate the effective use of these data,
including survey items that are
actionable (e.g., specific items facilitate use while items that
are too broad/general do not provide
useful information), data analysis and reporting that provide
information at the program and
clinician levels with practical interpretation and
recommendations, wide dissemination of data/
reports, and the active involvement of different stakeholders,
including consumers. Factors that
may hinder the use of these data were also identified, including
lack of CTP expertise in the
analysis and interpretation of data, lack of technical support
from funding agencies, and the lack of
timely reporting of results. Other themes that emerged included
concern about staff burden, the
cost–benefit of consumer surveys, how funding agencies or
other stakeholders will use the data, the
need to have multiple consumer surveys to address the unique
needs of individual programs, and
concern about the validity of the data due in part to survey
items written at too high a grade level or
containing professional jargon. Participants were also interested
in flexible consumer surveys that
could address local issues and concerns (e.g., by being able to
add items and to identify the results
for particular programs within a larger agency).
To assess the themes identified through the discussion groups,
as well as a review of the
literature, the CTP Survey was developed. Specific issues
addressed through the survey included
the extent to which respondents had read the most recent report
describing the consumer survey
data and the extent to which they used data from the consumer
survey (for quality assurance, for
quality improvement, to provide feedback to consumers, to
provide information to community
organizations, to provide feedback as a part of staff
supervision/staff performance, to enhance staff
morale, to demonstrate accountability, to meet accreditation
requirements, to support budget
requests, and to identify training/technical assistance needs; 24
items). Other questions asked about
respondents’ perceptions regarding what factors support or
hinder the use of consumer survey data
(e.g., the timeliness of reports, the literacy level of items, the
clarity of actions to be taken based on
Assessing the Utility of Consumer Surveys KOCH et al. 237
survey results; 21 items), what factors they think influence the
usefulness of this data (e.g., being
able to customize items, providing practical interpretation of
results, conducting training on the use
of data for QI, having local consumers participate in the
interpretation and use of survey results; ten
items), and their satisfaction with the procedures for
administering the survey (five items).
Additional items were included in the survey to obtain
information about each CTP (e.g., staffing,
training, funding, and the general use of data in decision
making) that could be used to examine the
relationship between CTP organizational characteristics and
their use of consumer surveys (12
items). Finally, ten items were included that captured
demographic data on survey respondents.
Administration of the CTP survey
The CTP Survey was web-based and administered to selected
staff at all 40 CSBs in Virginia
and to selected staff at eight private facilities that operate a
variety of community treatment
programs. Instructions indicated that the questionnaire should
be completed by each CSB’s
executive director and the directors of mental health services,
substance abuse services, and
children’s mental health services, as well as the person who
directs each CSB’s QI/program
evaluation activities where such a position exists (resulting in a
maximum of five persons per
CSB). For facilities in the private sector, instructions indicated
that the survey should be completed
by the executive director, program directors for outpatient
mental health, substance abuse, and
adolescent services, as well as the QI director in facilities that
have such a position.
The survey was sent (via a web link) to individuals whose
names and e-mail addresses were
obtained from databases maintained at the DBHDS, Virginia
Association of Community Services
Boards (VACSB), and from a private-sector behavioral health
care management services
organization. The survey was conducted using a modified
Dillman method.31 Thus, a total of
three electronic mailings were conducted. The first mailing
included a cover letter with a link to the
web-based questionnaire. Approximately 1 week later, a
reminder e-mail was sent to all
nonrespondents along with the link to the questionnaire.
Finally, 2 weeks after the first reminder
message was sent, a third reminder was e-mailed to
nonrespondents. Individuals who participated
in the survey were entered into a drawing in which six randomly
selected respondents would
receive $100 in cash to be used for professional development.
Unfortunately, the private-sector
consumer satisfaction survey was discontinued after the
discussion groups and just prior to
administration of the online survey. Although the participants
were encouraged to complete this
survey based on their most recent experience with their
satisfaction survey, there were only six
completed surveys from the private-sector group. This was an
insufficient number of cases for
analysis, and this group was dropped from analysis of the CTP
data as well as the final phase of the
study (i.e., the follow-up interviews described below).
Follow-up interviews
In order to provide a more in-depth examination about how
consumer survey data are used, follow-
up telephone interviews were conducted with 16 CSB staff,
drawn from each respondent group (i.e.,
executive directors, quality managers, and clinical program
directors). Participants who reported the
highest (n=9) and lowest use (n=7) of consumer survey data
were selected based on their answers to
items on the web-based survey in which they reported on their
use of the consumer survey for each of
11 specific purposes (e.g., QI, staff supervision, and
accreditation). All 11 items were dichotomously
scored, with a score of 1 indicating that the respondent had used
the consumer survey for that purpose.
Potential interviewees were notified by e-mail that they had
been selected for the follow-up
interview. If an individual declined to participate in the follow-
up interview, he/she was replaced by
the survey respondent who was the next lowest/highest user of
consumer survey data. Each
interview was approximately 20 to 30 min in duration, and all
participants or their organizations
238 The Journal of Behavioral Health Services & Research 38:2
April 2011
were paid $25 for the interview. A semistructured interview
format was used to provide a more in-
depth examination about how consumer survey data are used to
improve service quality, perceived
utility of these data, facilitating factors, obstacles, and
strategies for overcoming obstacles.
Results
Respondent characteristics
CTP questionnaire
Given staff vacancies and persons serving in more than one
position (e.g., the same staff person
serving as both the SA and MH director), there were a total of
150 potential respondents. Of that
number, 77 completed the survey. Specifically, 64.1% (N=25)
of the executive directors and 86.2%
(N=25) of the quality managers completed the survey. The
response rates for the MH, SA, and child/
family directors were 31%, 32%, and 25%, respectively. Given
the small number of respondents from
staff in these position categories (N=27), respondents from
these three groups were combined into a
“clinical program directors” group. The response rate for this
group was 32.9%.
Of the 77 respondents who completed the survey, 46.8% were
male, most were Caucasian
(93.5%; 6.5% were African American), and most were between
the ages of 40 and 49 (29%) or 50
and 59 (45.2%). Most (64%) reported having a master’s degree,
9% reported having a doctorate,
6% reported having a bachelor’s degree, and 21% did not
indicate a degree. Most respondents
indicated that their discipline was in social work (30%) or
psychology (30%), and fewer indicated
medicine/nursing/rehab counseling/other (13%), business (9%),
marriage and family therapy (5%),
and education (3%). Ten percent did not indicate a discipline.
Respondents reported several certifications, with most
indicating an LCSW (26%), LPC
(14.3%), or “other” (26%). In addition, 26% of respondents
indicated no certification. Overall,
respondents had worked in behavioral health for an average of
24 years (SD=7.7) and had worked
at their current CSB for 15.3 years (SD=8.3).
Respondent characteristics—follow-up interviews
As described earlier, participants were divided into high and
low users by using a scale
computed by adding positive responses from 11 dichotomously
scored items. Scores ranged from 1
to 9. Participants identified as high users of the consumer
survey included three executive directors,
three clinical program directors, and three QI directors, and
they came from both urban and rural
CSBs. Their mean use score was 7.8 (SD=1.2).
Participants identified as low users of the consumer survey
included one executive director, three
clinical program directors, and three QI directors, and they
came from both urban and rural CSBs.
Their mean use score was 1.0 (SD=0.0). Notably, only one
executive director indicating very low
use agreed to participate in this portion of this study. Overall,
most executive directors indicated at
least some use of consumer survey results (72%).
CTP survey results
Use of consumer surveys
Most staff reported having read either part of the most recent
consumer survey report (61%) or
the entire report (13%). When asked if they had used the
consumer survey for each of 11 different
purposes, the largest percentages of staff indicated that they
used the data for quality improvement
and for quality assurance, while the smallest percentages of
staff indicated that they used consumer
survey data to support budget requests and to evaluate
individual staff performance (see Fig. 1).
Assessing the Utility of Consumer Surveys KOCH et al. 239
Factors that support use of consumer surveys
CTP staff members were asked to rate 21 factors identified
through the literature and discussion
groups that support the active use of the consumer surveys.
Staff rated each factor on a five-point,
Likert-type scale (strongly disagree to strongly agree) on the
extent to which the factor was true for
the consumer survey and how it is implemented at their CTP. As
shown in Table 1, some items
were highly rated, while smaller percentages of staff agreed
with other items. The most highly
rated items included those about adequate staff training and
staff support for conducting the
consumer survey, as well as items concerning the usefulness of
comparing results with other
organizations, agency leadership discussing the survey results,
and sharing the results throughout
the organization. Few respondents indicated that their
organization involves consumers in using the
survey results or that it is clear how to improve services based
on the results.
Factors that influence the usefulness of consumer surveys
Ten factors were identified in the literature and through the
discussion groups that may be related
to increasing the usefulness of consumer surveys. CTP staff
rated each of these in terms of how
“important you think each factor is/would be to your
organization in facilitating the use of
consumer surveys to improve service quality.” Each factor was
rated on a four-point scale of “not
at all important,” “low importance,” “medium importance,” and
“high importance.” As shown in
Table 2, factors rated the most important concerned providing
information on the practical
interpretation of results and data on individual programs and the
ability to customize items as
Figure 1
Percent of respondents who answered “yes” to questions about
their use of Consumer Survey data
during the past year (N=68–75, depending on question)
11.3
22.9
30
33.8
38
46.5
49.3
53.5
53.5
56
59.7
0 10 20 30 40 50 60 70
Individual staff performance
Support budget requests
Staff supervision
Identify training/tech assistance needs
Community education
Enhance staff morale
Meet accreditation requirements
Feedback to individual staff
Demonstrate accountability
Quality assurance
Quality improvement
Percent (%)
240 The Journal of Behavioral Health Services & Research 38:2
April 2011
T
a
b
le
1
F
ac
to
rs
th
at
su
p
p
o
rt
u
se
o
f
co
n
su
m
er
su
rv
ey
s
It
em
P
er
ce
n
t
“
a
g
re
e”
o
r
“
st
ro
n
g
ly
a
g
re
e”
M
ea
n
a
S
D
It
em
s
in
o
rd
er
o
f
m
o
st
h
ig
h
ly
ra
te
d
to
le
as
t
h
ig
h
ly
ra
te
d
S
ta
ff
ar
e
ad
eq
u
at
el
y
tr
ai
n
ed
to
co
n
d
u
ct
th
e
co
n
su
m
er
su
rv
ey
7
9
.3
3
.8
4
0
.8
8
C
o
m
p
ar
in
g
su
rv
ey
re
su
lt
s
w
it
h
o
th
er
o
rg
an
iz
at
io
n
s
is
u
se
fu
l
7
8
.5
3
.8
2
0
.8
5
S
ta
ff
su
p
p
o
rt
co
n
d
u
ct
in
g
th
e
co
n
su
m
er
su
rv
ey
7
7
.2
3
.8
0
0
.6
4
T
h
e
le
ad
er
sh
ip
d
is
cu
ss
es
th
e
re
su
lt
s
o
f
th
e
co
n
su
m
er
su
rv
ey
7
2
.7
3
.7
3
1
.0
5
C
o
n
su
m
er
su
rv
ey
re
su
lt
s
ar
e
sh
ar
ed
th
ro
u
g
h
o
u
t
th
e
o
rg
an
iz
at
io
n
6
9
.2
3
.5
4
0
.9
7
T
h
er
e
ar
e
o
th
er
o
rg
an
iz
at
io
n
s
th
at
ar
e
si
m
il
ar
en
o
u
g
h
to
u
s
th
at
o
u
r
co
n
su
m
er
su
rv
ey
re
su
lt
s
ca
n
b
e
co
m
p
ar
ed
6
2
.1
3
.5
0
0
.8
5
T
h
e
co
n
su
m
er
s
w
h
o
co
m
p
le
te
th
e
co
n
su
m
er
su
rv
ey
at
o
u
r
o
rg
an
iz
at
io
n
ar
e
re
p
re
se
n
ta
ti
v
e
o
f
th
e
co
n
su
m
er
s
w
e
se
rv
e
5
4
.7
3
.3
8
0
.8
8
T
h
e
co
n
su
m
er
su
rv
ey
re
p
o
rt
p
ro
v
id
es
in
fo
rm
at
io
n
m
y
o
rg
an
iz
at
io
n
n
ee
d
s
to
im
p
ro
v
e
th
e
q
u
al
it
y
o
f
se
rv
ic
es
w
e
p
ro
v
id
e
4
7
.7
3
.2
5
0
.8
8
M
y
o
rg
an
iz
at
io
n
h
as
a
d
es
ig
n
at
ed
te
am
re
sp
o
n
si
b
le
fo
r
en
su
ri
n
g
th
at
th
e
re
su
lt
s
o
f
th
e
co
n
su
m
er
su
rv
ey
ar
e
u
se
d
4
4
.6
3
.1
2
1
.1
4
M
y
o
rg
an
iz
at
io
n
sh
ar
es
o
u
r
co
n
su
m
er
su
rv
ey
re
su
lt
s
w
it
h
st
ak
eh
o
ld
er
s
o
u
ts
id
e
o
f
o
u
r
o
rg
an
iz
at
io
n
4
4
.6
3
.0
5
0
.9
9
S
u
rv
ey
it
em
s
g
iv
e
d
et
ai
le
d
in
fo
rm
at
io
n
ab
o
u
t
th
e
q
u
al
it
y
o
f
se
rv
ic
es
4
0
.9
3
.0
8
0
.9
3
T
h
e
co
n
su
m
er
su
rv
ey
ac
cu
ra
te
ly
as
se
ss
es
th
e
p
er
ce
p
ti
o
n
s
o
f
co
n
su
m
er
s
se
rv
ed
b
y
m
y
o
rg
an
iz
at
io
n
4
0
.0
3
.2
5
0
.7
5
T
h
e
su
rv
ey
it
em
s
u
se
to
o
m
u
ch
p
ro
fe
ss
io
n
al
ja
rg
o
n
3
3
.8
2
.9
4
0
.9
0
T
h
e
co
n
su
m
er
su
rv
ey
re
p
o
rt
is
p
ro
v
id
ed
to
th
is
o
rg
an
iz
at
io
n
in
a
ti
m
el
y
m
an
n
er
3
3
.8
2
.6
9
1
.2
0
M
y
o
rg
an
iz
at
io
n
h
as
an
es
ta
b
li
sh
ed
p
ro
ce
ss
to
u
se
th
e
re
su
lt
s
o
f
th
e
co
n
su
m
er
su
rv
ey
3
0
.3
2
.7
3
1
.0
6
C
o
n
su
m
er
s
se
rv
ed
b
y
m
y
o
rg
an
iz
at
io
n
d
o
n
o
t
li
k
e
to
co
m
p
le
te
th
e
co
n
su
m
er
su
rv
ey
2
9
.2
3
.2
3
0
.7
0
T
h
e
li
te
ra
cy
le
v
el
re
q
u
ir
ed
to
u
n
d
er
st
an
d
th
e
su
rv
ey
it
em
s
is
ap
p
ro
p
ri
at
e
fo
r
th
e
co
n
su
m
er
s
w
e
se
rv
e
2
8
.8
2
.9
1
0
.9
1
C
o
n
su
m
er
cu
lt
u
re
en
co
u
ra
g
es
p
o
si
ti
v
e
re
sp
o
n
se
s
to
th
e
su
rv
ey
it
em
s
2
4
.6
3
.1
1
0
.7
5
M
y
o
rg
an
iz
at
io
n
sh
ar
es
th
e
re
su
lt
s
o
f
th
e
co
n
su
m
er
su
rv
ey
w
it
h
co
n
su
m
er
s
2
2
.7
2
.7
1
0
.9
7
It
is
cl
ea
r
w
h
at
ac
ti
o
n
s
sh
o
u
ld
b
e
ta
k
en
to
im
p
ro
v
e
se
rv
ic
es
b
as
ed
o
n
su
rv
ey
re
su
lt
s
1
2
.1
2
.5
6
0
.7
9
C
o
n
su
m
er
s
an
d
/o
r
fa
m
il
y
m
em
b
er
s
ac
ti
v
el
y
p
ar
ti
ci
p
at
e
in
u
si
n
g
th
e
re
su
lt
s
o
f
th
e
co
n
su
m
er
su
rv
ey
3
.0
2
.1
7
0
.7
1
a
S
co
re
s
ra
n
g
ed
fr
o
m
1
(s
tr
o
n
g
ly
d
is
ag
re
e)
to
5
(s
tr
o
n
g
ly
ag
re
e)
Assessing the Utility of Consumer Surveys KOCH et al. 241
needed. Interestingly, many respondents indicated that it is
important for local consumers to
participate in the interpretation and use of survey results,
although other results indicated that few
organizations share consumer survey data with consumers and
few actively involve consumers in
using the survey results.
Procedures for administering consumer surveys
Each year, all CSBs in Virginia are asked to administer the
Adult Consumer Survey to all
consumers receiving services during a 1-week period, and a
sample of all parents/guardians of
youth served are mailed the Youth Services Survey for Families.
Detailed procedures are provided
to guide the CSBs in administering the surveys. These
procedures were also assessed in the CTP
Questionnaire; respondents answered questions about either the
Adult Consumer Survey or the
Youth Services Survey for Families (results collapsed across
survey type). Most respondents said
they were satisfied with the procedures (61.9%). Fewer
respondents were satisfied with the analysis
and reporting of the results (49.2%). Also, most respondents
indicated a preference that the survey
be administered twice per year (73%), as opposed to
quarterly/monthly (14.3%), once per year
(6.3%), less than once per year (3.2%), or not at all (3.2%).
Table 2
Factors that influence the usefulness of consumer surveys
Item
Percent indicating
“medium” or
“high” importance Meana SD
Data analysis and reporting include practical
interpretation of results
92.0% 3.35 0.77
Results are reported for individual programs 90.4 3.43 0.80
Survey items can be customized to meet the
needs of specific programs or to address
current issues
85.7 3.22 0.89
Survey items can be added to the survey to
meet the needs of specific programs or to
address current issues
84.1 3.30 0.85
Comparisons are provided to similar provider
organizations
77.4 3.10 0.90
There is local information technology capacity
to analyze and report our consumer
survey data
76.2 3.10 0.98
There is local QI/evaluation staff available to
conduct analyses of our consumer
survey data
73.0 3.03 0.95
It is important for local consumers to participate
in the interpretation and use of survey results
70.9 2.89 0.96
Results are provided for individual survey items 67.8 2.95 0.90
Training is provided on how to interpret/use results 61.9 2.83
0.98
aScores ranged from 1 (not important) to 4 (high importance)
242 The Journal of Behavioral Health Services & Research 38:2
April 2011
Finally, most participants indicated that the benefits of
conducting the consumer survey were
equal to or greater than the cost/burden of conducting the
survey (62.3%).
Provider organizational characteristics
Several survey questions asked respondents to rate their
organizations on issues such as staffing,
training, funding, physical space, coordination/collaboration,
and the availability, credibility, and
relevance of data used in decision making. As shown in Table 3,
few respondents agreed or
strongly agreed to items regarding the availability of adequate
staffing, funding, and physical
Table 3
Provider organizational characteristics
Item
Percent “agree”
or “strongly agree” Meana SD
Items in order of most highly rated to
least highly rated
Our staff integrate new knowledge and
techniques into their work to improve
the way in which services are provided
88.5% 4.05 0.69
We regularly integrate new services,
programs, and/or initiatives if they
are needed
85.0 3.95 0.77
We have collaborations/partnerships
with external groups that facilitate
important priorities, new programs,
and/or initiatives for consumers
83.7 3.90 0.77
Employees understand how their work is
related to the goals or mission of
our organization
83.6 3.97 0.66
The training and development programs
for staff are of high quality
80.3 3.90 0.65
We have a high level of coordination
across units and/or departments when
it comes to delivering services and
programs to consumers
60.6 3.36 1.07
We have the necessary physical space
for the services and programs we run
19.6 2.25 1.14
We have few difficulties in adequately
staffing our organization
16.0 2.00 1.07
We have funding available to introduce
new programs and/or initiatives
if they are needed
11.5 2.11 0.99
Questions regarding data
Data needed for decision making are available 60.7 3.44 0.87
Data needed for decision making are relevant 58.3 3.50 0.78
Data needed for decision making are credible 50.8 3.38 0.82
aScores ranged from 1 (strongly disagree) to 5 (strongly agree)
Assessing the Utility of Consumer Surveys KOCH et al. 243
space. More respondents agreed with items regarding the
integration of new knowledge,
techniques, and new services and programs, the existence of
productive collaborations/partner-
ships, the understanding of employees on how their work relates
to the goals and mission of the
organization, and the high quality of training. Results on data
needed for decision making were
mixed: while most agreed that data needed for decision making
are available and relevant, fewer
indicated that these data are credible.
Exploratory analyses
Additional correlational analyses were conducted to better
understand the relationship between
respondents’ actual use of consumer survey data and items
about the usefulness of the survey,
factors that support survey use, preferred frequency of survey
administration, and organizational
characteristics. To accomplish this analysis, the mean use score
variable was used (as described
earlier, this variable was computed by adding positive answers
from 11 dichotomously scored
items; total mean=4.25, SD=2.75). This variable was then
correlated with items from the above
categories, and the Benjamini-Hochberg procedure32 was used
to control for false positives. In
total, 47 correlations were conducted, and 11 were significant
after the procedure was applied.
Results indicated significant correlations between higher use of
the consumer survey data and
the following items: having read the Consumer Survey Report
(r=0.29; pG.05), agreement that
survey items give detailed information (r=0.31; pG.05),
agreement that the annual report provides
information needed to improve the quality of services (r=0.46;
pG.01), and agreement that
respondents’ organizations share results throughout the
organization (r=0.40; pG.01), with
consumers (r=0.50; pG.01), and with stakeholders (r=0.54;
pG.01). In addition, respondents’
actual use was correlated with the likelihood that respondents’
organizations had an established
process to use the results of the consumer survey (r=0.57;
pG.01), that consumers and/or family
members actively participate in using the results (r=0.34;
pG.01), and that their organization has a
designated person/team responsible for ensuring that the results
are used (r=0.45; pG.01). Actual
use was also correlated with respondents indicating that it is
important for results to be provided for
each individual survey item (r=0.32; pG.05) and that funding is
available to introduce new
programs/initiatives if needed (r=0.35; pG.01).
Results from the follow-up interviews
Follow-up interviews allowed for a more in-depth examination
into how consumer survey data
are used to improve service quality, as well as participants’
perceived utility of these data,
facilitating factors, obstacles, and strategies for overcoming
obstacles. Participants in the follow-up
interviews named similar factors to those in the CTP survey,
such as the lack of timely data, the
lack of specific data (i.e., by program), and the lack of
resources (time, money, staff) and ability to
interpret the data and implement related QI initiatives. In
addition, participants felt that consumer
surveys may have questionable validity (consumers may not
understand the questions, response
rates may be poor, sample may not be representative) and that
the consumer survey could be more
useful for QI purposes if the results were provided for each
item, specific to program, location, or
clinician rather than by organization; if the results were more
timely (within 6 months or, best case,
real-time); and if training was provided on how to interpret and
utilize the data for quality
improvement processes.
Participants reported several features of the current consumer
survey report that facilitate its use
by their CTPs, including the ability to benchmark with state
averages; the ability to examine trends
across years; the inclusion of written comments by consumers;
the inclusion of analyses that they
do not have the capacity to conduct; and the inclusion of
graphs, visual aids, and summaries that
facilitate understanding of reported data.
244 The Journal of Behavioral Health Services & Research 38:2
April 2011
The follow-up interview participants that reported using the
consumer survey reports to inform
quality improvement initiatives indicated that such initiatives
focused on access to services (n=4),
intake processes (n=3), and wait lists (n=3). For example, one
high-use participant described:
. . . rearranging our scheduling of emergency services so people
will have easier contact and . . . even a quicker
response from our staff . . . added teleconferencing… . . .
videoconferencing at two locations . . . and that has helped
a great deal.
Another high-use participant stated:
. . . by changing the way we do business. For example, having
walk-ins – where people can just walk in and get
services without having an appointment – that's just one way
that we changed.
Similarly, another participant stated:
We identified that there were problems with returning phone
calls . . . we implemented some changes in program
practices and that brought those scores up.
In addition, most of the high-use interviewees (five out of nine)
reported that they had quality
improvement processes in place at their CSBs and were able to
articulate these systems and
processes. More specifically, one participant stated:
It’s part of our continuous quality improvement…. our city is
becoming very involved in management by results and
in many ways, we are the model for that… we are used to
collecting outcome data and using it to guide our
programs.
Fewer of the low-use interviewees (three out of seven) reported
that they had quality
improvement processes in place in their CSBs. The three low-
use participants that did report
having quality improvement processes in place reported being
unable to use the consumer survey
data for this purpose, again due to the inhibiting factors
previously described. Interestingly, high
users reported being able to overcome these inhibiting factors
primarily through conducting in-
house data analyses.
Conclusions
The results of this study indicate that there are a variety of
factors related to survey content, data
analysis/reporting, and technical support/resources that should
be addressed in order to improve the
likelihood that consumer surveys provide useful data and that
these data are actually used to
improve treatment services.
Overall, results from the CTP survey and the follow-up
interviews revealed general satisfaction with
the consumer survey and the protocol used for its
administration. Quite interestingly, the major
dissatisfaction was with the frequency with which the survey
has been conducted; the majority of
respondents wanted to conduct the survey more frequently,
although only if results were quickly
available. The only other major area of dissatisfaction was the
long delay between conducting the
survey and receiving a report on the results. In addition, study
participants indicated that it is important
to have practical interpretation of the results, that local
consumers participate in the interpretation, and
that items can be customized to meet the needs of specific
programs or to address current issues.
Few participants indicated that their organization has a
designated team responsible for ensuring that
the results of the consumer survey are used or that they have an
established process for using the results
of the consumer survey. Similar barriers to using consumer
surveys have been reported in one other
study, such as the lack of an effective quality improvement
infrastructure, a lack of expertise with
survey data, and a lack of timely and specific results.28 Clearly,
these barriers need to be addressed.
Further, analyses conducted looking at the relationship between
respondents’ actual use of the
consumer surveys and other variables indicated that use was
significantly correlated with survey
Assessing the Utility of Consumer Surveys KOCH et al. 245
results giving detailed information and recommendations,
sharing information with a broad range
of stakeholders, the likelihood of having an established process
and team to use survey results, and
having consumers/family members participate in this process.
Potentially, increasing use of
consumer survey results might be accomplished by addressing
these factors, such as having a
supportive environment and a quality improvement process in
place.
Given the findings of the study, consumer survey use could be
improved by addressing a variety
of concerns about survey content and administration procedures,
as well as program-level and/or
systems-level issues. Survey content could be improved by
giving individual CTPs the ability to
add items that address issues of local concern and by reporting
survey results by individual
programs within a given CTP. Survey administration could be
improved by conducting the surveys
every 6 months rather than annually and by decreasing the
amount of time from the administration
of the survey to the reporting of results.
At a program level, the use of consumer survey data could be
improved by conducting a series of
regional workshops coincident with the release of the consumer
survey reports to discuss the findings
and their implications and to assist CTPs in developing action
plans to address weakness identified by
the survey. In addition, further training could be provided to
CTP staff, especially quality managers, on
data analysis/interpretation and QI technology. CTPs could be
encouraged to establish standing QI
committees to review, respond to, and disseminate consumer
survey results to CTP staff, their boards of
directors, and their consumers and family members. Further,
CTPs could also involve consumers in
quality assurance or consumer advisory committees and
empower these groups by tasking them with
making recommendations. Finally, CTPs could post results in
waiting rooms and provide results to
consumers at intake. Some of these techniques are potentially
free or low cost, an essential component
for organizations such as CTPs that often function with limited
resources.
At a systems level, incentives could be provided using
performance contracting, similar to
previous work in this area33 but using the results of the
consumer surveys as the performance
measurement instead of or in addition to other measures. Also,
using other health care
organizations’ work on QI interventions as a model,27 a
statewide QI team could review consumer
survey data and related information, make recommendations,
oversee the implementation of
changes, and monitor outcomes.
A limitation to this study is the low response rate for program
directors, despite the use of the
Dillman method (three electronic mailings) and the potential to
receive $100 for professional
development. Thus, the results’ generalizability to all types of
behavioral health care program
directors may be limited. This is of particular concern since
program directors would typically play
the lead role in initiating and implementing program changes.
As has been made apparent in recent policy and research
initiatives, improving the quality of
behavioral health care is an important priority, due in part to its
contribution to the total global burden of
illness and impact on all aspects of life.17,24 Several studies
have provided recommendations for
improving the quality of behavioral health care, including ways
to use satisfaction data to guide quality
improvement.34–36 While information is becoming more
readily available regarding quality improve-
ment in health care in general37,38 as well as for behavioral
health care,23,39–41 historically, behavioral
health has lagged behind general medical services in integrating
quality improvement principles into
practice.42 Better understanding and then addressing issues
related to the use of performance measures
can help to improve the impact of these measures and,
ultimately, improve services for clients being
treated in the behavioral health care system.
Implications for Behavioral Health
Consumer surveys have been used in behavioral health care for
decades, but their actual utility
for quality improvement remains largely unexamined. By asking
CTPs about their use of consumer
survey data, the strengths and weaknesses of consumer surveys
can be addressed. Findings from
246 The Journal of Behavioral Health Services & Research 38:2
April 2011
this study indicate that, among other results, CTP staff prefer a
rapid turnaround of results, multiple
administrations each year, and the opportunity to customize
items. CTP staff also indicated that
having adequate staff training, staff and leadership support, and
dissemination of results helped
support the use of consumer survey data.
Community treatment providers should be urged to actively
incorporate results of consumer
surveys into their organizational planning. Suggestions for
improving the use of consumer survey
data include addressing administration preferences,
disseminating findings throughout CTPs and
regions, training CTP staff on data interpretation, and
establishing QI committees to address these
issues. Consumers themselves should be encouraged to
participate in this process. Performance
contracting, using consumer surveys for measurement, is
another possibility.
These steps have the potential to improve the actual use of
consumer survey data, which in turn
can help improve services for clients being treated in the
behavioral health care system.
Acknowledgements
This work was funded by grants from the Commonwealth Health
Research Board and the
Virginia Department of Behavioral Health and Developmental
Services. We would also like to
thank Rehana Kader for her assistance with this project.
References
1. Mental Health Statistics Improvement Program. Mental
Health Statistics Report Card. 1996. Available at:
http://www.mhsip.org/
reportcard/index.html. Accessed May 5, 2009.
2. Zhang Z, Gerstein DR, Friedmann P. (2008). Patient
satisfaction and sustained outcomes of drug abuse treatment.
Journal of Health
Psychology, 13(3), 388–400.
3. Eisen SV, Shaul JA, Leff HS, et al. Toward a national
consumer survey: Evaluation of the CABHS and MHSIP
instruments. The Journal
of Behavioral Health Services and Research 2001; 28(3):347–
369.
4. Perreault M, Katerelos TE, Tardif H, et al. Patients’
perspectives on information received in outpatient psychiatric
treatment. Journal of
Psychiatric and Mental Health Nursing 2006; 13(1):110–116.
5. Center for Mental Health Services. Community mental health
services block grant application guidance and instructions FY
2008:
Transforming mental healthcare in America. Washington, DC:
U.S. Department of Health and Human Services, Substance
Abuse and
Mental Health Services Administration. 2007.
6. Bjorkman T, Hansson L, Svensson B, et al. What is important
in psychiatric outpatient care? Quality of care from the patient’s
perspective. International Journal for Quality in Health Care
1995;7:355–362.
7. Campbell J, Einspahr K. Building partnerships in
accountability: Consumer satisfaction. In Dickey B, Sederer LI
(eds.), Improving
mental health care: Commitment to quality Washington, DC:
American Psychiatric., 2001, pp. 101–113.
8. Druss BG, Rosenheck RA, Stolar M. Patient satisfaction and
administrative measures as indicators of the quality of mental
health care.
Psychiatric Services 1999; 50(8):1053–1058.
9. Pellegrin KL, Stuart GW, Frueh BC, et al. A brief scale for
assessing patients’ satisfaction with care in outpatient
psychiatric services.
Psychiatric Services 2001; 52:816–819.
10. Young MP. An analysis of the concept ‘patient satisfaction’
as it relates to contemporary nursing care. Journal of Advanced
Nursing
1996; 24(6):1241–1248.
11. Edlund MJ, Young AS, Kung FY, et al. Does satisfaction
reflect the technical quality of mental health care? Health
Services Research
2003; 38(2):631–645.
12. Sanders LM, Trinh C, Sherman BR, et al. Assessment of
client satisfaction in a peer counseling substance abuse
treatment program for
pregnant and postpartum women. Evaluation & Program
Planning 1998; 21(3):287–296.
13. Carlson MJ, Gabriel RM. Patient satisfaction, use of
services, and one-year outcomes in publicly funded substance
abuse treatment.
Psychiatric Services 2001; 52(9):1230–1236.
14. Meredith LS, Orlando M, Humphrey N, et al. Are better
ratings of the patient–provider relationship associated with
quality care for
depression? Medical Care 2001; 39(4):349–360.
15. Chilvers R, Harrison G, Sipos A, et al. Evidence into
practice: Application of psychological models of change in
evidence-based
implementation. British Journal of Psychiatry 2002; 181:99–
101.
16. Garland AF, Kruse M, Aarons GA. Clinicians and outcomes
measurement: What’s the use? Journal of Behavioral Health
Services &
Research 2003; 30(4):393–405.
17. Manderscheid RW. Don’t let the future repeat the past.
Behavioral Healthcare 2006; 26(5):54–56.
18. Rawson RA, Marinelli-Casey P, Ling W. Dancing with
strangers: Will the U.S. substance abuse practice and research
organizations build
mutually productive relationships? Addictive Behaviors 2002;
27(6):941–949.
19. Hermann RC, Palmer H, Leff S, et al. Achieving consensus
across diverse stakeholders on quality measures for mental
healthcare.
Medical Care 2004; 42(12):1246–1250.
Assessing the Utility of Consumer Surveys KOCH et al. 247
http://www.mhsip.org/reportcard/index.html
http://www.mhsip.org/reportcard/index.html
20. Lennox RD, Mansfield AJ. A latent variable model of
evidence-based quality improvement for substance abuse
treatment. The Journal of
Behavioral Health Services & Research 2001; 28(2):164–177.
21. Hermann RC, Provost MM. Interpreting measurement data
for quality improvement: Standards, means, norms, and
benchmarks.
Psychiatric Services 2003; 54(5):655–657.
22. Beutler LE. Comparisons among quality assurance systems:
From outcome assessment to clinical utility. Journal of
Consulting and
Clinical Psychology 2001; 69(2):197–204.
23. Proctor EK. Leverage points for the implementation of
evidence-based practice. Brief Treatment in Crisis Intervention
2004; 4(3):227–
242.
24. Patel KK, Butler B, Wells KB. What is necessary to
transform the quality of mental health care? Health Affairs
2006; 25(3):681–694.
25. Rosenheck RA. Organizational process: A missing link
between research and practice. Psychiatric Services 2001;
52(12):1607–1612.
26. Simpson DD. A conceptual framework for transferring
research to practice. Journal of Substance Abuse Treatment
2002; 22(4):171–182.
27. Davies E, Shaller D, Edgeman-Levitan PA, et al. Evaluating
the use of a modified CAPS® survey to support improvements
in patient-
centered care: Lessons from a quality improvement
collaborative. Health Expectations 2008; 11:160–176.
28. Davies E, Cleary PD. Hearing the patient’s voice? Factors
affecting the use of patient survey data in quality improvement.
Qual Saf
Health Care 2005; 14:428–432.
29. Mental Health Statistics Improvement Program (MSHIP).
The MHSIP Consumer Satisfaction Survey and the Youth
Services Survey for
Families. 2000. Available at:
http://www.mhsip.org/surveylink.htm. Accessed May 5, 2009.
30. Brunk M, Koch JR. Assessing the outcomes of children’s
mental health services: Youth Services Survey for Families.
Poster session
presentation at: The 14th annual research conference, A System
of Care for Children's Mental Health: Expanding the Research
Base,
February, 2001; Tampa, FL.
31. Dillman DA. Mail and telephone surveys: the total design
method, New York: Wiley, 1978.
32. Thissen, D. Quick and easy implementation of the
Benjamini-Hochberg procedure for controlling the false positive
rate in multiple
comparisons. Journal of Education and Behavioral Statistics.
2000; 27(1):77–83.
33. McLellan, A.T. Improving public addiction treatment
through performance contracting: The Delaware experiment.
Health Policy
2008;87:276–308.
34. Beaudin CL, Kramer TL. Evaluation and treatment of
depression in managed care (part II): Quality improvement
programs. Disease
Management & Health Outcomes 1997;13(5):307–316.
35. Eisen SV. Patient satisfaction and perceptions of care. In
IsHak WW, Burt T, Sederer LI (eds), Outcome measurement in
Psychiatry: A
critical review. Washington, DC: American Psychiatric, 2002,
pp. 303–320
36. Hermann RC. Linking outcome measurement with process
measurement for quality improvement. In IsHak WW, Burt T,
Sederer LI
(eds), Outcome measurement in Psychiatry: A critical review.
Washington, DC: American Psychiatric, 2002, pp. 23–34
37. Pelletier L, Beaudin C. (eds). Q solutions: Essential
resources for the healthcare quality professional. National
Association for Quality
Healthcare, 2006
38. Teeley KH, Lowe JM, Beal J, Knapp ML. Incorporating
quality improvement concepts and practice into a community
health nursing
course. Journal of Nursing Education, 2006; 45(2), 86–90.
39. McGilloway S, Donnelly M, Mangan B, et al. Qualitative
assessment and improvement in mental health services: A
qualitative study.
Journal of Mental Health 1999; 8(5), 489–499.
40. Rago WV, Reid WH. Total quality management strategies in
mental health systems. Journal of Mental Health Administration
1991; 18
(34):253–264.
41. Roman PM, Johnson JA. Adoption and implementation of
new technologies in substance abuse treatment. Journal of
Substance Abuse
Treatment 2002; 22(4): 211–218.
42. Beaudin CL. The face of quality. Quality Progress 2006;
39(2):15.
248 The Journal of Behavioral Health Services & Research 38:2
April 2011
http://www.mhsip.org/surveylink.htm
Copyright of Journal of Behavioral Health Services & Research
is the property of Springer Science & Business
Media B.V. and its content may not be copied or emailed to
multiple sites or posted to a listserv without the
copyright holder's express written permission. However, users
may print, download, or email articles for
individual use.

More Related Content

Similar to Assessing the Utility of Consumer Surveysfor Improving the Q.docx

Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Marion Sills
 
[287 met]
[287 met][287 met]
Dr hatem el bitar quality text (4)
Dr hatem el bitar quality text (4)Dr hatem el bitar quality text (4)
Dr hatem el bitar quality text (4)
د حاتم البيطار
 
Poster, advancements in care coordination mn sim
Poster, advancements in care coordination mn simPoster, advancements in care coordination mn sim
Poster, advancements in care coordination mn sim
soder145
 
Survey Analyses for Implementing an Electronic Information System to Enhance ...
Survey Analyses for Implementing an Electronic Information System to Enhance ...Survey Analyses for Implementing an Electronic Information System to Enhance ...
Survey Analyses for Implementing an Electronic Information System to Enhance ...
APHA Alcohol, Tobacco, & Other Drugs Section
 
What quality measures does the MCO have in placeSolutionManag.pdf
What quality measures does the MCO have in placeSolutionManag.pdfWhat quality measures does the MCO have in placeSolutionManag.pdf
What quality measures does the MCO have in placeSolutionManag.pdf
formicreation
 
NCBI Bookshelf. A service of the National Library of Medicine,.docx
NCBI Bookshelf. A service of the National Library of Medicine,.docxNCBI Bookshelf. A service of the National Library of Medicine,.docx
NCBI Bookshelf. A service of the National Library of Medicine,.docx
vannagoforth
 
Evaluation of health services
Evaluation of health servicesEvaluation of health services
Evaluation of health services
kavita yadav
 
raised outcomes exists across several situations, as an example by .pdf
raised outcomes exists across several situations, as an example by .pdfraised outcomes exists across several situations, as an example by .pdf
raised outcomes exists across several situations, as an example by .pdf
anilgoelslg
 
Performance Management Case study A
Performance Management Case study APerformance Management Case study A
Performance Management Case study A
Osama Yousaf
 
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docxAssignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
jesuslightbody
 
Final eHO EMR Benefits Report Jan2013
Final eHO EMR Benefits Report Jan2013Final eHO EMR Benefits Report Jan2013
Final eHO EMR Benefits Report Jan2013
Emmanuel Casalino
 
Harvard style research paper nursing evidenced based practice
Harvard style research paper   nursing evidenced based practiceHarvard style research paper   nursing evidenced based practice
Harvard style research paper nursing evidenced based practice
CustomEssayOrder
 
Community Pharmacy Practice.pptx
Community Pharmacy Practice.pptxCommunity Pharmacy Practice.pptx
Community Pharmacy Practice.pptx
AjitKumar428826
 
HCFOFindingsBriefMarch2014FINAL
HCFOFindingsBriefMarch2014FINALHCFOFindingsBriefMarch2014FINAL
HCFOFindingsBriefMarch2014FINAL
Emily Blecker
 
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
ransayo
 
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docxHS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
AlysonDuongtw
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
RBFHealth
 
Patient Centered Care | Unit 2c Lecture
Patient Centered Care | Unit 2c LecturePatient Centered Care | Unit 2c Lecture
Patient Centered Care | Unit 2c Lecture
CMDLMS
 
Health Quality Program
Health Quality ProgramHealth Quality Program

Similar to Assessing the Utility of Consumer Surveysfor Improving the Q.docx (20)

Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
Stakeholder Engagement in a Patient-Reported Outcomes Implementation by a Pra...
 
[287 met]
[287 met][287 met]
[287 met]
 
Dr hatem el bitar quality text (4)
Dr hatem el bitar quality text (4)Dr hatem el bitar quality text (4)
Dr hatem el bitar quality text (4)
 
Poster, advancements in care coordination mn sim
Poster, advancements in care coordination mn simPoster, advancements in care coordination mn sim
Poster, advancements in care coordination mn sim
 
Survey Analyses for Implementing an Electronic Information System to Enhance ...
Survey Analyses for Implementing an Electronic Information System to Enhance ...Survey Analyses for Implementing an Electronic Information System to Enhance ...
Survey Analyses for Implementing an Electronic Information System to Enhance ...
 
What quality measures does the MCO have in placeSolutionManag.pdf
What quality measures does the MCO have in placeSolutionManag.pdfWhat quality measures does the MCO have in placeSolutionManag.pdf
What quality measures does the MCO have in placeSolutionManag.pdf
 
NCBI Bookshelf. A service of the National Library of Medicine,.docx
NCBI Bookshelf. A service of the National Library of Medicine,.docxNCBI Bookshelf. A service of the National Library of Medicine,.docx
NCBI Bookshelf. A service of the National Library of Medicine,.docx
 
Evaluation of health services
Evaluation of health servicesEvaluation of health services
Evaluation of health services
 
raised outcomes exists across several situations, as an example by .pdf
raised outcomes exists across several situations, as an example by .pdfraised outcomes exists across several situations, as an example by .pdf
raised outcomes exists across several situations, as an example by .pdf
 
Performance Management Case study A
Performance Management Case study APerformance Management Case study A
Performance Management Case study A
 
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docxAssignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
Assignment WK 9Assessing a Healthcare ProgramPolicy Evaluation.docx
 
Final eHO EMR Benefits Report Jan2013
Final eHO EMR Benefits Report Jan2013Final eHO EMR Benefits Report Jan2013
Final eHO EMR Benefits Report Jan2013
 
Harvard style research paper nursing evidenced based practice
Harvard style research paper   nursing evidenced based practiceHarvard style research paper   nursing evidenced based practice
Harvard style research paper nursing evidenced based practice
 
Community Pharmacy Practice.pptx
Community Pharmacy Practice.pptxCommunity Pharmacy Practice.pptx
Community Pharmacy Practice.pptx
 
HCFOFindingsBriefMarch2014FINAL
HCFOFindingsBriefMarch2014FINALHCFOFindingsBriefMarch2014FINAL
HCFOFindingsBriefMarch2014FINAL
 
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx86J Public Health Management Practice, 1999, 5(5), 86–97.docx
86J Public Health Management Practice, 1999, 5(5), 86–97.docx
 
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docxHS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
HS410 Unit 6 Quality Management - DiscussionDiscussionThi.docx
 
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
Annual Results and Impact Evaluation Workshop for RBF - Day One - Paper - Opp...
 
Patient Centered Care | Unit 2c Lecture
Patient Centered Care | Unit 2c LecturePatient Centered Care | Unit 2c Lecture
Patient Centered Care | Unit 2c Lecture
 
Health Quality Program
Health Quality ProgramHealth Quality Program
Health Quality Program
 

More from fredharris32

A report writingAt least 5 pagesTitle pageExecutive Su.docx
A report writingAt least 5 pagesTitle pageExecutive Su.docxA report writingAt least 5 pagesTitle pageExecutive Su.docx
A report writingAt least 5 pagesTitle pageExecutive Su.docx
fredharris32
 
A reflection of how your life has changedevolved as a result of the.docx
A reflection of how your life has changedevolved as a result of the.docxA reflection of how your life has changedevolved as a result of the.docx
A reflection of how your life has changedevolved as a result of the.docx
fredharris32
 
A Princeton University study argues that the preferences of average.docx
A Princeton University study argues that the preferences of average.docxA Princeton University study argues that the preferences of average.docx
A Princeton University study argues that the preferences of average.docx
fredharris32
 
A rapidly growing small firm does not have access to sufficient exte.docx
A rapidly growing small firm does not have access to sufficient exte.docxA rapidly growing small firm does not have access to sufficient exte.docx
A rapidly growing small firm does not have access to sufficient exte.docx
fredharris32
 
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docxA psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
fredharris32
 
A project to put on a major international sporting competition has t.docx
A project to put on a major international sporting competition has t.docxA project to put on a major international sporting competition has t.docx
A project to put on a major international sporting competition has t.docx
fredharris32
 
A professional services company wants to globalize by offering s.docx
A professional services company wants to globalize by offering s.docxA professional services company wants to globalize by offering s.docx
A professional services company wants to globalize by offering s.docx
fredharris32
 
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docxA presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
fredharris32
 
a presentatiion on how the over dependence of IOT AI and robotics di.docx
a presentatiion on how the over dependence of IOT AI and robotics di.docxa presentatiion on how the over dependence of IOT AI and robotics di.docx
a presentatiion on how the over dependence of IOT AI and robotics di.docx
fredharris32
 
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docxA P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
fredharris32
 
A nursing care plan (NCP) is a formal process that includes .docx
A nursing care plan (NCP) is a formal process that includes .docxA nursing care plan (NCP) is a formal process that includes .docx
A nursing care plan (NCP) is a formal process that includes .docx
fredharris32
 
A nurse educator is preparing an orientation on culture and the wo.docx
A nurse educator is preparing an orientation on culture and the wo.docxA nurse educator is preparing an orientation on culture and the wo.docx
A nurse educator is preparing an orientation on culture and the wo.docx
fredharris32
 
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docxA NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
fredharris32
 
A Look at the Marburg Fever OutbreaksThis week we will exami.docx
A Look at the Marburg Fever OutbreaksThis week we will exami.docxA Look at the Marburg Fever OutbreaksThis week we will exami.docx
A Look at the Marburg Fever OutbreaksThis week we will exami.docx
fredharris32
 
A network consisting of M cities and M-1 roads connecting them is gi.docx
A network consisting of M cities and M-1 roads connecting them is gi.docxA network consisting of M cities and M-1 roads connecting them is gi.docx
A network consisting of M cities and M-1 roads connecting them is gi.docx
fredharris32
 
A minimum 20-page (not including cover page, abstract, table of cont.docx
A minimum 20-page (not including cover page, abstract, table of cont.docxA minimum 20-page (not including cover page, abstract, table of cont.docx
A minimum 20-page (not including cover page, abstract, table of cont.docx
fredharris32
 
A major component of being a teacher is the collaboration with t.docx
A major component of being a teacher is the collaboration with t.docxA major component of being a teacher is the collaboration with t.docx
A major component of being a teacher is the collaboration with t.docx
fredharris32
 
a mad professor slips a secret tablet in your food that makes you gr.docx
a mad professor slips a secret tablet in your food that makes you gr.docxa mad professor slips a secret tablet in your food that makes you gr.docx
a mad professor slips a secret tablet in your food that makes you gr.docx
fredharris32
 
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docxA New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
fredharris32
 
A N A M E R I C A N H I S T O R YG I V E M EL I B.docx
A N  A M E R I C A N  H I S T O R YG I V E  M EL I B.docxA N  A M E R I C A N  H I S T O R YG I V E  M EL I B.docx
A N A M E R I C A N H I S T O R YG I V E M EL I B.docx
fredharris32
 

More from fredharris32 (20)

A report writingAt least 5 pagesTitle pageExecutive Su.docx
A report writingAt least 5 pagesTitle pageExecutive Su.docxA report writingAt least 5 pagesTitle pageExecutive Su.docx
A report writingAt least 5 pagesTitle pageExecutive Su.docx
 
A reflection of how your life has changedevolved as a result of the.docx
A reflection of how your life has changedevolved as a result of the.docxA reflection of how your life has changedevolved as a result of the.docx
A reflection of how your life has changedevolved as a result of the.docx
 
A Princeton University study argues that the preferences of average.docx
A Princeton University study argues that the preferences of average.docxA Princeton University study argues that the preferences of average.docx
A Princeton University study argues that the preferences of average.docx
 
A rapidly growing small firm does not have access to sufficient exte.docx
A rapidly growing small firm does not have access to sufficient exte.docxA rapidly growing small firm does not have access to sufficient exte.docx
A rapidly growing small firm does not have access to sufficient exte.docx
 
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docxA psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
A psychiatrist bills for 10 hours of psychotherapy and medication ch.docx
 
A project to put on a major international sporting competition has t.docx
A project to put on a major international sporting competition has t.docxA project to put on a major international sporting competition has t.docx
A project to put on a major international sporting competition has t.docx
 
A professional services company wants to globalize by offering s.docx
A professional services company wants to globalize by offering s.docxA professional services company wants to globalize by offering s.docx
A professional services company wants to globalize by offering s.docx
 
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docxA presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
A presentation( PowerPoint) on the novel, Disgrace by J . M. Coetzee.docx
 
a presentatiion on how the over dependence of IOT AI and robotics di.docx
a presentatiion on how the over dependence of IOT AI and robotics di.docxa presentatiion on how the over dependence of IOT AI and robotics di.docx
a presentatiion on how the over dependence of IOT AI and robotics di.docx
 
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docxA P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
A P P L I C A T I O N S A N D I M P L E M E N T A T I O Nh.docx
 
A nursing care plan (NCP) is a formal process that includes .docx
A nursing care plan (NCP) is a formal process that includes .docxA nursing care plan (NCP) is a formal process that includes .docx
A nursing care plan (NCP) is a formal process that includes .docx
 
A nurse educator is preparing an orientation on culture and the wo.docx
A nurse educator is preparing an orientation on culture and the wo.docxA nurse educator is preparing an orientation on culture and the wo.docx
A nurse educator is preparing an orientation on culture and the wo.docx
 
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docxA NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
A NOVEL TEACHER EVALUATION MODEL 1 Branching Paths A Nove.docx
 
A Look at the Marburg Fever OutbreaksThis week we will exami.docx
A Look at the Marburg Fever OutbreaksThis week we will exami.docxA Look at the Marburg Fever OutbreaksThis week we will exami.docx
A Look at the Marburg Fever OutbreaksThis week we will exami.docx
 
A network consisting of M cities and M-1 roads connecting them is gi.docx
A network consisting of M cities and M-1 roads connecting them is gi.docxA network consisting of M cities and M-1 roads connecting them is gi.docx
A network consisting of M cities and M-1 roads connecting them is gi.docx
 
A minimum 20-page (not including cover page, abstract, table of cont.docx
A minimum 20-page (not including cover page, abstract, table of cont.docxA minimum 20-page (not including cover page, abstract, table of cont.docx
A minimum 20-page (not including cover page, abstract, table of cont.docx
 
A major component of being a teacher is the collaboration with t.docx
A major component of being a teacher is the collaboration with t.docxA major component of being a teacher is the collaboration with t.docx
A major component of being a teacher is the collaboration with t.docx
 
a mad professor slips a secret tablet in your food that makes you gr.docx
a mad professor slips a secret tablet in your food that makes you gr.docxa mad professor slips a secret tablet in your food that makes you gr.docx
a mad professor slips a secret tablet in your food that makes you gr.docx
 
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docxA New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
A New Mindset for   Leading Change [WLO 1][CLO 6]Through.docx
 
A N A M E R I C A N H I S T O R YG I V E M EL I B.docx
A N  A M E R I C A N  H I S T O R YG I V E  M EL I B.docxA N  A M E R I C A N  H I S T O R YG I V E  M EL I B.docx
A N A M E R I C A N H I S T O R YG I V E M EL I B.docx
 

Recently uploaded

Electric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger HuntElectric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger Hunt
RamseyBerglund
 
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
Nguyen Thanh Tu Collection
 
Haunted Houses by H W Longfellow for class 10
Haunted Houses by H W Longfellow for class 10Haunted Houses by H W Longfellow for class 10
Haunted Houses by H W Longfellow for class 10
nitinpv4ai
 
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
Nguyen Thanh Tu Collection
 
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
ImMuslim
 
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptxCapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
CapitolTechU
 
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdfREASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
giancarloi8888
 
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
indexPub
 
Accounting for Restricted Grants When and How To Record Properly
Accounting for Restricted Grants  When and How To Record ProperlyAccounting for Restricted Grants  When and How To Record Properly
Accounting for Restricted Grants When and How To Record Properly
TechSoup
 
The basics of sentences session 7pptx.pptx
The basics of sentences session 7pptx.pptxThe basics of sentences session 7pptx.pptx
The basics of sentences session 7pptx.pptx
heathfieldcps1
 
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptxBIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
RidwanHassanYusuf
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
TechSoup
 
Benner "Expanding Pathways to Publishing Careers"
Benner "Expanding Pathways to Publishing Careers"Benner "Expanding Pathways to Publishing Careers"
Benner "Expanding Pathways to Publishing Careers"
National Information Standards Organization (NISO)
 
Juneteenth Freedom Day 2024 David Douglas School District
Juneteenth Freedom Day 2024 David Douglas School DistrictJuneteenth Freedom Day 2024 David Douglas School District
Juneteenth Freedom Day 2024 David Douglas School District
David Douglas School District
 
Temple of Asclepius in Thrace. Excavation results
Temple of Asclepius in Thrace. Excavation resultsTemple of Asclepius in Thrace. Excavation results
Temple of Asclepius in Thrace. Excavation results
Krassimira Luka
 
Oliver Asks for More by Charles Dickens (9)
Oliver Asks for More by Charles Dickens (9)Oliver Asks for More by Charles Dickens (9)
Oliver Asks for More by Charles Dickens (9)
nitinpv4ai
 
A Free 200-Page eBook ~ Brain and Mind Exercise.pptx
A Free 200-Page eBook ~ Brain and Mind Exercise.pptxA Free 200-Page eBook ~ Brain and Mind Exercise.pptx
A Free 200-Page eBook ~ Brain and Mind Exercise.pptx
OH TEIK BIN
 
Pharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brubPharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brub
danielkiash986
 
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
EduSkills OECD
 
Educational Technology in the Health Sciences
Educational Technology in the Health SciencesEducational Technology in the Health Sciences
Educational Technology in the Health Sciences
Iris Thiele Isip-Tan
 

Recently uploaded (20)

Electric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger HuntElectric Fetus - Record Store Scavenger Hunt
Electric Fetus - Record Store Scavenger Hunt
 
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
CHUYÊN ĐỀ ÔN TẬP VÀ PHÁT TRIỂN CÂU HỎI TRONG ĐỀ MINH HỌA THI TỐT NGHIỆP THPT ...
 
Haunted Houses by H W Longfellow for class 10
Haunted Houses by H W Longfellow for class 10Haunted Houses by H W Longfellow for class 10
Haunted Houses by H W Longfellow for class 10
 
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
BÀI TẬP BỔ TRỢ TIẾNG ANH LỚP 9 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2024-2025 - ...
 
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
Geography as a Discipline Chapter 1 __ Class 11 Geography NCERT _ Class Notes...
 
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptxCapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
CapTechTalks Webinar Slides June 2024 Donovan Wright.pptx
 
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdfREASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
REASIGNACION 2024 UGEL CHUPACA 2024 UGEL CHUPACA.pdf
 
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
THE SACRIFICE HOW PRO-PALESTINE PROTESTS STUDENTS ARE SACRIFICING TO CHANGE T...
 
Accounting for Restricted Grants When and How To Record Properly
Accounting for Restricted Grants  When and How To Record ProperlyAccounting for Restricted Grants  When and How To Record Properly
Accounting for Restricted Grants When and How To Record Properly
 
The basics of sentences session 7pptx.pptx
The basics of sentences session 7pptx.pptxThe basics of sentences session 7pptx.pptx
The basics of sentences session 7pptx.pptx
 
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptxBIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
BIOLOGY NATIONAL EXAMINATION COUNCIL (NECO) 2024 PRACTICAL MANUAL.pptx
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
 
Benner "Expanding Pathways to Publishing Careers"
Benner "Expanding Pathways to Publishing Careers"Benner "Expanding Pathways to Publishing Careers"
Benner "Expanding Pathways to Publishing Careers"
 
Juneteenth Freedom Day 2024 David Douglas School District
Juneteenth Freedom Day 2024 David Douglas School DistrictJuneteenth Freedom Day 2024 David Douglas School District
Juneteenth Freedom Day 2024 David Douglas School District
 
Temple of Asclepius in Thrace. Excavation results
Temple of Asclepius in Thrace. Excavation resultsTemple of Asclepius in Thrace. Excavation results
Temple of Asclepius in Thrace. Excavation results
 
Oliver Asks for More by Charles Dickens (9)
Oliver Asks for More by Charles Dickens (9)Oliver Asks for More by Charles Dickens (9)
Oliver Asks for More by Charles Dickens (9)
 
A Free 200-Page eBook ~ Brain and Mind Exercise.pptx
A Free 200-Page eBook ~ Brain and Mind Exercise.pptxA Free 200-Page eBook ~ Brain and Mind Exercise.pptx
A Free 200-Page eBook ~ Brain and Mind Exercise.pptx
 
Pharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brubPharmaceutics Pharmaceuticals best of brub
Pharmaceutics Pharmaceuticals best of brub
 
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...
 
Educational Technology in the Health Sciences
Educational Technology in the Health SciencesEducational Technology in the Health Sciences
Educational Technology in the Health Sciences
 

Assessing the Utility of Consumer Surveysfor Improving the Q.docx

  • 1. Assessing the Utility of Consumer Surveys for Improving the Quality of Behavioral Health Care Services J. Randy Koch, PhD Alison B. Breland, PhD Mary Nash, PhD Karen Cropsey, PsyD Abstract The development and implementation of provider performance and consumer outcome measures for behavioral health care have been growing over the last decade, presumably because they are useful tools for improving service quality. However, the extent to which providers have successfully used performance measurement results has not been adequately determined. To this end, two methods were used to better understand the use of data obtained from an annual survey of behavioral health care consumers: a cross-sectional survey of executive directors, clinical program directors, and quality improvement directors and follow-up interviews with a subsample of survey respondents. Results revealed information about the use of consumer survey data, factors that facilitate and hinder the use of results, as well as respondents’ opinions about consumer survey administration procedures. These findings provide valuable information for the application of performance measures and, ultimately, improving consumer
  • 2. outcomes. Address correspondence to Alison B. Breland, PhD, Institute for Drug and Alcohol Studies, Virginia Commonwealth University, McGuire Hall, Rm. B08, 1112 East Clay Street( P.O. Box 980310, Richmond, VA 23298, USA. Phone: +1-804- 6282300; Fax: +1-804-8287862; E-mail: [email protected] J. Randy Koch, PhD, Institute for Drug and Alcohol Studies, Virginia Commonwealth University, P.O. Box 980310, Richmond, VA, USA. Phone: +1-804-8288633; Fax: +1-804- 8287862; E-mail: [email protected] Mary Nash, PhD, School of Human and Organization Development, Fielding Graduate University, Santa Barbara, CA, USA. Phone: +1-757-4356589; Fax: +1-757-4356589; E-mail: [email protected] Karen Cropsey, PsyD, Department of Psychiatry and Behavioral Neurobiology, University of Alabama School of Medicine, Birmingham, AL, USA. Phone: +1-205-9160135; Fax: +1-205-9409258; E-mail: [email protected] This research was performed at the Virginia Commonwealth University, Institute for Drug and Alcohol Studies, 1112 East Clay Street, Suite B-08, Richmond, VA 23298. Journal of Behavioral Health Services & Research, 2010. c) 2010 National Council for Community Behavioral Healthcare. 234 The Journal of Behavioral Health Services & Research 38:2 April 2011
  • 3. Introduction Over the past decade, there has been significant growth in the development and implementation of provider performance and consumer outcome measures for the behavioral health care field. The Federal Substance Abuse and Mental Health Services Administration has been at the forefront in the development of performance measures for the public behavioral health care system and has sponsored several initiatives that have facilitated the acceptance of performance measurement as an essential business practice, including the Mental Health Statistics Improvement Program (MHSIP) Consumer-Oriented Report Card,1 the Outcomes Roundtable for Children and Families, and the Forum on Performance Measures. The private sector has also been actively promoting the use of performance measures through accreditation organizations that may require or strongly encourage provider organizations and health plans to establish performance measurement systems (e.g., the ORYX program operated by The Joint Commission, the Health Care Effectiveness Data and Information Set sponsored by the National Committee for Quality Assurance (NCQA), and the uSPEQ system of the Council on Accreditation of Rehabilitation Facilities (CARF)). A key rationale for requiring the implementation of performance measures is their presumed value to providers for improving the quality of services they deliver to consumers; importantly, increased consumer satisfaction has been associated with better drug use outcomes after treatment.2
  • 4. However, although there is anecdotal evidence that some providers have successfully used performance measurement results to improve the quality of services, the extent to which this has occurred has not been determined. In addition, understanding how the usefulness of performance data can be improved, as well as identifying barriers to using performance measurement data for quality improvement (QI), needs to be explored. Given the widespread use of performance measures in both the public and private sectors and the cost associated with their implementation, it is critical that these and related questions be addressed. Performance measurement systems are diverse and often complicated. In particular, these systems may include a variety of different types of measures (e.g., standardized clinical instruments and measures generated from administrative data). However, for a variety of reasons (e.g., cost, ease of data collection, and the desire to obtain input directly from consumers), consumer surveys have become popular, common components of existing measurement systems. For example, consumer surveys are now widely used to assess consumers’ perceptions of service accessibility, cultural sensitivity, and treatment outcomes.3,4 In fact, the States are now requested to provide some key performance indicators, including the results of a consumer survey, as part of the Federal mental health (MH) block grant.5 Further, consumer surveys can be used along with other sources of data (e.g., administrative data) to more fully understand a particular issue. As an example, administrative data might show that cost for a service is relatively low, but consumer survey data
  • 5. could show that consumers actually perceive the cost to be burdensome. Using multiple sources of data in this way can improve consumer services and highlights the importance of consumer surveys. Thus, determining the usefulness of consumer surveys for improving the quality of behavioral health care services is particularly important. Consumer surveys have been used in behavioral health care for over a decade and are an established tool for assessing the quality of care.3,6–10 Studies have shown that higher levels of satisfaction are significantly associated with appropriate technical quality of care11 and with longer lengths of stay, which in turn is associated with more positive clinical outcomes.12 Satisfaction with access and satisfaction with effectiveness of substance use disorder treatment have been found to be significantly associated with abstinence from substance use at 1 year.13 Additionally, higher satisfaction with provider interpersonal relationships has also been found to be associated with quality care.14 Despite being considered a key measure of health care quality, the actual utility of consumer survey outcomes for quality improvement remains largely unexamined and unrealized.15–18 The Assessing the Utility of Consumer Surveys KOCH et al. 235 literature indicates several possible reasons for the lack of integration of quality improvement principles and efforts into behavioral health care, including a
  • 6. lack of consensus on meaningful and feasible measures of care,19 an inability to specifically define the elements of high quality care,20 and a scarcity of comparative results.21 Also, potential self- report bias,22 the formidableness of the challenge,17,23 and organizational or system characteristics such as change management and readiness for change, culture, knowledge management and information dissemination, support, and infrastructure are often cited.22–26 To date, there have been very few studies published in peer- reviewed journals examining the use of consumer survey data for quality improvement in behavioral health care programs. In one recent study, staff members were supported in a quality improvement intervention using data from a consumer survey.27 In another, interviews were conducted with senior health professionals to determine barriers to using patient survey data in quality improvement.28 However, no studies have measured actual use of consumer survey data for quality improvement. This study had two main goals: 1. To better understand the extent to which community treatment programs (CTPs) use consumer survey data, as well as which factors facilitated and hindered use 2. To explore which consumer survey and organizational characteristics are related to the use of consumer survey data. Method
  • 7. Project site This study was conducted with CTPs located in Virginia. The behavioral health care system in Virginia is particularly well suited as a site for a study on the use of consumer surveys for several reasons. First, the public sector in Virginia uses the MHSIP Consumer Survey29 and the Youth Services Survey for Families (YSSF30), the two most widely used consumer surveys for public behavioral health care services. Second, Virginia has a long history of using these surveys. The MHSIP Consumer Survey has been conducted annually since 1996, and the YSSF has been conducted annually since 2000. Third, Virginia has developed relatively sophisticated reports of consumer survey results that include case-mix adjusted results and individual comparisons to “similar” CTPs identified through cluster analyses. Procedures The study had two components: (1) development, field testing, and administration of a cross- sectional web-based survey, based on issues/themes identified through discussion groups with key stakeholders and a review of the literature; and (2) semistructured follow-up interviews conducted with selected survey respondents to examine in greater depth issues identified through the web- based survey. Development of the cross-sectional survey Discussion groups were convened in order to generate items for the CTP survey. These groups
  • 8. were conducted separately with a panel of national experts and agency staff (leaders/managers) from a sample of public and private CTPs. The national expert discussion group was conducted via conference call with seven persons recognized as leaders in the field of behavioral health care performance measurement and QI, who have specific experience with consumer surveys. Participants in this discussion group were identified by soliciting nominations from national organizations with a particular interest in this area such as MHSIP, The Joint Commission, NCQA, 236 The Journal of Behavioral Health Services & Research 38:2 April 2011 CARF, and the Outcomes Roundtable for Children and Families. Consent was obtained through e- mail. Participants in the discussion group for national experts did not receive compensation. Four other discussion groups were also conducted—two with community services board (CSB; agencies of local government responsible for providing community-based behavioral health care services in Virginia) staff and two with private-sector groups. Potential participants were identified from contact lists maintained by the Virginia Department of Behavioral Health and Developmental Services (DBHDS), which funds and licenses all 40 of Virginia’s CSBs, and by a private-sector behavioral health care management services organization. Prospective participants were selected using a stratified random sampling procedure in which staff
  • 9. were stratified by type of position (executive director, mental health program director, substance abuse (SA) program director, child/ adolescent program director, and QI/program evaluation director) and geographic area (rural vs. urban). Written informed consent was obtained from participants immediately prior to the initiation of each discussion group. These participants (or their organizations) were each paid a $50.00 stipend to cover transportation costs and were provided lunch. For the discussion groups with public sector staff, a total of 17 persons from both rural and urban CSBs participated in two groups. For the discussion groups with private-sector staff, a total of six persons participated in two groups. Both groups were made up of a variety of executive directors, clinical program directors, and QI directors. A set of standardized questions was used to guide each discussion, focusing on the following topics: how consumer survey data are used; factors that facilitate and hinder use; and organizational, staff, and clinical factors related to use. These group discussions were digitally recorded and transcribed, and transcripts were reviewed to identify particular issues and themes. Outcomes from the discussion groups indicated that CTPs may use consumer survey data for quality assurance, staff development/supervision, public relations, to meet accreditation require- ments, and to support funding requests, among other reasons. In addition, several factors were identified that may facilitate the effective use of these data,
  • 10. including survey items that are actionable (e.g., specific items facilitate use while items that are too broad/general do not provide useful information), data analysis and reporting that provide information at the program and clinician levels with practical interpretation and recommendations, wide dissemination of data/ reports, and the active involvement of different stakeholders, including consumers. Factors that may hinder the use of these data were also identified, including lack of CTP expertise in the analysis and interpretation of data, lack of technical support from funding agencies, and the lack of timely reporting of results. Other themes that emerged included concern about staff burden, the cost–benefit of consumer surveys, how funding agencies or other stakeholders will use the data, the need to have multiple consumer surveys to address the unique needs of individual programs, and concern about the validity of the data due in part to survey items written at too high a grade level or containing professional jargon. Participants were also interested in flexible consumer surveys that could address local issues and concerns (e.g., by being able to add items and to identify the results for particular programs within a larger agency). To assess the themes identified through the discussion groups, as well as a review of the literature, the CTP Survey was developed. Specific issues addressed through the survey included the extent to which respondents had read the most recent report describing the consumer survey data and the extent to which they used data from the consumer survey (for quality assurance, for quality improvement, to provide feedback to consumers, to
  • 11. provide information to community organizations, to provide feedback as a part of staff supervision/staff performance, to enhance staff morale, to demonstrate accountability, to meet accreditation requirements, to support budget requests, and to identify training/technical assistance needs; 24 items). Other questions asked about respondents’ perceptions regarding what factors support or hinder the use of consumer survey data (e.g., the timeliness of reports, the literacy level of items, the clarity of actions to be taken based on Assessing the Utility of Consumer Surveys KOCH et al. 237 survey results; 21 items), what factors they think influence the usefulness of this data (e.g., being able to customize items, providing practical interpretation of results, conducting training on the use of data for QI, having local consumers participate in the interpretation and use of survey results; ten items), and their satisfaction with the procedures for administering the survey (five items). Additional items were included in the survey to obtain information about each CTP (e.g., staffing, training, funding, and the general use of data in decision making) that could be used to examine the relationship between CTP organizational characteristics and their use of consumer surveys (12 items). Finally, ten items were included that captured demographic data on survey respondents. Administration of the CTP survey The CTP Survey was web-based and administered to selected
  • 12. staff at all 40 CSBs in Virginia and to selected staff at eight private facilities that operate a variety of community treatment programs. Instructions indicated that the questionnaire should be completed by each CSB’s executive director and the directors of mental health services, substance abuse services, and children’s mental health services, as well as the person who directs each CSB’s QI/program evaluation activities where such a position exists (resulting in a maximum of five persons per CSB). For facilities in the private sector, instructions indicated that the survey should be completed by the executive director, program directors for outpatient mental health, substance abuse, and adolescent services, as well as the QI director in facilities that have such a position. The survey was sent (via a web link) to individuals whose names and e-mail addresses were obtained from databases maintained at the DBHDS, Virginia Association of Community Services Boards (VACSB), and from a private-sector behavioral health care management services organization. The survey was conducted using a modified Dillman method.31 Thus, a total of three electronic mailings were conducted. The first mailing included a cover letter with a link to the web-based questionnaire. Approximately 1 week later, a reminder e-mail was sent to all nonrespondents along with the link to the questionnaire. Finally, 2 weeks after the first reminder message was sent, a third reminder was e-mailed to nonrespondents. Individuals who participated in the survey were entered into a drawing in which six randomly selected respondents would
  • 13. receive $100 in cash to be used for professional development. Unfortunately, the private-sector consumer satisfaction survey was discontinued after the discussion groups and just prior to administration of the online survey. Although the participants were encouraged to complete this survey based on their most recent experience with their satisfaction survey, there were only six completed surveys from the private-sector group. This was an insufficient number of cases for analysis, and this group was dropped from analysis of the CTP data as well as the final phase of the study (i.e., the follow-up interviews described below). Follow-up interviews In order to provide a more in-depth examination about how consumer survey data are used, follow- up telephone interviews were conducted with 16 CSB staff, drawn from each respondent group (i.e., executive directors, quality managers, and clinical program directors). Participants who reported the highest (n=9) and lowest use (n=7) of consumer survey data were selected based on their answers to items on the web-based survey in which they reported on their use of the consumer survey for each of 11 specific purposes (e.g., QI, staff supervision, and accreditation). All 11 items were dichotomously scored, with a score of 1 indicating that the respondent had used the consumer survey for that purpose. Potential interviewees were notified by e-mail that they had been selected for the follow-up interview. If an individual declined to participate in the follow- up interview, he/she was replaced by the survey respondent who was the next lowest/highest user of
  • 14. consumer survey data. Each interview was approximately 20 to 30 min in duration, and all participants or their organizations 238 The Journal of Behavioral Health Services & Research 38:2 April 2011 were paid $25 for the interview. A semistructured interview format was used to provide a more in- depth examination about how consumer survey data are used to improve service quality, perceived utility of these data, facilitating factors, obstacles, and strategies for overcoming obstacles. Results Respondent characteristics CTP questionnaire Given staff vacancies and persons serving in more than one position (e.g., the same staff person serving as both the SA and MH director), there were a total of 150 potential respondents. Of that number, 77 completed the survey. Specifically, 64.1% (N=25) of the executive directors and 86.2% (N=25) of the quality managers completed the survey. The response rates for the MH, SA, and child/ family directors were 31%, 32%, and 25%, respectively. Given the small number of respondents from staff in these position categories (N=27), respondents from these three groups were combined into a “clinical program directors” group. The response rate for this group was 32.9%.
  • 15. Of the 77 respondents who completed the survey, 46.8% were male, most were Caucasian (93.5%; 6.5% were African American), and most were between the ages of 40 and 49 (29%) or 50 and 59 (45.2%). Most (64%) reported having a master’s degree, 9% reported having a doctorate, 6% reported having a bachelor’s degree, and 21% did not indicate a degree. Most respondents indicated that their discipline was in social work (30%) or psychology (30%), and fewer indicated medicine/nursing/rehab counseling/other (13%), business (9%), marriage and family therapy (5%), and education (3%). Ten percent did not indicate a discipline. Respondents reported several certifications, with most indicating an LCSW (26%), LPC (14.3%), or “other” (26%). In addition, 26% of respondents indicated no certification. Overall, respondents had worked in behavioral health for an average of 24 years (SD=7.7) and had worked at their current CSB for 15.3 years (SD=8.3). Respondent characteristics—follow-up interviews As described earlier, participants were divided into high and low users by using a scale computed by adding positive responses from 11 dichotomously scored items. Scores ranged from 1 to 9. Participants identified as high users of the consumer survey included three executive directors, three clinical program directors, and three QI directors, and they came from both urban and rural CSBs. Their mean use score was 7.8 (SD=1.2). Participants identified as low users of the consumer survey
  • 16. included one executive director, three clinical program directors, and three QI directors, and they came from both urban and rural CSBs. Their mean use score was 1.0 (SD=0.0). Notably, only one executive director indicating very low use agreed to participate in this portion of this study. Overall, most executive directors indicated at least some use of consumer survey results (72%). CTP survey results Use of consumer surveys Most staff reported having read either part of the most recent consumer survey report (61%) or the entire report (13%). When asked if they had used the consumer survey for each of 11 different purposes, the largest percentages of staff indicated that they used the data for quality improvement and for quality assurance, while the smallest percentages of staff indicated that they used consumer survey data to support budget requests and to evaluate individual staff performance (see Fig. 1). Assessing the Utility of Consumer Surveys KOCH et al. 239 Factors that support use of consumer surveys CTP staff members were asked to rate 21 factors identified through the literature and discussion groups that support the active use of the consumer surveys. Staff rated each factor on a five-point, Likert-type scale (strongly disagree to strongly agree) on the extent to which the factor was true for
  • 17. the consumer survey and how it is implemented at their CTP. As shown in Table 1, some items were highly rated, while smaller percentages of staff agreed with other items. The most highly rated items included those about adequate staff training and staff support for conducting the consumer survey, as well as items concerning the usefulness of comparing results with other organizations, agency leadership discussing the survey results, and sharing the results throughout the organization. Few respondents indicated that their organization involves consumers in using the survey results or that it is clear how to improve services based on the results. Factors that influence the usefulness of consumer surveys Ten factors were identified in the literature and through the discussion groups that may be related to increasing the usefulness of consumer surveys. CTP staff rated each of these in terms of how “important you think each factor is/would be to your organization in facilitating the use of consumer surveys to improve service quality.” Each factor was rated on a four-point scale of “not at all important,” “low importance,” “medium importance,” and “high importance.” As shown in Table 2, factors rated the most important concerned providing information on the practical interpretation of results and data on individual programs and the ability to customize items as Figure 1 Percent of respondents who answered “yes” to questions about their use of Consumer Survey data during the past year (N=68–75, depending on question)
  • 18. 11.3 22.9 30 33.8 38 46.5 49.3 53.5 53.5 56 59.7 0 10 20 30 40 50 60 70 Individual staff performance Support budget requests Staff supervision Identify training/tech assistance needs Community education Enhance staff morale
  • 19. Meet accreditation requirements Feedback to individual staff Demonstrate accountability Quality assurance Quality improvement Percent (%) 240 The Journal of Behavioral Health Services & Research 38:2 April 2011 T a b le 1 F ac to rs th at su p p o
  • 59. ly d is ag re e) to 5 (s tr o n g ly ag re e) Assessing the Utility of Consumer Surveys KOCH et al. 241 needed. Interestingly, many respondents indicated that it is important for local consumers to participate in the interpretation and use of survey results, although other results indicated that few organizations share consumer survey data with consumers and few actively involve consumers in using the survey results. Procedures for administering consumer surveys Each year, all CSBs in Virginia are asked to administer the
  • 60. Adult Consumer Survey to all consumers receiving services during a 1-week period, and a sample of all parents/guardians of youth served are mailed the Youth Services Survey for Families. Detailed procedures are provided to guide the CSBs in administering the surveys. These procedures were also assessed in the CTP Questionnaire; respondents answered questions about either the Adult Consumer Survey or the Youth Services Survey for Families (results collapsed across survey type). Most respondents said they were satisfied with the procedures (61.9%). Fewer respondents were satisfied with the analysis and reporting of the results (49.2%). Also, most respondents indicated a preference that the survey be administered twice per year (73%), as opposed to quarterly/monthly (14.3%), once per year (6.3%), less than once per year (3.2%), or not at all (3.2%). Table 2 Factors that influence the usefulness of consumer surveys Item Percent indicating “medium” or “high” importance Meana SD Data analysis and reporting include practical interpretation of results 92.0% 3.35 0.77 Results are reported for individual programs 90.4 3.43 0.80 Survey items can be customized to meet the
  • 61. needs of specific programs or to address current issues 85.7 3.22 0.89 Survey items can be added to the survey to meet the needs of specific programs or to address current issues 84.1 3.30 0.85 Comparisons are provided to similar provider organizations 77.4 3.10 0.90 There is local information technology capacity to analyze and report our consumer survey data 76.2 3.10 0.98 There is local QI/evaluation staff available to conduct analyses of our consumer survey data 73.0 3.03 0.95 It is important for local consumers to participate in the interpretation and use of survey results 70.9 2.89 0.96 Results are provided for individual survey items 67.8 2.95 0.90 Training is provided on how to interpret/use results 61.9 2.83 0.98
  • 62. aScores ranged from 1 (not important) to 4 (high importance) 242 The Journal of Behavioral Health Services & Research 38:2 April 2011 Finally, most participants indicated that the benefits of conducting the consumer survey were equal to or greater than the cost/burden of conducting the survey (62.3%). Provider organizational characteristics Several survey questions asked respondents to rate their organizations on issues such as staffing, training, funding, physical space, coordination/collaboration, and the availability, credibility, and relevance of data used in decision making. As shown in Table 3, few respondents agreed or strongly agreed to items regarding the availability of adequate staffing, funding, and physical Table 3 Provider organizational characteristics Item Percent “agree” or “strongly agree” Meana SD Items in order of most highly rated to least highly rated Our staff integrate new knowledge and techniques into their work to improve the way in which services are provided
  • 63. 88.5% 4.05 0.69 We regularly integrate new services, programs, and/or initiatives if they are needed 85.0 3.95 0.77 We have collaborations/partnerships with external groups that facilitate important priorities, new programs, and/or initiatives for consumers 83.7 3.90 0.77 Employees understand how their work is related to the goals or mission of our organization 83.6 3.97 0.66 The training and development programs for staff are of high quality 80.3 3.90 0.65 We have a high level of coordination across units and/or departments when it comes to delivering services and programs to consumers 60.6 3.36 1.07 We have the necessary physical space for the services and programs we run
  • 64. 19.6 2.25 1.14 We have few difficulties in adequately staffing our organization 16.0 2.00 1.07 We have funding available to introduce new programs and/or initiatives if they are needed 11.5 2.11 0.99 Questions regarding data Data needed for decision making are available 60.7 3.44 0.87 Data needed for decision making are relevant 58.3 3.50 0.78 Data needed for decision making are credible 50.8 3.38 0.82 aScores ranged from 1 (strongly disagree) to 5 (strongly agree) Assessing the Utility of Consumer Surveys KOCH et al. 243 space. More respondents agreed with items regarding the integration of new knowledge, techniques, and new services and programs, the existence of productive collaborations/partner- ships, the understanding of employees on how their work relates to the goals and mission of the organization, and the high quality of training. Results on data needed for decision making were mixed: while most agreed that data needed for decision making are available and relevant, fewer indicated that these data are credible.
  • 65. Exploratory analyses Additional correlational analyses were conducted to better understand the relationship between respondents’ actual use of consumer survey data and items about the usefulness of the survey, factors that support survey use, preferred frequency of survey administration, and organizational characteristics. To accomplish this analysis, the mean use score variable was used (as described earlier, this variable was computed by adding positive answers from 11 dichotomously scored items; total mean=4.25, SD=2.75). This variable was then correlated with items from the above categories, and the Benjamini-Hochberg procedure32 was used to control for false positives. In total, 47 correlations were conducted, and 11 were significant after the procedure was applied. Results indicated significant correlations between higher use of the consumer survey data and the following items: having read the Consumer Survey Report (r=0.29; pG.05), agreement that survey items give detailed information (r=0.31; pG.05), agreement that the annual report provides information needed to improve the quality of services (r=0.46; pG.01), and agreement that respondents’ organizations share results throughout the organization (r=0.40; pG.01), with consumers (r=0.50; pG.01), and with stakeholders (r=0.54; pG.01). In addition, respondents’ actual use was correlated with the likelihood that respondents’ organizations had an established process to use the results of the consumer survey (r=0.57; pG.01), that consumers and/or family
  • 66. members actively participate in using the results (r=0.34; pG.01), and that their organization has a designated person/team responsible for ensuring that the results are used (r=0.45; pG.01). Actual use was also correlated with respondents indicating that it is important for results to be provided for each individual survey item (r=0.32; pG.05) and that funding is available to introduce new programs/initiatives if needed (r=0.35; pG.01). Results from the follow-up interviews Follow-up interviews allowed for a more in-depth examination into how consumer survey data are used to improve service quality, as well as participants’ perceived utility of these data, facilitating factors, obstacles, and strategies for overcoming obstacles. Participants in the follow-up interviews named similar factors to those in the CTP survey, such as the lack of timely data, the lack of specific data (i.e., by program), and the lack of resources (time, money, staff) and ability to interpret the data and implement related QI initiatives. In addition, participants felt that consumer surveys may have questionable validity (consumers may not understand the questions, response rates may be poor, sample may not be representative) and that the consumer survey could be more useful for QI purposes if the results were provided for each item, specific to program, location, or clinician rather than by organization; if the results were more timely (within 6 months or, best case, real-time); and if training was provided on how to interpret and utilize the data for quality improvement processes.
  • 67. Participants reported several features of the current consumer survey report that facilitate its use by their CTPs, including the ability to benchmark with state averages; the ability to examine trends across years; the inclusion of written comments by consumers; the inclusion of analyses that they do not have the capacity to conduct; and the inclusion of graphs, visual aids, and summaries that facilitate understanding of reported data. 244 The Journal of Behavioral Health Services & Research 38:2 April 2011 The follow-up interview participants that reported using the consumer survey reports to inform quality improvement initiatives indicated that such initiatives focused on access to services (n=4), intake processes (n=3), and wait lists (n=3). For example, one high-use participant described: . . . rearranging our scheduling of emergency services so people will have easier contact and . . . even a quicker response from our staff . . . added teleconferencing… . . . videoconferencing at two locations . . . and that has helped a great deal. Another high-use participant stated: . . . by changing the way we do business. For example, having walk-ins – where people can just walk in and get services without having an appointment – that's just one way that we changed. Similarly, another participant stated:
  • 68. We identified that there were problems with returning phone calls . . . we implemented some changes in program practices and that brought those scores up. In addition, most of the high-use interviewees (five out of nine) reported that they had quality improvement processes in place at their CSBs and were able to articulate these systems and processes. More specifically, one participant stated: It’s part of our continuous quality improvement…. our city is becoming very involved in management by results and in many ways, we are the model for that… we are used to collecting outcome data and using it to guide our programs. Fewer of the low-use interviewees (three out of seven) reported that they had quality improvement processes in place in their CSBs. The three low- use participants that did report having quality improvement processes in place reported being unable to use the consumer survey data for this purpose, again due to the inhibiting factors previously described. Interestingly, high users reported being able to overcome these inhibiting factors primarily through conducting in- house data analyses. Conclusions The results of this study indicate that there are a variety of factors related to survey content, data analysis/reporting, and technical support/resources that should be addressed in order to improve the likelihood that consumer surveys provide useful data and that
  • 69. these data are actually used to improve treatment services. Overall, results from the CTP survey and the follow-up interviews revealed general satisfaction with the consumer survey and the protocol used for its administration. Quite interestingly, the major dissatisfaction was with the frequency with which the survey has been conducted; the majority of respondents wanted to conduct the survey more frequently, although only if results were quickly available. The only other major area of dissatisfaction was the long delay between conducting the survey and receiving a report on the results. In addition, study participants indicated that it is important to have practical interpretation of the results, that local consumers participate in the interpretation, and that items can be customized to meet the needs of specific programs or to address current issues. Few participants indicated that their organization has a designated team responsible for ensuring that the results of the consumer survey are used or that they have an established process for using the results of the consumer survey. Similar barriers to using consumer surveys have been reported in one other study, such as the lack of an effective quality improvement infrastructure, a lack of expertise with survey data, and a lack of timely and specific results.28 Clearly, these barriers need to be addressed. Further, analyses conducted looking at the relationship between respondents’ actual use of the consumer surveys and other variables indicated that use was significantly correlated with survey
  • 70. Assessing the Utility of Consumer Surveys KOCH et al. 245 results giving detailed information and recommendations, sharing information with a broad range of stakeholders, the likelihood of having an established process and team to use survey results, and having consumers/family members participate in this process. Potentially, increasing use of consumer survey results might be accomplished by addressing these factors, such as having a supportive environment and a quality improvement process in place. Given the findings of the study, consumer survey use could be improved by addressing a variety of concerns about survey content and administration procedures, as well as program-level and/or systems-level issues. Survey content could be improved by giving individual CTPs the ability to add items that address issues of local concern and by reporting survey results by individual programs within a given CTP. Survey administration could be improved by conducting the surveys every 6 months rather than annually and by decreasing the amount of time from the administration of the survey to the reporting of results. At a program level, the use of consumer survey data could be improved by conducting a series of regional workshops coincident with the release of the consumer survey reports to discuss the findings and their implications and to assist CTPs in developing action plans to address weakness identified by the survey. In addition, further training could be provided to
  • 71. CTP staff, especially quality managers, on data analysis/interpretation and QI technology. CTPs could be encouraged to establish standing QI committees to review, respond to, and disseminate consumer survey results to CTP staff, their boards of directors, and their consumers and family members. Further, CTPs could also involve consumers in quality assurance or consumer advisory committees and empower these groups by tasking them with making recommendations. Finally, CTPs could post results in waiting rooms and provide results to consumers at intake. Some of these techniques are potentially free or low cost, an essential component for organizations such as CTPs that often function with limited resources. At a systems level, incentives could be provided using performance contracting, similar to previous work in this area33 but using the results of the consumer surveys as the performance measurement instead of or in addition to other measures. Also, using other health care organizations’ work on QI interventions as a model,27 a statewide QI team could review consumer survey data and related information, make recommendations, oversee the implementation of changes, and monitor outcomes. A limitation to this study is the low response rate for program directors, despite the use of the Dillman method (three electronic mailings) and the potential to receive $100 for professional development. Thus, the results’ generalizability to all types of behavioral health care program directors may be limited. This is of particular concern since program directors would typically play
  • 72. the lead role in initiating and implementing program changes. As has been made apparent in recent policy and research initiatives, improving the quality of behavioral health care is an important priority, due in part to its contribution to the total global burden of illness and impact on all aspects of life.17,24 Several studies have provided recommendations for improving the quality of behavioral health care, including ways to use satisfaction data to guide quality improvement.34–36 While information is becoming more readily available regarding quality improve- ment in health care in general37,38 as well as for behavioral health care,23,39–41 historically, behavioral health has lagged behind general medical services in integrating quality improvement principles into practice.42 Better understanding and then addressing issues related to the use of performance measures can help to improve the impact of these measures and, ultimately, improve services for clients being treated in the behavioral health care system. Implications for Behavioral Health Consumer surveys have been used in behavioral health care for decades, but their actual utility for quality improvement remains largely unexamined. By asking CTPs about their use of consumer survey data, the strengths and weaknesses of consumer surveys can be addressed. Findings from 246 The Journal of Behavioral Health Services & Research 38:2 April 2011
  • 73. this study indicate that, among other results, CTP staff prefer a rapid turnaround of results, multiple administrations each year, and the opportunity to customize items. CTP staff also indicated that having adequate staff training, staff and leadership support, and dissemination of results helped support the use of consumer survey data. Community treatment providers should be urged to actively incorporate results of consumer surveys into their organizational planning. Suggestions for improving the use of consumer survey data include addressing administration preferences, disseminating findings throughout CTPs and regions, training CTP staff on data interpretation, and establishing QI committees to address these issues. Consumers themselves should be encouraged to participate in this process. Performance contracting, using consumer surveys for measurement, is another possibility. These steps have the potential to improve the actual use of consumer survey data, which in turn can help improve services for clients being treated in the behavioral health care system. Acknowledgements This work was funded by grants from the Commonwealth Health Research Board and the Virginia Department of Behavioral Health and Developmental Services. We would also like to thank Rehana Kader for her assistance with this project. References
  • 74. 1. Mental Health Statistics Improvement Program. Mental Health Statistics Report Card. 1996. Available at: http://www.mhsip.org/ reportcard/index.html. Accessed May 5, 2009. 2. Zhang Z, Gerstein DR, Friedmann P. (2008). Patient satisfaction and sustained outcomes of drug abuse treatment. Journal of Health Psychology, 13(3), 388–400. 3. Eisen SV, Shaul JA, Leff HS, et al. Toward a national consumer survey: Evaluation of the CABHS and MHSIP instruments. The Journal of Behavioral Health Services and Research 2001; 28(3):347– 369. 4. Perreault M, Katerelos TE, Tardif H, et al. Patients’ perspectives on information received in outpatient psychiatric treatment. Journal of Psychiatric and Mental Health Nursing 2006; 13(1):110–116. 5. Center for Mental Health Services. Community mental health services block grant application guidance and instructions FY 2008: Transforming mental healthcare in America. Washington, DC: U.S. Department of Health and Human Services, Substance Abuse and Mental Health Services Administration. 2007. 6. Bjorkman T, Hansson L, Svensson B, et al. What is important in psychiatric outpatient care? Quality of care from the patient’s perspective. International Journal for Quality in Health Care 1995;7:355–362. 7. Campbell J, Einspahr K. Building partnerships in accountability: Consumer satisfaction. In Dickey B, Sederer LI
  • 75. (eds.), Improving mental health care: Commitment to quality Washington, DC: American Psychiatric., 2001, pp. 101–113. 8. Druss BG, Rosenheck RA, Stolar M. Patient satisfaction and administrative measures as indicators of the quality of mental health care. Psychiatric Services 1999; 50(8):1053–1058. 9. Pellegrin KL, Stuart GW, Frueh BC, et al. A brief scale for assessing patients’ satisfaction with care in outpatient psychiatric services. Psychiatric Services 2001; 52:816–819. 10. Young MP. An analysis of the concept ‘patient satisfaction’ as it relates to contemporary nursing care. Journal of Advanced Nursing 1996; 24(6):1241–1248. 11. Edlund MJ, Young AS, Kung FY, et al. Does satisfaction reflect the technical quality of mental health care? Health Services Research 2003; 38(2):631–645. 12. Sanders LM, Trinh C, Sherman BR, et al. Assessment of client satisfaction in a peer counseling substance abuse treatment program for pregnant and postpartum women. Evaluation & Program Planning 1998; 21(3):287–296. 13. Carlson MJ, Gabriel RM. Patient satisfaction, use of services, and one-year outcomes in publicly funded substance abuse treatment. Psychiatric Services 2001; 52(9):1230–1236. 14. Meredith LS, Orlando M, Humphrey N, et al. Are better
  • 76. ratings of the patient–provider relationship associated with quality care for depression? Medical Care 2001; 39(4):349–360. 15. Chilvers R, Harrison G, Sipos A, et al. Evidence into practice: Application of psychological models of change in evidence-based implementation. British Journal of Psychiatry 2002; 181:99– 101. 16. Garland AF, Kruse M, Aarons GA. Clinicians and outcomes measurement: What’s the use? Journal of Behavioral Health Services & Research 2003; 30(4):393–405. 17. Manderscheid RW. Don’t let the future repeat the past. Behavioral Healthcare 2006; 26(5):54–56. 18. Rawson RA, Marinelli-Casey P, Ling W. Dancing with strangers: Will the U.S. substance abuse practice and research organizations build mutually productive relationships? Addictive Behaviors 2002; 27(6):941–949. 19. Hermann RC, Palmer H, Leff S, et al. Achieving consensus across diverse stakeholders on quality measures for mental healthcare. Medical Care 2004; 42(12):1246–1250. Assessing the Utility of Consumer Surveys KOCH et al. 247 http://www.mhsip.org/reportcard/index.html http://www.mhsip.org/reportcard/index.html 20. Lennox RD, Mansfield AJ. A latent variable model of
  • 77. evidence-based quality improvement for substance abuse treatment. The Journal of Behavioral Health Services & Research 2001; 28(2):164–177. 21. Hermann RC, Provost MM. Interpreting measurement data for quality improvement: Standards, means, norms, and benchmarks. Psychiatric Services 2003; 54(5):655–657. 22. Beutler LE. Comparisons among quality assurance systems: From outcome assessment to clinical utility. Journal of Consulting and Clinical Psychology 2001; 69(2):197–204. 23. Proctor EK. Leverage points for the implementation of evidence-based practice. Brief Treatment in Crisis Intervention 2004; 4(3):227– 242. 24. Patel KK, Butler B, Wells KB. What is necessary to transform the quality of mental health care? Health Affairs 2006; 25(3):681–694. 25. Rosenheck RA. Organizational process: A missing link between research and practice. Psychiatric Services 2001; 52(12):1607–1612. 26. Simpson DD. A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment 2002; 22(4):171–182. 27. Davies E, Shaller D, Edgeman-Levitan PA, et al. Evaluating the use of a modified CAPS® survey to support improvements in patient- centered care: Lessons from a quality improvement collaborative. Health Expectations 2008; 11:160–176. 28. Davies E, Cleary PD. Hearing the patient’s voice? Factors affecting the use of patient survey data in quality improvement.
  • 78. Qual Saf Health Care 2005; 14:428–432. 29. Mental Health Statistics Improvement Program (MSHIP). The MHSIP Consumer Satisfaction Survey and the Youth Services Survey for Families. 2000. Available at: http://www.mhsip.org/surveylink.htm. Accessed May 5, 2009. 30. Brunk M, Koch JR. Assessing the outcomes of children’s mental health services: Youth Services Survey for Families. Poster session presentation at: The 14th annual research conference, A System of Care for Children's Mental Health: Expanding the Research Base, February, 2001; Tampa, FL. 31. Dillman DA. Mail and telephone surveys: the total design method, New York: Wiley, 1978. 32. Thissen, D. Quick and easy implementation of the Benjamini-Hochberg procedure for controlling the false positive rate in multiple comparisons. Journal of Education and Behavioral Statistics. 2000; 27(1):77–83. 33. McLellan, A.T. Improving public addiction treatment through performance contracting: The Delaware experiment. Health Policy 2008;87:276–308. 34. Beaudin CL, Kramer TL. Evaluation and treatment of depression in managed care (part II): Quality improvement programs. Disease Management & Health Outcomes 1997;13(5):307–316.
  • 79. 35. Eisen SV. Patient satisfaction and perceptions of care. In IsHak WW, Burt T, Sederer LI (eds), Outcome measurement in Psychiatry: A critical review. Washington, DC: American Psychiatric, 2002, pp. 303–320 36. Hermann RC. Linking outcome measurement with process measurement for quality improvement. In IsHak WW, Burt T, Sederer LI (eds), Outcome measurement in Psychiatry: A critical review. Washington, DC: American Psychiatric, 2002, pp. 23–34 37. Pelletier L, Beaudin C. (eds). Q solutions: Essential resources for the healthcare quality professional. National Association for Quality Healthcare, 2006 38. Teeley KH, Lowe JM, Beal J, Knapp ML. Incorporating quality improvement concepts and practice into a community health nursing course. Journal of Nursing Education, 2006; 45(2), 86–90. 39. McGilloway S, Donnelly M, Mangan B, et al. Qualitative assessment and improvement in mental health services: A qualitative study. Journal of Mental Health 1999; 8(5), 489–499. 40. Rago WV, Reid WH. Total quality management strategies in mental health systems. Journal of Mental Health Administration 1991; 18 (34):253–264. 41. Roman PM, Johnson JA. Adoption and implementation of new technologies in substance abuse treatment. Journal of Substance Abuse
  • 80. Treatment 2002; 22(4): 211–218. 42. Beaudin CL. The face of quality. Quality Progress 2006; 39(2):15. 248 The Journal of Behavioral Health Services & Research 38:2 April 2011 http://www.mhsip.org/surveylink.htm Copyright of Journal of Behavioral Health Services & Research is the property of Springer Science & Business Media B.V. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.