SlideShare a Scribd company logo
1 of 177
Selected Findings from the Cross-Site Evaluation of the Federal
Healthy Start Program
Vonna Lou Caleb Drayton • Deborah Klein Walker •
Sarah W. Ball • Sara M. A. Donahue •
Rebecca V. Fink
Published online: 28 November 2014
� Springer Science+Business Media New York 2014
Abstract Initiated in 1991, the Federal Healthy Start
Program includes 105 community-based projects in 39
states, the District of Columbia and Puerto Rico. Healthy
Start projects work collaboratively with stakeholders to
ensure participants’ continuity of care during pregnancy
through 2 years postpartum. This evaluation of Healthy
Start projects examined relationships between implemen-
tation of nine core service and system program components
and improvements in birth and project outcomes. Program
components and outcomes were examined using data from
a 2010 Healthy Start project director (PD) survey
(N = 104 projects) and 2009 performance measure data
from the Maternal and Child Health Bureau Discretionary
Grant Information System (N = 98 projects). We explored
bivariate relationships between the nine core program
components and (a) intermediate and long-term project
outcomes and (b) birth outcomes. We assessed independent
associations of implementation of all core program com-
ponents with birth outcomes, adjusting for project charac-
teristics and activities. In 2010, 57 projects implemented all
nine core program components: 104 implemented all five
core service components and 69 implemented all four core
systems components. Implementation of all core program
components was significantly associated with several PD-
reported intermediate and long-term project outcomes, but
was not associated with singleton low birth weight or infant
mortality among participants’ infants. This evaluation
revealed a mixed set of relationships between Healthy Start
projects’ implementation of the core program components
and achievement of project outcomes. Although the find-
ings demonstrated a positive impact of Healthy Start pro-
jects on birth outcomes, only a few associations were
statistically significant.
Keywords Maternal and child health � Healthy Start
Program � Cross-site evaluation � Program evaluation
Introduction
The Federal Healthy Start Program began in 1991 as a
response to high infant mortality rates (IMR) in the United
States as well as the large gap in these rates between white
and non-white infants. The first Healthy Start projects were
funded as demonstration sites in 15 communities with IMR
1.5–2.5 times the national average. By 2012, the program
had expanded in size and mission to include 105 projects in
39 states, the District of Columbia and Puerto Rico,
including projects in both urban and rural areas. As spec-
ified by Health Resources and Services Administration
(HRSA) guidance documents [1–3] the core Program goals
include: (1) a reduction of racial and ethnic disparities in
access to and utilization of health services, (2) an improved
local health care system, and (3) an increased consumer or
community voice in health care decisions.
V. L. C. Drayton (&)
Booz Allen Hamilton, One Preserve Parkway, Rockville,
MD 20852, USA
e-mail: [email protected]
D. K. Walker � S. W. Ball � S. M. A. Donahue � R. V. Fink
Abt Associates, 55 Wheeler Street, Cambridge, MA 02138-
1168,
USA
e-mail: [email protected]
S. W. Ball
e-mail: [email protected]
S. M. A. Donahue
e-mail: [email protected]
R. V. Fink
e-mail: [email protected]
123
Matern Child Health J (2015) 19:1292–1305
DOI 10.1007/s10995-014-1635-4
http://crossmark.crossref.org/dialog/?doi=10.1007/s10995-014-
1635-4&domain=pdf
http://crossmark.crossref.org/dialog/?doi=10.1007/s10995-014-
1635-4&domain=pdf
The Federal Healthy Start Program focuses on improv-
ing the health and well-being of women, infants, children
and their families through the implementation of evidence-
based practices and innovative community interventions. In
2010, Healthy Start projects served almost 30,000 pregnant
women, many of whom were black or African American,
34 years and younger, with incomes below 100 percent of
the federal poverty level [4].
Healthy Start projects work collaboratively with com-
munity stakeholders and consumers to leverage existing
service and system resources so that women at risk for
adverse birth outcomes are assured continuity of care
during pregnancy through 2 years postpartum. Since 2001,
all Healthy Start projects have been required to implement
nine ‘‘core’’ program components: five service components
(outreach and recruitment, case management, health edu-
cation, interconception care (ICC), perinatal depression
screening) and four systems-building components (con-
sortia, local health systems action plan (LHSAP), coordi-
nation and collaboration with Title V, and a sustainability
plan). Healthy Start projects may also implement other
support services needed in their local communities, such as
breastfeeding support and education, screening for
domestic/intimate partner violence and child abuse, initia-
tives to improve family and/or male involvement, healthy
weight interventions, home visiting, and smoking cessation
[1–3].
National Evaluations of the Federal Healthy Start
Program
The Federal Healthy Start Program has been evaluated
from its inception in the early 1990 s. The first national
evaluation, conducted from 1997 through 1999, examined
the implementation of the 15 demonstration project activ-
ities during fiscal years 1992 and 1996 and assessed whe-
ther these projects achieved the Healthy Start Program
goals of reducing infant mortality and improving maternal
and infant health. The second national evaluation was
conducted in two phases from 2002 through 2007 and
sought to obtain information about the implementation of
program components and to identify program features
associated with improved perinatal outcomes. Findings
from this evaluation were summarized in a profile report
presenting the characteristics of all Healthy Start projects
[5] and in case studies that documented the context and
implementation of the Healthy Start Program in eight sites
[6]. The evaluation also collected information on program
implementation and outcomes through a participant survey
that was conducted in four sites [7]. The third national
evaluation is the cross-site evaluation summarized in this
article. It was conducted from 2009 through 2012 to
examine relationships between the core program
components and long-term program and birth outcomes, in
addition to factors that influence these relationships. The
primary objective of the evaluation was to assess the effect
of implementation of all nine core program components on
long-term maternal and child health outcomes.
Methods
The evaluation was guided by a logic model (Fig. 1) that
outlined the hypothesized relationships between Healthy
Start project context, implementation of core service and
system program components, and four long-term outcomes
relevant to the Healthy Start Program goals: (1) improved
birth outcomes, (2) improved maternal health, (3)
improved child health, and (4) sustained community
capacity to reduce disparities in health status in the target
community. A cross-sectional design was used to assess the
associations of implementation of the nine core program
components with (1) project characteristics, (2) achieve-
ment of intermediate project outcomes, (3) service and
system activities conducted by the Healthy Start project
that made a primary or major contribution to reducing
disparities in maternal and infant health outcomes, and (4)
achievement of long-term birth outcomes.
Data Sources
Self-reported data from the 2010 project director survey
(PD survey) and performance measure (PM) data for 2009
reported to the Maternal and Child Health Bureau (MCHB)
Discretionary Grant Information System (DGIS) were used
in all analyses. The 2010 PD survey was administered via
web to Healthy Start project staff between July and Sep-
tember 2011 and was completed for all 104 projects
(100 % response rate). The survey was designed to collect
information on implementation and features of the nine
core program components as well as additional support
services offered by each Healthy Start project and project
achievements. The DGIS is a Web-based system that
MCHB grantees use to report their data online to MCHB
through HRSA’s Electronic Handbook as a part of the
grant application and performance reporting processes; it is
the repository of PM data for all MCHB programs. During
the time period of this evaluation, the MCHB utilized 15
PMs to monitor the progress of all Healthy Start projects
towards the achievement of Program objectives. A list of
current MCHB Healthy Start Program PMs is available
from: https://mchdata.hrsa.gov/DGISReports/PerfMeasure/
default.aspx. Performance measure data for 2009 were
available for 98 projects. After a thorough examination of
the available PM data from the DGIS [8], four PMs (two
birth outcome PMs, 1 service outcome PM and 1 system
Matern Child Health J (2015) 19:1292–1305 1293
123
https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp
x
https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp
x
outcome PM; Table 1) were selected for this evaluation
based on the quality and consistency of data as well as the
relevance of the PM to the evaluation objectives. Project
characteristic data that were consistently reported in the
DGIS were also used in our analyses. State Title V birth
outcome PMs (singleton LBW and IMR) and Healthy
People 2010 and 2020 objective targets (LBW and IMR)
[9, 10] were used as benchmarks for comparison.
Measurement of Variables
Variable selection was informed by program components
and expected outcomes, the logic model, and previous
studies of birth outcomes [11]. The primary exposure of
interest was the implementation of all nine core program
components: outreach and recruitment, case management,
health education, interconception care (ICC), perinatal
depression screening, consortia, local health systems action
plan (LHSAP), coordination and collaboration with Title
V, and a sustainability plan. Implementation was deter-
mined using data from the 2010 PD survey (yes/no
response for each component). The birth outcomes of
interest were measured using two PMs reported in the
DGIS in 2009: percent singleton low birth weight (PM 51)
and infant mortality rate (PM 52).
We examined characteristics hypothesized to influence
the association of implementation of program components
with birth outcomes. We obtained information on these
characteristics from the 2010 PD survey and the DGIS.
Maternal demographic characteristics were not available
for this analysis. Project characteristics (Table 2) that were
examined were length of funding (initial project funding
received in Phase 1 [1991–1996], 2 [1997–2000], 3
[2001–2004], or 4 [2005–2010]), geographic location
(urban, not urban), and organization type (government
agency, community-based non–governmental agency or
other organization type). Project director report of
achievement of intermediate outcomes (eleven outcomes;
see Table 2) (yes/no), service and systems activities that
made a primary or major contribution to reducing dispar-
ities in maternal and infant health outcomes (fourteen
activities; see Table 2) (yes/no), and achievement of long-
term maternal and child health and community capacity
outcomes (five outcomes; see Table 2) (yes/no) were
examined in descriptive analyses and included as covari-
ates in multivariable analyses. One service outcome PM
Fig. 1 Logic model for the cross-site evaluation of Healthy Start
1294 Matern Child Health J (2015) 19:1292–1305
123
(PM 20, the percent of women participants who have an
ongoing source of primary and preventive care services for
women) and one system outcome PM (PM 22, a score
between 0 and 64 representing the degree to which the
project facilitated health providers’ screening of women
participants for eight risk factors) were examined in
descriptive analyses and included as covariates in multi-
variable analyses (see Table 1).
Analysis
We calculated descriptive statistics for all variables across
all Healthy Start projects. We then performed bivariate
analyses using Pearson’s Chi square test and Fisher’s exact
test to (1) describe implementation of the nine core Healthy
Start Program components by project characteristics; (2)
examine the association of implementation of all core
components with each of (a) intermediate outcomes,
(b) service and systems activities that made a primary or
major contribution to reducing disparities in maternal and
infant health outcomes, and (c) long-term maternal and
child health and community capacity outcomes; and (3)
examine the association of intermediate outcomes and
service and systems activities that made a primary or major
contribution to reducing disparities in maternal and infant
health outcomes with (a) long-term maternal and child
health and community capacity outcomes and with b) birth
outcome PMs. We also compared the birth outcome PM
rates among Healthy Start projects with their state’s Title V
Program rates and with achievement of national Healthy
People (HP) 2010 and 2020 objective targets.
We developed multivariable linear and logistic regres-
sion models to examine the independent associations of
implementation of all core program components with birth
outcomes, adjusting for project characteristics, project
director-reported intermediate outcomes, and service and
system PMs. We developed linear regression models to
examine continuous outcomes (singleton LBW, IMR) and
logistic regression models to examine achievement (yes/
no) of state Title V rates or national HP objectives. We
Table 1 MCHB performance measures (PM) used in multivariate
analyses
Category/
PM
Definition/elements Components/Scale
Birth outcomes
PM 51 Percent of live singleton births weighing less than 2,500
g Numerator: Number of live singleton births less than 2,500 g
in
the calendar year to program participants
Denominator: Live singleton births in the calendar year among
program participants
PM 52 The infant mortality rate per 1,000 live births
Numerator: Number of deaths to infants from birth through
364 days of age to program participants
Denominator: Number of live births in the calendar year among
program participants
Service outcomes
PM 20 The percent of women participating in MCHB supported
programs who have an ongoing source of primary and
preventive care services for women
Numerator: The number of women participating in MCHB-
funded projects who have an ongoing source of primary and
preventive care services during the reporting period
Denominator: The number of women participating in MCHB-
funded projects during the reporting period
Systems outcomes
PM 22 The degree to which MCHB supported programs
facilitate health
providers’ screening of women participants for risk factors
Total possible score: 0–64
Scoring instructions: Using a scale of 0-2, indicate the degree to
which your grant has performed each activity to facilitate
screening for each risk factor by health providers in your
program
Scale definitions:
0 = Grantee does not provide this function or assure that this
function is completed,
1 = Grantee sometimes provides or assures the provision of this
function but not on a consistent basis,
2 = Grantee regularly provides or assures the provision of this
function
Risk factors
1. Smoking
2. Alcohol
3. Illicit drugs
4. Eating disorders
5. Depression
6. Hypertension
7. Diabetes
8. Domestic violence
A list of all current MCHB Healthy Start Program PMs is
available from:
https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp
x
Matern Child Health J (2015) 19:1292–1305 1295
123
https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp
x
calculated betas or odds ratios with 95 % confidence
intervals. Variables that were included in the models were
those found to be associated with the birth outcomes of
interest in previous studies or in the bivariate analyses as
well as any other characteristics of a priori interest
according to the evaluation logic model (Fig. 1). The
multivariate models to examine birth outcomes included
only those projects with PM data.
The model to examine the association of implementa-
tion of all core components with singleton LBW (PM 51)
included the following covariates: initial funding (Phase 1
versus all other phases), urban geographic location, not
urban geographic location, grantee organization type,
Healthy Start project facilitation of provider screening for
risk factors (PM 22, score greater than mean of all pro-
jects), percent of women participants with ongoing source
of primary and preventive care (PM 20), self-reported
improved birth spacing in 2010 (yes/no), self-reported
increased cultural competence of providers (yes/no), and
self-reported increased participant involvement in Healthy
Start decision-making (yes/no). The model to examine the
IMR outcome (PM 52) included many of the same covar-
iates, in addition to percent singleton LBW (PM 51), an
independent risk factor for infant mortality.
This evaluation was determined exempt from IRB
review by the Abt Associates Institutional Review Board
on September 1, 2010 (Abt IRB # 0499).
Results
Descriptive Characteristics
Table 2 presents the distribution of project characteristics
as well as project director-reported implementation of the
core components, intermediate project outcomes, service
and systems activities that made a primary or major con-
tribution to reducing disparities in maternal and infant
health outcomes, and long-term maternal and child health
and community capacity outcomes. All 104 Healthy Start
projects implemented all five core service components.
Over two-thirds of projects implemented the four core
systems-building components: 99 % implemented one or
more consortium, 91 % implemented a LHSAP, 87 %
collaborated with Title V, and 66 % had a sustainability
plan. Overall, 57 (55 %) projects implemented all nine core
program components; this group includes 10 of the 18
projects that were first funded during Phase 1 (1991-1996)
of the Healthy Start Program. Most projects were in
operation for at least 10 years at the time the PD survey
was administered; 17 % were first funded in Phase 1 and
61 % in Phase 2. Approximately 75 % of projects were
located in urban areas, including cities and metropolitan
areas; and 40 % of grantee organizations were state or local
government agencies.
Approximately two-thirds of all projects reported that in
2010 the project had accomplished a number of interme-
diate outcomes including increased awareness of the
importance of interconception care and of disparities in
birth outcomes as a community priority, increased positive
health behaviors among participants, increased access to
available services for participants, and increased number of
participants with a medical home.
More than two-thirds of all projects reported that case
management, enabling services such as transportation and
translation, and interconception care activities conducted
by the project made a primary or major contribution to
reducing disparities in maternal and infant health out-
comes. Less than two-thirds of projects reported that other
service and systems activities conducted by the project,
such as collaboration with consumers, community-based
organizations, and public and private agencies, made a
similar contribution to reducing disparities in maternal and
infant health outcomes.
Sixty-eight percent of project directors reported that the
project had achieved improvements in birth outcomes in
2010 and 39 % reported achieving improvements in
maternal health. Less than one-third of project directors
reported that the Healthy Start project had achieved sus-
tained capacity to reduce disparities in health status in the
community (32 % of projects); improvements in child
health (31 %); and increased birth spacing (19 %). A small
proportion (12 %) of project directors reported that the
Healthy Start project had not achieved any long-term out-
comes in 2010.
Bivariate Analyses: Core Program Components
Table 3 presents the results of bivariate analyses examin-
ing the relationship between implementation of the nine
core program components and project characteristics, as
well as relationships between implementation of program
components and three categories of project director-
reported outcomes: (1) intermediate outcomes, (2) activi-
ties that contributed to reducing disparities in maternal and
infant health outcomes, and (3) long-term maternal and
child health and community capacity outcomes. The 57
projects that implemented all core components were used
as the reference group. Only results that were statistically
significant (p B 0.05) are reported in the table.
Healthy Start projects whose grantee organizations were
state or local government agencies were significantly
(p B 0.05) less likely to implement all core components
compared with projects whose grantee organizations were
a community-based non-governmental organization or
other type of organization.
1296 Matern Child Health J (2015) 19:1292–1305
123
Table 2 Distribution of Healthy Start project characteristics and
project director-reported implementation of program
components,
intermediate outcomes, service and systems activities that
contributed
to reducing disparities in maternal and infant health outcomes,
and
long-term maternal and child health and community capacity
out-
comes, among all Healthy Start projects (N = 104 projects)
All
projects
(N = 104)
n (%)
Project characteristics
a
Length of funding
Initial Funding Phase 1 (1991–1996) 18 17
Initial Funding Phase 2 (1997–2000) 63 61
Initial Funding Phase 3 (2001–2004) 10 10
Initial Funding Phase 4 (2005–2010) 13 12
Geographic location: Urban [urban/central city,
metropolitan area (city and suburbs)]
Yes 78 75
No 26 25
Geographic location: Not urban (suburban, border US-
Mexico, rural)
Yes 28 27
No 76 73
HS grantee organization type
Government agency (state agency, community
government agency such as a local health
department)
42 40
Community-based non-governmental organization
(health care or non-health care) or Other
organization (including academic medical center,
non-profit organization, tribal organization,
Federally Qualified Health Center)
62 60
Implementation of all nine core program components
b
Yes 57 55
No 47 45
Intermediate outcomes
c
Increased awareness of the importance of interconception care
Yes 80 77
No 24 23
Increased awareness of disparities in birth outcomes as
community
priority
Yes 76 73
No 28 27
Increased positive health behaviors among our participants
Yes 74 71
No 30 29
Increased access to the services available for our participants
Yes 71 68
No 33 32
Increased number of participants with a medical home
Table 2 continued
All
projects
(N = 104)
n (%)
Yes 70 67
No 34 33
Increased screening for perinatal depression among providers in
the
community
Yes 51 49
No 53 51
Increased participant involvement in Healthy Start decision-
making
Yes 50 48
No 54 52
Increased integration of prenatal, primary care, and mental
health
services
Yes 47 45
No 57 55
Increased cultural competence of providers in our community
Yes 43 41
No 61 59
Increased participant involvement in other community activities
addressing systems change
Yes 39 37
No 65 63
Increased participant involvement in decision-making among
partner
agencies
Yes 22 21
No 82 79
Service and systems activities that contributed to reducing
disparities
in maternal and infant health outcomes
d
Case management
Yes 90 87
No 14 13
Enabling services
Yes 73 73
No 31 30
Interconception care
Yes 70 67
No 34 33
Perinatal depression screening
Yes 66 63
No 38 37
Outreach and client recruitment
Yes 64 62
No 40 39
Collaboration with consumers
Yes 60 58
No 44 42
Matern Child Health J (2015) 19:1292–1305 1297
123
Table 2 continued
All
projects
(N = 104)
n (%)
Collaboration with community-based organizations
Yes 53 51
No 51 49
Collaboration with public agencies
Yes 49 47
No 55 53
Collaboration with private agencies
Yes 46 44
No 58 56
Consortium
Yes 45 43
No 59 57
Local Health System Action Plan
Yes 43 41
No 61 59
Collaboration with local Title V
Yes 34 33
No 70 67
Collaboration with State Title V
Yes 31 30
No 73 70
Provider education
Yes 39 38
No 65 62
Long-term maternal and child health and community capacity
outcomes
e
Improved birth outcomes
Yes 71 68
No 33 32
Improved maternal health
Yes 41 39
No 63 61
Sustained community capacity to reduce disparities in health
status in
the community
Yes 33 32
No 71 68
Improved child health
Yes 32 31
No 72 69
Increased birth spacing
Yes 20 19
No 84 81
No long term outcomes were achieved in 2010
Yes 13 12
Table 2 continued
All
projects
(N = 104)
n (%)
No 91 88
a
Data source: Maternal and Child Health Bureau Discretionary
Grant
Information System
b
Data source: 2010 Project Director survey. To determine imple-
mentation of core service components, project directors were
asked,
‘‘Which of the following services does your Healthy Start
project
offer?’’ (response options: ‘‘Outreach and participant
recruitment,’’
‘‘Case management,’’ ‘‘Health education,’’ ‘‘Perinatal
depression
screening,’’ and ‘‘Interconceptional services’’). To determine
imple-
mentation of the core systems-building component of having a
con-
sortium, project directors were asked ‘‘Does your Healthy Start
project have at least one active consortium that addresses
maternal
and child health issues’’ (response options: Yes/No). To
determine
implementation of the core systems-building component of
having a
Local Health System Action Plan, project directors were asked
‘‘Does
your Healthy Start project have a Local Health System Action
Plan
(LHSAP)?’’ (response options: Yes/No; a follow up question
was
asked to determine if the LHSAP was specific to the Healthy
Start
project). To determine implementation of the core systems-
building
component of coordination and collaboration with Title V,
project
directors were asked to specify the types of collaborative
activities
that their Healthy Start project established with the State Title
V
agency. Projects were classified with a ‘‘yes’’ response if the
project
director indicated that the State Title V agency ‘‘is a member of
the
Healthy Start consortium,’’ ‘‘has a written memorandum of
under-
standing or agreement with Healthy Start,’’ ‘‘provides
contracted
services to Healthy Start,’’ ‘‘hosts out-stationed Healthy Start
staff,’’
‘‘participates in joint training with Healthy Start,’’ ‘‘has a
shared
staffing arrangement with Healthy Start,’’ ‘‘coordinates case
man-
agement or is planning with Healthy Start for shared
participants,’’
‘‘shares protocols with Healthy Start,’’ ‘‘is involved in Healthy
Start
sustainability planning,’’ ‘‘has a data-sharing arrangement with
Healthy Start,’’ ‘‘contributes to pooled funding streams to
support
joint services,’’ ‘‘has a Healthy Start employee on their board,’’
‘‘works with Healthy Start to develop consistent health
messages for
participants,’’ and/or ‘‘receives cultural competence training
from
Healthy Start.’’ To determine implementation of the core
systems-
building component of having a sustainability plan, project
directors
were asked ‘‘Does your Healthy Start project have a
sustainability
plan, that is, a plan to maintain services to the target population
after
federal Healthy Start funding ends?’’ (response options:
Yes/No)
c
Data source: 2010 Project Director survey. Project directors
were
asked, ‘‘Which of the following intermediate outcomes did your
Healthy
Start project achieve in 2010?’’. Multiple responses were
allowed
d
Data source: 2010 Project Director survey. Project directors
were
asked, ‘‘To what extent did the following activities conducted
by your
Healthy Start project contribute to reducing disparities in
maternal
and infant health outcomes?’’. Response options included
Primary
contribution, Major contribution, Moderate contribution, Minor
con-
tribution, and No contribution or N/A. Primary contribution and
Major contribution were classified as ‘‘Yes.’’
e
Data source: 2010 Project Director survey. Project directors
were
asked, ‘‘Which of the following long term outcomes did your
Healthy
Start project achieve in 2010?’’. Multiple responses were
allowed
1298 Matern Child Health J (2015) 19:1292–1305
123
Table 3 Association of implementation of Healthy Start Program
components with project characteristics and project director-
reported inter-
mediate outcomes, service and systems activities that
contributed to reducing disparities in maternal and infant health
outcomes, and long-term
maternal and child health and community capacity outcomes (N
= 104 projects)
Implementation of all required
core program components
Yes
(n = 57)
No
(n = 47)
p value*
n (%) n (%)
Project characteristics
a
HS grantee organization type
Government agency (state agency, community government
agency such as a local health department) 18 32 24 51 0.04
Community-based non-governmental organization (health care
or non-health care) or Other organization
(including academic medical center, non-profit organization,
tribal organization, Federally Qualified Health
Center)
39 68 23 49
Intermediate outcomes
b
Increased access to the services available for our participants
Yes 46 80 25 53 0.00
No 11 20 22 47
Increased screening for perinatal depression among providers in
the community
Yes 33 58 18 38 0.04
No 24 42 29 62
Increased integration of prenatal, primary care, and mental
health services
Yes 31 54 16 34 0.03
No 26 46 31 66
Service and systems activities that contributed to reducing
disparities in maternal and child health outcomes
c
Enabling services
Yes 46 81 27 58 0.01
No 11 19 20 42
Interconception care
Yes 44 77 26 55 0.01
No 13 23 21 45
Long-term maternal and child health and community capacity
outcomes
d
Improved child health
Yes 22 39 10 21 0.05
No 35 61 37 79
Increased birth spacing
Yes 16 28 4 9 0.01
No 41 72 43 91
* Pearson’s Chi square or Fisher’s exact test
a
Data source: Maternal and Child Health Bureau Discretionary
Grant Information System
b
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following intermediate outcomes did
your Healthy
Start project achieve in 2010?’’. Multiple responses were
allowed. Only outcomes with statistically significant (p B 0.05)
relationships with
implementation of all core program components are reported
c
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘To what extent did the following activities
conducted by your
Healthy Start project contribute to reducing disparities in
maternal and infant health outcomes?’’. Response options
included Primary contri-
bution, Major contribution, Moderate contribution, Minor
contribution, and No contribution or N/A. Primary contribution
and Major contribution
were classified as ‘‘Contributed.’’ Only activities with
statistically significant (p B 0.05) relationships with
implementation of all core program
components are reported
d
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following long term outcomes did
your Healthy Start
project achieve in 2010?’’. Multiple responses were allowed.
Only outcomes with statistically significant (p B 0.05)
relationships with imple-
mentation of all core program components are reported
Matern Child Health J (2015) 19:1292–1305 1299
123
Although projects implementing all core components more
frequently reported achievement of the majority of interme-
diate outcomes than projects that did not implement all core
components, the intermediate outcomes for which the rela-
tionship between implementation of all core components and
the outcome were statistically significant were (1) increased
access to services available for participants, (2) increased
integration of prenatal, primary care and mental health ser-
vices and (3) increased screening for perinatal depression.
Projects implementing all core components were signifi-
cantly more likely to report that enabling and interconception
care services conducted by the project made a primary or
major contribution to reducing disparities in maternal and
infant health, when compared with projects that did not
implement all required core components. Additionally, pro-
jects implementing all core components were significantly
more likely to report that their project had achieved increased
birth spacing and improved child health in 2010, compared
with projects that did not implement all core components.
Bivariate Analyses: Intermediate Outcomes, Service
and Systems Activities that Contributed to Reducing
Disparities in Maternal and Infant Health Outcomes,
and Long-Term Maternal and Child Health
and Community Capacity Outcomes
Results of the bivariate analyses examining the relationship
between project director-reported intermediate outcomes,
service and systems activities that made a primary or major
contribution to reducing disparities in maternal and infant
health outcomes and long-term outcomes revealed many
significant associations (data not shown). Intermediate out-
comes that were significantly associated (p B 0.05) with
project director-reported improvements in birth outcomes
and/or maternal health included: increased cultural compe-
tence of providers in the community; increased number of
participants with a medical home; increased awareness of the
importance of interconception care; increased screening for
perinatal depression; and increased participant involvement
in community activities addressing systems change. Healthy
Start project activities, such as interconception care, peri-
natal depression screening, enabling services, collaboration
with consumers, and LHSAP, that made a primary or major
contribution to reducing disparities in maternal and infant
health outcomes were each significantly (p B 0.05) associ-
ated with project director-reported improvement in birth,
maternal, and/or child health outcomes (data not shown).
Descriptive and Comparative Analyses: Birth Outcome
Performance Measures
In 2009, 20 % of Healthy Start projects had singleton LBW
rates and 59 % had IMR that were less than or equal to the
Healthy People 2010 (HP2010) LBW targets of 5 % and
4.5 per 1,000 live births [9], respectively. The Healthy
People 2020 (HP2020) targets were revised to 7.8 % (LBW
rate) and 6 per 1,000 live births (IMR) [10], and a higher
proportion of Healthy Start projects achieved these targets
than achieved the HP2010 targets (33 % achieved the
LBW target and 60 % achieved the IMR target) (data not
shown). Compared with Healthy Start projects that did not
meet the HP2020 LBW target, projects that achieved the
HP2020 target were significantly (p B 0.05) more likely to
report achieving increased access to services available for
participants and increased integration of prenatal, primary
care, and mental health services. Similarly, these projects
were significantly more likely to report that their outreach
and client recruitment, collaboration with community-
based organizations, collaboration with private and public
agencies, and/or collaboration with local Title V activities
made a primary or major contribution to reducing dispar-
ities in maternal and infant health outcomes. Achieving the
HP2020 target for IMR was not significantly associated
with project director-report of achieving intermediate out-
comes or of (conducting) service or system activities that
made a primary or major contribution to reducing dispar-
ities in maternal and infant health outcomes (Table 4).
Similar results were observed when comparing Healthy
Start project PM rates with state birth outcome rates. In
2009, over one quarter (27 %) of all Healthy Start projects
had a singleton LBW rate less than the rate in their state,
and 62 % had an IMR that was less than the rate in their
state. Healthy Start projects that had a lower singleton
LBW rate in 2009 than the rate reported for their state were
significantly (p B 0.05) more likely to report achieving
increased positive health behaviors among participants and
increased number of participants with a medical home in
2010 (data not shown).
Multivariate Analyses
The results of the multivariate analyses are presented in
Tables 5 and 6. After controlling for project characteristics,
project director-reported intermediate outcomes and other
covariates consistent with the logic model, there were no
significant associations of implementation of all core pro-
gram components with singleton LBW and/or infant mor-
tality rates. Urban project setting and state/local
government agency grantee organization were significantly
associated with higher rates of LBW, and non-urban pro-
ject setting was significantly associated with higher IMR.
As expected, LBW rates were significantly associated with
higher IMR. Intermediate and long-term program outcomes
reported in the 2010 PD survey were not significantly
associated with either singleton LBW or infant mortality.
1300 Matern Child Health J (2015) 19:1292–1305
123
Table 4 Association of percent singleton low birth weight
(LBW) and infant mortality rates (IMR) among project
participants’ infants meeting
HP2010 and HP2020 objective targets with Healthy Start project
director-reported achievement of intermediate outcomes and
conduct of service
and systems activities that contributed to reducing disparities in
maternal and infant health outcomes (N = 104)
PM 51 (% singleton LBW) PM 52 (IMR)
Less than
HP2010 LBW
target of 5 %
(n = 20
projects)
Less than
HP2020 LBW
target of
7.8 %
(n = 32
projects)
Less than
HP2010 IMR
target of 4.5
deaths per
1,000 live
births
(n = 58
projects)
Less than
HP2020 IMR
target of 6
deaths per
1,000 live
births
(n = 59
projects)
%
Yes
%
No
%
Yes
%
No
%
Yes
%
No
%
Yes
%
No
Intermediate outcomes
a
Increased awareness of the importance of interconception care
85 15 84 16 79 21 80 20
Increased awareness of disparities in birth outcomes as
community priority 75 25 75 25 71 29 71 29
Increased positive health behaviors among our participants 85
15 84 16 71 29 71 29
Increased access to the services available for our participants
85* 15 81* 19 69 31 69 31
Increased number of participants with a medical home 85 15 75
25 76 24 76 24
Increased screening for perinatal depression among providers in
the community 60 40 59 41 48 52 49 51
Increased participant involvement in Healthy Start decision-
making 45 55 50 50 47 53 47 53
Increased integration of prenatal, primary care, and mental
health services 60 40 66* 34 40 60 41 59
Increased cultural competence of providers in our community 55
45 53 47 36 64 37 63
Increased participant involvement in other community activities
addressing systems
change
20 80 31 69 34 66 36 64
Increased participant involvement in decision-making among
partner agencies 10 90 16 84 24 76 25 75
Service and systems activities that contributed to reducing
disparities in maternal and child health outcomes
b
Case management 90 10 88 12 93 7 93 7
Enabling services 75 25 69 31 78 22 78 22
Interconception care 65 35 63 37 66 34 66 34
Perinatal depression screening 60 40 56 44 69 31 69 31
Outreach and client recruitment 50 50 47** 53 62 38 63 37
Collaboration with consumers 60 40 50 50 57 43 58 42
Collaboration with community-based organizations 30* 70 28**
72 55 45 56 44
Collaboration with public agencies 35 65 31* 69 50 50 51 49
Collaboration with private agencies 30 70 25** 75 48 52 49 51
Consortium 35 65 37 63 41 59 42 58
Local Health System Action Plan 30 70 31 69 40 60 41 59
Collaboration with local Title V 15 85 19* 81 40 60 39 61
Collaboration with state Title V 30 70 25 75 36 64 36 64
Provider education 35 65 31 69 41 59 42 58
Note that Healthy People (HP) LBW targets are for LBW among
all live births, whereas Healthy Start PM 51 and State Title V
HSI 01B
measures the singleton LBW rate
* Pearson’s Chi square or Fisher’s exact test p value B 0.05
** Pearson’s Chi square or Fisher’s exact test p value B 0.01
a
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following intermediate outcomes did
your Healthy
Start project achieve in 2010?’’. Multiple responses were
allowed. A ‘‘yes’’ response indicates that the project director
reported that the project
achieved the intermediate outcome. A ‘‘no’’ response indicates
that the project director did not report that the project achieved
the intermediate
outcome
b
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘To what extent did the following activities
conducted by your
Healthy Start project contribute to reducing disparities in
maternal and infant health outcomes?’’. Response options
included Primary contri-
bution, Major contribution, Moderate contribution, Minor
contribution, and No contribution or N/A. A ‘‘yes’’ response
indicates that the project
director reported that the service or system activity made a
primary or major contribution to reducing disparities in
maternal and infant health
outcomes. A ‘‘no’’ response indicates that the project director
reported that the service or system activity did not make a
primary or major
contribution to reducing disparities in maternal and infant
health outcomes
Matern Child Health J (2015) 19:1292–1305 1301
123
Table 5 Adjusted associations of implementation of Healthy
Start Program components with singleton low birth weight
(LBW) among Healthy
Start project participants’ infants (N = 98 projects)
Project characteristic %
Singleton
LBW
a
% Singleton LBW
less than State
Title V rate
b
% Singleton LBW less
than HP2010 LBW
target of 5 %
b
% Singleton LBW less
than HP2020 LBW
target of 7.8 %
b
Implemented all 5 core service components and all 4 core
systems components versus did not implement all core
components
c
0.4 0.5 0.4 0.4
Initial funding received in Phase 1 (1991–1996) versus
initial funding received in Phase 2, 3, or 4
d
1.2 1.3 0.6 0.4
Urban geographic location [urban/central city,
metropolitan area (city and suburbs)] versus not urban
d
2.9 0.4 0.8 0.4
Not urban geographic location (suburban, border US-
Mexico, rural) versus not not urban
d
1.6 0.6 1.7 0.4
State or local government agency grantee organization
versus community-based non-governmental
organization (health care or non-health care) or other
organization (including academic medical center, non-
profit organization, tribal organization, Federally
Qualified Health Center)
d
1.5 0.1 0.1 0.4
PM 20 (% women participants with an ongoing source of
primary and preventive care for women) (%, 2009)
d
0.0 1.0 1.0 1.0
PM 22 (degree to which Healthy Start project facilitates
health providers’ screening of women participants for
risk factors) (score greater than mean of all projects,
2009)
d
0.8 1.3 0.5 1.1
Achieved increased birth spacing
e
0.5 0.4 0.8 2.1
Achieved increased cultural competence of providers in
the community
f
-1.3 2.1 2.4 1.9
Achieved increased participant involvement in Healthy
Start decision-making
f
0.9 0.9 0.8 0.6
Results based on multivariable linear or logistic regression
models (separate models for each outcome), with each model
adjusted for the other
variables in the table. Bold font indicates effect estimate was
significant at p  0.10 or 95 % confidence interval [1
a
Linear model: values are b coefficients. The effect estimate
represents the effect per percent increase of LBW
b
Logistic model: values are odds ratios. The effect estimate
represents the effect of having a rate less than the state Title V
rate or less than the
Healthy People (HP) target. Note that HP2010 and HP2020
LBW targets are for LBW among all live births, whereas
Healthy Start PM 51 and
State Title V HSI 01B measures the singleton LBW rate
c
Data source: 2010 Project Director survey. To determine
implementation of core service components, project directors
were asked, ‘‘Which of the
following services does your Healthy Start project offer?’’
(response options: ‘‘Outreach & participant recruitment,’’
‘‘Case management,’’ ‘‘Health
education,’’ ‘‘Perinatal depression screening,’’ and
‘‘Interconceptional services’’). To determine implementation of
the core systems-building com-
ponent of having a consortium, project directors were asked
‘‘Does your Healthy Start project have at least one active
consortium that addresses maternal
and child health issues’’ (response options: Yes/No). To
determine implementation of the core systems-building
component of having a Local Health
System Action Plan, project directors were asked ‘‘Does your
Healthy Start project have a Local Health System Action Plan
(LHSAP)?’’ (response
options: Yes/No; a follow up question was asked to determine if
the LHSAP was specific to the Healthy Start project). To
determine implementation of
the core systems-building component of coordination and
collaboration with Title V, project directors were asked to
specify the types of collaborative
activities that their Healthy Start project established with the
State Title V agency. Projects were classified with a ‘‘yes’’
response if the project director
indicated that the State Title V agency ‘‘is a member of the
Healthy Start consortium,’’ ‘‘has a written memorandum of
understanding or agreement with
Healthy Start,’’ ‘‘provides contracted services to Healthy
Start,’’ ‘‘hosts out-stationed Healthy Start staff,’’ ‘‘participates
in joint training with Healthy
Start,’’ ‘‘has a shared staffing arrangement with Healthy Start,’’
‘‘coordinates case management or is planning with Healthy
Start for shared participants,’’
‘‘shares protocols with Healthy Start,’’ ‘‘is involved in Healthy
Start sustainability planning,’’ ‘‘has a data-sharing arrangement
with Healthy Start,’’
‘‘contributes to pooled funding streams to support joint
services,’’ ‘‘has a Healthy Start employee on their board,’’
‘‘works with Healthy Start to develop
consistent health messages for participants,’’ and/or ‘‘receives
cultural competence training from Healthy Start.’’ To determine
implementation of the
core systems-building component of having a sustainability
plan, project directors were asked ‘‘Does your Healthy Start
project have a sustainability
plan, that is, a plan to maintain services to the target population
after federal Healthy Start funding ends?’’ (response options:
Yes/No)
d
Data source: Maternal and Child Health Bureau Discretionary
Grant Information System
e
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following long term outcomes did
your Healthy Start
project achieve in 2010?’’. Multiple responses were allowed
f
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following intermediate outcomes did
your Healthy Start
project achieve in 2010?’’. Multiple responses were allowed
1302 Matern Child Health J (2015) 19:1292–1305
123
Table 6 Adjusted associations of implementation of Healthy
Start Program components with infant mortality rate (IMR)
among Healthy Start
project participants’ infants (N = 98 projects)
Project characteristic
a
Infant
mortality
rate
a
Infant mortality
rate less than
State Title V
IMR
b
Infant mortality rate less than
HP2010 IMR target of 4.5
deaths per 1,000 live births
c
Infant mortality rate less
than HP2020 IMR target of
6 deaths per 1,000 live
births
c
Implemented all 5 core service components and
all 4 core systems components versus did not
implement all core components
d
-0.7 1.2 1.1 1.1
Initial funding received in Phase 1 (1991–1996)
vs. initial funding received in Phase 2, 3, or 4
e
4.9 0.4 0.5 0.4
Urban geographic location (urban/central city,
metropolitan area [city and suburbs]) versus
not urban
e
-4.1 1.6 1.5 1.3
Not urban geographic location (suburban, border
US-Mexico, rural) versus not urban
e
7.4 0.5 0.6 0.5
State or local government agency grantee
organization versus community-based non-
governmental organization (health care or non-
health care) or other organization (including
academic medical center, non-profit
organization, tribal organization, Federally
Qualified Health Center)
e
0.7 1.0 0.9 1.1
PM 51 (% low birth weight) (%, 2009) 0.5 0.9 0.9 0.9
PM 20 (% women participants with an ongoing
source of primary and preventive care for
women) (%, 2009)
0.0 1.0 1.0 1.0
PM 22 (degree to which Healthy Start project
facilitates health providers’ screening of
women participants for risk factors) (score
greater than mean of all projects, 2009)
e
-3.0 0.7 0.8 0.6
Achieved increased birth spacing
f
3.8 0.6 0.3 0.5
a
Results based on multivariable linear or logistic regression
models (separate models for each outcome), with each model
adjusted for the other
variables in the table. Bold font indicates effect estimate was
significant at p  0.10 or 95 % confidence interval [1
b
Linear model: values are b coefficients. The effect estimate
represents the effect per increase in the infant mortality rate
(deaths per 1,000 live
births)
c
Logistic model: values are odds ratios. The effect estimate
represents the effect of having a rate less than the state Title V
rate or less than the
Healthy People (HP) target
d
Data source: 2010 Project Director survey. To determine
implementation of core service components, project directors
were asked, ‘‘Which of
the following services does your Healthy Start project offer?’’
(response options: ‘‘Outreach and participant recruitment,’’
‘‘Case management,’’
‘‘Health education,’’ ‘‘Perinatal depression screening,’’ and
‘‘Interconceptional services’’). To determine implementation of
the core systems-
building component of having a consortium, project directors
were asked ‘‘Does your Healthy Start project have at least one
active consortium
that addresses maternal and child health issues’’ (response
options: Yes/No). To determine implementation of the core
systems-building
component of having a Local Health System Action Plan,
project directors were asked ‘‘Does your Healthy Start project
have a Local Health
System Action Plan (LHSAP)?’’ (response options: Yes/No; a
follow up question was asked to determine if the LHSAP was
specific to the
Healthy Start project). To determine implementation of the core
systems-building component of coordination and collaboration
with Title V,
project directors were asked to specify the types of
collaborative activities that their Healthy Start project
established with the State Title V
agency. Projects were classified with a ‘‘yes’’ response if the
project director indicated that the State Title V agency ‘‘is a
member of the Healthy
Start consortium,’’ ‘‘has a written memorandum of
understanding or agreement with Healthy Start,’’ ‘‘provides
contracted services to Healthy
Start,’’ ‘‘hosts out-stationed Healthy Start staff,’’ ‘‘participates
in joint training with Healthy Start,’’ ‘‘has a shared staffing
arrangement with
Healthy Start,’’ ‘‘coordinates case management or is planning
with Healthy Start for shared participants,’’ ‘‘shares protocols
with Healthy Start,’’
‘‘is involved in Healthy Start sustainability planning,’’ ‘‘has a
data-sharing arrangement with Healthy Start,’’ ‘‘contributes to
pooled funding
streams to support joint services,’’ ‘‘has a Healthy Start
employee on their board,’’ ‘‘works with Healthy Start to
develop consistent health
messages for participants,’’ and/or ‘‘receives cultural
competence training from Healthy Start.’’ To determine
implementation of the core
systems-building component of having a sustainability plan,
project directors were asked ‘‘Does your Healthy Start project
have a sustainability
plan, that is, a plan to maintain services to the target population
after federal Healthy Start funding ends?’’ (response options:
Yes/No)
e
Data source: Maternal and Child Health Bureau Discretionary
Grant Information System
f
Data source: 2010 Project Director survey. Project directors
were asked, ‘‘Which of the following long term outcomes did
your Healthy Start
project achieve in 2010?’’. Multiple responses were allowed
Matern Child Health J (2015) 19:1292–1305 1303
123
Discussion
This evaluation of the Federal Healthy Start Program using
both data from a survey of project directors and Healthy
Start project birth, service, and system outcome perfor-
mance measures data revealed a mixed set of relationships
between implementation of core program components and
long-term maternal and child health outcomes. Analyses of
the 2010 PD survey data indicate that implementation of all
core components was associated with better project direc-
tor-reported intermediate and long-term project outcomes.
This is the first analysis to use MCHB performance mea-
sure data in a national evaluation to assess Healthy Start
projects’ progress toward achieving outcomes that are
expected to occur if program elements are successfully and
completely implemented. Results from this evaluation are
consistent with our hypothesis (illustrated in the logic
model, Fig. 1) of a progression of achievement of inter-
mediate outcomes leading to long-term outcomes. For
example, increased screening for perinatal depression, case
management and interconception care services may have
led to PD-reported improvement in maternal health. In
addition, we found that Healthy Start projects that reported
an increase in the number of participants with a medical
home in 2010 and an increase in positive behaviors among
participants had a significantly better (lower) singleton
LBW rate among project participants’ infants than the rate
in their state.
Our analyses used state and national benchmarks, and
our findings are reinforced by the results of previously
published evaluations that were conducted by Healthy Start
projects using vital records, clinical services and program
data. Site-specific evaluations conducted by individual
Healthy Start projects have identified components of the
program that show a positive effect on birth outcomes of
participants’ infants when compared with demographically
similar women who did not participate in the program. For
example, evaluations of individual Healthy Start projects
found that services provided to high risk participants
resulted in improved birth outcomes such as reduced rates
of LBW, preterm birth, and infant mortality [12–14] in
addition to lower rates of sexually transmitted diseases
[15].
Although previous national evaluations of the Federal
Healthy Start Program helped to establish the importance
of the Healthy Start program components for achieving
Program goals, these evaluations relied solely on grantees’
perspectives because objective performance measure data
were not adequate for use in national evaluations. A thor-
ough examination of the PM data reported by Healthy Start
projects revealed that the quality of reported data is suffi-
cient for evaluation activities but also identified several key
challenges to using these data for program evaluation [8].
Our review of the notes and detailed explanations that
accompanied the PM data that grantees submitted to the
DGIS revealed data quality issues, including: 1) inconsis-
tency in the definition of the measure used by the project
with the definition specified by MCHB; 2) lack of verifi-
cation of some measures, e.g. PM 52, due to the timing of
the completion of birth–death linked files prepared by the
state vital records department; and 3) missing and incom-
plete data. These data limitations may introduce bias if the
projects that had missing data or provided incomplete data
are different from those who provided accurate and com-
plete data, or if the under-reporting or erroneous reporting
is related to the performance measures used as the out-
comes for this analysis (PM 51 and PM 52).
A potential limitation of these analyses was the possible
variation in the information source(s) used to complete the
PD survey. Healthy Start project staff, including the project
director and other project staff, were asked to complete the
survey, and the staff member(s) who provided responses
could have varied by project. The survey was pilot-tested
with representatives of different Healthy Start project staff
roles, but allowing survey completion by more than one
type of respondent can increase the potential for variation
in the interpretation of the survey questions and lead to
variation in responses. Responses may also have varied
based on the length of time the respondent had been with
the project, in addition to the length of time that the project
had been in operation and the program components that
were implemented. We did not have access to complete,
reliable information about other project characteristics and
program components needed to perform a comprehensive
evaluation of project implementation in a variety of com-
munity settings and to conduct analyses that adequately
addressed all of the relationships outlined in the logic
model. For example, participant demographic data cap-
tured by the MCHB DGIS were not available for use in
these analyses. The eligibility criteria for participation in
Healthy Start lead to some demographic similarities across
project sites; however, other important differences in the
populations served by sites may exist. More detailed
information about program implementation and outcomes
achieved by individual Healthy Start projects is needed to
improve the specificity of future evaluations.
Healthy Start projects provide services to high risk
women in the most vulnerable communities in our country.
Improving birth outcomes for project participants requires
intensive and focused services and policies that will assure
quality services within communities. Ongoing monitoring
and assessment of the implementation of these programs
and routine, standardized collection of essential birth out-
come and project implementation data will provide critical
information for evaluating what is and is not working in
individual Healthy Start projects and the Program as a
1304 Matern Child Health J (2015) 19:1292–1305
123
whole. MCHB could provide Healthy Start Program staff
with online tools and training to improve the reliability of
data collection and reporting. Future Healthy Start Program
evaluations should build on more robust local evaluations
at the project level as well as employ a set of focused
questions for the national evaluation that specifically
address the major issues of interest to state and national
policy-makers. Improved capacity for data collection and
documentation by individual projects would help assure
that comprehensive cross-site evaluations could be con-
ducted in the future. Resources should be provided to
assure that the systems required to conduct this type of
evaluation are in place.
Based on our experience conducting national evalua-
tions of the Federal Healthy Start Program, we recommend
that future evaluations explicitly connect to local, state, and
national frameworks and agendas for improving birth
outcomes and reducing health inequities. The evaluation
plan should incorporate analyses at multiple levels to
provide a robust and comprehensive examination of
Healthy Start Program activities and achievements. Most
importantly, monitoring and evaluation activities con-
ducted by individual Healthy Start projects must be
strengthened to help ensure systematic and standardized
annual reporting to MCHB of performance measure data,
program activities and accomplishments, and other data
needed for evaluation.
Acknowledgments Financial support for this study was provided
by
the Health Resources and Services Administration, Maternal and
Child Health Bureau under Contract No. HHSH250200646015I
Task
Order HHSH25034002T: An Evaluation of the Core
Components of
the Federal Healthy Start Program: A Cross-site Examination.
The
authors would like to acknowledge the contributions of the
Healthy
Start Grantees who participated in this evaluation, the staff of
the
National Healthy Start Association, the Healthy Start Project
Officers
at MCHB, especially Dr. David de la Cruz and Dr. Keisher
High-
smith, and the Healthy Start project team at Abt, including Dr.
Chanza Baytop, Ms. Meredith Eastman, Ms. Carolyn Robinson,
and
Dr. Meghan Woo.
References
1. Health Resources and Services Administration. (2001).
Healthy
Start Guidance, 2001. Rockville, MD: Health Resources and
Services Administration, U.S. Department of Health and Human
Services.
2. Health Resources and Services Administration. (2005).
Healthy
Start Guidance, 2005. Rockville, MD: Health Resources and
Services Administration, U.S. Department of Health and Human
Services.
3. Health Resources and Services Administration. (2009).
Healthy
Start Guidance, 2010. Rockville, MD: Health Resources and
Services Administration, U.S. Department of Health and Human
Services.
4. Maternal and Child Health Bureau. (2013). DGIS Reports:
Pro-
gram Data Reports. Retrieved https://perf-data.hrsa.gov/MCHB/
DGISReports/PerfMeasure/PerfMeasureReports.aspx?Report=
ProgramPerfMeasures&Archived=0.
5. Health Resources and Services Administration. (2006). A
Profile
of Healthy Start: Findings From Phase I of the Evaluation,
2006.
Rockville, MD: Health Resources and Services Administration,
U.S. Department of Health and Human Services.
6. Brand, A., Walker, D. K., Hargreaves, M., & Rosenbach, M.
(2010). Intermediate outcomes, strategies, and challenges of
eight
Healthy Start Projects. Maternal and Child Health Journal,
14(5),
654–665.
7. Rosenbach, M., O’Neil, S., Cook, B., Trebino, L., & Walker,
D.
K. (2010). Characteristics, access, utilization, satisfaction, and
outcomes of healthy start participants in eight sites. Maternal
and
Child Health Journal, 14(5), 666–679.
8. Abt Associates. (2012). Methodology for Analysis of Health
Resources Service Administration, Maternal and Child Health
Bureau Healthy Start Performance Measures, September 2012.
Cambridge, MA: Abt Associates.
9. Centers for Disease Control and Prevention. National Center
for
Health Statistics. Healthy People Objective Targets: Maternal,
Infant, and Child Health. Retrieved http://www.cdc.gov/nchs/
data/hpdata2010/hp2010_final_review_focus_area_16.pdf.
10. Centers for Disease Control and Prevention. National Center
for
Health Statistics. Healthy People Objective Targets: Maternal,
Infant, and Child Health. Retrieved http://www.healthypeople.
gov/2020/topicsobjectives2020/overview.aspx?topicid=26.
11. Taylor, Y. J., & Nies, M. A. (2012). Measuring the impact
and
outcomes of maternal child health federal programs. Maternal
Child Health Journal, 17(5), 886–896. doi:10.1007/s10995-012-
1067-y.
12. Will, J. A., Hall, I., Cheney, T., & Driscoll, M. (2005).
Flower
Power: Assessing the impact of the Magnolia Project on
reducing
poor birth outcomes in an at-risk neighborhood. Journal of
Applied Sociology/Sociological Practice, 22.2/7(2), 74–90.
13. Salihu, H. M., Mbah, A. K., Jeffers, D., Alio, A. P., &
Berry, L.
(2009). Healthy Start program and feto-infant morbidity out-
comes: Evaluation of program effectiveness. Maternal and Child
Health Journal, 13(1), 56–65. doi:10.1007/s10995-008-0400-y.
14. Kothari, C. L., Wendt, A., Oemeeka, L., Overton, J., &
Sweezy,
L. C. (2011). Assessing maternal risk for fetal-infant mortality:
A
population-based study to prioritize risk reduction in a Healthy
Start community. Maternal and Child Health Journal, 15(1),
68–76. doi:10.1007/s10995-009-0561-3.
15. Livingood, W. C., Brady, C., Pierce, K., Atrash, H., Hou,
T., &
Bryant, T, 3rd. (2010). Impact of pre-conception health care:
evaluation of a social determinants focused intervention. Mater-
nal and Child Health Journal, 14(3), 382–391.
Matern Child Health J (2015) 19:1292–1305 1305
123
https://perf-
data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe
ports.aspx?Report=ProgramPerfMeasures&Archived=0
https://perf-
data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe
ports.aspx?Report=ProgramPerfMeasures&Archived=0
https://perf-
data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe
ports.aspx?Report=ProgramPerfMeasures&Archived=0
http://www.cdc.gov/nchs/data/hpdata2010/hp2010_final_review
_focus_area_16.pdf
http://www.cdc.gov/nchs/data/hpdata2010/hp2010_final_review
_focus_area_16.pdf
http://www.healthypeople.gov/2020/topicsobjectives2020/overvi
ew.aspx?topicid=26
http://www.healthypeople.gov/2020/topicsobjectives2020/overvi
ew.aspx?topicid=26
http://dx.doi.org/10.1007/s10995-012-1067-y
http://dx.doi.org/10.1007/s10995-012-1067-y
http://dx.doi.org/10.1007/s10995-008-0400-y
http://dx.doi.org/10.1007/s10995-009-0561-3
Reproduced with permission of the copyright owner. Further
reproduction prohibited without
permission.
c.10995_2014_Article_1635.pdfSelected Findings from the
Cross-Site Evaluation of the Federal Healthy Start
ProgramAbstractIntroductionNational Evaluations of the
Federal Healthy Start ProgramMethodsData
SourcesMeasurement of VariablesAnalysisResultsDescriptive
CharacteristicsBivariate Analyses: Core Program
ComponentsBivariate Analyses: Intermediate Outcomes, Service
and Systems Activities that Contributed to Reducing Disparities
in Maternal and Infant Health Outcomes, and Long-Term
Maternal and Child Health and Community Capacity
OutcomesDescriptive and Comparative Analyses: Birth
Outcome Performance MeasuresMultivariate
AnalysesDiscussionAcknowledgmentsReferences
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
L o g i c M o d e l W o r k b o o k
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
L o g i c M o d e l W o r k b o o k
T a b l e o f C o n t e n t s
P a g e
Introduction - How to Use this Workbook
.....................................................................2
Before You Begin
...............................................................................................
..................3
Developing a Logic Model
...............................................................................................
..4
Purposes of a Logic Model
...............................................................................................
5
The Logic Model’s Role in Evaluation
............................................................................ 6
Logic Model Components – Step by Step
....................................................................... 6
Problem Statement: What problem does your program address?
......................... 6
Goal: What is the overall purpose of your program?
.............................................. 7
Rationale and Assumptions: What are some implicit underlying
dynamics? ....8
Resources: What do you have to work with?
......................................................... 9
Activities: What will you do with your resources?
................................................ 11
Outputs: What are the tangible products of your activities?
................................. 13
Outcomes: What changes do you expect to occur as a result of
your work?.......... 14
Outcomes Chain
....................................................................................... 16
Outcomes vs. Outputs
............................................................................. 17
Logic Model Review
...............................................................................................
............18
Appendix A: Logic Model Template
Appendix B: Worksheet: Developing an Outcomes Chain
Logic Model Workbook
Page 2
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
I n t r o d u c t i o n - H o w t o U s e t h i s W o r k b o o k
Welcome to Innovation Network’s Logic Model Workbook. A
logic model is a commonly-used
tool to clarify and depict a program within an organization. You
may have heard it described as
a logical framework, theory of change, or program matrix—but
the purpose is usually the same:
to graphically depict your program, initiative, project or even
the sum total of all of your
organization’s work. It also serves as a
foundation for program planning and
evaluation.
This workbook is a do-it-yourself guide to
the concepts and use of the logic model. It
describes the steps necessary for you to
create logic models for your own
programs. This process may take
anywhere from an hour to several hours or
even days, depending on the complexity of
the program.
We hope you will use this workbook in the way that works best
for you:
• As a stand-alone guide to help create a logic model for a
program in an organization,
• As an additional resource for users of the Point K Learning
Center, and/or
• As a supplement to a logic model training conducted by
Innovation Network.
You can create your logic model online using the Logic Model
Builder in Innovation Network’s
Point K Learning Center, our suite of online planning and
evaluation tools and resources at
www.innonet.org. This online tool walks you through the logic
model development process;
allows you to save your work and come back to it later; share
work with colleagues to review
and critique; and print your logic model in an attractive, one-
page presentation view for sharing
with stakeholders. Free registration is required.
For those of you who prefer to work on paper or who don’t have
reliable Internet access, a logic
model template is located in Appendix A of this workbook. You
may want to make several
copies of this template, to allow for adjustments and updates to
your logic model over time.
This checklist icon appears at points in the workbook at which
you should record
something – either write something in your template, or enter it
into your online Logic Model
Builder.
Why evaluate?
Evaluation serves many purposes:
• Supports program and strategic
planning
• Helps communicate your goals and
progress
• Serves as a basis for ongoing learning
to make your work stronger and
more effective.
http://www.innonet.org/�
Logic Model Workbook
Page 3
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Ongoing Learning Cycle
Evaluation is an ongoing learning cycle; a process that starts
with planning, leads into data
collection, analysis and reflection, and then to action and
improvement. Logic models are the
foundation of planning and the core of any evaluation process.
As you make strategic decisions
based on evaluation findings, you move right back into the
planning stage.
B e f o r e Y o u B e g i n
In preparing to create a logic model, you may want to consider:
What stakeholders should I involve?
The development of a logic model offers an opportunity to
engage your program’s stakeholders
in a discussion about the program. Stakeholders might include
program staff, clients/service
recipients, partners, funders, board members, community
representatives, and volunteers.
Their perspectives can enrich your program logic model by
clarifying expectations for the
program.
What is the scope of this logic model?
• Identify a timeframe for the logic model you are about to
create. It will help you frame
short-, intermediate, and long-term outcomes and make better
decisions about resources
and activities. Many groups design logic models for a funding
or program cycle, a fiscal
year, or a timeframe in which they believe they can achieve
some meaningful results.
• This logic model structure is intended for program planning.
Define the parameters of
your program clearly. If your organization is small and only
has one program, you can
also use this structure for small-scale strategic planning.
Logic Model Workbook
Page 4
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
D e v e l o p i n g a L o g i c M o d e l
Many different logic model formats exist, but they all contain
the same core concepts. The
format we use in this workbook and in our online tools has
proven useful and manageable for
the nonprofit partners we have worked with, and is the result of
more than fifteen years of
program planning and evaluation experience in the field.
It’s not necessary to create your logic model all in one sitting.
It will almost certainly be useful
to talk to other program stakeholders and get their input along
the way. You can work through
the process as we have it laid out here – starting with the
problem your program is meant to
solve, and ending with your intended outcomes – or, if it’s
easier for you, you can work in
reverse, starting with outcomes and working your way
backwards.
Similarly, the names of key components may also vary among
different logic models used in the
field, but the underlying concepts are the same. In this
workbook, we identify other terms used
in the field for similar concepts. As you develop your logic
model, we encourage you to find a
common language to use among key stakeholders, whether that
language reflects the terms
used here or elsewhere. The important thing is that everyone
involved uses the same terms.
The components of the logic model used by Innovation Network
are:
Logic Model Workbook
Page 5
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
A series of “if-then” relationships connect the components of
the logic model: if resources are
available to the program, then program activities can be
implemented; if program activities are
implemented successfully, then certain outputs and outcomes
can be expected.
As you draft each component of the logic model, consider the
if-then relationship between the
components. If you cannot make a connection between each
component of the logic model, you
should identify the gaps and adjust your work. This may mean
revising some of your activities
to ensure that you are able to achieve your outcomes, or
revising intended outcomes to be
feasible with available resources.
P u r p o s e s o f a L o g i c M o d e l
The logic model is a versatile tool that can support many
management activities, such as:
• Program Planning. The logic model is a valuable tool for
program planning and
development. The logic model structure helps you think through
your program
strategy—to help clarify where you are and where you want to
be.
• Program Management. Because it "connects the dots" between
resources, activities, and
outcomes, a logic model can be the basis for developing a more
detailed management
plan. Using data collection and an evaluation plan, the logic
model helps you track and
monitor operations to better manage results. It can serve as the
foundation for creating
budgets and work plans.
• Communication. A well-built logic model is a powerful
communications tool. It can
show stakeholders at a glance what a program is doing
(activities) and what it is
achieving (outcomes), emphasizing the link between the two.
• Consensus-Building. Developing a logic model builds common
understanding and
promotes buy-in among both internal and external stakeholders
about what a program
is, how it works, and what it is trying to achieve.
• Fundraising. A sound logic model demonstrates to funders that
you have purposefully
identified what your program will do, what it hopes to achieve,
and what resources you
will need to accomplish your work. It can also help structure
and streamline grant
writing.
The logic model you create with this workbook can be used for
any or all of the above purposes
– any time you need to show or refer to a clear and succinct
picture of your program.
If…..
Resources Activities Outputs Outcomes
ThenThenThen
Logic Model Workbook
Page 6
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
T h e L o g i c M o d e l ’ s R o l e i n E v a l u a t i o n
The cornerstone of effective evaluation is a thorough
understanding of the program you are
trying to evaluate: What resources it has to work with, what it is
doing, what it hopes to
achieve, for whom, and when. In conducting an evaluation, it is
tempting to focus most of your
attention on data collection. However, your evaluation efforts
will be more effective if you start
with a logic model. Going through the logic model process will
help ensure that your
evaluation will yield relevant, useful information.
The figure below illustrates how the logic model you will build
can serve as the foundation for
future evaluation plans. (Our Evaluation Plan Workbook and
online Evaluation Plan Builder offer
guidance for creating evaluation plans.)
Logic Model
Evaluation Plan:
Process
OutputsActivities Outcomes
OutputsActivities
Evaluation Plan:
Outcomes
Outcomes Indicators
Resources
Data Collection
Data Collection
C o m p o n e n t s – S t e p b y S t e p
A note about our “Home Buying” example: People often ask for
examples that relate directly to
their program area—but examples for one programmatic area
can be difficult to “translate” to
another programmatic area. We use the example of becoming a
homeowner to give a more
general conceptual framework.
Problem Statement
The first step in creating a logic model is to clearly
articulate the problem your work is tring to solve—
that is, frame a particular challenge for the
population you serve. problem that frames a
particular challenge for the population your work
will try to solve.
Other Terms for
“Problem Statement”
You might also hear a problem statement
called an "issue statement" or "situation."
Logic Model Workbook
Page 7
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Your problem statement should briefly explain what needs to
change: why is there is a need for
an intervention? Your problem statement answers the question,
“What problem are we working
to solve?” Include “who, what, why, where, when, and how” in
your statement.
Sample problem statements:
I do not own my own home, so I do not experience the many
financial and emotional
benefits of home ownership.
A growing number of women in Highland Falls lack the
confidence and know-how to
obtain employment and be self-sufficient due to low literacy in
our region.
In Townsville, low-income residents with bad or no credit do
not have resources available
to help them improve their current living situations.
Build Your Logic Model: When you have identified your
problem statement, insert it into
the Problem Statement box in your logic model template, or on
the “Problem/Goals” tab of the
online Logic Model Builder.
Goal
Next, think about the overall purpose of what you are
trying to measure (your program, intervention, etc).
What are you trying to accomplish? The answer to this
question is the solution to your problem statement, and
will serve as your goal.
Goals serve as a frame for all elements of the logic model that
follow. They reflect
organizational priorities and help you steer a clear direction for
future action.
Goals should:
• Include the intended results—in general terms—of the
program or initiative.
• Specify the target population you intend to serve.
Examples of goal statements include:
To increase my financial independence and security through
home ownership.
Significantly increase the literacy rates among children with
reading difficulties at Yisser
Elementary School by implementing a teen-tutored reading
program.
Assist clients in their effort to become economically self-
sufficient.
Improve the health status of children, ages birth to 8 years, in
Harrison County.
Other Terms for
“Goal”
You might also hear a goal called an
"objective" or a "long-term outcome."
Logic Model Workbook
Page 8
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Goal Tips:
• All logic model components should be connected to your goal.
Having a clear goal helps
fight the temptation to implement an interesting program that
doesn’t really “fit.”
• It’s tempting to have more than one goal, but we recommend
that you articulate one clear
solution to your problem statement. Other goals of your
program may be long-term
outcomes, rather than goals.
• Phrase your goal in terms of the change you want to achieve
over the life of your
intervention, rather than a summary of the services you are
going to provide.
• Don’t make your statement so broad and general that it
provides no guidance for your
project.
Build Your Logic Model: Insert your goal statement(s) into the
Goal box in your logic
model template, or on the “Problem/Goals” tab of the online
Logic Model Builder.
Rationales
A program’s rationales are the beliefs about how change occurs
in your field and with your
specific clients (or audience), based on research, experience, or
best practices. For example:
Home ownership increases a person’s options for financial
stability and wealth-building.
Current research on women leaving public income support
systems indicates that targeted job
training, partnered with a menu of support and coaching
services, can help women get and keep
living wage jobs
Success in moving into higher-paying jobs and achieving
economic self-sufficiency is closely
related to the availability of opportunities for training and
education.
These rationales all demonstrate a core set of beliefs based on
knowledge about how changes
occur in the field.
Build Your Logic Model: If you choose to include Rationales
in your logic model, record
them in the “Rationales” box on the template, or on the
“Rationale/Assumptions” tab in the
online Logic Model Builder.
http://www.innonet.org/?module=lmb.rationales&set_v=d3BfaW
Q9MjEwJmxtYl9kYXRhX3R5cGVfaWQ9MyZsbWJfZGF0YV9p
ZD0yMjU5Jm1vZGU9ZWRpdF9sbWJfZGF0YQ==�
http://www.innonet.org/?module=lmb.rationales&set_v=d3BfaW
Q9MjEwJmxtYl9kYXRhX3R5cGVfaWQ9MyZsbWJfZGF0YV9p
ZD0yMjU5Jm1vZGU9ZWRpdF9sbWJfZGF0YQ==�
Logic Model Workbook
Page 9
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Assumptions
The assumptions that underlie a program’s theory are conditions
that are necessary for success,
and you believe are true. Your program needs these conditions
in order to succeed, but you
believe these conditions already exist – they are not something
you need to bring about with
your program activities. In fact, they are not within your
control.
These assumptions can refer to facts or special circumstances in
your community, region, and/or
field. Examples of program assumptions are:
There are houses for sale for which potential homebuyers will
qualify.
There are living wage jobs available within a reasonable
distance of this neighborhood, with
adequate public transportation to reach those jobs.
Two counselors can serve a client population of approximately
40.
The first assumption demonstrates that there is a circumstance
within the community that will
enable a homebuyer to successfully purchase a home. The third
example shows that the
program manager has clearly thought out how many counselors
are needed to support the
number of participants the program will serve.
Build Your Logic Model: If you choose to include the
Assumptions behind your
program choices in your logic model, record them in the
“Assumptions” box on the template, or
on the “Rationale/Assumptions” tab in the online Logic Model
Builder.
Resources
Identify the available resources for your program. This
helps you determine the extent to which you will be
able to implement the program and achieve your
intended goals and outcomes.
List the resources that you currently have to support
your program. (If you intend to raise additional resources for
the program during this program
timeframe, account for them under "Activities.")
An exception: If you’re building your logic model as part of a
proposal or to justify a funding
request, list all the resources you will need for a successful
program, whether or not you have
them in hand. (You may wish to separate resources under
headings for “need” and “have.”)
Other Terms for
“Resources”
You might also hear resources called
“inputs” or “program investments”.
Logic Model Workbook
Page 10
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Common types of resources include:
- and part-time staff, consultants (e.g.,
fundraising, technical
support, strategic planning, communications), pro bono staff
services, and volunteers
cial resources: Restricted grants, operating budget, and
other monetary resources
infrastructure (email,
website)
inters, copiers) and
equipment specific to the
program
materials), insurance, etc.
Resource Tips:
• Identify the major resource categories for your program.
• Be specific about these resources, but do not spend a lot of
time developing a detailed
list of all actual or anticipated program expenditures.
Not specific enough Just right Too specific
Home-buying resources
Clear financial records
W2 forms
1099s
Tax returns
Bank statements
Pay stubs
Utilities bills
Credit report
Staff
3 full-time staff
1 part-time
1 project lead @ 40 hrs/wk
2 project associates @ 40 hrs/wk
1 part-time support person @ 20 hrs/wk
Supplies Art Supplies
25 paintbrushes
50 bottles of paint
250 sheets of paper
25 coffee cans
Dishwashing liquid
• Remember to include resources such as technology, materials,
and space: these are often
overlooked at the program planning stage, which can cause
trouble later.
• You can use your resource list as the foundation for
developing a program budget.
• Do you receive in-kind contributions? List those among your
resources.
Build Your Logic Model: List your resources statement(s) in
the Resources box in your
logic model template, or on the “Timeframe/Resources” tab of
the online Logic Model Builder.
Logic Model Workbook
Page 11
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Activities
Activities are the actions that are needed to implement
your program—what you will do with program
resources in order to achieve program outcomes and,
ultimately, your goal(s).
Common activities are:
• Developing products (e.g., promotional materials and
educational curricula),
• Providing services (e.g., education and training, counseling or
health screening),
• Engaging in policy advocacy (e.g., issuing policy statements,
conducting public
testimony), or
• Building infrastructure (e.g., strengthening governance and
management structures,
relationships, and capacity).
It is often helpful to group related activities together. The
number of activity groups depends on
your program’s size and how you administer it. For a large
program, there might be numerous
activity groups; smaller programs may consist of just one or
two. Each activity group will have
more specific activities under it—but remember, this isn’t a to-
do list. Getting too specific can
overwhelm your audience.
Examples:
For our homebuying example, we use the activity groups of
preliminary research, financial
preparation, homebuyer’s education, identify a neighborhood,
secure mortgage loan, choose a
house, and make the purchase.
A program with the goal of reducing the teen pregnancy rate in
its city might have the following
activity groups: family planning education, mentoring, and
providing individual and group
counseling.
A program with a goal of increasing organizational capacity
through strategic use of technology
might have the following activity categories: technology
planning, selecting and implementing
technology infrastructure, staff assessment and training, and
network support.
Other Terms for
“Activities”
You might also hear activities called
“processes,” “strategies,” “methods,” or
“action steps.”
Logic Model Workbook
Page 12
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Activities Tips:
• You can use the activities you identify here as an outline for a
work plan. Use the
activities as headings in a more comprehensive work plan that
includes staff
assignments and a timeline.
• Providing a complete list of activities helps people who are
not familiar with your
understand what it really takes to implement it—but getting too
specific can overwhelm
them. The chart below gives some examples of what level of
specificity is useful.
Activity Group: Identify a neighborhood
ACTIVITIES:
• Hire real-estate agent
• Drive around the city
This set of activities is not detailed enough. It
omits a number of key steps needed to implement
mentor training.
Activity Group: Identify a neighborhood
ACTIVITIES:
• Conduct Google search
• Interview friends and family
• Choose three books from the local library
about neighborhoods
• Read three books
• Hire a driver to tour neighborhoods
• Try neighborhood restaurants
• Set up review meeting
• Take friends and family on neighborhood
tours
o Send out Invitations
o Arrange transportation
This is too detailed. It would more appropriately
belong in a work plan.
Activity Group: Identify a neighborhood
ACTIVITIES:
• Research local neighborhoods--amenities
and prices
• Hire a real-estate agent
• Tour priority neighborhoods
This is just about the right level of detail for a logic
model.
Build Your Logic Model: List all activities required to
implement your program, and
group related activities together. Record them in your template
or on the “Activities/Outputs”
tab of the online Logic Model Builder.
Logic Model Workbook
Page 13
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Outputs
Outputs are the measurable, tangible, and direct products
or results of program activities. They lead to desired
outcomes—benefits for participants, families,
communities, or organizations—but are not themselves the
changes you expect the program will produce. They do
help you assess how well you are implementing the
program.
Whenever possible, express outputs in terms of the size and/or
scope of services and products
delivered or produced by the program. They frequently include
quantities or reflect the
existence of something new.
Examples of program outputs include numbers and descriptions
of:
• Number of home buying workshops attended
• Number of neighborhoods researched
• Number of program participants served
• Hours of service provided
• Number of partnerships or coalitions formed
• Focus groups held
• Policy briefings conducted
An output statement doesn’t reveal anything about quality. You
will assess the quality of your
outputs in your evaluation.
Outputs Tips:
• Make sure your outputs have activities and resources
associated with them. This is one
way a logic model is useful—to check whether a program has
planned how it will create
a product or deliver a service.
• Many people identify specific numbers for their outputs.
Begin with an estimate, based
on experience, desired impact, and resources available. Don’t
get stuck on exact
numbers; you can adjust them later.
Build Your Logic Model: List all the outputs you expect your
program’s activities will
produce. Place these in the Outputs box of the logic model
template or on the
“Activities/Outputs” tab of the online Logic Model Builder.
Other Terms for
“Outputs”
You might also hear outputs called
“deliverables,” “units of service,” or
“products.”
Logic Model Workbook
Page 14
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Outcomes
Outcomes express the results that your program intends to
achieve if implemented as planned. Outcomes are the
changes that occur or the difference that is made for
individuals, groups, families, organizations, systems, or
communities during or after the program.
Outcomes answer the questions: “What difference does the
program make? What does success
look like?” They reflect the core achievements you hope for
your program.
Outcomes should:
• Represent the results or impacts that occur because of program
activities and services
• Be within the scope of the program’s control or sphere of
reasonable influence, as well as
the timeframe you have chosen for your logic model
• Be generally accepted as valid by various stakeholders of the
program
• Be phrased in terms of change
• Be measurable. (It may take work to translate them into
measurable indicators.)
Types of Change: Organizations with diverse missions and
services share common categories
of outcomes, because outcomes are about change: changes in
learning, changes in action, or
changes in condition.
Changes in Learning:
o New knowledge
o Increased skills
o Changed attitudes, opinions, or values
o Changed motivation or aspirations
For example:
• Potential homeowners increase their understanding of the
home buying process
• Teens ages 15-18 increase their commitment to community
service.
Changes in Action:
o Modified behavior or practice
o Changed decisions
o Changed policies
For example:
• Potential homeowners have purchased their first home.
• Teens ages 15-18 participate in community service.
Other Terms for
“Outcomes”
You might also hear outcomes called
“results”, “impacts”, or “objectives”.
Logic Model Workbook
Page 15
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Changes in Condition:
o Human (e.g., from oppression to freedom; from
malnourishment to food
security)
o Economic (e.g., from unemployed to employed)
o Civic (e.g., from disenfranchised to empowered)
o Environmental (e.g., from polluted to clean)
For example:
• Potential homeowners have purchased their first home.
• Teens ages 15-18 have improved employment prospects
because of community
service.
Focus of Outcomes: Clarify who or what will experience the
intended changes.
1. Individual, Client-Focused Outcomes: These reflect the
difference the program will make in the
lives of those directly served by the program. Examples
include:
• Potential homebuyer has purchased a home (change in
status/condition)
• Parents use alternative discipline approaches (behavior)
• Participants are better able to organize and advocate for their
rights (skills)
• Children are better prepared to enter school (changed
status/condition)
2. Family or Community Outcomes: Some programs intend to
create change for families,
neighborhoods, or whole communities. Examples include:
• Higher percentage of homeowners as opposed to renters in a
low-income community
• Improved communication among family members
• Increased parent-child-school interactions
• Decreased neighborhood violence
• Community group has an inclusive membership policy, work
group practices, and
democratic governance
3. Systemic Outcomes: These illustrate changes to overall
systems and might include cases where
agencies, departments, or complex organizations work in new
ways, behave differently, share
resources, and provide services in a coordinated fashion.
Examples include:
• Integrated system of services or interagency resource sharing
• Greater coordination among partners in a system
4. Organizational Outcomes: Some programs lead to internal
outcomes—both individual and
institutional—that affect how well a program can achieve
external outcomes. These produce
improvements in program management and organizational
effectiveness. Examples of
organizational outcomes include:
• Increased efficiency
• Increased staff motivation
• Increased collaboration with other organizations
Logic Model Workbook
Page 16
I N N O V A T I O N N E T W O R K , I N C .
www.innonet.org • [email protected]
Chain of Outcomes. Not all outcomes can occur at the same
time. Some outcomes must occur
before others become possible. This is referred to as the “chain
of outcomes.” (See Appendix B
for a worksheet.)
-term Outcomes: What change do you expect to occur
either immediately or in the near
future? Short-term outcomes are those that are the most direct
result of a program’s
activities and outputs. They are typically not ends in
themselves, but are necessary steps
toward desired ends (intermediate or long-term outcomes or
goals)
te Outcomes: What change do you want to occur
after that? Intermediae outcomes
are those outcomes that link a program’s short-term outcomes to
long-term outcomes.
-term Outcome: What change do you hope will occur
over time? Long-term outcomes are
those that result from the achievement of your short- and
intermediate-term outcomes. They
are also generally outcomes over which your program has a less
direct influence. Often
long-term outcomes will occur beyond the timeframe you
identified for your logic model.
Outcomes Chain Example
Good Health for Kids is an advocacy organization that educates
parents and guardians about
the importance of immunizing children. The staff has identified
the following program
activities:
• Develop educational literature
• Disseminate literature to social service agencies
• Develop public service announcements (PSAs)
• Identify and work with radio stations to air radio spots
The outcomes associated with these activities fall into three
categories:
Short-Term
LEARNING: The knowledge
parents and guardians gain
from the literature & PSAs.
• Increased understanding
among targeted parents
of the importance of
childhood immunization
• Increased knowledge
among targeted parents
of where to go to have
their children immunized
Intermediate
BEHAVIOR: The actions
parents & guardians take as
a result of that knowledge.
• Increased number of
targeted parents who take
their children to be
immunized
Closer in Time
Easier to Measure
More Attributable to Program
Long-Term
CONDITION: The conditions
that change as a result of
those actions.
• Increased number of
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx
Selected Findings from the Cross-Site Evaluation of the Federa.docx

More Related Content

Similar to Selected Findings from the Cross-Site Evaluation of the Federa.docx

Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...
Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...
Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...JSI
 
Research ArticleProcess Evaluation of a PositiveYouth De.docx
Research ArticleProcess Evaluation of a PositiveYouth De.docxResearch ArticleProcess Evaluation of a PositiveYouth De.docx
Research ArticleProcess Evaluation of a PositiveYouth De.docxrgladys1
 
Research ArticleProcess Evaluation of a PositiveYouth .docx
Research ArticleProcess Evaluation of a PositiveYouth .docxResearch ArticleProcess Evaluation of a PositiveYouth .docx
Research ArticleProcess Evaluation of a PositiveYouth .docxrgladys1
 
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...Whitney Bowman-Zatzkin
 
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docx
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docxBy Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docx
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docxbartholomeocoombs
 
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxRunning head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxtoltonkendal
 
Building a Resilient Health System in Liberia: Health Information System (HIS...
Building a Resilient Health System in Liberia: Health Information System (HIS...Building a Resilient Health System in Liberia: Health Information System (HIS...
Building a Resilient Health System in Liberia: Health Information System (HIS...MEASURE Evaluation
 
HCM 3305, Community Health 1 Course Learning Outcom.docx
 HCM 3305, Community Health 1 Course Learning Outcom.docx HCM 3305, Community Health 1 Course Learning Outcom.docx
HCM 3305, Community Health 1 Course Learning Outcom.docxaryan532920
 
DHHS_ChildWelfareEdCollab_Review_2012
DHHS_ChildWelfareEdCollab_Review_2012DHHS_ChildWelfareEdCollab_Review_2012
DHHS_ChildWelfareEdCollab_Review_2012Donnie Charleston
 
Virtual Leadership Development Program
Virtual Leadership Development ProgramVirtual Leadership Development Program
Virtual Leadership Development ProgramEdward K.R. Ikiugu
 
HFG Mali Final Country Report
HFG Mali Final Country ReportHFG Mali Final Country Report
HFG Mali Final Country ReportHFG Project
 
Introduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxIntroduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxmariuse18nolet
 
Introduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxIntroduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxbagotjesusa
 
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docx
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docxRunning head LOGIC MODELLOGIC MODEL 2Logic modelStu.docx
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docxwlynn1
 
Evaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramEvaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramMEASURE Evaluation
 
Case study: Maternal & Child Centers of Excellence: Improving health systems ...
Case study: Maternal & Child Centers of Excellence: Improving health systems ...Case study: Maternal & Child Centers of Excellence: Improving health systems ...
Case study: Maternal & Child Centers of Excellence: Improving health systems ...HFG Project
 
Essential Package of Health Services Country Snapshot: Nepal
Essential Package of Health Services Country Snapshot: NepalEssential Package of Health Services Country Snapshot: Nepal
Essential Package of Health Services Country Snapshot: NepalHFG Project
 

Similar to Selected Findings from the Cross-Site Evaluation of the Federa.docx (20)

Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...
Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...
Contribution of Multiple Assessment Methodologies in a Theory-driven Evaluati...
 
Research ArticleProcess Evaluation of a PositiveYouth De.docx
Research ArticleProcess Evaluation of a PositiveYouth De.docxResearch ArticleProcess Evaluation of a PositiveYouth De.docx
Research ArticleProcess Evaluation of a PositiveYouth De.docx
 
Research ArticleProcess Evaluation of a PositiveYouth .docx
Research ArticleProcess Evaluation of a PositiveYouth .docxResearch ArticleProcess Evaluation of a PositiveYouth .docx
Research ArticleProcess Evaluation of a PositiveYouth .docx
 
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...
An Interrupted Time Series Multivariate Regression Analysis Evaluation of Sta...
 
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docx
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docxBy Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docx
By Carrie E. Fry, Sayeh S. Nikpay, Erika Leslie, and Melinda B.docx
 
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docxRunning head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
Running head PSYCHOLOGY1PSYCHOLOGY7Programmatic pur.docx
 
Building a Resilient Health System in Liberia: Health Information System (HIS...
Building a Resilient Health System in Liberia: Health Information System (HIS...Building a Resilient Health System in Liberia: Health Information System (HIS...
Building a Resilient Health System in Liberia: Health Information System (HIS...
 
HDGP_EvaluationReport
HDGP_EvaluationReportHDGP_EvaluationReport
HDGP_EvaluationReport
 
Antoine-Mafwila-Session-3A-CCIH-2017
Antoine-Mafwila-Session-3A-CCIH-2017Antoine-Mafwila-Session-3A-CCIH-2017
Antoine-Mafwila-Session-3A-CCIH-2017
 
HCM 3305, Community Health 1 Course Learning Outcom.docx
 HCM 3305, Community Health 1 Course Learning Outcom.docx HCM 3305, Community Health 1 Course Learning Outcom.docx
HCM 3305, Community Health 1 Course Learning Outcom.docx
 
DHHS_ChildWelfareEdCollab_Review_2012
DHHS_ChildWelfareEdCollab_Review_2012DHHS_ChildWelfareEdCollab_Review_2012
DHHS_ChildWelfareEdCollab_Review_2012
 
Virtual Leadership Development Program
Virtual Leadership Development ProgramVirtual Leadership Development Program
Virtual Leadership Development Program
 
HFG Mali Final Country Report
HFG Mali Final Country ReportHFG Mali Final Country Report
HFG Mali Final Country Report
 
Introduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxIntroduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docx
 
Introduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docxIntroduction to Program Evaluation for Public Health.docx
Introduction to Program Evaluation for Public Health.docx
 
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docx
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docxRunning head LOGIC MODELLOGIC MODEL 2Logic modelStu.docx
Running head LOGIC MODELLOGIC MODEL 2Logic modelStu.docx
 
Evaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health ProgramEvaluating an Integrated Family Planning and Mother/Child Health Program
Evaluating an Integrated Family Planning and Mother/Child Health Program
 
Case study: Maternal & Child Centers of Excellence: Improving health systems ...
Case study: Maternal & Child Centers of Excellence: Improving health systems ...Case study: Maternal & Child Centers of Excellence: Improving health systems ...
Case study: Maternal & Child Centers of Excellence: Improving health systems ...
 
Updates from Maternal Infant and Early Childhood Home Visiting
Updates from Maternal Infant and Early Childhood Home VisitingUpdates from Maternal Infant and Early Childhood Home Visiting
Updates from Maternal Infant and Early Childhood Home Visiting
 
Essential Package of Health Services Country Snapshot: Nepal
Essential Package of Health Services Country Snapshot: NepalEssential Package of Health Services Country Snapshot: Nepal
Essential Package of Health Services Country Snapshot: Nepal
 

More from edgar6wallace88877

Write a page to a page and half for each topic and read each topic a.docx
Write a page to a page and half for each topic and read each topic a.docxWrite a page to a page and half for each topic and read each topic a.docx
Write a page to a page and half for each topic and read each topic a.docxedgar6wallace88877
 
Write a page discussing why you believe PMI is focusing BA as the fi.docx
Write a page discussing why you believe PMI is focusing BA as the fi.docxWrite a page discussing why you believe PMI is focusing BA as the fi.docx
Write a page discussing why you believe PMI is focusing BA as the fi.docxedgar6wallace88877
 
Write a page of personal reflection of your present leadership compe.docx
Write a page of personal reflection of your present leadership compe.docxWrite a page of personal reflection of your present leadership compe.docx
Write a page of personal reflection of your present leadership compe.docxedgar6wallace88877
 
Write a page of compare and contrast for the Big Five Personalit.docx
Write a page of compare and contrast for the Big Five Personalit.docxWrite a page of compare and contrast for the Big Five Personalit.docx
Write a page of compare and contrast for the Big Five Personalit.docxedgar6wallace88877
 
Write a page of research and discuss an innovation that includes mul.docx
Write a page of research and discuss an innovation that includes mul.docxWrite a page of research and discuss an innovation that includes mul.docx
Write a page of research and discuss an innovation that includes mul.docxedgar6wallace88877
 
Write a page answering the questions below.Sometimes projects .docx
Write a page answering the questions below.Sometimes projects .docxWrite a page answering the questions below.Sometimes projects .docx
Write a page answering the questions below.Sometimes projects .docxedgar6wallace88877
 
Write a one-paragraph summary of one of the reading assignments from.docx
Write a one-paragraph summary of one of the reading assignments from.docxWrite a one-paragraph summary of one of the reading assignments from.docx
Write a one-paragraph summary of one of the reading assignments from.docxedgar6wallace88877
 
Write a one-paragraph summary of this article.Riordan, B. C..docx
Write a one-paragraph summary of this article.Riordan, B. C..docxWrite a one-paragraph summary of this article.Riordan, B. C..docx
Write a one-paragraph summary of this article.Riordan, B. C..docxedgar6wallace88877
 
Write a one-paragraph response to the following topic. Use the MLA f.docx
Write a one-paragraph response to the following topic. Use the MLA f.docxWrite a one-paragraph response to the following topic. Use the MLA f.docx
Write a one-paragraph response to the following topic. Use the MLA f.docxedgar6wallace88877
 
Write a one-page rhetorical analysis in which you analyze the argume.docx
Write a one-page rhetorical analysis in which you analyze the argume.docxWrite a one-page rhetorical analysis in which you analyze the argume.docx
Write a one-page rhetorical analysis in which you analyze the argume.docxedgar6wallace88877
 
Write a one pageliterature review of your figure( FIGURE A.docx
Write a one pageliterature review of your figure( FIGURE A.docxWrite a one pageliterature review of your figure( FIGURE A.docx
Write a one pageliterature review of your figure( FIGURE A.docxedgar6wallace88877
 
Write a one page-paper documenting the problemneed you wish to .docx
Write a one page-paper documenting the problemneed you wish to .docxWrite a one page-paper documenting the problemneed you wish to .docx
Write a one page-paper documenting the problemneed you wish to .docxedgar6wallace88877
 
Write a one page report on Chapter 1 and 2 with the same style of mo.docx
Write a one page report on Chapter 1 and 2 with the same style of mo.docxWrite a one page report on Chapter 1 and 2 with the same style of mo.docx
Write a one page report on Chapter 1 and 2 with the same style of mo.docxedgar6wallace88877
 
Write a one page reflection about the following1) Identify .docx
Write a one page reflection about the following1) Identify .docxWrite a one page reflection about the following1) Identify .docx
Write a one page reflection about the following1) Identify .docxedgar6wallace88877
 
Write a one page paper on the question belowSome of the current.docx
Write a one page paper on the question belowSome of the current.docxWrite a one page paper on the question belowSome of the current.docx
Write a one page paper on the question belowSome of the current.docxedgar6wallace88877
 
Write a one page paper (double spaced) describing and discussing the.docx
Write a one page paper (double spaced) describing and discussing the.docxWrite a one page paper (double spaced) describing and discussing the.docx
Write a one page paper (double spaced) describing and discussing the.docxedgar6wallace88877
 
write a one page about this topic and provide a reference.Will.docx
write a one page about this topic and provide a reference.Will.docxwrite a one page about this topic and provide a reference.Will.docx
write a one page about this topic and provide a reference.Will.docxedgar6wallace88877
 
Write a one or more paragraph on the following question below.docx
Write a one or more paragraph on the following question below.docxWrite a one or more paragraph on the following question below.docx
Write a one or more paragraph on the following question below.docxedgar6wallace88877
 
Write a one or more page paper on the following belowWhy are .docx
Write a one or more page paper on the following belowWhy are .docxWrite a one or more page paper on the following belowWhy are .docx
Write a one or more page paper on the following belowWhy are .docxedgar6wallace88877
 
Write a one page dialogue in which two characters are arguing but .docx
Write a one page dialogue in which two characters are arguing but .docxWrite a one page dialogue in which two characters are arguing but .docx
Write a one page dialogue in which two characters are arguing but .docxedgar6wallace88877
 

More from edgar6wallace88877 (20)

Write a page to a page and half for each topic and read each topic a.docx
Write a page to a page and half for each topic and read each topic a.docxWrite a page to a page and half for each topic and read each topic a.docx
Write a page to a page and half for each topic and read each topic a.docx
 
Write a page discussing why you believe PMI is focusing BA as the fi.docx
Write a page discussing why you believe PMI is focusing BA as the fi.docxWrite a page discussing why you believe PMI is focusing BA as the fi.docx
Write a page discussing why you believe PMI is focusing BA as the fi.docx
 
Write a page of personal reflection of your present leadership compe.docx
Write a page of personal reflection of your present leadership compe.docxWrite a page of personal reflection of your present leadership compe.docx
Write a page of personal reflection of your present leadership compe.docx
 
Write a page of compare and contrast for the Big Five Personalit.docx
Write a page of compare and contrast for the Big Five Personalit.docxWrite a page of compare and contrast for the Big Five Personalit.docx
Write a page of compare and contrast for the Big Five Personalit.docx
 
Write a page of research and discuss an innovation that includes mul.docx
Write a page of research and discuss an innovation that includes mul.docxWrite a page of research and discuss an innovation that includes mul.docx
Write a page of research and discuss an innovation that includes mul.docx
 
Write a page answering the questions below.Sometimes projects .docx
Write a page answering the questions below.Sometimes projects .docxWrite a page answering the questions below.Sometimes projects .docx
Write a page answering the questions below.Sometimes projects .docx
 
Write a one-paragraph summary of one of the reading assignments from.docx
Write a one-paragraph summary of one of the reading assignments from.docxWrite a one-paragraph summary of one of the reading assignments from.docx
Write a one-paragraph summary of one of the reading assignments from.docx
 
Write a one-paragraph summary of this article.Riordan, B. C..docx
Write a one-paragraph summary of this article.Riordan, B. C..docxWrite a one-paragraph summary of this article.Riordan, B. C..docx
Write a one-paragraph summary of this article.Riordan, B. C..docx
 
Write a one-paragraph response to the following topic. Use the MLA f.docx
Write a one-paragraph response to the following topic. Use the MLA f.docxWrite a one-paragraph response to the following topic. Use the MLA f.docx
Write a one-paragraph response to the following topic. Use the MLA f.docx
 
Write a one-page rhetorical analysis in which you analyze the argume.docx
Write a one-page rhetorical analysis in which you analyze the argume.docxWrite a one-page rhetorical analysis in which you analyze the argume.docx
Write a one-page rhetorical analysis in which you analyze the argume.docx
 
Write a one pageliterature review of your figure( FIGURE A.docx
Write a one pageliterature review of your figure( FIGURE A.docxWrite a one pageliterature review of your figure( FIGURE A.docx
Write a one pageliterature review of your figure( FIGURE A.docx
 
Write a one page-paper documenting the problemneed you wish to .docx
Write a one page-paper documenting the problemneed you wish to .docxWrite a one page-paper documenting the problemneed you wish to .docx
Write a one page-paper documenting the problemneed you wish to .docx
 
Write a one page report on Chapter 1 and 2 with the same style of mo.docx
Write a one page report on Chapter 1 and 2 with the same style of mo.docxWrite a one page report on Chapter 1 and 2 with the same style of mo.docx
Write a one page report on Chapter 1 and 2 with the same style of mo.docx
 
Write a one page reflection about the following1) Identify .docx
Write a one page reflection about the following1) Identify .docxWrite a one page reflection about the following1) Identify .docx
Write a one page reflection about the following1) Identify .docx
 
Write a one page paper on the question belowSome of the current.docx
Write a one page paper on the question belowSome of the current.docxWrite a one page paper on the question belowSome of the current.docx
Write a one page paper on the question belowSome of the current.docx
 
Write a one page paper (double spaced) describing and discussing the.docx
Write a one page paper (double spaced) describing and discussing the.docxWrite a one page paper (double spaced) describing and discussing the.docx
Write a one page paper (double spaced) describing and discussing the.docx
 
write a one page about this topic and provide a reference.Will.docx
write a one page about this topic and provide a reference.Will.docxwrite a one page about this topic and provide a reference.Will.docx
write a one page about this topic and provide a reference.Will.docx
 
Write a one or more paragraph on the following question below.docx
Write a one or more paragraph on the following question below.docxWrite a one or more paragraph on the following question below.docx
Write a one or more paragraph on the following question below.docx
 
Write a one or more page paper on the following belowWhy are .docx
Write a one or more page paper on the following belowWhy are .docxWrite a one or more page paper on the following belowWhy are .docx
Write a one or more page paper on the following belowWhy are .docx
 
Write a one page dialogue in which two characters are arguing but .docx
Write a one page dialogue in which two characters are arguing but .docxWrite a one page dialogue in which two characters are arguing but .docx
Write a one page dialogue in which two characters are arguing but .docx
 

Recently uploaded

9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room servicediscovermytutordmt
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajanpragatimahajan3
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpinRaunakKeshri1
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhikauryashika82
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Disha Kariya
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 

Recently uploaded (20)

Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Student login on Anyboli platform.helpin
Student login on Anyboli platform.helpinStudent login on Anyboli platform.helpin
Student login on Anyboli platform.helpin
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..Sports & Fitness Value Added Course FY..
Sports & Fitness Value Added Course FY..
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 

Selected Findings from the Cross-Site Evaluation of the Federa.docx

  • 1. Selected Findings from the Cross-Site Evaluation of the Federal Healthy Start Program Vonna Lou Caleb Drayton • Deborah Klein Walker • Sarah W. Ball • Sara M. A. Donahue • Rebecca V. Fink Published online: 28 November 2014 � Springer Science+Business Media New York 2014 Abstract Initiated in 1991, the Federal Healthy Start Program includes 105 community-based projects in 39 states, the District of Columbia and Puerto Rico. Healthy Start projects work collaboratively with stakeholders to ensure participants’ continuity of care during pregnancy through 2 years postpartum. This evaluation of Healthy Start projects examined relationships between implemen- tation of nine core service and system program components and improvements in birth and project outcomes. Program components and outcomes were examined using data from
  • 2. a 2010 Healthy Start project director (PD) survey (N = 104 projects) and 2009 performance measure data from the Maternal and Child Health Bureau Discretionary Grant Information System (N = 98 projects). We explored bivariate relationships between the nine core program components and (a) intermediate and long-term project outcomes and (b) birth outcomes. We assessed independent associations of implementation of all core program com- ponents with birth outcomes, adjusting for project charac- teristics and activities. In 2010, 57 projects implemented all nine core program components: 104 implemented all five core service components and 69 implemented all four core systems components. Implementation of all core program components was significantly associated with several PD- reported intermediate and long-term project outcomes, but was not associated with singleton low birth weight or infant mortality among participants’ infants. This evaluation revealed a mixed set of relationships between Healthy Start
  • 3. projects’ implementation of the core program components and achievement of project outcomes. Although the find- ings demonstrated a positive impact of Healthy Start pro- jects on birth outcomes, only a few associations were statistically significant. Keywords Maternal and child health � Healthy Start Program � Cross-site evaluation � Program evaluation Introduction The Federal Healthy Start Program began in 1991 as a response to high infant mortality rates (IMR) in the United States as well as the large gap in these rates between white and non-white infants. The first Healthy Start projects were funded as demonstration sites in 15 communities with IMR 1.5–2.5 times the national average. By 2012, the program had expanded in size and mission to include 105 projects in 39 states, the District of Columbia and Puerto Rico, including projects in both urban and rural areas. As spec- ified by Health Resources and Services Administration
  • 4. (HRSA) guidance documents [1–3] the core Program goals include: (1) a reduction of racial and ethnic disparities in access to and utilization of health services, (2) an improved local health care system, and (3) an increased consumer or community voice in health care decisions. V. L. C. Drayton (&) Booz Allen Hamilton, One Preserve Parkway, Rockville, MD 20852, USA e-mail: [email protected] D. K. Walker � S. W. Ball � S. M. A. Donahue � R. V. Fink Abt Associates, 55 Wheeler Street, Cambridge, MA 02138- 1168, USA e-mail: [email protected] S. W. Ball e-mail: [email protected] S. M. A. Donahue e-mail: [email protected] R. V. Fink e-mail: [email protected] 123 Matern Child Health J (2015) 19:1292–1305
  • 5. DOI 10.1007/s10995-014-1635-4 http://crossmark.crossref.org/dialog/?doi=10.1007/s10995-014- 1635-4&domain=pdf http://crossmark.crossref.org/dialog/?doi=10.1007/s10995-014- 1635-4&domain=pdf The Federal Healthy Start Program focuses on improv- ing the health and well-being of women, infants, children and their families through the implementation of evidence- based practices and innovative community interventions. In 2010, Healthy Start projects served almost 30,000 pregnant women, many of whom were black or African American, 34 years and younger, with incomes below 100 percent of the federal poverty level [4]. Healthy Start projects work collaboratively with com- munity stakeholders and consumers to leverage existing service and system resources so that women at risk for adverse birth outcomes are assured continuity of care during pregnancy through 2 years postpartum. Since 2001, all Healthy Start projects have been required to implement
  • 6. nine ‘‘core’’ program components: five service components (outreach and recruitment, case management, health edu- cation, interconception care (ICC), perinatal depression screening) and four systems-building components (con- sortia, local health systems action plan (LHSAP), coordi- nation and collaboration with Title V, and a sustainability plan). Healthy Start projects may also implement other support services needed in their local communities, such as breastfeeding support and education, screening for domestic/intimate partner violence and child abuse, initia- tives to improve family and/or male involvement, healthy weight interventions, home visiting, and smoking cessation [1–3]. National Evaluations of the Federal Healthy Start Program The Federal Healthy Start Program has been evaluated from its inception in the early 1990 s. The first national evaluation, conducted from 1997 through 1999, examined
  • 7. the implementation of the 15 demonstration project activ- ities during fiscal years 1992 and 1996 and assessed whe- ther these projects achieved the Healthy Start Program goals of reducing infant mortality and improving maternal and infant health. The second national evaluation was conducted in two phases from 2002 through 2007 and sought to obtain information about the implementation of program components and to identify program features associated with improved perinatal outcomes. Findings from this evaluation were summarized in a profile report presenting the characteristics of all Healthy Start projects [5] and in case studies that documented the context and implementation of the Healthy Start Program in eight sites [6]. The evaluation also collected information on program implementation and outcomes through a participant survey that was conducted in four sites [7]. The third national evaluation is the cross-site evaluation summarized in this article. It was conducted from 2009 through 2012 to
  • 8. examine relationships between the core program components and long-term program and birth outcomes, in addition to factors that influence these relationships. The primary objective of the evaluation was to assess the effect of implementation of all nine core program components on long-term maternal and child health outcomes. Methods The evaluation was guided by a logic model (Fig. 1) that outlined the hypothesized relationships between Healthy Start project context, implementation of core service and system program components, and four long-term outcomes relevant to the Healthy Start Program goals: (1) improved birth outcomes, (2) improved maternal health, (3) improved child health, and (4) sustained community capacity to reduce disparities in health status in the target community. A cross-sectional design was used to assess the associations of implementation of the nine core program components with (1) project characteristics, (2) achieve-
  • 9. ment of intermediate project outcomes, (3) service and system activities conducted by the Healthy Start project that made a primary or major contribution to reducing disparities in maternal and infant health outcomes, and (4) achievement of long-term birth outcomes. Data Sources Self-reported data from the 2010 project director survey (PD survey) and performance measure (PM) data for 2009 reported to the Maternal and Child Health Bureau (MCHB) Discretionary Grant Information System (DGIS) were used in all analyses. The 2010 PD survey was administered via web to Healthy Start project staff between July and Sep- tember 2011 and was completed for all 104 projects (100 % response rate). The survey was designed to collect information on implementation and features of the nine core program components as well as additional support services offered by each Healthy Start project and project achievements. The DGIS is a Web-based system that
  • 10. MCHB grantees use to report their data online to MCHB through HRSA’s Electronic Handbook as a part of the grant application and performance reporting processes; it is the repository of PM data for all MCHB programs. During the time period of this evaluation, the MCHB utilized 15 PMs to monitor the progress of all Healthy Start projects towards the achievement of Program objectives. A list of current MCHB Healthy Start Program PMs is available from: https://mchdata.hrsa.gov/DGISReports/PerfMeasure/ default.aspx. Performance measure data for 2009 were available for 98 projects. After a thorough examination of the available PM data from the DGIS [8], four PMs (two birth outcome PMs, 1 service outcome PM and 1 system Matern Child Health J (2015) 19:1292–1305 1293 123 https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp x https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp x
  • 11. outcome PM; Table 1) were selected for this evaluation based on the quality and consistency of data as well as the relevance of the PM to the evaluation objectives. Project characteristic data that were consistently reported in the DGIS were also used in our analyses. State Title V birth outcome PMs (singleton LBW and IMR) and Healthy People 2010 and 2020 objective targets (LBW and IMR) [9, 10] were used as benchmarks for comparison. Measurement of Variables Variable selection was informed by program components and expected outcomes, the logic model, and previous studies of birth outcomes [11]. The primary exposure of interest was the implementation of all nine core program components: outreach and recruitment, case management, health education, interconception care (ICC), perinatal depression screening, consortia, local health systems action plan (LHSAP), coordination and collaboration with Title V, and a sustainability plan. Implementation was deter-
  • 12. mined using data from the 2010 PD survey (yes/no response for each component). The birth outcomes of interest were measured using two PMs reported in the DGIS in 2009: percent singleton low birth weight (PM 51) and infant mortality rate (PM 52). We examined characteristics hypothesized to influence the association of implementation of program components with birth outcomes. We obtained information on these characteristics from the 2010 PD survey and the DGIS. Maternal demographic characteristics were not available for this analysis. Project characteristics (Table 2) that were examined were length of funding (initial project funding received in Phase 1 [1991–1996], 2 [1997–2000], 3 [2001–2004], or 4 [2005–2010]), geographic location (urban, not urban), and organization type (government agency, community-based non–governmental agency or other organization type). Project director report of achievement of intermediate outcomes (eleven outcomes;
  • 13. see Table 2) (yes/no), service and systems activities that made a primary or major contribution to reducing dispar- ities in maternal and infant health outcomes (fourteen activities; see Table 2) (yes/no), and achievement of long- term maternal and child health and community capacity outcomes (five outcomes; see Table 2) (yes/no) were examined in descriptive analyses and included as covari- ates in multivariable analyses. One service outcome PM Fig. 1 Logic model for the cross-site evaluation of Healthy Start 1294 Matern Child Health J (2015) 19:1292–1305 123 (PM 20, the percent of women participants who have an ongoing source of primary and preventive care services for women) and one system outcome PM (PM 22, a score between 0 and 64 representing the degree to which the project facilitated health providers’ screening of women participants for eight risk factors) were examined in
  • 14. descriptive analyses and included as covariates in multi- variable analyses (see Table 1). Analysis We calculated descriptive statistics for all variables across all Healthy Start projects. We then performed bivariate analyses using Pearson’s Chi square test and Fisher’s exact test to (1) describe implementation of the nine core Healthy Start Program components by project characteristics; (2) examine the association of implementation of all core components with each of (a) intermediate outcomes, (b) service and systems activities that made a primary or major contribution to reducing disparities in maternal and infant health outcomes, and (c) long-term maternal and child health and community capacity outcomes; and (3) examine the association of intermediate outcomes and service and systems activities that made a primary or major contribution to reducing disparities in maternal and infant health outcomes with (a) long-term maternal and child
  • 15. health and community capacity outcomes and with b) birth outcome PMs. We also compared the birth outcome PM rates among Healthy Start projects with their state’s Title V Program rates and with achievement of national Healthy People (HP) 2010 and 2020 objective targets. We developed multivariable linear and logistic regres- sion models to examine the independent associations of implementation of all core program components with birth outcomes, adjusting for project characteristics, project director-reported intermediate outcomes, and service and system PMs. We developed linear regression models to examine continuous outcomes (singleton LBW, IMR) and logistic regression models to examine achievement (yes/ no) of state Title V rates or national HP objectives. We Table 1 MCHB performance measures (PM) used in multivariate analyses Category/ PM Definition/elements Components/Scale
  • 16. Birth outcomes PM 51 Percent of live singleton births weighing less than 2,500 g Numerator: Number of live singleton births less than 2,500 g in the calendar year to program participants Denominator: Live singleton births in the calendar year among program participants PM 52 The infant mortality rate per 1,000 live births Numerator: Number of deaths to infants from birth through 364 days of age to program participants Denominator: Number of live births in the calendar year among program participants Service outcomes PM 20 The percent of women participating in MCHB supported programs who have an ongoing source of primary and preventive care services for women Numerator: The number of women participating in MCHB- funded projects who have an ongoing source of primary and preventive care services during the reporting period
  • 17. Denominator: The number of women participating in MCHB- funded projects during the reporting period Systems outcomes PM 22 The degree to which MCHB supported programs facilitate health providers’ screening of women participants for risk factors Total possible score: 0–64 Scoring instructions: Using a scale of 0-2, indicate the degree to which your grant has performed each activity to facilitate screening for each risk factor by health providers in your program Scale definitions: 0 = Grantee does not provide this function or assure that this function is completed, 1 = Grantee sometimes provides or assures the provision of this function but not on a consistent basis, 2 = Grantee regularly provides or assures the provision of this function Risk factors
  • 18. 1. Smoking 2. Alcohol 3. Illicit drugs 4. Eating disorders 5. Depression 6. Hypertension 7. Diabetes 8. Domestic violence A list of all current MCHB Healthy Start Program PMs is available from: https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp x Matern Child Health J (2015) 19:1292–1305 1295 123 https://mchdata.hrsa.gov/DGISReports/PerfMeasure/default.asp x calculated betas or odds ratios with 95 % confidence intervals. Variables that were included in the models were those found to be associated with the birth outcomes of
  • 19. interest in previous studies or in the bivariate analyses as well as any other characteristics of a priori interest according to the evaluation logic model (Fig. 1). The multivariate models to examine birth outcomes included only those projects with PM data. The model to examine the association of implementa- tion of all core components with singleton LBW (PM 51) included the following covariates: initial funding (Phase 1 versus all other phases), urban geographic location, not urban geographic location, grantee organization type, Healthy Start project facilitation of provider screening for risk factors (PM 22, score greater than mean of all pro- jects), percent of women participants with ongoing source of primary and preventive care (PM 20), self-reported improved birth spacing in 2010 (yes/no), self-reported increased cultural competence of providers (yes/no), and self-reported increased participant involvement in Healthy Start decision-making (yes/no). The model to examine the
  • 20. IMR outcome (PM 52) included many of the same covar- iates, in addition to percent singleton LBW (PM 51), an independent risk factor for infant mortality. This evaluation was determined exempt from IRB review by the Abt Associates Institutional Review Board on September 1, 2010 (Abt IRB # 0499). Results Descriptive Characteristics Table 2 presents the distribution of project characteristics as well as project director-reported implementation of the core components, intermediate project outcomes, service and systems activities that made a primary or major con- tribution to reducing disparities in maternal and infant health outcomes, and long-term maternal and child health and community capacity outcomes. All 104 Healthy Start projects implemented all five core service components. Over two-thirds of projects implemented the four core systems-building components: 99 % implemented one or
  • 21. more consortium, 91 % implemented a LHSAP, 87 % collaborated with Title V, and 66 % had a sustainability plan. Overall, 57 (55 %) projects implemented all nine core program components; this group includes 10 of the 18 projects that were first funded during Phase 1 (1991-1996) of the Healthy Start Program. Most projects were in operation for at least 10 years at the time the PD survey was administered; 17 % were first funded in Phase 1 and 61 % in Phase 2. Approximately 75 % of projects were located in urban areas, including cities and metropolitan areas; and 40 % of grantee organizations were state or local government agencies. Approximately two-thirds of all projects reported that in 2010 the project had accomplished a number of interme- diate outcomes including increased awareness of the importance of interconception care and of disparities in birth outcomes as a community priority, increased positive health behaviors among participants, increased access to
  • 22. available services for participants, and increased number of participants with a medical home. More than two-thirds of all projects reported that case management, enabling services such as transportation and translation, and interconception care activities conducted by the project made a primary or major contribution to reducing disparities in maternal and infant health out- comes. Less than two-thirds of projects reported that other service and systems activities conducted by the project, such as collaboration with consumers, community-based organizations, and public and private agencies, made a similar contribution to reducing disparities in maternal and infant health outcomes. Sixty-eight percent of project directors reported that the project had achieved improvements in birth outcomes in 2010 and 39 % reported achieving improvements in maternal health. Less than one-third of project directors reported that the Healthy Start project had achieved sus-
  • 23. tained capacity to reduce disparities in health status in the community (32 % of projects); improvements in child health (31 %); and increased birth spacing (19 %). A small proportion (12 %) of project directors reported that the Healthy Start project had not achieved any long-term out- comes in 2010. Bivariate Analyses: Core Program Components Table 3 presents the results of bivariate analyses examin- ing the relationship between implementation of the nine core program components and project characteristics, as well as relationships between implementation of program components and three categories of project director- reported outcomes: (1) intermediate outcomes, (2) activi- ties that contributed to reducing disparities in maternal and infant health outcomes, and (3) long-term maternal and child health and community capacity outcomes. The 57 projects that implemented all core components were used as the reference group. Only results that were statistically
  • 24. significant (p B 0.05) are reported in the table. Healthy Start projects whose grantee organizations were state or local government agencies were significantly (p B 0.05) less likely to implement all core components compared with projects whose grantee organizations were a community-based non-governmental organization or other type of organization. 1296 Matern Child Health J (2015) 19:1292–1305 123 Table 2 Distribution of Healthy Start project characteristics and project director-reported implementation of program components, intermediate outcomes, service and systems activities that contributed to reducing disparities in maternal and infant health outcomes, and long-term maternal and child health and community capacity out- comes, among all Healthy Start projects (N = 104 projects) All
  • 25. projects (N = 104) n (%) Project characteristics a Length of funding Initial Funding Phase 1 (1991–1996) 18 17 Initial Funding Phase 2 (1997–2000) 63 61 Initial Funding Phase 3 (2001–2004) 10 10 Initial Funding Phase 4 (2005–2010) 13 12 Geographic location: Urban [urban/central city, metropolitan area (city and suburbs)] Yes 78 75 No 26 25 Geographic location: Not urban (suburban, border US- Mexico, rural) Yes 28 27 No 76 73
  • 26. HS grantee organization type Government agency (state agency, community government agency such as a local health department) 42 40 Community-based non-governmental organization (health care or non-health care) or Other organization (including academic medical center, non-profit organization, tribal organization, Federally Qualified Health Center) 62 60 Implementation of all nine core program components b Yes 57 55 No 47 45 Intermediate outcomes c Increased awareness of the importance of interconception care Yes 80 77
  • 27. No 24 23 Increased awareness of disparities in birth outcomes as community priority Yes 76 73 No 28 27 Increased positive health behaviors among our participants Yes 74 71 No 30 29 Increased access to the services available for our participants Yes 71 68 No 33 32 Increased number of participants with a medical home Table 2 continued All projects (N = 104) n (%) Yes 70 67
  • 28. No 34 33 Increased screening for perinatal depression among providers in the community Yes 51 49 No 53 51 Increased participant involvement in Healthy Start decision- making Yes 50 48 No 54 52 Increased integration of prenatal, primary care, and mental health services Yes 47 45 No 57 55 Increased cultural competence of providers in our community Yes 43 41 No 61 59 Increased participant involvement in other community activities
  • 29. addressing systems change Yes 39 37 No 65 63 Increased participant involvement in decision-making among partner agencies Yes 22 21 No 82 79 Service and systems activities that contributed to reducing disparities in maternal and infant health outcomes d Case management Yes 90 87 No 14 13 Enabling services Yes 73 73 No 31 30 Interconception care Yes 70 67
  • 30. No 34 33 Perinatal depression screening Yes 66 63 No 38 37 Outreach and client recruitment Yes 64 62 No 40 39 Collaboration with consumers Yes 60 58 No 44 42 Matern Child Health J (2015) 19:1292–1305 1297 123 Table 2 continued All projects (N = 104) n (%)
  • 31. Collaboration with community-based organizations Yes 53 51 No 51 49 Collaboration with public agencies Yes 49 47 No 55 53 Collaboration with private agencies Yes 46 44 No 58 56 Consortium Yes 45 43 No 59 57 Local Health System Action Plan Yes 43 41 No 61 59 Collaboration with local Title V Yes 34 33 No 70 67
  • 32. Collaboration with State Title V Yes 31 30 No 73 70 Provider education Yes 39 38 No 65 62 Long-term maternal and child health and community capacity outcomes e Improved birth outcomes Yes 71 68 No 33 32 Improved maternal health Yes 41 39 No 63 61 Sustained community capacity to reduce disparities in health status in the community Yes 33 32
  • 33. No 71 68 Improved child health Yes 32 31 No 72 69 Increased birth spacing Yes 20 19 No 84 81 No long term outcomes were achieved in 2010 Yes 13 12 Table 2 continued All projects (N = 104) n (%) No 91 88 a Data source: Maternal and Child Health Bureau Discretionary Grant Information System
  • 34. b Data source: 2010 Project Director survey. To determine imple- mentation of core service components, project directors were asked, ‘‘Which of the following services does your Healthy Start project offer?’’ (response options: ‘‘Outreach and participant recruitment,’’ ‘‘Case management,’’ ‘‘Health education,’’ ‘‘Perinatal depression screening,’’ and ‘‘Interconceptional services’’). To determine imple- mentation of the core systems-building component of having a con- sortium, project directors were asked ‘‘Does your Healthy Start project have at least one active consortium that addresses maternal and child health issues’’ (response options: Yes/No). To determine implementation of the core systems-building component of having a Local Health System Action Plan, project directors were asked ‘‘Does
  • 35. your Healthy Start project have a Local Health System Action Plan (LHSAP)?’’ (response options: Yes/No; a follow up question was asked to determine if the LHSAP was specific to the Healthy Start project). To determine implementation of the core systems- building component of coordination and collaboration with Title V, project directors were asked to specify the types of collaborative activities that their Healthy Start project established with the State Title V agency. Projects were classified with a ‘‘yes’’ response if the project director indicated that the State Title V agency ‘‘is a member of the Healthy Start consortium,’’ ‘‘has a written memorandum of under- standing or agreement with Healthy Start,’’ ‘‘provides contracted services to Healthy Start,’’ ‘‘hosts out-stationed Healthy Start staff,’’
  • 36. ‘‘participates in joint training with Healthy Start,’’ ‘‘has a shared staffing arrangement with Healthy Start,’’ ‘‘coordinates case man- agement or is planning with Healthy Start for shared participants,’’ ‘‘shares protocols with Healthy Start,’’ ‘‘is involved in Healthy Start sustainability planning,’’ ‘‘has a data-sharing arrangement with Healthy Start,’’ ‘‘contributes to pooled funding streams to support joint services,’’ ‘‘has a Healthy Start employee on their board,’’ ‘‘works with Healthy Start to develop consistent health messages for participants,’’ and/or ‘‘receives cultural competence training from Healthy Start.’’ To determine implementation of the core systems- building component of having a sustainability plan, project directors were asked ‘‘Does your Healthy Start project have a sustainability plan, that is, a plan to maintain services to the target population after
  • 37. federal Healthy Start funding ends?’’ (response options: Yes/No) c Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following intermediate outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed d Data source: 2010 Project Director survey. Project directors were asked, ‘‘To what extent did the following activities conducted by your Healthy Start project contribute to reducing disparities in maternal and infant health outcomes?’’. Response options included Primary contribution, Major contribution, Moderate contribution, Minor con- tribution, and No contribution or N/A. Primary contribution and Major contribution were classified as ‘‘Yes.’’ e Data source: 2010 Project Director survey. Project directors
  • 38. were asked, ‘‘Which of the following long term outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed 1298 Matern Child Health J (2015) 19:1292–1305 123 Table 3 Association of implementation of Healthy Start Program components with project characteristics and project director- reported inter- mediate outcomes, service and systems activities that contributed to reducing disparities in maternal and infant health outcomes, and long-term maternal and child health and community capacity outcomes (N = 104 projects) Implementation of all required core program components Yes (n = 57) No (n = 47)
  • 39. p value* n (%) n (%) Project characteristics a HS grantee organization type Government agency (state agency, community government agency such as a local health department) 18 32 24 51 0.04 Community-based non-governmental organization (health care or non-health care) or Other organization (including academic medical center, non-profit organization, tribal organization, Federally Qualified Health Center) 39 68 23 49 Intermediate outcomes b Increased access to the services available for our participants Yes 46 80 25 53 0.00 No 11 20 22 47 Increased screening for perinatal depression among providers in the community Yes 33 58 18 38 0.04
  • 40. No 24 42 29 62 Increased integration of prenatal, primary care, and mental health services Yes 31 54 16 34 0.03 No 26 46 31 66 Service and systems activities that contributed to reducing disparities in maternal and child health outcomes c Enabling services Yes 46 81 27 58 0.01 No 11 19 20 42 Interconception care Yes 44 77 26 55 0.01 No 13 23 21 45 Long-term maternal and child health and community capacity outcomes d Improved child health Yes 22 39 10 21 0.05 No 35 61 37 79 Increased birth spacing
  • 41. Yes 16 28 4 9 0.01 No 41 72 43 91 * Pearson’s Chi square or Fisher’s exact test a Data source: Maternal and Child Health Bureau Discretionary Grant Information System b Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following intermediate outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed. Only outcomes with statistically significant (p B 0.05) relationships with implementation of all core program components are reported c Data source: 2010 Project Director survey. Project directors were asked, ‘‘To what extent did the following activities conducted by your Healthy Start project contribute to reducing disparities in maternal and infant health outcomes?’’. Response options included Primary contri- bution, Major contribution, Moderate contribution, Minor contribution, and No contribution or N/A. Primary contribution and Major contribution were classified as ‘‘Contributed.’’ Only activities with
  • 42. statistically significant (p B 0.05) relationships with implementation of all core program components are reported d Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following long term outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed. Only outcomes with statistically significant (p B 0.05) relationships with imple- mentation of all core program components are reported Matern Child Health J (2015) 19:1292–1305 1299 123 Although projects implementing all core components more frequently reported achievement of the majority of interme- diate outcomes than projects that did not implement all core components, the intermediate outcomes for which the rela- tionship between implementation of all core components and the outcome were statistically significant were (1) increased access to services available for participants, (2) increased
  • 43. integration of prenatal, primary care and mental health ser- vices and (3) increased screening for perinatal depression. Projects implementing all core components were signifi- cantly more likely to report that enabling and interconception care services conducted by the project made a primary or major contribution to reducing disparities in maternal and infant health, when compared with projects that did not implement all required core components. Additionally, pro- jects implementing all core components were significantly more likely to report that their project had achieved increased birth spacing and improved child health in 2010, compared with projects that did not implement all core components. Bivariate Analyses: Intermediate Outcomes, Service and Systems Activities that Contributed to Reducing Disparities in Maternal and Infant Health Outcomes, and Long-Term Maternal and Child Health and Community Capacity Outcomes Results of the bivariate analyses examining the relationship
  • 44. between project director-reported intermediate outcomes, service and systems activities that made a primary or major contribution to reducing disparities in maternal and infant health outcomes and long-term outcomes revealed many significant associations (data not shown). Intermediate out- comes that were significantly associated (p B 0.05) with project director-reported improvements in birth outcomes and/or maternal health included: increased cultural compe- tence of providers in the community; increased number of participants with a medical home; increased awareness of the importance of interconception care; increased screening for perinatal depression; and increased participant involvement in community activities addressing systems change. Healthy Start project activities, such as interconception care, peri- natal depression screening, enabling services, collaboration with consumers, and LHSAP, that made a primary or major contribution to reducing disparities in maternal and infant health outcomes were each significantly (p B 0.05) associ-
  • 45. ated with project director-reported improvement in birth, maternal, and/or child health outcomes (data not shown). Descriptive and Comparative Analyses: Birth Outcome Performance Measures In 2009, 20 % of Healthy Start projects had singleton LBW rates and 59 % had IMR that were less than or equal to the Healthy People 2010 (HP2010) LBW targets of 5 % and 4.5 per 1,000 live births [9], respectively. The Healthy People 2020 (HP2020) targets were revised to 7.8 % (LBW rate) and 6 per 1,000 live births (IMR) [10], and a higher proportion of Healthy Start projects achieved these targets than achieved the HP2010 targets (33 % achieved the LBW target and 60 % achieved the IMR target) (data not shown). Compared with Healthy Start projects that did not meet the HP2020 LBW target, projects that achieved the HP2020 target were significantly (p B 0.05) more likely to report achieving increased access to services available for participants and increased integration of prenatal, primary
  • 46. care, and mental health services. Similarly, these projects were significantly more likely to report that their outreach and client recruitment, collaboration with community- based organizations, collaboration with private and public agencies, and/or collaboration with local Title V activities made a primary or major contribution to reducing dispar- ities in maternal and infant health outcomes. Achieving the HP2020 target for IMR was not significantly associated with project director-report of achieving intermediate out- comes or of (conducting) service or system activities that made a primary or major contribution to reducing dispar- ities in maternal and infant health outcomes (Table 4). Similar results were observed when comparing Healthy Start project PM rates with state birth outcome rates. In 2009, over one quarter (27 %) of all Healthy Start projects had a singleton LBW rate less than the rate in their state, and 62 % had an IMR that was less than the rate in their state. Healthy Start projects that had a lower singleton
  • 47. LBW rate in 2009 than the rate reported for their state were significantly (p B 0.05) more likely to report achieving increased positive health behaviors among participants and increased number of participants with a medical home in 2010 (data not shown). Multivariate Analyses The results of the multivariate analyses are presented in Tables 5 and 6. After controlling for project characteristics, project director-reported intermediate outcomes and other covariates consistent with the logic model, there were no significant associations of implementation of all core pro- gram components with singleton LBW and/or infant mor- tality rates. Urban project setting and state/local government agency grantee organization were significantly associated with higher rates of LBW, and non-urban pro- ject setting was significantly associated with higher IMR. As expected, LBW rates were significantly associated with higher IMR. Intermediate and long-term program outcomes
  • 48. reported in the 2010 PD survey were not significantly associated with either singleton LBW or infant mortality. 1300 Matern Child Health J (2015) 19:1292–1305 123 Table 4 Association of percent singleton low birth weight (LBW) and infant mortality rates (IMR) among project participants’ infants meeting HP2010 and HP2020 objective targets with Healthy Start project director-reported achievement of intermediate outcomes and conduct of service and systems activities that contributed to reducing disparities in maternal and infant health outcomes (N = 104) PM 51 (% singleton LBW) PM 52 (IMR) Less than HP2010 LBW target of 5 % (n = 20 projects) Less than HP2020 LBW
  • 49. target of 7.8 % (n = 32 projects) Less than HP2010 IMR target of 4.5 deaths per 1,000 live births (n = 58 projects) Less than HP2020 IMR target of 6 deaths per 1,000 live births
  • 51. Intermediate outcomes a Increased awareness of the importance of interconception care 85 15 84 16 79 21 80 20 Increased awareness of disparities in birth outcomes as community priority 75 25 75 25 71 29 71 29 Increased positive health behaviors among our participants 85 15 84 16 71 29 71 29 Increased access to the services available for our participants 85* 15 81* 19 69 31 69 31 Increased number of participants with a medical home 85 15 75 25 76 24 76 24 Increased screening for perinatal depression among providers in the community 60 40 59 41 48 52 49 51 Increased participant involvement in Healthy Start decision- making 45 55 50 50 47 53 47 53 Increased integration of prenatal, primary care, and mental health services 60 40 66* 34 40 60 41 59 Increased cultural competence of providers in our community 55 45 53 47 36 64 37 63 Increased participant involvement in other community activities addressing systems change 20 80 31 69 34 66 36 64
  • 52. Increased participant involvement in decision-making among partner agencies 10 90 16 84 24 76 25 75 Service and systems activities that contributed to reducing disparities in maternal and child health outcomes b Case management 90 10 88 12 93 7 93 7 Enabling services 75 25 69 31 78 22 78 22 Interconception care 65 35 63 37 66 34 66 34 Perinatal depression screening 60 40 56 44 69 31 69 31 Outreach and client recruitment 50 50 47** 53 62 38 63 37 Collaboration with consumers 60 40 50 50 57 43 58 42 Collaboration with community-based organizations 30* 70 28** 72 55 45 56 44 Collaboration with public agencies 35 65 31* 69 50 50 51 49 Collaboration with private agencies 30 70 25** 75 48 52 49 51 Consortium 35 65 37 63 41 59 42 58 Local Health System Action Plan 30 70 31 69 40 60 41 59 Collaboration with local Title V 15 85 19* 81 40 60 39 61 Collaboration with state Title V 30 70 25 75 36 64 36 64 Provider education 35 65 31 69 41 59 42 58
  • 53. Note that Healthy People (HP) LBW targets are for LBW among all live births, whereas Healthy Start PM 51 and State Title V HSI 01B measures the singleton LBW rate * Pearson’s Chi square or Fisher’s exact test p value B 0.05 ** Pearson’s Chi square or Fisher’s exact test p value B 0.01 a Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following intermediate outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed. A ‘‘yes’’ response indicates that the project director reported that the project achieved the intermediate outcome. A ‘‘no’’ response indicates that the project director did not report that the project achieved the intermediate outcome b Data source: 2010 Project Director survey. Project directors were asked, ‘‘To what extent did the following activities conducted by your Healthy Start project contribute to reducing disparities in maternal and infant health outcomes?’’. Response options included Primary contri- bution, Major contribution, Moderate contribution, Minor
  • 54. contribution, and No contribution or N/A. A ‘‘yes’’ response indicates that the project director reported that the service or system activity made a primary or major contribution to reducing disparities in maternal and infant health outcomes. A ‘‘no’’ response indicates that the project director reported that the service or system activity did not make a primary or major contribution to reducing disparities in maternal and infant health outcomes Matern Child Health J (2015) 19:1292–1305 1301 123 Table 5 Adjusted associations of implementation of Healthy Start Program components with singleton low birth weight (LBW) among Healthy Start project participants’ infants (N = 98 projects) Project characteristic % Singleton LBW a % Singleton LBW less than State
  • 55. Title V rate b % Singleton LBW less than HP2010 LBW target of 5 % b % Singleton LBW less than HP2020 LBW target of 7.8 % b Implemented all 5 core service components and all 4 core systems components versus did not implement all core components c 0.4 0.5 0.4 0.4 Initial funding received in Phase 1 (1991–1996) versus initial funding received in Phase 2, 3, or 4 d 1.2 1.3 0.6 0.4 Urban geographic location [urban/central city, metropolitan area (city and suburbs)] versus not urban
  • 56. d 2.9 0.4 0.8 0.4 Not urban geographic location (suburban, border US- Mexico, rural) versus not not urban d 1.6 0.6 1.7 0.4 State or local government agency grantee organization versus community-based non-governmental organization (health care or non-health care) or other organization (including academic medical center, non- profit organization, tribal organization, Federally Qualified Health Center) d 1.5 0.1 0.1 0.4 PM 20 (% women participants with an ongoing source of primary and preventive care for women) (%, 2009) d 0.0 1.0 1.0 1.0 PM 22 (degree to which Healthy Start project facilitates health providers’ screening of women participants for
  • 57. risk factors) (score greater than mean of all projects, 2009) d 0.8 1.3 0.5 1.1 Achieved increased birth spacing e 0.5 0.4 0.8 2.1 Achieved increased cultural competence of providers in the community f -1.3 2.1 2.4 1.9 Achieved increased participant involvement in Healthy Start decision-making f 0.9 0.9 0.8 0.6 Results based on multivariable linear or logistic regression models (separate models for each outcome), with each model adjusted for the other variables in the table. Bold font indicates effect estimate was significant at p 0.10 or 95 % confidence interval [1 a Linear model: values are b coefficients. The effect estimate
  • 58. represents the effect per percent increase of LBW b Logistic model: values are odds ratios. The effect estimate represents the effect of having a rate less than the state Title V rate or less than the Healthy People (HP) target. Note that HP2010 and HP2020 LBW targets are for LBW among all live births, whereas Healthy Start PM 51 and State Title V HSI 01B measures the singleton LBW rate c Data source: 2010 Project Director survey. To determine implementation of core service components, project directors were asked, ‘‘Which of the following services does your Healthy Start project offer?’’ (response options: ‘‘Outreach & participant recruitment,’’ ‘‘Case management,’’ ‘‘Health education,’’ ‘‘Perinatal depression screening,’’ and ‘‘Interconceptional services’’). To determine implementation of the core systems-building com- ponent of having a consortium, project directors were asked ‘‘Does your Healthy Start project have at least one active consortium that addresses maternal and child health issues’’ (response options: Yes/No). To determine implementation of the core systems-building component of having a Local Health System Action Plan, project directors were asked ‘‘Does your Healthy Start project have a Local Health System Action Plan
  • 59. (LHSAP)?’’ (response options: Yes/No; a follow up question was asked to determine if the LHSAP was specific to the Healthy Start project). To determine implementation of the core systems-building component of coordination and collaboration with Title V, project directors were asked to specify the types of collaborative activities that their Healthy Start project established with the State Title V agency. Projects were classified with a ‘‘yes’’ response if the project director indicated that the State Title V agency ‘‘is a member of the Healthy Start consortium,’’ ‘‘has a written memorandum of understanding or agreement with Healthy Start,’’ ‘‘provides contracted services to Healthy Start,’’ ‘‘hosts out-stationed Healthy Start staff,’’ ‘‘participates in joint training with Healthy Start,’’ ‘‘has a shared staffing arrangement with Healthy Start,’’ ‘‘coordinates case management or is planning with Healthy Start for shared participants,’’ ‘‘shares protocols with Healthy Start,’’ ‘‘is involved in Healthy Start sustainability planning,’’ ‘‘has a data-sharing arrangement with Healthy Start,’’ ‘‘contributes to pooled funding streams to support joint services,’’ ‘‘has a Healthy Start employee on their board,’’ ‘‘works with Healthy Start to develop consistent health messages for participants,’’ and/or ‘‘receives cultural competence training from Healthy Start.’’ To determine
  • 60. implementation of the core systems-building component of having a sustainability plan, project directors were asked ‘‘Does your Healthy Start project have a sustainability plan, that is, a plan to maintain services to the target population after federal Healthy Start funding ends?’’ (response options: Yes/No) d Data source: Maternal and Child Health Bureau Discretionary Grant Information System e Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following long term outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed f Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following intermediate outcomes did your Healthy Start project achieve in 2010?’’. Multiple responses were allowed 1302 Matern Child Health J (2015) 19:1292–1305 123 Table 6 Adjusted associations of implementation of Healthy Start Program components with infant mortality rate (IMR)
  • 61. among Healthy Start project participants’ infants (N = 98 projects) Project characteristic a Infant mortality rate a Infant mortality rate less than State Title V IMR b Infant mortality rate less than HP2010 IMR target of 4.5 deaths per 1,000 live births c Infant mortality rate less than HP2020 IMR target of 6 deaths per 1,000 live births
  • 62. c Implemented all 5 core service components and all 4 core systems components versus did not implement all core components d -0.7 1.2 1.1 1.1 Initial funding received in Phase 1 (1991–1996) vs. initial funding received in Phase 2, 3, or 4 e 4.9 0.4 0.5 0.4 Urban geographic location (urban/central city, metropolitan area [city and suburbs]) versus not urban e -4.1 1.6 1.5 1.3 Not urban geographic location (suburban, border US-Mexico, rural) versus not urban e 7.4 0.5 0.6 0.5 State or local government agency grantee
  • 63. organization versus community-based non- governmental organization (health care or non- health care) or other organization (including academic medical center, non-profit organization, tribal organization, Federally Qualified Health Center) e 0.7 1.0 0.9 1.1 PM 51 (% low birth weight) (%, 2009) 0.5 0.9 0.9 0.9 PM 20 (% women participants with an ongoing source of primary and preventive care for women) (%, 2009) 0.0 1.0 1.0 1.0 PM 22 (degree to which Healthy Start project facilitates health providers’ screening of women participants for risk factors) (score greater than mean of all projects, 2009) e -3.0 0.7 0.8 0.6
  • 64. Achieved increased birth spacing f 3.8 0.6 0.3 0.5 a Results based on multivariable linear or logistic regression models (separate models for each outcome), with each model adjusted for the other variables in the table. Bold font indicates effect estimate was significant at p 0.10 or 95 % confidence interval [1 b Linear model: values are b coefficients. The effect estimate represents the effect per increase in the infant mortality rate (deaths per 1,000 live births) c Logistic model: values are odds ratios. The effect estimate represents the effect of having a rate less than the state Title V rate or less than the Healthy People (HP) target d Data source: 2010 Project Director survey. To determine implementation of core service components, project directors were asked, ‘‘Which of the following services does your Healthy Start project offer?’’ (response options: ‘‘Outreach and participant recruitment,’’ ‘‘Case management,’’ ‘‘Health education,’’ ‘‘Perinatal depression screening,’’ and
  • 65. ‘‘Interconceptional services’’). To determine implementation of the core systems- building component of having a consortium, project directors were asked ‘‘Does your Healthy Start project have at least one active consortium that addresses maternal and child health issues’’ (response options: Yes/No). To determine implementation of the core systems-building component of having a Local Health System Action Plan, project directors were asked ‘‘Does your Healthy Start project have a Local Health System Action Plan (LHSAP)?’’ (response options: Yes/No; a follow up question was asked to determine if the LHSAP was specific to the Healthy Start project). To determine implementation of the core systems-building component of coordination and collaboration with Title V, project directors were asked to specify the types of collaborative activities that their Healthy Start project established with the State Title V agency. Projects were classified with a ‘‘yes’’ response if the project director indicated that the State Title V agency ‘‘is a member of the Healthy Start consortium,’’ ‘‘has a written memorandum of understanding or agreement with Healthy Start,’’ ‘‘provides contracted services to Healthy Start,’’ ‘‘hosts out-stationed Healthy Start staff,’’ ‘‘participates
  • 66. in joint training with Healthy Start,’’ ‘‘has a shared staffing arrangement with Healthy Start,’’ ‘‘coordinates case management or is planning with Healthy Start for shared participants,’’ ‘‘shares protocols with Healthy Start,’’ ‘‘is involved in Healthy Start sustainability planning,’’ ‘‘has a data-sharing arrangement with Healthy Start,’’ ‘‘contributes to pooled funding streams to support joint services,’’ ‘‘has a Healthy Start employee on their board,’’ ‘‘works with Healthy Start to develop consistent health messages for participants,’’ and/or ‘‘receives cultural competence training from Healthy Start.’’ To determine implementation of the core systems-building component of having a sustainability plan, project directors were asked ‘‘Does your Healthy Start project have a sustainability plan, that is, a plan to maintain services to the target population after federal Healthy Start funding ends?’’ (response options: Yes/No) e Data source: Maternal and Child Health Bureau Discretionary Grant Information System f Data source: 2010 Project Director survey. Project directors were asked, ‘‘Which of the following long term outcomes did your Healthy Start
  • 67. project achieve in 2010?’’. Multiple responses were allowed Matern Child Health J (2015) 19:1292–1305 1303 123 Discussion This evaluation of the Federal Healthy Start Program using both data from a survey of project directors and Healthy Start project birth, service, and system outcome perfor- mance measures data revealed a mixed set of relationships between implementation of core program components and long-term maternal and child health outcomes. Analyses of the 2010 PD survey data indicate that implementation of all core components was associated with better project direc- tor-reported intermediate and long-term project outcomes. This is the first analysis to use MCHB performance mea- sure data in a national evaluation to assess Healthy Start projects’ progress toward achieving outcomes that are expected to occur if program elements are successfully and
  • 68. completely implemented. Results from this evaluation are consistent with our hypothesis (illustrated in the logic model, Fig. 1) of a progression of achievement of inter- mediate outcomes leading to long-term outcomes. For example, increased screening for perinatal depression, case management and interconception care services may have led to PD-reported improvement in maternal health. In addition, we found that Healthy Start projects that reported an increase in the number of participants with a medical home in 2010 and an increase in positive behaviors among participants had a significantly better (lower) singleton LBW rate among project participants’ infants than the rate in their state. Our analyses used state and national benchmarks, and our findings are reinforced by the results of previously published evaluations that were conducted by Healthy Start projects using vital records, clinical services and program data. Site-specific evaluations conducted by individual
  • 69. Healthy Start projects have identified components of the program that show a positive effect on birth outcomes of participants’ infants when compared with demographically similar women who did not participate in the program. For example, evaluations of individual Healthy Start projects found that services provided to high risk participants resulted in improved birth outcomes such as reduced rates of LBW, preterm birth, and infant mortality [12–14] in addition to lower rates of sexually transmitted diseases [15]. Although previous national evaluations of the Federal Healthy Start Program helped to establish the importance of the Healthy Start program components for achieving Program goals, these evaluations relied solely on grantees’ perspectives because objective performance measure data were not adequate for use in national evaluations. A thor- ough examination of the PM data reported by Healthy Start projects revealed that the quality of reported data is suffi-
  • 70. cient for evaluation activities but also identified several key challenges to using these data for program evaluation [8]. Our review of the notes and detailed explanations that accompanied the PM data that grantees submitted to the DGIS revealed data quality issues, including: 1) inconsis- tency in the definition of the measure used by the project with the definition specified by MCHB; 2) lack of verifi- cation of some measures, e.g. PM 52, due to the timing of the completion of birth–death linked files prepared by the state vital records department; and 3) missing and incom- plete data. These data limitations may introduce bias if the projects that had missing data or provided incomplete data are different from those who provided accurate and com- plete data, or if the under-reporting or erroneous reporting is related to the performance measures used as the out- comes for this analysis (PM 51 and PM 52). A potential limitation of these analyses was the possible variation in the information source(s) used to complete the
  • 71. PD survey. Healthy Start project staff, including the project director and other project staff, were asked to complete the survey, and the staff member(s) who provided responses could have varied by project. The survey was pilot-tested with representatives of different Healthy Start project staff roles, but allowing survey completion by more than one type of respondent can increase the potential for variation in the interpretation of the survey questions and lead to variation in responses. Responses may also have varied based on the length of time the respondent had been with the project, in addition to the length of time that the project had been in operation and the program components that were implemented. We did not have access to complete, reliable information about other project characteristics and program components needed to perform a comprehensive evaluation of project implementation in a variety of com- munity settings and to conduct analyses that adequately addressed all of the relationships outlined in the logic
  • 72. model. For example, participant demographic data cap- tured by the MCHB DGIS were not available for use in these analyses. The eligibility criteria for participation in Healthy Start lead to some demographic similarities across project sites; however, other important differences in the populations served by sites may exist. More detailed information about program implementation and outcomes achieved by individual Healthy Start projects is needed to improve the specificity of future evaluations. Healthy Start projects provide services to high risk women in the most vulnerable communities in our country. Improving birth outcomes for project participants requires intensive and focused services and policies that will assure quality services within communities. Ongoing monitoring and assessment of the implementation of these programs and routine, standardized collection of essential birth out- come and project implementation data will provide critical information for evaluating what is and is not working in
  • 73. individual Healthy Start projects and the Program as a 1304 Matern Child Health J (2015) 19:1292–1305 123 whole. MCHB could provide Healthy Start Program staff with online tools and training to improve the reliability of data collection and reporting. Future Healthy Start Program evaluations should build on more robust local evaluations at the project level as well as employ a set of focused questions for the national evaluation that specifically address the major issues of interest to state and national policy-makers. Improved capacity for data collection and documentation by individual projects would help assure that comprehensive cross-site evaluations could be con- ducted in the future. Resources should be provided to assure that the systems required to conduct this type of evaluation are in place. Based on our experience conducting national evalua-
  • 74. tions of the Federal Healthy Start Program, we recommend that future evaluations explicitly connect to local, state, and national frameworks and agendas for improving birth outcomes and reducing health inequities. The evaluation plan should incorporate analyses at multiple levels to provide a robust and comprehensive examination of Healthy Start Program activities and achievements. Most importantly, monitoring and evaluation activities con- ducted by individual Healthy Start projects must be strengthened to help ensure systematic and standardized annual reporting to MCHB of performance measure data, program activities and accomplishments, and other data needed for evaluation. Acknowledgments Financial support for this study was provided by the Health Resources and Services Administration, Maternal and Child Health Bureau under Contract No. HHSH250200646015I Task Order HHSH25034002T: An Evaluation of the Core Components of
  • 75. the Federal Healthy Start Program: A Cross-site Examination. The authors would like to acknowledge the contributions of the Healthy Start Grantees who participated in this evaluation, the staff of the National Healthy Start Association, the Healthy Start Project Officers at MCHB, especially Dr. David de la Cruz and Dr. Keisher High- smith, and the Healthy Start project team at Abt, including Dr. Chanza Baytop, Ms. Meredith Eastman, Ms. Carolyn Robinson, and Dr. Meghan Woo. References 1. Health Resources and Services Administration. (2001). Healthy Start Guidance, 2001. Rockville, MD: Health Resources and Services Administration, U.S. Department of Health and Human Services. 2. Health Resources and Services Administration. (2005). Healthy
  • 76. Start Guidance, 2005. Rockville, MD: Health Resources and Services Administration, U.S. Department of Health and Human Services. 3. Health Resources and Services Administration. (2009). Healthy Start Guidance, 2010. Rockville, MD: Health Resources and Services Administration, U.S. Department of Health and Human Services. 4. Maternal and Child Health Bureau. (2013). DGIS Reports: Pro- gram Data Reports. Retrieved https://perf-data.hrsa.gov/MCHB/ DGISReports/PerfMeasure/PerfMeasureReports.aspx?Report= ProgramPerfMeasures&Archived=0. 5. Health Resources and Services Administration. (2006). A Profile of Healthy Start: Findings From Phase I of the Evaluation, 2006. Rockville, MD: Health Resources and Services Administration, U.S. Department of Health and Human Services. 6. Brand, A., Walker, D. K., Hargreaves, M., & Rosenbach, M.
  • 77. (2010). Intermediate outcomes, strategies, and challenges of eight Healthy Start Projects. Maternal and Child Health Journal, 14(5), 654–665. 7. Rosenbach, M., O’Neil, S., Cook, B., Trebino, L., & Walker, D. K. (2010). Characteristics, access, utilization, satisfaction, and outcomes of healthy start participants in eight sites. Maternal and Child Health Journal, 14(5), 666–679. 8. Abt Associates. (2012). Methodology for Analysis of Health Resources Service Administration, Maternal and Child Health Bureau Healthy Start Performance Measures, September 2012. Cambridge, MA: Abt Associates. 9. Centers for Disease Control and Prevention. National Center for Health Statistics. Healthy People Objective Targets: Maternal, Infant, and Child Health. Retrieved http://www.cdc.gov/nchs/ data/hpdata2010/hp2010_final_review_focus_area_16.pdf. 10. Centers for Disease Control and Prevention. National Center
  • 78. for Health Statistics. Healthy People Objective Targets: Maternal, Infant, and Child Health. Retrieved http://www.healthypeople. gov/2020/topicsobjectives2020/overview.aspx?topicid=26. 11. Taylor, Y. J., & Nies, M. A. (2012). Measuring the impact and outcomes of maternal child health federal programs. Maternal Child Health Journal, 17(5), 886–896. doi:10.1007/s10995-012- 1067-y. 12. Will, J. A., Hall, I., Cheney, T., & Driscoll, M. (2005). Flower Power: Assessing the impact of the Magnolia Project on reducing poor birth outcomes in an at-risk neighborhood. Journal of Applied Sociology/Sociological Practice, 22.2/7(2), 74–90. 13. Salihu, H. M., Mbah, A. K., Jeffers, D., Alio, A. P., & Berry, L. (2009). Healthy Start program and feto-infant morbidity out- comes: Evaluation of program effectiveness. Maternal and Child Health Journal, 13(1), 56–65. doi:10.1007/s10995-008-0400-y.
  • 79. 14. Kothari, C. L., Wendt, A., Oemeeka, L., Overton, J., & Sweezy, L. C. (2011). Assessing maternal risk for fetal-infant mortality: A population-based study to prioritize risk reduction in a Healthy Start community. Maternal and Child Health Journal, 15(1), 68–76. doi:10.1007/s10995-009-0561-3. 15. Livingood, W. C., Brady, C., Pierce, K., Atrash, H., Hou, T., & Bryant, T, 3rd. (2010). Impact of pre-conception health care: evaluation of a social determinants focused intervention. Mater- nal and Child Health Journal, 14(3), 382–391. Matern Child Health J (2015) 19:1292–1305 1305 123 https://perf- data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe ports.aspx?Report=ProgramPerfMeasures&Archived=0 https://perf- data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe ports.aspx?Report=ProgramPerfMeasures&Archived=0 https://perf- data.hrsa.gov/MCHB/DGISReports/PerfMeasure/PerfMeasureRe ports.aspx?Report=ProgramPerfMeasures&Archived=0 http://www.cdc.gov/nchs/data/hpdata2010/hp2010_final_review _focus_area_16.pdf
  • 80. http://www.cdc.gov/nchs/data/hpdata2010/hp2010_final_review _focus_area_16.pdf http://www.healthypeople.gov/2020/topicsobjectives2020/overvi ew.aspx?topicid=26 http://www.healthypeople.gov/2020/topicsobjectives2020/overvi ew.aspx?topicid=26 http://dx.doi.org/10.1007/s10995-012-1067-y http://dx.doi.org/10.1007/s10995-012-1067-y http://dx.doi.org/10.1007/s10995-008-0400-y http://dx.doi.org/10.1007/s10995-009-0561-3 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. c.10995_2014_Article_1635.pdfSelected Findings from the Cross-Site Evaluation of the Federal Healthy Start ProgramAbstractIntroductionNational Evaluations of the Federal Healthy Start ProgramMethodsData SourcesMeasurement of VariablesAnalysisResultsDescriptive CharacteristicsBivariate Analyses: Core Program ComponentsBivariate Analyses: Intermediate Outcomes, Service and Systems Activities that Contributed to Reducing Disparities in Maternal and Infant Health Outcomes, and Long-Term Maternal and Child Health and Community Capacity OutcomesDescriptive and Comparative Analyses: Birth Outcome Performance MeasuresMultivariate AnalysesDiscussionAcknowledgmentsReferences I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected]
  • 81. L o g i c M o d e l W o r k b o o k I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] L o g i c M o d e l W o r k b o o k T a b l e o f C o n t e n t s P a g e Introduction - How to Use this Workbook .....................................................................2 Before You Begin ............................................................................................... ..................3 Developing a Logic Model ............................................................................................... ..4 Purposes of a Logic Model ............................................................................................... 5 The Logic Model’s Role in Evaluation ............................................................................ 6 Logic Model Components – Step by Step
  • 82. ....................................................................... 6 Problem Statement: What problem does your program address? ......................... 6 Goal: What is the overall purpose of your program? .............................................. 7 Rationale and Assumptions: What are some implicit underlying dynamics? ....8 Resources: What do you have to work with? ......................................................... 9 Activities: What will you do with your resources? ................................................ 11 Outputs: What are the tangible products of your activities? ................................. 13 Outcomes: What changes do you expect to occur as a result of your work?.......... 14 Outcomes Chain ....................................................................................... 16 Outcomes vs. Outputs ............................................................................. 17 Logic Model Review ............................................................................................... ............18 Appendix A: Logic Model Template
  • 83. Appendix B: Worksheet: Developing an Outcomes Chain Logic Model Workbook Page 2 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] I n t r o d u c t i o n - H o w t o U s e t h i s W o r k b o o k Welcome to Innovation Network’s Logic Model Workbook. A logic model is a commonly-used tool to clarify and depict a program within an organization. You may have heard it described as a logical framework, theory of change, or program matrix—but the purpose is usually the same: to graphically depict your program, initiative, project or even the sum total of all of your organization’s work. It also serves as a foundation for program planning and evaluation. This workbook is a do-it-yourself guide to the concepts and use of the logic model. It describes the steps necessary for you to create logic models for your own programs. This process may take anywhere from an hour to several hours or even days, depending on the complexity of the program. We hope you will use this workbook in the way that works best
  • 84. for you: • As a stand-alone guide to help create a logic model for a program in an organization, • As an additional resource for users of the Point K Learning Center, and/or • As a supplement to a logic model training conducted by Innovation Network. You can create your logic model online using the Logic Model Builder in Innovation Network’s Point K Learning Center, our suite of online planning and evaluation tools and resources at www.innonet.org. This online tool walks you through the logic model development process; allows you to save your work and come back to it later; share work with colleagues to review and critique; and print your logic model in an attractive, one- page presentation view for sharing with stakeholders. Free registration is required. For those of you who prefer to work on paper or who don’t have reliable Internet access, a logic model template is located in Appendix A of this workbook. You may want to make several copies of this template, to allow for adjustments and updates to your logic model over time. This checklist icon appears at points in the workbook at which you should record something – either write something in your template, or enter it into your online Logic Model Builder.
  • 85. Why evaluate? Evaluation serves many purposes: • Supports program and strategic planning • Helps communicate your goals and progress • Serves as a basis for ongoing learning to make your work stronger and more effective. http://www.innonet.org/� Logic Model Workbook Page 3 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Ongoing Learning Cycle Evaluation is an ongoing learning cycle; a process that starts with planning, leads into data collection, analysis and reflection, and then to action and improvement. Logic models are the foundation of planning and the core of any evaluation process. As you make strategic decisions based on evaluation findings, you move right back into the planning stage.
  • 86. B e f o r e Y o u B e g i n In preparing to create a logic model, you may want to consider: What stakeholders should I involve? The development of a logic model offers an opportunity to engage your program’s stakeholders in a discussion about the program. Stakeholders might include program staff, clients/service recipients, partners, funders, board members, community representatives, and volunteers. Their perspectives can enrich your program logic model by clarifying expectations for the program. What is the scope of this logic model? • Identify a timeframe for the logic model you are about to create. It will help you frame short-, intermediate, and long-term outcomes and make better decisions about resources and activities. Many groups design logic models for a funding or program cycle, a fiscal year, or a timeframe in which they believe they can achieve some meaningful results. • This logic model structure is intended for program planning. Define the parameters of your program clearly. If your organization is small and only has one program, you can also use this structure for small-scale strategic planning.
  • 87. Logic Model Workbook Page 4 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] D e v e l o p i n g a L o g i c M o d e l Many different logic model formats exist, but they all contain the same core concepts. The format we use in this workbook and in our online tools has proven useful and manageable for the nonprofit partners we have worked with, and is the result of more than fifteen years of program planning and evaluation experience in the field. It’s not necessary to create your logic model all in one sitting. It will almost certainly be useful to talk to other program stakeholders and get their input along the way. You can work through the process as we have it laid out here – starting with the problem your program is meant to solve, and ending with your intended outcomes – or, if it’s easier for you, you can work in reverse, starting with outcomes and working your way backwards. Similarly, the names of key components may also vary among different logic models used in the field, but the underlying concepts are the same. In this workbook, we identify other terms used in the field for similar concepts. As you develop your logic model, we encourage you to find a
  • 88. common language to use among key stakeholders, whether that language reflects the terms used here or elsewhere. The important thing is that everyone involved uses the same terms. The components of the logic model used by Innovation Network are: Logic Model Workbook Page 5 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] A series of “if-then” relationships connect the components of the logic model: if resources are available to the program, then program activities can be implemented; if program activities are implemented successfully, then certain outputs and outcomes can be expected. As you draft each component of the logic model, consider the if-then relationship between the components. If you cannot make a connection between each component of the logic model, you should identify the gaps and adjust your work. This may mean revising some of your activities
  • 89. to ensure that you are able to achieve your outcomes, or revising intended outcomes to be feasible with available resources. P u r p o s e s o f a L o g i c M o d e l The logic model is a versatile tool that can support many management activities, such as: • Program Planning. The logic model is a valuable tool for program planning and development. The logic model structure helps you think through your program strategy—to help clarify where you are and where you want to be. • Program Management. Because it "connects the dots" between resources, activities, and outcomes, a logic model can be the basis for developing a more detailed management plan. Using data collection and an evaluation plan, the logic model helps you track and monitor operations to better manage results. It can serve as the foundation for creating budgets and work plans. • Communication. A well-built logic model is a powerful communications tool. It can show stakeholders at a glance what a program is doing (activities) and what it is achieving (outcomes), emphasizing the link between the two. • Consensus-Building. Developing a logic model builds common understanding and
  • 90. promotes buy-in among both internal and external stakeholders about what a program is, how it works, and what it is trying to achieve. • Fundraising. A sound logic model demonstrates to funders that you have purposefully identified what your program will do, what it hopes to achieve, and what resources you will need to accomplish your work. It can also help structure and streamline grant writing. The logic model you create with this workbook can be used for any or all of the above purposes – any time you need to show or refer to a clear and succinct picture of your program. If….. Resources Activities Outputs Outcomes ThenThenThen Logic Model Workbook Page 6 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] T h e L o g i c M o d e l ’ s R o l e i n E v a l u a t i o n The cornerstone of effective evaluation is a thorough understanding of the program you are
  • 91. trying to evaluate: What resources it has to work with, what it is doing, what it hopes to achieve, for whom, and when. In conducting an evaluation, it is tempting to focus most of your attention on data collection. However, your evaluation efforts will be more effective if you start with a logic model. Going through the logic model process will help ensure that your evaluation will yield relevant, useful information. The figure below illustrates how the logic model you will build can serve as the foundation for future evaluation plans. (Our Evaluation Plan Workbook and online Evaluation Plan Builder offer guidance for creating evaluation plans.) Logic Model Evaluation Plan: Process OutputsActivities Outcomes OutputsActivities Evaluation Plan: Outcomes Outcomes Indicators Resources Data Collection Data Collection
  • 92. C o m p o n e n t s – S t e p b y S t e p A note about our “Home Buying” example: People often ask for examples that relate directly to their program area—but examples for one programmatic area can be difficult to “translate” to another programmatic area. We use the example of becoming a homeowner to give a more general conceptual framework. Problem Statement The first step in creating a logic model is to clearly articulate the problem your work is tring to solve— that is, frame a particular challenge for the population you serve. problem that frames a particular challenge for the population your work will try to solve. Other Terms for “Problem Statement” You might also hear a problem statement called an "issue statement" or "situation." Logic Model Workbook Page 7
  • 93. I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Your problem statement should briefly explain what needs to change: why is there is a need for an intervention? Your problem statement answers the question, “What problem are we working to solve?” Include “who, what, why, where, when, and how” in your statement. Sample problem statements: I do not own my own home, so I do not experience the many financial and emotional benefits of home ownership. A growing number of women in Highland Falls lack the confidence and know-how to obtain employment and be self-sufficient due to low literacy in our region. In Townsville, low-income residents with bad or no credit do not have resources available to help them improve their current living situations. Build Your Logic Model: When you have identified your problem statement, insert it into the Problem Statement box in your logic model template, or on the “Problem/Goals” tab of the online Logic Model Builder. Goal Next, think about the overall purpose of what you are trying to measure (your program, intervention, etc). What are you trying to accomplish? The answer to this
  • 94. question is the solution to your problem statement, and will serve as your goal. Goals serve as a frame for all elements of the logic model that follow. They reflect organizational priorities and help you steer a clear direction for future action. Goals should: • Include the intended results—in general terms—of the program or initiative. • Specify the target population you intend to serve. Examples of goal statements include: To increase my financial independence and security through home ownership. Significantly increase the literacy rates among children with reading difficulties at Yisser Elementary School by implementing a teen-tutored reading program. Assist clients in their effort to become economically self- sufficient. Improve the health status of children, ages birth to 8 years, in Harrison County. Other Terms for “Goal”
  • 95. You might also hear a goal called an "objective" or a "long-term outcome." Logic Model Workbook Page 8 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Goal Tips: • All logic model components should be connected to your goal. Having a clear goal helps fight the temptation to implement an interesting program that doesn’t really “fit.” • It’s tempting to have more than one goal, but we recommend that you articulate one clear solution to your problem statement. Other goals of your program may be long-term outcomes, rather than goals. • Phrase your goal in terms of the change you want to achieve over the life of your intervention, rather than a summary of the services you are going to provide. • Don’t make your statement so broad and general that it provides no guidance for your project.
  • 96. Build Your Logic Model: Insert your goal statement(s) into the Goal box in your logic model template, or on the “Problem/Goals” tab of the online Logic Model Builder. Rationales A program’s rationales are the beliefs about how change occurs in your field and with your specific clients (or audience), based on research, experience, or best practices. For example: Home ownership increases a person’s options for financial stability and wealth-building. Current research on women leaving public income support systems indicates that targeted job training, partnered with a menu of support and coaching services, can help women get and keep living wage jobs Success in moving into higher-paying jobs and achieving economic self-sufficiency is closely related to the availability of opportunities for training and education. These rationales all demonstrate a core set of beliefs based on knowledge about how changes occur in the field. Build Your Logic Model: If you choose to include Rationales in your logic model, record them in the “Rationales” box on the template, or on the
  • 97. “Rationale/Assumptions” tab in the online Logic Model Builder. http://www.innonet.org/?module=lmb.rationales&set_v=d3BfaW Q9MjEwJmxtYl9kYXRhX3R5cGVfaWQ9MyZsbWJfZGF0YV9p ZD0yMjU5Jm1vZGU9ZWRpdF9sbWJfZGF0YQ==� http://www.innonet.org/?module=lmb.rationales&set_v=d3BfaW Q9MjEwJmxtYl9kYXRhX3R5cGVfaWQ9MyZsbWJfZGF0YV9p ZD0yMjU5Jm1vZGU9ZWRpdF9sbWJfZGF0YQ==� Logic Model Workbook Page 9 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Assumptions The assumptions that underlie a program’s theory are conditions that are necessary for success, and you believe are true. Your program needs these conditions in order to succeed, but you believe these conditions already exist – they are not something you need to bring about with your program activities. In fact, they are not within your control. These assumptions can refer to facts or special circumstances in your community, region, and/or field. Examples of program assumptions are:
  • 98. There are houses for sale for which potential homebuyers will qualify. There are living wage jobs available within a reasonable distance of this neighborhood, with adequate public transportation to reach those jobs. Two counselors can serve a client population of approximately 40. The first assumption demonstrates that there is a circumstance within the community that will enable a homebuyer to successfully purchase a home. The third example shows that the program manager has clearly thought out how many counselors are needed to support the number of participants the program will serve. Build Your Logic Model: If you choose to include the Assumptions behind your program choices in your logic model, record them in the “Assumptions” box on the template, or on the “Rationale/Assumptions” tab in the online Logic Model Builder. Resources Identify the available resources for your program. This helps you determine the extent to which you will be able to implement the program and achieve your intended goals and outcomes. List the resources that you currently have to support
  • 99. your program. (If you intend to raise additional resources for the program during this program timeframe, account for them under "Activities.") An exception: If you’re building your logic model as part of a proposal or to justify a funding request, list all the resources you will need for a successful program, whether or not you have them in hand. (You may wish to separate resources under headings for “need” and “have.”) Other Terms for “Resources” You might also hear resources called “inputs” or “program investments”. Logic Model Workbook Page 10 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Common types of resources include: - and part-time staff, consultants (e.g., fundraising, technical support, strategic planning, communications), pro bono staff services, and volunteers cial resources: Restricted grants, operating budget, and other monetary resources
  • 100. infrastructure (email, website) inters, copiers) and equipment specific to the program materials), insurance, etc. Resource Tips: • Identify the major resource categories for your program. • Be specific about these resources, but do not spend a lot of time developing a detailed list of all actual or anticipated program expenditures. Not specific enough Just right Too specific Home-buying resources Clear financial records W2 forms 1099s Tax returns Bank statements Pay stubs Utilities bills Credit report
  • 101. Staff 3 full-time staff 1 part-time 1 project lead @ 40 hrs/wk 2 project associates @ 40 hrs/wk 1 part-time support person @ 20 hrs/wk Supplies Art Supplies 25 paintbrushes 50 bottles of paint 250 sheets of paper 25 coffee cans Dishwashing liquid • Remember to include resources such as technology, materials, and space: these are often overlooked at the program planning stage, which can cause trouble later. • You can use your resource list as the foundation for developing a program budget. • Do you receive in-kind contributions? List those among your resources. Build Your Logic Model: List your resources statement(s) in the Resources box in your logic model template, or on the “Timeframe/Resources” tab of the online Logic Model Builder.
  • 102. Logic Model Workbook Page 11 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Activities Activities are the actions that are needed to implement your program—what you will do with program resources in order to achieve program outcomes and, ultimately, your goal(s). Common activities are: • Developing products (e.g., promotional materials and educational curricula), • Providing services (e.g., education and training, counseling or health screening), • Engaging in policy advocacy (e.g., issuing policy statements, conducting public testimony), or • Building infrastructure (e.g., strengthening governance and management structures, relationships, and capacity). It is often helpful to group related activities together. The number of activity groups depends on your program’s size and how you administer it. For a large program, there might be numerous activity groups; smaller programs may consist of just one or two. Each activity group will have more specific activities under it—but remember, this isn’t a to- do list. Getting too specific can
  • 103. overwhelm your audience. Examples: For our homebuying example, we use the activity groups of preliminary research, financial preparation, homebuyer’s education, identify a neighborhood, secure mortgage loan, choose a house, and make the purchase. A program with the goal of reducing the teen pregnancy rate in its city might have the following activity groups: family planning education, mentoring, and providing individual and group counseling. A program with a goal of increasing organizational capacity through strategic use of technology might have the following activity categories: technology planning, selecting and implementing technology infrastructure, staff assessment and training, and network support. Other Terms for “Activities” You might also hear activities called “processes,” “strategies,” “methods,” or “action steps.” Logic Model Workbook
  • 104. Page 12 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Activities Tips: • You can use the activities you identify here as an outline for a work plan. Use the activities as headings in a more comprehensive work plan that includes staff assignments and a timeline. • Providing a complete list of activities helps people who are not familiar with your understand what it really takes to implement it—but getting too specific can overwhelm them. The chart below gives some examples of what level of specificity is useful. Activity Group: Identify a neighborhood ACTIVITIES: • Hire real-estate agent • Drive around the city This set of activities is not detailed enough. It omits a number of key steps needed to implement mentor training. Activity Group: Identify a neighborhood ACTIVITIES:
  • 105. • Conduct Google search • Interview friends and family • Choose three books from the local library about neighborhoods • Read three books • Hire a driver to tour neighborhoods • Try neighborhood restaurants • Set up review meeting • Take friends and family on neighborhood tours o Send out Invitations o Arrange transportation This is too detailed. It would more appropriately belong in a work plan. Activity Group: Identify a neighborhood ACTIVITIES: • Research local neighborhoods--amenities and prices • Hire a real-estate agent • Tour priority neighborhoods This is just about the right level of detail for a logic model. Build Your Logic Model: List all activities required to implement your program, and group related activities together. Record them in your template
  • 106. or on the “Activities/Outputs” tab of the online Logic Model Builder. Logic Model Workbook Page 13 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Outputs Outputs are the measurable, tangible, and direct products or results of program activities. They lead to desired outcomes—benefits for participants, families, communities, or organizations—but are not themselves the changes you expect the program will produce. They do help you assess how well you are implementing the program. Whenever possible, express outputs in terms of the size and/or scope of services and products delivered or produced by the program. They frequently include quantities or reflect the existence of something new. Examples of program outputs include numbers and descriptions of: • Number of home buying workshops attended • Number of neighborhoods researched • Number of program participants served • Hours of service provided
  • 107. • Number of partnerships or coalitions formed • Focus groups held • Policy briefings conducted An output statement doesn’t reveal anything about quality. You will assess the quality of your outputs in your evaluation. Outputs Tips: • Make sure your outputs have activities and resources associated with them. This is one way a logic model is useful—to check whether a program has planned how it will create a product or deliver a service. • Many people identify specific numbers for their outputs. Begin with an estimate, based on experience, desired impact, and resources available. Don’t get stuck on exact numbers; you can adjust them later. Build Your Logic Model: List all the outputs you expect your program’s activities will produce. Place these in the Outputs box of the logic model template or on the “Activities/Outputs” tab of the online Logic Model Builder. Other Terms for “Outputs”
  • 108. You might also hear outputs called “deliverables,” “units of service,” or “products.” Logic Model Workbook Page 14 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Outcomes Outcomes express the results that your program intends to achieve if implemented as planned. Outcomes are the changes that occur or the difference that is made for individuals, groups, families, organizations, systems, or communities during or after the program. Outcomes answer the questions: “What difference does the program make? What does success look like?” They reflect the core achievements you hope for your program. Outcomes should: • Represent the results or impacts that occur because of program activities and services • Be within the scope of the program’s control or sphere of reasonable influence, as well as the timeframe you have chosen for your logic model • Be generally accepted as valid by various stakeholders of the program • Be phrased in terms of change
  • 109. • Be measurable. (It may take work to translate them into measurable indicators.) Types of Change: Organizations with diverse missions and services share common categories of outcomes, because outcomes are about change: changes in learning, changes in action, or changes in condition. Changes in Learning: o New knowledge o Increased skills o Changed attitudes, opinions, or values o Changed motivation or aspirations For example: • Potential homeowners increase their understanding of the home buying process • Teens ages 15-18 increase their commitment to community service. Changes in Action: o Modified behavior or practice o Changed decisions o Changed policies For example: • Potential homeowners have purchased their first home. • Teens ages 15-18 participate in community service.
  • 110. Other Terms for “Outcomes” You might also hear outcomes called “results”, “impacts”, or “objectives”. Logic Model Workbook Page 15 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Changes in Condition: o Human (e.g., from oppression to freedom; from malnourishment to food security) o Economic (e.g., from unemployed to employed) o Civic (e.g., from disenfranchised to empowered) o Environmental (e.g., from polluted to clean) For example: • Potential homeowners have purchased their first home. • Teens ages 15-18 have improved employment prospects because of community service. Focus of Outcomes: Clarify who or what will experience the intended changes. 1. Individual, Client-Focused Outcomes: These reflect the
  • 111. difference the program will make in the lives of those directly served by the program. Examples include: • Potential homebuyer has purchased a home (change in status/condition) • Parents use alternative discipline approaches (behavior) • Participants are better able to organize and advocate for their rights (skills) • Children are better prepared to enter school (changed status/condition) 2. Family or Community Outcomes: Some programs intend to create change for families, neighborhoods, or whole communities. Examples include: • Higher percentage of homeowners as opposed to renters in a low-income community • Improved communication among family members • Increased parent-child-school interactions • Decreased neighborhood violence • Community group has an inclusive membership policy, work group practices, and democratic governance 3. Systemic Outcomes: These illustrate changes to overall systems and might include cases where agencies, departments, or complex organizations work in new ways, behave differently, share resources, and provide services in a coordinated fashion. Examples include: • Integrated system of services or interagency resource sharing • Greater coordination among partners in a system
  • 112. 4. Organizational Outcomes: Some programs lead to internal outcomes—both individual and institutional—that affect how well a program can achieve external outcomes. These produce improvements in program management and organizational effectiveness. Examples of organizational outcomes include: • Increased efficiency • Increased staff motivation • Increased collaboration with other organizations Logic Model Workbook Page 16 I N N O V A T I O N N E T W O R K , I N C . www.innonet.org • [email protected] Chain of Outcomes. Not all outcomes can occur at the same time. Some outcomes must occur before others become possible. This is referred to as the “chain of outcomes.” (See Appendix B for a worksheet.) -term Outcomes: What change do you expect to occur either immediately or in the near future? Short-term outcomes are those that are the most direct result of a program’s activities and outputs. They are typically not ends in themselves, but are necessary steps toward desired ends (intermediate or long-term outcomes or
  • 113. goals) te Outcomes: What change do you want to occur after that? Intermediae outcomes are those outcomes that link a program’s short-term outcomes to long-term outcomes. -term Outcome: What change do you hope will occur over time? Long-term outcomes are those that result from the achievement of your short- and intermediate-term outcomes. They are also generally outcomes over which your program has a less direct influence. Often long-term outcomes will occur beyond the timeframe you identified for your logic model. Outcomes Chain Example Good Health for Kids is an advocacy organization that educates parents and guardians about the importance of immunizing children. The staff has identified the following program activities: • Develop educational literature • Disseminate literature to social service agencies • Develop public service announcements (PSAs) • Identify and work with radio stations to air radio spots The outcomes associated with these activities fall into three categories:
  • 114. Short-Term LEARNING: The knowledge parents and guardians gain from the literature & PSAs. • Increased understanding among targeted parents of the importance of childhood immunization • Increased knowledge among targeted parents of where to go to have their children immunized Intermediate BEHAVIOR: The actions parents & guardians take as a result of that knowledge. • Increased number of targeted parents who take their children to be immunized Closer in Time Easier to Measure More Attributable to Program Long-Term CONDITION: The conditions that change as a result of those actions. • Increased number of