In February, 2015, Jeanne Century and Amy Cassata presented a one hour webinar to over 150 education researchers, evaluators, faculty, and other individuals who are part of the MSPnet (Math and Science Partnership Network) online community. The MSPnet serves recipients of and participants in NSF’s “STEM + Computing Partnerships” and “Math and Science Partnerships” research programs. The webinar provided a high-level overview of the key issues related to implementation measurement, including definitions, theory, and study design and measurement approaches at different stages of research.
1. Making Sense of Measuring Implementation in
Educational Research
Amy Cassata & Jeanne Century
MSPNet Webinar
February 25th, 2015
2. Today’s Agenda
• Defini&ons,
Theory
and
Background
• Implementa&on
Measurement
and
the
Common
Guidelines
• Design
and
Measurement
Approaches
• Analysis
Strategies
3. • Study
#1:
Exploratory
-‐
Nine
School
Districts
–
Science
1998-‐2002
• Study
#2:
Literature
Synthesis
2006-‐2010
• Study
#3:
FOI
of
Mathema&cs
and
Science
Materials
2007-‐2010
• Study
#4:
Implementa&on
Study
–
STEM
Schools
(2)
2010-‐2013
• Study
#5:
Implementa&on
Study
–
Elementary
Mathema&cs
2011
-‐
2015
• Study
#6:
Instrument
Development/Valida&on
Study
–
IES
2011-‐2015
• Study
#7:
Implementa&on
Study
–
Computer
Science
2013-‐2016
Previous and Current Work
4. • What
is
an
innova&on?
An
interven&on?
• What
is
fidelity
of
implementa&on?
How
is
it
different
from
“implementa&on”
(use)?
• What
is
the
implementa&on
process?
Defini&ons
5.
6. Intervention
Component
Framework
Interven(on
Component
Framework
Categories
of
Cri&cal
Components
Structural Interac&onal
Procedural Educa&ve
Interac&onal
(Leader)
Interac&onal
(Par&cipant)
(engagement)
This
work
was
funded
in
part
by
the
Na(onal
Science
Founda(on
7. What is the “intervention?” What is the “IT?”
FACTORS
INTERVENTION
OUTCOME
B
OUTCOME
A
OUTCOME
A’
FACTORS
FACTORS
OUTCOME
B
OUTCOME
A’
OUTCOME
C
OUTCOME
B’
FACTORS
FACTORS
Time
INTERVENTION’
INTERVENTION’’
8.
9. characteris&cs
of
the
users
factors
In
the
context
of
the
interven&on
• self-‐efficacy
• understanding
of
the
interven&on
• actude
toward
the
interven&on
• intrinsic
mo&va&on
• extrinsic
mo&va&on
Descrip&ve
characteris&cs
• demographic
• educa&on
• experience
10. characteris&cs
of
the
users
factors
NOT
in
the
context
of
the
interven&on
• innova&veness
• resourcefulness
and
coping
• networked-‐ness
• &me
management
and
organiza&onal
skills
Percep&ons
• perceived
adaptability
• perceived
visibility
• ease
of
use
• perceived
effec&veness
11. characteris&cs
of
the
organiza&on
factors
Descrip&ve
Characteris&cs
• organiza&onal
structures
• physical
environment
• popula&on
characteris&cs
• stakeholder
community
support
• presence
of
opportuni&es
for
learning
inside
the
organiza&on
People
in
the
Organiza&on
• informa&on
sufficiency
• informa&on
sharing
• shared
beliefs
and
values
• Resources
• locus
of
decision
making
• u&lity
of
learning
opportuni&es
• organiza&onal
efficacy
• collabora&on
• leadership
Organiza&onal
Strategies
• ongoing
improvement
structures
• leveraging
• implementa&on
strategy
• family/community
communica&on
12. characteris&cs
of
the
environment
factors
• poli&cal
environment
• community
beliefs
and
values
• descrip&ve
community
characteris&cs
• opportuni&es
for
learning
• network
structures
• extraneous
events
or
ini&a&ves
13. • Defini&on
of
fidelity
of
implementa&on
• Measuring
“dimensions”
of
implementa&on:
structure
and
process
• Need
to
iden&fy
the
interven&on
model
and
the
cri&cal
elements
&ed
to
intended
impacts
Areas
of
Convergence
14. Design
and
Development
Efficacy
Study
Effec&veness
Study
Scale-‐up
Study
Purpose
Draw
on
exis(ng
theory
and
evidence
to
design
and
itera(vely
develop
interven(ons
or
strategies.
Tes(ng
a
strategy
or
interven(on
under
“ideal”
circumstances
Examine
a
strategy
or
interven(on
under
“typical”
circumstances
or
condi(ons
of
rou(ne
prac(ce
Examine
a
strategy
or
interven(on
in
a
wide
range
of
popula(ons,
context,
and
circumstances
Ques&ons
Are
users
able
to
enact
the
interven(on
as
intended(feasibility
and
fidelity)
and
if
not,
why
(factors)?
Does
it
work?
(fidelity)
Does
it
work
on
a
large
scale?
(fidelity,
use,
factors)
How
does
it
work
in
natural
seLngs
and
what
what
factors
support
and
inhibit
implementa(on?
(use,
factors)
Implementa&on
Measurement
• Iden(fying
essen(al
components
• Organizing
components
• Disentangling
interven(on
components
from
general
quality
indicators
• Measures
of
appropriate
grain
size
• Instruments
for
both
treatment
and
control
condi(ons
• Affordable
designs
that
will
yield
sufficient
data
to
differen(ate
treatments
• Crea(ng
scales
and
indices
represen(ng
implementa(on
that
can
be
linked
to
outcomes
• Appropriate
analy(c
approaches
• Rela(onships
between
factors
and
implementa(on
• Rela(onships
between
implementa(on
and
outcomes
• Describe
implementa(on
over
(me
• Op(mal
implementa(on
for
par(cular
popula(ons
Measuring
Implementa&on
in
Common
Guidelines
Studies
16. • Implementa&on
measurement
is
more
than
just
“fidelity
of
implementa&on.”
• All
ini&a&ves
(
programs,
interven&ons,
reforms,
innova&ons)
are
comprised
of
components.
• Factors
influence
enactment
of
components.
• Components
and
factors
are
measurable.
• Be
clear
about
what
the
“it”
is.
Takeaways
Contact Information:
acassata@uchicago.edu
jcentury@uchicago.edu