More Related Content
Similar to Chapter 13 Interpreting Data module version 1
Similar to Chapter 13 Interpreting Data module version 1 (20)
Chapter 13 Interpreting Data module version 1
- 2. 2
© 2018 Cengage Learning. All Rights Reserved.
Learning Objectives
• Summarize evaluation research and problem analysis as
examples of applied research in criminal justice
• Describe how different types of evaluation activities
correspond to different stages in the policy process
• Explain the role of an evaluability assessment
• Understand why a careful formulation of the problem,
relevant measurements, and criteria of success or failure
are essential in evaluation research
• Describe the parallels between evaluation research designs
and other designs
• Explain the advantages, requirements, and limits of
randomized field experiments
- 3. 3
© 2018 Cengage Learning. All Rights Reserved.
Learning Objectives, cont.
• Summarize the importance of process evaluations conducted
independently or in connection with an impact assessment
• Describe the role of problem analysis as a planning technique
that draws on the same social science research methods used
in program evaluation
• Explain how the scientific realist approach focuses on
mechanisms in context, rather than generalizable causal
processes
• Present an example of how criminal justice agencies are
increasingly using problem analysis tools, crime mapping, and
other space-based procedures
• Explain how evaluation research entails special logistical,
ethical, and political problems
- 4. 4
© 2018 Cengage Learning. All Rights Reserved.
Introduction
• Evaluation Research: Refers to a research
purpose rather than a specific method; seeks to
evaluate the impact of interventions; if some result
was produced
• Problem Analysis: Designed to help public officials
choose from alternative future actions
• Policy Intervention: An action taken for the
purpose of producing some intended result
• Evidence-Based Policy: The actions of justice
agencies are linked to evidence used for planning
and evaluation
- 5. 5
© 2018 Cengage Learning. All Rights Reserved.
The Policy Process
• Begins with a demand supporting a new course
of action or opposition to existing policy
• Policymakers consider ultimate goals and actions
to achieve those goals
• Outputs: The means to achieve desired goals
• Impacts: Refer to basic questions about what a
policy seeks to achieve
• If some policy is taken, then we expect some
result
- 6. 6
© 2018 Cengage Learning. All Rights Reserved.
Discussion Question 1
What if you studied policy implementation?
Do you think you’d be more interested in
impacts or outputs? Why?
- 7. 7
© 2018 Cengage Learning. All Rights Reserved.
Linking the Process to Evaluation
• Are policies being implemented as planned?
• Are policies achieving their intended goals?
• Evaluation seeks to link intended actions and
goals of policy to empirical evidence that:
• Policies are being carried out as planned (process evaluation)
• Policies are having the desired effects (impact assessment)
• Often conducted together
- 8. 8
© 2018 Cengage Learning. All Rights Reserved.
Getting Started
• Learning policy goals is a key first step in doing
evaluation research
• Evaluability Assessment: “Pre-evaluation”—
researcher determines whether requisite
conditions are present
• Support from relevant organizations
• What goals and objectives are; how they are translated into
program components
• What kinds of records or data are available
• Who has a direct or indirect stake in the program
- 9. 9
© 2018 Cengage Learning. All Rights Reserved.
Discussion Question 2
How might you convince a public official that
an evaluation is both a necessary thing and
a positive thing?
- 10. 10
© 2018 Cengage Learning. All Rights Reserved.
Problem Formulation and Measurement
• Different stakeholders often have different goals and
views as to how a program should actually operate
• Must clearly specify outcomes—program goals and
objectives
• Create objectives—operationalized statements
• Definition and measurement: target/beneficiary
population; decide between using current measures
or creating new ones
• Measure program contexts, outcomes, and delivery
- 11. 11
© 2018 Cengage Learning. All Rights Reserved.
Designs for Program Evaluation
• Randomized Evaluation Designs: Avoids selection
bias; allows assumption that groups created by
random assignment are statistically equivalent;
may not be suitable when agency or staff makes
exceptions
• Case Flow: Represents process through which
subjects are accumulated into experimental and
control groups
• Treatment Integrity: Whether an experimental
intervention is delivered as intended; ≈ reliability
• Threatened by midstream changes in program
- 12. 12
© 2018 Cengage Learning. All Rights Reserved.
Discussion Question 3
What if local researchers wanted to
experiment on the criminal justice system in
your area? What would be your concerns as
far as ethics and integrity?
- 13. 13
© 2018 Cengage Learning. All Rights Reserved.
Conditions for Randomized Experiments
• Staff must accept random assignment and agree to
minimize exceptions to randomization
• Case flow must be adequate to produce enough
subjects in each group so that statistical tests will be
able to detect significant differences in outcome
measures
• Experimental interventions must be consistently
applied to treatment groups and withheld from
control groups
• Need equivalence prior to intervention, and ability to
detect differences in outcome measures after
intervention
- 14. 14
© 2018 Cengage Learning. All Rights Reserved.
Home Detention: Two Randomized Studies
• Combining home detention with ELMO
• Juvenile program paid less attention to
delivering program elements and using ELMO
info than adult
• Difficult to maintain desired level of control over experimental
conditions
• Also difficult when more than one organization is involved
• Randomization does not control for variation in
treatment integrity and program delivery; utilize
other methods
- 15. 15
© 2018 Cengage Learning. All Rights Reserved.
Quasi-Experimental Designs
• No random assignment to Experimental and
Control group
• Often “nested” in experimental designs as backups
• Lack built-in controls for selection and other
Internal Validity threats
• You must construct Experimental and Control
groups that are as similar as possible
- 16. 16
© 2018 Cengage Learning. All Rights Reserved.
Quasi-Experimental Designs, cont.
• Ex post Evaluation: Conducted after experimental
program has gone into effect
• Full Coverage Programs: Sentencing guidelines
• Larger Treatment Units: Neighborhood crime
prevention program
• Interrupted Time-Series Designs: Require
attention to different issues because researchers
cannot normally control how reliably the
experimental treatment is actually implemented
• Instrumentation, History, Construct Validity
- 17. 17
© 2018 Cengage Learning. All Rights Reserved.
Problem Analysis and Scientific Realism
• Problem analysis, coupled with scientific realism, helps
public officials use research to select and assess
alternative courses of action
• Realists suggest that similar interventions will have
different outcomes in different contexts
• Evaluators should search for mechanisms (IVs) acting
in context (assorted intervening variables) to explain
outcomes (DVs)
• Appropriate in small-scale evaluations directed toward
solving a particular problem in a specific context
- 18. 18
© 2018 Cengage Learning. All Rights Reserved.
Problem Analysis
• Problem-Oriented Policing
• Problem solving: A fundamental tool in problem-oriented
policing
• How-to-Do-It Guides: A general guide to
crime analysis to support problem-oriented
policing
• Problem & Response Guides: Describe how
to analyze very specific types of problems
and what are known to be effective or
ineffective responses
- 19. 19
© 2018 Cengage Learning. All Rights Reserved.
Auto Theft in Chula Vista
• Nanci Plouffe and Rana Sampson (2004)
began their analysis of vehicle theft by
comparing Chula Vista to other southern
California cities
• Theft rates tended to be higher for cities closer to the
border
• Ten parking lots accounted for 25 percent of thefts and
20 percent of break-ins in the city
• Six of the ten lots were among the top ten calls-for-
service locations in Chula Vista
• Auto theft hot spots also tended to be hot spots for other
kinds of incidents
- 20. 20
© 2018 Cengage Learning. All Rights Reserved.
Other Applications of Policy Analysis
• Space- and Time-Based Analysis: increased
prevalence due to technological advances
• Crime maps usually represent at least four
different things:
• (1) one or more crime types; (2) space or area; (3) some
time period; and (4) some dimension of land use, usually
streets
• Problem-solving tools and processes
• Strategic Approaches to Community Safety Initiatives
(SACSI)
- 21. 21
© 2018 Cengage Learning. All Rights Reserved.
Political Context of Applied Research
• Different stakeholder interests can produce
conflicting perspectives on evaluations
• Researcher must identify stakeholders and
perspectives
• Educate stakeholders on why evaluation should
be conducted
• Explain that applied research is used to
determine what works and what does not
• Political concerns and ideology may color
evaluation; be careful