Upcoming SlideShare
×

# 11 problem solving with sara

1,450 views

Published on

Published in: Education
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
1,450
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
29
0
Likes
0
Embeds 0
No embeds

No notes for slide

### 11 problem solving with sara

1. 1. RESEARCHING PROBLEMS AND ASSESSING RESPONSES<br />CJS380 – Crime Science J.A.Gilmer ©<br />
2. 2. SARAThe PROBLEM-SOLVING PROCESS<br />SCAN<br />ANALYZE<br />ASSESS<br />RESPOND<br />
3. 3. SCANNING<br />
4. 4. IDENTIFY & PRIORITIZE PROBLEM<br />Identify recurring problems of concern<br />Identify consequences of the problem<br />Prioritize identified problems<br />Develop broad goals<br />Confirm that the problem exists<br />Determine how frequently the problem occurs and how long it has been taking place<br />Select problem for closer examination<br />
5. 5. The CHEERS Test<br />Community – must experience harmful events<br />Harmful – property loss/damage, injury/death, mental anguish, undermining police (illegality not a defining characteristic of problems) <br />Expectation – community members expect police to act (not necessarily a majority)<br />Events – problems made up of discrete events<br />Recurring – acute or chronic<br />Similarity – recurring events must have something in common<br />http://www.popcenter.org/learning/60steps/index.cfm?stepNum=14<br />
6. 6. UNDERSTAND YOUR PROBLEM 5 W + 1 H = Hypothesis<br />Whois involved?<br />Whatexactly do they do?<br />Whydo they do this?<br />Wheredo they do this?<br />Whendo they do this?<br />How do they carry out the crime?<br />Hypothesis – a statement that explains why the problem is occurring<br />
7. 7. ANALYSIS<br />
8. 8. RESEARCH THE PROBLEM<br />Identify/understand events/conditions that precede and accompany the problem<br />Identify relevant data to be collected<br />Research what is known about the problem type<br />Take inventory of how problem is currently addressed and the strengths and limitations of current response<br />Narrow the scope of the problem as specifically as possible<br />Identify a variety of resources that may assist in developing a deeper understanding of the problem<br />Develop working hypothesis about why problem is occurring<br />
9. 9. The Five Most Useful Websites<br />Center for Problem-Oriented Policing (www.popcenter.org) <br />National Criminal Justice Reference Service (NCJRS) Abstracts Database (http://www.ncjrs.gov/abstractdb/search.asp)<br />The Home Office | Crime, United Kingdom (http://www.homeoffice.gov.uk/crime/)<br />Australian Institute of Criminology (www.aic.gov.au)<br />9<br />Four<br />
10. 10. DATABASES at Hellman Library via Blackboard<br />EBSCO Host (search single or multiple databases)<br />Academic Search Premier<br />Criminal Justice Abstracts<br />ERIC<br />SocINDEX<br />Google Scholar<br />JSTOR (historical)<br />LexisNexis Academic<br />ProQuest Academic<br />SAGE journals online<br />Interlibrary Loan<br />10<br />Log in<br />and try it<br />
11. 11. RESPONSE<br />
12. 12. INTERVENTION<br />Brainstorm for new interventions<br />Search for what other communities with similar problems have done<br />Choose among the alternative interventions<br />Outline a response plan and identify responsible parties<br />State specific objectives for response plan<br />Carry out the planned activities<br />
13. 13. POLICE-SPECIFIC PROJECTS<br />Goldstein Awards (http://www.popcenter.org/goldstein/)<br />Recognizes outstanding police officers and police agencies–both in the United States and around the world–that engage in innovative and effective problem–solving efforts and achieve measurable success in reducing specific crime, disorder, and public safety problems.<br />Tilley Awards (http://www.homeoffice.gov.uk/crime/partnerships/tilley-awards/)<br />Set up by the U.K. Home Office Policing and Reducing Crime Unit (now the Crime and Policing Group) in 1999 to encourage and recognize good practice in implementing problem–oriented policing (POP)<br />13<br />
14. 14. IDENTIFY RESPONSES<br />Keep a summary record of responses<br />Note primary source<br />Explain how response works<br />Under what conditions it works best<br />Any special considerations (costs, legal requirements, etc.)<br />
15. 15. ASSESSMENT<br />
16. 16. EVALUATE AND ASSESS<br />KEY QUESTION: DID PROBLEM DECLINE ENOUGH TO END THE EFFORT?<br />Determine whether plan was implemented (process evaluation)<br />Collecting pre– and post–response data (qualitative & quantitative)<br />Determine whether broad goals and specific objectives were attained<br />Identify any new strategies needed to augment original plan<br />Conduct ongoing assessment to ensure continued effectiveness<br />
17. 17. EVALUATION VS ASSESSMENT<br />EVALUATION– scientific process for determining if a problem declined and if the solution caused the decline<br />Begins the moment the problem-solving process begins and continues through the completion of the effort<br />ASSESSMENT – the final stage of both evaluation and problem solving<br />Answers the following questions: Did the response occur as planned? Did the problem decline? If so, are there good reasons to believe the decline resulted from the response<br />
18. 18. Evaluation throughout problem-solving process<br />Fig 1 in Tool Guide No. 1 (2002)<br />
19. 19. TYPES OF EVALUATIONS<br />Process Evaluation<br />Did response occur as planned? Did all response components work?<br />involves comparing the planned response with what actually occurred<br />Impact Evaluation<br />Did the problem decline? If so, did the response cause the decline? <br />To be able to reliably use again, it is important to determine if the response caused the decline in the problem<br />
20. 20. Interpreting Results of Process and Impact Evaluations<br />Tool Guide No. 1 (2002)<br />
21. 21. CONDUCTING IMPACT EVALUATIONS<br />Part 1: Measure the problem<br />Quantitative – counts and numerical estimate; adds comparability<br />Qualitative – (e.g., photos, maps, interviews); allows comparisons, but not precision; reinforces quantitative information<br />Part 2: Evaluation design <br />Compare measures systematically<br />
22. 22. MEASURING THE PROBLEM<br />Take the most direct measure of the problems<br />The more indirect the measure, the less valid<br />Use multiple measures, where possible<br />Arrest, as a measure of impact, may be affected by citizen complaint activity and/or police practice.<br />Whether a measure is direct or indirect depends on how the problem is defined<br />Is focus on “behavior” or “perception of behavior”?<br />Measure the problem systematically and use the same measures throughout<br />
23. 23. DID THE REPONSE CAUSE THE CHANGE<br />Is there a Plausible Explanation that the response changed the level of the problem<br />Based on detailed problem analysis, backed by research<br />Is there an Associationbetween presence of the response and change in level of the problem<br />Did the response Precede a change in the problem<br />Have measures before and after response begins<br />Are there No Plausible Alternative Explanations<br />Could ‘something else’ have caused the results found<br />
24. 24. EVALUATION DESIGNS<br />Pre-post designs: simplest<br />Can establish ‘association’ and ‘temporal order”<br />Weak at ruling out alternative explanations<br />Can’t assess fluctuations between measurements<br />Tool Guide No. 1 (2002)<br />
25. 25. EVALUATION DESIGNS<br />Interrupted Time Series designs: superior<br />Repeated measures assess problem trajectory before and after response<br />Requires time intervals of sufficient duration to derive “meaningful” conclusions<br />Easy to use with routine data<br />Stability of impact after response controls for fluctuation<br />Tool Guide No. 1 (2002)<br />
26. 26. EVALUATION DESIGNS<br /> Interrupted time series designs not often practical<br />Measurement can be expensive or difficult (surveys)<br />Data may be unavailable for many periods before response<br />Decision-makers cannot want to wait for time required to establish results of the response<br />If data recording practices change, inter-period comparisons become invalid<br />Hard to interpret when problem events are rare in time period, forcing use of fewer intervals of longer duration<br />Cannot account for ‘something else’ that occurred which caused the level of the problem to change<br />