Surveyssdasdasdasd

307 views

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
307
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Surveyssdasdasdasd

  1. 1. Surveys<br /><ul><li>Some background information
  2. 2. The basic component of surveys consists of QUESTIONS THAT PEOPLE ANSWER however sampling and analysis are also fundamental parts of survey research
  3. 3. Arguably the most popular data collective strategy in the social sciences
  4. 4. Victim Surveys and Offender surveys represent major data collection efforts in criminology and criminal justice</li></ul>BASIC MODEL: **test , describe these steps, provide answers <br />ASKING QUESTION -> UNDERSTANDING QUESTION -> RETRIEVING INFORMATION -> REPORTING ANSWER<br />Answering survey questions<br /><ul><li>Understanding of questions
  5. 5. Interpretation and meaning</li></ul>Example : Are the 2010 Winter Olympics good for Vancouver?<br /><ul><li>“good” in what sense?
  6. 6. “Olympics” does this include the para-olympics?
  7. 7. “Vancouver” Vancouver people? Greater Vancouver? British Columbia? Canada?
  8. 8. Is the meaning/interpretation affected or dependent upon previous questions? (i.e. if all the questions before this question dealt with the homelessness problem in Vancouver would they affect the answer?
  9. 9. Information Processing
  10. 10. Is respondent being asked to think about actual experience or hypothetical situations?
  11. 11. Retrieving/constructing responses
  12. 12. Memory
  13. 13. Mental capacity
  14. 14. Qualified to answer **</li></ul>Memory <br /><ul><li>If the time frame is from jan.1 –dec.31 (what crime have you been a victim of)
  15. 15. Forward telescoping – talking about events that occurred before Jan. 1 acting like they did happen after jan1
  16. 16. Keep time period as small as possible for accuracy or relating it to other events (it occurred before my birthday , after my anniversary</li></ul>Mental capacity <br /><ul><li>Wouldn’t ask a kindergartener about the economic status in Canada
  17. 17. Drug users who are high won’t have </li></ul>Qualified to answer**<br /><ul><li>How old were you when you started to walk? How old were you when you didn’t need diapers? Wouldn’t ask the person who experienced them
  18. 18. Reporting answer
  19. 19. Social and conversational norms (e.g. social desirability- many will not disclose that they might think that would make them look bad)
  20. 20. Confidentially and anonymity (helps get a more accurate answer without worrying about social desirability, and insurance like a consent form)</li></ul>TYPES OF SURVEYS<br />Distinction based on the Mode of Surveying/how it is carried out.<br />Self administered – given survey and done by yourself<br />Interviews - surveyor asks you questions and they write down your answer<br />Half – interviewer asks questions but you write down answers <br />Self administered |-----------------------|-------------------------|Interviews <br /><ul><li>Mailed Half - Face to Face
  21. 21. Emailed- Telephone
  22. 22. Drop off </li></ul>QUESTIONNAIRES<br />Mailed surveys <br /><ul><li>At one time a common type of self-administered survey especially consumer/marketing research
  23. 23. Can be effectively used with targeted groups (e.g. membership lists for a golf club)</li></ul>Some basic considerations:<br /><ul><li>Cover letter
  24. 24. Describes the study
  25. 25. Has to get people interested in participating in the study</li></ul>Enlists respondent’s cooperation/participation<br /><ul><li>Stresses the importance of the study
  26. 26. Stresses the importance of the respondent </li></ul>Instructions<br /><ul><li>How to complete questionnaire
  27. 27. How/where/when to return completed questionnaire
  28. 28. Typically LOW response rates</li></ul>Non-respondents could be distinct in some way<br />Factors affecting participation/response rates:<br /><ul><li>Who’s doing it
  29. 29. Inducements to participate (awards, token of appreciation, being entered into draws)
  30. 30. Characteristic of respondents (group member/non member, age, gender, ethnicity, social class)
  31. 31. Mailing date
  32. 32. Follow-up procedures (reminder letter, additional letter and fresh questionnaire, telephone call!!!) got to do a lot of work to get the most and best responses possible
  33. 33. Confidentiality/anonymity </li></ul>INTERVIEW SCHEDULES<br /><ul><li>Role of interview (can include)
  34. 34. Locate and enlist cooperation of respondents
  35. 35. Conduct “good” interview
  36. 36. Motivate respondent
  37. 37. Answer questions/clarify confusion
  38. 38. Using PROBES to elicit address information
  39. 39. Accurately recording information (not easy)
  40. 40. Training of interviewers (must include)
  41. 41. Knowing key aspects of study (objectives, sponsors, sampling strategy etc.)
  42. 42. Interviewing basics (how to ask questions and record answers)</li></ul>Telephone Surveys<br />two reasons behind widespread use:<br /><ul><li>Almost all households had at least ONE phone
  43. 43. Use of computers (CATI – computer assisted telephone interviews)
  44. 44. Random digit dialling
  45. 45. Standardized script (present interviewer with everything he or she will say and they’ll read the script) allow you to build in an enormous amount of complexity in the script that not even the interviewer can screw up. For example, if you’re asking if someone has been a victim of a crime or not, depending on the answer, it’ll send the questions to a different part of the script
  46. 46. Direct data entry</li></ul>NOTE: the methodology used by Statistics Canada to conduct victimization surveys<br /> Face to Face interview<br /><ul><li>A direct meeting between interviewer and interviewee
  47. 47. Some see this as the GOLD standard for survey research because of the high quality data that can be generated
  48. 48. But it has disadvantages</li></ul>Comparing Face to face interviews, phone surveys, and self-administered questionnaires<br />Face to FacePhoneQuest.CostHighLow/medLow/medResponse ratesHighLowLow/medTime to answer Med/highLow/medHigh Complex questionsHighHighLow/medEducational BiasLowLowHigh Safety (teacher)High LowLowInterviewer bias ****HighHighLow<br />Cost<br /><ul><li>Includes travelling , experience, etc.</li></ul>Response rates <br /><ul><li>higher in face to face generally, easier to get out an answer out of </li></ul>Time to answer<br /><ul><li>Phone interview have to ask questions that really shouldn’t take a lot of time, tied to response rates . Very concise questions (because it’s easier to terminate interview
  49. 49. face to face/questionnaire more lengthy amount of time to answer questions, different questions</li></ul>Complex questions<br /><ul><li>higher in face to face and phone because of the way the questions are formatted as opposed to questionnaire where it’s very strict </li></ul>Educational bias<br /><ul><li>Quality and completeness of answers is closely tied to the level of education the person has.
  50. 50. In questionnaire the people who can read and write and understand correctly will give much better responses
  51. 51. In face to face/phone, you can get help whereas in questionnaires you really can’t </li></ul>Safety <br /><ul><li>If the interviewer is at risk of harm </li></ul>*** Ways in which the interviewer affects the data<br /><ul><li>During the interview
  52. 52. Changing the word order of questions (HAVE YOU EVER COMMITED SUICIDE)
  53. 53. Adding words to questions (you haven’t committed suicide, have you?)
  54. 54. Forcing answers (calling someone a liar)
  55. 55. Recording errors (e.g. making up answers by the interviewer)
  56. 56. Interviewer characteristics
  57. 57. Race/ethnicity (same race as the interviewee or different)
  58. 58. Gender ( male interviewing females on sexual assault)
  59. 59. Age (young person asking old person about some questions)
  60. 60. Physical appearance (clothing, tattoos)</li></ul>Types of surveys based on level of standardization/uniformity<br />|-----------------------------------------|-----------------------------------------|<br /> Unstructured semi-structured structured <br />Question Structured <br /><ul><li>Closed-ended questions ask respondents to choose among a set of fixed or predetermined set of answers </li></ul>Have you used marijuana in the past year?<br /><ul><li>Nob) Yes </li></ul>How often have you used marijuana in the past year?<br />Never1<br />1 or 2 times 2<br />3 or 4 times3<br />5 or more times 4<br />Might not be enough precision saying 5, (some might use it 50-100 times a year)<br /><ul><li>Open ended questions ask respondents to supply their own responses and in some cases to record his/her responses </li></ul>e.g. how often have you used marijuana in the past year?<br />____________________________________________________________<br />Comparing closed ended questions and open ended questions<br />ClosedOpenFacilitates completionYesNoKnown responses must be known YesNoComplex and subtle responses NoYesResponse set or biasYesNoAdd new information (exploration)NoYesEducation bias NoYes Classifying/coding responsesNoYes<br />Known responses must be known <br /><ul><li>Don’t get responses you don’t anticipate</li></ul>Complex and subtle responses <br /><ul><li>A B C D on whether you smoked pot vs. Open ended where you can talk about it </li></ul>Response set or bias<br /><ul><li>Based on responses you can see in their answers that the answers don’t make sense since they’re getting lazy creating a pattern </li></ul>Add new information (exploration)<br /><ul><li>Ask question and let them answer themselves opposed to closed ended abcd </li></ul>Education bias<br /><ul><li>Persons level of education affects answers </li></ul>Classifying/coding bias<br /><ul><li>If you ask someone to describe in an open ended question what work you do for pay, got to be required to code all the responses to numerical values to analyze the numbers. The responses can vary greatly by job and could make it really hard to code everything</li></ul>Contingency questions<br /><ul><li>Questions that apply to a subgroup of respondents based on answers to “filter or screener” questions
  61. 61. Filter/screener questions direct respondents to next relevant </li></ul>Have you used marijuana in the past year?<br /><ul><li>No- skip to Q 5
  62. 62. Yes- go to next question
  63. 63. How many times?
  64. 64. 1 or 2 times1
  65. 65. 3 or 4 times2
  66. 66. 5 or more times 3 </li></ul>Matrix questions<br /><ul><li>A set of closed ended questions that have the same response categories</li></ul>How afraid are you that someone will.... 1 not at all, 2 moderately, 3 extremely <br />Break into your house<br />Break into car<br />Steal your car <br />Etc. <br />Problem – response sets <br />General guidelines to writing questions <br /><ul><li>Keep the wording as simple as possible (if necessary provide definitions)
  67. 67. Keep the questions short
  68. 68. Avoid double barrelled questions
  69. 69. Asking two questions as one question (i.e. do you think the police do a good job in informing the public and enforcing the law?)
  70. 70. Avoid hypothetical questions (poor predictors of actual behaviour/experiences)
  71. 71. Don’t push the limits of what people can remember (how did you feel over the past year? Happy, sad etc)
  72. 72. Avoid negative questions
  73. 73. Ie. The police should not carry guns? Agree/disagree
  74. 74. Response categories must be mutually exclusive
  75. 75. Response categories must be exhaustive
  76. 76. Avoid leading questions (wording that encourages the respondent to answer in a particular way)
  77. 77. i.e. you don’t agree that the police should carry a gun, do you?</li></ul>Question placement (depends on mode of data collection)<br /><ul><li>begin with non sensitive demographic questions
  78. 78. questions of major research interest next
  79. 79. “sensitive” items should be last (if what you’re studying is highly sensitive, it should be moved up)</li></ul>Get as much information as you can in the major research section before people start bailing <br />Question patterning<br /><ul><li>Funnelling sequence- moving from general to narrow
  80. 80. Inverted funnelling sequence – moving from narrow to general </li></ul>Example:<br />Would you say that the police in Canada do a good job, average job, or a poor job? (scope is all of Canada)<br />Would you say that police in your city do a good job? (scope is the city)<br />Would you say that police in your neighbourhood do a good job? (scope is neighbourhood)<br />SELF REPORT SURVEYS (offenders)<br /><ul><li>One of the three (3) major ways of measuring delinquent and criminal involvement
  81. 81. Basic approach is to ask individuals about disreputable behaviour
  82. 82. First published findings of self report surveys in mind 1940’s
  83. 83. James Short and Ivan Nye (1957) carry out first methodologically sophisticated survey
  84. 84. Used scale construction, assessed measurement reliability and validity, probability sampling
  85. 85. Multi level scales</li></ul>Early studies focused on [two distinct but related]<br /><ul><li>Measuring the ‘Dark figure’
  86. 86. Relationship between SES and offending</li></ul>***Early studies show that almost everyone as an adult has at some time committed a very serious act <br />*** the early studies also assumed that the class/crime relationship doesn’t exist <br />Limitations<br /><ul><li>The “domain problem” (narrows scope)
  87. 87. Inability to measure ‘chronic and career’ offending (scale problems)</li></ul>Studies in 1970’s and onwards have measured other areas of an individual’s life (etiological factors, the study of causal connections) which permit theory testing/development<br />Assessing reliability and validity of self reports<br /><ul><li>Measurement reliability : measurement strategy yielding the same results on repeated trials
  88. 88. Internal consistency of items (more complex version of splitting techniques)
  89. 89. But should we expect items to correlate?
  90. 90. Test-retest method involves collecting panel data but re-asking the same questions at some later time</li></ul>Test results indicate major self report measures have acceptable reliability <br />Q1 drink alcohol, Q2 smoke Cig ,Q3 drop acid (lsd), Q4 use inhalants <br />Split them into 2 questions in order to see if they’re consistent, split half technique (if they’re high here, they should be high there too)<br />Internal consistency- To see how each question relates to every other question<br />Expect high correlation but not always doesn’t <br />Q1Q2Q3Q4Q1-----------------------------Q2-----------------------------Q3-----------------------------Q4-----------------------------<br /><ul><li>Measurement validity : a measurement strategy that accurately measures what it is intended to measure</li></ul>Recall you need a nominal definition (delinquency/crime/deviance)<br /><ul><li>Content Validity – less of an issue with more recent measures
  91. 91. Construct validity – measures shown to correlate with socio-demographic variables in theoretically expected ways
  92. 92. Criterion validity – comparison (delinquency of a respondent)
  93. 93. self report measures – official records
  94. 94. self report measures – report by friends/classmates
  95. 95. self report substance use – blood/urine/saliva tests</li></ul>***there appears to be substantial underreporting , people don’t always tell the truth <br />Test results indicate major self report measure have acceptable validity/accuracy <br />Major concerns with self report surveys<br />PEOPLE UNWILLING TO REPORT CRIMINAL BEHAVIOUR <br />Are there method effects?<br /><ul><li>Self administered/face to face/ telephone
  96. 96. Anonymous/non-anonymous</li></ul>Generally the available research findings and unclear<br />Spatial methods for collecting sensitive (illegal and embarrassing) information<br /><ul><li>Randomized response technique </li></ul>Computer assisted self-administered interview (CASI)<br /><ul><li>Essentially the respondent reads and questions and enters responses into computer
  97. 97. Convince them that it’ll be completely confidential </li></ul>Victimization surveys<br /><ul><li>First major victim surveys undertaken in the late 1960’s in the US- now carried out in many countries, including Canada
  98. 98. 1966-1967 large scale victim surveys started taking place in US
  99. 99. Canadian urban victimization survey 1981 (URBAN survey, all participants in 1 of 8 countries in Canada)
  100. 100. General social survey 1988, 1993, 1999, 2004 (5 year increments)
  101. 101. Violence against women survey 1993
  102. 102. ** All done by statistics Canada, done by telephone (CATI), excludes territories
  103. 103. Like self report offender surveys, victim surveys attempt to measure crime events reported and not reported to the police (dark figure)
  104. 104. Historically, have provided more detailed info (info about victim, offender, circumstances surrounding event) than UCR data – but this has changed since late 1980s
  105. 105. UCR data criticized for giving very limited data about crime ( # of homicides, charges, males/females etc) and are monthly totals (aggregate data) , don’t know if people are done more than one offences. Very little information relating to the offender
  106. 106. Now asked more (age of offender, were the people involved drinking, circumstances of situation, etc)</li></ul>Victim surveys have led to creation of new victim-centered theories (routine activities, lifestyle, opportunity, rational choice, repeat victimization)<br />

×