View stunning SlideShares in full-screen with the new iOS app!Introducing SlideShare for AndroidExplore all your favorite topics in the SlideShare appGet the SlideShare app to Save for Later — even offline
View stunning SlideShares in full-screen with the new Android app!View stunning SlideShares in full-screen with the new iOS app!
Example : Are the 2010 Winter Olympics good for Vancouver?
“good” in what sense?
“Olympics” does this include the para-olympics?
“Vancouver” Vancouver people? Greater Vancouver? British Columbia? Canada?
Is the meaning/interpretation affected or dependent upon previous questions? (i.e. if all the questions before this question dealt with the homelessness problem in Vancouver would they affect the answer?
Social and conversational norms (e.g. social desirability- many will not disclose that they might think that would make them look bad)
Confidentially and anonymity (helps get a more accurate answer without worrying about social desirability, and insurance like a consent form)
TYPES OF SURVEYS Distinction based on the Mode of Surveying/how it is carried out. Self administered – given survey and done by yourself Interviews - surveyor asks you questions and they write down your answer Half – interviewer asks questions but you write down answers Self administered |-----------------------|-------------------------|Interviews
Standardized script (present interviewer with everything he or she will say and they’ll read the script) allow you to build in an enormous amount of complexity in the script that not even the interviewer can screw up. For example, if you’re asking if someone has been a victim of a crime or not, depending on the answer, it’ll send the questions to a different part of the script
Comparing Face to face interviews, phone surveys, and self-administered questionnaires Face to FacePhoneQuest.CostHighLow/medLow/medResponse ratesHighLowLow/medTime to answer Med/highLow/medHigh Complex questionsHighHighLow/medEducational BiasLowLowHigh Safety (teacher)High LowLowInterviewer bias ****HighHighLow Cost
Includes travelling , experience, etc.
higher in face to face generally, easier to get out an answer out of
Time to answer
Phone interview have to ask questions that really shouldn’t take a lot of time, tied to response rates . Very concise questions (because it’s easier to terminate interview
face to face/questionnaire more lengthy amount of time to answer questions, different questions
higher in face to face and phone because of the way the questions are formatted as opposed to questionnaire where it’s very strict
Quality and completeness of answers is closely tied to the level of education the person has.
In questionnaire the people who can read and write and understand correctly will give much better responses
In face to face/phone, you can get help whereas in questionnaires you really can’t
If the interviewer is at risk of harm
*** Ways in which the interviewer affects the data
During the interview
Changing the word order of questions (HAVE YOU EVER COMMITED SUICIDE)
Adding words to questions (you haven’t committed suicide, have you?)
Types of surveys based on level of standardization/uniformity |-----------------------------------------|-----------------------------------------| Unstructured semi-structured structured Question Structured
Closed-ended questions ask respondents to choose among a set of fixed or predetermined set of answers
Have you used marijuana in the past year?
How often have you used marijuana in the past year? Never1 1 or 2 times 2 3 or 4 times3 5 or more times 4 Might not be enough precision saying 5, (some might use it 50-100 times a year)
Open ended questions ask respondents to supply their own responses and in some cases to record his/her responses
e.g. how often have you used marijuana in the past year? ____________________________________________________________ Comparing closed ended questions and open ended questions ClosedOpenFacilitates completionYesNoKnown responses must be known YesNoComplex and subtle responses NoYesResponse set or biasYesNoAdd new information (exploration)NoYesEducation bias NoYes Classifying/coding responsesNoYes Known responses must be known
Don’t get responses you don’t anticipate
Complex and subtle responses
A B C D on whether you smoked pot vs. Open ended where you can talk about it
Response set or bias
Based on responses you can see in their answers that the answers don’t make sense since they’re getting lazy creating a pattern
Add new information (exploration)
Ask question and let them answer themselves opposed to closed ended abcd
Persons level of education affects answers
If you ask someone to describe in an open ended question what work you do for pay, got to be required to code all the responses to numerical values to analyze the numbers. The responses can vary greatly by job and could make it really hard to code everything
Questions that apply to a subgroup of respondents based on answers to “filter or screener” questions
Filter/screener questions direct respondents to next relevant
A set of closed ended questions that have the same response categories
How afraid are you that someone will.... 1 not at all, 2 moderately, 3 extremely Break into your house Break into car Steal your car Etc. Problem – response sets General guidelines to writing questions
Keep the wording as simple as possible (if necessary provide definitions)
“sensitive” items should be last (if what you’re studying is highly sensitive, it should be moved up)
Get as much information as you can in the major research section before people start bailing Question patterning
Funnelling sequence- moving from general to narrow
Inverted funnelling sequence – moving from narrow to general
Example: Would you say that the police in Canada do a good job, average job, or a poor job? (scope is all of Canada) Would you say that police in your city do a good job? (scope is the city) Would you say that police in your neighbourhood do a good job? (scope is neighbourhood) SELF REPORT SURVEYS (offenders)
One of the three (3) major ways of measuring delinquent and criminal involvement
Basic approach is to ask individuals about disreputable behaviour
First published findings of self report surveys in mind 1940’s
James Short and Ivan Nye (1957) carry out first methodologically sophisticated survey
Used scale construction, assessed measurement reliability and validity, probability sampling
***Early studies show that almost everyone as an adult has at some time committed a very serious act *** the early studies also assumed that the class/crime relationship doesn’t exist Limitations
The “domain problem” (narrows scope)
Inability to measure ‘chronic and career’ offending (scale problems)
Studies in 1970’s and onwards have measured other areas of an individual’s life (etiological factors, the study of causal connections) which permit theory testing/development Assessing reliability and validity of self reports
Measurement reliability : measurement strategy yielding the same results on repeated trials
Internal consistency of items (more complex version of splitting techniques)
Test-retest method involves collecting panel data but re-asking the same questions at some later time
Test results indicate major self report measures have acceptable reliability Q1 drink alcohol, Q2 smoke Cig ,Q3 drop acid (lsd), Q4 use inhalants Split them into 2 questions in order to see if they’re consistent, split half technique (if they’re high here, they should be high there too) Internal consistency- To see how each question relates to every other question Expect high correlation but not always doesn’t Q1Q2Q3Q4Q1-----------------------------Q2-----------------------------Q3-----------------------------Q4-----------------------------
Measurement validity : a measurement strategy that accurately measures what it is intended to measure
Recall you need a nominal definition (delinquency/crime/deviance)
Content Validity – less of an issue with more recent measures
Construct validity – measures shown to correlate with socio-demographic variables in theoretically expected ways
Criterion validity – comparison (delinquency of a respondent)
self report measures – report by friends/classmates
self report substance use – blood/urine/saliva tests
***there appears to be substantial underreporting , people don’t always tell the truth Test results indicate major self report measure have acceptable validity/accuracy Major concerns with self report surveys PEOPLE UNWILLING TO REPORT CRIMINAL BEHAVIOUR Are there method effects?
** All done by statistics Canada, done by telephone (CATI), excludes territories
Like self report offender surveys, victim surveys attempt to measure crime events reported and not reported to the police (dark figure)
Historically, have provided more detailed info (info about victim, offender, circumstances surrounding event) than UCR data – but this has changed since late 1980s
UCR data criticized for giving very limited data about crime ( # of homicides, charges, males/females etc) and are monthly totals (aggregate data) , don’t know if people are done more than one offences. Very little information relating to the offender
Now asked more (age of offender, were the people involved drinking, circumstances of situation, etc)
Victim surveys have led to creation of new victim-centered theories (routine activities, lifestyle, opportunity, rational choice, repeat victimization)