Before we move on to today’s topic of “interviewing techniques,” I want to digress to last month’s topic of “questionnaire design.” I hope that most of you heard last month’s session. If you did not, I strongly recommend reviewing the September slides, since questionnaire design and the success of your data collection efforts are inextricably linked. The staff in your Office of Epidemiology have asked me to let you know about a relatively new CDC software application that will be applied for the first time in a multi-state outbreak field exercise later this month. The application is called the “Outbreak Management System,” or “OMS.” You can use the email address or toll free number posted at the bottom of this slide to request training or technical assistance. Let’s quickly look at how OMS will be relevant to your outbreak investigation questionnaire development efforts. . .
The broad context of this seven-part series is outbreak investigation methods. The Outbreak Management System was designed to facilitate the outbreak investigation process, and a significant element of the system is core question sets for collecting multiple categories of data such as: demographics; risk factors; clinical, treatment, and laboratory information; contact tracing; and traceback information. You can also create household, social, or occupational relationships among records And you will be able to run OMS on a desktop or laptop, so you will have the flexibility to take an electronic questionnaire into the field with you. [You will learn about Computer Assisted Personal Interviews – or “CAPI” in today’s lecture].
I want to very quickly show you two images of what the OMS user interface looks like. In this first image, I have placed a red box around the user-friendly tabs that let you navigate among the diverse types of information that the OMS allows you to store and relate.
Here you can see a directory of question sets from which you can choose, and the specific questions appear in the bottom right corner of the screen so you can preview them. In case the information on this slide is illegible, the question set categories include: Risk Exposure; Clinical Information; Food and Beverage; and Recreation & Vectors. This comprehensive question set library component is only one of many features of the OMS. For example, the system will also allow you to perform data analysis and generate reports.
If you want to inquire about how OMS will be incorporated into Virginia’s Emergency Preparedness and Response and Epidemiology operations, please contact Michael Coletta, your state Bioterrorism Surveillance Coordinator. Thank you. And now, let’s get back to the topic of the day. . .
Upon completion of this session, you will: Recognize the interrelatedness of interview techniques and questionnaire design Understand key survey research terms Understand the advantages and disadvantages of face-to-face, telephone, and computer assisted interview methods
Understand the advantages and disadvantages of mail and web based survey implementation Know what to cover in interviewer training Recognize good interview techniques Understand confidentiality concerns from the perspective of the respondent and the outbreak investigator
Before beginning this session, let’s review the basic steps of an outbreak investigation. Remember that in reality, some of these steps may occur simultaneously or in a different order. First, we must verify the diagnosis and confirm that there is actually an outbreak occurring. Next we need to create a preliminary case definition and conduct active case finding. Once we have information from some cases we need to compile and review it. Then we implement preliminary control measures. After that we formulate and test a hypothesis, and plan and execute additional studies based on the preliminary results. Finally, we implement and evaluate control measures and communicate our findings. Today we will focus on steps 2, 3, and 5 above. The formulation and testing of hypotheses are actually two distinct steps. We formulate a hypothesis based on the clinical, laboratory and epidemiological data that is available to us. Hypothesis generating interviews tend to be loosely structured information gathering interviews with persons who are ill. In the hypothesis testing phase of the outbreak investigation, we conduct structured interviews with predefined questionnaires talking to both persons who are ill and those who are not ill. This PHTIN session will focus on interviewing techniques used primarily for hypothesis testing during an outbreak investigation.
Before we dive into the details of how to conduct interviews during outbreak investigations, I want to begin by first covering some broader yet related concepts. They are: The role of interviews in outbreak investigations Types of interviewing methods Interrelatedness of interview method and questionnaire design and finally, key survey research concepts such as sampling and response rates.
The primary purpose of an interview is to collect data. In an outbreak investigation, interviews can identify cases and risk factors and assist with generating hypotheses.
To review, these are the various methods for conducting an interview. During an Outbreak Investigation, you would probably use an interviewer administered method such as a face-to-face or telephone. However, you may find situations were one of the self-administered methods are more appropriate. In this presentation we will focus mainly on “interviewer administered” methods: face-to-face and telephone interviews.
Your choice of interviewing method, be it a face-to-face or telephone survey, will also have an impact on the design of your questionnaire or survey instrument. The two components of interviews are interrelated. The length and format of your questionnaire, the types of questions you use, and cost considerations can impact the interview method. In last month’s session titled, “Questionnaire Design,” we discussed questionnaire and question types and formats in detail. If you missed last month’s session, you can access the archived slides at the same Virginia Department of Health Training Web site on which today’s slides are posted.
Before we begin discussing the advantages and disadvantages of various interviewing methods, you will need to be familiar with some survey research terms that will be used in today’s session.
Sampling is the systematic selection of a portion or part of the larger source population. One important feature of a sample is that is should be representative of the larger source population.
The source population is the larger population of interest. This diagram shows the relationship between the source population, 12,000 students, and the randomly sampled population of 150 students. Each dot is supposed to represent 1 student, but unfortunately we could not fit 150 dots on the circle!
One question you may be asking yourself is why should we sample? The answer is simply because it is more efficient. It can save time and money! For large outbreak investigations, we generally want to use sampling. That is, we will interview only a sample —versus all-- of our cases to conduct a case-control study . For smaller outbreaks, we could potentially use all or most of the cases in a case-control study; thus we would not need to sample a portion of the cases. However, even in small case-control studies you will need to sample controls from the source population. You may remember from the “Study Design” session in this series that in a cohort study , we compare the disease experience of persons who were exposed to a risk factor, such as a contaminated food item, to persons who were not exposed to that risk factor. To conduct a cohort study of an outbreak you must be able to identify every person in the cohort . This is possible if the group of people at risk of disease is small and well-defined, like at a wedding reception, on a cruise ship, in a school, or in prison. You generally have the option of interviewing all members of the cohort in such circumstances. But if the cohort is prohibitively large, then you would interview only a representative sample of the cohort.
The question of how many cases and controls to interview comes up often. Besides resource availability concerns, another important factor in deciding your sample size is to consider the purpose of your study. In large outbreaks, you can interview a surprisingly small number of cases to identify risk factors. For example, an outbreak of Salmonella in Illinois ultimately yielded 16,000 culture confirmed cases. Can you guess how many of these 16,000 cases were interviewed? Only 32 cases and 32 controls were used to conduct a small case-control study which correctly identified milk as the source of the outbreak. If your purpose is to determine the attack rate of infection as opposed to identifying the source of the outbreak, you need a much larger sample. An attack rate is defined as the number of persons who become ill divided by the number of persons at risk during a specific period of time, usually due to a specific exposure. In this situation, you might consider a cohort study (as opposed to a case-control study). When the population is large, t he rule of thumb is to sample about 10% of the population at risk.
There are many ways to select a sample. Most of you have probably heard of Simple Random Sampling. It is exactly what the name implies. You select persons at random to participate in a study. Imagine you are conducting a case-control study to investigate an outbreak on a cruise ship. There are 250 passengers on the ship who did not become ill with Salmonella. How would you go about selecting your controls? You could put the names of your 250 persons into a hat and choose 30 (or however many controls you need) at random. There are many variations on the theme of SRS. One could do a systematic sample, a stratified sample, or a cluster sample. Each of these methods describes a different way of selecting your sample, but each is to some degree a random selection. Another type of sampling is a convenience sample. A convenience sample can be simple and inexpensive, but for many reasons unadvisable when conducting an outbreak investigation.
Convenience samples are based on subjective judgment; the investigator decides who to include in the study. Therefore, respondents may not be representative of the population of interest. In addition, the selection of persons because they are easily accessible may end up being biased towards some demographic characteristic or exposure. This bias would ultimately bias your study results. Let’s go back to the cruise ship example. If you select your controls from persons lounging at the pool (they are probably not sick if they are lounging at the pool), you may find that your controls are younger than the rest of your source population. That means they are not representative of the rest of the population. Just because the pool goers are an easily accessible group all in one place is not a good reason to select them. Rather, you should choose your sample randomly.
If you want to learn more about the topic of “Sampling”: Here are three 1-hour, online, self-instructional trainings available through the North Carolina Center for Public Health Preparedness Training web site. Each training is free of charge, and you can also apply for free continuing education credits.
A Response rate measures the percentage of your sample that has participated in your survey. For Example, u sing the campus directory, you email a survey to a random sample of 100 freshmen. 40 of those students complete the survey and return it electronically. Your response rate is 40%.
If your sample is representative of the source population, then a high response rates ensure that survey data are representative of that population, and so your results will be valid. Survey data are valid when they measure what the survey sets out to measure. Those who do not respond to your survey, non-respondents, may have very different characteristics or health status compared to those who do respond. This means a low response rate can make the interpretation of results difficult. That is, we would be unsure if the results from a survey with a low response rate are valid. If, on the other hand, you get a high response rate you can be confident that your data reflect the distribution of the health outcome or demographic characteristics within the source population. Again returning to the cruise ship example, suppose you randomly select 50 people but only 30 respond. The first question on your mind is why didn’t those 20 people respond? If their non-response is associated with either disease status or exposure status, your results will be biased. That is, if those 20 people all ate the contaminated food and were sick, your results would be an underestimate of the true association between exposure and disease.
Let’s look at some reasons why people do not respond to surveys. There are 3 types of Non-response: Non-contact Refusal Inability to participate. Each type of non-response potentially biases the outcome of your survey in a different way. People who are not at home might have different demographic or health characteristics from those who refuse to participate and from those who are unable to participate.
Survey methodology research shows that most face-to-face surveys get response rates between 65 and 75%. Most telephone survey response rates are between 60 and 70%, while mailed surveys get the lowest response rates of between 56% and 68%. Keep in mind that these are for studies that do not have the same time pressure that an outbreak investigation would impose. In addition, most of these studies attempt to minimize non-response through a variety of time consuming and expensive methods such as sending letters to alert respondents of the survey, providing incentives for participation, and calling respondents back many times over the course of several weeks. The rates cited above may have been reached after follow up efforts and repeated contacts; the initial response rate for many surveys may be less than 50%.
Determining a response rate can get complicated in some situations. For example, if you are doing a telephone survey and you reach a business instead of a home, do you include that business as a non-respondent, or should it be excluded from the calculation’s denominator altogether? AAPOR, the American Association of Public Opinion Research, answers this question and many more in a document published to provide standard definitions of how to calculate response rates. Here is the web address if you are interested in downloading the publication.
In this next portion of the session, I will be discussing the advantages and disadvantages of face-to-face and telephone interviews. I will also talk about the use of new technology in the data collection process: this is referred to as “Computer Assisted Interviewing”.
Face-to-face interviews tend to have higher response rates because an interviewer standing in your home is more difficult to ignore than a piece of mail or someone on the telephone. Face-to-face interviews can be longer with more complex skip patterns because a trained interviewer will be administering the questionnaire. Because a trained interviewer conducts face-to-face interviews, responses are often recorded more accurately, and more specifically, fewer questions are left without a response (this is known at item non-response). Finally, face-to-face surveys are appropriate for populations that are hard to reach by phone or mail, like the illiterate, homeless, institutionalized or non-English speakers.
However, face-to-face surveys can be costly. And there is potential for interviewer error. For example, since the interviewer usually knows the disease status of the participant, he or she may use a certain tone of voice or specific body language to indicate what he/she wants the respondent to say. In addition, there are generally multiple interviewers conducting interviews during an outbreak investigation, each with a different way of interacting with respondents. Face-to-face surveys are less anonymous than self-administered questionnaires. This lack of anonymity may motivate persons to under-report socially undesirable behavior and over-report socially desirable behavior. For example, in a face to face interview a participant may be embarrassed to admit his daily alcoholic intake, or fear to admit using illicit drugs. On the other hand, if a question is about diet, a person may over-report his vegetable consumption to appear to have a more healthy diet. Please note that this lack of anonymity can apply to almost every other type of data collection method when personal identifiers are obtained, but the perception is often greater when the interviewer and participant meet face-to-face.
The advantages of phone interviews include: they are less costly than face-to-face interviews and have higher response rates than self-administered, mailed surveys. Access to participants is usually faster for phone interviews, and interviewers can be supervised to reduce interviewer error. More sensitive information can also be collected, and the design of the survey may end up being more efficient.
The disadvantages of telephone interviewing are lower response rates than face-to-face, questionnaires have to be shorter; and the interviewer is unable to capture important visual information such as a rash on a respondent or poor working conditions. In addition there may be under-coverage among certain populations. According to the 2000 U.S. Census, 2.1% (or 57,656 households) in Virginia do not have telephones in their homes .
Technological advances have allowed for improvements in data collection. Computer assisted interviewing is used in most large social science surveys. Interviewers, either in person or on the telephone, directly input responses into a laptop or desktop computer. With CAI, the interviewer’s computer screen is programmed to show questions in the correct order; interviewers cannot inadvertently omit questions (e.g., questions that may be part of a skip pattern), or ask questions out of sequence. Therefore, the use of computers eliminates much data entry error and speeds up survey processing. Audio casi is used for sensitive questions. A respondent listens to questions through headphones and keys their responses directly into the computer. It is unlikely that you will use computer assisted interviewing in an outbreak investigation because it requires up front time to program computers with the survey instrument, and to test the system to ensure that there are no mistakes. However, I DO want to highlight a successful example of the use of the CAPI method combined with Epi Info database management and analysis in the state of North Carolina.
The North Carolina Division of Public Health’s Office of Public Health Preparedness and Response has seven “Public Health Regional Surveillance Teams” – pronounced “First” Teams—across the state. Through a special project funding grant from the Centers for Disease Control and Prevention, each of the seven regions now has a laptop and up to seven PDAs—or handheld computers—loaded with Geographic Information Systems and relational database software.
Health department personnel can use the PDAs to conduct Computer Assisted Personal Interviews in the event of an outbreak, natural disaster, or other emergency. Data saved in the PDAs are then downloaded into a master database on a docking cart. That database is then analyzed in Epi Info software, where maps, graphs, and reports can be generated. Over the past two years, North Carolina public health officials successfully used this technology to conduct door-to-door rapid needs assessments after hurricanes damaged the state’s coastal communities.
You might be wondering how PHRST Team staff determine who, along a hurricane-stricken coastline, to interview. The sample selection process is very scientific. It is based on CDC and WHO methodology, and involves: Two-Stage Random Sampling; Selecting a sample area that is identified based on The storm path, damage reports, and service areas; Randomly selecting population weighted geographic clusters in the designated sample area; and finally, Randomly selecting households within each geographic cluster; In this image, the “census sample clusters” are shaded in grey. Each red dot represents a predetermined interview site.
Here is an image of the iPAQ PDAs used in North Carolina. You can see two separate elements of the software capabilities pictured on this slide: 1) On the left, the GIS element that directs interview teams to their interview sites; and On the right, the standardized survey instrument, with pre-programmed fields for data entry, that will later be saved in a master database for analysis. Dr. Mark Smith, Director of the Health Surveillance and Analysis Unit in the Guilford County Health Department, is pictured here. Those of you who attended the May session titled, “Epidemiology Specialties Applied” may recognize him.
This new technology has proven to be an asset to North Carolina public health professionals. But, like so many things, it has its “pros” and “cons.” So let’s quickly consider the “pros” and “cons” of not only this CAPI technology for face-to-face interviews in the field, but also the traditional “clipboard and pencil” mode of recording interview data. The traditional, paper-based mode of recording interview data is certainly inexpensive and requires no special skills—beyond reading and writing—on behalf of the interviewer. Conversely, this CAPI technology is expensive (although PDAs are inexpensive relative to tablet PCs or laptops), and not everyone is technologically savvy. So there is an inherent training element for the use of the PDAs. For example, the interviewer using a PDA must not only be able to access questionnaire screens and enter and save data; he or she must be able to activate the Global Positioning Receiver and troubleshoot software glitches in the field. The small size and portability of PDAs makes them a desirable tool to carry into the field. However, the very small screen size can sometimes pose a challenge to interviewers. Interviewers may have to scroll through several screens to read all possible response options for one question to a respondent. One of the primary “cons” of the paper-based mode of recording interview data is that it requires double data entry: a. Manual recording in the field; then b. Keyboard entry of data into a database This process increases the likelihood of data errors. The CAPI method not only decreases the likelihood of data errors by eliminating double data entry; one-time data entry saves valuable staff time. --------------------------------------------------------- In terms of the future of this interviewing technology in North Carolina, its application has been expanded to use in community assessments. After you hear our guest lecturer Erin Rothney speak, you will have a better understanding of community assessments and how CAPI interviewing could be beneficial.
If you want to learn more about how this technology has or is being used in North Carolina, please contact Steve Ramsey at the Guilford County Health Department in North Carolina. You can access the complete set of slides after today’s session to copy this contact information.
In this next portion of the session, I will be discussing the following self-administered data collection considerations: Advantages and disadvantages of mailed questionnaires Advantages and disadvantages of Web-based questionnaires
Advantages of mail questionnaires are that they are more anonymous, may collect more honest responses, and there is no interviewer error. Mailed questionnaires also tend to be less expensive, and respondents have more time to think about responses to the questions versus when they are in a face-to-face interview setting. Mailed surveys may work best when they target specific groups such as a membership list for a professional organization or all clients seen in one medical facility over the past calendar year. But because of the obvious delay in time for mail to be delivered to respondents and then returned to you, you will probably not use this interviewing method for an outbreak investigation. Let’s briefly look at some other disadvantages of mailed surveys on the next slide. . .
One disadvantage of a mailed surveys is that questionnaires must be simple. There also tends to be higher item non-response, that is people will skip questions inadvertently. Often times even a well organized and easy to follow questionnaire can have pitfalls. For example, if you ask respondents to give only one answer to a question, many people will choose multiple responses. In addition, response rates are lower for mailed surveys, data collection takes much more time than with telephone or face-to-face, the sample population must be literate, and there are coverage and frame deficiencies. Coverage and frame deficiencies refer to the lack of a comprehensive list of addresses for households in the U.S. For this reason, mail surveys are not used for general population surveys.
While mailed surveys may be too time consuming, Web-based questionnaires may prove very useful for outbreak investigations in certain circumstances. Among some populations, such as students or employees in an office, a large percent of people will have access to both the World Wide Web and email. Web and email access will only improve among the general population over time. Web based surveys can be inexpensive and fast. There is no data entry required on behalf of your health department staff when you implement Web based questionnaires because all responses are saved in a back-end database. This process has the potential to improve the quality of the data markedly. Many vendors offer survey capacity via the Web. These vendors can send data to your health department in a variety of formats for data analysis. You could import a Microsoft Access, Excel, FoxPro, dBASE, SAS, SPSS, or other database file into Epi Info software for analysis if you implement a Web-based questionnaire. Some vendors allow you to develop the survey instrument on-line, so there is no need to download any software. Some vendors will keep track of all the surveys you have done, so you can reuse a questionnaire again in the future.
The most important limitation to using web-based questionnaires is limited access to the Internet among the general population. Other disadvantages include. The respondent must have sufficient experience with the Internet The speed of the respondent’s Internet connection and computer’s hardware / software capacity may be important for successful survey completion. With Web-based questionnaires, you also run the risk of getting multiple responses for a single individual or of having non-sampled persons complete the survey. In addition, you would need some way to contact the sampled individuals to alert them of the questionnaires. Sending a cover letter and instructions via email is a common approach, but it requires knowing everyone’s email address.
An example of an arguably successful web based questionnaire comes from Dartmouth University in a 2002 outbreak of conjunctivitis affecting almost 14% of students. The vast majority of the cases were among undergraduates, so this was the source population for the study. The number of undergraduates was fairly large, so successfully reaching everyone in the cohort by phone would have been difficult. Instead, the investigators emailed the undergraduates and asked them to complete an on-line questionnaire. Since presumably all of the students regularly checked their email and were computer-savvy, the investigators were able to obtain risk factor information through a web based questionnaire. Investigators achieved a response rate of approximately 50%, thereby achieving about the same response rate as regular mail-out questionnaires.
If you are interested in exploring the use of web-based questionnaires, you can visit this Web site and compare the services of many survey vendors. Please note that while this web site can serve as a resource for you, neither the NC Center for Public Health Preparedness nor the Virginia Department of Public Health endorse any of these vendors.
What questions do you have regarding Interviewer Administered or Self Administered Interviewing Methods?
We are going to take a 5 minute break. When we reconvene, I will discuss some practical techniques for standardizing interviewer administered questionnaires.
Now I am going to discuss some practical techniques for improving interviewing skills for face-to-face and telephone interviewer administered questionnaire settings.
The goal of standardization is to help minimize error, thereby yielding better data quality. By standardizing both the survey instrument and the action of the interviewers, interviewer error can be minimized. To improve the quality of our data, we must have standardization among and between interviewers.
One important factor to be aware of when you conduct a survey during an outbreak investigation is interviewer error . Interviewer error is simply a deviation from the expected answer due to the effects of interviewers. Interviewer error can be prevented with adequate interviewer training and the standardization of survey instruments. I will discuss both of these approaches in this section of the presentation.
The two components of error are bias and variance. Here are examples of interviewer bias and variance within the same outbreak investigation. In interviews regarding Gonorrhea exposure and infection, interviewers ask probing questions on the sexual history section to non-whites but don’t ask as many probing questions to whites. As you can imagine, this would lead to differential data quality by race and may hamper investigators’ ability to identify all potentially infected persons –Especially if race is associated with Gonorrhea exposure, infection, or both. Interviewer bias is often the result of incomplete or no interviewer training. In the same Gonorrhea outbreak, suppose we have both a male and female interviewer. The male interviewer may elicit different responses from a female respondent than a female interviewer. A female respondent may not feel comfortable telling a man about her sexual history. Interviewer variance occurs because different interviewers have different effects on respondents. Generally speaking, this is not a wide spread phenomenon; in this example we might see that male interviewers affect only a very small number of female respondents.
If you are interested in learning more about the different sources of error, Chapter 4 in this text is a good reference. Please remember that you can access this session’s complete set of slides on the Virginia Department of Health’s Training Web site.
Remember, the goal of standardization is to reduce error. There are many areas where we can try to standardize our survey processes. They are: Question wording Interviewer selection Interviewer training Interviewing procedures Supervision of interviewers. These 5 areas usually occur in sequential order.
If you participated in the September “Designing Questionnaires” session, these concepts will be familiar to you! Questions on a questionnaire should meet these standards: They must be fully scripted; They must mean the same thing to every respondent. Without standardized questions, an interviewer faces an uphill battle. To make sure that questions mean the same thing to every respondent, it is advisable to field test or “ pilot” your survey instrument prior to beginning data collection. Try to develop a standardized survey instrument that can be used as a ‘template’ for future outbreak investigations, and field test that instrument when your county conducts field or table top exercises.
There are a few criteria one should be aware of when selecting interviewers. For telephone interviews specifically, all interviewers must be able to read questions fluently, they should possess a clear and pleasant telephone voice, they should be able to respond quickly to respondent’s questions without becoming flustered, and they should be reliable.
For face-to-face interviews, the interviewer must have some logistical skills such as reading maps or correctly using the data collection instrument (for example, suppose data were being collected using a PDA). Interviewers must also have good interpersonal skills, and must be independent and reliable. In some circumstances, the demographic characteristics of the interviewer may be important. For example, in an STD outbreak investigation, both gender and the age of the interviewer may influence the way a respondent answers the questions.
Obviously, outbreak investigations require a rapid response but this does not mean that interviewer training is optional. Even if interviewers receive a short training, some training must be done. Ideally, trainings should be interactive. Practice reading the questions out loud allows interviewers to become familiar with the survey instrument and can help in predicting problems with questions on the survey. In addition, some sort of support documentation, known as an interviewer manual, must be provided to all interviewers. We will discuss this more in a few minutes. You may want to consider scheduling a general interviewer training session or adding a short interviewing techniques module to another upcoming health department training. Having trained interviewers in advance of an outbreak investigation or disaster situation can save time and improve the quality of your data when the next ‘real life’ opportunity to conduct an interview presents itself.
There are several topics to cover in the interviewer training. They are: The Purpose of the survey; How to use the data collection instrument; The respondent selection process; The Intent and meaning of each questions; How to record or code responses; Administering the questionnaire – this includes covering the intent and meaning of each question; Addressing respondent’s questions; Methods for improving the response rate; Tracking calls or completed surveys, including the number of times to make call-backs; and Confidentiality. We will discuss specifics about the respondent selection process, administering the questionnaire, and logistics in the following slides.
When instructing interviewers about selecting a person in the household to interview, you must address the issue of proxy respondents. A proxy is someone that responds to the survey in lieu of the actual case. Clearly, for children, you will have to speak to a parent or guardian, but you should instruct interviewers at what age children can respond for themselves. If the case or control is not at home, someone else in the household might volunteer to respond to the survey. Rules about adult proxy responding must be established and communicated to the interviewer during the training process. Proxy responding rules impact both data quality and sampling.
One important piece of administering the questionnaire is securing the respondent’s cooperation. This may be more of an issue for controls, who are probably not familiar with the outbreak investigation. In order to establish legitimacy of the survey, the interviewer must tell the respondent: who is calling what is requested why the respondent should cooperate how they were chosen Without an understanding of this preliminary information, a potential respondent may not be inclined to participate. In last month’s “Designing Questionnaires” session, we presented an interviewer script example that North Carolina public health professionals use for door-to-door Rapid Needs Assessment data collection interviews.
Interviewers must also be trained or competent in some logistical details to perform effectively, such as reading maps, getting to respondents’ homes, reimbursement, dress code and scheduling callbacks. For telephone interviewers, they must be trained on how to operate the equipment and the CATI software if applicable. And, as I noted when discussing the advantages and disadvantages of the North Carolina PDA data collection, there is also a technology skills element in interviewer training for computer assisted personal interviews.
There are a few other considerations in training interviewers. It is important to stress to interviewers that each question must have some response or resolution. For example, if all questions are not fully answered, it may be difficult to tell if a question is skipped due to a skip pattern or if the question was simply overlooked. This distinction becomes important in the analysis phase of the investigation. Ideally, after the completion of an interview, the interviewer will review each completed interview to pick up any missing or illegible responses.
An interviewer manual has two purposes: It serves as a reference to interviewers during the interview It documents the outbreak investigation. The interviewer manual is a place to bring together all of the survey materials, but it does not have to be a large compilation of documentation in a 3 ring binder. It can simply be one page attached to the end of a questionnaire, or a folder with a few materials included for reference. Some materials that one would want to include in an interviewer manual can be prepared ahead of time. Reference sheets with commonly used interviewing techniques, probes, and feedbacks may be helpful in any outbreak investigation. I will provide you with examples of probes and feedback comments later in today’s session.
Other ideas of what may be helpful to include in an interviewer manual are: Background info, such as the Purpose of the survey, topics covered by the survey, how data will be used, funding, and sample design. Issues related to Fieldwork such as locating respondents, making the first contact, what to do with completed work, and record keeping. In the event of a phone survey, interviewers should know what to do with answering machines and how many times the household should be called back if there is no answer. Interviewing techniques, such as how to read questions, probing, refusals, entering responses, and making notes may be a helpful reference to interviewers. In addition, terms and definitions and any other detailed information about the questionnaire can be included.
One example of a well-established interviewer training program offered by the Centers for Disease Control and Prevention is the Behavioral Risk Factor Surveillance System (BRFSS). The BRFSS is supported by the National Center for Chronic Disease Prevention and Health Promotion, CDC. You may be thinking “Why can’t they use an outbreak investigation model here?” The reason is that it’s the big, ongoing research projects like BRFSS that have put the resources into developing training programs for their interviewers, not those who investigate outbreaks. Therefore, we have a lot to learn from those who do it right. (The outbreak investigation teams that have documented a need for better interviewer training have in fact turned to BRFSS for this training.) In 1984, the CDC established the BRFSS to monitor state-level prevalence of the major behavioral risks and chronic conditions associated with premature morbidity and mortality among adults. Data collected are useful for planning, initiating, supporting, and evaluating health promotion and disease prevention programs. The BRFSS is conducted by state health departments in every state and US territory.
The BRFSS training is an on-line training that covers 4 main topics: Why BRFSS data are important and how they are used Interviewer responsibilities Nuts and bolts of interviewing process Interviewing techniques Some of the information in the training is applicable to all surveys. It might be useful to browse through the training on your own time.
The Web addresses for both general information about the BRFSS and access to the online training materials are listed here.
Let’s now consider some important interviewing procedures that will enhance your data collection efforts. . .
All interviewers must follow some basic rules when conducting an interview. They must: Read questions exactly as worded Probe inadequate answers, if necessary Record answers without their own discretion Maintain a rapport with respondents and Maintain an even pace throughout interviews
In reading the questions exactly as worded, an interviewer should read the entire question before accepting an answer. This may seem time consuming and unnecessary in some situations, but it is important to setting the pace of the interview and giving the respondent time to think about the question being asked. The interviewer should also clarify questions if needed. Since it may be difficult to write a survey question that means the same thing to each respondent, clarification is necessary from time to time. But the clarification of a question may be dangerous if the interviewer begins to interpret the question as s/he perceives it. Supporting documentation in the interviewer manual that we discussed can facilitate standardized question clarification.
Here are a few ways to avoid having the interviewer interpret survey questions in order to provide clarification to respondents. Interviewers should use standard definitions provided to them, e.g., in manual documentation. Making up definitions for unfamiliar words is not a good interviewing technique. The respondent may ask what is meant by a subjective term or phrase in a question. If no standard definitions are available, the interviewer can use the phrase: “Whatever x means to you” or “Whatever you think of as x”. Again, the interviewer should avoid interpreting the question. And finally, when asked to repeat one of several response options, it may be best to repeat all of them. This gives the respondent more time to ponder the question.
A common practice in interviewing is to probe a respondent for additional information. It is important to recognize that probing should be done in a standardized way so not to influence the response. A probe is generally used when a respondent’s answer is unclear or irrelevant.
Lets look at an example. During an outbreak investigation, the interviewer asks “In the past two weeks, have you been swimming in a public pool?" The respondent replies, “I swam in a lake at a national park last month." This is an irrelevant response. The question is asking about public pools in the past 2 weeks. A probe would be necessary here. Another respondent replies, “I stayed in a hotel with a pool when I was on vacation last week." This is an unclear answer. The interviewer should not assume this means that the respondent actually swam in the pool without probing for further clarification.
Often times the best probe is simply to repeat the question, it is a neutral probe that an interviewer can consistently rely on. An interviewer may ask the respondent to retrieve various items such as receipts from a restaurant or a calendar or date book. These items will help respondents better recall exact dates and food items eaten. There are other neutral phrases you can use, such as: What do you mean? How do you mean? And if the respondent has narrowed the answer down to a few choices, an interviewer could say -- Which would be closer? If you had to choose, which would you pick?
When an interviewer is recording an answer, s/he should avoid the following temptations: Directing the respondent toward an answer. This is also called leading. Assuming that something a respondent said in passing is the correct answer Skipping questions even if the answer was already given Reminding the respondent of an earlier remark if an answer differs from what is expected At this point I would like to share a specific example where proper interviewer training could have helped alleviate a problem I encountered when interviewing controls during a Salmonella outbreak. (Some of you may remember this from last month.) There were a series of questions about egg consumption. Something like, “Did you eat any eggs last week? “Did you eat any eggs on Monday?” “Where they fried, scrambled, etc?” “Did you eat any dishes prepared with eggs, such as casseroles?” The last question was – “Just to review, did you eat any eggs last week?” Every single participant disliked being asked this review question. They felt as though they were being treated either as idiots or dishonest. I didn’t understand the purpose of this review question either. So even though I understood the principle of not skipping questions, I did it anyway. I suspect with proper interview training we could have either modified the question so as to not illicit a negative response or trained me to ask it in a way that it would have performed better. Or at the very least, if I understood the purpose of the question, I would have understood that the information was important enough to justify frustrating them.
Lets look at another example. The interviewer asks “In the last 7 days, how many times did you eat prepared food at the dorm cafeteria? Would you say: None, once, twice, 3 times, or more than 3 times.” The respond replies “Oh gee, I didn’t go very often, maybe a few times.”
One way the interviewer could respond is to use a probe. She could say, “which would be closer: none, once, twice, 3 times or more than 3 times?” This probe reiterates every possible response category. It allows the respondent a bit more time to consider what she ate in the past 7 days. In this situation, the interviewer may succumb to the temptation of inadvertently leading the respondent. The interviewer could respond, “So would you say twice, or three times?” Similarly she could say “Do you mean twice, or three times?” These responses are problematic. The respondent may be thinking that not very often means once, or she may be thinking it means 4 times. The interviewer cannot assume the respondents frame of reference. By narrowing down the answers for the respondent, the interviewer is leading her to a response.
Maintaining rapport with the respondent means remaining neutral. Neutrality helps the respondent feel comfortable answering the questions truthfully and completely. Changes in body language and tone of voice may suggest that the interviewer approves or disapproves of a respondents answers. An interviewer must remain aware of what he/she says and how he/she says it. An interviewer should strive to be nonjudgmental, noncommittal, and objective.
Interviewers can put respondents at ease by: Reading the questions in a friendly, natural manner Speaking at a moderate rate of speed Sounding interested and Striving for a low-pitched voice
One tactic to maintain rapport with the respondent is to provide feedback. Feedback is a statement or action that indicates to the respondent that s/he is doing a good job. Inexperienced interviewers may use feedback inappropriately, so it is important to keep these tips in mind: -Give feedback only for acceptable performance, not “good" content. For example, if the respondent is answering questions about smoking and tells the interviewer that he quit a few months ago, the interviewer may be tempted to provide a bit of encouragement. That is exactly what should be avoided. -Give short feedback phrases for short responses, longer feedback for longer responses. -Specific study information and interviewer task-related comments also act as feedback that motivate respondents. I will provide examples of this on the next slide. -Lastly, since the respondent cannot pick up on body language in a telephone interview, telephone interviewers should give feedback for acceptable performance 30-50% of the time.
Here are some examples of feedback. I see… Uh-huh Thank you That is useful information I see, that is helpful to know That is useful for our research Let me get that down I want to make sure I have that right (REPEAT ANSWER) We have touched on this before, but I need to ask every question in the order that it appears in the questionnaire
Pace refers to the rate of progression of the survey. In general, you want to maintain an even pace throughout the interview, and interviewers should let respondents set that pace. Older people may prefer a slower pace, while younger persons a faster one. If the respondent seems impatient, try to speed up the pace of the survey. Conversely, if the respondent is having trouble recalling answers, allow him / her more time to think.
What questions do you have about interview procedures in the context of an outbreak investigation?
We are now going to work through a 5 minute activity about interviewing techniques. You will need only a piece of paper and a pen or pencil; there was no activity form to download from the Web site for today’s session. After you complete the activity, I will talk through some suggested “correct” answers.
Here is a survey question with 3 possible responses from 3 different hypothetical respondents. For this activity, think about how the interviewer should respond to each of the three answers. Provide an example of either a clarification, a probe or feedback that could be used. Try to think of one correct use of each technique.
So, the question is: Are you still experiencing Diarrhea? Respondent 1 replied “I'm not sure”. In this situation you could clarify the question for the respondent. For example, you could say, “For the purposes of this survey, we consider diarrhea to be 3 or more loose bowel movements in a 24 hour period.” If this does not work, you can give the respondent more time to think about it or as a last resort, move to the next question.
Here the respondent has given an irrelevant answer. Try to redirect him/her with the most neutral probe, repeating the question.
This is a straightforward response. You could simply move on to the next question, or try giving some neutral feedback such as “I see”. Let’s pretend for a moment that the interviewer knows that the respondent is a control. The interviewer believes that the respondent should not be having diarrhea. If the interviewer said something like, “Are you sure?” that would be an example of very poor interviewer technique. Not only is it leading the respondent, but it also implies that you don’t believe him/her. Does anyone else have an example of responses that would work in this situation?
I have one final topic that I want to cover today: supervising your trained interviewers.
Supervising interviewers may seem like a luxury of time in an outbreak investigation. It is, however, important in ensuring standardization in interviews. And as we know standardization results in better data quality. Ideally, a supervisor would monitor, evaluate, and provide feedback to interviewers about the way the interviewers handle the question - answer process. This type of supervising only applies to telephone interviews.
Beyond monitoring interviewers, there are other tasks a supervisor must perform. A supervisor can schedule interviewers -- that includes estimating both the number of interviewers needed and figuring out the time of day to call or visit. If necessary, a supervisor must set up the physical space in which interviews will be conducted. 3. A supervisor should also track who has been called and who has not. When numerous interviewers are involved, this may require more attention than one would think. 4. Beyond logistical concerns, supervisors should also review completed interviews. This is a good way to catch mistakes and correct them for the remainder of the interviews. Furthermore, the data collection phase of your research is the best time to catch errors and even conduct call-backs to respondents if necessary. You do not want to discover errors or missing data during the data analysis phase of your research!
While confidentiality is the last aspect of interviewing techniques that I will discuss today, it is certainly not the least important one. Confidentiality is an important consideration in all the work we do – particularly within the context of the Health Insurance Portability and Accountability Act, or HIPAA.
In most social science research studies informed consent is required. However, since outbreak investigations are considered a public health emergency, with the purpose of identifying and controlling a health problem, informed consent or IRB clearance are not required.
Keep in mind, however, that if further analysis of outbreak investigation data is conducted for the purpose of research , IRB approval should be obtained. That is, if you plan to publish a paper about your outbreak investigation, you should enquire to see if IRB approval is needed. Do not read: You can publish using data collected during an outbreak investigation without getting IRB. The distinction is "planned" research. Both hypothesis generating and testing questionnaires can be done without IRB because they are considered part of an investigation to "control" the outbreak. "Disease control" is where the law is usually on the side of public health practitioners. Without testing a Hypothesis, one would not know e.g., which restaurant to shut down, which product to recall, which farm to fine, etc.
Although you will be exempt from filing Institutional Review Board paperwork and obtaining official “informed consent” for outbreak investigation or rapid needs assessment interviews, you should not overlook the issue of confidentiality from the respondent’s perspective. The opening statement of every interview should indicate that all information collected will be kept confidential. This is important to help assure the respondent that the medical or demographic information they give you will be secure. This may be especially important in certain circumstances. For example, in 1982, there was a multi-state Salmonella outbreak linked to marijuana. Clearly, this would be a sensitive area where confidentiality concerns are of the utmost importance.
An interviewer should not give out details regarding the outbreak, except for providing a brief description of why you are calling at the beginning of an interview. It may be helpful to provide interviewers with a preset script of what can be disclosed about the outbreak to avoid any unnecessary disclosure. If the respondent wants more information about the study, an interviewer should have a telephone number for him / her to call. This telephone number is yet another item that could be included in the supporting documentation in an interviewer’s manual.
This concludes my portion of today’s session. Are there any questions for me before we take a short break and move on to our guest lecturer’s presentation?
We are going to take a 5 minute break . . . We will reconvene at ____ o’clock.
Hello everyone. I’m Erin Rothney and I’m a Research Associate with the North Carolina Center for Public Health Preparedness.
The objectives for my talk today are to provide you with a couple of real-life examples of situations in which I applied the interviewing techniques that you have learned about today. I’ll discuss both face-to-face and telephone interviewing. Then I will compare the two methods and discuss what I found to be the challenges and the solutions to those challenges in each of the given situations. I will also accept questions at the end of my talk.
First I will describe my experience with face-to-face interviews.
I was part of a 5-person team that completed a community assessment of the Bragtown neighborhood in the city of Durham, NC over a five-month period in 2002 and 2003. This community assessment was part of the requirements for completing my master of public health degree. The goal of a community assessment is to identify the needs and strengths of a particular community from numerous stakeholder perspectives. Information collected during an assessment can include any predominant health issues and the barriers and resources available for improving health conditions. Methods of collecting that information come from interviewing community members and other community stakeholders, and observing both the environment and the people within that community. A rapid needs assessment such as the post-hurricane one that you heard about in last month’s “Designing Questionnaires” session is similar to a community assessment in many ways, although it tends to happen in a shorter time frame and with a more immediate health need or threat as the reason for conducting the assessment.
The object of the Durham, North Carolina community assessment was to identify the strengths and weaknesses that existed in the community, primarily as they concerned the public health infrastructure, and to present the information back to the community members to use as they wished. Our five person team interviewed 30 community members and other stakeholders, including the fire chief and business owners. We started with one point of contact, our preceptor who was a member of the community and gave us referrals to other community members. Our methods of selecting additional people to interview was not very systematic. We essentially visited businesses, churches, and other community organizations and inquired about interviewing someone who was part of the organization. We did ensure that we were able to interview people working in governmental organizations, including staff members at the county health department. From there we started to establish a network of people in the community as resources. The questionnaire that we used was 4 pages in length, and took approximately one hour to administer.
We had basic interviewer training in conjunction with our coursework to prepare us for conducting face-to-face interviews. On the next couple of slides, I will tell you about the interviewer training components and how they related to: the questionnaire format and design and the face-to-face interviewing method. The tasks I was involved with in the community assessment included questionnaire design, interviewing and conducting focus groups in the community, analyzing the data collected, and presenting the data back to the community through an open forum.
Here are several examples of questions we asked in the community assessment. These questions were intended to elicit information on life in the community and community assets, while other categories of questions included individual and family information, services offered within the community, and the needs or problems within the community. You can see here that we included some probing questions as a set part of the interview guide. For example, the question under community assets, “What are some organizations within your community that positively affect you or your community?” we probed by asking” What about political groups, environmental groups, church groups?” in case those groups were not the first organizations people mentioned. Questions, such as “What do you like about Bragtown?” served as a good starting point, by getting at the positive aspects of the community.
Now I will briefly discuss what our interviewer training included. Our training included information on active listening skills, which include skills like maintaining eye contact with the interviewee, nodding your head and other physical indications that you are paying attention and appearing open to the interviewee, meaning that you are sitting comfortably and casually while you are listening. Empathy can be defined in many ways, but generally it’s known as “a feeling of concern and understanding for another's situation or feelings.” Showing empathy while interviewing is important to making the interviewee feel comfortable talking with you and allowing them to feel you are a “safe” person to talk to. For our interviews, we came up with standard probes to use while interviewing as you saw in the previous slide. This was intended to reduce interviewer bias and error, which Aaron talked about as a disadvantage, and that standardized the way that each of us asked the questions. As Aaron also discussed earlier, practicing interviewing in OUR training was a definite must. When others viewed us practicing the interview, they gave us suggestions on ways to make our body posture and demeanor as welcoming as possible. We also read through the questions to ensure we had the intricate skip patterns down and didn’t stumble over our words when we went out to do the real interviews.
We ran into a few challenges when we conducted the face-to-face interviews. First, it’s hard to find people at home especially during the typical working day. One thing that can remedy this is making appointments with people ahead of time, although that does involve calling ahead or stopping by in person. Plus, if the community where you are conducting interviews is a low-income area, it may be essential to make personal visits since some households may not have phone service. As Aaron said, face-to-face interviewing is an appropriate method for hard to reach populations, such as those without telephones. Also, some people just don’t feel comfortable inviting a stranger into their home. Interviewers should be kind and considerate of people’s preferences and use those good listening and empathetic skills you learned in interviewer training to try and gain their trust. Also, you could agree to meet at a public location, depending on how sensitive the interview content is and complete the interview there. This can be tricky if you are asking for information on socially unacceptable or sensitive behaviors, but you could also find a private place within a public place, such as a conference room in the public library. O r you might suggest sitting out on the person’s porch or other space outside the home, which may feel less invasive. Lastly, face-to-face interviewing is the most costly method since it involves training of volunteers or staff; travel to the interviews; and time spent training, traveling, and conducting the interviews. This can be remedied by having someone on staff who is well versed in interviewing techniques train the interviewers, have multiple interviewers in the community at the same time to get the most bang-for-your-buck, and train people in interviewing before an immediate need presents itself. Although face-to-face interviewing is the most costly method, it can also yield the most complete data…so it may be worth the time and effort.
Before I move on to the next type of interviewing method with which I have experience—telephone interviewing—I want to share a few of the lessons that I learned while conducting face-to-face interviews in Durham: 1. Assess the environment before starting to do your interviewing to better understand the context of the community and prepare more thoroughly for the investigation. If it’s possible, learn about the community where you will be interviewing before you go there. You can do this by looking at census data, the town’s Web site, or simply by going into the community to assess the environment. This will help you know if you need to make any special considerations before embarking on your interviewing of community members. When we drove around the neighborhood in Durham, we saw a large number of storefronts with Spanish names and determined that we needed to interview members of the Latino population, and may need to translate our questionnaire into Spanish and borrow Spanish-speaking interviewers from another team. Also, face-to-face interviewing can be helpful in assessing the environment while you are interviewing. You will be able to visually assess living conditions and even some aspects of residents’ health status. Is there standing water? Is there garbage floating in the water? How does this all relate to the outbreak you are investigating? What are the important factors to make note of? 2. I also recommend having interviewer training occur well before people need to use their interviewing skills. You may want to have an annual interviewer training, since turnover in health departments can be an issue. With all that needs to occur during an outbreak investigation, it could be very helpful to have everyone already trained in interviewing and just provide a brief refresher course before heading into the field. As you heard in last month’s “Designing Questionnaires” session, it is equally as helpful to have standardized outbreak investigation questionnaires already developed and field tested- that field testing can include interviewer training and practice. 3. People like to tell you their stories so let them if you have time. It also helps you gain their trust by showing your good listening skills and appearing interested in what they have to say. Meeting someone face-to-face helps them feel more comfortable talking with you. This rapport is not only essential to successful data collection – it may also help you get referrals to others in the community that you may want to speak with as well. In the context of our assessment, stories were a great resource in better understanding the community and the interactions of the people within the community. Also, hearing stories and learning information you didn’t specifically ask about in the interview can be very helpful if you are trying to generate a hypothesis about the source of an outbreak. And if a disaster has occurred, taking time to listen to grieving people is a compassionate thing to do.
Next I will talk about telephone interviews and how they were used to conduct an investigation about an E. coli outbreak that originated at the North Carolina State Fair in 2004. This outbreak affected mostly children and included 43 confirmed cases, 59 suspect cases, and 6 probable cases.
The NC State Fair took place from October 15 th through October 24 th . Illness onset of E. coli occurred from mid October into early November. The sample of people who attended the fair included people from across North Carolina and other states, so it was a geographically dispersed group. Investigators implemented a case-control study design to investigate the E. coli outbreak, and controls were matched on age. As a Research Associate at the North Carolina Center for Public Health Preparedness, I assisted with Team Epi-Aid activities to support state health department efforts during the investigation. Those of us involved at UNC only called the controls; the state health department staff and a team of visiting CDC staff took care of interviewing the cases. This could have led to some interviewer variance since CDC interviewed one group and UNC the other, although there is no way now to tell if that was an issue in this study. Training was required of all interviewers. We essentially had a train-the-trainer session, with a North Carolina Public Health Regional Surveillance Team member training a group of 4 potential interviewers from UNC and then we trained those that could not attend that initial training. All interviewers were required to practice reading through the questions and the intricate skip patterns to ensure comfort with the survey instrument before starting the interviews with controls. Since I both received the training, then delivered the training to others, I felt very well prepared to dive into the interviewing.
We conducted interviews over 4 evenings, with anywhere from 3 to 6 interviewers making calls. We placed calls between 3 and 8 p.m., and that was a good time to catch adults at home from work. We called whatever number people had given to buy the pre-sale state fair tickets. The numbers were sometimes cell phone numbers; some people asked that we call them back at a home number or just happily talked to us while they were driving their car. We had lists of phone numbers, each with between 100 and 300 numbers. We were required to complete 3 interviews per list, and with a person of a pre-assigned age group- 1-5 years, 6-17 years, and 18 years of older (for those under 18 we spoke to a parent who answered the questions based on their child’s behavior at the fair). We did not determine the response rate, since some phone calls were not answered, and other people were willing to participate, but they did not have a household member in the age group we were interviewing. But I personally only had one person refuse to participate, so anecdotally it seemed that the response rate was high. It was helpful to have all of the interviewers working in the same building. We were able to have one person supervising and checking in with interviewers periodically, and reviewing completed interviews, which was one advantage of conducting the study in this way that you heard about from Aaron. If there were problems with inconsistency or missing information, the supervisor could ask the interviewers directly about the issue before they continued with additional interviews. Overall, people were very willing to talk with the interviewers. Most had heard about the outbreak through news sources and were curious and interested in knowing more about the outbreak (many had their own ideas of what caused the outbreak!)
Now, I want to briefly share the lessons that I learned when conducting telephone interviews with controls during the E. coli outbreak investigation : Familiarize yourself with the instrument and skip patterns. Interviewing can be complicated with skip patterns and it’s hard to figure out the flow of the questionnaire without reading through the questions and actually conducting a mock interview. It’s hard sometimes to sound conversational instead of just reading from a script. I think it’s really important to practice interviewing before launching into the real thing. You don’t want to sound stiff and like you are reading the questions for the first time. 2. I also learned that the media can be your friend. They may get information wrong sometimes, but the media can raise awareness and help people recognize what you are calling to interview them about. Remember, too from the July “Risk Communication” session in this series that you can use the media to not only raise awareness, but educate the public about the status of an outbreak situation, where community members can find additional information, and what they can do to protect themselves from infection. 3. Finally, it is important to relate the purpose of the phone call to the individual. Be sure to tell the respondent as soon as possible why you are calling. For example, “Because you live in such and such a neighborhood” or “Because you attended so and so’s wedding”. This may help them trust that your call is legitimate and get them interested in what you have to say. You may recall that in last month’s “Designing Questionnaires” session, we reviewed a sample introductory script for conducting interviews. Your introductory script should: Identify and legitimize the interviewer; Immediately state the reasons for conducting the survey; Assure the respondent that whatever information is given will be confidential; and Possibly serve additional purposes.
In closing, I want to briefly review the scenarios in which I conducted interviews, and discuss why the respective interview methods were most appropriate for the settings in which they were used. Let’s first consider the community assessment interviews. Face-to-face is the only way to conduct a thorough community assessment. Meeting face-to-face makes it easy to establish rapport and gain the trust of your interviewee. People may be more likely to tell you information if you are sitting in front of them versus on the phone with them. You’re a real person, not just a faceless voice. It may also be easier to find people in a certain geographical area by going directly to the area, instead of searching for phone numbers and addresses in the phone book or on the Internet. The combination of census data with Geographic Information Systems software can also provide a means of pre-determining where you can locate people to be interviewed. Finally, the face-to-face method also facilitates a complete assessment of the environment and the person you are interviewing. Assessing the environment in which the person lives, both their home or community, can be important for outbreak investigations. You may see clues about how an outbreak has spread or the physical condition of the person you’re interviewing or the place where they are living. During the E. coli outbreak investigation, the news coverage of the outbreak actually helped those of us who were interviewing. People would answer the phone and we would begin by stating why we were calling them and many times they would interject comments about how they had heard about the outbreak and would ask us what we could tell them. Of course we would say we didn’t know exactly what happened and that’s why we’re interviewing people who went to the fair and both got sick and did not get sick. Most people were very willing to talk to us about what they ate and saw at the fair, it seemed to be a fun recall game for some people. It was really easy for the people who designed the E.coli outbreak study to access the control group with telephone interviews. The control group consisted of people who had gone to the fair but hadn’t gotten sick (and had bought pre-sale tickets, which is how we had their phone numbers). Since people had come from all over the state and even from out of state, it would have been very difficult, costly, and time consuming to track a sample down and perform face-to-face interviews. So in the end, I think that the interviewing methods that were used in both situations were appropriate based on the research goals and the resources available.
What questions do you have for me today?
Let’s review the main points of this session on interviewing techniques that were presented to you today.
Questionnaire design and interview methods are interrelated in the overall process of an outbreak investigation. The primary purpose of interviews in outbreak investigations is to collect data for case identification, risk factor identification, or hypothesis generation.
Interview methods can be interviewer administered (face-to-face or telephone) or self administered (mailed, emailed, or Web-based). There are advantages and disadvantages to employing either method. Sampling is the systematic selection of a representative portion of the larger source population to be interviewed. If the purpose of your study is to determine the point source of infection, you may be able to interview a smaller sample; if the purpose of your study is to calculate an attack rate, you may need to interview a larger sample.
Survey response rates measure the percentage of your sample that has participated in your survey. Average response rates vary from as little as 56% for mailed surveys to 75% for face-to-face surveys. Non-response to surveys can be a result of no one being home, refusal to participate, or individual inability to participate (e.g., because of a language barrier or physical or mental condition).
Survey data collection error is a result of both bias and variance in the interview process. Interviewer error can be prevented with adequate interviewer training and the standardization of survey instruments.
Develop and distribute an interviewer manual to provide interviewer support. Such documentation reduces error and enhances the quality of data collected. Sound interviewing procedures include: reading questions exactly as they are worded; probing inadequate answers; recording answers without interviewer discretion; and maintaining rapport with respondents.
Next month we will have Dr. Amy Nelson with us again to talk to you about the next step in the outbreak investigation process: analyzing the data that you collected during interviews and coming one step closer to solving the outbreak mystery. . .
Final october interviewing_techniques
Public HealthInformation Network (PHIN) Series II Outbreak Investigation Methods: From Mystery to Mastery
Site Sign-in SheetPlease submit your site sign-in sheet and session evaluation forms to: Suzi Silverstein Director, Education and TrainingEmergency Preparedness & Response Programs FAX: (804) 225 - 3888
Series II Sessions Title Date“Recognizing an Outbreak” June 2“Risk Communication” July 7“Study Design” August 4“Designing Questionnaires” September 1“Interviewing Techniques” October 6“Data Analysis” November 3“Writing and Reviewing Epidemiological December 1Literature”
CDCOutbreak Management System Software Support: National Center for Public Health Informatics email@example.com / (800) 532-9929, option 6
OMS Applications• Track demographics, case investigations, and exposure contact relationships for persons, animals, events, travel events, vehicles, objects, organizations, other organisms, and locations.• Create household, social, or occupational relationships among records• Run OMS on desktops or laptops [CAPI]
OMS User InterfaceSource: http://www.cdc.gov/phin/software-solutions/oms/index.html
OMS User InterfaceSource: http://www.cdc.gov/phin/software-solutions/oms/index.html
OMS in Virginia Contact:Michael A. Coletta, MPHBioterrorism Surveillance CoordinatorDivision of Surveillance and InvestigationOffice of EpidemiologyTelephone: (804) 864-8099Email: firstname.lastname@example.org
Today’s PresentersAaron Wendelboe, MSPHDoctoral Candidate andGraduate Research Assistant,NC Center for Public Health PreparednessErin Rothney, MPHResearch Associate,NC Center for Public Health Preparedness
“Interviewing Techniques” Learning Objectives Upon completion of this session, you will:• Recognize the interrelatedness of interview techniques and questionnaire design• Understand key survey research terms• Understand the advantages and disadvantages of face-to-face, telephone, and computer assisted interview methods
Learning Objectives (cont’d.)• Understand the advantages and disadvantages of mail and Web-based survey implementation• Know what to address in interviewer training• Recognize good interview techniques• Understand confidentiality concerns from the perspectives of both the respondent and the outbreak investigator
Lecturer Aaron Wendelboe, MSPH Doctoral Candidate and Graduate Research Assistant,NC Center for Public Health Preparedness
Basic Steps of an Outbreak Investigation1. Verify the diagnosis and confirm the outbreak2. Define a case and conduct case finding3. Tabulate and orient data: time, place, person4. Take immediate control measures5. Formulate and test hypothesis6. Plan and execute additional studies7. Implement and evaluate control measures8. Communicate findings
Introduction• The role of interviews in outbreak investigations• Types of interviewing methods• Interrelatedness of interview method and questionnaire design• Key survey research concepts – Sampling – Response rates
Role of Interviews in Outbreak InvestigationsPrimary purpose: data collection• Case identification• Risk factor identification• Hypothesis generation
SamplingSampling is the systematic selection of aportion of the larger source population. Asample should be representative of thelarger source population.
SamplingSource Pop: Students (12,000)Sampled pop (150 students)
Sampling Why Sample?Because it is more efficient – saves timeand money!
Sampling Sample sizeIs the purpose of the study to determine thesource of the outbreak?– A small number of cases and controls can reveal risk factors for infection.Is the purpose of the study to determine thenumber of persons who become sick over aspecific period of time [attack rate]?– A cohort study would require a larger sample.
Sampling Types of SamplingSimple Random Sample (SRS)Randomly select persons to participate in study.There are many variations of SRS.Convenience SampleChoose those individuals who are easilyaccessible.
Sampling Problems with Convenience Sampling• Based on subjective judgment• Cases may or may not be representative of the total population• May lead to biased results
Sampling Additional Resources: http://www.sph.unc.edu/nccphp/training/all_trainings/at_sampl.htm1. “Sampling Case Studies”2. “Survey Sampling: Precision, Sample Size, and Conducting a Survey”3. “Survey Sampling Terminology and Methods”
Response RatesResponse rates measure the percentage of yoursample that has participated in your survey. Example:Using the campus directory, you email a surveyto a random sample of 100 freshmen. 40 ofthose students complete the survey and return itelectronically. Your response rate is 40%.
Response RatesHigh response rates ensure that surveydata are representative of the sourcepopulation, and that results will be valid.
Response Rates Types of Non-response• Non-contact: No one at home• Refusal to participate• Inability to participate (due to language barrier or physical or mental condition)
Response RatesWhat is an average response rate?
Response Rates Determining Response RatesRefer to the American Association of Public Opinion Research website: www.aapor.org – Link to the document titled, “Standard definitions” from the home page.
Interviewer Administered Data Collection• Advantages and disadvantages of face-to-face interviews• Advantages and disadvantages of telephone interviews• Advantages and disadvantages of Computer assisted interviews
Face-to-Face Interview Advantages:• Higher response rate• Longer survey instrument• Can have more complex skip patterns• More accurate recording of responses – Less item non-response• Appropriate for hard to reach populations (e.g., illiterate, institutionalized)
Face-to-Face Interview Disadvantages:• Costly• Potential for interviewer error• Less anonymous than self-administered – Potential for dishonesty
Telephone Interview Advantages:• Less costly than face-to-face• Higher response rates than mailed• Quicker access to participants• Supervision of interviewers feasible• Can collect more sensitive information• Survey design can be more efficient
Telephone Interview Disadvantages:• Lower response rates than face-to-face• Shorter questionnaires used• Unable to capture important visual information (e.g., rash, working conditions)• Under-coverage (e.g., population without phones) – 2.1% of total population in Virginia
CAPI Example: NC PHRST TeamsNC PHRST Team public healthprofessionals use PDAs* for rapid needsassessment face-to-face interviews.* PDA: Personal Digital Assistant, also sometimes called hand-held computers, palmtops, and pocket computers
Field Data Collection EPI Info GIS Wireless: WIFI 802.11 or Bluetooth Field Team 4 Field Team 5 Field Team 1 Field Team 2 Field Team 3Field data collection using IPAQ Pocket PCs equipped with GPS, GIS software and data collection forms.
Pros: Pros:• Inexpensive • Eliminates double data entry• No special skills required for • Provides routing and direction- finding for field teams data recording • Improved randomization through GIS • Ability to quickly analyze and map data Cons:Cons: • Technology is expensive• Requires double data-entry • Learning curve / required – Greater risk of data errors training for data entry• Clipboard and paper more • Small screen size requires cumbersome in the field scrolling through many questionnaire pages
For More Information. . .Steven Ramsey, RSTeam Leader / Industrial HygienistPHRST-5Guilford County Health Department, NC(336) -641-8192
Self-administered Data Collection• Advantages and disadvantages of mailed questionnaires• Advantages and disadvantages of Web-based questionnaires
Mailed Questionnaire Advantages:• More anonymous• May collect more honest responses• No interviewer error• Less expensive• Respondent has more time to think about question
Mailed Questionnaire Disadvantages:• Questionnaire must be simple• Higher item non-response• Lower response rate• Data collection takes more time• Sample population must be literate• Coverage / frame deficiencies
Web-based Questionnaire Advantages:• Among some populations, most people may have access to the Web / email• Inexpensive and fast• No data entry required – Improves data quality• Many vendors send data in a variety of formats
Web-based Questionnaire Disadvantages:• Mandatory access to and experience with Internet• Potential connection speed and hardware / software capacity limitations• Potential for multiple responses from one individual• Potential for responses from non-sampled respondents• Need email address list to contact sample
Web-based Questionnaire Example: Dartmouth University: 698 (13.8%) of 5060 students had conjunctivitis in spring 2002• To identify risk factors... – web-based questionnaire set up – E-mail sent to 3682 undergraduates – No data entry - rapid analysis• 1832 responded (50% response rate)-- Source: An outbreak of conjunctivitis due to atypical Streptococcus pneumoniae. N Engl J Med. 2003;348 (12):1112-21.
Web-based Questionnaire For a list of vendors that provide Web- based survey tools, please visit:http://www.surveymonkey.com/Pricing.asp
Standardizing Interviews• The goal of standardization is to help minimize error, thereby yielding better data quality• Minimizing interviewer error is done through making surveys more standard or consistent
Error Interviewer Error:Deviation from expected answer due tothe effects of interviewers.
Interviewer Error Example: Gonorrhea outbreak Bias VarianceInterviewers probe on A male interviewerthe sexual history may elicit differentsection more among responses from anon-whites than female respondentwhites than a female interviewer.
Error Additional ResourceSchwarz, N., Groves, R., and Schuman,H., “Survey Methods” Chapter 4 in Gilbert,D. et al (Eds) (1998). The Handbook ofSocial Psychology. Boston: McGraw-Hill;pp 143 – 179.
Interviewer SelectionCriteria for Telephone Interviewer Selection• Ability to read questions fluently• Clear and pleasant telephone voice• Responds quickly to respondent’s questions• Reliability
Criteria for Face-to-Face Interviewer Selection• Logistical skills (reading maps)• Good interpersonal skills• Independent workers• Reliability• In certain circumstances, parallel demographic characteristics among interviewers and interviewees
Interviewer Training• Training is NOT optional!• Trainings must be interactive• Interviewers must practice reading questions out loud• Provide support documentation (manual)
Interviewer Training Elements Purpose of survey How to use data collection instrument Respondent selection process Intent and meaning of each question How to record/code responses Administering questionnaire Addressing participants’ questions Methods for improving response rate Tracking calls / completed surveys / call- backs Confidentiality
Interviewer Training Respondent Selection ProcessProvide proxy respondent rules for adultsand children because proxy responseimpacts:– Data quality– Sampling
Interviewer Training Questionnaire Administration To establish legitimacy of the survey upon first contact, tell the respondent:Who is callingWhat is requestedWhy respondent should cooperateHow respondent was chosen
Interviewer Training Logistics Face-to-Face Telephone• Reading maps • Operation of equipment• Getting to • Operation of CATI respondents’ homes software (if applicable)• Reimbursement• Dress code• Scheduling callbacks
Interviewer Training Other Considerations• Record some resolution to each question – Are missing responses due to skip patterns or errors?• Review interview after completion – Missing responses – Illegible responses
Interviewer Training Interviewer ManualAn interviewer manual serves as areference to interviewers during interviewsand as survey documentation.
Interviewer Training Suggested Interviewer Manual Contents• Background information• Fieldwork• Interviewing techniques• Survey instrument terms and definitions
Interviewer Training Program Example Behavioral Risk FactorSurveillance System (BRFSS)
BRFSS Interviewer Training On-line training covers:• Why BRFSS data are important, how data are used• Interviewer responsibilities• Nuts and bolts of the interviewing process• Interviewing techniques
BRFSS Interviewer TrainingOn-line interviewer training available at:http://apps.nccd.cdc.gov/BRFSS_Training_Int/General information about BRFSS:http://www.cdc.gov/brfss/
Interviewing Procedures Rules• Read questions exactly as worded• Probe inadequate answers, if necessary• Record answers without interviewer discretion• Maintain rapport with respondents• Maintain an even pace
Interviewing Procedures Read questions exactly• Read entire question before accepting an answer• Clarify questions if necessary
Interviewing Procedures Read questions exactly• Use only standard definitions / clarification provided• Use the phrase: “Whatever x means to you”, OR “Whatever you think of as x.”• When asked to repeat only one of several response options, repeat ALL options given for a question
Interviewing Procedures ProbeA probe is a standardized way to obtainadditional information from a respondent.Use probes when a respondent’s answeris unclear or irrelevant.
ProbeExamples of responses requiring a probe:Interviewer: "In the past two weeks, have youbeen swimming in a public pool?”Irrelevant Response: “I swam in a lake at anational park last month."Unclear Response: “I stayed in a hotel with apool when I was on vacation last week."
Interviewing Procedures Standard Probe Examples• Repeat the question• Retrieve receipts / calendars• What do you mean? How do you mean?• If respondent has narrowed down answer: – Which would be closer? – If you had to choose, which would you pick?
Interviewing Procedures Recording Answers• Do not direct respondent toward an answer (leading)• Do not assume that an “answer” received in passing is correct• Do not skip questions, even if “answer” was given earlier• Do not remind respondent of earlier remark if answer differs from what you expect
Probing versus Leading Example:Interviewer: In the last 7 days, how many times did youeat prepared food at the dorm cafeteria? Would yousay: a. None d. 3 times b. Once e. More than 3 times c. TwiceRespondent:“Oh, gee, I didn’t go very often . . . maybe a few times.”
Probing versus Leading Example: Interviewer Probe (correct) “Which would be closer: none, once, twice, 3 times, or more than 3 times?” Interviewer Leading (incorrect)a. “So, would you say twice, or three times?”b. “Do you mean twice, or three times?”
Maintain Rapport “Any line can be said a thousand ways.” - BRFSS interviewer training Interviewers can put respondents at ease by doing the following:• Read the questions in a friendly, natural manner• Speak at a moderate rate of speed• Sound interested• Strive for a low-pitched voice
Feedback Helps Maintain RapportFeedback is a statement or action that indicatesto the respondent that s/he is doing a good job. – Give feedback only for acceptable performance - not “good" content. – Give short feedback phrases for short responses, longer feedback for longer responses. – Specific study information and interviewer task-related comments can serve as feedback. – Telephone interviewers should give feedback for acceptable respondent performance 30-50% of the time.
Feedback Examples• “I see…”• “Uh-huh”• “Thank you / Thanks”• “That is useful / helpful information”• “I see, that is helpful to know”• “That is useful for our research”• “Let me get that down”• “I want to make sure I have that right (REPEAT ANSWER)”• “We have touched on this before, but I need to ask every question in the order that it appears in the questionnaire”
Interviewing Procedures Maintain Even Pace• Pace refers to the rate of progression of the interview.• Pace can vary by question type.• Let the respondent set the pace.
Activity:Correct Interview ProceduresProbing vs. Leading vs. Feedback Completion time: 5 minutes
ActivityInterviewer: “Are you still experiencing Diarrhea?”Respondent 1: “I’m not sure”Respondent 2: “I definitely had diarrhea last Tuesday”Respondent 3: “Yes” Activity Instructions: How should the interviewer respond to these 3 answers? Provide an example of either a clarification, probe, or feedback that the interviewer could use. Try to think of one correct use of each technique.
Activity Suggested AnswerRespondent 1: “I’m not sure” Try a clarification: “For the purposes of this survey, we consider diarrhea to be 3 or more loose bowel movements in a 24 hour period.”
Activity Suggested AnswerRespondent 2:“I definitely had diarrhea last Tuesday” Try a Probe: “OK, but are you still experiencing diarrhea?”
Supervising InterviewersMonitoring, evaluation, and feedbackgiven to interviewers should focus on theway interviewers handle the question-answer process.
Other Supervision Tasks• Scheduling interviewers – Number of interviewers needed – Time calls / visits will be made• Setting up interview space• Tracking who has been called and who has not• Reviewing data from completed interviews
Confidentiality Human Subjects & Informed ConsentOutbreak investigations are considered apublic health emergency, with the purposeof identifying and controlling a healthproblem. Informed consent or InstitutionalReview Board (IRB) clearance are notrequired.
Confidentiality Human Subjects & Informed ConsentIf further analysis of outbreak investigationdata is conducted for the purpose ofresearch, IRB approval should beobtained.
Confidentiality Respondent PerspectiveOpening statement of every interviewshould indicate that all informationcollected will be kept confidential.
Confidentiality Outbreak Investigation Perspective• Do not discuss details about the outbreak• Provide only a brief description of the purpose of the survey at first contact
Guest Lecturer Erin Rothney, MPH Research AssociateNC Center for Public Health Preparedness
Overview• Provide real-life examples of situations where you will use interviewing techniques – Face-to-face interviewing – Telephone interviewing• Discuss advantages and disadvantages• Compare interviewing methods
Community Assessments• Identify the needs and strengths of a particular community from several stakeholder perspectives• Include interviewing community members and observing the environmental and individual characteristics and community infrastructure• Similar to rapid needs assessments, but completed within a longer time frame
Durham, NC Community Assessment• Fall 2002 - Spring 2003• Bragtown Neighborhood, Durham, NC• 5 person team• Interviewed residents and other stakeholders in Bragtown• 4 page survey, 60 minutes in length
Durham, NC Community Assessment Tasks:• Questionnaire design• Interviewer training• Interviewing• Facilitating focus groups• Analyzing data• Presenting data to the community
Survey Instrument Question Examples:Life in the Community• What do people in Bragtown do for recreation?• What types of religion are practiced in Bragtown?• What do people in Bragtown do for a living?• What political or government organizations exist in Bragtown?• What different cultural and ethnic groups live in Bragtown?• How do these different groups interact? Do they get along?Community Assets• What do you like about Bragtown?• What are some organizations within your community that positively affect you or your community? – Probe: What about political groups, environmental groups, church groups?• Who are the individuals within your community that you feel are positive leaders or role models? – Probe: Any others?
Interviewer Training• Active listening skills• Showing empathy• Using probes• Practice interviewing, not just reading questions
Face-to-face Interviews Challenges SolutionsHard to find people at home Schedule time ahead by phone or stop by and schedule more convenient timePeople may not want to Use the skills you learned invite a stranger into their in interviewer training to home gain trustCostly and time-intensive Have someone on staff method of interviewing train others on interviewing techniques; carpool; set time limits
Lessons Learned1. Study community demographics and characteristics before you interview2. Train interviewers before an immediate need3. People like to tell you their stories- could lead to relevant information
Telephone Interviews2004 E. coli Outbreak Investigation
E. coli Outbreak Investigation Telephone Interviews• Illness onset October - November 2004• Geographically dispersed cases in multiple states• Case-control study• Train-the-trainer, interviewer
E. coli Outbreak Investigation Telephone Interviews• Between 3 and 6 interviewers• Calls made between 3 p.m. and 8 p.m.• Quality control with one central interviewing location• News coverage piqued people’s interest in the outbreak investigation
Lessons Learned1. Practice reading through the questions and conducting an interview2. The media can be your friend3. Use an introductory script to relate the purpose of the phone call to the individual quickly – Identify and legitimize the interviewer – State reasons for conducting the survey – Assure that responses will be confidential
Summary [Face-to-face] [Telephone]Community Assessment Outbreak Investigation – Establish rapport – News coverage helped in recruiting people to participate – Identify people in a small geographic area – Widely distributed sample – Assess the environment of the area – We had the phone numbers of all the people who pre- bought tickets
Session Summary• Questionnaire design and interview methods are interrelated in the overall process of an outbreak investigation.• The primary purpose of interviews in outbreak investigations is to collect data for case identification, risk factor identification, or hypothesis generation.
Session Summary• Interview methods can be interviewer administered (face-to-face or telephone) or self administered (mailed, emailed, or Web-based). There are advantages and disadvantages to employing either method.• Sampling is the systematic selection of a representative portion of the larger source population to be interviewed. If the purpose of your study is to determine the point source of infection, you may be able to interview a smaller sample; if the purpose of your study is to calculate an attack rate, you may need to interview a larger sample.
Session Summary• Survey response rates measure the percentage of your sample that has participated in your survey. Average response rates vary from as little as 56% for mailed surveys to 75% for face- to-face surveys.• Non-response to surveys can be a result of no one being home, refusal to participate, or individual inability to participate (e.g., because of a language barrier or physical or mental condition).
Session Summary• Survey data collection error is a result of both bias and variance in the interview process.• Interviewer error can be prevented with adequate interviewer training and the standardization of survey instruments.
Session Summary• Develop and distribute an interviewer manual to provide interviewer support. Such documentation reduces error and enhances the quality of data collected.• Sound interviewing procedures include: reading questions exactly as they are worded; probing inadequate answers; recording answers without interviewer discretion; and maintaining rapport with respondents.
Next Session November 3rd 1:00 p.m. – 3:00 p.m. Topic: “Analyzing Data”
References and Resources1. American Statistical Association (1997). What Is a Survey? More About Mail Surveys. Alexandria, VA: Section on Survey Research Methods, American Statistical Association.2. American Statistical Association (1997). What Is a Survey? How to Collect Survey Data. Alexandria, VA: Section on Survey Research Methods, American Statistical Association.3. Centers for Disease Control and Prevention (2005). Outbreak Management System Demonstration Web site. http://www.cdc.gov/phin/software-solutions/oms/index.html .
References and Resources4. Fowler, F. and Mangione, T. (1990). Standardizing Survey Interviewing. Newbury Park: Sage Publications.5. Gregg, M. (ed). (1996). Field Epidemiology. Oxford University Press.6. Holstein, JA and Gubrium, JF. (1997). Active Interviewing. In Silverman, D. (Ed.) Qualitative Research: Theory, Method, and Practice. London: Sage Publications, pp. 113-129.7. Last, J.M. (2001). A Dictionary of Epidemiology: 4th Edition. Oxford University Press: New York.
References and Resources8. Levy, P. and Lemeshow, S. (1991). Sampling of Populations. John Wiley & Sons.9. Ramsey, S. et al (2005). Using GIS and GPS to Improve Public Health Response. Guilford County, NC Health Department Public Health Regional Surveillance Team 5.10. Rubin, HJ and Rubin, IS. (1995). Interviews as Guided Conversations. Qualitative Interviewing: The Art of Hearing Data. Sage Publications, pp. 1-16, 122-144.11. Salant, P. and Dillman, D. (1994). How to Conduct Your Own Survey. John Wiley & Sons.
References and Resources12. Stehr-Green, J.K. (2002). Gastroenteritis at a University in Texas: Case Study Instructor’s Guide. Atlanta, GA: U.S. Department of Health and Human Services, Public Health Service, Centers for Disease Control and Prevention.13. U.S. Census Bureau (2005). Profile of Selected Housing Characteristics by State: Census 2000 Summary File 3 http ://factfinder.census.gov/servlet/QTTable?_bm=n&_lang=en&qr_name=D14. Weiss, R.S. (1994). Learning from Strangers: The Art and Method of Qualitative Interview Studies. New York: The Free Press.15. Wiggins, B. and Deeb-Sossa, N. (2000). Conducting Telephone Surveys. Chapel Hill, NC: Odum Institute for Research in Social Science.