UNDERSTANDING DATA AND WAYS
TO SYSTEMATICALLY COLLECT DATA
Practical Research 2
The purpose of this design is to describe the status of an
identified variable such as events, people or subjects as
they exist. It usually makes some type of comparison,
contrasts, correlation and sometimes, in carefully planned
and orchestrated descriptive researches, cause and effect
relationships may be established to some extent.
DESCRIPTIVE RESEARCH DESIGN
SAMPLING
Is the process of getting information from a proper subset of
population. The fundamental purpose of all sampling plans is to
describe the population characteristics through the values obtained
from a sample as accurately as possible. It is therefore evident that if
one were to draw conclusions based on a small sample then the sample
must imitate the behavior or characteristics of the original population as
closely as possible.
Is a detailed outline of which measurements will be taken at what
times, on which material, in what manner, and by whom that
support the purpose of an analysis. Sampling plans should be
designed in such a way that the resulting data will contain a
representative sample of the parameters of interest and allow for
all questions, as stated in the research objectives to be answered.
SAMPLING PLAN
Steps in Developing a Sampling Plan
1. Identify the parameters to be measured, the range of possible
values and the required solution
2. Design a sampling scheme that details how and when samples will
be taken
3. Select sample sizes
4. Design data storage formats
5. Assign roles and responsibilities
Data gathering devices that will be used in the
study. It is a testing device for measuring a
given phenomenon, such as a paper and pencil
test, questionnaires, interviews, research tools,
or set of guidelines for observation.
INSTRUMENTS
CATEGORIES OF INSTRUMENTS
Research Completed
Instruments
Subject Completed
Instruments
Rating scales Questionnaires
Interview schedules/guides Self-checklists
Tally sheets Attitude scales
Flowcharts Personality inventories
Performance checklists Achievement test/aptitude tests
Time and motion logs Projective devices
Observation forms Sociometric devices
Refers to the extent to which the instrument measures what it
intends to measure and performs as it is designed to perform.
Types of Validity:
1. Face Validity - is the simplest form of validity. It means that, just
by looking at the test or measure, it seems to be measuring what
it's supposed to measure.
2. Content Validity – the extent to which a research instrument
accurately measures all aspects of a construct.
3. Construct Validity – the extent to which a research instrument or
tool measures the intended construct.
4. Criterion Validity – the extent to which a research instrument is
related to other instruments that measure the same variables.
VALIDITY
Tells how consistently a method measures something.
Attributes of Reliability
1. Test-Retest Correlation – the consistency of results when you repeat
the same test on the same sample at a different point in time.
2. Interrater - The same test conducted by different people.
3. Parallel forms – measures the correlation between two equivalent
versions of a test. You use it when you have two different assessment
tools or sets of questions designed to measure the same thing.
4. Internal Consistency – the extent to which all the items on a scale
measure one construct.
RELIABILITY
Primary Sources – known as primary data/raw data. These
are data obtained from your own researchers, surveys,
observations and interviews.
Secondary Sources – known as secondary data. These are
data obtained from secondary sources such as reports,
books, journals, documents, magazines, internet and more.
SOURCES OF DATA
1. INTERVIEWS
Kinds of Interview:
a. Structured Interview – the researcher asks a standard set of questions and
nothing more. The interview follows a specific format with the same line of
questioning. The aim of this approach is to ensure that each interview is
presented with exactly the same questions in the same order.
b. Face to Face Interview – most frequently used. It can be conducted in the
respondent’s home or workplace, halls or even simply in the street.
DATA COLLECTION METHODS
1. INTERVIEWS
Kinds of Interview:
c. Telephone Interview – less consuming and less expensive. The researcher
has ready access to anyone who has a telephone.
d. Computer-Assisted Personal Interviewing – is a form of personal
interview but instead of completing a questionnaire, the interviewer brings
along a laptop or handheld computer to enter the information directly into
the database.
DATA COLLECTION METHODS
2. QUESTIONNAIRES
Five Sections:
a. Respondent’s Identification Data – include respondent’s name,
address, date of the interview, and name of the interviewer.
b. Introduction – is the interviewer’s request for help. It is normally
scripted and lays out the credentials of the market research company,
the purpose of the study and any aspects of confidentiality.
DATA COLLECTION METHODS
2. QUESTIONNAIRES
Five Sections:
c. Instruction – refers to the interviewer and the respondent’s
directions on how to move through the questionnaire such as which
questions to skip and where to move to if certain answers are given.
d. Information – is the main body of the document and is made up of
the many questions and response codes.
DATA COLLECTION METHODS
2. QUESTIONNAIRES
Five Sections:
e. Classification Data and Information – establish the
important characteristics of the respondent, particularly
related to their demographics which are sometimes at the
front of questionnaire or sometimes at the end.
DATA COLLECTION METHODS
2. QUESTIONNAIRES
TYPES OF QUESTIONNAIRES:
a. Paper-pencil Questionnaire – can be sent to a large number of people and saves
the researcher time and money.
b. Web-based Questionnaire – is a new and inevitably growing methodology using the
internet based research.
c. Self-administered Questionnaire – are general distributed through mail, filled out
and administered by the respondent themselves which is returned via email to the
DATA COLLECTION METHODS
3. OBSERVATIONS – is a way of gathering data by watching behavior,
events, or noting physical characteristics in their natural setting.
Kinds of Observations:
a. Overt – when everyone knows they are being observed.
b. Covert – when no one knows they are being observed and the
observes is concealed.
DATA COLLECTION METHODS
4. TESTS – provide a way to assess subject’s knowledge and capacity to apply this knowledge to new
situations.
Kinds of Tests:
a. Norm-referenced tests – provide information on how the target performs against a reference group
or normative population.
b. Criterion-referenced tests – constructed to determine whether or not the respondents/subjects
have attained mastery of a skill or knowledge area.
c. Proficiency test – provides an assessment against a level of skill attainment, but includes standards
for performance at varying levels of proficiency.
DATA COLLECTION METHODS
5. SECONDARY DATA – a type of quantitative data that has already
been collected by someone else for a purpose different from yours.
These data are collected by researchers, government and private
agencies, institutions or organizations or companies that provide
important information for government planning and policy
recommendation and theory generation.
a. Paper-based sources – are those from books, journals,
periodicals, abstracts, indexes, directories, research reports,
conference papers, market reports, annual reports, internal
records of organizations, newspapers and magazines.
b. Electronic sources – are those from CD-ROMs, on-line databases,
internet, videos and broadcasts.
DATA COLLECTION METHODS
Pointers to Remember in Reporting the
Results:
• Explain the data you have collected, the statistical treatment and all
relevant results in relation to the research problem that you are
investigating.
• Describe unexpected events that occurred during your data
collection. Explain how the actual analysis differs from the planned
analysis. Explain how you handled the missing data and why any
missing data did not undermine the validity of your analysis.
• Explain the techniques you used to “clean” your data set.
• Choose a statistical tool and discuss its use and reference a for it.
Specify any computer programs or software used in the study.
• Describe well the assumptions for each procedure and the steps you
took to ensure that they were not violated.
• Provide the descriptive statistics, confidence intervals and sample
sizes for each variable.
Pointers to Remember in Reporting the Results:
• Avoid interfering causality, particularly in non-randomized
designs or without further experimentation.
• Use tables to provide exact values and use figures to convey
global effects. Keep figures small in size ad include graphic
presentations of confidence intervals whenever possible.
• Inform the reader what to look for in tables and figures.
Pointers to Remember in Reporting the Results:
Participants – describe the participants in your research study,
including who they are, how many there are, and how they are
selected. Explain how the samples were gathered, any
randomization techniques and how the samples were prepared.
Example: The researchers randomly selected 100 children from
elementary schools of Cebu City.
WRITING OF METHODOLOGY
Materials – describe the materials, measures, equipment, or stimuli
used in your research study. This may include testing instruments,
technical equipment, books, images or other materials used in the
course of your study.
Example: Two stories from Sullivan et al.’s (1994) second-order false
belief attribution tasks were used to assess children’s understanding of
second-order beliefs.
WRITING OF METHODOLOGY
Design – describe the research design used in your research study. Specify the
variables as well as the levels and measurement of these variables. Explain
whether your research study uses a within-groups or between-groups design.
Discuss how the measurements were made and what calculations were performed
upon the raw data. Describe the statistical techniques used upon the data.
Example: The experiment used a 3x2 between-subjects design. The independent
variables were age and understanding of second-order beliefs.
WRITING OF METHODOLOGY
Procedure – the detail of the research procedures used in your research study should be
properly explained. Explain what your participants/respondents do, how you collected the
data, the order in which steps occurred. Observe some ethical standards in gathering your
data.
Example:
A researcher interviewed children individually in their school in one session that lasted 20
minutes on average. The researcher explained to each child that he or she would be told two
short stories and that some questions would be asked after each story. All sessions were
videotaped so the data could later be coded.
WRITING OF METHODOLOGY
• Always write the method section in the past tense. (Use the future tense if it is a
research design.)
• Provide enough details that another researcher could replicate your experiment,
but focus on brevity. Avoid unnecessary detail that is not relevant to the outcome
of the experiment.
• Remember to use proper APA format.
• Take a rough draft of your method section with your teacher or research adviser
for additional assistance.
TIPS IN WRITING OF METHODOLOGY
• Proofread your paper for typos, grammar problems, and
spelling errors. Do not just rely on computer spell checkers.
Always read through each section of your paper for agreement
with other sections. If you mention steps and procedures in the
method section, these elements should also be present in the
results and discussion sections.
TIPS IN WRITING OF METHODOLOGY

10-Understanding-Data-and-Ways-to-Systematically-Collect-Data.pptx

  • 1.
    UNDERSTANDING DATA ANDWAYS TO SYSTEMATICALLY COLLECT DATA Practical Research 2
  • 2.
    The purpose ofthis design is to describe the status of an identified variable such as events, people or subjects as they exist. It usually makes some type of comparison, contrasts, correlation and sometimes, in carefully planned and orchestrated descriptive researches, cause and effect relationships may be established to some extent. DESCRIPTIVE RESEARCH DESIGN
  • 3.
    SAMPLING Is the processof getting information from a proper subset of population. The fundamental purpose of all sampling plans is to describe the population characteristics through the values obtained from a sample as accurately as possible. It is therefore evident that if one were to draw conclusions based on a small sample then the sample must imitate the behavior or characteristics of the original population as closely as possible.
  • 4.
    Is a detailedoutline of which measurements will be taken at what times, on which material, in what manner, and by whom that support the purpose of an analysis. Sampling plans should be designed in such a way that the resulting data will contain a representative sample of the parameters of interest and allow for all questions, as stated in the research objectives to be answered. SAMPLING PLAN
  • 5.
    Steps in Developinga Sampling Plan 1. Identify the parameters to be measured, the range of possible values and the required solution 2. Design a sampling scheme that details how and when samples will be taken 3. Select sample sizes 4. Design data storage formats 5. Assign roles and responsibilities
  • 6.
    Data gathering devicesthat will be used in the study. It is a testing device for measuring a given phenomenon, such as a paper and pencil test, questionnaires, interviews, research tools, or set of guidelines for observation. INSTRUMENTS
  • 7.
    CATEGORIES OF INSTRUMENTS ResearchCompleted Instruments Subject Completed Instruments Rating scales Questionnaires Interview schedules/guides Self-checklists Tally sheets Attitude scales Flowcharts Personality inventories Performance checklists Achievement test/aptitude tests Time and motion logs Projective devices Observation forms Sociometric devices
  • 8.
    Refers to theextent to which the instrument measures what it intends to measure and performs as it is designed to perform. Types of Validity: 1. Face Validity - is the simplest form of validity. It means that, just by looking at the test or measure, it seems to be measuring what it's supposed to measure. 2. Content Validity – the extent to which a research instrument accurately measures all aspects of a construct. 3. Construct Validity – the extent to which a research instrument or tool measures the intended construct. 4. Criterion Validity – the extent to which a research instrument is related to other instruments that measure the same variables. VALIDITY
  • 9.
    Tells how consistentlya method measures something. Attributes of Reliability 1. Test-Retest Correlation – the consistency of results when you repeat the same test on the same sample at a different point in time. 2. Interrater - The same test conducted by different people. 3. Parallel forms – measures the correlation between two equivalent versions of a test. You use it when you have two different assessment tools or sets of questions designed to measure the same thing. 4. Internal Consistency – the extent to which all the items on a scale measure one construct. RELIABILITY
  • 10.
    Primary Sources –known as primary data/raw data. These are data obtained from your own researchers, surveys, observations and interviews. Secondary Sources – known as secondary data. These are data obtained from secondary sources such as reports, books, journals, documents, magazines, internet and more. SOURCES OF DATA
  • 11.
    1. INTERVIEWS Kinds ofInterview: a. Structured Interview – the researcher asks a standard set of questions and nothing more. The interview follows a specific format with the same line of questioning. The aim of this approach is to ensure that each interview is presented with exactly the same questions in the same order. b. Face to Face Interview – most frequently used. It can be conducted in the respondent’s home or workplace, halls or even simply in the street. DATA COLLECTION METHODS
  • 12.
    1. INTERVIEWS Kinds ofInterview: c. Telephone Interview – less consuming and less expensive. The researcher has ready access to anyone who has a telephone. d. Computer-Assisted Personal Interviewing – is a form of personal interview but instead of completing a questionnaire, the interviewer brings along a laptop or handheld computer to enter the information directly into the database. DATA COLLECTION METHODS
  • 13.
    2. QUESTIONNAIRES Five Sections: a.Respondent’s Identification Data – include respondent’s name, address, date of the interview, and name of the interviewer. b. Introduction – is the interviewer’s request for help. It is normally scripted and lays out the credentials of the market research company, the purpose of the study and any aspects of confidentiality. DATA COLLECTION METHODS
  • 14.
    2. QUESTIONNAIRES Five Sections: c.Instruction – refers to the interviewer and the respondent’s directions on how to move through the questionnaire such as which questions to skip and where to move to if certain answers are given. d. Information – is the main body of the document and is made up of the many questions and response codes. DATA COLLECTION METHODS
  • 15.
    2. QUESTIONNAIRES Five Sections: e.Classification Data and Information – establish the important characteristics of the respondent, particularly related to their demographics which are sometimes at the front of questionnaire or sometimes at the end. DATA COLLECTION METHODS
  • 16.
    2. QUESTIONNAIRES TYPES OFQUESTIONNAIRES: a. Paper-pencil Questionnaire – can be sent to a large number of people and saves the researcher time and money. b. Web-based Questionnaire – is a new and inevitably growing methodology using the internet based research. c. Self-administered Questionnaire – are general distributed through mail, filled out and administered by the respondent themselves which is returned via email to the DATA COLLECTION METHODS
  • 17.
    3. OBSERVATIONS –is a way of gathering data by watching behavior, events, or noting physical characteristics in their natural setting. Kinds of Observations: a. Overt – when everyone knows they are being observed. b. Covert – when no one knows they are being observed and the observes is concealed. DATA COLLECTION METHODS
  • 18.
    4. TESTS –provide a way to assess subject’s knowledge and capacity to apply this knowledge to new situations. Kinds of Tests: a. Norm-referenced tests – provide information on how the target performs against a reference group or normative population. b. Criterion-referenced tests – constructed to determine whether or not the respondents/subjects have attained mastery of a skill or knowledge area. c. Proficiency test – provides an assessment against a level of skill attainment, but includes standards for performance at varying levels of proficiency. DATA COLLECTION METHODS
  • 19.
    5. SECONDARY DATA– a type of quantitative data that has already been collected by someone else for a purpose different from yours. These data are collected by researchers, government and private agencies, institutions or organizations or companies that provide important information for government planning and policy recommendation and theory generation. a. Paper-based sources – are those from books, journals, periodicals, abstracts, indexes, directories, research reports, conference papers, market reports, annual reports, internal records of organizations, newspapers and magazines. b. Electronic sources – are those from CD-ROMs, on-line databases, internet, videos and broadcasts. DATA COLLECTION METHODS
  • 20.
    Pointers to Rememberin Reporting the Results: • Explain the data you have collected, the statistical treatment and all relevant results in relation to the research problem that you are investigating. • Describe unexpected events that occurred during your data collection. Explain how the actual analysis differs from the planned analysis. Explain how you handled the missing data and why any missing data did not undermine the validity of your analysis.
  • 21.
    • Explain thetechniques you used to “clean” your data set. • Choose a statistical tool and discuss its use and reference a for it. Specify any computer programs or software used in the study. • Describe well the assumptions for each procedure and the steps you took to ensure that they were not violated. • Provide the descriptive statistics, confidence intervals and sample sizes for each variable. Pointers to Remember in Reporting the Results:
  • 22.
    • Avoid interferingcausality, particularly in non-randomized designs or without further experimentation. • Use tables to provide exact values and use figures to convey global effects. Keep figures small in size ad include graphic presentations of confidence intervals whenever possible. • Inform the reader what to look for in tables and figures. Pointers to Remember in Reporting the Results:
  • 23.
    Participants – describethe participants in your research study, including who they are, how many there are, and how they are selected. Explain how the samples were gathered, any randomization techniques and how the samples were prepared. Example: The researchers randomly selected 100 children from elementary schools of Cebu City. WRITING OF METHODOLOGY
  • 24.
    Materials – describethe materials, measures, equipment, or stimuli used in your research study. This may include testing instruments, technical equipment, books, images or other materials used in the course of your study. Example: Two stories from Sullivan et al.’s (1994) second-order false belief attribution tasks were used to assess children’s understanding of second-order beliefs. WRITING OF METHODOLOGY
  • 25.
    Design – describethe research design used in your research study. Specify the variables as well as the levels and measurement of these variables. Explain whether your research study uses a within-groups or between-groups design. Discuss how the measurements were made and what calculations were performed upon the raw data. Describe the statistical techniques used upon the data. Example: The experiment used a 3x2 between-subjects design. The independent variables were age and understanding of second-order beliefs. WRITING OF METHODOLOGY
  • 26.
    Procedure – thedetail of the research procedures used in your research study should be properly explained. Explain what your participants/respondents do, how you collected the data, the order in which steps occurred. Observe some ethical standards in gathering your data. Example: A researcher interviewed children individually in their school in one session that lasted 20 minutes on average. The researcher explained to each child that he or she would be told two short stories and that some questions would be asked after each story. All sessions were videotaped so the data could later be coded. WRITING OF METHODOLOGY
  • 27.
    • Always writethe method section in the past tense. (Use the future tense if it is a research design.) • Provide enough details that another researcher could replicate your experiment, but focus on brevity. Avoid unnecessary detail that is not relevant to the outcome of the experiment. • Remember to use proper APA format. • Take a rough draft of your method section with your teacher or research adviser for additional assistance. TIPS IN WRITING OF METHODOLOGY
  • 28.
    • Proofread yourpaper for typos, grammar problems, and spelling errors. Do not just rely on computer spell checkers. Always read through each section of your paper for agreement with other sections. If you mention steps and procedures in the method section, these elements should also be present in the results and discussion sections. TIPS IN WRITING OF METHODOLOGY