Methods of Education Research 
Most Common Methods, 
Controversies, and Ethical 
Considerations related to Methods 
and Research Instruments 
Olatokunbo S. Fashola, Ph.D. Adjunct Research Scientist, 
Faculty Associate, Johns Hopkins University
Different Types of Research 
n Basic Research 
n Applied Research 
n Evaluation Research 
n Action Research
Additional types of research 
n Historical Research 
n Descriptive Research 
n Correlational Research 
n Causal Comparative Research/ Quasi- 
Experimental Research 
n Experimental Research
Who are your ““people?”” 
n What is your population of interest 
–– To whom would you like the results to be 
generalized? 
–– How do you select your sample in a way that 
enables you to generalize the results to this 
sample? 
–– What do you wish to generalize to this sample? 
–– What are some ways of selecting an 
appropriate Population?
How and why are you sampling? 
n Random Sampling 
n Stratified Sampling 
n Cluster Sampling 
n Systematic Sampling 
n Some challenges to sampling include: 
–– Sampling Bias 
–– Size 
–– Population 
–– Self Selection 
–– Snowballing 
–– Available group use 
n How will you inform your funders that you have 
addressed these challenges/threats?
Research Instruments 
n Assessments to be administered to determine 
effectiveness or impact? 
n Developer Created versus Externally Developed? 
n Standardized Tests 
–– Things to look for include: 
–– Validity Define Validity 
» Construct, Content, Item, Concurrent, Sampling Face, 
Predictive, Construct 
–– Reliability Define Reliability 
» Inter-rater, Test-Retest, Equivalent Forms, Split Half 
n Observation 
–– Questionnaires 
–– Surveys 
n Focus Groups
Ethical considerations 
n Right to refuse to be involved 
n Right to stop being involved 
n Strategies for achieving and maintaining support 
from participants (schools, universities) 
n Training others to implement the treatment. 
n No harming of students (minimal risk) 
n Subject’’s right to privacy 
n Parental consent 
n Collecting data without permission 
n Sharing of data
Ethical Considerations 
n Refusal of treatment to participants who may 
need it 
n Advocacy versus research 
n Gould Experiment 
n Tuskegee (syphilis study) 
n AIDS study (overseas) 
n Aspirin and Heart Attack study 
n When is enough enough? 
–– Zimbardo Prison Study 
–– Milgrim Study
Two pieces of legislation on 
ethics 
n National Research Act of 1974 
–– Approval of the study by an external organized group 
prior to implementation of the study (IRB) 
» No harm 
» Informed consent 
» Parental or guardian permission(signatures) 
n Family Educational Rights and Privacy Act 
(Buckley) 
–– Privacy of educational records of students 
» e.g. recoding student id numbers 
» Masking direct access to student records
Descriptive Research 
n Why conduct descriptive research? 
–– How do some of these procedures differ from, 
or how are they similar to those of Quantitative 
Researchers? 
–– Test Hypotheses and answer questions 
–– Develop an appropriate instrument for 
gathering information 
–– Self Reports may include 
» Questionnaires, interviews, standardized attitude 
scales (likert)
Qualitative Research 
n Observation 
–– Naturalistic 
–– Role Playing 
–– Case Studies 
–– Content analyses (e.g. portfolios) 
n Participant Observation 
–– Research is embedded in the study 
» Uri Triesman 
» Slim’’s Cafe
Ethnography 
n Classroom Based 
n School Based 
n Enclosed group based 
n Any of the above
Correlational Studies 
n Why conduct correlational studies? 
–– Positive 
–– Negative 
–– Zero 
n When would it be good to have a positive 
correlation 
n When would it be good to have a negative 
correlation 
n When would it be good to have a zero correlation?
Causal Comparative or Quasi 
Experimental Designs 
n Experimental Group 
n Control Group 
n Independent Variable (manipulated?) 
n Dependent Variable 
–– Controlling using matching 
–– Comparing subgroups 
–– Co-Varying
Randomization 
n Both groups are equal at the onset of the study 
n Causal Relationship between two variables 
n Treatment and comparison/control group 
n Treatment versus no treatment 
n Treatment A versus Treatment B 
n Exposure for a determined amount of time 
n Post-test 
n So what sets this apart from Quasi-experimental 
designs? 
n Stay Tuned
Direct power over independent 
variables 
n Direct Manipulation 
–– Manipulation of at least one independent variable. 
Direct and intentional 
n Direct Control 
–– Direct control over what is provided to each group 
n Control of subject variables 
–– Pretest scores, pretest performance 
n Control of environmental variables 
–– Curriculum materials, length of exposure, etc.
What’’s the buzz? 
n Effect Sizes 
n Evidence of Effectiveness 
n Impact 
n Power Analysis 
n Type 1 and Type 2 Errors 
n Minimal Detectable Effects 
n Interclass/Intraclass Correlations 
n Counterfactual (what would have happened if we 
had provided this intervention or treatment?) 
n Value Added Models 
n Single Unit Transfer Variable Assumption
Any problems with 
Experimental Designs? 
n WHAT? Problems? What Problems? 
n Threats!!! 
n You threatening me? 
n Er, no, threats to internal and external validity 
n Internal Validity 
–– History, maturation, testing, instrumentation, statistical 
regression, differential selection, mortality 
n External Validity 
–– Pretest-treatment interaction, multiple treatment 
interference, experimenter effects, reactive 
arrangements (hawthorne effect).

Research methods 1

  • 1.
    Methods of EducationResearch Most Common Methods, Controversies, and Ethical Considerations related to Methods and Research Instruments Olatokunbo S. Fashola, Ph.D. Adjunct Research Scientist, Faculty Associate, Johns Hopkins University
  • 2.
    Different Types ofResearch n Basic Research n Applied Research n Evaluation Research n Action Research
  • 3.
    Additional types ofresearch n Historical Research n Descriptive Research n Correlational Research n Causal Comparative Research/ Quasi- Experimental Research n Experimental Research
  • 4.
    Who are your““people?”” n What is your population of interest –– To whom would you like the results to be generalized? –– How do you select your sample in a way that enables you to generalize the results to this sample? –– What do you wish to generalize to this sample? –– What are some ways of selecting an appropriate Population?
  • 5.
    How and whyare you sampling? n Random Sampling n Stratified Sampling n Cluster Sampling n Systematic Sampling n Some challenges to sampling include: –– Sampling Bias –– Size –– Population –– Self Selection –– Snowballing –– Available group use n How will you inform your funders that you have addressed these challenges/threats?
  • 6.
    Research Instruments nAssessments to be administered to determine effectiveness or impact? n Developer Created versus Externally Developed? n Standardized Tests –– Things to look for include: –– Validity Define Validity » Construct, Content, Item, Concurrent, Sampling Face, Predictive, Construct –– Reliability Define Reliability » Inter-rater, Test-Retest, Equivalent Forms, Split Half n Observation –– Questionnaires –– Surveys n Focus Groups
  • 7.
    Ethical considerations nRight to refuse to be involved n Right to stop being involved n Strategies for achieving and maintaining support from participants (schools, universities) n Training others to implement the treatment. n No harming of students (minimal risk) n Subject’’s right to privacy n Parental consent n Collecting data without permission n Sharing of data
  • 8.
    Ethical Considerations nRefusal of treatment to participants who may need it n Advocacy versus research n Gould Experiment n Tuskegee (syphilis study) n AIDS study (overseas) n Aspirin and Heart Attack study n When is enough enough? –– Zimbardo Prison Study –– Milgrim Study
  • 9.
    Two pieces oflegislation on ethics n National Research Act of 1974 –– Approval of the study by an external organized group prior to implementation of the study (IRB) » No harm » Informed consent » Parental or guardian permission(signatures) n Family Educational Rights and Privacy Act (Buckley) –– Privacy of educational records of students » e.g. recoding student id numbers » Masking direct access to student records
  • 10.
    Descriptive Research nWhy conduct descriptive research? –– How do some of these procedures differ from, or how are they similar to those of Quantitative Researchers? –– Test Hypotheses and answer questions –– Develop an appropriate instrument for gathering information –– Self Reports may include » Questionnaires, interviews, standardized attitude scales (likert)
  • 11.
    Qualitative Research nObservation –– Naturalistic –– Role Playing –– Case Studies –– Content analyses (e.g. portfolios) n Participant Observation –– Research is embedded in the study » Uri Triesman » Slim’’s Cafe
  • 12.
    Ethnography n ClassroomBased n School Based n Enclosed group based n Any of the above
  • 13.
    Correlational Studies nWhy conduct correlational studies? –– Positive –– Negative –– Zero n When would it be good to have a positive correlation n When would it be good to have a negative correlation n When would it be good to have a zero correlation?
  • 14.
    Causal Comparative orQuasi Experimental Designs n Experimental Group n Control Group n Independent Variable (manipulated?) n Dependent Variable –– Controlling using matching –– Comparing subgroups –– Co-Varying
  • 15.
    Randomization n Bothgroups are equal at the onset of the study n Causal Relationship between two variables n Treatment and comparison/control group n Treatment versus no treatment n Treatment A versus Treatment B n Exposure for a determined amount of time n Post-test n So what sets this apart from Quasi-experimental designs? n Stay Tuned
  • 16.
    Direct power overindependent variables n Direct Manipulation –– Manipulation of at least one independent variable. Direct and intentional n Direct Control –– Direct control over what is provided to each group n Control of subject variables –– Pretest scores, pretest performance n Control of environmental variables –– Curriculum materials, length of exposure, etc.
  • 17.
    What’’s the buzz? n Effect Sizes n Evidence of Effectiveness n Impact n Power Analysis n Type 1 and Type 2 Errors n Minimal Detectable Effects n Interclass/Intraclass Correlations n Counterfactual (what would have happened if we had provided this intervention or treatment?) n Value Added Models n Single Unit Transfer Variable Assumption
  • 18.
    Any problems with Experimental Designs? n WHAT? Problems? What Problems? n Threats!!! n You threatening me? n Er, no, threats to internal and external validity n Internal Validity –– History, maturation, testing, instrumentation, statistical regression, differential selection, mortality n External Validity –– Pretest-treatment interaction, multiple treatment interference, experimenter effects, reactive arrangements (hawthorne effect).