This document provides an introduction to research methods. It discusses why understanding research methods is important for interdisciplinary researchers and outlines different types of research such as quantitative, qualitative, and mixed methods. It also discusses how to experimentally measure learning, including within and between subjects designs. The document provides examples of how to design studies to obtain desired results and addresses important statistical concepts like independent and dependent variables. It raises considerations for survey design and cautions about assumptions of parametric statistics.
Definition
A procedure used to collect both qualitative and quantitative data.
This is done due to the fact that it is believed that both types of studies will provided a clearer understanding of what is being studied.
“It consists of merging ,integrating ,linking ,or embedding the two “strands””(Ceswell,2012).
Qualitative research design in research in educationRashna Asif
This presentation all about the qualitative research design its approaches features characteristics analysis and also data collection tools in this presentation approaches are very deeply discussed.
Research Design: Quantitative, Qualitative and Mixed Methods DesignThiyagu K
A Research Design is simply a structural framework of various research methods as well as techniques that are utilized by a researcher. This presentation slides explain the resign design of quantitative, qualitative, and mixed-method design.
Definition
A procedure used to collect both qualitative and quantitative data.
This is done due to the fact that it is believed that both types of studies will provided a clearer understanding of what is being studied.
“It consists of merging ,integrating ,linking ,or embedding the two “strands””(Ceswell,2012).
Qualitative research design in research in educationRashna Asif
This presentation all about the qualitative research design its approaches features characteristics analysis and also data collection tools in this presentation approaches are very deeply discussed.
Research Design: Quantitative, Qualitative and Mixed Methods DesignThiyagu K
A Research Design is simply a structural framework of various research methods as well as techniques that are utilized by a researcher. This presentation slides explain the resign design of quantitative, qualitative, and mixed-method design.
This ppt is made for Ph.D. Scholars, M.Ed., M.A.Education and other PG students. The advance version of this ppt in MP4 is available at https://www.youtube.com/watch?v=O2qMwrmUbe0
Research Methods in Education and Education Technology Prof Lili Saghafi Con...Professor Lili Saghafi
There are many different methodologies that can be used to conduct educational research.
The type of methodology selected by a researcher emanates directly from the research question that is being asked.
In addition, some of the differing techniques for conducting educational research reflect different paradigms in scientific thought.
Here a review of the most commonly used methodologies is presented the strengths and weaknesses of various methods are compared and contrasted.
This ppt is made for Ph.D. Scholars, M.Ed., M.A.Education and other PG students. The advance version of this ppt in MP4 is available at https://www.youtube.com/watch?v=O2qMwrmUbe0
Research Methods in Education and Education Technology Prof Lili Saghafi Con...Professor Lili Saghafi
There are many different methodologies that can be used to conduct educational research.
The type of methodology selected by a researcher emanates directly from the research question that is being asked.
In addition, some of the differing techniques for conducting educational research reflect different paradigms in scientific thought.
Here a review of the most commonly used methodologies is presented the strengths and weaknesses of various methods are compared and contrasted.
This presentation describes a study that explored the factors associated with more successful technological problem solving by junior secondary school pupils. It begins by exploring how technological problem solving might be concpetualised, before describing the methods employed and some of the key findings drawing on direct evidence of pupils' problem solving activity in the technology classroom.
JiTT - Blended Learning Across the Academy - Teaching Prof. Tech - Oct 2015Jeff Loats
A four-person panel discusses the implementation of Just-in-Time Teaching in 18 courses across 5 disciplines. Participation rates and correlations with other outcomes are discussed.
Building a System of Learning and Instructional Improvement – Barbara Schneider EduSkills OECD
This presentation was given by Barbaba Schneider at the conference “Creativity and Critical Thinking Skills in School: Moving a shared agenda forward” on 24-25 September 2019, London, UK.
Quantifying the Effects of an Active Learning Strategy on the Motivation of S...Zin Eddine Dadach
The main objective of this paper is to use performance of students in order to quantify the effects of an active learning strategy on their motivation.
In the first part of the investigation, the relative performance of students was used as a tool to gauge the effects of the active learning strategy on the motivation of students. The results indicate that the active learning strategy enhanced the performance of 38 (69%) students.
For the second part of this quantitative method, the Dadach Motivation Factor ‘DMF’ was introduced in order to measure the effects of the active learning strategy on the motivation of students. Based on the requirement of the analysis (DMF> 1), the final results suggest that the active learning strategy has enhanced the motivation and increased the performance of twenty-two (40%) students. On the other hand, motivation did not have a significant role for the other sixteen (29%) students whose performance in the process control course (FGP) was higher than their average performance in the department (CGPA).
The results of the quantitative approach were compared with the student survey.
MSN5300: Advanced Nursing Inquiry, Research, and Evidence-Based Practice Team:____________
Critical Appraisal Worksheet for Group Project 2, Part b
Elements of Appraisal
Discussion
Study Design
Was the study a qualitative or quantitative design? Explain.
Practice Problem
State the practice problem/issue that is the focus of the study.
How does this practice problem/issue affect
your nursing practice?
Study Purpose
State the purpose of the study.
Discuss whether this study was feasible to conduct in terms of money commitment; the researchers' expertise; availability of subjects, and ethical considerations.
Review of Literature
Was the literature review organized to show the progressive development of evidence from previous research?
Was a clear, concise summary presented of the current empirical and theoretical knowledge in the area of the study?
Did the literature review summary identify what was known and not known about the research problem? Did it logically direct the research purpose?
Theoretical Framework
Is a conceptual model or theoretical framework used? If so, is it presented with clarity? Does it adequately explain the phenomenon of concern?
Is the framework linked to the research purpose? If not, would another framework fit more logically with the study?
If a proposition or relationship from a theory is to be tested, is the proposition clearly identified and linked to the study hypothesis?
Research Question(s) and Hpothesis(es)
What is the research question? Is it clearly stated?
Does the research question match the purpose of the study?
What is the research hypothesis? Is it clearly stated?
Does the hypothesis match the purpose of the study?
Formulate a null hypothesis for this study.
Were the objectives, questions, or hypotheses logically linked to the concepts and relationships (propositions) in the framework? Explain.
Variables
List all research variables with corresponding levels of measurement on the NOIR scale.
Do variables represent the concepts identified in the framework?
How is each study variable defined (both conceptually
and operationally)?
Are conceptual definitions of variables consistent with operational definitions?
Study Design
What was the design was used in the study? Was it the most appropriate design to answer the study question? Explain.
Was the design logically linked to the sampling method and statistical analyses?
Did the design provide a means to examine
all objectives, questions, or hypotheses?
If there was an intervention (treatment) in the study:
· what was the intervention? was it clearly described?
· was the treatment appropriate for examining the study purpose and hypotheses?
· did the study framework explain the links between the treatment (independent variab.
Dr. Lani discusses all aspects of the dissertation methodology, including: selecting a survey instrument, population, reliability, validity, data analysis plan, and IRB/URR considerations.
Similar to Introduction to Research Methods in Education (20)
2. Research Methods
Why is it important to understand research
methods for interdisciplinary researchers?
Types of research
How do you measure learning
experimentally?
Within subjects design
Between subjects design
What are you measuring?
Important Statistical terms
[other methods]
3. Research methods as boundary
object in interdisciplinary research
Study of cross-disciplinary research
collaboration (Mercier, Penuel, Remold, Villalba, Kuhl &
Romo, under review)
The Bilingual Baby Project
4-year longitudinal study
Neuroscience, cognitive science, sociology
How does growing up in a bilingual environment
influence cognitive development & school
readiness?
Biggest issues was sample selection (methods)
4. … [a child] was brought back for me to test, and I was
putting [an ERP] cap on and then his mom said, well,
he’s used to this from going to the neurologist … and
then she told me that he has epilepsy. And I thought
to myself, well, okay, I’m wasting the next two-hours
because I’m not going to be able to use the data….. A
quantitative researcher would never include a child
that—you wouldn’t even waste the time and
resources.
5. It’s difficult because you do form close
relationships with the families…. So it was
then much harder for our group to eliminate
them from the pool, which we didn’t. We have
followed up, we’ve gone and interviewed her,
and it would be very interesting to see what
kind of school readiness skills that child has
and what kind of problems that mother has
encountered. I know that they wanted
children who had no other disabilities in order
to focus on their language acquisition, but not
all children are of this type
6. Two Types of Research
Quantitative
Experimental
Surveys (usually)
Qualitative
Biography, phenomenology, grounded theory,
ethnography & case study
Mixed methods
You can’t account for context with numbers
The plural of anecdote is not data
7. What do you want your data &
results to look like?
Do you want to show learning, engagement,
the process of the activity?
Do you want to show that your tool works, or
that it is better than an alternative?
Do you want to describe, code, run statistics,
present a case study?
How will you design the study to get the type
of results you want to present?
8. How do you measure
learning experimentally?
Ecology lesson
- Aim: to teach 5 year
olds about complex
systems
- Ten 1-hour long
sessions over 5 weeks
- “Embodied curriculum”
- Technology; dancing;
drawing
0
1
2
3
4
5
6
7
8
9
10
Pre Post
Scoreontest
No way to know whether it was the curriculum,
or just being taught that led to learning
CLAIM: Embodied curriculum is a
good way to teach complex systems
9. How do you measure
learning experimentally?
(and know that it’s because of what you did…)
Pre/post test design (within subjects)
Sequestered problem solving (SPS)
Preparation for future learning (PFL)
Free-write/free recall
Delayed post-test
Multiple baseline/single case design
10. How do you measure
learning experimentally?
(and know that it’s because of what you did…)
Control design (between subjects)
Only some of the participants receive
intervention & compare post-test scores
between control & experimental groups
Compare ‘experimental group’ with
previous groups who did not receive
intervention
Randomized control trials (the medical
model)
11. 2x2 design
Recall Condition
On Land Under
water
Learning
condition
On Land 20 20
Under
Water
20 20
Godden & Baddeley (1975)
12. But what are you measuring?
Prerequisites for Maths
Course
- Is maths101 necessary
to pass stats202?
- Half of students had
taken maths101
- All students take
stats202
- Contrast outcomes
0
10
20
30
40
50
60
70
80
90
100
Maths101 No Maths101
Meanpercentonposttest
Floor effect: either the post-test didn’t measure the content or very
little was learned from stats202.
CLAIM: No need to take
maths101 before taking stats202
13. But what are you measuring…
Is it valid?
Internal validity (is the effect caused by the IV)
External validity (would it replication beyond the sample)
Is it reliable?
test-retest reliability
Split-test reliability
[piloting]
15. Important statistical ideas
Independent variables
The thing you manipulate/control for
Main effects & Interaction effects between IV
Dependent variables
The outcome measure
Floor effects & ceiling effects
Statistical significance
Usually <.05 for social science
Indicates whether the effect is genuine or due to
chance
16. Important statistical ideas
Level of measurement
Nominal
Ordinal
Interval (& ranking)
Population & sample
Normal distribution
Descriptive statistics
Means, standard deviations, standard errors,
Parametric and non-parametric statistics
17. Assumptions for parametric statistics
Level of measurement must be at least
interval
Sample is drawn from a normally distributed
population
Homogeneity of Variance
Variance of two samples is not significantly
different
Independence of scores
19. Key things to look for:
Are the differences between conditions
significant:
T-tests, ANOVA, Chi Square
Is there a relationship between variables?
Correlations (note: you can’t tell causation from this)
Pay attention to r values (between 1 and -1).
Which of the IVs cause the DV?
Regression Analysis (note: need very large sample
size; controversial technique)
20. Survey design
In the past 24 hours, did you watch more
than an hour of television programming,
or not? Yes/No
In the past 24 hours, did you read a daily
newspaper, or not? Yes/No
On a scale of 1 to 7, please rate
How satisfied were you with what you
learned and the usability of the software?
(1) Agree strongly…….(7) disagree strongly
21. Survey design (things to remember)
Is there only one question in each item?
Pilot with a number of people – do they read the
question the way it was intended?
Are all your scales in the same direction (if not,
reverse them before analysis)
Do the answers match the questions?
How will you make sense of the answers
What sort of analysis can you do on rating, frequency,
open-ended items?
Are particular answers ‘socially desirable?’