The document provides guidance on writing the methodology chapter of a research paper. It discusses the main components including the introduction, philosophical assumptions, research design, methodology, sampling, data collection, ethics, and validity and reliability. The introduction should establish the purpose and provide an outline. The philosophical assumptions section explains the research paradigm including ontology and epistemology. The research design discusses the approach, methodology, and research methods. Sampling, data collection, and analysis are also important to include. The methodology chapter must address ethics and ensure the validity and reliability of the research.
1. Writing the Methodology Chapter
Mengjie JIANG mjj16@le.ac.uk
School of Education Workshop 8th
Of August 2017
2. Purpose of the methodology chapter
Explain your research approach (research questions/objectives/decisions)
Explain your rationale and why you tool those decisions and actions
Explain the research procedure
3. 2. Paradigm/philosophical assumptions
4. Data collection procedure
3. Research design/Methodology
Main Components of the Research Methodology Chapter
5. Ethics
1. Introduction
6. Validity and reliability
4. 1. Introduction
• Link to previous chapter
• Establish the purpose of this chapter
• Provide an outline
5. 2. Paradigm
Ontology
(nature of social reality)
Epistemology
(nature of social knowledge)
Paradigm
Research design/ Methodology
Methods
Carey (2011, p.67-69)
Bryman (2012, p. 6); Cohen et al. ( 2011, p. 5-6)
6. Within social research, ontology is directly linked to epistemology because it helps
generate questions to ask and therefore helps us to construct a research methodology
(Holloway, 1997, p.113, cited in Carey, 2011, p. 67)
Assumptions about the nature of social science
The subjectivist approach to social science The objectivist approach to social science
Ontology
Epistemology
constructionism/
Nominalism
Objectivism/Realism
Interpretivism/Anti-positivist Positivism/Normative
Bryman (2012, p. 27-35); Cohen et al. (2011, p. 7); Crotty (1998, p. 64)
7. Philosophical Assumptions/Paradigm
Philosophical Assumptions
Ontological
stance (nature of
reality)
Epistemological
stance (nature of
knowledge)
Paradigm Strategy
Qualitative
research
Multiple realities;
Subjectivity of
reality;
Socially
constructed reality;
Subjective meaning of
social action, Close
interaction between
the knower and the
known
Interpretivism Inductive
Quantitative
research
Singular reality;
Objectivity of
reality;
External facts that
beyond our reach
of influence
Objective and value
free; the purpose of
theory is to generate
hypotheses that can
be tested
Postivism Deductive
Bryman (2012, p.27- 35)
8. Different worldviews and implications for practice
Philosophical Assumptions
Ontological stance (nature
of reality)
Epistemological stance
(nature of knowledge)
Methodology/Approa
ch
Postpositivism Singular reality (e.g.,
researchers reject or fail to
reject hypotheses)
Distance and impartiality (e.g.,
researchers objectively collect
data on instruments)
Deductive (e.g.,
researcher test an a
priori theory)
Constructivism
Participants and researcher
develop multiple realities
through interaction
Closeness (e.g., research visit
participants at their sites to
collect data)
Inductive (e.g., research
start with participants’
view and build up
patterns, theories and
generalisations
Pragmatism
Reality is what is useful, is
practical and work; singular
and multiple realities
Practicality (e.g., researchers
collect data by “what works”
to address research questions
Combining both
quantitative and
qualitative data
Participatory Political reality (e.g., findings
are negotiated with
participants).
Collaboration (e.g., active
involvement of participants in
the study in constructing
realities
Participatory (e.g.,
researchers involve
participants in all stages
of the research
Worldviews
Creswell (2011, p. 42)
9. 3. Research design
‘A research design is a logical plan for getting from here to there, where
here may be defined as the initial set of questions to be answered, and
there is some of conclusions about these question (Yin, 2009, p. 26).’
‘Another way of thinking about a research design is as a “blueprint” for
your research, dealing with at least four problem: What questions to
study, what data are relevant, what data to collect, and how to analyse
the results (Philliber, Schwab & Samsloss, 1980, cithed in Yin, 2009, p.
26)’.
10. 4. Research design
The approach
Qualitative
Mixed
Methods
QuantitativeWith/Without
a conceptual
framework
Methodology
11. 4. Methodology
Major styles of
research
methodology
Main features Data collection
methods
Ethnographic
research
Ethnographic research involves the
information of its members, ideas, beliefs
and behaviours and what they do in
everyday life collected from certain cultural
group, by living with people of those groups
Participant observation
Survey Survey is seeking to gather large-scale data
from as representative a sample population
as possible to say with a measure of
statistical confidence
Questionnaire
Case studies Case studies research on a phenomenon by
studying in depth and report the real-life,
complex dynamic and unfolding interaction
of events, human relationships and other
factors
Interviews;
Observation;
Documents;
Diaries
Action research Action research seeks action to improve
practice and study the effect of the action
that was taken
Document analysis such
as student achievement
data, diagnostic
assessment, pre-test and
post-test;
Observation;
Interviews;
Cohen et al. (2011)
12. Sampling
Data collection
methods
Research Context
Pilot
study
Revise Revise
instruments
test/improve
• Population;
• Sample size;
• Access to sample;
• The representativeness
of the sample
• Sampling strategy
(Cohen et al., 2011, p. 143)
• Discussion on data
collection methods
• Discussion on how you
developed your
research instrument (s)
Procedure
Data analysis method Quantitative data (e.g.,
questionnaire-descriptive
statistics; inferential
statistics)
Qualitative data (e.g.,
Interview-thematic analysis;
narrative analysis)
4. Research design
13. 5. Ethics
The University of Leicester Ethical Appraisal Framework developed by Alison Fox and Hillary
(YouTube link: https://www.youtube.com/watch?v=j5U7kVr7bUQ&feature=youtu.be)
14. External/ecological Consequential
/utilitarian
Deontological Relational/individual
Cultural sensitivity;
Codes of practice;
Efficiency/use of resources;
Quality of evidence;
Risk;
Benefits for
individuals/particular
groups/organisation/
society;
Honesty;
Reciprocity;
Tell the truth;
Establishing
trust/collaboration;
Confirmation of finings;
Recruitment Respect values, norms in the
environment;
Permission of a gatekeeper
Informed consent Reciprocity ;
Project information sheet
(being transparent about
the procedure; free to
withdraw)
Collaboration
Building rapport and
constructive relationship
Field work Avoidance of detachment;
Gather enough evidence to
back-up conclusions;
Work in line with University/
British Educational Research
Association guidelines
Avoidance of harm
(respect dignity and
privacy, avoid potential
discomfort and
stress ); Justify the
choice of methods
Treat all participants fairly Avoidance of imposition;
respect autonomy
Reporting Responsive communication;
Data protection; avoidance
of plagiarism
Confidentiality and
anonymity
Fairness;
Confidentiality
Confirmation of findings
(How did I ensure the
validity and reliability of
findings?)
Ethical grid to analyse ethical considerations (adopted from Kris & Alison, 2009)
15. • Credibility- Does the data reflect
participants’ experience in a factual
accuracy of the account?
• Transferability- Can findings be
transferred to similar context?
• Dependability- Would we arrive
at similar results if the procedures are
followed?
• Confirmability- Can findings be
confirmed by other method,
participants or researcher?
‘respondent validation’ – member
checking and informant feedback;
‘triangulation’ –cross-checking findings
using multiple methods or data sources
‘thick description’ - provide a rich
account of the research context and
participants, and details to support findings
‘audit trail’ – records of the entire
research process
Checking researcher effects and
clarifying researcher bias
Trustworthiness of qualitative research
Bryman (2012, p. 392)
16. Validity and Reliability of the quantitative research
Reliability Content
validity
Criterion validity Construct
validity
Concurrent
validity
Predictive
validity
Meaning Internal
reliability asks
whether the
indicators that
make up the
scale are
consistent
Whether the test
covers a
representative
sample of the
domains to be
measured
The ability of
the test to
estimate
present
performance
The ability of the
test to predict
future
performance
The extent to
which the
instrument
measures a
theoretical
construct
How it is
accomplished
A test of
internal
reliability
known as
Cronbach’s
alpha
Ask experts to
assess the test to
establish that the
items are
representative of
the outcome
Correlate
performance on
the test with a
concurrent
behaviour
Correlate
performance on
the test with a
behaviour in
future
Correlate
performance on
the instrument
with a
performance on
an established
instrument (e.g.,
assess it with
following semi-
structured
interviews)
Bryman (2012, p. 168-173)