Name __________________________________ Writing for Thinking Section 01 Instructor: Cagle EDF 3413 Spring 2019
Research Paper Rubric Writing for Thinking Section 03
2 1.5 1 .5
APA Format Running head and page
number are placed correctly
on all pages. Title page
gives title, your name,
University name, course
information, instructor’s
name, date.
Includes 5 or 6 of the
requirements.
Includes 3 or 4 of the
requirements.
Includes 1 or 2 of the
requirements.
APA Style
Abstract
The overall purpose of the
abstract is clear; the abstract
expresses only the main
idea and major points of the
original selection; Word
choice is consistently
efficient and concise.
The purpose is generally
clear; the abstract expresses
only the main idea and most
major points of the original
selection; word choice is
fairly concise.
The purpose wavers; the
abstract does not accurately
express the main idea or
most major points of the
original selection; Word
choice is vague or repetitive.
The purpose is unclear; the
abstract does not convey the
main idea or major points of
the original selection; word
choice is confusing or
misleading.
Citations Includes and cites all data
obtained from other sources
(at least 5 sources). APA
citation style is used
correctly for all citations.
Cites at least 4 sources,
correctly using APA style.
Cites at least 3 sources,
correctly using APA style.
Cites 2 or less sources,
correctly using APA style.
Reference Page Written in APA format with
no errors.
Written in APA format with
1-3 errors.
Written in APA format with
4-5 errors.
Written in APA format with
more than 5 errors.
4 3 2 1
Thesis Statement Clearly and concisely states
the paper’s purpose in a
single sentence, which is
engaging, and thought
provoking.
Clearly states the paper’s
purpose in a single sentence.
States the paper’s purpose in
a single sentence.
Incomplete and/or
unfocused.
Introductory
Paragraphs
Well-developed
introductory paragraph(s)
that contains a clear
explanation, or definition of
the problem, a thesis
statement,
detailed background
information, and gives the
Introductory paragraph states
the problem, but does not
explain using details. Thesis
is stated. Some background
information is included.
Structure of the paper is
previewed.
Introduction states the thesis
but does not adequately
explain the background or
the problem. No structure of
paper is previewed.
Thesis and/or problem is
vague or unclear.
Explanations and
background information is
unclear, not present, or not
related to the topic. No
preview of structure is given.
Name __________________________________ Writing for Thinking Section 01 Instructor: Cagle EDF 3413 Spring 2019
reader an idea of how the
essay is organized.
Bo ...
Name __________________________________ Writing for Thinking .docx
1. Name __________________________________ Writing for
Thinking Section 01 Instructor: Cagle EDF 3413
Spring 2019
Research Paper Rubric Writing for Thinking Section 03
2 1.5 1 .5
APA Format Running head and page
number are placed correctly
on all pages. Title page
gives title, your name,
University name, course
information, instructor’s
name, date.
Includes 5 or 6 of the
requirements.
Includes 3 or 4 of the
requirements.
Includes 1 or 2 of the
2. requirements.
APA Style
Abstract
The overall purpose of the
abstract is clear; the abstract
expresses only the main
idea and major points of the
original selection; Word
choice is consistently
efficient and concise.
The purpose is generally
clear; the abstract expresses
only the main idea and most
major points of the original
selection; word choice is
fairly concise.
The purpose wavers; the
abstract does not accurately
express the main idea or
most major points of the
3. original selection; Word
choice is vague or repetitive.
The purpose is unclear; the
abstract does not convey the
main idea or major points of
the original selection; word
choice is confusing or
misleading.
Citations Includes and cites all data
obtained from other sources
(at least 5 sources). APA
citation style is used
correctly for all citations.
Cites at least 4 sources,
correctly using APA style.
Cites at least 3 sources,
correctly using APA style.
Cites 2 or less sources,
correctly using APA style.
4. Reference Page Written in APA format with
no errors.
Written in APA format with
1-3 errors.
Written in APA format with
4-5 errors.
Written in APA format with
more than 5 errors.
4 3 2 1
Thesis Statement Clearly and concisely states
the paper’s purpose in a
single sentence, which is
engaging, and thought
provoking.
Clearly states the paper’s
purpose in a single sentence.
States the paper’s purpose in
a single sentence.
Incomplete and/or
unfocused.
5. Introductory
Paragraphs
Well-developed
introductory paragraph(s)
that contains a clear
explanation, or definition of
the problem, a thesis
statement,
detailed background
information, and gives the
Introductory paragraph states
the problem, but does not
explain using details. Thesis
is stated. Some background
information is included.
Structure of the paper is
previewed.
Introduction states the thesis
but does not adequately
explain the background or
6. the problem. No structure of
paper is previewed.
Thesis and/or problem is
vague or unclear.
Explanations and
background information is
unclear, not present, or not
related to the topic. No
preview of structure is given.
Name __________________________________ Writing for
Thinking Section 01 Instructor: Cagle EDF 3413
Spring 2019
reader an idea of how the
essay is organized.
Body I Three or more points that
are in agreement with the
claim are well developed
with supporting details.
Three or more main points
7. that are in agreement with the
claim are present but lack
detail and development in
one or two.
Three main points that are in
agreement with the claim are
present, but all lack detail
and development.
There are less than three
main points in agreement
with the claim with poor
details and development of
ideas.
Body II
At least two opposing views
or comparisons/contrasts
are acknowledged.
Refutation of each opposing
point is strong and well
8. developed.
Two or more opposing views
or comparisons/contrasts are
present but refutation of each
opposing view is weak and
not well developed.
Only one opposing view or
comparisons/contrasts is
present. Refutation is
missing and or vague.
Opposing views or
comparisons/contrasts and
refutations are not present.
Conclusion Conclusion is engaging and
effective and restates the
thesis.
The conclusion is adequate
and restates the thesis.
The conclusion is not
engaging or effective and
does not adequately restate
9. the thesis.
Conclusion is incomplete or
unfocused and does not
restate the thesis.
Organizational
Structure
Logical progression of ideas
with a clear argumentative
structure. Transitions are
fluid.
Logical progression of ideas;
argumentative structure can
be followed although it is not
clearly developed.
Transitions are present.
Some level of
disorganization disrupts
progression of ideas.
Argumentative structure not
easy to follow. Transitions
10. are not used effectively or
are rarely used.
High level of
disorganization. More than
one transition is missing.
Mechanics and
Usage
2 or less errors in
punctuation, capitalization,
spelling, sentence structure,
and word usage.
3 to 5 errors in punctuation,
capitalization, spelling,
sentence structure and word
usage.
Numerous errors in
punctuation, capitalization,
spelling, sentence structure
and word usage.
So many errors in
11. punctuation, capitalization,
spelling, sentence structure
and word usage that
comprehension of the essay
is difficult.
Sources Includes at least 5 credible
references.
Includes at least 4 credible
references.
Includes at least 3 credible
references.
Includes less than 3 credible
references.
Score per column
Student’s Name: __________________________ Total Score
__________ Percentage Score _____________
Date: ___________________
11/5/2019
12. 1/6
Week 5
Research Methods and Data Types
Health research is systematic (follows a sequential process) and
principled (carried out
according to explicit rules) way to gather information to
investigate health issues and
solve health-related problems. As such, these guidelines are
what constitutes a research
method. In health research, the term method refers to a strict set
of rules, including (a)
how knowledge should be acquired, (b) the form in which
knowledge should be stated,
and (c) how the truth or validity of knowledge should be
established (Polgar & Thomas,
2019). This is very different from your project’s methodology,
which consists of the
research design, sample size, data collection procedures,
instrumentation, and paradigm
(Glicken, 2003).
There are three overarching conceptual foundations of health
research methods:
13. quantitative, qualitative and mixed methods.
Quantitative research involves the use of mathematics in the
discovery of
relationships among a variety of variables.
This research method uses a positivist paradigm, experimental
designs, and
the scientific method to generate results that may reveal such
relationships
that are generalizable to other people, places, and events
(Polgar & Thomas,
2019).
Qualitative research is an examination of phenomena within the
cultural and social
environment in which it takes place.
It is not designed to determine cause-effect relationships or the
ability to
generalize to other people, places, and events (as is found in
quantitative
research); rather, it incorporates participant observations, in-
depth
interviews, focus group discussions, and textual (nonnumeric)
data to
identify themes and patterns to eventually generate new theories
14. (Jacobsen,
2017).
A mixed-methods approach involves the use of the combination
of both
quantitative and qualitative methods.
Here, the researcher has the advantage of data being produced
including
both objective facts (statistical analysis) and subjective
experiences
described by the study participants (Polgar & Thomas, 2019).
11/5/2019
2/6
Books and Resources for this Week
Bazeley, P. (2015). Mixed methods in
management research: Implications for
the field. Electronic Journal of Business
Research Methods, 13(1), 27-35.
Data can be derived from a primary study, secondary sources, or
a systematic review of
15. the literature. A primary study involves the collection of
firsthand, often never-before-
seen information by the person who actually conducted the
research being reported. The
analysis of an existing (historical) dataset that was written by
someone other than the person
who conducted the research constitutes a secondary source. A
systematic review includes a
predetermined, comprehensive and transparent search and
screening process to identify,
collect, critique, and synthesize all of the relevant studies on a
particular topic. If the review
combines data from a number of studies into one calculated
outcome, then it is referred to as
a meta-analysis (Forister & Blessing, 2016).
Be sure to review this week's resources carefully. You are
expected to apply the information
from these resources when you prepare your assignments.
References
Forister, J. G., & Blessing, J. D. (2016). Introduction to
research and medical literature for
health professionals (4th ed.). Burlington, MA: Jones & Bartlett
Learning.
16. Glicken, M. D. (2003). Social research: A simple guide. Boston,
MA: Pearson Education,
Inc.
Jacobsen, K. H. (2017). Introduction to health research
methods: a practical guide (2nd
ed.). Burlington, MA: Jones & Bartlett Learning.
Polgar, S., & Thomas, S. A. (2019). Introduction to research in
the health sciences (7th ed.).
Edinburgh: Elsevier, Ltd.
javascript:void(0);
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
521/View
11/5/2019
3/6
Goff, W. M., & Getenet, S. (2017).
Design-based research in doctoral
studies: Adding a new dimension to
doctoral research. International
Journal...
Humphrey, R., & Simpson, B. (2012).
17. Writes of passage: Writing up
qualitative data as a threshold concept
in doctoral research. Teaching in Higher
Edu
Landrum, B., & Garza, G. (2015).
Mending fences: Defining the domains
and approaches of quantitative and
qualitative research. Qualitative
Psychology,
Mligo, E. S. (2016). Introduction to
research methods and report writing: A
practical guide for students and
researchers in social sciences and...
Wellington, J., & Szczerbinski, M.
(2007). Research methods for the social
sciences. London, UK: Continuum
International Publishing Group.
Yoshikawa, H., Weisner, T. S., Kalil, A., &
18. Way, N. (2013). Mixing qualitative and
quantitative research in developmental
science: Uses and...
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
522/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
523/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
524/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
525/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
526/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
527/View
11/5/2019
4/6
Quantitative
Link
Qualitative:
Link
Mixed Methods:
Link
DHA-7008 Week 5 Assignment
Word Document
19. Use of evidence....
PDF document
Week 5 - Assignment 1: Evaluate Research Methods
Assignment
Due November 10 at 11:59 PM
For this assignment, you will review the quantitative,
qualitative, and mixed methods
articles found in this week’s resources under Articles for
Review.
For each article, you will need to provide the following:
An APA-formatted reference list entry
Identify the problem under study
Identify the purpose statement
Emphasize key points of the study
Differentiate why the methodology was chosen
Discuss how researchers addressed ethical considerations in the
study
Length: 4-6 pages, not including the title page. A reference
page is not needed for this
assignment.
References: 3 references are provided
20. Your annotated bibliography should demonstrate thoughtful
consideration of the ideas
and concepts that are presented in the course and provide new
thoughts and insights
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1624
411/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
528/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
530/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
542/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1625
302/View
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
488/View
11/5/2019
5/6
Week 5 - Assignment 2: Determine Secondary Data
Sources
Assignment
Due November 10 at 11:59 PM
relating directly to this topic. Your response should reflect
graduate-level writing and
APA standards.
21. For this assignment, you will use the table provided and identify
at least two possible
secondary data sources for each project type listed in the first
column. You will need to
describe more specifically what your possible project may be
for each project type in the
second column. This helps frame the data sources and your
rationale for choosing that
source. Make sure you include the URL for those sources
retrievable from the Web.
Project Type
Project
Description
Secondary
Data
Source Rationale
Complet
APA
Referenc
Quality
Improvement/Performance
Management Project
This type of project leads to
measurable improvement in
22. healthcare systems, services,
and/or health status of
targeted populations.
Healthcare Policy
Analysis/Policy
Development
This type of project is broad
and may include the analysis
of policy process or policy
content, and links to health
outcomes. It is used to help
influence stakeholders’
https://ncuone.ncu.edu/d2l/le/content/168767/viewContent/1591
492/View
11/5/2019
decisions.
Evaluation of the
Effectiveness of a Project,
Program, Intervention,
Services, etc.
This type of project is used to
gain insight, improve practice,
assess effects, and/or build
capacity.
Table 1. Project Type Table
23. Length: 1-2 pages, not including the title page. A reference
page is not needed for this
assignment.
References: Include a minimum of 6 scholarly and/or
professional data sources
Your table should demonstrate thoughtful consideration of the
ideas and concepts that are
presented in the course and provide new thoughts and insights
relating directly to this
topic. Your response should reflect graduate-level writing and
APA standards.
Writes of passage: writing up qualitative data as a threshold
concept
in doctoral research
Robin Humphrey
a
* and Bob Simpson
b
a
Faculty of Humanities and Social Sciences, Newcastle
University, Daysh Building, Newcastle
upon Tyne, UK;
24. b
Department of Anthropology, Durham University, Dawson
Building, South
Road, Durham, UK
(Received 4 July 2011; final version received 14 March 2012)
Effective writing is an essential skill for all doctoral students,
yet it is one that
receives relatively little attention in training and supervision.
This article explores
extensive feedback from participants in a series of workshops
for doctoral
candidates engaged with writing up qualitative data. The themes
arising from the
data analysis are discussed in terms of the affective domain of
writing, and the main
claim is that writing up qualitative data has been identified as
what Meyer and
Land would call a threshold concept for doctoral candidates
employing qualitative
analysis. Drawing on Turner’s notion of liminality, the article
concludes that
interdisciplinary workshops can be instrumental in helping
doctoral candidates
understand the role of writing, and of writing up qualitative
data in particular, in
their development into independent, autonomous researchers.
Keywords: doctoral education; writing groups; qualitative
research methods;
threshold concepts; interdisciplinary study programme
Introduction
25. The impetus for the Writing Across Boundaries (WAB) project,
about which we write
in this article, was the observation that writing poses particular
challenges for
doctoral students in general and for those using qualitative data
in particular. The
problem is most acute at the point where data collection has
ended, writing begins in
earnest and the deadline for completion begins to loom. Both
authors had this
experience when writing their own theses, have dealt with it as
supervisors and
recognise it in the experience of others. When devising the
WAB project, we were
both immersed in doctoral matters through our formal Faculty-
level duties in our
respective universities, which allowed us to take a broad look at
writing support
across the social sciences. We realised that the difficulties were
being experienced not
just by doctoral candidates from our own disciplines of
Anthropology and Sociology,
but by researchers from every social science discipline, and
indeed from disciplines in
26. the natural and medical sciences and the humanities.
Whilst much research training goes into preparing doctoral
students in the early
stages of their Ph.D. careers, relatively little attention has been
paid to writing
post-fieldwork, which, for reasons we will outline later, we
continue to call
the ‘writing-up’ stage despite much recent criticism of the term
(Badley 2009;
*Corresponding author. Email: [email protected]
Teaching in Higher Education
Vol. 17, No. 6, December 2012, 735�746
ISSN 1356-2517 print/ISSN 1470-1294 online
# 2012 Taylor & Francis
http://dx.doi.org/10.1080/13562517.2012.678328
http://www.tandfonline.com
Thomson and Kamler 2010), and to how doctoral researchers
might be helped in this
endeavour. The WAB project was an attempt to address this
issue in practical ways
by offering doctoral students help in negotiating what can often
27. appear a very ‘scary
gap’ in their doctoral training and one which has hitherto been
self-negotiated
(Simpson and Humphrey 2008, 2010).
Our interest in writing and analysis contributes to the
development of the post-
Roberts agenda for UK doctoral training, and in particular to the
newly launched
Researcher Development Framework (RDF) which seeks to
articulate ‘the knowl-
edge, behaviours and attributes of effective and highly skilled
researchers’ (Vitae
2011). The RDF is structured in four domains, within each of
which are three sub-
domains and associated descriptors. Domain A encompasses the
knowledge,
intellectual abilities and techniques required to produce work of
a professional
academic standard, and the three sub-domains are Knowledge
Base (A1), Cognitive
Abilities (A2) and Creativity (A3). We were keen to understand
writing up qualitative
data as a distinctive synthesis of these three sub-domains, and
to situate our analysis
within the broad context of doctoral training. In short, we were
28. not so much
interested in the ‘how to’ approach to writing, but were
concerned rather to cultivate
a reflexive awareness of what happens for a doctoral student
when they begin to write
up qualitative data and why this happens.
In the account that follows we provide an analysis of a body of
data that was
collected at three of the annual WAB workshops. This data
provides insight into
the kinds of impediments that doctoral students encounter when
writing up
qualitative data for incorporation into a thesis. Furthermore, the
very positive
response that the workshops received gave us some important
clues as to the place
of writing in the doctoral process and how the impediments
might be more
effectively addressed.
Analysis of the data revealed very significant cross cutting
themes which are
pertinent when it comes to understanding the process of
doctoral study. The two
themes which we discuss here relate to debates about the
affective domain in the
29. writing process and the identification of writing up qualitative
data as a threshold
concept in doctoral research. This last theme highlights what we
see as an important
point in academic pedagogy and one which is critical for the
development of
autonomous, professional researchers.
Situating our analysis in the literature
The themes outlined above were generated from our data, rather
than derived prior
to the data analysis from the substantial body of literature that
is now available on
the writing of doctoral theses. However, once the themes were
identified, we then
sought to link them to concepts and discourses prevalent in the
contemporary
literature.
As our workshop participants came from so many different
disciplines, we turned
first to the work of Bernstein (1990, 2000), and in particular his
typology of
knowledge structures in which the vertical knowledge structures
of the natural
30. sciences are distinguished from the radically different
horizontal knowledge
structures found in the social sciences. Social science
disciplines, Bernstein
concluded, share a common conceptual core although the
boundaries between
them are far less rigid than are the boundaries between
disciplines in the natural
736 R. Humphrey and B. Simpson
sciences. As we will show, this differentiation between the
natural and the social
sciences helped us to appreciate the workings of our
interdisciplinary workshops in a
new light.
We then searched the burgeoning literature which addresses
academic writing at
doctoral level. Some of these texts concentrate on publishing
pedagogy (Aitchison,
Kamler, and Lee 2010; Belcher 2009), while others focus on
particular aspects of the
doctoral writing process, such as dissertation proposals
(Kratwohl and Smith 2005)
and literature reviews (Kamler and Thomson 2006a; Machi and
31. McEvoy 2008).
There is also an established research tradition which explores
the ways in which
graduate students learn writing conventions in disciplinary
settings (McAlpine, Paré,
and Starke-Meyerring 2008; Prior 1998). There were two bodies
of work, however,
that had direct relevance to our themes: work addressing the
relationship between
text work and identity construction in doctoral research (Kamler
and Thomson
2006a, 2006b); and work addressing the incorporation of the
notion of threshold
concepts into the realm of doctoral pedagogy (Kiley 2009;
Meyer and Land 2006).
Through their treatment of doctoral writing as a complex,
institutionally
constrained social practice, rather than simply a set of skills and
competences,
Kamler and Thomson (2006b, 2) employ the notion of ‘research
as writing’, and seek
to remedy the situation where the development of scholarly
writing has become a
major site of anxiety for doctoral candidates, and their
supervisors. Their call for
32. universities to address more seriously the question of research
writing and to
establish ‘institutional writing cultures’ (144) will be addressed
elsewhere. The aspects
of their work that we draw on here are the connections they
make between academic
writing practices and the formation of the ‘doctoral researcher’.
Recognising the relationship explored by Kamler and Thomson
between
successful doctoral writing and the development of the doctoral
candidate’s identity
as an academic researcher allowed us to make the link between
our findings and the
emerging literature on threshold concepts in doctoral research.
Kiley (2009) argues
that doctoral candidates undertake a series of rites of passage
during their
candidature, and that there are times during their research
education when they
demonstrate that they have undergone a change in the way they
see themselves and
their research work. These changes, she argues, are the result of
the candidate first
encountering, and then successfully crossing, a threshold which
is critical for the
33. furtherance of the doctoral research process. The identification
of discipline-specific
threshold concepts has been developed as a way of
differentiating between core
learning outcomes that represent ‘seeing things in a transformed
way’ and those that
do not (Kiley and Wisker 2009; Meyer and Land 2006). A
threshold concept is seen
as distinct from other core learning outcomes because ‘once
grasped, [it] leads to a
qualitatively different view of the subject matter and/or learning
experience and of
oneself as a learner’ (Kiley and Wisker 2009, 432).
We shall explore these concepts further when we discuss our
findings, but first we
will outline briefly the content of our workshops and our
methodological approaches
of generating feedback from the participants.
The workshops
The WAB workshops were the centrepiece of a project funded
by the UK’s
Economic and Social Research Council as a part of its
Researcher Development
34. Teaching in Higher Education 737
Initiative.
1
Each two day workshop was residential and comprised five
participative
sessions:
i. An introduction by the organisers, including a panel
conducted by recently
successful doctoral researchers who reflected on their strategies
for writing up
qualitative data in their theses.
ii. Ordering text � delivered by a psychologist who is also a
creative writer.
iii. Analysing the relationship between text and representation
� delivered
variously by a sociologist with an interest in narrative, and an
anthropologist
who is also a poet.
iv. Rhetoric and narrative in qualitative writing � delivered by
a social
anthropologist.
v. Data and theory � delivered by experienced and widely
published qualitative
researchers in Sociology and Education.
35. 2
The structure of the workshops did not change from their
inception, largely
owing to the positive feedback and reinforcement gained after
the first, and each
subsequent, workshop. The first two workshops were regional,
and open to
applicants from the five universities in the North-East of
England. The third
workshop was opened up to applicants from any university in
the UK, and the
fourth workshop was advertised throughout Europe.
Information about the workshop and details about applications
were dissemi-
nated via email distribution lists and via the WAB project
website.
3
The criteria used
for selection were that the doctoral candidate should have
completed their fieldwork
and should be writing a thesis based in part or entirely on
qualitative data. The
application form had to be submitted by the candidate’s
supervisor, who was asked
36. to make a case explaining why the candidate would benefit from
the workshop. Out
of a total of 237 applicants for the four workshops, 156
participants were drawn from
26 UK universities and, in the fourth workshop, from
universities in the Netherlands,
Poland, Belgium, the Irish Republic and the Czech Republic.
The workshop participants were drawn from all but four of the
19 ESRC social
science disciplines. Although the majority of the participants
were social scientists,
the workshops attracted some who had been trained and were
located in the natural
sciences (particularly environmental science), the medical
sciences (including health
services research, midwifery, physiotherapy, general medical
practice, nursing and
pharmacy) and the humanities (including modern languages,
history and design).
Participants brought with them experience of a wide range of
qualitative
methodologies, the most common being interviews (70%) and
participant-observa-
tion (42%). Most researchers (71%) were employing a
combination of qualitative
37. methods, and some (12%) were combining qualitative analysis
with that of
quantitative data.
Methodology
The strategy adopted for the formal evaluation of the workshops
had three stages,
with each stage employing a different methodological approach
in order to generate
multiple perspectives on the workshops. For stage one, two
Ph.D. students attended
the workshop as participant observers. They took notes, and
discussed what was
738 R. Humphrey and B. Simpson
going on with participants both during the workshop sessions
and in the less formal
periods during the residential weekend. Short reports were
produced by the participant
observers following the workshops, which provided the
foundation for debriefing
sessions where impressions and reflections were discussed with
the project leaders.
38. For stage two, all the participants were asked to fill in a short
questionnaire soon
after the workshop. This was filled in remotely on Durham
University’s online
learning support platform, Blackboard. The questionnaire was
designed to capture
immediate impressions of the workshop, and included a series
of closed questions
asking participants to rank their responses on five-point Likert
scales, and some
open ended questions asking what they liked best about the
workshop and how they
thought it could be improved. The overall response rate to the
post-workshop online
questionnaire across the three years was 82%, and the responses
generated both
quantitative and qualitative data. Significantly, the qualitative
data was unusually
detailed and thoughtful by the standards of online feedback, and
went beyond the
initial expectations of the project organisers.
For stage three, semi-structured telephone interviews were
carried out six months
after the workshop. These were conducted not only with
participants but also with
39. their supervisors to assess whether there had been any longer
term impacts arising
from the workshop, rather than just a short-term ‘glow’. The
response rates for stage
three were 53% for students and 50% for their supervisors.
These three exercises were repeated for each of the first three
workshops, and
produced a longitudinal data set comprising a rich mix of
quantitative and qualitative
data, the latter produced via ethnographic, self-completion and
interview-based
approaches. All the data were processed and stored
electronically on the software
package NVivo, in preparation for analysis. The breadth and
depth of the data set
allowed emergent themes to be traced across the three cohorts
of participants, and for
conclusions to be drawn with stronger claims to rigour and
generalisability than is
often the case with qualitative studies based on single context
or cohort studies.
From feedback to analysis
In terms of feedback on the success of the workshops, the data
40. collected was
overwhelmingly positive. In the online quantitative feedback
collected immediately
after the workshops, after the results for the first three years
were combined
participants indicated, through their rating on the five-point
Likert scales, not only
that they had found the workshops very enjoyable (90%) but
also that they had
greatly increased their confidence in their ability to write up
their Ph.D. (90%) and
felt that they had been very useful in helping them to develop
strategies for writing
up qualitative data (91%). These results reassured the organisers
that the workshops
were, at least in the short term, effective and that the project
was largely meeting its
original aims. These results were further reinforced by a great
many comments from
workshop participants pointing to how their attitude to writing
had been changed
positively:
The impact [of the workshop] has been phenomenal. I was
losing sleep before but when
41. I came back I got straight on to it and wrote reams and reams,
so it was like opening a
floodgate � it gave me the opportunity to move on as a writer.
(Doctoral Candidate in
Education, Workshop 2)
Teaching in Higher Education 739
Such responses were enormously gratifying, but made us
curious as to why the
workshops were so successful. We would like to think that they
were well organised
and presented, but there was a sense that we had touched on
something that was
much more fundamental and dynamic among the groups that we
had convened.
Beyond specific organisational factors, we first turned to
Bernstein’s typology of
knowledge structures to help us deepen our understanding of
what had transpired in
the workshops. Bernstein (1990, 2000) contrasts the vertical
knowledge structures of
the natural sciences with the horizontal knowledge structures
found in the social
sciences. In the latter, disciplines share a common conceptual
core and the
42. boundaries between them are weak and porous. Thus, bringing
together 30�40
doctoral students created a significant opportunity for lateral
communication to take
place and, as we go on to illustrate, for participants to use these
encounters to
generate positive momentum for themselves; in many respects
we were merely
providing the crucible in which certain kinds of reactions could
take place.
A key catalyst in these reactions was the sharing of broad
methodological
approaches. These overlaps facilitated moves out of the vertical,
disciplinary
knowledge domains which some participants brought to the
workshop. What we
inferred from these moves was that focusing on a common
processual problem for
doctoral students, in this case the writing up of qualitative data,
can be profitably
undertaken where a mix of disciplines and methodologies are
brought together.
An important feature of the workshop in this respect was the
academic level of
the doctoral participants, all of whom had already acquired
threshold concepts at
43. undergraduate level of a wide range of academic disciplines
(Cousin 2006). The
workshops were therefore characterised by a high level of cross
disciplinary
exploration and boundary crossing (Engeström, Engeström, and
Kärkkäinen 1995)
which, as we shall see below, constituted a rich resource from
which participants
could draw in deepening their learning experience.
The participants who were crossing the most difficult
boundaries were those who
had been educated in the natural sciences, for whom cross-
disciplinary moves of the
kind we discuss here were unfamiliar. For them, the intellectual
task of writing up
qualitative data was particularly challenging:
As someone with a natural science background, qualitative data
is still new to me and
analysing and writing up ‘words’ rather than numbers is a
daunting process. (Doctoral
candidate in Environmental Science, Workshop 3)
Observing interactions at the workshop, gathering post-
workshop impressions
and subsequently interviewing students and their supervisors
gave us important
44. insights into the experience of writing at this critical phase in
the doctoral process
and enabled us to illuminate some of the still largely uncharted
areas of doctoral
research training.
Acquiring confidence and self belief: the affective domain in
the writing process
Issues of confidence are evidently key when it comes to writing
up qualitative data.
This message was the clearest to emerge from the analysis of
the feedback data.
There were 196 explicit references to acquiring confidence in
writing from the
workshops in the anonymous online feedback and in the
transcripts of the 63 phone
interviews subsequently conducted with participants, and many
more comments
740 R. Humphrey and B. Simpson
where it was strongly implied. The context in which this
message was expressed,
however, took many forms.
As we have seen earlier, research using qualitative methods is
now carried out by
45. doctoral candidates in disciplines outside of the social sciences.
Many participants
from such disciplines referred to changes in their confidence
levels in writing about
approaches which may well be deemed marginal in their
academic environments:
Interestingly, one of the organisers said my approach was
similar to an anthropologist’s
approach. It was useful to find this out � finding out that it was
acceptable to do what I
was doing. That was really good. And good to speak to people
with diverse
backgrounds and find out that there’s lots of ways of doing it
and you do what you
need to, to fit the purpose. (Doctoral candidate in Design,
Workshop 1)
Confidence through increased knowledge and understanding of
the qualitative
research process was keenly felt by participants from
disciplines where quantitative
research is dominant and follows scientific models for enquiry
and data presentation:
I can’t explain very well but I knew the way I was writing in
this rigid scientific structure
wasn’t right for my data. Obviously I’m still doing active
experimental research as well
as writing up. Although I have one supervisor pushing it a lot
more experimentally,
46. I have the confidence now to say ‘no I have this [qualitative]
data’ and it is an important
part of the research too . . .. I think the difficulty I’ve had is
with my setting, I’m in the
medical setting. I feel more constrained, more reined in I think
because you have this
idea that it has to be rigorous and that means to write like this,
only in this certain
‘scientific’ way. But seeing how other people were doing this, I
thought ‘no, this works,
this could work for me’. (Doctoral candidate in Physiotherapy,
Workshop 3)
This greater appreciation of the nature of qualitative research
was expressed well
by the General Medical Practitioner quoted below, who prior to
the workshop had
clearly struggled with the differences between qualitative and
quantitative data and,
more specifically, with what this meant in a context
increasingly dominated by
evidence-based paradigms:
I am committed to qualitative research, but come from a
discipline very closely allied
with biomedicine, and this workshop . . . has given me the
confidence to trust my data.
There are many decisions which researchers have to make at all
stages of the process.
The writing stage is no different. I have the confidence to know
that I don’t have to
include everything in the thesis, I have to make judgments. It
47. was helpful to consider the
words ‘illustration’ and ‘explanation’, rather than ‘evidence’.
As a health care
professional, this word comes back to haunt us on a daily basis.
(Doctoral candidate
in General Medical Practice, Workshop 3)
This anxiety about writing up qualitative data was not
dependent on the
innovative adoption of qualitative research methods by
participants in disciplines
dominated by other methodological approaches, since there
were comments from
doctoral candidates in Sociology, Anthropology and Human
Geography showing
that familiarity with and acceptance of qualitative research
within a discipline can
produce its own pressures:
As you know Anthropology has been talking representation for
the last 20 years, so I
worry more about how to write, how do I do this? But that was a
good thing about the
Teaching in Higher Education 741
workshop; it was very good at bringing confidence to me. I was
anxious about writing.
(Doctoral candidate in Anthropology, Workshop 2)
48. The emergence of confidence as a strong theme in our data
analysis provides
powerful corroboration for claims that writing about qualitative
data and analysis is
more than simply a matter of technical ability, but is a process
in which the writer’s
attitudes and feelings about writing also play a significant part.
In our view, paying
attention to this relationship is key to the development of skills
in analysis and
communication for the aspiring social scientist.
Following Bloom’s taxonomy of goals within education systems
(Bloom 1956),
Wellington (2010) argues that the cognitive domain of skill and
knowledge
development has dominated thinking on developing writing
ability to the detriment
of understanding the role of affect in the writing process.
Considering the affective
domain causes us to reflect on the role of feelings and emotions
in learning and
teaching, yet it is a domain that has tended to be neglected in
postgraduate
education, where academics have, perhaps, underestimated the
49. extent to which
doctoral candidates need help with confidence, motivation and
inspiration (Lillis and
Turner 2001; Wellington 2010).
Our evidence suggests strongly that academic writing is not a
skill that we can
assume is inbuilt and simply develops autonomously and
individually (Nightingale
1988; Wellington 2010). Difficulties with writing are magnified
when coupled with
the analysis of qualitative data. This distinctive post-fieldwork
activity for
researchers working with qualitative data has its own
complexities and difficulties
(Silverman 2010; Wolcott 2009), as the interplay of writing and
analysis requires the
bringing together of rigorous data analysis and the nuanced and
rhetorical use of
language.
The anxieties provoked when trying to engage with this aspect
of the research
process in a doctoral thesis are likely to be exacerbated by
current thinking that
writing should start on ‘day one’ of a thesis, and that the notion
of ‘writing up’ is
50. outmoded and potentially dangerous as it implies that writing
only happens at the
end of the doctoral cycle (Badley 2009; Kamler and Thomson
2006b). While we
agree with Badley that the term ‘writing up’ is problematic
since it can fail to convey
the nature of good academic writing (‘a problematical and
tentative exercise in
critical reflective thinking’, Badley 2009), our evidence
suggests that this conflation
of ‘writing’ with ‘writing up’ can mask some important
distinctions between the
mechanics of writing (literature reviews, accounts of
methodology, contextualisation,
etc.), reflexive and reflective writing (in the form of diaries and
fieldwork logs) and
‘writing up’ (the final synthesis of information and experience
in the form of a
thesis). All of these stages of writing are important, and
particularly so for
qualitative researchers, who typically produce words that
describe words, rather
than words that describe numbers. Enabling students to be
clearer about how the
varieties of writing relate to each other was an important
51. outcome of the workshop
and one which led us to think more analytically about writing as
a threshold concept.
Writing up qualitative data as a threshold concept in doctoral
research
Just as we were able to identify some of the affective problems
associated with
writing, we were also interested to note some of the affective
solutions that the
742 R. Humphrey and B. Simpson
workshop generated. There were many comments that referred
to a new feeling of
being able to overcome the intellectual and emotional
challenges of post-fieldwork/
data-collection writing, and especially of having glimpsed new
perspectives that
might impart the confidence to try new approaches and pursue
more creative
directions:
I was scared before I got there, I felt challenged about things I
didn’t know about, that I
wouldn’t know enough. But when I got there I wasn’t
intimidated at all, everyone was
52. very willing to share. It gave me permission to let my creative
intuition take me forward.
To stop being worried that it’s not scientific or academic
enough. But it just comes. They
say that in the text books, you know, just write, but the
workshop let me do that. I feel
like I’m on a race course and all these hurdles keep popping up
in front of me and I
jump this one, and then that one, but the last one’s in sight and
I’m heading for the
finish line. (Doctoral candidate in Gerontology, Workshop 3)
The reference above to being given ‘permission’ is echoed in
many comments, and
there were also references to ‘feeling liberated’ and ‘freed up’
after the workshop:
Definitely [had positive benefits from the workshop], because
now I’m writing up � my
thinking has changed � I feel freed up to write up differently
than perhaps I might have
before. I felt liberated after the workshops as I could write up
more like myself. I think
the workshop liberated me to write up as me. (Doctoral
candidate in Health Care,
Workshop 1)
The idea that participants were somehow deriving a sense of
liberation and
perhaps even feeling that a kind of ‘permission’ was being
given was as puzzling as it
was gratifying. We were curious as to where the authority for
this licence to ‘write up
53. as me’ issued from as it was certainly not what we had in any
way planned or
intended and a deeper reflection on what is happening here is
instructive. In
particular, repeated reference to the idea of ‘voice’ gives some
clue as to where
impediments may lie:
I am much more concerned about using my own voice, much
more confident that I can
write in my own voice, that it is distinctive. I’m more confident
about doing that. My
colleagues are always talking about making an original
contribution. A big part of your
original contribution is the way you communicate . . . I have
more sense that I can put
my own stamp on this now, put in my own voice. (Doctoral
candidate in Nursing,
Workshop 3)
The interviewer of this participant noted that he had used the
phrase ‘finding my
own voice’ several times, and commented that for many
doctoral candidates she had
spoken to ‘there seems to be a sense of coming to the final year
of the Ph.D. having
spent so much time with other people’s words that they are
unsure of how to find
54. their own words, or what standing or role they take in the final
script’.
All of this is strong evidence that the workshops have been
working around what
Kiley refers to as a threshold concept in doctoral research, in
this case research that
employs qualitative methodology. Following Meyer and Land
(2006), Kiley (2009)
adapts Turner’s notion of liminality (1979) and suggests that
prior to crossing a
threshold of understanding doctoral candidates can enter a
liminal space, in which
some can experience being ‘stuck’ for some time. Whether there
is a single moment in
Teaching in Higher Education 743
which one discovers one’s own voice and in a single epiphany
acquires belief in one’s
own potential, thereby becoming ‘unstuck’, is questionable.
However, it did seem
that through the medium of the workshops we had been able to
create the space for
this kind of liminality. Drawing on the work of the
anthropologist Victor Turner
55. (1967, 1969, 1974), we understand liminality as a place and a
time which is outside of
the conventional structures of process and in which there is the
opportunity to
engage in play and experimentation in relation to values and
assumptions that might
otherwise be constrained by the structure and conventions that
prevail in other
contexts. As Turner famously put it, the liminal is culture in the
subjunctive mood,
that is ‘the mood of maybe, might be, as if, hypothesis, fantasy,
conjecture, desire �
depending on which of the trinity of cognition, affect, and
conation is situationally
dominant’ (Turner 1986, 42).
In the face of the challenges of writing up data collected using
qualitative
methodologies, the workshop appeared to provide a crack or
interstice within which
a deeper reflection on self, writing and the doctoral process
became possible.
Crucially, what doctoral students were able to create within this
space was a boost to
confidence and self belief which would enable them to
56. successfully cross a significant
threshold in writing up their qualitative data and take an
important step towards
becoming academic researchers in their own right.
Conclusion
Analysis of participant feedback has not been presented simply
to impress with the
evident success of the WAB workshops � the scale of the
success of which came as a
surprise to the organisers and, hopefully, offered reassurance to
the projects’ funders,
the ESRC. Rather, the feedback analysis has provided insights
into a crucial stage of
the doctoral cycle.
We believe that our data have provided evidence for the claim
that the writing up
of qualitative data is a threshold concept in this form of
doctoral research, and that
achieving this is challenging for most doctoral researchers. We
would contend that, in
enabling this to happen, training of the kind reported on here
can play a crucial role.
As we have seen, confidence is key to taking control of the
thesis as a textual
57. synthesis of data, theory and experience and requires the
bringing together of the
skills and expertise underpinning all of the three sub-domains of
Domain A of the
RDF, Knowledge and Intellectual Abilities: sound academic
knowledge, cognitive
abilities and creativity (Vitae 2011). Acquiring the confidence
to achieve this appears
to be significantly helped by removing the doctoral candidate
for a short, but intense
time from their established environments of supervisors,
immediate peers and
disciplinary arrangements. Our evidence confirms the
pedagogical benefits for
doctoral candidates of breaking out of their disciplinary and
institutional homes
in order to spend focused time with their peers and fellow
travellers.
The fact that the workshops were purposefully multi-
disciplinary, focusing more
on the nature of writing up qualitative data than on disciplinary
perspectives or
processes, was undoubtedly important for the experience of the
workshop
58. participants. However, it was also important for the potential
generalisability of
the analytic conclusions presented here. In those disciplines that
employ qualitative
methodologies, at least, the moment when the doctoral
candidate begins to analyse
744 R. Humphrey and B. Simpson
and write up their data is often a defining one in the move from
novice to
independent social researcher.
Acknowledgements
The authors would like to thank all the participants in the WAB
workshops, both for their
participation and for the time they gave us in providing the
feedback which gave us both the
data and the inspiration for this article. We would also like to
thank their supervisors, who
initially nominated them for a place on the workshops and then
allowed us to interview them
over the phone six months later. We are indebted to Carolyn
McAlhone and Clare Hardy,
from the Graduate School at Durham University, for
magnificent administrative help with the
workshops over the years. We thank, and acknowledge the work
of, the doctoral students who
helped with the workshops and data processing, Victoria Wood,
59. Mwenza Blell, Alison Jobe,
Sally Atkinson and Rachel Douglas Jones, and acknowledge
with thanks the help with NVivo
given by Dr Jane Wilcockson, and the insightful advice given
by Dr Stan Taylor, Director of
the Centre for Research and Academic Practice at Durham
University, regarding an early
draft of this article.
Notes
1. The Writing across Boundaries project was funded by the
ESRC Research Development
Initiative, grant number RES 035 25 0013. Further details of the
RDI can be found at:
http://www.rdi.ac.uk/
2. Further elaboration and reflections on the content of the
workshops can be found in earlier
publications (Simpson and Humphrey 2008, 2010).
3. The project website can be found at
http://www.dur.ac.uk/writingacrossboundaries/. Since
its inception in June 2008, according to Google Analytics the
211 pages have been viewed
70,941 times, and its home page has recorded 13,026 unique
page views, of which 7,049
have originated from outside the UK.
References
Aitchison, C., B. Kamler, and A. Lee, eds. 2010. Publishing
pedagogies for the doctorate and
beyond. London: Routledge.
Badley, G. 2009. Academic writing as shaping and reshaping.
60. Teaching in Higher Education
14, no. 2: 209�19.
Belcher, W.L. 2009. Writing your journal article in 12 weeks: A
guide to academic publishing
success. Thousand Oaks, CA: Sage.
Bernstein, B. 1990. The structuring of pedagogic practice.
London: Routledge.
Bernstein, B. 2000. Pedagogy, symbolic control and identity.
Oxford: Rowman and Littlefield.
Bloom, B.A., ed. 1956. Taxonomy of educational objections, the
classification of educational
goals � Handbook 1: Cognitive domain. New York: McKay.
Cousin, G. 2006. Threshold concepts, troublesome knowledge
and emotional capital: an
exploration into learning about others. In Overcoming barriers
to student understanding:
threshold concepts and troublesome knowledge, ed. J.H.F.
Meyer and R. Land, 134�47.
London: Routledge.
Engeström, Y., R. Engeström, and M. Kärkkäinen. 1995.
Polycontexuality and boundary
crossing in expert cognition: Learning and problem solving in
complex work activities.
Learning and Instruction 5: 319�36.
Kamler, B., and P. Thomson. 2006a. Doctoral writing:
Pedagogies for work with literatures.
Paper presented at AERA annual meeting, 7�11 April, in San
Francisco, CA.
Kamler, B., and P. Thomson. 2006b. Helping doctoral students
61. write: Pedagogies for
supervision. London: Routledge.
Kiley, M. 2009. Indentifying threshold concepts and proposing
strategies to support doctoral
candidates. Innovations in Education and Teaching International
46, no. 3: 293�304.
Teaching in Higher Education 745
Kiley, M., and G. Wisker. 2009. Threshold concepts in research
education and evidence of
threshold crossing. Higher Education Research and
Development 28, no. 4: 431�41.
Kratwohl, D.R., and N.L. Smith. 2005. How to prepare a
dissertation proposal: Suggestions for
students in the social and behavioural sciences. Syracuse. NY:
Syracuse Univ. Press.
Lillis, T., and J. Turner. 2001. Student writing in higher
education: Contemporary confusion,
traditional concerns. Teaching in Higher Education 6, no. 1:
57�68.
Machi, L.A., and B.T. McEvoy. 2008. The literature review: Six
steps to success. Thousand
Oaks, CA: Sage.
McAlpine, L., A. Paré, and D. Starke-Meyerring. 2008.
Disciplinary voices: A shifting
landscape for English doctoral education in the 21st century. In
Changing practices in
doctoral education, ed. D. Boud and A. Lee, 42�53, London:
62. Routledge.
Meyer, J.H.F., and R. Land, eds. 2006. Overcoming barriers to
student understanding: Threshold
concepts and troublesome knowledge. London and New York:
Routledge.
Nightingale, P. 1998. Understanding processes and problems in
student writing. Studies in
Higher Education 13, no. 3: 262�80.
Prior, P. 1998. Writing/disciplinarity: A sociohistoric account
of literate activity in the academy.
Maywah, NJ: Erlbaum.
Silverman, D. 2010. Doing qualitative research, 3rd edn.
London: Sage.
Simpson, R., and R. Humphrey. 2008. Writing across
boundaries: Explorations in research,
writing and rhetoric in qualitative research. Qualiti 8: 10�2.
Simpson, R., and R. Humphrey. 2010. Writing across
boundaries: Reflections on the place of
writing in doctoral research training for social scientists.
Learning and Teaching: The
International Journal of Higher Education in the Social Sciences
3, no. 1: 69�91.
Thomson, P., and B. Kamler. 2010. It’s been said before and
we’ll say it again - research is
writing. In The Routledge Doctoral Student’s Companion:
Getting to grips with research in
education and the social sciences, ed. P. Thomson and M.
Walker, 149�60. London:
Routledge.
63. Thomson, P., and M. Walker, eds. 2010. The Routledge
Doctoral Student’s Companion: Getting
to grips with research in education and the social sciences.
London: Routledge.
Turner, V. 1967. The forest of symbols: Aspects of Ndembu
ritual. Ithaca: Cornell Univ. Press.
Turner, V. 1969. The ritual process: Structure and anti-
structure. Ithaca: Cornell Univ. Press.
Turner, V. 1974. Dramas, fields, and metaphors: Symbolic
action in human society. Ithaca:
Cornell Univ. Press.
Turner, V. 1979. Betwixt and between: The liminal period in
rites of passage. In Reader in
comparative religion, ed. W. Less and E. Vogt, 234�53. New
York: Harper and Row.
Turner, V. 1986. Dewey, Dilthey and drama: An essay in the
anthropology of experience.
In The anthropology of experience, ed. V. Turner and E. Bruner,
33�44. Urbana & Chicago:
Univ. of Illinois Press.
Vitae. 2011. Researcher development statement.
http://www.vitae.ac.uk/CMS/files/upload/
Researcher development statement.pdf (accessed June 27,
2011).
Wolcott, H.F. 2009. Writing up qualitative research, 3rd edn.
Thousand Oaks, CA: Sage.
Wellington, J. 2010. More than a matter of cognition: An
exploration of affective writing
64. problems of post-graduate students and their possible solutions.
Teaching in Higher
Education 15, no. 2: 135�50.
746 R. Humphrey and B. Simpson
Copyright of Teaching in Higher Education is the property of
Routledge and its content may not be copied or
emailed to multiple sites or posted to a listserv without the
copyright holder's express written permission.
However, users may print, download, or email articles for
individual use.
Mending Fences: Defining the Domains and Approaches of
Quantitative and Qualitative Research
Brittany Landrum and Gilbert Garza
University of Dallas
In view of the increasing ubiquity of qualitative research,
particularly mixed method
designs, it is important to examine whether qualitative and
quantitative models of
research can be integrated and how this integration should take
place. The recent
adoption of best practices for mixed methods research by the
NIH seems an opportune
starting point for discussion of these questions. This article
explores the notion that
65. qualitative and quantitative research, while stemming from
fundamentally different
“approaches,” might yet find an appropriate complementary
relationship. We argue,
however, that such a complementary relationship depends on an
understanding of the
notion of approach and an insight into the fundamentally
different guiding questions
and domains of these 2 research models. Holding that “good
fences make good
neighbors,” this article explores the frontier between
quantitative and qualitative
research and the challenges attendant to designing and
conducting mixed methods
research.
Keywords: best practices, methodology, mixed methods
research, qualitative research, quantitative
research
Good fences make good neighbors.
—Robert Frost (1919), “Mending Wall”
With the increasing ubiquity of qualitative
research (Wertz, 2011) and the emergence of
mixed methods research that utilizes both qual-
itative and quantitative analysis (Creswell,
Klassen, Plano Clark, & Smith, 2011; see also
Creswell, 2009; Creswell & Clark, 2007;
Tashakkori & Teddlie, 2003; Taskakkori, Ted-
dlie, & Sines, 2013), there is a growing need to
address the boundaries and differences between
these two types of research. Both types of re-
search have a set of usually implicit philosoph-
ical suppositions (see Churchill & Wertz, 2002;
Garza, 2004, 2007, 2011; Giorgi, 2009; von
66. Eckartsberg, 1998; Wertz, 1985). Among oth-
ers, Garza (2006) and Giorgi (2009) suggest
that important differences exist between these
two approaches to research. Following Giorgi,
such differences would define different domains
of research motivated by fundamentally differ-
ent questions and producing fundamentally dif-
ferent knowledge claims. These different
knowledge claims can “create a terrible mess”
without an understanding of the philosophical
foundations of both types of research (Greener,
2011, p. 3). Thus, this article seeks to delineate
the domains of both approaches and discuss the
combined use of quantitative and qualitative
data and approaches in mixed methods research.
An understanding of these differences with mu-
tual respect for each domain will provide the
necessary framework for discussing issues re-
lated to mixing both types of research. Finally,
we will discuss the complementarity of
strengths of both approaches arguing for the
necessity of methodological pluralism.
Defining Quantitative and Qualitative
Domains and Approaches
Qualitative and quantitative research com-
prise two different (but not opposed) interpre-
tative frameworks. At a fundamental level, what
distinguishes the domains of qualitative and
quantitative research are the implicit interpreta-
tive frames of reference that are brought to bear
on their subject matter and methods (Giorgi,
Brittany Landrum and Gilbert Garza, Department of Psy-
67. chology, University of Dallas.
Correspondence concerning this article should be ad-
dressed to Brittany Landrum, Department of Psychology,
University of Dallas, 1845 East Northgate Drive, Irving, TX
75062. E-mail: [email protected]
T
hi
s
do
cu
m
en
t
is
co
py
ri
gh
te
d
by
th
e
A
m
72. claims to light. At one end of a continuum
describing the interface of knowledge and frame
of reference are ‘purely’ quantitative studies.
Such research examines relations of magnitude
between variables measuring quantities1 (e.g.,
height, weight, number of behaviors, hippocam-
pal volume, etc.) and uses the numeric analysis
of data to test and verify these relations. At the
other end of this continuum are ‘purely’ quali-
tative studies. This sort of research makes de-
scriptive knowledge claims about meaning us-
ing ‘descriptive’ data typically expressing these
findings in linguistic narratives.
However, all these definitions meant to dis-
tinguish the two approaches are not mutually
exclusive. Qualitative research does count and
explore dimensions of magnitude (Sandelowski,
2001) and likewise quantitative research in-
cludes non-numeric data (e.g., categorical
data2) and makes inferences about meaning
based on dimensions of magnitude (Teddlie &
Tashakkori, 2009). Furthermore, all scientific
inquiry draws on both inductive and deductive
frameworks (see Merleau-Ponty, 1961/1964),
and we would argue that the interplay between
data and the interpretative frame of reference
are not always mutually exclusively quantita-
tive or qualitative. The boundaries between
these two approaches, more often than not, are
not a clearly defined fence but rather a mixing
of both types of data and approaches. Indeed, in
both ‘pure’ cases described above, the kind of
knowledge claimed fits well with the frame of
reference used to establish and communicate its
findings. Verification or confirmation of such
73. studies can be achieved in terms of replication
within the same analytic model. However, it is
with regard to the middle regions on the con-
tinuum that epistemological clarity and explic-
itness are needed to interpret research findings
and the light they shed on the topic under in-
vestigation (see Figure 1).
One of these middle positions is called quan-
titizing and occurs when research claims knowl-
edge of an order of magnitude but uses a qual-
itative interpretive framework as the basis of
such claims (e.g., performing numerical analy-
ses based on frequency of themes, or “ratings of
strength or intensity” Teddlie & Tashakkori,
2009, p. 269; see also Sandelowski, Voils, &
Knafl, 2009). The other ‘middle position’ is
called qualitizing and occurs when research
claims qualitative knowledge but uses a quan-
titative interpretive framework as the basis of
such claims (e.g., categories based on range in
magnitude, frequency count taken as a dimen-
sion of importance; Hesse-Biber, 2010; Sand-
elowski et al., 2009; Teddlie & Tashakkori,
2009). Because the knowledge claims of such
research and interpretive frames of reference
used to establish and test them do not match,
special care and epistemological knowledge
must be used when interpreting such findings.
For instance, Johnson and den Heyer (1980)
emphasize the distinction between a statistical
question and a psychometric question pointing
to the necessity of understanding the rubric of
measurement when interpreting IQ scores.
74. An example contrasting a ‘purely’ quantita-
tive relationship with instances where data and
approach are mixed data will help illustrate
these concerns and the special care we are ad-
vocating. A regression coefficient of 1 between
number of friends on Facebook and number of
photos on one’s profile means that an increase
by one friend predicts an increase in one photo
posted; both of these variables are measured
using ratio data whereby 1 friend on Facebook
and 1 photo are quantities and thus fall under
the ‘purely’ quantitative approach. When the
1 We have deliberately chosen examples of measures
whose relation to the scales which produce them are not
under debate. There is widespread agreement that height
and weight represent quantities on a ratio scale, for exam-
ple. This is not always the case with scales such as the
Likert type, which is discussed below.
2 Categorical data are often called qualitative or nominal
data but are analyzed using specialized statistical methods
within quantitative research (see Agresti, 2002). In this
article, this scale of measurement classification is distinct
from qualitative data and research which describe non-
numeric data that are to be used with methodologies devel-
oped by qualitative researchers.
200 LANDRUM AND GARZA
T
hi
s
do
79. y.
variable in question is on a Likert scale, the
relationship is an increase or decrease in agree-
ment based on the number people circle on
average, not necessarily or directly with the
construct it is taken to operationalize. Concerns
have been raised against Likert type data con-
cerning the appropriate use of parametric or
nonparametric statistics resting upon whether it
is interval or ordinal data, respectively (see,
e.g., Carifio & Perla, 2008; Norman, 2010). For
such data to be considered interval, one would
have to be able to answer the question pointedly
posed by Knapp (1990), “3 what?” in relation to
a 3 circled on a Likert-type scale. This type of
data is not quite a quantity like the number of
friends or photos are; it is neither clear whether
the steps on the scale are indeed equidistant
from each other (see, e.g., Jamieson, 2004) nor
whether the ‘degree’ of agreement is measuring
a quantity of something and if this quantity is
the same for everyone who completes it. The
answers on a Likert-type scale cannot escape
the subjective understanding of the participant.
We are not saying that Likert type data should
not be used in this way; rather we are advocat-
ing for appropriately understanding the knowl-
edge claims they make. Likert-type data fit
somewhere between the two end points on the
spectrum of the interface of knowledge and
appear to be an example of quantitizing
whereby a dimension of agreement (qualitative)
is rendered in terms of quantity (quantitative).
80. While a psychometric question can be dis-
tinct from a statistical question, Merenda (n.d.)
points to a more troubling example of quantitiz-
ing concerning a case when the question of what
one is measuring cannot be separated from the
statistical problems that it raises. The case in
point Merenda highlights is when data repre-
senting dichotomous categories, such as male
and female, are included with other continuous
predictor variables through ‘dummy coding’ in
a regression analysis. To be used in statistical
analyses that require continuous variables, these
dichotomous variables are treated as though
they were continuous, as though there were
values somewhere between male and female.
This is a violation of the assumption of contin-
uous and discrete predictor variables in a regres-
sion analysis thus presenting a questionable sta-
tistical result. He further adds that there is no
substitute for conducting a separate analysis
between males and females.
In an example of qualitizing, Cialdini et al.
(1976) calculated the frequency of ‘we’ and
‘non-we’ statements used to describe team and
personal outcomes for players on a sport team.
A dimension of quantity (counts/frequency) is
rendered in terms of subjective ownership of
instrumentality in a sport team’s victory or de-
feat. Similarly, in an example of quantitizing,
Pollard, Nievar, Nathans, and Riggs (2014)
counted the frequency of occurrences of various
themes from qualitative narratives and con-
cluded that based on nonsignificant chi-squared
81. analyses that the experiences of Hispanic and
Caucasian mothers did not differ thematically.
In this example, a quantitative rubric is utilized
to make claims regarding dimensions of expe-
rience. In these examples, we see the need to
take special care when interpreting the mean-
ings of the statistical analysis and the operation-
alization of the constructs given that the data
Middle Ground
Qualitizing Quantitizing
‘Pure’ Quantitative ‘Pure’ Qualitative
Numerical analysis of
data that are quantities
Descriptive analysis of
data that are non-numeric
Figure 1. The possible configurations of data and interpretative
frame of references repre-
sented as a continuum. The middle ground is of special concern
regarding the practice of
mixed methods.
201QUANTITATIVE AND QUALITATIVE DOMAINS
T
hi
s
do
86. y.
(quantitative or qualitative) and interpretation
(qualitative or quantitative) do not coincide.
Although we argue that neither method holds
a privileged perspective on the world, these two
modes of description are distinguished, for the
most part, by their respective approaches. We
hold that no inquiry can be undertaken from a
perspective-less position (Merleau-Ponty, 1945/
1962) and thus even natural science is not value
free (see Kendler, 2005 who asserts this and
Garza, 2006, who refutes this position). Indeed,
we would hold that an explicit acknowledgment
of approach is necessary to assess the validity of
any inquiry (Churchill, Lowery, McNally, &
Rao, 1998; Garza, 2004).3 Specifically in qual-
itative research, validity comprises a coherence
between the researcher’s frame of reference, the
research question, the data, and the findings.
Next, we will turn to some specific concerns
with mixed methods.
Concerns Regarding the Intersection of
Quantitative and Qualitative Frameworks
The Question of Hegemony of Approach
In a qualitative research training meeting,
conducted for researchers who were for the
most part both well-versed in quantitative re-
search models and inexperienced in qualitative
87. research, one individual expressed a concern
with the notion of ‘interrater reliability’ and a
desire to make sure all the ‘coders’ were naming
themes the same way. This individual felt that if
one coder named a theme ‘reluctance’ and an-
other named it ‘resistance,’ the analysis would
not be reliable, that is, the same. This individual
proposed providing a list of themes that all
coders would share before conducting the anal-
ysis. To run a quantitative interrater reliability
analysis, it is commonly computed as a corre-
lation coefficient describing the degree of over-
lap between two variables (regardless of what
scale of measurement the code comprises). The
concern with the two words being the ‘same’
was a quantitative concern posed to a qualitative
question. Here a qualitative claim is based on a
quantitative rubric: the meanings are themati-
cally related but the rubric is a numeric one of
the codes used in reliability analysis. The
knowledge claim here is one of corresponding
magnitudes of evaluations. Judging the reliabil-
ity of the responses based on their thematic
coherence instead allows us to recognize their
‘sameness’ while preserving the subtle and nu-
anced differences captured in different ways of
expressing it highlighting the different perspec-
tives that are brought to bear when analyzing
qualitative data. The potential of qualitative re-
search to discern a complexity of meaning
should not be hampered by the quantitative con-
cern with reliability as correlation. Reliability in
quantitative analysis rests on sameness, repeti-
tion; in qualitative research it rests on related-
ness (further discussed in Churchill & Wertz,
88. 2002; Garza, 2004, 2007, 2011; Giorgi, 2009).
This example presents an opportunity to illumi-
nate the challenges that arise when the approach
of one research model is applied to the practices
of the other. Answering the concern raised here
necessitates that we understand the differences
in approaches that could be illustrative for prac-
titioners in this area to avoid some of the com-
mon pitfalls we are addressing.
In a particularly illustrative example,
Fredrickson and Losada (2005) adopted formu-
las created and suitable for fluid dynamics in
physics to explain changes in attitudes over
time. Resting on the presumption that attitudes
are not only similar to but follow the same laws
of nature as fluid, these researchers have not
taken into account the differing philosophical
approaches that shape both of these phenomena.
Quite apart from whether attitudes are a physi-
cal ‘thing’ like water for instance, the use and
application of these mathematical formulas
again highlights the hegemony of quantitative
frameworks. Following the critiques raised by
Brown, Sokal, and Friedman (2013), the utili-
zation of these models can raise serious episte-
mological and conceptual concerns.
Another example of hegemony of perspective
is raised by Giorgi regarding the practice of
some qualitative researchers to ‘verify’ their
qualitative interpretative analyses by their par-
ticipants or other ‘judges’ (Giorgi, 2008; Pollio,
Henley, & Thompson, 1997). Giorgi (2008) as-
tutely points out that participants are not versed
in either the approach or procedures used for the
89. analysis and thus could not assess its validity.
3 Creswell and Clark (2007) also point to the importance
of laying out the philosophical underpinnings of research.
However, in the literature, neither quantitative nor qualita-
tive research uniformly does this.
202 LANDRUM AND GARZA
T
hi
s
do
cu
m
en
t
is
co
py
ri
gh
te
d
by
th
e
93. be
di
ss
em
in
at
ed
br
oa
dl
y.
Similarly we would add that statistical results
would not be verified by the participants be-
cause we cannot presume sufficient statistical
sophistication to make such a judgment. Al-
though it might seem that we are singling out
incursions of quantitative into qualitative prac-
tice, we suspect this is because the highly spe-
cialized language of statistics makes incursions
in the other direction less likely; everyone
speaks in narratives but not everyone speaks in
statistical narratives. In either case, instances of
either incursion point to the need for method-
ological pluralism.
Counts
Our next concern is the use of counts in
94. qualitative research (see Leech & Onwueg-
buzie, 2011; Miles & Huberman, 1994; Sand-
elowski, 2001), and there are a number of ‘qual-
itative’ articles that include frequency counts of
themes and ‘quantitizing’ or assigning a numer-
ical value to qualitative data that is then subject
to quantitative analysis (see Dutton & Win-
stead, 2011; Sandelowski et al., 2009). It would
be a mistake to equate frequency with impor-
tance or worse yet to conduct statistical analysis
with these counts as in what Sandelowski
(2001) calls “acontextual counting.” Examples
could include counting up the number of times
a particular word is said taken to imply a greater
importance of that dimension of meaning in that
person’s life. Often what an individual does not
say is just as revealing and important as what
they do say and when counting something, this
‘absence’ is not taken into account. In a thesis
workshop for senior undergraduates conducting
phenomenological research, a participant pro-
vided a description of losing her virginity and
the most striking part was that she never men-
tioned the partner once in the entire description
(Garza, 2004, Spring). Here, the lack of any
mention of the other party involved reveals
much about this phenomenon as meaningfully
lived by the participant. We argue that as soon
as one begins to count themes, one is no longer
conducting qualitative research and not really
conducting quantitative research either. This, in
our minds, fails to respect the proper domains
for both types of research.
Another example of a heightened concern
with numbers in qualitative research is what
95. Sandelowski (2001) refers to as “analytic over-
counting.” This refers to the tendency by some
qualitative researchers to count everything that
could possibly be counted to the detriment of
clear presentation of the qualitative findings.
Examples of this include a focus on the precise
number of themes identified whereby the actual
count is given greater emphasis than a descrip-
tion of the themes themselves. Sometimes even
when patterns of meanings comprise the results,
Sandelowski reports that researchers become
preoccupied with the number of participants
who exhibit the themes where the focus is on
frequency and less so on the meaning of the
themes or patterns. All of these examples point
to the need for researchers to be mindful of the
type of data being gathered and the analytic
approach undertaken paying particular attention
to the appropriate knowledge claims.
Confirmation and Validation
The practice of ‘(dis)confirming’ and
‘(non-)validating’ one set of findings with
another set when the data and interpretive
frameworks are not matched is widespread
(see Ellis, Marsh, & Craven, 2009; Hastings,
2012; Riegel, Dickson, Kuhn, Page, & Wor-
rall-Carter, 2010; Sechrist, Suitor, Riffin,
Taylor-Watson, & Pillemer, 2011 for exam-
ples). In all of these examples, quantitative
and qualitative data are used to explicitly
‘confirm’ and ‘verify’ each other and to as-
sess ‘concordance’ of findings.
96. Wagner et al. (2012) argue against ‘confir-
mation’ of findings rooted in one approach by
research rooted in the other because conflicting
results might initially appear problematic. If we
examine the hippocampus from a neurophysio-
logical point of view and find there are differ-
ences between those (including animal species)
who hoard and those who do not hoard (see,
e.g., Brodin & Lundborg, 2003; Hampton,
Sherry, Shettleworth, Khurgel, & Ivy, 1995;
Volman, Grubb, & Schuett, 1997), it would not
be appropriate to use qualitative data to confirm
differences in the hippocampus. On the other
hand, would the hippocampal findings confirm
differences in memory found in qualitative
data? Although these two sets of findings from
two approaches shed light on each other, we do
not believe one can confirm the other without
implicitly holding that one type of data is more
valid and thus the basis for such confirmation. If
203QUANTITATIVE AND QUALITATIVE DOMAINS
T
hi
s
do
cu
m
en
t
is
101. But it would no more ‘confirm’ the findings
related to hippocampal volume than a German
translation of Shakespeare could confirm the
Chinese translation; the point here is that a
researcher must understand the differences in
the languages used. The increase in hippocam-
pal volume and the importance of memory are
two complementary findings: they neither con-
firm each other nor disaffirm each other. To-
gether, they expand our understanding of the
role of memory in those who hoard.
Another example of this practice of ‘confir-
mation’ and ‘validation’ arises when attempting
to interpret percentage of concordance between
qualitative and quantitative findings. In a mixed
methods study examining self care behaviors
among patients with heart failure, Riegel et al.
(2010) computed the percentage of agreement
between identification of a self care theme in the
participants’ narratives with a cutoff score on a
quantitative survey. Although the two research-
ers independently analyzed the two types of
data respectively, the operationalization of self
care has already been defined in advance with
the use of a quantitative survey and the calcu-
lation of ‘concordance’ rates presumes that the
lived experiences provided through the narra-
tives will touch on the same points raised by the
survey items and vice versa. Furthermore, the
concordance rates are taken to be an indication
that the quantitative and qualitative methods are
more valid and thus more trustworthy if a higher
rate of concordance is reached. However, it is
not immediately clear what this percentage of
agreement means; for instance if self care main-
102. tenance reached 75% agreement but self care
confidence reached 95% agreement, what does
the 20% difference mean? Assuming 100 par-
ticipants in the sample, this difference would
precisely mean that 20 more people provided
evidence of this theme in their narratives and
circled a higher number on the survey. Can this
increase indicate that one piece of data is more
valid? We suggest not because the validity of
either quantitative or qualitative methods rests
upon the respective philosophical approach un-
dergirding both types of methods and that using
one method cannot ‘confirm’ or ‘validate’ find-
ings in the other. This practice renders the qual-
itative data into a dimension of magnitude again
marking implicit adherence to a quantitative
frame of reference. Additionally, this practice
rests on the presumption that the number one
circles for a group of items operationalized to
measure a phenomenon will coincide with a
description or narrative provided by the partic-
ipants. How one narrates one’s experiences may
or may not match with a list of items on this
topic and one of the benefits of conducting
mixed methods would be to examine this pos-
sibility. However, holding this presumption of
similarity across two types of data collected
shuts down the possibility of examining this
dimension when the goal is to assess ‘concor-
dance.’ What these researchers have rightly dis-
cerned is that there are similarities here as well
as a relationship between these two methods;
however, similarity has both qualitative and
quantitative dimensions, and a change in one
does not necessarily map onto a manifestation
103. in both types. As Rollo May points out when
describing the differences between memory ca-
pacities in humans and sheep, a difference in
terms of length of time or other quantitative
distinctions also imply quality differences but
given the distinct interpretative frames of refer-
ence, these two changes cannot be assumed to
be ‘the same’ (May, 1979). When these two
methods are used to validate each other or to
usurp one by the other, the strength of multiple
perspectives is diminished and eradicates the
possibility of exploring amplification, differ-
ences, similarities, and so forth when both types
of data are viewed from one perspective and
thus conflated.
Likewise, we contend that neither method
can be used to confirm or disconfirm the other.
Instead, we suggest that the frame of reference
here is ‘augmentation.’ Consider the ‘mountain’
task used to assess developmental egocentrism
as an analogy here; a child sits at a square table
with a three-dimensional mountain and is asked
to describe the mountain from various view-
points. The egocentric child cannot discern how
a viewer sitting on the other three sides of the
table would see anything different from what he
or she sees from his or her own perspective. He
or she might even be perplexed by the fact that
such an observer could see something ‘at odds’
with what he or she sees. Similarly an ‘ap-
proach-centric’ researcher might seek ‘confir-
mation’ of his or her own perspective when
conducting mixed methods. We argue that a
204 LANDRUM AND GARZA
108. ed
br
oa
dl
y.
methodologically pluralistic researcher would
see that the complementary perspectives of
other approaches, need not ‘confirm’ their own
perspectival view but augment it, providing a
more complex and full description of the phe-
nomenon being investigated. Rather than have
convergence or agreement as a goal of mixed
methods, we advocate for complementarity in
mixed methods.
Mixing Methods as Complementarity of
Strengths: The Case for Methodological
Pluralism
Both quantitative and qualitative research
methods are limited in scope assessing dimen-
sions of meaning and magnitude, respectively
(Giorgi, 2009). But it is perhaps more fruitful to
think of these limits as domains of strength.
These domains describe the frontiers of the two
research models and set the stage for a comple-
mentarity of strengths whereby our understand-
ing of the phenomena we research is more com-
plete in view of the differences than that
109. proffered by ‘verification’ or ‘confirmation.’
Complementarity requires that research con-
ducted from any point along the continuum (a)
acknowledge the differences between the ap-
proaches, (b) show respect for these differences,
and (c) possess a mindfulness that the ‘middle
ground’ we have described comprises complex
intersections of knowledge claims, epistemo-
logical assumptions, and approach. We advo-
cate that no position on this continuum is priv-
ileged and that methodological plurality allows
researchers to more fully describe a phenome-
non across this full continuum generating a
wide array of knowledge.
In the recent Best Practices for Mixed Meth-
ods Research in the Health Sciences, Creswell
et al. (2011) describe three types of integrating
qualitative and quantitative data. Two of the
three types, connecting and embedding data,
respect the boundaries of the two domains
whereby one type of inquiry informs the other
type of inquiry at a subsequent or concurrent
time, respectively. The other type of integration,
merging data, mixes up the ‘messy middle’ by
using one type of data to compare and/or con-
firm the findings from the other type.
When connecting data, one type of data anal-
ysis is used to inform the collection of a second
type of data at a subsequent time point. In this
way, the data gathered are analyzed using the
methods appropriate to the type of data gath-
ered. In our own mixed methods research ex-
ample below, our qualitative analysis illumi-
110. nated a transformed meaning of home that
suggests an additional variable to examine in
future quantitative research. The connecting
process does not violate the boundaries as the
type of data gathered (numeric vs. non-numeric)
is appropriately analyzed (quantitative vs. qual-
itative, respectively), enabling the two ap-
proaches to mutually shed light on each other
while neither confirming nor validating one ap-
proach over the other.
Likewise in the embedding data method, one
type of data analysis is deemed primary and the
other as secondary. The primary method is cho-
sen appropriately given the type of data being
collected while the secondary method is chosen
for supplemental and illuminating purposes.
Like the connecting process above, the embed-
ded process does not violate the boundaries
between approaches.
However, the merging process can violate the
boundaries we have outlined above. In this pro-
cess, a researcher can transform a piece of qual-
itative data into counts that are then subject to
quantitative analyses. In our view, this violates
a fundamental difference in the two approaches;
namely the non-numeric qualitative data when
transformed into number of times a theme is
mentioned departs from a dimension of mean-
ing (i.e., importance) and transforms it into the
currency of magnitude (i.e., counts). This merg-
ing process calls into question the boundaries
that divide these two approaches and the differ-
ent currencies that each trade in. As argued
above, the act of using one type of approach to
111. validate or confirm the other neglects how each
has its own language, understanding, and phil-
osophical foundations. However, this does not
mean that the two approaches cannot be used
concurrently in one research project. Rather
than confirming or validating, where one ap-
proach is more highly valued, we feel that when
both approaches and domains are respected, the
two types of results can shed light and illumi-
nate the subject matter as well as provide a
greater understanding than either approach
could on their own.
Kendler (2005) argues that methodological
plurality would create confusion and contradic-
tion and argues for a strict natural science ap-
205QUANTITATIVE AND QUALITATIVE DOMAINS
T
hi
s
do
cu
m
en
t
is
co
py
ri
115. an
d
is
no
t
to
be
di
ss
em
in
at
ed
br
oa
dl
y.
proach to psychological research utilizing the
methods of quantitative research. To our minds
this is akin to saying that a meal could be
accurately described either by a list of its ingre-
dients or by the subjective experience of its
deliciousness but not both; reporting both
would be ‘confusing’ or ‘contradictory.’ De-
116. spite these claims that the two types of research
are incompatible (Kendler, 2005), we have il-
lustrated that the goal of both types of research
is to gain a more complete understanding of the
phenomenon under investigation. Rather than
rely on a ‘monomethod,’ we suggest that meth-
odological plurality allows researchers to draw
on the strengths of both quantitative and quali-
tative research.
As a case in point, Trend (1979 as cited in
Teddlie & Tashakkori, 2009) explored program
implementation and found discrepant results in
the quantitative and qualitative analyses. Spe-
cifically, when examining the quantitative data,
the program was rated positively across sites
and it appeared successful. However the quali-
tative data provided the researchers with a qual-
itative impression that the implementation of
the program was not successful and problems
were encountered at the various sites. When
attempting to reconcile these apparent differ-
ences, the researchers discovered that a contex-
tual variable, the site’s urban versus rural loca-
tion, could account for the discrepancy
revealing that dimensions of meaning associ-
ated with this distinction in terms of costs, in-
come of families, ethnicity, ease of recruitment,
among others, revealed further nuances in the
quantitative findings. By examining the qualita-
tive data gathered for implementation of the
program at each site rather than collapsing
across sites as the initial quantitative analysis
did, the researchers used both types of data
gathered to augment each other and to illumi-
nate the contextual factors specific to each pro-
117. gram. Only when both data and thus both anal-
yses were incorporated and examined together
could the findings give a more comprehensive
picture of implementation. This example illumi-
nates how posing a qualitative question could
lead to a reevaluation of a quantitative analysis
providing further insights that would not have
been possible if only one method had been
applied.
Another example of the appropriately com-
plementary relationship of quantitative and
qualitative analysis in mixed methods research
is a study we conducted on Facebook usage and
its relationship to satisfaction with college life.
(Landrum & Garza, 2011). We began with a
quantitative study by asking our participants to
report on their Facebook (FB) usage and gath-
ered measures of social capital among other
measures of demographic and college experi-
ence. We tested a Structural Equation Model
(SEM) and found that when heavy users of FB
were connecting with friends from high school,
they reported less satisfaction with college life
when compared to students who were connect-
ing with fellow students and classmates at col-
lege illuminating dimensions of magnitude.
This quantitative finding suggested a fruitful
avenue to explore dimensions of meaning, what
FB means to them. Our structured interview
focus group analysis revealed a theme that
could not emerge from the quantitative analysis
as we had conceived it. Our spontaneous inter-
action with participants and open-ended analy-
sis allowed us to discern that for some students
118. the meaning of home had transformed from
their parent’s home to their college residence.
This qualitative finding shed new light on our
interpretation of the SEM model suggesting that
it was not so much how often students used FB
but rather how they were using FB, whether
they were connecting with those in their current
milieu and past milieu and which of these mi-
lieus was understood by participants as their
home. This opens a whole new avenue of re-
search of both kinds. The benefits of truly col-
laborative mixed methods cannot occur when
each or either model is corrupted to the pur-
poses of the other. In both of these examples,
the relationship between the two approaches is
not one of confirmation or validation but of
augmentation. Just as describing the mountain
scene from two sides of the table yields a more
comprehensive description, the full potential of
mixed methods research becomes possible
when the boundaries are respected, the strengths
honored, and the two models are thus mutually
and truly complementary across the entire con-
tinuum of research approaches.
Like all who sojourn beyond their homes,
methodological adventurers would be well ad-
vised to learn the language and customs of the
domains they visit. The necessity of this only
comes to light when one recognizes that a fron-
tier has been crossed. To achieve a truly appro-
206 LANDRUM AND GARZA
T
hi
123. oa
dl
y.
priate balance between quantitative and qualita-
tive research methods as well as mixing the two
approaches, we recommend methodological
pluralism. Envisioned as a sort of methodolog-
ical multiculturalism, we are calling for other
researchers in the field to join this discussion
and engage in dialogue with each other. We
argue that together, quantitative and qualitative
approaches are stronger and provide more
knowledge and insights about a research topic
than either approach alone. While both ap-
proaches shed unique light on a particular re-
search topic, we suggest that methodologically
pluralistic researchers would be able to ap-
proach their interests in such a way as to reveal
new insights that neither method nor approach
could reveal alone. When both quantitative and
qualitative researchers reach out to each other
across the fence, learn the language, and respect
the boundaries outlined above, we can start to
make great strides in the emerging field. Only
when both sides understand and respect the
domains can the differences and uniqueness of
both approaches be appreciated.
References
Agresti, A. (2002). Categorical data analysis (2nd
124. ed.). Hoboken, NJ: Wiley, John & Sons, Inc. http://
dx.doi.org/10.1002/0471249688
Brodin, A., & Lundborg, K. (2003). Is hippocampal
volume affected by specialization for food hoard-
ing in birds? Proceedings of the Royal Society of
London B: Biological Sciences, 270, 1555–1563.
http://dx.doi.org/10.1098/rspb.2003.2413
Brown, N. J. L., Sokal, A. D., & Friedman, H. L.
(2013). The complex dynamics of wishful think-
ing: The critical positivity ratio. American Psy-
chologist, 68, 801–813. http://dx.doi.org/10.1037/
a0032850
Carifio, J., & Perla, R. (2008). Resolving the 50-year
debate around using and misusing Likert scales.
Medical Education, 42, 1150–1152. http://dx.doi
.org/10.1111/j.1365-2923.2008.03172.x
Churchill, S. D., Lowery, J. E., McNally, O., & Rao,
A. (1998). The question of reliability in interpre-
tive psychological research: A comparison of three
phenomenologically based protocol analyses. In R.
Valle (Ed.), Phenomenological inquiry in psychol-
ogy: Existential and transpersonal dimensions (pp.
63–85). New York, NY: Plenum Press. http://dx
.doi.org/10.1007/978-1-4899-0125-5_3
Churchill, S. D., & Wertz, F. J. (2002). An introduc-
tion to phenomenological research psychology:
Historical, conceptual, and methodological foun-
dations. In K. J. Schneider, J. F. T. Bugental, &
J. F. Pierson (Eds.), The handbook of humanistic
psychology: Leading edges in theory, research,
125. and practice (pp. 247–262). Thousand Oaks, CA:
Sage.
Cialdini, R. B., Borden, R. J., Thorne, A., Walker,
M., Freeman, S., & Sloan, L. (1976). Basking in
reflected glory: Three (football) field studies. Jour-
nal of Personality and Social Psychology, 34,
366–375. http://dx.doi.org/10.1037/0022-3514.34
.3.366
Creswell, J. W. (2009). Research design: Qualitative,
quantitative, and mixed methods approaches (3rd
ed.). Thousand Oaks, CA: Sage.
Creswell, J. W., & Clark, V. (2007). Designing and
conducting mixed methods research. Thousand
Oaks, CA: Sage.
Creswell, J. W., Klassen, A. C., Plano Clark, V. L., &
Smith, K. C. (2011). Best practices for mixed meth-
ods research in the health sciences. Bethesda, MD:
National Institutes of Health, Office of Behavioral
and Social Sciences Research. Retrieved from http://
obssr.od.nih.gov/mixed_methods_research
Dutton, L. B., & Winstead, B. A. (2011). Types,
frequency, and effectiveness of responses to un-
wanted pursuit and stalking. Journal of Interper-
sonal Violence, 26, 1129–1156. http://dx.doi.org/
10.1177/0886260510368153
Ellis, L. A., Marsh, H. W., & Craven, R. G. (2009).
Addressing the challenges faced by early adoles-
cents: A mixed-method evaluation of the benefits
of peer support. American Journal of Community
Psychology, 44(1–2), 54 –75. http://dx.doi.org/