SlideShare a Scribd company logo
1 of 10
Download to read offline
145
© 2016 DiLoreto and Gaines and IJLTER.ORG. All rights reserved.
International Journal of Learning, Teaching and Educational Research
Vol. 15, No. 12, pp. 145-154, November 2016
An Investigation of Discrepancies between
Qualitative and Quantitative Findings in Survey
Research
Melanie DiLoreto and Trudi Gaines
University of West Florida
Pensacola, Florida, USA
Abstract. The purpose of this paper is to bring attention to an important
aspect of mixed methods research design which occurs when the
qualitative findings do not match the quantitative findings within a
particular study. Different perspectives on this phenomenon attempt to
understand and then to resolve such differences. The authors present
these various perspectives as well as an example of a study which
illustrates the phenomenon in question. Recommendations for
resolving differences within the study are given based on the
perspectives presented and conclusions are drawn relevant to an
analysis founded in the literature.
Keywords: mixed methods; research methods; survey research;
discrepant findings
Introduction
While mixed methods research design seems to represent a desirable and
legitimate alternative to purely quantitative or qualitative designs, it can bring
with it a result that requires attention and reconciliation; that is, the presence of
differences between the qualitative and quantitative findings within a mixed
methods study. Incongruent mixed methods results were first recognized by
Campbell and Fiske (1959) when they established that qualitative and
quantitative methods in research are used to validate and expand upon variance
obtained from the ontological trait rather than an error in methodology (multiple
operationalism). While the theoretical differences that point to incompatibility of
qualitative and quantitative research methods may have been largely put to rest
by theorists in the field (Howe, 2012), actual findings can and do express
discrepancies between the two data sets when conducted either simultaneously
or in tandem.
We turn first to a discussion of triangulation in the context of mixed-
methods studies insofar as it provides the vehicle for understanding findings to
arrive at an overarching view of the issue at hand. There are mixed views of the
definition and use of triangulation in research (Hussein, 2009). Triangulation
146
© 2016 The authors and IJLTER.ORG. All rights reserved.
was first promoted in the literature by Campbell and Fiske (1959) as an
important if not necessary means toward the ends of withstanding scrutiny of
findings by the greater research community. The initial purpose of triangulation
was that the various methodologies’ findings would support or complement one
another, thereby providing the overall conclusion greater validity. Mathison
(1988) introduced a different concept of triangulation which addressed the
occurrence of divergent findings among various methodologies and presented
three options with respect to triangulation outcomes: convergence,
inconsistency, or contradictory. She also proposed the importance of a holistic
approach to understanding the research study itself especially in the presence of
contradictory findings. This holistic approach involves an understanding and
explanation of context in as broad a way as possible. Such an approach was
supported by Howe’s (2012) rationale for triangulation as a valid and important
pursuit regardless of particular elements of a mixed-methods approach being
either conjunctive or disjunctive that is, addressing the same or various research
questions respectively.
From a different perspective, triangulation can combine methodologies
in one study of the same phenomenon or within-method which uses multiple
techniques within one particular method used to collect and interpret data
(Denzin, 1978). In the case of survey research, multiple scales or various indices
are often used to cross-check for internal consistency (Jick, 1979). Denzin (1978),
however, indicated that triangulation should not be confused with mixed
methods; but instead, these are two distinct ways to conceptualize
interpretations and findings. The process, in a final perspective, includes an
assumption that data analyses can come from a variety of designs and that the
data analyses are not dependent upon the design that is employed
(Onwuegbuzie & Teddlie, 2003). By using triangulation, conclusions about the
research questions can be presented as the result of a higher level of synthesis
(Lee & Rowlands, 2015). Triangulation therefore, is in support of the
complementary theorist who carefully considers the outcome in logical sequence
and evaluates whether differences in conclusions can co-occur. The model of
triangulation is not proposed in this context to cross-validate data, but rather to
capture an assortment of aspects relative to a similar phenomenon. Thus,
triangulation is offered as a potential ontological explanation for discrepant
findings in the present mixed-methods research design.
With respect to typology of mixed methods design, a meta-analysis
(Bryman, 2006) showed that two of the possible methods (both quantitative and
qualitative) dominated the studies examined: self-administered questionnaires
and semi-structured interviews. A coded survey instrument was used in 82.4%
of the studies included. The researchers focused on the issue of intention or
rationale for each of the components that comprised the studies. That is, do the
quantitative questions target different or like concepts, opinions, or perceptions?
Another motivation for using mixed-methods design can be triangulation, but
this can be a strategy after the fact when findings from the different methods are
incongruent. Bryman posited a possible “lack of certainty” among researchers
employing mixed-methods and that increased certainty about motivation might
reduce redundancy among the data obtained.
147
© 2016 The authors and IJLTER.ORG. All rights reserved.
We turn next to an examination of multiple views about how to proceed
when discrepant results are found in mixed-methods studies. In cases of
contradiction, the researchers may simply juxtapose the findings, reporting them
as a recommendation for future research rather than seeking to explain or to
reconcile them (Brannen, 2005). Another possibility is that one set of findings
may be described as being “better” than the other. Wagner et al. (2012) reported
that addressing the differences between findings is an area of investigation unto
itself and that there are two theoretical foundations for integrating findings:
positivist views and socially relative views. Discrepant findings should be
interpreted as being in a reciprocal relationship rather than in an oppositional
one (Smith, Cannata, & Haynes, 2016). Wagner et al. further posited that the
tensions and dynamics around the various positions should be viewed as both
intellectually and scholastically healthy. Discrepant findings can point the
researchers to potential flaws in the construction of measuring instruments such
as unintended ambiguity or a deficit in the depth of participants’ responses.
Fielding (2009) held that an overarching benefit of examining differing results is
the prevention of “analytic tunnel vision” by way of achieving “analytic
density.”
As an alternate to explanation or reconciliation, researchers should
review the methodological issues and possibly reject outright one set of findings
(Slonim-Nevo & Nevo, 2009). Such methodological issues would include
problems with the sample, with the measuring instrument itself, and/or with
procedures followed in either the quantitative or qualitative portion of the
study. Such an analysis could lead to the conclusion that one data set is actually
superior to the other, which should be rejected. When no such methodological
issues are present, researchers should then determine if the contradictions
between findings are logical or not, with logical differences potentially leading
to further investigation. Finally, it is important to determine if the findings are
discrepant (conflicts) or if they are contradictory. If they are discrepant, then
often the discrepancies can help lead the researchers to a conclusion that will
likely be viewed as logical. If the results, however, are contradictory and there is
no logical way to resolve the differences, then the efforts should be abandoned, a
recommendation that aligns with the previous one suggesting a simple
reporting of differences to be left for further investigation.
Another perspective is to approach differences as being either
complementary or noncomplementary. In the complementary approach,
findings are not contradictory and thus can be logically reconciled. The
noncomplementary approach necessitates abandoning one set of findings over
the other as they cannot be logically understood to coexist. The former assumes
that conflicting findings implicate a necessary choice in determining which to
keep and report, and which to eliminate. In turn, the complementary theorist
does not agree that both discrepant findings cannot coexist in harmony. The
latter does not simply accept results which seem to not make sense, but rather,
attempts to rectify the conflict by making logical connections. For instance, in
reality and in nature, conflicts are always present. Consider the following
example. An individual might believe that the universe is the result of a massive
explosion dating back to approximately 12 billion years ago. Another may
disagree, arguing that God created the universe. The noncomplementary
148
© 2016 The authors and IJLTER.ORG. All rights reserved.
theorist would not accept this disagreement; both cannot exist, and one must be
wrong. Alternatively, the complementary theorist attempts to make logical
sense of both situations and its full complexity, understanding the multifaceted
essence of mankind. On the other hand, contradictions are logically impossible.
There do exist absolutes that cannot be debated. For example, it cannot be both
raining and not raining at the same time according to the laws of physics.
Therefore, it is imperative for the researcher to search for potential and logical
conclusions for discrepant results and accept if there are none. Conversely,
complementary findings necessitate consistency across methodologies (i.e.
qualitative and quantitative). If both are in accordance with one another, results
are considered to be complementary.
DeLisle (2011) went so far as to assert that discrepant findings are
necessary in order to portray all aspects of a particular issue, especially true
when the findings are used for policy-making decisions. Issues and questions
about education and educational policy can often be best be addressed by
eliciting more nuanced responses than can be obtained through strictly
quantitative methodologies. Complementary, discrepant findings are necessary
to illustrate the contextual aspects of an issue that are not apparent in
quantitative data alone (DeLisle, 2011; Slonin-Nevo & Nevo, 2009). Positivist
views support one sole, measurable “truth” whereas socially relative views
support the notion that context influences facts, expanding the possibilities for
disparate findings and justifying their importance.
Moffatt, White, Mackintosh, and Howel (2006) proposed six ways to
further explore differences in the data. The first is to treat the two
methodologies as fundamentally different. This leads to treating the two
resulting datasets as findings to complement each other rather than to be
integrated into each other. The second is to explore the methodological rigor of
each component. The recommendation is to use the findings from one
methodology as a benchmark to more closely examine the rigor of the other.
The third is to explore the dataset comparability. That is, in studies where a
different or modified sample population is used, likeness of the participants
between the methodological groups should be examined. The fourth is to collect
additional data and make more comparisons. For instance, in studies where a
smaller subset of participants is interviewed in addition to the larger sample
which participated in the survey, the recommendation is to interview additional
participants. The fifth is to explore if the intervention under study worked as
expected. It may be that initial assumptions made about participants were not
entirely accurate with such inaccuracies potentially contributing to skewed or
invalid findings. The sixth is to explore whether the outcomes of the
quantitative and qualitative components match insofar as they address the same
constructs or domains. It can sometimes be the case that quantitative findings
do not result from sufficiently explicit or individualized interrogations by way
of a survey whereas qualitative, open-ended questions provide the “room”
needed by participants to sufficiently express or explain their responses. The
first four of these could be applied to survey research and, particularly, to the
study discussed later in this article, which study investigates the issue of
assessment in education. The fifth proposed way would not apply here as there
is no intervention involved in the study and the sixth way would not apply as
149
© 2016 The authors and IJLTER.ORG. All rights reserved.
the construct investigated is the same in both the qualitative and quantitative
components
Education seems an especially appropriate discipline for mixed methods
research as educators navigate their way through difficult policy issues using
data-driven decisions to inform their practices. Postpositivist researchers believe
that an independent reality exists and can be studied. At the same time,
postpositivists reject the objectivity of theoretical notions since mankind is
inherently biased and largely influenced by cultural nuances. Thus, theoretical
constructs can be measured to some degree but never fully grasped
(Onwuegbuzie, Johnson, & Collins, 2009). This is able to occur if the researcher
takes a neutral and objective stance to the research, and remains as detached
from the results as possible. In a constructivist paradigm, researchers believe
that contradictory results in a mixed method design are equally valid measures
of the same phenomenon (Onwuegbuzie et al., 2009). From the perspective of a
postpositivist critical realism, answers to the many and varied questions and
challenges that educators and policymakers face today will likely be required to
serve and satisfy the different points of view of all stakeholders. As educators
and researchers proceed forward to resolve the challenging issues such as
assessment at numerous levels, it is clear that research findings should inform
the direction toward solutions and strategies that are the most representative
and valid. Toward that end, mixed methods research is an ideal methodology as
it bridges the longstanding divide, at times a hostile one, between the two
polarities of qualitative and quantitative research methods (Johnson &
Onwuegbuzie, 2004).
Results of Recent Study
A recent survey investigating conceptions of assessment among faculty
and students in higher education contained both quantitative and qualitative
items (DiLoreto, 2013). According to Bryman’s (2006) meta-analysis on
typologies in mixed methods research, this study’s quantitative and qualitative
components addressed the same concepts with respect to participants’
conceptions which would be seen as the rationale for their inclusion and as
evidence of the absence of a lack of certainty with respect to their inclusion. An
analysis of the results showed that student responses were consistent between
the quantitative and qualitative data; however, there were considerable
discrepancies between the nature of the responses from faculty on the
quantitative items when compared to the qualitative items. This discrepancy
captured the interest of the authors and led to the investigation of this
phenomenon.
All undergraduate students and all full and part-time faculty members
who teach at Level V institutions of higher education with a minimum of one
bachelor’s degree located within the accreditation region of the Southern
Association of Colleges and Schools (SACS) were asked to participate in this
study. Level V doctoral-degree granting institutions are defined by SACS as
institutions that offer three or fewer doctoral degrees as highest degrees. For the
purposes of this study, faculty members were identified as university employees
whose primary duty is classroom teaching, research, department chairpersons,
150
© 2016 The authors and IJLTER.ORG. All rights reserved.
academic deans, and program coordinators. Students were identified as
undergraduate students attending one of the institutions within this region.
The primary purposes of the study were to confirm a model of the
conceptions of assessment based on the previous research and to investigate the
deeper meaning of the term assessment and the activities that are associated
with the term. The researcher used Huang’s (2012) explanation of formative
assessment versus summative assessment as the lens by which to view the
responses of students and faculty about the meaning of assessment. Huang
(2012) defined formative and summative assessments in terms of their intended
outcome. By definition, formative assessments elicit evidence that form the basis
of improvement and these assessments ensure students understand the goals of
learning. Furthermore, formative assessments provide opportunities for
students to receive feedback. These assessments are primarily used as a
diagnostic tool by the instructor to provide informal feedback during the
progression of the learning itself. Formative assessments can be used to help the
instructor evaluate if students comprehend the material without the attached
punitive repercussions. Conversely, summative assessments are often
associated with high-stakes, are standardized, and evaluative.
The cross-sectional design using survey methodology provided a one-
time snapshot of information from both faculty and students of higher education
within the southeastern United States. Quantitative and qualitative data were
collected simultaneously. Participants responded to 27 items using a six-point
Likert scale, ranging from strongly disagree to strongly agree. Data were
analyzed using a structural equation modeling framework in order to determine
structural validity of the model before analyzing where the differences, if any,
existed between faculty members’ and students’ conceptions of assessment.
After determining the best fitting model, invariance analysis began. To this end,
the best fitting model was tested for invariance across groups, faculty and
students.
In order to identify trends and to further explore faculty members’ and
undergraduate students’ beliefs about the definition of assessment, an open-
ended question developed by the researcher was added to the modified
abridged version of the CoA-III developed by Fletcher, Meyer, Anderson,
Johnston, and Rees (2011): “What does the term assessment mean to you?”
Participants were also asked to select from a list of possible responses about
what types of activities come to mind when they think of the term assessment.
These additional questions were used to gain further insight into faculty
members’ and undergraduate students’ conceptions of assessment within level
V institutions of higher education in the SACS accreditation region and were
analyzed to determine if there were any trends in the responses in addition to in
the types of activities that came to mind when participants thought of the term
assessment.
Data were collected from a total of 563 participants. Separate analyses of
the quantitative and qualitative data were completed. Upon completion of these
separate analyses, it became evident that a discrepancy existed in the way
faculty conceptualized assessment when asked open-ended items versus closed-
ended Likert scale items.
151
© 2016 The authors and IJLTER.ORG. All rights reserved.
The quantitative data suggested that faculty believe the primary purpose
of assessment is to improve student learning. Often, the thought of using
assessment to improve student learning – a formative, not summative approach
– is evident. When asked to respond to, “What does the term assessment mean
to you,” 9% of faculty responses included the word test, testing, quiz, and/or
exam. Conversely, when asked to identify activities associated with assessment,
faculty selected standardized tests most often, and then followed by program
evaluation and student evaluation. Arguably, none of these are typically
associated with student learning in terms of a formative approach. It is
interesting to note that 77% of the faculty marked standardized testing as an
activity that comes to mind when they think of the term assessment. Although
faculty members did not necessarily use terms associated with testing in their
responses to the open-ended question; they did use terms associated with testing
in their responses to activities associated with assessment. This is possibly an
indication that although faculty use various ways to describe assessment(s), the
high-stakes testing culture apparent in the United States today nonetheless
influences the deep-rooted meaning of the activities associated with the term
assessment.
As evidenced in past research, this study supports the notion that faculty
often support certain values about assessment that are frequently contradicted
by actual practice (Fletcher et al., 2011). The finding of the open-ended question
related to the meaning of the term assessment represents this discrepancy.
Faculty indicated that assessment is a form of testing and/or evaluation of either
students or programs. Unlike the open-ended question, faculty reported
improvement purposes of assessment to the closed-ended items on the
questionnaire.
The introductory section of this paper explored the various ways in
which discrepant findings have been conceptualized and the recommendations
for how researchers should proceed when confronted with such findings.
Beginning with the initial view as reported by Campbell and Fisk (1959), the
findings of the study discussed here can be seen as providing an expansion of
the concept of assessment in education and pointing to its considerable variance
rather than to an incompatibility among the particular views or understandings
expressed by the survey participants. Indeed, the findings can be understood as
providing a holistic view of assessment as Mathison (1988) described, especially
since we know that the divergence in question actually describes the two major
categories of assessment in education – the formative and the summative. When
considering Lee and Rowland’s (2015) “higher level of synthesis” resulting from
the process of triangulation, the current findings especially point to an issue
beyond merely identifying perceptions and definitions of assessment from
different methodologies; that is, how shall stakeholders and all participants in
formulating educational policy with respect to educational assessment reconcile
its practice with its theoretical framework? It is not at all the opinion of the
authors here that one set of findings is “better” than the other which would lead
to a dismissal of that set. Following Wagner et al.’s (2012) recommendation to
further examine the measuring instrument itself, the authors here support its
reliability and validity with respect to the Likert scale items. As there were only
two open-ended items that contributed to the qualitative findings, there is
152
© 2016 The authors and IJLTER.ORG. All rights reserved.
clearly room for an expansion of this component either by having more such
items or by wording them differently or both.
From among the six ways to further explore the data as recommended
by Moffatt et al. (2006), we identified the four which could be applied to a
further investigation of survey research in general: 1) treat the methods as
fundamentally different, 2) explore the methodological rigor of each component,
3) explore the dataset comparability, and 4) collect additional data and make
more comparisons. When each of these is applied to the survey study discussed
in this article, the researchers could pursue any one of the four recommended
approaches. Specifically, we could consider treating the methods as
fundamentally different while exploring the methodological rigor of each
component. For example, the quantitative analysis using structural equation
modeling produced a decent fitting model; therefore, it is possible to report
those results alone. It is also plausible, however, to explore the methodological
rigor of each component as a means to ensure that the depth and precision
warrant the conclusions. In the case of the study described in this paper, it could
be argued that rigor was lacking in the qualitative method and that further
investigation using an expanded, open-ended questionnaire is warranted.
Finally, another approach to settle this contradiction might be to collect
additional data instead of attempting any reconciliation of the discrepancies
(Brannen, 2005). Those additional data may be collected using the same
procedures from the original study or the researchers may add to the qualitative
rigor by interviewing a sub-set from the original sample. Any one of these ways
to reconcile discrepant findings could be used alone or in combination with one
another in order to potentially resolve the discrepancies described in this study.
Conclusion
There are many possible explanations for differences between
quantitative and qualitative findings. This paper presents various
understandings of these differences and approaches to reconciling them. With
respect to the study at hand about conceptions of assessment among higher
education faculty and students, we conclude the following.
 The difference between the findings is a discrepancy between the
two demographics of student versus faculty and not a
contradiction (Slonim-Nevo & Nevo, 2009). Therefore, the
difference can be viewed as a logical one that should lead to
further investigation.
 Should these findings be used for policy-making decisions, they
should be viewed as complementary and as portraying different
aspects of the issue of assessment. The qualitative results provide
important information about the context of the issue that might
otherwise go unnoted. Consequently, neither set of findings
should be rejected, in fact, the authors’ view is that they are
actually necessary to understand all aspects of the issue (DeLisle,
2011; Ruark & Fielding-Miller, 2016).
 The findings here are viewed from the socially relativist view that
the forces of context hold important sway over facts on the
153
© 2016 The authors and IJLTER.ORG. All rights reserved.
ground, and that there is not one measurable truth about this
issue.
 Further investigation is warranted based on these findings to
determine how the qualitative results versus the quantitative
shall be weighted in relationship to one another. In fact, as
indicated by Fielding (2009), a revisiting of the research question
may be warranted.
The authors also conclude that an unforeseen area of scholarly and
intellectually stimulating investigation has been triggered by the current
findings and by the existing literature on this topic. The topic is complex, and
lends itself to a more realistic understanding of most of the issues that educators
at all levels seek to resolve in today’s climate of distilling answers into test
scores. Arguably, assessment has developed into an integral and inextricable
aspect of today’s educational systems and policies. In fact, it can be seen as a
driver of those policies and, therefore, highlighting a discrepancy between
assessment theory and practice could not be more essential as educators and
stakeholders go forward to develop improved instructional delivery systems.
As evidenced in the literature, resolving discrepant and contradictory
findings can pose significant challenges to researchers. Furthermore, it is even
possible for the issue under study to impact the best approach for resolving
discrepant findings. In fact, DeLisle (2011) suggested that the nuanced context
of educational issues are best understood when complementary, discrepant
findings are present. Thus, prior to resolving any conflicting or discrepant
findings, the researchers recommend determining both the context and purpose
of the initial investigation. Then, concluding whether the findings are
complementary and discrepant or non-complimentary and contradictory is
required. Based on those conclusions, future researchers can then take
appropriate steps in resolving the results of the research findings.
As has already been noted, the fact of discrepancies is a viable area of
investigation unto itself that should become of greater focus as the popularity of
mixed-methods research increases (Wagner et al., 2012). As has been illustrated
by the application to the study at hand of concepts and recommendations with
respect to discrepancies, a depth of understanding and a raising of questions
leading to future research can and should be the outcome.
References
Brannen, J. (2005). Mixed methods research: A discussion paper. ESRC Centre for Research
Methods.
Bryman, A. (2006). Integrating quantitative and qualitative research: How is it done?
Qualitative Research, 6(1), 97-113.
Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the
multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81-105.
De Lisle, J. (2011). The benefits and challenges of mixing methods and methodologies:
Lessons learnt from implementing qualitatively led mixed methods designs in
Trinidad and Tobago. Caribbean Curriculum, 18, 87-120.
Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods. New
York: McGraw-Hill.
154
© 2016 The authors and IJLTER.ORG. All rights reserved.
DiLoreto, M. A. (2013). Multi-group invariance of the conceptions of assessment scale among
university faculty and students. Retrieved from ProQuest Digital Dissertations.
(AAT 3577612).
Fielding, N. G. (2009). Going out on a limb: Postmodernism and multiple method
research. Current Sociology, 57(3), 427-447.
Fletcher, R. B., Meyer, L. H., Anderson, H., Johnston, P., & Rees, M. (2011). Faculty and
students’ conceptions of assessment in higher education. Springer
Science+Business Media B.V., 64, 119-133. doi: 10.1007/s10734-011-9484-1.
Howe, K. R. (2012). Mixed methods, triangulation, and causal explanation. Journal of
Mixed Methods Research, 6(2), 89-96.
Huang, S. (2012). Like a bell responding to a striker: Instruction contingent on
assessment. English Teaching: Practice and Critique, 11(4), 99-119.
Hussein, A. (2009). The use of triangulation in social sciences research: Can qualitative
and
quantitative methods be combined? Journal of Comparative Social Work, 4(1), 1-12.
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action.
Administrative Science Quarterly, 24(4), 602-611.
Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research
paradigm whose time has come, Educational Researcher, 33(7), 14-26. DOI:
10.3102/00131899X033007014.
Lee, C., & Rowlands, I. J. (2015). When mixed methods produce mixed results:
Integrating disparate findings about miscarriage and women’s wellbeing. British
Journal of Health Psychology, 20, 36-44.
Mathison, S. (1988). Why triangulate? Educational Researcher, 17(2), 13-17.
Moffatt, S., White, M., Mackintosh, J., & Howell, D. (2006) Using quantitative and
qualitative data in health services research – what happens when mixed method
findings conflict? BMC Health Services Research, 6(28). DOI: 10.1186/1472-6963-6-
28.
Onwuegbuzie, A. J., Johnson, R. B., & Collins, K. Mt. (2009). Call for mixed analysis: A
philosophical framework for combining qualitative and quantitative approaches.
International Journal of Multiple Research Approaches, 3, 114-139.
Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed
methods research. In A. Tashakori & C. Teddlie (Eds.), Handbook of mixed methods
in social and behavioral research (pp. 351-383). Thousand Oaks, CA: Sage
Publications.
Ruark, A., & Fielding-Miller, R. (2016). Using qualitative methods to validate and
contextualize quantitative findings: A case study of research on sexual behavior
and gender-based violence among young Swazi women. Global Health: Science
and Practice, 4(3), 373-383.
Slonin-Nevo, V., & Nevo, I. (2009). Conflicting findings in mixed methods research: An
illustration from an Israeli study on immigration. Journal of Mixed Methods
Research, 3(2), 109-128.
Smith, T. M., Cannata, M., & Haynes, K. T. (2016). Reconciling data from different
sources: Practical realities of using mixed methods to identify effective high
school practices. Teachers College Record, 118(17), 1-34.
Wagner, K. D., Davidson, P. J., Pollini, R. A., Strathdee, S. A., Washburn, R., & Palinkas,
L. A. (2012). Reconciling incongruous qualitative and quantitative findings in
mixed methods research: Exemplars from research with drug using populations.
International Journal of Drug Policy, 23(1), 54-61.

More Related Content

Similar to An Investigation Of Discrepancies Between Qualitative And Quantitative Findings In Survey Research

Mixed Method Research by bangladesh-.ppt
Mixed Method Research by bangladesh-.pptMixed Method Research by bangladesh-.ppt
Mixed Method Research by bangladesh-.pptsudiptasarker6
 
Mixed Method Research of bangladesh-.ppt
Mixed Method Research of bangladesh-.pptMixed Method Research of bangladesh-.ppt
Mixed Method Research of bangladesh-.pptsudiptasarker6
 
Quantitative and qualitative_methodologies
Quantitative and qualitative_methodologiesQuantitative and qualitative_methodologies
Quantitative and qualitative_methodologiesAyesha Yaqoob
 
Applying A Mixed Methods For Choosing Text And Data...
Applying A Mixed Methods For Choosing Text And Data...Applying A Mixed Methods For Choosing Text And Data...
Applying A Mixed Methods For Choosing Text And Data...Jennifer Reither
 
Reading Material: Qualitative Interview
Reading Material: Qualitative InterviewReading Material: Qualitative Interview
Reading Material: Qualitative Interviewfirdausabdmunir85
 
Sale mixed methods
Sale mixed methodsSale mixed methods
Sale mixed methodspsdeeren
 
Research methodologies increasing understanding of the world
Research methodologies increasing understanding of the worldResearch methodologies increasing understanding of the world
Research methodologies increasing understanding of the worldDr. Mary Jane Coy, PhD
 
Comparison and complimentary between qualitative and quantitative approaches
Comparison and complimentary between qualitative and quantitative approachesComparison and complimentary between qualitative and quantitative approaches
Comparison and complimentary between qualitative and quantitative approachesDr. Rania Al- Jilani
 
Mixed methods research. newpptx
Mixed methods research. newpptxMixed methods research. newpptx
Mixed methods research. newpptxmuhammad abu bakr
 
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)leomutazu
 
Advanced Research Methodology.docx
Advanced Research Methodology.docxAdvanced Research Methodology.docx
Advanced Research Methodology.docxAyeshaAyyaz3
 
A Guide to Conducting a Meta-Analysis.pdf
A Guide to Conducting a Meta-Analysis.pdfA Guide to Conducting a Meta-Analysis.pdf
A Guide to Conducting a Meta-Analysis.pdfTina Gabel
 
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approac
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approacCHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approac
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approacEstelaJeffery653
 
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docx
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docxpeer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docx
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docxbartholomeocoombs
 
introtoresearch.pdf
introtoresearch.pdfintrotoresearch.pdf
introtoresearch.pdfobedcudjoe1
 
Mistakes in comparative research method
Mistakes in comparative research methodMistakes in comparative research method
Mistakes in comparative research methodiouellet
 
EXT 501 - Triangulation Presentation
EXT 501 - Triangulation Presentation EXT 501 - Triangulation Presentation
EXT 501 - Triangulation Presentation Joanna Wiebe
 
Quantitative Methods of Research
Quantitative Methods of ResearchQuantitative Methods of Research
Quantitative Methods of ResearchJan Ine
 
Cress well mix method design chapter 1
Cress well mix method design chapter 1Cress well mix method design chapter 1
Cress well mix method design chapter 1Rehmanzaib1
 

Similar to An Investigation Of Discrepancies Between Qualitative And Quantitative Findings In Survey Research (20)

Mixed Method Research by bangladesh-.ppt
Mixed Method Research by bangladesh-.pptMixed Method Research by bangladesh-.ppt
Mixed Method Research by bangladesh-.ppt
 
Mixed Method Research of bangladesh-.ppt
Mixed Method Research of bangladesh-.pptMixed Method Research of bangladesh-.ppt
Mixed Method Research of bangladesh-.ppt
 
Quantitative and qualitative_methodologies
Quantitative and qualitative_methodologiesQuantitative and qualitative_methodologies
Quantitative and qualitative_methodologies
 
Applying A Mixed Methods For Choosing Text And Data...
Applying A Mixed Methods For Choosing Text And Data...Applying A Mixed Methods For Choosing Text And Data...
Applying A Mixed Methods For Choosing Text And Data...
 
Reading Material: Qualitative Interview
Reading Material: Qualitative InterviewReading Material: Qualitative Interview
Reading Material: Qualitative Interview
 
Sale mixed methods
Sale mixed methodsSale mixed methods
Sale mixed methods
 
Research methodologies increasing understanding of the world
Research methodologies increasing understanding of the worldResearch methodologies increasing understanding of the world
Research methodologies increasing understanding of the world
 
Comparison and complimentary between qualitative and quantitative approaches
Comparison and complimentary between qualitative and quantitative approachesComparison and complimentary between qualitative and quantitative approaches
Comparison and complimentary between qualitative and quantitative approaches
 
Triangulation
TriangulationTriangulation
Triangulation
 
Mixed methods research. newpptx
Mixed methods research. newpptxMixed methods research. newpptx
Mixed methods research. newpptx
 
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)
MIXED METHOD APPROACH BY LEOPTRA MUTAZU GREAT ZIMBABWE UNIVERSITY(2017)
 
Advanced Research Methodology.docx
Advanced Research Methodology.docxAdvanced Research Methodology.docx
Advanced Research Methodology.docx
 
A Guide to Conducting a Meta-Analysis.pdf
A Guide to Conducting a Meta-Analysis.pdfA Guide to Conducting a Meta-Analysis.pdf
A Guide to Conducting a Meta-Analysis.pdf
 
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approac
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approacCHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approac
CHAPTER 1 THE SELECTION OF A RESEARCH APPROACHResearch approac
 
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docx
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docxpeer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docx
peer1  Qualitative, Quantitative and Mixed MethodThe qualitati.docx
 
introtoresearch.pdf
introtoresearch.pdfintrotoresearch.pdf
introtoresearch.pdf
 
Mistakes in comparative research method
Mistakes in comparative research methodMistakes in comparative research method
Mistakes in comparative research method
 
EXT 501 - Triangulation Presentation
EXT 501 - Triangulation Presentation EXT 501 - Triangulation Presentation
EXT 501 - Triangulation Presentation
 
Quantitative Methods of Research
Quantitative Methods of ResearchQuantitative Methods of Research
Quantitative Methods of Research
 
Cress well mix method design chapter 1
Cress well mix method design chapter 1Cress well mix method design chapter 1
Cress well mix method design chapter 1
 

More from Kelly Lipiec

Buy The Essay Buy Essay Online An
Buy The Essay Buy Essay Online AnBuy The Essay Buy Essay Online An
Buy The Essay Buy Essay Online AnKelly Lipiec
 
Writing A Research Paper For Scholarly Journals
Writing A Research Paper For Scholarly JournalsWriting A Research Paper For Scholarly Journals
Writing A Research Paper For Scholarly JournalsKelly Lipiec
 
Buy College Admission Essa
Buy College Admission EssaBuy College Admission Essa
Buy College Admission EssaKelly Lipiec
 
009 Essay Example Yale Supplement Pa Why Nyu New Yo
009 Essay Example Yale Supplement Pa Why Nyu New Yo009 Essay Example Yale Supplement Pa Why Nyu New Yo
009 Essay Example Yale Supplement Pa Why Nyu New YoKelly Lipiec
 
Thesis Statements And Controlling Ideas - Writing
Thesis Statements And Controlling Ideas - WritingThesis Statements And Controlling Ideas - Writing
Thesis Statements And Controlling Ideas - WritingKelly Lipiec
 
How To Teach A Child To Write
How To Teach A Child To WriteHow To Teach A Child To Write
How To Teach A Child To WriteKelly Lipiec
 
Writing An Expository Essay
Writing An Expository EssayWriting An Expository Essay
Writing An Expository EssayKelly Lipiec
 
Introduction Of Education Essay. Inclusive Education
Introduction Of Education Essay. Inclusive EducationIntroduction Of Education Essay. Inclusive Education
Introduction Of Education Essay. Inclusive EducationKelly Lipiec
 
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1Kelly Lipiec
 
Pin On Daily Inspiration William Hannah
Pin On Daily Inspiration William HannahPin On Daily Inspiration William Hannah
Pin On Daily Inspiration William HannahKelly Lipiec
 
Comparison Essay Sample. Compare And Contrast Es
Comparison Essay Sample. Compare And Contrast EsComparison Essay Sample. Compare And Contrast Es
Comparison Essay Sample. Compare And Contrast EsKelly Lipiec
 
Fundations Writing Paper With Picture S
Fundations Writing Paper With Picture SFundations Writing Paper With Picture S
Fundations Writing Paper With Picture SKelly Lipiec
 
003 Essay Example Why I Deserve This Scholarship
003 Essay Example Why I Deserve This Scholarship003 Essay Example Why I Deserve This Scholarship
003 Essay Example Why I Deserve This ScholarshipKelly Lipiec
 
Examples Of How To Conclude An Essay Evadon.Co.Uk
Examples Of How To Conclude An Essay Evadon.Co.UkExamples Of How To Conclude An Essay Evadon.Co.Uk
Examples Of How To Conclude An Essay Evadon.Co.UkKelly Lipiec
 
A Strong Outline For An Argumentative Essay Should Inclu
A Strong Outline For An Argumentative Essay Should IncluA Strong Outline For An Argumentative Essay Should Inclu
A Strong Outline For An Argumentative Essay Should IncluKelly Lipiec
 
Printable Writing Pages You Can Choose Traditional O
Printable Writing Pages You Can Choose Traditional OPrintable Writing Pages You Can Choose Traditional O
Printable Writing Pages You Can Choose Traditional OKelly Lipiec
 
The 8 Essentials Of Writing An Essay In Under 24 Hours
The 8 Essentials Of Writing An Essay In Under 24 HoursThe 8 Essentials Of Writing An Essay In Under 24 Hours
The 8 Essentials Of Writing An Essay In Under 24 HoursKelly Lipiec
 
Example Of A Qualitative Research Article Critique
Example Of A Qualitative Research Article CritiqueExample Of A Qualitative Research Article Critique
Example Of A Qualitative Research Article CritiqueKelly Lipiec
 
How To Write A Process Paper For Science Fair
How To Write A Process Paper For Science FairHow To Write A Process Paper For Science Fair
How To Write A Process Paper For Science FairKelly Lipiec
 
Rite In The Rain All Weather Writing Paper - Expedition Journal ...
Rite In The Rain All Weather Writing Paper - Expedition Journal ...Rite In The Rain All Weather Writing Paper - Expedition Journal ...
Rite In The Rain All Weather Writing Paper - Expedition Journal ...Kelly Lipiec
 

More from Kelly Lipiec (20)

Buy The Essay Buy Essay Online An
Buy The Essay Buy Essay Online AnBuy The Essay Buy Essay Online An
Buy The Essay Buy Essay Online An
 
Writing A Research Paper For Scholarly Journals
Writing A Research Paper For Scholarly JournalsWriting A Research Paper For Scholarly Journals
Writing A Research Paper For Scholarly Journals
 
Buy College Admission Essa
Buy College Admission EssaBuy College Admission Essa
Buy College Admission Essa
 
009 Essay Example Yale Supplement Pa Why Nyu New Yo
009 Essay Example Yale Supplement Pa Why Nyu New Yo009 Essay Example Yale Supplement Pa Why Nyu New Yo
009 Essay Example Yale Supplement Pa Why Nyu New Yo
 
Thesis Statements And Controlling Ideas - Writing
Thesis Statements And Controlling Ideas - WritingThesis Statements And Controlling Ideas - Writing
Thesis Statements And Controlling Ideas - Writing
 
How To Teach A Child To Write
How To Teach A Child To WriteHow To Teach A Child To Write
How To Teach A Child To Write
 
Writing An Expository Essay
Writing An Expository EssayWriting An Expository Essay
Writing An Expository Essay
 
Introduction Of Education Essay. Inclusive Education
Introduction Of Education Essay. Inclusive EducationIntroduction Of Education Essay. Inclusive Education
Introduction Of Education Essay. Inclusive Education
 
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1
Clouds Alphabet Stock Vector. Illustration Of Font, Letter - 1
 
Pin On Daily Inspiration William Hannah
Pin On Daily Inspiration William HannahPin On Daily Inspiration William Hannah
Pin On Daily Inspiration William Hannah
 
Comparison Essay Sample. Compare And Contrast Es
Comparison Essay Sample. Compare And Contrast EsComparison Essay Sample. Compare And Contrast Es
Comparison Essay Sample. Compare And Contrast Es
 
Fundations Writing Paper With Picture S
Fundations Writing Paper With Picture SFundations Writing Paper With Picture S
Fundations Writing Paper With Picture S
 
003 Essay Example Why I Deserve This Scholarship
003 Essay Example Why I Deserve This Scholarship003 Essay Example Why I Deserve This Scholarship
003 Essay Example Why I Deserve This Scholarship
 
Examples Of How To Conclude An Essay Evadon.Co.Uk
Examples Of How To Conclude An Essay Evadon.Co.UkExamples Of How To Conclude An Essay Evadon.Co.Uk
Examples Of How To Conclude An Essay Evadon.Co.Uk
 
A Strong Outline For An Argumentative Essay Should Inclu
A Strong Outline For An Argumentative Essay Should IncluA Strong Outline For An Argumentative Essay Should Inclu
A Strong Outline For An Argumentative Essay Should Inclu
 
Printable Writing Pages You Can Choose Traditional O
Printable Writing Pages You Can Choose Traditional OPrintable Writing Pages You Can Choose Traditional O
Printable Writing Pages You Can Choose Traditional O
 
The 8 Essentials Of Writing An Essay In Under 24 Hours
The 8 Essentials Of Writing An Essay In Under 24 HoursThe 8 Essentials Of Writing An Essay In Under 24 Hours
The 8 Essentials Of Writing An Essay In Under 24 Hours
 
Example Of A Qualitative Research Article Critique
Example Of A Qualitative Research Article CritiqueExample Of A Qualitative Research Article Critique
Example Of A Qualitative Research Article Critique
 
How To Write A Process Paper For Science Fair
How To Write A Process Paper For Science FairHow To Write A Process Paper For Science Fair
How To Write A Process Paper For Science Fair
 
Rite In The Rain All Weather Writing Paper - Expedition Journal ...
Rite In The Rain All Weather Writing Paper - Expedition Journal ...Rite In The Rain All Weather Writing Paper - Expedition Journal ...
Rite In The Rain All Weather Writing Paper - Expedition Journal ...
 

Recently uploaded

POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,Virag Sontakke
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxAnaBeatriceAblay2
 
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 

Recently uploaded (20)

9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,
à€­à€Ÿà€°à€€-à€°à„‹à€ź à€”à„à€Żà€Ÿà€Șà€Ÿà€°.pptx, Indo-Roman Trade,
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
 
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptxENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
ENGLISH5 QUARTER4 MODULE1 WEEK1-3 How Visual and Multimedia Elements.pptx
 
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAĐĄY_INDEX-DM_23-1-final-eng.pdf
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 

An Investigation Of Discrepancies Between Qualitative And Quantitative Findings In Survey Research

  • 1. 145 © 2016 DiLoreto and Gaines and IJLTER.ORG. All rights reserved. International Journal of Learning, Teaching and Educational Research Vol. 15, No. 12, pp. 145-154, November 2016 An Investigation of Discrepancies between Qualitative and Quantitative Findings in Survey Research Melanie DiLoreto and Trudi Gaines University of West Florida Pensacola, Florida, USA Abstract. The purpose of this paper is to bring attention to an important aspect of mixed methods research design which occurs when the qualitative findings do not match the quantitative findings within a particular study. Different perspectives on this phenomenon attempt to understand and then to resolve such differences. The authors present these various perspectives as well as an example of a study which illustrates the phenomenon in question. Recommendations for resolving differences within the study are given based on the perspectives presented and conclusions are drawn relevant to an analysis founded in the literature. Keywords: mixed methods; research methods; survey research; discrepant findings Introduction While mixed methods research design seems to represent a desirable and legitimate alternative to purely quantitative or qualitative designs, it can bring with it a result that requires attention and reconciliation; that is, the presence of differences between the qualitative and quantitative findings within a mixed methods study. Incongruent mixed methods results were first recognized by Campbell and Fiske (1959) when they established that qualitative and quantitative methods in research are used to validate and expand upon variance obtained from the ontological trait rather than an error in methodology (multiple operationalism). While the theoretical differences that point to incompatibility of qualitative and quantitative research methods may have been largely put to rest by theorists in the field (Howe, 2012), actual findings can and do express discrepancies between the two data sets when conducted either simultaneously or in tandem. We turn first to a discussion of triangulation in the context of mixed- methods studies insofar as it provides the vehicle for understanding findings to arrive at an overarching view of the issue at hand. There are mixed views of the definition and use of triangulation in research (Hussein, 2009). Triangulation
  • 2. 146 © 2016 The authors and IJLTER.ORG. All rights reserved. was first promoted in the literature by Campbell and Fiske (1959) as an important if not necessary means toward the ends of withstanding scrutiny of findings by the greater research community. The initial purpose of triangulation was that the various methodologies’ findings would support or complement one another, thereby providing the overall conclusion greater validity. Mathison (1988) introduced a different concept of triangulation which addressed the occurrence of divergent findings among various methodologies and presented three options with respect to triangulation outcomes: convergence, inconsistency, or contradictory. She also proposed the importance of a holistic approach to understanding the research study itself especially in the presence of contradictory findings. This holistic approach involves an understanding and explanation of context in as broad a way as possible. Such an approach was supported by Howe’s (2012) rationale for triangulation as a valid and important pursuit regardless of particular elements of a mixed-methods approach being either conjunctive or disjunctive that is, addressing the same or various research questions respectively. From a different perspective, triangulation can combine methodologies in one study of the same phenomenon or within-method which uses multiple techniques within one particular method used to collect and interpret data (Denzin, 1978). In the case of survey research, multiple scales or various indices are often used to cross-check for internal consistency (Jick, 1979). Denzin (1978), however, indicated that triangulation should not be confused with mixed methods; but instead, these are two distinct ways to conceptualize interpretations and findings. The process, in a final perspective, includes an assumption that data analyses can come from a variety of designs and that the data analyses are not dependent upon the design that is employed (Onwuegbuzie & Teddlie, 2003). By using triangulation, conclusions about the research questions can be presented as the result of a higher level of synthesis (Lee & Rowlands, 2015). Triangulation therefore, is in support of the complementary theorist who carefully considers the outcome in logical sequence and evaluates whether differences in conclusions can co-occur. The model of triangulation is not proposed in this context to cross-validate data, but rather to capture an assortment of aspects relative to a similar phenomenon. Thus, triangulation is offered as a potential ontological explanation for discrepant findings in the present mixed-methods research design. With respect to typology of mixed methods design, a meta-analysis (Bryman, 2006) showed that two of the possible methods (both quantitative and qualitative) dominated the studies examined: self-administered questionnaires and semi-structured interviews. A coded survey instrument was used in 82.4% of the studies included. The researchers focused on the issue of intention or rationale for each of the components that comprised the studies. That is, do the quantitative questions target different or like concepts, opinions, or perceptions? Another motivation for using mixed-methods design can be triangulation, but this can be a strategy after the fact when findings from the different methods are incongruent. Bryman posited a possible “lack of certainty” among researchers employing mixed-methods and that increased certainty about motivation might reduce redundancy among the data obtained.
  • 3. 147 © 2016 The authors and IJLTER.ORG. All rights reserved. We turn next to an examination of multiple views about how to proceed when discrepant results are found in mixed-methods studies. In cases of contradiction, the researchers may simply juxtapose the findings, reporting them as a recommendation for future research rather than seeking to explain or to reconcile them (Brannen, 2005). Another possibility is that one set of findings may be described as being “better” than the other. Wagner et al. (2012) reported that addressing the differences between findings is an area of investigation unto itself and that there are two theoretical foundations for integrating findings: positivist views and socially relative views. Discrepant findings should be interpreted as being in a reciprocal relationship rather than in an oppositional one (Smith, Cannata, & Haynes, 2016). Wagner et al. further posited that the tensions and dynamics around the various positions should be viewed as both intellectually and scholastically healthy. Discrepant findings can point the researchers to potential flaws in the construction of measuring instruments such as unintended ambiguity or a deficit in the depth of participants’ responses. Fielding (2009) held that an overarching benefit of examining differing results is the prevention of “analytic tunnel vision” by way of achieving “analytic density.” As an alternate to explanation or reconciliation, researchers should review the methodological issues and possibly reject outright one set of findings (Slonim-Nevo & Nevo, 2009). Such methodological issues would include problems with the sample, with the measuring instrument itself, and/or with procedures followed in either the quantitative or qualitative portion of the study. Such an analysis could lead to the conclusion that one data set is actually superior to the other, which should be rejected. When no such methodological issues are present, researchers should then determine if the contradictions between findings are logical or not, with logical differences potentially leading to further investigation. Finally, it is important to determine if the findings are discrepant (conflicts) or if they are contradictory. If they are discrepant, then often the discrepancies can help lead the researchers to a conclusion that will likely be viewed as logical. If the results, however, are contradictory and there is no logical way to resolve the differences, then the efforts should be abandoned, a recommendation that aligns with the previous one suggesting a simple reporting of differences to be left for further investigation. Another perspective is to approach differences as being either complementary or noncomplementary. In the complementary approach, findings are not contradictory and thus can be logically reconciled. The noncomplementary approach necessitates abandoning one set of findings over the other as they cannot be logically understood to coexist. The former assumes that conflicting findings implicate a necessary choice in determining which to keep and report, and which to eliminate. In turn, the complementary theorist does not agree that both discrepant findings cannot coexist in harmony. The latter does not simply accept results which seem to not make sense, but rather, attempts to rectify the conflict by making logical connections. For instance, in reality and in nature, conflicts are always present. Consider the following example. An individual might believe that the universe is the result of a massive explosion dating back to approximately 12 billion years ago. Another may disagree, arguing that God created the universe. The noncomplementary
  • 4. 148 © 2016 The authors and IJLTER.ORG. All rights reserved. theorist would not accept this disagreement; both cannot exist, and one must be wrong. Alternatively, the complementary theorist attempts to make logical sense of both situations and its full complexity, understanding the multifaceted essence of mankind. On the other hand, contradictions are logically impossible. There do exist absolutes that cannot be debated. For example, it cannot be both raining and not raining at the same time according to the laws of physics. Therefore, it is imperative for the researcher to search for potential and logical conclusions for discrepant results and accept if there are none. Conversely, complementary findings necessitate consistency across methodologies (i.e. qualitative and quantitative). If both are in accordance with one another, results are considered to be complementary. DeLisle (2011) went so far as to assert that discrepant findings are necessary in order to portray all aspects of a particular issue, especially true when the findings are used for policy-making decisions. Issues and questions about education and educational policy can often be best be addressed by eliciting more nuanced responses than can be obtained through strictly quantitative methodologies. Complementary, discrepant findings are necessary to illustrate the contextual aspects of an issue that are not apparent in quantitative data alone (DeLisle, 2011; Slonin-Nevo & Nevo, 2009). Positivist views support one sole, measurable “truth” whereas socially relative views support the notion that context influences facts, expanding the possibilities for disparate findings and justifying their importance. Moffatt, White, Mackintosh, and Howel (2006) proposed six ways to further explore differences in the data. The first is to treat the two methodologies as fundamentally different. This leads to treating the two resulting datasets as findings to complement each other rather than to be integrated into each other. The second is to explore the methodological rigor of each component. The recommendation is to use the findings from one methodology as a benchmark to more closely examine the rigor of the other. The third is to explore the dataset comparability. That is, in studies where a different or modified sample population is used, likeness of the participants between the methodological groups should be examined. The fourth is to collect additional data and make more comparisons. For instance, in studies where a smaller subset of participants is interviewed in addition to the larger sample which participated in the survey, the recommendation is to interview additional participants. The fifth is to explore if the intervention under study worked as expected. It may be that initial assumptions made about participants were not entirely accurate with such inaccuracies potentially contributing to skewed or invalid findings. The sixth is to explore whether the outcomes of the quantitative and qualitative components match insofar as they address the same constructs or domains. It can sometimes be the case that quantitative findings do not result from sufficiently explicit or individualized interrogations by way of a survey whereas qualitative, open-ended questions provide the “room” needed by participants to sufficiently express or explain their responses. The first four of these could be applied to survey research and, particularly, to the study discussed later in this article, which study investigates the issue of assessment in education. The fifth proposed way would not apply here as there is no intervention involved in the study and the sixth way would not apply as
  • 5. 149 © 2016 The authors and IJLTER.ORG. All rights reserved. the construct investigated is the same in both the qualitative and quantitative components Education seems an especially appropriate discipline for mixed methods research as educators navigate their way through difficult policy issues using data-driven decisions to inform their practices. Postpositivist researchers believe that an independent reality exists and can be studied. At the same time, postpositivists reject the objectivity of theoretical notions since mankind is inherently biased and largely influenced by cultural nuances. Thus, theoretical constructs can be measured to some degree but never fully grasped (Onwuegbuzie, Johnson, & Collins, 2009). This is able to occur if the researcher takes a neutral and objective stance to the research, and remains as detached from the results as possible. In a constructivist paradigm, researchers believe that contradictory results in a mixed method design are equally valid measures of the same phenomenon (Onwuegbuzie et al., 2009). From the perspective of a postpositivist critical realism, answers to the many and varied questions and challenges that educators and policymakers face today will likely be required to serve and satisfy the different points of view of all stakeholders. As educators and researchers proceed forward to resolve the challenging issues such as assessment at numerous levels, it is clear that research findings should inform the direction toward solutions and strategies that are the most representative and valid. Toward that end, mixed methods research is an ideal methodology as it bridges the longstanding divide, at times a hostile one, between the two polarities of qualitative and quantitative research methods (Johnson & Onwuegbuzie, 2004). Results of Recent Study A recent survey investigating conceptions of assessment among faculty and students in higher education contained both quantitative and qualitative items (DiLoreto, 2013). According to Bryman’s (2006) meta-analysis on typologies in mixed methods research, this study’s quantitative and qualitative components addressed the same concepts with respect to participants’ conceptions which would be seen as the rationale for their inclusion and as evidence of the absence of a lack of certainty with respect to their inclusion. An analysis of the results showed that student responses were consistent between the quantitative and qualitative data; however, there were considerable discrepancies between the nature of the responses from faculty on the quantitative items when compared to the qualitative items. This discrepancy captured the interest of the authors and led to the investigation of this phenomenon. All undergraduate students and all full and part-time faculty members who teach at Level V institutions of higher education with a minimum of one bachelor’s degree located within the accreditation region of the Southern Association of Colleges and Schools (SACS) were asked to participate in this study. Level V doctoral-degree granting institutions are defined by SACS as institutions that offer three or fewer doctoral degrees as highest degrees. For the purposes of this study, faculty members were identified as university employees whose primary duty is classroom teaching, research, department chairpersons,
  • 6. 150 © 2016 The authors and IJLTER.ORG. All rights reserved. academic deans, and program coordinators. Students were identified as undergraduate students attending one of the institutions within this region. The primary purposes of the study were to confirm a model of the conceptions of assessment based on the previous research and to investigate the deeper meaning of the term assessment and the activities that are associated with the term. The researcher used Huang’s (2012) explanation of formative assessment versus summative assessment as the lens by which to view the responses of students and faculty about the meaning of assessment. Huang (2012) defined formative and summative assessments in terms of their intended outcome. By definition, formative assessments elicit evidence that form the basis of improvement and these assessments ensure students understand the goals of learning. Furthermore, formative assessments provide opportunities for students to receive feedback. These assessments are primarily used as a diagnostic tool by the instructor to provide informal feedback during the progression of the learning itself. Formative assessments can be used to help the instructor evaluate if students comprehend the material without the attached punitive repercussions. Conversely, summative assessments are often associated with high-stakes, are standardized, and evaluative. The cross-sectional design using survey methodology provided a one- time snapshot of information from both faculty and students of higher education within the southeastern United States. Quantitative and qualitative data were collected simultaneously. Participants responded to 27 items using a six-point Likert scale, ranging from strongly disagree to strongly agree. Data were analyzed using a structural equation modeling framework in order to determine structural validity of the model before analyzing where the differences, if any, existed between faculty members’ and students’ conceptions of assessment. After determining the best fitting model, invariance analysis began. To this end, the best fitting model was tested for invariance across groups, faculty and students. In order to identify trends and to further explore faculty members’ and undergraduate students’ beliefs about the definition of assessment, an open- ended question developed by the researcher was added to the modified abridged version of the CoA-III developed by Fletcher, Meyer, Anderson, Johnston, and Rees (2011): “What does the term assessment mean to you?” Participants were also asked to select from a list of possible responses about what types of activities come to mind when they think of the term assessment. These additional questions were used to gain further insight into faculty members’ and undergraduate students’ conceptions of assessment within level V institutions of higher education in the SACS accreditation region and were analyzed to determine if there were any trends in the responses in addition to in the types of activities that came to mind when participants thought of the term assessment. Data were collected from a total of 563 participants. Separate analyses of the quantitative and qualitative data were completed. Upon completion of these separate analyses, it became evident that a discrepancy existed in the way faculty conceptualized assessment when asked open-ended items versus closed- ended Likert scale items.
  • 7. 151 © 2016 The authors and IJLTER.ORG. All rights reserved. The quantitative data suggested that faculty believe the primary purpose of assessment is to improve student learning. Often, the thought of using assessment to improve student learning – a formative, not summative approach – is evident. When asked to respond to, “What does the term assessment mean to you,” 9% of faculty responses included the word test, testing, quiz, and/or exam. Conversely, when asked to identify activities associated with assessment, faculty selected standardized tests most often, and then followed by program evaluation and student evaluation. Arguably, none of these are typically associated with student learning in terms of a formative approach. It is interesting to note that 77% of the faculty marked standardized testing as an activity that comes to mind when they think of the term assessment. Although faculty members did not necessarily use terms associated with testing in their responses to the open-ended question; they did use terms associated with testing in their responses to activities associated with assessment. This is possibly an indication that although faculty use various ways to describe assessment(s), the high-stakes testing culture apparent in the United States today nonetheless influences the deep-rooted meaning of the activities associated with the term assessment. As evidenced in past research, this study supports the notion that faculty often support certain values about assessment that are frequently contradicted by actual practice (Fletcher et al., 2011). The finding of the open-ended question related to the meaning of the term assessment represents this discrepancy. Faculty indicated that assessment is a form of testing and/or evaluation of either students or programs. Unlike the open-ended question, faculty reported improvement purposes of assessment to the closed-ended items on the questionnaire. The introductory section of this paper explored the various ways in which discrepant findings have been conceptualized and the recommendations for how researchers should proceed when confronted with such findings. Beginning with the initial view as reported by Campbell and Fisk (1959), the findings of the study discussed here can be seen as providing an expansion of the concept of assessment in education and pointing to its considerable variance rather than to an incompatibility among the particular views or understandings expressed by the survey participants. Indeed, the findings can be understood as providing a holistic view of assessment as Mathison (1988) described, especially since we know that the divergence in question actually describes the two major categories of assessment in education – the formative and the summative. When considering Lee and Rowland’s (2015) “higher level of synthesis” resulting from the process of triangulation, the current findings especially point to an issue beyond merely identifying perceptions and definitions of assessment from different methodologies; that is, how shall stakeholders and all participants in formulating educational policy with respect to educational assessment reconcile its practice with its theoretical framework? It is not at all the opinion of the authors here that one set of findings is “better” than the other which would lead to a dismissal of that set. Following Wagner et al.’s (2012) recommendation to further examine the measuring instrument itself, the authors here support its reliability and validity with respect to the Likert scale items. As there were only two open-ended items that contributed to the qualitative findings, there is
  • 8. 152 © 2016 The authors and IJLTER.ORG. All rights reserved. clearly room for an expansion of this component either by having more such items or by wording them differently or both. From among the six ways to further explore the data as recommended by Moffatt et al. (2006), we identified the four which could be applied to a further investigation of survey research in general: 1) treat the methods as fundamentally different, 2) explore the methodological rigor of each component, 3) explore the dataset comparability, and 4) collect additional data and make more comparisons. When each of these is applied to the survey study discussed in this article, the researchers could pursue any one of the four recommended approaches. Specifically, we could consider treating the methods as fundamentally different while exploring the methodological rigor of each component. For example, the quantitative analysis using structural equation modeling produced a decent fitting model; therefore, it is possible to report those results alone. It is also plausible, however, to explore the methodological rigor of each component as a means to ensure that the depth and precision warrant the conclusions. In the case of the study described in this paper, it could be argued that rigor was lacking in the qualitative method and that further investigation using an expanded, open-ended questionnaire is warranted. Finally, another approach to settle this contradiction might be to collect additional data instead of attempting any reconciliation of the discrepancies (Brannen, 2005). Those additional data may be collected using the same procedures from the original study or the researchers may add to the qualitative rigor by interviewing a sub-set from the original sample. Any one of these ways to reconcile discrepant findings could be used alone or in combination with one another in order to potentially resolve the discrepancies described in this study. Conclusion There are many possible explanations for differences between quantitative and qualitative findings. This paper presents various understandings of these differences and approaches to reconciling them. With respect to the study at hand about conceptions of assessment among higher education faculty and students, we conclude the following.  The difference between the findings is a discrepancy between the two demographics of student versus faculty and not a contradiction (Slonim-Nevo & Nevo, 2009). Therefore, the difference can be viewed as a logical one that should lead to further investigation.  Should these findings be used for policy-making decisions, they should be viewed as complementary and as portraying different aspects of the issue of assessment. The qualitative results provide important information about the context of the issue that might otherwise go unnoted. Consequently, neither set of findings should be rejected, in fact, the authors’ view is that they are actually necessary to understand all aspects of the issue (DeLisle, 2011; Ruark & Fielding-Miller, 2016).  The findings here are viewed from the socially relativist view that the forces of context hold important sway over facts on the
  • 9. 153 © 2016 The authors and IJLTER.ORG. All rights reserved. ground, and that there is not one measurable truth about this issue.  Further investigation is warranted based on these findings to determine how the qualitative results versus the quantitative shall be weighted in relationship to one another. In fact, as indicated by Fielding (2009), a revisiting of the research question may be warranted. The authors also conclude that an unforeseen area of scholarly and intellectually stimulating investigation has been triggered by the current findings and by the existing literature on this topic. The topic is complex, and lends itself to a more realistic understanding of most of the issues that educators at all levels seek to resolve in today’s climate of distilling answers into test scores. Arguably, assessment has developed into an integral and inextricable aspect of today’s educational systems and policies. In fact, it can be seen as a driver of those policies and, therefore, highlighting a discrepancy between assessment theory and practice could not be more essential as educators and stakeholders go forward to develop improved instructional delivery systems. As evidenced in the literature, resolving discrepant and contradictory findings can pose significant challenges to researchers. Furthermore, it is even possible for the issue under study to impact the best approach for resolving discrepant findings. In fact, DeLisle (2011) suggested that the nuanced context of educational issues are best understood when complementary, discrepant findings are present. Thus, prior to resolving any conflicting or discrepant findings, the researchers recommend determining both the context and purpose of the initial investigation. Then, concluding whether the findings are complementary and discrepant or non-complimentary and contradictory is required. Based on those conclusions, future researchers can then take appropriate steps in resolving the results of the research findings. As has already been noted, the fact of discrepancies is a viable area of investigation unto itself that should become of greater focus as the popularity of mixed-methods research increases (Wagner et al., 2012). As has been illustrated by the application to the study at hand of concepts and recommendations with respect to discrepancies, a depth of understanding and a raising of questions leading to future research can and should be the outcome. References Brannen, J. (2005). Mixed methods research: A discussion paper. ESRC Centre for Research Methods. Bryman, A. (2006). Integrating quantitative and qualitative research: How is it done? Qualitative Research, 6(1), 97-113. Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81-105. De Lisle, J. (2011). The benefits and challenges of mixing methods and methodologies: Lessons learnt from implementing qualitatively led mixed methods designs in Trinidad and Tobago. Caribbean Curriculum, 18, 87-120. Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods. New York: McGraw-Hill.
  • 10. 154 © 2016 The authors and IJLTER.ORG. All rights reserved. DiLoreto, M. A. (2013). Multi-group invariance of the conceptions of assessment scale among university faculty and students. Retrieved from ProQuest Digital Dissertations. (AAT 3577612). Fielding, N. G. (2009). Going out on a limb: Postmodernism and multiple method research. Current Sociology, 57(3), 427-447. Fletcher, R. B., Meyer, L. H., Anderson, H., Johnston, P., & Rees, M. (2011). Faculty and students’ conceptions of assessment in higher education. Springer Science+Business Media B.V., 64, 119-133. doi: 10.1007/s10734-011-9484-1. Howe, K. R. (2012). Mixed methods, triangulation, and causal explanation. Journal of Mixed Methods Research, 6(2), 89-96. Huang, S. (2012). Like a bell responding to a striker: Instruction contingent on assessment. English Teaching: Practice and Critique, 11(4), 99-119. Hussein, A. (2009). The use of triangulation in social sciences research: Can qualitative and quantitative methods be combined? Journal of Comparative Social Work, 4(1), 1-12. Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly, 24(4), 602-611. Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come, Educational Researcher, 33(7), 14-26. DOI: 10.3102/00131899X033007014. Lee, C., & Rowlands, I. J. (2015). When mixed methods produce mixed results: Integrating disparate findings about miscarriage and women’s wellbeing. British Journal of Health Psychology, 20, 36-44. Mathison, S. (1988). Why triangulate? Educational Researcher, 17(2), 13-17. Moffatt, S., White, M., Mackintosh, J., & Howell, D. (2006) Using quantitative and qualitative data in health services research – what happens when mixed method findings conflict? BMC Health Services Research, 6(28). DOI: 10.1186/1472-6963-6- 28. Onwuegbuzie, A. J., Johnson, R. B., & Collins, K. Mt. (2009). Call for mixed analysis: A philosophical framework for combining qualitative and quantitative approaches. International Journal of Multiple Research Approaches, 3, 114-139. Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 351-383). Thousand Oaks, CA: Sage Publications. Ruark, A., & Fielding-Miller, R. (2016). Using qualitative methods to validate and contextualize quantitative findings: A case study of research on sexual behavior and gender-based violence among young Swazi women. Global Health: Science and Practice, 4(3), 373-383. Slonin-Nevo, V., & Nevo, I. (2009). Conflicting findings in mixed methods research: An illustration from an Israeli study on immigration. Journal of Mixed Methods Research, 3(2), 109-128. Smith, T. M., Cannata, M., & Haynes, K. T. (2016). Reconciling data from different sources: Practical realities of using mixed methods to identify effective high school practices. Teachers College Record, 118(17), 1-34. Wagner, K. D., Davidson, P. J., Pollini, R. A., Strathdee, S. A., Washburn, R., & Palinkas, L. A. (2012). Reconciling incongruous qualitative and quantitative findings in mixed methods research: Exemplars from research with drug using populations. International Journal of Drug Policy, 23(1), 54-61.