This document discusses triangulation in qualitative research. It defines triangulation as collecting data from different sources to provide a more comprehensive understanding. There are various forms of triangulation, including data triangulation which involves collecting data across different times, locations, or people, methodological triangulation which uses multiple research methods, and theory triangulation which uses multiple perspectives or theories. The rationale for triangulation is that it reduces bias and enriches findings by providing different viewpoints on the topic.
A comprehensive presentation based on a qualitative research methodology 'Grounded Theory, presented at Government College University Lahore, Pakistan.
By the end of this presentation you should be able to:
Describe what is qualitative research
Demonstrate the differences between Qualitative & Quantitative research
Understand the basic concepts of Qualitative studies:
Characteristics of qualitative research
Bias
Triangulation
Trustworthiness
A comprehensive presentation based on a qualitative research methodology 'Grounded Theory, presented at Government College University Lahore, Pakistan.
By the end of this presentation you should be able to:
Describe what is qualitative research
Demonstrate the differences between Qualitative & Quantitative research
Understand the basic concepts of Qualitative studies:
Characteristics of qualitative research
Bias
Triangulation
Trustworthiness
Thematic analysis in qualitative research Explained with ExampleSufi Nouman Riaz
https://youtu.be/QNP4KkNFzu4
Thematic analysis is a technique of data analysis while conducting a qualitative study. Thematic analysis is the most recognized, adapted, and used approach to analyze qualitative data.
This video is made as per the illustrations and procedures explained in the Braun and Clarke (2006) research article on Thematic Analysis.
Have you just conducted a qualitative study involving:
Interviews
Focus Groups
Observations
Document or artifact analysis
Journal notes or reflections?
How to use this type of data?
Just as there are numerous statistical tests to run for quantitative data, there are just as many options for qualitative data analysis.
THEMATIC APPROACH
Most common forms of analysis in qualitative research. It emphasizes Pinpointing, Examining, Recording
Patterns (or "themes") within data.
Themes are patterns across data sets that are important to the description of a phenomenon and are associated to a specific research question.
Themes become categories for analysis
6 Phases of Coding
(Thematic Analysis)
1-Familiarization with data
2-Generating initial codes
3-Searching for themes among codes
4-Reviewing themes
5-Defining and naming themes
6-Producing the final report
Ethnographic research is one of the many crucial research methodologies in educational research. This well-researched ppt gives a clear picture of the what, how, and why of the research design.
Thematic analysis in qualitative research Explained with ExampleSufi Nouman Riaz
https://youtu.be/QNP4KkNFzu4
Thematic analysis is a technique of data analysis while conducting a qualitative study. Thematic analysis is the most recognized, adapted, and used approach to analyze qualitative data.
This video is made as per the illustrations and procedures explained in the Braun and Clarke (2006) research article on Thematic Analysis.
Have you just conducted a qualitative study involving:
Interviews
Focus Groups
Observations
Document or artifact analysis
Journal notes or reflections?
How to use this type of data?
Just as there are numerous statistical tests to run for quantitative data, there are just as many options for qualitative data analysis.
THEMATIC APPROACH
Most common forms of analysis in qualitative research. It emphasizes Pinpointing, Examining, Recording
Patterns (or "themes") within data.
Themes are patterns across data sets that are important to the description of a phenomenon and are associated to a specific research question.
Themes become categories for analysis
6 Phases of Coding
(Thematic Analysis)
1-Familiarization with data
2-Generating initial codes
3-Searching for themes among codes
4-Reviewing themes
5-Defining and naming themes
6-Producing the final report
Ethnographic research is one of the many crucial research methodologies in educational research. This well-researched ppt gives a clear picture of the what, how, and why of the research design.
The Case StudyMany disciplines use various forms of the ca.docxmamanda2
The Case Study
Many disciplines use various forms of the case study to examine an individual or phenomenon within a specified context. The approach and application of case study designs also can vary widely between various disciplines such as medicine, law, and the social sciences. However, in the social and behavioral sciences, case studies are often referred to as uncontrolled studies. Yin (2013) defined the case study as an empirical inquiry that investigates a phenomenon within its real-world context, when the boundaries between phenomena and context are not clearly evident, in which multiple data sources are used. Yin referred to the case study as a “method” as opposed to confining it to only an approach or a “tradition” within the various forms of qualitative research (e.g., Creswell, 2012). Generally, the focus of the case study is on developing a narrative or revealing a phenomenon based on an in-depth, real-time, or retrospective analysis of a case. Therefore, issues related to experimental control and internal validity are nonfactors within this approach. Although case studies do not infer causation and the results should not be generalized, the findings can provide rich insight toward phenomena and serve as support for theories and the generation of hypotheses. However, if desired, Yin does offer approaches and models for researchers interested in attempting to infer causation from case study designs (which differs from QCA analysis).
The emphasis in a case study is primarily the qualitative method; however, cross sections of quantitative data are usually collected as supplementary data throughout the analyses (see mixed method embedded case study design). The label of case study is often applied to many social science examinations as a catchall term, many times misapplying the concept (Malcolm, 2010). However, the case study design can be applied to any of the approaches within the qualitative method, such as the most commonly applied narrative and phenomenological approach in psychology (Singer & Bonalume, 2010a) or the ethnographic approach in education (Creswell, 2014). Creswell took a different angle than Yin (2013) regarding the type and description of designs for the case study. Gall, Gall, and Borg (2007) succinctly described a case study “as (a) the in-depth study of (b) one or more instances of a phenomenon (c) in its real-life context that (d) reflects the perspective of the participants involved in the phenomenon” (p. 447).
Confusion does arise when authors use different terminology for similar constructs. These semantic differences can be seen in the work of Yin, who uniquely defined and applied the terms holistic and embedded (see Appendix B) differently than their traditional uses; for example, the term embedded has an entirely different meaning when used by Creswell. Another example of this is the term case study design, used within the qualitative method and most often associated with the ethnographic and phenomeno.
The Case StudyMany disciplines use various forms of the ca.docxarnoldmeredith47041
The Case Study
Many disciplines use various forms of the case study to examine an individual or phenomenon within a specified context. The approach and application of case study designs also can vary widely between various disciplines such as medicine, law, and the social sciences. However, in the social and behavioral sciences, case studies are often referred to as uncontrolled studies. Yin (2013) defined the case study as an empirical inquiry that investigates a phenomenon within its real-world context, when the boundaries between phenomena and context are not clearly evident, in which multiple data sources are used. Yin referred to the case study as a “method” as opposed to confining it to only an approach or a “tradition” within the various forms of qualitative research (e.g., Creswell, 2012). Generally, the focus of the case study is on developing a narrative or revealing a phenomenon based on an in-depth, real-time, or retrospective analysis of a case. Therefore, issues related to experimental control and internal validity are nonfactors within this approach. Although case studies do not infer causation and the results should not be generalized, the findings can provide rich insight toward phenomena and serve as support for theories and the generation of hypotheses. However, if desired, Yin does offer approaches and models for researchers interested in attempting to infer causation from case study designs (which differs from QCA analysis).
The emphasis in a case study is primarily the qualitative method; however, cross sections of quantitative data are usually collected as supplementary data throughout the analyses (see mixed method embedded case study design). The label of case study is often applied to many social science examinations as a catchall term, many times misapplying the concept (Malcolm, 2010). However, the case study design can be applied to any of the approaches within the qualitative method, such as the most commonly applied narrative and phenomenological approach in psychology (Singer & Bonalume, 2010a) or the ethnographic approach in education (Creswell, 2014). Creswell took a different angle than Yin (2013) regarding the type and description of designs for the case study. Gall, Gall, and Borg (2007) succinctly described a case study “as (a) the in-depth study of (b) one or more instances of a phenomenon (c) in its real-life context that (d) reflects the perspective of the participants involved in the phenomenon” (p. 447).
Confusion does arise when authors use different terminology for similar constructs. These semantic differences can be seen in the work of Yin, who uniquely defined and applied the terms holistic and embedded (see Appendix B) differently than their traditional uses; for example, the term embedded has an entirely different meaning when used by Creswell. Another example of this is the term case study design, used within the qualitative method and most often associated with the ethnographic and phenomeno.
How to develop and manage a case study database as suggested by Yin (2009) wi...stefanie ng
Abstract
This presentation aims at providing useful knowledge and skills which can help doctoral students from different disciplines in doing research which inevitably involves time, energy and cost in data collection and handling of different types of qualitative and quantitative data gathered from various data sources by using a combination of qualitative and quantitative research methods. The process of researching becomes more complex when the researcher decides to adopt a mixed methods design for his/her research study because both qualitative and quantitative research methodological approaches to inquiry are involved in the entire researching process either sequentially or concurrently in data collection, data storage, data retrieval, data examination, data processing, data analysis, interpretation, and reporting of results in the academic piece of work known as a thesis or dissertation. This presentation provides ideas and suggests the necessary steps to take so that a case study database can be developed comprehensively and managed efficiently.
peer1 Qualitative, Quantitative and Mixed MethodThe qualitati.docxbartholomeocoombs
peer1
Qualitative, Quantitative and Mixed Method
The qualitative method of research is characterized by the collection and analysis of textual data like surveys, interviews, focus groups, conversational analysis and observation (Olds, et-al, 2005). The qualitative approach is more uncertain, studying behavior in natural environments using words, images and identify natural patterns and themes. It also generates new hypotheses and theories based on collected data, narrating reports with description, categories and exploration. The qualitative can refute any hypotheses.
Quantitative research methods use statistical analysis for comparing, description and relating variables. It also uses numerical data, hypotheses testing, effect size and interval estimates. This is used to identify statistical relationships and generalized findings. It uses highly structured methods, such as surveys, questionnaires and observation. This can predict casual relationships; quantify variation, study design, statistical assumptions and conditions. This method is inflexible which helps with meaningful comparisons of responses across participants.
The mixed research method on the other hand is deductive and inductive with multiple objectives and forms. It studies behavior in more than one context or condition. Uses numeric variables, words and images with a statistical and holistic approach. In-depth narrative description and identification of overall themes are used as well. One major popular mixed method approach is the sequential explanatory strategy. This has the advantage of having multiple ways to explore a research problem. The mixed method can address problems of different levels, complement the strengths of single design and overcome the weaknesses as well.
The method selected for my final project model would be the qualitative method. This goes into details and more in depth, with cost-effectiveness, flexible times and locations, making interviews easier. It gives a clear picture of complex problems including how and what experiences about the project simply seeking to discover and understand the perspectives and views of people involved.
Citations
Mixed-Methods Approach. (n.d.). http://www.statisticssolutions.com/mixed-methods-approach/
Northeastern university. (n.d.). Retrieved April 21, 2017, from http://www.ccs.neu.edu/
Tan, D. (n.d.). Research design Qualitative, Quantitative, and mixed methods. http://www.scribd.com/doc/51107663/research
peer2
Qualitative Research method is information that is gathered that can be discussed. It gathers information that can be formed into a hypothesis. According to the text, "much like in quantitative research, it is used as a broad explanation for behavior and attitudes, and it may be complete with variables, constructs, and hypotheses" (Creswell, 2013). When testing your hypothesis with qualitative research, there are many different processes that can be .
Qualitative, Quantitative and Mixed MethodThe qualitative method o.docxhildredzr1di
Qualitative, Quantitative and Mixed Method
The qualitative method of research is characterized by the collection and analysis of textual data like surveys, interviews, focus groups, conversational analysis and observation (Olds, et-al, 2005). The qualitative approach is more uncertain, studying behavior in natural environments using words, images and identify natural patterns and themes. It also generates new hypotheses and theories based on collected data, narrating reports with description, categories and exploration. The qualitative can refute any hypotheses.
Quantitative research methods use statistical analysis for comparing, description and relating variables. It also uses numerical data, hypotheses testing, effect size and interval estimates. This is used to identify statistical relationships and generalized findings. It uses highly structured methods, such as surveys, questionnaires and observation. This can predict casual relationships; quantify variation, study design, statistical assumptions and conditions. This method is inflexible which helps with meaningful comparisons of responses across participants.
The mixed research method on the other hand is deductive and inductive with multiple objectives and forms. It studies behavior in more than one context or condition. Uses numeric variables, words and images with a statistical and holistic approach. In-depth narrative description and identification of overall themes are used as well. One major popular mixed method approach is the sequential explanatory strategy. This has the advantage of having multiple ways to explore a research problem. The mixed method can address problems of different levels, complement the strengths of single design and overcome the weaknesses as well.
The method selected for my final project model would be the qualitative method. This goes into details and more in depth, with cost-effectiveness, flexible times and locations, making interviews easier. It gives a clear picture of complex problems including how and what experiences about the project simply seeking to discover and understand the perspectives and views of people involved.
Citations
Mixed-Methods Approach. (n.d.).
http://www.statisticssolutions.com/mixed-methods-approach/
Northeastern university. (n.d.). Retrieved April 21, 2017, from
http://www.ccs.neu.edu/
Tan, D. (n.d.). Research design Qualitative, Quantitative, and mixed methods.
http://www.scribd.com/doc/51107663/research
peer2
Qualitative Research method is information that is gathered that can be discussed. It gathers information that can be formed into a hypothesis. According to the text, "much like in quantitative research, it is used as a broad explanation for behavior and attitudes, and it may be complete with variables, constructs, and hypotheses" (Creswell, 2013). When testing your hypothesis with qualitative research, there are many different processes that can be conducted. .
Developing of climate data for building simulation with future weather condit...Rasmus Madsen
Today, climate models are used frequently to describe past, current or future climate conditions in par-ticular building simulation. A research study of how future climate change will affect the future indoor environment and buildings energy use in a Danish context has been conducted. To fulfil this research study, information of how climate models are developed are needed as well. The research study includes an objective descriptive approach from both Danish and global research of the given topic. The gathered information from the publications is evaluated with respect to indicators for the quality of the journals as well as the authors. The method used for development of the Danish design reference year, is not clear, and to have a full knowledge of how the climate change will affect building simulation in a Danish context, further research is needed. This research for development of a new Danish weather file will require both a descriptive and analytical research.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
2. Presentation outline
Introduction: Exploring qualitative research
Background for triangulation
Triangulation defined
Rationale for triangulation
Exploring forms of triangulation and discussion
Conclusion
3. Introduction: Exploring qualitative research
According to Moon et al (2016), qualitative research is defined by the philosophical nature of
the inquiry, that is, the ontologies, epistemologies, and methodologies that researchers adopt
during the design of their research projects and the associated assumptions they make when
collecting, analysing and interpreting their data (citing Khagram et al. 2010).
The growth of qualitative social science research (herein referred to as qualitative research)
can be attributed to an increasing recognition of its value in seeking to define and understand
complexity rather than to reduce it (Creswell 2009), expanding the range of research
questions that can be asked (Prokopy 2011); providing an in-depth understanding of
phenomena.
Qualitative research engages the target audience in an open-ended, exploratory discussion
using tools like focus groups or in-depth interviews. Qualitative research explores the “what,
why and how” questions and provides directional data about the target audience. It is
commonly used to explore the perceptions and values that influence behaviour, identify
unmet needs, understand how people perceive a marketing message, or to inform a
subsequent phase of quantitative research.
Researchers must provide sufficient information on their research design to enable end-users
to determine its quality, namely, the dependability, credibility, confirmability and transferability
4. Background for
triangulation
Triangulation has been used in quantitative surveying from at least the
1600s to describe a method that calculates a distance that is difficult (or
impossible) to measure, from two or more easier-to-measure distances
(Lowler:2017).
Historically, triangulation is a new concept in the social science repertoire
dating back to a paper published by Campbell and Fiske in 1959. In their
paper they discussed establishing validity of measures through the
application of a multitrait-multimethod matrix, a procedure which examines
both convergent and discriminant validation of measures of traits state
Mathison (1988). While the procedure was presented in a mathematically
elegant fashion; the basic idea was that in the development of measures
of psychological traits, several methods should be employed to measure.
The metaphor is a good one because a phenomenon under study in a
qualitative research project is much like a ship at sea. The exact
description of the phenomenon is unclear.
Webb et al. (1966) coined the term "triangulation" in their published paper
in the social sciences.
5. Triangulation defined
According to Cresswell (2013; pg 252), triangulation is collecting data
over different times or from different sources. The process involves
corroborating evidence from different sources to shed light on a theme or
perspective.
Triangulation is the practice of obtaining more reliable answers to
research questions through integrating results from several different
approaches, where each approach has different key sources of potential
bias that are unrelated to each other (Lawlor: 2017)
Cohen and Manion (2000) define triangulation as an attempt to map out
or explain more fully the richness and complexity of human behaviour by
studying it from more than one standpoint.
O’Donoghue and Punch (2003) mentions triangulation as a method of
cross-checking data from multiple sources to search for regularities in the
research data.
6. Triangulation defined
(Cont.)
Triangulation involves using multiple data sources in an investigation to
produce understanding.
Some see triangulation as a method for corroborating findings and as a test
for validity of research through the use of a variety of methods to collect data
on the same topic.
Rather than seeing triangulation as a method for validation or verification,
qualitative researchers generally use this technique to ensure that an account
is rich, robust, comprehensive and well-developed.
Triangulation is an approach to research that uses a combination of more
than one research strategy in a single investigation. Denzin (1978), however,
indicated that triangulation should not be confused with mixed methods; but
instead, these are two distinct ways to conceptualize interpretations and
findings.
7. Rationale for
triangulation
Denzin proposes four reasons to triangulate:
1. Enriching-outputs of different informal and formal instruments add value to
each other by explaining different aspects of an issue and thus reducing
sources of error.
2. Refuting- where one set of options disproves a hypothesis generated by
another set of options
3. Confirming –where one set of options confirms a hypothesis generated by
another set options
4. Explaining –where one set of options sheds light on expected findings
derived from one set of options
It minimises bias and helps to balance out any of the potential weaknesses in
each data collection method
The goal in choosing different strategies in the same study is to balance them
so each counterbalances the margin of error in the other.
8. Rationale for triangulation (Cont.)
Qualitative investigators may choose triangulation as a research strategy to assure
completeness of findings or to confirm findings.
Assure completeness: The most accurate description of the elephant comes from
a combination of all three individuals' descriptions.
Confirm findings: Researchers might also choose triangulation to confirm
findings and conclusions. Any single qualitative research strategy has its
limitations. By combining different strategies, researchers confirm findings by
overcoming the limitations of a single strategy.
Uncovering the same information from more than one vantage point helps
researchers describe how the findings occurred under different circumstances and
assists them to confirm the validity of the findings.
9. Rationale (Cont.)
Triangulation does not only ensure validity but places the
responsibility with the researcher for the construction of
plausible explanations about the phenomena being studied.
Mathison (1988) mentions three outcomes that might result
from a triangulation strategy i.e.
Convergence: data from different sources, methods, investigators, and so on
will provide evidence that will result in a single proposition about some social
phenomenon.
Inconsistency: When multiple sources, methods, and so on are employed we
frequently are faced with a range of perspectives or data that do not confirm
a single proposition about a social phenomenon.
Contradiction: When we have employed several methods we are sometimes
left with a data bank that results in opposing views of the social phenomenon
being studied.
Triangulation therefore, is in support of the complementary
theorist who carefully considers the outcome in logical
sequence and evaluates whether differences in conclusions can
10. Rationale (Cont.)
A subjectivist scientific perspective, triangulation is seen as a way of
exploring the data, creating different data.
Triangulation is a validity procedure where researchers search for
convergence among multiple and different sources of information to
form themes or categories in a study.
Triangulation has some features in common with Austin Bradford
Hill’s concept of ‘consistency’ which he defines in his considerations
on causality as ‘[results that have] been repeatedly observed by
different persons, in different places, circumstances and times’
(Lowler:2017).
11. Exploring forms of triangulation
In his explication of how to use triangulation as a research strategy,
Denzin outlines four types of triangulation and Potter (1999) points out the
fifth to be the combination of other four:
Forms of
Triangulation
Data
Method
Investigato
r
Theory
5th Type:
Multiple
Triangulation
12. Data Triangulation
Data triangulation refers simply to using several data sources, the obvious
example being the inclusion of more than one individual as a source of data.
However, Denzin expands the notion of data triangulation to include time and
space based on the assumption that understanding a social phenomenon requires
its examination under a variety of conditions.
Denzin (1989) described three types of data triangulation: (1) time, (2) space, and
(3) person.
Time triangulation: researchers collect data about a phenomenon at different
points in time. Studies based on longitudinal designs are not considered examples
of data triangulation for time because they are intended to document changes over
time.
Space triangulation: consists of collecting data at more than one site. At the
outset, the researcher must identify how time or space relate to the study and
make an argument supporting the use of different time or space collection points in
the study.
13. Data Triangulation
Person triangulation: researchers collect data from more than one level of
person, that is, a set of individuals (aggregate analysis), groups (interactive
analysis), or collectives (collectivity level).
Researchers might also discover data that are dissimilar among levels. In such a
case, researchers would collect additional data to resolve the incongruence.
Example: for example, to study the effect of an inservice program on teachers,
one should observe teachers at different times of the school day or year and in
different settings such as the classroom and the teachers' lounge.
14. Methodological Triangulation
Smith and Kleine (1986) suggest that the use of multi-methods results in "different
images of understanding" thus increasing the "potency" of evaluation findings.
Methodological triangulation is the most discussed type of triangulation and refers
to the use of multiple methods in the examination of a social phenomenon.
Denzin suggests that the within-methods triangulation approach has limited value,
because essentially only one method is being used, and finds the between-
methods triangulation strategy more satisfying.
Methods triangulation at the design level has also been called between-method
triangulation and methods triangulation at the data collection level has been called
within-method triangulation. This implies methods triangulation can occur at the
level of design or data collection.
"The rationale for this strategy is that the flaws of one method are often the
strengths of another: and by combining methods, observers can achieve the best
of each while overcoming their unique deficiences" (Denzin, 1978, p. 302).
This method is potentially the most powerful because of the bias of methods from
one paradigm could be counterbalanced by the methods from the other, Gray D E
(2014).
15. Methodological Triangulation (Cont.)
a. Design/Between-method
Design methods triangulation most often uses quantitative methods combined with qualitative
methods in the study design.
simultaneous implementation
sequential implementation
Theory should emerge from the qualitative findings and should not be forced by researchers
into the theory they are using for the quantitative portion of the study.
The blending of qualitative and quantitative approaches does not occur during either data
generation or analysis. Rather, researchers blend these approaches at the level of
interpretation, merging findings from each technique to derive a consistent outcome.
The process of merging findings "is an informed thought process, involving judgment,
wisdom, creativity, and insight and includes the privilege of creating or modifying theory“.
lf contradictory findings emerge or researchers find negative cases, the investigators most
likely will need to study the phenomenon further.
Sometimes triangulation design method might use two different qualitative research methods.
When researchers combine methods at the design level, they should consider the purpose of
the research and make a logical argument for using each method.
16. Methodological Triangulation (Cont.)
b. Data collection/Within method
Using methods triangulation at the level of data collection, researchers use two
different techniques of data collection, but each technique is within the same
research tradition.
“is given when different approaches in one method are used systematically and
are theoretically well founded” (Flick 2007, p.73).
refers to different ways of finding data contained in one method.
The purpose of combining the data collection methods is to provide a more holistic
and better understanding of the phenomenon under study.
Example: For instance, within a survey, various subscales can be used in one
questionnaire, assessing different aspects of a phenomenon; or some items can be
included in order to check up on other items.
17. Investigator Triangulation
Investigator triangulation occurs when two or more researchers with divergent
backgrounds and expertise work together on the same study.
To achieve investigator triangulation, multiple investigators each must have
prominent roles in the study and their areas of expertise must be complementary.
involves more than one investigator in the research process, is also considered
good practice. This perhaps more than other types of triangulation is usually built
into the research process because most studies simply require more than one
individual to accomplish the necessary data collection. However, the decision about
who these multiple researchers should be and what their roles should be in the
research process is problematic (Denzin, 1978; Miles, 1982). How much hands-on
data collection the principal investigator needs to do in order to analyze the data,
and how much data analysis is relegated to field workers because much of the
analysis occurs as data are collected, are both relevant and not easily answered
questions.
18. Investigator Triangulation (Cont.)
Use of methods triangulation usually requires investigator triangulation because
few investigators are expert in more than one research method. This involvement in
data collection and analysis allows verification of findings from a range of
perspectives.
This can provide a check on selective perception and illuminate blind spots in an
interpretive analysis, Blanche ,et al(2006)
The goal is not to seek consensus, but to understand multiple ways of seeing the
data.
Observer bias can be reduced and inter-judge reliability can be improved.
However, the observers should be taught to keep an open mind and not to become
obsessed with their hypothesis. They should not jump towards solutions to a
problem as this will tend to make them ignore facts that do not confirm their
expectations, Creswell,J W(2013).
19. Theory Triangulation
Theory triangulation incorporates the use of more than one lens or theory in the
analysis of the same data set.
Is described as “approaching data with multiple perspectives and hypotheses in
mind” (Denzin, 1978, p. 297). Thus when explaining empirical data, rather than
using a well-known and suitable – and favourite – theory, or just letting data speak
for themselves, Denzin advocates a strategy that employs different theoretical
analyses onto the same set of data. Testing and discussing the findings in different
lights, new theories may also emerge.
Its using multiple theoretical perspectives to examine and interpret data.
In qualitative research, more than one theoretical explanation emerges from the
data.
Researchers investigate the utility and power of these emerging theories by
cycling between data generation and data analysis until they reach a conclusion.
20. Multiple Triangulation
which uses a combination of two or more triangulation techniques in one study.
According to Denzin to incorporate multiple methods of data collection, multiple
sources of data and multiple investigators with multiple areas of expertise.
Denzin states that multiple research methods are desirable because each method
reveals a different aspect of reality. This idea has since been developed to include
triangulation as a metaphor for strength, trustworthiness, and comprehensiveness.
A combination of multiple methods, data types, observers and theories are
combined in the same investigation.
Guba argues that trustworthiness through triangulation enhances the credibility,
dependability and ‘confirmability’ in qualitative studies.
22. Conclusion
Triangulation is a strategy that enhances the quality of the research
thereby ensuring that the findings are reliable, dependable and valid.
Idea of using different sources to verify the authenticity of the information
is important since a single perspective is never enough.
It helps to unveil the complexities of phenomena under study and
understand them in depth rather than generalising the findings.
Quality is not something that happens by itself it needs effort through strategizing and planning i.e. from collection through to analysis and interpretation. Hence the need to triangulate.
- To enhance clarity we need several viewpoints just like the ship at the sea.
-- Idea of using different sources to verify the authenticity of the information. A single perspective is never enough.
Margin of error: Is the possible range of values above and below the response you get from a given sample. The margin of error can be interpreted by making use of ideas from the laws of probability or the “laws of chance” as they re Sometimes called
An objectivist scientific perspective views triangulation may be justified as a means of validation, to make the findings more well-founded and convincing. (Nøkleby:****)