This document discusses the research methodology used for a case study on the Festive Chicken poultry processing factory in South Africa. Both quantitative and qualitative methods will be used, including surveys with 100+ employees and interviews. The target population is the 2500 employees. Convenience and judgmental sampling will be used to select participants. Data will be collected through questionnaires and interviews to explore issues related to implementation at the factory. The research aims to provide an in-depth analysis of the case study using mixed methods.
Call Girls Service Nagpur Maya Call 7001035870 Meet With Nagpur Escorts
Methodology dissertation complete
1. RESEARCH METHODOLOGY
3.1 Introduction
This chapter defines and details the research methodology used to ascertain the feasibility of
implementation at Festive Chicken poultry processing factory in Gauteng Province, South
Africa. This section discusses the study's theory, research strategy, and target population. The
instruments' validity and reliability and the interpretation of the data collected and ethical
considerations will be evaluated. The word "methodology" refers to a thorough knowledge of
the research process, while "methods" is a collection of particular procedures used in a study
to choose cases, measure and monitor social life, collect and refine data, analyze data, and
present results (Neuman & Guggenheim, 2011). The word "methodology" refers to a specific
technique for approaching data and data analysis. Two distinct types of approaches will be
used in this case study. These methods will be quantitative and qualitative.
A research technique is a systematic approach to solving a research problem. It can be
thought of as a science concerned with how research is conducted (Patel & Patel, 2019). This
chapter discusses the research methods used to develop a strategy for resolving the research
topic. It will go into great detail about the study's methodology, including specific data
collection and analysis strategies. It details the methodology used in the case study. It
investigates what is genuinely occurring in the commercial performance of the Festive
Chicken poultry processing plant in Gauteng Province, South Africa. As a result, the research
methodologies and procedures and the approaches that will be required are significant in this
study.
This study will employ a mixed research approach to data collection, collecting both
quantitative and qualitative data. Thamhain (2014) and Fassinger and Morrow (2013)
proposed a hybrid strategy for supporting research projects that combines qualitative and
quantitative methodologies. The mixed technique research is a methodological approach that
entails collecting, analyzing, and combining quantitative and qualitative data throughout the
research process to understand complex research problems better. According to Creswell and
Clark (2017), Due to their capacity for integration, a mixed approach adds several dimensions
and rigor to more traditional single-stranded research methods. On a similar scale, Yin
(2006). Asserted that mixed methods research allows for the presentation of a variety of
different perspectives on the case under study and the engagement of vulnerable populations.
According to Tashakkori et al. (2010), the mixed research method enables new ways which
2. explain the dynamics of the business, which is particularly applicable to the performance of
the Festive Chicken poultry processing plant in Gauteng Province, South Africa, which
serves as the research's case study. The methods of quantitative and qualitative research will
be used so as investigate and explore the various claims to knowledge in this case study, and
each method is intended to address a particular type of research question. While the
quantitative method quantifies reality objectively, the qualitative method enables the
researcher to delve deeper into and better understand the study's central main question. This
case study aims to define quantitative and qualitative research designs and summarize the
techniques used to conduct studies utilizing both approaches. Additionally, this case study
will discuss what it means to conduct research using a mixed-methods research approach.
While both approaches seek to establish the veracity of sensory knowledge, neither is
absolute in its application.
3.2. Qualitative
Qualitative research is multi-method in nature and approaches its subject through an
interpretative, naturalistic perspective. This is a comprehensive analysis of why culture is the
way it is. The combination of multiple data collection techniques such as interviews, field
notes, sample writing, and any other data assists in determining a group's cultural
phenomena, as in the case of Festive Chicken poultry processing plant in Gauteng Province,
South Africa. Qualitative research will be applied to solve problems in their natural
environments, make sense of, or interpret, phenomena in terms of the meanings assigned to
them by people and thus assess the performance of this particular industry. Qualitative
research examines and collects various empirical materials such as personal experience, case
study, reflective, life story, interview, observations, historical, interactive, and visual texts –
to define routine and problematic situations and meanings in employees' lives (Denzin &
Lincoln 2005).
Qualitative analyzes are known as qualitative analysis, such as text data from interview
transcripts. Unlike statistical and fundamentally unrelated quantitative analysis, qualitative
analysis depends largely on the researchers' analytical and interpretive capabilities and
intimate understanding of the social context in which the data is collected. The qualitative
analysis emphasizes rather on predicting or explaining a "sense-making" or a comprehension
of a phenomenon. A creative and research mentality, as well as a series of analytical methods
and a morally informed and participating context attitude, are needed for qualitative analyzes.
3. This chapter provides a brief overview of several of these qualitative analytical methods. We
suggest more credible and comprehensive sources to interested readers, such as Miles and
Huberman (1984)
Marshall and Rossman (1999) describe data analysis as the act of providing order, structure,
and meaning to a large amount of data. It is described as a messy, perplexing, and time-
consuming process, as well as a fascinating and creative one. The process of comprehending,
interpreting, and conceptualizing data, in general – albeit not in a linear way – entails a search
for general statements among data kinds (Schwandt, 2007).
Qualitative research techniques are significant in this study because it will attempt to gain a
deeper understanding of a situation and process, in this case, improvement of the
implementation at Festive Chicken poultry processing plant in Gauteng Province, South
Africa, given that it has not been recently researched or has been partially researched. During
the data collecting and analysis stages of a research, several techniques will be used. At the
data collecting stage, a number of techniques will be utilized, including individual depth
interviews, focus groups, case studies, grounded theory, and ethnography. The use of all
relevant tools to elicit memory, which helps in problem-solving, is a requirement of
qualitative research. To extract information from participants in their natural settings,
qualitative data collecting methods such as open-ended questions, observation, in-depth
interviews such as audios and films, and field notes are utilized. The data collection
techniques provide a comprehensive account of the study in terms of participants. The
observation of participants in qualitative research and the focus group structure help to better
understand behaviour. As a consequence, qualitative research yields a plethora of knowledge
about actual people and circumstances. In qualitative research, the method for retrieving data
is considered distinctive. Because it relies on the researcher acting as an instrument collecting
non-numerical primary data such as words and pictures, qualitative research is well-suited for
delivering factual and descriptive information.
Additionally, qualitative research puts human thought and behavior within a social context
and examines a diverse range of phenomena in order to comprehend and appreciate them
fully. Due to the in-depth examination of phenomena, including interaction, thought,
reasoning, composition, and norms. The researcher's close relationship with the participants
facilitates the participant's contribution to the shaping of the research. This does, however,
4. account for a substantial amount of experience comprehension, as its participants
comprehend both themselves and experience as a whole.
3.4. Quantitative
Quantitative research techniques depend on statistics to reach broad conclusions about a topic
(Regoniel, 2015). It focuses on the use of computer-aided techniques to examine numerical
data. Statistical analysis numbers are produced using objective measuring scales for the
analysis units of measurement, known as variables. Nominal, ordinal, ratio, and interval
scales are the four types of measuring scales. Quantitative research focuses on objective
measurements and "statistical, mathematical, or numerical analysis of data gathered through
polls, questionnaires, and surveys, or by altering pre-existing statistical data using computer
methods," according to studies.
As a consequence, surveys will be performed as part of this research to gather information
that may be utilized to explain a phenomenon. Such surveys will rely on numerical inputs and
actual measurements of the factors that characterize the study topic (Regoniel, 2015). These
data will be statistically examined to see whether there are any notable connections or
discrepancies between variables. The results are utilized to determine the study's conclusions
and generalizations. As in this case study, the data collecting tools will be built to extract
measurable demographic features. Quantitative and measurable techniques will be utilized.
Educational attainment, age, the number of children, and economic position are all
measurable variables in the research. Questionnaires, polls, and surveys are examples of data
gathering instruments. As Regoniel said, the data collecting procedure in this research would
be guided by standardized, pre-tested techniques and instruments, guaranteeing the data's
reliability, precision, and validity (2015). It guarantees that respondents will provide the
expected answers or that the researcher will meet the study goals.
When a deep understanding of a study, event, and phenomenon is required in its natural real-
world context, the case study approach is especially effective. In this context, a quantitative
case study methodology will be used. Conducting an in-depth examination of complex
phenomena within the context of a specific implementation at the Festive Chicken poultry
processing factory, as in this example, will be straightforward. In this study, a quantitative
technique will be the most appropriate methodology. As other scholars have demonstrated,
the case study method is the most frequently used (Baskarada, 2014). This is because a
similar piece of research is involved.
5. According to Yin (2009), case studies can describe, explain and investigate natural
occurrences and phenomena. These can be used to decode and explain the causal
relationships and trails that emerge from a new policy inventiveness and service
development. The case study method can elucidate any gaps in its delivery or the rationale
behind selecting one implementation option over another. In this scenario, this can aid in the
development or refinement of the Festive Chicken manufacturing factory.
As for case studies, it can be approached in various methods depending on the researcher's
perspective, whether to take a critical, interpretivism or positivist approach based on natural
science criteria, such as emphasizing generalisability. While such a schema may be
conceptually useful, it may be appropriate to employ multiple strategies in any case study,
especially given the context of this case study. For example, Doolin (1998) emphasized the
importance of adopting a critical, reflective approach when conducting interpretative case
studies to consider the larger qualitative approach that influenced the case.
In a real-world context, the case study approach allows for a more in-depth analysis of key
policy changes, events, involvements, policy changes, and program-based service
improvements, among other things. When a study technique is found to be inadequate to
address the research phenomenon or to be difficult to carry out, it should be reconsidered
(Crowe,2011). This approach is deemed acceptable for this study due to the frequency with
which Festive Chicken poultry processing factory now implements innovations and the ease
with which the case study method lends itself to in-depth, qualitative research.
3.5. Target population
A target population is a group of individuals or groups of elements about which a researcher
wishes to gain additional knowledge. "All respondents who meet a specified set of criteria" is
defined as the target population (Burns & Grove, 1997). The target populations must be
precisely defined, as their definition determines their inclusion or exclusion from the survey.
The target population's geographical and temporal characteristics and the types of elements
incorporated into the target population must be defined. In some situations, the target
population is restricted to eradicate members of the population who are impossible to
interview. The target population is the collection of individuals who will be subjected to
research and conclusions due to the intervention. When performing a cost-effectiveness
analysis, it is critical to define the target population and any subgroups clearly.
6. The target population is the group of entities on whom the intervention will conduct research
and draw conclusions: the Festive Chicken poultry processing plant employees in Gauteng
Province, South Africa. A thorough description of the target population and any subgroups is
required for cost-effectiveness analysis. The target population is the totality of the units from
which conclusions are to be drawn. Thus, the target population denotes the units for which
the survey's findings are to be generalized. The target populations must be precisely defined,
as this will determine whether they are included or excluded from the survey. It is necessary
to describe the target population's geographical and chronological characteristics and the
types of elements included. In some situations, the population taken as the target is restricted
to avoid redundancy and waste of time if time and resources are scarce factors. The target
population for this case study will be the 2500 employees at Festive Chicken's poultry
processing plant in Gauteng Province, South Africa. In this study, Convenience and
Judgmental sampling techniques will be used.
3.6. Sample and sample procedure
The quantitative section will employ a sample size of at least 100 company employees. The
first section of the questionnaire will be utilized to collect demographic data. The second
section of the questionnaires will employ a nominal scale, with response options ranging
from 1 (yes) to 2 (no) and 3 (I am not sure/uncertain). The second section will consist of
(three) sub-sections: management support, training, and recognition and rewards. Employees,
supervisors, and managers will be interviewed regarding their prior experience implementing
a continuous improvement initiative in Festive Chicken using a non-probability sampling
technique, specifically convenience and judgmental sampling.
This study will employ the convenience method. The researcher chooses this method based
on his or her assessment of who will provide the most information to accomplish the study's
objectives (Etikan et al., 2017). To obtain the necessary information and share it, the
researcher must focus on those who share the same opinion. Judgmental sampling is a
strategy in which specific settings, individuals, or events are purposefully chosen to obtain
critical information that cannot be obtained through other means. It occurs when a researcher
includes cases or participants in the sample solely because they believe they should be
included.
Convenience sampling is the most frequently used and applicable non-probability sampling
technique in case studies (Etikan et al., 2017). The investigators employ this method to select
7. subjects based on their availability and accessibility. As a result, this procedure is rapid,
inexpensive, and convenient. The term "convenient sampling" refers to the researcher's
choice of sample elements based on their accessibility and proximity to the researcher.
Convenience sampling is a term that refers to the practice of selecting participants who are
frequently available. Generally, convenience sampling is preferred for case studies because of
its low cost and ease of use compared to other sampling methods, which is why it will be
appropriate for this study. Convenience sampling frequently aids in circumventing a number
of the inherent limitations of research. For example, sampling friends or family members is
more convenient than randomly selecting individuals.
Convenience sampling will be used in this study to bring about data to be used because of its
convenient method. When a researcher's time is limited, he or she will choose this method of
data collection. Simple random sampling, stratified sampling, and systematic sampling have
more complicated sample selection procedures. Data gathering is fast as a result of its
simplicity. Convenience sampling is a low-cost sampling technique.
In comparison to convenience sampling, other probability sampling methods require a
significant investment of money and time. It enables researchers to generate a greater number
of samples at a low or no cost and in a short period. Conducting research using convenience
sampling is straightforward. The term "sampling technique" elucidates the sampling
procedure used in this survey technique. The researchers have direct access to the elements,
which streamlines the process of recruiting sample participants. The sampling technique is
reasonably priced. This is one of the primary reasons researchers employ this technique, even
more so when budget constraints are present. The data collection process is straightforward
and convenient. The majority of convenience sampling is population-based. Samples are
readily available to the researcher. During data collection, the researcher is not required to
move around excessively. Quickly filled quotas enable data collection to begin within a few
hours and hence will assist the researcher to minimize time wastage for this particular case
study.
3.7. Recruitment strategy
Typically, recruitment strategies include identifying eligible populations, obtaining enough
and representative samples, retaining respondents throughout the study's duration, and
minimizing the cost-benefit ratio, all while adhering to ethical standards (Blanton et al.,
2007). Recruitment strategies such as fliers, newspaper advertisements, emails, and letters are
8. just some of the ones that will be used in this study. The researcher will acquire a mailing list
and use the information contained therein to locate and contact individuals and send letters
and emails. Similarly, job announcements can be distributed via established listservs.
However, it is prudent first to contact the list owner to ascertain whether these types of
notices are permitted on a listserv.
3.8. Instrumentation
The research instrument will consist of an online questionnaire distributed to participants via
email link and interview questions. The study will collect primary data by using a semi-
structured questionnaire, as recommended by (Creswell & Clark, 2017). The questionnaires
will be closed-ended with multiple-choice responses to facilitate quantitative analysis of the
data. Interviews with key informants will be used to elicit qualitative data. Because of the
concentrated manner of the target population, the collection of data via personal interviews
and observations will be relatively appropriate.
3.9. Validity and Reliability of the Instruments
The validity test verifies that the data in the study are accurate in terms of the variables used
to represent them. It refers to the instrument's capability of measuring the variables for which
it was designed. Predictive validity is another term for making accurate predictions based on
criterion-related validity (Saunders et al., 2009). This type of validity means the degree to
which the methods used to assist in the estimation process are relevant, without bias, or were
designed and selected objectively. They should be dependable and readily available to the
researcher (Kothari, 2004). Additionally, Rothman et al. (2008) defined validity as the
amount of systematic and built-in error in a questionnaire as a tool.
Validity is the measure of how well an operational questionnaire will capture the concept of a
theoretical construct. The term "representational or translational legitimacy" refers to this.
Face validity and content validity are the two subcategories of validity. Questionnaire
validity, on the other hand, will be determined via a secondary survey in the form of a field
test, which will evaluate how well a particular measure corresponds with one or more
external criteria. Validity is classified into two types: criterion-related and construct validity
(Bolarinwa, 2015). While some authors consider criterion-related and construct validity to be
synonymous, others believe they are distinct concepts. Predictive and concurring validity is
9. criterion-related validity subtypes, whereas convergence, discriminant, known-group, and
factorial validity are construct validity subtypes. Furthermore, Bolarinwa (2015) defined
construct validity as the capability of hypotheses to be tested.
As defined by Saunders et al. (2009), construct validity means the extent to which a
measurement instrument correctly measures a presence of constructs being quantified. The
instrument's construct validity will be determined in this study by comparing its
measurements to a set of Kothari-recommended theoretical propositions (2004). Correlation
statistical analysis will be used to determine the validity of the criterion in this study. The
term "construct validity" refers to the degree to which a measurement tool accurately
quantifies the presence of the studied constructs. The construct validity of this study will be
determined through an analysis of the instrument's results and their association with a set of
theoretical prepositions.
The expert panel will determine the validity of a questionnaire in this study. Their feedback
will be used to make necessary revisions to the questionnaire before starting primary data
collection. Thus, the questionnaire's accuracy and consistency will be critical components of
this research methodology, as they will contribute to the validity and reliability of the study.
Contrary, reliability refers to a measurement's ability to provide consistent readings
regardless of the environment in which it is taken. The term "reliability" refers to a
measurement's or procedure's ability to replicate its results. (2008) (Rothman et al.).
According to Rothman et al. (2008), lack of reliability can occur due to deviation between
observers or measurement tools like a questionnaire or because of the result of the attribute
being measured being unstable, which invariably affects the questionnaire's validity. Three
components comprise reliability: equivalence, stability, and internal consistency. When the
same or similar scores are obtained repeatedly on the same group of respondents, this feature
of reliability is said to exist. In other words, the reliability ratings should remain consistent
over time. The instrument's reliability will be determined by a test-retest method that involves
running a similar measurement instrument, like a questionnaire, to the same entities under the
same circumstances two weeks later. This is the most frequently used method for determining
the reliability of questionnaires in surveys and is thus appropriate for this type of study.
3.10. Data collection
10. Respondents will sign a consent letter, will have additional time to consider their responses,
and will be able to respond to questions at their leisure. Most importantly, interviews will
occur in natural settings to allow for a complete understanding of a single individual. The
data collection process will be carried out via email distribution of online questionnaires.
Participants will be given one week to complete the survey before submitting it for
verification and analysis. Participants will be contacted before the interview to schedule the
session and prepare. Ten to twenty minutes will be allotted for each interview. Following
consent from the participants, an audio recorder will be used to capture the interview.
3.11. Data Analysis
Data analysis is the procedure of gauging data by applying rational and logical reasoning to
every element of the data that has been obtained and provided. Additionally, it is among one
of the numerous phases involved in undertaking a research experiment. As for this case
study, data will be gathered from a variety of sources. Once the data has been gathered, it will
be reviewed and analyzed to reach a conclusion or decide on the case under the study. As it is
normally customary in qualitative research, obtaining data and analysis will take place
simultaneously. The nature of the analysis conducted will entirely rely on the kind of the case
study and, in this particular study. Pattern matching, data linking to propositions, explanation
building, time-series research, logic models, and cross-case synthesis are the five analytical
techniques Yin (2006) outlines. On the contrary, other case studies research classifies
categorical aggregation and direct interpretation as types of analysis.
The researcher must classify each code into an appropriate category of a cross-reference
matrix after coding text content for analysis. Using computer tools to calculate frequency or
word count can result in errors. "In this scenario, the researcher may get a precise count of
that word's use and frequency, but not an accurate accounting of the meaning included in
each use," explains the researcher (Gottschalk, 1995). Additional analysis will be required to
determine the data set's dimensionality and possibly identify new meaningful underlying
variables.
Regardless of whether statistical or non-statistical methods of analysis are used, researchers
will be aware of the risk of data integrity being compromised. While statistical analysis is
most commonly used on quantitative data, there are also analytic processes created expressly
for qualitative data, such as content, thematic, and ethnographic analysis. Researchers utilize
a range of techniques to evaluate data in order to test hypotheses, discover patterns of
11. behaviour and eventually answer research questions, regardless of whether they do
quantitative or qualitative research. Data integrity will be jeopardized if the challenges with
data analysis are not understood and acknowledged.
The basic theory is an inductive method for the interpretation of collected data on social
phenomena in order to develop hypotheses about this phenomenon. Glaser and Strauss (1967)
developed and refined the technique as part of their ongoing comparative analysis of
grounded theory research methods to provide a better illustration of specific coding
techniques – a procedure for categorizing and classifying text data segments into a set of
codes (concepts), classes (constructions), and links. The term derives from the fact that the
interpretations are "based on" or empirically based. The basic theory method asks scientists to
suspend theoretical assumptions or biases of pre-existing theory before data analysis,
allowing the facts to determine the theory's development. This guarantees that the hypothesis
is based entirely on facts seen.
For the study of text data, Strauss and Corbin (1998) offer three coding methods: open, axial,
and selective. Open coding is a method for detecting hidden thoughts or ideas in textual data
that may be related to the phenomena of interest. The researcher searches through the raw
textual data line by line for discrete events, occurrences, thoughts, acts, perceptions, and
interactions to classify as concepts (hence called in vivo codes ). For further confirmation,
each concept is linked to specific passages in the text (coding unit). Some concepts are
straightforward, clear, and unambiguous, whereas others are more complicated, perplexing,
and subject to different interpretations. Depending on the ideas to be retrieved, the coding
unit may vary. Simple ideas like "organizational size" may need just a few text words, while
more complicated concepts like "organizational purpose" may require many text pages. The
researchers' own convention on naming or standardized labels from the studies may be
utilized to name topics. Once a basic collection of ideas has been found, the remainder of the
data may be coded while at the same time looking for new concepts and improving current
ones. The identification of accepted quality values such as size, colour, or level of each idea
(e.g., high or low) is essential for coding so that same concepts may then be put together. The
researcher is free to take in new ideas related to the subject of interest and is thus actively
investigating them; therefore, this coding style is called "open."
Similar ideas are then classified into higher-level categories. Context-specific notions are
known as concepts. Categories, on the other hand, are more generalizable. The researcher
12. must reduce the number of ideas he or she needs to deal with in order to create a "big picture"
of the problems that are crucial to understanding social phenomena. It is possible to organize
ideas into subcategories and then into higher-level categories in stages. Structures from the
literature may be utilized to establish these categories, especially if the study's goal is to build
on previously held ideas. In contrast, existing constructs should be utilized with care since
they may include commonly held assumptions and prejudices that must be taken into account.
As well as identifying the characteristics (or qualities) of each category, it is also important to
identify the dimensions of each characteristic. In terms of dimension, a characteristic's value
along a continuum is expressed.
Axial coding, categories, and subcategories are connected in the second stage of grounded
theory to create causal connections or hypotheses that may tentatively explain the events of
interest. It is feasible to do both Axial and Open Coding at the same time. Correlations
between categories may be obvious in the data or they can be more subtle and implicit. In the
second case, scientists may employ a coding system known as a "coding paradigm."
Individual responses to events that occur under these settings will be categorized, as will
actions/interactions and repercussions (the results of actions/interactions). Scientists may
begin to explain why a phenomenon happens, under what circumstances, and with what
effects, when conditions, actions/interactions, and consequences are identified.
Finally, in the last step of grounded theory, selective coding is used to connect main
categories or core variables to other categories. The core category may either be derived from
existing groups, or it could be a higher-order category including newly coded categories.
Fresh data is carefully sampled as part of the validation process (i.e., the tentative theory).As
a result, the scope of the inquiry is narrowed, and it is expedited. It is also important to watch
for new categories that arise from fresh data and are linked to a phenomenon of interest (open
coding), which may prompt to further refining of the original hypothesis. It is thus possible to
code in three different ways at the same time: open, axial, and selective. When theoretical
saturation is achieved, that is, when additional data produces no marginal change in the
fundamental categories or connections, new data coding, and theory refinement continue.
For many reasons, Yin (2003) stresses the significance of returning to the propositions
throughout the analytical phase of any case study. First, instead of being tempted to examine
13. data that is unrelated to the research objectives, this approach will result in a more focused
examination. Second, evaluating alternative hypotheses involves trying to explain a
phenomena in a different way. Third, when more hypotheses and competing concepts are
addressed and accepted or rejected as part of this iterative process, confidence in the results
will increase.
Qualitative data analysis generally is distinct from quantitative data analysis. Qualitative data
will be gathered through interviews in this study. Qualitative data will be analyzed for themes
and patterns using content analysis. Data analysis will entail identifying common patterns
among responses and conducting a critical analysis to accomplish the research's goals and
objectives. On the other hand, data analysis for quantitative methods will entail the deep
examination and explanation of figures and statistics to ascertain the basis for the emergence
of major conclusions. The findings and conclusions of primary research will be compared to
those of the literature review, critical for both qualitative and quantitative studies. The survey
data will be exported to excel format from the online survey tool. Following that, the data
will be analyzed, and reliability tests were done using the Statistical Package for the Social
Sciences, version 25.0 software. All SPSS or Excel tables, graphs, and charts must be
discussed and analyzed in detail.
Types of Data Analysis and Process
Hermeneutic Content Analysis
For market decision-making and assessment, content analysis is often needed in company
marketing, such as with reports, research papers, journal articles, and so on. This leads to the
development of a technique for qualitative content analysis that can be used by a wide range
of sources. It is a new technique of analysis called Hermeneutic Content Analysis. Using
content analysis concepts like coding, classification, and systematization in conjunction with
understanding and reflection, this method blends hermeneutics with qualitative content
analysis.
There are two types of content analysis: Hermeneutical and Qualitative. Interpreting a text in
a circular motion that incorporates both subjective and objective elements is called
hermeneutics. By classifying and categorizing qualitative data, qualitative content analysis
(QCA) provides the meaning of the data in a systematic way. It covers elements such as
material description and interpretation. It is believed by some that Hermeneutic Content
Analysis is a hybrid technique that combines Hermeneutic Analysis with Qualitative Content
14. Analysis; therefore, the HCA is composed of two processes: systematization, coding, and
classification, interpretation, understanding, and reflection.
Hermeneutic Content Analysis (HCA) is a kind of qualitative content analysis that goes
beyond the traditional definition. However, the focus of the hermeneutic content analysis is
on understanding and reflection. To understand the meaning of the material that has been
examined, Hermeneutic Content Analysis is required.
The Hermeneutic enables us to comprehend a text's sense as well as its deepest meaning.
Qualitative Content Analysis, on the other hand, is a method for systematically describing
qualitative data. With data reduction, systematization, and adaptability, the combination of
two methodologies allows for comprehension, objective spirit, and hermeneutic circle. These
aspects of Hermeneutic Content Analysis open up new possibilities for theory, action,
reflection, and lifeworld study.
Content Analysis
As an example, the content analysis examines who says what, to whom, why, and to what
degree and impact. Consider the following as an example of a typical content analysis
procedure. When analyzing a large number of texts, such as newspaper articles, financial
reports, blog entries, and online reviews, researchers begin by selecting a subset of the entire
number of texts to study.Choosing texts that provide more important information is not a
random procedure. For the second step, the researcher sets up and uses criteria to divide each
text into smaller "chunks" for an independent analysis. When this happens, it's called
"unifying." Examples of such units in texts include assumptions, repercussions, facilitators,
and obstacles, to name a few. In a process known as coding, the researcher develops and
applies one or more concepts to each unitized text segment. The researcher uses a coding
method to code the information. Last but not least, the coded data is quantitatively and
qualitatively examined to see which themes appear most often, under what conditions, and
how they relate to one another.
Conversation analysis
Social researchers use conversation analysis to study how participants comprehend natural
forms of social interaction and how they collaborate to organize them. Using video or audio
recording technology, conversation analysis offers a distinct methodological component.
Detailed transcriptions of recordings allow for a fine-grained study of the design, exchange,
15. and coordination of social actions. This study introduces the philosophical underpinnings and
fundamental concepts of conversation analysis. Conversations are those in which at least two
individuals are speaking at the same time. In this situation, both speakers are expected to
participate, either by speaking and responding or by just listening.
It is important to note that formal institutional contact is limited to just a few institutions.
Medical, psychiatric, social service and business contexts are much more likely to use the so-
called non-formal modes of communication. In these situations, behavioural patterns are
much less stable. The aim of the engagement is more or less self-evident: to carry out
governmental responsibilities, such as diagnosing disease or making choices about a client's
health or welfare needs. These formal duties and actions are usually carried out under turn-
taking frameworks that allow for a great deal of flexibility, improvisation, and negotiation in
terms of the participation status accepted at any given time by both lay and professional
participants. As a result of Conversation Analysis, non-formal forms of institutional
engagement are given a quasi-conversational quality. They are considered to be out of the
ordinary discourse. The acts of asking and responding, as well as the types of information
sought and given, are asymmetrically divided between doctors and patients in the context of
an overall framework (Maynard and Heritage, 2004).
Discourse analysis
Discourse is the use of words (Brown and Yule 1983; Cook 1989). The study of language in
use, then, is what discourse analysis is. "Language in use" refers to the norms, preferences,
and expectations that connect language to context. Language structure above sentence level is
called discourse analysis. When "discourse" isn't appropriate, the text is utilized. In addition
to studying formal language characteristics, discourse analysis also examines how language is
used in social and cultural settings. Language (written or spoken – conversation,
institutionalized forms of discourse) and the circumstances in which it is used are
investigated in discourse analysis. Everything depends on whether or not a text seems to be
well-organized. It is "language in use or language used to communicate something perceived
to be coherent, whether or not it correlates with a proper phrase or collection of right words,"
according to Guy Cook (1989:6-7). Speech analysis, according to him, is the quest for what
creates coherence in discourse. Speech, he argues, does not have to be grammatically
flawless, and it may be anything from a grunt or a simple curse to brief conversations and
scribbled notes to a book or a lengthy legal argument. "
16. Cognitive Mapping
Cognitive maps are defined as "internally represented schemas or mental models for certain
problem-solving domains" as a result of an individual's contact with their environment
(Swan, 1997, p. 188). Information is preserved as a network of ideas connected to one
another, according to Semantic Theory (Katz & Fudor, 1963). The degree of connection of an
individual's knowledge increases his or her ability to recall information. The learner develops
new knowledge by linking new information with existing knowledge systems, according to
constructivist theory. Deshalb externalizes mental information integration by creating a
network map of concepts and their relationships. Mental externalizations or cognitive maps
are other names for them (Turns, Atman, & Adams, 2000; Wheeldon & Faubert, 2009;
Wycoff, 1991).
Cognitive mapping has proved to be a helpful method for analysis in both qualitative and
quantitative techniques. Using a set of criteria, Pearson and Somekh scored each map. For
each map, the number of connections and nodes were tallied in order to calculate the final
score. On each map, we also counted the number of important items that appeared on it.
Researchers Pearson and Somekh performed a final quantitative study of each map by adding
together all of its content categories to arrive at an estimate of the map's richness (2003,).
Idea maps were used to evaluate an introductory course in human factor engineering at both
the development and program levels in another research by Turns, Atman, and Adams
(2000). These factors, as well as the number of hierarchical levels and intricacy of
connections, were all taken into account while rating the maps.
Narrative analysis
The collection of analytical methods to comprehend storied texts and visual data is the
narrative analysis. People create stories to help arrange their lives and make sense, and their
stories are useful and meaningful, according to a common understanding of narrative
methods. Subjects and structural techniques are categorized according to whether the
narrative content or structure is focused on, with the theme version asking what a story is
about and the construction version asking how a story is constructed to accomplish certain
communication objectives. According to Kohler Riessman (2008), this fundamental typology
may be supplemented by a dialogical performance narrative analysis which focuses on the
context and perspective of narratives as multi-voiced and builds visuals that connect word
and picture into a consistent story.
17. Repertory Grid Technique
Rep Grid is a technique for creating personal ideas or what individuals believe about a
problem. According to Psychology of the Personal Construct, people's perceptions of the
things with which they interact include a number of linked dimensions of similarity–
differences called personal constructs. It may be used as a method in a variety of fundamental
and applied human structure research projects. The repertoire grid method has the benefit that
impressions may be obtained without influence or prejudice from researchers. (Bytheway and
Whyte 1996). 1996.
In early investigations for a future qualitative and quantitative study, or as a complement to
verify or deepen findings acquired using other techniques, the repertory grid may be utilized
as a stand-alone method. Apart from academics, repertory grid analysis is utilized in areas
such as advice and marketing. There seem to be many versions of the global concept today,
some more complex than others. According to Slater, 1976 as Dillon (1994:76), using it as an
analytical tool does not require the adoption of Kelly's posited human model. There are also a
number of elicitation methods in "mainstream" RGT to extract and analyze constructions.
Grounded Theory
The theory is a widely recognized technique of study that is utilized in many research
projects. Both qualitative and quantitative data generating methods may be utilized in
grounded theory research. The objective of this type theory is to discover or create data that
has been systematically gathered and evaluated through comparative analyses. Grounded
theory is a complex and essentially flexible approach. New researchers, therefore, try to
understand the discourse around theoretical ideas and processes and how to apply them in
reality. The investigator must be well aware of the research process before any research
project begins. A well-developed study outline and knowledge of the key problems to design
and perform a GT study are necessary for achieving the research objectives. Although it is
essential to understand how a technique has developed, a beginner may link himself with a
well-founded theorist and take a sound approach to research.
For grounded theory research, the full use of specific methods and processes is necessary.
"Systematic approaches, processes or data gathering and analysis tools" are specified as the
methods. Mackenzie and Knipe (2006). While GT studies may start with a range of sample
18. methods, many start deliberately with sampling, following contemporaneous data creation
and/or collection, analysis of data, coding, and theoretical sampling at different levels.
Theoretical sampling is employed until theoretical saturation is achieved. A continuously
developing, iterative sequence of actions and interactions results in the methods and
processes of GT. The methods are linked together and inform repeated parts of the research
process, as indicated by the flow of the arrows, and the brackets included. The fundamental
stages and methods to create an integrated theory were presented. The approach provided
may be modified to educate and assist the design of a GT study by beginner researchers. This
framework is a useful tool to visualize the connections between GT methods and processes.
Research that is carried out ethically and carefully using the technique will provide high-
quality research results that are helpful in practice.
Qualitative data analysis
The qualitative method is frequently used to assist the researcher in developing a very
comprehensive knowledge of a particular phenomenon. The results of this type of research
include generating outcomes that can be used to inform practice (Lochmiller, 2016), giving
thorough descriptions of a specific practice problem, providing insights into professional
practices in a given context, and addressing problems interrelated to the subjective manner of
qualitative research (Cho et al., 2016). The potential for the qualitative method is significant,
but it is contingent upon researchers being capable of carrying oy grounded, rigorous
analyses and, more broadly, understanding what qualitative analysis entails.
A content analysis will be conducted to elicit detailed information about the respondent's
experience with continuous improvement at Festive Chicken. After data collection from
respondents, the case study's analysis will begin. The steps involved in the content analysis
will be as follows:
The analysis will begin with the formulation of a topic and a research question. When
the variable in question is a message or a symbol, content analysis is appropriate.
The unit of analysis will be determined: "the specific segment of content that is characterized
by being classified as a word, word sense or phrase, sentence, paragraph, or document, while
themes and individual persons.
19. Developa sampling strategy: The research will consider all of the items relevant to this
field of study. A census inquiry is a comprehensive enumeration of all the constituents of the
'population.' A population is a collection of individuals, objects, or items from which
measurements are taken.
Coding and recording of Social Media Content: Coding will be used to organize data. The
'code' may be a single word and even a short phrase that encapsulates a concept or notion.
Coding is the process of conceptualizing and categorizing research data in order to facilitate
data analysis and interpretation. Each code will have a meaningful title. Numerous non-
quantifiable elements will be given titles, including events, behaviors, activities, and
meanings.
Coding, Validity, and Reliability Check: This stage will involve the researcher determining
the validity and reliability of the established coding categories. In this case, as in others, a
research instrument is said to have validity if it measures what it claims to measure and
reliability if it consistently produces the same result.
Data Collection and Analysis: This is the technique and procedure where data will be
collected necessary for the research. The data analysis method will be determined by the type
of research conducted.
Recognize recurring themes, patterns, and relationships. Different from quantitative
techniques, there are no universally pertinent techniques for qualitative data examination. In
this case, the researcher's analytical and critical thinking abilities are critical for data analysis
in qualitative studies. This phase will identify themes and patterns concerning the research
objectives. Nonetheless, common patterns, themes, and relationships will be identified inside
the responses of the sample group members concerning the codes aligned in the preceding
stage.
Concisely summarizing data. The researcher for this case will then connect the research
outcomes to the research purpose and objectives. When undertaking the data analysis chapter,
important quotations from the record will emphasize major themes and possible
contradictions within the outcomes.
Quantitative analysis
Quantitative data analysis is the method of analyzing numerical data, which can be
categorical or numerical, using various statistical techniques. Statistics are classified into two
20. broad categories: descriptive statistics and inferential statistics. The descriptive approach will
describe the researcher's sample, whereas the inferential approach will be used to make
predictions about the characteristics of the population. Quantitative data is well-defined as the
value of data expressed in totals or figures, with every data set having its particular exclusive
numerical value. Quantitative data is any computable information that functions for
mathematical calculations and numerical analysis, permitting the derivation of real-world
conclusions built on these mathematical derivations. Lutabingwa and Auriacombe (2007)
recommended determining dispersion directly using mean, percentage, or ratio before
attempting to find relationships using regression and Chi-square. Quantitative data analysis,
both descriptive and inferential statistics, will be conducted using the Statistical Package for
the Social Sciences (SPSS) version 25.
The researcher will employ quantitative data analysis with the aid of statistical software in
this study, which will entail the following stages, as articulated by (Saunders et al., 2012).
Data will be compiled and verified. This will involve entering data into a computer,
determining the most appropriate tables and diagrams to use based on the research objectives,
defining the most appropriate figures to describe the data, and defining the most appropriate
statistics to examine the data relationships and trends.
In this case, the researcher will collect raw data that will be analyzed and presented in a
meaningful manner. Quantitative data will be analyzed to discover evidence that will assist in
the research procedure. The researcher will then compare the variables to measurement scales
such as ordinal, nominal, ratio, interval, and interval r. This step is necessary to ensure that
the data is properly organized. After that, the data will be entered into an excel sheet to be
organized in a predetermined format. The researcher will then select a scale of measurement.
It is critical to choose a measurement scale for the variable in order to generate descriptive
statistics. For example, because a nominal variable score never has a mean or median, the
descriptive statistics will vary accordingly. In circumstances where the outcome cannot be
generalized to the population, descriptive statistics will suffice. Finally, the researcher will
choose appropriate tables to represent and analyze obtained data: After determining a proper
measurement scale, researchers will represent data in a tabular format. Numerous techniques,
such as cross-tabulation, will be used to analyze this data.
While quantitative and qualitative approaches to research are distinct, both will be used in
this study to solve the research question adequately. When both methods are used, the
21. researcher will first conduct a focus group to assist in developing a survey-type instrument.
On the other hand, after completing a quantitative analysis, the researcher may wish to delve
deeper into a particular trend or phenomenon identified during the data analysis and
interpretation phases. Additionally, researchers may employ techniques from both traditions
concurrently. For instance, a researcher may choose to conduct a content analysis of an online
forum and a quantitative analysis of survey data. Using mixed methods in this study is an
effective way to employ triangulation, specifically "methodological triangulation." It will be
up to the researcher and their team to determine which methods will work best for the
research questions and objectives. The researcher must understand that he is not obligated to
write this dissertation using one tradition over another, as both are valuable.
References
Baskarada, S. (2014). Qualitative case study guidelines. Baškarada, S.(2014). Qualitative
case studies guidelines. The Qualitative Report, 19(40), 1-25.
22. Blanton, S., Morris, D. M., Prettyman, M. G., McCulloch, K., Redmond, S., Light, K. E., &
Wolf, S. L. (2006). Lessons learned in participant recruitment and retention: the
EXCITE trial. Physical therapy, 86(11), 1520-1533.
Bolarinwa, O. A. (2015). Principles and methods of validity and reliability testing of
questionnaires used in social and health science researches. Nigerian Postgraduate
Medical Journal, 22(4), 195.
Burns, N., & Groves, K. (1997). Practice of nursing research. Retrieved from
http://www.just.edu.jo/coursesandlabs/seminar_nur%20793/syllabus_793.doc
Birks, M., & Mills, J. (2015). Grounded theory: A practical guide. Sage.
Brown, G., Brown, G. D., Brown, G. R., Gillian, B., & Yule, G. (1983). Discourse analysis. Cambridge
university press.
Cho, Y., Park, J., Ju, B., Han, S., Moon, H., Park, S., Ju, A., Park, E. (2016). Women leaders'
work-life imbalance in South Korean companies: A collaborative qualitative study.
Human Resource Development Quarterly, 27(4), 461–487.
Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods
research. Sage publications.
Crowe, S., Cresswell, K., Robertson, A., Huby, G., Avery, A., & Sheikh, A. (2011). The case
study approach. BMC Medical Research Methodology, 11(1), 1-9.
Denzin, N. K., & Lincoln, Y. S. (Eds.). (2011). The Sage handbook of qualitative research.
Sage.
Dillon, A. (2002). Designing usable electronic text: Ergonomic aspects of human information usage.
CRC press.
Doolin, B. (1998). Information technology as disciplinary technology: being critical in
interpretive research on information systems. Journal of Information
Technology, 13(4), 301-311.
Etikan, I., & Bala, K. (2017). Sampling and sampling methods. Biometrics & Biostatistics
International Journal, 5(6), 00149.
23. Gottschalk, L. A. (2014). Content analysis of verbal behavior: New findings and clinical applications.
Routledge.
Glaser, B. G., & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative
Research, Chicago: Aldine Publishing, c1967.
Heritage, J., & Maynard, D. W. (2006). Problems and prospects in the study of physician-patient
interaction: 30 years of research. Annu. Rev. Sociol., 32, 351-374.
Levitt, H. M., Bamberg, M., Creswell, J. W., Frost, D. M., Josselson, R., & Suárez-Orozco,
C. (2018). Journal article reporting standards for qualitative primary, qualitative meta-
analytic, and mixed methods research in psychology: The APA Publications and
Communications Board task force report. American Psychologist, 73(1), 26.
Lochmiller, C. R. (2016). Examining administrators' instructional feedback to high school
math and science teachers. Educational Administration Quarterly, 52(1), 75–109.
Lutabingwa, J., & Auriacombe, C. J. (2007). Data analysis in quantitative research. Journal
of Public Administration, 42(6), 528-548
Mackenzie, N., & Knipe, S. (2006). Research dilemmas: Paradigms, methods and
methodology. Issues in educational research, 16(2), 193-205.
Neuman, W. R., & Guggenheim, L. (2011). The evolution of media effects theory: A six-
stage model of cumulative research. Communication Theory, 21(2), 169-196.
Patel, M., & Patel, N. (2019). Exploring Research Methodology: Review
Article. International Journal of Research and Review, 6(3), 48-55.
Pearson, M., & Somekh, B. (2003). Concept-mapping as a research tool: A study of primary children's
representations of information and communication technologies (ICT). Education and
Information Technologies, 8(1), 5-22.
Regoniel, P. A. (2015). Quantitative methods: Meaning and characteristics. University of
Southern California (2015). Quantitative methods. Retrieved from
https://simplyeducate.me/2015/01/03/quantitative-methods-meaning-and-
characteristics/
Rothman, K. J., Greenland, S., & Lash, T. L. (2008). Modern epidemiology (Vol. 3).
Philadelphia: Wolters Kluwer Health/Lippincott
24. Saunders, M., Lewis, P. & Thornhill, A. (2012) "Research Methods for Business Students"
6th edition, Pearson Education Limited
Tashakkori, A., & Teddlie, C. (2010). Sage handbook of mixed methods in social and
behavioral research. SAGE publications.
Turns, J., Atman, C. J., & Adams, R. (2000). Concept maps for engineering education: A cognitively
motivated tool supporting varied assessment functions. IEEE Transactions on
Education, 43(2), 164-173.
Whyte, G., & Bytheway, A. (1996). Factors affecting information systems’ success. International
journal of service industry management.
Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA:
Sage.
Yin, R. K. (2006). Mixed methods research: Are the methods genuinely integrated or merely
parallel. Research in the Schools, 13(1), 41-47
Yin, R. K. (2009). Case study research: Design and methods (vol. 5). Sage Publishers.