This document discusses various methods and tools for data collection in research. It describes primary and secondary data sources, as well as quantitative and qualitative data. Several data collection methods are outlined, including observation, interviews, questionnaires, and library or laboratory research. The document emphasizes that there is no single best method and researchers must consider their purpose, respondents, resources, and the type of data needed. Both structured and semi-structured approaches are described. The importance of sampling techniques and using multiple data collection methods, such as triangulation, is also highlighted.
Dear viewers Check Out my other piece of works at___ https://healthkura.com
Data Collection (Methods/ Tools/ Techniques), Primary & Secondary Data, Assessment of Qualitative Data, Qualitative & Quantitative Data, Data Processing
Presentation Contents:
- Introduction to data
- Classification of data
- Collection of data
- Methods of data collection
- Assessment of qualitative data
- Processing of data
- Editing
- Coding
- Tabulation
- Graphical representation
If anyone is really interested about research related topics particularly on data collection, this presentation will be the best reference.
For Further Reading
- Biostatistics by Prem P. Panta
- Fundamentals of Research Methodology and Statistics by Yogesh k. Singh
- Research Design by J. W. Creswell
- Internet
Dear viewers Check Out my other piece of works at___ https://healthkura.com
Data Collection (Methods/ Tools/ Techniques), Primary & Secondary Data, Assessment of Qualitative Data, Qualitative & Quantitative Data, Data Processing
Presentation Contents:
- Introduction to data
- Classification of data
- Collection of data
- Methods of data collection
- Assessment of qualitative data
- Processing of data
- Editing
- Coding
- Tabulation
- Graphical representation
If anyone is really interested about research related topics particularly on data collection, this presentation will be the best reference.
For Further Reading
- Biostatistics by Prem P. Panta
- Fundamentals of Research Methodology and Statistics by Yogesh k. Singh
- Research Design by J. W. Creswell
- Internet
CONTENT ANALYSIS (Quantitative Research Methods)Libcorpio
Content Analysis, Quantitative Research Methods, LIS Education, Library and Information Science, LIS Studies, Information Management, Education and Learning, Library science, Information science, Library Research Methods,
Methods of Data Collection in Quantitative Research (Biostatistik)AKak Long
DEFINITION : Quantitative research, is defined as a the systematic investigation of phenomena by gathering quantifiable data and performing statistical, mathematical or computational techniques.
Quantitative research gathers information from existing and potential customers using sampling methods and sending out online surveys, online polls, questionnaires etc., the results of which can be depicted in the form of numericals.
After careful understanding of these numbers to predict the future of a product or service and make changes accordingly.
Described as the process of gathering and measuring information on variables of interest, in an established systematic fashion that enables one to answer research questions, test hypothesis and evaluate outcome.
Importance of data collection:
Helps us search for answers and resolutions
Facilitates and improve decision-making processes and the quality of the decisions made.
#Types of quantitative research.
. Survey research
The collection of data attained by asking individuals questions by either in person, on paper, by phone or online.
2. Correlational research
Measures two variables, understand assess the statistical relationship between them with no influence from any extraneous variable.
3. Casual-comparative research
To find relationship between independent and dependent variables after an action or event has already occurred.
4. Experimental research
Researcher manipulates one variables, and control/randomizes the rest of the variables.
Concept of Sampling
Purpose of Sampling
Stages of Sampling Process
Types of Sampling –
Probability
Non-probability Sampling
Sampling Error and Bias
Determination of Sample Size.
CONTENT ANALYSIS (Quantitative Research Methods)Libcorpio
Content Analysis, Quantitative Research Methods, LIS Education, Library and Information Science, LIS Studies, Information Management, Education and Learning, Library science, Information science, Library Research Methods,
Methods of Data Collection in Quantitative Research (Biostatistik)AKak Long
DEFINITION : Quantitative research, is defined as a the systematic investigation of phenomena by gathering quantifiable data and performing statistical, mathematical or computational techniques.
Quantitative research gathers information from existing and potential customers using sampling methods and sending out online surveys, online polls, questionnaires etc., the results of which can be depicted in the form of numericals.
After careful understanding of these numbers to predict the future of a product or service and make changes accordingly.
Described as the process of gathering and measuring information on variables of interest, in an established systematic fashion that enables one to answer research questions, test hypothesis and evaluate outcome.
Importance of data collection:
Helps us search for answers and resolutions
Facilitates and improve decision-making processes and the quality of the decisions made.
#Types of quantitative research.
. Survey research
The collection of data attained by asking individuals questions by either in person, on paper, by phone or online.
2. Correlational research
Measures two variables, understand assess the statistical relationship between them with no influence from any extraneous variable.
3. Casual-comparative research
To find relationship between independent and dependent variables after an action or event has already occurred.
4. Experimental research
Researcher manipulates one variables, and control/randomizes the rest of the variables.
Concept of Sampling
Purpose of Sampling
Stages of Sampling Process
Types of Sampling –
Probability
Non-probability Sampling
Sampling Error and Bias
Determination of Sample Size.
Introduction
Study design in qualitative research
Method of data collection
Handling qualitative data
Analyzing qualitative data
Presenting the results of qualitative research
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
2. Define Research Problem
Review Concepts & Theories
Literature Review Review previous research finding
Formulate Hypothesis
Research Design (Including sample design)
Data Analysis
Interpret and Report
Data Collection
(Using Data collection Tools)
(Observation, Interview,
Questionnaire etc.,..)
3. Data
According to the Sources
1. Primary Sources Primary Data
2. Secondary Sources Secondary Data
According to the Nature
1. Quantitative Data
2. Qualitative Data
4. Primary Data
• Data never gathered before
• Advantage: find data you need to suit your
purpose
• Disadvantage: usually more costly and time
consuming than collecting secondary data
• Collected after secondary data is collected
5.
6. Secondary Data
• Data gathered by another sources
• Secondary data is gathered BEFORE primary data.
WHY?
• Because you want to find out what is already known
about a subject/ Problem (L/R) before you dive into
your own investigation. WHY?
• Because some of your questions can possibly have
been already answered by other investigators or
authors. Why “reinvent the wheel”?
7.
8. 8
Data Collection Strategies
• No one best way: decision depends on:
– What you need to know: numbers or stories
– Where the data reside: environment, files,
people
– Resources and time available
– Complexity of the data to be collected
– Frequency of data collection
– Intended forms of data analysis
18. When choosing methods,
consider…
The purpose of your Research
Will the method allow you to gather information
that can be analyzed and presented in a way that will be
credible and useful to you and others.
The respondents
What is the most appropriate method,
considering how the respondents can best be reached,
how they might best respond, literacy, cultural
considerations, etc.?
19. Consider…
• What kind of data your stakeholders will find
most credible and useful
• Resources available.
Time, money, and staff to design,
implement, and analyze the information. What
can you afford?
• Type of information you need. Numbers,
percent's, comparisons, stories etc.
20. • Interruptions to program or participants.
Which method is likely to be least
intrusive?
• Advantages and disadvantages of each
method.
• .The importance of ensuring cultural
appropriateness.
Consider…
21. Sampling Techniques
Population - Total group of respondents that the
researcher wants to study. Populations are too costly and
time consuming to study in entirety.
Sample - selecting and surveying
respondents (research participants)
from the population.
22. 22
METHODS
The various method of data gathering involve the use of
appropriate recording forms. These are called tools or
instruments of data collection, they consists of
Library Research
1. Analysis of Historical
Records
2. Analysis of Documents
Laboratory
Research
1. Small group study of random
behavior, play and role analysis
23. METHODS
Field
Research
1. Non Participant Direct
Observation
2. Participant Observation
3. Mass Observation
4. Questionnaire
5. Personal Interview
6. Group Interview
7. Focused Interview
8. Telephone Survey
9. Case Study & Life History etc.
24. Often, it is better to use more than
one data collection method.
Triangulation
25. The use of triangulation in qualitative
research
Triangulation refers to the use of multiple methods or data
sources in qualitative research to develop a comprehensive
understanding of phenomena (Patton, 1999).
Triangulation also has been viewed as a qualitative research
strategy to test validity through the convergence of information
from different sources.
Denzin (1978) and Patton (1999) identified four types of
triangulation:
(a) method triangulation
(b) investigator triangulation
(c) theory triangulation, and
(d) data source triangulation.