Qualitative sampling design is a key step in qualitative research, especially for rural development, researchers
this document provides the necessary details on the procedures to follow
Sampling is procedure or process of selecting some units from the population with some common characteristics and is primarily concerned with the collection of data of some selected units of the population.
Qualitative sampling design is a key step in qualitative research, especially for rural development, researchers
this document provides the necessary details on the procedures to follow
Sampling is procedure or process of selecting some units from the population with some common characteristics and is primarily concerned with the collection of data of some selected units of the population.
Notes on SAMPLING and its types with examples.pptxNawangSherpa6
Sampling is a process used in statistics and research to select a subset (sample) from a larger population for the purpose of making inferences about the entire population. It is a fundamental aspect of data collection that enables researchers to gather and analyze data without having to investigate an entire population, which is often impractical or impossible.
Population in statistics means the whole of the information which comes under the preview of statistical investigation.
In other words, an aggregate of objects animate or in animate under study is the population.
It is also known as “Universe”.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Notes on SAMPLING and its types with examples.pptxNawangSherpa6
Sampling is a process used in statistics and research to select a subset (sample) from a larger population for the purpose of making inferences about the entire population. It is a fundamental aspect of data collection that enables researchers to gather and analyze data without having to investigate an entire population, which is often impractical or impossible.
Population in statistics means the whole of the information which comes under the preview of statistical investigation.
In other words, an aggregate of objects animate or in animate under study is the population.
It is also known as “Universe”.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
(May 29th, 2024) Advancements in Intravital Microscopy- Insights for Preclini...Scintica Instrumentation
Intravital microscopy (IVM) is a powerful tool utilized to study cellular behavior over time and space in vivo. Much of our understanding of cell biology has been accomplished using various in vitro and ex vivo methods; however, these studies do not necessarily reflect the natural dynamics of biological processes. Unlike traditional cell culture or fixed tissue imaging, IVM allows for the ultra-fast high-resolution imaging of cellular processes over time and space and were studied in its natural environment. Real-time visualization of biological processes in the context of an intact organism helps maintain physiological relevance and provide insights into the progression of disease, response to treatments or developmental processes.
In this webinar we give an overview of advanced applications of the IVM system in preclinical research. IVIM technology is a provider of all-in-one intravital microscopy systems and solutions optimized for in vivo imaging of live animal models at sub-micron resolution. The system’s unique features and user-friendly software enables researchers to probe fast dynamic biological processes such as immune cell tracking, cell-cell interaction as well as vascularization and tumor metastasis with exceptional detail. This webinar will also give an overview of IVM being utilized in drug development, offering a view into the intricate interaction between drugs/nanoparticles and tissues in vivo and allows for the evaluation of therapeutic intervention in a variety of tissues and organs. This interdisciplinary collaboration continues to drive the advancements of novel therapeutic strategies.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
2. What is research?
Research is an organized and systematic way of
identifying problems and finding answers to
questions.
2
3. SAMPLING
• A sample is “a smaller (but hopefully
representative) collection of units from a
population used to determine truths about that
population” (Field, 2005)
• Why sample?
• Resources (time, money) and workload
• Gives results with known accuracy that can be
calculated mathematically
• The sampling frame is the list from which the
potential respondents are drawn
• Registrar’s office
• Class rosters
3
4. SAMPLING……
• What is your population of interest?
• To whom do you want to generalize your
results?
•All doctors
•School children
•Indians
•Women aged 15-45 years
•Other
• Can you sample the entire population?
4
7. Process
The sampling process comprises several stages:
Defining the population of concern
• Specifying a sampling frame, a set of items or
events possible to measure
• Specifying a sampling method for selecting
items or events from the frame
• Determining the sample size
• Implementing the sampling plan
• Sampling and data collecting
7
9. PROBABILITY SAMPLING
• A probability sampling scheme is one in which every
unit in the population has a chance (greater than zero)
of being selected in the sample, and this probability
can be accurately determined.
• . When every element in the population does have the
same probability of selection, this is known as an
'equal probability of selection' (EPS) design. Such
designs are also referred to as 'self-weighting'
because all sampled units are given the same weight.
9
10. PROBABILITY SAMPLING…….
PROBABILITY SAMPLING INCLUDES:
•Simple Random Sampling,
•Systematic Sampling,
•Stratified Random Sampling,
•Cluster Sampling
•Multistage Sampling.
•Multiphase sampling
10
11. NON PROBABILITY SAMPLING
• Any sampling method where some
elements of population have no chance
of selection (these are sometimes
referred to as 'out of
coverage'/'undercovered'), or where
the probability of selection can't be
accurately determined. It involves the
selection of elements based on
assumptions regarding the population
of interest, which forms the criteria
for selection. Hence, because the
selection of elements is nonrandom,
nonprobability sampling not allows the
estimation of sampling errors..
11
13. SYSTEMATIC SAMPLING
• Systematic sampling relies on arranging the target
population according to some ordering scheme and then
selecting elements at regular intervals through that
ordered list.
• Systematic sampling involves a random start and then
proceeds with the selection of every kth element from
then onwards. In this case, k=(population size/sample
size).
• It is important that the starting point is not
automatically the first in the list, but is instead
randomly chosen from within the first to the kth
element in the list.
• A simple example would be to select every 10th name
from the telephone directory (an 'every 10th' sample,
also referred to as 'sampling with a skip of 10').
13
14. SYSTEMATIC SAMPLING……
As described above, systematic sampling is an EPS method, because all
elements have the same probability of selection (in the example
given, one in ten). It is not 'simple random sampling' because
different subsets of the same size have different selection
probabilities - e.g. the set {4,14,24,...,994} has a one-in-ten
probability of selection, but the set {4,13,24,34,...} has zero
probability of selection.
14
15. SYSTEMATIC SAMPLING……
• ADVANTAGES:
• Sample easy to select
• Suitable sampling frame can be identified easily
• Sample evenly spread over entire reference population
• DISADVANTAGES:
• Sample may be biased if hidden periodicity in population
coincides with that of selection.
• Difficult to assess precision of estimate from one survey.
15
16. STRATIFIED SAMPLING
Where population embraces a number of distinct
categories, the frame can be organized into separate
"strata." Each stratum is then sampled as an independent
sub-population, out of which individual elements can be
randomly selected.
• Every unit in a stratum has same chance of being
selected.
• Using same sampling fraction for all strata ensures
proportionate representation in the sample.
• Adequate representation of minority subgroups of
interest can be ensured by stratification & varying
sampling fraction between strata as required.
16
17. STRATIFIED SAMPLING……
• Finally, since each stratum is treated as an
independent population, different sampling approaches
can be applied to different strata.
• Drawbacks to using stratified sampling.
• First, sampling frame of entire population has to be
prepared separately for each stratum
• Second, when examining multiple criteria, stratifying
variables may be related to some, but not to others,
further complicating the design, and potentially
reducing the utility of the strata.
• Finally, in some cases (such as designs with a large
number of strata, or those with a specified minimum
sample size per group), stratified sampling can
potentially require a larger sample than would other
methods 17
19. CLUSTER SAMPLING
Population divided into clusters of homogeneous
units, usually based on geographical contiguity.
• Sampling units are groups rather than individuals.
• A sample of such clusters is then selected.
• All units from the selected clusters are studied.
19
20. CLUSTER SAMPLING…….
Two types of cluster sampling methods.
One-stage sampling. All of the elements within
selected clusters are included in the sample.
Two-stage sampling. A subset of elements within
selected clusters are randomly selected for
inclusion in the sample.
20
21. MULTISTAGE SAMPLING
• In statistics, multistage sampling is the taking of samples in
stages using smaller and smaller sampling units at each stage.
Multistage sampling can be a complex form of cluster
sampling because it is a type of sampling which involves
dividing the population into groups
• First stage, random number of districts chosen in all
states.
• Followed by random number of villages.
• Then third stage units will be houses.
• All ultimate units (houses, for instance) selected at last step are
surveyed.
21
22. QUOTA SAMPLING
• The population is first segmented into mutually exclusive
sub-groups, just as in stratified sampling.
• Then judgment used to select subjects or units from
each segment based on a specified proportion.
• For example, an interviewer may be told to sample 200
females and 300 males between the age of 45 and 60.
• It is this second step which makes the technique one of
non-probability sampling.
• In quota sampling the selection of the sample is non-
random.
• For example interviewers might be tempted to interview
those who look most helpful. The problem is that these
samples may be biased because not everyone gets a
chance of selection. This random element is its greatest
weakness and quota versus probability has been a matter
of controversy for many years 22
23. CONVENIENCE SAMPLING
• Sometimes known as grab or opportunity sampling or
accidental sampling.
• A type of nonprobability sampling which involves the
sample being drawn from that part of the population
which is close to hand. That is, readily available and
convenient.
23
24. 24
24
In social science research,
snowball sampling is a similar
technique, where existing study
subjects are used to recruit more
subjects into the sample.
Snowball Sampling (Chain Referral)
25. Judgmental sampling or
Purposive sampling
• - The researcher chooses the sample based on who
they think would be appropriate for the study.
This is used primarily when there is a limited
number of people that have expertise in the area
being researched
25
27. • Reliability: this is about the replicability of your
research and the accuracy of the procedures and
research techniques.
• Will the same results be repeated if the research is
repeated?
• Are the measurements of the research methods
accurate and consistent?
• Could they be used in other similar contexts with
equivalent results?
• Would the same results be achieved by another
researcher using the same instruments? Is the research
free from error or bias on the part of the researcher, or
the participants?
27
28. • Validity:
• how successfully has the research actually achieved
what it set out to achieve?
• Can the results of the study be transferred to other
situations?
• Does x really cause y, in other words is the researcher
correct in maintaining a causal link between these
two variables?
• Have the findings really be accurately interpreted?
• Have other events intervened which might impact on
the study?
28
29. Generalizability:
•Are the findings applicable in other research
settings?
•Can a theory be developed that can apply to
other populations?
For example, can a particular study about dissatisfaction
amongst lecturers in a particular university be applied
generally?This is particularly applicable to research which has
a relatively wide sample, as in a questionnaire, or which
adopts a scientific technique, as with the experiment.
29