This document discusses statistical analysis and errors in measurement. It defines statistical analysis as dealing with numerical data using probability theory. Measurement errors can be divided into determinate (systematic) errors and indeterminate (random) errors. Determinate errors can be avoided or corrected, while indeterminate errors cannot be determined precisely but their probability can be estimated using statistical distributions like the Gaussian curve. The document also discusses concepts like significant figures, rounding off data, measures of central tendency (mean, median, mode), standard deviation, tests like F-test and T-test, quality control/quality assurance, good laboratory practices, validation of analytical methods and their parameters.
a type of an analyzer used in mass spectrometer. separates the ions based on mass to charge ratios. useful for the detection of ions present in the sample
Multiple regression analysis is a powerful technique used for predicting the unknown value of a variable from the known value of two or more variables.
Principal Component Analysis, or PCA, is a factual method that permits you to sum up the data contained in enormous information tables by methods for a littler arrangement of "synopsis files" that can be all the more handily envisioned and broke down.
a type of an analyzer used in mass spectrometer. separates the ions based on mass to charge ratios. useful for the detection of ions present in the sample
Multiple regression analysis is a powerful technique used for predicting the unknown value of a variable from the known value of two or more variables.
Principal Component Analysis, or PCA, is a factual method that permits you to sum up the data contained in enormous information tables by methods for a littler arrangement of "synopsis files" that can be all the more handily envisioned and broke down.
Quality control (QC) is a procedure or set of procedures intended to ensure that a manufactured product or performed service adheres to a defined set of quality criteria or meets the requirements of the client or customer. QC is similar to, but not identical with, quality assurance (QA).
QC IN clinical biochemistry labs and hospitals
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
2. What is statistical analysis ?
The science that deals with the collection, analysis, and interpret
ation of numerical data, often using probability theory.
The data themselves
In data there is always errors due to human and instrumental
errors some can be corrected and avoided but some can not
because they are inderminate
4. Measurements errors can be divided into two
components
Non random errors
Systematic errors
5. Non Random error
They have slight order.
Systematic error
repeatable error associated with faulty equipment or a flawed
experiment design. These errors are usually caused by
measuring instruments that are incorrectly calibrated or are
used incorrectly.
6. Determinate errors (systematic errors) are those that, as the name implies,
are determinable means they can be corrected or avoided.
1) Mis-calibration of apparatus. This can be removed by checking the
apparatus against a standard.
(2) Faulty observation. This is avoidable, and therefore should not be cited
as a source of error in any well-performed experiment.
7. The error can be proportional to sample size or may change in a more
complex manner
Variation are usually unidirectional
As in the case of solubility loss of precipitate due to its solubility . Such an
example is the change in solution volume and concentration occurring
with changes in temperature. This can be corrected for by measuring the
solution temperature. Such measurable determinate errors are classed as
systematic errors.
8. Inderminate errors or random errors
Inderminate errors are always random errors and can not be avoided
Random errors ( inderminate)
It is caused by inherently unpredictable
fluctuations in the readings of a
measurement apparatus or in the
experimenter's interpretation of the
instrumental reading.
9. Random errors are often called as accidental error.
Random errors are indeterminate which means they
can not be determined/ calculated. But, we can make
a certain conclusion about these random errors by
applying mathematical more precisely statistical rules.
10. Gaussian curve
Statistical rule which is applied to make a conclusion about random errors is
normal distribution curve often called as Gaussian distribution curve.
Y axis (pdf)
Probabality
Denstiy
Function
x –axis (variables) standard deviation from the mean
11. Significant figures
Definition
the number of digits necessary to express the results of a measurement
consistent with the measured precision.
Since there is uncertainty in any measurement of atleast +_ 1 in the last
significant figures.
Rules
Non-zero digits are always significant.
Any zeros between two significant digits are significant.
Zeros on the left side are always non-significant while zeros on right side
always significant.
12. For example
0.216 Three significant figures
90.7 Three significant figures
800.0 Four significant figures
0.0670 Three significant figures
13. Rounding off data
Definition.
Rounding off is a kind of estimating. To round off
decimals: Find the place value you want (the
"rounding digit") and look at the digit just to the
right of it. If that digit is less than 5, do not change
the rounding digit but drop all digits to the right of it.
For example
4.7892 rounding of data into two decimal = 4.8
14. Mean, Median, Mode
Mean.
The sum of a group of measurements divided by the number of
measurements; the average.
Mode.
The most frequently occurring value in the group of
measurement.
2, 3, 4, 2, 5, 2, 7, 5, 2 so 2 is mode in it
Median.
In a even data set, the median is the average of two middle
numbers.
2, 4, 6,8,10 then 6 is median here
15. What is standard deviation
For a series of ‘n’ measurement of the same measurand, the quantity S
characterizing dispersion of results
X= measutrements
X = mean
N= numbers
16. Tests
F-test:
is used to determinate when two variances are satisfactorily different from
each other .
F= S / S2
T-test
Is used to determinate when two sets are satisfactorily different.
Q-test
Is used to determine when outlier is due to determinate error. If it is not , then
falls within expected random error and should be retained.
17. QC/QA
A good laboratory practice (GLP) means who is defining for and what
purpose. A good laboratory should have
Management
Personnel
Facilities
Equipment
Operation
Method validation
Quality assurance
18. Good laboratory practices have been established by
worldwide bodies such as the Organization for
Economic Cooperation and Development (OECD)
and the International Organization for
Standardization (ISO). Government agencies have
adopted them for their purposes as rules that must
be followed for laboratories involved in analyzing
substances that require regulation. Examples are
pharmaceutical formulations, foods, and
environmentally important samples.
19. GLPs
GLPs ( good laboratory practice) a body of rules, operating
procedures, and practices established by a given organization
that are considered to be mandatory with a view to ensuring
quality and correctness in the results produced by a
laboratory.
GLP ensures correct results are reported.
20. The laboratory should have two things
1. SOPs ( standard operating procedures)
2. QAU (quality assurance unit)
21. SOPs
Standard operating procedures provide detailed
descriptions of activities performed by the laboratory.
1) sample custody chain
2)sample handling and preparation
3) the analytical method
4) instrument maintenance
5) record keeping
22. QAU (quality assurance unit)
The QAU is responsible for assuring good
laboratory practices are implemented.
Everyone in the lab is responsible for following
them.
24. 6) Range
7) Limit of detection (LOD)
8) Limit of quantitation
9) Ruggedness or robustness
25. Selectivity
Selectivity is basically the extent that the method can measure the analyte
of interest in the matrices of sample being analyzed without interference
from the matrix. Matrix effect may be either positive or negative
Matrix
Matrix is everything in a sample except the actual analyte
29. The sensitivity is determined by slope of the calibration
curve and generally reflects the ability to distinguish
two different concentrations
You can measure slope or measure sample of closely
related conc. At high, intermediate, and low
concentrations
30.
31. Range
Range is actually an interval between upper and
lower concentrations of analyte in sample