Development and evaluation of prediction models: pitfalls and solutionsMaarten van Smeden
Slides for the statistics in practice session for the Biometrisches Kolloqium (organized by the Deutsche Region der Internationalen Biometrischen Gesellschaft), 18 March 2021
Development and evaluation of prediction models: pitfalls and solutionsMaarten van Smeden
Slides for the statistics in practice session for the Biometrisches Kolloqium (organized by the Deutsche Region der Internationalen Biometrischen Gesellschaft), 18 March 2021
SEPSIS IS MOST FATAL DISEASE WORLD WIDE. EARLY DETECTION OR PREDICTION OF SEPSIS IS A CHALLENGE
SEPSIS BIOMARKERS ARE OUR WEAPON TO EARLY DETECT SEPSIS. WE HAVE TO UNDERSTAND IT WELL
Webinar Mean Reversion Strategies PresentationQuantInsti
This webinar will provide a brief overview of the cash segment on SGX, which offers a unique market access to global investors. We will also discuss Daily Leveraged Certificates (DLC), an innovative financial product recently introduced on SGX for the first time in the Asian markets. We will discuss salient points from the above themes and highlight effective short-term as well as medium-term systematic trading strategies to benefit from these offerings.
Learn more about our EPAT™ course here: https://www.quantinsti.com/epat/
Most Useful links:
Join EPAT – Executive Programme in Algorithmic Trading: https://goo.gl/3Oyf2B
Visit us at: https://www.quantinsti.com/
Like us on Facebook: https://www.facebook.com/quantinsti/
Follow us on Twitter: https://twitter.com/QuantInsti
Dive into our students' innovative project leveraging machine learning for heart disease prediction. Discover how advanced analytics and predictive modeling can revolutionize healthcare, providing early detection and personalized interventions for better patient outcomes. To learn more, do check out https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/.
linearity concept of significance, standard deviation, chi square test, stude...KavyasriPuttamreddy
Linearity concept of significance, standard deviation, chi square test, students T- test, ANOVA test , pharmaceutical science, statistical analysis, statistical methods, optimization technique, modern pharmaceutics, pharmaceutics, mpharm 1 unit i sem, 1 year m
pharm, applications of chi square test, application of standard deviation , pharmacy, method to compare dissolution profile, statistical analysis of dissolution profile, important statical analysis, m. pharmacy, graphical representation of standard deviation, graph of chi square test, graph of T test , graph of ANOVA test ,formulation of t test, formulation of chi square test, formula of standard deviation.
SEPSIS IS MOST FATAL DISEASE WORLD WIDE. EARLY DETECTION OR PREDICTION OF SEPSIS IS A CHALLENGE
SEPSIS BIOMARKERS ARE OUR WEAPON TO EARLY DETECT SEPSIS. WE HAVE TO UNDERSTAND IT WELL
Webinar Mean Reversion Strategies PresentationQuantInsti
This webinar will provide a brief overview of the cash segment on SGX, which offers a unique market access to global investors. We will also discuss Daily Leveraged Certificates (DLC), an innovative financial product recently introduced on SGX for the first time in the Asian markets. We will discuss salient points from the above themes and highlight effective short-term as well as medium-term systematic trading strategies to benefit from these offerings.
Learn more about our EPAT™ course here: https://www.quantinsti.com/epat/
Most Useful links:
Join EPAT – Executive Programme in Algorithmic Trading: https://goo.gl/3Oyf2B
Visit us at: https://www.quantinsti.com/
Like us on Facebook: https://www.facebook.com/quantinsti/
Follow us on Twitter: https://twitter.com/QuantInsti
Dive into our students' innovative project leveraging machine learning for heart disease prediction. Discover how advanced analytics and predictive modeling can revolutionize healthcare, providing early detection and personalized interventions for better patient outcomes. To learn more, do check out https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/.
linearity concept of significance, standard deviation, chi square test, stude...KavyasriPuttamreddy
Linearity concept of significance, standard deviation, chi square test, students T- test, ANOVA test , pharmaceutical science, statistical analysis, statistical methods, optimization technique, modern pharmaceutics, pharmaceutics, mpharm 1 unit i sem, 1 year m
pharm, applications of chi square test, application of standard deviation , pharmacy, method to compare dissolution profile, statistical analysis of dissolution profile, important statical analysis, m. pharmacy, graphical representation of standard deviation, graph of chi square test, graph of T test , graph of ANOVA test ,formulation of t test, formulation of chi square test, formula of standard deviation.
The Development of the Biostatistics & Clinical Epideimiolgy Skills (BACES) A...Pat Barlow
A poster based on my dissertation work developing a new biostatistics and clinical epidemiology skills self-assessment for medical residents. I used an Item Response Theory (IRT) approach to gather preliminary data on 30 multiple choice items that will eventually turn into an online self-assessment module for residents looking to hone their skills in these critical areas. The article is currently under review in Medical Teach, and I am looking for more sites / collaborators for expanding the instrument. Email if you're interested!
There are many valid criticisms of P-values but the criticism that they are largely responsible for the reproducibility crisis has been accepted rather lightly in some quarters. Whatever the inferential statistic that is used, it is quite illogical to assume that as the sample size increases it will tend to show more evidence against the null hypothesis. This applies to Bayesian posterior probabilities as much as it does to P-values. In the context of P-values it can be referred to as the trend towards significance fallacy but more generally, for reasons I shall explain, it could be referred to as the anticipated evidence fallacy.
The anticipated evidence fallacy is itself an example of the overstated evidence fallacy. I shall also discuss this fallacy and other relevant matters affecting reproducible science including the problem of false negatives.
Improving epidemiological research: avoiding the statistical paradoxes and fa...Maarten van Smeden
Keynote at Norwegian Epidemiological Association conference, October 26 2022. Discussing absence of evidence fallacy, Table 2 fallacy, Winner's curse and Stein's paradox.
The absence of a gold standard: a measurement error problemMaarten van Smeden
Talk about gold standard problems and solutions in medicine and epidemiology. Invited by the department of infectious disease epidemiology, University Medical Center Utrecht
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
Nutraceutical market, scope and growth: Herbal drug technologyLokesh Patil
As consumer awareness of health and wellness rises, the nutraceutical market—which includes goods like functional meals, drinks, and dietary supplements that provide health advantages beyond basic nutrition—is growing significantly. As healthcare expenses rise, the population ages, and people want natural and preventative health solutions more and more, this industry is increasing quickly. Further driving market expansion are product formulation innovations and the use of cutting-edge technology for customized nutrition. With its worldwide reach, the nutraceutical industry is expected to keep growing and provide significant chances for research and investment in a number of categories, including vitamins, minerals, probiotics, and herbal supplements.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
ISI 2024: Application Form (Extended), Exam Date (Out), EligibilitySciAstra
The Indian Statistical Institute (ISI) has extended its application deadline for 2024 admissions to April 2. Known for its excellence in statistics and related fields, ISI offers a range of programs from Bachelor's to Junior Research Fellowships. The admission test is scheduled for May 12, 2024. Eligibility varies by program, generally requiring a background in Mathematics and English for undergraduate courses and specific degrees for postgraduate and research positions. Application fees are ₹1500 for male general category applicants and ₹1000 for females. Applications are open to Indian and OCI candidates.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Absence of a gold standard in diagnostic test accuracy research
1. Absence of a gold standard in diagnostic test
accuracy research
with application in context of childhood TB
Maarten van Smeden, PhD
Post-doctoral researcher Julius Center for Health Sciences and Primary Care
WEON 2017 Pre-conference Accounting for Measurement Error in Epidemiology
Antwerp, June 7, 2017
2. Outline
• Diagnostic test accuracy
• The problem: absence of a gold standard
• Possible solution: latent class analysis in context of TB
6. Diagnostic testing
• “New test better than the existing test(s)?”
• “(Where to) add new test to diagnostic pathway?”
• “Recommend new test in practice guidelines?”
Fig from: Bossuyt, BMJ, 2006
7. Diagnostic test accuracy studies (DTA)
• Evaluation of “new” diagnostic tests (=index test) by
comparison to a “gold standard”
• Misclassification probabilities of index test: sensitivity,
specificity, negative/positive predictive values, etc.
12. All that glitters is not gold
• Commonly the best available reference standard: Se < 1 and
Sp < 1: not a “gold standard”.
Because:
detection limits (e.g. culture), infeasible/not ethical to execute
in some patients (e.g. biopsy), observer errors (e.g. MRI), etc.
13. All that glitters is not gold
• Commonly the best available reference standard: Se < 1 and
Sp < 1: not a “gold standard”.
-> misclassifications of the target condition by the reference
standard (= measurement error)
14. When using imperfect reference standard
Assuming: reference standard Se = 1, index test Sp = Se = 0.7, conditional independence reference standard and index test
0.5 0.6 0.7 0.8 0.9 1.0
Specificity Reference Standard
E[SenstivityIndexTest]
Disease prevalence = 0.05
Disease prevalence = 0.25
Disease prevalence = 0.50
0.3
0.4
0.5
0.6
0.7
15. When using imperfect reference standard
• Bias, sometimes called “reference standard bias”. Not
necessarily a lower bound of Se/Sp
• Philosophical problems when index test is believed to be
more accurate than the best available reference standard
16. When using imperfect reference standard
Absence of a gold standard
Misclassifications by the reference standard ->
no straightforward approaches to estimation of
misclassification probabilities of index tests (that are valid)
17.
18. Tuberculosis (TB)
Paulsen, Nature, 2013
■ FIGURE 2.16a
Top causes of death worldwide in 2012.a,b Deaths from TB
among HIV-positive people are shown in grey.c
Road injury
HIV/AIDS
Diabetes mellitus
Diarrheal diseases
Tracheal, bronchus,
lung cancers
TB
Chronic obstructive
pulmonary disease
Lower respiratory
infections
Stroke
Ischaemic heart
disease
0 1 2 3 4 5 6 7
Millions
■ F
Est
20
in g
a This is the latest year for which estimates for all causes are currently
available. See WHO Global Health Observatory data repository,
available at http://apps.who.int/gho/data/node.main.GHECOD
(accessed 27 August 2015).
b For HIV/AIDS, the latest estimates of the number of deaths in 2012
a F
t
o
b
i
b D
d
HIV
WPR 9.2 8.3–10.0 0.29
Global 35.2 30.9–39.4 8.4
WHO Global TB report 2015
19. Data
• 749 hospitalised children with suspected pulmonary TB in
Cape Town, South Africa
• Study procedures, a number of tests for TB for each subject:
• Microscopy
• Culture
• Xpert (NAAT)
• TST (skin test)
• Radiography
28. Heuristic model for TB data
• Conditional independence
between all tests is unlikely
• Conditional dependence
between: Xpert, culture,
microscopy, and TST among TB
diseased due to “bacterial load”
• Bacterial load modelled by a
random effect
32. Is latent class analysis useful?
• In TB example, I believe: yes
• More realistic than assuming reference standard (culture)
has Se = Sp = 1
• Results ‘robust’ to changing prior distributions and
conditional dependence structure
• Lack of robust alternative approaches for DTA in the
absence of a gold standard
33. Is latent class analysis useful?
• But:
• Latent class analysis for DTA is still rare
34. Latent class analysis in diagnostic research
Systematic review from 2014
• 69 theoretical papers
• 64 applied papers in human research + 47 in veterinary sciences
• applications of LCA still not common in human diagnostic research
van Smeden, AJE, 2014
35. Is latent class analysis useful?
• But:
• Latent class analysis for DTA is still rare
• Robustness to misspecification of the conditional
dependence structure is a concern
36.
37. Is latent class analysis useful?
• But:
• Latent class analysis for DTA is still rare
• Robustness to misspecification of the conditional
dependence structure is a concern
• Identifiability requirements
38. Why Bayesian?
• Practical arguments:
• Model specifications in non-commercial software packages
(e.g. randomLCA vs rjags in R)
• (Weakly) informative prior distributions can solve non-
identifiability problems
• Additional calculations (e.g. positive/negative predictive
values with CrI)
39. Final remarks
• Misclassification in DTA studies is often both the primary topic
of study (for the index test) and the problem (when occurring
in the reference standard)
• Model based estimation of index test accuracy by latent class
analysis can be useful
• There is some evidence that robustness of the latent class
model can be improved when disease status can be verified
with certainty in a subset
• While the focus of this talk was on DTA, other studies such as
“incremental value” studies suffer from the same problems
40. Acknowledgements
Thanks to all co-authors in:
Supported by a grant from Canadian Institutes of Health Research (MOP
#89857)