This year’s Nobel prize in physics: homage to John Bell.
Richard Gill
Mathematical Institute, Leiden University.
Focussing on statistical issues, I will first sketch the history initiated by John Bell’s landmark 1964 paper “On the Einstein Podolsky Rosen paradox”, which led to the 2022 Nobel prize awarded to John Clauser, Alain Aspect and Anton Zeilinger,
https://www.nobelprize.org/prizes/physics/2022/press-release/
A breakthrough in the history was the four successful “loophole-free” Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna. These experiments pushed quantum technology to the limit and paved the way for DIQKD (“Device Independent Quantum Key Distribution”) and a quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality", and depended on brilliant later innovations: Eberhard’s discovery that less entanglement could allow stronger manifestation of quantum non-locality, and Zeilinger’s discovery of quantum teleportation, allowing entanglement between photons to be transferred to entanglement between ions or atoms and ultimately to components of manufactured semi-conductors.
I will also discuss reanalyses of the 2015+ experiments, which could have allowed the experimenters to claim even smaller p-values than the ones they published,
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
Better slides: https://www.slideshare.net/gill1109/nobelpdf-253673329
In this talk I want to explain to you Bell’s inequality and Bell’s theorem. John Bell (1964) literally added a new twist — of 45 degrees, actually — to the famous EPR argument (Einstein, Podolsky, Rosen, 1935), which was supposed to show the incompleteness of quantum mechanics and vanquish Bohr. Bell’s twist apparently showed that either quantum mechanics was wrong, or is shockingly non-local. His findings have been apparently vindicated, in favour of quantum non-locality, by experiment; the most famous being that of Alain Aspect et al. (1982), who tested the so-called Bell-CHSH inequality (Clauser, Horne, Shimony and Holt’s 1969 variant).
However that experiment (and all other experiments to date actually!) have slight defects, so-called loopholes. They actually did the wrong experiment, because if they had done the right experiment, they wouldn’t have got the results they were looking for! Only now, 50 years on, are experimenters on the threshold of definitively proving quantum non-locality … as far as experiment ever proves anything. In particular, Delft is in the race, and there are even plans to perform the experiment with Alice and Bob (it’s always about Alice and Bob) located in Leiden and Delft.
Enter: probability and statistics. Rutherford famously once said “if you needed statistics, you did the wrong experiment” (Rutherford was wrong, several times in fact).
Differential geometry three dimensional spaceSolo Hermelin
This presentation describes the mathematics of curves and surfaces in a 3 dimensional (Euclidean) space.
The presentation is at an Undergraduate in Science (Math, Physics, Engineering) level.
Plee send comments and suggestions to improvements to solo.hermelin@gmail.com. Thanks/
More presentations can be found at my website http://www.solohermelin.com.
Graphic Notes on Introduction to Linear AlgebraKenji Hiranabe
Graphic Notes on Introduction to Linear Algebra authored by Prof. Gilbert Strang.
This is an idea for visualization to better understand linear algebra.
If you want a PowerPoint version, feel free to let me know, I'll share it with you.
This year’s Nobel prize in physics: homage to John Bell.
Richard Gill
Mathematical Institute, Leiden University.
Focussing on statistical issues, I will first sketch the history initiated by John Bell’s landmark 1964 paper “On the Einstein Podolsky Rosen paradox”, which led to the 2022 Nobel prize awarded to John Clauser, Alain Aspect and Anton Zeilinger,
https://www.nobelprize.org/prizes/physics/2022/press-release/
A breakthrough in the history was the four successful “loophole-free” Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna. These experiments pushed quantum technology to the limit and paved the way for DIQKD (“Device Independent Quantum Key Distribution”) and a quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality", and depended on brilliant later innovations: Eberhard’s discovery that less entanglement could allow stronger manifestation of quantum non-locality, and Zeilinger’s discovery of quantum teleportation, allowing entanglement between photons to be transferred to entanglement between ions or atoms and ultimately to components of manufactured semi-conductors.
I will also discuss reanalyses of the 2015+ experiments, which could have allowed the experimenters to claim even smaller p-values than the ones they published,
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
Better slides: https://www.slideshare.net/gill1109/nobelpdf-253673329
In this talk I want to explain to you Bell’s inequality and Bell’s theorem. John Bell (1964) literally added a new twist — of 45 degrees, actually — to the famous EPR argument (Einstein, Podolsky, Rosen, 1935), which was supposed to show the incompleteness of quantum mechanics and vanquish Bohr. Bell’s twist apparently showed that either quantum mechanics was wrong, or is shockingly non-local. His findings have been apparently vindicated, in favour of quantum non-locality, by experiment; the most famous being that of Alain Aspect et al. (1982), who tested the so-called Bell-CHSH inequality (Clauser, Horne, Shimony and Holt’s 1969 variant).
However that experiment (and all other experiments to date actually!) have slight defects, so-called loopholes. They actually did the wrong experiment, because if they had done the right experiment, they wouldn’t have got the results they were looking for! Only now, 50 years on, are experimenters on the threshold of definitively proving quantum non-locality … as far as experiment ever proves anything. In particular, Delft is in the race, and there are even plans to perform the experiment with Alice and Bob (it’s always about Alice and Bob) located in Leiden and Delft.
Enter: probability and statistics. Rutherford famously once said “if you needed statistics, you did the wrong experiment” (Rutherford was wrong, several times in fact).
Differential geometry three dimensional spaceSolo Hermelin
This presentation describes the mathematics of curves and surfaces in a 3 dimensional (Euclidean) space.
The presentation is at an Undergraduate in Science (Math, Physics, Engineering) level.
Plee send comments and suggestions to improvements to solo.hermelin@gmail.com. Thanks/
More presentations can be found at my website http://www.solohermelin.com.
Graphic Notes on Introduction to Linear AlgebraKenji Hiranabe
Graphic Notes on Introduction to Linear Algebra authored by Prof. Gilbert Strang.
This is an idea for visualization to better understand linear algebra.
If you want a PowerPoint version, feel free to let me know, I'll share it with you.
Existence, Uniqueness and Stability Solution of Differential Equations with B...iosrjce
In this work, we investigate the existence ,uniqueness and stability solution of non-linear
differential equations with boundary conditions by using both method Picard approximation and
Banach fixed point theorem which were introduced by [6] .These investigations lead us to improving
and extending the above method. Also we expand the results obtained by [1] to change the non-linear
differential equations with initial condition to non-linear differential equations with boundary
conditions
Double integrals over Rectangle, Fubini’s Theorem,Properties of double integrals, Double integrals over a general region, Double integrals in polar region
The presentation explains various Vedic math techniques that can be used for simplified manual calculations. It is a must learn mathematics technique for young students for better calculations.
These slides contain information about Euler method,Improved Euler and Runge-kutta's method.How these methods are helpful and applied to our questions are detailed discussed in the slides.
Jacobi Iteration Method is Used in Numerical Analysis. This slide helps you to figure out the use of the Jacobi Iteration Method to submit your presentatio9n slide for academic use.
THE PIGEON HOLE PRINCIPLE or also known as DRAWER PRINCIPLE BY ALVIN OBLIGACION Alvin Obligacion
this is the hard copy of the topic " THE PIGEON HOLE PRINCIPLE"
pigeon hole principle states that:
• If n pigeons fly into m pigeonholes and n > m, then at least one hole must contain two or more pigeons.
Introduction to Computing lecture presentation to analyze the number systems handled by digital computing devices to process data, convert decimal to binary, solve Binary Arithmetic, and extend understanding of other number systems (Octal and Hexadecimal).
Existence, Uniqueness and Stability Solution of Differential Equations with B...iosrjce
In this work, we investigate the existence ,uniqueness and stability solution of non-linear
differential equations with boundary conditions by using both method Picard approximation and
Banach fixed point theorem which were introduced by [6] .These investigations lead us to improving
and extending the above method. Also we expand the results obtained by [1] to change the non-linear
differential equations with initial condition to non-linear differential equations with boundary
conditions
Double integrals over Rectangle, Fubini’s Theorem,Properties of double integrals, Double integrals over a general region, Double integrals in polar region
The presentation explains various Vedic math techniques that can be used for simplified manual calculations. It is a must learn mathematics technique for young students for better calculations.
These slides contain information about Euler method,Improved Euler and Runge-kutta's method.How these methods are helpful and applied to our questions are detailed discussed in the slides.
Jacobi Iteration Method is Used in Numerical Analysis. This slide helps you to figure out the use of the Jacobi Iteration Method to submit your presentatio9n slide for academic use.
THE PIGEON HOLE PRINCIPLE or also known as DRAWER PRINCIPLE BY ALVIN OBLIGACION Alvin Obligacion
this is the hard copy of the topic " THE PIGEON HOLE PRINCIPLE"
pigeon hole principle states that:
• If n pigeons fly into m pigeonholes and n > m, then at least one hole must contain two or more pigeons.
Introduction to Computing lecture presentation to analyze the number systems handled by digital computing devices to process data, convert decimal to binary, solve Binary Arithmetic, and extend understanding of other number systems (Octal and Hexadecimal).
Talk given at Kobayashi-Maskawa Institute, Nagoya University, Japan.Peter Coles
Cosmic Anomalies
Observational measurements of the temperature variation of the Cosmic Microwave Background across the celestial sphere made by the Wilkinson Microwave Anisotropy Probe (WMAP) and, more recently Planck, have played a major part in establishing the standard "concordance" cosmological model. However, extensive statistical analysis of these data have also revealed some tantalising anomalies whose interpretation within the standard framework is by no means clear. In this talk, I'll discuss the significance of the evidence for some aspects of this anomalous behaviour, offer some possible theoretical models, and suggest how future measurements may provide firmer conclusions.
In this talk I want to explain to you Bell’s inequality and Bell’s theorem. John Bell (1964) literally added a new twist — of 45 degrees, actually — to the famous EPR argument (Einstein, Podolsky, Rosen, 1935), which was supposed to show the incompleteness of quantum mechanics and vanquish Bohr. Bell’s twist apparently showed that either quantum mechanics was wrong, or is shockingly non-local. His findings have been apparently vindicated, in favour of quantum non-locality, by experiment; the most famous being that of Alain Aspect et al. (1982), who tested the so-called Bell-CHSH inequality (Clauser, Horne, Shimony and Holt’s 1969 variant).
However that experiment (and all other experiments to date actually!) have slight defects, so-called loopholes. They actually did the wrong experiment, because if they had done the right experiment, they wouldn’t have got the results they were looking for! Only now, 50 years on, are experimenters on the threshold of definitively proving quantum non-locality … as far as experiment ever proves anything. In particular, Delft is in the race, and there are even plans to perform the experiment with Alice and Bob (it’s always about Alice and Bob) located in Leiden and Delft.
Enter: probability and statistics. Rutherford famously once said “if you needed statistics, you did the wrong experiment” (Rutherford was wrong, several times in fact).
In this second lecture, I will discuss how to calculate polarization in terms of Berry phase, how to include GW correction in the real-time dynamics and electron-hole interaction.
Parity-Violating and Parity-Conserving Asymmetries in ep and eN Scattering in...Wouter Deconinck
Invited workshop presentation at the Amherst Center for Fundamental Interactions at UMass Amherst. This presentation includes the official Qweak results and discussion of unofficial beam normal single spin asymmetries.
Hyperon and charm baryons masses from twisted mass Lattice QCDChristos Kallidonis
Talk given at the University of Bonn, Germany. We present results on the masses of all forty light, strange and charm baryons from Lattice QCD simulations. We elaborate on the various methods and techniques followed and examine systematic uncertainties related to isospin breaking effects and finite lattice spacing.
Non-interacting and interacting Graphene in a strong uniform magnetic fieldAnkurDas60
We study monolayer graphene in a uniform magnetic field in the absence and presence of interactions. In the non-interacting limit for p/q flux quanta per unit cell, the central two bands have 2q Dirac points in the Brillouin zone in the nearest-neighbor model. These touchings and their locations are guaranteed by chiral symmetry and the lattice symmetries of the honeycomb structure. If we add a staggered potential and a next nearest neighbor hopping we find their competition leads to a topological phase transition. We also study the stability of the Dirac touchings to one-body perturbations that explicitly lowers the symmetry.
In the interacting case, we study the phases in the strong magnetic field limit. We consider on-site Hubbard and nearest-neighbor Heisenberg interactions. In the continuum limit, the theory has been studied before [1]. It has been found that there are four competing phases namely, ferromagnetic, antiferromagnetic, charge density wave, and Kekulé distorted phases. We find phase diagrams for q=3,4,5,6,9,12 where some of the phases found in the continuum limit are co-existent in the lattice limit with some phases not present in the continuum limit.
[1] M. Kharitonov PRB 85, 155439 (2012)
*NSF DMR-1306897
NSF DMR-1611161
US-Israel BSF 2016130
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
A tale of two Lucys - Delft lecture - March 4, 2024Richard Gill
TUDelft Seminar Probability & Statistics, 4 March 2024
15:45 T/M 16:45 - LOCATION: LECTURE HALL D@TA
Lucia de Berk, a Dutch nurse, was arrested in 2001, and tried and convicted of serial murder of patients in her care. At a lower court the only hard evidence against her was the result of a probability calculation: the chance that she was present at so many suspicious deaths and collapses in the hospitals where she had worked was 1 in 342 million. During appeal proceedings at a higher court, the prosecution shifted gears and gave the impression that there was now hard evidence that she had killed one baby. Having established that she was a killer and a liar (she claimed innocence) it was not difficult to pin another 9 deaths and collapses on her. No statistics were needed any more. In 2005 the conviction was confirmed by the supreme court. But at the same time, some whistleblowers started getting attention from the media. A long fight for the hearts and minds of the public, and a long fight to have the case reopened (without any new evidence - only new scientific interpretation of existing evidence) began and ended in 2010 with Lucia’s complete exoneration. A number of statisticians played a big role in that fight. The idea that the conviction was purely based on objective scientific evidence was actually an illusion. This needed to be explained to journalists & to the public. And the judiciary needed to be convinced that something had to be done about it. Lucy Letby, an English nurse, was arrested in 2020 for murder of a large number of babies at a hospital in Chester, UK, in Jan 2015-June 2016. Her trial started in 2022 and took 10 months. She was convicted and given a whole life sentence in 2023. In my opinion, the similarities between the two cases are horrific. Again there is statistical evidence: a cluster of unexplained bad events, and Lucy was there every time; there is apparently irrefutable scientific evidence for two babies; and just like with Lucia de Berk, there are some weird personal and private writings which can be construed as a confession. For many reasons, the chances of a fair retrial for Lucy Letby are very thin indeed, but I am convinced she is innocent and that her trial was grossly unfair.
Optimal statistical analysis of Bell experiments
Richard D Gill (Mathematical Institute, Leiden University)
The 2022 Nobel prize in physics went to Clauser, Horne and Zeilinger '“for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”. I played a modest part in the process which led up to that prize by contributing statistical methodology used in four decisive “loophole free" experiments of 2020. What I contributed was quite simply the idea of using randomisation in order to get guaranteed statistical validity, and martingale methods which allowed the experimenters to rule out the notion that an apparent violation of Bell’s inequality could simply be due to time trends in physical parameters over the course of an experiment which takes days to complete (confounding of treatment with time). Most recently I have studied some simple methods to reduce noise in the usual ad hoc estimators of the four correlations which figure in Bell’s inequality. Do not fear: the statistical model is very simple, no knowledge of quantum mechanics is needed to understand the statistical issues. The talk is about the statistical analysis of four 2x2 tables.
https://www.mdpi.com/2673-9909/3/2/23
Subtitle: "Statistical issues in the investigation of a suspected serial killer nurse"
Abstract:
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
https://www.maths.lu.se/kalendarium/?evenemang=statistics-seminar-statistical-issues-investigation-suspected-serial-killer-nurse-richard-gill
Video: https://www.youtube.com/watch?v=RxmFLKTlim8
The RSS has published a report tackling statistical bias in criminal trials where healthcare professionals are accused of murdering patients. Following several high-profile cases where statistical evidence has been misused, the Society calls for all parties in such cases to consult with professional statisticians and use only expert witnesses who are appropriately qualified.
The report, ‘Healthcare serial killer or coincidence?’, is produced by the RSS’s Statistics and the Law Section. The group evolved from a working group of the same name set up in early 2000s after the Society wrote to the Lord Chancellor and made a statement setting out concerns around the use of statistical evidence in the case of Sally Clark.
According to the report, suspicions about medical murder often arise due to a surprising or unexpected series of events, such as an unusual number of deaths among patients under the care of a particular professional.
The RSS has major concerns about use of this kind of evidence in a criminal investigation: first, over the analysis and interpretation of such data, and secondly over whether it can be guaranteed that the data have been compiled in an objective and unbiased manner.
Subtitle: Statistical issues in the investigation of a suspected serial killer nurse
Abstract:
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
Statistical issues in the investigation of a suspected serial killer nurse
Abstract
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
I discuss various statistical analyses of the recent Bell experiment of Storz et al. (2023, Nature) at ETH Zurich. Both standard and novel analyses under different assumptions result in almost identical conclusions. This suggests strongly that those assumptions are actually satisfied.
Statistics, causality, and the 2022 Nobel prizes in physics.
Richard Gill
Leiden University
The 2022 Nobel prize in physics was awarded to John Clauser, Alain Aspect and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”. I will explain each of these three gentlemen’s contributions and point out connections to classical statistical causality and probabilistic coupling. It seems that the first commercial application of this work will be a technology called DIQKD: "device independent quantum key distribution". Alice and Bob are far apart and need to establish a shared cryptographic key so as to send one another some securely encrypted messages over public communication channels. How can they create a suitable key while far apart from one another, and only able to communicate using classical means and over public channels?
Healthcare serial killer or coincidence?
Richard Gill
Mathematical Institute, Leiden University
Abstract: The UK’s *Royal Statistical Society* recently published a report tackling statistical bias in criminal trials where healthcare professionals are accused of murdering patients. Following several high-profile cases where statistical evidence has been misused, the Society calls for all parties in such cases to consult with professional statisticians and use only expert witnesses who are appropriately qualified.
The RSS report came out just two weeks before the start of the trial in Manchester of a nurse called Lucy Letby. The trial is still ongoing. So far, neither side has called for evidence from experts in statistics. The core of the prosecution case is that so many odd events connected to nurse LL cannot be a coincidence.
I will discuss the challenges both procedural and conceptual which arise when presenting statistical thinking as evidence in criminal trials. https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
We analyse data from the final two years of a long-running and influential annual Dutch survey of the quality of Dutch New Herring served in large samples of consumer outlets. The data was compiled and analysed by a university econometrician whose findings were publicized in national and international media. This led to the cessation of the survey amid allegations of bias due to a conflict of interest on the part of the leader of the herring tasting team. The survey organizers responded with accusations of failure of scientific integrity. The econometrician was acquitted of wrong-doing by the Dutch authority, whose inquiry nonetheless concluded that further research was needed. We reconstitute the data and uncover its important features which throw new light on the econometrician's findings, focussing on the issue of correlation versus causality: the sample is definitely not a random sample. Taking account both of newly discovered data features and of the sampling mechanism, we conclude that there is no evidence of biased evaluation, despite the econometrician's renewed insistence on his claim.
This year’s Nobel prize in physics: homage to John Bell.
Richard Gill
Mathematical Institute, Leiden University.
Focussing on statistical issues, I will first sketch the history initiated by John Bell’s landmark 1964 paper “On the Einstein Podolsky Rosen paradox”, which led to the 2022 Nobel prize awarded to John Clauser, Alain Aspect and Anton Zeilinger,
https://www.nobelprize.org/prizes/physics/2022/press-release/
A breakthrough in the history was the four successful “loophole-free” Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna. These experiments pushed quantum technology to the limit and paved the way for DIQKD (“Device Independent Quantum Key Distribution”) and a quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality", and depended on brilliant later innovations: Eberhard’s discovery that less entanglement could allow stronger manifestation of quantum non-locality, and Zeilinger’s discovery of quantum teleportation, allowing entanglement between photons to be transferred to entanglement between ions or atoms and ultimately to components of manufactured semi-conductors.
I will also discuss reanalyses of the 2015+ experiments, which could have allowed the experimenters to claim even smaller p-values than the ones they published,
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
Solution to the measurement problem based on Belavkin's theory of Eventum Mechanics. There is only Schrödinger’s equation and a unitary evolution of the wave function of the universe, but we must add a Heisenberg cut to separate the past from the future (to separate particles from waves): Belavkin’s eventum mechanics. The past is a commuting sub-algebra A of the algebra of all observables B, and in the Heisenberg picture, the past history of any observable in A is also in A. Particles have definite trajectories back into the past; Eventum Mechanics defines the probability distributions of future given past. https://arxiv.org/abs/0905.2723 Schrödinger's cat meets Occam's razor (version 3: 10 Aug 2022); to appear in Entropy
The successful loophole-free Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna were milestone achievements. They pushed quantum technology to the limit and paved the way for DIQKD and quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality" (https://cds.cern.ch/record/142461/files/198009299.pdf). Still, there is a vociferous but small community of proponents of local realism, who continue to grasp at the straws offered by some shortcomings of the 2015+ experiments. I'll discuss the loopholes in the loophole-free experiments, explain how the experimenters could have claimed much smaller p-values "for free" by using standard likelihood-based inference, but also relativise the meaning of a 25 standard deviation violation of local realism. I suggest that particle physicists have wisely settled on 5 standard deviations because of Chebyshev's inequality and the fact that 1 / 5 squared = 0.04, just significant at the 5% level.
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
https://arxiv.org/abs/2208.09930 "Kupczynski's contextual setting-dependent parameters offer no escape from Bell-CHSH"
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
The ability to recreate computational results with minimal effort and actionable metrics provides a solid foundation for scientific research and software development. When people can replicate an analysis at the touch of a button using open-source software, open data, and methods to assess and compare proposals, it significantly eases verification of results, engagement with a diverse range of contributors, and progress. However, we have yet to fully achieve this; there are still many sociotechnical frictions.
Inspired by David Donoho's vision, this talk aims to revisit the three crucial pillars of frictionless reproducibility (data sharing, code sharing, and competitive challenges) with the perspective of deep software variability.
Our observation is that multiple layers — hardware, operating systems, third-party libraries, software versions, input data, compile-time options, and parameters — are subject to variability that exacerbates frictions but is also essential for achieving robust, generalizable results and fostering innovation. I will first review the literature, providing evidence of how the complex variability interactions across these layers affect qualitative and quantitative software properties, thereby complicating the reproduction and replication of scientific studies in various fields.
I will then present some software engineering and AI techniques that can support the strategic exploration of variability spaces. These include the use of abstractions and models (e.g., feature models), sampling strategies (e.g., uniform, random), cost-effective measurements (e.g., incremental build of software configurations), and dimensionality reduction methods (e.g., transfer learning, feature selection, software debloating).
I will finally argue that deep variability is both the problem and solution of frictionless reproducibility, calling the software science community to develop new methods and tools to manage variability and foster reproducibility in software systems.
Exposé invité Journées Nationales du GDR GPL 2024
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Epidemiology Meets Quantum: Statistics, Causality, and Bell's Theorem
1. Statistical thinking in
“the experiment of the century”
• “If you need statistics, you did the wrong
experiment” – Ernest Rutherford
Ernest Rutherford, 1st Baron Rutherford of Nelson, OM, FRS (30 August 1871 – 19
October 1937) was a New Zealand-born British physicist who became known as
the father of nuclear physics. Encyclopædia Britannica considers him to be the
greatest experimentalist since Michael Faraday
2. Towards a definitive and
successful Bell-type experiment
Bell (1964): quantum mechanics is incompatible with
local hidden variables aka local realism
Experimental evidence: Aspect (1982), Weihs (1998), Giustina (2013),
Christensen (2013) … but not good enough!
Why? technological limitations → loopholes → statistical issues familiar in
epidemiology!
3. < 1200 m >
^
4 μs
v
time (4 μs)
setting A → outcome X outcome Y ← setting B
A
X
B
Y
space (4000 ft)
Alice Bobspeed of light = 1 foot per nanosecond
4. What is this QM?
• QM models the probability distribution of measurement
outcomes, it does not tell us what actually happens
• Measurement outcomes - orthogonal subspaces of
complex Hilbert space
• Quantum states - unit vectors in Hilbert space
• Composite systems, measurements, modelled by tensor
product
• The probability of a particular outcome = squared length
of projection of state vector into corresponding subspace
Pythagoras!
5. • Settings A, B take values in {1, 2}
• Counterfactual outcomes X1, X2, Y1, Y2 take values
in {–1, +1}
• Actual outcomes X = XA, Y = YB
• (A, B) ⫫ (X1, X2, Y1, Y2)
Bell’s inequality,
Bell’s theorem
6. Locality, realism, freedom
• Realism = existence of counterfactual outcomes
• Locality = Alice’s outcomes don’t depend on Bob’s
settings and vice-versa
• Freedom = statistical independence of actual
settings from counterfactual outcomes
7. Bell’s inequality,
Bell’s theorem
X1 = Y2 & Y2 = X2 & X2 = Y1 X1 = Y1
∴ X1 ≠ Y1 X1 ≠ Y2 or Y2 ≠ X2 or X2 ≠ Y1
∴ P(X1 ≠ Y1) ≤ P(X1 ≠ Y2) + P(Y2 ≠ X2) + P(X2 ≠ Y1)
∴ P11(X ≠Y ) ≤ P12(X ≠Y ) + P22(Y ≠X ) + P21(X ≠ Y )
where Pab(…) = P( … | A = a, B = b)
X1
Y2
Y1
X2
NB:(probabilistic) Bell inequality is actually just
a simple corollary of a logical implication
8. Bell’s inequality,
Bell’s theorem
P11(X ≠Y ) ≤ P12(X ≠Y ) + P22(Y ≠X ) + P21(X ≠ Y )
For instance: 0.75 ≤ 0.25 + 0.25 + 0.25
But QM promises we can get (approx) 0.85, 0.15, 0.15, 0.15
Notice Eab(XY) = Pab(X =Y ) – Pab(X ≠Y ) = 1 – 2 Pab(X ≠Y )
Define S = E12(XY) + E22(XY) + E21(XY) – E11(XY)
Equivalently: S ≤ 2 (CHSH: Clauser-Horne-Shimony-Holt, 1969)
QM promises we can (maximally) get 2 √ 2 (why not 4 ??)
X1
Y1
Y2
X2
Bell, 1964 Tsirelson, 1980 Pawlowski, 2009
9. John Bell explains: “I cannot say that action at a
distance is required in physics. But I cannot say that
you can get away with no action at a distance. You
cannot separate off what happens in one place with
what happens at another”
https://www.youtube.com/watch?v=V8CCfOD1iu8
Bell (1981) Bertlmann’s socks and the nature of reality
10. The local polytope, quantum
convex body, no-signalling polytope
Definitions:
px,y,a,b = p(x, y | a, b) = P(X = x, Y = y | A = a, B = b)
= P(Xa = x, Yb = y) under locality + realism + freedom
p = ( p(x, y | a, b)x,y =±1;a,b=1,2 ) ∈ R16
S
Tsirelson’s bound
deterministic
local-realistic model
S’
4
2
-2
-4
-4 -2 2 4
completely
random
22
22-
22- 22
a generalized
Bell inequality
another
another
11. px,y,a,b = p(x, y | a, b) = P(X = x, Y = y | A = a, B = b)
p = ( px,y,a,b : x, y =±1; a, b=1,2 ) ∈ R16
Nonnegativity inequalities: ∀x,y,a,b px,y,a,b ≥ 0
Normalisation equalities: ∀a,b ∑x,y px,y,a,b = 1
No-signalling equalities: ∀a ∑y px,y,a,b same for all b,
∀b ∑x px,y,a,b same for all a
NB: no-signalling is a property of all decent physical models. Not just LHV theories …
The local-realism polytope, quantum
convex body, no-signalling polytope
12. S
Tsirelson’s bound
deterministic
local-realistic model
S’
4
2
-2
-4
-4 -2 2 4
completely
random
22
22
22- 22
a generalized
Bell inequality
another
another
Caricatures of the
local-realism, quantum,
no-signalling convex bodies
L ⊊ Q ⊊ NS
Generalised Bell inequalities: p parties, q settings, r outcomes
Left 2 x 2 x 2, right p x q x r
R8
13. Quantum extreme point S = ± 2√2
0.43 0.07 0.43 0.07
0.07 0.43 0.07 0.43
0.43 0.07 0.07 0.43
0.07 0.43 0.43 0.07
No-signalling extreme point S = ± 4
0.5 0.0 0.5 0.0
0.0 0.5 0.0 0.5
0.5 0.0 0.0 0.5
0.0 0.5 0.5 0.0
Local-realism extreme point S = ± 2
1 0 0 1
0 0 0 0
1 0 0 1
0 0 0 0
2x2x2 case
Local-realism S = ± 2 nr QM best
0.375 0.125 0.375 0.125
0.125 0.375 0.125 0.375
0.375 0.125 0.125 0.375
0.125 0.375 0.375 0.125
S
Tsirelson’s bound
deterministic
local-realistic model
S’
4
2
-2
-4
-4 -2 2 4
completely
random
22
22-
22- 22
14. Experiment
• The definitive experiment still has not been done
• They are doing it right now, in Delft
(and in about 4 other places)
16. Diamonds
Hanson's research group produces qubits using electrons in diamonds. ‘We use diamonds because ‘mini prisons’ for
electrons are formed in this material whenever a nitrogen atom is located in the position of one of the carbon atoms. The fact
that we're able to view these miniature prisons individually makes it possible for us to study and verify an individual electron
and even a single atomic nucleus. We're able to set the spin (rotational direction) of these particles in a predetermined state,
verify this spin and subsequently read out the data. We do all this in a material that can be used to make chips out of. This is
important as many believe that only chip-based systems can be scaled up to a practical technology,’ explains Hanson.
Holy Grail
Hanson is planning to repeat the experiment this summer over a distance of 1300 metres, with chips located in various
buildings on TU Delft's campus. This experiment could be the first that meets the criteria of the 'loophole-free Bell test', and
could provide the ultimate evidence to disprove Einstein’s rejection of entanglement. Various research groups, including
Hanson's, are currently striving to be the first to realise a loophole-free Bell test, which is considered 'Holy Grail' within
quantum mechanics.
Delft:
the Hanson diamond group
entanglement swapping
http://www.tudelft.nl/en/current/latest-news/article/detail/beam-me-up-data/
29 May 2014
18. Statistical issues
• Detection loophole – intention to treat principle!
• Memory loophole – randomisation and martingale theory!
• Time variation – randomisation and martingale theory!
• Statistical significance: based on randomisation of settings, not
on an assumed physical model (no “iid” assumption)
• Time variation – turn a bug into a feature by dynamic
optimisation of test statistic
• Statistical optimisation – from “S ≤ 2” to
S + ∑ no-signalling equalities cn.s.e. n.s.e. ≤ 2
http://rpubs.com/gill1109
19. Turning around the randomness:
Bell inequalities are logical inequalities
• Suppose A, B are independent fair Bernoulli
• Define Iab = I(A = a, B = b)
• Condition on values of X1, X2, Y1, Y2 and define
δab = I(xa ≠ yb)
• Observe that
E(δ11 I11 – δ12 I12 – δ22 I22 – δ21 I21) ≤ 0 because
x1 ≠ y1 x1 ≠ y2 or y2 ≠ x2 or x2 ≠ y1
20. Want to know more?
• http://www.slideshare.net/gill1109/epidemiology-
meets-quantum-statistics-causality-and-bells-
theorem
• http://www.math.leidenuniv.nl/~gill
• Survey paper in Statistical Science
30Jan2015
Statistical Science
2014, Vol. 29, No. 4, 512–528
DOI: 10.1214/14-STS490
c⃝ Institute of Mathematical Statistics, 2014
Statistics, Causality and Bell’s Theorem
Richard D. Gill
Abstract. Bell’s [Physics 1 (1964) 195–200] theorem is popularly sup-
posed to establish the nonlocality of quantum physics. Violation of
Bell’s inequality in experiments such as that of Aspect, Dalibard and
Roger [Phys. Rev. Lett. 49 (1982) 1804–1807] provides empirical proof
of nonlocality in the real world. This paper reviews recent work on
Bell’s theorem, linking it to issues in causality as understood by statis-
ticians. The paper starts with a proof of a strong, finite sample, version
of Bell’s inequality and thereby also of Bell’s theorem, which states that
quantum theory is incompatible with the conjunction of three formerly
uncontroversial physical principles, here referred to as locality, realism and freedom.
21. I cannot say that action at a distance is required in physics. But I cannot say
that you can get away with no action at a distance. You cannot separate off
what happens in one place with what happens at another – John Bell
Nature produces chance events (irreducibly chance-like!) which can occur
at widely removed spatial locations without anything propagating from point
to point along any path joining those locations. … The chance-like character
of these effects prevents any possibility of using this form of non locality to
communicate, thereby saving from contradiction one of the fundamental
principles of relativity theory according to which no communication can
travel faster than the speed of light – Nicolas Gisin
Postscript
Quantum Chance: Nonlocality, Teleportation and Other Quantum Marvels. Springer, 2014
https://www.youtube.com/watch?v=V8CCfOD1iu8