I discuss various statistical analyses of the recent Bell experiment of Storz et al. (2023, Nature) at ETH Zurich. Both standard and novel analyses under different assumptions result in almost identical conclusions. This suggests strongly that those assumptions are actually satisfied.
Optimal statistical analysis of Bell experiments
Richard D Gill (Mathematical Institute, Leiden University)
The 2022 Nobel prize in physics went to Clauser, Horne and Zeilinger '“for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”. I played a modest part in the process which led up to that prize by contributing statistical methodology used in four decisive “loophole free" experiments of 2020. What I contributed was quite simply the idea of using randomisation in order to get guaranteed statistical validity, and martingale methods which allowed the experimenters to rule out the notion that an apparent violation of Bell’s inequality could simply be due to time trends in physical parameters over the course of an experiment which takes days to complete (confounding of treatment with time). Most recently I have studied some simple methods to reduce noise in the usual ad hoc estimators of the four correlations which figure in Bell’s inequality. Do not fear: the statistical model is very simple, no knowledge of quantum mechanics is needed to understand the statistical issues. The talk is about the statistical analysis of four 2x2 tables.
https://www.mdpi.com/2673-9909/3/2/23
The experimenters performed a loophole-free Bell test using superconducting qubits separated by 30 meters. They entangled pairs of qubits and measured them in randomly chosen bases over 1 million trials. They found an average S value of 2.0747 ± 0.0033, violating Bell's inequality with a p-value smaller than 10-108, demonstrating non-local quantum correlations. This establishes superconducting circuits as a viable platform for foundational tests of quantum mechanics and applications in quantum information processing.
- The authors performed a loophole-free Bell test experiment using superconducting circuits to violate Bell's inequality. They entangled pairs of qubits over a 30 meter distance and measured them in randomly chosen bases, accumulating over 1 million trials.
- The average S value obtained was 2.0747 ± 0.0033, violating Bell's inequality with a p-value smaller than 10-108, demonstrating non-local quantum correlations between the spatially separated qubits. This establishes superconducting circuits as a viable platform for foundational tests of quantum physics and applications in quantum information.
Datasalon6 2011 - "Rise of the robo scientists": where is data coming from?Pieter Pauwels
Presentation given at Datasalon #6 in Brussels (2011). It presents a review on the article by R.D. King "Rise of the Robo Scientists" and some afterthoughts on the nature of data.
The article by R.D. King appeared in Scientific American: Vol. 304 (2011) pp. 72-77. DOI: 10.1038/scientificamerican0111-72
This document summarizes a study of new edgeless silicon pixel detector devices intended for use in particle physics experiments. The study tested prototypes of these devices, which have a nearly full active area compared to conventional detectors that have inactive "guard ring" regions. Tests in a particle beam found that the edgeless devices performed as expected, with a spatial resolution on the order of micrometers, demonstrating their potential to maximize the useful detection area for tracking particles. The new technology fulfills technical needs and could benefit particle physics as well as medical and other applications.
It has long been realized that the mathematical core of Bell's theorem is essentially a classical probabilistic proof that a certain distributed computing task is impossible: namely, the Monte Carlo simulation of certain iconic quantum correlations. I will present a new and simple proof of the theorem using Fourier methods (time series analysis) which should appeal to probabilists and statisticians. I call it Gull's theorem since it was sketched in a conference talk many years ago by astrophysicist Steve Gull, but never published. Indeed, there was a gap in the proof.
The connection with the topic of this session is the following: though a useful quantum computer is perhaps still a dream, many believe that a useful quantum internet is very close indeed. The first application will be: creating shared secret random cryptographic keys which, due to the laws of physics, cannot possibly be known to any other agent. So-called loophole-free Bell experiments have already been used for this purpose.
Like other proofs of Bell's theorem, the proof concerns a thought experiment, and the thought experiment could also in principle be carried out in the lab. This connects to the concept of functional Bell inequalities, whose application in the quantum research lab has not yet been explored. This is again a task for classical statisticians to explore.
R.D. Gill (2022) Gull's theorem revisited, Entropy 2022, 24(5), 679 (11pp.)
https://www.mdpi.com/1099-4300/24/5/679
https://arxiv.org/abs/2012.00719
[DSC Europe 23] Mila Pandurovic - Data science in high energy physicsDataScienceConferenc1
High energy physics experiments such as currently running Large Hadron Collider (LHC) or the future collider experiments (CEPC, CLIC, ILC, FCC), rely strongly on data science. Only from four LHC experiments the CERN Data Centre stores more than thirty petabytes of data per year, where over hundred petabytes of data are archived permanently. The collider experiments are characterized not only by the vast amount of data, but also with the necessity for the high precision measurement, unfavorable ratio of signal to background, where the tiny signals are covered by the huge pile of background events, with ratio of one per million, or less. In Higgs physics special challenge present the studies with purely hadronic final states, jets, where the lack of the sharp tagging variables lead to strenuous signal and background separation. The presentation will give the overview of the use of data science in the Higgs boson physics at future Circular electron positron collider, CEPC, China.
Optimal statistical analysis of Bell experiments
Richard D Gill (Mathematical Institute, Leiden University)
The 2022 Nobel prize in physics went to Clauser, Horne and Zeilinger '“for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”. I played a modest part in the process which led up to that prize by contributing statistical methodology used in four decisive “loophole free" experiments of 2020. What I contributed was quite simply the idea of using randomisation in order to get guaranteed statistical validity, and martingale methods which allowed the experimenters to rule out the notion that an apparent violation of Bell’s inequality could simply be due to time trends in physical parameters over the course of an experiment which takes days to complete (confounding of treatment with time). Most recently I have studied some simple methods to reduce noise in the usual ad hoc estimators of the four correlations which figure in Bell’s inequality. Do not fear: the statistical model is very simple, no knowledge of quantum mechanics is needed to understand the statistical issues. The talk is about the statistical analysis of four 2x2 tables.
https://www.mdpi.com/2673-9909/3/2/23
The experimenters performed a loophole-free Bell test using superconducting qubits separated by 30 meters. They entangled pairs of qubits and measured them in randomly chosen bases over 1 million trials. They found an average S value of 2.0747 ± 0.0033, violating Bell's inequality with a p-value smaller than 10-108, demonstrating non-local quantum correlations. This establishes superconducting circuits as a viable platform for foundational tests of quantum mechanics and applications in quantum information processing.
- The authors performed a loophole-free Bell test experiment using superconducting circuits to violate Bell's inequality. They entangled pairs of qubits over a 30 meter distance and measured them in randomly chosen bases, accumulating over 1 million trials.
- The average S value obtained was 2.0747 ± 0.0033, violating Bell's inequality with a p-value smaller than 10-108, demonstrating non-local quantum correlations between the spatially separated qubits. This establishes superconducting circuits as a viable platform for foundational tests of quantum physics and applications in quantum information.
Datasalon6 2011 - "Rise of the robo scientists": where is data coming from?Pieter Pauwels
Presentation given at Datasalon #6 in Brussels (2011). It presents a review on the article by R.D. King "Rise of the Robo Scientists" and some afterthoughts on the nature of data.
The article by R.D. King appeared in Scientific American: Vol. 304 (2011) pp. 72-77. DOI: 10.1038/scientificamerican0111-72
This document summarizes a study of new edgeless silicon pixel detector devices intended for use in particle physics experiments. The study tested prototypes of these devices, which have a nearly full active area compared to conventional detectors that have inactive "guard ring" regions. Tests in a particle beam found that the edgeless devices performed as expected, with a spatial resolution on the order of micrometers, demonstrating their potential to maximize the useful detection area for tracking particles. The new technology fulfills technical needs and could benefit particle physics as well as medical and other applications.
It has long been realized that the mathematical core of Bell's theorem is essentially a classical probabilistic proof that a certain distributed computing task is impossible: namely, the Monte Carlo simulation of certain iconic quantum correlations. I will present a new and simple proof of the theorem using Fourier methods (time series analysis) which should appeal to probabilists and statisticians. I call it Gull's theorem since it was sketched in a conference talk many years ago by astrophysicist Steve Gull, but never published. Indeed, there was a gap in the proof.
The connection with the topic of this session is the following: though a useful quantum computer is perhaps still a dream, many believe that a useful quantum internet is very close indeed. The first application will be: creating shared secret random cryptographic keys which, due to the laws of physics, cannot possibly be known to any other agent. So-called loophole-free Bell experiments have already been used for this purpose.
Like other proofs of Bell's theorem, the proof concerns a thought experiment, and the thought experiment could also in principle be carried out in the lab. This connects to the concept of functional Bell inequalities, whose application in the quantum research lab has not yet been explored. This is again a task for classical statisticians to explore.
R.D. Gill (2022) Gull's theorem revisited, Entropy 2022, 24(5), 679 (11pp.)
https://www.mdpi.com/1099-4300/24/5/679
https://arxiv.org/abs/2012.00719
[DSC Europe 23] Mila Pandurovic - Data science in high energy physicsDataScienceConferenc1
High energy physics experiments such as currently running Large Hadron Collider (LHC) or the future collider experiments (CEPC, CLIC, ILC, FCC), rely strongly on data science. Only from four LHC experiments the CERN Data Centre stores more than thirty petabytes of data per year, where over hundred petabytes of data are archived permanently. The collider experiments are characterized not only by the vast amount of data, but also with the necessity for the high precision measurement, unfavorable ratio of signal to background, where the tiny signals are covered by the huge pile of background events, with ratio of one per million, or less. In Higgs physics special challenge present the studies with purely hadronic final states, jets, where the lack of the sharp tagging variables lead to strenuous signal and background separation. The presentation will give the overview of the use of data science in the Higgs boson physics at future Circular electron positron collider, CEPC, China.
Combining density functional theory calculations, supercomputing, and data-dr...Anubhav Jain
The document summarizes how computational materials science using density functional theory (DFT) calculations, supercomputing, and data-driven methods can help design new materials faster than traditional experimental approaches. It describes how high-throughput DFT calculations are run on supercomputers to screen large numbers of potential materials. The results are compiled in open databases like the Materials Project to be shared and reused by researchers. While computational limitations remain, combining computation and data is helping accelerate the discovery of new materials with improved properties for applications like batteries, thermoelectrics, and carbon capture.
This document provides the program for a two-day conference on computational modeling of advanced materials. The program includes four sessions each day with speakers presenting on topics like titania in cement industry, multiscale modeling of energy storage materials, modeling electron transport in low-dimensional systems, modeling heterogeneous catalysts for Fischer-Tropsch synthesis and Ziegler-Natta systems, and modeling organic compounds on metal oxide surfaces. Coffee breaks are scheduled between sessions and a lunch and poster session are included each day. Chairpersons are designated for each session.
This document discusses axion dark matter research being conducted at the Center for Axion and Precision Physics Research (CAPP) in South Korea. It outlines several axion haloscope experiments underway at CAPP using microwave cavities inside superconducting magnets, including CAPP-PACE (8T magnet), CAPP-8TB (larger 8T magnet), and CAPP-9T MC (9T magnet with multiple cavities). It also discusses plans for future experiments using a 25T high-temperature superconducting magnet and 12T low-temperature superconducting magnet. The document provides details on related experimental activities and technologies as well as CAPP's involvement in broader global axion and dark matter projects.
The general theory of space time, mass, energy, quantum gravityAlexander Decker
The document discusses the relationships between various concepts in physics including general unified theory (GUT), space-time, mass-energy, quantum gravity, vacuum energy, and quantum fields. It explores how quantum computation may be possible using quantum discord rather than entanglement. Experiments showed that noisy, mixed quantum states could still enable computation through discord rather than requiring pristine entangled states. Theoretical work is ongoing to better understand how and when discord enables computation compared to entanglement.
A SPectroscopic Survey of Biased Halos in the Reionization Era (ASPIRE): JWST...Sérgio Sacani
The JWST ASPIRE program discovered a filamentary structure of galaxies around a z=6.61 quasar through spectroscopic observations. Ten [O III] emitters were found clustered along a filament spanning 637 cMpc3, indicating one of the most overdense early structures known. Additional observations revealed a complex environment with both UV-bright and dusty galaxies present, showing that early galaxy evolution was not simultaneous around the quasar. In total 41 [O III] emitters were discovered between 5.3 < z < 6.7, with half clustered at z ~ 5.4 and 6.2, indicating high-redshift star-forming galaxies are generally clustered.
Intracluster light is already abundant at redshift beyond unitySérgio Sacani
Intracluster light (ICL) is difuse light from stars that are gravitationally bound not
to individual member galaxies, but to the halo of galaxy clusters. Leading theories1,2
predict that the ICL fraction, defned by the ratio of the ICL to the total light, rapidly
decreases with increasing redshift, to the level of a few per cent at z > 1. However,
observational studies have remained inconclusive about the fraction beyond
redshift unity because, to date, only two clusters in this redshift regime have been
investigated. One shows a much lower fraction than the mean value at low redshift3
,
whereas the other possesses a fraction similar to the low-redshift value4
. Here we
report an ICL study of ten galaxy clusters at 1 ≲z≲ 2 based on deep infrared imaging
data. Contrary to the leading theories, our study fnds that ICL is already abundant
at z ≳ 1, with a mean ICL fraction of approximately 17%. Moreover, no signifcant
correlation between cluster mass and ICL fraction or between ICL colour and
cluster-centric radius is observed. Our fndings suggest that gradual stripping can
no longer be the dominant mechanism of ICL formation. Instead, our study supports
the scenario wherein the dominant ICL production occurs in tandem with the
formation and growth of the brightest cluster galaxies and/or through the accretion
of preprocessed stray stars.
This document discusses a thesis project that aims to evaluate the radiation hardness of sensor materials for use in the proposed International Linear Collider beamline calorimeter (BeamCal). The author performs Monte Carlo simulations to estimate the shower conversion factor α, which quantifies the mean radiation fluence at a sensor per incident electron, as a function of electron energy. Analysis of the simulation data provides fluence distribution profiles that decrease radially from the center of the irradiated sensor area. The author accounts for sensor rastering across the electron beam, which provides even illumination over a 2 cm area. Observations from the simulations indicate the radiation fluence is linearly dependent on the incident electron energy.
Measuring the Hubble constant with kilonovae using the expanding photosphere ...Sérgio Sacani
While gravitational wave (GW) standard sirens from neutron star (NS) mergers have been proposed to offer good measurements of
the Hubble constant, we show in this paper how a variation of the expanding photosphere method (EPM) or spectral-fitting expanding
atmosphere method, applied to the kilonovae (KNe) associated with the mergers, can provide an independent distance measurement
to individual mergers that is potentially accurate to within a few percent. There are four reasons why the KN-EPM overcomes the
major uncertainties commonly associated with this method in supernovae: (1) the early continuum is very well-reproduced by a
blackbody spectrum, (2) the dilution effect from electron scattering opacity is likely negligible, (3) the explosion times are exactly
known due to the GW detection, and (4) the ejecta geometry is, at least in some cases, highly spherical and can be constrained from
line-shape analysis. We provide an analysis of the early VLT/X-shooter spectra AT2017gfo showing how the luminosity distance can
be determined, and find a luminosity distance of DL = 44.5 ± 0.8 Mpc in agreement with, but more precise than, previous methods.
We investigate the dominant systematic uncertainties, but our simple framework, which assumes a blackbody photosphere, does not
account for the full time-dependent three-dimensional radiative transfer effects, so this distance should be treated as preliminary. The
luminosity distance corresponds to an estimated Hubble constant of H0 = 67.0 ± 3.6 km s−1 Mpc−1
, where the dominant uncertainty
is due to the modelling of the host peculiar velocity. We also estimate the expected constraints on H0 from future KN-EPM-analysis
with the upcoming O4 and O5 runs of the LIGO collaboration GW-detectors, where five to ten similar KNe would yield 1% precision
cosmological constraints.
This document summarizes research using matrix product states (MPS) to simulate the dynamics of atoms in an optical lattice. MPS allows modeling of larger systems than conventional exact calculations by only keeping the most relevant quantum mechanical combinations. The researcher investigated MPS accuracy by comparing hopping predictions to exact calculations, finding convergence up to a certain precision. Future work will apply MPS to more complex lattice systems and geometries to replicate experiments.
The Algorithms of Life - Scientific Computing for Systems Biologyinside-BigData.com
In this deck from ISC 2019, Ivo Sbalzarini from TU Dresden presents: The Algorithms of Life - Scientific Computing for Systems Biology. In his talk, Sbalzarini mainly discussed the rapidly growing importance and influence in the life sciences for scientific high-performance computing.
"Scientific high-performance computing is of rapidly growing importance and influence in the life sciences. Thanks to the increasing knowledge about the molecular foundations of life, recent advances in biomedical data science, and the availability of predictive biophysical theories that can be numerically simulated, mechanistic understanding of the emergence of life comes within reach. Computing is playing a pivotal and catalytic role in this scientific revolution, both as a tool of investigation and hypothesis testing, but also as a school of thought and systems model. This is because a developing tissue, embryo, or organ can itself be seen as a massively parallel distributed computing system that collectively self-organizes to bring about behavior we call life. In any multicellular organism, every cell constantly takes decisions about growth, division, and migration based on local information, with cells communicating with each other via chemical, mechanical, and electrical signals across length scales from nanometers to meters. Each cell can therefore be understood as a mechano-chemical processing element in a complexly interconnected million- or billion-core computing system. Mechanistically understanding and reprogramming this system is a grand challenge. While the “hardware” (proteins, lipids, etc.) and the “source code” (genetic code) are increasingly known, we known virtually nothing about the algorithms that this code implements on this hardware. Our vision is to contribute to this challenge by developing computational methods and software systems for high-performance data analysis, inference, and numerical simulation of computer models of biological tissues, incorporating the known biochemistry and biophysics in 3D-space and time, in order to understand biological processes on an algorithmic basis. This ranges from real-time approaches to biomedical image analysis, to novel simulation languages for parallel high-performance computing, to virtual reality and machine learning for 3D microscopy and numerical simulations of coupled biochemical-biomechanical models. The cooperative, interdisciplinary effort to develop and advance our understanding of life using computational approaches not only places high-performance computing center stage, but also provides stimulating impulses for the future development of this field."
Watch the video: https://wp.me/p3RLHQ-kBB
Learn more: https://www.isc-hpc.com/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This document provides an overview of the Center for Scientific and Technological Research (CCTVAL) in Chile. It summarizes that CCTVAL conducts research in theoretical and experimental physics, engineering, and their applications to technology and society. It has over 150 members working across various research groups and laboratories. CCTVAL has achieved success in scientific publications, international collaborations, technology development, and training students and young researchers. It aims to further develop physics and engineering in Chile and enhance their societal impacts.
This document reports on evidence for spatial variation in the fine structure constant α from observations of quasar absorption spectra. A sample of 153 measurements from the ESO Very Large Telescope (VLT) probing a different region of the universe suggests α was larger in the past, opposite to previous findings from the Keck telescope. Combining the two datasets reveals a significant spatial dipole in α, with the maximum variation in the direction of right ascension 17.3 hours, declination -61 degrees. Detailed analysis found no systematic effects that could mimic this result.
Advancing Science through Coordinated CyberinfrastructureDaniel S. Katz
How local, regional, and national cyberinfrastructure can be coordinated and linked to advance science and engineering, based on experiences and lessons from the Center for Computation & Technology at LSU (ideas, funding, implementation), plus some thoughts on what might be done differently if we were starting today. Presented at First Workshop - Center for Computational Engineering & Sciences, Unicamp, Campinas, Brazil 10 APR 2014
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
HotJupiters are amongthebest-studied exoplanets, but it is still poorly understood how their chemical composition and cloud properties vary with longitude. Theoretical models predict that clouds may condense on the nightside and that molecular abundances can be driven out of equilibrium by zonal winds. Here we report a phase-resolved emission spectrum of the hot Jupiter WASP-43b measured from 5–12µ 5–12µ 5–12µm with JWST’s Mid-Infrared Instrument (MIRI). 1524 ±35 1524 ±35 and 863±23 The spectra reveal a large day–night temperature contrast (with average brightness temperatures of 1524 ± 35 863 ±23 863 ±23Kelvin, respectively) and evidence for water absorption at all orbital phases. Comparisons with three-dimensional atmospheric models show that both the phase curve shape and emission spectra strongly suggest the presence of nightside clouds which become optically thick to thermal emission at pressures greater than ∼100mbar. The dayside is consistent with a cloudless atmosphere above the mid-infrared photosphere. Con3trary to expectations from equilibrium chemistry but consistent with disequilibrium kinetics models, methane is not detected on the nightside (2σ upper limit of 1–6 parts per million, depending on model assumptions).
This document provides information about the Engineering Physics course syllabus including:
- 5 units that will be covered related to electromagnetism, fiber optics, dielectric and magnetic materials, semiconductor physics, and semiconductor devices.
- 12 required experiments in the laboratory portion including determining magnetic fields, fiber optic properties, susceptibility, Hall effect, semiconductor properties, and characteristics of diodes and solar cells.
- Safety precautions and instructions for students conducting experiments in the physics laboratory, including following all safety guidelines, being careful with electrical and heating equipment, and not leaving open flames unattended.
This document discusses a new machine learning method for differentiating between quark and gluon jets using data from the ALICE experiment at CERN. Key points:
- The method uses features of jet substructure to construct discriminant variables to classify jets as initiated by quarks or gluons. Hundreds of features are explored.
- Data preprocessing steps are described, including removing unusable features, addressing class noise in jet labeling, and ranking features by information gain.
- Feature ranking identified both previously proposed discriminating features as well as new intriguing variables for better quark/gluon jet discrimination.
Presentation of ECOSTBio Action CM1305 at APC Keflavik (Iceland)Marcel Swart
This document summarizes the ECOSTBio CM1305 Action, which aims to establish a European network to study spin states of transition metal complexes. It will set up a SPINSTATE database, develop new computational methods, and facilitate collaboration between experimental and theoretical groups. The Action has 4 working groups focused on the database, enzymatic spin states, spin crossover materials, and biomimetic spin states. It involves 75 parties from 19 countries and over 75 participants in the first year, with equal representation of experimentalists and theoreticians. Future plans include populating the database, surveying spin states in enzymes and spin crossover materials, and synthesizing complexes to study through spectroscopy and reactivity experiments.
NB: This is a preliminary version, superceded by my next upload. It has long been realized that the mathematical core of Bell's theorem is essentially a classical probabilistic proof that a certain distributed computing task is impossible: namely, the Monte Carlo simulation of certain iconic quantum correlations. I will present a new and simple proof of the theorem using Fourier methods (time series analysis) which should appeal to probabilists and statisticians. I call it Gull's theorem since it was sketched in a conference talk many years ago by astrophysicist Steve Gull, but never published. Indeed, there was a gap in the proof.
The connection with the topic of this session is the following: though a useful quantum computer is perhaps still a dream, many believe that a useful quantum internet is very close indeed. The first application will be: creating shared secret random cryptographic keys which, due to the laws of physics, cannot possibly be known to any other agent. So-called loophole-free Bell experiments have already been used for this purpose.
Like other proofs of Bell's theorem, the proof concerns a thought experiment, and the thought experiment could also in principle be carried out in the lab. This connects to the concept of functional Bell inequalities, whose application in the quantum research lab has not yet been explored. This is again a task for classical statisticians to explore.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Combining density functional theory calculations, supercomputing, and data-dr...Anubhav Jain
The document summarizes how computational materials science using density functional theory (DFT) calculations, supercomputing, and data-driven methods can help design new materials faster than traditional experimental approaches. It describes how high-throughput DFT calculations are run on supercomputers to screen large numbers of potential materials. The results are compiled in open databases like the Materials Project to be shared and reused by researchers. While computational limitations remain, combining computation and data is helping accelerate the discovery of new materials with improved properties for applications like batteries, thermoelectrics, and carbon capture.
This document provides the program for a two-day conference on computational modeling of advanced materials. The program includes four sessions each day with speakers presenting on topics like titania in cement industry, multiscale modeling of energy storage materials, modeling electron transport in low-dimensional systems, modeling heterogeneous catalysts for Fischer-Tropsch synthesis and Ziegler-Natta systems, and modeling organic compounds on metal oxide surfaces. Coffee breaks are scheduled between sessions and a lunch and poster session are included each day. Chairpersons are designated for each session.
This document discusses axion dark matter research being conducted at the Center for Axion and Precision Physics Research (CAPP) in South Korea. It outlines several axion haloscope experiments underway at CAPP using microwave cavities inside superconducting magnets, including CAPP-PACE (8T magnet), CAPP-8TB (larger 8T magnet), and CAPP-9T MC (9T magnet with multiple cavities). It also discusses plans for future experiments using a 25T high-temperature superconducting magnet and 12T low-temperature superconducting magnet. The document provides details on related experimental activities and technologies as well as CAPP's involvement in broader global axion and dark matter projects.
The general theory of space time, mass, energy, quantum gravityAlexander Decker
The document discusses the relationships between various concepts in physics including general unified theory (GUT), space-time, mass-energy, quantum gravity, vacuum energy, and quantum fields. It explores how quantum computation may be possible using quantum discord rather than entanglement. Experiments showed that noisy, mixed quantum states could still enable computation through discord rather than requiring pristine entangled states. Theoretical work is ongoing to better understand how and when discord enables computation compared to entanglement.
A SPectroscopic Survey of Biased Halos in the Reionization Era (ASPIRE): JWST...Sérgio Sacani
The JWST ASPIRE program discovered a filamentary structure of galaxies around a z=6.61 quasar through spectroscopic observations. Ten [O III] emitters were found clustered along a filament spanning 637 cMpc3, indicating one of the most overdense early structures known. Additional observations revealed a complex environment with both UV-bright and dusty galaxies present, showing that early galaxy evolution was not simultaneous around the quasar. In total 41 [O III] emitters were discovered between 5.3 < z < 6.7, with half clustered at z ~ 5.4 and 6.2, indicating high-redshift star-forming galaxies are generally clustered.
Intracluster light is already abundant at redshift beyond unitySérgio Sacani
Intracluster light (ICL) is difuse light from stars that are gravitationally bound not
to individual member galaxies, but to the halo of galaxy clusters. Leading theories1,2
predict that the ICL fraction, defned by the ratio of the ICL to the total light, rapidly
decreases with increasing redshift, to the level of a few per cent at z > 1. However,
observational studies have remained inconclusive about the fraction beyond
redshift unity because, to date, only two clusters in this redshift regime have been
investigated. One shows a much lower fraction than the mean value at low redshift3
,
whereas the other possesses a fraction similar to the low-redshift value4
. Here we
report an ICL study of ten galaxy clusters at 1 ≲z≲ 2 based on deep infrared imaging
data. Contrary to the leading theories, our study fnds that ICL is already abundant
at z ≳ 1, with a mean ICL fraction of approximately 17%. Moreover, no signifcant
correlation between cluster mass and ICL fraction or between ICL colour and
cluster-centric radius is observed. Our fndings suggest that gradual stripping can
no longer be the dominant mechanism of ICL formation. Instead, our study supports
the scenario wherein the dominant ICL production occurs in tandem with the
formation and growth of the brightest cluster galaxies and/or through the accretion
of preprocessed stray stars.
This document discusses a thesis project that aims to evaluate the radiation hardness of sensor materials for use in the proposed International Linear Collider beamline calorimeter (BeamCal). The author performs Monte Carlo simulations to estimate the shower conversion factor α, which quantifies the mean radiation fluence at a sensor per incident electron, as a function of electron energy. Analysis of the simulation data provides fluence distribution profiles that decrease radially from the center of the irradiated sensor area. The author accounts for sensor rastering across the electron beam, which provides even illumination over a 2 cm area. Observations from the simulations indicate the radiation fluence is linearly dependent on the incident electron energy.
Measuring the Hubble constant with kilonovae using the expanding photosphere ...Sérgio Sacani
While gravitational wave (GW) standard sirens from neutron star (NS) mergers have been proposed to offer good measurements of
the Hubble constant, we show in this paper how a variation of the expanding photosphere method (EPM) or spectral-fitting expanding
atmosphere method, applied to the kilonovae (KNe) associated with the mergers, can provide an independent distance measurement
to individual mergers that is potentially accurate to within a few percent. There are four reasons why the KN-EPM overcomes the
major uncertainties commonly associated with this method in supernovae: (1) the early continuum is very well-reproduced by a
blackbody spectrum, (2) the dilution effect from electron scattering opacity is likely negligible, (3) the explosion times are exactly
known due to the GW detection, and (4) the ejecta geometry is, at least in some cases, highly spherical and can be constrained from
line-shape analysis. We provide an analysis of the early VLT/X-shooter spectra AT2017gfo showing how the luminosity distance can
be determined, and find a luminosity distance of DL = 44.5 ± 0.8 Mpc in agreement with, but more precise than, previous methods.
We investigate the dominant systematic uncertainties, but our simple framework, which assumes a blackbody photosphere, does not
account for the full time-dependent three-dimensional radiative transfer effects, so this distance should be treated as preliminary. The
luminosity distance corresponds to an estimated Hubble constant of H0 = 67.0 ± 3.6 km s−1 Mpc−1
, where the dominant uncertainty
is due to the modelling of the host peculiar velocity. We also estimate the expected constraints on H0 from future KN-EPM-analysis
with the upcoming O4 and O5 runs of the LIGO collaboration GW-detectors, where five to ten similar KNe would yield 1% precision
cosmological constraints.
This document summarizes research using matrix product states (MPS) to simulate the dynamics of atoms in an optical lattice. MPS allows modeling of larger systems than conventional exact calculations by only keeping the most relevant quantum mechanical combinations. The researcher investigated MPS accuracy by comparing hopping predictions to exact calculations, finding convergence up to a certain precision. Future work will apply MPS to more complex lattice systems and geometries to replicate experiments.
The Algorithms of Life - Scientific Computing for Systems Biologyinside-BigData.com
In this deck from ISC 2019, Ivo Sbalzarini from TU Dresden presents: The Algorithms of Life - Scientific Computing for Systems Biology. In his talk, Sbalzarini mainly discussed the rapidly growing importance and influence in the life sciences for scientific high-performance computing.
"Scientific high-performance computing is of rapidly growing importance and influence in the life sciences. Thanks to the increasing knowledge about the molecular foundations of life, recent advances in biomedical data science, and the availability of predictive biophysical theories that can be numerically simulated, mechanistic understanding of the emergence of life comes within reach. Computing is playing a pivotal and catalytic role in this scientific revolution, both as a tool of investigation and hypothesis testing, but also as a school of thought and systems model. This is because a developing tissue, embryo, or organ can itself be seen as a massively parallel distributed computing system that collectively self-organizes to bring about behavior we call life. In any multicellular organism, every cell constantly takes decisions about growth, division, and migration based on local information, with cells communicating with each other via chemical, mechanical, and electrical signals across length scales from nanometers to meters. Each cell can therefore be understood as a mechano-chemical processing element in a complexly interconnected million- or billion-core computing system. Mechanistically understanding and reprogramming this system is a grand challenge. While the “hardware” (proteins, lipids, etc.) and the “source code” (genetic code) are increasingly known, we known virtually nothing about the algorithms that this code implements on this hardware. Our vision is to contribute to this challenge by developing computational methods and software systems for high-performance data analysis, inference, and numerical simulation of computer models of biological tissues, incorporating the known biochemistry and biophysics in 3D-space and time, in order to understand biological processes on an algorithmic basis. This ranges from real-time approaches to biomedical image analysis, to novel simulation languages for parallel high-performance computing, to virtual reality and machine learning for 3D microscopy and numerical simulations of coupled biochemical-biomechanical models. The cooperative, interdisciplinary effort to develop and advance our understanding of life using computational approaches not only places high-performance computing center stage, but also provides stimulating impulses for the future development of this field."
Watch the video: https://wp.me/p3RLHQ-kBB
Learn more: https://www.isc-hpc.com/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
This document provides an overview of the Center for Scientific and Technological Research (CCTVAL) in Chile. It summarizes that CCTVAL conducts research in theoretical and experimental physics, engineering, and their applications to technology and society. It has over 150 members working across various research groups and laboratories. CCTVAL has achieved success in scientific publications, international collaborations, technology development, and training students and young researchers. It aims to further develop physics and engineering in Chile and enhance their societal impacts.
This document reports on evidence for spatial variation in the fine structure constant α from observations of quasar absorption spectra. A sample of 153 measurements from the ESO Very Large Telescope (VLT) probing a different region of the universe suggests α was larger in the past, opposite to previous findings from the Keck telescope. Combining the two datasets reveals a significant spatial dipole in α, with the maximum variation in the direction of right ascension 17.3 hours, declination -61 degrees. Detailed analysis found no systematic effects that could mimic this result.
Advancing Science through Coordinated CyberinfrastructureDaniel S. Katz
How local, regional, and national cyberinfrastructure can be coordinated and linked to advance science and engineering, based on experiences and lessons from the Center for Computation & Technology at LSU (ideas, funding, implementation), plus some thoughts on what might be done differently if we were starting today. Presented at First Workshop - Center for Computational Engineering & Sciences, Unicamp, Campinas, Brazil 10 APR 2014
Nightside clouds and disequilibrium chemistry on the hot Jupiter WASP-43bSérgio Sacani
HotJupiters are amongthebest-studied exoplanets, but it is still poorly understood how their chemical composition and cloud properties vary with longitude. Theoretical models predict that clouds may condense on the nightside and that molecular abundances can be driven out of equilibrium by zonal winds. Here we report a phase-resolved emission spectrum of the hot Jupiter WASP-43b measured from 5–12µ 5–12µ 5–12µm with JWST’s Mid-Infrared Instrument (MIRI). 1524 ±35 1524 ±35 and 863±23 The spectra reveal a large day–night temperature contrast (with average brightness temperatures of 1524 ± 35 863 ±23 863 ±23Kelvin, respectively) and evidence for water absorption at all orbital phases. Comparisons with three-dimensional atmospheric models show that both the phase curve shape and emission spectra strongly suggest the presence of nightside clouds which become optically thick to thermal emission at pressures greater than ∼100mbar. The dayside is consistent with a cloudless atmosphere above the mid-infrared photosphere. Con3trary to expectations from equilibrium chemistry but consistent with disequilibrium kinetics models, methane is not detected on the nightside (2σ upper limit of 1–6 parts per million, depending on model assumptions).
This document provides information about the Engineering Physics course syllabus including:
- 5 units that will be covered related to electromagnetism, fiber optics, dielectric and magnetic materials, semiconductor physics, and semiconductor devices.
- 12 required experiments in the laboratory portion including determining magnetic fields, fiber optic properties, susceptibility, Hall effect, semiconductor properties, and characteristics of diodes and solar cells.
- Safety precautions and instructions for students conducting experiments in the physics laboratory, including following all safety guidelines, being careful with electrical and heating equipment, and not leaving open flames unattended.
This document discusses a new machine learning method for differentiating between quark and gluon jets using data from the ALICE experiment at CERN. Key points:
- The method uses features of jet substructure to construct discriminant variables to classify jets as initiated by quarks or gluons. Hundreds of features are explored.
- Data preprocessing steps are described, including removing unusable features, addressing class noise in jet labeling, and ranking features by information gain.
- Feature ranking identified both previously proposed discriminating features as well as new intriguing variables for better quark/gluon jet discrimination.
Presentation of ECOSTBio Action CM1305 at APC Keflavik (Iceland)Marcel Swart
This document summarizes the ECOSTBio CM1305 Action, which aims to establish a European network to study spin states of transition metal complexes. It will set up a SPINSTATE database, develop new computational methods, and facilitate collaboration between experimental and theoretical groups. The Action has 4 working groups focused on the database, enzymatic spin states, spin crossover materials, and biomimetic spin states. It involves 75 parties from 19 countries and over 75 participants in the first year, with equal representation of experimentalists and theoreticians. Future plans include populating the database, surveying spin states in enzymes and spin crossover materials, and synthesizing complexes to study through spectroscopy and reactivity experiments.
NB: This is a preliminary version, superceded by my next upload. It has long been realized that the mathematical core of Bell's theorem is essentially a classical probabilistic proof that a certain distributed computing task is impossible: namely, the Monte Carlo simulation of certain iconic quantum correlations. I will present a new and simple proof of the theorem using Fourier methods (time series analysis) which should appeal to probabilists and statisticians. I call it Gull's theorem since it was sketched in a conference talk many years ago by astrophysicist Steve Gull, but never published. Indeed, there was a gap in the proof.
The connection with the topic of this session is the following: though a useful quantum computer is perhaps still a dream, many believe that a useful quantum internet is very close indeed. The first application will be: creating shared secret random cryptographic keys which, due to the laws of physics, cannot possibly be known to any other agent. So-called loophole-free Bell experiments have already been used for this purpose.
Like other proofs of Bell's theorem, the proof concerns a thought experiment, and the thought experiment could also in principle be carried out in the lab. This connects to the concept of functional Bell inequalities, whose application in the quantum research lab has not yet been explored. This is again a task for classical statisticians to explore.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
A tale of two Lucys - Delft lecture - March 4, 2024Richard Gill
TUDelft Seminar Probability & Statistics, 4 March 2024
15:45 T/M 16:45 - LOCATION: LECTURE HALL D@TA
Lucia de Berk, a Dutch nurse, was arrested in 2001, and tried and convicted of serial murder of patients in her care. At a lower court the only hard evidence against her was the result of a probability calculation: the chance that she was present at so many suspicious deaths and collapses in the hospitals where she had worked was 1 in 342 million. During appeal proceedings at a higher court, the prosecution shifted gears and gave the impression that there was now hard evidence that she had killed one baby. Having established that she was a killer and a liar (she claimed innocence) it was not difficult to pin another 9 deaths and collapses on her. No statistics were needed any more. In 2005 the conviction was confirmed by the supreme court. But at the same time, some whistleblowers started getting attention from the media. A long fight for the hearts and minds of the public, and a long fight to have the case reopened (without any new evidence - only new scientific interpretation of existing evidence) began and ended in 2010 with Lucia’s complete exoneration. A number of statisticians played a big role in that fight. The idea that the conviction was purely based on objective scientific evidence was actually an illusion. This needed to be explained to journalists & to the public. And the judiciary needed to be convinced that something had to be done about it. Lucy Letby, an English nurse, was arrested in 2020 for murder of a large number of babies at a hospital in Chester, UK, in Jan 2015-June 2016. Her trial started in 2022 and took 10 months. She was convicted and given a whole life sentence in 2023. In my opinion, the similarities between the two cases are horrific. Again there is statistical evidence: a cluster of unexplained bad events, and Lucy was there every time; there is apparently irrefutable scientific evidence for two babies; and just like with Lucia de Berk, there are some weird personal and private writings which can be construed as a confession. For many reasons, the chances of a fair retrial for Lucy Letby are very thin indeed, but I am convinced she is innocent and that her trial was grossly unfair.
Subtitle: "Statistical issues in the investigation of a suspected serial killer nurse"
Abstract:
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
https://www.maths.lu.se/kalendarium/?evenemang=statistics-seminar-statistical-issues-investigation-suspected-serial-killer-nurse-richard-gill
Video: https://www.youtube.com/watch?v=RxmFLKTlim8
The RSS has published a report tackling statistical bias in criminal trials where healthcare professionals are accused of murdering patients. Following several high-profile cases where statistical evidence has been misused, the Society calls for all parties in such cases to consult with professional statisticians and use only expert witnesses who are appropriately qualified.
The report, ‘Healthcare serial killer or coincidence?’, is produced by the RSS’s Statistics and the Law Section. The group evolved from a working group of the same name set up in early 2000s after the Society wrote to the Lord Chancellor and made a statement setting out concerns around the use of statistical evidence in the case of Sally Clark.
According to the report, suspicions about medical murder often arise due to a surprising or unexpected series of events, such as an unusual number of deaths among patients under the care of a particular professional.
The RSS has major concerns about use of this kind of evidence in a criminal investigation: first, over the analysis and interpretation of such data, and secondly over whether it can be guaranteed that the data have been compiled in an objective and unbiased manner.
Subtitle: Statistical issues in the investigation of a suspected serial killer nurse
Abstract:
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
Statistical issues in the investigation of a suspected serial killer nurse
Abstract
Investigating a cluster of deaths on a hospital ward is a difficult task for medical investigators, police, and courts. Patients do die in hospitals (the three most common causes of deaths in a hospital are, in order: cancer, heart disease, medical errors). Often such cases come to the attention of police investigators for two reasons: gossip about a particular nurse is circulating just as a couple of unexpected and disturbing events occur. Hospital investigators see a pattern and call in the police.
I will discuss two such cases with which I have been intensively involved. The first one is the case of the Dutch nurse Lucia de Berk. Arrested in 2001, convicted by a succession of courts up to the supreme court by 2005, after which a long fight started to get her a re-trial. She was completely exonerated in 2010. The second case is that of the English nurse Lucy Letby. Arrested in 2018, 2019 and 2020 for murders taking place in 2015 and 2016. Her trial started in 2022 and concluded with a “full life” sentence a couple of months ago.
There are many similarities between the two cases, but also a couple of disturbing differences. One difference being that Lucy Letby’s lawyers seem to have made no attempt whatsoever to defend her. Another difference is that statistics was used against Lucia de Berk but not, apparently, against Lucy Letby. But appearances are not always what they seem.
Report published by Royal Statistical Society on statistical issues in these cases
https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
News feature in “Science” about myself and my work
https://www.science.org/content/article/unlucky-numbers-fighting-murder-convictions-rest-shoddy-stats
Statistics, causality, and the 2022 Nobel prizes in physics.
Richard Gill
Leiden University
The 2022 Nobel prize in physics was awarded to John Clauser, Alain Aspect and Anton Zeilinger "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”. I will explain each of these three gentlemen’s contributions and point out connections to classical statistical causality and probabilistic coupling. It seems that the first commercial application of this work will be a technology called DIQKD: "device independent quantum key distribution". Alice and Bob are far apart and need to establish a shared cryptographic key so as to send one another some securely encrypted messages over public communication channels. How can they create a suitable key while far apart from one another, and only able to communicate using classical means and over public channels?
Healthcare serial killer or coincidence?
Richard Gill
Mathematical Institute, Leiden University
Abstract: The UK’s *Royal Statistical Society* recently published a report tackling statistical bias in criminal trials where healthcare professionals are accused of murdering patients. Following several high-profile cases where statistical evidence has been misused, the Society calls for all parties in such cases to consult with professional statisticians and use only expert witnesses who are appropriately qualified.
The RSS report came out just two weeks before the start of the trial in Manchester of a nurse called Lucy Letby. The trial is still ongoing. So far, neither side has called for evidence from experts in statistics. The core of the prosecution case is that so many odd events connected to nurse LL cannot be a coincidence.
I will discuss the challenges both procedural and conceptual which arise when presenting statistical thinking as evidence in criminal trials. https://rss.org.uk/news-publication/news-publications/2022/section-group-reports/rss-publishes-report-on-dealing-with-uncertainty-i/
The Utrecht University veterinary school was commissioned by the Dutch government to provide objective criteria for breeding short-muzzled dogs. Utrecht proposed 6 external characteristics rated on a traffic light system related to risks of Brachycephalic Obstructive Airway Syndrome and Brachycephalic Ocular Syndrome. These included abnormal breathing, nostril shape, relative muzzle length, nasal folds, eye exposure and eyelid closure. Utrecht determined standards for each characteristic and concluded that dogs meeting certain standards could be used for breeding while those exceeding the standards should not due to increased health risks. Utrecht based their recommendations on scientific studies and expertise in companion animal genetics. However, their criteria are still debated by other scientists and
1) The document discusses Marian Kupczynski's paper on whether John Bell would choose contextuality or nonlocality today based on graphical models representing random variables.
2) It presents a graphical model showing source hidden variables, context dependent instrument hidden variables, and Alice and Bob's outcome variables that may be correlated.
3) It notes that assuming the instrument hidden variables are uncorrelated leads to the CHSH inequality holding, while allowing them to be correlated allows any four joint probability distributions, and discusses Kupczynski's consideration of the detection loophole.
We analyse data from the final two years of a long-running and influential annual Dutch survey of the quality of Dutch New Herring served in large samples of consumer outlets. The data was compiled and analysed by a university econometrician whose findings were publicized in national and international media. This led to the cessation of the survey amid allegations of bias due to a conflict of interest on the part of the leader of the herring tasting team. The survey organizers responded with accusations of failure of scientific integrity. The econometrician was acquitted of wrong-doing by the Dutch authority, whose inquiry nonetheless concluded that further research was needed. We reconstitute the data and uncover its important features which throw new light on the econometrician's findings, focussing on the issue of correlation versus causality: the sample is definitely not a random sample. Taking account both of newly discovered data features and of the sampling mechanism, we conclude that there is no evidence of biased evaluation, despite the econometrician's renewed insistence on his claim.
This year’s Nobel prize in physics: homage to John Bell.
Richard Gill
Mathematical Institute, Leiden University.
Focussing on statistical issues, I will first sketch the history initiated by John Bell’s landmark 1964 paper “On the Einstein Podolsky Rosen paradox”, which led to the 2022 Nobel prize awarded to John Clauser, Alain Aspect and Anton Zeilinger,
https://www.nobelprize.org/prizes/physics/2022/press-release/
A breakthrough in the history was the four successful “loophole-free” Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna. These experiments pushed quantum technology to the limit and paved the way for DIQKD (“Device Independent Quantum Key Distribution”) and a quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality", and depended on brilliant later innovations: Eberhard’s discovery that less entanglement could allow stronger manifestation of quantum non-locality, and Zeilinger’s discovery of quantum teleportation, allowing entanglement between photons to be transferred to entanglement between ions or atoms and ultimately to components of manufactured semi-conductors.
I will also discuss reanalyses of the 2015+ experiments, which could have allowed the experimenters to claim even smaller p-values than the ones they published,
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
This year’s Nobel prize in physics: homage to John Bell.
Richard Gill
Mathematical Institute, Leiden University.
Focussing on statistical issues, I will first sketch the history initiated by John Bell’s landmark 1964 paper “On the Einstein Podolsky Rosen paradox”, which led to the 2022 Nobel prize awarded to John Clauser, Alain Aspect and Anton Zeilinger,
https://www.nobelprize.org/prizes/physics/2022/press-release/
A breakthrough in the history was the four successful “loophole-free” Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna. These experiments pushed quantum technology to the limit and paved the way for DIQKD (“Device Independent Quantum Key Distribution”) and a quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality", and depended on brilliant later innovations: Eberhard’s discovery that less entanglement could allow stronger manifestation of quantum non-locality, and Zeilinger’s discovery of quantum teleportation, allowing entanglement between photons to be transferred to entanglement between ions or atoms and ultimately to components of manufactured semi-conductors.
I will also discuss reanalyses of the 2015+ experiments, which could have allowed the experimenters to claim even smaller p-values than the ones they published,
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
Better slides: https://www.slideshare.net/gill1109/nobelpdf-253673329
Solution to the measurement problem based on Belavkin's theory of Eventum Mechanics. There is only Schrödinger’s equation and a unitary evolution of the wave function of the universe, but we must add a Heisenberg cut to separate the past from the future (to separate particles from waves): Belavkin’s eventum mechanics. The past is a commuting sub-algebra A of the algebra of all observables B, and in the Heisenberg picture, the past history of any observable in A is also in A. Particles have definite trajectories back into the past; Eventum Mechanics defines the probability distributions of future given past. https://arxiv.org/abs/0905.2723 Schrödinger's cat meets Occam's razor (version 3: 10 Aug 2022); to appear in Entropy
The successful loophole-free Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna were milestone achievements. They pushed quantum technology to the limit and paved the way for DIQKD and quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality" (https://cds.cern.ch/record/142461/files/198009299.pdf). Still, there is a vociferous but small community of proponents of local realism, who continue to grasp at the straws offered by some shortcomings of the 2015+ experiments. I'll discuss the loopholes in the loophole-free experiments, explain how the experimenters could have claimed much smaller p-values "for free" by using standard likelihood-based inference, but also relativise the meaning of a 25 standard deviation violation of local realism. I suggest that particle physicists have wisely settled on 5 standard deviations because of Chebyshev's inequality and the fact that 1 / 5 squared = 0.04, just significant at the 5% level.
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
https://arxiv.org/abs/2208.09930 "Kupczynski's contextual setting-dependent parameters offer no escape from Bell-CHSH"
The successful loophole-free Bell experiments of 2015 and 2016 in Delft, Munich, NIST and Vienna were milestone achievements. They pushed quantum technology to the limit and paved the way for DIQKD and quantum internet. They were the first successful implementations of the ideal experimental protocol described by John Bell in his 1981 masterpiece "Bertlmann's socks and the nature of reality" (https://cds.cern.ch/record/142461/files/198009299.pdf). Still, there is a vociferous but small community of proponents of local realism, who continue to grasp at the straws offered by some shortcomings of the 2015+ experiments. I'll discuss the loopholes in the loophole-free experiments, explain how the experimenters could have claimed much smaller p-values "for free" by using standard likelihood-based inference, but also relativise the meaning of a 25 standard deviation violation of local realism. I suggest that particle physicists have wisely settled on 5 standard deviations because of Chebyshev's inequality and the fact that 1 / 5 squared = 0.04, just significant at the 5% level.
https://arxiv.org/abs/2209.00702 "Optimal statistical analyses of Bell experiments"
https://arxiv.org/abs/2208.09930 "Kupczynski's contextual setting-dependent parameters offer no escape from Bell-CHSH"
Statistical issues in Serial Killer Nurse casesRichard Gill
- In serial killer nurse cases, clusters of suspicious deaths or incidents are often associated with a particular nurse on duty. However, alternative explanations for such clusters are difficult to rule out given the low base rate of nurses committing murder.
- Statistical evidence plays a key role in these cases but can be misleading if not interpreted carefully. Characteristics of the hospital system and processes of gathering evidence can inadvertently influence statistical analyses.
- Close examination of data in one case found that statistics were selectively reported in ways that exaggerated the nurse's involvement, such as restricting time periods analyzed. Complete data sets have sometimes contradicted initial statistical impressions.
The article discusses the d'Alembert betting system, one of the most popular systems used in casinos in the 19th century. While the systems appear to guarantee success by equalizing wins and losses over time, they fail to account for the risk of running out of money before wins and losses balance out. However, the systems can provide surprisingly high returns on investment when wins do occur, obscuring the overall negative expected value. The article analyzes how systems like the d'Alembert are seductive due to the potential for large gains despite the inevitability of overall losses in the long run.
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
The cost of acquiring information by natural selectionCarl Bergstrom
This is a short talk that I gave at the Banff International Research Station workshop on Modeling and Theory in Population Biology. The idea is to try to understand how the burden of natural selection relates to the amount of information that selection puts into the genome.
It's based on the first part of this research paper:
The cost of information acquisition by natural selection
Ryan Seamus McGee, Olivia Kosterlitz, Artem Kaznatcheev, Benjamin Kerr, Carl T. Bergstrom
bioRxiv 2022.07.02.498577; doi: https://doi.org/10.1101/2022.07.02.498577
Sexuality - Issues, Attitude and Behaviour - Applied Social Psychology - Psyc...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Current Ms word generated power point presentation covers major details about the micronuclei test. It's significance and assays to conduct it. It is used to detect the micronuclei formation inside the cells of nearly every multicellular organism. It's formation takes place during chromosomal sepration at metaphase.
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
(June 12, 2024) Webinar: Development of PET theranostics targeting the molecu...Scintica Instrumentation
Targeting Hsp90 and its pathogen Orthologs with Tethered Inhibitors as a Diagnostic and Therapeutic Strategy for cancer and infectious diseases with Dr. Timothy Haystead.
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...Advanced-Concepts-Team
Presentation in the Science Coffee of the Advanced Concepts Team of the European Space Agency on the 07.06.2024.
Speaker: Diego Blas (IFAE/ICREA)
Title: Gravitational wave detection with orbital motion of Moon and artificial
Abstract:
In this talk I will describe some recent ideas to find gravitational waves from supermassive black holes or of primordial origin by studying their secular effect on the orbital motion of the Moon or satellites that are laser ranged.
ESA/ACT Science Coffee: Diego Blas - Gravitational wave detection with orbita...
vaxjo2023rdg.pdf
1. Richard D. Gill (Leiden University), 13 June 2023, QIP Växjö
Statistical analysis of [the]
recent Bell experiments
“If your experiment needs statistics, you ought to
have done a better experiment”
https://www.slideshare.net/gill1109/presentations
2. • RD Gill, Optimal Statistical Analyses of Bell Experiments,
AppliedMath 2023, 3(2), 446-460; https://doi.org/10.3390/
appliedmath3020023
• Storz, S., Schär, J., Kulikov, A. et al. Loophole-free Bell inequality
violation with superconducting circuits. Nature 617, 265–270 (2023).
https://doi.org/10.1038/s41586-023-05885-0
• Giustina M. Superconducting qubits cover new distances. Nature 617
(7960), 254-256. https://doi.org/10.1038/d41586-023-01488-x
• I have promised Marian Kupczynski not to talk about:
RD Gill & JP Lambare, Kupczynski’s Contextual Locally Causal
Probabilistic Models Are Constrained by Bell’s Theorem, Quantum
Rep. 2023, 5(2), 481-495; https://doi.org/10.3390/quantum5020032
Storz et. al. – ETH Zürich – Nature 617, 265–270 (2023)
Loophole-free Bell inequality violation with
superconducting circuits
3. Richard D. Gill, 13 June 2023
On: “Loophole–free Bell
inequality violation
with superconducting circuits”
4. Loophole-freeBellinequalityviolationwith
superconductingcircuits
Simon Storz1✉,Josua Schär1
,Anatoly Kulikov1
,Paul Magnard1,10
,Philipp Kurpiers1,11
,
Janis Lütolf1
,Theo Walter1
,Adrian Copetudo1,12
,Kevin Reuer1
,Abdulkadir Akin1
,
Jean-Claude Besse1
,Mihai Gabureac1
,Graham J. Norris1
,Andrés Rosario1
,Ferran Martin2
,
José Martinez2
,Waldimar Amaya2
,Morgan W. Mitchell3,4
,Carlos Abellan2
,Jean-Daniel Bancal5
,
Nicolas Sangouard5
,Baptiste Royer6,7
,Alexandre Blais7,8
&Andreas Wallraff1,9✉
Superposition,entanglementandnon-localityconstitutefundamentalfeaturesof
quantumphysics.Thefactthatquantumphysicsdoesnotfollowtheprincipleoflocal
causality1–3
canbeexperimentallydemonstratedinBelltests4
performedonpairsof
spatiallyseparated,entangledquantumsystems.AlthoughBelltests,whicharewidely
regardedasalitmustestofquantumphysics,havebeenexploredusingabroadrange
ofquantumsystemsoverthepast50years,onlyrelativelyrecentlyhaveexperiments
freeofso-calledloopholes5
succeeded.Suchexperimentshavebeenperformedwith
spinsinnitrogen–vacancycentres6
,opticalphotons7–9
andneutralatoms10
.Herewe
demonstratealoophole-freeviolationofBell’sinequalitywithsuperconducting
circuits,whichareaprimecontenderforrealizingquantumcomputingtechnology11
.
ToevaluateaClauser–Horne–Shimony–Holt-typeBellinequality4
,wedeterministically
entangleapairofqubits12
andperformfastandhigh-fidelitymeasurements13
along
randomlychosenbasesonthequbitsconnectedthroughacryogeniclink14
spanning
adistanceof30 metres.Evaluatingmorethan1 millionexperimentaltrials,wefindan
averageSvalueof2.0747 ± 0.0033,violatingBell’sinequalitywithaPvaluesmallerthan
10−108
.Ourworkdemonstratesthatnon-localityisaviablenewresourceinquantum
informationtechnologyrealizedwithsuperconductingcircuitswithpotential
applicationsinquantumcommunication,quantumcomputingandfundamental
physics15
.
Oneoftheastoundingfeaturesofquantumphysicsisthatitcontradicts
ourcommonintuitiveunderstandingofnaturefollowingtheprinciple
oflocalcausality1
.Thisconceptderivesfromtheexpectationthatthe
causes of an event are to be found in its neighbourhood (see Supple-
mentaryInformationsectionIforadiscussion).In1964,JohnStewart
Bellproposedanexperiment,nowknownasaBelltest,toempirically
demonstratethattheoriessatisfyingtheprincipleoflocalcausalitydo
notdescribethepropertiesofapairofentangledquantumsystems2,3
.
cannotdependoninformationavailableatthelocationofpartyBand
vice versa, and measurement independence, the idea that the choice
between the two possible measurements is statistically independent
from any hidden variables.
AdecadeafterBell’sproposal,thefirstpioneeringexperimentalBell
tests were successful16,17
. However, these early experiments relied on
additionalassumptions18
,creatingloopholesintheconclusionsdrawn
fromtheexperiments.Inthefollowingdecades,experimentsrelyingon
https://doi.org/10.1038/s41586-023-05885-0
Received: 22 August 2022
Accepted: 24 February 2023
Published online: 10 May 2023
Open access
Check for updates
Nature | Vol 617 | 11 May 2023 | 265
ToevaluateaClauser–Horne–Shimony–Holt-typeBellinequality4
,wedeterministically
entangleapairofqubits12
andperformfastandhigh-fidelitymeasurements13
along
randomlychosenbasesonthequbitsconnectedthroughacryogeniclink14
spanning
adistanceof30 metres.Evaluatingmorethan1 millionexperimentaltrials,wefindan
averageSvalueof2.0747 ± 0.0033,violatingBell’sinequalitywithaPvaluesmallerthan
10−108
.Ourworkdemonstratesthatnon-localityisaviablenewresourceinquantum
informationtechnologyrealizedwithsuperconductingcircuitswithpotential
applicationsinquantumcommunication,quantumcomputingandfundamental
physics15
.
Oneoftheastoundingfeaturesofquantumphysicsisthatitcontradicts
ourcommonintuitiveunderstandingofnaturefollowingtheprinciple
oflocalcausality1
.Thisconceptderivesfromtheexpectationthatthe
causes of an event are to be found in its neighbourhood (see Supple-
mentaryInformationsectionIforadiscussion).In1964,JohnStewart
Bellproposedanexperiment,nowknownasaBelltest,toempirically
demonstratethattheoriessatisfyingtheprincipleoflocalcausalitydo
notdescribethepropertiesofapairofentangledquantumsystems2,3
.
In a Bell test4
, two distinct parties A and B each hold one part of an
entangledquantumsystem,forexample,oneoftwoqubits.Eachparty
then chooses one of two possible measurements to perform on their
qubit, and records the binary measurement outcome. The parties
repeat the process many times to accumulate statistics, and evaluate
aBellinequality2,4
usingthemeasurementchoicesandrecordedresults.
Systems governed by local hidden variable models are expected to
obey the inequality whereas quantum systems can violate it. The two
underlyingassumptionsinthederivationofBell’sinequalityarelocality,
theconceptthatthemeasurementoutcomeatthelocationofpartyA
cannotdependoninformationavailableatthelocationofpartyBand
vice versa, and measurement independence, the idea that the choice
between the two possible measurements is statistically independent
from any hidden variables.
AdecadeafterBell’sproposal,thefirstpioneeringexperimentalBell
tests were successful16,17
. However, these early experiments relied on
additionalassumptions18
,creatingloopholesintheconclusionsdrawn
fromtheexperiments.Inthefollowingdecades,experimentsrelyingon
fewerandfewerassumptionswereperformed19–21
,untilloophole-free
Bell inequality violations, which close all major loopholes simultane-
ously,weredemonstratedin2015andthefollowingyears6–10
;seeref.22
for a discussion.
Inthedevelopmentofquantuminformationscience,itbecameclear
that Bell tests relying on a minimum number of assumptions are not
only of interest for testing fundamental physics but also serve as a
key resource in quantum information processing protocols. Observ-
ing a violation of Bell’s inequality indicates that the system possesses
non-classical correlations, and asserts that the potentially unknown
1
Department of Physics, ETH Zurich, Zurich, Switzerland. 2
Quside Technologies S.L., Castelldefels, Spain. 3
ICFO - Institut de Ciencies Fotoniques, The Barcelona Institute of Science and
Technology, Castelldefels (Barcelona), Spain. 4
ICREA - Institució Catalana de Recerca i Estudis Avançats, Barcelona, Spain. 5
Institute of Theoretical Physics, University of Paris-Saclay, CEA,
CNRS, Gif-sur-Yvette, France. 6
Department of Physics, Yale University, New Haven, CT, USA. 7
Institut quantique and Départment de Physique, Université de Sherbrooke, Sherbrooke, Québec,
Canada. 8
Canadian Institute for Advanced Research, Toronto, Ontario, Canada. 9
Quantum Center, ETH Zurich, Zurich, Switzerland. 10
Present address: Alice and Bob, Paris, France. 11
Present address:
Rohde and Schwarz, Munich, Germany. 12
Present address: Centre for Quantum Technologies, National University of Singapore, Singapore, Singapore. ✉e-mail: simon.storz@phys.ethz.ch;
andreas.wallraff@phys.ethz.ch
6. • Evaluating more than 1 million experimental trials, we
fi
nd an
average S value of 2.0747 ± 0.0033, violating Bell’s inequality
with a p–value smaller than 10–108
• For the
fi
nal Bell test with an optimal angle θ (see main text), we
performed n = 220 Bell trials and obtained c = 796228 wins in the
Bell game. With these values we
fi
nd p ≤ 10–108
Notice how close they are …
Two log10 p–values
> pnorm(747/33, lower.tail = FALSE, log.p = TRUE) / log(10)
[1] –113.0221
> pbinom(796228 – 1, 2^20, lower.tail = FALSE, prob = 3/4,
+ log.p = TRUE) / log(10)
[1] –108.6195
2^20 = 1,048,576
7. Counts Na,b,x,y
x = +1
y = +1
x = +1
y = -1
x = -1
y = +1
x = -1
y = -1
a = 0, b = 0 100,529 31,780 29,926 99,965
a = 0, b = 1 30,638 101,342 96,592 33,131
a = 1, b = 0 94,661 30,018 35,565 102,060
a = 1, b = 1 96,291 29,186 32,104 104,788
ABLE SV. Raw counts of the individual occurrences for
al Bell test for fixed o↵set angle ✓ = ⇡/4 with the m
tistics (220
trials), presented in the main text.
e correlators
10 3 3 10
3 10 10 3
10 3 3 10
10 3 3 10
In tens of thousands, rounded
9. a = 0, b = 0 a = 0, b = 1 a = 1, b = 0 a = 1, b = 1
x = +1, y = +1 100529 30638 94661 96291
x = +1, y = –1 31780 101342 30018 29186
x = –1, y = +1 29926 96592 35565 32104
x = –1, y = –1 99965 33131 102060 104788
Zürich, transposed
a’ = 0, b = 0 a’ = 0, b = 1 a’ = 1, b = 0 a’ = 1, b = 1
x = +1, y = +1 94661 96291 100529 30638
x = +1, y = –1 30018 29186 31780 101342
x = –1, y = +1 35565 32104 29926 96592
x = –1, y = –1 102060 104788 99965 33131
Zürich, transposed & Alice’s setting
fl
ipped
11. Inputs
(binary)
Outputs
(binary)
Time
Distance (left to right) is so large that a signal travelling from one side to the other at the speed
of light takes longer than the time interval between input and output on each side
One “go = yes” trial has binary inputs and outputs; model as random variables A, B, X, Y
Image: figure 7 from J.S. Bell (1981), “Bertlmann’s socks and the nature of reality”
18. • Suppose multinomial distribution, condition on 4 counts N(a, b)
• Compute the CHSH statistic S and estimate its standard
deviation in the usual way (four independent binomial counts with
variance estimated by plug-in)
• Compare z-value = S / s.e.(S) with N(0, 1)
• In some circumstances one would prefer Eberhard’s J
• Are there more possibilities? Yes: a 4 dimensional continuum
of alternatives. Why not just pick the best???
Reduce data to 16 counts N(a, b, x, y)
Statistical analysis (Classical method)
∧
∧ ∧ ∧
∧
19. • Suppose multinomial distribution, condition on 4 counts N(a, b)
• We expect p(x | a, b) does not depend on b, and p(y | a, b) does
not depend on a
• If so, 4 empirical “deviations from no-signalling” are noise (mean
= 0). That noise is generally correlated with the noise in the
CHSH statistic S
• Reduce noise in S by subtracting prediction of statistical error,
given statistical errors in no-signalling equalities: 2SLS with plug-
in estimates of variances and covariances
Reduce data to 16 counts N(a, b, x, y)
Statistical analysis (New method 1)
∧
∧
20. • Suppose multinomial distribution, condition on 4 counts N(a, b)
• Estimate the 4 sets of 4 tetranomial probabilities p( x, y | a, b ) by
maximum likelihood assuming no-signalling (4 linear equalities)
• (a) Also assuming local realism (8 linear inequalities) and
• (b) Without also assuming local realism
• Test null hypothesis H0 : local realism against H1 = ¬ H0 using
Wilks’ log likelihood ratio test: Under H0, asymptotically,
–2 ( max{ log lik(p) : p ∈ H0 } – max{ log lik(p) : p ∈ H0 ∪ H1 } )
~ 1/2
𝜒
2(1) + 1/2
𝜒
2(0)
Statistical analysis (New method 2)
Reduce data to 16 counts N(a, b, x, y)
21. • Suppose 16-nomial, i.e., only grand total = N = 220 is
fi
xed
• Suppose all p(a, b) = 0.25
• Use martingale test (“Bell game”)
• Compute N( = | 1, 1) + N( = | 1, 2) + N( = | 2, 1) + N( ≠ | 2, 2);
compare to Bin(N, 3/4)
Statistical analysis (Bell game)
Reduce data to 16 counts N(a, b, x, y)
22. • The 4 tests are asymptotically equivalent if their model
assumptions are satis
fi
ed and the true probabilities p( x, y | a, b ),
p(a, b), have the nice symmetries
Comparison of the 4 p-values
Theory
e f
f e
e f
f e
e f
f e
f e
e f
g g g g
24. • Qutrits’ bases: | gA ⟩, | eA ⟩, | fA ⟩ ; | gB ⟩, | eB ⟩, | fB ⟩
• Ground state g , and
fi
rst two excited states e, f
• We get qutrits AB into state ( | eA, gB ⟩ + | gA, fB ⟩ ) / √2 + noise
• Channel has states | 0 ⟩, | 1 ⟩, | 2 ⟩ , …
• First create Schrödinger cat in A, superposition of two excited states
• One of the excited cats ( f ) emits a photon (microwave pulse) into the
channel and returns to the ground state
• The photon interacts with qutrit B and puts it into excited state f
• Attenuation of microwave plus interaction with environment -> noise
Actually, a tripartite system: qutritA ⨂ channel ⨂ qutritB
The QM stuff
25. The QM stuff
A tripartite system: qutritA ⨂ channel ⨂ qutritB
Create Schrödinger cat in qutrit A (superposition of two excited states e, f)
Actually done in two steps: g ➝ e ➝ (e + f) / √2
Cat “f” moves from qutrit A into channel
Cat “f” moves from channel into qutrit B
THE AUTHOR
|gA, 0, gBi !
1
p
2
⇣
|eA, 0, gBi + |fA, 0, gBi
⌘
!
1
p
2
⇣
|eA, 0, gBi + |gA, 1, gBi
⌘
!
1
p
2
⇣
|eA, 0, gBi + |gA, 0, fBi
⌘
26. • Tim Maudlin, Sabine Hossenfelder, Jonte Hance; Gerard ’t Hooft
• Restrict QM to a countable dense subset of states such that
Bell experiment is impossible: at most three of the four
experiments (
𝛼
i, βj) “exists”
• Paul Raymond-Robichaud
• Careful de
fi
nitions of QM, distinguish two levels (maths,
empirical observations) make it both local and realistic
• The catch: probability distributions of measurement outcomes
are in R-R’s model, individual outcomes are not
Two over-complex solutions to the Bell theorem quandrary
Lost in math
27. • David Oaknin: physicists are forbidden to look at
𝛼
– β
• Karl Hess, Hans de Raedt: physicists must condition (post-select) on both
photons being observed (the “photon identi
fi
cation loophole”)
• Marian Kupczynski: physicists must use 4 disjoint probability spaces for the
4 sub-experiments
• I am not a physicist!. As a mathematician, I do what I like.
• In physics there are no moral constraints on thought experiments
• Counterfactual reasoning is essential to (statistical) science, to law, to morality
• As a statistician, I know that “all models are wrong, some are useful”
Argumentum ab auctoritate
Physics fatwas
28. • Convert a bug into a feature
• The “Heisenberg cut” is for real; where it should be placed is up to the
user of the QM framework
• But it must be compatible with the underlying unitary evolution of the
world: the past consists of particles with de
fi
nite trajectories, the future
consists of waves of possibilities. Born’s rule gives us the probability law
of the resulting stochastic process by compounding the conditional
probabilities of the “collapse” in each next in
fi
nitesimal time step, given
the history so far
• A simple solution to what?
• It’s a mathematical solution to the measurement problem;
a mathematical resolution of the Schrödinger cat paradox
Belavkin’s “eventum mechanics”
A simple solution