The document discusses statistical issues that arise in global fits to determine parton distribution functions (PDFs) from experimental data. It notes that PDF fits must combine data from different collision types and experiments, which can have inconsistent measurements. Traditional PDF fitting methods make restrictive assumptions that introduce bias, while the NNPDF approach uses neural networks and Monte Carlo replicas to avoid biases and faithfully represent uncertainties, including in regions with limited data. Inconsistent data poses challenges and requires delicate handling in global fits to obtain statistically sound PDF results.
The impact of new collider data into the NNPDF global analysisJuan Rojo
The document summarizes Juan Rojo's presentation on the impact of new collider data in the NNPDF global analysis. It discusses updates and improvements to the NNPDF methodology, including adopting the public code APFEL, adding new LHC datasets like LHCb and top quark pair differential distributions, and analyzing the impact on parton distributions from including precise Tevatron and LHC Z boson data. Preliminary results from NNPDF3.1 indicate good stability compared to the previous NNPDF3.0 analysis, with reduced uncertainties and improved flavor separation from new experimental inputs.
Parton distributions in the LHC precision erajuanrojochacon
This document summarizes Juan Rojo's presentation on parton distribution functions (PDFs) at the Zurich Phenomenology Workshop 2018. The key points are:
1) PDF fits require combining perturbative cross-sections calculated using the Standard Model Lagrangian with non-perturbative PDFs extracted from a global analysis of experimental data.
2) More precise PDFs are needed to reduce uncertainties on calculations of processes like Higgs production and measurements of its couplings at the LHC.
3) Recent PDF analyses have included new data like differential top quark production from LHC and NNLO calculations, improving determinations of the gluon PDF over a wide range of x values.
4) Small-x data
The Structure of the Proton in the Higgs Boson Erajuanrojochacon
Juan Rojo gave a seminar at NIKHEF in Amsterdam on January 22, 2015 about the structure of the proton in the Higgs boson era. He discussed how the discovery of the Higgs boson completed the Standard Model but also opened new questions. He explained that the Large Hadron Collider will play a key role in exploring these questions over the next 20 years. Accurately determining the parton distribution functions of the proton is vital for phenomenology at the LHC.
Neural Network Fits of Parton DistributionsJuan Rojo
The document discusses neural network fits of parton distribution functions (PDFs). It describes the NNPDF approach, which uses artificial neural networks as universal interpolators to parametrize PDFs without theoretical bias, in contrast to traditional approaches. The NNPDF method also uses a Monte Carlo replica method to estimate PDF uncertainties without linear approximations. Recent NNPDF results include determinations of PDFs including quantum electrodynamics corrections and an investigation of the intrinsic charm content of the proton.
The document summarizes progress in the NNPDF global analysis of parton distribution functions. It discusses past NNPDF analyses from 2012-2015 and plans for future analyses, including NNPDF3.1 which will include new LHC data and improve the determination of intrinsic charm. It also presents results from fits allowing the charm content of the proton to be determined from data rather than assumed from perturbation theory, finding the data support a small intrinsic charm component within uncertainties.
NNLO PDF fits with top-quark pair differential distributionsJuan Rojo
Juan Rojo presented a study on including top-quark pair differential distributions in NNLO global PDF fits. The distributions provide stringent constraints on the large-x gluon, comparable to inclusive jet data. Fitting normalized distributions and including one distribution from ATLAS and CMS improves the description of data and reduces PDF uncertainties, particularly at high masses important for BSM searches. Some tension is seen between ATLAS and CMS measurements that can be reduced by fitting the experiments separately. Differential top data will be essential for future global PDF fits.
Parton Distributions and the search for New Physics at the LHCjuanrojochacon
Juan Rojo gave a seminar at King's College London on September 23, 2015 about parton distribution functions (PDFs) and their importance for precision physics at the Large Hadron Collider (LHC). PDFs describe the momentum distributions of quarks and gluons within protons and are crucial for determining cross sections and uncertainties for many LHC processes. The accurate determination of PDFs requires global analyses of experimental data using flexible parametrizations like neural networks to avoid biases. PDF uncertainties now limit characterization of the Higgs boson and searches for new physics at the LHC.
Parton Distributions: future needs, and the role of the High-Luminosity LHCjuanrojochacon
1) Improved PDFs are needed to match the accuracy of higher-order calculations of cross sections and characterize properties of the Higgs boson and search for new physics.
2) The HL-LHC could significantly reduce PDF uncertainties through high-statistics measurements, especially in processes sensitive to large-x gluons and quarks like top quark pair production and Drell-Yan.
3) Preliminary studies generating HL-LHC pseudo-data show PDF uncertainties on the gluon-gluon and quark-antiquark luminosities could be reduced by 10-30% from top pair and Drell-Yan measurements respectively.
The impact of new collider data into the NNPDF global analysisJuan Rojo
The document summarizes Juan Rojo's presentation on the impact of new collider data in the NNPDF global analysis. It discusses updates and improvements to the NNPDF methodology, including adopting the public code APFEL, adding new LHC datasets like LHCb and top quark pair differential distributions, and analyzing the impact on parton distributions from including precise Tevatron and LHC Z boson data. Preliminary results from NNPDF3.1 indicate good stability compared to the previous NNPDF3.0 analysis, with reduced uncertainties and improved flavor separation from new experimental inputs.
Parton distributions in the LHC precision erajuanrojochacon
This document summarizes Juan Rojo's presentation on parton distribution functions (PDFs) at the Zurich Phenomenology Workshop 2018. The key points are:
1) PDF fits require combining perturbative cross-sections calculated using the Standard Model Lagrangian with non-perturbative PDFs extracted from a global analysis of experimental data.
2) More precise PDFs are needed to reduce uncertainties on calculations of processes like Higgs production and measurements of its couplings at the LHC.
3) Recent PDF analyses have included new data like differential top quark production from LHC and NNLO calculations, improving determinations of the gluon PDF over a wide range of x values.
4) Small-x data
The Structure of the Proton in the Higgs Boson Erajuanrojochacon
Juan Rojo gave a seminar at NIKHEF in Amsterdam on January 22, 2015 about the structure of the proton in the Higgs boson era. He discussed how the discovery of the Higgs boson completed the Standard Model but also opened new questions. He explained that the Large Hadron Collider will play a key role in exploring these questions over the next 20 years. Accurately determining the parton distribution functions of the proton is vital for phenomenology at the LHC.
Neural Network Fits of Parton DistributionsJuan Rojo
The document discusses neural network fits of parton distribution functions (PDFs). It describes the NNPDF approach, which uses artificial neural networks as universal interpolators to parametrize PDFs without theoretical bias, in contrast to traditional approaches. The NNPDF method also uses a Monte Carlo replica method to estimate PDF uncertainties without linear approximations. Recent NNPDF results include determinations of PDFs including quantum electrodynamics corrections and an investigation of the intrinsic charm content of the proton.
The document summarizes progress in the NNPDF global analysis of parton distribution functions. It discusses past NNPDF analyses from 2012-2015 and plans for future analyses, including NNPDF3.1 which will include new LHC data and improve the determination of intrinsic charm. It also presents results from fits allowing the charm content of the proton to be determined from data rather than assumed from perturbation theory, finding the data support a small intrinsic charm component within uncertainties.
NNLO PDF fits with top-quark pair differential distributionsJuan Rojo
Juan Rojo presented a study on including top-quark pair differential distributions in NNLO global PDF fits. The distributions provide stringent constraints on the large-x gluon, comparable to inclusive jet data. Fitting normalized distributions and including one distribution from ATLAS and CMS improves the description of data and reduces PDF uncertainties, particularly at high masses important for BSM searches. Some tension is seen between ATLAS and CMS measurements that can be reduced by fitting the experiments separately. Differential top data will be essential for future global PDF fits.
Parton Distributions and the search for New Physics at the LHCjuanrojochacon
Juan Rojo gave a seminar at King's College London on September 23, 2015 about parton distribution functions (PDFs) and their importance for precision physics at the Large Hadron Collider (LHC). PDFs describe the momentum distributions of quarks and gluons within protons and are crucial for determining cross sections and uncertainties for many LHC processes. The accurate determination of PDFs requires global analyses of experimental data using flexible parametrizations like neural networks to avoid biases. PDF uncertainties now limit characterization of the Higgs boson and searches for new physics at the LHC.
Parton Distributions: future needs, and the role of the High-Luminosity LHCjuanrojochacon
1) Improved PDFs are needed to match the accuracy of higher-order calculations of cross sections and characterize properties of the Higgs boson and search for new physics.
2) The HL-LHC could significantly reduce PDF uncertainties through high-statistics measurements, especially in processes sensitive to large-x gluons and quarks like top quark pair production and Drell-Yan.
3) Preliminary studies generating HL-LHC pseudo-data show PDF uncertainties on the gluon-gluon and quark-antiquark luminosities could be reduced by 10-30% from top pair and Drell-Yan measurements respectively.
Probes of small-x QCD from HERA to the LHCJuan Rojo
This document discusses probes of small-x QCD physics from HERA to the LHC. It begins with an overview of parton distribution functions and the global analysis used to determine them. It then discusses various processes that are sensitive to the low-x gluon, such as direct photon and top quark production at the LHC. Forward charm production measured by LHCb is also highlighted as providing valuable constraints on the small-x gluon. The implications for modeling soft QCD processes and predicting neutrino fluxes are briefly mentioned.
News from NNPDF: new data and fits with intrinsic charmjuanrojochacon
Juan Rojo presented recent work by the NNPDF collaboration including: 1) inclusion of the final HERA legacy dataset which provides a moderate reduction in PDF uncertainties, 2) inclusion of new LHC data which constrains the large-x gluon PDF, and 3) ongoing work to perform fits with intrinsic charm and investigate implications for LHC phenomenology.
NNPDF3.0: Next Generation Parton Distributions for the LHC Run IIjuanrojochacon
NNPDF3.0 is a new PDF release from the NNPDF collaboration that incorporates recent experimental data from HERA and the LHC, improved theory calculations, and methodological advances. Key aspects of NNPDF3.0 include the inclusion of new data like HERA structure functions, LHC jets and electroweak data, and top quark production data. It also utilizes approximate NNLO calculations for jets and NLO electroweak corrections for Drell-Yan production. The fitting methodology has been improved with a C++ rewrite of the code and validation on closure tests. Preliminary results show good agreement with NNPDF2.3 and reduced uncertainties for some PDFs from the new data and methodology
Parton distributions from high-precision collider datajuanrojochacon
Juan Rojo gave a seminar at the Technical University of Munich on July 13, 2017 about parton distributions from high-precision collider data. He discussed how parton distribution functions are essential for calculating cross sections at hadron colliders like the LHC, since they describe the probability of finding quarks and gluons within protons. Rojo explained that global analyses fit PDFs to diverse experimental data using statistical techniques like neural networks, and the PDFs can then provide predictions for new processes. He highlighted recent updates from the NNPDF collaboration in version 3.1 to include more precise LHC data and the option to fit the charm quark distribution.
This document discusses potential areas where lattice QCD calculations could provide input to help constrain parton distribution functions (PDFs) which are currently not well known. It identifies "benchmarks" where PDFs are already well determined, as well as "opportunities" where lattice calculations could have more impact. These include PDFs at large values of Bjorken x, the strange quark content of the proton, and charm content. The document also discusses how lattice data could be included in PDF global fits to help reduce PDF uncertainties.
This document discusses parton distribution functions (PDFs) with intrinsic charm. It presents the motivation for fitting a charm PDF in global analyses, including stabilizing charm mass dependence, quantifying the non-perturbative charm component, and exploring implications for LHC phenomenology. The document summarizes previous fits allowing for intrinsic charm and issues they faced. It then presents new NNPDF fits that allow the charm PDF to be fitted, finding these fits describe EMC charm data and have stable gluon and charm PDFs under charm mass variations, with implications for precision LHC calculations.
PDFs at the LHC: Lessons from Run I and preparation for Run IIjuanrojochacon
This document summarizes parton distribution functions (PDFs) at the start of LHC Run II. It discusses the status of recent PDF sets from NNPDF, MMHT, CT, ABM, and HERAPDF. It notes some differences between these sets and the importance of PDF uncertainties for LHC measurements. The document also discusses benchmarks between PDF sets, comparisons using the APFEL online tool, and the prospects for including more ATLAS data in global PDF fits to further constrain PDFs.
Constraints on the gluon PDF from top quark differential distributions at NNLOjuanrojochacon
- The document discusses constraints on the gluon PDF from top quark production at hadron colliders.
- It describes using the inclusive top quark pair production cross section to reduce uncertainties in the gluon PDF, especially in the large-x region between 0.1 and 0.5.
- Cross section ratios between different beam energies, such as 8 TeV/7 TeV, are highlighted as powerful precision tests that can discriminate between PDFs and probe BSM physics.
Parton Distributions at a 100 TeV Hadron Colliderjuanrojochacon
Usage of modern PDF sets with LHAPDF6 v6.1.5 is suitable for FCC studies and simulations. At a 100 TeV hadron collider, PDFs would need to be evaluated in more extreme regions of small-x, large-x, and large invariant masses than at the LHC. Photon-initiated processes could contribute significantly at the FCC due to large uncertainties in the photon PDF. Heavy quark PDFs, including for the top quark, should be included in matched calculations for FCC simulations rather than using a purely massless scheme.
aMCfast: Automation of Fast NLO Computations for PDF fitsjuanrojochacon
MadGraph5_aMCatNLO provides NLO calculations for arbitrary processes and their matching to parton showers, but existing fast interfaces are limited. A new tool called aMCfast provides a fast interface to MadGraph5_aMCatNLO, allowing its predictions to be used in global PDF fits. It precomputes matrix elements and interpolates them using grids, then reconstructs distributions for any PDFs or scales. This will increase the number and accuracy of processes in PDF fits, including electroweak corrections and photon-initiated effects, improving determination of PDFs from LHC data.
Parton Distributions and Standard Model Physics at the LHCjuanrojochacon
This document discusses parton distribution functions (PDFs) and recent developments. It notes that NNLO calculations are essential to reduce uncertainties in PDF analysis. Several key processes like inclusive jet production and top quark production are now available at NNLO. The document also discusses the inclusion of LHC data like W+charm, top quark, and jet data in global PDF fits. It highlights updates to various PDF fitting groups and the upcoming NNPDF3.0 release.
NNPDF3.0: parton distributions for the LHC Run IIjuanrojochacon
NNPDF3.0 is a new PDF determination that includes updated data and theory improvements compared to NNPDF2.3. It includes all HERA-II data and new LHC measurements. The fitting code was rewritten in C++ and validated using closure tests. NNPDF3.0 shows reasonable agreement with NNPDF2.3 while improving descriptions of data and reducing uncertainties in some regions. It provides PDFs for use at the LHC Run II.
Precision determination of the small-x gluon from charm production at LHCbjuanrojochacon
This document discusses using LHCb data on charm production to constrain the small-x gluon and improve predictions for neutrino fluxes. LHCb data at 7 TeV, 5 TeV and 13 TeV provides stringent constraints on the small-x gluon beyond HERA. This improved gluon allows more accurate predictions for signals and backgrounds at neutrino telescopes. At a 100 TeV collider, inclusive cross sections depend directly on small-x PDFs, but using LHCb data leads to stabilized predictions with reduced uncertainties.
The structure of the proton in the Higgs Boson Erajuanrojochacon
This document summarizes Juan Rojo's presentation on the structure of the proton in the Higgs boson era. The key points are:
1) The discovery of the Higgs boson marks a new era for particle physics and new discoveries may be found at the LHC in the next years.
2) While the Standard Model is successful, it is incomplete and new physics is needed to explain phenomena like dark matter. The LHC aims to further study the Higgs and search for new physics.
3) Determining the parton distribution functions that describe the proton's constituents is crucial for LHC analyses and requires a global analysis of experimental data using advanced theoretical techniques like neural networks.
Recent progress in proton and nuclear PDFs at small-xjuanrojochacon
1) The document discusses recent progress in proton and nuclear parton distribution functions (PDFs) at small values of x. PDFs describe the momentum distribution of quarks and gluons inside protons and nuclei.
2) Global analyses of experimental data from various processes are used to determine PDFs at hadronic scales, which are then evolved perturbatively to higher scales relevant for LHC predictions. Recent analyses include data from the LHC.
3) Probing PDFs at small x requires processes dominated by gluons at leading order, produced in the forward region with low invariant masses. Examples discussed are direct photon and charm production. LHCb and future forward calorimeter data provide constraints on the small-
The document discusses the NNPDF3.1 global analysis of parton distribution functions (PDFs). It provides an update to NNPDF3.0 motivated by new high-precision collider data and progress in NNLO calculations. NNPDF3.1 fits both a perturbative and fitted charm PDF and finds slightly better fit quality for the fitted charm case. Comparisons to NNPDF3.0 show agreement within uncertainties and reduced PDF errors in NNPDF3.1 due to the new LHC data.
PDF uncertainties and the W mass: Report from the Workshop “Parton Distributi...juanrojochacon
The document summarizes discussions from a workshop about reducing uncertainties in parton distribution functions (PDFs) that affect measurements of the W boson mass. Key topics discussed included issues modeling the Z boson's transverse momentum, inconsistencies between PDF sets, and the need for additional LHC measurements like of W/Z production to reduce PDF uncertainties in future W mass measurements to below 10 MeV. Participants agreed more work is needed to understand differences between PDF sets and include relevant LHC data in global fits.
NNPDF3.0: Next generation parton distributions for the LHC Run IIjuanrojochacon
The document provides an overview of the forthcoming NNPDF3.0 PDF release from the NNPDF Collaboration. Key points include:
1) NNPDF3.0 includes over 1000 new data points from HERA and LHC experiments like ATLAS and CMS, improving constraints on PDFs.
2) Improved theory calculations are used, including approximate NNLO corrections for jet data and full NLO electroweak corrections.
3) The NNPDF methodology has been upgraded with a C++ code rewrite, validation on closure tests, and improvements to the fitting strategy and basis choices.
PDF uncertainties the LHC made easy: a compression algorithm for the combinat...juanrojochacon
This document summarizes Juan Rojo's presentation on a new method for combining PDF sets called compressed Monte Carlo PDFs (CMC-PDFs). The method involves combining Monte Carlo replicas from different PDF sets, then compressing the large combined set into a smaller set that still accurately reproduces properties like uncertainties. Validation shows CMC-PDFs with only 25 replicas can reproduce uncertainties for a variety of LHC processes, providing a computationally efficient implementation of the PDF4LHC recommendation.
This document discusses three approaches for modeling the power consumption of an RF subsystem: multivariate polynomial fitting, neural networks, and homotopy continuation. It presents a case study where measurements of an RF subsystem's power consumption were taken under varying conditions of transmitter power, signal bandwidth, carrier frequency, and ambient temperature. The document aims to compare the accuracy of the three modeling approaches for capturing the power consumption across the varying conditions while accounting for measurement uncertainty. Polynomial fitting and neural networks are described as existing approaches, while homotopy continuation is proposed as a new technique for this application.
Communicating effectively and consistently with students can help them feel at ease during their learning experience and provide the instructor with a communication trail to track the course's progress. This workshop will take you through constructing an engaging course container to facilitate effective communication.
More Related Content
Similar to Statistical issues in global fits: Lessons from PDF determinations
Probes of small-x QCD from HERA to the LHCJuan Rojo
This document discusses probes of small-x QCD physics from HERA to the LHC. It begins with an overview of parton distribution functions and the global analysis used to determine them. It then discusses various processes that are sensitive to the low-x gluon, such as direct photon and top quark production at the LHC. Forward charm production measured by LHCb is also highlighted as providing valuable constraints on the small-x gluon. The implications for modeling soft QCD processes and predicting neutrino fluxes are briefly mentioned.
News from NNPDF: new data and fits with intrinsic charmjuanrojochacon
Juan Rojo presented recent work by the NNPDF collaboration including: 1) inclusion of the final HERA legacy dataset which provides a moderate reduction in PDF uncertainties, 2) inclusion of new LHC data which constrains the large-x gluon PDF, and 3) ongoing work to perform fits with intrinsic charm and investigate implications for LHC phenomenology.
NNPDF3.0: Next Generation Parton Distributions for the LHC Run IIjuanrojochacon
NNPDF3.0 is a new PDF release from the NNPDF collaboration that incorporates recent experimental data from HERA and the LHC, improved theory calculations, and methodological advances. Key aspects of NNPDF3.0 include the inclusion of new data like HERA structure functions, LHC jets and electroweak data, and top quark production data. It also utilizes approximate NNLO calculations for jets and NLO electroweak corrections for Drell-Yan production. The fitting methodology has been improved with a C++ rewrite of the code and validation on closure tests. Preliminary results show good agreement with NNPDF2.3 and reduced uncertainties for some PDFs from the new data and methodology
Parton distributions from high-precision collider datajuanrojochacon
Juan Rojo gave a seminar at the Technical University of Munich on July 13, 2017 about parton distributions from high-precision collider data. He discussed how parton distribution functions are essential for calculating cross sections at hadron colliders like the LHC, since they describe the probability of finding quarks and gluons within protons. Rojo explained that global analyses fit PDFs to diverse experimental data using statistical techniques like neural networks, and the PDFs can then provide predictions for new processes. He highlighted recent updates from the NNPDF collaboration in version 3.1 to include more precise LHC data and the option to fit the charm quark distribution.
This document discusses potential areas where lattice QCD calculations could provide input to help constrain parton distribution functions (PDFs) which are currently not well known. It identifies "benchmarks" where PDFs are already well determined, as well as "opportunities" where lattice calculations could have more impact. These include PDFs at large values of Bjorken x, the strange quark content of the proton, and charm content. The document also discusses how lattice data could be included in PDF global fits to help reduce PDF uncertainties.
This document discusses parton distribution functions (PDFs) with intrinsic charm. It presents the motivation for fitting a charm PDF in global analyses, including stabilizing charm mass dependence, quantifying the non-perturbative charm component, and exploring implications for LHC phenomenology. The document summarizes previous fits allowing for intrinsic charm and issues they faced. It then presents new NNPDF fits that allow the charm PDF to be fitted, finding these fits describe EMC charm data and have stable gluon and charm PDFs under charm mass variations, with implications for precision LHC calculations.
PDFs at the LHC: Lessons from Run I and preparation for Run IIjuanrojochacon
This document summarizes parton distribution functions (PDFs) at the start of LHC Run II. It discusses the status of recent PDF sets from NNPDF, MMHT, CT, ABM, and HERAPDF. It notes some differences between these sets and the importance of PDF uncertainties for LHC measurements. The document also discusses benchmarks between PDF sets, comparisons using the APFEL online tool, and the prospects for including more ATLAS data in global PDF fits to further constrain PDFs.
Constraints on the gluon PDF from top quark differential distributions at NNLOjuanrojochacon
- The document discusses constraints on the gluon PDF from top quark production at hadron colliders.
- It describes using the inclusive top quark pair production cross section to reduce uncertainties in the gluon PDF, especially in the large-x region between 0.1 and 0.5.
- Cross section ratios between different beam energies, such as 8 TeV/7 TeV, are highlighted as powerful precision tests that can discriminate between PDFs and probe BSM physics.
Parton Distributions at a 100 TeV Hadron Colliderjuanrojochacon
Usage of modern PDF sets with LHAPDF6 v6.1.5 is suitable for FCC studies and simulations. At a 100 TeV hadron collider, PDFs would need to be evaluated in more extreme regions of small-x, large-x, and large invariant masses than at the LHC. Photon-initiated processes could contribute significantly at the FCC due to large uncertainties in the photon PDF. Heavy quark PDFs, including for the top quark, should be included in matched calculations for FCC simulations rather than using a purely massless scheme.
aMCfast: Automation of Fast NLO Computations for PDF fitsjuanrojochacon
MadGraph5_aMCatNLO provides NLO calculations for arbitrary processes and their matching to parton showers, but existing fast interfaces are limited. A new tool called aMCfast provides a fast interface to MadGraph5_aMCatNLO, allowing its predictions to be used in global PDF fits. It precomputes matrix elements and interpolates them using grids, then reconstructs distributions for any PDFs or scales. This will increase the number and accuracy of processes in PDF fits, including electroweak corrections and photon-initiated effects, improving determination of PDFs from LHC data.
Parton Distributions and Standard Model Physics at the LHCjuanrojochacon
This document discusses parton distribution functions (PDFs) and recent developments. It notes that NNLO calculations are essential to reduce uncertainties in PDF analysis. Several key processes like inclusive jet production and top quark production are now available at NNLO. The document also discusses the inclusion of LHC data like W+charm, top quark, and jet data in global PDF fits. It highlights updates to various PDF fitting groups and the upcoming NNPDF3.0 release.
NNPDF3.0: parton distributions for the LHC Run IIjuanrojochacon
NNPDF3.0 is a new PDF determination that includes updated data and theory improvements compared to NNPDF2.3. It includes all HERA-II data and new LHC measurements. The fitting code was rewritten in C++ and validated using closure tests. NNPDF3.0 shows reasonable agreement with NNPDF2.3 while improving descriptions of data and reducing uncertainties in some regions. It provides PDFs for use at the LHC Run II.
Precision determination of the small-x gluon from charm production at LHCbjuanrojochacon
This document discusses using LHCb data on charm production to constrain the small-x gluon and improve predictions for neutrino fluxes. LHCb data at 7 TeV, 5 TeV and 13 TeV provides stringent constraints on the small-x gluon beyond HERA. This improved gluon allows more accurate predictions for signals and backgrounds at neutrino telescopes. At a 100 TeV collider, inclusive cross sections depend directly on small-x PDFs, but using LHCb data leads to stabilized predictions with reduced uncertainties.
The structure of the proton in the Higgs Boson Erajuanrojochacon
This document summarizes Juan Rojo's presentation on the structure of the proton in the Higgs boson era. The key points are:
1) The discovery of the Higgs boson marks a new era for particle physics and new discoveries may be found at the LHC in the next years.
2) While the Standard Model is successful, it is incomplete and new physics is needed to explain phenomena like dark matter. The LHC aims to further study the Higgs and search for new physics.
3) Determining the parton distribution functions that describe the proton's constituents is crucial for LHC analyses and requires a global analysis of experimental data using advanced theoretical techniques like neural networks.
Recent progress in proton and nuclear PDFs at small-xjuanrojochacon
1) The document discusses recent progress in proton and nuclear parton distribution functions (PDFs) at small values of x. PDFs describe the momentum distribution of quarks and gluons inside protons and nuclei.
2) Global analyses of experimental data from various processes are used to determine PDFs at hadronic scales, which are then evolved perturbatively to higher scales relevant for LHC predictions. Recent analyses include data from the LHC.
3) Probing PDFs at small x requires processes dominated by gluons at leading order, produced in the forward region with low invariant masses. Examples discussed are direct photon and charm production. LHCb and future forward calorimeter data provide constraints on the small-
The document discusses the NNPDF3.1 global analysis of parton distribution functions (PDFs). It provides an update to NNPDF3.0 motivated by new high-precision collider data and progress in NNLO calculations. NNPDF3.1 fits both a perturbative and fitted charm PDF and finds slightly better fit quality for the fitted charm case. Comparisons to NNPDF3.0 show agreement within uncertainties and reduced PDF errors in NNPDF3.1 due to the new LHC data.
PDF uncertainties and the W mass: Report from the Workshop “Parton Distributi...juanrojochacon
The document summarizes discussions from a workshop about reducing uncertainties in parton distribution functions (PDFs) that affect measurements of the W boson mass. Key topics discussed included issues modeling the Z boson's transverse momentum, inconsistencies between PDF sets, and the need for additional LHC measurements like of W/Z production to reduce PDF uncertainties in future W mass measurements to below 10 MeV. Participants agreed more work is needed to understand differences between PDF sets and include relevant LHC data in global fits.
NNPDF3.0: Next generation parton distributions for the LHC Run IIjuanrojochacon
The document provides an overview of the forthcoming NNPDF3.0 PDF release from the NNPDF Collaboration. Key points include:
1) NNPDF3.0 includes over 1000 new data points from HERA and LHC experiments like ATLAS and CMS, improving constraints on PDFs.
2) Improved theory calculations are used, including approximate NNLO corrections for jet data and full NLO electroweak corrections.
3) The NNPDF methodology has been upgraded with a C++ code rewrite, validation on closure tests, and improvements to the fitting strategy and basis choices.
PDF uncertainties the LHC made easy: a compression algorithm for the combinat...juanrojochacon
This document summarizes Juan Rojo's presentation on a new method for combining PDF sets called compressed Monte Carlo PDFs (CMC-PDFs). The method involves combining Monte Carlo replicas from different PDF sets, then compressing the large combined set into a smaller set that still accurately reproduces properties like uncertainties. Validation shows CMC-PDFs with only 25 replicas can reproduce uncertainties for a variety of LHC processes, providing a computationally efficient implementation of the PDF4LHC recommendation.
This document discusses three approaches for modeling the power consumption of an RF subsystem: multivariate polynomial fitting, neural networks, and homotopy continuation. It presents a case study where measurements of an RF subsystem's power consumption were taken under varying conditions of transmitter power, signal bandwidth, carrier frequency, and ambient temperature. The document aims to compare the accuracy of the three modeling approaches for capturing the power consumption across the varying conditions while accounting for measurement uncertainty. Polynomial fitting and neural networks are described as existing approaches, while homotopy continuation is proposed as a new technique for this application.
Similar to Statistical issues in global fits: Lessons from PDF determinations (20)
Communicating effectively and consistently with students can help them feel at ease during their learning experience and provide the instructor with a communication trail to track the course's progress. This workshop will take you through constructing an engaging course container to facilitate effective communication.
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Statistical issues in global fits: Lessons from PDF determinations
1. !
Statistical issues in global fits:!
Lessons from PDF determinations
Juan Rojo!
Rudolf Peierls Center for Theoretical Physics!
University of Oxford!
!
International Workshop on Global Fits to !
Neutrino Scattering Data and Generator Tuning (NuTune2016)!
Liverpool, 11/06/2016
1Juan Rojo NuTune2016, Liverpool, 12/06/2016
2. The inner life of protons :!
Parton Distribution Functions
2Juan Rojo NuTune2016, Liverpool, 12/06/2016
3. Lepton vs Hadron Colliders!
In high-energy lepton colliders, such as the Large Electron-Positron Collider (LEP) at CERN, the
collisions involve elementary particles without substructure!
!
Cross-sections in lepton colliders can be computed in perturbation theory using the
Feynman rules of the Standard Model Lagrangian!
3Juan Rojo NuTune2016, Liverpool, 12/06/2016
4. Lepton vs Hadron Colliders!
In high-energy hadron colliders, such as the LHC, the collisions involve composite particles
(protons) with internal structure (quarks and gluons)
!
Calculations of cross-sections in hadron collisions require the combination of perturbative,
quark/gluon-initiated processes, and non-perturbative, parton distributions, information
Parton Distributions!
Non-perturbative
From global analysis
Quark/gluon collisions!
Perturbative!
From SM Lagrangian
4Juan Rojo NuTune2016, Liverpool, 12/06/2016
5. Parton Distributions
5
The distribution of energy that quarks and gluons carry inside the proton is quantified by the Parton
Distribution Functions (PDFs)
x: Fraction of the proton’s momentum
Q: Energy of the quark/gluon collision!
Inverse of the resolution length
PDFs are determined by non-perturbative QCD dynamics: cannot be computed from first
principles, and need to be extracted from experimental data with a global analysis!
Energy conservation!
Dependence with quark/gluon collision energy Q determined in perturbation theory!
g(x,Q): Probability of finding a gluon inside
a proton, carrying a fraction x of the proton
momentum, when probed with energy Q
Juan Rojo NuTune2016, Liverpool, 12/06/2016
6. The Factorization Theorem
6
The QCD Factorization Theorem guarantees PDF universality: extract them from a subset of process
and use them to provide pure predictions for new processes
Determine PDFs in lepton-proton collisions ….
And use them to compute cross-sections
in proton-proton collisions at the LHC
Juan Rojo NuTune2016, Liverpool, 12/06/2016
7. The global PDF analysis
7
Hadronic scale:
Global PDF fit results
LHC scale
Perturbative
Evolution
Combine state-of-the-art theory calculations, the constraints from PDF-sensitive measurements from
different processes and colliders, and a statistically robust fitting methodology!
Extract Parton Distributions at hadronic scales of a few GeV, where non-perturbative QCD sets in!
Use perturbative evolution to compute PDFs at high scales as input to LHC predictions
High scales:
input to LHC
Juan Rojo NuTune2016, Liverpool, 12/06/2016
8. The NNPDF approach
8
A novel approach to PDF determination, improving the limitations of the traditional PDF fitting methods
with the use of advanced statistical techniques such as machine learning and multivariate analysis!
Traditional approach: based on restrictive functional forms leading to strong theoretical bias!
NNPDF solution: use Artificial Neural Networks as universal unbiased interpolants
Non-perturbative PDF parametrization
PDF uncertainties and propagation to LHC calculations
Traditional approach: Hessian method, limited to Gaussian/linear approximation !
NNPDF solution: based on the Monte Carlo replica method to create a probability distribution in the
space of PDFs. Specially critical in extrapolation regions (i.e. high-x) for New Physics searches
Fitting technique
Traditional approach: deterministic minimization of χ2, flat directions problem!
NNPDF solution: Genetic Algorithms to explore efficiently the vast parameter space, with cross-
validation to avoid fitting stat fluctuations
Juan Rojo NuTune2016, Liverpool, 12/06/2016
9. The Monte Carlo replica method
9Juan Rojo NuTune2016, Liverpool, 12/06/2016
Two main approaches to estimate PDF uncertainties: the Hessian method and the Monte Carlo method!
In the Hessian method, the χ2 is expanded quadratically in the fit parameters {an} around the best fit!
!
!
!
The Hessian matrix is diagonalized, and PDF errors on cross sections F from linear error propagation
In the Monte Carlo replica method, pseudo-data replicas with same fluctuations as real data are
generated, and then a PDF fit is performed in each individual replica!
Leads to probability distribution in the space of PDFs, without linear/Gaussian approximations
Original data
Pseudo-data!
MC replica
10. ANN for PDF parametrization
10
ANNs are routinely exploited in high-energy physics, in most cases as classifiers to separate between
interesting and more mundane events!
ANNs also provide universal unbiased interpolants to parametrize the non-perturbative dynamics
that determines the size and shape of the PDFs from experimental data!
!
Traditional approach
NNPDF approach
ANNs eliminate theory bias introduced in PDF fits
from choice of ad-hoc functional forms!
NNPDF fits used O(400) free parameters, to be
compared with O(10-20) in traditional PDFs. Results
stable if O(4000) parameters used!!
Faithful extrapolation: PDF uncertainties blow up in
regions with scarce experimental data!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
11. 11
Artificial Neural Networks vs Polynomials
Compare a benchmark PDF analysis where the same dataset is fitted with Artificial Neural Networks
and with standard polynomials, other settings identical)!
ANNs avoid biasing the PDFs, faithful extrapolation at small-x (very few data, thus error blow up)!
!
!
Polynomials Neural Networks
PDF error
PDF error
No Data
No Data
Juan Rojo NuTune2016, Liverpool, 12/06/2016
HERA/LHC workshop
proceedings 09
12. Combining Inconsistent!
Experiments in the PDF fit
12Juan Rojo NuTune2016, Liverpool, 12/06/2016
Eigenvector number
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 201 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
global
2
χ∆ToleranceT=
-20
-15
-10
-5
0
5
10
15
20
(MRST)50+
(MRST)50-
(CTEQ)100+
(CTEQ)100-
NC
r
σH1ep97-00NC
r
σH1ep97-00
2
dFµNMCXµµ→NνNuTeV
Xµµ→NνNuTeVXµµ→NνCCFR
E866/NuSeapd/ppDYE866/NuSeapd/ppDY
Xµµ→NνNuTeV3
NxFνNuTeV
Xµµ→NνNuTeVXµµ→NνNuTeV
asym.νl→IIW∅D2
dFµBCDMS
2
dFµBCDMS
2
pFµBCDMS
NC
r
σH1ep97-00NC
r
σZEUSep95-00
2
dFµBCDMS
2
SLACedF
NC
r
σH1ep97-00NC
r
σZEUSep95-00
E866/NuSeapd/ppDYE866/NuSeapd/ppDY
E866/NuSeappDY3
NxFνNuTeV
2
dFµNMCasym.νl→IIW∅D
NC
r
σH1ep97-002
NFνNuTeV
Xµµ→NνCCFRE866/NuSeapd/ppDY
Xµµ→NνNuTeVXµµ→NνCCFR
asym.νl→IIW∅DE866/NuSeapd/ppDY
NC
r
σH1ep97-00NC
r
σH1ep97-00
3
NxFνNuTeVXµµ→NνNuTeV
MSTW 2008 NLO PDF fit
Figure 10: Tolerance for each eigenvector direction determined dynamically from the criteria
that each data set must be described within its 90% C.L. (Eq. (58)) (outer error bars) or 68%
C.L. limit (inner error bars). The labels give the name of the data set which sets the 90%
C.L. tolerance for each eigenvector direction.
6.3 Uncertainties on input PDFs
We use the values of the dynamic tolerance shown in Fig. 10 to generate the PDF eigenvector
sets according to (49), which can be written as
ai(S±
k ) = a0
i ± t±
k eik, (59)
where t±
k is adjusted to give T±
k , with T±
k the values shown in Fig. 10. We provide two different
sets for each fit corresponding to either a 90% or 68% C.L. limit. Note that the ratio of the PDF
uncertainties calculated using these two sets is not simply an overall factor of
√
2.71 = 1.64, as
it would be if choosing the tolerance according to the usual parameter-fitting criterion. Even
in the simplest case, where the data set fixing the tolerance is the same for the 90% and 68%
C.L. limits, and assuming linear error propagation, then the ratio of the T±
k values would be
(ξ90 − ξ50)/(ξ68 − ξ50), which is a function of the number of data points N in the data set which
fixes the tolerance, and takes a value around 1.7 for typical N ∼ 10–1000.
47
13. Experimental data
13
The global QCD analysis requires combining different experiments with disparate characteristics!
Type of high energy collision (lepton-proton, proton-proton), center-of-mass energy of collision!
Whether of not experimental correlated systematics are available, and if so, in which format!
Mutually inconsistent datasets and datasets with few points but large constraining power vs
datasets with many points but moderate constraining power
Juan Rojo NuTune2016, Liverpool, 12/06/2016
Lepton-Hadron collisions Hadron-Hadron collisionsNNPDF3.0 dataset
14. Experimental data
14
The global QCD analysis requires combining different experiments with disparate characteristics!
Type of high energy collision (lepton-proton, proton-proton), center-of-mass energy of collision!
Whether of not experimental correlated systematics are available, and if so, in which format!
Mutually inconsistent datasets and datasets with few points but large constraining power vs
datasets with many points but moderate constraining power
Juan Rojo NuTune2016, Liverpool, 12/06/2016
The kinematical coverage of
the experiments included in
NNPDF3.0 span several
orders of magnitude both in
x and Q2!
15. Inconsistent data
What it is usually meant by inconsistent data?!
Not a unique definition. Typically when one experiment that when added into a global fit leads to χ2 >> Ndat!
Many possible reasons for this:!
Underestimated systematic uncertainties?!
Incomplete/partial theory calculations?!
Methodological limitations, ie, too restrictive PDF fitting functional forms?!
Genuine statistical pull between different experiments in the global fit? (This is not inconsistency!)!
Dealing with potentially inconsistent experiments in the global PDF fit is very delicate. On the one hand, it is
not advisable to bias a priori the fit with a subjective selection of which experiments are more reliable. On the
other hand, once wants to achieve statistically sound results, and in particular PDF uncertainties that truly
quantify our genuine lack of information. So there are two complementary avenues:!
Attempt to understand how the inconsistencies arise, and when possible fix them (for example using a
better theory)!
Devise a fitting methodology that can deal with inconsistent experiments, regardless of the origin of the
inconsistency!
Note that some of the older fixed-target DIS experiments do not provide the full breakdown of systematics,
but this is now a small weight in the global fit!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
16. Dealing with inconsistent data
In the global PDF fit, different
experiments will prefer different
solutions, not always compatible!
Also, the number of datapoints
(statistical weight) of each experiment
can be quite different, and one wants
to describe both exps with many data
points and those with few!
Using textbook statistics, 68% CL
uncertainties in the PDF fit
parameters should be determined
from the ∆χ2 = 1 criteria !
However it has been shown that this
criterion is not adequate in the global
fits with many experiments!
So global Hessian fits use effectively
an increased tolerance ∆χ2>>1, to
ensure that all fitted experiments are
reasonably described!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
Collins and Pumplin 01
17. Dealing with inconsistent data
Juan Rojo NuTune2016, Liverpool, 12/06/2016
Eigenvector number
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 201 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
global
2
χ∆ToleranceT=
-20
-15
-10
-5
0
5
10
15
20
(MRST)50+
(MRST)50-
(CTEQ)100+
(CTEQ)100-
NC
r
σH1ep97-00NC
r
σH1ep97-00
2
dFµNMCXµµ→NνNuTeV
Xµµ→NνNuTeVXµµ→NνCCFR
E866/NuSeapd/ppDYE866/NuSeapd/ppDY
Xµµ→NνNuTeV3
NxFνNuTeV
Xµµ→NνNuTeVXµµ→NνNuTeV
asym.νl→IIW∅D2
dFµBCDMS
2
dFµBCDMS
2
pFµBCDMS
NC
r
σH1ep97-00NC
r
σZEUSep95-00
2
dFµBCDMS
2
SLACedF
NC
r
σH1ep97-00NC
r
σZEUSep95-00
E866/NuSeapd/ppDYE866/NuSeapd/ppDY
E866/NuSeappDY3
NxFνNuTeV
2
dFµNMCasym.νl→IIW∅D
NC
r
σH1ep97-002
NFνNuTeV
Xµµ→NνCCFRE866/NuSeapd/ppDY
Xµµ→NνNuTeVXµµ→NνCCFR
asym.νl→IIW∅DE866/NuSeapd/ppDY
NC
r
σH1ep97-00NC
r
σH1ep97-00
3
NxFνNuTeVXµµ→NνNuTeV
MSTW 2008 NLO PDF fit
MSTW08
In the MSTW approach, a dynamical tolerance criterion is used where different individual experiments
determine the allowed upper and lower variations of each eigenvectors!
This also indicates which datasets are more sensitive to each eigenvector!
18. Dealing with inconsistent data
Juan Rojo NuTune2016, Liverpool, 12/06/2016
In the NNPDF approach, no need to modify the fitting methodology in the presence of inconsistencies!
When new (compatible) experiments are added, then PDF errors decrease. If inconsistent experiments
included, fit essentially unaffected and PDF errors not modified (since no new information added)!
Fitting methodology also unchanged even for large variations of the fitted dataset!
!
!
!
!
20. 20
Reweighting as an alternative to fitting
Juan Rojo NuTune2016, Liverpool, 12/06/2016
When a new dataset becomes available, instead of updating the global fit, it is possible include this
new information on a prior PDF set using Bayes’ Theorem!
The weight (likelihood) in the presence of the new experiment corresponding to each Monte Carlo
replica k is given in terms of the χ2
k between data and the theory computed with this replica!
!
!
!
!
where ωk(α) are the weights ωk now with χ2 rescaled as χ2/α, that is, they correspond to the case where the
new experimental dataset has uncertainties rescaled by a factor α1/2!
!
!
!
!
The Bayesian reweighting technique also allows to quantify the overall consistency of the new experiment
with those already included of the global fit by defining!
Any inconsistent experiment can be brought in agreement with the global fit by a suitable rescaling of its
uncertainties (though this is not necessary in the NNPDF framework)!
21. 21Juan Rojo NuTune2016, Liverpool, 12/06/2016
Reweighting as an alternative to fitting
Adding new experiments, in this case the Tevatron inclusive jet data, by reweighting leads to results that
are statistically consistent with refitting !
Main benefit of RW is that it can be performed using only public tools (PDF sets and codes for cross-section
calculation) without any input from the PDF fitters!
!
!
!
!
22. 22Juan Rojo NuTune2016, Liverpool, 12/06/2016
Reweighting as an alternative to fitting
Consistent experiment (mildly) Inconsistent experiment
Experimental errors need to be !
rescaled to agree with global fit
The distribution of the χ2 rescaling parameter α allows to quantify the level of (in)consistency of a new
experiment with those already included in the global fit!
NNPDF 10
23. 23
Conservative Partons
Juan Rojo NuTune2016, Liverpool, 12/06/2016
To study the robustness of the global
fit results, it is possible to define
parton distributions based on a
maximally consistent dataset: the
conservative partons!
Include in the conservative fit only
those experiments which in the global
fit have their P(α) distribution peaked
at α< αmax!
modifying this threshold allows to
tune the PDF fit to be more or less
conservative!
Quantify impact of known dataset
inconsistencies on the global fit PDFs!
This is not merely a conceptual detail:
assessing robustness of PDF errors in
LHC cross-section is central ie for the
characterisation of the Higgs boson !
24. 24
Conservative Partons
Juan Rojo NuTune2016, Liverpool, 12/06/2016
At the level of LHC cross-sections, conservative PDFs consistent with global fit PDFs within uncertainties!
Conservative PDFs affected by larger uncertainties due to reduced dataset!
Non-trivial validation of the robustness of the global fit results!
!
!
26. 26
Closure Testing of Parton Distributions
PDF uncertainties have been often criticised by a potential lack of statistical interpretation!
Within NNPDF, we performed a systematic closure tests analysis based on pseudo-data, and verified that
PDF uncertainties exhibit a statistically robust behaviour!
!
!
!
!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
27. 27
Closure Testing of Parton Distributions
For instance, if the pseudo-data is generated without statistical fluctuations (that is, identical to the input
theory) then the agreement with theory by construction should become arbitrarily good!
!
!
!
!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
Measure of PDF uncertainties in units of data uncertainties!
!
And indeed it does: as the minimization advances, the χ2 decreases monotonically, and the PDF
uncertainties as well are reduced, as the fitted theory collapses to the underlying law!
!
!
!
!
!
NNPDF 14
28. 28
Closure Testing of Parton Distributions
Another important advantage of closure testing the global PDF analysis is that it allows to disentangle the
various components of the total PDF uncertainty!
!
!
!
!
!
Lvl0: fit pseudo-data without
fluctuations (limit: χ2 -> 0)!
=> Extrapolation Uncertainty!
Lvl1: fit pseudo-data with
fluctuations (limit: χ2 -> 1)!
=> Functional uncertainty!
Lvl2: fit Monte Carlo replicas
of pseudo-data with
fluctuations (limit: χ2->2)!
=> Data uncertainty!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
NNPDF 14
29. 29
Closure Testing of Parton Distributions
In the closure tests, it is possible to validate new techniques, such as the Bayesian reweighting, in a clean
environment where everything is under control (free in particular of potential data inconsistencies)!
!
!
!
!
!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
Closure testing the global fit allows disentangling methodological issues of principle (in an ideal world
with perfectly consistent datasets, does my fitting methodology give the result it should?) with those of
practice (how to deal with inconsistent experiments or with incomplete theory?)
NNPDF 14
30. 30
Adding artificial inconsistencies
To test the fitting methodology in a realistic situation, it is possible to generate pseudo-data adding artificial
inconsistencies and study how the resulting PDFs are modified.!
In the MMHT approach, adding artificial inconsistencies in a closure test leads to modified PDFs in most
cases in agreement with the global fit PDF uncertainties!
This is only the case if their dynamical tolerance criterion ∆χ2 >> 1 is used, as opposed to ∆χ2 = 1!
!
!
!
!
!
!
Juan Rojo NuTune2016, Liverpool, 12/06/2016
∆χ2 = 1
∆χ2 >> 1
Watt and Thorne 12
31. 31
Parton Distributions are an essential ingredient for LHC phenomenology!
At the LHC, precision PDFs are required for many analysis from the characterisation of the
Higgs sector to BSM searches and Monte Carlo event generators!
The global QCD analysis aims to extract parton distributions from a diverse experimental
dataset using state-of-the-art theory and methodology!
This involves having to deal with several non-trivial statistical issues, in particular with
potential inconsistencies between fitted datasets, that can arise from various sources: partial
theory, limited fitted methodology or underestimated systematic uncertainties!
To deal with these problems, a number of techniques have been developed, which allow to
validate the robustness of our PDF uncertainty estimates for high-precision LHC
phenomenology
Summary and outlook
Juan Rojo NuTune2016, Liverpool, 12/06/2016