The document describes a machine learning approach for primary vertex reconstruction in high-energy physics experiments. A hybrid method is proposed that uses a 1D convolutional neural network to analyze histograms produced from tracking data. The network is able to find primary vertices with high efficiency and tunable false positive rates, demonstrating the potential of machine learning for this task. Future work involves adding more tracking information and iterating between track association and vertex finding to improve performance.
ACAT 2019: A hybrid deep learning approach to vertexingHenry Schreiner
This document presents a hybrid deep learning approach for vertex finding in high-energy physics experiments. It uses a 1D convolutional neural network to analyze kernel density estimates of track information in order to identify primary vertex positions. The approach achieves primary vertex finding efficiencies of 88-94% with low false positive rates comparable to traditional algorithms. The authors demonstrate tuning of the efficiency-false positive rate tradeoff and discuss plans to improve performance by incorporating additional track information and iterative refinement.
2019 CtD: A hybrid deep learning approach to vertexingHenry Schreiner
This document presents a hybrid deep learning approach for vertex finding using 1D convolutional neural networks. It describes generating 1D kernel densities from tracking information, building target distributions, and using a CNN architecture with an adjustable cost function to optimize the false positive rate versus efficiency. The approach achieves 93.87% efficiency with a 0.251 false positive rate on test data. Future work includes incorporating additional xy information and exploring full 2D kernel densities.
2019 IML workshop: A hybrid deep learning approach to vertexingHenry Schreiner
A hybrid deep learning approach is proposed for vertex finding using 1D convolutional neural networks on kernel density estimates from tracking data. The approach generates 1D histograms from 3D tracking data and uses a CNN to classify primary vertex positions. In a proof-of-concept on simulated data, it achieves primary vertex finding efficiencies and false positive rates comparable to traditional algorithms, with tunable efficiency-false positive tradeoffs. Future work includes incorporating additional tracking features, associating tracks to vertices, and deploying the inference engine for the LHCb trigger.
LHCb Computing Workshop 2018: PV finding with CNNsHenry Schreiner
The document discusses using a convolutional neural network (CNN) to quickly find primary vertices (PVs) in high-energy physics events recorded by the LHCb experiment. A prototype tracking algorithm is used to generate a 1D kernel density estimate (KDE) histogram from hit triplets. This histogram is then used to train a CNN to predict the locations of PVs. Initial results show the CNN approach can find PVs with 70-75% efficiency and a false positive rate of 0.08-0.13, outperforming current algorithms. Further work aims to improve resolution, find secondary vertices, and integrate the approach into iterative tracking.
The document outlines new features in the PVsyst software. Version 6.39 includes improvements to meteorological input data, system definitions to support optimizers and multi-MPPT inverters, 3D editing tools, and batch simulations. Version 6.40 will introduce a text-based file format, faster 3D rendering, and expanded battery system modeling. Future versions will provide more detailed battery and hybrid system simulations.
Using Very High Resolution Satellite Images for Planning Activities in MiningArgongra Gis
Pleiades satellite imagery can be used to generate high-resolution digital elevation models, contour lines, and 3D models for mining sector planning activities. A case study acquired 240 sq km of Pleiades stereo imagery with less than 5% cloud cover to generate a 5m DEM and detailed contour lines and 3D visualizations. The Pleiades data provided more accurate topographic information than existing SRTM data for environmental studies, monitoring mining volume changes, and other planning purposes in the mining sector.
VR4PV is a virtual reality software that enables fast visualization and simulation of photovoltaic (PV) systems. It allows users to render shades on PV systems from surrounding objects or self-shading during the design process. VR4PV has been used for cases involving PV integration in the built environment, designing movable PV boats, shadow analysis of a PV-powered street lamp, and allocating PV and renewable energy technologies on small islands. Future work may involve extending VR4PV's capabilities and transitioning to more performant gaming engines like Unity to improve processing speed while separating animation from energy simulations.
This document discusses vertical axis wind turbine (VAWT) performance prediction theories and a single stream tube model for VAWTs. It presents the angle of attack equation for the single stream tube model assuming zero tilt angle. It also provides the iterative equation used to determine the induction factor and the power equation that is integrated over a full rotation to predict average power. The document indicates that understanding and integrating various engineering disciplines is required to make meaningful contributions to wind turbine research and development.
ACAT 2019: A hybrid deep learning approach to vertexingHenry Schreiner
This document presents a hybrid deep learning approach for vertex finding in high-energy physics experiments. It uses a 1D convolutional neural network to analyze kernel density estimates of track information in order to identify primary vertex positions. The approach achieves primary vertex finding efficiencies of 88-94% with low false positive rates comparable to traditional algorithms. The authors demonstrate tuning of the efficiency-false positive rate tradeoff and discuss plans to improve performance by incorporating additional track information and iterative refinement.
2019 CtD: A hybrid deep learning approach to vertexingHenry Schreiner
This document presents a hybrid deep learning approach for vertex finding using 1D convolutional neural networks. It describes generating 1D kernel densities from tracking information, building target distributions, and using a CNN architecture with an adjustable cost function to optimize the false positive rate versus efficiency. The approach achieves 93.87% efficiency with a 0.251 false positive rate on test data. Future work includes incorporating additional xy information and exploring full 2D kernel densities.
2019 IML workshop: A hybrid deep learning approach to vertexingHenry Schreiner
A hybrid deep learning approach is proposed for vertex finding using 1D convolutional neural networks on kernel density estimates from tracking data. The approach generates 1D histograms from 3D tracking data and uses a CNN to classify primary vertex positions. In a proof-of-concept on simulated data, it achieves primary vertex finding efficiencies and false positive rates comparable to traditional algorithms, with tunable efficiency-false positive tradeoffs. Future work includes incorporating additional tracking features, associating tracks to vertices, and deploying the inference engine for the LHCb trigger.
LHCb Computing Workshop 2018: PV finding with CNNsHenry Schreiner
The document discusses using a convolutional neural network (CNN) to quickly find primary vertices (PVs) in high-energy physics events recorded by the LHCb experiment. A prototype tracking algorithm is used to generate a 1D kernel density estimate (KDE) histogram from hit triplets. This histogram is then used to train a CNN to predict the locations of PVs. Initial results show the CNN approach can find PVs with 70-75% efficiency and a false positive rate of 0.08-0.13, outperforming current algorithms. Further work aims to improve resolution, find secondary vertices, and integrate the approach into iterative tracking.
The document outlines new features in the PVsyst software. Version 6.39 includes improvements to meteorological input data, system definitions to support optimizers and multi-MPPT inverters, 3D editing tools, and batch simulations. Version 6.40 will introduce a text-based file format, faster 3D rendering, and expanded battery system modeling. Future versions will provide more detailed battery and hybrid system simulations.
Using Very High Resolution Satellite Images for Planning Activities in MiningArgongra Gis
Pleiades satellite imagery can be used to generate high-resolution digital elevation models, contour lines, and 3D models for mining sector planning activities. A case study acquired 240 sq km of Pleiades stereo imagery with less than 5% cloud cover to generate a 5m DEM and detailed contour lines and 3D visualizations. The Pleiades data provided more accurate topographic information than existing SRTM data for environmental studies, monitoring mining volume changes, and other planning purposes in the mining sector.
VR4PV is a virtual reality software that enables fast visualization and simulation of photovoltaic (PV) systems. It allows users to render shades on PV systems from surrounding objects or self-shading during the design process. VR4PV has been used for cases involving PV integration in the built environment, designing movable PV boats, shadow analysis of a PV-powered street lamp, and allocating PV and renewable energy technologies on small islands. Future work may involve extending VR4PV's capabilities and transitioning to more performant gaming engines like Unity to improve processing speed while separating animation from energy simulations.
This document discusses vertical axis wind turbine (VAWT) performance prediction theories and a single stream tube model for VAWTs. It presents the angle of attack equation for the single stream tube model assuming zero tilt angle. It also provides the iterative equation used to determine the induction factor and the power equation that is integrated over a full rotation to predict average power. The document indicates that understanding and integrating various engineering disciplines is required to make meaningful contributions to wind turbine research and development.
This document provides information about power patterns in Colorado during winter, wind turbine performance prediction theories, and the fields required to contribute to wind energy technology. It summarizes:
1) A typical winter day in Colorado sees power demand peak in the morning and evening with lower demand during the day, while wind power generation is highest during the night and midday.
2) Common theories for predicting vertical axis wind turbine performance include single stream tube, multiple stream tube, and fixed wake models.
3) Developing contributions to wind energy requires expertise in many disciplines including fluid dynamics, materials science, engineering, programming, and business/policy areas.
This document discusses different sampling techniques used in population surveys including simple random sampling, systematic sampling, stratified sampling, cluster sampling, convenience sampling, and quota sampling. It provides examples of how to calculate the sample size for each technique and notes the advantages and disadvantages of each.
This document summarizes a street lighting project including:
1) Planning data for a single roadway with details on the luminaire, arrangement, and mounting specifications.
2) A luminaire parts list specifying the model number, luminous flux, and wattage.
3) Renderings showing the false color lighting distribution.
4) Valuation field details for the roadway including isolines, average lux, minimum lux, maximum lux, and uniformity ratios.
Adaptive Channel Prediction, Beamforming and Scheduling Design for 5G V2I Net...T. E. BOGALE
The document proposes and evaluates an adaptive channel prediction, beamforming, and scheduling design for 5G vehicle-to-infrastructure networks. It presents an RLS-based algorithm to predict time-varying channel impulse responses and jointly optimizes beamforming vectors and vehicle scheduling to maximize throughput. Simulation results show the proposed design outperforms alternatives when scheduling a single vehicle, but performance degrades with increasing numbers of scheduled vehicles due to accumulated prediction errors.
This document summarizes a method for quantifying plant and city CO2 fluxes from XCO2 imagery using Gaussian plume modeling. It presents a full processing approach including: 1) modeling single/multiple point sources and extended sources with Gaussian plumes, 2) using an optimal estimation method to retrieve plume parameters and flux, 3) incorporating effective wind speed from ancillary data for emission estimates. The method is validated using synthetic XCO2 images generated from model simulations for various sites. Preliminary results show the Gaussian model can accurately characterize realistic plumes and the method can quantify CO2 emissions to within 10-30% for different instrument configurations and noise levels.
Сегментация объектов на спутниковых снимках (Kaggle DSTL) / Артур Кузин (Avito)Ontico
РИТ++ 2017, секция ML + IoT + ИБ
Зал Белу-Оризонти, 6 июня, 16:00
Тезисы:
http://ritfest.ru/2017/abstracts/2802.html
В докладе я расскажу про решение задачи сегментации объектов на спутниковых снимках, которая была поставлена в рамках Kaggle-соревнования Dstl Satellite Imagery Feature Detection. В этом соревновании я в команде с Романом Соловьёвым занял 2 место.
В докладе я кратко опишу особенности работы нейросети для сегментации объектов. Затем будут показаны примеры модификаций нейросети с учетом особенности задачи. Также будут рассказаны приемы обучения нейросети, значимо повышающие финальную точность. Будут рассказаны все топ-5 решения.
В качестве бонуса - история, как можно сломать лидерборд за пару дней до конца соревнования.
This document discusses direction of arrival (DOA) estimation using a two-element antenna array. It describes simulating different radiation patterns from varying the phase between antennas. Randomly located nodes are generated and their received signal strength calculated using a two-ray model for different radiation patterns. The pattern with the highest RSS value for a node indicates the most likely region it is located in, allowing estimation of each node's DOA. While results show this method can determine DOA, more research is needed to narrow estimates.
NovelSat is a private company founded in 2008 that focuses on satellite communication technology. Their patented NS3 technology provides 30-60% higher capacity than DVB-S2 and was successfully tested at major events like the Olympics and World Cup. Key benefits include maximizing satellite capacity, fast data rates, robustness to interference, and fast return on investment. Major customers like EBU and FIFA chose NovelSat for its efficiency in providing more bandwidth for less cost, flexibility to grow over time, and reliability in all conditions.
Compute "Closeness" in Graphs using Apache Giraph.Robert Metzger
The document describes validating different implementations for measuring closeness in graphs using the Apache Giraph framework. Statistical tests show the HyperLogLogSketch implementation exhibits the highest correlation with the baseline "bitfield" implementation on two datasets and outperforms other approaches in approximating closeness values while using less memory. Next steps involve benchmarking the implementations on larger datasets stored out-of-core in Giraph to see if the HyperLogLogSketch performance and accuracy results hold for bigger graphs.
This document outlines a project to visually inspect wind turbine blades using drones and artificial intelligence. It defines the problem of creating composite images from drone photos of blades on land and offshore. The proposed solution is to use a cross-correlation algorithm to combine images with 2500px and 3500px overlaps for on-land and offshore blades respectively. The initial results from this algorithm are promising, and future work involves expanding the algorithm to handle vertical shifts and using deep learning on an image database of offshore wind turbines.
CMOS (complementary metal oxide semiconductor) technology continues to be the
dominant technology for fabricating integrated circuits (ICs or chips). This dominance
will likely continue for the next 25 years and perhaps even longer. Why? CMOS
technology is reliable, manufacturable, low power, low cost, and, perhaps most
importantly, scalable. The fact that silicon integrated circuit technology is scalable was
observed and described in 1965 by Intel founder Gordon Moore. His observations are
now referred to as Moore's law and state that the number of devices on a chip will double
every 18 to 24 months. While originally not specific to CMOS, Moore's law has been
fulfilled over the years by scaling down the feature size in CMOS technology. Whereas
the gate lengths of early CMOS transistors were in the micrometer range (long-channel
devices) the feature sizes of current CMOS devices are in the nanometer range
(short-channel devices).
To encompass both the long- and short-channel CMOS technologies in this book,
a two-path approach to custom CMOS integrated circuit design is adopted. Design
techniques are developed for both and then compared. This comparison gives readers
deep insight into the circuit design process. While the square-law equations used to
describe MOSFET operation that students learn in an introductory course in
microelectronics can be used for analog design in a long-channel CMOS process they are
not useful when designing in short-channel, or nanometer, CMOS technology. The
behavior of the devices in a nanometer CMOS process is quite complex. Simple
equations to describe the devices' behavior are not possible. Rather electrical plots are
used to estimate biasing points and operating behavior. It is still useful, however, for the
student to use mathematical rigor when learning circuit analysis and design and, hence,
the reason for the two-path approach. Hand calculations can be performed using a
long-channel CMOS technology with the results then used to describe how to design in a
nano-CMOS process.
A deep learning model using convolutional neural networks is proposed for lithography hotspot detection. The model takes layout clip images as input and outputs a prediction of hotspot or non-hotspot. It uses several convolutional and pooling layers to automatically learn features from the images without manual feature engineering. Evaluation shows the deep learning model achieves higher accuracy than previous shallow learning methods that rely on manually designed features.
This document presents an example of analysis design of slab using ETABS. This example examines a simple single story building, which is regular in plan and elevation. It is examining and compares the calculated ultimate moment from CSI ETABS & SAFE with hand calculation. Moment coefficients were used to calculate the ultimate moment. However it is good practice that such hand analysis methods are used to verify the output of more sophisticated methods.
Also, this document contains simple procedure (step-by-step) of how to design solid slab according to Eurocode 2.The process of designing elements will not be revolutionised as a result of using Eurocode 2. Due to time constraints and knowledge, I may not be able to address the whole issues.
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
Ia parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on STRUCTURAL OPTIMIZATION
May, 28, 2014
Corso di dottorato in Ottimizzazione Strutturale: applicazione mensola strall...StroNGER2012
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
IIa parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on
STRUCTURAL OPTIMIZATION
2nd part
May, 28, 2014
Corso di dottorato in Ottimizzazione Strutturale: applicazione mensola strall...Franco Bontempi
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
IIa parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on
STRUCTURAL OPTIMIZATION
2nd part
May, 28, 2014
VLSI stands for Very Large Scale Integration and refers to integrated circuits with over 100,000 transistors. The document discusses the history and progression of integration levels from SSI to VLSI to ULSI. It also describes the photolithography process used to etch circuit designs onto silicon wafers at the microscopic level needed for modern integrated circuits.
A basic tutorial on using Wannier90 with the VASP code. Includes a brief overview of Wannier functions, tips on how to build VASP with Wannier90 support, and how to use the VASP/Wannier90 interface to compute an HSE06 band structure and perform some other Wannier90 post processing.
Metallization techniques for high efficiency solar cellsMehul Raval
The document discusses several metallization techniques for improving the efficiency of high efficiency solar cells, including:
1) Advanced screen printing techniques that can reduce line widths and increase short circuit current.
2) Two-layer metallization approaches using a seed layer and light-induced plating to thicken the contact layer and improve aspect ratios.
3) Emerging techniques like pad printing, metal aerosol jetting, and laser sintering of metal powders that can print even narrower line widths compared to screen printing.
Efficient Finite Element Computation of Circulating Currents in Thin Parallel...Antti Lehikoinen
My poster for the International Conference on the Computation of Electromagnetic Fields (Compumag 2015).
I developed a non-conforming meshing approach for stranded conductors, resulting in a significant reduction on the degrees-of-freedom and computation times in loss calculation.
The document discusses the economics and challenges of scaling in VLSI chip design. It describes how Moore's Law has led to an exponential increase in the number of transistors that can fit on a chip over time. However, continued scaling is facing challenges related to power consumption, leakage currents, and interconnect delays as features sizes shrink further. Design costs are also rising as more engineers and advanced tools are needed to design increasingly large and complex chips.
This document provides information about power patterns in Colorado during winter, wind turbine performance prediction theories, and the fields required to contribute to wind energy technology. It summarizes:
1) A typical winter day in Colorado sees power demand peak in the morning and evening with lower demand during the day, while wind power generation is highest during the night and midday.
2) Common theories for predicting vertical axis wind turbine performance include single stream tube, multiple stream tube, and fixed wake models.
3) Developing contributions to wind energy requires expertise in many disciplines including fluid dynamics, materials science, engineering, programming, and business/policy areas.
This document discusses different sampling techniques used in population surveys including simple random sampling, systematic sampling, stratified sampling, cluster sampling, convenience sampling, and quota sampling. It provides examples of how to calculate the sample size for each technique and notes the advantages and disadvantages of each.
This document summarizes a street lighting project including:
1) Planning data for a single roadway with details on the luminaire, arrangement, and mounting specifications.
2) A luminaire parts list specifying the model number, luminous flux, and wattage.
3) Renderings showing the false color lighting distribution.
4) Valuation field details for the roadway including isolines, average lux, minimum lux, maximum lux, and uniformity ratios.
Adaptive Channel Prediction, Beamforming and Scheduling Design for 5G V2I Net...T. E. BOGALE
The document proposes and evaluates an adaptive channel prediction, beamforming, and scheduling design for 5G vehicle-to-infrastructure networks. It presents an RLS-based algorithm to predict time-varying channel impulse responses and jointly optimizes beamforming vectors and vehicle scheduling to maximize throughput. Simulation results show the proposed design outperforms alternatives when scheduling a single vehicle, but performance degrades with increasing numbers of scheduled vehicles due to accumulated prediction errors.
This document summarizes a method for quantifying plant and city CO2 fluxes from XCO2 imagery using Gaussian plume modeling. It presents a full processing approach including: 1) modeling single/multiple point sources and extended sources with Gaussian plumes, 2) using an optimal estimation method to retrieve plume parameters and flux, 3) incorporating effective wind speed from ancillary data for emission estimates. The method is validated using synthetic XCO2 images generated from model simulations for various sites. Preliminary results show the Gaussian model can accurately characterize realistic plumes and the method can quantify CO2 emissions to within 10-30% for different instrument configurations and noise levels.
Сегментация объектов на спутниковых снимках (Kaggle DSTL) / Артур Кузин (Avito)Ontico
РИТ++ 2017, секция ML + IoT + ИБ
Зал Белу-Оризонти, 6 июня, 16:00
Тезисы:
http://ritfest.ru/2017/abstracts/2802.html
В докладе я расскажу про решение задачи сегментации объектов на спутниковых снимках, которая была поставлена в рамках Kaggle-соревнования Dstl Satellite Imagery Feature Detection. В этом соревновании я в команде с Романом Соловьёвым занял 2 место.
В докладе я кратко опишу особенности работы нейросети для сегментации объектов. Затем будут показаны примеры модификаций нейросети с учетом особенности задачи. Также будут рассказаны приемы обучения нейросети, значимо повышающие финальную точность. Будут рассказаны все топ-5 решения.
В качестве бонуса - история, как можно сломать лидерборд за пару дней до конца соревнования.
This document discusses direction of arrival (DOA) estimation using a two-element antenna array. It describes simulating different radiation patterns from varying the phase between antennas. Randomly located nodes are generated and their received signal strength calculated using a two-ray model for different radiation patterns. The pattern with the highest RSS value for a node indicates the most likely region it is located in, allowing estimation of each node's DOA. While results show this method can determine DOA, more research is needed to narrow estimates.
NovelSat is a private company founded in 2008 that focuses on satellite communication technology. Their patented NS3 technology provides 30-60% higher capacity than DVB-S2 and was successfully tested at major events like the Olympics and World Cup. Key benefits include maximizing satellite capacity, fast data rates, robustness to interference, and fast return on investment. Major customers like EBU and FIFA chose NovelSat for its efficiency in providing more bandwidth for less cost, flexibility to grow over time, and reliability in all conditions.
Compute "Closeness" in Graphs using Apache Giraph.Robert Metzger
The document describes validating different implementations for measuring closeness in graphs using the Apache Giraph framework. Statistical tests show the HyperLogLogSketch implementation exhibits the highest correlation with the baseline "bitfield" implementation on two datasets and outperforms other approaches in approximating closeness values while using less memory. Next steps involve benchmarking the implementations on larger datasets stored out-of-core in Giraph to see if the HyperLogLogSketch performance and accuracy results hold for bigger graphs.
This document outlines a project to visually inspect wind turbine blades using drones and artificial intelligence. It defines the problem of creating composite images from drone photos of blades on land and offshore. The proposed solution is to use a cross-correlation algorithm to combine images with 2500px and 3500px overlaps for on-land and offshore blades respectively. The initial results from this algorithm are promising, and future work involves expanding the algorithm to handle vertical shifts and using deep learning on an image database of offshore wind turbines.
CMOS (complementary metal oxide semiconductor) technology continues to be the
dominant technology for fabricating integrated circuits (ICs or chips). This dominance
will likely continue for the next 25 years and perhaps even longer. Why? CMOS
technology is reliable, manufacturable, low power, low cost, and, perhaps most
importantly, scalable. The fact that silicon integrated circuit technology is scalable was
observed and described in 1965 by Intel founder Gordon Moore. His observations are
now referred to as Moore's law and state that the number of devices on a chip will double
every 18 to 24 months. While originally not specific to CMOS, Moore's law has been
fulfilled over the years by scaling down the feature size in CMOS technology. Whereas
the gate lengths of early CMOS transistors were in the micrometer range (long-channel
devices) the feature sizes of current CMOS devices are in the nanometer range
(short-channel devices).
To encompass both the long- and short-channel CMOS technologies in this book,
a two-path approach to custom CMOS integrated circuit design is adopted. Design
techniques are developed for both and then compared. This comparison gives readers
deep insight into the circuit design process. While the square-law equations used to
describe MOSFET operation that students learn in an introductory course in
microelectronics can be used for analog design in a long-channel CMOS process they are
not useful when designing in short-channel, or nanometer, CMOS technology. The
behavior of the devices in a nanometer CMOS process is quite complex. Simple
equations to describe the devices' behavior are not possible. Rather electrical plots are
used to estimate biasing points and operating behavior. It is still useful, however, for the
student to use mathematical rigor when learning circuit analysis and design and, hence,
the reason for the two-path approach. Hand calculations can be performed using a
long-channel CMOS technology with the results then used to describe how to design in a
nano-CMOS process.
A deep learning model using convolutional neural networks is proposed for lithography hotspot detection. The model takes layout clip images as input and outputs a prediction of hotspot or non-hotspot. It uses several convolutional and pooling layers to automatically learn features from the images without manual feature engineering. Evaluation shows the deep learning model achieves higher accuracy than previous shallow learning methods that rely on manually designed features.
This document presents an example of analysis design of slab using ETABS. This example examines a simple single story building, which is regular in plan and elevation. It is examining and compares the calculated ultimate moment from CSI ETABS & SAFE with hand calculation. Moment coefficients were used to calculate the ultimate moment. However it is good practice that such hand analysis methods are used to verify the output of more sophisticated methods.
Also, this document contains simple procedure (step-by-step) of how to design solid slab according to Eurocode 2.The process of designing elements will not be revolutionised as a result of using Eurocode 2. Due to time constraints and knowledge, I may not be able to address the whole issues.
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
Ia parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on STRUCTURAL OPTIMIZATION
May, 28, 2014
Corso di dottorato in Ottimizzazione Strutturale: applicazione mensola strall...StroNGER2012
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
IIa parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on
STRUCTURAL OPTIMIZATION
2nd part
May, 28, 2014
Corso di dottorato in Ottimizzazione Strutturale: applicazione mensola strall...Franco Bontempi
Appunti del corso di dottorato:
INTRODUZIONE ALL'OTTIMIZZAZIONE STRUTTURALE
IIa parte
Lezione del 28 maggio 2014
Lecture of the Ph.D. Course on
STRUCTURAL OPTIMIZATION
2nd part
May, 28, 2014
VLSI stands for Very Large Scale Integration and refers to integrated circuits with over 100,000 transistors. The document discusses the history and progression of integration levels from SSI to VLSI to ULSI. It also describes the photolithography process used to etch circuit designs onto silicon wafers at the microscopic level needed for modern integrated circuits.
A basic tutorial on using Wannier90 with the VASP code. Includes a brief overview of Wannier functions, tips on how to build VASP with Wannier90 support, and how to use the VASP/Wannier90 interface to compute an HSE06 band structure and perform some other Wannier90 post processing.
Metallization techniques for high efficiency solar cellsMehul Raval
The document discusses several metallization techniques for improving the efficiency of high efficiency solar cells, including:
1) Advanced screen printing techniques that can reduce line widths and increase short circuit current.
2) Two-layer metallization approaches using a seed layer and light-induced plating to thicken the contact layer and improve aspect ratios.
3) Emerging techniques like pad printing, metal aerosol jetting, and laser sintering of metal powders that can print even narrower line widths compared to screen printing.
Efficient Finite Element Computation of Circulating Currents in Thin Parallel...Antti Lehikoinen
My poster for the International Conference on the Computation of Electromagnetic Fields (Compumag 2015).
I developed a non-conforming meshing approach for stranded conductors, resulting in a significant reduction on the degrees-of-freedom and computation times in loss calculation.
The document discusses the economics and challenges of scaling in VLSI chip design. It describes how Moore's Law has led to an exponential increase in the number of transistors that can fit on a chip over time. However, continued scaling is facing challenges related to power consumption, leakage currents, and interconnect delays as features sizes shrink further. Design costs are also rising as more engineers and advanced tools are needed to design increasingly large and complex chips.
This document provides an introduction and overview of stochastic frontier analysis, which models a production frontier as a stochastic function to account for noise in production. It discusses estimating the parameters of a stochastic frontier model using maximum likelihood, predicting technical efficiency at the firm and industry level, and hypothesis testing using likelihood ratio tests. The key steps are estimating the stochastic frontier model, predicting technical efficiencies based on the estimates, and testing hypotheses about inefficiency effects.
This document provides structural calculations for the main canopy of a building located in Mumbai. It includes STAAD analysis of the steel structure, material properties, load assumptions, and results of the analysis. Key sections analyzed include the outer MS frame, inner MS frame, supporting MS pipes and tubes. Loads considered are self-weight, wind load, and live load. The analysis checks the steel structure for deflection under these loads.
This document summarizes Ashok Prabhu Masilamani's Ph.D. presentation on advanced silicon microring resonator devices for optical signal processing. It introduces microring resonators and their use in optical filters. It outlines Masilamani's research goals to explore new coupled microring topologies that can realize complex transfer functions. The document demonstrates experimental fabrication and testing of microring filters in silicon-on-insulator material. It also shows thermal tuning of microring resonances using integrated microheaters. The research contributes new coupled microring architectures and synthesis techniques for advanced optical signal processing.
The document provides details about the acceptance testing and commissioning of a new TrueBeam linear accelerator installed at the facility. Some key details include:
- The machine was installed in an existing bunker previously occupied by a Siemens Primus Plus with additional shielding added.
- Acceptance testing verifies a small subset of beam data based on manufacturer guidelines to check specifications, while commissioning involves comprehensive beam measurements and treatment planning system configuration.
- Beam data measurements included depth doses, profiles, output, symmetry, flatness, and other dosimetric parameters which were analyzed and entered into the treatment planning system.
- Electron and photon beam energies and characteristics were evaluated to ensure they met tolerance limits. Other
This document discusses load flow analysis and loss allocation methods for unbalanced radial power distribution systems. The objectives are to develop a fast three-phase load flow method and an active loss allocation scheme for unbalanced distribution networks. It presents a proposed load flow method based on a forward/backward sweep approach with a new bus identification and multiphase data handling scheme. Test results on sample systems show the proposed method has fewer iterations and faster computation time compared to other established methods.
IEEE Student Branch Chittagong University arranged a webinar titled "From APECE to ASML A Semiconductor Journey". Shawn Millat shared his working experience in Semiconductor industry and also shared tips about studying in Germany.
This document discusses bifacial solar cells and modules. It provides the following key points:
- Bifacial solar modules can provide a power gain of 7-9% compared to standard modules, or up to 30% when used with a special tracking system.
- A test of bifacial modules on a TRAXEL system in the Czech Republic showed a significant advantage over standard modules of similar capacity.
- Multi-crystalline silicon is suitable for bifacial applications despite lower minority carrier lifetimes. Measurements of diffusion lengths on multi-crystalline bifacial cells showed lengths over 300 microns.
- The "light trapping" effect can impact short circuit current
Recent developments in the field of reduced order modeling - and in particular, active subspace construction - have made it possible to efficiently approximate complex models by constructing low-order response surfaces based upon a small subspace of the original high dimensional parameter space. These methods rely upon the fact that the response tends to vary more prominently in a few dominant directions defined by linear combinations of the original inputs, allowing for a rotation of the coordinate axis and a consequent transformation of the parameters. In this talk, we discuss a gradient free active subspace algorithm that is feasible for high dimensional parameter spaces where finite-difference techniques are impractical. We illustrate an initialized gradient-free active subspace algorithm for a neutronics example implemented with SCALE6.1.
The document describes a regression analysis of housing prices in Boston suburbs using 14 variables. A full model found some variables like industrial proportion and age to be insignificant. A reduced model and transforming the dependent variable to its natural log improved the model fit. Stepwise selection confirmed one more variable as insignificant. The final model with 9 significant variables had high predictive power.
Similar to HOW 2019: Machine Learning for the Primary Vertex Reconstruction (20)
Modern binary build systems have made shipping binary packages for Python much easier than ever before. This talk discusses three of the most popular build systems for Python packages using the new standards developed for packaging.
This document discusses software quality assurance tooling, focusing on pre-commit. It introduces pre-commit as a tool for running code quality checks before code is committed. Pre-commit allows configuring hooks that run checks and fixers on files matching certain patterns. Hooks can be installed from repositories and support many languages including Python. The document provides examples of pre-commit checks such as disallowing improper capitalization in code comments and files. It also discusses how to configure, run, update and install pre-commit hooks.
The document summarizes Henry Schreiner's work on several Python and C++ scientific computing projects. It describes a scientific Python development guide built from the Scikit-HEP summit. It also outlines Henry's work on pybind11 for C++ bindings, scikit-build for building extensions, cibuildwheel for building wheels on CI, and several other related projects.
Flake8 is a Python linter that is fast, simple, and extensible. It can be configured through setup.cfg or .flake8 files to ignore certain checks or select others. The summary recommends using the flake8-bugbear plugin and avoiding all print statements with flake8-print. Linters like Flake8 help find errors, improve code quality, and avoid historical baggage, but one does not need every check and it is okay to build a long ignore list.
The document describes various productivity tools for Python development, including:
- Pre-commit hooks to run checks before committing code
- Hot code reloading in Jupyter notebooks using the %load_ext and %autoreload magic commands
- Cookiecutter for generating project templates
- SSH configuration files and escape sequences for easier remote access
- Autojump to quickly navigate frequently visited directories
- Terminal tips like command history search and referencing the last argument
- Options for tracking Jupyter notebooks with git like stripping outputs or synchronizing notebooks and Python files.
SciPy22 - Building binary extensions with pybind11, scikit build, and cibuild...Henry Schreiner
Building binary extensions is easier than ever thanks to several key libraries. Pybind11 provides a natural C++ language for extensions without requiring pre-processing or special dependencies. Scikit-build ties the premier C++ build system, CMake, into the Python extension build process. And cibuildwheel makes it easy to build highly compatible wheels for over 80 different platforms using CI or on your local machine. We will look at advancements to all three libraries over the last year, as well as future plans.
This document discusses the history and development of Python packages for high energy physics (HEP) analysis. It describes how experiments initially used ROOT and C++, but Python gained popularity for configuration and analysis. This led to the creation of packages like Scikit-HEP, Uproot, and Awkward Array to bridge the gap between ROOT files and the Python data science stack. Scikit-HEP grew to include many related packages and provides best practices through its developer pages. The future may include adopting Scikit-build for building Python packages with C/C++ extensions and running packages in the browser via WebAssembly.
PyCon 2022 -Scikit-HEP Developer Pages: Guidelines for modern packagingHenry Schreiner
This was a PyCon 2022 lightning talk over the Scikit-HEP developer pages. It highlights best practices and guides shown there, and the quick package creation cookiecutter. And finally it demos the Pyodide WebAssembly app embedded into the Scikit-HEP developer pages!
Talk at PyCon2022 over building binary packages for Python. Covers an overview and an in-depth look into pybind11 for binding, scikit-build for creating the build, and build & cibuildwheel for making the binaries that can be distributed on PyPI.
Digital RSE: automated code quality checks - RSE group meetingHenry Schreiner
Given at a local RSE group meeting. Covers code quality practices, focusing on Python but over multiple languages, with useful tools highlighted throughout.
This document provides best practices for using CMake, including:
- Set the cmake_minimum_required version to ensure modern features while maintaining backward compatibility.
- Use targets to define executables and libraries, their properties, and dependencies.
- Fetch remote dependencies at configure time using FetchContent or integrate with package managers like Conan.
- Import library targets rather than reimplementing Find modules when possible.
- Treat CUDA as a first-class language in CMake projects.
HOW 2019: A complete reproducible ROOT environment in under 5 minutesHenry Schreiner
The document discusses setting up a ROOT environment using Conda in under 5 minutes. It describes downloading and installing Miniconda and then using Conda commands to create a new environment and install ROOT and its dependencies from the conda-forge channel. The ROOT package provides full ROOT functionality, including compilation and graphics, and supports Linux, macOS, and multiple Python versions.
2019 IRIS-HEP AS workshop: Boost-histogram and histHenry Schreiner
The document discusses the current state of histograms in Python and the need for a new histogramming library. It introduces boost-histogram, a C++ histogramming library, and its new Python bindings. The bindings aim to provide a fast, flexible and easily distributable histogram object for Python. Key features discussed include histogram design that treats it as a first-class object, fast filling via multi-threading, a variety of axis and storage types, and performance benchmarks showing it can be over 10x faster than NumPy for filling histograms. Distribution is focused on providing binary wheels for many platforms via continuous integration.
The document discusses the current state of histograms in Python and the need for a new library. It introduces boost-histogram, a C++ histogram library, and its new Python bindings. The bindings aim to provide a fast, flexible, and easily distributable histogram object for Python with support for multiple axis types and storage options. It also discusses plans for an additional wrapper library called hist for easy plotting and interfacing with other tools.
2019 IRIS-HEP AS workshop: Particles and decaysHenry Schreiner
The Scikit-HEP project aims to create an ecosystem for particle physics data analysis in Python. It includes packages like Particle and DecayLanguage that provide tools for working with particle data and decay descriptions. Particle allows users to easily access and search particle property data from sources like the PDG. DecayLanguage allows parsing decay file formats, representing and manipulating decay chains, and converting between decay model representations. Future work includes expanding particle ID support and improving visualization of decay trees.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
HOW 2019: Machine Learning for the Primary Vertex Reconstruction
1. Machine Learning for Primary Vertex Reconstruction
Rui Fang1
Henry Schreiner1, 2
Mike Sokoloff1
Constantin Weisser3
Mike Williams3
March 20, 2019
1
The University of Cincinnati
2
Princeton University
3
Massachusetts Institute of Technology
HOW 2019
Supported by:
2. 0 5 10 15 20 25 30 35 40 45 50 55 60
# LHCb long tracks
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
Efficiency
Found 103002 of 109733 (eff 93.87%)
False positive rate = 0.251 per event
Asymmetric cost function
Found 96616 of 109733 (eff 88.05%)
False positive rate = 0.0485 per event
Symmetric cost function
Events in sample = 20K
Training sample = 240K
0 5 10 15 20 25 30 35 40 45 50 55 60
# LHCb long tracks
102
103
104
PVs
1/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
3. Tracking in the LHCb upgrade Introduction
The changes
• 30 MHz software trigger
• 7.6 PVs per event (Poisson distribution)
• Roughly 5.5 visible PVs per event
The problem
• Much higher pileup
• Very little time to do the tracking
• Current algorithms too slow
We need to rethink our algorithms from the ground up...
2/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
4. Vertices and tracks Introduction
Vertices
• Events contain ≈ 7 Primary Vertices (≈ 5
visible PVs)
A PV should contain 5+ long tracks
• Multiple Secondary Vertices (SVs) per
event as well
A SV should contain 2+ tracks
Beams
PV
Track
SV
Adapt to machine learning?
• Sparse 3D data (41M pixels) → rich 1D data
• 1D convolutional neural nets
• Highly parallelizable, GPU friendly
• Opportunities to visualize learning process
3/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
5. A hybrid ML approach Introduction
Tracking Kernel generation Make predictions
CNNs
Interpret results
Truth Training
Validation
Machine learning features (so far)
• Prototracking converts sparse 3D dataset to feature-rich 1D dataset
• Easy and effective visualization due to 1D nature
• Even simple networks can provide interesting results
What follows is a proof of principle implementation for finding PVs.
4/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
6. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
z axis (along the beam)
x PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
7. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
• Make a 3D grid of voxels (2D shown)
• Note: only z will be fully calculated and
stored
z axis (along the beam)
x PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
8. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
• Make a 3D grid of voxels (2D shown)
• Note: only z will be fully calculated and
stored
• Tracking (full or partial)
z axis (along the beam)
x PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
9. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
• Make a 3D grid of voxels (2D shown)
• Note: only z will be fully calculated and
stored
• Tracking (full or partial)
• Fill in each voxel center with Gaussian PDF
z axis (along the beam)
x PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
10. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
• Make a 3D grid of voxels (2D shown)
• Note: only z will be fully calculated and
stored
• Tracking (full or partial)
• Fill in each voxel center with Gaussian PDF
• PDF for each (proto)track is combined
z axis (along the beam)
x PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
11. Kernel generation Design
Tracking procedure
• Hits lie on the 26 planes
• For simplicity, only 3 tracks shown
• Make a 3D grid of voxels (2D shown)
• Note: only z will be fully calculated and
stored
• Tracking (full or partial)
• Fill in each voxel center with Gaussian PDF
• PDF for each (proto)track is combined
• Fill z "histogram" with maximum KDE value
in xy
z axis (along the beam)
x
Kernel
PV
5/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
12. Example of z KDE histogram Design
100 50 0 50 100 150 200 250 300
z values [mm]
0
500
1000
1500
2000
DensityofKernel
Kernel
LHCb PVs
Other PVs
LHCb SVs
Other SVs
Note: All events from toy detector simulation
Human learning
• Peaks generally correspond to PVs and SVs
Challenges
• Vertex may be offset from peak
• Vertices interact
6/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
13. Target distribution Design
Build target distribution
• True PV position as the mean of Gaussian
• σ (standard deviation) is 100 µm (simplification)
• Fill bins with integrated PDF within ±3 bins (±300 µm)
7/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
15. Cost function Design
10 6 10 5 10 4 10 3 10 2 10 1 100
yhat
0
10
20
30
40
50
60
cost
0.0 0.2 0.4 0.6 0.8
yhat
0
5
10
15
20
25
30
cost
Asym. Cost for y = 0.10
Symm. Cost for y = 0.10
Asym. Cost for y = 0.30
Symm. Cost for y = 0.30
Asym. Cost for y = 1e-5
Symm. Cost for y = 1e-5
0.2 0.4 0.6 0.8 1.0
yhat
0
2
4
6
8
10
cost
Approach
• Symmetric cost function: low FP but low efficiency
• Adding asymmetry term controls trade-off for FP vs. efficiency
9/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
16. False Positive and efficiency rates Results
88 89 90 91 92 93 94
Efficiency [%]
0.05
0.10
0.15
0.20
0.25FPperevent
Symm cost
Most asymm cost
88 89 90 91 92 93 94
Efficiency [%]
10 1
6×10 2
2×10 1
FPperevent
Symm cost
Most asymm cost
Search for PVs
• Search ±5 bins (±500µm) around a true PV
• At least 3 bins with predicted probability > 1% and
integrated probability > 20%.
Tunable efficiency vs. FP
• The asymmetry parameter
controls FP vs. efficiency
10/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
17. Compare predictions with targets: When it works Results
0
200
400
600
800
1000
1200
KernelDensity
True: 48.904 mm
Pred: 48.954 mm
: 50 µm
Event 0 @ 48.9 mm: PV found
Kernel Density
47.00 48.00 49.00 50.00 51.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
PV found example
0
50
100
150
200
KernelDensity
Pred: 0.976 mm
Event 0 @ 1.0 mm: Masked
Kernel Density
-1.00 0.00 1.00 2.00 3.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
Masked (<5 tracks) example
11/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
18. Compare predictions with targets: When it fails Results
0
50
100
150
200
250
KernelDensity
Pred: 65.696 mm
Event 2 @ 65.7 mm: False positive
Kernel Density
64.00 65.00 66.00 67.00 68.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
False Positive example
0
100
200
300
400
500
KernelDensity
True: 51.898 mm
Event 3 @ 51.9 mm: PV not found
Kernel Density
50.00 51.00 52.00 53.00 54.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
PV not found example
12/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
19. Conclusions and plans Future plans
0 5 10 15 20 25 30 35 40 45 50 55 60
# LHCb long tracks
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
Efficiency
• Proof-of-Principle established: a hybrid ML algorithm
using a 1-dimensional KDE processed by a 5-layer CNN
finds primary vertices with efficiencies and false positive
rates similar to traditional algorithms.
• Efficiency is tunable; increasing the efficiency also
increases the false positive rate.
• Adding information should improve performance.
• can add KDE (x,y) information to algorithm
• can associate tracks to PV candidates, then iterate.
• Next steps: train with full LHCb MC and deploy
inference engine in LHCb Hlt1 framework.
• Beyond LHCb
• approach might work for ATLAS and CMS (in 2D?);
• algorithm is an interesting ML laboratory.
13/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
20. Final words Future plans
Questions?
Source code:
• https://gitlab.cern.ch/LHCb-Reco-Dev/pv-finder
• Runnable with Conda on macOS and Linux
Run: conda env create -f environment-gpu.yml
Python 3.6+ and PyTorch used for machine learning code
Generation now available too using the new Conda-Forge
ROOT and Pythia8 packages
Supported by:
• NSF OAC-1836650:
IRIS-HEP
• NSF OAC-1740102:
SI2:SSE
• NSF OAC-1739772:
SI2:SSE
14/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
21. More predictions with targets (1) Backup
0
500
1000
1500
2000
KernelDensity
True: 114.622 mm
Pred: 114.597 mm
: -26 µm
Event 2 @ 114.6 mm: PV found
Kernel Density
113.00 114.00 115.00 116.00 117.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
0
100
200
300
400
500
KernelDensity
True: 197.461 mm
Pred: 197.396 mm
: -65 µm
Event 5 @ 197.4 mm: PV found
Kernel Density
195.00 196.00 197.00 198.00 199.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
15/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
22. More predictions with targets (2) Backup
0
50
100
150
200
KernelDensity
True: 221.595 mm
Pred: 221.546 mm
: -49 µm
Event 5 @ 221.5 mm: PV found
Kernel Density
219.00 220.00 221.00 222.00 223.00 224.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
0
200
400
600
800
1000
1200
1400
1600
KernelDensity
True: 36.068 mm
Pred: 36.400 mm
: 332 µm
Event 6 @ 36.1 mm: PV found
Kernel Density
34.00 35.00 36.00 37.00 38.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
16/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
23. More predictions with targets (3) Backup
0
200
400
600
800
1000
1200
1400
1600
KernelDensity
True: 129.336 mm
Pred: 129.337 mm
: 1 µm
Event 6 @ 129.3 mm: PV found
Kernel Density
127.00 128.00 129.00 130.00 131.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
0
500
1000
1500
2000
KernelDensity
True: 143.224 mm
Pred: 143.199 mm
: -25 µm
Event 6 @ 143.2 mm: PV found
Kernel Density
141.00 142.00 143.00 144.00 145.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
17/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
24. More predictions with targets (4) Backup
0
50
100
150
200
250
300
350
400
KernelDensity
True: 150.650 mm
Pred: 150.416 mm
: -234 µm
Event 6 @ 150.4 mm: PV found
Kernel Density
148.00 149.00 150.00 151.00 152.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
0
500
1000
1500
2000
2500
KernelDensity
True: 179.560 mm
Pred: 179.591 mm
: 31 µm
Event 6 @ 179.6 mm: PV found
Kernel Density
178.00 179.00 180.00 181.00 182.00
z values [mm]
150
100
50
0
50
100
150
xymaximum[m]
x
y
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Probability
Target
Predicted
Masked
18/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
25. The VELO Backup
Tracks
• Originate from vertices (not shown)
• Hits originate from tracks
• We only know the true track in simulation
• Nearly straight, but tracks may scatter in material
The VELO
• A set of 26 planes that detect tracks
• Tracks should hit one or more pixels per plane
• Sparse 3D dataset (41M pixels)
19/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019
26. Questions for other experiments Backup
• Beam width (x, y): 40 µm for LHCb, what is yours?
• Transverse resolution: 5–15 µm for LHCb depending on number of tracks, what is yours?
• Longitudinal resolution: 40–100 µm for LHCb depending on number of tracks, what is
yours?
• Cleaning up prototracks based on IP could simplify kernel
• Can prototracking be done in the triggers?
20/14Fang, Schreiner, Sokoloff, Weisser, Williams
ML for PV Reconstruction
March 20, 2019