This document discusses uncertainty in dispersion models used for air quality predictions. It notes that uncertainties should be routinely tracked for policy decisions, as in climate models. However, uncertainties are not commonly assessed for atmospheric pollution dispersion models. It recommends propagating uncertainties through parametric sampling and sensitivity analysis to determine influential parameters. Global sensitivity methods can evaluate model complexity and parameter importance. Closing the modeling loop through refinement informed by sensitivity analysis could help reduce prediction uncertainties.
An introduction to SigmaXL's Design of Experiments tools.
Established in 1998, SigmaXL Inc. is a leading provider of user friendly Excel Add-ins for Lean Six Sigma graphical and statistical tools and Monte Carlo simulation.
SigmaXL® customers include market leaders like Agilent, Diebold, FedEx, Microsoft, Motorola and Shell. SigmaXL® software is also used by numerous colleges, universities and government agencies.
Our flagship product, SigmaXL®, was designed from the ground up to be a cost-effective, powerful, but easy to use tool that enables users to measure, analyze, improve and control their service, transactional, and manufacturing processes. As an add-in to the already familiar Microsoft Excel, SigmaXL® is ideal for Lean Six Sigma training and application, or use in a college statistics course.
DiscoverSim™ enables you to quantify your risk through Monte Carlo simulation and minimize your risk with global optimization. Business decisions are often based on assumptions with a single point value estimate or an average, resulting in unexpected outcomes.
DiscoverSim™ allows you to model the uncertainty in your inputs so that you know what to expect in your outputs.
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
This webinar looks at answering this question, not by going deeply into the various designed experiment types, but from a process improvement perspective. Progressing from a definition of a designed experiment, to Why and when do I need a designed experiment?, What’s the concept? (and why can’t I do a “one-factor-at-a-time” series of experiments? , to Will this tool solve REAL WORLD problems?
An introduction to SigmaXL's Design of Experiments tools.
Established in 1998, SigmaXL Inc. is a leading provider of user friendly Excel Add-ins for Lean Six Sigma graphical and statistical tools and Monte Carlo simulation.
SigmaXL® customers include market leaders like Agilent, Diebold, FedEx, Microsoft, Motorola and Shell. SigmaXL® software is also used by numerous colleges, universities and government agencies.
Our flagship product, SigmaXL®, was designed from the ground up to be a cost-effective, powerful, but easy to use tool that enables users to measure, analyze, improve and control their service, transactional, and manufacturing processes. As an add-in to the already familiar Microsoft Excel, SigmaXL® is ideal for Lean Six Sigma training and application, or use in a college statistics course.
DiscoverSim™ enables you to quantify your risk through Monte Carlo simulation and minimize your risk with global optimization. Business decisions are often based on assumptions with a single point value estimate or an average, resulting in unexpected outcomes.
DiscoverSim™ allows you to model the uncertainty in your inputs so that you know what to expect in your outputs.
Introduction to Design of Experiments by Teck Nam Ang (University of Malaya)Teck Nam Ang
This set of slides explains in a simple manner the purpose of experiment, various strategies of experiment, how to plan and design experiment, and the handling of experimental data.
This webinar looks at answering this question, not by going deeply into the various designed experiment types, but from a process improvement perspective. Progressing from a definition of a designed experiment, to Why and when do I need a designed experiment?, What’s the concept? (and why can’t I do a “one-factor-at-a-time” series of experiments? , to Will this tool solve REAL WORLD problems?
Quite often in experimental work, many situations arise where some observations are lost or become
unavailable due to some accidents or cost constraints. When there are missing observations, some
desirable design properties like orthogonality,rotatability and optimality can be adversely affected. Some
attention has been given, in literature, to investigating the prediction capability of response surface
designs; however, little or no effort has been devoted to investigating same for such designs when some
observations are missing. This work therefore investigates the impact of a single missing observation of the
various design points: factorial, axial and center points, on the estimation and predictive capability of
Central Composite Designs (CCDs). It was observed that for each of the designs considered, precision of
model parameter estimates and the design prediction properties were adversely affected by the missing
observations and that the largest loss in precision of parameters corresponds to a missing factorial point.
Formula Fueler Design Of Experiments Class ExerciseRamon Balisnomo
There are alternative ways to teach DOE besides using the Statapult (Six Sigma Academy's favorite contraption which cost $300). A $20 toy made by Mattel is an effective teaching aid to full-factorial and fractional DOE's.
Authors: (i) Prashanth Lakshmi Narasimhan,
(ii) Mukesh Ravichandran
Industry: Automobile -Auto Ancillary Equipment ( Turbocharger)
This was presented after the completion of our 2 months internship at Turbo Energy Limited during our 3rd Year Summer holidays (2013)
Test Optimization With Design of Experimentajitbkulkarni
This presentation describes optimization techniques using JMP tool to significantly reduce the test resources and test execution time without sacrificing test coverage.
Advanced DOE with Minitab (presentation in Costa Rica)Blackberry&Cross
DOE:Diseño de Experimentos
Esta presentación fue dada por Minitab Inc., en Costa Rica, en el año 2007, como parte del trabajo de Blackberry&Cross, socio de Minitab Inc., para América Central, en la promoción y difusión de temas STEM, y de la comercialización de Minitab Statisitical Software.
This paper proposes a novel model management technique to be applied in population- based heuristic optimization. This technique adaptively selects different computational models (both physics-based and statistical models) to be used during optimization, with the overall goal to end with high fidelity solutions in a reasonable time period. For example, in optimizing an aircraft wing to obtain maximum lift-to-drag ratio, one can use low-fidelity models such as given by the vortex lattice method, or a high-fidelity finite volume model (that solves the full Navier-Stokes equations), or a surrogate model that substitutes the high-fidelity model.The information from models with different levels of fidelity is inte- grated into the heuristic optimization process using a novel model-switching metric. In this context, models could be surrogate models, low-fidelity physics-based analytical mod- els, and medium-to-high fidelity computational models (based on grid density). The model switching technique replaces the current model with the next higher fidelity model, when a stochastic switching criterion is met at a given iteration during the optimization process. The switching criteria is based on whether the uncertainty associated with the current model output dominates the latest improvement of the fitness function. In the case of the physics-based models, the uncertainty in their output is quantified through an inverse assessment process by comparing with high-fidelity model responses or experimental data (if available). To determine the fidelity of surrogate models, the Predictive Estimation of Model Fidelity (PEMF) method is applied. The effectiveness of the proposed method is demonstrated by applying it to airfoil optimization with the objective to maximize the lift to drag ratio of the wing under different flow regimes. It was found that the tuned low fidelity model dominates the optimization process in terms of computational time and function calls.
This paper advances the Domain Segmentation based on Uncertainty in the Surrogate (DSUS) framework which is a novel approach to characterize the uncertainty in surrogates. The leave-one-out cross-validation technique is adopted in the DSUS framework to measure local errors of a surrogate. A method is proposed in this paper to evaluate the performance of the leave-out-out cross-validation errors as local error measures. This method evaluates local errors by comparing: (i) the leave-one-out cross-validation error with (ii) the actual local error estimated within a local hypercube for each training point. The comparison results show that the leave-one-out cross-validation strategy can capture the local errors of a surrogate. The DSUS framework is then applied to key aspects of wind resource as- sessment and wind farm cost modeling. The uncertainties in the wind farm cost and the wind power potential are successfully characterized, which provides designers/users more confidence when using these models
Quite often in experimental work, many situations arise where some observations are lost or become
unavailable due to some accidents or cost constraints. When there are missing observations, some
desirable design properties like orthogonality,rotatability and optimality can be adversely affected. Some
attention has been given, in literature, to investigating the prediction capability of response surface
designs; however, little or no effort has been devoted to investigating same for such designs when some
observations are missing. This work therefore investigates the impact of a single missing observation of the
various design points: factorial, axial and center points, on the estimation and predictive capability of
Central Composite Designs (CCDs). It was observed that for each of the designs considered, precision of
model parameter estimates and the design prediction properties were adversely affected by the missing
observations and that the largest loss in precision of parameters corresponds to a missing factorial point.
Formula Fueler Design Of Experiments Class ExerciseRamon Balisnomo
There are alternative ways to teach DOE besides using the Statapult (Six Sigma Academy's favorite contraption which cost $300). A $20 toy made by Mattel is an effective teaching aid to full-factorial and fractional DOE's.
Authors: (i) Prashanth Lakshmi Narasimhan,
(ii) Mukesh Ravichandran
Industry: Automobile -Auto Ancillary Equipment ( Turbocharger)
This was presented after the completion of our 2 months internship at Turbo Energy Limited during our 3rd Year Summer holidays (2013)
Test Optimization With Design of Experimentajitbkulkarni
This presentation describes optimization techniques using JMP tool to significantly reduce the test resources and test execution time without sacrificing test coverage.
Advanced DOE with Minitab (presentation in Costa Rica)Blackberry&Cross
DOE:Diseño de Experimentos
Esta presentación fue dada por Minitab Inc., en Costa Rica, en el año 2007, como parte del trabajo de Blackberry&Cross, socio de Minitab Inc., para América Central, en la promoción y difusión de temas STEM, y de la comercialización de Minitab Statisitical Software.
This paper proposes a novel model management technique to be applied in population- based heuristic optimization. This technique adaptively selects different computational models (both physics-based and statistical models) to be used during optimization, with the overall goal to end with high fidelity solutions in a reasonable time period. For example, in optimizing an aircraft wing to obtain maximum lift-to-drag ratio, one can use low-fidelity models such as given by the vortex lattice method, or a high-fidelity finite volume model (that solves the full Navier-Stokes equations), or a surrogate model that substitutes the high-fidelity model.The information from models with different levels of fidelity is inte- grated into the heuristic optimization process using a novel model-switching metric. In this context, models could be surrogate models, low-fidelity physics-based analytical mod- els, and medium-to-high fidelity computational models (based on grid density). The model switching technique replaces the current model with the next higher fidelity model, when a stochastic switching criterion is met at a given iteration during the optimization process. The switching criteria is based on whether the uncertainty associated with the current model output dominates the latest improvement of the fitness function. In the case of the physics-based models, the uncertainty in their output is quantified through an inverse assessment process by comparing with high-fidelity model responses or experimental data (if available). To determine the fidelity of surrogate models, the Predictive Estimation of Model Fidelity (PEMF) method is applied. The effectiveness of the proposed method is demonstrated by applying it to airfoil optimization with the objective to maximize the lift to drag ratio of the wing under different flow regimes. It was found that the tuned low fidelity model dominates the optimization process in terms of computational time and function calls.
This paper advances the Domain Segmentation based on Uncertainty in the Surrogate (DSUS) framework which is a novel approach to characterize the uncertainty in surrogates. The leave-one-out cross-validation technique is adopted in the DSUS framework to measure local errors of a surrogate. A method is proposed in this paper to evaluate the performance of the leave-out-out cross-validation errors as local error measures. This method evaluates local errors by comparing: (i) the leave-one-out cross-validation error with (ii) the actual local error estimated within a local hypercube for each training point. The comparison results show that the leave-one-out cross-validation strategy can capture the local errors of a surrogate. The DSUS framework is then applied to key aspects of wind resource as- sessment and wind farm cost modeling. The uncertainties in the wind farm cost and the wind power potential are successfully characterized, which provides designers/users more confidence when using these models
This paper proposes a novel model management technique to be applied in population- based heuristic optimization. This technique adaptively selects different computational models (both physics-based and statistical models) to be used during optimization, with the overall goal to end with high fidelity solutions in a reasonable time period. For example, in optimizing an aircraft wing to obtain maximum lift-to-drag ratio, one can use low-fidelity models such as given by the vortex lattice method, or a high-fidelity finite volume model (that solves the full Navier-Stokes equations), or a surrogate model that substitutes the high-fidelity model.The information from models with different levels of fidelity is inte- grated into the heuristic optimization process using a novel model-switching metric. In this context, models could be surrogate models, low-fidelity physics-based analytical mod- els, and medium-to-high fidelity computational models (based on grid density). The model switching technique replaces the current model with the next higher fidelity model, when a stochastic switching criterion is met at a given iteration during the optimization process. The switching criteria is based on whether the uncertainty associated with the current model output dominates the latest improvement of the fitness function. In the case of the physics-based models, the uncertainty in their output is quantified through an inverse assessment process by comparing with high-fidelity model responses or experimental data (if available). To determine the fidelity of surrogate models, the Predictive Estimation of Model Fidelity (PEMF) method is applied. The effectiveness of the proposed method is demonstrated by applying it to airfoil optimization with the objective to maximize the lift to drag ratio of the wing under different flow regimes. It was found that the tuned low fidelity model dominates the optimization process in terms of computational time and function calls.
This paper advances the Domain Segmentation based on Uncertainty in the Surrogate (DSUS) framework which is a novel approach to characterize the uncertainty in surrogates. The leave-one-out cross-validation technique is adopted in the DSUS framework to measure local errors of a surrogate. A method is proposed in this paper to evaluate the performance of the leave-out-out cross-validation errors as local error measures. This method evaluates local errors by comparing: (i) the leave-one-out cross-validation error with (ii) the actual local error estimated within a local hypercube for each training point. The comparison results show that the leave-one-out cross-validation strategy can capture the local errors of a surrogate. The DSUS framework is then applied to key aspects of wind resource as- sessment and wind farm cost modeling. The uncertainties in the wind farm cost and the wind power potential are successfully characterized, which provides designers/users more confidence when using these models.
Classification of mathematical modeling,
Classification based on Variation of Independent Variables,
Static Model,
Dynamic Model,
Rigid or Deterministic Models,
Stochastic or Probabilistic Models,
Comparison Between Rigid and Stochastic Models
Large Eddy Simulation of Turbulence Modeling for wind Flow past Wall Mounted ...IJERA Editor
This paper will present the large eddy simulation of turbulence modeling for wind flow over a wall mounted 3D cubical model. The LES Smagorinsky scheme is employed for the numerical simulation. The domain for this study is of the size of 60 cm x 30 cm x 30 cm. The 3D cube model is taken of the size of 6 cm x 6 cm x 4 cm. The Reynolds number for the flow in respect of the height of the cube i.e, 4 cm is 5.3x104. The hexahedral grids are used for the meshing of the flow domain. The results are discussed in terms of various parameters such as velocity profile around the cube and the computational domain, the pressure distribution over the cube, near wall velocity profile and the shear stress distribution and also the result of drag coefficient is verified by neural network time series analysis using MATLAB. In this present study we have used the OpenFoam platform for the computational and numerical analysis. The numerical scheme employed is the combination of the steady state incompressible Newtonian flow model using SIMPLE algorithm followed by the transient model of incompressible Newtonian flow using PISO algorithm. We have observed that there is a constant positive drag coefficient in case of steady state simulation where as there is a negative lift coefficient in the initial run and a very low lift coefficient at the end of the steady state simulation.
A Comprehensive Introduction of the Finite Element Method for Undergraduate C...IJERA Editor
A simple and comprehensive introduction of the Finite Element Method for undergraduate courses is proposed. With very simple mathematics, students can easily understand it. The primary objective is to make students comfortable with the approach and cognizant of its potentials. The technique is based on the general overview of the steps involved in the solution of a typical finite element problem. This is followed by simple examples of mathematical developments which allow developing and demonstrating the major aspects of the finite element approach unencumbered by complicating factors.
This paper proposes a novel model management technique to be applied in population- based heuristic optimization. This technique adaptively selects different computational models (both physics-based and statistical models) to be used during optimization, with the overall goal to end with high fidelity solutions in a reasonable time period. For example, in optimizing an aircraft wing to obtain maximum lift-to-drag ratio, one can use low-fidelity models such as given by the vortex lattice method, or a high-fidelity finite volume model (that solves the full Navier-Stokes equations), or a surrogate model that substitutes the high-fidelity model.The information from models with different levels of fidelity is inte- grated into the heuristic optimization process using a novel model-switching metric. In this context, models could be surrogate models, low-fidelity physics-based analytical mod- els, and medium-to-high fidelity computational models (based on grid density). The model switching technique replaces the current model with the next higher fidelity model, when a stochastic switching criterion is met at a given iteration during the optimization process. The switching criteria is based on whether the uncertainty associated with the current model output dominates the latest improvement of the fitness function. In the case of the physics-based models, the uncertainty in their output is quantified through an inverse assessment process by comparing with high-fidelity model responses or experimental data (if available). To determine the fidelity of surrogate models, the Predictive Estimation of Model Fidelity (PEMF) method is applied. The effectiveness of the proposed method is demonstrated by applying it to airfoil optimization with the objective to maximize the lift to drag ratio of the wing under different flow regimes. It was found that the tuned low fidelity model dominates the optimization process in terms of computational time and function calls.
Approaches to gather business requirements, defining problem statements, business requirements for
use case development, Assets for development of IoT solutions
ARIMA Model for analysis of time series data.pptREFOTDEBuea
Applying the classical linear regression approach to time series data seriously violates one of the key assumptions, known as uncorrelated error terms. Therefore, there is a need for appropriate statistical tools to model these types of data. ARIMA.
Guidelines to Understanding Design of Experiment and Reliability Predictionijsrd.com
This paper will focus on how to plan experiments effectively and how to analyse data correctly. Practical and correct methods for analysing data from life testing will also be provided. This paper gives an extensive overview of reliability issues, definitions and prediction methods currently used in the industry. It defines different methods and correlations between these methods in order to make reliability comparison statements from different manufacturers' in easy way that may use different prediction methods and databases for failure rates. The paper finds however such comparison very difficult and risky unless the conditions for the reliability statements are scrutinized and analysed in detail.
BLASTING FRAGMENTATION MANAGEMENT USING COMPLEXITY ANALYSIS David Wilson
Ontonix Brasil academic collaboration: Presentation to 6th Brazilian Congress on Open Pit Mining was awarded the best graduation/master degree work presented in the congress
Surrogate modeling for industrial designShinwoo Jang
We describe GTApprox | a new tool for medium-scale surrogate modeling in industrial design. Compared to existing software, GTApprox brings several innovations: a few novel approximation algorithms, several advanced methods of automated model selection, novel options in the form of hints. We demonstrate the efficiency of GTApprox on a large collection of test problems. In addition, we describe several applications of GTApprox to real engineering problems.
Similar to The Treatment of Uncertainty in Models (20)
Sharing is Caring – Can cross industry collaboration be achieved on key envir...IES / IAQM
Sharing is Caring – Can cross industry collaboration be achieved on key environmental topics?
Rebecca Hearn, Director, Midland Lands Events: MidLE
mental topics?
Characterization and the Kinetics of drying at the drying oven and with micro...Open Access Research Paper
The objective of this work is to contribute to valorization de Nephelium lappaceum by the characterization of kinetics of drying of seeds of Nephelium lappaceum. The seeds were dehydrated until a constant mass respectively in a drying oven and a microwawe oven. The temperatures and the powers of drying are respectively: 50, 60 and 70°C and 140, 280 and 420 W. The results show that the curves of drying of seeds of Nephelium lappaceum do not present a phase of constant kinetics. The coefficients of diffusion vary between 2.09.10-8 to 2.98. 10-8m-2/s in the interval of 50°C at 70°C and between 4.83×10-07 at 9.04×10-07 m-8/s for the powers going of 140 W with 420 W the relation between Arrhenius and a value of energy of activation of 16.49 kJ. mol-1 expressed the effect of the temperature on effective diffusivity.
Willie Nelson Net Worth: A Journey Through Music, Movies, and Business Venturesgreendigital
Willie Nelson is a name that resonates within the world of music and entertainment. Known for his unique voice, and masterful guitar skills. and an extraordinary career spanning several decades. Nelson has become a legend in the country music scene. But, his influence extends far beyond the realm of music. with ventures in acting, writing, activism, and business. This comprehensive article delves into Willie Nelson net worth. exploring the various facets of his career that have contributed to his large fortune.
Follow us on: Pinterest
Introduction
Willie Nelson net worth is a testament to his enduring influence and success in many fields. Born on April 29, 1933, in Abbott, Texas. Nelson's journey from a humble beginning to becoming one of the most iconic figures in American music is nothing short of inspirational. His net worth, which estimated to be around $25 million as of 2024. reflects a career that is as diverse as it is prolific.
Early Life and Musical Beginnings
Humble Origins
Willie Hugh Nelson was born during the Great Depression. a time of significant economic hardship in the United States. Raised by his grandparents. Nelson found solace and inspiration in music from an early age. His grandmother taught him to play the guitar. setting the stage for what would become an illustrious career.
First Steps in Music
Nelson's initial foray into the music industry was fraught with challenges. He moved to Nashville, Tennessee, to pursue his dreams, but success did not come . Working as a songwriter, Nelson penned hits for other artists. which helped him gain a foothold in the competitive music scene. His songwriting skills contributed to his early earnings. laying the foundation for his net worth.
Rise to Stardom
Breakthrough Albums
The 1970s marked a turning point in Willie Nelson's career. His albums "Shotgun Willie" (1973), "Red Headed Stranger" (1975). and "Stardust" (1978) received critical acclaim and commercial success. These albums not only solidified his position in the country music genre. but also introduced his music to a broader audience. The success of these albums played a crucial role in boosting Willie Nelson net worth.
Iconic Songs
Willie Nelson net worth is also attributed to his extensive catalog of hit songs. Tracks like "Blue Eyes Crying in the Rain," "On the Road Again," and "Always on My Mind" have become timeless classics. These songs have not only earned Nelson large royalties but have also ensured his continued relevance in the music industry.
Acting and Film Career
Hollywood Ventures
In addition to his music career, Willie Nelson has also made a mark in Hollywood. His distinctive personality and on-screen presence have landed him roles in several films and television shows. Notable appearances include roles in "The Electric Horseman" (1979), "Honeysuckle Rose" (1980), and "Barbarosa" (1982). These acting gigs have added a significant amount to Willie Nelson net worth.
Television Appearances
Nelson's char
Artificial Reefs by Kuddle Life Foundation - May 2024punit537210
Situated in Pondicherry, India, Kuddle Life Foundation is a charitable, non-profit and non-governmental organization (NGO) dedicated to improving the living standards of coastal communities and simultaneously placing a strong emphasis on the protection of marine ecosystems.
One of the key areas we work in is Artificial Reefs. This presentation captures our journey so far and our learnings. We hope you get as excited about marine conservation and artificial reefs as we are.
Please visit our website: https://kuddlelife.org
Our Instagram channel:
@kuddlelifefoundation
Our Linkedin Page:
https://www.linkedin.com/company/kuddlelifefoundation/
and write to us if you have any questions:
info@kuddlelife.org
UNDERSTANDING WHAT GREEN WASHING IS!.pdfJulietMogola
Many companies today use green washing to lure the public into thinking they are conserving the environment but in real sense they are doing more harm. There have been such several cases from very big companies here in Kenya and also globally. This ranges from various sectors from manufacturing and goes to consumer products. Educating people on greenwashing will enable people to make better choices based on their analysis and not on what they see on marketing sites.
Diabetes is a rapidly and serious health problem in Pakistan. This chronic condition is associated with serious long-term complications, including higher risk of heart disease and stroke. Aggressive treatment of hypertension and hyperlipideamia can result in a substantial reduction in cardiovascular events in patients with diabetes 1. Consequently pharmacist-led diabetes cardiovascular risk (DCVR) clinics have been established in both primary and secondary care sites in NHS Lothian during the past five years. An audit of the pharmaceutical care delivery at the clinics was conducted in order to evaluate practice and to standardize the pharmacists’ documentation of outcomes. Pharmaceutical care issues (PCI) and patient details were collected both prospectively and retrospectively from three DCVR clinics. The PCI`s were categorized according to a triangularised system consisting of multiple categories. These were ‘checks’, ‘changes’ (‘change in drug therapy process’ and ‘change in drug therapy’), ‘drug therapy problems’ and ‘quality assurance descriptors’ (‘timer perspective’ and ‘degree of change’). A verified medication assessment tool (MAT) for patients with chronic cardiovascular disease was applied to the patients from one of the clinics. The tool was used to quantify PCI`s and pharmacist actions that were centered on implementing or enforcing clinical guideline standards. A database was developed to be used as an assessment tool and to standardize the documentation of achievement of outcomes. Feedback on the audit of the pharmaceutical care delivery and the database was received from the DCVR clinic pharmacist at a focus group meeting.
"Understanding the Carbon Cycle: Processes, Human Impacts, and Strategies for...MMariSelvam4
The carbon cycle is a critical component of Earth's environmental system, governing the movement and transformation of carbon through various reservoirs, including the atmosphere, oceans, soil, and living organisms. This complex cycle involves several key processes such as photosynthesis, respiration, decomposition, and carbon sequestration, each contributing to the regulation of carbon levels on the planet.
Human activities, particularly fossil fuel combustion and deforestation, have significantly altered the natural carbon cycle, leading to increased atmospheric carbon dioxide concentrations and driving climate change. Understanding the intricacies of the carbon cycle is essential for assessing the impacts of these changes and developing effective mitigation strategies.
By studying the carbon cycle, scientists can identify carbon sources and sinks, measure carbon fluxes, and predict future trends. This knowledge is crucial for crafting policies aimed at reducing carbon emissions, enhancing carbon storage, and promoting sustainable practices. The carbon cycle's interplay with climate systems, ecosystems, and human activities underscores its importance in maintaining a stable and healthy planet.
In-depth exploration of the carbon cycle reveals the delicate balance required to sustain life and the urgent need to address anthropogenic influences. Through research, education, and policy, we can work towards restoring equilibrium in the carbon cycle and ensuring a sustainable future for generations to come.
1. School of something
FACULTY OF OTHER
The treatment of uncertainty in
dispersion models
Alison S. Tomlin
Energy and Resources Research Institute
SPEME, Faculty of Engineering
2. Why track uncertainties?
• In climate modelling the current expectation is that
uncertainties will be tracked within model predictions.
• Phrases such as “high confidence”, “likely”, “more likely
than not” etc. are presented along side mean predictions
and confidence limits.
• This is critical since such predictions are to be used in
decision making and policy formulation.
• Urban air quality models are also used to help form policy
on pollution mitigation strategies.
• So why are uncertainties in such model predictions not
routinely traced??
3. Hierarchy of model complexity
Direct numerical
simulation with
detailed chemistry.
Computationally expensive
Detailedchemicalandphysical
representations
Large Eddy
Simulation
Reynolds
Averaged CFD
+ Lagrangian
Semi-empirical
e.g. Gaussian, or
network models
4. Types of model uncertainty
Structural uncertainty:
➢ Missing physical/chemical processes within the model.
➢ Over simplification of processes to save compute power.
e.g. reduced chemistry, averaged turbulence representations.
➢ Influence of grid resolution.
Parametric uncertainty:
➢ Complex models contain large numbers of parameters
that are all uncertain.
e.g. kinetic reaction rates, roughness lengths, mixing
timescales, emissions, flux rates etc.
5. Propagating Uncertainties
➢ Parametric uncertainties can be propagated through
sampling methods using multiple model runs.
➢ Structural uncertainty more difficult to assess:
the “Unknown unknowns” dilemma.
➢ Neither approach commonly used in atmospheric pollution
dispersion models!
6. Can we learn from other
communities?
Climate Change community tests for structural uncertainties
via the inter-comparison of models and observations.
9. Model inter-comparison for
dispersion models?
➢ A few examples of inter-comparisons between models of
the same type e.g. Gaussian based, different RANS
models.
➢ Few exercises that compare models of different structure to
assess their fitness for purpose.
What does fit for purpose mean?
• Within parametric uncertainties model overlaps with error
bars of measured data.
• Model can be extrapolated to new conditions (not easy for
over-fitted models) i.e. predictive.
• Other criteria? Doesn’t have to be physically realistic.
10. Evaluation of complex models
• Need to evaluate if models are fit for purpose.
• Comparison of model with experimental or field data for
simple to complex scenarios.
MUST INCLUDE
• how much confidence we can place in simulations.
• If lack of agreement then how do we find contributing
causes?
• Sensitivity and uncertainty analysis help to answer these
questions:
- need strong feedback loop between model
evaluation and methods for model improvement.
11. Sensitivity and uncertainty
analysis
• Uncertainty analysis (UA)
estimates the overall predictive
uncertainty of a model given the
state/or lack of knowledge about its
input parameters.
• UA puts error bars on predictions.
Sensitivity analysis (SA) determines
how much each input parameter
contributes to the output uncertainty
(usually variance).
Parameter Si, x=2.2 m
Structure function coeff. C0 0.7976
Mixing time-scale coeff. α 0.1853
Σ Si 0.9843
12. How can uncertainties be
traced?
• The most consistent ways to trace uncertainties are
1. To run several models with potentially different structures and
parameterisations.
2. To run each model using an ensemble or random sample of input
parameters in order to generate a distribution of predicted target
outputs.
Parameter 1
Parameter2
Estimated maxEstimated min
13. Scatter plots for urban RANS dispersion
model
The examples show possible nonlinearities and large scatter –
obscuring overall sensitivity to parameter.
Wind direction° (input) Wind direction° (input)
Verticalvelocityinstreet(output)
Rooflevelturbulence(output)
14. High Dimensional Model
Representations (HDMR)
• Developed to provide detailed mapping of the input variable space to
selected outputs – a meta-model.
• Output is expressed as a finite hierarchical function expansion:
• Meta-model built using quasi random sample and approximation of
component functions by orthonormal polynomials.
• Used to generate partial variances and therefore sensitivity indices
• Si then ranked to give parameter importance.
)x,...,x,xx,xf)(xfff n21
nji
jiij
n
i
ii
1
12...n
1
0 (f...)()(x
15. Component functions for
previous example
Wind direction°
Verticalvelocityinstreet
Rooflevelturbulence
Wind direction°
Verticalvelocity
Wind direction°Surface roughness length
16. Closing the loop
• High ranked parameters could then be re-estimated using ab
initio modelling studies or simple experiments which help to
isolate their effects.
• If the parameter can be re-estimated with better certainty
then the error bars of the prediction are reduced.
• In some cases, even within the error bars, there is no
overlap between experimental and modelled outputs.
suggests structural problems with the model
• For this reason comparisons of model vs observations
should include error bars on BOTH!
17. Example: York street canyon
simulation of [NO2]
Signal
controlled
intersection
TKE, wind
vectors for
θref=90° to
canyon
axis
18. Complex model couplings
Traffic micro-simulation
model
Vehicle Characteristics
Traffic Network
Information
Instantaneous
emissions model
Vehicle
speeds
Vehicle acceleration
CFD flow model
(MISKAM k-ε)
Meteorology Building
layouts
Emissions
Flow and turbulence
Dispersion model
Pollution concentrations
19. Parameter set varied in global
sensitivity study
Model parameters (26 in total):
➢velocity structure function coefficient co
➢mixing time-scale coefficient α
➢surface roughness z0 for inlet, surface and wall
➢temperature dependant rate parameters for NO/NO2/O3 reactions,
photolysis rate parameters for JO3 and JNO2
➢wind direction θref
➢temperature
➢background [O3]
➢NO:NOx ratio
➢traffic demand
Model
parametrisation
Physical/meteorological
Traffic
emissions
20. Sample sensitivity results
(θref= 110-130): average over 6 in-canyon
road-side locations: mean [NO2]
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
Averagesensitivityindex
Off-Peak
Peak
21. Summary of findings
• Based on the assumed uncertainties in input
parameters the predicted roadside increments in
[NO2] can vary by around a factor of 2.
• Using the HDMR approach the influence of
traffic related parameters e.g. demand, primary
NO2 fraction, can be assessed within the overall
data scatter.
• For sites away from the traffic queue the wind
direction is the single most important parameter
affecting predicted NO2.
• Model parameters such as surface roughness lengths are influential.
• Chemical parameters (for NO+O3) are important for sites near the
junction.
• Background O3 appears not to be too influential for this site.
22. Conclusions
• Propagating uncertainties within pollution dispersion models needs to
become more routine.
➢ Doesn’t come without a computational cost.
•Uncertainties can be significant e.g. causing a factor of 2 variability in
predicted concentrations.
•Using global sensitivity/meta-modelling approaches it is still possible to
trace the influence of controllable parameters e.g. traffic demand on
target outputs using component functions.
•Wind speed and direction are key parameters causing variability and yet
we have very few truly urban met stations.
We can learn from other communities!
23. HDMR software with graphical
user interface freely available
Main author: Tilo Ziehn
24. Example
• Need to be able to model formation of secondary pollutants such as NO2
in order to test the impact of emission reduction strategies.
• Computational fluid dynamics (CFD) models becoming more common in
pollution and emergency response management.
• Different turbulence closure schemes in use:
Reynolds Averaged Navier Stokes (RANS), Large Eddy Simulation
(LES).
• Also different dispersion schemes:
Lagrangian particle, stochastic fields, eddy diffusivity.
• Urgent need to evaluate where different approaches are “fit for purpose”
and the impacts of parametrisations of turbulence, mixing and kinetics.
25. Convergence with sample size
0.00E+00
1.00E+11
2.00E+11
3.00E+11
4.00E+11
5.00E+11
6.00E+11
7.00E+11
8.00E+11
9.00E+11
0 20 40 60 80 100 120
NO2concentration(moleculescm-3)
Sample size in Sobol sequence
Mean Variance
64 samples enough to
give accurate estimates
of mean and variance of
predicted [NO2]
What we want to know is which of the 26 input
parameters contribute most to this predictive variance
due to their estimated uncertainties i.e.
what are the global sensitivities for each parameter?