This document summarizes a thesis on numerical methods for stochastic systems subject to generalized Levy noise. It includes:
1) Motivation for studying such systems from both mathematical and applicational perspectives, such as in mathematical finance and chaotic flows.
2) An introduction to Levy processes and the probability collocation method (PCM) for uncertainty quantification (UQ).
3) Details on improving PCM through a multi-element approach and constructing orthogonal polynomials for discrete measures.
This document summarizes Kristina Hettne's PhD thesis defense on applying next-generation text mining to toxicogenomics data analysis. The thesis investigated improving information coverage in biomedical and chemical thesauri used for text mining by developing new chemical concept identification methods. A next-generation text mining approach was developed to statistically relate chemical information to gene expression data, allowing identification of toxicity effects at an earlier stage than manual curation alone. The approach was shown to complement and sometimes outperform existing databases, with potential to reduce animal testing through early prediction of drug toxicity.
This document is the souvenir/abstract book for the 15th Annual Congress of the Indian Fertility Society titled "Fertivision 2019". It provides information on the conference including:
- Details of the conference theme, dates, and location in Gurugram, India.
- Descriptions of the workshops and main scientific program over three days covering the latest developments in reproductive medicine.
- Messages from the President and Secretary General of the Indian Fertility Society welcoming delegates.
- The full scientific program schedule, abstracts of talks, and list of international and national faculty speaking at the event.
Modern BioManufacturing: Single-Use Technologies in Configurable, Prefabricat...Merck Life Sciences
Merck KGaA provides end-to-end solutions for biomanufacturing facilities including process development, clinical supply, and GMP production. They are partnering with G-CON Manufacturing to offer a turnkey modular solution that integrates a single-use process train within a prefabricated and rapidly deployable cleanroom module. This allows clients to scale biomanufacturing capacity quickly and flexibly in a cost-effective and clonable manner to address challenges in emerging biopharma markets.
This document summarizes a review of randomized controlled trials (RCTs) and meta-analyses from 1990-2004 on factors affecting the success of embryo transfer (ET). The review identified 2 Cochrane reviews, 5 meta-analyses, and 34 RCTs. Key findings included that pregnancy and implantation rates were increased with trial transfers, ultrasound-guided ET, depositing embryos 2cm below the fundus, and applying gentle pressure to the cervix during and after ET. Soft catheters and exposure to semen around the time of ET also improved outcomes.
The document provides information about biomanufacturing education and training programs. It discusses the Northeast Biomanufacturing Collaborative and Center's work since 2003 to define the skills, knowledge, and abilities needed by biomanufacturing technicians. It highlights the development of curriculum and instructional materials to educate technicians for careers supporting the bioeconomy. Examples are given of various biomanufacturing processes and the equipment involved in areas like biopharmaceuticals, biofuels, and industrial biotechnology.
[논문리뷰] Data Augmentation for 1D 시계열 데이터Donghyeon Kim
* 1D 시계열 신호에 Data Augmentation 을 진행한 연구논문을 간단하게 리뷰합니다.
* 웨어러블 가속도 신호와 뇌전도 신호에 대한 논문 3편을 준비했습니다.
* 광주과학기술원 인공지능 스터디 A-GIST 모임에서 발표했습니다.
* 발표 영상 (한국어, 유튜브): https://youtu.be/NpUMFKaDCU4
The document discusses the importance of conversations in developing relationships. It notes that while some advocate "selling the sizzle not the steak", engaging in meaningful conversations where common ground is found is better. The results of interviews with people on their dating experiences and favorite companies suggest that conversations matter because that's how relationships are formed. People are more inclined to connect with companies or products that fit their personality or lifestyle.
This document summarizes Kristina Hettne's PhD thesis defense on applying next-generation text mining to toxicogenomics data analysis. The thesis investigated improving information coverage in biomedical and chemical thesauri used for text mining by developing new chemical concept identification methods. A next-generation text mining approach was developed to statistically relate chemical information to gene expression data, allowing identification of toxicity effects at an earlier stage than manual curation alone. The approach was shown to complement and sometimes outperform existing databases, with potential to reduce animal testing through early prediction of drug toxicity.
This document is the souvenir/abstract book for the 15th Annual Congress of the Indian Fertility Society titled "Fertivision 2019". It provides information on the conference including:
- Details of the conference theme, dates, and location in Gurugram, India.
- Descriptions of the workshops and main scientific program over three days covering the latest developments in reproductive medicine.
- Messages from the President and Secretary General of the Indian Fertility Society welcoming delegates.
- The full scientific program schedule, abstracts of talks, and list of international and national faculty speaking at the event.
Modern BioManufacturing: Single-Use Technologies in Configurable, Prefabricat...Merck Life Sciences
Merck KGaA provides end-to-end solutions for biomanufacturing facilities including process development, clinical supply, and GMP production. They are partnering with G-CON Manufacturing to offer a turnkey modular solution that integrates a single-use process train within a prefabricated and rapidly deployable cleanroom module. This allows clients to scale biomanufacturing capacity quickly and flexibly in a cost-effective and clonable manner to address challenges in emerging biopharma markets.
This document summarizes a review of randomized controlled trials (RCTs) and meta-analyses from 1990-2004 on factors affecting the success of embryo transfer (ET). The review identified 2 Cochrane reviews, 5 meta-analyses, and 34 RCTs. Key findings included that pregnancy and implantation rates were increased with trial transfers, ultrasound-guided ET, depositing embryos 2cm below the fundus, and applying gentle pressure to the cervix during and after ET. Soft catheters and exposure to semen around the time of ET also improved outcomes.
The document provides information about biomanufacturing education and training programs. It discusses the Northeast Biomanufacturing Collaborative and Center's work since 2003 to define the skills, knowledge, and abilities needed by biomanufacturing technicians. It highlights the development of curriculum and instructional materials to educate technicians for careers supporting the bioeconomy. Examples are given of various biomanufacturing processes and the equipment involved in areas like biopharmaceuticals, biofuels, and industrial biotechnology.
[논문리뷰] Data Augmentation for 1D 시계열 데이터Donghyeon Kim
* 1D 시계열 신호에 Data Augmentation 을 진행한 연구논문을 간단하게 리뷰합니다.
* 웨어러블 가속도 신호와 뇌전도 신호에 대한 논문 3편을 준비했습니다.
* 광주과학기술원 인공지능 스터디 A-GIST 모임에서 발표했습니다.
* 발표 영상 (한국어, 유튜브): https://youtu.be/NpUMFKaDCU4
The document discusses the importance of conversations in developing relationships. It notes that while some advocate "selling the sizzle not the steak", engaging in meaningful conversations where common ground is found is better. The results of interviews with people on their dating experiences and favorite companies suggest that conversations matter because that's how relationships are formed. People are more inclined to connect with companies or products that fit their personality or lifestyle.
Continuous representations of words and documents, which is recently referred to as Word Embeddings, have recently demonstrated large advancements in many of the Natural language processing tasks.
In this presentation we will provide an introduction to the most common methods of learning these representations. As well as previous methods in building these representations before the recent advances in deep learning, such as dimensionality reduction on the word co-occurrence matrix.
Moreover, we will present the continuous bag of word model (CBOW), one of the most successful models for word embeddings and one of the core models in word2vec, and in brief a glance of many other models of building representations for other tasks such as knowledge base embeddings.
Finally, we will motivate the potential of using such embeddings for many tasks that could be of importance for the group, such as semantic similarity, document clustering and retrieval.
This document discusses Latent Dirichlet Allocation (LDA), a probabilistic topic modeling technique. It begins with an introduction to topic models and their use in understanding large collections of documents. It then describes LDA's generative process using Dirichlet distributions to represent document-topic and topic-term distributions. Approximate inference methods for LDA like Gibbs sampling are also summarized. The document concludes by outlining the implementation of an LDA model, including preprocessing of documents and collapsed Gibbs sampling.
This document discusses new developments in controlled ovarian stimulation (COS) protocols. It outlines several new forms of fertility drugs including long acting FSH, FSH biosimilars, and subcutaneous progestagens. It also describes new COS protocols such as those using fewer injections, flexibility in start dates, dual stimulation, and individualizing FSH dosing to prevent ovarian hyperstimulation syndrome. The document concludes that while further research is still needed, these new drugs and protocols provide valuable options for increasing flexibility and optimizing outcomes in ART treatment.
This presentation discusses the role of recombinant human chorionic gonadotropin (hCG) in in vitro fertilization (IVF). It begins by outlining the learning objectives and providing background on hCG and its native and recombinant forms. The document then compares urinary and recombinant hCG, noting the higher purity and consistency of recombinant hCG. Several studies are summarized showing similar or improved outcomes with recombinant hCG compared to urinary hCG in terms of fertility measures like fertilization rates and pregnancy rates, as well as reduced risk of ovarian hyperstimulation syndrome.
This document provides an overview of the clinical management of nonobstructive azoospermia (NOA). It begins by defining NOA and explaining its challenges. It then discusses the diagnostic evaluation and differentiates between obstructive and nonobstructive causes. For NOA due to spermatogenic failure, the document outlines that the condition is irreversible and reviews sperm retrieval techniques and their success rates depending on the underlying etiology. It also notes that while biomarkers can reflect testicular function, they cannot definitively predict whether sperm will be found for retrieval.
Recent advances in assisted reproductive technology include:
1. The 1978 birth of Louise Brown, the first "test-tube baby", using in vitro fertilization without ovarian stimulation.
2. Developments like intracytoplasmic sperm injection (ICSI) and preimplantation genetic diagnosis (PGD) that have improved treatment options for male factor infertility and genetic disorders.
3. Continued research on techniques such as cryopreservation of eggs/embryos, stem cell therapy, and cloning that could further advance reproductive medicine if proven successful and safe.
Describes tThe precise dosage and duration of progesterone administration for luteal support in IVF are still topics of ongoing research, which could potentially lead to suboptimal outcomes such as miscarriage or preterm birth if not correctly managed.The efficacy of progesterone in reducing miscarriage and preterm birth rates in IVF, creating uncertainty about the best approach to luteal phase support.
This document provides an overview of Latent Dirichlet Allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. It defines key terminology for LDA including documents, words, topics, and distributions. The document then explains LDA's graphical model and generative process, which represents documents as mixtures over latent topics and generates words probabilistically from topics. Variational inference is introduced as an approach for approximating the intractable posterior distribution over topics and learning model parameters.
PRESENTED AT MASTER CLASS MUMBAI...........THNX TO MANY WHO HAVE CONTRIBUTED TO THIS PRESENTATION.....FEEL FREE TO USE THIS , BUT PLEASE ACKNOWLEDGE THE EFFORTS OF ALL ...........jaideep-narendra
An Empirical Study on Faith-based Microfinance as an Alternative Tool of Poverty Alleviation. The doctoral study discussed the role of FBOs in microfinance.
The document discusses future developments in IVF labs in three main areas: 1) Embryo culture techniques aim to better mimic the reproductive tract using microfluidics and dynamic culture systems. 2) Automation seeks to robotically assist processes like ICSI. 3) Non-invasive embryo selection techniques study the chemical fingerprints embryos leave in culture media using spectroscopy and proteomics to predict viability without biopsy. The goal is improving IVF success rates through more physiological culture conditions, precision of techniques, and selecting the highest quality embryos.
Improving Success by Tailoring Ovarian StimulationSandro Esteves
This document summarizes a presentation given by Dr. Sandro Esteves on improving IVF success through tailored ovarian stimulation. The presentation covered factors that determine ovarian response, strategies for high and poor responders, and evidence for different stimulation protocols. For high responders, low starting doses of rFSH, GnRH antagonists, and GnRH agonist triggering were recommended based on evidence from randomized controlled trials and observational studies. For poor responders, GnRH antagonists were suggested to potentially improve outcomes based on data from 14 RCTs.
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...jodischneider
This document summarizes a PhD thesis on enabling the reuse of arguments and opinions in open collaboration systems. It discusses three research questions: 1) opportunities and requirements for argumentation support, 2) common arguments used in these systems, and 3) structuring arguments to support reuse. The methodology involved analyzing discussions from Wikipedia and open collaboration projects using argumentation theories like Walton's schemes and factors analysis. The goal is to develop semantic structures and visualizations to help people understand diverse opinions and make collaborative decisions. A prototype system tested with users found structuring discussions by key factors helped people evaluate arguments more effectively.
The document presents a research proposal submitted for a Doctor of Computer Science degree focusing on developing a hiring framework to facilitate the transition from military to civilian careers in program management. It outlines the dissertation which will use a mixed methods approach including quantitative data collection and qualitative interviews. The preliminary results suggest military candidates for civilian program management roles often have graduate management/business education, intense military training, and display traits of the Army's leadership model.
- The thesis studies numerical methods for stochastic partial differential equations (SPDEs) subject to generalized Levy noise.
- It develops both deterministic methods using the Fokker-Planck equation and probabilistic methods like polynomial chaos.
- Key contributions include developing adaptive multi-element polynomial chaos for discrete measures, comparing approaches to construct orthogonal polynomials over discrete measures, and improving efficiency and accuracy through adaptive integration meshes and sparse grids.
Patch Matching with Polynomial Exponential Families and Projective DivergencesFrank Nielsen
This document presents a method called Polynomial Exponential Family-Patch Matching (PEF-PM) to solve the patch matching problem. PEF-PM models patch colors using polynomial exponential families (PEFs), which are universal smooth positive densities. It estimates PEFs using a Score Matching Estimator and accelerates batch estimation using Summed Area Tables. Patch similarity is measured using a statistical projective divergence called the symmetrized γ-divergence. Experiments show PEF-PM handles noise robustly, symmetries, and outperforms baseline methods.
The document discusses error analysis for quasi-Monte Carlo methods used for numerical integration. It introduces the concepts of reproducing kernel Hilbert spaces and mean square discrepancy to analyze integration error. Specifically, it shows that the mean square discrepancy of randomized low-discrepancy point sets can be computed in O(n) operations, whereas the standard discrepancy requires O(n^2) operations, making randomized quasi-Monte Carlo methods more efficient for high-dimensional integration problems.
Multidimensional integrals may be approximated by weighted averages of integrand values. Quasi-Monte Carlo (QMC) methods are more accurate than simple Monte Carlo methods because they carefully choose where to evaluate the integrand. This tutorial focuses on how quickly QMC methods converge to the correct answer as the number of integrand values increases. The answer may depend on the smoothness of the integrand and the sophistication of the QMC method. QMC error analysis may assumes the integrand belongs to a reproducing kernel Hilbert space or may assume that the integrand is an instance of a stochastic process with known covariance structure. These two approaches have interesting parallels. This tutorial also explores how the computational cost of achieving a good approximation to the integral depends on the dimension of the domain of the integrand. Finally, this tutorial explores methods for determining how many integrand values are needed to satisfy the error tolerance. Relevant software is described.
This document discusses Bayesian inference on mixtures models. It covers several key topics:
1. Density approximation and consistency results for mixtures as a way to approximate unknown distributions.
2. The "scarcity phenomenon" where the posterior probabilities of most component allocations in mixture models are zero, concentrating on just a few high probability allocations.
3. Challenges with Bayesian inference for mixtures, including identifiability issues, label switching, and complex combinatorial calculations required to integrate over all possible component allocations.
This document contains information about data structures and algorithms taught at KTH Royal Institute of Technology. It includes code templates for a contest, descriptions and implementations of common data structures like an order statistic tree and hash map, as well as summaries of mathematical and algorithmic concepts like trigonometry, probability theory, and Markov chains.
This document summarizes research on computing stochastic partial differential equations (SPDEs) using an adaptive multi-element polynomial chaos method (MEPCM) with discrete measures. Key points include:
1) MEPCM uses polynomial chaos expansions and numerical integration to compute SPDEs with parametric uncertainty.
2) Orthogonal polynomials are generated for discrete measures using various methods like Vandermonde, Stieltjes, and Lanczos.
3) Numerical integration is tested on discrete measures using Genz functions in 1D and sparse grids in higher dimensions.
4) The method is demonstrated on the KdV equation with random initial conditions. Future work includes applying these techniques to SPDEs driven
Continuous representations of words and documents, which is recently referred to as Word Embeddings, have recently demonstrated large advancements in many of the Natural language processing tasks.
In this presentation we will provide an introduction to the most common methods of learning these representations. As well as previous methods in building these representations before the recent advances in deep learning, such as dimensionality reduction on the word co-occurrence matrix.
Moreover, we will present the continuous bag of word model (CBOW), one of the most successful models for word embeddings and one of the core models in word2vec, and in brief a glance of many other models of building representations for other tasks such as knowledge base embeddings.
Finally, we will motivate the potential of using such embeddings for many tasks that could be of importance for the group, such as semantic similarity, document clustering and retrieval.
This document discusses Latent Dirichlet Allocation (LDA), a probabilistic topic modeling technique. It begins with an introduction to topic models and their use in understanding large collections of documents. It then describes LDA's generative process using Dirichlet distributions to represent document-topic and topic-term distributions. Approximate inference methods for LDA like Gibbs sampling are also summarized. The document concludes by outlining the implementation of an LDA model, including preprocessing of documents and collapsed Gibbs sampling.
This document discusses new developments in controlled ovarian stimulation (COS) protocols. It outlines several new forms of fertility drugs including long acting FSH, FSH biosimilars, and subcutaneous progestagens. It also describes new COS protocols such as those using fewer injections, flexibility in start dates, dual stimulation, and individualizing FSH dosing to prevent ovarian hyperstimulation syndrome. The document concludes that while further research is still needed, these new drugs and protocols provide valuable options for increasing flexibility and optimizing outcomes in ART treatment.
This presentation discusses the role of recombinant human chorionic gonadotropin (hCG) in in vitro fertilization (IVF). It begins by outlining the learning objectives and providing background on hCG and its native and recombinant forms. The document then compares urinary and recombinant hCG, noting the higher purity and consistency of recombinant hCG. Several studies are summarized showing similar or improved outcomes with recombinant hCG compared to urinary hCG in terms of fertility measures like fertilization rates and pregnancy rates, as well as reduced risk of ovarian hyperstimulation syndrome.
This document provides an overview of the clinical management of nonobstructive azoospermia (NOA). It begins by defining NOA and explaining its challenges. It then discusses the diagnostic evaluation and differentiates between obstructive and nonobstructive causes. For NOA due to spermatogenic failure, the document outlines that the condition is irreversible and reviews sperm retrieval techniques and their success rates depending on the underlying etiology. It also notes that while biomarkers can reflect testicular function, they cannot definitively predict whether sperm will be found for retrieval.
Recent advances in assisted reproductive technology include:
1. The 1978 birth of Louise Brown, the first "test-tube baby", using in vitro fertilization without ovarian stimulation.
2. Developments like intracytoplasmic sperm injection (ICSI) and preimplantation genetic diagnosis (PGD) that have improved treatment options for male factor infertility and genetic disorders.
3. Continued research on techniques such as cryopreservation of eggs/embryos, stem cell therapy, and cloning that could further advance reproductive medicine if proven successful and safe.
Describes tThe precise dosage and duration of progesterone administration for luteal support in IVF are still topics of ongoing research, which could potentially lead to suboptimal outcomes such as miscarriage or preterm birth if not correctly managed.The efficacy of progesterone in reducing miscarriage and preterm birth rates in IVF, creating uncertainty about the best approach to luteal phase support.
This document provides an overview of Latent Dirichlet Allocation (LDA), a generative probabilistic model for collections of discrete data such as text corpora. It defines key terminology for LDA including documents, words, topics, and distributions. The document then explains LDA's graphical model and generative process, which represents documents as mixtures over latent topics and generates words probabilistically from topics. Variational inference is introduced as an approach for approximating the intractable posterior distribution over topics and learning model parameters.
PRESENTED AT MASTER CLASS MUMBAI...........THNX TO MANY WHO HAVE CONTRIBUTED TO THIS PRESENTATION.....FEEL FREE TO USE THIS , BUT PLEASE ACKNOWLEDGE THE EFFORTS OF ALL ...........jaideep-narendra
An Empirical Study on Faith-based Microfinance as an Alternative Tool of Poverty Alleviation. The doctoral study discussed the role of FBOs in microfinance.
The document discusses future developments in IVF labs in three main areas: 1) Embryo culture techniques aim to better mimic the reproductive tract using microfluidics and dynamic culture systems. 2) Automation seeks to robotically assist processes like ICSI. 3) Non-invasive embryo selection techniques study the chemical fingerprints embryos leave in culture media using spectroscopy and proteomics to predict viability without biopsy. The goal is improving IVF success rates through more physiological culture conditions, precision of techniques, and selecting the highest quality embryos.
Improving Success by Tailoring Ovarian StimulationSandro Esteves
This document summarizes a presentation given by Dr. Sandro Esteves on improving IVF success through tailored ovarian stimulation. The presentation covered factors that determine ovarian response, strategies for high and poor responders, and evidence for different stimulation protocols. For high responders, low starting doses of rFSH, GnRH antagonists, and GnRH agonist triggering were recommended based on evidence from randomized controlled trials and observational studies. For poor responders, GnRH antagonists were suggested to potentially improve outcomes based on data from 14 RCTs.
Enabling reuse of arguments and opinions in open collaboration systems PhD vi...jodischneider
This document summarizes a PhD thesis on enabling the reuse of arguments and opinions in open collaboration systems. It discusses three research questions: 1) opportunities and requirements for argumentation support, 2) common arguments used in these systems, and 3) structuring arguments to support reuse. The methodology involved analyzing discussions from Wikipedia and open collaboration projects using argumentation theories like Walton's schemes and factors analysis. The goal is to develop semantic structures and visualizations to help people understand diverse opinions and make collaborative decisions. A prototype system tested with users found structuring discussions by key factors helped people evaluate arguments more effectively.
The document presents a research proposal submitted for a Doctor of Computer Science degree focusing on developing a hiring framework to facilitate the transition from military to civilian careers in program management. It outlines the dissertation which will use a mixed methods approach including quantitative data collection and qualitative interviews. The preliminary results suggest military candidates for civilian program management roles often have graduate management/business education, intense military training, and display traits of the Army's leadership model.
- The thesis studies numerical methods for stochastic partial differential equations (SPDEs) subject to generalized Levy noise.
- It develops both deterministic methods using the Fokker-Planck equation and probabilistic methods like polynomial chaos.
- Key contributions include developing adaptive multi-element polynomial chaos for discrete measures, comparing approaches to construct orthogonal polynomials over discrete measures, and improving efficiency and accuracy through adaptive integration meshes and sparse grids.
Patch Matching with Polynomial Exponential Families and Projective DivergencesFrank Nielsen
This document presents a method called Polynomial Exponential Family-Patch Matching (PEF-PM) to solve the patch matching problem. PEF-PM models patch colors using polynomial exponential families (PEFs), which are universal smooth positive densities. It estimates PEFs using a Score Matching Estimator and accelerates batch estimation using Summed Area Tables. Patch similarity is measured using a statistical projective divergence called the symmetrized γ-divergence. Experiments show PEF-PM handles noise robustly, symmetries, and outperforms baseline methods.
The document discusses error analysis for quasi-Monte Carlo methods used for numerical integration. It introduces the concepts of reproducing kernel Hilbert spaces and mean square discrepancy to analyze integration error. Specifically, it shows that the mean square discrepancy of randomized low-discrepancy point sets can be computed in O(n) operations, whereas the standard discrepancy requires O(n^2) operations, making randomized quasi-Monte Carlo methods more efficient for high-dimensional integration problems.
Multidimensional integrals may be approximated by weighted averages of integrand values. Quasi-Monte Carlo (QMC) methods are more accurate than simple Monte Carlo methods because they carefully choose where to evaluate the integrand. This tutorial focuses on how quickly QMC methods converge to the correct answer as the number of integrand values increases. The answer may depend on the smoothness of the integrand and the sophistication of the QMC method. QMC error analysis may assumes the integrand belongs to a reproducing kernel Hilbert space or may assume that the integrand is an instance of a stochastic process with known covariance structure. These two approaches have interesting parallels. This tutorial also explores how the computational cost of achieving a good approximation to the integral depends on the dimension of the domain of the integrand. Finally, this tutorial explores methods for determining how many integrand values are needed to satisfy the error tolerance. Relevant software is described.
This document discusses Bayesian inference on mixtures models. It covers several key topics:
1. Density approximation and consistency results for mixtures as a way to approximate unknown distributions.
2. The "scarcity phenomenon" where the posterior probabilities of most component allocations in mixture models are zero, concentrating on just a few high probability allocations.
3. Challenges with Bayesian inference for mixtures, including identifiability issues, label switching, and complex combinatorial calculations required to integrate over all possible component allocations.
This document contains information about data structures and algorithms taught at KTH Royal Institute of Technology. It includes code templates for a contest, descriptions and implementations of common data structures like an order statistic tree and hash map, as well as summaries of mathematical and algorithmic concepts like trigonometry, probability theory, and Markov chains.
This document summarizes research on computing stochastic partial differential equations (SPDEs) using an adaptive multi-element polynomial chaos method (MEPCM) with discrete measures. Key points include:
1) MEPCM uses polynomial chaos expansions and numerical integration to compute SPDEs with parametric uncertainty.
2) Orthogonal polynomials are generated for discrete measures using various methods like Vandermonde, Stieltjes, and Lanczos.
3) Numerical integration is tested on discrete measures using Genz functions in 1D and sparse grids in higher dimensions.
4) The method is demonstrated on the KdV equation with random initial conditions. Future work includes applying these techniques to SPDEs driven
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
The GraphNet (aka S-Lasso), as well as other “sparsity + structure” priors like TV (Total-Variation), TV-L1, etc., are not easily applicable to brain data because of technical problems
relating to the selection of the regularization parameters. Also, in
their own right, such models lead to challenging high-dimensional optimization problems. In this manuscript, we present some heuristics for speeding up the overall optimization process: (a) Early-stopping, whereby one halts the optimization process when the test score (performance on leftout data) for the internal cross-validation for model-selection stops improving, and (b) univariate feature-screening, whereby irrelevant (non-predictive) voxels are detected and eliminated before the optimization problem is entered, thus reducing the size of the problem. Empirical results with GraphNet on real MRI (Magnetic Resonance Imaging) datasets indicate that these heuristics are a win-win strategy, as they add speed without sacrificing the quality of the predictions. We expect the proposed heuristics to work on other models like TV-L1, etc.
Stochastic differential equations (SDEs) describe systems with random components. Common methods to solve SDEs include spectral and perturbation methods. The spectral method represents variables and parameters as mean values plus fluctuations. Taking the expected value of the SDE yields equations for the mean and fluctuations that can be solved. The perturbation method expresses variables and parameters as power series expansions. Introducing these into the SDE allows analytical or numerical solution. SDEs are used to model systems with uncertain parameters like groundwater flow with random hydraulic conductivity.
The document discusses statistical representation of random inputs in continuum models. It provides examples of representing random fields using the Karhunen-Loeve expansion, which expresses a random field as the sum of orthogonal deterministic basis functions and random variables. Common choices for the covariance function in the expansion include the radial basis function and limiting cases of fully correlated and uncorrelated fields. The covariance function can be approximated from samples of the random field to enable representation in applications.
Wang-Landau Monte Carlo simulation is a method for calculating the density of states function which can then be used to calculate thermodynamic properties like the mean value of variables. It improves on traditional Monte Carlo methods which struggle at low temperatures due to complicated energy landscapes with many local minima separated by large barriers. The Wang-Landau algorithm calculates the density of states function directly rather than relying on sampling configurations, allowing it to overcome barriers and fully explore the configuration space even at low temperatures.
We study an elliptic eigenvalue problem, with a random coefficient that can be parametrised by infinitely-many stochastic parameters. The physical motivation is the criticality problem for a nuclear reactor: in steady state the fission reaction can be modeled by an elliptic eigenvalue
problem, and the smallest eigenvalue provides a measure of how close the reaction is to equilibrium -- in terms of production/absorption of neutrons. The coefficients are allowed to be random to model the uncertainty of the composition of materials inside the reactor, e.g., the
control rods, reactor structure, fuel rods etc.
The randomness in the coefficient also results in randomness in the eigenvalues and corresponding eigenfunctions. As such, our quantity of interest is the expected value, with
respect to the stochastic parameters, of the smallest eigenvalue, which we formulate as an integral over the infinite-dimensional parameter domain. Our approximation involves three steps: truncating the stochastic dimension, discretizing the spatial domain using finite elements and approximating the now finite but still high-dimensional integral.
To approximate the high-dimensional integral we use quasi-Monte Carlo (QMC) methods. These are deterministic or quasi-random quadrature rules that can be proven to be very efficient for the numerical integration of certain classes of high-dimensional functions. QMC methods have previously been applied to linear functionals of the solution of a similar elliptic source problem; however, because of the nonlinearity of eigenvalues the existing analysis of the integration error
does not hold in our case.
We show that the minimal eigenvalue belongs to the spaces required for QMC theory, outline the approximation algorithm and provide numerical results.
The document discusses uncertainty quantification and robust design approaches for aircraft design. It compares using a polynomial chaos expansion with an adaptive sparse grid to represent input uncertainties and the objective function. This allows solving the robust optimization problem with reduced computational cost compared to evaluating on a full tensor grid. The methodology is demonstrated on a transonic airfoil design test case with geometrical uncertainties, comparing different robust measures of performance.
Using blurred images to assess damage in bridge structures?Alessandro Palmeri
Faster trains and augmented traffic have significantly increased the number and amplitude of loading cycles experienced on a daily basis by composite steel-concrete bridges. This higher demand accelerates the occurrence of damage in the shear connectors between the two materials, which in turn can severely affect performance and reliability of these structures. The aim of this talk is to present the preliminary results of theoretical and experimental investigations undertaken to assess the feasibility of using the envelope of deflections and rotations induced by moving loads as a practical and cost-effective alternative to traditional methods of health monitoring for composite bridges. Both analytical and numerical formulations for this dynamic problem are presented and the results of a parametric study are discussed. A novel photogrammetric approach is also introduced, which allows identifying vibration patterns in civil engineering structures by analysing blurred targets in long-exposure digital images. The initial experimental validation of this approach is presented and further challenges are highlighted.
This document summarizes research on quantum chaos, including the principle of uniform semiclassical condensation of Wigner functions, spectral statistics in mixed systems, and dynamical localization of chaotic eigenstates. It discusses how in the semiclassical limit, Wigner functions condense uniformly on classical invariant components. For mixed systems, the spectrum can be seen as a superposition of regular and chaotic level sequences. Localization effects can be observed if the Heisenberg time is shorter than the classical diffusion time. The document presents an analytical formula called BRB that describes the transition between Poisson and random matrix statistics. An example is given of applying this to analyze the level spacing distribution for a billiard system.
Response Surface in Tensor Train format for Uncertainty QuantificationAlexander Litvinenko
We apply low-rank Tensor Train format to solve PDEs with uncertain coefficients. First, we approximate uncertain permeability coefficient in TT format, then the operator and then apply iterations to solve stochastic Galerkin system.
Polynomial matrices can help to elegantly formulate many broadband multi-sensor / multi-channel processing problems, and represent a direct extension of well-established narrowband techniques which typically involve eigen- (EVD) and singular value decompositions (SVD) for optimisation. Polynomial matrix decompositions extend the utility of the EVD to polynomial parahermitian matrices, and this talk presents a brief overview of such polynomial matrices, characteristics of the polynomial EVD (PEVD) and iterative algorithms for its solution. The presentation concludes with some surprising results when applying the PEVD to subband coding and broadband beamforming.
phd Thesis Mengdi Zheng (Summer) Brown Applied MathsZheng Mengdi
This document is the dissertation of Mengdi Zheng submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Applied Mathematics at Brown University in May 2015. The dissertation focuses on developing numerical methods for uncertainty quantification of stochastic partial differential equations driven by Lévy jump processes. Specifically, it presents work on applying polynomial chaos expansions, Wick-Malliavin approximations, and generalized Fokker-Planck equations to simulate stochastic systems subject to generalized Lévy noise and analyze their moment statistics. The dissertation contains publications by the author in peer-reviewed journals on these topics.
Mengdi Zheng is a Chinese national working as a KTP Associate at University College London since June 2015. She holds a PhD in Applied Mathematics from Brown University (2011-2015) and has experience modeling natural catastrophes including earthquakes, tsunamis, and financial losses. Her research interests include stochastic partial differential equations, uncertainty quantification, and scientific computing methods.
Mengdi Zheng is a catastrophe risk research analyst at University College London. She has over 10 years of experience in applied mathematics and scientific computing. Her responsibilities include numerical simulation of tsunamis from earthquakes and estimating financial loss from tsunamis for an insurance company. She holds a PhD in Applied Mathematics from Brown University.
This document is Mengdi Zheng's dissertation for the degree of Doctor of Philosophy in Applied Mathematics from Brown University. The dissertation focuses on developing numerical methods for stochastic partial differential equations (SPDEs) driven by Lévy jump processes. Chapter 1 introduces the motivation and challenges in uncertainty quantification of nonlinear SPDEs driven by Lévy noise. The subsequent chapters develop simulation methods for Lévy jump processes, adaptive stochastic collocation methods, and Wick-Malliavin approximations to solve SPDEs with discrete and tempered stable Lévy noise in multiple dimensions.
This document summarizes numerical methods for solving stochastic partial differential equations (SPDEs) driven by Lévy jump processes. It discusses both probabilistic methods like Monte Carlo (MC) and probabilistic collocation method (PCM), as well as deterministic methods based on solving the generalized Fokker-Planck equation. Specific examples discussed include an overdamped Langevin equation driven by a 1D tempered alpha-stable process, and diffusion equations driven by multi-dimensional jump processes using different dependence structures. The document compares the accuracy and efficiency of MC/PCM versus solving the tempered fractional Fokker-Planck equation directly. It also discusses how to represent SPDEs with additive multi-dimensional Lévy
This document summarizes research on numerical methods for solving stochastic partial differential equations (SPDEs) driven by Lévy jump processes. It discusses both probabilistic methods like Monte Carlo simulation and polynomial chaos methods, as well as deterministic methods based on generalized Fokker-Planck equations. Specific examples presented include the overdamped Langevin equation driven by a tempered α-stable Lévy process, and heat equations with jumps modeled by multi-dimensional Lévy processes using either Lévy copulas or Lévy measure representations. Comparisons are made between probabilistic and deterministic methods in terms of accuracy and computational efficiency for moment statistics.
The document is a teaching statement from an applicant named Mengdi Zheng. It discusses their teaching philosophy and experience. Some key points made include:
- College students are able to think independently but class time is not enough to fully cover material, so the goal is to inspire interest and emphasize key points.
- Teaching and research are related activities that both involve collecting information, asking questions, and problem solving. Teaching enhances research skills.
- It is important to engage all students, not just active ones, and ensure quieter students also understand before moving to new topics. Flexibility is needed to reach learning goals.
Summer Zheng will present the paper "Fractional dynamics on networks: Emergence of anomalous diffusion and Levy flights" which discusses fractional diffusion processes and long-range dynamics on networks. The paper introduces a fractional formalism to describe diffusion on networks and shows how this leads to anomalous diffusion and Levy flights. It provides examples of fractional diffusion on tree and ring networks and analyzes how network structure, such as being a tree, ring, or scale-free, influences properties like the fractional return probability and global exploration time.
This document outlines the author's dissertation project on numerical methods for stochastic systems subject to generalized Lévy noise. The dissertation will include 6 chapters covering: 1) simulation of Lévy jump processes, 2) adaptive multi-element polynomial chaos for stochastic PDEs with discrete measures, 3) Wick-Malliavin approximation of nonlinear SPDEs with discrete random variables, 4) methods for SPDEs with tempered α-stable processes, 5) methods for SPDEs with additive multidimensional Lévy jump processes, and 6) application of fractional dynamics on networks. Each chapter will apply the numerical methods to examples such as stochastic reaction-diffusion equations, Burgers equations, and Navier-Stokes flow past
uncertainty quantification of SPDEs with multi-dimensional Levy processesZheng Mengdi
The document discusses using Fokker-Planck equations to model stochastic differential equations driven by additive pure jump processes. It compares using the Fokker-Planck approach to a Monte Carlo simulation or probability density approach. It also examines using Fokker-Planck equations to model heat equations with two-dimensional jump processes, comparing the Fokker-Planck solution to exact solutions and those obtained from copula models or using LePage series.
1) The document discusses five methods for generating orthogonal polynomials for discrete measures: Nowak's method, Fischer's method, Stieltjes method, modified Chebyshev method, and Lanczos method.
2) It compares the orthogonality, computational cost, and minimum polynomial order where the Stieltjes method starts to fail for these five methods.
3) It proposes an adaptive multi-element polynomial chaos method (ME-PCM) for stochastic partial differential equations driven by discrete random variables and demonstrates its accuracy on example problems.
2014 spring crunch seminar (SDE/levy/fractional/spectral method)Zheng Mengdi
This document summarizes numerical methods for simulating stochastic partial differential equations (SPDEs) with tempered alpha-stable (TαS) processes. It discusses two main methods:
1) The compound Poisson (CP) approximation method, which simulates large jumps as a CP process and replaces small jumps with their expected drift term.
2) The series representation method, which represents the TαS process as an infinite series involving i.i.d. random variables.
It also provides algorithms for implementing these two methods and applies them to simulate specific examples like reaction-diffusion equations with TαS noise. Numerical results demonstrate that both methods can accurately capture the statistics of the underlying TαS
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxMAGOTI ERNEST
Although Artemia has been known to man for centuries, its use as a food for the culture of larval organisms apparently began only in the 1930s, when several investigators found that it made an excellent food for newly hatched fish larvae (Litvinenko et al., 2023). As aquaculture developed in the 1960s and ‘70s, the use of Artemia also became more widespread, due both to its convenience and to its nutritional value for larval organisms (Arenas-Pardo et al., 2024). The fact that Artemia dormant cysts can be stored for long periods in cans, and then used as an off-the-shelf food requiring only 24 h of incubation makes them the most convenient, least labor-intensive, live food available for aquaculture (Sorgeloos & Roubach, 2021). The nutritional value of Artemia, especially for marine organisms, is not constant, but varies both geographically and temporally. During the last decade, however, both the causes of Artemia nutritional variability and methods to improve poorquality Artemia have been identified (Loufi et al., 2024).
Brine shrimp (Artemia spp.) are used in marine aquaculture worldwide. Annually, more than 2,000 metric tons of dry cysts are used for cultivation of fish, crustacean, and shellfish larva. Brine shrimp are important to aquaculture because newly hatched brine shrimp nauplii (larvae) provide a food source for many fish fry (Mozanzadeh et al., 2021). Culture and harvesting of brine shrimp eggs represents another aspect of the aquaculture industry. Nauplii and metanauplii of Artemia, commonly known as brine shrimp, play a crucial role in aquaculture due to their nutritional value and suitability as live feed for many aquatic species, particularly in larval stages (Sorgeloos & Roubach, 2021).
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
ESPP presentation to EU Waste Water Network, 4th June 2024 “EU policies driving nutrient removal and recycling
and the revised UWWTD (Urban Waste Water Treatment Directive)”
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
BREEDING METHODS FOR DISEASE RESISTANCE.pptxRASHMI M G
Plant breeding for disease resistance is a strategy to reduce crop losses caused by disease. Plants have an innate immune system that allows them to recognize pathogens and provide resistance. However, breeding for long-lasting resistance often involves combining multiple resistance genes
1. Numerical methods
for stochastic systems
subject to generalized Levy noise
by Mengdi Zheng!
Thesis committee: George Em Karniadakis (Ph.D., advisor)!
Hui Wang (Ph.D., reader, APMA, Brown)!
Xiaoliang Wan (Ph.D., reader, Mathematics, LSU)
4. Probability collocation method (PCM) in UQ
Xt (ω) ≈ Xt (ξ1,ξ2,...,ξn ) ω ∈Ω Ε[um
(x,t;ω)] ≈ Ε[um
(x,t;ξ1,ξ2,...,ξn )]
ξ1
ξ2
ξ3
... ξn
O Ω
PCM
ξ1
ξ2
Ω
O
MEPCM
B1 B2
B3 B4
n = 2
I = dΓ(x) f (x) ≈
a
b
∫ dΓ(x) f (xi )hi (x)
i=1
d
∑ = f (xi ) dΓ(x)hi (x)
a
b
∫i=1
d
∑
a
b
∫
Gauss integration:
u(x,t;ξ1i )
i=1
d
∑ wi
n = 1
{Pi (x)}orthogonal to Γ(x)
Pd (x)zeros of
Lagrange
interpolation
at zeros{xi ,i = 1,..,d}
5. sample path of a
Poisson process
Jt
t
Introduction of Levy processes
6. Gauss
quadrature
and weights
generate
orthogonal to{Pj
k (ξ j
)}
µ j
(ξ j
)
moment
statistics
5
ways
tensor
product
or
sparse
grid
Ε[um
(x,t;ω)]
0
17.5
35
52.5
70
1 2 3 4
measure of
ξi
ξi
what if is a set of experimental data?ξi
subjective assumption of!
distribution shapes
?
ut + 6uux + uxxx = σiξi ,
i=1
n
∑ x ∈!
u(x,0) =
a
2
sech2
(
a
2
(x − x0 ))
Data-driven UQ for stochastic KdV equations
M. Zheng, X. Wan, G.E. Karniadakis, Adaptive multi-element polynomial chaos with discrete measure: Algorithms and
application to SPDEs, Applied Numerical Mathematics, 90 (2015), pp. 91–110.
A(k + n,n) = (−1)k+n−|i| n −1
k + n− | i |
⎛
⎝⎜
⎞
⎠⎟ (Ui1
⊗...⊗Uin
)
k+1≤|i|≤k+n
∑
Sparse grids in Smolyak algorithm: level k, dimension n
sparse!
grid
tensor!
product!
grid
7. Construct orthogonal polynomials to discrete measures
1. (Nowak) S. Oladyshkin, W. Nowak, Data-driven uncertainty quantification using the arbitrary polynomial chaos expansion,
Reliability Engineering & System Safety, 106 (2012), pp. 179–190.
2. (Stieltjes, Modified Chebyshev) W. Gautschi, On generating orthogonal polynomials, SIAM J. Sci. Stat. Comp., 3 (1982), no.3, pp.
289–317.
3. (Lanczos) D. Boley, G. H. Golub, A survey of matrix inverse eigenvalue problems, Inverse Problems, 3 (1987), pp. 595–622.
4. (Fischer) H. J. Fischer, On generating orthogonal polynomials for discrete measures, Z. Anal. Anwendungen, 17 (1998), pp. 183–
205.
f (k;N, p) =
N!
k!(N − k)!
pk
(1− p)N−k
,k = 0,1,...,N.
10 20 40 80 100
10
−4
10
−3
10
−2
10
−1
10
0
polynomial order i
CPUtimetoevaluateorth(i)
Nowak
Stieltjes
Fischer
Modified Chebyshev
Lanczos
C*i2
n=100,p=1/2
polynomial order i
Bino(100, 1/2)
0 10 20 30 40 50 60 70 80 90 100
10
−20
10
−15
10
−10
10
−5
10
0
polynomial order i
orth(i)
Nowak
Stieltjes
Fischer
Modified Chebyshev
Lanczos
N=100, p=1/2
polynomial order i
Bino(100, 1/2)
orthogonality ?
cost ?
Bino(N, p)Binomial distribution
8. | f (ξ)µ(ξ)− Qm
Bi
f
i=1
Nes
∑
Γ
∫ |≤ Chm+1
|| EΓ ||m+1,∞,Γ | f |m+1,∞,Γ
{Bi
}i=1
Nes
: elementsNes : number of elements
: number of elementsΓ
µ: discrete measure
Qm
Bi
Gauss quadrature + tensor product
with exactness m=2d-1
h: maximum size of Bi
f : test function in W m+1,∞
(Γ)
(when the measure is continuous) J. Foo, X. Wan, G. E. Karniadakis, A multi-element probabilistic collocation method for PDEs with
parametric uncertainty: error analysis and applications, Journal of Computational Physics 227 (2008), pp. 9572–9595.
10
0
10
1
10
−6
10
−5
10
−4
10
−3
10
−2
N
es
absoluteerror
c=0.1,w=1
GENZ1
d=2
m=3
bino(120,1/2)Bino(120, 1/2)
10
0
10
1
10
−13
10
−12
10
−11
10
−10
10
−9
N
es
absoluteerrors
c=0.1,w=1
GENZ4
d=2
m=3
bino(120,1/2)
Bino(120, 1/2)
0 20 40 60 80 100
−1
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
GENZ1 function (oscillations)
w=1, c=0.01
w=1,c=0.1
w=1,c=1
0 20 40 60 80 100 120
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
GENZ4 function (Gaussian)
c=0.01,w=1
c=0.1,w=1
c=1,w=1
Multi-element Gauss integration over discrete measures
9. UQ of stochastic KdV equation with 1RVut + 6uux + uxxx = σiξi ,
i=1
n
∑ x ∈!
u(x,0) =
a
2
sech2
(
a
2
(x − x0 ))
2 3 4 5 6 7 8
10
−3
10
−2
d
error
l2u1aPC
l2u2
aPC
2 3 5 10 15 20 30
10
−7
10
−6
10
−5
10
−4
10
−3
10
−2
Nes
error
l2u1
l2u2
C*Nel−4
h-convergence!
of MEPCM
p-convergence of PCM
Poisson distribution
Binomial distribution
σi
2
local variance to the measure µ(dξ) / µ(dξ)
Bi
∫
adaptive !
integration!
mesh
2 3 4 5 6
10
−5
10
−4
10
−3
10
−2
Number of PCM points on each element
errors 2 el, even grid
2 el, uneven grid
4 el, even grid
4 el, uneven grid
5 el, even grid
5 el, uneven grid
(MEPCM)!
adaptive!
vs.!
non-
adaptive!
meshes
error of Ε[u2
]
Improved !
10. ut + 6uux + uxxx = σiξi ,
i=1
n
∑ x ∈!
u(x,0) =
a
2
sech2
(
a
2
(x − x0 ))
(sparse grid) D. Xiu, J. S. Hesthaven, High-order collocation methods for differential equations with random inputs, SIAM J. Scientific
Computing 27(3) (2005), pp. 1118– 1139.
17 153 256 969 4,845
10
−10
10
−9
10
−8
10
−7
10
−6
10
−5
10
−4
r(k)
errors
l2u1(sparse grid)
l2u2(sparse grid)
l2u1(tensor product grid)
l2u2(tensor product grid)
sparse grid vs. tensor product grid
Binomial distribution
n=8
Improved !
2D sparse grid in Smolyak algorithm
UQ of stochastic KdV equation with multiple RVs
11. Summary of contributions (1)
✰ convergence study of multi-element integration over
discrete measure
!
✰ comparison of 5 ways to construct orthogonal
polynomials w.r.t. discrete measure
!
✰ improvement of moment statistics by adaptive
integration mesh (on discrete measure)
!
✰ improvement of moment statistics by sparse grid (on
discrete measure)
12. gPC for 1D stochastic Burgers equation
M. Zheng, B. Rozovsky, G.E. Karniadakis, Adaptive Wick-Malliavin Approximation to Nonlinear SPDEs with Discrete Random
Variables, SIAM J. Sci. Comput., revised. (multiple discrete RVs)
D. Venturi, X. Wan, R. Mikulevicius, B.L. Rozovskii, G.E. Karniadakis, Wick-Malliavin approximation to nonlinear stochastic
PDEs: analysis and simulations, Proceedings of the Royal Society, vol.469, no.2158, (2013). (multiple Gaussian RVs)
ut + uux = υuxx +σc1(ξ;λ), x ∈[−π,π]
u(x,t;ξ) ≈ u!k (x,t)ck (ξ;λ)
k=0
P
∑Expand the solution:
∂u!k
∂t
+ u!m
∂u!n
∂t
< cmcnck >=
m,n=0
P
∑ υ
∂2
u!k
∂x2
+σδ1k ,
general Polynomial Chaos (gPC) propagator
k = 0,1,...,P.
ξ ∼ Pois(λ)
ck : Charlier polynomial
e−λ
λk
k!
cm (k;λ)cn (k;λ) =< cmcn >= n!λn
δmn
k∈!
∑
cm
by Galerkin projection : < uck >
nonlinear
How many numbers of
terms !
!
!
!
there are !
u!m
∂u!n
∂t
< cmcnck >
?
(motivation)
Let us simplify the gPC propagator !
(P +1)3
13. Wick-Malliavin (WM) approximation
G.C. Wick, The evaluation of the collision matrix, Phys. Rev. 80 (2), (1950), pp. 268-272.
◊
ξ ∼ Pois(λ)✰ consider with measure
!
✰ define Wick product as:
✰ define Malliavin derivative D as:
!
✰ the product of two polynomials can be approximated by:
!
!
!
✰ here
!
✰ define weighted Wick product :
!
✰ rewrite the product of two polynomials:
!
!
✰ approximate the product of uv as:
Γ(x) =
e−λ
λk
k!
δ(x − k)
k∈!
∑
cm (x;λ)◊cn (x;λ) = cm+n (x;λ),
Dp
ci (x;λ) =
i!
(i − p)!
ci−p (x;λ)
cm (x)cm (x) = a(k,m,n)ck (x) =
k=0
m+n
∑ Kmnpcm+n−2 p (x;λ)
p=0
m+n
2
∑
Kmnp = a(m + n − 2p,m,n)
◊p
cm◊pcn =
p!m!n!
(m + p)!(n + p)!
Km+p,n+p,pcm◊cn
cmcn =
Dp
cm◊pDp
cn
p!p=0
m+n
2
∑
uv =
Dp
u◊pDp
v
p!
≈
p=0
∞
∑
Dp
u◊pDp
v
p!p=0
Q
∑
14. WM approximation simplifies the gPC propagator !
ut + uux = υuxx +σc1(ξ;λ), x ∈[−π,π]
∂u!k
∂t
+ u!m
∂u!n
∂t
< cmcnck >=
m,n=0
P
∑ υ
∂2
u!k
∂x2
+σδ1k , k = 0,1,...,P.gPC propagator:
∂u!k
∂t
+ u!i
i=0
P
∑
∂u!k+2 p−i
∂x
Ki,k+2 p−i,Q =
p=0
Q
∑ υ
∂2
u!k
∂x2
+σδ1k , k = 0,1,...,P.WM propagator:
How much less? Let us count the dots !
k=0 k=1 k=2 k=3 k=4
P=4, Q=1/2
15. Spectral convergence when Q ≥ P −1
ut + uux = υuxx +σc1(ξ;λ) u(x,0) = 1− sin(x)
ξ ∼ Pois(λ) x ∈[−π,π] Periodic B.C.
16. concept of P-Q refinement
ut + uux = υuxx +σc1(ξ;λ) u(x,0) = 1− sin(x)
ξ ∼ Pois(λ) x ∈[−π,π] Periodic B.C.
17. WM for stochastic Burgers equation w/ multiple RVs
ut + uux = υuxx +σ
j=1
3
∑ c1(ξj;λ)cos(0.1jt)
0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
10
−7
10
−6
10
−5
10
−4
10
−3
10
−2
T
l2u2(T)
Q
1
=Q
2
=Q
3
=0
Q1
=1,Q2
=Q3
=0
Q1
=Q2
=1,Q3
=0
Q
1
=Q
2
=Q
3
=1
u(x,0) = 1− sin(x) ξ1,2,3 ∼ Pois(λ)
x ∈[−π,π] Periodic B.C.
How about 3 discrete RVs ? How about the cost in d-dim ?
C(P,Q)d
the # of terms u!i
∂u! j
∂x
(P +1)3d
the # of terms u!m
∂u!n
∂t
Let us find the ratio
C(P,Q)d
(P +1)3d
P=3
Q=2
P=4
Q=3
d=2 61.0% 65.3%
d=3 47.7% 52.8%
d=4 0.000436% 0.0023%
C(P,Q)d
(P +1)3d
??
Cost: WM vs. gPC
18. Summary of contributions (2)
References
✰ Extend the numerical work on WM approximation for
SPDEs driven by Gaussian RVs to discrete RVs with
arbitrary distribution w/ finite moments
✰ Discover spectral convergence when for
stochastic Burgers equations
✰ Error control with P-Q refinements
✰ Computational complexity comparison of gPC and
WM in d dimensions
Q ≥ P −1
D. Bell, The Malliavin calculus, Dover, (2007)
S. Kaligotla and S.V. Lototsky, Wick product in the stochastic Burgers equation: a curse or a cure?
Asymptotic Analysis 75, (2011), pp. 145–168.
S.V. Lototsky, B.L. Rozovskii, and D. Selesi, On generalized Malliavin calculus, Stochastic Processes
and their Applications 122(3), (2012), pp. 808–843.
19. M. Zheng, G.E. Karniadakis, ‘Numerical Methods for SPDEs with Tempered Stable Processes’,SIAM J. Sci. Comput., accepted.
N. Hilber, O. Reichmann, Ch. Schwab, Ch. Winter, Computational Methods for Quantitative Finance: Finite Element Methods for
Derivative Pric- ing, Springer Finance, 2013.
S.I. Denisov, W. Horsthemke, P. Hänggi, Generalized Fokker-Planck equation: Derivation and exact solutions, Eur. Phys. J. B, 68
(2009), pp. 567–575.
Generalized Fokker-Planck Equation for Overdamped Langevin Equation
Overdamped Langevin equation (1D, SODE, in the Ito’s sense)
Density satisfies tempered fractional PDEs (by Ito’s formula)
1D tempered stable (TS) pure jump process has this Levy measure
20. Generalized FP Equation for Overdamped Langevin Equation driven by TS white noise
Left Riemann-Liouville tempered fractional derivatives (as an example)
Fully implicit scheme in time, Grunwald-Letnikov for fractional derivatives
MC for Overdamped Langevin Equation driven by TS white noise
TFPDE
PCM for Overdamped Langevin Equation driven by TS white noise
Compound Poisson (CP)
approximation
MC!
(probabilistic)
PCM!
(probabilistic)
TFPDE!
(deterministic)
21. Histogram from MC vs. Density from TFPDEs
Zoomed in plots of P(x,T) by TFPDEs and MC/CP at T = 0.5 (left) and T = 1 (right): α = 0.5, c = 1, λ = 1, x0
= 1 and σ = 0.01 (left and right). In MC/CP: s = 105, δ = 0.01, △t = 1e −3 (left and right). In the TFPDEs:
△t = 1e −5, and Nx = 2000 points on [−12, 12] in space (left and right).
jump intensity jump size distribution
22. Moment Statistics from PCM/CP vs. TFPDE
TFPDE vs. PCM/CP: error of the 2nd moment of the solution versus time with λ=10 (left) and λ=1 (right).
α=0.5,c=2,σ=0.1,x0 =1 (left and right). For the TFPDE: finite difference scheme with △t = 2.5 × 10−5 , Nx
equidistant points on [−12, 12], initial condition given by δD (left and right).
TFPDE costs much less computational time
but more accurate than PCM/CP
23. M. Zheng, G.E. Karniadakis, Numerical methods for SPDEs with additive multi-dimensional Levy jump processes, in
preparation.
24. How to describe the dependence structure among components!
of a multi-dimensional Levy jump process ?
J. Kallsen, P. Tankov, Characterization of dependence of multidimensional Levy processes using Levy copulas, Journal of
Multivariate Analysis, 97 (2006), pp. 1551–1572.
LePage’s representation of Levy measure:
Series representation:
τ = 1
τ = 100
Levy copula!
+!
Marginal Levy!
measure!
=!
Levy measure
1
2
25. Analysis of variance (ANOVA) + FP = marginal distribution
FP equation
ANOVA decomposition
ANOVA terms are related to marginal distributions
1D-ANOVA-FP for marginal distributions
2D-ANOVA-FP for marginal distributions
LePage’s
representation
TFPDEs
26. 0 0.2 0.4 0.6 0.8 1
−2
0
2
4
6
8
10
12
x
E[u(x,T=1)]
E[uPCM
]
E[u
1D−ANOVA−FP
]
E[u
2D−ANOVA−FP
]
0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1
3.4
3.6
3.8
4
4.2
4.4
4.6
4.8
5
5.2
x 10
−4
T
L2
normofdifferenceinE[u]
||E[u
1D−ANOVA−FP
−E[u
PCM
]||
L
2
([0,1])
/||E[u
PCM
]||
L
2
([0,1])
||E[u
2D−ANOVA−FP
−E[u
PCM
]||
L
2
([0,1])
/||E[u
PCM
]||
L
2
([0,1])
Moments: 1D-ANOVA-FP is accurate for E[u] in 10D
1D-ANOVA-FP
2D-ANOVA-FP
PCM
1D-ANOVA-FP
2D-ANOVA-FP
noise-to-signal!
ratio NSR ≈18.24%
27. Moments: 1D-ANOVA-FP is not accurate for in 10D
0 0.2 0.4 0.6 0.8 1
0
20
40
60
80
100
120
x
E[u
2
(x,T=1)]
E[u
2
PCM
]
E[u2
1D−ANOVA−FP
]
E[u
2
2D−ANOVA−FP
]
0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
T
L
2
normofdifferenceinE[u2
]
||E[u2
1D−ANOVA−FP
−E[u2
PCM
]||
L
2
([0,1])
/||E[u2
PCM
]||
L
2
([0,1])
||E[u
2
2D−ANOVA−FP
−E[u
2
PCM
]||
L
2
([0,1])
/||E[u
2
PCM
]||
L
2
([0,1])
1D-ANOVA-FP
2D-ANOVA-FP
PCM
1D-ANOVA-FP
2D-ANOVA-FP
Ε[u2
]
NSR ≈18.24%
28. Moments: PCM vs. FP (TFPDE)
Initial condition of FP
equation introduce error
0.2 0.4 0.6 0.8 1
10
−10
10
−8
10
−6
10
−4
10
−2
l2u2(t)
t
PCM/S Q=5, q=2
PCM/S Q=10, q=2
TFPDE
NSR 4.8%
Moments: PCM vs. MC
LePage’s representation (2D)
Ε[u2
]
Ε[u2
]
LePage’s representation (2D)
10
0
10
2
10
4
10
6
10
−4
10
−3
10
−2
10
−1
s
l2u2(t=1)
PCM/S q=1
PCM/S q=2
MC/S Q=40
PCM costs less than MC
Q — # of truncation in series representation
q — # of quadrature points on each
dimension
29. Density: MC vs. FP equation (2D Levy)
LePage’s !
representation!
2D — MC
3D — FP/TFPDE
Levy!
copula
31. Summary of contributions (3, 4)
✰ Established a framework for UQ of SPDEs w/ multi-
dimensional Levy jump processes by probabilistic
(MC, PCM) and deterministic (FP) methods
✰ Combined the ANOVA & FP to simulate moments of
solution at lower orders
✰ Improved the traditional MC method’s efficiency and
accuracy
✰ Link the area of fractional PDEs & UQ for SPDEs w/
Levy jump processes
32. Future work
✰ Simulate SPDEs driven by higher-dimensional Levy
jump processes with ANOVA-FP
✰ Consider other jump processes than TS processes
✰ Consider nonlinear SPDEs w/ multiplicative multi-
dimensional Levy jump processes
✰ Application to the Energy Balance Model in climate
modeling
✰ Application to Mathematical Finance
33. Acknowledgements
✰ Thanks Prof. George Em Karniadakis for advice and
support
✰ Thanks Prof. Xiaoliang Wan and Prof. Hui Wang to be
on my committee
✰ Thanks Prof. Xiaoliang Wan and Prof. Boris Rozovskii
for their innovative ideas and collaboration
✰ Thanks for the support from the NSF/DMS (grant
DMS-0915077) and the Airforce MURI (grant
FA9550-09-1-0613)