SlideShare a Scribd company logo
1 of 35
Download to read offline
Introduction to MonteIntroduction to Monte
Carlo MethodsCarlo Methods
SarathSarath SeneviratneSeneviratne
((((((((B.ScB.Sc (Hons),(Hons), M.ScM.Sc,, PG.Dip.CompPG.Dip.Comp,, M.Med.PhyM.Med.Phy))
Contact email: Sarath@alumi.rpi.edu
Monte Carlo MethodMonte Carlo Method
Monte Carlo Method: any technique of statisticalMonte Carlo Method: any technique of statistical
sampling employed to approximate solutions tosampling employed to approximate solutions to
quantitative problemsquantitative problems
StanislawStanislaw UlamUlam (Polish(Polish--American mathematician) whoAmerican mathematician) who
worked with John Neumann and Nicholas Metropolisworked with John Neumann and Nicholas Metropolis
on theon the ““Manhattan ProjectManhattan Project”” had been recognised ashad been recognised as
the first scientist used the computers to transformthe first scientist used the computers to transform
nonnon--random problems into random forms that wouldrandom problems into random forms that would
facilitate their solution via statistical sampling.facilitate their solution via statistical sampling.
Nicholas Metropolis and StanislawNicholas Metropolis and Stanislaw UlamUlam publishedpublished
first paper on the Monte Carlo method in 1949first paper on the Monte Carlo method in 1949
MC Simulation plays a major role in variousMC Simulation plays a major role in various
phases of experimental physics projectsphases of experimental physics projects
•• design of the experimental setdesign of the experimental set--upup
•• evaluation and definition of the potential physicsevaluation and definition of the potential physics
output of the projectoutput of the project
•• evaluation of potential risks to the projectevaluation of potential risks to the project
•• assessment of the performance of the experimentassessment of the performance of the experiment
•• development, test and optimization ofdevelopment, test and optimization of
reconstruction and physics analysis and datareconstruction and physics analysis and data
acquisition softwareacquisition software
•• contribution to the calculation and validation ofcontribution to the calculation and validation of
physics resultsphysics results
•• Estimation of nonEstimation of non--measurable eventsmeasurable events
IntroductionIntroduction
Deterministic ModelDeterministic Model
Produce the same output for a give stating conditions
For example, return on an investment
y
m
r
PF 



+= 1
P – Initial investment
r – annual interest rate
m – compounding period
y – number of years
Stochastic ModelStochastic Model
Error
Expectation
Examples:
Stock market
Particle interaction with matter
Random ProcessRandom Process
Described in probabilistic termsDescribed in probabilistic terms
All the pAll the properties are given as averagesroperties are given as averages
•• Ensemble averaging:Ensemble averaging:
Properties of the process are obtained by averaging over aProperties of the process are obtained by averaging over a
collection orcollection or ‘‘ensembleensemble’’ of sample records using values atof sample records using values at
corresponding timescorresponding times
•• Time averaging:Time averaging:
Properties are obtained by averaging over a singleProperties are obtained by averaging over a single
record in timerecord in time
Basic properties of a single random processBasic properties of a single random process
mean, standard deviation, automean, standard deviation, auto--correlation, spectral densitycorrelation, spectral density
Joint properties of two or more random processesJoint properties of two or more random processes
correlation, covariance, cross spectral density, simple inputcorrelation, covariance, cross spectral density, simple input--
output relationsoutput relations
Stationary random processStationary random process ::
Ensemble averages do not vary with timeEnsemble averages do not vary with time
Random ProcessRandom Process
The probability density function describes the generalThe probability density function describes the general
distribution of the magnitude of the random process, but itdistribution of the magnitude of the random process, but it
gives no information on the time or frequency content of thegives no information on the time or frequency content of the
processprocess
x(t)
Time, t
f(x)
x
Probability Density FunctionProbability Density Function
}{ dxxfbxaP
b
a
x∫=<< )(
xallforxf
dxxf
i
i
0)(
1)(
>
=∫
+∞=
−∞=
x
fx(x) P(a<x<b)
Probability that x lies between the values a and b is the area
under the graph of fX(x) defined by x=a and x=b
x=a b
x
IfIf ffxx(x(x)) is a continuousis a continuous
normalised probabilitynormalised probability
density functiondensity function
xσ
fx(x)fx(x)fx(x)fx(x)
MeanMode
Cumulative distribution functionCumulative distribution function
x=a
fx(x)
x
The cumulative distribution function (c.d.f.) is the integral
between -∞ and x of fX(x)
F(a)
dxxfxF
x
x
xx ∫ −∞=
= )()(
Central Limit TheoremCentral Limit Theorem
Sample averages from a random process with finite mean andSample averages from a random process with finite mean and
variance always approximate a normal distributionvariance always approximate a normal distribution
Another interpretation, the sum of independent samples fromAnother interpretation, the sum of independent samples from
any distribution with finite mean and variance converges to theany distribution with finite mean and variance converges to the
normal distribution as the sample size goes to infinity.normal distribution as the sample size goes to infinity.
µ = 0
σ = 1
Gaussian
µ = 4
σ = 2
Normal
2
2
2
)(
2
1
),|( σ
µ
πσ
σµ
−−
=
x
exf
Mean and VarianceMean and Variance
The expected value (mean) of discrete variablesThe expected value (mean) of discrete variables
IfIf xx is a discrete random variable with possible valuesis a discrete random variable with possible values xx11,x,x22,x,x33,,
……....xxNN
∑=
=
N
i
ix
N
x
1
1
If x is a continuous random variable with probability density
function f(x), then the mean value is
∫
+∞=
−∞=
=
i
i
dxxxfx )(
Sample VarianceSample Variance ∑=
−
−
≡
N
i
ix xx
N 1
22
)(
1
1
σ
Sample VarianceSample Variance ( ) dxxfxx xx ∫
∞
∞−
−= )(
22
σ
Random SamplingRandom Sampling
otherwise
rforrp
0
101)(
=
<≤=
0 1
p(r)
r
p(x)
x
0
Uniform distribution
in sampling space
Probability distribution in
which samples are being drawn
xallforxp 0)( ≥
Random NumbersRandom Numbers
Pseudorandom numbersPseudorandom numbers
•• Generated by deterministic computer algorithmGenerated by deterministic computer algorithm
•• Uniform distribution over a predefined rangeUniform distribution over a predefined range
•• Can repeat a calculation with the same sequence of numbersCan repeat a calculation with the same sequence of numbers
Easy to debug the simulationsEasy to debug the simulations
•• Should produce a large number of unique numbers beforeShould produce a large number of unique numbers before
repeating the cycle.repeating the cycle.
•• For example, if we are studying the sensitivity of aFor example, if we are studying the sensitivity of a
calculation to variation in a selected parameter, we cancalculation to variation in a selected parameter, we can
reduce the variance of the difference between resultsreduce the variance of the difference between results
calculated with two trail values of the parameter by usingcalculated with two trail values of the parameter by using
the same random number sequencethe same random number sequence..
Direct MethodDirect Method
•• Use the probability distribution directlyUse the probability distribution directly
•• For example:For example:
Binominal random numbersBinominal random numbers
Let assume a probabilityLet assume a probability pp of a head on any single toss of aof a head on any single toss of a
coincoin
If you generate N uniform random numbers on the intervalIf you generate N uniform random numbers on the interval
(0,1), the counts the numbers less than p(0,1), the counts the numbers less than p
The count is a binomial random numbers withThe count is a binomial random numbers with
parametersparameters NN andand pp
N=100
P=0.3
Random Numbers from Probability DistributionsRandom Numbers from Probability Distributions
Random Numbers from Probability DistributionsRandom Numbers from Probability Distributions
Transformation Method (Inverse transform method)Transformation Method (Inverse transform method)
Numbers drawn from a specific probability distributionNumbers drawn from a specific probability distribution P(xP(x))
•• Let us define uniform deviatesLet us define uniform deviates p(rp(r)) drawn from a standarddrawn from a standard
probability density distribution that is uniform betweenprobability density distribution that is uniform between r=0r=0 andand r=1r=1
∫−∞=
=
x
x
x dxxPxF
fuctionondistributiComulative
)()(
This method is used to generate normal, exponential and other types of
distributions where the inverse is well defined and easy to obtain
otherwise
rforrp
0
101)(
=
<≤=
Transformation MethodTransformation Method
40
4
1
)( ≤<= xfor
x
xf
X = F -1(r) = (2*r) 2
r
40
4
1
)( ≤<= xfor
x
xf
)(
]1,0[
2
1
)(
functioneCummulativ
forxxF =
Transformation MethodTransformation Method
Simulation of 10,000 events
40
4
1
)( ≤<= xfor
x
xf
Random Numbers from Probability DistributionRandom Numbers from Probability Distribution
AcceptanceAcceptance--Rejection MethodRejection Method
•• Very easy to useVery easy to use
•• Successively generates random values from theSuccessively generates random values from the
uniform distributionuniform distribution
•• Calculate the probability function to estimates theCalculate the probability function to estimates the
random eventrandom event
•• If the event falls within the probabilityIf the event falls within the probability
distribution functiondistribution function –– ““HitHit””, otherwise, otherwise ““rejectreject””
•• In a complex Monte Carlo simulation only a smallIn a complex Monte Carlo simulation only a small
fraction of the events may survivefraction of the events may survive..
AcceptanceAcceptance--Rejection MethodRejection Method
Aim is to generate a random number from aAim is to generate a random number from a
continuous distribution with PDFcontinuous distribution with PDF f(xf(x),),
xallandcsomeforxcgxf )()( ≤
1. Choose a density function g(x)
2. Find a constant c such that
3. Generate uniform random number u between 0 & 1
4. Generate a random number v from g(x)
5. If c*u ≤ f(v)/g(v), accept and return v
6. Otherwise reject and go to step 3
AcceptanceAcceptance--Rejection MethodRejection Method
2.2
)(*
)(
)( 2
2
=
=
=
−
−
c
xgc
exg
xexf
x
x
AcceptanceAcceptance--Rejection MethodRejection Method
2
2
)(
x
xexf
−
=
Error in Monte Carlo MethodError in Monte Carlo Method
Variance in Monte Carlo Simulation can beVariance in Monte Carlo Simulation can be
calculated for N number of measurements ascalculated for N number of measurements as














−= ∑∑ ==
2
11
22 11 N
i
i
N
i
i x
N
x
N
s
Standard deviation
2
s=σ
If the simulation divided into m sets of measurements each with n
trials (measurements), then the error will be
nn
err
σσ
≈
−
=
1
This is the standard
deviation in a single
measurement. It will be
several orders higher than
the actual error
Calculate valueCalculate value ofof π
1-1
1
-1
generatedeventsTotal
circlewithinscore
squareenclosingofArea
circleaofArea
R
R
*4
4
)2(
4
2
2
=
×
=
×
=
π
π
π
resplywithinfallandondistributiGaussianOn
eventsNfor
%95%,3.682
10001.01417.3Result 7
σσ ±±
=±=
Gaussian DistributionGaussian Distribution
Almost any MC that simulates experimental measurements will requAlmost any MC that simulates experimental measurements will require theire the
generation of deviates drawn from Gaussian distributiongeneration of deviates drawn from Gaussian distribution
The normal distribution with mean 0 and the standard deviation 1The normal distribution with mean 0 and the standard deviation 1
dz
z
dzzP 





−=
2
exp
2
1
)(
2
π
To scale to different means µ and standard deviation δ by
µσ += zx
Gaussian DistributionGaussian Distribution
• An interesting method for generating Gaussian deviates is based on the
fact that if we reputedly calculate the men of group of numbers drawn
randomly from any distribution, the distribution of those means tends to a
Gaussian as the number of means increases
• If we calculate many times the sums of N uniform deviates, drawn
from the uniform distribution, we should expect the sums to fall into a
truncated Gaussian distribution (bounded by 0 & N) with mean value
N/2
• If we generate N values of r from a uniform distribution
12/,02/
0
101)(
0
NwhereNrr
otherwise
rforrp
N
i
iG ==−=
=
<≤=
∑=
σµ
When N=12, the variance and standard deviation becomes 1
Gaussian DistributionGaussian Distribution
Box and Muller (1958) suggested to generate two Gaussian deviateBox and Muller (1958) suggested to generate two Gaussian deviatess zz11 && zz22 fromfrom
two uniform deviatestwo uniform deviates rr11 & r& r22
212
211
2sinln2
2cosln2
rrz
rrz
π
π
−=
−=
100,000 deviates
Poisson DistributionPoisson Distribution
Does not have the convenient scaling properties of the GaussianDoes not have the convenient scaling properties of the Gaussian functionfunction
To find an integerTo find an integer xx drawn from the Poisson distribution with meandrawn from the Poisson distribution with mean µµ,,
•• Generate a random variable r from the uniform distribution asGenerate a random variable r from the uniform distribution as
∑=
−
=
x
x
x
e
x
r
0 !
µµ
At larger values of µ, the Poisson distribution becomes indistinguishable
from the Gaussian function
Build the Poisson distribution table and look up from the table for the
generated random number.
For example: Geiger counter experiment with an assumed mean counting
rate of 8.4 counts per 20s interval. Build a table for Poisson distribution
with µ=8.4
Exponential DistributionExponential Distribution
To simulate unstable state such as particle decayTo simulate unstable state such as particle decay
The probability density function for obtaining a count at timeThe probability density function for obtaining a count at time tt from anfrom an
exponential distribution with mean lifeexponential distribution with mean life ττ
τ
τ
τ
t
e
tfortP
−
=
<= 00):(
Random sample ti from the distribution can be obtained from
ii rt lnτ−=
Composition + RejectionComposition + Rejection
If the probability densityIf the probability density f(xf(x)) can be written ascan be written as
∫
∑
=≥>≤≤
=
=
1)(,0)(,0,1)(0
)()()(
1
dxxfxfxgwhere
xgxfxf
iiii
n
i
iii
α
α fi(x) must be easily sampled & gi(x) should be not
much smaller than unit to keep rejections low
First choose an integer from 1 to n with probability proportional to αi of being i
A sample x is drawn from the distribution function with frequency function fi(x)
and the value is accepted or rejected by computing the value of gi(x) and
comparing with a random number r
if r > gi(x), reject x
r ≤ gi(x), accept x
So that the probability of accepting or rejection x is gi(x)
For example, Compton scattering
Compton Scattering
KleinKlein--NishinaNishina DCSDCS
)cos1(
&,
1
sin
1
1
0
2
2
01
0
1
10
2
2
2
0
2
2
θ
ε
ε
θε
ε
ε
π
ε
σ
−+
=
=
−
−
−






−
−



+=
Ecm
cm
EE
E
E
photonsscatteredincidentofEnergyEE
masselectroncm
radiuselectronclassiclr
Z
E
cm
r
d
d
e
e
e
e
e
e
Compton Scattering
Compton Scattering
Minimum photon energyMinimum photon energy –– Backward scatteringBackward scattering ε0
[ ]
22
2
02
1101
22112
2
0
0
2
2
0
/)(2/)1(
)/(1)()/1ln(
)()()()()(
1
sin
1
1
)(
]1,[
2
αεεεα
εαεεα
εεαεαεε
ε
θε
ε
ε
εφ
εεε
=−=
==
•+=•=





+
−



+≅
∈
+
=
f
fwhere
gffgf
Ecm
cm
e
e
f1 and f2 are probability density functions [ε0,1]
1)(0]1,[
sin
1
1)(
0
2
2
≤<⇒




+
−=
εε
θ
ε
ε
ε
gallfor
g
Rejection function




−+ θ
ε
ε 2
sin
1
1)()(
2
1
0 >≥= ∑=i
i forff εεεαε
Sampling Photon EnergySampling Photon Energy
f1 and f2 - normalised probability density functions [ε0,1]
Random numbers (r) are drawn from uniform distribution [0 1]
Step 1Step 1
•• IfIf rr << αα11/(/(αα11++αα22) select f) select f11, otherwise f, otherwise f22
Step 2Step 2
•• SampleSample εε from the distribution corresponding tofrom the distribution corresponding to f1 or f2
For f1 : ε = exp(-rα1 )
For f2 : ε2 = ε0
2 + (1 - ε0
2)r
Step 3
• Use the value of ε in stp 2 to calculate sin2θ = (2-t) and then g(ε)
Where t = (1-cosθ) = mec2(1-ε)/(E0ε)
Step 4
• If r ≤ g(ε) accept ε, otherwise goto step 1
Compton Scattering without atomic form factorCompton Scattering without atomic form factor
)cos1(
&,
1
sin
1
1
0
2
2
01
0
1
10
2
2
2
0
2
2
θ
ε
ε
θε
ε
ε
π
ε
σ
−+
=
=
−
−
−






−
−



+=
Ecm
cm
EE
E
E
photonsscatteredincidentofEnergyEE
masselectroncm
radiuselectronclassiclr
Z
E
cm
r
d
d
e
e
e
e
e
e
0
1
E
E
=ε
1,000,000 Events
keVEAu 50, 0 =
KleinKlein--NishinaNishina DCSDCS
ε
σ
d
d
Low Energy Compton Scattering in GEANT4
HubbelHubbel’’ss atomic form factor, SFatomic form factor, SF
P(P(εε,q)=,q)=ФФ((εε) x) x SF(qSF(q))
Momentum transfer q = E sinMomentum transfer q = E sin22((θθ/2)/2)
• Energy distribution of the scattered photon according to the Klein-
Nishina formula, multiplied by scattering function F(q) (Hubbel’s
atomic factor) from EPDL97 data library
• The effect of scattering function becomes significant at low
energies in suppressing forward scattering
• Angular distribution of the scattered photon and the recoil electron
also based on EPDL97

More Related Content

What's hot

Spatial Point Processes and Their Applications in Epidemiology
Spatial Point Processes and Their Applications in EpidemiologySpatial Point Processes and Their Applications in Epidemiology
Spatial Point Processes and Their Applications in EpidemiologyLilac Liu Xu
 
Runtime Analysis of Population-based Evolutionary Algorithms
Runtime Analysis of Population-based Evolutionary AlgorithmsRuntime Analysis of Population-based Evolutionary Algorithms
Runtime Analysis of Population-based Evolutionary AlgorithmsPK Lehre
 
Batch mode reinforcement learning based on the synthesis of artificial trajec...
Batch mode reinforcement learning based on the synthesis of artificial trajec...Batch mode reinforcement learning based on the synthesis of artificial trajec...
Batch mode reinforcement learning based on the synthesis of artificial trajec...Université de Liège (ULg)
 
Beyond function approximators for batch mode reinforcement learning: rebuildi...
Beyond function approximators for batch mode reinforcement learning: rebuildi...Beyond function approximators for batch mode reinforcement learning: rebuildi...
Beyond function approximators for batch mode reinforcement learning: rebuildi...Université de Liège (ULg)
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical MethodsChristian Robert
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsChristian Robert
 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagationDong Guo
 
Digital Signal Processing and Control System under MATLAB Environment
Digital Signal Processing and Control System under MATLAB EnvironmentDigital Signal Processing and Control System under MATLAB Environment
Digital Signal Processing and Control System under MATLAB EnvironmentParamjeet Singh Jamwal
 
Approximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingApproximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingSSA KPI
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?Christian Robert
 

What's hot (19)

A bit about мcmc
A bit about мcmcA bit about мcmc
A bit about мcmc
 
Spatial Point Processes and Their Applications in Epidemiology
Spatial Point Processes and Their Applications in EpidemiologySpatial Point Processes and Their Applications in Epidemiology
Spatial Point Processes and Their Applications in Epidemiology
 
Dsp2
Dsp2Dsp2
Dsp2
 
Runtime Analysis of Population-based Evolutionary Algorithms
Runtime Analysis of Population-based Evolutionary AlgorithmsRuntime Analysis of Population-based Evolutionary Algorithms
Runtime Analysis of Population-based Evolutionary Algorithms
 
Batch mode reinforcement learning based on the synthesis of artificial trajec...
Batch mode reinforcement learning based on the synthesis of artificial trajec...Batch mode reinforcement learning based on the synthesis of artificial trajec...
Batch mode reinforcement learning based on the synthesis of artificial trajec...
 
Beyond function approximators for batch mode reinforcement learning: rebuildi...
Beyond function approximators for batch mode reinforcement learning: rebuildi...Beyond function approximators for batch mode reinforcement learning: rebuildi...
Beyond function approximators for batch mode reinforcement learning: rebuildi...
 
Monte Carlo Statistical Methods
Monte Carlo Statistical MethodsMonte Carlo Statistical Methods
Monte Carlo Statistical Methods
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
Big model, big data
Big model, big dataBig model, big data
Big model, big data
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagation
 
Digital Signal Processing and Control System under MATLAB Environment
Digital Signal Processing and Control System under MATLAB EnvironmentDigital Signal Processing and Control System under MATLAB Environment
Digital Signal Processing and Control System under MATLAB Environment
 
Bioalgo 2012-04-hmm
Bioalgo 2012-04-hmmBioalgo 2012-04-hmm
Bioalgo 2012-04-hmm
 
Approximation in Stochastic Integer Programming
Approximation in Stochastic Integer ProgrammingApproximation in Stochastic Integer Programming
Approximation in Stochastic Integer Programming
 
Estimating Space-Time Covariance from Finite Sample Sets
Estimating Space-Time Covariance from Finite Sample SetsEstimating Space-Time Covariance from Finite Sample Sets
Estimating Space-Time Covariance from Finite Sample Sets
 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
 
Can we estimate a constant?
Can we estimate a constant?Can we estimate a constant?
Can we estimate a constant?
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
 
Channel coding
Channel codingChannel coding
Channel coding
 

Viewers also liked

Types of stages and drapes - Theatre 1
Types of stages and drapes - Theatre 1Types of stages and drapes - Theatre 1
Types of stages and drapes - Theatre 1brandonjsnyder
 
How to design a film set
How  to design  a film setHow  to design  a film set
How to design a film setcouerdeglam
 
Mock presentation2
Mock presentation2Mock presentation2
Mock presentation2fuad Mwalad
 
Tech i set design
Tech i set designTech i set design
Tech i set designvprahl
 
Set design Ashley Sanchez
Set design Ashley SanchezSet design Ashley Sanchez
Set design Ashley SanchezSanchex1
 
Scenic Design Chapter 15
Scenic Design Chapter 15Scenic Design Chapter 15
Scenic Design Chapter 15georgeyesdccd
 
Theatre stages and_terms
Theatre stages and_termsTheatre stages and_terms
Theatre stages and_termsandy motilal
 
Film Production Workflow
Film Production WorkflowFilm Production Workflow
Film Production WorkflowJohn Grace
 
Introduction to drama, theater, and culture
Introduction to drama, theater, and cultureIntroduction to drama, theater, and culture
Introduction to drama, theater, and cultureanalizaamurao
 

Viewers also liked (13)

Theater stage types
Theater stage typesTheater stage types
Theater stage types
 
Types of stages and drapes - Theatre 1
Types of stages and drapes - Theatre 1Types of stages and drapes - Theatre 1
Types of stages and drapes - Theatre 1
 
How to design a film set
How  to design  a film setHow  to design  a film set
How to design a film set
 
Mock presentation2
Mock presentation2Mock presentation2
Mock presentation2
 
Types of stages
Types of stagesTypes of stages
Types of stages
 
Tech i set design
Tech i set designTech i set design
Tech i set design
 
Stage Design
Stage DesignStage Design
Stage Design
 
Set design Ashley Sanchez
Set design Ashley SanchezSet design Ashley Sanchez
Set design Ashley Sanchez
 
Scenic Design Chapter 15
Scenic Design Chapter 15Scenic Design Chapter 15
Scenic Design Chapter 15
 
Theatre stages and_terms
Theatre stages and_termsTheatre stages and_terms
Theatre stages and_terms
 
Film Production Workflow
Film Production WorkflowFilm Production Workflow
Film Production Workflow
 
Introduction to drama, theater, and culture
Introduction to drama, theater, and cultureIntroduction to drama, theater, and culture
Introduction to drama, theater, and culture
 
Theatre
TheatreTheatre
Theatre
 

Similar to PhysicsSIG2008-01-Seneviratne

Lecture: Monte Carlo Methods
Lecture: Monte Carlo MethodsLecture: Monte Carlo Methods
Lecture: Monte Carlo MethodsFrank Kienle
 
Monte Carlo Methods
Monte Carlo MethodsMonte Carlo Methods
Monte Carlo MethodsJames Bell
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxbissacr
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxmccormicknadine86
 
Probability distribution
Probability distributionProbability distribution
Probability distributionRanjan Kumar
 
pptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacespptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacesbutest
 
pptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacespptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacesbutest
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
ORMR_Monte Carlo Method.pdf
ORMR_Monte Carlo Method.pdfORMR_Monte Carlo Method.pdf
ORMR_Monte Carlo Method.pdfSanjayBalu7
 
B02110105012
B02110105012B02110105012
B02110105012theijes
 
The International Journal of Engineering and Science (The IJES)
 The International Journal of Engineering and Science (The IJES) The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
Applied machine learning for search engine relevance 3
Applied machine learning for search engine relevance 3Applied machine learning for search engine relevance 3
Applied machine learning for search engine relevance 3Charles Martin
 
Randomized Algorithm- Advanced Algorithm
Randomized Algorithm- Advanced AlgorithmRandomized Algorithm- Advanced Algorithm
Randomized Algorithm- Advanced AlgorithmMahbubur Rahman
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson methodBijay Mishra
 
Probability Formula sheet
Probability Formula sheetProbability Formula sheet
Probability Formula sheetHaris Hassan
 
Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptxHamzaHaji6
 

Similar to PhysicsSIG2008-01-Seneviratne (20)

Lecture: Monte Carlo Methods
Lecture: Monte Carlo MethodsLecture: Monte Carlo Methods
Lecture: Monte Carlo Methods
 
Monte Carlo Methods
Monte Carlo MethodsMonte Carlo Methods
Monte Carlo Methods
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 
Probability distribution
Probability distributionProbability distribution
Probability distribution
 
pptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacespptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspaces
 
pptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspacespptx - Psuedo Random Generator for Halfspaces
pptx - Psuedo Random Generator for Halfspaces
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
ORMR_Monte Carlo Method.pdf
ORMR_Monte Carlo Method.pdfORMR_Monte Carlo Method.pdf
ORMR_Monte Carlo Method.pdf
 
Optimization tutorial
Optimization tutorialOptimization tutorial
Optimization tutorial
 
B02110105012
B02110105012B02110105012
B02110105012
 
The International Journal of Engineering and Science (The IJES)
 The International Journal of Engineering and Science (The IJES) The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
Applied machine learning for search engine relevance 3
Applied machine learning for search engine relevance 3Applied machine learning for search engine relevance 3
Applied machine learning for search engine relevance 3
 
Randomized Algorithm- Advanced Algorithm
Randomized Algorithm- Advanced AlgorithmRandomized Algorithm- Advanced Algorithm
Randomized Algorithm- Advanced Algorithm
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson method
 
PTSP PPT.pdf
PTSP PPT.pdfPTSP PPT.pdf
PTSP PPT.pdf
 
Montecarlophd
MontecarlophdMontecarlophd
Montecarlophd
 
Probability Formula sheet
Probability Formula sheetProbability Formula sheet
Probability Formula sheet
 
Week08.pdf
Week08.pdfWeek08.pdf
Week08.pdf
 
Chapter-4 combined.pptx
Chapter-4 combined.pptxChapter-4 combined.pptx
Chapter-4 combined.pptx
 

PhysicsSIG2008-01-Seneviratne

  • 1. Introduction to MonteIntroduction to Monte Carlo MethodsCarlo Methods SarathSarath SeneviratneSeneviratne ((((((((B.ScB.Sc (Hons),(Hons), M.ScM.Sc,, PG.Dip.CompPG.Dip.Comp,, M.Med.PhyM.Med.Phy)) Contact email: Sarath@alumi.rpi.edu
  • 2. Monte Carlo MethodMonte Carlo Method Monte Carlo Method: any technique of statisticalMonte Carlo Method: any technique of statistical sampling employed to approximate solutions tosampling employed to approximate solutions to quantitative problemsquantitative problems StanislawStanislaw UlamUlam (Polish(Polish--American mathematician) whoAmerican mathematician) who worked with John Neumann and Nicholas Metropolisworked with John Neumann and Nicholas Metropolis on theon the ““Manhattan ProjectManhattan Project”” had been recognised ashad been recognised as the first scientist used the computers to transformthe first scientist used the computers to transform nonnon--random problems into random forms that wouldrandom problems into random forms that would facilitate their solution via statistical sampling.facilitate their solution via statistical sampling. Nicholas Metropolis and StanislawNicholas Metropolis and Stanislaw UlamUlam publishedpublished first paper on the Monte Carlo method in 1949first paper on the Monte Carlo method in 1949
  • 3. MC Simulation plays a major role in variousMC Simulation plays a major role in various phases of experimental physics projectsphases of experimental physics projects •• design of the experimental setdesign of the experimental set--upup •• evaluation and definition of the potential physicsevaluation and definition of the potential physics output of the projectoutput of the project •• evaluation of potential risks to the projectevaluation of potential risks to the project •• assessment of the performance of the experimentassessment of the performance of the experiment •• development, test and optimization ofdevelopment, test and optimization of reconstruction and physics analysis and datareconstruction and physics analysis and data acquisition softwareacquisition software •• contribution to the calculation and validation ofcontribution to the calculation and validation of physics resultsphysics results •• Estimation of nonEstimation of non--measurable eventsmeasurable events
  • 4. IntroductionIntroduction Deterministic ModelDeterministic Model Produce the same output for a give stating conditions For example, return on an investment y m r PF     += 1 P – Initial investment r – annual interest rate m – compounding period y – number of years
  • 6. Random ProcessRandom Process Described in probabilistic termsDescribed in probabilistic terms All the pAll the properties are given as averagesroperties are given as averages •• Ensemble averaging:Ensemble averaging: Properties of the process are obtained by averaging over aProperties of the process are obtained by averaging over a collection orcollection or ‘‘ensembleensemble’’ of sample records using values atof sample records using values at corresponding timescorresponding times •• Time averaging:Time averaging: Properties are obtained by averaging over a singleProperties are obtained by averaging over a single record in timerecord in time Basic properties of a single random processBasic properties of a single random process mean, standard deviation, automean, standard deviation, auto--correlation, spectral densitycorrelation, spectral density Joint properties of two or more random processesJoint properties of two or more random processes correlation, covariance, cross spectral density, simple inputcorrelation, covariance, cross spectral density, simple input-- output relationsoutput relations Stationary random processStationary random process :: Ensemble averages do not vary with timeEnsemble averages do not vary with time
  • 7. Random ProcessRandom Process The probability density function describes the generalThe probability density function describes the general distribution of the magnitude of the random process, but itdistribution of the magnitude of the random process, but it gives no information on the time or frequency content of thegives no information on the time or frequency content of the processprocess x(t) Time, t f(x) x
  • 8. Probability Density FunctionProbability Density Function }{ dxxfbxaP b a x∫=<< )( xallforxf dxxf i i 0)( 1)( > =∫ +∞= −∞= x fx(x) P(a<x<b) Probability that x lies between the values a and b is the area under the graph of fX(x) defined by x=a and x=b x=a b x IfIf ffxx(x(x)) is a continuousis a continuous normalised probabilitynormalised probability density functiondensity function xσ fx(x)fx(x)fx(x)fx(x) MeanMode
  • 9. Cumulative distribution functionCumulative distribution function x=a fx(x) x The cumulative distribution function (c.d.f.) is the integral between -∞ and x of fX(x) F(a) dxxfxF x x xx ∫ −∞= = )()(
  • 10. Central Limit TheoremCentral Limit Theorem Sample averages from a random process with finite mean andSample averages from a random process with finite mean and variance always approximate a normal distributionvariance always approximate a normal distribution Another interpretation, the sum of independent samples fromAnother interpretation, the sum of independent samples from any distribution with finite mean and variance converges to theany distribution with finite mean and variance converges to the normal distribution as the sample size goes to infinity.normal distribution as the sample size goes to infinity. µ = 0 σ = 1 Gaussian µ = 4 σ = 2 Normal 2 2 2 )( 2 1 ),|( σ µ πσ σµ −− = x exf
  • 11. Mean and VarianceMean and Variance The expected value (mean) of discrete variablesThe expected value (mean) of discrete variables IfIf xx is a discrete random variable with possible valuesis a discrete random variable with possible values xx11,x,x22,x,x33,, ……....xxNN ∑= = N i ix N x 1 1 If x is a continuous random variable with probability density function f(x), then the mean value is ∫ +∞= −∞= = i i dxxxfx )( Sample VarianceSample Variance ∑= − − ≡ N i ix xx N 1 22 )( 1 1 σ Sample VarianceSample Variance ( ) dxxfxx xx ∫ ∞ ∞− −= )( 22 σ
  • 12. Random SamplingRandom Sampling otherwise rforrp 0 101)( = <≤= 0 1 p(r) r p(x) x 0 Uniform distribution in sampling space Probability distribution in which samples are being drawn xallforxp 0)( ≥
  • 13. Random NumbersRandom Numbers Pseudorandom numbersPseudorandom numbers •• Generated by deterministic computer algorithmGenerated by deterministic computer algorithm •• Uniform distribution over a predefined rangeUniform distribution over a predefined range •• Can repeat a calculation with the same sequence of numbersCan repeat a calculation with the same sequence of numbers Easy to debug the simulationsEasy to debug the simulations •• Should produce a large number of unique numbers beforeShould produce a large number of unique numbers before repeating the cycle.repeating the cycle. •• For example, if we are studying the sensitivity of aFor example, if we are studying the sensitivity of a calculation to variation in a selected parameter, we cancalculation to variation in a selected parameter, we can reduce the variance of the difference between resultsreduce the variance of the difference between results calculated with two trail values of the parameter by usingcalculated with two trail values of the parameter by using the same random number sequencethe same random number sequence..
  • 14. Direct MethodDirect Method •• Use the probability distribution directlyUse the probability distribution directly •• For example:For example: Binominal random numbersBinominal random numbers Let assume a probabilityLet assume a probability pp of a head on any single toss of aof a head on any single toss of a coincoin If you generate N uniform random numbers on the intervalIf you generate N uniform random numbers on the interval (0,1), the counts the numbers less than p(0,1), the counts the numbers less than p The count is a binomial random numbers withThe count is a binomial random numbers with parametersparameters NN andand pp N=100 P=0.3 Random Numbers from Probability DistributionsRandom Numbers from Probability Distributions
  • 15. Random Numbers from Probability DistributionsRandom Numbers from Probability Distributions Transformation Method (Inverse transform method)Transformation Method (Inverse transform method) Numbers drawn from a specific probability distributionNumbers drawn from a specific probability distribution P(xP(x)) •• Let us define uniform deviatesLet us define uniform deviates p(rp(r)) drawn from a standarddrawn from a standard probability density distribution that is uniform betweenprobability density distribution that is uniform between r=0r=0 andand r=1r=1 ∫−∞= = x x x dxxPxF fuctionondistributiComulative )()( This method is used to generate normal, exponential and other types of distributions where the inverse is well defined and easy to obtain otherwise rforrp 0 101)( = <≤=
  • 16. Transformation MethodTransformation Method 40 4 1 )( ≤<= xfor x xf X = F -1(r) = (2*r) 2 r 40 4 1 )( ≤<= xfor x xf )( ]1,0[ 2 1 )( functioneCummulativ forxxF =
  • 17. Transformation MethodTransformation Method Simulation of 10,000 events 40 4 1 )( ≤<= xfor x xf
  • 18. Random Numbers from Probability DistributionRandom Numbers from Probability Distribution AcceptanceAcceptance--Rejection MethodRejection Method •• Very easy to useVery easy to use •• Successively generates random values from theSuccessively generates random values from the uniform distributionuniform distribution •• Calculate the probability function to estimates theCalculate the probability function to estimates the random eventrandom event •• If the event falls within the probabilityIf the event falls within the probability distribution functiondistribution function –– ““HitHit””, otherwise, otherwise ““rejectreject”” •• In a complex Monte Carlo simulation only a smallIn a complex Monte Carlo simulation only a small fraction of the events may survivefraction of the events may survive..
  • 19. AcceptanceAcceptance--Rejection MethodRejection Method Aim is to generate a random number from aAim is to generate a random number from a continuous distribution with PDFcontinuous distribution with PDF f(xf(x),), xallandcsomeforxcgxf )()( ≤ 1. Choose a density function g(x) 2. Find a constant c such that 3. Generate uniform random number u between 0 & 1 4. Generate a random number v from g(x) 5. If c*u ≤ f(v)/g(v), accept and return v 6. Otherwise reject and go to step 3
  • 22. Error in Monte Carlo MethodError in Monte Carlo Method Variance in Monte Carlo Simulation can beVariance in Monte Carlo Simulation can be calculated for N number of measurements ascalculated for N number of measurements as               −= ∑∑ == 2 11 22 11 N i i N i i x N x N s Standard deviation 2 s=σ If the simulation divided into m sets of measurements each with n trials (measurements), then the error will be nn err σσ ≈ − = 1 This is the standard deviation in a single measurement. It will be several orders higher than the actual error
  • 23. Calculate valueCalculate value ofof π 1-1 1 -1 generatedeventsTotal circlewithinscore squareenclosingofArea circleaofArea R R *4 4 )2( 4 2 2 = × = × = π π π resplywithinfallandondistributiGaussianOn eventsNfor %95%,3.682 10001.01417.3Result 7 σσ ±± =±=
  • 24. Gaussian DistributionGaussian Distribution Almost any MC that simulates experimental measurements will requAlmost any MC that simulates experimental measurements will require theire the generation of deviates drawn from Gaussian distributiongeneration of deviates drawn from Gaussian distribution The normal distribution with mean 0 and the standard deviation 1The normal distribution with mean 0 and the standard deviation 1 dz z dzzP       −= 2 exp 2 1 )( 2 π To scale to different means µ and standard deviation δ by µσ += zx
  • 25. Gaussian DistributionGaussian Distribution • An interesting method for generating Gaussian deviates is based on the fact that if we reputedly calculate the men of group of numbers drawn randomly from any distribution, the distribution of those means tends to a Gaussian as the number of means increases • If we calculate many times the sums of N uniform deviates, drawn from the uniform distribution, we should expect the sums to fall into a truncated Gaussian distribution (bounded by 0 & N) with mean value N/2 • If we generate N values of r from a uniform distribution 12/,02/ 0 101)( 0 NwhereNrr otherwise rforrp N i iG ==−= = <≤= ∑= σµ When N=12, the variance and standard deviation becomes 1
  • 26. Gaussian DistributionGaussian Distribution Box and Muller (1958) suggested to generate two Gaussian deviateBox and Muller (1958) suggested to generate two Gaussian deviatess zz11 && zz22 fromfrom two uniform deviatestwo uniform deviates rr11 & r& r22 212 211 2sinln2 2cosln2 rrz rrz π π −= −= 100,000 deviates
  • 27. Poisson DistributionPoisson Distribution Does not have the convenient scaling properties of the GaussianDoes not have the convenient scaling properties of the Gaussian functionfunction To find an integerTo find an integer xx drawn from the Poisson distribution with meandrawn from the Poisson distribution with mean µµ,, •• Generate a random variable r from the uniform distribution asGenerate a random variable r from the uniform distribution as ∑= − = x x x e x r 0 ! µµ At larger values of µ, the Poisson distribution becomes indistinguishable from the Gaussian function Build the Poisson distribution table and look up from the table for the generated random number. For example: Geiger counter experiment with an assumed mean counting rate of 8.4 counts per 20s interval. Build a table for Poisson distribution with µ=8.4
  • 28. Exponential DistributionExponential Distribution To simulate unstable state such as particle decayTo simulate unstable state such as particle decay The probability density function for obtaining a count at timeThe probability density function for obtaining a count at time tt from anfrom an exponential distribution with mean lifeexponential distribution with mean life ττ τ τ τ t e tfortP − = <= 00):( Random sample ti from the distribution can be obtained from ii rt lnτ−=
  • 29. Composition + RejectionComposition + Rejection If the probability densityIf the probability density f(xf(x)) can be written ascan be written as ∫ ∑ =≥>≤≤ = = 1)(,0)(,0,1)(0 )()()( 1 dxxfxfxgwhere xgxfxf iiii n i iii α α fi(x) must be easily sampled & gi(x) should be not much smaller than unit to keep rejections low First choose an integer from 1 to n with probability proportional to αi of being i A sample x is drawn from the distribution function with frequency function fi(x) and the value is accepted or rejected by computing the value of gi(x) and comparing with a random number r if r > gi(x), reject x r ≤ gi(x), accept x So that the probability of accepting or rejection x is gi(x) For example, Compton scattering
  • 32. Compton Scattering Minimum photon energyMinimum photon energy –– Backward scatteringBackward scattering ε0 [ ] 22 2 02 1101 22112 2 0 0 2 2 0 /)(2/)1( )/(1)()/1ln( )()()()()( 1 sin 1 1 )( ]1,[ 2 αεεεα εαεεα εεαεαεε ε θε ε ε εφ εεε =−= == •+=•=      + −    +≅ ∈ + = f fwhere gffgf Ecm cm e e f1 and f2 are probability density functions [ε0,1] 1)(0]1,[ sin 1 1)( 0 2 2 ≤<⇒     + −= εε θ ε ε ε gallfor g Rejection function     −+ θ ε ε 2 sin 1 1)()( 2 1 0 >≥= ∑=i i forff εεεαε
  • 33. Sampling Photon EnergySampling Photon Energy f1 and f2 - normalised probability density functions [ε0,1] Random numbers (r) are drawn from uniform distribution [0 1] Step 1Step 1 •• IfIf rr << αα11/(/(αα11++αα22) select f) select f11, otherwise f, otherwise f22 Step 2Step 2 •• SampleSample εε from the distribution corresponding tofrom the distribution corresponding to f1 or f2 For f1 : ε = exp(-rα1 ) For f2 : ε2 = ε0 2 + (1 - ε0 2)r Step 3 • Use the value of ε in stp 2 to calculate sin2θ = (2-t) and then g(ε) Where t = (1-cosθ) = mec2(1-ε)/(E0ε) Step 4 • If r ≤ g(ε) accept ε, otherwise goto step 1
  • 34. Compton Scattering without atomic form factorCompton Scattering without atomic form factor )cos1( &, 1 sin 1 1 0 2 2 01 0 1 10 2 2 2 0 2 2 θ ε ε θε ε ε π ε σ −+ = = − − −       − −    += Ecm cm EE E E photonsscatteredincidentofEnergyEE masselectroncm radiuselectronclassiclr Z E cm r d d e e e e e e 0 1 E E =ε 1,000,000 Events keVEAu 50, 0 = KleinKlein--NishinaNishina DCSDCS ε σ d d
  • 35. Low Energy Compton Scattering in GEANT4 HubbelHubbel’’ss atomic form factor, SFatomic form factor, SF P(P(εε,q)=,q)=ФФ((εε) x) x SF(qSF(q)) Momentum transfer q = E sinMomentum transfer q = E sin22((θθ/2)/2) • Energy distribution of the scattered photon according to the Klein- Nishina formula, multiplied by scattering function F(q) (Hubbel’s atomic factor) from EPDL97 data library • The effect of scattering function becomes significant at low energies in suppressing forward scattering • Angular distribution of the scattered photon and the recoil electron also based on EPDL97