SlideShare a Scribd company logo
1 of 9
Download to read offline
IOSR Journal of Applied Physics (IOSR-JAP)
e-ISSN: 2278-4861.Volume 4, Issue 4 (Sep. - Oct. 2013), PP 15-23
www.iosrjournals.org
www.iosrjournals.org 15 | Page
The Monte Carlo Method of Random Sampling in Statistical
Physics
1
Akabuogu E. U., 2
Chiemeka I. U. and 3
Dike C. O.
Department of Physics, Abia State University, Uturu, Nigeria.
Abstract: The Monte Carlo technique of random sampling was reviewed in this work. It plays an important
role in Statistical Mechanics as well as in scientific computation especially when problems have a vast phase
space. The purpose of this paper is to review a general method, suitable to fast electronic computing machines,
for calculating the properties of any system which may be considered as composed of interacting particles.
Concepts such as phase transition, the Ising model,ergodicity, simple sampling, Metropolis algorithm, quantum
Monte Carlo and Non-Boltzmann sampling were discussed. The applications of Monte Carlo method in other
areas of study aside Statistical Physics werealso mentioned.
Keywords: Ising Model, Phase transition, Metropolis algorithm, Quantum Monte Carlo, Observables, Non-
Boltzmann Sampling.
I. Introduction
In Statistical Mechanics, many real systems can be considered: These systems can be broadly divided into
two namely:
๏‚ท Systems which are composed of particles that do not interact with each other, therefore uncorrelated. These
model are called Ideal gases
๏‚ท Systems which consist of particles which interact with each other such that interparticle interactions cause
correlation between the many particles. Such systems therefore exhibit phase transition. A typical example
of such system is the Ising model.
For such many particle systems, systematic exploration of all possible fluctuations is often a very complicated
task due to the huge number of microscopic states that must be considered and the cumbersome detail needed to
characterize these states. However, there have been formulated a number of practical methods for sampling
relevant fluctuation for non-interacting systems: the fluctuations can be handled by the Factorization and the
Born-Oppenheimer approximation. For the interacting systems, which involve phase transition (e.g the Ising
model), the Monte Carlo Method (MCM) and the Molecular Dynamic (MD) method can be employed to
estimate the fluctuations (Chandler, 1987).
The goal of statistical mechanics is to understand and interpret the measurable properties in terms of
the properties of their constituent particles and the interactions between them. This is done by connecting
thermodynamic function to quantum-mechanical equations. The two central quantities being the Boltzmann
factor and the partition function. The ability to make macroscopic predictions based on microscopic properties is
the main advantage of statistical mechanics. For problem where the phase space dimensions is very large โ€“ this
is especially the case when the dimension of phase space depends on the number of degrees of freedom โ€“ the
Monte Carlo Method outperforms any other integration scheme. The difficulty lies in smartly choosing the
random samples to minimize the numerical effort.
The Monte Carlo method in computational physics is possibly the most important numerical
approaches to study problems spanning all unthinkable scientific discipline. These include, all physical sciences,
Engineering, computational biology, telecommunication, finance and banking etc. The idea is seemingly simple:
Randomly sample volume in d-dimensional space to obtain an estimate of an integral at the expense of an
experimental error (Helmut, 2011).
In this work, focus will be on the application of Monte Carlo Method to canonical systems with
coupled degrees of freedom such as the Ising model. The theory of phase transition will be quickly revisited
while the Ising model will be used as a representation of our interacting many particle systems. The simple
sampling and the Metropolis sampling will be treated as a Monte Carlo Method.In addition, quantum extensions
of the Monte Carlo Method such as the โ€˜Quantum Monte Carlo and the Path Integral methodโ€™ will be discussed.
The non- Boltzmann sampling and the application of the Monte Carlo Method in other area of study aside
statistical physics will also be outlined.
II. Theory of Phase Transition
Monte Carlo method is hugely employed in systems with inter-particle interactions: Phase transition is
a clear manifestation of such interactions. Many processes in life exhibit phase transition. A good example being
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 16 | Page
the solid to liquid, liquid to gas and the reverse changes due to the effect of temperature and/or pressure. A
phase change of great importance, which is frequently used in statistical mechanics, is that of magnetic systems.
Ferromagnets are clear example of such magnetic system. A ferromagnet is a material that exhibit spontaneous
magnetization. In ferromagnets, magnetic ions in the material align to create zero magnetism. Each ion exerts a
force on the surrounding ions to align with it. An aligned state is a low energy state while entropy increases at
higher temperature to cause substantial alignment. When these individual interactions build up to a macro scale,
alignments produce a net magnetism. If there is a sudden change in magnetism at a certain temperature, we say
that there is a phase transition (Eva, 2010).
III. The Ising Model
Once the electrons spin was discovered, it was clear that the magnetism should be due to large number
of electrons spinning in the same direction. A spinning charged particle has a magnetic moment associated with
it. It was natural to ask how the electrons all know which direction to spin, because the electrons on one side of
a magnet do not directly interact with the electrons on the other side. They can only influence their neighbors.
The Ising model was designed to investigate whether a large fraction of the electron could be made to spin in the
same direction using only local force (Wikipedia).
The idealized simple model of a ferromagnet is based upon the concept of interacting spins on an
unchanging lattices is called the Ising model. It is named after the physicist Ernst Ising. The model consists of a
discrete variable that represents magnetic dipole moments of atomic spins๐‘†๐‘– that can be in one state (+1 or - 1).
๐‘†๐‘–
+1 ๐‘Ÿ๐‘’๐‘๐‘Ÿ๐‘’๐‘ ๐‘’๐‘›๐‘ก๐‘  "๐‘ ๐‘๐‘–๐‘›๐‘ข๐‘" โ†‘
โˆ’1 ๐‘Ÿ๐‘’๐‘๐‘Ÿ๐‘’๐‘ ๐‘’๐‘›๐‘ก๐‘  "spin down" โ†“
The spins are arranged in a graph usually a lattice, allowing each spin to interact with its neighbors. The model
allows the identification of phase transitions, as a simplified model of reality. The one โ€“ dimensional model has
no phase transition and was solved by Ising in 1925; the 2- dimensional model was solved by Lans Onsager
(Binder et al, 1992).
In the Ising model, we consider a system of N spins arranged on a lattice as illustrated in fig 1 below.
๐ปโ€ฆโ€ฆ
๐‘ ๐‘๐‘–๐‘›๐‘–๐‘ค๐‘–๐‘ก๐‘•๐‘š๐‘Ž๐‘”๐‘›๐‘’๐‘ก๐‘–๐‘๐‘š๐‘œ๐‘š๐‘’๐‘›๐‘ก ยฑ ๐œ‡
Fig 1: An Ising model
In the presence of a magnetic field H, the energy of the system in a particular state, ๐œ is
๐ธ๐œˆ = โˆ’ ๐ป๐œ‡๐‘ ๐‘–
๐‘
๐‘–=1 โˆ’ ๐‘— ๐‘ ๐‘– ๐‘ ๐‘—
๐ผ
๐‘–๐‘— (1)
external magnetic energy energy due to interaction of the spin
๐‘ค๐‘•๐‘’๐‘Ÿ๐‘’๐‘—๐‘–๐‘ ๐‘ก๐‘•๐‘’๐‘๐‘œ๐‘ข๐‘๐‘™๐‘–๐‘›๐‘”๐‘๐‘œ๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘ก.
The spin at each lattice site (i) interact with its four nearest neighbor sites such that the overall Hamiltonian for
the system is
๐ป = โˆ’๐‘— ๐‘ ๐‘– ๐‘ ๐‘—๐‘–๐‘— (2)
Where the sum over <ij> represents a sum over the nearest neighbors of all the lattice sites.
By setting up a system obeying this Hamiltonian, we can find the mean energy,๐ธ and the specific heat, ๐ถ, per
spin for the Ising ferromagnet from
๐ธ =
1
๐‘
๐ธ ๐‘– ๐‘’โˆ’๐›ฝ ๐ธ ๐‘–๐‘–
๐‘
(3)
And
๐ถ =
๐œŽ ๐ธ
2
๐‘๐พ ๐ต ๐‘‡
(4)
We can also estimate the mean magnetization,๐‘€, from the following ensemble average
๐‘€ =
1
๐‘
๐‘ ๐‘–๐‘– (5)
As also noted byFrenkelet al (1996), we are interested in the variations of these quantities with temperature as
this system will undergo a phase change and so discontinuities should appear in its macroscopic properties at a
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 17 | Page
particular value T. It should be possible tocompare the behavior of ๐ธ , ๐ถ, ๐‘€ and also the partition
function, ๐‘ ๐›ฝ with the result in the literature to determine if the simulation is giving a reasonable result.
IV. Phase Change Behavior of the Ising Model
The difference between the thermal energy available๐พ๐ต ๐‘‡, and the interaction exchange energy ๐‘—
defines the behavior of the Ising model. At high temperature๐‘‡ > ๐‘‡๐‘ , when the thermal energy is much greater
than the interaction energy, the interaction energy will be drowned out by randomizing effect of the heat bath
and so the patterns of the spins on the mesh will be random and there will be no overall magnetization.However,
as the temperature is lowered, ๐‘‡ < ๐‘‡๐‘ , and ๐ฝ > 0 , the correlation between the spin begin to take effect.
Lower temperature allows clusters of like spins to form, the size of which defines the correlation length ๐œ€ of
the system, therefore it may be anticipated that for low enough temperature, the stabilization will lead to
cooperation phenomenon called Spontaneous Magnetization: that is through interactions between nearest
neighbors, a given magnetic moment can influence the alignment of spins that are separated from the given spin
by a macroscopic distance. These long ranged correlations between spins are associated with long ranged order
in which the lattice has a net magnetization even in the absence of a magnetic field. The magnetization
๐‘€ = ๐œ‡๐‘ ๐‘–
๐‘
๐‘–=1 (6)
In the absence of the external magnetic field H, is called the spontaneous magnetization (Chandler, 1987).
|M|
T<Tc (Tc) T>Tc T
Fig 2: Plot of magnetization against temperature of an Ising model.
As shown in fig. 2, Tc(the Curie temperature or critical temperature) denote the highest
temperature to which there can be non โ€“ zero magnetization, we expect that after the critical point, the system
must make a decision as which direction is favoured, until at T=0 all spins are pointing either upwards or
downwards. Both choices are energetically equivalent and there is no pre-guess which one the system will
choose. The breaking of symmetry, such having to choose the favoured direction for the sins and divergence of
system parameter such as ๐œ€๐‘Ž๐‘›๐‘‘๐ถ are characteristics of the phase change (Binder et al,1992).
Chandler (1987) opined that the conditions at which all spins align in either upward or downward direction
(at T<Tc) and when all the spin are randomly aligned (at T>Tc) can be considered or modelled as a phase change
transition between a ferromagnetic and paramagnetic conditions of the magnetic system. The 1 โ€“ and โ€“ 2 โ€“
Dimensional Ising model is as shown in fig. 3.
1-Dimensional Ising as solved by Ising in 1925
Low T High T
2 โ€“ Dimensional Ising Model as solved by Onsager in 1944
Low T High T
Fig 3: 1 โ€“ and โ€“ 2 โ€“ Dimensional Ising models
As the temperature increases, entropy increase but net magnetization decreases. This alignment and
misalignment of spin constitute a phase change in an Ising model.
V. Variables describing the system
According to Laetitia (2004), if we consider a lattice of ๐‘2
spins which has 2 possible states, up and
down, In order to avoid the end effects, periodic boundary conditions are imposed. The equilibrium of the
system can be
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 18 | Page
๏‚ท ๐‘€๐‘Ž๐‘”๐‘›๐‘’๐‘ก๐‘–๐‘ง๐‘Ž๐‘ก๐‘–๐‘œ๐‘› , ๐‘€ =
1
๐‘2 ๐‘† (7)
๏‚ท ๐ป๐‘’๐‘Ž๐‘ก๐ถ๐‘Ž๐‘๐‘Ž๐‘๐‘–๐‘ก๐‘ฆ, ๐ถ =
1
๐‘2
1
๐พ๐‘‡
2
๐ธ2
โˆ’ ๐ธ 2
(8)
It is linked to the variance of the energy.
๏‚ท ๐‘†๐‘ข๐‘ ๐‘๐‘’๐‘๐‘ก๐‘–๐‘๐‘–๐‘™๐‘–๐‘ก๐‘ฆ, ๐’ณ =
1
๐‘2
1
๐พ๐‘‡
๐‘†2
โˆ’ ๐‘† 2
(9)
It is linked to the variance of the magnetization
VI. Expected behaviour of the system
The free energy satisfies
๐น = โˆ’๐พ๐ต ๐‘‡๐‘™๐‘›๐‘ = ๐ธ โˆ’ ๐‘‡๐‘† (10)
Where ๐‘ = ๐‘’โˆ’๐›ฝ๐ธ โˆ๐‘—
โˆ๐‘—
The system approaches the equilibrium by minimizing F (the free energy). At low temperature, the
interaction between the spins seem to be strong, the spins tend to align with another. In this case, the
magnetization reaches its maximal value |M| = 1 according its formula, the magnetization exists even if there is
no external magnetic field. At high temperature, the interaction is weak, the spins are randomly up or down, as
such the magnetization is close to the value |M| = 0. Several configuration suits; the system is metastable.
The magnetization disappears at a given temperature. There exists thus a transition phase. In zero external
magnetic fields, the critical temperature is the Curie temperature ๐‘‡๐‘ =
2
๐‘™๐‘› 1+ 2
(obtained by Onsagerโ€™s theory).
According to the transition phase theory, the second-order derivative of the free energy in ๐›ฝ&๐‘‡ are
discontinuous at the transition phase, as the susceptibility and the heat capacity are expressed with these
derivatives, they should diverge at critical temperature.
VII. The Monte Carlo Method
With the advent and wide availability of powerful computers, the methodology of computer simulation
has become a ubiquitous tool in the study of many body systems. The basic idea in these methods is that with a
computer, one may follow the trajectory of system involving 102
or even 103
degrees of freedom. If the system
is appropriately constructed โ€“ that is, if physically meaningful and boundary conditions and interparticle
interactions are employed โ€“ the trajectory will serve to simulate the behaviour of a real assembly of particles and
the statistical analysis of the trajectory will determine meaningful predictions for the properties of the assembly.
The importance of these methods is that, in principle they provide exact results for the Hamiltonian under
investigation. These simulations provide indispensable benchmarks for approximate treatment of non-trivial
systems composed of interacting particles (Chandler, 1987).
There are 2 general classes of simulations: One is called the Molecular Dynamic method (MD); Here
one considers a classical dynamic model for atoms and molecules, and there trajectory is formed by integrating
Newtonโ€™s equations of motion. The other class is called the Monte Carlo method (MCM). This procedure is
more generally applicable than molecular dynamics in that it can be used to study quantum systems and lattice
models as well as classical assemblies of molecules. As a representative of a non-trivial system of interacting
particles, we choose the Ising magnet to show how the Monte Carlo method can be applied.
The term Monte Carlo method was coined in 1940โ€™s by physicists, S. Ulam, E. Fermi, J. Von Neumann
and N. Metropolis (amongst others) working on the nuclear weapon project at Los Alamos National Laboratory.
Because random numbers (similar to processes occurring at in a casino, such as the Monte Carlo casino in
Monaco) are needed, it is believed that, that was the source of the name. For problems where the phase space
dimensions is very large โ€“ this is especially the case when the dimension of phase space depends on the number
of degrees of freedom โ€“ the Monte Carlo method outperforms any other integration scheme. The difficulty lies
in smartly choosing the random samples to minimize the numerical effort (Helmut, 2011).The Ising model can
often be difficult to evaluate numerically, if there are many states in the system. Consider an Ising model with L
lattice sites. Let โˆ™ ๐ฟ = | โˆง |: ๐‘ก๐‘•๐‘’๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ๐‘œ๐‘“๐‘ ๐‘–๐‘ก๐‘’๐‘ ๐‘œ๐‘›๐‘ก๐‘•๐‘’๐‘™๐‘Ž๐‘ก๐‘ก๐‘–๐‘๐‘’
โˆ™โˆ๐‘— โˆˆ โˆ’1, +1 : ๐‘Ž๐‘›๐‘–๐‘›๐‘‘๐‘–๐‘ฃ๐‘–๐‘‘๐‘ข๐‘Ž๐‘™๐‘ ๐‘๐‘–๐‘›๐‘ ๐‘–๐‘ก๐‘’๐‘œ๐‘›๐‘ก๐‘•๐‘’๐‘™๐‘Ž๐‘ก๐‘ก๐‘–๐‘๐‘’, ๐‘— = 1, โ€ฆ โ€ฆ . . ๐ฟ.
โˆ™ ๐‘† โˆˆ โˆ’1, +1 : ๐‘ ๐‘ก๐‘Ž๐‘ก๐‘’๐‘œ๐‘“๐‘ก๐‘•๐‘’๐‘ ๐‘ฆ๐‘ ๐‘ก๐‘’๐‘š
Since every spin has ยฑ๐‘ ๐‘๐‘–๐‘›, there are 2 ๐ฟ
different states that are possible. This motivates the reason for the Ising
model to be simulated using the Monte Carlo method. As an example, for a small 2-dimensional Ising magnet
with only 20x20=400 spins, the total number of configuration is2400
. Yet Monte Carlo procedures routinely
treat such system successfully sampling only 106
configurations. The reason is that Monte Carlo schemes are
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 19 | Page
devised in such a way that the trajectory probes primarily those states that are statistically most important. The
vast majority of the 2400
states are of such high energy that they have negligible weight in the Boltzmann
distribution (wikipedia). The Monte Carlo technique works by taking the random sample of points from phase
space, and then using that data to find the overall behaviour of the system. This is analogous to the idea of an
opinion poll. When trying to work on the opinion of the general public on a particular issue, holding a
referendum on the subject is costly and time consuming. Instead, we ask a randomly sampled proportion of the
population and use the average of their opinion as representation of the opinion of the whole country. This is
much easier than asking everybody, but is not as reliable. We must ensure that the people asked are chosen truly
at random (to avoid correlations between the opinions), and be aware of how accurate we can expect our results
to be given that we are only asking a relatively small number of people. Issues as these all are analogous to the
implementation of the Monte Carlo technique (Frenkel, 1997). To carry out a Monte Carlo method on an Ising,
an ๐‘2
-array is initialized at a low temperature with aligned spins (this is called a cold start) or at a high
temperature with random values 1 or -1(this is called the hot start). The Monte Carlo method consists to choose
randomly a spin to flip. The chosen spin is flipped, if is favourable for the energy. When the energy becomes
stable, the system reached the thermalisation, then configurations are determined by the Metropolis Algorithm
in order to determine the properties of interest as the heat capacity and the susceptibility.
In statistical physics, expectation values of quantities such as the energy, magnetisation, specific heat etc โ€“
generally called โ€˜observablesโ€™ โ€“ are computed by performing a trace over the partition function Z. Within the
canonical ensemble (where the temperature T is fixed), the โ€˜expectation value or thermal average of an
observable, A is given by
๐ด = ๐‘ƒ๐‘’๐‘ž๐‘  ๐‘  ๐ด ๐‘  (11)
Where s denotes a state, ๐ด ๐‘  is the value of A in the state s. ๐‘ƒ๐‘’๐‘ž ๐‘  is the equilibrium Boltzmann distribution.
๐ด =
1
๐‘
๐ด ๐‘  ๐‘’โˆ’๐›ฝ๐ธ ๐‘  (12)
Represent the sum overall states S in the system and ๐›ฝ =
1
๐พ๐‘‡
where K is the Boltzmann constant. ๐‘ =
๐‘’๐‘ฅ๐‘ โˆ’๐›ฝ๐ธ ๐‘ ๐‘  is the partition function which normalises the equilibrium Boltzmann distribution. It is good to
note that the Boltzmann factor ๐‘ƒ๐‘– ๐›ผ๐‘’โˆ’๐›ฝ ๐ธ ๐‘– is not a probability until it has been normalised by a partition function.
Such that the true nature of the equilibrium Boltzmann distribution ๐‘ƒ๐‘’๐‘ž ๐‘  is
๐‘ƒ๐‘’๐‘ž ๐‘  =
๐‘’
โˆ’๐›ฝ ๐ธ ๐‘ 
๐‘’โˆ’๐›ฝ๐ธ ๐‘ 
๐‘ 
(13)
One can show that the free energy,F is given by
๐น = โˆ’๐‘˜๐‘‡๐‘™๐‘›๐‘ (14)
All thermodynamic quantities can be computed directly from the partition function and expressed as derivatives
of the free energy. Because the partition function is closely related to the Boltzmann distribution. It follows that
we can โ€˜sampleโ€™ observables with states generated according to the corresponding Boltzmann distribution. A
simple Monte Carlo algorithm can be used to produce such estimate (Newmannet al, 1999).
VIII. Simple Sampling
The randomly most basic sampling technique consists of just randomly choosing points from anywhere
within the configuration space. A large number of spins patterns are generated at random (for the whole mesh)
and the average energy and magnetization is calculated from data. However, this technique needs to suffer from
exactly the same problems as the quadrature approach, often sampling from unimportant areas of the phase
space. The chances of a randomly created array of spin producing and all โ€“ up/ all- down spin pattern is remote
(~2โˆ’๐‘
), and high temperature random spin array is much likely. The most common way to avoid this problem is
by using the Metropolis Importance Sampling, which works by applying weights to the microstates.
IX. The Metropolis Sampling
The algorithm first chooses selection probabilities๐‘” ๐œ‡, ๐œˆ , which represents the probability that state ๐œˆ
is selected by the algorithm out of all states given that we are in state๐œ‡. It then uses acceptance probabilities
๐ด ๐œ‡, ๐œˆ so that detailed balance is satisfied. If the new state ๐œˆ is accepted, then we move to that state and repeat
with selecting a new state and deciding to accept it. If ๐œˆis not accepted, then we stay in๐œ‡. The process is
repeated until some stopping criteria is met, which for the Ising model is often when the lattice becomes
ferromagnetic, meaning all of the sites point in the same direction (Frenkelet al, 1996).
We implementing the algorithm, we must ensure that ๐‘” ๐œ‡, ๐œˆ is selected such that ergodicity is met (by
this we mean that orbit of the representative point in phase space goes through all points on the surface). In
thermal equilibrium, a systemโ€™s energy only fluctuates within a small range. With this knowledge, the concept
of single-spin flip dynamics โ€“ which states that in each transition, we will only change one of the spin sites on
the lattice. Furthermore, by using a single-spin flip dynamics, we can get from one state to any other state by
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 20 | Page
flipping each site that differs between the 2 states one at a time. The random choices for the spins performed
with the aid of the โ€˜pseudo โ€“ random number generatorโ€™ โ€“ an algorithm that generates a long sequence of
random numbers uniformly distributed in the interval between 0 and 1- commonly available with digital
computers.
Having picked out a spin at random, we next consider the new configuration๐œˆ, generated from the old
configuration,๐œ‡, by flipping the randomly identified spin ๐œ‡. The change in configuration changes the energy of
the system by the amount
ฮ”๐ธ๐œ‡๐œˆ = ๐ธ๐œˆ โˆ’ ๐ธ๐œ‡ (15)
Chandler (1987) opined that this energy difference governs the relative probability of configurations through the
Boltzmann distribution, and we can build this probability into the Monte Carlo trajectory by criterion for
accepting and rejecting moves to new configuration. The maximum amount of change between the energy of
this present ๐ธ๐œ‡ and any possible new stateโ€™s energy ๐ธ๐œˆ is 2J between the spin we choose to โ€˜flipโ€™ to move to the
new state and that spinโ€™s neighbour.
Specifically for the Ising model, implementing the single-spin flip dynamics, we can establish the following:
- Since there are L total sites on the lattice, using the single โ€“ spin as the only way we transition to another
state, we can see that there are a total of L new state ๐œˆ from our present state ๐œ‡.โ€˜Detailed balance tells us
that the following equations must hold:
๐ด ๐œ‡, ๐œˆ ๐‘ƒ๐›ฝ ๐œ‡ = ๐ด ๐œˆ, ๐œ‡ ๐‘ƒ๐›ฝ ๐œˆ (16)
Such that,
๐‘ƒ ๐œ‡,๐œˆ
๐‘ƒ ๐œˆ,๐œ‡
=
๐‘” ๐œ‡,๐œˆ ๐ด ๐œ‡,๐œˆ
๐‘” ๐œˆ,๐œ‡ ๐ด ๐œˆ,๐œ‡
=
๐ด ๐œ‡,๐œˆ
๐ด ๐œˆ,๐œ‡
=
๐‘ƒ ๐›ฝ ๐œˆ
๐‘ƒ ๐›ฝ ๐œ‡
=
๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ /๐‘
๐‘’โˆ’๐›ฝ ๐ธ ๐œ‡ /๐‘
= ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ (17)
Where ๐‘ƒ๐›ฝ ๐œˆ is the transition probability for the next state๐œˆ, and it only depends on the present state๐œ‡. Thus we
want to select the acceptance probability for our algorithm to satisfy
๐ด ๐œ‡,๐œˆ
๐ด ๐œˆ,๐œ‡
= ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ (18)
If ๐ธ๐œˆ > ๐ธ๐œ‡ , then ๐ด ๐œˆ, ๐œ‡ > ๐ด ๐œ‡, ๐œˆ .
By this reasoning, the acceptance algorithm is
๐ด ๐œ‡, ๐œˆ =
๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ ๐‘–๐‘“๐ธ๐œˆ > ๐ธ๐œ‡
1 ๐ธ๐œˆ โ‰ค ๐ธ๐œ‡
Conclusively, the basis for the algorithm is as follows
- Pick a spin site using the selection probability ๐‘” ๐œ‡, ๐œˆ and calculate the contribution of the energy involving
the spin.
- Flip the value of the spin and calculate the new contribution.
- If the new energy is less, keep the flipped value i.e.๐‘–๐‘“๐ธ๐œˆ โ‰ค ๐ธ๐œ‡ , we accept the move
- If the new energy is more, only keep the probability๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ .
- Repeat: we now repeat this procedure for making millions of steps thus forming a long trajectory through
the configuration space.
This process is repeated until the stopping criteria is met, which for the Ising model is often when the lattice
have become ferromagnetic (at low energies๐‘‡ < ๐‘‡๐‘ ), meaning that all the sites point in the same
direction(Wikipedia).
X. Practical Implementation of the Metropolis Algorithm
A pseudo โ€“ code Monte Carlo program to compute an observable ๐œ— for the Ising model is the following:
1 algorithm ising_metropolis A (T, steps)
2 initialize starting configuration ๐œ‡
3 Initialize ๐œš = 0
4
5 for (counter = 1 ........steps) do
6 generate trial state ๐œˆ
7 computep(๐œ‡ โ†’ ๐œˆ , T)
8 x = rand (0,1)
9 If(p > x) then
10 Accept ๐œˆ
11 fi
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 21 | Page
12
13 ๐œš+= ๐œš(๐œˆ)
14 done
15
16 return ๐œš/steps
After initialization, in line 6 a proposed state is generated by e.g. flipping a spin. The energy of the new
state is computed and henceforth the transition probability between states๐‘ƒ = ๐‘‡ ๐œ‡ โ†’ ๐œˆ . A uniform random
number ๐‘‹ โˆˆ 0,1 is generated. If the probability is larger than the number, the move is accepted. If the energy is
loweredโˆ†๐ธ > 0 , the spin is always flipped. Otherwise the spin is flipped with a probability P. Once the new
state is accepted, we measure a given observable and record its value to perform the thermal average at a given
temperature. For steps โ†’ โˆž the average of the observable converges to the exact value, again with an error
inversely proportional to the square root of the number of steps. In order to obtain a correct estimate of an
observable, it is imperative to ensure that one is actually sampling an equilibrium state. Because in general, the
initial configuration of the simulation can be chosen at random โ€“ population choices being random or polarized
configuration - the system will have to evolve for several Monte Carlo steps before an equilibrium state at a
given temperature is obtained. The time ๐œ ๐‘’๐‘ž until the system is in thermal equilibrium is called โ€˜equilibrium
timeโ€™, it depends directly on the system size and it increases with decreasing temperatures. In general, it is a
measure on โ€˜Monte Carlo Sweepsโ€™ i.e. 1 MCS=N spin updates. In practice, all measures observables should be
monitored as a function of MCS to ensure that the system is in thermal equilibrium. Some observables such as
energy equilibrates faster than the others (e.g. Magnetization) and thus equilibration time of all observables
measured need to be considered. It is expected that a plot of the variables when averaged over a partition
function, shows discontinuities at a particular temperature (Helmut, 2011).
XI. The Non โ€“ Boltzmann Sampling
According to Mezei (1987), the success of Monte Carlo is based on its ability to restrict the sample of
the configuration space to the extremely small fractions that contributes significantly to most properties of
interest. However, for calculating free energy difference, adequate sampling of a much larger subspace is
required. Also for the related problems of potential of mean force calculations, the structural parameter along
with the potential of mean force are calculated and has to be sampled uniformly. These goals can be achieved by
performing the calculation using an appropriately modified potential energy function V or๐ธ๐œˆ
0
. Specifically,
Non Boltzmann is used in cases where one need to
๏‚ท Sample part of the phase space relevant for a particular property
๏‚ท Complete simulation when sampling of a specific region of the configuration space is required that may not
be sampled adequately in a direct Boltzmann sampling. The technique calls for sampling with a modified
potential
๏‚ท Sample phase space more efficiently.
Suppose it is convenient to generate a Monte Carlo trajectory for a system with energy๐ธ๐œˆ
0
, but you are
interested in the averages for a system with a different energetic.
๐ธ๐œˆ = ๐ธ๐œˆ
0
+ ฮ”๐ธ๐œˆ (19)
For example, suppose we wanted to analyse a generalized Ising model with couplings between certain non โ€“
nearest neighbours as well as nearest neighbours. The convenient trajectory might then correspond to that for the
simple Ising magnet, and ฮ”๐ธ๐œˆ would then be the sum of all non-nearest-neighbours interaction. How might we
proceed to evaluate the average of interest from the convenient trajectories?
Chandler(1987) suggested that the answer is obtained from a factorization of the Boltzmann factor:
๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ 0
๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ (20)
With this simple property, we have
๐‘„ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ =
๐‘„0 ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ
0
๐‘’โˆ’๐›ฝ ฮ”๐ธ ๐œˆ๐œˆ
๐‘„0
๐œˆ (21)
= ๐‘„0 ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ
0 (22)
Where โ€ฆ โ€ฆ . . 0 indicates the canonical ensemble average taken with energetic ๐ธ๐œˆ
0
. A similar result due to the
factorization of the Boltzmann factor is
๐บ = ๐‘„โˆ’1
๐บ๐œˆ ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ
๐œˆ (23)
= ๐‘„0 ๐‘„ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ
0 (24)
= ๐บ๐œˆ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œ
0/ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ
0 (25)
The formula provides a basis for Monte Carlo procedures called the โ€˜non-Boltzmann samplingโ€™ and โ€˜the
Umbrella samplingโ€™. In particular, since a Metropolis Monte Carlo trajectory with energy ๐ธ๐œˆ
0
can be used to
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 22 | Page
compute the averages denoted by โ€ฆ โ€ฆ . 0 , we can use the factorization formulas to compute ๐บ and ๐‘„ ๐‘„0
even though the sampling employing ๐ธ๐œˆ
0
is not consistent with the Boltzmann distribution ๐‘’๐‘ฅ๐‘ โˆ’๐›ฝ๐ธ๐œˆ .
Non โ€“ Boltzmann sampling can also be useful in removing bottlenecks that create quasi-ergodic problems
and in focussing attention on rare events. Consider the former first, suppose a large activation barrier separates
one region of configuration space from another, and suppose this barrier can be identified and located at a
configuration or sets of configurations. We can then form a reference system in which the barrier has been
removed: that is, we take
๐ธ๐œˆ
0
= ๐ธ๐œˆ โˆ’ ๐‘‰๐œˆ (26)
Where ๐‘‰๐œˆ is large in the region where ๐ธ๐œˆhas the barrier and it is zero otherwise. Trajectories based upon the
energy ๐ธ๐œˆ
0
will indeed waste time in the barrier region, but they will also pass one side of the barrier to the
other thus providing scheme for solving problems of quasi-ergodicity.
Also an example of the latter occurs when we are interested in a relatively rare event. For example, suppose we
want to analyze the behaviour of surrounding spins given that an n x n block were all perfectly aligned. While
such blocks do form spontaneously and their existence might catalyze something of interest, the natural
occurrence of this perfectly aligned n x n block might be very infrequent during the course of the Monte Carlo
trajectory. To obtain a meaningful statistics from these rare events, without wasting time with irrelevant though
accessible configurations, we have to perform a non-Boltzmann the Monte Carlo trajectory with energy like:
๐ธ๐œˆ
0
= ๐ธ๐œˆ + ๐‘Š๐œˆ (27)
Where ๐‘Š๐œˆis zero for the interesting class of configurations, and for all others๐‘Š๐œˆ is large. The energy ๐‘Š๐œˆ is then
called an Umbrella potential. It biases the Monte Carlo trajectory to sample only the rare configurations of
interest. The non โ€“ Boltzmann sampling performed this way is called the umbrella Sampling (Chandler, 1987).
XII. Quantum Monte Carlo
In addition to the aforementioned Monte Carlo Methods that treat classical problems, quantum
extensions such as variational Monte Carlo, Path Integral Monte Carlo etc have been developed for quantum
systems. In quantum Monte Carlo simulation, the goal is toavoid considering the full Hilbert space, but
randomly sample the most relevant degree of freedom and try to extract the quantities of interest such as the
Magnetization by averaging over a stochastic process.
According to Chandler (1987), to pursue this goal, we first need to establish a framework in which we
can interpret quantum-mechanical observables in terms of classical probabilities. The process is called
โ€˜quantum-classicalโ€™ mapping and it allows us to reformulate quantum many-body problems in terms of models
from classical statistical mechanics, albeit in higher dimensions. To achieve this โ€˜quantum classical
mappingโ€™and carry out a quantum Monte Carlo, the concept of โ€˜discretized quantum pathsโ€™ will be followed. In
this representation, quantum mechanics is Isomorphic with the Boltzmann weighted sampling of fluctuations in
a discretized classical systems with many degrees of freedoms. Such sampling can be done by Monte Carlo and
thus this mapping provides a basis for performing quantum Monte Carlo calculations.
Many Monte Carlo sampling schemes exist that can be used to generate numerical solutions to Schrodinger
equation. To illustrate the Quantum Monte Carlo simulation, we will focus on one โ€“ โ€˜path integral Monte Carloโ€™
which is popularly used to study quantum systems at non-zero temperatures.
As an example, one can look at a simple one model: the two-state quantum system coupled to a
Gaussian fluctuating field.It is a system that can be analyzed analytically and comparison of the exact analytical
treatment with the quantum Mont Carlo procedure serves as a useful illustration if the convergence of Monte
Carlo sampling.
The quantal fluctuations of the model are to be sampled from the distribution
๐‘Š ๐‘ข1, โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘; ๐œ‰ โˆ ๐‘’๐‘ฅ๐‘ ๐’ฎ ๐‘ข1, โ€ฆ โ€ฆ , ๐‘ข ๐‘ ; ๐œ‰ , (28)
Where
๐’ฎ = โˆ’
๐›ฝ ๐œ‰2
2๐œŽ
+ ๐›ฝ/๐‘ƒ ๐œ‡ ๐‘ข๐‘– ๐œ‰๐‘ƒ
๐‘–=1 (29)
=
1
2
๐‘™๐‘› ๐›ฝโˆ†/๐‘ƒ ๐‘ข๐‘– ๐‘ข๐‘–+1.
๐‘ƒ
๐‘–=1 (30)
Here ๐‘ข๐‘– = ยฑ1 (31)
specifies the state of the quantum system at ๐‘–๐‘ก๐‘• point on the quantum path, there are P such points, the
parameters ๐œ‡๐‘Ž๐‘›๐‘‘โˆ† correspond to the magnitude of the dipole and half the tunnel splitting of the two-state
system, respectively. The electric field,๐œ‰, is a continuous variable that fluctuates between โˆ’โˆž๐‘Ž๐‘›๐‘‘ + โˆž, but due
to the finite of ๐œŽ, the values of ๐œ‰ close to ๐œŽ ๐›ฝ in magnitude are quite common.
The Monte Carlo Method of Random Sampling in Statistical Physics
www.iosrjournals.org 23 | Page
The quantity ๐’ฎis often called the action. It is a function of the P points on the quantum path (๐‘ข1, โ€ฆ . , ๐‘ข ๐‘). The
discretized representation of the quantum path becomes exact in the continuum limit of an infinite number of
points on the path- that is, Pโ†’ โˆž . In that limit, the path changes from a set of P variables, ๐‘ข1to๐‘ข ๐‘, to a function
of a continuous variable. The action is then a quantity whose value depends upon a function, and such a quantity
is called a functional.
Consequently, a computer program can be written in BASIC to run on an IBMPC. Such program, samples the
discretized weight function ๐‘Š ๐‘ข1, โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘; ๐œ‰ by the following procedure:
- Begin with a configuration
- One of the P+1 variables ๐‘ข1 through๐‘ข ๐‘ and ๐œ‰ is identified at random, and the identified variable is changed.
For instance, if the variable is ๐‘ข1the change corresponds to๐‘ข๐‘– โ†’ โˆ’๐‘ข๐‘–. If on the other hand, the identified
variable is the field ๐œ‰ โ†’ ๐œ‰ + โˆ†๐œ‰, where the size of โˆ†๐œ‰ is taken at random from a continuous set of numbers
generated with aid of the โ€˜pseudorandom-number generatorโ€™
- This change in one of the variables causes a change in the actionโˆ†๐’ฎ . For example , if ๐‘ข๐‘– โ†’ โˆ’๐‘ข๐‘–then
โˆ†๐’ฎ = ๐’ฎ ๐‘ข1, โ€ฆ . , โˆ’๐‘ข๐‘–, โ€ฆ โ€ฆ . , ๐œ‰ -๐’ฎ ๐‘ข1, โ€ฆ โ€ฆ , ๐‘ข๐‘–, โ€ฆ โ€ฆ . . ; ๐œ‰
- The Boltzmann factor associated with this change,๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ , then compared with a random number x taken
from the uniform distribution between 0 to 1.
- If ๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ > ๐‘ฅ, the move is accepted.
- If however ๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ < ๐‘ฅ the move is rejected.
An accepted move means that the new configuration of the system is the changed configuration. A rejected
move means that the new configuration is unchanged from the previous one.
As with the Ising model, this standard Metropolis procedure for a Monte Carlo step is repeated over and over
again producing a trajectory that samples configuration space according to the statistical
weight๐‘Š ๐‘ข1,โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘ ; ๐œ‰ . Typical results are then being obtained by averaging properties over these
trajectories. A particular example of such results is the โ€˜solvation energyโ€™ corresponding to the average of the
coupling between the electric field and the quantal dipole, that is
๐ธ๐‘ ๐‘œ๐‘™๐‘ฃ =
1
๐‘
๐œ‡๐œ‰๐œ‡๐‘–
๐‘ƒ
๐‘–=1 (32)
XIII. Conclusion
The Monte Carlo method of random sampling was reviewed in this work. It works by taking the
random sample of points from phase space, and then using the data to find the overall behaviour of the system.
To carry out this method on interacting systems, an array is initialized at a low or high temperature; spins are
chose randomly to flip. The chosen spin is flipped if it satisfies the boundary condition that is favourable to the
energy. Expectation values of the Observables such as the energy, magnetization, specific heat etc are computed
by performing a trace over the partition function, Z. The Monte Carlo applied to the Ising model which
describes the properties of materials allows obtaining the thermodynamic quantities variations. For such model,
when the Monte Carlo method is carried out, it is expected that
๏‚ท Without magnetic field, a phase transition at the critical temperature is strongly marked. This transition
separates๐‘‡ < ๐‘‡๐‘ , where the magnetization is maximum, the spins are aligned, and ๐‘‡ > ๐‘‡๐‘ where the
magnetization is in order of zero, the spins are randomly oriented.
๏‚ท In the presence of magnetic field, the phase transition is not so marked. At a very high temperature, the field
has no effect because of the thermal agitation. In a general way, the spins align with the magnetic field but
at ๐‘‡ < ๐‘‡๐‘ the changes of direction happen only if field is above a critical value.
The use of the Monte Carlo stretches beyond statistical mechanics as its use is seen in Engineering,
telecommunication, biology etc.
References
[1] Binder K&Heermaan D W (1992)โ€˜Monte Carlo Simulation in Statistical Physicsโ€™, 2nd
edition. Springer-Verlag U.K.
[2] Chandler D (1987) โ€˜Introduction to Modern Statistical Mechanicsโ€™, New York, Oxford University Press.
[3] Eva M (2011) โ€˜Phase transition on the Ising Modelโ€™ Vol. I (pg 183-187), TerreHaute.
[4] Frenkel D &Smit B (1996) โ€˜Understanding Molecular Simulationโ€™, Academic Press UK
[5] Helmut G (2011) โ€˜ Introduction to Monte Carlo Methodsโ€™, Department of Physics and Astronomy, Texex A&M University Press, USA
[6] Ising Model; http://en.wikipedia.org/wiki.Ising-model.
[7] Laetitia G (2004) โ€˜Monte Carlo Method applied to the Ising Model; Department of Physics Oxford University
[8] Mezei M (1987) โ€˜Adaptive Umbrella Sampling: Self consistent determination of the Non-Boltzmann biasโ€™; Department of Chemistry
Hunter College C.U.N, New York.
[9] Neumann M E J &Barkema GT (1999) โ€˜Monte Carlo Methods In Statistical Physicsโ€™, Oxford University Press.

More Related Content

What's hot

COMPUTATIONAL CHEMISTRY
COMPUTATIONAL CHEMISTRY COMPUTATIONAL CHEMISTRY
COMPUTATIONAL CHEMISTRY Komal Rajgire
ย 
Anand's presentation
Anand's presentationAnand's presentation
Anand's presentationANAND PARKASH
ย 
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...The Phase Theory towards the Unification of the Forces of Nature the Heart Be...
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...IOSR Journals
ย 
Gaussian presentation
Gaussian presentationGaussian presentation
Gaussian presentationmojdeh y
ย 
molecular mechanics and quantum mechnics
molecular mechanics and quantum mechnicsmolecular mechanics and quantum mechnics
molecular mechanics and quantum mechnicsRAKESH JAGTAP
ย 
Advantages and applications of computational chemistry
Advantages and applications of computational chemistryAdvantages and applications of computational chemistry
Advantages and applications of computational chemistrymanikanthaTumarada
ย 
Applications of Magnon Excitations in Spintronic and Optomagnetic Devices
Applications of Magnon Excitations in Spintronic and Optomagnetic DevicesApplications of Magnon Excitations in Spintronic and Optomagnetic Devices
Applications of Magnon Excitations in Spintronic and Optomagnetic DevicesEmeka Ikpeazu
ย 
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...ijrap
ย 
Computational Chemistry- An Introduction
Computational Chemistry- An IntroductionComputational Chemistry- An Introduction
Computational Chemistry- An IntroductionAnjali Devi J S
ย 
Potential Energy Surface Molecular Mechanics ForceField
Potential Energy Surface Molecular Mechanics ForceField Potential Energy Surface Molecular Mechanics ForceField
Potential Energy Surface Molecular Mechanics ForceField Jahan B Ghasemi
ย 
By using the anharmonic correlated einstein model to define the expressions o...
By using the anharmonic correlated einstein model to define the expressions o...By using the anharmonic correlated einstein model to define the expressions o...
By using the anharmonic correlated einstein model to define the expressions o...Premier Publishers
ย 
Potential Energy Surface & Molecular Graphics
Potential Energy Surface & Molecular GraphicsPotential Energy Surface & Molecular Graphics
Potential Energy Surface & Molecular GraphicsPrasanthperceptron
ย 
Dq35655671
Dq35655671Dq35655671
Dq35655671IJERA Editor
ย 
Computational Organic Chemistry
Computational Organic ChemistryComputational Organic Chemistry
Computational Organic ChemistryIsamu Katsuyama
ย 
Z matrix and potential energy surface
Z matrix and potential energy  surfaceZ matrix and potential energy  surface
Z matrix and potential energy surfacemsfbi1521
ย 
Oscar Nieves (11710858) Computational Physics Project - Inverted Pendulum
Oscar Nieves (11710858) Computational Physics Project - Inverted PendulumOscar Nieves (11710858) Computational Physics Project - Inverted Pendulum
Oscar Nieves (11710858) Computational Physics Project - Inverted PendulumOscar Nieves
ย 
Application of differential equation in ETE
Application of differential equation in ETEApplication of differential equation in ETE
Application of differential equation in ETELimon Prince
ย 

What's hot (20)

COMPUTATIONAL CHEMISTRY
COMPUTATIONAL CHEMISTRY COMPUTATIONAL CHEMISTRY
COMPUTATIONAL CHEMISTRY
ย 
Anand's presentation
Anand's presentationAnand's presentation
Anand's presentation
ย 
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...The Phase Theory towards the Unification of the Forces of Nature the Heart Be...
The Phase Theory towards the Unification of the Forces of Nature the Heart Be...
ย 
Gaussian presentation
Gaussian presentationGaussian presentation
Gaussian presentation
ย 
molecular mechanics and quantum mechnics
molecular mechanics and quantum mechnicsmolecular mechanics and quantum mechnics
molecular mechanics and quantum mechnics
ย 
Advantages and applications of computational chemistry
Advantages and applications of computational chemistryAdvantages and applications of computational chemistry
Advantages and applications of computational chemistry
ย 
SpecMech
SpecMechSpecMech
SpecMech
ย 
Applications of Magnon Excitations in Spintronic and Optomagnetic Devices
Applications of Magnon Excitations in Spintronic and Optomagnetic DevicesApplications of Magnon Excitations in Spintronic and Optomagnetic Devices
Applications of Magnon Excitations in Spintronic and Optomagnetic Devices
ย 
Applications of Computational Quantum Chemistry
Applications of Computational Quantum ChemistryApplications of Computational Quantum Chemistry
Applications of Computational Quantum Chemistry
ย 
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...
CLASSICAL AND QUASI-CLASSICAL CONSIDERATION OF CHARGED PARTICLES IN COULOMB F...
ย 
Computational Chemistry- An Introduction
Computational Chemistry- An IntroductionComputational Chemistry- An Introduction
Computational Chemistry- An Introduction
ย 
Potential Energy Surface Molecular Mechanics ForceField
Potential Energy Surface Molecular Mechanics ForceField Potential Energy Surface Molecular Mechanics ForceField
Potential Energy Surface Molecular Mechanics ForceField
ย 
By using the anharmonic correlated einstein model to define the expressions o...
By using the anharmonic correlated einstein model to define the expressions o...By using the anharmonic correlated einstein model to define the expressions o...
By using the anharmonic correlated einstein model to define the expressions o...
ย 
Potential Energy Surface & Molecular Graphics
Potential Energy Surface & Molecular GraphicsPotential Energy Surface & Molecular Graphics
Potential Energy Surface & Molecular Graphics
ย 
Dq35655671
Dq35655671Dq35655671
Dq35655671
ย 
Advanced Molecular Dynamics 2016
Advanced Molecular Dynamics 2016Advanced Molecular Dynamics 2016
Advanced Molecular Dynamics 2016
ย 
Computational Organic Chemistry
Computational Organic ChemistryComputational Organic Chemistry
Computational Organic Chemistry
ย 
Z matrix and potential energy surface
Z matrix and potential energy  surfaceZ matrix and potential energy  surface
Z matrix and potential energy surface
ย 
Oscar Nieves (11710858) Computational Physics Project - Inverted Pendulum
Oscar Nieves (11710858) Computational Physics Project - Inverted PendulumOscar Nieves (11710858) Computational Physics Project - Inverted Pendulum
Oscar Nieves (11710858) Computational Physics Project - Inverted Pendulum
ย 
Application of differential equation in ETE
Application of differential equation in ETEApplication of differential equation in ETE
Application of differential equation in ETE
ย 

Viewers also liked

Magnetic semiconductors: classes of materials, basic properties, central ques...
Magnetic semiconductors: classes of materials, basic properties, central ques...Magnetic semiconductors: classes of materials, basic properties, central ques...
Magnetic semiconductors: classes of materials, basic properties, central ques...ABDERRAHMANE REGGAD
ย 
Introduction to the phenomenology of HiTc superconductors.
Introduction to  the phenomenology of HiTc superconductors.Introduction to  the phenomenology of HiTc superconductors.
Introduction to the phenomenology of HiTc superconductors.ABDERRAHMANE REGGAD
ย 
Graphene2016 Abstracts Book
Graphene2016 Abstracts BookGraphene2016 Abstracts Book
Graphene2016 Abstracts BookPhantoms Foundation
ย 
Electrical transport and magnetic interactions in 3d and 5d transition metal ...
Electrical transport and magnetic interactions in 3d and 5d transition metal ...Electrical transport and magnetic interactions in 3d and 5d transition metal ...
Electrical transport and magnetic interactions in 3d and 5d transition metal ...ABDERRAHMANE REGGAD
ย 
Monte Carlo Simulations
Monte Carlo SimulationsMonte Carlo Simulations
Monte Carlo Simulationsgfbreaux
ย 

Viewers also liked (7)

JOURNALnew
JOURNALnewJOURNALnew
JOURNALnew
ย 
Magnetic semiconductors: classes of materials, basic properties, central ques...
Magnetic semiconductors: classes of materials, basic properties, central ques...Magnetic semiconductors: classes of materials, basic properties, central ques...
Magnetic semiconductors: classes of materials, basic properties, central ques...
ย 
Introduction to the phenomenology of HiTc superconductors.
Introduction to  the phenomenology of HiTc superconductors.Introduction to  the phenomenology of HiTc superconductors.
Introduction to the phenomenology of HiTc superconductors.
ย 
Graphene2016 Abstracts Book
Graphene2016 Abstracts BookGraphene2016 Abstracts Book
Graphene2016 Abstracts Book
ย 
Electrical transport and magnetic interactions in 3d and 5d transition metal ...
Electrical transport and magnetic interactions in 3d and 5d transition metal ...Electrical transport and magnetic interactions in 3d and 5d transition metal ...
Electrical transport and magnetic interactions in 3d and 5d transition metal ...
ย 
Monte Carlo Simulations
Monte Carlo SimulationsMonte Carlo Simulations
Monte Carlo Simulations
ย 
Role of Atomic-Scale Modeling in Materials Design Discovery.
Role of Atomic-Scale Modeling in Materials Design Discovery.Role of Atomic-Scale Modeling in Materials Design Discovery.
Role of Atomic-Scale Modeling in Materials Design Discovery.
ย 

Similar to The Monte Carlo Method of Random Sampling in Statistical Physics

Multi scale modeling of micro-coronas
Multi scale modeling of micro-coronasMulti scale modeling of micro-coronas
Multi scale modeling of micro-coronasFa-Gung Fan
ย 
thesis booklet.fin
thesis booklet.finthesis booklet.fin
thesis booklet.finN S
ย 
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...ijrap
ย 
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...ijrap
ย 
COMPUTATIONAL TOOLS
COMPUTATIONAL TOOLS COMPUTATIONAL TOOLS
COMPUTATIONAL TOOLS Bhavesh Amrute
ย 
Energy of Corpuscular-Wave Mechanism
Energy of Corpuscular-Wave MechanismEnergy of Corpuscular-Wave Mechanism
Energy of Corpuscular-Wave MechanismAI Publications
ย 
Opportunities for students
Opportunities for students Opportunities for students
Opportunities for students Rene Kotze
ย 
Hamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdfHamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdfmiteshmohanty03
ย 
Decohering environment and coupled quantum states and internal resonance in ...
Decohering environment and coupled quantum states  and internal resonance in ...Decohering environment and coupled quantum states  and internal resonance in ...
Decohering environment and coupled quantum states and internal resonance in ...Alexander Decker
ย 
PART II.2 - Modern Physics
PART II.2 - Modern PhysicsPART II.2 - Modern Physics
PART II.2 - Modern PhysicsMaurice R. TREMBLAY
ย 
Poster_modified12
Poster_modified12Poster_modified12
Poster_modified12Ayushi Malviya
ย 
The Equation Based on the Rotational and Orbital Motion of the Planets
The Equation Based on the Rotational and Orbital Motion of the PlanetsThe Equation Based on the Rotational and Orbital Motion of the Planets
The Equation Based on the Rotational and Orbital Motion of the PlanetsIJERA Editor
ย 
A General Relativity Primer
A General Relativity PrimerA General Relativity Primer
A General Relativity PrimerIOSR Journals
ย 
calculation of currents in nanowires
calculation of currents in nanowirescalculation of currents in nanowires
calculation of currents in nanowiresVorname Nachname
ย 
PH600 Project report
PH600 Project reportPH600 Project report
PH600 Project reportLuke Moore
ย 
Quantum mechanics
Quantum mechanicsQuantum mechanics
Quantum mechanicshplap
ย 
Entangled states of trapped atomic ions
Entangled states of trapped atomic ionsEntangled states of trapped atomic ions
Entangled states of trapped atomic ionsGabriel O'Brien
ย 
Global-10
Global-10Global-10
Global-10Vijay Bist
ย 

Similar to The Monte Carlo Method of Random Sampling in Statistical Physics (20)

Multi scale modeling of micro-coronas
Multi scale modeling of micro-coronasMulti scale modeling of micro-coronas
Multi scale modeling of micro-coronas
ย 
thesis booklet.fin
thesis booklet.finthesis booklet.fin
thesis booklet.fin
ย 
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...
Ferromagnetic-Ferroelectric Composite 1D Nanostructure in the Pursuit of Magn...
ย 
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...
FERROMAGNETIC-FERROELECTRIC COMPOSITE 1D NANOSTRUCTURE IN THE PURSUIT OF MAGN...
ย 
COMPUTATIONAL TOOLS
COMPUTATIONAL TOOLS COMPUTATIONAL TOOLS
COMPUTATIONAL TOOLS
ย 
Energy of Corpuscular-Wave Mechanism
Energy of Corpuscular-Wave MechanismEnergy of Corpuscular-Wave Mechanism
Energy of Corpuscular-Wave Mechanism
ย 
Opportunities for students
Opportunities for students Opportunities for students
Opportunities for students
ย 
kuramoto
kuramotokuramoto
kuramoto
ย 
Hamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdfHamiltonian formulation project Sk Serajuddin.pdf
Hamiltonian formulation project Sk Serajuddin.pdf
ย 
Decohering environment and coupled quantum states and internal resonance in ...
Decohering environment and coupled quantum states  and internal resonance in ...Decohering environment and coupled quantum states  and internal resonance in ...
Decohering environment and coupled quantum states and internal resonance in ...
ย 
PART II.2 - Modern Physics
PART II.2 - Modern PhysicsPART II.2 - Modern Physics
PART II.2 - Modern Physics
ย 
Poster_modified12
Poster_modified12Poster_modified12
Poster_modified12
ย 
The Equation Based on the Rotational and Orbital Motion of the Planets
The Equation Based on the Rotational and Orbital Motion of the PlanetsThe Equation Based on the Rotational and Orbital Motion of the Planets
The Equation Based on the Rotational and Orbital Motion of the Planets
ย 
Energy of Corpuscular-Wave Mechanism_Crimson Publishers
Energy of Corpuscular-Wave Mechanism_Crimson PublishersEnergy of Corpuscular-Wave Mechanism_Crimson Publishers
Energy of Corpuscular-Wave Mechanism_Crimson Publishers
ย 
A General Relativity Primer
A General Relativity PrimerA General Relativity Primer
A General Relativity Primer
ย 
calculation of currents in nanowires
calculation of currents in nanowirescalculation of currents in nanowires
calculation of currents in nanowires
ย 
PH600 Project report
PH600 Project reportPH600 Project report
PH600 Project report
ย 
Quantum mechanics
Quantum mechanicsQuantum mechanics
Quantum mechanics
ย 
Entangled states of trapped atomic ions
Entangled states of trapped atomic ionsEntangled states of trapped atomic ions
Entangled states of trapped atomic ions
ย 
Global-10
Global-10Global-10
Global-10
ย 

More from IOSR Journals

More from IOSR Journals (20)

A011140104
A011140104A011140104
A011140104
ย 
M0111397100
M0111397100M0111397100
M0111397100
ย 
L011138596
L011138596L011138596
L011138596
ย 
K011138084
K011138084K011138084
K011138084
ย 
J011137479
J011137479J011137479
J011137479
ย 
I011136673
I011136673I011136673
I011136673
ย 
G011134454
G011134454G011134454
G011134454
ย 
H011135565
H011135565H011135565
H011135565
ย 
F011134043
F011134043F011134043
F011134043
ย 
E011133639
E011133639E011133639
E011133639
ย 
D011132635
D011132635D011132635
D011132635
ย 
C011131925
C011131925C011131925
C011131925
ย 
B011130918
B011130918B011130918
B011130918
ย 
A011130108
A011130108A011130108
A011130108
ย 
I011125160
I011125160I011125160
I011125160
ย 
H011124050
H011124050H011124050
H011124050
ย 
G011123539
G011123539G011123539
G011123539
ย 
F011123134
F011123134F011123134
F011123134
ย 
E011122530
E011122530E011122530
E011122530
ย 
D011121524
D011121524D011121524
D011121524
ย 

Recently uploaded

Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSรฉrgio Sacani
ย 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
ย 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfSumit Kumar yadav
ย 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsSรฉrgio Sacani
ย 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoSรฉrgio Sacani
ย 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfSumit Kumar yadav
ย 
Botany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfBotany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfSumit Kumar yadav
ย 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)Areesha Ahmad
ย 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...Sรฉrgio Sacani
ย 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxjana861314
ย 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPirithiRaju
ย 
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINChromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINsankalpkumarsahoo174
ย 
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls Agency
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls AgencyHire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls Agency
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls AgencySheetal Arora
ย 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxkessiyaTpeter
ย 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)PraveenaKalaiselvan1
ย 
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCR
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCRStunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCR
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCRDelhi Call girls
ย 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksSรฉrgio Sacani
ย 
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...anilsa9823
ย 

Recently uploaded (20)

Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
ย 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
ย 
Botany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdfBotany 4th semester file By Sumit Kumar yadav.pdf
Botany 4th semester file By Sumit Kumar yadav.pdf
ย 
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroidsHubble Asteroid Hunter III. Physical properties of newly found asteroids
Hubble Asteroid Hunter III. Physical properties of newly found asteroids
ย 
Isotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on IoIsotopic evidence of long-lived volcanism on Io
Isotopic evidence of long-lived volcanism on Io
ย 
The Philosophy of Science
The Philosophy of ScienceThe Philosophy of Science
The Philosophy of Science
ย 
Zoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdfZoology 4th semester series (krishna).pdf
Zoology 4th semester series (krishna).pdf
ย 
Botany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdfBotany 4th semester series (krishna).pdf
Botany 4th semester series (krishna).pdf
ย 
GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)GBSN - Microbiology (Unit 1)
GBSN - Microbiology (Unit 1)
ย 
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...
All-domain Anomaly Resolution Office U.S. Department of Defense (U) Case: โ€œEg...
ย 
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptxBroad bean, Lima Bean, Jack bean, Ullucus.pptx
Broad bean, Lima Bean, Jack bean, Ullucus.pptx
ย 
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdfPests of cotton_Sucking_Pests_Dr.UPR.pdf
Pests of cotton_Sucking_Pests_Dr.UPR.pdf
ย 
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATINChromatin Structure | EUCHROMATIN | HETEROCHROMATIN
Chromatin Structure | EUCHROMATIN | HETEROCHROMATIN
ย 
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls Agency
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls AgencyHire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls Agency
Hire ๐Ÿ’• 9907093804 Hooghly Call Girls Service Call Girls Agency
ย 
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptxSOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
SOLUBLE PATTERN RECOGNITION RECEPTORS.pptx
ย 
Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)Recombinant DNA technology (Immunological screening)
Recombinant DNA technology (Immunological screening)
ย 
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCR
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCRStunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCR
Stunning โžฅ8448380779โ–ป Call Girls In Panchshil Enclave Delhi NCR
ย 
Formation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disksFormation of low mass protostars and their circumstellar disks
Formation of low mass protostars and their circumstellar disks
ย 
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
Lucknow ๐Ÿ’‹ Russian Call Girls Lucknow Finest Escorts Service 8923113531 Availa...
ย 
CELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdfCELL -Structural and Functional unit of life.pdf
CELL -Structural and Functional unit of life.pdf
ย 

The Monte Carlo Method of Random Sampling in Statistical Physics

  • 1. IOSR Journal of Applied Physics (IOSR-JAP) e-ISSN: 2278-4861.Volume 4, Issue 4 (Sep. - Oct. 2013), PP 15-23 www.iosrjournals.org www.iosrjournals.org 15 | Page The Monte Carlo Method of Random Sampling in Statistical Physics 1 Akabuogu E. U., 2 Chiemeka I. U. and 3 Dike C. O. Department of Physics, Abia State University, Uturu, Nigeria. Abstract: The Monte Carlo technique of random sampling was reviewed in this work. It plays an important role in Statistical Mechanics as well as in scientific computation especially when problems have a vast phase space. The purpose of this paper is to review a general method, suitable to fast electronic computing machines, for calculating the properties of any system which may be considered as composed of interacting particles. Concepts such as phase transition, the Ising model,ergodicity, simple sampling, Metropolis algorithm, quantum Monte Carlo and Non-Boltzmann sampling were discussed. The applications of Monte Carlo method in other areas of study aside Statistical Physics werealso mentioned. Keywords: Ising Model, Phase transition, Metropolis algorithm, Quantum Monte Carlo, Observables, Non- Boltzmann Sampling. I. Introduction In Statistical Mechanics, many real systems can be considered: These systems can be broadly divided into two namely: ๏‚ท Systems which are composed of particles that do not interact with each other, therefore uncorrelated. These model are called Ideal gases ๏‚ท Systems which consist of particles which interact with each other such that interparticle interactions cause correlation between the many particles. Such systems therefore exhibit phase transition. A typical example of such system is the Ising model. For such many particle systems, systematic exploration of all possible fluctuations is often a very complicated task due to the huge number of microscopic states that must be considered and the cumbersome detail needed to characterize these states. However, there have been formulated a number of practical methods for sampling relevant fluctuation for non-interacting systems: the fluctuations can be handled by the Factorization and the Born-Oppenheimer approximation. For the interacting systems, which involve phase transition (e.g the Ising model), the Monte Carlo Method (MCM) and the Molecular Dynamic (MD) method can be employed to estimate the fluctuations (Chandler, 1987). The goal of statistical mechanics is to understand and interpret the measurable properties in terms of the properties of their constituent particles and the interactions between them. This is done by connecting thermodynamic function to quantum-mechanical equations. The two central quantities being the Boltzmann factor and the partition function. The ability to make macroscopic predictions based on microscopic properties is the main advantage of statistical mechanics. For problem where the phase space dimensions is very large โ€“ this is especially the case when the dimension of phase space depends on the number of degrees of freedom โ€“ the Monte Carlo Method outperforms any other integration scheme. The difficulty lies in smartly choosing the random samples to minimize the numerical effort. The Monte Carlo method in computational physics is possibly the most important numerical approaches to study problems spanning all unthinkable scientific discipline. These include, all physical sciences, Engineering, computational biology, telecommunication, finance and banking etc. The idea is seemingly simple: Randomly sample volume in d-dimensional space to obtain an estimate of an integral at the expense of an experimental error (Helmut, 2011). In this work, focus will be on the application of Monte Carlo Method to canonical systems with coupled degrees of freedom such as the Ising model. The theory of phase transition will be quickly revisited while the Ising model will be used as a representation of our interacting many particle systems. The simple sampling and the Metropolis sampling will be treated as a Monte Carlo Method.In addition, quantum extensions of the Monte Carlo Method such as the โ€˜Quantum Monte Carlo and the Path Integral methodโ€™ will be discussed. The non- Boltzmann sampling and the application of the Monte Carlo Method in other area of study aside statistical physics will also be outlined. II. Theory of Phase Transition Monte Carlo method is hugely employed in systems with inter-particle interactions: Phase transition is a clear manifestation of such interactions. Many processes in life exhibit phase transition. A good example being
  • 2. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 16 | Page the solid to liquid, liquid to gas and the reverse changes due to the effect of temperature and/or pressure. A phase change of great importance, which is frequently used in statistical mechanics, is that of magnetic systems. Ferromagnets are clear example of such magnetic system. A ferromagnet is a material that exhibit spontaneous magnetization. In ferromagnets, magnetic ions in the material align to create zero magnetism. Each ion exerts a force on the surrounding ions to align with it. An aligned state is a low energy state while entropy increases at higher temperature to cause substantial alignment. When these individual interactions build up to a macro scale, alignments produce a net magnetism. If there is a sudden change in magnetism at a certain temperature, we say that there is a phase transition (Eva, 2010). III. The Ising Model Once the electrons spin was discovered, it was clear that the magnetism should be due to large number of electrons spinning in the same direction. A spinning charged particle has a magnetic moment associated with it. It was natural to ask how the electrons all know which direction to spin, because the electrons on one side of a magnet do not directly interact with the electrons on the other side. They can only influence their neighbors. The Ising model was designed to investigate whether a large fraction of the electron could be made to spin in the same direction using only local force (Wikipedia). The idealized simple model of a ferromagnet is based upon the concept of interacting spins on an unchanging lattices is called the Ising model. It is named after the physicist Ernst Ising. The model consists of a discrete variable that represents magnetic dipole moments of atomic spins๐‘†๐‘– that can be in one state (+1 or - 1). ๐‘†๐‘– +1 ๐‘Ÿ๐‘’๐‘๐‘Ÿ๐‘’๐‘ ๐‘’๐‘›๐‘ก๐‘  "๐‘ ๐‘๐‘–๐‘›๐‘ข๐‘" โ†‘ โˆ’1 ๐‘Ÿ๐‘’๐‘๐‘Ÿ๐‘’๐‘ ๐‘’๐‘›๐‘ก๐‘  "spin down" โ†“ The spins are arranged in a graph usually a lattice, allowing each spin to interact with its neighbors. The model allows the identification of phase transitions, as a simplified model of reality. The one โ€“ dimensional model has no phase transition and was solved by Ising in 1925; the 2- dimensional model was solved by Lans Onsager (Binder et al, 1992). In the Ising model, we consider a system of N spins arranged on a lattice as illustrated in fig 1 below. ๐ปโ€ฆโ€ฆ ๐‘ ๐‘๐‘–๐‘›๐‘–๐‘ค๐‘–๐‘ก๐‘•๐‘š๐‘Ž๐‘”๐‘›๐‘’๐‘ก๐‘–๐‘๐‘š๐‘œ๐‘š๐‘’๐‘›๐‘ก ยฑ ๐œ‡ Fig 1: An Ising model In the presence of a magnetic field H, the energy of the system in a particular state, ๐œ is ๐ธ๐œˆ = โˆ’ ๐ป๐œ‡๐‘ ๐‘– ๐‘ ๐‘–=1 โˆ’ ๐‘— ๐‘ ๐‘– ๐‘ ๐‘— ๐ผ ๐‘–๐‘— (1) external magnetic energy energy due to interaction of the spin ๐‘ค๐‘•๐‘’๐‘Ÿ๐‘’๐‘—๐‘–๐‘ ๐‘ก๐‘•๐‘’๐‘๐‘œ๐‘ข๐‘๐‘™๐‘–๐‘›๐‘”๐‘๐‘œ๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘ก. The spin at each lattice site (i) interact with its four nearest neighbor sites such that the overall Hamiltonian for the system is ๐ป = โˆ’๐‘— ๐‘ ๐‘– ๐‘ ๐‘—๐‘–๐‘— (2) Where the sum over <ij> represents a sum over the nearest neighbors of all the lattice sites. By setting up a system obeying this Hamiltonian, we can find the mean energy,๐ธ and the specific heat, ๐ถ, per spin for the Ising ferromagnet from ๐ธ = 1 ๐‘ ๐ธ ๐‘– ๐‘’โˆ’๐›ฝ ๐ธ ๐‘–๐‘– ๐‘ (3) And ๐ถ = ๐œŽ ๐ธ 2 ๐‘๐พ ๐ต ๐‘‡ (4) We can also estimate the mean magnetization,๐‘€, from the following ensemble average ๐‘€ = 1 ๐‘ ๐‘ ๐‘–๐‘– (5) As also noted byFrenkelet al (1996), we are interested in the variations of these quantities with temperature as this system will undergo a phase change and so discontinuities should appear in its macroscopic properties at a
  • 3. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 17 | Page particular value T. It should be possible tocompare the behavior of ๐ธ , ๐ถ, ๐‘€ and also the partition function, ๐‘ ๐›ฝ with the result in the literature to determine if the simulation is giving a reasonable result. IV. Phase Change Behavior of the Ising Model The difference between the thermal energy available๐พ๐ต ๐‘‡, and the interaction exchange energy ๐‘— defines the behavior of the Ising model. At high temperature๐‘‡ > ๐‘‡๐‘ , when the thermal energy is much greater than the interaction energy, the interaction energy will be drowned out by randomizing effect of the heat bath and so the patterns of the spins on the mesh will be random and there will be no overall magnetization.However, as the temperature is lowered, ๐‘‡ < ๐‘‡๐‘ , and ๐ฝ > 0 , the correlation between the spin begin to take effect. Lower temperature allows clusters of like spins to form, the size of which defines the correlation length ๐œ€ of the system, therefore it may be anticipated that for low enough temperature, the stabilization will lead to cooperation phenomenon called Spontaneous Magnetization: that is through interactions between nearest neighbors, a given magnetic moment can influence the alignment of spins that are separated from the given spin by a macroscopic distance. These long ranged correlations between spins are associated with long ranged order in which the lattice has a net magnetization even in the absence of a magnetic field. The magnetization ๐‘€ = ๐œ‡๐‘ ๐‘– ๐‘ ๐‘–=1 (6) In the absence of the external magnetic field H, is called the spontaneous magnetization (Chandler, 1987). |M| T<Tc (Tc) T>Tc T Fig 2: Plot of magnetization against temperature of an Ising model. As shown in fig. 2, Tc(the Curie temperature or critical temperature) denote the highest temperature to which there can be non โ€“ zero magnetization, we expect that after the critical point, the system must make a decision as which direction is favoured, until at T=0 all spins are pointing either upwards or downwards. Both choices are energetically equivalent and there is no pre-guess which one the system will choose. The breaking of symmetry, such having to choose the favoured direction for the sins and divergence of system parameter such as ๐œ€๐‘Ž๐‘›๐‘‘๐ถ are characteristics of the phase change (Binder et al,1992). Chandler (1987) opined that the conditions at which all spins align in either upward or downward direction (at T<Tc) and when all the spin are randomly aligned (at T>Tc) can be considered or modelled as a phase change transition between a ferromagnetic and paramagnetic conditions of the magnetic system. The 1 โ€“ and โ€“ 2 โ€“ Dimensional Ising model is as shown in fig. 3. 1-Dimensional Ising as solved by Ising in 1925 Low T High T 2 โ€“ Dimensional Ising Model as solved by Onsager in 1944 Low T High T Fig 3: 1 โ€“ and โ€“ 2 โ€“ Dimensional Ising models As the temperature increases, entropy increase but net magnetization decreases. This alignment and misalignment of spin constitute a phase change in an Ising model. V. Variables describing the system According to Laetitia (2004), if we consider a lattice of ๐‘2 spins which has 2 possible states, up and down, In order to avoid the end effects, periodic boundary conditions are imposed. The equilibrium of the system can be
  • 4. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 18 | Page ๏‚ท ๐‘€๐‘Ž๐‘”๐‘›๐‘’๐‘ก๐‘–๐‘ง๐‘Ž๐‘ก๐‘–๐‘œ๐‘› , ๐‘€ = 1 ๐‘2 ๐‘† (7) ๏‚ท ๐ป๐‘’๐‘Ž๐‘ก๐ถ๐‘Ž๐‘๐‘Ž๐‘๐‘–๐‘ก๐‘ฆ, ๐ถ = 1 ๐‘2 1 ๐พ๐‘‡ 2 ๐ธ2 โˆ’ ๐ธ 2 (8) It is linked to the variance of the energy. ๏‚ท ๐‘†๐‘ข๐‘ ๐‘๐‘’๐‘๐‘ก๐‘–๐‘๐‘–๐‘™๐‘–๐‘ก๐‘ฆ, ๐’ณ = 1 ๐‘2 1 ๐พ๐‘‡ ๐‘†2 โˆ’ ๐‘† 2 (9) It is linked to the variance of the magnetization VI. Expected behaviour of the system The free energy satisfies ๐น = โˆ’๐พ๐ต ๐‘‡๐‘™๐‘›๐‘ = ๐ธ โˆ’ ๐‘‡๐‘† (10) Where ๐‘ = ๐‘’โˆ’๐›ฝ๐ธ โˆ๐‘— โˆ๐‘— The system approaches the equilibrium by minimizing F (the free energy). At low temperature, the interaction between the spins seem to be strong, the spins tend to align with another. In this case, the magnetization reaches its maximal value |M| = 1 according its formula, the magnetization exists even if there is no external magnetic field. At high temperature, the interaction is weak, the spins are randomly up or down, as such the magnetization is close to the value |M| = 0. Several configuration suits; the system is metastable. The magnetization disappears at a given temperature. There exists thus a transition phase. In zero external magnetic fields, the critical temperature is the Curie temperature ๐‘‡๐‘ = 2 ๐‘™๐‘› 1+ 2 (obtained by Onsagerโ€™s theory). According to the transition phase theory, the second-order derivative of the free energy in ๐›ฝ&๐‘‡ are discontinuous at the transition phase, as the susceptibility and the heat capacity are expressed with these derivatives, they should diverge at critical temperature. VII. The Monte Carlo Method With the advent and wide availability of powerful computers, the methodology of computer simulation has become a ubiquitous tool in the study of many body systems. The basic idea in these methods is that with a computer, one may follow the trajectory of system involving 102 or even 103 degrees of freedom. If the system is appropriately constructed โ€“ that is, if physically meaningful and boundary conditions and interparticle interactions are employed โ€“ the trajectory will serve to simulate the behaviour of a real assembly of particles and the statistical analysis of the trajectory will determine meaningful predictions for the properties of the assembly. The importance of these methods is that, in principle they provide exact results for the Hamiltonian under investigation. These simulations provide indispensable benchmarks for approximate treatment of non-trivial systems composed of interacting particles (Chandler, 1987). There are 2 general classes of simulations: One is called the Molecular Dynamic method (MD); Here one considers a classical dynamic model for atoms and molecules, and there trajectory is formed by integrating Newtonโ€™s equations of motion. The other class is called the Monte Carlo method (MCM). This procedure is more generally applicable than molecular dynamics in that it can be used to study quantum systems and lattice models as well as classical assemblies of molecules. As a representative of a non-trivial system of interacting particles, we choose the Ising magnet to show how the Monte Carlo method can be applied. The term Monte Carlo method was coined in 1940โ€™s by physicists, S. Ulam, E. Fermi, J. Von Neumann and N. Metropolis (amongst others) working on the nuclear weapon project at Los Alamos National Laboratory. Because random numbers (similar to processes occurring at in a casino, such as the Monte Carlo casino in Monaco) are needed, it is believed that, that was the source of the name. For problems where the phase space dimensions is very large โ€“ this is especially the case when the dimension of phase space depends on the number of degrees of freedom โ€“ the Monte Carlo method outperforms any other integration scheme. The difficulty lies in smartly choosing the random samples to minimize the numerical effort (Helmut, 2011).The Ising model can often be difficult to evaluate numerically, if there are many states in the system. Consider an Ising model with L lattice sites. Let โˆ™ ๐ฟ = | โˆง |: ๐‘ก๐‘•๐‘’๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ๐‘œ๐‘“๐‘ ๐‘–๐‘ก๐‘’๐‘ ๐‘œ๐‘›๐‘ก๐‘•๐‘’๐‘™๐‘Ž๐‘ก๐‘ก๐‘–๐‘๐‘’ โˆ™โˆ๐‘— โˆˆ โˆ’1, +1 : ๐‘Ž๐‘›๐‘–๐‘›๐‘‘๐‘–๐‘ฃ๐‘–๐‘‘๐‘ข๐‘Ž๐‘™๐‘ ๐‘๐‘–๐‘›๐‘ ๐‘–๐‘ก๐‘’๐‘œ๐‘›๐‘ก๐‘•๐‘’๐‘™๐‘Ž๐‘ก๐‘ก๐‘–๐‘๐‘’, ๐‘— = 1, โ€ฆ โ€ฆ . . ๐ฟ. โˆ™ ๐‘† โˆˆ โˆ’1, +1 : ๐‘ ๐‘ก๐‘Ž๐‘ก๐‘’๐‘œ๐‘“๐‘ก๐‘•๐‘’๐‘ ๐‘ฆ๐‘ ๐‘ก๐‘’๐‘š Since every spin has ยฑ๐‘ ๐‘๐‘–๐‘›, there are 2 ๐ฟ different states that are possible. This motivates the reason for the Ising model to be simulated using the Monte Carlo method. As an example, for a small 2-dimensional Ising magnet with only 20x20=400 spins, the total number of configuration is2400 . Yet Monte Carlo procedures routinely treat such system successfully sampling only 106 configurations. The reason is that Monte Carlo schemes are
  • 5. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 19 | Page devised in such a way that the trajectory probes primarily those states that are statistically most important. The vast majority of the 2400 states are of such high energy that they have negligible weight in the Boltzmann distribution (wikipedia). The Monte Carlo technique works by taking the random sample of points from phase space, and then using that data to find the overall behaviour of the system. This is analogous to the idea of an opinion poll. When trying to work on the opinion of the general public on a particular issue, holding a referendum on the subject is costly and time consuming. Instead, we ask a randomly sampled proportion of the population and use the average of their opinion as representation of the opinion of the whole country. This is much easier than asking everybody, but is not as reliable. We must ensure that the people asked are chosen truly at random (to avoid correlations between the opinions), and be aware of how accurate we can expect our results to be given that we are only asking a relatively small number of people. Issues as these all are analogous to the implementation of the Monte Carlo technique (Frenkel, 1997). To carry out a Monte Carlo method on an Ising, an ๐‘2 -array is initialized at a low temperature with aligned spins (this is called a cold start) or at a high temperature with random values 1 or -1(this is called the hot start). The Monte Carlo method consists to choose randomly a spin to flip. The chosen spin is flipped, if is favourable for the energy. When the energy becomes stable, the system reached the thermalisation, then configurations are determined by the Metropolis Algorithm in order to determine the properties of interest as the heat capacity and the susceptibility. In statistical physics, expectation values of quantities such as the energy, magnetisation, specific heat etc โ€“ generally called โ€˜observablesโ€™ โ€“ are computed by performing a trace over the partition function Z. Within the canonical ensemble (where the temperature T is fixed), the โ€˜expectation value or thermal average of an observable, A is given by ๐ด = ๐‘ƒ๐‘’๐‘ž๐‘  ๐‘  ๐ด ๐‘  (11) Where s denotes a state, ๐ด ๐‘  is the value of A in the state s. ๐‘ƒ๐‘’๐‘ž ๐‘  is the equilibrium Boltzmann distribution. ๐ด = 1 ๐‘ ๐ด ๐‘  ๐‘’โˆ’๐›ฝ๐ธ ๐‘  (12) Represent the sum overall states S in the system and ๐›ฝ = 1 ๐พ๐‘‡ where K is the Boltzmann constant. ๐‘ = ๐‘’๐‘ฅ๐‘ โˆ’๐›ฝ๐ธ ๐‘ ๐‘  is the partition function which normalises the equilibrium Boltzmann distribution. It is good to note that the Boltzmann factor ๐‘ƒ๐‘– ๐›ผ๐‘’โˆ’๐›ฝ ๐ธ ๐‘– is not a probability until it has been normalised by a partition function. Such that the true nature of the equilibrium Boltzmann distribution ๐‘ƒ๐‘’๐‘ž ๐‘  is ๐‘ƒ๐‘’๐‘ž ๐‘  = ๐‘’ โˆ’๐›ฝ ๐ธ ๐‘  ๐‘’โˆ’๐›ฝ๐ธ ๐‘  ๐‘  (13) One can show that the free energy,F is given by ๐น = โˆ’๐‘˜๐‘‡๐‘™๐‘›๐‘ (14) All thermodynamic quantities can be computed directly from the partition function and expressed as derivatives of the free energy. Because the partition function is closely related to the Boltzmann distribution. It follows that we can โ€˜sampleโ€™ observables with states generated according to the corresponding Boltzmann distribution. A simple Monte Carlo algorithm can be used to produce such estimate (Newmannet al, 1999). VIII. Simple Sampling The randomly most basic sampling technique consists of just randomly choosing points from anywhere within the configuration space. A large number of spins patterns are generated at random (for the whole mesh) and the average energy and magnetization is calculated from data. However, this technique needs to suffer from exactly the same problems as the quadrature approach, often sampling from unimportant areas of the phase space. The chances of a randomly created array of spin producing and all โ€“ up/ all- down spin pattern is remote (~2โˆ’๐‘ ), and high temperature random spin array is much likely. The most common way to avoid this problem is by using the Metropolis Importance Sampling, which works by applying weights to the microstates. IX. The Metropolis Sampling The algorithm first chooses selection probabilities๐‘” ๐œ‡, ๐œˆ , which represents the probability that state ๐œˆ is selected by the algorithm out of all states given that we are in state๐œ‡. It then uses acceptance probabilities ๐ด ๐œ‡, ๐œˆ so that detailed balance is satisfied. If the new state ๐œˆ is accepted, then we move to that state and repeat with selecting a new state and deciding to accept it. If ๐œˆis not accepted, then we stay in๐œ‡. The process is repeated until some stopping criteria is met, which for the Ising model is often when the lattice becomes ferromagnetic, meaning all of the sites point in the same direction (Frenkelet al, 1996). We implementing the algorithm, we must ensure that ๐‘” ๐œ‡, ๐œˆ is selected such that ergodicity is met (by this we mean that orbit of the representative point in phase space goes through all points on the surface). In thermal equilibrium, a systemโ€™s energy only fluctuates within a small range. With this knowledge, the concept of single-spin flip dynamics โ€“ which states that in each transition, we will only change one of the spin sites on the lattice. Furthermore, by using a single-spin flip dynamics, we can get from one state to any other state by
  • 6. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 20 | Page flipping each site that differs between the 2 states one at a time. The random choices for the spins performed with the aid of the โ€˜pseudo โ€“ random number generatorโ€™ โ€“ an algorithm that generates a long sequence of random numbers uniformly distributed in the interval between 0 and 1- commonly available with digital computers. Having picked out a spin at random, we next consider the new configuration๐œˆ, generated from the old configuration,๐œ‡, by flipping the randomly identified spin ๐œ‡. The change in configuration changes the energy of the system by the amount ฮ”๐ธ๐œ‡๐œˆ = ๐ธ๐œˆ โˆ’ ๐ธ๐œ‡ (15) Chandler (1987) opined that this energy difference governs the relative probability of configurations through the Boltzmann distribution, and we can build this probability into the Monte Carlo trajectory by criterion for accepting and rejecting moves to new configuration. The maximum amount of change between the energy of this present ๐ธ๐œ‡ and any possible new stateโ€™s energy ๐ธ๐œˆ is 2J between the spin we choose to โ€˜flipโ€™ to move to the new state and that spinโ€™s neighbour. Specifically for the Ising model, implementing the single-spin flip dynamics, we can establish the following: - Since there are L total sites on the lattice, using the single โ€“ spin as the only way we transition to another state, we can see that there are a total of L new state ๐œˆ from our present state ๐œ‡.โ€˜Detailed balance tells us that the following equations must hold: ๐ด ๐œ‡, ๐œˆ ๐‘ƒ๐›ฝ ๐œ‡ = ๐ด ๐œˆ, ๐œ‡ ๐‘ƒ๐›ฝ ๐œˆ (16) Such that, ๐‘ƒ ๐œ‡,๐œˆ ๐‘ƒ ๐œˆ,๐œ‡ = ๐‘” ๐œ‡,๐œˆ ๐ด ๐œ‡,๐œˆ ๐‘” ๐œˆ,๐œ‡ ๐ด ๐œˆ,๐œ‡ = ๐ด ๐œ‡,๐œˆ ๐ด ๐œˆ,๐œ‡ = ๐‘ƒ ๐›ฝ ๐œˆ ๐‘ƒ ๐›ฝ ๐œ‡ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ /๐‘ ๐‘’โˆ’๐›ฝ ๐ธ ๐œ‡ /๐‘ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ (17) Where ๐‘ƒ๐›ฝ ๐œˆ is the transition probability for the next state๐œˆ, and it only depends on the present state๐œ‡. Thus we want to select the acceptance probability for our algorithm to satisfy ๐ด ๐œ‡,๐œˆ ๐ด ๐œˆ,๐œ‡ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ (18) If ๐ธ๐œˆ > ๐ธ๐œ‡ , then ๐ด ๐œˆ, ๐œ‡ > ๐ด ๐œ‡, ๐œˆ . By this reasoning, the acceptance algorithm is ๐ด ๐œ‡, ๐œˆ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ ๐‘–๐‘“๐ธ๐œˆ > ๐ธ๐œ‡ 1 ๐ธ๐œˆ โ‰ค ๐ธ๐œ‡ Conclusively, the basis for the algorithm is as follows - Pick a spin site using the selection probability ๐‘” ๐œ‡, ๐œˆ and calculate the contribution of the energy involving the spin. - Flip the value of the spin and calculate the new contribution. - If the new energy is less, keep the flipped value i.e.๐‘–๐‘“๐ธ๐œˆ โ‰ค ๐ธ๐œ‡ , we accept the move - If the new energy is more, only keep the probability๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ โˆ’๐ธ ๐œ‡ . - Repeat: we now repeat this procedure for making millions of steps thus forming a long trajectory through the configuration space. This process is repeated until the stopping criteria is met, which for the Ising model is often when the lattice have become ferromagnetic (at low energies๐‘‡ < ๐‘‡๐‘ ), meaning that all the sites point in the same direction(Wikipedia). X. Practical Implementation of the Metropolis Algorithm A pseudo โ€“ code Monte Carlo program to compute an observable ๐œ— for the Ising model is the following: 1 algorithm ising_metropolis A (T, steps) 2 initialize starting configuration ๐œ‡ 3 Initialize ๐œš = 0 4 5 for (counter = 1 ........steps) do 6 generate trial state ๐œˆ 7 computep(๐œ‡ โ†’ ๐œˆ , T) 8 x = rand (0,1) 9 If(p > x) then 10 Accept ๐œˆ 11 fi
  • 7. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 21 | Page 12 13 ๐œš+= ๐œš(๐œˆ) 14 done 15 16 return ๐œš/steps After initialization, in line 6 a proposed state is generated by e.g. flipping a spin. The energy of the new state is computed and henceforth the transition probability between states๐‘ƒ = ๐‘‡ ๐œ‡ โ†’ ๐œˆ . A uniform random number ๐‘‹ โˆˆ 0,1 is generated. If the probability is larger than the number, the move is accepted. If the energy is loweredโˆ†๐ธ > 0 , the spin is always flipped. Otherwise the spin is flipped with a probability P. Once the new state is accepted, we measure a given observable and record its value to perform the thermal average at a given temperature. For steps โ†’ โˆž the average of the observable converges to the exact value, again with an error inversely proportional to the square root of the number of steps. In order to obtain a correct estimate of an observable, it is imperative to ensure that one is actually sampling an equilibrium state. Because in general, the initial configuration of the simulation can be chosen at random โ€“ population choices being random or polarized configuration - the system will have to evolve for several Monte Carlo steps before an equilibrium state at a given temperature is obtained. The time ๐œ ๐‘’๐‘ž until the system is in thermal equilibrium is called โ€˜equilibrium timeโ€™, it depends directly on the system size and it increases with decreasing temperatures. In general, it is a measure on โ€˜Monte Carlo Sweepsโ€™ i.e. 1 MCS=N spin updates. In practice, all measures observables should be monitored as a function of MCS to ensure that the system is in thermal equilibrium. Some observables such as energy equilibrates faster than the others (e.g. Magnetization) and thus equilibration time of all observables measured need to be considered. It is expected that a plot of the variables when averaged over a partition function, shows discontinuities at a particular temperature (Helmut, 2011). XI. The Non โ€“ Boltzmann Sampling According to Mezei (1987), the success of Monte Carlo is based on its ability to restrict the sample of the configuration space to the extremely small fractions that contributes significantly to most properties of interest. However, for calculating free energy difference, adequate sampling of a much larger subspace is required. Also for the related problems of potential of mean force calculations, the structural parameter along with the potential of mean force are calculated and has to be sampled uniformly. These goals can be achieved by performing the calculation using an appropriately modified potential energy function V or๐ธ๐œˆ 0 . Specifically, Non Boltzmann is used in cases where one need to ๏‚ท Sample part of the phase space relevant for a particular property ๏‚ท Complete simulation when sampling of a specific region of the configuration space is required that may not be sampled adequately in a direct Boltzmann sampling. The technique calls for sampling with a modified potential ๏‚ท Sample phase space more efficiently. Suppose it is convenient to generate a Monte Carlo trajectory for a system with energy๐ธ๐œˆ 0 , but you are interested in the averages for a system with a different energetic. ๐ธ๐œˆ = ๐ธ๐œˆ 0 + ฮ”๐ธ๐œˆ (19) For example, suppose we wanted to analyse a generalized Ising model with couplings between certain non โ€“ nearest neighbours as well as nearest neighbours. The convenient trajectory might then correspond to that for the simple Ising magnet, and ฮ”๐ธ๐œˆ would then be the sum of all non-nearest-neighbours interaction. How might we proceed to evaluate the average of interest from the convenient trajectories? Chandler(1987) suggested that the answer is obtained from a factorization of the Boltzmann factor: ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ 0 ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ (20) With this simple property, we have ๐‘„ = ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ = ๐‘„0 ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ 0 ๐‘’โˆ’๐›ฝ ฮ”๐ธ ๐œˆ๐œˆ ๐‘„0 ๐œˆ (21) = ๐‘„0 ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ 0 (22) Where โ€ฆ โ€ฆ . . 0 indicates the canonical ensemble average taken with energetic ๐ธ๐œˆ 0 . A similar result due to the factorization of the Boltzmann factor is ๐บ = ๐‘„โˆ’1 ๐บ๐œˆ ๐‘’โˆ’๐›ฝ ๐ธ ๐œˆ ๐œˆ (23) = ๐‘„0 ๐‘„ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ 0 (24) = ๐บ๐œˆ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œ 0/ ๐‘’โˆ’๐›ฝฮ”๐ธ ๐œˆ 0 (25) The formula provides a basis for Monte Carlo procedures called the โ€˜non-Boltzmann samplingโ€™ and โ€˜the Umbrella samplingโ€™. In particular, since a Metropolis Monte Carlo trajectory with energy ๐ธ๐œˆ 0 can be used to
  • 8. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 22 | Page compute the averages denoted by โ€ฆ โ€ฆ . 0 , we can use the factorization formulas to compute ๐บ and ๐‘„ ๐‘„0 even though the sampling employing ๐ธ๐œˆ 0 is not consistent with the Boltzmann distribution ๐‘’๐‘ฅ๐‘ โˆ’๐›ฝ๐ธ๐œˆ . Non โ€“ Boltzmann sampling can also be useful in removing bottlenecks that create quasi-ergodic problems and in focussing attention on rare events. Consider the former first, suppose a large activation barrier separates one region of configuration space from another, and suppose this barrier can be identified and located at a configuration or sets of configurations. We can then form a reference system in which the barrier has been removed: that is, we take ๐ธ๐œˆ 0 = ๐ธ๐œˆ โˆ’ ๐‘‰๐œˆ (26) Where ๐‘‰๐œˆ is large in the region where ๐ธ๐œˆhas the barrier and it is zero otherwise. Trajectories based upon the energy ๐ธ๐œˆ 0 will indeed waste time in the barrier region, but they will also pass one side of the barrier to the other thus providing scheme for solving problems of quasi-ergodicity. Also an example of the latter occurs when we are interested in a relatively rare event. For example, suppose we want to analyze the behaviour of surrounding spins given that an n x n block were all perfectly aligned. While such blocks do form spontaneously and their existence might catalyze something of interest, the natural occurrence of this perfectly aligned n x n block might be very infrequent during the course of the Monte Carlo trajectory. To obtain a meaningful statistics from these rare events, without wasting time with irrelevant though accessible configurations, we have to perform a non-Boltzmann the Monte Carlo trajectory with energy like: ๐ธ๐œˆ 0 = ๐ธ๐œˆ + ๐‘Š๐œˆ (27) Where ๐‘Š๐œˆis zero for the interesting class of configurations, and for all others๐‘Š๐œˆ is large. The energy ๐‘Š๐œˆ is then called an Umbrella potential. It biases the Monte Carlo trajectory to sample only the rare configurations of interest. The non โ€“ Boltzmann sampling performed this way is called the umbrella Sampling (Chandler, 1987). XII. Quantum Monte Carlo In addition to the aforementioned Monte Carlo Methods that treat classical problems, quantum extensions such as variational Monte Carlo, Path Integral Monte Carlo etc have been developed for quantum systems. In quantum Monte Carlo simulation, the goal is toavoid considering the full Hilbert space, but randomly sample the most relevant degree of freedom and try to extract the quantities of interest such as the Magnetization by averaging over a stochastic process. According to Chandler (1987), to pursue this goal, we first need to establish a framework in which we can interpret quantum-mechanical observables in terms of classical probabilities. The process is called โ€˜quantum-classicalโ€™ mapping and it allows us to reformulate quantum many-body problems in terms of models from classical statistical mechanics, albeit in higher dimensions. To achieve this โ€˜quantum classical mappingโ€™and carry out a quantum Monte Carlo, the concept of โ€˜discretized quantum pathsโ€™ will be followed. In this representation, quantum mechanics is Isomorphic with the Boltzmann weighted sampling of fluctuations in a discretized classical systems with many degrees of freedoms. Such sampling can be done by Monte Carlo and thus this mapping provides a basis for performing quantum Monte Carlo calculations. Many Monte Carlo sampling schemes exist that can be used to generate numerical solutions to Schrodinger equation. To illustrate the Quantum Monte Carlo simulation, we will focus on one โ€“ โ€˜path integral Monte Carloโ€™ which is popularly used to study quantum systems at non-zero temperatures. As an example, one can look at a simple one model: the two-state quantum system coupled to a Gaussian fluctuating field.It is a system that can be analyzed analytically and comparison of the exact analytical treatment with the quantum Mont Carlo procedure serves as a useful illustration if the convergence of Monte Carlo sampling. The quantal fluctuations of the model are to be sampled from the distribution ๐‘Š ๐‘ข1, โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘; ๐œ‰ โˆ ๐‘’๐‘ฅ๐‘ ๐’ฎ ๐‘ข1, โ€ฆ โ€ฆ , ๐‘ข ๐‘ ; ๐œ‰ , (28) Where ๐’ฎ = โˆ’ ๐›ฝ ๐œ‰2 2๐œŽ + ๐›ฝ/๐‘ƒ ๐œ‡ ๐‘ข๐‘– ๐œ‰๐‘ƒ ๐‘–=1 (29) = 1 2 ๐‘™๐‘› ๐›ฝโˆ†/๐‘ƒ ๐‘ข๐‘– ๐‘ข๐‘–+1. ๐‘ƒ ๐‘–=1 (30) Here ๐‘ข๐‘– = ยฑ1 (31) specifies the state of the quantum system at ๐‘–๐‘ก๐‘• point on the quantum path, there are P such points, the parameters ๐œ‡๐‘Ž๐‘›๐‘‘โˆ† correspond to the magnitude of the dipole and half the tunnel splitting of the two-state system, respectively. The electric field,๐œ‰, is a continuous variable that fluctuates between โˆ’โˆž๐‘Ž๐‘›๐‘‘ + โˆž, but due to the finite of ๐œŽ, the values of ๐œ‰ close to ๐œŽ ๐›ฝ in magnitude are quite common.
  • 9. The Monte Carlo Method of Random Sampling in Statistical Physics www.iosrjournals.org 23 | Page The quantity ๐’ฎis often called the action. It is a function of the P points on the quantum path (๐‘ข1, โ€ฆ . , ๐‘ข ๐‘). The discretized representation of the quantum path becomes exact in the continuum limit of an infinite number of points on the path- that is, Pโ†’ โˆž . In that limit, the path changes from a set of P variables, ๐‘ข1to๐‘ข ๐‘, to a function of a continuous variable. The action is then a quantity whose value depends upon a function, and such a quantity is called a functional. Consequently, a computer program can be written in BASIC to run on an IBMPC. Such program, samples the discretized weight function ๐‘Š ๐‘ข1, โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘; ๐œ‰ by the following procedure: - Begin with a configuration - One of the P+1 variables ๐‘ข1 through๐‘ข ๐‘ and ๐œ‰ is identified at random, and the identified variable is changed. For instance, if the variable is ๐‘ข1the change corresponds to๐‘ข๐‘– โ†’ โˆ’๐‘ข๐‘–. If on the other hand, the identified variable is the field ๐œ‰ โ†’ ๐œ‰ + โˆ†๐œ‰, where the size of โˆ†๐œ‰ is taken at random from a continuous set of numbers generated with aid of the โ€˜pseudorandom-number generatorโ€™ - This change in one of the variables causes a change in the actionโˆ†๐’ฎ . For example , if ๐‘ข๐‘– โ†’ โˆ’๐‘ข๐‘–then โˆ†๐’ฎ = ๐’ฎ ๐‘ข1, โ€ฆ . , โˆ’๐‘ข๐‘–, โ€ฆ โ€ฆ . , ๐œ‰ -๐’ฎ ๐‘ข1, โ€ฆ โ€ฆ , ๐‘ข๐‘–, โ€ฆ โ€ฆ . . ; ๐œ‰ - The Boltzmann factor associated with this change,๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ , then compared with a random number x taken from the uniform distribution between 0 to 1. - If ๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ > ๐‘ฅ, the move is accepted. - If however ๐‘’๐‘ฅ๐‘ โˆ†๐’ฎ < ๐‘ฅ the move is rejected. An accepted move means that the new configuration of the system is the changed configuration. A rejected move means that the new configuration is unchanged from the previous one. As with the Ising model, this standard Metropolis procedure for a Monte Carlo step is repeated over and over again producing a trajectory that samples configuration space according to the statistical weight๐‘Š ๐‘ข1,โ€ฆ โ€ฆ โ€ฆ . , ๐‘ข ๐‘ ; ๐œ‰ . Typical results are then being obtained by averaging properties over these trajectories. A particular example of such results is the โ€˜solvation energyโ€™ corresponding to the average of the coupling between the electric field and the quantal dipole, that is ๐ธ๐‘ ๐‘œ๐‘™๐‘ฃ = 1 ๐‘ ๐œ‡๐œ‰๐œ‡๐‘– ๐‘ƒ ๐‘–=1 (32) XIII. Conclusion The Monte Carlo method of random sampling was reviewed in this work. It works by taking the random sample of points from phase space, and then using the data to find the overall behaviour of the system. To carry out this method on interacting systems, an array is initialized at a low or high temperature; spins are chose randomly to flip. The chosen spin is flipped if it satisfies the boundary condition that is favourable to the energy. Expectation values of the Observables such as the energy, magnetization, specific heat etc are computed by performing a trace over the partition function, Z. The Monte Carlo applied to the Ising model which describes the properties of materials allows obtaining the thermodynamic quantities variations. For such model, when the Monte Carlo method is carried out, it is expected that ๏‚ท Without magnetic field, a phase transition at the critical temperature is strongly marked. This transition separates๐‘‡ < ๐‘‡๐‘ , where the magnetization is maximum, the spins are aligned, and ๐‘‡ > ๐‘‡๐‘ where the magnetization is in order of zero, the spins are randomly oriented. ๏‚ท In the presence of magnetic field, the phase transition is not so marked. At a very high temperature, the field has no effect because of the thermal agitation. In a general way, the spins align with the magnetic field but at ๐‘‡ < ๐‘‡๐‘ the changes of direction happen only if field is above a critical value. The use of the Monte Carlo stretches beyond statistical mechanics as its use is seen in Engineering, telecommunication, biology etc. References [1] Binder K&Heermaan D W (1992)โ€˜Monte Carlo Simulation in Statistical Physicsโ€™, 2nd edition. Springer-Verlag U.K. [2] Chandler D (1987) โ€˜Introduction to Modern Statistical Mechanicsโ€™, New York, Oxford University Press. [3] Eva M (2011) โ€˜Phase transition on the Ising Modelโ€™ Vol. I (pg 183-187), TerreHaute. [4] Frenkel D &Smit B (1996) โ€˜Understanding Molecular Simulationโ€™, Academic Press UK [5] Helmut G (2011) โ€˜ Introduction to Monte Carlo Methodsโ€™, Department of Physics and Astronomy, Texex A&M University Press, USA [6] Ising Model; http://en.wikipedia.org/wiki.Ising-model. [7] Laetitia G (2004) โ€˜Monte Carlo Method applied to the Ising Model; Department of Physics Oxford University [8] Mezei M (1987) โ€˜Adaptive Umbrella Sampling: Self consistent determination of the Non-Boltzmann biasโ€™; Department of Chemistry Hunter College C.U.N, New York. [9] Neumann M E J &Barkema GT (1999) โ€˜Monte Carlo Methods In Statistical Physicsโ€™, Oxford University Press.