Wang-Landau Monte Carlo simulation is a method for calculating the density of states function which can then be used to calculate thermodynamic properties like the mean value of variables. It improves on traditional Monte Carlo methods which struggle at low temperatures due to complicated energy landscapes with many local minima separated by large barriers. The Wang-Landau algorithm calculates the density of states function directly rather than relying on sampling configurations, allowing it to overcome barriers and fully explore the configuration space even at low temperatures.
3. Outline
1. Statistical thermodynamics, ensembles
2. Numerical evaluation of integrals, crude Monte Carlo (MC)
3. MC methods for statistical thermodynamics, Markov chains
4. Variants of MC methods
8. Statistical thermodynamics
ensemble averages Monte Carlo methods
1
1
1
( , ) ( , ; , , ) d
, ,
( , ; , , ) d
K K K K n
n
K K n
b r p r p A A
B A A b
r p A A
9. Statistical thermodynamics
Statistical (Gibbs) ensembles
microstate (m-state): positions and momenta
macrostate (m-state): macroscopic variables (A1, … )
1 m-state = many m-states Gibbs ensemble
probability distribution in system’s phase space, 1( , ; , , )K K nr p A A
10. Statistical thermodynamics
Examples of statistical (Gibbs) ensembles
micro-canonical: N, V, E
canonical: N, V, T
grand-canonical: m, V, T
isobaric-isothermic: N, P, T
etc.
11. Statistical thermodynamics
B
( , ; )
, ; , , exp N K K
K K
H r p V
r p N V T
k T
Canonical Gibbs ensemble
2
1
( , ; ) ( ; )
2
N
K
N K K K
K K
p
H r p V W r V
M
12. Statistical thermodynamics
1
1
1
( , ) ( , ; , , ) d
, ,
( , ; , , ) d
K K K K n
n
K K n
b r p r p A A
B A A b
r p A A
Statistical thermodynamics as a mathematical
problem
3 3 3 3
1 13
1
d d d ...d d
!h N NN
r p r p
N
13. Statistical thermodynamics
con con 1 1
1 kin
con 1 1
( ) ( ; , , ) d d
, ,
( ; , , ) d d
K K n N
n
K n N
b r r A A r r
B A A B
r A A r r
Statistical thermodynamics as a mathematical
problem
Canonical ensemble
con
B
( ; )
; , , exp N K
K
W r V
r N V T
k T
14. Numerical evaluation of integrals
1 1
( ) d ( ) ( )
b n n
j
j ja
b a b a b a
f x x f a j f x
n n n
1D – rectangular, trapezoid, Simpsonian, …
15. Numerical evaluation of integrals
1 1
( ) d ( ) ( )
b n n
j
j ja
b a b a b a
f x x f a j f x
n n n
1D – rectangular, trapezoid, Simpsonian, …
1D – crude Monte Carlo
1
( ) d ( )
b n
j
ja
b a
f x x f x
n
xj randomly distributed over [a,b]
18. Numerical evaluation of integrals
Localized functions
( ) ( )d
b
a
f x x x
Solution: Sampling from (x).
19. Numerical evaluation of integrals
Localized functions
( ) ( )d
b
a
f x x x
Solution: Sampling from (x).
How? Markov chains.
20. Markov chains
(e) (e)
1 , , n
Simplified example – finite number of microstates
microstates: s1, …, sn
probability distribution: 1 = (s1), …, n = (sn)
equilibrium PD:
21. Markov chains
Simplified example – finite number of microstates
Markov chain: s(1) s(2) … s(N) …
s(K) is from {s1, …, sn}
transition probabilities: ik = ik
(stochastic matrix)
1
0 1
1
ik
n
ik
k
22. Markov chains
( 1) ( )
1
n
N N
k i ik
j
Simplified example – finite number of microstates
PD function evolution:
( 1) ( )N N
equilibrium PD function:
(e) (e)
1
n
k i ik
j
(e) (e)
23. Markov chains
Simplified example – finite number of microstates
Theorem:
Let s(1), s(2), …, s(N), … be an infinite Markov chain of microstates generated using a
stochastic matrix ik . Then the generated microstates are distributed over the state
space of the system in congruence with the equilibrium PD function of the used
stochastic matrix.
Remarks:
The Markov chain does not represent any physical time evolution.
From the statistical thermodynamics point of view, only the equilibrium PD function
is physically relevant. (Not, e. g., the stochastic matrix.)
24. Markov chains
Construction of an appropriate stochastic matrix
(e) (e)
1 1
, 1
n n
k j jk jk
j k
Mathematically
2n linear algebraic equations for n2 variables
many (infinite number of) solutions
25. Markov chains
Construction of an appropriate stochastic matrix
(e) (e)
1 1
, 1
n n
k j jk jk
j k
Mathematically
2n linear algebraic equations for n2 variables
many (infinite number of) solutions
Detailed balance
(e) (e)
k kj j jk
26. Markov chains
Decomposition of the stochastic matrix
ik ik ikp a
pik … probability of proposing a particular transition
aik … probability of accepting the proposed transition
Metropolis solution [1]
(e)
(e) (e)
(e
(e)
(e))
mi min 1, ,n 1
ki ikp p
k ki
k ki ki i ik ik ik
ik
k
ik
ii
p
p
ap a p a a
[1] N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, E. Teller, J. Chem. Phys. 21 (1953) 1087.
27. Markov chains
Decomposition of the stochastic matrix
ik ik ikp a
pik … probability of proposing a particular transition
aik … probability of accepting the proposed transition
Metropolis solution [1]
(e)
(e) (e)
(e
(e)
(e))
mi min 1, ,n 1
ki ikp p
k ki
k ki ki i ik ik ik
ik
k
ik
ii
p
p
ap a p a a
[1] N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, E. Teller, J. Chem. Phys. 21 (1953) 1087.
Notice that the PD function need not be normalized!
28. Metropolis algorithm
… for evaluating configuration integrals of statistical
thermodynamics (canonical ensemble)
B
B
( )
con 1
kin ( )
1
( )e d d
e d d
K
K
W r
k T
K N
W r
k T
N
b r r r
B B
r r
and, consequently,
( ) ( )( ) ( )( )
B B B B
( ) (e)
(e)
(e)
e e e e
k ii i ki
K
W WW WW r
k T k T k T k Tk
i
i
29. Metropolis algorithm
from the configuration generated in the preceding step,
r(i), generate a new one using an algorithm obeying
that the probability for old new is the same as for
new old (usually isotropic translation and/or
rotation)
if W (new)<W (i) then r (i+1) = r (new)
if W (new)>W (i) then
r (i+1) = r (new) with probability
r (i+1) = r (i) with probability
repeat until “infinity” (convergence)
( new)
B/
e
i
W k T
( new)
B/
1 e
i
W k T
31. Metropolis algorithm
Then
B
B
( )
con 1 ( )
con( )
1
1
( )e d d 1
lim ( )
e d d
K
K
W r
k T
n
K N i
KW r n
ik T
N
b r r r
b r
n
r r
Canonical (or NVT or isochoric-isothermic)
Monte Carlo
32. A general sketch of an MC simulation
propose an initial configuration
employ the Metropolis algorithm to generate as many as
possible further configurations
collect necessary data along the MC path to evaluate
ensemble averages
calculate the averages as simple arithmetic means of the
collected data
34. Literature
M. LEWERENZ, Monte Carlo Methods: Overview and Basics, in NIC Series Volume 10:
Quantum Simulations of Complex Many-Body Systems: From Theory to Algorithms, NIC Juelich 2002
(http://www.fz-juelich.de/nic-series/volume10/volume10.html)
M. P. ALLEN, P. J. TILDESLEY, Computer Simulation of Liquids, Oxford University Press,
Oxford 1987.
D. P. LANDAU, K. BINDER, A Guide to Monte Carlo Simulations in Statistical
Physics, Cambridge University Press, Cambridge 2000 and 2005.
35. Appendix
Simulated annealing [1] and basin hopping [2]
MC simulation at a high temperature
equilibrium structure for T = 0 K
T0K
simulatedannealing
gradientoptimization
basinhopping[1] S. Kirckpatrick, C. D. Gellat, M. P. Vecchi, Science 220 (1983) 671; [2] D. J. Wales, H. A. Scheraga, Science 285
(1999) 1368.
38. Problem: low temperatures
Simulation at low temperatures – combination of
integration and optimization
Molecular clusters, biomolecules:
Complicated energy landscapes
A lot of minims separated by large energy
barriers
39.
con con 1 1
1 kin
con 1 1
( ) ( ; , , ) d d
, ,
( ; , , ) d d
K K n N
n
K n N
b r r A A r r
B A A B
r A A r r
IF , then mean value of B can be expressed as
a 1D integral
where is classical density of states
How to calculate ? Wang-Landau algorithm
con( ) ( )K Kb r b W r
Mean value of variable B
min
min
con con 1
1 kin
con 1
( ) ( ; , , ) ( )d
, ,
( ; , , ) ( )d
nW
n
nW
b W W A A W W
B A A B
W A A W W
( )W
1 1
1
1
, , ; ( , , ) ,
0
1
, ,
1 d d
( ) lim
1 d d
N N
N
N
r r W r r W W W
W
N
r r
r r
W
r r
( )W
40. Calculation of Ω Wang-Landau algorithm – basic idea
- Completaly random walk in configuration space => energy
histogram h(W)~Ω(W), but low energy configurations will not be
visited
- Metropolis algorithm: configurations are sampled by ρ(Ei(r), =>
energy histogram h(W)~Ω(W) x ρ(W(r))
- If we use ρ(Wi(r)) = 1/(Ω(W)), then energy histogram will be
constant function
41. Wang-Landau algorithm – calculation of density of states
- Start of simulation: Ω(W) is unknown, we start with constant Ω(W) = 1
and from random initial configuration r
- Following configuration r2 created by small random change of foregoing
configuration r1 is accepted with probability
- Visiting of energy Wi: change of Ω(Wi) → Ω(Wi) x k0,where modification
factor k0 >1.
- e. g., k0 =2, big k → big statistical errors, small k → long simulation
- After reaching of flat energy histogram h(W), modification factor is
changed e.g. by rule , energy histogram is deleted (density of
states isn‘t deleted!) and we start a new simulation
- Computations is finished, when k reaches predeterminated value (e. g.
1,000 001)
min 1, 1,
new new old
newold old
W r W
WW r
1j jk k
42. Wang-Landau algorithm – accuracy of results depends on:
- Final value of modification factor kfinal
- Softness of deals of energy
- Criterion of flatness of histogram
- Complexity of simulated system
- Details of implemented algorithm