3. Stochastic Process
• Probability model are more realistic than
deterministic models
• Observations taken at different time points
rather than fixed time point
–Led new concept of indeterminism
(indeterminism in dynamic studies or
dynamic indeterminism)
4. Stochastic Process
• Many phenomenon (physical science, life
science, social science, engineering and
management and so on) are studied not only
as random phenomenon but also as one
changing with time.
5. Stochastic Process
• The application of random variables which are
function of time has been increase.
• Definition: Families of random variables which
are functions of time are known as stochastic
process.
6. Eamples
• Example 1: Simple experiment like throwing die.
– 𝑋 𝑛 outcome of the n th throw, 𝑛 ≥ 1
– {𝑋 𝑛, 𝑛 ≥ 1} is a family of random variables
• Example2:
– r cells and infinitely large number of identical balls
– Throw randomly
– 𝑋 𝑛 is the number of occupied cells after nth throw
{𝑋 𝑛, 𝑛 ≥ 1} constitute a family of random variables
7. Examples
• Example 3:
– X(t) is the random variable which represent the
number of incoming calls in an interval (0, t) of
duration t units at a switchedboard.
– {x(t), t∈ T} constitute a stochastic process
(T=[0,∞) )
8. Specification of stochastic process
• The set of possible values of a single random
variable 𝑋 𝑛 of a stochastic process {𝑋 𝑛, 𝑛 ≥
1} is known as its state space.
• It may discrete or continuous
9. Specification of stochastic process
• Evidence1: 𝑋 𝑛 is the total number of sixes,
first n throws of die. Values of 𝑋 𝑛 are 0,1,…, n
– discrete time and discrete state space
• Evidence2: 𝑋 𝑛 = 𝑧1 + 𝑧1 + ⋯ + 𝑧 𝑛 where 𝑧𝑖 is
continuous r.v. assuming values in [0, ∞)
– Discrete time and continuous state space
• Evidence3: X(t) gives the number of incoming
call in an interval [0, ∞)
– continuoustime and discrete state space
10. Specification of stochastic process
• Evidence4: X(t) represents the maximum
temparature at a particular placein [0, ∞)
– Continuous time and Continuous state space
Hence stochastic process has four types
I. Discrete time, discrete state space
II. Discrete time, continuous state space
III. Continuous time, discrete state space
IV. Continuous time, continuous state space
11. Specification of stochastic process
• Discrete parameter (time) family
– Stochastic sequence
• Continuous parameter (time) family
– Stochastic process
• Usual notation
{𝑋 𝑡 , 𝑡 ∈ 𝑇} both cases of discrete and
continuous parameters
12. Nature of {𝑋 𝑛, 𝑛 ≥ 1}
• In some cases, {𝑋 𝑛, 𝑛 ≥ 1}
– mutually independent
• But more often they are not i.e dependent.
• We generally come across processes whose
members are dependent.
• The relationship is often great importance.
• Nature of dependence infinitely varied.
• Special cases are considered.
13. • Marcov process (Marcov chain)
• Stationary process
• Gaussian prosess
• Martingale process and so on.
Some processes
14. • Pioneer of Markov Process (Markov Chain)
Andrei Andreivich Markov
(Russian mathematician )
15. Markov process
• Markov process: If {𝑋 𝑡 , 𝑡 ∈ 𝑇} is a stochastic
process such that, given 𝑋 𝑠 , the values of
𝑋 𝑡 , 𝑡 > 𝑠, do not depend on the values of
𝑋 𝑢 , 𝑢 < 𝑠, then the process is said to be
Markov process.
• If for 𝑡1 < 𝑡2 < ⋯ < 𝑡 𝑛 < 𝑡
Pr 𝑎 ≤ 𝑋 𝑡 ≤ 𝑏 𝑋 𝑡1 = 𝑥1, … , 𝑋 𝑡 𝑛 = 𝑥 𝑛}
The process {𝑋 𝑡 , 𝑡 ∈ 𝑇} is a Markov process.
• A discrete parameter Markov process is
known as a Markov chain.
16. Markov process
• A stochastic model describing a sequence of
possible events in which the probability of
each event depends only on the state attained
in the previous event.
• Visualization of Markov Chain
http://setosa.io/ev/markov-chains/
17. Markov chain
• The stochastic process 𝑋 𝑛, 𝑛 = 0,1,2, … is
called Markov chain for j, 𝑘, 𝑗1, … , 𝑗 𝑛−1 ∈ 𝑁
Pr 𝑋 𝑛 = 𝑘 𝑋 𝑛−1 = 𝑗, 𝑋 𝑛−2 = 𝑗1, … , 𝑋0 = 𝑗 𝑛−1 = Pr 𝑋 𝑛 = 𝑘 𝑋 𝑛−1 = 𝑗 = 𝑝𝑗𝑘
– The outcomes are called the states of the Markov
chain; if 𝑋 𝑛 has the outcome j the process is
saidto be at state j at nth trial.
18.
19. Polya’s urn model
• Let’s say you had an urn with black and red
balls. You choose one ball at random, note the
color, and replace the ball in the urn along
with another ball of the same color. The
resulting model is called Polya’s urn model.
20. Polya’s urn model
• An urn contains b black and r red balls. A ball
is drawn atrandom and is replaced after
drawing (wr).The outcome at the nth drawing
iseither ablack ball or a red ball. Let the
random variable 𝑋 𝑛 bedefined as
𝑋 𝑛 = 1, if nth drawing results in a black ball, and
𝑋 𝑛 = 0, if it results in a red ball.
There are two possible outcomes of 𝑋 𝑛 with
21. Polya’s urn model
• Pr 𝑋 𝑛 = 1 =
b
b+r
, Pr 𝑋 𝑛 = 0 =
𝑟
b+r
for all 𝑛 ≥ 1
We have
Pr 𝑋1 = 𝑗, … , 𝑋 𝑛 = 𝑘 = Pr 𝑋1 = 𝑗} … Pr{𝑋 𝑛 = 𝑘 , 𝑗, 𝑘 = 0,1
Because of independence of 𝑋1, … , 𝑋 𝑛.
Polya’s urn model is such that after each drawing not only that
the ball drawn is replaced but 𝑐 (𝑐 > 0)balls of the colour drawn
are added to the urn so that the number of balls of the colour
drawn increase, while the number of balls of the other colour
remains unchanged as at the drawing.
22.
23.
24. Application of Polya’s urn model
• We usually use this polya,s urn model to describe
the Epidemic model. According to our
assumption(𝐵0 black balls and 𝑅0 red balls is the
initial state ), in the Epidemic model , 𝑅0
represents the number of healthy people and 𝐵0
represents the number of sick people. And in
every process, we add 𝑐 same balls in the urn
when the drawn balls is black. If, 𝑐 = 0,the state
needn’t change. What’s more,a describe the
infection rate, which means the average infected
person is a ,when a sick person touch with other
health people.
25. Transition matrix
• The transition matrix records all data about
transitions from one state to the other. The
form of a general transition matrix is
26.
27.
28. Transition matrix
• The transition probabilities 𝑝𝑖𝑗 satisfy
𝑝𝑖𝑗 ≥ 0, 𝑘 𝑝𝑖𝑗 = 1 for all 𝑗.
These probabilities may be written in the matrix
form P =
𝑝11 𝑝12 𝑝13
𝑝21 𝑝22 𝑝23
… …
… …
… … …
… … …
… …
… …
This is
called the transition probability matrix of
Markov chain.P is a stochastic matrix with non
negative elements and unit row sums.
29. Transition matrix
• A transition matrix is any square matrix that
satisfies the following two properties:
– All entries are greater than or equal to 0;
– The sum of the entries in each row is 1.
38. Classification of States and Chain:
Communication relation
• If 𝑝(𝑛)
𝑖𝑗
> 0, 𝑛 ≥ 1 then we say that state j
can be reached or state j is accessible from i.
• Denoted by i → 𝑗
• If 𝑝(𝑛)
𝑖𝑗
= 0 then j is not accessible from i
• Denoted by i → 𝑗
• If two states i and j are such that each is
accessible from the other then that two states
communicate.
• Denoted by i ↔ 𝑗
40. Classification of States and Chain:
Periodicity
• State 𝑖 is a return state if 𝑝𝑖𝑖
(𝑛)
> 0 for some
𝑛 ≥ 0. the period 𝑑𝑖 of a return to state 𝑖 is
defined as the greatest common divissor of all
m such that 𝑝𝑖𝑖
(𝑚)
> 0.
𝑑𝑖 = 𝐺. 𝐶. 𝐷 𝑚: 𝑝𝑖𝑖
𝑚
> 0 ;
Thus state 𝑖 is said to be aperiodic if 𝑑𝑖 = 1
And periodic is 𝑑𝑖 > 1.
• State 𝑖 is aperiodic if 𝑝𝑖𝑖 ≠ 0
41. Classification of States and Chain:
Closed Set
• If C is a set of states such that no state outside
C can be reached from any state in C, then C is
said to be closed set. If C is closed and 𝑗 ∈ 𝐶
while 𝑘 ∉ 𝐶 then 𝑝𝑗𝑘
(𝑛)
= 0 for all, 𝑛 i.e 𝐶 is
closed iff 𝑗∈𝐶 𝑝𝑖𝑗 = 1 for every 𝑖 ∈ 𝐶.
• A closed set contain one or more states. If a
closed set contains only one state j then state j
is said to be absorbing: j is absorbing iff 𝑝𝑗𝑗 =
1, 𝑝𝑗𝑘 = 0, 𝑘 ≠ 𝑗
43. Classification of States and Chain:
Irreducible and Reduicible
• Every finite Markov chain contains at least one
closed set. If the chain does not contain any
other proper closed subset other than the
state space, then the chain is called
irreduicible; the t.p.m. of irreduicible chain is
an irreduicible matrix. In an irreduicible
Markov chain every state can be reached from
every other state.
44. Classification of States and Chain:
Irreducible and Reduicible
• Following chain is Irreduicible
45. Classification of States and Chain:
Irreducible and Reduicible
• A chain which are not irreduicible is called
reduicible or non-irreduicible; the t.p.m. is
reduicible.
• It has two classification
– Primitive(aperiodic): iff t.p.m. premitive
– Imprimitive(periodic): iff t.p.m. impremitive
• In an irreduicible chain all states belong to the
same class.
46. Classification of States and Chain:
Transient and Persistant
• 𝑓𝑗𝑘
(𝑛)
: The probability that it reaches the state
k for the first time at the nth step.
• 𝑝𝑗𝑘
(𝑛)
: The probability that it reaches the
state k (not necessarily for the first time) after
nth step.
• The relation between 𝑓𝑗𝑘
(𝑛)
and 𝑝𝑗𝑘
(𝑛)
can be
expressed as
47. Classification of States and Chain:
Transient and Persistant
• 𝑝𝑗𝑘
(𝑛)
= 𝑟=0
𝑛
𝑓𝑗𝑘
(𝑟)
𝑝 𝑘𝑘
(𝑛−𝑟)
• 𝑝𝑗𝑘
(0)
= , 𝑓𝑗𝑘
(0)
= 0, 𝑓𝑗𝑘
(1)
= 𝑝𝑗𝑘
• Expand…..
48. Classification of States and Chain:
Transient and Persistant
• 𝐹𝑗𝑘: denote the probability that startig with
state j the system will ever reach state k.
𝐹𝑗𝑘 =
𝑛=1
∞
𝑓𝑗𝑘
(𝑛)
𝐹𝑗𝑗 = 1, State j is persistent.
𝐹𝑗𝑗 < 1, State j is transient.
49. Classification of States and Chain:
Transient and Persistant
• The mean time from state j to k is given by
𝜇 𝑗𝑘 =
𝑛=1
∞
𝑛𝑓𝑗𝑘
(𝑛)
In particular, when k = j, {𝑓𝑗𝑗
𝑛
, 𝑛 = 1,2,3, … }
will present the distribution of the recurrence
times of j. The mean recurrence time for the
state j is given by,
𝜇 𝑗𝑗 =
𝑛=1
∞
𝑛𝑓𝑗𝑗
(𝑛)
Like expectation
f presents probability
50. Classification of States and Chain:
Transient and Persistant
• A persistant state j is said to be null persistent if
𝜇 𝑗𝑗 = ∞
• A persistant state j is said to be non-null
persistent if
𝜇 𝑗𝑗 < ∞
• A persistant non-null and aperiodic state of a
Markov chain is said to be ergodic.