Hello,
This is Tahsin Ahmed Nasim. I'm a student of Civil Engineering. My Own MARKOV CHAINS Presentation.
This is the part of Probability of Statistic.
2. INTRODUCTION
2
Markov Process is proposed by Russian Mathematician, Andrey
Markov.
In probability theory, a Markov model is a stochastic model randomly
changing system.
If the future states of a process are independent of the past and depend
only on the present, the process is called a Markov process.
A Markov Chain is a random process with the property that the next
state depends only on the current state.
3. There are four
common
Markov Models.
3
System state
is fully
observable
Systemstate is
partially observable
System is
autonom
ous
Markov
Chain
Hidden Markov Model
System is
controller
d
Markov
decision
process
Partially observable
Markov decision process
4. MARKOV CHAINS
4
It is the simplest Markov Model.
It models the state of a system with a random variable that
changes through time.
A Markov chain can be described by a transition matrix.
A Markov chain is "a stochastic model describing a sequence
of possible events in which the probability of each event
depends only on the state attained in the previous event."
5. TRANSITION MATRIX
5
It is also termed as the probability matrix of the Markov matrix
or substitution matrix.
It is a square Matrix used to describe the transitions of a
Markov chain.
7. EXAMPLE OF
MARKOV
CHAIN
7
• Design a Markov Chain to predict the
weather of tomorrow using previous
information of the past days.
• Our model has only 3 states: 𝑆 = 𝑆1,
𝑆2, 𝑆3, and the name of each state is
𝑆1 = 𝑆𝑢𝑛𝑛𝑦, 𝑆2 = 𝑅𝑎𝑖𝑛𝑦,
𝑆3 = 𝐶𝑙𝑜𝑢𝑑 𝑦.
• To establish the transition
probabilities relationship between
states we will need to collect data.
9. 9
Let’s say we have a sequence: Sunny, Rainy, Cloudy,
Cloudy, Sunny, Sunny, Sunny, Rainy, ….;
so, we can be in any of the three states in a day.
We can use the following state sequence notation: 𝑞1, 𝑞2,
𝑞3, 𝑞4, 𝑞5,….., where 𝑞𝑖 𝜖 {𝑆𝑢𝑛𝑛𝑦,𝑅𝑎𝑖𝑛𝑦,𝐶𝑙𝑜𝑢𝑑𝑦}.
To compute the probability of tomorrow’s whether we can
use the Markov property:
𝑃 (𝑞1,…,𝑞𝑛) = 𝑃(𝑞𝑖|𝑞𝑖−1) 𝑖=1
10. APPLICATION OF MARKOV CHAIN
10
It is used in various study field such as physics, chemistry, medicine,
music etc.
It is used in thermodynamics and statistical mechanics.
It can be used for data analysis.
Markov chain methods have also become very important for
generating sequences of random numbers via a process called Markov
Chain Monte Carlo(MCMC).
It is used in mathematical Biology, especially in population processes.
Markov chains can be used in population genetics.
It is used to detect weather conditions.
11. MARKOV DECISION PROCESS
11
A Markov decision process is a discrete-time stochastic
control process.
Markov decision processes (MDPs) provide a mathematical
framework for modeling decision-making in situations where
outcomes are partly random and partly under the decision
maker's control.
It is an extension of the Markov chain model.
It is applied in case when the system is controlled and the
system state is fully observable.
12. APPLICATIONS
OF MARKOV
DECISION
PROCESS
12
It is used in various fields
such as robotics, automatic
control, economics,
manufacturing, etc.
It is used in the network
(world wide web) process.
13. 13
Tahsin Ahmed Nasim
ID: 2019-2-22-026
Department: Civil Engineering
Course Title: Probability of Statistic
Phone: 01852742703
Email: tahsin.ahmed.Nasim@gmail.com
Presenter