Solution Manual for Principles of Corporate Finance 14th Edition by Richard B...
Lecture4 SIQ3003.pdf
1. Stochastic Process and Markov Chain
Discrete Markov Chain
Actuarial Mathematics II
Lecture 4
Dr. Shaiful Anuar
Institute Of Mathematical Sciences
University of Malaya
Lecture 4 Actuarial Mathematics II
2. Stochastic Process and Markov Chain
Discrete Markov Chain
Table of contents
1 Stochastic Process and Markov Chain
2 Discrete Markov Chain
Example
Example
Example
Example
Lecture 4 Actuarial Mathematics II
3. Stochastic Process and Markov Chain
Discrete Markov Chain
Stochastic Process and Markov Chain
A collection of random variables of a stochastic process is
normally represented by {Yt, t ∈ T} where t is often referred
to as time and Yt is the process state at time t with Yt ∈ S.
S is known as the state space, set of all possible values of the
random variables.
The stochastic process can be categorized as follows
depending on T:
discrete-time stochastic process : when T is of
discrete/countable type
continuous-time stochastic process : when T is of
continuous/interval type
Similarly a Markov chain can be of discrete or continuous
type.
Lecture 4 Actuarial Mathematics II
4. Stochastic Process and Markov Chain
Discrete Markov Chain
Markov chain is a stochastic process with the following
properties:
(i) P(Yt+1 ∈ S|Yt = i) = 1
(ii) P(Yt+1 = j|Yn = i, Yn−1 = in−1, · · · , Y0 = i0)
= Pr(Yt+1 = j|Yt = i)
An important characteristics of a Markov chain as expressed
in property (ii) above is the memoryless property, that is,
the transition probabilities only depend on the current state.
In insurance, we apply this concept in multi-state models
which uses the concept of Markov chain to specify the
probabilities of moving to various states in the model.
The probabilities are known as transition probabilities.
Lecture 4 Actuarial Mathematics II
5. Stochastic Process and Markov Chain
Discrete Markov Chain
Several popular multi-state models are given below:
(i) The alive-dead model
Alive (1) Dead (2)
(ii) The double indemnity/accidental death model
Alive (1)
Dead - Accident (2)
Dead - Other causes (3)
Lecture 4 Actuarial Mathematics II
6. Stochastic Process and Markov Chain
Discrete Markov Chain
(iii) The permanent disability model
Healthy (1) Disabled (2)
Dead (3)
(iv) The disability income model
Healthy (1) Sick (2)
Dead (3)
Lecture 4 Actuarial Mathematics II
7. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Discrete Markov Chain
The most common model that we have come across is the
single decrement model with a two-state transition.
The alive-dead model
Alive (1) Dead (2)
It is common to represent the probability of transition in the
form of a matrix. For the above example, assume that the
probability of moving from state ”Alive” and ”Death” equals
0.4. Then, the matrix can be represented as follows:
1 2
1 0.6 0.4
2 0 1
Lecture 4 Actuarial Mathematics II
8. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
One-step transition probability is the probability of moving
from one state to another in a single step. It is denoted by:
Q
(i,j)
t = P(Yt+1 = j|Yt = i)
kth-step transition probability is the probability of moving
from one state to another in a single step. It is denoted by:
kQ
(i,j)
t =
X
k
Q
(i,k)
t · Q
(k,j)
t+1
As mentioned earlier, the transition probabilities is given in a
matrix known as transition matrix.
If the transition probabilities does not vary with time, it is
referred to as homogeneous Markov chain.
If the transition probabilities varies with time, it is referred to
as non-homogeneous Markov chain.
Lecture 4 Actuarial Mathematics II
9. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
For a homogeneous Markov chain the transition probability is
independent of t.
kQ
(i,j)
t =
X
k
Q(i,k)
· Q(k,j)
Therefore if Q is the transition matrix, then the kth-step
transition matrix is simply given by:
kQt = Qk
Lecture 4 Actuarial Mathematics II
10. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 1
For a homogeneous Markov chain, the transition probability matrix
is given by
1 2
1 0.4 0.6
2 0.8 0.2
Calculate 3Q(2,1).
Lecture 4 Actuarial Mathematics II
11. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
12. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 2
An auto insured who was claim-free during a policy period will
claim-free during the next policy period with probability 0.9. An
auto insured who was not claim-free during a policy period will be
claim-free during the next policy period with probability 0.7. What
is the probability that an insured who was claim-free during the
policy period 0 will incur a claim during period 3?
Lecture 4 Actuarial Mathematics II
13. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
14. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 3
Consider a homogeneous Markov model with three states, Healthy
(0), Disabled (1) and Dead (2).
(i) The annual transition matrix is given by
0 1 2
0 0.7 0.2 0.10
1 0.10 0.65 0.25
2 0 0 1
(ii) There are 100 lives at the start, all Healthy. Their future
states are independent.
(iii) Assume that all lives have the same age at start.
Calculate the variance of the number of the original lives who die
within the first two years.
Lecture 4 Actuarial Mathematics II
15. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
16. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Example 4
A life insurance policy waives premium upon disability of the
insured. The policy is modeled as a homogeneous Markov chain
with the three states: active, disabled, gone. The annual transition
matrix is
1 2 3
1 0.75 0.15 0.10
2 0.50 0.30 0.20
3 0 0 1
Currently 90% of the insureds are active and 10% are disabled.
(a) Calculate the percentage of the current population of insureds
that are (1) active, (2) disabled and (3) gone at the end of
three years.
(b) Calculate the probability that a currently disabled life will be
active at the end of three years.
Lecture 4 Actuarial Mathematics II
17. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II
18. Stochastic Process and Markov Chain
Discrete Markov Chain
Example
Example
Example
Example
Solution
Lecture 4 Actuarial Mathematics II