The document describes a discrete memoryless channel and provides examples of how to model such a channel. It defines a discrete memoryless channel as having a discrete input and output alphabet where the current output only depends on the current input. It presents how to describe the channel using forward and backward conditional probability matrices and provides examples of calculating various probabilities based on these matrices. Examples of applying these concepts to binary and ternary channels are also given.
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Information Theory and Coding - Lecture 4
1. Mustaqbal University
College of Engineering &Computer Sciences
Electronics and Communication Engineering Department
Course: EE301: Probability Theory and Applications
Part B
Prerequisite: Stat 219
Te×t Book: B.P. Lathi, “Modern Digital and Analog Communication Systems”, 3th edition, O×ford University Press, Inc., 1998
Reference: A. Papoulis, Probability, Random Variables, and Stochastic Processes, Mc-Graw Hill, 2005
Dr. Aref Hassan Kurdali
2. Discrete Memoryless Channels
The issue of information transmission is now considered,
with particular emphasis on reliability (error free).
Consider a discrete memoryless channel which is a
statistical model with an input A and an output B that is a
noisy version of A; both A and B are random discrete
variables. At every unit of time, the channel accepts an
input symbol ai selected from an input alphabet A {a1,a2,
...,ai, ., ar} and in response, it emits an output symbol B
from an output alphabet B {b1,b2, ...,bj, ., bs}. The
channel is said to be "discrete" when both of the
alphabets A and B have finite sizes. It is said to be
"memoryless" when the current output symbol depends
only on the current input symbol and not on any of the
previous ones. The input alphabet A and output
alphabet B need not have the same size.
3. A convenient way of describing a discrete memoryless channel is to arrange
the various conditional probabilities of the channel at the transmitter in the form
of a forward conditional channel matri× F as follows:
p(b1/a1) p(b2/a1) ............ p(bs/a1)
FM = p(b1/a2) p(b2/a2) ............ p(bs/a2)
.
p(b1/ar) p(b2/ar) ............ p(bs/ar)
The ith raw represents the output probability distribution given the input symbol
ai is transmitted. Thus, the sum over any raw has to equal to one.
Similarly, at the receiver, the backward conditional channel matri× can be
calculated and arranged as follows:
p(a1/b1) p(a2/b1) ............ p(ar/b1)
BM = p(a1/b2) p(a2/b2) ............ p(ar/b2)
.
p(a1/bs) p(a2/bs) ............ p(ar/bs)
The jth raw represents the input probability distribution given the output symbol
bj is received. Thus, the sum over any raw has also to equal to one.
The joint probability p(ai , bj) = p(bj/ai) p(ai) = p(ai/bj) p(bj) = p(bj , ai)
5. Forward Conditional Channel Matrix: Conditional output probability distribution at the transmitter
The output probabilities can be calculated as:
P(b1) = P(b1/a1)P(a1) + P(b1/a2) P(a2) + P(b1/a3) P(a3) = P(a1,b1) + P(a2,b1) + P(a3,b1)
P(b2) =
P(b3) =
Backward Conditional Channel Matrix: Conditional input probability distribution at the receiver
P(a1/b1) P(a2/b1) P(a3/b1) 0.8 0.15 0.05
P(a1/b2) P(a2/b2) P(a3/b2) 0.07 0.9 0.03
P(a1/b3) P(a2/b3) P(a3/b3) 0.01 0.29 0.7
Ideal Practical
P(b1/a1) P(b2/a1) P(b3/a1)
P(b1/a2) P(b2/a2) P(b3/a2)
P(b1/a3) P(b2/a3) P(b3/a3)
1 0 0
0 1 0
0 0 1
0.9 0.07 0.03
0.1 0.6 0.3
0.01 0.03 0.96
6. The marginal probability distribution of the output random variable
B {p(b1), p(b2), ...,p(bj), ., p(bs)} can be calculated from the priori
probability distribution of the input random variable A
{p(a1), p(a2), ...,p(ai), ., p(ar)} and the forward conditional channel
matri×. Where
P(bj) = sum over all i for p(bj/ai) p(ai), i=1, 2, ....., r
𝑃 𝑏𝑗 =
𝑖=1
𝑟
𝑝 𝑏𝑗 𝑎𝑖 𝑝(𝑎𝑖)
i.e., B = FMT A
Note that the input probability distribution A and the forward
channel matrix FM are independent and should be practically
calculated and given.
7. The binary symmetric channel (BSC) is of great theoretical
interest and practical importance. It is a special case of the
discrete memoryless channel with i = j = 2. The channel has two
input symbols (x0 = 0, xl = 1) and two output symbols (y0 = 0, y1 =
1). The channel is symmetric because the probability of receiving
a 1 if a 0 is sent is the same as the probability of receiving a 0 if a
1 is sent. This conditional probability of error is denoted by p. The
forward channel matrix and diagram of a binary symmetric
channel are shown above in Figure 9.8.
BSC forward transition matrix
1 − 𝑝 𝑝
𝑝 1 − 𝑝
8. Consider the transition probability diagram of a binary symmetric channel shown
in the figure below. The input binary symbols 0 and 1 occur with equal
probability.
a) Find the probabilities of the binary symbols 0 and 1 appearing at the channel
output.
b) Repeat the calculation in (a), assuming that the input binary symbols 0 and 1
occur with probabilities
1
4
and
3
4
, respectively.
Problem 1
14. Receiver Decision Rules
Let the symbol bj has been received, which input symbol
ai should the receiver decide to be sent? d(bj) is the
receiver decision rule whenever bj is received. Using the
backward transition matrix and the maximum likelihood
rule over each row:
d(bj) = a*, where p(a*/bj) ≥ p(ai/bj) for all i
and in terms of the joint probabilities:
i.e. p(a*,bj) ≥ p(ai,bj) for all i
or the forward conditional probabilities p(bj/a*)p(a*) ≥ p(bj/ai)p(ai) for all i
If the input probability is uniform, then choose
d(bj) = a*, where p(bj /a*) ≥ p(bj /ai) for all i
15. Probability of Error Calculations
For a given channel matri× and a given receiver decision rule,
the conditional probability of error p(E/bj ) = 1- p(d(bj)/bj) and
the statistical average of error PE can be calculated as:
If the input probability is uniform, then
PE = 1- (1/r) (sum of all p(bj /a*) )
s
j
j
E
s
j
j
E
j
s
j
j
E
s
j
j
j
E
b
a
p
P
a
p
a
b
p
P
b
p
b
a
p
P
b
p
b
E
p
P
1
*
1
1
1
)
,
(
1
)
(
)
/
(
1
)
(
)
/
(
1
)
(
)
/
(
16. Problem 3
Consider the following backward 4-ary channel matrix
What is the receiver decision rule would be for: 𝑑 𝑏1 , 𝑑 𝑏2 , 𝑑 𝑏3 , 𝑑 𝑏4 .
17. Problem 3 - Solution
Consider the following backward 4-ary channel matrix
The receiver decision rule would be
𝑑 𝑏1 = 𝑎1
𝑑 𝑏2 = 𝑎1
𝑑 𝑏3 = 𝑎3
𝑑 𝑏4 = 𝑎2
18. Problem 4
A 4-symbols of memoryless source S {a, b, c, and d} is encoded using the following
binary Shannon-Fano encoder:
a 1100
b 10
c 0110
d 00
The source probability distribution is: p(a)=p(c)=0.0625, p(b)=p(d)=0.4375.
If this encoder output binary stream is applied to a binary channel, Calculate the
channel input probability distribution: P(0) and P(1).
20. Problem 5
T Prob
Huffman
Ternary
Code
𝑝(𝑡1) 0.25 1
𝑝(𝑡2) 0.2 2
𝑝(𝑡4) 0.2 00
𝑝(𝑡5) 0.16 02
𝑝(𝑡3) 0.05 011
𝑝(𝑡7) 0.05 012
𝑝(𝑡6) 0.04 0100
𝑝(𝑡8) 0.04 0101
𝑝(𝑡9) 0.01 0102
If the shown ternary stream is applied to a ternary
channel with the following forward channel matrix:
0.9 0.05 0.05
0.15 0.7 0.15
0.1 0.1 0.8
a) Calculate the channel input probability distribution.
b) Calculate the channel output probability
distribution.
c) Find the receiver decision rule and its probability
of error.
d) Calculate the following conditional probabilities:
𝑝 𝑎2|𝑏1 , 𝑝 𝑎1|𝑏3
23. Problem 6
Given the following forward ternary channel matrix:
0.7 0.13 0.17
0.08 0.9 0.02
0.2 0.3 0.5
Where the channel input probability distribution is 0.4, 0.5 & 0.1.
1. Draw the forward channel diagram.
2. Calculate the output probability distribution.
3. Calculate the joint channel matrix.
4. Calculate the backward channel matrix.
5. Find the receiver decision rule.
25. Problem 7
A source S has six symbols with probability distribution [0.55, 0.1, 0.15, 0.13,
0.03, and 0.04].
a) Construct a ternary Huffman code.
b) If the encoder output stream is transmitted via a ternary symmetric channel
with probability of error p = 0.1. Calculate the input probability distribution
of the channel.
c) Write the forward ternary channel matrix.
d) Calculate the output probability distribution.
e) Calculate the joint channel matrix.
f) Calculate the backward channel matrix.
g) Find the receiver decision rule and calculate its probability of error PE.
h) How could you reduce the probability of error PE?
26. Problem 7 - Solution
Ternary symmetric channel forward diagram:
28. Problem 8
Given the following forward ternary channel matrix:
0.9 0.03 0.07
0.08 0.7 0.22
0.1 0.4 0.5
Find the receiver decision rule and its Probability of error for each input
probability distribution [p(a1), p(a2), P(a3)]:
a) [0.8, 0.15 & 0.05]
b) [0.15, 0.8 & 0.05]
c) [0.1, 0.15 & 0.8]
d) Comment