SlideShare a Scribd company logo
1 of 41
Mustaqbal University
College of Engineering &Computer Sciences
Electronics and Communication Engineering Department
Course: EE301: Probability Theory and Applications
Part C
Prerequisite: Stat 219
Text Book: B.P. Lathi, β€œModern Digital and Analog Communication Systems”, 3th edition, Oxford University Press, Inc., 1998
Reference: A. Papoulis, Probability, Random Variables, and Stochastic Processes, Mc-Graw Hill, 2005
Dr. Aref Hassan Kurdali
Communications Channel Entropies
(calculated in base 2)
Could A & B be SI? !!!!!!!!!!!!!!!!
Tx Rx Notes
H(A( H(B)
The entropy H(A( represents our
uncertainty about the channel input
before observing the channel output.
H(B/ai) H(A/bj(
H(B/A) H(A/B(
The conditional entropy H(A/B(
represents our uncertainty about the
channel input after observing the
channel output.
H(A,B) = H(A) + H(B/A)
= H(B) + H(A/B( H(A,B) is the joint channel entropy.
Could A & B be SI ?
Conditional Channel Entropies
The Joint Channel Entropy
Mutual Information
p(ai) is the priori probability of the input symbol ai
p(ai /bj) is the posterior probability of the input symbol ai upon
reception of bj
The change in the probability (due to channel noise) measures
how much the receiver learned from the reception of bj. The
receiver will never absolutely sure what exactly was sent. The
difference between the information uncertainty of ai before &
after reception of bj is called the mutual information that is the
actual amount of information transmitted via the channel
I(ai;bj) = I(ai) – I(ai/bj)
For ideal error free channel, if i=j p(ai/bj) =1 & if i≠j, p(ai/bj) = 0
Therefore, I(ai;bj) = I(ai) [I(ai;bj) = 0 if ai & bj were SI !!!!!!]
I(ai;bj) = logr(1/p(ai)) –logr(1/p(ai/bj)) = logr(p(ai/bj)/p(ai))
= logr(p(ai,bj)/(p(ai) p(bj))).
Similarly, I(bj;ai) = I(bj) – I(bj/ai) = logr(1/p(bj)) - logr(1/p(bj/ai)) = I(ai;bj)
Channel Mutual Information
Therefore, I(A;B) = H(A) – H(A/B)
Note also that
Similarly,
I(A;B) = H(B) – H(B/A)
Since the entropy H(A) represents our uncertainty about the channel
input before observing the channel output, and the conditional entropy
H(A/B) represents the average amount of uncertainty remaining about
the channel input after the channel output has been observed, it follows
that the difference H(A) - H(A/B) must represent our uncertainty about
the channel input that is resolved by observing the channel output. This
important quantity is called the channel mutual information
I(A;B) = H(A) - H(A/B)
The channel mutual information represents the actual average amount of
information transmitted through the channel.
Similarly, I(A;B) = H(B) - H(B/A)
Or I(A;B) = H(A) + H(B) - H(A,B)
Note that I(A;B) = I(B;A) β‰₯ 0
Channel Mutual Information
The interpretation given in Figure 9.9 above shows The entropy of channel input H(A) is
represented by the circle on the left while the entropy of channel output H(B) is
represented by the circle on the right. The mutual information of the channel is
represented by the intersection between these two circles.
I(A;B) = H(B) - H(B/A) = H(A) - H(A/B) = H(A) + H(B) - H(A,B)
Problem 1
Let 𝑝 π‘Ž1, 𝑏1 =
1
3
, 𝑝 π‘Ž1, 𝑏2 =
1
3
, 𝑝 π‘Ž2, 𝑏1 = 0, 𝑝 π‘Ž2, 𝑏2 =
1
3
Find 𝐻 𝐴 , 𝐻 𝐡 , 𝐻 𝐴, 𝐡 , 𝐻 𝐴 𝐡 𝐻(𝐡|𝐴), I 𝐴, 𝐡 .
Problem 1 - Solution
P 𝐴, 𝐡 =
𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2
𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2
=
1
3
1
3
0
1
3
β†’ 𝑝 π‘Ž1 =
1
3
+
1
3
=
2
3
β†’ 𝑝 π‘Ž2 = 0 +
1
3
=
1
3
𝑝 𝑏1 =
1
3
+ 0 =
1
3
𝑝 𝑏2 =
1
3
+
1
3
=
2
3
So
H 𝐴 =
2
3
π‘™π‘œπ‘”2
3
2
+
1
3
π‘™π‘œπ‘”2 3 = 0.918
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’
H 𝐡 =
1
3
π‘™π‘œπ‘”2 3 +
2
3
π‘™π‘œπ‘”2
3
2
= 0.918 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’
Problem 1 - Solution
P 𝐴, 𝐡 =
𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2
𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2
=
1
3
1
3
0
1
3
H 𝐴, 𝐡 =
𝑖=1
2
𝑗=1
2
P π‘Žπ‘–, 𝑏𝑗 Γ— π‘™π‘œπ‘”2(
1
P π‘Žπ‘–, 𝑏𝑗
)
H 𝐴, 𝐡 =
1
3
Γ— π‘™π‘œπ‘”2 3 +
1
3
Γ— π‘™π‘œπ‘”2 3 + 0 +
1
3
Γ— π‘™π‘œπ‘”2 3 =1.584 bit/symbol
I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 = 0.918 + 0.918 βˆ’ 1.584 = 𝟎. πŸπŸ“πŸ bit/symbol
Problem 1 - Solution
H 𝐴|𝐡 = H 𝐴, 𝐡 βˆ’ H 𝐡 = 1.584 βˆ’ 0.918 =0.666 bit/symbol
H 𝐡|𝐴 = H 𝐴, 𝐡 βˆ’ H 𝐴 = 1.584 βˆ’ 0.918 = 0.666 bit/symbol
Problem 1 - Tips
Note that P 𝐡|𝐴 is the forward transition matrix (FM), and
P 𝐴|𝐡 is the backward transition matrix (BM).
Form the Joint Matrix: P 𝐴, 𝐡 = JM =
𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2
𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2
=
1
3
1
3
0
1
3
We can deduce the FM as follows:
𝐹𝑀 = P 𝐡|𝐴 =
(
1
3
)/𝑝(π‘Ž1)
1
3
/𝑝(π‘Ž1)
0/𝑝(π‘Ž2)
1
3
/𝑝(π‘Ž2)
=
0.5 0.5
0 1
Now, we can find H 𝐡|𝐴 as follows:
So:
H 𝐡|𝐴 = [0.5π‘™π‘œπ‘”2
1
0.5
+ 0.5π‘™π‘œπ‘”2
1
0.5
] Γ— 𝑝 π‘Ž1 + [0π‘™π‘œπ‘”2
1
0
+ 1 π‘™π‘œπ‘”2
1
1
] Γ— 𝑝 π‘Ž2
H 𝐡|𝐴 = 0.666 bit/symbol
Problem 2
Let 𝑝 π‘Ž1, 𝑏1 = 0.5, 𝑝 π‘Ž1, 𝑏2 = 0.1, 𝑝 π‘Ž2, 𝑏1 = 0.3, 𝑝 π‘Ž2, 𝑏2 = 0.1
Find I 𝐴, 𝐡 .
Problem 2 - Solution
P 𝐴, 𝐡 =
𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2
𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2
=
0.5 0.1
0.3 0.1
β†’ 𝑝 π‘Ž1 = 0.5 + 0.1 = 0.6
β†’ 𝑝 π‘Ž2 = 0.3 + 0.1 = 0.4
𝑝 𝑏1 = 0.5 + 0.3 = 0.8
𝑝 𝑏2 = 0.1 + 0.1 = 0.2
So
H 𝐴 = 0.6π‘™π‘œπ‘”2
1
0.6
+ 0.4π‘™π‘œπ‘”2
1
0.4
= 0.97
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’
H 𝐡 = 0.8π‘™π‘œπ‘”2
1
0.8
+ 0.2π‘™π‘œπ‘”2
1
0.2
= 0.72 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’
H 𝐴, 𝐡 = 0.5 Γ— π‘™π‘œπ‘”2
1
0.5
+ 0.1 Γ— π‘™π‘œπ‘”2
1
0.1
+ 0.3 Γ— π‘™π‘œπ‘”2
1
0.3
+ 0.1 Γ— π‘™π‘œπ‘”2
1
0.1
=1.684 bit/symbol
I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 = 0.97 + 0.72 βˆ’ 1.684 = 𝟎. πŸŽπŸŽπŸ” bit/symbol
Problem 2 - Tips
P 𝐴, 𝐡 = JM =
𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2
𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2
=
0.5 0.1
0.3 0.1
We can deduce the FM as follows:
𝐹𝑀 = P 𝐡|𝐴 =
0.5/𝑝(π‘Ž1) 0.1/𝑝(π‘Ž1)
0.3/𝑝(π‘Ž2) 0.1/𝑝(π‘Ž2)
=
0.83 0.17
0.75 0.25
Now, we can find 𝐇 𝑩|𝑨 as follows:
So:
H 𝐡|𝐴 = [0.8333 π‘™π‘œπ‘”2
1
0.8333
+ 0.1666 π‘™π‘œπ‘”2
1
0.1666
] Γ— 𝑝 π‘Ž1 + [0.75 π‘™π‘œπ‘”2
1
0.75
+ 0.25π‘™π‘œπ‘”2
1
0.25
] Γ— 𝑝 π‘Ž2
H 𝐡|𝐴 = [0.8333 π‘™π‘œπ‘”2
1
0.8333
+ 0.1666 π‘™π‘œπ‘”2
1
0.1666
] Γ— 0.6 + [0.75 π‘™π‘œπ‘”2
1
0.75
+ 0.25π‘™π‘œπ‘”2
1
0.25
] Γ— 0.4
H 𝐡|𝐴 = 0.714 bit/symbol
Or:
𝐇 𝑩|𝑨 = 𝐇 𝑨, 𝑩 βˆ’ 𝐇 𝑨 = 𝟏. πŸ”πŸ–πŸ’ βˆ’ 𝟎. πŸ—πŸ• = 𝟎. πŸ•πŸπŸ’ π’ƒπ’Šπ’•/π’”π’šπ’Žπ’ƒπ’π’
Channel Capacity
The channel capacity of a discrete memoryless channel is defined as the
maximum mutual information I(A; B) in any single use of the channel (i.e., signaling
interval), where the maximization is over all possible input probability distributions
{p(ai)} on A. The channel capacity is commonly denoted by C and is written as
C = Max I(A;B)
{p(ai)}
The channel capacity C is measured in bits per channel use, or bits per transmission.
Note that the channel capacity C is a function only of the transition probabilities
p(bj/ai), which define the channel. The calculation of C involves maximization of the
mutual information I(A; B) over r variables [i.e., the input probabilities p(a1), . . . ,p(ar)]
subject to two constraints:
p(ai) β‰₯ 0 for all i
sum of all p(ai) for all i =1
In general, the variation problem of finding the channel capacity C is a challenging task.
Uniform Channel
The channel is called uniform when each row in its forward matrix is a permutation of the first
row. The mutual information then can be written as I(A;B) = H(B) – H(B/A) = H(B) – W
Where W is a constant = entropy of any row of the uniform matrix.
Example 1: binary symmetric channel (BSC) is a uniform channel
1-p p
p 1-p
Where p is the probability of error.
Here W = h(p) = plog(1/p)+(1-p)log(1/(1-p))
H(p) is called the entropy function.
Therefore, the channel capacity is the maximization of [H(B) – h(p)]
which gives H(B) = 1 at uniform input probability distribution, i. e.
C= 1 - h(p) bits/ transmission at p(a1) = p(a2) = Β½
Let p = 0.01, h(0.01) = 0.0808, then C = 1 – 0.0808 = 0.919 bits/ transmission
the channel capacity curve as a function in the bit error probability p is
shown above.
Problem 3
Figure 1 shows the forward channel diagram of a binary symmetric channel
(BSC. If you given that I(a1) = 2 bits, find:
1- Channel Capacity (C).
2- Channel efficiency (πœ‚).
3- Channel Redundancy.
Problem 3 - Solution
1) From the forward channel diagram we deduce that the forward transition matrix is:
𝐹𝑀 = 𝑃(𝐡|𝐴) =
1 βˆ’ 𝑝 𝑝
𝑝 1 βˆ’ 𝑝 π‘ŸΓ—π‘ 
=
0.7 0.3
0.3 0.7
π‘ͺ = π’π’π’ˆπŸπ’” βˆ’ 𝒉 𝒑 = π‘™π‘œπ‘”22 βˆ’ β„Ž 𝑝 = 1 βˆ’ β„Ž 𝑝 (where s: number of FM’s
columns, or it is the number of the receiver nodes)
𝑏𝑒𝑑: β„Ž 𝑝 = 0.7π‘™π‘œπ‘”2
1
0.7
+ 0.3π‘™π‘œπ‘”2
1
0.3
= 0.881 bit/symbol
π‘ͺ = 𝟏 βˆ’ 𝒉 𝒑 = 𝟏 βˆ’ 𝟎. πŸ–πŸ–πŸ = 𝟎. πŸπŸπŸ— π’ƒπ’Šπ’•/π’”π’‚π’Žπ’‘π’π’†
2) πœ‚=
𝑰(𝑨,𝑩)
π‘ͺ
Γ— 𝟏𝟎𝟎% , so we should to find 𝐼 𝐴, 𝐡
I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡
Given that I π‘Ž1 = 2 = π‘™π‘œπ‘”2
1
𝑝 π‘Ž1
β†’ 𝑝 π‘Ž1 = 0.25 ⟹ 𝑝 π‘Ž2 = 0.75. So
The Joint Matrix is: JM=
0.7 Γ— 0.25 0.3 Γ— 0.25
0.3 Γ— 0.75 0.7 Γ— 0.75
=
0.175 0.075
0. 225 0.525
From JM:
𝑝 𝑏1 = 0.175 + 0.225 = 0.4
𝑝 𝑏2 = 0.075 + 0.525 = 0.6
Problem 3 - Solution
𝑝 π‘Ž1 = 0.25
𝑝 π‘Ž2 = 0.75
𝐻 𝐴 = 0.25π‘™π‘œπ‘”2
1
0.25
+ 0.75π‘™π‘œπ‘”2
1
0.75
= 0.811
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™
𝑝 𝑏1 = 0.4
𝑝 𝑏2 = 0.6
𝐻 𝐡 = 0.4π‘™π‘œπ‘”2
1
0.4
+ 0.6π‘™π‘œπ‘”2
1
0.6
= 0.97
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™
JM=
0.175 0.075
0. 225 0.525
𝐻 𝐴, 𝐡
= 0.175 π‘™π‘œπ‘”2
1
0.175
+ 0.075 π‘™π‘œπ‘”2
1
0.075
+ 0. 225 π‘™π‘œπ‘”2
1
0. 225
+ 0.525 π‘™π‘œπ‘”2
1
0.525
= 1.692
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™
I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 =0.811+0.97-1.692=0.089 bit/symbol
[or 𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑 = 0.97-0.881=0.089]
πœ‚=
𝑰(𝑨, 𝑩)
π‘ͺ
Γ— 𝟏𝟎𝟎% =
𝟎. πŸŽπŸ–πŸ—
𝟎. πŸπŸπŸ—
Γ— 𝟏𝟎𝟎% = πŸ•πŸ“%
3) Redundancy: 𝜸=1- πœ‚ = 25%
Problem 4
Given the forward transition matrix (FM).
𝑃 𝐡 𝐴 = 𝐹𝑀 =
0.7 0.2 0.1
0.1 0.7 0.2
0.2 0.1 0.7
If you given that p π‘Ž1 = p π‘Ž2 = 0.25, find:
1- Channel Capacity (C).
2- Channel efficiency (πœ‚).
3- Channel Redundancy (𝜸).
4- Draw the channel.
Problem 4 - Solution
𝑃 𝐡 𝐴 = 𝐹𝑀 =
0.7 0.2 0.1
0.1 0.7 0.2
0.2 0.1 0.7
1) π‘ͺ = π’π’π’ˆπŸπŸ‘ βˆ’ 𝒉 𝒑
𝒉 𝒑 = 0.7π‘™π‘œπ‘”2
1
0.7
+ 0.2π‘™π‘œπ‘”2
1
0.2
+ 0.1π‘™π‘œπ‘”2
1
0.1
= 1.156
𝑏𝑖𝑑
π‘ π‘¦π‘šπ‘π‘œπ‘™
π‘ͺ = π’π’π’ˆπŸπŸ‘ βˆ’ 𝟏. πŸπŸ“πŸ” = 𝟎. πŸ’πŸπŸ–
π’ƒπ’Šπ’•
π’•π’“π’‚π’π’”π’Žπ’Šπ’”π’”π’Šπ’π’
𝒐𝒓
π’ƒπ’Šπ’•
π’”π’‚π’Žπ’‘π’π’†
2) πœ‚=
𝑰(𝑨,𝑩)
π‘ͺ
Γ— 𝟏𝟎𝟎%
𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑
𝑝 𝑏1 = 0.7 Γ— 𝑝 π‘Ž1 + 0.1 Γ— 𝑝 π‘Ž1 + 0.2 Γ— 𝑝 π‘Ž1 = 0.3
𝑝 𝑏2 = 02 Γ— 𝑝 π‘Ž2 + 0.7 Γ— 𝑝 π‘Ž2 + 0.1 Γ— 𝑝 π‘Ž2 = 0.275
𝑝 𝑏3 = 0.1 Γ— 𝑝 π‘Ž3 + 0.2 Γ— 𝑝 π‘Ž3 + 0.7 Γ— 𝑝 π‘Ž3 = 0.425
𝐇 𝑩 = 0.3π‘™π‘œπ‘”2
1
0.3
+ 0.275 π‘™π‘œπ‘”2
1
0.275
+ 0.425 π‘™π‘œπ‘”2
1
0.425
= 1.577 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™
𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑 = 𝟏. πŸ“πŸ•πŸ• βˆ’ 𝟏. πŸπŸ“πŸ” = 𝟎. πŸ’ π’ƒπ’Šπ’•/π’”π’šπ’Žπ’ƒπ’π’
πœ‚=
𝟎.πŸ’
𝟎.πŸ’πŸπŸ–
Γ— 𝟏𝟎𝟎%=94%
Problem 4 - Solution
𝑃 𝐡 𝐴 = 𝐹𝑀 =
0.7 0.2 0.1
0.1 0.7 0.2
0.2 0.1 0.7
3) Redundancy: 𝜸=1- πœ‚ = 6%
4)
β€’
π‘Ž1 β€’
𝑏1
β€’
π‘Ž2 β€’
𝑏2
β€’
π‘Ž3 β€’
𝑏3
0.2
0.7
0.1
Example 2: Cascaded BSCs
Two cascaded BSC whose probability of error p each, has an equivalent
forward diagram as
With an equivalent forward matrix such as a BSC with probability of error = 2p(1-p)
(1-p)2+p2 2p(1-p)
2p(1-p) (1-p)2+p2
Therefore, C = 1- h(2p(1-p)) bits/ transmission at p(a1) = p(a2) = Β½
If p = 0.01, h (2x0.01x0.99) = h (0.0198) = 0.1126, then
C = 1 – 0.1126 = 0.8874 bits/trans which is less than of that of Example 1
This means that Tr and Rx should always be connected by one channel.
M-ary baseband Transmission
Binary data is usually transmitted using binary code extension. Instead of sending one
binary bit (0 or 1) per transmission, M-ary transmission uses M different symbols, with
duration of [log2 (M)Tb] each, to transmit [log2 (M)] binary bits per symbol.
For example, 4-ary transmission uses the second extension of the binary code. Four
distinct symbols, with duration of 2Tb each, such as four Pulses with different magnitude
of +3A, A, -A & -3A may be used to represent 11, 10, 00 & 01 (Grey code) respectively.
Example:
4-ary transmission with symbol duration = 2Tb
11 3
10 1
00 -1
01 -3
Binary bit stream: 1 1 0 1 0 0 0 1 1 0 1 1 1 0
4-ary symbol stream: 3, -3, -1, -3, 1, 3, 1
Similarly, 8-ary transmission uses the third extension of the binary code.
eight distinct symbols, with duration of 3Tb each.
Example: 8-ary transmission with symbol duration = 3Tb
Code Pulse Amplitude Threshold
111 7
6
110 5
4
100 3
2
101 1
0
001 -1
-2
000 -3
-4
010 -5
-6
011 -7
Binary bit stream: 1 1 0 1 0 0 0 1 1 0 1 1 1 0 1
8-ary symbol stream: 5, 3, -7,-7, 1
r-ary Symmetric Channel with Pe = p
1-p p/(r-1) p/(r-1) ........ p/(r-1)
p/(r-1) 1-p p/(r-1) ........ p/(r-1)
..
p/(r-1) p/(r-1) p/(r-1) ........ 1-p
Example: 4-ary Symmetric Channel with Pe = .1
.9 .1/3 .1/3 .1/3
.1/3 .9 .1//3 .1/3
.1/3 .1/3 .9 .1/3
.1/3 .1/3 .1//3 .9
The forward symmetric matrix is uniform of size (r , r).
Therefore, W = (1-p) log (1/(1-p)) + (r-1)[(p/(r-1)) log ((r-1)/p)]
= h(p) + p log(r-1).
C = Max [ H(B) - h(p) - p log(r-1)]
{p(ai)}
= log r - h(p) - p log(r-1) bits/ transmission
at uniform input probability distribution
Statistics of Continuous Random Variable
Joint and conditional probability density functions can be also defined for continuous
random variables X & Y.
f X,Y(x, y) is the joint probability density function of X and Y, and
fx(x/y) is the conditional probability density function of X, given Y.
some of the previous discussed information theory concepts are to be
extended to continuous random variables and random vectors. The
motivation is to pave the way for the description of another fundamental
limit in information theory.
Consider a continuous random variable X with the probability density
function fx(x). By analogy with the entropy of a discrete random variable,
the following definition is introduced :
Differential Entropy and Mutual
Information for Continuous Ensembles
οƒ₯
οƒ₯ ο€½
ο€½
ο€½
q
i
i
i
q
i
i
i p
p
I
p
1
1
)
/
1
log(
Example 9.8 Gaussian Distribution
Mutual Information
the mutual information between two random variables X and Y is
defined as follows:
where f X,Y(x, y) is the joint probability density function of X and Y, and
fx(x/y) is the conditional probability density function of X, given that Y = y.
h(X/Y) is the
conditional differential
entropy of X, given Y
Information Capacity Theorem
The information capacity theorem for band-limited, power-limited
Gaussian channels is to be considered. Consider a zero-mean stationary
process X(t) that is band-limited to B hertz. Let Xk, k = 1 , 2 , . . . , K,
denote the continuous random variables obtained by uniform sampling of
the process X(t) at the Nyquist rate of 2B samples per second. These
samples are transmitted in T seconds over a noisy channel, also band-
limited to B hertz. Hence, the number of samples, K = 2BT.
Xk is a sample of the transmitted signal. The channel output is disturbed
by additive white Gaussian noise (AWGN) of zero mean and power
spectral density No/2. The noise is band-limited to B hertz. Let the
continuous random variables Yk, k = 1,2, . . . , K denote samples of the
received signal, as shown by
The information capacity of a power-limited Gaussian channel is defined
as the maximum of the mutual information between the channel input Xk
and the channel output Yk over all distributions on the input Xk that
satisfy the power constraint of Equation (9.86).
Where P is the average transmitted power.
Since Xk and Nk are independent random variables, and their sum equals Yk, as in
equation (9.84), the conditional differential entropy of Yk, given Xk, is equal to the
differential entropy of Nk
Since h(Nk) is independent of the distribution of Xk ,
maximizing I(Xk; Yk) in accordance with Equation (9.87) requires
maximizing h(Yk), the differential entropy of sample Yk of the received
signal. For h(Yk) to be maximum, Yk has to be a Gaussian random
variable (see Example 9.8). That is, the samples of the received signal
represent a noise like process. Next, note that since Nk is Gaussian by
assumption, the sample Xk of the transmitted signal must be Gaussian
too. Therefore, the maximization specified in Equation (9.87) is achieved
by choosing the samples of the transmitted signal from a noise like
process of average power P.
Correspondingly, Equation(9.87) may be reformulated as in the
following slide.
With 2B transmission per second, the channel information capacity will be
C = C*2B bits/sec
Information Capacity Theorem
Shannon's third Theorem
The information capacity of a continuous channel of bandwidth B hertz,
disturbed by additive white Gaussian noise of power spectral density No/2
and limited in bandwidth to B, is given by
Where P is the average transmitted power and the noise variance Οƒ2 = NoB
It highlights most intensely the interplay among three key system
parameters: channel bandwidth B, average received signal power P, and
noise power spectral density at the channel output No /2. The dependence
of information capacity C on channel bandwidth B is linear, whereas its
dependence on signal-to-noise ratio P/(NoB) is logarithmic. Accordingly,
it is easier to increase the information capacity of a communication
channel by expanding its bandwidth than increasing the transmitted power
for a prescribed noise variance.
Channel Coding Theorem
The theorem implies that, for given average transmitted power P and
channel bandwidth B.
β€’ Information can be transmitted at the rate of C bits per second, as
defined in the previous Equation, with arbitrarily small probability of
error by employing sufficiently complex encoding systems.
β€’ It is not possible to transmit at a rate higher than C bits per second by
any encoding system without a definite probability of error.
Hence, the channel capacity theorem defines the fundamental limit on
the rate of error-free transmission for a power limited, band-limited
Gaussian channel.
To approach this limit, however, the transmitted signal must have
statistical properties approximating those of white Gaussian noise.

More Related Content

What's hot

Monopulse tracking radar
Monopulse tracking radarMonopulse tracking radar
Monopulse tracking radarAshok Selsan
Β 
Microwave basics
Microwave basicsMicrowave basics
Microwave basicsIslam Saleh
Β 
Wave guide tees
Wave guide teesWave guide tees
Wave guide teesKeval Patel
Β 
Equalization
EqualizationEqualization
Equalization@zenafaris91
Β 
Sonet (synchronous optical networking )
Sonet (synchronous optical networking )Sonet (synchronous optical networking )
Sonet (synchronous optical networking )Hamza Sajjad
Β 
Dsss final
Dsss finalDsss final
Dsss finalAJAL A J
Β 
Link budget calculation
Link budget calculationLink budget calculation
Link budget calculationsitimunirah88
Β 
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...Madhumita Tamhane
Β 
Introduction to Wireless Communication
Introduction to Wireless CommunicationIntroduction to Wireless Communication
Introduction to Wireless CommunicationDilum Bandara
Β 
Chap 5 (small scale fading)
Chap 5 (small scale fading)Chap 5 (small scale fading)
Chap 5 (small scale fading)asadkhan1327
Β 
Antenna Measurements
Antenna MeasurementsAntenna Measurements
Antenna MeasurementsDarshan Bhatt
Β 
Unguided Media
Unguided MediaUnguided Media
Unguided Mediatechbed
Β 
Radar Systems -Unit- I : Radar Equation
Radar Systems -Unit- I : Radar Equation Radar Systems -Unit- I : Radar Equation
Radar Systems -Unit- I : Radar Equation VenkataRatnam14
Β 
5. 2 ray propagation model part 1
5. 2 ray propagation model   part 15. 2 ray propagation model   part 1
5. 2 ray propagation model part 1JAIGANESH SEKAR
Β 
4.4 diversity combining techniques
4.4   diversity combining techniques4.4   diversity combining techniques
4.4 diversity combining techniquesJAIGANESH SEKAR
Β 
Diversity Techniques in mobile communications
Diversity Techniques in mobile communicationsDiversity Techniques in mobile communications
Diversity Techniques in mobile communicationsDiwaker Pant
Β 

What's hot (20)

Spread spectrum modulation
Spread spectrum modulationSpread spectrum modulation
Spread spectrum modulation
Β 
Magic tee
Magic tee  Magic tee
Magic tee
Β 
Monopulse tracking radar
Monopulse tracking radarMonopulse tracking radar
Monopulse tracking radar
Β 
Microwave basics
Microwave basicsMicrowave basics
Microwave basics
Β 
Wave guide tees
Wave guide teesWave guide tees
Wave guide tees
Β 
Equalization
EqualizationEqualization
Equalization
Β 
Sonet (synchronous optical networking )
Sonet (synchronous optical networking )Sonet (synchronous optical networking )
Sonet (synchronous optical networking )
Β 
Dsss final
Dsss finalDsss final
Dsss final
Β 
Link budget calculation
Link budget calculationLink budget calculation
Link budget calculation
Β 
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...
Convolution codes - Coding/Decoding Tree codes and Trellis codes for multiple...
Β 
MIMO
MIMOMIMO
MIMO
Β 
Introduction to Wireless Communication
Introduction to Wireless CommunicationIntroduction to Wireless Communication
Introduction to Wireless Communication
Β 
Chap 5 (small scale fading)
Chap 5 (small scale fading)Chap 5 (small scale fading)
Chap 5 (small scale fading)
Β 
Multiple Access
Multiple AccessMultiple Access
Multiple Access
Β 
Antenna Measurements
Antenna MeasurementsAntenna Measurements
Antenna Measurements
Β 
Unguided Media
Unguided MediaUnguided Media
Unguided Media
Β 
Radar Systems -Unit- I : Radar Equation
Radar Systems -Unit- I : Radar Equation Radar Systems -Unit- I : Radar Equation
Radar Systems -Unit- I : Radar Equation
Β 
5. 2 ray propagation model part 1
5. 2 ray propagation model   part 15. 2 ray propagation model   part 1
5. 2 ray propagation model part 1
Β 
4.4 diversity combining techniques
4.4   diversity combining techniques4.4   diversity combining techniques
4.4 diversity combining techniques
Β 
Diversity Techniques in mobile communications
Diversity Techniques in mobile communicationsDiversity Techniques in mobile communications
Diversity Techniques in mobile communications
Β 

Similar to Information Theory and Coding - Lecture 5

Information Theory and Coding - Lecture 4
Information Theory and Coding - Lecture 4Information Theory and Coding - Lecture 4
Information Theory and Coding - Lecture 4Aref35
Β 
Matlab lab manual
Matlab lab manualMatlab lab manual
Matlab lab manualnmahi96
Β 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...SAJJAD KHUDHUR ABBAS
Β 
Optimization algorithms for solving computer vision problems
Optimization algorithms for solving computer vision problemsOptimization algorithms for solving computer vision problems
Optimization algorithms for solving computer vision problemsKrzysztof Wegner
Β 
Probability and Stochastic Processes A Friendly Introduction for Electrical a...
Probability and Stochastic Processes A Friendly Introduction for Electrical a...Probability and Stochastic Processes A Friendly Introduction for Electrical a...
Probability and Stochastic Processes A Friendly Introduction for Electrical a...KionaHood
Β 
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltAnalysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltVarun Jadhav
Β 
Numerical Differentiations Solved examples
Numerical Differentiations Solved examplesNumerical Differentiations Solved examples
Numerical Differentiations Solved examplesDevelopedia
Β 
Mathematics TAKS Exit Level Review
Mathematics TAKS Exit Level ReviewMathematics TAKS Exit Level Review
Mathematics TAKS Exit Level Reviewguest3f17823
Β 
Applied Algorithms and Structures week999
Applied Algorithms and Structures week999Applied Algorithms and Structures week999
Applied Algorithms and Structures week999fashiontrendzz20
Β 
Extended network and algorithm finding maximal flows
Extended network and algorithm finding maximal flows Extended network and algorithm finding maximal flows
Extended network and algorithm finding maximal flows IJECEIAES
Β 
Frequency Response with MATLAB Examples.pdf
Frequency Response with MATLAB Examples.pdfFrequency Response with MATLAB Examples.pdf
Frequency Response with MATLAB Examples.pdfSunil Manjani
Β 
Sequence and Series Word File || Discrete Structure
Sequence and Series Word File || Discrete StructureSequence and Series Word File || Discrete Structure
Sequence and Series Word File || Discrete StructureZain Abid
Β 
Physical Chemistry Assignment Help
Physical Chemistry Assignment HelpPhysical Chemistry Assignment Help
Physical Chemistry Assignment HelpEdu Assignment Help
Β 
Numerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value ProblemsNumerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value ProblemsGobinda Debnath
Β 
Lecture Notes: EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...
Lecture Notes:  EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...Lecture Notes:  EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...
Lecture Notes: EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...AIMST University
Β 
Trapezoidal Method IN Numerical Analysis
Trapezoidal Method IN  Numerical AnalysisTrapezoidal Method IN  Numerical Analysis
Trapezoidal Method IN Numerical AnalysisMostafijur Rahman
Β 
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notesVTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notesJayanth Dwijesh H P
Β 

Similar to Information Theory and Coding - Lecture 5 (20)

Information Theory and Coding - Lecture 4
Information Theory and Coding - Lecture 4Information Theory and Coding - Lecture 4
Information Theory and Coding - Lecture 4
Β 
Matlab lab manual
Matlab lab manualMatlab lab manual
Matlab lab manual
Β 
Math
MathMath
Math
Β 
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...Episode 50 :  Simulation Problem Solution Approaches Convergence Techniques S...
Episode 50 : Simulation Problem Solution Approaches Convergence Techniques S...
Β 
Optimization algorithms for solving computer vision problems
Optimization algorithms for solving computer vision problemsOptimization algorithms for solving computer vision problems
Optimization algorithms for solving computer vision problems
Β 
Probability and Stochastic Processes A Friendly Introduction for Electrical a...
Probability and Stochastic Processes A Friendly Introduction for Electrical a...Probability and Stochastic Processes A Friendly Introduction for Electrical a...
Probability and Stochastic Processes A Friendly Introduction for Electrical a...
Β 
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltAnalysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
Β 
Numerical Differentiations Solved examples
Numerical Differentiations Solved examplesNumerical Differentiations Solved examples
Numerical Differentiations Solved examples
Β 
Mathematics TAKS Exit Level Review
Mathematics TAKS Exit Level ReviewMathematics TAKS Exit Level Review
Mathematics TAKS Exit Level Review
Β 
Applied Algorithms and Structures week999
Applied Algorithms and Structures week999Applied Algorithms and Structures week999
Applied Algorithms and Structures week999
Β 
Extended network and algorithm finding maximal flows
Extended network and algorithm finding maximal flows Extended network and algorithm finding maximal flows
Extended network and algorithm finding maximal flows
Β 
Frequency Response with MATLAB Examples.pdf
Frequency Response with MATLAB Examples.pdfFrequency Response with MATLAB Examples.pdf
Frequency Response with MATLAB Examples.pdf
Β 
Sequence and Series Word File || Discrete Structure
Sequence and Series Word File || Discrete StructureSequence and Series Word File || Discrete Structure
Sequence and Series Word File || Discrete Structure
Β 
Physical Chemistry Assignment Help
Physical Chemistry Assignment HelpPhysical Chemistry Assignment Help
Physical Chemistry Assignment Help
Β 
Numerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value ProblemsNumerical Analysis and Its application to Boundary Value Problems
Numerical Analysis and Its application to Boundary Value Problems
Β 
Lecture Notes: EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...
Lecture Notes:  EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...Lecture Notes:  EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...
Lecture Notes: EEEE6490345 RF and Microwave Electronics - Noise In Two-Port ...
Β 
pRO
pROpRO
pRO
Β 
Mathematics
MathematicsMathematics
Mathematics
Β 
Trapezoidal Method IN Numerical Analysis
Trapezoidal Method IN  Numerical AnalysisTrapezoidal Method IN  Numerical Analysis
Trapezoidal Method IN Numerical Analysis
Β 
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notesVTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notes
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -3 notes
Β 

Recently uploaded

EduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIEduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIkoyaldeepu123
Β 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
Β 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
Β 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
Β 
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...9953056974 Low Rate Call Girls In Saket, Delhi NCR
Β 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
Β 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfROCENODodongVILLACER
Β 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...VICTOR MAESTRE RAMIREZ
Β 
complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...asadnawaz62
Β 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.eptoze12
Β 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...srsj9000
Β 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxKartikeyaDwivedi3
Β 
Introduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHIntroduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHC Sai Kiran
Β 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfAsst.prof M.Gokilavani
Β 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionDr.Costas Sachpazis
Β 
DATA ANALYTICS PPT definition usage example
DATA ANALYTICS PPT definition usage exampleDATA ANALYTICS PPT definition usage example
DATA ANALYTICS PPT definition usage examplePragyanshuParadkar1
Β 
An experimental study in using natural admixture as an alternative for chemic...
An experimental study in using natural admixture as an alternative for chemic...An experimental study in using natural admixture as an alternative for chemic...
An experimental study in using natural admixture as an alternative for chemic...Chandu841456
Β 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
Β 

Recently uploaded (20)

EduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIEduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AI
Β 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
Β 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
Β 
Design and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdfDesign and analysis of solar grass cutter.pdf
Design and analysis of solar grass cutter.pdf
Β 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
Β 
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Β 
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
πŸ”9953056974πŸ”!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
Β 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
Β 
Risk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdfRisk Assessment For Installation of Drainage Pipes.pdf
Risk Assessment For Installation of Drainage Pipes.pdf
Β 
Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...Software and Systems Engineering Standards: Verification and Validation of Sy...
Software and Systems Engineering Standards: Verification and Validation of Sy...
Β 
complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...
Β 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.
Β 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Β 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptx
Β 
Introduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECHIntroduction to Machine Learning Unit-3 for II MECH
Introduction to Machine Learning Unit-3 for II MECH
Β 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
Β 
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective IntroductionSachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Sachpazis Costas: Geotechnical Engineering: A student's Perspective Introduction
Β 
DATA ANALYTICS PPT definition usage example
DATA ANALYTICS PPT definition usage exampleDATA ANALYTICS PPT definition usage example
DATA ANALYTICS PPT definition usage example
Β 
An experimental study in using natural admixture as an alternative for chemic...
An experimental study in using natural admixture as an alternative for chemic...An experimental study in using natural admixture as an alternative for chemic...
An experimental study in using natural admixture as an alternative for chemic...
Β 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
Β 

Information Theory and Coding - Lecture 5

  • 1. Mustaqbal University College of Engineering &Computer Sciences Electronics and Communication Engineering Department Course: EE301: Probability Theory and Applications Part C Prerequisite: Stat 219 Text Book: B.P. Lathi, β€œModern Digital and Analog Communication Systems”, 3th edition, Oxford University Press, Inc., 1998 Reference: A. Papoulis, Probability, Random Variables, and Stochastic Processes, Mc-Graw Hill, 2005 Dr. Aref Hassan Kurdali
  • 2. Communications Channel Entropies (calculated in base 2) Could A & B be SI? !!!!!!!!!!!!!!!! Tx Rx Notes H(A( H(B) The entropy H(A( represents our uncertainty about the channel input before observing the channel output. H(B/ai) H(A/bj( H(B/A) H(A/B( The conditional entropy H(A/B( represents our uncertainty about the channel input after observing the channel output. H(A,B) = H(A) + H(B/A) = H(B) + H(A/B( H(A,B) is the joint channel entropy. Could A & B be SI ?
  • 5. Mutual Information p(ai) is the priori probability of the input symbol ai p(ai /bj) is the posterior probability of the input symbol ai upon reception of bj The change in the probability (due to channel noise) measures how much the receiver learned from the reception of bj. The receiver will never absolutely sure what exactly was sent. The difference between the information uncertainty of ai before & after reception of bj is called the mutual information that is the actual amount of information transmitted via the channel I(ai;bj) = I(ai) – I(ai/bj) For ideal error free channel, if i=j p(ai/bj) =1 & if iβ‰ j, p(ai/bj) = 0 Therefore, I(ai;bj) = I(ai) [I(ai;bj) = 0 if ai & bj were SI !!!!!!] I(ai;bj) = logr(1/p(ai)) –logr(1/p(ai/bj)) = logr(p(ai/bj)/p(ai)) = logr(p(ai,bj)/(p(ai) p(bj))). Similarly, I(bj;ai) = I(bj) – I(bj/ai) = logr(1/p(bj)) - logr(1/p(bj/ai)) = I(ai;bj)
  • 6. Channel Mutual Information Therefore, I(A;B) = H(A) – H(A/B) Note also that
  • 7. Similarly, I(A;B) = H(B) – H(B/A)
  • 8.
  • 9. Since the entropy H(A) represents our uncertainty about the channel input before observing the channel output, and the conditional entropy H(A/B) represents the average amount of uncertainty remaining about the channel input after the channel output has been observed, it follows that the difference H(A) - H(A/B) must represent our uncertainty about the channel input that is resolved by observing the channel output. This important quantity is called the channel mutual information I(A;B) = H(A) - H(A/B) The channel mutual information represents the actual average amount of information transmitted through the channel. Similarly, I(A;B) = H(B) - H(B/A) Or I(A;B) = H(A) + H(B) - H(A,B) Note that I(A;B) = I(B;A) β‰₯ 0 Channel Mutual Information
  • 10. The interpretation given in Figure 9.9 above shows The entropy of channel input H(A) is represented by the circle on the left while the entropy of channel output H(B) is represented by the circle on the right. The mutual information of the channel is represented by the intersection between these two circles. I(A;B) = H(B) - H(B/A) = H(A) - H(A/B) = H(A) + H(B) - H(A,B)
  • 11. Problem 1 Let 𝑝 π‘Ž1, 𝑏1 = 1 3 , 𝑝 π‘Ž1, 𝑏2 = 1 3 , 𝑝 π‘Ž2, 𝑏1 = 0, 𝑝 π‘Ž2, 𝑏2 = 1 3 Find 𝐻 𝐴 , 𝐻 𝐡 , 𝐻 𝐴, 𝐡 , 𝐻 𝐴 𝐡 𝐻(𝐡|𝐴), I 𝐴, 𝐡 .
  • 12. Problem 1 - Solution P 𝐴, 𝐡 = 𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2 𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2 = 1 3 1 3 0 1 3 β†’ 𝑝 π‘Ž1 = 1 3 + 1 3 = 2 3 β†’ 𝑝 π‘Ž2 = 0 + 1 3 = 1 3 𝑝 𝑏1 = 1 3 + 0 = 1 3 𝑝 𝑏2 = 1 3 + 1 3 = 2 3 So H 𝐴 = 2 3 π‘™π‘œπ‘”2 3 2 + 1 3 π‘™π‘œπ‘”2 3 = 0.918 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’ H 𝐡 = 1 3 π‘™π‘œπ‘”2 3 + 2 3 π‘™π‘œπ‘”2 3 2 = 0.918 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’
  • 13. Problem 1 - Solution P 𝐴, 𝐡 = 𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2 𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2 = 1 3 1 3 0 1 3 H 𝐴, 𝐡 = 𝑖=1 2 𝑗=1 2 P π‘Žπ‘–, 𝑏𝑗 Γ— π‘™π‘œπ‘”2( 1 P π‘Žπ‘–, 𝑏𝑗 ) H 𝐴, 𝐡 = 1 3 Γ— π‘™π‘œπ‘”2 3 + 1 3 Γ— π‘™π‘œπ‘”2 3 + 0 + 1 3 Γ— π‘™π‘œπ‘”2 3 =1.584 bit/symbol I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 = 0.918 + 0.918 βˆ’ 1.584 = 𝟎. πŸπŸ“πŸ bit/symbol
  • 14. Problem 1 - Solution H 𝐴|𝐡 = H 𝐴, 𝐡 βˆ’ H 𝐡 = 1.584 βˆ’ 0.918 =0.666 bit/symbol H 𝐡|𝐴 = H 𝐴, 𝐡 βˆ’ H 𝐴 = 1.584 βˆ’ 0.918 = 0.666 bit/symbol
  • 15. Problem 1 - Tips Note that P 𝐡|𝐴 is the forward transition matrix (FM), and P 𝐴|𝐡 is the backward transition matrix (BM). Form the Joint Matrix: P 𝐴, 𝐡 = JM = 𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2 𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2 = 1 3 1 3 0 1 3 We can deduce the FM as follows: 𝐹𝑀 = P 𝐡|𝐴 = ( 1 3 )/𝑝(π‘Ž1) 1 3 /𝑝(π‘Ž1) 0/𝑝(π‘Ž2) 1 3 /𝑝(π‘Ž2) = 0.5 0.5 0 1 Now, we can find H 𝐡|𝐴 as follows: So: H 𝐡|𝐴 = [0.5π‘™π‘œπ‘”2 1 0.5 + 0.5π‘™π‘œπ‘”2 1 0.5 ] Γ— 𝑝 π‘Ž1 + [0π‘™π‘œπ‘”2 1 0 + 1 π‘™π‘œπ‘”2 1 1 ] Γ— 𝑝 π‘Ž2 H 𝐡|𝐴 = 0.666 bit/symbol
  • 16. Problem 2 Let 𝑝 π‘Ž1, 𝑏1 = 0.5, 𝑝 π‘Ž1, 𝑏2 = 0.1, 𝑝 π‘Ž2, 𝑏1 = 0.3, 𝑝 π‘Ž2, 𝑏2 = 0.1 Find I 𝐴, 𝐡 .
  • 17. Problem 2 - Solution P 𝐴, 𝐡 = 𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2 𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2 = 0.5 0.1 0.3 0.1 β†’ 𝑝 π‘Ž1 = 0.5 + 0.1 = 0.6 β†’ 𝑝 π‘Ž2 = 0.3 + 0.1 = 0.4 𝑝 𝑏1 = 0.5 + 0.3 = 0.8 𝑝 𝑏2 = 0.1 + 0.1 = 0.2 So H 𝐴 = 0.6π‘™π‘œπ‘”2 1 0.6 + 0.4π‘™π‘œπ‘”2 1 0.4 = 0.97 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’ H 𝐡 = 0.8π‘™π‘œπ‘”2 1 0.8 + 0.2π‘™π‘œπ‘”2 1 0.2 = 0.72 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™π‘’ H 𝐴, 𝐡 = 0.5 Γ— π‘™π‘œπ‘”2 1 0.5 + 0.1 Γ— π‘™π‘œπ‘”2 1 0.1 + 0.3 Γ— π‘™π‘œπ‘”2 1 0.3 + 0.1 Γ— π‘™π‘œπ‘”2 1 0.1 =1.684 bit/symbol I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 = 0.97 + 0.72 βˆ’ 1.684 = 𝟎. πŸŽπŸŽπŸ” bit/symbol
  • 18. Problem 2 - Tips P 𝐴, 𝐡 = JM = 𝑝 π‘Ž1, 𝑏1 𝑝 π‘Ž1, 𝑏2 𝑝 π‘Ž2, 𝑏1 𝑝 π‘Ž2, 𝑏2 = 0.5 0.1 0.3 0.1 We can deduce the FM as follows: 𝐹𝑀 = P 𝐡|𝐴 = 0.5/𝑝(π‘Ž1) 0.1/𝑝(π‘Ž1) 0.3/𝑝(π‘Ž2) 0.1/𝑝(π‘Ž2) = 0.83 0.17 0.75 0.25 Now, we can find 𝐇 𝑩|𝑨 as follows: So: H 𝐡|𝐴 = [0.8333 π‘™π‘œπ‘”2 1 0.8333 + 0.1666 π‘™π‘œπ‘”2 1 0.1666 ] Γ— 𝑝 π‘Ž1 + [0.75 π‘™π‘œπ‘”2 1 0.75 + 0.25π‘™π‘œπ‘”2 1 0.25 ] Γ— 𝑝 π‘Ž2 H 𝐡|𝐴 = [0.8333 π‘™π‘œπ‘”2 1 0.8333 + 0.1666 π‘™π‘œπ‘”2 1 0.1666 ] Γ— 0.6 + [0.75 π‘™π‘œπ‘”2 1 0.75 + 0.25π‘™π‘œπ‘”2 1 0.25 ] Γ— 0.4 H 𝐡|𝐴 = 0.714 bit/symbol Or: 𝐇 𝑩|𝑨 = 𝐇 𝑨, 𝑩 βˆ’ 𝐇 𝑨 = 𝟏. πŸ”πŸ–πŸ’ βˆ’ 𝟎. πŸ—πŸ• = 𝟎. πŸ•πŸπŸ’ π’ƒπ’Šπ’•/π’”π’šπ’Žπ’ƒπ’π’
  • 19. Channel Capacity The channel capacity of a discrete memoryless channel is defined as the maximum mutual information I(A; B) in any single use of the channel (i.e., signaling interval), where the maximization is over all possible input probability distributions {p(ai)} on A. The channel capacity is commonly denoted by C and is written as C = Max I(A;B) {p(ai)} The channel capacity C is measured in bits per channel use, or bits per transmission. Note that the channel capacity C is a function only of the transition probabilities p(bj/ai), which define the channel. The calculation of C involves maximization of the mutual information I(A; B) over r variables [i.e., the input probabilities p(a1), . . . ,p(ar)] subject to two constraints: p(ai) β‰₯ 0 for all i sum of all p(ai) for all i =1 In general, the variation problem of finding the channel capacity C is a challenging task.
  • 20. Uniform Channel The channel is called uniform when each row in its forward matrix is a permutation of the first row. The mutual information then can be written as I(A;B) = H(B) – H(B/A) = H(B) – W Where W is a constant = entropy of any row of the uniform matrix. Example 1: binary symmetric channel (BSC) is a uniform channel 1-p p p 1-p Where p is the probability of error. Here W = h(p) = plog(1/p)+(1-p)log(1/(1-p)) H(p) is called the entropy function. Therefore, the channel capacity is the maximization of [H(B) – h(p)] which gives H(B) = 1 at uniform input probability distribution, i. e. C= 1 - h(p) bits/ transmission at p(a1) = p(a2) = Β½ Let p = 0.01, h(0.01) = 0.0808, then C = 1 – 0.0808 = 0.919 bits/ transmission the channel capacity curve as a function in the bit error probability p is shown above.
  • 21. Problem 3 Figure 1 shows the forward channel diagram of a binary symmetric channel (BSC. If you given that I(a1) = 2 bits, find: 1- Channel Capacity (C). 2- Channel efficiency (πœ‚). 3- Channel Redundancy.
  • 22. Problem 3 - Solution 1) From the forward channel diagram we deduce that the forward transition matrix is: 𝐹𝑀 = 𝑃(𝐡|𝐴) = 1 βˆ’ 𝑝 𝑝 𝑝 1 βˆ’ 𝑝 π‘ŸΓ—π‘  = 0.7 0.3 0.3 0.7 π‘ͺ = π’π’π’ˆπŸπ’” βˆ’ 𝒉 𝒑 = π‘™π‘œπ‘”22 βˆ’ β„Ž 𝑝 = 1 βˆ’ β„Ž 𝑝 (where s: number of FM’s columns, or it is the number of the receiver nodes) 𝑏𝑒𝑑: β„Ž 𝑝 = 0.7π‘™π‘œπ‘”2 1 0.7 + 0.3π‘™π‘œπ‘”2 1 0.3 = 0.881 bit/symbol π‘ͺ = 𝟏 βˆ’ 𝒉 𝒑 = 𝟏 βˆ’ 𝟎. πŸ–πŸ–πŸ = 𝟎. πŸπŸπŸ— π’ƒπ’Šπ’•/π’”π’‚π’Žπ’‘π’π’† 2) πœ‚= 𝑰(𝑨,𝑩) π‘ͺ Γ— 𝟏𝟎𝟎% , so we should to find 𝐼 𝐴, 𝐡 I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 Given that I π‘Ž1 = 2 = π‘™π‘œπ‘”2 1 𝑝 π‘Ž1 β†’ 𝑝 π‘Ž1 = 0.25 ⟹ 𝑝 π‘Ž2 = 0.75. So The Joint Matrix is: JM= 0.7 Γ— 0.25 0.3 Γ— 0.25 0.3 Γ— 0.75 0.7 Γ— 0.75 = 0.175 0.075 0. 225 0.525 From JM: 𝑝 𝑏1 = 0.175 + 0.225 = 0.4 𝑝 𝑏2 = 0.075 + 0.525 = 0.6
  • 23. Problem 3 - Solution 𝑝 π‘Ž1 = 0.25 𝑝 π‘Ž2 = 0.75 𝐻 𝐴 = 0.25π‘™π‘œπ‘”2 1 0.25 + 0.75π‘™π‘œπ‘”2 1 0.75 = 0.811 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™ 𝑝 𝑏1 = 0.4 𝑝 𝑏2 = 0.6 𝐻 𝐡 = 0.4π‘™π‘œπ‘”2 1 0.4 + 0.6π‘™π‘œπ‘”2 1 0.6 = 0.97 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™ JM= 0.175 0.075 0. 225 0.525 𝐻 𝐴, 𝐡 = 0.175 π‘™π‘œπ‘”2 1 0.175 + 0.075 π‘™π‘œπ‘”2 1 0.075 + 0. 225 π‘™π‘œπ‘”2 1 0. 225 + 0.525 π‘™π‘œπ‘”2 1 0.525 = 1.692 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™ I 𝐴, 𝐡 = H 𝐴 + H 𝐡 βˆ’ H 𝐴, 𝐡 =0.811+0.97-1.692=0.089 bit/symbol [or 𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑 = 0.97-0.881=0.089] πœ‚= 𝑰(𝑨, 𝑩) π‘ͺ Γ— 𝟏𝟎𝟎% = 𝟎. πŸŽπŸ–πŸ— 𝟎. πŸπŸπŸ— Γ— 𝟏𝟎𝟎% = πŸ•πŸ“% 3) Redundancy: 𝜸=1- πœ‚ = 25%
  • 24. Problem 4 Given the forward transition matrix (FM). 𝑃 𝐡 𝐴 = 𝐹𝑀 = 0.7 0.2 0.1 0.1 0.7 0.2 0.2 0.1 0.7 If you given that p π‘Ž1 = p π‘Ž2 = 0.25, find: 1- Channel Capacity (C). 2- Channel efficiency (πœ‚). 3- Channel Redundancy (𝜸). 4- Draw the channel.
  • 25. Problem 4 - Solution 𝑃 𝐡 𝐴 = 𝐹𝑀 = 0.7 0.2 0.1 0.1 0.7 0.2 0.2 0.1 0.7 1) π‘ͺ = π’π’π’ˆπŸπŸ‘ βˆ’ 𝒉 𝒑 𝒉 𝒑 = 0.7π‘™π‘œπ‘”2 1 0.7 + 0.2π‘™π‘œπ‘”2 1 0.2 + 0.1π‘™π‘œπ‘”2 1 0.1 = 1.156 𝑏𝑖𝑑 π‘ π‘¦π‘šπ‘π‘œπ‘™ π‘ͺ = π’π’π’ˆπŸπŸ‘ βˆ’ 𝟏. πŸπŸ“πŸ” = 𝟎. πŸ’πŸπŸ– π’ƒπ’Šπ’• π’•π’“π’‚π’π’”π’Žπ’Šπ’”π’”π’Šπ’π’ 𝒐𝒓 π’ƒπ’Šπ’• π’”π’‚π’Žπ’‘π’π’† 2) πœ‚= 𝑰(𝑨,𝑩) π‘ͺ Γ— 𝟏𝟎𝟎% 𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑 𝑝 𝑏1 = 0.7 Γ— 𝑝 π‘Ž1 + 0.1 Γ— 𝑝 π‘Ž1 + 0.2 Γ— 𝑝 π‘Ž1 = 0.3 𝑝 𝑏2 = 02 Γ— 𝑝 π‘Ž2 + 0.7 Γ— 𝑝 π‘Ž2 + 0.1 Γ— 𝑝 π‘Ž2 = 0.275 𝑝 𝑏3 = 0.1 Γ— 𝑝 π‘Ž3 + 0.2 Γ— 𝑝 π‘Ž3 + 0.7 Γ— 𝑝 π‘Ž3 = 0.425 𝐇 𝑩 = 0.3π‘™π‘œπ‘”2 1 0.3 + 0.275 π‘™π‘œπ‘”2 1 0.275 + 0.425 π‘™π‘œπ‘”2 1 0.425 = 1.577 𝑏𝑖𝑑/π‘ π‘¦π‘šπ‘π‘œπ‘™ 𝐈 𝑨, 𝑩 = 𝐇 𝑩 βˆ’ 𝒉 𝒑 = 𝟏. πŸ“πŸ•πŸ• βˆ’ 𝟏. πŸπŸ“πŸ” = 𝟎. πŸ’ π’ƒπ’Šπ’•/π’”π’šπ’Žπ’ƒπ’π’ πœ‚= 𝟎.πŸ’ 𝟎.πŸ’πŸπŸ– Γ— 𝟏𝟎𝟎%=94%
  • 26. Problem 4 - Solution 𝑃 𝐡 𝐴 = 𝐹𝑀 = 0.7 0.2 0.1 0.1 0.7 0.2 0.2 0.1 0.7 3) Redundancy: 𝜸=1- πœ‚ = 6% 4) β€’ π‘Ž1 β€’ 𝑏1 β€’ π‘Ž2 β€’ 𝑏2 β€’ π‘Ž3 β€’ 𝑏3 0.2 0.7 0.1
  • 27. Example 2: Cascaded BSCs Two cascaded BSC whose probability of error p each, has an equivalent forward diagram as With an equivalent forward matrix such as a BSC with probability of error = 2p(1-p) (1-p)2+p2 2p(1-p) 2p(1-p) (1-p)2+p2 Therefore, C = 1- h(2p(1-p)) bits/ transmission at p(a1) = p(a2) = Β½ If p = 0.01, h (2x0.01x0.99) = h (0.0198) = 0.1126, then C = 1 – 0.1126 = 0.8874 bits/trans which is less than of that of Example 1 This means that Tr and Rx should always be connected by one channel.
  • 28. M-ary baseband Transmission Binary data is usually transmitted using binary code extension. Instead of sending one binary bit (0 or 1) per transmission, M-ary transmission uses M different symbols, with duration of [log2 (M)Tb] each, to transmit [log2 (M)] binary bits per symbol. For example, 4-ary transmission uses the second extension of the binary code. Four distinct symbols, with duration of 2Tb each, such as four Pulses with different magnitude of +3A, A, -A & -3A may be used to represent 11, 10, 00 & 01 (Grey code) respectively. Example: 4-ary transmission with symbol duration = 2Tb 11 3 10 1 00 -1 01 -3 Binary bit stream: 1 1 0 1 0 0 0 1 1 0 1 1 1 0 4-ary symbol stream: 3, -3, -1, -3, 1, 3, 1
  • 29. Similarly, 8-ary transmission uses the third extension of the binary code. eight distinct symbols, with duration of 3Tb each. Example: 8-ary transmission with symbol duration = 3Tb Code Pulse Amplitude Threshold 111 7 6 110 5 4 100 3 2 101 1 0 001 -1 -2 000 -3 -4 010 -5 -6 011 -7 Binary bit stream: 1 1 0 1 0 0 0 1 1 0 1 1 1 0 1 8-ary symbol stream: 5, 3, -7,-7, 1
  • 30. r-ary Symmetric Channel with Pe = p 1-p p/(r-1) p/(r-1) ........ p/(r-1) p/(r-1) 1-p p/(r-1) ........ p/(r-1) .. p/(r-1) p/(r-1) p/(r-1) ........ 1-p Example: 4-ary Symmetric Channel with Pe = .1 .9 .1/3 .1/3 .1/3 .1/3 .9 .1//3 .1/3 .1/3 .1/3 .9 .1/3 .1/3 .1/3 .1//3 .9 The forward symmetric matrix is uniform of size (r , r). Therefore, W = (1-p) log (1/(1-p)) + (r-1)[(p/(r-1)) log ((r-1)/p)] = h(p) + p log(r-1). C = Max [ H(B) - h(p) - p log(r-1)] {p(ai)} = log r - h(p) - p log(r-1) bits/ transmission at uniform input probability distribution
  • 31. Statistics of Continuous Random Variable Joint and conditional probability density functions can be also defined for continuous random variables X & Y. f X,Y(x, y) is the joint probability density function of X and Y, and fx(x/y) is the conditional probability density function of X, given Y.
  • 32. some of the previous discussed information theory concepts are to be extended to continuous random variables and random vectors. The motivation is to pave the way for the description of another fundamental limit in information theory. Consider a continuous random variable X with the probability density function fx(x). By analogy with the entropy of a discrete random variable, the following definition is introduced : Differential Entropy and Mutual Information for Continuous Ensembles οƒ₯ οƒ₯ ο€½ ο€½ ο€½ q i i i q i i i p p I p 1 1 ) / 1 log(
  • 33.
  • 34. Example 9.8 Gaussian Distribution
  • 35. Mutual Information the mutual information between two random variables X and Y is defined as follows: where f X,Y(x, y) is the joint probability density function of X and Y, and fx(x/y) is the conditional probability density function of X, given that Y = y. h(X/Y) is the conditional differential entropy of X, given Y
  • 36. Information Capacity Theorem The information capacity theorem for band-limited, power-limited Gaussian channels is to be considered. Consider a zero-mean stationary process X(t) that is band-limited to B hertz. Let Xk, k = 1 , 2 , . . . , K, denote the continuous random variables obtained by uniform sampling of the process X(t) at the Nyquist rate of 2B samples per second. These samples are transmitted in T seconds over a noisy channel, also band- limited to B hertz. Hence, the number of samples, K = 2BT. Xk is a sample of the transmitted signal. The channel output is disturbed by additive white Gaussian noise (AWGN) of zero mean and power spectral density No/2. The noise is band-limited to B hertz. Let the continuous random variables Yk, k = 1,2, . . . , K denote samples of the received signal, as shown by
  • 37. The information capacity of a power-limited Gaussian channel is defined as the maximum of the mutual information between the channel input Xk and the channel output Yk over all distributions on the input Xk that satisfy the power constraint of Equation (9.86). Where P is the average transmitted power. Since Xk and Nk are independent random variables, and their sum equals Yk, as in equation (9.84), the conditional differential entropy of Yk, given Xk, is equal to the differential entropy of Nk
  • 38. Since h(Nk) is independent of the distribution of Xk , maximizing I(Xk; Yk) in accordance with Equation (9.87) requires maximizing h(Yk), the differential entropy of sample Yk of the received signal. For h(Yk) to be maximum, Yk has to be a Gaussian random variable (see Example 9.8). That is, the samples of the received signal represent a noise like process. Next, note that since Nk is Gaussian by assumption, the sample Xk of the transmitted signal must be Gaussian too. Therefore, the maximization specified in Equation (9.87) is achieved by choosing the samples of the transmitted signal from a noise like process of average power P. Correspondingly, Equation(9.87) may be reformulated as in the following slide.
  • 39. With 2B transmission per second, the channel information capacity will be C = C*2B bits/sec
  • 40. Information Capacity Theorem Shannon's third Theorem The information capacity of a continuous channel of bandwidth B hertz, disturbed by additive white Gaussian noise of power spectral density No/2 and limited in bandwidth to B, is given by Where P is the average transmitted power and the noise variance Οƒ2 = NoB It highlights most intensely the interplay among three key system parameters: channel bandwidth B, average received signal power P, and noise power spectral density at the channel output No /2. The dependence of information capacity C on channel bandwidth B is linear, whereas its dependence on signal-to-noise ratio P/(NoB) is logarithmic. Accordingly, it is easier to increase the information capacity of a communication channel by expanding its bandwidth than increasing the transmitted power for a prescribed noise variance.
  • 41. Channel Coding Theorem The theorem implies that, for given average transmitted power P and channel bandwidth B. β€’ Information can be transmitted at the rate of C bits per second, as defined in the previous Equation, with arbitrarily small probability of error by employing sufficiently complex encoding systems. β€’ It is not possible to transmit at a rate higher than C bits per second by any encoding system without a definite probability of error. Hence, the channel capacity theorem defines the fundamental limit on the rate of error-free transmission for a power limited, band-limited Gaussian channel. To approach this limit, however, the transmitted signal must have statistical properties approximating those of white Gaussian noise.

Editor's Notes

  1. 1