Introduction to TechSoup’s Digital Marketing Services and Use Cases
VTU CBCS E&C 5th sem Information theory and coding(15EC54) Module -1notes
1. INFORMATION THEORY AND
CODING
5th SEM E&C
JAYANTHDWIJESH H P M.tech (DECS)
Assistant Professor – Dept of E&C
B.G.S INSTITUTE OF TECHNOLOGY (B.G.S.I.T)
B.G Nagara, Nagamangala Tq, Mandya District- 571448
2. FORMULAS FOR REFERENCE
MODULE –1(INFORMATION THEORY)
Amount of information or Self information.
𝑰 𝑲 = log (
𝟏
𝑷 𝑲
) or 𝑰 𝑲 = 𝐥𝐨𝐠 𝟐 𝟏(
𝟏
𝑷 𝑲
) or I (𝒎 𝑲) = log (
𝟏
𝑷 𝑲
)
Entropy of source or Average information content of the source.
H = 𝑷𝒊 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷 𝒊
𝑴
𝒊=𝟏 ) bits/symbol or H = 𝑷 𝑲 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷 𝑲
𝑴
𝑲=𝟏 ) bits/symbol or
H(S) = 𝑷𝒊 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷 𝒊
𝒒
𝒊=𝟏 ) bits/symbol or H(S) = 𝑷𝒊 𝐥𝐨𝐠(
𝟏
𝑷 𝒊
𝒒
𝒊=𝟏 ) bits/symbol or
H(S) = 𝑷 𝑲 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷 𝑲
𝑵
𝑲=𝟏 ) bits/symbol
Information rate or average information rate.
𝑹 𝑺 = 𝒓 𝒔H(S) bits/sec or R= 𝒓 𝒔H bits/sec or R=r H bits/sec
Bits.
𝑰 𝑲 = 𝐥𝐨𝐠 𝟐 𝟏(
𝟏
𝑷 𝑲
) bits
Hartley’s or Decits.
𝑰 𝑲 = 𝐥𝐨𝐠 𝟏𝟎 𝟏(
𝟏
𝑷 𝑲
) Hartley’s or Decits
Nats or Neper.
𝑰 𝑲 = 𝐥𝐨𝐠 𝒆 𝟏(
𝟏
𝑷 𝑲
) Nats or Neper.
Extremal or Upper bound or Maximum entropy
𝑯(𝑺) 𝒎𝒂𝒙 = 𝐥𝐨𝐠 𝟐
𝒒 bits/message-symbol or 𝑯(𝑺) 𝒎𝒂𝒙 = 𝐥𝐨𝐠 𝟐
𝑵 bits/message-symbol.
Source efficiency
𝜼 𝑺=
𝑯(𝑺)
𝑯(𝑺) 𝒎𝒂𝒙
or 𝜼 𝑺=
𝑯(𝑺)
𝑯(𝑺) 𝒎𝒂𝒙
X 𝟏𝟎𝟎%
Source redundancy
𝑹 𝜼 𝑺
= 1- 𝜼 𝑺 = (1 -
𝑯(𝑺)
𝑯(𝑺) 𝒎𝒂𝒙
) X 𝟏𝟎𝟎%
The average information content of the symbols emitted from the i th state.
𝑯𝒊= 𝑷𝒊𝒋 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷𝒊𝒋
𝒏
𝒋=𝟏 ) bits/symbol or
𝑯𝒊= 𝑷𝒊𝒋 𝐥𝐨𝐠 𝒐 𝟏 (
𝟏
𝑷𝒊𝒋
𝒏
𝒋=𝟏 ) bits/symbol
The average information content of the symbols emitted from the k th state.
3. 𝑯 𝒌= 𝑷𝒍𝑲 𝐥𝐨𝐠 𝟐 𝟏 (
𝟏
𝑷𝒍𝑲
𝑴
𝒍=𝟏 ) bits/symbol
The average information content per symbol in a message of length N.
𝑮 𝑵=
𝟏
𝑵
𝑷(𝒊 𝒎𝒊)log
𝟏
𝑷(𝒎 𝒊)
or 𝑮 𝑵= −
𝟏
𝑵
𝑷(𝒊 𝒎𝒊)log P (𝒎𝒊) =
𝟏
𝑵
H ( 𝒔)
The entropy of the second order symbols.
𝑮 𝑵 =
𝟏
𝑵
H ( 𝒔 𝒙 𝑵
) where N=2.
The entropy of the third order symbols.
𝑮 𝑵 =
𝟏
𝑵
H ( 𝒔 𝒙 𝑵
) where N=3.
Log properties
1. 𝐥𝐨𝐠 𝒂 𝒃 =
𝟏
𝐥𝐨𝐠 𝒃 𝒂
𝟐.
𝐥𝐨𝐠 𝒙 𝒃
𝐥𝐨𝐠 𝒙 𝒂
= 𝐥𝐨𝐠 𝒂 𝒃
𝟑. 𝐥𝐨𝐠 𝒆 𝟏𝟎 = ln (10)