Call for Papers - International Journal of Intelligent Systems and Applicatio...
ย
Conditionalprobabilityandmutualinformation1
1. See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/303802644
Brief summary about conditional probability
Research ยท June 2016
DOI: 10.13140/RG.2.1.4398.8087
CITATIONS
0
READS
29
All content following this page was uploaded by Aisha Alhammadi on 04 June 2016.
The user has requested enhancement of the downloaded file.
2. Conditional probability:
โIt is defined as the probability of one event is influenced by the outcome of another event. In the
other word, the probability of event B thus depends on whether event A occursโ [2].
โThe probability of an event is conditional on another eventโ [3].
It is denoted as
( | )
( )
( )
where ( )
Because it depends on the intersection of two events, then it gives indication about the probability of the
dependence of the first event on the occurrence of the second event.
โThe probability of event A when it is known that event B has occurred. It is read as probability of A
given Bโ [2].
โThe relative frequency of event A among the trials that produce an outcome in event B โ.
Bayesโ rule: one conditional probability is expressed in terms of the reversed conditional
probability.
( | )
( ) ( | )
( )
In which ( ) ( | ) ( )
โBayesโ theorem isolates and finds the relative likelihood of each possible cause of an event of interest
[2]โ.
( | )
( )
( )
( | ) ( )
( )
From the total probability rule:
( | )
( | ) ( )
( )
( | ) ( )
โ ( )
( | ) ( )
โ ( | ) ( )
3. ๏ท For independent events:
( | )
( )
( )
( ) ( )
( )
( )
( | )
( )
( )
( ) ( )
( )
( )
The probability of B does not depend on whether event A occurs or not.
๏ท For mutual exclusive events:
( | )
( )
( ) ( )
( | )
( )
( ) ( )
- The probability of B intersect A is zero.
- The probability of B will not be determined by knowing the probability of A.
- The probability of event B under the knowledge that the outcomes will be in event A = 0.
- Event B and event A canโt occur at the same time.
- The probability is zero ๏ impossible event
๏ท
๏ท For event B is subset of A (A contains B):
( | )
( )
( )
( )
( )
or A is subset of B ( B contains A):
( | )
( )
( )
( )
( )
- The probability of B under the knowledge that the outcome will be in event A is 1.
- B contains A.
- A is contained B.
- A B.
A B
A
B
B
A
4. Mutual information:
It is defined as information content provided by the occurrence of the event Y=yj about the
event X=xi .
( )
( | )
( )
( )
( ) ( )
( )
๏ท For independent events:
( )
( | )
( )
( )
( ) ( )
( ) ( )
( ) ( )
๏ท For mutual exclusive events (probability is zero ๏ impossible event ๏ maximum
information):
( )
( | )
( )
( )
( ) ( )
๏ท For subset events:
For x contains y, then
( )
( | )
( )
( )
( ) ( )
( )
( ) ( ) ( )
( )
For y contains x, then
( )
( | )
( )
( )
( ) ( )
( )
( ) ( ) ( )
( )
5. Remark from J. Proakis textbook:
โWhen the occurrence of the event Y=yj uniquely determines the occurrence of the event X=xi ,
the conditional probability in the numerator is unity as given belowโ [4].
( )
( | )
( ) ( )
( ) ( )
The event Y=yj uniquely determines the occurrence of the event X=xi means that event X
contains Y (Y is subset of X). Thus, the conditional probability in the numerator is unity.
This is call self-information of event X=xi .
๏ท Properties of mutual information [1] :
1. Symmetry:
( ) ( )
Proof:
( )
( | )
( )
( )
( ) ( )
( )
( | )
( )
( )
( ) ( )
2. ( ) ( | ) ( )
3. ( ) ( | ) ( )
4. ( ) ( | ) ( ) ๏ x and y are independent.
๏ท Properties of self-information [1]:
1. ( ) ( )
2. ( ) ( ) ( ) ( ) ๏ this means that x is subset of y.
3. ( ) ( ) ๏ no information.
4. ( ) ( ) ๏ maximum information
5. ( ) ( )
( )
( ).
6. ( ) ( ) ( )
=
6. Joint Entropy:
It is defined as the measurements of how much uncertainty there is in the two random variables
of a pair of discrete random variable
( ) โ โ ( ) ( ) โ โ ( )
( )
( )
Average Mutual Information:
โThe mutual information between random variables X and Y is the average amount of
information provided about X by observing Y , which is also the average amount of uncertainty
resolved about X by observing Yโ [1] .
( ) โ โ ( ) ( ) โ โ ( )
( | )
( )
7. References:
1. Entropy and mutual information [Online]. Available: http://ee.tamu.edu/~georghiades/
courses/ftp647/Chapter2.pdf.
2. B. P. Lathi and Z. Ding, Modern Digital and Analog Communication Systems, 4th
edition, Oxford, 2010.
3. R. Johnson, e-Study Guide for: Miller and Freunds Probability and Statistics for
Engineers, ISBN 9780131437456.
4. John G. Proakis, Digital Communication, Fourth Edition, 2001, Prentice Hall; ISBN: 0-
13-061793-8
View publication statsView publication stats