b. Consider a binary symmetric communication channel, whose input source is the alphabet X = { 0 , 1 } with probabilities { 0.39 , 0.61 } . The channel matrix is given below [ 1 1 ] where is the probability of transmission error. Find, i. The marginal entropies ii. The joint entropies iii. The conditional entropies iv. The mutual information.