Is the ability of the human ear better or worse than the sonic capabilities of a CD quality audio system? What can a digital engineer take advantage of because of this fact? Name the four classes of coding methods. Identify whether each compression method is lossy or lossless: Morse Code_____Run Length Encoding_____JPEG_____Say the relative frequencies of four symbols in an alphabet are 1/2,1/4,1/8 and 1/8. What is the entropy (numerical value) of the alphabet? Suppose we want to compress two sound recordings, A and B that are in.WAV format and equally long. If the entropy of A is twice that of B, which recording has a greater amount of randomness? Which recording should we expect to be shorter in compressed form? Solution As already short answer provided for other questions except Q10, I am answering it. 10) Relative frequencies of four symbols in an alphabet are, ½, ¼, 1/8, 1/8, So P(1) = ½ P(2) = ¼ P(3) = 1/8 P(4) = 1/8 Entropy of the alphabet = H(P) = - [ P(1) logP(1) + P(2) logP(2) + P(3) logP(3)+ P(4) logP(4)] = - [1/2log(1/2) + ¼ log(1/4) +1/8 log(1/8) +1/8 log(1/8)] = - [(-0.150514998) + (-0.150514998) + (-0.112886248) + (-0.112886248)] = 0.526802492 .