Upcoming SlideShare
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Standard text messaging rates apply

# Mathematical theory of information

406

Published on

Mathematical Theory of Information

Mathematical Theory of Information

0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total Views
406
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
7
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Transcript

• 1. Annenberg School for Communication Departmental Papers (ASC) University of Pennsylvania Year 2009Mathematical Theory of Communication Klaus Krippendorﬀ University of Pennsylvania, kkrippendorﬀ@asc.upenn.eduThis paper is posted at ScholarlyCommons.http://repository.upenn.edu/asc papers/169
• 3. Entropies and Communication T(S:R) = H(R) – HS(R), the uncertainty at receiverThe additivity of H gives rise to a calculus of R minus noisecommunication. For a sender S and a receiver R one = H(S) – HR(S), the uncertainty at sender Scan measure three basic entropies. minus equivocation1. The uncertainty of messages s at sender S, occurring with probability ps = H(S) + H(R) – H(SR), the sum of the uncertainties at S and R minus the total H ( S )  sS ps log 2 ps . = H(SR) – HS(R) – HR(S), the total uncertainty minus noise and equivocation2. The uncertainty of messages r at receiver R, occurring with probability pr The algebraic relationships between these quantities are visualized in the center of Figure 1. H ( R )  rR pr log 2 pr . Accordingly,3. The total uncertainty of s-r pairs of messages in  Communication is the extent a sender is able a S×R table, occurring with probability psr to limit the receiver’s choices, and  H ( SR )  sS rR psr log 2 psr . Information is the extent a receiver knows the sender’s choices. These lead to the sender-unrelated uncertainty Both are interpretations of the same quantity,entering a communication channel, colloquially T(S:R), the amount of information transmitted. Bothcalled noise, and expressed as: express differences between two uncertainties, with and without knowing the choices made at the other H S ( R )  H ( SR )  H ( S ), point of a communication channel. They increasethe uncertainty in the sender lost due to with the number of choices or the improbability ofsimplifications or omissions during communication, the alternatives reduced in the cause of transmission.called equivocation, and expressed as T(S:R) quantifies a symmetrical relationship, not the property of a message. H R ( S )  H ( SR )  H ( R ).and the amount of information transmitted betweensender and receiver, which can be obtained in atleast four ways: