Claude Shannon quantified information based on the level of surprise or uncertainty a recipient experiences when receiving a message. Information is inversely related to probability - the more unexpected or less probable a message is, the more information it provides as it generates more surprise. Shannon mathematically defined information as a measure of a recipient's surprise, with more surprising messages conveying more information.
Major project report on Tata Motors and its marketing strategies
In 1948 Claude Shannon quantified information asSolutionDescr.pdf
1. In 1948 Claude Shannon quantified information as:
Solution
Description: Basically in a message we get information. But getting such information is
probability oriented. It means we expect to get such and such information. There is no surprising
act. But if there is any surprise of getting unexpected information, then this is not probability
oriented. It means, there was no probability in the mind of recipient of getting such information.
Therefore, Surprise and probability are inversely related. More surprise indicates less probability.
Beyond information: This description is beyond the defination of information. This is the
mathematical exposure of information and probability.
Quantify: Information is quantified by recipient's surprise. More surprise means more
information.