Information theory originated from pioneering work by Claude Shannon in 1948. It examines how information is encoded, transmitted, received, and decoded in communication systems. Key concepts include encoding messages at the source, transmitting through a channel, and decoding at the destination. Information is defined as symbols that decrease uncertainty by conveying unpredictable messages. The amount of information or uncertainty is known as entropy. Redundancy increases predictability and helps ensure messages are transmitted accurately. Gestalt principles of perception like continuity, similarity, and closure are also relevant to information theory. Later, theorists like Abraham Moles and Max Bense applied information theory concepts to aesthetics and the communication of art.