The ‘glue’ that binds life and the universe into a coherent matrix - there is far more to Bytes than just bits, communication, storage and perception!
Big Data, Small Data, Information; storage and transmission; immediately conjure a picture of ‘potential high confusion’. But Information Theory is here to help us despite it upsetting the ‘purists’ of other disciplines; for it ‘steals’ the ideas and concepts of fundamental physics to apply them in a new and novel way that some would consider ‘fuzzy and sloppy’.
“What passes as information theory today is not communication at all, but merely transportation. ... Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.”
Non-the-less, as a practical theory, it played a key role in empowering the telecoms, coding and computing revolutions by defining the limits of what is possible, what can and can’t be done. Without out this theory we would be engaged in blind engineering - trial and error, rules of thumb, and guessing!
The prime contention is the use of ‘Entropy’ as a practical measure of order and disorder outside the confines of Thermodynamics. Thankfully, Shannon assumed the edict: just because something is not pure, and perfect, doesn’t mean to say we can’t exploit it!
This tutorial therefore details the thinking and justifies the principles so that students may utilise the many facets in the design and practice of information system engineering. Specifically: digital transmission over copper, fibre and wireless, data storage in all media, image processing and display, signal coding, information encryption, and security.