This document provides an introduction to information channels. It defines an information channel as having an input alphabet, output alphabet, and conditional probabilities relating input and output symbols. It discusses how to represent channels in matrix form and calculates various probabilities. It also covers zero-memory channels, extensions of channels to multiple inputs/outputs, entropy, mutual information, and uses the binary symmetric channel as an example.