The document describes discrete Markov chains. A discrete Markov chain is a stochastic process that takes on countably many states, where the probability of moving to the next state depends only on the current state, not on the sequence of events that preceded it. This is known as the Markov property. The document defines discrete Markov chains mathematically and provides some basic properties, including the Markov property. It also gives examples of discrete Markov chains and how they can be specified by their transition probabilities between states.