Measurement errors occur due to noise during measurements. This is demonstrated by the theoretical example of a two qubit system prepared in four superposition states that are affected by noise. Mitigation works by applying the inverse of the noise matrix to the noisy measurement results using linear algebra, allowing the noise-free results to be inferred. The goal is to create a calibration matrix to apply to noisy results from IBM to mitigate the effects of noise.