Conditional probability plays a vital role in Bayesian inference. Bayes' rule states that the posterior probability of an unknown parameter θ given observed data y is proportional to the likelihood of the data given the parameter times the prior probability of the parameter. This allows Bayesian inference to represent uncertainty about statistical models through probability distributions over model parameters, integrating over multiple plausible models. The prior represents beliefs before observing data, while the posterior incorporates the data to update those beliefs.