Maximum likelihood estimation finds the parameters that make the observed data most probable given a statistical model. For example, it can estimate the mean and variance of penguin heights from a sample, assuming heights are normally distributed.
Bayes estimation uses Bayes' theorem and assigns costs to errors between the estimated and actual parameters. The Bayes estimate minimizes the expected cost. For a squared error cost function, the Bayes estimate is the mean of the posterior probability distribution given the data. This minimizes the mean squared error between the estimate and the actual parameter value.