Monte Carlo methods can be used to estimate sums and integrals by approximating them as expectations under a probability distribution. Samples are drawn from the distribution and the average of the function evaluated at each sample is calculated. This provides an unbiased estimate with variance that decreases as more samples are taken. Importance sampling improves upon this by drawing samples from a different distribution that puts more weight on important areas, which can reduce variance. Markov chain Monte Carlo methods like Gibbs sampling are used to draw samples from distributions that cannot be directly sampled, like those represented by undirected graphs, by iteratively updating variables conditioned on others.