A teaser for an upcoming new course in Bayesian Statistics.
Based in the great book and materials by Allen Downey (Olin College), "Think Bayes".
https://sites.google.com/site/simplebayes/
The book (and presentation) is under a Creative Commons Noncommercial license, so classroom presentations are totally fine.
27. Chapter 2: Computational Statistics
• Distribution: set of values and their corresponding probabilities.
• Probability mass function: way to represent a distribution
mathematically.
• When talking about probabilities, you need to normalise (they should
add up to 1)
• This distribution, which contains the priors for each hypothesis, is called
(wait for it) the prior distribution.
• To update the distribution based on new data (a vanilla cookie!), we
multiply each prior by the corresponding likelihood.
• The distribution is no longer normalized, you need to renormalize
• The result is a distribution that contains the posterior probability for
each hypothesis, which is called (wait again!) the posterior distribution.
Terminology and design patterns of python programs that you can use during the rest
of the course
43. Differences between Bayesians and Non-Bayesians
ACCP 37th Annual Meeting, Philadelphia, PA [2]
Differences Between Bayesians and Non-Bayesians
According to my friend Jeff Gill
Typical Bayesian Typical Non-BayesianTypical Bayesian
ACCP 37th Annual Meeting, Philadelphia, PA [2]
Differences Between Bayesians and Non-Bayesians
According to my friend Jeff Gill
Typical Bayesian Typical Non-BayesianTypical Non-Bayesian
According to Jeff Gill (Center for Applied Statistics, WashU)