Upcoming SlideShare
×

# 21 Inference

478 views

Published on

Published in: Technology, News & Politics
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
478
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
4
0
Likes
0
Embeds 0
No embeds

No notes for slide

### 21 Inference

1. 1. Stat310 Inference Hadley Wickham Tuesday, 31 March 2009
2. 2. 1. Homework / Take home exam 2. Recap 3. Data vs. distributions 4. Estimation 1. Maximum likelihood 2. Method of moments 5. Feedback Tuesday, 31 March 2009
3. 3. Assessment Short homework this week. (But you have to do some reading) Take home test will be available online next Thursday. Both take home and homework will be due in class on Thursday April 9. Will put up study guide asap. Tuesday, 31 March 2009
4. 4. Recap What are the 5 parameters of the bivariate normal? If X and Y are bivariate normal, and their correlation is zero, what does that imply about X and Y? Is that usually true? Tuesday, 31 March 2009
5. 5. Data vs. Distributions Random experiments produce data. A repeatable random experiment has some underlying distribution. We want to go from the data to say something about the underlying distribution. Tuesday, 31 March 2009
6. 6. Coin tossing Half the class generates 100 heads and tails by ﬂipping coins. The other half generates 100 heads and tails just by writing down what they think the sequence would be. Write up on the board. I’ll come in and guess which group was which. Tuesday, 31 March 2009
7. 7. Problem Have some data and a probability model, with unknown parameters. Want to estimate the value of those parameters Tuesday, 31 March 2009
8. 8. Some deﬁnitons Parameter space: set of all possible parameter values Estimator: process/function which takes data and gives best guess for parameter (usually many possible estimators for a problem) Point estimate: estimator for a single value Tuesday, 31 March 2009
9. 9. Example Data: 5.7 3.0 5.7 4.5 6.0 6.3 4.9 5.8 4.4 5.8 Model: Normal(?, 1) What is the mean of the underlying distribution? (5.2?) Tuesday, 31 March 2009
10. 10. Uncertainty Also want to be able to quantify how certain/conﬁdent we are in our answer. How close is our estimate to the true mean? Tuesday, 31 March 2009
11. 11. Simulation One approach to ﬁnd the answer is to use simulation, i.e., set up a case where we know what the true answer is and see what happens. X ~ Normal(5, 1) Draw 10 numbers from this distribution and calculate their average. Tuesday, 31 March 2009
12. 12. 3.1 3.4 5.1 4.9 2.2 4.4 4.2 3.9 5.6 4.9 4.2 5.9 2.8 6.0 5.1 2.7 6.5 4.2 4.9 4.6 4.4 4.7 5.0 5.3 5.3 5.1 5.4 4.7 4.7 4.4 5.9 4.2 5.0 4.3 5.4 5.5 4.9 3.1 4.1 4.8 3.6 6.8 5.5 4.8 3.8 6.1 3.8 5.2 5.7 5.2 3.2 5.2 5.3 2.3 4.6 5.6 6.0 5.5 5.5 5.1 7.3 5.4 6.1 4.4 4.9 5.6 Tuesday, 31 March 2009
13. 13. Repeat 1000 times 120 100 80 count 60 40 95% of values lie between 20 4.5 and 5.6 0 4.0 4.5 5.0 5.5 6.0 samp Tuesday, 31 March 2009
14. 14. Theory From Tuesday, we know what the distribution of the average is. Write it down. Create a 95% conﬁdence interval. How does it compare to the simulation? Tuesday, 31 March 2009
15. 15. Why the mean? Why is the mean of the data a good estimate of μ? Are there other estimators that might be as good or better? In general, how can we ﬁgure out an estimator for a parameter of a distribution? Tuesday, 31 March 2009
16. 16. Maximum likelihood Method of moments Tuesday, 31 March 2009
17. 17. Maximum likelihood Write down log-likelihood (i.e., given this data how likely is it that it was generated from this parmeter?) Find the maximum (i.e., differentiate and set to zero) Tuesday, 31 March 2009
18. 18. Example X ~ Binomial(10, p?) Here is some data drawn from that random experiment: 4 5 1 5 3 2 4 2 2 4 We know the joint pdf because they are independent. Can try out various values of p and see which is most likely Tuesday, 31 March 2009
19. 19. Your turn Write down the joint pdf for X1, X2, …, Xn ~ Binomial(n, p) Try evaluating it for x = (4 5 1 5 3 2 4 2 2 4), n = 10, p = 0.1 Tuesday, 31 March 2009
20. 20. Try 10 different ● values of p 3.0e−08 2.5e−08 2.0e−08 prob 1.5e−08 1.0e−08 ● 5.0e−09 ● 0.0e+00 ● ● ● ● ● ● ● ● 0.0 0.2 0.4 0.6 0.8 1.0 p Tuesday, 31 March 2009
21. 21. Try 100 different values of p 3.5e−08 3.0e−08 2.5e−08 2.0e−08 prob 1.5e−08 True p is 0.3 1.0e−08 5.0e−09 0.0e+00 0.0 0.2 0.4 0.6 0.8 1.0 p Tuesday, 31 March 2009
22. 22. Calculus Can do the same analytically with calculus. Want to ﬁnd the maximum of the pdf with respect to p. (How do we do this?) Normally call this the likelihood when we’re thinking of the x’s being ﬁxed and the parameters varying. Usually easier to work with the log pdf (why?) Tuesday, 31 March 2009
23. 23. Steps Write out log-likelihood (Discard constants) Differentiate and set to 0 (Check second derivative is positive) Tuesday, 31 March 2009
24. 24. Analytically Mean of x’s is 3.2 n = 10 Maximum likelihood estimate of p for this example is 0.32 Tuesday, 31 March 2009
25. 25. Method of moments We know how to calculate sample moments (e.g. mean and variance of data) We know what the moments of the distribution are in terms of the parameters. Why not just match them up? Tuesday, 31 March 2009
26. 26. Binomial E(X) = np Var(X) = np(1-p) Tuesday, 31 March 2009
27. 27. Binomial E(X) = np Var(X) = np(1-p) p = mean / n = 3.2 / 10 = 0.32 Tuesday, 31 March 2009
28. 28. Binomial E(X) = np Var(X) = np(1-p) p = mean / n = 3.2 / 10 = 0.32 p(1-p) = var / n = 2 / 10 = 0.2 Tuesday, 31 March 2009
29. 29. Binomial E(X) = np Var(X) = np(1-p) p = mean / n = 3.2 / 10 = 0.32 p(1-p) = var / n = 2 / 10 = 0.2 -p 2 + p - 0.2 = 0 Tuesday, 31 March 2009
30. 30. Binomial E(X) = np Var(X) = np(1-p) p = mean / n = 3.2 / 10 = 0.32 p(1-p) = var / n = 2 / 10 = 0.2 -p 2 + p - 0.2 = 0 p = (0.276, 0.725) Tuesday, 31 March 2009
31. 31. Your turn What are the method of moments estimators for the mean and variance of the normal distribution? What about the gamma distribution? Tuesday, 31 March 2009
32. 32. Feedback Tuesday, 31 March 2009