Upcoming SlideShare
×

# 16 Sums

466 views

Published on

Published in: News & Politics, Technology
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
466
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
7
0
Likes
0
Embeds 0
No embeds

No notes for slide

### 16 Sums

1. 1. Stat310 Sums of independent rv’s Hadley Wickham Tuesday, 10 March 2009
2. 2. http://xkcd.com/552/ Tuesday, 10 March 2009
3. 3. 1. Exam 2. Recap 3. Motivation - connection to data 4. Expectation, variance, mgf’s 5. Convergence in probability Tuesday, 10 March 2009
4. 4. Exam How did it go? Too long? Too short? Sorry about question 1 Draft model answers online Will be back to you next week Tuesday, 10 March 2009
5. 5. Tuesday, 10 March 2009
6. 6. Recap What is a random experiment? What does it mean for two random variables to be independent? What is E(X + Y)? What is Var(X + Y)? What is the deﬁnition of the mgf? Tuesday, 10 March 2009
7. 7. Random experiment An observation that is uncertain: we don’t know ahead of time what the answer will be (pretty common!) Ideally we, want the experiment to be repeatable under exactly the same initial conditions (pretty rare!) Tuesday, 10 March 2009
8. 8. Why? Want to be able to work with (many) more variables than just two. Typically need to make some assumptions to make this easier. Turns out many of the things we are interested in can be expressed as sums, and we can design experiments to ensure independence. Tuesday, 10 March 2009
9. 9. Why? Most often these random variables will arise from repeated experiments. Each random variable is a result from a repetition. Want to be able to make inferences about the distribution from the samples that we see. Will touch on this today, and cover much more deeply in two weeks. Tuesday, 10 March 2009
10. 10. Mutual independence If X1, X2, X3, …, and Xn are mutually independent, what do you think the joint pdf will look like? f(x1, x2, …, xn) = ? Tuesday, 10 March 2009
11. 11. f (x1 , x2 , . . . , xn ) = f (x1 )f (x2 ) · · · f (xn ) f (x1 , x2 , . . . , xn ) = f1 (x1 )f2 (x2 ) · · · fn (xn ) n = fi (xi ) i=1 Tuesday, 10 March 2009
12. 12. Note If X and Y are independent, then so are: X2 and Y2 1/X and eY u(X) and v(Y) - any functions that only involve one variable Tuesday, 10 March 2009
13. 13. Deﬁnition iid = independent identically distributed e.g. X1, X2, …, Xn iid implies that f (xi ) f(x1, x2, …, xn) = And the f’s are all the same! Tuesday, 10 March 2009
14. 14. Mean & Variance Let X1, X2, …, Xn be independent random variables. Then: Xi = E(Xi ) E Xi = V ar(Xi ) V ar Tuesday, 10 March 2009
15. 15. Y = a1 X1 + a2 X2 + · · · + an Xn What are E(Y) and Var(Y) ? Tuesday, 10 March 2009
16. 16. Y = a1 X1 + a2 X2 + · · · + an Xn What are E(Y) and Var(Y) ? E(Y ) = ai E(Xi ) Tuesday, 10 March 2009
17. 17. Y = a1 X1 + a2 X2 + · · · + an Xn What are E(Y) and Var(Y) ? E(Y ) = ai E(Xi ) V ar(Y ) = ar(Xi ) 2 ai V Tuesday, 10 March 2009
18. 18. Your turn Let X1, X2, …, Xn be iid random variables. Xi ¯= X n What are the mean and variance? Tuesday, 10 March 2009
19. 19. 1 ¯= X Xi n ¯ = E(X) E(X) V ar(X) ¯= V ar(X) n Tuesday, 10 March 2009
20. 20. Mgf So that’s great if all we want to know is the mean and variance. What if we want to know the actual distribution? This is where we use the second property of the mgf - if the mgfs of two random variables are equal, then the distributions (pdfs) are equal Tuesday, 10 March 2009
21. 21. Your turn If X and Y are independent, what is the mgf of X + Y? What is the mgf of aX ? Tuesday, 10 March 2009
22. 22. MX+Y (t) = E(e ) (X+Y )t = E(e e) Xt Y t = E(e )(e ) Xt Yt = MX (t)MY (t) MaX (t) = MX (at) Tuesday, 10 March 2009
23. 23. Y = a1 X1 + a2 X2 + · · · + an Xn MY (t) = MXi (ai t) If the X’s are iid, and the a’s are constant, then MY (t) = MX (at) n Tuesday, 10 March 2009
24. 24. Your turn Let X1, X2, …, Xn be iid random variables. Xi ¯= X n What is the mgf? Tuesday, 10 March 2009
25. 25. Convergence in P Imagine you have a Bernoulli(p) process. You can repeat the process as many times as you like to generate X1, X2, …, Xn. How could you use these X’s to ﬁgure out what p is? Tuesday, 10 March 2009
26. 26. Convergence in P It’s an intuitive answer (and correct), but how can you check it? Want to say something like: n 1 lim Zn = p Zn = Xi n x=1 n→∞ But this doesn’t make sense. Why not? Tuesday, 10 March 2009
27. 27. lim P (|Zn − p| ≤ ) = 1 n→∞ ∀ >0 Tuesday, 10 March 2009