Like this presentation? Why not share!

# 17 Normal Intro

## by Hadley Wickham, Working at Rice Unversity on Mar 12, 2009

• 672 views

### Views

Total Views
672
Views on SlideShare
672
Embed Views
0

Likes
0
2
0

No embeds

## 17 Normal IntroPresentation Transcript

• Stat310 The normal distribution Hadley Wickham Thursday, 12 March 2009
• 1. Exam 2. Recap 3. Finish off convergence example 4. The normal distribution (reading?) Thursday, 12 March 2009
• Exam Graded for you to pick up after class. Generally did ok on question one, despite the mistake. Point for question four just for attempting it Question was ﬁne too. Struggled with question three (which was supposed to be pretty straightforward - sorry!) “Carry through” Thursday, 12 March 2009
• Question 3 Thursday, 12 March 2009
• 1.0 ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● 0.8 ● ● ● ● ● ● ● ● ● ● ● ● ● ● 0.6 ● ● ● exam2/30 ● ● ● ● ● ● ● 0.4 ● 0.2 0.0 ● 0.2 0.4 0.6 0.8 1.0 exam1/50 Thursday, 12 March 2009
• 8 6 count 4 2 0 0.0 0.2 0.4 0.6 0.8 1.0 exam2/30 Thursday, 12 March 2009
• Recap If X1, X2, …, Xn are iid, then: What does the joint pdf look like? What is the expected value of the sum? What is the variance of the sum? What is the mgf of the sum? Thursday, 12 March 2009
• Convergence in P Imagine you have a Bernoulli(p) process. You can repeat the process as many times as you like to generate X1, X2, …, Xn. How could you use these X’s to ﬁgure out what p is? Thursday, 12 March 2009
• lim P (|Zn − p| ≤ ) = 1 n→∞ ∀ >0 Thursday, 12 March 2009
• Time Event Estimate Total 1 1 1 1.00 2 0 1 0.50 3 1 2 0.67 4 0 2 0.50 5 0 2 0.40 Thursday, 12 March 2009
• 1.0 0.8 0.6 est 0.4 0.2 0.0 200 400 600 800 1000 n Thursday, 12 March 2009
• 1.0 0.8 0.6 est 0.4 0.2 0.0 200 400 600 800 1000 n Thursday, 12 March 2009
• 1.0 0.8 0.6 est 0.4 0.2 0.0 200 400 600 800 1000 n Thursday, 12 March 2009
• 1000 runs Thursday, 12 March 2009
• 1.0 0.8 99% 0.6 est 0.4 75% 25% 0.2 1% 0.0 200 400 600 800 1000 n Thursday, 12 March 2009
• 0.40 99% 0.35 75% 0.30 est 50% 25% 0.25 1% 0.20 200 400 600 800 1000 n Thursday, 12 March 2009
• 0.7 0.6 0.5 0.4 dist 0.3 0.2 0.1 0.0 200 400 600 800 1000 n Thursday, 12 March 2009
• 0.10 99% 0.08 0.06 dist 75% 0.04 50% 0.02 25% 0.00 1% 200 400 600 800 1000 n Thursday, 12 March 2009
• The normal distribution Thursday, 12 March 2009
• 1 2 (x−µ) − 2σ2 f (x) = √ e 2π Is this a valid pdf? Thursday, 12 March 2009
• 1 22 M (t) = e µt+ 2 σ t See book for derivation. A few tricks + lots of algebra Thursday, 12 March 2009
• Your turn σ If X ~ Normal(μ, 2), what is the mean and variance of X? (Work it out - don’t just write down what you know!) Thursday, 12 March 2009
• 0.4 0.4 N(-2, 1) N(5, 1) 0.3 0.3 0.2 0.2 f(x) f(x) 0.1 0.1 0.0 0.0 −10 −5 0 5 10 −10 −5 0 5 10 0.4 N(0, 1) 0.3 0.2 f(x) 0.1 0.0 −10 −5 0 5 10 0.4 0.4 N(0, 4) N(0, 16) 0.3 0.3 0.2 0.2 f(x) f(x) 0.1 0.1 0.0 0.0 −10 −5 0 5 10 −10 −5 0 5 10 Thursday, 12 March 2009
• Transformations σ If X ~ Normal(μ, 2), and Y = a(X + b) Y ~ Normal(b + μ, 2σ2) a If a = -μ and b = 1/σ, we often write Z = (X - μ) / σ Z ~ Normal(0, 1) = standard normal Thursday, 12 March 2009
• Example Let X ~ Normal(5, 10) What is P(3 < X < 8) ? Convert to standard normal. Look up Z score P(-0.2 < Z < 0.3) = P(Z < 0.3) - P(-0.2) (Google z table) Thursday, 12 March 2009
• P (Z < z) = Φ(z) Φ(−z) = 1 − Φ(z) P (−1 < Z < 1) = 0.68 P (−2 < Z < 2) = 0.95 P (−3 < Z < 3) = 0.998 Thursday, 12 March 2009
• Readings: 5.3, 5.4, 5.5 Thursday, 12 March 2009