Elmentary statistics 8th edition BLUMAN

26,584 views

Published on

0 Comments
19 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
26,584
On SlideShare
0
From Embeds
0
Number of Embeds
60
Actions
Shares
0
Downloads
583
Comments
0
Likes
19
Embeds 0
No embeds

No notes for slide

Elmentary statistics 8th edition BLUMAN

  1. 1. This page intentionally left blank
  2. 2. Important Formulas Chapter 3 Data Description ᎐ Mean for individual data: X ϭ ᎐ Mean for grouped data: X ϭ Chapter 5 Discrete Probability Distributions ͚X n ͚ f • Xm n ͙ ͚Θ X Ϫ X Ι 2 nϪ1 ͙ nΘ ͚X 2Ι Ϫ Θ͚XΙ 2 nΘn Ϫ 1Ι (Shortcut formula) sϭ or Standard deviation for grouped data: sϭ ͙ 2 nΘ͚ f • X m Ι Ϫ Θ ͚ f • Xm Ι 2 nΘn Ϫ 1Ι Range rule of thumb: s Ϸ s2 ϭ ͚[X 2 и P(X)] Ϫ m2 s ϭ ͙͚[X 2 • PΘXΙ ] Ϫ m2 Standard deviation for a sample: sϭ Mean for a probability distribution: m ϭ ͚[X и P(X)] Variance and standard deviation for a probability distribution: range 4 n! • pX • q nϪX Ϫ XΙ !X! Mean for binomial distribution: m ϭ n и p Variance and standard deviation for the binomial distribution: s2 ϭ n и p и q s ϭ ͙n • p • q Multinomial probability: n! X X X PΘXΙ ϭ • p X 1 • p2 2 • p3 3 • • • pk k X1!X2!X3! . . . Xk! 1 Binomial probability: PΘXΙ ϭ Θn Poisson probability: P(X; l) ϭ Chapter 4 Probability and Counting Rules Addition rule 1 (mutually exclusive events): P(A or B) ϭ P(A) ϩ P(B) Addition rule 2 (events not mutually exclusive): P(A or B) ϭ P(A) ϩ P(B) Ϫ P(A and B) Multiplication rule 1 (independent events): P(A and B) ϭ P(A) и P(B) Multiplication rule 2 (dependent events): P(A and B) ϭ P(A) и P(B ͉ A) Conditional probability: PΘB Խ AΙ ϭ Expectation: E(X) ϭ ͚[X и P(X)] PΘ A and BΙ PΘ AΙ ᎐ Complementary events: P(E ) ϭ 1 Ϫ P(E) Fundamental counting rule: Total number of outcomes of a sequence when each event has a different number of possibilities: k 1 и k 2 и k 3 и и и k n Permutation rule: Number of permutations of n objects n! taking r at a time is n Pr ϭ Θn Ϫ rΙ ! Combination rule: Number of combinations of r objects n! selected from n objects is n Cr ϭ Θ n Ϫ r Ι !r! X ϭ 0, 1, 2, . . . e Ϫ ␭␭X where X! Hypergeometric probability: PΘXΙ ϭ a CX • bCnϪX aϩbCn Chapter 6 The Normal Distribution Standard score z ϭ ᎐ XϪ␮ ␴ zϭ or XϪX s Mean of sample means: mX ϭ m ␴ ͙n ᎐ XϪ␮ Central limit theorem formula: z ϭ ␴ ր͙n Standard error of the mean: sX ϭ Chapter 7 Confidence Intervals and Sample Size z confidence interval for means: ᎐ X Ϫ z ␣ր2 ␴ ␴ Θ ͙n Ι Ͻ ␮ Ͻ X ϩ z ր Θ ͙n Ι ᎐ ␣ 2 t confidence interval for means: ᎐ X Ϫ t ␣ր2 s s Θ ͙n Ι Ͻ ␮ Ͻ X ϩ t ր Θ ͙n Ι ᎐ ␣ 2 z␣ր2 • ␴ E maximum error of estimate Sample size for means: n ϭ Θ Ι 2 where E is the Confidence interval for a proportion: p Ϫ Θz ␣ ր 2Ι ˆ ͙ pq ˆˆ Ͻ p Ͻ p ϩ Θz ␣ ր 2Ι ˆ n ͙ pq ˆˆ n
  3. 3. ˆˆ Sample size for a proportion: n ϭ p q z␣ 2 Θ Eր Ι 2 Formula for the confidence interval for difference of two means (small independent samples, variance unequal): X and qϭ1Ϫp ˆ ˆ n Confidence interval for variance: pϭ ˆ where Θn ᎐ Θ X1 ͙ ᎐ Ϫ X2Ι Ϫ t ␣ ր 2 Θ n Ϫ 1 Ι s2 Ϫ 1Ι s2 Ͻ ␴2 Ͻ 2 ␹ right ␹2 left ᎐ ͙ Ϫ 1Ι s2 Ͻ␴Ͻ ␹2 right ͙ Θn ᎐ tϭ ᎐ XϪ␮ for any value n. If n Ͻ 30, ␴ ր͙n population must be normally distributed. sD ϭ (d.f. ϭ n Ϫ 1) Θn Ϫ 1Ι s 2 ␴2 ᎐ ᎐ Ϫ X2Ι Ϫ z␣ր2 ͙ ␴2 ␴2 1 ϩ 2 Ͻ ␮1 Ϫ ␮ 2 n1 n2 ᎐ ᎐ ᎐ ᎐ Ϫ X2 Ι Ϫ Θ␮1 Ϫ ␮ 2Ι ͙ ͙ __ pq Θ n1 ϩ n1 Ι 1 _ pϭ 2 X1 ϩ X2 n1 ϩ n2 _ _ qϭ1Ϫp p1 ϭ ˆ X1 n1 p2 ϭ ˆ X2 n2 s2 s2 1 ϩ 2 n1 n2 (d.f. ϭ the smaller of n 1 Ϫ 1 or n2 Ϫ 1) Θ p1 ˆ Ϫ p2Ι Ϫ z␣ր2 ˆ ͙ ˆ ˆ p1 q1 p2 q2 ˆ ˆ ϩ Ͻ p1 Ϫ p2 n1 n2 Ͻ Θ p1 Ϫ p2Ι ϩ z␣ ր 2 ˆ ˆ ͙ ␴2 ␴2 1 ϩ 2 n1 n2 t test for comparing two means (independent samples, variances not equal): Θ X1 Ϫ p2Ι Ϫ Θ p1 Ϫ p2Ι ˆ Formula for the confidence interval for the difference of two proportions: Ͻ ΘX1 Ϫ X2Ι ϩ z ␣ ր 2 tϭ Θ p1 ˆ where Ϫ Θ ␮1 Ϫ ␮ 2 Ι ␴2 ␴2 1 ϩ 2 n1 n2 Formula for the confidence interval for difference of two means (large samples): Θ X1 ϭ n Ϫ 1Ι z test for comparing two proportions: z test for comparing two means (independent samples): ͙ Θ d.f. and SD S ᎐ Ͻ ␮D Ͻ D ϩ t␣ր2 D ͙n ͙n (d.f. ϭ n Ϫ 1) zϭ Chapter 9 Testing the Difference Between Two Means, Two Proportions, and Two Variances Ϫ n͚D 2 Ϫ Θ͚DΙ 2 nΘn Ϫ 1Ι ͚D n ᎐ Dϭ ᎐ (d.f. ϭ n Ϫ 1) zϭ ͙ where D Ϫ t␣ր2 pϪp ˆ ͙pqրn Chi-square test for a single variance: ␹ 2 ϭ ᎐ X2 Ι D Ϫ ␮D sD ր͙n Formula for confidence interval for the mean of the difference for dependent samples: ᎐ ᎐ Θ X1 s2 s2 1 ϩ 2 n1 n2 t test for comparing two means for dependent samples: z test: z ϭ z test for proportions: z ϭ ͙ (d.f. ϭ smaller of n1 Ϫ 1 and n2 Ϫ 1) Ϫ 1Ι s2 ␹2 left Chapter 8 Hypothesis Testing XϪ␮ t test: t ϭ sր͙n ᎐ Ͻ ΘX1 Ϫ X2Ι ϩ t ␣ ր 2 Confidence interval for standard deviation: Θn s2 s2 1 ϩ 2 Ͻ ␮1 Ϫ ␮ 2 n1 n2 ͙ p1 q1 p2 q2 ˆ ˆ ˆ ˆ ϩ n1 n2 s2 1 where s 2 is the 1 s2 2 larger variance and d.f.N. ϭ n1 Ϫ 1, d.f.D. ϭ n2 Ϫ 1 F test for comparing two variances: F ϭ
  4. 4. Chapter 10 Correlation and Regression Chapter 11 Other Chi-Square Tests Correlation coefficient: Chi-square test for goodness-of-fit: rϭ nΘ͚xyΙ Ϫ Θ ͚xΙΘ͚yΙ t test for correlation coefficient: t ϭ r (d.f. ϭ n Ϫ 2) ͙ nϪ2 1 Ϫ r2 The regression line equation: yЈ ϭ a ϩ bx Ϫ EΙ 2 E [d.f. ϭ (rows Ϫ 1)(col. Ϫ 1)] Ϫ Θ͚xΙΘ͚xyΙ nΘ͚x2Ι Ϫ Θ͚xΙ 2 nΘ͚xyΙ Ϫ Θ͚xΙΘ͚yΙ nΘ ͚x 2Ι Ϫ Θ͚xΙ 2 bϭ Coefficient of determination: r 2 ϭ ͙ explained variation total variation ANOVA test: F ϭ d.f.N. ϭ k Ϫ 1 d.f.D. ϭ N Ϫ k ͚y2 Ϫ a ͚y Ϫ b ͚xy nϪ2 ͙ ͚niΘXi Ϫ XGM Ι 2 kϪ1 2 sW ϭ ᎐ 1 nΘ x Ϫ X Ι 2 1ϩ ϩ n n ͚x 2 Ϫ Θ ͚xΙ 2 Ͻ y Ͻ yЈ ϩ t␣ ր 2s est ͙ ᎐ 1 nΘ x Ϫ XΙ 2 1ϩ ϩ n n ͚x2 Ϫ Θ͚xΙ 2 (d.f. ϭ n Ϫ 2) Formula for the multiple correlation coefficient: Rϭ ͙ 2 2 r yx 1 ϩ r yx 2 Ϫ 2ryx 1 • ryx 2 • rx 1x2 1 Ϫ r 21 x 2 x Formula for the F test for the multiple correlation coefficient: Fϭ Θ1 Ϫ R 2րk ր Ϫ k Ϫ 1Ι R 2Ι Θn ͚Θni Ϫ 1Ι s2 i ͚Θni Ϫ 1Ι Scheffé test: FS ϭ Θ1 ΄ Ϫ R2 ΙΘn Ϫ 1Ι nϪkϪ1 Xi Ϫ Xj 2 ͙sW րn Formulas for two-way ANOVA: SSA aϪ1 SSB MSB ϭ bϪ1 MSA ϭ MSW ϭ ΅ and Tukey test: q ϭ (d.f.N. ϭ n Ϫ k and d.f.D. ϭ n Ϫ k Ϫ 1) R2 ϭ 1 Ϫ adj Ϫ Xj Ι 2 րni ϩ 1րnjΙ ΘXi 2 sW Θ1 FЈ ϭ (k Ϫ 1)(C.V.) MSAϫB ϭ Formula for the adjusted R2: 2 sB ͚X where XGM ϭ 2 sW N where N ϭ n1 ϩ n2 ϩ и и и ϩ nk where k ϭ number of groups 2 sB ϭ Prediction interval for y: yЈ Ϫ t␣ ր 2 sest ΘO Chapter 12 Analysis of Variance Standard error of estimate: sest ϭ ΘO Chi-square test for independence and homogeneity of proportions: x2 ϭ a Θ ͚y ΙΘ ͚x2 Ι aϭ where Ϫ EΙ 2 E (d.f. ϭ no. of categories Ϫ 1) x2 ϭ a ͙[nΘ͚x2 Ι Ϫ Θ͚xΙ 2][nΘ ͚y2Ι Ϫ Θ ͚yΙ 2] Θa SSAϫB Ϫ 1ΙΘb Ϫ 1Ι SSW abΘ n Ϫ 1Ι MSA MSW MSB FB ϭ MSW FA ϭ FAϫB ϭ MSAϫB MSW
  5. 5. Chapter 13 Nonparametric Statistics ϩ 0.5Ι Ϫ Θnր2Ι z test value in the sign test: z ϭ ͙n ր 2 where n ϭ sample size (greater than or equal to 26) X ϭ smaller number of ϩ or Ϫ signs Kruskal-Wallis test: ΘX Wilcoxon rank sum test: z ϭ R Ϫ mR sR where ␮R ϭ n1Θn1 ϩ n2 ϩ 1Ι 2 ͙ n 1 n 2Θn1 ϩ n 2 ϩ 1Ι 12 R ϭ sum of the ranks for the smaller sample size (n1) n1 ϭ smaller of the sample sizes n2 ϭ larger of the sample sizes n1 Ն 10 and n2 Ն 10 ␴R ϭ Wilcoxon signed-rank test: z ϭ where A ws Ϫ nΘn ϩ 1Ι 4 nΘn ϩ 1ΙΘ2n ϩ 1Ι 24 Hϭ R2 R2 12 R2 1 ϩ 2 ϩ • • • ϩ k Ϫ 3ΘN ϩ 1Ι NΘN ϩ 1Ι n1 n2 nk Θ Ι where R1 ϭ sum of the ranks of sample 1 n1 ϭ size of sample 1 R2 ϭ sum of the ranks of sample 2 n2 ϭ size of sample 2 и и и Rk ϭ sum of the ranks of sample k nk ϭ size of sample k N ϭ n1 ϩ n2 ϩ и и и ϩ nk k ϭ number of samples Spearman rank correlation coefficient: rS ϭ 1 Ϫ 6 ͚d 2 nΘn2 Ϫ 1Ι where d ϭ difference in the ranks n ϭ number of data pairs n ϭ number of pairs where the difference is not 0 ws ϭ smaller sum in absolute value of the signed ranks Procedure Table Solving Hypothesis-Testing Problems (Traditional Method) State the hypotheses and identify the claim. Step 2 Find the critical value(s) from the appropriate table in Appendix C. Step 3 Compute the test value. Step 4 Make the decision to reject or not reject the null hypothesis. Step 5 Summarize the results. Procedure Table Solving Hypothesis-Testing Problems (P-value Method) Step 1 State the hypotheses and identify the claim. Step 2 Compute the test value. Step 3 Find the P-value. Step 4 Make the decision. Step 5 Summarize the results. ISBN-13: 978–0–07–743861–6 ISBN-10: 0–07–743861–2 Step 1
  6. 6. Table E The Standard Normal Distribution Cumulative Standard Normal Distribution z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09 Ϫ3.4 .0003 .0003 .0003 .0003 .0003 .0003 .0003 .0003 .0003 .0002 Ϫ3.3 .0005 .0005 .0005 .0004 .0004 .0004 .0004 .0004 .0004 .0003 Ϫ3.2 .0007 .0007 .0006 .0006 .0006 .0006 .0006 .0005 .0005 .0005 Ϫ3.1 .0010 .0009 .0009 .0009 .0008 .0008 .0008 .0008 .0007 .0007 Ϫ3.0 .0013 .0013 .0013 .0012 .0012 .0011 .0011 .0011 .0010 .0010 Ϫ2.9 .0019 .0018 .0018 .0017 .0016 .0016 .0015 .0015 .0014 .0014 Ϫ2.8 .0026 .0025 .0024 .0023 .0023 .0022 .0021 .0021 .0020 .0019 Ϫ2.7 .0035 .0034 .0033 .0032 .0031 .0030 .0029 .0028 .0027 .0026 Ϫ2.6 .0047 .0045 .0044 .0043 .0041 .0040 .0039 .0038 .0037 .0036 Ϫ2.5 .0062 .0060 .0059 .0057 .0055 .0054 .0052 .0051 .0049 .0048 Ϫ2.4 .0082 .0080 .0078 .0075 .0073 .0071 .0069 .0068 .0066 .0064 Ϫ2.3 .0107 .0104 .0102 .0099 .0096 .0094 .0091 .0089 .0087 .0084 Ϫ2.2 .0139 .0136 .0132 .0129 .0125 .0122 .0119 .0116 .0113 .0110 Ϫ2.1 .0179 .0174 .0170 .0166 .0162 .0158 .0154 .0150 .0146 .0143 Ϫ2.0 .0228 .0222 .0217 .0212 .0207 .0202 .0197 .0192 .0188 .0183 Ϫ1.9 .0287 .0281 .0274 .0268 .0262 .0256 .0250 .0244 .0239 .0233 Ϫ1.8 .0359 .0351 .0344 .0336 .0329 .0322 .0314 .0307 .0301 .0294 Ϫ1.7 .0446 .0436 .0427 .0418 .0409 .0401 .0392 .0384 .0375 .0367 Ϫ1.6 .0548 .0537 .0526 .0516 .0505 .0495 .0485 .0475 .0465 .0455 Ϫ1.5 .0668 .0655 .0643 .0630 .0618 .0606 .0594 .0582 .0571 .0559 Ϫ1.4 .0808 .0793 .0778 .0764 .0749 .0735 .0721 .0708 .0694 .0681 Ϫ1.3 .0968 .0951 .0934 .0918 .0901 .0885 .0869 .0853 .0838 .0823 Ϫ1.2 .1151 .1131 .1112 .1093 .1075 .1056 .1038 .1020 .1003 .0985 Ϫ1.1 .1357 .1335 .1314 .1292 .1271 .1251 .1230 .1210 .1190 .1170 Ϫ1.0 .1587 .1562 .1539 .1515 .1492 .1469 .1446 .1423 .1401 .1379 Ϫ0.9 .1841 .1814 .1788 .1762 .1736 .1711 .1685 .1660 .1635 .1611 Ϫ0.8 .2119 .2090 .2061 .2033 .2005 .1977 .1949 .1922 .1894 .1867 Ϫ0.7 .2420 .2389 .2358 .2327 .2296 .2266 .2236 .2206 .2177 .2148 Ϫ0.6 .2743 .2709 .2676 .2643 .2611 .2578 .2546 .2514 .2483 .2451 Ϫ0.5 .3085 .3050 .3015 .2981 .2946 .2912 .2877 .2843 .2810 .2776 Ϫ0.4 .3446 .3409 .3372 .3336 .3300 .3264 .3228 .3192 .3156 .3121 Ϫ0.3 .3821 .3783 .3745 .3707 .3669 .3632 .3594 .3557 .3520 .3483 Ϫ0.2 .4207 .4168 .4129 .4090 .4052 .4013 .3974 .3936 .3897 .3859 Ϫ0.1 .4602 .4562 .4522 .4483 .4443 .4404 .4364 .4325 .4286 .4247 Ϫ0.0 .5000 .4960 .4920 .4880 .4840 .4801 .4761 .4721 .4681 .4641 For z values less than Ϫ3.49, use 0.0001. Area z 0
  7. 7. Table E (continued ) Cumulative Standard Normal Distribution z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09 0.0 .5000 .5040 .5080 .5120 .5160 .5199 .5239 .5279 .5319 .5359 0.1 .5398 .5438 .5478 .5517 .5557 .5596 .5636 .5675 .5714 .5753 0.2 .5793 .5832 .5871 .5910 .5948 .5987 .6026 .6064 .6103 .6141 0.3 .6179 .6217 .6255 .6293 .6331 .6368 .6406 .6443 .6480 .6517 0.4 .6554 .6591 .6628 .6664 .6700 .6736 .6772 .6808 .6844 .6879 0.5 .6915 .6950 .6985 .7019 .7054 .7088 .7123 .7157 .7190 .7224 0.6 .7257 .7291 .7324 .7357 .7389 .7422 .7454 .7486 .7517 .7549 0.7 .7580 .7611 .7642 .7673 .7704 .7734 .7764 .7794 .7823 .7852 0.8 .7881 .7910 .7939 .7967 .7995 .8023 .8051 .8078 .8106 .8133 0.9 .8159 .8186 .8212 .8238 .8264 .8289 .8315 .8340 .8365 .8389 1.0 .8413 .8438 .8461 .8485 .8508 .8531 .8554 .8577 .8599 .8621 1.1 .8643 .8665 .8686 .8708 .8729 .8749 .8770 .8790 .8810 .8830 1.2 .8849 .8869 .8888 .8907 .8925 .8944 .8962 .8980 .8997 .9015 1.3 .9032 .9049 .9066 .9082 .9099 .9115 .9131 .9147 .9162 .9177 1.4 .9192 .9207 .9222 .9236 .9251 .9265 .9279 .9292 .9306 .9319 1.5 .9332 .93