Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

Like this presentation? Why not share!

- The linear regression model: Theory... by Università degli ... 1276 views
- JF608: Quality Control - Unit 1 by Asraf Malik 692 views
- multiple linear reggression model by OmarSalih 627 views
- Understanding Regression Analysis -... by Siddharth Nath 2270 views
- JF608: Quality Control - Unit 2 by Asraf Malik 1896 views
- JF608: Quality Control - Unit 4 by Asraf Malik 702 views

No Downloads

Total views

1,825

On SlideShare

0

From Embeds

0

Number of Embeds

33

Shares

0

Downloads

147

Comments

0

Likes

5

No embeds

No notes for slide

- 1. Chapter 14 Multiple Regression Models
- 2. <ul><li>A general additive multiple regression model , which relates a dependent variable y to k predictor variables x 1 , x 2 ,…, x k is given by the model equation </li></ul><ul><li>y = + 1 x 1 + 2 x 2 + … + k x k + e </li></ul>Multiple Regression Models
- 3. <ul><li>The random deviation e is assumed to be normally distributed with mean value 0 and variance 2 for any particular values of x 1 , x 2 ,…, x k . </li></ul><ul><li>This implies that for fixed x 1 , x 2 ,…, x k values, y has a normal distribution with variance 2 and </li></ul>Multiple Regression Models (mean y value for fixed x 1 , x 2 ,…, x k values) = + 1 x 1 + 2 x 2 + … + k x k
- 4. <ul><li>The i ’s are called population regression coefficients ; each i can be interpreted as the true average change in y when the predictor x i increases by 1 unit and the values of all the other predictors remain fixed. </li></ul>Multiple Regression Models The deterministic portion + 1 x 1 + 2 x 2 + … + k x k is called the population regression function .
- 5. <ul><li>The k th degree polynomial regression model </li></ul><ul><li>y = + 1 x + 2 x 2 + … + k x k + e </li></ul><ul><li>is a special case of the general multiple regression model with x 1 = x, x 2 = x 2 , … , x k = x k . </li></ul><ul><li>The population regression function (mean value of y for fixed values of the predictors) is </li></ul><ul><li> + 1 x + 2 x 2 + … + k x k . </li></ul>Polynomial Regression Models
- 6. <ul><li>The most important special case other than simple linear regression (k = 1) is the quadratic regression model </li></ul><ul><li>y = + 1 x + 2 x 2 . </li></ul><ul><li>This model replaces the line y = + x with a parabolic cure of mean values + 1 x + 2 x 2 . </li></ul><ul><li>If 2 > 0, the curve opens upward, whereas if 2 < 0, the curve opens downward. </li></ul>Polynomial Regression Models
- 7. <ul><li>If the change in the mean y value associated with a 1-unit increase in one independent variable depends on the value of a second independent variable, there is interaction between these two variables. When the variables are denoted by x 1 and x 2 , such interaction can be modeled by including x 1 x 2 , the product of the variables that interact, as a predictor variable. </li></ul>Interaction
- 8. <ul><li>Up to now, we have only considered the inclusion of quantitative (numerical) predictor variables in a multiple regression model. </li></ul><ul><li>Two other types are very common: </li></ul><ul><ul><li>Dichotomous variable: One with just two possible categories coded 0 and 1 </li></ul></ul><ul><ul><li>Examples </li></ul></ul><ul><ul><ul><li>Gender {male, female} </li></ul></ul></ul><ul><ul><ul><li>Marriage status {married, not-married} </li></ul></ul></ul>Qualitative Predictor Variables.
- 9. <ul><ul><li>Ordinal variables: Categorical variables that have a natural ordering </li></ul></ul><ul><ul><ul><li>Activity level {light, moderate, heavy} coded respectively as 1, 2 and 3 </li></ul></ul></ul><ul><ul><ul><li>Education level {none, elementary, secondary, college, graduate} coded respectively 1, 2, 3, 4, 5 (or for that matter any 5 consecutive integers} </li></ul></ul></ul>Qualitative Predictor Variables.
- 10. <ul><li>According to the principle of least squares, the fit of a particular estimated regression function a + b 1 x 1 + b 2 x 2 + … + b k x k to the observed data is measured by the sum of squared deviations between the observed y values and the y values predicted by the estimated function: </li></ul><ul><li> [y –(a + b 1 x 1 + b 2 x 2 + … + b k x k )] 2 </li></ul>Least Square Estimates The least squares estimates of , 1 , 2 ,…, k are those values of a, b 1 , b 2 , … , b k that make this sum of squared deviations as small as possible.
- 11. Predicted Values & Residuals Doing this successively for the remaining observations yields the predicted values ( sometimes referred to as the fitted values or fits ). The first predicted value is obtained by taking the values of the predictor variables x 1 , x 2 ,…, x k for the first sample observation and substituting these values into the estimated regression function.
- 12. Predicted Values & Residuals The residuals are then the differences between the observed and predicted y values.
- 13. Sums of Squares The number of degrees of freedom associated with SSResid is n - (k + 1), because k + 1 df are lost in estimating the k + 1 coefficients , 1 , 2 ,…, k . The residual (or error) sum of sqyares, SSResid , and total sum of squares, SSTo , are given by where is the mean of the y observations in the sample.
- 14. Estimate for 2 An estimate of the random deviation variance 2 is given by and is the estimate of .
- 15. Coefficient of Multiple Determination, R 2 The coefficient of multiple determination, R 2 , interpreted as the proportion of variation in observed y values that is explained by the fitted model, is
- 16. Adjusted R 2 Generally, a model with large R 2 and small s e are desirable. If a large number of variables (relative to the number of data points) is used those conditions may be satisfied but the model will be unrealistic and difficult to interpret.
- 17. Adjusted R 2 Notice that when a large number of variables are used to build the model, this value will be substantially lower than R 2 and give a better indication of usability of the model. To sort out this problem, sometimes computer packages compute a quantity called the adjusted R 2 ,
- 18. F Distributions F distributions are similar to a Chi-Square distributions, but have two parameters, df den and df num .
- 19. The F Test for Model Utility The regression sum of squares denoted by SSReg is defined by SSREG = SSTo - SSresid
- 20. The F Test for Model Utility When all k i ’s are zero in the model y = + 1 x 1 + 2 x 2 + … + k x k + e And when the distribution of e is normal with mean 0 and variance 2 for any particular values of x 1 , x 2 ,…, x k , the statistic has an F probability distribution based on k numerator df and n - (K+ 1) denominator df
- 21. The F Test for Utility of the Model y = + 1 x 1 + 2 x 2 + … + k x k + e <ul><li>Null hypothesis: </li></ul><ul><ul><li>H 0 : 1 = 2 = … = k =0 </li></ul></ul><ul><ul><li>(There is no useful linear relationship between y and any of the predictors.) </li></ul></ul><ul><li>Alternate hypothesis: </li></ul><ul><ul><li>H a : At least one among 1 , 2 , … , k is not zero </li></ul></ul><ul><ul><li>(There is a useful linear relationship between y and at least one of the predictors.) </li></ul></ul>
- 22. The F Test for Utility of the Model y = + 1 x 1 + 2 x 2 + … + k x k + e Test statistic: An alternate formula:
- 23. The F Test Utility of the Model y = + 1 x 1 + 2 x 2 + … + k x k + e The test is upper-tailed, and the information in the Table of Values that capture specified upper-tail F curve areas is used to obtain a bound or bounds on the P-value using numerator df = k and denominator df = n - (k + 1). Assumptions: For any particular combination of predictor variable values, the distribution of e, the random deviation, is normal with mean 0 and constant variance.
- 24. Example <ul><li>A number of years ago, a group of college professors teaching statistics met at an NSF program and put together a sample student research project. </li></ul>They attempted to create a model to explain lung capacity in terms of a number of variables. Specifically, Numerical variables: height, age, weight, waist Categorical variables: gender, activity level and smoking status.
- 25. Example <ul><li>They managed to sample 41 subjects and obtain/measure the variables. </li></ul><ul><li>There was some discussion and many felt that the calculated variable (height)(waist) 2 would be useful since it would likely be proportional to the volume of the individual. </li></ul><ul><li>The initial regression analysis performed with Minitab appears on the next slide. </li></ul>
- 26. Example Linear Model with All Numerical Variables <ul><li>The regression equation is </li></ul><ul><li>Capacity = - 13.0 - 0.0158 Age + 0.232 Height - 0.00064 Weight - 0.0029 Chest </li></ul><ul><li>+ 0.101 Waist -0.000018 hw2 </li></ul><ul><li>40 cases used 1 cases contain missing values </li></ul><ul><li>Predictor Coef SE Coef T P </li></ul><ul><li>Constant -13.016 2.865 -4.54 0.000 </li></ul><ul><li>Age -0.015801 0.007847 -2.01 0.052 </li></ul><ul><li>Height 0.23215 0.02895 8.02 0.000 </li></ul><ul><li>Weight -0.000639 0.006542 -0.10 0.923 </li></ul><ul><li>Chest -0.00294 0.06491 -0.05 0.964 </li></ul><ul><li>Waist 0.10068 0.09427 1.07 0.293 </li></ul><ul><li>hw2 -0.00001814 0.00001761 -1.03 0.310 </li></ul><ul><li>S = 0.5260 R-Sq = 78.2% R-Sq(adj) = 74.2% </li></ul>
- 27. Example <ul><li>The only coefficient that appeared to be significant and the 5% level was the height. Since the P-value for the coefficient on the age was very close to 5% (5.2%) it was decided that a linear model with the two independent variables height and age would be calculated. </li></ul><ul><li>The resulting model is on the next slide. </li></ul>
- 28. Example Linear Model with variables: Height & Age <ul><li>The regression equation is </li></ul><ul><li>Capacity = - 10.2 + 0.215 Height - 0.0133 Age </li></ul><ul><li>40 cases used 1 cases contain missing values </li></ul><ul><li>Predictor Coef SE Coef T P </li></ul><ul><li>Constant -10.217 1.272 -8.03 0.000 </li></ul><ul><li>Height 0.21481 0.01921 11.18 0.000 </li></ul><ul><li>Age -0.013322 0.005861 -2.27 0.029 </li></ul><ul><li>S = 0.5073 R-Sq = 77.2% R-Sq(adj) = 76.0% </li></ul>Notice that even though the R 2 value decreases slightly, the adjusted R 2 value actually increases. Also note that the coefficient on Age is now significant at 5%.
- 29. Example <ul><li>In an attempt to determine if incorporating the categorical variables into the model would significantly enhance the it. </li></ul><ul><li>Gender was coded as an indicator variable (male = 0 and female = 1), </li></ul><ul><li>Smoking was coded as an indicator variable (No = 0 and Yes = 1), and </li></ul><ul><li>Activity level (light, moderate, heavy) was coded respectively as 1, 2 and 3. </li></ul><ul><li>The resulting Minitab output is given on the next slide. </li></ul>
- 30. Example Linear Model with categorical variables added <ul><li>The regression equation is </li></ul><ul><li>Capacity = - 7.58 + 0.171 Height - 0.0113 Age - 0.383 C-Gender </li></ul><ul><li>+ 0.260 C-Activity - 0.289 C-Smoke </li></ul><ul><li>37 cases used 4 cases contain missing values </li></ul><ul><li>Predictor Coef SE Coef T P </li></ul><ul><li>Constant -7.584 2.005 -3.78 0.001 </li></ul><ul><li>Height 0.17076 0.02919 5.85 0.000 </li></ul><ul><li>Age -0.011261 0.005908 -1.91 0.066 </li></ul><ul><li>C-Gender -0.3827 0.2505 -1.53 0.137 </li></ul><ul><li>C-Activi 0.2600 0.1210 2.15 0.040 </li></ul><ul><li>C-Smoke -0.2885 0.2126 -1.36 0.185 </li></ul><ul><li>S = 0.4596 R-Sq = 84.2% R-Sq(adj) = 81.7% </li></ul>
- 31. Example <ul><li>It was noted that coefficient for the coded indicator variables gender and smoking were not significant, but after considerable discussion, the group felt that a number of the variables were related. </li></ul>This, the group felt, was confounding the study. In an attempt to determine a reasonable optimal subgroup of the variables to keep in the study, it was noted that a number of the variables were highly related. Since the study was small, a stepwise regression was run and the variables, Height, Age, Coded Activity, Coded Gender were kept and the following model was obtained.
- 32. Example Linear Model with Height, Age & Coded Activity and Gender <ul><li>The regression equation is </li></ul><ul><li>Capacity = - 6.93 + 0.161 Height - 0.0137 Age </li></ul><ul><li>+ 0.302 C-Activity - 0.466 C-Gender </li></ul><ul><li>40 cases used 1 cases contain missing values </li></ul><ul><li>Predictor Coef SE Coef T P </li></ul><ul><li>Constant -6.929 1.708 -4.06 0.000 </li></ul><ul><li>Height 0.16079 0.02454 6.55 0.000 </li></ul><ul><li>Age -0.013744 0.005404 -2.54 0.016 </li></ul><ul><li>C-Activi 0.3025 0.1133 2.67 0.011 </li></ul><ul><li>C-Gender -0.4658 0.2082 -2.24 0.032 </li></ul><ul><li>S = 0.4477 R-Sq = 83.2% R-Sq(adj) = 81.3% </li></ul>
- 33. Example Linear Model with Height, Age & Coded Activity and Gender <ul><li>Analysis of Variance </li></ul><ul><li>Source DF SS MS F P </li></ul><ul><li>Regression 4 34.8249 8.7062 43.44 0.000 </li></ul><ul><li>Residual Error 35 7.0151 0.2004 </li></ul><ul><li>Total 39 41.8399 </li></ul><ul><li>Source DF Seq SS </li></ul><ul><li>Height 1 30.9878 </li></ul><ul><li>Age 1 1.3296 </li></ul><ul><li>C-Activi 1 1.5041 </li></ul><ul><li>C-Gender 1 1.0034 </li></ul><ul><li>Unusual Observations </li></ul><ul><li>Obs Height Capacity Fit SE Fit Residual St Resid </li></ul><ul><li>4 66.0 2.2000 3.2039 0.1352 -1.0039 -2.35R </li></ul><ul><li>23 74.0 5.7000 4.7635 0.2048 0.9365 2.35R </li></ul><ul><li>39 70.0 5.4000 4.4228 0.1064 0.9772 2.25R </li></ul><ul><li>R denotes an observation with a large standardized residual </li></ul>The rest of the Minitab output is given below.
- 34. Example Linear Model with Height, Age & Coded Activity and Gender <ul><li>All of the coefficients in this model were significant at the 5% level and the R 2 and adjusted R 2 were both fairly large. </li></ul><ul><li>This appeared to be a reasonable model for describing lung capacity even though the study was limited by sample size, and measurement limitations due to antique equipment. </li></ul>Minitab identified 3 outliers (because the standardized residuals were unusually large. Various plots of the standardized residuals are produced on the next few slides with comments
- 35. Example Linear Model with Height, Age & Coded Activity and Gender The histogram of the residuals appears to be consistent with the assumption that the residuals are a sample from a normal distribution.
- 36. Example Linear Model with Height, Age & Coded Activity and Gender The normality plot also tends to indicate the residuals can reasonably be thought to be a sample from a normal distribution.
- 37. Example Linear Model with Height, Age & Coded Activity and Gender The residual plot also tends to indicate that the model assumptions are not unreasonable, although there would be some concern that the residuals are predominantly positive for smaller fitted lung capacities.

No public clipboards found for this slide

Be the first to comment