Regression Analysis

1,075 views

Published on

Published in: Technology, Economy & Finance
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,075
On SlideShare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
79
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Regression Analysis

  1. 1. Regression Analysis <ul><li>What is regression ? </li></ul><ul><li>Best-fit line </li></ul><ul><li>Least square </li></ul>
  2. 2. What is regression ? <ul><li>Study of the behavior of one variable in relation to several compartments induced by another variable. </li></ul><ul><li>By the use of regression line or equation, we can predict scores on the dependent variable from those of the independent variable. There are different nomenclatures of independent and dependent variables. </li></ul>
  3. 3. COMPARTMENTS <ul><li>NON CROSS CLASSIFIED </li></ul><ul><li>AGE </li></ul><ul><li>EDUCATION </li></ul><ul><li>CROSS CLASSIFIED </li></ul><ul><li>ELDER EDUCATED </li></ul><ul><li>ELDER UNEDUCATED </li></ul><ul><li>YOUNGER EDUCATED </li></ul><ul><li>YOUNGER UNEDUCATED </li></ul>
  4. 4. NOMENCLATURE
  5. 5. 2 - WAY TABLE
  6. 6. Regression lines (Best - fit line)
  7. 7. Change in best -fit line
  8. 8. Equation for a straight line <ul><ul><li>Y = a + bX (simple regression) </li></ul></ul><ul><ul><li>Y = a+ b1X1+b2X2+……..bnXn </li></ul></ul><ul><ul><li>Y= Predicted score </li></ul></ul><ul><ul><li>a = Intercept/origin of regression line </li></ul></ul><ul><ul><li>b = regression coefficient representing unit of change in dependent variable with the increase in 1 unit on X variable </li></ul></ul>
  9. 9. b coefficient estimation <ul><li>SumXY-[(sum X)(sum Y)/ N] </li></ul><ul><li>bxy = --------------------------------------- </li></ul><ul><li>Sum Y2 - [(sum Y)^2/N] </li></ul><ul><li>Sum of deviation XY </li></ul><ul><li>bxy = -------------------------------- </li></ul><ul><li>Sum of deviation X square </li></ul>
  10. 10. Estimation of ‘a’ <ul><li>axy = Mean X - bxy(predicted Y ) </li></ul><ul><li>Predicting by graph </li></ul>
  11. 11. Least square <ul><li>Goal of linear regression procedure is to fit a line through the points. Specifically , the program will compute a line so that the squared deviations of the observed points from that line are minimized. This procedure is called least square. </li></ul><ul><li>A fundamental principle of least square method is that variation on a dependent variable can be partitioned, or divided into parts, according to the sources of variation. </li></ul>
  12. 12. Least square estimation <ul><li> ( Y - Y mean )^2 =  (Y hat - Y mean) ^2 +  (Y - Y hat) ^2 </li></ul><ul><li>a) Total sum of squared deviations of the observed values (Y) on the dependent variable from the dependent variable mean (Total SS) </li></ul><ul><li>b) Sum of squared deviations of the predicted values (Y hat) for the dependent variable from the dependent variable mean (Model SS) </li></ul><ul><li>c) Sum of squared of the observed values on the dependent variable from the predicted values (Y -hat) , that is, the sum of the squared residuals (Error SS) </li></ul>

×