Regression Analysis

  • 585 views
Uploaded on

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
585
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
48
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Regression Analysis
    • What is regression ?
    • Best-fit line
    • Least square
  • 2. What is regression ?
    • Study of the behavior of one variable in relation to several compartments induced by another variable.
    • By the use of regression line or equation, we can predict scores on the dependent variable from those of the independent variable. There are different nomenclatures of independent and dependent variables.
  • 3. COMPARTMENTS
    • NON CROSS CLASSIFIED
    • AGE
    • EDUCATION
    • CROSS CLASSIFIED
    • ELDER EDUCATED
    • ELDER UNEDUCATED
    • YOUNGER EDUCATED
    • YOUNGER UNEDUCATED
  • 4. NOMENCLATURE
  • 5. 2 - WAY TABLE
  • 6. Regression lines (Best - fit line)
  • 7. Change in best -fit line
  • 8. Equation for a straight line
      • Y = a + bX (simple regression)
      • Y = a+ b1X1+b2X2+……..bnXn
      • Y= Predicted score
      • a = Intercept/origin of regression line
      • b = regression coefficient representing unit of change in dependent variable with the increase in 1 unit on X variable
  • 9. b coefficient estimation
    • SumXY-[(sum X)(sum Y)/ N]
    • bxy = ---------------------------------------
    • Sum Y2 - [(sum Y)^2/N]
    • Sum of deviation XY
    • bxy = --------------------------------
    • Sum of deviation X square
  • 10. Estimation of ‘a’
    • axy = Mean X - bxy(predicted Y )
    • Predicting by graph
  • 11. Least square
    • Goal of linear regression procedure is to fit a line through the points. Specifically , the program will compute a line so that the squared deviations of the observed points from that line are minimized. This procedure is called least square.
    • A fundamental principle of least square method is that variation on a dependent variable can be partitioned, or divided into parts, according to the sources of variation.
  • 12. Least square estimation
    •  ( Y - Y mean )^2 =  (Y hat - Y mean) ^2 +  (Y - Y hat) ^2
    • a) Total sum of squared deviations of the observed values (Y) on the dependent variable from the dependent variable mean (Total SS)
    • b) Sum of squared deviations of the predicted values (Y hat) for the dependent variable from the dependent variable mean (Model SS)
    • c) Sum of squared of the observed values on the dependent variable from the predicted values (Y -hat) , that is, the sum of the squared residuals (Error SS)