Correlation and regression

2,183 views

Published on

Correlation and regression

Published in: Technology, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,183
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
154
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Correlation and regression

  1. 1. Correlation and Regression<br />
  2. 2. Correlation<br />
  3. 3. What is correlation?<br />A statistical method used to determine whether a linear relationship exist between variables. <br />
  4. 4. Correlation and dependence go by the hand, Dependence refers to any statistical relationship between two random variables or two sets of data, and Correlation refers to any of a broad class of statistical relationships involving dependence.<br />Correlation and Dependence<br />
  5. 5. Correlation and Causality<br />The conventional dictum that "correlation does not imply causation" means that correlation cannot be used to infer a causal relationship between the variables. This dictum should not be taken to mean that correlations cannot indicate the potential existence of causal relations. <br />
  6. 6. Correlation and causality example<br />A correlation between age and height in children is fairly causally transparent, but a correlation between mood and health in people is less so. <br />Does improved mood lead to improved health, or does good health lead to good mood, or both? Or does some other factor underlie both? <br />In other words, a correlation can be taken as evidence for a possible causal relationship, but cannot indicate what the causal relationship, if any, might be.<br />
  7. 7. Correlation and linearity<br />The Pearson correlation coefficient indicates the strength of a linear relationship between two variables, but its value generally does not completely characterize their relationship. In particular, if the conditional mean of Y given X, denoted E(Y|X), is not linear in X, the correlation coefficient will not fully determine the form of E(Y|X).<br />
  8. 8.
  9. 9. Regression<br />
  10. 10. Regression analysis<br />regression analysis helps one understand how the typical value of the dependent variable changes when any one of the independent variables is varied, while the other independent variables are held fixed. <br />
  11. 11. Regression Models<br />The unknown parameters denoted as β; this may be a scalar or a vector.<br />The independent variables, X.<br />The dependent variable, Y.<br />
  12. 12. Linear regression<br />the model specification is that the dependent variable, yis a linear combination of the parameters (but need not be linear in the independent variables). For example, in simple linear regression for modeling n data points there is one independent variable: xi, and two parameters, β0 and β1: straight line<br />
  13. 13. Non linear Regression<br />When the model function is not linear in the parameters, the sum of squares must be minimized by an iterative procedure. This introduces many complications which are summarized in Differences between linear and non-linear least squares.<br />

×