INTRODUCTION TO CORRELATION &
DR. PRASANT SARANGI
Introduction to Correlation Analysis
Linear Regression Analysis
Multiple Regression Analysis
• Positive Correlation
• Negative Correlation
• Linear Correlation and
• Non-linear Correlation
• Two variables are said to be positively correlated when the movement of
the one variable leads the movement of the other variable in the same
• There exists direct relationship between the two variables.
• Correlation between two variables is said to be negative when the
movement of one variable leads to the movement in the other variable in
the opposite direction.
• Here there exists inverse relationship between the two variables.
• The correlation between two variables is said to be linear where the
points when drawn is a graph represents a straight line.
• Non-linear Correlation
A relationship between two variables is said to be non-linear if a unit change
in one variable causes the other variable to change in fluctuations.
If X is changed then corresponding values of Y will not change in the same
Methods of Measuring Correlation
• The Graphical Method
The correlation can be graphically shown by using scatter diagrams.
Scatter diagram reveals two important useful information.
Firstly, through this diagram, one can observe the patterns between two
variables which indicate whether there exists some association between
the variables or not.
Secondly, if an association between the variables is found, then it can be
easily identified regarding the nature of relationship between the two
(whether two variables are linearly related or non-linearly related).
• Karl Pearson’s Coefficient of Correlation
Karl Pearson’s coefficient of correlation (developed in 1986) measures
linear relationship between two variables under study.
Since, the relationship is expressed is linear, hence, two variables change
in a fixed proportion.
This measure provides the answer of the degree of relationship in real
number, independent of the units in which the variables have been
expressed, and also indicates the direction of the correlation.
• Direct method
Assumed Mean Method
∑ ∑∑ ∑
∑ ∑ ∑
Assumptions of Coefficient of Correlation
1. The Value of the Coefficient of Correlation Lies between -1 (minus
one) to +1 (plus one).
2. The Value of the Coefficient of Correlation is Independent of the Change
of Origin and Change of Scale of Measurement
∑ ∑∑ ∑
∑ ∑ ∑
Rank Correlation Coefficient
There are three different situations of applying the Spearman’s rank
• When ranks of both the variables are given
• When ranks of both the variables are not given and
• When ranks between two or more observations in a series are
• When Ranks of Both the Variables are Given
When Ranks of both the Variables are not Given
•In such cases, each observation in the series is to be ranked first.
•The selection of highest value depends on the researcher.
• In other words, either the highest value or the lowest value will be ranked 1
(one) depends upon the decision of the researcher.
• When Ranks between Two or More Observations in a Series are Equal
• The ranks to be assigned to each observation are an average of the ranks
which these observations would have got, if they differed from each
What do we use regression models for:
1. Estimate a relationship among economic variables, such as
y = f(x).
2. Test hypotheses
3. Forecast or predict the value of one variable, y, based
on the value of another variable, x.
Dependent and Independent Variables
Dependent variable - the variable we are trying to explain
Independent (or explanatory) variables - variables that we think cause
movements in the dependent variable
Simple Regression Model
Y = dependent variable
X = independent variable
Model is: Y = α + β X
α is the intercept or constant
β is the slope coefficient
Models that are linear in the variables and in the coefficients:
Y = α + β X
Models that are nonlinear in the variables but linear in the coefficients:
Y = α + β X2
Models that are nonlinear in the variables and in the coefficients:
Y = α + X β
Some models that are nonlinear can be made linear in the coefficients:
Y = e α
ln Y = α + β ln X
E(Y|X)= α +βX
An Example showing income and average expenditure
Y is a random variable composed of two parts:
I. Systematic component: E(Y) = α+ βX
This is the mean of Y.
II. Random component: u = Y - E(Y | X)
= Y - α- βX
u is called the stochastic or random error.
Together E(Y) and u form the model:
Y = α+ βX + u
Sources of error term
• Dependent variable measured with error
• Model left out relevant variables
• Wrong functional form
• Inherent randomness of behaviour
E(Y)= α + β X
X1 X2 X3 X4
The Estimated Model
We use the data on Y and X to come up with guesses for α and β. These
estimated parameters or coefficients are
α and β cap
Our estimated, or “fitted”, model gives the predicted value for Y for any given
Yi = α + β Xi
The residual is the difference between the actual or observed value of Y and
the predicted value:
ui = Yi - Yi = Yi - α - β Xi
^ ^ ^
^ ^ ^ ^