SlideShare a Scribd company logo
REGRESSION ANALYSIS
ECONOMETRICS
By
Muthama
JKUAT
REGRESSION ANALYSIS
PRESENTED BY
JAPHETH MUTINDA MUTHAMA
PRESENTED TO
PROFESSOR NAMUSONGE
JKUAT - KENYA
MEANING OF REGRESSION:
The dictionary meaning of the word Regression is ‘Stepping back’ or ‘Going back’.
Regression is the measures of the average relationship between two or more variables in
terms of the original units of the data.
It attempts to establish the functional relationship between the variables and thereby
provide a mechanism for prediction or forecasting.
It describes the relationship between two (or more) variables.
Regression analysis uses data to identify relationships among variables by applying
regression models
The relationships between the variables e.g X and Y can be used to make predictions on
the same.
The ’independent’ variable ‘X’ is usually called the repressor (there may be one
or more of these), the ’dependent’ variable y is the response variable.
Regression
Regression is thus an explanation of causation.
If the independent variable(s) sufficiently explain the variation in the dependent
variable, the model can be used for prediction.
Independent variable (x)
Dependentvariable(Y)
APPLICATION OF REGRESSION ANALYSIS IN RESEARCH
i. It helps in the formulation and determination of functional
relationship between two or more variables.
ii. It helps in establishing a cause and effect relationship between
two variables in economics and business research.
iii. It helps in predicting and estimating the value of dependent
variable as price, production, sales etc.
iv. It helps to measure the variability or spread of values of a
dependent variable with respect to the regression line
USE OF REGRESSION IN ORGANIZATIONS
In the field of business regression is widely used by businessmen in;
•Predicting future production
•Investment analysis
•Forecasting on sales etc.
It is also used in sociological study and economic planning to find the
projections of population, birth rates. death rates
So the success of a businessman depends on the correctness of the
various estimates that he is required to make.
METHODS OF STUDYING REGRESSION:
Or
Algebraically method
1.Least Square Method-:
The regression equation of X on Y is :
X= a+bX
Where,
X=Dependent variable and Y=Independent variable
The regression equation of Y on X is:
Y = a+bX
Where,
Y=Dependent variable
X=Independent variable
Simple Linear Regression
Independent variable (x)
Dependentvariable(y)
The output of a regression is a function that predicts the dependent
variable based upon values of the independent variables.
Simple regression fits a straight line to the data.
y = a + bX ± є
a (y intercept)
b = slope
= ∆y/ ∆x
є
The output of a simple regression is the coefficient β and the constant A.
The equation is then:
y = A + β * x + ε
where ε is the residual error.
β is the per unit change in the dependent variable for each unit change in
the independent variable. Mathematically:
β =
∆ y
∆ x
Multiple Linear Regression
More than one independent variable can be used to explain variance in
the dependent variable, as long as they are not linearly related.
A multiple regression takes the form:
y = A + β X + β X + … + β k Xk + ε
where k is the number of variables, or parameters.
1 1 2 2
Multicollinearity
Multicollinearity is a condition in which at least 2 independent variables
are highly linearly correlated. It will often crash computers.
Example table of
Correlations
Y X1 X2
Y 1.000
X1 0.802 1.000
X2 0.848 0.578 1.000
A correlations table can suggest which independent variables may be
significant. Generally, an ind. variable that has more than a .3 correlation
with the dependent variable and less than .7 with any other ind. variable
can be included as a possible predictor.
Nonlinear Regression
Nonlinear functions can also be fit as regressions. Common
choices include Power, Logarithmic, Exponential, and Logistic,
but any continuous function can be used.
Example1-: From the following data obtain the regression equations
using the method of Least Squares.
X 3 2 7 4 8
Y 6 1 8 5 9
Solution-:
X Y XY X2
Y2
3 6 18 9 36
2 1 2 4 1
7 8 56 49 64
4 5 20 16 25
8 9 72 64 81
∑ = 24X ∑ = 29Y ∑ =168XY 1422
=∑ X 2072
=∑Y
∑ ∑+= XbnaY
∑ ∑∑ += 2
XbXaXY
Substitution the values from the table we get
29=5a+24b…………………(i)
168=24a+142b
84=12a+71b………………..(ii)
Multiplying equation (i ) by 12 and (ii) by 5
348=60a+288b………………(iii)
420=60a+355b………………(iv)
By solving equation(iii)and (iv) we get
a=0.66 and b=1.07
By putting the value of a and b in the Regression equation Y on X
we get
Y=0.66+1.07X
Now to find the regression equation of X on Y ,
The two normal equation are
∑∑ ∑
∑ ∑
+=
+=
2
YbYaXY
YbnaX
Substituting the values in the equations we get
24=5a+29b………………………(i)
168=29a+207b…………………..(ii)
Multiplying equation (i)by 29 and in (ii) by 5 we get
a=0.49 and b=0.74
Substituting the values of a and b in the Regression equation X and Y
X=0.49+0.74Y
2.Deaviation from the Arithmetic mean method:
The calculation by the least squares method are quit cumbersome when
the values of X and Y are large. So the work can be simplified by using this
method.
The formula for the calculation of Regression Equations by this method:
Regression Equation of X on Y- )()( YYbXX xy −=−
Regression Equation of Y on X-
)()( XXbYY yx −=−
∑
∑= 2
y
xy
bxy
∑
∑= 2
x
xy
byxand
Where, xyb
yxband = Regression
Coefficient
Example2-: from the previous data obtain the regression equations by
Taking deviations from the actual means of X and Y series.
X 3 2 7 4 8
Y 6 1 8 5 9
X Y x2 y2
xy
3 6 -1.8 0.2 3.24 0.04 -0.36
2 1 -2.8 -4.8 7.84 23.04 13.44
7 8 2.2 2.2 4.84 4.84 4.84
4 5 -0.8 -0.8 0.64 0.64 0.64
8 9 3.2 3.2 10.24 10.24 10.24
XXx −= YYy −=
∑ = 24X ∑ = 29Y 8.26
2
=∑x 8.28=∑ xy8.382
=∑ y∑ = 0x 0∑ =y
Solution-:
Regression Equation of X on Y is
( )
( )
49.074.0
8.574.08.4
8.5
8.38
8.28
8.4
2
+=
−=−
−=−
=
∑
∑
YX
YX
YX
y
xy
bxy
Regression Equation of Y on X is
)()( XXbYY yx −=−
( )
66.007.1
)8.4(07.18.5
8.4
8.26
8.28
8.5
2
+=
−=−
−=−
=
∑
∑
XY
XY
XY
x
xy
byx
………….(I)
………….(II)
)()( YYbXX xy −=−
It would be observed that these regression equations are same as
those obtained by the direct method .
3.Deviation from Assumed mean method-:
When actual mean of X and Y variables are in fractions ,the
calculations can be simplified by taking the deviations from the
assumed mean.
The Regression Equation of X on Y-:
( )∑ ∑
∑ ∑ ∑
−
−
= 22
yy
yxyx
xy
ddN
ddddN
b
The Regression Equation of Y on X-:
( )∑ ∑
∑ ∑ ∑
−
−
= 22
xx
yxyx
yx
ddN
ddddN
b
)()( YYbXX xy −=−
)()( XXbYY yx −=−
But , here the values of and will be calculated by
following formula:
xyb yxb
Example-: From the data given in previous example calculate
regression equations by assuming 7 as the mean of X series and 6 as
the mean of Y series.
X Y
Dev. From
assu. Mean 7
(dx)=X-7
Dev. From assu.
Mean 6 (dy)=Y-6
dxdy
3 6 -4 16 0 0 0
2 1 -5 25 -5 25 +25
7 8 0 0 2 4 0
4 5 -3 9 -1 1 +3
8 9 1 1 3 9 +3
Solution-:
2
xd 2
yd
∑ = 24X ∑ = 29Y ∑ −= 11xd ∑ −= 1yd∑ = 512
xd ∑ = 392
yd ∑ = 31yxdd
The Regression Coefficient of X on Y-:
( )∑ ∑
∑ ∑ ∑
−
−
= 22
yy
yxyx
xy
ddN
ddddN
b
74.0
194
144
1195
11155
)1()39(5
)1)(11()31(5
2
=
=
−
−
=
−−
−−−
=
xy
xy
xy
xy
b
b
b
b
8.5
5
29
==⇒=
∑ Y
N
Y
Y
The Regression equation of X on Y-:
49.074.0
)8.5(74.0)8.4(
)()(
+=
−=−
−=−
YX
YX
YYbXX xy
8.4
5
24
==⇒=
∑ X
N
X
X
The Regression coefficient of Y on X-:
( )∑ ∑
∑ ∑ ∑
−
−
= 22
xx
yxyx
yx
ddN
ddddN
b
07.1
134
144
121255
11155
)11()51(5
)1)(11()31(5
2
=
=
−
−
=
−−
−−−
=
yx
yx
yx
yx
b
b
b
b
The Regression Equation of Y on X-:
)()( XXbYY yx −=−
66.007.1
)8.4(07.1)8.5(
+=
−=−
XY
XY
It would be observed the these regression equations are same as those
obtained by the least squares method and deviation from arithmetic mean .
SIMPLE REGRESSION
This assumes the model y = β0 + βx + ε
Example:
Assume variables Y and X Explained by the following model
Y = β0
+ βx
Where (Y) is called the dependent (or response) variable and X the
independent (or predictor, or explanatory) variable.
The two variables can be explained in the following model E(Y | X = x) = β0
+ βx (the “population line”)
Cont…..
The interpretation is as follows:
where β0
is the (unknown) intercept and β1
is the (unknown) slope or
incremental change in Y per unit change in X.
β0
and β1
are not known exactly, but are estimated from sample data and
their estimates can be denoted b0
and b1
.
Note that the actual value of σ is usually not known.
The two regression coefficients are called the slope and intercept.
Their actual values are also unknown and should always be estimated
using the empirical data at hand.
MULTIVARIATE (LINEAR) REGRESSION
This is a regression model with multiple independent variables
Here, the independent (regressor) variables x1, x2.... xn with only one
dependent (response) variable y
The model therefore assumes the following format;
yi = β0 + β1x1 + β2x2 + ...... βnxn+ ε
Where 1, 2, ... n, are the first index labels of the variable and the
second observation.
NB: The exact values of β and ε are, and will always remain unknown
Polynomial Regression
This is a special case of multivariate regression, with only one independent
variable
x, but an x-y relationship which is clearly nonlinear (at the same time, there
is no ‘physical’ model to rely on).
y = β0 + β1x + β2x2 + β3x3.....+ βnxn + ε
Effectively, this is the same as having a multivariate model with x1 ≡ x, x2
≡ x2, x3 ≡ x3
NONLINEAR REGRESSION
This is a model with one independent variable (the results can be easily
extended to several) and ‘n’ unknown parameters, which we will call b1,
b2, ... bn:
y = f (x, b) + ε
where f (x, b) is a specific (given) function of the independent variable and
the ‘n’ parameters.
Types of Lines
Scatter plot
15.0 20.0 25.0 30.0 35.0
Percent of Population 25 years and Over with Bachelor's Degree or More,
March 2000 estimates
20000
25000
30000
35000
40000
PersonalIncomePerCapita,currentdollars,
1999
Percent of Population with Bachelor's Degree by Personal Income Per Capita
•This is a linear relationship
•It is a positive relationship.
•As population with BA’s
increases so does the
personal income per capita.
Regression Line
15.0 20.0 25.0 30.0 35.0
Percent of Population 25 years and Over with Bachelor's Degree or More,
March 2000 estimates
20000
25000
30000
35000
40000
PersonalIncomePerCapita,currentdollars,
1999
Percent of Population with Bachelor's Degree by Personal Income Per Capita
R Sq Linear = 0.542
•Regression line is the
best straight line
description of the plotted
points and use can use it
to describe the
association between the
variables.
•If all the lines fall exactly
on the line then the line is
0 and you have a perfect
relationship.
Things to note
Regression focuses on association, not causation.
Association is a necessary prerequisite for inferring causation, but also:
1. The independent variable must preceed the dependent variable.
2. The two variables must be inline with a given theory,
3. Competing independent variables must be eliminated.
Regression Table
•The regression coefficient is
not a good indicator for the
strength of the relationship.
•Two scatter plots with very
different dispersions could
produce the same regression
line.
15.0 20.0 25.0 30.0 35.0
Percent of Population 25 years and Over with Bachelor's Degree or More,
March 2000 estimates
20000
25000
30000
35000
40000
PersonalIncomePerCapita,currentdollars,
1999
Percent of Population with Bachelor's Degree by Personal Income Per Capita
R Sq Linear = 0.542
0.00 200.00 400.00 600.00 800.00 1000.00 1200.00
Population Per Square Mile
20000
25000
30000
35000
40000
PersonalIncomePerCapita,currentdollars,1999
Percent of Population with Bachelor's Degree by Personal Income Per Capita
R Sq Linear = 0.463
Regression coefficient
The regression coefficient is the slope of the regression line wil tell;
• What the nature of the relationship between the variables is.
• How much change in the independent variables is associated with
thechange in the dependent variable.
• The larger the regression coefficient the more the change.
Pearson’s r
• To determine strength you look at how closely the dots are clustered
around the line. The more tightly the cases are clustered, the
stronger the relationship, while the more distant, the weaker.
• Pearson’s r is given a range of -1 to + 1 with 0 being no linear
relationship at all.
Reading the tables
Model Summary
.736a .542 .532 2760.003
Model
1
R R Square
Adjusted
R Square
Std. Error of
the Estimate
Predictors: (Constant), Percent of Population 25 years
and Over with Bachelor's Degree or More, March 2000
estimates
a.
•When you run regression analysis on SPSS you get a 3 tables.
Each tells you something about the relationship.
•The first is the model summary.
•The R is the Pearson Product Moment Correlation Coefficient.
•In this case R is .736
•R is the square root of R-Squared and is the correlation between
the observed and predicted values of dependent variable.
R-Square
Model Summary
.736a .542 .532 2760.003
Model
1
R R Square
Adjusted
R Square
Std. Error of
the Estimate
Predictors: (Constant), Percent of Population 25 years
and Over with Bachelor's Degree or More, March 2000
estimates
a.
•R-Square is the proportion of variance in the dependent
variable (income per capita) which can be predicted from the
independent variable (level of education).
•This value indicates that 54.2% of the variance in income can be
predicted from the variable education. Note that this is an
overall measure of the strength of association, and does not
reflect the extent to which any particular independent variable
is associated with the dependent variable.
•R-Square is also called the coefficient of determination.
Adjusted R-square
Model Summary
.736a .542 .532 2760.003
Model
1
R R Square
Adjusted
R Square
Std. Error of
the Estimate
Predictors: (Constant), Percent of Population 25 years
and Over with Bachelor's Degree or More, March 2000
estimates
a.
•As predictors are added to the model, each predictor will explain some of the
variance in the dependent variable simply due to chance.
•One could continue to add predictors to the model which would continue to
improve the ability of the predictors to explain the dependent variable, although
some of this increase in R-square would be simply due to chance variation in that
particular sample.
•The adjusted R-square attempts to yield a more honest value to estimate the R-
squared for the population. The value of R-square was .542, while the value of
Adjusted R-square was .532. There isn’t much difference because we are dealing
with only one variable.
•When the number of observations is small and the number of predictors is large,
there will be a much greater difference between R-square and adjusted R-square.
•By contrast, when the number of observations is very large compared to the
number of predictors, the value of R-square and adjusted R-square will be much
closer.
ANOVA
ANOVAb
4.32E+08 1 432493775.8 56.775 .000a
3.66E+08 48 7617618.586
7.98E+08 49
Regression
Residual
Total
Model
1
Sum of
Squares df Mean Square F Sig.
Predictors: (Constant), Percent of Population 25 years and Over with Bachelor's
Degree or More, March 2000 estimates
a.
Dependent Variable: Personal Income Per Capita, current dollars, 1999b.
•The p-value associated with this F value is very small (0.0000).
•These values are used to answer the question "Do the independent variables
reliably predict the dependent variable?".
•The p-value is compared to your alpha level (typically 0.05) and, if smaller,
you can conclude "Yes, the independent variables reliably predict the
dependent variable".
•If the p-value were greater than 0.05, you would say that the group of
independent variables does not show a statistically significant relationship
with the dependent variable, or that the group of independent variables does
not reliably predict the dependent variable.
Part of the Regression Equation
• b represents the slope of the line
• It is calculated by dividing the change in the dependent variable by the change in
the independent variable.
• The difference between the actual value of Y and the calculated amount is called
the residual.
• The represents how much error there is in the prediction of the regression
equation for the y value of any individual case as a function of X.

More Related Content

What's hot

Regression Analysis
Regression AnalysisRegression Analysis
Regression AnalysisSalim Azad
 
Simple linear regression (final)
Simple linear regression (final)Simple linear regression (final)
Simple linear regression (final)Harsh Upadhyay
 
Regression analysis
Regression analysisRegression analysis
Regression analysisRavi shankar
 
Linear regression without tears
Linear regression without tearsLinear regression without tears
Linear regression without tearsAnkit Sharma
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression AnalysisASAD ALI
 
Introduction to correlation and regression analysis
Introduction to correlation and regression analysisIntroduction to correlation and regression analysis
Introduction to correlation and regression analysisFarzad Javidanrad
 
Regression Analysis presentation by Al Arizmendez and Cathryn Lottier
Regression Analysis presentation by Al Arizmendez and Cathryn LottierRegression Analysis presentation by Al Arizmendez and Cathryn Lottier
Regression Analysis presentation by Al Arizmendez and Cathryn LottierAl Arizmendez
 
Regression (Linear Regression and Logistic Regression) by Akanksha Bali
Regression (Linear Regression and Logistic Regression) by Akanksha BaliRegression (Linear Regression and Logistic Regression) by Akanksha Bali
Regression (Linear Regression and Logistic Regression) by Akanksha BaliAkanksha Bali
 
Chapter 4 - multiple regression
Chapter 4  - multiple regressionChapter 4  - multiple regression
Chapter 4 - multiple regressionTauseef khan
 
Chapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares RegressionChapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares Regressionnszakir
 
Logistic regression
Logistic regressionLogistic regression
Logistic regressionVARUN KUMAR
 
Simple Linier Regression
Simple Linier RegressionSimple Linier Regression
Simple Linier Regressiondessybudiyanti
 
Regression analysis.
Regression analysis.Regression analysis.
Regression analysis.sonia gupta
 

What's hot (20)

Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Chapter 10
Chapter 10Chapter 10
Chapter 10
 
Simple linear regression (final)
Simple linear regression (final)Simple linear regression (final)
Simple linear regression (final)
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Regression
RegressionRegression
Regression
 
Linear regression without tears
Linear regression without tearsLinear regression without tears
Linear regression without tears
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Introduction to correlation and regression analysis
Introduction to correlation and regression analysisIntroduction to correlation and regression analysis
Introduction to correlation and regression analysis
 
Regression Analysis presentation by Al Arizmendez and Cathryn Lottier
Regression Analysis presentation by Al Arizmendez and Cathryn LottierRegression Analysis presentation by Al Arizmendez and Cathryn Lottier
Regression Analysis presentation by Al Arizmendez and Cathryn Lottier
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Regression (Linear Regression and Logistic Regression) by Akanksha Bali
Regression (Linear Regression and Logistic Regression) by Akanksha BaliRegression (Linear Regression and Logistic Regression) by Akanksha Bali
Regression (Linear Regression and Logistic Regression) by Akanksha Bali
 
Chapter 4 - multiple regression
Chapter 4  - multiple regressionChapter 4  - multiple regression
Chapter 4 - multiple regression
 
Regression
RegressionRegression
Regression
 
Chapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares RegressionChapter 2 part3-Least-Squares Regression
Chapter 2 part3-Least-Squares Regression
 
Logistic regression
Logistic regressionLogistic regression
Logistic regression
 
Simple Linier Regression
Simple Linier RegressionSimple Linier Regression
Simple Linier Regression
 
Multiple Regression Analysis (MRA)
Multiple Regression Analysis (MRA)Multiple Regression Analysis (MRA)
Multiple Regression Analysis (MRA)
 
Regression analysis.
Regression analysis.Regression analysis.
Regression analysis.
 

Viewers also liked

WF ED 540, Class Meeting 8, 15 October 2015, regression
WF ED 540, Class Meeting 8, 15 October 2015, regressionWF ED 540, Class Meeting 8, 15 October 2015, regression
WF ED 540, Class Meeting 8, 15 October 2015, regressionPenn State University
 
4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression4H2012 508 nonLinear Regression
4H2012 508 nonLinear RegressionA Jorge Garcia
 
Regression techniques to study the student performance in post graduate exam...
Regression techniques to study the student performance in post  graduate exam...Regression techniques to study the student performance in post  graduate exam...
Regression techniques to study the student performance in post graduate exam...IJMER
 
Tbs910 regression models
Tbs910 regression modelsTbs910 regression models
Tbs910 regression modelsStephen Ong
 
Multiple regression analysis
Multiple regression analysisMultiple regression analysis
Multiple regression analysisDushyant Bheda
 
Regression analysis
Regression analysisRegression analysis
Regression analysisbijuhari
 
Regression analysis
Regression analysisRegression analysis
Regression analysisSubin Raj
 
Basic biostatistics
Basic biostatisticsBasic biostatistics
Basic biostatisticsdk08
 
Regression Analysis and its uses
Regression Analysis and its usesRegression Analysis and its uses
Regression Analysis and its usesnim dar Nimra_Dar
 
Non linear curve fitting
Non linear curve fitting Non linear curve fitting
Non linear curve fitting Anumita Mondal
 
Applications of regression analysis - Measurement of validity of relationship
Applications of regression analysis - Measurement of validity of relationshipApplications of regression analysis - Measurement of validity of relationship
Applications of regression analysis - Measurement of validity of relationshipRithish Kumar
 
Skillshare - Regression Analysis for Data Journalism
Skillshare - Regression Analysis for Data JournalismSkillshare - Regression Analysis for Data Journalism
Skillshare - Regression Analysis for Data JournalismSchool of Data
 
Mba2216 week 11 data analysis part 02
Mba2216 week 11 data analysis part 02Mba2216 week 11 data analysis part 02
Mba2216 week 11 data analysis part 02Stephen Ong
 
multiple regression
multiple regressionmultiple regression
multiple regressionPriya Sharma
 
Linear regression
Linear regressionLinear regression
Linear regressionTech_MX
 
My regression lecture mk3 (uploaded to web ct)
My regression lecture   mk3 (uploaded to web ct)My regression lecture   mk3 (uploaded to web ct)
My regression lecture mk3 (uploaded to web ct)chrisstiff
 

Viewers also liked (19)

WF ED 540, Class Meeting 8, 15 October 2015, regression
WF ED 540, Class Meeting 8, 15 October 2015, regressionWF ED 540, Class Meeting 8, 15 October 2015, regression
WF ED 540, Class Meeting 8, 15 October 2015, regression
 
4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression
 
Regression techniques to study the student performance in post graduate exam...
Regression techniques to study the student performance in post  graduate exam...Regression techniques to study the student performance in post  graduate exam...
Regression techniques to study the student performance in post graduate exam...
 
Tbs910 regression models
Tbs910 regression modelsTbs910 regression models
Tbs910 regression models
 
Regression
RegressionRegression
Regression
 
Multiple regression analysis
Multiple regression analysisMultiple regression analysis
Multiple regression analysis
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Basic biostatistics
Basic biostatisticsBasic biostatistics
Basic biostatistics
 
Ch 7 correlation_and_linear_regression
Ch 7 correlation_and_linear_regressionCh 7 correlation_and_linear_regression
Ch 7 correlation_and_linear_regression
 
Regression Analysis and its uses
Regression Analysis and its usesRegression Analysis and its uses
Regression Analysis and its uses
 
Non linear curve fitting
Non linear curve fitting Non linear curve fitting
Non linear curve fitting
 
Applications of regression analysis - Measurement of validity of relationship
Applications of regression analysis - Measurement of validity of relationshipApplications of regression analysis - Measurement of validity of relationship
Applications of regression analysis - Measurement of validity of relationship
 
Skillshare - Regression Analysis for Data Journalism
Skillshare - Regression Analysis for Data JournalismSkillshare - Regression Analysis for Data Journalism
Skillshare - Regression Analysis for Data Journalism
 
Mba2216 week 11 data analysis part 02
Mba2216 week 11 data analysis part 02Mba2216 week 11 data analysis part 02
Mba2216 week 11 data analysis part 02
 
multiple regression
multiple regressionmultiple regression
multiple regression
 
Linear regression
Linear regressionLinear regression
Linear regression
 
My regression lecture mk3 (uploaded to web ct)
My regression lecture   mk3 (uploaded to web ct)My regression lecture   mk3 (uploaded to web ct)
My regression lecture mk3 (uploaded to web ct)
 
Chapter 14
Chapter 14 Chapter 14
Chapter 14
 

Similar to Regression analysis by Muthama JM

Correlation by Neeraj Bhandari ( Surkhet.Nepal )
Correlation by Neeraj Bhandari ( Surkhet.Nepal )Correlation by Neeraj Bhandari ( Surkhet.Nepal )
Correlation by Neeraj Bhandari ( Surkhet.Nepal )Neeraj Bhandari
 
Regression.ppt basic introduction of regression with example
Regression.ppt basic introduction of regression with exampleRegression.ppt basic introduction of regression with example
Regression.ppt basic introduction of regression with exampleshivshankarshiva98
 
Regression analysis presentation
Regression analysis presentationRegression analysis presentation
Regression analysis presentationMuhammadFaisal733
 
Regression
Regression Regression
Regression Ali Raza
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework helpMark Austin
 
REGRESSION ANALYSIS THEORY EXPLAINED HERE
REGRESSION ANALYSIS THEORY EXPLAINED HEREREGRESSION ANALYSIS THEORY EXPLAINED HERE
REGRESSION ANALYSIS THEORY EXPLAINED HEREShriramKargaonkar
 
Regression Ayalysis (2).ppt
Regression Ayalysis (2).pptRegression Ayalysis (2).ppt
Regression Ayalysis (2).pptDeepThinker15
 
regression analysis .ppt
regression analysis .pptregression analysis .ppt
regression analysis .pptTapanKumarDash3
 
Regression analysis
Regression analysisRegression analysis
Regression analysisAwais Salman
 
Simple Linear Regression
Simple Linear RegressionSimple Linear Regression
Simple Linear RegressionSharlaine Ruth
 
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...Payaamvohra1
 
regression-130929093340-phpapp02 (1).pdf
regression-130929093340-phpapp02 (1).pdfregression-130929093340-phpapp02 (1).pdf
regression-130929093340-phpapp02 (1).pdfMuhammadAftab89
 
Regression | Linear | Regression Analysis | Complete Explanation | Regression |
Regression | Linear | Regression Analysis | Complete Explanation | Regression |Regression | Linear | Regression Analysis | Complete Explanation | Regression |
Regression | Linear | Regression Analysis | Complete Explanation | Regression |Shailendra Singh
 
Presentation on regression analysis
Presentation on regression analysisPresentation on regression analysis
Presentation on regression analysisSujeet Singh
 

Similar to Regression analysis by Muthama JM (20)

Correlation by Neeraj Bhandari ( Surkhet.Nepal )
Correlation by Neeraj Bhandari ( Surkhet.Nepal )Correlation by Neeraj Bhandari ( Surkhet.Nepal )
Correlation by Neeraj Bhandari ( Surkhet.Nepal )
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Regression.ppt basic introduction of regression with example
Regression.ppt basic introduction of regression with exampleRegression.ppt basic introduction of regression with example
Regression.ppt basic introduction of regression with example
 
Regression analysis presentation
Regression analysis presentationRegression analysis presentation
Regression analysis presentation
 
Regression
Regression Regression
Regression
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework help
 
REGRESSION ANALYSIS THEORY EXPLAINED HERE
REGRESSION ANALYSIS THEORY EXPLAINED HEREREGRESSION ANALYSIS THEORY EXPLAINED HERE
REGRESSION ANALYSIS THEORY EXPLAINED HERE
 
Regression Ayalysis (2).ppt
Regression Ayalysis (2).pptRegression Ayalysis (2).ppt
Regression Ayalysis (2).ppt
 
regression analysis .ppt
regression analysis .pptregression analysis .ppt
regression analysis .ppt
 
Reg
RegReg
Reg
 
Chapter13
Chapter13Chapter13
Chapter13
 
Regression
RegressionRegression
Regression
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
Simple Linear Regression
Simple Linear RegressionSimple Linear Regression
Simple Linear Regression
 
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...
Biostats coorelation vs rREGRESSION.DIFFERENCE BETWEEN CORRELATION AND REGRES...
 
Regression.pptx
Regression.pptxRegression.pptx
Regression.pptx
 
regression-130929093340-phpapp02 (1).pdf
regression-130929093340-phpapp02 (1).pdfregression-130929093340-phpapp02 (1).pdf
regression-130929093340-phpapp02 (1).pdf
 
Regression | Linear | Regression Analysis | Complete Explanation | Regression |
Regression | Linear | Regression Analysis | Complete Explanation | Regression |Regression | Linear | Regression Analysis | Complete Explanation | Regression |
Regression | Linear | Regression Analysis | Complete Explanation | Regression |
 
Regression
RegressionRegression
Regression
 
Presentation on regression analysis
Presentation on regression analysisPresentation on regression analysis
Presentation on regression analysis
 

Recently uploaded

How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPCeline George
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfjoachimlavalley1
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsCol Mukteshwar Prasad
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
 
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptxslides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptxCapitolTechU
 
Morse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxMorse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxjmorse8
 
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...Sayali Powar
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasiemaillard
 
The Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational ResourcesThe Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational Resourcesaileywriter
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfQucHHunhnh
 
Application of Matrices in real life. Presentation on application of matrices
Application of Matrices in real life. Presentation on application of matricesApplication of Matrices in real life. Presentation on application of matrices
Application of Matrices in real life. Presentation on application of matricesRased Khan
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
 
2024_Student Session 2_ Set Plan Preparation.pptx
2024_Student Session 2_ Set Plan Preparation.pptx2024_Student Session 2_ Set Plan Preparation.pptx
2024_Student Session 2_ Set Plan Preparation.pptxmansk2
 
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringBasic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringDenish Jangid
 
How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17Celine George
 
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTelling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTechSoup
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaasiemaillard
 
Open Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPointOpen Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPointELaRue0
 

Recently uploaded (20)

How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptxslides CapTechTalks Webinar May 2024 Alexander Perry.pptx
slides CapTechTalks Webinar May 2024 Alexander Perry.pptx
 
Operations Management - Book1.p - Dr. Abdulfatah A. Salem
Operations Management - Book1.p  - Dr. Abdulfatah A. SalemOperations Management - Book1.p  - Dr. Abdulfatah A. Salem
Operations Management - Book1.p - Dr. Abdulfatah A. Salem
 
Morse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptxMorse OER Some Benefits and Challenges.pptx
Morse OER Some Benefits and Challenges.pptx
 
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
The Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational ResourcesThe Benefits and Challenges of Open Educational Resources
The Benefits and Challenges of Open Educational Resources
 
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdfDanh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
Danh sách HSG Bộ môn cấp trường - Cấp THPT.pdf
 
Application of Matrices in real life. Presentation on application of matrices
Application of Matrices in real life. Presentation on application of matricesApplication of Matrices in real life. Presentation on application of matrices
Application of Matrices in real life. Presentation on application of matrices
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
2024_Student Session 2_ Set Plan Preparation.pptx
2024_Student Session 2_ Set Plan Preparation.pptx2024_Student Session 2_ Set Plan Preparation.pptx
2024_Student Session 2_ Set Plan Preparation.pptx
 
Mattingly "AI & Prompt Design: Limitations and Solutions with LLMs"
Mattingly "AI & Prompt Design: Limitations and Solutions with LLMs"Mattingly "AI & Prompt Design: Limitations and Solutions with LLMs"
Mattingly "AI & Prompt Design: Limitations and Solutions with LLMs"
 
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & EngineeringBasic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
Basic Civil Engg Notes_Chapter-6_Environment Pollution & Engineering
 
How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17How to the fix Attribute Error in odoo 17
How to the fix Attribute Error in odoo 17
 
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdfTelling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
Telling Your Story_ Simple Steps to Build Your Nonprofit's Brand Webinar.pdf
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Open Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPointOpen Educational Resources Primer PowerPoint
Open Educational Resources Primer PowerPoint
 

Regression analysis by Muthama JM

  • 2. REGRESSION ANALYSIS PRESENTED BY JAPHETH MUTINDA MUTHAMA PRESENTED TO PROFESSOR NAMUSONGE JKUAT - KENYA
  • 3. MEANING OF REGRESSION: The dictionary meaning of the word Regression is ‘Stepping back’ or ‘Going back’. Regression is the measures of the average relationship between two or more variables in terms of the original units of the data. It attempts to establish the functional relationship between the variables and thereby provide a mechanism for prediction or forecasting. It describes the relationship between two (or more) variables. Regression analysis uses data to identify relationships among variables by applying regression models The relationships between the variables e.g X and Y can be used to make predictions on the same. The ’independent’ variable ‘X’ is usually called the repressor (there may be one or more of these), the ’dependent’ variable y is the response variable.
  • 4. Regression Regression is thus an explanation of causation. If the independent variable(s) sufficiently explain the variation in the dependent variable, the model can be used for prediction. Independent variable (x) Dependentvariable(Y)
  • 5. APPLICATION OF REGRESSION ANALYSIS IN RESEARCH i. It helps in the formulation and determination of functional relationship between two or more variables. ii. It helps in establishing a cause and effect relationship between two variables in economics and business research. iii. It helps in predicting and estimating the value of dependent variable as price, production, sales etc. iv. It helps to measure the variability or spread of values of a dependent variable with respect to the regression line
  • 6. USE OF REGRESSION IN ORGANIZATIONS In the field of business regression is widely used by businessmen in; •Predicting future production •Investment analysis •Forecasting on sales etc. It is also used in sociological study and economic planning to find the projections of population, birth rates. death rates So the success of a businessman depends on the correctness of the various estimates that he is required to make.
  • 7.
  • 8. METHODS OF STUDYING REGRESSION: Or
  • 9. Algebraically method 1.Least Square Method-: The regression equation of X on Y is : X= a+bX Where, X=Dependent variable and Y=Independent variable The regression equation of Y on X is: Y = a+bX Where, Y=Dependent variable X=Independent variable
  • 10. Simple Linear Regression Independent variable (x) Dependentvariable(y) The output of a regression is a function that predicts the dependent variable based upon values of the independent variables. Simple regression fits a straight line to the data. y = a + bX ± є a (y intercept) b = slope = ∆y/ ∆x є
  • 11. The output of a simple regression is the coefficient β and the constant A. The equation is then: y = A + β * x + ε where ε is the residual error. β is the per unit change in the dependent variable for each unit change in the independent variable. Mathematically: β = ∆ y ∆ x
  • 12. Multiple Linear Regression More than one independent variable can be used to explain variance in the dependent variable, as long as they are not linearly related. A multiple regression takes the form: y = A + β X + β X + … + β k Xk + ε where k is the number of variables, or parameters. 1 1 2 2
  • 13. Multicollinearity Multicollinearity is a condition in which at least 2 independent variables are highly linearly correlated. It will often crash computers. Example table of Correlations Y X1 X2 Y 1.000 X1 0.802 1.000 X2 0.848 0.578 1.000 A correlations table can suggest which independent variables may be significant. Generally, an ind. variable that has more than a .3 correlation with the dependent variable and less than .7 with any other ind. variable can be included as a possible predictor.
  • 14. Nonlinear Regression Nonlinear functions can also be fit as regressions. Common choices include Power, Logarithmic, Exponential, and Logistic, but any continuous function can be used.
  • 15. Example1-: From the following data obtain the regression equations using the method of Least Squares. X 3 2 7 4 8 Y 6 1 8 5 9 Solution-: X Y XY X2 Y2 3 6 18 9 36 2 1 2 4 1 7 8 56 49 64 4 5 20 16 25 8 9 72 64 81 ∑ = 24X ∑ = 29Y ∑ =168XY 1422 =∑ X 2072 =∑Y
  • 16. ∑ ∑+= XbnaY ∑ ∑∑ += 2 XbXaXY Substitution the values from the table we get 29=5a+24b…………………(i) 168=24a+142b 84=12a+71b………………..(ii) Multiplying equation (i ) by 12 and (ii) by 5 348=60a+288b………………(iii) 420=60a+355b………………(iv) By solving equation(iii)and (iv) we get a=0.66 and b=1.07
  • 17. By putting the value of a and b in the Regression equation Y on X we get Y=0.66+1.07X Now to find the regression equation of X on Y , The two normal equation are ∑∑ ∑ ∑ ∑ += += 2 YbYaXY YbnaX Substituting the values in the equations we get 24=5a+29b………………………(i) 168=29a+207b…………………..(ii) Multiplying equation (i)by 29 and in (ii) by 5 we get a=0.49 and b=0.74
  • 18. Substituting the values of a and b in the Regression equation X and Y X=0.49+0.74Y 2.Deaviation from the Arithmetic mean method: The calculation by the least squares method are quit cumbersome when the values of X and Y are large. So the work can be simplified by using this method. The formula for the calculation of Regression Equations by this method: Regression Equation of X on Y- )()( YYbXX xy −=− Regression Equation of Y on X- )()( XXbYY yx −=− ∑ ∑= 2 y xy bxy ∑ ∑= 2 x xy byxand Where, xyb yxband = Regression Coefficient
  • 19. Example2-: from the previous data obtain the regression equations by Taking deviations from the actual means of X and Y series. X 3 2 7 4 8 Y 6 1 8 5 9 X Y x2 y2 xy 3 6 -1.8 0.2 3.24 0.04 -0.36 2 1 -2.8 -4.8 7.84 23.04 13.44 7 8 2.2 2.2 4.84 4.84 4.84 4 5 -0.8 -0.8 0.64 0.64 0.64 8 9 3.2 3.2 10.24 10.24 10.24 XXx −= YYy −= ∑ = 24X ∑ = 29Y 8.26 2 =∑x 8.28=∑ xy8.382 =∑ y∑ = 0x 0∑ =y Solution-:
  • 20. Regression Equation of X on Y is ( ) ( ) 49.074.0 8.574.08.4 8.5 8.38 8.28 8.4 2 += −=− −=− = ∑ ∑ YX YX YX y xy bxy Regression Equation of Y on X is )()( XXbYY yx −=− ( ) 66.007.1 )8.4(07.18.5 8.4 8.26 8.28 8.5 2 += −=− −=− = ∑ ∑ XY XY XY x xy byx ………….(I) ………….(II) )()( YYbXX xy −=−
  • 21. It would be observed that these regression equations are same as those obtained by the direct method . 3.Deviation from Assumed mean method-: When actual mean of X and Y variables are in fractions ,the calculations can be simplified by taking the deviations from the assumed mean. The Regression Equation of X on Y-: ( )∑ ∑ ∑ ∑ ∑ − − = 22 yy yxyx xy ddN ddddN b The Regression Equation of Y on X-: ( )∑ ∑ ∑ ∑ ∑ − − = 22 xx yxyx yx ddN ddddN b )()( YYbXX xy −=− )()( XXbYY yx −=− But , here the values of and will be calculated by following formula: xyb yxb
  • 22. Example-: From the data given in previous example calculate regression equations by assuming 7 as the mean of X series and 6 as the mean of Y series. X Y Dev. From assu. Mean 7 (dx)=X-7 Dev. From assu. Mean 6 (dy)=Y-6 dxdy 3 6 -4 16 0 0 0 2 1 -5 25 -5 25 +25 7 8 0 0 2 4 0 4 5 -3 9 -1 1 +3 8 9 1 1 3 9 +3 Solution-: 2 xd 2 yd ∑ = 24X ∑ = 29Y ∑ −= 11xd ∑ −= 1yd∑ = 512 xd ∑ = 392 yd ∑ = 31yxdd
  • 23. The Regression Coefficient of X on Y-: ( )∑ ∑ ∑ ∑ ∑ − − = 22 yy yxyx xy ddN ddddN b 74.0 194 144 1195 11155 )1()39(5 )1)(11()31(5 2 = = − − = −− −−− = xy xy xy xy b b b b 8.5 5 29 ==⇒= ∑ Y N Y Y The Regression equation of X on Y-: 49.074.0 )8.5(74.0)8.4( )()( += −=− −=− YX YX YYbXX xy 8.4 5 24 ==⇒= ∑ X N X X
  • 24. The Regression coefficient of Y on X-: ( )∑ ∑ ∑ ∑ ∑ − − = 22 xx yxyx yx ddN ddddN b 07.1 134 144 121255 11155 )11()51(5 )1)(11()31(5 2 = = − − = −− −−− = yx yx yx yx b b b b The Regression Equation of Y on X-: )()( XXbYY yx −=− 66.007.1 )8.4(07.1)8.5( += −=− XY XY It would be observed the these regression equations are same as those obtained by the least squares method and deviation from arithmetic mean .
  • 25. SIMPLE REGRESSION This assumes the model y = β0 + βx + ε Example: Assume variables Y and X Explained by the following model Y = β0 + βx Where (Y) is called the dependent (or response) variable and X the independent (or predictor, or explanatory) variable. The two variables can be explained in the following model E(Y | X = x) = β0 + βx (the “population line”)
  • 26. Cont….. The interpretation is as follows: where β0 is the (unknown) intercept and β1 is the (unknown) slope or incremental change in Y per unit change in X. β0 and β1 are not known exactly, but are estimated from sample data and their estimates can be denoted b0 and b1 . Note that the actual value of σ is usually not known. The two regression coefficients are called the slope and intercept. Their actual values are also unknown and should always be estimated using the empirical data at hand.
  • 27. MULTIVARIATE (LINEAR) REGRESSION This is a regression model with multiple independent variables Here, the independent (regressor) variables x1, x2.... xn with only one dependent (response) variable y The model therefore assumes the following format; yi = β0 + β1x1 + β2x2 + ...... βnxn+ ε Where 1, 2, ... n, are the first index labels of the variable and the second observation. NB: The exact values of β and ε are, and will always remain unknown
  • 28. Polynomial Regression This is a special case of multivariate regression, with only one independent variable x, but an x-y relationship which is clearly nonlinear (at the same time, there is no ‘physical’ model to rely on). y = β0 + β1x + β2x2 + β3x3.....+ βnxn + ε Effectively, this is the same as having a multivariate model with x1 ≡ x, x2 ≡ x2, x3 ≡ x3
  • 29. NONLINEAR REGRESSION This is a model with one independent variable (the results can be easily extended to several) and ‘n’ unknown parameters, which we will call b1, b2, ... bn: y = f (x, b) + ε where f (x, b) is a specific (given) function of the independent variable and the ‘n’ parameters.
  • 31. Scatter plot 15.0 20.0 25.0 30.0 35.0 Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates 20000 25000 30000 35000 40000 PersonalIncomePerCapita,currentdollars, 1999 Percent of Population with Bachelor's Degree by Personal Income Per Capita •This is a linear relationship •It is a positive relationship. •As population with BA’s increases so does the personal income per capita.
  • 32. Regression Line 15.0 20.0 25.0 30.0 35.0 Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates 20000 25000 30000 35000 40000 PersonalIncomePerCapita,currentdollars, 1999 Percent of Population with Bachelor's Degree by Personal Income Per Capita R Sq Linear = 0.542 •Regression line is the best straight line description of the plotted points and use can use it to describe the association between the variables. •If all the lines fall exactly on the line then the line is 0 and you have a perfect relationship.
  • 33. Things to note Regression focuses on association, not causation. Association is a necessary prerequisite for inferring causation, but also: 1. The independent variable must preceed the dependent variable. 2. The two variables must be inline with a given theory, 3. Competing independent variables must be eliminated.
  • 34. Regression Table •The regression coefficient is not a good indicator for the strength of the relationship. •Two scatter plots with very different dispersions could produce the same regression line. 15.0 20.0 25.0 30.0 35.0 Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates 20000 25000 30000 35000 40000 PersonalIncomePerCapita,currentdollars, 1999 Percent of Population with Bachelor's Degree by Personal Income Per Capita R Sq Linear = 0.542 0.00 200.00 400.00 600.00 800.00 1000.00 1200.00 Population Per Square Mile 20000 25000 30000 35000 40000 PersonalIncomePerCapita,currentdollars,1999 Percent of Population with Bachelor's Degree by Personal Income Per Capita R Sq Linear = 0.463
  • 35. Regression coefficient The regression coefficient is the slope of the regression line wil tell; • What the nature of the relationship between the variables is. • How much change in the independent variables is associated with thechange in the dependent variable. • The larger the regression coefficient the more the change.
  • 36. Pearson’s r • To determine strength you look at how closely the dots are clustered around the line. The more tightly the cases are clustered, the stronger the relationship, while the more distant, the weaker. • Pearson’s r is given a range of -1 to + 1 with 0 being no linear relationship at all.
  • 37. Reading the tables Model Summary .736a .542 .532 2760.003 Model 1 R R Square Adjusted R Square Std. Error of the Estimate Predictors: (Constant), Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates a. •When you run regression analysis on SPSS you get a 3 tables. Each tells you something about the relationship. •The first is the model summary. •The R is the Pearson Product Moment Correlation Coefficient. •In this case R is .736 •R is the square root of R-Squared and is the correlation between the observed and predicted values of dependent variable.
  • 38. R-Square Model Summary .736a .542 .532 2760.003 Model 1 R R Square Adjusted R Square Std. Error of the Estimate Predictors: (Constant), Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates a. •R-Square is the proportion of variance in the dependent variable (income per capita) which can be predicted from the independent variable (level of education). •This value indicates that 54.2% of the variance in income can be predicted from the variable education. Note that this is an overall measure of the strength of association, and does not reflect the extent to which any particular independent variable is associated with the dependent variable. •R-Square is also called the coefficient of determination.
  • 39. Adjusted R-square Model Summary .736a .542 .532 2760.003 Model 1 R R Square Adjusted R Square Std. Error of the Estimate Predictors: (Constant), Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates a. •As predictors are added to the model, each predictor will explain some of the variance in the dependent variable simply due to chance. •One could continue to add predictors to the model which would continue to improve the ability of the predictors to explain the dependent variable, although some of this increase in R-square would be simply due to chance variation in that particular sample. •The adjusted R-square attempts to yield a more honest value to estimate the R- squared for the population. The value of R-square was .542, while the value of Adjusted R-square was .532. There isn’t much difference because we are dealing with only one variable. •When the number of observations is small and the number of predictors is large, there will be a much greater difference between R-square and adjusted R-square. •By contrast, when the number of observations is very large compared to the number of predictors, the value of R-square and adjusted R-square will be much closer.
  • 40. ANOVA ANOVAb 4.32E+08 1 432493775.8 56.775 .000a 3.66E+08 48 7617618.586 7.98E+08 49 Regression Residual Total Model 1 Sum of Squares df Mean Square F Sig. Predictors: (Constant), Percent of Population 25 years and Over with Bachelor's Degree or More, March 2000 estimates a. Dependent Variable: Personal Income Per Capita, current dollars, 1999b. •The p-value associated with this F value is very small (0.0000). •These values are used to answer the question "Do the independent variables reliably predict the dependent variable?". •The p-value is compared to your alpha level (typically 0.05) and, if smaller, you can conclude "Yes, the independent variables reliably predict the dependent variable". •If the p-value were greater than 0.05, you would say that the group of independent variables does not show a statistically significant relationship with the dependent variable, or that the group of independent variables does not reliably predict the dependent variable.
  • 41. Part of the Regression Equation • b represents the slope of the line • It is calculated by dividing the change in the dependent variable by the change in the independent variable. • The difference between the actual value of Y and the calculated amount is called the residual. • The represents how much error there is in the prediction of the regression equation for the y value of any individual case as a function of X.