SlideShare a Scribd company logo
AMMARA AFTAB
ammara.aftab63@gmail.com
MSC (FINAL)
ECONOMETRICS
THE SHINNIG STAR FOR STATISTIC…
MY IDEALS…
REGRESSION seems like
MSC(FINAL) OF UOK DEPENDS ON QUALIFIED
ECONOMIST MR.ZOHAIB AZIZ
SIR
ZOHAIB
AZIZ
By Ammara Aftab
INTRODUCTION
By Ammara Aftab
HISTORICAL ORIGIN OF THE TERM
REGRESSION
By Ammara Aftab
Historical ORIGIN BY FRANCISGALTON
GALTON’S Law of universal regression was
confirmed by his friend CARL F .GAUSS
KARL PEARSONBy Ammara Aftab
Galton's universal regression law
Galton found that, although there was a tendency for tall parents to
have tall children and for short parents to have short children.
KARL PEARSON:
He is talking in average sense that average height (not single height of
children that may be high or low from tall fathers) of sons is less than
the fathers height means tendency to mid....
similarly, average height of sons (not single height of children that may
be high or low from short fathers) of short parents greater than from
them means tendency to mid.
Conclusion:
Tall parents have tall children but average height of their children will
be less from them similarly for short parents. That is why Karl
Pearson's endorses the Galton's theory.
By Ammara Aftab
MODERN
INTERPRETATION
OF REGRESSION
By Ammara Aftab
By Ammara Aftab
By Ammara Aftab
Reconsider Galton’s law of universal regression.
Galton was interested in finding out why there was a
stability in the distribution of heights
in a population.
But in the modern view our concern is not with this
explanation but rather with finding out how the
average height of sons changes, given the fathers’
height. In other words, our concern is with predicting
the average height of sons knowing the height of
their fathers
By Ammara Aftab
EXAMPLE
By Ammara Aftab
By Ammara Aftab
PUT THE REGRESSION DATA ON
EXCEL SHEET
By Ammara Aftab
The slope indicates the steepness of a line and the intercept indicates
the location where it intersects an axis.
By Ammara Aftab
By Ammara Aftab
By Ammara Aftab
By Ammara Aftab
By Ammara Aftab
REGRESSION
ASSUMPTIONS
By Ammara Aftab
WHY THE ASSUMPTION requirement
is needed??
By Ammara Aftab
LOOK AT PRF:
Yi=β1+(β2)Xi+µi
It shows that Yi depends on both Xi and ui . Therefore, unless
we are specific about how Xi and ui are created or generated,
there is no way we can make any statistical inference about
the Yi and also, as we shall see, about β1 and β2. Thus, the
assumptions made about the Xi variable(s) and the error term
are extremely critical to the valid interpretation of the
regression estimates.
By Ammara Aftab
the Gaussian or
classical linear regression ASSUMPTIONS(clrm)
By Ammara Aftab
By Ammara Aftab
LINEAR REGRESSION MODEL:
The regression model is linear with
respect to parameter.
1) Yi = β1 + β2X1 +µi
In this above equation the model is linear with
respect to parameter.
2)Yi=B1+(B2^2)X1+µi
In this above equation the model is non linear with
respect to parameter.
1
By Ammara Aftab
Linear with respect to
(Parameter)
• Yi=β1+(β2)Xi+µi
• Yi=β1+(β2)Xi^2+µi
Non-linear with respect to
(PARAMETER)
Yi=β1+(β2^2)Xi+µi
ß is the parameter,
If ß have power then it will be non linear with respect to
parameter, if it does not have then it is in Linear form.
By Ammara Aftab
Simple linear regression describes the
linear relationship between a predictor
variable, plotted on the x-axis, and a
response variable, plotted on the y-axis
Independent Variable (X)
dependentVariable(Y)
1
By Ammara Aftab
By Ammara Aftab
X values are fixed in repeated sampling:
Values taken by the regressor X
are considered fixed in repeated samples. More technically, X
is assumed to be no stochastic .
2
By Ammara Aftab
2
By Ammara Aftab
X
Y
20$ 40$ 60$ 80$ 100$
JAPAN 18 23 47 78 89
CHINA 09 28 37 34 34
USA 16 35 48 45 67
RUSSIA 17 38 50 23 69
2
By Ammara Aftab
The X variable is measured without error X is
fixed and Y is dependent
X
Y
2
By Ammara Aftab
By Ammara Aftab
Zero mean value of disturbance ui. Giventhe value of X, the mean, or expected, value of the random
disturbance term ui is zero. Technically, the conditional mean value of ui is zero. Symbolically,we have
E(ui|Xi) = 0
EXAMPLE:
If X= 3,6,9
Then the mean =6,after applying thisformula
(mean-X)
= (6-3)+(6-6)+(6-9)
= 0
so 1st moment is alwayszero.
3
By Ammara Aftab
3
Mean =o
variance= constant
By Ammara Aftab
By Ammara Aftab
4Homoscedasticity or equal variance of ui.
Given the value of X, the variance
of ui is the same for all observations. That is,
the conditional variances of ui are identical.
Symbolically, we have
var (ui |Xi) = E[ui − E(ui |Xi)]2
= E(ui2 | Xi ) because of Asp3
= σ2
By Ammara Aftab
4
Homo= same
scedasticity=spreadness
By Ammara Aftab
By Ammara Aftab
5No autocorrelation between the
disturbances. Given any two X values,
Xi and Xj (i = j), the correlation between any two
ui and uj (i = j) is zero.
Symbolically,
cov (ui, uj |Xi, Xj) = E{[ui − E(ui)] | Xi }{[uj − E(uj)] | Xj }
= E(ui |Xi)(uj | Xj) (why?)
= 0
By Ammara Aftab
No Auto-Correlation
Positive
correlation
Negative
correlation
No
correlation
5
By Ammara Aftab
By Ammara Aftab
6
Zero covariance between ui and
Xi,or E(uiXi) = 0.
We assumed that X and u have separate effect on Y,But if X
and u are corelated,it is not possible to assess their
individual effect on Y as well as they will be directly
proportional.
X  u
Formally,
cov (ui, Xi) = E[ui − E(ui)][Xi − E(Xi)]
= E[ui (Xi − E(Xi))] since E(ui) = 0
= E(uiXi) − E(Xi)E(ui)
since E(Xi) is nonstochastic ( NON RANDOM)
= E(uiXi) since E(ui) = 0
= 0
By Ammara Aftab
By Ammara Aftab
7
The number of observations n must be greater
than the number of parameters to be
estimated.
Example:
Yi=β1+(β2)X1+(β3)X2+µi
As u can see that we have 3 parameters here so the number
of observation will be grater then 3.
Alternatively,
The number of observations n must be greater than the
number of regressors. From this single observation there is
no way to estimate the two unknowns, β1 and β2. We need
at least two pairs of observations to estimate the two
unknowns.By Ammara Aftab
By Ammara Aftab
8Variability in X values.
The X values in a given sample must not all be
the same.
If the X values are same then the Variance is
equal to zero and regression can’t be run
Technically,
var (X) must be a finite positive number.
By Ammara Aftab
8If we have same values of X:
X = 2,2,2
Then, according to the variance formula
σ = [(x-µ)^2]/n
σ = [(2-2)^2]/3+[(2-2)^2]/3+[(2-2)^2]/3
σ = 0
As σ= o then regression could not be run
Variance = 0 ,regression can not run..
Positive Finite means:
variance could not be(-ve),because it has squaring σ^2
variance could not be 0 , regression could not run.
variance should be (+ve) , positive finite.By Ammara Aftab
By Ammara Aftab
9
The regression model is correctly
specified.
Alternatively,
There is no specification bias or error in the
model used in empirical analysis.
Yi = α1 + α2Xi + ui
Yi = β1 + β2(1/Xi)+ ui
By Ammara Aftab
9
Yi = β1 + β2(1/Xi)+ ui
By Ammara Aftab
By Ammara Aftab
10
There is no perfect
multicollinearity.
Perfect multicollinearity=Strong Correlation b/w
regressors.
That is, there are no perfect linear
Relation b/w regressors.
Yi = β1 + β2X2i + β3X3i + ui
where Y is the dependent variable, X2 and X3 the
explanatory variables (or
regressors) or non random variables.
By Ammara Aftab

More Related Content

What's hot

Heteroscedasticity | Eonomics
Heteroscedasticity | EonomicsHeteroscedasticity | Eonomics
Heteroscedasticity | Eonomics
Transweb Global Inc
 
Lesson 2 stationary_time_series
Lesson 2 stationary_time_seriesLesson 2 stationary_time_series
Lesson 2 stationary_time_series
ankit_ppt
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
Avjinder (Avi) Kaler
 
7 classical assumptions of ordinary least squares
7 classical assumptions of ordinary least squares7 classical assumptions of ordinary least squares
7 classical assumptions of ordinary least squares
Yugesh Dutt Panday
 
Basic concepts of_econometrics
Basic concepts of_econometricsBasic concepts of_econometrics
Basic concepts of_econometrics
SwapnaJahan
 
Overview of econometrics 1
Overview of econometrics 1Overview of econometrics 1
Overview of econometrics 1
Emeni Joshua
 
Dummy variables
Dummy variablesDummy variables
Dummy variables
Irfan Hussain
 
Auto Correlation Presentation
Auto Correlation PresentationAuto Correlation Presentation
Auto Correlation Presentation
Irfan Hussain
 
Autocorrelation (1)
Autocorrelation (1)Autocorrelation (1)
Autocorrelation (1)
Manokamna Kochar
 
Multicollinearity
MulticollinearityMulticollinearity
Multicollinearity
Bernard Asia
 
Ols
OlsOls
Autocorrelation
AutocorrelationAutocorrelation
Autocorrelation
Pabitra Mishra
 
Correlation and regression analysis
Correlation and regression analysisCorrelation and regression analysis
Correlation and regression analysis
_pem
 
correlation and regression
correlation and regressioncorrelation and regression
correlation and regression
Unsa Shakir
 
Heteroskedasticity
HeteroskedasticityHeteroskedasticity
Heteroskedasticityhalimuth
 
Dummyvariable1
Dummyvariable1Dummyvariable1
Dummyvariable1
Sreenivasa Harish
 
Presentation On Regression
Presentation On RegressionPresentation On Regression
Presentation On Regression
alok tiwari
 
Correlation and Regression
Correlation and RegressionCorrelation and Regression
Correlation and Regression
Ram Kumar Shah "Struggler"
 

What's hot (20)

Heteroscedasticity | Eonomics
Heteroscedasticity | EonomicsHeteroscedasticity | Eonomics
Heteroscedasticity | Eonomics
 
Lesson 2 stationary_time_series
Lesson 2 stationary_time_seriesLesson 2 stationary_time_series
Lesson 2 stationary_time_series
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
 
7 classical assumptions of ordinary least squares
7 classical assumptions of ordinary least squares7 classical assumptions of ordinary least squares
7 classical assumptions of ordinary least squares
 
Basic concepts of_econometrics
Basic concepts of_econometricsBasic concepts of_econometrics
Basic concepts of_econometrics
 
Overview of econometrics 1
Overview of econometrics 1Overview of econometrics 1
Overview of econometrics 1
 
Ols by hiron
Ols by hironOls by hiron
Ols by hiron
 
time series analysis
time series analysistime series analysis
time series analysis
 
Dummy variables
Dummy variablesDummy variables
Dummy variables
 
Auto Correlation Presentation
Auto Correlation PresentationAuto Correlation Presentation
Auto Correlation Presentation
 
Autocorrelation (1)
Autocorrelation (1)Autocorrelation (1)
Autocorrelation (1)
 
Multicollinearity
MulticollinearityMulticollinearity
Multicollinearity
 
Ols
OlsOls
Ols
 
Autocorrelation
AutocorrelationAutocorrelation
Autocorrelation
 
Correlation and regression analysis
Correlation and regression analysisCorrelation and regression analysis
Correlation and regression analysis
 
correlation and regression
correlation and regressioncorrelation and regression
correlation and regression
 
Heteroskedasticity
HeteroskedasticityHeteroskedasticity
Heteroskedasticity
 
Dummyvariable1
Dummyvariable1Dummyvariable1
Dummyvariable1
 
Presentation On Regression
Presentation On RegressionPresentation On Regression
Presentation On Regression
 
Correlation and Regression
Correlation and RegressionCorrelation and Regression
Correlation and Regression
 

Viewers also liked

Regression analysis ppt
Regression analysis pptRegression analysis ppt
Regression analysis pptElkana Rorio
 
FMEA by AMMARA AFTAB
FMEA by AMMARA AFTABFMEA by AMMARA AFTAB
FMEA by AMMARA AFTAB
University of Karachi
 
DYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
DYNAMIC ECONOMETRIC MODELS BY Ammara AftabDYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
DYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
University of Karachi
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
Maria Theresa
 
Simple Linear Regression (simplified)
Simple Linear Regression (simplified)Simple Linear Regression (simplified)
Simple Linear Regression (simplified)
Haoran Zhang
 
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
Mahvesh Zahra
 
Linear regression
Linear regressionLinear regression
Linear regression
vermaumeshverma
 
Chap12 simple regression
Chap12 simple regressionChap12 simple regression
Chap12 simple regression
Uni Azza Aunillah
 
Simple Linier Regression
Simple Linier RegressionSimple Linier Regression
Simple Linier Regressiondessybudiyanti
 
Linear regression
Linear regressionLinear regression
Linear regressionTech_MX
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
nadiazaheer
 
Multiple regression presentation
Multiple regression presentationMultiple regression presentation
Multiple regression presentationCarlo Magno
 
Simple linear regression (final)
Simple linear regression (final)Simple linear regression (final)
Simple linear regression (final)Harsh Upadhyay
 
An Overview of Simple Linear Regression
An Overview of Simple Linear RegressionAn Overview of Simple Linear Regression
An Overview of Simple Linear Regression
Georgian Court University
 
Regression analysis
Regression analysisRegression analysis
Regression analysisRavi shankar
 
agriculture ppt
 agriculture ppt agriculture ppt
agriculture ppt
icon66rt
 

Viewers also liked (17)

Regression analysis ppt
Regression analysis pptRegression analysis ppt
Regression analysis ppt
 
FMEA by AMMARA AFTAB
FMEA by AMMARA AFTABFMEA by AMMARA AFTAB
FMEA by AMMARA AFTAB
 
DYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
DYNAMIC ECONOMETRIC MODELS BY Ammara AftabDYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
DYNAMIC ECONOMETRIC MODELS BY Ammara Aftab
 
Simple linear regression
Simple linear regressionSimple linear regression
Simple linear regression
 
Simple Linear Regression (simplified)
Simple Linear Regression (simplified)Simple Linear Regression (simplified)
Simple Linear Regression (simplified)
 
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
Statistics for-management-by-levin-and-rubin-solution-manual2-130831111553-ph...
 
Linear regression
Linear regressionLinear regression
Linear regression
 
Chap12 simple regression
Chap12 simple regressionChap12 simple regression
Chap12 simple regression
 
Simple Linier Regression
Simple Linier RegressionSimple Linier Regression
Simple Linier Regression
 
Linear regression
Linear regressionLinear regression
Linear regression
 
Regression
RegressionRegression
Regression
 
Regression Analysis
Regression AnalysisRegression Analysis
Regression Analysis
 
Multiple regression presentation
Multiple regression presentationMultiple regression presentation
Multiple regression presentation
 
Simple linear regression (final)
Simple linear regression (final)Simple linear regression (final)
Simple linear regression (final)
 
An Overview of Simple Linear Regression
An Overview of Simple Linear RegressionAn Overview of Simple Linear Regression
An Overview of Simple Linear Regression
 
Regression analysis
Regression analysisRegression analysis
Regression analysis
 
agriculture ppt
 agriculture ppt agriculture ppt
agriculture ppt
 

Similar to regression assumption by Ammara Aftab

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
Christian Robert
 
regression.pptx
regression.pptxregression.pptx
regression.pptx
Rashi Agarwal
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
Christian Robert
 
Corr-and-Regress (1).ppt
Corr-and-Regress (1).pptCorr-and-Regress (1).ppt
Corr-and-Regress (1).ppt
MuhammadAftab89
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
BAGARAGAZAROMUALD2
 
Cr-and-Regress.ppt
Cr-and-Regress.pptCr-and-Regress.ppt
Cr-and-Regress.ppt
RidaIrfan10
 
Correlation & Regression for Statistics Social Science
Correlation & Regression for Statistics Social ScienceCorrelation & Regression for Statistics Social Science
Correlation & Regression for Statistics Social Science
ssuser71ac73
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
HarunorRashid74
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
krunal soni
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
MoinPasha12
 
Econometrics ch8
Econometrics ch8Econometrics ch8
Econometrics ch8
Baterdene Batchuluun
 
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
마이캠퍼스
 
Tracing of cartesian curve
Tracing of cartesian curveTracing of cartesian curve
Tracing of cartesian curve
Kaushal Patel
 
Lecture 15(graphing of cartesion curves)
Lecture 15(graphing of cartesion curves)Lecture 15(graphing of cartesion curves)
Lecture 15(graphing of cartesion curves)
FahadYaqoob5
 
ISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptxISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptx
ssuser1eba67
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
Christian Robert
 
Econometrics- lecture 10 and 11
Econometrics- lecture 10 and 11Econometrics- lecture 10 and 11
Econometrics- lecture 10 and 11
Almaszabeen Badekhan
 
5 equations of lines x
5 equations of lines x5 equations of lines x
5 equations of lines x
Tzenma
 

Similar to regression assumption by Ammara Aftab (20)

ABC short course: survey chapter
ABC short course: survey chapterABC short course: survey chapter
ABC short course: survey chapter
 
Corr And Regress
Corr And RegressCorr And Regress
Corr And Regress
 
regression.pptx
regression.pptxregression.pptx
regression.pptx
 
Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017Monte Carlo in Montréal 2017
Monte Carlo in Montréal 2017
 
Corr-and-Regress (1).ppt
Corr-and-Regress (1).pptCorr-and-Regress (1).ppt
Corr-and-Regress (1).ppt
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
 
Cr-and-Regress.ppt
Cr-and-Regress.pptCr-and-Regress.ppt
Cr-and-Regress.ppt
 
Correlation & Regression for Statistics Social Science
Correlation & Regression for Statistics Social ScienceCorrelation & Regression for Statistics Social Science
Correlation & Regression for Statistics Social Science
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
 
Corr-and-Regress.ppt
Corr-and-Regress.pptCorr-and-Regress.ppt
Corr-and-Regress.ppt
 
Econometrics ch8
Econometrics ch8Econometrics ch8
Econometrics ch8
 
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)슬로우캠퍼스:  scikit-learn & 머신러닝 (강박사)
슬로우캠퍼스: scikit-learn & 머신러닝 (강박사)
 
Tracing of cartesian curve
Tracing of cartesian curveTracing of cartesian curve
Tracing of cartesian curve
 
Lecture 15
Lecture 15Lecture 15
Lecture 15
 
Lecture 15(graphing of cartesion curves)
Lecture 15(graphing of cartesion curves)Lecture 15(graphing of cartesion curves)
Lecture 15(graphing of cartesion curves)
 
ISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptxISM_Session_5 _ 23rd and 24th December.pptx
ISM_Session_5 _ 23rd and 24th December.pptx
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
Econometrics- lecture 10 and 11
Econometrics- lecture 10 and 11Econometrics- lecture 10 and 11
Econometrics- lecture 10 and 11
 
5 equations of lines x
5 equations of lines x5 equations of lines x
5 equations of lines x
 

regression assumption by Ammara Aftab

  • 2. THE SHINNIG STAR FOR STATISTIC… MY IDEALS…
  • 3. REGRESSION seems like MSC(FINAL) OF UOK DEPENDS ON QUALIFIED ECONOMIST MR.ZOHAIB AZIZ SIR ZOHAIB AZIZ By Ammara Aftab
  • 5. HISTORICAL ORIGIN OF THE TERM REGRESSION By Ammara Aftab
  • 6. Historical ORIGIN BY FRANCISGALTON GALTON’S Law of universal regression was confirmed by his friend CARL F .GAUSS KARL PEARSONBy Ammara Aftab
  • 7. Galton's universal regression law Galton found that, although there was a tendency for tall parents to have tall children and for short parents to have short children. KARL PEARSON: He is talking in average sense that average height (not single height of children that may be high or low from tall fathers) of sons is less than the fathers height means tendency to mid.... similarly, average height of sons (not single height of children that may be high or low from short fathers) of short parents greater than from them means tendency to mid. Conclusion: Tall parents have tall children but average height of their children will be less from them similarly for short parents. That is why Karl Pearson's endorses the Galton's theory. By Ammara Aftab
  • 11. Reconsider Galton’s law of universal regression. Galton was interested in finding out why there was a stability in the distribution of heights in a population. But in the modern view our concern is not with this explanation but rather with finding out how the average height of sons changes, given the fathers’ height. In other words, our concern is with predicting the average height of sons knowing the height of their fathers By Ammara Aftab
  • 14. PUT THE REGRESSION DATA ON EXCEL SHEET By Ammara Aftab
  • 15. The slope indicates the steepness of a line and the intercept indicates the location where it intersects an axis. By Ammara Aftab
  • 21. WHY THE ASSUMPTION requirement is needed?? By Ammara Aftab
  • 22. LOOK AT PRF: Yi=β1+(β2)Xi+µi It shows that Yi depends on both Xi and ui . Therefore, unless we are specific about how Xi and ui are created or generated, there is no way we can make any statistical inference about the Yi and also, as we shall see, about β1 and β2. Thus, the assumptions made about the Xi variable(s) and the error term are extremely critical to the valid interpretation of the regression estimates. By Ammara Aftab
  • 23. the Gaussian or classical linear regression ASSUMPTIONS(clrm) By Ammara Aftab
  • 25. LINEAR REGRESSION MODEL: The regression model is linear with respect to parameter. 1) Yi = β1 + β2X1 +µi In this above equation the model is linear with respect to parameter. 2)Yi=B1+(B2^2)X1+µi In this above equation the model is non linear with respect to parameter. 1 By Ammara Aftab
  • 26. Linear with respect to (Parameter) • Yi=β1+(β2)Xi+µi • Yi=β1+(β2)Xi^2+µi Non-linear with respect to (PARAMETER) Yi=β1+(β2^2)Xi+µi ß is the parameter, If ß have power then it will be non linear with respect to parameter, if it does not have then it is in Linear form. By Ammara Aftab
  • 27. Simple linear regression describes the linear relationship between a predictor variable, plotted on the x-axis, and a response variable, plotted on the y-axis Independent Variable (X) dependentVariable(Y) 1 By Ammara Aftab
  • 29. X values are fixed in repeated sampling: Values taken by the regressor X are considered fixed in repeated samples. More technically, X is assumed to be no stochastic . 2 By Ammara Aftab
  • 31. X Y 20$ 40$ 60$ 80$ 100$ JAPAN 18 23 47 78 89 CHINA 09 28 37 34 34 USA 16 35 48 45 67 RUSSIA 17 38 50 23 69 2 By Ammara Aftab
  • 32. The X variable is measured without error X is fixed and Y is dependent X Y 2 By Ammara Aftab
  • 34. Zero mean value of disturbance ui. Giventhe value of X, the mean, or expected, value of the random disturbance term ui is zero. Technically, the conditional mean value of ui is zero. Symbolically,we have E(ui|Xi) = 0 EXAMPLE: If X= 3,6,9 Then the mean =6,after applying thisformula (mean-X) = (6-3)+(6-6)+(6-9) = 0 so 1st moment is alwayszero. 3 By Ammara Aftab
  • 37. 4Homoscedasticity or equal variance of ui. Given the value of X, the variance of ui is the same for all observations. That is, the conditional variances of ui are identical. Symbolically, we have var (ui |Xi) = E[ui − E(ui |Xi)]2 = E(ui2 | Xi ) because of Asp3 = σ2 By Ammara Aftab
  • 40. 5No autocorrelation between the disturbances. Given any two X values, Xi and Xj (i = j), the correlation between any two ui and uj (i = j) is zero. Symbolically, cov (ui, uj |Xi, Xj) = E{[ui − E(ui)] | Xi }{[uj − E(uj)] | Xj } = E(ui |Xi)(uj | Xj) (why?) = 0 By Ammara Aftab
  • 43. 6 Zero covariance between ui and Xi,or E(uiXi) = 0. We assumed that X and u have separate effect on Y,But if X and u are corelated,it is not possible to assess their individual effect on Y as well as they will be directly proportional. X  u Formally, cov (ui, Xi) = E[ui − E(ui)][Xi − E(Xi)] = E[ui (Xi − E(Xi))] since E(ui) = 0 = E(uiXi) − E(Xi)E(ui) since E(Xi) is nonstochastic ( NON RANDOM) = E(uiXi) since E(ui) = 0 = 0 By Ammara Aftab
  • 45. 7 The number of observations n must be greater than the number of parameters to be estimated. Example: Yi=β1+(β2)X1+(β3)X2+µi As u can see that we have 3 parameters here so the number of observation will be grater then 3. Alternatively, The number of observations n must be greater than the number of regressors. From this single observation there is no way to estimate the two unknowns, β1 and β2. We need at least two pairs of observations to estimate the two unknowns.By Ammara Aftab
  • 47. 8Variability in X values. The X values in a given sample must not all be the same. If the X values are same then the Variance is equal to zero and regression can’t be run Technically, var (X) must be a finite positive number. By Ammara Aftab
  • 48. 8If we have same values of X: X = 2,2,2 Then, according to the variance formula σ = [(x-µ)^2]/n σ = [(2-2)^2]/3+[(2-2)^2]/3+[(2-2)^2]/3 σ = 0 As σ= o then regression could not be run Variance = 0 ,regression can not run.. Positive Finite means: variance could not be(-ve),because it has squaring σ^2 variance could not be 0 , regression could not run. variance should be (+ve) , positive finite.By Ammara Aftab
  • 50. 9 The regression model is correctly specified. Alternatively, There is no specification bias or error in the model used in empirical analysis. Yi = α1 + α2Xi + ui Yi = β1 + β2(1/Xi)+ ui By Ammara Aftab
  • 51. 9 Yi = β1 + β2(1/Xi)+ ui By Ammara Aftab
  • 53. 10 There is no perfect multicollinearity. Perfect multicollinearity=Strong Correlation b/w regressors. That is, there are no perfect linear Relation b/w regressors. Yi = β1 + β2X2i + β3X3i + ui where Y is the dependent variable, X2 and X3 the explanatory variables (or regressors) or non random variables. By Ammara Aftab