SlideShare a Scribd company logo
1
1
INTRODUCTION
Multiple Regression
3
LECTURE
2
CHAPTER 7
Multiple Regression
The Problem of Estimation
3
NOTATION
Y = 1 + 2 X2 + 3 X3 +…+ k Xk + u
4
Y
X3
= 3
holding X2 constant, the direct effect
of a unit change in X3 on the mean
value of Y.
To assess the true contribution of X2 to the change in
Y, we control the influence of X3.
Y = 1 + 2X2 + 3X3 + u (suppose this is a true model)
Y
X2
= 2 : 2
measures the change in the mean
values of Y, per unit change in X2,
holding X3 constant.
or The ‘direct’ or ‘net’ effect of a unit change in
X2 on the mean value of Y
Explaination for the parial coefficients
5
Derive OLS estimators of multiple regression
OLS is to minimize the SSR( u2)
^
min. RSS = min.  u2 = min. (Y - 1 - 2X2 - 3X3)2
^ ^ ^ ^
RSS
1
=2  ( Y - 1- 2X2 - 3X3)(-1) = 0
^
^ ^ ^
RSS
2
=2  ( Y - 1- 2X2 - 3X3)(-X2) = 0
^
^ ^ ^
RSS
3
=2  ( Y - 1- 2X2 - 3X3)(-X3) = 0
^
^ ^ ^
Y = 1 + 2X2 + 3X3 + u
^ ^ ^ ^
u = Y - 1 - 2X2 - 3X3
^ ^ ^ ^
6
rearranging three equations:
n1 + 2 X2 + 3  X3 = Y
^ ^ ^
2 X2 + 2 X3
2 + 3  X2X3 = X2Y
^ ^ ^
1 X3 + 2 X2X3 + 3  X3
2 = X3Y
^ ^ ^
rewrite in matrix form:
n X2 X3
X2 X2
2 X2X3
X3 X2X3 X3
2
1
2
3
^
^
^
=
Y
 X2Y
X3Y
2-variables Case
3-variables Case
(X’X) 
^
= X’Y Matrix notation
7
1 = Y - 2X2 - 3X3
^ ^ ^
_ _ _
n X2 Y
X2 X2
2 X2Y
X3 X2X3 X3Y
n X2 X3
X2 X2
2 X2X3
X3 X2X3 X3
2
=
3
^
=
(yx3)(x2
2) - (yx2)(x2x3)
(x2
2)(x3
2) - (x2x3)2
n Y X3
X2 X2Y X2X3
X3 X3Y X3
2
n X2 X3
X2 X2
2 X2X3
X3 X2X3 X3
2
=
2
^
=
(yx2)(x3
2) - (yx3)(x2x3)
(x2
2)(x3
2) - (x2x3)2
Cramer’s rule:
8
Variance-Covariance matrix
Var-cov() =
^
Var(1) Cov(1 2) Cov(1 3)
Cov (2 1) Var(2) Cov(2 3)
Cov (3 1) Cov(3 2) Var(3)
^ ^ ^
^ ^
^ ^
^
^
^
^
^ ^
^
^
= u
2(X’X)-1
^
or in matrix form:
3x3 3x1 3x1
^

(X’X) X’Y
=
==> 
^
= (X’X)-1 (X’Y)
3x3
3x1 3x1
Var-cov() = u
2 (X’X)-1 and u
2 =
^ ^ ^  u 2
^
n-3
9
n X2 X3
X2 X2
2 X2X3
X3 X3X2 X3
2
= u
2
^
-1
u
2 =
^ u2
^
n-3
and =
u2
^
n- k
u2
^
k=3
# of independent variables
( including the constant term)
10
Scalar forms of Var and SE of OLS estimators
 

)
1
(
)
ˆ
var( 2
23
2
2
2
2
r
x i


 


)
1
(
)
ˆ
( 2
23
2
3
2
3
r
x
se
i


 

)
1
(
)
ˆ
var( 2
23
2
2
2
2
r
x i


 


)
1
(
)
ˆ
( 2
23
2
2
2
2
r
x
se
i







2
3
2
2
2
23
2
23
3
2
1
(
)
ˆ
,
ˆ
cov(
i
i x
x
r
r 


 

 2
3
2
2
2
3
2
2
23
)
(
i
i
i
i
x
x
x
x
r
11
Properties of multiple OLS estimators
1. The regression line(surface)passes through the mean of Y1, X2, X3
_ _ _
i.e.,
Y= 1 + 2X2 + 3X3
_
^ ^
^
_ _
==>
1 = Y - 2X2 - 3X3
^ ^ ^
_ _
_
Linear in parameters
Regression through the mean
3. u=0
^
Zero mean of error
uY=0
^
5. ^
random sample
uX2 = uX3 = 0
^ ^
4. (uXk=0 )
^ constant Var(ui) = 2
2. Y = Y + 2x2 + 3x3
^
_
^ ^
y = 2x2 + 3x3
^ ^
or
Unbiased: E(i) = i
^
12
Properties of multiple OLS estimators
6. As X2 and X3 are closely related ==> var(2) and var(3)
become large and infinite. Therefore the true values of 2
and 3 are difficult to know.
^ ^
All the normality assumptions in the two-variables case regression
are also applied to the multiple variable regression.
But one addition assumption is
No exact linear relationship among the independent variables.
(No perfect collinearity, i.e., Xk  Xj )
7. The greater the variation in the sample values of X2 or
X3, the smaller variance of 2 and 3 , and the
estimations are more precisely.
^ ^
8. BLUE (Gauss-Markov Theorem)
13
The adjusted R2 (R2) as one of indicators of the overall fitness
R2 =
ESS
TSS
= 1 -
RSS
TSS
= 1 -
u2
y2
^
R2 = 1 -
_ 2
SY
2
^
R2 = 1 -
_ u2
y2
^ (n-1)
(n-k)
u2 / (n-k)
y2 / (n-1)
R2 = 1 -
_ ^
k : # of independent
variables plus the
constant term.
n : # of obs.
n-1
R2 = 1 - (1-R2)
_
n-k
R2  R2
_
Adjusted R2 can be negative: R2  0
0 < R2 < 1
Note: Don’t misuse the adjusted R2, Gujarati(2003) pp. 222
14
Some notes on R2 and adjusted R2
• R2 always increases when we add independent variables. So we use
to decide on whether or not to add one more variable. The rule can
be applied is: add more variable if in creases and t-test for new
variable is significant.
• We compare two R2 of two models to select the better model if the
two models have the same sample size n and forms of dependent
variable.
• Game of maximizing : In regression analysis, we should be more
concerned about the logical or theoretical relevance of X to Y and
their significance. So, it is not necessary to get very high R2 or
adjected R2. If they are high, it is good, but if they are low, it does not
mean the model is bad.
2
R
2
R
2
R
15
Y
X
= 2
=
Y
X
Y
X
Y = 1 + 2 X
Y
X
= 2
percentage change in y
percentage change in x
 = =
x/x
y/y
=
y x
x y
Applying elasticities
16
Estimating elasticities
Y
X
= b2
=
Y
X
Y
X
^
Yt = b1 + b2Xt = 4 + 1.5Xt
^
X = 8 = average number of years of experience
Y = $10 = average wage rate
= 1.5 = 1.2
8
10
= b2

Y
X
^
17
Y
X
= 2
1
Y
X
X
1
X
ln(Y)
X
ln(X)
X
= 2
log-log models: Nonlinearities
ln(Y) = 1 + 2 ln(X)
18
= 2
Y
X
X
Y
elasticity of Y with respect to X:
= 2
Y
X
X
Y
 =
19
linear X
Y 2
1

 
 2


dX
dY X
)
(
2
Y

X
Y ln
ln 2
1

 

Log-log
2
ln
ln



X
dX
Y
dY
X
d
Y
d
2

)
(
2
X
Y
dX
dY


==>
Summary
Model Equation
)
(
dX
dY
 )
(
X
dX
Y
dY

20
Summary(Count.)
Reciprocal
X
Y
1
2
1

 
 2
2
)
1
(
1




dX
X
dY
X
d
dY
X2
dX
dY -1
2


)
-1
(
2
XY

==>
Lin-log X
Y ln
2
1

 
 2
ln



X
dX
dY
X
d
dY
Y
1
2

X
dX
dY 1
2


==>
Y
dX
dY
1


==>
X
Y 2
1
ln 
 

Log-lin
X
2

2
ln



dX
Y
dY
dX
Y
d
21
Application of functional form regression
1. Cobb-Douglas Production function:
u
e
K
L
Y 1
3
2


Transforming:
u
K
L
Y
u
K
L
Y








ln
ln
ln
ln
ln
ln
ln
3
2
1
3
2
1


’



==>
2
ln
ln


L
d
Y
d
3
ln
ln


K
d
Y
d
: elasticity of output w.r.t. labor input
: elasticity of output w.r.t. capital input.
1
3
2

 
 >
<
Information about the scale of returns.
22
2. Polynomial regression model:
Marginal cost function or total cost function
costs
y
MC
i.e.
costs
y
u
X
X
Y 


 2
3
2
1


 (MC)
or
costs
y
TC
u
X
X
X
Y 



 3
4
2
3
2
1



 (TC)
23
Chapter 8
Multiple Regression Analysis:
The Problem of Inference
24
Hypothesis Testing in multiple regression:
1. Testing individual partial coefficient
2. Testing the overall significance of all coefficients
3. Testing restriction on variables (add or drop): Xk = 0 ?
4. Testing partial coefficient under some restrictions
Such as 2+ 3 = 1;
or 2 = 3 (or 2+ 3 = 0); etc.
6. Testing the stability of the estimated regression model
-- over time
-- in different cross-sections
5. Testing the functional form of regression model.
25
1. Individual partial coefficient test
t =
2 - 0
^
Se (2)
^
=
0.726
0.048
= 14.906
Compare with the critical value tc
0.025, 12 = 2.179
Since t > tc ==> reject Ho
Answer : Yes, 2 is statistically significant and is
significantly different from zero.
^
H0 : 2 = 0
H1 : 2  0
holding X3 constant: Whether X2 has the effect on Y ?
1
Y
X2
= 2 = 0?
26
1. Individual partial coefficient test (cont.)
holding X2 constant: Whether X3 has the effect on Y?
2
H0 : 3 = 0
H1 : 3  0
Y
X3
= 3 = 0?
t =
3 - 0
^
Se (3)
^
=
2.736-0
0.848
= 3.226
Critical value: tc
0.025, 12 = 2.179
Since | t | > | tc | ==> reject Ho
Answer: Yes, 3 is statistically significant and is
significantly different from zero.
^
27
2. Testing overall significance of the multiple regression
3. Compare F and Fc , and
if F > Fc ==> reject H0
1. Compute and obtain F-statistics
2. Check for the critical Fc value (Fc
, k-1, n-k)
Y = 1 + 2X2 + 3X3 + u
H0 : 2 = 0, 3 = 0, (all variable are zero effect)
H1 : 2  0 or 3  0 (At least one variable is not zero,
At least one variable has the effect)
3-variable case:
28
F =
MSS of ESS
MSS of RSS
=
ESS / k-1
RSS / n-k
=
 y2/(k-1)
 u 2 /(n-k)
^
^
if F > Fc
α,k-1,n-k ==> reject Ho
H1 : 2  …  k  0
H0 : 2 = … = k = 0
Analysis of Variance:
Source of variation Sum of Square df Mean sum of Sq.
Due to regression(ESS)  y 2 k-1
Due to residuals(RSS)  u2 n-k
Total variation(TSS)  y2 n-1
ANOVA TABLE
y2
k-1
^
^
u2
(SS)
n-k = u
2
^
^
^
(MSS)
Note: k is the total number of parameters including the intercept term.
Since y = y + u
^
==> y2 = y2 + u2
^ ^
TSS = ESS + RSS
^
29
Three-
variable
case
y = 2x2 + 3x3 + u
^ ^ ^
 y2 = 2  x2 y + 3  x3 y +  u2
^ ^ ^
TSS = ESS + RSS
F-Statistic =
ESS / k-1
RSS / n-k
=
(2 x2y + 3 x3y) / 3-1
u2 / n-3
^
^ ^
ANOVA TABLE
Source of variation SS df(k=3) MSS
ESS 2 x2 y + 3 x3 y 3-1 ESS/3-1
RSS u2 n-3 RSS/n-3
TSS y2 n-1
^ ^
^
(n-k)
30
An important relationship between R2 and F
F =
ESS / k-1
RSS / n-k
=
ESS (n-k)
RSS (k-1)
=
TSS-ESS
ESS n-k
k-1
TSS
=
ESS/TSS
ESS
1 -
n-k
k-1
=
R2
1 - R2
n-k
k-1
=
R2 / (k-1)
(1-R2) / n-k
F R2 =
(k-1)F + (n-k)
(k-1) F
For the three-variables case :
F =
R2 / 2
(1-R2) / n-3
31
R2 =
ESS
TSS
= 1 -
RSS
TSS
= 1 -
u2
y2
^
R2 = 1 -
_ 2
SY
2
^
R2 = 1 -
_ u2
y2
^ (n-1)
(n-k)
u2 / (n-k)
y2 / (n-1)
R2 = 1 -
_ ^
k : # of independent
variables
including the
constant term.
n : # of obs.
R2 and the adjusted R2 (R2)
n-1
R2 = 1 - (1-R2)
_
n-k
R2  R2
_
Adjusted R2 can be negative: R2  0
0 < R2 < 1
32
Overall significance test:
H1 : at least one coefficient
is not zero.
H0 : 2 = 3 = 4 = 0
2  0 , or 3  0 , or 4  0
Fc(0.05, 4-1, 20-4) = 3.24
 k-1 n-k
Since F* > Fc ==> reject H0.
F* = =
R2 / k-1
(1-R2) / n- k
= 179.13
0.9710 / 3
(1-0.9710) /16
=
33
Construct the ANOVA Table (8.4) .(Information from EViews)
F* =
MSS of regression
MSS of residual
=
5164.3903
28.8288
= 179.1339
Source of
variation
SS Df MSS
Due to
regression
(SSE)
R
2
(y
2
)
=(0.971088)(28.97771)2x19
=15493.171
k-1
=3
R
2
(y
2
)/(k-1)
=5164.3903
Due to
Residuals
(RSS)
(1- R
2
)(y
2
) or ( u
2
)
=(0.0289112)(28.97771) )2x19
=461.2621
n-k
=16
(1- R
2
)(y
2
)/(n-k)
=28.8288
Total
(TSS)
(y
2
)
=(28.97771) 2x19
=15954.446
n-1
=19
Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2
34
Example:Gujarati(2003)-Table6.4, pp.185)
Fc(0.05, 3-1, 64-3) = 3.15
 k-1 n-k
Since F* > Fc
==> reject H0.
F* =
0.707665 / 2
(1-0.707665)/ 61
=
R2 / k-1
(1-R2) / n- k
F* = 73.832
=
ESS / k-1
RSS/(n- k)
H0 : 1 = 2 = 3 = 0
35
Construct the ANOVA Table (8.4) .(Information from EVIEWS)
F* =
MSS of regression
MSS of residual
=
130723.67
1770.547
= 73.832
Source of
variation
SS Df MSS
Due to
regression
(SSE)
R
2
(y
2
)
=(0.707665)(75.97807)2x64
=261447.33
k-1
=2
R
2
(y
2
)/(k-1)
=130723.67
Due to
Residuals
(RSS)
(1- R
2
)(y
2
) or ( u
2
)
=(0.292335)(75397807)2x64
=108003.37
n-k
=61
(1- R
2
)(y
2
)/(n-k)
=1770.547
Total
(TSS)
(y
2
)
=(75.97807)2x64
=369450.7
n-1
=63
Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2
36
Decision Rule:
Since F*= .73.832 > Fc = 4.98 (3.15) ==> reject Ho
Answer : The overall estimators are statistically significant
different from zero.
Fc
0.01, 2, 61 = 4.98
Fc
0.05, 2, 61 = 3.15
Compare F* and Fc, checks the F-table:
H0 : 2 = 0, 3= 0,
H1 : 2  0 ; 3  0
Y = 1 + 2 X2 + 3 X3 + u
37
38
3.Testing the addition variable in the regression model
Old model :
Y = 1 + 2 X2 + u1
Obtain R2
old or RSSold and/or ESSold
Now consider a new variable X3, whether it is relevant to add or not ?
New model :
Y = 1 + 2X2 + 3 X3 + u2
Obtain R2
new or RSSnew and ESSnew
H0 : 3 = 0, add X3 is not relevant
H1 : 3  0, add X3 is relevant
39
Steps of testing whether X3 has an incremental contribution
of the explanatory power in the new model.
1. Compute the F-statistic
(J = 1)
F* =
(RSSold - RSSnew) / # of additional variables
RSSnew / n - # of parameters in new model
(n-3)
2. Compare F* and Fc
(, 1, n-3)
3. Decision rule: If F* > Fc ==> reject H0 : 3 = 0
that means X3 is a relevant variable to be added into the model.
F =
(R2
new - R2
old) / df
(1 - R2
new) / df n-k (in the new model)
# of new regressors
(add or drop)
F* can also be calculated by
40
Old model
Add an irrelevant variable X2: (Studenmund, pp.166)
RSSold = 160.5929 R2
old = 0.986828
41
F* =
(R2
new - R2
old) / df
(1 - R2
new) / df
n-k ( )
in the new
model
# of new regressors
(add or drop)
=
(0.9872 - 0.9868) / 1
(1 - 0.9872) / 39
= 1.218
Fc
0.05, 1, 39
= 4.17
New model H0 : add X3 (R) is not suitable
3 = 0
Since
F* < Fc ==> not reject H0.
42
Add an relevant variable X3: (Studenmund, pp.166)
Old model
Y = 0 + 1 X1 + 2 X2 + u
Next:
New model: Y = 0 + 1 X1 + 2 X2 + 3 X3 + u’
H0 : add X3 (YD variable is not suitable, 3 = 0
43
F* =
(R2
new - R2
old) / df
(1 - R2
new) / df
=
0.9868 - 0.9203 / 1
(1 - 0.9868) / 44 - 4
=
0.0665x40
0.0132
= 201.5
Fc
(0.05, 1, 40) = 4.08 Since F* > Fc ==> reject H0.
New model
Add an relevant variable X3(YD): (Studenmund, pp.166)
44
Add variables in general discussion:
Testing the two additional variables Xl, Xm, whether they
are relevant or not?
F-test : H0 : l = 0, m = 0
H1 : l  0, or m  0
F* =
(R2
new- R2
old) / # of added regressors
(1 - R2
new) / n-m
If F* > Fc ==> reject H0
old : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui
Restricted l= m=0
new : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + Xm + ui unrestricted
# of new regressors
# of total regressors
in the new model
45
Drop variables in general discussion:
Test the dropping of two variables Xl, Xm whether they are
relevant or not?
F-test : H0 : l = 0, m = 0
H1 : l  0, or m  0
F* =
(R2
old- R2
new) / # of dropped regressors
(1 - R2
old) / n-k
If F* > Fc ==> reject H0
new : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui
Restricted l= m=0
old : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + mXm + ui unrestricted
46
Justify whether to keep (or drop) Xl and Xm in the model
3. Check the t-statistics of Xl and Xm? Whether tl*, tm* > 1.96?
(5% leve of significance)
^
1. Theory: Check the sign of l and m? Whether the
variables are theoretically explaining the dependent
variable?
^
4. Bias: Check t-statistics of other variables, X2,……Xk whether
they have change significantly or not?
2. Overall fitness: Check R2 increase or not? Check the F*-
statistics increase or not?
47
Use the R2 instead of ESS or RSS in the F-test.
Restriction Test:
F =
(R2
UR - R2
R) / J
(1 - R2
UR) / (n- k)
J: # restricted variables
k: # of parameters (included
the intercept) in the
unrestricted model.
Note that we use R2 for
restriction test only if two
regression models have the same
form of dependent variable.
=
(R2
UR - R2
R) / (dfUR-dfR)
(1 - R2
UR) / dfUR
48
Old model or restricted model
Example 8.4: The demand for Chicken ( Gujarati(2003) p. 272)
49
H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0
New Model or unrestricted model
50
WALD TEST (Likelihood Ratio Test)
(0.9823 - 0.9801) /2
0.000983
F* =
(R2
new - R2
old) / # added
(1-R2
new) / n-k
=
(1 - 0.9823) / (23 - 5)
=
0.0011
= 1.119
Since F* < Fc ==> not reject H0
Fc
0.05, 2, 18 = 3.55
H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0
1
Adding variables:
51
< Fc
0.05, 1, 19 = 4.38  not reject H0
F* =
(R2
UR - R2
R) / m-k
1-R2
UR/ n-k
=
(0.982313 - 0.981509) /1
(1-0.982313) / (23 - 4)
= 0.864
H0: No effect of X5, i.e., 5 = 0
Since t* < tc ==> not reject H0
2
Dropping variable:
(β5)
52
ln Y = 1 + 2 ln X2 + 3 ln X3 + u
unrestricted
model
ln( ) = ’1 +’3ln( ) + u’
=>
X2
ln Y = 1 + ( 1 - 3 ) ln X2 + 3 ln X3 + u
=>
ln Y = 1 + ln X2 + 3 ( ln X3 – lnX2 ) + u
=>
(ln Y - ln X2) = 1 + 3 ( ln X3 – lnX2) + u
=>
Y X3
X2
restricted
model
Y* = ’1 + ’3 X* + u’
Restricted least squares:
Constant returns to scales
Y = 1 X2
2 X3
 3 eu
2 + 3 = 1
2 = 1 - 3
3 = 1 - 2
4. Testing partial coefficient under some restrictions:
53
ln( ) = ”1 + ”2 ln ( ) + u”
=>
X3
ln Y = 1 + 2 ln X2 + (1- 2 ) ln X3 + u
=>
ln Y = 1 + 2 ln X2 + lnX3 - 2 ln X3 + u
=>
(ln Y - ln X3) = 1 + 2 ( lnX2 – lnX3) + u
=>
Y X2
X3
restricted
model
Y** = ”1+”3 X** + u”
OR
54
H0 : 2 + 3 = 1
F =
(RSSR –RSSUR) / m
(RSSUR) / n - k
# of restriction in
restricted model
# of variable in
unrestricted model
RSSUR = 0.013604
RSSR = 0.016629
F* =3.75
Fc
(0.05,1,17)=4.45
Restricted equation:
ln(Y/X2) = ’1+ ’3ln(X3/X2)
Unrestricted equation:
lnY = 1+ 2lnX2+3lnX3 + u
55
Unrestricted equation:
lnY = 1+ 2lnX2+3lnX3 + u
Restricted equation:
ln(Y/X3) = ”1+ ”2ln(X2/X3)
H0 : 2 + 3 = 1
F =
(RSSR –RSSUR) / m
(RSSUR) / n - k
# of restriction in
restricted model
# of variable in
unrestricted model
R2
UR = 0.013604
R2
R = 0.016629
F* = 3.75
Not Reject H0
56
Restriction: 2 = 3 (or 2 - 3 =  = 0 or as 2 =  + 3 = 0 )
Unrestricted model: Y = 1 + 2 X2 + 3X3 + u
Restricted model: Y = 1 +  X2 + 3 X*3 + u’
Test for Restriction on parameters: t test approach
Next rewrite the equation as :
=> Y = 1 + ( + 3 )X2 + 3 X3 + u’
=> Y = 1 + X2 + 3 X2 + 3 X3 + u’
=> Y = 1 + X2 + 3 (X2 + X3) + u’
Simply use the t-value to test whether  is zero or not
Or
use
F-test
Compute t = and compare to tc
and follow t-test decision rule
2 - 3
^
Se(2- 3)
^
^
^
)
ˆ
,
ˆ
cov(
2
)
ˆ
var(
)
ˆ
var(
)
ˆ
ˆ
( 3
2
3
2
3
2 




 



se
57
H0:  = 0
H1:   0 tc
(0.05, 931)= 1.96
H0 : 3 - 4 = 0
F* = = 0
(R2
R – R2
UR) / m
(1-R2
UR) / n - k
Fc
(0.05, 4, 931) =2.37
Example
Unrestricted model Restricted model
=> not reject H0
58
5- Test for Functional Form
(MacKinnon, White, Davidson)
MWD Test for the functional form
Optional Reading
59
(MacKinnon, White, Davidson)
MWD Test for the functional form (Gujarati(2003) pp.280)
1. Run OLS on the linear model, obtain Y
^
Y = 1 + 2 X2 + 3 X3
^ ^
^
^
2. Run OLS on the log-log model and obtain lnY
^
lnY = 1 + 2 ln X2 + 3 ln X3
^ ^
^
^
3. Compute Z1 = ln(Y) - lnY
^ ^
4. Run OLS on the linear model by adding z1
Y = 1’ + 2’X2 + 3’X3 + 4’ Z1
^ ^
^
^ ^
and check t-statistic of 4’
If t*
4
> tc ==> reject H0 : linear model
^
If t*
4
< tc ==> not reject H0 : linear model
^
H0: linear model;
H1:log-linear model
60
MWD test for the functional form (Cont.)
5. Compute Z2 = antilog (lnY) - Y
^ ^
6. Run OLS on the log-log model by adding Z2
lnY = 1’ + 2’ ln X2 + 3’ ln X3 + 4’ Z2
^ ^ ^ ^ ^
If t*
4
> tc ==> reject H0 : log-log model
^
If t*
4
< tc ==> not reject H0 : log-log model
^
and check t-statistic of ’3
^
61
MWD TEST: TESTING the Functional form of regression
CV1 =

Y
_ =
1583.279
24735.33
= 0.064
^
Y
^
Example: 8.5
Step 1:
Run the linear model
and obtain
62
lnY
^
fitted
or
estimated
Step 2:
Run the log-log model
and obtain
CV2 =

Y
_ =
0.07481
10.09653
= 0.0074
^
63
tc
0.05, 11 = 1.796
tc
0.10, 11 = 1.363
t* < tc at 5%
=> not reject H0
t* > tc at 10%
=> reject H0
Step 4:
H0 : true model
is linear
64
tc
0.025, 11 = 2.201
tc
0.05, 11 = 1.796
tc
0.10, 11 = 1.363
Since t* < tc
=> not reject H0 Comparing the C.V. =
C.V.1
C.V.2
=
0.064
0.0074
Step 6:
H0 : true model is
log-log model
65

Y
^
The coefficient of variation:
C.V =
It measures the average error of the sample
regression function relative to the mean of Y.
Linear, log-linear, and log-log equations can be
meaningfully compared.
The smaller C.V. of the model,
the more preferred equation (functional model).
Alternative Criteria for comparing two different functional models:
66
= 4.916 means that model 2 is better
Coefficient Variation
(C.V.)
 / Y of model 1
^
 / Y of model 2
^
=
2.1225/89.612
0.0217/4.4891
=
0.0236
0.0048
Compare two different functional form models:
Model 1
linear model
Model 2
log-log model
67
6- “Chow Test” - structural stability Test
H0 : no structural change
H1 : yes
Procedures:
1. Divide the sample of N observation into two groups.
- group1 consisting of first n1 obs.
- group2 consisting of the remaining n2 = N - n1 obs.
68
2. Run OLS on two sub-sample groups separately and
obtain the RSS1, and RSS2
3. Run OLS on the whole sample (N) and obtain the
restricted RSSR
4. Compute F* =
(RSSR - RSS1 - RSS2) / k
(RSS1 + RSS2) / N-2k
5. Compute F* and Fc
, k, N-2k
If F* > Fc ==> reject H0
It means that there is a structural change in the sample.
69
Structural stability:CHOW TEST
70
Scatter plot of Income and Savings
71
Structural stability : Ho: Var(u1) = Var(u2) = 2
Whole sample
RSSR
72
Y = 1 + 2X + u1
Sub-sample n1
RSS1
73
Y = 1 + 2X +u2
Sub-sample n2
RSS2
74
Empirical Results:
Dep.
variable
Constant Indep. V
X
R
2
SEE RSS n
Y
(70-95)
624226
(4.89)
0.0376
(8.89)
0.7672 31.12 23248.3 26
Y
(70-81)
1.0161
(0.08)
0.0803
(9.60)
0.9021 13.36 1785.03 12
Y
(82-95)
153.494
(4.69)
0.0148
(1.77)
0.2071 28.87 10005.2 14
F* =
(RSSR - RSS1 - RSS2) / k
(RSS1 + RSS2) / N-2k
=
(23248.3 – 1785.03 -10005.2)/ 2
(1785.03 +10005.2)/(22)
F* = 10.69 Fc
0.01 =5.72; Fc
0.10=2.56
Fc
0.05=3.44;
Conclusion: F* > Fc ==> reject H0
75
Prediction in multiple regression
• The procedure is exactly the same as the
prediction in two variable model.
• We use the matrix notation form, the formula
is given in the appendix C9 on page 861 of
the book.
• We also have “mean prediction” and
“individual prediction”.
• The software will do it for you.
76
THE END

More Related Content

What's hot

Chapter3 econometrics
Chapter3 econometricsChapter3 econometrics
Chapter3 econometrics
Vu Vo
 
Chapter 11
Chapter 11 Chapter 11
Chapter 11
Tuul Tuul
 
Froyen06
Froyen06Froyen06
Future of Globalization.pptx
Future of Globalization.pptxFuture of Globalization.pptx
Future of Globalization.pptx
joshva raja john
 
Phillips Curve, Inflation & Interest Rate
Phillips Curve, Inflation & Interest RatePhillips Curve, Inflation & Interest Rate
Phillips Curve, Inflation & Interest RateZeeshan Ali
 
Chap10 hypothesis testing ; additional topics
Chap10 hypothesis testing ; additional topicsChap10 hypothesis testing ; additional topics
Chap10 hypothesis testing ; additional topics
Judianto Nugroho
 
Unit 2
Unit 2Unit 2
Chapter 3
Chapter 3Chapter 3
Chapter 3
EasyStudy3
 
Basic regression with time series data
Basic regression with time series dataBasic regression with time series data
Basic regression with time series data
saidasa
 
Two-sample Hypothesis Tests
Two-sample Hypothesis Tests Two-sample Hypothesis Tests
Two-sample Hypothesis Tests
mgbardossy
 
law of large number and central limit theorem
 law of large number and central limit theorem law of large number and central limit theorem
law of large number and central limit theorem
lovemucheca
 
Economic Growth And International Trade
Economic Growth And International TradeEconomic Growth And International Trade
Economic Growth And International TradeHitesh Kukreja
 
Gregory mankiw macroeconomic 7th edition chapter (1)
Gregory mankiw macroeconomic 7th edition chapter  (1)Gregory mankiw macroeconomic 7th edition chapter  (1)
Gregory mankiw macroeconomic 7th edition chapter (1)
Kyaw Thiha
 
The Normal Distribution and Other Continuous Distributions
The Normal Distribution and Other Continuous DistributionsThe Normal Distribution and Other Continuous Distributions
The Normal Distribution and Other Continuous Distributions
Yesica Adicondro
 
Bbs11 ppt ch10
Bbs11 ppt ch10Bbs11 ppt ch10
Bbs11 ppt ch10
Tuul Tuul
 
Tam Rekabet Piyasası Ve Firma Dengesi
Tam Rekabet Piyasası Ve Firma DengesiTam Rekabet Piyasası Ve Firma Dengesi
Tam Rekabet Piyasası Ve Firma Dengesierkanaktas
 
Ch03
Ch03Ch03
Ch03
waiwai28
 
Froyen07
Froyen07Froyen07
Dummyvariable1
Dummyvariable1Dummyvariable1
Dummyvariable1
Sreenivasa Harish
 

What's hot (20)

Chapter3 econometrics
Chapter3 econometricsChapter3 econometrics
Chapter3 econometrics
 
Chapter 11
Chapter 11 Chapter 11
Chapter 11
 
Froyen06
Froyen06Froyen06
Froyen06
 
Future of Globalization.pptx
Future of Globalization.pptxFuture of Globalization.pptx
Future of Globalization.pptx
 
Phillips Curve, Inflation & Interest Rate
Phillips Curve, Inflation & Interest RatePhillips Curve, Inflation & Interest Rate
Phillips Curve, Inflation & Interest Rate
 
Chap10 hypothesis testing ; additional topics
Chap10 hypothesis testing ; additional topicsChap10 hypothesis testing ; additional topics
Chap10 hypothesis testing ; additional topics
 
Problems 3
Problems 3Problems 3
Problems 3
 
Unit 2
Unit 2Unit 2
Unit 2
 
Chapter 3
Chapter 3Chapter 3
Chapter 3
 
Basic regression with time series data
Basic regression with time series dataBasic regression with time series data
Basic regression with time series data
 
Two-sample Hypothesis Tests
Two-sample Hypothesis Tests Two-sample Hypothesis Tests
Two-sample Hypothesis Tests
 
law of large number and central limit theorem
 law of large number and central limit theorem law of large number and central limit theorem
law of large number and central limit theorem
 
Economic Growth And International Trade
Economic Growth And International TradeEconomic Growth And International Trade
Economic Growth And International Trade
 
Gregory mankiw macroeconomic 7th edition chapter (1)
Gregory mankiw macroeconomic 7th edition chapter  (1)Gregory mankiw macroeconomic 7th edition chapter  (1)
Gregory mankiw macroeconomic 7th edition chapter (1)
 
The Normal Distribution and Other Continuous Distributions
The Normal Distribution and Other Continuous DistributionsThe Normal Distribution and Other Continuous Distributions
The Normal Distribution and Other Continuous Distributions
 
Bbs11 ppt ch10
Bbs11 ppt ch10Bbs11 ppt ch10
Bbs11 ppt ch10
 
Tam Rekabet Piyasası Ve Firma Dengesi
Tam Rekabet Piyasası Ve Firma DengesiTam Rekabet Piyasası Ve Firma Dengesi
Tam Rekabet Piyasası Ve Firma Dengesi
 
Ch03
Ch03Ch03
Ch03
 
Froyen07
Froyen07Froyen07
Froyen07
 
Dummyvariable1
Dummyvariable1Dummyvariable1
Dummyvariable1
 

Similar to Econometric lec3.ppt

Krishna
KrishnaKrishna
maths_formula_sheet.pdf
maths_formula_sheet.pdfmaths_formula_sheet.pdf
maths_formula_sheet.pdf
VanhoaTran2
 
The Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability DistributionThe Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability Distribution
Pedro222284
 
Applications of Differential Calculus in real life
Applications of Differential Calculus in real life Applications of Differential Calculus in real life
Applications of Differential Calculus in real life
OlooPundit
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework help
Mark Austin
 
Section2 stochastic
Section2 stochasticSection2 stochastic
Section2 stochastic
cairo university
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Pamelaew
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
CrackDSE
 
An Efficient Boundary Integral Method for Stiff Fluid Interface Problems
An Efficient Boundary Integral Method for Stiff Fluid Interface ProblemsAn Efficient Boundary Integral Method for Stiff Fluid Interface Problems
An Efficient Boundary Integral Method for Stiff Fluid Interface Problems
Alex (Oleksiy) Varfolomiyev
 
Bahan ajar kalkulus integral
Bahan ajar kalkulus integralBahan ajar kalkulus integral
Bahan ajar kalkulus integral
grand_livina_good
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
nodyligomi
 
1526 exploiting symmetries
1526 exploiting symmetries1526 exploiting symmetries
1526 exploiting symmetries
Dr Fereidoun Dejahang
 
correlation (3).pdf
correlation (3).pdfcorrelation (3).pdf
correlation (3).pdf
Ankita Jaiswal
 
Gabarito completo anton_calculo_8ed_caps_01_08
Gabarito completo anton_calculo_8ed_caps_01_08Gabarito completo anton_calculo_8ed_caps_01_08
Gabarito completo anton_calculo_8ed_caps_01_08joseotaviosurdi
 
Lecture 10
Lecture 10Lecture 10
Lecture 10
FahadYaqoob5
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniquesKrishna Gali
 
Regression analysis by Muthama JM
Regression analysis by Muthama JMRegression analysis by Muthama JM
Regression analysis by Muthama JM
Japheth Muthama
 

Similar to Econometric lec3.ppt (20)

Krishna
KrishnaKrishna
Krishna
 
maths_formula_sheet.pdf
maths_formula_sheet.pdfmaths_formula_sheet.pdf
maths_formula_sheet.pdf
 
The Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability DistributionThe Multivariate Gaussian Probability Distribution
The Multivariate Gaussian Probability Distribution
 
Applications of Differential Calculus in real life
Applications of Differential Calculus in real life Applications of Differential Calculus in real life
Applications of Differential Calculus in real life
 
Econometrics homework help
Econometrics homework helpEconometrics homework help
Econometrics homework help
 
Section2 stochastic
Section2 stochasticSection2 stochastic
Section2 stochastic
 
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by AntonSolutions Manual for Calculus Early Transcendentals 10th Edition by Anton
Solutions Manual for Calculus Early Transcendentals 10th Edition by Anton
 
ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)ISI MSQE Entrance Question Paper (2008)
ISI MSQE Entrance Question Paper (2008)
 
An Efficient Boundary Integral Method for Stiff Fluid Interface Problems
An Efficient Boundary Integral Method for Stiff Fluid Interface ProblemsAn Efficient Boundary Integral Method for Stiff Fluid Interface Problems
An Efficient Boundary Integral Method for Stiff Fluid Interface Problems
 
Bahan ajar kalkulus integral
Bahan ajar kalkulus integralBahan ajar kalkulus integral
Bahan ajar kalkulus integral
 
final19
final19final19
final19
 
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions ManualCalculus Early Transcendentals 10th Edition Anton Solutions Manual
Calculus Early Transcendentals 10th Edition Anton Solutions Manual
 
1526 exploiting symmetries
1526 exploiting symmetries1526 exploiting symmetries
1526 exploiting symmetries
 
Sect5 1
Sect5 1Sect5 1
Sect5 1
 
correlation (3).pdf
correlation (3).pdfcorrelation (3).pdf
correlation (3).pdf
 
Gabarito completo anton_calculo_8ed_caps_01_08
Gabarito completo anton_calculo_8ed_caps_01_08Gabarito completo anton_calculo_8ed_caps_01_08
Gabarito completo anton_calculo_8ed_caps_01_08
 
Lecture 10
Lecture 10Lecture 10
Lecture 10
 
Lecture 10
Lecture 10Lecture 10
Lecture 10
 
Integration techniques
Integration techniquesIntegration techniques
Integration techniques
 
Regression analysis by Muthama JM
Regression analysis by Muthama JMRegression analysis by Muthama JM
Regression analysis by Muthama JM
 

Recently uploaded

What is the point of small housing associations.pptx
What is the point of small housing associations.pptxWhat is the point of small housing associations.pptx
What is the point of small housing associations.pptx
Paul Smith
 
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
ukyewh
 
2024: The FAR - Federal Acquisition Regulations, Part 36
2024: The FAR - Federal Acquisition Regulations, Part 362024: The FAR - Federal Acquisition Regulations, Part 36
2024: The FAR - Federal Acquisition Regulations, Part 36
JSchaus & Associates
 
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
evkovas
 
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
850fcj96
 
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
850fcj96
 
ZGB - The Role of Generative AI in Government transformation.pdf
ZGB - The Role of Generative AI in Government transformation.pdfZGB - The Role of Generative AI in Government transformation.pdf
ZGB - The Role of Generative AI in Government transformation.pdf
Saeed Al Dhaheri
 
Get Government Grants and Assistance Program
Get Government Grants and Assistance ProgramGet Government Grants and Assistance Program
Get Government Grants and Assistance Program
Get Government Grants
 
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
ehbuaw
 
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptxMHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
ILC- UK
 
The Role of a Process Server in real estate
The Role of a Process Server in real estateThe Role of a Process Server in real estate
The Role of a Process Server in real estate
oklahomajudicialproc1
 
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
ehbuaw
 
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
ehbuaw
 
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
ehbuaw
 
2024: The FAR - Federal Acquisition Regulations, Part 37
2024: The FAR - Federal Acquisition Regulations, Part 372024: The FAR - Federal Acquisition Regulations, Part 37
2024: The FAR - Federal Acquisition Regulations, Part 37
JSchaus & Associates
 
NHAI_Under_Implementation_01-05-2024.pdf
NHAI_Under_Implementation_01-05-2024.pdfNHAI_Under_Implementation_01-05-2024.pdf
NHAI_Under_Implementation_01-05-2024.pdf
AjayVejendla3
 
PPT Item # 6 - 7001 Broadway ARB Case # 933F
PPT Item # 6 - 7001 Broadway ARB Case # 933FPPT Item # 6 - 7001 Broadway ARB Case # 933F
PPT Item # 6 - 7001 Broadway ARB Case # 933F
ahcitycouncil
 
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
Congressional Budget Office
 
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdfPNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
ClaudioTebaldi2
 
PPT Item # 8 - Tuxedo Columbine 3way Stop
PPT Item # 8 - Tuxedo Columbine 3way StopPPT Item # 8 - Tuxedo Columbine 3way Stop
PPT Item # 8 - Tuxedo Columbine 3way Stop
ahcitycouncil
 

Recently uploaded (20)

What is the point of small housing associations.pptx
What is the point of small housing associations.pptxWhat is the point of small housing associations.pptx
What is the point of small housing associations.pptx
 
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
一比一原版(QUT毕业证)昆士兰科技大学毕业证成绩单
 
2024: The FAR - Federal Acquisition Regulations, Part 36
2024: The FAR - Federal Acquisition Regulations, Part 362024: The FAR - Federal Acquisition Regulations, Part 36
2024: The FAR - Federal Acquisition Regulations, Part 36
 
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
一比一原版(WSU毕业证)西悉尼大学毕业证成绩单
 
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
如何办理(uoit毕业证书)加拿大安大略理工大学毕业证文凭证书录取通知原版一模一样
 
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
快速制作(ocad毕业证书)加拿大安大略艺术设计学院毕业证本科学历雅思成绩单原版一模一样
 
ZGB - The Role of Generative AI in Government transformation.pdf
ZGB - The Role of Generative AI in Government transformation.pdfZGB - The Role of Generative AI in Government transformation.pdf
ZGB - The Role of Generative AI in Government transformation.pdf
 
Get Government Grants and Assistance Program
Get Government Grants and Assistance ProgramGet Government Grants and Assistance Program
Get Government Grants and Assistance Program
 
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
一比一原版(UQ毕业证)昆士兰大学毕业证成绩单
 
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptxMHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
MHM Roundtable Slide Deck WHA Side-event May 28 2024.pptx
 
The Role of a Process Server in real estate
The Role of a Process Server in real estateThe Role of a Process Server in real estate
The Role of a Process Server in real estate
 
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
一比一原版(UOW毕业证)伍伦贡大学毕业证成绩单
 
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
一比一原版(Adelaide毕业证)阿德莱德大学毕业证成绩单
 
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
一比一原版(ANU毕业证)澳大利亚国立大学毕业证成绩单
 
2024: The FAR - Federal Acquisition Regulations, Part 37
2024: The FAR - Federal Acquisition Regulations, Part 372024: The FAR - Federal Acquisition Regulations, Part 37
2024: The FAR - Federal Acquisition Regulations, Part 37
 
NHAI_Under_Implementation_01-05-2024.pdf
NHAI_Under_Implementation_01-05-2024.pdfNHAI_Under_Implementation_01-05-2024.pdf
NHAI_Under_Implementation_01-05-2024.pdf
 
PPT Item # 6 - 7001 Broadway ARB Case # 933F
PPT Item # 6 - 7001 Broadway ARB Case # 933FPPT Item # 6 - 7001 Broadway ARB Case # 933F
PPT Item # 6 - 7001 Broadway ARB Case # 933F
 
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
Effects of Extreme Temperatures From Climate Change on the Medicare Populatio...
 
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdfPNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
PNRR MADRID GREENTECH FOR BROWN NETWORKS NETWORKS MUR_MUSA_TEBALDI.pdf
 
PPT Item # 8 - Tuxedo Columbine 3way Stop
PPT Item # 8 - Tuxedo Columbine 3way StopPPT Item # 8 - Tuxedo Columbine 3way Stop
PPT Item # 8 - Tuxedo Columbine 3way Stop
 

Econometric lec3.ppt

  • 2. 2 CHAPTER 7 Multiple Regression The Problem of Estimation
  • 3. 3 NOTATION Y = 1 + 2 X2 + 3 X3 +…+ k Xk + u
  • 4. 4 Y X3 = 3 holding X2 constant, the direct effect of a unit change in X3 on the mean value of Y. To assess the true contribution of X2 to the change in Y, we control the influence of X3. Y = 1 + 2X2 + 3X3 + u (suppose this is a true model) Y X2 = 2 : 2 measures the change in the mean values of Y, per unit change in X2, holding X3 constant. or The ‘direct’ or ‘net’ effect of a unit change in X2 on the mean value of Y Explaination for the parial coefficients
  • 5. 5 Derive OLS estimators of multiple regression OLS is to minimize the SSR( u2) ^ min. RSS = min.  u2 = min. (Y - 1 - 2X2 - 3X3)2 ^ ^ ^ ^ RSS 1 =2  ( Y - 1- 2X2 - 3X3)(-1) = 0 ^ ^ ^ ^ RSS 2 =2  ( Y - 1- 2X2 - 3X3)(-X2) = 0 ^ ^ ^ ^ RSS 3 =2  ( Y - 1- 2X2 - 3X3)(-X3) = 0 ^ ^ ^ ^ Y = 1 + 2X2 + 3X3 + u ^ ^ ^ ^ u = Y - 1 - 2X2 - 3X3 ^ ^ ^ ^
  • 6. 6 rearranging three equations: n1 + 2 X2 + 3  X3 = Y ^ ^ ^ 2 X2 + 2 X3 2 + 3  X2X3 = X2Y ^ ^ ^ 1 X3 + 2 X2X3 + 3  X3 2 = X3Y ^ ^ ^ rewrite in matrix form: n X2 X3 X2 X2 2 X2X3 X3 X2X3 X3 2 1 2 3 ^ ^ ^ = Y  X2Y X3Y 2-variables Case 3-variables Case (X’X)  ^ = X’Y Matrix notation
  • 7. 7 1 = Y - 2X2 - 3X3 ^ ^ ^ _ _ _ n X2 Y X2 X2 2 X2Y X3 X2X3 X3Y n X2 X3 X2 X2 2 X2X3 X3 X2X3 X3 2 = 3 ^ = (yx3)(x2 2) - (yx2)(x2x3) (x2 2)(x3 2) - (x2x3)2 n Y X3 X2 X2Y X2X3 X3 X3Y X3 2 n X2 X3 X2 X2 2 X2X3 X3 X2X3 X3 2 = 2 ^ = (yx2)(x3 2) - (yx3)(x2x3) (x2 2)(x3 2) - (x2x3)2 Cramer’s rule:
  • 8. 8 Variance-Covariance matrix Var-cov() = ^ Var(1) Cov(1 2) Cov(1 3) Cov (2 1) Var(2) Cov(2 3) Cov (3 1) Cov(3 2) Var(3) ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ = u 2(X’X)-1 ^ or in matrix form: 3x3 3x1 3x1 ^  (X’X) X’Y = ==>  ^ = (X’X)-1 (X’Y) 3x3 3x1 3x1 Var-cov() = u 2 (X’X)-1 and u 2 = ^ ^ ^  u 2 ^ n-3
  • 9. 9 n X2 X3 X2 X2 2 X2X3 X3 X3X2 X3 2 = u 2 ^ -1 u 2 = ^ u2 ^ n-3 and = u2 ^ n- k u2 ^ k=3 # of independent variables ( including the constant term)
  • 10. 10 Scalar forms of Var and SE of OLS estimators    ) 1 ( ) ˆ var( 2 23 2 2 2 2 r x i       ) 1 ( ) ˆ ( 2 23 2 3 2 3 r x se i      ) 1 ( ) ˆ var( 2 23 2 2 2 2 r x i       ) 1 ( ) ˆ ( 2 23 2 2 2 2 r x se i        2 3 2 2 2 23 2 23 3 2 1 ( ) ˆ , ˆ cov( i i x x r r        2 3 2 2 2 3 2 2 23 ) ( i i i i x x x x r
  • 11. 11 Properties of multiple OLS estimators 1. The regression line(surface)passes through the mean of Y1, X2, X3 _ _ _ i.e., Y= 1 + 2X2 + 3X3 _ ^ ^ ^ _ _ ==> 1 = Y - 2X2 - 3X3 ^ ^ ^ _ _ _ Linear in parameters Regression through the mean 3. u=0 ^ Zero mean of error uY=0 ^ 5. ^ random sample uX2 = uX3 = 0 ^ ^ 4. (uXk=0 ) ^ constant Var(ui) = 2 2. Y = Y + 2x2 + 3x3 ^ _ ^ ^ y = 2x2 + 3x3 ^ ^ or Unbiased: E(i) = i ^
  • 12. 12 Properties of multiple OLS estimators 6. As X2 and X3 are closely related ==> var(2) and var(3) become large and infinite. Therefore the true values of 2 and 3 are difficult to know. ^ ^ All the normality assumptions in the two-variables case regression are also applied to the multiple variable regression. But one addition assumption is No exact linear relationship among the independent variables. (No perfect collinearity, i.e., Xk  Xj ) 7. The greater the variation in the sample values of X2 or X3, the smaller variance of 2 and 3 , and the estimations are more precisely. ^ ^ 8. BLUE (Gauss-Markov Theorem)
  • 13. 13 The adjusted R2 (R2) as one of indicators of the overall fitness R2 = ESS TSS = 1 - RSS TSS = 1 - u2 y2 ^ R2 = 1 - _ 2 SY 2 ^ R2 = 1 - _ u2 y2 ^ (n-1) (n-k) u2 / (n-k) y2 / (n-1) R2 = 1 - _ ^ k : # of independent variables plus the constant term. n : # of obs. n-1 R2 = 1 - (1-R2) _ n-k R2  R2 _ Adjusted R2 can be negative: R2  0 0 < R2 < 1 Note: Don’t misuse the adjusted R2, Gujarati(2003) pp. 222
  • 14. 14 Some notes on R2 and adjusted R2 • R2 always increases when we add independent variables. So we use to decide on whether or not to add one more variable. The rule can be applied is: add more variable if in creases and t-test for new variable is significant. • We compare two R2 of two models to select the better model if the two models have the same sample size n and forms of dependent variable. • Game of maximizing : In regression analysis, we should be more concerned about the logical or theoretical relevance of X to Y and their significance. So, it is not necessary to get very high R2 or adjected R2. If they are high, it is good, but if they are low, it does not mean the model is bad. 2 R 2 R 2 R
  • 15. 15 Y X = 2 = Y X Y X Y = 1 + 2 X Y X = 2 percentage change in y percentage change in x  = = x/x y/y = y x x y Applying elasticities
  • 16. 16 Estimating elasticities Y X = b2 = Y X Y X ^ Yt = b1 + b2Xt = 4 + 1.5Xt ^ X = 8 = average number of years of experience Y = $10 = average wage rate = 1.5 = 1.2 8 10 = b2  Y X ^
  • 18. 18 = 2 Y X X Y elasticity of Y with respect to X: = 2 Y X X Y  =
  • 19. 19 linear X Y 2 1     2   dX dY X ) ( 2 Y  X Y ln ln 2 1     Log-log 2 ln ln    X dX Y dY X d Y d 2  ) ( 2 X Y dX dY   ==> Summary Model Equation ) ( dX dY  ) ( X dX Y dY 
  • 20. 20 Summary(Count.) Reciprocal X Y 1 2 1     2 2 ) 1 ( 1     dX X dY X d dY X2 dX dY -1 2   ) -1 ( 2 XY  ==> Lin-log X Y ln 2 1     2 ln    X dX dY X d dY Y 1 2  X dX dY 1 2   ==> Y dX dY 1   ==> X Y 2 1 ln     Log-lin X 2  2 ln    dX Y dY dX Y d
  • 21. 21 Application of functional form regression 1. Cobb-Douglas Production function: u e K L Y 1 3 2   Transforming: u K L Y u K L Y         ln ln ln ln ln ln ln 3 2 1 3 2 1   ’    ==> 2 ln ln   L d Y d 3 ln ln   K d Y d : elasticity of output w.r.t. labor input : elasticity of output w.r.t. capital input. 1 3 2     > < Information about the scale of returns.
  • 22. 22 2. Polynomial regression model: Marginal cost function or total cost function costs y MC i.e. costs y u X X Y     2 3 2 1    (MC) or costs y TC u X X X Y      3 4 2 3 2 1     (TC)
  • 23. 23 Chapter 8 Multiple Regression Analysis: The Problem of Inference
  • 24. 24 Hypothesis Testing in multiple regression: 1. Testing individual partial coefficient 2. Testing the overall significance of all coefficients 3. Testing restriction on variables (add or drop): Xk = 0 ? 4. Testing partial coefficient under some restrictions Such as 2+ 3 = 1; or 2 = 3 (or 2+ 3 = 0); etc. 6. Testing the stability of the estimated regression model -- over time -- in different cross-sections 5. Testing the functional form of regression model.
  • 25. 25 1. Individual partial coefficient test t = 2 - 0 ^ Se (2) ^ = 0.726 0.048 = 14.906 Compare with the critical value tc 0.025, 12 = 2.179 Since t > tc ==> reject Ho Answer : Yes, 2 is statistically significant and is significantly different from zero. ^ H0 : 2 = 0 H1 : 2  0 holding X3 constant: Whether X2 has the effect on Y ? 1 Y X2 = 2 = 0?
  • 26. 26 1. Individual partial coefficient test (cont.) holding X2 constant: Whether X3 has the effect on Y? 2 H0 : 3 = 0 H1 : 3  0 Y X3 = 3 = 0? t = 3 - 0 ^ Se (3) ^ = 2.736-0 0.848 = 3.226 Critical value: tc 0.025, 12 = 2.179 Since | t | > | tc | ==> reject Ho Answer: Yes, 3 is statistically significant and is significantly different from zero. ^
  • 27. 27 2. Testing overall significance of the multiple regression 3. Compare F and Fc , and if F > Fc ==> reject H0 1. Compute and obtain F-statistics 2. Check for the critical Fc value (Fc , k-1, n-k) Y = 1 + 2X2 + 3X3 + u H0 : 2 = 0, 3 = 0, (all variable are zero effect) H1 : 2  0 or 3  0 (At least one variable is not zero, At least one variable has the effect) 3-variable case:
  • 28. 28 F = MSS of ESS MSS of RSS = ESS / k-1 RSS / n-k =  y2/(k-1)  u 2 /(n-k) ^ ^ if F > Fc α,k-1,n-k ==> reject Ho H1 : 2  …  k  0 H0 : 2 = … = k = 0 Analysis of Variance: Source of variation Sum of Square df Mean sum of Sq. Due to regression(ESS)  y 2 k-1 Due to residuals(RSS)  u2 n-k Total variation(TSS)  y2 n-1 ANOVA TABLE y2 k-1 ^ ^ u2 (SS) n-k = u 2 ^ ^ ^ (MSS) Note: k is the total number of parameters including the intercept term. Since y = y + u ^ ==> y2 = y2 + u2 ^ ^ TSS = ESS + RSS ^
  • 29. 29 Three- variable case y = 2x2 + 3x3 + u ^ ^ ^  y2 = 2  x2 y + 3  x3 y +  u2 ^ ^ ^ TSS = ESS + RSS F-Statistic = ESS / k-1 RSS / n-k = (2 x2y + 3 x3y) / 3-1 u2 / n-3 ^ ^ ^ ANOVA TABLE Source of variation SS df(k=3) MSS ESS 2 x2 y + 3 x3 y 3-1 ESS/3-1 RSS u2 n-3 RSS/n-3 TSS y2 n-1 ^ ^ ^ (n-k)
  • 30. 30 An important relationship between R2 and F F = ESS / k-1 RSS / n-k = ESS (n-k) RSS (k-1) = TSS-ESS ESS n-k k-1 TSS = ESS/TSS ESS 1 - n-k k-1 = R2 1 - R2 n-k k-1 = R2 / (k-1) (1-R2) / n-k F R2 = (k-1)F + (n-k) (k-1) F For the three-variables case : F = R2 / 2 (1-R2) / n-3
  • 31. 31 R2 = ESS TSS = 1 - RSS TSS = 1 - u2 y2 ^ R2 = 1 - _ 2 SY 2 ^ R2 = 1 - _ u2 y2 ^ (n-1) (n-k) u2 / (n-k) y2 / (n-1) R2 = 1 - _ ^ k : # of independent variables including the constant term. n : # of obs. R2 and the adjusted R2 (R2) n-1 R2 = 1 - (1-R2) _ n-k R2  R2 _ Adjusted R2 can be negative: R2  0 0 < R2 < 1
  • 32. 32 Overall significance test: H1 : at least one coefficient is not zero. H0 : 2 = 3 = 4 = 0 2  0 , or 3  0 , or 4  0 Fc(0.05, 4-1, 20-4) = 3.24  k-1 n-k Since F* > Fc ==> reject H0. F* = = R2 / k-1 (1-R2) / n- k = 179.13 0.9710 / 3 (1-0.9710) /16 =
  • 33. 33 Construct the ANOVA Table (8.4) .(Information from EViews) F* = MSS of regression MSS of residual = 5164.3903 28.8288 = 179.1339 Source of variation SS Df MSS Due to regression (SSE) R 2 (y 2 ) =(0.971088)(28.97771)2x19 =15493.171 k-1 =3 R 2 (y 2 )/(k-1) =5164.3903 Due to Residuals (RSS) (1- R 2 )(y 2 ) or ( u 2 ) =(0.0289112)(28.97771) )2x19 =461.2621 n-k =16 (1- R 2 )(y 2 )/(n-k) =28.8288 Total (TSS) (y 2 ) =(28.97771) 2x19 =15954.446 n-1 =19 Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2
  • 34. 34 Example:Gujarati(2003)-Table6.4, pp.185) Fc(0.05, 3-1, 64-3) = 3.15  k-1 n-k Since F* > Fc ==> reject H0. F* = 0.707665 / 2 (1-0.707665)/ 61 = R2 / k-1 (1-R2) / n- k F* = 73.832 = ESS / k-1 RSS/(n- k) H0 : 1 = 2 = 3 = 0
  • 35. 35 Construct the ANOVA Table (8.4) .(Information from EVIEWS) F* = MSS of regression MSS of residual = 130723.67 1770.547 = 73.832 Source of variation SS Df MSS Due to regression (SSE) R 2 (y 2 ) =(0.707665)(75.97807)2x64 =261447.33 k-1 =2 R 2 (y 2 )/(k-1) =130723.67 Due to Residuals (RSS) (1- R 2 )(y 2 ) or ( u 2 ) =(0.292335)(75397807)2x64 =108003.37 n-k =61 (1- R 2 )(y 2 )/(n-k) =1770.547 Total (TSS) (y 2 ) =(75.97807)2x64 =369450.7 n-1 =63 Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2
  • 36. 36 Decision Rule: Since F*= .73.832 > Fc = 4.98 (3.15) ==> reject Ho Answer : The overall estimators are statistically significant different from zero. Fc 0.01, 2, 61 = 4.98 Fc 0.05, 2, 61 = 3.15 Compare F* and Fc, checks the F-table: H0 : 2 = 0, 3= 0, H1 : 2  0 ; 3  0 Y = 1 + 2 X2 + 3 X3 + u
  • 37. 37
  • 38. 38 3.Testing the addition variable in the regression model Old model : Y = 1 + 2 X2 + u1 Obtain R2 old or RSSold and/or ESSold Now consider a new variable X3, whether it is relevant to add or not ? New model : Y = 1 + 2X2 + 3 X3 + u2 Obtain R2 new or RSSnew and ESSnew H0 : 3 = 0, add X3 is not relevant H1 : 3  0, add X3 is relevant
  • 39. 39 Steps of testing whether X3 has an incremental contribution of the explanatory power in the new model. 1. Compute the F-statistic (J = 1) F* = (RSSold - RSSnew) / # of additional variables RSSnew / n - # of parameters in new model (n-3) 2. Compare F* and Fc (, 1, n-3) 3. Decision rule: If F* > Fc ==> reject H0 : 3 = 0 that means X3 is a relevant variable to be added into the model. F = (R2 new - R2 old) / df (1 - R2 new) / df n-k (in the new model) # of new regressors (add or drop) F* can also be calculated by
  • 40. 40 Old model Add an irrelevant variable X2: (Studenmund, pp.166) RSSold = 160.5929 R2 old = 0.986828
  • 41. 41 F* = (R2 new - R2 old) / df (1 - R2 new) / df n-k ( ) in the new model # of new regressors (add or drop) = (0.9872 - 0.9868) / 1 (1 - 0.9872) / 39 = 1.218 Fc 0.05, 1, 39 = 4.17 New model H0 : add X3 (R) is not suitable 3 = 0 Since F* < Fc ==> not reject H0.
  • 42. 42 Add an relevant variable X3: (Studenmund, pp.166) Old model Y = 0 + 1 X1 + 2 X2 + u Next: New model: Y = 0 + 1 X1 + 2 X2 + 3 X3 + u’ H0 : add X3 (YD variable is not suitable, 3 = 0
  • 43. 43 F* = (R2 new - R2 old) / df (1 - R2 new) / df = 0.9868 - 0.9203 / 1 (1 - 0.9868) / 44 - 4 = 0.0665x40 0.0132 = 201.5 Fc (0.05, 1, 40) = 4.08 Since F* > Fc ==> reject H0. New model Add an relevant variable X3(YD): (Studenmund, pp.166)
  • 44. 44 Add variables in general discussion: Testing the two additional variables Xl, Xm, whether they are relevant or not? F-test : H0 : l = 0, m = 0 H1 : l  0, or m  0 F* = (R2 new- R2 old) / # of added regressors (1 - R2 new) / n-m If F* > Fc ==> reject H0 old : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui Restricted l= m=0 new : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + Xm + ui unrestricted # of new regressors # of total regressors in the new model
  • 45. 45 Drop variables in general discussion: Test the dropping of two variables Xl, Xm whether they are relevant or not? F-test : H0 : l = 0, m = 0 H1 : l  0, or m  0 F* = (R2 old- R2 new) / # of dropped regressors (1 - R2 old) / n-k If F* > Fc ==> reject H0 new : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui Restricted l= m=0 old : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + mXm + ui unrestricted
  • 46. 46 Justify whether to keep (or drop) Xl and Xm in the model 3. Check the t-statistics of Xl and Xm? Whether tl*, tm* > 1.96? (5% leve of significance) ^ 1. Theory: Check the sign of l and m? Whether the variables are theoretically explaining the dependent variable? ^ 4. Bias: Check t-statistics of other variables, X2,……Xk whether they have change significantly or not? 2. Overall fitness: Check R2 increase or not? Check the F*- statistics increase or not?
  • 47. 47 Use the R2 instead of ESS or RSS in the F-test. Restriction Test: F = (R2 UR - R2 R) / J (1 - R2 UR) / (n- k) J: # restricted variables k: # of parameters (included the intercept) in the unrestricted model. Note that we use R2 for restriction test only if two regression models have the same form of dependent variable. = (R2 UR - R2 R) / (dfUR-dfR) (1 - R2 UR) / dfUR
  • 48. 48 Old model or restricted model Example 8.4: The demand for Chicken ( Gujarati(2003) p. 272)
  • 49. 49 H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0 New Model or unrestricted model
  • 50. 50 WALD TEST (Likelihood Ratio Test) (0.9823 - 0.9801) /2 0.000983 F* = (R2 new - R2 old) / # added (1-R2 new) / n-k = (1 - 0.9823) / (23 - 5) = 0.0011 = 1.119 Since F* < Fc ==> not reject H0 Fc 0.05, 2, 18 = 3.55 H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0 1 Adding variables:
  • 51. 51 < Fc 0.05, 1, 19 = 4.38  not reject H0 F* = (R2 UR - R2 R) / m-k 1-R2 UR/ n-k = (0.982313 - 0.981509) /1 (1-0.982313) / (23 - 4) = 0.864 H0: No effect of X5, i.e., 5 = 0 Since t* < tc ==> not reject H0 2 Dropping variable: (β5)
  • 52. 52 ln Y = 1 + 2 ln X2 + 3 ln X3 + u unrestricted model ln( ) = ’1 +’3ln( ) + u’ => X2 ln Y = 1 + ( 1 - 3 ) ln X2 + 3 ln X3 + u => ln Y = 1 + ln X2 + 3 ( ln X3 – lnX2 ) + u => (ln Y - ln X2) = 1 + 3 ( ln X3 – lnX2) + u => Y X3 X2 restricted model Y* = ’1 + ’3 X* + u’ Restricted least squares: Constant returns to scales Y = 1 X2 2 X3  3 eu 2 + 3 = 1 2 = 1 - 3 3 = 1 - 2 4. Testing partial coefficient under some restrictions:
  • 53. 53 ln( ) = ”1 + ”2 ln ( ) + u” => X3 ln Y = 1 + 2 ln X2 + (1- 2 ) ln X3 + u => ln Y = 1 + 2 ln X2 + lnX3 - 2 ln X3 + u => (ln Y - ln X3) = 1 + 2 ( lnX2 – lnX3) + u => Y X2 X3 restricted model Y** = ”1+”3 X** + u” OR
  • 54. 54 H0 : 2 + 3 = 1 F = (RSSR –RSSUR) / m (RSSUR) / n - k # of restriction in restricted model # of variable in unrestricted model RSSUR = 0.013604 RSSR = 0.016629 F* =3.75 Fc (0.05,1,17)=4.45 Restricted equation: ln(Y/X2) = ’1+ ’3ln(X3/X2) Unrestricted equation: lnY = 1+ 2lnX2+3lnX3 + u
  • 55. 55 Unrestricted equation: lnY = 1+ 2lnX2+3lnX3 + u Restricted equation: ln(Y/X3) = ”1+ ”2ln(X2/X3) H0 : 2 + 3 = 1 F = (RSSR –RSSUR) / m (RSSUR) / n - k # of restriction in restricted model # of variable in unrestricted model R2 UR = 0.013604 R2 R = 0.016629 F* = 3.75 Not Reject H0
  • 56. 56 Restriction: 2 = 3 (or 2 - 3 =  = 0 or as 2 =  + 3 = 0 ) Unrestricted model: Y = 1 + 2 X2 + 3X3 + u Restricted model: Y = 1 +  X2 + 3 X*3 + u’ Test for Restriction on parameters: t test approach Next rewrite the equation as : => Y = 1 + ( + 3 )X2 + 3 X3 + u’ => Y = 1 + X2 + 3 X2 + 3 X3 + u’ => Y = 1 + X2 + 3 (X2 + X3) + u’ Simply use the t-value to test whether  is zero or not Or use F-test Compute t = and compare to tc and follow t-test decision rule 2 - 3 ^ Se(2- 3) ^ ^ ^ ) ˆ , ˆ cov( 2 ) ˆ var( ) ˆ var( ) ˆ ˆ ( 3 2 3 2 3 2           se
  • 57. 57 H0:  = 0 H1:   0 tc (0.05, 931)= 1.96 H0 : 3 - 4 = 0 F* = = 0 (R2 R – R2 UR) / m (1-R2 UR) / n - k Fc (0.05, 4, 931) =2.37 Example Unrestricted model Restricted model => not reject H0
  • 58. 58 5- Test for Functional Form (MacKinnon, White, Davidson) MWD Test for the functional form Optional Reading
  • 59. 59 (MacKinnon, White, Davidson) MWD Test for the functional form (Gujarati(2003) pp.280) 1. Run OLS on the linear model, obtain Y ^ Y = 1 + 2 X2 + 3 X3 ^ ^ ^ ^ 2. Run OLS on the log-log model and obtain lnY ^ lnY = 1 + 2 ln X2 + 3 ln X3 ^ ^ ^ ^ 3. Compute Z1 = ln(Y) - lnY ^ ^ 4. Run OLS on the linear model by adding z1 Y = 1’ + 2’X2 + 3’X3 + 4’ Z1 ^ ^ ^ ^ ^ and check t-statistic of 4’ If t* 4 > tc ==> reject H0 : linear model ^ If t* 4 < tc ==> not reject H0 : linear model ^ H0: linear model; H1:log-linear model
  • 60. 60 MWD test for the functional form (Cont.) 5. Compute Z2 = antilog (lnY) - Y ^ ^ 6. Run OLS on the log-log model by adding Z2 lnY = 1’ + 2’ ln X2 + 3’ ln X3 + 4’ Z2 ^ ^ ^ ^ ^ If t* 4 > tc ==> reject H0 : log-log model ^ If t* 4 < tc ==> not reject H0 : log-log model ^ and check t-statistic of ’3 ^
  • 61. 61 MWD TEST: TESTING the Functional form of regression CV1 =  Y _ = 1583.279 24735.33 = 0.064 ^ Y ^ Example: 8.5 Step 1: Run the linear model and obtain
  • 62. 62 lnY ^ fitted or estimated Step 2: Run the log-log model and obtain CV2 =  Y _ = 0.07481 10.09653 = 0.0074 ^
  • 63. 63 tc 0.05, 11 = 1.796 tc 0.10, 11 = 1.363 t* < tc at 5% => not reject H0 t* > tc at 10% => reject H0 Step 4: H0 : true model is linear
  • 64. 64 tc 0.025, 11 = 2.201 tc 0.05, 11 = 1.796 tc 0.10, 11 = 1.363 Since t* < tc => not reject H0 Comparing the C.V. = C.V.1 C.V.2 = 0.064 0.0074 Step 6: H0 : true model is log-log model
  • 65. 65  Y ^ The coefficient of variation: C.V = It measures the average error of the sample regression function relative to the mean of Y. Linear, log-linear, and log-log equations can be meaningfully compared. The smaller C.V. of the model, the more preferred equation (functional model). Alternative Criteria for comparing two different functional models:
  • 66. 66 = 4.916 means that model 2 is better Coefficient Variation (C.V.)  / Y of model 1 ^  / Y of model 2 ^ = 2.1225/89.612 0.0217/4.4891 = 0.0236 0.0048 Compare two different functional form models: Model 1 linear model Model 2 log-log model
  • 67. 67 6- “Chow Test” - structural stability Test H0 : no structural change H1 : yes Procedures: 1. Divide the sample of N observation into two groups. - group1 consisting of first n1 obs. - group2 consisting of the remaining n2 = N - n1 obs.
  • 68. 68 2. Run OLS on two sub-sample groups separately and obtain the RSS1, and RSS2 3. Run OLS on the whole sample (N) and obtain the restricted RSSR 4. Compute F* = (RSSR - RSS1 - RSS2) / k (RSS1 + RSS2) / N-2k 5. Compute F* and Fc , k, N-2k If F* > Fc ==> reject H0 It means that there is a structural change in the sample.
  • 70. 70 Scatter plot of Income and Savings
  • 71. 71 Structural stability : Ho: Var(u1) = Var(u2) = 2 Whole sample RSSR
  • 72. 72 Y = 1 + 2X + u1 Sub-sample n1 RSS1
  • 73. 73 Y = 1 + 2X +u2 Sub-sample n2 RSS2
  • 74. 74 Empirical Results: Dep. variable Constant Indep. V X R 2 SEE RSS n Y (70-95) 624226 (4.89) 0.0376 (8.89) 0.7672 31.12 23248.3 26 Y (70-81) 1.0161 (0.08) 0.0803 (9.60) 0.9021 13.36 1785.03 12 Y (82-95) 153.494 (4.69) 0.0148 (1.77) 0.2071 28.87 10005.2 14 F* = (RSSR - RSS1 - RSS2) / k (RSS1 + RSS2) / N-2k = (23248.3 – 1785.03 -10005.2)/ 2 (1785.03 +10005.2)/(22) F* = 10.69 Fc 0.01 =5.72; Fc 0.10=2.56 Fc 0.05=3.44; Conclusion: F* > Fc ==> reject H0
  • 75. 75 Prediction in multiple regression • The procedure is exactly the same as the prediction in two variable model. • We use the matrix notation form, the formula is given in the appendix C9 on page 861 of the book. • We also have “mean prediction” and “individual prediction”. • The software will do it for you.