This document discusses multiple linear regression analysis. It begins by introducing multiple regression and defining the notation used. It then explains how to derive the OLS estimators for multiple regression by minimizing the sum of squared errors. This involves setting partial derivatives of the SSE with respect to each beta coefficient equal to zero and solving the system of equations. The document further discusses the properties of the OLS estimators, how to calculate variances and standard errors, and applications of functional form regression including Cobb-Douglas production functions and polynomial models.
Econometrics is a tough subject so is its homework and assignments in general. In case you are looking for econometrics homework help, you can rely on Economicshelpdesk. Our tutors are expert and we honour our client’s need as well as privacy.
This is the entrance exam paper for ISI MSQE Entrance Exam for the year 2008. Much more information on the ISI MSQE Entrance Exam and ISI MSQE Entrance preparation help available on http://crackdse.com
The purpose of this work is to formulate and investigate a boundary integral method for the solution of the internal waves/Rayleigh-Taylor problem. This problem describes the evolution of the interface between two immiscible, inviscid, incompressible, irrotational fluids of different density in three dimensions. The motion of the interface and fluids is driven by the action of a gravity force, surface tension at the interface, elastic bending and/or a prescribed far-field pressure gradient. The interface is a generalized vortex sheet, and dipole density is interpreted as the (unnormalized) vortex sheet strength. Presence of the surface tension or elastic bending effects introduces high order derivatives into the evolution equations. This makes the considered problem stiff and the application of the standard explicit time-integration methods suffers strong time-step stability constraints.
The proposed numerical method employs a special interface parameterization that enables the use of an efficient implicit time-integration method via a small-scale decomposition. This approach allows one to capture the nonlinear growth of normal modes for the case of Rayleigh-Taylor instability with the heavier fluid on top.
Validation of the results is done by comparison of numeric solution to the analytic solution of the linearized problem for a short time. We check the energy and the interface mean height preservation. The developed model and numerical method can be efficiently applied to study the motion of internal waves for doubly periodic interfacial flows with surface tension and elastic bending stress at the interface.
Mathematics (from Greek μάθημα máthēma, “knowledge, study, learning”) is the study of topics such as quantity (numbers), structure, space, and change. There is a range of views among mathematicians and philosophers as to the exact scope and definition of mathematics
To get a copy of the slides for free Email me at: japhethmuthama@gmail.com
You can also support my PhD studies by donating a 1 dollar to my PayPal.
PayPal ID is japhethmuthama@gmail.com
What is the point of small housing associations.pptxPaul Smith
Given the small scale of housing associations and their relative high cost per home what is the point of them and how do we justify their continued existance
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
ZGB - The Role of Generative AI in Government transformation.pdfSaeed Al Dhaheri
This keynote was presented during the the 7th edition of the UAE Hackathon 2024. It highlights the role of AI and Generative AI in addressing government transformation to achieve zero government bureaucracy
A process server is a authorized person for delivering legal documents, such as summons, complaints, subpoenas, and other court papers, to peoples involved in legal proceedings.
Jennifer Schaus and Associates hosts a complimentary webinar series on The FAR in 2024. Join the webinars on Wednesdays and Fridays at noon, eastern.
Recordings are on YouTube and the company website.
https://www.youtube.com/@jenniferschaus/videos
Presentation by Jared Jageler, David Adler, Noelia Duchovny, and Evan Herrnstadt, analysts in CBO’s Microeconomic Studies and Health Analysis Divisions, at the Association of Environmental and Resource Economists Summer Conference.
4. 4
Y
X3
= 3
holding X2 constant, the direct effect
of a unit change in X3 on the mean
value of Y.
To assess the true contribution of X2 to the change in
Y, we control the influence of X3.
Y = 1 + 2X2 + 3X3 + u (suppose this is a true model)
Y
X2
= 2 : 2
measures the change in the mean
values of Y, per unit change in X2,
holding X3 constant.
or The ‘direct’ or ‘net’ effect of a unit change in
X2 on the mean value of Y
Explaination for the parial coefficients
5. 5
Derive OLS estimators of multiple regression
OLS is to minimize the SSR( u2)
^
min. RSS = min. u2 = min. (Y - 1 - 2X2 - 3X3)2
^ ^ ^ ^
RSS
1
=2 ( Y - 1- 2X2 - 3X3)(-1) = 0
^
^ ^ ^
RSS
2
=2 ( Y - 1- 2X2 - 3X3)(-X2) = 0
^
^ ^ ^
RSS
3
=2 ( Y - 1- 2X2 - 3X3)(-X3) = 0
^
^ ^ ^
Y = 1 + 2X2 + 3X3 + u
^ ^ ^ ^
u = Y - 1 - 2X2 - 3X3
^ ^ ^ ^
9. 9
n X2 X3
X2 X2
2 X2X3
X3 X3X2 X3
2
= u
2
^
-1
u
2 =
^ u2
^
n-3
and =
u2
^
n- k
u2
^
k=3
# of independent variables
( including the constant term)
10. 10
Scalar forms of Var and SE of OLS estimators
)
1
(
)
ˆ
var( 2
23
2
2
2
2
r
x i
)
1
(
)
ˆ
( 2
23
2
3
2
3
r
x
se
i
)
1
(
)
ˆ
var( 2
23
2
2
2
2
r
x i
)
1
(
)
ˆ
( 2
23
2
2
2
2
r
x
se
i
2
3
2
2
2
23
2
23
3
2
1
(
)
ˆ
,
ˆ
cov(
i
i x
x
r
r
2
3
2
2
2
3
2
2
23
)
(
i
i
i
i
x
x
x
x
r
11. 11
Properties of multiple OLS estimators
1. The regression line(surface)passes through the mean of Y1, X2, X3
_ _ _
i.e.,
Y= 1 + 2X2 + 3X3
_
^ ^
^
_ _
==>
1 = Y - 2X2 - 3X3
^ ^ ^
_ _
_
Linear in parameters
Regression through the mean
3. u=0
^
Zero mean of error
uY=0
^
5. ^
random sample
uX2 = uX3 = 0
^ ^
4. (uXk=0 )
^ constant Var(ui) = 2
2. Y = Y + 2x2 + 3x3
^
_
^ ^
y = 2x2 + 3x3
^ ^
or
Unbiased: E(i) = i
^
12. 12
Properties of multiple OLS estimators
6. As X2 and X3 are closely related ==> var(2) and var(3)
become large and infinite. Therefore the true values of 2
and 3 are difficult to know.
^ ^
All the normality assumptions in the two-variables case regression
are also applied to the multiple variable regression.
But one addition assumption is
No exact linear relationship among the independent variables.
(No perfect collinearity, i.e., Xk Xj )
7. The greater the variation in the sample values of X2 or
X3, the smaller variance of 2 and 3 , and the
estimations are more precisely.
^ ^
8. BLUE (Gauss-Markov Theorem)
13. 13
The adjusted R2 (R2) as one of indicators of the overall fitness
R2 =
ESS
TSS
= 1 -
RSS
TSS
= 1 -
u2
y2
^
R2 = 1 -
_ 2
SY
2
^
R2 = 1 -
_ u2
y2
^ (n-1)
(n-k)
u2 / (n-k)
y2 / (n-1)
R2 = 1 -
_ ^
k : # of independent
variables plus the
constant term.
n : # of obs.
n-1
R2 = 1 - (1-R2)
_
n-k
R2 R2
_
Adjusted R2 can be negative: R2 0
0 < R2 < 1
Note: Don’t misuse the adjusted R2, Gujarati(2003) pp. 222
14. 14
Some notes on R2 and adjusted R2
• R2 always increases when we add independent variables. So we use
to decide on whether or not to add one more variable. The rule can
be applied is: add more variable if in creases and t-test for new
variable is significant.
• We compare two R2 of two models to select the better model if the
two models have the same sample size n and forms of dependent
variable.
• Game of maximizing : In regression analysis, we should be more
concerned about the logical or theoretical relevance of X to Y and
their significance. So, it is not necessary to get very high R2 or
adjected R2. If they are high, it is good, but if they are low, it does not
mean the model is bad.
2
R
2
R
2
R
15. 15
Y
X
= 2
=
Y
X
Y
X
Y = 1 + 2 X
Y
X
= 2
percentage change in y
percentage change in x
= =
x/x
y/y
=
y x
x y
Applying elasticities
19. 19
linear X
Y 2
1
2
dX
dY X
)
(
2
Y
X
Y ln
ln 2
1
Log-log
2
ln
ln
X
dX
Y
dY
X
d
Y
d
2
)
(
2
X
Y
dX
dY
==>
Summary
Model Equation
)
(
dX
dY
)
(
X
dX
Y
dY
21. 21
Application of functional form regression
1. Cobb-Douglas Production function:
u
e
K
L
Y 1
3
2
Transforming:
u
K
L
Y
u
K
L
Y
ln
ln
ln
ln
ln
ln
ln
3
2
1
3
2
1
’
==>
2
ln
ln
L
d
Y
d
3
ln
ln
K
d
Y
d
: elasticity of output w.r.t. labor input
: elasticity of output w.r.t. capital input.
1
3
2
>
<
Information about the scale of returns.
22. 22
2. Polynomial regression model:
Marginal cost function or total cost function
costs
y
MC
i.e.
costs
y
u
X
X
Y
2
3
2
1
(MC)
or
costs
y
TC
u
X
X
X
Y
3
4
2
3
2
1
(TC)
24. 24
Hypothesis Testing in multiple regression:
1. Testing individual partial coefficient
2. Testing the overall significance of all coefficients
3. Testing restriction on variables (add or drop): Xk = 0 ?
4. Testing partial coefficient under some restrictions
Such as 2+ 3 = 1;
or 2 = 3 (or 2+ 3 = 0); etc.
6. Testing the stability of the estimated regression model
-- over time
-- in different cross-sections
5. Testing the functional form of regression model.
25. 25
1. Individual partial coefficient test
t =
2 - 0
^
Se (2)
^
=
0.726
0.048
= 14.906
Compare with the critical value tc
0.025, 12 = 2.179
Since t > tc ==> reject Ho
Answer : Yes, 2 is statistically significant and is
significantly different from zero.
^
H0 : 2 = 0
H1 : 2 0
holding X3 constant: Whether X2 has the effect on Y ?
1
Y
X2
= 2 = 0?
26. 26
1. Individual partial coefficient test (cont.)
holding X2 constant: Whether X3 has the effect on Y?
2
H0 : 3 = 0
H1 : 3 0
Y
X3
= 3 = 0?
t =
3 - 0
^
Se (3)
^
=
2.736-0
0.848
= 3.226
Critical value: tc
0.025, 12 = 2.179
Since | t | > | tc | ==> reject Ho
Answer: Yes, 3 is statistically significant and is
significantly different from zero.
^
27. 27
2. Testing overall significance of the multiple regression
3. Compare F and Fc , and
if F > Fc ==> reject H0
1. Compute and obtain F-statistics
2. Check for the critical Fc value (Fc
, k-1, n-k)
Y = 1 + 2X2 + 3X3 + u
H0 : 2 = 0, 3 = 0, (all variable are zero effect)
H1 : 2 0 or 3 0 (At least one variable is not zero,
At least one variable has the effect)
3-variable case:
28. 28
F =
MSS of ESS
MSS of RSS
=
ESS / k-1
RSS / n-k
=
y2/(k-1)
u 2 /(n-k)
^
^
if F > Fc
α,k-1,n-k ==> reject Ho
H1 : 2 … k 0
H0 : 2 = … = k = 0
Analysis of Variance:
Source of variation Sum of Square df Mean sum of Sq.
Due to regression(ESS) y 2 k-1
Due to residuals(RSS) u2 n-k
Total variation(TSS) y2 n-1
ANOVA TABLE
y2
k-1
^
^
u2
(SS)
n-k = u
2
^
^
^
(MSS)
Note: k is the total number of parameters including the intercept term.
Since y = y + u
^
==> y2 = y2 + u2
^ ^
TSS = ESS + RSS
^
29. 29
Three-
variable
case
y = 2x2 + 3x3 + u
^ ^ ^
y2 = 2 x2 y + 3 x3 y + u2
^ ^ ^
TSS = ESS + RSS
F-Statistic =
ESS / k-1
RSS / n-k
=
(2 x2y + 3 x3y) / 3-1
u2 / n-3
^
^ ^
ANOVA TABLE
Source of variation SS df(k=3) MSS
ESS 2 x2 y + 3 x3 y 3-1 ESS/3-1
RSS u2 n-3 RSS/n-3
TSS y2 n-1
^ ^
^
(n-k)
30. 30
An important relationship between R2 and F
F =
ESS / k-1
RSS / n-k
=
ESS (n-k)
RSS (k-1)
=
TSS-ESS
ESS n-k
k-1
TSS
=
ESS/TSS
ESS
1 -
n-k
k-1
=
R2
1 - R2
n-k
k-1
=
R2 / (k-1)
(1-R2) / n-k
F R2 =
(k-1)F + (n-k)
(k-1) F
For the three-variables case :
F =
R2 / 2
(1-R2) / n-3
31. 31
R2 =
ESS
TSS
= 1 -
RSS
TSS
= 1 -
u2
y2
^
R2 = 1 -
_ 2
SY
2
^
R2 = 1 -
_ u2
y2
^ (n-1)
(n-k)
u2 / (n-k)
y2 / (n-1)
R2 = 1 -
_ ^
k : # of independent
variables
including the
constant term.
n : # of obs.
R2 and the adjusted R2 (R2)
n-1
R2 = 1 - (1-R2)
_
n-k
R2 R2
_
Adjusted R2 can be negative: R2 0
0 < R2 < 1
32. 32
Overall significance test:
H1 : at least one coefficient
is not zero.
H0 : 2 = 3 = 4 = 0
2 0 , or 3 0 , or 4 0
Fc(0.05, 4-1, 20-4) = 3.24
k-1 n-k
Since F* > Fc ==> reject H0.
F* = =
R2 / k-1
(1-R2) / n- k
= 179.13
0.9710 / 3
(1-0.9710) /16
=
33. 33
Construct the ANOVA Table (8.4) .(Information from EViews)
F* =
MSS of regression
MSS of residual
=
5164.3903
28.8288
= 179.1339
Source of
variation
SS Df MSS
Due to
regression
(SSE)
R
2
(y
2
)
=(0.971088)(28.97771)2x19
=15493.171
k-1
=3
R
2
(y
2
)/(k-1)
=5164.3903
Due to
Residuals
(RSS)
(1- R
2
)(y
2
) or ( u
2
)
=(0.0289112)(28.97771) )2x19
=461.2621
n-k
=16
(1- R
2
)(y
2
)/(n-k)
=28.8288
Total
(TSS)
(y
2
)
=(28.97771) 2x19
=15954.446
n-1
=19
Since (y)2 = Var(Y) = y2/(n-1) => (n-1)(y)2 = y2
38. 38
3.Testing the addition variable in the regression model
Old model :
Y = 1 + 2 X2 + u1
Obtain R2
old or RSSold and/or ESSold
Now consider a new variable X3, whether it is relevant to add or not ?
New model :
Y = 1 + 2X2 + 3 X3 + u2
Obtain R2
new or RSSnew and ESSnew
H0 : 3 = 0, add X3 is not relevant
H1 : 3 0, add X3 is relevant
39. 39
Steps of testing whether X3 has an incremental contribution
of the explanatory power in the new model.
1. Compute the F-statistic
(J = 1)
F* =
(RSSold - RSSnew) / # of additional variables
RSSnew / n - # of parameters in new model
(n-3)
2. Compare F* and Fc
(, 1, n-3)
3. Decision rule: If F* > Fc ==> reject H0 : 3 = 0
that means X3 is a relevant variable to be added into the model.
F =
(R2
new - R2
old) / df
(1 - R2
new) / df n-k (in the new model)
# of new regressors
(add or drop)
F* can also be calculated by
40. 40
Old model
Add an irrelevant variable X2: (Studenmund, pp.166)
RSSold = 160.5929 R2
old = 0.986828
41. 41
F* =
(R2
new - R2
old) / df
(1 - R2
new) / df
n-k ( )
in the new
model
# of new regressors
(add or drop)
=
(0.9872 - 0.9868) / 1
(1 - 0.9872) / 39
= 1.218
Fc
0.05, 1, 39
= 4.17
New model H0 : add X3 (R) is not suitable
3 = 0
Since
F* < Fc ==> not reject H0.
42. 42
Add an relevant variable X3: (Studenmund, pp.166)
Old model
Y = 0 + 1 X1 + 2 X2 + u
Next:
New model: Y = 0 + 1 X1 + 2 X2 + 3 X3 + u’
H0 : add X3 (YD variable is not suitable, 3 = 0
43. 43
F* =
(R2
new - R2
old) / df
(1 - R2
new) / df
=
0.9868 - 0.9203 / 1
(1 - 0.9868) / 44 - 4
=
0.0665x40
0.0132
= 201.5
Fc
(0.05, 1, 40) = 4.08 Since F* > Fc ==> reject H0.
New model
Add an relevant variable X3(YD): (Studenmund, pp.166)
44. 44
Add variables in general discussion:
Testing the two additional variables Xl, Xm, whether they
are relevant or not?
F-test : H0 : l = 0, m = 0
H1 : l 0, or m 0
F* =
(R2
new- R2
old) / # of added regressors
(1 - R2
new) / n-m
If F* > Fc ==> reject H0
old : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui
Restricted l= m=0
new : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + Xm + ui unrestricted
# of new regressors
# of total regressors
in the new model
45. 45
Drop variables in general discussion:
Test the dropping of two variables Xl, Xm whether they are
relevant or not?
F-test : H0 : l = 0, m = 0
H1 : l 0, or m 0
F* =
(R2
old- R2
new) / # of dropped regressors
(1 - R2
old) / n-k
If F* > Fc ==> reject H0
new : Yi = 1 + 2X2 + 3 X3 +……+ k Xk + ui
Restricted l= m=0
old : Yi = 1 + 2X2 + 3X3 +...+ kXk + lXl + mXm + ui unrestricted
46. 46
Justify whether to keep (or drop) Xl and Xm in the model
3. Check the t-statistics of Xl and Xm? Whether tl*, tm* > 1.96?
(5% leve of significance)
^
1. Theory: Check the sign of l and m? Whether the
variables are theoretically explaining the dependent
variable?
^
4. Bias: Check t-statistics of other variables, X2,……Xk whether
they have change significantly or not?
2. Overall fitness: Check R2 increase or not? Check the F*-
statistics increase or not?
47. 47
Use the R2 instead of ESS or RSS in the F-test.
Restriction Test:
F =
(R2
UR - R2
R) / J
(1 - R2
UR) / (n- k)
J: # restricted variables
k: # of parameters (included
the intercept) in the
unrestricted model.
Note that we use R2 for
restriction test only if two
regression models have the same
form of dependent variable.
=
(R2
UR - R2
R) / (dfUR-dfR)
(1 - R2
UR) / dfUR
48. 48
Old model or restricted model
Example 8.4: The demand for Chicken ( Gujarati(2003) p. 272)
49. 49
H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0
New Model or unrestricted model
50. 50
WALD TEST (Likelihood Ratio Test)
(0.9823 - 0.9801) /2
0.000983
F* =
(R2
new - R2
old) / # added
(1-R2
new) / n-k
=
(1 - 0.9823) / (23 - 5)
=
0.0011
= 1.119
Since F* < Fc ==> not reject H0
Fc
0.05, 2, 18 = 3.55
H0: No joint effect of X4 and X5, i.e., 4 = 5 = 0
1
Adding variables:
51. 51
< Fc
0.05, 1, 19 = 4.38 not reject H0
F* =
(R2
UR - R2
R) / m-k
1-R2
UR/ n-k
=
(0.982313 - 0.981509) /1
(1-0.982313) / (23 - 4)
= 0.864
H0: No effect of X5, i.e., 5 = 0
Since t* < tc ==> not reject H0
2
Dropping variable:
(β5)
52. 52
ln Y = 1 + 2 ln X2 + 3 ln X3 + u
unrestricted
model
ln( ) = ’1 +’3ln( ) + u’
=>
X2
ln Y = 1 + ( 1 - 3 ) ln X2 + 3 ln X3 + u
=>
ln Y = 1 + ln X2 + 3 ( ln X3 – lnX2 ) + u
=>
(ln Y - ln X2) = 1 + 3 ( ln X3 – lnX2) + u
=>
Y X3
X2
restricted
model
Y* = ’1 + ’3 X* + u’
Restricted least squares:
Constant returns to scales
Y = 1 X2
2 X3
3 eu
2 + 3 = 1
2 = 1 - 3
3 = 1 - 2
4. Testing partial coefficient under some restrictions:
53. 53
ln( ) = ”1 + ”2 ln ( ) + u”
=>
X3
ln Y = 1 + 2 ln X2 + (1- 2 ) ln X3 + u
=>
ln Y = 1 + 2 ln X2 + lnX3 - 2 ln X3 + u
=>
(ln Y - ln X3) = 1 + 2 ( lnX2 – lnX3) + u
=>
Y X2
X3
restricted
model
Y** = ”1+”3 X** + u”
OR
54. 54
H0 : 2 + 3 = 1
F =
(RSSR –RSSUR) / m
(RSSUR) / n - k
# of restriction in
restricted model
# of variable in
unrestricted model
RSSUR = 0.013604
RSSR = 0.016629
F* =3.75
Fc
(0.05,1,17)=4.45
Restricted equation:
ln(Y/X2) = ’1+ ’3ln(X3/X2)
Unrestricted equation:
lnY = 1+ 2lnX2+3lnX3 + u
55. 55
Unrestricted equation:
lnY = 1+ 2lnX2+3lnX3 + u
Restricted equation:
ln(Y/X3) = ”1+ ”2ln(X2/X3)
H0 : 2 + 3 = 1
F =
(RSSR –RSSUR) / m
(RSSUR) / n - k
# of restriction in
restricted model
# of variable in
unrestricted model
R2
UR = 0.013604
R2
R = 0.016629
F* = 3.75
Not Reject H0
56. 56
Restriction: 2 = 3 (or 2 - 3 = = 0 or as 2 = + 3 = 0 )
Unrestricted model: Y = 1 + 2 X2 + 3X3 + u
Restricted model: Y = 1 + X2 + 3 X*3 + u’
Test for Restriction on parameters: t test approach
Next rewrite the equation as :
=> Y = 1 + ( + 3 )X2 + 3 X3 + u’
=> Y = 1 + X2 + 3 X2 + 3 X3 + u’
=> Y = 1 + X2 + 3 (X2 + X3) + u’
Simply use the t-value to test whether is zero or not
Or
use
F-test
Compute t = and compare to tc
and follow t-test decision rule
2 - 3
^
Se(2- 3)
^
^
^
)
ˆ
,
ˆ
cov(
2
)
ˆ
var(
)
ˆ
var(
)
ˆ
ˆ
( 3
2
3
2
3
2
se
57. 57
H0: = 0
H1: 0 tc
(0.05, 931)= 1.96
H0 : 3 - 4 = 0
F* = = 0
(R2
R – R2
UR) / m
(1-R2
UR) / n - k
Fc
(0.05, 4, 931) =2.37
Example
Unrestricted model Restricted model
=> not reject H0
58. 58
5- Test for Functional Form
(MacKinnon, White, Davidson)
MWD Test for the functional form
Optional Reading
59. 59
(MacKinnon, White, Davidson)
MWD Test for the functional form (Gujarati(2003) pp.280)
1. Run OLS on the linear model, obtain Y
^
Y = 1 + 2 X2 + 3 X3
^ ^
^
^
2. Run OLS on the log-log model and obtain lnY
^
lnY = 1 + 2 ln X2 + 3 ln X3
^ ^
^
^
3. Compute Z1 = ln(Y) - lnY
^ ^
4. Run OLS on the linear model by adding z1
Y = 1’ + 2’X2 + 3’X3 + 4’ Z1
^ ^
^
^ ^
and check t-statistic of 4’
If t*
4
> tc ==> reject H0 : linear model
^
If t*
4
< tc ==> not reject H0 : linear model
^
H0: linear model;
H1:log-linear model
60. 60
MWD test for the functional form (Cont.)
5. Compute Z2 = antilog (lnY) - Y
^ ^
6. Run OLS on the log-log model by adding Z2
lnY = 1’ + 2’ ln X2 + 3’ ln X3 + 4’ Z2
^ ^ ^ ^ ^
If t*
4
> tc ==> reject H0 : log-log model
^
If t*
4
< tc ==> not reject H0 : log-log model
^
and check t-statistic of ’3
^
61. 61
MWD TEST: TESTING the Functional form of regression
CV1 =
Y
_ =
1583.279
24735.33
= 0.064
^
Y
^
Example: 8.5
Step 1:
Run the linear model
and obtain
63. 63
tc
0.05, 11 = 1.796
tc
0.10, 11 = 1.363
t* < tc at 5%
=> not reject H0
t* > tc at 10%
=> reject H0
Step 4:
H0 : true model
is linear
64. 64
tc
0.025, 11 = 2.201
tc
0.05, 11 = 1.796
tc
0.10, 11 = 1.363
Since t* < tc
=> not reject H0 Comparing the C.V. =
C.V.1
C.V.2
=
0.064
0.0074
Step 6:
H0 : true model is
log-log model
65. 65
Y
^
The coefficient of variation:
C.V =
It measures the average error of the sample
regression function relative to the mean of Y.
Linear, log-linear, and log-log equations can be
meaningfully compared.
The smaller C.V. of the model,
the more preferred equation (functional model).
Alternative Criteria for comparing two different functional models:
66. 66
= 4.916 means that model 2 is better
Coefficient Variation
(C.V.)
/ Y of model 1
^
/ Y of model 2
^
=
2.1225/89.612
0.0217/4.4891
=
0.0236
0.0048
Compare two different functional form models:
Model 1
linear model
Model 2
log-log model
67. 67
6- “Chow Test” - structural stability Test
H0 : no structural change
H1 : yes
Procedures:
1. Divide the sample of N observation into two groups.
- group1 consisting of first n1 obs.
- group2 consisting of the remaining n2 = N - n1 obs.
68. 68
2. Run OLS on two sub-sample groups separately and
obtain the RSS1, and RSS2
3. Run OLS on the whole sample (N) and obtain the
restricted RSSR
4. Compute F* =
(RSSR - RSS1 - RSS2) / k
(RSS1 + RSS2) / N-2k
5. Compute F* and Fc
, k, N-2k
If F* > Fc ==> reject H0
It means that there is a structural change in the sample.
74. 74
Empirical Results:
Dep.
variable
Constant Indep. V
X
R
2
SEE RSS n
Y
(70-95)
624226
(4.89)
0.0376
(8.89)
0.7672 31.12 23248.3 26
Y
(70-81)
1.0161
(0.08)
0.0803
(9.60)
0.9021 13.36 1785.03 12
Y
(82-95)
153.494
(4.69)
0.0148
(1.77)
0.2071 28.87 10005.2 14
F* =
(RSSR - RSS1 - RSS2) / k
(RSS1 + RSS2) / N-2k
=
(23248.3 – 1785.03 -10005.2)/ 2
(1785.03 +10005.2)/(22)
F* = 10.69 Fc
0.01 =5.72; Fc
0.10=2.56
Fc
0.05=3.44;
Conclusion: F* > Fc ==> reject H0
75. 75
Prediction in multiple regression
• The procedure is exactly the same as the
prediction in two variable model.
• We use the matrix notation form, the formula
is given in the appendix C9 on page 861 of
the book.
• We also have “mean prediction” and
“individual prediction”.
• The software will do it for you.