SlideShare a Scribd company logo
IOSR Journal of Mathematics (IOSR-JM)
e-ISSN: 2278-5728, p-ISSN: 2319-765X. Volume 11, Issue 1 Ver. III (Jan - Feb. 2015), PP 45-51
www.iosrjournals.org
DOI: 10.9790/5728-11134551 www.iosrjournals.org 45 |Page
A Bootstrap Approach to Error-Reduction of Nonlinear
Regression Parameters Estimation
1
Obiora-Ilouno H. O. And 2
Mbegbu J. I
1
Department of Statistics Nnamdi Azikiwe University, Awka, Nigeria
2
Department of Mathematics University of Benin, Benin City Edo State, Nigeria.
Abstract: In this paper we proposed bootstrap algorithm for the estimation of parameters of the non-linear
regression analysis by cooperating the Gauss- Newton method. We used bootstrapping to provide estimates of
exponential regression coefficients. The computational difficulties that would have encountered in using the
proposed method have been resolved by developing a computer program in R which was used to implement the
algorithm. From the result obtained the bootstrap algorithm yielded a better reduced error sum of squares
SS than the analytical method. With these results, we have a greater confidence in the result obtained.
Keywords: bootstrap, algorithm, exponential, regression, parameters, non-linear, Gauss-Newton.
I. Introduction
Bootstrap resampling as computer based methods is a tools for constructing inferential procedures in
statistical data analysis. The bootstrap approach introduced by Efron (1979) was used initially for estimation of
certain statistics such as the mean and the variability of estimated characteristics of the probability distribution
of a set of observations by providing confidence intervals for the population parameters (Efron, 2003; Efron and
Tibshirani (1998). It is a technique in which sample of size n are obtained randomly with replacement from the
original sample each with 1
n chances of being selected. This is basically to analyze the population by replacing
the unknown distribution function F by the empirical distribution function Fˆ obtained from the sample. (See
Davison and Hinkley, 1997, 2003; Efron, 2003; Bickel and Freedman, 2007; Hall, 2003; Casella, 2003).
Exponential regression is a non-linear regression model often used to measure the growth of a variable,
such as population, GDP etc. An exponential relationship between X and Y exists whenever the dependent
variable Y changes by a constant percentage, as the variable X also changes.
II. Materials And Method
Let     jkXXXfY ,,,,,,, 2121  (2.1)
be a non linear regression model with s being the parameters, sX  are the predictor variables and the error
term  2
,0~  N independently identically distributed and are uncorrelated.
Equation (2.1) is assumed to be intrinsically nonlinear. Suppose n is number of observations on the Y and sX 
, then
  niXXXfY ijikiii ,,2,1;,,,,,,, 2121    (2.2)
The n-equation can be written compactly in a matrix notation as
    ,XfY (2.3)
where






















































njknnn
k
k
n XXX
XXX
XXX
X
y
y
y
Y














2
1
2
1
21
22212
11211
2
1
,,,
and   0E
The error sum of squares for the nonlinear model is defined as
    
2
1
,

n
i
ii XfYSQ  (2.4)
A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 46 |Page
Denoting the least square estimates of  ˆby , these minimize the . The least square estimates of
are obtained by differentiating (2.4) with respect to , equate to zero and solve for , this results in J normal
equations:
    













n
i
i
p
ii JpniXfXfY
1 ˆ
,,2,1,,,2,1;0,ˆ, 



 (2.5)
In estimating the parameters of nonlinear regression model, we use the Gauss-Newton method based on
Taylor’s series to approximate equation (2.3). Now, considering the function which is the
deterministic component of
  niXY iii ,,2,1,   (2.6)
Let 0
 be the initial approximate value of . Adopting Taylor’s series expansion of about 0
 , we
have the linear approximation
       
0
,,, 00






 iii XfXfXf (2.7)
Substituting expressions (2.7) in (2.6) we obtain
      JpniXfXfY
J
p
ippi
i
ii ,,2,1,,,2,1,,
1
00
0
 







  




Equation (2.8) may be viewed as a linear approximation in a neighborhood of the starting value
Let  00
, ii Xff 
00
ppp  
  JpandniforXfZ i
p
pi ,,2,1,,2,1,,
0
0
 














(2.8)
Hence, equation (2.8) becomes


J
p
ippiii niZfY
1
000
,,2,1,  (2.9)


J
p
ippiii niZfY
1
000
,,2,1,  (2.10)
In a matrix form, we have



























































njJnnn
J
J
nn ZZZ
ZZZ
ZZZ
fY
fY
fY












2
1
0
0
2
0
1
00
2
0
1
0
1
0
22
0
12
0
1
0
21
0
11
0
0
22
0
11
(2.11)
Compactly, equation (2.11) becomes
  000
ZfY (2.12)
( )S  
 ˆ
( , )f X 
 ( , )if X 
0

A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 47 |Page
where
   
0 0
11 1
0 0
'' '0 0 0 0 0 0 0 012 2
1 1 2 2 1 1
0 0
1
, , , , ,
J
J
n n J n
n Jn
Z Z
Z Z
Y f Y f Y f Y f Z
Z Z
     
 
 
            
 
  


  
  

We obtain the Sum of squares error  SS
       000000
 ZfYZfYSS 


    0ˆ22 00000
0







ZZZfY
SS



   00000 ˆZZZfY 

 (2.13)
Hence,
   100000ˆ 
 ZZZfY (2.14)
Therefore, the least square estimates of
0
 is
   001000ˆ fYZZZ  
 (2.15)
Thus,  
 00
2
0
1
0 ˆ,,ˆ,ˆˆ
J  minimizes the error sum of squares,
    









n
i
J
p
ppiii ZfYS
1
2
1
000* ˆ (2.16)
Now, the estimates of parameters p of non linear regression (2.1) are
Jpppp ,,2,1;ˆ 001
  (2.17)
Iteratively, equation (2.17) reduces to
rrr
rrr








ˆ
ˆ
ˆ
1
11
001

Thus
   
11 ' 'r r r r r r
Z Z Z Y f 

   (2.18)
where    1ˆr r r r r
Z Z Z Y f  
  are the least squares estimates of  obtained at the
 th
r 1 iterations. The iterative process continues until 




r
rr 1
where
5
10
 is the error tolerance (See Smith (1998) and Nduka (1999)).
 *
S is evaluated after each iteration to see if a reduction in its value has actually been achieved. At the end of
the  th
r 1 iteration, we have
    









n
i
r
p
p
j
r
pi
r
ii
r
ZfYS
1
2
1
* ˆ (2.19)
A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 48 |Page
and iteration is stopped if convergence is achieved. The final estimates of the parameters at the end of the
 th
r 1 iteration are:
 11
2
1
1
1
,,, 
 r
j
rrr
  (2.20)
The Bootstrap Algorithm based on the Resampling Observations for the Estimation of Non- linear
Regression Parameters
 
 jiii ZYW , being the original sample of size n for the resampling, sWi are drawn independently and
identically from a distribution of F.  
 ni yyyY ,,, 21  is the column vector of the response variables,
 
 jnjjji zzzZ ,,, 21  is the matrix of dimension kn for the predictor variables, where
niandkj ,,2,1,,2,1   .
Let the   11 k vector  
 jiii ZYW , denote the values associated with th
i  nwww ,,, 21 
observation sets.
Step 1: Draw bootstrap sample  )()(
2
)(
1 ,,, b
n
bb
www  of size n with replacement from the observation
giving 1
n probability of each iW value being sampled from the population and label the elements of
each vector  
 )()()(
, b
ji
b
i
b
i XYW
where. niandkj ,,2,1,,2,1   . From these form the vector for the response variable
 )()(
2
)(
1
)(
,,, b
n
bbb
i yyyY  and the matrix of the predictor variables
 )()()()(
,,, n
jn
b
jz
b
ji
b
ji zzzZ  .
Step2: Calculate the least square estimates for nonlinear regression coefficient from the bootstrap sample;
   fYZZZ 
10ˆ .
Step3: Compute
001 ˆˆˆ   using the Gauss-Newton method, the
1ˆ value is treated as the initial value
in the first approximated linear model.
Step4: We return to the second step and again compute sˆ . At each iteration, new sˆ represent increments
that are added to the estimates from the previous iteration according to step 3 and eventually find
2ˆ ,
which is
rrr
toup  ˆˆˆˆ 1112
 
.
Step 5: Stopping Rule; the iteration process continues until 




r
rr 1
, where
5
10
 , for the
values of 110 ,,, p  from the first bootstrap sample
)1(ˆ b
 .
Step6: Repeat steps 1 to 5 for Br ,,2,1  , where B is the number of repetition.
Step 7: Obtain the probability distributions  )(ˆ b
F  of bootstrap estimates
)()21()1( ˆ.,ˆ,ˆ bBbb
  and use
 )(ˆ b
F  to estimate regression coefficients, variances. The bootstrap estimates of regression
coefficient is the mean of the distribution  )(ˆ b
F  , (see Topuz and Sahinler,2007; Obiora-Ilouno and
Mbegbu, 2012).
 
 
 br
B
r
br
b
B


 ˆ
ˆ
ˆ 1


(2.21)
The bootstrap standard deviation from  )(ˆ b
F  distribution is
A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 49 |Page
 
 
   
     
 
Br
B
s
B
b
bdrbdr
b
,,2,1;
ˆˆˆˆ
ˆ
1
1


















 




 (2.22)
The Computer Program In R For Bootstrapping Non-Linear Regression:
#x is the vector of independent variable
#theta is the vector of parameters of the model
#This function calculates the matrix of partial derivatives
F=function(x,theta)
{
output=matrix(0,ncol=2,nrow=length(x))
for(i in 1:length(x)) output[i,]=c(exp(theta[2]*x[i]),theta[1]*x[i]*exp(theta[2]*x[i]))
output
}
#This function calculates the regression coefficients using the Gauss-Newton Method
gaussnewton=function(y,x,initial,tol)
{
theta=initial
count=0
eps=y-(theta[1]*exp(theta[2]*x))
SS=sum(eps**2)
diff=1
while(tol<diff)
{
S=SS
ff=F(x,theta)
theta=c(theta+solve(t(ff)%*%ff)%*%t(ff)%*%eps)
eps=y-(theta[1]*exp(theta[2]*x))
SS=sum(eps**2)
diff=abs(SS-S)
count=count+1
if(count==100) break
pp=c(theta,SS)
#at each iteration
}
pp
}
#This part of the code does the bootstrap
boot=function(data,p,b,initial)
{
n=length(data[,1])
z=matrix(0,ncol=p,nrow=n)
output=matrix(0,ncol=p+1,nrow=b)
for (i in 1:b)
{
u=sample(n,n,replace=T)
for (j in 1:n) z[j,]=data[u[j],]
y=z[,1]
x=z[,2:p]
logreg=gaussnewton(y,x,initial,0.00001)
coef=logreg
output[i,]=c(coef)
}
output
}
A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 50 |Page
#Then to run the code we use the following
y <- c(data)
x <- c(data)
data=cbind(y,x)
initial=c(initial)
expo=boot(data,p,B,initial)
#Run the following to view the bootstrap results
theta_0=mean(expo[,1])
theta_0
theta_1=mean(expo[,2])
theta_1
SSE=mean(expo[,3])
SSE
Problem[ GUJARATI AND PORTER, 2009]
The data below relates to the management fees that a leading mutual fund pays to its investment advisors to
manage its assets. The fees paid depend on the net asset value of the fund.
Develop a regression model for the management fees to the advisors.
Fee % 0.520 0.508 0.484 0.46 0.4398 0.4238 0.4115 0.402 0.3944 0.388 0.3825 0.3738
Asset 0.5 5.0 10.0 15 20 25 30 35 40 45 55 60
III. Results And Discussion
From the data, the higher the net asset values of the fund, the lower are the advisory fees. (see Gujarati and
Porter ,2009).
The Analytical Result of R Program (Without Bootstrapping)
y <- c(0.520,0.508,0.484,0.46,0.4398,0.4238,0.4115,0.402,0.3944,0.388,0.3825,0.3738)
x <- c(0.5,5,10,15,20,25,30,35,40,45,55,60)
run=gaussnewton(y,x,c(0.5028,-0.002),0.00001)
[1] 0.505805756 -0.005487479 0.001934099
[1] 0.508724700 -0.005949598 0.001699222
[1] 0.508897316 -0.005964811 0.001699048
The estimated model from the result is:
0.00596ˆ 0.50890iY X 

where Y and X are the fee and assets respectively.
The Result of R Program Using Bootstrapping Algorithm
> # run the code use the following for the bootstrap algorithm
> y <- c(0.520,0.508,0.484,0.46,0.4398,0.4238,0.4115,0.402,0.3944,0.388,0.3825,0.3738)
6050403020100
0.52
0.50
0.48
0.46
0.44
0.42
0.40
0.38
0.36
Asset
fee%
Relationship of advisory fees to fund assets
A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation
DOI: 10.9790/5728-11134551 www.iosrjournals.org 51 |Page
> x <- c(0.5,5,10,15,20,25,30,35,40,45,55,60)
> data=cbind(y,x)
> initial=c(0.5028,-0.002)
> expo=boot(data,2,1000,initial)
>
> #Run the following to view the bootstrap results
> theta_0=mean(expo[,1])
> theta_0
[1] 0.5075696
> theta_1=mean(expo[,2])
> theta_1
[1] -0.005964939
> SSE=mean(expo[,3])
> SSE
[1] 0.001328289
IV. Discussion
are the revised parameter estimates at the end of the last
iteration. The least squares criterion measure SS for the starting values has been reduced in the first iteration
and also further reduced in second, third iterations respectively. The third iteration led to no change in either the
estimates of the coefficient or the least squares SS criterion measure. Hence, convergence is achieved, and
the iteration end.
The fitted regression function is,  XY 00586.0exp50889.0ˆ 
The results of the analytical and the bootstrap computations are shown in Table 2
Table 2: Analytical and Bootstrap Results
0 1 SS
Analytical 0.50889 -0.00596 0.001699
Bootstrap 0.50757 -0.00596 0.001328
The fitted regression function for both the analytical bootstrapping computation are
 ˆ 0.50889exp 0.00596Y X  and  XY 00596.0exp50757.0ˆ  respectively.
V. Conclusion
From the result of our analysis the bootstrap approach yields approximately the same inference as the
analytical method. Also the bootstrap algorithm yielded a better reduced error sum of squares SS than the
analytical method (see Table 2). With these results, we have a greater confidence in the result obtained by
bootstrap approach.
References
[1]. Casella, G., 2003, Introduction to the Silver Anniversary of the Bootstrap, Statistical Science, Vol.18, No.2, pp. 133-134.
[2]. Efron, B., (1979). Bootstrap Methods: Another look at the Jackknife. The Annuals of Statistics, 7, 1-26.
[3]. Efron, B., 2003, Second Thought on Bootstrapping, Statistical Science, Vol. 18, No. 2, 135-140.
[4]. Efron, B. and Tibshirani, R.J., 1998, An Introduction to the Bootstrap, Library of Congress Cataloging in Publication Data, CRC
Press, Boca Raton, Florida, 60-82.
[5]. Freedman, . A., 1981, Bootstrapping Regression Models, Annals of Statistical Statistics, Vol. 9, 2, 158-167.
[6]. Nduka, E. C., 1999, Principles of Applied Statistics 1, Regression and Correlation Analyses, Crystal Publishers, Imo State, Nigeria.
[7]. Obiora-Ilouno H.O. and Mbegbu J.I (2012), Jackknife Algorithm for the Estimation of Logistic Regression Parameters, Journal of
Natural Sciences Research, IISTE UK, Vol.2, No.4, 74-82.
[8]. Sahinler, S. and Topuz, D., 2007, Bootstrap and Jackknife Resampling Algorithm for Estimation of Regression Parameters, Journal
of Applied Quantitative Method, Vol. 2, No. 2, pp. 188-199.
[9]. Smith, H. and Draper, N.R., 1998, Applied Regression Analysis, Third Edition, Wiley- Interscience, New York.
[10]. Venables, W. N and Smith, 2007, An Introduction to R, A Programming Environment for Data Analysis and Graphics, Version
2.6.1, pp. 1-100.
0 10.50889 and 0.00596   

More Related Content

What's hot

International journal of engineering and mathematical modelling vol2 no1_2015_1
International journal of engineering and mathematical modelling vol2 no1_2015_1International journal of engineering and mathematical modelling vol2 no1_2015_1
International journal of engineering and mathematical modelling vol2 no1_2015_1
IJEMM
 
BlUP and BLUE- REML of linear mixed model
BlUP and BLUE- REML of linear mixed modelBlUP and BLUE- REML of linear mixed model
BlUP and BLUE- REML of linear mixed model
KyusonLim
 
Chapter2: Likelihood-based approach
Chapter2: Likelihood-based approach Chapter2: Likelihood-based approach
Chapter2: Likelihood-based approach
Jae-kwang Kim
 
Regularization and variable selection via elastic net
Regularization and variable selection via elastic netRegularization and variable selection via elastic net
Regularization and variable selection via elastic net
KyusonLim
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
ijceronline
 
GC-S005-DataAnalysis
GC-S005-DataAnalysisGC-S005-DataAnalysis
GC-S005-DataAnalysis
henry kang
 
Bayesian Inferences for Two Parameter Weibull Distribution
Bayesian Inferences for Two Parameter Weibull DistributionBayesian Inferences for Two Parameter Weibull Distribution
Bayesian Inferences for Two Parameter Weibull Distribution
IOSR Journals
 
Dag in mmhc
Dag in mmhcDag in mmhc
Dag in mmhc
KyusonLim
 
Kriging
KrigingKriging
Kriging
Aditi Sarkar
 
Survival analysis 1
Survival analysis 1Survival analysis 1
Survival analysis 1
KyusonLim
 
Propensity albert
Propensity albertPropensity albert
Propensity albert
Jae-kwang Kim
 
tw1979 Exercise 1 Report
tw1979 Exercise 1 Reporttw1979 Exercise 1 Report
tw1979 Exercise 1 Report
Thomas Wigg
 
A proposed nth – order jackknife ridge estimator for linear regression designs
A proposed nth – order jackknife ridge estimator for linear regression designsA proposed nth – order jackknife ridge estimator for linear regression designs
A proposed nth – order jackknife ridge estimator for linear regression designs
Alexander Decker
 
tw1979 Exercise 3 Report
tw1979 Exercise 3 Reporttw1979 Exercise 3 Report
tw1979 Exercise 3 Report
Thomas Wigg
 
tw1979 Exercise 2 Report
tw1979 Exercise 2 Reporttw1979 Exercise 2 Report
tw1979 Exercise 2 Report
Thomas Wigg
 
Estimation of Parameters and Missing Responses In Second Order Response Surfa...
Estimation of Parameters and Missing Responses In Second Order Response Surfa...Estimation of Parameters and Missing Responses In Second Order Response Surfa...
Estimation of Parameters and Missing Responses In Second Order Response Surfa...
inventionjournals
 
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
The Statistical and Applied Mathematical Sciences Institute
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
ijceronline
 
Lesson 19: The Mean Value Theorem (Section 021 handout)
Lesson 19: The Mean Value Theorem (Section 021 handout)Lesson 19: The Mean Value Theorem (Section 021 handout)
Lesson 19: The Mean Value Theorem (Section 021 handout)
Matthew Leingang
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classification
Tatsuya Yokota
 

What's hot (20)

International journal of engineering and mathematical modelling vol2 no1_2015_1
International journal of engineering and mathematical modelling vol2 no1_2015_1International journal of engineering and mathematical modelling vol2 no1_2015_1
International journal of engineering and mathematical modelling vol2 no1_2015_1
 
BlUP and BLUE- REML of linear mixed model
BlUP and BLUE- REML of linear mixed modelBlUP and BLUE- REML of linear mixed model
BlUP and BLUE- REML of linear mixed model
 
Chapter2: Likelihood-based approach
Chapter2: Likelihood-based approach Chapter2: Likelihood-based approach
Chapter2: Likelihood-based approach
 
Regularization and variable selection via elastic net
Regularization and variable selection via elastic netRegularization and variable selection via elastic net
Regularization and variable selection via elastic net
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 
GC-S005-DataAnalysis
GC-S005-DataAnalysisGC-S005-DataAnalysis
GC-S005-DataAnalysis
 
Bayesian Inferences for Two Parameter Weibull Distribution
Bayesian Inferences for Two Parameter Weibull DistributionBayesian Inferences for Two Parameter Weibull Distribution
Bayesian Inferences for Two Parameter Weibull Distribution
 
Dag in mmhc
Dag in mmhcDag in mmhc
Dag in mmhc
 
Kriging
KrigingKriging
Kriging
 
Survival analysis 1
Survival analysis 1Survival analysis 1
Survival analysis 1
 
Propensity albert
Propensity albertPropensity albert
Propensity albert
 
tw1979 Exercise 1 Report
tw1979 Exercise 1 Reporttw1979 Exercise 1 Report
tw1979 Exercise 1 Report
 
A proposed nth – order jackknife ridge estimator for linear regression designs
A proposed nth – order jackknife ridge estimator for linear regression designsA proposed nth – order jackknife ridge estimator for linear regression designs
A proposed nth – order jackknife ridge estimator for linear regression designs
 
tw1979 Exercise 3 Report
tw1979 Exercise 3 Reporttw1979 Exercise 3 Report
tw1979 Exercise 3 Report
 
tw1979 Exercise 2 Report
tw1979 Exercise 2 Reporttw1979 Exercise 2 Report
tw1979 Exercise 2 Report
 
Estimation of Parameters and Missing Responses In Second Order Response Surfa...
Estimation of Parameters and Missing Responses In Second Order Response Surfa...Estimation of Parameters and Missing Responses In Second Order Response Surfa...
Estimation of Parameters and Missing Responses In Second Order Response Surfa...
 
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
CLIM Fall 2017 Course: Statistics for Climate Research, Geostats for Large Da...
 
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...IJCER (www.ijceronline.com) International Journal of computational Engineerin...
IJCER (www.ijceronline.com) International Journal of computational Engineerin...
 
Lesson 19: The Mean Value Theorem (Section 021 handout)
Lesson 19: The Mean Value Theorem (Section 021 handout)Lesson 19: The Mean Value Theorem (Section 021 handout)
Lesson 19: The Mean Value Theorem (Section 021 handout)
 
Principal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classificationPrincipal Component Analysis for Tensor Analysis and EEG classification
Principal Component Analysis for Tensor Analysis and EEG classification
 

Viewers also liked

проект QR СМАРТС
проект QR СМАРТСпроект QR СМАРТС
проект QR СМАРТСMM2B
 
4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression
A Jorge Garcia
 
Regresi nonlinear&amp;ganda
Regresi nonlinear&amp;gandaRegresi nonlinear&amp;ganda
Regresi nonlinear&amp;ganda
lennygoru
 
Regresi Non Linear
Regresi Non LinearRegresi Non Linear
Regresi Non Linear
Fahrul Usman
 
Analisis trend II
Analisis trend IIAnalisis trend II
Analisis trend II
Lucky Maharani Safitri
 
Uji perbedaan ayda tri_valen_virdya
Uji perbedaan ayda tri_valen_virdyaUji perbedaan ayda tri_valen_virdya
Uji perbedaan ayda tri_valen_virdya
Ayda Fitriani
 
Regegresi sederhana ayda tri_valen_virdya
Regegresi sederhana ayda tri_valen_virdyaRegegresi sederhana ayda tri_valen_virdya
Regegresi sederhana ayda tri_valen_virdya
Ayda Fitriani
 
MAD UJI REGRESI DAN KORELASI SPSS
MAD UJI REGRESI DAN KORELASI SPSSMAD UJI REGRESI DAN KORELASI SPSS
MAD UJI REGRESI DAN KORELASI SPSS
NajMah Usman
 
Analisis korelasi dan regresi
Analisis korelasi dan regresiAnalisis korelasi dan regresi
Analisis korelasi dan regresi
Shofyan Shofyan
 
Non linear curve fitting
Non linear curve fitting Non linear curve fitting
Non linear curve fitting
Anumita Mondal
 
Model regresi-non-linear
Model regresi-non-linearModel regresi-non-linear
Model regresi-non-linearGifard Narut
 
Tugas regresi linear dan non linier
Tugas regresi linear dan non linierTugas regresi linear dan non linier
Tugas regresi linear dan non linier
nopiana
 
Modul belajar-spss-1
Modul belajar-spss-1Modul belajar-spss-1
Modul belajar-spss-1
in_ndah
 
Buku SPSS (Statistika)
Buku SPSS (Statistika)Buku SPSS (Statistika)
Buku SPSS (Statistika)
Ester Melinda
 
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSS
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSSPanduan Lengkap Analisis Statistika dengan Aplikasi SPSS
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSS
Muliadin Forester
 

Viewers also liked (16)

проект QR СМАРТС
проект QR СМАРТСпроект QR СМАРТС
проект QR СМАРТС
 
4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression4H2012 508 nonLinear Regression
4H2012 508 nonLinear Regression
 
Regresi nonlinear&amp;ganda
Regresi nonlinear&amp;gandaRegresi nonlinear&amp;ganda
Regresi nonlinear&amp;ganda
 
Regresi Non Linear
Regresi Non LinearRegresi Non Linear
Regresi Non Linear
 
Analisis trend II
Analisis trend IIAnalisis trend II
Analisis trend II
 
Uji perbedaan ayda tri_valen_virdya
Uji perbedaan ayda tri_valen_virdyaUji perbedaan ayda tri_valen_virdya
Uji perbedaan ayda tri_valen_virdya
 
Regegresi sederhana ayda tri_valen_virdya
Regegresi sederhana ayda tri_valen_virdyaRegegresi sederhana ayda tri_valen_virdya
Regegresi sederhana ayda tri_valen_virdya
 
MAD UJI REGRESI DAN KORELASI SPSS
MAD UJI REGRESI DAN KORELASI SPSSMAD UJI REGRESI DAN KORELASI SPSS
MAD UJI REGRESI DAN KORELASI SPSS
 
Analisis korelasi dan regresi
Analisis korelasi dan regresiAnalisis korelasi dan regresi
Analisis korelasi dan regresi
 
Non linear curve fitting
Non linear curve fitting Non linear curve fitting
Non linear curve fitting
 
Analisis regresi
Analisis regresiAnalisis regresi
Analisis regresi
 
Model regresi-non-linear
Model regresi-non-linearModel regresi-non-linear
Model regresi-non-linear
 
Tugas regresi linear dan non linier
Tugas regresi linear dan non linierTugas regresi linear dan non linier
Tugas regresi linear dan non linier
 
Modul belajar-spss-1
Modul belajar-spss-1Modul belajar-spss-1
Modul belajar-spss-1
 
Buku SPSS (Statistika)
Buku SPSS (Statistika)Buku SPSS (Statistika)
Buku SPSS (Statistika)
 
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSS
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSSPanduan Lengkap Analisis Statistika dengan Aplikasi SPSS
Panduan Lengkap Analisis Statistika dengan Aplikasi SPSS
 

Similar to A Bootstrap Approach to Error-Reduction of Nonlinear Regression Parameters Estimation

Auto Regressive Process (1) with Change Point: Bayesian Approch
Auto Regressive Process (1) with Change Point: Bayesian ApprochAuto Regressive Process (1) with Change Point: Bayesian Approch
Auto Regressive Process (1) with Change Point: Bayesian Approch
IJRESJOURNAL
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regression
Alexander Decker
 
Lecture.3.regression.all
Lecture.3.regression.allLecture.3.regression.all
Lecture.3.regression.all
KUBUKE JACKSON
 
Solution of second kind volterra integro equations using linear
Solution of second kind volterra integro equations using linearSolution of second kind volterra integro equations using linear
Solution of second kind volterra integro equations using linear
Alexander Decker
 
Chapter 14 Part I
Chapter 14 Part IChapter 14 Part I
Chapter 14 Part I
Matthew L Levy
 
v39i11.pdf
v39i11.pdfv39i11.pdf
v39i11.pdf
Gangula Abhimanyu
 
E028047054
E028047054E028047054
E028047054
inventionjournals
 
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
iosrjce
 
regression analysis .ppt
regression analysis .pptregression analysis .ppt
regression analysis .ppt
TapanKumarDash3
 
Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...
Wajih Alaiyan
 
Estimation rs
Estimation rsEstimation rs
Estimation rs
meharahutsham
 
Linear Regression With One or More Variables
Linear Regression With One or More VariablesLinear Regression With One or More Variables
Linear Regression With One or More Variables
tadeuferreirajr
 
R180304110115
R180304110115R180304110115
R180304110115
IOSR Journals
 
H023078082
H023078082H023078082
H023078082
inventionjournals
 
H023078082
H023078082H023078082
H023078082
inventionjournals
 
Unit 05 - Limits and Continuity.pdf
Unit 05 - Limits and Continuity.pdfUnit 05 - Limits and Continuity.pdf
Unit 05 - Limits and Continuity.pdf
SagarPetwal
 
Chapter13
Chapter13Chapter13
Chapter13
rwmiller
 
201977 1-1-1-pb
201977 1-1-1-pb201977 1-1-1-pb
201977 1-1-1-pb
AssociateProfessorKM
 
Bayesian Estimation for Missing Values in Latin Square Design
Bayesian Estimation for Missing Values in Latin Square DesignBayesian Estimation for Missing Values in Latin Square Design
Bayesian Estimation for Missing Values in Latin Square Design
inventionjournals
 
arijit ppt (1) (1).pptx
arijit ppt (1) (1).pptxarijit ppt (1) (1).pptx
arijit ppt (1) (1).pptx
RadharamanRoy3
 

Similar to A Bootstrap Approach to Error-Reduction of Nonlinear Regression Parameters Estimation (20)

Auto Regressive Process (1) with Change Point: Bayesian Approch
Auto Regressive Process (1) with Change Point: Bayesian ApprochAuto Regressive Process (1) with Change Point: Bayesian Approch
Auto Regressive Process (1) with Change Point: Bayesian Approch
 
Nonparametric approach to multiple regression
Nonparametric approach to multiple regressionNonparametric approach to multiple regression
Nonparametric approach to multiple regression
 
Lecture.3.regression.all
Lecture.3.regression.allLecture.3.regression.all
Lecture.3.regression.all
 
Solution of second kind volterra integro equations using linear
Solution of second kind volterra integro equations using linearSolution of second kind volterra integro equations using linear
Solution of second kind volterra integro equations using linear
 
Chapter 14 Part I
Chapter 14 Part IChapter 14 Part I
Chapter 14 Part I
 
v39i11.pdf
v39i11.pdfv39i11.pdf
v39i11.pdf
 
E028047054
E028047054E028047054
E028047054
 
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
Numerical treatments to nonlocal Fredholm –Volterra integral equation with co...
 
regression analysis .ppt
regression analysis .pptregression analysis .ppt
regression analysis .ppt
 
Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...Use of the correlation coefficient as a measure of effectiveness of a scoring...
Use of the correlation coefficient as a measure of effectiveness of a scoring...
 
Estimation rs
Estimation rsEstimation rs
Estimation rs
 
Linear Regression With One or More Variables
Linear Regression With One or More VariablesLinear Regression With One or More Variables
Linear Regression With One or More Variables
 
R180304110115
R180304110115R180304110115
R180304110115
 
H023078082
H023078082H023078082
H023078082
 
H023078082
H023078082H023078082
H023078082
 
Unit 05 - Limits and Continuity.pdf
Unit 05 - Limits and Continuity.pdfUnit 05 - Limits and Continuity.pdf
Unit 05 - Limits and Continuity.pdf
 
Chapter13
Chapter13Chapter13
Chapter13
 
201977 1-1-1-pb
201977 1-1-1-pb201977 1-1-1-pb
201977 1-1-1-pb
 
Bayesian Estimation for Missing Values in Latin Square Design
Bayesian Estimation for Missing Values in Latin Square DesignBayesian Estimation for Missing Values in Latin Square Design
Bayesian Estimation for Missing Values in Latin Square Design
 
arijit ppt (1) (1).pptx
arijit ppt (1) (1).pptxarijit ppt (1) (1).pptx
arijit ppt (1) (1).pptx
 

More from IOSR Journals

A011140104
A011140104A011140104
A011140104
IOSR Journals
 
M0111397100
M0111397100M0111397100
M0111397100
IOSR Journals
 
L011138596
L011138596L011138596
L011138596
IOSR Journals
 
K011138084
K011138084K011138084
K011138084
IOSR Journals
 
J011137479
J011137479J011137479
J011137479
IOSR Journals
 
I011136673
I011136673I011136673
I011136673
IOSR Journals
 
G011134454
G011134454G011134454
G011134454
IOSR Journals
 
H011135565
H011135565H011135565
H011135565
IOSR Journals
 
F011134043
F011134043F011134043
F011134043
IOSR Journals
 
E011133639
E011133639E011133639
E011133639
IOSR Journals
 
D011132635
D011132635D011132635
D011132635
IOSR Journals
 
C011131925
C011131925C011131925
C011131925
IOSR Journals
 
B011130918
B011130918B011130918
B011130918
IOSR Journals
 
A011130108
A011130108A011130108
A011130108
IOSR Journals
 
I011125160
I011125160I011125160
I011125160
IOSR Journals
 
H011124050
H011124050H011124050
H011124050
IOSR Journals
 
G011123539
G011123539G011123539
G011123539
IOSR Journals
 
F011123134
F011123134F011123134
F011123134
IOSR Journals
 
E011122530
E011122530E011122530
E011122530
IOSR Journals
 
D011121524
D011121524D011121524
D011121524
IOSR Journals
 

More from IOSR Journals (20)

A011140104
A011140104A011140104
A011140104
 
M0111397100
M0111397100M0111397100
M0111397100
 
L011138596
L011138596L011138596
L011138596
 
K011138084
K011138084K011138084
K011138084
 
J011137479
J011137479J011137479
J011137479
 
I011136673
I011136673I011136673
I011136673
 
G011134454
G011134454G011134454
G011134454
 
H011135565
H011135565H011135565
H011135565
 
F011134043
F011134043F011134043
F011134043
 
E011133639
E011133639E011133639
E011133639
 
D011132635
D011132635D011132635
D011132635
 
C011131925
C011131925C011131925
C011131925
 
B011130918
B011130918B011130918
B011130918
 
A011130108
A011130108A011130108
A011130108
 
I011125160
I011125160I011125160
I011125160
 
H011124050
H011124050H011124050
H011124050
 
G011123539
G011123539G011123539
G011123539
 
F011123134
F011123134F011123134
F011123134
 
E011122530
E011122530E011122530
E011122530
 
D011121524
D011121524D011121524
D011121524
 

Recently uploaded

Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.
Aditi Bajpai
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
Sérgio Sacani
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
KrushnaDarade1
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
pablovgd
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
MAGOTI ERNEST
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
terusbelajar5
 
molar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptxmolar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptx
Anagha Prasad
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
European Sustainable Phosphorus Platform
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
TinyAnderson
 
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero WaterSharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Texas Alliance of Groundwater Districts
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
yqqaatn0
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
muralinath2
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
by6843629
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
HongcNguyn6
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
LengamoLAppostilic
 
Cytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptxCytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptx
Hitesh Sikarwar
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
Sérgio Sacani
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
RitabrataSarkar3
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
Sérgio Sacani
 
Basics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different formsBasics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different forms
MaheshaNanjegowda
 

Recently uploaded (20)

Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.Micronuclei test.M.sc.zoology.fisheries.
Micronuclei test.M.sc.zoology.fisheries.
 
The binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defectsThe binding of cosmological structures by massless topological defects
The binding of cosmological structures by massless topological defects
 
SAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdfSAR of Medicinal Chemistry 1st by dk.pdf
SAR of Medicinal Chemistry 1st by dk.pdf
 
NuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyerNuGOweek 2024 Ghent programme overview flyer
NuGOweek 2024 Ghent programme overview flyer
 
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptxThe use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
The use of Nauplii and metanauplii artemia in aquaculture (brine shrimp).pptx
 
Medical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptxMedical Orthopedic PowerPoint Templates.pptx
Medical Orthopedic PowerPoint Templates.pptx
 
molar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptxmolar-distalization in orthodontics-seminar.pptx
molar-distalization in orthodontics-seminar.pptx
 
Thornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdfThornton ESPP slides UK WW Network 4_6_24.pdf
Thornton ESPP slides UK WW Network 4_6_24.pdf
 
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdfTopic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
Topic: SICKLE CELL DISEASE IN CHILDREN-3.pdf
 
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero WaterSharlene Leurig - Enabling Onsite Water Use with Net Zero Water
Sharlene Leurig - Enabling Onsite Water Use with Net Zero Water
 
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
如何办理(uvic毕业证书)维多利亚大学毕业证本科学位证书原版一模一样
 
Oedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptxOedema_types_causes_pathophysiology.pptx
Oedema_types_causes_pathophysiology.pptx
 
8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf8.Isolation of pure cultures and preservation of cultures.pdf
8.Isolation of pure cultures and preservation of cultures.pdf
 
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốtmô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
mô tả các thí nghiệm về đánh giá tác động dòng khí hóa sau đốt
 
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdfwaterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
waterlessdyeingtechnolgyusing carbon dioxide chemicalspdf
 
Cytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptxCytokines and their role in immune regulation.pptx
Cytokines and their role in immune regulation.pptx
 
The debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically youngThe debris of the ‘last major merger’ is dynamically young
The debris of the ‘last major merger’ is dynamically young
 
Eukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptxEukaryotic Transcription Presentation.pptx
Eukaryotic Transcription Presentation.pptx
 
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...
 
Basics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different formsBasics of crystallography, crystal systems, classes and different forms
Basics of crystallography, crystal systems, classes and different forms
 

A Bootstrap Approach to Error-Reduction of Nonlinear Regression Parameters Estimation

  • 1. IOSR Journal of Mathematics (IOSR-JM) e-ISSN: 2278-5728, p-ISSN: 2319-765X. Volume 11, Issue 1 Ver. III (Jan - Feb. 2015), PP 45-51 www.iosrjournals.org DOI: 10.9790/5728-11134551 www.iosrjournals.org 45 |Page A Bootstrap Approach to Error-Reduction of Nonlinear Regression Parameters Estimation 1 Obiora-Ilouno H. O. And 2 Mbegbu J. I 1 Department of Statistics Nnamdi Azikiwe University, Awka, Nigeria 2 Department of Mathematics University of Benin, Benin City Edo State, Nigeria. Abstract: In this paper we proposed bootstrap algorithm for the estimation of parameters of the non-linear regression analysis by cooperating the Gauss- Newton method. We used bootstrapping to provide estimates of exponential regression coefficients. The computational difficulties that would have encountered in using the proposed method have been resolved by developing a computer program in R which was used to implement the algorithm. From the result obtained the bootstrap algorithm yielded a better reduced error sum of squares SS than the analytical method. With these results, we have a greater confidence in the result obtained. Keywords: bootstrap, algorithm, exponential, regression, parameters, non-linear, Gauss-Newton. I. Introduction Bootstrap resampling as computer based methods is a tools for constructing inferential procedures in statistical data analysis. The bootstrap approach introduced by Efron (1979) was used initially for estimation of certain statistics such as the mean and the variability of estimated characteristics of the probability distribution of a set of observations by providing confidence intervals for the population parameters (Efron, 2003; Efron and Tibshirani (1998). It is a technique in which sample of size n are obtained randomly with replacement from the original sample each with 1 n chances of being selected. This is basically to analyze the population by replacing the unknown distribution function F by the empirical distribution function Fˆ obtained from the sample. (See Davison and Hinkley, 1997, 2003; Efron, 2003; Bickel and Freedman, 2007; Hall, 2003; Casella, 2003). Exponential regression is a non-linear regression model often used to measure the growth of a variable, such as population, GDP etc. An exponential relationship between X and Y exists whenever the dependent variable Y changes by a constant percentage, as the variable X also changes. II. Materials And Method Let     jkXXXfY ,,,,,,, 2121  (2.1) be a non linear regression model with s being the parameters, sX  are the predictor variables and the error term  2 ,0~  N independently identically distributed and are uncorrelated. Equation (2.1) is assumed to be intrinsically nonlinear. Suppose n is number of observations on the Y and sX  , then   niXXXfY ijikiii ,,2,1;,,,,,,, 2121    (2.2) The n-equation can be written compactly in a matrix notation as     ,XfY (2.3) where                                                       njknnn k k n XXX XXX XXX X y y y Y               2 1 2 1 21 22212 11211 2 1 ,,, and   0E The error sum of squares for the nonlinear model is defined as      2 1 ,  n i ii XfYSQ  (2.4)
  • 2. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 46 |Page Denoting the least square estimates of  ˆby , these minimize the . The least square estimates of are obtained by differentiating (2.4) with respect to , equate to zero and solve for , this results in J normal equations:                   n i i p ii JpniXfXfY 1 ˆ ,,2,1,,,2,1;0,ˆ,      (2.5) In estimating the parameters of nonlinear regression model, we use the Gauss-Newton method based on Taylor’s series to approximate equation (2.3). Now, considering the function which is the deterministic component of   niXY iii ,,2,1,   (2.6) Let 0  be the initial approximate value of . Adopting Taylor’s series expansion of about 0  , we have the linear approximation         0 ,,, 00        iii XfXfXf (2.7) Substituting expressions (2.7) in (2.6) we obtain       JpniXfXfY J p ippi i ii ,,2,1,,,2,1,, 1 00 0                 Equation (2.8) may be viewed as a linear approximation in a neighborhood of the starting value Let  00 , ii Xff  00 ppp     JpandniforXfZ i p pi ,,2,1,,2,1,, 0 0                 (2.8) Hence, equation (2.8) becomes   J p ippiii niZfY 1 000 ,,2,1,  (2.9)   J p ippiii niZfY 1 000 ,,2,1,  (2.10) In a matrix form, we have                                                            njJnnn J J nn ZZZ ZZZ ZZZ fY fY fY             2 1 0 0 2 0 1 00 2 0 1 0 1 0 22 0 12 0 1 0 21 0 11 0 0 22 0 11 (2.11) Compactly, equation (2.11) becomes   000 ZfY (2.12) ( )S    ˆ ( , )f X   ( , )if X  0 
  • 3. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 47 |Page where     0 0 11 1 0 0 '' '0 0 0 0 0 0 0 012 2 1 1 2 2 1 1 0 0 1 , , , , , J J n n J n n Jn Z Z Z Z Y f Y f Y f Y f Z Z Z                                      We obtain the Sum of squares error  SS        000000  ZfYZfYSS        0ˆ22 00000 0        ZZZfY SS       00000 ˆZZZfY    (2.13) Hence,    100000ˆ   ZZZfY (2.14) Therefore, the least square estimates of 0  is    001000ˆ fYZZZ    (2.15) Thus,    00 2 0 1 0 ˆ,,ˆ,ˆˆ J  minimizes the error sum of squares,               n i J p ppiii ZfYS 1 2 1 000* ˆ (2.16) Now, the estimates of parameters p of non linear regression (2.1) are Jpppp ,,2,1;ˆ 001   (2.17) Iteratively, equation (2.17) reduces to rrr rrr         ˆ ˆ ˆ 1 11 001  Thus     11 ' 'r r r r r r Z Z Z Y f      (2.18) where    1ˆr r r r r Z Z Z Y f     are the least squares estimates of  obtained at the  th r 1 iterations. The iterative process continues until      r rr 1 where 5 10  is the error tolerance (See Smith (1998) and Nduka (1999)).  * S is evaluated after each iteration to see if a reduction in its value has actually been achieved. At the end of the  th r 1 iteration, we have               n i r p p j r pi r ii r ZfYS 1 2 1 * ˆ (2.19)
  • 4. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 48 |Page and iteration is stopped if convergence is achieved. The final estimates of the parameters at the end of the  th r 1 iteration are:  11 2 1 1 1 ,,,   r j rrr   (2.20) The Bootstrap Algorithm based on the Resampling Observations for the Estimation of Non- linear Regression Parameters    jiii ZYW , being the original sample of size n for the resampling, sWi are drawn independently and identically from a distribution of F.    ni yyyY ,,, 21  is the column vector of the response variables,    jnjjji zzzZ ,,, 21  is the matrix of dimension kn for the predictor variables, where niandkj ,,2,1,,2,1   . Let the   11 k vector    jiii ZYW , denote the values associated with th i  nwww ,,, 21  observation sets. Step 1: Draw bootstrap sample  )()( 2 )( 1 ,,, b n bb www  of size n with replacement from the observation giving 1 n probability of each iW value being sampled from the population and label the elements of each vector    )()()( , b ji b i b i XYW where. niandkj ,,2,1,,2,1   . From these form the vector for the response variable  )()( 2 )( 1 )( ,,, b n bbb i yyyY  and the matrix of the predictor variables  )()()()( ,,, n jn b jz b ji b ji zzzZ  . Step2: Calculate the least square estimates for nonlinear regression coefficient from the bootstrap sample;    fYZZZ  10ˆ . Step3: Compute 001 ˆˆˆ   using the Gauss-Newton method, the 1ˆ value is treated as the initial value in the first approximated linear model. Step4: We return to the second step and again compute sˆ . At each iteration, new sˆ represent increments that are added to the estimates from the previous iteration according to step 3 and eventually find 2ˆ , which is rrr toup  ˆˆˆˆ 1112   . Step 5: Stopping Rule; the iteration process continues until      r rr 1 , where 5 10  , for the values of 110 ,,, p  from the first bootstrap sample )1(ˆ b  . Step6: Repeat steps 1 to 5 for Br ,,2,1  , where B is the number of repetition. Step 7: Obtain the probability distributions  )(ˆ b F  of bootstrap estimates )()21()1( ˆ.,ˆ,ˆ bBbb   and use  )(ˆ b F  to estimate regression coefficients, variances. The bootstrap estimates of regression coefficient is the mean of the distribution  )(ˆ b F  , (see Topuz and Sahinler,2007; Obiora-Ilouno and Mbegbu, 2012).      br B r br b B    ˆ ˆ ˆ 1   (2.21) The bootstrap standard deviation from  )(ˆ b F  distribution is
  • 5. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 49 |Page                 Br B s B b bdrbdr b ,,2,1; ˆˆˆˆ ˆ 1 1                          (2.22) The Computer Program In R For Bootstrapping Non-Linear Regression: #x is the vector of independent variable #theta is the vector of parameters of the model #This function calculates the matrix of partial derivatives F=function(x,theta) { output=matrix(0,ncol=2,nrow=length(x)) for(i in 1:length(x)) output[i,]=c(exp(theta[2]*x[i]),theta[1]*x[i]*exp(theta[2]*x[i])) output } #This function calculates the regression coefficients using the Gauss-Newton Method gaussnewton=function(y,x,initial,tol) { theta=initial count=0 eps=y-(theta[1]*exp(theta[2]*x)) SS=sum(eps**2) diff=1 while(tol<diff) { S=SS ff=F(x,theta) theta=c(theta+solve(t(ff)%*%ff)%*%t(ff)%*%eps) eps=y-(theta[1]*exp(theta[2]*x)) SS=sum(eps**2) diff=abs(SS-S) count=count+1 if(count==100) break pp=c(theta,SS) #at each iteration } pp } #This part of the code does the bootstrap boot=function(data,p,b,initial) { n=length(data[,1]) z=matrix(0,ncol=p,nrow=n) output=matrix(0,ncol=p+1,nrow=b) for (i in 1:b) { u=sample(n,n,replace=T) for (j in 1:n) z[j,]=data[u[j],] y=z[,1] x=z[,2:p] logreg=gaussnewton(y,x,initial,0.00001) coef=logreg output[i,]=c(coef) } output }
  • 6. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 50 |Page #Then to run the code we use the following y <- c(data) x <- c(data) data=cbind(y,x) initial=c(initial) expo=boot(data,p,B,initial) #Run the following to view the bootstrap results theta_0=mean(expo[,1]) theta_0 theta_1=mean(expo[,2]) theta_1 SSE=mean(expo[,3]) SSE Problem[ GUJARATI AND PORTER, 2009] The data below relates to the management fees that a leading mutual fund pays to its investment advisors to manage its assets. The fees paid depend on the net asset value of the fund. Develop a regression model for the management fees to the advisors. Fee % 0.520 0.508 0.484 0.46 0.4398 0.4238 0.4115 0.402 0.3944 0.388 0.3825 0.3738 Asset 0.5 5.0 10.0 15 20 25 30 35 40 45 55 60 III. Results And Discussion From the data, the higher the net asset values of the fund, the lower are the advisory fees. (see Gujarati and Porter ,2009). The Analytical Result of R Program (Without Bootstrapping) y <- c(0.520,0.508,0.484,0.46,0.4398,0.4238,0.4115,0.402,0.3944,0.388,0.3825,0.3738) x <- c(0.5,5,10,15,20,25,30,35,40,45,55,60) run=gaussnewton(y,x,c(0.5028,-0.002),0.00001) [1] 0.505805756 -0.005487479 0.001934099 [1] 0.508724700 -0.005949598 0.001699222 [1] 0.508897316 -0.005964811 0.001699048 The estimated model from the result is: 0.00596ˆ 0.50890iY X   where Y and X are the fee and assets respectively. The Result of R Program Using Bootstrapping Algorithm > # run the code use the following for the bootstrap algorithm > y <- c(0.520,0.508,0.484,0.46,0.4398,0.4238,0.4115,0.402,0.3944,0.388,0.3825,0.3738) 6050403020100 0.52 0.50 0.48 0.46 0.44 0.42 0.40 0.38 0.36 Asset fee% Relationship of advisory fees to fund assets
  • 7. A Bootstrap Approach To Error-Reduction Of Nonlinear Regression Parameters Estimation DOI: 10.9790/5728-11134551 www.iosrjournals.org 51 |Page > x <- c(0.5,5,10,15,20,25,30,35,40,45,55,60) > data=cbind(y,x) > initial=c(0.5028,-0.002) > expo=boot(data,2,1000,initial) > > #Run the following to view the bootstrap results > theta_0=mean(expo[,1]) > theta_0 [1] 0.5075696 > theta_1=mean(expo[,2]) > theta_1 [1] -0.005964939 > SSE=mean(expo[,3]) > SSE [1] 0.001328289 IV. Discussion are the revised parameter estimates at the end of the last iteration. The least squares criterion measure SS for the starting values has been reduced in the first iteration and also further reduced in second, third iterations respectively. The third iteration led to no change in either the estimates of the coefficient or the least squares SS criterion measure. Hence, convergence is achieved, and the iteration end. The fitted regression function is,  XY 00586.0exp50889.0ˆ  The results of the analytical and the bootstrap computations are shown in Table 2 Table 2: Analytical and Bootstrap Results 0 1 SS Analytical 0.50889 -0.00596 0.001699 Bootstrap 0.50757 -0.00596 0.001328 The fitted regression function for both the analytical bootstrapping computation are  ˆ 0.50889exp 0.00596Y X  and  XY 00596.0exp50757.0ˆ  respectively. V. Conclusion From the result of our analysis the bootstrap approach yields approximately the same inference as the analytical method. Also the bootstrap algorithm yielded a better reduced error sum of squares SS than the analytical method (see Table 2). With these results, we have a greater confidence in the result obtained by bootstrap approach. References [1]. Casella, G., 2003, Introduction to the Silver Anniversary of the Bootstrap, Statistical Science, Vol.18, No.2, pp. 133-134. [2]. Efron, B., (1979). Bootstrap Methods: Another look at the Jackknife. The Annuals of Statistics, 7, 1-26. [3]. Efron, B., 2003, Second Thought on Bootstrapping, Statistical Science, Vol. 18, No. 2, 135-140. [4]. Efron, B. and Tibshirani, R.J., 1998, An Introduction to the Bootstrap, Library of Congress Cataloging in Publication Data, CRC Press, Boca Raton, Florida, 60-82. [5]. Freedman, . A., 1981, Bootstrapping Regression Models, Annals of Statistical Statistics, Vol. 9, 2, 158-167. [6]. Nduka, E. C., 1999, Principles of Applied Statistics 1, Regression and Correlation Analyses, Crystal Publishers, Imo State, Nigeria. [7]. Obiora-Ilouno H.O. and Mbegbu J.I (2012), Jackknife Algorithm for the Estimation of Logistic Regression Parameters, Journal of Natural Sciences Research, IISTE UK, Vol.2, No.4, 74-82. [8]. Sahinler, S. and Topuz, D., 2007, Bootstrap and Jackknife Resampling Algorithm for Estimation of Regression Parameters, Journal of Applied Quantitative Method, Vol. 2, No. 2, pp. 188-199. [9]. Smith, H. and Draper, N.R., 1998, Applied Regression Analysis, Third Edition, Wiley- Interscience, New York. [10]. Venables, W. N and Smith, 2007, An Introduction to R, A Programming Environment for Data Analysis and Graphics, Version 2.6.1, pp. 1-100. 0 10.50889 and 0.00596   