Transcript of "Model of robust regression with parametric and nonparametric methods"
1.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201327Model of Robust Regression with Parametric and NonparametricMethodsDr. Nadia H. AL – Noor* and Asmaa A. Mohammad*** Department of Mathematics College of Science Al-Mustansiriya University-Iraq** Department of Mathematics College of Science for Women, University of Baghdad-IraqCorresponding E-mail:nadialnoor@yahoo.comAbstractIn the present work, we evaluate the performance of the classical parametric estimation method "ordinary leastsquares" with the classical nonparametric estimation methods, some robust estimation methods and twosuggested methods for conditions in which varying degrees and directions of outliers are presented in theobserved data. The study addresses the problem via computer simulation methods. In order to cover the effectsof various situations of outliers on the simple linear regression model, samples were classified into four cases (nooutliers, outliers in the X-direction, outliers in the Y-direction and outliers in the XY-direction) and thepercentages of outliers are varied between 10%, 20% and 30%. The performances of estimators are evaluated inrespect to their mean squares error and relative mean squares error.Keywords: Simple Linear Regression model; Ordinary Least Squares Method; Nonparametric Regression;Robust Regression; Least Absolute Deviations Regression; M-Estimation Regression; Trimmed Least SquaresRegression.1. IntroductionThe simple linear regression model is expressed as:=+ + (1)Where: Y is called response variable or dependent variable; X is called predictor variable, regressor variable orindependent variable, and is called prediction error or residual. The symbols and are called intercept andslope respectively which they represents the linear regression unknown parameters or coefﬁcients.The process of estimating the parameters of regression model is still one of important subjects despiteof large number of papers and studies written in this subject which differ in techniques followed in the process ofestimation. The ordinary least squares (OLS) method is the most popular classical parametric regressiontechnique in statistics and it is often used to estimate the parameters of a model because of nice property andease of computation. According to Gauss-Marcov theorem, the OLS estimators, in the class of unbiased linearestimators, have minimum variance i.e. they are best linear unbiased estimator (BLUE)[10]. Nonetheless, theOLS estimates are easily affected by the presence of outliers, "outliers are observations which are markedlydifferent from the bulk of the data or from the pattern set by the majority of the observations. In a regressionproblem, observations corresponding to excessively large residuals are treated as outliers[18]", and will produceinaccurate estimates. The breakdown point of the OLS estimator is 0% which implies that it can be easilyaffected by a single outlier. So alternative methods such as nonparametric and robust methods should be putforward which are less affected by the outliers. However, most robust methods are relatively difficult andcomputationally complicated. As an alternative to OLS, least absolute deviations regression (LAD or L1) hasbeen proposed by Boscovich in 1757, then Edgeworth in 1887. LAD regression is the ﬁrst step toward a morerobust regression [22][26]. The next direct step to obtain robust regression was the use of M-estimators. Theclass of M-estimators was defined by Huber (1964, 1968) for the location model and extended by him to theregression model in (1973) [12] as an alternative robust regression estimator to the least squares. This methodbased on the idea of replacing the squared residual in OLS by another symmetric function, ρ, of the residuals[13]. Rousseeuw and Yohai (1984) [24] introduced the Trimmed Least Squares (TLS) regression which is ahighly robust method for fitting a linear regression model. The TLS estimator minimizes the sum of the (h)smallest squared residuals. Alma (2011) [1] compare some robust regression methods such that TLS and M-estimate against OLS regression estimation method in terms of the determination of coefficient. Bai (2012) [3]review various robust regression methods including "M-estimate and TLS estimate" and compare between thembased on their robustness and efficiency through a simulation study where n=20,100. In other side, Theil (1950)[27] introduced a nonparametric procedure which is expected to perform well without regard to the distributionof the error terms. This procedure is based on ranks and uses the median as robust measures rather than using themean as in OLS. Mood and Brown (1950) [19] proposed to estimate the intercept and slope simultaneously fromtwo equations depending upon divide the observations for two groups according to the median of the variable
2.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201328(X). Conover (1980) [5] calculate the estimate of the intercept by used the median of the response variables,estimated Thiels slope and the median of the explanatory variables. Hussain and Sprent (1983) [14] presented asimulation study in which they compared the OLS regression estimator against the Theil pairwise median andweighted Theil estimators in a study using 100 replications per condition. Hussain and Sprent characterized thedata modeled in their study as typical data patterns that might result from contamination due to outliers.Contaminated data sets were generated using a mixture model in which each error term is either a randomobservation from a unit normal distribution [N(0,1)] or an observation from a normal distribution with a largervariance [N(0, k2), k >1]. Jajo (1989) [15] carried a simulation study to compare the estimators that obtainedfrom (Thiel, Mood-Brown, M-estimation and Adaptive M-estimation) with the estimators that obtained fromleast squares of the simple linear regression model in the presence of outliers. Mutan (2004) [20] introduced aMonte Carlo simulation study to comparing regression techniques including (ordinary least squares, , leastabsolute deviations, trimmed least squares, Theil and weighted Theil) for the simple linear regression modelwhen the distribution of the error terms is Generalized Logistic. Meenai and Yasmeen (2008) [17] appliednonparametric regression methods to some real and simulated data.In the present work, we evaluate the performance of the classical nonparametric estimation methods,some robust estimation methods "least absolute deviations, M-estimation and trimmed least squares" and twosuggested methods "depending upon nonparametric and M-estimation" with the OLS estimation method forconditions in which varying degrees and directions of outliers are presented in the observed data. The studyaddresses the problem via computer simulation methods. In order to cover the effects of various situations ofoutliers on the simple linear regression model, samples were classified into four cases (no outliers, outliers in theX-direction, outliers in the Y-direction "error distributed as contaminated normal", and outliers in the XY-direction) and the percentages of outliers are varied between 10%, 20% and 30% . The performances ofestimators are evaluated in respect to their mean squares error and relative mean squares error.2. Classical Estimation Method for Regression Parameters [16]The most well-known classical parametric method of estimating the regression parameters is to use aleast square error (LSE) approach. The basic idea of ordinary least squares is to optimize the fit by minimizingthe total sum of the squares of the errors (deviations) between the observed values yi and the estimated values+ :∑ = ∑ ( − −1 )2 (2)where and are estimates of β0 and β1, respectively. The least squares estimators of β0 and β1, and are:=∑ − (∑ )(∑ )⁄∑ −(∑ ) ⁄=∑ ( − )( − )∑ ( − )(3)=− (4)Where: = (1 ) ∑⁄ and = (1 ) ∑⁄3. Alternative Estimation Methods for Regression Parameters3.1 Nonparametric Regression [5][11][14][15][21][27]The OLS regression method described above assume normally distributed error terms in the regressionmodel. In distinction, classical nonparametric methods to linear regression typically employ parameterestimation methods that are regarded as distribution free. Since nonparametric regression procedures aredeveloped without relying on the assumption of normality of error distributions, the only presupposition behindsuch procedures is that the errors of prediction are independently and identically distributed (i.i.d.). Manynonparametric procedures are based on using the ranks of the observed data rather than the observed datathemselves. The robust estimate of slope for nonparametric fitted line was first described by Theil (1950). Heproposed two methods, namely, the complete and the incomplete method. Assumed that all the s are distinct,and lose no generality that the s are arranged in ascending order. The complete Theil slope estimate iscomputed by comparing each data pair to all others in a pairwise fashion. A data set of n (X,Y) pairs will resultin " = # $ =( % )pairwise comparisons. For each of these comparisons a slope ∆Y/∆X is computed. Themedian of all possible pairwise slopes is taken as the nonparametric Thiels slope estimate,& (), Where:
3.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201329* + =∆∆=+ −+ −; ≠ + , 1 ≤ < 2≤ (5)& ()= 456 7 #* +$ ; 1≤ < 2 ≤ (6)For incomplete method, Theil suggested using only a subset of all * +, and took as estimator of themedian of the subset (* , 8 ∗); where:* , 8 ∗ =8 ∗ −8 ∗ −;= 1,2, … , ∗(7)If n is even then ∗= /2 . If n is odd, the observation with rank (n+1)/2 is not used. The incomplete Theilsslope estimator is:& ()∗= 456 7 #* , 8 ∗$;= 1,2, … , ∗(8)For estimation the intercept parameter, Thiels intercept estimate,&>, is defined as:&>= 456 7 ? −&>@ ; = 1,2, … , (9)Where βA BCis the estimate of β according to the complete or the incomplete Thiels slope estimator.Other estimators of intercept have been suggested. Conover suggested estimating by using the formula:D E= 456 7 ( ) −&>. 456 7 ( ) (10)This formula "Conovers estimator" assures that the fitted line goes through the point (Xmedian ,Ymedian). This isanalogous to OLS, where the fitted line always goes through the point ( , ).3.2 Robust RegressionAny robust method must be reasonably efficient when compared to the least squares estimators; if theunderlying distribution of errors are independent normal, and substantially more efficient than least squaresestimators, when there are outlying observations. There are various robust methods for estimation the regressionparameters. The main focus of this subsection is to least absolute deviations regression, M-estimation andtrimmed least squares regression which are the most popular robust regression coefficients with outliers.3.2.1 Least Absolute Deviations Regression [4][8][20][25]The least absolute deviations regression (LAD regression) is one of the principal alternatives to theordinary least squares method when one seeks to estimate regression parameters.The goal of the LAD regression is to provide a robust estimator which is minimized the sum of theabsolute residuals.4 H| | (11)The LAD procedure was developed to reduce the influence of Y-outliers in the OLS. The Y-outliershave less impact on the LAD results, because it does not square the residuals, and then the outliers are not givenas much weight as in OLS procedure. However, LAD regression estimator is just as vulnerable as least squaresestimates to high leverage outliers (X-outliers). In fact, LAD estimate have low breakdown point (BP is 1/n or0%). Although the concept of LAD is not more difficult than the concept of the OLS estimation, calculation ofthe LAD estimates is more troublesome. Since there are no exact formulas for LAD estimates, an algorithm isused. Birkes and Dodge (1993) explain this algorithm for the simple linear regression model. It is known thatLAD regression line passes through two of the data points. Therefore, the algorithm begins with one of the datapoints, denoted by ( , ), and tries to find the best line passing through it. The procedure for finding the bestline among all lines passing through a given data point ( , ) is describe below.For each data point ( , ), the slope of the line passing through the two points ( , ) and ( , ) is calculatedand it is equal to the ( - )/( - ). If = for some i, the slope is not defined. The data points are re-indexedin such a way that: ( − )/( − ) ≤ ( − )/( − ) ≤ ⋯ ≤ ( − )/( − )Now, the searched point ( +, +) is determined by the index j for which.K| − | + ⋯ + L +% − L < T| − | + ⋯ + L +% − L + L + − L > TO (12)Where T = ∑ | − |.This conditions guarantee that minimizes the quantity ∑ L( − ) − ( − )L
4.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201330Analogously to ∑| | for the regression lines passing through ( , ).The is computed in such a way that theregression line crosses ( , ). So, the best line passing through ( , ) is the line P = + where:QR=ST%SUVT%VU(13)QR=−QR(14)We can equally verify that it passes through the data point ( +, +).We just have to rename the point ( +, +) by( , ) and restart.3.2.2 M-Estimation Regression [1][3][10][12]The most common general method of robust regression is M-estimation, introduced by Huber (1973).The M in M-estimates stands for "maximum likelihood type". That is because M-estimation is a generalizationof maximum likelihood estimates (MLE). The goal of M-estimation is minimized a sum of less rapidlyincreasing functions of the residuals, ∑ W ?XYZ@ where s is an estimate of scale which can be estimated by usingthe formula:[ =456 7 | − 456 7 ( )|0.6745(15)A reasonable W should satisfy the following properties: W( ) ≥ 0; W( ) = W(− ); W(0) = 0; W( ) ≥W# +$ ^_ | | ≥ L +LM-estimators are robust to outliers in the response variable with high efficiency. However, M-estimators are justas vulnerable as least squares estimates to high leverage outliers. In fact, the BP (breakdown point) of M-estimates is 1/n or 0%. Suppose simple linear regression model, the M-estimator minimizes the objectivefunction:H W ?[@ = H W `− −[a= H W b( )[c = H W(d ) (16)Where d =XY(e)Zare called standardized residuals. Let f(d) = Ẃ(d)Differentiating (16) with respect to and setting the partial derivatives to zero, we get the normal equations:KH f b( )[c = 0H f b( )[c x = 0ijjjjk(17)To solve (17) we define the weight function l( ) =m(V)V; if ≠ 0 and l( ) = fp (0); if = 0. let wi = W(ui).Then equations (17) can be written asKH q ( − − ) = 0Hq ( − − ) = 0ijjjjk(18)Solving the estimating equations1(18) is a weighted least squares problem, minimizing ∑ q d . The weights,however, depend upon the residuals, the residuals depend upon the estimated coeﬃcients, and the estimatedcoeﬃcients depend upon the weights. An iterative solution (called iteratively reweighted least squares) istherefore required. So, the solution of (18) can be found by iterating between wi and :1. Select an initial estimates( )and( ), such as the least squares estimates.2. At each iteration t, calculate standardized residuals d(s% )and associated weights q(s% )= l(d(s% ))from the previous iteration.1 Newton-Raphson and Iteratively Reweighted Least Squares (IRLS) are the two methods to solve the M-estimates nonlinear normalequations. IRLS is the most widely used in practice and we considered for this study.
5.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 2013313. Solve for new weighted least squares estimates(s),(s).(s)=#∑ q (s% )$#∑ q (s% )$ − (∑ q (s% ))(∑ q (s% ))(∑ q (s% ))(∑ q (s% ) ) − (∑ q (s% ) )(19)(s)=#∑ q (s% )$#∑ q (s% )$ − (∑ q (s% ))(∑ q (s% ))(∑ q (s% ))(∑ q (s% ) ) − (∑ q (s% ) )(20)Also, we can find(s)7[:(s)=∑ q (s% )∑ q (s% )−(s) ∑ q (s% )∑ q (s% )(21)4. Repeat step 2 and step 3 until the estimated coefficients converge. The iteration process continues untilsome convergence criterion is satisfied, vβA(w)− βA(w% )v ≅ 0.Several choices of W have been proposed by various authors. Two of these are presented in table (1)together with the corresponding derivatives (f) and the resulting weights (w).Table (1): Different y functions, together with the corresponding derivatives z and the resulting weights wType y({|) z({|) }({|)Huber~12; | | ≤ •• `| | −12•a ; | | > •K €; | | ≤ •• [ • ( ); | | > •K ~1 ; | | ≤ ••| |; | | > •Kc = 1.345, 1.5, 1.7, 2.08Welsch•2b1 − 5%?XY‚@ƒc ; | | < ∞ 5%?XY‚@ƒ; | | < ∞ 5%?XY‚@ƒ; | | < ∞c = 2.4, 2.9853.2.3 Trimmed Least Squares Regression [3][23][24]Rousseeuw and Yohai (1984) proposed the trimmed least squares (TLS) estimator regression.Extending from the trimmed mean, TLS regression minimizes the h out of n ordered squared residuals. So, theobjective function is minimize the sum of the smallest h of the squared residuals and is defined as:4 H ( )(22)where ( ) represents the ithordered squared residuals ( ) ≤ ( ) ≤ ⋯ ≤ ( ) and h is called the trimmingconstant which has to satisfy < ℎ < . This constant, h , determines the breakdown point of the TLS estimator.Using h = [(n / 2) +1] ensures that the estimator has a breakdown point equal to 50%. When h = n, TLS isexactly equivalent to OLS estimator whose breakdown point is 0%. Rousseeuw and Leroy (1987) recommendedh = [n (1− α) +1] where α is the trimmed percentage. This estimator is attractive because can be selected toprevent some of the poor results other 50% breakdown estimator show. TLS can be fairly efficient if the numberof trimmed observations is close to the number of outliers because OLS is used to estimate parameters from theremaining h observations.4. Suggested Estimators4.1 First Suggested Estimator: in this estimator, we suggest to modifying Thiel estimator (complete andincomplete method). Thiel suggest using the median as a robust estimator of location instead of the mean in OLS.So, we suggest using the Gastwirths estimator instead of median in Thiel estimator in order to not exclude toomuch of the information from the regression. Gastwirths location estimator is a weighted sum of three orderstatistics. It is based on median with two ordered observations and therefore it contains information regarding thesample more than the median. The formula Gastwirths location estimator is [9]:GAS = 0.3 xˆ‰Š8 ‹+ 0.4 median + 0.3 x?‰%ˆ‰Š‹@(23)
6.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201332Where: ˆ‰Š+ 1‹ ∶ The integer part of the real number ?‰Š+ 1@ and ˆ‰Š‹ ∶ The integer part of the real number ?‰Š@.4.2 Second Suggested Estimator: in this estimator, we suggest to use the following function as M-estimatorwhich satisfies the proprieties of y function.W( ) =•18log b1 + `3•a c ; | | < ∞ , •= 9 (24)The z function will be as follow:f( ) =/•1 + ?3•@; | | < ∞ , •= 9 (25)5. Simulation StudyIn this section we introduced the simulation study which has been carried out to illustrate the robustnessof the estimators under different cases. Simulation was used to compare the mean squares error (MSE) andrelative mean squares error (RMSE) of the estimates of regression coefficients and model by using the ordinaryleast squares (OLS); least absolute deviation (LAD); nonparametric estimators contains "complete Thielsestimator (CTH) and incomplete Thiels estimator (ITH) with Conovers estimator for intercept"; suggestednonparametric estimator contains "complete Gastwirths estimator (CGAS) and incomplete Gastwirths estimator(IGAS) with Conovers estimator for intercept"; M-estimators "Hubers M-estimators with c=1.345 (H-M),Welschs M-estimators with c=2.4 (W-M) and suggested M-estimator (SU-M)" and trimmed least squares (TLS)with proportion of trimmed (α) equal to (10%, 20%, 30% and 40%). The data sets are generated from the simplelinear regression model as: = 1 + 3 + which means that the true value of regression parameters are= 1 and = 3. Since the parameters known, a detailed comparison can be made. The process was repeated1000 times to obtain 1000 independent samples of Y and X of size n. The sample sizes varied from small (10), tomedium (30) and large (50). In order to cover the effects of various situations on the regression coefficients andmodel, samples were classified into four cases, three of them where contaminated with outliers. In addition, threepercentages of outliers (δ) were considered, δ =10%, 20% and 30%. We treated with normal and contaminatednormal distribution. The simulation programs were written using Visual Basic6 programming language.Case (1) No-outliers "Normal Case":Generate errors, ~"(0,1); = 1,2, … , .Generate the values of independent variable, ~"(0,100); = 1,2, … , .Compute the yi values.Case (2) X-outliers:Generate errors, ~"(0,1); = 1,2, … , .Generate the values of independent variable with no X-outliers, ~"(0,100); = 1,2, … , (1 − δ).Generate (n δ) of X-outliers for the values of independent variable , ~"(100,100); =n (1- δ)+1, n (1-δ)+2,…, n.Compute the yi values.Case (3) Y-outliers:Generate the values with no Y-outliers using errors, ~"(0,1); = 1,2, … , (1 − δ).Generate the values with Y-outliers using errors, ~"(0,50); i= n (1- δ)+1, n (1- δ)+2,…, n.Generate the values of independent variable, ~"(0,100); = 1,2, … , .Compute the yi values.Case (4) XY-outliers:Generate the values with no Y-outliers using errors, ~"(0,1); = 1,2, … , (1 − δ).Generate the values with Y-outliers using errors, ~"(0,50); i= n (1- δ)+1, n (1- δ)+2,…, n.Generate the values of independent variable with no X-outliers, ~"(0,100); = 1,2, … , (1 − δ).Generate (n δ) of X-outliers for the values of independent variable , ~"(100,100); =n (1- δ)+1, n (1-δ)+2,…, n.Compute the yi values.For each case, random samples of size n were chosen and from each sample thus obtained, MSE andRMSE using OLS, LAD, CTH, ITH, CGAS, IGAS, H-M, W-M, SU-M, TLS10%, TLS 20%, TLS 30% and TLS40% were found and compared. MSE can be a useful measure of the quality of parameter estimation and iscomputed as:
7.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201333–*—# $ =˜7 # $ + ™š 7[# $› (26)š 7[ # $ = − ; ˜7 # $ =∑ # (œ) − $•žŸ − 1; =∑ (œ)•žŸ; Ÿ = 1000–*—# A$=∑ –*—#A(œ)$•žŸ(27)–*—# A(œ)$ =∑ ( − P )− 2A relative mean squares error has also been used as a measure of the quality of parameter estimation.We computed RMSE as:Ÿ–*—# $=–*—# $ − –*—# s(X ¡(s ¢$–*—# $(28)Ÿ–*—# A$=–*—# A $ − –*—#A s(X ¡(s ¢$–*—# A $(29)The formulation (28) is useful for comparing estimator performance and is interpreted as aproportionate (or percent) change from baseline, using the OLS estimator MSE within a given data condition asa baseline value [21]. Positive values of RMSE refer to the proportional reduction in the MSE of a givenestimator with respect to OLS estimation. Hence, RMSE is interpreted as a relative measure of performanceabove and beyond that of the OLS estimator.6. Conclusions and RecommendationsBased on simulation results that have been shown in tables (2)…(5), the following conclusions could bereached:Under ideal conditions (unit normal error distribution, no contamination) "normal case", table (2) notethe following:OLS indicates the best performance (as expected) for all sample sizes. The decline in the performanceof the rest of the estimation methods compare to the performance of ordinary least squares "which can be seenthrough the negative values for the RMSE" it is the only sacrifice paid by those methods in anticipation of theexistence of outliers.Proposed method "SU-M" provided the second best performance of the estimates for all sample sizes, aswell as provided a performance equal to the performance of OLS in estimating the slope with a sample size equalto 30 and 50, followed by the performance of both H-M and W-M respectively. Consequently, the method of M-estimations surpassed the performance of the alternative methods for OLS.In general, the MSE values of estimating the intercept are greater than the corresponding MSE values ofestimating the slope. So, the results for intercept estimator need more consideration.The use of GAS estimator instead of median in Thiel method reduced inflation in MSE values of modelas compared to OLS. From the value of RMSE we can see the reduction was between (22%-28%) for all samplesizes in complete method whereas was between (20%-26%) for = 30,50 in incomplete method.As the sample size increases, the value of MSE decreases.LAD introduced better performance comparing with nonparametric estimators in estimating interceptand model.Under contamination cases, tables (3), (4) and (5) note the following:Ordinary Least squares recorded a decline in performance when outlier exists while most of the otherestimation methods are recorded good performances depending on the percentage and direction ofcontaminations.In general, TLS indicates the best performance for all sample sizes depending on the proportion oftrimmed. TLS can be fairly efficient when the number of trimmed observations is close to the number of outliersbecause OLS is used to estimate parameters from the remaining h observations.
8.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201334The MSE values indicate that the degree of sensitivity of all methods, except the TLS in some situations,to the existence of outliers in Y–direction was small compared with the degree of sensitivity to the existence ofoutliers in the X–direction and XY–direction.LAD and M-estimators are very sensitive to the presence of outliers in X- direction and XY- direction. Inaddition, the negative values of RMSE of LAD and M-estimators in some results indicate that these methods aremore affected by outliers comparing with OLS. Also, LAD estimators are more sensitive to outliers comparingwith M-estimators especially for estimating intercept and model. So, LAD and M-estimators are not robustestimators against those directions, but they are robust estimators against outliers in Y-direction.Nonparametric estimators introduced better performance in the presence of outliers in X- direction andXY- direction comparing with OLS, LAD and M-estimators especially for estimating slope and model.Although the performance of nonparametric estimators are better than OLS in presence of outliers in X-direction and XY- direction, it seems less better in estimating intercept when we have no outliers, thus thoseestimators is not robust for estimation intercept according to criterion of a robust methods that is any robustmethod must be reasonably efficient when compared to the least squares estimators; if the underlyingdistribution of errors are independent normal, and substantially more efficient than least squares estimators,when there are outlying observations.The use of GAS estimator instead of median in Thiel method improves the performance of this methodwhen outliers appear in Y-direction. Also this estimator improves the performance of this method in some caseswhen outliers appear in X-direction and XY-direction and the most improvements get when it is used in anincomplete method especially for estimating intercept and model with 10% percentage of contamination and forestimating slop and model with 30% percentage of contamination.In general, the MSE values decrease when the sample sizes increase while the MSE values increase asthe proportion of contaminations "outliers" increases.Now, after pointing to the conclusions that were obtained in the present work, the followingRecommendations for future work are relevant:The poor performance of OLS estimators with the presence of outliers confirms our need for alternativemethods. Therefore, before analyzing the data, we should first check the presence of outliers and then constructthe necessary tests whether to see the underlying assumptions are satisfied. After that, we should conduct theappropriate estimation techniques.Choosing a nonparametric method, especially to estimate slope and model, or choosing a trimmedmethod when the outliers appear in X- direction or XY-direction.Choosing M-estimation and LAD method, or choosing a trimmed method when the outliers areappearing in Y-direction.When the outliers appear in X-direction or XY-direction, choose RMSE or mean absolute error (MAE)as criteria for comparing between methods to avoid dealing with the large values of MSE.References[1] Alma, Ö. G. (2011). "Comparison of Robust Regression Methods in Linear Regression".Int. J. Contemp.Math. Sciences, Vol. 6, No. 9, pp. 409- 421.[2] Amphanthong, P. (2009). "Estimation of Regression Coefficients with Outliers". A DissertationSubmitted to the School of Applied Statistics, National Institute of Development Administration in PartialFulfillment of the Requirements for the Degree of Doctor of Philosophy in Statistics.[3] Bai , X. (2012). "Robust Linear Regression". A report submitted in partial fulfillment of therequirements for the degree Master of Science Department of Statistics College of Arts and Sciences KansasState University, Manhattan, Kansas.[4] Birkes, D. and Dodge, Y. (1993). "Alternative Methods of Regression". John Wiley & Sons Publication,New York.[5] Conover, W.L. (1980) "Practical nonparametric statistics". Second edition, John Wiley & SonsPublication, New York.[6] Dietz, E.J. (1986). "On estimating a slope and intercept in a nonparametric statistics course". Institute ofStatistics Mimeograph Series No. 1689R, North Carolina State University.[7] Dietz, E. J. (1987) "A Comparison of Robust Estimators in Simple Linear Regression" Communicationsin Statistics - Simulation and Computation, Vol. 16, Issue 4, pp. 1209-1227.[8] Dodge, Y. (2008). "The Concise Encyclopedia of Statistics". Springer Science & Business Media.[9] Gastwirth, J. L. (1966). "On Robust Procedures". J. Amer. Statist. Assn., Vol. 61, pp. 929-948.
9.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201335[10] Gulasirima, R. (2005). "Robustifying Regression Models". A Dissertation Presented to the School ofApplied Statistics National Institute of Development Administration in Partial Fulfillment of the Requirementsfor the Degree of Doctor of Philosophy in Statistics.[11] Helsel D.R., and Hirsch, R.M. (2002). "Statistical methods in water resources - Hydrologic analysisand interpretation: Techniques of Water-Resources Investigations of the U.S. Geological Survey", chap. A3,book 4.[12] Huber, P. J. (1973). "Robust regression: Asymptotics, conjectures and Monte Carlo". Ann. Stat., Vol.1, pp. 799-821.[13] Huber, P. J. (1981)."Robust Statistics". John Wiley & Sons Publication, New York.[14] Hussain, S. S., and Sprent, P. (1983). Nonparametric Regression. Journal of the Royal StatisticalSociety, Series A, Vol. 146, pp. 182-191.[15] Jajo, N. K. (1989). "Robust estimators in linear regression model". A Thesis Submitted to the SecondEducation College Ibn Al-Haitham Baghdad University in partial fulfillment of the requirements for the degreeof Master of Science in Mathematics.[16] Marques de Sá, J. P. (2007). "Applied Statistics Using SPSS, STATISTICA, MATLAB and R".Second Edition, Springer-Verlag Berlin Heidelberg, New York.[17] Meenai, Y. A. and Yasmeen, F. (2008). "Nonparametric Regression Analysis". Proceedings of 8thIslamic Countries Conference of Statistical Sciences, Vol. 12, PP. 415-424, Lahore-Pakistan.[18] Midi, H., Uraibi , H. S. and Talib , B. A. (2009)." Dynamic Robust Bootstrap Method Based on LTSEstimators". European Journal of Scientific Research, Vol.32, No.3, pp. 277-287[19] Mood, A. M. (1950). "Introduction to the theory of statistics". McGraw-Hill, New York.[20] Mutan, O. C. (2004). "Comparison of Regression Techniques Via Monte Carlo Simulation". A ThesisSubmitted to the Graduate School of Natural and Applied Sciences of Middle East Technical University inpartial fulfillment of the requirements for the degree of Master of Science in Statistics.[21] Nevitt, J. and Tam, H.P. (1998). "A comparison of robust and nonparametric estimators under thesimple linear regression model: Multiple linear regression viewpoints, Vol. 25, pp. 54–69.[22] Ronchetti, E.M. (1987) " Statistical Data Analysis Based on the L1-Norm and Related Methods".North-Holland, Amsterdam.[23] Rousseeuw, P. J. and Leroy, A. M. (1987) "Robust Regression and Outlier Detection". John Wiley &Sons Publication, New York.[24] Rousseeuw, P.J. and Yohai, V. (1984). "Robust regression by means of S-estimators, Lecture Notes inStatistics No.26, pp. 256-272, Springer Verilog, New York.[25] Seber, G. A. and Lee, A. J. (2003). "Linear regression analysis". Second Edition, John Wiley & SonsPublication, New York.[26] Stigler, S. M. (1986). "The History of Statistics: The Measurement of Uncertainty before 1900".Harvard University Press, Cambridge.[27] Theil, H. (1950). "A rank - invariant method of linear and polynomial regression analysis".Indagationes Mathematicae, Vol. 12, pp. 85-91.
10.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201336Table (2) MSE and RMSE results for estimating intercept, slope and model in Normal case "No outliers"MethodsMSERMSEn=10 n=30 n=50 n=10 n=30 n=50£¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ ModelOLS0.109220.001421.010420.034150.000380.998860.020040.000221.005180.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD0.163020.002401.144290.054440.000571.039260.030850.000331.02849-0.49258-0.69014-0.13249-0.59414-0.50000-0.04045-0.53942-0.50000-0.02319CTH0.489790.001861.524830.441600.000421.439210.418830.000251.42273-3.48444-0.30986-0.50911-11.93119-0.10526-0.44085-19.89970-0.13636-0.41540CGAS0.263690.001931.237620.200780.000421.183330.199190.000251.18660-1.41430-0.35915-0.22486-4.87936-0.10526-0.18468-8.93962-0.13636-0.18049ITH0.626060.011482.533170.450460.002031.602310.422300.001171.51406-4.73210-7.08451-1.50705-12.19063-4.34211-0.60414-20.07285-4.31818-0.50626IGAS0.524060.045275.850650.206510.001991.342220.201730.001231.28272-3.79821-30.88028-4.79031-5.04714-4.23684-0.34375-9.06637-4.59091-0.27611H-M0.117820.001481.024970.036720.000391.002940.021300.000231.00743-0.07874-0.04225-0.01440-0.07526-0.02632-0.00408-0.06287-0.04545-0.00224W-M0.133280.001691.060970.040180.000411.008500.022720.000251.01018-0.22029-0.19014-0.05003-0.17657-0.07895-0.00965-0.13373-0.13636-0.00497SU-M0.114260.001461.020400.035780.000381.001190.020720.000221.00644-0.04615-0.02817-0.00988-0.047730.00000-0.00233-0.033930.00000-0.00125TLS10%0.109220.001421.010420.043920.000491.019090.026620.000281.017500.000000.000000.00000-0.28609-0.28947-0.02025-0.32834-0.27273-0.01226TLS20%0.153890.002071.114160.049790.000491.026780.027880.000291.01987-0.40899-0.45775-0.10267-0.45798-0.28947-0.02795-0.39122-0.31818-0.01461TLS30%0.154710.002091.115570.048440.000491.026560.028180.000301.02010-0.41650-0.47183-0.10407-0.41845-0.28947-0.02773-0.40619-0.36364-0.01484TLS40%0.158800.002151.127720.048550.000461.023240.026650.000281.01851-0.45395-0.51408-0.11609-0.42167-0.21053-0.02441-0.32984-0.27273-0.01326Table (3) MSE and RMSE results for estimating intercept, slope and model in X-outliersMethodsCont.MSERMSEn=10 n=30 n=50 n=10 n=30 n=50£¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦Model £¤¥ £¤¦Model £¤¥ £¤¦ModelOLS10%92.148527.77139881.1367033.859447.44083773.4853922.720307.36628761.897260.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD142.954728.61501988.3205847.034177.50494793.1788230.555257.15163751.07685-0.55135-0.10855-0.12164-0.38910-0.00862-0.02546-0.344840.029140.01420CTH41.713660.0036253.4432728.802700.0007632.0131024.518160.0002926.761180.547320.999530.939350.149350.999900.95861-0.079130.999960.96488CGAS34.396360.0067944.4651225.003870.0023728.0875223.385220.0016825.708600.626730.999130.949540.261540.999680.96369-0.029270.999770.96626ITH41.521510.0522956.8448428.752520.0059132.3477224.472100.0032826.930740.549410.993270.935490.150830.999210.95818-0.077100.999550.96465IGAS34.621790.1103153.4023124.946830.0090328.5606923.202930.0082226.021950.624280.985810.939390.263220.998790.96308-0.021240.998880.96585H-M100.847017.97210914.3056634.607817.40867770.8115122.472527.30531755.26305-0.09440-0.02583-0.03764-0.022100.004320.003460.010910.008280.00871W-M116.683108.18073955.1499035.442557.27480756.9050322.346067.14704736.82034-0.26625-0.05267-0.08400-0.046760.022310.021440.016470.029760.03291SU-M98.414617.96376908.7632334.126127.42166771.6956422.474337.32628757.40965-0.06800-0.02475-0.03135-0.007880.002580.002310.010830.005430.00589TLS10%92.148527.77139881.1367024.741885.64394575.4752312.663814.28680434.990550.000000.000000.000000.269280.241490.256000.442620.418050.42907TLS20%0.125230.001681.050610.047270.000521.026520.028890.000301.021350.998640.999780.998810.998600.999930.998670.998730.999960.99866TLS30%0.164570.002271.143910.051060.000521.032250.029420.000311.023390.998210.999710.998700.998490.999930.998670.998710.999960.99866TLS40%0.165770.002341.143900.049020.000511.029560.028420.000301.021490.998200.999700.998700.998550.999930.998670.998750.999960.99866OLS20%112.984258.25725962.9903042.345538.06222846.8812628.681018.03100837.857660.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD167.039477.961221007.7940362.166467.91205851.9457938.214838.06243849.51701-0.478430.03585-0.04653-0.468080.01863-0.00598-0.33241-0.00391-0.01392CTH140.530790.02580178.82824111.578930.01076121.59829105.583430.00999112.10313-0.243810.996880.81430-1.634960.998670.85642-2.681300.998760.86620CGAS82.027560.67273169.9204371.302000.54560130.8994873.777750.49494127.321470.273990.918530.82355-0.683810.932330.84543-1.572360.938370.84804ITH134.834170.47755205.57820102.058950.18581128.4347898.416540.08961112.28821-0.193390.942170.78652-1.410150.976950.84834-2.431420.988840.86598IGAS76.744071.05676199.4561763.022520.82290149.6499962.806470.80117147.041040.320750.872020.79288-0.488290.897930.82329-1.189830.900240.82450H-M 122.28 8.252 973.720 43.490 8.063 849.12 28.703 8.028 838.02 - 0.000 - - - - - 0.000 -
11.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201337656 56 75 99 59 523 91 02 271 0.0823357 0.011140.027050.000170.002650.0008037 0.00020W-M137.069798.20760989.3926945.274538.06265851.4489929.302858.02707838.74293-0.213180.00601-0.02742-0.06917-0.00005-0.00539-0.021680.00049-0.00106SU-M119.470438.24900970.3747242.902598.06210848.0798728.586468.02900837.81485-0.057410.00100-0.00767-0.013160.00001-0.001420.003300.000250.00005TLS10%112.984258.25725962.9903039.517437.79656815.5327125.607467.66649796.691470.000000.000000.000000.066790.032950.037020.107160.045390.04913TLS20%106.162587.93173915.0466328.845895.89851604.9407114.584874.60907469.985060.060380.039420.049790.318800.268380.285680.491480.426090.43906TLS30%0.145200.002091.110820.051040.000561.034010.032470.000321.026120.998710.999750.998850.998790.999930.998780.998870.999960.99878TLS40%0.178730.002481.180950.054620.000541.036430.031710.000321.026020.998420.999700.998770.998710.999930.998780.998890.999960.99878OLS30%127.688148.391481000.7112651.273808.27527879.6099936.798868.23236867.015510.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD195.771088.896891132.2160268.584648.35473907.4560349.591328.20029878.72903-0.53320-0.06023-0.13141-0.33762-0.00960-0.03166-0.347630.00390-0.01351CTH283.835800.76709420.08288287.386230.07950316.27291270.312730.06125288.92922-1.222880.908590.58022-4.604930.990390.64044-6.345680.992560.66675CGAS199.415621.29984375.48329202.537330.88383305.74674228.147370.85378325.27757-0.561740.845100.62478-2.950110.893200.65241-5.199850.896290.62483ITH105.861627.36218874.2776240.509647.11848753.4888824.895347.06094736.682980.170940.122660.126340.209930.139790.143380.323480.142290.15032IGAS95.352164.27891554.1597757.400004.52308510.7188358.608654.12733478.341220.253240.490090.44623-0.119480.453420.41938-0.592680.498650.44829H-M137.219618.365061009.5039653.616688.27498882.1746336.947058.23448867.75962-0.074650.00315-0.00879-0.045690.00004-0.00292-0.00403-0.00026-0.00086W-M156.629458.376591036.1806856.392958.27780885.5599937.803078.23498868.88008-0.226660.00177-0.03544-0.09984-0.00031-0.00676-0.02729-0.00032-0.00215SU-M134.858868.365031006.2790052.622608.27310880.9802136.748888.23220867.16309-0.056160.00315-0.00556-0.026310.00026-0.001560.001360.00002-0.00017TLS10%127.688148.391481000.7112648.957598.18363867.1367333.829418.10130850.131290.000000.000000.000000.045170.011070.014180.080690.015920.01947TLS20%123.568798.31673986.5632644.711637.87599830.4676729.763397.74000807.985110.032260.008910.014140.127980.048250.055870.191190.059810.06808TLS30%117.263988.01869938.4018633.481916.17508638.2221617.902324.93872506.775770.081640.044420.062270.347000.253790.274430.513510.400080.41549TLS40%0.166970.002381.171310.055240.000611.044230.035790.000351.032760.998690.999720.998830.998920.999930.998810.999030.999960.99881Table (4) MSE and RMSE results for estimating intercept, slope and model in Y-outliersMethodsCont.MSERMSEn=10 n=30 n=50 n=10 n=30 n=50£¤¥ £¤¦ Model £¤¥ £¤¦Model £¤¥ £¤¦Model £¤¥ £¤¦Model £¤¥ £¤¦Model £¤¥ £¤¦ModelOLS10%0.719390.008491.991940.189030.001401.245600.109630.000531.155270.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD0.196610.002331.192590.063650.000101.049780.036620.000411.033750.726700.725560.401290.663280.928570.157210.665970.226420.10519CTH2.100390.002233.454151.714850.000612.750491.203740.000352.22141-1.919680.73734-0.73406-8.071840.56429-1.20816-9.980020.33962-0.92285CGAS1.250160.002862.393090.798060.000691.781670.520110.000391.50252-0.737810.66313-0.20139-3.221870.50714-0.43037-3.744230.26415-0.30058ITH2.315570.020855.250971.727710.002533.017691.210250.000932.35407-2.21880-1.45583-1.63611-8.13987-0.80714-1.42268-10.03941-0.75472-1.03768IGAS1.964170.020924.565260.809730.002772.063920.523120.001131.65883-1.73033-1.46408-1.29187-3.28361-0.97857-0.65697-3.77169-1.13208-0.43588H-M0.169040.001841.098280.050090.000051.013430.028450.000321.012600.765020.783270.448640.735020.964290.186390.740490.396230.12349W-M0.158440.001661.135400.045700.000041.019470.026530.000291.016420.779760.804480.430000.758240.971430.181540.758000.452830.12019SU-M0.164010.001871.113550.049660.000051.014890.027850.000311.013210.772020.779740.440970.737290.964290.185220.745960.415090.12297TLS10%0.719390.008491.991940.051750.000551.027700.026230.000291.016720.000000.000000.000000.726230.607140.174940.760740.452830.11993TLS20%0.132410.001761.074080.048690.000521.028190.029210.000301.021520.815940.792700.460790.742420.628570.174540.733560.433960.11577TLS30%0.164290.002261.139730.050490.000531.031440.029780.000311.023100.771630.733800.427830.732900.621430.171930.728360.415090.11441TLS40%0.164690.002321.144460.048110.000501.028510.028260.000301.021100.771070.726740.425450.745490.642860.174290.742220.433960.11614OLS20%1.245610.016312.964210.346950.003211.489160.209340.001561.313110.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD0.277460.004021.340700.078980.000181.065090.044030.000021.042560.777250.753530.547700.772360.943930.284770.789670.987180.20604CTH4.384980.004046.186362.696780.000893.785311.945570.000522.97954-2.520350.75230-1.08702-6.772820.72274-1.54191-8.293830.66667-1.26907CGA2.419170.006453.878491.266580.001202.285430.837650.000711.82992-0.942160.60454-0.30844-2.650610.62617-0.53471-3.001390.54487-0.39358
12.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201338SITH4.775560.0424010.169902.745840.005154.320121.957330.002093.21397-2.83391-1.59963-2.43090-6.91422-0.60436-1.90104-8.35000-0.33974-1.44760IGAS2.270290.063057.333231.288660.006152.860910.845950.003252.15007-0.82263-2.86573-1.47392-2.71425-0.91589-0.92116-3.04103-1.08333-0.63739H-M0.310720.004361.335670.071000.000191.035750.040200.000031.025040.750550.732680.549400.795360.940810.304470.807970.980770.21938W-M0.214180.003071.278010.054100.000051.032670.032800.000011.025420.828050.811770.568850.844070.984420.306540.843320.993590.21909SU-M0.313390.004671.377900.071640.000221.042230.040620.000021.028570.748400.713670.535150.793510.931460.300120.805960.987180.21669TLS10%1.245610.016312.964210.131420.001391.156000.062980.000701.072900.000000.000000.000000.621210.566980.223720.699150.551280.18293TLS20%0.379740.005911.562660.050130.000571.032530.031160.000331.024860.695140.637650.472820.855510.822430.306640.851150.788460.21952TLS30%0.156050.002191.138570.053160.000551.035550.031720.000311.025470.874720.865730.615890.846780.828660.304610.848480.801280.21905TLS40%0.175100.002461.169080.052080.000561.036820.031330.000321.025630.859430.849170.605600.849890.825550.303760.850340.794870.21893OLS30%1.813330.022723.900770.523270.005371.779480.308080.002831.468870.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD0.441830.007841.708240.102590.000431.095780.056850.000081.056520.756340.654930.562080.803940.919930.384210.815470.971730.28073CTH6.515250.008398.891143.827120.001485.014382.476870.000793.53595-2.592980.63072-1.27933-6.313850.72439-1.81789-7.039700.72085-1.40726CGAS3.598570.012025.548861.783100.002352.888521.121330.001292.14221-0.984510.47095-0.42250-2.407610.56238-0.62324-2.639740.54417-0.45841ITH7.171270.0918718.021633.925550.011536.082802.483960.004903.99974-2.95475-3.04357-3.62002-6.50196-1.14711-2.41830-7.06271-0.73145-1.72300IGAS5.687090.1082116.065161.840110.013094.014771.138240.007732.82842-2.13627-3.76276-3.11846-2.51656-1.43762-1.25615-2.69462-1.73145-0.92558H-M0.577790.010201.877880.115200.000731.099620.060540.000161.052290.681370.551060.518590.779850.864060.382060.803490.943460.28361W-M0.378190.006171.609430.076390.000281.070390.044810.000051.043720.791440.728430.587410.854010.947860.398480.854550.982330.28944SU-M0.624720.010831.971870.123630.000881.122250.064450.000231.063830.655480.523330.494490.763740.836130.369340.790800.918730.27575TLS10%1.813330.022723.900770.253220.002681.349400.126630.001361.177320.000000.000000.000000.516080.500930.241690.588970.519430.19849TLS20%0.769720.010152.213490.092920.000991.102610.052100.000551.057000.575520.553260.432550.822420.815640.380380.830890.805650.28040TLS30%0.306470.004901.454680.058980.000631.045100.034980.000371.033410.830990.784330.627080.887290.882680.412690.886460.869260.29646TLS40%0.175690.002411.191410.055420.000601.044890.034430.000341.030920.903110.893930.694570.894090.888270.412810.888240.879860.29815Table (5) MSE and RMSE results for estimating intercept, slope and model in XY-outliersMethodsCont.MSERMSEn=10 n=30 n=50 n=10 n=30 n=50£¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦ Model £¤¥ £¤¦Model £¤¥ £¤¦Model £¤¥ £¤¦ModelOLS10%92.040497.77059880.9043333.938307.43896773.1632022.715357.36523761.848210.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD142.820028.61155987.6478146.869747.46566788.7539930.525187.11496747.23289-0.55171-0.10822-0.12117-0.38103-0.00359-0.02016-0.343810.033980.01918CTH41.465910.0036253.0309028.846110.0019532.0009324.419590.0013226.634420.549480.999530.939800.150040.999740.95861-0.075030.999820.96504CGAS34.567290.0067944.5394825.102130.0035528.1497223.181440.0031925.478310.624430.999130.949440.260360.999520.96359-0.020520.999570.96656ITH41.496380.0512456.5477928.793710.0059132.3396824.373050.0032826.799810.549150.993410.935810.151590.999210.95817-0.072980.999550.96482IGAS34.875390.1102553.5129525.055520.0090328.6242223.013550.0082225.794430.621090.985810.939250.261730.998790.96298-0.013130.998880.96614H-M100.636347.97149914.1509234.583137.38816768.3900622.392727.28738753.34853-0.09339-0.02585-0.03774-0.019000.006830.006170.014200.010570.01116W-M116.289968.17873954.5684635.278357.22388750.6374921.940217.03813724.91953-0.26347-0.05252-0.08362-0.039480.028910.029130.034120.044410.04847SU-M98.235767.96333908.6962334.136987.40587769.7987522.415537.31204755.90321-0.06731-0.02480-0.03155-0.005850.004450.004350.013200.007220.00780TLS10%92.040497.77059880.9043324.171955.45501556.6706212.064064.08871414.976930.000000.000000.000000.287770.266700.280010.468900.444860.45530TLS20%0.125230.001681.050610.047270.000521.026520.028890.000301.021350.998640.999780.998810.998610.999930.998670.998730.999960.99866
13.
Mathematical Theory and Modeling www.iiste.orgISSN 2224-5804 (Paper) ISSN 2225-0522 (Online)Vol.3, No.5, 201339TLS30%0.164570.002271.143910.051060.000521.029250.029420.000311.023390.998210.999710.998700.998500.999930.998670.998700.999960.99866TLS40%0.165770.002341.143900.049020.000511.029560.028420.000301.021490.998200.999700.998700.998560.999930.998670.998750.999960.99866OLS20%112.775038.25371963.0645442.377618.05852846.4398428.659758.02745837.545850.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD166.737517.946491006.7520461.781857.89547849.8340037.972398.04644847.72111-0.478500.03722-0.04536-0.457890.02023-0.00401-0.32494-0.00237-0.01215CTH139.995460.03106178.17154110.363480.01255120.36321104.178480.01123110.73622-0.241370.996240.81500-1.604290.998440.85780-2.635010.998600.86778CGAS82.700220.68038171.0490470.243120.56919132.2226971.056730.53716128.764330.266680.917570.82239-0.657550.929370.84379-1.479320.933080.84626ITH135.049570.47345205.95563101.353610.18561127.5951997.538200.08959111.32251-0.197510.942640.78615-1.391680.976970.84926-2.403320.988840.86708IGAS77.718101.05393200.1572363.156450.81908149.4247162.277480.79795146.099230.310860.872310.79217-0.490330.898360.82347-1.172990.900600.82556H-M122.013748.23766972.6340343.526788.04997847.6018028.575098.01483836.60179-0.081920.00194-0.00994-0.027120.00106-0.001370.002950.001570.00113W-M137.516938.18704988.0959045.384178.04112849.1321229.080518.00712836.54617-0.219390.00808-0.02599-0.070950.00216-0.00318-0.014680.002530.00119SU-M119.391888.23854969.9936342.884828.05106846.8489028.459658.01834836.67153-0.058670.00184-0.00719-0.011970.00093-0.000480.006980.001130.00104TLS10%112.775038.25371963.0645438.849237.64031799.0863125.093407.48600777.904940.000000.000000.000000.083260.051900.055940.124440.067450.07121TLS20%103.873997.73738892.6221227.173525.59538573.6177613.709424.33090441.479180.078930.062560.073140.358780.305660.322320.521650.460490.47289TLS30%0.145200.002091.110820.051040.000561.044010.031470.000321.026120.998710.999750.998850.998800.999930.998770.998900.999960.99877TLS40%0.178730.002481.180950.054620.000541.038640.031710.000321.026700.998420.999700.998770.998710.999930.998770.998890.999960.99877OLS30%127.581258.388631000.6853851.302348.27313879.2590636.796618.23023866.874730.000000.000000.000000.000000.000000.000000.000000.000000.00000LAD195.620928.889761131.6262968.102938.33614905.3327649.376288.18894877.42982-0.53330-0.05974-0.13085-0.32748-0.00762-0.02965-0.341870.00502-0.01218CTH242.433111.47948431.08093258.312110.19724295.95882245.552240.14272270.96148-0.900230.823630.56921-4.035090.976160.66340-5.673230.982660.68743CGAS180.338501.64453382.67620190.473541.01165305.01228215.185780.95815322.04720-0.413520.803960.61759-2.712770.877720.65310-4.847980.883580.62850ITH107.599327.33471873.7129840.773487.07379749.3883825.161287.01429732.014840.156620.125640.126890.205230.144970.147700.316210.147740.15557IGAS96.154354.26860553.1416057.401884.50734509.2473458.116054.10988476.239100.246330.491140.44724-0.118890.455180.42082-0.579390.500640.45063H-M136.880458.352751007.8241053.631458.26437880.7817436.842588.22601866.87072-0.072890.00428-0.00713-0.045400.00106-0.00173-0.001250.000510.00000W-M153.568528.359181029.6848656.394458.26344883.7642037.575748.22398867.57333-0.203690.00351-0.02898-0.099260.00117-0.00512-0.021170.00076-0.00081SU-M134.622258.356991004.9509552.682578.26538879.9593636.625268.22580866.46767-0.055190.00377-0.00426-0.026900.00094-0.000800.004660.000540.00047TLS10%127.581258.388631000.6853848.210968.06464854.2960333.242247.96301835.668200.000000.000000.000000.060260.025200.028390.096600.032470.03600TLS20%121.409888.18474972.6560943.286377.61905803.2891028.628557.44984777.708390.048370.024310.028010.156250.079060.086400.221980.094820.10286TLS30%114.207177.74069908.7195331.631375.84757604.4101416.734574.60937473.250860.104830.077240.091900.383430.293190.312590.545210.439950.45407TLS40%0.166970.002381.171310.055240.000611.044230.035790.000351.032760.998690.999720.998830.998920.999930.998810.999030.999960.99881
Be the first to comment