This study introduces and compares different methods for estimating the two parameters of
generalized logarithmic series distribution. These methods are the cuckoo search optimization,
maximum likelihood estimation, and method of moments algorithms. All the required
derivations and basic steps of each algorithm are explained. The applications for these
algorithms are implemented through simulations using different sample sizes (n = 15, 25, 50,
100). Results are compared using the statistical measure mean square error.
2. 76 Computer Science & Information Technology (CS & IT)
The discrete random variable (x) exhibits the generalized logarithmic series distribution (GLSD)
with two parameters (α and β), where (α) is a scale parameter and (β) is a shape parameter. Let
Where Ө is a function from α. The positive matrix factorization (p.m.f.) of GLSD is defined by
Eq. (1) as follows:
The distribution in Eq. (1) depends on the zero-truncated generalized negative binomial defined
by Eq. (2):
When limit (k→0) is considered for Eq. (2), we obtain the studied distribution in Eq. (1).
The mean of GLSD is defined in Eq. (1), and the variance obtained from the general formula of
the (kth
) moments about the mean is as follows:
2. ESTIMATING PARAMETERS
We apply different methods for the p.m.f. parameters in Eq. (1).
2.1 Maximum Likelihood Estimation (MLE)
The Maximum Likelihood Estimation (MLE) [8], [9] that corresponds to Eq. (1) is given by:
3. Computer Science & Information Technology (CS & IT) 77
This equation derives:
2.2 Method of Moments (MOM) Estimator for GLSD Parameters
Method of Moments (MOM) Estimator for GLSD Parameters [10], these estimators
are obtained by solving the following:
When r = 1,
When r = 2,
Given that:
and
then,
We have
and
4. 78 Computer Science & Information Technology (CS & IT)
which is simplified as follows:
We also obtain the following:
Then,
We derive the first three non-central moments obtained from
Then,
Given that:
We obtain the following based on the preceding relation:
Eq. (14) can be simplified into
Given that
Eq. (15) can be written as follows:
which is an implicit function that can be solved numerically to determine based on
observation. We then obtain by using and solving Eq. (11).
5. Computer Science & Information Technology (CS & IT) 79
2.3 Cuckoo Search Optimization (CSO) Algorithm
This algorithm is based on the breeding behaviour of the cuckoo bird. It has three basic
optimization rules [11], [12].
1. Each bird lays one egg at a time, and each egg is placed randomly in a selected nest.
2. The best nest with the highest fitness value will be carried over to the next generation.
3. The number of available host nests is fixed. represents the probability of the host
bird discovering egg/s in its nest. The host bird can either throw away the egg/s or abandon the
nest to build a new one [13][14].
The latter scenario can be regarded as the new best solution.
Let:
Be the nest where the cuckoo bird initially lives, and
Be the new nest with the highest fitness value.
When random walk is used, the performance of the cuckoo (i) that applies levy flight is expressed
as [5, 7]:
.
Levy flight was first introduced by the French mathematician Paul Pierre.
The probability indicates that the egg/s in the nest may be from another bird, and
thus, the cuckoo bird may leave this host nest and build another one. The n hosts may be changed
to new hosts with random positions (probability of change). Thus, the objective function
belongs to the maximization type and the objective must be fitted into this type. The most
important algorithm that can be applied is one used to solve nonlinear equation problems or one
used in neural networks because these objects allow the algorithm to be transformed from state to
state to reach the optimal solution. Given that GLSD has two parameters (θ and β), then the
algorithm implements the following steps.
Each bird lays one egg at a time in a randomly selected nest. The number of selected nests is
equal to the number of parameters to be estimated. The number of nests is determined from the
following equation:
Number of nests = LB + (UB − LB) × random number (0, 1).
Let be the nest where the cuckoo bird initially lives.
is the new nest with the highest fitness value.
6. 80 Computer Science & Information Technology (CS & IT)
Each nest contains the parameters to be estimated, and the number of nests is also determined
based on these parameters.
Step (1):
Number of nests = LB + (UB − LB) × random number (0, 1)
Step (2):
The objective function for each nest is calculated as follows:
Step (3):
The best values of the parameters determine the best nest with respect to the eggs.
Step (4):
The repetition begins. Let
be the nest in which the cuckoo bird initially lives, and
be the new nest with the highest fitness value.
Step (5):
A new nest is generated for the cuckoo from k, as follows:
Step (6):
The objective function for each new nest is computed.
7. Computer Science & Information Technology (CS & IT) 81
Step (7):
The solution is continued until the stopping rule ends with the total frequency. The best solution
determined is then printed.
The CSO algorithm, which represents a meta-heuristic algorithm, is adopted to estimate (θ ̂, β ̂).
Then, (θ ̂ ) provides the estimate of (α). More details on this algorithm are explained in detail in
[15].
3. SIMULATION
The three estimators of (α and β), i.e., the CSO, MLE, and MOM algorithms, are compared
through MATLAB: A11 program. Different sample sizes (n = 15, 25, 50, 100) are considered,
and the results are compared using the statistical measure mean square error (MSE) and run of
each experiment (R = 1000).
TABLE 1: Comparison of the Different Estimators When β = 1.5 and α = 0.3
SkewnessKurtosisα = 0.3Methodn
010.30561.0512mle
15
0.00190.6889mse_mle
0.73471.0961mom
0.18970.8226mse_mom
0.30101.3836cuckoo
0.00140.0313mse_cuckoo
cuckoocuckoobest
01.75000.29911.3894mle
25
0.00180.0271mse_mle
0.59081.0121mom
0.08690.2558mse_mom
0.32921.3689cuckoo
8.6762e-0040.0364mse_cuckoo
cuckoocuckoobest
01.70000.32771.4254mle
50
0.00482260.0479mse_mle
0.71631.2055mom
0.17650.0937mse_mom
0.29911.3983cuckoo
8.2262e-0040.0357mse_cuckoo
mlemlebest
01.78770.30341.4910mle
100
7.4992e-0040.0220mse_mle
0.63531.2194mom
0.11390.0788mse_mom
0.29941.4032cuckoo
0.00220.0343mse_cuckoo
mlemlebest
8. 82 Computer Science & Information Technology (CS & IT)
TABLE 2: Comparison of the Different Estimators When β = 2 and α = 0.2
SkewnessKurtosisMethodn
010.22331.8177mle
15
0.00170.0673mse_mle
0.66481.0430mom
0.21820.9581mse_mom
0.21071.9709cuckoo
0.00150.0422mse_cuckoo
cuckoocuckoobest
01.73140.20771.9701mle
25
8.8789e-0040.0407mse_mle
0.78151.1209mom
0.33830.7887mse_mom
0.21931.9801cuckoo
0.00160.0375mse_cuckoo
cuckoocuckoobest
01.79820.20361.9914mle
50
5.7535e-0049.1436e-004mse_mle
0.78531.2229mom
0.34700.6041mse_mom
0.21021.8305cuckoo
0.00140.0425mse_cuckoo
mlemlebest
01.79970.21982.0130mle
100
4.2334e-0047.8137e-004mse_mle
0.65131.2277mom
0.20350.6014mse_mom
0.19711.9860cuckoo
0.00120.0161mse_cuckoo
mlemlebest
9. Computer Science & Information Technology (CS & IT) 83
TABLE 3: Comparison of the Different Estimators When β = 2.2 and α = 0.4
SkewnessKurtosisMethodn
01.78320.62941.9464mle
15
0.11290.1197mse_mle
0.79401.2001mom
0.15530.9998mse_mom
0.38792.4615cuckoo
0.00310.8312mse_cuckoo
cuckoocuckoobest
01.79550.39291.9705mle
25
0.00300.0982mse_mle
0.84311.2136mom
0.19680.9731mse_mom
0.39192.1559cuckoo
0.00290.0480mse_cuckoo
cuckoocuckoobest
01.79900.43252.0000mle
50
0.00220.0804mse_mle
0.88011.2003mom
0.23260.9432mse_mom
0.38232.3042cuckoo
0.00500.0110mse_cuckoo
mleCuckoobest
01.79980.43862.1386mle
100
0.00210.0063mse_mle
0.74351.5161mom
0.11840.4677mse_mom
0.44412.0096Cuckoo
0.00220.0071mse_Cuckoo
mlemlebest
10. 84 Computer Science & Information Technology (CS & IT)
TABLE 4: Comparison of the Different Estimators When β = 3 and α = 0.33
SkewnessKurtosismethodn
01.79970.84333.6985mle
15
0.26430.6205mse_mle
0.88542.1156mom
0.30970.7822mse_mom
0.33262.6793cuckoo
0.00220.1882mse_cuckoo
cuckoocuckoobest
01.79900.66933.5721mle
25
0.18350.3273mse_mle
0.84432.1598mom
0.26460.7059mse_mom
0.33262.7307cuckoo
0.00220.1482mse_cuckoo
cuckoocuckoobest
01.79450.33123.1131mle
50
1.4400e-0060.0128mse_mle
0.64432.3598mom
0.09880.4099mse_mom
0.33162.8307cuckoo
2.5600e-0060.0287mse_cuckoo
mlemlebest
01.78000.33103.0030mle
100
1.0000e-0069.0000e-006mse_mle
0.44432.5598mom
0.01310.1938mse_mom
0.33132.9307cuckoo
1.6900e-0060.0048mse_cuckoo
mlemlebest
11. Computer Science & Information Technology (CS & IT) 85
TABLE 5: Comparison of the Different Estimators When β = 1.8 and α = 0.5
4. CONCLUSION
After estimating (α and β) using the three different methods (i.e., MOM, CSO, and MLE) with
different sample sizes (n = 15, 25, 50, 100), we determined that the best estimator for small
sample sizes (n = 15, 25) based on MSE was the CSO estimator, as shown in Tables 1 to 5. By
contrast, MLE was the best estimator for large sample sizes (n = 50, 100). However, we conclude
that the CSO estimator is the best type for small sample sizes (n = 15, 25) because the CSO
algorithm depends on the number of eggs in the host nest, which is limited.
REFERENCES
[1] Azizah Binti Mohamad, Azlan Mohd Zain & Nor Erne Nazira Bazin, (2014), “Cuckoo Search
Algorithm for Optimization Problems—A Literature Review and its Applications”, Applied Artificial
Intelligence An International Journal Volume 28, Issue 5.
[2] Xin She Yang and Sush Deb, "Nature & Biologically Inspired Computing," in IEEE, University of
Cambridge, Trumpinton Street, CB2 1PZ, UK, 2010.
SkewnessKurtosisMethodn
01.79980.88702.4976mle
15
0.14980.4867mse_mle
0.88751.1168mom
0.15020.4668mse_mom
0.47691.6294cuckoo
0.00460.0626mse_cuckoo
cuckoocuckoobest
01.79900.80952.2176mle
25
0.09590.1744mse_mle
0.84001.1642mom
0.11590.4045mse_mom
0.46841.7142cuckoo
0.00520.0074mse_cuckoo
cuckoocuckoobest
01.79580.62781.8522mle
50
0.03110.0027mse_mle
0.78461.2056mom
0.08150.3534mse_mom
0.46491.7278cuckoo
0.00570.0052mse_cuckoo
cuckoomlebest
01.78320.53491.8022mle
100
0.00144.8400e-006mse_mle
0.74301.2154mom
0.05950.3418mse_mom
0.48991. 7663cuckoo
0.00390.0011mse_cuckoo
mlemlebest
12. 86 Computer Science & Information Technology (CS & IT)
[3] Ravi Kiran Varma P, Valli Kumari V, and Srinivas Kumar S, "A novel intelligent attribute reduction
technique based on Ant Colony Optimization," International Journal of Intelligent Systems
Technologies and Applicaitons, vol. 1, no. 1, pp. 23-45, 2015.
[4] Ravi Kiran Varma P, Valli Kumari V, and Srinivas Kumar S, "Feature selection using relative fuzzy
entropy and ant colony optimization applied to real-time intrusion detection system," Procedia
Computer Science, vol. 85, no. 2016, pp. 503-510, 2016.
[5] Ravi Kiran Varma P, Valli Kumari V, and Srinivas Kumar S, "Application of Rough Sets and Ant
Colony Optimization in feature selection for Network Intrusion Detection," International Journal of
Applied Engineering Research, vol. 10, no. 22, pp. 43156-43163, 2015.
[6] Ravi Kiran Varma P, Valli Kumari V, and Srinivas Kumar S, "Ant Colony Optimization Based
Anomaly Mitigation Engine," Springerplus, vol. 5, no. 1, pp. 1-32, 2016.
[7] Xin-She Yang and Suash, ""Engineering optimisation by cuckoo search"," International Journal of
Mathematical Modelling and Numerical Optimisation, vol. 1, no. 4, pp. 330-343, 2010.
[8] D. S. Bunch, “Maximum Likelihood Estimation (MLE) of probabilistic choice models”, SIAM
Journal on Scientific and Statistical Computing, 8(1):56-70.
[9] M.S.Prasad Babu et al, (2012), "Development of Maize Expert System using Ada-Boost Algorithm
and Navie Bayesian Classifier", International journal of computer Applications technology and
research, volume 1-issue 3, 89-93.
[10] Persi D., “Application of the Method of Moments in Probability and Statistics”, Auspices national
science foundation grant DMS86-00235, Nov. 1986.
[11] Xin She Yang and Suash, "A brief literature review: Cuckoo Search and Firefly Algorithm," Studies
in Computational Intelligence, vol. 516, pp. 49-62, 2014.
[12] Hongqing Zheng and Yongquan Zhou,(2013), A Cooperative Coevolutionary Cuckoo Search
Algorithm for Optimization Problem”, Journal of Applied Mathematics, J. Appl. Math. Volume 2013,
Special Issue (2013).
[13] Najla Akram AL-Saati, Marwa Abd-AlKareem, (2013), “The Use of Cuckoo Search in Estimating
the Parameters of Software Reliability Growth Models”, International Journal of Computer Science
and Information Security,Vol. 11, No. 6.
[14] Manjeet Kumar, Tarun Kumar Rawat,(2015), “Optimal design of FIR fractional order differentiator
using cuckoo search algorithm”, Expert Systems with Applications, volume 42, Issue 7, Pages 3433–
3449.
[15] Prasad Babu, B.Jyothsna, (2015), “Implementation of Cuckoo Search Optimization Algorithm using
Semantic Web – Based Coconut Expert System”, International Journal of Advanced research in
Computer Science and Software Engineering, Vol.5, Issue 9.