SlideShare a Scribd company logo
1 of 8
Download to read offline
International Journal of Electrical and Computer Engineering (IJECE)
Vol. 8, No. 6, December 2018, pp. 4705~4712
ISSN: 2088-8708, DOI: 10.11591/ijece.v8i6.pp4705-4712 ๏ฒ 4705
Journal homepage: http://iaescore.com/journals/index.php/IJECE
Ant System and Weighted Voting Method for Multiple
Classifier Systems
Abdullah Husin1
, Ku Ruhana Ku-Mahamud2
1
Department of Information Systems, Universitas Islam Indragiri, Indonesia
2
School of Computing, Universiti Utara Malaysia, Malaysia
Article Info ABSTRACT
Article history:
Received Jul 11, 2018
Revised Apr 16, 2018
Accepted Apr 30, 2018
Combining multiple classifiers is considered as a general solution for
classification tasks. However, there are two problems in combining multiple
classifiers: constructing a diverse classifier ensemble; and, constructing an
appropriate combiner. In this study, an improved multiple classifier
combination scheme is propose. A diverse classifier ensemble is constructed
by training them with different feature set partitions. The ant system-based
algorithm is used to form the optimal feature set partitions. Weighted voting
is used to combine the classifiersโ€™ outputs by considering the strength of the
classifiers prior to voting. Experiments were carried out using k-NN
ensembles on benchmark datasets from the University of California, Irvine,
to evaluate the credibility of the proposed method. Experimental results
showed that the proposed method has successfully constructed better k-NN
ensembles. Further more the proposed method can be used to develop other
multiple classifier systems.
Keyword:
Ant system
Classifier ensemble
construction
Combiner construction
Feature set partitioning
Multiple classifier system
Copyright ยฉ 2018 Institute of Advanced Engineering and Science.
All rights reserved.
Corresponding Author:
Abdullah Husin,
Department of Information System, Universitas Islam Indragiri,
Jalan Provinsi Parit 1 Tembilahan Indragiri Hilir Riau, Indonesia.
Email: abdialam@yahoo.com
1. INTRODUCTION
Classification is an important function in data mining. One of the main issues in performing
classification is to identify the classifier in order to obtain good classification accuracy. The use of a single
classifier provides minimal exploitation of complementary information from other classifiers, while the
combination of multiple classifiers may provide such additional information [1]. The goal of multiple
classifier combination is to obtain a comprehensive result by combining the outputs of several individual
classifiers [2]. This consists of a set of classifiers called classifier ensemble and a combination strategy for
integrating classifier outputs called combiner.
Multiple classifier combination has been widely used in many application domains such as: speech
recognition [3], human emotion recognition [4], video classification [5], face recognition [6], email
classification [7], cancer classification [8], plant leaf identification [9], concept drift identification [10] and
sukuk rating prediction [11]. Multiple classifier combination has been very useful in enhancing the
performance of classification. However, there are two problems in developing multiple classifier
combinations: constructing the classifier ensemble; and, constructing the combiner. There are no standard
guidelines concerning how to construct a set of diverse and accurate classifiers and how to combine the
classifier outputs [12]. Most previous studies focus on classifier ensemble construction and apply a simple
fixed combiner to combine the outputs [13]. This study focused on both problems and reviews were
performed on feature set partitioning and weighted voting combiner.
There are several approaches to construct a classifier ensemble. All such approaches attempt to
generate diversity by creating classifiers that make errors on different patterns, thus they can be combined
๏ฒ ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712
4706
effectively. The diversity among classifiers in ensemble is deemed to be a key success factor when to
constructing classifier ensemble. Theoretically and empirically, it has been shown that a good ensemble has
both accuracy and diversity [14]. One of the approaches used to construct a classifier ensemble is the feature
decomposition method which manipulates input features in constructing a diverse classifier ensemble. This
method decomposes the input features while training the classifier ensemble. Therefore this method is
appropriate for high dimensionality data sets [15].
One of the cases of feature decomposition is feature set partitioning. Input features are randomly
partitioned to several disjointed subsets. Consequently, each classifier is trained on different subsets. Feature
set partitioning is appropriate for classification tasks containing large number of features [16], [17].
However, it is difficult to determine how to form optimal feature set partition to train classifiers to produce
good performance.
Reviews of the set partitioning problem highlight that the ant system, which is a variant of ant
colony optimization (ACO), is the most promising technique to be applied [18]. The ACO algorithm was
introduced by Marco Dorigo in the early 1990s. This algorithm is inspired by the behavior of ants in finding
the shortest path from the colony to the food; in order to find the shortest route they leave a pheromone on
their tour paths. The ant-based algorithm has shown better performance than other popular heuristics such as
simulated annealing and genetic algorithms [19]. The ant system (AS) algorithm is a variant of the ant based-
algorithm. This is an original and most used ant-based algorithm in solving many optimization
problems [20]. The ant system has also been used to solve the set partitioning problem. Set partitioning
problems are difficult and very complicated combinatorial issues [21]. The use of ant system for set
partitioning problem has been applied in constructing a classifier ensemble [22].
The most popular, fundamental and straightforward combiner is majority voting [23]. Every
individual classifier votes for one class label. The class label that most frequently appears in the output of
individual classifiers is the final output. To avoid the draw problem, the number of classifiers performed for
voting is usually odd. Majority voting is often used to combine multiple classifiers in order to solve
classification problems [24]. Previously popular ensemble methods such as bagging, boosting and random
forest have used majority voting in combining classifier outputs. The advantages of majority voting include
simplicity and lower computational cost. Majority voting enables combination of the output of classifiers
regardless of what classifier is used. It is an optimal combiner in several ensemble methods [25]. However,
the disadvantage of this combiner is that it does not consider the strength of the classifier [26].
Weighted voting is a trainable version of majority voting which, unlike majority voting, gives
weight to each classifier before voting. To make an overall prediction, a weighted vote of the classifier
predictions is performed to predict the class. There are several ways to determine the weight of
classifiers [27]. The advantages of weighted voting include its flexibility and the potential to produce
better performances than majority voting. This combiner has the potential to make multiple classifier
combinations more robust to the choice of the number of individual classifiers [28]. In addition the
accuracies of the classifiers can be reliably estimated, after which weighted voting may be considered [29].
Several studies have concentrated on weighted voting and have been proven to solve real-world problems
such as face and voice recognition [30] and listed companiesโ€™ financial distress prediction [31]. Therefore, in
this study the weighted voting combiner is adapted as a combiner which considers the performance of each
classifier.
2. RESEARCH METHOD
There are three steps to the research work: (1) classifier ensemble construction; (2) combiner
construction; and (3) evaluation. In developing the multiple classifier system, effective combination must
address the first two steps of ensemble construction and combiner construction. The ant system feature set
partitioning algorithm is applied to construct classifier ensemble, while the weighted voting technique is
applied as a combiner. Figure 1 shows the architecture of the proposed method which consists of two
components namely the ant system feature set partitioning and the weighted voting combiner.
Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ
Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin)
4707
Figure 1. Architecture of the proposed method for multiple classifier system
2.1. Classifier Ensemble Construction
The classifier ensemble is built based on the feature set partitioning algorithm. A disjoint feature set
partition is carried out based on the input feature set. An algorithm based on ant system is developed to
perform feature set partitioning. The number of feature partitions is determined by the number of individual
classifiers. The required inputs include feature set and category labels of the original data set. The input
feature set is partitioned into different feature subsets and no feature in the training set is removed. Therefore,
each individual classifier is trained on a different projection of the training set. The flowchart for feature
decomposition is depicted in Figure 2.
Figure 2. Flowchart of the ant system-based feature set partitioning algorithm
๏ฒ ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712
4708
2.2. Combiner Construction
In this construction stage, the weighted voting method is used as the combiner. A learning process
for each classifier on different partitions of features is performed by the ant system algorithm. Weights are
given according to the performance of each classifier. The performance of each classifier depends on the
feature set partition. Therefore, the voting weights of each classifier are updated dynamically based on the
feature set partition. The idea behind this approach is that the classifier which is trained by different feature
set partitions will provide different accuracies although one type of classifier is used in the ensemble.
Classifiers that provide a high accuracy are more likely to classify patterns correctly. Let ๐ท = {๐ท1, โ€ฆ , ๐ท๐ฟ} be
a set of individual classifiers (or an ensemble of classifiers) where ๐ฟ is the number of individual classifiers.
Let ๐›บ = {๐œ”1, ๐œ”2, ๐œ”3, โ€ฆ , ๐œ”๐‘} be a set of class labels where c is the number of classes. Let ๐‘‡ = {๐‘ฅ๐‘–, ๐‘ฆ๐‘–} be a
training set (a labelled dataset) where ๐‘– = 1 โ€ฆ ๐‘, ๐‘ is the number of instances, ๐‘ฅ๐‘– โˆˆ โ„œ ๐‘›
is the ๐‘› dimensional
feature vector of i-th instance and ๐‘ฆ๐‘– โˆˆ {๐œ”1, โ€ฆ , ๐œ”๐‘} is the class label of the i-th instance. Each classifier ๐ท๐‘—
assigns an input feature vector to one of the predefined class labels, i.e., ๐ท๐‘—: โ„œ ๐‘›
โ†’ ๐›บ. The output of a
classifier ensemble is an ๐ฟ dimensional class label vector [๐ท1(๐‘ฅ), โ€ฆ , ๐ท๐ฟ(๐‘ฅ)] ๐‘‡
. The task is to combine ๐ฟ of
individual classifier outputs to predict the class label from a set of possible class labels that make the best
classification of the unknown pattern.
In formulating the weighted voting combiner, let us assume that only the class labels are available
from the classifier outputs, and define the decision of the j-th classifier as d_(j,k)โˆˆ{0,1}, j=1,โ€ฆ,L and
k=1,โ€ฆ,C, where L is the number of classifiers and C is the number of classes. If j-th classifier D_j chooses
class ฯ‰_k, then d_(j,k)=1 and 0 otherwise. The ensemble decision for the proposed weighted voting can be
described as follows: choose class ฯ‰_(k*) if
โˆ‘ ๐‘Ž๐‘๐‘๐‘— ๐‘‘๐‘—,๐‘˜โˆ—
๐ฟ
๐‘—=1 (๐‘ฅ) = ๐‘š๐‘Ž๐‘ฅ ๐‘˜ โˆ‘ ๐‘Ž๐‘๐‘๐‘— ๐‘‘๐‘—,๐‘˜
๐ฟ
๐‘—=1 (๐‘ฅ) (1)
where ๐‘Ž๐‘๐‘๐‘— is the accuracy (or weight) of classifier ๐ท๐‘—. The votes are multiplied by a weight before the actual
voting. The weight is obtained by estimating the classification accuracy on a validation set.
2.3. Evaluation
In this step, the performance of multiple classifiers constructed by the proposed ant system and
weighted voting (ASWV) method is measured and compared with several other ensemble methods.
Experiments were conducted on 9 (nine) benchmark datasets taken from the University California, Irvine
(UCI) repository. The k-Nearest Neighbour (k-NN) ensemble has also been used in the experiments. Table 1
shows a summary of the datasets used in the experiments.
Table 1. Summary of Datasets
No. Datasets
Number of
Instances
Number of
Classes
Number
of
Features
Features Types
1 Haberman 306 2 3 Integer
2 Iris 150 3 4 Real
3 Lenses 24 3 4 Categorical
4 Liver 345 2 6
Categorical, Integer
Real
5 Ecoli 336 8 7 Real
6 Prima Indians Diabetes 768 2 8 Integer, Real
7 Tic-Tac-Toe 958 2 9 Categorical
8 Glass 214 6 9 Real
9
Breast Cancer
(Wisconsin)
699 2 9 Categorical
The k-fold cross-validation method was applied in the process of obtaining the classification
accuracy [32]. A set of labeled samples are randomly partitioned into k disjoint folds of equal size. Then, one
of the k folds is randomly selected as the testing set and the remaining (k-1) folds are selected as the training
set with the assumption that there is at least one sample per class. The classification accuracy (acc) is the
ratio of numbers of all correctly classified instances and the total number of instances as shown in
Equation 2.
๐‘Ž๐‘๐‘ =
๐‘›๐‘œ.๐‘œ๐‘“ ๐‘Ž๐‘™๐‘™ ๐‘๐‘œ๐‘Ÿ๐‘Ÿ๐‘’๐‘๐‘ก๐‘™๐‘ฆ ๐‘๐‘™๐‘Ž๐‘ ๐‘ ๐‘–๐‘“๐‘–๐‘’๐‘‘ ๐‘–๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘๐‘’๐‘ 
๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™ ๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ ๐‘œ๐‘“ ๐‘–๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘๐‘’
โˆ— 100% (2)
Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ
Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin)
4709
Finally, the estimation of classification accuracy is obtained by dividing the total of all classification
accuracies by the total number of folds or rounds as shown in Equation 3.
๐‘Ž๐‘๐‘ ๐ถ๐‘‰ =
1
๐‘˜
โˆ‘ ๐‘Ž๐‘๐‘๐‘–
๐‘˜
๐‘–=1 (3)
๐‘Ž๐‘๐‘๐‘– is the classification accuracy of round i and k is the number of folds. A common choice for k-
fold cross validation is k=10. Extensive experiments have shown that 10 (ten) is the best choice to get an
accurate estimate [33]โ€“[35]. To obtain powerful performance estimation and comparisons, a large number of
estimates are always preferred. Therefore, in this research, the experiments are conducted on ten times the
10-fold cross-validation method.
3. RESULTS AND ANALYSIS
The ant system algorithm was used to partition the feature set and weighted voting was used to
combine classifier outputs. Experiments were carried out on nine (9) data sets from the UCI repository.
Ten (10) experiments which consist of 10-fold cross validation method were carried out to validate the
accuracy of single k-NN and constructed k-NN ensembles. Tables 2 shows the average and standard
deviation of the classification accuracies of single k-NN, constructed k-NN ensembles based on random
subspace and constructed k-NN ensembles with the used of ant system-based feature set partitioning
respectively. It can be shown that a small standard deviation was obtained for all method which indicates the
experiments were stable. The average accuracy of the constructed multiple k-NN by the proposed method
was compared with the average accuracies of original single k-NN and constructed k-NN ensembles by the
random subspace method. It can also be seen that the proposed method provides better accuracy than single
approach and random space method in constructing k-NN ensembles. Improvements in accuracy are obtained
on all datasets. The comparison of accuracies is as shown in Table 2.
Table 2. The Accuracy of Single k-NN, Random Subspace and Proposed Method
No Dataset
Single k-NN
k-NN with
Random Subspace
k-NN with Ant Syastem
Proposed Method
Average
Standard
Deviation
Average
Standard
Deviation
Average
Standard
Deviation
1 Haberman 68.83 1.37 67.91 1.96 68.53 0.79
2 Iris 95.67 0.47 93.40 0.47 96.34 0.35
3 Lenses 77.92 2.81 62.50 4.17 86.67 1.76
4 Liver 62.32 1.00 60.06 3.48 65.48 1.35
5 Ecoli 81.19 0.61 81.19 1.70 81.91 0.31
6 Pima 67.37 0.81 70.59 1.32 71.22 0.00
7 Tic-Tac-Toe 75.77 0.45 75.70 2.19 78.81 0.39
8 Glass 72.71 0.83 72.71 1.86 73.54 0.43
9 Breast Cancer 95.78 0.28 97.23 0.31 98.09 0.00
The proposed algorithm was successfully applied to form feature set partition. Table 3 shows the
summary of the result of implementing this proposed algorithm. This table presents the feature set partition
and the number of classifiers.
Table 3. Obtained Feature Set Partition and Number of Classifiers
No Dataset Partition Number of Classifiers
1 Haberman [1 3][2] 2
2 Iris [1 2 3 4] 1
3 Lenses [1 2 3 4] 1
4 Liver [1 4 6][3 5][2] 3
5 Ecoli [1 2 3 4 5 6 7] 1
6 Pima [1 3 4 7][5 6 8][2] 3
7 Tic-Tac-Toe [1 2 3 4 5 6 7 8 9] 1
8 Glass [1 2 3 4 5 6 7 8 9] 1
9 Breast Cancer [1 2 4 7 9][3 5][6][8] 4
The accuracy of the proposed method was also compared to the other common methods as shown in
Table 4. The accuracy of the proposed method was evaluated by comparing the results to: (1) Single
๏ฒ ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712
4710
classifier approach, (2) dynamic weighted voting [28], (3) improved k-NN classification using genetic
algorithm (GA k-NN) [36], (4) simultaneous metaheuristic feature selection (SMFS) [37], (5) weighted k-NN
ensemble method [27], (6) direct boosting algorithm [38], (7) cluster-oriented ensemble classifier (COEC)
[39] and (8) evidential neural network [40]. The k-NN classifier was used as the base classifier. Based on the
results, it can be seen that the proposed method gives the best classification accuracies as compared to the
other methods on habermann and breast cancer dataset. In general, the proposed method gives good
classification results and is comparable with other methods.
Table 4. Comparison of Accuracies with Common Ensemble Methods
Dataset 1 2 3 4 5 6 7 8 9
Haberman 66.83 - - - 71.89 - - - 72.75
Iris 95.67 97.33 - - 95.20 96.70 96.00 94,93 96.34
Ecoli 81.19 - - - 82.89 - - - 81.91
Glass 72.71 - - - 74.23 72.50 - - 73.54
Pima 67.37 72.68 - 71.90 - 75.70 - 71.79 71.22
Breast
Cancer
95.78 96.35 97.92 97.50 - - 97.72 - 98.09
1. Single k-NN 2. Dynamic weighted voting 3. GA k-NN
4. SMFS 5. Weighted k-NN ensemble method 6. Direct boosting algorithm
7. COEC 8. Evidential neural network 9. Proposed ASWV method
4. CONCLUSION
A new method based on the integration of the ant system and weighted voting for multiple classifier
systems has been presented. The ant system was applied to optimize the feature set partition activity while
weighted voting was used as a combiner. Experiment results show that the application of this method in
combining several k-NNs as base classifier outperforms single k-NN, comparable with other ensemble
methods. The results indicate that the proposed method can be applied in generating better k-NN ensembles.
Furthermore, this method can determine the number of the combined classifiers based on the number of
formed partitions.
Future research is to apply this method on other classifiers such as the Support Vector Machine,
Neural Network and Decision Tree. The dynamic feature partition-selection approach can be considered to
enhance the performance of this method. The method will, hopefully, be able to partition the feature set into
several lower-dimensional feature sets, which would allow a set of classifiers to process low dimensional
feature vectors simultaneously. Therefore, testing the ability of this method to overcome the high
dimensional data and small training sample problems can be considered.
ACKNOWLEDGEMENTS
This work is supported by the Higher Education Ministry of Malaysia under the Long Term
Research Grant Scheme (S/O code: 12490).
REFERENCES
[1] K. Kim, H. Lin, J. Y. Choi, and K. Choi, โ€œA design framework for hierarchical ensemble of multiple feature
extractors and multiple classifiers,โ€ Pattern Recognition, 2015.
[2] G. Baron, โ€œIntelligent Decision Technologies,โ€ Analysis of Multiple Classifiers Performance for Discretized Data
in Authorship Attribution, vol. 73, pp. 33โ€“42, 2017.
[3] S. Hegde, K. . Achary, and S. Shetty, โ€œA multiple classifier system for automatic speech recognition,โ€ International
Journal of Computer Applications, vol. 101, no. 9, pp. 38โ€“43, 2014.
[4] C. H. Wu and W. Bin Liang, โ€œEmotion recognition of affective speech based on multiple classifiers using acoustic-
prosodic information and semantic labels,โ€ IEEE Transactions on Affective Computing, vol. 2, no. 1, pp. 10โ€“21,
2015.
[5] M. H. Sigari, S. A. Sureshjani, and H. Soltanian-Zadeh, โ€œSport video classification using an ensemble classifier,โ€ in
Proceedings of 7th Iranian Conference on Machine Vision and Image Processing, 2011, pp. 2โ€“5.
[6] B. Zhang, โ€œReliable face recognition by random subspace support vector machine ensemble,โ€ in Proceedings of the
2012 International Conference on Machine Learning and Cybernetics, 2012, vol. 1, pp. 415โ€“420.
[7] A. Chharia and R. K. Gupta, โ€œEmail classifier: An ensemble using probability and rules,โ€ in Proceedings of 6th
International Conference on Contemporary Computing, 2013, pp. 130โ€“136.
Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ
Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin)
4711
[8] A. Margoosian and J. Abouei, โ€œEnsemble-based classifiers for cancer classification using human tumor microarray
data,โ€ in Proceedings of the 21st International Conference on Electrical Engineering (ICEE), 2013, pp. 1โ€“6.
[9] R. P. A. Pramesti, Y. Herdiyeni, and A. S. Nugroho, โ€œWeighted Ensemble Classifier for Plant Leaf Identification,โ€
TELKOMNIKA, vol. 16, no. 3, pp. 1386โ€“1393, 2018.
[10] L. Deshpande and M. N. Rao, โ€œConcept Drift Identification using Classifier Ensemble Approach,โ€ International
Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 1, pp. 19โ€“25, 2018.
[11] M. Kartiwi, T. S. Gunawan, T. Arundina, and M. A. Omar, โ€œSukuk Rating Prediction using Voting Ensemble
Strategy,โ€ International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 1, pp. 299โ€“303, 2018.
[12] D. Hernรกndez-Lobato, G. Martรญnez-Muรฑoz, and A. Suรกrez, โ€œHow large should ensembles of classifiers be?,โ€
Pattern Recognition, vol. 46, no. 5, pp. 1323โ€“1336, 2013.
[13] P. Du, J. Xia, W. Zhang, K. Tan, Y. Liu, and S. Liu, โ€œMultiple classifier system for remote sensing image
classification: A review,โ€ Sensors, vol. 12, no. 4, pp. 4764โ€“4792, 2012.
[14] S. Li, Z. Zheng, Y. Wang, C. Chang, and Y. Yu, โ€œA new hyperspectral band selection and classification framework
based on combining multiple classifier,โ€ Pattern Recognition Letters, pp. 1โ€“8, 2016.
[15] L. Rokach, โ€œEnsemble-based classifiers,โ€ Artificial Intelligence Review, vol. 33, no. 1โ€“2, pp. 1โ€“39, 2010.
[16] O. Maimon and L. Rokach, Decomposition methodology for knowledge discovery and data mining. Berlin
Heidelberg: Springer-Verlag., 2005.
[17] H. Ahn, H. Moon, M. J. Fazzari, N. Lim, J. J. Chen, and R. L. Kodell, โ€œClassification by ensembles from random
partitions of high-dimensional data,โ€ Computational Statistics and Data Analysis, vol. 51, no. 12, pp. 6166โ€“6179,
2007.
[18] B. Crawford, R. Soto, E. Monfroy, C. Castro, W. Palma, and F. Paredes, โ€œA hybrid soft computing approach for
subset problems,โ€ Mathematical Problems in Engineering, vol. 2013, 2013.
[19] G. Chicco, โ€œAnt colony system-based applications to electrical distribution system optimization,โ€ in Ant Colony
Optimization - Methods and Applications, A. Otsfeld, Ed. Rijeka, Croatia: InTechOpen, 2011, pp. 237โ€“262.
[20] R. Ribeiro and F. Enembreck, โ€œA sociologically inspired heuristic for optimization algorithms: A case study on ant
systems,โ€ Expert Systems with Applications, vol. 40, no. 5, pp. 1814โ€“1826, 2013.
[21] E. Cetin, โ€œThe solution of crew scheduling problem with set partitioning model,โ€ Aviation and Space Technology,
vol. 3, no. (4), pp. 47โ€“54, 2008.
[22] Abdullah and K. R. Ku-mahamud, โ€œAnt system-based feature set partitioning algorithm for classifier ensemble
construction,โ€ International Journal of Soft Computing, vol. 11, no. 3, pp. 176โ€“184, 2016.
[23] L. Hansen and P. Salamon, โ€œNeural network ensembles,โ€ IEEE Transaction on Pattern Analysis and Machine
Intelligence, vol. 12, no. 10, pp. 993โ€“1001, 1990.
[24] A. Hajdu, L. Hajdu, A. Jonas, L. Kovacs, and H. Toman, โ€œGeneralizing the majority voting scheme to spatially
constrained voting,โ€ IEEE Transactions on Image Processing, vol. 22, no. 11, pp. 4182โ€“4194, 2013.
[25] M. P. Ponti, โ€œCombining classifiers: From the creation of ensembles to the decision fusion,โ€ in Proceedings of 24th
SIBGRAPI Conference on Graphics, Patterns, and Images Tutorials, SIBGRAPI-T 2011, 2011, no. Icmc, pp. 1โ€“10.
[26] W. Wang, Y. Zhu, X. Huang, D. Lopresti, Z. Xue, R. Long, S. Antani, and G. Thoma, โ€œA Classifier Ensemble
Based on performance level estimation,โ€ in International Symposium on Biomedical Imaging: From Nano to
Macro, 2009, pp. 342โ€“345.
[27] S. Hamzeloo, H. Shahparast, and M. Zolghadri Jahromi, โ€œA novel weighted nearest neighbor ensemble classifier,โ€
in Proceedings of the 16th International Symposium on Artificial Intelligence and Signal Processing, 2012, pp.
413โ€“416.
[28] R. M. Valdovinos and J. S. Sรกnchez, โ€œCombining multiple classifiers with dynamic weighted voting,โ€ in
Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems (HAIS โ€™09), 2009, pp.
510โ€“516.
[29] R. Polikar, โ€œEnsemble based systems in decision making,โ€ Circuits and Systems Magazine, IEEE, vol. 6, no. 3, pp.
21โ€“45, 2006.
[30] X. Mu, J. Lu, P. Watta, and M. H. Hassoun, โ€œWeighted voting-based ensemble classifier with application to human
face recognition and voice recognition,โ€ in Proceedings of International Joint Conference on Neural Networks,
2009, pp. 2168โ€“2171.
[31] J. Sun and H. Li, โ€œListed companiesโ€™ financial distress prediction based on weighted majority voting combination
of multiple classifiers,โ€ Expert Systems with Applications, vol. 35, no. 3, pp. 818โ€“827, 2008.
[32] P. M. Jena and S. R. Nayak, โ€œAngular Symmetric Axis Constellation Model for Off-line Odia Handwritten
Characters Recognition,โ€ International Journal of Advances in Applied Sciences (IJAAS), vol. 7, no. 3, pp. 265โ€“
272, 2018.
[33] R. Kohavi, โ€œA study of cross-validation and bootstrap for accuracy estimation and model selection,โ€ in Proceedings
of the 14th International Joint Conference on Artificial Intelligence, 1995, no. 2, pp. 1137โ€“1145.
[34] M. Wozniak, โ€œClassifier fusion based on weighted voting - Analytical and experimental results,โ€ 2008 Eighth
International Conference on Intelligent Systems Design and Applications, pp. 687โ€“692, 2008.
[35] M. Wozniak, โ€œEvolutionary approach to produce classifier ensemble based on weighted voting,โ€ in Proceedings of
World Congress on Nature and Biologically Inspired Computing, NaBIC., 2009, pp. 648โ€“653.
[36] N. Suguna and K. Thanushkodi, โ€œAn Improved k-Nearest Neighbor Classification Algorithm Using Shared Nearest
Neighbor Similarity,โ€ International Journal of Computer Science, vol. 7, no. 4, pp. 18โ€“21, 2010.
[37] M. A. Tahir and J. Smith, โ€œCreating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature
selection,โ€ Pattern Recognition Letters, vol. 31, no. 11, pp. 1470โ€“1480, 2010.
[38] T. Koon, C. Neo, and D. Ventura, โ€œA direct boosting algorithm for the k -nearest neighbor classifier via local
๏ฒ ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712
4712
warping of the distance metric,โ€ Pattern Recognition Letters, vol. 33, no. 1, pp. 92โ€“102, 2012.
[39] B. Verma and A. Rahman, โ€œCluster Oriented Ensemble Classifier: Impact of Multicluster Characterization on
Ensemble Classifier Learning,โ€ IEEE Transactions on Knowledge and Data Engineering, vol. 24, no. 4, pp. 605โ€“
618, 2012.
[40] H. He, D. Han, and Y. Yang, โ€œDesign of multiple classifier systems based on evidential neural network,โ€ in
Chinese Control Conference, 2017, pp. 5496โ€“5501.
BIOGRAPHIES OF AUTHORS
Abdullah Husin: He recived his Sarjana and Master degrees in Computer Science from Gadjah
Mada University Yogyakarta Indonesia. Now, he is working on the Bachelorโ€™s degree in
Information System Program Study, Engineering and Computer Science Faculty, Universitas
Islam Indragiri Indonesia. He received his Ph.D degree in Computer Science in 2015 from
Universiti Utara Malaysia (UUM). His research interest includes Data Mining, Optimization
Algorithm, Multimedia Database and Pattern Recognition and Classification.
Ku Ruhana Ku-Mahamud: She holds a Bachelor in Mathematical Sciences and a Masters
degree in Computing, both from Bradford University, United Kingdom in 1983 and 1986
respectively. Her PhD in Computer Science was obtained from Universiti Pertanian Malaysia in
1994. As an academic, her research interests include ant colony optimization, pattern
classification and vehicle routing problem.

More Related Content

What's hot

Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...
Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...
Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...Waqas Tariq
ย 
IRJET- A Detailed Study on Classification Techniques for Data Mining
IRJET- A Detailed Study on Classification Techniques for Data MiningIRJET- A Detailed Study on Classification Techniques for Data Mining
IRJET- A Detailed Study on Classification Techniques for Data MiningIRJET Journal
ย 
Comparative study of various supervisedclassification methodsforanalysing def...
Comparative study of various supervisedclassification methodsforanalysing def...Comparative study of various supervisedclassification methodsforanalysing def...
Comparative study of various supervisedclassification methodsforanalysing def...eSAT Publishing House
ย 
An unsupervised feature selection algorithm with feature ranking for maximizi...
An unsupervised feature selection algorithm with feature ranking for maximizi...An unsupervised feature selection algorithm with feature ranking for maximizi...
An unsupervised feature selection algorithm with feature ranking for maximizi...Asir Singh
ย 
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASET
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASETSURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASET
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASETEditor IJMTER
ย 
Flower Grain Image Classification Using Supervised Classification Algorithm
Flower Grain Image Classification Using Supervised Classification AlgorithmFlower Grain Image Classification Using Supervised Classification Algorithm
Flower Grain Image Classification Using Supervised Classification AlgorithmIJERD Editor
ย 
IRJET- Ordinal based Classification Techniques: A Survey
IRJET-  	  Ordinal based Classification Techniques: A SurveyIRJET-  	  Ordinal based Classification Techniques: A Survey
IRJET- Ordinal based Classification Techniques: A SurveyIRJET Journal
ย 
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMS
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMSMULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMS
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMSijcsit
ย 
Effective Feature Selection for Feature Possessing Group Structure
Effective Feature Selection for Feature Possessing Group StructureEffective Feature Selection for Feature Possessing Group Structure
Effective Feature Selection for Feature Possessing Group Structurerahulmonikasharma
ย 
Feature selection in multimodal
Feature selection in multimodalFeature selection in multimodal
Feature selection in multimodalijcsa
ย 
Trust Enhanced Role Based Access Control Using Genetic Algorithm
Trust Enhanced Role Based Access Control Using Genetic Algorithm Trust Enhanced Role Based Access Control Using Genetic Algorithm
Trust Enhanced Role Based Access Control Using Genetic Algorithm IJECEIAES
ย 
Knowledge extraction from numerical data an abc
Knowledge extraction from numerical data an abcKnowledge extraction from numerical data an abc
Knowledge extraction from numerical data an abcIAEME Publication
ย 
The use of rough set theory in determining the preferences of the customers o...
The use of rough set theory in determining the preferences of the customers o...The use of rough set theory in determining the preferences of the customers o...
The use of rough set theory in determining the preferences of the customers o...Alexander Decker
ย 
Multi-Cluster Based Approach for skewed Data in Data Mining
Multi-Cluster Based Approach for skewed Data in Data MiningMulti-Cluster Based Approach for skewed Data in Data Mining
Multi-Cluster Based Approach for skewed Data in Data MiningIOSR Journals
ย 
Multi-label Classification Methods: A Comparative Study
Multi-label Classification Methods: A Comparative StudyMulti-label Classification Methods: A Comparative Study
Multi-label Classification Methods: A Comparative StudyIRJET Journal
ย 
Bx34452461
Bx34452461Bx34452461
Bx34452461IJERA Editor
ย 
Performance Analysis of Selected Classifiers in User Profiling
Performance Analysis of Selected Classifiers in User ProfilingPerformance Analysis of Selected Classifiers in User Profiling
Performance Analysis of Selected Classifiers in User Profilingijdmtaiir
ย 
Correlation of artificial neural network classification and nfrs attribute fi...
Correlation of artificial neural network classification and nfrs attribute fi...Correlation of artificial neural network classification and nfrs attribute fi...
Correlation of artificial neural network classification and nfrs attribute fi...eSAT Journals
ย 
11.software modules clustering an effective approach for reusability
11.software modules clustering an effective approach for  reusability11.software modules clustering an effective approach for  reusability
11.software modules clustering an effective approach for reusabilityAlexander Decker
ย 

What's hot (20)

Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...
Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...
Improved Slicing Algorithm For Greater Utility In Privacy Preserving Data Pub...
ย 
IRJET- A Detailed Study on Classification Techniques for Data Mining
IRJET- A Detailed Study on Classification Techniques for Data MiningIRJET- A Detailed Study on Classification Techniques for Data Mining
IRJET- A Detailed Study on Classification Techniques for Data Mining
ย 
Comparative study of various supervisedclassification methodsforanalysing def...
Comparative study of various supervisedclassification methodsforanalysing def...Comparative study of various supervisedclassification methodsforanalysing def...
Comparative study of various supervisedclassification methodsforanalysing def...
ย 
An unsupervised feature selection algorithm with feature ranking for maximizi...
An unsupervised feature selection algorithm with feature ranking for maximizi...An unsupervised feature selection algorithm with feature ranking for maximizi...
An unsupervised feature selection algorithm with feature ranking for maximizi...
ย 
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASET
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASETSURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASET
SURVEY ON CLASSIFICATION ALGORITHMS USING BIG DATASET
ย 
Flower Grain Image Classification Using Supervised Classification Algorithm
Flower Grain Image Classification Using Supervised Classification AlgorithmFlower Grain Image Classification Using Supervised Classification Algorithm
Flower Grain Image Classification Using Supervised Classification Algorithm
ย 
IRJET- Ordinal based Classification Techniques: A Survey
IRJET-  	  Ordinal based Classification Techniques: A SurveyIRJET-  	  Ordinal based Classification Techniques: A Survey
IRJET- Ordinal based Classification Techniques: A Survey
ย 
Ijetr021251
Ijetr021251Ijetr021251
Ijetr021251
ย 
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMS
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMSMULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMS
MULTI-PARAMETER BASED PERFORMANCE EVALUATION OF CLASSIFICATION ALGORITHMS
ย 
Effective Feature Selection for Feature Possessing Group Structure
Effective Feature Selection for Feature Possessing Group StructureEffective Feature Selection for Feature Possessing Group Structure
Effective Feature Selection for Feature Possessing Group Structure
ย 
Feature selection in multimodal
Feature selection in multimodalFeature selection in multimodal
Feature selection in multimodal
ย 
Trust Enhanced Role Based Access Control Using Genetic Algorithm
Trust Enhanced Role Based Access Control Using Genetic Algorithm Trust Enhanced Role Based Access Control Using Genetic Algorithm
Trust Enhanced Role Based Access Control Using Genetic Algorithm
ย 
Knowledge extraction from numerical data an abc
Knowledge extraction from numerical data an abcKnowledge extraction from numerical data an abc
Knowledge extraction from numerical data an abc
ย 
The use of rough set theory in determining the preferences of the customers o...
The use of rough set theory in determining the preferences of the customers o...The use of rough set theory in determining the preferences of the customers o...
The use of rough set theory in determining the preferences of the customers o...
ย 
Multi-Cluster Based Approach for skewed Data in Data Mining
Multi-Cluster Based Approach for skewed Data in Data MiningMulti-Cluster Based Approach for skewed Data in Data Mining
Multi-Cluster Based Approach for skewed Data in Data Mining
ย 
Multi-label Classification Methods: A Comparative Study
Multi-label Classification Methods: A Comparative StudyMulti-label Classification Methods: A Comparative Study
Multi-label Classification Methods: A Comparative Study
ย 
Bx34452461
Bx34452461Bx34452461
Bx34452461
ย 
Performance Analysis of Selected Classifiers in User Profiling
Performance Analysis of Selected Classifiers in User ProfilingPerformance Analysis of Selected Classifiers in User Profiling
Performance Analysis of Selected Classifiers in User Profiling
ย 
Correlation of artificial neural network classification and nfrs attribute fi...
Correlation of artificial neural network classification and nfrs attribute fi...Correlation of artificial neural network classification and nfrs attribute fi...
Correlation of artificial neural network classification and nfrs attribute fi...
ย 
11.software modules clustering an effective approach for reusability
11.software modules clustering an effective approach for  reusability11.software modules clustering an effective approach for  reusability
11.software modules clustering an effective approach for reusability
ย 

Similar to Ant System and Weighted Voting Method for Multiple Classifier Systems

New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...
New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...
New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...ijaia
ย 
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMS
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMSPREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMS
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMSSamsung Electronics
ย 
Predicting performance of classification algorithms
Predicting performance of classification algorithmsPredicting performance of classification algorithms
Predicting performance of classification algorithmsIAEME Publication
ย 
Assessment of Cluster Tree Analysis based on Data Linkages
Assessment of Cluster Tree Analysis based on Data LinkagesAssessment of Cluster Tree Analysis based on Data Linkages
Assessment of Cluster Tree Analysis based on Data Linkagesjournal ijrtem
ย 
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...csandit
ย 
Lx3520322036
Lx3520322036Lx3520322036
Lx3520322036IJERA Editor
ย 
Recommendation based on Clustering and Association Rules
Recommendation based on Clustering and Association RulesRecommendation based on Clustering and Association Rules
Recommendation based on Clustering and Association RulesIJARIIE JOURNAL
ย 
Applying supervised and un supervised learning approaches for movie recommend...
Applying supervised and un supervised learning approaches for movie recommend...Applying supervised and un supervised learning approaches for movie recommend...
Applying supervised and un supervised learning approaches for movie recommend...IAEME Publication
ย 
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...IAEME Publication
ย 
Feature selection for classification
Feature selection for classificationFeature selection for classification
Feature selection for classificationefcastillo744
ย 
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary AlgorithmAutomatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithmaciijournal
ย 
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary AlgorithmAutomatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithmaciijournal
ย 
An Heterogeneous Population-Based Genetic Algorithm for Data Clustering
An Heterogeneous Population-Based Genetic Algorithm for Data ClusteringAn Heterogeneous Population-Based Genetic Algorithm for Data Clustering
An Heterogeneous Population-Based Genetic Algorithm for Data Clusteringijeei-iaes
ย 
Final Report
Final ReportFinal Report
Final Reportimu409
ย 
Network Based Intrusion Detection System using Filter Based Feature Selection...
Network Based Intrusion Detection System using Filter Based Feature Selection...Network Based Intrusion Detection System using Filter Based Feature Selection...
Network Based Intrusion Detection System using Filter Based Feature Selection...IRJET Journal
ย 
Introduction to Multi-Objective Clustering Ensemble
Introduction to Multi-Objective Clustering EnsembleIntroduction to Multi-Objective Clustering Ensemble
Introduction to Multi-Objective Clustering EnsembleIJSRD
ย 
IRJET-Clustering Techniques for Mushroom Dataset
IRJET-Clustering Techniques for Mushroom DatasetIRJET-Clustering Techniques for Mushroom Dataset
IRJET-Clustering Techniques for Mushroom DatasetIRJET Journal
ย 
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...A Brief Survey on Recommendation System for a Gradient Classifier based Inade...
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...Christo Ananth
ย 

Similar to Ant System and Weighted Voting Method for Multiple Classifier Systems (20)

New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...
New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...
New Feature Selection Model Based Ensemble Rule Classifiers Method for Datase...
ย 
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMS
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMSPREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMS
PREDICTING PERFORMANCE OF CLASSIFICATION ALGORITHMS
ย 
Predicting performance of classification algorithms
Predicting performance of classification algorithmsPredicting performance of classification algorithms
Predicting performance of classification algorithms
ย 
Assessment of Cluster Tree Analysis based on Data Linkages
Assessment of Cluster Tree Analysis based on Data LinkagesAssessment of Cluster Tree Analysis based on Data Linkages
Assessment of Cluster Tree Analysis based on Data Linkages
ย 
Ijmet 10 01_141
Ijmet 10 01_141Ijmet 10 01_141
Ijmet 10 01_141
ย 
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...
ATTRIBUTE REDUCTION-BASED ENSEMBLE RULE CLASSIFIERS METHOD FOR DATASET CLASSI...
ย 
Lx3520322036
Lx3520322036Lx3520322036
Lx3520322036
ย 
Recommendation based on Clustering and Association Rules
Recommendation based on Clustering and Association RulesRecommendation based on Clustering and Association Rules
Recommendation based on Clustering and Association Rules
ย 
Applying supervised and un supervised learning approaches for movie recommend...
Applying supervised and un supervised learning approaches for movie recommend...Applying supervised and un supervised learning approaches for movie recommend...
Applying supervised and un supervised learning approaches for movie recommend...
ย 
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...
APPLYING SUPERVISED AND UN-SUPERVISED LEARNING APPROACHES FOR MOVIE RECOMMEND...
ย 
Feature selection for classification
Feature selection for classificationFeature selection for classification
Feature selection for classification
ย 
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary AlgorithmAutomatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
ย 
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary AlgorithmAutomatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
Automatic Unsupervised Data Classification Using Jaya Evolutionary Algorithm
ย 
An Heterogeneous Population-Based Genetic Algorithm for Data Clustering
An Heterogeneous Population-Based Genetic Algorithm for Data ClusteringAn Heterogeneous Population-Based Genetic Algorithm for Data Clustering
An Heterogeneous Population-Based Genetic Algorithm for Data Clustering
ย 
Final Report
Final ReportFinal Report
Final Report
ย 
Network Based Intrusion Detection System using Filter Based Feature Selection...
Network Based Intrusion Detection System using Filter Based Feature Selection...Network Based Intrusion Detection System using Filter Based Feature Selection...
Network Based Intrusion Detection System using Filter Based Feature Selection...
ย 
Introduction to Multi-Objective Clustering Ensemble
Introduction to Multi-Objective Clustering EnsembleIntroduction to Multi-Objective Clustering Ensemble
Introduction to Multi-Objective Clustering Ensemble
ย 
IRJET-Clustering Techniques for Mushroom Dataset
IRJET-Clustering Techniques for Mushroom DatasetIRJET-Clustering Techniques for Mushroom Dataset
IRJET-Clustering Techniques for Mushroom Dataset
ย 
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...A Brief Survey on Recommendation System for a Gradient Classifier based Inade...
A Brief Survey on Recommendation System for a Gradient Classifier based Inade...
ย 
Ijmet 10 02_050
Ijmet 10 02_050Ijmet 10 02_050
Ijmet 10 02_050
ย 

More from IJECEIAES

Predicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learningPredicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learningIJECEIAES
ย 
Taxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca AirportTaxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca AirportIJECEIAES
ย 
Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...IJECEIAES
ย 
Ataxic person prediction using feature optimized based on machine learning model
Ataxic person prediction using feature optimized based on machine learning modelAtaxic person prediction using feature optimized based on machine learning model
Ataxic person prediction using feature optimized based on machine learning modelIJECEIAES
ย 
Situational judgment test measures administrator computational thinking with ...
Situational judgment test measures administrator computational thinking with ...Situational judgment test measures administrator computational thinking with ...
Situational judgment test measures administrator computational thinking with ...IJECEIAES
ย 
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...IJECEIAES
ย 
An overlapping conscious relief-based feature subset selection method
An overlapping conscious relief-based feature subset selection methodAn overlapping conscious relief-based feature subset selection method
An overlapping conscious relief-based feature subset selection methodIJECEIAES
ย 
Recognition of music symbol notation using convolutional neural network
Recognition of music symbol notation using convolutional neural networkRecognition of music symbol notation using convolutional neural network
Recognition of music symbol notation using convolutional neural networkIJECEIAES
ย 
A simplified classification computational model of opinion mining using deep ...
A simplified classification computational model of opinion mining using deep ...A simplified classification computational model of opinion mining using deep ...
A simplified classification computational model of opinion mining using deep ...IJECEIAES
ย 
An efficient convolutional neural network-extreme gradient boosting hybrid de...
An efficient convolutional neural network-extreme gradient boosting hybrid de...An efficient convolutional neural network-extreme gradient boosting hybrid de...
An efficient convolutional neural network-extreme gradient boosting hybrid de...IJECEIAES
ย 
Generating images using generative adversarial networks based on text descrip...
Generating images using generative adversarial networks based on text descrip...Generating images using generative adversarial networks based on text descrip...
Generating images using generative adversarial networks based on text descrip...IJECEIAES
ย 
Collusion-resistant multiparty data sharing in social networks
Collusion-resistant multiparty data sharing in social networksCollusion-resistant multiparty data sharing in social networks
Collusion-resistant multiparty data sharing in social networksIJECEIAES
ย 
Application of deep learning methods for automated analysis of retinal struct...
Application of deep learning methods for automated analysis of retinal struct...Application of deep learning methods for automated analysis of retinal struct...
Application of deep learning methods for automated analysis of retinal struct...IJECEIAES
ย 
Proposal of a similarity measure for unified modeling language class diagram ...
Proposal of a similarity measure for unified modeling language class diagram ...Proposal of a similarity measure for unified modeling language class diagram ...
Proposal of a similarity measure for unified modeling language class diagram ...IJECEIAES
ย 
Efficient criticality oriented service brokering policy in cloud datacenters
Efficient criticality oriented service brokering policy in cloud datacentersEfficient criticality oriented service brokering policy in cloud datacenters
Efficient criticality oriented service brokering policy in cloud datacentersIJECEIAES
ย 
System call frequency analysis-based generative adversarial network model for...
System call frequency analysis-based generative adversarial network model for...System call frequency analysis-based generative adversarial network model for...
System call frequency analysis-based generative adversarial network model for...IJECEIAES
ย 
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...IJECEIAES
ย 
Efficient intelligent crawler for hamming distance based on prioritization of...
Efficient intelligent crawler for hamming distance based on prioritization of...Efficient intelligent crawler for hamming distance based on prioritization of...
Efficient intelligent crawler for hamming distance based on prioritization of...IJECEIAES
ย 
Predictive modeling for breast cancer based on machine learning algorithms an...
Predictive modeling for breast cancer based on machine learning algorithms an...Predictive modeling for breast cancer based on machine learning algorithms an...
Predictive modeling for breast cancer based on machine learning algorithms an...IJECEIAES
ย 
An algorithm for decomposing variations of 3D model
An algorithm for decomposing variations of 3D modelAn algorithm for decomposing variations of 3D model
An algorithm for decomposing variations of 3D modelIJECEIAES
ย 

More from IJECEIAES (20)

Predicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learningPredicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learning
ย 
Taxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca AirportTaxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca Airport
ย 
Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...
ย 
Ataxic person prediction using feature optimized based on machine learning model
Ataxic person prediction using feature optimized based on machine learning modelAtaxic person prediction using feature optimized based on machine learning model
Ataxic person prediction using feature optimized based on machine learning model
ย 
Situational judgment test measures administrator computational thinking with ...
Situational judgment test measures administrator computational thinking with ...Situational judgment test measures administrator computational thinking with ...
Situational judgment test measures administrator computational thinking with ...
ย 
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
Hand LightWeightNet: an optimized hand pose estimation for interactive mobile...
ย 
An overlapping conscious relief-based feature subset selection method
An overlapping conscious relief-based feature subset selection methodAn overlapping conscious relief-based feature subset selection method
An overlapping conscious relief-based feature subset selection method
ย 
Recognition of music symbol notation using convolutional neural network
Recognition of music symbol notation using convolutional neural networkRecognition of music symbol notation using convolutional neural network
Recognition of music symbol notation using convolutional neural network
ย 
A simplified classification computational model of opinion mining using deep ...
A simplified classification computational model of opinion mining using deep ...A simplified classification computational model of opinion mining using deep ...
A simplified classification computational model of opinion mining using deep ...
ย 
An efficient convolutional neural network-extreme gradient boosting hybrid de...
An efficient convolutional neural network-extreme gradient boosting hybrid de...An efficient convolutional neural network-extreme gradient boosting hybrid de...
An efficient convolutional neural network-extreme gradient boosting hybrid de...
ย 
Generating images using generative adversarial networks based on text descrip...
Generating images using generative adversarial networks based on text descrip...Generating images using generative adversarial networks based on text descrip...
Generating images using generative adversarial networks based on text descrip...
ย 
Collusion-resistant multiparty data sharing in social networks
Collusion-resistant multiparty data sharing in social networksCollusion-resistant multiparty data sharing in social networks
Collusion-resistant multiparty data sharing in social networks
ย 
Application of deep learning methods for automated analysis of retinal struct...
Application of deep learning methods for automated analysis of retinal struct...Application of deep learning methods for automated analysis of retinal struct...
Application of deep learning methods for automated analysis of retinal struct...
ย 
Proposal of a similarity measure for unified modeling language class diagram ...
Proposal of a similarity measure for unified modeling language class diagram ...Proposal of a similarity measure for unified modeling language class diagram ...
Proposal of a similarity measure for unified modeling language class diagram ...
ย 
Efficient criticality oriented service brokering policy in cloud datacenters
Efficient criticality oriented service brokering policy in cloud datacentersEfficient criticality oriented service brokering policy in cloud datacenters
Efficient criticality oriented service brokering policy in cloud datacenters
ย 
System call frequency analysis-based generative adversarial network model for...
System call frequency analysis-based generative adversarial network model for...System call frequency analysis-based generative adversarial network model for...
System call frequency analysis-based generative adversarial network model for...
ย 
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...
Comparison of Iris dataset classification with Gaussian naรฏve Bayes and decis...
ย 
Efficient intelligent crawler for hamming distance based on prioritization of...
Efficient intelligent crawler for hamming distance based on prioritization of...Efficient intelligent crawler for hamming distance based on prioritization of...
Efficient intelligent crawler for hamming distance based on prioritization of...
ย 
Predictive modeling for breast cancer based on machine learning algorithms an...
Predictive modeling for breast cancer based on machine learning algorithms an...Predictive modeling for breast cancer based on machine learning algorithms an...
Predictive modeling for breast cancer based on machine learning algorithms an...
ย 
An algorithm for decomposing variations of 3D model
An algorithm for decomposing variations of 3D modelAn algorithm for decomposing variations of 3D model
An algorithm for decomposing variations of 3D model
ย 

Recently uploaded

KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlysanyuktamishra911
ย 
Vivazz, Mieres Social Housing Design Spain
Vivazz, Mieres Social Housing Design SpainVivazz, Mieres Social Housing Design Spain
Vivazz, Mieres Social Housing Design Spaintimesproduction05
ย 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college projectTonystark477637
ย 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdfankushspencer015
ย 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTbhaskargani46
ย 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...roncy bisnoi
ย 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptDineshKumar4165
ย 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...ranjana rawat
ย 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations120cr0395
ย 
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...SUHANI PANDEY
ย 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
ย 
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Bookingroncy bisnoi
ย 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Christo Ananth
ย 
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELLPVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELLManishPatel169454
ย 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
ย 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...ranjana rawat
ย 
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar โ‰ผ๐Ÿ” Delhi door step de...
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar  โ‰ผ๐Ÿ” Delhi door step de...Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar  โ‰ผ๐Ÿ” Delhi door step de...
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar โ‰ผ๐Ÿ” Delhi door step de...9953056974 Low Rate Call Girls In Saket, Delhi NCR
ย 
Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfRagavanV2
ย 
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort ServiceCall Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service9953056974 Low Rate Call Girls In Saket, Delhi NCR
ย 

Recently uploaded (20)

KubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghlyKubeKraft presentation @CloudNativeHooghly
KubeKraft presentation @CloudNativeHooghly
ย 
Vivazz, Mieres Social Housing Design Spain
Vivazz, Mieres Social Housing Design SpainVivazz, Mieres Social Housing Design Spain
Vivazz, Mieres Social Housing Design Spain
ย 
result management system report for college project
result management system report for college projectresult management system report for college project
result management system report for college project
ย 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
ย 
Generative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPTGenerative AI or GenAI technology based PPT
Generative AI or GenAI technology based PPT
ย 
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
Call Girls Pimpri Chinchwad Call Me 7737669865 Budget Friendly No Advance Boo...
ย 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
ย 
Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024Water Industry Process Automation & Control Monthly - April 2024
Water Industry Process Automation & Control Monthly - April 2024
ย 
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
The Most Attractive Pune Call Girls Manchar 8250192130 Will You Miss This Cha...
ย 
Extrusion Processes and Their Limitations
Extrusion Processes and Their LimitationsExtrusion Processes and Their Limitations
Extrusion Processes and Their Limitations
ย 
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
VIP Model Call Girls Kothrud ( Pune ) Call ON 8005736733 Starting From 5K to ...
ย 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
ย 
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance BookingCall Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
Call Girls Walvekar Nagar Call Me 7737669865 Budget Friendly No Advance Booking
ย 
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
Call for Papers - Educational Administration: Theory and Practice, E-ISSN: 21...
ย 
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELLPVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
PVC VS. FIBERGLASS (FRP) GRAVITY SEWER - UNI BELL
ย 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
ย 
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
The Most Attractive Pune Call Girls Budhwar Peth 8250192130 Will You Miss Thi...
ย 
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar โ‰ผ๐Ÿ” Delhi door step de...
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar  โ‰ผ๐Ÿ” Delhi door step de...Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar  โ‰ผ๐Ÿ” Delhi door step de...
Call Now โ‰ฝ 9953056974 โ‰ผ๐Ÿ” Call Girls In New Ashok Nagar โ‰ผ๐Ÿ” Delhi door step de...
ย 
Unit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdfUnit 1 - Soil Classification and Compaction.pdf
Unit 1 - Soil Classification and Compaction.pdf
ย 
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort ServiceCall Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
Call Girls in Ramesh Nagar Delhi ๐Ÿ’ฏ Call Us ๐Ÿ”9953056974 ๐Ÿ” Escort Service
ย 

Ant System and Weighted Voting Method for Multiple Classifier Systems

  • 1. International Journal of Electrical and Computer Engineering (IJECE) Vol. 8, No. 6, December 2018, pp. 4705~4712 ISSN: 2088-8708, DOI: 10.11591/ijece.v8i6.pp4705-4712 ๏ฒ 4705 Journal homepage: http://iaescore.com/journals/index.php/IJECE Ant System and Weighted Voting Method for Multiple Classifier Systems Abdullah Husin1 , Ku Ruhana Ku-Mahamud2 1 Department of Information Systems, Universitas Islam Indragiri, Indonesia 2 School of Computing, Universiti Utara Malaysia, Malaysia Article Info ABSTRACT Article history: Received Jul 11, 2018 Revised Apr 16, 2018 Accepted Apr 30, 2018 Combining multiple classifiers is considered as a general solution for classification tasks. However, there are two problems in combining multiple classifiers: constructing a diverse classifier ensemble; and, constructing an appropriate combiner. In this study, an improved multiple classifier combination scheme is propose. A diverse classifier ensemble is constructed by training them with different feature set partitions. The ant system-based algorithm is used to form the optimal feature set partitions. Weighted voting is used to combine the classifiersโ€™ outputs by considering the strength of the classifiers prior to voting. Experiments were carried out using k-NN ensembles on benchmark datasets from the University of California, Irvine, to evaluate the credibility of the proposed method. Experimental results showed that the proposed method has successfully constructed better k-NN ensembles. Further more the proposed method can be used to develop other multiple classifier systems. Keyword: Ant system Classifier ensemble construction Combiner construction Feature set partitioning Multiple classifier system Copyright ยฉ 2018 Institute of Advanced Engineering and Science. All rights reserved. Corresponding Author: Abdullah Husin, Department of Information System, Universitas Islam Indragiri, Jalan Provinsi Parit 1 Tembilahan Indragiri Hilir Riau, Indonesia. Email: abdialam@yahoo.com 1. INTRODUCTION Classification is an important function in data mining. One of the main issues in performing classification is to identify the classifier in order to obtain good classification accuracy. The use of a single classifier provides minimal exploitation of complementary information from other classifiers, while the combination of multiple classifiers may provide such additional information [1]. The goal of multiple classifier combination is to obtain a comprehensive result by combining the outputs of several individual classifiers [2]. This consists of a set of classifiers called classifier ensemble and a combination strategy for integrating classifier outputs called combiner. Multiple classifier combination has been widely used in many application domains such as: speech recognition [3], human emotion recognition [4], video classification [5], face recognition [6], email classification [7], cancer classification [8], plant leaf identification [9], concept drift identification [10] and sukuk rating prediction [11]. Multiple classifier combination has been very useful in enhancing the performance of classification. However, there are two problems in developing multiple classifier combinations: constructing the classifier ensemble; and, constructing the combiner. There are no standard guidelines concerning how to construct a set of diverse and accurate classifiers and how to combine the classifier outputs [12]. Most previous studies focus on classifier ensemble construction and apply a simple fixed combiner to combine the outputs [13]. This study focused on both problems and reviews were performed on feature set partitioning and weighted voting combiner. There are several approaches to construct a classifier ensemble. All such approaches attempt to generate diversity by creating classifiers that make errors on different patterns, thus they can be combined
  • 2. ๏ฒ ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712 4706 effectively. The diversity among classifiers in ensemble is deemed to be a key success factor when to constructing classifier ensemble. Theoretically and empirically, it has been shown that a good ensemble has both accuracy and diversity [14]. One of the approaches used to construct a classifier ensemble is the feature decomposition method which manipulates input features in constructing a diverse classifier ensemble. This method decomposes the input features while training the classifier ensemble. Therefore this method is appropriate for high dimensionality data sets [15]. One of the cases of feature decomposition is feature set partitioning. Input features are randomly partitioned to several disjointed subsets. Consequently, each classifier is trained on different subsets. Feature set partitioning is appropriate for classification tasks containing large number of features [16], [17]. However, it is difficult to determine how to form optimal feature set partition to train classifiers to produce good performance. Reviews of the set partitioning problem highlight that the ant system, which is a variant of ant colony optimization (ACO), is the most promising technique to be applied [18]. The ACO algorithm was introduced by Marco Dorigo in the early 1990s. This algorithm is inspired by the behavior of ants in finding the shortest path from the colony to the food; in order to find the shortest route they leave a pheromone on their tour paths. The ant-based algorithm has shown better performance than other popular heuristics such as simulated annealing and genetic algorithms [19]. The ant system (AS) algorithm is a variant of the ant based- algorithm. This is an original and most used ant-based algorithm in solving many optimization problems [20]. The ant system has also been used to solve the set partitioning problem. Set partitioning problems are difficult and very complicated combinatorial issues [21]. The use of ant system for set partitioning problem has been applied in constructing a classifier ensemble [22]. The most popular, fundamental and straightforward combiner is majority voting [23]. Every individual classifier votes for one class label. The class label that most frequently appears in the output of individual classifiers is the final output. To avoid the draw problem, the number of classifiers performed for voting is usually odd. Majority voting is often used to combine multiple classifiers in order to solve classification problems [24]. Previously popular ensemble methods such as bagging, boosting and random forest have used majority voting in combining classifier outputs. The advantages of majority voting include simplicity and lower computational cost. Majority voting enables combination of the output of classifiers regardless of what classifier is used. It is an optimal combiner in several ensemble methods [25]. However, the disadvantage of this combiner is that it does not consider the strength of the classifier [26]. Weighted voting is a trainable version of majority voting which, unlike majority voting, gives weight to each classifier before voting. To make an overall prediction, a weighted vote of the classifier predictions is performed to predict the class. There are several ways to determine the weight of classifiers [27]. The advantages of weighted voting include its flexibility and the potential to produce better performances than majority voting. This combiner has the potential to make multiple classifier combinations more robust to the choice of the number of individual classifiers [28]. In addition the accuracies of the classifiers can be reliably estimated, after which weighted voting may be considered [29]. Several studies have concentrated on weighted voting and have been proven to solve real-world problems such as face and voice recognition [30] and listed companiesโ€™ financial distress prediction [31]. Therefore, in this study the weighted voting combiner is adapted as a combiner which considers the performance of each classifier. 2. RESEARCH METHOD There are three steps to the research work: (1) classifier ensemble construction; (2) combiner construction; and (3) evaluation. In developing the multiple classifier system, effective combination must address the first two steps of ensemble construction and combiner construction. The ant system feature set partitioning algorithm is applied to construct classifier ensemble, while the weighted voting technique is applied as a combiner. Figure 1 shows the architecture of the proposed method which consists of two components namely the ant system feature set partitioning and the weighted voting combiner.
  • 3. Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin) 4707 Figure 1. Architecture of the proposed method for multiple classifier system 2.1. Classifier Ensemble Construction The classifier ensemble is built based on the feature set partitioning algorithm. A disjoint feature set partition is carried out based on the input feature set. An algorithm based on ant system is developed to perform feature set partitioning. The number of feature partitions is determined by the number of individual classifiers. The required inputs include feature set and category labels of the original data set. The input feature set is partitioned into different feature subsets and no feature in the training set is removed. Therefore, each individual classifier is trained on a different projection of the training set. The flowchart for feature decomposition is depicted in Figure 2. Figure 2. Flowchart of the ant system-based feature set partitioning algorithm
  • 4. ๏ฒ ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712 4708 2.2. Combiner Construction In this construction stage, the weighted voting method is used as the combiner. A learning process for each classifier on different partitions of features is performed by the ant system algorithm. Weights are given according to the performance of each classifier. The performance of each classifier depends on the feature set partition. Therefore, the voting weights of each classifier are updated dynamically based on the feature set partition. The idea behind this approach is that the classifier which is trained by different feature set partitions will provide different accuracies although one type of classifier is used in the ensemble. Classifiers that provide a high accuracy are more likely to classify patterns correctly. Let ๐ท = {๐ท1, โ€ฆ , ๐ท๐ฟ} be a set of individual classifiers (or an ensemble of classifiers) where ๐ฟ is the number of individual classifiers. Let ๐›บ = {๐œ”1, ๐œ”2, ๐œ”3, โ€ฆ , ๐œ”๐‘} be a set of class labels where c is the number of classes. Let ๐‘‡ = {๐‘ฅ๐‘–, ๐‘ฆ๐‘–} be a training set (a labelled dataset) where ๐‘– = 1 โ€ฆ ๐‘, ๐‘ is the number of instances, ๐‘ฅ๐‘– โˆˆ โ„œ ๐‘› is the ๐‘› dimensional feature vector of i-th instance and ๐‘ฆ๐‘– โˆˆ {๐œ”1, โ€ฆ , ๐œ”๐‘} is the class label of the i-th instance. Each classifier ๐ท๐‘— assigns an input feature vector to one of the predefined class labels, i.e., ๐ท๐‘—: โ„œ ๐‘› โ†’ ๐›บ. The output of a classifier ensemble is an ๐ฟ dimensional class label vector [๐ท1(๐‘ฅ), โ€ฆ , ๐ท๐ฟ(๐‘ฅ)] ๐‘‡ . The task is to combine ๐ฟ of individual classifier outputs to predict the class label from a set of possible class labels that make the best classification of the unknown pattern. In formulating the weighted voting combiner, let us assume that only the class labels are available from the classifier outputs, and define the decision of the j-th classifier as d_(j,k)โˆˆ{0,1}, j=1,โ€ฆ,L and k=1,โ€ฆ,C, where L is the number of classifiers and C is the number of classes. If j-th classifier D_j chooses class ฯ‰_k, then d_(j,k)=1 and 0 otherwise. The ensemble decision for the proposed weighted voting can be described as follows: choose class ฯ‰_(k*) if โˆ‘ ๐‘Ž๐‘๐‘๐‘— ๐‘‘๐‘—,๐‘˜โˆ— ๐ฟ ๐‘—=1 (๐‘ฅ) = ๐‘š๐‘Ž๐‘ฅ ๐‘˜ โˆ‘ ๐‘Ž๐‘๐‘๐‘— ๐‘‘๐‘—,๐‘˜ ๐ฟ ๐‘—=1 (๐‘ฅ) (1) where ๐‘Ž๐‘๐‘๐‘— is the accuracy (or weight) of classifier ๐ท๐‘—. The votes are multiplied by a weight before the actual voting. The weight is obtained by estimating the classification accuracy on a validation set. 2.3. Evaluation In this step, the performance of multiple classifiers constructed by the proposed ant system and weighted voting (ASWV) method is measured and compared with several other ensemble methods. Experiments were conducted on 9 (nine) benchmark datasets taken from the University California, Irvine (UCI) repository. The k-Nearest Neighbour (k-NN) ensemble has also been used in the experiments. Table 1 shows a summary of the datasets used in the experiments. Table 1. Summary of Datasets No. Datasets Number of Instances Number of Classes Number of Features Features Types 1 Haberman 306 2 3 Integer 2 Iris 150 3 4 Real 3 Lenses 24 3 4 Categorical 4 Liver 345 2 6 Categorical, Integer Real 5 Ecoli 336 8 7 Real 6 Prima Indians Diabetes 768 2 8 Integer, Real 7 Tic-Tac-Toe 958 2 9 Categorical 8 Glass 214 6 9 Real 9 Breast Cancer (Wisconsin) 699 2 9 Categorical The k-fold cross-validation method was applied in the process of obtaining the classification accuracy [32]. A set of labeled samples are randomly partitioned into k disjoint folds of equal size. Then, one of the k folds is randomly selected as the testing set and the remaining (k-1) folds are selected as the training set with the assumption that there is at least one sample per class. The classification accuracy (acc) is the ratio of numbers of all correctly classified instances and the total number of instances as shown in Equation 2. ๐‘Ž๐‘๐‘ = ๐‘›๐‘œ.๐‘œ๐‘“ ๐‘Ž๐‘™๐‘™ ๐‘๐‘œ๐‘Ÿ๐‘Ÿ๐‘’๐‘๐‘ก๐‘™๐‘ฆ ๐‘๐‘™๐‘Ž๐‘ ๐‘ ๐‘–๐‘“๐‘–๐‘’๐‘‘ ๐‘–๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘๐‘’๐‘  ๐‘ก๐‘œ๐‘ก๐‘Ž๐‘™ ๐‘›๐‘ข๐‘š๐‘๐‘’๐‘Ÿ ๐‘œ๐‘“ ๐‘–๐‘›๐‘ ๐‘ก๐‘Ž๐‘›๐‘๐‘’ โˆ— 100% (2)
  • 5. Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin) 4709 Finally, the estimation of classification accuracy is obtained by dividing the total of all classification accuracies by the total number of folds or rounds as shown in Equation 3. ๐‘Ž๐‘๐‘ ๐ถ๐‘‰ = 1 ๐‘˜ โˆ‘ ๐‘Ž๐‘๐‘๐‘– ๐‘˜ ๐‘–=1 (3) ๐‘Ž๐‘๐‘๐‘– is the classification accuracy of round i and k is the number of folds. A common choice for k- fold cross validation is k=10. Extensive experiments have shown that 10 (ten) is the best choice to get an accurate estimate [33]โ€“[35]. To obtain powerful performance estimation and comparisons, a large number of estimates are always preferred. Therefore, in this research, the experiments are conducted on ten times the 10-fold cross-validation method. 3. RESULTS AND ANALYSIS The ant system algorithm was used to partition the feature set and weighted voting was used to combine classifier outputs. Experiments were carried out on nine (9) data sets from the UCI repository. Ten (10) experiments which consist of 10-fold cross validation method were carried out to validate the accuracy of single k-NN and constructed k-NN ensembles. Tables 2 shows the average and standard deviation of the classification accuracies of single k-NN, constructed k-NN ensembles based on random subspace and constructed k-NN ensembles with the used of ant system-based feature set partitioning respectively. It can be shown that a small standard deviation was obtained for all method which indicates the experiments were stable. The average accuracy of the constructed multiple k-NN by the proposed method was compared with the average accuracies of original single k-NN and constructed k-NN ensembles by the random subspace method. It can also be seen that the proposed method provides better accuracy than single approach and random space method in constructing k-NN ensembles. Improvements in accuracy are obtained on all datasets. The comparison of accuracies is as shown in Table 2. Table 2. The Accuracy of Single k-NN, Random Subspace and Proposed Method No Dataset Single k-NN k-NN with Random Subspace k-NN with Ant Syastem Proposed Method Average Standard Deviation Average Standard Deviation Average Standard Deviation 1 Haberman 68.83 1.37 67.91 1.96 68.53 0.79 2 Iris 95.67 0.47 93.40 0.47 96.34 0.35 3 Lenses 77.92 2.81 62.50 4.17 86.67 1.76 4 Liver 62.32 1.00 60.06 3.48 65.48 1.35 5 Ecoli 81.19 0.61 81.19 1.70 81.91 0.31 6 Pima 67.37 0.81 70.59 1.32 71.22 0.00 7 Tic-Tac-Toe 75.77 0.45 75.70 2.19 78.81 0.39 8 Glass 72.71 0.83 72.71 1.86 73.54 0.43 9 Breast Cancer 95.78 0.28 97.23 0.31 98.09 0.00 The proposed algorithm was successfully applied to form feature set partition. Table 3 shows the summary of the result of implementing this proposed algorithm. This table presents the feature set partition and the number of classifiers. Table 3. Obtained Feature Set Partition and Number of Classifiers No Dataset Partition Number of Classifiers 1 Haberman [1 3][2] 2 2 Iris [1 2 3 4] 1 3 Lenses [1 2 3 4] 1 4 Liver [1 4 6][3 5][2] 3 5 Ecoli [1 2 3 4 5 6 7] 1 6 Pima [1 3 4 7][5 6 8][2] 3 7 Tic-Tac-Toe [1 2 3 4 5 6 7 8 9] 1 8 Glass [1 2 3 4 5 6 7 8 9] 1 9 Breast Cancer [1 2 4 7 9][3 5][6][8] 4 The accuracy of the proposed method was also compared to the other common methods as shown in Table 4. The accuracy of the proposed method was evaluated by comparing the results to: (1) Single
  • 6. ๏ฒ ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712 4710 classifier approach, (2) dynamic weighted voting [28], (3) improved k-NN classification using genetic algorithm (GA k-NN) [36], (4) simultaneous metaheuristic feature selection (SMFS) [37], (5) weighted k-NN ensemble method [27], (6) direct boosting algorithm [38], (7) cluster-oriented ensemble classifier (COEC) [39] and (8) evidential neural network [40]. The k-NN classifier was used as the base classifier. Based on the results, it can be seen that the proposed method gives the best classification accuracies as compared to the other methods on habermann and breast cancer dataset. In general, the proposed method gives good classification results and is comparable with other methods. Table 4. Comparison of Accuracies with Common Ensemble Methods Dataset 1 2 3 4 5 6 7 8 9 Haberman 66.83 - - - 71.89 - - - 72.75 Iris 95.67 97.33 - - 95.20 96.70 96.00 94,93 96.34 Ecoli 81.19 - - - 82.89 - - - 81.91 Glass 72.71 - - - 74.23 72.50 - - 73.54 Pima 67.37 72.68 - 71.90 - 75.70 - 71.79 71.22 Breast Cancer 95.78 96.35 97.92 97.50 - - 97.72 - 98.09 1. Single k-NN 2. Dynamic weighted voting 3. GA k-NN 4. SMFS 5. Weighted k-NN ensemble method 6. Direct boosting algorithm 7. COEC 8. Evidential neural network 9. Proposed ASWV method 4. CONCLUSION A new method based on the integration of the ant system and weighted voting for multiple classifier systems has been presented. The ant system was applied to optimize the feature set partition activity while weighted voting was used as a combiner. Experiment results show that the application of this method in combining several k-NNs as base classifier outperforms single k-NN, comparable with other ensemble methods. The results indicate that the proposed method can be applied in generating better k-NN ensembles. Furthermore, this method can determine the number of the combined classifiers based on the number of formed partitions. Future research is to apply this method on other classifiers such as the Support Vector Machine, Neural Network and Decision Tree. The dynamic feature partition-selection approach can be considered to enhance the performance of this method. The method will, hopefully, be able to partition the feature set into several lower-dimensional feature sets, which would allow a set of classifiers to process low dimensional feature vectors simultaneously. Therefore, testing the ability of this method to overcome the high dimensional data and small training sample problems can be considered. ACKNOWLEDGEMENTS This work is supported by the Higher Education Ministry of Malaysia under the Long Term Research Grant Scheme (S/O code: 12490). REFERENCES [1] K. Kim, H. Lin, J. Y. Choi, and K. Choi, โ€œA design framework for hierarchical ensemble of multiple feature extractors and multiple classifiers,โ€ Pattern Recognition, 2015. [2] G. Baron, โ€œIntelligent Decision Technologies,โ€ Analysis of Multiple Classifiers Performance for Discretized Data in Authorship Attribution, vol. 73, pp. 33โ€“42, 2017. [3] S. Hegde, K. . Achary, and S. Shetty, โ€œA multiple classifier system for automatic speech recognition,โ€ International Journal of Computer Applications, vol. 101, no. 9, pp. 38โ€“43, 2014. [4] C. H. Wu and W. Bin Liang, โ€œEmotion recognition of affective speech based on multiple classifiers using acoustic- prosodic information and semantic labels,โ€ IEEE Transactions on Affective Computing, vol. 2, no. 1, pp. 10โ€“21, 2015. [5] M. H. Sigari, S. A. Sureshjani, and H. Soltanian-Zadeh, โ€œSport video classification using an ensemble classifier,โ€ in Proceedings of 7th Iranian Conference on Machine Vision and Image Processing, 2011, pp. 2โ€“5. [6] B. Zhang, โ€œReliable face recognition by random subspace support vector machine ensemble,โ€ in Proceedings of the 2012 International Conference on Machine Learning and Cybernetics, 2012, vol. 1, pp. 415โ€“420. [7] A. Chharia and R. K. Gupta, โ€œEmail classifier: An ensemble using probability and rules,โ€ in Proceedings of 6th International Conference on Contemporary Computing, 2013, pp. 130โ€“136.
  • 7. Int J Elec & Comp Eng ISSN: 2088-8708 ๏ฒ Ant System and Weighted Voting Method for Multiple โ€ฆ (Abdullah Husin) 4711 [8] A. Margoosian and J. Abouei, โ€œEnsemble-based classifiers for cancer classification using human tumor microarray data,โ€ in Proceedings of the 21st International Conference on Electrical Engineering (ICEE), 2013, pp. 1โ€“6. [9] R. P. A. Pramesti, Y. Herdiyeni, and A. S. Nugroho, โ€œWeighted Ensemble Classifier for Plant Leaf Identification,โ€ TELKOMNIKA, vol. 16, no. 3, pp. 1386โ€“1393, 2018. [10] L. Deshpande and M. N. Rao, โ€œConcept Drift Identification using Classifier Ensemble Approach,โ€ International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 1, pp. 19โ€“25, 2018. [11] M. Kartiwi, T. S. Gunawan, T. Arundina, and M. A. Omar, โ€œSukuk Rating Prediction using Voting Ensemble Strategy,โ€ International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 1, pp. 299โ€“303, 2018. [12] D. Hernรกndez-Lobato, G. Martรญnez-Muรฑoz, and A. Suรกrez, โ€œHow large should ensembles of classifiers be?,โ€ Pattern Recognition, vol. 46, no. 5, pp. 1323โ€“1336, 2013. [13] P. Du, J. Xia, W. Zhang, K. Tan, Y. Liu, and S. Liu, โ€œMultiple classifier system for remote sensing image classification: A review,โ€ Sensors, vol. 12, no. 4, pp. 4764โ€“4792, 2012. [14] S. Li, Z. Zheng, Y. Wang, C. Chang, and Y. Yu, โ€œA new hyperspectral band selection and classification framework based on combining multiple classifier,โ€ Pattern Recognition Letters, pp. 1โ€“8, 2016. [15] L. Rokach, โ€œEnsemble-based classifiers,โ€ Artificial Intelligence Review, vol. 33, no. 1โ€“2, pp. 1โ€“39, 2010. [16] O. Maimon and L. Rokach, Decomposition methodology for knowledge discovery and data mining. Berlin Heidelberg: Springer-Verlag., 2005. [17] H. Ahn, H. Moon, M. J. Fazzari, N. Lim, J. J. Chen, and R. L. Kodell, โ€œClassification by ensembles from random partitions of high-dimensional data,โ€ Computational Statistics and Data Analysis, vol. 51, no. 12, pp. 6166โ€“6179, 2007. [18] B. Crawford, R. Soto, E. Monfroy, C. Castro, W. Palma, and F. Paredes, โ€œA hybrid soft computing approach for subset problems,โ€ Mathematical Problems in Engineering, vol. 2013, 2013. [19] G. Chicco, โ€œAnt colony system-based applications to electrical distribution system optimization,โ€ in Ant Colony Optimization - Methods and Applications, A. Otsfeld, Ed. Rijeka, Croatia: InTechOpen, 2011, pp. 237โ€“262. [20] R. Ribeiro and F. Enembreck, โ€œA sociologically inspired heuristic for optimization algorithms: A case study on ant systems,โ€ Expert Systems with Applications, vol. 40, no. 5, pp. 1814โ€“1826, 2013. [21] E. Cetin, โ€œThe solution of crew scheduling problem with set partitioning model,โ€ Aviation and Space Technology, vol. 3, no. (4), pp. 47โ€“54, 2008. [22] Abdullah and K. R. Ku-mahamud, โ€œAnt system-based feature set partitioning algorithm for classifier ensemble construction,โ€ International Journal of Soft Computing, vol. 11, no. 3, pp. 176โ€“184, 2016. [23] L. Hansen and P. Salamon, โ€œNeural network ensembles,โ€ IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 12, no. 10, pp. 993โ€“1001, 1990. [24] A. Hajdu, L. Hajdu, A. Jonas, L. Kovacs, and H. Toman, โ€œGeneralizing the majority voting scheme to spatially constrained voting,โ€ IEEE Transactions on Image Processing, vol. 22, no. 11, pp. 4182โ€“4194, 2013. [25] M. P. Ponti, โ€œCombining classifiers: From the creation of ensembles to the decision fusion,โ€ in Proceedings of 24th SIBGRAPI Conference on Graphics, Patterns, and Images Tutorials, SIBGRAPI-T 2011, 2011, no. Icmc, pp. 1โ€“10. [26] W. Wang, Y. Zhu, X. Huang, D. Lopresti, Z. Xue, R. Long, S. Antani, and G. Thoma, โ€œA Classifier Ensemble Based on performance level estimation,โ€ in International Symposium on Biomedical Imaging: From Nano to Macro, 2009, pp. 342โ€“345. [27] S. Hamzeloo, H. Shahparast, and M. Zolghadri Jahromi, โ€œA novel weighted nearest neighbor ensemble classifier,โ€ in Proceedings of the 16th International Symposium on Artificial Intelligence and Signal Processing, 2012, pp. 413โ€“416. [28] R. M. Valdovinos and J. S. Sรกnchez, โ€œCombining multiple classifiers with dynamic weighted voting,โ€ in Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems (HAIS โ€™09), 2009, pp. 510โ€“516. [29] R. Polikar, โ€œEnsemble based systems in decision making,โ€ Circuits and Systems Magazine, IEEE, vol. 6, no. 3, pp. 21โ€“45, 2006. [30] X. Mu, J. Lu, P. Watta, and M. H. Hassoun, โ€œWeighted voting-based ensemble classifier with application to human face recognition and voice recognition,โ€ in Proceedings of International Joint Conference on Neural Networks, 2009, pp. 2168โ€“2171. [31] J. Sun and H. Li, โ€œListed companiesโ€™ financial distress prediction based on weighted majority voting combination of multiple classifiers,โ€ Expert Systems with Applications, vol. 35, no. 3, pp. 818โ€“827, 2008. [32] P. M. Jena and S. R. Nayak, โ€œAngular Symmetric Axis Constellation Model for Off-line Odia Handwritten Characters Recognition,โ€ International Journal of Advances in Applied Sciences (IJAAS), vol. 7, no. 3, pp. 265โ€“ 272, 2018. [33] R. Kohavi, โ€œA study of cross-validation and bootstrap for accuracy estimation and model selection,โ€ in Proceedings of the 14th International Joint Conference on Artificial Intelligence, 1995, no. 2, pp. 1137โ€“1145. [34] M. Wozniak, โ€œClassifier fusion based on weighted voting - Analytical and experimental results,โ€ 2008 Eighth International Conference on Intelligent Systems Design and Applications, pp. 687โ€“692, 2008. [35] M. Wozniak, โ€œEvolutionary approach to produce classifier ensemble based on weighted voting,โ€ in Proceedings of World Congress on Nature and Biologically Inspired Computing, NaBIC., 2009, pp. 648โ€“653. [36] N. Suguna and K. Thanushkodi, โ€œAn Improved k-Nearest Neighbor Classification Algorithm Using Shared Nearest Neighbor Similarity,โ€ International Journal of Computer Science, vol. 7, no. 4, pp. 18โ€“21, 2010. [37] M. A. Tahir and J. Smith, โ€œCreating diverse nearest-neighbour ensembles using simultaneous metaheuristic feature selection,โ€ Pattern Recognition Letters, vol. 31, no. 11, pp. 1470โ€“1480, 2010. [38] T. Koon, C. Neo, and D. Ventura, โ€œA direct boosting algorithm for the k -nearest neighbor classifier via local
  • 8. ๏ฒ ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 8, No. 6, December 2018 : 4705 - 4712 4712 warping of the distance metric,โ€ Pattern Recognition Letters, vol. 33, no. 1, pp. 92โ€“102, 2012. [39] B. Verma and A. Rahman, โ€œCluster Oriented Ensemble Classifier: Impact of Multicluster Characterization on Ensemble Classifier Learning,โ€ IEEE Transactions on Knowledge and Data Engineering, vol. 24, no. 4, pp. 605โ€“ 618, 2012. [40] H. He, D. Han, and Y. Yang, โ€œDesign of multiple classifier systems based on evidential neural network,โ€ in Chinese Control Conference, 2017, pp. 5496โ€“5501. BIOGRAPHIES OF AUTHORS Abdullah Husin: He recived his Sarjana and Master degrees in Computer Science from Gadjah Mada University Yogyakarta Indonesia. Now, he is working on the Bachelorโ€™s degree in Information System Program Study, Engineering and Computer Science Faculty, Universitas Islam Indragiri Indonesia. He received his Ph.D degree in Computer Science in 2015 from Universiti Utara Malaysia (UUM). His research interest includes Data Mining, Optimization Algorithm, Multimedia Database and Pattern Recognition and Classification. Ku Ruhana Ku-Mahamud: She holds a Bachelor in Mathematical Sciences and a Masters degree in Computing, both from Bradford University, United Kingdom in 1983 and 1986 respectively. Her PhD in Computer Science was obtained from Universiti Pertanian Malaysia in 1994. As an academic, her research interests include ant colony optimization, pattern classification and vehicle routing problem.