Short Term Load Forecasting (STLF) can predict load from several minutes to week plays
the vital role to address challenges such as optimal generation, economic scheduling, dispatching and
contingency analysis. This paper uses Multi-Layer Perceptron (MLP) Artificial Neural Network
(ANN) technique to perform STFL but long training time and convergence issues caused by bias,
variance and less generalization ability, unable this algorithm to accurately predict future loads. This
issue can be resolved by various methods of Bootstraps Aggregating (Bagging) (like disjoint
partitions, small bags, replica small bags and disjoint bags) which helps in reducing variance and
increasing generalization ability of ANN. Moreover, it results in reducing error in the learning process
of ANN. Disjoint partition proves to be the most accurate Bagging method and combining outputs of
this method by taking mean improves the overall performance. This method of combining several
predictors known as Ensemble Artificial Neural Network (EANN) outperform the ANN and Bagging
method by further increasing the generalization ability and STLF accuracy.
2. 2 Journal Name, 2019, Vol. 0, No. 0 M.F. Tahir et al.
and Genetic Algorithm (GA) [20] and many others
have been used in past to address this issue. ANN is
preferred among other intelligence techniques due to
its aptness to self-learn and perform well for complex
non-linear problems. However, these days
hybridization of various techniques with ANN to
solve STLF is getting more attention and few of these
techniques are listed in Table I.
TABLE I
DIFFERENT HYBRID TECHNIQUES USED FOR STLF
HYBRIDIZATION ADVANTAGES OVER UN-HYBRID ANN ANN TYPE
ANN - Artificial Immune
System (AIS) [21]
High accuracy and fast convergence and improved Mean Average
Percentage Error (MAPE)
Feed Forward Back Propagation
(FFBP) ANN
ANN - Fuzzy [22] Improvement in prediction accuracy and reduction in forecasting error Levenberg-Marquardt Back
Propagation (LMBP) ANN
ANN - GA [23] Better performance and good ability of solving the problem FFBP ANN
ANN - PSO [24] More accurate Radial Basis Function (RBF)
ANN
ANN - CPSO [25] Improves searching efficiency and quality RBF ANN
ANN - firefly [26] Improves both local and global searching ability FFBP ANN
ANN – Support Vector
Machine (SVM) [27]
Better forecasting accuracy and high speed FFBP ANN
Aforementioned hybridization techniques achieve
better results than ANN because ANN suffers from
noise, bias, variance and inefficient generalization
ability. However, if these problems can be resolved
then ANN will be able to achieve improved results in
less computational time than hybridization
techniques.
Main contributions of this work are:
i) ANN is trained for three years (2007-2009) data to
calculate 2010 data that is taken from Australian
Market. Humidity, system load, wet bulb, dew point
and dry bulb temperature acts as inputs while 2010
data acts as target output for the ANN network.
ii) ANN inability to accurately predict 2010
forecasted data is improved by 4 Bootstrap
Aggregating (Bagging) algorithms that just resample
the original data which will help in increasing
generalization ability and reducing variance.
iii) Disjoint partition proves to be superior than other
three Bagging methods and Ensemble Artificial
Neural Network (EANN) combines the output of this
method to increase accuracy further.
Rest of the paper is organized as follows: Section 2
briefly elaborates ANN, Bagging and EANN
techniques while section 3 discusses methodology
and data used for ANN training. Section 4 illustrates
results simulation and section 5 concludes the paper.
2. ANN, BAGGING AND EANN
2.1. Artificial neural network
The basic idea of ANN derives from the
biological nervous system [28, 29]. The key element
for processing information in the neural network is
neuron. A Neuron has four main parts and these
elements form the basic building block for ANN as
shown in Fig. 1.
Fig. 1. Biological and ANN architecture
ANN output from output function is compared
with the desired results and in the case of
mismatching both outputs indicates there is some
error. Some architecture utilizes this error directly
while some squares it or cube it to modify according
to the specific purpose. The error is propagated
backwards to adjust the weights of input so that
desired output matches the ANN output. This
adjustment of weight and backward propagation of
error accounts in the learning function in which some
specified algorithm is used for this function to
minimize the error. Four performance metrics like
Mean Square Error (MSE), Root Mean Square Error
(RMSE), Mean Absolute Percentage Error (MAPE)
and Mean Absolute Deviation (MAD) are used in this
work for reduction in the learning process as
indicated in equations 1-
x1
xn
x2
.
.
Dendrites
Soma
Axon
Synapses
Input function
Weighting factors
Transfer function
Output function
w1j
w2j
w3j
Σ
ψ
Activation
function
3. Title of the Article Recent Advances in Electrical and Electronic Engineering, 2019, Vol. 0, No. 0 3
4.
2 2
1 1
1 1
( ) ( )
n n
i i
i i
MSE e i t y
n n
(1)
2 2
1 1
1 1
( ) ( )
n n
i i
i i
RMSE e i t y
n n
(2)
1
1 n
i i
i
MAD t y
n
(3)
1
1
100
n
i i
i i
t y
MAPE
n t
(4)
where, n is number of examples, i represents
iterations, it is desired target value and iy is ANN
output value.
ANN does not need to be programmed, it just
learns that causes it to work well with large data sets
and complex non-linear problems. Moreover, it easily
solves the problems that are difficult to specify it
mathematically and do not have particular knowledge
about the problem. However, sometimes it cannot
extrapolate desired results even after trying different
training algorithms, activation functions and
structures. This difficulty in extrapolating desired
results can be due to an error in the learning process
that occurs due to noise, bias and variance. Bias and
Variance cause underfitting and overfitting of data
respectively due to ANN inability to learn target
function and fluctuations in training dataset.
2.2. Bootstrap aggregating
Bootstrap Aggregating commonly known as Bagging
was presented by Breiman [30] that helps in
minimizing the variance by reducing the overfitting
which increases the precision of machine learning
algorithms [31]. Disjoint partition, small bags, no
replication small bags and disjoint bags are common
methods of Bagging which are elaborated in by
considering the below hypothetical dataset shown in
Fig. 2(a).
Fig. 2(a). Hypothetical data set
The disjoint partition divides the data in small
subsets into such a pattern that set union of subsets
must be equal to hypothetical data set and each
classifier is selected by once. In contrary, subsets
created by small bags may not necessarily be equal to
the above data because of repetition of few classifiers
and in no replication small bags method, no repetition
occurs while generating subset independently but still
the union of subsets may not be equal to above data.
Disjoints bags training is carried out in a similar
fashion to disjoint bags but it is the only method in
which there is the possibility of increasing the subset
size than original size as depicted in Figs. 2 (b-e).
Fig. 2(b). Disjoint partition
Fig. 2(c). Small bags
Fig. 2(d). No replication small bags
Fig. 2(e). Disjoint bags
Bagging not only minimizes variance but this
random distribution of data increases the
generalization ability of neural networks. Therefore,
the creation of multiple Bootstraps and again training
ANN improves the overall accuracy.
C. Ensemble artificial neural network
EANN is a method of combining different ANN outputs and obtaining one single output [32, 33]. This process
can be summed up as sketched in Fig. 3
Fig. 3. Ensemble artificial neural network
A B C D E F G H I J K L M N O P
A B C D M N O PI J K LE F G H
A C D E D P E F I A K H M O J L
A C H LO P L N D I O H K C F P
A B C D B E F G H G I J K L I M N O P N
Training the bootstraps again and chooses the best
bootstrap method for EANN
ANN Model
B1 BNB2
EANN Model
It suffers with variance and
bias
Creation of multiple bootstraps increases the
ANN generalization ability
Combining multiple outputs
increases the accuracy
. . . .
Final accurate EANN output
B1 B2 BN. . . .
4. 4 Journal Name, 2019, Vol. 0, No. 0 M.F. Tahir et al.
Combination of several predictors outweighs the
prediction of individual predictors [12]. Therefore,
EANN which combines multiple outputs as shown
above guarantees a reduction in error and
improvement in accuracy. Moreover, generalization
ability and performance of the whole system
increases significantly that has been shown in the
results section.
3. METHODOLOGY AND DATA
COLLECTION
The data is taken from the Australian market as
follows
Temperature data from Bureau of
Meteorology (BOM) [34].
Load data from Australian Energy Market
Operator (AEMO) [35].
The data for the year 2007, 2008 and 2009
comprises of quantities mentioned in table II but for
the sake of simplicity only data of first 12 hours of
January 2007 is depicted in this work.
TABLE II
DATA FOR ANN LOAD FORECASTING
Given Data
Date Time
(hour)
Dry Bulb (Celsius
o
C)
Dew Point
(Celsius o
C)
Wet Bulb
(Celsius o
C)
Humidity
(g/kg)
System Load
(MW)
1-Jan-2007 0.0 20.40 15.2 17.30 72.0 7228.86
1-Jan-2007 0.5 20.35 15.3 17.35 72.5 7062.49
1-Jan-2007 1.0 20.30 15.4 17.40 73.0 6843.66
1-Jan-2007 1.5 20.25 15.5 17.45 74.0 6552.34
1-Jan-2007 2.0 20.20 15.7 17.50 75.0 6296.34
1-Jan-2007 2.5 20.15 15.9 17.60 76.5 6079.49
1-Jan-2007 3.0 20.10 16.1 17.70 78.0 5957.18
1-Jan-2007 3.5 20.10 15.8 17.55 76.5 5913.07
1-Jan-2007 4.0 20.10 15.6 17.40 75.0 5855.45
1-Jan-2007 4.5 19.75 16.3 17.65 80.5 5884.93
1-Jan-2007 5.0 19.40 17.0 17.90 86.0 5904.63
1-Jan-2007 5.5 19.90 16.4 17.80 80.5 5953.51
1-Jan-2007 6.0 20.40 15.9 17.70 75.0 6040.14
1-Jan-2007 6.5 20.65 15.9 17.80 74.0 6150.36
1-Jan-2007 7.0 20.90 15.9 17.90 73.0 6332.48
1-Jan-2007 7.5 20.60 16.5 18.15 77.5 6577.33
1-Jan-2007 8.0 20.30 17.1 18.40 82.0 6796.30
1-Jan-2007 8.5 20.10 16.85 18.15 81.5 7015.00
1-Jan-2007 9.0 19.90 16.6 17.90 81.0 7250.31
1-Jan-2007 9.5 20.05 17.3 18.35 84.0 7470.74
1-Jan-2007 10 20.20 17.9 18.80 86.7 7574.95
1-Jan-2007 10.5 21.40 16.8 18.60 76.0 7666.11
1-Jan-2007 11.0 22.60 15.6 18.40 65.0 7762.30
1-Jan-2007 11.5 22.50 15.2 18.15 63.5 7758.87
1-Jan-2007 12.0 22.40 14.8 17.90 62.0 7750.38
3.1. Initialization, training and adaptation of ANN
Load of any electric unit is comprised of various
consumption units (industrial, commercial and
residential) and different factors (like meteorological
5. Title of the Article Recent Advances in Electrical and Electronic Engineering, 2019, Vol. 0, No. 0 5
conditions, economic and demographic factors, time
factors and other random factors) affect the electric
load depending on the specific consumption unit.
Generally, load forecasting is categorized into three
periods: Long, medium and short terms. This
research is focused on short term load forecasting
which is mostly based on climate conditions like dry
bulb temperature, wet bulb temperature, humidity
and dew point temperature [36]. Therefore, above six
inputs are used as inputs for modelling neural
network to determine the desired load.
Multi-Layer Perceptron neural network model are
chosen because they are comparatively simpler to
implement and have several applications in case of
nonlinear mapping among inputs and outputs such as
behavioral modelling, adaptive control and image
recognition and so on []. Moreover, Levenberg-
Marquardt training is used because it is the fastest
backpropagation algorithm in the nntool box which is
recommended as a first choice for supervised
learning algorithm [].
Above parameters of three years data (2007-2009) act
as inputs and after normalizing input datasets, it is
used for ANN training to forecast 2010 data and then
compared with actual 2010 data which serve as
targeted output has been made. ANN output and
targeted outputs are not 100 percent accurate that
shows areas of improvement which is accomplished
by Bagging and EANN. MATLAB provides nntool
for ANN creation, input data, target data, network
type, training function and other parameters required
for ANN training are summarized in table III.
TABLE III
ANN PARAMETERS DETAILS
Parameters Details
Number of input neurons 6 (time, dry bulb, dew point and wet bulb temp, humidity and
system load)
Number of output neurons 1 (forecasted data)
Number of hidden-layer neurons 20
Neural network model Multi-Layer Perceptron
Training function Levenberg-Marquardt Back Propagation
Adaptation learning function Gradient descent with momentum weight and bias
Number of layers 2
Activation function for layer 1 Trans sigmoid
Activation function for layer 2 Pure linear
Performance function MAD, MSE, RMSE, MAPE
Percentage of using information Train (70%), test (15%), cross validation (15%)
Maximum of epoch 1000
Learning rate 0.01
Maximum validation failures 6
Error threshold 0.001
Weight update method Batch
6. 6 Journal Name, 2019, Vol. 0, No. 0 M.F. Tahir et al.
3.2. Bootstrap aggregating with different methods
and EANN
All aforementioned bootstrap methods randomly
distribute the data that increases ANN generalization
ability to opt to new data set and helps in achieving
the desired accurate error with less computational
time. However, among the four Bootstraps methods,
one with least error is chosen and compared with
desired or targeted output and if it is yet not 100
percent accurate, it means it is still suffering from
variance and bias. This problem can be overcome and
results can be further improved by ensembling this
trained neural network. The complete flowchart of
repetitive training procedure of ANN, creation of
Bootstraps and ensembling of these trained
bootstraps are illustrated in Fig. 4.
Fig. 4. Flowchart of STLF using ANN, Bootstraps and EANN
Accurate results, no error
Ensemble the trained most accurate bootstraps
outputs by taking the mean
Accurate results, tolerable error
ANN output
start
Determine the network structure
Determine activation functions, training algorithm, learning rate, gradient, MSE
& no. of epochs
Load input and output data
Adjusting training parameters
Separate data sets into training and testing sets
Train network with training data Test network with testing data
Trained network
Meeting precision
Yes No
Create Bootstraps End
Disjoint partition Small bags No replication small bags Disjoint bags
Train Train TrainTrain
Choose the one that has least MAD, MSE, RMSE and MAPE
Bootstrap outputs
EndStill error in learning phase
EANN
End
7. Title of the Article Recent Advances in Electrical and Electronic Engineering, 2019, Vol. 0, No. 0 7
4. RESULTS AND SIMULATIONS
4.1. ANN forecasted outputs
nntool is used in MATLAB for the creation of
ANN. Forecasted ANN load and actual 2010 load
and difference or fluctuations between ANN output
and desired output are represented in terms of error
also portrayed in Table IV.
TABLE IV
STLF USING MULTI-LAYER PERCEPTRON LMBP ANN
Date Time (hour) Actual 2010 Load ANN forecasted Load %Error
1-Jan-2007 0.0 7228.86 7414.529126 -2.5684427
1-Jan-2007 0.5 7062.49 7016.189440 0.6555841
1-Jan-2007 1.0 6843.66 6685.957192 2.3043636
1-Jan-2007 1.5 6552.34 6447.714934 1.5967588
1-Jan-2007 2.0 6296.34 6255.132551 0.6544667
1-Jan-2007 2.5 6079.49 6080.790273 -0.0213879
1-Jan-2007 3.0 5957.18 5944.961677 0.2051025
1-Jan-2007 3.5 5913.07 5871.978270 0.6949306
1-Jan-2007 4.0 5855.45 5811.196357 0.7557684
1-Jan-2007 4.5 5884.93 5768.429280 1.9796450
1-Jan-2007 5.0 5904.63 5757.130604 2.4980295
1-Jan-2007 5.5 5953.51 5857.232854 1.6171493
1-Jan-2007 6.0 6040.14 5964.384373 1.2542032
1-Jan-2007 6.5 6150.36 6132.461517 0.2910152
1-Jan-2007 7.0 6332.48 6343.680201 -0.1768691
1-Jan-2007 7.5 6577.33 6608.768615 -0.4779845
1-Jan-2007 8.0 6796.30 6976.836093 -2.6563879
1-Jan-2007 8.5 7015.00 6943.014246 1.0261690
1-Jan-2007 9.0 7250.31 6910.429359 4.6878084
1-Jan-2007 9.5 7470.74 7541.104642 -0.9418698
1-Jan-2007 10 7574.95 8155.651981 -7.6660833
1-Jan-2007 10.5 7666.11 8004.098992 -4.4088722
1-Jan-2007 11.0 7762.30 7875.537612 -1.4588152
1-Jan-2007 11.5 7758.87 7778.655206 -0.2550011
1-Jan-2007 12 7750.38 7719.075881 0.4039043
4.2. Bootstraps forecasted outputs
Four methods of bootstraps with regression plots
and error histograms are shown in Figures that
clearly indicates that disjoint partition is the best
among all four. Therefore, disjoint partition output is
used for EANN to further increase prediction
accuracy and reduce errors.
Fig. 5 (a) Disjoint partition regression plot Fig. 5 (b) Disjoint partition error histograms
8. 8 Journal Name, 2019, Vol. 0, No. 0 M.F. Tahir et al.
Fig. 5 (c) Small bags regression plot Fig. 5 (d) small bags error histograms
Fig. 5 (e) Replica small bags regression plot Fig. 5 (f) Replica small bags error histogram
Fig. 5 (g) Disjoint bags regression plot Fig. 5 (h) Disjoint bags error histogram
9. Title of the Article Recent Advances in Electrical and Electronic Engineering, 2019, Vol. 0, No. 0 9
Difference between desired and output value is termed as
error that is shown in error histograms. Regression plot
depicts the relation between ANN output and desired output
and R=1 means that it exactly matches with target results.
Therefore, R=0.9972 in case of a Disjoint partition with least
error proves to be the most efficient Bagging method.
4.3. EANN forecasted outputs
Finally, disjoint partition bootstraps are trained again by
using a neural network approach and by combining the
outputs of all bootstraps by taking mean will gives the final
forecasted EANN output. This output comparison when
made with ANN and best bootstrap method, it reveals that it
outperforms the rest of methods in terms of accuracy. Table
5 shows the superiority of EANN as all four evaluation
measures are reduced in comparisons to other two
techniques.
TABLE V
EANN FORECASTED LOAD, ITS COMPARISON WITH ANN AND VARIOUS BAGGING METHODS
Errors ANN Bootstrap Aggregating EANN
FFBP Disjoint bags Replica Small Bags Small bags Disjoint partition Disjoint partition
MAD 0.68 0.5333 0.4315 0.2562 0.1241 0.1021
MSE 0.48 0.2908 0.2011 0.0687 0.0174 0.0013
RMSE 0.69 0.5393 0.4484 0.2621 0.1319 0.1011
MAPE 0.0037 0.0022 0.0015 0.0005 0.0001 0.0001
4.4. Computational Complexity Analysis
Low RMSE shows the better predictive ability of the
classifiers. As far as the computational complexity is
concern, 20 Individual trails for 25 Bags has been done and
the results are aggregated, the EANN (Disjoint partition)
exhibits low RMSE but it is more computational intense than
simple ANN when simulated on MALAB R2018b, Windows
10, 7th
Gen. Core i5 2.5 GHz quad core processor with 8gb
of ram .
The Time complexity Analysis is shown in the table VI
below
TABLE VI
TIME COMPLEXTY ANALYSIS OF ANN AND EANN
MODEL TYPE RMSE
FOR
TRAINING
RMSE
FOR
TESTING
RMSE FOR
VALIDATI ON
PREDICTION
ACCURACY
TIME
(SEC)
ANN
(MLP)
0.37 0.23 0.69 95.23% 26.563
EANN 0.1023 0.1001 0.1011 99.87% 49.365
REPLICA
SMALL
BAGS
0.3254 0.2353 0.4484 98.58% 68.235
SMALL
BAGS
0.2532 0.5641 0.2621 99.21% 51.235
DISJOINT
PARTITION
0.6325 0.3622 0.1319 99.35% 50.235
Table VI shows total time, including the training, testing and
validation of EANN is more than the ANN but the error is
significantly improved as indicated in the prediction accuracy.
Therefore, there exist a tradeoff between ANN and EANN,
higher the time it takes lower the forecast error it exhibits.
The computational complexity also depends on the number of
bags and is proportional to the training time.
EANN comprises of complex network architecture and
greater dimensions due to the bootstrapping .However the run
time still allows the use of the model for online load
forecasting application.
5. CONCLUSION
In this research, a successful implementation of STLF has
carried out using ANN, four Bagging methods and EANN
algorithms. The Australian Market dataset of three years
(2007-2009) is considered to train ANN and predict 2010
load data. The significant error has been observed in STLF
when using multi-layer perceptron ANN that shows this
technique does not deal with the given problem so efficiently
due to overfitting or underfitting of data caused by bias and
variance. Proposed disjoint partition, small bags, replica
small bags and disjoint bags Bagging methods are employed
to fill this technological gap. All these four methods, when
trained again, shows reduced error and a significant deal of
improvement is evidenced in comparison to ANN. The most
accurate method with least regression error (R=0.9972) was
disjoint partition bagging method. However, there is still
some scope of improvement that can be achieved by
Ensembling trained disjoint partition bagging method.
Finally, higher system accuracy, better generalization ability
and reduce error in EANN proves to be more efficient than
all the aforementioned algorithms.
REFERENCES
1. Outlook, B.E., 2019 edition.
2. Rehman, A. and Z. Deyuan, Investigating the
linkage between economic growth, electricity
access, energy use, and population growth in
Pakistan. Applied sciences, 2018. 8(12): p. 2442.
3. Vantuch, T., et al. Machine learning based electric
load forecasting for short and long-term period. in
Internet of Things (WF-IoT), 2018 IEEE 4th World
Forum on. 2018. IEEE.
10. 10 Journal Name, 2019, Vol. 0, No. 0 M.F. Tahir et al.
4. Hong, T. and S. Fan, Probabilistic electric load
forecasting: A tutorial review. International Journal
of Forecasting, 2016. 32(3): p. 914-938.
5. Yang, A., W. Li, and X. Yang, Short-term
electricity load forecasting based on feature
selection and Least Squares Support Vector
Machines. Knowledge-Based Systems, 2019. 163:
p. 159-173.
6. Tahir, M.F. and M.A. Saqib, Optimal scheduling of
electrical power in energy-deficient scenarios using
artificial neural network and Bootstrap
aggregating. International Journal of Electrical
Power & Energy Systems, 2016. 83: p. 49-57.
7. Alani, A.Y. and I.O. Osunmakinde, Short-term
multiple forecasting of electric energy loads for
sustainable demand planning in smart grids for
smart homes. Sustainability, 2017. 9(11): p. 1972.
8. Singh, A.K., et al. Load forecasting techniques and
methodologies: A review. in 2012 2nd International
Conference on Power, Control and Embedded
Systems. 2012.
9. Mi, J., et al., Short-term power load forecasting
method based on improved exponential smoothing
grey model. Mathematical Problems in Engineering,
2018. 2018.
10. Srivastava, A.K., A.S. Pandey, and D. Singh. Short-
term load forecasting methods: A review. in 2016
International Conference on Emerging Trends in
Electrical Electronics & Sustainable Energy
Systems (ICETEESES). 2016.
11. Fallah, S.N., et al., Computational intelligence on
short-term load forecasting: A methodological
overview. Energies, 2019. 12(3): p. 393.
12. FaizanTahir, M., Optimal Load Shedding Using an
Ensemble of Artificial Neural Networks.
International journal of electrical and computer
engineering systems, 2016. 7(2.): p. 39-46.
13. Fahad, M.U. and N. Arbab, Factor affecting short
term load forecasting. Journal of Clean Energy
Technologies, 2014. 2(4): p. 305-309.
14. Rothe, M., D.A. Wadhwani, and D. Wadhwani,
Short term load forecasting using multi parameter
regression. arXiv preprint arXiv:0912.1015, 2009.
15. Charytoniuk, W., M.S. Chen, and P.V. Olinda,
Nonparametric regression based short-term load
forecasting. IEEE Transactions on Power Systems,
1998. 13(3): p. 725-730.
16. Amjady, N., Short-term hourly load forecasting
using time-series modeling with peak load
estimation capability. IEEE Transactions on Power
Systems, 2001. 16(3): p. 498-505.
17. Akay, D. and M. Atak, Grey prediction with rolling
mechanism for electricity demand forecasting of
Turkey. Energy, 2007. 32(9): p. 1670-1675.
18. Singh, S., S. Hussain, and M.A. Bazaz. Short term
load forecasting using artificial neural network. in
Image Information Processing (ICIIP), 2017 Fourth
International Conference on. 2017. IEEE.
19. Ozerdem, O.C., E.O. Olaniyi, and O.K. Oyedotun,
Short term load forecasting using particle swarm
optimization neural network. Procedia Computer
Science, 2017. 120: p. 382-393.
20. Ray, P., S.K. Panda, and D.P. Mishra, Short-term
load forecasting using genetic algorithm, in
Computational Intelligence in Data Mining. 2019,
Springer. p. 863-872.
21. Hamid, M.A. and T.A. Rahman. Short term load
forecasting using an artificial neural network
trained by artificial immune system learning
algorithm. in Computer Modelling and Simulation
(UKSim), 2010 12th International Conference on.
2010. IEEE.
22. Khosravi, A., S. Nahavandi, and D. Creighton.
Short term load forecasting using interval type-2
fuzzy logic systems. in Fuzzy Systems (FUZZ), 2011
IEEE International Conference on. 2011. IEEE.
23. Chaturvedi, D., R. Kumar, and P.K. Kalra, Artificial
neural network learning using improved genetic
algorithm. Vol. 82. 2002. 1-8.
24. Lu, N. and J. Zhou. Particle swarm optimization-
based RBF neural network load forecasting model.
in Power and Energy Engineering Conference,
2009. APPEEC 2009. Asia-Pacific. 2009. IEEE.
25. ShangDong, Y. and L. Xiang. A new ANN
optimized by improved PSO algorithm combined
with chaos and its application in short-term load
forecasting. in Computational Intelligence and
Security, 2006 International Conference on. 2006.
IEEE.
26. Kavousi-Fard, A., T. Niknam, and M. Golmaryami,
Short term load forecasting of distribution systems
by a new hybrid modified FA-backpropagation
method. Journal of Intelligent & Fuzzy Systems,
2014. 26(1): p. 517-522.
27. Dong-Xiao, N., W. Qiang, and L. Jin-Chao. Short
term load forecasting model using support vector
machine based on artificial neural network. in 2005
International Conference on Machine Learning and
Cybernetics. 2005.
28. Eluyode, O. and D.T. Akomolafe, Comparative
study of biological and artificial neural networks.
European Journal of Applied Engineering and
Scientific Research, 2013. 2(1): p. 36-46.
29. Jiang, J., P. Trundle, and J. Ren, Medical image
analysis with artificial neural networks.
Computerized Medical Imaging and Graphics,
2010. 34(8): p. 617-631.
30. Breiman, L., Bagging predictors. Machine learning,
1996. 24(2): p. 123-140.
31. Pardoe, D., M. Ryoo, and R. Miikkulainen.
Evolving neural network ensembles for control
problems. in Proceedings of the 7th annual
conference on Genetic and evolutionary
computation. 2005. ACM.
32. Li, H., X. Wang, and S. Ding, Research and
development of neural network ensembles: a survey.
Artificial Intelligence Review, 2018. 49(4): p. 455-
479.
33. Anifowose, F., J. Labadin, and A. Abdulraheem.
Ensemble model of Artificial Neural Networks with
randomized number of hidden neurons. in 2013 8th
11. Title of the Article Recent Advances in Electrical and Electronic Engineering, 2019, Vol. 0, No. 0 11
International Conference on Information
Technology in Asia (CITA). 2013. IEEE.
34. Australian Government, B.o.M. Data Requests and
Enquiries. July 2010; Available from:
http://www.bom.gov.au/climate/data-services/data-
requests.shtml.
35. Operator, A.E.M. Load Data. December 2018;
Available from: http://www.aemo.com.au/.
36. Abbas, F., et al., Short term residential load
forecasting: An Improved optimal nonlinear auto
regressive (NARX) method with exponential weight
decay function. Electronics, 2018. 7(12): p. 432.