SlideShare a Scribd company logo
1 of 9
Download to read offline
ARTIFICIAL NEURAL NETWORKS AND ITS
APPLICATIONS
Girish Kumar Jha
I.A.R.I., New Delhi-110 012
girish_iasri@rediffmail.com
Introduction
Neural networks have seen an explosion of interest over the last few years and are being
successfully applied across an extraordinary range of problem domains, in areas as diverse as
finance, medicine, engineering, geology, physics and biology. The excitement stems from the
fact that these networks are attempts to model the capabilities of the human brain. From a
statistical perspective neural networks are interesting because of their potential use in
prediction and classification problems.
Artificial neural networks (ANNs) are non-linear data driven self adaptive approach as
opposed to the traditional model based methods. They are powerful tools for modelling,
especially when the underlying data relationship is unknown. ANNs can identify and learn
correlated patterns between input data sets and corresponding target values. After training,
ANNs can be used to predict the outcome of new independent input data. ANNs imitate the
learning process of the human brain and can process problems involving non-linear and
complex data even if the data are imprecise and noisy. Thus they are ideally suited for the
modeling of agricultural data which are known to be complex and often non-linear.
A very important feature of these networks is their adaptive nature, where “learning by
example” replaces “programming” in solving problems. This feature makes such
computational models very appealing in application domains where one has little or
incomplete understanding of the problem to be solved but where training data is readily
available.
These networks are “neural” in the sense that they may have been inspired by neuroscience
but not necessarily because they are faithful models of biological neural or cognitive
phenomena. In fact majority of the network are more closely related to traditional
mathematical and/or statistical models such as non-parametric pattern classifiers, clustering
algorithms, nonlinear filters, and statistical regression models than they are to neurobiology
models.
Neural networks (NNs) have been used for a wide variety of applications where statistical
methods are traditionally employed. They have been used in classification problems, such as
identifying underwater sonar currents, recognizing speech, and predicting the secondary
structure of globular proteins. In time-series applications, NNs have been used in predicting
stock market performance. As statisticians or users of statistics, these problems are normally
solved through classical statistical methods, such as discriminant analysis, logistic regression,
Bayes analysis, multiple regression, and ARIMA time-series models. It is, therefore, time to
recognize neural networks as a powerful tool for data analysis.
Artificial Neural Networks and its Applications
V-42
Characteristics of neural networks
• The NNs exhibit mapping capabilities, that is, they can map input patterns to their
associated output patterns.
• The NNs learn by examples. Thus, NN architectures can be ‘trained’ with known
examples of a problem before they are tested for their ‘inference’ capability on
unknown instances of the problem. They can, therefore, identify new objects
previously untrained.
• The NNs possess the capability to generalize. Thus, they can predict new outcomes
from past trends.
• The NNs are robust systems and are fault tolerant. They can, therefore, recall full
patterns from incomplete, partial or noisy patterns.
• The NNs can process information in parallel, at high speed, and in a distributed
manner.
Basics of artificial neural networks
The terminology of artificial neural networks has developed from a biological model of the
brain. A neural network consists of a set of connected cells: The neurons. The neurons receive
impulses from either input cells or other neurons and perform some kind of transformation of
the input and transmit the outcome to other neurons or to output cells. The neural networks
are built from layers of neurons connected so that one layer receives input from the preceding
layer of neurons and passes the output on to the subsequent layer.
A neuron is a real function of the input vector( )kyy ,,1 K . The output is obtained as
⎟
⎠
⎞
⎜
⎝
⎛ += ∑ =
k
i jijjj ywfxf
1
)( α , where f is a function, typically the sigmoid (logistic or tangent
hyperbolic) function. A graphical presentation of neuron is given in figure below.
Mathematically a Multi-Layer Perceptron network is a function consisting of compositions of
weighted sums of the functions corresponding to the neurons.
Figure: A single neuron
Artificial Neural Networks and its Applications
V-43
Neural networks architectures
An ANN is defined as a data processing system consisting of a large number of simple highly
inter connected processing elements (artificial neurons) in an architecture inspired by the
structure of the cerebral cortex of the brain. There are several types of architecture of NNs.
However, the two most widely used NNs are discussed below:
Feed forward networks
In a feed forward network, information flows in one direction along connecting pathways,
from the input layer via the hidden layers to the final output layer. There is no feedback
(loops) i.e., the output of any layer does not affect that same or preceding layer.
Figure: A multi-layer feed forward neural network
Recurrent networks
These networks differ from feed forward network architectures in the sense that there is at
least one feedback loop. Thus, in these networks, for example, there could exist one layer with
feedback connections as shown in figure below. There could also be neurons with self-
feedback links, i.e. the output of a neuron is fed back into itself as input.
Figure: A recurrent neural network
Input layer Hidden layer Output layer
Input layer Hidden layer Output layer
Artificial Neural Networks and its Applications
V-44
Learning/Training methods
Learning methods in neural networks can be broadly classified into three basic types:
supervised, unsupervised and reinforced.
Supervised learning
In this, every input pattern that is used to train the network is associated with an output
pattern, which is the target or the desired pattern. A teacher is assumed to be present during
the learning process, when a comparison is made between the network’s computed output and
the correct expected output, to determine the error. The error can then be used to change
network parameters, which result in an improvement in performance.
Unsupervised learning
In this learning method, the target output is not presented to the network. It is as if there is no
teacher to present the desired patterns and hence, the system learns of its own by discovering
and adapting to structural features in the input patterns.
Reinforced learning
In this method, a teacher though available, does not present the expected answer but only
indicates if the computed output is correct or incorrect. The information provided helps the
network in its learning process. A reward is given for a correct answer computed and a
penalty for a wrong answer. But, reinforced learning is not one of the popular forms of
learning.
Types of neural networks
The most important class of neural networks for real world problems solving includes
• Multilayer Perceptron
• Radial Basis Function Networks
• Kohonen Self Organizing Feature Maps
Multilayer Perceptrons
The most popular form of neural network architecture is the multilayer perceptron (MLP). A
multilayer perceptron:
• has any number of inputs.
• has one or more hidden layers with any number of units.
• uses linear combination functions in the input layers.
• uses generally sigmoid activation functions in the hidden layers.
• has any number of outputs with any activation function.
• has connections between the input layer and the first hidden layer, between the hidden
layers, and between the last hidden layer and the output layer.
Given enough data, enough hidden units, and enough training time, an MLP with just one
hidden layer can learn to approximate virtually any function to any degree of accuracy. (A
statistical analogy is approximating a function with nth order polynomials.) For this reason
MLPs are known as universal approximators and can be used when you have little prior
knowledge of the relationship between inputs and targets. Although one hidden layer is
always sufficient provided you have enough data, there are situations where a network with
two or more hidden layers may require fewer hidden units and weights than a network with
one hidden layer, so using extra hidden layers sometimes can improve generalization.
Artificial Neural Networks and its Applications
V-45
Radial Basis Function Networks
Radial basis functions (RBF) networks are also feedforward, but have only one hidden layer.
A RBF network:
• has any number of inputs.
• typically has only one hidden layer with any number of units.
• uses radial combination functions in the hidden layer, based on the squared Euclidean
distance between the input vector and the weight vector.
• typically uses exponential or softmax activation functions in the hidden layer, in which
case the network is a Gaussian RBF network.
• has any number of outputs with any activation function.
• has connections between the input layer and the hidden layer, and between the hidden
layer and the output layer.
MLPs are said to be distributed-processing networks because the effect of a hidden unit can
be distributed over the entire input space. On the other hand, Gaussian RBF networks are said
to be local-processing networks because the effect of a hidden unit is usually concentrated in a
local area centered at the weight vector.
Kohonen Neural Network
Self Organizing Feature Map (SOFM, or Kohonen) networks are used quite differently to the
other networks. Whereas all the other networks are designed for supervised learning tasks,
SOFM networks are designed primarily for unsupervised learning (Patterson, 1996).
At first glance this may seem strange. Without outputs, what can the network learn? The
answer is that the SOFM network attempts to learn the structure of the data. One possible use
is therefore in exploratory data analysis. A second possible use is in novelty detection. SOFM
networks can learn to recognize clusters in the training data, and respond to it. If new data,
unlike previous cases, is encountered, the network fails to recognize it and this indicates
novelty. A SOFM network has only two layers: the input layer, and an output layer of radial
units (also known as the topological map layer).
Figure: A Kohonen Neural Network Applications
Artificial Neural Networks and its Applications
V-46
Neural networks have proven to be effective mapping tools for a wide variety of problems and
consequently they have been used extensively by practitioners in almost every application
domain ranging from agriculture to zoology. Since neural networks are best at identifying
patterns or trends in data, they are well suited for prediction or forecasting applications. A
very small sample of applications has been indicated in the next paragraph.
Kaastra and Boyd (1996) developed neural network model for forecasting financial and
economic time series. Kohzadi et al. (1996) used a feedforward neural network to compare
ARIMA and neural network price forecasting performance. Dewolf et al. (1997, 2000)
demonstrated the applicability of neural network technology for plant diseases forecasting.
Zhang et al. (1998) provided the general summary of the work in ANN forecasting, providing
the guidelines for neural network modeling, general paradigm of the ANNs especially those
used for forecasting, modeling issue of ANNs in forecasting and relative performance of
ANN over traditional statistical methods. Sanzogni et al. (2001) developed the models for
predicting milk production from farm inputs using standard feed forward ANN. Kumar et al.
(2002) studied utility of neural networks for estimation of daily grass reference crop
evapotranspiration and compared the performance of ANNs with the conventional method
used to estimate evapotranspiration. Pal et al. (2002) developed MLP based forecasting model
for maximum and minimum temperatures for ground level at Dum Dum station, Kolkata on
the basis of daily data on several variables, such as mean sea level pressure, vapour pressure,
relative humidity, rainfall, and radiation for the period 1989-95. Gaudart et al. (2004)
compared the performance of MLP and that of linear regression for epidemiological data with
regard to quality of prediction and robustness to deviation from underlying assumptions of
normality, homoscedasticity and independence of errors. The details of some of the
application in the field of agriculture will be discussed in the class.
The large number of parameters that must be selected to develop a neural network model for
any application indicates that the design process still involves much trial and error. The next
section provides a practical introductory guide for designing a neural network model.
Development of an ANN model
The various steps in developing a neural network model are:
A. Variable selection
The input variables important for modeling variable(s) under study are selected by suitable
variable selection procedures.
B. Formation of training, testing and validation sets
The data set is divided into three distinct sets called training, testing and validation sets. The
training set is the largest set and is used by neural network to learn patterns present in the
data. The testing set is used to evaluate the generalization ability of a supposedly trained
network. A final check on the performance of the trained network is made using validation
set.
C. Neural network architecture
Neural network architecture defines its structure including number of hidden layers, number
of hidden nodes and number of output nodes etc.
Artificial Neural Networks and its Applications
V-47
• Number of hidden layers: The hidden layer(s) provide the network with its ability to
generalize. In theory, a neural network with one hidden layer with a sufficient number of
hidden neurons is capable of approximating any continuous function. In practice, neural
network with one and occasionally two hidden layers are widely used and have to
perform very well.
• Number of hidden nodes: There is no magic formula for selecting the optimum number
of hidden neurons. However, some thumb rules are available for calculating number of
hidden neurons. A rough approximation can be obtained by the geometric pyramid rule
proposed by Masters (1993). For a three layer network with n input and m output
neurons, the hidden layer would have sqrt(n*m) neurons.
• Number of output nodes: Neural networks with multiple outputs, especially if these
outputs are widely spaced, will produce inferior results as compared to a network with a
single output.
• Activation function: Activation functions are mathematical formulae that determine the
output of a processing node. Each unit takes its net input and applies an activation
function to it. Non linear functions have been used as activation functions such as
logistic, tanh etc. The purpose of the transfer function is to prevent output from reaching
very large value which can ‘paralyze’ neural networks and thereby inhibit training.
Transfer functions such as sigmoid are commonly used because they are nonlinear and
continuously differentiable which are desirable for network learning.
D. Evaluation criteria
The most common error function minimized in neural networks is the sum of squared errors.
Other error functions offered by different software include least absolute deviations, least
fourth powers, asymmetric least squares and percentage differences.
E. Neural network training
Training a neural network to learn patterns in the data involves iteratively presenting it with
examples of the correct known answers. The objective of training is to find the set of weights
between the neurons that determine the global minimum of error function. This involves
decision regarding the number of iteration i.e., when to stop training a neural network and the
selection of learning rate (a constant of proportionality which determines the size of the
weight adjustments made at each iteration) and momentum values (how past weight changes
affect current weight changes).
Software
Several software on neural networks have been developed, to cite a few
Commercial Software : Statistica Neural Network, TNs2Server, DataEngine, Know Man
Basic Suite, Partek, Saxon, ECANSE - Environment for Computer Aided Neural Software
Engineering, Neuroshell, Neurogen, Matlab:Neural Network Toolbar.
Freeware Software: Net II, Spider Nets Neural Network Library, NeuDC, Binary Hopfeild
Net with free Java source, Neural shell, PlaNet, Valentino Computational Neuroscience Work
bench, Neural Simulation language version-NSL, Brain neural network Simulator.
Conclusion
The computing world has a lot to gain from neural networks. Their ability to learn by example
Artificial Neural Networks and its Applications
V-48
makes them very flexible and powerful. A large number of claims have been made about the
modeling capabilities of neural networks, some exaggerated and some justified. Hence, to best
utilize ANNs for different problems, it is essential to understand the potential as well as
limitations of neural networks. For some tasks, neural networks will never replace
conventional methods, but for a growing list of applications, the neural architecture will
provide either an alternative or a complement to these existing techniques. Finally, I would
like to state that even though neural networks have a huge potential we will only get the best
of them when they are integrated with Artificial Intelligence, Fuzzy Logic and related
subjects.
References
Anderson,, J. A.. ((2003)).. An Introduction to neural networks.. Prentice Hall..
Cheng, B. and Titterington, D. M. (1994). Neural networks: A review from a statistical
perspective. Statistical Science, 9, 2-54.
Dewolf, E.D., and Francl, L.J., (1997). Neural networks that distinguish in period of wheat tan
spot in an outdoor environment. Phytopathalogy, 87, 83-87.
Dewolf, E.D. and Francl, L.J. (2000) Neural network classification of tan spot and
stagonespore blotch infection period in wheat field environment. Phytopathalogy, 20,
108-113 .
Gaudart, J. Giusiano, B. and Huiart, L. (2004). Comparison of the performance of multi-layer
perceptron and linear regression for epidemiological data. Comput. Statist. & Data
Anal., 44, 547-70.
Hassoun, M. H. (1995). Fundamentals of Artificial Neural Networks. Cambridge: MIT Press.
Hopfield, J.J. (1982). Neural network and physical system with emergent collective
computational capabilities. In proceeding of the National Academy of Science(USA)
79, 2554-2558.
Kaastra, I. and Boyd, M.(1996). Designing a neural network for forecasting financial and
economic time series. Neurocomputing, 10, 215-236.
Kohzadi, N., Boyd, S.M., Kermanshahi, B. and Kaastra, I. (1996). A comparision of artificial
neural network and time series models for forecasting commodity prices.
Neurocomputing, 10, 169-181.
Kumar, M., Raghuwanshi, N. S., Singh, R,. Wallender, W. W. and Pruitt, W. O. (2002).
Estimating Evapotranspiration using Artificial Neural Network. Journal of Irrigation
and Drainage Engineering, 128, 224-233
Masters, T. (1993). Practical neural network recipes in C++, Academic press, NewYork.
Mcculloch, W.S. and Pitts, W. (1943) A logical calculus of the ideas immanent in nervous
activity. Bull. Math. Biophy., 5, 115-133
Pal, S. Das, J. Sengupta, P. and Banerjee, S. K. (2002). Short term prediction of atmospheric
temperature using neural networks. Mausam, 53, 471-80
Patterson, D. (1996). Artificial Neural Networks. Singapore: Prentice Hall.
Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage ang
organization in the brain. Psychological review, 65, 386-408.
Rumelhart, D.E., Hinton, G.E and Williams, R.J. (1986). “Learning internal representation by
error propagation”, in Parallel distributed processing: Exploration in microstructure of
cognition, Vol. (1) ( D.E. Rumelhart, J.L. McClelland and the PDP research gropus,
edn.) Cambridge, MA: MIT Press, 318-362.
Artificial Neural Networks and its Applications
V-49
Saanzogni, Louis and Kerr, Don (2001) Milk production estimate using feed forward artificial
neural networks. Computer and Electronics in Agriculture, 32, 21-30.
Warner, B. and Misra, M. (1996). Understanding neural networks as statistical tools.
American Statistician, 50, 284-93.
Yegnanarayana, B. (1999). Artificial Neural Networks. Prentice Hall
Zhang, G., Patuwo, B. E. and Hu, M. Y. (1998). Forecasting with artificial neural networks:
The state of the art. International Journal of Forecasting, 14, 35-62.

More Related Content

What's hot

Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkMuhammad Ishaq
 
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...COMPEGENCE
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...ijaia
 
Machine learning and_neural_network_lecture_slide_ece_dku
Machine learning and_neural_network_lecture_slide_ece_dkuMachine learning and_neural_network_lecture_slide_ece_dku
Machine learning and_neural_network_lecture_slide_ece_dkuSeokhyun Yoon
 
Ai and neural networks
Ai and neural networksAi and neural networks
Ai and neural networksNikhil Kansari
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionVipra Singh
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryInderjeet Singh
 
neural network
neural networkneural network
neural networkSTUDENT
 
Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)spartacus131211
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaEr. Arpit Sharma
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryInderjeet Singh
 
Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkEditor IJCATR
 

What's hot (20)

Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Neural networks
Neural networksNeural networks
Neural networks
 
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
 
Neural network
Neural networkNeural network
Neural network
 
Machine learning and_neural_network_lecture_slide_ece_dku
Machine learning and_neural_network_lecture_slide_ece_dkuMachine learning and_neural_network_lecture_slide_ece_dku
Machine learning and_neural_network_lecture_slide_ece_dku
 
Ai and neural networks
Ai and neural networksAi and neural networks
Ai and neural networks
 
ANN load forecasting
ANN load forecastingANN load forecasting
ANN load forecasting
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Neutral Network
Neutral NetworkNeutral Network
Neutral Network
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance Industry
 
neural networks
neural networksneural networks
neural networks
 
neural network
neural networkneural network
neural network
 
Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)Artificial Neural Network(Artificial intelligence)
Artificial Neural Network(Artificial intelligence)
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharma
 
Neural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance IndustryNeural Network Classification and its Applications in Insurance Industry
Neural Network Classification and its Applications in Insurance Industry
 
Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural Network
 

Viewers also liked

Library media policies 2011 2012
Library media policies 2011 2012Library media policies 2011 2012
Library media policies 2011 2012springstout1
 
історія обчилювальної техніки
історія обчилювальної технікиісторія обчилювальної техніки
історія обчилювальної технікиigoryacok36
 
Nonlinear image processing using artificial neural
Nonlinear image processing using artificial neuralNonlinear image processing using artificial neural
Nonlinear image processing using artificial neuralHưng Đặng
 
9ª Aula para Formação de Discipuladores
9ª Aula para Formação de Discipuladores9ª Aula para Formação de Discipuladores
9ª Aula para Formação de DiscipuladoresIBC de Jacarepaguá
 
Assistive Technology Presentation
Assistive Technology Presentation Assistive Technology Presentation
Assistive Technology Presentation awood4127
 
La joya service center 2014
La joya service center 2014La joya service center 2014
La joya service center 2014LuchitoF18
 
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)IBC de Jacarepaguá
 
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)IBC de Jacarepaguá
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Hưng Đặng
 
addictive disorder (abnormal psychology)
addictive disorder (abnormal psychology)addictive disorder (abnormal psychology)
addictive disorder (abnormal psychology)Vershul Jain
 
Pst obligation and permission
Pst obligation and permissionPst obligation and permission
Pst obligation and permissionEmil Jiménez
 
11ª Aula - Formação de Discipuladores
11ª Aula - Formação de Discipuladores11ª Aula - Formação de Discipuladores
11ª Aula - Formação de DiscipuladoresIBC de Jacarepaguá
 
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-viet
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-vietTai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-viet
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-vietHưng Đặng
 
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)IBC de Jacarepaguá
 
Aula 4 - Seminário sobre a Igreja
Aula 4 - Seminário sobre a IgrejaAula 4 - Seminário sobre a Igreja
Aula 4 - Seminário sobre a IgrejaIBC de Jacarepaguá
 

Viewers also liked (20)

Library media policies 2011 2012
Library media policies 2011 2012Library media policies 2011 2012
Library media policies 2011 2012
 
Fraire systems
Fraire systemsFraire systems
Fraire systems
 
історія обчилювальної техніки
історія обчилювальної технікиісторія обчилювальної техніки
історія обчилювальної техніки
 
Nonlinear image processing using artificial neural
Nonlinear image processing using artificial neuralNonlinear image processing using artificial neural
Nonlinear image processing using artificial neural
 
9ª Aula para Formação de Discipuladores
9ª Aula para Formação de Discipuladores9ª Aula para Formação de Discipuladores
9ª Aula para Formação de Discipuladores
 
Assistive Technology Presentation
Assistive Technology Presentation Assistive Technology Presentation
Assistive Technology Presentation
 
La joya service center 2014
La joya service center 2014La joya service center 2014
La joya service center 2014
 
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 05 - Seminário Sobre a Igreja (Segunda Temporada)
 
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 03 - Seminário Sobre a Igreja (Segunda Temporada)
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...
 
addictive disorder (abnormal psychology)
addictive disorder (abnormal psychology)addictive disorder (abnormal psychology)
addictive disorder (abnormal psychology)
 
Pst obligation and permission
Pst obligation and permissionPst obligation and permission
Pst obligation and permission
 
Website ideas
Website ideasWebsite ideas
Website ideas
 
11ª Aula - Formação de Discipuladores
11ª Aula - Formação de Discipuladores11ª Aula - Formação de Discipuladores
11ª Aula - Formação de Discipuladores
 
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-viet
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-vietTai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-viet
Tai lieutonghop.com --mau-cv-curriculum-vitae-bang-tieng-viet
 
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)
Aula 04 - Seminário Sobre a Igreja (Segunda Temporada)
 
Reported speech
Reported speechReported speech
Reported speech
 
On the job
On the jobOn the job
On the job
 
Aula 4 - Seminário sobre a Igreja
Aula 4 - Seminário sobre a IgrejaAula 4 - Seminário sobre a Igreja
Aula 4 - Seminário sobre a Igreja
 

Similar to Artificial neural networks and its application

CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfAsst.prof M.Gokilavani
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdfBria Davis
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfneelamsanjeevkumar
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Webguestecf0af
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learninggrinu
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligencealldesign
 
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxEXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxJavier Daza
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Akash Goel
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cseNaveenBhajantri1
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisAdityendra Kumar Singh
 
Neural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxNeural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxRatuRumana3
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 

Similar to Artificial neural networks and its application (20)

CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdf
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdf
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Web
 
Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learning
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
 
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxEXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
 
Neural network
Neural networkNeural network
Neural network
 
Neural network and fuzzy logic
Neural network and fuzzy logicNeural network and fuzzy logic
Neural network and fuzzy logic
 
MaLAI_Hyderabad presentation
MaLAI_Hyderabad presentationMaLAI_Hyderabad presentation
MaLAI_Hyderabad presentation
 
Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders Intro to Deep learning - Autoencoders
Intro to Deep learning - Autoencoders
 
A04401001013
A04401001013A04401001013
A04401001013
 
Lesson 37
Lesson 37Lesson 37
Lesson 37
 
AI Lesson 37
AI Lesson 37AI Lesson 37
AI Lesson 37
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Neural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptxNeural Networks-introduction_with_prodecure.pptx
Neural Networks-introduction_with_prodecure.pptx
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 

More from Hưng Đặng

Tao cv-tieng-anhby pdfcv
Tao cv-tieng-anhby pdfcvTao cv-tieng-anhby pdfcv
Tao cv-tieng-anhby pdfcvHưng Đặng
 
Mlp trainning algorithm
Mlp trainning algorithmMlp trainning algorithm
Mlp trainning algorithmHưng Đặng
 
Lecture artificial neural networks and pattern recognition
Lecture   artificial neural networks and pattern recognitionLecture   artificial neural networks and pattern recognition
Lecture artificial neural networks and pattern recognitionHưng Đặng
 
Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Hưng Đặng
 
Lecture artificial neural networks and pattern recognition
Lecture   artificial neural networks and pattern recognitionLecture   artificial neural networks and pattern recognition
Lecture artificial neural networks and pattern recognitionHưng Đặng
 

More from Hưng Đặng (9)

Tao cv-tieng-anhby pdfcv
Tao cv-tieng-anhby pdfcvTao cv-tieng-anhby pdfcv
Tao cv-tieng-anhby pdfcv
 
Dhq-cv-tieng-anh
 Dhq-cv-tieng-anh Dhq-cv-tieng-anh
Dhq-cv-tieng-anh
 
Cv it developer
Cv it developerCv it developer
Cv it developer
 
Cach viết-cv
Cach viết-cvCach viết-cv
Cach viết-cv
 
Cach viet cv hay
Cach viet cv hayCach viet cv hay
Cach viet cv hay
 
Mlp trainning algorithm
Mlp trainning algorithmMlp trainning algorithm
Mlp trainning algorithm
 
Lecture artificial neural networks and pattern recognition
Lecture   artificial neural networks and pattern recognitionLecture   artificial neural networks and pattern recognition
Lecture artificial neural networks and pattern recognition
 
Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...Image compression and reconstruction using a new approach by artificial neura...
Image compression and reconstruction using a new approach by artificial neura...
 
Lecture artificial neural networks and pattern recognition
Lecture   artificial neural networks and pattern recognitionLecture   artificial neural networks and pattern recognition
Lecture artificial neural networks and pattern recognition
 

Recently uploaded

Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort servicejennyeacort
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .Satyam Kumar
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxk795866
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxKartikeyaDwivedi3
 
Work Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvWork Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvLewisJB
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfAsst.prof M.Gokilavani
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfAsst.prof M.Gokilavani
 
EduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIEduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIkoyaldeepu123
 
pipeline in computer architecture design
pipeline in computer architecture  designpipeline in computer architecture  design
pipeline in computer architecture designssuser87fa0c1
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AIabhishek36461
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEINFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEroselinkalist12
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)dollysharma2066
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 

Recently uploaded (20)

Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort serviceGurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
Gurgaon ✡️9711147426✨Call In girls Gurgaon Sector 51 escort service
 
Churning of Butter, Factors affecting .
Churning of Butter, Factors affecting  .Churning of Butter, Factors affecting  .
Churning of Butter, Factors affecting .
 
Introduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptxIntroduction-To-Agricultural-Surveillance-Rover.pptx
Introduction-To-Agricultural-Surveillance-Rover.pptx
 
Concrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptxConcrete Mix Design - IS 10262-2019 - .pptx
Concrete Mix Design - IS 10262-2019 - .pptx
 
young call girls in Green Park🔝 9953056974 🔝 escort Service
young call girls in Green Park🔝 9953056974 🔝 escort Serviceyoung call girls in Green Park🔝 9953056974 🔝 escort Service
young call girls in Green Park🔝 9953056974 🔝 escort Service
 
Work Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvvWork Experience-Dalton Park.pptxfvvvvvvv
Work Experience-Dalton Park.pptxfvvvvvvv
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
 
EduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AIEduAI - E learning Platform integrated with AI
EduAI - E learning Platform integrated with AI
 
pipeline in computer architecture design
pipeline in computer architecture  designpipeline in computer architecture  design
pipeline in computer architecture design
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AI
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETEINFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
INFLUENCE OF NANOSILICA ON THE PROPERTIES OF CONCRETE
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
Call Us ≽ 8377877756 ≼ Call Girls In Shastri Nagar (Delhi)
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 

Artificial neural networks and its application

  • 1. ARTIFICIAL NEURAL NETWORKS AND ITS APPLICATIONS Girish Kumar Jha I.A.R.I., New Delhi-110 012 girish_iasri@rediffmail.com Introduction Neural networks have seen an explosion of interest over the last few years and are being successfully applied across an extraordinary range of problem domains, in areas as diverse as finance, medicine, engineering, geology, physics and biology. The excitement stems from the fact that these networks are attempts to model the capabilities of the human brain. From a statistical perspective neural networks are interesting because of their potential use in prediction and classification problems. Artificial neural networks (ANNs) are non-linear data driven self adaptive approach as opposed to the traditional model based methods. They are powerful tools for modelling, especially when the underlying data relationship is unknown. ANNs can identify and learn correlated patterns between input data sets and corresponding target values. After training, ANNs can be used to predict the outcome of new independent input data. ANNs imitate the learning process of the human brain and can process problems involving non-linear and complex data even if the data are imprecise and noisy. Thus they are ideally suited for the modeling of agricultural data which are known to be complex and often non-linear. A very important feature of these networks is their adaptive nature, where “learning by example” replaces “programming” in solving problems. This feature makes such computational models very appealing in application domains where one has little or incomplete understanding of the problem to be solved but where training data is readily available. These networks are “neural” in the sense that they may have been inspired by neuroscience but not necessarily because they are faithful models of biological neural or cognitive phenomena. In fact majority of the network are more closely related to traditional mathematical and/or statistical models such as non-parametric pattern classifiers, clustering algorithms, nonlinear filters, and statistical regression models than they are to neurobiology models. Neural networks (NNs) have been used for a wide variety of applications where statistical methods are traditionally employed. They have been used in classification problems, such as identifying underwater sonar currents, recognizing speech, and predicting the secondary structure of globular proteins. In time-series applications, NNs have been used in predicting stock market performance. As statisticians or users of statistics, these problems are normally solved through classical statistical methods, such as discriminant analysis, logistic regression, Bayes analysis, multiple regression, and ARIMA time-series models. It is, therefore, time to recognize neural networks as a powerful tool for data analysis.
  • 2. Artificial Neural Networks and its Applications V-42 Characteristics of neural networks • The NNs exhibit mapping capabilities, that is, they can map input patterns to their associated output patterns. • The NNs learn by examples. Thus, NN architectures can be ‘trained’ with known examples of a problem before they are tested for their ‘inference’ capability on unknown instances of the problem. They can, therefore, identify new objects previously untrained. • The NNs possess the capability to generalize. Thus, they can predict new outcomes from past trends. • The NNs are robust systems and are fault tolerant. They can, therefore, recall full patterns from incomplete, partial or noisy patterns. • The NNs can process information in parallel, at high speed, and in a distributed manner. Basics of artificial neural networks The terminology of artificial neural networks has developed from a biological model of the brain. A neural network consists of a set of connected cells: The neurons. The neurons receive impulses from either input cells or other neurons and perform some kind of transformation of the input and transmit the outcome to other neurons or to output cells. The neural networks are built from layers of neurons connected so that one layer receives input from the preceding layer of neurons and passes the output on to the subsequent layer. A neuron is a real function of the input vector( )kyy ,,1 K . The output is obtained as ⎟ ⎠ ⎞ ⎜ ⎝ ⎛ += ∑ = k i jijjj ywfxf 1 )( α , where f is a function, typically the sigmoid (logistic or tangent hyperbolic) function. A graphical presentation of neuron is given in figure below. Mathematically a Multi-Layer Perceptron network is a function consisting of compositions of weighted sums of the functions corresponding to the neurons. Figure: A single neuron
  • 3. Artificial Neural Networks and its Applications V-43 Neural networks architectures An ANN is defined as a data processing system consisting of a large number of simple highly inter connected processing elements (artificial neurons) in an architecture inspired by the structure of the cerebral cortex of the brain. There are several types of architecture of NNs. However, the two most widely used NNs are discussed below: Feed forward networks In a feed forward network, information flows in one direction along connecting pathways, from the input layer via the hidden layers to the final output layer. There is no feedback (loops) i.e., the output of any layer does not affect that same or preceding layer. Figure: A multi-layer feed forward neural network Recurrent networks These networks differ from feed forward network architectures in the sense that there is at least one feedback loop. Thus, in these networks, for example, there could exist one layer with feedback connections as shown in figure below. There could also be neurons with self- feedback links, i.e. the output of a neuron is fed back into itself as input. Figure: A recurrent neural network Input layer Hidden layer Output layer Input layer Hidden layer Output layer
  • 4. Artificial Neural Networks and its Applications V-44 Learning/Training methods Learning methods in neural networks can be broadly classified into three basic types: supervised, unsupervised and reinforced. Supervised learning In this, every input pattern that is used to train the network is associated with an output pattern, which is the target or the desired pattern. A teacher is assumed to be present during the learning process, when a comparison is made between the network’s computed output and the correct expected output, to determine the error. The error can then be used to change network parameters, which result in an improvement in performance. Unsupervised learning In this learning method, the target output is not presented to the network. It is as if there is no teacher to present the desired patterns and hence, the system learns of its own by discovering and adapting to structural features in the input patterns. Reinforced learning In this method, a teacher though available, does not present the expected answer but only indicates if the computed output is correct or incorrect. The information provided helps the network in its learning process. A reward is given for a correct answer computed and a penalty for a wrong answer. But, reinforced learning is not one of the popular forms of learning. Types of neural networks The most important class of neural networks for real world problems solving includes • Multilayer Perceptron • Radial Basis Function Networks • Kohonen Self Organizing Feature Maps Multilayer Perceptrons The most popular form of neural network architecture is the multilayer perceptron (MLP). A multilayer perceptron: • has any number of inputs. • has one or more hidden layers with any number of units. • uses linear combination functions in the input layers. • uses generally sigmoid activation functions in the hidden layers. • has any number of outputs with any activation function. • has connections between the input layer and the first hidden layer, between the hidden layers, and between the last hidden layer and the output layer. Given enough data, enough hidden units, and enough training time, an MLP with just one hidden layer can learn to approximate virtually any function to any degree of accuracy. (A statistical analogy is approximating a function with nth order polynomials.) For this reason MLPs are known as universal approximators and can be used when you have little prior knowledge of the relationship between inputs and targets. Although one hidden layer is always sufficient provided you have enough data, there are situations where a network with two or more hidden layers may require fewer hidden units and weights than a network with one hidden layer, so using extra hidden layers sometimes can improve generalization.
  • 5. Artificial Neural Networks and its Applications V-45 Radial Basis Function Networks Radial basis functions (RBF) networks are also feedforward, but have only one hidden layer. A RBF network: • has any number of inputs. • typically has only one hidden layer with any number of units. • uses radial combination functions in the hidden layer, based on the squared Euclidean distance between the input vector and the weight vector. • typically uses exponential or softmax activation functions in the hidden layer, in which case the network is a Gaussian RBF network. • has any number of outputs with any activation function. • has connections between the input layer and the hidden layer, and between the hidden layer and the output layer. MLPs are said to be distributed-processing networks because the effect of a hidden unit can be distributed over the entire input space. On the other hand, Gaussian RBF networks are said to be local-processing networks because the effect of a hidden unit is usually concentrated in a local area centered at the weight vector. Kohonen Neural Network Self Organizing Feature Map (SOFM, or Kohonen) networks are used quite differently to the other networks. Whereas all the other networks are designed for supervised learning tasks, SOFM networks are designed primarily for unsupervised learning (Patterson, 1996). At first glance this may seem strange. Without outputs, what can the network learn? The answer is that the SOFM network attempts to learn the structure of the data. One possible use is therefore in exploratory data analysis. A second possible use is in novelty detection. SOFM networks can learn to recognize clusters in the training data, and respond to it. If new data, unlike previous cases, is encountered, the network fails to recognize it and this indicates novelty. A SOFM network has only two layers: the input layer, and an output layer of radial units (also known as the topological map layer). Figure: A Kohonen Neural Network Applications
  • 6. Artificial Neural Networks and its Applications V-46 Neural networks have proven to be effective mapping tools for a wide variety of problems and consequently they have been used extensively by practitioners in almost every application domain ranging from agriculture to zoology. Since neural networks are best at identifying patterns or trends in data, they are well suited for prediction or forecasting applications. A very small sample of applications has been indicated in the next paragraph. Kaastra and Boyd (1996) developed neural network model for forecasting financial and economic time series. Kohzadi et al. (1996) used a feedforward neural network to compare ARIMA and neural network price forecasting performance. Dewolf et al. (1997, 2000) demonstrated the applicability of neural network technology for plant diseases forecasting. Zhang et al. (1998) provided the general summary of the work in ANN forecasting, providing the guidelines for neural network modeling, general paradigm of the ANNs especially those used for forecasting, modeling issue of ANNs in forecasting and relative performance of ANN over traditional statistical methods. Sanzogni et al. (2001) developed the models for predicting milk production from farm inputs using standard feed forward ANN. Kumar et al. (2002) studied utility of neural networks for estimation of daily grass reference crop evapotranspiration and compared the performance of ANNs with the conventional method used to estimate evapotranspiration. Pal et al. (2002) developed MLP based forecasting model for maximum and minimum temperatures for ground level at Dum Dum station, Kolkata on the basis of daily data on several variables, such as mean sea level pressure, vapour pressure, relative humidity, rainfall, and radiation for the period 1989-95. Gaudart et al. (2004) compared the performance of MLP and that of linear regression for epidemiological data with regard to quality of prediction and robustness to deviation from underlying assumptions of normality, homoscedasticity and independence of errors. The details of some of the application in the field of agriculture will be discussed in the class. The large number of parameters that must be selected to develop a neural network model for any application indicates that the design process still involves much trial and error. The next section provides a practical introductory guide for designing a neural network model. Development of an ANN model The various steps in developing a neural network model are: A. Variable selection The input variables important for modeling variable(s) under study are selected by suitable variable selection procedures. B. Formation of training, testing and validation sets The data set is divided into three distinct sets called training, testing and validation sets. The training set is the largest set and is used by neural network to learn patterns present in the data. The testing set is used to evaluate the generalization ability of a supposedly trained network. A final check on the performance of the trained network is made using validation set. C. Neural network architecture Neural network architecture defines its structure including number of hidden layers, number of hidden nodes and number of output nodes etc.
  • 7. Artificial Neural Networks and its Applications V-47 • Number of hidden layers: The hidden layer(s) provide the network with its ability to generalize. In theory, a neural network with one hidden layer with a sufficient number of hidden neurons is capable of approximating any continuous function. In practice, neural network with one and occasionally two hidden layers are widely used and have to perform very well. • Number of hidden nodes: There is no magic formula for selecting the optimum number of hidden neurons. However, some thumb rules are available for calculating number of hidden neurons. A rough approximation can be obtained by the geometric pyramid rule proposed by Masters (1993). For a three layer network with n input and m output neurons, the hidden layer would have sqrt(n*m) neurons. • Number of output nodes: Neural networks with multiple outputs, especially if these outputs are widely spaced, will produce inferior results as compared to a network with a single output. • Activation function: Activation functions are mathematical formulae that determine the output of a processing node. Each unit takes its net input and applies an activation function to it. Non linear functions have been used as activation functions such as logistic, tanh etc. The purpose of the transfer function is to prevent output from reaching very large value which can ‘paralyze’ neural networks and thereby inhibit training. Transfer functions such as sigmoid are commonly used because they are nonlinear and continuously differentiable which are desirable for network learning. D. Evaluation criteria The most common error function minimized in neural networks is the sum of squared errors. Other error functions offered by different software include least absolute deviations, least fourth powers, asymmetric least squares and percentage differences. E. Neural network training Training a neural network to learn patterns in the data involves iteratively presenting it with examples of the correct known answers. The objective of training is to find the set of weights between the neurons that determine the global minimum of error function. This involves decision regarding the number of iteration i.e., when to stop training a neural network and the selection of learning rate (a constant of proportionality which determines the size of the weight adjustments made at each iteration) and momentum values (how past weight changes affect current weight changes). Software Several software on neural networks have been developed, to cite a few Commercial Software : Statistica Neural Network, TNs2Server, DataEngine, Know Man Basic Suite, Partek, Saxon, ECANSE - Environment for Computer Aided Neural Software Engineering, Neuroshell, Neurogen, Matlab:Neural Network Toolbar. Freeware Software: Net II, Spider Nets Neural Network Library, NeuDC, Binary Hopfeild Net with free Java source, Neural shell, PlaNet, Valentino Computational Neuroscience Work bench, Neural Simulation language version-NSL, Brain neural network Simulator. Conclusion The computing world has a lot to gain from neural networks. Their ability to learn by example
  • 8. Artificial Neural Networks and its Applications V-48 makes them very flexible and powerful. A large number of claims have been made about the modeling capabilities of neural networks, some exaggerated and some justified. Hence, to best utilize ANNs for different problems, it is essential to understand the potential as well as limitations of neural networks. For some tasks, neural networks will never replace conventional methods, but for a growing list of applications, the neural architecture will provide either an alternative or a complement to these existing techniques. Finally, I would like to state that even though neural networks have a huge potential we will only get the best of them when they are integrated with Artificial Intelligence, Fuzzy Logic and related subjects. References Anderson,, J. A.. ((2003)).. An Introduction to neural networks.. Prentice Hall.. Cheng, B. and Titterington, D. M. (1994). Neural networks: A review from a statistical perspective. Statistical Science, 9, 2-54. Dewolf, E.D., and Francl, L.J., (1997). Neural networks that distinguish in period of wheat tan spot in an outdoor environment. Phytopathalogy, 87, 83-87. Dewolf, E.D. and Francl, L.J. (2000) Neural network classification of tan spot and stagonespore blotch infection period in wheat field environment. Phytopathalogy, 20, 108-113 . Gaudart, J. Giusiano, B. and Huiart, L. (2004). Comparison of the performance of multi-layer perceptron and linear regression for epidemiological data. Comput. Statist. & Data Anal., 44, 547-70. Hassoun, M. H. (1995). Fundamentals of Artificial Neural Networks. Cambridge: MIT Press. Hopfield, J.J. (1982). Neural network and physical system with emergent collective computational capabilities. In proceeding of the National Academy of Science(USA) 79, 2554-2558. Kaastra, I. and Boyd, M.(1996). Designing a neural network for forecasting financial and economic time series. Neurocomputing, 10, 215-236. Kohzadi, N., Boyd, S.M., Kermanshahi, B. and Kaastra, I. (1996). A comparision of artificial neural network and time series models for forecasting commodity prices. Neurocomputing, 10, 169-181. Kumar, M., Raghuwanshi, N. S., Singh, R,. Wallender, W. W. and Pruitt, W. O. (2002). Estimating Evapotranspiration using Artificial Neural Network. Journal of Irrigation and Drainage Engineering, 128, 224-233 Masters, T. (1993). Practical neural network recipes in C++, Academic press, NewYork. Mcculloch, W.S. and Pitts, W. (1943) A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophy., 5, 115-133 Pal, S. Das, J. Sengupta, P. and Banerjee, S. K. (2002). Short term prediction of atmospheric temperature using neural networks. Mausam, 53, 471-80 Patterson, D. (1996). Artificial Neural Networks. Singapore: Prentice Hall. Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage ang organization in the brain. Psychological review, 65, 386-408. Rumelhart, D.E., Hinton, G.E and Williams, R.J. (1986). “Learning internal representation by error propagation”, in Parallel distributed processing: Exploration in microstructure of cognition, Vol. (1) ( D.E. Rumelhart, J.L. McClelland and the PDP research gropus, edn.) Cambridge, MA: MIT Press, 318-362.
  • 9. Artificial Neural Networks and its Applications V-49 Saanzogni, Louis and Kerr, Don (2001) Milk production estimate using feed forward artificial neural networks. Computer and Electronics in Agriculture, 32, 21-30. Warner, B. and Misra, M. (1996). Understanding neural networks as statistical tools. American Statistician, 50, 284-93. Yegnanarayana, B. (1999). Artificial Neural Networks. Prentice Hall Zhang, G., Patuwo, B. E. and Hu, M. Y. (1998). Forecasting with artificial neural networks: The state of the art. International Journal of Forecasting, 14, 35-62.